Sample records for complete methodology based

  1. Completion and Attrition Rates for Apprentices and Trainees, 2016. Australian Vocational Education and Training Statistics

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research (NCVER), 2017

    2017-01-01

    This publication presents completion and attrition rates for apprentices and trainees using three different methodologies: (1) contract completion and attrition rates: based on the outcomes of contracts of training; (2) individual completion rates: based on contract completion rates and adjusted for factors representing average recommencements by…

  2. Completion and Attrition Rates for Apprentices and Trainees 2014. Australian Vocational Education and Training Statistics

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research (NCVER), 2015

    2015-01-01

    This publication presents completion and attrition rates for apprentices and trainees using three different methodologies: (1) contract completion and attrition rates: based on the outcomes of contracts of training; (2) individual completion rates: based on contract completion rates and adjusted for factors representing average recommencements by…

  3. Development of an Integrated Team Training Design and Assessment Architecture to Support Adaptability in Healthcare Teams

    DTIC Science & Technology

    2017-10-01

    to patient safety by addressing key methodological and conceptual gaps in healthcare simulation-based team training. The investigators are developing...primary outcome of Aim 1a is a conceptually and methodologically sound training design architecture that supports the development and integration of team...should be delivered. This subtask was delayed by approximately 1 month and is now completed. Completed Evaluation of existing experimental dataset to

  4. A top-down design methodology and its implementation for VCSEL-based optical links design

    NASA Astrophysics Data System (ADS)

    Li, Jiguang; Cao, Mingcui; Cai, Zilong

    2005-01-01

    In order to find the optimal design for a given specification of an optical communication link, an integrated simulation of electronic, optoelectronic, and optical components of a complete system is required. It is very important to be able to simulate at both system level and detailed model level. This kind of model is feasible due to the high potential of Verilog-AMS language. In this paper, we propose an effective top-down design methodology and employ it in the development of a complete VCSEL-based optical links simulation. The principle of top-down methodology is that the development would proceed from the system to device level. To design a hierarchical model for VCSEL based optical links, the design framework is organized in three levels of hierarchy. The models are developed, and implemented in Verilog-AMS. Therefore, the model parameters are fitted to measured data. A sample transient simulation demonstrates the functioning of our implementation. Suggestions for future directions in top-down methodology used for optoelectronic systems technology are also presented.

  5. Physical Therapy-Related Child Outcomes in School: An Example of Practice-Based Evidence Methodology.

    PubMed

    Effgen, Susan K; McCoy, Sarah Westcott; Chiarello, Lisa A; Jeffries, Lynn M; Bush, Heather

    2016-01-01

    To describe the use of practice-based evidence research methodology in a prospective, multisite observational study to investigate changes in students' participation in school activity, self-care, posture/mobility, recreation/fitness, and academic outcomes, and the relationships of these changes to characteristics of school-based physical therapy. One hundred nine physical therapists completed the training and data collection and 296 students, 5 to 12 years of age (mean age = 7.3 years) had 6 months of complete data. Therapists completed individualized (Goal Attainment Scaling) and standardized (School Function Assessment) outcome measures for students at the beginning and end of the school year and during the year collected weekly data on services to and on behalf of the students. This research design enabled the investigation of complex research questions related to school-based practice. The findings of this study, to be reported later, should influence school-based therapy by providing guidance related to what activities, interventions, and services influence student outcomes.

  6. Software Risk Identification for Interplanetary Probes

    NASA Technical Reports Server (NTRS)

    Dougherty, Robert J.; Papadopoulos, Periklis E.

    2005-01-01

    The need for a systematic and effective software risk identification methodology is critical for interplanetary probes that are using increasingly complex and critical software. Several probe failures are examined that suggest more attention and resources need to be dedicated to identifying software risks. The direct causes of these failures can often be traced to systemic problems in all phases of the software engineering process. These failures have lead to the development of a practical methodology to identify risks for interplanetary probes. The proposed methodology is based upon the tailoring of the Software Engineering Institute's (SEI) method of taxonomy-based risk identification. The use of this methodology will ensure a more consistent and complete identification of software risks in these probes.

  7. Methodological Innovation in Practice-Based Design Doctorates

    ERIC Educational Resources Information Center

    Yee, Joyce S. R.

    2010-01-01

    This article presents a selective review of recent design PhDs that identify and analyse the methodological innovation that is occurring in the field, in order to inform future provision of research training. Six recently completed design PhDs are used to highlight possible philosophical and practical models that can be adopted by future PhD…

  8. Progress in multirate digital control system design

    NASA Technical Reports Server (NTRS)

    Berg, Martin C.; Mason, Gregory S.

    1991-01-01

    A new methodology for multirate sampled-data control design based on a new generalized control law structure, two new parameter-optimization-based control law synthesis methods, and a new singular-value-based robustness analysis method are described. The control law structure can represent multirate sampled-data control laws of arbitrary structure and dynamic order, with arbitrarily prescribed sampling rates for all sensors and update rates for all processor states and actuators. The two control law synthesis methods employ numerical optimization to determine values for the control law parameters. The robustness analysis method is based on the multivariable Nyquist criterion applied to the loop transfer function for the sampling period equal to the period of repetition of the system's complete sampling/update schedule. The complete methodology is demonstrated by application to the design of a combination yaw damper and modal suppression system for a commercial aircraft.

  9. Systematic investigation of gastrointestinal diseases in China (SILC): validation of survey methodology

    PubMed Central

    2009-01-01

    Background Symptom-based surveys suggest that the prevalence of gastrointestinal diseases is lower in China than in Western countries. The aim of this study was to validate a methodology for the epidemiological investigation of gastrointestinal symptoms and endoscopic findings in China. Methods A randomized, stratified, multi-stage sampling methodology was used to select 18 000 adults aged 18-80 years from Shanghai, Beijing, Xi'an, Wuhan and Guangzhou. Participants from Shanghai were invited to provide blood samples and undergo upper gastrointestinal endoscopy. All participants completed Chinese versions of the Reflux Disease Questionnaire (RDQ) and the modified Rome II questionnaire; 20% were also invited to complete the 36-item Short Form Health Survey (SF-36) and Epworth Sleepiness Scale (ESS). The psychometric properties of the questionnaires were evaluated statistically. Results The study was completed by 16 091 individuals (response rate: 89.4%), with 3219 (89.4% of those invited) completing the SF-36 and ESS. All 3153 participants in Shanghai provided blood samples and 1030 (32.7%) underwent endoscopy. Cronbach's alpha coefficients were 0.89, 0.89, 0.80 and 0.91, respectively, for the RDQ, modified Rome II questionnaire, ESS and SF-36, supporting internal consistency. Factor analysis supported construct validity of all questionnaire dimensions except SF-36 psychosocial dimensions. Conclusion This population-based study has great potential to characterize the relationship between gastrointestinal symptoms and endoscopic findings in China. PMID:19925662

  10. Systematic investigation of gastrointestinal diseases in China (SILC): validation of survey methodology.

    PubMed

    Yan, Xiaoyan; Wang, Rui; Zhao, Yanfang; Ma, Xiuqiang; Fang, Jiqian; Yan, Hong; Kang, Xiaoping; Yin, Ping; Hao, Yuantao; Li, Qiang; Dent, John; Sung, Joseph; Zou, Duowu; Johansson, Saga; Halling, Katarina; Liu, Wenbin; He, Jia

    2009-11-19

    Symptom-based surveys suggest that the prevalence of gastrointestinal diseases is lower in China than in Western countries. The aim of this study was to validate a methodology for the epidemiological investigation of gastrointestinal symptoms and endoscopic findings in China. A randomized, stratified, multi-stage sampling methodology was used to select 18,000 adults aged 18-80 years from Shanghai, Beijing, Xi'an, Wuhan and Guangzhou. Participants from Shanghai were invited to provide blood samples and undergo upper gastrointestinal endoscopy. All participants completed Chinese versions of the Reflux Disease Questionnaire (RDQ) and the modified Rome II questionnaire; 20% were also invited to complete the 36-item Short Form Health Survey (SF-36) and Epworth Sleepiness Scale (ESS). The psychometric properties of the questionnaires were evaluated statistically. The study was completed by 16,091 individuals (response rate: 89.4%), with 3219 (89.4% of those invited) completing the SF-36 and ESS. All 3153 participants in Shanghai provided blood samples and 1030 (32.7%) underwent endoscopy. Cronbach's alpha coefficients were 0.89, 0.89, 0.80 and 0.91, respectively, for the RDQ, modified Rome II questionnaire, ESS and SF-36, supporting internal consistency. Factor analysis supported construct validity of all questionnaire dimensions except SF-36 psychosocial dimensions. This population-based study has great potential to characterize the relationship between gastrointestinal symptoms and endoscopic findings in China.

  11. Relationships between abstract features and methodological quality explained variations of social media activity derived from systematic reviews about psoriasis interventions.

    PubMed

    Ruano, J; Aguilar-Luque, M; Isla-Tejera, B; Alcalde-Mellado, P; Gay-Mimbrera, J; Hernandez-Romero, José Luis; Sanz-Cabanillas, J L; Maestre-López, B; González-Padilla, M; Carmona-Fernández, P J; Gómez-García, F; García-Nieto, A Vélez

    2018-05-24

    The aim of this study was to describe the relationship among abstract structure, readability, and completeness, and how these features may influence social media activity and bibliometric results, considering systematic reviews (SRs) about interventions in psoriasis classified by methodological quality. Systematic literature searches about psoriasis interventions were undertaken on relevant databases. For each review, methodological quality was evaluated using the Assessing the Methodological Quality of Systematic Reviews (AMSTAR) tool. Abstract extension, structure, readability, and quality and completeness of reporting were analyzed. Social media activity, which consider Twitter and Facebook mention counts, as well as Mendeley readers and Google scholar citations were obtained for each article. Analyses were conducted to describe any potential influence of abstract characteristics on review's social media diffusion. We classified 139 intervention SRs as displaying high/moderate/low methodological quality. We observed that abstract readability of SRs has been maintained high for last 20 years, although there are some differences based on their methodological quality. Free-format abstracts were most sensitive to the increase of text readability as compared with more structured abstracts (IMRAD or 8-headings), yielding opposite effects on their quality and completeness depending on the methodological quality: a worsening in low quality reviews and an improvement in those of high-quality. Both readability indices and PRISMA for Abstract total scores showed an inverse relationship with social media activity and bibliometric results in high methodological quality reviews but not in those of lower quality. Our results suggest that increasing abstract readability must be specially considered when writing free-format summaries of high-quality reviews, because this fact correlates with an improvement of their completeness and quality, and this may help to achieve broader social media visibility and article usage. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. Army Base Realignment Methodology. Volume II.

    DTIC Science & Technology

    1981-08-01

    deficien- cies because they were reconstructed after the fact. c. The Craig action was very similar to the Fort Wolters closure. It was completed ...7 / provided cross-checks on the Fort Wolters After Action Report and substantiated the completion of equipment shipments and proposed construction...DATES AND MILESTONES OF USAMPS RELOCATIONA / Event Date CSJF completed --USAMPS relocation to Fort Devens not justified 22 July 1971 Criminal

  13. Conjugate gradient based projection - A new explicit methodology for frictional contact

    NASA Technical Reports Server (NTRS)

    Tamma, Kumar K.; Li, Maocheng; Sha, Desong

    1993-01-01

    With special attention towards the applicability to parallel computation or vectorization, a new and effective explicit approach for linear complementary formulations involving a conjugate gradient based projection methodology is proposed in this study for contact problems with Coulomb friction. The overall objectives are focussed towards providing an explicit methodology of computation for the complete contact problem with friction. In this regard, the primary idea for solving the linear complementary formulations stems from an established search direction which is projected to a feasible region determined by the non-negative constraint condition; this direction is then applied to the Fletcher-Reeves conjugate gradient method resulting in a powerful explicit methodology which possesses high accuracy, excellent convergence characteristics, fast computational speed and is relatively simple to implement for contact problems involving Coulomb friction.

  14. Educating Laboratory Science Learners at a Distance Using Interactive Television

    ERIC Educational Resources Information Center

    Reddy, Christopher

    2014-01-01

    Laboratory science classes offered to students learning at a distance require a methodology that allows for the completion of tactile activities. Literature describes three different methods of solving the distance laboratory dilemma: kit-based laboratory experience, computer-based laboratory experience, and campus-based laboratory experience,…

  15. Theorizing E-Learning Participation: A Study of the HRD Online Communities in the USA

    ERIC Educational Resources Information Center

    Wang, Greg G.

    2010-01-01

    Purpose: This study sets out to investigate the e-learning participation and completion phenomenon in the US corporate HRD online communities and to explore determinants of e-learning completion. Design/methodology/approach: Based on the HRD Learning Participation Theory (LPT), this study takes a two-stage approach. Stage one adopts an interview…

  16. Paper-based and web-based intervention modeling experiments identified the same predictors of general practitioners' antibiotic-prescribing behavior.

    PubMed

    Treweek, Shaun; Bonetti, Debbie; Maclennan, Graeme; Barnett, Karen; Eccles, Martin P; Jones, Claire; Pitts, Nigel B; Ricketts, Ian W; Sullivan, Frank; Weal, Mark; Francis, Jill J

    2014-03-01

    To evaluate the robustness of the intervention modeling experiment (IME) methodology as a way of developing and testing behavioral change interventions before a full-scale trial by replicating an earlier paper-based IME. Web-based questionnaire and clinical scenario study. General practitioners across Scotland were invited to complete the questionnaire and scenarios, which were then used to identify predictors of antibiotic-prescribing behavior. These predictors were compared with the predictors identified in an earlier paper-based IME and used to develop a new intervention. Two hundred seventy general practitioners completed the questionnaires and scenarios. The constructs that predicted simulated behavior and intention were attitude, perceived behavioral control, risk perception/anticipated consequences, and self-efficacy, which match the targets identified in the earlier paper-based IME. The choice of persuasive communication as an intervention in the earlier IME was also confirmed. Additionally, a new intervention, an action plan, was developed. A web-based IME replicated the findings of an earlier paper-based IME, which provides confidence in the IME methodology. The interventions will now be evaluated in the next stage of the IME, a web-based randomized controlled trial. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. A flight-test methodology for identification of an aerodynamic model for a V/STOL aircraft

    NASA Technical Reports Server (NTRS)

    Bach, Ralph E., Jr.; Mcnally, B. David

    1988-01-01

    Described is a flight test methodology for developing a data base to be used to identify an aerodynamic model of a vertical and short takeoff and landing (V/STOL) fighter aircraft. The aircraft serves as a test bed at Ames for ongoing research in advanced V/STOL control and display concepts. The flight envelope to be modeled includes hover, transition to conventional flight, and back to hover, STOL operation, and normaL cruise. Although the aerodynamic model is highly nonlinear, it has been formulated to be linear in the parameters to be identified. Motivation for the flight test methodology advocated in this paper is based on the choice of a linear least-squares method for model identification. The paper covers elements of the methodology from maneuver design to the completed data base. Major emphasis is placed on the use of state estimation with tracking data to ensure consistency among maneuver variables prior to their entry into the data base. The design and processing of a typical maneuver is illustrated.

  18. A Study of the Use of a Monolingual Pedagogical Dictionary by Learners of English Engaged in Writing.

    ERIC Educational Resources Information Center

    Harvey, Keith; Yuill, Deborah

    1997-01-01

    Presents an account of a study of the role played by a dictionary in the completion of written (encoding) tasks by students of English as a foreign language. The study uses an introspective methodology based on the completion of flowcharts. Results indicate the importance of information on spelling and meanings and the neglect of coded syntactic…

  19. Management of the aging of critical safety-related concrete structures in light-water reactor plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naus, D.J.; Oland, C.B.; Arndt, E.G.

    1990-01-01

    The Structural Aging Program has the overall objective of providing the USNRC with an improved basis for evaluating nuclear power plant safety-related structures for continued service. The program consists of a management task and three technical tasks: materials property data base, structural component assessment/repair technology, and quantitative methodology for continued-service determinations. Objectives, accomplishments, and planned activities under each of these tasks are presented. Major program accomplishments include development of a materials property data base for structural materials as well as an aging assessment methodology for concrete structures in nuclear power plants. Furthermore, a review and assessment of inservice inspection techniquesmore » for concrete materials and structures has been complete, and work on development of a methodology which can be used for performing current as well as reliability-based future condition assessment of concrete structures is well under way. 43 refs., 3 tabs.« less

  20. Adaptive Architectures for Effects Based Operations

    DTIC Science & Technology

    2006-08-12

    laLb c d elfl I A IB Ic d e f Parent 2 Figure 3: One-Point Crossover System Architectures Lab 85 Aug-06 6.4. ECAD -EA Methodology The previous two...that accomplishes this task is termed as ECAD -EA (Effective Courses of Action Determination Using Evolutionary Algorithms). Besides a completely...items are given below followed by their explanations, while Figure 4 shows the inputs and outputs of the ECAD -EA methodology in the form of a block

  1. Case-Based Learning as Pedagogy for Teaching Information Ethics Based on the Dervin Sense-Making Methodology

    ERIC Educational Resources Information Center

    Dow, Mirah J.; Boettcher, Carrie A.; Diego, Juana F.; Karch, Marziah E.; Todd-Diaz, Ashley; Woods, Kristine M.

    2015-01-01

    The purpose of this mixed methods study is to determine the effectiveness of case-based pedagogy in teaching basic principles of information ethics and ethical decision making. Study reports results of pre- and post-assessment completed by 49 library and information science (LIS) graduate students at a Midwestern university. Using Creswell's…

  2. Base heating methodology improvements, volume 1

    NASA Technical Reports Server (NTRS)

    Bender, Robert L.; Reardon, John E.; Somers, Richard E.; Fulton, Michael S.; Smith, Sheldon D.; Pergament, Harold

    1992-01-01

    This document is the final report for NASA MSFC Contract NAS8-38141. The contracted effort had the broad objective of improving the launch vehicles ascent base heating methodology to improve and simplify the determination of that environment for Advanced Launch System (ALS) concepts. It was pursued as an Advanced Development Plan (ADP) for the Joint DoD/NASA ALS program office with project management assigned to NASA/MSFC. The original study was to be completed in 26 months beginning Sep. 1989. Because of several program changes and emphasis on evolving launch vehicle concepts, the period of performance was extended to the current completion date of Nov. 1992. A computer code incorporating the methodology improvements into a quick prediction tool was developed and is operational for basic configuration and propulsion concepts. The code and its users guide are also provided as part of the contract documentation. Background information describing the specific objectives, limitations, and goals of the contract is summarized. A brief chronology of the ALS/NLS program history is also presented to provide the reader with an overview of the many variables influencing the development of the code over the past three years.

  3. Screening Methodologies to Support Risk and Technology ...

    EPA Pesticide Factsheets

    The Clean Air Act establishes a two-stage regulatory process for addressing emissions of hazardous air pollutants (HAPs) from stationary sources. In the first stage, the Act requires the EPA to develop technology-based standards for categories of industrial sources. We have largely completed the required “Maximum Achievable Control Technology” (MACT) standards. In the second stage of the regulatory process, EPA must review each MACT standard at least every eight years and revise them as necessary, “taking into account developments in practices, processes and control technologies.” We call this requirement the “technology review.” EPA is also required to complete a one-time assessment of the health and environmental risks that remain after sources come into compliance with MACT. This residual risk review also must be done within 8 years of setting the initial MACT standard. If additional risk reductions are necessary to protect public health with an ample margin of safety or to prevent adverse environmental effects, EPA must develop standards to address these remaining risks. Because the risk review is an important component of the RTR process, EPA is seeking SAB input on the scientific credibility of specific enhancements made to our risk assessment methodologies, particularly with respect to screening methodologies, since the last SAB review was completed in 2010. These enhancements to our risk methodologies are outlined in the document title

  4. Disruptive technologies for Massachusetts Bay Transportation Authority business strategy exploration.

    DOT National Transportation Integrated Search

    2013-04-01

    There are three tasks for this research : 1. Methodology to extract Road Usage Patterns from Phone Data: We combined the : most complete record of daily mobility, based on large-scale mobile phone data, with : detailed Geographic Information System (...

  5. GPS based pilot survey of freight movements in the Midwest region.

    DOT National Transportation Integrated Search

    2013-05-01

    This report explains the methodology and results surrounding a recently completed study of a major grocery trucking firms travel patterns. The research group used Global Positioning System (GPS) logging devices to trace the temporal and spatial mo...

  6. Test Methodology Development for Experimental Structural Assessment of ASC Planar Spring Material for Long-Term Durability

    NASA Technical Reports Server (NTRS)

    Yun, Gunjin; Abdullah, A. B. M.; Binienda, Wieslaw; Krause, David L.; Kalluri, Sreeramesh

    2014-01-01

    A vibration-based testing methodology has been developed that will assess fatigue behavior of the metallic material of construction for the Advanced Stirling Convertor displacer (planar) spring component. To minimize the testing duration, the test setup is designed for base-excitation of a multiplespecimen arrangement, driven in a high-frequency resonant mode; this allows completion of fatigue testing in an accelerated period. A high performance electro-dynamic exciter (shaker) is used to generate harmonic oscillation of cantilever beam specimens, which are clasped on the shaker armature with specially-designed clamp fixtures. The shaker operates in closed-loop control with dynamic specimen response feedback provided by a scanning laser vibrometer. A test coordinator function synchronizes the shaker controller and the laser vibrometer to complete the closed-loop scheme. The test coordinator also monitors structural health of the test specimens throughout the test period, recognizing any change in specimen dynamic behavior. As this may be due to fatigue crack initiation, the test coordinator terminates test progression and then acquires test data in an orderly manner. Design of the specimen and fixture geometry was completed by finite element analysis such that peak stress does not occur at the clamping fixture attachment points. Experimental stress evaluation was conducted to verify the specimen stress predictions. A successful application of the experimental methodology was demonstrated by validation tests with carbon steel specimens subjected to fully-reversed bending stress; high-cycle fatigue failures were induced in such specimens using higher-than-prototypical stresses

  7. Completing the Task Procedure or Focusing on Form: Contextualizing Grammar Instruction via Task-Based Teaching

    ERIC Educational Resources Information Center

    Saraç, Hatice Sezgi

    2018-01-01

    In this study, it was aimed to compare two distinct methodologies of grammar instruction: task-based and form-focused teaching. Within the application procedure, which lasted for one academic term, two groups of tertiary level learners (N = 53) were exposed to the same sequence of target structures, extensive writing activities and evaluation…

  8. 24 CFR Appendix to Part 972 - Methodology of Comparing Cost of Public Housing With the Cost of Tenant-Based Assistance

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... justified by a newly created property-based needs assessment (a life-cycle physical needs assessments... calculated as the sum of total operating cost, modernization cost, and costs to address accrual needs. Costs... assist PHAs in completing the assessments. The spreadsheet calculator is designed to walk housing...

  9. 24 CFR Appendix to Part 972 - Methodology of Comparing Cost of Public Housing With the Cost of Tenant-Based Assistance

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... justified by a newly created property-based needs assessment (a life-cycle physical needs assessments... calculated as the sum of total operating cost, modernization cost, and costs to address accrual needs. Costs... assist PHAs in completing the assessments. The spreadsheet calculator is designed to walk housing...

  10. 78 FR 21162 - Notice of Intent to Establish an Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-09

    ... Programs. NCSES, under generic clearance (OMB 3145-0174), has conducted a methodological study to test a.... Estimate of Burden: In the methodological study, HAs required 1 hour on average to complete these tasks...,206 hours. Most ECs were able to complete this task in less than 30 minutes in the methodological...

  11. NLS cycle 1 and NLS 2 base heating technical notes. Appendix 3: Preliminary cycle 1 NLS base heating environments. Cycle 1 NLS base heating environments. NLS 2 650K STME base heating environments

    NASA Technical Reports Server (NTRS)

    Bender, Robert L.; Reardon, John E.; Prendergast, Maurice J.; Schmitz, Craig P.; Brown, John R.

    1992-01-01

    A preliminary analysis of National Launch System ascent plume induced base heating environments has been completed to support the Induced Environments Panel's objective to assist in maturing the NLS vehicle (1.5 stage and heavy launch lift vehicle) design. Environments during ascent have been determined from this analysis for a few selected locations on the engine nozzles and base heat shield for both vehicles. The environments reflect early summer 1991 configurations and performance data and conservative methodology. A more complete and thorough analysis is under way to update these environments for the cycle 1 review in January 1992.

  12. Using experts feedback in clinical case resolution and arbitration as accuracy diagnosis methodology.

    PubMed

    Rodríguez-González, Alejandro; Torres-Niño, Javier; Valencia-Garcia, Rafael; Mayer, Miguel A; Alor-Hernandez, Giner

    2013-09-01

    This paper proposes a new methodology for assessing the efficiency of medical diagnostic systems and clinical decision support systems by using the feedback/opinions of medical experts. The methodology behind this work is based on a comparison between the expert feedback that has helped solve different clinical cases and the expert system that has evaluated these same cases. Once the results are returned, an arbitration process is carried out in order to ensure the correctness of the results provided by both methods. Once this process has been completed, the results are analyzed using Precision, Recall, Accuracy, Specificity and Matthews Correlation Coefficient (MCC) (PRAS-M) metrics. When the methodology is applied, the results obtained from a real diagnostic system allow researchers to establish the accuracy of the system based on objective facts. The methodology returns enough information to analyze the system's behavior for each disease in the knowledge base or across the entire knowledge base. It also returns data on the efficiency of the different assessors involved in the evaluation process, analyzing their behavior in the diagnostic process. The proposed work facilitates the evaluation of medical diagnostic systems, having a reliable process based on objective facts. The methodology presented in this research makes it possible to identify the main characteristics that define a medical diagnostic system and their values, allowing for system improvement. A good example of the results provided by the application of the methodology is shown in this paper. A diagnosis system was evaluated by means of this methodology, yielding positive results (statistically significant) when comparing the system with the assessors that participated in the evaluation process of the system through metrics such as recall (+27.54%) and MCC (+32.19%). These results demonstrate the real applicability of the methodology used. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Seismic Hazard Estimates Using Ill-defined Macroseismic Data at Site

    NASA Astrophysics Data System (ADS)

    Albarello, D.; Mucciarelli, M.

    - A new approach is proposed to the seismic hazard estimate based on documentary data concerning local history of seismic effects. The adopted methodology allows for the use of ``poor'' data, such as the macroseismic ones, within a formally coherent approach that permits overcoming a number of problems connected to the forcing of available information in the frame of ``standard'' methodologies calibrated on the use of instrumental data. The use of the proposed methodology allows full exploitation of all the available information (that for many towns in Italy covers several centuries) making possible a correct use of macroseismic data characterized by different levels of completeness and reliability. As an application of the proposed methodology, seismic hazard estimates are presented for two towns located in Northern Italy: Bologna and Carpi.

  14. Everyday Innovation--Pushing Boundaries While Maintaining Stability

    ERIC Educational Resources Information Center

    Lippke, Lena; Wegener, Charlotte

    2014-01-01

    Purpose: The purpose of this paper is to explore how vocational teachers' everyday practices can constitute innovative learning spaces that help students to experience engagement and commitment towards education and thus increase their possibilities for completing their studies despite notable difficulties. Design/methodology/approach: Based on…

  15. Practical Example of Introductory Engineering Education Based on the Design Process and Teaching Methodology Using a Gyro Bicycle

    ERIC Educational Resources Information Center

    Higa, Yoshikazu; Shimojima, Ken

    2018-01-01

    This report describes a workshop on the Dynamics of Machinery based on the fabrication of a gyro- bicycle in a summer school program for junior high school students. The workshop was conducted by engineering students who had completed "Creative Research", an engineering design course at the National Institute of Technology, Okinawa…

  16. Academic English Teaching for Postgraduates Based on Self-Regulated Learning Environment: A Case Study of Academic Reading Course

    ERIC Educational Resources Information Center

    Zhao, Wei

    2016-01-01

    This study selects postgraduate students in the first grade as the participants, based on their needs analysis, classroom presentations and performance of assignments completion, through the methodology of case study, the results show that students at the university level even the graduate levels still struggle with academic English. Thus, this…

  17. Evaluating Three School-Based Integrated Health Centres Established by a Partnership in Cornwall to Inform Future Provision and Practice

    ERIC Educational Resources Information Center

    Macpherson, Reynold

    2013-01-01

    Purpose: The aim of this paper is to report the process, findings and implications of a three-year evaluation of integrated health centres (IHCs) established in three secondary schools in Cornwall by the School-Based Integrated Health Centres (SBIHC) partnership. Design/methodology/approach: When the partners had completed the capital works, an…

  18. Preliminary comparative assessment of PM10 hourly measurement results from new monitoring stations type using stochastic and exploratory methodology and models

    NASA Astrophysics Data System (ADS)

    Czechowski, Piotr Oskar; Owczarek, Tomasz; Badyda, Artur; Majewski, Grzegorz; Rogulski, Mariusz; Ogrodnik, Paweł

    2018-01-01

    The paper presents selected preliminary stage key issues proposed extended equivalence measurement results assessment for new portable devices - the comparability PM10 concentration results hourly series with reference station measurement results with statistical methods. In article presented new portable meters technical aspects. The emphasis was placed on the comparability the results using the stochastic and exploratory methods methodology concept. The concept is based on notice that results series simple comparability in the time domain is insufficient. The comparison of regularity should be done in three complementary fields of statistical modeling: time, frequency and space. The proposal is based on model's results of five annual series measurement results new mobile devices and WIOS (Provincial Environmental Protection Inspectorate) reference station located in Nowy Sacz city. The obtained results indicate both the comparison methodology completeness and the high correspondence obtained new measurements results devices with reference.

  19. A methodological, task-based approach to Procedure-Specific Simulations training.

    PubMed

    Setty, Yaki; Salzman, Oren

    2016-12-01

    Procedure-Specific Simulations (PSS) are 3D realistic simulations that provide a platform to practice complete surgical procedures in a virtual-reality environment. While PSS have the potential to improve surgeons' proficiency, there are no existing standards or guidelines for PSS development in a structured manner. We employ a unique platform inspired by game design to develop virtual reality simulations in three dimensions of urethrovesical anastomosis during radical prostatectomy. 3D visualization is supported by a stereo vision, providing a fully realistic view of the simulation. The software can be executed for any robotic surgery platform. Specifically, we tested the simulation under windows environment on the RobotiX Mentor. Using urethrovesical anastomosis during radical prostatectomy simulation as a representative example, we present a task-based methodological approach to PSS training. The methodology provides tasks in increasing levels of difficulty from a novice level of basic anatomy identification, to an expert level that permits testing new surgical approaches. The modular methodology presented here can be easily extended to support more complex tasks. We foresee this methodology as a tool used to integrate PSS as a complementary training process for surgical procedures.

  20. Averting Uncertainty: A Practical Guide to Physical Activity Research in Australian Schools

    ERIC Educational Resources Information Center

    Rachele, Jerome N.; Cuddihy, Thomas F.; Washington, Tracy L.; McPhail, Steven M.

    2013-01-01

    Preventative health has become central to contemporary health care, identifying youth physical activity as a key factor in determining health and functioning. Schools offer a unique research setting due to distinctive methodological circumstances. However, school-based researchers face several obstacles in their endeavour to complete successful…

  1. Re-Storying an Entrepreneurial Identity: Education, Experience and Self-Narrative

    ERIC Educational Resources Information Center

    Harmeling, Susan S.

    2011-01-01

    Purpose: This paper aims to explore the ways in which entrepreneurship education may serve as an identity workspace. Design/methodology/approach: This is a conceptual/theoretical paper based on previously completed empirical work. Findings: The paper makes the connection between worldmaking, experience, action and identity. Practical implications:…

  2. A Comparison of Online and Classroom-Based Developmental Math Courses

    ERIC Educational Resources Information Center

    Eggert, Jeanette Gibeson

    2009-01-01

    Effectiveness was operationalized as a combination of successful developmental course completion, high student satisfaction at the end of the course, and high academic achievement in a subsequent college-level math course. Instructional methodologies were similar to the extent that the instructional delivery systems allowed. With a sample size of…

  3. Automating Formative and Summative Feedback for Individualised Assignments

    ERIC Educational Resources Information Center

    Hamilton, Ian Robert

    2009-01-01

    Purpose: The purpose of this paper is to report on the rationale behind the use of a unique paper-based individualised accounting assignment, which automated the provision to students of immediate formative and timely summative feedback. Design/methodology/approach: As students worked towards completing their assignment, the package provided…

  4. Comment on Birgegard and Sohlberg's (1999) suggestions for research in subliminal psychodynamic activation.

    PubMed

    Fudin, R

    2000-06-01

    Methodological changes in subliminal psychodynamic activation experiments based on the assumption that multiletter messages can be encoded automatically (Birgegard & Sohlberg, 1999) are questioned. Their contention that partial experimental messages and appropriate nonsense anagram controls (Fudin, 1986) need not be presented in every experiment is supported, with a reservation. If the difference between responses to the complete message and its control is significant in the predicted direction, then Fudin's procedure should be used. A nonsignificant difference between the response to each partial message and its control is needed to support the assumption of proponents of subliminal psychodynamic activation that successful outcomes are effected by the encoding of the meaning of a complete message. Experiments in subliminal psychodynamic activation can be improved if their methodologies take into account variables that may operate when subliminal stimuli are presented and encoded.

  5. A Case Study of Two Regional State Universities Qualifying as Learning Organizations Based on Administration and Staff Viewpoints

    ERIC Educational Resources Information Center

    Rich, Tammy Morrison

    2011-01-01

    This case study of 2 state universities qualifying as learning organizations, based on administration and staff viewpoints, was completed using a qualitative methodology. The idea of what a learning organization is can be different depending on who or what is being analyzed. For this study, the work of theorists including W. Edwards Deming,…

  6. A primary care Web-based Intervention Modeling Experiment replicated behavior changes seen in earlier paper-based experiment.

    PubMed

    Treweek, Shaun; Francis, Jill J; Bonetti, Debbie; Barnett, Karen; Eccles, Martin P; Hudson, Jemma; Jones, Claire; Pitts, Nigel B; Ricketts, Ian W; Sullivan, Frank; Weal, Mark; MacLennan, Graeme

    2016-12-01

    Intervention Modeling Experiments (IMEs) are a way of developing and testing behavior change interventions before a trial. We aimed to test this methodology in a Web-based IME that replicated the trial component of an earlier, paper-based IME. Three-arm, Web-based randomized evaluation of two interventions (persuasive communication and action plan) and a "no intervention" comparator. The interventions were designed to reduce the number of antibiotic prescriptions in the management of uncomplicated upper respiratory tract infection. General practitioners (GPs) were invited to complete an online questionnaire and eight clinical scenarios where an antibiotic might be considered. One hundred twenty-nine GPs completed the questionnaire. GPs receiving the persuasive communication did not prescribe an antibiotic in 0.70 more scenarios (95% confidence interval [CI] = 0.17-1.24) than those in the control arm. For the action plan, GPs did not prescribe an antibiotic in 0.63 (95% CI = 0.11-1.15) more scenarios than those in the control arm. Unlike the earlier IME, behavioral intention was unaffected by the interventions; this may be due to a smaller sample size than intended. A Web-based IME largely replicated the findings of an earlier paper-based study, providing some grounds for confidence in the IME methodology. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Black youth suicide: literature review with a focus on prevention.

    PubMed Central

    Baker, F. M.

    1990-01-01

    The national rates of completed suicide in the black population between 1950 and 1981 are presented, including age-adjusted rates. Specific studies of black suicide attempters and completed suicides by blacks in several cities are discussed. Methodological problems with existing studies and national suicide statistics are presented. Proposed theories of black suicide are reviewed. Based on a summary of the characteristics of black suicide attempters reported by the literature, preventive strategies--primary, secondary, and tertiary--are presented. PMID:2204709

  8. Structural mapping from MSS-LANDSAT imagery: A proposed methodology for international geological correlation studies

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Crepani, E.; Martini, P. R.

    1980-01-01

    A methodology is proposed for international geological correlation studies based on LANDSAT-MSS imagery, Bullard's model of continental fit and compatible structural trends between Northeast Brazil and the West African counterpart. Six extensive lineaments in the Brazilian study area are mapped and discussed according to their regional behavior and in relation to the adjacent continental margin. Among the first conclusions, correlations were found between the Sobral Pedro II Lineament and the megafaults that surround the West African craton; and the Pernambuco Lineament with the Ngaurandere Linemanet in Cameroon. Ongoing research to complete the methodological stages includes the mapping of the West African structural framework, reconstruction of the pre-drift puzzle, and an analysis of the counterpart correlations.

  9. A simplified approach to determine the carbon footprint of a region: Key learning points from a Galician study.

    PubMed

    Roibás, Laura; Loiseau, Eléonore; Hospido, Almudena

    2018-07-01

    On a previous study, the carbon footprint (CF) of all production and consumption activities of Galicia, an Autonomous Community located in the north-west of Spain, was determined and the results were used to devise strategies aimed at the reduction and mitigation of the greenhouse gas (GHG) emissions. The territorial LCA methodology was used there to perform the calculations. However, that methodology was initially designed to compute the emissions of all types of polluting substances to the environment (several thousands of substances considered in the life cycle inventories), aimed at performing complete LCA studies. This requirement implies the use of specific modelling approaches and databases that in turn raised some difficulties, i.e., need of large amounts of data (which increased gathering times), low temporal, geographical and technological representativeness of the study, lack of data, and presence of double counting issues when trying to combine the sectorial CF results into those of the total economy. In view of these of difficulties, and considering the need to focus only on GHG emissions, it seems important to improve the robustness of the CF computation while proposing a simplified methodology. This study is the result of those efforts to improve the aforementioned methodology. In addition to the territorial LCA approach, several Input-Output (IO) based alternatives have been used here to compute direct and indirect GHG emissions of all Galician production and consumption activities. The results of the different alternatives were compared and evaluated under a multi-criteria approach considering reliability, completeness, temporal and geographical correlation, applicability and consistency. Based on that, an improved and simplified methodology was proposed to determine the CF of the Galician consumption and production activities from a total responsibility perspective. This methodology adequately reflects the current characteristics of the Galician economy, thus increasing the representativeness of the results, and can be applied to any region in which IO tables and environmental vectors are available. This methodology could thus provide useful information in decision making processes to reduce and prevent GHG emissions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Fully Associative, Nonisothermal, Potential-Based Unified Viscoplastic Model for Titanium-Based Matrices

    NASA Technical Reports Server (NTRS)

    2005-01-01

    A number of titanium matrix composite (TMC) systems are currently being investigated for high-temperature air frame and propulsion system applications. As a result, numerous computational methodologies for predicting both deformation and life for this class of materials are under development. An integral part of these methodologies is an accurate and computationally efficient constitutive model for the metallic matrix constituent. Furthermore, because these systems are designed to operate at elevated temperatures, the required constitutive models must account for both time-dependent and time-independent deformations. To accomplish this, the NASA Lewis Research Center is employing a recently developed, complete, potential-based framework. This framework, which utilizes internal state variables, was put forth for the derivation of reversible and irreversible constitutive equations. The framework, and consequently the resulting constitutive model, is termed complete because the existence of the total (integrated) form of the Gibbs complementary free energy and complementary dissipation potentials are assumed a priori. The specific forms selected here for both the Gibbs and complementary dissipation potentials result in a fully associative, multiaxial, nonisothermal, unified viscoplastic model with nonlinear kinematic hardening. This model constitutes one of many models in the Generalized Viscoplasticity with Potential Structure (GVIPS) class of inelastic constitutive equations.

  11. AN INTEGRATION OF COPEPOD-BASED BAFS, LIFECYCLE TOXICITY TESTING, AND ENDOCRINE DISRUPTION METHODOLOGIES FOR RAPID POPULATION-LEVEL RISK ASSESSMENT OF PERSISTENT BIOACCUMULATIVE TOXICANTS

    EPA Science Inventory

    Extensive multi-generational microplate culturing (copepod hatching stage through two broods) experiments were completed with the POPs lindane, DDD and fipronil sulfide.  Identical tandem microplate experiments were run concurrently to yield sufficient copepod biomass for li...

  12. 44 CFR 65.6 - Revision of base flood elevation determinations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... when discharges change as a result of the use of an alternative methodology or data for computing flood... land use regulation. (ii) It must be well-documented including source codes and user's manuals. (iii... projects that may effect map changes when they are completed. (4) The datum and date of releveling of...

  13. Preliminary results of the global forest biomass survey

    Treesearch

    S. Healey; E. Lindquist

    2014-01-01

    Many countries do not yet have well-established national forest inventories, and among those that do, significant methodological differences exist, particularly in the estimation of standing forest biomass. Global space-based LiDAR (Light Detection and Ranging) from NASA’s now-completed ICESat mission provided consistent, high-quality measures of canopy height and...

  14. Examining Foundations of Qualitative Research: A Review of Social Work Dissertations, 2008-2010

    ERIC Educational Resources Information Center

    Gringeri, Christina; Barusch, Amanda; Cambron, Christopher

    2013-01-01

    This study examined the treatment of epistemology and methodological rigor in qualitative social work dissertations. Template-based review was conducted on a random sample of 75 dissertations completed between 2008 and 2010. For each dissertation, we noted the presence or absence of four markers of epistemology: theory, paradigm, reflexivity, and…

  15. Mapping of Supply Chain Learning: A Framework for SMEs

    ERIC Educational Resources Information Center

    Thakkar, Jitesh; Kanda, Arun; Deshmukh, S. G.

    2011-01-01

    Purpose: The aim of this paper is to propose a mapping framework for evaluating supply chain learning potential for the context of small- to medium-sized enterprises (SMEs). Design/methodology/approach: The extracts of recently completed case based research for ten manufacturing SME units and facts reported in the previous research are utilized…

  16. Piecing the Puzzle: A Framework for Developing Intercultural Online Communication Projects in Business Education

    ERIC Educational Resources Information Center

    Crossman, Joanna; Bordia, Sarbari

    2012-01-01

    Purpose: The purpose of this paper is to present a framework based on lessons learnt from a recently completed project aimed at developing intercultural online communication competencies in business students. Design/methodology/approach: The project entailed collaboration between students and staff in business communication courses from an…

  17. Philosophical Roots of Cosmology

    NASA Astrophysics Data System (ADS)

    Ivanovic, M.

    2008-10-01

    We shall consider the philosophical roots of cosmology in the earlier Greek philosophy. Our goal is to answer the question: Are earlier Greek theories of pure philosophical-mythological character, as often philosophers cited it, or they have scientific character. On the bases of methodological criteria, we shall contend that the latter is the case. In order to answer the question about contemporary situation of the relation philosophy-cosmology, we shall consider the next question: Is contemporary cosmology completely independent of philosophical conjectures? The answer demands consideration of methodological character about scientific status of contemporary cosmology. We also consider some aspects of the relation contemporary philosophy-cosmology.

  18. System cost/performance analysis (study 2.3). Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Kazangey, T.

    1973-01-01

    The relationships between performance, safety, cost, and schedule parameters were identified and quantified in support of an overall effort to generate program models and methodology that provide insight into a total space vehicle program. A specific space vehicle system, the attitude control system (ACS), was used, and a modeling methodology was selected that develops a consistent set of quantitative relationships among performance, safety, cost, and schedule, based on the characteristics of the components utilized in candidate mechanisms. These descriptive equations were developed for a three-axis, earth-pointing, mass expulsion ACS. A data base describing typical candidate ACS components was implemented, along with a computer program to perform sample calculations. This approach, implemented on a computer, is capable of determining the effect of a change in functional requirements to the ACS mechanization and the resulting cost and schedule. By a simple extension of this modeling methodology to the other systems in a space vehicle, a complete space vehicle model can be developed. Study results and recommendations are presented.

  19. Model-Based Thermal System Design Optimization for the James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Cataldo, Giuseppe; Niedner, Malcolm B.; Fixsen, Dale J.; Moseley, Samuel H.

    2017-01-01

    Spacecraft thermal model validation is normally performed by comparing model predictions with thermal test data and reducing their discrepancies to meet the mission requirements. Based on thermal engineering expertise, the model input parameters are adjusted to tune the model output response to the test data. The end result is not guaranteed to be the best solution in terms of reduced discrepancy and the process requires months to complete. A model-based methodology was developed to perform the validation process in a fully automated fashion and provide mathematical bases to the search for the optimal parameter set that minimizes the discrepancies between model and data. The methodology was successfully applied to several thermal subsystems of the James Webb Space Telescope (JWST). Global or quasiglobal optimal solutions were found and the total execution time of the model validation process was reduced to about two weeks. The model sensitivities to the parameters, which are required to solve the optimization problem, can be calculated automatically before the test begins and provide a library for sensitivity studies. This methodology represents a crucial commodity when testing complex, large-scale systems under time and budget constraints. Here, results for the JWST Core thermal system will be presented in detail.

  20. Model-based thermal system design optimization for the James Webb Space Telescope

    NASA Astrophysics Data System (ADS)

    Cataldo, Giuseppe; Niedner, Malcolm B.; Fixsen, Dale J.; Moseley, Samuel H.

    2017-10-01

    Spacecraft thermal model validation is normally performed by comparing model predictions with thermal test data and reducing their discrepancies to meet the mission requirements. Based on thermal engineering expertise, the model input parameters are adjusted to tune the model output response to the test data. The end result is not guaranteed to be the best solution in terms of reduced discrepancy and the process requires months to complete. A model-based methodology was developed to perform the validation process in a fully automated fashion and provide mathematical bases to the search for the optimal parameter set that minimizes the discrepancies between model and data. The methodology was successfully applied to several thermal subsystems of the James Webb Space Telescope (JWST). Global or quasiglobal optimal solutions were found and the total execution time of the model validation process was reduced to about two weeks. The model sensitivities to the parameters, which are required to solve the optimization problem, can be calculated automatically before the test begins and provide a library for sensitivity studies. This methodology represents a crucial commodity when testing complex, large-scale systems under time and budget constraints. Here, results for the JWST Core thermal system will be presented in detail.

  1. Reversed inverse regression for the univariate linear calibration and its statistical properties derived using a new methodology

    NASA Astrophysics Data System (ADS)

    Kang, Pilsang; Koo, Changhoi; Roh, Hokyu

    2017-11-01

    Since simple linear regression theory was established at the beginning of the 1900s, it has been used in a variety of fields. Unfortunately, it cannot be used directly for calibration. In practical calibrations, the observed measurements (the inputs) are subject to errors, and hence they vary, thus violating the assumption that the inputs are fixed. Therefore, in the case of calibration, the regression line fitted using the method of least squares is not consistent with the statistical properties of simple linear regression as already established based on this assumption. To resolve this problem, "classical regression" and "inverse regression" have been proposed. However, they do not completely resolve the problem. As a fundamental solution, we introduce "reversed inverse regression" along with a new methodology for deriving its statistical properties. In this study, the statistical properties of this regression are derived using the "error propagation rule" and the "method of simultaneous error equations" and are compared with those of the existing regression approaches. The accuracy of the statistical properties thus derived is investigated in a simulation study. We conclude that the newly proposed regression and methodology constitute the complete regression approach for univariate linear calibrations.

  2. Using Concept Mapping as as Tool for Program Theory Development

    ERIC Educational Resources Information Center

    Orsi, Rebecca

    2011-01-01

    The purpose of this methodological study is to explore how well a process called "concept mapping" (Trochim, 1989) can articulate the theory which underlies a social program. Articulation of a program's theory is a key step in completing a sound theory based evaluation (Weiss, 1997a). In this study, concept mapping is used to…

  3. Capturing security requirements for software systems.

    PubMed

    El-Hadary, Hassan; El-Kassas, Sherif

    2014-07-01

    Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way.

  4. Capturing security requirements for software systems

    PubMed Central

    El-Hadary, Hassan; El-Kassas, Sherif

    2014-01-01

    Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way. PMID:25685514

  5. Developing dementia prevention trials: baseline report of the Home-Based Assessment study.

    PubMed

    Sano, Mary; Egelko, Susan; Donohue, Michael; Ferris, Steven; Kaye, Jeffrey; Hayes, Tamara L; Mundt, James C; Sun, Chung-Kai; Paparello, Silvia; Aisen, Paul S

    2013-01-01

    This report describes the baseline experience of the multicenter, Home-Based Assessment study, designed to develop methods for dementia prevention trials using novel technologies for test administration and data collection. Nondemented individuals of 75 years of age or more were recruited and evaluated in-person using established clinical trial outcomes of cognition and function, and randomized to one of 3 assessment methodologies: (1) mail-in questionnaire/live telephone interviews [mail-in/phone (MIP)]; (2) automated telephone with interactive voice recognition; and (3) internet-based computer Kiosk. Brief versions of cognitive and noncognitive outcomes were adapted to each methodology and administered at baseline and repeatedly over a 4-year period. "Efficiency" measures assessed the time from screening to baseline, and staff time required for each methodology. A total of 713 individuals signed consent and were screened; 640 met eligibility and were randomized to one of 3 assessment arms; and 581 completed baseline. Dropout, time from screening to baseline, and total staff time were highest among those assigned to internet-based computer Kiosk. However, efficiency measures were driven by nonrecurring start-up activities suggesting that differences may be mitigated over a long trial. Performance among Home-Based Assessment instruments collected through different technologies will be compared with established outcomes over this 4-year study.

  6. Integrating field methodology and web-based data collection to assess the reliability of the Alcohol Use Disorders Identification Test (AUDIT).

    PubMed

    Celio, Mark A; Vetter-O'Hagen, Courtney S; Lisman, Stephen A; Johansen, Gerard E; Spear, Linda P

    2011-12-01

    Field methodologies offer a unique opportunity to collect ecologically valid data on alcohol use and its associated problems within natural drinking environments. However, limitations in follow-up data collection methods have left unanswered questions regarding the psychometric properties of field-based measures. The aim of the current study is to evaluate the reliability of self-report data collected in a naturally occurring environment - as indexed by the Alcohol Use Disorders Identification Test (AUDIT) - compared to self-report data obtained through an innovative web-based follow-up procedure. Individuals recruited outside of bars (N=170; mean age=21; range 18-32) provided a BAC sample and completed a self-administered survey packet that included the AUDIT. BAC feedback was provided anonymously through a dedicated web page. Upon sign in, follow-up participants (n=89; 52%) were again asked to complete the AUDIT before receiving their BAC feedback. Reliability analyses demonstrated that AUDIT scores - both continuous and dichotomized at the standard cut-point - were stable across field- and web-based administrations. These results suggest that self-report data obtained from acutely intoxicated individuals in naturally occurring environments are reliable when compared to web-based data obtained after a brief follow-up interval. Furthermore, the results demonstrate the feasibility, utility, and potential of integrating field methods and web-based data collection procedures. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  7. Calculation of the exchange coupling constants of copper binuclear systems based on spin-flip constricted variational density functional theory.

    PubMed

    Zhekova, Hristina R; Seth, Michael; Ziegler, Tom

    2011-11-14

    We have recently developed a methodology for the calculation of exchange coupling constants J in weakly interacting polynuclear metal clusters. The method is based on unrestricted and restricted second order spin-flip constricted variational density functional theory (SF-CV(2)-DFT) and is here applied to eight binuclear copper systems. Comparison of the SF-CV(2)-DFT results with experiment and with results obtained from other DFT and wave function based methods has been made. Restricted SF-CV(2)-DFT with the BH&HLYP functional yields consistently J values in excellent agreement with experiment. The results acquired from this scheme are comparable in quality to those obtained by accurate multi-reference wave function methodologies such as difference dedicated configuration interaction and the complete active space with second-order perturbation theory. © 2011 American Institute of Physics

  8. Crossing trend analysis methodology and application for Turkish rainfall records

    NASA Astrophysics Data System (ADS)

    Şen, Zekâi

    2018-01-01

    Trend analyses are the necessary tools for depicting possible general increase or decrease in a given time series. There are many versions of trend identification methodologies such as the Mann-Kendall trend test, Spearman's tau, Sen's slope, regression line, and Şen's innovative trend analysis. The literature has many papers about the use, cons and pros, and comparisons of these methodologies. In this paper, a completely new approach is proposed based on the crossing properties of a time series. It is suggested that the suitable trend from the centroid of the given time series should have the maximum number of crossings (total number of up-crossings or down-crossings). This approach is applicable whether the time series has dependent or independent structure and also without any dependence on the type of the probability distribution function. The validity of this method is presented through extensive Monte Carlo simulation technique and its comparison with other existing trend identification methodologies. The application of the methodology is presented for a set of annual daily extreme rainfall time series from different parts of Turkey and they have physically independent structure.

  9. Best evidence on the educational effects of undergraduate portfolios.

    PubMed

    Buckley, Sharon; Coleman, Jamie; Khan, Khalid

    2010-09-01

    The great variety of portfolio types and schemes used in the education of health professionals is reflected in the extensive and diverse educational literature relating to portfolio use. We have recently completed a Best Evidence Medical Education (BEME) systematic review of the literature relating to the use of portfolios in the undergraduate setting that offers clinical teachers insights into both their effects on learning and issues to consider in portfolio implementation. Using a methodology based on BEME recommendations, we searched the literature relating to a range of health professions, identifying evidence for the effects of portfolios on undergraduate student learning, and assessing the methodological quality of each study. The higher quality studies in our review report that, when implemented appropriately, portfolios can improve students' ability to integrate theory with practice, can encourage their self-awareness and reflection, and can offer support for students facing difficult emotional situations. Portfolios can also enhance student-tutor relationships and prepare students for the rigours of postgraduate training. However, the time required to complete a portfolio may detract from students' clinical learning. An analysis of methodological quality against year of publication suggests that, across a range of health professions, the quality of the literature relating to the educational effects of portfolios is improving. However, further work is still required to build the evidence base for the educational effects of portfolios, particularly comparative studies that assess effects on learning directly. Our findings have implications for the design and implementation of portfolios in the undergraduate setting. © Blackwell Publishing Ltd 2010.

  10. Pilot Study to Show the Feasibility of a Multicenter Trial of Home-based Assessment of People Over 75 Years Old

    PubMed Central

    Sano, Mary; Egelko, Susan; Ferris, Steven; Kaye, Jeffrey; Hayes, Tamara L.; Mundt, James C.; Donohue, Michael; Walter, Sarah; Sun, Shelly; Sauceda-Cerda, Luis

    2012-01-01

    This report describes a pilot study to evaluate feasibility of new home-based assessment technologies applicable to clinical trials for prevention of cognitive loss and Alzheimer disease. Methods Community-dwelling nondemented individuals ≥ 75 years old were recruited and randomized to 1 of 3 assessment methodologies: (1) mail-in questionnaire/live telephone interviews (MIP); (2) automated telephone with interactive voice recognition (IVR); and (3) internet-based computer Kiosk (KIO). Brief versions of cognitive and noncognitive outcomes were adapted to the different methodologies and administered at baseline and 1-month. An Efficiency measure, consisting of direct staff-to-participant time required to complete assessments, was also compared across arms. Results Forty-eight out of 60 screened participants were randomized. The dropout rate across arms from randomization through 1-month was different: 33% for KIO, 25% for IVR, and 0% for MIP (Fisher Exact Test P = 0.04). Nearly all participants who completed baseline also completed 1-month assessment (38 out of 39). The 1-way ANOVA across arms for total staff-to-participant direct contact time (ie, training, baseline, and 1-month) was significant: F (2,33) = 4.588; P = 0.017, with lowest overall direct time in minutes for IVR (Mn = 44.4; SD = 21.5), followed by MIP (Mn = 74.9; SD = 29.9), followed by KIO (Mn = 129.4; SD = 117.0). Conclusions In this sample of older individuals, a higher dropout rate occurred in those assigned to the high-technology assessment techniques; however, once participants had completed baseline in all 3 arms, they continued participation through 1 month. High-technology home-based assessment methods, which do not require live testers, began to emerge as more time-efficient over the brief time of this pilot, despite initial time-intensive participant training. PMID:20592583

  11. 3-D Survey Applied to Industrial Archaeology by Tls Methodology

    NASA Astrophysics Data System (ADS)

    Monego, M.; Fabris, M.; Menin, A.; Achilli, V.

    2017-05-01

    This work describes the three-dimensional survey of "Ex Stazione Frigorifera Specializzata": initially used for agricultural storage, during the years it was allocated to different uses until the complete neglect. The historical relevance and the architectural heritage that this building represents has brought the start of a recent renovation project and functional restoration. In this regard it was necessary a global 3-D survey that was based on the application and integration of different geomatic methodologies (mainly terrestrial laser scanner, classical topography, and GNSS). The acquisitions of point clouds was performed using different laser scanners: with time of flight (TOF) and phase shift technologies for the distance measurements. The topographic reference network, needed for scans alignment in the same system, was measured with a total station. For the complete survey of the building, 122 scans were acquired and 346 targets were measured from 79 vertices of the reference network. Moreover, 3 vertices were measured with GNSS methodology in order to georeference the network. For the detail survey of machine room were executed 14 scans with 23 targets. The 3-D global model of the building have less than one centimeter of error in the alignment (for the machine room the error in alignment is not greater than 6 mm) and was used to extract products such as longitudinal and transversal sections, plans, architectural perspectives, virtual scans. A complete spatial knowledge of the building is obtained from the processed data, providing basic information for restoration project, structural analysis, industrial and architectural heritage valorization.

  12. A GIS-based time-dependent seismic source modeling of Northern Iran

    NASA Astrophysics Data System (ADS)

    Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza

    2017-01-01

    The first step in any seismic hazard study is the definition of seismogenic sources and the estimation of magnitude-frequency relationships for each source. There is as yet no standard methodology for source modeling and many researchers have worked on this topic. This study is an effort to define linear and area seismic sources for Northern Iran. The linear or fault sources are developed based on tectonic features and characteristic earthquakes while the area sources are developed based on spatial distribution of small to moderate earthquakes. Time-dependent recurrence relationships are developed for fault sources using renewal approach while time-independent frequency-magnitude relationships are proposed for area sources based on Poisson process. GIS functionalities are used in this study to introduce and incorporate spatial-temporal and geostatistical indices in delineating area seismic sources. The proposed methodology is used to model seismic sources for an area of about 500 by 400 square kilometers around Tehran. Previous researches and reports are studied to compile an earthquake/fault catalog that is as complete as possible. All events are transformed to uniform magnitude scale; duplicate events and dependent shocks are removed. Completeness and time distribution of the compiled catalog is taken into account. The proposed area and linear seismic sources in conjunction with defined recurrence relationships can be used to develop time-dependent probabilistic seismic hazard analysis of Northern Iran.

  13. Using artificial intelligence to bring evidence-based medicine a step closer to making the individual difference.

    PubMed

    Sissons, B; Gray, W A; Bater, A; Morrey, D

    2007-03-01

    The vision of evidence-based medicine is that of experienced clinicians systematically using the best research evidence to meet the individual patient's needs. This vision remains distant from clinical reality, as no complete methodology exists to apply objective, population-based research evidence to the needs of an individual real-world patient. We describe an approach, based on techniques from machine learning, to bridge this gap between evidence and individual patients in oncology. We examine existing proposals for tackling this gap and the relative benefits and challenges of our proposed, k-nearest-neighbour-based, approach.

  14. A Quality Improvement Activity to Promote Interprofessional Collaboration Among Health Professions Students

    PubMed Central

    Stevenson, Katherine; Busch, Angela; Scott, Darlene J.; Henry, Carol; Wall, Patricia A.

    2009-01-01

    Objectives To develop and evaluate a classroom-based curriculum designed to promote interprofessional competencies by having undergraduate students from various health professions work together on system-based problems using quality improvement (QI) methods and tools to improve patient-centered care. Design Students from 4 health care programs (nursing, nutrition, pharmacy, and physical therapy) participated in an interprofessional QI activity. In groups of 6 or 7, students completed pre-intervention and post-intervention reflection tools on attitudes relating to interprofessio nal teams, and a tool designed to evaluate group process. Assessment One hundred thirty-four students (76.6%) completed both self-reflection instruments, and 132 (74.2%) completed the post-course group evaluation instrument. Although already high prior to the activity, students' mean post-intervention reflection scores increased for 12 of 16 items. Post-intervention group evaluation scores reflected a high level of satisfaction with the experience. Conclusion Use of a quality-based case study and QI methodology were an effective approach to enhancing interprofessional experiences among students. PMID:19657497

  15. An enhanced methodology for spacecraft correlation activity using virtual testing tools

    NASA Astrophysics Data System (ADS)

    Remedia, Marcello; Aglietti, Guglielmo S.; Appolloni, Matteo; Cozzani, Alessandro; Kiley, Andrew

    2017-11-01

    Test planning and post-test correlation activity have been issues of growing importance in the last few decades and many methodologies have been developed to either quantify or improve the correlation between computational and experimental results. In this article the methodologies established so far are enhanced with the implementation of a recently developed procedure called Virtual Testing. In the context of fixed-base sinusoidal tests (commonly used in the space sector for correlation), there are several factors in the test campaign that affect the behaviour of the satellite and are not normally taken into account when performing analyses: different boundary conditions created by the shaker's own dynamics, non-perfect control system, signal delays etc. All these factors are the core of the Virtual Testing implementation, which will be thoroughly explained in this article and applied to the specific case of Bepi-Colombo spacecraft tested on the ESA QUAD Shaker. Correlation activity will be performed in the various stages of the process, showing important improvements observed after applying the final complete methodology.

  16. Adaptation of a software development methodology to the implementation of a large-scale data acquisition and control system. [for Deep Space Network

    NASA Technical Reports Server (NTRS)

    Madrid, G. A.; Westmoreland, P. T.

    1983-01-01

    A progress report is presented on a program to upgrade the existing NASA Deep Space Network in terms of a redesigned computer-controlled data acquisition system for channelling tracking, telemetry, and command data between a California-based control center and three signal processing centers in Australia, California, and Spain. The methodology for the improvements is oriented towards single subsystem development with consideration for a multi-system and multi-subsystem network of operational software. Details of the existing hardware configurations and data transmission links are provided. The program methodology includes data flow design, interface design and coordination, incremental capability availability, increased inter-subsystem developmental synthesis and testing, system and network level synthesis and testing, and system verification and validation. The software has been implemented thus far to a 65 percent completion level, and the methodology being used to effect the changes, which will permit enhanced tracking and communication with spacecraft, has been concluded to feature effective techniques.

  17. A new methodology for automated diagnosis of mild cognitive impairment (MCI) using magnetoencephalography (MEG).

    PubMed

    Amezquita-Sanchez, Juan P; Adeli, Anahita; Adeli, Hojjat

    2016-05-15

    Mild cognitive impairment (MCI) is a cognitive disorder characterized by memory impairment, greater than expected by age. A new methodology is presented to identify MCI patients during a working memory task using MEG signals. The methodology consists of four steps: In step 1, the complete ensemble empirical mode decomposition (CEEMD) is used to decompose the MEG signal into a set of adaptive sub-bands according to its contained frequency information. In step 2, a nonlinear dynamics measure based on permutation entropy (PE) analysis is employed to analyze the sub-bands and detect features to be used for MCI detection. In step 3, an analysis of variation (ANOVA) is used for feature selection. In step 4, the enhanced probabilistic neural network (EPNN) classifier is applied to the selected features to distinguish between MCI and healthy patients. The usefulness and effectiveness of the proposed methodology are validated using the sensed MEG data obtained experimentally from 18 MCI and 19 control patients. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Creating a Safe Climate for Active Learning and Student Engagement: An Example from an Introductory Social Work Module

    ERIC Educational Resources Information Center

    Ni Raghallaigh, M.; Cunniffe, R.

    2013-01-01

    This article explores the experiences of students who participated in a series of seminars that employed active learning methodologies. The study on which the article is based involved two parts. First, students completed a questionnaire after each seminar, resulting in 468 questionnaires. Second, nine students participated in a focus group where…

  19. A New Net to Go Fishing: Messages from International Evidence-Based Research and "Kaupapa" Maori Research

    ERIC Educational Resources Information Center

    Manning, Richard F.; Macfarlane, Angus H.; Skerrett, Mere; Cooper, Garrick; De Oliveira, Vanessa; Emery, Tepora

    2011-01-01

    This article draws upon a Maori metaphor to describe the theoretical framework underpinning the methodology and findings of a research project completed by researchers from the University of Canterbury, New Zealand, in 2010. It explains how and why the project required the research team to synthesise key information from four New Zealand Ministry…

  20. The ACVD task force on canine atopic dermatitis (XVI): laboratory evaluation of dogs with atopic dermatitis with serum-based "allergy" tests.

    PubMed

    DeBoer, D J; Hillier, A

    2001-09-20

    Serum-based in vitro "allergy tests" are commercially available to veterinarians, and are widely used in diagnostic evaluation of a canine atopic patient. Following initial clinical diagnosis, panels of allergen-specific IgE measurements may be performed in an attempt to identify to which allergens the atopic dog is hypersensitive. Methodology for these tests varies by laboratory; few critical studies have evaluated performance of these tests, and current inter-laboratory standardization and quality control measures are inadequate. Other areas where information is critically limited include the usefulness of these tests in diagnosis of food allergy, the effect of extrinsic factors such as season of the year on results, and the influence of corticosteroid treatment on test results. Allergen-specific IgE serological tests are never completely sensitive, nor completely specific. There is only partial correlation between the serum tests and intradermal testing; however, the significance of discrepant results is unknown and unstudied. Variation in test methodologies along with the absence of universal standardization and reporting procedures have created confusion, varying study results, and an inability to compare between studies performed by different investigators.

  1. Development of a case tool to support decision based software development

    NASA Technical Reports Server (NTRS)

    Wild, Christian J.

    1993-01-01

    A summary of the accomplishments of the research over the past year are presented. Achievements include: made demonstrations with DHC, a prototype supporting decision based software development (DBSD) methodology, for Paramax personnel at ODU; met with Paramax personnel to discuss DBSD issues, the process of integrating DBSD and Refinery and the porting process model; completed and submitted a paper describing DBSD paradigm to IFIP '92; completed and presented a paper describing the approach for software reuse at the Software Reuse Workshop in April 1993; continued to extend DHC with a project agenda, facility necessary for a better project management; completed a primary draft of the re-engineering process model for porting; created a logging form to trace all the activities involved in the process of solving the reengineering problem, and developed a primary chart with the problems involved by the reengineering process.

  2. A novel neural network based image reconstruction model with scale and rotation invariance for target identification and classification for Active millimetre wave imaging

    NASA Astrophysics Data System (ADS)

    Agarwal, Smriti; Bisht, Amit Singh; Singh, Dharmendra; Pathak, Nagendra Prasad

    2014-12-01

    Millimetre wave imaging (MMW) is gaining tremendous interest among researchers, which has potential applications for security check, standoff personal screening, automotive collision-avoidance, and lot more. Current state-of-art imaging techniques viz. microwave and X-ray imaging suffers from lower resolution and harmful ionizing radiation, respectively. In contrast, MMW imaging operates at lower power and is non-ionizing, hence, medically safe. Despite these favourable attributes, MMW imaging encounters various challenges as; still it is very less explored area and lacks suitable imaging methodology for extracting complete target information. Keeping in view of these challenges, a MMW active imaging radar system at 60 GHz was designed for standoff imaging application. A C-scan (horizontal and vertical scanning) methodology was developed that provides cross-range resolution of 8.59 mm. The paper further details a suitable target identification and classification methodology. For identification of regular shape targets: mean-standard deviation based segmentation technique was formulated and further validated using a different target shape. For classification: probability density function based target material discrimination methodology was proposed and further validated on different dataset. Lastly, a novel artificial neural network based scale and rotation invariant, image reconstruction methodology has been proposed to counter the distortions in the image caused due to noise, rotation or scale variations. The designed neural network once trained with sample images, automatically takes care of these deformations and successfully reconstructs the corrected image for the test targets. Techniques developed in this paper are tested and validated using four different regular shapes viz. rectangle, square, triangle and circle.

  3. Improving Efficiency Using Time-Driven Activity-Based Costing Methodology.

    PubMed

    Tibor, Laura C; Schultz, Stacy R; Menaker, Ronald; Weber, Bradley D; Ness, Jay; Smith, Paula; Young, Phillip M

    2017-03-01

    The aim of this study was to increase efficiency in MR enterography using a time-driven activity-based costing methodology. In February 2015, a multidisciplinary team was formed to identify the personnel, equipment, space, and supply costs of providing outpatient MR enterography. The team mapped the current state, completed observations, performed timings, and calculated costs associated with each element of the process. The team used Pareto charts to understand the highest cost and most time-consuming activities, brainstormed opportunities, and assessed impact. Plan-do-study-act cycles were developed to test the changes, and run charts were used to monitor progress. The process changes consisted of revising the workflow associated with the preparation and administration of glucagon, with completed implementation in November 2015. The time-driven activity-based costing methodology allowed the radiology department to develop a process to more accurately identify the costs of providing MR enterography. The primary process modification was reassigning responsibility for the administration of glucagon from nurses to technologists. After implementation, the improvements demonstrated success by reducing non-value-added steps and cost by 13%, staff time by 16%, and patient process time by 17%. The saved process time was used to augment existing examination time slots to more accurately accommodate the entire enterographic examination. Anecdotal comments were captured to validate improved staff satisfaction within the multidisciplinary team. This process provided a successful outcome to address daily workflow frustrations that could not previously be improved. A multidisciplinary team was necessary to achieve success, in addition to the use of a structured problem-solving approach. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  4. Calculating complete and exact Pareto front for multiobjective optimization: a new deterministic approach for discrete problems.

    PubMed

    Hu, Xiao-Bing; Wang, Ming; Di Paolo, Ezequiel

    2013-06-01

    Searching the Pareto front for multiobjective optimization problems usually involves the use of a population-based search algorithm or of a deterministic method with a set of different single aggregate objective functions. The results are, in fact, only approximations of the real Pareto front. In this paper, we propose a new deterministic approach capable of fully determining the real Pareto front for those discrete problems for which it is possible to construct optimization algorithms to find the k best solutions to each of the single-objective problems. To this end, two theoretical conditions are given to guarantee the finding of the actual Pareto front rather than its approximation. Then, a general methodology for designing a deterministic search procedure is proposed. A case study is conducted, where by following the general methodology, a ripple-spreading algorithm is designed to calculate the complete exact Pareto front for multiobjective route optimization. When compared with traditional Pareto front search methods, the obvious advantage of the proposed approach is its unique capability of finding the complete Pareto front. This is illustrated by the simulation results in terms of both solution quality and computational efficiency.

  5. Applications of aerospace technology in biology and medicine

    NASA Technical Reports Server (NTRS)

    Rouse, D. J.

    1983-01-01

    Utilization of NASA technology and its application to medicine is discussed. The introduction of new or improved commercially available medical products and incorporation of aerospace technology is outlined. A biopolar donor-recipient model of medical technology transfer is presented to provide a basis for the methodology. The methodology is designed to: (1) identify medical problems and NASA technology that, in combination, constitute opportunities for successful medical products; (2) obtain the early participation of industry in the transfer process; and (3) obtain acceptance by the medical community of new medical products based on NASA technology. Two commercial transfers were completed: the ocular screening device, a system for quick detection of vision problems in preschool children, and Porta-Fib III, a hospital monitoring unit. Two institutional transfers were completed: implant materials testing, the application of NASA fracture control technology to improve reliability of metallic prostheses, and incinerator monitoring, a quadrupole mass spectrometer to monitor combustion products of municipal incinerators. Mobility aids for the blind and ultrasound diagnosis of burn depth are also studied.

  6. Computational models for the analysis/design of hypersonic scramjet components. I - Combustor and nozzle models

    NASA Technical Reports Server (NTRS)

    Dash, S. M.; Sinha, N.; Wolf, D. E.; York, B. J.

    1986-01-01

    An overview of computational models developed for the complete, design-oriented analysis of a scramjet propulsion system is provided. The modular approach taken involves the use of different PNS models to analyze the individual propulsion system components. The external compression and internal inlet flowfields are analyzed by the SCRAMP and SCRINT components discussed in Part II of this paper. The combustor is analyzed by the SCORCH code which is based upon SPLITP PNS pressure-split methodology formulated by Dash and Sinha. The nozzle is analyzed by the SCHNOZ code which is based upon SCIPVIS PNS shock-capturing methodology formulated by Dash and Wolf. The current status of these models, previous developments leading to this status, and, progress towards future hybrid and 3D versions are discussed in this paper.

  7. Advancing scoping study methodology: a web-based survey and consultation of perceptions on terminology, definition and methodological steps.

    PubMed

    O'Brien, Kelly K; Colquhoun, Heather; Levac, Danielle; Baxter, Larry; Tricco, Andrea C; Straus, Sharon; Wickerson, Lisa; Nayar, Ayesha; Moher, David; O'Malley, Lisa

    2016-07-26

    Scoping studies (or reviews) are a method used to comprehensively map evidence across a range of study designs in an area, with the aim of informing future research practice, programs and policy. However, no universal agreement exists on terminology, definition or methodological steps. Our aim was to understand the experiences of, and considerations for conducting scoping studies from the perspective of academic and community partners. Primary objectives were to 1) describe experiences conducting scoping studies including strengths and challenges; and 2) describe perspectives on terminology, definition, and methodological steps. We conducted a cross-sectional web-based survey with clinicians, educators, researchers, knowledge users, representatives from community-based organizations, graduate students, and policy stakeholders with experience and/or interest in conducting scoping studies to gain an understanding of experiences and perspectives on the conduct and reporting of scoping studies. We administered an electronic self-reported questionnaire comprised of 22 items related to experiences with scoping studies, strengths and challenges, opinions on terminology, and methodological steps. We analyzed questionnaire data using descriptive statistics and content analytical techniques. Survey results were discussed during a multi-stakeholder consultation to identify key considerations in the conduct and reporting of scoping studies. Of the 83 invitations, 54 individuals (65 %) completed the scoping questionnaire, and 48 (58 %) attended the scoping study meeting from Canada, the United Kingdom and United States. Many scoping study strengths were dually identified as challenges including breadth of scope, and iterative process. No consensus on terminology emerged, however key defining features that comprised a working definition of scoping studies included the exploratory mapping of literature in a field; iterative process, inclusion of grey literature; no quality assessment of included studies, and an optional consultation phase. We offer considerations for the conduct and reporting of scoping studies for researchers, clinicians and knowledge users engaging in this methodology. Lack of consensus on scoping terminology, definition and methodological steps persists. Reasons for this may be attributed to diversity of disciplines adopting this methodology for differing purposes. Further work is needed to establish guidelines on the reporting and methodological quality assessment of scoping studies.

  8. Computerized LCC/ORLA methodology. [Life cycle cost/optimum repair level analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henderson, J.T.

    1979-01-01

    The effort by Sandia Laboratories in developing CDC6600 computer programs for Optimum Repair Level Analysis (ORLA) and Life Cycle Cost (LCC) analysis is described. Investigation of the three repair-level strategies referenced in AFLCM/AFSCM 800-4 (base discard of subassemblies, base repair of subassemblies, and depot repair of subassemblies) was expanded to include an additional three repair-level strategies (base discard of complete assemblies and, upon shipment of complete assemblies to the depot, depot repair of assemblies by subassembly repair, and depot repair of assemblies by subassembly discard). The expanded ORLA was used directly in an LCC model that was procedurally altered tomore » accommodate the ORLA input data. Available from the LCC computer run was an LCC value corresponding to the strategy chosen from the ORLA. 2 figures.« less

  9. Methodology for assessing laser-based equipment

    NASA Astrophysics Data System (ADS)

    Pelegrina-Bonilla, Gabriel; Hermsdorf, Jörg; Thombansen, Ulrich; Abels, Peter; Kaierle, Stefan; Neumann, Jörg

    2017-10-01

    Methodologies for the assessment of technology's maturity are widely used in industry and research. Probably the best known are technology readiness levels (TRLs), initially pioneered by the National Aeronautics and Space Administration (NASA). At the beginning, only descriptively defined TRLs existed, but over time, automated assessment techniques in the form of questionnaires emerged in order to determine TRLs. Originally TRLs targeted equipment for space applications, but the demands on industrial relevant equipment are partly different in terms of, for example, overall costs, product quantities, or the presence of competitors. Therefore, we present a commonly valid assessment methodology with the aim of assessing laser-based equipment for industrial use, in general. The assessment is carried out with the help of a questionnaire, which allows for a user-friendly and easy accessible way to monitor the progress from the lab-proven state to the application-ready product throughout the complete development period. The assessment result is presented in a multidimensional metric in order to reveal the current specific strengths and weaknesses of the equipment development process, which can be used to direct the remaining development process of the equipment in the right direction.

  10. Abstract analysis method facilitates filtering low-methodological quality and high-bias risk systematic reviews on psoriasis interventions.

    PubMed

    Gómez-García, Francisco; Ruano, Juan; Aguilar-Luque, Macarena; Alcalde-Mellado, Patricia; Gay-Mimbrera, Jesús; Hernández-Romero, José Luis; Sanz-Cabanillas, Juan Luis; Maestre-López, Beatriz; González-Padilla, Marcelino; Carmona-Fernández, Pedro J; García-Nieto, Antonio Vélez; Isla-Tejera, Beatriz

    2017-12-29

    Article summaries' information and structure may influence researchers/clinicians' decisions to conduct deeper full-text analyses. Specifically, abstracts of systematic reviews (SRs) and meta-analyses (MA) should provide structured summaries for quick assessment. This study explored a method for determining the methodological quality and bias risk of full-text reviews using abstract information alone. Systematic literature searches for SRs and/or MA about psoriasis were undertaken on MEDLINE, EMBASE, and Cochrane database. For each review, quality, abstract-reporting completeness, full-text methodological quality, and bias risk were evaluated using Preferred Reporting Items for Systematic Reviews and Meta-analyses for abstracts (PRISMA-A), Assessing the Methodological Quality of Systematic Reviews (AMSTAR), and ROBIS tools, respectively. Article-, author-, and journal-derived metadata were systematically extracted from eligible studies using a piloted template, and explanatory variables concerning abstract-reporting quality were assessed using univariate and multivariate-regression models. Two classification models concerning SRs' methodological quality and bias risk were developed based on per-item and total PRISMA-A scores and decision-tree algorithms. This work was supported, in part, by project ICI1400136 (JR). No funding was received from any pharmaceutical company. This study analysed 139 SRs on psoriasis interventions. On average, they featured 56.7% of PRISMA-A items. The mean total PRISMA-A score was significantly higher for high-methodological-quality SRs than for moderate- and low-methodological-quality reviews. SRs with low-bias risk showed higher total PRISMA-A values than reviews with high-bias risk. In the final model, only 'authors per review > 6' (OR: 1.098; 95%CI: 1.012-1.194), 'academic source of funding' (OR: 3.630; 95%CI: 1.788-7.542), and 'PRISMA-endorsed journal' (OR: 4.370; 95%CI: 1.785-10.98) predicted PRISMA-A variability. Reviews with a total PRISMA-A score < 6, lacking identification as SR or MA in the title, and lacking explanation concerning bias risk assessment methods were classified as low-methodological quality. Abstracts with a total PRISMA-A score ≥ 9, including main outcomes results and explanation bias risk assessment method were classified as having low-bias risk. The methodological quality and bias risk of SRs may be determined by abstract's quality and completeness analyses. Our proposal aimed to facilitate synthesis of evidence evaluation by clinical professionals lacking methodological skills. External validation is necessary.

  11. Evaluating the feasibility of utilizing the Automated Self-administered 24-hour (ASA24) dietary recall in a sample of multiethnic older adults

    PubMed Central

    Ettienne-Gittens, Reynolette; Boushey, Carol J.; Au, Donna; Murphy, Suzanne P.; Lim, Unhee; Wilkens, Lynne

    2016-01-01

    The ASA24 is a web application which enables the collection of self-administered dietary recalls thus utilizing technology to overcome some of the limitations of traditional assessment methodologies. Older adults, particularly those from certain ethnic groups may have less access to and may be less receptive to technology. This research sought to determine the level of access to the internet as well as evaluate the feasibility of using a web-based alternative dietary data collection method in older, multiethnic adults. Participants completed three telephone administered diet recalls (n=347), and were asked to complete a one day recall via the ASA24. They were also asked to evaluate their experience with using the ASA24 system. Almost 60% of the participants reported no access to a computer or internet access, with African Americans and Latinos less likely than non-Hispanic Whites and Japanese-Americans to have access. Of those with access to the internet (n=100), 44% of the participants accessed the ASA24 system and 37% successfully launched the ASA24 program. However, most respondents preferred the traditional diet recall methodology over the ASA24. Further research is needed to investigate recruitment and use of electronic data collection methodologies in older adults. PMID:28149712

  12. Self-Calibration and Optimal Response in Intelligent Sensors Design Based on Artificial Neural Networks

    PubMed Central

    Rivera, José; Carrillo, Mariano; Chacón, Mario; Herrera, Gilberto; Bojorquez, Gilberto

    2007-01-01

    The development of smart sensors involves the design of reconfigurable systems capable of working with different input sensors. Reconfigurable systems ideally should spend the least possible amount of time in their calibration. An autocalibration algorithm for intelligent sensors should be able to fix major problems such as offset, variation of gain and lack of linearity, as accurately as possible. This paper describes a new autocalibration methodology for nonlinear intelligent sensors based on artificial neural networks, ANN. The methodology involves analysis of several network topologies and training algorithms. The proposed method was compared against the piecewise and polynomial linearization methods. Method comparison was achieved using different number of calibration points, and several nonlinear levels of the input signal. This paper also shows that the proposed method turned out to have a better overall accuracy than the other two methods. Besides, experimentation results and analysis of the complete study, the paper describes the implementation of the ANN in a microcontroller unit, MCU. In order to illustrate the method capability to build autocalibration and reconfigurable systems, a temperature measurement system was designed and tested. The proposed method is an improvement over the classic autocalibration methodologies, because it impacts on the design process of intelligent sensors, autocalibration methodologies and their associated factors, like time and cost.

  13. Space Engineering Projects in Design Methodology

    NASA Technical Reports Server (NTRS)

    Crawford, R.; Wood, K.; Nichols, S.; Hearn, C.; Corrier, S.; DeKunder, G.; George, S.; Hysinger, C.; Johnson, C.; Kubasta, K.

    1993-01-01

    NASA/USRA is an ongoing sponsor of space design projects in the senior design courses of the Mechanical Engineering Department at The University of Texas at Austin. This paper describes the UT senior design sequence, focusing on the first-semester design methodology course. The philosophical basis and pedagogical structure of this course is summarized. A history of the Department's activities in the Advanced Design Program is then presented. The paper includes a summary of the projects completed during the 1992-93 Academic Year in the methodology course, and concludes with an example of two projects completed by student design teams.

  14. Using Modern Methodologies with Maintenance Software

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara A.; Francis, Laurie K.; Smith, Benjamin D.

    2014-01-01

    Jet Propulsion Laboratory uses multi-mission software produced by the Mission Planning and Sequencing (MPS) team to process, simulate, translate, and package the commands that are sent to a spacecraft. MPS works under the auspices of the Multi-Mission Ground Systems and Services (MGSS). This software consists of nineteen applications that are in maintenance. The MPS software is classified as either class B (mission critical) or class C (mission important). The scheduling of tasks is difficult because mission needs must be addressed prior to performing any other tasks and those needs often spring up unexpectedly. Keeping track of the tasks that everyone is working on is also difficult because each person is working on a different software component. Recently the group adopted the Scrum methodology for planning and scheduling tasks. Scrum is one of the newer methodologies typically used in agile development. In the Scrum development environment, teams pick their tasks that are to be completed within a sprint based on priority. The team specifies the sprint length usually a month or less. Scrum is typically used for new development of one application. In the Scrum methodology there is a scrum master who is a facilitator who tries to make sure that everything moves smoothly, a product owner who represents the user(s) of the software and the team. MPS is not the traditional environment for the Scrum methodology. MPS has many software applications in maintenance, team members who are working on disparate applications, many users, and is interruptible based on mission needs, issues and requirements. In order to use scrum, the methodology needed adaptation to MPS. Scrum was chosen because it is adaptable. This paper is about the development of the process for using scrum, a new development methodology, with a team that works on disparate interruptible tasks on multiple software applications.

  15. Assessment of undiscovered oil and gas resources of the Susitna Basin, southern Alaska, 2017

    USGS Publications Warehouse

    Stanley, Richard G.; Potter, Christopher J.; Lewis, Kristen A.; Lillis, Paul G.; Shah, Anjana K.; Haeussler, Peter J.; Phillips, Jeffrey D.; Valin, Zenon C.; Schenk, Christopher J.; Klett, Timothy R.; Brownfield, Michael E.; Drake II, Ronald M.; Finn, Thomas M.; Haines, Seth S.; Higley, Debra K.; Houseknecht, David W.; Le, Phuong A.; Marra, Kristen R.; Mercier, Tracey J.; Leathers-Miller, Heidi M.; Paxton, Stanley T.; Pearson, Ofori N.; Tennyson, Marilyn E.; Woodall, Cheryl A.; Zyrianova, Margarita V.

    2018-05-01

    The U.S. Geological Survey (USGS) recently completed an assessment of undiscovered, technically recoverable oil and gas resources in the Susitna Basin of southern Alaska. Using a geology-based methodology, the USGS estimates that mean undiscovered volumes of about 2 million barrels of oil and nearly 1.7 trillion cubic feet of gas may be found in this area.

  16. Testing the Intelligence of Unmanned Autonomous Systems

    DTIC Science & Technology

    2008-01-01

    decisions without the operator. The term autonomous is also used interchangeably with intelligent, giving rise to the name unmanned autonomous system ( UAS ...For the purposes of this article, UAS describes an unmanned system that makes decisions based on gathered information. Because testers should not...make assumptions about the decision process within a UAS , there is a need for a methodology that completely tests this decision process without biasing

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carmichael, Joshua Daniel; Carr, Christina; Pettit, Erin C.

    We apply a fully autonomous icequake detection methodology to a single day of high-sample rate (200 Hz) seismic network data recorded from the terminus of Taylor Glacier, ANT that temporally coincided with a brine release episode near Blood Falls (May 13, 2014). We demonstrate a statistically validated procedure to assemble waveforms triggered by icequakes into populations of clusters linked by intra-event waveform similarity. Our processing methodology implements a noise-adaptive power detector coupled with a complete-linkage clustering algorithm and noise-adaptive correlation detector. This detector-chain reveals a population of 20 multiplet sequences that includes ~150 icequakes and produces zero false alarms onmore » the concurrent, diurnally variable noise. Our results are very promising for identifying changes in background seismicity associated with the presence or absence of brine release episodes. We thereby suggest that our methodology could be applied to longer time periods to establish a brine-release monitoring program for Blood Falls that is based on icequake detections.« less

  18. Specializing architectures for the type 2 diabetes mellitus care use cases with a focus on process management.

    PubMed

    Uribe, Gustavo A; Blobel, Bernd; López, Diego M; Ruiz, Alonso A

    2015-01-01

    The development of software supporting inter-disciplinary systems like the type 2 diabetes mellitus care requires the deployment of methodologies designed for this type of interoperability. The GCM framework allows the architectural description of such systems and the development of software solutions based on it. A first step of the GCM methodology is the definition of a generic architecture, followed by its specialization for specific use cases. This paper describes the specialization of the generic architecture of a system, supporting Type 2 diabetes mellitus glycemic control, for a pharmacotherapy use case. It focuses on the behavioral aspect of the system, i.e. the policy domain and the definition of the rules governing the system. The design of this architecture reflects the inter-disciplinary feature of the methodology. Finally, the resulting architecture allows building adaptive, intelligent and complete systems.

  19. High redshift galaxies in the ALHAMBRA survey . I. Selection method and number counts based on redshift PDFs

    NASA Astrophysics Data System (ADS)

    Viironen, K.; Marín-Franch, A.; López-Sanjuan, C.; Varela, J.; Chaves-Montero, J.; Cristóbal-Hornillos, D.; Molino, A.; Fernández-Soto, A.; Vilella-Rojo, G.; Ascaso, B.; Cenarro, A. J.; Cerviño, M.; Cepa, J.; Ederoclite, A.; Márquez, I.; Masegosa, J.; Moles, M.; Oteo, I.; Pović, M.; Aguerri, J. A. L.; Alfaro, E.; Aparicio-Villegas, T.; Benítez, N.; Broadhurst, T.; Cabrera-Caño, J.; Castander, J. F.; Del Olmo, A.; González Delgado, R. M.; Husillos, C.; Infante, L.; Martínez, V. J.; Perea, J.; Prada, F.; Quintana, J. M.

    2015-04-01

    Context. Most observational results on the high redshift restframe UV-bright galaxies are based on samples pinpointed using the so-called dropout technique or Ly-α selection. However, the availability of multifilter data now allows the dropout selections to be replaced by direct methods based on photometric redshifts. In this paper we present the methodology to select and study the population of high redshift galaxies in the ALHAMBRA survey data. Aims: Our aim is to develop a less biased methodology than the traditional dropout technique to study the high redshift galaxies in ALHAMBRA and other multifilter data. Thanks to the wide area ALHAMBRA covers, we especially aim at contributing to the study of the brightest, least frequent, high redshift galaxies. Methods: The methodology is based on redshift probability distribution functions (zPDFs). It is shown how a clean galaxy sample can be obtained by selecting the galaxies with high integrated probability of being within a given redshift interval. However, reaching both a complete and clean sample with this method is challenging. Hence, a method to derive statistical properties by summing the zPDFs of all the galaxies in the redshift bin of interest is introduced. Results: Using this methodology we derive the galaxy rest frame UV number counts in five redshift bins centred at z = 2.5,3.0,3.5,4.0, and 4.5, being complete up to the limiting magnitude at mUV(AB) = 24, where mUV refers to the first ALHAMBRA filter redwards of the Ly-α line. With the wide field ALHAMBRA data we especially contribute to the study of the brightest ends of these counts, accurately sampling the surface densities down to mUV(AB) = 21-22. Conclusions: We show that using the zPDFs it is easy to select a very clean sample of high redshift galaxies. We also show that it is better to do statistical analysis of the properties of galaxies using a probabilistic approach, which takes into account both the incompleteness and contamination issues in a natural way. Based on observations collected at the German-Spanish Astronomical Center, Calar Alto, jointly operated by the Max-Planck-Institut für Astronomie (MPIA) at Heidelberg and the Instituto de Astrofísica de Andalucía (CSIC).

  20. Commerce Laboratory: Mission analysis payload integration study

    NASA Technical Reports Server (NTRS)

    Bannister, T. C.

    1984-01-01

    A mission model which will accommodate commercial users and provide a basic data base for further mission planning is reported. The data bases to be developed are: (1) user requirements; (2) apparatus capabilities and availabilities; and (3) carrier capabilities. These data bases are synthesized in a trades and analysis phase along with the STS flight apparatus, and optimum missions will be identified. The completed work is reported. The user requirements data base was expanded to identify within the six scientific disciplines the areas of investigation, investigation categories and status, potential commercial application, interested parties, process, and experiment requirements. The scope of the apparatus data base was expanded to indicate apparatus status as to whether it is ground or flight equipment and, within both categories, whether the apparatus is: (1) existing, (2) under development, (3) planned, or (4) needed. Applications for the apparatus are listed. The methodology is revised in the areas of trades and analysis and mission planning. The carrier capabilities data base was updated and completed.

  1. Predictions of CD4 lymphocytes’ count in HIV patients from complete blood count

    PubMed Central

    2013-01-01

    Background HIV diagnosis, prognostic and treatment requires T CD4 lymphocytes’ number from flow cytometry, an expensive technique often not available to people in developing countries. The aim of this work is to apply a previous developed methodology that predicts T CD4 lymphocytes’ value based on total white blood cell (WBC) count and lymphocytes count applying sets theory, from information taken from the Complete Blood Count (CBC). Methods Sets theory was used to classify into groups named A, B, C and D the number of leucocytes/mm3, lymphocytes/mm3, and CD4/μL3 subpopulation per flow cytometry of 800 HIV diagnosed patients. Union between sets A and C, and B and D were assessed, and intersection between both unions was described in order to establish the belonging percentage to these sets. Results were classified into eight ranges taken by 1000 leucocytes/mm3, calculating the belonging percentage of each range with respect to the whole sample. Results Intersection (A ∪ C) ∩ (B ∪ D) showed an effectiveness in the prediction of 81.44% for the range between 4000 and 4999 leukocytes, 91.89% for the range between 3000 and 3999, and 100% for the range below 3000. Conclusions Usefulness and clinical applicability of a methodology based on sets theory were confirmed to predict the T CD4 lymphocytes’ value, beginning with WBC and lymphocytes’ count from CBC. This methodology is new, objective, and has lower costs than the flow cytometry which is currently considered as Gold Standard. PMID:24034560

  2. Accuracy and Calibration of Computational Approaches for Inpatient Mortality Predictive Modeling.

    PubMed

    Nakas, Christos T; Schütz, Narayan; Werners, Marcus; Leichtle, Alexander B

    2016-01-01

    Electronic Health Record (EHR) data can be a key resource for decision-making support in clinical practice in the "big data" era. The complete database from early 2012 to late 2015 involving hospital admissions to Inselspital Bern, the largest Swiss University Hospital, was used in this study, involving over 100,000 admissions. Age, sex, and initial laboratory test results were the features/variables of interest for each admission, the outcome being inpatient mortality. Computational decision support systems were utilized for the calculation of the risk of inpatient mortality. We assessed the recently proposed Acute Laboratory Risk of Mortality Score (ALaRMS) model, and further built generalized linear models, generalized estimating equations, artificial neural networks, and decision tree systems for the predictive modeling of the risk of inpatient mortality. The Area Under the ROC Curve (AUC) for ALaRMS marginally corresponded to the anticipated accuracy (AUC = 0.858). Penalized logistic regression methodology provided a better result (AUC = 0.872). Decision tree and neural network-based methodology provided even higher predictive performance (up to AUC = 0.912 and 0.906, respectively). Additionally, decision tree-based methods can efficiently handle Electronic Health Record (EHR) data that have a significant amount of missing records (in up to >50% of the studied features) eliminating the need for imputation in order to have complete data. In conclusion, we show that statistical learning methodology can provide superior predictive performance in comparison to existing methods and can also be production ready. Statistical modeling procedures provided unbiased, well-calibrated models that can be efficient decision support tools for predicting inpatient mortality and assigning preventive measures.

  3. Adapting total quality management for general practice: evaluation of a programme.

    PubMed Central

    Lawrence, M; Packwood, T

    1996-01-01

    OBJECTIVE: Assessment of the benefits and limitations of a quality improvement programme based on total quality management principles in general practice over a period of one year (October 1993-4). DESIGN: Questionnaires to practice team members before any intervention and after one year. Three progress reports completed by facilitators at four month intervals. Semistructured interviews with a sample of staff from each practice towards the end of the year. SETTING: 18 self selected practices from across the former Oxford Region. Three members of each practice received an initial residential course and three one day seminars during the year. Each practice was supported by a facilitator from their Medical Audit Advisory Group. MEASURES: Extent of understanding and implementation of quality improvement methodology. Number, completeness, and evaluation of quality improvement projects. Practice team members' attitudes to and involvement in team working and quality improvement. RESULTS: 16 of the 18 practices succeeded in implementing the quality improvement methods. 48 initiatives were considered and staff involvement was broad. Practice members showed increased involvement in, and appreciation of, strategic planning and team working, and satisfaction from improved patients services. 11 of the practices intend to continue with the methodology. The commonest barrier expressed was time. CONCLUSION: Quality improvement programmes based on total quality management principles produce beneficial changes in service delivery and team working in most general practices. It is incompatible with traditional doctor centred practice. The methodology needs to be adapted for primary care to avoid quality improvement being seen as separate from routine activity, and to save time. PMID:10161529

  4. Rolling element bearing fault diagnosis based on Over-Complete rational dilation wavelet transform and auto-correlation of analytic energy operator

    NASA Astrophysics Data System (ADS)

    Singh, Jaskaran; Darpe, A. K.; Singh, S. P.

    2018-02-01

    Local damage in rolling element bearings usually generates periodic impulses in vibration signals. The severity, repetition frequency and the fault excited resonance zone by these impulses are the key indicators for diagnosing bearing faults. In this paper, a methodology based on over complete rational dilation wavelet transform (ORDWT) is proposed, as it enjoys a good shift invariance. ORDWT offers flexibility in partitioning the frequency spectrum to generate a number of subbands (filters) with diverse bandwidths. The selection of the optimal filter that perfectly overlaps with the bearing fault excited resonance zone is based on the maximization of a proposed impulse detection measure "Temporal energy operated auto correlated kurtosis". The proposed indicator is robust and consistent in evaluating the impulsiveness of fault signals in presence of interfering vibration such as heavy background noise or sporadic shocks unrelated to the fault or normal operation. The structure of the proposed indicator enables it to be sensitive to fault severity. For enhanced fault classification, an autocorrelation of the energy time series of the signal filtered through the optimal subband is proposed. The application of the proposed methodology is validated on simulated and experimental data. The study shows that the performance of the proposed technique is more robust and consistent in comparison to the original fast kurtogram and wavelet kurtogram.

  5. Designing prospective cohort studies for assessing reproductive and developmental toxicity during sensitive windows of human reproduction and development--the LIFE Study.

    PubMed

    Buck Louis, Germaine M; Schisterman, Enrique F; Sweeney, Anne M; Wilcosky, Timothy C; Gore-Langton, Robert E; Lynch, Courtney D; Boyd Barr, Dana; Schrader, Steven M; Kim, Sungduk; Chen, Zhen; Sundaram, Rajeshwari

    2011-09-01

    The relationship between the environment and human fecundity and fertility remains virtually unstudied from a couple-based perspective in which longitudinal exposure data and biospecimens are captured across sensitive windows. In response, we completed the LIFE Study with methodology that intended to empirically evaluate a priori purported methodological challenges: implementation of population-based sampling frameworks suitable for recruiting couples planning pregnancy; obtaining environmental data across sensitive windows of reproduction and development; home-based biospecimen collection; and development of a data management system for hierarchical exposome data. We used two sampling frameworks (i.e., fish/wildlife licence registry and a direct marketing database) for 16 targeted counties with presumed environmental exposures to persistent organochlorine chemicals to recruit 501 couples planning pregnancies for prospective longitudinal follow-up while trying to conceive and throughout pregnancy. Enrolment rates varied from <1% of the targeted population (n = 424,423) to 42% of eligible couples who were successfully screened; 84% of the targeted population could not be reached, while 36% refused screening. Among enrolled couples, ∼ 85% completed daily journals while trying; 82% of pregnant women completed daily early pregnancy journals, and 80% completed monthly pregnancy journals. All couples provided baseline blood/urine samples; 94% of men provided one or more semen samples and 98% of women provided one or more saliva samples. Women successfully used urinary fertility monitors for identifying ovulation and home pregnancy test kits. Couples can be recruited for preconception cohorts and will comply with intensive data collection across sensitive windows. However, appropriately sized sampling frameworks are critical, given the small percentage of couples contacted found eligible and reportedly planning pregnancy at any point in time. © Published 2011. This article is a US Government work and is in the public domain in the USA.

  6. Short-term forecasting of turbidity in trunk main networks.

    PubMed

    Meyers, Gregory; Kapelan, Zoran; Keedwell, Edward

    2017-11-01

    Water discolouration is an increasingly important and expensive issue due to rising customer expectations, tighter regulatory demands and ageing Water Distribution Systems (WDSs) in the UK and abroad. This paper presents a new turbidity forecasting methodology capable of aiding operational staff and enabling proactive management strategies. The turbidity forecasting methodology developed here is completely data-driven and does not require hydraulic or water quality network model that is expensive to build and maintain. The methodology is tested and verified on a real trunk main network with observed turbidity measurement data. Results obtained show that the methodology can detect if discolouration material is mobilised, estimate if sufficient turbidity will be generated to exceed a preselected threshold and approximate how long the material will take to reach the downstream meter. Classification based forecasts of turbidity can be reliably made up to 5 h ahead although at the expense of increased false alarm rates. The methodology presented here could be used as an early warning system that can enable a multitude of cost beneficial proactive management strategies to be implemented as an alternative to expensive trunk mains cleaning programs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. 77 FR 24221 - Agency Information Collection Activities: Proposed Collection; Comments Requested; Research To...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-23

    ... collection: Extension of the time frame required to complete approved and ongoing methodological research on... methodological research on the National Crime Victimization Survey. (2) Title of the Form/Collection: National.... This generic clearance will cover methodological research that will use existing or new sampled...

  8. 75 FR 46942 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-04

    ... employers. Should any needed methodological changes be identified, NIOSH will submit a request for modification to OMB. If no substantive methodological changes are required, the phase II study will proceed and... complete the questionnaire on the web or by telephone at that time.) Assuming no methodological changes...

  9. Evaluation of Model-Based Training for Vertical Guidance Logic

    NASA Technical Reports Server (NTRS)

    Feary, Michael; Palmer, Everett; Sherry, Lance; Polson, Peter; Alkin, Marty; McCrobie, Dan; Kelley, Jerry; Rosekind, Mark (Technical Monitor)

    1997-01-01

    This paper will summarize the results of a study which introduces a structured, model based approach to learning how the automated vertical guidance system works on a modern commercial air transport. The study proposes a framework to provide accurate and complete information in an attempt to eliminate confusion about 'what the system is doing'. This study will examine a structured methodology for organizing the ideas on which the system was designed, communicating this information through the training material, and displaying it in the airplane. Previous research on model-based, computer aided instructional technology has shown reductions in the amount of time to a specified level of competence. The lessons learned from the development of these technologies are well suited for use with the design methodology which was used to develop the vertical guidance logic for a large commercial air transport. The design methodology presents the model from which to derive the training material, and the content of information to be displayed to the operator. The study consists of a 2 X 2 factorial experiment which will compare a new method of training vertical guidance logic and a new type of display. The format of the material used to derive both the training and the display will be provided by the Operational Procedure Methodology. The training condition will compare current training material to the new structured format. The display condition will involve a change of the content of the information displayed into pieces that agree with the concepts with which the system was designed.

  10. Applied Swarm-based medicine: collecting decision trees for patterns of algorithms analysis.

    PubMed

    Panje, Cédric M; Glatzer, Markus; von Rappard, Joscha; Rothermundt, Christian; Hundsberger, Thomas; Zumstein, Valentin; Plasswilm, Ludwig; Putora, Paul Martin

    2017-08-16

    The objective consensus methodology has recently been applied in consensus finding in several studies on medical decision-making among clinical experts or guidelines. The main advantages of this method are an automated analysis and comparison of treatment algorithms of the participating centers which can be performed anonymously. Based on the experience from completed consensus analyses, the main steps for the successful implementation of the objective consensus methodology were identified and discussed among the main investigators. The following steps for the successful collection and conversion of decision trees were identified and defined in detail: problem definition, population selection, draft input collection, tree conversion, criteria adaptation, problem re-evaluation, results distribution and refinement, tree finalisation, and analysis. This manuscript provides information on the main steps for successful collection of decision trees and summarizes important aspects at each point of the analysis.

  11. FAME, a microprocessor based front-end analysis and modeling environment

    NASA Technical Reports Server (NTRS)

    Rosenbaum, J. D.; Kutin, E. B.

    1980-01-01

    Higher order software (HOS) is a methodology for the specification and verification of large scale, complex, real time systems. The HOS methodology was implemented as FAME (front end analysis and modeling environment), a microprocessor based system for interactively developing, analyzing, and displaying system models in a low cost user-friendly environment. The nature of the model is such that when completed it can be the basis for projection to a variety of forms such as structured design diagrams, Petri-nets, data flow diagrams, and PSL/PSA source code. The user's interface with the analyzer is easily recognized by any current user of a structured modeling approach; therefore extensive training is unnecessary. Furthermore, when all the system capabilities are used one can check on proper usage of data types, functions, and control structures thereby adding a new dimension to the design process that will lead to better and more easily verified software designs.

  12. EHR-based disease registries to support integrated care in a health neighbourhood: an ontology-based methodology.

    PubMed

    Liaw, Siaw-Teng; Taggart, Jane; Yu, Hairong

    2014-01-01

    Disease registries derived from Electronic Health Records (EHRs) are widely used for chronic disease management. We approached registries from the perspective of integrated care in a health neighbourhood, considering data quality issues such as semantic interoperability (consistency), accuracy, completeness and duplication. Our proposition is that a realist ontological approach is required to accurately identify patients in an EHR or data repository, assess data quality and fitness for use by the multidisciplinary integrated care team. We report on this approach with routinely collected data in a practice based research network in Australia.

  13. Methodology for evaluating pattern transfer completeness in inkjet printing with irregular edges

    NASA Astrophysics Data System (ADS)

    Huang, Bo-Cin; Chan, Hui-Ju; Hong, Jian-Wei; Lo, Cheng-Yao

    2016-06-01

    A methodology for quantifying and qualifying pattern transfer completeness in inkjet printing through examining both pattern dimensions and pattern contour deviations from reference design is proposed, which enables scientifically identifying and evaluating inkjet-printed lines, corners, circles, ellipses, and spirals with irregular edges of bulging, necking, and unpredictable distortions resulting from different process conditions. This methodology not only avoids differences in individual perceptions of ambiguous pattern distortions but also indicates the systematic effects of mechanical stresses applied in different directions to a polymer substrate, and is effective for both optical and electrical microscopy in direct and indirect lithography or lithography-free patterning.

  14. Developing Dementia Prevention Trials: Baseline Report of the Home-Based Assessment Study

    PubMed Central

    Sano, Mary; Egelko, Susan; Donohue, Michael; Ferris, Steven; Kaye, Jeffrey; Hayes, Tamara L.; Mundt, James C.; Sun, C.K.; Paparello, Silvia; Aisen, Paul S.

    2014-01-01

    This report describes the baseline experience of the multi-center, Home Based Assessment (HBA) study, designed to develop methods for dementia prevention trials using novel technologies for test administration and data collection. Non-demented individuals ≥ 75 years old were recruited and evaluated in-person using established clinical trial outcomes of cognition and function, and randomized to one of 3 assessment methodologies: 1) mail-in questionnaire/live telephone interviews (MIP); 2) automated telephone with interactive voice recognition (IVR); and 3) internet-based computer Kiosk (KIO). Brief versions of cognitive and non-cognitive outcomes, were adapted to each methodology and administered at baseline and repeatedly over a 4-year period. “Efficiency” measures assessed the time from screening to baseline, and staff time required for each methodology. 713 individuals signed consent and were screened; 640 met eligibility and were randomized to one of 3 assessment arms and 581 completed baseline. Drop out, time from screening to baseline and total staff time were highest among those assigned to KIO. However efficiency measures were driven by non-recurring start-up activities suggesting that differences may be mitigated over a long trial. Performance among HBA instruments collected via different technologies will be compared to established outcomes over this 4 year study. PMID:23151596

  15. How completely are physiotherapy interventions described in reports of randomised trials?

    PubMed

    Yamato, Tiê P; Maher, Chris G; Saragiotto, Bruno T; Hoffmann, Tammy C; Moseley, Anne M

    2016-06-01

    Incomplete descriptions of interventions are a common problem in reports of randomised controlled trials. To date no study has evaluated the completeness of the descriptions of physiotherapy interventions. To evaluate the completeness of the descriptions of physiotherapy interventions in a random sample of reports of randomised controlled trials (RCTs). A random sample of 200 reports of RCTs from the PEDro database. We included full text papers, written in English, and reporting trials with two arms. We included trials evaluating any type of physiotherapy interventions and subdisciplines. The methodological quality was evaluated using the PEDro scale and completeness of intervention description using the Template for Intervention Description and Replication (TIDieR) checklist. The proportion and 95% confidence interval were calculated for intervention and control groups, and used to present the relationship between completeness and methodological quality, and subdisciplines. Completeness of intervention reporting in physiotherapy RCTs was poor. For intervention groups, 46 (23%) trials did not describe at least half of the items. Reporting was worse for control groups, 149 (75%) trials described less than half of the items. There was no clear difference in the completeness across subdisciplines or methodological quality. Our sample were restricted to trials published in English in 2013. Descriptions of interventions in physiotherapy RCTs are typically incomplete. Authors and journals should aim for more complete descriptions of interventions in physiotherapy trials. Copyright © 2016 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.

  16. SUPERFAMILY 1.75 including a domain-centric gene ontology method.

    PubMed

    de Lima Morais, David A; Fang, Hai; Rackham, Owen J L; Wilson, Derek; Pethica, Ralph; Chothia, Cyrus; Gough, Julian

    2011-01-01

    The SUPERFAMILY resource provides protein domain assignments at the structural classification of protein (SCOP) superfamily level for over 1400 completely sequenced genomes, over 120 metagenomes and other gene collections such as UniProt. All models and assignments are available to browse and download at http://supfam.org. A new hidden Markov model library based on SCOP 1.75 has been created and a previously ignored class of SCOP, coiled coils, is now included. Our scoring component now uses HMMER3, which is in orders of magnitude faster and produces superior results. A cloud-based pipeline was implemented and is publicly available at Amazon web services elastic computer cloud. The SUPERFAMILY reference tree of life has been improved allowing the user to highlight a chosen superfamily, family or domain architecture on the tree of life. The most significant advance in SUPERFAMILY is that now it contains a domain-based gene ontology (GO) at the superfamily and family levels. A new methodology was developed to ensure a high quality GO annotation. The new methodology is general purpose and has been used to produce domain-based phenotypic ontologies in addition to GO.

  17. Effectiveness of the Comprehensive Approach to Rehabilitation (CARe) methodology: design of a cluster randomized controlled trial.

    PubMed

    Bitter, Neis A; Roeg, Diana P K; van Nieuwenhuizen, Chijs; van Weeghel, Jaap

    2015-07-22

    There is an increasing amount of evidence for the effectiveness of rehabilitation interventions for people with severe mental illness (SMI). In the Netherlands, a rehabilitation methodology that is well known and often applied is the Comprehensive Approach to Rehabilitation (CARe) methodology. The overall goal of the CARe methodology is to improve the client's quality of life by supporting the client in realizing his/her goals and wishes, handling his/her vulnerability and improving the quality of his/her social environment. The methodology is strongly influenced by the concept of 'personal recovery' and the 'strengths case management model'. No controlled effect studies have been conducted hitherto regarding the CARe methodology. This study is a two-armed cluster randomized controlled trial (RCT) that will be executed in teams from three organizations for sheltered and supported housing, which provide services to people with long-term severe mental illness. Teams in the intervention group will receive the multiple-day CARe methodology training from a specialized institute and start working according the CARe Methodology guideline. Teams in the control group will continue working in their usual way. Standardized questionnaires will be completed at baseline (T0), and 10 (T1) and 20 months (T2) post baseline. Primary outcomes are recovery, social functioning and quality of life. The model fidelity of the CARe methodology will be assessed at T1 and T2. This study is the first controlled effect study on the CARe methodology and one of the few RCTs on a broad rehabilitation method or strength-based approach. This study is relevant because mental health care organizations have become increasingly interested in recovery and rehabilitation-oriented care. The trial registration number is ISRCTN77355880 .

  18. Microfabrication and integration of a sol-gel PZT folded spring energy harvester.

    PubMed

    Lueke, Jonathan; Badr, Ahmed; Lou, Edmond; Moussa, Walied A

    2015-05-26

    This paper presents the methodology and challenges experienced in the microfabrication, packaging, and integration of a fixed-fixed folded spring piezoelectric energy harvester. A variety of challenges were overcome in the fabrication of the energy harvesters, such as the diagnosis and rectification of sol-gel PZT film quality and adhesion issues. A packaging and integration methodology was developed to allow for the characterizing the harvesters under a base vibration. The conditioning circuitry developed allowed for a complete energy harvesting system, consisting a harvester, a voltage doubler, a voltage regulator and a NiMH battery. A feasibility study was undertaken with the designed conditioning circuitry to determine the effect of the input parameters on the overall performance of the circuit. It was found that the maximum efficiency does not correlate to the maximum charging current supplied to the battery. The efficiency and charging current must be balanced to achieve a high output and a reasonable output current. The development of the complete energy harvesting system allows for the direct integration of the energy harvesting technology into existing power management schemes for wireless sensing.

  19. Microfabrication and Integration of a Sol-Gel PZT Folded Spring Energy Harvester

    PubMed Central

    Lueke, Jonathan; Badr, Ahmed; Lou, Edmond; Moussa, Walied A.

    2015-01-01

    This paper presents the methodology and challenges experienced in the microfabrication, packaging, and integration of a fixed-fixed folded spring piezoelectric energy harvester. A variety of challenges were overcome in the fabrication of the energy harvesters, such as the diagnosis and rectification of sol-gel PZT film quality and adhesion issues. A packaging and integration methodology was developed to allow for the characterizing the harvesters under a base vibration. The conditioning circuitry developed allowed for a complete energy harvesting system, consisting a harvester, a voltage doubler, a voltage regulator and a NiMH battery. A feasibility study was undertaken with the designed conditioning circuitry to determine the effect of the input parameters on the overall performance of the circuit. It was found that the maximum efficiency does not correlate to the maximum charging current supplied to the battery. The efficiency and charging current must be balanced to achieve a high output and a reasonable output current. The development of the complete energy harvesting system allows for the direct integration of the energy harvesting technology into existing power management schemes for wireless sensing. PMID:26016911

  20. Strategy for design NIR calibration sets based on process spectrum and model space: An innovative approach for process analytical technology.

    PubMed

    Cárdenas, V; Cordobés, M; Blanco, M; Alcalà, M

    2015-10-10

    The pharmaceutical industry is under stringent regulations on quality control of their products because is critical for both, productive process and consumer safety. According to the framework of "process analytical technology" (PAT), a complete understanding of the process and a stepwise monitoring of manufacturing are required. Near infrared spectroscopy (NIRS) combined with chemometrics have lately performed efficient, useful and robust for pharmaceutical analysis. One crucial step in developing effective NIRS-based methodologies is selecting an appropriate calibration set to construct models affording accurate predictions. In this work, we developed calibration models for a pharmaceutical formulation during its three manufacturing stages: blending, compaction and coating. A novel methodology is proposed for selecting the calibration set -"process spectrum"-, into which physical changes in the samples at each stage are algebraically incorporated. Also, we established a "model space" defined by Hotelling's T(2) and Q-residuals statistics for outlier identification - inside/outside the defined space - in order to select objectively the factors to be used in calibration set construction. The results obtained confirm the efficacy of the proposed methodology for stepwise pharmaceutical quality control, and the relevance of the study as a guideline for the implementation of this easy and fast methodology in the pharma industry. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Building the Qualification File of EGNOS with DOORS

    NASA Astrophysics Data System (ADS)

    Fabre, J.

    2008-08-01

    EGNOS, the European Satellite-Based Augmentation System (SBAS) to GPS, is getting to its final deployment and being initially operated towards qualification and certification to reach operational capability by 2008/2009. A very important milestone in the development process is the System Qualification Review (QR). As the verification phase aims at demonstrating that the EGNOS System design meets the applicable requirements, the QR declares the completion of verification activities. The main document to present at QR is a consolidated, consistent and complete Qualification file. The information included shall give confidence to the QR reviewers that the performed qualification activities are completed. Therefore, an important issue for the project team is to focus on synthetic and consistent information, and to make the presentation as clear as possible. Traceability to applicable requirements shall be systematically presented. Moreover, in order to support verification justification, reference to details shall be available, and the reviewer shall have the possibility to link automatically to the documents including this detailed information. In that frame, Thales Alenia Space has implemented a strong support in terms of methodology and tool, to provide to System Engineering and Verification teams a single reference technical database, in which all team members consult the applicable requirements, compliance, justification, design data and record the information necessary to build the final Qualification file. This paper presents the EGNOS context, the Qualification file contents, and the methodology implemented, based on Thales Alenia Space practices and in line with ECSS. Finally, it shows how the Qualification file is built in a DOORS environment.

  2. Assessment of perceptions of clinical management in courses oriented by competency.

    PubMed

    Gomes, Romeu; Padilha, Roberto de Queiroz; Lima, Valéria Vernaschi; Silva, Cosme Marcelo Furtado Passos da

    2018-01-01

    The study aims to assess perceptions of mastery of abilities in clinical management in participants of courses oriented by competency and based on active methodologies of teaching and learning, before and after the offered training process. Three conceptual frameworks were utilized: clinical management, expectation of auto-efficacy, and the holistic concept of competency. Methodologically, an electronic instrument was made available to students of the training courses, adapted to the Likert scale, in two stages: before the courses were undertaken and after their completion. The group of subjects that participated simultaneously in both stages was comprised of 825 trainees. Average, mean, standard deviation, and the Wilcoxon test were utilized in the analysis. Generally, in terms of findings, the perception of mastery of abilities in clinical management increased after the courses, proving a positive contribution of the training process of the students. Among other aspects of their results, it is concluded that the educational initiatives studied, oriented by competency and based in active methodologies of teaching and learning, can obtain the increase in perception of their participants regarding the mastery of abilities present in the competency profile, confirming the study's hypothesis.

  3. Bayesian Hierarchical Models to Augment the Mediterranean Forecast System

    DTIC Science & Technology

    2010-09-30

    In part 2 (Bonazzi et al., 2010), the impact of the ensemble forecast methodology based on MFS-Wind-BHM perturbations is documented. Forecast...absence of dt data stage inputs, the forecast impact of MFS-Error-BHM is neutral. Experiments are underway now to introduce dt back into the MFS-Error...BHM and quantify forecast impacts at MFS. MFS-SuperEnsemble-BHM We have assembled all needed datasets and completed algorithmic development

  4. The Operational Equations of State, 3: Recovery of the EOS for Hydrocode From the Measured Heat Capacity, Isentrope, and Hugoniot Adiabat

    DTIC Science & Technology

    2012-07-01

    hydrocode from experimental data. It is assumed that the substance in question possesses only two thermodynamic degrees of freedom – the specific volume V...excludes the possibility of phase transformations). 15. SUBJECT TERMS thermodynamics , EOS, hydrocode 16. SECURITY CLASSIFICATION OF: 17. LIMITATION...we gave several examples of generating complete thermodynamically consistent equations of state (EOS). The methodology used there was based on

  5. An Example of an Improvable Rao-Blackwell Improvement, Inefficient Maximum Likelihood Estimator, and Unbiased Generalized Bayes Estimator.

    PubMed

    Galili, Tal; Meilijson, Isaac

    2016-01-02

    The Rao-Blackwell theorem offers a procedure for converting a crude unbiased estimator of a parameter θ into a "better" one, in fact unique and optimal if the improvement is based on a minimal sufficient statistic that is complete. In contrast, behind every minimal sufficient statistic that is not complete, there is an improvable Rao-Blackwell improvement. This is illustrated via a simple example based on the uniform distribution, in which a rather natural Rao-Blackwell improvement is uniformly improvable. Furthermore, in this example the maximum likelihood estimator is inefficient, and an unbiased generalized Bayes estimator performs exceptionally well. Counterexamples of this sort can be useful didactic tools for explaining the true nature of a methodology and possible consequences when some of the assumptions are violated. [Received December 2014. Revised September 2015.].

  6. The Nigerian national blindness and visual impairment survey: Rationale, objectives and detailed methodology

    PubMed Central

    Dineen, Brendan; Gilbert, Clare E; Rabiu, Mansur; Kyari, Fatima; Mahdi, Abdull M; Abubakar, Tafida; Ezelum, Christian C; Gabriel, Entekume; Elhassan , Elizabeth; Abiose, Adenike; Faal, Hannah; Jiya, Jonathan Y; Ozemela, Chinenyem P; Lee, Pak Sang; Gudlavalleti, Murthy VS

    2008-01-01

    Background Despite having the largest population in Africa, Nigeria has no accurate population based data to plan and evaluate eye care services. A national survey was undertaken to estimate the prevalence and determine the major causes of blindness and low vision. This paper presents the detailed methodology used during the survey. Methods A nationally representative sample of persons aged 40 years and above was selected. Children aged 10–15 years and individuals aged <10 or 16–39 years with visual impairment were also included if they lived in households with an eligible adult. All participants had their height, weight, and blood pressure measured followed by assessment of presenting visual acuity, refractokeratomery, A-scan ultrasonography, visual fields and best corrected visual acuity. Anterior and posterior segments of each eye were examined with a torch and direct ophthalmoscope. Participants with visual acuity of < = 6/12 in one or both eyes underwent detailed examination including applanation tonometry, dilated slit lamp biomicroscopy, lens grading and fundus photography. All those who had undergone cataract surgery were refracted and best corrected vision recorded. Causes of visual impairment by eye and for the individual were determined using a clinical algorithm recommended by the World Health Organization. In addition, 1 in 7 adults also underwent a complete work up as described for those with vision < = 6/12 for constructing a normative data base for Nigerians. Discussion The field work for the study was completed in 30 months over the period 2005–2007 and covered 305 clusters across the entire country. Concurrently persons 40+ years were examined to form a normative data base. Analysis of the data is currently underway. Conclusion The methodology used was robust and adequate to provide estimates on the prevalence and causes of blindness in Nigeria. The survey would also provide information on barriers to accessing services, quality of life of visually impaired individuals and also provide normative data for Nigerian eyes. PMID:18808712

  7. Factors affecting reproducibility between genome-scale siRNA-based screens

    PubMed Central

    Barrows, Nicholas J.; Le Sommer, Caroline; Garcia-Blanco, Mariano A.; Pearson, James L.

    2011-01-01

    RNA interference-based screening is a powerful new genomic technology which addresses gene function en masse. To evaluate factors influencing hit list composition and reproducibility, we performed two identically designed small interfering RNA (siRNA)-based, whole genome screens for host factors supporting yellow fever virus infection. These screens represent two separate experiments completed five months apart and allow the direct assessment of the reproducibility of a given siRNA technology when performed in the same environment. Candidate hit lists generated by sum rank, median absolute deviation, z-score, and strictly standardized mean difference were compared within and between whole genome screens. Application of these analysis methodologies within a single screening dataset using a fixed threshold equivalent to a p-value ≤ 0.001 resulted in hit lists ranging from 82 to 1,140 members and highlighted the tremendous impact analysis methodology has on hit list composition. Intra- and inter-screen reproducibility was significantly influenced by the analysis methodology and ranged from 32% to 99%. This study also highlighted the power of testing at least two independent siRNAs for each gene product in primary screens. To facilitate validation we conclude by suggesting methods to reduce false discovery at the primary screening stage. In this study we present the first comprehensive comparison of multiple analysis strategies, and demonstrate the impact of the analysis methodology on the composition of the “hit list”. Therefore, we propose that the entire dataset derived from functional genome-scale screens, especially if publicly funded, should be made available as is done with data derived from gene expression and genome-wide association studies. PMID:20625183

  8. Value-Based Assessment of New Medical Technologies: Towards a Robust Methodological Framework for the Application of Multiple Criteria Decision Analysis in the Context of Health Technology Assessment.

    PubMed

    Angelis, Aris; Kanavos, Panos

    2016-05-01

    In recent years, multiple criteria decision analysis (MCDA) has emerged as a likely alternative to address shortcomings in health technology assessment (HTA) by offering a more holistic perspective to value assessment and acting as an alternative priority setting tool. In this paper, we argue that MCDA needs to subscribe to robust methodological processes related to the selection of objectives, criteria and attributes in order to be meaningful in the context of healthcare decision making and fulfil its role in value-based assessment (VBA). We propose a methodological process, based on multi-attribute value theory (MAVT) methods comprising five distinct phases, outline the stages involved in each phase and discuss their relevance in the HTA process. Importantly, criteria and attributes need to satisfy a set of desired properties, otherwise the outcome of the analysis can produce spurious results and misleading recommendations. Assuming the methodological process we propose is adhered to, the application of MCDA presents three very distinct advantages to decision makers in the context of HTA and VBA: first, it acts as an instrument for eliciting preferences on the performance of alternative options across a wider set of explicit criteria, leading to a more complete assessment of value; second, it allows the elicitation of preferences across the criteria themselves to reflect differences in their relative importance; and, third, the entire process of preference elicitation can be informed by direct stakeholder engagement, and can therefore reflect their own preferences. All features are fully transparent and facilitate decision making.

  9. Completion Mindsets and Contexts in Doctoral Supervision

    ERIC Educational Resources Information Center

    Green, Pam; Bowden, John

    2012-01-01

    Purpose: Doctoral candidates are now located within a research context of performativity where the push to successfully complete in a timely manner is central. The purpose of this paper is to develop a model of completion mindset within a completion context to assist research students and supervisors. Design/methodology/approach: The research was…

  10. Health systems around the world - a comparison of existing health system rankings.

    PubMed

    Schütte, Stefanie; Acevedo, Paula N Marin; Flahault, Antoine

    2018-06-01

    Existing health systems all over the world are different due to the different combinations of components that can be considered for their establishment. The ranking of health systems has been a focal points for many years especially the issue of performance. In 2000 the World Health Organization (WHO) performed a ranking to compare the Performance of the health system of the member countries. Since then other health system rankings have been performed and it became an issue of public discussion. A point of contention regarding these rankings is the methodology employed by each of them, since no gold standard exists. Therefore, this review focuses on evaluating the methodologies of each existing health system performance ranking to assess their reproducibility and transparency. A search was conducted to identify existing health system rankings, and a questionnaire was developed for the comparison of the methodologies based on the following indicators: (1) General information, (2) Statistical methods, (3) Data (4) Indicators. Overall nine rankings were identified whereas six of them focused rather on the measurement of population health without any financial component and were therefore excluded. Finally, three health system rankings were selected for this review: "Health Systems: Improving Performance" by the WHO, "Mirror, Mirror on the wall: How the Performance of the US Health Care System Compares Internationally" by the Commonwealth Fund and "the Most efficient Health Care" by Bloomberg. After the completion of the comparison of the rankings by giving them scores according to the indicators, the ranking performed the WHO was considered the most complete regarding the ability of reproducibility and transparency of the methodology. This review and comparison could help in establishing consensus in the field of health system research. This may also help giving recommendations for future health rankings and evaluating the current gap in the literature.

  11. Evidence-based medicine for neurosurgeons: introduction and methodology.

    PubMed

    Linskey, Mark E

    2006-01-01

    Evidence-based medicine is a tool of considerable value for medicine and neurosurgery that provides a secure base for clinical practice and practice improvement, but is not without inherent drawbacks, weaknesses and limitations. EBM finds answers to only those questions open to its techniques, and the best available evidence can be a far cry from scientific truth. With the support and backing of governmental agencies, professional medical societies, the AAMC, the ACGME, and the ABMS, EBM is likely here to stay. The fact that: (1) EBM philosophy and critical appraisal techniques have become fully integrated into the training and culture of our younger colleagues, (2) that maintenance of certification will require individuals to demonstrate personal evidence based practice based on tracking and critical analysis of personal practice outcomes as part of the performance-based learning and improvement competency, and (3) that the progressively growing national healthcare expenditures will necessitate increasing basis of reimbursement and funding based on evidence-based effectiveness and guidelines, all point to the likelihood that complete immersion of neurosurgical practice in EBM is inevitable. This article thoroughly explores the history of EBM in medicine in general and in neurosurgery in particular. Emphasis is placed on identifying the legislative and regulatory motive forces at work behind its promulgation and the role that organized medicine has taken to facilitate and foster its acceptance and implementation. An accounting of resources open to neurosurgeons, and a detailed description EBM clinical decision-making methodology is presented. Special emphasis is placed on outlining the methodology as well as the limitations of meta-analyses, randomized clinic trials, and clinical practice parameter guidelines. Commonly perceived objections, as well as substantive problems and limitations of EBM assumptions, tools, and approaches both for individual clinical practice and health policy design and implementation are explored in detail.

  12. Applications of aerospace technology in biology and medicine

    NASA Technical Reports Server (NTRS)

    Beall, H. C.; Brown, J. N.; Rouse, D. J.; Ruddle, J. C.; Scearce, R. W.

    1978-01-01

    A bipolar, donor-recipient model of medical technology transfer is introduced to provide a basis for the team's methodology. That methodology is designed (1) to identify medical problems and NASA technology that in combination constitute opportunities for successful medical products, (2) to obtain the early participation of industry in the transfer proces, and (3) to obtain acceptance by the medical community of new medical products based on NASA technology. Two commercial technology transfers and five institutional technology transfers were completed in 1977. A new, commercially available teaching manikin system uses NASA-developed concepts and techniques for effective visual presentation of information and data. Drugs shipped by the National Cancer Institute to locations throughout the world are maintained at low temperatures in shipping containers that incorporate recommendations made by NASA.

  13. Towards a general object-oriented software development methodology

    NASA Technical Reports Server (NTRS)

    Seidewitz, ED; Stark, Mike

    1986-01-01

    Object diagrams were used to design a 5000 statement team training exercise and to design the entire dynamics simulator. The object diagrams are also being used to design another 50,000 statement Ada system and a personal computer based system that will be written in Modula II. The design methodology evolves out of these experiences as well as the limitations of other methods that were studied. Object diagrams, abstraction analysis, and associated principles provide a unified framework which encompasses concepts from Yourdin, Booch, and Cherry. This general object-oriented approach handles high level system design, possibly with concurrency, through object-oriented decomposition down to a completely functional level. How object-oriented concepts can be used in other phases of the software life-cycle, such as specification and testing is being studied concurrently.

  14. [Optimization of process of icraiin be hydrolyzed to Baohuoside I by cellulase based on Plackett-Burman design combined with CCD response surface methodology].

    PubMed

    Song, Chuan-xia; Chen, Hong-mei; Dai, Yu; Kang, Min; Hu, Jia; Deng, Yun

    2014-11-01

    To optimize the process of Icraiin be hydrolyzed to Baohuoside I by cellulase by Plackett-Burman design combined with Central Composite Design (CCD) response surface methodology. To select the main influencing factors by Plackett-Burman design, using CCD response surface methodology to optimize the process of Icraiin be hydrolyzed to Baohuoside I by cellulase. Taking substrate concentration, the pH of buffer and reaction time as independent variables, with conversion rate of icariin as dependent variable,using regression fitting of completely quadratic response surface between independent variable and dependent variable,the optimum process of Icraiin be hydrolyzed to Baohuoside I by cellulase was intuitively analyzed by 3D surface chart, and taking verification tests and predictive analysis. The best enzymatic hydrolytic process was as following: substrate concentration 8. 23 mg/mL, pH 5. 12 of buffer,reaction time 35. 34 h. The optimum process of Icraiin be hydrolyzed to Baohuoside I by cellulase is determined by Plackett-Burman design combined with CCD response surface methodology. The optimized enzymatic hydrolytic process is simple, convenient, accurate, reproducible and predictable.

  15. Potential and Limitations of an Improved Method to Produce Dynamometric Wheels

    PubMed Central

    García de Jalón, Javier

    2018-01-01

    A new methodology for the estimation of tyre-contact forces is presented. The new procedure is an evolution of a previous method based on harmonic elimination techniques developed with the aim of producing low cost dynamometric wheels. While the original method required stress measurement in many rim radial lines and the fulfillment of some rigid conditions of symmetry, the new methodology described in this article significantly reduces the number of required measurement points and greatly relaxes symmetry constraints. This can be done without compromising the estimation error level. The reduction of the number of measuring radial lines increases the ripple of demodulated signals due to non-eliminated higher order harmonics. Therefore, it is necessary to adapt the calibration procedure to this new scenario. A new calibration procedure that takes into account angular position of the wheel is completely described. This new methodology is tested on a standard commercial five-spoke car wheel. Obtained results are qualitatively compared to those derived from the application of former methodology leading to the conclusion that the new method is both simpler and more robust due to the reduction in the number of measuring points, while contact forces’ estimation error remains at an acceptable level. PMID:29439427

  16. Intelligent Management Control for Unmanned Aircraft Navigation and Formation Keeping

    DTIC Science & Technology

    2003-06-01

    support of a variety of military and civilian applications has introduced basic and applied research challenges in areas such as levels of autonomy and...methodologies based on reliable and advanced basic research results. The present paper outlines some of these aspects, although not in an exhaustive manner...takes ∆P’ as input. Figure 3 shows the complete control block diagram. The resulting control law is: ∆T = KFC (s)∆ The FC applies the corrections ∆T

  17. Mechanics Methodology for Textile Preform Composite Materials

    NASA Technical Reports Server (NTRS)

    Poe, Clarence C., Jr.

    1996-01-01

    NASA and its contractors have completed a program to develop a basic mechanics underpinning for textile composites. Three major deliverables were produced by the program: 1. A set of test methods for measuring material properties and design allowables; 2. Mechanics models to predict the effects of the fiber preform architecture and constituent properties on engineering moduli, strength, damage resistance, and fatigue life; and 3. An electronic data base of coupon type test data. This report describes these three deliverables.

  18. Assessment of undiscovered oil and gas resources of the Cook Inlet region, south-central Alaska, 2011

    USGS Publications Warehouse

    Stanley, Richard G.; Charpentier, Ronald R.; Cook, Troy A.; Houseknecht, David W.; Klett, Timothy R.; Lewis, Kristen A.; Lillis, Paul G.; Nelson, Philip H.; Phillips, Jeffrey D.; Pollastro, Richard M.; Potter, Christopher J.; Rouse, William A.; Saltus, Richard W.; Schenk, Christopher J.; Shah, Anjana K.; Valin, Zenon C.

    2011-01-01

    The U.S. Geological Survey (USGS) recently completed a new assessment of undiscovered, technically recoverable oil and gas resources in the Cook Inlet region of south-central Alaska. Using a geology-based assessment methodology, the USGS estimates that mean undiscovered volumes of nearly 600 million barrels of oil, about 19 trillion cubic feet of natural gas, and 46 million barrels of natural gas liquids remain to be found in this area.

  19. Tactical Implications of Air Blast Variations from Nuclear Tests

    DTIC Science & Technology

    1976-11-30

    work com- pleted under Contract ODlA 001-76-C-0284. The objective of this analysis was to assess the rationale for additional underground tests ( UGT ) to...applications wore based, and additional applications of the methodology for a more complete assessment of the UGT rationale. This report summarizes work...corresponding to a 25 percent to 50 percent reduction in yield. The maximum improvement possible through UGT is, of course, when the variance in the weapon

  20. Atmospheric profiles from active space-based radio measurements

    NASA Technical Reports Server (NTRS)

    Hardy, Kenneth R.; Hinson, David P.; Tyler, G. L.; Kursinski, E. R.

    1992-01-01

    The paper describes determinations of atmospheric profiles from space-based radio measurements and the retrieval methodology used, with special attention given to the measurement procedure and the characteristics of the soundings. It is speculated that reliable profiles of the terrestrial atmosphere can be obtained by the occultation technique from the surface to a height of about 60 km. With the full complement of 21 the Global Positioning System (GPS) satellites and one GPS receiver in sun synchronous polar orbit, a maximum of 42 soundings could be obtained for each complete orbit or about 670 per day, providing almost uniform global coverage.

  1. Computer-aided personal interviewing. A new technique for data collection in epidemiologic surveys.

    PubMed

    Birkett, N J

    1988-03-01

    Most epidemiologic studies involve the collection of data directly from selected respondents. Traditionally, interviewers are provided with the interview in booklet form on paper and answers are recorded therein. On receipt at the study office, the interview results are coded, transcribed, and keypunched for analysis. The author's team has developed a method of personal interviewing which uses a structured interview stored on a lap-sized computer. Responses are entered into the computer and are subject to immediate error-checking and correction. All skip-patterns are automatic. Data entry to the final data-base involves no manual data transcription. A pilot evaluation with a preliminary version of the system using tape-recorded interviews in a test/re-test methodology revealed a slightly higher error rate, probably related to weaknesses in the pilot system and the training process. Computer interviews tended to be longer but other features of the interview process were not affected by computer. The author's team has now completed 2,505 interviews using this system in a community-based blood pressure survey. It has been well accepted by both interviewers and respondents. Failure to complete an interview on the computer was uncommon (5 per cent) and well-handled by paper back-up questionnaires. The results show that computer-aided personal interviewing in the home is feasible but that further evaluation is needed to establish the impact of this methodology on overall data quality.

  2. Optimal Modality Selection for Cooperative Human-Robot Task Completion.

    PubMed

    Jacob, Mithun George; Wachs, Juan P

    2016-12-01

    Human-robot cooperation in complex environments must be fast, accurate, and resilient. This requires efficient communication channels where robots need to assimilate information using a plethora of verbal and nonverbal modalities such as hand gestures, speech, and gaze. However, even though hybrid human-robot communication frameworks and multimodal communication have been studied, a systematic methodology for designing multimodal interfaces does not exist. This paper addresses the gap by proposing a novel methodology to generate multimodal lexicons which maximizes multiple performance metrics over a wide range of communication modalities (i.e., lexicons). The metrics are obtained through a mixture of simulation and real-world experiments. The methodology is tested in a surgical setting where a robot cooperates with a surgeon to complete a mock abdominal incision and closure task by delivering surgical instruments. Experimental results show that predicted optimal lexicons significantly outperform predicted suboptimal lexicons (p <; 0.05) in all metrics validating the predictability of the methodology. The methodology is validated in two scenarios (with and without modeling the risk of a human-robot collision) and the differences in the lexicons are analyzed.

  3. A time-responsive tool for informing policy making: rapid realist review.

    PubMed

    Saul, Jessie E; Willis, Cameron D; Bitz, Jennifer; Best, Allan

    2013-09-05

    A realist synthesis attempts to provide policy makers with a transferable theory that suggests a certain program is more or less likely to work in certain respects, for particular subjects, in specific kinds of situations. Yet realist reviews can require considerable and sustained investment over time, which does not always suit the time-sensitive demands of many policy decisions. 'Rapid Realist Review' methodology (RRR) has been developed as a tool for applying a realist approach to a knowledge synthesis process in order to produce a product that is useful to policy makers in responding to time-sensitive and/or emerging issues, while preserving the core elements of realist methodology. Using examples from completed RRRs, we describe key features of the RRR methodology, the resources required, and the strengths and limitations of the process. All aspects of an RRR are guided by both a local reference group, and a group of content experts. Involvement of knowledge users and external experts ensures both the usability of the review products, as well as their links to current practice. RRRs have proven useful in providing evidence for and making explicit what is known on a given topic, as well as articulating where knowledge gaps may exist. From the RRRs completed to date, findings broadly adhere to four (often overlapping) classifications: guiding rules for policy-making; knowledge quantification (i.e., the amount of literature available that identifies context, mechanisms, and outcomes for a given topic); understanding tensions/paradoxes in the evidence base; and, reinforcing or refuting beliefs and decisions taken. 'Traditional' realist reviews and RRRs have some key differences, which allow policy makers to apply each type of methodology strategically to maximize its utility within a particular local constellation of history, goals, resources, politics and environment. In particular, the RRR methodology is explicitly designed to engage knowledge users and review stakeholders to define the research questions, and to streamline the review process. In addition, results are presented with a focus on context-specific explanations for what works within a particular set of parameters rather than producing explanations that are potentially transferrable across contexts and populations. For policy makers faced with making difficult decisions in short time frames for which there is sufficient (if limited) published/research and practice-based evidence available, RRR provides a practical, outcomes-focused knowledge synthesis method.

  4. A low power biomedical signal processor ASIC based on hardware software codesign.

    PubMed

    Nie, Z D; Wang, L; Chen, W G; Zhang, T; Zhang, Y T

    2009-01-01

    A low power biomedical digital signal processor ASIC based on hardware and software codesign methodology was presented in this paper. The codesign methodology was used to achieve higher system performance and design flexibility. The hardware implementation included a low power 32bit RISC CPU ARM7TDMI, a low power AHB-compatible bus, and a scalable digital co-processor that was optimized for low power Fast Fourier Transform (FFT) calculations. The co-processor could be scaled for 8-point, 16-point and 32-point FFTs, taking approximate 50, 100 and 150 clock circles, respectively. The complete design was intensively simulated using ARM DSM model and was emulated by ARM Versatile platform, before conducted to silicon. The multi-million-gate ASIC was fabricated using SMIC 0.18 microm mixed-signal CMOS 1P6M technology. The die area measures 5,000 microm x 2,350 microm. The power consumption was approximately 3.6 mW at 1.8 V power supply and 1 MHz clock rate. The power consumption for FFT calculations was less than 1.5 % comparing with the conventional embedded software-based solution.

  5. Symmetrical compression distance for arrhythmia discrimination in cloud-based big-data services.

    PubMed

    Lillo-Castellano, J M; Mora-Jiménez, I; Santiago-Mozos, R; Chavarría-Asso, F; Cano-González, A; García-Alberola, A; Rojo-Álvarez, J L

    2015-07-01

    The current development of cloud computing is completely changing the paradigm of data knowledge extraction in huge databases. An example of this technology in the cardiac arrhythmia field is the SCOOP platform, a national-level scientific cloud-based big data service for implantable cardioverter defibrillators. In this scenario, we here propose a new methodology for automatic classification of intracardiac electrograms (EGMs) in a cloud computing system, designed for minimal signal preprocessing. A new compression-based similarity measure (CSM) is created for low computational burden, so-called weighted fast compression distance, which provides better performance when compared with other CSMs in the literature. Using simple machine learning techniques, a set of 6848 EGMs extracted from SCOOP platform were classified into seven cardiac arrhythmia classes and one noise class, reaching near to 90% accuracy when previous patient arrhythmia information was available and 63% otherwise, hence overcoming in all cases the classification provided by the majority class. Results show that this methodology can be used as a high-quality service of cloud computing, providing support to physicians for improving the knowledge on patient diagnosis.

  6. Heuristic evaluation of paper-based Web pages: a simplified inspection usability methodology.

    PubMed

    Allen, Mureen; Currie, Leanne M; Bakken, Suzanne; Patel, Vimla L; Cimino, James J

    2006-08-01

    Online medical information, when presented to clinicians, must be well-organized and intuitive to use, so that the clinicians can conduct their daily work efficiently and without error. It is essential to actively seek to produce good user interfaces that are acceptable to the user. This paper describes the methodology used to develop a simplified heuristic evaluation (HE) suitable for the evaluation of screen shots of Web pages, the development of an HE instrument used to conduct the evaluation, and the results of the evaluation of the aforementioned screen shots. In addition, this paper presents examples of the process of categorizing problems identified by the HE and the technological solutions identified to resolve these problems. Four usability experts reviewed 18 paper-based screen shots and made a total of 108 comments. Each expert completed the task in about an hour. We were able to implement solutions to approximately 70% of the violations. Our study found that a heuristic evaluation using paper-based screen shots of a user interface was expeditious, inexpensive, and straightforward to implement.

  7. CARES: Completely Automated Robust Edge Snapper for carotid ultrasound IMT measurement on a multi-institutional database of 300 images: a two stage system combining an intensity-based feature approach with first order absolute moments

    NASA Astrophysics Data System (ADS)

    Molinari, Filippo; Acharya, Rajendra; Zeng, Guang; Suri, Jasjit S.

    2011-03-01

    The carotid intima-media thickness (IMT) is the most used marker for the progression of atherosclerosis and onset of the cardiovascular diseases. Computer-aided measurements improve accuracy, but usually require user interaction. In this paper we characterized a new and completely automated technique for carotid segmentation and IMT measurement based on the merits of two previously developed techniques. We used an integrated approach of intelligent image feature extraction and line fitting for automatically locating the carotid artery in the image frame, followed by wall interfaces extraction based on Gaussian edge operator. We called our system - CARES. We validated the CARES on a multi-institutional database of 300 carotid ultrasound images. IMT measurement bias was 0.032 +/- 0.141 mm, better than other automated techniques and comparable to that of user-driven methodologies. Our novel approach of CARES processed 96% of the images leading to the figure of merit to be 95.7%. CARES ensured complete automation and high accuracy in IMT measurement; hence it could be a suitable clinical tool for processing of large datasets in multicenter studies involving atherosclerosis.pre-

  8. Bayesian Estimation of the Spatially Varying Completeness Magnitude of Earthquake Catalogs

    NASA Astrophysics Data System (ADS)

    Mignan, A.; Werner, M.; Wiemer, S.; Chen, C.; Wu, Y.

    2010-12-01

    Assessing the completeness magnitude Mc of earthquake catalogs is an essential prerequisite for any seismicity analysis. We employ a simple model to compute Mc in space, based on the proximity to seismic stations in a network. We show that a relationship of the form Mcpred(d) = ad^b+c, with d the distance to the 5th nearest seismic station, fits the observations well. We then propose a new Mc mapping approach, the Bayesian Magnitude of Completeness (BMC) method, based on a 2-step procedure: (1) a spatial resolution optimization to minimize spatial heterogeneities and uncertainties in Mc estimates and (2) a Bayesian approach that merges prior information about Mc based on the proximity to seismic stations with locally observed values weighted by their respective uncertainties. This new methodology eliminates most weaknesses associated with current Mc mapping procedures: the radius that defines which earthquakes to include in the local magnitude distribution is chosen according to an objective criterion and there are no gaps in the spatial estimation of Mc. The method solely requires the coordinates of seismic stations. Here, we investigate the Taiwan Central Weather Bureau (CWB) earthquake catalog by computing a Mc map for the period 1994-2010.

  9. Roadway safety analysis methodology for Utah : final report.

    DOT National Transportation Integrated Search

    2016-12-01

    This research focuses on the creation of a three-part Roadway Safety Analysis methodology that applies and automates the cumulative work of recently-completed roadway safety research. The first part is to prepare the roadway and crash data for analys...

  10. Rocket-Based Combined Cycle Engine Technology Development: Inlet CFD Validation and Application

    NASA Technical Reports Server (NTRS)

    DeBonis, J. R.; Yungster, S.

    1996-01-01

    A CFD methodology has been developed for inlet analyses of Rocket-Based Combined Cycle (RBCC) Engines. A full Navier-Stokes analysis code, NPARC, was used in conjunction with pre- and post-processing tools to obtain a complete description of the flow field and integrated inlet performance. This methodology was developed and validated using results from a subscale test of the inlet to a RBCC 'Strut-Jet' engine performed in the NASA Lewis 1 x 1 ft. supersonic wind tunnel. Results obtained from this study include analyses at flight Mach numbers of 5 and 6 for super-critical operating conditions. These results showed excellent agreement with experimental data. The analysis tools were also used to obtain pre-test performance and operability predictions for the RBCC demonstrator engine planned for testing in the NASA Lewis Hypersonic Test Facility. This analysis calculated the baseline fuel-off internal force of the engine which is needed to determine the net thrust with fuel on.

  11. A methodology for enhancing implementation science proposals: comparison of face-to-face versus virtual workshops.

    PubMed

    Marriott, Brigid R; Rodriguez, Allison L; Landes, Sara J; Lewis, Cara C; Comtois, Katherine A

    2016-05-06

    With the current funding climate and need for advancements in implementation science, there is a growing demand for grantsmanship workshops to increase the quality and rigor of proposals. A group-based implementation science-focused grantsmanship workshop, the Implementation Development Workshop (IDW), is one methodology to address this need. This manuscript provides an overview of the IDW structure, format, and findings regarding its utility. The IDW methodology allows researchers to vet projects in the proposal stage in a structured format with a facilitator and two types of expert participants: presenters and attendees. The presenter uses a one-page handout and verbal presentation to present their proposal and questions. The facilitator elicits feedback from attendees using a format designed to maximize the number of unique points made. After each IDW, participants completed an anonymous survey assessing perceptions of the IDW. Presenters completed a funding survey measuring grant submission and funding success. Qualitative interviews were conducted with a subset of participants who participated in both delivery formats. Mixed method analyses were performed to evaluate the effectiveness and acceptability of the IDW and compare the delivery formats. Of those who participated in an IDW (N = 72), 40 participated in face-to-face only, 16 in virtual only, and 16 in both formats. Thirty-eight (face-to-face n = 12, 35 % response rate; virtual n = 26, 66.7 % response rate) responded to the surveys and seven (15.3 % response rate), who had attended both formats, completed an interview. Of 36 total presenters, 17 (face-to-face n = 12, 42.9 % response rate; virtual n = 5, 62.9 % response rate) responded to the funding survey. Mixed method analyses indicated that the IDW was effective for collaboration and growth, effective for enhancing success in obtaining grants, and acceptable. A third (35.3 %) of presenters ultimately received funding for their proposal, and more than 80 % of those who presented indicated they would present again in the future. The IDW structure and facilitation process were found to be acceptable, with both formats rated as equally strong. The IDW presents an acceptable and successful methodology for increasing competitiveness of implementation science grant proposals.

  12. An assessment of patient sign-outs conducted by University at Buffalo internal medicine residents.

    PubMed

    Wheat, Deirdre; Co, Christopher; Manochakian, Rami; Rich, Ellen

    2012-01-01

    Internal medicine residents were surveyed regarding patient sign-outs at shift change. Data were used to design and implement interventions aimed at improving sign-out quality. This quasi-experimental project incorporated the Plan, Do, Study, Act methodology. Residents completed an anonymous electronic survey regarding experiences during sign-outs. Survey questions assessed structure, process, and outcome of sign-outs. Analysis of qualitative and quantitative data was performed; interventions were implemented based on survey findings. A total of 120 surveys (89% response) and 115 surveys (83% response) were completed by residents of 4 postgraduate years in response to the first (2008) and second (2009) survey requests, respectively. Approximately 79% of the respondents to the second survey indicated that postintervention sign-out systems were superior to preintervention systems. Results indicated improvement in specific areas of structure, process, and outcome. Survey-based modifications to existing sign-out systems effected measurable quality improvement in structure, process, and outcome.

  13. Research in interactive scene analysis

    NASA Technical Reports Server (NTRS)

    Tenenbaum, J. M.; Garvey, T. D.; Weyl, S. A.; Wolf, H. C.

    1975-01-01

    An interactive scene interpretation system (ISIS) was developed as a tool for constructing and experimenting with man-machine and automatic scene analysis methods tailored for particular image domains. A recently developed region analysis subsystem based on the paradigm of Brice and Fennema is described. Using this subsystem a series of experiments was conducted to determine good criteria for initially partitioning a scene into atomic regions and for merging these regions into a final partition of the scene along object boundaries. Semantic (problem-dependent) knowledge is essential for complete, correct partitions of complex real-world scenes. An interactive approach to semantic scene segmentation was developed and demonstrated on both landscape and indoor scenes. This approach provides a reasonable methodology for segmenting scenes that cannot be processed completely automatically, and is a promising basis for a future automatic system. A program is described that can automatically generate strategies for finding specific objects in a scene based on manually designated pictorial examples.

  14. Air-borne shape measurement of parabolic trough collector fields

    NASA Astrophysics Data System (ADS)

    Prahl, Christoph; Röger, Marc; Hilgert, Christoph

    2017-06-01

    The optical and thermal efficiency of parabolic trough collector solar fields is dependent on the performance and assembly accuracy of its components such as the concentrator and absorber. For the purpose of optical inspection/approval, yield analysis, localization of low performing areas, and optimization of the solar field, it is essential to create a complete view of the optical properties of the field. Existing optical measurement tools are based on ground based cameras, facing restriction concerning speed, volume and automation. QFly is an airborne qualification system which provides holistic and accurate information on geometrical, optical, and thermal properties of the entire solar field. It consists of an unmanned aerial vehicle, cameras and related software for flight path planning, data acquisition and evaluation. This article presents recent advances of the QFly measurement system and proposes a methodology on holistic qualification of the complete solar field with minimum impact on plant operation.

  15. Architectural approaches for HL7-based health information systems implementation.

    PubMed

    López, D M; Blobel, B

    2010-01-01

    Information systems integration is hard, especially when semantic and business process interoperability requirements need to be met. To succeed, a unified methodology, approaching different aspects of systems architecture such as business, information, computational, engineering and technology viewpoints, has to be considered. The paper contributes with an analysis and demonstration on how the HL7 standard set can support health information systems integration. Based on the Health Information Systems Development Framework (HIS-DF), common architectural models for HIS integration are analyzed. The framework is a standard-based, consistent, comprehensive, customizable, scalable methodology that supports the design of semantically interoperable health information systems and components. Three main architectural models for system integration are analyzed: the point to point interface, the messages server and the mediator models. Point to point interface and messages server models are completely supported by traditional HL7 version 2 and version 3 messaging. The HL7 v3 standard specification, combined with service-oriented, model-driven approaches provided by HIS-DF, makes the mediator model possible. The different integration scenarios are illustrated by describing a proof-of-concept implementation of an integrated public health surveillance system based on Enterprise Java Beans technology. Selecting the appropriate integration architecture is a fundamental issue of any software development project. HIS-DF provides a unique methodological approach guiding the development of healthcare integration projects. The mediator model - offered by the HIS-DF and supported in HL7 v3 artifacts - is the more promising one promoting the development of open, reusable, flexible, semantically interoperable, platform-independent, service-oriented and standard-based health information systems.

  16. An investigation of the efficacy of collaborative virtual reality systems for moderated remote usability testing.

    PubMed

    Chalil Madathil, Kapil; Greenstein, Joel S

    2017-11-01

    Collaborative virtual reality-based systems have integrated high fidelity voice-based communication, immersive audio and screen-sharing tools into virtual environments. Such three-dimensional collaborative virtual environments can mirror the collaboration among usability test participants and facilitators when they are physically collocated, potentially enabling moderated usability tests to be conducted effectively when the facilitator and participant are located in different places. We developed a virtual collaborative three-dimensional remote moderated usability testing laboratory and employed it in a controlled study to evaluate the effectiveness of moderated usability testing in a collaborative virtual reality-based environment with two other moderated usability testing methods: the traditional lab approach and Cisco WebEx, a web-based conferencing and screen sharing approach. Using a mixed methods experimental design, 36 test participants and 12 test facilitators were asked to complete representative tasks on a simulated online shopping website. The dependent variables included the time taken to complete the tasks; the usability defects identified and their severity; and the subjective ratings on the workload index, presence and satisfaction questionnaires. Remote moderated usability testing methodology using a collaborative virtual reality system performed similarly in terms of the total number of defects identified, the number of high severity defects identified and the time taken to complete the tasks with the other two methodologies. The overall workload experienced by the test participants and facilitators was the least with the traditional lab condition. No significant differences were identified for the workload experienced with the virtual reality and the WebEx conditions. However, test participants experienced greater involvement and a more immersive experience in the virtual environment than in the WebEx condition. The ratings for the virtual environment condition were not significantly different from those for the traditional lab condition. The results of this study suggest that participants were productive and enjoyed the virtual lab condition, indicating the potential of a virtual world based approach as an alternative to conventional approaches for synchronous usability testing. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. A standardized curriculum to introduce novice health professional students to practice-based learning and improvement: a multi-institutional pilot study.

    PubMed

    Huntington, Jonathan T; Dycus, Paula; Hix, Carolyn; West, Rita; McKeon, Leslie; Coleman, Mary T; Hathaway, Donna; McCurren, Cynthia; Ogrinc, Greg

    2009-01-01

    Practice-based learning and improvement (PBLI) combines the science of continuous quality improvement with the pragmatics of day-to-day clinical care delivery. PBLI is a core-learning domain in nursing and medical education. We developed a workbook-based, project-focused curriculum to teach PBLI to novice health professional students. Evaluate the efficacy of a standardized curriculum to teach PBLI. Nonrandomized, controlled trial with medical and nursing students from 3 institutions. Faculty used the workbook to facilitate completion of an improvement project with 16 participants. Both participants and controls (N = 15) completed instruments to measure PBLI knowledge and self-efficacy. Participants also completed a satisfaction survey and presented project posters at a national conference. There was no significant difference in PBLI knowledge between groups. Self-efficacy of participants was higher than that of controls in identifying best practice, identifying measures, identifying successful local improvement work, implementing a structured change plan, and using Plan-Do-Study-Act methodology. Participant satisfaction with the curriculum was high. Although PBLI knowledge was similar between groups, participants had higher self-efficacy and confidently disseminated their findings via formal poster presentation. This pilot study suggests that using a workbook-based, project-focused approach may be effective in teaching PBLI to novice health professional students.

  18. Ranking methodology for determining the relative favorability for commercial development of US tar-sand deposits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aamodt, P.L.; Freiwald, J.G.

    1983-03-01

    As a part of the DOE's program to stimulate petroleum production from unconventional sources, the Los Alamos National Laboratory has developed a methodology to compare and rank tar sand deposits, based on their suitability for commercial development. Major categories influencing favorability were identified and evaluated to determine their individual and collective impacts. To facilitate their evaluation, deposit characteristics, extraction technologies, environmental controls, and institutional constraints were broken down into their elements. The elements were assessed singly and in interactive groups to determine their influence on favorability for commercial development. A numerical value was assigned each element to signify its estimatedmore » importance relative to the other elements. Eight tar sand deposits were evaluated using only one major category, deposit characteristics. This initial, and only partial favorability assessment, was solely a test of the methodology, and it was considered successful. Because only one of the four major categories was used for this initial favorability ranking, and also because the available deposit characteristic data were barely adequate for the test, these first results should be used only as an example of how the methodology is to be applied when more complete data are available. The eight deposits and their relative favorability rankings for commercial development, based only on the deposit characteristics, are Sunnyside, Utah; Asphalt Ridge, Utah; Edna, California; Santa Rosa, New Mexico; Tar Sand Triangle, Utah; PR Spring, Utah; Uvalde, Texas; and circle cliffs, Utah.« less

  19. National Institute for Petroleum and Energy Research quarterly technical report for April 1--June 30, 1993. Volume 1, Fuels research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Progress reports are presented for the following fuels researches: Development of analytical methodology for analysis of heave crudes; and thermochemistry and thermophysical properties of organic nitrogen and diheteroatom-containing compounds. Some of the accomplishments are: Topical reports summarizing GC/MS methodology for determination of amines in petroleum and catalytic cracking behavior of compound type in Wilmington 650{degrees} F+ resid were completed; density measurements between 320 K and 550 K were completed for 8-methylquinoline; high-temperature heat-capacities and critical temperature (near 800 K) for 8-methylquinoline were determined; vapor-pressure measurements were completed for 2,6-dimethylpyridine; and a series of enthalpy-of-combustion measurement was completed for 1,10-phenanthroline, phenazine,more » 2-methylquinoline, and 8-methylquinoline.« less

  20. A Meta-Analysis and Review of Holistic Face Processing

    PubMed Central

    Richler, Jennifer J.; Gauthier, Isabel

    2014-01-01

    The concept of holistic processing is a cornerstone of face recognition research, yet central questions related to holistic processing remain unanswered, and debates have thus far failed to reach a resolution despite accumulating empirical evidence. We argue that a considerable source of confusion in this literature stems from a methodological problem. Specifically, two different measures of holistic processing based on the composite paradigm (complete design and partial design) are used in the literature, but they often lead to qualitatively different results. First, we present a comprehensive review of the work that directly compares the two designs, and which clearly favors the complete design over the partial design. Second, we report a meta-analysis of holistic face processing according to both designs, and use this as further evidence for one design over the other. The meta-analysis effect size of holistic processing in the complete design is nearly three times that of the partial design. Effect sizes were not correlated between measures, consistent with the suggestion that they do not measure the same thing. Our meta-analysis also examines the correlation between conditions in the complete design of the composite task, and suggests that in an individual differences context, little is gained by including a misaligned baseline. Finally, we offer a comprehensive review of the state of knowledge about holistic processing based on evidence gathered from the measure we favor based on the first sections of our review—the complete design—and outline outstanding research questions in that new context. PMID:24956123

  1. Assessment of change in knowledge about research methods among delegates attending research methodology workshop.

    PubMed

    Shrivastava, Manisha; Shah, Nehal; Navaid, Seema

    2018-01-01

    In an era of evidence based medicine research is an essential part of medical profession whether clinical or academic. A research methodology workshop intends to help participants, those who are newer to research field or those who are already doing empirical research. The present study was conducted to assess the changes in knowledge of the participants of a research methodology workshop through a structured questionnaire. With administrative and ethical approval, a four day research methodology workshop was planned. The participants were subjected to a structured questionnaire (pre-test) containing 20 multiple choice questions (Q1-Q 20) related to the topics to be covered in research methodology workshop before the commencement of the workshop and then subjected to similar posttest questionnaire after the completion of workshop. The mean values of pre and post-test scores were calculated and the results were analyzed and compared. Out of the total 153 delegates, 45(29 %) were males and 108 were (71 %) females. 92 (60%) participants consented to fill the pre-test questionnaire and 68 (44%) filled the post-test questionnaire. The mean Pre-test and post-test scores at 95% Confidence Interval were 07.62 (SD ±3.220) and 09.66 (SD ±2.477) respectively. The differences were found to be significant using Paired Sample T test ( P <0.003). There was increase in knowledge of the delegates after attending research methodology workshops. Participatory research methodology workshops are good methods of imparting knowledge, also the long term effects needs to be evaluated.

  2. A new methodology for hydro-abrasive erosion tests simulating penstock erosive flow

    NASA Astrophysics Data System (ADS)

    Aumelas, V.; Maj, G.; Le Calvé, P.; Smith, M.; Gambiez, B.; Mourrat, X.

    2016-11-01

    Hydro-abrasive resistance is an important property requirement for hydroelectric power plant penstock coating systems used by EDF. The selection of durable coating systems requires an experimental characterization of coating performance. This can be achieved by performing accelerated and representative laboratory tests. In case of severe erosion induced by a penstock flow, there is no suitable method or standard representative of real erosive flow conditions. The presented study aims at developing a new methodology and an associated laboratory experimental device. The objective of the laboratory apparatus is to subject coated test specimens to wear conditions similar to the ones generated at the penstock lower generatrix in actual flow conditions. Thirteen preselected coating solutions were first been tested during a 45 hours erosion test. A ranking of the thirteen coating solutions was then determined after characterisation. To complete this first evaluation and to determine the wear kinetic of the four best coating solutions, additional erosion tests were conducted with a longer duration of 216 hours. A comparison of this new method with standardized tests and with real service operating flow conditions is also discussed. To complete the final ranking based on hydro-abrasive erosion tests, some trial tests were carried out on penstock samples to check the application method of selected coating systems. The paper gives some perspectives related to erosion test methodologies for materials and coating solutions for hydraulic applications. The developed test method can also be applied in other fields.

  3. CONCEPTUAL DESIGNS FOR A NEW HIGHWAY VEHICLE EMISSIONS ESTIMATION METHODOLOGY

    EPA Science Inventory

    The report discusses six conceptual designs for a new highway vehicle emissions estimation methodology and summarizes the recommendations of each design for improving the emissions and activity factors in the emissions estimation process. he complete design reports are included a...

  4. 78 FR 34671 - Invitation for Membership on Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-10

    ... by successful completion of Joint Board examinations in basic actuarial mathematics and methodology and in actuarial mathematics and methodology relating to pension plans qualifying under ERISA. The... (ERISA), is responsible for the enrollment of individuals who wish to perform actuarial services under...

  5. Aeroelastic optimization methodology for viscous and turbulent flows

    NASA Astrophysics Data System (ADS)

    Barcelos Junior, Manuel Nascimento Dias

    2007-12-01

    In recent years, the development of faster computers and parallel processing allowed the application of high-fidelity analysis methods to the aeroelastic design of aircraft. However, these methods are restricted to the final design verification, mainly due to the computational cost involved in iterative design processes. Therefore, this work is concerned with the creation of a robust and efficient aeroelastic optimization methodology for inviscid, viscous and turbulent flows by using high-fidelity analysis and sensitivity analysis techniques. Most of the research in aeroelastic optimization, for practical reasons, treat the aeroelastic system as a quasi-static inviscid problem. In this work, as a first step toward the creation of a more complete aeroelastic optimization methodology for realistic problems, an analytical sensitivity computation technique was developed and tested for quasi-static aeroelastic viscous and turbulent flow configurations. Viscous and turbulent effects are included by using an averaged discretization of the Navier-Stokes equations, coupled with an eddy viscosity turbulence model. For quasi-static aeroelastic problems, the traditional staggered solution strategy has unsatisfactory performance when applied to cases where there is a strong fluid-structure coupling. Consequently, this work also proposes a solution methodology for aeroelastic and sensitivity analyses of quasi-static problems, which is based on the fixed point of an iterative nonlinear block Gauss-Seidel scheme. The methodology can also be interpreted as the solution of the Schur complement of the aeroelastic and sensitivity analyses linearized systems of equations. The methodologies developed in this work are tested and verified by using realistic aeroelastic systems.

  6. Extension of specification language for soundness and completeness of service workflow

    NASA Astrophysics Data System (ADS)

    Viriyasitavat, Wattana; Xu, Li Da; Bi, Zhuming; Sapsomboon, Assadaporn

    2018-05-01

    A Service Workflow is an aggregation of distributed services to fulfill specific functionalities. With ever increasing available services, the methodologies for the selections of the services against the given requirements become main research subjects in multiple disciplines. A few of researchers have contributed to the formal specification languages and the methods for model checking; however, existing methods have the difficulties to tackle with the complexity of workflow compositions. In this paper, we propose to formalize the specification language to reduce the complexity of the workflow composition. To this end, we extend a specification language with the consideration of formal logic, so that some effective theorems can be derived for the verification of syntax, semantics, and inference rules in the workflow composition. The logic-based approach automates compliance checking effectively. The Service Workflow Specification (SWSpec) has been extended and formulated, and the soundness, completeness, and consistency of SWSpec applications have been verified; note that a logic-based SWSpec is mandatory for the development of model checking. The application of the proposed SWSpec has been demonstrated by the examples with the addressed soundness, completeness, and consistency.

  7. Visual performance-based image enhancement methodology: an investigation of contrast enhancement algorithms

    NASA Astrophysics Data System (ADS)

    Neriani, Kelly E.; Herbranson, Travis J.; Reis, George A.; Pinkus, Alan R.; Goodyear, Charles D.

    2006-05-01

    While vast numbers of image enhancing algorithms have already been developed, the majority of these algorithms have not been assessed in terms of their visual performance-enhancing effects using militarily relevant scenarios. The goal of this research was to apply a visual performance-based assessment methodology to evaluate six algorithms that were specifically designed to enhance the contrast of digital images. The image enhancing algorithms used in this study included three different histogram equalization algorithms, the Autolevels function, the Recursive Rational Filter technique described in Marsi, Ramponi, and Carrato1 and the multiscale Retinex algorithm described in Rahman, Jobson and Woodell2. The methodology used in the assessment has been developed to acquire objective human visual performance data as a means of evaluating the contrast enhancement algorithms. Objective performance metrics, response time and error rate, were used to compare algorithm enhanced images versus two baseline conditions, original non-enhanced images and contrast-degraded images. Observers completed a visual search task using a spatial-forcedchoice paradigm. Observers searched images for a target (a military vehicle) hidden among foliage and then indicated in which quadrant of the screen the target was located. Response time and percent correct were measured for each observer. Results of the study and future directions are discussed.

  8. Identification of anomalous motion of thunderstorms using daily rainfall fields

    NASA Astrophysics Data System (ADS)

    Moral, Anna del; Llasat, María del Carmen; Rigo, Tomeu

    2017-03-01

    Most of the adverse weather phenomena in Catalonia (northeast Iberian Peninsula) are caused by convective events, which can produce heavy rains, large hailstones, strong winds, lightning and/or tornadoes. These thunderstorms usually have marked paths. However, their trajectories can vary sharply at any given time, completely changing direction from the path they have previously followed. Furthermore, some thunderstorms split or merge with each other, creating new formations with different behaviour. In order to identify the potentially anomalous movements that some thunderstorms make, this paper presents a two-step methodology using a database with 8 years of daily rainfall fields data for the Catalonia region (2008-2015). First, it classifies daily rainfall fields between days with "no rain", "non-potentially convective rain" and "potentially convective rain", based on daily accumulated precipitation and extension thresholds. Second, it categorises convective structures within rainfall fields and briefly identifies their main features, distinguishing whether there were any anomalous thunderstorm movements in each case. This methodology has been applied to the 2008-2015 period, and the main climatic features of convective and non-convective days were obtained. The methodology can be exported to other regions that do not have the necessary radar-based algorithms to detect convective cells, but where there is a good rain gauge network in place.

  9. An approach to accidents modeling based on compounds road environments.

    PubMed

    Fernandes, Ana; Neves, Jose

    2013-04-01

    The most common approach to study the influence of certain road features on accidents has been the consideration of uniform road segments characterized by a unique feature. However, when an accident is related to the road infrastructure, its cause is usually not a single characteristic but rather a complex combination of several characteristics. The main objective of this paper is to describe a methodology developed in order to consider the road as a complete environment by using compound road environments, overcoming the limitations inherented in considering only uniform road segments. The methodology consists of: dividing a sample of roads into segments; grouping them into quite homogeneous road environments using cluster analysis; and identifying the influence of skid resistance and texture depth on road accidents in each environment by using generalized linear models. The application of this methodology is demonstrated for eight roads. Based on real data from accidents and road characteristics, three compound road environments were established where the pavement surface properties significantly influence the occurrence of accidents. Results have showed clearly that road environments where braking maneuvers are more common or those with small radii of curvature and high speeds require higher skid resistance and texture depth as an important contribution to the accident prevention. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Product development using process monitoring and NDE data fusion

    NASA Astrophysics Data System (ADS)

    Peterson, Todd; Bossi, Richard H.

    1998-03-01

    Composite process/product development relies on both process monitoring information and nondestructive evaluation measurements for determining application suitability. In the past these activities have been performed and analyzed independently. Our present approach is to present the process monitoring and NDE data together in a data fusion workstation. This methodology leads to final product acceptance based on a combined process monitoring and NDE criteria. The data fusion work station combines process parameter and NDE data in a single workspace enabling all the data to be used in the acceptance/rejection decision process. An example application is the induction welding process, a unique joining method for assembling primary composite structure, that offers significant cost and weight advantages over traditional fasted structure. The determination of the required time, temperature and pressure conditions used in the process to achieve a complete weld is being aided by the use of ultrasonic inspection techniques. Full waveform ultrasonic inspection data is employed to evaluate the quality of spar cap to skin fit, an essential element of the welding process, and is processed to find a parameter that can be used for weld acceptance. Certification of the completed weld incorporates the data fusion methodology.

  11. Lesion registration for longitudinal disease tracking in an imaging informatics-based multiple sclerosis eFolder

    NASA Astrophysics Data System (ADS)

    Ma, Kevin; Liu, Joseph; Zhang, Xuejun; Lerner, Alex; Shiroishi, Mark; Amezcua, Lilyana; Liu, Brent

    2016-03-01

    We have designed and developed a multiple sclerosis eFolder system for patient data storage, image viewing, and automatic lesion quantification results stored in DICOM-SR format. The web-based system aims to be integrated in DICOM-compliant clinical and research environments to aid clinicians in patient treatments and data analysis. The system needs to quantify lesion volumes, identify and register lesion locations to track shifts in volume and quantity of lesions in a longitudinal study. In order to perform lesion registration, we have developed a brain warping and normalizing methodology using Statistical Parametric Mapping (SPM) MATLAB toolkit for brain MRI. Patients' brain MR images are processed via SPM's normalization processes, and the brain images are analyzed and warped according to the tissue probability map. Lesion identification and contouring are completed by neuroradiologists, and lesion volume quantification is completed by the eFolder's CAD program. Lesion comparison results in longitudinal studies show key growth and active regions. The results display successful lesion registration and tracking over a longitudinal study. Lesion change results are graphically represented in the web-based user interface, and users are able to correlate patient progress and changes in the MRI images. The completed lesion and disease tracking tool would enable the eFolder to provide complete patient profiles, improve the efficiency of patient care, and perform comprehensive data analysis through an integrated imaging informatics system.

  12. A Call for a Community of Practice to Assess the Impact of Emerging Technologies on Undergraduate Biology Education †

    PubMed Central

    Jensen, Jamie L.; Dario-Becker, Juville; Hughes, Lee E.; Amburn, D. Sue Katz; Shaw, Joyce A.

    2012-01-01

    Recent recommendations for educational research encourage empirically tested, theory-based, completely transparent, and broadly applicable studies. In light of these recommendations, we call for a research standard and community of practice in the evaluation of technology use in the undergraduate life science classroom. We outline appropriate research methodology, review and critique the past research on technology usage and, lastly, suggest a new and improved focus for research on emerging technologies. PMID:23653777

  13. A call for a community of practice to assess the impact of emerging technologies on undergraduate biology education.

    PubMed

    Jensen, Jamie L; Dario-Becker, Juville; Hughes, Lee E; Amburn, D Sue Katz; Shaw, Joyce A

    2012-01-01

    Recent recommendations for educational research encourage empirically tested, theory-based, completely transparent, and broadly applicable studies. In light of these recommendations, we call for a research standard and community of practice in the evaluation of technology use in the undergraduate life science classroom. We outline appropriate research methodology, review and critique the past research on technology usage and, lastly, suggest a new and improved focus for research on emerging technologies.

  14. A Cost Model for Testing Unmanned and Autonomous Systems of Systems

    DTIC Science & Technology

    2011-02-01

    those risks. In addition, the fundamental methods presented by Aranha and Borba to include the complexity and sizing of tests for UASoS, can be expanded...used as an input for test execution effort estimation models (Aranha & Borba , 2007). Such methodology is very relevant to this work because as a UASoS...calculate the test effort based on the complexity of the SoS. However, Aranha and Borba define test size as the number of steps required to complete

  15. Video-Based Intervention in Teaching Fraction Problem-Solving to Students with Autism Spectrum Disorder.

    PubMed

    Yakubova, Gulnoza; Hughes, Elizabeth M; Hornberger, Erin

    2015-09-01

    The purpose of this study was to determine the effectiveness of a point-of-view video modeling intervention to teach mathematics problem-solving when working on word problems involving subtracting mixed fractions with uncommon denominators. Using a multiple-probe across students design of single-case methodology, three high school students with ASD completed the study. All three students demonstrated greater accuracy in solving fraction word problems and maintained accuracy levels at a 1-week follow-up.

  16. Highly Stable Lyophilized Homogeneous Bead-Based Immunoassays for On-Site Detection of Bio Warfare Agents from Complex Matrices.

    PubMed

    Mechaly, Adva; Marx, Sharon; Levy, Orly; Yitzhaki, Shmuel; Fisher, Morly

    2016-06-21

    This study shows the development of dry, highly stable immunoassays for the detection of bio warfare agents in complex matrices. Thermal stability was achieved by the lyophilization of the complete, homogeneous, bead-based immunoassay in a special stabilizing buffer, resulting in a ready-to-use, simple assay, which exhibited long shelf and high-temperature endurance (up to 1 week at 100 °C). The developed methodology was successfully implemented for the preservation of time-resolved fluorescence, Alexa-fluorophores, and horse radish peroxidase-based bead assays, enabling multiplexed detection. The multiplexed assay was successfully implemented for the detection of Bacillus anthracis, botulinum B, and tularemia in complex matrices.

  17. Complete Hexose Isomer Identification with Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Nagy, Gabe; Pohl, Nicola L. B.

    2015-04-01

    The first analytical method is presented for the identification and absolute configuration determination of all 24 aldohexose and 2-ketohexose isomers, including the D and L enantiomers for allose, altrose, galactose, glucose, gulose, idose, mannose, talose, fructose, psicose, sorbose, and tagatose. Two unique fixed ligand kinetic method combinations were discovered to create significant enough energetic differences to achieve chiral discrimination among all 24 hexoses. Each of these 24 hexoses yields unique ratios of a specific pair of fragment ions that allows for simultaneous determination of identification and absolute configuration. This mass spectrometric-based methodology can be readily employed for accurate identification of any isolated monosaccharide from an unknown biological source. This work provides a key step towards the goal of complete de novo carbohydrate analysis.

  18. Hybrid plasma modeling.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hopkins, Matthew Morgan; DeChant, Lawrence Justin.; Piekos, Edward Stanley

    2009-02-01

    This report summarizes the work completed during FY2007 and FY2008 for the LDRD project ''Hybrid Plasma Modeling''. The goal of this project was to develop hybrid methods to model plasmas across the non-continuum-to-continuum collisionality spectrum. The primary methodology to span these regimes was to couple a kinetic method (e.g., Particle-In-Cell) in the non-continuum regions to a continuum PDE-based method (e.g., finite differences) in continuum regions. The interface between the two would be adjusted dynamically ased on statistical sampling of the kinetic results. Although originally a three-year project, it became clear during the second year (FY2008) that there were not sufficientmore » resources to complete the project and it was terminated mid-year.« less

  19. Human Space Flight

    NASA Technical Reports Server (NTRS)

    Woolford, Barbara

    2006-01-01

    The performance of complex tasks on the International Space Station (ISS) requires significant preflight crew training commitments and frequent skill and knowledge refreshment. This report documents a recently developed just-in-time training methodology, which integrates preflight hardware familiarization and procedure training with an on-orbit CD-ROM-based skill enhancement. This just-in-time concept was used to support real-time remote expert guidance to complete medical examinations using the ISS Human Research Facility (HRF). An American md Russian ISS crewmember received 2-hours of hands on ultrasound training 8 months prior to the on-orbit ultrasound exam. A CD-ROM-based Onboard Proficiency Enhancement (OPE) interactive multimedia program consisting of memory enhancing tutorials, and skill testing exercises, was completed by the crewmember six days prior to the on-orbit ultrasound exam. The crewmember was then remotely guided through a thoracic, vascular, and echocardiographic examination by ultrasound imaging experts. Results of the CD ROM based OPE session were used to modify the instructions during a complete 35 minute real-time thoracic, cardiac, and carotid/jugular ultrasound study. Following commands from the ground-based expert, the crewmember acquired all target views and images without difficulty. The anatomical content and fidelity of ultrasound video were excellent and adequate for clinical decision-making. Complex ultrasound experiments with expert guidance were performed with high accuracy following limited pre-flight training and CD-ROM-based in-flight review, despite a 2-second communication latency.

  20. A Lean Six Sigma quality improvement project to increase discharge paperwork completeness for admission to a comprehensive integrated inpatient rehabilitation program.

    PubMed

    Neufeld, Nathan J; Hoyer, Erik H; Cabahug, Philippines; González-Fernández, Marlís; Mehta, Megha; Walker, N Colbey; Powers, Richard L; Mayer, R Samuel

    2013-01-01

    Lean Six Sigma (LSS) process analysis can be used to increase completeness of discharge summary reports used as a critical communication tool when a patient transitions between levels of care. The authors used the LSS methodology as an intervention to improve systems process. Over the course of the project, 8 required elements were analyzed in the discharge paperwork. The authors analyzed the discharge paperwork of patients (42 patients preintervention and 143 patients postintervention) of a comprehensive integrated inpatient rehabilitation program (CIIRP). Prior to this LSS project, 61.8% of required discharge elements were present. The intervention improved the completeness to 94.2% of the required elements. The percentage of charts that were 100% complete increased from 11.9% to 67.8%. LSS is a well-established process improvement methodology that can be used to make significant improvements in complex health care workflow issues. Specifically, the completeness of discharge documentation required for transition of care to CIIRP can be improved.

  1. Optimization as a Tool for Consistency Maintenance in Multi-Resolution Simulation

    NASA Technical Reports Server (NTRS)

    Drewry, Darren T; Reynolds, Jr , Paul F; Emanuel, William R

    2006-01-01

    The need for new approaches to the consistent simulation of related phenomena at multiple levels of resolution is great. While many fields of application would benefit from a complete and approachable solution to this problem, such solutions have proven extremely difficult. We present a multi-resolution simulation methodology that uses numerical optimization as a tool for maintaining external consistency between models of the same phenomena operating at different levels of temporal and/or spatial resolution. Our approach follows from previous work in the disparate fields of inverse modeling and spacetime constraint-based animation. As a case study, our methodology is applied to two environmental models of forest canopy processes that make overlapping predictions under unique sets of operating assumptions, and which execute at different temporal resolutions. Experimental results are presented and future directions are addressed.

  2. Exploring Applications of Radiomics in Magnetic Resonance Imaging of Head and Neck Cancer: A Systematic Review.

    PubMed

    Jethanandani, Amit; Lin, Timothy A; Volpe, Stefania; Elhalawani, Hesham; Mohamed, Abdallah S R; Yang, Pei; Fuller, Clifton D

    2018-01-01

    Radiomics has been widely investigated for non-invasive acquisition of quantitative textural information from anatomic structures. While the vast majority of radiomic analysis is performed on images obtained from computed tomography, magnetic resonance imaging (MRI)-based radiomics has generated increased attention. In head and neck cancer (HNC), however, attempts to perform consistent investigations are sparse, and it is unclear whether the resulting textural features can be reproduced. To address this unmet need, we systematically reviewed the quality of existing MRI radiomics research in HNC. Literature search was conducted in accordance with guidelines established by Preferred Reporting Items for Systematic Reviews and Meta-Analyses. Electronic databases were examined from January 1990 through November 2017 for common radiomic keywords. Eligible completed studies were then scored using a standardized checklist that we developed from Enhancing the Quality and Transparency of Health Research guidelines for reporting machine-learning predictive model specifications and results in biomedical research, defined by Luo et al. (1). Descriptive statistics of checklist scores were populated, and a subgroup analysis of methodology items alone was conducted in comparison to overall scores. Sixteen completed studies and four ongoing trials were selected for inclusion. Of the completed studies, the nasopharynx was the most common site of study (37.5%). MRI modalities varied with only four of the completed studies (25%) extracting radiomic features from a single sequence. Study sample sizes ranged between 13 and 118 patients (median of 40), and final radiomic signatures ranged from 2 to 279 features. Analyzed endpoints included either segmentation or histopathological classification parameters (44%) or prognostic and predictive biomarkers (56%). Liu et al. (2) addressed the highest number of our checklist items (total score: 48), and a subgroup analysis of methodology checklist items alone did not demonstrate any difference in scoring trends between studies [Spearman's ρ = 0.94 ( p  < 0.0001)]. Although MRI radiomic applications demonstrate predictive potential in analyzing diverse HNC outcomes, methodological variances preclude accurate and collective interpretation of data.

  3. 75 FR 53716 - Invitation for Membership on Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-01

    ... successful completion of Joint Board examinations in basic actuarial mathematics and methodology and in actuarial mathematics and methodology relating to pension plans qualifying under ERISA. The Joint Board, the... Act of 1974 (ERISA), is responsible for the enrollment of individuals who wish to perform actuarial...

  4. An integrated dispersion preparation, characterization and in vitro dosimetry methodology for engineered nanomaterials

    PubMed Central

    DeLoid, Glen M.; Cohen, Joel M.; Pyrgiotakis, Georgios; Demokritou, Philip

    2018-01-01

    Summary Evidence continues to grow of the importance of in vitro and in vivo dosimetry in the hazard assessment and ranking of engineered nanomaterials (ENMs). Accurate dose metrics are particularly important for in vitro cellular screening to assess the potential health risks or bioactivity of ENMs. In order to ensure meaningful and reproducible quantification of in vitro dose, with consistent measurement and reporting between laboratories, it is necessary to adopt standardized and integrated methodologies for 1) generation of stable ENM suspensions in cell culture media, 2) colloidal characterization of suspended ENMs, particularly properties that determine particle kinetics in an in vitro system (size distribution and formed agglomerate effective density), and 3) robust numerical fate and transport modeling for accurate determination of ENM dose delivered to cells over the course of the in vitro exposure. Here we present such an integrated comprehensive protocol based on such a methodology for in vitro dosimetry, including detailed standardized procedures for each of these three critical steps. The entire protocol requires approximately 6-12 hours to complete. PMID:28102836

  5. Ultrasensitive low noise voltage amplifier for spectral analysis.

    PubMed

    Giusi, G; Crupi, F; Pace, C

    2008-08-01

    Recently we have proposed several voltage noise measurement methods that allow, at least in principle, the complete elimination of the noise introduced by the measurement amplifier. The most severe drawback of these methods is that they require a multistep measurement procedure. Since environmental conditions may change in the different measurement steps, the final result could be affected by these changes. This problem is solved by the one-step voltage noise measurement methodology based on a novel amplifier topology proposed in this paper. Circuit implementations for the amplifier building blocks based on operational amplifiers are critically discussed. The proposed approach is validated through measurements performed on a prototype circuit.

  6. The Scholarship of Teaching and Learning: Transformation and Transgression

    ERIC Educational Resources Information Center

    Bolf-Beliveau, Laura

    2013-01-01

    Chapter Five of "The Scholarship of Teaching and Learning Reconsidered" (2011) suggests that traditional research scholarship methodology can inform and reform the ways in which we value and evaluate teaching. The authors discuss applying research methodology as way to complete this process. This article suggests that using theoretical…

  7. Assessing Pragmatics: DCTS and Retrospective Verbal Reports

    ERIC Educational Resources Information Center

    Beltrán-Palanques, Vicente

    2016-01-01

    Assessing pragmatic knowledge in the instructed setting is seen as a complex but necessary task, which requires the design of appropriate research methodologies to examine pragmatic performance. This study discusses the use of two different research methodologies, namely those of Discourse Completion Tests/Tasks (DCTs) and verbal reports. Research…

  8. Fundamental Use of Surgical Energy (FUSE) certification: validation and predictors of success.

    PubMed

    Robinson, Thomas N; Olasky, Jaisa; Young, Patricia; Feldman, Liane S; Fuchshuber, Pascal R; Jones, Stephanie B; Madani, Amin; Brunt, Michael; Mikami, Dean; Jackson, Gretchen P; Mischna, Jessica; Schwaitzberg, Steven; Jones, Daniel B

    2016-03-01

    The Fundamental Use of Surgical Energy (FUSE) program includes a Web-based didactic curriculum and a high-stakes multiple-choice question examination with the goal to provide certification of knowledge on the safe use of surgical energy-based devices. The purpose of this study was (1) to set a passing score through a psychometrically sound process and (2) to determine what pretest factors predicted passing the FUSE examination. Beta-testing of multiple-choice questions on 62 topics of importance to the safe use of surgical energy-based devices was performed. Eligible test takers were physicians with a minimum of 1 year of surgical training who were recruited by FUSE task force members. A pretest survey collected baseline information. A total of 227 individuals completed the FUSE beta-test, and 208 completed the pretest survey. The passing/cut score for the first test form of the FUSE multiple-choice examination was determined using the modified Angoff methodology and for the second test form was determined using a linear equating methodology. The overall passing rate across the two examination forms was 81.5%. Self-reported time studying the FUSE Web-based curriculum for a minimum of >2 h was associated with a passing examination score (p < 0.001). Performance was not different based on increased years of surgical practice (p = 0.363), self-reported expertise on one or more types of energy-based devices (p = 0.683), participation in the FUSE postgraduate course (p = 0.426), or having reviewed the FUSE manual (p = 0.428). Logistic regression found that studying the FUSE didactics for >2 h predicted a passing score (OR 3.61; 95% CI 1.44-9.05; p = 0.006) independent of the other baseline characteristics recorded. The development of the FUSE examination, including the passing score, followed a psychometrically sound process. Self-reported time studying the FUSE curriculum predicted a passing score independent of other pretest characteristics such as years in practice and self-reported expertise.

  9. Regression to fuzziness method for estimation of remaining useful life in power plant components

    NASA Astrophysics Data System (ADS)

    Alamaniotis, Miltiadis; Grelle, Austin; Tsoukalas, Lefteri H.

    2014-10-01

    Mitigation of severe accidents in power plants requires the reliable operation of all systems and the on-time replacement of mechanical components. Therefore, the continuous surveillance of power systems is a crucial concern for the overall safety, cost control, and on-time maintenance of a power plant. In this paper a methodology called regression to fuzziness is presented that estimates the remaining useful life (RUL) of power plant components. The RUL is defined as the difference between the time that a measurement was taken and the estimated failure time of that component. The methodology aims to compensate for a potential lack of historical data by modeling an expert's operational experience and expertise applied to the system. It initially identifies critical degradation parameters and their associated value range. Once completed, the operator's experience is modeled through fuzzy sets which span the entire parameter range. This model is then synergistically used with linear regression and a component's failure point to estimate the RUL. The proposed methodology is tested on estimating the RUL of a turbine (the basic electrical generating component of a power plant) in three different cases. Results demonstrate the benefits of the methodology for components for which operational data is not readily available and emphasize the significance of the selection of fuzzy sets and the effect of knowledge representation on the predicted output. To verify the effectiveness of the methodology, it was benchmarked against the data-based simple linear regression model used for predictions which was shown to perform equal or worse than the presented methodology. Furthermore, methodology comparison highlighted the improvement in estimation offered by the adoption of appropriate of fuzzy sets for parameter representation.

  10. Institutionalizing human-computer interaction for global health

    PubMed Central

    Gulliksen, Jan

    2017-01-01

    ABSTRACT Digitalization is the societal change process in which new ICT-based solutions bring forward completely new ways of doing things, new businesses and new movements in the society. Digitalization also provides completely new ways of addressing issues related to global health. This paper provides an overview of the field of human-computer interaction (HCI) and in what way the field has contributed to international development in different regions of the world. Additionally, it outlines the United Nations’ new sustainability goals from December 2015 and what these could contribute to the development of global health and its relationship to digitalization. Finally, it argues why and how HCI could be adopted and adapted to fit the contextual needs, the need for localization and for the development of new digital innovations. The research methodology is mostly qualitative following an action research paradigm in which the actual change process that the digitalization is evoking is equally important as the scientific conclusions that can be drawn. In conclusion, the paper argues that digitalization is fundamentally changing the society through the development and use of digital technologies and may have a profound effect on the digital development of every country in the world. But it needs to be developed based on local practices, it needs international support and to not be limited by any technological constraints. Particularly digitalization to support global health requires a profound understanding of the users and their context, arguing for user-centred systems design methodologies as particularly suitable. PMID:28838309

  11. Development of an Unstructured Mesh Code for Flows About Complete Vehicles

    NASA Technical Reports Server (NTRS)

    Peraire, Jaime; Gupta, K. K. (Technical Monitor)

    2001-01-01

    This report describes the research work undertaken at the Massachusetts Institute of Technology, under NASA Research Grant NAG4-157. The aim of this research is to identify effective algorithms and methodologies for the efficient and routine solution of flow simulations about complete vehicle configurations. For over ten years we have received support from NASA to develop unstructured mesh methods for Computational Fluid Dynamics. As a result of this effort a methodology based on the use of unstructured adapted meshes of tetrahedra and finite volume flow solvers has been developed. A number of gridding algorithms, flow solvers, and adaptive strategies have been proposed. The most successful algorithms developed from the basis of the unstructured mesh system FELISA. The FELISA system has been extensively for the analysis of transonic and hypersonic flows about complete vehicle configurations. The system is highly automatic and allows for the routine aerodynamic analysis of complex configurations starting from CAD data. The code has been parallelized and utilizes efficient solution algorithms. For hypersonic flows, a version of the code which incorporates real gas effects, has been produced. The FELISA system is also a component of the STARS aeroservoelastic system developed at NASA Dryden. One of the latest developments before the start of this grant was to extend the system to include viscous effects. This required the development of viscous generators, capable of generating the anisotropic grids required to represent boundary layers, and viscous flow solvers. We show some sample hypersonic viscous computations using the developed viscous generators and solvers. Although this initial results were encouraging it became apparent that in order to develop a fully functional capability for viscous flows, several advances in solution accuracy, robustness and efficiency were required. In this grant we set out to investigate some novel methodologies that could lead to the required improvements. In particular we focused on two fronts: (1) finite element methods and (2) iterative algebraic multigrid solution techniques.

  12. Tuberculosis Prevention in the Private Sector: Using Claims-Based Methods to Identify and Evaluate Latent Tuberculosis Infection Treatment With Isoniazid Among the Commercially Insured.

    PubMed

    Stockbridge, Erica L; Miller, Thaddeus L; Carlson, Erin K; Ho, Christine

    Targeted identification and treatment of people with latent tuberculosis infection (LTBI) are key components of the US tuberculosis elimination strategy. Because of recent policy changes, some LTBI treatment may shift from public health departments to the private sector. To (1) develop methodology to estimate initiation and completion of treatment with isoniazid for LTBI using claims data, and (2) estimate treatment completion rates for isoniazid regimens from commercial insurance claims. Medical and pharmacy claims data representing insurance-paid services rendered and prescriptions filled between January 2011 and March 2015 were analyzed. Four million commercially insured individuals 0 to 64 years of age. Six-month and 9-month treatment completion rates for isoniazid LTBI regimens. There was an annual isoniazid LTBI treatment initiation rate of 12.5/100 000 insured persons. Of 1074 unique courses of treatment with isoniazid for which treatment completion could be assessed, almost half (46.3%; confidence interval, 43.3-49.3) completed 6 or more months of therapy. Of those, approximately half (48.9%; confidence interval, 44.5-53.3) completed 9 months or more. Claims data can be used to identify and evaluate LTBI treatment with isoniazid occurring in the commercial sector. Completion rates were in the range of those found in public health settings. These findings suggest that the commercial sector may be a valuable adjunct to more traditional venues for tuberculosis prevention. In addition, these newly developed claims-based methods offer a means to gain important insights and open new avenues to monitor, evaluate, and coordinate tuberculosis prevention.

  13. The Society for Implementation Research Collaboration Instrument Review Project: a methodology to promote rigorous evaluation.

    PubMed

    Lewis, Cara C; Stanick, Cameo F; Martinez, Ruben G; Weiner, Bryan J; Kim, Mimi; Barwick, Melanie; Comtois, Katherine A

    2015-01-08

    Identification of psychometrically strong instruments for the field of implementation science is a high priority underscored in a recent National Institutes of Health working meeting (October 2013). Existing instrument reviews are limited in scope, methods, and findings. The Society for Implementation Research Collaboration Instrument Review Project's objectives address these limitations by identifying and applying a unique methodology to conduct a systematic and comprehensive review of quantitative instruments assessing constructs delineated in two of the field's most widely used frameworks, adopt a systematic search process (using standard search strings), and engage an international team of experts to assess the full range of psychometric criteria (reliability, construct and criterion validity). Although this work focuses on implementation of psychosocial interventions in mental health and health-care settings, the methodology and results will likely be useful across a broad spectrum of settings. This effort has culminated in a centralized online open-access repository of instruments depicting graphical head-to-head comparisons of their psychometric properties. This article describes the methodology and preliminary outcomes. The seven stages of the review, synthesis, and evaluation methodology include (1) setting the scope for the review, (2) identifying frameworks to organize and complete the review, (3) generating a search protocol for the literature review of constructs, (4) literature review of specific instruments, (5) development of an evidence-based assessment rating criteria, (6) data extraction and rating instrument quality by a task force of implementation experts to inform knowledge synthesis, and (7) the creation of a website repository. To date, this multi-faceted and collaborative search and synthesis methodology has identified over 420 instruments related to 34 constructs (total 48 including subconstructs) that are relevant to implementation science. Despite numerous constructs having greater than 20 available instruments, which implies saturation, preliminary results suggest that few instruments stem from gold standard development procedures. We anticipate identifying few high-quality, psychometrically sound instruments once our evidence-based assessment rating criteria have been applied. The results of this methodology may enhance the rigor of implementation science evaluations by systematically facilitating access to psychometrically validated instruments and identifying where further instrument development is needed.

  14. A meta-analysis and review of holistic face processing.

    PubMed

    Richler, Jennifer J; Gauthier, Isabel

    2014-09-01

    The concept of holistic processing is a cornerstone of face recognition research, yet central questions related to holistic processing remain unanswered, and debates have thus far failed to reach a resolution despite accumulating empirical evidence. We argue that a considerable source of confusion in this literature stems from a methodological problem. Specifically, 2 measures of holistic processing based on the composite paradigm (complete design and partial design) are used in the literature, but they often lead to qualitatively different results. First, we present a comprehensive review of the work that directly compares the 2 designs, and which clearly favors the complete design over the partial design. Second, we report a meta-analysis of holistic face processing according to both designs and use this as further evidence for one design over the other. The meta-analysis effect size of holistic processing in the complete design is nearly 3 times that of the partial design. Effect sizes were not correlated between measures, consistent with the suggestion that they do not measure the same thing. Our meta-analysis also examines the correlation between conditions in the complete design of the composite task, and suggests that in an individual differences context, little is gained by including a misaligned baseline. Finally, we offer a comprehensive review of the state of knowledge about holistic processing based on evidence gathered from the measure we favor based on the 1st sections of our review-the complete design-and outline outstanding research questions in that new context. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  15. Participant dropout as a function of survey length in internet-mediated university studies: implications for study design and voluntary participation in psychological research.

    PubMed

    Hoerger, Michael

    2010-12-01

    Internet-mediated research has offered substantial advantages over traditional laboratory-based research in terms of efficiently and affordably allowing for the recruitment of large samples of participants for psychology studies. Core technical, ethical, and methodological issues have been addressed in recent years, but the important issue of participant dropout has received surprisingly little attention. Specifically, web-based psychology studies often involve undergraduates completing lengthy and time-consuming batteries of online personality questionnaires, but no known published studies to date have closely examined the natural course of participant dropout during attempted completion of these studies. The present investigation examined participant dropout among 1,963 undergraduates completing one of six web-based survey studies relatively representative of those conducted in university settings. Results indicated that 10% of participants could be expected to drop out of these studies nearly instantaneously, with an additional 2% dropping out per 100 survey items included in the study. For individual project investigators, these findings hold ramifications for study design considerations, such as conducting a priori power analyses. The present results also have broader ethical implications for understanding and improving voluntary participation in research involving human subjects. Nonetheless, the generalizability of these conclusions may be limited to studies involving similar design or survey content.

  16. A Validated Methodology for Genetic Identification of Tuna Species (Genus Thunnus)

    PubMed Central

    Viñas, Jordi; Tudela, Sergi

    2009-01-01

    Background Tuna species of the genus Thunnus, such as the bluefin tunas, are some of the most important and yet most endangered trade fish in the world. Identification of these species in traded forms, however, may be difficult depending on the presentation of the products, which may hamper conservation efforts on trade control. In this paper, we validated a genetic methodology that can fully distinguish between the eight Thunnus species from any kind of processed tissue. Methodology After testing several genetic markers, a complete discrimination of the eight tuna species was achieved using Forensically Informative Nucleotide Sequencing based primarily on the sequence variability of the hypervariable genetic marker mitochondrial DNA control region (mtDNA CR), followed, in some specific cases, by a second validation by a nuclear marker rDNA first internal transcribed spacer (ITS1). This methodology was able to distinguish all tuna species, including those belonging to the subgenus Neothunnus that are very closely related, and in consequence can not be differentiated with other genetic markers of lower variability. This methodology also took into consideration the presence of introgression that has been reported in past studies between T. thynnus, T. orientalis and T. alalunga. Finally, we applied the methodology to cross-check the species identity of 26 processed tuna samples. Conclusions Using the combination of two genetic markers, one mitochondrial and another nuclear, allows a full discrimination between all eight tuna species. Unexpectedly, the genetic marker traditionally used for DNA barcoding, cytochrome oxidase 1, could not differentiate all species, thus its use as a genetic marker for tuna species identification is questioned. PMID:19898615

  17. Reconstructing missing information on precipitation datasets: impact of tails on adopted statistical distributions.

    NASA Astrophysics Data System (ADS)

    Pedretti, Daniele; Beckie, Roger Daniel

    2014-05-01

    Missing data in hydrological time-series databases are ubiquitous in practical applications, yet it is of fundamental importance to make educated decisions in problems involving exhaustive time-series knowledge. This includes precipitation datasets, since recording or human failures can produce gaps in these time series. For some applications, directly involving the ratio between precipitation and some other quantity, lack of complete information can result in poor understanding of basic physical and chemical dynamics involving precipitated water. For instance, the ratio between precipitation (recharge) and outflow rates at a discharge point of an aquifer (e.g. rivers, pumping wells, lysimeters) can be used to obtain aquifer parameters and thus to constrain model-based predictions. We tested a suite of methodologies to reconstruct missing information in rainfall datasets. The goal was to obtain a suitable and versatile method to reduce the errors given by the lack of data in specific time windows. Our analyses included both a classical chronologically-pairing approach between rainfall stations and a probability-based approached, which accounted for the probability of exceedence of rain depths measured at two or multiple stations. Our analyses proved that it is not clear a priori which method delivers the best methodology. Rather, this selection should be based considering the specific statistical properties of the rainfall dataset. In this presentation, our emphasis is to discuss the effects of a few typical parametric distributions used to model the behavior of rainfall. Specifically, we analyzed the role of distributional "tails", which have an important control on the occurrence of extreme rainfall events. The latter strongly affect several hydrological applications, including recharge-discharge relationships. The heavy-tailed distributions we considered were parametric Log-Normal, Generalized Pareto, Generalized Extreme and Gamma distributions. The methods were first tested on synthetic examples, to have a complete control of the impact of several variables such as minimum amount of data required to obtain reliable statistical distributions from the selected parametric functions. Then, we applied the methodology to precipitation datasets collected in the Vancouver area and on a mining site in Peru.

  18. Multi-atlas and label fusion approach for patient-specific MRI based skull estimation.

    PubMed

    Torrado-Carvajal, Angel; Herraiz, Joaquin L; Hernandez-Tamames, Juan A; San Jose-Estepar, Raul; Eryaman, Yigitcan; Rozenholc, Yves; Adalsteinsson, Elfar; Wald, Lawrence L; Malpica, Norberto

    2016-04-01

    MRI-based skull segmentation is a useful procedure for many imaging applications. This study describes a methodology for automatic segmentation of the complete skull from a single T1-weighted volume. The skull is estimated using a multi-atlas segmentation approach. Using a whole head computed tomography (CT) scan database, the skull in a new MRI volume is detected by nonrigid image registration of the volume to every CT, and combination of the individual segmentations by label-fusion. We have compared Majority Voting, Simultaneous Truth and Performance Level Estimation (STAPLE), Shape Based Averaging (SBA), and the Selective and Iterative Method for Performance Level Estimation (SIMPLE) algorithms. The pipeline has been evaluated quantitatively using images from the Retrospective Image Registration Evaluation database (reaching an overlap of 72.46 ± 6.99%), a clinical CT-MR dataset (maximum overlap of 78.31 ± 6.97%), and a whole head CT-MRI pair (maximum overlap 78.68%). A qualitative evaluation has also been performed on MRI acquisition of volunteers. It is possible to automatically segment the complete skull from MRI data using a multi-atlas and label fusion approach. This will allow the creation of complete MRI-based tissue models that can be used in electromagnetic dosimetry applications and attenuation correction in PET/MR. © 2015 Wiley Periodicals, Inc.

  19. A Global Health Research Checklist for clinicians.

    PubMed

    Sawaya, Rasha D; Breslin, Kristen A; Abdulrahman, Eiman; Chapman, Jennifer I; Good, Dafina M; Moran, Lili; Mullan, Paul C; Badaki-Makun, Oluwakemi

    2018-04-19

    Global health research has become a priority in most international medical projects. However, it is a difficult endeavor, especially for a busy clinician. Navigating the ethics, methods, and local partnerships is essential yet daunting.To date, there are no guidelines published to help clinicians initiate and complete successful global health research projects. This Global Health Research Checklist was developed to be used by clinicians or other health professionals for developing, implementing, and completing a successful research project in an international and often low-resource setting. It consists of five sections: Objective, Methodology, Institutional Review Board and Ethics, Culture and partnerships, and Logistics. We used individual experiences and published literature to develop and emphasize the key concepts. The checklist was trialed in two workshops and adjusted based on participants' feedback.

  20. Adolescent and Young Adult Patient Engagement and Participation in Survey-Based Research: A Report From the "Resilience in Adolescents and Young Adults With Cancer" Study.

    PubMed

    Rosenberg, Abby R; Bona, Kira; Wharton, Claire M; Bradford, Miranda; Shaffer, Michele L; Wolfe, Joanne; Baker, Kevin Scott

    2016-04-01

    Conducting patient-reported outcomes research with adolescents and young adults (AYAs) is difficult due to low participation rates and high attrition. Forty-seven AYAs with newly diagnosed cancer at two large hospitals were prospectively surveyed at the time of diagnosis and 3-6 and 12-18 months later. A subset participated in 1:1 semistructured interviews. Attrition prompted early study closure at one site. The majority of patients preferred paper-pencil to online surveys. Interview participants were more likely to complete surveys (e.g., 93% vs. 58% completion of 3-6 month surveys, P = 0.02). Engaging patients through qualitative methodologies and using patient-preferred instruments may optimize future research success. © 2015 Wiley Periodicals, Inc.

  1. Comparative analysis of student self-reflections on course projects

    NASA Astrophysics Data System (ADS)

    Pomales-García, Cristina; Cortés Barreto, Kenneth

    2014-11-01

    This study presents the skills, experiences, and values identified in project self-reflections of 161 undergraduate engineering students. Self-reflections from two different engineering design courses, which provide experiences in project-based learning (PBL), are analysed through the content analysis methodology. Results show that 'application', 'true life', 'satisfaction', and 'communication' are the common keywords shared in the reflections. Multiple hypothesis tests to identify differences between courses, project types, years, and gender suggest that there are no significant differences between experiences, skills, and values self-reported by students who completed either a case study or an industry project. Based on research findings, recommendations will be provided to enhance the engineering curriculum based on PBL experiences to support the development of relevant professional skills and experiences.

  2. School Psychology as a Relational Enterprise: The Role and Process of Qualitative Methodology

    ERIC Educational Resources Information Center

    Newman, Daniel S.; Clare, Mary M.

    2016-01-01

    The purpose of this article is to explore the application of qualitative research to establishing a more complete understanding of relational processes inherent in school psychology practice. We identify the building blocks of rigorous qualitative research design through a conceptual overview of qualitative paradigms, methodologies, methods (i.e.,…

  3. Validating the Octave Allegro Information Systems Risk Assessment Methodology: A Case Study

    ERIC Educational Resources Information Center

    Keating, Corland G.

    2014-01-01

    An information system (IS) risk assessment is an important part of any successful security management strategy. Risk assessments help organizations to identify mission-critical IS assets and prioritize risk mitigation efforts. Many risk assessment methodologies, however, are complex and can only be completed successfully by highly qualified and…

  4. Implementation of Recommendations from the One System Comparative Evaluation of the Hanford Tank Farms and Waste Treatment Plant Safety Bases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garrett, Richard L.; Niemi, Belinda J.; Paik, Ingle K.

    2013-11-07

    A Comparative Evaluation was conducted for One System Integrated Project Team to compare the safety bases for the Hanford Waste Treatment and Immobilization Plant Project (WTP) and Tank Operations Contract (TOC) (i.e., Tank Farms) by an Expert Review Team. The evaluation had an overarching purpose to facilitate effective integration between WTP and TOC safety bases. It was to provide One System management with an objective evaluation of identified differences in safety basis process requirements, guidance, direction, procedures, and products (including safety controls, key safety basis inputs and assumptions, and consequence calculation methodologies) between WTP and TOC. The evaluation identified 25more » recommendations (Opportunities for Integration). The resolution of these recommendations resulted in 16 implementation plans. The completion of these implementation plans will help ensure consistent safety bases for WTP and TOC along with consistent safety basis processes. procedures, and analyses. and should increase the likelihood of a successful startup of the WTP. This early integration will result in long-term cost savings and significant operational improvements. In addition, the implementation plans lead to the development of eight new safety analysis methodologies that can be used at other U.S. Department of Energy (US DOE) complex sites where URS Corporation is involved.« less

  5. FDTD-based optical simulations methodology for CMOS image sensors pixels architecture and process optimization

    NASA Astrophysics Data System (ADS)

    Hirigoyen, Flavien; Crocherie, Axel; Vaillant, Jérôme M.; Cazaux, Yvon

    2008-02-01

    This paper presents a new FDTD-based optical simulation model dedicated to describe the optical performances of CMOS image sensors taking into account diffraction effects. Following market trend and industrialization constraints, CMOS image sensors must be easily embedded into even smaller packages, which are now equipped with auto-focus and short-term coming zoom system. Due to miniaturization, the ray-tracing models used to evaluate pixels optical performances are not accurate anymore to describe the light propagation inside the sensor, because of diffraction effects. Thus we adopt a more fundamental description to take into account these diffraction effects: we chose to use Maxwell-Boltzmann based modeling to compute the propagation of light, and to use a software with an FDTD-based (Finite Difference Time Domain) engine to solve this propagation. We present in this article the complete methodology of this modeling: on one hand incoherent plane waves are propagated to approximate a product-use diffuse-like source, on the other hand we use periodic conditions to limit the size of the simulated model and both memory and computation time. After having presented the correlation of the model with measurements we will illustrate its use in the case of the optimization of a 1.75μm pixel.

  6. Separation of GRACE geoid time-variations using Independent Component Analysis

    NASA Astrophysics Data System (ADS)

    Frappart, F.; Ramillien, G.; Maisongrande, P.; Bonnet, M.

    2009-12-01

    Independent Component Analysis (ICA) is a blind separation method based on the simple assumptions of the independence of the sources and the non-Gaussianity of the observations. An approach based on this numerical method is used here to extract hydrological signals over land and oceans from the polluting striping noise due to orbit repetitiveness and present in the GRACE global mass anomalies. We took advantage of the availability of monthly Level-2 solutions from three official providers (i.e., CSR, JPL and GFZ) that can be considered as different observations of the same phenomenon. The efficiency of the methodology is first demonstrated on a synthetic case. Applied to one month of GRACE solutions, it allows to clearly separate the total water storage change from the meridional-oriented spurious gravity signals on the continents but not on the oceans. This technique gives results equivalent as the destriping method for continental water storage for the hydrological patterns with less smoothing. This methodology is then used to filter the complete series of the 2002-2009 GRACE solutions.

  7. A spectral approach for the quantitative description of cardiac collagen network from nonlinear optical imaging.

    PubMed

    Masè, Michela; Cristoforetti, Alessandro; Avogaro, Laura; Tessarolo, Francesco; Piccoli, Federico; Caola, Iole; Pederzolli, Carlo; Graffigna, Angelo; Ravelli, Flavia

    2015-01-01

    The assessment of collagen structure in cardiac pathology, such as atrial fibrillation (AF), is essential for a complete understanding of the disease. This paper introduces a novel methodology for the quantitative description of collagen network properties, based on the combination of nonlinear optical microscopy with a spectral approach of image processing and analysis. Second-harmonic generation (SHG) microscopy was applied to atrial tissue samples from cardiac surgery patients, providing label-free, selective visualization of the collagen structure. The spectral analysis framework, based on 2D-FFT, was applied to the SHG images, yielding a multiparametric description of collagen fiber orientation (angle and anisotropy indexes) and texture scale (dominant wavelength and peak dispersion indexes). The proof-of-concept application of the methodology showed the capability of our approach to detect and quantify differences in the structural properties of the collagen network in AF versus sinus rhythm patients. These results suggest the potential of our approach in the assessment of collagen properties in cardiac pathologies related to a fibrotic structural component.

  8. The Aeronautical Data Link: Taxonomy, Architectural Analysis, and Optimization

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Goode, Plesent W.

    2002-01-01

    The future Communication, Navigation, and Surveillance/Air Traffic Management (CNS/ATM) System will rely on global satellite navigation, and ground-based and satellite based communications via Multi-Protocol Networks (e.g. combined Aeronautical Telecommunications Network (ATN)/Internet Protocol (IP)) to bring about needed improvements in efficiency and safety of operations to meet increasing levels of air traffic. This paper will discuss the development of an approach that completely describes optimal data link architecture configuration and behavior to meet the multiple conflicting objectives of concurrent and different operations functions. The practical application of the approach enables the design and assessment of configurations relative to airspace operations phases. The approach includes a formal taxonomic classification, an architectural analysis methodology, and optimization techniques. The formal taxonomic classification provides a multidimensional correlation of data link performance with data link service, information protocol, spectrum, and technology mode; and to flight operations phase and environment. The architectural analysis methodology assesses the impact of a specific architecture configuration and behavior on the local ATM system performance. Deterministic and stochastic optimization techniques maximize architectural design effectiveness while addressing operational, technology, and policy constraints.

  9. Background qualitative analysis of the European Reference Life Cycle Database (ELCD) energy datasets - part I: fuel datasets.

    PubMed

    Garraín, Daniel; Fazio, Simone; de la Rúa, Cristina; Recchioni, Marco; Lechón, Yolanda; Mathieux, Fabrice

    2015-01-01

    The aim of this study is to identify areas of potential improvement of the European Reference Life Cycle Database (ELCD) fuel datasets. The revision is based on the data quality indicators described by the ILCD Handbook, applied on sectorial basis. These indicators evaluate the technological, geographical and time-related representativeness of the dataset and the appropriateness in terms of completeness, precision and methodology. Results show that ELCD fuel datasets have a very good quality in general terms, nevertheless some findings and recommendations in order to improve the quality of Life-Cycle Inventories have been derived. Moreover, these results ensure the quality of the fuel-related datasets to any LCA practitioner, and provide insights related to the limitations and assumptions underlying in the datasets modelling. Giving this information, the LCA practitioner will be able to decide whether the use of the ELCD fuel datasets is appropriate based on the goal and scope of the analysis to be conducted. The methodological approach would be also useful for dataset developers and reviewers, in order to improve the overall DQR of databases.

  10. Transient Region Coverage in the Propulsion IVHM Technology Experiment

    NASA Technical Reports Server (NTRS)

    Balaban, Edward; Sweet, Adam; Bajwa, Anupa; Maul, William; Fulton, Chris; Chicatelli, amy

    2004-01-01

    Over the last several years researchers at NASA Glenn and Ames Research Centers have developed a real-time fault detection and isolation system for propulsion subsystems of future space vehicles. The Propulsion IVHM Technology Experiment (PITEX), as it is called follows the model-based diagnostic methodology and employs Livingstone, developed at NASA Ames, as its reasoning engine. The system has been tested on,flight-like hardware through a series of nominal and fault scenarios. These scenarios have been developed using a highly detailed simulation of the X-34 flight demonstrator main propulsion system and include realistic failures involving valves, regulators, microswitches, and sensors. This paper focuses on one of the recent research and development efforts under PITEX - to provide more complete transient region coverage. It describes the development of the transient monitors, the corresponding modeling methodology, and the interface software responsible for coordinating the flow of information between the quantitative monitors and the qualitative, discrete representation Livingstone.

  11. A Novel Strategy for Numerical Simulation of High-speed Turbulent Reacting Flows

    NASA Technical Reports Server (NTRS)

    Sheikhi, M. R. H.; Drozda, T. G.; Givi, P.

    2003-01-01

    The objective of this research is to improve and implement the filtered mass density function (FDF) methodology for large eddy simulation (LES) of high-speed reacting turbulent flows. We have just completed Year 1 of this research. This is the Final Report on our activities during the period: January 1, 2003 to December 31, 2003. 2002. In the efforts during the past year, LES is conducted of the Sandia Flame D, which is a turbulent piloted nonpremixed methane jet flame. The subgrid scale (SGS) closure is based on the scalar filtered mass density function (SFMDF) methodology. The SFMDF is basically the mass weighted probability density function (PDF) of the SGS scalar quantities. For this flame (which exhibits little local extinction), a simple flamelet model is used to relate the instantaneous composition to the mixture fraction. The modelled SFMDF transport equation is solved by a hybrid finite-difference/Monte Carlo scheme.

  12. Establishing survey validity and reliability for American Indians through "think aloud" and test-retest methods.

    PubMed

    Hauge, Cindy Horst; Jacobs-Knight, Jacque; Jensen, Jamie L; Burgess, Katherine M; Puumala, Susan E; Wilton, Georgiana; Hanson, Jessica D

    2015-06-01

    The purpose of this study was to use a mixed-methods approach to determine the validity and reliability of measurements used within an alcohol-exposed pregnancy prevention program for American Indian women. To develop validity, content experts provided input into the survey measures, and a "think aloud" methodology was conducted with 23 American Indian women. After revising the measurements based on this input, a test-retest was conducted with 79 American Indian women who were randomized to complete either the original measurements or the new, modified measurements. The test-retest revealed that some of the questions performed better for the modified version, whereas others appeared to be more reliable for the original version. The mixed-methods approach was a useful methodology for gathering feedback on survey measurements from American Indian participants and in indicating specific survey questions that needed to be modified for this population. © The Author(s) 2015.

  13. The Evolutionary History of Protein Domains Viewed by Species Phylogeny

    PubMed Central

    Yang, Song; Bourne, Philip E.

    2009-01-01

    Background Protein structural domains are evolutionary units whose relationships can be detected over long evolutionary distances. The evolutionary history of protein domains, including the origin of protein domains, the identification of domain loss, transfer, duplication and combination with other domains to form new proteins, and the formation of the entire protein domain repertoire, are of great interest. Methodology/Principal Findings A methodology is presented for providing a parsimonious domain history based on gain, loss, vertical and horizontal transfer derived from the complete genomic domain assignments of 1015 organisms across the tree of life. When mapped to species trees the evolutionary history of domains and domain combinations is revealed, and the general evolutionary trend of domain and combination is analyzed. Conclusions/Significance We show that this approach provides a powerful tool to study how new proteins and functions emerged and to study such processes as horizontal gene transfer among more distant species. PMID:20041107

  14. Reduced-order modeling of soft robots

    PubMed Central

    Chenevier, Jean; González, David; Aguado, J. Vicente; Chinesta, Francisco

    2018-01-01

    We present a general strategy for the modeling and simulation-based control of soft robots. Although the presented methodology is completely general, we restrict ourselves to the analysis of a model robot made of hyperelastic materials and actuated by cables or tendons. To comply with the stringent real-time constraints imposed by control algorithms, a reduced-order modeling strategy is proposed that allows to minimize the amount of online CPU cost. Instead, an offline training procedure is proposed that allows to determine a sort of response surface that characterizes the response of the robot. Contrarily to existing strategies, the proposed methodology allows for a fully non-linear modeling of the soft material in a hyperelastic setting as well as a fully non-linear kinematic description of the movement without any restriction nor simplifying assumption. Examples of different configurations of the robot were analyzed that show the appeal of the method. PMID:29470496

  15. Piloted Evaluation of an Integrated Methodology for Propulsion and Airframe Control Design

    NASA Technical Reports Server (NTRS)

    Bright, Michelle M.; Simon, Donald L.; Garg, Sanjay; Mattern, Duane L.; Ranaudo, Richard J.; Odonoghue, Dennis P.

    1994-01-01

    An integrated methodology for propulsion and airframe control has been developed and evaluated for a Short Take-Off Vertical Landing (STOVL) aircraft using a fixed base flight simulator at NASA Lewis Research Center. For this evaluation the flight simulator is configured for transition flight using a STOVL aircraft model, a full nonlinear turbofan engine model, simulated cockpit and displays, and pilot effectors. The paper provides a brief description of the simulation models, the flight simulation environment, the displays and symbology, the integrated control design, and the piloted tasks used for control design evaluation. In the simulation, the pilots successfully completed typical transition phase tasks such as combined constant deceleration with flight path tracking, and constant acceleration wave-off maneuvers. The pilot comments of the integrated system performance and the display symbology are discussed and analyzed to identify potential areas of improvement.

  16. Drugs of Abuse and Addiction: An integrated approach to teaching.

    PubMed

    Miller, Lindsey N; Mercer, Susan L

    2017-05-01

    To describe the design, implementation, and student perceptions of a Drugs of Abuse and Addiction elective course utilizing an integrated teaching model. Third-year pharmacy students enrolled in the two credit hour elective. Teaching methodology included didactic lecture, journal club, simulated addiction assignment with reflection, debates, external speakers, site visit to a residential drug court program and research paper with presentation. A course objective survey was administered upon course completion. All students strongly agreed that having science- and clinical-based faculty members develop and deliver course content was beneficial. Additionally, all students agree to strongly agree that their research project helped them integrate and comprehend the science and practice surrounding drugs of abuse and addiction. Students enjoyed an integrated teaching approach and multiple teaching methodologies leading to increased engagement and enhancement of student learning. Course enrollment was beneficial for personalized learning, but limited student perspective. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. A framework for an alternatives assessment dashboard for evaluating chemical alternatives applied to flame retardants for electronic applications.

    PubMed

    Martin, Todd M

    2017-05-01

    The goal of alternatives assessment (AA) is to facilitate a comparison of alternatives to a chemical of concern, resulting in the identification of safer alternatives. A two stage methodology for comparing chemical alternatives was developed. In the first stage, alternatives are compared using a variety of human health effects, ecotoxicity, and physicochemical properties. Hazard profiles are completed using a variety of online sources and quantitative structure activity relationship models. In the second stage, alternatives are evaluated utilizing an exposure/risk assessment over the entire life cycle. Exposure values are calculated using screening-level near-field and far-field exposure models. The second stage allows one to more accurately compare potential exposure to each alternative and consider additional factors that may not be obvious from separate binned persistence, bioaccumulation, and toxicity scores. The methodology was utilized to compare phosphate-based alternatives for decabromodiphenyl ether (decaBDE) in electronics applications.

  18. Use of evidence-based practice in an aid organisation: a proposal to deal with the variety in terminology and methodology.

    PubMed

    De Buck, Emmy; Pauwels, Nele S; Dieltjens, Tessa; Vandekerckhove, Philippe

    2014-03-01

    As part of its strategy Belgian Red Cross-Flanders underpins all its activities with evidence-based guidelines and systematic reviews. The aim of this publication is to describe in detail the methodology used to achieve this goal within an action-oriented organisation, in a timely and cost-effective way. To demonstrate transparency in our methods, we wrote a methodological charter describing the way in which we develop evidence-based materials to support our activities. Criteria were drawn up for deciding on project priority and the choice of different types of projects (scoping reviews, systematic reviews and evidence-based guidelines). While searching for rigorous and realistically attainable methodological standards, we encountered a wide variety in terminology and methodology used in the field of evidence-based practice. Terminologies currently being used by different organisations and institutions include systematic reviews, systematic literature searches, evidence-based guidelines, rapid reviews, pragmatic systematic reviews, and rapid response service. It is not always clear what the definition and methodology is behind these terms and whether they are used consistently. We therefore describe the terminology and methodology used by Belgian Red Cross-Flanders; criteria for making methodological choices and details on the methodology we use are given. In our search for an appropriate methodology, taking into account time and resource constraints, we encountered an enormous variety of methodological approaches and terminology used for evidence-based materials. In light of this, we recommend that authors of evidence-based guidelines and reviews are transparent and clear about the methodology used. To be transparent about our approach, we developed a methodological charter. This charter may inspire other organisations that want to use evidence-based methodology to support their activities.

  19. The added value of thorough economic evaluation of telemedicine networks.

    PubMed

    Le Goff-Pronost, Myriam; Sicotte, Claude

    2010-02-01

    This paper proposes a thorough framework for the economic evaluation of telemedicine networks. A standard cost analysis methodology was used as the initial base, similar to the evaluation method currently being applied to telemedicine, and to which we suggest adding subsequent stages that enhance the scope and sophistication of the analytical methodology. We completed the methodology with a longitudinal and stakeholder analysis, followed by the calculation of a break-even threshold, a calculation of the economic outcome based on net present value (NPV), an estimate of the social gain through external effects, and an assessment of the probability of social benefits. In order to illustrate the advantages, constraints and limitations of the proposed framework, we tested it in a paediatric cardiology tele-expertise network. The results demonstrate that the project threshold was not reached after the 4 years of the study. Also, the calculation of the project's NPV remained negative. However, the additional analytical steps of the proposed framework allowed us to highlight alternatives that can make this service economically viable. These included: use over an extended period of time, extending the network to other telemedicine specialties, or including it in the services offered by other community hospitals. In sum, the results presented here demonstrate the usefulness of an economic evaluation framework as a way of offering decision makers the tools they need to make comprehensive evaluations of telemedicine networks.

  20. Environmental exposure effects on composite materials for commercial aircraft

    NASA Technical Reports Server (NTRS)

    Hoffman, D. J.

    1978-01-01

    Activities reported include completion of the program design tasks, resolution of a high fiber volume problem and resumption of specimen fabrication, fixture fabrication, and progress on the analysis methodology and definition of the typical aircraft environment. Program design activities including test specimens, specimen holding fixtures, flap-track fairing tailcones, and ground exposure racks were completed. The problem experienced in obtaining acceptable fiber volume fraction results on two of the selected graphite epoxy material systems was resolved with an alteration to the bagging procedure called out in BAC 5562. The revised bagging procedure, involving lower numbers of bleeder plies, produces acceptable results. All required laminates for the contract have now been laid up and cured. Progress in the area of analysis methodology has been centered about definition of the environment that a commercial transport aircraft undergoes. The selected methodology is analagous to fatigue life assessment.

  1. Relationship Between Active Learning Methodologies and Community College Students' STEM Course Grades

    NASA Astrophysics Data System (ADS)

    Clark Lesko, Cherish Christina

    Active learning methodologies (ALM) are associated with student success, but little research on this topic has been pursued at the community college level. At a local community college, students in science, technology, engineering, and math (STEM) courses exhibited lower than average grades. The purpose of this study was to examine whether the use of ALM predicted STEM course grades while controlling for academic discipline, course level, and class size. The theoretical framework was Vygotsky's social constructivism. Descriptive statistics and multinomial logistic regression were performed on data collected through an anonymous survey of 74 instructors of 272 courses during the 2016 fall semester. Results indicated that students were more likely to achieve passing grades when instructors employed in-class, highly structured activities, and writing-based ALM, and were less likely to achieve passing grades when instructors employed project-based or online ALM. The odds ratios indicated strong positive effects (greater likelihoods of receiving As, Bs, or Cs in comparison to the grade of F) for writing-based ALM (39.1-43.3%, 95% CI [10.7-80.3%]), highly structured activities (16.4-22.2%, 95% CI [1.8-33.7%]), and in-class ALM (5.0-9.0%, 95% CI [0.6-13.8%]). Project-based and online ALM showed negative effects (lower likelihoods of receiving As, Bs, or Cs in comparison to the grade of F) with odds ratios of 15.7-20.9%, 95% CI [9.7-30.6%] and 16.1-20.4%, 95% CI [5.9-25.2%] respectively. A white paper was developed with recommendations for faculty development, computer skills assessment and training, and active research on writing-based ALM. Improving student grades and STEM course completion rates could lead to higher graduation rates and lower college costs for at-risk students by reducing course repetition and time to degree completion.

  2. Final Report for X-ray Diffraction Sample Preparation Method Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ely, T. M.; Meznarich, H. K.; Valero, T.

    WRPS-1500790, “X-ray Diffraction Saltcake Sample Preparation Method Development Plan/Procedure,” was originally prepared with the intent of improving the specimen preparation methodology used to generate saltcake specimens suitable for XRD-based solid phase characterization. At the time that this test plan document was originally developed, packed powder in cavity supports with collodion binder was the established XRD specimen preparation method. An alternate specimen preparation method less vulnerable, if not completely invulnerable to preferred orientation effects, was desired as a replacement for the method.

  3. U.S. Air Force Installation Restoration Program. Phase 1. Records Search for Suffolk County Air Force Base (Retired) Landfills 1 and 2. Suffolk County Airport, Westhampton Beach, New York.

    DTIC Science & Technology

    1987-09-20

    of the two study sites (Appendix D). Common species include the herring and ring-billed gull, mourning dove, tree swallow, chimney swift, purple... Species ................... 111-30 III.G ADJACENT LAND USE .................................... 111-31 III.H SUMMARY OF ENVIRONMENTAL FEATURES...methodology, and a list of acronyms/abbreviations used in this report. 1-4 W- x--WW- -- vqwu UV X DECISION TREE Complete List of Locations/Sites I Evaluation

  4. Analysis of Wind Turbine Simulation Models: Assessment of Simplified versus Complete Methodologies: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Honrubia-Escribano, A.; Jimenez-Buendia, F.; Molina-Garcia, A.

    This paper presents the current status of simplified wind turbine models used for power system stability analysis. This work is based on the ongoing work being developed in IEC 61400-27. This international standard, for which a technical committee was convened in October 2009, is focused on defining generic (also known as simplified) simulation models for both wind turbines and wind power plants. The results of the paper provide an improved understanding of the usability of generic models to conduct power system simulations.

  5. A review of EO image information mining

    NASA Astrophysics Data System (ADS)

    Quartulli, Marco; Olaizola, Igor G.

    2013-01-01

    We analyze the state of the art of content-based retrieval in Earth observation image archives focusing on complete systems showing promise for operational implementation. The different paradigms at the basis of the main system families are introduced. The approaches taken are considered, focusing in particular on the phases after primitive feature extraction. The solutions envisaged for the issues related to feature simplification and synthesis, indexing, semantic labeling are reviewed. The methodologies for query specification and execution are evaluated. Conclusions are drawn on the state of published research in Earth observation (EO) mining.

  6. A low-rank matrix recovery approach for energy efficient EEG acquisition for a wireless body area network.

    PubMed

    Majumdar, Angshul; Gogna, Anupriya; Ward, Rabab

    2014-08-25

    We address the problem of acquiring and transmitting EEG signals in Wireless Body Area Networks (WBAN) in an energy efficient fashion. In WBANs, the energy is consumed by three operations: sensing (sampling), processing and transmission. Previous studies only addressed the problem of reducing the transmission energy. For the first time, in this work, we propose a technique to reduce sensing and processing energy as well: this is achieved by randomly under-sampling the EEG signal. We depart from previous Compressed Sensing based approaches and formulate signal recovery (from under-sampled measurements) as a matrix completion problem. A new algorithm to solve the matrix completion problem is derived here. We test our proposed method and find that the reconstruction accuracy of our method is significantly better than state-of-the-art techniques; and we achieve this while saving sensing, processing and transmission energy. Simple power analysis shows that our proposed methodology consumes considerably less power compared to previous CS based techniques.

  7. Interventions to Improve the Quality of Outpatient Specialty Referral Requests: A Systematic Review.

    PubMed

    Hendrickson, Chase D; Lacourciere, Stacy L; Zanetti, Cole A; Donaldson, Patrick C; Larson, Robin J

    2016-09-01

    Requests for outpatient specialty consultations occur frequently but often are of poor quality because of incompleteness. The authors searched bibliographic databases, trial registries, and references during October 2014 for studies evaluating interventions to improve the quality of outpatient specialty referral requests compared to usual practice. Two reviewers independently extracted data and assessed quality. Findings were qualitatively summarized for completeness of information relayed in a referral request within naturally emerging intervention categories. Of 3495 articles screened, 11 were eligible. All 3 studies evaluating software-based interventions found statistically significant improvements. Among 4 studies evaluating template/pro forma interventions, completeness was uniformly improved but with variable or unreported statistical significance. Of 4 studies evaluating educational interventions, 2 favored the intervention and 2 found no difference. One study evaluating referral management was negative. Current evidence for improving referral request quality is strongest for software-based interventions and templates, although methodological quality varied and findings may be setting specific. © The Author(s) 2015.

  8. Development and applications of two computational procedures for determining the vibration modes of structural systems. [aircraft structures - aerospaceplanes

    NASA Technical Reports Server (NTRS)

    Kvaternik, R. G.

    1975-01-01

    Two computational procedures for analyzing complex structural systems for their natural modes and frequencies of vibration are presented. Both procedures are based on a substructures methodology and both employ the finite-element stiffness method to model the constituent substructures. The first procedure is a direct method based on solving the eigenvalue problem associated with a finite-element representation of the complete structure. The second procedure is a component-mode synthesis scheme in which the vibration modes of the complete structure are synthesized from modes of substructures into which the structure is divided. The analytical basis of the methods contains a combination of features which enhance the generality of the procedures. The computational procedures exhibit a unique utilitarian character with respect to the versatility, computational convenience, and ease of computer implementation. The computational procedures were implemented in two special-purpose computer programs. The results of the application of these programs to several structural configurations are shown and comparisons are made with experiment.

  9. Quality evaluation of health information system's architectures developed using the HIS-DF methodology.

    PubMed

    López, Diego M; Blobel, Bernd; Gonzalez, Carolina

    2010-01-01

    Requirement analysis, design, implementation, evaluation, use, and maintenance of semantically interoperable Health Information Systems (HIS) have to be based on eHealth standards. HIS-DF is a comprehensive approach for HIS architectural development based on standard information models and vocabulary. The empirical validity of HIS-DF has not been demonstrated so far. Through an empirical experiment, the paper demonstrates that using HIS-DF and HL7 information models, semantic quality of HIS architecture can be improved, compared to architectures developed using traditional RUP process. Semantic quality of the architecture has been measured in terms of model's completeness and validity metrics. The experimental results demonstrated an increased completeness of 14.38% and an increased validity of 16.63% when using the HIS-DF and HL7 information models in a sample HIS development project. Quality assurance of the system architecture in earlier stages of HIS development presumes an increased quality of final HIS systems, which supposes an indirect impact on patient care.

  10. Model based design introduction: modeling game controllers to microprocessor architectures

    NASA Astrophysics Data System (ADS)

    Jungwirth, Patrick; Badawy, Abdel-Hameed

    2017-04-01

    We present an introduction to model based design. Model based design is a visual representation, generally a block diagram, to model and incrementally develop a complex system. Model based design is a commonly used design methodology for digital signal processing, control systems, and embedded systems. Model based design's philosophy is: to solve a problem - a step at a time. The approach can be compared to a series of steps to converge to a solution. A block diagram simulation tool allows a design to be simulated with real world measurement data. For example, if an analog control system is being upgraded to a digital control system, the analog sensor input signals can be recorded. The digital control algorithm can be simulated with the real world sensor data. The output from the simulated digital control system can then be compared to the old analog based control system. Model based design can compared to Agile software develop. The Agile software development goal is to develop working software in incremental steps. Progress is measured in completed and tested code units. Progress is measured in model based design by completed and tested blocks. We present a concept for a video game controller and then use model based design to iterate the design towards a working system. We will also describe a model based design effort to develop an OS Friendly Microprocessor Architecture based on the RISC-V.

  11. Relationship between solitary pulmonary nodule lung cancer and CT image features based on gradual clustering

    NASA Astrophysics Data System (ADS)

    Zhang, Weipeng

    2017-06-01

    The relationship between the medical characteristics of lung cancers and computer tomography (CT) images are explored so as to improve the early diagnosis rate of lung cancers. This research collected CT images of patients with solitary pulmonary nodule lung cancer, and used gradual clustering methodology to classify them. Preliminary classifications were made, followed by continuous modification and iteration to determine the optimal condensation point, until iteration stability was achieved. Reasonable classification results were obtained. the clustering results fell into 3 categories. The first type of patients was mostly female, with ages between 50 and 65 years. CT images of solitary pulmonary nodule lung cancer for this group contain complete lobulation and burr, with pleural indentation; The second type of patients was mostly male with ages between 50 and 80 years. CT images of solitary pulmonary nodule lung cancer for this group contain complete lobulation and burr, but with no pleural indentation; The third type of patients was also mostly male with ages between 50 and 80 years. CT images for this group showed no abnormalities. the application of gradual clustering methodology can scientifically classify CT image features of patients with lung cancer in the initial lesion stage. These findings provide the basis for early detection and treatment of malignant lesions in patients with lung cancer.

  12. Building the Material Flow Networks of Aluminum in the 2007 U.S. Economy.

    PubMed

    Chen, Wei-Qiang; Graedel, T E; Nuss, Philip; Ohno, Hajime

    2016-04-05

    Based on the combination of the U.S. economic input-output table and the stocks and flows framework for characterizing anthropogenic metal cycles, this study presents a methodology for building material flow networks of bulk metals in the U.S. economy and applies it to aluminum. The results, which we term the Input-Output Material Flow Networks (IO-MFNs), achieve a complete picture of aluminum flow in the entire U.S. economy and for any chosen industrial sector (illustrated for the Automobile Manufacturing sector). The results are compared with information from our former study on U.S. aluminum stocks and flows to demonstrate the robustness and value of this new methodology. We find that the IO-MFN approach has the following advantages: (1) it helps to uncover the network of material flows in the manufacturing stage in the life cycle of metals; (2) it provides a method that may be less time-consuming but more complete and accurate in estimating new scrap generation, process loss, domestic final demand, and trade of final products of metals, than existing material flow analysis approaches; and, most importantly, (3) it enables the analysis of the material flows of metals in the U.S. economy from a network perspective, rather than merely that of a life cycle chain.

  13. Dynamic Uncertain Causality Graph for Knowledge Representation and Probabilistic Reasoning: Directed Cyclic Graph and Joint Probability Distribution.

    PubMed

    Zhang, Qin

    2015-07-01

    Probabilistic graphical models (PGMs) such as Bayesian network (BN) have been widely applied in uncertain causality representation and probabilistic reasoning. Dynamic uncertain causality graph (DUCG) is a newly presented model of PGMs, which can be applied to fault diagnosis of large and complex industrial systems, disease diagnosis, and so on. The basic methodology of DUCG has been previously presented, in which only the directed acyclic graph (DAG) was addressed. However, the mathematical meaning of DUCG was not discussed. In this paper, the DUCG with directed cyclic graphs (DCGs) is addressed. In contrast, BN does not allow DCGs, as otherwise the conditional independence will not be satisfied. The inference algorithm for the DUCG with DCGs is presented, which not only extends the capabilities of DUCG from DAGs to DCGs but also enables users to decompose a large and complex DUCG into a set of small, simple sub-DUCGs, so that a large and complex knowledge base can be easily constructed, understood, and maintained. The basic mathematical definition of a complete DUCG with or without DCGs is proved to be a joint probability distribution (JPD) over a set of random variables. The incomplete DUCG as a part of a complete DUCG may represent a part of JPD. Examples are provided to illustrate the methodology.

  14. Contemporary health care economics: an overview.

    PubMed

    McLaughlin, Nancy; Ong, Michael K; Tabbush, Victor; Hagigi, Farhad; Martin, Neil A

    2014-11-01

    Economic evaluations provide a decision-making framework in which outcomes (benefits) and costs are assessed for various alternative options. Although the interest in complete and partial economic evaluations has increased over the past 2 decades, the quality of studies has been marginal due to methodological challenges or incomplete cost determination. This paper provides an overview of the main types of complete and partial economic evaluations, reviews key methodological elements to be considered for any economic evaluation, and reviews concepts of cost determination. The goal is to provide the clinician neurosurgeon with the knowledge and tools needed to appraise published economic evaluations and to direct high-quality health economic evaluations.

  15. Robotic Mission to Mars: Hands-on, minds-on, web-based learning

    NASA Astrophysics Data System (ADS)

    Mathers, Naomi; Goktogen, Ali; Rankin, John; Anderson, Marion

    2012-11-01

    Problem-based learning has been demonstrated as an effective methodology for developing analytical skills and critical thinking. The use of scenario-based learning incorporates problem-based learning whilst encouraging students to collaborate with their colleagues and dynamically adapt to their environment. This increased interaction stimulates a deeper understanding and the generation of new knowledge. The Victorian Space Science Education Centre (VSSEC) uses scenario-based learning in its Mission to Mars, Mission to the Orbiting Space Laboratory and Primary Expedition to the M.A.R.S. Base programs. These programs utilize methodologies such as hands-on applications, immersive-learning, integrated technologies, critical thinking and mentoring to engage students in Science, Technology, Engineering and Mathematics (STEM) and highlight potential career paths in science and engineering. The immersive nature of the programs demands specialist environments such as a simulated Mars environment, Mission Control and Space Laboratory, thus restricting these programs to a physical location and limiting student access to the programs. To move beyond these limitations, VSSEC worked with its university partners to develop a web-based mission that delivered the benefits of scenario-based learning within a school environment. The Robotic Mission to Mars allows students to remotely control a real rover, developed by the Australian Centre for Field Robotics (ACFR), on the VSSEC Mars surface. After completing a pre-mission training program and site selection activity, students take on the roles of scientists and engineers in Mission Control to complete a mission and collect data for further analysis. Mission Control is established using software developed by the ACRI Games Technology Lab at La Trobe University using the principles of serious gaming. The software allows students to control the rover, monitor its systems and collect scientific data for analysis. This program encourages students to work scientifically and explores the interaction between scientists and engineers. This paper presents the development of the program, including the involvement of university students in the development of the rover, the software, and the collation of the scientific data. It also presents the results of the trial phase of this program including the impact on student engagement and learning outcomes.

  16. Investigation of Experimental Factors That Underlie BRCA1/2 mRNA Isoform Expression Variation: Recommendations for Utilizing Targeted RNA Sequencing to Evaluate Potential Spliceogenic Variants

    PubMed Central

    Lattimore, Vanessa L.; Pearson, John F.; Currie, Margaret J.; Spurdle, Amanda B.; Robinson, Bridget A.; Walker, Logan C.

    2018-01-01

    PCR-based RNA splicing assays are commonly used in diagnostic and research settings to assess the potential effects of variants of uncertain clinical significance in BRCA1 and BRCA2. The Evidence-based Network for the Interpretation of Germline Mutant Alleles (ENIGMA) consortium completed a multicentre investigation to evaluate differences in assay design and the integrity of published data, raising a number of methodological questions associated with cell culture conditions and PCR-based protocols. We utilized targeted RNA-seq to re-assess BRCA1 and BRCA2 mRNA isoform expression patterns in lymphoblastoid cell lines (LCLs) previously used in the multicentre ENIGMA study. Capture of the targeted cDNA sequences was carried out using 34 BRCA1 and 28 BRCA2 oligonucleotides from the Illumina Truseq Targeted RNA Expression platform. Our results show that targeted RNA-seq analysis of LCLs overcomes many of the methodology limitations associated with PCR-based assays leading us to make the following observations and recommendations: (1) technical replicates (n > 2) of variant carriers to capture methodology induced variability associated with RNA-seq assays, (2) LCLs can undergo multiple freeze/thaw cycles and can be cultured up to 2 weeks without noticeably influencing isoform expression levels, (3) nonsense-mediated decay inhibitors are essential prior to splicing assays for comprehensive mRNA isoform detection, (4) quantitative assessment of exon:exon junction levels across BRCA1 and BRCA2 can help distinguish between normal and aberrant isoform expression patterns. Experimentally derived recommendations from this study will facilitate the application of targeted RNA-seq platforms for the quantitation of BRCA1 and BRCA2 mRNA aberrations associated with sequence variants of uncertain clinical significance. PMID:29774201

  17. Investigation of Experimental Factors That Underlie BRCA1/2 mRNA Isoform Expression Variation: Recommendations for Utilizing Targeted RNA Sequencing to Evaluate Potential Spliceogenic Variants.

    PubMed

    Lattimore, Vanessa L; Pearson, John F; Currie, Margaret J; Spurdle, Amanda B; Robinson, Bridget A; Walker, Logan C

    2018-01-01

    PCR-based RNA splicing assays are commonly used in diagnostic and research settings to assess the potential effects of variants of uncertain clinical significance in BRCA1 and BRCA2 . The Evidence-based Network for the Interpretation of Germline Mutant Alleles (ENIGMA) consortium completed a multicentre investigation to evaluate differences in assay design and the integrity of published data, raising a number of methodological questions associated with cell culture conditions and PCR-based protocols. We utilized targeted RNA-seq to re-assess BRCA1 and BRCA2 mRNA isoform expression patterns in lymphoblastoid cell lines (LCLs) previously used in the multicentre ENIGMA study. Capture of the targeted cDNA sequences was carried out using 34 BRCA1 and 28 BRCA2 oligonucleotides from the Illumina Truseq Targeted RNA Expression platform. Our results show that targeted RNA-seq analysis of LCLs overcomes many of the methodology limitations associated with PCR-based assays leading us to make the following observations and recommendations: (1) technical replicates ( n  > 2) of variant carriers to capture methodology induced variability associated with RNA-seq assays, (2) LCLs can undergo multiple freeze/thaw cycles and can be cultured up to 2 weeks without noticeably influencing isoform expression levels, (3) nonsense-mediated decay inhibitors are essential prior to splicing assays for comprehensive mRNA isoform detection, (4) quantitative assessment of exon:exon junction levels across BRCA1 and BRCA2 can help distinguish between normal and aberrant isoform expression patterns. Experimentally derived recommendations from this study will facilitate the application of targeted RNA-seq platforms for the quantitation of BRCA1 and BRCA2 mRNA aberrations associated with sequence variants of uncertain clinical significance.

  18. Controlling bias and inflation in epigenome- and transcriptome-wide association studies using the empirical null distribution.

    PubMed

    van Iterson, Maarten; van Zwet, Erik W; Heijmans, Bastiaan T

    2017-01-27

    We show that epigenome- and transcriptome-wide association studies (EWAS and TWAS) are prone to significant inflation and bias of test statistics, an unrecognized phenomenon introducing spurious findings if left unaddressed. Neither GWAS-based methodology nor state-of-the-art confounder adjustment methods completely remove bias and inflation. We propose a Bayesian method to control bias and inflation in EWAS and TWAS based on estimation of the empirical null distribution. Using simulations and real data, we demonstrate that our method maximizes power while properly controlling the false positive rate. We illustrate the utility of our method in large-scale EWAS and TWAS meta-analyses of age and smoking.

  19. Updated population metadata for United States historical climatology network stations

    USGS Publications Warehouse

    Owen, T.W.; Gallo, K.P.

    2000-01-01

    The United States Historical Climatology Network (HCN) serial temperature dataset is comprised of 1221 high-quality, long-term climate observing stations. The HCN dataset is available in several versions, one of which includes population-based temperature modifications to adjust urban temperatures for the "heat-island" effect. Unfortunately, the decennial population metadata file is not complete as missing values are present for 17.6% of the 12 210 population values associated with the 1221 individual stations during the 1900-90 interval. Retrospective grid-based populations. Within a fixed distance of an HCN station, were estimated through the use of a gridded population density dataset and historically available U.S. Census county data. The grid-based populations for the HCN stations provide values derived from a consistent methodology compared to the current HCN populations that can vary as definitions of the area associated with a city change over time. The use of grid-based populations may minimally be appropriate to augment populations for HCN climate stations that lack any population data, and are recommended when consistent and complete population data are required. The recommended urban temperature adjustments based on the HCN and grid-based methods of estimating station population can be significantly different for individual stations within the HCN dataset.

  20. A DDDAS Framework for Volcanic Ash Propagation and Hazard Analysis

    DTIC Science & Technology

    2012-01-01

    probability distribution for the input variables (for example, Hermite polynomials for normally distributed parameters, or Legendre for uniformly...parameters and windfields will drive our simulations. We will use uncertainty quantification methodology – polynomial chaos quadrature in combination...quantification methodology ? polynomial chaos quadrature in combination with data integration to complete the DDDAS loop. 15. SUBJECT TERMS 16. SECURITY

  1. On Teaching the History of California Spanish to HLL Using Siri: Methodology and Procedures

    ERIC Educational Resources Information Center

    Lamar Prieto, Covadonga

    2016-01-01

    This article reports results from a study in which two groups of college level students were exposed to interactions with Apple's Siri in order to foster dialogue about their dialectal features. In this paper, the methodology and procedural challenges behind one of the activities that the participants completed are studied. These activities had…

  2. Using Indigenous Educational Research to Transform Mainstream Education: A Guide for P-12 School Leaders

    ERIC Educational Resources Information Center

    Harrington, Billie Graham; CHiXapkaid (Pavel, D. Michael)

    2013-01-01

    The principal assertion of this article is that Indigenous research methodologies should be used to develop educational policies and practices for Native students. The history of American educational research is marred by a near complete dismissal of Indigenous knowledge, as Western research methodologies continue to define the landscape of P-12…

  3. Transcending the Quantitative-Qualitative Divide with Mixed Methods Research: A Multidimensional Framework for Understanding Congruence and Completeness in the Study of Values

    ERIC Educational Resources Information Center

    McLafferty, Charles L., Jr.; Slate, John R.; Onwuegbuzie, Anthony J.

    2010-01-01

    Quantitative research dominates published literature in the helping professions. Mixed methods research, which integrates quantitative and qualitative methodologies, has received a lukewarm reception. The authors address the iterative separation that infuses theory, praxis, philosophy, methodology, training, and public perception and propose a…

  4. Development of a methodology for strategic environmental assessment: application to the assessment of golf course installation policy in Taiwan.

    PubMed

    Chen, Ching-Ho; Wu, Ray-Shyan; Liu, Wei-Lin; Su, Wen-Ray; Chang, Yu-Min

    2009-01-01

    Some countries, including Taiwan, have adopted strategic environmental assessment (SEA) to assess and modify proposed policies, plans, and programs (PPPs) in the planning phase for pursuing sustainable development. However, there were only some sketchy steps focusing on policy assessment in the system of Taiwan. This study aims to develop a methodology for SEA in Taiwan to enhance the effectiveness associated with PPPs. The proposed methodology comprises an SEA procedure involving PPP management and assessment in various phases, a sustainable assessment framework, and an SEA management system. The SEA procedure is devised based on the theoretical considerations by systems thinking and the regulative requirements in Taiwan. The positive and negative impacts on ecology, society, and economy are simultaneously considered in the planning (including policy generation and evaluation), implementation, and control phases of the procedure. This study used the analytic hierarchy process, Delphi technique, and systems analysis to develop a sustainable assessment framework. An SEA management system was built based on geographic information system software to process spatial, attribute, and satellite image data during the assessment procedure. The proposed methodology was applied in the SEA of golf course installation policy in 2001 as a case study, which was the first SEA in Taiwan. Most of the 82 existing golf courses in 2001 were installed on slope lands and caused a serious ecological impact. Assessment results indicated that 15 future golf courses installed on marginal lands (including buffer zones, remedied lands, and wastelands) were acceptable because the comprehensive environmental (ecological, social, and economic) assessment value was better based on environmental characteristics and management regulations of Taiwan. The SEA procedure in the planning phase for this policy was completed but the implementation phase of this policy was not begun because the related legislation procedure could not be arranged due to a few senators' resistance. A self-review of the control phase was carried out in 2006 using this methodology. Installation permits for 12 courses on slope lands were terminated after 2001 and then 27 future courses could be installed on marginal lands. The assessment value of this policy using the data on ecological, social, and economic conditions from 2006 was higher than that using the data from 2001. The analytical results illustrate that the proposed methodology can be used to effectively and efficiently assist the related authorities for SEA.

  5. Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Knox, Lenora A.

    The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.

  6. Methodological support for the further abstraction of and philosophical examination of empirical findings in the context of caring science.

    PubMed

    Lindberg, Elisabeth; Österberg, Sofia A; Hörberg, Ulrica

    2016-01-01

    Phenomena in caring science are often complex and laden with meanings. Empirical research with the aim of capturing lived experiences is one way of revealing the complexity. Sometimes, however, results from empirical research need to be further discussed. One way is to further abstract the result and/or philosophically examine it. This has previously been performed and presented in scientific journals and doctoral theses, contributing to a greater understanding of phenomena in caring science. Although the intentions in many of these publications are laudable, the lack of methodological descriptions as well as a theoretical and systematic foundation can contribute to an ambiguity concerning how the results have emerged during the analysis. The aim of this paper is to describe the methodological support for the further abstraction of and/or philosophical examination of empirical findings. When trying to systematize the support procedures, we have used a reflective lifeworld research (RLR) approach. Based on the assumptions in RLR, this article will present methodological support for a theoretical examination that can include two stages. In the first stage, data from several (two or more) empirical results on an essential level are synthesized into a general structure. Sometimes the analysis ends with the general structure, but sometimes there is a need to proceed further. The second stage can then be a philosophical examination, in which the general structure is discussed in relation to a philosophical text, theory, or concept. It is important that the theories are brought in as the final stage after the completion of the analysis. Core dimensions of the described methodological support are, in accordance with RLR, openness, bridling, and reflection. The methodological support cannot be understood as fixed stages, but rather as a guiding light in the search for further meanings.

  7. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.

  8. Assessing primary care data quality.

    PubMed

    Lim, Yvonne Mei Fong; Yusof, Maryati; Sivasampu, Sheamini

    2018-04-16

    Purpose The purpose of this paper is to assess National Medical Care Survey data quality. Design/methodology/approach Data completeness and representativeness were computed for all observations while other data quality measures were assessed using a 10 per cent sample from the National Medical Care Survey database; i.e., 12,569 primary care records from 189 public and private practices were included in the analysis. Findings Data field completion ranged from 69 to 100 per cent. Error rates for data transfer from paper to web-based application varied between 0.5 and 6.1 per cent. Error rates arising from diagnosis and clinical process coding were higher than medication coding. Data fields that involved free text entry were more prone to errors than those involving selection from menus. The authors found that completeness, accuracy, coding reliability and representativeness were generally good, while data timeliness needs to be improved. Research limitations/implications Only data entered into a web-based application were examined. Data omissions and errors in the original questionnaires were not covered. Practical implications Results from this study provided informative and practicable approaches to improve primary health care data completeness and accuracy especially in developing nations where resources are limited. Originality/value Primary care data quality studies in developing nations are limited. Understanding errors and missing data enables researchers and health service administrators to prevent quality-related problems in primary care data.

  9. Improved methodology to assess modification and completion of landfill gas management in the aftercare period

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, Jeremy W.F., E-mail: jmorris@geosyntec.com; Crest, Marion, E-mail: marion.crest@suez-env.com; Barlaz, Morton A., E-mail: barlaz@ncsu.edu

    Highlights: Black-Right-Pointing-Pointer Performance-based evaluation of landfill gas control system. Black-Right-Pointing-Pointer Analytical framework to evaluate transition from active to passive gas control. Black-Right-Pointing-Pointer Focus on cover oxidation as an alternative means of passive gas control. Black-Right-Pointing-Pointer Integrates research on long-term landfill behavior with practical guidance. - Abstract: Municipal solid waste landfills represent the dominant option for waste disposal in many parts of the world. While some countries have greatly reduced their reliance on landfills, there remain thousands of landfills that require aftercare. The development of cost-effective strategies for landfill aftercare is in society's interest to protect human health and the environmentmore » and to prevent the emergence of landfills with exhausted aftercare funding. The Evaluation of Post-Closure Care (EPCC) methodology is a performance-based approach in which landfill performance is assessed in four modules including leachate, gas, groundwater, and final cover. In the methodology, the objective is to evaluate landfill performance to determine when aftercare monitoring and maintenance can be reduced or possibly eliminated. This study presents an improved gas module for the methodology. While the original version of the module focused narrowly on regulatory requirements for control of methane migration, the improved gas module also considers best available control technology for landfill gas in terms of greenhouse gas emissions, air quality, and emissions of odoriferous compounds. The improved module emphasizes the reduction or elimination of fugitive methane by considering the methane oxidation capacity of the cover system. The module also allows for the installation of biologically active covers or other features designed to enhance methane oxidation. A methane emissions model, CALMIM, was used to assist with an assessment of the methane oxidation capacity of landfill covers.« less

  10. Philosophy of clinical psychopharmacology.

    PubMed

    Aragona, Massimiliano

    2013-03-01

    The renewal of the philosophical debate in psychiatry is one exciting news of recent years. However, its use in psychopharmacology may be problematic, ranging from self-confinement into the realm of values (which leaves the evidence-based domain unchallenged) to complete rejection of scientific evidence. In this paper philosophy is conceived as a conceptual audit of clinical psychopharmacology. Its function is to criticise the epistemological and methodological problems of current neopositivist, ingenuously realist and evidence-servant psychiatry from within the scientific stance and with the aim of aiding psychopharmacologists in practicing a more self-aware, critical and possibly useful clinical practice. Three examples are discussed to suggest that psychopharmacological practice needs conceptual clarification. At the diagnostic level it is shown that the crisis of the current diagnostic system and the problem of comorbidity strongly influence psychopharmacological results, new conceptualizations more respondent to the psychopharmacological requirements being needed. Heterogeneity of research samples, lack of specificity of psychotropic drugs, difficult generalizability of results, need of a phenomenological study of drug-induced psychopathological changes are discussed herein. At the methodological level the merits and limits of evidence-based practice are considered, arguing that clinicians should know the best available evidence but that guidelines should not be constrictive (due to several methodological biases and rhetorical tricks of which the clinician should be aware, sometimes respondent to extra-scientific, economical requests). At the epistemological level it is shown that the clinical stance is shaped by implicit philosophical beliefs about the mind/body problem (reductionism, dualism, interactionism, pragmatism), and that philosophy can aid physicians to be more aware of their beliefs in order to choose the most useful view and to practice coherently. In conclusion, psychopharmacologists already use methodological audit (e.g. statistical audit); similarly, conceptual clarification is needed in both research planning/evaluation and everyday psychopharmacological practice.

  11. The Children’s Attention-deficit Hyperactivity Disorder (ADHD) Telemental Health Treatment Study: Methodology for Conducting a Trial of Telemental Health in Multiple Underserved Communities

    PubMed Central

    Stoep, Ann Vander; Myers, Kathleen

    2013-01-01

    Background Children who live in non-metropolitan communities are underserved by evidence-based mental health care and underrepresented in clinical trials of mental health services. Telemental Health (TMH), the use of videoteleconferencing (VTC) to provide care that is usually delivered in person, shows promise for helping to rectify these service disparities. Purpose The Children’s ADHD Telemental Health Treatment Study (CATTS) is a randomized controlled trial designed to test the effectiveness of TMH in providing treatment to children diagnosed with attention-deficit hyperactivity disorder (ADHD) who are living in underserved communities. In this paper we describe the methodologies we developed for the trial and lessons learned. Methods Children ages 5.5-12 years of age with ADHD were referred to CATTS by their primary care physicians (PCP’s). The test intervention group (Group A) received six telepsychiatry sessions followed by in-person caregiver behavioral training delivered by a local therapist who was trained and supervised remotely. A secure website was used to support decision-making by the telepsychiatrists, to facilitate real-time collaboration between the telepsychiatrists and community therapists, and communication with the PCP’s. The control group (Group B) received a single telepsychiatry consultation followed by treatment with their PCP’s who implemented the telepsychiatrists’ recommendations at their discretion. Caregivers completed five sets of questionnaires about children’s symptoms and functioning and their own levels of distress. Older children (aged 10-12 years) completed questionnaires about their symptoms and functioning. Teachers completed ADHD rating scales. Questionnaires were completed online through a secure portal from personal computers. Results Eighty-eight PCP’s in seven communities referred the 223 children who participated in the trial. Attrition was low (3%). Children in Group A completed an average of 5.3 of 6 scheduled sessions; 96% of children in Group B completed their telepsychiatry consultation. Parents in both groups completed an average of 4.8 of 5 assessments. Telepsychiatrists and therapists showed high adherence to treatment protocols. Lessons Learned TMH proved to be a viable means of providing evidence-based pharmacological services to children and of training local therapists in evidence-based caregiver behavioral management. Recruitment was enhanced by offering the control group a telepsychiatry consultation. To meet recruitment targets across multiple dispersed sites, we developed community-specific strategies. A dedicated scheduler was a critical staff role to coordinate the multiple sites, sessions, and clinicians. Trial implementation was easier with sites that shared an electronic medical record system with our research hub. Conclusions The CATTS study used methods and procedures to optimize inclusion of children living in multiple dispersed and underserved areas. These experiences should advance the development of technologies needed to recruit underserved populations into research projects with the goal of reducing disparities in access to quality mental health care. PMID:23897950

  12. Discrete Abstractions of Hybrid Systems: Verification of Safety and Application to User-Interface Design

    NASA Technical Reports Server (NTRS)

    Oishi, Meeko; Tomlin, Claire; Degani, Asaf

    2003-01-01

    Human interaction with complex hybrid systems involves the user, the automation's discrete mode logic, and the underlying continuous dynamics of the physical system. Often the user-interface of such systems displays a reduced set of information about the entire system. In safety-critical systems, how can we identify user-interface designs which do not have adequate information, or which may confuse the user? Here we describe a methodology, based on hybrid system analysis, to verify that a user-interface contains information necessary to safely complete a desired procedure or task. Verification within a hybrid framework allows us to account for the continuous dynamics underlying the simple, discrete representations displayed to the user. We provide two examples: a car traveling through a yellow light at an intersection and an aircraft autopilot in a landing/go-around maneuver. The examples demonstrate the general nature of this methodology, which is applicable to hybrid systems (not fully automated) which have operational constraints we can pose in terms of safety. This methodology differs from existing work in hybrid system verification in that we directly account for the user's interactions with the system.

  13. How to survive (and enjoy) doing a thesis: the experiences of a methodological working group.

    PubMed

    Giddings, Lynne S; Wood, Pamela J

    2006-03-01

    'Doing a thesis', whether for Masters or PhD, can be a lonely and tortuous journey. This article offers a complementary process to the traditional apprenticeship supervision model. It describes the experiences of students who during their thesis research met monthly in a grounded theory working group. They reflected on their experiences during a focus group interview. After describing the background to how the group started in 1999 and exploring some of the ideas in the literature concerning the thesis experience, the article presents the interview. To focus the presentation, specific questions are used as category headings. Overall, the participants found attending the group was a "life-line" that gave them "hope" and was complementary to the supervision process. Through the support of peers, guidance from those ahead in the process, and consultancy with teachers and visiting methodological scholars, these students not only successfully completed their theses, but reported that they had some enjoyment along the way. This is the fifteenth in a series of articles which have been based on interviews with nursing and midwifery researchers, and were primarily designed to offer the beginning researcher a first-hand account of the experience of using particular methodologies.

  14. ADM1-based methodology for the characterisation of the influent sludge in anaerobic reactors.

    PubMed

    Huete, E; de Gracia, M; Ayesa, E; Garcia-Heras, J L

    2006-01-01

    This paper presents a systematic methodology to characterise the influent sludge in terms of the ADM1 components from the experimental measurements traditionally used in wastewater engineering. For this purpose, a complete characterisation of the model components in their elemental mass fractions and charge has been used, making a rigorous mass balance for all the process transformations and enabling the future connection with other unit-process models. It also makes possible the application of mathematical algorithms for the optimal characterisation of several components poorly defined in the ADM1 report. Additionally, decay and disintegration have been necessarily uncoupled so that the decay proceeds directly to hydrolysis instead of producing intermediate composites. The proposed methodology has been applied to the particular experimental work of a pilot-scale CSTR treating real sewage sludge, a mixture of primary and secondary sludge. The results obtained have shown a good characterisation of the influent reflected in good model predictions. However, its limitations for an appropriate prediction of alkalinity and carbon percentages in biogas suggest the convenience of including the elemental characterisation of the process in terms of carbon in the analytical program.

  15. Intelligent systems engineering methodology

    NASA Technical Reports Server (NTRS)

    Fouse, Scott

    1990-01-01

    An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.

  16. Functional-diversity indices can be driven by methodological choices and species richness.

    PubMed

    Poos, Mark S; Walker, Steven C; Jackson, Donald A

    2009-02-01

    Functional diversity is an important concept in community ecology because it captures information on functional traits absent in measures of species diversity. One popular method of measuring functional diversity is the dendrogram-based method, FD. To calculate FD, a variety of methodological choices are required, and it has been debated about whether biological conclusions are sensitive to such choices. We studied the probability that conclusions regarding FD were sensitive, and that patterns in sensitivity were related to alpha and beta components of species richness. We developed a randomization procedure that iteratively calculated FD by assigning species into two assemblages and calculating the probability that the community with higher FD varied across methods. We found evidence of sensitivity in all five communities we examined, ranging from a probability of sensitivity of 0 (no sensitivity) to 0.976 (almost completely sensitive). Variations in these probabilities were driven by differences in alpha diversity between assemblages and not by beta diversity. Importantly, FD was most sensitive when it was most useful (i.e., when differences in alpha diversity were low). We demonstrate that trends in functional-diversity analyses can be largely driven by methodological choices or species richness, rather than functional trait information alone.

  17. Encouraging the pursuit of advanced degrees in science and engineering: Top-down and bottom-up methodologies

    NASA Technical Reports Server (NTRS)

    Maddox, Anthony B.; Smith-Maddox, Renee P.; Penick, Benson E.

    1989-01-01

    The MassPEP/NASA Graduate Research Development Program (GRDP) whose objective is to encourage Black Americans, Mexican Americans, American Indians, Puerto Ricans, and Pacific Islanders to pursue graduate degrees in science and engineering is described. The GRDP employs a top-down or goal driven methodology through five modules which focus on research, graduate school climate, technical writing, standardized examinations, and electronic networking. These modules are designed to develop and reinforce some of the skills necessary to seriously consider the goal of completing a graduate education. The GRDP is a community-based program which seeks to recruit twenty participants from a pool of Boston-area undergraduates enrolled in engineering and science curriculums and recent graduates with engineering and science degrees. The program emphasizes that with sufficient information, its participants can overcome most of the barriers perceived as preventing them from obtaining graduate science and engineering degrees. Experience has shown that the top-down modules may be complemented by a more bottom-up or event-driven methodology. This approach considers events in the academic and professional experiences of participants in order to develop the personal and leadership skills necessary for graduate school and similar endeavors.

  18. a Semi-Automated Point Cloud Processing Methodology for 3d Cultural Heritage Documentation

    NASA Astrophysics Data System (ADS)

    Kıvılcım, C. Ö.; Duran, Z.

    2016-06-01

    The preliminary phase in any architectural heritage project is to obtain metric measurements and documentation of the building and its individual elements. On the other hand, conventional measurement techniques require tremendous resources and lengthy project completion times for architectural surveys and 3D model production. Over the past two decades, the widespread use of laser scanning and digital photogrammetry have significantly altered the heritage documentation process. Furthermore, advances in these technologies have enabled robust data collection and reduced user workload for generating various levels of products, from single buildings to expansive cityscapes. More recently, the use of procedural modelling methods and BIM relevant applications for historic building documentation purposes has become an active area of research, however fully automated systems in cultural heritage documentation still remains open. In this paper, we present a semi-automated methodology, for 3D façade modelling of cultural heritage assets based on parametric and procedural modelling techniques and using airborne and terrestrial laser scanning data. We present the contribution of our methodology, which we implemented in an open source software environment using the example project of a 16th century early classical era Ottoman structure, Sinan the Architect's Şehzade Mosque in Istanbul, Turkey.

  19. TPS In-Flight Health Monitoring Project Progress Report

    NASA Technical Reports Server (NTRS)

    Kostyk, Chris; Richards, Lance; Hudston, Larry; Prosser, William

    2007-01-01

    Progress in the development of new thermal protection systems (TPS) is reported. New approaches use embedded lightweight, sensitive, fiber optic strain and temperature sensors within the TPS. Goals of the program are to develop and demonstrate a prototype TPS health monitoring system, develop a thermal-based damage detection algorithm, characterize limits of sensor/system performance, and develop ea methodology transferable to new designs of TPS health monitoring systems. Tasks completed during the project helped establish confidence in understanding of both test setup and the model and validated system/sensor performance in a simple TPS structure. Other progress included complete initial system testing, commencement of the algorithm development effort, generation of a damaged thermal response characteristics database, initial development of a test plan for integration testing of proven FBG sensors in simple TPS structure, and development of partnerships to apply the technology.

  20. Improving Project Management with Simulation and Completion Distribution Functions

    NASA Technical Reports Server (NTRS)

    Cates, Grant R.

    2004-01-01

    Despite the critical importance of project completion timeliness, management practices in place today remain inadequate for addressing the persistent problem of project completion tardiness. A major culprit in late projects is uncertainty, which most, if not all, projects are inherently subject to. This uncertainty resides in the estimates for activity durations, the occurrence of unplanned and unforeseen events, and the availability of critical resources. In response to this problem, this research developed a comprehensive simulation based methodology for conducting quantitative project completion time risk analysis. It is called the Project Assessment by Simulation Technique (PAST). This new tool enables project stakeholders to visualize uncertainty or risk, i.e. the likelihood of their project completing late and the magnitude of the lateness, by providing them with a completion time distribution function of their projects. Discrete event simulation is used within PAST to determine the completion distribution function for the project of interest. The simulation is populated with both deterministic and stochastic elements. The deterministic inputs include planned project activities, precedence requirements, and resource requirements. The stochastic inputs include activity duration growth distributions, probabilities for events that can impact the project, and other dynamic constraints that may be placed upon project activities and milestones. These stochastic inputs are based upon past data from similar projects. The time for an entity to complete the simulation network, subject to both the deterministic and stochastic factors, represents the time to complete the project. Repeating the simulation hundreds or thousands of times allows one to create the project completion distribution function. The Project Assessment by Simulation Technique was demonstrated to be effective for the on-going NASA project to assemble the International Space Station. Approximately $500 million per month is being spent on this project, which is scheduled to complete by 2010. NASA project stakeholders participated in determining and managing completion distribution functions produced from PAST. The first result was that project stakeholders improved project completion risk awareness. Secondly, using PAST, mitigation options were analyzed to improve project completion performance and reduce total project cost.

  1. Building the Stock of College-Educated Labor

    ERIC Educational Resources Information Center

    Dynarski, Susan

    2008-01-01

    Half of college students drop out without completing a degree. This paper establishes a causal link between college costs and degree completion. I use quasi-experimental methodology to analyze two state scholarship programs. The programs increase the share of the exposed population with a college degree by three percentage points, with stronger…

  2. Lessons Learned for Successful Dissertation Completion from Social Work Doctoral Graduates

    ERIC Educational Resources Information Center

    Davis, Ashley; Wladkowski, Stephanie P.; Mirick, Rebecca G.

    2017-01-01

    A dissertation demonstrates a doctoral candidate's knowledge of a content area, mastery of research methodology, and readiness for future scholarship. Doctoral candidates, social work programs, and the profession as a whole are invested in ensuring that candidates successfully complete dissertations and enter academic, research, and leadership…

  3. What Factors Influence Vietnamese Students' Choice of University?

    ERIC Educational Resources Information Center

    Dao, Mai Thi Ngoc; Thorpe, Anthony

    2015-01-01

    Purpose: The purpose of this paper is to report the factors that influence Vietnamese students' choice of university in a little researched context where the effects of globalization and education reform are changing higher education. Design/methodology/approach: A quantitative survey was completed by 1,124 current or recently completed university…

  4. Bayesian methodology incorporating expert judgment for ranking countermeasure effectiveness under uncertainty: example applied to at grade railroad crossings in Korea.

    PubMed

    Washington, Simon; Oh, Jutaek

    2006-03-01

    Transportation professionals are sometimes required to make difficult transportation safety investment decisions in the face of uncertainty. In particular, an engineer may be expected to choose among an array of technologies and/or countermeasures to remediate perceived safety problems when: (1) little information is known about the countermeasure effects on safety; (2) information is known but from different regions, states, or countries where a direct generalization may not be appropriate; (3) where the technologies and/or countermeasures are relatively untested, or (4) where costs prohibit the full and careful testing of each of the candidate countermeasures via before-after studies. The importance of an informed and well-considered decision based on the best possible engineering knowledge and information is imperative due to the potential impact on the numbers of human injuries and deaths that may result from these investments. This paper describes the formalization and application of a methodology to evaluate the safety benefit of countermeasures in the face of uncertainty. To illustrate the methodology, 18 countermeasures for improving safety of at grade railroad crossings (AGRXs) in the Republic of Korea are considered. Akin to "stated preference" methods in travel survey research, the methodology applies random selection and laws of large numbers to derive accident modification factor (AMF) densities from expert opinions. In a full Bayesian analysis framework, the collective opinions in the form of AMF densities (data likelihood) are combined with prior knowledge (AMF density priors) for the 18 countermeasures to obtain 'best' estimates of AMFs (AMF posterior credible intervals). The countermeasures are then compared and recommended based on the largest safety returns with minimum risk (uncertainty). To the author's knowledge the complete methodology is new and has not previously been applied or reported in the literature. The results demonstrate that the methodology is able to discern anticipated safety benefit differences across candidate countermeasures. For the 18 at grade railroad crossings considered in this analysis, it was found that the top three performing countermeasures for reducing crashes are in-vehicle warning systems, obstacle detection systems, and constant warning time systems.

  5. Moon and Mars Analog Mission Activities for Mauna Kea 2012

    NASA Technical Reports Server (NTRS)

    Graham, Lee D.; Morris, Richard V.; Graff, Trevor G.; Yingst, R. Aileen; tenKate, I. L.; Glavin, Daniel P.; Hedlund, Magnus; Malespin, Charles A.; Mumm, Erik

    2012-01-01

    Rover-based 2012 Moon and Mars Analog Mission Activities (MMAMA) scientific investigations were recently completed at Mauna Kea, Hawaii. Scientific investigations, scientific input, and science operations constraints were tested in the context of an existing project and protocols for the field activities designed to help NASA achieve the Vision for Space Exploration. Initial science operations were planned based on a model similar to the operations control of the Mars Exploration Rovers (MER). However, evolution of the operations process occurred as the analog mission progressed. We report here on the preliminary sensor data results, an applicable methodology for developing an optimum science input based on productive engineering and science trades discussions and the science operations approach for an investigation into the valley on the upper slopes of Mauna Kea identified as "Apollo Valley".

  6. Focus: a robust workflow for one-dimensional NMR spectral analysis.

    PubMed

    Alonso, Arnald; Rodríguez, Miguel A; Vinaixa, Maria; Tortosa, Raül; Correig, Xavier; Julià, Antonio; Marsal, Sara

    2014-01-21

    One-dimensional (1)H NMR represents one of the most commonly used analytical techniques in metabolomic studies. The increase in the number of samples analyzed as well as the technical improvements involving instrumentation and spectral acquisition demand increasingly accurate and efficient high-throughput data processing workflows. We present FOCUS, an integrated and innovative methodology that provides a complete data analysis workflow for one-dimensional NMR-based metabolomics. This tool will allow users to easily obtain a NMR peak feature matrix ready for chemometric analysis as well as metabolite identification scores for each peak that greatly simplify the biological interpretation of the results. The algorithm development has been focused on solving the critical difficulties that appear at each data processing step and that can dramatically affect the quality of the results. As well as method integration, simplicity has been one of the main objectives in FOCUS development, requiring very little user input to perform accurate peak alignment, peak picking, and metabolite identification. The new spectral alignment algorithm, RUNAS, allows peak alignment with no need of a reference spectrum, and therefore, it reduces the bias introduced by other alignment approaches. Spectral alignment has been tested against previous methodologies obtaining substantial improvements in the case of moderate or highly unaligned spectra. Metabolite identification has also been significantly improved, using the positional and correlation peak patterns in contrast to a reference metabolite panel. Furthermore, the complete workflow has been tested using NMR data sets from 60 human urine samples and 120 aqueous liver extracts, reaching a successful identification of 42 metabolites from the two data sets. The open-source software implementation of this methodology is available at http://www.urr.cat/FOCUS.

  7. Fisheye Photogrammetry: Tests and Methodologies for the Survey of Narrow Spaces

    NASA Astrophysics Data System (ADS)

    Perfetti, L.; Polari, C.; Fassi, F.

    2017-02-01

    The research illustrated in this article aimed at identifying a good standard methodology to survey very narrow spaces during 3D investigation of Cultural Heritage. It is an important topic in today's era of BIM modelling applied to Cultural Heritage. Spaces like staircases, corridors and passages are very common in the architectural or archaeological fields, and obtaining a 3D-oriented survey of those areas can be a very complex task when completeness of the model and high precision are requested. Photogrammetry appears to be the most promising solution in terms of versatility and manoeuvrability also considering the quality of the required data. Fisheye lenses were studied and tested in depth because of their significant advantage in the field of view if compared with rectilinear lenses. This advantage alone can be crucial to reduce the total amount of photos and, as a consequence, to obtain manageable data, to simplify the survey phase and to significantly reduce the elaboration time. In order to overcome the main issue that arise when using fisheye lenses, which is the lack of rules that can be employed to design the survey, a general mathematical formulation to precisely estimate the GSD (Ground Sampling Distance) for every optical projection is presented here. A complete survey of a real complex case study was performed in order to test and stress the proposed methodology, and to handle a fisheye-based survey from beginning to end: the photogrammetric survey of the Minguzzi Staircase. It is a complex service spiral-staircase located in the Duomo di Milano with a total height of 25 meters and characterized by a narrow walkable space about 70 centimetres wide.

  8. Evaluating Cross-National Metrics of Tertiary Graduation Rates for OECD Countries: A Case for Increasing Methodological Congruence and Data Comparability

    ERIC Educational Resources Information Center

    Heuser, Brian L.; Drake, Timothy A.; Owens, Taya L.

    2013-01-01

    By examining the different methods and processes by which national data gathering agencies compile and submit their findings to the Organization for Economic Cooperation and Development (OECD), the authors (1) assess the methodological challenges of accurately reporting tertiary completion and graduation rates cross-nationally; (2) to examine the…

  9. The Brazilian National Curriculum for Foreign Languages Revisited through a Multiculturalism and Peace Studies Approach

    ERIC Educational Resources Information Center

    Costa, Rejane Pinto

    2011-01-01

    This study emerged from a broader research completed during my Masters Course. (THEORY/METHODOLOGY) Theory and methodology were guided by the critical multiculturalism as seen in McLaren (1997, 2000). In my doctoral thesis, this concept was deepened by and linked to the peace studies of Galtung (1990, 2005, 2006), to empower multicultural peace…

  10. Using physics-based pose predictions and free energy perturbation calculations to predict binding poses and relative binding affinities for FXR ligands in the D3R Grand Challenge 2

    NASA Astrophysics Data System (ADS)

    Athanasiou, Christina; Vasilakaki, Sofia; Dellis, Dimitris; Cournia, Zoe

    2018-01-01

    Computer-aided drug design has become an integral part of drug discovery and development in the pharmaceutical and biotechnology industry, and is nowadays extensively used in the lead identification and lead optimization phases. The drug design data resource (D3R) organizes challenges against blinded experimental data to prospectively test computational methodologies as an opportunity for improved methods and algorithms to emerge. We participated in Grand Challenge 2 to predict the crystallographic poses of 36 Farnesoid X Receptor (FXR)-bound ligands and the relative binding affinities for two designated subsets of 18 and 15 FXR-bound ligands. Here, we present our methodology for pose and affinity predictions and its evaluation after the release of the experimental data. For predicting the crystallographic poses, we used docking and physics-based pose prediction methods guided by the binding poses of native ligands. For FXR ligands with known chemotypes in the PDB, we accurately predicted their binding modes, while for those with unknown chemotypes the predictions were more challenging. Our group ranked #1st (based on the median RMSD) out of 46 groups, which submitted complete entries for the binding pose prediction challenge. For the relative binding affinity prediction challenge, we performed free energy perturbation (FEP) calculations coupled with molecular dynamics (MD) simulations. FEP/MD calculations displayed a high success rate in identifying compounds with better or worse binding affinity than the reference (parent) compound. Our studies suggest that when ligands with chemical precedent are available in the literature, binding pose predictions using docking and physics-based methods are reliable; however, predictions are challenging for ligands with completely unknown chemotypes. We also show that FEP/MD calculations hold predictive value and can nowadays be used in a high throughput mode in a lead optimization project provided that crystal structures of sufficiently high quality are available.

  11. Development and Attestation of Gamma-Ray Measurement Methodologies for use by Rostekhnadzor Inspectors in the Russian Federation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeff Sanders

    2006-09-01

    Development and attestation of gamma-ray non-destructive assay measurement methodologies for use by inspectors of the Russian Federal Service for Environmental, Technological, and Nuclear Oversight (Rostekhnadzor, formerly Gosatomnadzor or GAN), as well as for use by Russian nuclear facilities, has been completed. Specifically, a methodology utilizing the gamma-ray multi group analysis (MGA) method for determining plutonium isotopic composition has been developed, while existing methodologies to determining uranium enrichment and isotopic composition have been revised to make them more appropriate to the material types and conditions present in nuclear facilities in the Russian Federation. This paper will discuss the development and revisionmore » of these methodologies, the metrological characteristics of the final methodologies, as well as the limitations and concerns specific to the utilization of these analysis methods in the Russian Federation.« less

  12. A database and probabilistic assessment methodology for carbon dioxide enhanced oil recovery and associated carbon dioxide retention in the United States

    USGS Publications Warehouse

    Warwick, Peter D.; Verma, Mahendra K.; Attanasi, Emil; Olea, Ricardo A.; Blondes, Madalyn S.; Freeman, Philip; Brennan, Sean T.; Merrill, Matthew; Jahediesfanjani, Hossein; Roueche, Jacqueline; Lohr, Celeste D.

    2017-01-01

    The U.S. Geological Survey (USGS) has developed an assessment methodology for estimating the potential incremental technically recoverable oil resources resulting from carbon dioxide-enhanced oil recovery (CO2-EOR) in reservoirs with appropriate depth, pressure, and oil composition. The methodology also includes a procedure for estimating the CO2 that remains in the reservoir after the CO2-EOR process is complete. The methodology relies on a reservoir-level database that incorporates commercially available geologic and engineering data. The mathematical calculations of this assessment methodology were tested and produced realistic results for the Permian Basin Horseshoe Atoll, Upper Pennsylvanian-Wolfcampian Play (Texas, USA). The USGS plans to use the new methodology to conduct an assessment of technically recoverable hydrocarbons and associated CO2 sequestration resulting from CO2-EOR in the United States.

  13. Evaluation of the HARDMAN comparability methodology for manpower, personnel and training

    NASA Technical Reports Server (NTRS)

    Zimmerman, W.; Butler, R.; Gray, V.; Rosenberg, L.

    1984-01-01

    The methodology evaluation and recommendation are part of an effort to improve Hardware versus Manpower (HARDMAN) methodology for projecting manpower, personnel, and training (MPT) to support new acquisition. Several different validity tests are employed to evaluate the methodology. The methodology conforms fairly well with both the MPT user needs and other accepted manpower modeling techniques. Audits of three completed HARDMAN applications reveal only a small number of potential problem areas compared to the total number of issues investigated. The reliability study results conform well with the problem areas uncovered through the audits. The results of the accuracy studies suggest that the manpower life-cycle cost component is only marginally sensitive to changes in other related cost variables. Even with some minor problems, the methodology seem sound and has good near term utility to the Army. Recommendations are provided to firm up the problem areas revealed through the evaluation.

  14. A validated methodology for genetic identification of tuna species (genus Thunnus).

    PubMed

    Viñas, Jordi; Tudela, Sergi

    2009-10-27

    Tuna species of the genus Thunnus, such as the bluefin tunas, are some of the most important and yet most endangered trade fish in the world. Identification of these species in traded forms, however, may be difficult depending on the presentation of the products, which may hamper conservation efforts on trade control. In this paper, we validated a genetic methodology that can fully distinguish between the eight Thunnus species from any kind of processed tissue. After testing several genetic markers, a complete discrimination of the eight tuna species was achieved using Forensically Informative Nucleotide Sequencing based primarily on the sequence variability of the hypervariable genetic marker mitochondrial DNA control region (mtDNA CR), followed, in some specific cases, by a second validation by a nuclear marker rDNA first internal transcribed spacer (ITS1). This methodology was able to distinguish all tuna species, including those belonging to the subgenus Neothunnus that are very closely related, and in consequence can not be differentiated with other genetic markers of lower variability. This methodology also took into consideration the presence of introgression that has been reported in past studies between T. thynnus, T. orientalis and T. alalunga. Finally, we applied the methodology to cross-check the species identity of 26 processed tuna samples. Using the combination of two genetic markers, one mitochondrial and another nuclear, allows a full discrimination between all eight tuna species. Unexpectedly, the genetic marker traditionally used for DNA barcoding, cytochrome oxidase 1, could not differentiate all species, thus its use as a genetic marker for tuna species identification is questioned.

  15. Nutrition management guideline for maple syrup urine disease: an evidence- and consensus-based approach.

    PubMed

    Frazier, Dianne M; Allgeier, Courtney; Homer, Caroline; Marriage, Barbara J; Ogata, Beth; Rohr, Frances; Splett, Patricia L; Stembridge, Adrya; Singh, Rani H

    2014-07-01

    In an effort to increase harmonization of care and enable outcome studies, the Genetic Metabolic Dietitians International (GMDI) and the Southeast Regional Newborn Screening and Genetics Collaborative (SERC) are partnering to develop nutrition management guidelines for inherited metabolic disorders (IMD) using a model combining both evidence- and consensus-based methodology. The first guideline to be completed is for maple syrup urine disease (MSUD). This report describes the methodology used in its development: formulation of five research questions; review, critical appraisal and abstraction of peer-reviewed studies and unpublished practice literature; and expert input through Delphi surveys and a nominal group process. This report includes the summary statements for each research question and the nutrition management recommendations they generated. Each recommendation is followed by a standardized rating based on the strength of the evidence and consensus used. The application of technology to build the infrastructure for this project allowed transparency during development of this guideline and will be a foundation for future guidelines. Online open access of the full, published guideline allows utilization by health care providers, researchers, and collaborators who advise, advocate and care for individuals with MSUD and their families. There will be future updates as warranted by developments in research and clinical practice. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  16. [Sex differentiation in plants. Terms and notions].

    PubMed

    Godin, V N

    2007-01-01

    There are two methodological approaches to the study of sex in plants: the descriptive and morphological approach and the quantitative approach. The former is based exclusively on external morphological peculiarities of the generative organs of the flower, the latter is based on the functioning of individuals as parents of the coming generation. It has been suggested to recognize three flower types: staminate, pistillate, and complete. Depending on the distribution pattern of the flowers of different sex type, there are monomorphic populations (all individuals form flowers of the same type) and heteromorphic populations (individuals have flowers of different types). Monomorphic populations include monoclinous, monoecious, gynomonoecious, andromonoecious, and polygamomonoecious ones. Among heteromorphic populations, dioecious, polygamodioecious, subdioecious, paradioecious, and trioecious ones are recognized. It is desirable to give up the usage of such terms as "bisexual", "polygamous", "functionally female", and "functionally male" flowers, "temporary dioecy" and some others. The notion "gender" has been established in English-language works for describing the sex quantitavely; two additional terms have been proposed: "phenotypic gender" and "functional gender". The recently developed quantitative approach is at present in the process of accumulating material, and in need of the further elaborating the methodological base for research. Analysis of the principal notions shows the necessity to form their integrated structure and to correct the usage of the existing and new terms.

  17. Accelerated Training at Mach 20: A Brief Communication Submitted from the International Space Station

    NASA Technical Reports Server (NTRS)

    Foale, C. Michael; Kaleri, Alexander Y.; Sargsyan, Ashot E.; Hamilton, Douglas R.; Melton, Shannon; Martin, David; Dulchavsky, Scott A.

    2004-01-01

    The performance of complex tasks on the International Space Station (ISS) requires significant preflight crew training commitments and frequent skill and knowledge refreshment. This report documents a recently developed just-in-time training methodology, which integrates preflight hardware familiarization and procedure training with an on-orbit CD-ROM-based skill enhancement. This just-in-time concept was used to support real-time remote expert guidance to complete medical examinations using the ISS Human Research Facility (HRF). An American and Russian ISS crewmember received 2-hours of hands on ultrasound training 8 months prior to the on-orbit ultrasound exam. A CD-ROM-based Onboard Proficiency Enhancement (OPE) interactive multimedia program consisting of memory enhancing tutorials, and skill testing exercises, was completed by the crewmember six days prior to the on-orbit ultrasound exam. The crewmember was then remotely guided through a thoracic, vascular, and echocardiographic examination by ultrasound imaging experts. Results of the CD ROM based OPE session were used to modify the instructions during a complete 35 minute real-time thoracic, cardiac, and carotid/jugular ultrasound study. Following commands from the ground-based expert, the crewmember acquired all target views and images without difficulty. The anatomical content and fidelity of ultrasound video were excellent and adequate for clinical decision-making. Complex ultrasound experiments with expert guidance were performed with high accuracy following limited pre-flight training and CD-ROM-based in-flight review, despite a 2-second communication latency. In-flight application of multimedia proficiency enhancement software, coupled with real-time remote expert guidance, can facilitate the performance of complex demanding tasks.

  18. [Evaluation of the first training on clinical research methodology in Chile].

    PubMed

    Espinoza, Manuel; Cabieses, Báltica; Pedreros, César; Zitko, Pedro

    2011-03-01

    This paper describes the evaluation of the first training on clinical research methodology in Chile (EMIC-Chile) 12 months after its completion. An online survey was conducted for students and the Delphi method was used for the teaching team. Among the students, the majority reported that the program had contributed to their professional development and that they had shared some of the knowledge acquired with colleagues in their workplace. Forty-one percent submitted a project to obtain research funding through a competitive grants process once they had completed the course. Among the teachers, the areas of greatest interest were the communication strategy, teaching methods, the characteristics of the teaching team, and potential strategies for making the EMIC-Chile permanent in the future. This experience could contribute to future research training initiatives for health professionals. Recognized challenges are the involvement of nonmedical professions in clinical research, the complexities associated with the distance learning methodology, and the continued presence of initiatives of this importance at the national and regional level.

  19. Context-specific selection of algorithms for recursive feature tracking in endoscopic image using a new methodology.

    PubMed

    Selka, F; Nicolau, S; Agnus, V; Bessaid, A; Marescaux, J; Soler, L

    2015-03-01

    In minimally invasive surgery, the tracking of deformable tissue is a critical component for image-guided applications. Deformation of the tissue can be recovered by tracking features using tissue surface information (texture, color,...). Recent work in this field has shown success in acquiring tissue motion. However, the performance evaluation of detection and tracking algorithms on such images are still difficult and are not standardized. This is mainly due to the lack of ground truth data on real data. Moreover, in order to avoid supplementary techniques to remove outliers, no quantitative work has been undertaken to evaluate the benefit of a pre-process based on image filtering, which can improve feature tracking robustness. In this paper, we propose a methodology to validate detection and feature tracking algorithms, using a trick based on forward-backward tracking that provides an artificial ground truth data. We describe a clear and complete methodology to evaluate and compare different detection and tracking algorithms. In addition, we extend our framework to propose a strategy to identify the best combinations from a set of detector, tracker and pre-process algorithms, according to the live intra-operative data. Experimental results have been performed on in vivo datasets and show that pre-process can have a strong influence on tracking performance and that our strategy to find the best combinations is relevant for a reasonable computation cost. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. From diets to foods: using linear programming to formulate a nutritious, minimum-cost porridge mix for children aged 1 to 2 years.

    PubMed

    De Carvalho, Irene Stuart Torrié; Granfeldt, Yvonne; Dejmek, Petr; Håkansson, Andreas

    2015-03-01

    Linear programming has been used extensively as a tool for nutritional recommendations. Extending the methodology to food formulation presents new challenges, since not all combinations of nutritious ingredients will produce an acceptable food. Furthermore, it would help in implementation and in ensuring the feasibility of the suggested recommendations. To extend the previously used linear programming methodology from diet optimization to food formulation using consistency constraints. In addition, to exemplify usability using the case of a porridge mix formulation for emergency situations in rural Mozambique. The linear programming method was extended with a consistency constraint based on previously published empirical studies on swelling of starch in soft porridges. The new method was exemplified using the formulation of a nutritious, minimum-cost porridge mix for children aged 1 to 2 years for use as a complete relief food, based primarily on local ingredients, in rural Mozambique. A nutritious porridge fulfilling the consistency constraints was found; however, the minimum cost was unfeasible with local ingredients only. This illustrates the challenges in formulating nutritious yet economically feasible foods from local ingredients. The high cost was caused by the high cost of mineral-rich foods. A nutritious, low-cost porridge that fulfills the consistency constraints was obtained by including supplements of zinc and calcium salts as ingredients. The optimizations were successful in fulfilling all constraints and provided a feasible porridge, showing that the extended constrained linear programming methodology provides a systematic tool for designing nutritious foods.

  1. UAV-Based Photogrammetry and Integrated Technologies for Architectural Applications—Methodological Strategies for the After-Quake Survey of Vertical Structures in Mantua (Italy)

    PubMed Central

    Achille, Cristiana; Adami, Andrea; Chiarini, Silvia; Cremonesi, Stefano; Fassi, Francesco; Fregonese, Luigi; Taffurelli, Laura

    2015-01-01

    This paper examines the survey of tall buildings in an emergency context like in the case of post-seismic events. The after-earthquake survey has to guarantee time-savings, high precision and security during the operational stages. The main goal is to optimize the application of methodologies based on acquisition and automatic elaborations of photogrammetric data even with the use of Unmanned Aerial Vehicle (UAV) systems in order to provide fast and low cost operations. The suggested methods integrate new technologies with commonly used technologies like TLS and topographic acquisition. The value of the photogrammetric application is demonstrated by a test case, based on the comparison of acquisition, calibration and 3D modeling results in case of use of a laser scanner, metric camera and amateur reflex camera. The test would help us to demonstrate the efficiency of image based methods in the acquisition of complex architecture. The case study is Santa Barbara Bell tower in Mantua. The applied survey solution allows a complete 3D database of the complex architectural structure to be obtained for the extraction of all the information needed for significant intervention. This demonstrates the applicability of the photogrammetry using UAV for the survey of vertical structures, complex buildings and difficult accessible architectural parts, providing high precision results. PMID:26134108

  2. Quasi-experimental designs in practice-based research settings: design and implementation considerations.

    PubMed

    Handley, Margaret A; Schillinger, Dean; Shiboski, Stephen

    2011-01-01

    Although randomized controlled trials are often a gold standard for determining intervention effects, in the area of practice-based research (PBR), there are many situations in which individual randomization is not possible. Alternative approaches to evaluating interventions have received increased attention, particularly those that can retain elements of randomization such that they can be considered "controlled" trials. Methodological design elements and practical implementation considerations for two quasi-experimental design approaches that have considerable promise in PBR settings--the stepped-wedge design, and a variant of this design, a wait-list cross-over design, are presented along with a case study from a recent PBR intervention for patients with diabetes. PBR-relevant design features include: creation of a cohort over time that collects control data but allows all participants (clusters or patients) to receive the intervention; staggered introduction of clusters; multiple data collection points; and one-way cross-over into the intervention arm. Practical considerations include: randomization versus stratification, training run in phases; and extended time period for overall study completion. Several design features of practice based research studies can be adapted to local circumstances yet retain elements to improve methodological rigor. Studies that utilize these methods, such as the stepped-wedge design and the wait-list cross-over design, can increase the evidence base for controlled studies conducted within the complex environment of PBR.

  3. UAV-Based Photogrammetry and Integrated Technologies for Architectural Applications--Methodological Strategies for the After-Quake Survey of Vertical Structures in Mantua (Italy).

    PubMed

    Achille, Cristiana; Adami, Andrea; Chiarini, Silvia; Cremonesi, Stefano; Fassi, Francesco; Fregonese, Luigi; Taffurelli, Laura

    2015-06-30

    This paper examines the survey of tall buildings in an emergency context like in the case of post-seismic events. The after-earthquake survey has to guarantee time-savings, high precision and security during the operational stages. The main goal is to optimize the application of methodologies based on acquisition and automatic elaborations of photogrammetric data even with the use of Unmanned Aerial Vehicle (UAV) systems in order to provide fast and low cost operations. The suggested methods integrate new technologies with commonly used technologies like TLS and topographic acquisition. The value of the photogrammetric application is demonstrated by a test case, based on the comparison of acquisition, calibration and 3D modeling results in case of use of a laser scanner, metric camera and amateur reflex camera. The test would help us to demonstrate the efficiency of image based methods in the acquisition of complex architecture. The case study is Santa Barbara Bell tower in Mantua. The applied survey solution allows a complete 3D database of the complex architectural structure to be obtained for the extraction of all the information needed for significant intervention. This demonstrates the applicability of the photogrammetry using UAV for the survey of vertical structures, complex buildings and difficult accessible architectural parts, providing high precision results.

  4. How EIA Estimates Natural Gas Production

    EIA Publications

    2004-01-01

    The Energy Information Administration (EIA) publishes estimates monthly and annually of the production of natural gas in the United States. The estimates are based on data EIA collects from gas producing states and data collected by the U. S. Minerals Management Service (MMS) in the Department of Interior. The states and MMS collect this information from producers of natural gas for various reasons, most often for revenue purposes. Because the information is not sufficiently complete or timely for inclusion in EIA's Natural Gas Monthly (NGM), EIA has developed estimation methodologies to generate monthly production estimates that are described in this document.

  5. Post-Doctoral Fellowship for Merton S. Krause. Final Report.

    ERIC Educational Resources Information Center

    Jackson, Philip W.

    The final quarter of Krause's fellowship year was spent in completing his interviews with political socialization researchers in the eastern United States and his work on methodological problems. Krause also completed a long essay on the nature and implications of the "matrix perspective" for research planning, pursued his study of measurement…

  6. Resistances to Scientific Knowledge Production of Comparative Measurements of Dropout and Completion in European Higher Education

    ERIC Educational Resources Information Center

    Carlhed, Carina

    2017-01-01

    The article is a critical sociological analysis of current transnational practices on creating comparable measurements of dropout and completion in higher education and the consequences for the conditions of scientific knowledge production on the topic. The analysis revolves around questions of epistemological, methodological and symbolic types…

  7. Application of process improvement principles to increase the frequency of complete airway management documentation.

    PubMed

    McCarty, L Kelsey; Saddawi-Konefka, Daniel; Gargan, Lauren M; Driscoll, William D; Walsh, John L; Peterfreund, Robert A

    2014-12-01

    Process improvement in healthcare delivery settings can be difficult, even when there is consensus among clinicians about a clinical practice or desired outcome. Airway management is a medical intervention fundamental to the delivery of anesthesia care. Like other medical interventions, a detailed description of the management methods should be documented. Despite this expectation, airway documentation is often insufficient. The authors hypothesized that formal adoption of process improvement methods could be used to increase the rate of "complete" airway management documentation. The authors defined a set of criteria as a local practice standard of "complete" airway management documentation. The authors then employed selected process improvement methodologies over 13 months in three iterative and escalating phases to increase the percentage of records with complete documentation. The criteria were applied retrospectively to determine the baseline frequency of complete records, and prospectively to measure the impact of process improvements efforts over the three phases of implementation. Immediately before the initial intervention, a retrospective review of 23,011 general anesthesia cases over 6 months showed that 13.2% of patient records included complete documentation. At the conclusion of the 13-month improvement effort, documentation improved to a completion rate of 91.6% (P<0.0001). During the subsequent 21 months, the completion rate was sustained at an average of 90.7% (SD, 0.9%) across 82,571 general anesthetic records. Systematic application of process improvement methodologies can improve airway documentation and may be similarly effective in improving other areas of anesthesia clinical practice.

  8. Abu Sayyaf Group (ASG): An Al-Qaeda Associate Case Study

    DTIC Science & Technology

    2017-10-01

    completed in August 2017. In order to conduct this assessment, CNA used a comparative methodology that included eight case studies on groups affiliated...assessment, CNA used a comparative methodology that included eight case studies on groups affiliated or associated with Al-Qaeda. These case studies ...Case Study P. Kathleen Hammerberg and Pamela G. Faber With contributions from Alexander Powell October 2017 This work was performed

  9. Manufacturing Technology for Shipbuilding

    DTIC Science & Technology

    1983-01-01

    during which time there was an interface of Japanese and American shipbuilding concepts and methodology . Those two years of work resulted in many...lanes is of paramount importance in maintaining a smoothy orderly flow of pre-fabricated steel. Occasionally, the process lanes may fall behind schedule...design methodology . The earlier the start that Engineering has, the better the chance that all required engineering work will be completed at start of

  10. Suggestions for Job and Curriculum Ladders in Health Center Ambulatory Care: A Pilot Test of the Health Services Mobility Study Methodology.

    ERIC Educational Resources Information Center

    Gilpatrick, Eleanor

    This report contains the results of a pilot test which represents the first complete field test of methodological work begun in October 1967 under a Federal grant for the purpose of job analysis in the health services. This 4-year Health Services Mobility Study permitted basic research, field testing, practical application, and policy involvement…

  11. Software Requirements Specification for an Ammunition Management System

    DTIC Science & Technology

    1986-09-01

    thesis takes the form of a software requirements specification. Such a specification, according to Pressman [Ref. 7], establishes a complete...defined by Pressman , is depicted in Figure 1.1. 11 Figure 1.1 Generalized Software Life Cycle The common thread which binds the various phases together...application of software engineering principles requires an established methodology. This methodology, according to Pressman [Ref. 8:p. 151 is an

  12. Fusion of MultiSpectral and Panchromatic Images Based on Morphological Operators.

    PubMed

    Restaino, Rocco; Vivone, Gemine; Dalla Mura, Mauro; Chanussot, Jocelyn

    2016-04-20

    Nonlinear decomposition schemes constitute an alternative to classical approaches for facing the problem of data fusion. In this paper we discuss the application of this methodology to a popular remote sensing application called pansharpening, which consists in the fusion of a low resolution multispectral image and a high resolution panchromatic image. We design a complete pansharpening scheme based on the use of morphological half gradients operators and demonstrate the suitability of this algorithm through the comparison with state of the art approaches. Four datasets acquired by the Pleiades, Worldview-2, Ikonos and Geoeye-1 satellites are employed for the performance assessment, testifying the effectiveness of the proposed approach in producing top-class images with a setting independent of the specific sensor.

  13. Geometrical-Based Navigation System Performance Assessment in the Space Service Volume Using a Multiglobal Navigation Satellite System Methodology

    NASA Technical Reports Server (NTRS)

    Welch, Bryan W.

    2016-01-01

    NASA is participating in the International Committee on Global Navigation Satellite Systems (GNSS) (ICG)'s efforts towards demonstrating the benefits to the space user in the Space Service Volume (SSV) when a multi-GNSS solution space approach is utilized. The ICG Working Group: Enhancement of GNSS Performance, New Services and Capabilities has started a three phase analysis initiative as an outcome of recommendations at the ICG-10 meeting, in preparation for the ICG-11 meeting. The first phase of that increasing complexity and fidelity analysis initiative is based on a pure geometrically-derived access technique. The first phase of analysis has been completed, and the results are documented in this paper.

  14. Multiscale Simulation of Gas Film Lubrication During Liquid Droplet Collision

    NASA Astrophysics Data System (ADS)

    Chen, Xiaodong; Khare, Prashant; Ma, Dongjun; Yang, Vigor

    2012-02-01

    Droplet collision plays an elementary role in dense spray combustion process. When two droplets approach each other, a gas film forms in between. The pressure generated within the film prevents motion of approaching droplets. This fluid mechanics is fluid film lubrication that occurs when opposing bearing surfaces are completely separated by fluid film. The lubrication flow in gas film decides the collision outcome, coalescence or bouncing. Present study focuses on gas film drainage process over a wide range of Weber numbers during equal- and unequal-sized droplet collision. The formulation is based on complete set of conservation equations for both liquid and surrounding gas phases. An improved volume-of-fluid technique, augmented by an adaptive mesh refinement algorithm, is used to track liquid/gas interfaces. A unique thickness-based refinement algorithm based on topology of interfacial flow is developed and implemented to efficiently resolve the multiscale problem. The grid size on interface is up O(10-4) of droplet size with a max resolution of 0.015 μm. An advanced visualization technique using the Ray-tracing methodology is used to gain direct insights to detailed physics. Theories are established by analyzing the characteristics of shape changing and flow evolution.

  15. Limitations imposed on fire PRA methods as the result of incomplete and uncertain fire event data.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nowlen, Steven Patrick; Hyslop, J. S.

    2010-04-01

    Fire probabilistic risk assessment (PRA) methods utilize data and insights gained from actual fire events in a variety of ways. For example, fire occurrence frequencies, manual fire fighting effectiveness and timing, and the distribution of fire events by fire source and plant location are all based directly on the historical experience base. Other factors are either derived indirectly or supported qualitatively based on insights from the event data. These factors include the general nature and intensity of plant fires, insights into operator performance, and insights into fire growth and damage behaviors. This paper will discuss the potential methodology improvements thatmore » could be realized if more complete fire event reporting information were available. Areas that could benefit from more complete event reporting that will be discussed in the paper include fire event frequency analysis, analysis of fire detection and suppression system performance including incipient detection systems, analysis of manual fire fighting performance, treatment of fire growth from incipient stages to fully-involved fires, operator response to fire events, the impact of smoke on plant operations and equipment, and the impact of fire-induced cable failures on plant electrical circuits.« less

  16. Utility-based designs for randomized comparative trials with categorical outcomes

    PubMed Central

    Murray, Thomas A.; Thall, Peter F.; Yuan, Ying

    2016-01-01

    A general utility-based testing methodology for design and conduct of randomized comparative clinical trials with categorical outcomes is presented. Numerical utilities of all elementary events are elicited to quantify their desirabilities. These numerical values are used to map the categorical outcome probability vector of each treatment to a mean utility, which is used as a one-dimensional criterion for constructing comparative tests. Bayesian tests are presented, including fixed sample and group sequential procedures, assuming Dirichlet-multinomial models for the priors and likelihoods. Guidelines are provided for establishing priors, eliciting utilities, and specifying hypotheses. Efficient posterior computation is discussed, and algorithms are provided for jointly calibrating test cutoffs and sample size to control overall type I error and achieve specified power. Asymptotic approximations for the power curve are used to initialize the algorithms. The methodology is applied to re-design a completed trial that compared two chemotherapy regimens for chronic lymphocytic leukemia, in which an ordinal efficacy outcome was dichotomized and toxicity was ignored to construct the trial’s design. The Bayesian tests also are illustrated by several types of categorical outcomes arising in common clinical settings. Freely available computer software for implementation is provided. PMID:27189672

  17. Background qualitative analysis of the European reference life cycle database (ELCD) energy datasets - part II: electricity datasets.

    PubMed

    Garraín, Daniel; Fazio, Simone; de la Rúa, Cristina; Recchioni, Marco; Lechón, Yolanda; Mathieux, Fabrice

    2015-01-01

    The aim of this paper is to identify areas of potential improvement of the European Reference Life Cycle Database (ELCD) electricity datasets. The revision is based on the data quality indicators described by the International Life Cycle Data system (ILCD) Handbook, applied on sectorial basis. These indicators evaluate the technological, geographical and time-related representativeness of the dataset and the appropriateness in terms of completeness, precision and methodology. Results show that ELCD electricity datasets have a very good quality in general terms, nevertheless some findings and recommendations in order to improve the quality of Life-Cycle Inventories have been derived. Moreover, these results ensure the quality of the electricity-related datasets to any LCA practitioner, and provide insights related to the limitations and assumptions underlying in the datasets modelling. Giving this information, the LCA practitioner will be able to decide whether the use of the ELCD electricity datasets is appropriate based on the goal and scope of the analysis to be conducted. The methodological approach would be also useful for dataset developers and reviewers, in order to improve the overall Data Quality Requirements of databases.

  18. Realistic evaluation of an emergency department-based mental health nurse practitioner outpatient service in Australia.

    PubMed

    Wand, Timothy; White, Kathryn; Patching, Joanna

    2011-06-01

    Evaluation of new models of care requires consideration of the complexity inherent within health care programs and their sensitivity to local contextual factors as well as broader community, social and political influences. Evaluation frameworks that are flexible and responsive while maintaining research rigor are therefore required. Realistic evaluation was adopted as the methodology for the implementation and evaluation of an emergency department-based mental health nurse practitioner outpatient service in Sydney, Australia. The aim of realistic evaluation is to generate, test and refine theories of how programs work within a given context. This paper represents the final methodological step from the completed evaluation. A summary of quantitative and qualitative findings from the mixed-methods evaluation is presented, which is transformed into a set of overarching statements or "middle range theories". Middle range theory statements seek to explain the success of a program and provide transferable lessons for practitioners wishing to implement similar programs elsewhere. For example, the research team consider that early consultation with key local stakeholders and emergency department ownership of the project was pivotal to the implementation process. © 2011 Blackwell Publishing Asia Pty Ltd.

  19. A stepwise, 'test-all-positives' methodology to assess gluten-kernel contamination at the serving-size level in gluten-free (GF) oat production.

    PubMed

    Chen, Yumin; Fritz, Ronald D; Kock, Lindsay; Garg, Dinesh; Davis, R Mark; Kasturi, Prabhakar

    2018-02-01

    A step-wise, 'test-all-positive-gluten' analytical methodology has been developed and verified to assess kernel-based gluten contamination (i.e., wheat, barley and rye kernels) during gluten-free (GF) oat production. It targets GF claim compliance at the serving-size level (of a pouch or approximately 40-50g). Oat groats are collected from GF oat production following a robust attribute-based sampling plan then split into 75-g subsamples, and ground. R-Biopharm R5 sandwich ELISA R7001 is used for analysis of all the first15-g portions of the ground sample. A >20-ppm result disqualifies the production lot, while a >5 to <20-ppm result triggers complete analysis of the remaining 60-g of ground sample, analyzed in 15-g portions. If all five 15-g test results are <20ppm, and their average is <10.67ppm (since a 20-ppm contaminant in 40g of oats would dilute to 10.67ppm in 75-g), the lot is passed. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  20. Artistic image analysis using graph-based learning approaches.

    PubMed

    Carneiro, Gustavo

    2013-08-01

    We introduce a new methodology for the problem of artistic image analysis, which among other tasks, involves the automatic identification of visual classes present in an art work. In this paper, we advocate the idea that artistic image analysis must explore a graph that captures the network of artistic influences by computing the similarities in terms of appearance and manual annotation. One of the novelties of our methodology is the proposed formulation that is a principled way of combining these two similarities in a single graph. Using this graph, we show that an efficient random walk algorithm based on an inverted label propagation formulation produces more accurate annotation and retrieval results compared with the following baseline algorithms: bag of visual words, label propagation, matrix completion, and structural learning. We also show that the proposed approach leads to a more efficient inference and training procedures. This experiment is run on a database containing 988 artistic images (with 49 visual classification problems divided into a multiclass problem with 27 classes and 48 binary problems), where we show the inference and training running times, and quantitative comparisons with respect to several retrieval and annotation performance measures.

  1. Application of the enterprise management tools Lean Six Sigma and PMBOK in developing a program of research management.

    PubMed

    Hors, Cora; Goldberg, Anna Carla; Almeida, Ederson Haroldo Pereira de; Babio Júnior, Fernando Galan; Rizzo, Luiz Vicente

    2012-01-01

    Introduce a program for the management of scientific research in a General Hospital employing the business management tools Lean Six Sigma and PMBOK for project management in this area. The Lean Six Sigma methodology was used to improve the management of the institution's scientific research through a specific tool (DMAIC) for identification, implementation and posterior analysis based on PMBOK practices of the solutions found. We present our solutions for the management of institutional research projects at the Sociedade Beneficente Israelita Brasileira Albert Einstein. The solutions were classified into four headings: people, processes, systems and organizational culture. A preliminary analysis of these solutions showed them to be completely or partially compliant to the processes described in the PMBOK Guide. In this post facto study, we verified that the solutions drawn from a project using Lean Six Sigma methodology and based on PMBOK enabled the improvement of our processes dealing with the management of scientific research carried out in the institution and constitutes a model to contribute to the search of innovative science management solutions by other institutions dealing with scientific research in Brazil.

  2. Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques

    DTIC Science & Technology

    2017-11-01

    ARL-TR-8225 ● NOV 2017 US Army Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based...Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques by...SUBTITLE Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques 5a. CONTRACT NUMBER

  3. Decision support methodology to establish priorities on the inspection of structures

    NASA Astrophysics Data System (ADS)

    Cortes, V. Juliette; Sterlacchini, Simone; Bogaard, Thom; Frigerio, Simone; Schenato, Luca; Pasuto, Alessandro

    2014-05-01

    For hydro-meteorological hazards in mountain areas, the regular inspection of check dams and bridges is important due to the effect of their functional status on water-sediment processes. Moreover, the inspection of these structures is time consuming for organizations due to their extensive number in many regions. However, trained citizen-volunteers can support civil protection and technical services in the frequency, timeliness and coverage of monitoring the functional status of hydraulic structures. Technicians should evaluate and validate these reports to get an index for the status of the structure. Thus, preventive actions could initiate such as the cleaning of obstructions or to pre-screen potential problems for a second level inspection. This study proposes a decision support methodology that technicians can use to assess an index for three parameters representing the functional status of the structure: a) condition of the structure at the opening of the stream flow, b) level of obstruction at the structure and c) the level of erosion in the stream bank. The calculation of the index for each parameter is based upon fuzzy logic theory to handle ranges in precision of the reports and to convert the linguistic rating scales into numbers representing the structure's status. A weighting method and multi-criteria method (Analytic Hierarchy Process- AHP and TOPSIS), can be used by technicians to combine the different ratings according to the component elements of the structure and the completeness of the reports. Finally, technicians can set decision rules based on the worst rating and a threshold for the functional indexes. The methodology was implemented as a prototype web-based tool to be tested with technicians of the Civil Protection in the Fella basin, Northern Italy. Results at this stage comprise the design and implementation of the web-based tool with GIS interaction to evaluate available reports and to set priorities on the inspection of structures. Keywords Decision-making, Multi-criteria methods, Torrent control structures, Web-based tools.

  4. Assessing the oral health of an ageing population: methods, challenges and predictors of survey participation.

    PubMed

    Matthews, Debora C; Brillant, Martha G S; Clovis, Joanne B; McNally, Mary E; Filiaggi, Mark J; Kotzer, Robert D; Lawrence, Herenia P

    2012-06-01

    To examine predictors of participation and to describe the methodological considerations of conducting a two-stage population-based oral health survey. An observational, cross-sectional survey (telephone interview and clinical oral examination) of community-dwelling adults aged 45-64 and ≥65 living in Nova Scotia, Canada was conducted. The survey response rate was 21% for the interview and 13.5% for the examination. A total of 1141 participants completed one or both components of the survey. Both age groups had higher levels of education than the target population; the age 45-64 sample also had a higher proportion of females and lower levels of employment than the target population. Completers (participants who completed interview and examination) were compared with partial completers (who completed only the interview), and stepwise logistic regression was performed to examine predictors of completion. Identified predictors were as follows: not working, post-secondary education and frequent dental visits. Recruitment, communications and logistics present challenges in conducting a province-wide survey. Identification of employment, education and dental visit frequency as predictors of survey participation provide insight into possible non-response bias and suggest potential for underestimation of oral disease prevalence in this and similar surveys. This potential must be considered in analysis and in future recruitment strategies. © 2011 The Gerodontology Society and John Wiley & Sons A/S.

  5. Dense Velocity Field of Turkey

    NASA Astrophysics Data System (ADS)

    Ozener, H.; Aktug, B.; Dogru, A.; Tasci, L.

    2017-12-01

    While the GNSS-based crustal deformation studies in Turkey date back to early 1990s, a homogenous velocity field utilizing all the available data is still missing. Regional studies employing different site distributions, observation plans, processing software and methodology not only create reference frame variations but also heterogeneous stochastic models. While the reference frame effect between different velocity fields could easily be removed by estimating a set of rotations, the homogenization of the stochastic models of the individual velocity fields requires a more detailed analysis. Using a rigorous Variance Component Estimation (VCE) methodology, we estimated the variance factors for each of the contributing velocity fields and combined them into a single homogenous velocity field covering whole Turkey. Results show that variance factors between velocity fields including the survey mode and continuous observations can vary a few orders of magnitude. In this study, we present the most complete velocity field in Turkey rigorously combined from 20 individual velocity fields including the 146 station CORS network and totally 1072 stations. In addition, three GPS campaigns were performed along the North Anatolian Fault and Aegean Region to fill the gap between existing velocity fields. The homogenously combined new velocity field is nearly complete in terms of geographic coverage, and will serve as the basis for further analyses such as the estimation of the deformation rates and the determination of the slip rates across main fault zones.

  6. Magnetic fluid control for viscous loss reduction of high-speed MRF brakes and clutches with well-defined fail-safe behavior

    NASA Astrophysics Data System (ADS)

    Güth, Dirk; Schamoni, Markus; Maas, Jürgen

    2013-09-01

    No-load losses within brakes and clutches based on magnetorheological fluids are unavoidable and represent a major barrier towards their wide-spread commercial adoption. Completely torque free rotation is not yet possible due to persistent fluid contact within the shear gap. In this paper, a novel concept is presented that facilitates the controlled movement of the magnetorheological fluid from an active, torque-transmitting region into an inactive region of the shear gap. This concept enables complete decoupling of the fluid engaging surfaces such that viscous drag torque can be eliminated. In order to achieve the desired effect, motion in the magnetorheological fluid is induced by magnetic forces acting on the fluid, which requires an appropriate magnetic circuit design. In this investigation, we propose a methodology to determine suitable magnetic circuit designs with well-defined fail-safe behavior. The magnetically induced motion of magnetorheological fluids is modeled by the use of the Kelvin body force, and a multi-physics domain simulation is performed to elucidate various transitions between an engaged and disengaged operating mode. The modeling approach is validated by captured high-speed video frames which show the induced motion of the magnetorheological fluid due to the magnetic field. Finally, measurements performed with a prototype actuator prove that the induced viscous drag torque can be reduced significantly by the proposed magnetic fluid control methodology.

  7. Prevalidation in pharmaceutical analysis. Part I. Fundamentals and critical discussion.

    PubMed

    Grdinić, Vladimir; Vuković, Jadranka

    2004-05-28

    A complete prevalidation, as a basic prevalidation strategy for quality control and standardization of analytical procedure was inaugurated. Fast and simple, the prevalidation methodology based on mathematical/statistical evaluation of a reduced number of experiments (N < or = 24) was elaborated and guidelines as well as algorithms were given in detail. This strategy has been produced for the pharmaceutical applications and dedicated to the preliminary evaluation of analytical methods where linear calibration model, which is very often occurred in practice, could be the most appropriate to fit experimental data. The requirements presented in this paper should therefore help the analyst to design and perform the minimum number of prevalidation experiments needed to obtain all the required information to evaluate and demonstrate the reliability of its analytical procedure. In complete prevalidation process, characterization of analytical groups, checking of two limiting groups, testing of data homogeneity, establishment of analytical functions, recognition of outliers, evaluation of limiting values and extraction of prevalidation parameters were included. Moreover, system of diagnosis for particular prevalidation step was suggested. As an illustrative example for demonstration of feasibility of prevalidation methodology, among great number of analytical procedures, Vis-spectrophotometric procedure for determination of tannins with Folin-Ciocalteu's phenol reagent was selected. Favourable metrological characteristics of this analytical procedure, as prevalidation figures of merit, recognized the metrological procedure as a valuable concept in preliminary evaluation of quality of analytical procedures.

  8. Knowledge-based system verification and validation

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1990-01-01

    The objective of this task is to develop and evaluate a methodology for verification and validation (V&V) of knowledge-based systems (KBS) for space station applications with high reliability requirements. The approach consists of three interrelated tasks. The first task is to evaluate the effectiveness of various validation methods for space station applications. The second task is to recommend requirements for KBS V&V for Space Station Freedom (SSF). The third task is to recommend modifications to the SSF to support the development of KBS using effectiveness software engineering and validation techniques. To accomplish the first task, three complementary techniques will be evaluated: (1) Sensitivity Analysis (Worchester Polytechnic Institute); (2) Formal Verification of Safety Properties (SRI International); and (3) Consistency and Completeness Checking (Lockheed AI Center). During FY89 and FY90, each contractor will independently demonstrate the user of his technique on the fault detection, isolation, and reconfiguration (FDIR) KBS or the manned maneuvering unit (MMU), a rule-based system implemented in LISP. During FY91, the application of each of the techniques to other knowledge representations and KBS architectures will be addressed. After evaluation of the results of the first task and examination of Space Station Freedom V&V requirements for conventional software, a comprehensive KBS V&V methodology will be developed and documented. Development of highly reliable KBS's cannot be accomplished without effective software engineering methods. Using the results of current in-house research to develop and assess software engineering methods for KBS's as well as assessment of techniques being developed elsewhere, an effective software engineering methodology for space station KBS's will be developed, and modification of the SSF to support these tools and methods will be addressed.

  9. Physics-based deformable organisms for medical image analysis

    NASA Astrophysics Data System (ADS)

    Hamarneh, Ghassan; McIntosh, Chris

    2005-04-01

    Previously, "Deformable organisms" were introduced as a novel paradigm for medical image analysis that uses artificial life modelling concepts. Deformable organisms were designed to complement the classical bottom-up deformable models methodologies (geometrical and physical layers), with top-down intelligent deformation control mechanisms (behavioral and cognitive layers). However, a true physical layer was absent and in order to complete medical image segmentation tasks, deformable organisms relied on pure geometry-based shape deformations guided by sensory data, prior structural knowledge, and expert-generated schedules of behaviors. In this paper we introduce the use of physics-based shape deformations within the deformable organisms framework yielding additional robustness by allowing intuitive real-time user guidance and interaction when necessary. We present the results of applying our physics-based deformable organisms, with an underlying dynamic spring-mass mesh model, to segmenting and labelling the corpus callosum in 2D midsagittal magnetic resonance images.

  10. An efficient and provable secure revocable identity-based encryption scheme.

    PubMed

    Wang, Changji; Li, Yuan; Xia, Xiaonan; Zheng, Kangjia

    2014-01-01

    Revocation functionality is necessary and crucial to identity-based cryptosystems. Revocable identity-based encryption (RIBE) has attracted a lot of attention in recent years, many RIBE schemes have been proposed in the literature but shown to be either insecure or inefficient. In this paper, we propose a new scalable RIBE scheme with decryption key exposure resilience by combining Lewko and Waters' identity-based encryption scheme and complete subtree method, and prove our RIBE scheme to be semantically secure using dual system encryption methodology. Compared to existing scalable and semantically secure RIBE schemes, our proposed RIBE scheme is more efficient in term of ciphertext size, public parameters size and decryption cost at price of a little looser security reduction. To the best of our knowledge, this is the first construction of scalable and semantically secure RIBE scheme with constant size public system parameters.

  11. Carbon dioxide storage in unconventional reservoirs workshop: summary of recommendations

    USGS Publications Warehouse

    Jones, Kevin B.; Blondes, Madalyn S.

    2015-01-01

    The storage capacity for all unconventional reservoirs may be modeled using a volumetric equation starting with the extent of the rock unit and adjusted using these key factors and reaction terms. The ideas that were developed during this workshop can be used by USGS scientists to develop a methodology to assess the CO2 storage resource in unconventional reservoirs. This methodology could then be released for public comment and peer review. After completing this development process, the USGS could then use the methodology to assess the CO2 storage resource in unconventional reservoirs.

  12. The space station assembly phase: Flight telerobotic servicer feasibility. Volume 2: Methodology and case study

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey H.; Gyamfi, Max A.; Volkmer, Kent; Zimmerman, Wayne F.

    1987-01-01

    A methodology is described for examining the feasibility of a Flight Telerobotic Servicer (FTS) using two assembly scenarios, defined at the EVA task level, for the 30 shuttle flights (beginning with MB-1) over a four-year period. Performing all EVA tasks by crew only is compared to a scenario in which crew EVA is augmented by FTS. A reference FTS concept is used as a technology baseline and life-cycle cost analysis is performed to highlight cost tradeoffs. The methodology, procedure, and data used to complete the analysis are documented in detail.

  13. Complete Description of Forces Acting on a Flying Beach Volleyball

    NASA Astrophysics Data System (ADS)

    Dumek, Jan; Šafařík, Pavel

    2018-06-01

    Complete description of all forces acting on a flying Beach Volleyball was made based on measurements in the wind tunnel. Forces (drag, lift and side force) were measured for different angle of attack β which varies from 0° to 47°. Velocity region was from 10 to 25 m/s and revolution region was from 0 to 12.5 rps. Moments (Roll, Yaw, Pitch) were detected. Results are described by means of non-dimensional numbers, such as Reynolds number Re, spin s, drag CD, lift CL and side force CS coefficients. Differences in results of CD, CL and CS were detected for various angle β and are further described in the article. Conclusions of the investigation can be utilized 1st by ball producers for practical use in development, 2nd for sport Methodist to build more exact methodology for Beach Volleyball, 3rd in basic and applied aerodynamic research.

  14. Community Mental Health Providers' Beliefs About Addressing Weight Loss Among Youth Clients with Serious Emotional Disturbance and Overweight/Obesity: An Elicitation Study.

    PubMed

    Wykes, Thomas L; Bourassa, Katelynn A; Slosser, Andrea E; McKibbin, Christine L

    2018-02-09

    Youth with Serious Emotional Disturbance (SED) have high rates of overweight/obesity. Factors influencing mental health provider intentions to deliver weight-related advice are unclear. This study used qualitative methodology and Theory of Planned Behavior (TPB) constructs to examine these factors. Community mental health providers serving youth with SED were recruited via convenience sampling and an online provider list. Participants completed an open-ended TPB-based questionnaire online. Content analysis identified thematic beliefs. Twenty-one providers completed the questionnaire. Providers identified behavioral beliefs (e.g., client defensiveness), normative beliefs (e.g., medical professionals), and control beliefs (e.g., limited resources) that impact decisions to provide weight-related advice. Knowledge of factors that may influence providers' delivery of weight-related advice may lead to more effective healthy lifestyle programming for youth with SED.

  15. Prediction of SOFC Performance with or without Experiments: A Study on Minimum Requirements for Experimental Data

    DOE PAGES

    Yang, Tao; Sezer, Hayri; Celik, Ismail B.; ...

    2015-06-02

    In the present paper, a physics-based procedure combining experiments and multi-physics numerical simulations is developed for overall analysis of SOFCs operational diagnostics and performance predictions. In this procedure, essential information for the fuel cell is extracted first by utilizing empirical polarization analysis in conjunction with experiments and refined by multi-physics numerical simulations via simultaneous analysis and calibration of polarization curve and impedance behavior. The performance at different utilization cases and operating currents is also predicted to confirm the accuracy of the proposed model. It is demonstrated that, with the present electrochemical model, three air/fuel flow conditions are needed to producemore » a set of complete data for better understanding of the processes occurring within SOFCs. After calibration against button cell experiments, the methodology can be used to assess performance of planar cell without further calibration. The proposed methodology would accelerate the calibration process and improve the efficiency of design and diagnostics.« less

  16. Mark Making: Methodologies and methods (innovative practice).

    PubMed

    Zeilig, Hannah

    2016-09-01

    Mark Making is a recently completed AHRC-funded review exploring the role of the participative arts for people with dementia in the UK. Key concerns underlying Mark Making were both how to privilege the views and feelings of people with a dementia and also how best to understand the value of the arts for people with a dementia. These issues were tackled using a variety of qualitative methods. Methods included a rigorous literature review, the development of a unique web-based map locating many participative arts projects and above all working with people with a dementia to ascertain their views. This brief article will concentrate on some of the innovative methods that the Mark Making team used, with particular reference to comics as a mode of engagement as used in the Descartes project. The article will provide an insight into some of the methodological challenges confronted by Mark Making as well as the inspirations and successes that were enjoyed. © The Author(s) 2015.

  17. Q-FANSTM for general aviation aircraft

    NASA Technical Reports Server (NTRS)

    Worobel, R.; Mayo, M. G.

    1973-01-01

    Continued growth of general aviation over the next 10 to 15 years is dependent on continuing improvement in aircraft safety, utility, performance and cost. Moreover, these advanced aircraft will need to conform to expected government regulations controlling propulsion system emissions and noise levels. An attractive compact low noise propulsor concept, the Q-FANTM when matched to piston, rotary combustion, or gas turbine engines opens up the exciting prospect of new, cleaner airframe designs for the next generation of general aviation aircraft which will provide these improvements and meet the expected noise and pollution restriction of the 1980 time period. New Q-FAN methodology which was derived to predict Q-FAN noise, weight and cost is presented. Based on this methodology Q-FAN propulsion system performance, weight, noise, and cost trends are discussed. Then the impact of this propulsion system type on the complete aircraft is investigated for several representative aircraft size categories. Finally, example conceptual designs for Q-FAN/engine integration and aircraft installations are presented.

  18. Development of Boundary Condition Independent Reduced Order Thermal Models using Proper Orthogonal Decomposition

    NASA Astrophysics Data System (ADS)

    Raghupathy, Arun; Ghia, Karman; Ghia, Urmila

    2008-11-01

    Compact Thermal Models (CTM) to represent IC packages has been traditionally developed using the DELPHI-based (DEvelopment of Libraries of PHysical models for an Integrated design) methodology. The drawbacks of this method are presented, and an alternative method is proposed. A reduced-order model that provides the complete thermal information accurately with less computational resources can be effectively used in system level simulations. Proper Orthogonal Decomposition (POD), a statistical method, can be used to reduce the order of the degree of freedom or variables of the computations for such a problem. POD along with the Galerkin projection allows us to create reduced-order models that reproduce the characteristics of the system with a considerable reduction in computational resources while maintaining a high level of accuracy. The goal of this work is to show that this method can be applied to obtain a boundary condition independent reduced-order thermal model for complex components. The methodology is applied to the 1D transient heat equation.

  19. Monitoring the Impact of Solution Concepts within a Given Problematic

    NASA Astrophysics Data System (ADS)

    Cavallucci, Denis; Rousselot, François; Zanni, Cecilia

    It is acknowledged that one of the most critical issues facing today’s organizations concerns the substantial leaps required to methodologically structure innovation. Among other published work, some suggest that a complete rethinking of current practices is required. In this article, we propose a methodology aiming at providing controlled R&D choices based on a monitoring of the impact Solution Concepts provoke on a problematic situation. Initially this problematic situation is modeled in a graph form, namely a Problem Graph. It has the objective to assists R&D managers when choosing which activities to support and bring them concrete arguments to defend their choices. We postulate that by improving the robustness of such approaches we help deciders to switch from intuitive decisions (mostly built upon their past experiences, fear regarding risks, and awareness of the company’s level of acceptance of novelties) to thoroughly constructed inventive problem solving strategies. Our approach will be discussed using a computer application that illustrates our hypothesis after being tested in several industrial applications.

  20. Development of Detonation Modeling Capabilities for Rocket Test Facilities: Hydrogen-Oxygen-Nitrogen Mixtures

    NASA Technical Reports Server (NTRS)

    Allgood, Daniel C.

    2016-01-01

    The objective of the presented work was to develop validated computational fluid dynamics (CFD) based methodologies for predicting propellant detonations and their associated blast environments. Applications of interest were scenarios relevant to rocket propulsion test and launch facilities. All model development was conducted within the framework of the Loci/CHEM CFD tool due to its reliability and robustness in predicting high-speed combusting flow-fields associated with rocket engines and plumes. During the course of the project, verification and validation studies were completed for hydrogen-fueled detonation phenomena such as shock-induced combustion, confined detonation waves, vapor cloud explosions, and deflagration-to-detonation transition (DDT) processes. The DDT validation cases included predicting flame acceleration mechanisms associated with turbulent flame-jets and flow-obstacles. Excellent comparison between test data and model predictions were observed. The proposed CFD methodology was then successfully applied to model a detonation event that occurred during liquid oxygen/gaseous hydrogen rocket diffuser testing at NASA Stennis Space Center.

  1. A framework for an alternatives assessment dashboard for evaluating chemical alternatives applied to flame retardants for electronic applications

    PubMed Central

    Martin, Todd M.

    2017-01-01

    The goal of alternatives assessment (AA) is to facilitate a comparison of alternatives to a chemical of concern, resulting in the identification of safer alternatives. A two stage methodology for comparing chemical alternatives was developed. In the first stage, alternatives are compared using a variety of human health effects, ecotoxicity, and physicochemical properties. Hazard profiles are completed using a variety of online sources and quantitative structure activity relationship models. In the second stage, alternatives are evaluated utilizing an exposure/risk assessment over the entire life cycle. Exposure values are calculated using screening-level near-field and far-field exposure models. The second stage allows one to more accurately compare potential exposure to each alternative and consider additional factors that may not be obvious from separate binned persistence, bioaccumulation, and toxicity scores. The methodology was utilized to compare phosphate-based alternatives for decabromodiphenyl ether (decaBDE) in electronics applications. PMID:29333139

  2. A method to identify and analyze biological programs through automated reasoning

    PubMed Central

    Yordanov, Boyan; Dunn, Sara-Jane; Kugler, Hillel; Smith, Austin; Martello, Graziano; Emmott, Stephen

    2016-01-01

    Predictive biology is elusive because rigorous, data-constrained, mechanistic models of complex biological systems are difficult to derive and validate. Current approaches tend to construct and examine static interaction network models, which are descriptively rich, but often lack explanatory and predictive power, or dynamic models that can be simulated to reproduce known behavior. However, in such approaches implicit assumptions are introduced as typically only one mechanism is considered, and exhaustively investigating all scenarios is impractical using simulation. To address these limitations, we present a methodology based on automated formal reasoning, which permits the synthesis and analysis of the complete set of logical models consistent with experimental observations. We test hypotheses against all candidate models, and remove the need for simulation by characterizing and simultaneously analyzing all mechanistic explanations of observed behavior. Our methodology transforms knowledge of complex biological processes from sets of possible interactions and experimental observations to precise, predictive biological programs governing cell function. PMID:27668090

  3. Exploring nursing educators' use of theory and methods in search for evidence based credibility in nursing education.

    PubMed

    Beccaria, Lisa; Kek, Megan Y C A; Huijser, Henk

    2018-06-01

    In this paper, a review of nursing education literature is employed to ascertain the extent to which nursing educators apply theory to their research, as well as the types of theory they employ. In addition, the use of research methodologies in the nursing education literature is explored. An integrative review. A systematic search was conducted for English-language, peer reviewed publications of any research design via Academic Search Complete, Science Direct, CINAHL, and Health Source: Nursing/Academic Edition databases from 2001 to 2016, of which 140 were reviewed. The findings suggest that within current nursing education literature the scholarship of discovery, and the exploration of epistemologies other than nursing, in particular as they relate to teaching and learning, shows significant potential for expansion and diversification. The analysis highlights opportunities for nursing educators to incorporate broader theoretical, pedagogical, methodological and philosophical perspectives within teaching and the scholarship of teaching. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Assessment Methodology for Process Validation Lifecycle Stage 3A.

    PubMed

    Sayeed-Desta, Naheed; Pazhayattil, Ajay Babu; Collins, Jordan; Chen, Shu; Ingram, Marzena; Spes, Jana

    2017-07-01

    The paper introduces evaluation methodologies and associated statistical approaches for process validation lifecycle Stage 3A. The assessment tools proposed can be applied to newly developed and launched small molecule as well as bio-pharma products, where substantial process and product knowledge has been gathered. The following elements may be included in Stage 3A: number of 3A batch determination; evaluation of critical material attributes, critical process parameters, critical quality attributes; in vivo in vitro correlation; estimation of inherent process variability (IPV) and PaCS index; process capability and quality dashboard (PCQd); and enhanced control strategy. US FDA guidance on Process Validation: General Principles and Practices, January 2011 encourages applying previous credible experience with suitably similar products and processes. A complete Stage 3A evaluation is a valuable resource for product development and future risk mitigation of similar products and processes. Elements of 3A assessment were developed to address industry and regulatory guidance requirements. The conclusions made provide sufficient information to make a scientific and risk-based decision on product robustness.

  5. A Novel Consensus-Based Particle Swarm Optimization-Assisted Trust-Tech Methodology for Large-Scale Global Optimization.

    PubMed

    Zhang, Yong-Feng; Chiang, Hsiao-Dong

    2017-09-01

    A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.

  6. Markerless 3D motion capture for animal locomotion studies

    PubMed Central

    Sellers, William Irvin; Hirasaki, Eishi

    2014-01-01

    ABSTRACT Obtaining quantitative data describing the movements of animals is an essential step in understanding their locomotor biology. Outside the laboratory, measuring animal locomotion often relies on video-based approaches and analysis is hampered because of difficulties in calibration and often the limited availability of possible camera positions. It is also usually restricted to two dimensions, which is often an undesirable over-simplification given the essentially three-dimensional nature of many locomotor performances. In this paper we demonstrate a fully three-dimensional approach based on 3D photogrammetric reconstruction using multiple, synchronised video cameras. This approach allows full calibration based on the separation of the individual cameras and will work fully automatically with completely unmarked and undisturbed animals. As such it has the potential to revolutionise work carried out on free-ranging animals in sanctuaries and zoological gardens where ad hoc approaches are essential and access within enclosures often severely restricted. The paper demonstrates the effectiveness of video-based 3D photogrammetry with examples from primates and birds, as well as discussing the current limitations of this technique and illustrating the accuracies that can be obtained. All the software required is open source so this can be a very cost effective approach and provides a methodology of obtaining data in situations where other approaches would be completely ineffective. PMID:24972869

  7. Experience-Sampling Methodology with a Mobile Device in Fibromyalgia

    PubMed Central

    Diana, Castilla; Cristina, Botella; Azucena, García-Palacios; Luis, Farfallini; Ignacio, Miralles

    2012-01-01

    This work describes the usability studies conducted in the development of an experience-sampling methodology (ESM) system running in a mobile device. The goal of the system is to improve the accuracy and ecology in gathering daily self-report data in individuals suffering a chronic pain condition, fibromyalgia. The usability studies showed that the developed software to conduct ESM with mobile devices (smartphones, cell phones) can be successfully used by individuals with fibromyalgia of different ages and with low level of expertise in the use of information and communication technologies. 100% of users completed the tasks successfully, although some have completely illiterate. Also there seems to be a clear difference in the way of interaction obtained in the two studies carried out. PMID:23304132

  8. Methodological Investigation of the Curriculum Evaluation Theses Completed between the Years 2006-2015 in Turkey

    ERIC Educational Resources Information Center

    Aslan, Mecit; Saglam, Mustafa

    2017-01-01

    The aim of this research is to examine postgraduate theses on curriculum evaluation completed between the years 2006-2015 in Turkey in terms of various aspects such as university, year, curriculum which is evaluated, curriculum evaluation model, research method, design, sample type, data collection methods, data analysis technique. In order to…

  9. Ged® Completers' Perceptions of College Readiness and Social Capital: Linking Adult Literacy to a Greater Quality of Life

    ERIC Educational Resources Information Center

    Lott, Donalyn; O'Dell, Jade

    2014-01-01

    This study examined the efficacy of general education development (GED®) acquisition and GED® completers' perceptions of college readiness and social capital using a quantitative methodology. Also, the study used a descriptive, cross-sectional research design framed by the social capital theoretical perspective. The conceptual framework developed…

  10. Benchmarking Course Completion Rates: A Method with an Example from the British Columbia Open University

    ERIC Educational Resources Information Center

    Giguere, Louis

    2007-01-01

    We report findings on the methodological phase of a research project designed to assess the progress of the British Columbia Open University (BCOU) toward a 1997 goal of increasing distance education course completion rates to British Columbia system levels by adapting existing "off-line" courses for online delivery (a virtualization…

  11. HTA decision support system for sustainable business continuity management in hospitals. The case of surgical activity at the University Hospital in Florence.

    PubMed

    Miniati, Roberto; Dori, Fabrizio; Cecconi, Giulio; Gusinu, Roberto; Niccolini, Fabrizio; Gentili, Guido Biffi

    2013-01-01

    A fundamental element of the social and safety function of a health structure is the need to guarantee continuity of clinical activity through the continuity of technology. This paper aims to design a Decision Support System (DSS) for medical technology evaluations based on the use of Key Performance Indicators (KPI) in order to provide a multi-disciplinary valuation of a technology in a health structure. The methodology used in planning the DSS followed the following key steps: the definition of relevant KPIs, the development of a database to calculate the KPIs, the calculation of the defined KPIs and the resulting study report. Finally, the clinical and economic validation of the system was conducted though a case study of Business Continuity applied in the operating department of the Florence University Hospital AOU Careggi in Italy. A web-based support system was designed for HTA in health structures. The case study enabled Business Continuity Management (BCM) to be implemented in a hospital department in relation to aspects of a single technology and the specific clinical process. Finally, an economic analysis of the procedure was carried out. The system is useful for decision makers in that it precisely defines which equipment to include in the BCM procedure, using a scale analysis of the specific clinical process in which the equipment is used. In addition, the economic analysis shows how the cost of the procedure is completely covered by the indirect costs which would result from the expenses incurred from a broken device, hence showing the complete auto-sustainability of the methodology.

  12. Estimating a Service-Life Distribution Based on Production Counts and a Failure Database

    DOE PAGES

    Ryan, Kenneth J.; Hamada, Michael Scott; Vardeman, Stephen B.

    2017-04-01

    A manufacturer wanted to compare the service-life distributions of two similar products. These concern product lifetimes after installation (not manufacture). For each product, there were available production counts and an imperfect database providing information on failing units. In the real case, these units were expensive repairable units warrantied against repairs. Failure (of interest here) was relatively rare and driven by a different mode/mechanism than ordinary repair events (not of interest here). Approach: Data models for the service life based on a standard parametric lifetime distribution and a related limited failure population were developed. These models were used to develop expressionsmore » for the likelihood of the available data that properly accounts for information missing in the failure database. Results: A Bayesian approach was employed to obtain estimates of model parameters (with associated uncertainty) in order to investigate characteristics of the service-life distribution. Custom software was developed and is included as Supplemental Material to this case study. One part of a responsible approach to the original case was a simulation experiment used to validate the correctness of the software and the behavior of the statistical methodology before using its results in the application, and an example of such an experiment is included here. Because of confidentiality issues that prevent use of the original data, simulated data with characteristics like the manufacturer’s proprietary data are used to illustrate some aspects of our real analyses. Lastly, we also note that, although this case focuses on rare and complete product failure, the statistical methodology provided is directly applicable to more standard warranty data problems involving typically much larger warranty databases where entries are warranty claims (often for repairs) rather than reports of complete failures.« less

  13. Application of design for six sigma methodology on portable water filter that uses membrane filtration system: A preliminary study

    NASA Astrophysics Data System (ADS)

    Fahrul Hassan, Mohd; Jusoh, Suhada; Zaini Yunos, Muhamad; Arifin, A. M. T.; Ismail, A. E.; Rasidi Ibrahim, M.; Zulafif Rahim, M.

    2017-09-01

    Portable water filter has grown significantly in recent years. The use of water bottles as a water drink stuff using hand pump water filtration unit has been suggested to replace water bottled during outdoor recreational activities and for emergency supplies. However, quality of water still the issue related to contaminated water due to the residual waste plants, bacteria, and so on. Based on these issues, the study was carried out to design a portable water filter that uses membrane filtration system by applying Design for Six Sigma. Design for Six Sigma methodology consists of five stages which is Define, Measure, Analyze, Design and Verify. There were several tools have been used in each stage in order to come out with a specific objective. In the Define stage, questionnaire approach was used to identify the needs of portable water filter in the future from potential users. Next, Quality Function Deployment (QFD) tool was used in the Measure stage to measure the users’ needs into engineering characteristics. Based on the information in the Measure stage, morphological chart and weighted decision matrix tools were used in the Analyze stage. This stage performed several activities including concept generation and selection. Once the selection of the final concept completed, detail drawing was made in the Design stage. Then, prototype was developed in the Verify stage to conduct proof-of-concept testing. The results that obtained from each stage have been reported in this paper. From this study, it can be concluded that the application of Design for Six Sigma in designing a future portable water filter that uses membrane filtration system is a good start in looking for a new alternative concept with a completed supporting document.

  14. Estimating a Service-Life Distribution Based on Production Counts and a Failure Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryan, Kenneth J.; Hamada, Michael Scott; Vardeman, Stephen B.

    A manufacturer wanted to compare the service-life distributions of two similar products. These concern product lifetimes after installation (not manufacture). For each product, there were available production counts and an imperfect database providing information on failing units. In the real case, these units were expensive repairable units warrantied against repairs. Failure (of interest here) was relatively rare and driven by a different mode/mechanism than ordinary repair events (not of interest here). Approach: Data models for the service life based on a standard parametric lifetime distribution and a related limited failure population were developed. These models were used to develop expressionsmore » for the likelihood of the available data that properly accounts for information missing in the failure database. Results: A Bayesian approach was employed to obtain estimates of model parameters (with associated uncertainty) in order to investigate characteristics of the service-life distribution. Custom software was developed and is included as Supplemental Material to this case study. One part of a responsible approach to the original case was a simulation experiment used to validate the correctness of the software and the behavior of the statistical methodology before using its results in the application, and an example of such an experiment is included here. Because of confidentiality issues that prevent use of the original data, simulated data with characteristics like the manufacturer’s proprietary data are used to illustrate some aspects of our real analyses. Lastly, we also note that, although this case focuses on rare and complete product failure, the statistical methodology provided is directly applicable to more standard warranty data problems involving typically much larger warranty databases where entries are warranty claims (often for repairs) rather than reports of complete failures.« less

  15. Measuring the effect of improvement in methodological techniques on data collection in the Gharbiah population-based cancer registry in Egypt: Implications for other Low- and Middle-Income Countries.

    PubMed

    Smith, Brittney L; Ramadan, Mohamed; Corley, Brittany; Hablas, Ahmed; Seifeldein, Ibrahim A; Soliman, Amr S

    2015-12-01

    The purpose of this study was to describe and quantify procedures and methods that maximized the efficiency of the Gharbiah Cancer Registry (GPCR), the only population-based cancer registry in Egypt. The procedures and measures included a locally-developed software program to translate names from Arabic to English, a new national ID number for demographic and occupational information, and linkage of cancer cases to new electronic mortality records of the Ministry of Health. Data was compiled from the 34,058 cases from the registry for the years 1999-2007. Cases and registry variables about demographic and clinical information were reviewed by year to assess trends associated with each new method or procedure during the study period. The introduction of the name translation software in conjunction with other demographic variables increased the identification of detected duplicates from 23.4% to 78.1%. Use of the national ID increased the proportion of cases with occupation information from 27% to 89%. Records with complete mortality information increased from 18% to 43%. Proportion of cases that came from death certificate only, decreased from 9.8% to 4.7%. Overall, the study revealed that introducing and utilizing local and culture-specific methodological changes, software, and electronic non-cancer databases had a significant impact on data quality and completeness. This study may have translational implications for improving the quality of cancer registries in LMICs considering the emerging advances in electronic databases and utilization of health software and computerization of data. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. A validation method for near-infrared spectroscopy based tissue oximeters for cerebral and somatic tissue oxygen saturation measurements.

    PubMed

    Benni, Paul B; MacLeod, David; Ikeda, Keita; Lin, Hung-Mo

    2018-04-01

    We describe the validation methodology for the NIRS based FORE-SIGHT ELITE ® (CAS Medical Systems, Inc., Branford, CT, USA) tissue oximeter for cerebral and somatic tissue oxygen saturation (StO 2 ) measurements for adult subjects submitted to the United States Food and Drug Administration (FDA) to obtain clearance for clinical use. This validation methodology evolved from a history of NIRS validations in the literature and FDA recommended use of Deming regression and bootstrapping statistical validation methods. For cerebral validation, forehead cerebral StO 2 measurements were compared to a weighted 70:30 reference (REF CX B ) of co-oximeter internal jugular venous and arterial blood saturation of healthy adult subjects during a controlled hypoxia sequence, with a sensor placed on the forehead. For somatic validation, somatic StO 2 measurements were compared to a weighted 70:30 reference (REF CX S ) of co-oximetry central venous and arterial saturation values following a similar protocol, with sensors place on the flank, quadriceps muscle, and calf muscle. With informed consent, 25 subjects successfully completed the cerebral validation study. The bias and precision (1 SD) of cerebral StO 2 compared to REF CX B was -0.14 ± 3.07%. With informed consent, 24 subjects successfully completed the somatic validation study. The bias and precision of somatic StO 2 compared to REF CX S was 0.04 ± 4.22% from the average of flank, quadriceps, and calf StO 2 measurements to best represent the global whole body REF CX S . The NIRS validation methods presented potentially provide a reliable means to test NIRS monitors and qualify them for clinical use.

  17. Analysis methods for Thematic Mapper data of urban regions

    NASA Technical Reports Server (NTRS)

    Wang, S. C.

    1984-01-01

    Studies have indicated the difficulty in deriving a detailed land-use/land-cover classification for heterogeneous metropolitan areas with Landsat MSS and TM data. The major methodological issues of digital analysis which possibly have effected the results of classification are examined. In response to these methodological issues, a multichannel hierarchical clustering algorithm has been developed and tested for a more complete analysis of the data for urban areas.

  18. Complete Prevention of Dendrite Formation in Zn Metal Anodes by Means of Pulsed Charging Protocols.

    PubMed

    Garcia, Grecia; Ventosa, Edgar; Schuhmann, Wolfgang

    2017-06-07

    Zn metal as anode in rechargeable batteries, such as Zn/air or Zn/Ni, suffers from poor cyclability. The formation of Zn dendrites upon cycling is the key limiting step. We report a systematic study of the influence of pulsed electroplating protocols on the formation of Zn dendrites and in turn on strategies to completely prevent Zn dendrite formation. Because of the large number of variables in electroplating protocols, a scanning droplet cell technique was adapted as a high-throughput methodology in which a descriptor of the surface roughness can be in situ derived by means of electrochemical impedance spectroscopy. Upon optimizing the electroplating protocol by controlling nucleation, zincate ion depletion, and zincate ion diffusion, scanning electron microscopy and atomic force microscopy confirmed the growth of uniform and homogenous Zn deposits with a complete prevention of dendrite growth. The implementation of pulsed electroplating as the charging protocol for commercially available Ni-Zn batteries leads to substantially prolonged cyclability demonstrating the benefits of pulsed charging in Zn metal-based batteries.

  19. Spectral Target Detection using Schroedinger Eigenmaps

    NASA Astrophysics Data System (ADS)

    Dorado-Munoz, Leidy P.

    Applications of optical remote sensing processes include environmental monitoring, military monitoring, meteorology, mapping, surveillance, etc. Many of these tasks include the detection of specific objects or materials, usually few or small, which are surrounded by other materials that clutter the scene and hide the relevant information. This target detection process has been boosted lately by the use of hyperspectral imagery (HSI) since its high spectral dimension provides more detailed spectral information that is desirable in data exploitation. Typical spectral target detectors rely on statistical or geometric models to characterize the spectral variability of the data. However, in many cases these parametric models do not fit well HSI data that impacts the detection performance. On the other hand, non-linear transformation methods, mainly based on manifold learning algorithms, have shown a potential use in HSI transformation, dimensionality reduction and classification. In target detection, non-linear transformation algorithms are used as preprocessing techniques that transform the data to a more suitable lower dimensional space, where the statistical or geometric detectors are applied. One of these non-linear manifold methods is the Schroedinger Eigenmaps (SE) algorithm that has been introduced as a technique for semi-supervised classification. The core tool of the SE algorithm is the Schroedinger operator that includes a potential term that encodes prior information about the materials present in a scene, and enables the embedding to be steered in some convenient directions in order to cluster similar pixels together. A completely novel target detection methodology based on SE algorithm is proposed for the first time in this thesis. The proposed methodology does not just include the transformation of the data to a lower dimensional space but also includes the definition of a detector that capitalizes on the theory behind SE. The fact that target pixels and those similar pixels are clustered in a predictable region of the low-dimensional representation is used to define a decision rule that allows one to identify target pixels over the rest of pixels in a given image. In addition, a knowledge propagation scheme is used to combine spectral and spatial information as a means to propagate the "potential constraints" to nearby points. The propagation scheme is introduced to reinforce weak connections and improve the separability between most of the target pixels and the background. Experiments using different HSI data sets are carried out in order to test the proposed methodology. The assessment is performed from a quantitative and qualitative point of view, and by comparing the SE-based methodology against two other detection methodologies that use linear/non-linear algorithms as transformations and the well-known Adaptive Coherence/Cosine Estimator (ACE) detector. Overall results show that the SE-based detector outperforms the other two detection methodologies, which indicates the usefulness of the SE transformation in spectral target detection problems.

  20. Validation of On-Orbit Methodology for the Assessment of Cardiac Function and Changes in the Circulating Volume Using Ultrasound and "Braslet-M" Occlusion Cuffs

    NASA Technical Reports Server (NTRS)

    Bogomolov, V. V.; Duncan, J. M.; Alferova, I. V.; Dulchavsky, S. A.; Ebert, D.; Hamilton, D. R.; Matveev, V. P.; Sargsyan, A. E.

    2008-01-01

    Recent advances in remotely guided imaging techniques on ISS allow the acquisition of high quality ultrasound data using crewmember operators with no medical background and minimal training. However, ongoing efforts are required to develop and validate methodology for complex imaging protocols to ensure their repeatability, efficiency, and suitability for use aboard the ISS. This Station Developmental Test Objective (SDTO) tests a cardiovascular evaluation methodology that takes advantage of the ISS Ultrasound capability, the Braslet-M device, and modified respiratory maneuvers (Valsalva and Mueller), to broaden the spectrum of anatomical and functional information on human cardiovascular system during long-duration space missions. The proposed methodology optimizes and combines new and previously demonstrated methods, and is expected to benefit medically indicated assessments, operational research protocols, and data collections for science. Braslet-M is a current Russian operational countermeasure that compresses the upper thigh to impede the venous return from lower extremities. The goal of the SDTO is to establish and validate a repeatable ultrasound-based methodology for the assessment of a number of cardiovascular criteria in microgravity. Braslet-M device is used as a means to acutely alter volume distribution while focused ultrasound measurements are performed. Modified respiratory maneuvers are done upon volume manipulations to record commensurate changes in anatomical and functional parameters. The overall cardiovascular effects of the Braslet-M device are not completely understood, and although not a primary objective of this SDTO, this effort will provide pilot data regarding the suitability of Braslet-M for its intended purpose, effects, and the indications for its use.

  1. Antimicrobial activity of biogenic silver nanoparticles, and silver chloride nanoparticles: an overview and comments.

    PubMed

    Durán, Nelson; Nakazato, Gerson; Seabra, Amedea B

    2016-08-01

    The antimicrobial impact of biogenic-synthesized silver-based nanoparticles has been the focus of increasing interest. As the antimicrobial activity of nanoparticles is highly dependent on their size and surface, the complete and adequate characterization of the nanoparticle is important. This review discusses the characterization and antimicrobial activity of biogenic synthesized silver nanoparticles and silver chloride nanoparticles. By revising the literature, there is confusion in the characterization of these two silver-based nanoparticles, which consequently affects the conclusion regarding to their antimicrobial activities. This review critically analyzes recent publications on the synthesis of biogenic silver nanoparticles and silver chloride nanoparticles by attempting to correlate the characterization of the nanoparticles with their antimicrobial activity. It was difficult to correlate the size of biogenic nanoparticles with their antimicrobial activity, since different techniques are employed for the characterization. Biogenic synthesized silver-based nanoparticles are not completely characterized, particularly the nature of capped proteins covering the nanomaterials. Moreover, the antimicrobial activity of theses nanoparticles is assayed by using different protocols and strains, which difficult the comparison among the published papers. It is important to select some bacteria as standards, by following international foundations (Pharmaceutical Microbiology Manual) and use the minimal inhibitory concentration by broth microdilution assays from Clinical and Laboratory Standards Institute, which is the most common assay used in antibiotic ones. Therefore, we conclude that to have relevant results on antimicrobial effects of biogenic silver-based nanoparticles, it is necessary to have a complete and adequate characterization of these nanostructures, followed by standard methodology in microbiology protocols.

  2. An in vitro simulation method for the tribological assessment of complete natural hip joints

    PubMed Central

    Fisher, John; Williams, Sophie

    2017-01-01

    The use of hip joint simulators to evaluate the tribological performance of total hip replacements is widely reported in the literature, however, in vitro simulation studies investigating the tribology of the natural hip joint are limited with heterogeneous methodologies reported. An in vitro simulation system for the complete natural hip joint, enabling the acetabulum and femoral head to be positioned with different orientations whilst maintaining the correct joint centre of rotation, was successfully developed for this study. The efficacy of the simulation system was assessed by testing complete, matched natural porcine hip joints and porcine hip hemiarthroplasty joints in a pendulum friction simulator. The results showed evidence of biphasic lubrication, with a non-linear increase in friction being observed in both groups. Lower overall mean friction factor values in the complete natural joint group that increased at a lower rate over time, suggest that the exudation of fluid and transition to solid phase lubrication occurred more slowly in the complete natural hip joint compared to the hip hemiarthroplasty joint. It is envisaged that this methodology will be used to investigate morphological risk factors for developing hip osteoarthritis, as well as the effectiveness of early interventional treatments for degenerative hip disease. PMID:28886084

  3. Concurrent measurement of "real-world" stress and arousal in individuals with psychosis: assessing the feasibility and validity of a novel methodology.

    PubMed

    Kimhy, David; Delespaul, Philippe; Ahn, Hongshik; Cai, Shengnan; Shikhman, Marina; Lieberman, Jeffrey A; Malaspina, Dolores; Sloan, Richard P

    2010-11-01

    Psychosis has been repeatedly suggested to be affected by increases in stress and arousal. However, there is a dearth of evidence supporting the temporal link between stress, arousal, and psychosis during "real-world" functioning. This paucity of evidence may stem from limitations of current research methodologies. Our aim is to the test the feasibility and validity of a novel methodology designed to measure concurrent stress and arousal in individuals with psychosis during "real-world" daily functioning. Twenty patients with psychosis completed a 36-hour ambulatory assessment of stress and arousal. We used experience sampling method with palm computers to assess stress (10 times per day, 10 AM → 10 PM) along with concurrent ambulatory measurement of cardiac autonomic regulation using a Holter monitor. The clocks of the palm computer and Holter monitor were synchronized, allowing the temporal linking of the stress and arousal data. We used power spectral analysis to determine the parasympathetic contributions to autonomic regulation and sympathovagal balance during 5 minutes before and after each experience sample. Patients completed 79% of the experience samples (75% with a valid concurrent arousal data). Momentary increases in stress had inverse correlation with concurrent parasympathetic activity (ρ = -.27, P < .0001) and positive correlation with sympathovagal balance (ρ = .19, P = .0008). Stress and heart rate were not significantly related (ρ = -.05, P = .3875). The findings support the feasibility and validity of our methodology in individuals with psychosis. The methodology offers a novel way to study in high time resolution the concurrent, "real-world" interactions between stress, arousal, and psychosis. The authors discuss the methodology's potential applications and future research directions.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rivera, W. Gary; Robinson, David Gerald; Wyss, Gregory Dane

    The charter for adversarial delay is to hinder access to critical resources through the use of physical systems increasing an adversarys task time. The traditional method for characterizing access delay has been a simple model focused on accumulating times required to complete each task with little regard to uncertainty, complexity, or decreased efficiency associated with multiple sequential tasks or stress. The delay associated with any given barrier or path is further discounted to worst-case, and often unrealistic, times based on a high-level adversary, resulting in a highly conservative calculation of total delay. This leads to delay systems that require significantmore » funding and personnel resources in order to defend against the assumed threat, which for many sites and applications becomes cost prohibitive. A new methodology has been developed that considers the uncertainties inherent in the problem to develop a realistic timeline distribution for a given adversary path. This new methodology incorporates advanced Bayesian statistical theory and methodologies, taking into account small sample size, expert judgment, human factors and threat uncertainty. The result is an algorithm that can calculate a probability distribution function of delay times directly related to system risk. Through further analysis, the access delay analyst or end user can use the results in making informed decisions while weighing benefits against risks, ultimately resulting in greater system effectiveness with lower cost.« less

  5. Methodology for fault detection in induction motors via sound and vibration signals

    NASA Astrophysics Data System (ADS)

    Delgado-Arredondo, Paulo Antonio; Morinigo-Sotelo, Daniel; Osornio-Rios, Roque Alfredo; Avina-Cervantes, Juan Gabriel; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene de Jesus

    2017-01-01

    Nowadays, timely maintenance of electric motors is vital to keep up the complex processes of industrial production. There are currently a variety of methodologies for fault diagnosis. Usually, the diagnosis is performed by analyzing current signals at a steady-state motor operation or during a start-up transient. This method is known as motor current signature analysis, which identifies frequencies associated with faults in the frequency domain or by the time-frequency decomposition of the current signals. Fault identification may also be possible by analyzing acoustic sound and vibration signals, which is useful because sometimes this information is the only available. The contribution of this work is a methodology for detecting faults in induction motors in steady-state operation based on the analysis of acoustic sound and vibration signals. This proposed approach uses the Complete Ensemble Empirical Mode Decomposition for decomposing the signal into several intrinsic mode functions. Subsequently, the frequency marginal of the Gabor representation is calculated to obtain the spectral content of the IMF in the frequency domain. This proposal provides good fault detectability results compared to other published works in addition to the identification of more frequencies associated with the faults. The faults diagnosed in this work are two broken rotor bars, mechanical unbalance and bearing defects.

  6. Maxwell's contrived analogy: An early version of the methodology of modeling

    NASA Astrophysics Data System (ADS)

    Hon, Giora; Goldstein, Bernard R.

    2012-11-01

    The term "analogy" stands for a variety of methodological practices all related in one way or another to the idea of proportionality. We claim that in his first substantial contribution to electromagnetism James Clerk Maxwell developed a methodology of analogy which was completely new at the time or, to borrow John North's expression, Maxwell's methodology was a "newly contrived analogue". In his initial response to Michael Faraday's experimental researches in electromagnetism, Maxwell did not seek an analogy with some physical system in a domain different from electromagnetism as advocated by William Thomson; rather, he constructed an entirely artificial one to suit his needs. Following North, we claim that the modification which Maxwell introduced to the methodology of analogy has not been properly appreciated. In view of our examination of the evidence, we argue that Maxwell gave a new meaning to analogy; in fact, it comes close to modeling in current usage.

  7. Methodological quality assessment of paper-based systematic reviews published in oral health.

    PubMed

    Wasiak, J; Shen, A Y; Tan, H B; Mahar, R; Kan, G; Khoo, W R; Faggion, C M

    2016-04-01

    This study aimed to conduct a methodological assessment of paper-based systematic reviews (SR) published in oral health using a validated checklist. A secondary objective was to explore temporal trends on methodological quality. Two electronic databases (OVID Medline and OVID EMBASE) were searched for paper-based SR of interventions published in oral health from inception to October 2014. Manual searches of the reference lists of paper-based SR were also conducted. Methodological quality of included paper-based SR was assessed using an 11-item questionnaire, Assessment of Multiple Systematic Reviews (AMSTAR) checklist. Methodological quality was summarized using the median and inter-quartile range (IQR) of the AMSTAR score over different categories and time periods. A total of 643 paper-based SR were included. The overall median AMSTAR score was 4 (IQR 2-6). The highest median score (5) was found in the pain dentistry and periodontology fields, while the lowest median score (3) was found in implant dentistry, restorative dentistry, oral medicine, and prosthodontics. The number of paper-based SR per year and the median AMSTAR score increased over time (median score in 1990s was 2 (IQR 2-3), 2000s was 4 (IQR 2-5), and 2010 onwards was 5 (IQR 3-6)). Although the methodological quality of paper-based SR published in oral health has improved in the last few years, there is still scope for improving quality in most evaluated dental specialties. Large-scale assessment of methodological quality of dental SR highlights areas of methodological strengths and weaknesses that can be targeted in future publications to encourage better quality review methodology.

  8. A systematic review of the cost and cost-effectiveness of electronic discharge communications

    PubMed Central

    Sevick, Laura K; Esmail, Rosmin; Tang, Karen; Lorenzetti, Diane L; Ronksley, Paul; James, Matthew; Santana, Maria; Ghali, William A; Clement, Fiona

    2017-01-01

    Background The transition between acute care and community care can be a vulnerable period in a patients’ treatment due to the potential for postdischarge adverse events. The vulnerability of this period has been attributed to factors related to the miscommunication between hospital-based and community-based physicians. Electronic discharge communication has been proposed as one solution to bridge this communication gap. Prior to widespread implementation of these tools, the costs and benefits should be considered. Objective To establish the cost and cost-effectiveness of electronic discharge communications compared with traditional discharge systems for individuals who have completed care with one provider and are transitioning care to a new provider. Methods We conducted a systematic review of the published literature, using best practices, to identify economic evaluations/cost analyses of electronic discharge communication tools. Inclusion criteria were: (1) economic analysis and (2) electronic discharge communication tool as the intervention. Quality of each article was assessed, and data were summarised using a component-based analysis. Results One thousand unique abstracts were identified, and 57 full-text articles were assessed for eligibility. Four studies met final inclusion criteria. These studies varied in their primary objectives, methodology, costs reported and outcomes. All of the studies were of low to good quality. Three of the studies reported a cost-effectiveness measure ranging from an incremental daily cost of decreasing average discharge note completion by 1 day of $0.331 (2003 Canadian), a cost per page per discharge letter of €9.51 and a dynamic net present value of €31.1 million for a 5-year implementation of the intervention. None of the identified studies considered clinically meaningful patient or quality outcomes. Discussion Economic analyses of electronic discharge communications are scarcely reported, and with inconsistent methodology and outcomes. Further studies are needed to understand the cost-effectiveness and value for patient care. PMID:28674136

  9. A Methodological Intercomparison of Topographic and Aerial Photographic Habitat Survey Techniques

    NASA Astrophysics Data System (ADS)

    Bangen, S. G.; Wheaton, J. M.; Bouwes, N.

    2011-12-01

    A severe decline in Columbia River salmonid populations and subsequent Federal listing of subpopulations has mandated both the monitoring of populations and evaluation of the status of available habitat. Numerous field and analytical methods exist to assist in the quantification of the abundance and quality of in-stream habitat for salmonids. These methods range from field 'stick and tape' surveys to spatially explicit topographic and aerial photographic surveys from a mix of ground-based and remotely sensed airborne platforms. Although several previous studies have assessed the quality of specific individual survey methods, the intercomparison of competing techniques across a diverse range of habitat conditions (wadeable headwater channels to non-wadeable mainstem channels) has not yet been elucidated. In this study, we seek to enumerate relative quality (i.e. accuracy, precision, extent) of habitat metrics and inventories derived from an array of ground-based and remotely sensed surveys of varying degrees of sophistication, as well as quantify the effort and cost in conducting the surveys. Over the summer of 2010, seven sample reaches of varying habitat complexity were surveyed in the Lemhi River Basin, Idaho, USA. Complete topographic surveys were attempted at each site using rtkGPS, total station, ground-based LiDaR and traditional airborne LiDaR. Separate high spatial resolution aerial imagery surveys were acquired using a tethered blimp, a drone UAV, and a traditional fixed-wing aircraft. Here we also developed a relatively simplistic methodology for deriving bathymetry from aerial imagery that could be readily employed by instream habitat monitoring programs. The quality of bathymetric maps derived from aerial imagery was compared with rtkGPS topographic data. The results are helpful for understanding the strengths and weaknesses of different approaches in specific conditions, and how a hybrid of data acquisition methods can be used to build a more complete quantification of salmonid habitat conditions in streams.

  10. Using default methodologies to derive an acceptable daily exposure (ADE).

    PubMed

    Faria, Ellen C; Bercu, Joel P; Dolan, David G; Morinello, Eric J; Pecquet, Alison M; Seaman, Christopher; Sehner, Claudia; Weideman, Patricia A

    2016-08-01

    This manuscript discusses the different historical and more recent default approaches that have been used to derive an acceptable daily exposure (ADE). While it is preferable to derive a health-based ADE based on a complete nonclinical and clinical data package, this is not always possible. For instance, for drug candidates in early development there may be no or limited nonclinical or clinical trial data. Alternative approaches that can support decision making with less complete data packages represent a variety of methods that rely on default assumptions or data inputs where chemical-specific data on health effects are lacking. A variety of default approaches are used including those based on certain toxicity estimates, a fraction of the therapeutic dose, cleaning-based limits, the threshold of toxicological concern (TTC), and application of hazard banding tools such as occupational exposure banding (OEB). Each of these default approaches is discussed in this manuscript, including their derivation, application, strengths, and limitations. In order to ensure patient safety when faced with toxicological and clinical data-gaps, default ADE methods should be purposefully as or more protective than ADEs derived from full data packages. Reliance on the subset of default approaches (e.g., TTC or OEB) that are based on toxicological data is preferred over other methods for establishing ADEs in early development while toxicology and clinical data are still being collected. Copyright © 2016. Published by Elsevier Inc.

  11. Diagnostic instrumentation aboard ISS: just-in-time training for non-physician crewmembers.

    PubMed

    Foale, C Michael; Kaleri, Alexander Y; Sargsyan, Ashot E; Hamilton, Douglas R; Melton, Shannon; Martin, David; Dulchavsky, Scott A

    2005-06-01

    The performance of complex tasks on the International Space Station (ISS) requires significant preflight crew training commitments and frequent skill and knowledge refreshment. This report documents a recently developed "just-in-time" training methodology, which integrates preflight hardware familiarization and procedure training with an on-orbit CD-ROM-based skill enhancement. This "just-in-time" concept was used to support real-time remote expert guidance to complete ultrasound examinations using the ISS Human Research Facility (HRF). An American and Russian ISS crewmember received 2 h of "hands on" ultrasound training 8 mo prior to the on-orbit ultrasound exam. A CD-ROM-based Onboard Proficiency Enhancement (OPE) interactive multimedia program consisting of memory enhancing tutorials, and skill testing exercises, was completed by the crewmember 6 d prior to the on-orbit ultrasound exam. The crewmember was then remotely guided through a thoracic, vascular, and echocardiographic examination by ultrasound imaging experts. Results of the CD-ROM-based OPE session were used to modify the instructions during a complete 35-min real-time thoracic, cardiac, and carotid/jugular ultrasound study. Following commands from the ground-based expert, the crewmember acquired all target views and images without difficulty. The anatomical content and fidelity of ultrasound video were adequate for clinical decision making. Complex ultrasound experiments with expert guidance were performed with high accuracy following limited preflight training and multimedia based in-flight review, despite a 2-s communication latency. In-flight application of multimedia proficiency enhancement software, coupled with real-time remote expert guidance, facilitates the successful performance of ultrasound examinations on orbit and may have additional terrestrial and space applications.

  12. Diagnostic instrumentation aboard ISS: just-in-time training for non-physician crewmembers

    NASA Technical Reports Server (NTRS)

    Foale, C. Michael; Kaleri, Alexander Y.; Sargsyan, Ashot E.; Hamilton, Douglas R.; Melton, Shannon; Martin, David; Dulchavsky, Scott A.

    2005-01-01

    INTRODUCTION: The performance of complex tasks on the International Space Station (ISS) requires significant preflight crew training commitments and frequent skill and knowledge refreshment. This report documents a recently developed "just-in-time" training methodology, which integrates preflight hardware familiarization and procedure training with an on-orbit CD-ROM-based skill enhancement. This "just-in-time" concept was used to support real-time remote expert guidance to complete ultrasound examinations using the ISS Human Research Facility (HRF). METHODS: An American and Russian ISS crewmember received 2 h of "hands on" ultrasound training 8 mo prior to the on-orbit ultrasound exam. A CD-ROM-based Onboard Proficiency Enhancement (OPE) interactive multimedia program consisting of memory enhancing tutorials, and skill testing exercises, was completed by the crewmember 6 d prior to the on-orbit ultrasound exam. The crewmember was then remotely guided through a thoracic, vascular, and echocardiographic examination by ultrasound imaging experts. RESULTS: Results of the CD-ROM-based OPE session were used to modify the instructions during a complete 35-min real-time thoracic, cardiac, and carotid/jugular ultrasound study. Following commands from the ground-based expert, the crewmember acquired all target views and images without difficulty. The anatomical content and fidelity of ultrasound video were adequate for clinical decision making. CONCLUSIONS: Complex ultrasound experiments with expert guidance were performed with high accuracy following limited preflight training and multimedia based in-flight review, despite a 2-s communication latency. In-flight application of multimedia proficiency enhancement software, coupled with real-time remote expert guidance, facilitates the successful performance of ultrasound examinations on orbit and may have additional terrestrial and space applications.

  13. New systematic methodology for incorporating dynamic heat transfer modelling in multi-phase biochemical reactors.

    PubMed

    Fernández-Arévalo, T; Lizarralde, I; Grau, P; Ayesa, E

    2014-09-01

    This paper presents a new modelling methodology for dynamically predicting the heat produced or consumed in the transformations of any biological reactor using Hess's law. Starting from a complete description of model components stoichiometry and formation enthalpies, the proposed modelling methodology has integrated successfully the simultaneous calculation of both the conventional mass balances and the enthalpy change of reaction in an expandable multi-phase matrix structure, which facilitates a detailed prediction of the main heat fluxes in the biochemical reactors. The methodology has been implemented in a plant-wide modelling methodology in order to facilitate the dynamic description of mass and heat throughout the plant. After validation with literature data, as illustrative examples of the capability of the methodology, two case studies have been described. In the first one, a predenitrification-nitrification dynamic process has been analysed, with the aim of demonstrating the easy integration of the methodology in any system. In the second case study, the simulation of a thermal model for an ATAD has shown the potential of the proposed methodology for analysing the effect of ventilation and influent characterization. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Cyber-Informed Engineering: The Need for a New Risk Informed and Design Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Price, Joseph Daniel; Anderson, Robert Stephen

    Current engineering and risk management methodologies do not contain the foundational assumptions required to address the intelligent adversary’s capabilities in malevolent cyber attacks. Current methodologies focus on equipment failures or human error as initiating events for a hazard, while cyber attacks use the functionality of a trusted system to perform operations outside of the intended design and without the operator’s knowledge. These threats can by-pass or manipulate traditionally engineered safety barriers and present false information, invalidating the fundamental basis of a safety analysis. Cyber threats must be fundamentally analyzed from a completely new perspective where neither equipment nor human operationmore » can be fully trusted. A new risk analysis and design methodology needs to be developed to address this rapidly evolving threatscape.« less

  15. Shear-wave velocity profiling according to three alternative approaches: A comparative case study

    NASA Astrophysics Data System (ADS)

    Dal Moro, G.; Keller, L.; Al-Arifi, N. S.; Moustafa, S. S. R.

    2016-11-01

    The paper intends to compare three different methodologies which can be used to analyze surface-wave propagation, thus eventually obtaining the vertical shear-wave velocity (VS) profile. The three presented methods (currently still quite unconventional) are characterized by different field procedures and data processing. The first methodology is a sort of evolution of the classical Multi-channel Analysis of Surface Waves (MASW) here accomplished by jointly considering Rayleigh and Love waves (analyzed according to the Full Velocity Spectrum approach) and the Horizontal-to-Vertical Spectral Ratio (HVSR). The second method is based on the joint analysis of the HVSR curve together with the Rayleigh-wave dispersion determined via Miniature Array Analysis of Microtremors (MAAM), a passive methodology that relies on a small number (4 to 6) of vertical geophones deployed along a small circle (for the common near-surface application the radius usually ranges from 0.6 to 5 m). Finally, the third considered approach is based on the active data acquired by a single 3-component geophone and relies on the joint inversion of the group-velocity spectra of the radial and vertical components of the Rayleigh waves, together with the Radial-to-Vertical Spectral Ratio (RVSR). The results of the analyses performed while considering these approaches (completely different both in terms of field procedures and data analysis) appear extremely consistent thus mutually validating their performances. Pros and cons of each approach are summarized both in terms of computational aspects as well as with respect to practical considerations regarding the specific character of the pertinent field procedures.

  16. Adaptive Oceanographic Sampling in a Coastal Environment Using Autonomous Gliding Vehicles

    DTIC Science & Technology

    2003-08-01

    cost autonomous vehicles with near-global range and modular sensor payload. Particular emphasis is placed on the development of adaptive sampling...environment. Secondary objectives include continued development of adaptive sampling strategies suitable for large fleets of slow-moving autonomous ... vehicles , and development and implementation of new oceanographic sensors and sampling methodologies. The main task completed was a complete redesign of

  17. "I Never Heard the Word Methodology": Personal Accounts of Teacher Training in Ireland 1943-1980

    ERIC Educational Resources Information Center

    Walsh, Brendan

    2017-01-01

    This article examines the experiences of 27 retired secondary school teachers (respondents) who completed initial teacher education (ITE) courses between 1943 and 1980. The eldest respondent completed ITE in 1943 and the youngest in 1980. The timespan 1943-1980 is not purposeful but dependent on the cohort that volunteered to take part in the…

  18. PuReD-MCL: a graph-based PubMed document clustering methodology.

    PubMed

    Theodosiou, T; Darzentas, N; Angelis, L; Ouzounis, C A

    2008-09-01

    Biomedical literature is the principal repository of biomedical knowledge, with PubMed being the most complete database collecting, organizing and analyzing such textual knowledge. There are numerous efforts that attempt to exploit this information by using text mining and machine learning techniques. We developed a novel approach, called PuReD-MCL (Pubmed Related Documents-MCL), which is based on the graph clustering algorithm MCL and relevant resources from PubMed. PuReD-MCL avoids using natural language processing (NLP) techniques directly; instead, it takes advantage of existing resources, available from PubMed. PuReD-MCL then clusters documents efficiently using the MCL graph clustering algorithm, which is based on graph flow simulation. This process allows users to analyse the results by highlighting important clues, and finally to visualize the clusters and all relevant information using an interactive graph layout algorithm, for instance BioLayout Express 3D. The methodology was applied to two different datasets, previously used for the validation of the document clustering tool TextQuest. The first dataset involves the organisms Escherichia coli and yeast, whereas the second is related to Drosophila development. PuReD-MCL successfully reproduces the annotated results obtained from TextQuest, while at the same time provides additional insights into the clusters and the corresponding documents. Source code in perl and R are available from http://tartara.csd.auth.gr/~theodos/

  19. Psychotherapy Outcome Research: Issues and Questions.

    PubMed

    Shean, Glenn

    2016-03-01

    Emphasis on identifying evidence-based therapies (EBTs) has increased markedly. Lists of EBTs are the rationale for recommendations for how psychotherapy provider training programs should be evaluated, professional competence assessed, and licensure and reimbursement policies structured. There are however methodological concerns that limit the external validity of EBTs. Among the most salient is the circularity inherent in randomized control trials (RCTs) of psychotherapy that constrains the manner in which the psychological problems are defined, psychotherapy can be practiced, and change evaluated. RCT studies favor therapies that focus of specific symptoms and can be described in a manual, administered reliably across patients, completed in relatively few sessions, and involve short-term evaluations of outcome. The epistemological assumptions of a natural science approach to psychotherapy research limit how studies are conducted and assessed in ways that that advantage symptom-focused approaches and disadvantage those approaches that seek to bring broad recovery-based changes. Research methods that are not limited to RCTs and include methodology to minimize the effects of "therapist allegiance" are necessary for valid evaluations of therapeutic approaches that seek to facilitate changes that are broader than symptom reduction. Recent proposals to adopt policies that dictate training, credentialing, and reimbursement based on lists of EBTs unduly limit how psychotherapy can be conceptualized and practiced, and are not in the best interests of the profession or of individuals seeking psychotherapy services.

  20. Sparse representation of whole-brain fMRI signals for identification of functional networks.

    PubMed

    Lv, Jinglei; Jiang, Xi; Li, Xiang; Zhu, Dajiang; Chen, Hanbo; Zhang, Tuo; Zhang, Shu; Hu, Xintao; Han, Junwei; Huang, Heng; Zhang, Jing; Guo, Lei; Liu, Tianming

    2015-02-01

    There have been several recent studies that used sparse representation for fMRI signal analysis and activation detection based on the assumption that each voxel's fMRI signal is linearly composed of sparse components. Previous studies have employed sparse coding to model functional networks in various modalities and scales. These prior contributions inspired the exploration of whether/how sparse representation can be used to identify functional networks in a voxel-wise way and on the whole brain scale. This paper presents a novel, alternative methodology of identifying multiple functional networks via sparse representation of whole-brain task-based fMRI signals. Our basic idea is that all fMRI signals within the whole brain of one subject are aggregated into a big data matrix, which is then factorized into an over-complete dictionary basis matrix and a reference weight matrix via an effective online dictionary learning algorithm. Our extensive experimental results have shown that this novel methodology can uncover multiple functional networks that can be well characterized and interpreted in spatial, temporal and frequency domains based on current brain science knowledge. Importantly, these well-characterized functional network components are quite reproducible in different brains. In general, our methods offer a novel, effective and unified solution to multiple fMRI data analysis tasks including activation detection, de-activation detection, and functional network identification. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Electronic health records and disease registries to support integrated care in a health neighbourhood: an ontology-based methodology.

    PubMed

    Liaw, Siaw-Teng; Taggart, Jane; Yu, Hairong; Rahimi, Alireza

    2014-01-01

    Disease registries derived from Electronic Health Records (EHRs) are widely used for chronic disease management (CDM). However, unlike national registries which are specialised data collections, they are usually specific to an EHR or organization such as a medical home. We approached registries from the perspective of integrated care in a health neighbourhood, considering data quality issues such as semantic interoperability (consistency), accuracy, completeness and duplication. Our proposition is that a realist ontological approach is required to systematically and accurately identify patients in an EHR or data repository of EHRs, assess intrinsic data quality and fitness for use by members of the multidisciplinary integrated care team. We report on this approach as applied to routinely collected data in an electronic practice based research network in Australia.

  2. Current methodologies on genotyping for nosocomial pathogen methicillin-resistant Staphylococcus aureus (MRSA).

    PubMed

    Miao, Jian; Chen, Lequn; Wang, Jingwen; Wang, Wenxin; Chen, Dingqiang; Li, Lin; Li, Bing; Deng, Yang; Xu, Zhenbo

    2017-06-01

    Methicillin-resistant Staphylococcus aureus (MRSA) is a common pathogen in hospitals and the community. As the rapid spread and wide distribution of antimicrobial resistance (such as MRSA), treatment for infectious diseases caused by microorganisms has become a vital threat. Thus, early identification and genotyping are essential for further therapeutic treatment and the control of rapid expansion of MRSA. In combination with applications and data feedbacks, this review focused on the currently available molecular-based assays on their utility and performance for rapid typing of MRSA, especially on effective molecular-based methods. Besides, a common mobile element SCCmec and prevalence of HA-MRSA, LA-MRSA and CA-MRSA were introduced in this review in order to provide a more complete profile of MRSA. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Cybernetic modeling based on pathway analysis for Penicillium chrysogenum fed-batch fermentation.

    PubMed

    Geng, Jun; Yuan, Jingqi

    2010-08-01

    A macrokinetic model employing cybernetic methodology is proposed to describe mycelium growth and penicillin production. Based on the primordial and complete metabolic network of Penicillium chrysogenum found in the literature, the modeling procedure is guided by metabolic flux analysis and cybernetic modeling framework. The abstracted cybernetic model describes the transients of the consumption rates of the substrates, the assimilation rates of intermediates, the biomass growth rate, as well as the penicillin formation rate. Combined with the bioreactor model, these reaction rates are linked with the most important state variables, i.e., mycelium, substrate and product concentrations. Simplex method is used to estimate the sensitive parameters of the model. Finally, validation of the model is carried out with 20 batches of industrial-scale penicillin cultivation.

  4. Obtaining optic disc center and pixel region by automatic thresholding methods on morphologically processed fundus images.

    PubMed

    Marin, Diego; Gegundez-Arias, Manuel E; Suero, Angel; Bravo, Jose M

    2015-02-01

    Development of automatic retinal disease diagnosis systems based on retinal image computer analysis can provide remarkably quicker screening programs for early detection. Such systems are mainly focused on the detection of the earliest ophthalmic signs of illness and require previous identification of fundal landmark features such as optic disc (OD), fovea or blood vessels. A methodology for accurate center-position location and OD retinal region segmentation on digital fundus images is presented in this paper. The methodology performs a set of iterative opening-closing morphological operations on the original retinography intensity channel to produce a bright region-enhanced image. Taking blood vessel confluence at the OD into account, a 2-step automatic thresholding procedure is then applied to obtain a reduced region of interest, where the center and the OD pixel region are finally obtained by performing the circular Hough transform on a set of OD boundary candidates generated through the application of the Prewitt edge detector. The methodology was evaluated on 1200 and 1748 fundus images from the publicly available MESSIDOR and MESSIDOR-2 databases, acquired from diabetic patients and thus being clinical cases of interest within the framework of automated diagnosis of retinal diseases associated to diabetes mellitus. This methodology proved highly accurate in OD-center location: average Euclidean distance between the methodology-provided and actual OD-center position was 6.08, 9.22 and 9.72 pixels for retinas of 910, 1380 and 1455 pixels in size, respectively. On the other hand, OD segmentation evaluation was performed in terms of Jaccard and Dice coefficients, as well as the mean average distance between estimated and actual OD boundaries. Comparison with the results reported by other reviewed OD segmentation methodologies shows our proposal renders better overall performance. Its effectiveness and robustness make this proposed automated OD location and segmentation method a suitable tool to be integrated into a complete prescreening system for early diagnosis of retinal diseases. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  5. Methodological support for the further abstraction of and philosophical examination of empirical findings in the context of caring science

    PubMed Central

    Lindberg, Elisabeth; Österberg, Sofia A.; Hörberg, Ulrica

    2016-01-01

    Phenomena in caring science are often complex and laden with meanings. Empirical research with the aim of capturing lived experiences is one way of revealing the complexity. Sometimes, however, results from empirical research need to be further discussed. One way is to further abstract the result and/or philosophically examine it. This has previously been performed and presented in scientific journals and doctoral theses, contributing to a greater understanding of phenomena in caring science. Although the intentions in many of these publications are laudable, the lack of methodological descriptions as well as a theoretical and systematic foundation can contribute to an ambiguity concerning how the results have emerged during the analysis. The aim of this paper is to describe the methodological support for the further abstraction of and/or philosophical examination of empirical findings. When trying to systematize the support procedures, we have used a reflective lifeworld research (RLR) approach. Based on the assumptions in RLR, this article will present methodological support for a theoretical examination that can include two stages. In the first stage, data from several (two or more) empirical results on an essential level are synthesized into a general structure. Sometimes the analysis ends with the general structure, but sometimes there is a need to proceed further. The second stage can then be a philosophical examination, in which the general structure is discussed in relation to a philosophical text, theory, or concept. It is important that the theories are brought in as the final stage after the completion of the analysis. Core dimensions of the described methodological support are, in accordance with RLR, openness, bridling, and reflection. The methodological support cannot be understood as fixed stages, but rather as a guiding light in the search for further meanings. PMID:26925926

  6. Memristor-Based Computing Architecture: Design Methodologies and Circuit Techniques

    DTIC Science & Technology

    2013-03-01

    MEMRISTOR-BASED COMPUTING ARCHITECTURE : DESIGN METHODOLOGIES AND CIRCUIT TECHNIQUES POLYTECHNIC INSTITUTE OF NEW YORK UNIVERSITY...TECHNICAL REPORT 3. DATES COVERED (From - To) OCT 2010 – OCT 2012 4. TITLE AND SUBTITLE MEMRISTOR-BASED COMPUTING ARCHITECTURE : DESIGN METHODOLOGIES...schemes for a memristor-based reconfigurable architecture design have not been fully explored yet. Therefore, in this project, we investigated

  7. Mechanism of realization economic strategy of transport organization

    NASA Astrophysics Data System (ADS)

    Palkina, E. S.

    2017-10-01

    In modern conditions of economic globalization, high dynamism of external environment, economic strategy of transport organization plays an important role in maintaining its competitive advantages, long-term development. For effective achievement of set strategic goals it is necessary to use an adequate mechanism based on completeness and interrelation of its constituent instruments. The main objective of the study presented in this paper is to develop methodological provisions on formation the mechanism of realization economic strategy for transport organizations. The principles of its construction have been proposed, the key components have been defined. Finally, an attempt to implementation this mechanism into the transport organization management system has been realized.

  8. Thermal Stability Testing of a Fischer-Tropsch Fuel and Various Blends with Jet A

    NASA Technical Reports Server (NTRS)

    Klettlinger, Jennifer Suder; Surgenor, Angela; Yen, Chia

    2010-01-01

    Fischer-Tropsch (F-T) jet fuel composition differs from petroleum-based, conventional commercial jet fuel because of differences in feedstock and production methodology. Fischer-Tropsch fuel typically has a lower aromatic and sulfur content and consists primarily of iso and normal parafins. The ASTM D3241 specification for Jet Fuel Thermal Oxidation Test (JFTOT) break point testing method was used to test the breakpoint of a baseline conventional Jet A, a commercial grade F-T jet fuel, and various blends of this F-T fuel in Jet A. The testing completed in this report was supported by the NASA Fundamental Aeronautics Subsonics Fixed Wing Project.

  9. Medical Refugees and the Modernisation of British Medicine, 1930–1960

    PubMed Central

    Weindling, Paul

    2015-01-01

    Summary This paper reappraises the position of medical refugees in Britain between the 1930s and 1950s. Advocates of reforming British medicine in terms of its knowledge base and social provision emerged as strongly supportive of the medical refugees. By way of contrast, an élite in the British Medical Association attempted to exercise a controlling regime through the Home Office Advisory Committee. The effects of these divisions are gauged by reconstructing the complete spectrum of refugees as a total population. Applying this methodology of population reconstruction provides a corrective to the notion of a cohesive ‘medical establishment’ exercising rigid and discriminatory controls. PMID:26166948

  10. Understanding parenting in Manitoba First nations: implications for program development.

    PubMed

    Eni, Rachel; Rowe, Gladys

    2011-01-01

    This qualitative study introduced the "Manitoba First Nation Strengthening Families Maternal Child Health Pilot Project" program and evaluation methodologies. The study provided a knowledge base for programmers, evaluators, and communities to develop relevant health promotion, prevention, and intervention programming to assist in meeting health needs of pregnant women and young families. Sixty-five open-ended, semistructured interviews were completed in 13 communities. Data analysis was through grounded theory. Three major themes emerged from the data: interpersonal support and relationships; socioeconomic factors; and community initiatives. Complex structural, historical events compromise parenting; capacity and resilience are supported through informal and formal health and social supports.

  11. Evidence-based Frameworks for Teaching and Learning in Classical Singing Training: A Systematic Review.

    PubMed

    Crocco, Laura; Madill, Catherine J; McCabe, Patricia

    2017-01-01

    The study systematically reviews evidence-based frameworks for teaching and learning of classical singing training. This is a systematic review. A systematic literature search of 15 electronic databases following the Preferred Reporting Items for Systematic Reviews (PRISMA) guidelines was conducted. Eligibility criteria included type of publication, participant characteristics, intervention, and report of outcomes. Quality rating scales were applied to support assessment of the included literature. Data analysis was conducted using meta-aggregation. Nine papers met the inclusion criteria. No complete evidence-based teaching and learning framework was found. Thematic content analysis showed that studies either (1) identified teaching practices in one-to-one lessons, (2) identified student learning strategies in one-to-one lessons or personal practice sessions, and (3) implemented a tool to enhance one specific area of teaching and learning in lessons. The included studies showed that research in music education is not always specific to musical genre or instrumental group, with four of the nine studies including participant teachers and students of classical voice training only. The overall methodological quality ratings were low. Research in classical singing training has not yet developed an evidence-based framework for classical singing training. This review has found that introductory information on teaching and learning practices has been provided, and tools have been suggested for use in the evaluation of the teaching-learning process. High-quality methodological research designs are needed. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  12. Examination of the equivalence of self-report survey-based paper-and-pencil and internet data collection methods.

    PubMed

    Weigold, Arne; Weigold, Ingrid K; Russell, Elizabeth J

    2013-03-01

    Self-report survey-based data collection is increasingly carried out using the Internet, as opposed to the traditional paper-and-pencil method. However, previous research on the equivalence of these methods has yielded inconsistent findings. This may be due to methodological and statistical issues present in much of the literature, such as nonequivalent samples in different conditions due to recruitment, participant self-selection to conditions, and data collection procedures, as well as incomplete or inappropriate statistical procedures for examining equivalence. We conducted 2 studies examining the equivalence of paper-and-pencil and Internet data collection that accounted for these issues. In both studies, we used measures of personality, social desirability, and computer self-efficacy, and, in Study 2, we used personal growth initiative to assess quantitative equivalence (i.e., mean equivalence), qualitative equivalence (i.e., internal consistency and intercorrelations), and auxiliary equivalence (i.e., response rates, missing data, completion time, and comfort completing questionnaires using paper-and-pencil and the Internet). Study 1 investigated the effects of completing surveys via paper-and-pencil or the Internet in both traditional (i.e., lab) and natural (i.e., take-home) settings. Results indicated equivalence across conditions, except for auxiliary equivalence aspects of missing data and completion time. Study 2 examined mailed paper-and-pencil and Internet surveys without contact between experimenter and participants. Results indicated equivalence between conditions, except for auxiliary equivalence aspects of response rate for providing an address and completion time. Overall, the findings show that paper-and-pencil and Internet data collection methods are generally equivalent, particularly for quantitative and qualitative equivalence, with nonequivalence only for some aspects of auxiliary equivalence. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  13. A Probabilistic and Observation Based Methodology to Estimate Small Craft Harbor Vulnerability to Tsunami Events

    NASA Astrophysics Data System (ADS)

    Keen, A. S.; Lynett, P. J.; Ayca, A.

    2016-12-01

    Because of the damage resulting from the 2010 Chile and 2011 Japanese tele-tsunamis, the tsunami risk to the small craft marinas in California has become an important concern. The talk will outline an assessment tool which can be used to assess the tsunami hazard to small craft harbors. The methodology is based on the demand and structural capacity of the floating dock system, composed of floating docks/fingers and moored vessels. The structural demand is determined using a Monte Carlo methodology. Monte Carlo methodology is a probabilistic computational tool where the governing might be well known, but the independent variables of the input (demand) as well as the resisting structural components (capacity) may not be completely known. The Monte Carlo approach uses a distribution of each variable, and then uses that random variable within the described parameters, to generate a single computation. The process then repeats hundreds or thousands of times. The numerical model "Method of Splitting Tsunamis" (MOST) has been used to determine the inputs for the small craft harbors within California. Hydrodynamic model results of current speed, direction and surface elevation were incorporated via the drag equations to provide the bases of the demand term. To determine the capacities, an inspection program was developed to identify common features of structural components. A total of six harbors have been inspected ranging from Crescent City in Northern California to Oceanside Harbor in Southern California. Results from the inspection program were used to develop component capacity tables which incorporated the basic specifications of each component (e.g. bolt size and configuration) and a reduction factor (which accounts for the component reduction in capacity with age) to estimate in situ capacities. Like the demand term, these capacities are added probabilistically into the model. To date the model has been applied to Santa Cruz Harbor as well as Noyo River. Once calibrated, the model was able to hindcast the damage produced in Santa Cruz Harbor during the 2010 Chile and 2011 Japan events. Results of the Santa Cruz analysis will be presented and discussed.

  14. Improving ED specimen TAT using Lean Six Sigma.

    PubMed

    Sanders, Janet H; Karr, Tedd

    2015-01-01

    Lean and Six Sigma are continuous improvement methodologies that have garnered international fame for improving manufacturing and service processes. Increasingly these methodologies are demonstrating their power to also improve healthcare processes. The purpose of this paper is to discuss a case study for the application of Lean and Six Sigma tools in the reduction of turnaround time (TAT) for Emergency Department (ED) specimens. This application of the scientific methodologies uncovered opportunities to improve the entire ED to lab system for the specimens. This case study provides details on the completion of a Lean Six Sigma project in a 1,000 bed tertiary care teaching hospital. Six Sigma's Define, Measure, Analyze, Improve, and Control methodology is very similar to good medical practice: first, relevant information is obtained and assembled; second, a careful and thorough diagnosis is completed; third, a treatment is proposed and implemented; and fourth, checks are made to determine if the treatment was effective. Lean's primary goal is to do more with less work and waste. The Lean methodology was used to identify and eliminate waste through rapid implementation of change. The initial focus of this project was the reduction of turn-around-times for ED specimens. However, the results led to better processes for both the internal and external customers of this and other processes. The project results included: a 50 percent decrease in vials used for testing, a 50 percent decrease in unused or extra specimens, a 90 percent decrease in ED specimens without orders, a 30 percent decrease in complete blood count analysis (CBCA) Median TAT, a 50 percent decrease in CBCA TAT Variation, a 10 percent decrease in Troponin TAT Variation, a 18.2 percent decrease in URPN TAT Variation, and a 2-5 minute decrease in ED registered nurses rainbow draw time. This case study demonstrated how the quantitative power of Six Sigma and the speed of Lean worked in harmony to improve the blood draw process for a 1,000 bed tertiary care teaching hospital. The blood draw process is a standard process used in hospitals to collect blood chemistry and hematology information for clinicians. The methods used in this case study demonstrated valuable and practical applications of process improvement methodologies that can be used for any hospital process and/or service environment. While this is not the first case study that has demonstrated the use of continuous process improvement methodologies to improve a hospital process, it is unique in the way in which it utilizes the strength of the project focussed approach that adheres more to the structure and rigor of Six Sigma and relied less on the speed of lean. Additionally, the application of these methodologies in healthcare is emerging research.

  15. 77 FR 54884 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-06

    ... design and methodological research, cognitive interviews were completed with eighty-three respondents in... assessing and directing federal, state, and local government programs designed to promote the activities of...

  16. The Emerging Role of Meditation in Addressing Psychiatric Illness, with a Focus on Substance Use Disorders

    PubMed Central

    Dakwar, Elias; Levin, Frances R.

    2011-01-01

    Over the past 30 years the practice of meditation has become increasingly popular in clinical settings. In addition to evidence-based medical uses, meditation may have psychiatric benefits. In this review, the literature on the role of meditation in addressing psychiatric issues, and specifically substance use disorders, is discussed. Each of the three meditation modalities that have been most widely studied—transcendental meditation, Buddhist meditation, and mindfulness-based meditation—is critically examined in terms of its background, techniques, mechanisms of action, and evidence-based clinical applications, with special attention given to its emerging role in the treatment of substance use disorders. The unique methodological difficulties that beset the study of meditation are also considered. A brief discussion then integrates the research that has been completed thus far, elucidates the specific ways that meditation may be helpful for substance use disorders, and suggests new avenues for research. PMID:19637074

  17. A complete-pelvis segmentation framework for image-free total hip arthroplasty (THA): methodology and clinical study.

    PubMed

    Xie, Weiguo; Franke, Jochen; Chen, Cheng; Grützner, Paul A; Schumann, Steffen; Nolte, Lutz-P; Zheng, Guoyan

    2015-06-01

    Complete-pelvis segmentation in antero-posterior pelvic radiographs is required to create a patient-specific three-dimensional pelvis model for surgical planning and postoperative assessment in image-free navigation of total hip arthroplasty. A fast and robust framework for accurately segmenting the complete pelvis is presented, consisting of two consecutive modules. In the first module, a three-stage method was developed to delineate the left hemi-pelvis based on statistical appearance and shape models. To handle complex pelvic structures, anatomy-specific information processing techniques were employed. As the input to the second module, the delineated left hemi-pelvis was then reflected about an estimated symmetry line of the radiograph to initialize the right hemi-pelvis segmentation. The right hemi-pelvis was segmented by the same three-stage method, Two experiments conducted on respectively 143 and 40 AP radiographs demonstrated a mean segmentation accuracy of 1.61±0.68 mm. A clinical study to investigate the postoperative assessment of acetabular cup orientations based on the proposed framework revealed an average accuracy of 1.2°±0.9° and 1.6°±1.4° for anteversion and inclination, respectively. Delineation of each radiograph costs less than one minute. Despite further validation needed, the preliminary results implied the underlying clinical applicability of the proposed framework for image-free THA. Copyright © 2014 John Wiley & Sons, Ltd.

  18. 77 FR 33463 - Proposed Data Collections Submitted for Public Comment and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-06

    ..., or methodological studies. Participation in NHANES is completely voluntary and confidential. NHANES programs produce descriptive statistics which measure the health and nutrition status of the general...

  19. Integration of Personal Digital Assistant (PDA) Devices into the Military Healthcare Clinic Environment

    DTIC Science & Technology

    2001-09-01

    3 E . METHODOLOGY...25 a. Patient Tracker 4.1..................................................................25 E . SAMPLE CLINICAL...Partially Completed or Missed?......29 E . CANDIDATE SOLUTIONS .........................................................................31 1. Success Story

  20. Theory and procedures for finding a correct kinetic model for the bacteriorhodopsin photocycle.

    PubMed

    Hendler, R W; Shrager, R; Bose, S

    2001-04-26

    In this paper, we present the implementation and results of new methodology based on linear algebra. The theory behind these methods is covered in detail in the Supporting Information, available electronically (Shragerand Hendler). In brief, the methods presented search through all possible forward sequential submodels in order to find candidates that can be used to construct a complete model for the BR-photocycle. The methodology is limited only to forward sequential models. If no such models are compatible with the experimental data,none will be found. The procedures apply objective tests and filters to eliminate possibilities that cannot be correct, thus cutting the total number of candidate sequences to be considered. In the current application,which uses six exponentials, the total sequences were cut from 1950 to 49. The remaining sequences were further screened using known experimental criteria. The approach led to a solution which consists of a pair of sequences, one with 5 exponentials showing BR* f L(f) M(f) N O BR and the other with three exponentials showing BR* L(s) M(s) BR. The deduced complete kinetic model for the BR photocycle is thus either a single photocycle branched at the L intermediate or a pair of two parallel photocycles. Reasons for preferring the parallel photocycles are presented. Synthetic data constructed on the basis of the parallel photocycles were indistinguishable from the experimental data in a number of analytical tests that were applied.

  1. The WFIRST Microlensing Survey: Expectations and Unexpectations

    NASA Astrophysics Data System (ADS)

    Gaudi, B. Scott; Penny, Matthew

    2016-01-01

    The WFIRST microlensing survey will provide the definitive determination of the demographics of cool planets with semimajor axes > 1 AU and masses greater than that of the Earth, including free-floating planets. Together with the results from Kepler, TESS, and PLATO, WFIRST will complete the statistical census of planets in the Galaxy. These expectations are based on the most basic and conservative assumptions about the data quality, and assumes that the analysis methodologies will be similar to that used for current ground-based microlensing. Yet, in fact, the data quality will be dramatically better, and information content substantially richer, for the WFIRST microlensing survey as compared to current ground-based surveys. Thus WFIRST should allow for orders of magnitude improvement in both sensitivity and science yield. We will review some of these expected improvements and opportunities (the "known unknowns"), and provide a "to do list" of what tasks will need to be completed in order to take advantage of these opportunities. We will then speculate on the opportunities that we may not be aware of yet (the "unknown unknowns"), how we might go about determining what those opportunities are, and how we might figure out what we will need to do to take advantage of them.This work was partially supported by NASA grant NNX14AF63G.

  2. Consultative Committee on Road Traffic Fatalities: trauma audit methodology.

    PubMed

    McDermott, F T; Cordner, S M; Tremayne, A B

    2000-10-01

    Since 1992 the Consultative Committee on Road Traffic Fatalities in Victoria has identified deficiencies and errors in the management of 559 road traffic fatalities in which the patients were alive on arrival of ambulance services. The Committee also assessed the preventability of deaths. Reproducibility of results using its methodology has been shown to be statistically significant. The Committee's findings and recommendations, the latter made in association with the learned Colleges and specialist Societies, led to the establishment of a Ministerial Taskforce on Trauma and Emergency Services. As a consequence, in 2000, a new trauma care system will be implemented in Victoria. This paper presents a case example demonstrating the Committee's methodology. The Committee has two 12 member multidisciplinary evaluative panels. A retrospective evaluation was made of the complete ambulance, hospital and autopsy records of eligible fatalities. The clinical and pathological findings were analysed using a comprehensive data proforma, a narrative summary and the complete records. Resulting multidisciplinary discussion problems were identified and the potential preventability of death was assessed. In the present case example the Committee identified 16 management deficiencies of which 11 were assessed as having contributed to the patient's death; the death, however, was judged to be non-preventable. The presentation of this example demonstrating the Committee's methodology may be of assistance to hospital medical staff undertaking their own major trauma audit.

  3. Human-Centered Design Study: Enhancing the Usability of a Mobile Phone App in an Integrated Falls Risk Detection System for Use by Older Adult Users

    PubMed Central

    Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul MA; Scharf, Thomas; ÓLaighin, Gearóid

    2017-01-01

    Background Design processes such as human-centered design (HCD), which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of HCD can often conflict with the necessary rapid product development life-cycles associated with the competitive connected health industry. Objective The aim of this study was to apply a structured HCD methodology to the development of a smartphone app that was to be used within a connected health fall risk detection system. Our methodology utilizes so called discount usability engineering techniques to minimize the burden on resources during development and maintain a rapid pace of development. This study will provide prospective designers a detailed description of the application of a HCD methodology. Methods A 3-phase methodology was applied. In the first phase, a descriptive “use case” was developed by the system designers and analyzed by both expert stakeholders and end users. The use case described the use of the app and how various actors would interact with it and in what context. A working app prototype and a user manual were then developed based on this feedback and were subjected to a rigorous usability inspection. Further changes were made both to the interface and support documentation. The now advanced prototype was exposed to user testing by end users where further design recommendations were made. Results With combined expert and end-user analysis of a comprehensive use case having originally identified 21 problems with the system interface, we have only seen and observed 3 of these problems in user testing, implying that 18 problems were eliminated between phase 1 and 3. Satisfactory ratings were obtained during validation testing by both experts and end users, and final testing by users shows the system requires low mental, physical, and temporal demands according to the NASA Task Load Index (NASA-TLX). Conclusions From our observation of older adults’ interactions with smartphone interfaces, there were some recurring themes. Clear and relevant feedback as the user attempts to complete a task is critical. Feedback should include pop-ups, sound tones, color or texture changes, or icon changes to indicate that a function has been completed successfully, such as for the connection sequence. For text feedback, clear and unambiguous language should be used so as not to create anxiety, particularly when it comes to saving data. Warning tones or symbols, such as caution symbols or shrill tones, should only be used if absolutely necessary. Our HCD methodology, designed and implemented based on the principles of the International Standard Organizaton (ISO) 9241-210 standard, produced a functional app interface within a short production cycle, which is now suitable for use by older adults in long term clinical trials. PMID:28559227

  4. Phenetic Comparison of Prokaryotic Genomes Using k-mers

    PubMed Central

    Déraspe, Maxime; Raymond, Frédéric; Boisvert, Sébastien; Culley, Alexander; Roy, Paul H.; Laviolette, François; Corbeil, Jacques

    2017-01-01

    Abstract Bacterial genomics studies are getting more extensive and complex, requiring new ways to envision analyses. Using the Ray Surveyor software, we demonstrate that comparison of genomes based on their k-mer content allows reconstruction of phenetic trees without the need of prior data curation, such as core genome alignment of a species. We validated the methodology using simulated genomes and previously published phylogenomic studies of Streptococcus pneumoniae and Pseudomonas aeruginosa. We also investigated the relationship of specific genetic determinants with bacterial population structures. By comparing clusters from the complete genomic content of a genome population with clusters from specific functional categories of genes, we can determine how the population structures are correlated. Indeed, the strain clustering based on a subset of k-mers allows determination of its similarity with the whole genome clusters. We also applied this methodology on 42 species of bacteria to determine the correlational significance of five important bacterial genomic characteristics. For example, intrinsic resistance is more important in P. aeruginosa than in S. pneumoniae, and the former has increased correlation of its population structure with antibiotic resistance genes. The global view of the pangenome of bacteria also demonstrated the taxa-dependent interaction of population structure with antibiotic resistance, bacteriophage, plasmid, and mobile element k-mer data sets. PMID:28957508

  5. Resonance Parameter Adjustment Based on Integral Experiments

    DOE PAGES

    Sobes, Vladimir; Leal, Luiz; Arbanas, Goran; ...

    2016-06-02

    Our project seeks to allow coupling of differential and integral data evaluation in a continuous-energy framework and to use the generalized linear least-squares (GLLS) methodology in the TSURFER module of the SCALE code package to update the parameters of a resolved resonance region evaluation. We recognize that the GLLS methodology in TSURFER is identical to the mathematical description of a Bayesian update in SAMMY, the SAMINT code was created to use the mathematical machinery of SAMMY to update resolved resonance parameters based on integral data. Traditionally, SAMMY used differential experimental data to adjust nuclear data parameters. Integral experimental data, suchmore » as in the International Criticality Safety Benchmark Experiments Project, remain a tool for validation of completed nuclear data evaluations. SAMINT extracts information from integral benchmarks to aid the nuclear data evaluation process. Later, integral data can be used to resolve any remaining ambiguity between differential data sets, highlight troublesome energy regions, determine key nuclear data parameters for integral benchmark calculations, and improve the nuclear data covariance matrix evaluation. Moreover, SAMINT is not intended to bias nuclear data toward specific integral experiments but should be used to supplement the evaluation of differential experimental data. Using GLLS ensures proper weight is given to the differential data.« less

  6. Constellation Ground Systems Launch Availability Analysis: Enhancing Highly Reliable Launch Systems Design

    NASA Technical Reports Server (NTRS)

    Gernand, Jeffrey L.; Gillespie, Amanda M.; Monaghan, Mark W.; Cummings, Nicholas H.

    2010-01-01

    Success of the Constellation Program's lunar architecture requires successfully launching two vehicles, Ares I/Orion and Ares V/Altair, in a very limited time period. The reliability and maintainability of flight vehicles and ground systems must deliver a high probability of successfully launching the second vehicle in order to avoid wasting the on-orbit asset launched by the first vehicle. The Ground Operations Project determined which ground subsystems had the potential to affect the probability of the second launch and allocated quantitative availability requirements to these subsystems. The Ground Operations Project also developed a methodology to estimate subsystem reliability, availability and maintainability to ensure that ground subsystems complied with allocated launch availability and maintainability requirements. The verification analysis developed quantitative estimates of subsystem availability based on design documentation; testing results, and other information. Where appropriate, actual performance history was used for legacy subsystems or comparative components that will support Constellation. The results of the verification analysis will be used to verify compliance with requirements and to highlight design or performance shortcomings for further decision-making. This case study will discuss the subsystem requirements allocation process, describe the ground systems methodology for completing quantitative reliability, availability and maintainability analysis, and present findings and observation based on analysis leading to the Ground Systems Preliminary Design Review milestone.

  7. Optimization of Well Configuration for a Sedimentary Enhanced Geothermal Reservoir

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Mengnan; Cho, JaeKyoung; Zerpa, Luis E.

    The extraction of geothermal energy in the form of hot water from sedimentary rock formations could expand the current geothermal energy resources toward new regions. From previous work, we observed that sedimentary geothermal reservoirs with relatively low permeability would require the application of enhancement techniques (e.g., well hydraulic stimulation) to achieve commercial production/injection rates. In this paper we extend our previous work to develop a methodology to determine the optimum well configuration that maximizes the hydraulic performance of the geothermal system. The geothermal systems considered consist of one vertical well doublet system with hydraulic fractures, and three horizontal well configurationsmore » with open-hole completion, longitudinal fractures and transverse fractures, respectively. A commercial thermal reservoir simulation is used to evaluate the geothermal reservoir performance using as design parameters the well spacing and the length of the horizontal wells. The results obtained from the numerical simulations are used to build a response surface model based on the multiple linear regression method. The optimum configuration of the sedimentary geothermal systems is obtained from the analysis of the response surface model. The proposed methodology is applied to a case study based on a reservoir model of the Lyons sandstone formation, located in the Wattenberg field, Denver-Julesburg basin, Colorado.« less

  8. A way forward for teaching and learning of Physiology: Students’ perception of the effectiveness of teaching methodologies

    PubMed Central

    Rehan, Rabiya; Ahmed, Khalid; Khan, Hira; Rehman, Rehana

    2016-01-01

    Objective: To compare the perception of medical students on the usefulness of the interactive lectures, case-based lectures, and structured interactive sessions (SIS) in teaching and learning of Physiology. Methods: A cross-sectional study was carried out from January to December 2012 at Bahria University Medical & Dental College, Karachi, which had qualitative and quantitative aspects, assessed by self- reported questionnaire and focused group discussion (FGD). The questionnaire was distributed to 100 medical students after completion of first year of teaching of MBBS Physiology. The data was analyzed using SPSS version 15. Differences were considered significant at p-values <0.05 after application of Friedman test. Responses of FGD were analyzed. Results: All the teaching methodologies helped in understanding of precise learning objectives. The comprehension of structure and functions with understanding of difficult concepts was made best possible by SIS (p=0.04, p<0.01). SIS enabled adult learning, self-directed learning, peer learning and critical reasoning more than the other teaching strategies (p< 0.01). Conclusion: SIS involved students who used reasoning skills and power of discussion in a group to comprehend difficult concepts for better understanding of Physiology as compared to interactive and case-based lectures. PMID:28083047

  9. Need for a marginal methodology in assessing natural gas system methane emissions in response to incremental consumption.

    PubMed

    Mac Kinnon, Michael; Heydarzadeh, Zahra; Doan, Quy; Ngo, Cuong; Reed, Jeff; Brouwer, Jacob

    2018-05-17

    Accurate quantification of methane emissions from the natural gas system is important for establishing greenhouse gas inventories and understanding cause and effect for reducing emissions. Current carbon intensity methods generally assume methane emissions are proportional to gas throughput so that increases in gas consumption yield linear increases in emitted methane. However, emissions sources are diverse and many are not proportional to throughput. Insights into the causal drivers of system methane emissions, and how system-wide changes affect such drivers are required. The development of a novel cause-based methodology to assess marginal methane emissions per unit of fuel consumed is introduced. The carbon intensities of technologies consuming natural gas are critical metrics currently used in policy decisions for reaching environmental goals. For example, the low-carbon fuel standard in California uses carbon intensity to determine incentives provided. Current methods generally assume methane emissions from the natural gas system are completely proportional to throughput. The proposed cause-based marginal emissions method will provide a better understanding of the actual drivers of emissions to support development of more effective mitigation measures. Additionally, increasing the accuracy of carbon intensity calculations supports the development of policies that can maximize the environmental benefits of alternative fuels, including reducing greenhouse gas emissions.

  10. Advanced Methodologies for NASA Science Missions

    NASA Astrophysics Data System (ADS)

    Hurlburt, N. E.; Feigelson, E.; Mentzel, C.

    2017-12-01

    Most of NASA's commitment to computational space science involves the organization and processing of Big Data from space-based satellites, and the calculations of advanced physical models based on these datasets. But considerable thought is also needed on what computations are needed. The science questions addressed by space data are so diverse and complex that traditional analysis procedures are often inadequate. The knowledge and skills of the statistician, applied mathematician, and algorithmic computer scientist must be incorporated into programs that currently emphasize engineering and physical science. NASA's culture and administrative mechanisms take full cognizance that major advances in space science are driven by improvements in instrumentation. But it is less well recognized that new instruments and science questions give rise to new challenges in the treatment of satellite data after it is telemetered to the ground. These issues might be divided into two stages: data reduction through software pipelines developed within NASA mission centers; and science analysis that is performed by hundreds of space scientists dispersed through NASA, U.S. universities, and abroad. Both stages benefit from the latest statistical and computational methods; in some cases, the science result is completely inaccessible using traditional procedures. This paper will review the current state of NASA and present example applications using modern methodologies.

  11. Pragmatic estimation of a spatio-temporal air quality model with irregular monitoring data

    NASA Astrophysics Data System (ADS)

    Sampson, Paul D.; Szpiro, Adam A.; Sheppard, Lianne; Lindström, Johan; Kaufman, Joel D.

    2011-11-01

    Statistical analyses of health effects of air pollution have increasingly used GIS-based covariates for prediction of ambient air quality in "land use" regression models. More recently these spatial regression models have accounted for spatial correlation structure in combining monitoring data with land use covariates. We present a flexible spatio-temporal modeling framework and pragmatic, multi-step estimation procedure that accommodates essentially arbitrary patterns of missing data with respect to an ideally complete space by time matrix of observations on a network of monitoring sites. The methodology incorporates a model for smooth temporal trends with coefficients varying in space according to Partial Least Squares regressions on a large set of geographic covariates and nonstationary modeling of spatio-temporal residuals from these regressions. This work was developed to provide spatial point predictions of PM 2.5 concentrations for the Multi-Ethnic Study of Atherosclerosis and Air Pollution (MESA Air) using irregular monitoring data derived from the AQS regulatory monitoring network and supplemental short-time scale monitoring campaigns conducted to better predict intra-urban variation in air quality. We demonstrate the interpretation and accuracy of this methodology in modeling data from 2000 through 2006 in six U.S. metropolitan areas and establish a basis for likelihood-based estimation.

  12. Cu-Al-Ni-SMA-Based High-Damping Composites

    NASA Astrophysics Data System (ADS)

    López, Gabriel A.; Barrado, Mariano; San Juan, Jose; Nó, María Luisa

    2009-08-01

    Recently, absorption of vibration energy by mechanical damping has attracted much attention in several fields such as vibration reduction in aircraft and automotive industries, nanoscale vibration isolations in high-precision electronics, building protection in civil engineering, etc. Typically, the most used high-damping materials are based on polymers due to their viscoelastic behavior. However, polymeric materials usually show a low elastic modulus and are not stable at relatively low temperatures (≈323 K). Therefore, alternative materials for damping applications are needed. In particular, shape memory alloys (SMAs), which intrinsically present high-damping capacity thanks to the dissipative hysteretic movement of interfaces under external stresses, are very good candidates for high-damping applications. A completely new approach was applied to produce high-damping composites with relatively high stiffness. Cu-Al-Ni shape memory alloy powders were embedded with metallic matrices of pure In, a In-10wt.%Sn alloy and In-Sn eutectic alloy. The production methodology is described. The composite microstructures and damping properties were characterized. A good particle distribution of the Cu-Al-Ni particles in the matrices was observed. The composites exhibit very high damping capacities in relatively wide temperature ranges. The methodology introduced provides versatility to control the temperature of maximum damping by adjusting the shape memory alloy composition.

  13. Agile methodology selection criteria: IT start-up case study

    NASA Astrophysics Data System (ADS)

    Micic, Lj

    2017-05-01

    Project management in modern IT companies is often based on agile methodologies which have several advantages compared to traditional methodologies such is waterfall. Having in mind that clients sometimes change project during development it is crucial for an IT company to choose carefully which methodology is going to implement and is it going to be mostly based on one or is it going got be combination of several. There are several modern and often used methodologies but among those Scrum, Kanban and XP programming are usually the most common. Sometimes companies use mostly tools and procedures from one but quite often they use some of the combination of those methodologies. Having in mind that those methodologies are just a framework they allow companies to adapt it for their specific projects as well as for other limitations. These methodologies are in limited usage Bosnia but more and more IT companies are starting to use agile methodologies because it is practice and common not just for their clients abroad but also starting to be the only option in order to deliver quality product on time. However it is always challenging which methodology or combination of several companies should implement and how to connect it to its own project, organizational framework and HR management. This paper presents one case study based on local IT start up and delivers solution based on theoretical framework and practical limitations that case company has.

  14. Feasibility and Acceptability of Text Messaging to Assess Daily Substance Use and Sexual Behaviors among Urban Emerging Adults.

    PubMed

    Bonar, Erin E; Cunningham, Rebecca M; Collins, R Lorraine; Cranford, James A; Chermack, Stephen T; Zimmerman, Marc A; Blow, Frederic C; Walton, Maureen A

    2018-01-01

    Daily process research can help distinguish causal relationships between substance use and sexual risk behaviors in high-risk groups, such as urban emerging adults. We employed text messaging to assess 18-25 year-olds' daily substance use and sexual risk behaviors over 28 days. We describe the implementation of this method, attitudes regarding the daily surveys, and correlates of survey completion. We recruited 111 emerging adults from an urban Emergency Department in a resource-limited area who reported recent drug use and unprotected sex ( M age =22.0; 53.2% female; 45.1% African American; 43.2% receiving public assistance). Respondents completed M =18.0 ( SD = 8.7) of 28 daily surveys (27 items each). Participants completing a 1-month follow-up found the surveys not at all/only a little annoying (90.3%) and were comfortable with questions about drugs/alcohol (97.9%) and sex (94.6%). Completion was higher on weekdays versus weekends, and earlier in the study. Daily survey completion was unrelated to same-day substance use measured by the Timeline Follow Back at follow-up; polysubstance use and drinks consumed were associated with lower odds of next-day completion. School enrollment, public assistance, unlimited texting plan, lower baseline alcohol use, and depression symptoms at follow-up were associated with higher completion. Technology difficulties were commonly mentioned barriers to completion. Participants in this urban, resource-constrained sample found the daily text message methodology acceptable for reporting sensitive information. With rapid advancements in technologies and increased accessibility, text messaging remains a promising methodology for the study of daily processes in substance use and HIV risk behaviors. Keywords: text messaging; assessment; emerging adults; substance use; risky sex; mobile technology.

  15. Complete Sequence of the Intronless Mitochondrial Genome of the Saccharomyces cerevisiae Strain CW252

    PubMed Central

    2018-01-01

    ABSTRACT The mitochondrial genomes of Saccharomyces cerevisiae strains contain up to 13 introns. An intronless recombinant genome introduced into the nuclear background of S. cerevisiae strain W303 gave the S. cerevisiae CW252 strain, which is used to model mitochondrial respiratory pathologies. The complete sequence of this mitochondrial genome was obtained using a hybrid assembling methodology. PMID:29700138

  16. Complete all-optical processing polarization-based binary logic gates and optical processors.

    PubMed

    Zaghloul, Y A; Zaghloul, A R M

    2006-10-16

    We present a complete all-optical-processing polarization-based binary-logic system, by which any logic gate or processor can be implemented. Following the new polarization-based logic presented in [Opt. Express 14, 7253 (2006)], we develop a new parallel processing technique that allows for the creation of all-optical-processing gates that produce a unique output either logic 1 or 0 only once in a truth table, and those that do not. This representation allows for the implementation of simple unforced OR, AND, XOR, XNOR, inverter, and more importantly NAND and NOR gates that can be used independently to represent any Boolean expression or function. In addition, the concept of a generalized gate is presented which opens the door for reconfigurable optical processors and programmable optical logic gates. Furthermore, the new design is completely compatible with the old one presented in [Opt. Express 14, 7253 (2006)], and with current semiconductor based devices. The gates can be cascaded, where the information is always on the laser beam. The polarization of the beam, and not its intensity, carries the information. The new methodology allows for the creation of multiple-input-multiple-output processors that implement, by itself, any Boolean function, such as specialized or non-specialized microprocessors. Three all-optical architectures are presented: orthoparallel optical logic architecture for all known and unknown binary gates, singlebranch architecture for only XOR and XNOR gates, and the railroad (RR) architecture for polarization optical processors (POP). All the control inputs are applied simultaneously leading to a single time lag which leads to a very-fast and glitch-immune POP. A simple and easy-to-follow step-by-step algorithm is provided for the POP, and design reduction methodologies are briefly discussed. The algorithm lends itself systematically to software programming and computer-assisted design. As examples, designs of all binary gates, multiple-input gates, and sequential and non-sequential Boolean expressions are presented and discussed. The operation of each design is simply understood by a bullet train traveling at the speed of light on a railroad system preconditioned by the crossover states predetermined by the control inputs. The presented designs allow for optical processing of the information eliminating the need to convert it, back and forth, to an electronic signal for processing purposes. All gates with a truth table, including for example Fredkin, Toffoli, testable reversible logic, and threshold logic gates, can be designed and implemented using the railroad architecture. That includes any future gates not known today. Those designs and the quantum gates are not discussed in this paper.

  17. Analysis of oil-pipeline distribution of multiple products subject to delivery time-windows

    NASA Astrophysics Data System (ADS)

    Jittamai, Phongchai

    This dissertation defines the operational problems of, and develops solution methodologies for, a distribution of multiple products into oil pipeline subject to delivery time-windows constraints. A multiple-product oil pipeline is a pipeline system composing of pipes, pumps, valves and storage facilities used to transport different types of liquids. Typically, products delivered by pipelines are petroleum of different grades moving either from production facilities to refineries or from refineries to distributors. Time-windows, which are generally used in logistics and scheduling areas, are incorporated in this study. The distribution of multiple products into oil pipeline subject to delivery time-windows is modeled as multicommodity network flow structure and mathematically formulated. The main focus of this dissertation is the investigation of operating issues and problem complexity of single-source pipeline problems and also providing solution methodology to compute input schedule that yields minimum total time violation from due delivery time-windows. The problem is proved to be NP-complete. The heuristic approach, a reversed-flow algorithm, is developed based on pipeline flow reversibility to compute input schedule for the pipeline problem. This algorithm is implemented in no longer than O(T·E) time. This dissertation also extends the study to examine some operating attributes and problem complexity of multiple-source pipelines. The multiple-source pipeline problem is also NP-complete. A heuristic algorithm modified from the one used in single-source pipeline problems is introduced. This algorithm can also be implemented in no longer than O(T·E) time. Computational results are presented for both methodologies on randomly generated problem sets. The computational experience indicates that reversed-flow algorithms provide good solutions in comparison with the optimal solutions. Only 25% of the problems tested were more than 30% greater than optimal values and approximately 40% of the tested problems were solved optimally by the algorithms.

  18. The medical educator, the discourse analyst, and the phonetician: a collaborative feedback methodology for clinical communication.

    PubMed

    Woodward-Kron, Robyn; Stevens, Mary; Flynn, Eleanor

    2011-05-01

    Frameworks for clinical communication assist educators in making explicit the principles of good communication and providing feedback to medical trainees. However, existing frameworks rarely take into account the roles of culture and language in communication, which can be important for international medical graduates (IMGs) whose first language is not English. This article describes the collaboration by a medical educator, a discourse analyst, and a phonetician to develop a communication and language feedback methodology to assist IMG trainees at a Victorian hospital in Australia with developing their doctor-patient communication skills. The Communication and Language Feedback (CaLF) methodology incorporates a written tool and video recording of role-plays of doctor-patient interactions in a classroom setting or in an objective structured clinical examination (OSCE) practice session with a simulated patient. IMG trainees receive verbal feedback from their hospital-based medical clinical educator, the simulated patient, and linguists. The CaLF tool was informed by a model of language in context, observation of IMG communication training, and process evaluation by IMG participants during January to August 2009. The authors provided participants with a feedback package containing their practice video (which included verbal feedback) and the completed CaLF tool.The CaLF methodology provides a tool for medical educators and language practitioners to work collaboratively with IMGs to enhance communication and language skills. The ongoing interdisciplinary collaboration also provides much-needed applied research opportunities in intercultural health communication, an area the authors believe cannot be adequately addressed from the perspective of one discipline alone. Copyright © by the Association of American medical Colleges.

  19. 78 FR 43883 - Proposed Data Collections Submitted for Public Comment and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-22

    ...,000 additional persons might participate in tests of procedures, special studies, or methodological... participants who have completed the NHANES examination. This information is designed to better understand...

  20. The effectiveness of evidence-based nursing on development of nursing students' critical thinking: A meta-analysis.

    PubMed

    Cui, Chuyun; Li, Yufeng; Geng, Dongrong; Zhang, Hui; Jin, Changde

    2018-06-01

    The aim of this meta-analysis was to assess the effectiveness of evidence-based nursing (EBN) on the development of critical thinking for nursing students. A systematic literature review of original studies on randomized controlled trials was conducted. The relevant randomized controlled trials were retrieved from multiple electronic databases including Cochrane Central Register of Controlled Trials (CENTRAL), PubMed, EMBASE, Web of Science, Cumulative Index to Nursing and Allied Health (CINAHL), Chinese BioMed Database (CBM), China National Knowledge Infrastructure (CNKI), and WanFang Database. In order to make a systematic evaluation, studies were selected according to inclusion and exclusion criteria, and then according to extracted data and assessed quality. The data extraction was completed by two independent reviewers, and the methodological quality assessment was completed by another two reviewers. All of the data was analyzed by the software RevMan5.3. A total of nine studies with 1079 nursing students were chosen in this systematic literature review. The result of this meta-analysis showed that the effectiveness of evidence-based nursing was superior to that of traditional teaching on nursing students' critical thinking. The results of this meta-analysis indicate that evidence-based nursing could help nursing students to promote their development of critical thinking. More researches with higher quality and larger sample size can be analyzed in the further. Copyright © 2018. Published by Elsevier Ltd.

  1. Using a Root Cause Analysis Curriculum for Practice-Based Learning and Improvement in General Surgery Residency.

    PubMed

    Ramanathan, Rajesh; Duane, Therese M; Kaplan, Brian J; Farquhar, Doris; Kasirajan, Vigneshwar; Ferrada, Paula

    2015-01-01

    To describe and evaluate a root cause analysis (RCA)-based educational curriculum for quality improvement (QI) practice-based learning and implementation in general surgery residency. A QI curriculum was designed using RCA and spaced-learning approaches to education. The program included a didactic session about the RCA methodology. Resident teams comprising multiple postgraduate years then selected a personal complication, completed an RCA, and presented the findings to the Department of Surgery. Mixed methods consisting of quantitative assessment of performance and qualitative feedback about the program were used to assess the value, strengths, and limitations of the program. Urban tertiary academic medical center. General surgery residents, faculty, and medical students. An RCA was completed by 4 resident teams for the following 4 adverse outcomes: postoperative neck hematoma, suboptimal massive transfusion for trauma, venous thromboembolism, and decubitus ulcer complications. Quantitative peer assessment of their performance revealed proficiency in selecting an appropriate case, defining the central problem, identifying root causes, and proposing solutions. During the qualitative feedback assessment, residents noted value of the course, with the greatest limitation being time constraints and equal participation. An RCA-based curriculum can provide general surgery residents with QI exposure and training that they value. Barriers to successful implementation include time restrictions and equal participation from all involved members. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  2. Effects-Based Operations in the Cyber Domain

    DTIC Science & Technology

    2017-05-03

    as the joint targeting methodology . The description that Batschelet gave the traditional targeting methodology included a process of, “Decide, Detect...technology, requires new planning and methodology to fight back. This paper evaluates current Department of Defense doctrine to look at ways to conduct...developing its cyber tactics, techniques, and procedures, which, includes various targeting methodologies , such as the use of effects-based

  3. Two-thirds of methodological research remained unpublished after presentation at Cochrane Colloquia: an empirical analysis.

    PubMed

    Chapman, Sarah; Eisinga, Anne; Hopewell, Sally; Clarke, Mike

    2012-05-01

    To determine the extent to which abstracts of methodology research, initially presented at annual meetings of The Cochrane Collaboration, have been published as full reports and over what period of time. A secondary aim was to explore whether full publication varied in different methodological subject areas. The Cochrane Methodology Register (CMR) was searched for all abstracts reporting methodology research, presented at the 11 Cochrane Colloquia from 1997 to 2007. EMBASE, PubMed, and CMR were searched for full publications of the same research. We identified 908 eligible conference abstracts and found full publications for 312 (34.4%) of these, almost half of which (47.1%) had appeared by the end of the first year after the relevant Colloquium. The proportion of abstracts that had not been published by 3 years was 69.7%, falling to 66.2% at 5 years. Publication varied considerably between different methodological areas. Approximately two-thirds of methodological research studies presented at Cochrane Colloquia remain unpublished as full papers at least 5 years later. This highlights the importance of searching conference abstracts if one wishes to find as comprehensive and complete a sample of methodological research as possible. Crown Copyright © 2012. Published by Elsevier Inc. All rights reserved.

  4. Development and participant assessment of a practical quality improvement educational initiative for surgical residents.

    PubMed

    Sellers, Morgan M; Hanson, Kristi; Schuller, Mary; Sherman, Karen; Kelz, Rachel R; Fryer, Jonathan; DaRosa, Debra; Bilimoria, Karl Y

    2013-06-01

    As patient-safety and quality efforts spread throughout health care, the need for physician involvement is critical, yet structured training programs during surgical residency are still uncommon. Our objective was to develop an extended quality-improvement curriculum for surgical residents that included formal didactics and structured practical experience. Surgical trainees completed an 8-hour didactic program in quality-improvement methodology at the start of PGY3. Small teams developed practical quality-improvement projects based on needs identified during clinical experience. With the assistance of the hospital's process-improvement team and surgical faculty, residents worked through their selected projects during the following year. Residents were anonymously surveyed after their participation to assess the experience. During the first 3 years of the program, 17 residents participated, with 100% survey completion. Seven quality-improvement projects were developed, with 57% completing all DMAIC (Define, Measure, Analyze, Improve, Control) phases. Initial projects involved issues of clinical efficiency and later projects increasingly focused on clinical care questions. Residents found the experience educationally important (65%) and believed they were well equipped to lead similar initiatives in the future (70%). Based on feedback, the timeline was expanded from 12 to 24 months and changed to start in PGY2. Developing an extended curriculum using both didactic sessions and applied projects to teach residents the theory and implementation of quality improvement is possible and effective. It addresses the ACGME competencies of practice-based improvement and learning and systems-based practice. Our iterative experience during the past 3 years can serve as a guide for other programs. Copyright © 2013 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  5. Fleet management performance monitoring.

    DOT National Transportation Integrated Search

    2013-05-01

    The principle goal of this project was to enhance and expand the analytical modeling methodology previously developed as part of the Fleet Management Criteria: Disposal Points and Utilization Rates project completed in 2010. The enhanced and ex...

  6. A game-based decision support methodology for competitive systems design

    NASA Astrophysics Data System (ADS)

    Briceno, Simon Ignacio

    This dissertation describes the development of a game-based methodology that facilitates the exploration and selection of research and development (R&D) projects under uncertain competitive scenarios. The proposed method provides an approach that analyzes competitor positioning and formulates response strategies to forecast the impact of technical design choices on a project's market performance. A critical decision in the conceptual design phase of propulsion systems is the selection of the best architecture, centerline, core size, and technology portfolio. This selection can be challenging when considering evolving requirements from both the airframe manufacturing company and the airlines in the market. Furthermore, the exceedingly high cost of core architecture development and its associated risk makes this strategic architecture decision the most important one for an engine company. Traditional conceptual design processes emphasize performance and affordability as their main objectives. These areas alone however, do not provide decision-makers with enough information as to how successful their engine will be in a competitive market. A key objective of this research is to examine how firm characteristics such as their relative differences in completing R&D projects, differences in the degree of substitutability between different project types, and first/second-mover advantages affect their product development strategies. Several quantitative methods are investigated that analyze business and engineering strategies concurrently. In particular, formulations based on the well-established mathematical field of game theory are introduced to obtain insights into the project selection problem. The use of game theory is explored in this research as a method to assist the selection process of R&D projects in the presence of imperfect market information. The proposed methodology focuses on two influential factors: the schedule uncertainty of project completion times and the uncertainty associated with competitive reactions. A normal-form matrix is created to enumerate players, their moves and payoffs, and to formulate a process by which an optimal decision can be achieved. The non-cooperative model is tested using the concept of a Nash equilibrium to identify potential strategies that are robust to uncertain market fluctuations (e.g: uncertainty in airline demand, airframe requirements and competitor positioning). A first/second-mover advantage parameter is used as a scenario dial to adjust market rewards and firms' payoffs. The methodology is applied to a commercial aircraft engine selection study where engine firms must select an optimal engine project for development. An engine modeling and simulation framework is developed to generate a broad engine project portfolio. The creation of a customer value model enables designers to incorporate airline operation characteristics into the engine modeling and simulation process to improve the accuracy of engine/customer matching. Summary. Several key findings are made that provide recommendations on project selection strategies for firms uncertain as to when they will enter the market. The proposed study demonstrates that within a technical design environment, a rational and analytical means of modeling project development strategies is beneficial in high market risk situations.

  7. Network representations of angular regions for electromagnetic scattering

    PubMed Central

    2017-01-01

    Network modeling in electromagnetics is an effective technique in treating scattering problems by canonical and complex structures. Geometries constituted of angular regions (wedges) together with planar layers can now be approached with the Generalized Wiener-Hopf Technique supported by network representation in spectral domain. Even if the network representations in spectral planes are of great importance by themselves, the aim of this paper is to present a theoretical base and a general procedure for the formulation of complex scattering problems using network representation for the Generalized Wiener Hopf Technique starting basically from the wave equation. In particular while the spectral network representations are relatively well known for planar layers, the network modelling for an angular region requires a new theory that will be developed in this paper. With this theory we complete the formulation of a network methodology whose effectiveness is demonstrated by the application to a complex scattering problem with practical solutions given in terms of GTD/UTD diffraction coefficients and total far fields for engineering applications. The methodology can be applied to other physics fields. PMID:28817573

  8. Children's Understanding of Behavioral Consequences of Epistemic States: A Comparison of Knowledge, Ignorance, and False Belief.

    PubMed

    Deneault, Joane

    2015-01-01

    The author addressed the issue of the simultaneity of false belief and knowledge understanding by investigating children's ability to predict the behavioral consequences of knowledge, ignorance, and false belief. The second aim of the study was to explore the role of counterfactuals in knowledge understanding. Ninety-nine (99) children, age 3-7 years old, completed the unexpected transfer task and a newly designed task in which a protagonist experienced 1 of the following 4 situations: knowing a fact, not knowing a fact, knowing a procedure, and not knowing a procedure. The results showed that factual ignorance was as difficult as false belief for the children, whereas the other conditions were all easier than false belief, suggesting that the well-known lag between ignorance and false belief may be partly methodologically based. The results provide support for a common underlying conceptual system for both knowing and believing, and evidence of the role of counterfactual reasoning in the development of epistemic state understanding. Methodological variations of the new task are proposed for future research.

  9. Rapid NMR Assignments of Proteins by Using Optimized Combinatorial Selective Unlabeling.

    PubMed

    Dubey, Abhinav; Kadumuri, Rajashekar Varma; Jaipuria, Garima; Vadrevu, Ramakrishna; Atreya, Hanudatta S

    2016-02-15

    A new approach for rapid resonance assignments in proteins based on amino acid selective unlabeling is presented. The method involves choosing a set of multiple amino acid types for selective unlabeling and identifying specific tripeptides surrounding the labeled residues from specific 2D NMR spectra in a combinatorial manner. The methodology directly yields sequence specific assignments, without requiring a contiguously stretch of amino acid residues to be linked, and is applicable to deuterated proteins. We show that a 2D [(15) N,(1) H] HSQC spectrum with two 2D spectra can result in ∼50 % assignments. The methodology was applied to two proteins: an intrinsically disordered protein (12 kDa) and the 29 kDa (268 residue) α-subunit of Escherichia coli tryptophan synthase, which presents a challenging case with spectral overlaps and missing peaks. The method can augment existing approaches and will be useful for applications such as identifying active-site residues involved in ligand binding, phosphorylation, or protein-protein interactions, even prior to complete resonance assignments. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Applications of aerospace technology in biology and medicine

    NASA Technical Reports Server (NTRS)

    Bass, B.; Beall, H. C.; Brown, J. N., Jr.; Clingman, W. H.; Eakes, R. E.; Kizakevich, P. N.; Mccartney, M.; Rouse, D. J.

    1982-01-01

    Utilization of National Aeronautics and Space Administration (NASA) technology in medicine is discussed. The objective is best obtained by stimulation of the introduction of new or improved commercially available medical products incorporating aerospace technology. A bipolar donor/recipient model of medical technology transfer is presented to provide a basis for the team's methodology. That methodology is designed to: (1) identify medical problems and NASA technology that, in combination, constitute opportunities for successful medical products; (2) obtain the early participation of industry in the transfer process; and (3) obtain acceptance by the medical community of new medical products based on NASA technology. Two commercial transfers were completed: the Stowaway, a lightweight wheelchair that provides mobility for the disabled and elderly in the cabin of commercial aircraft, and Micromed, a portable medication infusion pump for the reliable, continuous infusion of medications such as heparin or insulin. The marketing and manufacturing factors critical to the commercialization of the lightweight walker incorporating composite materials were studied. Progress was made in the development and commercialization of each of the 18 currently active projects.

  11. [Commitment and community participation towards health: knowledge creation from the systematization of social experiences].

    PubMed

    López-Bolaños, Lizbeth; Campos-Rivera, Marisol; Villanueva-Borbolla, María Ángeles

    2018-01-01

    Objective. To reflect on the process of committing to participation in the implementation of a health strategic plan, using Participative Systematization of Social Experiences as a tool. Our study was a qualitative research-intervention study, based on the Dialectical Methodological Conception approach. We designed and implemented a two-day workshop, six hours daily, using Systematization methodology with a Community Work Group (CWG). During the workshop, women systematized their experience, with compromise as axis of the process. Using Grounded Theory techniques, we applied micro-analysis to data in order to identify and strengthen categories that emerged during the systematization process. We completed open and axial coding. The CWG identified that commitment and participation itself is influenced by group dynamics and structural determinants. They also reconsidered the way they understood and exercised commitment and participation, and generated knowledge, empowering them to improve their future practice. Commitment and participation were determined by group dynamics and structural factors such as socioeconomic conditions and gender roles. These determinants must be visible and understood in order to generate proposals that are aimed at strengthening the participation and organization of groups.

  12. A Patient Focused Solution for Enrolling Clinical Trials in Rare and Selective Cancer Indications: A Landscape of Haystacks and Needles

    PubMed Central

    Lynam, Eric B.; Leaw, Jiin; Wiener, Matthew B.

    2013-01-01

    Participation of adult cancer patients in US based clinical trials has remained near 3% for decades. Traditional research methodology reaches a small fraction of the target population with a fixed number of predetermined sites. Solutions are needed to ethically increase patient participation and accelerate cancer trial completion. We compared enrollment outcomes of traditional and patient focused research methodologies. A patient prioritized method (Just-In-Time, JIT) was implemented in parallel with traditionally managed sites in three cancer trials. JIT research sites were initiated after candidate patients presented, while traditional sites were initiated in advance. JIT sites enrolled with mean rates no less than, and up to 2.75 fold greater than, traditional sites. Mean patients enrolled per site was comparable (JIT-1.82, traditional-1.78). There were fewer non-enrolling JIT sites (2/28, 7%) compared to traditional sites 19/52, 37%). This retrospective analysis supports JIT as a prospective solution to increase cancer clinical trial enrollment and the efficiency of clinical trial administrative activities. PMID:23990689

  13. Image-guided endobronchial ultrasound

    NASA Astrophysics Data System (ADS)

    Higgins, William E.; Zang, Xiaonan; Cheirsilp, Ronnarit; Byrnes, Patrick; Kuhlengel, Trevor; Bascom, Rebecca; Toth, Jennifer

    2016-03-01

    Endobronchial ultrasound (EBUS) is now recommended as a standard procedure for in vivo verification of extraluminal diagnostic sites during cancer-staging bronchoscopy. Yet, physicians vary considerably in their skills at using EBUS effectively. Regarding existing bronchoscopy guidance systems, studies have shown their effectiveness in the lung-cancer management process. With such a system, a patient's X-ray computed tomography (CT) scan is used to plan a procedure to regions of interest (ROIs). This plan is then used during follow-on guided bronchoscopy. Recent clinical guidelines for lung cancer, however, also dictate using positron emission tomography (PET) imaging for identifying suspicious ROIs and aiding in the cancer-staging process. While researchers have attempted to use guided bronchoscopy systems in tandem with PET imaging and EBUS, no true EBUS-centric guidance system exists. We now propose a full multimodal image-based methodology for guiding EBUS. The complete methodology involves two components: 1) a procedure planning protocol that gives bronchoscope movements appropriate for live EBUS positioning; and 2) a guidance strategy and associated system graphical user interface (GUI) designed for image-guided EBUS. We present results demonstrating the operation of the system.

  14. Parametric assessment of climate change impacts of automotive material substitution.

    PubMed

    Geyer, Roland

    2008-09-15

    Quantifying the net climate change impact of automotive material substitution is not a trivial task. It requires the assessment of the mass reduction potential of automotive materials, the greenhouse gas (GHG) emissions from their production and recycling, and their impact on GHG emissions from vehicle use. The model presented in this paper is based on life cycle assessment (LCA) and completely parameterized, i.e., its computational structure is separated from the required input data, which is not traditionally done in LCAs. The parameterization increases scientific rigor and transparency of the assessment methodology, facilitates sensitivity and uncertainty analysis of the results, and also makes it possible to compare different studies and explain their disparities. The state of the art of the modeling methodology is reviewed and advanced. Assessment of the GHG emission impacts of material recycling through consequential system expansion shows that our understanding of this issue is still incomplete. This is a critical knowledge gap since a case study shows thatfor materials such as aluminum, the GHG emission impacts of material production and recycling are both of the same size as the use phase savings from vehicle mass reduction.

  15. Instrumentation and methodology for quantifying GFP fluorescence in intact plant organs

    NASA Technical Reports Server (NTRS)

    Millwood, R. J.; Halfhill, M. D.; Harkins, D.; Russotti, R.; Stewart, C. N. Jr

    2003-01-01

    The General Fluorescence Plant Meter (GFP-Meter) is a portable spectrofluorometer that utilizes a fiber-optic cable and a leaf clip to gather spectrofluorescence data. In contrast to traditional analytical systems, this instrument allows for the rapid detection and fluorescence measurement of proteins under field conditions with no damage to plant tissue. Here we discuss the methodology of gathering and standardizing spectrofluorescence data from tobacco and canola plants expressing GFP. Furthermore, we demonstrate the accuracy and effectiveness of the GFP-Meter. We first compared GFP fluorescence measurements taken by the GFP-Meter to those taken by a standard laboratory-based spectrofluorometer, the FluoroMax-2. Spectrofluorescence measurements were taken from the same location on intact leaves. When these measurements were tested by simple linear regression analysis, we found that there was a positive functional relationship between instruments. Finally, to exhibit that the GFP-Meter recorded accurate measurements over a span of time, we completed a time-course analysis of GFP fluorescence measurements. We found that only initial measurements were accurate; however, subsequent measurements could be used for qualitative purposes.

  16. Breaking the bottleneck: Use of molecular tailoring approach for the estimation of binding energies at MP2/CBS limit for large water clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Gurmeet; Nandi, Apurba; Gadre, Shridhar R., E-mail: gadre@iitk.ac.in

    2016-03-14

    A pragmatic method based on the molecular tailoring approach (MTA) for estimating the complete basis set (CBS) limit at Møller-Plesset second order perturbation (MP2) theory accurately for large molecular clusters with limited computational resources is developed. It is applied to water clusters, (H{sub 2}O){sub n} (n = 7, 8, 10, 16, 17, and 25) optimized employing aug-cc-pVDZ (aVDZ) basis-set. Binding energies (BEs) of these clusters are estimated at the MP2/aug-cc-pVNZ (aVNZ) [N = T, Q, and 5 (whenever possible)] levels of theory employing grafted MTA (GMTA) methodology and are found to lie within 0.2 kcal/mol of the corresponding full calculationmore » MP2 BE, wherever available. The results are extrapolated to CBS limit using a three point formula. The GMTA-MP2 calculations are feasible on off-the-shelf hardware and show around 50%–65% saving of computational time. The methodology has a potential for application to molecular clusters containing ∼100 atoms.« less

  17. [Evaluation of quality of service in Early Intervention: A systematic review].

    PubMed

    Jemes Campaña, Inmaculada Concepción; Romero-Galisteo, Rita Pilar; Labajos Manzanares, María Teresa; Moreno Morales, Noelia

    2018-06-07

    Early Intervention (EI), as a paediatric service, has the duty of quantifying the results and the quality of its services provided. The accessibility of valid and reliable tools allows professionals to evaluate the quality of these services. The aim of this study is to review the scientific literature on tools used to measure the methodological and service quality in EI. A search was made in different databases: Medline (from PubMed), Web of Science, PsycINFO, Cochrane, Scopus, ERIC and Scielo. The methodological quality of the studies was tested using the COSMIN scale. A total of 13 manuscripts met the criteria to be included in this review. Ten of them received a "good" or "reasonable" score based on the COSMIN scale. Despite its importance, there is no consensus among authors on the measurement of service quality in EI. It is often the family of the children attended in EI that are considered the target to study, although the opinion of professionals carries more weight and completes the information. Copyright © 2018. Publicado por Elsevier España, S.L.U.

  18. Physico-Chemical Alternatives in Lignocellulosic Materials in Relation to the Kind of Component for Fermenting Purposes

    PubMed Central

    Coz, Alberto; Llano, Tamara; Cifrián, Eva; Viguri, Javier; Maican, Edmond; Sixta, Herbert

    2016-01-01

    The complete bioconversion of the carbohydrate fraction is of great importance for a lignocellulosic-based biorefinery. However, due to the structure of the lignocellulosic materials, and depending basically on the main parameters within the pretreatment steps, numerous byproducts are generated and they act as inhibitors in the fermentation operations. In this sense, the impact of inhibitory compounds derived from lignocellulosic materials is one of the major challenges for a sustainable biomass-to-biofuel and -bioproduct industry. In order to minimise the negative effects of these compounds, numerous methodologies have been tested including physical, chemical, and biological processes. The main physical and chemical treatments have been studied in this work in relation to the lignocellulosic material and the inhibitor in order to point out the best mechanisms for fermenting purposes. In addition, special attention has been made in the case of lignocellulosic hydrolysates obtained by chemical processes with SO2, due to the complex matrix of these materials and the increase in these methodologies in future biorefinery markets. Recommendations of different detoxification methods have been given. PMID:28773700

  19. Clinical practice guideline development manual: a quality-driven approach for translating evidence into action.

    PubMed

    Rosenfeld, Richard M; Shiffman, Richard N

    2009-06-01

    Guidelines translate best evidence into best practice. A well-crafted guideline promotes quality by reducing health-care variations, improving diagnostic accuracy, promoting effective therapy, and discouraging ineffective-or potentially harmful-interventions. Despite a plethora of published guidelines, methodology is often poorly defined and varies greatly within and among organizations. This manual describes the principles and practices used successfully by the American Academy of Otolaryngology-Head and Neck Surgery to produce quality-driven, evidence-based guidelines using efficient and transparent methodology for action-ready recommendations with multidisciplinary applicability. The development process, which allows moving from conception to completion in 12 months, emphasizes a logical sequence of key action statements supported by amplifying text, evidence profiles, and recommendation grades that link action to evidence. As clinical practice guidelines become more prominent as a key metric of quality health care, organizations must develop efficient production strategies that balance rigor and pragmatism. Equally important, clinicians must become savvy in understanding what guidelines are-and are not-and how they are best utilized to improve care. The information in this manual should help clinicians and organizations achieve these goals.

  20. A complete equation of state for non-ideal condensed phase explosives

    NASA Astrophysics Data System (ADS)

    Wilkinson, S. D.; Braithwaite, M.; Nikiforakis, N.; Michael, L.

    2017-12-01

    The objective of this work is to improve the robustness and accuracy of numerical simulations of both ideal and non-ideal explosives by introducing temperature dependence in mechanical equations of state for reactants and products. To this end, we modify existing mechanical equations of state to appropriately approximate the temperature in the reaction zone. Mechanical equations of state of the Mie-Grüneisen form are developed with extensions, which allow the temperature to be evaluated appropriately and the temperature equilibrium condition to be applied robustly. Furthermore, the snow plow model is used to capture the effect of porosity on the reactant equation of state. We apply the methodology to predict the velocity of compliantly confined detonation waves. Once reaction rates are calibrated for unconfined detonation velocities, simulations of confined rate sticks and slabs are performed, and the experimental detonation velocities are matched without further parameter alteration, demonstrating the predictive capability of our simulations. We apply the same methodology to both ideal (PBX9502, a high explosive with principal ingredient TATB) and non-ideal (EM120D, an ANE or ammonium nitrate based emulsion) explosives.

  1. A method for the complete analysis of NORM building materials by γ-ray spectrometry using HPGe detectors.

    PubMed

    Quintana, B; Pedrosa, M C; Vázquez-Canelas, L; Santamaría, R; Sanjuán, M A; Puertas, F

    2018-04-01

    A methodology including software tools for analysing NORM building materials and residues by low-level gamma-ray spectrometry has been developed. It comprises deconvolution of gamma-ray spectra using the software GALEA with focus on the natural radionuclides and Monte Carlo simulations for efficiency and true coincidence summing corrections. The methodology has been tested on a range of building materials and validated against reference materials. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Formalization and Validation of an SADT Specification Through Executable Simulation in VHDL

    DTIC Science & Technology

    1991-12-01

    be found in (39, 40, 41). One recent summary of the SADT methodology was written by Marca and McGowan in 1988 (.32). SADT is a methodology to provide...that is required. Also, the presence of "all" inputs and controls may not be needed for the activity to proceed. Marca and McGowan (32) describe a...diagrams which describe a complete system. Marca and McGowan define an SADT Model as: "a collection of carefully coorinated descriptions, starting from a

  3. A low-power photovoltaic system with energy storage for radio communications: Description and design methodology

    NASA Technical Reports Server (NTRS)

    Chapman, C. P.; Chapman, P. D.; Lewison, A. H.

    1982-01-01

    A low power photovoltaic system was constructed with approximately 500 amp hours of battery energy storage to provide power to an emergency amateur radio communications center. The system can power the communications center for about 72 hours of continuous nonsun operation. Complete construction details and a design methodology algorithm are given with abundant engineering data and adequate theory to allow similar systems to be constructed, scaled up or down, with minimum design effort.

  4. Towards Methodologies for Building Knowledge-Based Instructional Systems.

    ERIC Educational Resources Information Center

    Duchastel, Philippe

    1992-01-01

    Examines the processes involved in building instructional systems that are based on artificial intelligence and hypermedia technologies. Traditional instructional systems design methodology is discussed; design issues including system architecture and learning strategies are addressed; and a new methodology for building knowledge-based…

  5. Depth-Based Selective Blurring in Stereo Images Using Accelerated Framework

    NASA Astrophysics Data System (ADS)

    Mukherjee, Subhayan; Guddeti, Ram Mohana Reddy

    2014-09-01

    We propose a hybrid method for stereo disparity estimation by combining block and region-based stereo matching approaches. It generates dense depth maps from disparity measurements of only 18 % image pixels (left or right). The methodology involves segmenting pixel lightness values using fast K-Means implementation, refining segment boundaries using morphological filtering and connected components analysis; then determining boundaries' disparities using sum of absolute differences (SAD) cost function. Complete disparity maps are reconstructed from boundaries' disparities. We consider an application of our method for depth-based selective blurring of non-interest regions of stereo images, using Gaussian blur to de-focus users' non-interest regions. Experiments on Middlebury dataset demonstrate that our method outperforms traditional disparity estimation approaches using SAD and normalized cross correlation by up to 33.6 % and some recent methods by up to 6.1 %. Further, our method is highly parallelizable using CPU-GPU framework based on Java Thread Pool and APARAPI with speed-up of 5.8 for 250 stereo video frames (4,096 × 2,304).

  6. Optimal methodologies for terahertz time-domain spectroscopic analysis of traditional pigments in powder form

    NASA Astrophysics Data System (ADS)

    Ha, Taewoo; Lee, Howon; Sim, Kyung Ik; Kim, Jonghyeon; Jo, Young Chan; Kim, Jae Hoon; Baek, Na Yeon; Kang, Dai-ill; Lee, Han Hyoung

    2017-05-01

    We have established optimal methods for terahertz time-domain spectroscopic analysis of highly absorbing pigments in powder form based on our investigation of representative traditional Chinese pigments, such as azurite [blue-based color pigment], Chinese vermilion [red-based color pigment], and arsenic yellow [yellow-based color pigment]. To accurately extract the optical constants in the terahertz region of 0.1 - 3 THz, we carried out transmission measurements in such a way that intense absorption peaks did not completely suppress the transmission level. This required preparation of pellet samples with optimized thicknesses and material densities. In some cases, mixing the pigments with polyethylene powder was required to minimize absorption due to certain peak features. The resulting distortion-free terahertz spectra of the investigated set of pigment species exhibited well-defined unique spectral fingerprints. Our study will be useful to future efforts to establish non-destructive analysis methods of traditional pigments, to construct their spectral databases, and to apply these tools to restoration of cultural heritage materials.

  7. Nutrition Education for Pediatric Gastroenterology, Hepatology and Nutrition Fellows: A Survey of NASPGHAN Fellowship Training Programs

    PubMed Central

    Martinez, J. Andres; Koyama, Tatsuki; Acra, Sari; Mascarenhas, Maria R.; Shulman, Robert J.

    2012-01-01

    Objectives The aim of the study was to assess the methodology and content of nutrition education during gastroenterology fellowship training and the variability among the different programs. Methods A survey questionnaire was completed by 43 fellowship training directors of 62 active programs affiliated to NASPGHAN, including sites in the United States, Canada and Mexico. The data were examined for patterns in teaching methodology and coverage of specific nutrition topics based on Level 1 training in nutrition, which is the minimum requirement according to published NASPGHAN fellowship training guidelines. Results The majority of the teaching was conducted by MD degree faculty (61%), and most of the education was provided through clinical care experiences. Only 31% of Level 1 nutrition topics were consistently covered by more than 80% of programs, and coverage did not correlate with the size of the programs. Competency in nutrition training was primarily assessed through questions to individuals or groups of fellows (77 and 65%, respectively). Program directors cited a lack of faculty interested in nutrition and a high workload as common obstacles for teaching. Conclusions The methodology of nutrition education during gastroenterology fellowship training is for the most part unstructured and inconsistent among the different programs. The minimum Level 1 requirements are not consistently covered. The development of core curriculums and learning modules may be beneficial in improving nutrition education. PMID:22343911

  8. Implementation of a formulary management process.

    PubMed

    Karel, Lauren I; Delisle, Dennis R; Anagnostis, Ellena A; Wordell, Cindy J

    2017-08-15

    The application of lean methodology in an initiative to redesign the formulary maintenance process at an academic medical center is described. Maintaining a hospital formulary requires clear communication and coordination among multiple members of the pharmacy department. Using principles of lean methodology, pharmacy department personnel within a multihospital health system launched a multifaceted initiative to optimize formulary management systemwide. The ongoing initiative began with creation of a formulary maintenance redesign committee consisting of pharmacy department personnel with expertise in informatics, automation, purchasing, drug information, and clinical pharmacy services. The committee met regularly and used lean methodology to design a standardized process for management of formulary additions and deletions and changes to medications' formulary status. Through value stream analysis, opportunities for process and performance improvement were identified; staff suggestions on process streamlining were gathered during a series of departmental kaizen events. A standardized template for development and dissemination of monographs associated with formulary additions and status changes was created. In addition, a shared Web-based checklist was developed to facilitate information sharing and timely initiation and completion of tasks involved in formulary status changes, and a permanent formulary maintenance committee was established to monitor and refine the formulary management process. A clearly defined, standardized process within the pharmacy department was developed for tracking necessary steps in enacting formulary changes to encourage safe and efficient workflow. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  9. Renewable Energy Assessment Methodology for Japanese OCONUS Army Installations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solana, Amy E.; Horner, Jacob A.; Russo, Bryan J.

    2010-08-30

    Since 2005, Pacific Northwest National Laboratory (PNNL) has been asked by Installation Management Command (IMCOM) to conduct strategic assessments at selected US Army installations of the potential use of renewable energy resources, including solar, wind, geothermal, biomass, waste, and ground source heat pumps (GSHPs). IMCOM has the same economic, security, and legal drivers to develop alternative, renewable energy resources overseas as it has for installations located in the US. The approach for continental US (CONUS) studies has been to use known, US-based renewable resource characterizations and information sources coupled with local, site-specific sources and interviews. However, the extent to whichmore » this sort of data might be available for outside the continental US (OCONUS) sites was unknown. An assessment at Camp Zama, Japan was completed as a trial to test the applicability of the CONUS methodology at OCONUS installations. It was found that, with some help from Camp Zama personnel in translating and locating a few Japanese sources, there was relatively little difficulty in finding sources that should provide a solid basis for conducting an assessment of comparable depth to those conducted for US installations. Project implementation will likely be more of a challenge, but the feasibility analysis will be able to use the same basic steps, with some adjusted inputs, as PNNL’s established renewable resource assessment methodology.« less

  10. Methodological Framework for World Health Organization Estimates of the Global Burden of Foodborne Disease

    PubMed Central

    Devleesschauwer, Brecht; Haagsma, Juanita A.; Angulo, Frederick J.; Bellinger, David C.; Cole, Dana; Döpfer, Dörte; Fazil, Aamir; Fèvre, Eric M.; Gibb, Herman J.; Hald, Tine; Kirk, Martyn D.; Lake, Robin J.; Maertens de Noordhout, Charline; Mathers, Colin D.; McDonald, Scott A.; Pires, Sara M.; Speybroeck, Niko; Thomas, M. Kate; Torgerson, Paul R.; Wu, Felicia; Havelaar, Arie H.; Praet, Nicolas

    2015-01-01

    Background The Foodborne Disease Burden Epidemiology Reference Group (FERG) was established in 2007 by the World Health Organization to estimate the global burden of foodborne diseases (FBDs). This paper describes the methodological framework developed by FERG's Computational Task Force to transform epidemiological information into FBD burden estimates. Methods and Findings The global and regional burden of 31 FBDs was quantified, along with limited estimates for 5 other FBDs, using Disability-Adjusted Life Years in a hazard- and incidence-based approach. To accomplish this task, the following workflow was defined: outline of disease models and collection of epidemiological data; design and completion of a database template; development of an imputation model; identification of disability weights; probabilistic burden assessment; and estimating the proportion of the disease burden by each hazard that is attributable to exposure by food (i.e., source attribution). All computations were performed in R and the different functions were compiled in the R package 'FERG'. Traceability and transparency were ensured by sharing results and methods in an interactive way with all FERG members throughout the process. Conclusions We developed a comprehensive framework for estimating the global burden of FBDs, in which methodological simplicity and transparency were key elements. All the tools developed have been made available and can be translated into a user-friendly national toolkit for studying and monitoring food safety at the local level. PMID:26633883

  11. Evaluation of near field atmospheric dispersion around nuclear facilities using a Lorentzian distribution methodology.

    PubMed

    Hawkley, Gavin

    2014-12-01

    Atmospheric dispersion modeling within the near field of a nuclear facility typically applies a building wake correction to the Gaussian plume model, whereby a point source is modeled as a plane source. The plane source results in greater near field dilution and reduces the far field effluent concentration. However, the correction does not account for the concentration profile within the near field. Receptors of interest, such as the maximally exposed individual, may exist within the near field and thus the realm of building wake effects. Furthermore, release parameters and displacement characteristics may be unknown, particularly during upset conditions. Therefore, emphasis is placed upon the need to analyze and estimate an enveloping concentration profile within the near field of a release. This investigation included the analysis of 64 air samples collected over 128 wk. Variables of importance were then derived from the measurement data, and a methodology was developed that allowed for the estimation of Lorentzian-based dispersion coefficients along the lateral axis of the near field recirculation cavity; the development of recirculation cavity boundaries; and conservative evaluation of the associated concentration profile. The results evaluated the effectiveness of the Lorentzian distribution methodology for estimating near field releases and emphasized the need to place air-monitoring stations appropriately for complete concentration characterization. Additionally, the importance of the sampling period and operational conditions were discussed to balance operational feedback and the reporting of public dose.

  12. Soni-removal of nucleic acids from inclusion bodies.

    PubMed

    Neerathilingam, Muniasamy; Mysore, Sumukh; Gandham, Sai Hari A

    2014-05-23

    Inclusion bodies (IBs) are commonly formed in Escherichia coli due to over expression of recombinant proteins in non-native state. Isolation, denaturation and refolding of these IBs is generally performed to obtain functional protein. However, during this process IBs tend to form non-specific interactions with sheared nucleic acids from the genome, thus getting carried over into downstream processes. This may hinder the refolding of IBs into their native state. To circumvent this, we demonstrate a methodology termed soni-removal which involves disruption of nucleic acid-inclusion body interaction using sonication; followed by solvent based separation. As opposed to conventional techniques that use enzymes and column-based separations, soni-removal is a cost effective alternative for complete elimination of buried and/or strongly bound short nucleic acid contaminants from IBs. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Unsupervised pattern recognition methods in ciders profiling based on GCE voltammetric signals.

    PubMed

    Jakubowska, Małgorzata; Sordoń, Wanda; Ciepiela, Filip

    2016-07-15

    This work presents a complete methodology of distinguishing between different brands of cider and ageing degrees, based on voltammetric signals, utilizing dedicated data preprocessing procedures and unsupervised multivariate analysis. It was demonstrated that voltammograms recorded on glassy carbon electrode in Britton-Robinson buffer at pH 2 are reproducible for each brand. By application of clustering algorithms and principal component analysis visible homogenous clusters were obtained. Advanced signal processing strategy which included automatic baseline correction, interval scaling and continuous wavelet transform with dedicated mother wavelet, was a key step in the correct recognition of the objects. The results show that voltammetry combined with optimized univariate and multivariate data processing is a sufficient tool to distinguish between ciders from various brands and to evaluate their freshness. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Vocational rehabilitation: facilitating evidence based practice through participatory action research.

    PubMed

    Maciver, Donald; Prior, Susan; Forsyth, Kirsty; Walsh, Mike; Meiklejohn, Allison; Irvine, Linda; Pentland, Duncan

    2013-04-01

    Improving vocational rehabilitation in line with the current evidence base is an area of considerable interest. Aims To describe the strategies used by a multidisciplinary team in the initial stages of a participatory action research (PAR) approach to improving a vocational rehabilitation service. A literature review and PAR process were completed. One hundred and fifteen participants engaged in multifaceted data collection and analysis, building consensus around key principles for a new vocational rehabilitation service. A synthesis of our literature review and PAR process was developed into a set of principles for practice which we plan to implement across the service. We have developed methodologies in interdisciplinary collaborations spanning statutory and non-statutory services. We have developed a set of principles for practice and detailed plans for implementation are being drawn up to inform provision in the future.

  15. Ceramic technology for advanced heat engines project. Semiannual progress report, April-September 1985

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1986-05-01

    An assessment of needs was completed, and a five-year project plan was developed with input from private industry. Objective is to develop the industrial technology base required for reliable ceramics for application in advanced automotive heat engines. Focus is on structural ceramics for advanced gas turbine and diesel engines, ceramic bearings and attachments, and ceramic coatings for thermal barrier and wear applications in these engines. The work described in this report is organized according to the following WBS project elements: management and coordination; materials and processing (monolithics, ceramic composites, thermal and wear coatings, joining); materials design methodology (contact interfaces, newmore » concepts); data base and life prediction (time-dependent behavior, environmental effects, fracture mechanics, NDE development); and technology transfer. This report includes contributions from all currently active project participants.« less

  16. Illusory correlation: a function of availability or representativeness heuristics?

    PubMed

    MacDonald, M G

    2000-08-01

    The present study sought to investigate the illusory correlation phenomenon by experimentally manipulating the availability of information through the use of the "lag" effect (Madigan, 1969). Seventy-four university students voluntarily participated in this study. Similar to Starr and Katkin's (1969) methodology, subjects were visually presented with each possible combination of four experimental problem descriptions and four sentence completions that were paired and shown twice at each of four lags (i.e., with 0, 2, 8 and 20 intervening variables). Subjects were required to make judgements concerning the frequency with which sentence completions and problem descriptions co-occurred. In agreement with previous research (Starr & Katkin, 1969), the illusory correlation effect was found for specific descriptions and sentence completions. Results also yielded a significant effect of lag for mean ratings between 0 and 2 lags; however, there was no reliable increase in judged co-occurrence at lags 8 and 20. Evidence failed to support the hypothesis that greater availability, through the experimental manipulation of lag, would result in increased frequency of co-occurrence judgements. Findings indicate that, in the present study, the illusory correlation effect is probably due to a situational bias based on the representativeness heuristic.

  17. Lessons learned: a pilot study on occupational therapy effectiveness for children with sensory modulation disorder.

    PubMed

    Miller, Lucy Jane; Schoen, Sarah A; James, Katherine; Schaaf, Roseann C

    2007-01-01

    The purpose of this pilot study was to prepare for a randomized controlled study of the effectiveness of occupational therapy using a sensory integration approach (OT-SI) with children who have sensory processing disorders (SPD). A one-group pretest, posttest design with 30 children was completed with a subset of children with SPD, those with sensory modulation disorder. Lessons learned relate to (a) identifying a homogeneous sample with quantifiable inclusion criteria, (b) developing an intervention manual for study replication and a fidelity to treatment measure, (c) determining which outcomes are sensitive to change and relate to parents' priorities, and (d) clarifying rigorous methodologies (e.g., blinded examiners, randomization, power). A comprehensive program of research is needed, including multiple pilot studies to develop enough knowledge that high-quality effectiveness research in occupational therapy can be completed. Previous effectiveness studies in OT-SI have been single projects not based on a unified long-term program of research.

  18. Reedsport PB150 Deployment and Ocean Test Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Phil

    2016-06-03

    As the first utility scale wave power project in the US, the Wave Power Demonstration Project at Reedsport (OR) was planned to consist of 10 PowerBuoys (Phase II)1, located 2.5 miles off the coast. U.S. Department of Energy (DOE) funding under a prior DOE Grant (DE-FG36-08GO88017) along with funding from PNGC Power, an Oregon-based electric power cooperative, was utilized for the design completion, fabrication, assembly and factory testing of the first PowerBuoy for the Reedsport project. The design and fabrication of the first PowerBuoy and factory testing of the power take-off subsystem were completed, and the power take-off subsystem wasmore » successfully integrated into the spar at the fabricator’s facility in Oregon. The objectives of this follow-on grant were: advance PB150B design from TRL 5/6 to TRL 7/8; deploy a single PB150 and operate autonomously for 2 years; establish O&M costs; collect environmental information; and establish manufacturing methodologies.« less

  19. Comprehensive analysis of a Metabolic Model for lipid production in Rhodosporidium toruloides.

    PubMed

    Castañeda, María Teresita; Nuñez, Sebastián; Garelli, Fabricio; Voget, Claudio; Battista, Hernán De

    2018-05-19

    The yeast Rhodosporidium toruloides has been extensively studied for its application in biolipid production. The knowledge of its metabolism capabilities and the application of constraint-based flux analysis methodology provide useful information for process prediction and optimization. The accuracy of the resulting predictions is highly dependent on metabolic models. A metabolic reconstruction for R. toruloides metabolism has been recently published. On the basis of this model, we developed a curated version that unblocks the central nitrogen metabolism and, in addition, completes charge and mass balances in some reactions neglected in the former model. Then, a comprehensive analysis of network capability was performed with the curated model and compared with the published metabolic reconstruction. The flux distribution obtained by lipid optimization with Flux Balance Analysis was able to replicate the internal biochemical changes that lead to lipogenesis in oleaginous microorganisms. These results motivate the development of a genome-scale model for complete elucidation of R. toruloides metabolism. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. MSC/NASTRAN Stress Analysis of Complete Models Subjected to Random and Quasi-Static Loads

    NASA Technical Reports Server (NTRS)

    Hampton, Roy W.

    2000-01-01

    Space payloads, such as those which fly on the Space Shuttle in Spacelab, are designed to withstand dynamic loads which consist of combined acoustic random loads and quasi-static acceleration loads. Methods for computing the payload stresses due to these loads are well known and appear in texts and NASA documents, but typically involve approximations such as the Miles' equation, as well as possible adjustments based on "modal participation factors." Alternatively, an existing capability in MSC/NASTRAN may be used to output exact root mean square [rms] stresses due to the random loads for any specified elements in the Finite Element Model. However, it is time consuming to use this methodology to obtain the rms stresses for the complete structural model and then combine them with the quasi-static loading induced stresses. Special processing was developed as described here to perform the stress analysis of all elements in the model using existing MSC/NASTRAN and MSC/PATRAN and UNIX utilities. Fail-safe and buckling analyses applications are also described.

  1. Complete Genome Sequence of a Streptococcus pyogenes Serotype M12 Scarlet Fever Outbreak Isolate from China, Compiled Using Oxford Nanopore and Illumina Sequencing

    PubMed Central

    You, Yuanhai; Kou, Yongjun; Niu, Longfei; Jia, Qiong; Liu, Yahui; Walker, Mark J.; Zhu, Jiaqiang

    2018-01-01

    ABSTRACT The incidence of scarlet fever cases remains high in China. Here, we report the complete genome sequence of a Streptococcus pyogenes isolate of serotype M12, which has been confirmed as the predominant serotype in recent outbreaks. Genome sequencing was achieved by a combination of Oxford Nanopore MinION and Illumina methodologies. PMID:29724853

  2. Learning outcomes through the cooperative learning team assisted individualization on research methodology’ course

    NASA Astrophysics Data System (ADS)

    Pakpahan, N. F. D. B.

    2018-01-01

    All articles must contain an abstract. The research methodology is a subject in which the materials must be understood by the students who will take the thesis. Implementation of learning should create the conditions for active learning, interactive and effective are called Team Assisted Individualization (TAI) cooperative learning. The purpose of this study: 1) improving student learning outcomes at the course research methodology on TAI cooperative learning. 2) improvement of teaching activities. 3) improvement of learning activities. This study is a classroom action research conducted at the Department of Civil Engineering Universitas Negeri Surabaya. The research subjects were 30 students and lecturer of courses. Student results are complete in the first cycle by 20 students (67%) and did not complete 10 students (33%). In the second cycle students who complete being 26 students (87%) and did not complete 4 students (13%). There is an increase in learning outcomes by 20%. Results of teaching activities in the first cycle obtained the value of 3.15 with the criteria enough well. In the second cycle obtained the value of 4.22 with good criterion. The results of learning activities in the first cycle obtained the value of 3.05 with enough criterion. In the second cycle was obtained 3.95 with good criterion.

  3. A Methodological Conundrum: Comparing Schools in Scotland and England

    ERIC Educational Resources Information Center

    Marshall, Bethan; Gibbons, Simon

    2015-01-01

    This article considers a conundrum in research methodology; the fact that, in the main, you have to use a social science-based research methodology if you want to look at what goes on in a classroom. This article proposes an alternative arts-based research method instead based on the work of Eisner, and before him Dewey, where one can use the more…

  4. School-Based Methylphenidate Placebo Protocols: Methodological and Practical Issues.

    ERIC Educational Resources Information Center

    Hyman, Irwin A.; Wojtowicz, Alexandra; Lee, Kee Duk; Haffner, Mary Elizabeth; Fiorello, Catherine A.; And Others

    1998-01-01

    Focuses on methodological issues involved in choosing instruments to monitor behavior, once a comprehensive evaluation has suggested trials on Ritalin. Case examples illustrate problems of teacher compliance in filling out measures, supplying adequate placebos, and obtaining physical cooperation. Emerging school-based methodologies are discussed…

  5. Transaction based approach

    NASA Astrophysics Data System (ADS)

    Hunka, Frantisek; Matula, Jiri

    2017-07-01

    Transaction based approach is utilized in some methodologies in business process modeling. Essential parts of these transactions are human beings. The notion of agent or actor role is usually used for them. The paper on a particular example describes possibilities of Design Engineering Methodology for Organizations (DEMO) and Resource-Event-Agent (REA) methodology. Whereas the DEMO methodology can be regarded as a generic methodology having its foundation in the theory of Enterprise Ontology the REA methodology is regarded as the domain specific methodology and has its origin in accountancy systems. The results of these approaches is that the DEMO methodology captures everything that happens in the reality with a good empirical evidence whereas the REA methodology captures only changes connected with economic events. Economic events represent either change of the property rights to economic resource or consumption or production of economic resources. This results from the essence of economic events and their connection to economic resources.

  6. Evaluating space station applications of automation and robotics technologies from a human productivity point of view

    NASA Technical Reports Server (NTRS)

    Bard, J. F.

    1986-01-01

    The role that automation, robotics, and artificial intelligence will play in Space Station operations is now beginning to take shape. Although there is only limited data on the precise nature of the payoffs that these technologies are likely to afford there is a general consensus that, at a minimum, the following benefits will be realized: increased responsiveness to innovation, lower operating costs, and reduction of exposure to hazards. Nevertheless, the question arises as to how much automation can be justified with the technical and economic constraints of the program? The purpose of this paper is to present a methodology which can be used to evaluate and rank different approaches to automating the functions and tasks planned for the Space Station. Special attention is given to the impact of advanced automation on human productivity. The methodology employed is based on the Analytic Hierarchy Process. This permits the introduction of individual judgements to resolve the confict that normally arises when incomparable criteria underly the selection process. Because of the large number of factors involved in the model, the overall problem is decomposed into four subproblems individually focusing on human productivity, economics, design, and operations, respectively. The results from each are then combined to yield the final rankings. To demonstrate the methodology, an example is developed based on the selection of an on-orbit assembly system. Five alternatives for performing this task are identified, ranging from an astronaut working in space, to a dexterous manipulator with sensory feedback. Computational results are presented along with their implications. A final parametric analysis shows that the outcome is locally insensitive to all but complete reversals in preference.

  7. Preventing Harm in the ICU-Building a Culture of Safety and Engaging Patients and Families.

    PubMed

    Thornton, Kevin C; Schwarz, Jennifer J; Gross, A Kendall; Anderson, Wendy G; Liu, Kathleen D; Romig, Mark C; Schell-Chaple, Hildy; Pronovost, Peter J; Sapirstein, Adam; Gropper, Michael A; Lipshutz, Angela K M

    2017-09-01

    Preventing harm remains a persistent challenge in the ICU despite evidence-based practices known to reduce the prevalence of adverse events. This review seeks to describe the critical role of safety culture and patient and family engagement in successful quality improvement initiatives in the ICU. We review the evidence supporting the impact of safety culture and provide practical guidance for those wishing to implement initiatives aimed at improving safety culture and more effectively integrate patients and families in such efforts. Literature review using PubMed including evaluation of key studies assessing large-scale quality improvement efforts in the ICU, impact of safety culture on patient outcomes, methodologies for quality improvement commonly used in healthcare, and patient and family engagement. Print and web-based resources from leading patient safety organizations were also searched. Our group completed a review of original studies, review articles, book chapters, and recommendations from leading patient safety organizations. Our group determined by consensus which resources would best inform this review. A strong safety culture is associated with reduced adverse events, lower mortality rates, and lower costs. Quality improvement efforts have been shown to be more effective and sustainable when paired with a strong safety culture. Different methodologies exist for quality improvement in the ICU; a thoughtful approach to implementation that engages frontline providers and administrative leadership is essential for success. Efforts to substantively include patients and families in the processes of quality improvement work in the ICU should be expanded. Efforts to establish a culture of safety and meaningfully engage patients and families should form the foundation for all safety interventions in the ICU. This review describes an approach that integrates components of several proven quality improvement methodologies to enhance safety culture in the ICU and highlights opportunities to include patients and families.

  8. Implementation of an acoustic-based methane flux estimation methodology in the Eastern Siberian Arctic Sea

    NASA Astrophysics Data System (ADS)

    Weidner, E. F.; Weber, T. C.; Mayer, L. A.

    2017-12-01

    Quantifying methane flux originating from marine seep systems in climatically sensitive regions is of critically importance for current and future climate studies. Yet, the methane contribution from these systems has been difficult to estimate given the broad spatial scale of the ocean and the heterogeneity of seep activity. One such region is the Eastern Siberian Arctic Sea (ESAS), where bubble release into the shallow water column (<40 meters average depth) facilitates transport of methane to the atmosphere without oxidation. Quantifying the current seep methane flux from the ESAS is necessary to understand not only the total ocean methane budget, but also to provide baseline estimates against which future climate-induced changes can be measured. At the 2016 AGU fall meeting, we presented a new acoustic-based flux methodology using a calibrated broadband split-beam echosounder. The broad (14-24 kHz) bandwidth provides a vertical resolution of 10 cm, making possible the identification of single bubbles. After calibration using 64 mm copper sphere of known backscatter, the acoustic backscatter of individual bubbles is measured and compared to analytical models to estimate bubble radius. Additionally, bubbles are precisely located and traced upwards through the water column to estimate rise velocity. The combination of radius and rise velocity allows for gas flux estimation. Here, we follow up with the completed implementation of this methodology applied to the Herald Canyon region of the western ESAS. From the 68 recognized seeps, bubble radii and rise velocity were computed for more than 550 individual bubbles. The range of bubble radii, 1-6 mm, is comparable to those published by other investigators, while the radius dependent rise velocities are consistent with published models. Methane flux for the Herald Canyon region was estimated by extrapolation from individual seep flux values.

  9. Use of operating room information system data to predict the impact of reducing turnover times on staffing costs.

    PubMed

    Dexter, Franklin; Abouleish, Amr E; Epstein, Richard H; Whitten, Charles W; Lubarsky, David A

    2003-10-01

    Potential benefits to reducing turnover times are both quantitative (e.g., complete more cases and reduce staffing costs) and qualitative (e.g., improve professional satisfaction). Analyses have shown the quantitative arguments to be unsound except for reducing staffing costs. We describe a methodology by which each surgical suite can use its own numbers to calculate its individual potential reduction in staffing costs from reducing its turnover times. Calculations estimate optimal allocated operating room (OR) time (based on maximizing OR efficiency) before and after reducing the maximum and average turnover times. At four academic tertiary hospitals, reductions in average turnover times of 3 to 9 min would result in 0.8% to 1.8% reductions in staffing cost. Reductions in average turnover times of 10 to 19 min would result in 2.5% to 4.0% reductions in staffing costs. These reductions in staffing cost are achieved predominantly by reducing allocated OR time, not by reducing the hours that staff work late. Heads of anesthesiology groups often serve on OR committees that are fixated on turnover times. Rather than having to argue based on scientific studies, this methodology provides the ability to show the specific quantitative effects (small decreases in staffing costs and allocated OR time) of reducing turnover time using a surgical suite's own data. Many anesthesiologists work at hospitals where surgeons and/or operating room (OR) committees focus repeatedly on turnover time reduction. We developed a methodology by which the reductions in staffing cost as a result of turnover time reduction can be calculated for each facility using its own data. Staffing cost reductions are generally very small and would be achieved predominantly by reducing allocated OR time to the surgeons.

  10. The methodology of population surveys of headache prevalence, burden and cost: Principles and recommendations from the Global Campaign against Headache

    PubMed Central

    2014-01-01

    The global burden of headache is very large, but knowledge of it is far from complete and needs still to be gathered. Published population-based studies have used variable methodology, which has influenced findings and made comparisons difficult. Among the initiatives of the Global Campaign against Headache to improve and standardize methods in use for cross-sectional studies, the most important is the production of consensus-based methodological guidelines. This report describes the development of detailed principles and recommendations. For this purpose we brought together an expert consensus group to include experience and competence in headache epidemiology and/or epidemiology in general and drawn from all six WHO world regions. The recommendations presented are for anyone, of whatever background, with interests in designing, performing, understanding or assessing studies that measure or describe the burden of headache in populations. While aimed principally at researchers whose main interests are in the field of headache, they should also be useful, at least in parts, to those who are expert in public health or epidemiology and wish to extend their interest into the field of headache disorders. Most of all, these recommendations seek to encourage collaborations between specialists in headache disorders and epidemiologists. The focus is on migraine, tension-type headache and medication-overuse headache, but they are not intended to be exclusive to these. The burdens arising from secondary headaches are, in the majority of cases, more correctly attributed to the underlying disorders. Nevertheless, the principles outlined here are relevant for epidemiological studies on secondary headaches, provided that adequate definitions can be not only given but also applied in questionnaires or other survey instruments. PMID:24467862

  11. PAI-OFF: A new proposal for online flood forecasting in flash flood prone catchments

    NASA Astrophysics Data System (ADS)

    Schmitz, G. H.; Cullmann, J.

    2008-10-01

    SummaryThe Process Modelling and Artificial Intelligence for Online Flood Forecasting (PAI-OFF) methodology combines the reliability of physically based, hydrologic/hydraulic modelling with the operational advantages of artificial intelligence. These operational advantages are extremely low computation times and straightforward operation. The basic principle of the methodology is to portray process models by means of ANN. We propose to train ANN flood forecasting models with synthetic data that reflects the possible range of storm events. To this end, establishing PAI-OFF requires first setting up a physically based hydrologic model of the considered catchment and - optionally, if backwater effects have a significant impact on the flow regime - a hydrodynamic flood routing model of the river reach in question. Both models are subsequently used for simulating all meaningful and flood relevant storm scenarios which are obtained from a catchment specific meteorological data analysis. This provides a database of corresponding input/output vectors which is then completed by generally available hydrological and meteorological data for characterizing the catchment state prior to each storm event. This database subsequently serves for training both a polynomial neural network (PoNN) - portraying the rainfall-runoff process - and a multilayer neural network (MLFN), which mirrors the hydrodynamic flood wave propagation in the river. These two ANN models replace the hydrological and hydrodynamic model in the operational mode. After presenting the theory, we apply PAI-OFF - essentially consisting of the coupled "hydrologic" PoNN and "hydrodynamic" MLFN - to the Freiberger Mulde catchment in the Erzgebirge (Ore-mountains) in East Germany (3000 km 2). Both the demonstrated computational efficiency and the prediction reliability underline the potential of the new PAI-OFF methodology for online flood forecasting.

  12. How does study quality affect the results of a diagnostic meta-analysis?

    PubMed Central

    Westwood, Marie E; Whiting, Penny F; Kleijnen, Jos

    2005-01-01

    Background The use of systematic literature review to inform evidence based practice in diagnostics is rapidly expanding. Although the primary diagnostic literature is extensive, studies are often of low methodological quality or poorly reported. There has been no rigorously evaluated, evidence based tool to assess the methodological quality of diagnostic studies. The primary objective of this study was to determine the extent to which variations in the quality of primary studies impact the results of a diagnostic meta-analysis and whether this differs with diagnostic test type. A secondary objective was to contribute to the evaluation of QUADAS, an evidence-based tool for the assessment of quality in diagnostic accuracy studies. Methods This study was conducted as part of large systematic review of tests used in the diagnosis and further investigation of urinary tract infection (UTI) in children. All studies included in this review were assessed using QUADAS, an evidence-based tool for the assessment of quality in systematic reviews of diagnostic accuracy studies. The impact of individual components of QUADAS on a summary measure of diagnostic accuracy was investigated using regression analysis. The review divided the diagnosis and further investigation of UTI into the following three clinical stages: diagnosis of UTI, localisation of infection, and further investigation of the UTI. Each stage used different types of diagnostic test, which were considered to involve different quality concerns. Results Many of the studies included in our review were poorly reported. The proportion of QUADAS items fulfilled was similar for studies in different sections of the review. However, as might be expected, the individual items fulfilled differed between the three clinical stages. Regression analysis found that different items showed a strong association with test performance for the different tests evaluated. These differences were observed both within and between the three clinical stages assessed by the review. The results of regression analyses were also affected by whether or not a weighting (by sample size) was applied. Our analysis was severely limited by the completeness of reporting and the differences between the index tests evaluated and the reference standards used to confirm diagnoses in the primary studies. Few tests were evaluated by sufficient studies to allow meaningful use of meta-analytic pooling and investigation of heterogeneity. This meant that further analysis to investigate heterogeneity could only be undertaken using a subset of studies, and that the findings are open to various interpretations. Conclusion Further work is needed to investigate the influence of methodological quality on the results of diagnostic meta-analyses. Large data sets of well-reported primary studies are needed to address this question. Without significant improvements in the completeness of reporting of primary studies, progress in this area will be limited. PMID:15943861

  13. An optimized methodology for whole genome sequencing of RNA respiratory viruses from nasopharyngeal aspirates.

    PubMed

    Goya, Stephanie; Valinotto, Laura E; Tittarelli, Estefania; Rojo, Gabriel L; Nabaes Jodar, Mercedes S; Greninger, Alexander L; Zaiat, Jonathan J; Marti, Marcelo A; Mistchenko, Alicia S; Viegas, Mariana

    2018-01-01

    Over the last decade, the number of viral genome sequences deposited in available databases has grown exponentially. However, sequencing methodology vary widely and many published works have relied on viral enrichment by viral culture or nucleic acid amplification with specific primers rather than through unbiased techniques such as metagenomics. The genome of RNA viruses is highly variable and these enrichment methodologies may be difficult to achieve or may bias the results. In order to obtain genomic sequences of human respiratory syncytial virus (HRSV) from positive nasopharyngeal aspirates diverse methodologies were evaluated and compared. A total of 29 nearly complete and complete viral genomes were obtained. The best performance was achieved with a DNase I treatment to the RNA directly extracted from the nasopharyngeal aspirate (NPA), sequence-independent single-primer amplification (SISPA) and library preparation performed with Nextera XT DNA Library Prep Kit with manual normalization. An average of 633,789 and 1,674,845 filtered reads per library were obtained with MiSeq and NextSeq 500 platforms, respectively. The higher output of NextSeq 500 was accompanied by the increasing of duplicated reads percentage generated during SISPA (from an average of 1.5% duplicated viral reads in MiSeq to an average of 74% in NextSeq 500). HRSV genome recovery was not affected by the presence or absence of duplicated reads but the computational demand during the analysis was increased. Considering that only samples with viral load ≥ E+06 copies/ml NPA were tested, no correlation between sample viral loads and number of total filtered reads was observed, nor with the mapped viral reads. The HRSV genomes showed a mean coverage of 98.46% with the best methodology. In addition, genomes of human metapneumovirus (HMPV), human rhinovirus (HRV) and human parainfluenza virus types 1-3 (HPIV1-3) were also obtained with the selected optimal methodology.

  14. Comparison between two methodologies for urban drainage decision aid.

    PubMed

    Moura, P M; Baptista, M B; Barraud, S

    2006-01-01

    The objective of the present work is to compare two methodologies based on multicriteria analysis for the evaluation of stormwater systems. The first methodology was developed in Brazil and is based on performance-cost analysis, the second one is ELECTRE III. Both methodologies were applied to a case study. Sensitivity and robustness analyses were then carried out. These analyses demonstrate that both methodologies have equivalent results, and present low sensitivity and high robustness. These results prove that the Brazilian methodology is consistent and can be used safely in order to select a good solution or a small set of good solutions that could be compared with more detailed methods afterwards.

  15. Mass selective separation applied to radioisotopes of cesium: Mass selective applied to radioisotopes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dion, Michael; Eiden, Greg; Farmer, Orville

    2016-07-22

    A developed technique that uses the intrinsic mass-based separation capability of a quadrupole mass spectrometer has been used to resolve spectral radiometric interference of two isotopes of the same element. In this work the starting sample was a combination of 137Cs and 134Cs and was (activity) dominated by 137Cs and this methodology separated and “implanted” 134Cs that was later quantified for spectral features and ac- tivity with traditional radiometric techniques. This work demonstrated a 134Cs/137Cs activity ratio enhancement of >4 orders of magnitude and complete removal of 137Cs spectral features from the implanted target mass (i.e., 134).

  16. S3 Guideline for the treatment of psoriasis vulgaris, update - Short version part 1 - Systemic treatment.

    PubMed

    Nast, Alexander; Amelunxen, Lasse; Augustin, Matthias; Boehncke, Wolf-Henning; Dressler, Corinna; Gaskins, Matthew; Härle, Peter; Hoffstadt, Bernd; Klaus, Joachim; Koza, Joachim; Mrowietz, Ulrich; Ockenfels, Hans-Michael; Philipp, Sandra; Reich, Kristian; Rosenbach, Thomas; Rzany, Berthold; Schlaeger, Martin; Schmid-Ott, Gerhard; Sebastian, Michael; von Kiedrowski, Ralph; Weberschock, Tobias

    2018-05-01

    The German guideline for the treatment of psoriasis vulgaris was updated using GRADE methodology. The guideline is based on a systematic literature review completed on December 1, 2016, and on a formal consensus and approval process. The first section of this short version of the guideline covers systemic treatment options considered relevant by the expert panel and approved in Germany at the time of the consensus conference (acitretin, adalimumab, apremilast, cyclosporine, etanercept, fumaric acid esters, infliximab, methotrexate, secukinumab and ustekinumab). Detailed information is provided on the management and monitoring of the included treatment options. © 2018 The Authors | Journal compilation © Blackwell Verlag GmbH, Berlin.

  17. Thermal Stability Testing of Fischer-Tropsch Fuel and Various Blends with Jet A, as Well as Aromatic Blend Additives

    NASA Technical Reports Server (NTRS)

    Klettlinger, J.; Rich, R.; Yen, C.; Surgenor, A.

    2011-01-01

    Fischer-Tropsch (F-T) jet fuel composition differs from petroleum-based, conventional commercial jet fuel because of differences in feedstock and production methodology. Fischer-Tropsch fuel typically has a lower aromatic and sulfur content and consists primarily of iso and normal parafins. The ASTM D3241 specification for Jet Fuel Thermal Oxidation Test (JFTOT) break point testing method was used to test the breakpoint of a baseline conventional Jet A, a commercial grade F-T jet fuel, and various blends of this F-T fuel in Jet A. The testing completed in this report was supported by the NASA Fundamental Aeronautics Subsonics Fixed Wing Project.

  18. General Multivariate Linear Modeling of Surface Shapes Using SurfStat

    PubMed Central

    Chung, Moo K.; Worsley, Keith J.; Nacewicz, Brendon, M.; Dalton, Kim M.; Davidson, Richard J.

    2010-01-01

    Although there are many imaging studies on traditional ROI-based amygdala volumetry, there are very few studies on modeling amygdala shape variations. This paper present a unified computational and statistical framework for modeling amygdala shape variations in a clinical population. The weighted spherical harmonic representation is used as to parameterize, to smooth out, and to normalize amygdala surfaces. The representation is subsequently used as an input for multivariate linear models accounting for nuisance covariates such as age and brain size difference using SurfStat package that completely avoids the complexity of specifying design matrices. The methodology has been applied for quantifying abnormal local amygdala shape variations in 22 high functioning autistic subjects. PMID:20620211

  19. Metabolic design of macroscopic bioreaction models: application to Chinese hamster ovary cells.

    PubMed

    Provost, A; Bastin, G; Agathos, S N; Schneider, Y-J

    2006-12-01

    The aim of this paper is to present a systematic methodology to design macroscopic bioreaction models for cell cultures based upon metabolic networks. The cell culture is seen as a succession of phases. During each phase, a metabolic network represents the set of reactions occurring in the cell. Then, through the use of the elementary flux modes, these metabolic networks are used to derive macroscopic bioreactions linking the extracellular substrates and products. On this basis, as many separate models are obtained as there are phases. Then, a complete model is obtained by smoothly switching from model to model. This is illustrated with batch cultures of Chinese hamster ovary cells.

  20. U.S. Geological Survey 2011 assessment of undiscovered oil and gas resources of the Cook Inlet region, south-central Alaska

    USGS Publications Warehouse

    Stanley, Richard G.; Pierce, Brenda S.; Houseknecht, David W.

    2011-01-01

    The U.S. Geological Survey (USGS) has completed an assessment of the volumes of undiscovered, technically recoverable oil and gas resources in conventional and continuous accumulations in Cook Inlet. The assessment used a geology-based methodology and results from new scientific research by the USGS and the State of Alaska, Department of Natural Resources, Division of Geological and Geophysical Surveys and Division of Oil and Gas (DOG). In the Cook Inlet region, the USGS estimates mean undiscovered volumes of nearly 600 million barrels of oil, about 19 trillion cubic feet of gas, and about 46 million barrels of natural gas liquids.

  1. RECIST 1.1-Update and clarification: From the RECIST committee.

    PubMed

    Schwartz, Lawrence H; Litière, Saskia; de Vries, Elisabeth; Ford, Robert; Gwyther, Stephen; Mandrekar, Sumithra; Shankar, Lalitha; Bogaerts, Jan; Chen, Alice; Dancey, Janet; Hayes, Wendy; Hodi, F Stephen; Hoekstra, Otto S; Huang, Erich P; Lin, Nancy; Liu, Yan; Therasse, Patrick; Wolchok, Jedd D; Seymour, Lesley

    2016-07-01

    The Response Evaluation Criteria in Solid Tumours (RECIST) were developed and published in 2000, based on the original World Health Organisation guidelines first published in 1981. In 2009, revisions were made (RECIST 1.1) incorporating major changes, including a reduction in the number of lesions to be assessed, a new measurement method to classify lymph nodes as pathologic or normal, the clarification of the requirement to confirm a complete response or partial response and new methodologies for more appropriate measurement of disease progression. The purpose of this paper was to summarise the questions posed and the clarifications provided as an update to the 2009 publication. Copyright © 2016. Published by Elsevier Ltd.

  2. Identifying social factors amongst older individuals in linked electronic health records: An assessment in a population based study

    PubMed Central

    van Hoek, Albert J.; Walker, Jemma L.; Mathur, Rohini; Smeeth, Liam; Thomas, Sara L.

    2017-01-01

    Identification and quantification of health inequities amongst specific social groups is a pre-requisite for designing targeted healthcare interventions. This study investigated the recording of social factors in linked electronic health records (EHR) of individuals aged ≥65 years, to assess the potential of these data to identify the social determinants of disease burden and uptake of healthcare interventions. Methodology was developed for ascertaining social factors recorded on or before a pre-specified index date (01/01/2013) using primary care data from Clinical Practice Research Datalink (CPRD) linked to hospitalisation and deprivation data in a cross-sectional study. Social factors included: religion, ethnicity, immigration status, small area-level deprivation, place of residence (including communal establishments such as care homes), marital status and living arrangements (e.g. living alone, cohabitation). Each social factor was examined for: completeness of recording including improvements in completeness by using other linked EHR, timeliness of recording for factors that might change over time and their representativeness (compared with English 2011 Census data when available). Data for 591,037 individuals from 389 practices from England were analysed. The completeness of recording varied from 1.6% for immigration status to ~80% for ethnicity. Linkages provided the deprivation data (available for 82% individuals) and improved completeness of ethnicity recording from 55% to 79% (when hospitalisation data were added). Data for ethnicity, deprivation, living arrangements and care home residence were comparable to the Census data. For time-varying variables such as residence and living alone, ~60% and ~35% respectively of those with available data, had this information recorded within the last 5 years of the index date. This work provides methods to identify social factors in EHR relevant to older individuals and shows that factors such as ethnicity, deprivation, not living alone, cohabitation and care home residence can be ascertained using these data. Applying these methodologies to routinely collected data could improve surveillance programmes and allow assessment of health equity in specific healthcare studies. PMID:29190680

  3. Predicting Failure Progression and Failure Loads in Composite Open-Hole Tension Coupons

    NASA Technical Reports Server (NTRS)

    Arunkumar, Satyanarayana; Przekop, Adam

    2010-01-01

    Failure types and failure loads in carbon-epoxy [45n/90n/-45n/0n]ms laminate coupons with central circular holes subjected to tensile load are simulated using progressive failure analysis (PFA) methodology. The progressive failure methodology is implemented using VUMAT subroutine within the ABAQUS(TradeMark)/Explicit nonlinear finite element code. The degradation model adopted in the present PFA methodology uses an instantaneous complete stress reduction (COSTR) approach to simulate damage at a material point when failure occurs. In-plane modeling parameters such as element size and shape are held constant in the finite element models, irrespective of laminate thickness and hole size, to predict failure loads and failure progression. Comparison to published test data indicates that this methodology accurately simulates brittle, pull-out and delamination failure types. The sensitivity of the failure progression and the failure load to analytical loading rates and solvers precision is demonstrated.

  4. Automated Methodologies for the Design of Flow Diagrams for Development and Maintenance Activities

    NASA Astrophysics Data System (ADS)

    Shivanand M., Handigund; Shweta, Bhat

    The Software Requirements Specification (SRS) of the organization is a text document prepared by strategic management incorporating the requirements of the organization. These requirements of ongoing business/ project development process involve the software tools, the hardware devices, the manual procedures, the application programs and the communication commands. These components are appropriately ordered for achieving the mission of the concerned process both in the project development and the ongoing business processes, in different flow diagrams viz. activity chart, workflow diagram, activity diagram, component diagram and deployment diagram. This paper proposes two generic, automatic methodologies for the design of various flow diagrams of (i) project development activities, (ii) ongoing business process. The methodologies also resolve the ensuing deadlocks in the flow diagrams and determine the critical paths for the activity chart. Though both methodologies are independent, each complements other in authenticating its correctness and completeness.

  5. Modified graphene oxide sensors for ultra-sensitive detection of nitrate ions in water.

    PubMed

    Ren, Wen; Mura, Stefania; Irudayaraj, Joseph M K

    2015-10-01

    Nitrate ions is a very common contaminant in drinking water and has a significant impact on the environment, necessitating routine monitoring. Due to its chemical and physical properties, it is hard to directly detect nitrate ions with high sensitivity in a simple and inexpensive manner. Herein with amino group modified graphene oxide (GO) as a sensing element, we show a direct and ultra-sensitive method to detect nitrate ions, at a lowest detected concentration of 5 nM in river water samples, much lower than the reported methods based on absorption spectroscopy. Furthermore, unlike the reported strategies based on absorption spectroscopy wherein the nitrate concentration is determined by monitoring an increase in aggregation of gold nanoparticles (GNPs), our method evaluates the concentration of nitrate ions based on reduction in aggregation of GNPs for monitoring in real samples. To improve sensitivity, several optimizations were performed, including the assessment of the amount of modified GO required, concentration of GNPs and incubation time. The detection methodology was characterized by zeta potential, TEM and SEM. Our results indicate that an enrichment of modified GO with nitrate ions contributed to excellent sensitivity and the entire detection procedure could be completed within 75 min with only 20 μl of sample. This simple and rapid methodology was applied to monitor nitrate ions in real samples with excellent sensitivity and minimum pretreatment. The proposed approach paves the way for a novel means to detect anions in real samples and highlights the potential of GO based detection strategy for water quality monitoring. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Electrokinetic transport of rigid macroions in the thin double layer limit: a boundary element approach.

    PubMed

    Allison, Stuart A; Xin, Yao

    2005-08-15

    A boundary element (BE) procedure is developed to numerically calculate the electrophoretic mobility of highly charged, rigid model macroions in the thin double layer regime based on the continuum primitive model. The procedure is based on that of O'Brien (R.W. O'Brien, J. Colloid Interface Sci. 92 (1983) 204). The advantage of the present procedure over existing BE methodologies that are applicable to rigid model macroions in general (S. Allison, Macromolecules 29 (1996) 7391) is that computationally time consuming integrations over a large number of volume elements that surround the model particle are completely avoided. The procedure is tested by comparing the mobilities derived from it with independent theory of the mobility of spheres of radius a in a salt solution with Debye-Huckel screening parameter, kappa. The procedure is shown to yield accurate mobilities provided (kappa)a exceeds approximately 50. The methodology is most relevant to model macroions of mean linear dimension, L, with 1000>(kappa)L>100 and reduced absolute zeta potential (q|zeta|/k(B)T) greater than 1.0. The procedure is then applied to the compact form of high molecular weight, duplex DNA that is formed in the presence of the trivalent counterion, spermidine, under low salt conditions. For T4 DNA (166,000 base pairs), the compact form is modeled as a sphere (diameter=600 nm) and as a toroid (largest linear dimension=600 nm). In order to reconcile experimental and model mobilities, approximately 95% of the DNA phosphates must be neutralized by bound counterions. This interpretation, based on electrokinetics, is consistent with independent studies.

  7. Improving data retention in EEG research with children using child-centered eye tracking

    PubMed Central

    Maguire, Mandy J.; Magnon, Grant; Fitzhugh, Anna E.

    2014-01-01

    Background Event Related Potentials (ERPs) elicited by visual stimuli have increased our understanding of developmental disorders and adult cognitive abilities for decades; however, these studies are very difficult with populations who cannot sustain visual attention such as infants and young children. Current methods for studying such populations include requiring a button response, which may be impossible for some participants, and experimenter monitoring, which is subject to error, highly variable, and spatially imprecise. New Method We developed a child-centered methodology to integrate EEG data acquisition and eye-tracking technologies that uses “attention-getters” in which stimulus display is contingent upon the child’s gaze. The goal was to increase the number of trials retained. Additionally, we used the eye-tracker to categorize and analyze the EEG data based on gaze to specific areas of the visual display, compared to analyzing based on stimulus presentation. Results Compared with Existing Methods The number of trials retained was substantially improved using the child-centered methodology compared to a button-press response in 7–8 year olds. In contrast, analyzing the EEG based on eye gaze to specific points within the visual display as opposed to stimulus presentation provided too few trials for reliable interpretation. Conclusions By using the linked EEG-eye-tracker we significantly increased data retention. With this method, studies can be completed with fewer participants and a wider range of populations. However, caution should be used when epoching based on participants’ eye gaze because, in this case, this technique provided substantially fewer trials. PMID:25251555

  8. Communicating personal amnesty: a model for health promotion in an Australian disability context.

    PubMed

    Vogelpoel, Nicholas; Gattenhof, Sandra; Shakespeare-Finch, Jane

    2015-09-01

    Currently pathological and illness-centric policy surrounds the evaluation of the health status of a person experiencing disability. In this research partnerships were built between disability service providers, community development organizations and disability arts organizations to build a translational evaluative methodology prior to implementation of an arts-based workshop that was embedded in a strengths-based approach to health and well-being. The model consisted of three foci: participation in a pre-designed drama-based workshop program; individualized assessment and evaluation of changing health status; and longitudinal analysis of participants changing health status in their public lives following the culmination of the workshop series. Participants (n = 15) were recruited through disability service providers and disability arts organizations to complete a 13-week workshop series and public performance. The study developed accumulative qualitative analysis tools and member-checking methods specific to the communication systems used by individual participants. Principle findings included increased confidence for verbal and non-verbal communicators; increased personal drive, ambition and goal-setting; increased arts-based skills including professional engagements as artists; demonstrated skills in communicating perceptions of health status to private and public spheres. Tangential positive observations were evident in the changing recreational, vocational and educational activities participants engaged with pre- and post- the workshop series; participants advocating for autonomous accommodation and health provision and changes in the disability service staff's culture. The research is an example of translational health methodologies in disability studies. © The Author (2013). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Flexible data integration and curation using a graph-based approach.

    PubMed

    Croset, Samuel; Rupp, Joachim; Romacker, Martin

    2016-03-15

    The increasing diversity of data available to the biomedical scientist holds promise for better understanding of diseases and discovery of new treatments for patients. In order to provide a complete picture of a biomedical question, data from many different origins needs to be combined into a unified representation. During this data integration process, inevitable errors and ambiguities present in the initial sources compromise the quality of the resulting data warehouse, and greatly diminish the scientific value of the content. Expensive and time-consuming manual curation is then required to improve the quality of the information. However, it becomes increasingly difficult to dedicate and optimize the resources for data integration projects as available repositories are growing both in size and in number everyday. We present a new generic methodology to identify problematic records, causing what we describe as 'data hairball' structures. The approach is graph-based and relies on two metrics traditionally used in social sciences: the graph density and the betweenness centrality. We evaluate and discuss these measures and show their relevance for flexible, optimized and automated data curation and linkage. The methodology focuses on information coherence and correctness to improve the scientific meaningfulness of data integration endeavors, such as knowledge bases and large data warehouses. samuel.croset@roche.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. Improving Computational Efficiency of Prediction in Model-Based Prognostics Using the Unscented Transform

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew John; Goebel, Kai Frank

    2010-01-01

    Model-based prognostics captures system knowledge in the form of physics-based models of components, and how they fail, in order to obtain accurate predictions of end of life (EOL). EOL is predicted based on the estimated current state distribution of a component and expected profiles of future usage. In general, this requires simulations of the component using the underlying models. In this paper, we develop a simulation-based prediction methodology that achieves computational efficiency by performing only the minimal number of simulations needed in order to accurately approximate the mean and variance of the complete EOL distribution. This is performed through the use of the unscented transform, which predicts the means and covariances of a distribution passed through a nonlinear transformation. In this case, the EOL simulation acts as that nonlinear transformation. In this paper, we review the unscented transform, and describe how this concept is applied to efficient EOL prediction. As a case study, we develop a physics-based model of a solenoid valve, and perform simulation experiments to demonstrate improved computational efficiency without sacrificing prediction accuracy.

  11. Biomedical question answering using semantic relations.

    PubMed

    Hristovski, Dimitar; Dinevski, Dejan; Kastrin, Andrej; Rindflesch, Thomas C

    2015-01-16

    The proliferation of the scientific literature in the field of biomedicine makes it difficult to keep abreast of current knowledge, even for domain experts. While general Web search engines and specialized information retrieval (IR) systems have made important strides in recent decades, the problem of accurate knowledge extraction from the biomedical literature is far from solved. Classical IR systems usually return a list of documents that have to be read by the user to extract relevant information. This tedious and time-consuming work can be lessened with automatic Question Answering (QA) systems, which aim to provide users with direct and precise answers to their questions. In this work we propose a novel methodology for QA based on semantic relations extracted from the biomedical literature. We extracted semantic relations with the SemRep natural language processing system from 122,421,765 sentences, which came from 21,014,382 MEDLINE citations (i.e., the complete MEDLINE distribution up to the end of 2012). A total of 58,879,300 semantic relation instances were extracted and organized in a relational database. The QA process is implemented as a search in this database, which is accessed through a Web-based application, called SemBT (available at http://sembt.mf.uni-lj.si ). We conducted an extensive evaluation of the proposed methodology in order to estimate the accuracy of extracting a particular semantic relation from a particular sentence. Evaluation was performed by 80 domain experts. In total 7,510 semantic relation instances belonging to 2,675 distinct relations were evaluated 12,083 times. The instances were evaluated as correct 8,228 times (68%). In this work we propose an innovative methodology for biomedical QA. The system is implemented as a Web-based application that is able to provide precise answers to a wide range of questions. A typical question is answered within a few seconds. The tool has some extensions that make it especially useful for interpretation of DNA microarray results.

  12. Feasibility of ecological momentary assessment of hearing difficulties encountered by hearing aid users

    PubMed Central

    Galvez, Gino; Turbin, Mitchel B.; Thielman, Emily J.; Istvan, Joseph A.; Andrews, Judy A.; Henry, James A.

    2012-01-01

    Objectives Measurement of outcomes has become increasingly important to assess the benefit of audiologic rehabilitation, including hearing aids, in adults. Data from questionnaires, however, are based on retrospective recall of events and experiences, and often can be inaccurate. Questionnaires also do not capture the daily variation that typically occurs in relevant events and experiences. Clinical researchers in a variety of fields have turned to a methodology known as ecological momentary assessment (EMA) to assess quotidian experiences associated with health problems. The objective of this study was to determine the feasibility of using EMA to obtain real-time responses from hearing aid users describing their experiences with challenging hearing situations. Design This study required three phases: (1) develop EMA methodology to assess hearing difficulties experienced by hearing aid users; (2) utilize focus groups to refine the methodology; and (3) test the methodology with 24 hearing aid users. Phase 3 participants carried a personal digital assistant (PDA) 12 hr per day for 2 wk. The PDA alerted participants to respond to questions four times a day. Each assessment started with a question to determine if a hearing problem was experienced since the last alert. If “yes,” then up to 23 questions (depending on contingent response branching) obtained details about the situation. If “no,” then up to 11 questions obtained information that would help to explain why hearing was not a problem. Each participant completed the Hearing Handicap Inventory for the Elderly (HHIE) both before and after the 2-wk EMA testing period to evaluate for “reactivity” (exacerbation of self-perceived hearing problems that could result from the repeated assessments). Results Participants responded to the alerts with a 77% compliance rate, providing a total of 991 completed momentary assessments (mean = 43.1 per participant). A substantial amount of data was obtained with the methodology. Notably, participants reported a “hearing problem situation since the last alert” 37.6% of the time (372 responses). The most common problem situation involved “face-to-face conversation” (53.8% of the time). The next most common problem situation was “telephone conversation” (17.2%) followed by “TV, radio, iPod, etc.” (15.3%), “environmental sounds” (9.7%), and “movies, lecture, etc.” (4.0%). Comparison of pre- and post-EMA mean HHIE scores revealed no significant difference (p>.05), indicating that reactivity did not occur for this group. It should be noted, however, that 37.5% of participants reported a greater sense of awareness regarding their hearing loss and use of hearing aids. Conclusions Results showed participants were compliant, gave positive feedback, and did not demonstrate reactivity based on pre- and post-HHIE scores. We conclude that EMA methodology is feasible with patients who use hearing aids and could potentially inform hearing healthcare (HHC) services. The next step is to develop and evaluate EMA protocols that provide detailed daily patient information to audiologists at each stage of HHC. The advantages of such an approach would be to obtain real-life outcome measures, and to determine within- and between-day variability in outcomes and associated factors. Such information currently is not available from patients who seek and use HHC services. PMID:22531573

  13. Feasibility of ecological momentary assessment of hearing difficulties encountered by hearing aid users.

    PubMed

    Galvez, Gino; Turbin, Mitchel B; Thielman, Emily J; Istvan, Joseph A; Andrews, Judy A; Henry, James A

    2012-01-01

    Measurement of outcomes has become increasingly important to assess the benefit of audiologic rehabilitation, including hearing aids, in adults. Data from questionnaires, however, are based on retrospective recall of events and experiences, and often can be inaccurate. Questionnaires also do not capture the daily variation that typically occurs in relevant events and experiences. Clinical researchers in a variety of fields have turned to a methodology known as ecological momentary assessment (EMA) to assess quotidian experiences associated with health problems. The objective of this study was to determine the feasibility of using EMA to obtain real-time responses from hearing aid users describing their experiences with challenging hearing situations. This study required three phases: (1) develop EMA methodology to assess hearing difficulties experienced by hearing aid users; (2) make use of focus groups to refine the methodology; and (3) test the methodology with 24 hearing aid users. Phase 3 participants carried a personal digital assistant 12 hr per day for 2 weeks. The personal digital assistant alerted participants to respond to questions four times a day. Each assessment started with a question to determine whether a hearing problem was experienced since the last alert. If "yes," then up to 23 questions (depending on contingent response branching) obtained details about the situation. If "no," then up to 11 questions obtained information that would help to explain why hearing was not a problem. Each participant completed the Hearing Handicap Inventory for the Elderly (HHIE) both before and after the 2-week EMA testing period to evaluate for "reactivity" (exacerbation of self-perceived hearing problems that could result from the repeated assessments). Participants responded to the alerts with a 77% compliance rate, providing a total of 991 completed momentary assessments (mean = 43.1 per participant). A substantial amount of data were obtained with the methodology. It is important to note that participants reported a "hearing problem situation since the last alert" 37.6% of the time (372 responses). The most common problem situation involved "face-to-face conversation" (53.8% of the time). The next most common problem situation was "telephone conversation" (17.2%) followed by "TV, radio, iPod, etc." (15.3%), "environmental sounds" (9.7%), and "movies, lecture, etc." (4.0%). Comparison of pre- and post-EMA mean HHIE scores revealed no significant difference (p > 0.05), indicating that reactivity did not occur for this group. It should be noted, however, that 37.5% of participants reported a greater sense of awareness regarding their hearing loss and use of hearing aids. Results showed participants were compliant, gave positive feedback, and did not demonstrate reactivity based on pre- and post-HHIE scores. We conclude that EMA methodology is feasible with patients who use hearing aids and could potentially inform hearing healthcare (HHC) services. The next step is to develop and evaluate EMA protocols that provide detailed daily patient information to audiologists at each stage of HHC. The advantages of such an approach would be to obtain real-life outcome measures, and to determine within- and between-day variability in outcomes and associated factors. Such information at present is not available from patients who seek and use HHC services.

  14. Analysis of a Proposal to Implement the Readiness Based Sparing Process in the Brazilian Navy

    DTIC Science & Technology

    2017-06-01

    determine inventory levels. This research investigates whether implementing the U.S. DOD readiness-based sparing (RBS) methodology could provide the...suggested by applying the methodology first for determining reparable spares initial provisioning. 14. SUBJECT TERMS reparable, system-approach...This research investigates whether implementing the U.S. DOD readiness-based sparing (RBS) methodology could provide the Brazilian Navy with greater

  15. The importance of growth factors for the treatment of chronic wounds in the case of diabetic foot ulcers.

    PubMed

    Buchberger, Barbara; Follmann, Markus; Freyer, Daniela; Huppertz, Hendrik; Ehm, Alexandra; Wasem, Jürgen

    2010-09-01

    Ulcers as a result of diabetes mellitus are a serious problem with an enormous impact on the overall global disease burden due to the increasing prevalence of diabetes. Because of long hospital stays, rehabilitation, often required home care and the use of social services diabetic foot complications are costly. Therapy with growth factors could be an effective and innovative add-on to standard wound care. What is the benefit of therapies with growth factors alone or in combination with other technologies in the treatment of diabetic foot ulcer assessed regarding medical, economical, social, ethical and juridical aspects? We systematically searched relevant databases limited to English and German language and publications since 1990. Cost values were adjusted to the price level of 2008 and converted into Euro. A review and an assessment of the quality of publications were conducted following approved methodical standards conforming to evidence-based medicine and health economics. We identified 25 studies (14 randomized controlled trials (RCT), nine cost-effectiveness analyses, two meta-analyses). The RCT compared an add-on therapy to standard wound care with standard wound care/placebo alone or extracellular wound matrix: in six studies becaplermin, in two rhEGF, in one bFGF, and in five studies the metabolically active skin grafts Dermagraft and Apligraf. The study duration ranged from twelve to 20 weeks and the study population included between 17 to 382 patients, average 130 patients. The treatment with becaplermin, rhEGF and skin implants Dermagraft and Apligraf showed in eight out of 13 studies an advantage concerning complete wound closure and the time to complete wound healing. Evidence for a benefit of treatment with bFGF could not be found. In four out of 14 studies the proportion of adverse events was 30% per study group with no difference between the treatment groups. The methodological quality of the studies was affected by significant deficiencies. The results showed becaplermin being cost-effective whereas no obvious statement can be made regarding Dermagraft and Apligraf because of diverging cost bases and incremental cost-effectiveness ratios. Differences in standard wound care are complicating the comparison of study results. Taking into consideration the small to very small sample sizes and other methodological flaws with high potential of bias, the validity of the results with regard to effectiveness and cost-effectiveness has to be considered limited. The duration of treatment and follow-up examinations is not long enough to assess the sustainability of the intervention and the surveillance of ulcer recurrences or treatment related adverse events like the development of malignancy. There are indications of an advantage for the add-on therapy with growth factors in diabetic foot ulcers concerning complete wound closure and the time to complete wound healing. Further more studies of high methodological quality with adequate sample sizes and sufficient follow-up periods are necessary also investigating patient-relevant parameters like the health-related quality of life, the acceptance and tolerance of the intervention in addition to clinical outcomes.

  16. Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria

    NASA Astrophysics Data System (ADS)

    Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong

    2017-08-01

    In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.

  17. Multiple Methodologies: Using Community-Based Participatory Research and Decolonizing Methodologies in Kenya

    ERIC Educational Resources Information Center

    Elder, Brent C.; Odoyo, Kenneth O.

    2018-01-01

    In this project, we examined the development of a sustainable inclusive education system in western Kenya by combining community-based participatory research (CBPR) and decolonizing methodologies. Through three cycles of qualitative interviews with stakeholders in inclusive education, participants explained what they saw as foundational components…

  18. 40 CFR Appendix I to Subpart S of... - Vehicle Procurement Methodology

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (CONTINUED) General Compliance Provisions for Control of Air Pollution From New and In-Use Light-Duty Vehicles, Light-Duty Trucks, and Complete Otto-Cycle Heavy-Duty Vehicles Pt. 86, Subpt. S, App. I Appendix...

  19. Complete Genome Sequence of a Streptococcus pyogenes Serotype M12 Scarlet Fever Outbreak Isolate from China, Compiled Using Oxford Nanopore and Illumina Sequencing.

    PubMed

    You, Yuanhai; Kou, Yongjun; Niu, Longfei; Jia, Qiong; Liu, Yahui; Davies, Mark R; Walker, Mark J; Zhu, Jiaqiang; Zhang, Jianzhong

    2018-05-03

    The incidence of scarlet fever cases remains high in China. Here, we report the complete genome sequence of a Streptococcus pyogenes isolate of serotype M12, which has been confirmed as the predominant serotype in recent outbreaks. Genome sequencing was achieved by a combination of Oxford Nanopore MinION and Illumina methodologies. Copyright © 2018 You et al.

  20. Precision of dual-energy X-ray absorptiometry of the knee and heel: methodology and implications for research to reduce bone mineral loss after spinal cord injury.

    PubMed

    Peppler, W T; Kim, W J; Ethans, K; Cowley, K C

    2017-05-01

    Methodological validation of dual-energy x-ray absorptiometry (DXA)-based measures of leg bone mineral density (BMD) based on the guidelines of the International Society for Clinical Densitometry. The primary objective of this study was to determine the precision of BMD estimates at the knee and heel using the manufacturer provided DXA acquisition algorithm. The secondary objective was to determine the smallest change in DXA-based measurement of BMD that should be surpassed (least significant change (LSC)) before suggesting that a biological change has occurred in the distal femur, proximal tibia and calcaneus. Academic Research Centre, Canada. Ten people with motor-complete SCI of at least 2 years duration and 10 people from the general population volunteered to have four DXA-based measurements taken of their femur, tibia and calcaneus. BMDs for seven regions of interest (RIs) were calculated, as were short-term precision (root-mean-square (RMS) standard deviation (g cm -2 ), RMS-coefficient of variation (RMS-CV, %)) and LSC. Overall, RMS-CV values were similar between SCI (3.63-10.20%, mean=5.3%) and able-bodied (1.85-5.73%, mean=4%) cohorts, despite lower absolute BMD values at each RIs in those with SCI (35%, heel to 54%, knee; P<0.0001). Precision was highest at the calcaneus and lowest at the femur. Except at the femur, RMS-CV values were under 6%. For DXA-based estimates of BMD at the distal femur, proximal tibia and calcaneus, these precision values suggest that LSC values >10% are needed to detect differences between treated and untreated groups in studies aimed at reducing bone mineral loss after SCI.

  1. Graceful Failure and Societal Resilience Analysis Via Agent-Based Modeling and Simulation

    NASA Astrophysics Data System (ADS)

    Schopf, P. S.; Cioffi-Revilla, C.; Rogers, J. D.; Bassett, J.; Hailegiorgis, A. B.

    2014-12-01

    Agent-based social modeling is opening up new methodologies for the study of societal response to weather and climate hazards, and providing measures of resiliency that can be studied in many contexts, particularly in coupled human and natural-technological systems (CHANTS). Since CHANTS are complex adaptive systems, societal resiliency may or may not occur, depending on dynamics that lack closed form solutions. Agent-based modeling has been shown to provide a viable theoretical and methodological approach for analyzing and understanding disasters and societal resiliency in CHANTS. Our approach advances the science of societal resilience through computational modeling and simulation methods that complement earlier statistical and mathematical approaches. We present three case studies of social dynamics modeling that demonstrate the use of these agent based models. In Central Asia, we exmaine mutltiple ensemble simulations with varying climate statistics to see how droughts and zuds affect populations, transmission of wealth across generations, and the overall structure of the social system. In Eastern Africa, we explore how successive episodes of drought events affect the adaptive capacity of rural households. Human displacement, mainly, rural to urban migration, and livelihood transition particularly from pastoral to farming are observed as rural households interacting dynamically with the biophysical environment and continually adjust their behavior to accommodate changes in climate. In the far north case we demonstrate one of the first successful attempts to model the complete climate-permafrost-infrastructure-societal interaction network as a complex adaptive system/CHANTS implemented as a ``federated'' agent-based model using evolutionary computation. Analysis of population changes resulting from extreme weather across these and other cases provides evidence for the emergence of new steady states and shifting patterns of resilience.

  2. Improvement of sustainability of irrigation in olive by the accurate management of regulated deficit irrigation

    NASA Astrophysics Data System (ADS)

    Memmi, Houssem; Moreno, Marta M.; Gijón, M. Carmen; Pérez-López, David

    2015-04-01

    Regulated Deficit Irrigation (RDI) is a useful tool to balance the improvement of productivity and water saving. This methodology is based in keeping the maximum yield with deficit irrigation. The key consists in setting water deficit during a non-sensitive phenological period. In olive, this phenological period is pit hardening, although, the accurate delimitation of the end of this period is nowadays under researching. Another interesting point in this methodology is how deep can be the water stress during the non-sensitive period. In this assay, three treatments were used in 2012 and 2013. A control treatment (T0), irrigated following FAO methodology, without water stress during the whole season and two RDI treatments in which water stress was avoided only during stage I and III of fruit growth. During stage II, widely considered as pit hardening, irrigation was ceased until trees reach the stated water stress threshold. Water status was monitored by means of stem water potential (ψs) measurements. When ψs value reached -2 MPa in T1 treatment, trees were irrigated but with a low amount of water with the aim of keeping this water status for the whole stage II. The same methodology was used for T2 treatment, but with a threshold of -3 MPa. Water status was also controlled by leaf conductance measurements. Fruit size and yield were determined at the end of each season. The statistically design was a randomized complete blocks with four repetitions. The irrigation amount in T1 and T2 was 50% and 65% less than T0 at the end of the study. There were no significant differences among treatments in terms of yield in 2012 (year off) and 2013 (year on).

  3. New mission requirements methodologies for services provided by the Office of Space Communications

    NASA Technical Reports Server (NTRS)

    Holmes, Dwight P.; Hall, J. R.; Macoughtry, William; Spearing, Robert

    1993-01-01

    The Office of Space Communications, NASA Headquarters, has recently revised its methodology for receiving, accepting and responding to customer requests for use of that office's tracking and communications capabilities. This revision is the result of a process which has become over-burdened by the size of the currently active and proposed missions set, requirements reviews that focus on single missions rather than on mission sets, and negotiations most often not completed early enough to effect needed additions to capacity or capability prior to launch. The requirements-coverage methodology described is more responsive to project/program needs and provides integrated input into the NASA budget process early enough to effect change, and describes the mechanisms and tools in place to insure a value-added process which will benefit both NASA and its customers. Key features of the requirements methodology include the establishment of a mechanism for early identification of and systems trades with new customers, and delegates the review and approval of requirements documents to NASA centers in lieu of Headquarters, thus empowering the system design teams to establish and negotiate the detailed requirements with the user. A Mission Requirements Request (MRR) is introduced to facilitate early customer interaction. The expected result is that the time to achieve an approved set of implementation requirements which meet the customer's needs can be greatly reduced. Finally, by increasing the discipline in requirements management, through the use of base lining procedures, a tighter coupling between customer requirements and the budget is provided. A twice-yearly projection of customer requirements accommodation, designated as the Capacity Projection Plan (CPP), provides customer feedback allowing the entire mission set to be serviced.

  4. Demonstration of a Safety Analysis on a Complex System

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey; hide

    1997-01-01

    For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.

  5. Economic evaluation in patient safety: a literature review of methods.

    PubMed

    de Rezende, Bruna Alves; Or, Zeynep; Com-Ruelle, Laure; Michel, Philippe

    2012-06-01

    Patient safety practices, targeting organisational changes for improving patient safety, are implemented worldwide but their costs are rarely evaluated. This paper provides a review of the methods used in economic evaluation of such practices. International medical and economics databases were searched for peer-reviewed publications on economic evaluations of patient safety between 2000 and 2010 in English and French. This was complemented by a manual search of the reference lists of relevant papers. Grey literature was excluded. Studies were described using a standardised template and assessed independently by two researchers according to six quality criteria. 33 articles were reviewed that were representative of different patient safety domains, data types and evaluation methods. 18 estimated the economic burden of adverse events, 3 measured the costs of patient safety practices and 12 provided complete economic evaluations. Healthcare-associated infections were the most common subject of evaluation, followed by medication-related errors and all types of adverse events. Of these, 10 were selected that had adequately fulfilled one or several key quality criteria for illustration. This review shows that full cost-benefit/utility evaluations are rarely completed as they are resource intensive and often require unavailable data; some overcome these difficulties by performing stochastic modelling and by using secondary sources. Low methodological transparency can be a problem for building evidence from available economic evaluations. Investing in the economic design and reporting of studies with more emphasis on defining study perspectives, data collection and methodological choices could be helpful for strengthening our knowledge base on practices for improving patient safety.

  6. In silico gene expression analysis – an overview

    PubMed Central

    Murray, David; Doran, Peter; MacMathuna, Padraic; Moss, Alan C

    2007-01-01

    Efforts aimed at deciphering the molecular basis of complex disease are underpinned by the availability of high throughput strategies for the identification of biomolecules that drive the disease process. The completion of the human genome-sequencing project, coupled to major technological developments, has afforded investigators myriad opportunities for multidimensional analysis of biological systems. Nowhere has this research explosion been more evident than in the field of transcriptomics. Affordable access and availability to the technology that supports such investigations has led to a significant increase in the amount of data generated. As most biological distinctions are now observed at a genomic level, a large amount of expression information is now openly available via public databases. Furthermore, numerous computational based methods have been developed to harness the power of these data. In this review we provide a brief overview of in silico methodologies for the analysis of differential gene expression such as Serial Analysis of Gene Expression and Digital Differential Display. The performance of these strategies, at both an operational and result/output level is assessed and compared. The key considerations that must be made when completing an in silico expression analysis are also presented as a roadmap to facilitate biologists. Furthermore, to highlight the importance of these in silico methodologies in contemporary biomedical research, examples of current studies using these approaches are discussed. The overriding goal of this review is to present the scientific community with a critical overview of these strategies, so that they can be effectively added to the tool box of biomedical researchers focused on identifying the molecular mechanisms of disease. PMID:17683638

  7. Emotional States of Athletes Prior to Performance-Induced Injury

    PubMed Central

    Devonport, Tracey J.; Lane, Andrew M.; Hanin, Yuri L.

    2005-01-01

    Psychological states experienced by athletes prior to injured, best and worst performances were investigated retrospectively using a mixed methodology. Fifty-nine athletes volunteered to complete an individualized assessment of performance states based on the Individual Zones of Optimal fFunctioning (IZOF) model. A subsection (n = 30) of participants completed a standardized psychometric scale (Brunel Mood Rating Scale: BRUMS), retrospectively describing how they felt before best, worst, and injured performances. IZOF results showed similar emotion states being identified for injured and best performances. Analysis of BRUMS scores indicated a significant main effect for differences in mood by performance outcome, with post-hoc analyses showing best performance was associated with lower scores on depression and fatigue and higher vigor than injured performance and worst performance. Worst performance was associated with higher fatigue and confusion than injured performance. Results indicate that retrospective emotional profiles before injured performance are closer to successful performance, than unsuccessful, and confirm differences between successful and unsuccessful performance. Qualitative and quantitative approaches used to retrospectively assess pre-performance emotional states before three performance outcomes, produced complimentary findings. Practical implications of the study are discussed. Key Points Psychological states experienced by athletes prior to injured, best and worst performances were investigated retrospectively using a mixed methodology. Results indicate that retrospective emotional profiles before injured performance are closer to successful performance, than unsuccessful, and confirm differences between successful and unsuccessful performance, a finding that occurred using both methods. Future research should further examine the emotional antecedents of injury and that applied sport psychologists recognize the potential risk of injury associated with emotional profiles typically linked with best performance. PMID:24501552

  8. The association between anxiety disorders and suicidal behaviors: a systematic review and meta-analysis.

    PubMed

    Kanwar, Amrit; Malik, Shaista; Prokop, Larry J; Sim, Leslie A; Feldstein, David; Wang, Zhen; Murad, M Hassan

    2013-10-01

    Although anxiety has been proposed to be a potentially modifiable risk factor for suicide, research examining the relationship between anxiety and suicidal behaviors has demonstrated mixed results. Therefore, we aimed at testing the hypothesis that anxiety disorders are associated with suicidal behaviors and evaluate the magnitude and quality of supporting evidence. A systematic literature search of multiple databases was conducted from database inception through August 2011. Two investigators independently reviewed and determined the eligibility and quality of the studies based upon a priori established inclusion criteria. The outcomes of interest were suicidal ideations, suicide attempts, completed suicides, and a composite outcome of any suicidal behaviors. We pooled odds ratios from the included studies using random effects models. Forty-two observational studies were included. The studies had variable methodological quality due to inconsistent adjustment of confounders. Compared to those without anxiety, patients with anxiety were more likely to have suicidal ideations (OR = 2.89, 95% CI: 2.09, 4.00), attempted suicides (OR = 2.47, 95% CI: 1.96, 3.10), completed suicides (OR = 3.34, 95% CI: 2.13, 5.25), or have any suicidal behaviors (OR = 2.85, 95% CI: 2.35, 3.46). The increase in the risk of suicide was demonstrated for each subtype of anxiety except obsessive-compulsive disorder (OCD). The quality of this evidence is considered low to moderate due to heterogeneity and methodological limitations. This systematic review and meta-analysis provides evidence that the rates of suicides are higher in patients with any type of anxiety disorders excluding OCD. © 2013 Wiley Periodicals, Inc.

  9. Regional health care planning: a methodology to cluster facilities using community utilization patterns

    PubMed Central

    2013-01-01

    Background Community-based health care planning and regulation necessitates grouping facilities and areal units into regions of similar health care use. Limited research has explored the methodologies used in creating these regions. We offer a new methodology that clusters facilities based on similarities in patient utilization patterns and geographic location. Our case study focused on Hospital Groups in Michigan, the allocation units used for predicting future inpatient hospital bed demand in the state’s Bed Need Methodology. The scientific, practical, and political concerns that were considered throughout the formulation and development of the methodology are detailed. Methods The clustering methodology employs a 2-step K-means + Ward’s clustering algorithm to group hospitals. The final number of clusters is selected using a heuristic that integrates both a statistical-based measure of cluster fit and characteristics of the resulting Hospital Groups. Results Using recent hospital utilization data, the clustering methodology identified 33 Hospital Groups in Michigan. Conclusions Despite being developed within the politically charged climate of Certificate of Need regulation, we have provided an objective, replicable, and sustainable methodology to create Hospital Groups. Because the methodology is built upon theoretically sound principles of clustering analysis and health care service utilization, it is highly transferable across applications and suitable for grouping facilities or areal units. PMID:23964905

  10. Regional health care planning: a methodology to cluster facilities using community utilization patterns.

    PubMed

    Delamater, Paul L; Shortridge, Ashton M; Messina, Joseph P

    2013-08-22

    Community-based health care planning and regulation necessitates grouping facilities and areal units into regions of similar health care use. Limited research has explored the methodologies used in creating these regions. We offer a new methodology that clusters facilities based on similarities in patient utilization patterns and geographic location. Our case study focused on Hospital Groups in Michigan, the allocation units used for predicting future inpatient hospital bed demand in the state's Bed Need Methodology. The scientific, practical, and political concerns that were considered throughout the formulation and development of the methodology are detailed. The clustering methodology employs a 2-step K-means + Ward's clustering algorithm to group hospitals. The final number of clusters is selected using a heuristic that integrates both a statistical-based measure of cluster fit and characteristics of the resulting Hospital Groups. Using recent hospital utilization data, the clustering methodology identified 33 Hospital Groups in Michigan. Despite being developed within the politically charged climate of Certificate of Need regulation, we have provided an objective, replicable, and sustainable methodology to create Hospital Groups. Because the methodology is built upon theoretically sound principles of clustering analysis and health care service utilization, it is highly transferable across applications and suitable for grouping facilities or areal units.

  11. Using Design-Based Research in Gifted Education

    ERIC Educational Resources Information Center

    Jen, Enyi; Moon, Sidney; Samarapungavan, Ala

    2015-01-01

    Design-based research (DBR) is a new methodological framework that was developed in the context of the learning sciences; however, it has not been used very often in the field of gifted education. Compared with other methodologies, DBR is more process-oriented and context-sensitive. In this methodological brief, the authors introduce DBR and…

  12. Systematic Review Methodology for the Fatigue in Emergency Medical Services Project

    DOT National Transportation Integrated Search

    2018-01-11

    Background: Guidance for managing fatigue in the Emergency Medical Services (EMS) setting is limited. The Fatigue in EMS Project sought to complete multiple systematic reviews guided by seven explicit research questions, assemble the best available e...

  13. 76 FR 70111 - Certain Frozen Fish Fillets from the Socialist Republic of Vietnam: Extension of Deadline for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-10

    ... completion of the preliminary results of a new shipper review to 300 days if it determines that the case is... shipper review involves extraordinarily complicated methodological issues. Interested parties have...

  14. Complete Chloroplast Genome Sequences of Mongolia Medicine Artemisia frigida and Phylogenetic Relationships with Other Plants

    PubMed Central

    Liu, Yue; Huo, Naxin; Dong, Lingli; Wang, Yi; Zhang, Shuixian; Young, Hugh A.; Feng, Xiaoxiao; Gu, Yong Qiang

    2013-01-01

    Background Artemisia frigida Willd. is an important Mongolian traditional medicinal plant with pharmacological functions of stanch and detumescence. However, there is little sequence and genomic information available for Artemisia frigida, which makes phylogenetic identification, evolutionary studies, and genetic improvement of its value very difficult. We report the complete chloroplast genome sequence of Artemisia frigida based on 454 pyrosequencing. Methodology/Principal Findings The complete chloroplast genome of Artemisia frigida is 151,076 bp including a large single copy (LSC) region of 82,740 bp, a small single copy (SSC) region of 18,394 bp and a pair of inverted repeats (IRs) of 24,971 bp. The genome contains 114 unique genes and 18 duplicated genes. The chloroplast genome of Artemisia frigida contains a small 3.4 kb inversion within a large 23 kb inversion in the LSC region, a unique feature in Asteraceae. The gene order in the SSC region of Artemisia frigida is inverted compared with the other 6 Asteraceae species with the chloroplast genomes sequenced. This inversion is likely caused by an intramolecular recombination event only occurred in Artemisia frigida. The existence of rich SSR loci in the Artemisia frigida chloroplast genome provides a rare opportunity to study population genetics of this Mongolian medicinal plant. Phylogenetic analysis demonstrates a sister relationship between Artemisia frigida and four other species in Asteraceae, including Ageratina adenophora, Helianthus annuus, Guizotia abyssinica and Lactuca sativa, based on 61 protein-coding sequences. Furthermore, Artemisia frigida was placed in the tribe Anthemideae in the subfamily Asteroideae (Asteraceae) based on ndhF and trnL-F sequence comparisons. Conclusion The chloroplast genome sequence of Artemisia frigida was assembled and analyzed in this study, representing the first plastid genome sequenced in the Anthemideae tribe. This complete chloroplast genome sequence will be useful for molecular ecology and molecular phylogeny studies within Artemisia species and also within the Asteraceae family. PMID:23460871

  15. How can activity-based costing methodology be performed as a powerful tool to calculate costs and secure appropriate patient care?

    PubMed

    Lin, Blossom Yen-Ju; Chao, Te-Hsin; Yao, Yuh; Tu, Shu-Min; Wu, Chun-Ching; Chern, Jin-Yuan; Chao, Shiu-Hsiung; Shaw, Keh-Yuong

    2007-04-01

    Previous studies have shown the advantages of using activity-based costing (ABC) methodology in the health care industry. The potential values of ABC methodology in health care are derived from the more accurate cost calculation compared to the traditional step-down costing, and the potentials to evaluate quality or effectiveness of health care based on health care activities. This project used ABC methodology to profile the cost structure of inpatients with surgical procedures at the Department of Colorectal Surgery in a public teaching hospital, and to identify the missing or inappropriate clinical procedures. We found that ABC methodology was able to accurately calculate costs and to identify several missing pre- and post-surgical nursing education activities in the course of treatment.

  16. Psychological interventions for post-traumatic stress disorder and comorbid substance use disorder: A systematic review and meta-analysis.

    PubMed

    Roberts, Neil P; Roberts, Pamela A; Jones, Neil; Bisson, Jonathan I

    2015-06-01

    Co-morbid post-traumatic stress disorder (PTSD) and substance use disorder (SUD) are common, difficult to treat, and associated with poor prognosis. This review aimed to determine the efficacy of individual and group psychological interventions aimed at treating comorbid PTSD and SUD, based on evidence from randomised controlled trials. Our pre-specified primary outcomes were PTSD severity, drug/alcohol use, and treatment completion. We undertook a comprehensive search strategy. Included studies were rated for methodological quality. Available evidence was judged through GRADE. Fourteen studies were included. We found that individual trauma-focused cognitive-behavioural intervention, delivered alongside SUD intervention, was more effective than treatment as usual (TAU)/minimal intervention for PTSD severity post-treatment, and at subsequent follow-up. There was no evidence of an effect for level of drug/alcohol use post-treatment but there was an effect at 5-7 months. Fewer participants completed trauma-focused intervention than TAU. We found little evidence to support the use of individual or group-based non-trauma-focused interventions. All findings were judged as being of low/very low quality. We concluded that there is evidence that individual trauma-focused psychological intervention delivered alongside SUD intervention can reduce PTSD severity, and drug/alcohol use. There is very little evidence to support use of non-trauma-focused individual or group-based interventions. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Identification of candidate MLO powdery mildew susceptibility genes in cultivated Solanaceae and functional characterization of tobacco NtMLO1.

    PubMed

    Appiano, Michela; Pavan, Stefano; Catalano, Domenico; Zheng, Zheng; Bracuto, Valentina; Lotti, Concetta; Visser, Richard G F; Ricciardi, Luigi; Bai, Yuling

    2015-10-01

    Specific homologs of the plant Mildew Locus O (MLO) gene family act as susceptibility factors towards the powdery mildew (PM) fungal disease, causing significant economic losses in agricultural settings. Thus, in order to obtain PM resistant phenotypes, a general breeding strategy has been proposed, based on the selective inactivation of MLO susceptibility genes across cultivated species. In this study, PCR-based methodologies were used in order to isolate MLO genes from cultivated solanaceous crops that are hosts for PM fungi, namely eggplant, potato and tobacco, which were named SmMLO1, StMLO1 and NtMLO1, respectively. Based on phylogenetic analysis and sequence alignment, these genes were predicted to be orthologs of tomato SlMLO1 and pepper CaMLO2, previously shown to be required for PM pathogenesis. Full-length sequence of the tobacco homolog NtMLO1 was used for a heterologous transgenic complementation assay, resulting in its characterization as a PM susceptibility gene. The same assay showed that a single nucleotide change in a mutated NtMLO1 allele leads to complete gene loss-of-function. Results here presented, also including a complete overview of the tobacco and potato MLO gene families, are valuable to study MLO gene evolution in Solanaceae and for molecular breeding approaches aimed at introducing PM resistance using strategies of reverse genetics.

  18. Adaptive Approximation-Based Regulation Control for a Class of Uncertain Nonlinear Systems Without Feedback Linearizability.

    PubMed

    Wang, Ning; Sun, Jing-Chao; Han, Min; Zheng, Zhongjiu; Er, Meng Joo

    2017-09-06

    In this paper, for a general class of uncertain nonlinear (cascade) systems, including unknown dynamics, which are not feedback linearizable and cannot be solved by existing approaches, an innovative adaptive approximation-based regulation control (AARC) scheme is developed. Within the framework of adding a power integrator (API), by deriving adaptive laws for output weights and prediction error compensation pertaining to single-hidden-layer feedforward network (SLFN) from the Lyapunov synthesis, a series of SLFN-based approximators are explicitly constructed to exactly dominate completely unknown dynamics. By the virtue of significant advancements on the API technique, an adaptive API methodology is eventually established in combination with SLFN-based adaptive approximators, and it contributes to a recursive mechanism for the AARC scheme. As a consequence, the output regulation error can asymptotically converge to the origin, and all other signals of the closed-loop system are uniformly ultimately bounded. Simulation studies and comprehensive comparisons with backstepping- and API-based approaches demonstrate that the proposed AARC scheme achieves remarkable performance and superiority in dealing with unknown dynamics.

  19. A systematic review of the cost and cost-effectiveness of electronic discharge communications.

    PubMed

    Sevick, Laura K; Esmail, Rosmin; Tang, Karen; Lorenzetti, Diane L; Ronksley, Paul; James, Matthew; Santana, Maria; Ghali, William A; Clement, Fiona

    2017-07-02

    The transition between acute care and community care can be a vulnerable period in a patients' treatment due to the potential for postdischarge adverse events. The vulnerability of this period has been attributed to factors related to the miscommunication between hospital-based and community-based physicians. Electronic discharge communication has been proposed as one solution to bridge this communication gap. Prior to widespread implementation of these tools, the costs and benefits should be considered. To establish the cost and cost-effectiveness of electronic discharge communications compared with traditional discharge systems for individuals who have completed care with one provider and are transitioning care to a new provider. We conducted a systematic review of the published literature, using best practices, to identify economic evaluations/cost analyses of electronic discharge communication tools. Inclusion criteria were: (1) economic analysis and (2) electronic discharge communication tool as the intervention. Quality of each article was assessed, and data were summarised using a component-based analysis. One thousand unique abstracts were identified, and 57 full-text articles were assessed for eligibility. Four studies met final inclusion criteria. These studies varied in their primary objectives, methodology, costs reported and outcomes. All of the studies were of low to good quality. Three of the studies reported a cost-effectiveness measure ranging from an incremental daily cost of decreasing average discharge note completion by 1 day of $0.331 (2003 Canadian), a cost per page per discharge letter of €9.51 and a dynamic net present value of €31.1 million for a 5-year implementation of the intervention. None of the identified studies considered clinically meaningful patient or quality outcomes. Economic analyses of electronic discharge communications are scarcely reported, and with inconsistent methodology and outcomes. Further studies are needed to understand the cost-effectiveness and value for patient care. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  20. A phylotranscriptomic backbone of the orb-weaving spider family Araneidae (Arachnida, Araneae) supported by multiple methodological approaches.

    PubMed

    Kallal, Robert J; Fernández, Rosa; Giribet, Gonzalo; Hormiga, Gustavo

    2018-04-07

    The orb-weaving spider family Araneidae is extremely diverse (>3100 spp.) and its members can be charismatic terrestrial arthropods, many of them recognizable by their iconic orbicular snare web, such as the common garden spiders. Despite considerable effort to better understand their backbone relationships based on multiple sources of data (morphological, behavioral and molecular), pervasive low support remains in recent studies. In addition, no overarching phylogeny of araneids is available to date, hampering further comparative work. In this study, we analyze the transcriptomes of 33 taxa, including 19 araneids - 12 of them new to this study - representing most of the core family lineages, to examine the relationships within the family using genomic-scale datasets resulting from various methodological treatments, namely ortholog selection and gene occupancy as a measure of matrix completion. Six matrices were constructed to assess these effects by varying orthology inference method and gene occupancy threshold. Orthology methods used are the benchmarking tool BUSCO and the tree-based method UPhO; three gene occupancy thresholds (45%, 65%, 85%) were used to assess the effect of missing data. Gene tree and species tree-based methods (including multi-species coalescent and concatenation approaches, as well as maximum likelihood and Bayesian inference) were used totalling 17 analytical treatments. The monophyly of Araneidae and the placement of core araneid lineages were supported, together with some previously unsound backbone divergences; these include high support for Zygiellinae as the earliest diverging subfamily (followed by Nephilinae), the placement of Gasteracanthinae as sister group to Cyclosa and close relatives, and close relationships between the Araneus + Neoscona clade and Cyrtophorinae + Argiopinae clade. Incongruences were relegated to short branches in the clade comprising Cyclosa and its close relatives. We found congruence between most of the completed analyses, with minimal topological effects from occupancy/missing data and orthology assessment. The resulting number of genes by certain combinations of orthology and occupancy thresholds being analyzed had the greatest effect on the resulting trees, with anomalous outcomes recovered from analysis of lower numbers of genes. Copyright © 2018 Elsevier Inc. All rights reserved.

Top