Methodological Issues in Achieving School Accountability
ERIC Educational Resources Information Center
Linn, Robert L.
2008-01-01
Test-based educational accountability is widely used in many countries, but is pervasive in the US. Key features of test-based accountability required by the US No Child Left Behind Act are discussed. Particular attention is given to methodological issues such as the distinction between status and growth approaches, the setting of performance…
NASA Astrophysics Data System (ADS)
Peng, Xiang; Zhang, Peng; Cai, Lilong
In this paper, we present a virtual-optical based information security system model with the aid of public-key-infrastructure (PKI) techniques. The proposed model employs a hybrid architecture in which our previously published encryption algorithm based on virtual-optics imaging methodology (VOIM) can be used to encipher and decipher data while an asymmetric algorithm, for example RSA, is applied for enciphering and deciphering the session key(s). For an asymmetric system, given an encryption key, it is computationally infeasible to determine the decryption key and vice versa. The whole information security model is run under the framework of PKI, which is on basis of public-key cryptography and digital signatures. This PKI-based VOIM security approach has additional features like confidentiality, authentication, and integrity for the purpose of data encryption under the environment of network.
METHODOLOGICAL QUALITY OF ECONOMIC EVALUATIONS ALONGSIDE TRIALS OF KNEE PHYSIOTHERAPY.
García-Pérez, Lidia; Linertová, Renata; Arvelo-Martín, Alejandro; Guerra-Marrero, Carolina; Martínez-Alberto, Carlos Enrique; Cuéllar-Pompa, Leticia; Escobar, Antonio; Serrano-Aguilar, Pedro
2017-01-01
The methodological quality of an economic evaluation performed alongside a clinical trial can be underestimated if the paper does not report key methodological features. This study discusses methodological assessment issues on the example of a systematic review on cost-effectiveness of physiotherapy for knee osteoarthritis. Six economic evaluation studies included in the systematic review and related clinical trials were assessed using the 10-question check-list by Drummond and the Physiotherapy Evidence Database (PEDro) scale. All economic evaluations were performed alongside a clinical trial but the studied interventions were too heterogeneous to be synthesized. Methodological quality of the economic evaluations reported in the papers was not free of drawbacks, and in some cases, it improved when information from the related clinical trial was taken into account. Economic evaluation papers dedicate little space to methodological features of related clinical trials; therefore, the methodological quality can be underestimated if evaluated separately from the trials. Future economic evaluations should follow more strictly the recommendations about methodology and the authors should pay special attention to the quality of reporting.
An introduction to exemplar research: a definition, rationale, and conceptual issues.
Bronk, Kendall Cotton; King, Pamela Ebstyne; Matsuba, M Kyle
2013-01-01
The exemplar methodology represents a useful yet underutilized approach to studying developmental constructs. It features an approach to research whereby individuals, entities, or programs that exemplify the construct of interest in a particularly intense or highly developed manner compose the study sample. Accordingly, it reveals what the upper ends of development look like in practice. Utilizing the exemplar methodology allows researchers to glimpse not only what is but also what is possible with regard to the development of a particular characteristic. The present chapter includes a definition of the exemplar methodology, a discussion of some of key conceptual issues to consider when employing it in empirical studies, and a brief overview of the other chapters featured in this volume. © Wiley Periodicals, Inc.
A Co-modeling Method Based on Component Features for Mechatronic Devices in Aero-engines
NASA Astrophysics Data System (ADS)
Wang, Bin; Zhao, Haocen; Ye, Zhifeng
2017-08-01
Data-fused and user-friendly design of aero-engine accessories is required because of their structural complexity and stringent reliability. This paper gives an overview of a typical aero-engine control system and the development process of key mechatronic devices used. Several essential aspects of modeling and simulation in the process are investigated. Considering the limitations of a single theoretic model, feature-based co-modeling methodology is suggested to satisfy the design requirements and compensate for diversity of component sub-models for these devices. As an example, a stepper motor controlled Fuel Metering Unit (FMU) is modeled in view of the component physical features using two different software tools. An interface is suggested to integrate the single discipline models into the synthesized one. Performance simulation of this device using the co-model and parameter optimization for its key components are discussed. Comparison between delivery testing and the simulation shows that the co-model for the FMU has a high accuracy and the absolute superiority over a single model. Together with its compatible interface with the engine mathematical model, the feature-based co-modeling methodology is proven to be an effective technical measure in the development process of the device.
Cellular neural network-based hybrid approach toward automatic image registration
NASA Astrophysics Data System (ADS)
Arun, Pattathal VijayaKumar; Katiyar, Sunil Kumar
2013-01-01
Image registration is a key component of various image processing operations that involve the analysis of different image data sets. Automatic image registration domains have witnessed the application of many intelligent methodologies over the past decade; however, inability to properly model object shape as well as contextual information has limited the attainable accuracy. A framework for accurate feature shape modeling and adaptive resampling using advanced techniques such as vector machines, cellular neural network (CNN), scale invariant feature transform (SIFT), coreset, and cellular automata is proposed. CNN has been found to be effective in improving feature matching as well as resampling stages of registration and complexity of the approach has been considerably reduced using coreset optimization. The salient features of this work are cellular neural network approach-based SIFT feature point optimization, adaptive resampling, and intelligent object modelling. Developed methodology has been compared with contemporary methods using different statistical measures. Investigations over various satellite images revealed that considerable success was achieved with the approach. This system has dynamically used spectral and spatial information for representing contextual knowledge using CNN-prolog approach. This methodology is also illustrated to be effective in providing intelligent interpretation and adaptive resampling.
Shukla, Nagesh; Keast, John E; Ceglarek, Darek
2014-10-01
The modelling of complex workflows is an important problem-solving technique within healthcare settings. However, currently most of the workflow models use a simplified flow chart of patient flow obtained using on-site observations, group-based debates and brainstorming sessions, together with historic patient data. This paper presents a systematic and semi-automatic methodology for knowledge acquisition with detailed process representation using sequential interviews of people in the key roles involved in the service delivery process. The proposed methodology allows the modelling of roles, interactions, actions, and decisions involved in the service delivery process. This approach is based on protocol generation and analysis techniques such as: (i) initial protocol generation based on qualitative interviews of radiology staff, (ii) extraction of key features of the service delivery process, (iii) discovering the relationships among the key features extracted, and, (iv) a graphical representation of the final structured model of the service delivery process. The methodology is demonstrated through a case study of a magnetic resonance (MR) scanning service-delivery process in the radiology department of a large hospital. A set of guidelines is also presented in this paper to visually analyze the resulting process model for identifying process vulnerabilities. A comparative analysis of different workflow models is also conducted. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Diagnostic methodology for incipient system disturbance based on a neural wavelet approach
NASA Astrophysics Data System (ADS)
Won, In-Ho
Since incipient system disturbances are easily mixed up with other events or noise sources, the signal from the system disturbance can be neglected or identified as noise. Thus, as available knowledge and information is obtained incompletely or inexactly from the measurements; an exploration into the use of artificial intelligence (AI) tools to overcome these uncertainties and limitations was done. A methodology integrating the feature extraction efficiency of the wavelet transform with the classification capabilities of neural networks is developed for signal classification in the context of detecting incipient system disturbances. The synergistic effects of wavelets and neural networks present more strength and less weakness than either technique taken alone. A wavelet feature extractor is developed to form concise feature vectors for neural network inputs. The feature vectors are calculated from wavelet coefficients to reduce redundancy and computational expense. During this procedure, the statistical features based on the fractal concept to the wavelet coefficients play a role as crucial key in the wavelet feature extractor. To verify the proposed methodology, two applications are investigated and successfully tested. The first involves pump cavitation detection using dynamic pressure sensor. The second pertains to incipient pump cavitation detection using signals obtained from a current sensor. Also, through comparisons between three proposed feature vectors and with statistical techniques, it is shown that the variance feature extractor provides a better approach in the performed applications.
Defining competency-based evaluation objectives in family medicine
Lawrence, Kathrine; Allen, Tim; Brailovsky, Carlos; Crichton, Tom; Bethune, Cheri; Donoff, Michel; Laughlin, Tom; Wetmore, Stephen; Carpentier, Marie-Pierre; Visser, Shaun
2011-01-01
Abstract Objective To develop key features for priority topics previously identified by the College of Family Physicians of Canada that, together with skill dimensions and phases of the clinical encounter, broadly describe competence in family medicine. Design Modified nominal group methodology, which was used to develop key features for each priority topic through an iterative process. Setting The College of Family Physicians of Canada. Participants An expert group of 7 family physicians and 1 educational consultant, all of whom had experience in assessing competence in family medicine. Group members represented the Canadian family medicine context with respect to region, sex, language, community type, and experience. Methods The group used a modified Delphi process to derive a detailed operational definition of competence, using multiple iterations until consensus was achieved for the items under discussion. The group met 3 to 4 times a year from 2000 to 2007. Main findings The group analyzed 99 topics and generated 773 key features. There were 2 to 20 (average 7.8) key features per topic; 63% of the key features focused on the diagnostic phase of the clinical encounter. Conclusion This project expands previous descriptions of the process of generating key features for assessment, and removes this process from the context of written examinations. A key-features analysis of topics focuses on higher-order cognitive processes of clinical competence. The project did not define all the skill dimensions of competence to the same degree, but it clearly identified those requiring further definition. This work generates part of a discipline-specific, competency-based definition of family medicine for assessment purposes. It limits the domain for assessment purposes, which is an advantage for the teaching and assessment of learners. A validation study on the content of this work would ensure that it truly reflects competence in family medicine. PMID:21998245
Making sense of grounded theory in medical education.
Kennedy, Tara J T; Lingard, Lorelei A
2006-02-01
Grounded theory is a research methodology designed to develop, through collection and analysis of data that is primarily (but not exclusively) qualitative, a well-integrated set of concepts that provide a theoretical explanation of a social phenomenon. This paper aims to provide an introduction to key features of grounded theory methodology within the context of medical education research. In this paper we include a discussion of the origins of grounded theory, a description of key methodological processes, a comment on pitfalls encountered commonly in the application of grounded theory research, and a summary of the strengths of grounded theory methodology with illustrations from the medical education domain. The significant strengths of grounded theory that have resulted in its enduring prominence in qualitative research include its clearly articulated analytical process and its emphasis on the generation of pragmatic theory that is grounded in the data of experience. When applied properly and thoughtfully, grounded theory can address research questions of significant relevance to the domain of medical education.
Martian polar geological studies
NASA Technical Reports Server (NTRS)
Cutts, J. A. J.
1977-01-01
Multiple arcs of rugged mountains and adjacent plains on the surface of Mars were examined. These features, located in the southern polar region were photographed by Mariner 9. Comparisons are made with characteristics of a lunar basin and mare; Mare imbrium in particular. The martian feature is interpreted to have originated in the same way as its lunar analog- by volcanic flooding of a large impact basin. Key data and methodology leading to this conclusion are cited.
Handwriting: Feature Correlation Analysis for Biometric Hashes
NASA Astrophysics Data System (ADS)
Vielhauer, Claus; Steinmetz, Ralf
2004-12-01
In the application domain of electronic commerce, biometric authentication can provide one possible solution for the key management problem. Besides server-based approaches, methods of deriving digital keys directly from biometric measures appear to be advantageous. In this paper, we analyze one of our recently published specific algorithms of this category based on behavioral biometrics of handwriting, the biometric hash. Our interest is to investigate to which degree each of the underlying feature parameters contributes to the overall intrapersonal stability and interpersonal value space. We will briefly discuss related work in feature evaluation and introduce a new methodology based on three components: the intrapersonal scatter (deviation), the interpersonal entropy, and the correlation between both measures. Evaluation of the technique is presented based on two data sets of different size. The method presented will allow determination of effects of parameterization of the biometric system, estimation of value space boundaries, and comparison with other feature selection approaches.
Single-Subject Research in Gifted Education
ERIC Educational Resources Information Center
Simonsen, Brandi; Little, Catherine A.
2011-01-01
Single-subject research (SSR) is an experimental research tradition that is well established in other fields (e.g., special education, behavior analysis) but has rarely been applied to topics in gifted education. In this Methodological Brief, Brandi Simonsen and Catherine A. Little from the University of Connecticut highlight the key features of…
Co-Sleeping during Infancy and Early Childhood: Key Findings and Future Directions
ERIC Educational Resources Information Center
Goldberg, Wendy A.; Keller, Meret A.
2007-01-01
Emergent themes from this special issue on parent-child co-sleeping are featured in this concluding article. Each of the pieces in this collection addressed one or more of the following themes: methodologies for studying parent-infant co-sleeping, physical and social characteristics of the child's sleep environment, associations between sleep…
Student Achievement and Fidelity of Implementation of the Middle School Concept in Middle Schools
ERIC Educational Resources Information Center
Jackson, Delilah A.
2013-01-01
This study, using qualitative, multiple case methodology, examined four middle schools within a Local Education Agency (LEA) in eastern North Carolina to determine whether the implementation of key middle school features; (a) interdisciplinary teaming, (b) flexible scheduling, (c) advisor/advisee relationships, and (d) an integrative, exploratory…
Dynamics of Western Career Attributes in the Russian Context
ERIC Educational Resources Information Center
Khapova, Svetlana N.; Korotov, Konstantin
2007-01-01
Purpose: The purpose of this article is to raise awareness of the dynamic character of career and its key attributes, and the embeddedness of their definitions and meanings in national social, political and economic contexts. Design/methodology/approach: Features of three recent distinct social, political and economic situations in Russia are used…
Systemic Modelling for Relating Labour Market to Vocational Education
ERIC Educational Resources Information Center
Papakitsos, Evangelos C.
2016-01-01
The present study introduces a systemic model that demonstrates a description of the relationship between the labour-market and vocational education from the perspective of systemic theory. Based on the application of the relevant methodology, the two open social systems are identified and analyzed. Their key-features are presented and the points…
National All-Age Career Guidance Services: Evidence and Issues
ERIC Educational Resources Information Center
Watts, A. G.
2010-01-01
The three major national all-age career guidance services--in New Zealand, Scotland and Wales--have been reviewed using an adaptation of the methodology adopted in the OECD Career Guidance Policy Review. The main features of the three services are summarised, and some key differences and distinctive strengths are outlined. The alternative approach…
NASA Astrophysics Data System (ADS)
Leandro, J.; Schumann, A.; Pfister, A.
2016-04-01
Some of the major challenges in modelling rainfall-runoff in urbanised areas are the complex interaction between the sewer system and the overland surface, and the spatial heterogeneity of the urban key features. The former requires the sewer network and the system of surface flow paths to be solved simultaneously. The latter is still an unresolved issue because the heterogeneity of runoff formation requires high detailed information and includes a large variety of feature specific rainfall-runoff dynamics. This paper discloses a methodology for considering the variability of building types and the spatial heterogeneity of land surfaces. The former is achieved by developing a specific conceptual rainfall-runoff model and the latter by defining a fully distributed approach for infiltration processes in urban areas with limited storage capacity dependent on OpenStreetMaps (OSM). The model complexity is increased stepwise by adding components to an existing 2D overland flow model. The different steps are defined as modelling levels. The methodology is applied in a German case study. Results highlight that: (a) spatial heterogeneity of urban features has a medium to high impact on the estimated overland flood-depths, (b) the addition of multiple urban features have a higher cumulative effect due to the dynamic effects simulated by the model, (c) connecting the runoff from buildings to the sewer contributes to the non-linear effects observed on the overland flood-depths, and (d) OSM data is useful in identifying pounding areas (for which infiltration plays a decisive role) and permeable natural surface flow paths (which delay the flood propagation).
Sumner, T; Shephard, E; Bogle, I D L
2012-09-07
One of the main challenges in the development of mathematical and computational models of biological systems is the precise estimation of parameter values. Understanding the effects of uncertainties in parameter values on model behaviour is crucial to the successful use of these models. Global sensitivity analysis (SA) can be used to quantify the variability in model predictions resulting from the uncertainty in multiple parameters and to shed light on the biological mechanisms driving system behaviour. We present a new methodology for global SA in systems biology which is computationally efficient and can be used to identify the key parameters and their interactions which drive the dynamic behaviour of a complex biological model. The approach combines functional principal component analysis with established global SA techniques. The methodology is applied to a model of the insulin signalling pathway, defects of which are a major cause of type 2 diabetes and a number of key features of the system are identified.
Methodological quality and descriptive characteristics of prosthodontic-related systematic reviews.
Aziz, T; Compton, S; Nassar, U; Matthews, D; Ansari, K; Flores-Mir, C
2013-04-01
Ideally, healthcare systematic reviews (SRs) should be beneficial to practicing professionals in making evidence-based clinical decisions. However, the conclusions drawn from SRs are directly related to the quality of the SR and of the included studies. The aim was to investigate the methodological quality and key descriptive characteristics of SRs published in prosthodontics. Methodological quality was analysed using the Assessment of Multiple Reviews (AMSTAR) tool. Several electronic resources (MEDLINE, EMBASE, Web of Science and American Dental Association's Evidence-based Dentistry website) were searched. In total 106 SRs were located. Key descriptive characteristics and methodological quality features were gathered and assessed, and descriptive and inferential statistical testing performed. Most SRs in this sample originated from the European continent followed by North America. Two to five authors conducted most SRs; the majority was affiliated with academic institutions and had prior experience publishing SRs. The majority of SRs were published in specialty dentistry journals, with implant or implant-related topics, the primary topics of interest for most. According to AMSTAR, most quality aspects were adequately fulfilled by less than half of the reviews. Publication bias and grey literature searches were the most poorly adhered components. Overall, the methodological quality of the prosthodontic-related systematic was deemed limited. Future recommendations would include authors to have prior training in conducting SRs and for journals to include a universal checklist that should be adhered to address all key characteristics of an unbiased SR process. © 2013 Blackwell Publishing Ltd.
HRD Practices and Talent Management in the Companies with the Employer Brand
ERIC Educational Resources Information Center
Kucherov, Dmitry; Zavyalova, Elena
2012-01-01
Purpose: The employer brand could be a key factor of competitiveness for a company in a contemporary labour market. The purpose of this paper is to identify the features of human resource development (HRD) practices and talent management in companies with employer brand (CEBs). Design/methodology/approach: The authors examined three economic…
Unruly Practices: What a Sociology of Translations Can Offer to Educational Policy Analysis
ERIC Educational Resources Information Center
Hamilton, Mary
2011-01-01
This paper argues for the utility of ANT as a philosophical and methodological approach to policy analysis. It introduces the key features of a recent educational policy reform initiative, Skills for Life and illustrates the argument by looking at three "moments" (in Callon's 1986 terminology) in the life of this initiative, applying the…
Stereoscopic Feature Tracking System for Retrieving Velocity of Surface Waters
NASA Astrophysics Data System (ADS)
Zuniga Zamalloa, C. C.; Landry, B. J.
2017-12-01
The present work is concerned with the surface velocity retrieval of flows using a stereoscopic setup and finding the correspondence in the images via feature tracking (FT). The feature tracking provides a key benefit of substantially reducing the level of user input. In contrast to other commonly used methods (e.g., normalized cross-correlation), FT does not require the user to prescribe interrogation window sizes and removes the need for masking when specularities are present. The results of the current FT methodology are comparable to those obtained via Large Scale Particle Image Velocimetry while requiring little to no user input which allowed for rapid, automated processing of imagery.
ERIC Educational Resources Information Center
Dongxu, Wang; Yuhui, Shi; Stewart, Donald; Chun, Chang; Chaoyang, Li
2012-01-01
Purpose: The paper seeks to identify key features of prenatal care utilization and quality in western regions of China and to determine the factors affecting the quality of prenatal care. Design/methodology/approach: A descriptive, cross-sectional study was conducted. The instrument for the study was a 10-stem respondent-administered, structured…
ERIC Educational Resources Information Center
Bertolli, Jeanne; And Others
1995-01-01
This article discusses methodologic limitations of four observational study designs (ecologic, case-control, cross-sectional, and cohort) that dominate the child abuse and neglect literature and identifies key features of an "ideal" study of child maltreatment. It proposes a new mixed-design strategy, which improves ability to identify child…
ERIC Educational Resources Information Center
Kennelly, Brendan; Flannery, Darragh; Considine, John; Doherty, Edel; Hynes, Stephen
2014-01-01
This paper outlines how a discrete choice experiment (DCE) can be used to learn more about how students are willing to trade off various features of assignments such as the nature and timing of feedback and the method used to submit assignments. A DCE identifies plausible levels of the key attributes of a good or service and then presents the…
Employing an ethnographic approach: key characteristics.
Lambert, Veronica; Glacken, Michele; McCarron, Mary
2011-01-01
Nurses are increasingly embracing ethnography as a useful research methodology. This paper presents an overview of some of the main characteristics we considered and the challenges encountered when using ethnography to explore the nature of communication between children and health professionals in a children's hospital. There is no consensual definition or single procedure to follow when using ethnography. This is largely attributable to the re-contextualisation of ethnography over time through diversification in and across many disciplines. Thus, it is imperative to consider some of ethnography's trademark features. To identify core trademark features of ethnography, we collated data following a scoping review of pertinent ethnographic textbooks, journal articles, attendance at ethnographic workshops and discussions with principle ethnographers. This is a methodological paper. Essentially, ethnography is a field-orientated activity that has cultural interpretations at its core, although the levels of those interpretations vary. We identified six trademark features to be considered when embracing an ethnographic approach: naturalism; context; multiple data sources; small case numbers; 'emic' and 'etic' perspectives, and ethical considerations. Ethnography has an assortment of meanings, so it is not often used in a wholly orthodox way and does not fall under the auspices of one epistemological belief. Yet, there are core criteria and trademark features that researchers should take into account alongside their particular epistemological beliefs when embracing an ethnographic inquiry. We hope this paper promotes a clearer vision of the methodological processes to consider when embarking on ethnography and creates an avenue for others to disseminate their experiences of and challenges encountered when applying ethnography's trademark features in different healthcare contexts.
Vein matching using artificial neural network in vein authentication systems
NASA Astrophysics Data System (ADS)
Noori Hoshyar, Azadeh; Sulaiman, Riza
2011-10-01
Personal identification technology as security systems is developing rapidly. Traditional authentication modes like key; password; card are not safe enough because they could be stolen or easily forgotten. Biometric as developed technology has been applied to a wide range of systems. According to different researchers, vein biometric is a good candidate among other biometric traits such as fingerprint, hand geometry, voice, DNA and etc for authentication systems. Vein authentication systems can be designed by different methodologies. All the methodologies consist of matching stage which is too important for final verification of the system. Neural Network is an effective methodology for matching and recognizing individuals in authentication systems. Therefore, this paper explains and implements the Neural Network methodology for finger vein authentication system. Neural Network is trained in Matlab to match the vein features of authentication system. The Network simulation shows the quality of matching as 95% which is a good performance for authentication system matching.
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.
2009-01-01
Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include (1) diagnostic/detection methodology, (2) prognosis/lifing methodology, (3) diagnostic/prognosis linkage, (4) experimental validation, and (5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multimechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.
2009-01-01
Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include 1) diagnostic/detection methodology, 2) prognosis/lifing methodology, 3) diagnostic/prognosis linkage, 4) experimental validation and 5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multi-mechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.
Approaches to Children’s Exposure Assessment: Case Study with Diethylhexylphthalate (DEHP)
Ginsberg, Gary; Ginsberg, Justine; Foos, Brenda
2016-01-01
Children’s exposure assessment is a key input into epidemiology studies, risk assessment and source apportionment. The goals of this article are to describe a methodology for children’s exposure assessment that can be used for these purposes and to apply the methodology to source apportionment for the case study chemical, diethylhexylphthalate (DEHP). A key feature is the comparison of total (aggregate) exposure calculated via a pathways approach to that derived from a biomonitoring approach. The 4-step methodology and its results for DEHP are: (1) Prioritization of life stages and exposure pathways, with pregnancy, breast-fed infants, and toddlers the focus of the case study and pathways selected that are relevant to these groups; (2) Estimation of pathway-specific exposures by life stage wherein diet was found to be the largest contributor for pregnant women, breast milk and mouthing behavior for the nursing infant and diet, house dust, and mouthing for toddlers; (3) Comparison of aggregate exposure by pathways vs biomonitoring-based approaches wherein good concordance was found for toddlers and pregnant women providing confidence in the exposure assessment; (4) Source apportionment in which DEHP presence in foods, children’s products, consumer products and the built environment are discussed with respect to early life mouthing, house dust and dietary exposure. A potential fifth step of the method involves the calculation of exposure doses for risk assessment which is described but outside the scope for the current case study. In summary, the methodology has been used to synthesize the available information to identify key sources of early life exposure to DEHP. PMID:27376320
On Designing Multicore-Aware Simulators for Systems Biology Endowed with OnLine Statistics
Calcagno, Cristina; Coppo, Mario
2014-01-01
The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big) data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed. PMID:25050327
Mollison, Daisy; Sellar, Robin; Bastin, Mark; Mollison, Denis; Chandran, Siddharthan; Wardlaw, Joanna; Connick, Peter
2017-01-01
Moderate correlation exists between the imaging quantification of brain white matter lesions and cognitive performance in people with multiple sclerosis (MS). This may reflect the greater importance of other features, including subvisible pathology, or methodological limitations of the primary literature. To summarise the cognitive clinico-radiological paradox and explore the potential methodological factors that could influence the assessment of this relationship. Systematic review and meta-analysis of primary research relating cognitive function to white matter lesion burden. Fifty papers met eligibility criteria for review, and meta-analysis of overall results was possible in thirty-two (2050 participants). Aggregate correlation between cognition and T2 lesion burden was r = -0.30 (95% confidence interval: -0.34, -0.26). Wide methodological variability was seen, particularly related to key factors in the cognitive data capture and image analysis techniques. Resolving the persistent clinico-radiological paradox will likely require simultaneous evaluation of multiple components of the complex pathology using optimum measurement techniques for both cognitive and MRI feature quantification. We recommend a consensus initiative to support common standards for image analysis in MS, enabling benchmarking while also supporting ongoing innovation.
On designing multicore-aware simulators for systems biology endowed with OnLine statistics.
Aldinucci, Marco; Calcagno, Cristina; Coppo, Mario; Damiani, Ferruccio; Drocco, Maurizio; Sciacca, Eva; Spinella, Salvatore; Torquati, Massimo; Troina, Angelo
2014-01-01
The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big) data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed.
Conserve, Donaldson F; Jennings, Larissa; Aguiar, Carolina; Shin, Grace; Handler, Lara; Maman, Suzanne
2017-02-01
Introduction This systematic narrative review examined the empirical evidence on the effectiveness of mobile health (mHealth) behavioural interventions designed to increase the uptake of HIV testing among vulnerable and key populations. Methods MEDLINE/PubMed, Embase, Web of Science, and Global Health electronic databases were searched. Studies were eligible for inclusion if they were published between 2005 and 2015, evaluated an mHealth intervention, and reported an outcome relating to HIV testing. We also reviewed the bibliographies of retrieved studies for other relevant citations. The methodological rigor of selected articles was assessed, and narrative analyses were used to synthesize findings from mixed methodologies. Results A total of seven articles met the inclusion criteria. Most mHealth interventions employed a text-messaging feature and were conducted in middle- and high-income countries. The methodological rigor was moderate among studies. The current literature suggests that mHealth interventions can have significant positive effects on HIV testing initiation among vulnerable and key populations, as well as the general public. In some cases, null results were observed. Qualitative themes relating to the use of mobile technologies to increase HIV testing included the benefits of having low-cost, confidential, and motivational communication. Reported barriers included cellular network restrictions, poor linkages with physical testing services, and limited knowledge of appropriate text-messaging dose. Discussion MHealth interventions may prove beneficial in reducing the proportion of undiagnosed persons living with HIV, particularly among vulnerable and key populations. However, more rigorous and tailored interventions are needed to assess the effectiveness of widespread use.
Conserve, Donaldson F.; Jennings, Larissa; Aguiar, Carolina; Shin, Grace; Handler, Lara; Maman, Suzanne
2016-01-01
Objective This systematic narrative review examined the empirical evidence on the effectiveness of mobile health (mHealth) behavioral interventions designed to increase uptake of HIV testing among vulnerable and key populations. Methods MEDLINE/PubMed, Embase, Web of Science, and Global Health electronic databases were searched. Studies were eligible for inclusion if they were published between 2005 and 2015, evaluated an mHealth intervention, and reported an outcome relating to HIV testing. We also reviewed the bibliographies of retrieved studies for other relevant citations. The methodological rigor of selected articles was assessed, and narrative analyses were used to synthesize findings from mixed methodologies. Results A total of seven articles met the inclusion criteria. Most mHealth interventions employed a text-messaging feature and were conducted in middle- and high-income countries. The methodological rigor was moderate among studies. The current literature suggests that mHealth interventions can have significant positive effects on HIV testing initiation among vulnerable and key populations, as well as the general public. In some cases, null results were observed. Qualitative themes relating to use of mobile technologies to increase HIV testing included the benefits of having low-cost, confidential, and motivational communication. Reported barriers included cellular network restrictions, poor linkages with physical testing services, and limited knowledge of appropriate text-messaging dose. Conclusions MHealth interventions may prove beneficial in reducing the proportion of undiagnosed persons living with HIV, particularly among vulnerable and key populations. However, more rigorous and tailored intervention trials are needed to assess the effectiveness of widespread use. PMID:27056905
Feasibility of an International Multiple Sclerosis Rehabilitation Data Repository
Bradford, Elissa Held; Baert, Ilse; Finlayson, Marcia; Feys, Peter
2018-01-01
Abstract Background: Multiple sclerosis (MS) rehabilitation evidence is limited due to methodological factors, which may be addressed by a data repository. We describe the perceived challenges of, motivators for, interest in participating in, and key features of an international MS rehabilitation data repository. Methods: A multimethod sequential investigation was performed with the results of two focus groups, using nominal group technique, and study aims informing the development of an online questionnaire. Percentage agreement and key quotations illustrated questionnaire findings. Subgroup comparisons were made between clinicians and researchers and between participants in North America and Europe. Results: Rehabilitation professionals from 25 countries participated (focus groups: n = 21; questionnaire: n = 166). The top ten challenges (C) and motivators (M) identified by the focus groups were database control/management (C); ethical/legal concerns (C); data quality (C); time, effort, and cost (C); best practice (M); uniformity (C); sustainability (C); deeper analysis (M); collaboration (M); and identifying research needs (M). Percentage agreement with questionnaire statements regarding challenges to, motivators for, interest in, and key features of a successful repository was at least 80%, 85%, 72%, and 83%, respectively, across each group of statements. Questionnaire subgroup analysis revealed a few differences (P < .05), including that clinicians more strongly identified with improving best practice as a motivator. Conclusions: Findings support clinician and researcher interest in and potential for success of an international MS rehabilitation data repository if prioritized challenges and motivators are addressed and key features are included. PMID:29507539
Bradford, Elissa Held; Baert, Ilse; Finlayson, Marcia; Feys, Peter; Wagner, Joanne
2018-01-01
Multiple sclerosis (MS) rehabilitation evidence is limited due to methodological factors, which may be addressed by a data repository. We describe the perceived challenges of, motivators for, interest in participating in, and key features of an international MS rehabilitation data repository. A multimethod sequential investigation was performed with the results of two focus groups, using nominal group technique, and study aims informing the development of an online questionnaire. Percentage agreement and key quotations illustrated questionnaire findings. Subgroup comparisons were made between clinicians and researchers and between participants in North America and Europe. Rehabilitation professionals from 25 countries participated (focus groups: n = 21; questionnaire: n = 166). The top ten challenges (C) and motivators (M) identified by the focus groups were database control/management (C); ethical/legal concerns (C); data quality (C); time, effort, and cost (C); best practice (M); uniformity (C); sustainability (C); deeper analysis (M); collaboration (M); and identifying research needs (M). Percentage agreement with questionnaire statements regarding challenges to, motivators for, interest in, and key features of a successful repository was at least 80%, 85%, 72%, and 83%, respectively, across each group of statements. Questionnaire subgroup analysis revealed a few differences (P < .05), including that clinicians more strongly identified with improving best practice as a motivator. Findings support clinician and researcher interest in and potential for success of an international MS rehabilitation data repository if prioritized challenges and motivators are addressed and key features are included.
A geometric multigrid preconditioning strategy for DPG system matrices
Roberts, Nathan V.; Chan, Jesse
2017-08-23
Here, the discontinuous Petrov–Galerkin (DPG) methodology of Demkowicz and Gopalakrishnan (2010, 2011) guarantees the optimality of the solution in an energy norm, and provides several features facilitating adaptive schemes. A key question that has not yet been answered in general – though there are some results for Poisson, e.g.– is how best to precondition the DPG system matrix, so that iterative solvers may be used to allow solution of large-scale problems.
Methodological Challenges to Economic Evaluations of Vaccines: Is a Common Approach Still Possible?
Jit, Mark; Hutubessy, Raymond
2016-06-01
Economic evaluation of vaccination is a key tool to inform effective spending on vaccines. However, many evaluations have been criticised for failing to capture features of vaccines which are relevant to decision makers. These include broader societal benefits (such as improved educational achievement, economic growth and political stability), reduced health disparities, medical innovation, reduced hospital beds pressures, greater peace of mind and synergies in economic benefits with non-vaccine interventions. Also, the fiscal implications of vaccination programmes are not always made explicit. Alternative methodological frameworks have been proposed to better capture these benefits. However, any broadening of the methodology for economic evaluation must also involve evaluations of non-vaccine interventions, and hence may not always benefit vaccines given a fixed health-care budget. The scope of an economic evaluation must consider the budget from which vaccines are funded, and the decision-maker's stated aims for that spending to achieve.
No evidence for intervention-dependent influence of methodological features on treatment effect.
Jacobs, Wilco C H; Kruyt, Moyo C; Moojen, Wouter A; Verbout, Ab J; Oner, F Cumhur
2013-12-01
The goal of this systematic review was to evaluate if the influence of methodological features on treatment effect differs between types of intervention. MEDLINE, Embase, Web of Science, Cochrane methodology register, and reference lists were searched for meta-epidemiologic studies on the influence of methodological features on treatment effect. Studies analyzing influence of methodological features related to internal validity were included. We made a distinction among surgical, pharmaceutical, and therapeutical as separate types of intervention. Heterogeneity was calculated to identify differences among these types. Fourteen meta-epidemiologic studies were found with 51 estimates of influence of methodological features on treatment effect. Heterogeneity was observed among the intervention types for randomization. Surgical intervention studies showed a larger treatment effect when randomized; this was in contrast to pharmaceutical studies that found the opposite. For allocation concealment and double blinding, the influence of methodological features on the treatment effect was comparable across different types of intervention. For the remaining methodological features, there were insufficient observations. The influence of allocation concealment and double blinding on the treatment effect is consistent across studies of different interventional types. The influence of randomization although, may be different between surgical and nonsurgical studies. Copyright © 2013 Elsevier Inc. All rights reserved.
Health Worker Focused Distributed Simulation for Improving Capability of Health Systems in Liberia.
Gale, Thomas C E; Chatterjee, Arunangsu; Mellor, Nicholas E; Allan, Richard J
2016-04-01
The main goal of this study was to produce an adaptable learning platform using virtual learning and distributed simulation, which can be used to train health care workers, across a wide geographical area, key safety messages regarding infection prevention control (IPC). A situationally responsive agile methodology, Scrum, was used to develop a distributed simulation module using short 1-week iterations and continuous synchronous plus asynchronous communication including end users and IPC experts. The module contained content related to standard IPC precautions (including handwashing techniques) and was structured into 3 distinct sections related to donning, doffing, and hazard perception training. Using Scrum methodology, we were able to link concepts applied to best practices in simulation-based medical education (deliberate practice, continuous feedback, self-assessment, and exposure to uncommon events), pedagogic principles related to adult learning (clear goals, contextual awareness, motivational features), and key learning outcomes regarding IPC, as a rapid response initiative to the Ebola outbreak in West Africa. Gamification approach has been used to map learning mechanics to enhance user engagement. The developed IPC module demonstrates how high-frequency, low-fidelity simulations can be rapidly designed using scrum-based agile methodology. Analytics incorporated into the tool can help demonstrate improved confidence and competence of health care workers who are treating patients within an Ebola virus disease outbreak region. These concepts could be used in a range of evolving disasters where rapid development and communication of key learning messages are required.
Curated Collection for Educators: Five Key Papers about the Flipped Classroom Methodology.
King, Andrew; Boysen-Osborn, Megan; Cooney, Robert; Mitzman, Jennifer; Misra, Asit; Williams, Jennifer; Dulani, Tina; Gottlieb, Michael
2017-10-25
The flipped classroom (FC) pedagogy is becoming increasingly popular in medical education due to its appeal to the millennial learner and potential benefits in knowledge acquisition. Despite its popularity and effectiveness, the FC educational method is not without challenges. In this article, we identify and summarize several key papers relevant to medical educators interested in exploring the FC teaching methodology. The authors identified an extensive list of papers relevant to FC pedagogy via online discussions within the Academic Life in Emergency Medicine (ALiEM) Faculty Incubator. This list was augmented by an open call on Twitter (utilizing the #meded, #FOAMed, and #flippedclassroom hashtags) yielding a list of 33 papers. We then conducted a three-round modified Delphi process within the authorship group, which included both junior and senior clinician educators, to identify the most impactful papers for educators interested in FC pedagogy. The three-round modified Delphi process ranked all of the selected papers and selected the five most highly-rated papers for inclusion. The authorship group reviewed and summarized these papers with specific consideration given to their value to junior faculty educators and faculty developers interested in the flipped classroom approach. The list of papers featured in this article serves as a key reading list for junior clinician educators and faculty developers interested in the flipped classroom technique. The associated commentaries contextualize the importance of these papers for medical educators aiming to optimize their understanding and implementation of the flipped classroom methodology in their teaching and through faculty development.
Evaluation of the effectiveness of color attributes for video indexing
NASA Astrophysics Data System (ADS)
Chupeau, Bertrand; Forest, Ronan
2001-10-01
Color features are reviewed and their effectiveness assessed in the application framework of key-frame clustering for abstracting unconstrained video. Existing color spaces and associated quantization schemes are first studied. Description of global color distribution by means of histograms is then detailed. In our work, 12 combinations of color space and quantization were selected, together with 12 histogram metrics. Their respective effectiveness with respect to picture similarity measurement was evaluated through a query-by-example scenario. For that purpose, a set of still-picture databases was built by extracting key frames from several video clips, including news, documentaries, sports and cartoons. Classical retrieval performance evaluation criteria were adapted to the specificity of our testing methodology.
Evaluation of the effectiveness of color attributes for video indexing
NASA Astrophysics Data System (ADS)
Chupeau, Bertrand; Forest, Ronan
2001-01-01
Color features are reviewed and their effectiveness assessed in the application framework of key-frame clustering for abstracting unconstrained video. Existing color spaces and associated quantization schemes are first studied. Description of global color distribution by means of histograms is then detailed. In our work, twelve combinations of color space and quantization were selected, together with twelve histogram metrics. Their respective effectiveness with respect to picture similarity measurement was evaluated through a query-be-example scenario. For that purpose, a set of still-picture databases was built by extracting key-frames from several video clips, including news, documentaries, sports and cartoons. Classical retrieval performance evaluation criteria were adapted to the specificity of our testing methodology.
Evaluation of the effectiveness of color attributes for video indexing
NASA Astrophysics Data System (ADS)
Chupeau, Bertrand; Forest, Ronan
2000-12-01
Color features are reviewed and their effectiveness assessed in the application framework of key-frame clustering for abstracting unconstrained video. Existing color spaces and associated quantization schemes are first studied. Description of global color distribution by means of histograms is then detailed. In our work, twelve combinations of color space and quantization were selected, together with twelve histogram metrics. Their respective effectiveness with respect to picture similarity measurement was evaluated through a query-be-example scenario. For that purpose, a set of still-picture databases was built by extracting key-frames from several video clips, including news, documentaries, sports and cartoons. Classical retrieval performance evaluation criteria were adapted to the specificity of our testing methodology.
Modelling Aṣṭādhyāyī: An Approach Based on the Methodology of Ancillary Disciplines (Vedāṅga)
NASA Astrophysics Data System (ADS)
Mishra, Anand
This article proposes a general model based on the common methodological approach of the ancillary disciplines (Vedāṅga) associated with the Vedas taking examples from Śikṣā, Chandas, Vyākaraṇa and Prātiśā khya texts. It develops and elaborates this model further to represent the contents and processes of Aṣṭādhyāyī. Certain key features are added to my earlier modelling of Pāṇinian system of Sanskrit grammar. This includes broader coverage of the Pāṇinian meta-language, mechanism for automatic application of rules and positioning the grammatical system within the procedural complexes of ancillary disciplines.
New technology and regional studies in human ecology: A Papua New Guinea example
NASA Technical Reports Server (NTRS)
Morren, George E. B., Jr.
1991-01-01
Two key issues in using technologies such as digital image processing and geographic information systems are a conceptually and methodologically valid research design and the exploitation of varied sources of data. With this realized, the new technologies offer anthropologists the opportunity to test hypotheses about spatial and temporal variations in the features of interest within a regionally coherent mosaic of social groups and landscapes. Current research on the Mountain OK of Papua New Guinea is described with reference to these issues.
Human nonverbal courtship behavior--a brief historical review.
Moore, Monica M
2010-03-01
This article reviews research findings documenting the nature of nonverbal courtship behavior compiled through both observation and self-report methods. I briefly present the major theoretical perspectives guiding research methodologies used in the field and in the laboratory. Studies of verbal courtship, including those conducted via computer, via text messaging, or through personal advertisement, are not included in this review. The article ends by elucidating some key features of human nonverbal courtship behavior that have become apparent after scrutinizing these data.
Baicalein Reduces Airway Injury in Allergen and IL-13 Induced Airway Inflammation
Mabalirajan, Ulaganathan; Ahmad, Tanveer; Rehman, Rakhshinda; Leishangthem, Geeta Devi; Dinda, Amit Kumar; Agrawal, Anurag; Ghosh, Balaram; Sharma, Surendra Kumar
2013-01-01
Background Baicalein, a bioflavone present in the dry roots of Scutellaria baicalensis Georgi, is known to reduce eotaxin production in human fibroblasts. However, there are no reports of its anti-asthma activity or its effect on airway injury. Methodology/Principal Findings In a standard experimental asthma model, male Balb/c mice that were sensitized with ovalbumin (OVA), treated with baicalein (10 mg/kg, ip) or a vehicle control, either during (preventive use) or after OVA challenge (therapeutic use). In an alternate model, baicalein was administered to male Balb/c mice which were given either IL-4 or IL-13 intranasally. Features of asthma were determined by estimating airway hyperresponsiveness (AHR), histopathological changes and biochemical assays of key inflammatory molecules. Airway injury was determined with apoptotic assays, transmission electron microscopy and assessing key mitochondrial functions. Baicalein treatment reduced AHR and inflammation in both experimental models. TGF-β1, sub-epithelial fibrosis and goblet cell metaplasia, were also reduced. Furthermore, baicalein treatment significantly reduced 12/15-LOX activity, features of mitochondrial dysfunctions, and apoptosis of bronchial epithelia. Conclusion/Significance Our findings demonstrate that baicalein can attenuate important features of asthma, possibly through the reduction of airway injury and restoration of mitochondrial function. PMID:23646158
Assessing validity of observational intervention studies - the Benchmarking Controlled Trials.
Malmivaara, Antti
2016-09-01
Benchmarking Controlled Trial (BCT) is a concept which covers all observational studies aiming to assess impact of interventions or health care system features to patients and populations. To create and pilot test a checklist for appraising methodological validity of a BCT. The checklist was created by extracting the most essential elements from the comprehensive set of criteria in the previous paper on BCTs. Also checklists and scientific papers on observational studies and respective systematic reviews were utilized. Ten BCTs published in the Lancet and in the New England Journal of Medicine were used to assess feasibility of the created checklist. The appraised studies seem to have several methodological limitations, some of which could be avoided in planning, conducting and reporting phases of the studies. The checklist can be used for planning, conducting, reporting, reviewing, and critical reading of observational intervention studies. However, the piloted checklist should be validated in further studies. Key messages Benchmarking Controlled Trial (BCT) is a concept which covers all observational studies aiming to assess impact of interventions or health care system features to patients and populations. This paper presents a checklist for appraising methodological validity of BCTs and pilot-tests the checklist with ten BCTs published in leading medical journals. The appraised studies seem to have several methodological limitations, some of which could be avoided in planning, conducting and reporting phases of the studies. The checklist can be used for planning, conducting, reporting, reviewing, and critical reading of observational intervention studies.
Setting conservation targets for sandy beach ecosystems
NASA Astrophysics Data System (ADS)
Harris, Linda; Nel, Ronel; Holness, Stephen; Sink, Kerry; Schoeman, David
2014-10-01
Representative and adequate reserve networks are key to conserving biodiversity. This begs the question, how much of which features need to be placed in protected areas? Setting specifically-derived conservation targets for most ecosystems is common practice; however, this has never been done for sandy beaches. The aims of this paper, therefore, are to propose a methodology for setting conservation targets for sandy beach ecosystems; and to pilot the proposed method using data describing biodiversity patterns and processes from microtidal beaches in South Africa. First, a classification scheme of valued features of beaches is constructed, including: biodiversity features; unique features; and important processes. Second, methodologies for setting targets for each feature under different data-availability scenarios are described. From this framework, targets are set for features characteristic of microtidal beaches in South Africa, as follows. 1) Targets for dune vegetation types were adopted from a previous assessment, and ranged 19-100%. 2) Targets for beach morphodynamic types (habitats) were set using species-area relationships (SARs). These SARs were derived from species richness data from 142 sampling events around the South African coast (extrapolated to total theoretical species richness estimates using previously-established species-accumulation curve relationships), plotted against the area of the beach (calculated from Google Earth imagery). The species-accumulation factor (z) was 0.22, suggesting a baseline habitat target of 27% is required to protect 75% of the species. This baseline target was modified by heuristic principles, based on habitat rarity and threat status, with final values ranging 27-40%. 3) Species targets were fixed at 20%, modified using heuristic principles based on endemism, threat status, and whether or not beaches play an important role in the species' life history, with targets ranging 20-100%. 4) Targets for processes and 5) important assemblages were set at 50%, following other studies. 6) Finally, a target for an outstanding feature (the Alexandria dunefield) was set at 80% because of its national, international and ecological importance. The greatest shortfall in the current target-setting process is in the lack of empirical models describing the key beach processes, from which robust ecological thresholds can be derived. As for many other studies, our results illustrate that the conservation target of 10% for coastal and marine systems proposed by the Convention on Biological Diversity is too low to conserve sandy beaches and their biota.
Parents' Verbal Communication and Childhood Anxiety: A Systematic Review.
Percy, Ray; Creswell, Cathy; Garner, Matt; O'Brien, Doireann; Murray, Lynne
2016-03-01
Parents' verbal communication to their child, particularly the expression of fear-relevant information (e.g., attributions of threat to the environment), is considered to play a key role in children's fears and anxiety. This review considers the extent to which parental verbal communication is associated with child anxiety by examining research that has employed objective observational methods. Using a systematic search strategy, we identified 15 studies that addressed this question. These studies provided some evidence that particular fear-relevant features of parental verbal communication are associated with child anxiety under certain conditions. However, the scope for drawing reliable, general conclusions was limited by extensive methodological variation between studies, particularly in terms of the features of parental verbal communication examined and the context in which communication took place, how child anxiety was measured, and inconsistent consideration of factors that may moderate the verbal communication-child anxiety relationship. We discuss ways in which future research can contribute to this developing evidence base and reduce further methodological inconsistency so as to inform interventions for children with anxiety problems.
Curated Collection for Educators: Five Key Papers about the Flipped Classroom Methodology
Boysen-Osborn, Megan; Cooney, Robert; Mitzman, Jennifer; Misra, Asit; Williams, Jennifer; Dulani, Tina; Gottlieb, Michael
2017-01-01
The flipped classroom (FC) pedagogy is becoming increasingly popular in medical education due to its appeal to the millennial learner and potential benefits in knowledge acquisition. Despite its popularity and effectiveness, the FC educational method is not without challenges. In this article, we identify and summarize several key papers relevant to medical educators interested in exploring the FC teaching methodology. The authors identified an extensive list of papers relevant to FC pedagogy via online discussions within the Academic Life in Emergency Medicine (ALiEM) Faculty Incubator. This list was augmented by an open call on Twitter (utilizing the #meded, #FOAMed, and #flippedclassroom hashtags) yielding a list of 33 papers. We then conducted a three-round modified Delphi process within the authorship group, which included both junior and senior clinician educators, to identify the most impactful papers for educators interested in FC pedagogy. The three-round modified Delphi process ranked all of the selected papers and selected the five most highly-rated papers for inclusion. The authorship group reviewed and summarized these papers with specific consideration given to their value to junior faculty educators and faculty developers interested in the flipped classroom approach. The list of papers featured in this article serves as a key reading list for junior clinician educators and faculty developers interested in the flipped classroom technique. The associated commentaries contextualize the importance of these papers for medical educators aiming to optimize their understanding and implementation of the flipped classroom methodology in their teaching and through faculty development. PMID:29282445
Advanced piloted aircraft flight control system design methodology. Volume 1: Knowledge base
NASA Technical Reports Server (NTRS)
Mcruer, Duane T.; Myers, Thomas T.
1988-01-01
The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design stages starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. While theory and associated computational means are an important aspect of the design methodology, the lore, knowledge and experience elements, which guide and govern applications are critical features. This material is presented as summary tables, outlines, recipes, empirical data, lists, etc., which encapsulate a great deal of expert knowledge. Much of this is presented in topical knowledge summaries which are attached as Supplements. The composite of the supplements and the main body elements constitutes a first cut at a a Mark 1 Knowledge Base for manned-aircraft flight control.
O'Brien, Kelly K; Colquhoun, Heather; Levac, Danielle; Baxter, Larry; Tricco, Andrea C; Straus, Sharon; Wickerson, Lisa; Nayar, Ayesha; Moher, David; O'Malley, Lisa
2016-07-26
Scoping studies (or reviews) are a method used to comprehensively map evidence across a range of study designs in an area, with the aim of informing future research practice, programs and policy. However, no universal agreement exists on terminology, definition or methodological steps. Our aim was to understand the experiences of, and considerations for conducting scoping studies from the perspective of academic and community partners. Primary objectives were to 1) describe experiences conducting scoping studies including strengths and challenges; and 2) describe perspectives on terminology, definition, and methodological steps. We conducted a cross-sectional web-based survey with clinicians, educators, researchers, knowledge users, representatives from community-based organizations, graduate students, and policy stakeholders with experience and/or interest in conducting scoping studies to gain an understanding of experiences and perspectives on the conduct and reporting of scoping studies. We administered an electronic self-reported questionnaire comprised of 22 items related to experiences with scoping studies, strengths and challenges, opinions on terminology, and methodological steps. We analyzed questionnaire data using descriptive statistics and content analytical techniques. Survey results were discussed during a multi-stakeholder consultation to identify key considerations in the conduct and reporting of scoping studies. Of the 83 invitations, 54 individuals (65 %) completed the scoping questionnaire, and 48 (58 %) attended the scoping study meeting from Canada, the United Kingdom and United States. Many scoping study strengths were dually identified as challenges including breadth of scope, and iterative process. No consensus on terminology emerged, however key defining features that comprised a working definition of scoping studies included the exploratory mapping of literature in a field; iterative process, inclusion of grey literature; no quality assessment of included studies, and an optional consultation phase. We offer considerations for the conduct and reporting of scoping studies for researchers, clinicians and knowledge users engaging in this methodology. Lack of consensus on scoping terminology, definition and methodological steps persists. Reasons for this may be attributed to diversity of disciplines adopting this methodology for differing purposes. Further work is needed to establish guidelines on the reporting and methodological quality assessment of scoping studies.
Silvestri, Mark M.; Lewis, Jennifer M.; Borsari, Brian; Correia, Christopher J.
2014-01-01
Background Drinking games are prevalent among college students and are associated with increased alcohol use and negative alcohol-related consequences. There has been substantial growth in research on drinking games. However, the majority of published studies rely on retrospective self-reports of behavior and very few studies have made use of laboratory procedures to systematically observe drinking game behavior. Objectives The current paper draws on the authors’ experiences designing and implementing methods for the study of drinking games in the laboratory. Results The paper addressed the following key design features: (a) drinking game selection; (b) beverage selection; (c) standardizing game play; (d) selection of dependent and independent variables; and (e) creating a realistic drinking game environment. Conclusions The goal of this methodological review paper is to encourage other researchers to pursue laboratory research on drinking game behavior. Use of laboratory-based methodologies will facilitate a better understanding of the dynamics of risky drinking and inform prevention and intervention efforts. PMID:25192209
NASA Astrophysics Data System (ADS)
Dodick, Jeff; Argamon, Shlomo; Chase, Paul
2009-08-01
A key focus of current science education reforms involves developing inquiry-based learning materials. However, without an understanding of how working scientists actually do science, such learning materials cannot be properly developed. Until now, research on scientific reasoning has focused on cognitive studies of individual scientific fields. However, the question remains as to whether scientists in different fields fundamentally rely on different methodologies. Although many philosophers and historians of science do indeed assert that there is no single monolithic scientific method, this has never been tested empirically. We therefore approach this problem by analyzing patterns of language used by scientists in their published work. Our results demonstrate systematic variation in language use between types of science that are thought to differ in their characteristic methodologies. The features of language use that were found correspond closely to a proposed distinction between Experimental Sciences (e.g., chemistry) and Historical Sciences (e.g., paleontology); thus, different underlying rhetorical and conceptual mechanisms likely operate for scientific reasoning and communication in different contexts.
Hexafluoroisopropyl alcohol mediated synthesis of 2,3-dihydro-4H-pyrido[1,2-a]pyrimidin-4-ones.
Alam, Mohammad A; Alsharif, Zakeyah; Alkhattabi, Hessa; Jones, Derika; Delancey, Evan; Gottsponer, Adam; Yang, Tianhong
2016-11-02
An efficient synthesis of novel 2,3-dihydro-4H-pyrido[1,2-a]pyrimidin-4-ones has been reported. Inexpensive and readily available substrates, environmentally benign reaction condition, and product formation up to quantitative yield are the key features of this methodology. Products are formed by the aza-Michael addition followed by intramolecular acyl substitution in a domino process. The polar nature and strong hydrogen bond donor capability of 1,1,1,3,3,3-hexafluoropropan-2-ol is pivotal in this cascade protocol.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winkler, Mirko S., E-mail: mirko.winkler@unibas.ch; University of Basel, P.O. Box, CH-4003 Basel; Divall, Mark J., E-mail: mdivall@shapeconsulting.org
2012-02-15
The quantitative assessment of health impacts has been identified as a crucial feature for realising the full potential of health impact assessment (HIA). In settings where demographic and health data are notoriously scarce, but there is a broad range of ascertainable ecological, environmental, epidemiological and socioeconomic information, a diverse toolkit of data collection strategies becomes relevant for the mainly small-area impacts of interest. We present a modular, cross-sectional baseline health survey study design, which has been developed for HIA of industrial development projects in the humid tropics. The modular nature of our toolkit allows our methodology to be readily adaptedmore » to the prevailing eco-epidemiological characteristics of a given project setting. Central to our design is a broad set of key performance indicators, covering a multiplicity of health outcomes and determinants at different levels and scales. We present experience and key findings from our modular baseline health survey methodology employed in 14 selected sentinel sites within an iron ore mining project in the Republic of Guinea. We argue that our methodology is a generic example of rapid evidence assembly in difficult-to-reach localities, where improvement of the predictive validity of the assessment and establishment of a benchmark for longitudinal monitoring of project impacts and mitigation efforts is needed.« less
Assessing validity of observational intervention studies – the Benchmarking Controlled Trials
Malmivaara, Antti
2016-01-01
Abstract Background: Benchmarking Controlled Trial (BCT) is a concept which covers all observational studies aiming to assess impact of interventions or health care system features to patients and populations. Aims: To create and pilot test a checklist for appraising methodological validity of a BCT. Methods: The checklist was created by extracting the most essential elements from the comprehensive set of criteria in the previous paper on BCTs. Also checklists and scientific papers on observational studies and respective systematic reviews were utilized. Ten BCTs published in the Lancet and in the New England Journal of Medicine were used to assess feasibility of the created checklist. Results: The appraised studies seem to have several methodological limitations, some of which could be avoided in planning, conducting and reporting phases of the studies. Conclusions: The checklist can be used for planning, conducting, reporting, reviewing, and critical reading of observational intervention studies. However, the piloted checklist should be validated in further studies.Key messagesBenchmarking Controlled Trial (BCT) is a concept which covers all observational studies aiming to assess impact of interventions or health care system features to patients and populations.This paper presents a checklist for appraising methodological validity of BCTs and pilot-tests the checklist with ten BCTs published in leading medical journals. The appraised studies seem to have several methodological limitations, some of which could be avoided in planning, conducting and reporting phases of the studies.The checklist can be used for planning, conducting, reporting, reviewing, and critical reading of observational intervention studies. PMID:27238631
phMRI: methodological considerations for mitigating potential confounding factors
Bourke, Julius H.; Wall, Matthew B.
2015-01-01
Pharmacological Magnetic Resonance Imaging (phMRI) is a variant of conventional MRI that adds pharmacological manipulations in order to study the effects of drugs, or uses pharmacological probes to investigate basic or applied (e.g., clinical) neuroscience questions. Issues that may confound the interpretation of results from various types of phMRI studies are briefly discussed, and a set of methodological strategies that can mitigate these problems are described. These include strategies that can be employed at every stage of investigation, from study design to interpretation of resulting data, and additional techniques suited for use with clinical populations are also featured. Pharmacological MRI is a challenging area of research that has both significant advantages and formidable difficulties, however with due consideration and use of these strategies many of the key obstacles can be overcome. PMID:25999812
Hyperbolic reformulation of a 1D viscoelastic blood flow model and ADER finite volume schemes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montecinos, Gino I.; Müller, Lucas O.; Toro, Eleuterio F.
2014-06-01
The applicability of ADER finite volume methods to solve hyperbolic balance laws with stiff source terms in the context of well-balanced and non-conservative schemes is extended to solve a one-dimensional blood flow model for viscoelastic vessels, reformulated as a hyperbolic system, via a relaxation time. A criterion for selecting relaxation times is found and an empirical convergence rate assessment is carried out to support this result. The proposed methodology is validated by applying it to a network of viscoelastic vessels for which experimental and numerical results are available. The agreement between the results obtained in the present paper and thosemore » available in the literature is satisfactory. Key features of the present formulation and numerical methodologies, such as accuracy, efficiency and robustness, are fully discussed in the paper.« less
A feature selection approach towards progressive vector transmission over the Internet
NASA Astrophysics Data System (ADS)
Miao, Ru; Song, Jia; Feng, Min
2017-09-01
WebGIS has been applied for visualizing and sharing geospatial information popularly over the Internet. In order to improve the efficiency of the client applications, the web-based progressive vector transmission approach is proposed. Important features should be selected and transferred firstly, and the methods for measuring the importance of features should be further considered in the progressive transmission. However, studies on progressive transmission for large-volume vector data have mostly focused on map generalization in the field of cartography, but rarely discussed on the selection of geographic features quantitatively. This paper applies information theory for measuring the feature importance of vector maps. A measurement model for the amount of information of vector features is defined based upon the amount of information for dealing with feature selection issues. The measurement model involves geometry factor, spatial distribution factor and thematic attribute factor. Moreover, a real-time transport protocol (RTP)-based progressive transmission method is then presented to improve the transmission of vector data. To clearly demonstrate the essential methodology and key techniques, a prototype for web-based progressive vector transmission is presented, and an experiment of progressive selection and transmission for vector features is conducted. The experimental results indicate that our approach clearly improves the performance and end-user experience of delivering and manipulating large vector data over the Internet.
Learning Motion Features for Example-Based Finger Motion Estimation for Virtual Characters
NASA Astrophysics Data System (ADS)
Mousas, Christos; Anagnostopoulos, Christos-Nikolaos
2017-09-01
This paper presents a methodology for estimating the motion of a character's fingers based on the use of motion features provided by a virtual character's hand. In the presented methodology, firstly, the motion data is segmented into discrete phases. Then, a number of motion features are computed for each motion segment of a character's hand. The motion features are pre-processed using restricted Boltzmann machines, and by using the different variations of semantically similar finger gestures in a support vector machine learning mechanism, the optimal weights for each feature assigned to a metric are computed. The advantages of the presented methodology in comparison to previous solutions are the following: First, we automate the computation of optimal weights that are assigned to each motion feature counted in our metric. Second, the presented methodology achieves an increase (about 17%) in correctly estimated finger gestures in comparison to a previous method.
Snedeker, Kate G; Canning, Paisley; Totton, Sarah C; Sargeant, Jan M
2012-04-01
Abstracts are the most commonly read part of a journal article, and play an important role as summaries of the articles, and search and screening tools. However, research on abstracts in human biomedicine has shown that abstracts often do not report key methodological features and results. Little research has been done to examine reporting of such features in abstracts from papers detailing pre-harvest food safety trials. Thus, the objective of this study was to assess the quality of reporting of key factors in abstracts detailing trials of pre-harvest food safety interventions. A systematic search algorithm was used to identify all in vivo trials of pre-harvest interventions against foodborne pathogens in PubMed and CAB Direct published from 1999 to October 2009. References were screened for relevance, and 150 were randomly chosen for inclusion in the study. A checklist based on the CONSORT abstract extension and the REFLECT Statement was used to assess the reporting of methodological features and results. All screening and assessment was performed by two independent reviewers with disagreements resolved by consensus. The systematic search returned 3554 unique citations; 356 were found to be relevant and 150 were randomly selected for inclusion. The abstracts were from 51 different journals, and 13 out of 150 were structured. Of the 124 abstracts that reported whether the trial design was deliberate disease challenge or natural exposure, 113 were deliberate challenge and 11 natural exposure. 103 abstracts detailed studies involving poultry, 20 cattle and 15 swine. Most abstracts reported the production stage of the animals (135/150), a hypothesis or objective (123/150), and results for all treatment groups (136/150). However, few abstracts reported on how animals were grouped in housing (25/150), the location of the study (5/150), the primary outcome (2/126), level of treatment allocation (15/150), sample size (63/150) or whether study units were lost to follow up (4/150). Forty-eight (48/150) abstracts reported the name, mode of administration, dose and duration of the intervention(s), while 102 (102/150) reported at least one of these elements. Nine (9/150) abstracts specified that allocation of study units to treatments was randomized, and none of the abstracts reported whether blinding was used (0/150). These results reveal gaps in reporting of methodological features and results. Thus, improving reporting quality in abstracts should be a crucial goal to be pursued by authors, reviewers and journal editors. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Belgasam, Tarek M.; Zbib, Hussein M.
2018-06-01
The increase in use of dual-phase (DP) steel grades by vehicle manufacturers to enhance crash resistance and reduce body car weight requires the development of a clear understanding of the effect of various microstructural parameters on the energy absorption in these materials. Accordingly, DP steelmakers are interested in predicting the effect of various microscopic factors as well as optimizing microstructural properties for application in crash-relevant components of vehicle bodies. This study presents a microstructure-based approach using a multiscale material and structure model. In this approach, Digimat and LS-DYNA software were coupled and employed to provide a full micro-macro multiscale material model, which is then used to simulate tensile tests. Microstructures with varied ferrite grain sizes, martensite volume fractions, and carbon content in DP steels were studied. The impact of these microstructural features at different strain rates on energy absorption characteristics of DP steels is investigated numerically using an elasto-viscoplastic constitutive model. The model is implemented in a multiscale finite-element framework. A comprehensive statistical parametric study using response surface methodology is performed to determine the optimum microstructural features for a required tensile toughness at different strain rates. The simulation results are validated using experimental data found in the literature. The developed methodology proved to be effective for investigating the influence and interaction of key microscopic properties on the energy absorption characteristics of DP steels. Furthermore, it is shown that this method can be used to identify optimum microstructural conditions at different strain-rate conditions.
NASA Astrophysics Data System (ADS)
Belgasam, Tarek M.; Zbib, Hussein M.
2018-03-01
The increase in use of dual-phase (DP) steel grades by vehicle manufacturers to enhance crash resistance and reduce body car weight requires the development of a clear understanding of the effect of various microstructural parameters on the energy absorption in these materials. Accordingly, DP steelmakers are interested in predicting the effect of various microscopic factors as well as optimizing microstructural properties for application in crash-relevant components of vehicle bodies. This study presents a microstructure-based approach using a multiscale material and structure model. In this approach, Digimat and LS-DYNA software were coupled and employed to provide a full micro-macro multiscale material model, which is then used to simulate tensile tests. Microstructures with varied ferrite grain sizes, martensite volume fractions, and carbon content in DP steels were studied. The impact of these microstructural features at different strain rates on energy absorption characteristics of DP steels is investigated numerically using an elasto-viscoplastic constitutive model. The model is implemented in a multiscale finite-element framework. A comprehensive statistical parametric study using response surface methodology is performed to determine the optimum microstructural features for a required tensile toughness at different strain rates. The simulation results are validated using experimental data found in the literature. The developed methodology proved to be effective for investigating the influence and interaction of key microscopic properties on the energy absorption characteristics of DP steels. Furthermore, it is shown that this method can be used to identify optimum microstructural conditions at different strain-rate conditions.
Health coaching to improve healthy lifestyle behaviors: an integrative review.
Olsen, Jeanette M; Nesbitt, Bonnie J
2010-01-01
Chronic diseases account for 70% of U.S. deaths. Health coaching may help patients adopt healthy lifestyle behaviors that prevent and control diseases. This integrative review analyzed health coaching studies for evidence of effectiveness and to identify key program features. Multiple electronic databases were utilized, yielding a final sample of 15 documents. The search was limited to peer-reviewed research articles published between 1999 and 2008. Studies were further analyzed if they (1) specifically cited coaching as a program intervention, and (2) applied the intervention to research. Articles describing various quantitative and qualitative methodologies were critically analyzed using a systematic method. Data were synthesized using a matrix format according to purpose, method, intervention, findings, critique, and quality rating. All 15 studies utilized nonprobability sampling, 7 (47%) with randomized intervention and control groups. Significant improvements in one or more of the behaviors of nutrition, physical activity, weight management, or medication adherence were identified in six (40%) of the studies. Common features of effective programs were goal setting (73%), motivational interviewing (27%), and collaboration with health care providers (20%). Health coaching studies with well-specified methodologies and more rigorous designs are needed to strengthen findings; however, this behavioral change intervention suggests promise.
Combining Feature Selection and Integration—A Neural Model for MT Motion Selectivity
Beck, Cornelia; Neumann, Heiko
2011-01-01
Background The computation of pattern motion in visual area MT based on motion input from area V1 has been investigated in many experiments and models attempting to replicate the main mechanisms. Two different core conceptual approaches were developed to explain the findings. In integrationist models the key mechanism to achieve pattern selectivity is the nonlinear integration of V1 motion activity. In contrast, selectionist models focus on the motion computation at positions with 2D features. Methodology/Principal Findings Recent experiments revealed that neither of the two concepts alone is sufficient to explain all experimental data and that most of the existing models cannot account for the complex behaviour found. MT pattern selectivity changes over time for stimuli like type II plaids from vector average to the direction computed with an intersection of constraint rule or by feature tracking. Also, the spatial arrangement of the stimulus within the receptive field of a MT cell plays a crucial role. We propose a recurrent neural model showing how feature integration and selection can be combined into one common architecture to explain these findings. The key features of the model are the computation of 1D and 2D motion in model area V1 subpopulations that are integrated in model MT cells using feedforward and feedback processing. Our results are also in line with findings concerning the solution of the aperture problem. Conclusions/Significance We propose a new neural model for MT pattern computation and motion disambiguation that is based on a combination of feature selection and integration. The model can explain a range of recent neurophysiological findings including temporally dynamic behaviour. PMID:21814543
Lauria, Antonino; Tutone, Marco; Almerico, Anna Maria
2011-09-01
In the last years the application of computational methodologies in the medicinal chemistry fields has found an amazing development. All the efforts were focused on the searching of new leads featuring a close affinity on a specific biological target. Thus, different molecular modeling approaches in simulation of molecular behavior for a specific biological target were employed. In spite of the increasing reliability of computational methodologies, not always the designed lead, once synthesized and screened, are suitable for the chosen biological target. To give another chance to these compounds, this work tries to resume the old concept of Fischer lock-and-key model. The same can be done for the "re-purposing" of old drugs. In fact, it is known that drugs may have many physiological targets, therefore it may be useful to identify them. This aspect, called "polypharmacology", is known to be therapeutically essential in the different treatments. The proposed protocol, the virtual lock-and-key approach (VLKA), consists in the "virtualization" of biological targets through the respectively known inhibitors. In order to release a real lock it is necessary the key fits the pins of the lock. The molecular descriptors could be considered as pins. A tested compound can be considered a potential inhibitor of a biological target if the values of its molecular descriptors fall in the calculated range values for the set of known inhibitors. The proposed protocol permits to transform a biological target in a "lock model" starting from its known inhibitors. To release a real lock all pins must fit. In the proposed protocol, it was supposed that the higher is the number of fit pins, the higher will be the affinity to the considered biological target. Therefore, each biological target was converted in a sequence of "weighted" molecular descriptor range values (locks) by using the structural features of the known inhibitors. Each biological target lock was tested by performing a molecular descriptors "fitting" on known inhibitors not used in the model construction (keys or test set). The results showed a good predictive capability of the protocol (confidence level 80%). This method gives interesting and convenient results because of the user-defined descriptors and biological targets choice in the process of new inhibitors discovery. Copyright © 2011 Elsevier Masson SAS. All rights reserved.
Hexafluoroisopropyl alcohol mediated synthesis of 2,3-dihydro-4H-pyrido[1,2-a]pyrimidin-4-ones
Alam, Mohammad A.; Alsharif, Zakeyah; Alkhattabi, Hessa; Jones, Derika; Delancey, Evan; Gottsponer, Adam; Yang, Tianhong
2016-01-01
An efficient synthesis of novel 2,3-dihydro-4H-pyrido[1,2-a]pyrimidin-4-ones has been reported. Inexpensive and readily available substrates, environmentally benign reaction condition, and product formation up to quantitative yield are the key features of this methodology. Products are formed by the aza-Michael addition followed by intramolecular acyl substitution in a domino process. The polar nature and strong hydrogen bond donor capability of 1,1,1,3,3,3-hexafluoropropan-2-ol is pivotal in this cascade protocol. PMID:27805054
A review of clinical practice guidelines for lung cancer
Ball, David; Silvestri, Gerard A.
2013-01-01
Clinical practice guidelines are important evidence-based resources to guide complex clinical decision making. However, it is challenging for health professionals to keep abreast available guidelines and to know how and where to access relevant guidelines. This review examines currently available guidelines for lung cancer published in the English language. Important key features are listed for each identified guideline. The methodology, approaches to dissemination and implementation, and associated resources are summarised. General challenges in the area of guideline development are highlighted. The potential to collaborate more widely across lung cancer guideline developers by sharing literature searches and assessments is discussed. PMID:24163752
Statechart-based design controllers for FPGA partial reconfiguration
NASA Astrophysics Data System (ADS)
Łabiak, Grzegorz; Wegrzyn, Marek; Rosado Muñoz, Alfredo
2015-09-01
Statechart diagram and UML technique can be a vital part of early conceptual modeling. At the present time there is no much support in hardware design methodologies for reconfiguration features of reprogrammable devices. Authors try to bridge the gap between imprecise UML model and formal HDL description. The key concept in author's proposal is to describe the behavior of the digital controller by statechart diagrams and to map some parts of the behavior into reprogrammable logic by means of group of states which forms sequential automaton. The whole process is illustrated by the example with experimental results.
Dubow, Eric F.; Huesmann, L. Rowell; Boxer, Paul
2015-01-01
The four studies in this special issue represent important advances in research on the intergenerational transmission of aggressive behavior. In this commentary, we review the key features and findings of these studies, as well as our own cross-generational study of aggression, the Columbia County Longitudinal Study. Next, we consider important theoretical issues (e.g., defining and operationalizing “aggression” and “parenting”; assessing reciprocal effects of parenting and child aggression; identifying the ages at which aggression should be assessed across generations; broadening the investigation of contextual and individual factors). We then discuss several methodological issues (e.g., determining the most informative measurement intervals for assessing prospective effects; sampling considerations; measuring potential moderating and mediating variables that might explain cross-generational continuities and discontinuities in parenting and aggression). Finally, we raise implications of cross-generational research for designing interventions targeting the reduction and prevention of child aggression. PMID:12735400
NASA Astrophysics Data System (ADS)
Plaimer, Martin; Breitfuß, Christoph; Sinz, Wolfgang; Heindl, Simon F.; Ellersdorfer, Christian; Steffan, Hermann; Wilkening, Martin; Hennige, Volker; Tatschl, Reinhard; Geier, Alexander; Schramm, Christian; Freunberger, Stefan A.
2016-02-01
Lithium-ion batteries are in widespread use in electric vehicles and hybrid vehicles. Besides features like energy density, cost, lifetime, and recyclability the safety of a battery system is of prime importance. The separator material impacts all these properties and requires therefore an informed selection. The interplay between the mechanical and electrochemical properties as key selection criteria is investigated. Mechanical properties were investigated using tensile and puncture penetration tests at abuse relevant conditions. To investigate the electrochemical performance in terms of effective conductivity a method based on impedance spectroscopy was introduced. This methodology is applied to evaluate ten commercial separators which allows for a trade-off analysis of mechanical versus electrochemical performance. Based on the results, and in combination with other factors, this offers an effective approach to select suitable separators for automotive applications.
NASA Astrophysics Data System (ADS)
Colantonio, Alessandro; di Pietro, Roberto; Ocello, Alberto; Verde, Nino Vincenzo
In this paper we address the problem of generating a candidate role-set for an RBAC configuration that enjoys the following two key features: it minimizes the administration cost; and, it is a stable candidate role-set. To achieve these goals, we implement a three steps methodology: first, we associate a weight to roles; second, we identify and remove the user-permission assignments that cannot belong to a role that have a weight exceeding a given threshold; third, we restrict the problem of finding a candidate role-set for the given system configuration using only the user-permission assignments that have not been removed in the second step—that is, user-permission assignments that belong to roles with a weight exceeding the given threshold. We formally show—proof of our results are rooted in graph theory—that this methodology achieves the intended goals. Finally, we discuss practical applications of our approach to the role mining problem.
Tucker, Conrad; Han, Yixiang; Nembhard, Harriet Black; Lewis, Mechelle; Lee, Wang-Chien; Sterling, Nicholas W; Huang, Xuemei
2017-01-01
Parkinson’s disease (PD) is the second most common neurological disorder after Alzheimer’s disease. Key clinical features of PD are motor-related and are typically assessed by healthcare providers based on qualitative visual inspection of a patient’s movement/gait/posture. More advanced diagnostic techniques such as computed tomography scans that measure brain function, can be cost prohibitive and may expose patients to radiation and other harmful effects. To mitigate these challenges, and open a pathway to remote patient-physician assessment, the authors of this work propose a data mining driven methodology that uses low cost, non-invasive sensors to model and predict the presence (or lack therefore) of PD movement abnormalities and model clinical subtypes. The study presented here evaluates the discriminative ability of non-invasive hardware and data mining algorithms to classify PD cases and controls. A 10-fold cross validation approach is used to compare several data mining algorithms in order to determine that which provides the most consistent results when varying the subject gait data. Next, the predictive accuracy of the data mining model is quantified by testing it against unseen data captured from a test pool of subjects. The proposed methodology demonstrates the feasibility of using non-invasive, low cost, hardware and data mining models to monitor the progression of gait features outside of the traditional healthcare facility, which may ultimately lead to earlier diagnosis of emerging neurological diseases. PMID:29541376
NASA Astrophysics Data System (ADS)
Karimi-Fard, M.; Durlofsky, L. J.
2016-10-01
A comprehensive framework for modeling flow in porous media containing thin, discrete features, which could be high-permeability fractures or low-permeability deformation bands, is presented. The key steps of the methodology are mesh generation, fine-grid discretization, upscaling, and coarse-grid discretization. Our specialized gridding technique combines a set of intersecting triangulated surfaces by constructing approximate intersections using existing edges. This procedure creates a conforming mesh of all surfaces, which defines the internal boundaries for the volumetric mesh. The flow equations are discretized on this conforming fine mesh using an optimized two-point flux finite-volume approximation. The resulting discrete model is represented by a list of control-volumes with associated positions and pore-volumes, and a list of cell-to-cell connections with associated transmissibilities. Coarse models are then constructed by the aggregation of fine-grid cells, and the transmissibilities between adjacent coarse cells are obtained using flow-based upscaling procedures. Through appropriate computation of fracture-matrix transmissibilities, a dual-continuum representation is obtained on the coarse scale in regions with connected fracture networks. The fine and coarse discrete models generated within the framework are compatible with any connectivity-based simulator. The applicability of the methodology is illustrated for several two- and three-dimensional examples. In particular, we consider gas production from naturally fractured low-permeability formations, and transport through complex fracture networks. In all cases, highly accurate solutions are obtained with significant model reduction.
Institutions and national development in Latin America: a comparative study
Portes, Alejandro; Smith, Lori D.
2013-01-01
We review the theoretical and empirical literatures on the role of institutions on national development as a prelude to present a more rigorous and measurable definition of the concept and a methodology to study this relationship at the national and subnational levels. The existing research literature features conflicting definitions of the concept of “institutions” and empirical tests based mostly on reputational indices, with countries as units of analysis. The present study’s methodology is based on a set of five strategic organizations studied comparatively in five Latin American countries. These include key federal agencies, public administrative organizations, and stock exchanges. Systematic analysis of results show a pattern of differences between economically-oriented institutions and those entrusted with providing basic services to the general population. Consistent differences in institutional quality also emerge across countries, despite similar levels of economic development. Using the algebraic methods developed by Ragin, we test six hypotheses about factors determining the developmental character of particular institutions. Implications of results for theory and for methodological practices of future studies in this field are discussed. PMID:26543407
NASA Astrophysics Data System (ADS)
D'silva, Oneil; Kerrison, Roger
2013-09-01
A key feature for the increased utilization of space robotics is to automate Extra-Vehicular manned space activities and thus significantly reduce the potential for catastrophic hazards while simultaneously minimizing the overall costs associated with manned space. The principal scope of the paper is to evaluate the use of industry standard accepted Probability risk/safety assessment (PRA/PSA) methodologies and Hazard Risk frequency Criteria as a hazard control. This paper illustrates the applicability of combining the selected Probability risk assessment methodology and hazard risk frequency criteria, in order to apply the necessary safety controls that allow for the increased use of the Mobile Servicing system (MSS) robotic system on the International Space Station. This document will consider factors such as component failure rate reliability, software reliability, and periods of operation and dormancy, fault tree analyses and their effects on the probability risk assessments. The paper concludes with suggestions for the incorporation of existing industry Risk/Safety plans to create an applicable safety process for future activities/programs
Dynamic Assessment and Response to Intervention
Grigorenko, Elena L.
2013-01-01
This article compares and contrasts the main features of dynamic testing and assessment (DT/A) and response to intervention (RTI). The comparison is carried out along the following lines: (a) historical and empirical roots of both concepts, (b) premises underlying DT/A and RTI, (c) terms used in these concepts, (d) use of these concepts, (e) evidence in support of DT/A and RTI, and (f) expectations associated with each of the concepts. The main outcome of this comparison is a conclusion that both approaches belong to one family of methodologies in psychology and education whose key feature is in blending assessment and intervention in one holistic activity. Because DT/A has been around much longer than RTI, it makes sense for the proponents of RTI to consider both the accomplishments and frustrations that have accumulated in the field of DT/A. PMID:19073895
Current trends in the design of scaffolds for computer-aided tissue engineering.
Giannitelli, S M; Accoto, D; Trombetta, M; Rainer, A
2014-02-01
Advances introduced by additive manufacturing have significantly improved the ability to tailor scaffold architecture, enhancing the control over microstructural features. This has led to a growing interest in the development of innovative scaffold designs, as testified by the increasing amount of research activities devoted to the understanding of the correlation between topological features of scaffolds and their resulting properties, in order to find architectures capable of optimal trade-off between often conflicting requirements (such as biological and mechanical ones). The main aim of this paper is to provide a review and propose a classification of existing methodologies for scaffold design and optimization in order to address key issues and help in deciphering the complex link between design criteria and resulting scaffold properties. Copyright © 2013 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
Message Variability and Heterogeneity: A Core Challenge for Communication Research
Slater, Michael D.; Peter, Jochen; Valkenberg, Patti
2015-01-01
Messages are central to human social experience, and pose key conceptual and methodological challenges in the study of communication. In response to these challenges, we outline a systematic approach to conceptualizing, operationalizing, and analyzing messages. At the conceptual level, we distinguish between two core aspects of messages: message variability (the defined and operationalized features of messages) and message heterogeneity (the undefined and unmeasured features of messages), and suggest preferred approaches to defining message variables. At the operational level, we identify message sampling, selection, and research design strategies responsive to issues of message variability and heterogeneity in experimental and survey research. At the analytical level, we highlight effective techniques to deal with message variability and heterogeneity. We conclude with seven recommendations to increase rigor in the study of communication through appropriately addressing the challenges presented by messages. PMID:26681816
Ortiz-Rosario, Alexis; Adeli, Hojjat; Buford, John A
2017-01-15
Researchers often rely on simple methods to identify involvement of neurons in a particular motor task. The historical approach has been to inspect large groups of neurons and subjectively separate neurons into groups based on the expertise of the investigator. In cases where neuron populations are small it is reasonable to inspect these neuronal recordings and their firing rates carefully to avoid data omissions. In this paper, a new methodology is presented for automatic objective classification of neurons recorded in association with behavioral tasks into groups. By identifying characteristics of neurons in a particular group, the investigator can then identify functional classes of neurons based on their relationship to the task. The methodology is based on integration of a multiple signal classification (MUSIC) algorithm to extract relevant features from the firing rate and an expectation-maximization Gaussian mixture algorithm (EM-GMM) to cluster the extracted features. The methodology is capable of identifying and clustering similar firing rate profiles automatically based on specific signal features. An empirical wavelet transform (EWT) was used to validate the features found in the MUSIC pseudospectrum and the resulting signal features captured by the methodology. Additionally, this methodology was used to inspect behavioral elements of neurons to physiologically validate the model. This methodology was tested using a set of data collected from awake behaving non-human primates. Copyright © 2016 Elsevier B.V. All rights reserved.
[Prediction of ETA oligopeptides antagonists from Glycine max based on in silico proteolysis].
Qiao, Lian-Sheng; Jiang, Lu-di; Luo, Gang-Gang; Lu, Fang; Chen, Yan-Kun; Wang, Ling-Zhi; Li, Gong-Yu; Zhang, Yan-Ling
2017-02-01
Oligopeptides are one of the the key pharmaceutical effective constituents of traditional Chinese medicine(TCM). Systematic study on composition and efficacy of TCM oligopeptides is essential for the analysis of material basis and mechanism of TCM. In this study, the potential anti-hypertensive oligopeptides from Glycine max and their endothelin receptor A (ETA) antagonistic activity were discovered and predicted based on in silico technologies.Main protein sequences of G. max were collected and oligopeptides were obtained using in silico gastrointestinal tract proteolysis. Then, the pharmacophore of ETA antagonistic peptides was constructed and included one hydrophobic feature, one ionizable negative feature, one ring aromatic feature and five excluded volumes. Meanwhile, three-dimensional structure of ETA was developed by homology modeling methods for further docking studies. According to docking analysis and consensus score, the key amino acid of GLN165 was identified for ETA antagonistic activity. And 27 oligopeptides from G. max were predicted as the potential ETA antagonists by pharmacophore and docking studies.In silico proteolysis could be used to analyze the protein sequences from TCM. According to combination of in silico proteolysis and molecular simulation, the biological activities of oligopeptides could be predicted rapidly based on the known TCM protein sequence. It might provide the methodology basis for rapidly and efficiently implementing the mechanism analysis of TCM oligopeptides. Copyright© by the Chinese Pharmaceutical Association.
Climate Model Diagnostic Analyzer Web Service System
NASA Astrophysics Data System (ADS)
Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.
2015-12-01
Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the new methodology as web services and incorporated the system into the Cloud. We have also developed a provenance management system for CMDA where CMDA service semantics modeling, service search and recommendation, and service execution history management are designed and implemented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Messner, Mark C.; Sham, Sam; Wang, Yanli
This report summarizes the experiments performed in FY17 on Gr. 91 steels. The testing of Gr. 91 has technical significance because, currently, it is the only approved material for Class A construction that is strongly cyclic softening. Specific FY17 testing includes the following activities for Gr. 91 steel. First, two types of key feature testing have been initiated, including two-bar thermal ratcheting and Simplified Model Testing (SMT). The goal is to qualify the Elastic – Perfectly Plastic (EPP) design methodologies and to support incorporation of these rules for Gr. 91 into the ASME Division 5 Code. The preliminary SMT testmore » results show that Gr. 91 is most damaging when tested with compression hold mode under the SMT creep fatigue testing condition. Two-bar thermal ratcheting test results at a temperature range between 350 to 650o C were compared with the EPP strain limits code case evaluation, and the results show that the EPP strain limits code case is conservative. The material information obtained from these key feature tests can also be used to verify its material model. Second, to provide experimental data in support of the viscoplastic material model development at Argonne National Laboratory, selective tests were performed to evaluate the effect of cyclic softening on strain rate sensitivity and creep rates. The results show the prior cyclic loading history decreases the strain rate sensitivity and increases creep rates. In addition, isothermal cyclic stress-strain curves were generated at six different temperatures, and a nonisothermal thermomechanical testing was also performed to provide data to calibrate the viscoplastic material model.« less
Hu, Yongli; Hase, Takeshi; Li, Hui Peng; Prabhakar, Shyam; Kitano, Hiroaki; Ng, See Kiong; Ghosh, Samik; Wee, Lawrence Jin Kiat
2016-12-22
The ability to sequence the transcriptomes of single cells using single-cell RNA-seq sequencing technologies presents a shift in the scientific paradigm where scientists, now, are able to concurrently investigate the complex biology of a heterogeneous population of cells, one at a time. However, till date, there has not been a suitable computational methodology for the analysis of such intricate deluge of data, in particular techniques which will aid the identification of the unique transcriptomic profiles difference between the different cellular subtypes. In this paper, we describe the novel methodology for the analysis of single-cell RNA-seq data, obtained from neocortical cells and neural progenitor cells, using machine learning algorithms (Support Vector machine (SVM) and Random Forest (RF)). Thirty-eight key transcripts were identified, using the SVM-based recursive feature elimination (SVM-RFE) method of feature selection, to best differentiate developing neocortical cells from neural progenitor cells in the SVM and RF classifiers built. Also, these genes possessed a higher discriminative power (enhanced prediction accuracy) as compared commonly used statistical techniques or geneset-based approaches. Further downstream network reconstruction analysis was carried out to unravel hidden general regulatory networks where novel interactions could be further validated in web-lab experimentation and be useful candidates to be targeted for the treatment of neuronal developmental diseases. This novel approach reported for is able to identify transcripts, with reported neuronal involvement, which optimally differentiate neocortical cells and neural progenitor cells. It is believed to be extensible and applicable to other single-cell RNA-seq expression profiles like that of the study of the cancer progression and treatment within a highly heterogeneous tumour.
'Setting the guinea pigs free': towards a new model of community-led social marketing.
Smith, A J; Henry, L
2009-09-01
To offer the opportunity to discuss the positive contribution of co-production approaches in the field of social marketing. Recognizing the ever-evolving theoretical base for social marketing, this article offers a brief commentary on the positive contribution of co-production approaches in this field. The authors outline their own move towards conceptualizing a community-led social marketing approach and describe some key features. This developing framework has been influenced by, and tested through, the Early Presentation of Cancer Symptoms Programme, a community-led social marketing approach to tackle health inequalities across priority neighbourhoods in North East Lincolnshire, UK. A blend of social marketing, community involvement and rapid improvement science methodologies are drawn upon. The approach involves not just a strong focus on involving communities in insight and consultation, but also adopts methods where they are in charge of the process of generating solutions. A series of monthly and pre/post measures have demonstrated improvements in awareness of symptoms, reported willingness to act and increases in presentation measured through service referrals. Key features of the approach involve shared ownership and a shift away from service-instigated change by enabling communities 'to do' through developing skills and confidence and the conditions to 'try out'. The approach highlights the contribution that co-production approaches have to offer social marketing activity. In order to maximize potential, it is important to consider ways of engaging communities effectively. Successful approaches include translating social marketing methodology into easy-to-use frameworks, involving communities in gathering and interpreting local data, and supporting communities to act as change agents by planning and carrying out activity. The range of impacts across organisational, health and social capital measures demonstrates that multiple and longer-lasting improvements can be achieved with successful approaches.
Testing earthquake source inversion methodologies
Page, M.; Mai, P.M.; Schorlemmer, D.
2011-01-01
Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.
Krefeld-Schwalb, Antonia; Witte, Erich H.; Zenker, Frank
2018-01-01
In psychology as elsewhere, the main statistical inference strategy to establish empirical effects is null-hypothesis significance testing (NHST). The recent failure to replicate allegedly well-established NHST-results, however, implies that such results lack sufficient statistical power, and thus feature unacceptably high error-rates. Using data-simulation to estimate the error-rates of NHST-results, we advocate the research program strategy (RPS) as a superior methodology. RPS integrates Frequentist with Bayesian inference elements, and leads from a preliminary discovery against a (random) H0-hypothesis to a statistical H1-verification. Not only do RPS-results feature significantly lower error-rates than NHST-results, RPS also addresses key-deficits of a “pure” Frequentist and a standard Bayesian approach. In particular, RPS aggregates underpowered results safely. RPS therefore provides a tool to regain the trust the discipline had lost during the ongoing replicability-crisis. PMID:29740363
Thornton, Lukar E; Pearce, Jamie R; Kavanagh, Anne M
2011-07-01
Features of the built environment are increasingly being recognised as potentially important determinants of obesity. This has come about, in part, because of advances in methodological tools such as Geographic Information Systems (GIS). GIS has made the procurement of data related to the built environment easier and given researchers the flexibility to create a new generation of environmental exposure measures such as the travel time to the nearest supermarket or calculations of the amount of neighbourhood greenspace. Given the rapid advances in the availability of GIS data and the relative ease of use of GIS software, a glossary on the use of GIS to assess the built environment is timely. As a case study, we draw on aspects the food and physical activity environments as they might apply to obesity, to define key GIS terms related to data collection, concepts, and the measurement of environmental features.
Krefeld-Schwalb, Antonia; Witte, Erich H; Zenker, Frank
2018-01-01
In psychology as elsewhere, the main statistical inference strategy to establish empirical effects is null-hypothesis significance testing (NHST). The recent failure to replicate allegedly well-established NHST-results, however, implies that such results lack sufficient statistical power, and thus feature unacceptably high error-rates. Using data-simulation to estimate the error-rates of NHST-results, we advocate the research program strategy (RPS) as a superior methodology. RPS integrates Frequentist with Bayesian inference elements, and leads from a preliminary discovery against a (random) H 0 -hypothesis to a statistical H 1 -verification. Not only do RPS-results feature significantly lower error-rates than NHST-results, RPS also addresses key-deficits of a "pure" Frequentist and a standard Bayesian approach. In particular, RPS aggregates underpowered results safely. RPS therefore provides a tool to regain the trust the discipline had lost during the ongoing replicability-crisis.
Functional equivalency inferred from "authoritative sources" in networks of homologous proteins.
Natarajan, Shreedhar; Jakobsson, Eric
2009-06-12
A one-on-one mapping of protein functionality across different species is a critical component of comparative analysis. This paper presents a heuristic algorithm for discovering the Most Likely Functional Counterparts (MoLFunCs) of a protein, based on simple concepts from network theory. A key feature of our algorithm is utilization of the user's knowledge to assign high confidence to selected functional identification. We show use of the algorithm to retrieve functional equivalents for 7 membrane proteins, from an exploration of almost 40 genomes form multiple online resources. We verify the functional equivalency of our dataset through a series of tests that include sequence, structure and function comparisons. Comparison is made to the OMA methodology, which also identifies one-on-one mapping between proteins from different species. Based on that comparison, we believe that incorporation of user's knowledge as a key aspect of the technique adds value to purely statistical formal methods.
Functional Equivalency Inferred from “Authoritative Sources” in Networks of Homologous Proteins
Natarajan, Shreedhar; Jakobsson, Eric
2009-01-01
A one-on-one mapping of protein functionality across different species is a critical component of comparative analysis. This paper presents a heuristic algorithm for discovering the Most Likely Functional Counterparts (MoLFunCs) of a protein, based on simple concepts from network theory. A key feature of our algorithm is utilization of the user's knowledge to assign high confidence to selected functional identification. We show use of the algorithm to retrieve functional equivalents for 7 membrane proteins, from an exploration of almost 40 genomes form multiple online resources. We verify the functional equivalency of our dataset through a series of tests that include sequence, structure and function comparisons. Comparison is made to the OMA methodology, which also identifies one-on-one mapping between proteins from different species. Based on that comparison, we believe that incorporation of user's knowledge as a key aspect of the technique adds value to purely statistical formal methods. PMID:19521530
NASA Astrophysics Data System (ADS)
Phillion, A. B.; Cockcroft, S. L.; Lee, P. D.
2009-07-01
The methodology of direct finite element (FE) simulation was used to predict the semi-solid constitutive behavior of an industrially important aluminum-magnesium alloy, AA5182. Model microstructures were generated that detail key features of the as-cast semi-solid: equiaxed-globular grains of random size and shape, interconnected liquid films, and pores at the triple-junctions. Based on the results of over fifty different simulations, a model-based constitutive relationship which includes the effects of the key microstructure features—fraction solid, grain size and fraction porosity—was derived using regression analysis. This novel constitutive equation was then validated via comparison with both the FE simulations and experimental stress/strain data. Such an equation can now be used to incorporate the effects of microstructure on the bulk semi-solid flow stress within a macro- scale process model.
ERIC Educational Resources Information Center
Karpova, Natalia Konstantinovna; Uvarovsky, Alexander Pavlovich; Mareev, Vladimir Ivanovich; Petrova, Nina Petrovna; Borzilov, Yuri Petrovich
2016-01-01
The article is devoted to some methodological features of modernization in modern education in the context of "the knowledge economy" development which is aimed at shaping new mental potential of the modern state. Features and prerequisites for the emergence of the knowledge economy, the priorities of which include the development and…
System perspectives for mobile platform design in m-Health
NASA Astrophysics Data System (ADS)
Roveda, Janet M.; Fink, Wolfgang
2016-05-01
Advances in integrated circuit technologies have led to the integration of medical sensor front ends with data processing circuits, i.e., mobile platform design for wearable sensors. We discuss design methodologies for wearable sensor nodes and their applications in m-Health. From the user perspective, flexibility, comfort, appearance, fashion, ease-of-use, and visibility are key form factors. From the technology development point of view, high accuracy, low power consumption, and high signal to noise ratio are desirable features. From the embedded software design standpoint, real time data analysis algorithms, application and database interfaces are the critical components to create successful wearable sensor-based products.
NASA Astrophysics Data System (ADS)
Szuflitowska, B.; Orlowski, P.
2017-08-01
Automated detection system consists of two key steps: extraction of features from EEG signals and classification for detection of pathology activity. The EEG sequences were analyzed using Short-Time Fourier Transform and the classification was performed using Linear Discriminant Analysis. The accuracy of the technique was tested on three sets of EEG signals: epilepsy, healthy and Alzheimer's Disease. The classification error below 10% has been considered a success. The higher accuracy are obtained for new data of unknown classes than testing data. The methodology can be helpful in differentiation epilepsy seizure and disturbances in the EEG signal in Alzheimer's Disease.
Ultrasonic characterization of the fiber-matrix interfacial bond in aerospace composites.
Aggelis, D G; Kleitsa, D; Matikas, T E
2013-01-01
The properties of advanced composites rely on the quality of the fiber-matrix bonding. Service-induced damage results in deterioration of bonding quality, seriously compromising the load-bearing capacity of the structure. While traditional methods to assess bonding are destructive, herein a nondestructive methodology based on shear wave reflection is numerically investigated. Reflection relies on the bonding quality and results in discernable changes in the received waveform. The key element is the "interphase" model material with varying stiffness. The study is an example of how computational methods enhance the understanding of delicate features concerning the nondestructive evaluation of materials used in advanced structures.
Improving applied roughness measurement of involute helical gears
NASA Astrophysics Data System (ADS)
Koulin, G.; Zhang, J.; Frazer, R. C.; Wilson, S. J.; Shaw, B. A.
2017-12-01
With improving gear design and manufacturing technology, improvement in metrology is necessary to provide reliable feedback to the designer and manufacturer. A recommended gear roughness measurement method is applied to a micropitting contact fatigue test gear. The development of wear and micropitting is reliably characterised at the sub-micron roughness level. Changes to the features of the localised surface texture are revealed and are related to key gear meshing positions. The application of the recommended methodology is shown to provide informative feedback to the gear designer in reference to the fundamental gear coordinate system, which is used in gear performance simulations such as tooth contact analysis.
Deterministic nonlinear phase gates induced by a single qubit
NASA Astrophysics Data System (ADS)
Park, Kimin; Marek, Petr; Filip, Radim
2018-05-01
We propose deterministic realizations of nonlinear phase gates by repeating a finite sequence of non-commuting Rabi interactions between a harmonic oscillator and only a single two-level ancillary qubit. We show explicitly that the key nonclassical features of the ideal cubic phase gate and the quartic phase gate are generated in the harmonic oscillator faithfully by our method. We numerically analyzed the performance of our scheme under realistic imperfections of the oscillator and the two-level system. The methodology is extended further to higher-order nonlinear phase gates. This theoretical proposal completes the set of operations required for continuous-variable quantum computation.
Health impact assessment of industrial development projects: a spatio-temporal visualization.
Winkler, Mirko S; Krieger, Gary R; Divall, Mark J; Singer, Burton H; Utzinger, Jürg
2012-05-01
Development and implementation of large-scale industrial projects in complex eco-epidemiological settings typically require combined environmental, social and health impact assessments. We present a generic, spatio-temporal health impact assessment (HIA) visualization, which can be readily adapted to specific projects and key stakeholders, including poorly literate communities that might be affected by consequences of a project. We illustrate how the occurrence of a variety of complex events can be utilized for stakeholder communication, awareness creation, interactive learning as well as formulating HIA research and implementation questions. Methodological features are highlighted in the context of an iron ore development in a rural part of Africa.
Iris recognition based on key image feature extraction.
Ren, X; Tian, Q; Zhang, J; Wu, S; Zeng, Y
2008-01-01
In iris recognition, feature extraction can be influenced by factors such as illumination and contrast, and thus the features extracted may be unreliable, which can cause a high rate of false results in iris pattern recognition. In order to obtain stable features, an algorithm was proposed in this paper to extract key features of a pattern from multiple images. The proposed algorithm built an iris feature template by extracting key features and performed iris identity enrolment. Simulation results showed that the selected key features have high recognition accuracy on the CASIA Iris Set, where both contrast and illumination variance exist.
Accuracy and Calibration of Computational Approaches for Inpatient Mortality Predictive Modeling.
Nakas, Christos T; Schütz, Narayan; Werners, Marcus; Leichtle, Alexander B
2016-01-01
Electronic Health Record (EHR) data can be a key resource for decision-making support in clinical practice in the "big data" era. The complete database from early 2012 to late 2015 involving hospital admissions to Inselspital Bern, the largest Swiss University Hospital, was used in this study, involving over 100,000 admissions. Age, sex, and initial laboratory test results were the features/variables of interest for each admission, the outcome being inpatient mortality. Computational decision support systems were utilized for the calculation of the risk of inpatient mortality. We assessed the recently proposed Acute Laboratory Risk of Mortality Score (ALaRMS) model, and further built generalized linear models, generalized estimating equations, artificial neural networks, and decision tree systems for the predictive modeling of the risk of inpatient mortality. The Area Under the ROC Curve (AUC) for ALaRMS marginally corresponded to the anticipated accuracy (AUC = 0.858). Penalized logistic regression methodology provided a better result (AUC = 0.872). Decision tree and neural network-based methodology provided even higher predictive performance (up to AUC = 0.912 and 0.906, respectively). Additionally, decision tree-based methods can efficiently handle Electronic Health Record (EHR) data that have a significant amount of missing records (in up to >50% of the studied features) eliminating the need for imputation in order to have complete data. In conclusion, we show that statistical learning methodology can provide superior predictive performance in comparison to existing methods and can also be production ready. Statistical modeling procedures provided unbiased, well-calibrated models that can be efficient decision support tools for predicting inpatient mortality and assigning preventive measures.
Nuclear thermal propulsion engine system design analysis code development
NASA Astrophysics Data System (ADS)
Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.; Ivanenok, Joseph F.
1992-01-01
A Nuclear Thermal Propulsion (NTP) Engine System Design Analyis Code has recently been developed to characterize key NTP engine system design features. Such a versatile, standalone NTP system performance and engine design code is required to support ongoing and future engine system and vehicle design efforts associated with proposed Space Exploration Initiative (SEI) missions of interest. Key areas of interest in the engine system modeling effort were the reactor, shielding, and inclusion of an engine multi-redundant propellant pump feed system design option. A solid-core nuclear thermal reactor and internal shielding code model was developed to estimate the reactor's thermal-hydraulic and physical parameters based on a prescribed thermal output which was integrated into a state-of-the-art engine system design model. The reactor code module has the capability to model graphite, composite, or carbide fuels. Key output from the model consists of reactor parameters such as thermal power, pressure drop, thermal profile, and heat generation in cooled structures (reflector, shield, and core supports), as well as the engine system parameters such as weight, dimensions, pressures, temperatures, mass flows, and performance. The model's overall analysis methodology and its key assumptions and capabilities are summarized in this paper.
BATSE gamma-ray burst line search. 2: Bayesian consistency methodology
NASA Technical Reports Server (NTRS)
Band, D. L.; Ford, L. A.; Matteson, J. L.; Briggs, M.; Paciesas, W.; Pendleton, G.; Preece, R.; Palmer, D.; Teegarden, B.; Schaefer, B.
1994-01-01
We describe a Bayesian methodology to evaluate the consistency between the reported Ginga and Burst and Transient Source Experiment (BATSE) detections of absorption features in gamma-ray burst spectra. Currently no features have been detected by BATSE, but this methodology will still be applicable if and when such features are discovered. The Bayesian methodology permits the comparison of hypotheses regarding the two detectors' observations and makes explicit the subjective aspects of our analysis (e.g., the quantification of our confidence in detector performance). We also present non-Bayesian consistency statistics. Based on preliminary calculations of line detectability, we find that both the Bayesian and non-Bayesian techniques show that the BATSE and Ginga observations are consistent given our understanding of these detectors.
From LCAs to simplified models: a generic methodology applied to wind power electricity.
Padey, Pierryves; Girard, Robin; le Boulch, Denis; Blanc, Isabelle
2013-02-05
This study presents a generic methodology to produce simplified models able to provide a comprehensive life cycle impact assessment of energy pathways. The methodology relies on the application of global sensitivity analysis to identify key parameters explaining the impact variability of systems over their life cycle. Simplified models are built upon the identification of such key parameters. The methodology is applied to one energy pathway: onshore wind turbines of medium size considering a large sample of possible configurations representative of European conditions. Among several technological, geographical, and methodological parameters, we identified the turbine load factor and the wind turbine lifetime as the most influent parameters. Greenhouse Gas (GHG) performances have been plotted as a function of these key parameters identified. Using these curves, GHG performances of a specific wind turbine can be estimated, thus avoiding the undertaking of an extensive Life Cycle Assessment (LCA). This methodology should be useful for decisions makers, providing them a robust but simple support tool for assessing the environmental performance of energy systems.
Reliability in content analysis: The case of semantic feature norms classification.
Bolognesi, Marianna; Pilgram, Roosmaryn; van den Heerik, Romy
2017-12-01
Semantic feature norms (e.g., STIMULUS: car → RESPONSE:
Methodology for Generating Conflict Scenarios by Time Shifting Recorded Traffic Data
NASA Technical Reports Server (NTRS)
Paglione, Mike; Oaks, Robert; Bilimoria, Karl D.
2003-01-01
A methodology is presented for generating conflict scenarios that can be used as test cases to estimate the operational performance of a conflict probe. Recorded air traffic data is time shifted to create traffic scenarios featuring conflicts with characteristic properties similar to those encountered in typical air traffic operations. First, a reference set of conflicts is obtained from trajectories that are computed using birth points and nominal flight plans extracted from recorded traffic data. Distributions are obtained for several primary properties (e.g., encounter angle) that are most likely to affect the performance of a conflict probe. A genetic algorithm is then utilized to determine the values of time shifts for the recorded track data so that the primary properties of conflicts generated by the time shifted data match those of the reference set. This methodology is successfully demonstrated using recorded traffic data for the Memphis Air Route Traffic Control Center; a key result is that the required time shifts are less than 5 min for 99% of the tracks. It is also observed that close matching of the primary properties used in this study additionally provides a good match for some other secondary properties.
A quantitative metric to identify critical elements within seafood supply networks.
Plagányi, Éva E; van Putten, Ingrid; Thébaud, Olivier; Hobday, Alistair J; Innes, James; Lim-Camacho, Lilly; Norman-López, Ana; Bustamante, Rodrigo H; Farmery, Anna; Fleming, Aysha; Frusher, Stewart; Green, Bridget; Hoshino, Eriko; Jennings, Sarah; Pecl, Gretta; Pascoe, Sean; Schrobback, Peggy; Thomas, Linda
2014-01-01
A theoretical basis is required for comparing key features and critical elements in wild fisheries and aquaculture supply chains under a changing climate. Here we develop a new quantitative metric that is analogous to indices used to analyse food-webs and identify key species. The Supply Chain Index (SCI) identifies critical elements as those elements with large throughput rates, as well as greater connectivity. The sum of the scores for a supply chain provides a single metric that roughly captures both the resilience and connectedness of a supply chain. Standardised scores can facilitate cross-comparisons both under current conditions as well as under a changing climate. Identification of key elements along the supply chain may assist in informing adaptation strategies to reduce anticipated future risks posed by climate change. The SCI also provides information on the relative stability of different supply chains based on whether there is a fairly even spread in the individual scores of the top few key elements, compared with a more critical dependence on a few key individual supply chain elements. We use as a case study the Australian southern rock lobster Jasus edwardsii fishery, which is challenged by a number of climate change drivers such as impacts on recruitment and growth due to changes in large-scale and local oceanographic features. The SCI identifies airports, processors and Chinese consumers as the key elements in the lobster supply chain that merit attention to enhance stability and potentially enable growth. We also apply the index to an additional four real-world Australian commercial fishery and two aquaculture industry supply chains to highlight the utility of a systematic method for describing supply chains. Overall, our simple methodological approach to empirically-based supply chain research provides an objective method for comparing the resilience of supply chains and highlighting components that may be critical.
A Quantitative Metric to Identify Critical Elements within Seafood Supply Networks
Plagányi, Éva E.; van Putten, Ingrid; Thébaud, Olivier; Hobday, Alistair J.; Innes, James; Lim-Camacho, Lilly; Norman-López, Ana; Bustamante, Rodrigo H.; Farmery, Anna; Fleming, Aysha; Frusher, Stewart; Green, Bridget; Hoshino, Eriko; Jennings, Sarah; Pecl, Gretta; Pascoe, Sean; Schrobback, Peggy; Thomas, Linda
2014-01-01
A theoretical basis is required for comparing key features and critical elements in wild fisheries and aquaculture supply chains under a changing climate. Here we develop a new quantitative metric that is analogous to indices used to analyse food-webs and identify key species. The Supply Chain Index (SCI) identifies critical elements as those elements with large throughput rates, as well as greater connectivity. The sum of the scores for a supply chain provides a single metric that roughly captures both the resilience and connectedness of a supply chain. Standardised scores can facilitate cross-comparisons both under current conditions as well as under a changing climate. Identification of key elements along the supply chain may assist in informing adaptation strategies to reduce anticipated future risks posed by climate change. The SCI also provides information on the relative stability of different supply chains based on whether there is a fairly even spread in the individual scores of the top few key elements, compared with a more critical dependence on a few key individual supply chain elements. We use as a case study the Australian southern rock lobster Jasus edwardsii fishery, which is challenged by a number of climate change drivers such as impacts on recruitment and growth due to changes in large-scale and local oceanographic features. The SCI identifies airports, processors and Chinese consumers as the key elements in the lobster supply chain that merit attention to enhance stability and potentially enable growth. We also apply the index to an additional four real-world Australian commercial fishery and two aquaculture industry supply chains to highlight the utility of a systematic method for describing supply chains. Overall, our simple methodological approach to empirically-based supply chain research provides an objective method for comparing the resilience of supply chains and highlighting components that may be critical. PMID:24633147
Design and development of a virtual reality simulator for advanced cardiac life support training.
Vankipuram, Akshay; Khanal, Prabal; Ashby, Aaron; Vankipuram, Mithra; Gupta, Ashish; DrummGurnee, Denise; Josey, Karen; Smith, Marshall
2014-07-01
The use of virtual reality (VR) training tools for medical education could lead to improvements in the skills of clinicians while providing economic incentives for healthcare institutions. The use of VR tools can also mitigate some of the drawbacks currently associated with providing medical training in a traditional clinical environment such as scheduling conflicts and the need for specialized equipment (e.g., high-fidelity manikins). This paper presents the details of the framework and the development methodology associated with a VR-based training simulator for advanced cardiac life support, a time critical, team-based medical scenario. In addition, we also report the key findings of a usability study conducted to assess the efficacy of various features of this VR simulator through a postuse questionnaire administered to various care providers. The usability questionnaires were completed by two groups that used two different versions of the VR simulator. One version consisted of the VR trainer with it all its features and a minified version with certain immersive features disabled. We found an increase in usability scores from the minified group to the full VR group.
Automatic Feature Selection and Improved Classification in SICADA Counterfeit Electronics Detection
2017-03-20
The SICADA methodology was developed to detect such counterfeit microelectronics by collecting power side channel data and applying machine learning...to identify counterfeits. This methodology has been extended to include a two-step automated feature selection process and now uses a one-class SVM...classifier. We describe this methodology and show results for empirical data collected from several types of Microchip dsPIC33F microcontrollers
The colloquial approach: An active learning technique
NASA Astrophysics Data System (ADS)
Arce, Pedro
1994-09-01
This paper addresses the very important problem of the effectiveness of teaching methodologies in fundamental engineering courses such as transport phenomena. An active learning strategy, termed the colloquial approach, is proposed in order to increase student involvement in the learning process. This methodology is a considerable departure from traditional methods that use solo lecturing. It is based on guided discussions, and it promotes student understanding of new concepts by directing the student to construct new ideas by building upon the current knowledge and by focusing on key cases that capture the essential aspects of new concepts. The colloquial approach motivates the student to participate in discussions, to develop detailed notes, and to design (or construct) his or her own explanation for a given problem. This paper discusses the main features of the colloquial approach within the framework of other current and previous techniques. Problem-solving strategies and the need for new textbooks and for future investigations based on the colloquial approach are also outlined.
Technological advancements and their importance for nematode identification
NASA Astrophysics Data System (ADS)
Ahmed, Mohammed; Sapp, Melanie; Prior, Thomas; Karssen, Gerrit; Back, Matthew Alan
2016-06-01
Nematodes represent a species-rich and morphologically diverse group of metazoans known to inhabit both aquatic and terrestrial environments. Their role as biological indicators and as key players in nutrient cycling has been well documented. Some plant-parasitic species are also known to cause significant losses to crop production. In spite of this, there still exists a huge gap in our knowledge of their diversity due to the enormity of time and expertise often involved in characterising species using phenotypic features. Molecular methodology provides useful means of complementing the limited number of reliable diagnostic characters available for morphology-based identification. We discuss herein some of the limitations of traditional taxonomy and how molecular methodologies, especially the use of high-throughput sequencing, have assisted in carrying out large-scale nematode community studies and characterisation of phytonematodes through rapid identification of multiple taxa. We also provide brief descriptions of some the current and almost-outdated high-throughput sequencing platforms and their applications in both plant nematology and soil ecology.
A 3D model retrieval approach based on Bayesian networks lightfield descriptor
NASA Astrophysics Data System (ADS)
Xiao, Qinhan; Li, Yanjun
2009-12-01
A new 3D model retrieval methodology is proposed by exploiting a novel Bayesian networks lightfield descriptor (BNLD). There are two key novelties in our approach: (1) a BN-based method for building lightfield descriptor; and (2) a 3D model retrieval scheme based on the proposed BNLD. To overcome the disadvantages of the existing 3D model retrieval methods, we explore BN for building a new lightfield descriptor. Firstly, 3D model is put into lightfield, about 300 binary-views can be obtained along a sphere, then Fourier descriptors and Zernike moments descriptors can be calculated out from binaryviews. Then shape feature sequence would be learned into a BN model based on BN learning algorithm; Secondly, we propose a new 3D model retrieval method by calculating Kullback-Leibler Divergence (KLD) between BNLDs. Beneficial from the statistical learning, our BNLD is noise robustness as compared to the existing methods. The comparison between our method and the lightfield descriptor-based approach is conducted to demonstrate the effectiveness of our proposed methodology.
Celluloid devils: a research study of male nurses in feature films.
Stanley, David
2012-11-01
To report a study of how male nurses are portrayed in feature films. It was hypothesized that male nurses are frequently portrayed negatively or stereotypically in the film media, potentially having a negative impact on male nurse recruitment and the public's perception of male nurses. An interpretive, qualitative methodology guided by insights into hegemonic masculinity and structured around a set of collective case studies (films) was used to examine the portrayal of male nurses in feature films made in the Western world from 1900 to 2007. Over 36,000 feature film synopses were reviewed (via CINAHL, ProQuest and relevant movie-specific literature) for the keyword 'nurse' and 'nursing' with an additional search for films from 1900 to 2010 for the word 'male nurse'. Identified films were labelled as 'cases' and analysed collectively to determine key attributes related to men in nursing and explore them for the emergence of concepts and themes related to the image of male nurses in films. A total of 13 relevant cases (feature films) were identified with 12 being made in the USA. Most films portrayed male nurses negatively and in ways opposed to hegemonic masculinity, as effeminate, homosexual, homicidal, corrupt or incompetent. Few film images of male nurses show them in traditional masculine roles or as clinically competent or self-confident professionals. Feature films predominantly portray male nurses negatively. Given the popularity of feature films, there may be negative effects on recruitment and on the public's perception of male nurses. © 2012 Blackwell Publishing Ltd.
A new approach for minimum phase output definition
NASA Astrophysics Data System (ADS)
Jahangiri, Fatemeh; Talebi, Heidar Ali; Menhaj, Mohammad Bagher; Ebenbauer, Christian
2017-01-01
This paper presents a novel method for output redefinition for linear systems. The approach also determines possible relative degrees for the systems corresponding to any new output vector. To guarantee the minimum phase property with a prescribed relative degree, a set of new conditions is introduced. A key feature of these conditions is that there is no need to any form of transformations which make the scheme suitable for optimisation problems in control to ensure the minimum phase property. Moreover, the results are useful for sensor placement problems and for obtaining minimum phase approximations of non-minimum phase systems. Numerical examples including an example of unmanned aerial vehicle systems are given to demonstrate the effectiveness of the methodology.
NASA Technical Reports Server (NTRS)
Moe, Karen L.; Perkins, Dorothy C.; Szczur, Martha R.
1987-01-01
The user support environment (USE) which is a set of software tools for a flexible standard interactive user interface to the Space Station systems, platforms, and payloads is described in detail. Included in the USE concept are a user interface language, a run time environment and user interface management system, support tools, and standards for human interaction methods. The goals and challenges of the USE are discussed as well as a methodology based on prototype demonstrations for involving users in the process of validating the USE concepts. By prototyping the key concepts and salient features of the proposed user interface standards, the user's ability to respond is greatly enhanced.
How Methodological Features Affect Effect Sizes in Education
ERIC Educational Resources Information Center
Cheung, Alan; Slavin, Robert
2016-01-01
As evidence-based reform becomes increasingly important in educational policy, it is becoming essential to understand how research design might contribute to reported effect sizes in experiments evaluating educational programs. The purpose of this study was to examine how methodological features such as types of publication, sample sizes, and…
Qualitative Research in PBL in Health Sciences Education: A Review
ERIC Educational Resources Information Center
Jin, Jun; Bridges, Susan
2016-01-01
Context: Qualitative methodologies are relatively new in health sciences education research, especially in the area of problem-based learning (PBL). A key advantage of qualitative approaches is the ability to gain in-depth, textured insights into educational phenomena. Key methodological issues arise, however, in terms of the strategies of…
The impact of feature selection on one and two-class classification performance for plant microRNAs.
Khalifa, Waleed; Yousef, Malik; Saçar Demirci, Müşerref Duygu; Allmer, Jens
2016-01-01
MicroRNAs (miRNAs) are short nucleotide sequences that form a typical hairpin structure which is recognized by a complex enzyme machinery. It ultimately leads to the incorporation of 18-24 nt long mature miRNAs into RISC where they act as recognition keys to aid in regulation of target mRNAs. It is involved to determine miRNAs experimentally and, therefore, machine learning is used to complement such endeavors. The success of machine learning mostly depends on proper input data and appropriate features for parameterization of the data. Although, in general, two-class classification (TCC) is used in the field; because negative examples are hard to come by, one-class classification (OCC) has been tried for pre-miRNA detection. Since both positive and negative examples are currently somewhat limited, feature selection can prove to be vital for furthering the field of pre-miRNA detection. In this study, we compare the performance of OCC and TCC using eight feature selection methods and seven different plant species providing positive pre-miRNA examples. Feature selection was very successful for OCC where the best feature selection method achieved an average accuracy of 95.6%, thereby being ∼29% better than the worst method which achieved 66.9% accuracy. While the performance is comparable to TCC, which performs up to 3% better than OCC, TCC is much less affected by feature selection and its largest performance gap is ∼13% which only occurs for two of the feature selection methodologies. We conclude that feature selection is crucially important for OCC and that it can perform on par with TCC given the proper set of features.
Ensemble of Chaotic and Naive Approaches for Performance Enhancement in Video Encryption.
Chandrasekaran, Jeyamala; Thiruvengadam, S J
2015-01-01
Owing to the growth of high performance network technologies, multimedia applications over the Internet are increasing exponentially. Applications like video conferencing, video-on-demand, and pay-per-view depend upon encryption algorithms for providing confidentiality. Video communication is characterized by distinct features such as large volume, high redundancy between adjacent frames, video codec compliance, syntax compliance, and application specific requirements. Naive approaches for video encryption encrypt the entire video stream with conventional text based cryptographic algorithms. Although naive approaches are the most secure for video encryption, the computational cost associated with them is very high. This research work aims at enhancing the speed of naive approaches through chaos based S-box design. Chaotic equations are popularly known for randomness, extreme sensitivity to initial conditions, and ergodicity. The proposed methodology employs two-dimensional discrete Henon map for (i) generation of dynamic and key-dependent S-box that could be integrated with symmetric algorithms like Blowfish and Data Encryption Standard (DES) and (ii) generation of one-time keys for simple substitution ciphers. The proposed design is tested for randomness, nonlinearity, avalanche effect, bit independence criterion, and key sensitivity. Experimental results confirm that chaos based S-box design and key generation significantly reduce the computational cost of video encryption with no compromise in security.
Ensemble of Chaotic and Naive Approaches for Performance Enhancement in Video Encryption
Chandrasekaran, Jeyamala; Thiruvengadam, S. J.
2015-01-01
Owing to the growth of high performance network technologies, multimedia applications over the Internet are increasing exponentially. Applications like video conferencing, video-on-demand, and pay-per-view depend upon encryption algorithms for providing confidentiality. Video communication is characterized by distinct features such as large volume, high redundancy between adjacent frames, video codec compliance, syntax compliance, and application specific requirements. Naive approaches for video encryption encrypt the entire video stream with conventional text based cryptographic algorithms. Although naive approaches are the most secure for video encryption, the computational cost associated with them is very high. This research work aims at enhancing the speed of naive approaches through chaos based S-box design. Chaotic equations are popularly known for randomness, extreme sensitivity to initial conditions, and ergodicity. The proposed methodology employs two-dimensional discrete Henon map for (i) generation of dynamic and key-dependent S-box that could be integrated with symmetric algorithms like Blowfish and Data Encryption Standard (DES) and (ii) generation of one-time keys for simple substitution ciphers. The proposed design is tested for randomness, nonlinearity, avalanche effect, bit independence criterion, and key sensitivity. Experimental results confirm that chaos based S-box design and key generation significantly reduce the computational cost of video encryption with no compromise in security. PMID:26550603
Secure image retrieval with multiple keys
NASA Astrophysics Data System (ADS)
Liang, Haihua; Zhang, Xinpeng; Wei, Qiuhan; Cheng, Hang
2018-03-01
This article proposes a secure image retrieval scheme under a multiuser scenario. In this scheme, the owner first encrypts and uploads images and their corresponding features to the cloud; then, the user submits the encrypted feature of the query image to the cloud; next, the cloud compares the encrypted features and returns encrypted images with similar content to the user. To find the nearest neighbor in the encrypted features, an encryption with multiple keys is proposed, in which the query feature of each user is encrypted by his/her own key. To improve the key security and space utilization, global optimization and Gaussian distribution are, respectively, employed to generate multiple keys. The experiments show that the proposed encryption can provide effective and secure image retrieval for each user and ensure confidentiality of the query feature of each user.
A practical guide to assessing clinical decision-making skills using the key features approach.
Farmer, Elizabeth A; Page, Gordon
2005-12-01
This paper in the series on professional assessment provides a practical guide to writing key features problems (KFPs). Key features problems test clinical decision-making skills in written or computer-based formats. They are based on the concept of critical steps or 'key features' in decision making and represent an advance on the older, less reliable patient management problem (PMP) formats. The practical steps in writing these problems are discussed and illustrated by examples. Steps include assembling problem-writing groups, selecting a suitable clinical scenario or problem and defining its key features, writing the questions, selecting question response formats, preparing scoring keys, reviewing item quality and item banking. The KFP format provides educators with a flexible approach to testing clinical decision-making skills with demonstrated validity and reliability when constructed according to the guidelines provided.
Ghost Hunting as a Means to Illustrate Scientific Methodology and Enhance Critical Thinking
ERIC Educational Resources Information Center
Rockwell, Steven C.
2012-01-01
The increasing popularity of television shows featuring paranormal investigations has led to a renewed enthusiasm in ghost hunting activities, and belief in the paranormal in general. These shows typically feature a group of investigators who, while claiming to utilize proper scientifically correct methodologies, violate many core scientific…
Eye-gaze and intent: Application in 3D interface control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schryver, J.C.; Goldberg, J.H.
1993-06-01
Computer interface control is typically accomplished with an input ``device`` such as keyboard, mouse, trackball, etc. An input device translates a users input actions, such as mouse clicks and key presses, into appropriate computer commands. To control the interface, the user must first convert intent into the syntax of the input device. A more natural means of computer control is possible when the computer can directly infer user intent, without need of intervening input devices. We describe an application of eye-gaze-contingent control of an interactive three-dimensional (3D) user interface. A salient feature of the user interface is natural input, withmore » a heightened impression of controlling the computer directly by the mind. With this interface, input of rotation and translation are intuitive, whereas other abstract features, such as zoom, are more problematic to match with user intent. This paper describes successes with implementation to date, and ongoing efforts to develop a more sophisticated intent inferencing methodology.« less
Eye-gaze and intent: Application in 3D interface control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schryver, J.C.; Goldberg, J.H.
1993-01-01
Computer interface control is typically accomplished with an input device'' such as keyboard, mouse, trackball, etc. An input device translates a users input actions, such as mouse clicks and key presses, into appropriate computer commands. To control the interface, the user must first convert intent into the syntax of the input device. A more natural means of computer control is possible when the computer can directly infer user intent, without need of intervening input devices. We describe an application of eye-gaze-contingent control of an interactive three-dimensional (3D) user interface. A salient feature of the user interface is natural input, withmore » a heightened impression of controlling the computer directly by the mind. With this interface, input of rotation and translation are intuitive, whereas other abstract features, such as zoom, are more problematic to match with user intent. This paper describes successes with implementation to date, and ongoing efforts to develop a more sophisticated intent inferencing methodology.« less
Mapping cattle trade routes in southern Somalia: a method for mobile livestock keeping systems.
Tempia, S; Braidotti, F; Aden, H H; Abdulle, M H; Costagli, R; Otieno, F T
2010-12-01
The Somali economy is the only one in the world in which more than half the population is dependent on nomadic pastoralism. Trade typically involves drovers trekking animals over long distances to markets. A pilot approach for mapping trade routes was undertaken, using the Afmadow to Garissa routes in southern Somalia. The methodology included conducting a workshop with traders to gather preliminary information about the most-used routes and general husbandry practices and training selected drovers to collect data about key features along the routes, using hand-held global positioning system (GPS) devices, radio collar GPS and pictorial data forms. Collected data were then integrated into geographic information systems for analysis. The resultant spatial maps describe the Afmadow to Garissa routes, the speed of livestock movement along these routes and relevant environmental and social features affecting this speed. These data are useful for identifying critical control points for health screening along the routes, which may enable the establishment of a livestock certification system in nomadic pastoral environments.
Electromyogenic Artifacts and Electroencephalographic Inferences Revisited
McMenamin, Brenton W.; Shackman, Alexander J.; Greischar, Lawrence L.; Davidson, Richard J.
2010-01-01
Recent years have witnessed a renewed interest in using oscillatory brain electrical activity to understand the neural bases of cognition and emotion. Electrical signals originating from pericranial muscles represent a profound threat to the validity of such research. Recently, McMenamin et al (2010) examined whether independent component analysis (ICA) provides a sensitive and specific means of correcting electromyogenic (EMG) artifacts. This report sparked the accompanying commentary (Olbrich, Jödicke, Sander, Himmerich & Hegerl, in press), and here we revisit the question of how EMG can alter inferences drawn from the EEG and what can be done to minimize its pernicious effects. Accordingly, we briefly summarize salient features of the EMG problem and review recent research investigating the utility of ICA for correcting EMG and other artifacts. We then directly address the key concerns articulated by Olbrich and provide a critique of their efforts at validating ICA. We conclude by identifying key areas for future methodological work and offer some practical recommendations for intelligently addressing EMG artifact. PMID:20981275
Gyori, Miklos; Stefanik, Krisztina; Kanizsai-Nagy, Ildikó
2015-01-01
A growing body of evidence confirms that mobile digital devices have key potentials as assistive/educational tools for people with autism spectrum disorders. The aim of this paper is to outline key aspects of development and evaluation methodologies that build on, and provide systematic evidence on effects of using such apps. We rely on the results of two R+D projects, both using quantitative and qualitative methods to support development and to evaluate developed apps (n=54 and n=22). Analyzing methodological conclusions from these studies we outline some guidelines for an 'ideal' R+D methodology but we also point to important trade-offs between the need for best systematic evidence and the limitations on development time and costs. We see these trade-offs as a key issue to be resolved in this field.
2011-01-21
If the editors' intention was to produce a comprehensive text book that will be of value to healthcare professionals interested in surgical research and improvements in health care, they have succeeded.
Self-adaptive MOEA feature selection for classification of bankruptcy prediction data.
Gaspar-Cunha, A; Recio, G; Costa, L; Estébanez, C
2014-01-01
Bankruptcy prediction is a vast area of finance and accounting whose importance lies in the relevance for creditors and investors in evaluating the likelihood of getting into bankrupt. As companies become complex, they develop sophisticated schemes to hide their real situation. In turn, making an estimation of the credit risks associated with counterparts or predicting bankruptcy becomes harder. Evolutionary algorithms have shown to be an excellent tool to deal with complex problems in finances and economics where a large number of irrelevant features are involved. This paper provides a methodology for feature selection in classification of bankruptcy data sets using an evolutionary multiobjective approach that simultaneously minimise the number of features and maximise the classifier quality measure (e.g., accuracy). The proposed methodology makes use of self-adaptation by applying the feature selection algorithm while simultaneously optimising the parameters of the classifier used. The methodology was applied to four different sets of data. The obtained results showed the utility of using the self-adaptation of the classifier.
Self-Adaptive MOEA Feature Selection for Classification of Bankruptcy Prediction Data
Gaspar-Cunha, A.; Recio, G.; Costa, L.; Estébanez, C.
2014-01-01
Bankruptcy prediction is a vast area of finance and accounting whose importance lies in the relevance for creditors and investors in evaluating the likelihood of getting into bankrupt. As companies become complex, they develop sophisticated schemes to hide their real situation. In turn, making an estimation of the credit risks associated with counterparts or predicting bankruptcy becomes harder. Evolutionary algorithms have shown to be an excellent tool to deal with complex problems in finances and economics where a large number of irrelevant features are involved. This paper provides a methodology for feature selection in classification of bankruptcy data sets using an evolutionary multiobjective approach that simultaneously minimise the number of features and maximise the classifier quality measure (e.g., accuracy). The proposed methodology makes use of self-adaptation by applying the feature selection algorithm while simultaneously optimising the parameters of the classifier used. The methodology was applied to four different sets of data. The obtained results showed the utility of using the self-adaptation of the classifier. PMID:24707201
Marchand, Pascal; Garel, Mathieu; Bourgoin, Gilles; Duparc, Antoine; Dubray, Dominique; Maillard, Daniel; Loison, Anne
2017-03-01
Recent advances in animal ecology have enabled identification of certain mechanisms that lead to the emergence of territories and home ranges from movements considered as unbounded. Among them, memory and familiarity have been identified as key parameters in cognitive maps driving animal navigation, but have been only recently used in empirical analyses of animal movements. At the same time, the influence of landscape features on movements of numerous species and on space division in territorial animals has been highlighted. Despite their potential as exocentric information in cognitive maps and as boundaries for home ranges, few studies have investigated their role in the design of home ranges of non-territorial species. Using step selection analyses, we assessed the relative contribution of habitat characteristics, familiarity preferences and linear landscape features in movement step selection of 60 GPS-collared Mediterranean mouflon Ovis gmelini musimon × Ovis sp. monitored in southern France. Then, we evaluated the influence of these movement-impeding landscape features on the design of home ranges by testing for a non-random distribution of these behavioural barriers within sections of space differentially used by mouflon. We reveal that familiarity and landscape features are key determinants of movements, relegating to a lower level certain habitat constraints (e.g. food/cover trade-off) that we had previously identified as important for this species. Mouflon generally avoid crossing both anthropogenic (i.e. roads, tracks and hiking trails) and natural landscape features (i.e. ridges, talwegs and forest edges) while moving in the opposite direction, preferentially toward familiar areas. These specific behaviours largely depend on the relative position of each movement step regarding distance to the landscape features or level of familiarity in the surroundings. We also revealed cascading consequences on the design of home ranges in which most landscape features were excluded from cores and relegated to the peripheral areas. These results provide crucial information on landscape connectivity in a context of marked habitat fragmentation. They also call for more research on the role of landscape features in the emergence of home ranges in non-territorial species using recent methodological developments bridging the gap between movements and space use patterns. © 2016 The Authors. Journal of Animal Ecology © 2016 British Ecological Society.
Martin, François-Pierre J; Montoliu, Ivan; Kochhar, Sunil; Rezzi, Serge
2010-12-01
Over the past decade, the analysis of metabolic data with advanced chemometric techniques has offered the potential to explore functional relationships among biological compartments in relation to the structure and function of the intestine. However, the employed methodologies, generally based on regression modeling techniques, have given emphasis to region-specific metabolic patterns, while providing only limited insights into the spatiotemporal metabolic features of the complex gastrointestinal system. Hence, novel approaches are needed to analyze metabolic data to reconstruct the metabolic biological space associated with the evolving structures and functions of an organ such as the gastrointestinal tract. Here, we report the application of multivariate curve resolution (MCR) methodology to model metabolic relationships along the gastrointestinal compartments in relation to its structure and function using data from our previous metabonomic analysis. The method simultaneously summarizes metabolite occurrence and contribution to continuous metabolic signatures of the different biological compartments of the gut tract. This methodology sheds new light onto the complex web of metabolic interactions with gut symbionts that modulate host cell metabolism in surrounding gut tissues. In the future, such an approach will be key to provide new insights into the dynamic onset of metabolic deregulations involved in region-specific gastrointestinal disorders, such as Crohn's disease or ulcerative colitis.
A novel MALDI–TOF based methodology for genotyping single nucleotide polymorphisms
Blondal, Thorarinn; Waage, Benedikt G.; Smarason, Sigurdur V.; Jonsson, Frosti; Fjalldal, Sigridur B.; Stefansson, Kari; Gulcher, Jeffery; Smith, Albert V.
2003-01-01
A new MALDI–TOF based detection assay was developed for analysis of single nucleotide polymorphisms (SNPs). It is a significant modification on the classic three-step minisequencing method, which includes a polymerase chain reaction (PCR), removal of excess nucleotides and primers, followed by primer extension in the presence of dideoxynucleotides using modified thermostable DNA polymerase. The key feature of this novel assay is reliance upon deoxynucleotide mixes, lacking one of the nucleotides at the polymorphic position. During primer extension in the presence of depleted nucleotide mixes, standard thermostable DNA polymerases dissociate from the template at positions requiring a depleted nucleotide; this principal was harnessed to create a genotyping assay. The assay design requires a primer- extension primer having its 3′-end one nucleotide upstream from the interrogated site. The assay further utilizes the same DNA polymerase in both PCR and the primer extension step. This not only simplifies the assay but also greatly reduces the cost per genotype compared to minisequencing methodology. We demonstrate accurate genotyping using this methodology for two SNPs run in both singleplex and duplex reactions. We term this assay nucleotide depletion genotyping (NUDGE). Nucleotide depletion genotyping could be extended to other genotyping assays based on primer extension such as detection by gel or capillary electrophoresis. PMID:14654708
Design for dependability: A simulation-based approach. Ph.D. Thesis, 1993
NASA Technical Reports Server (NTRS)
Goswami, Kumar K.
1994-01-01
This research addresses issues in simulation-based system level dependability analysis of fault-tolerant computer systems. The issues and difficulties of providing a general simulation-based approach for system level analysis are discussed and a methodology that address and tackle these issues is presented. The proposed methodology is designed to permit the study of a wide variety of architectures under various fault conditions. It permits detailed functional modeling of architectural features such as sparing policies, repair schemes, routing algorithms as well as other fault-tolerant mechanisms, and it allows the execution of actual application software. One key benefit of this approach is that the behavior of a system under faults does not have to be pre-defined as it is normally done. Instead, a system can be simulated in detail and injected with faults to determine its failure modes. The thesis describes how object-oriented design is used to incorporate this methodology into a general purpose design and fault injection package called DEPEND. A software model is presented that uses abstractions of application programs to study the behavior and effect of software on hardware faults in the early design stage when actual code is not available. Finally, an acceleration technique that combines hierarchical simulation, time acceleration algorithms and hybrid simulation to reduce simulation time is introduced.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bucknor, Matthew; Grabaskas, David; Brunett, Acacia
2015-04-26
Advanced small modular reactor designs include many advantageous design features such as passively driven safety systems that are arguably more reliable and cost effective relative to conventional active systems. Despite their attractiveness, a reliability assessment of passive systems can be difficult using conventional reliability methods due to the nature of passive systems. Simple deviations in boundary conditions can induce functional failures in a passive system, and intermediate or unexpected operating modes can also occur. As part of an ongoing project, Argonne National Laboratory is investigating various methodologies to address passive system reliability. The Reliability Method for Passive Systems (RMPS), amore » systematic approach for examining reliability, is one technique chosen for this analysis. This methodology is combined with the Risk-Informed Safety Margin Characterization (RISMC) approach to assess the reliability of a passive system and the impact of its associated uncertainties. For this demonstration problem, an integrated plant model of an advanced small modular pool-type sodium fast reactor with a passive reactor cavity cooling system is subjected to a station blackout using RELAP5-3D. This paper discusses important aspects of the reliability assessment, including deployment of the methodology, the uncertainty identification and quantification process, and identification of key risk metrics.« less
Shao, Chaofeng; Tian, Xiaogang; Guan, Yang; Ju, Meiting; Xie, Qiang
2013-05-21
Selecting indicators based on the characteristics and development trends of a given study area is essential for building a framework for assessing urban ecological security. However, few studies have focused on how to select the representative indicators systematically, and quantitative research is lacking. We developed an innovative quantitative modeling approach called the grey dynamic hierarchy analytic system (GDHAS) for both the procedures of indicator selection and quantitative assessment of urban ecological security. Next, a systematic methodology based on the GDHAS is developed to assess urban ecological security comprehensively and dynamically. This assessment includes indicator selection, driving force-pressure-state-impact-response (DPSIR) framework building, and quantitative evaluation. We applied this systematic methodology to assess the urban ecological security of Tianjin, which is a typical coastal super megalopolis and the industry base in China. This case study highlights the key features of our approach. First, 39 representative indicators are selected for the evaluation index system from 62 alternative ones available through the GDHAS. Second, the DPSIR framework is established based on the indicators selected, and the quantitative assessment of the eco-security of Tianjin is conducted. The results illustrate the following: urban ecological security of Tianjin in 2008 was in alert level but not very stable; the driving force and pressure subsystems were in good condition, but the eco-security levels of the remainder of the subsystems were relatively low; the pressure subsystem was the key to urban ecological security; and 10 indicators are defined as the key indicators for five subsystems. These results can be used as the basis for urban eco-environmental management.
Shao, Chaofeng; Tian, Xiaogang; Guan, Yang; Ju, Meiting; Xie, Qiang
2013-01-01
Selecting indicators based on the characteristics and development trends of a given study area is essential for building a framework for assessing urban ecological security. However, few studies have focused on how to select the representative indicators systematically, and quantitative research is lacking. We developed an innovative quantitative modeling approach called the grey dynamic hierarchy analytic system (GDHAS) for both the procedures of indicator selection and quantitative assessment of urban ecological security. Next, a systematic methodology based on the GDHAS is developed to assess urban ecological security comprehensively and dynamically. This assessment includes indicator selection, driving force-pressure-state-impact-response (DPSIR) framework building, and quantitative evaluation. We applied this systematic methodology to assess the urban ecological security of Tianjin, which is a typical coastal super megalopolis and the industry base in China. This case study highlights the key features of our approach. First, 39 representative indicators are selected for the evaluation index system from 62 alternative ones available through the GDHAS. Second, the DPSIR framework is established based on the indicators selected, and the quantitative assessment of the eco-security of Tianjin is conducted. The results illustrate the following: urban ecological security of Tianjin in 2008 was in alert level but not very stable; the driving force and pressure subsystems were in good condition, but the eco-security levels of the remainder of the subsystems were relatively low; the pressure subsystem was the key to urban ecological security; and 10 indicators are defined as the key indicators for five subsystems. These results can be used as the basis for urban eco-environmental management. PMID:23698700
Model-based assist feature insertion for sub-40nm memory device
NASA Astrophysics Data System (ADS)
Suh, Sungsoo; Lee, Suk-joo; Choi, Seong-woon; Lee, Sung-Woo; Park, Chan-hoon
2009-04-01
Many issues need to be resolved for a production-worthy model based assist feature insertion flow for single and double exposure patterning process to extend low k1 process at 193 nm immersion technology. Model based assist feature insertion is not trivial to implement either for single and double exposure patterning compared to rule based methods. As shown in Fig. 1, pixel based mask inversion technology in itself has difficulties in mask writing and inspection although it presents as one of key technology to extend single exposure for contact layer. Thus far, inversion technology is tried as a cooptimization of target mask to simultaneously generate optimized main and sub-resolution assists features for a desired process window. Alternatively, its technology can also be used to optimize for a target feature after an assist feature types are inserted in order to simplify the mask complexity. Simplification of inversion mask is one of major issue with applying inversion technology to device development even if a smaller mask feature can be fabricated since the mask writing time is also a major factor. As shown in Figure 2, mask writing time may be a limiting factor in determining whether or not an inversion solution is viable. It can be reasoned that increased number of shot counts relates to increase in margin for inversion methodology. On the other hand, there is a limit on how complex a mask can be in order to be production worthy. There is also source and mask co-optimization which influences the final mask patterns and assist feature sizes and positions for a given target. In this study, we will discuss assist feature insertion methods for sub 40-nm technology.
The role of health informatics in clinical audit: part of the problem or key to the solution?
Georgiou, Andrew; Pearson, Michael
2002-05-01
The concepts of quality assurance (for which clinical audit is an essential part), evaluation and clinical governance each depend on the ability to derive and record measurements that describe clinical performance. Rapid IT developments have raised many new possibilities for managing health care. They have allowed for easier collection and processing of data in greater quantities. These developments have encouraged the growth of quality assurance as a key feature of health care delivery. In the past most of the emphasis has been on hospital information systems designed predominantly for the administration of patients and the management of financial performance. Large, hi-tech information system capacity does not guarantee quality information. The task of producing information that can be confidently used to monitor the quality of clinical care requires attention to key aspects of the design and operation of the audit. The Myocardial Infarction National Audit Project (MINAP) utilizes an IT-based system to collect and process data on large numbers of patients and make them readily available to contributing hospitals. The project shows that IT systems that employ rigorous health informatics methodologies can do much to improve the monitoring and provision of health care.
Multi-modal image registration: matching MRI with histology
NASA Astrophysics Data System (ADS)
Alic, Lejla; Haeck, Joost C.; Klein, Stefan; Bol, Karin; van Tiel, Sandra T.; Wielopolski, Piotr A.; Bijster, Magda; Niessen, Wiro J.; Bernsen, Monique; Veenland, Jifke F.; de Jong, Marion
2010-03-01
Spatial correspondence between histology and multi sequence MRI can provide information about the capabilities of non-invasive imaging to characterize cancerous tissue. However, shrinkage and deformation occurring during the excision of the tumor and the histological processing complicate the co registration of MR images with histological sections. This work proposes a methodology to establish a detailed 3D relation between histology sections and in vivo MRI tumor data. The key features of the methodology are a very dense histological sampling (up to 100 histology slices per tumor), mutual information based non-rigid B-spline registration, the utilization of the whole 3D data sets, and the exploitation of an intermediate ex vivo MRI. In this proof of concept paper, the methodology was applied to one tumor. We found that, after registration, the visual alignment of tumor borders and internal structures was fairly accurate. Utilizing the intermediate ex vivo MRI, it was possible to account for changes caused by the excision of the tumor: we observed a tumor expansion of 20%. Also the effects of fixation, dehydration and histological sectioning could be determined: 26% shrinkage of the tumor was found. The annotation of viable tissue, performed in histology and transformed to the in vivo MRI, matched clearly with high intensity regions in MRI. With this methodology, histological annotation can be directly related to the corresponding in vivo MRI. This is a vital step for the evaluation of the feasibility of multi-spectral MRI to depict histological groundtruth.
Pereira, Sérgio; Meier, Raphael; McKinley, Richard; Wiest, Roland; Alves, Victor; Silva, Carlos A; Reyes, Mauricio
2018-02-01
Machine learning systems are achieving better performances at the cost of becoming increasingly complex. However, because of that, they become less interpretable, which may cause some distrust by the end-user of the system. This is especially important as these systems are pervasively being introduced to critical domains, such as the medical field. Representation Learning techniques are general methods for automatic feature computation. Nevertheless, these techniques are regarded as uninterpretable "black boxes". In this paper, we propose a methodology to enhance the interpretability of automatically extracted machine learning features. The proposed system is composed of a Restricted Boltzmann Machine for unsupervised feature learning, and a Random Forest classifier, which are combined to jointly consider existing correlations between imaging data, features, and target variables. We define two levels of interpretation: global and local. The former is devoted to understanding if the system learned the relevant relations in the data correctly, while the later is focused on predictions performed on a voxel- and patient-level. In addition, we propose a novel feature importance strategy that considers both imaging data and target variables, and we demonstrate the ability of the approach to leverage the interpretability of the obtained representation for the task at hand. We evaluated the proposed methodology in brain tumor segmentation and penumbra estimation in ischemic stroke lesions. We show the ability of the proposed methodology to unveil information regarding relationships between imaging modalities and extracted features and their usefulness for the task at hand. In both clinical scenarios, we demonstrate that the proposed methodology enhances the interpretability of automatically learned features, highlighting specific learning patterns that resemble how an expert extracts relevant data from medical images. Copyright © 2017 Elsevier B.V. All rights reserved.
Dedios, Maria Cecilia; Esperato, Alexo; De-Regil, Luz Maria; Peña-Rosas, Juan Pablo; Norris, Susan L
2017-03-21
Over the past decade, the World Health Organization (WHO) has implemented a standardized, evidence-informed guideline development process to assure technically sound and policy-relevant guidelines. This study is an independent evaluation of the adaptability of the guidelines produced by the Evidence and Programme Guidance unit, at the Department of Nutrition for Health and Development (NHD). The study systematizes the lessons learned by the NHD group at WHO. We used a mixed methods approach to determine the adaptability of the nutrition guidelines. Adaptability was defined as having two components; methodological quality and implementability of guidelines. Additionally, we gathered recommendations to improve future guideline development in nutrition actions for health and development. Data sources for this evaluation were official documentation and feedback (both qualitative and quantitative) from key stakeholders involved in the development of nutrition guidelines. The qualitative data was collected through a desk review and two waves of semi-structured interviews (n = 12) and was analyzed through axial coding. Guideline adaptability was assessed quantitatively using two standardized instruments completed by key stakeholders. The Appraisal Guideline for Research and Evaluation questionnaire, version II was used to assess guideline quality (n = 6), while implementability was assessed with the electronic version of the GuideLine Implementability Appraisal (n = 7). The nutrition evidence-informed guideline development process has several strengths, among them are the appropriate management of conflicts of interest of guideline developers and the systematic use of high-quality evidence to inform the recommendations. These features contribute to increase the methodological quality of the guidelines. The key areas for improvement are the limited implementability of the recommendations, the lack of explicit and precise implementation advice in the guidelines and challenges related to collaborative work within interdisciplinary groups. Overall, our study found that the nutrition evidence-informed guidelines are of good methodological quality but that the implementability requires improvement. The recommendations to improve guideline adaptability address the guideline content, the dynamics shaping interdisciplinary work, and actions for implementation feasibility. As WHO relies heavily on a standardized procedure to develop guidelines, the lessons learned may be applicable to guideline development across the organization and to other groups developing guidelines.
Common Bolted Joint Analysis Tool
NASA Technical Reports Server (NTRS)
Imtiaz, Kauser
2011-01-01
Common Bolted Joint Analysis Tool (comBAT) is an Excel/VB-based bolted joint analysis/optimization program that lays out a systematic foundation for an inexperienced or seasoned analyst to determine fastener size, material, and assembly torque for a given design. Analysts are able to perform numerous what-if scenarios within minutes to arrive at an optimal solution. The program evaluates input design parameters, performs joint assembly checks, and steps through numerous calculations to arrive at several key margins of safety for each member in a joint. It also checks for joint gapping, provides fatigue calculations, and generates joint diagrams for a visual reference. Optimum fastener size and material, as well as correct torque, can then be provided. Analysis methodology, equations, and guidelines are provided throughout the solution sequence so that this program does not become a "black box:" for the analyst. There are built-in databases that reduce the legwork required by the analyst. Each step is clearly identified and results are provided in number format, as well as color-coded spelled-out words to draw user attention. The three key features of the software are robust technical content, innovative and user friendly I/O, and a large database. The program addresses every aspect of bolted joint analysis and proves to be an instructional tool at the same time. It saves analysis time, has intelligent messaging features, and catches operator errors in real time.
NASA Astrophysics Data System (ADS)
Mendel, Kayla R.; Li, Hui; Sheth, Deepa; Giger, Maryellen L.
2018-02-01
With growing adoption of digital breast tomosynthesis (DBT) in breast cancer screening protocols, it is important to compare the performance of computer-aided diagnosis (CAD) in the diagnosis of breast lesions on DBT images compared to conventional full-field digital mammography (FFDM). In this study, we retrospectively collected FFDM and DBT images of 78 lesions from 76 patients, each containing lesions that were biopsy-proven as either malignant or benign. A square region of interest (ROI) was placed to fully cover the lesion on each FFDM, DBT synthesized 2D images, and DBT key slice images in the cranial-caudal (CC) and mediolateral-oblique (MLO) views. Features were extracted on each ROI using a pre-trained convolutional neural network (CNN). These features were then input to a support vector machine (SVM) classifier, and area under the ROC curve (AUC) was used as the figure of merit. We found that in both the CC view and MLO view, the synthesized 2D image performed best (AUC = 0.814, AUC = 0.881 respectively) in the task of lesion characterization. Small database size was a key limitation in this study, and could lead to overfitting in the application of the SVM classifier. In future work, we plan to expand this dataset and to explore more robust deep learning methodology such as fine-tuning.
Medical cost analysis: application to colorectal cancer data from the SEER Medicare database.
Bang, Heejung
2005-10-01
Incompleteness is a key feature of most survival data. Numerous well established statistical methodologies and algorithms exist for analyzing life or failure time data. However, induced censorship invalidates the use of those standard analytic tools for some survival-type data such as medical costs. In this paper, some valid methods currently available for analyzing censored medical cost data are reviewed. Some cautionary findings under different assumptions are envisioned through application to medical costs from colorectal cancer patients. Cost analysis should be suitably planned and carefully interpreted under various meaningful scenarios even with judiciously selected statistical methods. This approach would be greatly helpful to policy makers who seek to prioritize health care expenditures and to assess the elements of resource use.
SURF Model Calibration Strategy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Menikoff, Ralph
2017-03-10
SURF and SURFplus are high explosive reactive burn models for shock initiation and propagation of detonation waves. They are engineering models motivated by the ignition & growth concept of high spots and for SURFplus a second slow reaction for the energy release from carbon clustering. A key feature of the SURF model is that there is a partial decoupling between model parameters and detonation properties. This enables reduced sets of independent parameters to be calibrated sequentially for the initiation and propagation regimes. Here we focus on a methodology for tting the initiation parameters to Pop plot data based on 1-Dmore » simulations to compute a numerical Pop plot. In addition, the strategy for tting the remaining parameters for the propagation regime and failure diameter is discussed.« less
Advanced Propulsion and TPS for a Rapidly-Prototyped CEV
NASA Astrophysics Data System (ADS)
Hudson, Gary C.
2005-02-01
Transformational Space Corporation (t/Space) is developing for NASA the initial designs for the Crew Exploration Vehicle family, focusing on a Launch CEV for transporting NASA and civilian passengers from Earth to orbit. The t/Space methodology is rapid prototyping of major vehicle systems, and deriving detailed specifications from the resulting hardware, avoiding "written-in-advance" specs that can force the costly invention of new capabilities simply to meet such specs. A key technology shared by the CEV family is Vapor Pressurized propulsion (Vapak) for simplicity and reliability, which provides electrical power, life support gas and a heat sink in addition to propulsion. The CEV family also features active transpiration cooling of re-entry surfaces (for reusability) backed up by passive thermal protection.
Multiagent robotic systems' ambient light sensor
NASA Astrophysics Data System (ADS)
Iureva, Radda A.; Maslennikov, Oleg S.; Komarov, Igor I.
2017-05-01
Swarm robotics is one of the fastest growing areas of modern technology. Being subclass of multi-agent systems it inherits the main part of scientific-methodological apparatus of construction and functioning of practically useful complexes, which consist of rather autonomous independent agents. Ambient light sensors (ALS) are widely used in robotics. But speaking about swarm robotics, the technology which has great number of specific features and is developing, we can't help mentioning that its important to use sensors on each robot not only in order to help it to get directionally oriented, but also to follow light emitted by robot-chief or to help to find the goal easier. Key words: ambient light sensor, swarm system, multiagent system, robotic system, robotic complexes, simulation modelling
Aresi, Giovanni; Fattori, Francesco; Pozzi, Maura; Moore, Simon C
2016-09-01
The aim was to explore shared representations of alcohol use in students who were to travel abroad to study. Focus group data from Italian students ( N = 69) were collected. Analyses used Grounded Theory Methodology and were informed by the four key components of Social Representation Theory (cognition, emotion, attitude and behavioural intentions). The study abroad experience was described as one that would involve an increase in alcohol consumption compared to pre-departure levels. Reasons given included greater social and leisure opportunities involving alcohol, reduced social control and features of the host country environment. Opportunities to intervene and address risky alcohol use in this group are discussed.
Structural zeros in high-dimensional data with applications to microbiome studies.
Kaul, Abhishek; Davidov, Ori; Peddada, Shyamal D
2017-07-01
This paper is motivated by the recent interest in the analysis of high-dimensional microbiome data. A key feature of these data is the presence of "structural zeros" which are microbes missing from an observation vector due to an underlying biological process and not due to error in measurement. Typical notions of missingness are unable to model these structural zeros. We define a general framework which allows for structural zeros in the model and propose methods of estimating sparse high-dimensional covariance and precision matrices under this setup. We establish error bounds in the spectral and Frobenius norms for the proposed estimators and empirically verify them with a simulation study. The proposed methodology is illustrated by applying it to the global gut microbiome data of Yatsunenko and others (2012. Human gut microbiome viewed across age and geography. Nature 486, 222-227). Using our methodology we classify subjects according to the geographical location on the basis of their gut microbiome. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Danon, Leon; Brooks-Pollock, Ellen
2016-09-01
In their review, Chowell et al. consider the ability of mathematical models to predict early epidemic growth [1]. In particular, they question the central prediction of classical differential equation models that the number of cases grows exponentially during the early stages of an epidemic. Using examples including HIV and Ebola, they argue that classical models fail to capture key qualitative features of early growth and describe a selection of models that do capture non-exponential epidemic growth. An implication of this failure is that predictions may be inaccurate and unusable, highlighting the need for care when embarking upon modelling using classical methodology. There remains a lack of understanding of the mechanisms driving many observed epidemic patterns; we argue that data science should form a fundamental component of epidemic modelling, providing a rigorous methodology for data-driven approaches, rather than trying to enforce established frameworks. The need for refinement of classical models provides a strong argument for the use of data science, to identify qualitative characteristics and pinpoint the mechanisms responsible for the observed epidemic patterns.
Supplement to a Methodology for Succession Planning for Technical Experts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kirk, Bernadette Lugue; Cain, Ronald A.; Agreda, Carla L.
This report complements A Methodology for Succession Planning for Technical Experts (Ron Cain, Shaheen Dewji, Carla Agreda, Bernadette Kirk, July 2017), which describes a draft methodology for identifying and evaluating the loss of key technical skills at nuclear operations facilities. This report targets the methodology for identifying critical skills, and the methodology is tested through interviews with selected subject matter experts.
NASA Astrophysics Data System (ADS)
Lakshmi, A.; Faheema, A. G. J.; Deodhare, Dipti
2016-05-01
Pedestrian detection is a key problem in night vision processing with a dozen of applications that will positively impact the performance of autonomous systems. Despite significant progress, our study shows that performance of state-of-the-art thermal image pedestrian detectors still has much room for improvement. The purpose of this paper is to overcome the challenge faced by the thermal image pedestrian detectors, which employ intensity based Region Of Interest (ROI) extraction followed by feature based validation. The most striking disadvantage faced by the first module, ROI extraction, is the failed detection of cloth insulted parts. To overcome this setback, this paper employs an algorithm and a principle of region growing pursuit tuned to the scale of the pedestrian. The statistics subtended by the pedestrian drastically vary with the scale and deviation from normality approach facilitates scale detection. Further, the paper offers an adaptive mathematical threshold to resolve the problem of subtracting the background while extracting cloth insulated parts as well. The inherent false positives of the ROI extraction module are limited by the choice of good features in pedestrian validation step. One such feature is curvelet feature, which has found its use extensively in optical images, but has as yet no reported results in thermal images. This has been used to arrive at a pedestrian detector with a reduced false positive rate. This work is the first venture made to scrutinize the utility of curvelet for characterizing pedestrians in thermal images. Attempt has also been made to improve the speed of curvelet transform computation. The classification task is realized through the use of the well known methodology of Support Vector Machines (SVMs). The proposed method is substantiated with qualified evaluation methodologies that permits us to carry out probing and informative comparisons across state-of-the-art features, including deep learning methods, with six standard and in-house databases. With reference to deep learning, our algorithm exhibits comparable performance. More important is that it has significant lower requirements in terms of compute power and memory, thus making it more relevant for depolyment in resource constrained platforms with significant size, weight and power constraints.
ERIC Educational Resources Information Center
Develaki, Maria
2016-01-01
Models and modeling are core elements of scientific methods and consequently also are of key importance for the conception and teaching of scientific methodology. The epistemology of models and its transfer and adaption to nature of science education are not, however, simple themes. We present some conceptual units in which school science models…
Second Language Listening Strategy Research: Methodological Challenges and Perspectives
ERIC Educational Resources Information Center
Santos, Denise; Graham, Suzanne; Vanderplank, Robert
2008-01-01
This paper explores methodological issues related to research into second language listening strategies. We argue that a number of central questions regarding research methodology in this line of enquiry are underexamined, and we engage in the discussion of three key methodological questions: (1) To what extent is a verbal report a valid and…
Automatic Road Sign Inventory Using Mobile Mapping Systems
NASA Astrophysics Data System (ADS)
Soilán, M.; Riveiro, B.; Martínez-Sánchez, J.; Arias, P.
2016-06-01
The periodic inspection of certain infrastructure features plays a key role for road network safety and preservation, and for developing optimal maintenance planning that minimize the life-cycle cost of the inspected features. Mobile Mapping Systems (MMS) use laser scanner technology in order to collect dense and precise three-dimensional point clouds that gather both geometric and radiometric information of the road network. Furthermore, time-stamped RGB imagery that is synchronized with the MMS trajectory is also available. In this paper a methodology for the automatic detection and classification of road signs from point cloud and imagery data provided by a LYNX Mobile Mapper System is presented. First, road signs are detected in the point cloud. Subsequently, the inventory is enriched with geometrical and contextual data such as orientation or distance to the trajectory. Finally, semantic content is given to the detected road signs. As point cloud resolution is insufficient, RGB imagery is used projecting the 3D points in the corresponding images and analysing the RGB data within the bounding box defined by the projected points. The methodology was tested in urban and road environments in Spain, obtaining global recall results greater than 95%, and F-score greater than 90%. In this way, inventory data is obtained in a fast, reliable manner, and it can be applied to improve the maintenance planning of the road network, or to feed a Spatial Information System (SIS), thus, road sign information can be available to be used in a Smart City context.
Imai, Kosuke; Jiang, Zhichao
2018-04-29
The matched-pairs design enables researchers to efficiently infer causal effects from randomized experiments. In this paper, we exploit the key feature of the matched-pairs design and develop a sensitivity analysis for missing outcomes due to truncation by death, in which the outcomes of interest (e.g., quality of life measures) are not even well defined for some units (e.g., deceased patients). Our key idea is that if 2 nearly identical observations are paired prior to the randomization of the treatment, the missingness of one unit's outcome is informative about the potential missingness of the other unit's outcome under an alternative treatment condition. We consider the average treatment effect among always-observed pairs (ATOP) whose units exhibit no missing outcome regardless of their treatment status. The naive estimator based on available pairs is unbiased for the ATOP if 2 units of the same pair are identical in terms of their missingness patterns. The proposed sensitivity analysis characterizes how the bounds of the ATOP widen as the degree of the within-pair similarity decreases. We further extend the methodology to the matched-pairs design in observational studies. Our simulation studies show that informative bounds can be obtained under some scenarios when the proportion of missing data is not too large. The proposed methodology is also applied to the randomized evaluation of the Mexican universal health insurance program. An open-source software package is available for implementing the proposed research. Copyright © 2018 John Wiley & Sons, Ltd.
Amezquita-Sanchez, Juan P; Adeli, Anahita; Adeli, Hojjat
2016-05-15
Mild cognitive impairment (MCI) is a cognitive disorder characterized by memory impairment, greater than expected by age. A new methodology is presented to identify MCI patients during a working memory task using MEG signals. The methodology consists of four steps: In step 1, the complete ensemble empirical mode decomposition (CEEMD) is used to decompose the MEG signal into a set of adaptive sub-bands according to its contained frequency information. In step 2, a nonlinear dynamics measure based on permutation entropy (PE) analysis is employed to analyze the sub-bands and detect features to be used for MCI detection. In step 3, an analysis of variation (ANOVA) is used for feature selection. In step 4, the enhanced probabilistic neural network (EPNN) classifier is applied to the selected features to distinguish between MCI and healthy patients. The usefulness and effectiveness of the proposed methodology are validated using the sensed MEG data obtained experimentally from 18 MCI and 19 control patients. Copyright © 2016 Elsevier B.V. All rights reserved.
Key Program Features to Enhance the School-to-Career Transition for Youth with Disabilities
ERIC Educational Resources Information Center
Doren, Bonnie; Yan, Min-Chi; Tu, Wei-Mo
2013-01-01
The purpose of the article was to identify key features within research-based school-to-career programs that were linked to positive employment outcomes for youth disabilities. Three key program features were identified and discussed that could be incorporated into the practices and programs of schools and communities to support the employment…
Methodological reviews of economic evaluations in health care: what do they target?
Hutter, Maria-Florencia; Rodríguez-Ibeas, Roberto; Antonanzas, Fernando
2014-11-01
An increasing number of published studies of economic evaluations of health technologies have been reviewed and summarized with different purposes, among them to facilitate decision-making processes. These reviews have covered different aspects of economic evaluations, using a variety of methodological approaches. The aim of this study is to analyze the methodological characteristics of the reviews of economic evaluations in health care, published during the period 1990-2010, to identify their main features and the potential missing elements. This may help to develop a common procedure for elaborating these kinds of reviews. We performed systematic searches in electronic databases (Scopus, Medline and PubMed) of methodological reviews published in English, period 1990-2010. We selected the articles whose main purpose was to review and assess the methodology applied in the economic evaluation studies. We classified the data according to the study objectives, period of the review, number of reviewed studies, methodological and non-methodological items assessed, medical specialty, type of disease and technology, databases used for the review and their main conclusions. We performed a descriptive statistical analysis and checked how generalizability issues were considered in the reviews. We identified 76 methodological reviews, 42 published in the period 1990-2001 and 34 during 2002-2010. The items assessed most frequently (by 70% of the reviews) were perspective, type of economic study, uncertainty and discounting. The reviews also described the type of intervention and disease, funding sources, country in which the evaluation took place, type of journal and author's characteristics. Regarding the intertemporal comparison, higher frequencies were found in the second period for two key methodological items: the source of effectiveness data and the models used in the studies. However, the generalizability issues that apparently are creating a growing interest in the economic evaluation literature did not receive as much attention in the reviews of the second period. The remaining items showed similar frequencies in both periods. Increasingly more reviews of economic evaluation studies aim to analyze the application of methodological principles, and offer summaries of papers classified by either diseases or health technologies. These reviews are useful for finding literature trends, aims of studies and possible deficiencies in the implementation of methods of specific health interventions. As no significant methodological improvement was clearly detected in the two periods analyzed, it would be convenient to pay more attention to the methodological aspects of the reviews.
Extension of Companion Modeling Using Classification Learning
NASA Astrophysics Data System (ADS)
Torii, Daisuke; Bousquet, François; Ishida, Toru
Companion Modeling is a methodology of refining initial models for understanding reality through a role-playing game (RPG) and a multiagent simulation. In this research, we propose a novel agent model construction methodology in which classification learning is applied to the RPG log data in Companion Modeling. This methodology enables a systematic model construction that handles multi-parameters, independent of the modelers ability. There are three problems in applying classification learning to the RPG log data: 1) It is difficult to gather enough data for the number of features because the cost of gathering data is high. 2) Noise data can affect the learning results because the amount of data may be insufficient. 3) The learning results should be explained as a human decision making model and should be recognized by the expert as being the result that reflects reality. We realized an agent model construction system using the following two approaches: 1) Using a feature selction method, the feature subset that has the best prediction accuracy is identified. In this process, the important features chosen by the expert are always included. 2) The expert eliminates irrelevant features from the learning results after evaluating the learning model through a visualization of the results. Finally, using the RPG log data from the Companion Modeling of agricultural economics in northeastern Thailand, we confirm the capability of this methodology.
Fitting methods to paradigms: are ergonomics methods fit for systems thinking?
Salmon, Paul M; Walker, Guy H; M Read, Gemma J; Goode, Natassia; Stanton, Neville A
2017-02-01
The issues being tackled within ergonomics problem spaces are shifting. Although existing paradigms appear relevant for modern day systems, it is worth questioning whether our methods are. This paper asks whether the complexities of systems thinking, a currently ubiquitous ergonomics paradigm, are outpacing the capabilities of our methodological toolkit. This is achieved through examining the contemporary ergonomics problem space and the extent to which ergonomics methods can meet the challenges posed. Specifically, five key areas within the ergonomics paradigm of systems thinking are focused on: normal performance as a cause of accidents, accident prediction, system migration, systems concepts and ergonomics in design. The methods available for pursuing each line of inquiry are discussed, along with their ability to respond to key requirements. In doing so, a series of new methodological requirements and capabilities are identified. It is argued that further methodological development is required to provide researchers and practitioners with appropriate tools to explore both contemporary and future problems. Practitioner Summary: Ergonomics methods are the cornerstone of our discipline. This paper examines whether our current methodological toolkit is fit for purpose given the changing nature of ergonomics problems. The findings provide key research and practice requirements for methodological development.
Transitioning Domain Analysis: An Industry Experience.
1996-06-01
References 6 Implementation 6.1 Analysis of Operator Services’ Requirements Process 21 6.2 Preliminary Planning for FODA Training by SEI 21...an academic and industry partnership took feature oriented domain analysis ( FODA ) from a methodology that is still being defined to a well-documented...to pilot the use of the Software Engineering Institute (SEI) domain analysis methodology known as feature-oriented domain analysis ( FODA ). Supported
A time-responsive tool for informing policy making: rapid realist review.
Saul, Jessie E; Willis, Cameron D; Bitz, Jennifer; Best, Allan
2013-09-05
A realist synthesis attempts to provide policy makers with a transferable theory that suggests a certain program is more or less likely to work in certain respects, for particular subjects, in specific kinds of situations. Yet realist reviews can require considerable and sustained investment over time, which does not always suit the time-sensitive demands of many policy decisions. 'Rapid Realist Review' methodology (RRR) has been developed as a tool for applying a realist approach to a knowledge synthesis process in order to produce a product that is useful to policy makers in responding to time-sensitive and/or emerging issues, while preserving the core elements of realist methodology. Using examples from completed RRRs, we describe key features of the RRR methodology, the resources required, and the strengths and limitations of the process. All aspects of an RRR are guided by both a local reference group, and a group of content experts. Involvement of knowledge users and external experts ensures both the usability of the review products, as well as their links to current practice. RRRs have proven useful in providing evidence for and making explicit what is known on a given topic, as well as articulating where knowledge gaps may exist. From the RRRs completed to date, findings broadly adhere to four (often overlapping) classifications: guiding rules for policy-making; knowledge quantification (i.e., the amount of literature available that identifies context, mechanisms, and outcomes for a given topic); understanding tensions/paradoxes in the evidence base; and, reinforcing or refuting beliefs and decisions taken. 'Traditional' realist reviews and RRRs have some key differences, which allow policy makers to apply each type of methodology strategically to maximize its utility within a particular local constellation of history, goals, resources, politics and environment. In particular, the RRR methodology is explicitly designed to engage knowledge users and review stakeholders to define the research questions, and to streamline the review process. In addition, results are presented with a focus on context-specific explanations for what works within a particular set of parameters rather than producing explanations that are potentially transferrable across contexts and populations. For policy makers faced with making difficult decisions in short time frames for which there is sufficient (if limited) published/research and practice-based evidence available, RRR provides a practical, outcomes-focused knowledge synthesis method.
2017-09-01
with new methodologies of intratumoral phylogenetic analyses, will yield pivotal information in elucidating the key genes involved evolution of PCa...combined with both clinical and experimental genetic data produced by this study may empower patients and doctors to make personalized treatment decisions...sequencing, paired with new methodologies of intratumoral phylogenetic analyses, will yield pivotal information in elucidating the key genes involved
Analysis of the Source Physics Experiment SPE4 Prime Using State-Of Parallel Numerical Tools.
NASA Astrophysics Data System (ADS)
Vorobiev, O.; Ezzedine, S. M.; Antoun, T.; Glenn, L.
2015-12-01
This work describes a methodology used for large scale modeling of wave propagation from underground chemical explosions conducted at the Nevada National Security Site (NNSS) fractured granitic rock. We show that the discrete natures of rock masses as well as the spatial variability of the fabric of rock properties are very important to understand ground motions induced by underground explosions. In order to build a credible conceptual model of the subsurface we integrated the geological, geomechanical and geophysical characterizations conducted during recent test at the NNSS as well as historical data from the characterization during the underground nuclear test conducted at the NNSS. Because detailed site characterization is limited, expensive and, in some instances, impossible we have numerically investigated the effects of the characterization gaps on the overall response of the system. We performed several computational studies to identify the key important geologic features specific to fractured media mainly the joints characterized at the NNSS. We have also explored common key features to both geological environments such as saturation and topography and assess which characteristics affect the most the ground motion in the near-field and in the far-field. Stochastic representation of these features based on the field characterizations has been implemented into LLNL's Geodyn-L hydrocode. Simulations were used to guide site characterization efforts in order to provide the essential data to the modeling community. We validate our computational results by comparing the measured and computed ground motion at various ranges for the recently executed SPE4 prime experiment. We have also conducted a comparative study between SPE4 prime and previous experiments SPE1 and SPE3 to assess similarities and differences and draw conclusions on designing SPE5.
Single Channel EEG Artifact Identification Using Two-Dimensional Multi-Resolution Analysis.
Taherisadr, Mojtaba; Dehzangi, Omid; Parsaei, Hossein
2017-12-13
As a diagnostic monitoring approach, electroencephalogram (EEG) signals can be decoded by signal processing methodologies for various health monitoring purposes. However, EEG recordings are contaminated by other interferences, particularly facial and ocular artifacts generated by the user. This is specifically an issue during continuous EEG recording sessions, and is therefore a key step in using EEG signals for either physiological monitoring and diagnosis or brain-computer interface to identify such artifacts from useful EEG components. In this study, we aim to design a new generic framework in order to process and characterize EEG recording as a multi-component and non-stationary signal with the aim of localizing and identifying its component (e.g., artifact). In the proposed method, we gather three complementary algorithms together to enhance the efficiency of the system. Algorithms include time-frequency (TF) analysis and representation, two-dimensional multi-resolution analysis (2D MRA), and feature extraction and classification. Then, a combination of spectro-temporal and geometric features are extracted by combining key instantaneous TF space descriptors, which enables the system to characterize the non-stationarities in the EEG dynamics. We fit a curvelet transform (as a MRA method) to 2D TF representation of EEG segments to decompose the given space to various levels of resolution. Such a decomposition efficiently improves the analysis of the TF spaces with different characteristics (e.g., resolution). Our experimental results demonstrate that the combination of expansion to TF space, analysis using MRA, and extracting a set of suitable features and applying a proper predictive model is effective in enhancing the EEG artifact identification performance. We also compare the performance of the designed system with another common EEG signal processing technique-namely, 1D wavelet transform. Our experimental results reveal that the proposed method outperforms 1D wavelet.
Soper, Bryony; Yaqub, Ohid; Hinrichs, Saba; Marjanovich, Sonja; Drabble, Samuel; Hanney, Stephen; Nolte, Ellen
2013-10-01
The nine NIHR CLAHRCs are collaborations between universities and local NHS organizations that seek to improve patient outcomes through the conduct and application of applied health research. The theoretical and practical context within which the CLAHRCs were set up was characterized by a considerable degree of uncertainty, and the CLAHRCs were established as a natural experiment. We adopted a formative and emergent evaluation approach. Drawing on in-depth, multi-method case studies of two CLAHRCs we explored how they pursued their remit by supporting efforts to increase the relevance and use of health research, and building relationships. Both CLAHRCs: strengthened local networks and relationships; built capacity in their local academic and NHS communities to undertake and use research that meets the needs of the service; developed research and implementation methodologies; and added to understanding of the complex relation between research and implementation. There was evidence of impact of CLAHRC projects on health and social care services. Informed by the literature on implementing collaborative research initiatives, knowledge transfer and exchange and cultural change, some key lessons can be drawn. The CLAHRCs pursued a strategy that can be categorized as one of flexible comprehensiveness; i.e. their programmes have been flexible and responsive and they have used a range of approaches that seek to match the diverse aspects of the complex issues they face. Key features include their work on combining a range of knowledge transfer and exchange strategies, their efforts to promote cultural change, and the freedom to experiment, learn and adapt. Although the CLAHRCs do not, by themselves, have the remit or resources to bring about wholesale service improvement in health care, they do have features that would allow them to play a key role in some of the wider initiatives that encourage innovation.
Improving the performance of univariate control charts for abnormal detection and classification
NASA Astrophysics Data System (ADS)
Yiakopoulos, Christos; Koutsoudaki, Maria; Gryllias, Konstantinos; Antoniadis, Ioannis
2017-03-01
Bearing failures in rotating machinery can cause machine breakdown and economical loss, if no effective actions are taken on time. Therefore, it is of prime importance to detect accurately the presence of faults, especially at their early stage, to prevent sequent damage and reduce costly downtime. The machinery fault diagnosis follows a roadmap of data acquisition, feature extraction and diagnostic decision making, in which mechanical vibration fault feature extraction is the foundation and the key to obtain an accurate diagnostic result. A challenge in this area is the selection of the most sensitive features for various types of fault, especially when the characteristics of failures are difficult to be extracted. Thus, a plethora of complex data-driven fault diagnosis methods are fed by prominent features, which are extracted and reduced through traditional or modern algorithms. Since most of the available datasets are captured during normal operating conditions, the last decade a number of novelty detection methods, able to work when only normal data are available, have been developed. In this study, a hybrid method combining univariate control charts and a feature extraction scheme is introduced focusing towards an abnormal change detection and classification, under the assumption that measurements under normal operating conditions of the machinery are available. The feature extraction method integrates the morphological operators and the Morlet wavelets. The effectiveness of the proposed methodology is validated on two different experimental cases with bearing faults, demonstrating that the proposed approach can improve the fault detection and classification performance of conventional control charts.
Nieri, Michele; Clauser, Carlo; Franceschi, Debora; Pagliaro, Umberto; Saletta, Daniele; Pini-Prato, Giovanpaolo
2007-08-01
The aim of the present study was to investigate the relationships among reported methodological, statistical, clinical and paratextual variables of randomized clinical trials (RCTs) in implant therapy, and their influence on subsequent research. The material consisted of the RCTs in implant therapy published through the end of the year 2000. Methodological, statistical, clinical and paratextual features of the articles were assessed and recorded. The perceived clinical relevance was subjectively evaluated by an experienced clinician on anonymous abstracts. The impact on research was measured by the number of citations found in the Science Citation Index. A new statistical technique (Structural learning of Bayesian Networks) was used to assess the relationships among the considered variables. Descriptive statistics revealed that the reported methodology and statistics of RCTs in implant therapy were defective. Follow-up of the studies was generally short. The perceived clinical relevance appeared to be associated with the objectives of the studies and with the number of published images in the original articles. The impact on research was related to the nationality of the involved institutions and to the number of published images. RCTs in implant therapy (until 2000) show important methodological and statistical flaws and may not be appropriate for guiding clinicians in their practice. The methodological and statistical quality of the studies did not appear to affect their impact on practice and research. Bayesian Networks suggest new and unexpected relationships among the methodological, statistical, clinical and paratextual features of RCTs.
Impact and Crashworthiness Characteristics of Venera Type Landers for Future Venus Missions
NASA Technical Reports Server (NTRS)
Schroeder, Kevin; Bayandor, Javid; Samareh, Jamshid
2016-01-01
In this paper an in-depth investigation of the structural design of the Venera 9-14 landers is explored. A complete reverse engineering of the Venera lander was required. The lander was broken down into its fundamental components and analyzed. This provided in-sights into the hidden features of the design. A trade study was performed to find the sensitivity of the lander's overall mass to the variation of several key parameters. For the lander's legs, the location, length, configuration, and number are all parameterized. The size of the impact ring, the radius of the drag plate, and other design features are also parameterized, and all of these features were correlated to the change of mass of the lander. A multi-fidelity design tool used for further investigation of the parameterized lander was developed. As a design was passed down from one level to the next, the fidelity, complexity, accuracy, and run time of the model increased. The low-fidelity model was a highly nonlinear analytical model developed to rapidly predict the mass of each design. The medium and high fidelity models utilized an explicit finite element framework to investigate the performance of various landers upon impact with the surface under a range of landing conditions. This methodology allowed for a large variety of designs to be investigated by the analytical model, which identified designs with the optimum structural mass to payload ratio. As promising designs emerged, investigations in the following higher fidelity models were focused on establishing their reliability and crashworthiness. The developed design tool efficiently modelled and tested the best concepts for any scenario based on critical Venusian mission requirements and constraints. Through this program, the strengths and weaknesses inherent in the Venera-Type landers were thoroughly investigated. Key features identified for the design of robust landers will be used as foundations for the development of the next generation of landers for future exploration missions to Venus.
A neuromorphic approach to satellite image understanding
NASA Astrophysics Data System (ADS)
Partsinevelos, Panagiotis; Perakakis, Manolis
2014-05-01
Remote sensing satellite imagery provides high altitude, top viewing aspects of large geographic regions and as such the depicted features are not always easily recognizable. Nevertheless, geoscientists familiar to remote sensing data, gradually gain experience and enhance their satellite image interpretation skills. The aim of this study is to devise a novel computational neuro-centered classification approach for feature extraction and image understanding. Object recognition through image processing practices is related to a series of known image/feature based attributes including size, shape, association, texture, etc. The objective of the study is to weight these attribute values towards the enhancement of feature recognition. The key cognitive experimentation concern is to define the point when a user recognizes a feature as it varies in terms of the above mentioned attributes and relate it with their corresponding values. Towards this end, we have set up an experimentation methodology that utilizes cognitive data from brain signals (EEG) and eye gaze data (eye tracking) of subjects watching satellite images of varying attributes; this allows the collection of rich real-time data that will be used for designing the image classifier. Since the data are already labeled by users (using an input device) a first step is to compare the performance of various machine-learning algorithms on the collected data. On the long-run, the aim of this work would be to investigate the automatic classification of unlabeled images (unsupervised learning) based purely on image attributes. The outcome of this innovative process is twofold: First, in an abundance of remote sensing image datasets we may define the essential image specifications in order to collect the appropriate data for each application and improve processing and resource efficiency. E.g. for a fault extraction application in a given scale a medium resolution 4-band image, may be more effective than costly, multispectral, very high resolution imagery. Second, we attempt to relate the experienced against the non-experienced user understanding in order to indirectly assess the possible limits of purely computational systems. In other words, obtain the conceptual limits of computation vs human cognition concerning feature recognition from satellite imagery. Preliminary results of this pilot study show relations between collected data and differentiation of the image attributes which indicates that our methodology can lead to important results.
Effective Information Systems: What's the Secret?
ERIC Educational Resources Information Center
Kirkham, Sandi
1994-01-01
Argues that false assumptions about user needs implicit in methodologies for building information systems have resulted in inadequate and inflexible systems. Checkland's Soft Systems Methodology is examined as a useful alternative. Its fundamental features are described, and examples of models demonstrate how the methodology can facilitate…
Understanding Participatory Action Research: A Qualitative Research Methodology Option
ERIC Educational Resources Information Center
MacDonald, Cathy
2012-01-01
Participatory Action Research (PAR) is a qualitative research methodology option that requires further understanding and consideration. PAR is considered democratic, equitable, liberating, and life-enhancing qualitative inquiry that remains distinct from other qualitative methodologies (Kach & Kralik, 2006). Using PAR, qualitative features of an…
Mammographic phenotypes of breast cancer risk driven by breast anatomy
NASA Astrophysics Data System (ADS)
Gastounioti, Aimilia; Oustimov, Andrew; Hsieh, Meng-Kang; Pantalone, Lauren; Conant, Emily F.; Kontos, Despina
2017-03-01
Image-derived features of breast parenchymal texture patterns have emerged as promising risk factors for breast cancer, paving the way towards personalized recommendations regarding women's cancer risk evaluation and screening. The main steps to extract texture features of the breast parenchyma are the selection of regions of interest (ROIs) where texture analysis is performed, the texture feature calculation and the texture feature summarization in case of multiple ROIs. In this study, we incorporate breast anatomy in these three key steps by (a) introducing breast anatomical sampling for the definition of ROIs, (b) texture feature calculation aligned with the structure of the breast and (c) weighted texture feature summarization considering the spatial position and the underlying tissue composition of each ROI. We systematically optimize this novel framework for parenchymal tissue characterization in a case-control study with digital mammograms from 424 women. We also compare the proposed approach with a conventional methodology, not considering breast anatomy, recently shown to enhance the case-control discriminatory capacity of parenchymal texture analysis. The case-control classification performance is assessed using elastic-net regression with 5-fold cross validation, where the evaluation measure is the area under the curve (AUC) of the receiver operating characteristic. Upon optimization, the proposed breast-anatomy-driven approach demonstrated a promising case-control classification performance (AUC=0.87). In the same dataset, the performance of conventional texture characterization was found to be significantly lower (AUC=0.80, DeLong's test p-value<0.05). Our results suggest that breast anatomy may further leverage the associations of parenchymal texture features with breast cancer, and may therefore be a valuable addition in pipelines aiming to elucidate quantitative mammographic phenotypes of breast cancer risk.
Epidemiologic evidence on mobile phones and tumor risk: a review.
Ahlbom, Anders; Feychting, Maria; Green, Adele; Kheifets, Leeka; Savitz, David A; Swerdlow, Anthony J
2009-09-01
This review summarizes and interprets epidemiologic evidence bearing on a possible causal relation between radiofrequency field exposure from mobile phone use and tumor risk. In the last few years, epidemiologic evidence on mobile phone use and the risk of brain and other tumors of the head in adults has grown in volume, geographic diversity of study settings, and the amount of data on longer-term users. However, some key methodologic problems remain, particularly with regard to selective nonresponse and inaccuracy and bias in recall of phone use. Most studies of glioma show small increased or decreased risks among users, although a subset of studies show appreciably elevated risks. We considered methodologic features that might explain the deviant results, but found no clear explanation. Overall the studies published to date do not demonstrate an increased risk within approximately 10 years of use for any tumor of the brain or any other head tumor. Despite the methodologic shortcomings and the limited data on long latency and long-term use, the available data do not suggest a causal association between mobile phone use and fast-growing tumors such as malignant glioma in adults (at least for tumors with short induction periods). For slow-growing tumors such as meningioma and acoustic neuroma, as well as for glioma among long-term users, the absence of association reported thus far is less conclusive because the observation period has been too short.
Traditional chinese medicine: an update on clinical evidence.
Xue, Charlie C L; Zhang, Anthony L; Greenwood, Kenneth M; Lin, Vivian; Story, David F
2010-03-01
As an alternative medical system, Traditional Chinese Medicine (TCM) has been increasingly used over the last several decades. Such a consumer-driven development has resulted in introduction of education programs for practitioner training, development of product and practitioner regulation systems, and generation of an increasing interest in research. Significant efforts have been made in validating the quality, effectiveness, and safety of TCM interventions evidenced by a growing number of published trials and systematic reviews. Commonly, the results of these studies were inconclusive due to the lack of quality and quantity of the trials to answer specific and answerable clinical questions. The methodology of a randomized clinical trial (RCT) is not free from bias, and the unique features of TCM (such as individualization and holism) further complicate effective execution of RCTs in TCM therapies. Thus, data from limited RCTs and systematic reviews need to be interpreted with great caution. Nevertheless, until new and specific methodology is developed that can adequately address these methodology challenges for RCTs in TCM, evidence from quality RCTs and systematic reviews still holds the credibility of TCM in the scientific community. This article summarizes studies on TCM utilization, and regulatory and educational development with a focus on updating the TCM clinical evidence from RCTs and systematic reviews over the last decade. The key issues and challenges associated with evidence-based TCM developments are also explored.
Modeling Operations Other Than War: Non-Combatants in Combat Modeling
1994-09-01
supposition that non-combatants are an essential feature in OOTW. The model proposal includes a methodology for civilian unit decision making . The model...combatants are an essential feature in OOTW. The model proposal includes a methodology for civilian unit decision making . Thi- model also includes...numerical example demonstrated that the model appeared to perform in an acceptable manner, in that it produced output within a reasonable range. During the
Summarizing Monte Carlo Results in Methodological Research.
ERIC Educational Resources Information Center
Harwell, Michael R.
Monte Carlo studies of statistical tests are prominently featured in the methodological research literature. Unfortunately, the information from these studies does not appear to have significantly influenced methodological practice in educational and psychological research. One reason is that Monte Carlo studies lack an overarching theory to guide…
NASA Astrophysics Data System (ADS)
Shah, Syed Muhammad Saqlain; Batool, Safeera; Khan, Imran; Ashraf, Muhammad Usman; Abbas, Syed Hussnain; Hussain, Syed Adnan
2017-09-01
Automatic diagnosis of human diseases are mostly achieved through decision support systems. The performance of these systems is mainly dependent on the selection of the most relevant features. This becomes harder when the dataset contains missing values for the different features. Probabilistic Principal Component Analysis (PPCA) has reputation to deal with the problem of missing values of attributes. This research presents a methodology which uses the results of medical tests as input, extracts a reduced dimensional feature subset and provides diagnosis of heart disease. The proposed methodology extracts high impact features in new projection by using Probabilistic Principal Component Analysis (PPCA). PPCA extracts projection vectors which contribute in highest covariance and these projection vectors are used to reduce feature dimension. The selection of projection vectors is done through Parallel Analysis (PA). The feature subset with the reduced dimension is provided to radial basis function (RBF) kernel based Support Vector Machines (SVM). The RBF based SVM serves the purpose of classification into two categories i.e., Heart Patient (HP) and Normal Subject (NS). The proposed methodology is evaluated through accuracy, specificity and sensitivity over the three datasets of UCI i.e., Cleveland, Switzerland and Hungarian. The statistical results achieved through the proposed technique are presented in comparison to the existing research showing its impact. The proposed technique achieved an accuracy of 82.18%, 85.82% and 91.30% for Cleveland, Hungarian and Switzerland dataset respectively.
Towards a taxonomy for integrated care: a mixed-methods study
Valentijn, Pim P.; Boesveld, Inge C.; van der Klauw, Denise M.; Ruwaard, Dirk; Struijs, Jeroen N.; Molema, Johanna J.W.; Bruijnzeels, Marc A.; Vrijhoef, Hubertus JM.
2015-01-01
Introduction Building integrated services in a primary care setting is considered an essential important strategy for establishing a high-quality and affordable health care system. The theoretical foundations of such integrated service models are described by the Rainbow Model of Integrated Care, which distinguishes six integration dimensions (clinical, professional, organisational, system, functional and normative integration). The aim of the present study is to refine the Rainbow Model of Integrated Care by developing a taxonomy that specifies the underlying key features of the six dimensions. Methods First, a literature review was conducted to identify features for achieving integrated service delivery. Second, a thematic analysis method was used to develop a taxonomy of key features organised into the dimensions of the Rainbow Model of Integrated Care. Finally, the appropriateness of the key features was tested in a Delphi study among Dutch experts. Results The taxonomy consists of 59 key features distributed across the six integration dimensions of the Rainbow Model of Integrated Care. Key features associated with the clinical, professional, organisational and normative dimensions were considered appropriate by the experts. Key features linked to the functional and system dimensions were considered less appropriate. Discussion This study contributes to the ongoing debate of defining the concept and typology of integrated care. This taxonomy provides a development agenda for establishing an accepted scientific framework of integrated care from an end-user, professional, managerial and policy perspective. PMID:25759607
Towards a taxonomy for integrated care: a mixed-methods study.
Valentijn, Pim P; Boesveld, Inge C; van der Klauw, Denise M; Ruwaard, Dirk; Struijs, Jeroen N; Molema, Johanna J W; Bruijnzeels, Marc A; Vrijhoef, Hubertus Jm
2015-01-01
Building integrated services in a primary care setting is considered an essential important strategy for establishing a high-quality and affordable health care system. The theoretical foundations of such integrated service models are described by the Rainbow Model of Integrated Care, which distinguishes six integration dimensions (clinical, professional, organisational, system, functional and normative integration). The aim of the present study is to refine the Rainbow Model of Integrated Care by developing a taxonomy that specifies the underlying key features of the six dimensions. First, a literature review was conducted to identify features for achieving integrated service delivery. Second, a thematic analysis method was used to develop a taxonomy of key features organised into the dimensions of the Rainbow Model of Integrated Care. Finally, the appropriateness of the key features was tested in a Delphi study among Dutch experts. The taxonomy consists of 59 key features distributed across the six integration dimensions of the Rainbow Model of Integrated Care. Key features associated with the clinical, professional, organisational and normative dimensions were considered appropriate by the experts. Key features linked to the functional and system dimensions were considered less appropriate. This study contributes to the ongoing debate of defining the concept and typology of integrated care. This taxonomy provides a development agenda for establishing an accepted scientific framework of integrated care from an end-user, professional, managerial and policy perspective.
FY16 Status Report on Development of Integrated EPP and SMT Design Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jetter, R. I.; Sham, T. -L.; Wang, Y.
2016-08-01
The goal of the Elastic-Perfectly Plastic (EPP) combined integrated creep-fatigue damage evaluation approach is to incorporate a Simplified Model Test (SMT) data based approach for creep-fatigue damage evaluation into the EPP methodology to avoid the separate evaluation of creep and fatigue damage and eliminate the requirement for stress classification in current methods; thus greatly simplifying evaluation of elevated temperature cyclic service. The EPP methodology is based on the idea that creep damage and strain accumulation can be bounded by a properly chosen “pseudo” yield strength used in an elastic-perfectly plastic analysis, thus avoiding the need for stress classification. The originalmore » SMT approach is based on the use of elastic analysis. The experimental data, cycles to failure, is correlated using the elastically calculated strain range in the test specimen and the corresponding component strain is also calculated elastically. The advantage of this approach is that it is no longer necessary to use the damage interaction, or D-diagram, because the damage due to the combined effects of creep and fatigue are accounted in the test data by means of a specimen that is designed to replicate or bound the stress and strain redistribution that occurs in actual components when loaded in the creep regime. The reference approach to combining the two methodologies and the corresponding uncertainties and validation plans are presented. Results from recent key feature tests are discussed to illustrate the applicability of the EPP methodology and the behavior of materials at elevated temperature when undergoing stress and strain redistribution due to plasticity and creep.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valiev, Marat; Yang, Jie; Adams, Joseph
2007-11-29
Protein kinases catalyze the transfer of the γ-phosphoryl group from ATP, a key regulatory process governing signalling pathways in eukaryotic cells. The structure of the active site in these enzymes is highly conserved implying common catalytic mechanism. In this work we investigate the reaction process in cAPK protein kinase (PKA) using a combined quantum mechanics and molecular mechanics approach. The novel computational features of our work include reaction pathway determination with nudged elastic band methodology and calculation of free energy profiles of the reaction process taking into account finite temperature fluctuations of the protein environment. We find that the transfermore » of the γ-phosphoryl group in the protein environment is an exothermic reaction with the reaction barrier of 15 kcal/mol.« less
SEACAS Theory Manuals: Part III. Finite Element Analysis in Nonlinear Solid Mechanics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laursen, T.A.; Attaway, S.W.; Zadoks, R.I.
1999-03-01
This report outlines the application of finite element methodology to large deformation solid mechanics problems, detailing also some of the key technological issues that effective finite element formulations must address. The presentation is organized into three major portions: first, a discussion of finite element discretization from the global point of view, emphasizing the relationship between a virtual work principle and the associated fully discrete system, second, a discussion of finite element technology, emphasizing the important theoretical and practical features associated with an individual finite element; and third, detailed description of specific elements that enjoy widespread use, providing some examples ofmore » the theoretical ideas already described. Descriptions of problem formulation in nonlinear solid mechanics, nonlinear continuum mechanics, and constitutive modeling are given in three companion reports.« less
NASA Technical Reports Server (NTRS)
Menzies, Robert T.; Spiers, Gary D.; Jacob, Joseph C.
2013-01-01
The JPL airborne Laser Absorption Spectrometer instrument has been flown several times in the 2007-2011 time frame for the purpose of measuring CO2 mixing ratios in the lower atmosphere. This instrument employs CW laser transmitters and coherent detection receivers in the 2.05- micro m spectral region. The Integrated Path Differential Absorption (IPDA) method is used to retrieve weighted CO2 column mixing ratios. We present key features of the evolving LAS signal processing and data analysis algorithms and the calibration/validation methodology. Results from 2011 flights in various U.S. locations include observed mid-day CO2 drawdown in the Midwest and high spatial resolution plume detection during a leg downwind of the Four Corners power plant in New Mexico.
Sideris, Costas; Alshurafa, Nabil; Pourhomayoun, Mohammad; Shahmohammadi, Farhad; Samy, Lauren; Sarrafzadeh, Majid
2015-01-01
In this paper, we propose a novel methodology for utilizing disease diagnostic information to predict severity of condition for Congestive Heart Failure (CHF) patients. Our methodology relies on a novel, clustering-based, feature extraction framework using disease diagnostic information. To reduce the dimensionality we identify disease clusters using cooccurence frequencies. We then utilize these clusters as features to predict patient severity of condition. We build our clustering and feature extraction algorithm using the 2012 National Inpatient Sample (NIS), Healthcare Cost and Utilization Project (HCUP) which contains 7 million discharge records and ICD-9-CM codes. The proposed framework is tested on Ronald Reagan UCLA Medical Center Electronic Health Records (EHR) from 3041 patients. We compare our cluster-based feature set with another that incorporates the Charlson comorbidity score as a feature and demonstrate an accuracy improvement of up to 14% in the predictability of the severity of condition.
Multithreaded hybrid feature tracking for markerless augmented reality.
Lee, Taehee; Höllerer, Tobias
2009-01-01
We describe a novel markerless camera tracking approach and user interaction methodology for augmented reality (AR) on unprepared tabletop environments. We propose a real-time system architecture that combines two types of feature tracking. Distinctive image features of the scene are detected and tracked frame-to-frame by computing optical flow. In order to achieve real-time performance, multiple operations are processed in a synchronized multi-threaded manner: capturing a video frame, tracking features using optical flow, detecting distinctive invariant features, and rendering an output frame. We also introduce user interaction methodology for establishing a global coordinate system and for placing virtual objects in the AR environment by tracking a user's outstretched hand and estimating a camera pose relative to it. We evaluate the speed and accuracy of our hybrid feature tracking approach, and demonstrate a proof-of-concept application for enabling AR in unprepared tabletop environments, using bare hands for interaction.
How Is This Flower Pollinated? A Polyclave Key to Use in Teaching.
ERIC Educational Resources Information Center
Tyrrell, Lucy
1989-01-01
Presents an identification method which uses the process of elimination to identify pollination systems. Provides the polyclave key, methodology for using the key, a sample worksheet, and abbreviation codes for pollination systems. (MVL)
Tevi, Giuliano; Tevi, Anca
2012-01-01
Traditional agricultural practices based on non-customized irrigation and soil fertilization are harmful for the environment, and may pose a risk for human health. By continuing the use of these practices, it is not possible to ensure effective land management, which might be acquired by using advanced satellite technology configured for modern agricultural development. The paper presents a methodology based on the correlation between remote sensing data and field observations, aiming to identify the key features and to establish an interpretation pattern for the inhomogeneity highlighted by the remote sensing data. Instead of using classical methods for the evaluation of land features (field analysis, measurements and mapping), the approach is to use high resolution multispectral and hyperspectral methods, in correlation with data processing and geographic information systems (GIS), in order to improve the agricultural practices and mitigate their environmental impact (soil and shallow aquifer).
NASA Astrophysics Data System (ADS)
Guha, Anirban
2017-11-01
Theoretical studies on linear shear instabilities as well as different kinds of wave interactions often use simple velocity and/or density profiles (e.g. constant, piecewise) for obtaining good qualitative and quantitative predictions of the initial disturbances. Moreover, such simple profiles provide a minimal model to obtain a mechanistic understanding of shear instabilities. Here we have extended this minimal paradigm into nonlinear domain using vortex method. Making use of unsteady Bernoulli's equation in presence of linear shear, and extending Birkhoff-Rott equation to multiple interfaces, we have numerically simulated the interaction between multiple fully nonlinear waves. This methodology is quite general, and has allowed us to simulate diverse problems that can be essentially reduced to the minimal system with interacting waves, e.g. spilling and plunging breakers, stratified shear instabilities (Holmboe, Taylor-Caulfield, stratified Rayleigh), jet flows, and even wave-topography interaction problem like Bragg resonance. We found that the minimal models capture key nonlinear features (e.g. wave breaking features like cusp formation and roll-ups) which are observed in experiments and/or extensive simulations with smooth, realistic profiles.
Benedek, C; Descombes, X; Zerubia, J
2012-01-01
In this paper, we introduce a new probabilistic method which integrates building extraction with change detection in remotely sensed image pairs. A global optimization process attempts to find the optimal configuration of buildings, considering the observed data, prior knowledge, and interactions between the neighboring building parts. We present methodological contributions in three key issues: 1) We implement a novel object-change modeling approach based on Multitemporal Marked Point Processes, which simultaneously exploits low-level change information between the time layers and object-level building description to recognize and separate changed and unaltered buildings. 2) To answer the challenges of data heterogeneity in aerial and satellite image repositories, we construct a flexible hierarchical framework which can create various building appearance models from different elementary feature-based modules. 3) To simultaneously ensure the convergence, optimality, and computation complexity constraints raised by the increased data quantity, we adopt the quick Multiple Birth and Death optimization technique for change detection purposes, and propose a novel nonuniform stochastic object birth process which generates relevant objects with higher probability based on low-level image features.
Field Test of the Methodology for Succession Planning for Technical Experts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cain, Ronald A.; Kirk, Bernadette Lugue; Agreda, Carla L.
This report complements A Methodology for Succession Planning for Technical Experts (Ron Cain, Shaheen Dewji, Carla Agreda, Bernadette Kirk, July 2017), which describes a methodology for identifying and evaluating the loss of key technical skills at nuclear operations facilities. This report targets the methodology for identifying critical skills, hereafter referred to as “core competencies”. The methodology has been field tested by interviewing selected retiring subject matter experts (SMEs).
Evolution of regional to global paddy rice mapping methods
NASA Astrophysics Data System (ADS)
Dong, J.; Xiao, X.
2016-12-01
Paddy rice agriculture plays an important role in various environmental issues including food security, water use, climate change, and disease transmission. However, regional and global paddy rice maps are surprisingly scarce and sporadic despite numerous efforts in paddy rice mapping algorithms and applications. In this presentation we would like to review the existing paddy rice mapping methods from the literatures ranging from the 1980s to 2015. In particular, we illustrated the evolution of these paddy rice mapping efforts, looking specifically at the future trajectory of paddy rice mapping methodologies. The biophysical features and growth phases of paddy rice were analyzed first, and feature selections for paddy rice mapping were analyzed from spectral, polarimetric, temporal, spatial, and textural aspects. We sorted out paddy rice mapping algorithms into four categories: 1) Reflectance data and image statistic-based approaches, 2) vegetation index (VI) data and enhanced image statistic-based approaches, 3) VI or RADAR backscatter-based temporal analysis approaches, and 4) phenology-based approaches through remote sensing recognition of key growth phases. The phenology-based approaches using unique features of paddy rice (e.g., transplanting) for mapping have been increasingly used in paddy rice mapping. Based on the literature review, we discussed a series of issues for large scale operational paddy rice mapping.
Contributions to the Characterization and Mitigation of Rotorcraft Brownout
NASA Astrophysics Data System (ADS)
Tritschler, John Kirwin
Rotorcraft brownout, the condition in which the flow field of a rotorcraft mobilizes sediment from the ground to generate a cloud that obscures the pilot's field of view, continues to be a significant hazard to civil and military rotorcraft operations. This dissertation presents methodologies for: (i) the systematic mitigation of rotorcraft brownout through operational and design strategies and (ii) the quantitative characterization of the visual degradation caused by a brownout cloud. In Part I of the dissertation, brownout mitigation strategies are developed through simulation-based brownout studies that are mathematically formulated within a numerical optimization framework. Two optimization studies are presented. The first study involves the determination of approach-to-landing maneuvers that result in reduced brownout severity. The second study presents a potential methodology for the design of helicopter rotors with improved brownout characteristics. The results of both studies indicate that the fundamental mechanisms underlying brownout mitigation are aerodynamic in nature, and the evolution of a ground vortex ahead of the rotor disk is seen to be a key element in the development of a brownout cloud. In Part II of the dissertation, brownout cloud characterizations are based upon the Modulation Transfer Function (MTF), a metric commonly used in the optics community for the characterization of imaging systems. The use of the MTF in experimentation is examined first, and the application of MTF calculation and interpretation methods to actual flight test data is described. The potential for predicting the MTF from numerical simulations is examined second, and an initial methodology is presented for the prediction of the MTF of a brownout cloud. Results from the experimental and analytical studies rigorously quantify the intuitively-known facts that the visual degradation caused by brownout is a space and time-dependent phenomenon, and that high spatial frequency features, i.e., fine-grained detail, are obscured before low spatial frequency features, i.e., large objects. As such, the MTF is a metric that is amenable to Handling Qualities (HQ) analyses.
Leça, João M; Pereira, Ana C; Vieira, Ana C; Reis, Marco S; Marques, José C
2015-08-05
Vicinal diketones, namely diacetyl (DC) and pentanedione (PN), are compounds naturally found in beer that play a key role in the definition of its aroma. In lager beer, they are responsible for off-flavors (buttery flavor) and therefore their presence and quantification is of paramount importance to beer producers. Aiming at developing an accurate quantitative monitoring scheme to follow these off-flavor compounds during beer production and in the final product, the head space solid-phase microextraction (HS-SPME) analytical procedure was tuned through experiments planned in an optimal way and the final settings were fully validated. Optimal design of experiments (O-DOE) is a computational, statistically-oriented approach for designing experiences that are most informative according to a well-defined criterion. This methodology was applied for HS-SPME optimization, leading to the following optimal extraction conditions for the quantification of VDK: use a CAR/PDMS fiber, 5 ml of samples in 20 ml vial, 5 min of pre-incubation time followed by 25 min of extraction at 30 °C, with agitation. The validation of the final analytical methodology was performed using a matrix-matched calibration, in order to minimize matrix effects. The following key features were obtained: linearity (R(2) > 0.999, both for diacetyl and 2,3-pentanedione), high sensitivity (LOD of 0.92 μg L(-1) and 2.80 μg L(-1), and LOQ of 3.30 μg L(-1) and 10.01 μg L(-1), for diacetyl and 2,3-pentanedione, respectively), recoveries of approximately 100% and suitable precision (repeatability and reproducibility lower than 3% and 7.5%, respectively). The applicability of the methodology was fully confirmed through an independent analysis of several beer samples, with analyte concentrations ranging from 4 to 200 g L(-1). Copyright © 2015 Elsevier B.V. All rights reserved.
Development of a standardized training course for laparoscopic procedures using Delphi methodology.
Bethlehem, Martijn S; Kramp, Kelvin H; van Det, Marc J; ten Cate Hoedemaker, Henk O; Veeger, Nicolaas J G M; Pierie, Jean Pierre E N
2014-01-01
Content, evaluation, and certification of laparoscopic skills and procedure training lack uniformity among different hospitals in The Netherlands. Within the process of developing a new regional laparoscopic training curriculum, a uniform and transferrable curriculum was constructed for a series of laparoscopic procedures. The aim of this study was to determine regional expert consensus regarding the key steps for laparoscopic appendectomy and cholecystectomy using Delphi methodology. Lists of suggested key steps for laparoscopic appendectomy and cholecystectomy were created using surgical textbooks, available guidelines, and local practice. A total of 22 experts, working for teaching hospitals throughout the region, were asked to rate the suggested key steps for both procedures on a Likert scale from 1-5. Consensus was reached with Crohnbach's α ≥ 0.90. Of the 22 experts, 21 completed and returned the survey (95%). Data analysis already showed consensus after the first round of Delphi on the key steps for laparoscopic appendectomy (Crohnbach's α = 0.92) and laparoscopic cholecystectomy (Crohnbach's α = 0.90). After the second round, 15 proposed key steps for laparoscopic appendectomy and 30 proposed key steps for laparoscopic cholecystectomy were rated as important (≥4 by at least 80% of the expert panel). These key steps were used for the further development of the training curriculum. By using the Delphi methodology, regional consensus was reached on the key steps for laparoscopic appendectomy and cholecystectomy. These key steps are going to be used for standardized training and evaluation purposes in a new regional laparoscopic curriculum. Copyright © 2014 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Brimblecombe, Julie; Wycherley, Thomas Philip
2017-01-01
Smartphone applications are increasingly being used to support nutrition improvement in community settings. However, there is a scarcity of practical literature to support researchers and practitioners in choosing or developing health applications. This work maps the features, key content, theoretical approaches, and methods of consumer testing of applications intended for nutrition improvement in community settings. A systematic, scoping review methodology was used to map published, peer-reviewed literature reporting on applications with a specific nutrition-improvement focus intended for use in the community setting. After screening, articles were grouped into 4 categories: dietary self-monitoring trials, nutrition improvement trials, application description articles, and qualitative application development studies. For mapping, studies were also grouped into categories based on the target population and aim of the application or program. Of the 4818 titles identified from the database search, 64 articles were included. The broad categories of features found to be included in applications generally corresponded to different behavior change support strategies common to many classic behavioral change models. Key content of applications generally focused on food composition, with tailored feedback most commonly used to deliver educational content. Consumer testing before application deployment was reported in just over half of the studies. Collaboration between practitioners and application developers promotes an appropriate balance of evidence-based content and functionality. This work provides a unique resource for program development teams and practitioners seeking to use an application for nutrition improvement in community settings. PMID:28298274
A hierarchical anatomical classification schema for prediction of phenotypic side effects
Kanji, Rakesh
2018-01-01
Prediction of adverse drug reactions is an important problem in drug discovery endeavors which can be addressed with data-driven strategies. SIDER is one of the most reliable and frequently used datasets for identification of key features as well as building machine learning models for side effects prediction. The inherently unbalanced nature of this data presents with a difficult multi-label multi-class problem towards prediction of drug side effects. We highlight the intrinsic issue with SIDER data and methodological flaws in relying on performance measures such as AUC while attempting to predict side effects.We argue for the use of metrics that are robust to class imbalance for evaluation of classifiers. Importantly, we present a ‘hierarchical anatomical classification schema’ which aggregates side effects into organs, sub-systems, and systems. With the help of a weighted performance measure, using 5-fold cross-validation we show that this strategy facilitates biologically meaningful side effects prediction at different levels of anatomical hierarchy. By implementing various machine learning classifiers we show that Random Forest model yields best classification accuracy at each level of coarse-graining. The manually curated, hierarchical schema for side effects can also serve as the basis of future studies towards prediction of adverse reactions and identification of key features linked to specific organ systems. Our study provides a strategy for hierarchical classification of side effects rooted in the anatomy and can pave the way for calibrated expert systems for multi-level prediction of side effects. PMID:29494708
Tonkin, Emma; Brimblecombe, Julie; Wycherley, Thomas Philip
2017-03-01
Smartphone applications are increasingly being used to support nutrition improvement in community settings. However, there is a scarcity of practical literature to support researchers and practitioners in choosing or developing health applications. This work maps the features, key content, theoretical approaches, and methods of consumer testing of applications intended for nutrition improvement in community settings. A systematic, scoping review methodology was used to map published, peer-reviewed literature reporting on applications with a specific nutrition-improvement focus intended for use in the community setting. After screening, articles were grouped into 4 categories: dietary self-monitoring trials, nutrition improvement trials, application description articles, and qualitative application development studies. For mapping, studies were also grouped into categories based on the target population and aim of the application or program. Of the 4818 titles identified from the database search, 64 articles were included. The broad categories of features found to be included in applications generally corresponded to different behavior change support strategies common to many classic behavioral change models. Key content of applications generally focused on food composition, with tailored feedback most commonly used to deliver educational content. Consumer testing before application deployment was reported in just over half of the studies. Collaboration between practitioners and application developers promotes an appropriate balance of evidence-based content and functionality. This work provides a unique resource for program development teams and practitioners seeking to use an application for nutrition improvement in community settings. © 2017 American Society for Nutrition.
A hierarchical anatomical classification schema for prediction of phenotypic side effects.
Wadhwa, Somin; Gupta, Aishwarya; Dokania, Shubham; Kanji, Rakesh; Bagler, Ganesh
2018-01-01
Prediction of adverse drug reactions is an important problem in drug discovery endeavors which can be addressed with data-driven strategies. SIDER is one of the most reliable and frequently used datasets for identification of key features as well as building machine learning models for side effects prediction. The inherently unbalanced nature of this data presents with a difficult multi-label multi-class problem towards prediction of drug side effects. We highlight the intrinsic issue with SIDER data and methodological flaws in relying on performance measures such as AUC while attempting to predict side effects.We argue for the use of metrics that are robust to class imbalance for evaluation of classifiers. Importantly, we present a 'hierarchical anatomical classification schema' which aggregates side effects into organs, sub-systems, and systems. With the help of a weighted performance measure, using 5-fold cross-validation we show that this strategy facilitates biologically meaningful side effects prediction at different levels of anatomical hierarchy. By implementing various machine learning classifiers we show that Random Forest model yields best classification accuracy at each level of coarse-graining. The manually curated, hierarchical schema for side effects can also serve as the basis of future studies towards prediction of adverse reactions and identification of key features linked to specific organ systems. Our study provides a strategy for hierarchical classification of side effects rooted in the anatomy and can pave the way for calibrated expert systems for multi-level prediction of side effects.
CMOS Active Pixel Sensor Technology and Reliability Characterization Methodology
NASA Technical Reports Server (NTRS)
Chen, Yuan; Guertin, Steven M.; Pain, Bedabrata; Kayaii, Sammy
2006-01-01
This paper describes the technology, design features and reliability characterization methodology of a CMOS Active Pixel Sensor. Both overall chip reliability and pixel reliability are projected for the imagers.
Røislien, Jo; Winje, Brita
2013-09-20
Clinical studies frequently include repeated measurements of individuals, often for long periods. We present a methodology for extracting common temporal features across a set of individual time series observations. In particular, the methodology explores extreme observations within the time series, such as spikes, as a possible common temporal phenomenon. Wavelet basis functions are attractive in this sense, as they are localized in both time and frequency domains simultaneously, allowing for localized feature extraction from a time-varying signal. We apply wavelet basis function decomposition of individual time series, with corresponding wavelet shrinkage to remove noise. We then extract common temporal features using linear principal component analysis on the wavelet coefficients, before inverse transformation back to the time domain for clinical interpretation. We demonstrate the methodology on a subset of a large fetal activity study aiming to identify temporal patterns in fetal movement (FM) count data in order to explore formal FM counting as a screening tool for identifying fetal compromise and thus preventing adverse birth outcomes. Copyright © 2013 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Ikeno, Rimon; Maruyama, Satoshi; Mita, Yoshio; Ikeda, Makoto; Asada, Kunihiro
2016-03-01
Among various electron-beam lithography (EBL) techniques, variable-shaped beam (VSB) and character projection (CP) methods have attracted many EBL users for their high-throughput feature, but they are considered to be more suited to small-featured VLSI fabrication with regularly-arranged layouts like standard-cell logics and memory arrays. On the other hand, non-VLSI applications like photonics, MEMS, MOEMS, and so on, have not been fully utilized the benefit of CP method due to their wide variety of layout patterns. In addition, the stepwise edge shapes by VSB method often causes intolerable edge roughness to degrade device characteristics from its intended performance with smooth edges. We proposed an overall EBL methodology applicable to wade-variety of EBL applications utilizing VSB and CP methods. Its key idea is in our layout data conversion algorithm that decomposes curved or oblique edges of arbitrary layout patterns into CP shots. We expect significant reduction in EB shot count with a CP-bordered exposure data compared to the corresponding VSB-alone conversion result. Several CP conversion parameters are used to optimize EB exposure throughput, edge quality, and resultant device characteristics. We demonstrated out methodology using the leading-edge VSB/CP EBL tool, ADVANTEST F7000S-VD02, with high resolution Hydrogen Silsesquioxane (HSQ) resist. Through our experiments of curved and oblique edge lithography under various data conversion conditions, we learned correspondence of the conversion parameters to the resultant edge roughness and other conditions. They will be utilized as the fundamental data for further enhancement of our EBL strategy for optimized EB exposure.
The Device Centric Communication System for 5G Networks
NASA Astrophysics Data System (ADS)
Biswash, S. K.; Jayakody, D. N. K.
2017-01-01
The Fifth Generation Communication (5G) networks have several functional features such as: Massive Multiple Input and Multiple Output (MIMO), Device centric data and voice support, Smarter-device communications, etc. The objective for 5G networks is to gain the 1000x more throughput, 10x spectral efficiency, 100 x more energy efficiency than existing technologies. The 5G system will provide the balance between the Quality of Experience (QoE) and the Quality of Service (QoS), without compromising the user benefit. The data rate has been the key metric for wireless QoS; QoE deals with the delay and throughput. In order to realize a balance between the QoS and QoE, we propose a cellular Device centric communication methodology for the overlapping network coverage area in the 5G communication system. The multiple beacon signals mobile tower refers to an overlapping network area, and a user must be forwarded to the next location area. To resolve this issue, we suggest the user centric methodology (without Base Station interface) to handover the device in the next area, until the users finalize the communication. The proposed method will reduce the signalling cost and overheads for the communication.
Abdur-Rashid, Khalil; Furber, Steven Woodward; Abdul-Basser, Taha
2013-04-01
We survey the meta-ethical tools and institutional processes that traditional Islamic ethicists apply when deliberating on bioethical issues. We present a typology of these methodological elements, giving particular attention to the meta-ethical techniques and devices that traditional Islamic ethicists employ in the absence of decisive or univocal authoritative texts or in the absence of established transmitted cases. In describing how traditional Islamic ethicists work, we demonstrate that these experts possess a variety of discursive tools. We find that the ethical responsa-i.e., the products of the application of the tools that we describe-are generally characterized by internal consistency. We also conclude that Islamic ethical reasoning on bioethical issues, while clearly scripture-based, is also characterized by strong consequentialist elements and possesses clear principles-based characteristics. The paper contributes to the study of bioethics by familiarizing non-specialists in Islamic ethics with the role, scope, and applicability of key Islamic ethical concepts, such as "aims" (maqāṣid), "universals" (kulliyyāt), "interest" (maṣlaḥa), "maxims" (qawā`id), "controls" (ḍawābit), "differentiators" (furūq), "preponderization" (tarjīḥ), and "extension" (tafrī`).
Key features of an EU health information system: a concept mapping study.
Rosenkötter, Nicole; Achterberg, Peter W; van Bon-Martens, Marja J H; Michelsen, Kai; van Oers, Hans A M; Brand, Helmut
2016-02-01
Despite the acknowledged value of an EU health information system (EU-HISys) and the many achievements in this field, the landscape is still heavily fragmented and incomplete. Through a systematic analysis of the opinions and valuations of public health stakeholders, this study aims to conceptualize key features of an EU-HISys. Public health professionals and policymakers were invited to participate in a concept mapping procedure. First, participants (N = 34) formulated statements that reflected their vision of an EU-HISys. Second, participants (N = 28) rated the relative importance of each statement and grouped conceptually similar ones. Principal Component and cluster analyses were used to condense these results to EU-HISys key features in a concept map. The number of key features and the labelling of the concept map were determined by expert consensus. The concept map contains 10 key features that summarize 93 statements. The map consists of a horizontal axis that represents the relevance of an 'organizational strategy', which deals with the 'efforts' to design and develop an EU-HISys and the 'achievements' gained by a functioning EU-HISys. The vertical axis represents the 'professional orientation' of the EU-HISys, ranging from the 'scientific' through to the 'policy' perspective. The top ranking statement expressed the need to establish a system that is permanent and sustainable. The top ranking key feature focuses on data and information quality. This study provides insights into key features of an EU-HISys. The results can be used to guide future planning and to support the development of a health information system for Europe. © The Author 2015. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.
Brown, Matt A; Bishnoi, Ram J; Dholakia, Sara; Velligan, Dawn I
2016-01-20
Recent failures to detect efficacy in clinical trials investigating pharmacological treatments for schizophrenia raise concerns regarding the potential contribution of methodological shortcomings to this research. This review provides an examination of two key methodological issues currently suspected of playing a role in hampering schizophrenia drug development; 1) limitations on the translational utility of preclinical development models, and 2) methodological challenges posed by increased placebo effects. Recommendations for strategies to address these methodological issues are addressed.
NASA Astrophysics Data System (ADS)
Tarolli, Paolo; Fuller, Ian C.; Basso, Federica; Cavalli, Marco; Sofia, Giulia
2017-04-01
Hydro-geomorphic connectivity has significantly emerged as a new concept to understand the transfer of surface water and sediment through landscapes. A further scientific challenge is determining how the concept can be used to enable sustainable land and water management. This research proposes an interesting approach to integrating remote sensing techniques, connectivity theory, and geomorphometry based on high-resolution digital terrain model (HR-DTMs) to automatically extract landslides crowns and gully erosion, to determine the different rate of connectivity among the main extracted features and the river network, and thus determine a possible categorization of hazardous areas. The study takes place in two mountainous regions in the Wellington Region (New Zealand). The methodology is a three step approach. Firstly, we performed an automatic detection of the likely landslides crowns through the use of thresholds obtained by the statistical analysis of the variability of landform curvature. After that, the research considered the Connectivity Index to analyse how a complex and rugged topography induces large variations in erosion and sediment delivery in the two catchments. Lastly, the two methods have been integrated to create a unique procedure able to classify the different rate of connectivity among the main features and the river network and thus identifying potential threats and hazardous areas. The methodology is fast, and it can produce a detailed and updated inventory map that could be a key tool for erosional and sediment delivery hazard mitigation. This fast and simple method can be a useful tool to manage emergencies giving priorities to more failure-prone zones. Furthermore, it could be considered to do a preliminary interpretations of geomorphological phenomena and more in general, it could be the base to develop inventory maps. References Cavalli M, Trevisani S, Comiti F, Marchi L. 2013. Geomorphometric assessment of spatial sediment connectivity in small Alpine catchments. Geomorphology 188: 31-41 DOI: 10.1016/j.geomorph.2012.05.007 Sofia G, Dalla Fontana G, Tarolli P. 2014. High-resolution topography and anthropogenic feature extraction: testing geomorphometric parameters in floodplains. Hydrological Processes 28 (4): 2046-2061 DOI: 10.1002/hyp.9727 Tarolli P, Sofia G, Dalla Fontana G. 2012. Geomorphic features extraction from high-resolution topography: landslide crowns and bank erosion. Natural Hazards 61 (1): 65-83 DOI: 10.1007/s11069-010-9695-2
On the effect of model parameters on forecast objects
NASA Astrophysics Data System (ADS)
Marzban, Caren; Jones, Corinne; Li, Ning; Sandgathe, Scott
2018-04-01
Many physics-based numerical models produce a gridded, spatial field of forecasts, e.g., a temperature map
. The field for some quantities generally consists of spatially coherent and disconnected objects
. Such objects arise in many problems, including precipitation forecasts in atmospheric models, eddy currents in ocean models, and models of forest fires. Certain features of these objects (e.g., location, size, intensity, and shape) are generally of interest. Here, a methodology is developed for assessing the impact of model parameters on the features of forecast objects. The main ingredients of the methodology include the use of (1) Latin hypercube sampling for varying the values of the model parameters, (2) statistical clustering algorithms for identifying objects, (3) multivariate multiple regression for assessing the impact of multiple model parameters on the distribution (across the forecast domain) of object features, and (4) methods for reducing the number of hypothesis tests and controlling the resulting errors. The final output
of the methodology is a series of box plots and confidence intervals that visually display the sensitivities. The methodology is demonstrated on precipitation forecasts from a mesoscale numerical weather prediction model.
ERIC Educational Resources Information Center
Walker, Allan; Qian, Haiyan
2015-01-01
Purpose: The purpose of this paper is to review English-language publications about school principalship in China published between 1998 and 2013 and to present an overview of the authorship, topics, methodologies and key findings of these publications. Design/methodology/approach: The methodology includes an exhaustive review of journal articles…
Precipitate statistics in an Al-Mg-Si-Cu alloy from scanning precession electron diffraction data
NASA Astrophysics Data System (ADS)
Sunde, J. K.; Paulsen, Ø.; Wenner, S.; Holmestad, R.
2017-09-01
The key microstructural feature providing strength to age-hardenable Al alloys is nanoscale precipitates. Alloy development requires a reliable statistical assessment of these precipitates, in order to link the microstructure with material properties. Here, it is demonstrated that scanning precession electron diffraction combined with computational analysis enable the semi-automated extraction of precipitate statistics in an Al-Mg-Si-Cu alloy. Among the main findings is the precipitate number density, which agrees well with a conventional method based on manual counting and measurements. By virtue of its data analysis objectivity, our methodology is therefore seen as an advantageous alternative to existing routines, offering reproducibility and efficiency in alloy statistics. Additional results include improved qualitative information on phase distributions. The developed procedure is generic and applicable to any material containing nanoscale precipitates.
Microengineering neocartilage scaffolds.
Petersen, Erik F; Spencer, Richard G S; McFarland, Eric W
2002-06-30
Advances in micropatterning methodologies have made it possible to create structures with precise architecture on the surface of cell culture substrata. We applied these techniques to fabricate microfeatures (15-65 microm wide; 40 microm deep) on the surface of a flexible, biocompatible polysaccharide gel. The micropatterned polymer gels were subsequently applied as scaffolds for chondrocyte culture and proved effective in maintaining key aspects of the chondrogenic phenotype. These were rounded cell morphology and a positive and statistically significant (p < 0.0001) immunofluorescence assay for the production of type II collagen throughout the maximum culture time of 10 days after cell seeding. Further, cells housed within individual surface features were observed to proliferate, while serial application of chondrocytes resulted in the formation of cellular aggregates. These methods represent a novel approach to the problem of engineering reparative cartilage in vitro. Copyright 2002 Wiley Periodicals, Inc.
Ablation, Thermal Response, and Chemistry Program for Analysis of Thermal Protection Systems
NASA Technical Reports Server (NTRS)
Milos, Frank S.; Chen, Yih-Kanq
2010-01-01
In previous work, the authors documented the Multicomponent Ablation Thermochemistry (MAT) and Fully Implicit Ablation and Thermal response (FIAT) programs. In this work, key features from MAT and FIAT were combined to create the new Fully Implicit Ablation, Thermal response, and Chemistry (FIATC) program. FIATC is fully compatible with FIAT (version 2.5) but has expanded capabilities to compute the multispecies surface chemistry and ablation rate as part of the surface energy balance. This new methodology eliminates B' tables, provides blown species fractions as a function of time, and enables calculations that would otherwise be impractical (e.g. 4+ dimensional tables) such as pyrolysis and ablation with kinetic rates or unequal diffusion coefficients. Equations and solution procedures are presented, then representative calculations of equilibrium and finite-rate ablation in flight and ground-test environments are discussed.
Fajans, Peter; Simmons, Ruth; Ghiron, Laura
2006-03-01
Public sector health systems that provide services to poor and marginalized populations in developing countries face great challenges. Change associated with health sector reform and structural adjustment often leaves these already-strained institutions with fewer resources and insufficient capacity to relieve health burdens. The Strategic Approach to Strengthening Reproductive Health Policies and Programs is a methodological innovation developed by the World Health Organization and its partners to help countries identify and prioritize their reproductive health service needs, test appropriate interventions, and scale up successful innovations to a subnational or national level. The participatory, interdisciplinary, and country-owned process can set in motion much-needed change. We describe key features of this approach, provide illustrations from country experiences, and use insights from the diffusion of innovation literature to explain the approach's dissemination and sustainability.
Transport in Dynamical Astronomy and Multibody Problems
NASA Astrophysics Data System (ADS)
Dellnitz, Michael; Junge, Oliver; Koon, Wang Sang; Lekien, Francois; Lo, Martin W.; Marsden, Jerrold E.; Padberg, Kathrin; Preis, Robert; Ross, Shane D.; Thiere, Bianca
We combine the techniques of almost invariant sets (using tree structured box elimination and graph partitioning algorithms) with invariant manifold and lobe dynamics techniques. The result is a new computational technique for computing key dynamical features, including almost invariant sets, resonance regions as well as transport rates and bottlenecks between regions in dynamical systems. This methodology can be applied to a variety of multibody problems, including those in molecular modeling, chemical reaction rates and dynamical astronomy. In this paper we focus on problems in dynamical astronomy to illustrate the power of the combination of these different numerical tools and their applicability. In particular, we compute transport rates between two resonance regions for the three-body system consisting of the Sun, Jupiter and a third body (such as an asteroid). These resonance regions are appropriate for certain comets and asteroids.
Bartholomew, Theodore T; Lockard, Allison J
2018-06-13
Mixed methods can foster depth and breadth in psychological research. However, its use remains in development in psychotherapy research. Our purpose was to review the use of mixed methods in psychotherapy research. Thirty-one studies were identified via the PRISMA systematic review method. Using Creswell & Plano Clark's typologies to identify design characteristics, we assessed each study for rigor and how each used mixed methods. Key features of mixed methods designs and these common patterns were identified: (a) integration of clients' perceptions via mixing; (b) understanding group psychotherapy; (c) integrating methods with cases and small samples; (d) analyzing clinical data as qualitative data; and (e) exploring cultural identities in psychotherapy through mixed methods. The review is discussed with respect to the value of integrating multiple data in single studies to enhance psychotherapy research. © 2018 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane
2003-09-01
This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to anmore » aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.« less
An economic analysis methodology for project evaluation and programming.
DOT National Transportation Integrated Search
2013-08-01
Economic analysis is a critical component of a comprehensive project or program evaluation methodology that considers all key : quantitative and qualitative impacts of highway investments. It allows highway agencies to identify, quantify, and value t...
New mission requirements methodologies for services provided by the Office of Space Communications
NASA Technical Reports Server (NTRS)
Holmes, Dwight P.; Hall, J. R.; Macoughtry, William; Spearing, Robert
1993-01-01
The Office of Space Communications, NASA Headquarters, has recently revised its methodology for receiving, accepting and responding to customer requests for use of that office's tracking and communications capabilities. This revision is the result of a process which has become over-burdened by the size of the currently active and proposed missions set, requirements reviews that focus on single missions rather than on mission sets, and negotiations most often not completed early enough to effect needed additions to capacity or capability prior to launch. The requirements-coverage methodology described is more responsive to project/program needs and provides integrated input into the NASA budget process early enough to effect change, and describes the mechanisms and tools in place to insure a value-added process which will benefit both NASA and its customers. Key features of the requirements methodology include the establishment of a mechanism for early identification of and systems trades with new customers, and delegates the review and approval of requirements documents to NASA centers in lieu of Headquarters, thus empowering the system design teams to establish and negotiate the detailed requirements with the user. A Mission Requirements Request (MRR) is introduced to facilitate early customer interaction. The expected result is that the time to achieve an approved set of implementation requirements which meet the customer's needs can be greatly reduced. Finally, by increasing the discipline in requirements management, through the use of base lining procedures, a tighter coupling between customer requirements and the budget is provided. A twice-yearly projection of customer requirements accommodation, designated as the Capacity Projection Plan (CPP), provides customer feedback allowing the entire mission set to be serviced.
Boger, Elin; Ewing, Pär; Eriksson, Ulf G; Fihn, Britt-Marie; Chappell, Michael; Evans, Neil; Fridén, Markus
2015-05-01
Investigation of pharmacokinetic/pharmacodynamic (PK/PD) relationships for inhaled drugs is challenging because of the limited possibilities of measuring tissue exposure and target engagement in the lung. The aim of this study was to develop a methodology for measuring receptor occupancy in vivo in the rat for the glucocorticoid receptor (GR) to allow more informative inhalation PK/PD studies. From AstraZeneca's chemical library of GR binders, compound 1 [N-(2-amino-2-oxo-ethyl)-3-[5-[(1R,2S)-2-(2,2-difluoropropanoylamino)-1-(2,3-dihydro-1,4-benzodioxin-6-yl)propoxy]indazol-1-yl]-N-methyl-benzamide] was identified to have properties that are useful as a tracer for GR in vitro. When given at an appropriate dose (30 nmol/kg) to rats, compound 1 functioned as a tracer in the lung and spleen in vivo using liquid chromatography-tandem mass spectrometry bioanalysis. The methodology was successfully used to show the dose-receptor occupancy relationship measured at 1.5 hours after intravenous administration of fluticasone propionate (20, 150, and 750 nmol/kg) as well as to characterize the time profile for receptor occupancy after a dose of 90 nmol/kg i.v. The dose giving 50% occupancy was estimated as 47 nmol/kg. The methodology is novel in terms of measuring occupancy strictly in vivo and by using an unlabeled tracer. This feature confers key advantages, including occupancy estimation not being influenced by drug particle dissolution or binding/dissociation taking place postmortem. In addition, the tracer may be labeled for use in positron emission tomography imaging, thus enabling occupancy estimation in humans as a translatable biomarker of target engagement. Copyright © 2015 by The American Society for Pharmacology and Experimental Therapeutics.
Yovel, Galit
2009-11-01
It is often argued that picture-plane face inversion impairs discrimination of the spacing among face features to a greater extent than the identity of the facial features. However, several recent studies have reported similar inversion effects for both types of face manipulations. In a recent review, Rossion (2008) claimed that similar inversion effects for spacing and features are due to methodological and conceptual shortcomings and that data still support the idea that inversion impairs the discrimination of features less than that of the spacing among them. Here I will claim that when facial features differ primarily in shape, the effect of inversion on features is not smaller than the one on spacing. It is when color/contrast information is added to facial features that the inversion effect on features decreases. This obvious observation accounts for the discrepancy in the literature and suggests that the large inversion effect that was found for features that differ in shape is not a methodological artifact. These findings together with other data that are discussed are consistent with the idea that the shape of facial features and the spacing among them are integrated rather than dissociated in the holistic representation of faces.
Young, Jacy L; Green, Christopher D
2013-11-01
In this article, we present the results of an exploratory digital analysis of the contents of the two journals founded in the late 19th century by American psychologist G. Stanley Hall. Using the methods of the increasingly popular digital humanities, some key attributes of the American Journal of Psychology (AJP) and the Pedagogical Seminary (PS) are identified. Our analysis reaffirms some of Hall's explicit aims for the two periodicals, while also revealing a number of other features of the journals, as well as of the people who published within their pages, the methodologies they employed, and the institutions at which they worked. Notably, despite Hall's intent that his psychological journal be strictly an outlet for scientific research, the journal-like its sister pedagogically focused publication-included an array of methodologically diverse research. The multiplicity of research styles that characterize the content of Hall's journals in their initial years is, in part, a consequence of individual researchers at times crossing methodological lines and producing a diverse body of research. Along with such variety within each periodical, it is evident that the line between content appropriate to one periodical rather than the other was fluid rather than absolute. The full results of this digitally informed analysis of Hall's two journals suggest a number of novel avenues for future research and demonstrate the utility of digital methods as applied to the history of psychology. (PsycINFO Database Record (c) 2013 APA, all rights reserved).
Practical Issues of Conducting a Q Methodology Study: Lessons Learned From a Cross-cultural Study.
Stone, Teresa Elizabeth; Maguire, Jane; Kang, Sook Jung; Cha, Chiyoung
This article advances nursing research by presenting the methodological challenges experienced in conducting a multination Q-methodology study. This article critically analyzes the relevance of the methodology for cross-cultural and nursing research and the challenges that led to specific responses by the investigators. The use of focus groups with key stakeholders supplemented the Q-analysis results. The authors discuss practical issues and shared innovative approaches and provide best-practice suggestions on the use of this flexible methodology. Q methodology has the versatility to explore complexities of contemporary nursing practice and cross-cultural health research.
Georgoulas, George; Georgopoulos, Voula C; Stylios, Chrysostomos D
2006-01-01
This paper proposes a novel integrated methodology to extract features and classify speech sounds with intent to detect the possible existence of a speech articulation disorder in a speaker. Articulation, in effect, is the specific and characteristic way that an individual produces the speech sounds. A methodology to process the speech signal, extract features and finally classify the signal and detect articulation problems in a speaker is presented. The use of support vector machines (SVMs), for the classification of speech sounds and detection of articulation disorders is introduced. The proposed method is implemented on a data set where different sets of features and different schemes of SVMs are tested leading to satisfactory performance.
A hybrid approach to select features and classify diseases based on medical data
NASA Astrophysics Data System (ADS)
AbdelLatif, Hisham; Luo, Jiawei
2018-03-01
Feature selection is popular problem in the classification of diseases in clinical medicine. Here, we developing a hybrid methodology to classify diseases, based on three medical datasets, Arrhythmia, Breast cancer, and Hepatitis datasets. This methodology called k-means ANOVA Support Vector Machine (K-ANOVA-SVM) uses K-means cluster with ANOVA statistical to preprocessing data and selection the significant features, and Support Vector Machines in the classification process. To compare and evaluate the performance, we choice three classification algorithms, decision tree Naïve Bayes, Support Vector Machines and applied the medical datasets direct to these algorithms. Our methodology was a much better classification accuracy is given of 98% in Arrhythmia datasets, 92% in Breast cancer datasets and 88% in Hepatitis datasets, Compare to use the medical data directly with decision tree Naïve Bayes, and Support Vector Machines. Also, the ROC curve and precision with (K-ANOVA-SVM) Achieved best results than other algorithms
Biosynthesis of Inorganic Nanoparticles: A Fresh Look at the Control of Shape, Size and Composition
Dahoumane, Si Amar; Jeffryes, Clayton; Mechouet, Mourad; Agathos, Spiros N.
2017-01-01
Several methodologies have been devised for the design of nanomaterials. The “Holy Grail” for materials scientists is the cost-effective, eco-friendly synthesis of nanomaterials with controlled sizes, shapes and compositions, as these features confer to the as-produced nanocrystals unique properties making them appropriate candidates for valuable bio-applications. The present review summarizes published data regarding the production of nanomaterials with special features via sustainable methodologies based on the utilization of natural bioresources. The richness of the latter, the diversity of the routes adopted and the tuned experimental parameters have led to the fabrication of nanomaterials belonging to different chemical families with appropriate compositions and displaying interesting sizes and shapes. It is expected that these outstanding findings will encourage researchers and attract newcomers to continue and extend the exploration of possibilities offered by nature and the design of innovative and safer methodologies towards the synthesis of unique nanomaterials, possessing desired features and exhibiting valuable properties that can be exploited in a profusion of fields. PMID:28952493
NASA Astrophysics Data System (ADS)
Revollo Sarmiento, G. N.; Cipolletti, M. P.; Perillo, M. M.; Delrieux, C. A.; Perillo, Gerardo M. E.
2016-03-01
Tidal flats generally exhibit ponds of diverse size, shape, orientation and origin. Studying the genesis, evolution, stability and erosive mechanisms of these geographic features is critical to understand the dynamics of coastal wetlands. However, monitoring these locations through direct access is hard and expensive, not always feasible, and environmentally damaging. Processing remote sensing images is a natural alternative for the extraction of qualitative and quantitative data due to their non-invasive nature. In this work, a robust methodology for automatic classification of ponds and tidal creeks in tidal flats using Google Earth images is proposed. The applicability of our method is tested in nine zones with different morphological settings. Each zone is processed by a segmentation stage, where ponds and tidal creeks are identified. Next, each geographical feature is measured and a set of shape descriptors is calculated. This dataset, together with a-priori classification of each geographical feature, is used to define a regression model, which allows an extensive automatic classification of large volumes of data discriminating ponds and tidal creeks against other various geographical features. In all cases, we identified and automatically classified different geographic features with an average accuracy over 90% (89.7% in the worst case, and 99.4% in the best case). These results show the feasibility of using freely available Google Earth imagery for the automatic identification and classification of complex geographical features. Also, the presented methodology may be easily applied in other wetlands of the world and perhaps employing other remote sensing imagery.
Squires, Hazel; Chilcott, James; Akehurst, Ronald; Burr, Jennifer; Kelly, Michael P
2016-04-01
To identify the key methodological challenges for public health economic modelling and set an agenda for future research. An iterative literature search identified papers describing methodological challenges for developing the structure of public health economic models. Additional multidisciplinary literature searches helped expand upon important ideas raised within the review. Fifteen articles were identified within the formal literature search, highlighting three key challenges: inclusion of non-healthcare costs and outcomes; inclusion of equity; and modelling complex systems and multi-component interventions. Based upon these and multidisciplinary searches about dynamic complexity, the social determinants of health, and models of human behaviour, six areas for future research were specified. Future research should focus on: the use of systems approaches within health economic modelling; approaches to assist the systematic consideration of the social determinants of health; methods for incorporating models of behaviour and social interactions; consideration of equity; and methodology to help modellers develop valid, credible and transparent public health economic model structures.
An investigation into creative design methodologies for textiles and fashion
NASA Astrophysics Data System (ADS)
Gault, Alison
2017-10-01
Understanding market intelligence, trends, influences and personal approaches are essential tools for design students to develop their ideas in textiles and fashion. Identifying different personal approaches including, visual, process-led or concept by employing creative methodologies are key to developing a brief. A series of ideas or themes start to emerge and through the design process serve to underpin and inform an entire collection. These investigations ensure that the design collections are able to produce a diverse range of outcomes. Following key structures and coherent stages in the design process creates authentic collections in textiles and fashion. A range of undergraduate students presented their design portfolios (180) and the methodologies employed were mapped against success at module level, industry response and graduate employment.
ERIC Educational Resources Information Center
Brown, Robert D.; Gortmaker, Valerie J.
2009-01-01
Methodological and political issues arise during the designing, conducting, and reporting of campus-climate studies for LGBT students. These issues interact; making a decision about a methodological issue (e.g., sample size) has an impact on a political issue (e.g., how well the findings will be received). Ten key questions that must be addressed…
Speed-Accuracy Tradeoffs in Speech Production
2017-06-01
imaging data of speech production. A theoretical framework for considering Fitts’ law in the domain of speech production is elucidated. Methodological ...articulatory kinematics conform to Fitts’ law. A second, associated goal is to address the methodological challenges inherent in performing Fitts-style...analysis on rtMRI data of speech production. Methodological challenges include segmenting continuous speech into specific motor tasks, defining key
Container Surface Evaluation by Function Estimation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wendelberger, James G.
Container images are analyzed for specific surface features, such as, pits, cracks, and corrosion. The detection of these features is confounded with complicating features. These complication features include: shape/curvature, welds, edges, scratches, foreign objects among others. A method is provided to discriminate between the various features. The method consists of estimating the image background, determining a residual image and post processing to determine the features present. The methodology is not finalized but demonstrates the feasibility of a method to determine the kind and size of the features present.
Larriba, Yolanda; Rueda, Cristina; Fernández, Miguel A; Peddada, Shyamal D
2018-01-01
Motivation: Gene-expression data obtained from high throughput technologies are subject to various sources of noise and accordingly the raw data are pre-processed before formally analyzed. Normalization of the data is a key pre-processing step, since it removes systematic variations across arrays. There are numerous normalization methods available in the literature. Based on our experience, in the context of oscillatory systems, such as cell-cycle, circadian clock, etc., the choice of the normalization method may substantially impact the determination of a gene to be rhythmic. Thus rhythmicity of a gene can purely be an artifact of how the data were normalized. Since the determination of rhythmic genes is an important component of modern toxicological and pharmacological studies, it is important to determine truly rhythmic genes that are robust to the choice of a normalization method. Results: In this paper we introduce a rhythmicity measure and a bootstrap methodology to detect rhythmic genes in an oscillatory system. Although the proposed methodology can be used for any high-throughput gene expression data, in this paper we illustrate the proposed methodology using several publicly available circadian clock microarray gene-expression datasets. We demonstrate that the choice of normalization method has very little effect on the proposed methodology. Specifically, for any pair of normalization methods considered in this paper, the resulting values of the rhythmicity measure are highly correlated. Thus it suggests that the proposed measure is robust to the choice of a normalization method. Consequently, the rhythmicity of a gene is potentially not a mere artifact of the normalization method used. Lastly, as demonstrated in the paper, the proposed bootstrap methodology can also be used for simulating data for genes participating in an oscillatory system using a reference dataset. Availability: A user friendly code implemented in R language can be downloaded from http://www.eio.uva.es/~miguel/robustdetectionprocedure.html.
Larriba, Yolanda; Rueda, Cristina; Fernández, Miguel A.; Peddada, Shyamal D.
2018-01-01
Motivation: Gene-expression data obtained from high throughput technologies are subject to various sources of noise and accordingly the raw data are pre-processed before formally analyzed. Normalization of the data is a key pre-processing step, since it removes systematic variations across arrays. There are numerous normalization methods available in the literature. Based on our experience, in the context of oscillatory systems, such as cell-cycle, circadian clock, etc., the choice of the normalization method may substantially impact the determination of a gene to be rhythmic. Thus rhythmicity of a gene can purely be an artifact of how the data were normalized. Since the determination of rhythmic genes is an important component of modern toxicological and pharmacological studies, it is important to determine truly rhythmic genes that are robust to the choice of a normalization method. Results: In this paper we introduce a rhythmicity measure and a bootstrap methodology to detect rhythmic genes in an oscillatory system. Although the proposed methodology can be used for any high-throughput gene expression data, in this paper we illustrate the proposed methodology using several publicly available circadian clock microarray gene-expression datasets. We demonstrate that the choice of normalization method has very little effect on the proposed methodology. Specifically, for any pair of normalization methods considered in this paper, the resulting values of the rhythmicity measure are highly correlated. Thus it suggests that the proposed measure is robust to the choice of a normalization method. Consequently, the rhythmicity of a gene is potentially not a mere artifact of the normalization method used. Lastly, as demonstrated in the paper, the proposed bootstrap methodology can also be used for simulating data for genes participating in an oscillatory system using a reference dataset. Availability: A user friendly code implemented in R language can be downloaded from http://www.eio.uva.es/~miguel/robustdetectionprocedure.html PMID:29456555
Casado, Monica Rivas; Gonzalez, Rocio Ballesteros; Kriechbaumer, Thomas; Veal, Amanda
2015-11-04
European legislation is driving the development of methods for river ecosystem protection in light of concerns over water quality and ecology. Key to their success is the accurate and rapid characterisation of physical features (i.e., hydromorphology) along the river. Image pattern recognition techniques have been successfully used for this purpose. The reliability of the methodology depends on both the quality of the aerial imagery and the pattern recognition technique used. Recent studies have proved the potential of Unmanned Aerial Vehicles (UAVs) to increase the quality of the imagery by capturing high resolution photography. Similarly, Artificial Neural Networks (ANN) have been shown to be a high precision tool for automated recognition of environmental patterns. This paper presents a UAV based framework for the identification of hydromorphological features from high resolution RGB aerial imagery using a novel classification technique based on ANNs. The framework is developed for a 1.4 km river reach along the river Dee in Wales, United Kingdom. For this purpose, a Falcon 8 octocopter was used to gather 2.5 cm resolution imagery. The results show that the accuracy of the framework is above 81%, performing particularly well at recognising vegetation. These results leverage the use of UAVs for environmental policy implementation and demonstrate the potential of ANNs and RGB imagery for high precision river monitoring and river management.
NASA Astrophysics Data System (ADS)
Singh, Leeth; Mutanga, Onisimo; Mafongoya, Paramu; Peerbhay, Kabir
2017-07-01
The concentration of forage fiber content is critical in explaining the palatability of forage quality for livestock grazers in tropical grasslands. Traditional methods of determining forage fiber content are usually time consuming, costly, and require specialized laboratory analysis. With the potential of remote sensing technologies, determination of key fiber attributes can be made more accurately. This study aims to determine the effectiveness of known absorption wavelengths for detecting forage fiber biochemicals, neutral detergent fiber, acid detergent fiber, and lignin using hyperspectral data. Hyperspectral reflectance spectral measurements (350 to 2500 nm) of grass were collected and implemented within the random forest (RF) ensemble. Results show successful correlations between the known absorption features and the biochemicals with coefficients of determination (R2) ranging from 0.57 to 0.81 and root mean square errors ranging from 6.97 to 3.03 g/kg. In comparison, using the entire dataset, the study identified additional wavelengths for detecting fiber biochemicals, which contributes to the accurate determination of forage quality in a grassland environment. Overall, the results showed that hyperspectral remote sensing in conjunction with the competent RF ensemble could discriminate each key biochemical evaluated. This study shows the potential to upscale the methodology to a space-borne multispectral platform with similar spectral configurations for an accurate and cost effective mapping analysis of forage quality.
NASA Astrophysics Data System (ADS)
Beliakov, Sergei
2018-03-01
Investment projects of high-rise construction have a number of features that determine specific risks and additional opportunities that require analysis and accounting in the formation of an effective project concept. The most significant features of high-rise construction include long construction time, complexity of technical and technological solutions, complexity of decisions on the organization of construction and operation, high cost of construction and operation, complexity in determining the ratio of areas designed to accommodate different functional areas, when organizing and coordinating the operation of the facility, with internal zoning. Taking into account the specificity of high-rise construction, among the factors determining the effectiveness of projects, it is advisable to consider as key factors: organizational, technological and investment factors. Within the framework of the article, the author singled out key particular functions for each group of factors under consideration, and also developed a system of principles for the formation of an effective concept of multifunctional high-rise construction investment projects, including the principle of logistic efficiency, the principle of optimal functional zoning, the principle of efficiency of equipment use, the principle of optimizing technological processes, the principle maximization of income, the principle of fund management, the principle of risk management . The model of formation of an effective concept of investment projects of multifunctional high-rise construction developed by the author can contribute to the development of methodological tools in the field of managing the implementation of high-rise construction projects, taking into account their specificity in the current economic conditions.
QUALITY ASSURANCE MEASURES ASSOCIATED WITH CORAL REEF MONITORING
Systematic efforts began in 1997 to assess the incidence of coral diseases in the Florida Keys. Protocols were developed for the selection of permanent stations and for data collection methodology. Permanent stations and for data collection methodology. Permanent stations were es...
Fast Localization in Large-Scale Environments Using Supervised Indexing of Binary Features.
Youji Feng; Lixin Fan; Yihong Wu
2016-01-01
The essence of image-based localization lies in matching 2D key points in the query image and 3D points in the database. State-of-the-art methods mostly employ sophisticated key point detectors and feature descriptors, e.g., Difference of Gaussian (DoG) and Scale Invariant Feature Transform (SIFT), to ensure robust matching. While a high registration rate is attained, the registration speed is impeded by the expensive key point detection and the descriptor extraction. In this paper, we propose to use efficient key point detectors along with binary feature descriptors, since the extraction of such binary features is extremely fast. The naive usage of binary features, however, does not lend itself to significant speedup of localization, since existing indexing approaches, such as hierarchical clustering trees and locality sensitive hashing, are not efficient enough in indexing binary features and matching binary features turns out to be much slower than matching SIFT features. To overcome this, we propose a much more efficient indexing approach for approximate nearest neighbor search of binary features. This approach resorts to randomized trees that are constructed in a supervised training process by exploiting the label information derived from that multiple features correspond to a common 3D point. In the tree construction process, node tests are selected in a way such that trees have uniform leaf sizes and low error rates, which are two desired properties for efficient approximate nearest neighbor search. To further improve the search efficiency, a probabilistic priority search strategy is adopted. Apart from the label information, this strategy also uses non-binary pixel intensity differences available in descriptor extraction. By using the proposed indexing approach, matching binary features is no longer much slower but slightly faster than matching SIFT features. Consequently, the overall localization speed is significantly improved due to the much faster key point detection and descriptor extraction. It is empirically demonstrated that the localization speed is improved by an order of magnitude as compared with state-of-the-art methods, while comparable registration rate and localization accuracy are still maintained.
Wu, Jin; Liu, Yayuan; Guo, Yuanyuan; Feng, Shuanglong; Zou, Binghua; Mao, Hui; Yu, Cheng-han; Tian, Danbi; Huang, Wei; Huo, Fengwei
2015-05-05
By coating polydimethylsiloxane (PDMS) relief structures with a layer of opaque metal such as gold, the incident light is strictly allowed to pass through the nanoscopic apertures at the sidewalls of PDMS reliefs to expose underlying photoresist at nanoscale regions, thus producing subwavelength nanopatterns covering centimeter-scale areas. It was found that the sidewalls were a little oblique, which was the key to form the nanoscale apertures. Two-sided and one-sided subwavelength apertures can be constructed by employing vertical and oblique metal evaporation directions, respectively. Consequently, two-line and one-line subwavelength nanopatterns with programmable feature shapes, sizes, and periodicities could be produced using the obtained photomasks. The smallest aperture size and line width of 80 nm were achieved. In contrast to the generation of raised positive photoresist nanopatterns in phase shifting photolithography, the recessed positive photoresist nanopatterns produced in this study provide a convenient route to transfer the resist nanopatterns to metal nanopatterns. This nanolithography methodology possesses the distinctive advantages of simplicity, low cost, high throughput, and nanoscale feature size and shape controllability, making it a potent nanofabrication technique to enable functional nanostructures for various potential applications.
NASA Astrophysics Data System (ADS)
Chen, Junxun; Cheng, Longsheng; Yu, Hui; Hu, Shaolin
2018-01-01
MonoSLAM: real-time single camera SLAM.
Davison, Andrew J; Reid, Ian D; Molton, Nicholas D; Stasse, Olivier
2007-06-01
We present a real-time algorithm which can recover the 3D trajectory of a monocular camera, moving rapidly through a previously unknown scene. Our system, which we dub MonoSLAM, is the first successful application of the SLAM methodology from mobile robotics to the "pure vision" domain of a single uncontrolled camera, achieving real time but drift-free performance inaccessible to Structure from Motion approaches. The core of the approach is the online creation of a sparse but persistent map of natural landmarks within a probabilistic framework. Our key novel contributions include an active approach to mapping and measurement, the use of a general motion model for smooth camera movement, and solutions for monocular feature initialization and feature orientation estimation. Together, these add up to an extremely efficient and robust algorithm which runs at 30 Hz with standard PC and camera hardware. This work extends the range of robotic systems in which SLAM can be usefully applied, but also opens up new areas. We present applications of MonoSLAM to real-time 3D localization and mapping for a high-performance full-size humanoid robot and live augmented reality with a hand-held camera.
Model-driven development of covariances for spatiotemporal environmental health assessment.
Kolovos, Alexander; Angulo, José Miguel; Modis, Konstantinos; Papantonopoulos, George; Wang, Jin-Feng; Christakos, George
2013-01-01
Known conceptual and technical limitations of mainstream environmental health data analysis have directed research to new avenues. The goal is to deal more efficiently with the inherent uncertainty and composite space-time heterogeneity of key attributes, account for multi-sourced knowledge bases (health models, survey data, empirical relationships etc.), and generate more accurate predictions across space-time. Based on a versatile, knowledge synthesis methodological framework, we introduce new space-time covariance functions built by integrating epidemic propagation models and we apply them in the analysis of existing flu datasets. Within the knowledge synthesis framework, the Bayesian maximum entropy theory is our method of choice for the spatiotemporal prediction of the ratio of new infectives (RNI) for a case study of flu in France. The space-time analysis is based on observations during a period of 15 weeks in 1998-1999. We present general features of the proposed covariance functions, and use these functions to explore the composite space-time RNI dependency. We then implement the findings to generate sufficiently detailed and informative maps of the RNI patterns across space and time. The predicted distributions of RNI suggest substantive relationships in accordance with the typical physiographic and climatologic features of the country.
Conceptions of Height and Verticality in the History of Skyscrapers and Skylines
NASA Astrophysics Data System (ADS)
Maslovskaya, Oksana; Ignatov, Grigoriy
2018-03-01
The main goal of this article is to reveal the significance of height and verticality history of skyscrapers and skylines. The objectives are as follows: 1. trace the origin of design concepts related to skyscraper; 2. discuss the perceived experience of the cultural aspects of skyscrapers and skylines; 3. describe the differences and similarities of the profiles of with comparable skylines. The methodology of study is designed to explore the perceived theory and principals of skyscraper and skyline development phenomenon and its key features. The skyscraper reveals an assertive creative form of vertical design. Skyscraper construction also relates to the origin of ancient cultural symbolism as the dominant vertical element as the main features of an ordered space. The historical idea of height reaches back to the earliest civilization such as the Tower of Babel. Philosophical approaches of elements of such post-structuralism have been included in studying of skyscraper phenomenon. The analysis of skyscraper and their resulting skyline are examined to show the connection to their origins with their concepts of height and verticality. From the historical perspective, cities with skyscrapers and a skyline turn out to be an assertive manifestation of common ideas of height and verticality.
Measurement and Modeling of Job Stress of Electric Overhead Traveling Crane Operators
Krishna, Obilisetty B.; Maiti, Jhareswar; Ray, Pradip K.; Samanta, Biswajit; Mandal, Saptarshi; Sarkar, Sobhan
2015-01-01
Background In this study, the measurement of job stress of electric overhead traveling crane operators and quantification of the effects of operator and workplace characteristics on job stress were assessed. Methods Job stress was measured on five subscales: employee empowerment, role overload, role ambiguity, rule violation, and job hazard. The characteristics of the operators that were studied were age, experience, body weight, and body height. The workplace characteristics considered were hours of exposure, cabin type, cabin feature, and crane height. The proposed methodology included administration of a questionnaire survey to 76 electric overhead traveling crane operators followed by analysis using analysis of variance and a classification and regression tree. Results The key findings were: (1) the five subscales can be used to measure job stress; (2) employee empowerment was the most significant factor followed by the role overload; (3) workplace characteristics contributed more towards job stress than operator's characteristics; and (4) of the workplace characteristics, crane height was the major contributor. Conclusion The issues related to crane height and cabin feature can be fixed by providing engineering or foolproof solutions than relying on interventions related to the demographic factors. PMID:26929839
Multi-scale computational modeling of developmental biology.
Setty, Yaki
2012-08-01
Normal development of multicellular organisms is regulated by a highly complex process in which a set of precursor cells proliferate, differentiate and move, forming over time a functioning tissue. To handle their complexity, developmental systems can be studied over distinct scales. The dynamics of each scale is determined by the collective activity of entities at the scale below it. I describe a multi-scale computational approach for modeling developmental systems and detail the methodology through a synthetic example of a developmental system that retains key features of real developmental systems. I discuss the simulation of the system as it emerges from cross-scale and intra-scale interactions and describe how an in silico study can be carried out by modifying these interactions in a way that mimics in vivo experiments. I highlight biological features of the results through a comparison with findings in Caenorhabditis elegans germline development and finally discuss about the applications of the approach in real developmental systems and propose future extensions. The source code of the model of the synthetic developmental system can be found in www.wisdom.weizmann.ac.il/~yaki/MultiScaleModel. yaki.setty@gmail.com Supplementary data are available at Bioinformatics online.
Sjekavica, Mariela; Haller, Herman; Cerić, Anita
2015-01-01
Building usage is the phase in the building life cycle that is most time-consuming, most functional, most significant due to building purpose and often systematically ignored. Maintenance is the set of activities that ensure the planned duration of facility exploitation phase in accordance with the requirements for quality maintenance of a large number of important building features as well as other elements immanent to the nature of facilities' life. The aim of the study is to show the analysis of the current state of organized, planned and comprehensive managerial approach in hospital utilization and maintenance in the Republic of Croatia, given on the case study of Clinical hospital center in Rijeka. The methodology used consists of relevant literature section of theory of facility utilization, maintenance and management in general, hospital buildings especially, display of practice on case study, and comparison of key performance indicators values obtained through interview with those that author Igal M. Shohet defined in his study by field surveys and statistical analyses. Despite many positive indicators of Clinical hospital center Rijeka maintenance, an additional research is needed in order to define a more complete national hospital maintenance strategy.
What are the Starting Points? Evaluating Base-Year Assumptions in the Asian Modeling Exercise
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chaturvedi, Vaibhav; Waldhoff, Stephanie; Clarke, Leon E.
2012-12-01
A common feature of model inter-comparison efforts is that the base year numbers for important parameters such as population and GDP can differ substantially across models. This paper explores the sources and implications of this variation in Asian countries across the models participating in the Asian Modeling Exercise (AME). Because the models do not all have a common base year, each team was required to provide data for 2005 for comparison purposes. This paper compares the year 2005 information for different models, noting the degree of variation in important parameters, including population, GDP, primary energy, electricity, and CO2 emissions. Itmore » then explores the difference in these key parameters across different sources of base-year information. The analysis confirms that the sources provide different values for many key parameters. This variation across data sources and additional reasons why models might provide different base-year numbers, including differences in regional definitions, differences in model base year, and differences in GDP transformation methodologies, are then discussed in the context of the AME scenarios. Finally, the paper explores the implications of base-year variation on long-term model results.« less
Development of a Stochastically-driven, Forward Predictive Performance Model for PEMFCs
NASA Astrophysics Data System (ADS)
Harvey, David Benjamin Paul
A one-dimensional multi-scale coupled, transient, and mechanistic performance model for a PEMFC membrane electrode assembly has been developed. The model explicitly includes each of the 5 layers within a membrane electrode assembly and solves for the transport of charge, heat, mass, species, dissolved water, and liquid water. Key features of the model include the use of a multi-step implementation of the HOR reaction on the anode, agglomerate catalyst sub-models for both the anode and cathode catalyst layers, a unique approach that links the composition of the catalyst layer to key properties within the agglomerate model and the implementation of a stochastic input-based approach for component material properties. The model employs a new methodology for validation using statistically varying input parameters and statistically-based experimental performance data; this model represents the first stochastic input driven unit cell performance model. The stochastic input driven performance model was used to identify optimal ionomer content within the cathode catalyst layer, demonstrate the role of material variation in potential low performing MEA materials, provide explanation for the performance of low-Pt loaded MEAs, and investigate the validity of transient-sweep experimental diagnostic methods.
Assessing the Impact of Lesson Study on the Teaching Practice of Middle School Science Teachers
NASA Astrophysics Data System (ADS)
Grove, Michael C.
Despite wave after wave of educational reform in the United States our students continue to lag behind their peers in other industrialized countries on virtually all measures of academic achievement. Effective professional development (PD) is seen as a key to improving instructional practice and therefore student learning, but traditional forms of PD have been wholly unsuccessful in changing teaching practice. Over the last two decades an emerging body of research has identified some key features of effective PD that seem to create meaningful change and improvement in instructional practice. Some of this research highlights the promise of adapting Japanese lesson study (LS) to the American context as a means of incrementally improving instruction. Much of the existing research around LS is descriptive in nature and offers little insight into if and how participation in LS impacts subsequent instructional practice. This study utilized case study methodology to examine the instructional practice of one group of four middle school science teachers before, during, and after participation in LS. The study attempted to identify specific learning outcomes of a LS process, to identify influences on teacher learning during LS, and to identify subsequent changes in the instructional practice of participants resulting from participation in LS. Key findings from the study include significant teacher learning derived from the LS process, the identification of influences that enhanced or inhibited teacher learning, and clear evidence that participants successfully integrated learning from the LS into subsequent instructional practice. Learning outcomes included deepening of subject matter knowledge, increased understanding of student thinking and abilities, clarity of expectations for student performance, recognition of the ineffectiveness of past instructional practice, specific instructional strategies, shared student learning goals, and an increased commitment to future development of student learning. Influences supporting teacher learning were trust and honest dialogue among participants, focused collaboration, examination of student work, and the opportunity to watch other teachers deliver instruction. Influences inhibiting teacher learning related to failure to adhere to key features of the LS protocol. The study offers initial evidence confirming the promise of LS as a model of effective PD.
Wollenweber, Scott D; Kemp, Brad J
2016-11-01
This investigation aimed to develop a scanner quantification performance methodology and compare multiple metrics between two scanners under different imaging conditions. Most PET scanners are designed to work over a wide dynamic range of patient imaging conditions. Clinical constraints, however, often impact the realization of the entitlement performance for a particular scanner design. Using less injected dose and imaging for a shorter time are often key considerations, all while maintaining "acceptable" image quality and quantitative capability. A dual phantom measurement including resolution inserts was used to measure the effects of in-plane (x, y) and axial (z) system resolution between two PET/CT systems with different block detector crystal dimensions. One of the scanners had significantly thinner slices. Several quantitative measures, including feature contrast recovery, max/min value, and feature profile accuracy were derived from the resulting data and compared between the two scanners and multiple phantoms and alignments. At the clinically relevant count levels used, the scanner with thinner slices had improved performance of approximately 2%, averaged over phantom alignments, measures, and reconstruction methods, for the head-sized phantom, mainly demonstrated with the rods aligned perpendicular to the scanner axis. That same scanner had a slightly decreased performance of -1% for the larger body-size phantom, mostly due to an apparent noise increase in the images. Most of the differences in the metrics between the two scanners were less than 10%. Using the proposed scanner performance methodology, it was shown that smaller detector elements and a larger number of image voxels require higher count density in order to demonstrate improved image quality and quantitation. In a body imaging scenario under typical clinical conditions, the potential advantages of the design must overcome increases in noise due to lower count density.
Simulating smokers' acceptance of modifications in a cessation program.
Spoth, R
1992-01-01
Recent research has underscored the importance of assessing barriers to smokers' acceptance of cessation programs. This paper illustrates the use of computer simulations to gauge smokers' response to program modifications which may produce barriers to participation. It also highlights methodological issues encountered in conducting this work. Computer simulations were based on conjoint analysis, a consumer research method which enables measurement of smokers' relative preference for various modifications of cessation programs. Results from two studies are presented in this paper. The primary study used a randomly selected sample of 218 adult smokers who participated in a computer-assisted phone interview. Initially, the study assessed smokers' relative utility rating of 30 features of cessation programs. Utility data were used in computer-simulated comparisons of a low-cost, self-help oriented program under development and five other existing programs. A baseline version of the program under development and two modifications (for example, use of a support group with a higher level of cost) were simulated. Both the baseline version and modifications received a favorable response vis-à-vis comparison programs. Modifications requiring higher program costs were, however, associated with moderately reduced levels of favorable consumer response. The second study used a sample of 70 smokers who responded to an expanded set of smoking cessation program features focusing on program packaging. This secondary study incorporate in-person, computer-assisted interviews at a shopping mall, with smokers viewing an artist's mock-up of various program options on display. A similar pattern of responses to simulated program modifications emerged, with monetary cost apparently playing a key role. The significance of conjoint-based computer simulation as a tool in program development or dissemination, salient methodological issues, and implications for further research are discussed. PMID:1738813
Simulating smokers' acceptance of modifications in a cessation program.
Spoth, R
1992-01-01
Recent research has underscored the importance of assessing barriers to smokers' acceptance of cessation programs. This paper illustrates the use of computer simulations to gauge smokers' response to program modifications which may produce barriers to participation. It also highlights methodological issues encountered in conducting this work. Computer simulations were based on conjoint analysis, a consumer research method which enables measurement of smokers' relative preference for various modifications of cessation programs. Results from two studies are presented in this paper. The primary study used a randomly selected sample of 218 adult smokers who participated in a computer-assisted phone interview. Initially, the study assessed smokers' relative utility rating of 30 features of cessation programs. Utility data were used in computer-simulated comparisons of a low-cost, self-help oriented program under development and five other existing programs. A baseline version of the program under development and two modifications (for example, use of a support group with a higher level of cost) were simulated. Both the baseline version and modifications received a favorable response vis-à-vis comparison programs. Modifications requiring higher program costs were, however, associated with moderately reduced levels of favorable consumer response. The second study used a sample of 70 smokers who responded to an expanded set of smoking cessation program features focusing on program packaging. This secondary study incorporate in-person, computer-assisted interviews at a shopping mall, with smokers viewing an artist's mock-up of various program options on display. A similar pattern of responses to simulated program modifications emerged, with monetary cost apparently playing a key role. The significance of conjoint-based computer simulation as a tool in program development or dissemination, salient methodological issues, and implications for further research are discussed.
Annual land cover change mapping using MODIS time series to improve emissions inventories.
NASA Astrophysics Data System (ADS)
López Saldaña, G.; Quaife, T. L.; Clifford, D.
2014-12-01
Understanding and quantifying land surface changes is necessary for estimating greenhouse gas and ammonia emissions, and for meeting air quality limits and targets. More sophisticated inventories methodologies for at least key emission source are needed due to policy-driven air quality directives. Quantifying land cover changes on an annual basis requires greater spatial and temporal disaggregation of input data. The main aim of this study is to develop a methodology for using Earth Observations (EO) to identify annual land surface changes that will improve emissions inventories from agriculture and land use/land use change and forestry (LULUCF) in the UK. First goal is to find the best sets of input features that describe accurately the surface dynamics. In order to identify annual and inter-annual land surface changes, a times series of surface reflectance was used to capture seasonal variability. Daily surface reflectance images from the Moderate Resolution Imaging Spectroradiometer (MODIS) at 500m resolution were used to invert a Bidirectional Reflectance Distribution Function (BRDF) model to create the seamless time series. Given the limited number of cloud-free observations, a BRDF climatology was used to constrain the model inversion and where no high-scientific quality observations were available at all, as a gap filler. The Land Cover Map 2007 (LC2007) produced by the Centre for Ecology & Hydrology (CEH) was used for training and testing purposes. A prototype land cover product was created for 2006 to 2008. Several machine learning classifiers were tested as well as different sets of input features going from the BRDF parameters to spectral Albedo. We will present the results of the time series development and the first exercises when creating the prototype land cover product.
Atun, Rifat A.; McKee, Martin; Drobniewski, Francis; Coker, Richard
2005-01-01
OBJECTIVE: To develop a methodology and an instrument that allow the simultaneous rapid and systematic examination of the broad public health context, the health care systems, and the features of disease-specific programmes. METHODS: Drawing on methodologies used for rapid situational assessments of vertical programmes for tackling communicable disease, we analysed programmes for the control human of immunodeficiency virus (HIV) and their health systems context in three regions in the Russian Federation. The analysis was conducted in three phases: first, analysis of published literature, documents and routine data from the regions; second, interviews with key informants, and third, further data collection and analysis. Synthesis of findings through exploration of emergent themes, with iteration, resulted in the identification of the key systems issues that influenced programme delivery. FINDINGS: We observed a complex political economy within which efforts to control HIV sit, an intricate legal environment, and a high degree of decentralization of financing and operational responsibility. Although each region displays some commonalities arising from the Soviet traditions of public health control, there are considerable variations in the epidemiological trajectories, cultural responses, the political environment, financing, organization and service delivery, and the extent of multisectoral work in response to HIV epidemics. CONCLUSION: Within a centralized, post-Soviet health system, centrally directed measures to enhance HIV control may have varying degrees of impact at the regional level. Although the central tenets of effective vertical HIV programmes may be present, local imperatives substantially influence their interpretation, operationalization and effectiveness. Systematic analysis of the context within which vertical programmes are embedded is necessary to enhance understanding of how the relevant policies are prioritized and translated to action. PMID:16283049
Feminist methodologies and engineering education research
NASA Astrophysics Data System (ADS)
Beddoes, Kacey
2013-03-01
This paper introduces feminist methodologies in the context of engineering education research. It builds upon other recent methodology articles in engineering education journals and presents feminist research methodologies as a concrete engineering education setting in which to explore the connections between epistemology, methodology and theory. The paper begins with a literature review that covers a broad range of topics featured in the literature on feminist methodologies. Next, data from interviews with engineering educators and researchers who have engaged with feminist methodologies are presented. The ways in which feminist methodologies shape their research topics, questions, frameworks of analysis, methods, practices and reporting are each discussed. The challenges and barriers they have faced are then discussed. Finally, the benefits of further and broader engagement with feminist methodologies within the engineering education community are identified.
2018-03-01
We apply our methodology to the criticism text written in the flight-training program student evaluations in order to construct a model that...factors. We apply our methodology to the criticism text written in the flight-training program student evaluations in order to construct a model...9 D. BINARY CLASSIFICATION AND FEATURE SELECTION ..........11 III. METHODOLOGY
An overview of key technology thrusts at Bell Helicopter Textron
NASA Technical Reports Server (NTRS)
Harse, James H.; Yen, Jing G.; Taylor, Rodney S.
1988-01-01
Insight is provided into several key technologies at Bell. Specific topics include the results of ongoing research and development in advanced rotors, methodology development, and new configurations. The discussion on advanced rotors highlight developments on the composite, bearingless rotor, including the development and testing of full scale flight hardware as well as some of the design support analyses and verification testing. The discussion on methodology development concentrates on analytical development in aeromechanics, including correlation studies and design application. New configurations, presents the results of some advanced configuration studies including hardware development.
Preprocessing Structured Clinical Data for Predictive Modeling and Decision Support
Oliveira, Mónica Duarte; Janela, Filipe; Martins, Henrique M. G.
2016-01-01
Summary Background EHR systems have high potential to improve healthcare delivery and management. Although structured EHR data generates information in machine-readable formats, their use for decision support still poses technical challenges for researchers due to the need to preprocess and convert data into a matrix format. During our research, we observed that clinical informatics literature does not provide guidance for researchers on how to build this matrix while avoiding potential pitfalls. Objectives This article aims to provide researchers a roadmap of the main technical challenges of preprocessing structured EHR data and possible strategies to overcome them. Methods Along standard data processing stages – extracting database entries, defining features, processing data, assessing feature values and integrating data elements, within an EDPAI framework –, we identified the main challenges faced by researchers and reflect on how to address those challenges based on lessons learned from our research experience and on best practices from related literature. We highlight the main potential sources of error, present strategies to approach those challenges and discuss implications of these strategies. Results Following the EDPAI framework, researchers face five key challenges: (1) gathering and integrating data, (2) identifying and handling different feature types, (3) combining features to handle redundancy and granularity, (4) addressing data missingness, and (5) handling multiple feature values. Strategies to address these challenges include: cross-checking identifiers for robust data retrieval and integration; applying clinical knowledge in identifying feature types, in addressing redundancy and granularity, and in accommodating multiple feature values; and investigating missing patterns adequately. Conclusions This article contributes to literature by providing a roadmap to inform structured EHR data preprocessing. It may advise researchers on potential pitfalls and implications of methodological decisions in handling structured data, so as to avoid biases and help realize the benefits of the secondary use of EHR data. PMID:27924347
A data-driven multi-model methodology with deep feature selection for short-term wind forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feng, Cong; Cui, Mingjian; Hodge, Bri-Mathias
With the growing wind penetration into the power system worldwide, improving wind power forecasting accuracy is becoming increasingly important to ensure continued economic and reliable power system operations. In this paper, a data-driven multi-model wind forecasting methodology is developed with a two-layer ensemble machine learning technique. The first layer is composed of multiple machine learning models that generate individual forecasts. A deep feature selection framework is developed to determine the most suitable inputs to the first layer machine learning models. Then, a blending algorithm is applied in the second layer to create an ensemble of the forecasts produced by firstmore » layer models and generate both deterministic and probabilistic forecasts. This two-layer model seeks to utilize the statistically different characteristics of each machine learning algorithm. A number of machine learning algorithms are selected and compared in both layers. This developed multi-model wind forecasting methodology is compared to several benchmarks. The effectiveness of the proposed methodology is evaluated to provide 1-hour-ahead wind speed forecasting at seven locations of the Surface Radiation network. Numerical results show that comparing to the single-algorithm models, the developed multi-model framework with deep feature selection procedure has improved the forecasting accuracy by up to 30%.« less
KeyWare: an open wireless distributed computing environment
NASA Astrophysics Data System (ADS)
Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir
1995-12-01
Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.
ERIC Educational Resources Information Center
Mitchell, Claudia
2008-01-01
At the risk of seeming to make exaggerated claims for visual methodologies, what I set out to do is lay bare some of the key elements of working with the visual as a set of methodologies and practices. In particular, I address educational research in South Africa at a time when questions of the social responsibility of the academic researcher…
A CNN based Hybrid approach towards automatic image registration
NASA Astrophysics Data System (ADS)
Arun, Pattathal V.; Katiyar, Sunil K.
2013-06-01
Image registration is a key component of various image processing operations which involve the analysis of different image data sets. Automatic image registration domains have witnessed the application of many intelligent methodologies over the past decade; however inability to properly model object shape as well as contextual information had limited the attainable accuracy. In this paper, we propose a framework for accurate feature shape modeling and adaptive resampling using advanced techniques such as Vector Machines, Cellular Neural Network (CNN), SIFT, coreset, and Cellular Automata. CNN has found to be effective in improving feature matching as well as resampling stages of registration and complexity of the approach has been considerably reduced using corset optimization The salient features of this work are cellular neural network approach based SIFT feature point optimisation, adaptive resampling and intelligent object modelling. Developed methodology has been compared with contemporary methods using different statistical measures. Investigations over various satellite images revealed that considerable success was achieved with the approach. System has dynamically used spectral and spatial information for representing contextual knowledge using CNN-prolog approach. Methodology also illustrated to be effective in providing intelligent interpretation and adaptive resampling. Rejestracja obrazu jest kluczowym składnikiem różnych operacji jego przetwarzania. W ostatnich latach do automatycznej rejestracji obrazu wykorzystuje się metody sztucznej inteligencji, których największą wadą, obniżającą dokładność uzyskanych wyników jest brak możliwości dobrego wymodelowania kształtu i informacji kontekstowych. W niniejszej pracy zaproponowano zasady dokładnego modelowania kształtu oraz adaptacyjnego resamplingu z wykorzystaniem zaawansowanych technik, takich jak Vector Machines (VM), komórkowa sieć neuronowa (CNN), przesiewanie (SIFT), Coreset i automaty komórkowe. Stwierdzono, że za pomocą CNN można skutecznie poprawiać dopasowanie obiektów obrazowych oraz resampling kolejnych kroków rejestracji, zaś zastosowanie optymalizacji metodą Coreset znacznie redukuje złożoność podejścia. Zasadniczym przedmiotem pracy są: optymalizacja punktów metodą SIFT oparta na podejściu CNN, adaptacyjny resampling oraz inteligentne modelowanie obiektów. Opracowana metoda została porównana ze współcześnie stosowanymi metodami wykorzystującymi różne miary statystyczne. Badania nad różnymi obrazami satelitarnymi wykazały, że stosując opracowane podejście osiągnięto bardzo dobre wyniki. System stosując podejście CNN-prolog dynamicznie wykorzystuje informacje spektralne i przestrzenne dla reprezentacji wiedzy kontekstowej. Metoda okazała się również skuteczna w dostarczaniu inteligentnych interpretacji i w adaptacyjnym resamplingu.
Finite element model updating and damage detection for bridges using vibration measurement.
DOT National Transportation Integrated Search
2013-12-01
In this report, the results of a study on developing a damage detection methodology based on Statistical Pattern Recognition are : presented. This methodology uses a new damage sensitive feature developed in this study that relies entirely on modal :...
Evaluation Methods Sourcebook.
ERIC Educational Resources Information Center
Love, Arnold J., Ed.
The chapters commissioned for this book describe key aspects of evaluation methodology as they are practiced in a Canadian context, providing representative illustrations of recent developments in evaluation methodology as it is currently applied. The following chapters are included: (1) "Program Evaluation with Limited Fiscal and Human…
NASA Technical Reports Server (NTRS)
Neukum, G.; Hiller, K.
1981-01-01
Four discussions are conducted: (1) the methodology of relative age determination by impact crater statistics, (2) a comparison of proposed Martian impact chronologies for the determination of absolute ages from crater frequencies, (3) a report on work dating Martian volcanoes and erosional features by impact crater statistics, and (4) an attempt to understand the main features of Martian history through a synthesis of crater frequency data. Two cratering chronology models are presented and used for inference of absolute ages from crater frequency data, and it is shown that the interpretation of all data available and tractable by the methodology presented leads to a global Martian geological history that is characterized by two epochs of activity. It is concluded that Mars is an ancient planet with respect to its surface features.
An image-processing methodology for extracting bloodstain pattern features.
Arthur, Ravishka M; Humburg, Philomena J; Hoogenboom, Jerry; Baiker, Martin; Taylor, Michael C; de Bruin, Karla G
2017-08-01
There is a growing trend in forensic science to develop methods to make forensic pattern comparison tasks more objective. This has generally involved the application of suitable image-processing methods to provide numerical data for identification or comparison. This paper outlines a unique image-processing methodology that can be utilised by analysts to generate reliable pattern data that will assist them in forming objective conclusions about a pattern. A range of features were defined and extracted from a laboratory-generated impact spatter pattern. These features were based in part on bloodstain properties commonly used in the analysis of spatter bloodstain patterns. The values of these features were consistent with properties reported qualitatively for such patterns. The image-processing method developed shows considerable promise as a way to establish measurable discriminating pattern criteria that are lacking in current bloodstain pattern taxonomies. Copyright © 2017 Elsevier B.V. All rights reserved.
A robust dataset-agnostic heart disease classifier from Phonocardiogram.
Banerjee, Rohan; Dutta Choudhury, Anirban; Deshpande, Parijat; Bhattacharya, Sakyajit; Pal, Arpan; Mandana, K M
2017-07-01
Automatic classification of normal and abnormal heart sounds is a popular area of research. However, building a robust algorithm unaffected by signal quality and patient demography is a challenge. In this paper we have analysed a wide list of Phonocardiogram (PCG) features in time and frequency domain along with morphological and statistical features to construct a robust and discriminative feature set for dataset-agnostic classification of normal and cardiac patients. The large and open access database, made available in Physionet 2016 challenge was used for feature selection, internal validation and creation of training models. A second dataset of 41 PCG segments, collected using our in-house smart phone based digital stethoscope from an Indian hospital was used for performance evaluation. Our proposed methodology yielded sensitivity and specificity scores of 0.76 and 0.75 respectively on the test dataset in classifying cardiovascular diseases. The methodology also outperformed three popular prior art approaches, when applied on the same dataset.
Merly, Corinne; Chapman, Antony; Mouvet, Christophe
2012-01-01
Research results in environmental and socio-economic sciences are often under-used by stakeholders involved in the management of natural resources. To minimise this gap, the FP6 EU interdisciplinary project AquaTerra (AT) developed an end-users' integration methodology in order to ensure that the data, knowledge and tools related to the soil-water-sediment system that were generated by the project were delivered in a meaningful way for end-users, thus improving their uptake. The methodology and examples of its application are presented in this paper. From the 408 project deliverables, 96 key findings were identified, 53 related to data and knowledge, and 43 describing advanced tools. River Basin Management (RBM) stakeholders workshops identified 8 main RBM issues and 25 specific stakeholders' questions related to RBM which were classified into seven groups of cross-cutting issues, namely scale, climate change, non-climatic change, the need for systemic approaches, communication and participation, international and inter-basin coordination and collaboration, and the implementation of the Water Framework Directive. The integration methodology enabled an assessment of how AT key findings meet stakeholders' demands, and for each main RBM issue and for each specific question, described the added-value of the AT project in terms of knowledge and tools generated, key parameters to consider, and recommendations that can be made to stakeholders and the wider scientific community. Added value and limitations of the integration methodology and its outcomes are discussed and recommendations are provided to further improve integration methodology and bridge the gaps between scientific research data and their potential uptake by end-users.
Event-related brain potentials and the study of reward processing: Methodological considerations.
Krigolson, Olave E
2017-11-14
There is growing interest in using electroencephalography and specifically the event-related brain potential (ERP) methodology to study human reward processing. Since the discovery of the feedback related negativity (Miltner et al., 1997) and the development of theories associating the feedback related negativity and more recently the reward positivity with reinforcement learning, midbrain dopamine function, and the anterior cingulate cortex (i.e., Holroyd and Coles, 2002) researchers have used the ERP methodology to probe the neural basis of reward learning in humans. However, examination of the feedback related negativity and the reward positivity cannot be done without an understanding of some key methodological issues that must be taken into account when using ERPs and examining these ERP components. For example, even the component name - the feedback related negativity - is a source of debate within the research community as some now strongly feel that the component should be named the reward positivity (Proudfit, 2015). Here, ten key methodological issues are discussed - confusion in component naming, the reward positivity, component identification, peak quantification and the use of difference waveforms, frequency (the N200) and component contamination (the P300), the impact of feedback timing, action, and task learnability, and how learning results in changes in the amplitude of the feedback-related negativity/reward positivity. The hope here is to not provide a definitive approach for examining the feedback related negativity/reward positivity, but instead to outline the key issues that must be taken into account when examining this component to assist researchers in their study of human reward processing with the ERP methodology. Copyright © 2017 Elsevier B.V. All rights reserved.
Thomas, Paul; McDonnell, Juliet; McCulloch, Janette; While, Alison; Bosanquet, Nick; Ferlie, Ewan
2005-01-01
PURPOSE We wanted to identify what organizational features support innovation in Primary Care Groups (PCGs). METHODS Our study used a whole system participatory action research model. Four research teams provided complementary insights. Four case study PCGs were analyzed. Two had an intervention to help local facilitators reflect on their work. Data included 70 key informant interviews, observations of clinical governance interventions and committee meetings, analysis of written materials, surveys and telephone interviews of London Primary Care Organizations, interviews with 20 nurses, and interviews with 6 finance directors. A broad range of stakeholders reviewed data at annual conferences and formed conclusions about trustworthy principles. Sequential research phases were refocused in the light of these conclusions and in response to the changing political context. RESULTS Five features were associated with increased organizational capacity for innovation: (1) clear structures and a vision for corporate and clinical governance; (2) multiple opportunities for people to reflect and learn at all levels of the organization, and connections between these “learning spaces”; (3) both clinicians and managers in leadership roles that encourage participation; (4) the right timing for an initiative and its adaptation to the local context; and (5) external facilitation that provides opportunities for people to make sense of their experiences. Low morale was commonly attributed to 3 features: (1) overwhelming pace of reform, (2) inadequate staff experience and supportive infrastructure, and (3) financial deficits. CONCLUSIONS These features together may support innovation in other primary care bureaucracies. The research methodology enabled people from different backgrounds to make sense of diverse research insights. PMID:16046563
Thomas, Paul; McDonnell, Juliet; McCulloch, Janette; While, Alison; Bosanquet, Nick; Ferlie, Ewan
2005-01-01
We wanted to identify what organizational features support innovation in Primary Care Groups (PCGs). Our study used a whole system participatory action research model. Four research teams provided complementary insights. Four case study PCGs were analyzed. Two had an intervention to help local facilitators reflect on their work. Data included 70 key informant interviews, observations of clinical governance interventions and committee meetings, analysis of written materials, surveys and telephone interviews of London Primary Care Organizations, interviews with 20 nurses, and interviews with 6 finance directors. A broad range of stakeholders reviewed data at annual conferences and formed conclusions about trustworthy principles. Sequential research phases were refocused in the light of these conclusions and in response to the changing political context. Five features were associated with increased organizational capacity for innovation: (1) clear structures and a vision for corporate and clinical governance; (2) multiple opportunities for people to reflect and learn at all levels of the organization, and connections between these "learning spaces"; (3) both clinicians and managers in leadership roles that encourage participation; (4) the right timing for an initiative and its adaptation to the local context; and (5) external facilitation that provides opportunities for people to make sense of their experiences. Low morale was commonly attributed to 3 features: (1) overwhelming pace of reform, (2) inadequate staff experience and supportive infrastructure, and (3) financial deficits. These features together may support innovation in other primary care bureaucracies. The research methodology enabled people from different backgrounds to make sense of diverse research insights.
3D scanning and printing skeletal tissues for anatomy education.
Thomas, Daniel B; Hiscox, Jessica D; Dixon, Blair J; Potgieter, Johan
2016-09-01
Detailed anatomical models can be produced with consumer-level 3D scanning and printing systems. 3D replication techniques are significant advances for anatomical education as they allow practitioners to more easily introduce diverse or numerous specimens into classrooms. Here we present a methodology for producing anatomical models in-house, with the chondrocranium cartilage from a spiny dogfish (Squalus acanthias) and the skeleton of a cane toad (Rhinella marina) as case studies. 3D digital replicas were produced using two consumer-level scanners and specimens were 3D-printed with selective laser sintering. The fidelity of the two case study models was determined with respect to key anatomical features. Larger-scale features of the dogfish chondrocranium and frog skeleton were all well-resolved and distinct in the 3D digital models, and many finer-scale features were also well-resolved, but some more subtle features were absent from the digital models (e.g. endolymphatic foramina in chondrocranium). All characters identified in the digital chondrocranium could be identified in the subsequent 3D print; however, three characters in the 3D-printed frog skeleton could not be clearly delimited (palatines, parasphenoid and pubis). Characters that were absent in the digital models or 3D prints had low-relief in the original scanned specimen and represent a minor loss of fidelity. Our method description and case studies show that minimal equipment and training is needed to produce durable skeletal specimens. These technologies support the tailored production of models for specific classes or research aims. © 2016 Anatomical Society.
2017-01-01
A light-mediated methodology to grow patterned, emissive polymer brushes with micron feature resolution is reported and applied to organic light emitting diode (OLED) displays. Light is used for both initiator functionalization of indium tin oxide and subsequent atom transfer radical polymerization of methacrylate-based fluorescent and phosphorescent iridium monomers. The iridium centers play key roles in photocatalyzing and mediating polymer growth while also emitting light in the final OLED structure. The scope of the presented procedure enables the synthesis of a library of polymers with emissive colors spanning the visible spectrum where the dopant incorporation, position of brush growth, and brush thickness are readily controlled. The chain-ends of the polymer brushes remain intact, affording subsequent chain extension and formation of well-defined diblock architectures. This high level of structure and function control allows for the facile preparation of random ternary copolymers and red–green–blue arrays to yield white emission. PMID:28691078
NASA Astrophysics Data System (ADS)
Wang, Yukun; Chen, Charles H.; Hu, Dan; Ulmschneider, Martin B.; Ulmschneider, Jakob P.
2016-11-01
Many antimicrobial peptides (AMPs) selectively target and form pores in microbial membranes. However, the mechanisms of membrane targeting, pore formation and function remain elusive. Here we report an experimentally guided unbiased simulation methodology that yields the mechanism of spontaneous pore assembly for the AMP maculatin at atomic resolution. Rather than a single pore, maculatin forms an ensemble of structurally diverse temporarily functional low-oligomeric pores, which mimic integral membrane protein channels in structure. These pores continuously form and dissociate in the membrane. Membrane permeabilization is dominated by hexa-, hepta- and octamers, which conduct water, ions and small dyes. Pores form by consecutive addition of individual helices to a transmembrane helix or helix bundle, in contrast to current poration models. The diversity of the pore architectures--formed by a single sequence--may be a key feature in preventing bacterial resistance and could explain why sequence-function relationships in AMPs remain elusive.
An Efficient Scheme for Crystal Structure Prediction Based on Structural Motifs
Zhu, Zizhong; Wu, Ping; Wu, Shunqing; ...
2017-05-15
An efficient scheme based on structural motifs is proposed for the crystal structure prediction of materials. The key advantage of the present method comes in two fold: first, the degrees of freedom of the system are greatly reduced, since each structural motif, regardless of its size, can always be described by a set of parameters (R, θ, φ) with five degrees of freedom; second, the motifs could always appear in the predicted structures when the energies of the structures are relatively low. Both features make the present scheme a very efficient method for predicting desired materials. The method has beenmore » applied to the case of LiFePO 4, an important cathode material for lithium-ion batteries. Numerous new structures of LiFePO 4 have been found, compared to those currently available, available, demonstrating the reliability of the present methodology and illustrating the promise of the concept of structural motifs.« less
Page, Zachariah A; Narupai, Benjaporn; Pester, Christian W; Bou Zerdan, Raghida; Sokolov, Anatoliy; Laitar, David S; Mukhopadhyay, Sukrit; Sprague, Scott; McGrath, Alaina J; Kramer, John W; Trefonas, Peter; Hawker, Craig J
2017-06-28
A light-mediated methodology to grow patterned, emissive polymer brushes with micron feature resolution is reported and applied to organic light emitting diode (OLED) displays. Light is used for both initiator functionalization of indium tin oxide and subsequent atom transfer radical polymerization of methacrylate-based fluorescent and phosphorescent iridium monomers. The iridium centers play key roles in photocatalyzing and mediating polymer growth while also emitting light in the final OLED structure. The scope of the presented procedure enables the synthesis of a library of polymers with emissive colors spanning the visible spectrum where the dopant incorporation, position of brush growth, and brush thickness are readily controlled. The chain-ends of the polymer brushes remain intact, affording subsequent chain extension and formation of well-defined diblock architectures. This high level of structure and function control allows for the facile preparation of random ternary copolymers and red-green-blue arrays to yield white emission.
Multimission helicopter cockpit displays
NASA Astrophysics Data System (ADS)
Terry, William S.; Terry, Jody K.; Lovelace, Nancy D.
1996-05-01
A new operator display subsystem is being incorporated as part of the next generation United States Navy (USN) helicopter avionics system to be integrated into the multi-mission helicopter (MMH) that replaces both the SH-60B and the SH-60F in 2001. This subsystem exploits state-of-the-art technology for the display hardware, the display driver hardware, information presentation methodologies, and software architecture. Both of the existing SH-60 helicopter display systems are based on monochrome CRT technology; a key feature of the MMH cockpit is the integration of color AMLCD multifunction displays. The MMH program is one of the first military programs to use modified commercial AMLCD elements in a tactical aircraft. This paper presents the general configuration of the MMH cockpit and multifunction display subsystem and discusses the approach taken for presenting helicopter flight information to the pilots as well as presentation of mission sensor data for use by the copilot.
Majdinasab, Marjan; Yaqub, Mustansara; Rahim, Abdur; Catanante, Gaelle; Hayat, Akhtar; Marty, Jean Louis
2017-01-01
Anti-microbial drugs are widely employed for the treatment and cure of diseases in animals, promotion of animal growth, and feed efficiency. However, the scientific literature has indicated the possible presence of antimicrobial drug residues in animal-derived food, making it one of the key public concerns for food safety. Therefore, it is highly desirable to design fast and accurate methodologies to monitor antimicrobial drug residues in animal-derived food. Legislation is in place in many countries to ensure antimicrobial drug residue quantities are less than the maximum residue limits (MRL) defined on the basis of food safety. In this context, the recent years have witnessed a special interest in the field of electrochemical biosensors for food safety, based on their unique analytical features. This review article is focused on the recent progress in the domain of electrochemical biosensors to monitor antimicrobial drug residues in animal-derived food. PMID:28837093
Practical approaches to the ESI-MS analysis of catalytic reactions.
Yunker, Lars P E; Stoddard, Rhonda L; McIndoe, J Scott
2014-01-01
Electrospray ionization mass spectrometry (ESI-MS) is a soft ionization technique commonly coupled with liquid or gas chromatography for the identification of compounds in a one-time view of a mixture (for example, the resulting mixture generated by a synthesis). Over the past decade, Scott McIndoe and his research group at the University of Victoria have developed various methodologies to enhance the ability of ESI-MS to continuously monitor catalytic reactions as they proceed. The power, sensitivity and large dynamic range of ESI-MS have allowed for the refinement of several homogenous catalytic mechanisms and could potentially be applied to a wide range of reactions (catalytic or otherwise) for the determination of their mechanistic pathways. In this special feature article, some of the key challenges encountered and the adaptations employed to counter them are briefly reviewed. Copyright © 2014 John Wiley & Sons, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tunuguntla, Ramya H.; Chen, Xi; Belliveau, Allison
Carbon nanotube porins (CNTPs) are a convenient membrane-based model system for studying nanofluidic transport that replicates a number of key structural features of biological membrane channels. We present a generalized approach for CNTP synthesis using sonochemistry-assisted segmenting of carbon nanotubes. Prolonged tip sonication in the presence of lipid molecules debundles and fragments long carbon nanotube aggregates into stable and water-soluble individual CNTPs with lengths in the range 5–20 nm. We discuss the main parameters that determine the efficiency and the yield of this process, describe the optimized conditions for high-yield CNTP synthesis, and demonstrate that this methodology can be adaptedmore » for synthesis of CNTPs of different diameters. We also present the optical properties of CNTPs and show that a combination of Raman and UV–vis–NIR spectroscopy can be used to monitor the quality of the CNTP synthesis. Altogether, CNTPs represent a versatile nanopore building block for creating higher-order functional biomimetic materials.« less
NASA Astrophysics Data System (ADS)
Obermayer, Richard W.; Nugent, William A.
2000-11-01
The SPAWAR Systems Center San Diego is currently developing an advanced Multi-Modal Watchstation (MMWS); design concepts and software from this effort are intended for transition to future United States Navy surface combatants. The MMWS features multiple flat panel displays and several modes of user interaction, including voice input and output, natural language recognition, 3D audio, stylus and gestural inputs. In 1999, an extensive literature review was conducted on basic and applied research concerned with alerting and warning systems. After summarizing that literature, a human computer interaction (HCI) designer's guide was prepared to support the design of an attention allocation subsystem (AAS) for the MMWS. The resultant HCI guidelines are being applied in the design of a fully interactive AAS prototype. An overview of key findings from the literature review, a proposed design methodology with illustrative examples, and an assessment of progress made in implementing the HCI designers guide are presented.
Natural neural projection dynamics underlying social behavior
Gunaydin, Lisa A.; Grosenick, Logan; Finkelstein, Joel C.; Kauvar, Isaac V.; Fenno, Lief E.; Adhikari, Avishek; Lammel, Stephan; Mirzabekov, Julie J.; Airan, Raag D.; Zalocusky, Kelly A.; Tye, Kay M.; Anikeeva, Polina; Malenka, Robert C.; Deisseroth, Karl
2014-01-01
Social interaction is a complex behavior essential for many species, and is impaired in major neuropsychiatric disorders. Pharmacological studies have implicated certain neurotransmitter systems in social behavior, but circuit-level understanding of endogenous neural activity during social interaction is lacking. We therefore developed and applied a new methodology, termed fiber photometry, to optically record natural neural activity in genetically- and connectivity-defined projections to elucidate the real-time role of specified pathways in mammalian behavior. Fiber photometry revealed that activity dynamics of a ventral tegmental area (VTA)-to-nucleus accumbens (NAc) projection could encode and predict key features of social but not novel-object interaction. Consistent with this observation, optogenetic control of cells specifically contributing to this projection was sufficient to modulate social behavior, which was mediated by type-1 dopamine receptor signaling downstream in the NAc. Direct observation of projection-specific activity in this way captures a fundamental and previously inaccessible dimension of circuit dynamics. PMID:24949967
Minimalistic Liquid-Assisted Route to Highly Crystalline α-Zirconium Phosphate.
Cheng, Yu; Wang, Xiaodong Tony; Jaenicke, Stephan; Chuah, Gaik-Khuan
2017-08-24
Zirconium phosphates have potential applications in areas of ion exchange, catalysis, photochemistry, and biotechnology. However, synthesis methodologies to form crystalline α-zirconium phosphate (Zr(HPO 4 ) 2 ⋅H 2 O) typically involve the use of excess phosphoric acid, addition of HF or oxalic acid and long reflux times or hydrothermal conditions. A minimalistic sustainable route to its synthesis has been developed by using only zirconium oxychloride and concentrated phosphoric acid to form highly crystalline α-zirconium phosphate within hours. The morphology can be changed from platelets to rod-shaped particles by fluoride addition. By varying the temperature and time, α-zirconium phosphate with particle sizes from nanometers to microns can be obtained. Key features of this minimal solvent synthesis are the excellent yields obtained with high atom economy under mild conditions and ease of scalability. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
User-Defined Data Distributions in High-Level Programming Languages
NASA Technical Reports Server (NTRS)
Diaconescu, Roxana E.; Zima, Hans P.
2006-01-01
One of the characteristic features of today s high performance computing systems is a physically distributed memory. Efficient management of locality is essential for meeting key performance requirements for these architectures. The standard technique for dealing with this issue has involved the extension of traditional sequential programming languages with explicit message passing, in the context of a processor-centric view of parallel computation. This has resulted in complex and error-prone assembly-style codes in which algorithms and communication are inextricably interwoven. This paper presents a high-level approach to the design and implementation of data distributions. Our work is motivated by the need to improve the current parallel programming methodology by introducing a paradigm supporting the development of efficient and reusable parallel code. This approach is currently being implemented in the context of a new programming language called Chapel, which is designed in the HPCS project Cascade.
Behavior analysis and social constructionism: Some points of contact and departure
Roche, Bryan; Barnes-Holmes, Dermot
2003-01-01
Social constructionists occasionally single out behavior analysis as the field of psychology that most closely resembles the natural sciences in its commitment to empiricism, and accuses it of suffering from many of the limitations to science identified by the postmodernist movement (e.g., K. J. Gergen, 1985a; Soyland, 1994). Indeed, behavior analysis is a natural science in many respects. However, it also shares with social constructionism important epistemological features such as a rejection of mentalism, a functional-analytic approach to language, the use of interpretive methodologies, and a reflexive stance on analysis. The current paper outlines briefly the key tenets of the behavior-analytic and social constructionist perspectives before examining a number of commonalties between these approaches. The paper aims to show that far from being a nemesis to social constructionism, behavior analysis may in fact be its close ally. PMID:22478403
An Efficient Scheme for Crystal Structure Prediction Based on Structural Motifs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Zizhong; Wu, Ping; Wu, Shunqing
An efficient scheme based on structural motifs is proposed for the crystal structure prediction of materials. The key advantage of the present method comes in two fold: first, the degrees of freedom of the system are greatly reduced, since each structural motif, regardless of its size, can always be described by a set of parameters (R, θ, φ) with five degrees of freedom; second, the motifs could always appear in the predicted structures when the energies of the structures are relatively low. Both features make the present scheme a very efficient method for predicting desired materials. The method has beenmore » applied to the case of LiFePO 4, an important cathode material for lithium-ion batteries. Numerous new structures of LiFePO 4 have been found, compared to those currently available, available, demonstrating the reliability of the present methodology and illustrating the promise of the concept of structural motifs.« less
Behavior analysis and social constructionism: some points of contact and departure.
Roche, Bryan; Barnes-Holmes, Dermot
2003-01-01
Social constructionists occasionally single out behavior analysis as the field of psychology that most closely resembles the natural sciences in its commitment to empiricism, and accuses it of suffering from many of the limitations to science identified by the postmodernist movement (e.g., K. J. Gergen, 1985a; Soyland, 1994). Indeed, behavior analysis is a natural science in many respects. However, it also shares with social constructionism important epistemological features such as a rejection of mentalism, a functional-analytic approach to language, the use of interpretive methodologies, and a reflexive stance on analysis. The current paper outlines briefly the key tenets of the behavior-analytic and social constructionist perspectives before examining a number of commonalties between these approaches. The paper aims to show that far from being a nemesis to social constructionism, behavior analysis may in fact be its close ally.
Recent advances in synchrotron-based hard x-ray phase contrast imaging
NASA Astrophysics Data System (ADS)
Liu, Y.; Nelson, J.; Holzner, C.; Andrews, J. C.; Pianetta, P.
2013-12-01
Ever since the first demonstration of phase contrast imaging (PCI) in the 1930s by Frits Zernike, people have realized the significant advantage of phase contrast over conventional absorption-based imaging in terms of sensitivity to ‘transparent’ features within specimens. Thus, x-ray phase contrast imaging (XPCI) holds great potential in studies of soft biological tissues, typically containing low Z elements such as C, H, O and N. Particularly when synchrotron hard x-rays are employed, the favourable brightness, energy tunability, monochromatic characteristics and penetration depth have dramatically enhanced the quality and variety of XPCI methods, which permit detection of the phase shift associated with 3D geometry of relatively large samples in a non-destructive manner. In this paper, we review recent advances in several synchrotron-based hard x-ray XPCI methods. Challenges and key factors in methodological development are discussed, and biological and medical applications are presented.
ERIC Educational Resources Information Center
Dusenbury, Linda; Yoder, Nick
2017-01-01
The current document serves two purposes. First, it provides an overview of six key features of a high-quality, comprehensive package of policies and guidance to support student social and emotional learning (SEL). These features are based on Collaborative for Academic Social, and Emotional Learning's (CASEL's) review of the research literature on…
Identifying the features of an exercise addiction: A Delphi study
Macfarlane, Lucy; Owens, Glynn; Cruz, Borja del Pozo
2016-01-01
Objectives There remains limited consensus regarding the definition and conceptual basis of exercise addiction. An understanding of the factors motivating maintenance of addictive exercise behavior is important for appropriately targeting intervention. The aims of this study were twofold: first, to establish consensus on features of an exercise addiction using Delphi methodology and second, to identify whether these features are congruous with a conceptual model of exercise addiction adapted from the Work Craving Model. Methods A three-round Delphi process explored the views of participants regarding the features of an exercise addiction. The participants were selected from sport and exercise relevant domains, including physicians, physiotherapists, coaches, trainers, and athletes. Suggestions meeting consensus were considered with regard to the proposed conceptual model. Results and discussion Sixty-three items reached consensus. There was concordance of opinion that exercising excessively is an addiction, and therefore it was appropriate to consider the suggestions in light of the addiction-based conceptual model. Statements reaching consensus were consistent with all three components of the model: learned (negative perfectionism), behavioral (obsessive–compulsive drive), and hedonic (self-worth compensation and reduction of negative affect and withdrawal). Conclusions Delphi methodology allowed consensus to be reached regarding the features of an exercise addiction, and these features were consistent with our hypothesized conceptual model of exercise addiction. This study is the first to have applied Delphi methodology to the exercise addiction field, and therefore introduces a novel approach to exercise addiction research that can be used as a template to stimulate future examination using this technique. PMID:27554504
Sosedova, L M
2014-01-01
In the materials there are presented features of methodological approaches in the performing of experimental studies concerning of the investigation of the impacts of environmental factors on the human body. There were shown the results of our experiments performed at the Institute, in the modeling of biological effects of antimicrobial nanobiocomposites with nanosilver particles, toxic encephalopathy, in the study of the combined effect of the factors of biological and chemical nature. There was proved the importance of intracellular of proteomics in the assessment of the effects of the action of nanoparticles and nanomaterials on the body. There were revealed key parts of progredient course of mercury poisoning in the long-term. The special section is presented by the study of long-term effects of anthropogenic environmental factors on subsequent generations. There are presented results witnessing to a deterioration of the functional state of the central nervous system in rats in the first and second generations, whose parents were exposed to neurotoxicants. There was proved the aggravating role of prenatal hypoxia in the development of toxicity in rats in sexually mature age. Experimental biomodeling is aimed at sighting of pathogenetically substantiated treatment and preventive measures: initially, in experimental conditions, and in the future in the rehabilitation of sick or injured patients.
Sanchez-Segado, Sergio; Monti, Tamara; Katrib, Juliano; Kingman, Samuel; Dodds, Chris; Jha, Animesh
2017-12-21
Current methodologies for the extraction of tantalum and niobium pose a serious threat to human beings and the environment due to the use of hydrofluoric acid (HF). Niobium and tantalum metal powders and pentoxides are widely used for energy efficient devices and components. However, the current processing methods for niobium and tantalum metals and oxides are energy inefficient. This dichotomy between materials use for energy applications and their inefficient processing is the main motivation for exploring a new methodology for the extraction of these two oxides, investigating the microwave absorption properties of the reaction products formed during the alkali roasting of niobium-tantalum bearing minerals with sodium bicarbonate. The experimental findings from dielectric measurement at elevated temperatures demonstrate an exponential increase in the values of the dielectric properties as a result of the formation of NaNbO 3 -NaTaO 3 solid solutions at temperatures above 700 °C. The investigation of the evolution of the dielectric properties during the roasting reaction is a key feature in underpinning the mechanism for designing a new microwave assisted high-temperature process for the selective separation of niobium and tantalum oxides from the remainder mineral crystalline lattice.
A Fault Diagnosis Methodology for Gear Pump Based on EEMD and Bayesian Network
Liu, Zengkai; Liu, Yonghong; Shan, Hongkai; Cai, Baoping; Huang, Qing
2015-01-01
This paper proposes a fault diagnosis methodology for a gear pump based on the ensemble empirical mode decomposition (EEMD) method and the Bayesian network. Essentially, the presented scheme is a multi-source information fusion based methodology. Compared with the conventional fault diagnosis with only EEMD, the proposed method is able to take advantage of all useful information besides sensor signals. The presented diagnostic Bayesian network consists of a fault layer, a fault feature layer and a multi-source information layer. Vibration signals from sensor measurement are decomposed by the EEMD method and the energy of intrinsic mode functions (IMFs) are calculated as fault features. These features are added into the fault feature layer in the Bayesian network. The other sources of useful information are added to the information layer. The generalized three-layer Bayesian network can be developed by fully incorporating faults and fault symptoms as well as other useful information such as naked eye inspection and maintenance records. Therefore, diagnostic accuracy and capacity can be improved. The proposed methodology is applied to the fault diagnosis of a gear pump and the structure and parameters of the Bayesian network is established. Compared with artificial neural network and support vector machine classification algorithms, the proposed model has the best diagnostic performance when sensor data is used only. A case study has demonstrated that some information from human observation or system repair records is very helpful to the fault diagnosis. It is effective and efficient in diagnosing faults based on uncertain, incomplete information. PMID:25938760
A Fault Diagnosis Methodology for Gear Pump Based on EEMD and Bayesian Network.
Liu, Zengkai; Liu, Yonghong; Shan, Hongkai; Cai, Baoping; Huang, Qing
2015-01-01
This paper proposes a fault diagnosis methodology for a gear pump based on the ensemble empirical mode decomposition (EEMD) method and the Bayesian network. Essentially, the presented scheme is a multi-source information fusion based methodology. Compared with the conventional fault diagnosis with only EEMD, the proposed method is able to take advantage of all useful information besides sensor signals. The presented diagnostic Bayesian network consists of a fault layer, a fault feature layer and a multi-source information layer. Vibration signals from sensor measurement are decomposed by the EEMD method and the energy of intrinsic mode functions (IMFs) are calculated as fault features. These features are added into the fault feature layer in the Bayesian network. The other sources of useful information are added to the information layer. The generalized three-layer Bayesian network can be developed by fully incorporating faults and fault symptoms as well as other useful information such as naked eye inspection and maintenance records. Therefore, diagnostic accuracy and capacity can be improved. The proposed methodology is applied to the fault diagnosis of a gear pump and the structure and parameters of the Bayesian network is established. Compared with artificial neural network and support vector machine classification algorithms, the proposed model has the best diagnostic performance when sensor data is used only. A case study has demonstrated that some information from human observation or system repair records is very helpful to the fault diagnosis. It is effective and efficient in diagnosing faults based on uncertain, incomplete information.
Methodological Behaviorism from the Standpoint of a Radical Behaviorist.
Moore, J
2013-01-01
Methodological behaviorism is the name for a prescriptive orientation to psychological science. Its first and original feature is that the terms and concepts deployed in psychological theories and explanations should be based on observable stimuli and behavior. I argue that the interpretation of the phrase "based on" has changed over the years because of the influence of operationism. Its second feature, which developed after the first and is prominent in contemporary psychology, is that research should emphasize formal testing of a theory that involves mediating theoretical entities from an nonbehavioral dimension according to the hypothetico-deductive method. I argue that for contemporary methodological behaviorism, explanations of the behavior of both participants and scientists appeal to the mediating entities as mental causes, if only indirectly. In contrast to methodological behaviorism is the radical behaviorism of B. F. Skinner. Unlike methodological behaviorism, radical behaviorism conceives of verbal behavior in terms of an operant process that involves antecedent circumstances and reinforcing consequences, rather than in terms of a nonbehavioral process that involves reference and symbolism. In addition, radical behaviorism recognizes private behavioral events and subscribes to research and explanatory practices that do not include testing hypotheses about supposed mediating entities from another dimension. I conclude that methodological behaviorism is actually closer to mentalism than to Skinner's radical behaviorism.
Methodological Behaviorism from the Standpoint of a Radical Behaviorist
2013-01-01
Methodological behaviorism is the name for a prescriptive orientation to psychological science. Its first and original feature is that the terms and concepts deployed in psychological theories and explanations should be based on observable stimuli and behavior. I argue that the interpretation of the phrase “based on” has changed over the years because of the influence of operationism. Its second feature, which developed after the first and is prominent in contemporary psychology, is that research should emphasize formal testing of a theory that involves mediating theoretical entities from an nonbehavioral dimension according to the hypothetico-deductive method. I argue that for contemporary methodological behaviorism, explanations of the behavior of both participants and scientists appeal to the mediating entities as mental causes, if only indirectly. In contrast to methodological behaviorism is the radical behaviorism of B. F. Skinner. Unlike methodological behaviorism, radical behaviorism conceives of verbal behavior in terms of an operant process that involves antecedent circumstances and reinforcing consequences, rather than in terms of a nonbehavioral process that involves reference and symbolism. In addition, radical behaviorism recognizes private behavioral events and subscribes to research and explanatory practices that do not include testing hypotheses about supposed mediating entities from another dimension. I conclude that methodological behaviorism is actually closer to mentalism than to Skinner's radical behaviorism. PMID:28018031
Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A
2012-04-01
Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ(2) (1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ(2) (1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports. © Health Research and Educational Trust.
Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A
2012-01-01
Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports. PMID:22092040
Predicting Vessel Trajectories from Ais Data Using R
2017-06-01
future position at the expectation level set by the user, therefore producing a valid methodology for both estimating the future vessel location and... methodology for both estimating the future vessel location and for assessing anomalous vessel behavior. vi THIS PAGE INTENTIONALLY LEFT BLANK vii... methodology , that brings them one step closer to attaining these goals. A key idea in the current literature is that the series of vessel locations
Venkatesh, Santosh S; Levenback, Benjamin J; Sultan, Laith R; Bouzghar, Ghizlane; Sehgal, Chandra M
2015-12-01
The goal of this study was to devise a machine learning methodology as a viable low-cost alternative to a second reader to help augment physicians' interpretations of breast ultrasound images in differentiating benign and malignant masses. Two independent feature sets consisting of visual features based on a radiologist's interpretation of images and computer-extracted features when used as first and second readers and combined by adaptive boosting (AdaBoost) and a pruning classifier resulted in a very high level of diagnostic performance (area under the receiver operating characteristic curve = 0.98) at a cost of pruning a fraction (20%) of the cases for further evaluation by independent methods. AdaBoost also improved the diagnostic performance of the individual human observers and increased the agreement between their analyses. Pairing AdaBoost with selective pruning is a principled methodology for achieving high diagnostic performance without the added cost of an additional reader for differentiating solid breast masses by ultrasound. Copyright © 2015 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.
Greedy feature selection for glycan chromatography data with the generalized Dirichlet distribution
2013-01-01
Background Glycoproteins are involved in a diverse range of biochemical and biological processes. Changes in protein glycosylation are believed to occur in many diseases, particularly during cancer initiation and progression. The identification of biomarkers for human disease states is becoming increasingly important, as early detection is key to improving survival and recovery rates. To this end, the serum glycome has been proposed as a potential source of biomarkers for different types of cancers. High-throughput hydrophilic interaction liquid chromatography (HILIC) technology for glycan analysis allows for the detailed quantification of the glycan content in human serum. However, the experimental data from this analysis is compositional by nature. Compositional data are subject to a constant-sum constraint, which restricts the sample space to a simplex. Statistical analysis of glycan chromatography datasets should account for their unusual mathematical properties. As the volume of glycan HILIC data being produced increases, there is a considerable need for a framework to support appropriate statistical analysis. Proposed here is a methodology for feature selection in compositional data. The principal objective is to provide a template for the analysis of glycan chromatography data that may be used to identify potential glycan biomarkers. Results A greedy search algorithm, based on the generalized Dirichlet distribution, is carried out over the feature space to search for the set of “grouping variables” that best discriminate between known group structures in the data, modelling the compositional variables using beta distributions. The algorithm is applied to two glycan chromatography datasets. Statistical classification methods are used to test the ability of the selected features to differentiate between known groups in the data. Two well-known methods are used for comparison: correlation-based feature selection (CFS) and recursive partitioning (rpart). CFS is a feature selection method, while recursive partitioning is a learning tree algorithm that has been used for feature selection in the past. Conclusions The proposed feature selection method performs well for both glycan chromatography datasets. It is computationally slower, but results in a lower misclassification rate and a higher sensitivity rate than both correlation-based feature selection and the classification tree method. PMID:23651459
DOT National Transportation Integrated Search
2014-06-01
The objective of this project focused on the development of a hybrid nondestructive testing and evaluation (NDT&E) methodology that combines the benefits of microwave NDT and thermography into one new technique. In this way, unique features of both N...
ERIC Educational Resources Information Center
Mansfield, John; Stanford, James
2017-01-01
Documenting sociolinguistic variation in lesser-studied languages presents methodological challenges, but also offers important research opportunities. In this paper we examine three key methodological challenges commonly faced by researchers who are outsiders to the community. We then present practical solutions for successful variationist…
Missouri Program Highlights How Standards Make a Difference
ERIC Educational Resources Information Center
Killion, Joellen
2017-01-01
Professional development designed to integrate key features of research-based professional learning has positive and significant effects on teacher practice and student achievement in mathematics when implemented in schools that meet specified technology-readiness criteria. Key features of research-based professional learning include intensive…
An Initial Multi-Domain Modeling of an Actively Cooled Structure
NASA Technical Reports Server (NTRS)
Steinthorsson, Erlendur
1997-01-01
A methodology for the simulation of turbine cooling flows is being developed. The methodology seeks to combine numerical techniques that optimize both accuracy and computational efficiency. Key components of the methodology include the use of multiblock grid systems for modeling complex geometries, and multigrid convergence acceleration for enhancing computational efficiency in highly resolved fluid flow simulations. The use of the methodology has been demonstrated in several turbo machinery flow and heat transfer studies. Ongoing and future work involves implementing additional turbulence models, improving computational efficiency, adding AMR.
Examining emotional expressions in discourse: methodological considerations
NASA Astrophysics Data System (ADS)
Hufnagel, Elizabeth; Kelly, Gregory J.
2017-10-01
This methodological paper presents an approach for examining emotional expressions through discourse analysis and ethnographic methods. Drawing on trends in the current literature in science education, we briefly explain the importance of emotions in science education and examine the current research methodologies used in interactional emotion studies. We put forth and substantiate a methodological approach that attends to the interactional, contextual, intertextual, and consequential aspects of emotional expressions. By examining emotional expressions in the discourse in which they are constructed, emotional expressions are identified through semantics, contextualization, and linguistic features. These features make salient four dimensions of emotional expressions: aboutness, frequency, type, and ownership. Drawing on data from a large empirical study of pre-service elementary teachers' emotional expressions about climate change in a science course, we provide illustrative examples to describe what counts as emotional expressions in situ. In doing so we explain how our approach makes salient the nuanced nature of such expressions as well as the broader discourse in which they are constructed and the implications for researching emotional expressions in science education discourse. We suggest reasons why this discourse orientated research methodology can contribute to the interactional study of emotions in science education contexts.
Setnik, Beatrice; Schoedel, Kerri A; Levy-Cooperman, Naama; Shram, Megan; Pixton, Glenn C; Roland, Carl L
With the development of opioid abuse-deterrent formulations (ADFs), there is a need to conduct well-designed human abuse potential studies to evaluate the effectiveness of their deterrent properties. Although these types of studies have been conducted for many years, largely to evaluate inherent abuse potential of a molecule and inform drug scheduling, methodological approaches have varied across studies. The focus of this review is to describe current "best practices" and methodological adaptations required to assess abuse-deterrent opioid formulations for regulatory submissions. A literature search was conducted in PubMed® to review methodological approaches (study conduct and analysis) used in opioid human abuse potential studies. Search terms included a combination of "opioid," "opiate," "abuse potential," "abuse liability," "liking," AND "pharmacodynamic," and only studies that evaluated single doses of opioids in healthy, nondependent individuals with or without prior opioid experience were included. Seventy-one human abuse potential studies meeting the prespecified criteria were identified, of which 21 studies evaluated a purported opioid ADF. Based on these studies, key methodological considerations were reviewed and summarized according to participant demographics, study prequalification, comparator and dose selection, route of administration and drug manipulation, study blinding, outcome measures and training, safety, and statistical analyses. The authors recommend careful consideration of key elements (eg, a standardized definition of a "nondependent recreational user"), as applicable, and offer key principles and "best practices" when conducting human abuse potential studies for opioid ADFs. Careful selection of appropriate study conditions is dependent on the type of ADF technology being evaluated.
Historic Landscape Inventory for Marietta National Cemetery
2017-11-14
development context, a description of current conditions, and an analysis of changes over time to the cultural landscape. All landscape features were...Factors ..................................................................................................... xxiv 1 Methodology ...yards 0.9144 meters ERDC/CERL TR-17-41 1 1 Methodology 1.1 Background The U.S. Congress codified the National Historic Preservation Act of
ASPEN Plus in the Chemical Engineering Curriculum: Suitable Course Content and Teaching Methodology
ERIC Educational Resources Information Center
Rockstraw, David A.
2005-01-01
An established methodology involving the sequential presentation of five skills on ASPEN Plus to undergraduate seniors majoring in ChE is presented in this document: (1) specifying unit operations; (2) manipulating physical properties; (3) accessing variables; (4) specifying nonstandard components; and (5) applying advanced features. This…
Barriers and Coping Mechanisms Relating to Agroforestry Adoption by Smallholder Farmers in Zimbabwe
ERIC Educational Resources Information Center
Chitakira, Munyaradzi; Torquebiau, Emmanuel
2010-01-01
Purpose: The purpose of the present study was to investigate agroforestry adoption by smallholder farmers in Gutu District, Zimbabwe. Design/Methodology/Approach: The methodology was based on field data collected through household questionnaires, key informant interviews and direct observations. Findings: Major findings reveal that traditional…
Incorporating Sustainability Content and Pedagogy through Faculty Development
ERIC Educational Resources Information Center
Hurney, Carol A.; Nash, Carole; Hartman, Christie-Joy B.; Brantmeier, Edward J.
2016-01-01
Purpose: Key elements of a curriculum are presented for a faculty development program that integrated sustainability content with effective course design methodology across a variety of disciplines. The study aims to present self-reported impacts for a small number of faculty participants and their courses. Design/methodology/approach: A yearlong…
The Methodological Underdog: A Review of Quantitative Research in the Key Adult Education Journals
ERIC Educational Resources Information Center
Boeren, Ellen
2018-01-01
An examination of articles published in leading adult education journals demonstrates that qualitative research dominates. To better understand this situation, a review of journal articles reporting on quantitative research has been undertaken by the author of this article. Differences in methodological strengths and weaknesses between…
MULTI-MEDIA MICROBIOLOGICAL RISK ASSESSMENT METHODOLOGY FOR MUNICIPAL WASTEWATER SLUDGES
In order to reduce the risk of municipal sludge to acceptable levels, the U.S. EPA has undertaken a regulatory program based on risk assessment and risk management. The key to such a program is the development of a methodology which allows the regulatory agency to quantify the re...
Global-local methodologies and their application to nonlinear analysis
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
1989-01-01
An assessment is made of the potential of different global-local analysis strategies for predicting the nonlinear and postbuckling responses of structures. Two postbuckling problems of composite panels are used as benchmarks and the application of different global-local methodologies to these benchmarks is outlined. The key elements of each of the global-local strategies are discussed and future research areas needed to realize the full potential of global-local methodologies are identified.
#eVALUate: Monetizing Service Acquisition Trade-offs Using the QUALITY-INFUSED Price Methodology
2016-04-01
methodologies that are incompatible with the characteristics of services. These methodologies involve best-value source selection and contractor ...defense contractors for services, it lacks the key elements at the strategic and tactical levels to make service contracts a managed outcome (U.S...Regulation (FAR) defines a service contract as a “contract that directly engages the time and effort of a contractor whose primary purpose is to
Object-Oriented Scientific Programming with Fortran 90
NASA Technical Reports Server (NTRS)
Norton, C.
1998-01-01
Fortran 90 is a modern language that introduces many important new features beneficial for scientific programming. We discuss our experiences in plasma particle simulation and unstructured adaptive mesh refinement on supercomputers, illustrating the features of Fortran 90 that support the object-oriented methodology.
ERIC Educational Resources Information Center
Tam, Winnie; Cox, Andrew M.; Bussey, Andy
2009-01-01
Purpose: The purpose of this paper is to identify the features that international student users prefer for next generation OPACs. Design/methodology/approach: A total of 16 international students of the University of Sheffield were interviewed in July 2008 to explore their preferences among potential features in next generation OPACs. A…
Feature Mining and Health Assessment for Gearboxes Using Run-Up/Coast-Down Signals
Zhao, Ming; Lin, Jing; Miao, Yonghao; Xu, Xiaoqiang
2016-01-01
Vibration signals measured in the run-up/coast-down (R/C) processes usually carry rich information about the health status of machinery. However, a major challenge in R/C signals analysis lies in how to exploit more diagnostic information, and how this information could be properly integrated to achieve a more reliable maintenance decision. Aiming at this problem, a framework of R/C signals analysis is presented for the health assessment of gearbox. In the proposed methodology, we first investigate the data preprocessing and feature selection issues for R/C signals. Based on that, a sparsity-guided feature enhancement scheme is then proposed to extract the weak phase jitter associated with gear defect. In order for an effective feature mining and integration under R/C, a generalized phase demodulation technique is further established to reveal the evolution of modulation feature with operating speed and rotation angle. The experimental results indicate that the proposed methodology could not only detect the presence of gear damage, but also offer a novel insight into the dynamic behavior of gearbox. PMID:27827831
Feature Mining and Health Assessment for Gearboxes Using Run-Up/Coast-Down Signals.
Zhao, Ming; Lin, Jing; Miao, Yonghao; Xu, Xiaoqiang
2016-11-02
Vibration signals measured in the run-up/coast-down (R/C) processes usually carry rich information about the health status of machinery. However, a major challenge in R/C signals analysis lies in how to exploit more diagnostic information, and how this information could be properly integrated to achieve a more reliable maintenance decision. Aiming at this problem, a framework of R/C signals analysis is presented for the health assessment of gearbox. In the proposed methodology, we first investigate the data preprocessing and feature selection issues for R/C signals. Based on that, a sparsity-guided feature enhancement scheme is then proposed to extract the weak phase jitter associated with gear defect. In order for an effective feature mining and integration under R/C, a generalized phase demodulation technique is further established to reveal the evolution of modulation feature with operating speed and rotation angle. The experimental results indicate that the proposed methodology could not only detect the presence of gear damage, but also offer a novel insight into the dynamic behavior of gearbox.
Nagarajan, Mahesh B.; Huber, Markus B.; Schlossbauer, Thomas; Leinsinger, Gerda; Krol, Andrzej; Wismüller, Axel
2014-01-01
Objective While dimension reduction has been previously explored in computer aided diagnosis (CADx) as an alternative to feature selection, previous implementations of its integration into CADx do not ensure strict separation between training and test data required for the machine learning task. This compromises the integrity of the independent test set, which serves as the basis for evaluating classifier performance. Methods and Materials We propose, implement and evaluate an improved CADx methodology where strict separation is maintained. This is achieved by subjecting the training data alone to dimension reduction; the test data is subsequently processed with out-of-sample extension methods. Our approach is demonstrated in the research context of classifying small diagnostically challenging lesions annotated on dynamic breast magnetic resonance imaging (MRI) studies. The lesions were dynamically characterized through topological feature vectors derived from Minkowski functionals. These feature vectors were then subject to dimension reduction with different linear and non-linear algorithms applied in conjunction with out-of-sample extension techniques. This was followed by classification through supervised learning with support vector regression. Area under the receiver-operating characteristic curve (AUC) was evaluated as the metric of classifier performance. Results Of the feature vectors investigated, the best performance was observed with Minkowski functional ’perimeter’ while comparable performance was observed with ’area’. Of the dimension reduction algorithms tested with ’perimeter’, the best performance was observed with Sammon’s mapping (0.84 ± 0.10) while comparable performance was achieved with exploratory observation machine (0.82 ± 0.09) and principal component analysis (0.80 ± 0.10). Conclusions The results reported in this study with the proposed CADx methodology present a significant improvement over previous results reported with such small lesions on dynamic breast MRI. In particular, non-linear algorithms for dimension reduction exhibited better classification performance than linear approaches, when integrated into our CADx methodology. We also note that while dimension reduction techniques may not necessarily provide an improvement in classification performance over feature selection, they do allow for a higher degree of feature compaction. PMID:24355697
NASA Astrophysics Data System (ADS)
Martín López, Alejandro
2015-08-01
This presentation discusses the result of 18 years of ethnographic and ethnohistorical studies on Chaco astronomies. The main features of the systems of astronomical knowledge of the Chaco Aboriginal groups will be discussed. In particular we will discuss the relevance of the Milky Way, the role of the visibility of the Pleiades, the ways in which the celestial space is represented, the constitution of astronomical orientations in geographic space, etc. We also address a key feature of their vision of the cosmos: the universe is seen by these groups as a socio-cosmos, where humans and non-humans are related. These are therefore actually socio-cosmologies. We will link this to the theories of Chaco Aboriginal groups about power and political relations.We will discuss how the study of Aboriginal astronomies must be performed along with the studies about astronomies of Creole people and European migrants, as well as anthropological studies about the science teaching in the formal education system and by the mass media. In this form we will discuss the relevance of a very complex system of interethnic relations for the conformation of these astronomical representations and practices.We will also discuss the general methodological implications of this case for the ethnoastronomy studies. In particular we will talk about the advantages of a study of regional scope and about the key importance of put in contact the ethnoastronomy with contemporary issues in social sciences.We also analyze the importance of ethnoastronomy studies in relation to studies of sociology of science, especially astronomy. We also study the potential impact on improving formal and informal science curricula and in shaping effective policies to protect the tangible and intangible astronomical heritage in a context of respect for the rights of Aboriginal groups.
Teede, Helena; Gibson-Helm, Melanie; Norman, Robert J; Boyle, Jacqueline
2014-01-01
Polycystic ovary syndrome (PCOS) is an under-recognized, common, and complex endocrinopathy. The name PCOS is a misnomer, and there have been calls for a change to reflect the broader clinical syndrome. The aim of the study was to determine perceptions held by women and primary health care physicians around key clinical features of PCOS and attitudes toward current and alternative names for the syndrome. We conducted a cross-sectional study utilizing a devised questionnaire. Participants were recruited throughout Australia via professional associations, women's health organizations, and a PCOS support group. Fifty-seven women with PCOS and 105 primary care physicians participated in the study. Perceptions of key clinical PCOS features and attitudes toward current and alternative syndrome names were investigated. Irregular periods were identified as a key clinical feature of PCOS by 86% of the women with PCOS and 90% of the primary care physicians. In both groups, 60% also identified hormone imbalance as a key feature. Among women with PCOS, 47% incorrectly identified ovarian cysts as key, 48% felt the current name is confusing, and 51% supported a change. Most primary care physicians agreed that the name is confusing (74%) and needs changing (81%); however, opinions on specific alternative names were divided. The name "polycystic ovary syndrome" is perceived as confusing, and there is general support for a change to reflect the broader clinical syndrome. Engagement of primary health care physicians and consumers is strongly recommended to ensure that an alternative name enhances understanding and recognition of the syndrome and its complex features.
Predicting Key Events in the Popularity Evolution of Online Information.
Hu, Ying; Hu, Changjun; Fu, Shushen; Fang, Mingzhe; Xu, Wenwen
2017-01-01
The popularity of online information generally experiences a rising and falling evolution. This paper considers the "burst", "peak", and "fade" key events together as a representative summary of popularity evolution. We propose a novel prediction task-predicting when popularity undergoes these key events. It is of great importance to know when these three key events occur, because doing so helps recommendation systems, online marketing, and containment of rumors. However, it is very challenging to solve this new prediction task due to two issues. First, popularity evolution has high variation and can follow various patterns, so how can we identify "burst", "peak", and "fade" in different patterns of popularity evolution? Second, these events usually occur in a very short time, so how can we accurately yet promptly predict them? In this paper we address these two issues. To handle the first one, we use a simple moving average to smooth variation, and then a universal method is presented for different patterns to identify the key events in popularity evolution. To deal with the second one, we extract different types of features that may have an impact on the key events, and then a correlation analysis is conducted in the feature selection step to remove irrelevant and redundant features. The remaining features are used to train a machine learning model. The feature selection step improves prediction accuracy, and in order to emphasize prediction promptness, we design a new evaluation metric which considers both accuracy and promptness to evaluate our prediction task. Experimental and comparative results show the superiority of our prediction solution.
Predicting Key Events in the Popularity Evolution of Online Information
Fu, Shushen; Fang, Mingzhe; Xu, Wenwen
2017-01-01
The popularity of online information generally experiences a rising and falling evolution. This paper considers the “burst”, “peak”, and “fade” key events together as a representative summary of popularity evolution. We propose a novel prediction task—predicting when popularity undergoes these key events. It is of great importance to know when these three key events occur, because doing so helps recommendation systems, online marketing, and containment of rumors. However, it is very challenging to solve this new prediction task due to two issues. First, popularity evolution has high variation and can follow various patterns, so how can we identify “burst”, “peak”, and “fade” in different patterns of popularity evolution? Second, these events usually occur in a very short time, so how can we accurately yet promptly predict them? In this paper we address these two issues. To handle the first one, we use a simple moving average to smooth variation, and then a universal method is presented for different patterns to identify the key events in popularity evolution. To deal with the second one, we extract different types of features that may have an impact on the key events, and then a correlation analysis is conducted in the feature selection step to remove irrelevant and redundant features. The remaining features are used to train a machine learning model. The feature selection step improves prediction accuracy, and in order to emphasize prediction promptness, we design a new evaluation metric which considers both accuracy and promptness to evaluate our prediction task. Experimental and comparative results show the superiority of our prediction solution. PMID:28046121
Persistence and uncertainty in the academic career
Petersen, Alexander M.; Riccaboni, Massimo; Stanley, H. Eugene; Pammolli, Fabio
2012-01-01
Understanding how institutional changes within academia may affect the overall potential of science requires a better quantitative representation of how careers evolve over time. Because knowledge spillovers, cumulative advantage, competition, and collaboration are distinctive features of the academic profession, both the employment relationship and the procedures for assigning recognition and allocating funding should be designed to account for these factors. We study the annual production ni(t) of a given scientist i by analyzing longitudinal career data for 200 leading scientists and 100 assistant professors from the physics community. Our empirical analysis of individual productivity dynamics shows that (i) there are increasing returns for the top individuals within the competitive cohort, and that (ii) the distribution of production growth is a leptokurtic “tent-shaped” distribution that is remarkably symmetric. Our methodology is general, and we speculate that similar features appear in other disciplines where academic publication is essential and collaboration is a key feature. We introduce a model of proportional growth which reproduces these two observations, and additionally accounts for the significantly right-skewed distributions of career longevity and achievement in science. Using this theoretical model, we show that short-term contracts can amplify the effects of competition and uncertainty making careers more vulnerable to early termination, not necessarily due to lack of individual talent and persistence, but because of random negative production shocks. We show that fluctuations in scientific production are quantitatively related to a scientist’s collaboration radius and team efficiency. PMID:22431620
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoogmartens, Rob, E-mail: rob.hoogmartens@uhasselt.be; Van Passel, Steven, E-mail: steven.vanpassel@uhasselt.be; Van Acker, Karel, E-mail: karel.vanacker@lrd.kuleuven.be
Increasing interest in sustainability has led to the development of sustainability assessment tools such as Life Cycle Analysis (LCA), Life Cycle Costing (LCC) and Cost–Benefit Analysis (CBA). Due to methodological disparity of these three tools, conflicting assessment results generate confusion for many policy and business decisions. In order to interpret and integrate assessment results, the paper provides a framework that clarifies the connections and coherence between the included assessment methodologies. Building on this framework, the paper further focuses on key aspects to adapt any of the methodologies to full sustainability assessments. Aspects dealt with in the review are for examplemore » the reported metrics, the scope, data requirements, discounting, product- or project-related and approaches with respect to scarcity and labor requirements. In addition to these key aspects, the review shows that important connections exist: (i) the three tools can cope with social inequality, (ii) processes such as valuation techniques for LCC and CBA are common, (iii) Environmental Impact Assessment (EIA) is used as input in both LCA and CBA and (iv) LCA can be used in parallel with LCC. Furthermore, the most integrated sustainability approach combines elements of LCA and LCC to achieve the Life Cycle Sustainability Assessment (LCSA). The key aspects and the connections referred to in the review are illustrated with a case study on the treatment of end-of-life automotive glass. - Highlights: • Proliferation of assessment tools creates ambiguity and confusion. • The developed assessment framework clarifies connections between assessment tools. • Broadening LCA, key aspects are metric and data requirements. • Broadening LCC, key aspects are scope, time frame and discounting. • Broadening CBA, focus point, timespan, references, labor and scarcity are key.« less
3D Reconstruction and Approximation of Vegetation Geometry for Modeling of Within-canopy Flows
NASA Astrophysics Data System (ADS)
Henderson, S. M.; Lynn, K.; Lienard, J.; Strigul, N.; Mullarney, J. C.; Norris, B. K.; Bryan, K. R.
2016-02-01
Aquatic vegetation can shelter coastlines from waves and currents, sometimes resulting in accretion of fine sediments. We developed a photogrammetric technique for estimating the key geometric vegetation parameters that are required for modeling of within-canopy flows. Accurate estimates of vegetation geometry and density are essential to refine hydrodynamic models, but accurate, convenient, and time-efficient methodologies for measuring complex canopy geometries have been lacking. The novel approach presented here builds on recent progress in photogrammetry and computer vision. We analyzed the geometry of aerial mangrove roots, called pneumatophores, in Vietnam's Mekong River Delta. Although comparatively thin, pneumatophores are more numerous than mangrove trunks, and thus influence near bed flow and sediment transport. Quadrats (1 m2) were placed at low tide among pneumatophores. Roots were counted and measured for height and diameter. Photos were taken from multiple angles around each quadrat. Relative camera locations and orientations were estimated from key features identified in multiple images using open-source software (VisualSfM). Next, a dense 3D point cloud was produced. Finally, algorithms were developed for automated estimation of pneumatophore geometry from the 3D point cloud. We found good agreement between hand-measured and photogrammetric estimates of key geometric parameters, including mean stem diameter, total number of stems, and frontal area density. These methods can reduce time spent measuring in the field, thereby enabling future studies to refine models of water flows and sediment transport within heterogenous vegetation canopies.
NASA Astrophysics Data System (ADS)
Leijenaar, Ralph T. H.; Nalbantov, Georgi; Carvalho, Sara; van Elmpt, Wouter J. C.; Troost, Esther G. C.; Boellaard, Ronald; Aerts, Hugo J. W. L.; Gillies, Robert J.; Lambin, Philippe
2015-08-01
FDG-PET-derived textural features describing intra-tumor heterogeneity are increasingly investigated as imaging biomarkers. As part of the process of quantifying heterogeneity, image intensities (SUVs) are typically resampled into a reduced number of discrete bins. We focused on the implications of the manner in which this discretization is implemented. Two methods were evaluated: (1) RD, dividing the SUV range into D equally spaced bins, where the intensity resolution (i.e. bin size) varies per image; and (2) RB, maintaining a constant intensity resolution B. Clinical feasibility was assessed on 35 lung cancer patients, imaged before and in the second week of radiotherapy. Forty-four textural features were determined for different D and B for both imaging time points. Feature values depended on the intensity resolution and out of both assessed methods, RB was shown to allow for a meaningful inter- and intra-patient comparison of feature values. Overall, patients ranked differently according to feature values-which was used as a surrogate for textural feature interpretation-between both discretization methods. Our study shows that the manner of SUV discretization has a crucial effect on the resulting textural features and the interpretation thereof, emphasizing the importance of standardized methodology in tumor texture analysis.
Feature Selection and Effective Classifiers.
ERIC Educational Resources Information Center
Deogun, Jitender S.; Choubey, Suresh K.; Raghavan, Vijay V.; Sever, Hayri
1998-01-01
Develops and analyzes four algorithms for feature selection in the context of rough set methodology. Experimental results confirm the expected relationship between the time complexity of these algorithms and the classification accuracy of the resulting upper classifiers. When compared, results of upper classifiers perform better than lower…
Survey of Key Concepts in Enactivist Theory and Methodology
ERIC Educational Resources Information Center
Reid, David A.; Mgombelo, Joyce
2015-01-01
This article discusses key concepts within enactivist writing, focussing especially on concepts involved in the enactivist description of cognition as embodied action: perceptually guided action, embodiment, and structural coupling through recurrent sensorimotor patterns. Other concepts on which these concepts depend are also discussed, including…
Compressive Feedback Control Design for Spatially Distributed Systems
2017-01-03
NecSys 2015 & 2016 Abstract The goal of this research is the development of new fundamental insights and methodologies to exploit structural properties of...Measures One of the simplest class of dynamical networks that our proposed methodology can be explained in a simple setting is the class of first–order...developed a novel methodology to obtain tight lower and upper bounds for the class of systemic measures. In the following, some of the key ideas behind our
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
1986-01-01
An assessment is made of the potential of different global-local analysis strategies for predicting the nonlinear and postbuckling responses of structures. Two postbuckling problems of composite panels are used as benchmarks and the application of different global-local methodologies to these benchmarks is outlined. The key elements of each of the global-local strategies are discussed and future research areas needed to realize the full potential of global-local methodologies are identified.
Evaluation of Image Segmentation and Object Recognition Algorithms for Image Parsing
2013-09-01
generation of the features from the key points. OpenCV uses Euclidean distance to match the key points and has the option to use Manhattan distance...feature vector includes polarity and intensity information. Final step is matching the key points. In OpenCV , Euclidean distance or Manhattan...the code below is one way and OpenCV offers the function radiusMatch (a pair must have a distance less than a given maximum distance). OpenCV’s
Willis, Danny G; Sullivan-Bolyai, Susan; Knafl, Kathleen; Cohen, Marlene Z
2016-09-01
Scholars who research phenomena of concern to the discipline of nursing are challenged with making wise choices about different qualitative research approaches. Ultimately, they want to choose an approach that is best suited to answer their research questions. Such choices are predicated on having made distinctions between qualitative methodology, methods, and analytic frames. In this article, we distinguish two qualitative research approaches widely used for descriptive studies: descriptive phenomenological and qualitative description. Providing a clear basis that highlights the distinguishing features and similarities between descriptive phenomenological and qualitative description research will help students and researchers make more informed choices in deciding upon the most appropriate methodology in qualitative research. We orient the reader to distinguishing features and similarities associated with each approach and the kinds of research questions descriptive phenomenological and qualitative description research address. © The Author(s) 2016.
Quality and Rigor of the Concept Mapping Methodology: A Pooled Study Analysis
ERIC Educational Resources Information Center
Rosas, Scott R.; Kane, Mary
2012-01-01
The use of concept mapping in research and evaluation has expanded dramatically over the past 20 years. Researchers in academic, organizational, and community-based settings have applied concept mapping successfully without the benefit of systematic analyses across studies to identify the features of a methodologically sound study. Quantitative…
A Synthesis of a Quality Management Model for Education in Universities
ERIC Educational Resources Information Center
Srikanthan, G.; Dalrymple, John
2004-01-01
The paper attempts to synthesise the features of the model for quality management in education based on the approaches spelt out in four well-articulated methodologies for the practice of quality in higher education. Each methodology contributes to different views of education from the learners' and the institution's perspectives, providing…
Have Cognitive Diagnostic Models Delivered Their Goods? Some Substantial and Methodological Concerns
ERIC Educational Resources Information Center
Wilhelm, Oliver; Robitzsch, Alexander
2009-01-01
The paper by Rupp and Templin (2008) is an excellent work on the characteristics and features of cognitive diagnostic models (CDM). In this article, the authors comment on some substantial and methodological aspects of this focus paper. They organize their comments by going through issues associated with the terms "cognitive,"…
Wang, Xiaohua; Li, Xi; Rong, Mingzhe; Xie, Dingli; Ding, Dan; Wang, Zhixiang
2017-01-01
The ultra-high frequency (UHF) method is widely used in insulation condition assessment. However, UHF signal processing algorithms are complicated and the size of the result is large, which hinders extracting features and recognizing partial discharge (PD) patterns. This article investigated the chromatic methodology that is novel in PD detection. The principle of chromatic methodologies in color science are introduced. The chromatic processing represents UHF signals sparsely. The UHF signals obtained from PD experiments were processed using chromatic methodology and characterized by three parameters in chromatic space (H, L, and S representing dominant wavelength, signal strength, and saturation, respectively). The features of the UHF signals were studied hierarchically. The results showed that the chromatic parameters were consistent with conventional frequency domain parameters. The global chromatic parameters can be used to distinguish UHF signals acquired by different sensors, and they reveal the propagation properties of the UHF signal in the L-shaped gas-insulated switchgear (GIS). Finally, typical PD defect patterns had been recognized by using novel chromatic parameters in an actual GIS tank and good performance of recognition was achieved. PMID:28106806
Wang, Xiaohua; Li, Xi; Rong, Mingzhe; Xie, Dingli; Ding, Dan; Wang, Zhixiang
2017-01-18
The ultra-high frequency (UHF) method is widely used in insulation condition assessment. However, UHF signal processing algorithms are complicated and the size of the result is large, which hinders extracting features and recognizing partial discharge (PD) patterns. This article investigated the chromatic methodology that is novel in PD detection. The principle of chromatic methodologies in color science are introduced. The chromatic processing represents UHF signals sparsely. The UHF signals obtained from PD experiments were processed using chromatic methodology and characterized by three parameters in chromatic space ( H , L , and S representing dominant wavelength, signal strength, and saturation, respectively). The features of the UHF signals were studied hierarchically. The results showed that the chromatic parameters were consistent with conventional frequency domain parameters. The global chromatic parameters can be used to distinguish UHF signals acquired by different sensors, and they reveal the propagation properties of the UHF signal in the L-shaped gas-insulated switchgear (GIS). Finally, typical PD defect patterns had been recognized by using novel chromatic parameters in an actual GIS tank and good performance of recognition was achieved.
Methodological Choices in Peer Nomination Research
ERIC Educational Resources Information Center
Cillessen, Antonius H. N.; Marks, Peter E. L.
2017-01-01
Although peer nomination measures have been used by researchers for nearly a century, common methodological practices and rules of thumb (e.g., which variables to measure; use of limited vs. unlimited nomination methods) have continued to develop in recent decades. At the same time, other key aspects of the basic nomination procedure (e.g.,…
Key Methodological Aspects of Translators' Training in Ukraine and in the USA
ERIC Educational Resources Information Center
Skyba, Kateryna
2015-01-01
The diversity of international relations in the globalized world has influenced the role of a translator that is becoming more and more important. Translators' training institutions today are to work out and to implement the best teaching methodology taking into consideration the new challenges of modern multinational and multicultural society.…
ERIC Educational Resources Information Center
Savvides, Nicola; Al-Youssef, Joanna; Colin, Mindy; Garrido, Cecilia
2014-01-01
This article highlights key theoretical and methodological issues and implications of being an insider/outsider when undertaking qualitative research in international educational settings. It first addresses discourses of "self" and "other," noting that identity and belonging emerge from fluid engagement between researchers and…
ERIC Educational Resources Information Center
Azevedo, Roger
2015-01-01
Engagement is one of the most widely misused and overgeneralized constructs found in the educational, learning, instructional, and psychological sciences. The articles in this special issue represent a wide range of traditions and highlight several key conceptual, theoretical, methodological, and analytical issues related to defining and measuring…
Forgetski Vygotsky: Or, a Plea for Bootstrapping Accounts of Learning
ERIC Educational Resources Information Center
Luntley, Michael
2017-01-01
This paper argues that sociocultural accounts of learning fail to answer the key question about learning--how is it possible? Accordingly, we should adopt an individualist bootstrapping methodology in providing a theory of learning. Such a methodology takes seriously the idea that learning is staged and distinguishes between a non-comprehending…
Multiple Cultures of Doing Geography Facilitate Global Studies
ERIC Educational Resources Information Center
Ahamer, Gilbert
2013-01-01
Purpose: This article aims to explain why geography is a prime discipline for analysing globalisation and a multicultural view of Global Studies. The generic approach of human geography to first select an appropriate methodology is taken as a key approach. Design/methodology/approach: Concepts from aggregate disciplines such as history, economics,…
Appraising the Ethos of Experiential Narratives: Key Aspects of a Methodological Challenge
ERIC Educational Resources Information Center
Conle, Carola; deBeyer, Michael
2009-01-01
In this essay, Carola Conle and Michael deBeyer describe their efforts to find a conceptual approach and methodology for the appraisal of the ethos of experiential narratives presented in a particular curriculum context. The language of "implied authorship," "the patterning of desire," and "friendships offered and received," first introduced by…
The Relationship between Ethical Positions and Methodological Approaches: A Scandinavian Perspective
ERIC Educational Resources Information Center
Beach, Dennis; Eriksson, Anita
2010-01-01
In this article, based on reading ethnographic theses, books and articles and conversations with nine key informants, we have tried to describe how research ethics are approached and written about in educational ethnography in Scandinavia. The article confirms findings from previous research that there are different methodological forms of…
Telephone-quality pathological speech classification using empirical mode decomposition.
Kaleem, M F; Ghoraani, B; Guergachi, A; Krishnan, S
2011-01-01
This paper presents a computationally simple and effective methodology based on empirical mode decomposition (EMD) for classification of telephone quality normal and pathological speech signals. EMD is used to decompose continuous normal and pathological speech signals into intrinsic mode functions, which are analyzed to extract physically meaningful and unique temporal and spectral features. Using continuous speech samples from a database of 51 normal and 161 pathological speakers, which has been modified to simulate telephone quality speech under different levels of noise, a linear classifier is used with the feature vector thus obtained to obtain a high classification accuracy, thereby demonstrating the effectiveness of the methodology. The classification accuracy reported in this paper (89.7% for signal-to-noise ratio 30 dB) is a significant improvement over previously reported results for the same task, and demonstrates the utility of our methodology for cost-effective remote voice pathology assessment over telephone channels.
Incremental Upgrade of Legacy Systems (IULS)
2001-04-01
analysis task employed SEI’s Feature-Oriented Domain Analysis methodology (see FODA reference) and included several phases: • Context Analysis • Establish...Legacy, new Host and upgrade system and software. The Feature Oriented Domain Analysis approach ( FODA , see SUM References) was used for this step...Feature-Oriented Domain Analysis ( FODA ) Feasibility Study (CMU/SEI-90-TR- 21, ESD-90-TR-222); Software Engineering Institute, Carnegie Mellon University
A methodology for designing robust multivariable nonlinear control systems. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Grunberg, D. B.
1986-01-01
A new methodology is described for the design of nonlinear dynamic controllers for nonlinear multivariable systems providing guarantees of closed-loop stability, performance, and robustness. The methodology is an extension of the Linear-Quadratic-Gaussian with Loop-Transfer-Recovery (LQG/LTR) methodology for linear systems, thus hinging upon the idea of constructing an approximate inverse operator for the plant. A major feature of the methodology is a unification of both the state-space and input-output formulations. In addition, new results on stability theory, nonlinear state estimation, and optimal nonlinear regulator theory are presented, including the guaranteed global properties of the extended Kalman filter and optimal nonlinear regulators.
ERIC Educational Resources Information Center
Texas State Dept. of Commerce, Austin.
In 1993, Texas' 24 quality work force planning committees used a state-developed targeted occupations planning methodology to identify key industries and targeted occupations with the greatest potential for job openings in their respective regions. Between 11 and 20 key industries (13.5 on average) were identified for each region. The following 10…
Towards a Transferable UAV-Based Framework for River Hydromorphological Characterization
González, Rocío Ballesteros; Leinster, Paul; Wright, Ros
2017-01-01
The multiple protocols that have been developed to characterize river hydromorphology, partly in response to legislative drivers such as the European Union Water Framework Directive (EU WFD), make the comparison of results obtained in different countries challenging. Recent studies have analyzed the comparability of existing methods, with remote sensing based approaches being proposed as a potential means of harmonizing hydromorphological characterization protocols. However, the resolution achieved by remote sensing products may not be sufficient to assess some of the key hydromorphological features that are required to allow an accurate characterization. Methodologies based on high resolution aerial photography taken from Unmanned Aerial Vehicles (UAVs) have been proposed by several authors as potential approaches to overcome these limitations. Here, we explore the applicability of an existing UAV based framework for hydromorphological characterization to three different fluvial settings representing some of the distinct ecoregions defined by the WFD geographical intercalibration groups (GIGs). The framework is based on the automated recognition of hydromorphological features via tested and validated Artificial Neural Networks (ANNs). Results show that the framework is transferable to the Central-Baltic and Mediterranean GIGs with accuracies in feature identification above 70%. Accuracies of 50% are achieved when the framework is implemented in the Very Large Rivers GIG. The framework successfully identified vegetation, deep water, shallow water, riffles, side bars and shadows for the majority of the reaches. However, further algorithm development is required to ensure a wider range of features (e.g., chutes, structures and erosion) are accurately identified. This study also highlights the need to develop an objective and fit for purpose hydromorphological characterization framework to be adopted within all EU member states to facilitate comparison of results. PMID:28954434
Towards a Transferable UAV-Based Framework for River Hydromorphological Characterization.
Rivas Casado, Mónica; González, Rocío Ballesteros; Ortega, José Fernando; Leinster, Paul; Wright, Ros
2017-09-26
The multiple protocols that have been developed to characterize river hydromorphology, partly in response to legislative drivers such as the European Union Water Framework Directive (EU WFD), make the comparison of results obtained in different countries challenging. Recent studies have analyzed the comparability of existing methods, with remote sensing based approaches being proposed as a potential means of harmonizing hydromorphological characterization protocols. However, the resolution achieved by remote sensing products may not be sufficient to assess some of the key hydromorphological features that are required to allow an accurate characterization. Methodologies based on high resolution aerial photography taken from Unmanned Aerial Vehicles (UAVs) have been proposed by several authors as potential approaches to overcome these limitations. Here, we explore the applicability of an existing UAV based framework for hydromorphological characterization to three different fluvial settings representing some of the distinct ecoregions defined by the WFD geographical intercalibration groups (GIGs). The framework is based on the automated recognition of hydromorphological features via tested and validated Artificial Neural Networks (ANNs). Results show that the framework is transferable to the Central-Baltic and Mediterranean GIGs with accuracies in feature identification above 70%. Accuracies of 50% are achieved when the framework is implemented in the Very Large Rivers GIG. The framework successfully identified vegetation, deep water, shallow water, riffles, side bars and shadows for the majority of the reaches. However, further algorithm development is required to ensure a wider range of features (e.g., chutes, structures and erosion) are accurately identified. This study also highlights the need to develop an objective and fit for purpose hydromorphological characterization framework to be adopted within all EU member states to facilitate comparison of results.
Using Multispectral False Color Imaging to Characterize Tropical Cyclone Structure and Environment
NASA Astrophysics Data System (ADS)
Cossuth, J.; Bankert, R.; Richardson, K.; Surratt, M. L.
2016-12-01
The Naval Research Laboratory's (NRL) tropical cyclone (TC) web page (http://www.nrlmry.navy.mil/TC.html) has provided nearly two decades of near real-time access to TC-centric images and products by TC forecasters and enthusiasts around the world. Particularly, microwave imager and sounder information that is featured on this site provides crucial internal storm structure information by allowing users to perceive hydrometeor structure, providing key details beyond cloud top information provided by visible and infrared channels. Towards improving TC analysis techniques and helping advance the utility of the NRL TC webpage resource, new research efforts are presented. This work demonstrates results as well as the methodology used to develop new automated, objective satellite-based TC structure and intensity guidance and enhanced data fusion imagery products that aim to bolster and streamline TC forecast operations. This presentation focuses on the creation and interpretation of false color RGB composite imagery that leverages the different emissive and scattering properties of atmospheric ice, liquid, and vapor water as well as ocean surface roughness as seen by microwave radiometers. Specifically, a combination of near-realtime data and a standardized digital database of global TCs in microwave imagery from 1987-2012 is employed as a climatology of TC structures. The broad range of TC structures, from pinhole eyes through multiple eyewall configurations, is characterized as resolved by passive microwave sensors. The extraction of these characteristic features from historical data also lends itself to statistical analysis. For example, histograms of brightness temperature distributions allows a rigorous examination of how structural features are conveyed in image products, allowing a better representation of colors and breakpoints as they relate to physical features. Such climatological work also suggests steps to better inform the near-real time application of upcoming satellite datasets to TC analyses.
Recognition of large scale deep-seated landslides in vegetated areas of Taiwan
NASA Astrophysics Data System (ADS)
Lin, C. W.; Tarolli, P.; Tseng, C. M.; Tseng, Y. H.
2012-04-01
In August 2009, Typhoon Morakot triggered thousands of landslides and debris flows, and according to government reports, 619 people were dead and 76 missing and the economic loss was estimated at hundreds million of USD. In particular, the large deep-seated landslides are critical and deserve attention, since they can be affected by a reactivation during intense events, that usually can evolve in destructive failures. These are also difficult to recognize in the field, especially under dense forest areas. A detailed and constantly updated inventory map of such phenomena, and the recognition of their topographic signatures really represents a key tool for landslide risk mitigation, and mapping. The aim of this work is to test the performance of a new developed method for the automatic extraction of geomorphic features related to landslide crowns developed by Tarolli et al. (2010), in support to field surveys in order to develop a detailed and accurate inventory map of such phenomena. The methodology is based on the detection of thresholds derived by the statistical analysis of variability of landform curvature from high resolution LiDAR derived topography. The analysis suggested that the method allowed a good performance in localization and extraction, respect to field analysis, of features related to deep-seated landslides. Thanks to LiDAR capabilty to detect the bare ground elevation data also in forested areas, it was possible to recognize in detail landslide features also in remote regions difficult to access. Reference Tarolli, P., Sofia, G., Dalla Fontana, G. (2010). Geomorphic features extraction from high-resolution topography: landslide crowns and bank erosion, Natural Hazards, doi:10.1007/s11069-010-9695-2
An overview on the emerging area of identification, characterization, and assessment of health apps.
Paglialonga, Alessia; Lugo, Alessandra; Santoro, Eugenio
2018-05-28
The need to characterize and assess health apps has inspired a significant amount of research in the past years, in search for methods able to provide potential app users with relevant, meaningful knowledge. This article presents an overview of the recent literature in this field and categorizes - by discussing some specific examples - the various methodologies introduced so far for the identification, characterization, and assessment of health apps. Specifically, this article outlines the most significant web-based resources for app identification, relevant frameworks for descriptive characterization of apps' features, and a number of methods for the assessment of quality along its various components (e.g., evidence base, trustworthiness, privacy, or user engagement). The development of methods to characterize the apps' features and to assess their quality is important to define benchmarks and minimum requirements. Similarly, such methods are important to categorize potential risks and challenges in the field so that risks can be minimized, whenever possible, by design. Understanding methods to assess apps is key to raise the standards of quality of health apps on the market, towards the final goal of delivering apps that are built on the pillars of evidence-base, reliability, long-term effectiveness, and user-oriented quality. Copyright © 2018. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Soilán, M.; Riveiro, B.; Sánchez-Rodríguez, A.; González-deSantos, L. M.
2018-05-01
During the last few years, there has been a huge methodological development regarding the automatic processing of 3D point cloud data acquired by both terrestrial and aerial mobile mapping systems, motivated by the improvement of surveying technologies and hardware performance. This paper presents a methodology that, in a first place, extracts geometric and semantic information regarding the road markings within the surveyed area from Mobile Laser Scanning (MLS) data, and then employs it to isolate street areas where pedestrian crossings are found and, therefore, pedestrians are more likely to cross the road. Then, different safety-related features can be extracted in order to offer information about the adequacy of the pedestrian crossing regarding its safety, which can be displayed in a Geographical Information System (GIS) layer. These features are defined in four different processing modules: Accessibility analysis, traffic lights classification, traffic signs classification, and visibility analysis. The validation of the proposed methodology has been carried out in two different cities in the northwest of Spain, obtaining both quantitative and qualitative results for pedestrian crossing classification and for each processing module of the safety assessment on pedestrian crossing environments.
Pathways from Cannabis to Psychosis: A Review of the Evidence
Burns, Jonathan K.
2013-01-01
The nature of the relationship between cannabis use (CU) and psychosis is complex and remains unclear. Researchers and clinicians remain divided regarding key issues such as whether or not cannabis is an independent cause of psychosis and schizophrenia. This paper reviews the field in detail, examining questions of causality, the neurobiological basis for such causality and for differential inter-individual risk, the clinical and cognitive features of psychosis in cannabis users, and patterns of course and outcome of psychosis in the context of CU. The author proposes two major pathways from cannabis to psychosis based on a differentiation between early-initiated lifelong CU and a scenario where vulnerable individuals without a lifelong pattern of use consume cannabis over a relatively brief period of time just prior to psychosis onset. Additional key factors determining the clinical and neurobiological manifestation of psychosis as well as course and outcome in cannabis users include: underlying genetic and developmental vulnerability to schizophrenia-spectrum disorders; and whether or not CU ceases or continues after the onset of psychosis. Finally, methodological guidelines are presented for future research aimed at both elucidating the pathways that lead from cannabis to psychosis and clarifying the long-term outcome of the disorder in those who have a history of using cannabis. PMID:24133460
Novel methodology for pharmaceutical expenditure forecast.
Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Rémuzat, Cécile; Urbinati, Duccio; Kornfeld, Åsa; Mzoughi, Olfa; Toumi, Mondher
2014-01-01
The value appreciation of new drugs across countries today features a disruption that is making the historical data that are used for forecasting pharmaceutical expenditure poorly reliable. Forecasting methods rarely addressed uncertainty. The objective of this project was to propose a methodology to perform pharmaceutical expenditure forecasting that integrates expected policy changes and uncertainty (developed for the European Commission as the 'EU Pharmaceutical expenditure forecast'; see http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). 1) Identification of all pharmaceuticals going off-patent and new branded medicinal products over a 5-year forecasting period in seven European Union (EU) Member States. 2) Development of a model to estimate direct and indirect impacts (based on health policies and clinical experts) on savings of generics and biosimilars. Inputs were originator sales value, patent expiry date, time to launch after marketing authorization, price discount, penetration rate, time to peak sales, and impact on brand price. 3) Development of a model for new drugs, which estimated sales progression in a competitive environment. Clinical expected benefits as well as commercial potential were assessed for each product by clinical experts. Inputs were development phase, marketing authorization dates, orphan condition, market size, and competitors. 4) Separate analysis of the budget impact of products going off-patent and new drugs according to several perspectives, distribution chains, and outcomes. 5) Addressing uncertainty surrounding estimations via deterministic and probabilistic sensitivity analysis. This methodology has proven to be effective by 1) identifying the main parameters impacting the variations in pharmaceutical expenditure forecasting across countries: generics discounts and penetration, brand price after patent loss, reimbursement rate, the penetration of biosimilars and discount price, distribution chains, and the time to reach peak sales for new drugs; 2) estimating the statistical distribution of the budget impact; and 3) testing different pricing and reimbursement policy decisions on health expenditures. This methodology was independent of historical data and appeared to be highly flexible and adapted to test robustness and provide probabilistic analysis to support policy decision making.
2012-01-01
Background In recent years, biological event extraction has emerged as a key natural language processing task, aiming to address the information overload problem in accessing the molecular biology literature. The BioNLP shared task competitions have contributed to this recent interest considerably. The first competition (BioNLP'09) focused on extracting biological events from Medline abstracts from a narrow domain, while the theme of the latest competition (BioNLP-ST'11) was generalization and a wider range of text types, event types, and subject domains were considered. We view event extraction as a building block in larger discourse interpretation and propose a two-phase, linguistically-grounded, rule-based methodology. In the first phase, a general, underspecified semantic interpretation is composed from syntactic dependency relations in a bottom-up manner. The notion of embedding underpins this phase and it is informed by a trigger dictionary and argument identification rules. Coreference resolution is also performed at this step, allowing extraction of inter-sentential relations. The second phase is concerned with constraining the resulting semantic interpretation by shared task specifications. We evaluated our general methodology on core biological event extraction and speculation/negation tasks in three main tracks of BioNLP-ST'11 (GENIA, EPI, and ID). Results We achieved competitive results in GENIA and ID tracks, while our results in the EPI track leave room for improvement. One notable feature of our system is that its performance across abstracts and articles bodies is stable. Coreference resolution results in minor improvement in system performance. Due to our interest in discourse-level elements, such as speculation/negation and coreference, we provide a more detailed analysis of our system performance in these subtasks. Conclusions The results demonstrate the viability of a robust, linguistically-oriented methodology, which clearly distinguishes general semantic interpretation from shared task specific aspects, for biological event extraction. Our error analysis pinpoints some shortcomings, which we plan to address in future work within our incremental system development methodology. PMID:22759461
Development of 3D pseudo pin-by-pin calculation methodology in ANC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, B.; Mayhue, L.; Huria, H.
2012-07-01
Advanced cores and fuel assembly designs have been developed to improve operational flexibility, economic performance and further enhance safety features of nuclear power plants. The simulation of these new designs, along with strong heterogeneous fuel loading, have brought new challenges to the reactor physics methodologies currently employed in the industrial codes for core analyses. Control rod insertion during normal operation is one operational feature in the AP1000{sup R} plant of Westinghouse next generation Pressurized Water Reactor (PWR) design. This design improves its operational flexibility and efficiency but significantly challenges the conventional reactor physics methods, especially in pin power calculations. Themore » mixture loading of fuel assemblies with significant neutron spectrums causes a strong interaction between different fuel assembly types that is not fully captured with the current core design codes. To overcome the weaknesses of the conventional methods, Westinghouse has developed a state-of-the-art 3D Pin-by-Pin Calculation Methodology (P3C) and successfully implemented in the Westinghouse core design code ANC. The new methodology has been qualified and licensed for pin power prediction. The 3D P3C methodology along with its application and validation will be discussed in the paper. (authors)« less
Qiao, Hong; Li, Yinlin; Li, Fengfu; Xi, Xuanyang; Wu, Wei
2016-10-01
Recently, many biologically inspired visual computational models have been proposed. The design of these models follows the related biological mechanisms and structures, and these models provide new solutions for visual recognition tasks. In this paper, based on the recent biological evidence, we propose a framework to mimic the active and dynamic learning and recognition process of the primate visual cortex. From principle point of view, the main contributions are that the framework can achieve unsupervised learning of episodic features (including key components and their spatial relations) and semantic features (semantic descriptions of the key components), which support higher level cognition of an object. From performance point of view, the advantages of the framework are as follows: 1) learning episodic features without supervision-for a class of objects without a prior knowledge, the key components, their spatial relations and cover regions can be learned automatically through a deep neural network (DNN); 2) learning semantic features based on episodic features-within the cover regions of the key components, the semantic geometrical values of these components can be computed based on contour detection; 3) forming the general knowledge of a class of objects-the general knowledge of a class of objects can be formed, mainly including the key components, their spatial relations and average semantic values, which is a concise description of the class; and 4) achieving higher level cognition and dynamic updating-for a test image, the model can achieve classification and subclass semantic descriptions. And the test samples with high confidence are selected to dynamically update the whole model. Experiments are conducted on face images, and a good performance is achieved in each layer of the DNN and the semantic description learning process. Furthermore, the model can be generalized to recognition tasks of other objects with learning ability.
NASA Technical Reports Server (NTRS)
Anderson, B. H.
1983-01-01
A broad program to develop advanced, reliable, and user oriented three-dimensional viscous design techniques for supersonic inlet systems, and encourage their transfer into the general user community is discussed. Features of the program include: (1) develop effective methods of computing three-dimensional flows within a zonal modeling methodology; (2) ensure reasonable agreement between said analysis and selective sets of benchmark validation data; (3) develop user orientation into said analysis; and (4) explore and develop advanced numerical methodology.
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; Railkar, Sudhir B.
1988-01-01
This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.
Myopic Loss Aversion: Demystifying the Key Factors Influencing Decision Problem Framing
ERIC Educational Resources Information Center
Hardin, Andrew M.; Looney, Clayton Arlen
2012-01-01
Advancement of myopic loss aversion theory has been hamstrung by conflicting results, methodological inconsistencies, and a piecemeal approach toward understanding the key factors influencing decision problem framing. A series of controlled experiments provides a more holistic view of the variables promoting myopia. Extending the information…
Perceptions on TRIST: Implications for INSET.
ERIC Educational Resources Information Center
Saunders, Murray
1987-01-01
The article explores strategic responses of key participants to the design and implementation of inservice education related to a technical and vocational education initiative developed in England and Wales. Data collected from interviews with the participants are used to discuss their aims, their roles as key informants, and methodological stance…
Q Methodology as a Tool for Program Assessment
ERIC Educational Resources Information Center
Ramlo, Susan E.
2015-01-01
Program assessment is now commonplace at most colleges and universities and is required for accreditation of specific degree programs. Key aspects of program assessment include program improvement, improved student learning, and adequate student preparation for the workforce. Thus, program assessment is a key ingredient to program health. Although…
ERIC Educational Resources Information Center
Cheung, Alan C. K.; Slavin, Robert E.
2013-01-01
The present review examines research on the effects of educational technology applications on mathematics achievement in K-12 classrooms. Unlike previous reviews, this review applies consistent inclusion standards to focus on studies that met high methodological standards. In addition, methodological and substantive features of the studies are…
Working in the Methodological "Outfield": The Case of Bourdieu and Occupational Therapy
ERIC Educational Resources Information Center
Watson, Jo; Grenfell, Michael
2016-01-01
The article reports on a study of methodological innovation involving occupational therapy (OT) students in higher education (HE). It is based on an original project which examined the experiences and outcomes of non-traditional entrants to pre-registration OT education. A feature of the original project was the application of the epistemological…
ERIC Educational Resources Information Center
Karabenick, Stuart A.; Zusho, Akane
2015-01-01
We provide a conceptual commentary on the articles in this special issue, first by describing the unique features of each study, focusing on what we consider to be their theoretical and methodological contributions, and then by highlighting significant crosscutting themes and future directions in the study of SRL. Specifically, we define SRL to be…
The Methodological Framework of Occupational Training in Culture and Art High Schools of Kazakhstan
ERIC Educational Resources Information Center
Kulbekova, ?igul K.; Tleubayeva, Balzhan S.; Tleubayev, Seraly Sh.; Saparova, Yulduz A.; Dildebayeva, Gulmira R.; Daribayeva, Raushan D.; Omar, Esen O.
2016-01-01
The purpose of this study is to examine specific features of the traditional Kazakh dances as the methodological foundation of training specialists in the culture and art universities. The article describes the main typologies of Kazakh dances, such as ritual and ceremonial, combative-hunting, work dances, household-imitative dances, festive and…
On Teaching the History of California Spanish to HLL Using Siri: Methodology and Procedures
ERIC Educational Resources Information Center
Lamar Prieto, Covadonga
2016-01-01
This article reports results from a study in which two groups of college level students were exposed to interactions with Apple's Siri in order to foster dialogue about their dialectal features. In this paper, the methodology and procedural challenges behind one of the activities that the participants completed are studied. These activities had…
ERIC Educational Resources Information Center
Ermeling, Bradley Alan
2012-01-01
Past and contemporary scholars have emphasized the importance of job-embedded, systematic instructional inquiry for educators. A recent review of the literature highlights four key features shared by several well documented inquiry approaches for classroom teachers. Interestingly, another line of research suggests that these key features also…
ERIC Educational Resources Information Center
Dunst, Carl J.
2015-01-01
A model for designing and implementing evidence-based in-service professional development in early childhood intervention as well as the key features of the model are described. The key features include professional development specialist (PDS) description and demonstration of an intervention practice, active and authentic job-embedded…
Salient Key Features of Actual English Instructional Practices in Saudi Arabia
ERIC Educational Resources Information Center
Al-Seghayer, Khalid
2015-01-01
This is a comprehensive review of the salient key features of the actual English instructional practices in Saudi Arabia. The goal of this work is to gain insights into the practices and pedagogic approaches to English as a foreign language (EFL) teaching currently employed in this country. In particular, we identify the following central features…
ERIC Educational Resources Information Center
Jung, Youngok; Zuniga, Stephen; Howes, Carollee; Jeon, Hyun-Joo; Parrish, Deborah; Quick, Heather; Manship, Karen; Hauser, Alison
2016-01-01
Noting the lack of research on how early childhood education (ECE) programmes within family literacy programmes influence Latino children's early language and literacy development, this study examined key features of ECE programmes, specifically teacher-child interactions and child engagement in language and literacy activities and how these…
Moving towards Optimising Demand-Led Learning: The 2005-2007 ECUANET Leonardo Da Vinci Project
ERIC Educational Resources Information Center
Dealtry, Richard; Howard, Keith
2008-01-01
Purpose: The purpose of this paper is to present the key project learning points and outcomes as a guideline for the future quality management of demand-led learning and development. Design/methodology/approach: The research methodology was based upon a corporate university blueprint architecture and browser toolkit developed by a member of the…
Analyzing Media: Metaphors as Methodologies.
ERIC Educational Resources Information Center
Meyrowitz, Joshua
Students have little intuitive insight into the process of thinking and structuring ideas. The image of metaphor for a phenomenon acts as a kind of methodology for the study of the phenomenon by (1) defining the key issues or problems; (2) shaping the type of research questions that are asked; (3) defining the type of data that are searched out;…
Evaluation of Adult Literacy Education in the United States: A Review of Methodological Issues
ERIC Educational Resources Information Center
Shi, Yan; Tsang, Mun C.
2008-01-01
This is a critical review of methodological issues in the evaluation of adult literacy education programs in the United States. It addresses the key research questions: What are the appropriate methods for evaluating these programs under given circumstances. It identifies 15 evaluation studies that are representative of a range of adult literacy…
Organizing Education: Schools, School Districts, and the Study of Organizational History
ERIC Educational Resources Information Center
Duke, Daniel L.
2015-01-01
Purpose: The purpose of this paper is to present a rationale for organizational histories of schools and school districts and discuss the findings of selected examples of the genre. Design/methodology/approach: The author presents a vignette of an organizational history, discusses key elements of the methodology, and offers seven ways in which…
A Reflection on the Methodology Used for a Qualitative Longitudinal Study
ERIC Educational Resources Information Center
Evangelinou-Yiannakis, Angela
2017-01-01
This paper presents a reflection on the methodology used for a qualitative longitudinal study of the teaching of Modern Greek (Greek) in Western Australia under the Seconded Teachers from Greece Scheme (STGS). The study, a first of its kind, addressed an area of need in the teaching of Greek, investigating the perspectives of the key stakeholders…
Aguilar-Arredondo, Andrea; Arias, Clorinda; Zepeda, Angélica
2015-01-01
Hippocampal neurogenesis occurs in the adult brain in various species, including humans. A compelling question that arose when neurogenesis was accepted to occur in the adult dentate gyrus (DG) is whether new neurons become functionally relevant over time, which is key for interpreting their potential contributions to synaptic circuitry. The functional state of adult-born neurons has been evaluated using various methodological approaches, which have, in turn, yielded seemingly conflicting results regarding the timing of maturation and functional integration. Here, we review the contributions of different methodological approaches to addressing the maturation process of adult-born neurons and their functional state, discussing the contributions and limitations of each method. We aim to provide a framework for interpreting results based on the approaches currently used in neuroscience for evaluating functional integration. As shown by the experimental evidence, adult-born neurons are prone to respond from early stages, even when they are not yet fully integrated into circuits. The ongoing integration process for the newborn neurons is characterised by different features. However, they may contribute differently to the network depending on their maturation stage. When combined, the strategies used to date convey a comprehensive view of the functional development of newly born neurons while providing a framework for approaching the critical time at which new neurons become functionally integrated and influence brain function.
NASA Astrophysics Data System (ADS)
Ozevin, Didem; Fazel, Hossein; Cox, Justin; Hardman, William; Kessler, Seth S.; Timmons, Alan
2014-04-01
Gearbox components of aerospace structures are typically made of brittle materials with high fracture toughness, but susceptible to fatigue failure due to continuous cyclic loading. Structural Health Monitoring (SHM) methods are used to monitor the crack growth in gearbox components. Damage detection methodologies developed in laboratory-scale experiments may not represent the actual gearbox structural configuration, and are usually not applicable to real application as the vibration and wave properties depend on the material, structural layers and thicknesses. Also, the sensor types and locations are key factors for frequency content of ultrasonic waves, which are essential features for pattern recognition algorithm development in noisy environments. Therefore, a deterministic damage detection methodology that considers all the variables influencing the waveform signature should be considered in the preliminary computation before any experimental test matrix. In order to achieve this goal, we developed two dimensional finite element models of a gearbox cross section from front view and shaft section. The cross section model consists of steel revolving teeth, a thin layer of oil, and retention plate. An ultrasonic wave up to 1 MHz frequency is generated, and waveform histories along the gearbox are recorded. The received waveforms under pristine and cracked conditions are compared in order to analyze the crack influence on the wave propagation in gearbox, which can be utilized by both active and passive SHM methods.
Change management methodologies trained for automotive infotainment projects
NASA Astrophysics Data System (ADS)
Prostean, G.; Volker, S.; Hutanu, A.
2017-01-01
An Automotive Electronic Control Units (ECU) development project embedded within a car Environment is constantly under attack of a continuous flow of modifications of specifications throughout the life cycle. Root causes for those modifications are for instance simply software or hardware implementation errors or requirement changes to satisfy the forthcoming demands of the market to ensure the later commercial success. It is unavoidable that from the very beginning until the end of the project “requirement changes” will “expose” the agreed objectives defined by contract specifications, which are product features, budget, schedule and quality. The key discussions will focus upon an automotive radio-navigation (infotainment) unit, which challenges aftermarket devises such as smart phones. This competition stresses especially current used automotive development processes, which are fit into a 4 Year car development (introduction) cycle against a one-year update cycle of a smart phone. The research will focus the investigation of possible impacts of changes during all phases of the project: the Concept-Validation, Development and Debugging-Phase. Building a thorough understanding of prospective threats is of paramount importance in order to establish the adequate project management process to handle requirement changes. Personal automotive development experiences and Literature review of change- and configuration management software development methodologies led the authors to new conceptual models, which integrates into the structure of traditional development models used in automotive projects, more concretely of radio-navigation projects.
Animal models for studying homeopathy and high dilutions: conceptual critical review.
Bonamin, Leoni Villano; Endler, Peter Christian
2010-01-01
This is a systematic review of the animal models used in studies of high dilutions. The objectives are to analyze methodological quality of papers and reported results, and to highlight key conceptual aspects of high dilution to suggest clues concerning putative mechanisms of action. Papers for inclusion were identified systematically, from the Pubmed-Medline database, using 'Homeopathy' and 'Animal' as keywords. Only original full papers in English published between January 1999 and June 2009 were included, reviews, scientific reports, thesis, older papers, papers extracted from Medline using similar keywords, papers about mixed commercial formulas and books were also considered for discussion only. 31 papers describing 33 experiments were identified for the main analysis and a total of 89 items cited. Systematic analysis of the selected papers yielded evidence of some important intrinsic features of high dilution studies performed in animal models: a) methodological quality was generally adequate, some aspects could be improved; b) convergence between results and materia medica is seen in some studies, pointing toward to the possibility of systematic study of the Similia principle c) both isopathic and Similia models seem useful to understand some complex biological phenomena, such as parasite-host interactions; d) the effects of high dilutions seem to stimulate restoration of a 'stable state', as seen in several experimental models from both descriptive and mathematical points of view. Copyright 2009 The Faculty of Homeopathy. Published by Elsevier Ltd. All rights reserved.
Identifying the determinants of premature mortality in Russia: overcoming a methodological challenge
Tomkins, Susannah; Shkolnikov, Vladimir; Andreev, Evgueni; Kiryanov, Nikolay; Leon, David A; McKee, Martin; Saburova, Lyudmila
2007-01-01
Background It is thought that excessive alcohol consumption is related to the high mortality among working age men in Russia. Moreover it has been suggested that alcohol is a key proximate driver of the very sharp fluctuations in mortality seen in this group since the mid-1980s. Designing an individual-level study suitable to address the potential acute effects of alcohol consumption on mortality in Russia has posed a challenge to epidemiologists, especially because of the need to identify factors that could underlie the rapid changes up and down in mortality rates that have been such a distinctive feature of the Russian mortality crisis. In order to address this study question which focuses on exposures acting shortly before sudden death, a cohort would be unfeasibly large and would suffer from recruitment bias. Methods Although the situation in Russia is unusual, with a very high death rate characterised by many sudden and apparently unexpected deaths in young men, the methodological problem is common to research on any cause of death where many deaths are sudden. Results We describe the development of an innovative approach that has overcome some of these challenges: a case-control study employing proxy informants and external data sources to collect information about proximate determinants of mortality. Conclusion This offers a set of principles that can be adopted by epidemiologists studying sudden and unexpected deaths in other settings. PMID:18045487
Tomkins, Susannah; Shkolnikov, Vladimir; Andreev, Evgueni; Kiryanov, Nikolay; Leon, David A; McKee, Martin; Saburova, Lyudmila
2007-11-28
It is thought that excessive alcohol consumption is related to the high mortality among working age men in Russia. Moreover it has been suggested that alcohol is a key proximate driver of the very sharp fluctuations in mortality seen in this group since the mid-1980s. Designing an individual-level study suitable to address the potential acute effects of alcohol consumption on mortality in Russia has posed a challenge to epidemiologists, especially because of the need to identify factors that could underlie the rapid changes up and down in mortality rates that have been such a distinctive feature of the Russian mortality crisis. In order to address this study question which focuses on exposures acting shortly before sudden death, a cohort would be unfeasibly large and would suffer from recruitment bias. Although the situation in Russia is unusual, with a very high death rate characterised by many sudden and apparently unexpected deaths in young men, the methodological problem is common to research on any cause of death where many deaths are sudden. We describe the development of an innovative approach that has overcome some of these challenges: a case-control study employing proxy informants and external data sources to collect information about proximate determinants of mortality. This offers a set of principles that can be adopted by epidemiologists studying sudden and unexpected deaths in other settings.
A systematic review of the therapeutic effects of Reiki.
vanderVaart, Sondra; Gijsen, Violette M G J; de Wildt, Saskia N; Koren, Gideon
2009-11-01
Reiki is an ancient form of Japanese healing. While this healing method is widely used for a variety of psychologic and physical symptoms, evidence of its effectiveness is scarce and conflicting. The purpose of this systematic review was to try to evaluate whether Reiki produces a significant treatment effect. Studies were identified using an electronic search of Medline, EMBASE, Cochrane Library, and Google Scholar. Quality of reporting was evaluated using a modified CONSORT Criteria for Herbal Interventions, while methodological quality was assessed using the Jadad Quality score. Two (2) researchers selected articles based on the following features: placebo or other adequate control, clinical investigation on humans, intervention using a Reiki practitioner, and published in English. They independently extracted data on study design, inclusion criteria, type of control, sample size, result, and nature of outcome measures. The modified CONSORT Criteria indicated that all 12 trials meeting the inclusion criteria were lacking in at least one of the three key areas of randomization, blinding, and accountability of all patients, indicating a low quality of reporting. Nine (9) of the 12 trials detected a significant therapeutic effect of the Reiki intervention; however, using the Jadad Quality score, 11 of the 12 studies ranked "poor." The serious methodological and reporting limitations of limited existing Reiki studies preclude a definitive conclusion on its effectiveness. High-quality randomized controlled trials are needed to address the effectiveness of Reiki over placebo.
Bjornsson, Christopher S; Lin, Gang; Al-Kofahi, Yousef; Narayanaswamy, Arunachalam; Smith, Karen L; Shain, William; Roysam, Badrinath
2009-01-01
Brain structural complexity has confounded prior efforts to extract quantitative image-based measurements. We present a systematic ‘divide and conquer’ methodology for analyzing three-dimensional (3D) multi-parameter images of brain tissue to delineate and classify key structures, and compute quantitative associations among them. To demonstrate the method, thick (~100 μm) slices of rat brain tissue were labeled using 3 – 5 fluorescent signals, and imaged using spectral confocal microscopy and unmixing algorithms. Automated 3D segmentation and tracing algorithms were used to delineate cell nuclei, vasculature, and cell processes. From these segmentations, a set of 23 intrinsic and 8 associative image-based measurements was computed for each cell. These features were used to classify astrocytes, microglia, neurons, and endothelial cells. Associations among cells and between cells and vasculature were computed and represented as graphical networks to enable further analysis. The automated results were validated using a graphical interface that permits investigator inspection and corrective editing of each cell in 3D. Nuclear counting accuracy was >89%, and cell classification accuracy ranged from 81–92% depending on cell type. We present a software system named FARSIGHT implementing our methodology. Its output is a detailed XML file containing measurements that may be used for diverse quantitative hypothesis-driven and exploratory studies of the central nervous system. PMID:18294697
Collins, Brian D.; Brown, Kristin M.; Fairley, Helen C.
2008-01-01
This report presents the results of an evaluation of terrestrial light detection and ranging (LIDAR) for monitoring geomorphic change at archeological sites located within Grand Canyon National Park, Ariz. Traditionally, topographic change-detection studies have used total station methods for the collection of data related to key measurable features of site erosion such as the location of thalwegs and knickpoints of gullies that traverse archeological sites (for example, Pederson and others, 2003). Total station methods require survey teams to walk within and on the features of interest within the archeological sites to take accurate measurements. As a result, site impacts may develop such as trailing, damage to cryptogamic crusts, and surface compaction that can exacerbate future erosion of the sites. National Park Service (NPS) resource managers have become increasingly concerned that repeated surveys for research and monitoring purposes may have a detrimental impact on the resources that researchers are trying to study and protect. Beginning in 2006, the Sociocultural Program of the U.S. Geological Survey's (USGS) Grand Canyon Monitoring and Research Center (GCMRC) initiated an evaluation of terrestrial LIDAR as a new monitoring tool that might enhance data quality and reduce site impacts. This evaluation was conducted as one part of an ongoing study to develop objective, replicable, quantifiable monitoring protocols for tracking the status and trend of variables affecting archeological site condition along the Colorado River corridor. The overall study consists of two elements: (1) an evaluation of the methodology through direct comparison to geomorphologic metrics already being collected by total station methods (this report) and (2) an evaluation of terrestrial LIDAR's ability to detect topographic change through the collection of temporally different datasets (a report on this portion of the study is anticipated early in 2009). The main goals of the first element of study were to 1. test the methodology and survey protocols of terrestrial LIDAR surveying under actual archeological site field conditions, 2. examine the ability to collect topographic data of entire archeological sites given such constraints as vegetation and rough topography, and 3. evaluate the ability of terrestrial LIDAR to accurately map the locations of key geomorphic features already being collected by total station methods such as gully thalweg and knickpoint locations. This report focuses on the ability of terrestrial LIDAR to duplicate total station methods, including typical erosion-related change features such as the plan view gully thalweg location and the gully thalweg long profile. The report also presents information concerning the use of terrestrial LIDAR for archeological site monitoring in a general sense. In addition, a detailed comparison of the site impacts caused by both total station and terrestrial LIDAR survey methods is presented using a suite of indicators, including total field survey time, field footstep count, and data-processing time. A thorough discussion of the relative benefits and limitations of using terrestrial LIDAR for monitoring erosion-induced changes at archeological sites in Grand Canyon National Park concludes this report.
CNN based approach for activity recognition using a wrist-worn accelerometer.
Panwar, Madhuri; Dyuthi, S Ram; Chandra Prakash, K; Biswas, Dwaipayan; Acharyya, Amit; Maharatna, Koushik; Gautam, Arvind; Naik, Ganesh R
2017-07-01
In recent years, significant advancements have taken place in human activity recognition using various machine learning approaches. However, feature engineering have dominated conventional methods involving the difficult process of optimal feature selection. This problem has been mitigated by using a novel methodology based on deep learning framework which automatically extracts the useful features and reduces the computational cost. As a proof of concept, we have attempted to design a generalized model for recognition of three fundamental movements of the human forearm performed in daily life where data is collected from four different subjects using a single wrist worn accelerometer sensor. The validation of the proposed model is done with different pre-processing and noisy data condition which is evaluated using three possible methods. The results show that our proposed methodology achieves an average recognition rate of 99.8% as opposed to conventional methods based on K-means clustering, linear discriminant analysis and support vector machine.
NASA Astrophysics Data System (ADS)
Penkov, V. B.; Levina, L. V.; Novikova, O. S.; Shulmin, A. S.
2018-03-01
Herein we propose a methodology for structuring a full parametric analytical solution to problems featuring elastostatic media based on state-of-the-art computing facilities that support computerized algebra. The methodology includes: direct and reverse application of P-Theorem; methods of accounting for physical properties of media; accounting for variable geometrical parameters of bodies, parameters of boundary states, independent parameters of volume forces, and remote stress factors. An efficient tool to address the task is the sustainable method of boundary states originally designed for the purposes of computerized algebra and based on the isomorphism of Hilbertian spaces of internal states and boundary states of bodies. We performed full parametric solutions of basic problems featuring a ball with a nonconcentric spherical cavity, a ball with a near-surface flaw, and an unlimited medium with two spherical cavities.
Cytomorphology of Circulating Colorectal Tumor Cells:A Small Case Series
Marrinucci, Dena; Bethel, Kelly; Lazar, Daniel; Fisher, Jennifer; Huynh, Edward; Clark, Peter; Bruce, Richard; Nieva, Jorge; Kuhn, Peter
2010-01-01
Several methodologies exist to enumerate circulating tumor cells (CTCs) from the blood of cancer patients; however, most methodologies lack high-resolution imaging, and thus, little is known about the cytomorphologic features of these cells. In this study of metastatic colorectal cancer patients, we used immunofluorescent staining with fiber-optic array scanning technology to identify CTCs, with subsequent Wright-Giemsa and Papanicolau staining. The CTCs were compared to the corresponding primary and metastatic tumors. The colorectal CTCs showed marked intrapatient pleomorphism. In comparison to the corresponding tissue biopsies, cells from all sites showed similar pleomorphism, demonstrating that colorectal CTCs retain the pleomorphism present in regions of solid growth. They also often retain particular cytomorphologic features present in the patient's primary and/or metastatic tumor tissue. This study provides an initial analysis of the cytomorphologic features of circulating colon cancer cells, providing a foundation for further investigation into the significance and metastatic potential of CTCs. PMID:20111743
DOE Office of Scientific and Technical Information (OSTI.GOV)
Symons, Christopher T; Arel, Itamar
2011-01-01
Budgeted learning under constraints on both the amount of labeled information and the availability of features at test time pertains to a large number of real world problems. Ideas from multi-view learning, semi-supervised learning, and even active learning have applicability, but a common framework whose assumptions fit these problem spaces is non-trivial to construct. We leverage ideas from these fields based on graph regularizers to construct a robust framework for learning from labeled and unlabeled samples in multiple views that are non-independent and include features that are inaccessible at the time the model would need to be applied. We describemore » examples of applications that fit this scenario, and we provide experimental results to demonstrate the effectiveness of knowledge carryover from training-only views. As learning algorithms are applied to more complex applications, relevant information can be found in a wider variety of forms, and the relationships between these information sources are often quite complex. The assumptions that underlie most learning algorithms do not readily or realistically permit the incorporation of many of the data sources that are available, despite an implicit understanding that useful information exists in these sources. When multiple information sources are available, they are often partially redundant, highly interdependent, and contain noise as well as other information that is irrelevant to the problem under study. In this paper, we are focused on a framework whose assumptions match this reality, as well as the reality that labeled information is usually sparse. Most significantly, we are interested in a framework that can also leverage information in scenarios where many features that would be useful for learning a model are not available when the resulting model will be applied. As with constraints on labels, there are many practical limitations on the acquisition of potentially useful features. A key difference in the case of feature acquisition is that the same constraints often don't pertain to the training samples. This difference provides an opportunity to allow features that are impractical in an applied setting to nevertheless add value during the model-building process. Unfortunately, there are few machine learning frameworks built on assumptions that allow effective utilization of features that are only available at training time. In this paper we formulate a knowledge carryover framework for the budgeted learning scenario with constraints on features and labels. The approach is based on multi-view and semi-supervised learning methods that use graph-encoded regularization. Our main contributions are the following: (1) we propose and provide justification for a methodology for ensuring that changes in the graph regularizer using alternate views are performed in a manner that is target-concept specific, allowing value to be obtained from noisy views; and (2) we demonstrate how this general set-up can be used to effectively improve models by leveraging features unavailable at test time. The rest of the paper is structured as follows. In Section 2, we outline real-world problems to motivate the approach and describe relevant prior work. Section 3 describes the graph construction process and the learning methodologies that are employed. Section 4 provides preliminary discussion regarding theoretical motivation for the method. In Section 5, effectiveness of the approach is demonstrated in a series of experiments employing modified versions of two well-known semi-supervised learning algorithms. Section 6 concludes the paper.« less
Ju, Seung-hwan; Seo, Hee-suk; Han, Sung-hyu; Ryou, Jae-cheol; Kwak, Jin
2013-01-01
The prevalence of computers and the development of the Internet made us able to easily access information. As people are concerned about user information security, the interest of the user authentication method is growing. The most common computer authentication method is the use of alphanumerical usernames and passwords. The password authentication systems currently used are easy, but only if you know the password, as the user authentication is vulnerable. User authentication using fingerprints, only the user with the information that is specific to the authentication security is strong. But there are disadvantage such as the user cannot change the authentication key. In this study, we proposed authentication methodology that combines numeric-based password and biometric-based fingerprint authentication system. Use the information in the user's fingerprint, authentication keys to obtain security. Also, using numeric-based password can to easily change the password; the authentication keys were designed to provide flexibility.
Ju, Seung-hwan; Seo, Hee-suk; Han, Sung-hyu; Ryou, Jae-cheol
2013-01-01
The prevalence of computers and the development of the Internet made us able to easily access information. As people are concerned about user information security, the interest of the user authentication method is growing. The most common computer authentication method is the use of alphanumerical usernames and passwords. The password authentication systems currently used are easy, but only if you know the password, as the user authentication is vulnerable. User authentication using fingerprints, only the user with the information that is specific to the authentication security is strong. But there are disadvantage such as the user cannot change the authentication key. In this study, we proposed authentication methodology that combines numeric-based password and biometric-based fingerprint authentication system. Use the information in the user's fingerprint, authentication keys to obtain security. Also, using numeric-based password can to easily change the password; the authentication keys were designed to provide flexibility. PMID:24151601
Darwinism and positivism as methodological influences on the development of psychology.
Mackenzie, B
1976-10-01
The methodological significance of evolutionary theory for psychology may be distinguished from its substantive or theoretical significance. The methodological significance was that evolutionay theory broadened the current conceptors of scientific method and rendered them relatively independent of physics. It thereby made the application of the "scientific method" to psychology much more feasible than it had been previously, and thus established the possibility of a wide-ranging scientific psychology for the first time. The methodological eclecticism that made scientific psychology possible did not, however, remain a feature of psychology for very long. Psychology's methodology rapidly became restricted and codified through the influence of, and in imitation of, the rigorously positivistic orientation of physics around the turn of the twentieth century.
Learning representations for the early detection of sepsis with deep neural networks.
Kam, Hye Jin; Kim, Ha Young
2017-10-01
Sepsis is one of the leading causes of death in intensive care unit patients. Early detection of sepsis is vital because mortality increases as the sepsis stage worsens. This study aimed to develop detection models for the early stage of sepsis using deep learning methodologies, and to compare the feasibility and performance of the new deep learning methodology with those of the regression method with conventional temporal feature extraction. Study group selection adhered to the InSight model. The results of the deep learning-based models and the InSight model were compared. With deep feedforward networks, the area under the ROC curve (AUC) of the models were 0.887 and 0.915 for the InSight and the new feature sets, respectively. For the model with the combined feature set, the AUC was the same as that of the basic feature set (0.915). For the long short-term memory model, only the basic feature set was applied and the AUC improved to 0.929 compared with the existing 0.887 of the InSight model. The contributions of this paper can be summarized in three ways: (i) improved performance without feature extraction using domain knowledge, (ii) verification of feature extraction capability of deep neural networks through comparison with reference features, and (iii) improved performance with feedforward neural networks using long short-term memory, a neural network architecture that can learn sequential patterns. Copyright © 2017 Elsevier Ltd. All rights reserved.
Selka, F; Nicolau, S; Agnus, V; Bessaid, A; Marescaux, J; Soler, L
2015-03-01
In minimally invasive surgery, the tracking of deformable tissue is a critical component for image-guided applications. Deformation of the tissue can be recovered by tracking features using tissue surface information (texture, color,...). Recent work in this field has shown success in acquiring tissue motion. However, the performance evaluation of detection and tracking algorithms on such images are still difficult and are not standardized. This is mainly due to the lack of ground truth data on real data. Moreover, in order to avoid supplementary techniques to remove outliers, no quantitative work has been undertaken to evaluate the benefit of a pre-process based on image filtering, which can improve feature tracking robustness. In this paper, we propose a methodology to validate detection and feature tracking algorithms, using a trick based on forward-backward tracking that provides an artificial ground truth data. We describe a clear and complete methodology to evaluate and compare different detection and tracking algorithms. In addition, we extend our framework to propose a strategy to identify the best combinations from a set of detector, tracker and pre-process algorithms, according to the live intra-operative data. Experimental results have been performed on in vivo datasets and show that pre-process can have a strong influence on tracking performance and that our strategy to find the best combinations is relevant for a reasonable computation cost. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Sahoo, Madhumita; Sahoo, Satiprasad; Dhar, Anirban; Pradhan, Biswajeet
2016-10-01
Groundwater vulnerability assessment has been an accepted practice to identify the zones with relatively increased potential for groundwater contamination. DRASTIC is the most popular secondary information-based vulnerability assessment approach. Original DRASTIC approach considers relative importance of features/sub-features based on subjective weighting/rating values. However variability of features at a smaller scale is not reflected in this subjective vulnerability assessment process. In contrast to the subjective approach, the objective weighting-based methods provide flexibility in weight assignment depending on the variation of the local system. However experts' opinion is not directly considered in the objective weighting-based methods. Thus effectiveness of both subjective and objective weighting-based approaches needs to be evaluated. In the present study, three methods - Entropy information method (E-DRASTIC), Fuzzy pattern recognition method (F-DRASTIC) and Single parameter sensitivity analysis (SA-DRASTIC), were used to modify the weights of the original DRASTIC features to include local variability. Moreover, a grey incidence analysis was used to evaluate the relative performance of subjective (DRASTIC and SA-DRASTIC) and objective (E-DRASTIC and F-DRASTIC) weighting-based methods. The performance of the developed methodology was tested in an urban area of Kanpur City, India. Relative performance of the subjective and objective methods varies with the choice of water quality parameters. This methodology can be applied without/with suitable modification. These evaluations establish the potential applicability of the methodology for general vulnerability assessment in urban context.
Dzialak, Matthew R.; Olson, Chad V.; Harju, Seth M.; Webb, Stephen L.; Mudd, James P.; Winstead, Jeffrey B.; Hayden-Wing, L.D.
2011-01-01
Background Balancing animal conservation and human use of the landscape is an ongoing scientific and practical challenge throughout the world. We investigated reproductive success in female greater sage-grouse (Centrocercus urophasianus) relative to seasonal patterns of resource selection, with the larger goal of developing a spatially-explicit framework for managing human activity and sage-grouse conservation at the landscape level. Methodology/Principal Findings We integrated field-observation, Global Positioning Systems telemetry, and statistical modeling to quantify the spatial pattern of occurrence and risk during nesting and brood-rearing. We linked occurrence and risk models to provide spatially-explicit indices of habitat-performance relationships. As part of the analysis, we offer novel biological information on resource selection during egg-laying, incubation, and night. The spatial pattern of occurrence during all reproductive phases was driven largely by selection or avoidance of terrain features and vegetation, with little variation explained by anthropogenic features. Specifically, sage-grouse consistently avoided rough terrain, selected for moderate shrub cover at the patch level (within 90 m2), and selected for mesic habitat in mid and late brood-rearing phases. In contrast, risk of nest and brood failure was structured by proximity to anthropogenic features including natural gas wells and human-created mesic areas, as well as vegetation features such as shrub cover. Conclusions/Significance Risk in this and perhaps other human-modified landscapes is a top-down (i.e., human-mediated) process that would most effectively be minimized by developing a better understanding of specific mechanisms (e.g., predator subsidization) driving observed patterns, and using habitat-performance indices such as those developed herein for spatially-explicit guidance of conservation intervention. Working under the hypothesis that industrial activity structures risk by enhancing predator abundance or effectiveness, we offer specific recommendations for maintaining high-performance habitat and reducing low-performance habitat, particularly relative to the nesting phase, by managing key high-risk anthropogenic features such as industrial infrastructure and water developments. PMID:22022587
A Framework for Developing the Structure of Public Health Economic Models.
Squires, Hazel; Chilcott, James; Akehurst, Ronald; Burr, Jennifer; Kelly, Michael P
2016-01-01
A conceptual modeling framework is a methodology that assists modelers through the process of developing a model structure. Public health interventions tend to operate in dynamically complex systems. Modeling public health interventions requires broader considerations than clinical ones. Inappropriately simple models may lead to poor validity and credibility, resulting in suboptimal allocation of resources. This article presents the first conceptual modeling framework for public health economic evaluation. The framework presented here was informed by literature reviews of the key challenges in public health economic modeling and existing conceptual modeling frameworks; qualitative research to understand the experiences of modelers when developing public health economic models; and piloting a draft version of the framework. The conceptual modeling framework comprises four key principles of good practice and a proposed methodology. The key principles are that 1) a systems approach to modeling should be taken; 2) a documented understanding of the problem is imperative before and alongside developing and justifying the model structure; 3) strong communication with stakeholders and members of the team throughout model development is essential; and 4) a systematic consideration of the determinants of health is central to identifying the key impacts of public health interventions. The methodology consists of four phases: phase A, aligning the framework with the decision-making process; phase B, identifying relevant stakeholders; phase C, understanding the problem; and phase D, developing and justifying the model structure. Key areas for further research involve evaluation of the framework in diverse case studies and the development of methods for modeling individual and social behavior. This approach could improve the quality of Public Health economic models, supporting efficient allocation of scarce resources. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Kerr, Deirdre; Chung, Gregory K. W. K.
2012-01-01
The assessment cycle of "evidence-centered design" (ECD) provides a framework for treating an educational video game or simulation as an assessment. One of the main steps in the assessment cycle of ECD is the identification of the key features of student performance. While this process is relatively simple for multiple choice tests, when…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-15
... (EPO) as the lead, to propose a revised standard for the filing of nucleotide and/or amino acid.... ST.25 uses a controlled vocabulary of feature keys to describe nucleic acid and amino acid sequences... patent data purposes. The XML standard also includes four qualifiers for amino acids. These feature keys...
Crafting your Elevator Pitch: Key Features of an Elevator Speech to Help You Reach the Top Floor
You never know when you will end up talking to someone who will end up helping to shape your career. Many of these chance meetings are brief and when you only get 2-3 minutes to make your case everything that you say has to count. This presentation will cover the key features o...
Modeling the Time Course of Feature Perception and Feature Information Retrieval
ERIC Educational Resources Information Center
Kent, Christopher; Lamberts, Koen
2006-01-01
Three experiments investigated whether retrieval of information about different dimensions of a visual object varies as a function of the perceptual properties of those dimensions. The experiments involved two perception-based matching tasks and two retrieval-based matching tasks. A signal-to-respond methodology was used in all tasks. A stochastic…
ERIC Educational Resources Information Center
Lee, Jang Ho
2012-01-01
Experimental methods have played a significant role in the growth of English teaching and learning studies. The paper presented here outlines basic features of experimental design, including the manipulation of independent variables, the role and practicality of randomised controlled trials (RCTs) in educational research, and alternative methods…
ERIC Educational Resources Information Center
Topolovcan, Tomislav
2016-01-01
This paper provides a critical analysis of art-based research in education, that is, in constructivist learning and teaching. It presents the methodological features and advantages of art-based research in terms of the axiological, ontological and epistemological features of the constructivist, participatory and critical scientific paradigm, and…
Living at Sea: Learning from Communal Life Aboard Sail Training Vessels
ERIC Educational Resources Information Center
McCulloch, Ken
2007-01-01
This paper considers features of domestic and social life aboard sail training vessels, exploring the particular character of life at sea, and how these features contribute to the distinctive character of sail training experience as a context for learning. Methodologically, the study lies in the sociological tradition of ethnography, focusing on…
Not Just the Taste: Why Adolescents Drink Alcopops
ERIC Educational Resources Information Center
Jones, Sandra C.; Reis, Samantha
2012-01-01
Purpose: The purpose of this paper is to determine the features of alcopops which make them attractive to Australian adolescents, which features are most important in determining choice of ready-to-drinks (RTDs) over other alcoholic drinks, and whether these vary by age and gender. Design/methodology/approach: Mixed methods study. Participants in…
NASA Astrophysics Data System (ADS)
Caruso, Fabio; Verdi, Carla; Poncé, Samuel; Giustino, Feliciano
2018-04-01
We develop a first-principles approach based on many-body perturbation theory to investigate the effects of the interaction between electrons and carrier plasmons on the electronic properties of highly doped semiconductors and oxides. Through the evaluation of the electron self-energy, we account simultaneously for electron-plasmon and electron-phonon coupling in theoretical calculations of angle-resolved photoemission spectra, electron linewidths, and relaxation times. We apply this methodology to electron-doped anatase TiO2 as an illustrative example. The simulated spectra indicate that electron-plasmon coupling in TiO2 underpins the formation of satellites at energies comparable to those of polaronic spectral features. At variance with phonons, however, the energy of plasmons and their spectral fingerprints depends strongly on the carrier concentration, revealing a complex interplay between plasmon and phonon satellites. The electron-plasmon interaction accounts for approximately 40% of the total electron-boson interaction strength, and it is key to improve the agreement with measured quasiparticle spectra.
Experimental platform for intra-uterine needle placement procedures
NASA Astrophysics Data System (ADS)
Madjidi, Yashar; Haidegger, Tamás.; Ptacek, Wolfgang; Berger, Daniel; Kirisits, Christian; Kronreif, Gernot; Fichtinger, Gabor
2013-03-01
A framework has been investigated to enable a variety of comparative studies in the context of needle-based gynaecological brachytherapy. Our aim was to create an anthropomorphic phantom-based platform. The three main elements of the platform are the organ model, needle guide, and needle drive. These have been studied and designed to replicate the close environment of brachytherapy treatment for cervical cancer. Key features were created with the help of collaborating interventional radio-oncologists and the observations made in the operating room. A phantom box, representing the uterus model, has been developed considering available surgical analogies and operational limitations, such as organs at risk. A modular phantom-based platform has been designed and prototyped with the capability of providing various boundary conditions for the target organ. By mimicking the female pelvic floor, this framework has been used to compare a variety of needle insertion techniques and configurations for cervical and uterine interventions. The results showed that the proposed methodology is useful for the investigation of quantifiable experiments in the intraabdominal and pelvic regions.
a Conceptual Framework for Virtual Geographic Environments Knowledge Engineering
NASA Astrophysics Data System (ADS)
You, Lan; Lin, Hui
2016-06-01
VGE geographic knowledge refers to the abstract and repeatable geo-information which is related to the geo-science problem, geographical phenomena and geographical laws supported by VGE. That includes expert experiences, evolution rule, simulation processes and prediction results in VGE. This paper proposes a conceptual framework for VGE knowledge engineering in order to effectively manage and use geographic knowledge in VGE. Our approach relies on previous well established theories on knowledge engineering and VGE. The main contribution of this report is following: (1) The concepts of VGE knowledge and VGE knowledge engineering which are defined clearly; (2) features about VGE knowledge different with common knowledge; (3) geographic knowledge evolution process that help users rapidly acquire knowledge in VGE; and (4) a conceptual framework for VGE knowledge engineering providing the supporting methodologies system for building an intelligent VGE. This conceptual framework systematically describes the related VGE knowledge theories and key technologies. That will promote the rapid transformation from geodata to geographic knowledge, and furtherly reduce the gap between the data explosion and knowledge absence.
Analysis of Big Data from Space
NASA Astrophysics Data System (ADS)
Tan, J.; Osborne, B.
2017-09-01
Massive data have been collected through various space mission. To maximize the investment, the data need to be exploited to the fullest. In this paper, we address key topics on big data from space about the status and future development using the system engineering method. First, we summarized space data including operation data and mission data, on their sources, access way, characteristics of 5Vs and application models based on the concept of big data, as well as the challenges they faced in application. Second, we gave proposals on platform design and architecture to meet the demand and challenges on space data application. It has taken into account of features of space data and their application models. It emphasizes high scalability and flexibility in the aspects of storage, computing and data mining. Thirdly, we suggested typical and promising practices for space data application, that showed valuable methodologies for improving intelligence on space application, engineering, and science. Our work will give an interdisciplinary knowledge to space engineers and information engineers.
Uncovering the features of negotiation in developing the patient-nurse relationship.
Stoddart, Kathleen; Bugge, Carol
2012-02-01
This article describes a study that set out to explore the interaction between patients and nurses in community practice settings, in order to understand the social meanings and understandings brought to the interaction and at play within it. The study used a grounded theory methodology with traditional procedures. Driven by constant comparative analysis, data were collected by non-participant observation and informal and semi-structured interviews in four community health centres. Eighteen patients and 18 registered practice nurses participated. Negotiation was found to be a fundamental process in patient- nurse interaction. Navigation, socio-cultural characteristics and power and control were found to be key properties of negotiation. The negotiation processes for developing understanding required patients and nurses to draw upon social meanings and understandings generated from within and beyond their current interaction. Social meanings and understandings created within and beyond the health-care setting influence negotiation. The developmental nature of negotiation in interaction is an important dimension of the patient- nurse relationship in community practice.
Cockpit Resource Management (CRM): A tool for improved flight safety (United Airlines CRM training)
NASA Technical Reports Server (NTRS)
Carroll, J. E.; Taggart, William R.
1987-01-01
The approach and methodology used in developing cockpit management skills is effective because of the following features: (1) A comparative method of learning is used enabling crewmembers to study different forms of teamwork. (2) The learning comes about as a result of crewmembers learning from one another instead of from an expert instructor. (3) Key elements of cockpit teamwork and effective management are studied so that crewmembers can determine how these elements can improve safety and problem solving. (4) Critique among the crewmembers themselves rather than from outsiders is used as a common focusing point for crews to provide feedback to one another on how each can be a more effective crewmember. (5) The training is continuous in the sense that it becomes part of recurrent, upgrade, and other forms of crewmember training and development. And (6) the training results in sound and genuine insights that come about through solid education as opposed to tutoring, coaching, or telling crewmembers how to behave more effectively.
NASA Technical Reports Server (NTRS)
Csank, Jeffrey T.; Connolly, Joseph W.
2016-01-01
This paper discusses the design and application of model-based engine control (MBEC) for use during emergency operation of the aircraft. The MBEC methodology is applied to the Commercial Modular Aero-Propulsion System Simulation 40k (CMAPSS40k) and features an optimal tuner Kalman Filter (OTKF) to estimate unmeasured engine parameters, which can then be used for control. During an emergency scenario, normally-conservative engine operating limits may be relaxed to increase the performance of the engine and overall survivability of the aircraft; this comes at the cost of additional risk of an engine failure. The MBEC architecture offers the advantage of estimating key engine parameters that are not directly measureable. Estimating the unknown parameters allows for tighter control over these parameters, and on the level of risk the engine will operate at. This will allow the engine to achieve better performance than possible when operating to more conservative limits on a related, measurable parameter.
NASA Technical Reports Server (NTRS)
Csank, Jeffrey T.; Connolly, Joseph W.
2015-01-01
This paper discusses the design and application of model-based engine control (MBEC) for use during emergency operation of the aircraft. The MBEC methodology is applied to the Commercial Modular Aero-Propulsion System Simulation 40,000 (CMAPSS40,000) and features an optimal tuner Kalman Filter (OTKF) to estimate unmeasured engine parameters, which can then be used for control. During an emergency scenario, normally-conservative engine operating limits may be relaxed to increase the performance of the engine and overall survivability of the aircraft; this comes at the cost of additional risk of an engine failure. The MBEC architecture offers the advantage of estimating key engine parameters that are not directly measureable. Estimating the unknown parameters allows for tighter control over these parameters, and on the level of risk the engine will operate at. This will allow the engine to achieve better performance than possible when operating to more conservative limits on a related, measurable parameter.
Cadogan, Cathal A; Ryan, Cristín; Hughes, Carmel
2016-01-01
There is a growing emphasis on behavior change in intervention development programmes aimed at improving public health and healthcare professionals' practice. A number of frameworks and methodological tools have been established to assist researchers in developing interventions seeking to change healthcare professionals' behaviors. The key features of behavior change intervention design involve specifying the target group (i.e. healthcare professional or patient cohort), the target behavior and identifying mediators (i.e. barriers and facilitators) of behavior change. Once the target behavior is clearly specified and understood, specific behavior change techniques can then be used as the basis of the intervention to target identified mediators of behavior change. This commentary outlines the challenges for pharmacy practice-based researchers in targeting dispensing as a behavior when developing behavior change interventions aimed at pharmacists and proposes a definition of dispensing to consider in future research. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA's Software Safety Standard
NASA Technical Reports Server (NTRS)
Ramsay, Christopher M.
2005-01-01
NASA (National Aeronautics and Space Administration) relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft (manned or unmanned) launched that did not have a computer on board that provided vital command and control services. Despite this growing dependence on software control and monitoring, there has been no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Led by the NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard (STD-18l9.13B) has recently undergone a significant update in an attempt to provide that consistency. This paper will discuss the key features of the new NASA Software Safety Standard. It will start with a brief history of the use and development of software in safety critical applications at NASA. It will then give a brief overview of the NASA Software Working Group and the approach it took to revise the software engineering process across the Agency.
Model-Based Anomaly Detection for a Transparent Optical Transmission System
NASA Astrophysics Data System (ADS)
Bengtsson, Thomas; Salamon, Todd; Ho, Tin Kam; White, Christopher A.
In this chapter, we present an approach for anomaly detection at the physical layer of networks where detailed knowledge about the devices and their operations is available. The approach combines physics-based process models with observational data models to characterize the uncertainties and derive the alarm decision rules. We formulate and apply three different methods based on this approach for a well-defined problem in optical network monitoring that features many typical challenges for this methodology. Specifically, we address the problem of monitoring optically transparent transmission systems that use dynamically controlled Raman amplification systems. We use models of amplifier physics together with statistical estimation to derive alarm decision rules and use these rules to automatically discriminate between measurement errors, anomalous losses, and pump failures. Our approach has led to an efficient tool for systematically detecting anomalies in the system behavior of a deployed network, where pro-active measures to address such anomalies are key to preventing unnecessary disturbances to the system's continuous operation.
Atomically flat single-crystalline gold nanostructures for plasmonic nanocircuitry.
Huang, Jer-Shing; Callegari, Victor; Geisler, Peter; Brüning, Christoph; Kern, Johannes; Prangsma, Jord C; Wu, Xiaofei; Feichtner, Thorsten; Ziegler, Johannes; Weinmann, Pia; Kamp, Martin; Forchel, Alfred; Biagioni, Paolo; Sennhauser, Urs; Hecht, Bert
2010-01-01
Deep subwavelength integration of high-definition plasmonic nanostructures is of key importance in the development of future optical nanocircuitry for high-speed communication, quantum computation and lab-on-a-chip applications. To date, the experimental realization of proposed extended plasmonic networks consisting of multiple functional elements remains challenging, mainly because of the multi-crystallinity of commonly used thermally evaporated gold layers. This can produce structural imperfections in individual circuit elements that drastically reduce the yield of functional integrated nanocircuits. In this paper we demonstrate the use of large (>100 μm(2)) but thin (<80 nm) chemically grown single-crystalline gold flakes that, after immobilization, serve as an ideal basis for focused ion beam milling and other top-down nanofabrication techniques on any desired substrate. Using this methodology we obtain high-definition ultrasmooth gold nanostructures with superior optical properties and reproducible nano-sized features over micrometre-length scales. Our approach provides a possible solution to overcome the current fabrication bottleneck and realize high-definition plasmonic nanocircuitry.
The normative structure of mathematization in systematic biology.
Sterner, Beckett; Lidgard, Scott
2014-06-01
We argue that the mathematization of science should be understood as a normative activity of advocating for a particular methodology with its own criteria for evaluating good research. As a case study, we examine the mathematization of taxonomic classification in systematic biology. We show how mathematization is a normative activity by contrasting its distinctive features in numerical taxonomy in the 1960s with an earlier reform advocated by Ernst Mayr starting in the 1940s. Both Mayr and the numerical taxonomists sought to formalize the work of classification, but Mayr introduced a qualitative formalism based on human judgment for determining the taxonomic rank of populations, while the numerical taxonomists introduced a quantitative formalism based on automated procedures for computing classifications. The key contrast between Mayr and the numerical taxonomists is how they conceptualized the temporal structure of the workflow of classification, specifically where they allowed meta-level discourse about difficulties in producing the classification. Copyright © 2014. Published by Elsevier Ltd.
[Patient safety in management contracts].
Campillo-Artero, C
2012-01-01
Patient safety is becoming commonplace in management contracts. Since our experience in patient safety still falls short of other clinical areas, it is advisable to review some of its characteristics in order to improve its inclusion in these contracts. In this paper opinions and recommendations concerning the design and review of contractual clauses on safety are given, as well as reflections drawn from methodological papers and informal opinions of clinicians, who are most familiar with the nuances of safe and unsafe practices. After reviewing some features of these contracts, criteria for prioritizing and including safety objectives and activities in them, and key points for their evaluation are described. The need to replace isolated activities by systemic and multifaceted ones is emphasized. Errors, limitations and improvement opportunities observed when contracts are linked to indicators, information and adverse event reporting systems are analysed. Finally, the influence of the rules of the game, and clinicians behaviour are emphasised. Copyright © 2011 SECA. Published by Elsevier Espana. All rights reserved.
Bainbridge, Daryl; Brazil, Kevin; Krueger, Paul; Ploeg, Jenny; Taniguchi, Alan; Darnay, Julie
2011-01-01
There is increasing global interest in using regional palliative care networks (PCNs) to integrate care and create systems that are more cost-effective and responsive. We examined a PCN that used a community development approach to build capacity for palliative care in each distinct community in a region of southern Ontario, Canada, with the goal of achieving a competent integrated system. Using a case study methodology, we examined a PCN at the structural level through a document review, a survey of 20 organizational administrators, and an interview with the network director. The PCN identified 14 distinct communities at different stages of development within the region. Despite the lack of some key features that would facilitate efficient palliative care delivery across these communities, administrators largely viewed the network partnership as beneficial and collaborative. The PCN has attempted to recognize specific needs in each local area. Change is gradual but participatory. There remain structural issues that may negatively affect the functioning of the PCN.
Evidence and gaps in the literature on orthorexia nervosa.
Varga, Márta; Dukay-Szabó, Szilvia; Túry, Ferenc; van Furth, Eric F; van Furth Eric, F
2013-06-01
To review the literature on the prevalence, risk groups and risk factors of the alleged eating disorder orthorexia nervosa. We searched Medline and Pubmed using several key terms relating to orthorexia nervosa (ON) and checked the reference list of the articles that we found. Attention was given to methodological problems in these studies, such as the use of non-validated assessment instruments, small sample size and sample characteristics, which make generalization of the results impossible. Eleven studies were found. The average prevalence rate for orthorexia was 6.9 % for the general population and 35-57.8 % for high-risk groups (healthcare professionals, artists). Dieticians and other healthcare professionals are at high risk of ON. Risk factors include obsessive-compulsive features, eating-related disturbances and higher socioeconomic status. Relevant clinical experience, published literature and research data have increased in the last few years. The definition and diagnostic criteria of ON remain unclear. Further studies are needed to clarify appropriate diagnostic methods and the place of ON among psychopathological categories.
High-Yield Synthesis and Optical Properties of Carbon Nanotube Porins
Tunuguntla, Ramya H.; Chen, Xi; Belliveau, Allison; ...
2017-01-18
Carbon nanotube porins (CNTPs) are a convenient membrane-based model system for studying nanofluidic transport that replicates a number of key structural features of biological membrane channels. We present a generalized approach for CNTP synthesis using sonochemistry-assisted segmenting of carbon nanotubes. Prolonged tip sonication in the presence of lipid molecules debundles and fragments long carbon nanotube aggregates into stable and water-soluble individual CNTPs with lengths in the range 5–20 nm. We discuss the main parameters that determine the efficiency and the yield of this process, describe the optimized conditions for high-yield CNTP synthesis, and demonstrate that this methodology can be adaptedmore » for synthesis of CNTPs of different diameters. We also present the optical properties of CNTPs and show that a combination of Raman and UV–vis–NIR spectroscopy can be used to monitor the quality of the CNTP synthesis. Altogether, CNTPs represent a versatile nanopore building block for creating higher-order functional biomimetic materials.« less
Heyland, Daren K; Rooyakers, Olav; Mourtzakis, Marina; Stapleton, Renee D
2017-02-01
Recent literature has created considerable confusion about the optimal amount of protein/amino acids that should be provided to the critically ill patient. In fact, the evidentiary basis that directly tries to answer this question is relatively small. As a clinical nutrition research community, there is an urgent need to develop the optimal methods to assess the impact of exogenous protein/amino acid administration in the intensive care unit setting. That assessment can be conducted at various levels: (1) impact on stress response pathways, (2) impact on muscle synthesis and protein balance, (3) impact on muscle mass and function, and (4) impact on the patient's recovery. The objective of this research workshop was to review current literature relating to protein/amino acid administration for the critically ill patient and clinical outcomes and to discuss the key measurement and methodological features of future studies that should be done to inform the optimal protein/amino acid dose provided to critically ill patients.
ERIC Educational Resources Information Center
Uetake, Tetsuya
2015-01-01
Purpose: Large-scale collective action is necessary when managing agricultural natural resources such as biodiversity and water quality. This paper determines the key factors to the success of such action. Design/Methodology/Approach: This paper analyses four large-scale collective actions used to manage agri-environmental resources in Canada and…
Mixed Methods Research: What Are the Key Issues to Consider?
ERIC Educational Resources Information Center
Ghosh, Rajashi
2016-01-01
Mixed methods research (MMR) is increasingly becoming a popular methodological approach in several fields due to the promise it holds for comprehensive understanding of complex problems being researched. However, researchers interested in MMR often lack reference to a guide that can explain the key issues pertaining to the paradigm wars…
Setting up School Partnerships: Some Insights from Birmingham's Collegiate Academies
ERIC Educational Resources Information Center
Rutherford, Desmond; Jackson, Lindsay
2006-01-01
This article explores the key issues or dilemmas secondary schools face when considering collaborating to form a school partnership, in particular a Collegiate Academy. The methodology is an evaluation based on interviews with 15 headteachers and other key staff from three Collegiate Academies in Birmingham during the autumn of 2005. Seven key…
ERIC Educational Resources Information Center
Taylor, Tony; Collins, Sue
2012-01-01
This article critiques popular assumptions that underlie the ongoing politicisation of school history curriculum as an agent of social identity and behaviour. It raises some key research questions which need further investigation and suggests a potential methodology for establishing evidence-based understanding of the relationship between history…
Adolphus, Katie; Bellissimo, Nick; Lawton, Clare L; Ford, Nikki A; Rains, Tia M; Totosy de Zepetnek, Julia; Dye, Louise
2017-01-01
Breakfast is purported to confer a number of benefits on diet quality, health, appetite regulation, and cognitive performance. However, new evidence has challenged the long-held belief that breakfast is the most important meal of the day. This review aims to provide a comprehensive discussion of the key methodological challenges and considerations in studies assessing the effect of breakfast on cognitive performance and appetite control, along with recommendations for future research. This review focuses on the myriad challenges involved in studying children and adolescents specifically. Key methodological challenges and considerations include study design and location, sampling and sample section, choice of objective cognitive tests, choice of objective and subjective appetite measures, merits of providing a fixed breakfast compared with ad libitum, assessment and definition of habitual breakfast consumption, transparency of treatment condition, difficulty of isolating the direct effects of breakfast consumption, untangling acute and chronic effects, and influence of confounding variables. These methodological challenges have hampered a clear substantiation of the potential positive effects of breakfast on cognition and appetite control and contributed to the debate questioning the notion that breakfast is the most important meal of the day. © 2017 American Society for Nutrition.
Graves, John A; Mishra, Pranita
2016-10-01
To highlight key methodological issues in studying insurance dynamics and to compare estimates across two commonly used surveys. Nonelderly uninsured adults and children sampled between 2001 and 2011 in the Medical Expenditure Panel Survey and the Survey of Income and Program Participation. We utilized nonparametric Kaplan-Meier methods to estimate quantiles (25th, 50th, and 75th percentiles) in the distribution of uninsured spells. We compared estimates obtained across surveys and across different methodological approaches to address issues like attrition, seam bias, censoring and truncation, and survey weighting method. All data were drawn from publicly available household surveys. Estimated uninsured spell durations in the MEPS were longer than those observed in the SIPP. There were few changes in spell durations between 2001 and 2011, with median durations of 14 months among adults and 5-7 months among children in the MEPS, and 8 months (adults) and 4 months (children) in the SIPP. The use of panel survey data to study insurance dynamics presents a unique set of methodological challenges. Researchers should consider key analytic and survey design trade-offs when choosing which survey can best suit their research goals. © Health Research and Educational Trust.
Bellissimo, Nick; Ford, Nikki A; Rains, Tia M
2017-01-01
Breakfast is purported to confer a number of benefits on diet quality, health, appetite regulation, and cognitive performance. However, new evidence has challenged the long-held belief that breakfast is the most important meal of the day. This review aims to provide a comprehensive discussion of the key methodological challenges and considerations in studies assessing the effect of breakfast on cognitive performance and appetite control, along with recommendations for future research. This review focuses on the myriad challenges involved in studying children and adolescents specifically. Key methodological challenges and considerations include study design and location, sampling and sample section, choice of objective cognitive tests, choice of objective and subjective appetite measures, merits of providing a fixed breakfast compared with ad libitum, assessment and definition of habitual breakfast consumption, transparency of treatment condition, difficulty of isolating the direct effects of breakfast consumption, untangling acute and chronic effects, and influence of confounding variables. These methodological challenges have hampered a clear substantiation of the potential positive effects of breakfast on cognition and appetite control and contributed to the debate questioning the notion that breakfast is the most important meal of the day. PMID:28096143
Evaluating the Safety Profile of Non-Active Implantable Medical Devices Compared with Medicines.
Pane, Josep; Coloma, Preciosa M; Verhamme, Katia M C; Sturkenboom, Miriam C J M; Rebollo, Irene
2017-01-01
Recent safety issues involving non-active implantable medical devices (NAIMDs) have highlighted the need for better pre-market and post-market evaluation. Some stakeholders have argued that certain features of medicine safety evaluation should also be applied to medical devices. Our objectives were to compare the current processes and methodologies for the assessment of NAIMD safety profiles with those for medicines, identify potential gaps, and make recommendations for the adoption of new methodologies for the ongoing benefit-risk monitoring of these devices throughout their entire life cycle. A literature review served to examine the current tools for the safety evaluation of NAIMDs and those for medicines. We searched MEDLINE using these two categories. We supplemented this search with Google searches using the same key terms used in the MEDLINE search. Using a comparative approach, we summarized the new product design, development cycle (preclinical and clinical phases), and post-market phases for NAIMDs and drugs. We also evaluated and compared the respective processes to integrate and assess safety data during the life cycle of the products, including signal detection, signal management, and subsequent potential regulatory actions. The search identified a gap in NAIMD safety signal generation: no global program exists that collects and analyzes adverse events and product quality issues. Data sources in real-world settings, such as electronic health records, need to be effectively identified and explored as additional sources of safety information, particularly in some areas such as the EU and USA where there are plans to implement the unique device identifier (UDI). The UDI and other initiatives will enable more robust follow-up and assessment of long-term patient outcomes. The safety evaluation system for NAIMDs differs in many ways from those for drugs, but both systems face analogous challenges with respect to monitoring real-world usage. Certain features of the drug safety evaluation process could, if adopted and adapted for NAIMDs, lead to better and more systematic evaluations of the latter.
Chaisiri, Kittipong; Anantatat, Tippawan; Stekolnikov, Alexandr A.; Morand, Serge; Prasartvit, Anchana; Makepeace, Benjamin L.; Sungvornyothin, Sungsit; Paris, Daniel H.
2018-01-01
Background Conventional gold standard characterization of chigger mites involves chemical preparation procedures (i.e. specimen clearing) for visualization of morphological features, which however contributes to destruction of the arthropod host DNA and any endosymbiont or pathogen DNA harbored within the specimen. Methodology/Principal findings In this study, a novel work flow based on autofluorescence microscopy was developed to enable identification of trombiculid mites to the species level on the basis of morphological traits without any special preparation, while preserving the mite DNA for subsequent genotyping. A panel of 16 specifically selected fluorescence microscopy images of mite features from available identification keys served for complete chigger morphological identification to the species level, and was paired with corresponding genotype data. We evaluated and validated this method for paired chigger morphological and genotypic ID using the mitochondrial cytochrome c oxidase subunit I gene (coi) in 113 chigger specimens representing 12 species and 7 genera (Leptotrombidium, Ascoschoengastia, Gahrliepia, Walchia, Blankaartia, Schoengastia and Schoutedenichia) from the Lao People’s Democratic Republic (Lao PDR) to the species level (complete characterization), and 153 chiggers from 5 genera (Leptotrombidium, Ascoschoengastia, Helenicula, Schoengastiella and Walchia) from Thailand, Cambodia and Lao PDR to the genus level. A phylogenetic tree constructed from 77 coi gene sequences (approximately 640 bp length, n = 52 new coi sequences and n = 25 downloaded from GenBank), demonstrated clear grouping of assigned morphotypes at the genus levels, although evidence of both genetic polymorphism and morphological plasticity was found. Conclusions/Significance With this new methodology, we provided the largest collection of characterized coi gene sequences for trombiculid mites to date, and almost doubled the number of available characterized coi gene sequences with a single study. The ability to provide paired phenotypic-genotypic data is of central importance for future characterization of mites and dissecting the molecular epidemiology of mites transmitting diseases like scrub typhus. PMID:29494599
Janero, David R; Korde, Anisha; Makriyannis, Alexandros
2017-01-01
Detailed characterization of the ligand-binding motifs and structure-function correlates of the principal GPCRs of the endocannabinoid-signaling system, the cannabinoid 1 (CB1R) and cannabinoid 2 (CB2R) receptors, is essential to inform the rational design of drugs that modulate CB1R- and CB2R-dependent biosignaling for therapeutic gain. We discuss herein an experimental paradigm termed "ligand-assisted protein structure" (LAPS) that affords a means of characterizing, at the amino acid level, CB1R and CB2R structural features key to ligand engagement and receptor-dependent information transmission. For this purpose, LAPS integrates three key disciplines and methodologies: (a) medicinal chemistry: design and synthesis of high-affinity, pharmacologically active probes as reporters capable of reacting irreversibly with particular amino acids at (or in the immediate vicinity of) the ligand-binding domain of the functionally active receptor; (b) molecular and cellular biology: introduction of discrete, conservative point mutations into the target GPCR and determination of their effect on probe binding and pharmacological activity; (c) analytical chemistry: identification of the site(s) of probe-GPCR interaction through focused, bottom-up, amino acid-level proteomic identification of the probe-receptor complex using liquid chromatography tandem mass spectrometry. Subsequent in silico methods including ligand docking and computational modeling provide supplementary data on the probe-receptor interaction as defined by LAPS. Examples of LAPS as applied to human CB2R orthosteric binding site characterization for a biarylpyrazole antagonist/inverse agonist and a classical cannabinoid agonist belonging to distinct chemical classes of cannabinergic compounds are given as paradigms for further application of this methodology to other therapeutic protein targets. LAPS is well positioned to complement other experimental and in silico methods in contemporary structural biology such as X-ray crystallography. © 2017 Elsevier Inc. All rights reserved.
Predicting plant biomass accumulation from image-derived parameters
Chen, Dijun; Shi, Rongli; Pape, Jean-Michel; Neumann, Kerstin; Graner, Andreas; Chen, Ming; Klukas, Christian
2018-01-01
Abstract Background Image-based high-throughput phenotyping technologies have been rapidly developed in plant science recently, and they provide a great potential to gain more valuable information than traditionally destructive methods. Predicting plant biomass is regarded as a key purpose for plant breeders and ecologists. However, it is a great challenge to find a predictive biomass model across experiments. Results In the present study, we constructed 4 predictive models to examine the quantitative relationship between image-based features and plant biomass accumulation. Our methodology has been applied to 3 consecutive barley (Hordeum vulgare) experiments with control and stress treatments. The results proved that plant biomass can be accurately predicted from image-based parameters using a random forest model. The high prediction accuracy based on this model will contribute to relieving the phenotyping bottleneck in biomass measurement in breeding applications. The prediction performance is still relatively high across experiments under similar conditions. The relative contribution of individual features for predicting biomass was further quantified, revealing new insights into the phenotypic determinants of the plant biomass outcome. Furthermore, methods could also be used to determine the most important image-based features related to plant biomass accumulation, which would be promising for subsequent genetic mapping to uncover the genetic basis of biomass. Conclusions We have developed quantitative models to accurately predict plant biomass accumulation from image data. We anticipate that the analysis results will be useful to advance our views of the phenotypic determinants of plant biomass outcome, and the statistical methods can be broadly used for other plant species. PMID:29346559
Sequence Bundles: a novel method for visualising, discovering and exploring sequence motifs
2014-01-01
Background We introduce Sequence Bundles--a novel data visualisation method for representing multiple sequence alignments (MSAs). We identify and address key limitations of the existing bioinformatics data visualisation methods (i.e. the Sequence Logo) by enabling Sequence Bundles to give salient visual expression to sequence motifs and other data features, which would otherwise remain hidden. Methods For the development of Sequence Bundles we employed research-led information design methodologies. Sequences are encoded as uninterrupted, semi-opaque lines plotted on a 2-dimensional reconfigurable grid. Each line represents a single sequence. The thickness and opacity of the stack at each residue in each position indicates the level of conservation and the lines' curved paths expose patterns in correlation and functionality. Several MSAs can be visualised in a composite image. The Sequence Bundles method is designed to favour a tangible, continuous and intuitive display of information. Results We have developed a software demonstration application for generating a Sequence Bundles visualisation of MSAs provided for the BioVis 2013 redesign contest. A subsequent exploration of the visualised line patterns allowed for the discovery of a number of interesting features in the dataset. Reported features include the extreme conservation of sequences displaying a specific residue and bifurcations of the consensus sequence. Conclusions Sequence Bundles is a novel method for visualisation of MSAs and the discovery of sequence motifs. It can aid in generating new insight and hypothesis making. Sequence Bundles is well disposed for future implementation as an interactive visual analytics software, which can complement existing visualisation tools. PMID:25237395
The longevity of lava dome eruptions
NASA Astrophysics Data System (ADS)
Wolpert, Robert L.; Ogburn, Sarah E.; Calder, Eliza S.
2016-02-01
Understanding the duration of past, ongoing, and future volcanic eruptions is an important scientific goal and a key societal need. We present a new methodology for forecasting the duration of ongoing and future lava dome eruptions based on a database (DomeHaz) recently compiled by the authors. The database includes duration and composition for 177 such eruptions, with "eruption" defined as the period encompassing individual episodes of dome growth along with associated quiescent periods during which extrusion pauses but unrest continues. In a key finding, we show that probability distributions for dome eruption durations are both heavy tailed and composition dependent. We construct objective Bayesian statistical models featuring heavy-tailed Generalized Pareto distributions with composition-specific parameters to make forecasts about the durations of new and ongoing eruptions that depend on both eruption duration to date and composition. Our Bayesian predictive distributions reflect both uncertainty about model parameter values (epistemic uncertainty) and the natural variability of the geologic processes (aleatoric uncertainty). The results are illustrated by presenting likely trajectories for 14 dome-building eruptions ongoing in 2015. Full representation of the uncertainty is presented for two key eruptions, Soufriére Hills Volcano in Montserrat (10-139 years, median 35 years) and Sinabung, Indonesia (1-17 years, median 4 years). Uncertainties are high but, importantly, quantifiable. This work provides for the first time a quantitative and transferable method and rationale on which to base long-term planning decisions for lava dome-forming volcanoes, with wide potential use and transferability to forecasts of other types of eruptions and other adverse events across the geohazard spectrum.
One Controller at a Time (1-CAT): A mimo design methodology
NASA Technical Reports Server (NTRS)
Mitchell, J. R.; Lucas, J. C.
1987-01-01
The One Controller at a Time (1-CAT) methodology for designing digital controllers for Large Space Structures (LSS's) is introduced and illustrated. The flexible mode problem is first discussed. Next, desirable features of a LSS control system design methodology are delineated. The 1-CAT approach is presented, along with an analytical technique for carrying out the 1-CAT process. Next, 1-CAT is used to design digital controllers for the proposed Space Based Laser (SBL). Finally, the SBL design is evaluated for dynamical performance, noise rejection, and robustness.
Using mixed methods in health research
Woodman, Jenny
2013-01-01
Summary Mixed methods research is the use of quantitative and qualitative methods in a single study or series of studies. It is an emergent methodology which is increasingly used by health researchers, especially within health services research. There is a growing literature on the theory, design and critical appraisal of mixed methods research. However, there are few papers that summarize this methodological approach for health practitioners who wish to conduct or critically engage with mixed methods studies. The objective of this paper is to provide an accessible introduction to mixed methods for clinicians and researchers unfamiliar with this approach. We present a synthesis of key methodological literature on mixed methods research, with examples from our own work and that of others, to illustrate the practical applications of this approach within health research. We summarize definitions of mixed methods research, the value of this approach, key aspects of study design and analysis, and discuss the potential challenges of combining quantitative and qualitative methods and data. One of the key challenges within mixed methods research is the successful integration of quantitative and qualitative data during analysis and interpretation. However, the integration of different types of data can generate insights into a research question, resulting in enriched understanding of complex health research problems. PMID:23885291
Peláez-Coca, M. D.; Orini, M.; Lázaro, J.; Bailón, R.; Gil, E.
2013-01-01
A methodology that combines information from several nonstationary biological signals is presented. This methodology is based on time-frequency coherence, that quantifies the similarity of two signals in the time-frequency domain. A cross time-frequency analysis method, based on quadratic time-frequency distribution, has been used for combining information of several nonstationary biomedical signals. In order to evaluate this methodology, the respiratory rate from the photoplethysmographic (PPG) signal is estimated. The respiration provokes simultaneous changes in the pulse interval, amplitude, and width of the PPG signal. This suggests that the combination of information from these sources will improve the accuracy of the estimation of the respiratory rate. Another target of this paper is to implement an algorithm which provides a robust estimation. Therefore, respiratory rate was estimated only in those intervals where the features extracted from the PPG signals are linearly coupled. In 38 spontaneous breathing subjects, among which 7 were characterized by a respiratory rate lower than 0.15 Hz, this methodology provided accurate estimates, with the median error {0.00; 0.98} mHz ({0.00; 0.31}%) and the interquartile range error {4.88; 6.59} mHz ({1.60; 1.92}%). The estimation error of the presented methodology was largely lower than the estimation error obtained without combining different PPG features related to respiration. PMID:24363777
Narrative review of yoga intervention clinical trials including weight-related outcomes.
Rioux, Jennifer Grace; Ritenbaugh, Cheryl
2013-01-01
Medical authorities have identified obesity as a causal factor in the development of diabetes, hypertension, and cardiovascular disease (CVD), and more broadly, of metabolic syndrome/insulin resistance syndrome. To provide solutions that can modify this risk factor, researchers need to identify methods of effective risk reduction and primary prevention of obesity. Research on the effectiveness of yoga as a treatment for obesity is limited, and studies vary in overall quality and methodological rigor. This narrative review assessed the quantity and quality of clinical trials of yoga as an intervention for weight loss or as a means of risk reduction or treatment for obesity and diseases in which obesity is a causal factor. This review summarized the studies' research designs and evaluated the efficacy of yoga for weight loss via the current evidence base. The research team evaluated published studies to determine the appropriateness of research designs, comparability of programs' intervention elements, and standardization of outcome measures. The research team's literature search used the key terms yoga and obesity or yoga and weight loss in three primary medical-literature databases (PubMed, PsychInfo, and Web of Science). The study excluded clinical trials with no quantitative obesity related measure. Extracted data included each study's (1) design; (2) setting and population; (3) nature, duration, and frequency of interventions; (4) comparison groups; (5) recruitment strategies; (6) outcome measures; (7) data analysis and presentation; and (8) results and conclusions. The research team developed an overall evaluation parameter to compare disparate trials. The research team reviewed each study to determine its key features, each worth a specified number of points, with a maximum total of 20 points. The features included a study's (1) duration, (2) frequency of yoga practice, (3) intensity of (length of) each practice, (4) number of yogic elements, (5) inclusion of dietary modification, (6) inclusion of a residential component, (7) the number of weight-related outcome measures, and (8) a discussion of the details of the yogic elements. Overall, therapeutic yoga programs are frequently effective in promoting weight loss and/or improvements in body composition. The effectiveness of yoga for weight loss is related to the following key features: (1) an increased frequency of practice; (2) a longer intervention duration (3) a yogic dietary component; (4) a residential component; (5) the comprehensive inclusion of yogic components; (5) and a home-practice component. Yoga appears to be an appropriate and potentially successful intervention for weight maintenance, prevention of obesity, and risk reduction for diseases in which obesity plays a significant causal role.
A Wavelet-Based Methodology for Grinding Wheel Condition Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liao, T. W.; Ting, C.F.; Qu, Jun
2007-01-01
Grinding wheel surface condition changes as more material is removed. This paper presents a wavelet-based methodology for grinding wheel condition monitoring based on acoustic emission (AE) signals. Grinding experiments in creep feed mode were conducted to grind alumina specimens with a resinoid-bonded diamond wheel using two different conditions. During the experiments, AE signals were collected when the wheel was 'sharp' and when the wheel was 'dull'. Discriminant features were then extracted from each raw AE signal segment using the discrete wavelet decomposition procedure. An adaptive genetic clustering algorithm was finally applied to the extracted features in order to distinguish differentmore » states of grinding wheel condition. The test results indicate that the proposed methodology can achieve 97% clustering accuracy for the high material removal rate condition, 86.7% for the low material removal rate condition, and 76.7% for the combined grinding conditions if the base wavelet, the decomposition level, and the GA parameters are properly selected.« less
Kemp, Candace L.; Ball, Mary M.; Morgan, Jennifer Craft; Doyle, Patrick J.; Burgess, Elisabeth O.; Dillard, Joy A.; Barmon, Christina E.; Fitzroy, Andrea F.; Helmly, Victoria E.; Avent, Elizabeth S.; Perkins, Molly M.
2018-01-01
In this article, we analyze the research experiences associated with a longitudinal qualitative study of residents’ care networks in assisted living. Using data from researcher meetings, field notes, and memos, we critically examine our design and decision making and accompanying methodological implications. We focus on one complete wave of data collection involving 28 residents and 114 care network members in four diverse settings followed for 2 years. We identify study features that make our research innovative, but that also represent significant challenges. They include the focus and topic; settings and participants; scope and design complexity; nature, modes, frequency, and duration of data collection; and analytic approach. Each feature has methodological implications, including benefits and challenges pertaining to recruitment, retention, data collection, quality, and management, research team work, researcher roles, ethics, and dissemination. Our analysis demonstrates the value of our approach and of reflecting on and sharing methodological processes for cumulative knowledge building. PMID:27651072
Imai, Takeshi; Hayakawa, Masayo; Ohe, Kazuhiko
2013-01-01
Prediction of synergistic or antagonistic effects of drug-drug interaction (DDI) in vivo has been of considerable interest over the years. Formal representation of pharmacological knowledge such as ontology is indispensable for machine reasoning of possible DDIs. However, current pharmacology knowledge bases are not sufficient to provide formal representation of DDI information. With this background, this paper presents: (1) a description framework of pharmacodynamics ontology; and (2) a methodology to utilize pharmacodynamics ontology to detect different types of possible DDI pairs with supporting information such as underlying pharmacodynamics mechanisms. We also evaluated our methodology in the field of drugs related to noradrenaline signal transduction process and 11 different types of possible DDI pairs were detected. The main features of our methodology are the explanation capability of the reason for possible DDIs and the distinguishability of different types of DDIs. These features will not only be useful for providing supporting information to prescribers, but also for large-scale monitoring of drug safety.
Kemp, Candace L; Ball, Mary M; Morgan, Jennifer Craft; Doyle, Patrick J; Burgess, Elisabeth O; Dillard, Joy A; Barmon, Christina E; Fitzroy, Andrea F; Helmly, Victoria E; Avent, Elizabeth S; Perkins, Molly M
2017-07-01
In this article, we analyze the research experiences associated with a longitudinal qualitative study of residents' care networks in assisted living. Using data from researcher meetings, field notes, and memos, we critically examine our design and decision making and accompanying methodological implications. We focus on one complete wave of data collection involving 28 residents and 114 care network members in four diverse settings followed for 2 years. We identify study features that make our research innovative, but that also represent significant challenges. They include the focus and topic; settings and participants; scope and design complexity; nature, modes, frequency, and duration of data collection; and analytic approach. Each feature has methodological implications, including benefits and challenges pertaining to recruitment, retention, data collection, quality, and management, research team work, researcher roles, ethics, and dissemination. Our analysis demonstrates the value of our approach and of reflecting on and sharing methodological processes for cumulative knowledge building.
Booth, Andrew
2016-05-04
Qualitative systematic reviews or qualitative evidence syntheses (QES) are increasingly recognised as a way to enhance the value of systematic reviews (SRs) of clinical trials. They can explain the mechanisms by which interventions, evaluated within trials, might achieve their effect. They can investigate differences in effects between different population groups. They can identify which outcomes are most important to patients, carers, health professionals and other stakeholders. QES can explore the impact of acceptance, feasibility, meaningfulness and implementation-related factors within a real world setting and thus contribute to the design and further refinement of future interventions. To produce valid, reliable and meaningful QES requires systematic identification of relevant qualitative evidence. Although the methodologies of QES, including methods for information retrieval, are well-documented, little empirical evidence exists to inform their conduct and reporting. This structured methodological overview examines papers on searching for qualitative research identified from the Cochrane Qualitative and Implementation Methods Group Methodology Register and from citation searches of 15 key papers. A single reviewer reviewed 1299 references. Papers reporting methodological guidance, use of innovative methodologies or empirical studies of retrieval methods were categorised under eight topical headings: overviews and methodological guidance, sampling, sources, structured questions, search procedures, search strategies and filters, supplementary strategies and standards. This structured overview presents a contemporaneous view of information retrieval for qualitative research and identifies a future research agenda. This review concludes that poor empirical evidence underpins current information practice in information retrieval of qualitative research. A trend towards improved transparency of search methods and further evaluation of key search procedures offers the prospect of rapid development of search methods.
ERIC Educational Resources Information Center
Work Keys USA, 1998
1998-01-01
"Work Keys" is a comprehensive program for assessing and teaching workplace skills. This serial "special issue" features 18 first-hand reports on Work Keys projects in action in states across North America. They show how the Work Keys is helping businesses and educators solve the challenge of building a world-class work force.…
Multiple Paths to Mathematics Practice in Al-Kashi's "Key to Arithmetic"
ERIC Educational Resources Information Center
Taani, Osama
2014-01-01
In this paper, I discuss one of the most distinguishing features of Jamshid al-Kashi's pedagogy from his "Key to Arithmetic", a well-known Arabic mathematics textbook from the fifteenth century. This feature is the multiple paths that he includes to find a desired result. In the first section light is shed on al-Kashi's life…
An Analysis of the Contents and Pedagogy of Al-Kashi's 1427 "Key to Arithmetic" (Miftah Al-Hisab)
ERIC Educational Resources Information Center
Ta'ani, Osama Hekmat
2011-01-01
Al-Kashi's 1427 "Key to Arithmetic" had important use over several hundred years in mathematics teaching in Medieval Islam throughout the time of the Ottoman Empire. Its pedagogical features have never been studied before. In this dissertation I have made a close pedagogical analysis of these features and discovered several teaching…
Estimation of end point foot clearance points from inertial sensor data.
Santhiranayagam, Braveena K; Lai, Daniel T H; Begg, Rezaul K; Palaniswami, Marimuthu
2011-01-01
Foot clearance parameters provide useful insight into tripping risks during walking. This paper proposes a technique for the estimate of key foot clearance parameters using inertial sensor (accelerometers and gyroscopes) data. Fifteen features were extracted from raw inertial sensor measurements, and a regression model was used to estimate two key foot clearance parameters: First maximum vertical clearance (m x 1) after toe-off and the Minimum Toe Clearance (MTC) of the swing foot. Comparisons are made against measurements obtained using an optoelectronic motion capture system (Optotrak), at 4 different walking speeds. General Regression Neural Networks (GRNN) were used to estimate the desired parameters from the sensor features. Eight subjects foot clearance data were examined and a Leave-one-subject-out (LOSO) method was used to select the best model. The best average Root Mean Square Errors (RMSE) across all subjects obtained using all sensor features at the maximum speed for m x 1 was 5.32 mm and for MTC was 4.04 mm. Further application of a hill-climbing feature selection technique resulted in 0.54-21.93% improvement in RMSE and required fewer input features. The results demonstrated that using raw inertial sensor data with regression models and feature selection could accurately estimate key foot clearance parameters.
NASA Technical Reports Server (NTRS)
Logston, R. G.; Budris, G. D.
1977-01-01
The methodology to optimize the utilization of Spacelab racks and pallets and to apply this methodology to the early STS Spacelab missions was developed. A review was made of Spacelab Program requirements and flow plans, generic flow plans for racks and pallets were examined, and the principal optimization criteria and methodology were established. Interactions between schedule, inventory, and key optimization factors; schedule and cost sensitivity to optional approaches; and the development of tradeoff methodology were addressed. This methodology was then applied to early spacelab missions (1980-1982). Rack and pallet requirements and duty cycles were defined, a utilization assessment was made, and several trade studies performed involving varying degrees of Level IV integration, inventory level, and shared versus dedicated Spacelab racks and pallets.
Review of evaluation on ecological carrying capacity: The progress and trend of methodology
NASA Astrophysics Data System (ADS)
Wang, S. F.; Xu, Y.; Liu, T. J.; Ye, J. M.; Pan, B. L.; Chu, C.; Peng, Z. L.
2018-02-01
The ecological carrying capacity (ECC) has been regarded as an important reference to indicate the level of regional sustainable development since the very beginning of twenty-first century. By a brief review of the main progress in ECC evaluation methodologies in recent five years, this paper systematically discusses the features and differences of these methods and expounds the current states and future development trend of ECC methodology. The result shows that further exploration in terms of the dynamic, comprehensive and intelligent assessment technologies needs to be provided in order to form a unified and scientific ECC methodology system and to produce a reliable basis for environmental-economic decision-makings.
A multi-criteria decision aid methodology to design electric vehicles public charging networks
NASA Astrophysics Data System (ADS)
Raposo, João; Rodrigues, Ana; Silva, Carlos; Dentinho, Tomaz
2015-05-01
This article presents a new multi-criteria decision aid methodology, dynamic-PROMETHEE, here used to design electric vehicle charging networks. In applying this methodology to a Portuguese city, results suggest that it is effective in designing electric vehicle charging networks, generating time and policy based scenarios, considering offer and demand and the city's urban structure. Dynamic-PROMETHE adds to the already known PROMETHEE's characteristics other useful features, such as decision memory over time, versatility and adaptability. The case study, used here to present the dynamic-PROMETHEE, served as inspiration and base to create this new methodology. It can be used to model different problems and scenarios that may present similar requirement characteristics.
Cost-Utility Analysis: Current Methodological Issues and Future Perspectives
Nuijten, Mark J. C.; Dubois, Dominique J.
2011-01-01
The use of cost–effectiveness as final criterion in the reimbursement process for listing of new pharmaceuticals can be questioned from a scientific and policy point of view. There is a lack of consensus on main methodological issues and consequently we may question the appropriateness of the use of cost–effectiveness data in health care decision-making. Another concern is the appropriateness of the selection and use of an incremental cost–effectiveness threshold (Cost/QALY). In this review, we focus mainly on only some key methodological concerns relating to discounting, the utility concept, cost assessment, and modeling methodologies. Finally we will consider the relevance of some other important decision criteria, like social values and equity. PMID:21713127
Das, D K; Maiti, A K; Chakraborty, C
2015-03-01
In this paper, we propose a comprehensive image characterization cum classification framework for malaria-infected stage detection using microscopic images of thin blood smears. The methodology mainly includes microscopic imaging of Leishman stained blood slides, noise reduction and illumination correction, erythrocyte segmentation, feature selection followed by machine classification. Amongst three-image segmentation algorithms (namely, rule-based, Chan-Vese-based and marker-controlled watershed methods), marker-controlled watershed technique provides better boundary detection of erythrocytes specially in overlapping situations. Microscopic features at intensity, texture and morphology levels are extracted to discriminate infected and noninfected erythrocytes. In order to achieve subgroup of potential features, feature selection techniques, namely, F-statistic and information gain criteria are considered here for ranking. Finally, five different classifiers, namely, Naive Bayes, multilayer perceptron neural network, logistic regression, classification and regression tree (CART), RBF neural network have been trained and tested by 888 erythrocytes (infected and noninfected) for each features' subset. Performance evaluation of the proposed methodology shows that multilayer perceptron network provides higher accuracy for malaria-infected erythrocytes recognition and infected stage classification. Results show that top 90 features ranked by F-statistic (specificity: 98.64%, sensitivity: 100%, PPV: 99.73% and overall accuracy: 96.84%) and top 60 features ranked by information gain provides better results (specificity: 97.29%, sensitivity: 100%, PPV: 99.46% and overall accuracy: 96.73%) for malaria-infected stage classification. © 2014 The Authors Journal of Microscopy © 2014 Royal Microscopical Society.
[Radiotherapy phase I trials' methodology: Features].
Rivoirard, R; Vallard, A; Langrand-Escure, J; Guy, J-B; Ben Mrad, M; Yaoxiong, X; Diao, P; Méry, B; Pigne, G; Rancoule, C; Magné, N
2016-12-01
In clinical research, biostatistical methods allow the rigorous analysis of data collection and should be defined from the trial design to obtain the appropriate experimental approach. Thus, if the main purpose of phase I is to determine the dose to use during phase II, methodology should be finely adjusted to experimental treatment(s). Today, the methodology for chemotherapy and targeted therapy is well known. For radiotherapy and chemoradiotherapy phase I trials, the primary endpoint must reflect both effectiveness and potential treatment toxicities. Methodology should probably be complex to limit failures in the following phases. However, there are very few data about methodology design in the literature. The present study focuses on these particular trials and their characteristics. It should help to raise existing methodological patterns shortcomings in order to propose new and better-suited designs. Copyright © 2016 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.
Rogers, J L; Stoms, G B; Phifer, J L
1989-01-01
A systematic "roadmap" through the medical literature that empirically examines the incidence of psychological sequelae of induced abortion is presented. Because outcome incidence rates and methodological profiles vary substantially across studies, selective use of articles from this literature without an accompanying rationale for that selectivity could foster erroneous conclusions. Information compiled here can facilitate a rapid methodological critique of citations in abortion-related materials. Investigations published in English between January 1966 and April 1988 that quantitatively examined psychological sequelae using prospective, retrospective, or comparative methodologies are summarized in tables to produce a synopsis of the demographics, methodological limitations, and gross statistical features of each article. This quantitative guide is designed to facilitate appropriate use of the current literature, provide needed background to assess positions arising from the currently available data, and provide methodological focus for planning better studies in the future.
10 CFR 1045.17 - Classification levels.
Code of Federal Regulations, 2014 CFR
2014-01-01
... classification include detailed technical descriptions of critical features of a nuclear explosive design that... classification include designs for specific weapon components (not revealing critical features), key features of uranium enrichment technologies, or specifications of weapon materials. (3) Confidential. The Director of...
10 CFR 1045.17 - Classification levels.
Code of Federal Regulations, 2013 CFR
2013-01-01
... classification include detailed technical descriptions of critical features of a nuclear explosive design that... classification include designs for specific weapon components (not revealing critical features), key features of uranium enrichment technologies, or specifications of weapon materials. (3) Confidential. The Director of...
10 CFR 1045.17 - Classification levels.
Code of Federal Regulations, 2011 CFR
2011-01-01
... classification include detailed technical descriptions of critical features of a nuclear explosive design that... classification include designs for specific weapon components (not revealing critical features), key features of uranium enrichment technologies, or specifications of weapon materials. (3) Confidential. The Director of...
10 CFR 1045.17 - Classification levels.
Code of Federal Regulations, 2012 CFR
2012-01-01
... classification include detailed technical descriptions of critical features of a nuclear explosive design that... classification include designs for specific weapon components (not revealing critical features), key features of uranium enrichment technologies, or specifications of weapon materials. (3) Confidential. The Director of...
A Methodology to Seperate and Analyze a Seismic Wide Angle Profile
NASA Astrophysics Data System (ADS)
Weinzierl, Wolfgang; Kopp, Heidrun
2010-05-01
General solutions of inverse problems can often be obtained through the introduction of probability distributions to sample the model space. We present a simple approach of defining an a priori space in a tomographic study and retrieve the velocity-depth posterior distribution by a Monte Carlo method. Utilizing a fitting routine designed for very low statistics to setup and analyze the obtained tomography results, it is possible to statistically separate the velocity-depth model space derived from the inversion of seismic refraction data. An example of a profile acquired in the Lesser Antilles subduction zone reveals the effectiveness of this approach. The resolution analysis of the structural heterogeneity includes a divergence analysis which proves to be capable of dissecting long wide-angle profiles for deep crust and upper mantle studies. The complete information of any parameterised physical system is contained in the a posteriori distribution. Methods for analyzing and displaying key properties of the a posteriori distributions of highly nonlinear inverse problems are therefore essential in the scope of any interpretation. From this study we infer several conclusions concerning the interpretation of the tomographic approach. By calculating a global as well as singular misfits of velocities we are able to map different geological units along a profile. Comparing velocity distributions with the result of a tomographic inversion along the profile we can mimic the subsurface structures in their extent and composition. The possibility of gaining a priori information for seismic refraction analysis by a simple solution to an inverse problem and subsequent resolution of structural heterogeneities through a divergence analysis is a new and simple way of defining a priori space and estimating the a posteriori mean and covariance in singular and general form. The major advantage of a Monte Carlo based approach in our case study is the obtained knowledge of velocity depth distributions. Certainly the decision of where to extract velocity information on the profile for setting up a Monte Carlo ensemble is limiting the a priori space. However, the general conclusion of analyzing the velocity field according to distinct reference distributions gives us the possibility to define the covariance according to any geological unit if we have a priori information on the velocity depth distributions. Using the wide angle data recorded across the Lesser Antilles arc, we are able to resolve a shallow feature like the backstop by a robust and simple divergence analysis. We demonstrate the effectiveness of the new methodology to extract some key features and properties from the inversion results by including information concerning the confidence level of results.
Cognitive Models for Integrating Testing and Instruction, Phase II. Methodology Program.
ERIC Educational Resources Information Center
Quellmalz, Edys S.; Shaha, Steven
The potential of a cognitive model task analysis scheme (CMS) that specifies features of test problems shown by research to affect performance is explored. CMS describes the general skill area and the generic task or problem type. It elaborates features of the problem situation and required responses found by research to influence performance.…
ERIC Educational Resources Information Center
Böhm, Stephan; Constantine, Georges Philip
2016-01-01
Purpose: This paper aims to focus on contextualized features for mobile language learning apps. The scope of this paper is to explore students' perceptions of contextualized mobile language learning. Design/Methodology/Approach: An extended Technology Acceptance Model was developed to analyze the effect of contextual app features on students'…
ERIC Educational Resources Information Center
Suki, Norazah Mohd
2013-01-01
Purpose: The study aims to examine structural relationships of product features, brand name, product price and social influence with demand for Smartphones among Malaysian students'. Design/methodology/approach: Data collected from 320 valid pre-screened university students studying at the pubic higher learning institution in Federal Territory of…
ERIC Educational Resources Information Center
Damboeck, Johanna
2012-01-01
Purpose: The aim of this article is to provide an analysis of the features that have shaped the state's decision-making process in the United Nations, with regard to the humanitarian intervention in Darfur from 2003 onwards. Design/methodology/approach: The methodological approach to the study is a review of political statement papers grounded in…
ERIC Educational Resources Information Center
Association for Education in Journalism and Mass Communication.
The Communication Theory and Methodology section of the proceedings contains the following 12 selected papers: "Innovativeness and Perceptions of Faculty Innovation Champions on the Diffusion of World Wide Web Course Features" (Patrick J. Sutherland); "A Communication 'Mr. Fit'? Living with No Significant Difference" (Fiona…
ERIC Educational Resources Information Center
Singer, Judith D.; Willett, John B.
The National Center for Education Statistics (NCES) is exploring the possibility of conducting a large-scale multi-year study of teachers' careers. The proposed new study is intended to follow a national probability sample of teachers over an extended period of time. A number of methodological issues need to be addressed before the study can be…
Factors in Human-Computer Interface Design (A Pilot Study).
1994-12-01
This study used a pretest - posttest control group experimental design to test the effect of consistency on speed, retention, and user satisfaction. Four...analysis. The overall methodology was a pretest - posttest control group experimental design using different prototypes to test the effects of...methodology used for this study was a pretest - posttest control group experimental design using different prototypes to test for features of the human
Old Wine in New Skins: The Sensitivity of Established Findings to New Methods
ERIC Educational Resources Information Center
Foster, E. Michael; Wiley-Exley, Elizabeth; Bickman, Leonard
2009-01-01
Findings from an evaluation of a model system for delivering mental health services to youth were reassessed to determine the robustness of key findings to the use of methodologies unavailable to the original analysts. These analyses address a key concern about earlier findings--that the quasi-experimental design involved the comparison of two…
A Retention Assessment Process: Utilizing Total Quality Management Principles and Focus Groups
ERIC Educational Resources Information Center
Codjoe, Henry M.; Helms, Marilyn M.
2005-01-01
Retaining students is a critical topic in higher education. Methodologies abound to gather attrition data as well as key variables important to retention. Using the theories of total quality management and focus groups, this case study gathers and reports data from current college students. Key results, suggestions for replication, and areas for…
A Case Study of Issues of Strategy Implementation in Internationalization of Higher Education
ERIC Educational Resources Information Center
Jiang, Nan; Carpenter, Victoria
2013-01-01
Purpose: The purpose of this research is to identify and critically evaluate key issues faced by an institution in the quest to implement higher education internationalization. Design/methodology/approach: A qualitative research is conducted in a post-1992 UK university. A total of 20 interviewees from three key departments participated in this…
ERIC Educational Resources Information Center
Saunders, Mark N. K.; Skinner, Denise; Beresford, Richard
2005-01-01
Purpose: To explore potential mismatches between stakeholders' perceptions and expectations of key and technical skills needed for an advanced modern apprentice within the UK. Design/methodology/approach: Using data collected from the automotive sector, the template process is used to establish lecturer, student and employee stakeholder group's…
Designing and Evaluating an Online Role Play in Conflict Management
ERIC Educational Resources Information Center
Hrastinski, Stefan; Watson, Jason
2009-01-01
Purpose: This paper aims to identify, through a literature review, key issues regarding how online role plays can be designed and to apply them when designing a role play on conflict management. Design/methodology/approach: By drawing on the key issues identified in the literature review, a role play on conflict management was designed and…
Automatic Inference of Cryptographic Key Length Based on Analysis of Proof Tightness
2016-06-01
within an attack tree structure, then expand attack tree methodology to include cryptographic reductions. We then provide the algorithms for...maintaining and automatically reasoning about these expanded attack trees . We provide a software tool that utilizes machine-readable proof and attack metadata...and the attack tree methodology to provide rapid and precise answers regarding security parameters and effective security. This eliminates the need
Design and Diagnosis Problem Solving with Multifunctional Technical Knowledge Bases
1992-09-29
STRUCTURE METHODOLOGY Design problem solving is a complex activity involving a number of subtasks. and a number of alternative methods potentially available...Conference on Artificial Intelligence. London: The British Computer Society, pp. 621-633. Friedland, P. (1979). Knowledge-based experimental design ...Computing Milieuxl: Management of Computing and Information Systems- -ty,*m man- agement General Terms: Design . Methodology Additional Key Words and Phrases
Development of Pain Endpoint Models for Use in Prostate Cancer Clinical Trials and Drug Approval
2017-10-01
publication delineating key methodological components of pain studies in prostate cancer. KEYWORDS Pain, metastatic castrate resistant prostate cancer...pain palliation and pain progression in prostate cancer clinical trials that are feasible, methodologically rigorous, and meet regulatory...requirements for drug approval and labeling. The primary aim of this award is to conduct an observational longitudinal study in men with castrate-resistant
Keys to Successful Implementation and Sustainment of Managed Maintenance for Healthcare Facilities
2004-03-23
second they involve studying those phenomena in all their complexity (Leedy and Ormrod, 2001). According to Denzin and Lincoln (1994), qualitative...people being studied (Leedy and Ormrod, 2001). Research Design Methodological Triangulation Denzin and Lincoln (1994) suggest because different...the setting. This dual view is refereed to as methodological triangulation ( Denzin and Lincoln , 1994). A research design develops a logical plan for
Automatic programming of arc welding robots
NASA Astrophysics Data System (ADS)
Padmanabhan, Srikanth
Automatic programming of arc welding robots requires the geometric description of a part from a solid modeling system, expert weld process knowledge and the kinematic arrangement of the robot and positioner automatically. Current commercial solid models are incapable of storing explicitly product and process definitions of weld features. This work presents a paradigm to develop a computer-aided engineering environment that supports complete weld feature information in a solid model and to create an automatic programming system for robotic arc welding. In the first part, welding features are treated as properties or attributes of an object, features which are portions of the object surface--the topological boundary. The structure for representing the features and attributes is a graph called the Welding Attribute Graph (WAGRAPH). The method associates appropriate weld features to geometric primitives, adds welding attributes, and checks the validity of welding specifications. A systematic structure is provided to incorporate welding attributes and coordinate system information in a CSG tree. The specific implementation of this structure using a hybrid solid modeler (IDEAS) and an object-oriented programming paradigm is described. The second part provides a comprehensive methodology to acquire and represent weld process knowledge required for the proper selection of welding schedules. A methodology of knowledge acquisition using statistical methods is proposed. It is shown that these procedures did little to capture the private knowledge of experts (heuristics), but helped in determining general dependencies, and trends. A need was established for building the knowledge-based system using handbook knowledge and to allow the experts further to build the system. A methodology to check the consistency and validity for such knowledge addition is proposed. A mapping shell designed to transform the design features to application specific weld process schedules is described. A new approach using fixed path modified continuation methods is proposed in the final section to plan continuously the trajectory of weld seams in an integrated welding robot and positioner environment. The joint displacement, velocity, and acceleration histories all along the path as a function of the path parameter for the best possible welding condition are provided for the robot and the positioner to track various paths normally encountered in arc welding.
Larue, Ruben T H M; Defraene, Gilles; De Ruysscher, Dirk; Lambin, Philippe; van Elmpt, Wouter
2017-02-01
Quantitative analysis of tumour characteristics based on medical imaging is an emerging field of research. In recent years, quantitative imaging features derived from CT, positron emission tomography and MR scans were shown to be of added value in the prediction of outcome parameters in oncology, in what is called the radiomics field. However, results might be difficult to compare owing to a lack of standardized methodologies to conduct quantitative image analyses. In this review, we aim to present an overview of the current challenges, technical routines and protocols that are involved in quantitative imaging studies. The first issue that should be overcome is the dependency of several features on the scan acquisition and image reconstruction parameters. Adopting consistent methods in the subsequent target segmentation step is evenly crucial. To further establish robust quantitative image analyses, standardization or at least calibration of imaging features based on different feature extraction settings is required, especially for texture- and filter-based features. Several open-source and commercial software packages to perform feature extraction are currently available, all with slightly different functionalities, which makes benchmarking quite challenging. The number of imaging features calculated is typically larger than the number of patients studied, which emphasizes the importance of proper feature selection and prediction model-building routines to prevent overfitting. Even though many of these challenges still need to be addressed before quantitative imaging can be brought into daily clinical practice, radiomics is expected to be a critical component for the integration of image-derived information to personalize treatment in the future.
A flexible data-driven comorbidity feature extraction framework.
Sideris, Costas; Pourhomayoun, Mohammad; Kalantarian, Haik; Sarrafzadeh, Majid
2016-06-01
Disease and symptom diagnostic codes are a valuable resource for classifying and predicting patient outcomes. In this paper, we propose a novel methodology for utilizing disease diagnostic information in a predictive machine learning framework. Our methodology relies on a novel, clustering-based feature extraction framework using disease diagnostic information. To reduce the data dimensionality, we identify disease clusters using co-occurrence statistics. We optimize the number of generated clusters in the training set and then utilize these clusters as features to predict patient severity of condition and patient readmission risk. We build our clustering and feature extraction algorithm using the 2012 National Inpatient Sample (NIS), Healthcare Cost and Utilization Project (HCUP) which contains 7 million hospital discharge records and ICD-9-CM codes. The proposed framework is tested on Ronald Reagan UCLA Medical Center Electronic Health Records (EHR) from 3041 Congestive Heart Failure (CHF) patients and the UCI 130-US diabetes dataset that includes admissions from 69,980 diabetic patients. We compare our cluster-based feature set with the commonly used comorbidity frameworks including Charlson's index, Elixhauser's comorbidities and their variations. The proposed approach was shown to have significant gains between 10.7-22.1% in predictive accuracy for CHF severity of condition prediction and 4.65-5.75% in diabetes readmission prediction. Copyright © 2016 Elsevier Ltd. All rights reserved.
Method for Controlling Space Transportation System Life Cycle Costs
NASA Technical Reports Server (NTRS)
McCleskey, Carey M.; Bartine, David E.
2006-01-01
A structured, disciplined methodology is required to control major cost-influencing metrics of space transportation systems during design and continuing through the test and operations phases. This paper proposes controlling key space system design metrics that specifically influence life cycle costs. These are inclusive of flight and ground operations, test, and manufacturing and infrastructure. The proposed technique builds on today's configuration and mass properties control techniques and takes on all the characteristics of a classical control system. While the paper does not lay out a complete math model, key elements of the proposed methodology are explored and explained with both historical and contemporary examples. Finally, the paper encourages modular design approaches and technology investments compatible with the proposed method.
Methodology or method? A critical review of qualitative case study reports.
Hyett, Nerida; Kenny, Amanda; Dickson-Swift, Virginia
2014-01-01
Despite on-going debate about credibility, and reported limitations in comparison to other approaches, case study is an increasingly popular approach among qualitative researchers. We critically analysed the methodological descriptions of published case studies. Three high-impact qualitative methods journals were searched to locate case studies published in the past 5 years; 34 were selected for analysis. Articles were categorized as health and health services (n=12), social sciences and anthropology (n=7), or methods (n=15) case studies. The articles were reviewed using an adapted version of established criteria to determine whether adequate methodological justification was present, and if study aims, methods, and reported findings were consistent with a qualitative case study approach. Findings were grouped into five themes outlining key methodological issues: case study methodology or method, case of something particular and case selection, contextually bound case study, researcher and case interactions and triangulation, and study design inconsistent with methodology reported. Improved reporting of case studies by qualitative researchers will advance the methodology for the benefit of researchers and practitioners.
VIII. THE PAST, PRESENT, AND FUTURE OF DEVELOPMENTAL METHODOLOGY.
Little, Todd D; Wang, Eugene W; Gorrall, Britt K
2017-06-01
This chapter selectively reviews the evolution of quantitative practices in the field of developmental methodology. The chapter begins with an overview of the past in developmental methodology, discussing the implementation and dissemination of latent variable modeling and, in particular, longitudinal structural equation modeling. It then turns to the present state of developmental methodology, highlighting current methodological advances in the field. Additionally, this section summarizes ample quantitative resources, ranging from key quantitative methods journal articles to the various quantitative methods training programs and institutes. The chapter concludes with the future of developmental methodology and puts forth seven future innovations in the field. The innovations discussed span the topics of measurement, modeling, temporal design, and planned missing data designs. Lastly, the chapter closes with a brief overview of advanced modeling techniques such as continuous time models, state space models, and the application of Bayesian estimation in the field of developmental methodology. © 2017 The Society for Research in Child Development, Inc.
Elastin-like polypeptides: the power of design for smart cell encapsulation.
Bandiera, Antonella
2017-01-01
Cell encapsulation technology is still a challenging issue. Innovative methodologies such as additive manufacturing, and alternative bioprocesses, such as cell therapeutic delivery, where cell encapsulation is a key tool are rapidly gaining importance for their potential in regenerative medicine. Responsive materials such as elastin-based recombinant expression products have features that are particularly attractive for cell encapsulation. They can be designed and tailored to meet desired requirements. Thus, they represent promising candidates for the development of new concept-based materials that can be employed in this field. Areas covered: An overview of the design and employment of elastin-like polypeptides for cell encapsulation is given to outline the state of the art. Special attention is paid to the design of the macromolecule employed as well as to the method of matrix formation and the biological system involved. Expert opinion: As a result of recent progress in regenerative medicine there is a compelling need for materials that provide specific properties and demonstrate defined functional features. Rationally designed materials that may adapt according to applied external stimuli and that are responsive to biological systems, such as elastin-like polypeptides, belong to this class of smart material. A run through the components described to date represents a good starting point for further advancement in this area. Employment of these components in cell encapsulation application will promote its advance toward 'smart cell encapsulation technology'.
NASA Astrophysics Data System (ADS)
Marcu, Laura
2017-02-01
The surgeon's limited ability to accurately delineate the tumor margin during surgical interventions is one key challenge in clinical management of cancer. New methods for guiding tumor resection decisions are needed. Numerous studies have shown that tissue autofluorescence properties have the potential to asses biochemical features associates with distinct pathologies in tissue and to distinguish various cancers from normal tissues. However, despite these promising reports, autofluorescence techniques were sparsely adopted in clinical settings. Moreover, when adopted they were primarily used for pre-operative diagnosis rather than guiding interventions. To address this need, we have researched and engineered instrumentation that utilizes label-free fluorescence lifetime contrast to characterize tissue biochemical features in vivo in patients and methodologies conducive to real-time (few seconds) diagnosis of tissue pathologies during surgical procedures. This presentation overviews clinically-compatible multispectral fluorescence lifetime imaging techniques developed in our laboratory and their ability to operate as stand-alone tools, integrated in a biopsy needle and in conjunction with the da Vinci surgical robot. We present pre-clinical and clinical studies in patients that demonstrate the potential of these techniques for intraoperative assessment of brain tumors and head and neck cancer. Current results demonstrate that intrinsic fluorescence signals can provide useful contrast for delineation distinct types of tissues including tumors intraoperatively. Challenges and solutions in the clinical implementation of these techniques are discussed.