Sample records for sophisticated analysis methods

  1. Naive vs. Sophisticated Methods of Forecasting Public Library Circulations.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.

    1984-01-01

    Two sophisticated--autoregressive integrated moving average (ARIMA), straight-line regression--and two naive--simple average, monthly average--forecasting techniques were used to forecast monthly circulation totals of 34 public libraries. Comparisons of forecasts and actual totals revealed that ARIMA and monthly average methods had smallest mean…

  2. Neo-Sophistic Rhetorical Theory: Sophistic Precedents for Contemporary Epistemic Rhetoric.

    ERIC Educational Resources Information Center

    McComiskey, Bruce

    Interest in the sophists has recently intensified among rhetorical theorists, culminating in the notion that rhetoric is epistemic. Epistemic rhetoric has its first and deepest roots in sophistic epistemological and rhetorical traditions, so that the view of rhetoric as epistemic is now being dubbed "neo-sophistic." In epistemic…

  3. The Sophistical Attitude and the Invention of Rhetoric

    ERIC Educational Resources Information Center

    Crick, Nathan

    2010-01-01

    Traditionally, the Older Sophists were conceived as philosophical skeptics who rejected speculative inquiry to focus on rhetorical methods of being successful in practical life. More recently, this view has been complicated by studies revealing the Sophists to be a diverse group of intellectuals who practiced their art prior to the categorization…

  4. Political Trust and Sophistication: Taking Measurement Seriously.

    PubMed

    Turper, Sedef; Aarts, Kees

    2017-01-01

    Political trust is an important indicator of political legitimacy. Hence, seemingly decreasing levels of political trust in Western democracies have stimulated a growing body of research on the causes and consequences of political trust. However, the neglect of potential measurement problems of political trust raises doubts about the findings of earlier studies. The current study revisits the measurement of political trust and re-examines the relationship between political trust and sophistication in the Netherlands by utilizing European Social Survey (ESS) data across five time points and four-wave panel data from the Panel Component of ESS. Our findings illustrate that high and low political sophistication groups display different levels of political trust even when measurement characteristics of political trust are taken into consideration. However, the relationship between political sophistication and political trust is weaker than it is often suggested by earlier research. Our findings also provide partial support for the argument that the gap between sophistication groups is widening over time. Furthermore, we demonstrate that, although the between-method differences between the latent means and the composite score means of political trust for high- and low sophistication groups are relatively minor, it is important to analyze the measurement characteristics of the political trust construct.

  5. Automatically Assessing Lexical Sophistication: Indices, Tools, Findings, and Application

    ERIC Educational Resources Information Center

    Kyle, Kristopher; Crossley, Scott A.

    2015-01-01

    This study explores the construct of lexical sophistication and its applications for measuring second language lexical and speaking proficiency. In doing so, the study introduces the Tool for the Automatic Analysis of LExical Sophistication (TAALES), which calculates text scores for 135 classic and newly developed lexical indices related to word…

  6. The Impact of Financial Sophistication on Adjustable Rate Mortgage Ownership

    ERIC Educational Resources Information Center

    Smith, Hyrum; Finke, Michael S.; Huston, Sandra J.

    2011-01-01

    The influence of a financial sophistication scale on adjustable-rate mortgage (ARM) borrowing is explored. Descriptive statistics and regression analysis using recent data from the Survey of Consumer Finances reveal that ARM borrowing is driven by both the least and most financially sophisticated households but for different reasons. Less…

  7. In Praise of the Sophists.

    ERIC Educational Resources Information Center

    Gibson, Walker

    1993-01-01

    Discusses the thinking of the Greek Sophist philosophers, particularly Gorgias and Protagoras, and their importance and relevance for contemporary English instructors. Considers the problem of language as signs of reality in the context of Sophist philosophy. (HB)

  8. Roman sophisticated surface modification methods to manufacture silver counterfeited coins

    NASA Astrophysics Data System (ADS)

    Ingo, G. M.; Riccucci, C.; Faraldi, F.; Pascucci, M.; Messina, E.; Fierro, G.; Di Carlo, G.

    2017-11-01

    By means of the combined use of X-ray photoelectron spectroscopy (XPS), optical microscopy (OM) and scanning electron microscopy (SEM) coupled with energy dispersive X-ray spectroscopy (EDS) the surface and subsurface chemical and metallurgical features of silver counterfeited Roman Republican coins are investigated to decipher some aspects of the manufacturing methods and to evaluate the technological ability of the Roman metallurgists to produce thin silver coatings. The results demonstrate that over 2000 ago important advances in the technology of thin layer deposition on metal substrates were attained by Romans. The ancient metallurgists produced counterfeited coins by combining sophisticated micro-plating methods and tailored surface chemical modification based on the mercury-silvering process. The results reveal that Romans were able systematically to chemically and metallurgically manipulate alloys at a micro scale to produce adherent precious metal layers with a uniform thickness up to few micrometers. The results converge to reveal that the production of forgeries was aimed firstly to save expensive metals as much as possible allowing profitable large-scale production at a lower cost. The driving forces could have been a lack of precious metals, an unexpected need to circulate coins for trade and/or a combinations of social, political and economic factors that requested a change in money supply. Finally, some information on corrosion products have been achieved useful to select materials and methods for the conservation of these important witnesses of technology and economy.

  9. Sophistry, the Sophists and modern medical education.

    PubMed

    Macsuibhne, S P

    2010-01-01

    The term 'sophist' has become a term of intellectual abuse in both general discourse and that of educational theory. However the actual thought of the fifth century BC Athenian-based philosophers who were the original Sophists was very different from the caricature. In this essay, I draw parallels between trends in modern medical educational practice and the thought of the Sophists. Specific areas discussed are the professionalisation of medical education, the teaching of higher-order characterological attributes such as personal development skills, and evidence-based medical education. Using the specific example of the Sophist Protagoras, it is argued that the Sophists were precursors of philosophical approaches and practices of enquiry underlying modern medical education.

  10. Information technology sophistication in nursing homes.

    PubMed

    Alexander, Gregory L; Wakefield, Douglas S

    2009-07-01

    There is growing recognition that a more sophisticated information technology (IT) infrastructure is needed to improve the quality of nursing home care in the United States. The purpose of this study was to explore the concept of IT sophistication in nursing homes considering the level of technological diversity, maturity and level of integration in resident care, clinical support, and administration. Twelve IT stakeholders were interviewed from 4 nursing homes considered to have high IT sophistication using focus groups and key informant interviews. Common themes were derived using qualitative analytics and axial coding from field notes collected during interviews and focus groups. Respondents echoed the diversity of the innovative IT systems being implemented; these included resident alerting mechanisms for clinical decision support, enhanced reporting capabilities of patient-provider interactions, remote monitoring, and networking among affiliated providers. Nursing home IT is in its early stages of adoption; early adopters are beginning to realize benefits across clinical domains including resident care, clinical support, and administrative activities. The most important thread emerging from these discussions was the need for further interface development between IT systems to enhance integrity and connectivity. The study shows that some early adopters of sophisticated IT systems in nursing homes are beginning to achieve added benefit for resident care, clinical support, and administrative activities.

  11. On the substance of a sophisticated epistemology

    NASA Astrophysics Data System (ADS)

    Elby, Andrew; Hammer, David

    2001-09-01

    Among researchers who study students' epistemologies, a consensus has emerged about what constitutes a sophisticated stance toward scientific knowledge. According to this community consensus, students should understand scientific knowledge as tentative and evolving, rather than certain and unchanging; subjectively tied to scientists' perspectives, rather than objectively inherent in nature; and individually or socially constructed, rather than discovered. Surveys, interview protocols, and other methods used to probe students' beliefs about scientific knowledge broadly reflect this outlook. This article questions the community consensus about epistemological sophistication. We do not suggest that scientific knowledge is objective and fixed; if forced to choose whether knowledge is certain or tentative, with no opportunity to elaborate, we would choose tentative. Instead, our critique consists of two lines of argument. First, the literature fails to distinguish between the correctness and productivity of an epistemological belief. For instance, elementary school students who believe that science is about discovering objective truths to questions, such as whether the earth is round or flat, or whether an asteroid led to the extinction of the dinosaurs, may be more likely to succeed in science than students who believe science is about telling stories that vary with one's perspective. Naïve realism, although incorrect (according to a broad consensus of philosophers and social scientists), may nonetheless be productive for helping those students learn. Second, according to the consensus view as reflected in commonly used surveys, epistemological sophistication consists of believing certain blanket generalizations about the nature of knowledge and learning, generalizations that do not attend to context. These generalizations are neither correct nor productive. For example, it would be unsophisticated for students to view as tentative the idea that the earth is round

  12. Purification through Emotions: The Role of Shame in Plato's "Sophist" 230B4-E5

    ERIC Educational Resources Information Center

    Candiotto, Laura

    2018-01-01

    This article proposes an analysis of Plato's "Sophist" (230b4--e5) that underlines the bond between the logical and the emotional components of the Socratic "elenchus", with the aim of depicting the social valence of this philosophical practice. The use of emotions characterizing the 'elenctic' method described by Plato is…

  13. Assessing epistemic sophistication by considering domain-specific absolute and multiplicistic beliefs separately.

    PubMed

    Peter, Johannes; Rosman, Tom; Mayer, Anne-Kathrin; Leichner, Nikolas; Krampen, Günter

    2016-06-01

    Particularly in higher education, not only a view of science as a means of finding absolute truths (absolutism), but also a view of science as generally tentative (multiplicism) can be unsophisticated and obstructive for learning. Most quantitative epistemic belief inventories neglect this and understand epistemic sophistication as disagreement with absolute statements. This article suggests considering absolutism and multiplicism as separate dimensions. Following our understanding of epistemic sophistication as a cautious and reluctant endorsement of both positions, we assume evaluativism (a contextually adaptive view of knowledge as personally constructed and evidence-based) to be reflected by low agreement with both generalized absolute and generalized multiplicistic statements. Three studies with a total sample size of N = 416 psychology students were conducted. A domain-specific inventory containing both absolute and multiplicistic statements was developed. Expectations were tested by exploratory factor analysis, confirmatory factor analysis, and correlational analyses. Results revealed a two-factor solution with an absolute and a multiplicistic factor. Criterion validity of both factors was confirmed. Cross-sectional analyses revealed that agreement to generalized multiplicistic statements decreases with study progress. Moreover, consistent with our understanding of epistemic sophistication as a reluctant attitude towards generalized epistemic statements, evidence for a negative relationship between epistemic sophistication and need for cognitive closure was found. We recommend including multiplicistic statements into epistemic belief questionnaires and considering them as a separate dimension, especially when investigating individuals in later stages of epistemic development (i.e., in higher education). © 2015 The British Psychological Society.

  14. Lexical Sophistication as a Multidimensional Phenomenon: Relations to Second Language Lexical Proficiency, Development, and Writing Quality

    ERIC Educational Resources Information Center

    Kim, Minkyung; Crossley, Scott A.; Kyle, Kristopher

    2018-01-01

    This study conceptualizes lexical sophistication as a multidimensional phenomenon by reducing numerous lexical features of lexical sophistication into 12 aggregated components (i.e., dimensions) via a principal component analysis approach. These components were then used to predict second language (L2) writing proficiency levels, holistic lexical…

  15. The First Sophists and the Uses of History.

    ERIC Educational Resources Information Center

    Jarratt, Susan C.

    1987-01-01

    Reviews the history of intellectual views on the Greek sophists in three phases: (1) their disparagement by Plato and Aristotle as the morally disgraceful "other"; (2) nineteenth century British positivists' reappraisal of these relativists as ethically and scientifically superior; and (3) twentieth century versions of the sophists as…

  16. LIFE CYCLE IMPACT ASSESSMENT SOPHISTICATION

    EPA Science Inventory

    An international workshop was held in Brussels on 11/29-30/1998, to discuss LCIA Sophistication. LCA experts from North America, Europs, and Asia attended. Critical reviews of associated factors, including current limitations of available assessment methodologies, and comparison...

  17. The tool for the automatic analysis of lexical sophistication (TAALES): version 2.0.

    PubMed

    Kyle, Kristopher; Crossley, Scott; Berger, Cynthia

    2017-07-11

    This study introduces the second release of the Tool for the Automatic Analysis of Lexical Sophistication (TAALES 2.0), a freely available and easy-to-use text analysis tool. TAALES 2.0 is housed on a user's hard drive (allowing for secure data processing) and is available on most operating systems (Windows, Mac, and Linux). TAALES 2.0 adds 316 indices to the original tool. These indices are related to word frequency, word range, n-gram frequency, n-gram range, n-gram strength of association, contextual distinctiveness, word recognition norms, semantic network, and word neighbors. In this study, we validated TAALES 2.0 by investigating whether its indices could be used to model both holistic scores of lexical proficiency in free writes and word choice scores in narrative essays. The results indicated that the TAALES 2.0 indices could be used to explain 58% of the variance in lexical proficiency scores and 32% of the variance in word-choice scores. Newly added TAALES 2.0 indices, including those related to n-gram association strength, word neighborhood, and word recognition norms, featured heavily in these predictor models, suggesting that TAALES 2.0 represents a substantial upgrade.

  18. The conceptualization and measurement of cognitive health sophistication.

    PubMed

    Bodie, Graham D; Collins, William B; Jensen, Jakob D; Davis, Lashara A; Guntzviller, Lisa M; King, Andy J

    2013-01-01

    This article develops a conceptualization and measure of cognitive health sophistication--the complexity of an individual's conceptual knowledge about health. Study 1 provides initial validity evidence for the measure--the Healthy-Unhealthy Other Instrument--by showing its association with other cognitive health constructs indicative of higher health sophistication. Study 2 presents data from a sample of low-income adults to provide evidence that the measure does not depend heavily on health-related vocabulary or ethnicity. Results from both studies suggest that the Healthy-Unhealthy Other Instrument can be used to capture variability in the sophistication or complexity of an individual's health-related schematic structures on the basis of responses to two simple open-ended questions. Methodological advantages of the Healthy-Unhealthy Other Instrument and suggestions for future research are highlighted in the discussion.

  19. Predicting Second Language Writing Proficiency: The Roles of Cohesion and Linguistic Sophistication

    ERIC Educational Resources Information Center

    Crossley, Scott A.; McNamara, Danielle S.

    2012-01-01

    This study addresses research gaps in predicting second language (L2) writing proficiency using linguistic features. Key to this analysis is the inclusion of linguistic measures at the surface, textbase and situation model level that assess text cohesion and linguistic sophistication. The results of this study demonstrate that five variables…

  20. Financial Literacy and Financial Sophistication in the Older Population

    PubMed Central

    Lusardi, Annamaria; Mitchell, Olivia S.; Curto, Vilsa

    2017-01-01

    Using a special-purpose module implemented in the Health and Retirement Study, we evaluate financial sophistication in the American population over the age of 50. We combine several financial literacy questions into an overall index to highlight which questions best capture financial sophistication and examine the sensitivity of financial literacy responses to framing effects. Results show that many older respondents are not financially sophisticated: they fail to grasp essential aspects of risk diversification, asset valuation, portfolio choice, and investment fees. Subgroups with notable deficits include women, the least educated, non-Whites, and those over age 75. In view of the fact that retirees increasingly must take on responsibility for their own retirement security, such meager levels of knowledge have potentially serious and negative implications. PMID:28553191

  1. Financial Literacy and Financial Sophistication in the Older Population.

    PubMed

    Lusardi, Annamaria; Mitchell, Olivia S; Curto, Vilsa

    2014-10-01

    Using a special-purpose module implemented in the Health and Retirement Study, we evaluate financial sophistication in the American population over the age of 50. We combine several financial literacy questions into an overall index to highlight which questions best capture financial sophistication and examine the sensitivity of financial literacy responses to framing effects. Results show that many older respondents are not financially sophisticated: they fail to grasp essential aspects of risk diversification, asset valuation, portfolio choice, and investment fees. Subgroups with notable deficits include women, the least educated, non-Whites, and those over age 75. In view of the fact that retirees increasingly must take on responsibility for their own retirement security, such meager levels of knowledge have potentially serious and negative implications.

  2. Moral foundations and political attitudes: The moderating role of political sophistication.

    PubMed

    Milesi, Patrizia

    2016-08-01

    Political attitudes can be associated with moral concerns. This research investigated whether people's level of political sophistication moderates this association. Based on the Moral Foundations Theory, this article examined whether political sophistication moderates the extent to which reliance on moral foundations, as categories of moral concerns, predicts judgements about policy positions. With this aim, two studies examined four policy positions shown by previous research to be best predicted by the endorsement of Sanctity, that is, the category of moral concerns focused on the preservation of physical and spiritual purity. The results showed that reliance on Sanctity predicted political sophisticates' judgements, as opposed to those of unsophisticates, on policy positions dealing with equal rights for same-sex and unmarried couples and with euthanasia. Political sophistication also interacted with Fairness endorsement, which includes moral concerns for equal treatment of everybody and reciprocity, in predicting judgements about equal rights for unmarried couples, and interacted with reliance on Authority, which includes moral concerns for obedience and respect for traditional authorities, in predicting opposition to stem cell research. Those findings suggest that, at least for these particular issues, endorsement of moral foundations can be associated with political attitudes more strongly among sophisticates than unsophisticates. © 2015 International Union of Psychological Science.

  3. Aristotle and Social-Epistemic Rhetoric: The Systematizing of the Sophistic Legacy.

    ERIC Educational Resources Information Center

    Allen, James E.

    While Aristotle's philosophical views are more foundational than those of many of the Older Sophists, Aristotle's rhetorical theories inherit and incorporate many of the central tenets ascribed to Sophistic rhetoric, albeit in a more systematic fashion, as represented in the "Rhetoric." However, Aristotle was more than just a rhetorical…

  4. High deductible health plans: does cost sharing stimulate increased consumer sophistication?

    PubMed

    Gupta, Neal; Polsky, Daniel

    2015-06-01

    To determine whether increased cost sharing in health insurance plans induces higher levels of consumer sophistication in a non-elderly population. This analysis is based on the collection of survey and demographic data collected from enrollees in the RAND health insurance experiment (HIE). During the RAND HIE, enrollees were randomly assigned to different levels of cost sharing (0, 25, 50 and 95%). The study population compromises about 2000 people enrolled in the RAND HIE, between the years 1974 and 1982. Effects on health-care decision making were measured using the results of a standardized questionnaire, administered at the beginning and end of the experiment. Points of enquiry included whether or not enrollees' (i) recognized the need for second opinions (ii) questioned the effectiveness of certain therapies and (iii) researched the background/skill of their medical providers. Consumer sophistication was also measured for regular health-care consumers, as indicated by the presence of a chronic disease. We found no statically significant changes (P < 0.05) in the health-care decision-making strategies between individuals randomized to high cost sharing plans and low cost sharing plans. Furthermore, we did not find a stronger effect for patients with a chronic disease. The evidence from the RAND HIE does not support the hypothesis that a higher level of cost sharing incentivizes the development of consumer sophistication. As a result, cost sharing alone will not promote individuals to become more selective in their health-care decision-making. © 2012 Blackwell Publishing Ltd.

  5. Sensorless control for a sophisticated artificial myocardial contraction by using shape memory alloy fibre.

    PubMed

    Shiraishi, Y; Yambe, T; Saijo, Y; Sato, F; Tanaka, A; Yoshizawa, M; Sugai, T K; Sakata, R; Luo, Y; Park, Y; Uematsu, M; Umezu, M; Fujimoto, T; Masumoto, N; Liu, H; Baba, A; Konno, S; Nitta, S; Imachi, K; Tabayashi, K; Sasada, H; Homma, D

    2008-01-01

    The authors have been developing an artificial myocardium, which is capable of supporting natural contractile function from the outside of the ventricle. The system was originally designed by using sophisticated covalent shape memory alloy fibres, and the surface did not implicate blood compatibility. The purpose of our study on the development of artificial myocardium was to achieve the assistance of myocardial functional reproduction by the integrative small mechanical elements without sensors, so that the effective circulatory support could be accomplished. In this study, the authors fabricated the prototype artificial myocardial assist unit composed of the sophisticated shape memory alloy fibre (Biometal), the diameter of which was 100 microns, and examined the mechanical response by using pulse width modulation (PWM) control method in each unit. Prior to the evaluation of dynamic characteristics, the relationship between strain and electric resistance and also the initial response of each unit were obtained. The component for the PWM control was designed in order to regulate the myocardial contractile function, which consisted of an originally-designed RISC microcomputer with the input of displacement, and its output signal was controlled by pulse wave modulation method. As a result, the optimal PWM parameters were confirmed and the fibrous displacement was successfully regulated under the different heat transfer conditions simulating internal body temperature as well as bias tensile loading. Then it was indicated that this control theory might be applied for more sophisticated ventricular passive or active restraint by the artificial myocardium on physiological demand.

  6. Characteristics and Levels of Sophistication: An Analysis of Chemistry Students' Ability to Think with Mental Models

    ERIC Educational Resources Information Center

    Wang, Chia-Yu; Barrow, Lloyd H.

    2011-01-01

    This study employed a case-study approach to reveal how an ability to think with mental models contributes to differences in students' understanding of molecular geometry and polarity. We were interested in characterizing features and levels of sophistication regarding first-year university chemistry learners' mental modeling behaviors while the…

  7. Isocratean Discourse Theory and Neo-Sophistic Pedagogy: Implications for the Composition Classroom.

    ERIC Educational Resources Information Center

    Blair, Kristine L.

    With the recent interest in the fifth century B.C. theories of Protagoras and Gorgias come assumptions about the philosophical affinity of the Greek educator Isocrates to this pair of older sophists. Isocratean education in discourse, with its emphasis on collaborative political discourse, falls within recent definitions of a sophist curriculum.…

  8. From Poetry to Prose: Sophistic Rhetoric and the Epistemic Music of Language.

    ERIC Educational Resources Information Center

    Katz, Steven B.

    Much revisionist scholarship has focused on sophistic epistemology and its relationship to the current revival of epistemic rhetoric in the academy. However, few scholars have recognized the sensuous substance of words as sounds, and the role it played in sophistic philosophy and rhetoric. Before the invention of the Greek alphabet, poetry was…

  9. The "Virtual ChemLab" Project: A Realistic and Sophisticated Simulation of Organic Synthesis and Organic Qualitative Analysis

    ERIC Educational Resources Information Center

    Woodfield, Brian F.; Andrus, Merritt B.; Waddoups, Gregory L.; Moore, Melissa S.; Swan, Richard; Allen, Rob; Bodily, Greg; Andersen, Tricia; Miller, Jordan; Simmons, Bryon; Stanger, Richard

    2005-01-01

    A set of sophisticated and realistic laboratory simulations is created for use in freshman- and sophomore-level chemistry classes and laboratories called 'Virtual ChemLab'. A detailed assessment of student responses is provided and the simulation's pedagogical utility is described using the organic simulation.

  10. Sophisticated Approval Voting, Ignorance Priors, and Plurality Heuristics: A Behavioral Social Choice Analysis in a Thurstonian Framework

    ERIC Educational Resources Information Center

    Regenwetter, Michel; Ho, Moon-Ho R.; Tsetlin, Ilia

    2007-01-01

    This project reconciles historically distinct paradigms at the interface between individual and social choice theory, as well as between rational and behavioral decision theory. The authors combine a utility-maximizing prescriptive rule for sophisticated approval voting with the ignorance prior heuristic from behavioral decision research and two…

  11. A regional assessment of information technology sophistication in Missouri nursing homes.

    PubMed

    Alexander, Gregory L; Madsen, Richard; Wakefield, Douglas

    2010-08-01

    To provide a state profile of information technology (IT) sophistication in Missouri nursing homes. Primary survey data were collected from December 2006 to August 2007. A descriptive, exploratory cross-sectional design was used to investigate dimensions of IT sophistication (technological, functional, and integration) related to resident care, clinical support, and administrative processes. Each dimension was used to describe the clinical domains and demographics (ownership, regional location, and bed size). The final sample included 185 nursing homes. A wide range of IT sophistication is being used in administrative and resident care management processes, but very little in clinical support activities. Evidence suggests nursing homes in Missouri are expanding use of IT beyond traditional administrative and billing applications to patient care and clinical applications. This trend is important to provide support for capabilities which have been implemented to achieve national initiatives for meaningful use of IT in health care settings.

  12. Experience with a sophisticated computer based authoring system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gardner, P.R.

    1984-04-01

    In the November 1982 issue of ADCIS SIG CBT Newsletter the editor arrives at two conclusions regarding Computer Based Authoring Systems (CBAS): (1) CBAS drastically reduces programming time and the need for expert programmers, and (2) CBAS appears to have minimal impact on initial lesson design. Both of these comments have significant impact on any Cost-Benefit analysis for Computer-Based Training. The first tends to improve cost-effectiveness but only toward the limits imposed by the second. Westinghouse Hanford Company (WHC) recently purchased a sophisticated CBAS, the WISE/SMART system from Wicat (Orem, UT), for use in the Nuclear Power Industry. This reportmore » details our experience with this system relative to Items (1) and (2) above; lesson design time will be compared with lesson input time. Also provided will be the WHC experience in the use of subject matter experts (though computer neophytes) for the design and inputting of CBT materials.« less

  13. AN INTERNATIONAL WORKSHOP ON LIFE CYCLE IMPACT ASSESSMENT SOPHISTICATION

    EPA Science Inventory

    On November 29-30,1998 in Brussels, an international workshop was held to discuss Life Cycle Impact Assessment (LCIA) Sophistication. Approximately 50 LCA experts attended the workshop from North America, Europe, and Asia. Prominant practicioners and researchers were invited to ...

  14. The musicality of non-musicians: an index for assessing musical sophistication in the general population.

    PubMed

    Müllensiefen, Daniel; Gingras, Bruno; Musil, Jason; Stewart, Lauren

    2014-01-01

    Musical skills and expertise vary greatly in Western societies. Individuals can differ in their repertoire of musical behaviours as well as in the level of skill they display for any single musical behaviour. The types of musical behaviours we refer to here are broad, ranging from performance on an instrument and listening expertise, to the ability to employ music in functional settings or to communicate about music. In this paper, we first describe the concept of 'musical sophistication' which can be used to describe the multi-faceted nature of musical expertise. Next, we develop a novel measurement instrument, the Goldsmiths Musical Sophistication Index (Gold-MSI) to assess self-reported musical skills and behaviours on multiple dimensions in the general population using a large Internet sample (n = 147,636). Thirdly, we report results from several lab studies, demonstrating that the Gold-MSI possesses good psychometric properties, and that self-reported musical sophistication is associated with performance on two listening tasks. Finally, we identify occupation, occupational status, age, gender, and wealth as the main socio-demographic factors associated with musical sophistication. Results are discussed in terms of theoretical accounts of implicit and statistical music learning and with regard to social conditions of sophisticated musical engagement.

  15. The New Toxicology of Sophisticated Materials: Nanotoxicology and Beyond

    PubMed Central

    Maynard, Andrew D.; Warheit, David B.; Philbert, Martin A.

    2011-01-01

    It has long been recognized that the physical form of materials can mediate their toxicity—the health impacts of asbestiform materials, industrial aerosols, and ambient particulate matter are prime examples. Yet over the past 20 years, toxicology research has suggested complex and previously unrecognized associations between material physicochemistry at the nanoscale and biological interactions. With the rapid rise of the field of nanotechnology and the design and production of increasingly complex nanoscale materials, it has become ever more important to understand how the physical form and chemical composition of these materials interact synergistically to determine toxicity. As a result, a new field of research has emerged—nanotoxicology. Research within this field is highlighting the importance of material physicochemical properties in how dose is understood, how materials are characterized in a manner that enables quantitative data interpretation and comparison, and how materials move within, interact with, and are transformed by biological systems. Yet many of the substances that are the focus of current nanotoxicology studies are relatively simple materials that are at the vanguard of a new era of complex materials. Over the next 50 years, there will be a need to understand the toxicology of increasingly sophisticated materials that exhibit novel, dynamic and multifaceted functionality. If the toxicology community is to meet the challenge of ensuring the safe use of this new generation of substances, it will need to move beyond “nano” toxicology and toward a new toxicology of sophisticated materials. Here, we present a brief overview of the current state of the science on the toxicology of nanoscale materials and focus on three emerging toxicology-based challenges presented by sophisticated materials that will become increasingly important over the next 50 years: identifying relevant materials for study, physicochemical characterization, and

  16. Viscous-Inviscid Methods in Unsteady Aerodynamic Analysis of Bio-Inspired Morphing Wings

    NASA Astrophysics Data System (ADS)

    Dhruv, Akash V.

    Flight has been one of the greatest realizations of human imagination, revolutionizing communication and transportation over the years. This has greatly influenced the growth of technology itself, enabling researchers to communicate and share their ideas more effectively, extending the human potential to create more sophisticated systems. While the end product of a sophisticated technology makes our lives easier, its development process presents an array of challenges in itself. In last decade, scientists and engineers have turned towards bio-inspiration to design more efficient and robust aerodynamic systems to enhance the ability of Unmanned Aerial Vehicles (UAVs) to be operated in cluttered environments, where tight maneuverability and controllability are necessary. Effective use of UAVs in domestic airspace will mark the beginning of a new age in communication and transportation. The design of such complex systems necessitates the need for faster and more effective tools to perform preliminary investigations in design, thereby streamlining the design process. This thesis explores the implementation of numerical panel methods for aerodynamic analysis of bio-inspired morphing wings. Numerical panel methods have been one of the earliest forms of computational methods for aerodynamic analysis to be developed. Although the early editions of this method performed only inviscid analysis, the algorithm has matured over the years as a result of contributions made by prominent aerodynamicists. The method discussed in this thesis is influenced by recent advancements in panel methods and incorporates both viscous and inviscid analysis of multi-flap wings. The surface calculation of aerodynamic coefficients makes this method less computationally expensive than traditional Computational Fluid Dynamics (CFD) solvers available, and thus is effective when both speed and accuracy are desired. The morphing wing design, which consists of sequential feather-like flaps installed

  17. Rational thinking and cognitive sophistication: development, cognitive abilities, and thinking dispositions.

    PubMed

    Toplak, Maggie E; West, Richard F; Stanovich, Keith E

    2014-04-01

    We studied developmental trends in 5 important reasoning tasks that are critical components of the operational definition of rational thinking. The tasks measured denominator neglect, belief bias, base rate sensitivity, resistance to framing, and the tendency toward otherside thinking. In addition to age, we examined 2 other individual difference domains that index cognitive sophistication: cognitive ability (intelligence and executive functioning) and thinking dispositions (actively open-minded thinking, superstitious thinking, and need for cognition). All 5 reasoning domains were consistently related to cognitive sophistication regardless of how it was indexed (age, cognitive ability, thinking dispositions). The implications of these findings for taxonomies of developmental trends in rational thinking tasks are discussed. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  18. Primary Cilia: Highly Sophisticated Biological Sensors

    PubMed Central

    Abou Alaiwi, Wissam A.; Lo, Shao T.; Nauli, Surya M.

    2009-01-01

    Primary cilia, thin hair-like structures protruding from the apical surface of most mammalian cells, have gained the attention of many researchers over the past decade. Primary cilia are microtubule-filled sensory organelles that are enclosed within the ciliary membrane. They originate at the cell surface from the mother centriole that becomes the mature basal body. In this review, we will discuss recent literatures on the roles of cilia as sophisticated sensory organelles. With particular emphasis on vascular endothelia and renal epithelia, the mechanosensory role of cilia in sensing fluid shear stress will be discussed. Also highlighted is the ciliary involvement in cell cycle regulation, development, cell signaling and cancer. Finally, primary cilia-related disorders will be briefly described. PMID:22423203

  19. An Excel‐based implementation of the spectral method of action potential alternans analysis

    PubMed Central

    Pearman, Charles M.

    2014-01-01

    Abstract Action potential (AP) alternans has been well established as a mechanism of arrhythmogenesis and sudden cardiac death. Proper interpretation of AP alternans requires a robust method of alternans quantification. Traditional methods of alternans analysis neglect higher order periodicities that may have greater pro‐arrhythmic potential than classical 2:1 alternans. The spectral method of alternans analysis, already widely used in the related study of microvolt T‐wave alternans, has also been used to study AP alternans. Software to meet the specific needs of AP alternans analysis is not currently available in the public domain. An AP analysis tool is implemented here, written in Visual Basic for Applications and using Microsoft Excel as a shell. This performs a sophisticated analysis of alternans behavior allowing reliable distinction of alternans from random fluctuations, quantification of alternans magnitude, and identification of which phases of the AP are most affected. In addition, the spectral method has been adapted to allow detection and quantification of higher order regular oscillations. Analysis of action potential morphology is also performed. A simple user interface enables easy import, analysis, and export of collated results. PMID:25501439

  20. Reading wild minds: A computational assay of Theory of Mind sophistication across seven primate species

    PubMed Central

    Devaine, Marie; San-Galli, Aurore; Trapanese, Cinzia; Bardino, Giulia; Hano, Christelle; Saint Jalme, Michel; Bouret, Sebastien

    2017-01-01

    Theory of Mind (ToM), i.e. the ability to understand others' mental states, endows humans with highly adaptive social skills such as teaching or deceiving. Candidate evolutionary explanations have been proposed for the unique sophistication of human ToM among primates. For example, the Machiavellian intelligence hypothesis states that the increasing complexity of social networks may have induced a demand for sophisticated ToM. This type of scenario ignores neurocognitive constraints that may eventually be crucial limiting factors for ToM evolution. In contradistinction, the cognitive scaffolding hypothesis asserts that a species' opportunity to develop sophisticated ToM is mostly determined by its general cognitive capacity (on which ToM is scaffolded). However, the actual relationships between ToM sophistication and either brain volume (a proxy for general cognitive capacity) or social group size (a proxy for social network complexity) are unclear. Here, we let 39 individuals sampled from seven non-human primate species (lemurs, macaques, mangabeys, orangutans, gorillas and chimpanzees) engage in simple dyadic games against artificial ToM players (via a familiar human caregiver). Using computational analyses of primates' choice sequences, we found that the probability of exhibiting a ToM-compatible learning style is mainly driven by species' brain volume (rather than by social group size). Moreover, primates' social cognitive sophistication culminates in a precursor form of ToM, which still falls short of human fully-developed ToM abilities. PMID:29112973

  1. Reading wild minds: A computational assay of Theory of Mind sophistication across seven primate species.

    PubMed

    Devaine, Marie; San-Galli, Aurore; Trapanese, Cinzia; Bardino, Giulia; Hano, Christelle; Saint Jalme, Michel; Bouret, Sebastien; Masi, Shelly; Daunizeau, Jean

    2017-11-01

    Theory of Mind (ToM), i.e. the ability to understand others' mental states, endows humans with highly adaptive social skills such as teaching or deceiving. Candidate evolutionary explanations have been proposed for the unique sophistication of human ToM among primates. For example, the Machiavellian intelligence hypothesis states that the increasing complexity of social networks may have induced a demand for sophisticated ToM. This type of scenario ignores neurocognitive constraints that may eventually be crucial limiting factors for ToM evolution. In contradistinction, the cognitive scaffolding hypothesis asserts that a species' opportunity to develop sophisticated ToM is mostly determined by its general cognitive capacity (on which ToM is scaffolded). However, the actual relationships between ToM sophistication and either brain volume (a proxy for general cognitive capacity) or social group size (a proxy for social network complexity) are unclear. Here, we let 39 individuals sampled from seven non-human primate species (lemurs, macaques, mangabeys, orangutans, gorillas and chimpanzees) engage in simple dyadic games against artificial ToM players (via a familiar human caregiver). Using computational analyses of primates' choice sequences, we found that the probability of exhibiting a ToM-compatible learning style is mainly driven by species' brain volume (rather than by social group size). Moreover, primates' social cognitive sophistication culminates in a precursor form of ToM, which still falls short of human fully-developed ToM abilities.

  2. Systematic Review and Meta-Analysis of Studies Evaluating Diagnostic Test Accuracy: A Practical Review for Clinical Researchers-Part II. Statistical Methods of Meta-Analysis

    PubMed Central

    Lee, Juneyoung; Kim, Kyung Won; Choi, Sang Hyun; Huh, Jimi

    2015-01-01

    Meta-analysis of diagnostic test accuracy studies differs from the usual meta-analysis of therapeutic/interventional studies in that, it is required to simultaneously analyze a pair of two outcome measures such as sensitivity and specificity, instead of a single outcome. Since sensitivity and specificity are generally inversely correlated and could be affected by a threshold effect, more sophisticated statistical methods are required for the meta-analysis of diagnostic test accuracy. Hierarchical models including the bivariate model and the hierarchical summary receiver operating characteristic model are increasingly being accepted as standard methods for meta-analysis of diagnostic test accuracy studies. We provide a conceptual review of statistical methods currently used and recommended for meta-analysis of diagnostic test accuracy studies. This article could serve as a methodological reference for those who perform systematic review and meta-analysis of diagnostic test accuracy studies. PMID:26576107

  3. An Excel-based implementation of the spectral method of action potential alternans analysis.

    PubMed

    Pearman, Charles M

    2014-12-01

    Action potential (AP) alternans has been well established as a mechanism of arrhythmogenesis and sudden cardiac death. Proper interpretation of AP alternans requires a robust method of alternans quantification. Traditional methods of alternans analysis neglect higher order periodicities that may have greater pro-arrhythmic potential than classical 2:1 alternans. The spectral method of alternans analysis, already widely used in the related study of microvolt T-wave alternans, has also been used to study AP alternans. Software to meet the specific needs of AP alternans analysis is not currently available in the public domain. An AP analysis tool is implemented here, written in Visual Basic for Applications and using Microsoft Excel as a shell. This performs a sophisticated analysis of alternans behavior allowing reliable distinction of alternans from random fluctuations, quantification of alternans magnitude, and identification of which phases of the AP are most affected. In addition, the spectral method has been adapted to allow detection and quantification of higher order regular oscillations. Analysis of action potential morphology is also performed. A simple user interface enables easy import, analysis, and export of collated results. © 2014 The Author. Physiological Reports published by Wiley Periodicals, Inc. on behalf of the American Physiological Society and The Physiological Society.

  4. A sophisticated simulation for the fracture behavior of concrete material using XFEM

    NASA Astrophysics Data System (ADS)

    Zhai, Changhai; Wang, Xiaomin; Kong, Jingchang; Li, Shuang; Xie, Lili

    2017-10-01

    The development of a powerful numerical model to simulate the fracture behavior of concrete material has long been one of the dominant research areas in earthquake engineering. A reliable model should be able to adequately represent the discontinuous characteristics of cracks and simulate various failure behaviors under complicated loading conditions. In this paper, a numerical formulation, which incorporates a sophisticated rigid-plastic interface constitutive model coupling cohesion softening, contact, friction and shear dilatation into the XFEM, is proposed to describe various crack behaviors of concrete material. An effective numerical integration scheme for accurately assembling the contribution to the weak form on both sides of the discontinuity is introduced. The effectiveness of the proposed method has been assessed by simulating several well-known experimental tests. It is concluded that the numerical method can successfully capture the crack paths and accurately predict the fracture behavior of concrete structures. The influence of mode-II parameters on the mixed-mode fracture behavior is further investigated to better determine these parameters.

  5. The Musicality of Non-Musicians: An Index for Assessing Musical Sophistication in the General Population

    PubMed Central

    Müllensiefen, Daniel; Gingras, Bruno; Musil, Jason; Stewart, Lauren

    2014-01-01

    Musical skills and expertise vary greatly in Western societies. Individuals can differ in their repertoire of musical behaviours as well as in the level of skill they display for any single musical behaviour. The types of musical behaviours we refer to here are broad, ranging from performance on an instrument and listening expertise, to the ability to employ music in functional settings or to communicate about music. In this paper, we first describe the concept of ‘musical sophistication’ which can be used to describe the multi-faceted nature of musical expertise. Next, we develop a novel measurement instrument, the Goldsmiths Musical Sophistication Index (Gold-MSI) to assess self-reported musical skills and behaviours on multiple dimensions in the general population using a large Internet sample (n = 147,636). Thirdly, we report results from several lab studies, demonstrating that the Gold-MSI possesses good psychometric properties, and that self-reported musical sophistication is associated with performance on two listening tasks. Finally, we identify occupation, occupational status, age, gender, and wealth as the main socio-demographic factors associated with musical sophistication. Results are discussed in terms of theoretical accounts of implicit and statistical music learning and with regard to social conditions of sophisticated musical engagement. PMID:24586929

  6. Strategic sophistication of individuals and teams. Experimental evidence

    PubMed Central

    Sutter, Matthias; Czermak, Simon; Feri, Francesco

    2013-01-01

    Many important decisions require strategic sophistication. We examine experimentally whether teams act more strategically than individuals. We let individuals and teams make choices in simple games, and also elicit first- and second-order beliefs. We find that teams play the Nash equilibrium strategy significantly more often, and their choices are more often a best response to stated first order beliefs. Distributional preferences make equilibrium play less likely. Using a mixture model, the estimated probability to play strategically is 62% for teams, but only 40% for individuals. A model of noisy introspection reveals that teams differ from individuals in higher order beliefs. PMID:24926100

  7. Differential ethnic associations between maternal flexibility and play sophistication in toddlers born very low birth weight

    PubMed Central

    Erickson, Sarah J.; Montague, Erica Q.; Maclean, Peggy C.; Bancroft, Mary E.; Lowe, Jean R.

    2013-01-01

    Children born very low birth weight (<1500 grams, VLBW) are at increased risk for developmental delays. Play is an important developmental outcome to the extent that child’s play and social communication are related to later development of self-regulation and effective functional skills, and play serves as an important avenue of early intervention. The current study investigated associations between maternal flexibility and toddler play sophistication in Caucasian, Spanish speaking Hispanic, English speaking Hispanic, and Native American toddlers (18-22 months adjusted age) in a cross-sectional cohort of 73 toddlers born VLBW and their mothers. We found that the association between maternal flexibility and toddler play sophistication differed by ethnicity (F(3,65) = 3.34, p = .02). In particular, Spanish speaking Hispanic dyads evidenced a significant positive association between maternal flexibility and play sophistication of medium effect size. Results for Native Americans were parallel to those of Spanish speaking Hispanic dyads: the relationship between flexibility and play sophistication was positive and of small-medium effect size. Findings indicate that for Caucasians and English speaking Hispanics, flexibility evidenced a non-significant (negative and small effect size) association with toddler play sophistication. Significant follow-up contrasts revealed that the associations for Caucasian and English speaking Hispanic dyads were significantly different from those of the other two ethnic groups. Results remained unchanged after adjusting for the amount of maternal language, an index of maternal engagement and stimulation; and after adjusting for birth weight, gestational age, gender, test age, cognitive ability, as well maternal age, education, and income. Our results provide preliminary evidence that ethnicity and acculturation may mediate the association between maternal interactive behavior such as flexibility and toddler developmental outcomes, as

  8. Differential ethnic associations between maternal flexibility and play sophistication in toddlers born very low birth weight.

    PubMed

    Erickson, Sarah J; Montague, Erica Q; Maclean, Peggy C; Bancroft, Mary E; Lowe, Jean R

    2012-12-01

    Children born very low birth weight (<1500 g, VLBW) are at increased risk for developmental delays. Play is an important developmental outcome to the extent that child's play and social communication are related to later development of self-regulation and effective functional skills, and play serves as an important avenue of early intervention. The current study investigated associations between maternal flexibility and toddler play sophistication in Caucasian, Spanish speaking Hispanic, English speaking Hispanic, and Native American toddlers (18-22 months adjusted age) in a cross-sectional cohort of 73 toddlers born VLBW and their mothers. We found that the association between maternal flexibility and toddler play sophistication differed by ethnicity (F(3,65) = 3.34, p = .02). In particular, Spanish speaking Hispanic dyads evidenced a significant positive association between maternal flexibility and play sophistication of medium effect size. Results for Native Americans were parallel to those of Spanish speaking Hispanic dyads: the relationship between flexibility and play sophistication was positive and of small-medium effect size. Findings indicate that for Caucasians and English speaking Hispanics, flexibility evidenced a non-significant (negative and small effect size) association with toddler play sophistication. Significant follow-up contrasts revealed that the associations for Caucasian and English speaking Hispanic dyads were significantly different from those of the other two ethnic groups. Results remained unchanged after adjusting for the amount of maternal language, an index of maternal engagement and stimulation; and after adjusting for birth weight, gestational age, gender, test age, cognitive ability, as well maternal age, education, and income. Our results provide preliminary evidence that ethnicity and acculturation may mediate the association between maternal interactive behavior such as flexibility and toddler developmental outcomes, as indexed

  9. Assessing Syntactic Sophistication in L2 Writing: A Usage-Based Approach

    ERIC Educational Resources Information Center

    Kyle, Kristopher; Crossley, Scott

    2017-01-01

    Over the past 45 years, the construct of syntactic sophistication has been assessed in L2 writing using what Bulté and Housen (2012) refer to as absolute complexity (Lu, 2011; Ortega, 2003; Wolfe-Quintero, Inagaki, & Kim, 1998). However, it has been argued that making inferences about learners based on absolute complexity indices (e.g., mean…

  10. Examining Candidate Information Search Processes: The Impact of Processing Goals and Sophistication.

    ERIC Educational Resources Information Center

    Huang, Li-Ning

    2000-01-01

    Investigates how 4 different information-processing goals, varying on the dimensions of effortful versus effortless and impression-driven versus non-impression-driven processing, and individual difference in political sophistication affect the depth at which undergraduate students process candidate information and their decision-making strategies.…

  11. Sophistic Ethics in the Technical Writing Classroom: Teaching "Nomos," Deliberation, and Action.

    ERIC Educational Resources Information Center

    Scott, J. Blake

    1995-01-01

    Claims that teaching ethics is particularly important to technical writing. Outlines a classical, sophistic approach to ethics based on the theories and pedagogies of Protagoras, Gorgias, and Isocrates, which emphasizes the Greek concept of "nomos," internal and external deliberation, and responsible action. Discusses problems and…

  12. The sophisticated visual system of a tiny Cambrian crustacean: analysis of a stalked fossil compound eye

    PubMed Central

    Schoenemann, Brigitte; Castellani, Christopher; Clarkson, Euan N. K.; Haug, Joachim T.; Maas, Andreas; Haug, Carolin; Waloszek, Dieter

    2012-01-01

    Fossilized compound eyes from the Cambrian, isolated and three-dimensionally preserved, provide remarkable insights into the lifestyle and habitat of their owners. The tiny stalked compound eyes described here probably possessed too few facets to form a proper image, but they represent a sophisticated system for detecting moving objects. The eyes are preserved as almost solid, mace-shaped blocks of phosphate, in which the original positions of the rhabdoms in one specimen are retained as deep cavities. Analysis of the optical axes reveals four visual areas, each with different properties in acuity of vision. They are surveyed by lenses directed forwards, laterally, backwards and inwards, respectively. The most intriguing of these is the putatively inwardly orientated zone, where the optical axes, like those orientated to the front, interfere with axes of the other eye of the contralateral side. The result is a three-dimensional visual net that covers not only the front, but extends also far laterally to either side. Thus, a moving object could be perceived by a two-dimensional coordinate (which is formed by two axes of those facets, one of the left and one of the right eye, which are orientated towards the moving object) in a wide three-dimensional space. This compound eye system enables small arthropods equipped with an eye of low acuity to estimate velocity, size or distance of possible food items efficiently. The eyes are interpreted as having been derived from individuals of the early crustacean Henningsmoenicaris scutula pointing to the existence of highly efficiently developed eyes in the early evolutionary lineage leading towards the modern Crustacea. PMID:22048954

  13. A Macro-Level Analysis of SRL Processes and Their Relations to the Acquisition of a Sophisticated Mental Model of a Complex System

    ERIC Educational Resources Information Center

    Greene, Jeffrey Alan; Azevedo, Roger

    2009-01-01

    In this study, we used think-aloud verbal protocols to examine how various macro-level processes of self-regulated learning (SRL; e.g., planning, monitoring, strategy use, handling of task difficulty and demands) were associated with the acquisition of a sophisticated mental model of a complex biological system. Numerous studies examine how…

  14. Assessing Epistemic Sophistication by Considering Domain-Specific Absolute and Multiplicistic Beliefs Separately

    ERIC Educational Resources Information Center

    Peter, Johannes; Rosman, Tom; Mayer, Anne-Kathrin; Leichner, Nikolas; Krampen, Günter

    2016-01-01

    Background: Particularly in higher education, not only a view of science as a means of finding absolute truths (absolutism), but also a view of science as generally tentative (multiplicism) can be unsophisticated and obstructive for learning. Most quantitative epistemic belief inventories neglect this and understand epistemic sophistication as…

  15. A sophisticated, multi-channel data acquisition and processing system for high frequency noise research

    NASA Technical Reports Server (NTRS)

    Hall, David G.; Bridges, James

    1992-01-01

    A sophisticated, multi-channel computerized data acquisition and processing system was developed at the NASA LeRC for use in noise experiments. This technology, which is available for transfer to industry, provides a convenient, cost-effective alternative to analog tape recording for high frequency acoustic measurements. This system provides 32-channel acquisition of microphone signals with an analysis bandwidth up to 100 kHz per channel. Cost was minimized through the use of off-the-shelf components. Requirements to allow for future expansion were met by choosing equipment which adheres to established industry standards for hardware and software. Data processing capabilities include narrow band and 1/3 octave spectral analysis, compensation for microphone frequency response/directivity, and correction of acoustic data to standard day conditions. The system was used successfully in a major wind tunnel test program at NASA LeRC to acquire and analyze jet noise data in support of the High Speed Civil Transport (HSCT) program.

  16. Sophisticated Clean Air Strategies Required to Mitigate Against Particulate Organic Pollution

    PubMed Central

    Grigas, T.; Ovadnevaite, J.; Ceburnis, D.; Moran, E.; McGovern, F. M.; Jennings, S. G.; O’Dowd, C.

    2017-01-01

    Since the 1980’s, measures mitigating the impact of transboundary air pollution have been implemented successfully as evidenced in the 1980–2014 record of atmospheric sulphur pollution over the NE-Atlantic, a key region for monitoring background northern-hemisphere pollution levels. The record reveals a 72–79% reduction in annual-average airborne sulphur pollution (SO4 and SO2, respectively) over the 35-year period. The NE-Atlantic, as observed from the Mace Head research station on the Irish coast, can be considered clean for 64% of the time during which sulphate dominates PM1 levels, contributing 42% of the mass, and for the remainder of the time, under polluted conditions, a carbonaceous (organic matter and Black Carbon) aerosol prevails, contributing 60% to 90% of the PM1 mass and exhibiting a trend whereby its contribution increases with increasing pollution levels. The carbonaceous aerosol is known to be diverse in source and nature and requires sophisticated air pollution policies underpinned by sophisticated characterisation and source apportionment capabilities to inform selective emissions-reduction strategies. Inauspiciously, however, this carbonaceous concoction is not measured in regulatory Air Quality networks. PMID:28303958

  17. Sophisticated Clean Air Strategies Required to Mitigate Against Particulate Organic Pollution

    NASA Astrophysics Data System (ADS)

    Grigas, T.; Ovadnevaite, J.; Ceburnis, D.; Moran, E.; McGovern, F. M.; Jennings, S. G.; O'Dowd, C.

    2017-03-01

    Since the 1980’s, measures mitigating the impact of transboundary air pollution have been implemented successfully as evidenced in the 1980-2014 record of atmospheric sulphur pollution over the NE-Atlantic, a key region for monitoring background northern-hemisphere pollution levels. The record reveals a 72-79% reduction in annual-average airborne sulphur pollution (SO4 and SO2, respectively) over the 35-year period. The NE-Atlantic, as observed from the Mace Head research station on the Irish coast, can be considered clean for 64% of the time during which sulphate dominates PM1 levels, contributing 42% of the mass, and for the remainder of the time, under polluted conditions, a carbonaceous (organic matter and Black Carbon) aerosol prevails, contributing 60% to 90% of the PM1 mass and exhibiting a trend whereby its contribution increases with increasing pollution levels. The carbonaceous aerosol is known to be diverse in source and nature and requires sophisticated air pollution policies underpinned by sophisticated characterisation and source apportionment capabilities to inform selective emissions-reduction strategies. Inauspiciously, however, this carbonaceous concoction is not measured in regulatory Air Quality networks.

  18. Sophisticated Clean Air Strategies Required to Mitigate Against Particulate Organic Pollution.

    PubMed

    Grigas, T; Ovadnevaite, J; Ceburnis, D; Moran, E; McGovern, F M; Jennings, S G; O'Dowd, C

    2017-03-17

    Since the 1980's, measures mitigating the impact of transboundary air pollution have been implemented successfully as evidenced in the 1980-2014 record of atmospheric sulphur pollution over the NE-Atlantic, a key region for monitoring background northern-hemisphere pollution levels. The record reveals a 72-79% reduction in annual-average airborne sulphur pollution (SO 4 and SO 2 , respectively) over the 35-year period. The NE-Atlantic, as observed from the Mace Head research station on the Irish coast, can be considered clean for 64% of the time during which sulphate dominates PM 1 levels, contributing 42% of the mass, and for the remainder of the time, under polluted conditions, a carbonaceous (organic matter and Black Carbon) aerosol prevails, contributing 60% to 90% of the PM 1 mass and exhibiting a trend whereby its contribution increases with increasing pollution levels. The carbonaceous aerosol is known to be diverse in source and nature and requires sophisticated air pollution policies underpinned by sophisticated characterisation and source apportionment capabilities to inform selective emissions-reduction strategies. Inauspiciously, however, this carbonaceous concoction is not measured in regulatory Air Quality networks.

  19. Musical Sophistication and the Effect of Complexity on Auditory Discrimination in Finnish Speakers.

    PubMed

    Dawson, Caitlin; Aalto, Daniel; Šimko, Juraj; Vainio, Martti; Tervaniemi, Mari

    2017-01-01

    Musical experiences and native language are both known to affect auditory processing. The present work aims to disentangle the influences of native language phonology and musicality on behavioral and subcortical sound feature processing in a population of musically diverse Finnish speakers as well as to investigate the specificity of enhancement from musical training. Finnish speakers are highly sensitive to duration cues since in Finnish, vowel and consonant duration determine word meaning. Using a correlational approach with a set of behavioral sound feature discrimination tasks, brainstem recordings, and a musical sophistication questionnaire, we find no evidence for an association between musical sophistication and more precise duration processing in Finnish speakers either in the auditory brainstem response or in behavioral tasks, but they do show an enhanced pitch discrimination compared to Finnish speakers with less musical experience and show greater duration modulation in a complex task. These results are consistent with a ceiling effect set for certain sound features which corresponds to the phonology of the native language, leaving an opportunity for music experience-based enhancement of sound features not explicitly encoded in the language (such as pitch, which is not explicitly encoded in Finnish). Finally, the pattern of duration modulation in more musically sophisticated Finnish speakers suggests integrated feature processing for greater efficiency in a real world musical situation. These results have implications for research into the specificity of plasticity in the auditory system as well as to the effects of interaction of specific language features with musical experiences.

  20. Musical Sophistication and the Effect of Complexity on Auditory Discrimination in Finnish Speakers

    PubMed Central

    Dawson, Caitlin; Aalto, Daniel; Šimko, Juraj; Vainio, Martti; Tervaniemi, Mari

    2017-01-01

    Musical experiences and native language are both known to affect auditory processing. The present work aims to disentangle the influences of native language phonology and musicality on behavioral and subcortical sound feature processing in a population of musically diverse Finnish speakers as well as to investigate the specificity of enhancement from musical training. Finnish speakers are highly sensitive to duration cues since in Finnish, vowel and consonant duration determine word meaning. Using a correlational approach with a set of behavioral sound feature discrimination tasks, brainstem recordings, and a musical sophistication questionnaire, we find no evidence for an association between musical sophistication and more precise duration processing in Finnish speakers either in the auditory brainstem response or in behavioral tasks, but they do show an enhanced pitch discrimination compared to Finnish speakers with less musical experience and show greater duration modulation in a complex task. These results are consistent with a ceiling effect set for certain sound features which corresponds to the phonology of the native language, leaving an opportunity for music experience-based enhancement of sound features not explicitly encoded in the language (such as pitch, which is not explicitly encoded in Finnish). Finally, the pattern of duration modulation in more musically sophisticated Finnish speakers suggests integrated feature processing for greater efficiency in a real world musical situation. These results have implications for research into the specificity of plasticity in the auditory system as well as to the effects of interaction of specific language features with musical experiences. PMID:28450829

  1. Creativity Research: More Studies, Greater Sophistication and the Importance of "Big" Questions

    ERIC Educational Resources Information Center

    Ward, Thomas B.; Kennedy, Evan S.

    2017-01-01

    In the past 20 years, there has been a strong and steady increase in the number of publications concerned with creativity and in the number of outlets for that work. More importantly, there has been an increase in the level of detail and sophistication of answers provided for the most fundamental questions in the field. We illustrate that…

  2. Analysis of Time Filters in Multistep Methods

    NASA Astrophysics Data System (ADS)

    Hurl, Nicholas

    Geophysical ow simulations have evolved sophisticated implicit-explicit time stepping methods (based on fast-slow wave splittings) followed by time filters to control any unstable models that result. Time filters are modular and parallel. Their effect on stability of the overall process has been tested in numerous simulations, but never analyzed. Stability is proven herein for the Crank-Nicolson Leapfrog (CNLF) method with the Robert-Asselin (RA) time filter and for the Crank-Nicolson Leapfrog method with the Robert-Asselin-Williams (RAW) time filter for systems by energy methods. We derive an equivalent multistep method for CNLF+RA and CNLF+RAW and stability regions are obtained. The time step restriction for energy stability of CNLF+RA is smaller than CNLF and CNLF+RAW time step restriction is even smaller. Numerical tests find that RA and RAW add numerical dissipation. This thesis also shows that all modes of the Crank-Nicolson Leap Frog (CNLF) method are asymptotically stable under the standard timestep condition.

  3. Sagacious, Sophisticated, and Sedulous: The Importance of Discussing 50-Cent Words with Preschoolers

    ERIC Educational Resources Information Center

    Collins, Molly F.

    2012-01-01

    Adults often use simple words instead of complex words when talking to young children. Reasons vary from teachers' beliefs that young children cannot understand sophisticated vocabulary because they are too young or have limited language skills, to teachers' unfamiliarity with complex words or with strategies for supporting vocabulary. As a…

  4. Quantitative Proteomic Analysis Reveals Populus cathayana Females Are More Sensitive and Respond More Sophisticatedly to Iron Deficiency than Males.

    PubMed

    Zhang, Sheng; Zhang, Yunxiang; Cao, Yanchun; Lei, Yanbao; Jiang, Hao

    2016-03-04

    Previous studies have shown that there are significant sexual differences in the morphological and physiological responses of Populus cathayana Rehder to nitrogen and phosphorus deficiencies, but little is known about the sex-specific differences in responses to iron deficiency. In this study, the effects of iron deficiency on the morphology, physiology, and proteome of P. cathayana males and females were investigated. The results showed that iron deficiency (25 days) significantly decreased height growth, photosynthetic rate, chlorophyll content, and tissue iron concentration in both sexes. A comparison between the sexes indicated that iron-deficient males had less height inhibition and photosynthesis system II or chloroplast ultrastructural damage than iron-deficient females. iTRAQ-based quantitative proteomic analysis revealed that 144 and 68 proteins were decreased in abundance (e.g., proteins involved in photosynthesis, carbohydrate and energy metabolism, and gene expression regulation) and 78 and 39 proteins were increased in abundance (e.g., proteins involved in amino acid metabolism and stress response) according to the criterion of ratio ≥1.5 in females and males, respectively. A comparison between the sexes indicated that iron-deficient females exhibited a greater change in the proteins involved in photosynthesis, carbon and energy metabolism, the redox system, and stress responsive proteins. This study reveals females are more sensitive and have a more sophisticated response to iron deficiency compared with males and provides new insights into differential sexual responses to nutrient deficiency.

  5. Machine cost analysis using the traditional machine-rate method and ChargeOut!

    Treesearch

    E. M. (Ted) Bilek

    2009-01-01

    Forestry operations require ever more use of expensive capital equipment. Mechanization is frequently necessary to perform cost-effective and safe operations. Increased capital should mean more sophisticated capital costing methodologies. However the machine rate method, which is the costing methodology most frequently used, dates back to 1942. CHARGEOUT!, a recently...

  6. When to Pull the Trigger for the Counterattack: Simplicity versus Sophistication.

    DTIC Science & Technology

    1985-12-02

    ADA1I67 705 WHNEN TO PULL THE TRIGGER FOR THE CO$JNTERRTTRCK: vi1 SIMPLICITY VERSUS SOPHISTICATION(U) ARMY COMMAND AND, GENERAL STAFF COLL FORT...Adv’affied Military Studie SU.S. Army Command and General Staff College Fort Leavenworth, Kansas 2 December 1985 Approved ror Public Release: Distribution...OF MONITORING ORGANIZAl ION O~US ARMY CMD1AN’D AN𔃻D GENERAL If JT -10ab 6C. ADD)RESS (City. State, and ZIP Code) 7b. ADDRESS (City, State, and ZIP

  7. Survival analysis and classification methods for forest fire size.

    PubMed

    Tremblay, Pier-Olivier; Duchesne, Thierry; Cumming, Steven G

    2018-01-01

    Factors affecting wildland-fire size distribution include weather, fuels, and fire suppression activities. We present a novel application of survival analysis to quantify the effects of these factors on a sample of sizes of lightning-caused fires from Alberta, Canada. Two events were observed for each fire: the size at initial assessment (by the first fire fighters to arrive at the scene) and the size at "being held" (a state when no further increase in size is expected). We developed a statistical classifier to try to predict cases where there will be a growth in fire size (i.e., the size at "being held" exceeds the size at initial assessment). Logistic regression was preferred over two alternative classifiers, with covariates consistent with similar past analyses. We conducted survival analysis on the group of fires exhibiting a size increase. A screening process selected three covariates: an index of fire weather at the day the fire started, the fuel type burning at initial assessment, and a factor for the type and capabilities of the method of initial attack. The Cox proportional hazards model performed better than three accelerated failure time alternatives. Both fire weather and fuel type were highly significant, with effects consistent with known fire behaviour. The effects of initial attack method were not statistically significant, but did suggest a reverse causality that could arise if fire management agencies were to dispatch resources based on a-priori assessment of fire growth potentials. We discuss how a more sophisticated analysis of larger data sets could produce unbiased estimates of fire suppression effect under such circumstances.

  8. Recent advances in methods for the analysis of protein o-glycosylation at proteome level.

    PubMed

    You, Xin; Qin, Hongqiang; Ye, Mingliang

    2018-01-01

    O-Glycosylation, which refers to the glycosylation of the hydroxyl group of side chains of Serine/Threonine/Tyrosine residues, is one of the most common post-translational modifications. Compared with N-linked glycosylation, O-glycosylation is less explored because of its complex structure and relatively low abundance. Recently, O-glycosylation has drawn more and more attention for its various functions in many sophisticated biological processes. To obtain a deep understanding of O-glycosylation, many efforts have been devoted to develop effective strategies to analyze the two most abundant types of O-glycosylation, i.e. O-N-acetylgalactosamine and O-N-acetylglucosamine glycosylation. In this review, we summarize the proteomics workflows to analyze these two types of O-glycosylation. For the large-scale analysis of mucin-type glycosylation, the glycan simplification strategies including the ''SimpleCell'' technology were introduced. A variety of enrichment methods including lectin affinity chromatography, hydrophilic interaction chromatography, hydrazide chemistry, and chemoenzymatic method were introduced for the proteomics analysis of O-N-acetylgalactosamine and O-N-acetylglucosamine glycosylation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. The State of Nursing Home Information Technology Sophistication in Rural and Nonrural US Markets.

    PubMed

    Alexander, Gregory L; Madsen, Richard W; Miller, Erin L; Wakefield, Douglas S; Wise, Keely K; Alexander, Rachel L

    2017-06-01

    To test for significant differences in information technology sophistication (ITS) in US nursing homes (NH) based on location. We administered a primary survey January 2014 to July 2015 to NH in each US state. The survey was cross-sectional and examined 3 dimensions (IT capabilities, extent of IT use, degree of IT integration) among 3 domains (resident care, clinical support, administrative activities) of ITS. ITS was broken down by NH location. Mean responses were compared across 4 NH categories (Metropolitan, Micropolitan, Small Town, and Rural) for all 9 ITS dimensions and domains. Least square means and Tukey's method were used for multiple comparisons. Methods yielded 815/1,799 surveys (45% response rate). In every health care domain (resident care, clinical support, and administrative activities) statistical differences in facility ITS occurred in larger (metropolitan or micropolitan) and smaller (small town or rural) populated areas. This study represents the most current national assessment of NH IT since 2004. Historically, NH IT has been used solely for administrative activities and much less for resident care and clinical support. However, results are encouraging as ITS in other domains appears to be greater than previously imagined. © 2016 National Rural Health Association.

  10. New analysis methods to push the boundaries of diagnostic techniques in the environmental sciences

    NASA Astrophysics Data System (ADS)

    Lungaroni, M.; Murari, A.; Peluso, E.; Gelfusa, M.; Malizia, A.; Vega, J.; Talebzadeh, S.; Gaudio, P.

    2016-04-01

    In the last years, new and more sophisticated measurements have been at the basis of the major progress in various disciplines related to the environment, such as remote sensing and thermonuclear fusion. To maximize the effectiveness of the measurements, new data analysis techniques are required. First data processing tasks, such as filtering and fitting, are of primary importance, since they can have a strong influence on the rest of the analysis. Even if Support Vector Regression is a method devised and refined at the end of the 90s, a systematic comparison with more traditional non parametric regression methods has never been reported. In this paper, a series of systematic tests is described, which indicates how SVR is a very competitive method of non-parametric regression that can usefully complement and often outperform more consolidated approaches. The performance of Support Vector Regression as a method of filtering is investigated first, comparing it with the most popular alternative techniques. Then Support Vector Regression is applied to the problem of non-parametric regression to analyse Lidar surveys for the environments measurement of particulate matter due to wildfires. The proposed approach has given very positive results and provides new perspectives to the interpretation of the data.

  11. Multi-disciplinary communication networks for skin risk assessment in nursing homes with high IT sophistication.

    PubMed

    Alexander, Gregory L; Pasupathy, Kalyan S; Steege, Linsey M; Strecker, E Bradley; Carley, Kathleen M

    2014-08-01

    The role of nursing home (NH) information technology (IT) in quality improvement has not been clearly established, and its impacts on communication between care givers and patient outcomes in these settings deserve further attention. In this research, we describe a mixed method approach to explore communication strategies used by healthcare providers for resident skin risk in NH with high IT sophistication (ITS). Sample included NH participating in the statewide survey of ITS. We incorporated rigorous observation of 8- and 12-h shifts, and focus groups to identify how NH IT and a range of synchronous and asynchronous tools are used. Social network analysis tools and qualitative analysis were used to analyze data and identify relationships between ITS dimensions and communication interactions between care providers. Two of the nine ITS dimensions (resident care-technological and administrative activities-technological) and total ITS were significantly negatively correlated with number of unique interactions. As more processes in resident care and administrative activities are supported by technology, the lower the number of observed unique interactions. Additionally, four thematic areas emerged from staff focus groups that demonstrate how important IT is to resident care in these facilities including providing resident-centered care, teamwork and collaboration, maintaining safety and quality, and using standardized information resources. Our findings in this study confirm prior research that as technology support (resident care and administrative activities) and overall ITS increases, observed interactions between staff members decrease. Conversations during staff interviews focused on how technology facilitated resident centered care through enhanced information sharing, greater virtual collaboration between team members, and improved care delivery. These results provide evidence for improving the design and implementation of IT in long term care systems to support

  12. Centrifugal compressor surge detecting method based on wavelet analysis of unsteady pressure fluctuations in typical stages

    NASA Astrophysics Data System (ADS)

    Izmaylov, R.; Lebedev, A.

    2015-08-01

    Centrifugal compressors are complex energy equipment. Automotive control and protection system should meet the requirements: of operation reliability and durability. In turbocompressors there are at least two dangerous areas: surge and rotating stall. Antisurge protecting systems usually use parametric or feature methods. As a rule industrial system are parametric. The main disadvantages of anti-surge parametric systems are difficulties in mass flow measurements in natural gas pipeline compressor. The principal idea of feature method is based on the experimental fact: as a rule just before the onset of surge rotating or precursor stall established in compressor. In this case the problem consists in detecting of unsteady pressure or velocity fluctuations characteristic signals. Wavelet analysis is the best method for detecting onset of rotating stall in spite of high level of spurious signals (rotating wakes, turbulence, etc.). This method is compatible with state of the art DSP systems of industrial control. Examples of wavelet analysis application for detecting onset of rotating stall in typical stages centrifugal compressor are presented. Experimental investigations include unsteady pressure measurement and sophisticated data acquisition system. Wavelet transforms used biorthogonal wavelets in Mathlab systems.

  13. Survival analysis and classification methods for forest fire size

    PubMed Central

    2018-01-01

    Factors affecting wildland-fire size distribution include weather, fuels, and fire suppression activities. We present a novel application of survival analysis to quantify the effects of these factors on a sample of sizes of lightning-caused fires from Alberta, Canada. Two events were observed for each fire: the size at initial assessment (by the first fire fighters to arrive at the scene) and the size at “being held” (a state when no further increase in size is expected). We developed a statistical classifier to try to predict cases where there will be a growth in fire size (i.e., the size at “being held” exceeds the size at initial assessment). Logistic regression was preferred over two alternative classifiers, with covariates consistent with similar past analyses. We conducted survival analysis on the group of fires exhibiting a size increase. A screening process selected three covariates: an index of fire weather at the day the fire started, the fuel type burning at initial assessment, and a factor for the type and capabilities of the method of initial attack. The Cox proportional hazards model performed better than three accelerated failure time alternatives. Both fire weather and fuel type were highly significant, with effects consistent with known fire behaviour. The effects of initial attack method were not statistically significant, but did suggest a reverse causality that could arise if fire management agencies were to dispatch resources based on a-priori assessment of fire growth potentials. We discuss how a more sophisticated analysis of larger data sets could produce unbiased estimates of fire suppression effect under such circumstances. PMID:29320497

  14. A Snapshot of Serial Rape: An Investigation of Criminal Sophistication and Use of Force on Victim Injury and Severity of the Assault.

    PubMed

    de Heer, Brooke

    2016-02-01

    Prior research on rapes reported to law enforcement has identified criminal sophistication and the use of force against the victim as possible unique identifiers to serial rape versus one-time rape. This study sought to contribute to the current literature on reported serial rape by investigating how the level of criminal sophistication of the rapist and use of force used were associated with two important outcomes of rape: victim injury and overall severity of the assault. In addition, it was evaluated whether rapist and victim ethnicity affected these relationships. A nation-wide sample of serial rape cases reported to law enforcement collected by the Federal Bureau of Investigation (FBI) was analyzed (108 rapists, 543 victims). Results indicated that serial rapists typically used a limited amount of force against the victim and displayed a high degree of criminal sophistication. In addition, the more criminally sophisticated the perpetrator was, the more sexual acts he performed on his victim. Finally, rapes between a White rapist and White victim were found to exhibit higher levels of criminal sophistication and were more severe in terms of number and types of sexual acts committed. These findings provide a more in-depth understanding of serial rape that can inform both academics and practitioners in the field about contributors to victim injury and severity of the assault. © The Author(s) 2014.

  15. Differential Variance Analysis: a direct method to quantify and visualize dynamic heterogeneities

    NASA Astrophysics Data System (ADS)

    Pastore, Raffaele; Pesce, Giuseppe; Caggioni, Marco

    2017-03-01

    Many amorphous materials show spatially heterogenous dynamics, as different regions of the same system relax at different rates. Such a signature, known as Dynamic Heterogeneity, has been crucial to understand the nature of the jamming transition in simple model systems and is currently considered very promising to characterize more complex fluids of industrial and biological relevance. Unfortunately, measurements of dynamic heterogeneities typically require sophisticated experimental set-ups and are performed by few specialized groups. It is now possible to quantitatively characterize the relaxation process and the emergence of dynamic heterogeneities using a straightforward method, here validated on video microscopy data of hard-sphere colloidal glasses. We call this method Differential Variance Analysis (DVA), since it focuses on the variance of the differential frames, obtained subtracting images at different time-lags. Moreover, direct visualization of dynamic heterogeneities naturally appears in the differential frames, when the time-lag is set to the one corresponding to the maximum dynamic susceptibility. This approach opens the way to effectively characterize and tailor a wide variety of soft materials, from complex formulated products to biological tissues.

  16. Naive Analysis of Variance

    ERIC Educational Resources Information Center

    Braun, W. John

    2012-01-01

    The Analysis of Variance is often taught in introductory statistics courses, but it is not clear that students really understand the method. This is because the derivation of the test statistic and p-value requires a relatively sophisticated mathematical background which may not be well-remembered or understood. Thus, the essential concept behind…

  17. Sophisticated Calculation of the 1oo4-architecture for Safety-related Systems Conforming to IEC61508

    NASA Astrophysics Data System (ADS)

    Hayek, A.; Bokhaiti, M. Al; Schwarz, M. H.; Boercsoek, J.

    2012-05-01

    With the publication and enforcement of the standard IEC 61508 of safety related systems, recent system architectures have been presented and evaluated. Among a number of techniques and measures to the evaluation of safety integrity level (SIL) for safety-related systems, several measures such as reliability block diagrams and Markov models are used to analyze the probability of failure on demand (PFD) and mean time to failure (MTTF) which conform to IEC 61508. The current paper deals with the quantitative analysis of the novel 1oo4-architecture (one out of four) presented in recent work. Therefore sophisticated calculations for the required parameters are introduced. The provided 1oo4-architecture represents an advanced safety architecture based on on-chip redundancy, which is 3-failure safe. This means that at least one of the four channels have to work correctly in order to trigger the safety function.

  18. A critical analysis of methods for rapid and nondestructive determination of wood density in standing trees

    Treesearch

    Shan Gao; Xiping Wang; Michael C. Wiemann; Brian K. Brashaw; Robert J. Ross; Lihai Wang

    2017-01-01

    Key message Field methods for rapid determination of wood density in trees have evolved from increment borer, torsiometer, Pilodyn, and nail withdrawal into sophisticated electronic tools of resistance drilling measurement. A partial resistance drilling approach coupled with knowledge of internal tree density distribution may...

  19. Rapid deletion plasmid construction methods for protoplast and Agrobacterium based fungal transformation systems

    USDA-ARS?s Scientific Manuscript database

    Increasing availability of genomic data and sophistication of analytical methodology in fungi has elevated the need for functional genomics tools in these organisms. Gene deletion is a critical tool for functional analysis. The targeted deletion of genes requires both a suitable method for the trans...

  20. Analysis Method for Laterally Loaded Pile Groups Using an Advanced Modeling of Reinforced Concrete Sections.

    PubMed

    Stacul, Stefano; Squeglia, Nunziante

    2018-02-15

    A Boundary Element Method (BEM) approach was developed for the analysis of pile groups. The proposed method includes: the non-linear behavior of the soil by a hyperbolic modulus reduction curve; the non-linear response of reinforced concrete pile sections, also taking into account the influence of tension stiffening; the influence of suction by increasing the stiffness of shallow portions of soil and modeled using the Modified Kovacs model; pile group shadowing effect, modeled using an approach similar to that proposed in the Strain Wedge Model for pile groups analyses. The proposed BEM method saves computational effort compared to more sophisticated codes such as VERSAT-P3D, PLAXIS 3D and FLAC-3D, and provides reliable results using input data from a standard site investigation. The reliability of this method was verified by comparing results from data from full scale and centrifuge tests on single piles and pile groups. A comparison is presented between measured and computed data on a laterally loaded fixed-head pile group composed by reinforced concrete bored piles. The results of the proposed method are shown to be in good agreement with those obtained in situ.

  1. Analysis Method for Laterally Loaded Pile Groups Using an Advanced Modeling of Reinforced Concrete Sections

    PubMed Central

    2018-01-01

    A Boundary Element Method (BEM) approach was developed for the analysis of pile groups. The proposed method includes: the non-linear behavior of the soil by a hyperbolic modulus reduction curve; the non-linear response of reinforced concrete pile sections, also taking into account the influence of tension stiffening; the influence of suction by increasing the stiffness of shallow portions of soil and modeled using the Modified Kovacs model; pile group shadowing effect, modeled using an approach similar to that proposed in the Strain Wedge Model for pile groups analyses. The proposed BEM method saves computational effort compared to more sophisticated codes such as VERSAT-P3D, PLAXIS 3D and FLAC-3D, and provides reliable results using input data from a standard site investigation. The reliability of this method was verified by comparing results from data from full scale and centrifuge tests on single piles and pile groups. A comparison is presented between measured and computed data on a laterally loaded fixed-head pile group composed by reinforced concrete bored piles. The results of the proposed method are shown to be in good agreement with those obtained in situ. PMID:29462857

  2. Sophisticated approval voting, ignorance priors, and plurality heuristics: a behavioral social choice analysis in a Thurstonian framework.

    PubMed

    Regenwetter, Michel; Ho, Moon-Ho R; Tsetlin, Ilia

    2007-10-01

    This project reconciles historically distinct paradigms at the interface between individual and social choice theory, as well as between rational and behavioral decision theory. The authors combine a utility-maximizing prescriptive rule for sophisticated approval voting with the ignorance prior heuristic from behavioral decision research and two types of plurality heuristics to model approval voting behavior. When using a sincere plurality heuristic, voters simplify their decision process by voting for their single favorite candidate. When using a strategic plurality heuristic, voters strategically focus their attention on the 2 front-runners and vote for their preferred candidate among these 2. Using a hierarchy of Thurstonian random utility models, the authors implemented these different decision rules and tested them statistically on 7 real world approval voting elections. They cross-validated their key findings via a psychological Internet experiment. Although a substantial number of voters used the plurality heuristic in the real elections, they did so sincerely, not strategically. Moreover, even though Thurstonian models do not force such agreement, the results show, in contrast to common wisdom about social choice rules, that the sincere social orders by Condorcet, Borda, plurality, and approval voting are identical in all 7 elections and in the Internet experiment. PsycINFO Database Record (c) 2007 APA, all rights reserved.

  3. Real-time color-based texture analysis for sophisticated defect detection on wooden surfaces

    NASA Astrophysics Data System (ADS)

    Polzleitner, Wolfgang; Schwingshakl, Gert

    2004-10-01

    We describe a scanning system developed for the classification and grading of surfaces of wooden tiles. The system uses color imaging sensors to analyse the surfaces of either hard- or softwood material in terms of the texture formed by grain lines (orientation, spatial frequency, and color), various types of colorization, and other defects like knots, heart wood, cracks, holes, etc. The analysis requires two major tracks: the assignment of a tile to its texture class (like A, B, C, 1, 2, 3, Waste), and the detection of defects that decrease the commercial value of the tile (heart wood, knots, etc.). The system was initially developed under the international IMS program (Intelligent Manufacturing Systems) by an industry consortium. During the last two years it has been further developed, and several industrial systems have been installed, and are presently used in production of hardwood flooring. The methods implemented reflect some of the latest developments in the field of pattern recognition: genetic feature selection, two-dimensional second order statistics, special color space transforms, and classification by neural networks. In the industrial scenario we describe, many of the features defining a class cannot be described mathematically. Consequently a focus was the design of a learning architecture, where prototype texture samples are presented to the system, which then automatically finds the internal representation necessary for classification. The methods used in this approach have a wide applicability to problems of inspection, sorting, and optimization of high-value material typically used in the furniture, flooring, and related wood manufacturing industries.

  4. A method for the automated detection phishing websites through both site characteristics and image analysis

    NASA Astrophysics Data System (ADS)

    White, Joshua S.; Matthews, Jeanna N.; Stacy, John L.

    2012-06-01

    Phishing website analysis is largely still a time-consuming manual process of discovering potential phishing sites, verifying if suspicious sites truly are malicious spoofs and if so, distributing their URLs to the appropriate blacklisting services. Attackers increasingly use sophisticated systems for bringing phishing sites up and down rapidly at new locations, making automated response essential. In this paper, we present a method for rapid, automated detection and analysis of phishing websites. Our method relies on near real-time gathering and analysis of URLs posted on social media sites. We fetch the pages pointed to by each URL and characterize each page with a set of easily computed values such as number of images and links. We also capture a screen-shot of the rendered page image, compute a hash of the image and use the Hamming distance between these image hashes as a form of visual comparison. We provide initial results demonstrate the feasibility of our techniques by comparing legitimate sites to known fraudulent versions from Phishtank.com, by actively introducing a series of minor changes to a phishing toolkit captured in a local honeypot and by performing some initial analysis on a set of over 2.8 million URLs posted to Twitter over a 4 days in August 2011. We discuss the issues encountered during our testing such as resolvability and legitimacy of URL's posted on Twitter, the data sets used, the characteristics of the phishing sites we discovered, and our plans for future work.

  5. Descriptive sensory analysis in different classes of orange juice by a robust free-choice profile method.

    PubMed

    Pérez Aparicio, Jesús; Toledano Medina, M Angeles; Lafuente Rosales, Victoria

    2007-07-09

    Free-choice profile (FCP), developed in the 1980s, is a sensory analysis method that can be carried out by untrained panels. The participants need only to be able to use a scale and be consumers of the product under evaluation. The data are analysed by sophisticated statistical methodologies like Generalized Procrustean Analysis (GPA) or STATIS. To facilitate a wider use of the free-choice profiling procedure, different authors have advocated simpler methods based on principal components analysis (PCA) of merged data sets. The purpose of this work was to apply another easy procedure to this type of data by means of a robust PCA. The most important characteristic of the proposed method is that quality responsible managers could use this methodology without any scale evaluation. Only the free terms generated by the assessors are necessary to apply the script, thus avoiding the error associated with scale utilization by inexpert assessors. Also, it is possible to use the application with missing data and with differences in the assessors' attendance at sessions. An example was performed to generate the descriptors from different orange juice types. The results were compared with the STATIS method and with the PCA on the merged data sets. The samples evaluated were fresh orange juices with differences in storage days and pasteurized, concentrated and orange nectar drinks from different brands. Eighteen assessors with a low-level training program were used in a six-session free-choice profile framework. The results proved that this script could be of use in marketing decisions and product quality program development.

  6. The Model Optimization, Uncertainty, and SEnsitivity analysis (MOUSE) toolbox: overview and application

    USDA-ARS?s Scientific Manuscript database

    For several decades, optimization and sensitivity/uncertainty analysis of environmental models has been the subject of extensive research. Although much progress has been made and sophisticated methods developed, the growing complexity of environmental models to represent real-world systems makes it...

  7. Simple Plans or Sophisticated Habits? State, Transition and Learning Interactions in the Two-Step Task.

    PubMed

    Akam, Thomas; Costa, Rui; Dayan, Peter

    2015-12-01

    The recently developed 'two-step' behavioural task promises to differentiate model-based from model-free reinforcement learning, while generating neurophysiologically-friendly decision datasets with parametric variation of decision variables. These desirable features have prompted its widespread adoption. Here, we analyse the interactions between a range of different strategies and the structure of transitions and outcomes in order to examine constraints on what can be learned from behavioural performance. The task involves a trade-off between the need for stochasticity, to allow strategies to be discriminated, and a need for determinism, so that it is worth subjects' investment of effort to exploit the contingencies optimally. We show through simulation that under certain conditions model-free strategies can masquerade as being model-based. We first show that seemingly innocuous modifications to the task structure can induce correlations between action values at the start of the trial and the subsequent trial events in such a way that analysis based on comparing successive trials can lead to erroneous conclusions. We confirm the power of a suggested correction to the analysis that can alleviate this problem. We then consider model-free reinforcement learning strategies that exploit correlations between where rewards are obtained and which actions have high expected value. These generate behaviour that appears model-based under these, and also more sophisticated, analyses. Exploiting the full potential of the two-step task as a tool for behavioural neuroscience requires an understanding of these issues.

  8. Simple Plans or Sophisticated Habits? State, Transition and Learning Interactions in the Two-Step Task

    PubMed Central

    Akam, Thomas; Costa, Rui; Dayan, Peter

    2015-01-01

    The recently developed ‘two-step’ behavioural task promises to differentiate model-based from model-free reinforcement learning, while generating neurophysiologically-friendly decision datasets with parametric variation of decision variables. These desirable features have prompted its widespread adoption. Here, we analyse the interactions between a range of different strategies and the structure of transitions and outcomes in order to examine constraints on what can be learned from behavioural performance. The task involves a trade-off between the need for stochasticity, to allow strategies to be discriminated, and a need for determinism, so that it is worth subjects’ investment of effort to exploit the contingencies optimally. We show through simulation that under certain conditions model-free strategies can masquerade as being model-based. We first show that seemingly innocuous modifications to the task structure can induce correlations between action values at the start of the trial and the subsequent trial events in such a way that analysis based on comparing successive trials can lead to erroneous conclusions. We confirm the power of a suggested correction to the analysis that can alleviate this problem. We then consider model-free reinforcement learning strategies that exploit correlations between where rewards are obtained and which actions have high expected value. These generate behaviour that appears model-based under these, and also more sophisticated, analyses. Exploiting the full potential of the two-step task as a tool for behavioural neuroscience requires an understanding of these issues. PMID:26657806

  9. Urban Image Classification: Per-Pixel Classifiers, Sub-Pixel Analysis, Object-Based Image Analysis, and Geospatial Methods. 10; Chapter

    NASA Technical Reports Server (NTRS)

    Myint, Soe W.; Mesev, Victor; Quattrochi, Dale; Wentz, Elizabeth A.

    2013-01-01

    Remote sensing methods used to generate base maps to analyze the urban environment rely predominantly on digital sensor data from space-borne platforms. This is due in part from new sources of high spatial resolution data covering the globe, a variety of multispectral and multitemporal sources, sophisticated statistical and geospatial methods, and compatibility with GIS data sources and methods. The goal of this chapter is to review the four groups of classification methods for digital sensor data from space-borne platforms; per-pixel, sub-pixel, object-based (spatial-based), and geospatial methods. Per-pixel methods are widely used methods that classify pixels into distinct categories based solely on the spectral and ancillary information within that pixel. They are used for simple calculations of environmental indices (e.g., NDVI) to sophisticated expert systems to assign urban land covers. Researchers recognize however, that even with the smallest pixel size the spectral information within a pixel is really a combination of multiple urban surfaces. Sub-pixel classification methods therefore aim to statistically quantify the mixture of surfaces to improve overall classification accuracy. While within pixel variations exist, there is also significant evidence that groups of nearby pixels have similar spectral information and therefore belong to the same classification category. Object-oriented methods have emerged that group pixels prior to classification based on spectral similarity and spatial proximity. Classification accuracy using object-based methods show significant success and promise for numerous urban 3 applications. Like the object-oriented methods that recognize the importance of spatial proximity, geospatial methods for urban mapping also utilize neighboring pixels in the classification process. The primary difference though is that geostatistical methods (e.g., spatial autocorrelation methods) are utilized during both the pre- and post

  10. Overview and application of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) toolbox

    USDA-ARS?s Scientific Manuscript database

    For several decades, optimization and sensitivity/uncertainty analysis of environmental models has been the subject of extensive research. Although much progress has been made and sophisticated methods developed, the growing complexity of environmental models to represent real-world systems makes it...

  11. Analysis of Qualitative Interviews about the Impact of Information Technology on Pressure Ulcer Prevention Programs: Implications for the Wound Ostomy Continence Nurse

    PubMed Central

    Shepherd, Marilyn Murphy; Wipke-Tevis, Deidre D.; Alexander, Gregory L.

    2015-01-01

    Purpose The purpose of this study was to compare pressure ulcer prevention programs in 2 long term care facilities (LTC) with diverse Information Technology Sophistication (ITS), one with high sophistication and one with low sophistication, and to identify implications for the Wound Ostomy Continence Nurse (WOC Nurse) Design Secondary analysis of narrative data obtained from a mixed methods study. Subjects and Setting The study setting was 2 LTC facilities in the Midwestern United States. The sample comprised 39 staff from 2 facilities, including 26 from a high ITS facility and 13 from the low ITS facility. Respondents included Certified Nurse Assistants,, Certified Medical Technicians, Restorative Medical Technicians, Social Workers, Registered Nurses, Licensed Practical Nurses, Information Technology staff, Administrators, and Directors. Methods This study is a secondary analysis of interviews regarding communication and education strategies in two longterm care agencies. This analysis focused on focus group interviews, which included both direct and non-direct care providers. Results Eight themes (codes) were identified in the analysis. Three themes are presented individually with exemplars of communication and education strategies. The analysis revealed specific differences between the high ITS and low ITS facility in regards to education and communication involving pressure ulcer prevention. These differences have direct implications for WOC nurses consulting in the LTC setting. Conclusions Findings from this study suggest that effective strategies for staff education and communication regarding PU prevention differ based on the level of ITS within a given facility. Specific strategies for education and communication are suggested for agencies with high ITS and agencies with low ITS sophistication. PMID:25945822

  12. BOOK REVIEW: Vortex Methods: Theory and Practice

    NASA Astrophysics Data System (ADS)

    Cottet, G.-H.; Koumoutsakos, P. D.

    2001-03-01

    The book Vortex Methods: Theory and Practice presents a comprehensive account of the numerical technique for solving fluid flow problems. It provides a very nice balance between the theoretical development and analysis of the various techniques and their practical implementation. In fact, the presentation of the rigorous mathematical analysis of these methods instills confidence in their implementation. The book goes into some detail on the more recent developments that attempt to account for viscous effects, in particular the presence of viscous boundary layers in some flows of interest. The presentation is very readable, with most points illustrated with well-chosen examples, some quite sophisticated. It is a very worthy reference book that should appeal to a large body of readers, from those interested in the mathematical analysis of the methods to practitioners of computational fluid dynamics. The use of the book as a text is compromised by its lack of exercises for students, but it could form the basis of a graduate special topics course. Juan Lopez

  13. A Sophisticated Architecture Is Indeed Necessary for the Implementation of Health in All Policies but not Enough

    PubMed Central

    Breton, Eric

    2016-01-01

    In this commentary, I argue that beyond a sophisticated supportive architecture to facilitate implementation of actions on the social determinants of health (SDOH) and health inequities, the Health in All Policies (HiAP) project faces two main barriers: lack of awareness within policy networks on the social determinants of population health, and a tendency of health actors to neglect investing in other sectors’ complex problems. PMID:27285517

  14. Highly Sophisticated Virtual Laboratory Instruments in Education

    NASA Astrophysics Data System (ADS)

    Gaskins, T.

    2006-12-01

    Many areas of Science have advanced or stalled according to the ability to see what can not normally be seen. Visual understanding has been key to many of the world's greatest breakthroughs, such as discovery of DNAs double helix. Scientists use sophisticated instruments to see what the human eye can not. Light microscopes, scanning electron microscopes (SEM), spectrometers and atomic force microscopes are employed to examine and learn the details of the extremely minute. It's rare that students prior to university have access to such instruments, or are granted full ability to probe and magnify as desired. Virtual Lab, by providing highly authentic software instruments and comprehensive imagery of real specimens, provides them this opportunity. Virtual Lab's instruments let explorers operate virtual devices on a personal computer to examine real specimens. Exhaustive sets of images systematically and robotically photographed at thousands of positions and multiple magnifications and focal points allow students to zoom in and focus on the most minute detail of each specimen. Controls on each Virtual Lab device interactively and smoothly move the viewer through these images to display the specimen as the instrument saw it. Users control position, magnification, focal length, filters and other parameters. Energy dispersion spectrometry is combined with SEM imagery to enable exploration of chemical composition at minute scale and arbitrary location. Annotation capabilities allow scientists, teachers and students to indicate important features or areas. Virtual Lab is a joint project of NASA and the Beckman Institute at the University of Illinois at Urbana- Champaign. Four instruments currently compose the Virtual Lab suite: A scanning electron microscope and companion energy dispersion spectrometer, a high-power light microscope, and a scanning probe microscope that captures surface properties to the level of atoms. Descriptions of instrument operating principles and

  15. Yersinia virulence factors - a sophisticated arsenal for combating host defences

    PubMed Central

    Atkinson, Steve; Williams, Paul

    2016-01-01

    The human pathogens Yersinia pseudotuberculosis and Yersinia enterocolitica cause enterocolitis, while Yersinia pestis is responsible for pneumonic, bubonic, and septicaemic plague. All three share an infection strategy that relies on a virulence factor arsenal to enable them to enter, adhere to, and colonise the host while evading host defences to avoid untimely clearance. Their arsenal includes a number of adhesins that allow the invading pathogens to establish a foothold in the host and to adhere to specific tissues later during infection. When the host innate immune system has been activated, all three pathogens produce a structure analogous to a hypodermic needle. In conjunction with the translocon, which forms a pore in the host membrane, the channel that is formed enables the transfer of six ‘effector’ proteins into the host cell cytoplasm. These proteins mimic host cell proteins but are more efficient than their native counterparts at modifying the host cell cytoskeleton, triggering the host cell suicide response. Such a sophisticated arsenal ensures that yersiniae maintain the upper hand despite the best efforts of the host to counteract the infecting pathogen. PMID:27347390

  16. Aspects of Lexical Sophistication in Advanced Learners' Oral Production: Vocabulary Acquisition and Use in L2 French and Italian

    ERIC Educational Resources Information Center

    Bardel, Camilla; Gudmundson, Anna; Lindqvist, Christina

    2012-01-01

    This article reports on the design and use of a profiler for lexical sophistication (i.e., use of advanced vocabulary), which was created to assess the lexical richness of intermediate and advanced Swedish second language (L2) learners' French and Italian. It discusses how teachers' judgments (TJs) of word difficulty can contribute to the…

  17. Clustering Methods with Qualitative Data: A Mixed Methods Approach for Prevention Research with Small Samples

    PubMed Central

    Henry, David; Dymnicki, Allison B.; Mohatt, Nathaniel; Allen, James; Kelly, James G.

    2016-01-01

    Qualitative methods potentially add depth to prevention research, but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data, but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-Means clustering, and latent class analysis produced similar levels of accuracy with binary data, and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a “real-world” example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969

  18. Clustering Methods with Qualitative Data: a Mixed-Methods Approach for Prevention Research with Small Samples.

    PubMed

    Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G

    2015-10-01

    Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.

  19. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B [Idaho Falls, ID; Novascone, Stephen R [Idaho Falls, ID; Wright, Jerry P [Idaho Falls, ID

    2012-05-29

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  20. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B [Idaho Falls, ID; Novascone, Stephen R [Idaho Falls, ID; Wright, Jerry P [Idaho Falls, ID

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  1. Turning up the heat on aircraft structures. [design and analysis for high-temperature conditions

    NASA Technical Reports Server (NTRS)

    Dobyns, Alan; Saff, Charles; Johns, Robert

    1992-01-01

    An overview is presented of the current effort in design and development of aircraft structures to achieve the lowest cost for best performance. Enhancements in this area are focused on integrated design, improved design analysis tools, low-cost fabrication techniques, and more sophisticated test methods. 3D CAD/CAM data are becoming the method through which design, manufacturing, and engineering communicate.

  2. Sophisticated Communication in the Brazilian Torrent Frog Hylodes japi

    PubMed Central

    de Sá, Fábio P.; Zina, Juliana; Haddad, Célio F. B.

    2016-01-01

    Intraspecific communication in frogs plays an important role in the recognition of conspecifics in general and of potential rivals or mates in particular and therefore with relevant consequences for pre-zygotic reproductive isolation. We investigate intraspecific communication in Hylodes japi, an endemic Brazilian torrent frog with territorial males and an elaborate courtship behavior. We describe its repertoire of acoustic signals as well as one of the most complex repertoires of visual displays known in anurans, including five new visual displays. Previously unknown in frogs, we also describe a bimodal inter-sexual communication system where the female stimulates the male to emit a courtship call. As another novelty for frogs, we show that in addition to choosing which limb to signal with, males choose which of their two vocal sacs will be used for visual signaling. We explain how and why this is accomplished. Control of inflation also provides additional evidence that vocal sac movement and color must be important for visual communication, even while producing sound. Through the current knowledge on visual signaling in Neotropical torrent frogs (i.e. hylodids), we discuss and highlight the behavioral diversity in the family Hylodidae. Our findings indicate that communication in species of Hylodes is undoubtedly more sophisticated than we expected and that visual communication in anurans is more widespread than previously thought. This is especially true in tropical regions, most likely due to the higher number of species and phylogenetic groups and/or to ecological factors, such as higher microhabitat diversity. PMID:26760304

  3. Sophisticated Communication in the Brazilian Torrent Frog Hylodes japi.

    PubMed

    de Sá, Fábio P; Zina, Juliana; Haddad, Célio F B

    2016-01-01

    Intraspecific communication in frogs plays an important role in the recognition of conspecifics in general and of potential rivals or mates in particular and therefore with relevant consequences for pre-zygotic reproductive isolation. We investigate intraspecific communication in Hylodes japi, an endemic Brazilian torrent frog with territorial males and an elaborate courtship behavior. We describe its repertoire of acoustic signals as well as one of the most complex repertoires of visual displays known in anurans, including five new visual displays. Previously unknown in frogs, we also describe a bimodal inter-sexual communication system where the female stimulates the male to emit a courtship call. As another novelty for frogs, we show that in addition to choosing which limb to signal with, males choose which of their two vocal sacs will be used for visual signaling. We explain how and why this is accomplished. Control of inflation also provides additional evidence that vocal sac movement and color must be important for visual communication, even while producing sound. Through the current knowledge on visual signaling in Neotropical torrent frogs (i.e. hylodids), we discuss and highlight the behavioral diversity in the family Hylodidae. Our findings indicate that communication in species of Hylodes is undoubtedly more sophisticated than we expected and that visual communication in anurans is more widespread than previously thought. This is especially true in tropical regions, most likely due to the higher number of species and phylogenetic groups and/or to ecological factors, such as higher microhabitat diversity.

  4. Background field removal technique based on non-regularized variable kernels sophisticated harmonic artifact reduction for phase data for quantitative susceptibility mapping.

    PubMed

    Kan, Hirohito; Arai, Nobuyuki; Takizawa, Masahiro; Omori, Kazuyoshi; Kasai, Harumasa; Kunitomo, Hiroshi; Hirose, Yasujiro; Shibamoto, Yuta

    2018-06-11

    We developed a non-regularized, variable kernel, sophisticated harmonic artifact reduction for phase data (NR-VSHARP) method to accurately estimate local tissue fields without regularization for quantitative susceptibility mapping (QSM). We then used a digital brain phantom to evaluate the accuracy of the NR-VSHARP method, and compared it with the VSHARP and iterative spherical mean value (iSMV) methods through in vivo human brain experiments. Our proposed NR-VSHARP method, which uses variable spherical mean value (SMV) kernels, minimizes L2 norms only within the volume of interest to reduce phase errors and save cortical information without regularization. In a numerical phantom study, relative local field and susceptibility map errors were determined using NR-VSHARP, VSHARP, and iSMV. Additionally, various background field elimination methods were used to image the human brain. In a numerical phantom study, the use of NR-VSHARP considerably reduced the relative local field and susceptibility map errors throughout a digital whole brain phantom, compared with VSHARP and iSMV. In the in vivo experiment, the NR-VSHARP-estimated local field could sufficiently achieve minimal boundary losses and phase error suppression throughout the brain. Moreover, the susceptibility map generated using NR-VSHARP minimized the occurrence of streaking artifacts caused by insufficient background field removal. Our proposed NR-VSHARP method yields minimal boundary losses and highly precise phase data. Our results suggest that this technique may facilitate high-quality QSM. Copyright © 2017. Published by Elsevier Inc.

  5. Background field removal technique using regularization enabled sophisticated harmonic artifact reduction for phase data with varying kernel sizes.

    PubMed

    Kan, Hirohito; Kasai, Harumasa; Arai, Nobuyuki; Kunitomo, Hiroshi; Hirose, Yasujiro; Shibamoto, Yuta

    2016-09-01

    An effective background field removal technique is desired for more accurate quantitative susceptibility mapping (QSM) prior to dipole inversion. The aim of this study was to evaluate the accuracy of regularization enabled sophisticated harmonic artifact reduction for phase data with varying spherical kernel sizes (REV-SHARP) method using a three-dimensional head phantom and human brain data. The proposed REV-SHARP method used the spherical mean value operation and Tikhonov regularization in the deconvolution process, with varying 2-14mm kernel sizes. The kernel sizes were gradually reduced, similar to the SHARP with varying spherical kernel (VSHARP) method. We determined the relative errors and relationships between the true local field and estimated local field in REV-SHARP, VSHARP, projection onto dipole fields (PDF), and regularization enabled SHARP (RESHARP). Human experiment was also conducted using REV-SHARP, VSHARP, PDF, and RESHARP. The relative errors in the numerical phantom study were 0.386, 0.448, 0.838, and 0.452 for REV-SHARP, VSHARP, PDF, and RESHARP. REV-SHARP result exhibited the highest correlation between the true local field and estimated local field. The linear regression slopes were 1.005, 1.124, 0.988, and 0.536 for REV-SHARP, VSHARP, PDF, and RESHARP in regions of interest on the three-dimensional head phantom. In human experiments, no obvious errors due to artifacts were present in REV-SHARP. The proposed REV-SHARP is a new method combined with variable spherical kernel size and Tikhonov regularization. This technique might make it possible to be more accurate backgroud field removal and help to achive better accuracy of QSM. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. The architecture challenge: Future artificial-intelligence systems will require sophisticated architectures, and knowledge of the brain might guide their construction.

    PubMed

    Baldassarre, Gianluca; Santucci, Vieri Giuliano; Cartoni, Emilio; Caligiore, Daniele

    2017-01-01

    In this commentary, we highlight a crucial challenge posed by the proposal of Lake et al. to introduce key elements of human cognition into deep neural networks and future artificial-intelligence systems: the need to design effective sophisticated architectures. We propose that looking at the brain is an important means of facing this great challenge.

  7. Path Similarity Analysis: A Method for Quantifying Macromolecular Pathways

    PubMed Central

    Seyler, Sean L.; Kumar, Avishek; Thorpe, M. F.; Beckstein, Oliver

    2015-01-01

    Diverse classes of proteins function through large-scale conformational changes and various sophisticated computational algorithms have been proposed to enhance sampling of these macromolecular transition paths. Because such paths are curves in a high-dimensional space, it has been difficult to quantitatively compare multiple paths, a necessary prerequisite to, for instance, assess the quality of different algorithms. We introduce a method named Path Similarity Analysis (PSA) that enables us to quantify the similarity between two arbitrary paths and extract the atomic-scale determinants responsible for their differences. PSA utilizes the full information available in 3N-dimensional configuration space trajectories by employing the Hausdorff or Fréchet metrics (adopted from computational geometry) to quantify the degree of similarity between piecewise-linear curves. It thus completely avoids relying on projections into low dimensional spaces, as used in traditional approaches. To elucidate the principles of PSA, we quantified the effect of path roughness induced by thermal fluctuations using a toy model system. Using, as an example, the closed-to-open transitions of the enzyme adenylate kinase (AdK) in its substrate-free form, we compared a range of protein transition path-generating algorithms. Molecular dynamics-based dynamic importance sampling (DIMS) MD and targeted MD (TMD) and the purely geometric FRODA (Framework Rigidity Optimized Dynamics Algorithm) were tested along with seven other methods publicly available on servers, including several based on the popular elastic network model (ENM). PSA with clustering revealed that paths produced by a given method are more similar to each other than to those from another method and, for instance, that the ENM-based methods produced relatively similar paths. PSA applied to ensembles of DIMS MD and FRODA trajectories of the conformational transition of diphtheria toxin, a particularly challenging example, showed that

  8. The Matrix Element Method: Past, Present, and Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gainer, James S.; Lykken, Joseph; Matchev, Konstantin T.

    2013-07-12

    The increasing use of multivariate methods, and in particular the Matrix Element Method (MEM), represents a revolution in experimental particle physics. With continued exponential growth in computing capabilities, the use of sophisticated multivariate methods-- already common-- will soon become ubiquitous and ultimately almost compulsory. While the existence of sophisticated algorithms for disentangling signal and background might naively suggest a diminished role for theorists, the use of the MEM, with its inherent connection to the calculation of differential cross sections will benefit from collaboration between theorists and experimentalists. In this white paper, we will briefly describe the MEM and some ofmore » its recent uses, note some current issues and potential resolutions, and speculate about exciting future opportunities.« less

  9. Ontology-based coupled optimisation design method using state-space analysis for the spindle box system of large ultra-precision optical grinding machine

    NASA Astrophysics Data System (ADS)

    Wang, Qianren; Chen, Xing; Yin, Yuehong; Lu, Jian

    2017-08-01

    With the increasing complexity of mechatronic products, traditional empirical or step-by-step design methods are facing great challenges with various factors and different stages having become inevitably coupled during the design process. Management of massive information or big data, as well as the efficient operation of information flow, is deeply involved in the process of coupled design. Designers have to address increased sophisticated situations when coupled optimisation is also engaged. Aiming at overcoming these difficulties involved in conducting the design of the spindle box system of ultra-precision optical grinding machine, this paper proposed a coupled optimisation design method based on state-space analysis, with the design knowledge represented by ontologies and their semantic networks. An electromechanical coupled model integrating mechanical structure, control system and driving system of the motor is established, mainly concerning the stiffness matrix of hydrostatic bearings, ball screw nut and rolling guide sliders. The effectiveness and precision of the method are validated by the simulation results of the natural frequency and deformation of the spindle box when applying an impact force to the grinding wheel.

  10. A rapid method for the sampling of atmospheric water vapour for isotopic analysis.

    PubMed

    Peters, Leon I; Yakir, Dan

    2010-01-01

    Analysis of the stable isotopic composition of atmospheric moisture is widely applied in the environmental sciences. Traditional methods for obtaining isotopic compositional data from ambient moisture have required complicated sampling procedures, expensive and sophisticated distillation lines, hazardous consumables, and lengthy treatments prior to analysis. Newer laser-based techniques are expensive and usually not suitable for large-scale field campaigns, especially in cases where access to mains power is not feasible or high spatial coverage is required. Here we outline the construction and usage of a novel vapour-sampling system based on a battery-operated Stirling cycle cooler, which is simple to operate, does not require any consumables, or post-collection distillation, and is light-weight and highly portable. We demonstrate the ability of this system to reproduce delta(18)O isotopic compositions of ambient water vapour, with samples taken simultaneously by a traditional cryogenic collection technique. Samples were collected over 1 h directly into autosampler vials and were analysed by mass spectrometry after pyrolysis of 1 microL aliquots to CO. This yielded an average error of < +/-0.5 per thousand, approximately equal to the signal-to-noise ratio of traditional approaches. This new system provides a rapid and reliable alternative to conventional cryogenic techniques, particularly in cases requiring high sample throughput or where access to distillation lines, slurry maintenance or mains power is not feasible. Copyright 2009 John Wiley & Sons, Ltd.

  11. A modified precise integration method for transient dynamic analysis in structural systems with multiple damping models

    NASA Astrophysics Data System (ADS)

    Ding, Zhe; Li, Li; Hu, Yujin

    2018-01-01

    Sophisticated engineering systems are usually assembled by subcomponents with significantly different levels of energy dissipation. Therefore, these damping systems often contain multiple damping models and lead to great difficulties in analyzing. This paper aims at developing a time integration method for structural systems with multiple damping models. The dynamical system is first represented by a generally damped model. Based on this, a new extended state-space method for the damped system is derived. A modified precise integration method with Gauss-Legendre quadrature is then proposed. The numerical stability and accuracy of the proposed integration method are discussed in detail. It is verified that the method is conditionally stable and has inherent algorithmic damping, period error and amplitude decay. Numerical examples are provided to assess the performance of the proposed method compared with other methods. It is demonstrated that the method is more accurate than other methods with rather good efficiency and the stable condition is easy to be satisfied in practice.

  12. Statistical Analysis of Individual Participant Data Meta-Analyses: A Comparison of Methods and Recommendations for Practice

    PubMed Central

    Stewart, Gavin B.; Altman, Douglas G.; Askie, Lisa M.; Duley, Lelia; Simmonds, Mark C.; Stewart, Lesley A.

    2012-01-01

    . Researchers considering undertaking an IPD meta-analysis should not necessarily be deterred by a perceived need for sophisticated statistical methods when combining information from large randomised trials. PMID:23056232

  13. Computational methods for global/local analysis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  14. Using Argument-Driven Inquiry to enhance students' argument sophistication when supporting a stance in the context of Socioscientific Issues

    NASA Astrophysics Data System (ADS)

    Grooms, Jonathon A.

    This quasi-experimental study assesses the extent to which the Argument-Driven Inquiry (ADI) instructional model enhances undergraduate students' abilities to generate quality arguments supporting their stance in the context of a Socioscientific Issue (SSI) as compared to students experiencing a traditional style of instruction. Enhancing the quality of undergraduate students' arguments in the context of SSI can serve as an indirect measure of their scientific literacy and their ability to make sound decisions on issues that are inherently scientific but also involve social implications. Data collected in this study suggest that the undergraduate students experiencing the ADI instruction more readily provide rationales in their arguments supporting their decisions regarding two SSI-tasks as compared to a group of undergraduate students experiencing traditional instruction. This improvement in argument quality and gain in scientific literacy was achieved despite the overall lower SSI related content knowledge of the ADI students. Furthermore, the gap between the argument quality of those students with high versus low SSI related content knowledge was closed within the ADI group, while the same gap persisted post-intervention within the traditional instruction students. The role of students' epistemological sophistication was also investigated, which showed that neither instructional strategy was effective at shifting students' epistemological sophistication toward an evaluativist stance. However, the multiplists within the ADI group were able to significantly increase the sophistication of their arguments whereas the traditional students were not. There were no differences between the quality of arguments generated by the evaluativist students with either the treatment or comparison groups. Finally, the nature of the justifications used by the students revealed that the students (both comparison and treatment groups) did not invoke science-based justifications when

  15. On the Analysis Methods for the Time Domain and Frequency Domain Response of a Buried Objects*

    NASA Astrophysics Data System (ADS)

    Poljak, Dragan; Šesnić, Silvestar; Cvetković, Mario

    2014-05-01

    There has been a continuous interest in the analysis of ground-penetrating radar systems and related applications in civil engineering [1]. Consequently, a deeper insight of scattering phenomena occurring in a lossy half-space, as well as the development of sophisticated numerical methods based on Finite Difference Time Domain (FDTD) method, Finite Element Method (FEM), Boundary Element Method (BEM), Method of Moments (MoM) and various hybrid methods, is required, e.g. [2], [3]. The present paper deals with certain techniques for time and frequency domain analysis, respectively, of buried conducting and dielectric objects. Time domain analysis is related to the assessment of a transient response of a horizontal straight thin wire buried in a lossy half-space using a rigorous antenna theory (AT) approach. The AT approach is based on the space-time integral equation of the Pocklington type (time domain electric field integral equation for thin wires). The influence of the earth-air interface is taken into account via the simplified reflection coefficient arising from the Modified Image Theory (MIT). The obtained results for the transient current induced along the electrode due to the transmitted plane wave excitation are compared to the numerical results calculated via an approximate transmission line (TL) approach and the AT approach based on the space-frequency variant of the Pocklington integro-differential approach, respectively. It is worth noting that the space-frequency Pocklington equation is numerically solved via the Galerkin-Bubnov variant of the Indirect Boundary Element Method (GB-IBEM) and the corresponding transient response is obtained by the aid of inverse fast Fourier transform (IFFT). The results calculated by means of different approaches agree satisfactorily. Frequency domain analysis is related to the assessment of frequency domain response of dielectric sphere using the full wave model based on the set of coupled electric field integral

  16. Sophisticated epistemologies of physics versus high-stakes tests: How do elite high school students respond to competing influences about how to learn physics?

    NASA Astrophysics Data System (ADS)

    Yerdelen-Damar, Sevda; Elby, Andrew

    2016-06-01

    This study investigates how elite Turkish high school physics students claim to approach learning physics when they are simultaneously (i) engaged in a curriculum that led to significant gains in their epistemological sophistication and (ii) subject to a high-stakes college entrance exam. Students reported taking surface (rote) approaches to learning physics, largely driven by college entrance exam preparation and therefore focused on algorithmic problem solving at the expense of exploring concepts and real-life examples more deeply. By contrast, in recommending study strategies to "Arzu," a hypothetical student who doesn't need to take a college entrance exam and just wants to understand physics deeply, the students focused more on linking concepts and real-life examples and on making sense of the formulas and concepts—deep approaches to learning that reflect somewhat sophisticated epistemologies. These results illustrate how students can epistemically compartmentalize, consciously taking different epistemic stances—different views of what counts as knowing and learning—in different contexts even within the same discipline.

  17. Factorization method of quadratic template

    NASA Astrophysics Data System (ADS)

    Kotyrba, Martin

    2017-07-01

    Multiplication of two numbers is a one-way function in mathematics. Any attempt to distribute the outcome to its roots is called factorization. There are many methods such as Fermat's factorization, Dixońs method or quadratic sieve and GNFS, which use sophisticated techniques fast factorization. All the above methods use the same basic formula differing only in its use. This article discusses a newly designed factorization method. Effective implementation of this method in programs is not important, it only represents and clearly defines its properties.

  18. Preliminary design methods for fiber reinforced composite structures employing a personal computer

    NASA Technical Reports Server (NTRS)

    Eastlake, C. N.

    1986-01-01

    The objective of this project was to develop a user-friendly interactive computer program to be used as an analytical tool by structural designers. Its intent was to do preliminary, approximate stress analysis to help select or verify sizing choices for composite structural members. The approach to the project was to provide a subroutine which uses classical lamination theory to predict an effective elastic modulus for a laminate of arbitrary material and ply orientation. This effective elastic modulus can then be used in a family of other subroutines which employ the familiar basic structural analysis methods for isotropic materials. This method is simple and convenient to use but only approximate, as is appropriate for a preliminary design tool which will be subsequently verified by more sophisticated analysis. Additional subroutines have been provided to calculate laminate coefficient of thermal expansion and to calculate ply-by-ply strains within a laminate.

  19. Sophisticated Epistemologies of Physics versus High-Stakes Tests: How Do Elite High School Students Respond to Competing Influences about How to Learn Physics?

    ERIC Educational Resources Information Center

    Yerdelen-Damar, Sevda; Elby, Andrew

    2016-01-01

    This study investigates how elite Turkish high school physics students claim to approach learning physics when they are simultaneously (i) engaged in a curriculum that led to significant gains in their epistemological sophistication and (ii) subject to a high-stakes college entrance exam. Students reported taking surface (rote) approaches to…

  20. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 1 2011-04-01 2011-04-01 false Methods of analysis. 2.19 Section 2.19 Food and... ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis... enforcement programs to utilize the methods of analysis of the AOAC INTERNATIONAL (AOAC) as published in the...

  1. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Methods of analysis. 2.19 Section 2.19 Food and... ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis... enforcement programs to utilize the methods of analysis of the AOAC INTERNATIONAL (AOAC) as published in the...

  2. Analysis of the safety evaluation for premarketing clinical trials of hemodialyzer and of postmarketing safety reports of hemodialyzer in Japan and the US: insights into the construction of a sophisticated premarketing evaluation.

    PubMed

    Saito, Masami; Iwasaki, Kiyotaka

    2017-03-01

    Our aim was to conduct a scoping review of the regulations for hemodialyzers in the safety evaluation in Japan and the United States, and to evaluate the criteria for premarketing clinical trials and postmarketing safety reports to inform the development of a sophisticated premarketing evaluation in Japan. Regulations for approval of hemodialyzers were identified from the databases of the Ministry of Health, Labor and Welfare in Japan and the Federal Drug Agency (FDA) in the United States (US). The criteria for premarket clinical trials and postmarketing safety reports were evaluated for both countries. Standards in Japan required evaluation of blood compatibility and reporting of acute adverse effects by a premarketing clinical trial in 6 of 86 applications with semipermeable membrane materials deemed to be different to those of previously approved devices from 1983 to 31 August 2015. By comparison, the clinical trial was required in one of 545 approvals in the US from 1976 to 29 January 2016, but blood compatibility was not the point. All postmarketing adverse effects identified in Japan were included in the set of 'warnings'. The more stringent requirements for evaluation of blood compatibility and acute adverse effects in Japan seemed to be related to differences in the history of quality management systems for medical devices between the two countries. This study revealed that there were differences between Japan and the US in requiring the premarketing clinical trials for the hemodialyzers. Our findings could be useful for constructing sophisticated premarketing safety evaluation.

  3. When not to copy: female fruit flies use sophisticated public information to avoid mated males

    NASA Astrophysics Data System (ADS)

    Loyau, Adeline; Blanchet, Simon; van Laere, Pauline; Clobert, Jean; Danchin, Etienne

    2012-10-01

    Semen limitation (lack of semen to fertilize all of a female's eggs) imposes high fitness costs to female partners. Females should therefore avoid mating with semen-limited males. This can be achieved by using public information extracted from watching individual males' previous copulating activities. This adaptive preference should be flexible given that semen limitation is temporary. We first demonstrate that the number of offspring produced by males Drosophila melanogaster gradually decreases over successive copulations. We then show that females avoid mating with males they just watched copulating and that visual public cues are sufficient to elicit this response. Finally, after males were given the time to replenish their sperm reserves, females did not avoid the males they previously saw copulating anymore. These results suggest that female fruit flies may have evolved sophisticated behavioural processes of resistance to semen-limited males, and demonstrate unsuspected adaptive context-dependent mate choice in an invertebrate.

  4. A Method for Cognitive Task Analysis

    DTIC Science & Technology

    1992-07-01

    A method for cognitive task analysis is described based on the notion of ’generic tasks’. The method distinguishes three layers of analysis. At the...model for applied areas such as the development of knowledge-based systems and training, are discussed. Problem solving, Cognitive Task Analysis , Knowledge, Strategies.

  5. 21 CFR 163.5 - Methods of analysis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Methods of analysis. 163.5 Section 163.5 Food and... CONSUMPTION CACAO PRODUCTS General Provisions § 163.5 Methods of analysis. Shell and cacao fat content in cacao products shall be determined by the following methods of analysis prescribed in “Official Methods...

  6. Trial Sequential Methods for Meta-Analysis

    ERIC Educational Resources Information Center

    Kulinskaya, Elena; Wood, John

    2014-01-01

    Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…

  7. Ranking network of a captive rhesus macaque society: a sophisticated corporative kingdom.

    PubMed

    Fushing, Hsieh; McAssey, Michael P; Beisner, Brianne; McCowan, Brenda

    2011-03-15

    We develop a three-step computing approach to explore a hierarchical ranking network for a society of captive rhesus macaques. The computed network is sufficiently informative to address the question: Is the ranking network for a rhesus macaque society more like a kingdom or a corporation? Our computations are based on a three-step approach. These steps are devised to deal with the tremendous challenges stemming from the transitivity of dominance as a necessary constraint on the ranking relations among all individual macaques, and the very high sampling heterogeneity in the behavioral conflict data. The first step simultaneously infers the ranking potentials among all network members, which requires accommodation of heterogeneous measurement error inherent in behavioral data. Our second step estimates the social rank for all individuals by minimizing the network-wide errors in the ranking potentials. The third step provides a way to compute confidence bounds for selected empirical features in the social ranking. We apply this approach to two sets of conflict data pertaining to two captive societies of adult rhesus macaques. The resultant ranking network for each society is found to be a sophisticated mixture of both a kingdom and a corporation. Also, for validation purposes, we reanalyze conflict data from twenty longhorn sheep and demonstrate that our three-step approach is capable of correctly computing a ranking network by eliminating all ranking error.

  8. Increasing signal processing sophistication in the calculation of the respiratory modulation of the photoplethysmogram (DPOP).

    PubMed

    Addison, Paul S; Wang, Rui; Uribe, Alberto A; Bergese, Sergio D

    2015-06-01

    DPOP (∆POP or Delta-POP) is a non-invasive parameter which measures the strength of respiratory modulations present in the pulse oximetry photoplethysmogram (pleth) waveform. It has been proposed as a non-invasive surrogate parameter for pulse pressure variation (PPV) used in the prediction of the response to volume expansion in hypovolemic patients. Many groups have reported on the DPOP parameter and its correlation with PPV using various semi-automated algorithmic implementations. The study reported here demonstrates the performance gains made by adding increasingly sophisticated signal processing components to a fully automated DPOP algorithm. A DPOP algorithm was coded and its performance systematically enhanced through a series of code module alterations and additions. Each algorithm iteration was tested on data from 20 mechanically ventilated OR patients. Correlation coefficients and ROC curve statistics were computed at each stage. For the purposes of the analysis we split the data into a manually selected 'stable' region subset of the data containing relatively noise free segments and a 'global' set incorporating the whole data record. Performance gains were measured in terms of correlation against PPV measurements in OR patients undergoing controlled mechanical ventilation. Through increasingly advanced pre-processing and post-processing enhancements to the algorithm, the correlation coefficient between DPOP and PPV improved from a baseline value of R = 0.347 to R = 0.852 for the stable data set, and, correspondingly, R = 0.225 to R = 0.728 for the more challenging global data set. Marked gains in algorithm performance are achievable for manually selected stable regions of the signals using relatively simple algorithm enhancements. Significant additional algorithm enhancements, including a correction for low perfusion values, were required before similar gains were realised for the more challenging global data set.

  9. Fabrication of sophisticated two-dimensional organic nanoarchitectures thought hydrogen bond mediated molecular self assembly

    NASA Astrophysics Data System (ADS)

    Silly, Fabien

    2012-02-01

    Complex supramolecular two-dimensional (2D) networks are attracting considerable interest as highly ordered functional materials for applications in nanotechnology. The challenge consists in tailoring the ordering of one or more molecular species into specific architectures over an extended length scale with molecular precision. Highly organized supramolecular arrays can be obtained through self-assembly of complementary molecules which can interlock via intermolecular interactions. Molecules forming hydrogen bonds (H-bonds) are especially interesting building blocks for creating sophisticated organic architectures due to high selectivity and directionality of these bindings. We used scanning tunnelling microscopy to investigate at the atomic scale the formation of H-bonded 2D organic nanoarchitectures on surfaces. We mixed perylene derivatives having rectangular shape with melamine and DNA base having triangular and non symmetric shape respectively. We observe that molecule substituents play a key role in formation of the multicomponent H-bonded architectures. We show that the 2D self-assembly of these molecules can be tailored by adjusting the temperature and molecular ratio. We used these stimuli to successfully create numerous close-packed and porous 2D multicomponent structures.

  10. Platonic Dialogue, Maieutic Method and Critical Thinking

    ERIC Educational Resources Information Center

    Leigh, Fiona

    2007-01-01

    In this paper I offer a reading of one of Plato's later works, the "Sophist", that reveals it to be informed by principles comparable on the face of it with those that have emerged recently in the field of critical thinking. As a development of the famous Socratic method of his teacher, I argue, Plato deployed his own pedagogical method, a…

  11. Text analysis methods, text analysis apparatuses, and articles of manufacture

    DOEpatents

    Whitney, Paul D; Willse, Alan R; Lopresti, Charles A; White, Amanda M

    2014-10-28

    Text analysis methods, text analysis apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a text analysis method includes accessing information indicative of data content of a collection of text comprising a plurality of different topics, using a computing device, analyzing the information indicative of the data content, and using results of the analysis, identifying a presence of a new topic in the collection of text.

  12. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Methods of analysis. 133.5 Section 133.5 Food and... CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis. Moisture, milkfat, and phosphatase levels in cheeses will be determined by the following methods of analysis from...

  13. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 1 2013-04-01 2013-04-01 false Methods of analysis. 2.19 Section 2.19 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL GENERAL ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis...

  14. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 1 2014-04-01 2014-04-01 false Methods of analysis. 2.19 Section 2.19 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL GENERAL ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis...

  15. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 1 2012-04-01 2012-04-01 false Methods of analysis. 2.19 Section 2.19 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL GENERAL ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis...

  16. The Use of Object-Oriented Analysis Methods in Surety Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automaticmore » model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.« less

  17. Experimental design and quantitative analysis of microbial community multiomics.

    PubMed

    Mallick, Himel; Ma, Siyuan; Franzosa, Eric A; Vatanen, Tommi; Morgan, Xochitl C; Huttenhower, Curtis

    2017-11-30

    Studies of the microbiome have become increasingly sophisticated, and multiple sequence-based, molecular methods as well as culture-based methods exist for population-scale microbiome profiles. To link the resulting host and microbial data types to human health, several experimental design considerations, data analysis challenges, and statistical epidemiological approaches must be addressed. Here, we survey current best practices for experimental design in microbiome molecular epidemiology, including technologies for generating, analyzing, and integrating microbiome multiomics data. We highlight studies that have identified molecular bioactives that influence human health, and we suggest steps for scaling translational microbiome research to high-throughput target discovery across large populations.

  18. Thermal image analysis using the serpentine method

    NASA Astrophysics Data System (ADS)

    Koprowski, Robert; Wilczyński, Sławomir

    2018-03-01

    Thermal imaging is an increasingly widespread alternative to other imaging methods. As a supplementary method in diagnostics, it can be used both statically and with dynamic temperature changes. The paper proposes a new image analysis method that allows for the acquisition of new diagnostic information as well as object segmentation. The proposed serpentine analysis uses known and new methods of image analysis and processing proposed by the authors. Affine transformations of an image and subsequent Fourier analysis provide a new diagnostic quality. The method is fully repeatable and automatic and independent of inter-individual variability in patients. The segmentation results are by 10% better than those obtained from the watershed method and the hybrid segmentation method based on the Canny detector. The first and second harmonics of serpentine analysis enable to determine the type of temperature changes in the region of interest (gradient, number of heat sources etc.). The presented serpentine method provides new quantitative information on thermal imaging and more. Since it allows for image segmentation and designation of contact points of two and more heat sources (local minimum), it can be used to support medical diagnostics in many areas of medicine.

  19. Comprehensive rotorcraft analysis methods

    NASA Technical Reports Server (NTRS)

    Stephens, Wendell B.; Austin, Edward E.

    1988-01-01

    The development and application of comprehensive rotorcraft analysis methods in the field of rotorcraft technology are described. These large scale analyses and the resulting computer programs are intended to treat the complex aeromechanical phenomena that describe the behavior of rotorcraft. They may be used to predict rotor aerodynamics, acoustic, performance, stability and control, handling qualities, loads and vibrations, structures, dynamics, and aeroelastic stability characteristics for a variety of applications including research, preliminary and detail design, and evaluation and treatment of field problems. The principal comprehensive methods developed or under development in recent years and generally available to the rotorcraft community because of US Army Aviation Research and Technology Activity (ARTA) sponsorship of all or part of the software systems are the Rotorcraft Flight Simulation (C81), Dynamic System Coupler (DYSCO), Coupled Rotor/Airframe Vibration Analysis Program (SIMVIB), Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD), General Rotorcraft Aeromechanical Stability Program (GRASP), and Second Generation Comprehensive Helicopter Analysis System (2GCHAS).

  20. The Delphi Method in Rehabilitation Counseling Research

    ERIC Educational Resources Information Center

    Vazquez-Ramos, Robinson; Leahy, Michael; Estrada Hernandez, Noel

    2007-01-01

    Rehabilitation researchers have found in the application of the Delphi method a more sophisticated way of obtaining consensus from experts in the field on certain matters. The application of this research methodology has affected and certainly advanced the body of knowledge of the rehabilitation counseling practice. However, the rehabilitation…

  1. SimHap GUI: An intuitive graphical user interface for genetic association analysis

    PubMed Central

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-01-01

    Background Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. Results We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. Conclusion SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis. PMID:19109877

  2. SimHap GUI: an intuitive graphical user interface for genetic association analysis.

    PubMed

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-12-25

    Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis.

  3. Nonclinical dose formulation analysis method validation and sample analysis.

    PubMed

    Whitmire, Monica Lee; Bryan, Peter; Henry, Teresa R; Holbrook, John; Lehmann, Paul; Mollitor, Thomas; Ohorodnik, Susan; Reed, David; Wietgrefe, Holly D

    2010-12-01

    Nonclinical dose formulation analysis methods are used to confirm test article concentration and homogeneity in formulations and determine formulation stability in support of regulated nonclinical studies. There is currently no regulatory guidance for nonclinical dose formulation analysis method validation or sample analysis. Regulatory guidance for the validation of analytical procedures has been developed for drug product/formulation testing; however, verification of the formulation concentrations falls under the framework of GLP regulations (not GMP). The only current related regulatory guidance is the bioanalytical guidance for method validation. The fundamental parameters for bioanalysis and formulation analysis validations that overlap include: recovery, accuracy, precision, specificity, selectivity, carryover, sensitivity, and stability. Divergence in bioanalytical and drug product validations typically center around the acceptance criteria used. As the dose formulation samples are not true "unknowns", the concept of quality control samples that cover the entire range of the standard curve serving as the indication for the confidence in the data generated from the "unknown" study samples may not always be necessary. Also, the standard bioanalytical acceptance criteria may not be directly applicable, especially when the determined concentration does not match the target concentration. This paper attempts to reconcile the different practices being performed in the community and to provide recommendations of best practices and proposed acceptance criteria for nonclinical dose formulation method validation and sample analysis.

  4. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    PubMed

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-04-07

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.

  5. Was Euclid an Unnecessarily Sophisticated Psychologist?

    ERIC Educational Resources Information Center

    Arabie, Phipps

    1991-01-01

    The current state of multidimensional scaling using the city-block metric is reviewed, with attention to (1) substantive and theoretical issues; (2) recent algorithmic developments and their implications for analysis; (3) isometries with other metrics; (4) links to graph-theoretic models; and (5) prospects for future development. (SLD)

  6. Prognostic Analysis System and Methods of Operation

    NASA Technical Reports Server (NTRS)

    MacKey, Ryan M. E. (Inventor); Sneddon, Robert (Inventor)

    2014-01-01

    A prognostic analysis system and methods of operating the system are provided. In particular, a prognostic analysis system for the analysis of physical system health applicable to mechanical, electrical, chemical and optical systems and methods of operating the system are described herein.

  7. The Impact of Services on Economic Complexity: Service Sophistication as Route for Economic Growth.

    PubMed

    Stojkoski, Viktor; Utkovski, Zoran; Kocarev, Ljupco

    2016-01-01

    Economic complexity reflects the amount of knowledge that is embedded in the productive structure of an economy. By combining tools from network science and econometrics, a robust and stable relationship between a country's productive structure and its economic growth has been established. Here we report that not only goods but also services are important for predicting the rate at which countries will grow. By adopting a terminology which classifies manufactured goods and delivered services as products, we investigate the influence of services on the country's productive structure. In particular, we provide evidence that complexity indices for services are in general higher than those for goods, which is reflected in a general tendency to rank countries with developed service sector higher than countries with economy centred on manufacturing of goods. By focusing on country dynamics based on experimental data, we investigate the impact of services on the economic complexity of countries measured in the product space (consisting of both goods and services). Importantly, we show that diversification of service exports and its sophistication can provide an additional route for economic growth in both developing and developed countries.

  8. The Impact of Services on Economic Complexity: Service Sophistication as Route for Economic Growth

    PubMed Central

    Utkovski, Zoran; Kocarev, Ljupco

    2016-01-01

    Economic complexity reflects the amount of knowledge that is embedded in the productive structure of an economy. By combining tools from network science and econometrics, a robust and stable relationship between a country’s productive structure and its economic growth has been established. Here we report that not only goods but also services are important for predicting the rate at which countries will grow. By adopting a terminology which classifies manufactured goods and delivered services as products, we investigate the influence of services on the country’s productive structure. In particular, we provide evidence that complexity indices for services are in general higher than those for goods, which is reflected in a general tendency to rank countries with developed service sector higher than countries with economy centred on manufacturing of goods. By focusing on country dynamics based on experimental data, we investigate the impact of services on the economic complexity of countries measured in the product space (consisting of both goods and services). Importantly, we show that diversification of service exports and its sophistication can provide an additional route for economic growth in both developing and developed countries. PMID:27560133

  9. Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanfilippo, Antonio P; Cowell, Andrew J; Gregory, Michelle L

    Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

  10. Thermal Analysis

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The University of Georgia used NASTRAN, a COSMIC program that predicts how a design will stand up under stress, to develop a model for monitoring the transient cooling of vegetables. The winter use of passive solar heating for poultry houses is also under investigation by the Agricultural Engineering Dept. Another study involved thermal analysis of black and green nursery containers. The use of NASTRAN has encouraged student appreciation of sophisticated computer analysis.

  11. Probabilistic structural analysis by extremum methods

    NASA Technical Reports Server (NTRS)

    Nafday, Avinash M.

    1990-01-01

    The objective is to demonstrate discrete extremum methods of structural analysis as a tool for structural system reliability evaluation. Specifically, linear and multiobjective linear programming models for analysis of rigid plastic frames under proportional and multiparametric loadings, respectively, are considered. Kinematic and static approaches for analysis form a primal-dual pair in each of these models and have a polyhedral format. Duality relations link extreme points and hyperplanes of these polyhedra and lead naturally to dual methods for system reliability evaluation.

  12. Digital Movement Analysis in Physical Education

    ERIC Educational Resources Information Center

    Trout, Josh

    2013-01-01

    Mobile devices such as smartphones and tablets offer applications (apps) that make digital movement analysis simple and efficient in physical education. Highly sophisticated movement analysis software has been available for many years but has mainly appealed to coaches of elite athletes and biomechanists. Apps on mobile devices are less expensive…

  13. Method and apparatus for ceramic analysis

    DOEpatents

    Jankowiak, Ryszard J.; Schilling, Chris; Small, Gerald J.; Tomasik, Piotr

    2003-04-01

    The present invention relates to a method and apparatus for ceramic analysis, in particular, a method for analyzing density, density gradients and/or microcracks, including an apparatus with optical instrumentation for analysis of density, density gradients and/or microcracks in ceramics. The method provides analyzing density of a ceramic comprising exciting a component on a surface/subsurface of the ceramic by exposing the material to excitation energy. The method may further include the step of obtaining a measurement of an emitted energy from the component. The method may additionally include comparing the measurement of the emitted energy from the component with a predetermined reference measurement so as to obtain a density for said ceramic.

  14. Overview of Non-Volatile Testing and Screening Methods

    NASA Technical Reports Server (NTRS)

    Irom, Farokh

    2001-01-01

    Testing methods for memories and non-volatile memories have become increasingly sophisticated as they become denser and more complex. High frequency and faster rewrite times as well as smaller feature sizes have led to many testing challenges. This paper outlines several testing issues posed by novel memories and approaches to testing for radiation and reliability effects. We discuss methods for measurements of Total Ionizing Dose (TID).

  15. WebArray: an online platform for microarray data analysis

    PubMed Central

    Xia, Xiaoqin; McClelland, Michael; Wang, Yipeng

    2005-01-01

    Background Many cutting-edge microarray analysis tools and algorithms, including commonly used limma and affy packages in Bioconductor, need sophisticated knowledge of mathematics, statistics and computer skills for implementation. Commercially available software can provide a user-friendly interface at considerable cost. To facilitate the use of these tools for microarray data analysis on an open platform we developed an online microarray data analysis platform, WebArray, for bench biologists to utilize these tools to explore data from single/dual color microarray experiments. Results The currently implemented functions were based on limma and affy package from Bioconductor, the spacings LOESS histogram (SPLOSH) method, PCA-assisted normalization method and genome mapping method. WebArray incorporates these packages and provides a user-friendly interface for accessing a wide range of key functions of limma and others, such as spot quality weight, background correction, graphical plotting, normalization, linear modeling, empirical bayes statistical analysis, false discovery rate (FDR) estimation, chromosomal mapping for genome comparison. Conclusion WebArray offers a convenient platform for bench biologists to access several cutting-edge microarray data analysis tools. The website is freely available at . It runs on a Linux server with Apache and MySQL. PMID:16371165

  16. Bioanalytical methods for food contaminant analysis.

    PubMed

    Van Emon, Jeanette M

    2010-01-01

    Foods are complex mixtures of lipids, carbohydrates, proteins, vitamins, organic compounds, and other naturally occurring substances. Sometimes added to this mixture are residues of pesticides, veterinary and human drugs, microbial toxins, preservatives, contaminants from food processing and packaging, and other residues. This milieu of compounds can pose difficulties in the analysis of food contaminants. There is an expanding need for rapid and cost-effective residue methods for difficult food matrixes to safeguard our food supply. Bioanalytical methods are established for many food contaminants such as mycotoxins and are the method of choice for many food allergens. Bioanalytical methods are often more cost-effective and sensitive than instrumental procedures. Recent developments in bioanalytical methods may provide more applications for their use in food analysis.

  17. Probabilistic methods for rotordynamics analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  18. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL, as issued by the USDA, Agricultural...

  19. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Method of sample analysis. 58.245 Section 58.245... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL as issued by the USDA, Agricultural Marketing...

  20. Practical Use of Computationally Frugal Model Analysis Methods

    DOE PAGES

    Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; ...

    2015-03-21

    Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugalmore » methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts« less

  1. Assessing semantic similarity of texts - Methods and algorithms

    NASA Astrophysics Data System (ADS)

    Rozeva, Anna; Zerkova, Silvia

    2017-12-01

    Assessing the semantic similarity of texts is an important part of different text-related applications like educational systems, information retrieval, text summarization, etc. This task is performed by sophisticated analysis, which implements text-mining techniques. Text mining involves several pre-processing steps, which provide for obtaining structured representative model of the documents in a corpus by means of extracting and selecting the features, characterizing their content. Generally the model is vector-based and enables further analysis with knowledge discovery approaches. Algorithms and measures are used for assessing texts at syntactical and semantic level. An important text-mining method and similarity measure is latent semantic analysis (LSA). It provides for reducing the dimensionality of the document vector space and better capturing the text semantics. The mathematical background of LSA for deriving the meaning of the words in a given text by exploring their co-occurrence is examined. The algorithm for obtaining the vector representation of words and their corresponding latent concepts in a reduced multidimensional space as well as similarity calculation are presented.

  2. Building Models in the Classroom: Taking Advantage of Sophisticated Geomorphic Numerical Tools Using a Simple Graphical User Interface

    NASA Astrophysics Data System (ADS)

    Roy, S. G.; Koons, P. O.; Gerbi, C. C.; Capps, D. K.; Tucker, G. E.; Rogers, Z. A.

    2014-12-01

    Sophisticated numerical tools exist for modeling geomorphic processes and linking them to tectonic and climatic systems, but they are often seen as inaccessible for users with an exploratory level of interest. We have improved the accessibility of landscape evolution models by producing a simple graphics user interface (GUI) that takes advantage of the Channel-Hillslope Integrated Landscape Development (CHILD) model. Model access is flexible: the user can edit values for basic geomorphic, tectonic, and climate parameters, or obtain greater control by defining the spatiotemporal distributions of those parameters. Users can make educated predictions by choosing their own parametric values for the governing equations and interpreting the results immediately through model graphics. This method of modeling allows users to iteratively build their understanding through experimentation. Use of this GUI is intended for inquiry and discovery-based learning activities. We discuss a number of examples of how the GUI can be used at the upper high school, introductory university, and advanced university level. Effective teaching modules initially focus on an inquiry-based example guided by the instructor. As students become familiar with the GUI and the CHILD model, the class can shift to more student-centered exploration and experimentation. To make model interpretations more robust, digital elevation models can be imported and direct comparisons can be made between CHILD model results and natural topography. The GUI is available online through the University of Maine's Earth and Climate Sciences website, through the Community Surface Dynamics Modeling System (CSDMS) model repository, or by contacting the corresponding author.

  3. Laboratory theory and methods for sediment analysis

    USGS Publications Warehouse

    Guy, Harold P.

    1969-01-01

    The diverse character of fluvial sediments makes the choice of laboratory analysis somewhat arbitrary and the pressing of sediment samples difficult. This report presents some theories and methods used by the Water Resources Division for analysis of fluvial sediments to determine the concentration of suspended-sediment samples and the particle-size distribution of both suspended-sediment and bed-material samples. Other analyses related to these determinations may include particle shape, mineral content, and specific gravity, the organic matter and dissolved solids of samples, and the specific weight of soils. The merits and techniques of both the evaporation and filtration methods for concentration analysis are discussed. Methods used for particle-size analysis of suspended-sediment samples may include the sieve pipet, the VA tube-pipet, or the BW tube-VA tube depending on the equipment available, the concentration and approximate size of sediment in the sample, and the settling medium used. The choice of method for most bed-material samples is usually limited to procedures suitable for sand or to some type of visual analysis for large sizes. Several tested forms are presented to help insure a well-ordered system in the laboratory to handle the samples, to help determine the kind of analysis required for each, to conduct the required processes, and to assist in the required computations. Use of the manual should further 'standardize' methods of fluvial sediment analysis among the many laboratories and thereby help to achieve uniformity and precision of the data.

  4. Implementation of physicochemical and sensory analysis in conjunction with multivariate analysis towards assessing olive oil authentication/adulteration.

    PubMed

    Arvanitoyannis, Ioannis S; Vlachos, Antonios

    2007-01-01

    The authenticity of products labeled as olive oils, and in particular as virgin olive oils, stands for a very important issue both in terms of its health and commercial aspects. In view of the continuously increasing interest in virgin olive oil therapeutic properties, the traditional methods of characterization and physical and sensory analysis were further enriched with more advanced and sophisticated methods such as HPLC-MS, HPLC-GC/C/IRMS, RPLC-GC, DEPT, and CSIA among others. The results of both traditional and "novel" methods were treated both by means of classical multivariate analysis (cluster, principal component, correspondence, canonical, and discriminant) and artificial intelligence methods showing that nowadays the adulteration of virgin olive oil with seed oil is detectable at very low percentages, sometimes even at less than 1%. Furthermore, the detection of geographical origin of olive oil is equally feasible and much more accurate in countries like Italy and Spain where databases of physical/chemical properties exist. However, this geographical origin classification can also be accomplished in the absence of such databases provided that an adequate number of oil samples are used and the parameters studied have "discriminating power."

  5. Method of multivariate spectral analysis

    DOEpatents

    Keenan, Michael R.; Kotula, Paul G.

    2004-01-06

    A method of determining the properties of a sample from measured spectral data collected from the sample by performing a multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used to analyze X-ray spectral data generated by operating a Scanning Electron Microscope (SEM) with an attached Energy Dispersive Spectrometer (EDS).

  6. Analysis of flexible aircraft longitudinal dynamics and handling qualities. Volume 1: Analysis methods

    NASA Technical Reports Server (NTRS)

    Waszak, M. R.; Schmidt, D. S.

    1985-01-01

    As aircraft become larger and lighter due to design requirements for increased payload and improved fuel efficiency, they will also become more flexible. For highly flexible vehicles, the handling qualities may not be accurately predicted by conventional methods. This study applies two analysis methods to a family of flexible aircraft in order to investigate how and when structural (especially dynamic aeroelastic) effects affect the dynamic characteristics of aircraft. The first type of analysis is an open loop model analysis technique. This method considers the effects of modal residue magnitudes on determining vehicle handling qualities. The second method is a pilot in the loop analysis procedure that considers several closed loop system characteristics. Volume 1 consists of the development and application of the two analysis methods described above.

  7. SWECS tower dynamics analysis methods and results

    NASA Technical Reports Server (NTRS)

    Wright, A. D.; Sexton, J. H.; Butterfield, C. P.; Thresher, R. M.

    1981-01-01

    Several different tower dynamics analysis methods and computer codes were used to determine the natural frequencies and mode shapes of both guyed and freestanding wind turbine towers. These analysis methods are described and the results for two types of towers, a guyed tower and a freestanding tower, are shown. The advantages and disadvantages in the use of and the accuracy of each method are also described.

  8. Quantum algorithms for topological and geometric analysis of data

    PubMed Central

    Lloyd, Seth; Garnerone, Silvano; Zanardi, Paolo

    2016-01-01

    Extracting useful information from large data sets can be a daunting task. Topological methods for analysing data sets provide a powerful technique for extracting such information. Persistent homology is a sophisticated tool for identifying topological features and for determining how such features persist as the data is viewed at different scales. Here we present quantum machine learning algorithms for calculating Betti numbers—the numbers of connected components, holes and voids—in persistent homology, and for finding eigenvectors and eigenvalues of the combinatorial Laplacian. The algorithms provide an exponential speed-up over the best currently known classical algorithms for topological data analysis. PMID:26806491

  9. A hybrid technique for speech segregation and classification using a sophisticated deep neural network

    PubMed Central

    Nawaz, Tabassam; Mehmood, Zahid; Rashid, Muhammad; Habib, Hafiz Adnan

    2018-01-01

    Recent research on speech segregation and music fingerprinting has led to improvements in speech segregation and music identification algorithms. Speech and music segregation generally involves the identification of music followed by speech segregation. However, music segregation becomes a challenging task in the presence of noise. This paper proposes a novel method of speech segregation for unlabelled stationary noisy audio signals using the deep belief network (DBN) model. The proposed method successfully segregates a music signal from noisy audio streams. A recurrent neural network (RNN)-based hidden layer segregation model is applied to remove stationary noise. Dictionary-based fisher algorithms are employed for speech classification. The proposed method is tested on three datasets (TIMIT, MIR-1K, and MusicBrainz), and the results indicate the robustness of proposed method for speech segregation. The qualitative and quantitative analysis carried out on three datasets demonstrate the efficiency of the proposed method compared to the state-of-the-art speech segregation and classification-based methods. PMID:29558485

  10. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2002-01-01

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  11. A catalog of automated analysis methods for enterprise models.

    PubMed

    Florez, Hector; Sánchez, Mario; Villalobos, Jorge

    2016-01-01

    Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool.

  12. Harnessing psychoanalytical methods for a phenomenological neuroscience

    PubMed Central

    Cusumano, Emma P.; Raz, Amir

    2014-01-01

    Psychoanalysis proffers a wealth of phenomenological tools to advance the study of consciousness. Techniques for elucidating the structures of subjective life are sorely lacking in the cognitive sciences; as such, experiential reporting techniques must rise to meet both complex theories of brain function and increasingly sophisticated neuroimaging technologies. Analysis may offer valuable methods for bridging the gap between first-person and third-person accounts of the mind. Using both systematic observational approaches alongside unstructured narrative interactions, psychoanalysts help patients articulate their experience and bring unconscious mental contents into awareness. Similar to seasoned meditators or phenomenologists, individuals who have undergone analysis are experts in discerning and describing their subjective experience, thus making them ideal candidates for neurophenomenology. Moreover, analytic techniques may provide a means of guiding untrained experimental participants to greater awareness of their mental continuum, as well as gathering subjective reports about fundamental yet elusive aspects of experience including selfhood, temporality, and inter-subjectivity. Mining psychoanalysis for its methodological innovations provides a fresh turn for the neuropsychoanalysis movement and cognitive science as a whole – showcasing the integrity of analysis alongside the irreducibility of human experience. PMID:24808869

  13. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Methods of analysis. 133.5 Section 133.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis. Moisture...

  14. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Methods of analysis. 133.5 Section 133.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis. Moisture...

  15. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Methods of analysis. 133.5 Section 133.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis. Moisture...

  16. Beyond mind-reading: multi-voxel pattern analysis of fMRI data.

    PubMed

    Norman, Kenneth A; Polyn, Sean M; Detre, Greg J; Haxby, James V

    2006-09-01

    A key challenge for cognitive neuroscience is determining how mental representations map onto patterns of neural activity. Recently, researchers have started to address this question by applying sophisticated pattern-classification algorithms to distributed (multi-voxel) patterns of functional MRI data, with the goal of decoding the information that is represented in the subject's brain at a particular point in time. This multi-voxel pattern analysis (MVPA) approach has led to several impressive feats of mind reading. More importantly, MVPA methods constitute a useful new tool for advancing our understanding of neural information processing. We review how researchers are using MVPA methods to characterize neural coding and information processing in domains ranging from visual perception to memory search.

  17. CrowdMapping: A Crowdsourcing-Based Terminology Mapping Method for Medical Data Standardization.

    PubMed

    Mao, Huajian; Chi, Chenyang; Huang, Boyu; Meng, Haibin; Yu, Jinghui; Zhao, Dongsheng

    2017-01-01

    Standardized terminology is the prerequisite of data exchange in analysis of clinical processes. However, data from different electronic health record systems are based on idiosyncratic terminology systems, especially when the data is from different hospitals and healthcare organizations. Terminology standardization is necessary for the medical data analysis. We propose a crowdsourcing-based terminology mapping method, CrowdMapping, to standardize the terminology in medical data. CrowdMapping uses a confidential model to determine how terminologies are mapped to a standard system, like ICD-10. The model uses mappings from different health care organizations and evaluates the diversity of the mapping to determine a more sophisticated mapping rule. Further, the CrowdMapping model enables users to rate the mapping result and interact with the model evaluation. CrowdMapping is a work-in-progress system, we present initial results mapping terminologies.

  18. 21 CFR 163.5 - Methods of analysis.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Methods of analysis. 163.5 Section 163.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CACAO PRODUCTS General Provisions § 163.5 Methods of analysis. Shell and cacao fat content in...

  19. 21 CFR 163.5 - Methods of analysis.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Methods of analysis. 163.5 Section 163.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CACAO PRODUCTS General Provisions § 163.5 Methods of analysis. Shell and cacao fat content in...

  20. 21 CFR 163.5 - Methods of analysis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Methods of analysis. 163.5 Section 163.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CACAO PRODUCTS General Provisions § 163.5 Methods of analysis. Shell and cacao fat content in...

  1. 21 CFR 163.5 - Methods of analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Methods of analysis. 163.5 Section 163.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CACAO PRODUCTS General Provisions § 163.5 Methods of analysis. Shell and cacao fat content in...

  2. Method of photon spectral analysis

    DOEpatents

    Gehrke, Robert J.; Putnam, Marie H.; Killian, E. Wayne; Helmer, Richard G.; Kynaston, Ronnie L.; Goodwin, Scott G.; Johnson, Larry O.

    1993-01-01

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and .gamma.-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2000 keV), as well as high-energy .gamma. rays (>1 MeV). A 8192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The .gamma.-ray portion of each spectrum is analyzed by a standard Ge .gamma.-ray analysis program. This method can be applied to any analysis involving x- and .gamma.-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the .gamma.-ray analysis and accommodated during the x-ray analysis.

  3. Method of photon spectral analysis

    DOEpatents

    Gehrke, R.J.; Putnam, M.H.; Killian, E.W.; Helmer, R.G.; Kynaston, R.L.; Goodwin, S.G.; Johnson, L.O.

    1993-04-27

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and [gamma]-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2,000 keV), as well as high-energy [gamma] rays (>1 MeV). A 8,192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The [gamma]-ray portion of each spectrum is analyzed by a standard Ge [gamma]-ray analysis program. This method can be applied to any analysis involving x- and [gamma]-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the [gamma]-ray analysis and accommodated during the x-ray analysis.

  4. Graphical methods for the sensitivity analysis in discriminant analysis

    DOE PAGES

    Kim, Youngil; Anderson-Cook, Christine M.; Dae-Heung, Jang

    2015-09-30

    Similar to regression, many measures to detect influential data points in discriminant analysis have been developed. Many follow similar principles as the diagnostic measures used in linear regression in the context of discriminant analysis. Here we focus on the impact on the predicted classification posterior probability when a data point is omitted. The new method is intuitive and easily interpretative compared to existing methods. We also propose a graphical display to show the individual movement of the posterior probability of other data points when a specific data point is omitted. This enables the summaries to capture the overall pattern ofmore » the change.« less

  5. An integrated platform for directly widely-targeted quantitative analysis of feces part I: Platform configuration and method validation.

    PubMed

    Song, Yuelin; Song, Qingqing; Li, Jun; Zheng, Jiao; Li, Chun; Zhang, Yuan; Zhang, Lingling; Jiang, Yong; Tu, Pengfei

    2016-07-08

    Direct analysis is of great importance to understand the real chemical profile of a given sample, notably biological materials, because either chemical degradation or diverse errors and uncertainties might be resulted from sophisticated protocols. In comparison with biofluids, it is still challenging for direct analysis of solid biological samples using high performance liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS). Herein, a new analytical platform was configured by online hyphenating pressurized liquid extraction (PLE), turbulent flow chromatography (TFC), and LC-MS/MS. A facile, but robust PLE module was constructed based on the phenomenon that noticeable back-pressure can be generated during rapid fluid passing through a narrow tube. TFC column that is advantageous at extracting low molecular analytes from rushing fluid was employed to link at the outlet of the PLE module to capture constituents-of-interest. An electronic 6-port/2-position valve was introduced between TFC column and LC-MS/MS to fragment each measurement into extraction and elution phases, whereas LC-MS/MS took the charge of analyte separation and monitoring. As a proof of concept, simultaneous determination of 24 endogenous substances including eighteen steroids, five eicosanoids, and one porphyrin in feces was carried out in this paper. Method validation assays demonstrated the analytical platform to be qualified for directly simultaneous measurement of diverse endogenous analytes in fecal matrices. Application of this integrated platform on homolog-focused profiling of feces is discussed in a companion paper. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. A Computer Analysis of Library Postcards. (CALP)

    ERIC Educational Resources Information Center

    Stevens, Norman D.

    1974-01-01

    A description of a sophisticated application of computer techniques to the analysis of a collection of picture postcards of library buildings in an attempt to establish the minimum architectural requirements needed to distinguish one style of library building from another. (Author)

  7. A sophisticated cad tool for the creation of complex models for electromagnetic interaction analysis

    NASA Astrophysics Data System (ADS)

    Dion, Marc; Kashyap, Satish; Louie, Aloisius

    1991-06-01

    This report describes the essential features of the MS-DOS version of DIDEC-DREO, an interactive program for creating wire grid, surface patch, and cell models of complex structures for electromagnetic interaction analysis. It uses the device-independent graphics library DIGRAF and the graphics kernel system HALO, and can be executed on systems with various graphics devices. Complicated structures can be created by direct alphanumeric keyboard entry, digitization of blueprints, conversion form existing geometric structure files, and merging of simple geometric shapes. A completed DIDEC geometric file may then be converted to the format required for input to a variety of time domain and frequency domain electromagnetic interaction codes. This report gives a detailed description of the program DIDEC-DREO, its installation, and its theoretical background. Each available interactive command is described. The associated program HEDRON which generates simple geometric shapes, and other programs that extract the current amplitude data from electromagnetic interaction code outputs, are also discussed.

  8. Multivariate Methods for Meta-Analysis of Genetic Association Studies.

    PubMed

    Dimou, Niki L; Pantavou, Katerina G; Braliou, Georgia G; Bagos, Pantelis G

    2018-01-01

    Multivariate meta-analysis of genetic association studies and genome-wide association studies has received a remarkable attention as it improves the precision of the analysis. Here, we review, summarize and present in a unified framework methods for multivariate meta-analysis of genetic association studies and genome-wide association studies. Starting with the statistical methods used for robust analysis and genetic model selection, we present in brief univariate methods for meta-analysis and we then scrutinize multivariate methodologies. Multivariate models of meta-analysis for a single gene-disease association studies, including models for haplotype association studies, multiple linked polymorphisms and multiple outcomes are discussed. The popular Mendelian randomization approach and special cases of meta-analysis addressing issues such as the assumption of the mode of inheritance, deviation from Hardy-Weinberg Equilibrium and gene-environment interactions are also presented. All available methods are enriched with practical applications and methodologies that could be developed in the future are discussed. Links for all available software implementing multivariate meta-analysis methods are also provided.

  9. Fast automated analysis of strong gravitational lenses with convolutional neural networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hezaveh, Yashar D.; Levasseur, Laurence Perreault; Marshall, Philip J.

    Quantifying image distortions caused by strong gravitational lensing—the formation of multiple images of distant sources due to the deflection of their light by the gravity of intervening structures—and estimating the corresponding matter distribution of these structures (the ‘gravitational lens’) has primarily been performed using maximum likelihood modelling of observations. Our procedure is typically time- and resource-consuming, requiring sophisticated lensing codes, several data preparation steps, and finding the maximum likelihood model parameters in a computationally expensive process with downhill optimizers. Accurate analysis of a single gravitational lens can take up to a few weeks and requires expert knowledge of the physicalmore » processes and methods involved. Tens of thousands of new lenses are expected to be discovered with the upcoming generation of ground and space surveys. We report the use of deep convolutional neural networks to estimate lensing parameters in an extremely fast and automated way, circumventing the difficulties that are faced by maximum likelihood methods. We also show that the removal of lens light can be made fast and automated using independent component analysis of multi-filter imaging data. Our networks can recover the parameters of the ‘singular isothermal ellipsoid’ density profile, which is commonly used to model strong lensing systems, with an accuracy comparable to the uncertainties of sophisticated models but about ten million times faster: 100 systems in approximately one second on a single graphics processing unit. These networks can provide a way for non-experts to obtain estimates of lensing parameters for large samples of data.« less

  10. Fast automated analysis of strong gravitational lenses with convolutional neural networks

    DOE PAGES

    Hezaveh, Yashar D.; Levasseur, Laurence Perreault; Marshall, Philip J.

    2017-08-30

    Quantifying image distortions caused by strong gravitational lensing—the formation of multiple images of distant sources due to the deflection of their light by the gravity of intervening structures—and estimating the corresponding matter distribution of these structures (the ‘gravitational lens’) has primarily been performed using maximum likelihood modelling of observations. Our procedure is typically time- and resource-consuming, requiring sophisticated lensing codes, several data preparation steps, and finding the maximum likelihood model parameters in a computationally expensive process with downhill optimizers. Accurate analysis of a single gravitational lens can take up to a few weeks and requires expert knowledge of the physicalmore » processes and methods involved. Tens of thousands of new lenses are expected to be discovered with the upcoming generation of ground and space surveys. We report the use of deep convolutional neural networks to estimate lensing parameters in an extremely fast and automated way, circumventing the difficulties that are faced by maximum likelihood methods. We also show that the removal of lens light can be made fast and automated using independent component analysis of multi-filter imaging data. Our networks can recover the parameters of the ‘singular isothermal ellipsoid’ density profile, which is commonly used to model strong lensing systems, with an accuracy comparable to the uncertainties of sophisticated models but about ten million times faster: 100 systems in approximately one second on a single graphics processing unit. These networks can provide a way for non-experts to obtain estimates of lensing parameters for large samples of data.« less

  11. Fast automated analysis of strong gravitational lenses with convolutional neural networks

    NASA Astrophysics Data System (ADS)

    Hezaveh, Yashar D.; Levasseur, Laurence Perreault; Marshall, Philip J.

    2017-08-01

    Quantifying image distortions caused by strong gravitational lensing—the formation of multiple images of distant sources due to the deflection of their light by the gravity of intervening structures—and estimating the corresponding matter distribution of these structures (the ‘gravitational lens’) has primarily been performed using maximum likelihood modelling of observations. This procedure is typically time- and resource-consuming, requiring sophisticated lensing codes, several data preparation steps, and finding the maximum likelihood model parameters in a computationally expensive process with downhill optimizers. Accurate analysis of a single gravitational lens can take up to a few weeks and requires expert knowledge of the physical processes and methods involved. Tens of thousands of new lenses are expected to be discovered with the upcoming generation of ground and space surveys. Here we report the use of deep convolutional neural networks to estimate lensing parameters in an extremely fast and automated way, circumventing the difficulties that are faced by maximum likelihood methods. We also show that the removal of lens light can be made fast and automated using independent component analysis of multi-filter imaging data. Our networks can recover the parameters of the ‘singular isothermal ellipsoid’ density profile, which is commonly used to model strong lensing systems, with an accuracy comparable to the uncertainties of sophisticated models but about ten million times faster: 100 systems in approximately one second on a single graphics processing unit. These networks can provide a way for non-experts to obtain estimates of lensing parameters for large samples of data.

  12. Fast automated analysis of strong gravitational lenses with convolutional neural networks.

    PubMed

    Hezaveh, Yashar D; Levasseur, Laurence Perreault; Marshall, Philip J

    2017-08-30

    Quantifying image distortions caused by strong gravitational lensing-the formation of multiple images of distant sources due to the deflection of their light by the gravity of intervening structures-and estimating the corresponding matter distribution of these structures (the 'gravitational lens') has primarily been performed using maximum likelihood modelling of observations. This procedure is typically time- and resource-consuming, requiring sophisticated lensing codes, several data preparation steps, and finding the maximum likelihood model parameters in a computationally expensive process with downhill optimizers. Accurate analysis of a single gravitational lens can take up to a few weeks and requires expert knowledge of the physical processes and methods involved. Tens of thousands of new lenses are expected to be discovered with the upcoming generation of ground and space surveys. Here we report the use of deep convolutional neural networks to estimate lensing parameters in an extremely fast and automated way, circumventing the difficulties that are faced by maximum likelihood methods. We also show that the removal of lens light can be made fast and automated using independent component analysis of multi-filter imaging data. Our networks can recover the parameters of the 'singular isothermal ellipsoid' density profile, which is commonly used to model strong lensing systems, with an accuracy comparable to the uncertainties of sophisticated models but about ten million times faster: 100 systems in approximately one second on a single graphics processing unit. These networks can provide a way for non-experts to obtain estimates of lensing parameters for large samples of data.

  13. Determining the Effectiveness of Various Delivery Methods in an Information Technology/Information Systems Curriculum

    ERIC Educational Resources Information Center

    Davis, Gary Alan; Kovacs, Paul J.; Scarpino, John; Turchek, John C.

    2010-01-01

    The emergence of increasingly sophisticated communication technologies and the media-rich extensions of the World Wide Web have prompted universities to use alternatives to the traditional classroom teaching and learning methods. This demand for alternative delivery methods has led to the development of a wide range of eLearning techniques.…

  14. Internet of Things technology-based management methods for environmental specimen banks.

    PubMed

    Peng, Lihong; Wang, Qian; Yu, Ang

    2015-02-01

    The establishment and management of environmental specimen banks (ESBs) has long been a problem worldwide. The complexity of specimen environment has made the management of ESB likewise complex. Through an analysis of the development and management of ESBs worldwide and in light of the sophisticated Internet of Things (IOT) technology, this paper presents IOT technology-based ESB management methods. An IOT technology-based ESB management system can significantly facilitate ESB ingress and egress management as well as long-term storage management under quality control. This paper elaborates on the design of IOT technology-based modules, which can be used in ESB management to achieve standardized, smart, information-based ESB management. ESB management has far-reaching implications for environmental management and for research in environmental science.

  15. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor)

    2009-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

  16. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor)

    2013-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.

  17. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Reid, Ray D. (Inventor); Hug, William F. (Inventor)

    2010-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

  18. Airbreathing hypersonic vehicle design and analysis methods

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.

    1996-01-01

    The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.

  19. The role of motion analysis in elite soccer: contemporary performance measurement techniques and work rate data.

    PubMed

    Carling, Christopher; Bloomfield, Jonathan; Nelsen, Lee; Reilly, Thomas

    2008-01-01

    The optimal physical preparation of elite soccer (association football) players has become an indispensable part of the professional game, especially due to the increased physical demands of match-play. The monitoring of players' work rate profiles during competition is now feasible through computer-aided motion analysis. Traditional methods of motion analysis were extremely labour intensive and were largely restricted to university-based research projects. Recent technological developments have meant that sophisticated systems, capable of quickly recording and processing the data of all players' physical contributions throughout an entire match, are now being used in elite club environments. In recognition of the important role that motion analysis now plays as a tool for measuring the physical performance of soccer players, this review critically appraises various motion analysis methods currently employed in elite soccer and explores research conducted using these methods. This review therefore aims to increase the awareness of both practitioners and researchers of the various motion analysis systems available, and identify practical implications of the established body of knowledge, while highlighting areas that require further exploration.

  20. Optical alignment of the Chromospheric Lyman-Alpha Spectro-Polarimeter using sophisticated methods to minimize activities under vacuum

    NASA Astrophysics Data System (ADS)

    Giono, G.; Katsukawa, Y.; Ishikawa, R.; Narukage, N.; Kano, R.; Kubo, M.; Ishikawa, S.; Bando, T.; Hara, H.; Suematsu, Y.; Winebarger, A.; Kobayashi, K.; Auchère, F.; Trujillo Bueno, J.

    2016-07-01

    The Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) is a sounding-rocket instrument developed at the National Astronomical Observatory of Japan (NAOJ) as a part of an international collaboration. The instrument main scientific goal is to achieve polarization measurement of the Lyman-α line at 121.56 nm emitted from the solar upper-chromosphere and transition region with an unprecedented 0.1% accuracy. The optics are composed of a Cassegrain telescope coated with a "cold mirror" coating optimized for UV reflection and a dual-channel spectrograph allowing for simultaneous observation of the two orthogonal states of polarization. Although the polarization sensitivity is the most important aspect of the instrument, the spatial and spectral resolutions of the instrument are also crucial to observe the chromospheric features and resolve the Ly-α profiles. A precise alignment of the optics is required to ensure the resolutions, but experiments under vacuum conditions are needed since Ly-α is absorbed by air, making the alignment experiments difficult. To bypass this issue, we developed methods to align the telescope and the spectrograph separately in visible light. We explain these methods and present the results for the optical alignment of the CLASP telescope and spectrograph. We then discuss the combined performances of both parts to derive the expected resolutions of the instrument, and compare them with the flight observations performed on September 3rd 2015.

  1. Spectroscopic Chemical Analysis Methods and Apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Lane, Arthur L. (Inventor); Bhartia, Rohit (Inventor); Reid, Ray D. (Inventor)

    2017-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.

  2. Spectroscopic Chemical Analysis Methods and Apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Lane, Arthur L. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor)

    2018-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.

  3. Network modelling methods for FMRI.

    PubMed

    Smith, Stephen M; Miller, Karla L; Salimi-Khorshidi, Gholamreza; Webster, Matthew; Beckmann, Christian F; Nichols, Thomas E; Ramsey, Joseph D; Woolrich, Mark W

    2011-01-15

    There is great interest in estimating brain "networks" from FMRI data. This is often attempted by identifying a set of functional "nodes" (e.g., spatial ROIs or ICA maps) and then conducting a connectivity analysis between the nodes, based on the FMRI timeseries associated with the nodes. Analysis methods range from very simple measures that consider just two nodes at a time (e.g., correlation between two nodes' timeseries) to sophisticated approaches that consider all nodes simultaneously and estimate one global network model (e.g., Bayes net models). Many different methods are being used in the literature, but almost none has been carefully validated or compared for use on FMRI timeseries data. In this work we generate rich, realistic simulated FMRI data for a wide range of underlying networks, experimental protocols and problematic confounds in the data, in order to compare different connectivity estimation approaches. Our results show that in general correlation-based approaches can be quite successful, methods based on higher-order statistics are less sensitive, and lag-based approaches perform very poorly. More specifically: there are several methods that can give high sensitivity to network connection detection on good quality FMRI data, in particular, partial correlation, regularised inverse covariance estimation and several Bayes net methods; however, accurate estimation of connection directionality is more difficult to achieve, though Patel's τ can be reasonably successful. With respect to the various confounds added to the data, the most striking result was that the use of functionally inaccurate ROIs (when defining the network nodes and extracting their associated timeseries) is extremely damaging to network estimation; hence, results derived from inappropriate ROI definition (such as via structural atlases) should be regarded with great caution. Copyright © 2010 Elsevier Inc. All rights reserved.

  4. Sensitive and inexpensive digital DNA analysis by microfluidic enrichment of rolling circle amplified single-molecules

    PubMed Central

    Kühnemund, Malte; Hernández-Neuta, Iván; Sharif, Mohd Istiaq; Cornaglia, Matteo; Gijs, Martin A.M.

    2017-01-01

    Abstract Single molecule quantification assays provide the ultimate sensitivity and precision for molecular analysis. However, most digital analysis techniques, i.e. droplet PCR, require sophisticated and expensive instrumentation for molecule compartmentalization, amplification and analysis. Rolling circle amplification (RCA) provides a simpler means for digital analysis. Nevertheless, the sensitivity of RCA assays has until now been limited by inefficient detection methods. We have developed a simple microfluidic strategy for enrichment of RCA products into a single field of view of a low magnification fluorescent sensor, enabling ultra-sensitive digital quantification of nucleic acids over a dynamic range from 1.2 aM to 190 fM. We prove the broad applicability of our analysis platform by demonstrating 5-plex detection of as little as ∼1 pg (∼300 genome copies) of pathogenic DNA with simultaneous antibiotic resistance marker detection, and the analysis of rare oncogene mutations. Our method is simpler, more cost-effective and faster than other digital analysis techniques and provides the means to implement digital analysis in any laboratory equipped with a standard fluorescent microscope. PMID:28077562

  5. Methods for Analysis and Simulation of Ballistic Impact

    DTIC Science & Technology

    2017-04-01

    ARL-RP-0597 ● Apr 2017 US Army Research Laboratory Methods for Analysis and Simulation of Ballistic Impact by John D Clayton...Laboratory Methods for Analysis and Simulation of Ballistic Impact by John D Clayton Weapons and Materials Research Directorate, ARL...analytical, and numerical methods of ballistics research . Similar lengthy references dealing with pertinent aspects include [8, 9]. In contrast, the

  6. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2004-03-23

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following prediction or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The hybrid method herein means a combination of an initial calibration step with subsequent analysis by an inverse multivariate analysis method. A spectral shape herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The shape can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  7. Analysis of Classes of Superlinear Semipositone Problems with Nonlinear Boundary Conditions

    NASA Astrophysics Data System (ADS)

    Morris, Quinn A.

    We study positive radial solutions for classes of steady state reaction diffusion problems on the exterior of a ball with both Dirichlet and nonlinear boundary conditions. We consider p-Laplacian problems (p > 1) with reaction terms which are superlinear at infinity and semipositone. In the case p = 2, using variational methods, we establish the existence of a solution, and via detailed analysis of the Green's function, we prove the positivity of the solution. In the case p ≠ 2, we again use variational methods to establish the existence of a solution, but the positivity of the solution is achieved via sophisticated a priori estimates. In the case p ≠ 2, the Green's function analysis is no longer available. Our results significantly enhance the literature on superlinear semipositone problems. Finally, we provide algorithms for the numerical generation of exact bifurcation curves for one-dimensional problems. In the autonomous case, we extend and analyze a quadrature method, and using nonlinear solvers in Mathematica, generate bifurcation curves. In the nonautonomous case, we employ shooting methods in Mathematica to generate bifurcation curves.

  8. Multiscale methods for gore curvature calculations from FSI modeling of spacecraft parachutes

    NASA Astrophysics Data System (ADS)

    Takizawa, Kenji; Tezduyar, Tayfun E.; Kolesar, Ryan; Boswell, Cody; Kanai, Taro; Montel, Kenneth

    2014-12-01

    There are now some sophisticated and powerful methods for computer modeling of parachutes. These methods are capable of addressing some of the most formidable computational challenges encountered in parachute modeling, including fluid-structure interaction (FSI) between the parachute and air flow, design complexities such as those seen in spacecraft parachutes, and operational complexities such as use in clusters and disreefing. One should be able to extract from a reliable full-scale parachute modeling any data or analysis needed. In some cases, however, the parachute engineers may want to perform quickly an extended or repetitive analysis with methods based on simplified models. Some of the data needed by a simplified model can very effectively be extracted from a full-scale computer modeling that serves as a pilot. A good example of such data is the circumferential curvature of a parachute gore, where a gore is the slice of the parachute canopy between two radial reinforcement cables running from the parachute vent to the skirt. We present the multiscale methods we devised for gore curvature calculation from FSI modeling of spacecraft parachutes. The methods include those based on the multiscale sequentially-coupled FSI technique and using NURBS meshes. We show how the methods work for the fully-open and two reefed stages of the Orion spacecraft main and drogue parachutes.

  9. Improved dynamic analysis method using load-dependent Ritz vectors

    NASA Technical Reports Server (NTRS)

    Escobedo-Torres, J.; Ricles, J. M.

    1993-01-01

    The dynamic analysis of large space structures is important in order to predict their behavior under operating conditions. Computer models of large space structures are characterized by having a large number of degrees of freedom, and the computational effort required to carry out the analysis is very large. Conventional methods of solution utilize a subset of the eigenvectors of the system, but for systems with many degrees of freedom, the solution of the eigenproblem is in many cases the most costly phase of the analysis. For this reason, alternate solution methods need to be considered. It is important that the method chosen for the analysis be efficient and that accurate results be obtainable. It is important that the method chosen for the analysis be efficient and that accurate results be obtainable. The load dependent Ritz vector method is presented as an alternative to the classical normal mode methods for obtaining dynamic responses of large space structures. A simplified model of a space station is used to compare results. Results show that the load dependent Ritz vector method predicts the dynamic response better than the classical normal mode method. Even though this alternate method is very promising, further studies are necessary to fully understand its attributes and limitations.

  10. A study on the influence of eWOM using content analysis: how do comments on value for money, product sophistication and experiential feeling affect our choices?

    NASA Astrophysics Data System (ADS)

    Cho, Vincent; Chan, Alpha

    2017-07-01

    The influence of electronic word of mouth (eWOM) has been heavily investigated in relation to online ratings. However, only a few studies examined the content of eWOM. From the perspective of the consideration sets model, consumers formulate an awareness set, a consideration set and a choice set before making a purchase. We argue that the formulation of these sets is influenced by eWOM based on its volume, valance and content relating to product attributes such as value for money, product sophistication and experiential feeling. In this study, the content of posts relating to Shure professional earphones in the online forum Mingo (www.mingo-hmw.com/forum) was captured and annotated. During the data collection period, Mingo was the sole online forum relating to professional earphones. Without much interference from other online forums, the circumstances of this study closely approximate a laboratory setting. In addition, we collected the actual sales, marketing costs, fault rates and number of retail stores selling the Shure professional earphones for 126 weeks. Our findings show that the weekly volume of posts, their relative number of positive (negative) comments, especially regarding value for money and sound quality, and those posts from the earlier week impinged strongly on weekly sales of Shure products. From the regression models, the explained variance in sales jumps from 0.236 to 0.732 due to the influence of eWOM.

  11. Optical Alignment of the Chromospheric Lyman-Alpha SpectroPolarimeter using Sophisticated Methods to Minimize Activities under Vacuum

    NASA Technical Reports Server (NTRS)

    Giono, G.; Katsukawa, Y.; Ishikawa, R.; Narukage, N.; Kano, R.; Kubo, M.; Ishikawa, S.; Bando, T.; Hara, H.; Suematsu, Y.; hide

    2016-01-01

    The Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) is a sounding-rocket instrument developed at the National Astronomical Observatory of Japan (NAOJ) as a part of an international collaboration. The in- strument main scientific goal is to achieve polarization measurement of the Lyman-alpha line at 121.56 nm emitted from the solar upper-chromosphere and transition region with an unprecedented 0.1% accuracy. For this purpose, the optics are composed of a Cassegrain telescope coated with a "cold mirror" coating optimized for UV reflection and a dual-channel spectrograph allowing for simultaneous observation of the two orthogonal states of polarization. Although the polarization sensitivity is the most important aspect of the instrument, the spatial and spectral resolutions of the instrument are also crucial to observe the chromospheric features and resolve the Ly- pro les. A precise alignment of the optics is required to ensure the resolutions, but experiments under vacuum conditions are needed since Ly-alpha is absorbed by air, making the alignment experiments difficult. To bypass this issue, we developed methods to align the telescope and the spectrograph separately in visible light. We will explain these methods and present the results for the optical alignment of the CLASP telescope and spectrograph. We will then discuss the combined performances of both parts to derive the expected resolutions of the instrument, and compare them with the flight observations performed on September 3rd 2015.

  12. Handbook of capture-recapture analysis

    USGS Publications Warehouse

    Amstrup, Steven C.; McDonald, Trent L.; Manly, Bryan F.J.

    2005-01-01

    Every day, biologists in parkas, raincoats, and rubber boots go into the field to capture and mark a variety of animal species. Back in the office, statisticians create analytical models for the field biologists' data. But many times, representatives of the two professions do not fully understand one another's roles. This book bridges this gap by helping biologists understand state-of-the-art statistical methods for analyzing capture-recapture data. In so doing, statisticians will also become more familiar with the design of field studies and with the real-life issues facing biologists.Reliable outcomes of capture-recapture studies are vital to answering key ecological questions. Is the population increasing or decreasing? Do more or fewer animals have a particular characteristic? In answering these questions, biologists cannot hope to capture and mark entire populations. And frequently, the populations change unpredictably during a study. Thus, increasingly sophisticated models have been employed to convert data into answers to ecological questions. This book, by experts in capture-recapture analysis, introduces the most up-to-date methods for data analysis while explaining the theory behind those methods. Thorough, concise, and portable, it will be immensely useful to biologists, biometricians, and statisticians, students in both fields, and anyone else engaged in the capture-recapture process.

  13. Applying phylogenetic analysis to viral livestock diseases: moving beyond molecular typing.

    PubMed

    Olvera, Alex; Busquets, Núria; Cortey, Marti; de Deus, Nilsa; Ganges, Llilianne; Núñez, José Ignacio; Peralta, Bibiana; Toskano, Jennifer; Dolz, Roser

    2010-05-01

    Changes in livestock production systems in recent years have altered the presentation of many diseases resulting in the need for more sophisticated control measures. At the same time, new molecular assays have been developed to support the diagnosis of animal viral disease. Nucleotide sequences generated by these diagnostic techniques can be used in phylogenetic analysis to infer phenotypes by sequence homology and to perform molecular epidemiology studies. In this review, some key elements of phylogenetic analysis are highlighted, such as the selection of the appropriate neutral phylogenetic marker, the proper phylogenetic method and different techniques to test the reliability of the resulting tree. Examples are given of current and future applications of phylogenetic reconstructions in viral livestock diseases. Copyright 2009 Elsevier Ltd. All rights reserved.

  14. Epistemic beliefs of middle and high school students in a problem-based, scientific inquiry unit: An exploratory, mixed methods study

    NASA Astrophysics Data System (ADS)

    Gu, Jiangyue

    Epistemic beliefs are individuals' beliefs about the nature of knowledge, how knowledge is constructed, and how knowledge can be justified. This study employed a mixed-methods approach to examine: (a) middle and high school students' self-reported epistemic beliefs (quantitative) and epistemic beliefs revealed from practice (qualitative) during a problem-based, scientific inquiry unit, (b) How do middle and high school students' epistemic beliefs contribute to the construction of students' problem solving processes, and (c) how and why do students' epistemic beliefs change by engaging in PBL. Twenty-one middle and high school students participated in a summer science class to investigate local water quality in a 2-week long problem-based learning (PBL) unit. The students worked in small groups to conduct water quality tests at in their local watershed and visited several stakeholders for their investigation. Pretest and posttest versions of the Epistemological Beliefs Questionnaire were conducted to assess students' self-reported epistemic beliefs before and after the unit. I videotaped and interviewed three groups of students during the unit and conducted discourse analysis to examine their epistemic beliefs revealed from scientific inquiry activities and triangulate with their self-reported data. There are three main findings from this study. First, students in this study self-reported relatively sophisticated epistemic beliefs on the pretest. However, the comparison between their self-reported beliefs and beliefs revealed from practice indicated that some students were able to apply sophisticated beliefs during the unit while others failed to do so. The inconsistency between these two types of epistemic beliefs may due to students' inadequate cognitive ability, low validity of self-report measure, and the influence of contextual factors. Second, qualitative analysis indicated that students' epistemic beliefs of the nature of knowing influenced their problem

  15. Recent Advances in the Method of Forces: Integrated Force Method of Structural Analysis

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Coroneos, Rula M.; Hopkins, Dale A.

    1998-01-01

    Stress that can be induced in an elastic continuum can be determined directly through the simultaneous application of the equilibrium equations and the compatibility conditions. In the literature, this direct stress formulation is referred to as the integrated force method. This method, which uses forces as the primary unknowns, complements the popular equilibrium-based stiffness method, which considers displacements as the unknowns. The integrated force method produces accurate stress, displacement, and frequency results even for modest finite element models. This version of the force method should be developed as an alternative to the stiffness method because the latter method, which has been researched for the past several decades, may have entered its developmental plateau. Stress plays a primary role in the development of aerospace and other products, and its analysis is difficult. Therefore, it is advisable to use both methods to calculate stress and eliminate errors through comparison. This paper examines the role of the integrated force method in analysis, animation and design.

  16. Designs and methods used in published Australian health promotion evaluations 1992-2011.

    PubMed

    Chambers, Alana Hulme; Murphy, Kylie; Kolbe, Anthony

    2015-06-01

    To describe the designs and methods used in published Australian health promotion evaluation articles between 1992 and 2011. Using a content analysis approach, we reviewed 157 articles to analyse patterns and trends in designs and methods in Australian health promotion evaluation articles. The purpose was to provide empirical evidence about the types of designs and methods used. The most common type of evaluation conducted was impact evaluation. Quantitative designs were used exclusively in more than half of the articles analysed. Almost half the evaluations utilised only one data collection method. Surveys were the most common data collection method used. Few articles referred explicitly to an intended evaluation outcome or benefit and references to published evaluation models or frameworks were rare. This is the first time Australian-published health promotion evaluation articles have been empirically investigated in relation to designs and methods. There appears to be little change in the purposes, overall designs and methods of published evaluations since 1992. More methodologically transparent and sophisticated published evaluation articles might be instructional, and even motivational, for improving evaluation practice and result in better public health interventions and outcomes. © 2015 Public Health Association of Australia.

  17. Integration of Research Studies: Meta-Analysis of Research. Methods of Integrative Analysis; Final Report.

    ERIC Educational Resources Information Center

    Glass, Gene V.; And Others

    Integrative analysis, or what is coming to be known as meta-analysis, is the integration of the findings of many empirical research studies of a topic. Meta-analysis differs from traditional narrative forms of research reviewing in that it is more quantitative and statistical. Thus, the methods of meta-analysis are merely statistical methods,…

  18. A comparison of analysis methods to estimate contingency strength.

    PubMed

    Lloyd, Blair P; Staubitz, Johanna L; Tapp, Jon T

    2018-05-09

    To date, several data analysis methods have been used to estimate contingency strength, yet few studies have compared these methods directly. To compare the relative precision and sensitivity of four analysis methods (i.e., exhaustive event-based, nonexhaustive event-based, concurrent interval, concurrent+lag interval), we applied all methods to a simulated data set in which several response-dependent and response-independent schedules of reinforcement were programmed. We evaluated the degree to which contingency strength estimates produced from each method (a) corresponded with expected values for response-dependent schedules and (b) showed sensitivity to parametric manipulations of response-independent reinforcement. Results indicated both event-based methods produced contingency strength estimates that aligned with expected values for response-dependent schedules, but differed in sensitivity to response-independent reinforcement. The precision of interval-based methods varied by analysis method (concurrent vs. concurrent+lag) and schedule type (continuous vs. partial), and showed similar sensitivities to response-independent reinforcement. Recommendations and considerations for measuring contingencies are identified. © 2018 Society for the Experimental Analysis of Behavior.

  19. Interaction Analysis: Theory, Research and Application.

    ERIC Educational Resources Information Center

    Amidon, Edmund J., Ed.; Hough, John J., Ed.

    This volume of selected readings developed for students and practitioners at various levels of sophistication is intended to be representative of work done to date on interaction analysis. The contents include journal articles, papers read at professional meetings, abstracts of doctoral dissertations, and selections from larger monographs, plus 12…

  20. Comparability of river suspended-sediment sampling and laboratory analysis methods

    USGS Publications Warehouse

    Groten, Joel T.; Johnson, Gregory D.

    2018-03-06

    Accurate measurements of suspended sediment, a leading water-quality impairment in many Minnesota rivers, are important for managing and protecting water resources; however, water-quality standards for suspended sediment in Minnesota are based on grab field sampling and total suspended solids (TSS) laboratory analysis methods that have underrepresented concentrations of suspended sediment in rivers compared to U.S. Geological Survey equal-width-increment or equal-discharge-increment (EWDI) field sampling and suspended sediment concentration (SSC) laboratory analysis methods. Because of this underrepresentation, the U.S. Geological Survey, in collaboration with the Minnesota Pollution Control Agency, collected concurrent grab and EWDI samples at eight sites to compare results obtained using different combinations of field sampling and laboratory analysis methods.Study results determined that grab field sampling and TSS laboratory analysis results were biased substantially low compared to EWDI sampling and SSC laboratory analysis results, respectively. Differences in both field sampling and laboratory analysis methods caused grab and TSS methods to be biased substantially low. The difference in laboratory analysis methods was slightly greater than field sampling methods.Sand-sized particles had a strong effect on the comparability of the field sampling and laboratory analysis methods. These results indicated that grab field sampling and TSS laboratory analysis methods fail to capture most of the sand being transported by the stream. The results indicate there is less of a difference among samples collected with grab field sampling and analyzed for TSS and concentration of fines in SSC. Even though differences are present, the presence of strong correlations between SSC and TSS concentrations provides the opportunity to develop site specific relations to address transport processes not captured by grab field sampling and TSS laboratory analysis methods.

  1. Three-dimensional Stress Analysis Using the Boundary Element Method

    NASA Technical Reports Server (NTRS)

    Wilson, R. B.; Banerjee, P. K.

    1984-01-01

    The boundary element method is to be extended (as part of the NASA Inelastic Analysis Methods program) to the three-dimensional stress analysis of gas turbine engine hot section components. The analytical basis of the method (as developed in elasticity) is outlined, its numerical implementation is summarized, and the approaches to be followed in extending the method to include inelastic material response indicated.

  2. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. A strategy for evaluating pathway analysis methods.

    PubMed

    Yu, Chenggang; Woo, Hyung Jun; Yu, Xueping; Oyama, Tatsuya; Wallqvist, Anders; Reifman, Jaques

    2017-10-13

    Researchers have previously developed a multitude of methods designed to identify biological pathways associated with specific clinical or experimental conditions of interest, with the aim of facilitating biological interpretation of high-throughput data. Before practically applying such pathway analysis (PA) methods, we must first evaluate their performance and reliability, using datasets where the pathways perturbed by the conditions of interest have been well characterized in advance. However, such 'ground truths' (or gold standards) are often unavailable. Furthermore, previous evaluation strategies that have focused on defining 'true answers' are unable to systematically and objectively assess PA methods under a wide range of conditions. In this work, we propose a novel strategy for evaluating PA methods independently of any gold standard, either established or assumed. The strategy involves the use of two mutually complementary metrics, recall and discrimination. Recall measures the consistency of the perturbed pathways identified by applying a particular analysis method to an original large dataset and those identified by the same method to a sub-dataset of the original dataset. In contrast, discrimination measures specificity-the degree to which the perturbed pathways identified by a particular method to a dataset from one experiment differ from those identifying by the same method to a dataset from a different experiment. We used these metrics and 24 datasets to evaluate six widely used PA methods. The results highlighted the common challenge in reliably identifying significant pathways from small datasets. Importantly, we confirmed the effectiveness of our proposed dual-metric strategy by showing that previous comparative studies corroborate the performance evaluations of the six methods obtained by our strategy. Unlike any previously proposed strategy for evaluating the performance of PA methods, our dual-metric strategy does not rely on any ground truth

  4. Global/local methods research using a common structural analysis framework

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. H., Jr.; Thompson, Danniella M.

    1991-01-01

    Methodologies for global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.

  5. Teaching Basic Quantum Mechanics in Secondary School Using Concepts of Feynman Path Integrals Method

    ERIC Educational Resources Information Center

    Fanaro, Maria de los Angeles; Otero, Maria Rita; Arlego, Marcelo

    2012-01-01

    This paper discusses the teaching of basic quantum mechanics in high school. Rather than following the usual formalism, our approach is based on Feynman's path integral method. Our presentation makes use of simulation software and avoids sophisticated mathematical formalism. (Contains 3 figures.)

  6. Methods for geochemical analysis

    USGS Publications Warehouse

    Baedecker, Philip A.

    1987-01-01

    The laboratories for analytical chemistry within the Geologic Division of the U.S. Geological Survey are administered by the Office of Mineral Resources. The laboratory analysts provide analytical support to those programs of the Geologic Division that require chemical information and conduct basic research in analytical and geochemical areas vital to the furtherance of Division program goals. Laboratories for research and geochemical analysis are maintained at the three major centers in Reston, Virginia, Denver, Colorado, and Menlo Park, California. The Division has an expertise in a broad spectrum of analytical techniques, and the analytical research is designed to advance the state of the art of existing techniques and to develop new methods of analysis in response to special problems in geochemical analysis. The geochemical research and analytical results are applied to the solution of fundamental geochemical problems relating to the origin of mineral deposits and fossil fuels, as well as to studies relating to the distribution of elements in varied geologic systems, the mechanisms by which they are transported, and their impact on the environment.

  7. Toward a methodical framework for comprehensively assessing forest multifunctionality.

    PubMed

    Trogisch, Stefan; Schuldt, Andreas; Bauhus, Jürgen; Blum, Juliet A; Both, Sabine; Buscot, François; Castro-Izaguirre, Nadia; Chesters, Douglas; Durka, Walter; Eichenberg, David; Erfmeier, Alexandra; Fischer, Markus; Geißler, Christian; Germany, Markus S; Goebes, Philipp; Gutknecht, Jessica; Hahn, Christoph Zacharias; Haider, Sylvia; Härdtle, Werner; He, Jin-Sheng; Hector, Andy; Hönig, Lydia; Huang, Yuanyuan; Klein, Alexandra-Maria; Kühn, Peter; Kunz, Matthias; Leppert, Katrin N; Li, Ying; Liu, Xiaojuan; Niklaus, Pascal A; Pei, Zhiqin; Pietsch, Katherina A; Prinz, Ricarda; Proß, Tobias; Scherer-Lorenzen, Michael; Schmidt, Karsten; Scholten, Thomas; Seitz, Steffen; Song, Zhengshan; Staab, Michael; von Oheimb, Goddert; Weißbecker, Christina; Welk, Erik; Wirth, Christian; Wubet, Tesfaye; Yang, Bo; Yang, Xuefei; Zhu, Chao-Dong; Schmid, Bernhard; Ma, Keping; Bruelheide, Helge

    2017-12-01

    Biodiversity-ecosystem functioning (BEF) research has extended its scope from communities that are short-lived or reshape their structure annually to structurally complex forest ecosystems. The establishment of tree diversity experiments poses specific methodological challenges for assessing the multiple functions provided by forest ecosystems. In particular, methodological inconsistencies and nonstandardized protocols impede the analysis of multifunctionality within, and comparability across the increasing number of tree diversity experiments. By providing an overview on key methods currently applied in one of the largest forest biodiversity experiments, we show how methods differing in scale and simplicity can be combined to retrieve consistent data allowing novel insights into forest ecosystem functioning. Furthermore, we discuss and develop recommendations for the integration and transferability of diverse methodical approaches to present and future forest biodiversity experiments. We identified four principles that should guide basic decisions concerning method selection for tree diversity experiments and forest BEF research: (1) method selection should be directed toward maximizing data density to increase the number of measured variables in each plot. (2) Methods should cover all relevant scales of the experiment to consider scale dependencies of biodiversity effects. (3) The same variable should be evaluated with the same method across space and time for adequate larger-scale and longer-time data analysis and to reduce errors due to changing measurement protocols. (4) Standardized, practical and rapid methods for assessing biodiversity and ecosystem functions should be promoted to increase comparability among forest BEF experiments. We demonstrate that currently available methods provide us with a sophisticated toolbox to improve a synergistic understanding of forest multifunctionality. However, these methods require further adjustment to the specific

  8. Two MIS Analysis Methods: An Experimental Comparison.

    ERIC Educational Resources Information Center

    Wang, Shouhong

    1996-01-01

    In China, 24 undergraduate business students applied data flow diagrams (DFD) to a mini-case, and 20 used object-oriented analysis (OOA). DFD seemed easier to learn, but after training, those using the OOA method for systems analysis made fewer errors. (SK)

  9. Analysis of concrete beams using applied element method

    NASA Astrophysics Data System (ADS)

    Lincy Christy, D.; Madhavan Pillai, T. M.; Nagarajan, Praveen

    2018-03-01

    The Applied Element Method (AEM) is a displacement based method of structural analysis. Some of its features are similar to that of Finite Element Method (FEM). In AEM, the structure is analysed by dividing it into several elements similar to FEM. But, in AEM, elements are connected by springs instead of nodes as in the case of FEM. In this paper, background to AEM is discussed and necessary equations are derived. For illustrating the application of AEM, it has been used to analyse plain concrete beam of fixed support condition. The analysis is limited to the analysis of 2-dimensional structures. It was found that the number of springs has no much influence on the results. AEM could predict deflection and reactions with reasonable degree of accuracy.

  10. Spectroscopic Chemical Analysis Methods and Apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F.; Reid, Ray D.

    2012-01-01

    This invention relates to non-contact spectroscopic methods and apparatus for performing chemical analysis and the ideal wavelengths and sources needed for this analysis. It employs deep ultraviolet (200- to 300-nm spectral range) electron-beam-pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor lightemitting devices, and hollow cathode metal ion lasers. Three achieved goals for this innovation are to reduce the size (under 20 L), reduce the weight [under 100 lb (.45 kg)], and reduce the power consumption (under 100 W). This method can be used in microscope or macroscope to provide measurement of Raman and/or native fluorescence emission spectra either by point-by-point measurement, or by global imaging of emissions within specific ultraviolet spectral bands. In other embodiments, the method can be used in analytical instruments such as capillary electrophoresis, capillary electro-chromatography, high-performance liquid chromatography, flow cytometry, and related instruments for detection and identification of unknown analytes using a combination of native fluorescence and/or Raman spectroscopic methods. This design provides an electron-beampumped semiconductor radiation-producing method, or source, that can emit at a wavelength (or wavelengths) below 300 nm, e.g. in the deep ultraviolet between about 200 and 300 nm, and more preferably less than 260 nm. In some variations, the method is to produce incoherent radiation, while in other implementations it produces laser radiation. In some variations, this object is achieved by using an AlGaN emission medium, while in other implementations a diamond emission medium may be used. This instrument irradiates a sample with deep UV radiation, and then uses an improved filter for separating wavelengths to be detected. This provides a multi-stage analysis of the sample. To avoid the difficulties related to producing deep UV semiconductor sources, a pumping approach has been developed that uses

  11. Comprehension-Driven Program Analysis (CPA) for Malware Detection in Android Phones

    DTIC Science & Technology

    2015-07-01

    COMPREHENSION-DRIVEN PROGRAM ANALYSIS (CPA) FOR MALWARE DETECTION IN ANDROID PHONES IOWA STATE UNIVERSITY JULY 2015 FINAL...DRIVEN PROGRAM ANALYSIS (CPA) FOR MALWARE DETECTION IN ANDROID PHONES Sb. GRANT NUMBER N/A Sc. PROGRAM ELEMENT NUMBER 6 1101E 6. AUTHOR(S) Sd...machine analysis system to detect novel, sophisticated Android malware. (c) An innovative library summarization technique and its incorporation in

  12. ToF-SIMS PCA analysis of Myrtus communis L.

    NASA Astrophysics Data System (ADS)

    Piras, F. M.; Dettori, M. F.; Magnani, A.

    2009-06-01

    Nowadays there is a growing interest of researchers for the application of sophisticated analytical techniques in conjunction with statistical data analysis methods to the characterization of natural products to assure their authenticity and quality, and for the possibility of direct analysis of food to obtain maximum information. In this work, time-of-flight secondary ion mass spectrometry (ToF-SIMS) in conjunction with principal components analysis (PCA) are applied to study the chemical composition and variability of Sardinian myrtle ( Myrtus communis L.) through the analysis of both berries alcoholic extracts and berries epicarp. ToF-SIMS spectra of berries epicarp show that the epicuticular waxes consist mainly of carboxylic acids with chain length ranging from C20 to C30, or identical species formed from fragmentation of long-chain esters. PCA of ToF-SIMS data from myrtle berries epicarp distinguishes two groups characterized by a different surface concentration of triacontanoic acid. Variability in antocyanins, flavonols, α-tocopherol, and myrtucommulone contents is showed by ToF-SIMS PCA analysis of myrtle berries alcoholic extracts.

  13. Application of integrated fluid-thermal-structural analysis methods

    NASA Technical Reports Server (NTRS)

    Wieting, Allan R.; Dechaumphai, Pramote; Bey, Kim S.; Thornton, Earl A.; Morgan, Ken

    1988-01-01

    Hypersonic vehicles operate in a hostile aerothermal environment which has a significant impact on their aerothermostructural performance. Significant coupling occurs between the aerodynamic flow field, structural heat transfer, and structural response creating a multidisciplinary interaction. Interfacing state-of-the-art disciplinary analysis methods is not efficient, hence interdisciplinary analysis methods integrated into a single aerothermostructural analyzer are needed. The NASA Langley Research Center is developing such methods in an analyzer called LIFTS (Langley Integrated Fluid-Thermal-Structural) analyzer. The evolution and status of LIFTS is reviewed and illustrated through applications.

  14. Cell-fusion method to visualize interphase nuclear pore formation.

    PubMed

    Maeshima, Kazuhiro; Funakoshi, Tomoko; Imamoto, Naoko

    2014-01-01

    In eukaryotic cells, the nucleus is a complex and sophisticated organelle that organizes genomic DNA to support essential cellular functions. The nuclear surface contains many nuclear pore complexes (NPCs), channels for macromolecular transport between the cytoplasm and nucleus. It is well known that the number of NPCs almost doubles during interphase in cycling cells. However, the mechanism of NPC formation is poorly understood, presumably because a practical system for analysis does not exist. The most difficult obstacle in the visualization of interphase NPC formation is that NPCs already exist after nuclear envelope formation, and these existing NPCs interfere with the observation of nascent NPCs. To overcome this obstacle, we developed a novel system using the cell-fusion technique (heterokaryon method), previously also used to analyze the shuttling of macromolecules between the cytoplasm and the nucleus, to visualize the newly synthesized interphase NPCs. In addition, we used a photobleaching approach that validated the cell-fusion method. We recently used these methods to demonstrate the role of cyclin-dependent protein kinases and of Pom121 in interphase NPC formation in cycling human cells. Here, we describe the details of the cell-fusion approach and compare the system with other NPC formation visualization methods. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Text analysis devices, articles of manufacture, and text analysis methods

    DOEpatents

    Turner, Alan E; Hetzler, Elizabeth G; Nakamura, Grant C

    2013-05-28

    Text analysis devices, articles of manufacture, and text analysis methods are described according to some aspects. In one aspect, a text analysis device includes processing circuitry configured to analyze initial text to generate a measurement basis usable in analysis of subsequent text, wherein the measurement basis comprises a plurality of measurement features from the initial text, a plurality of dimension anchors from the initial text and a plurality of associations of the measurement features with the dimension anchors, and wherein the processing circuitry is configured to access a viewpoint indicative of a perspective of interest of a user with respect to the analysis of the subsequent text, and wherein the processing circuitry is configured to use the viewpoint to generate the measurement basis.

  16. Sensitive and inexpensive digital DNA analysis by microfluidic enrichment of rolling circle amplified single-molecules.

    PubMed

    Kühnemund, Malte; Hernández-Neuta, Iván; Sharif, Mohd Istiaq; Cornaglia, Matteo; Gijs, Martin A M; Nilsson, Mats

    2017-05-05

    Single molecule quantification assays provide the ultimate sensitivity and precision for molecular analysis. However, most digital analysis techniques, i.e. droplet PCR, require sophisticated and expensive instrumentation for molecule compartmentalization, amplification and analysis. Rolling circle amplification (RCA) provides a simpler means for digital analysis. Nevertheless, the sensitivity of RCA assays has until now been limited by inefficient detection methods. We have developed a simple microfluidic strategy for enrichment of RCA products into a single field of view of a low magnification fluorescent sensor, enabling ultra-sensitive digital quantification of nucleic acids over a dynamic range from 1.2 aM to 190 fM. We prove the broad applicability of our analysis platform by demonstrating 5-plex detection of as little as ∼1 pg (∼300 genome copies) of pathogenic DNA with simultaneous antibiotic resistance marker detection, and the analysis of rare oncogene mutations. Our method is simpler, more cost-effective and faster than other digital analysis techniques and provides the means to implement digital analysis in any laboratory equipped with a standard fluorescent microscope. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  17. An advanced probabilistic structural analysis method for implicit performance functions

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.

    1989-01-01

    In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.

  18. The same analysis approach: Practical protection against the pitfalls of novel neuroimaging analysis methods.

    PubMed

    Görgen, Kai; Hebart, Martin N; Allefeld, Carsten; Haynes, John-Dylan

    2017-12-27

    Standard neuroimaging data analysis based on traditional principles of experimental design, modelling, and statistical inference is increasingly complemented by novel analysis methods, driven e.g. by machine learning methods. While these novel approaches provide new insights into neuroimaging data, they often have unexpected properties, generating a growing literature on possible pitfalls. We propose to meet this challenge by adopting a habit of systematic testing of experimental design, analysis procedures, and statistical inference. Specifically, we suggest to apply the analysis method used for experimental data also to aspects of the experimental design, simulated confounds, simulated null data, and control data. We stress the importance of keeping the analysis method the same in main and test analyses, because only this way possible confounds and unexpected properties can be reliably detected and avoided. We describe and discuss this Same Analysis Approach in detail, and demonstrate it in two worked examples using multivariate decoding. With these examples, we reveal two sources of error: A mismatch between counterbalancing (crossover designs) and cross-validation which leads to systematic below-chance accuracies, and linear decoding of a nonlinear effect, a difference in variance. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Dual ant colony operational modal analysis parameter estimation method

    NASA Astrophysics Data System (ADS)

    Sitarz, Piotr; Powałka, Bartosz

    2018-01-01

    Operational Modal Analysis (OMA) is a common technique used to examine the dynamic properties of a system. Contrary to experimental modal analysis, the input signal is generated in object ambient environment. Operational modal analysis mainly aims at determining the number of pole pairs and at estimating modal parameters. Many methods are used for parameter identification. Some methods operate in time while others in frequency domain. The former use correlation functions, the latter - spectral density functions. However, while some methods require the user to select poles from a stabilisation diagram, others try to automate the selection process. Dual ant colony operational modal analysis parameter estimation method (DAC-OMA) presents a new approach to the problem, avoiding issues involved in the stabilisation diagram. The presented algorithm is fully automated. It uses deterministic methods to define the interval of estimated parameters, thus reducing the problem to optimisation task which is conducted with dedicated software based on ant colony optimisation algorithm. The combination of deterministic methods restricting parameter intervals and artificial intelligence yields very good results, also for closely spaced modes and significantly varied mode shapes within one measurement point.

  20. Analysis of Biomass Sugars Using a Novel HPLC Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agblevor, F. A.; Hames, B. R.; Schell, D.

    The precise quantitative analysis of biomass sugars is a very important step in the conversion of biomass feedstocks to fuels and chemicals. However, the most accurate method of biomass sugar analysis is based on the gas chromatography analysis of derivatized sugars either as alditol acetates or trimethylsilanes. The derivatization method is time consuming but the alternative high-performance liquid chromatography (HPLC) method cannot resolve most sugars found in biomass hydrolysates. We have demonstrated for the first time that by careful manipulation of the HPLC mobile phase, biomass monomeric sugars (arabinose, xylose, fructose, glucose, mannose, and galactose) can be analyzed quantitatively andmore » there is excellent baseline resolution of all the sugars. This method was demonstrated for standard sugars, pretreated corn stover liquid and solid fractions. Our method can also be used to analyze dimeric sugars (cellobiose and sucrose).« less

  1. Improving Instructions Using a Data Analysis Collaborative Model

    ERIC Educational Resources Information Center

    Good, Rebecca B.; Jackson, Sherion H.

    2007-01-01

    As student data analysis reports become more sophisticated, these reports reveal greater details on low performance skills. Availability of models and programs depicting detailed instructions or guidance for utilizing data to impact classroom instruction, in an effort to increase student achievement, has been lacking. This study examines the…

  2. Detection of feigned mental disorders on the personality assessment inventory: a discriminant analysis.

    PubMed

    Rogers, R; Sewell, K W; Morey, L C; Ustad, K L

    1996-12-01

    Psychological assessment with multiscale inventories is largely dependent on the honesty and forthrightness of those persons evaluated. We investigated the effectiveness of the Personality Assessment Inventory (PAI) in detecting participants feigning three specific disorders: schizophrenia, major depression, and generalized anxiety disorder. With a simulation design, we tested the PAI validity scales on 166 naive (undergraduates with minimal preparation) and 80 sophisticated (doctoral psychology students with 1 week preparation) participants. We compared their results to persons with the designated disorders: schizophrenia (n = 45), major depression (n = 136), and generalized anxiety disorder (n = 40). Although moderately effective with naive simulators, the validity scales evidenced only modest positive predictive power with their sophisticated counterparts. Therefore, we performed a two-stage discriminant analysis that yielded a moderately high hit rate (> 80%) that was maintained in the cross-validation sample, irrespective of the feigned disorder or the sophistication of the simulators.

  3. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis. Moisture...: http://www.archives.gov/federal_register/code_of_federal_regulations/ibr_locations.html): (a) Moisture content—section 16.233 “Method I (52)—Official Final Action”, under the heading “Moisture”. (b) Milkfat...

  4. Bayesian data analysis in population ecology: motivations, methods, and benefits

    USGS Publications Warehouse

    Dorazio, Robert

    2016-01-01

    During the 20th century ecologists largely relied on the frequentist system of inference for the analysis of their data. However, in the past few decades ecologists have become increasingly interested in the use of Bayesian methods of data analysis. In this article I provide guidance to ecologists who would like to decide whether Bayesian methods can be used to improve their conclusions and predictions. I begin by providing a concise summary of Bayesian methods of analysis, including a comparison of differences between Bayesian and frequentist approaches to inference when using hierarchical models. Next I provide a list of problems where Bayesian methods of analysis may arguably be preferred over frequentist methods. These problems are usually encountered in analyses based on hierarchical models of data. I describe the essentials required for applying modern methods of Bayesian computation, and I use real-world examples to illustrate these methods. I conclude by summarizing what I perceive to be the main strengths and weaknesses of using Bayesian methods to solve ecological inference problems.

  5. Measuring the efficacy of advertising communication with neuroscience methods: an experiment performed by Telecom Italia.

    PubMed

    Grimaldi, Loredana

    2012-01-01

    Recently, there has been a concentrated effort by companies to better understand the needs and desires of their consumers. Such efforts usually employ different and sophisticated analysis techniques for monitoring the consumers preferences and how such consumers perceive the advertising communication campaign from a specific company.

  6. Reactive polymer coatings: A robust platform towards sophisticated surface engineering for biotechnology

    NASA Astrophysics Data System (ADS)

    Chen, Hsien-Yeh

    Functionalized poly(p-xylylenes) or so-called reactive polymers can be synthesized via chemical vapor deposition (CVD) polymerization. The resulting ultra-thin coatings are pinhole-free and can be conformally deposited to a wide range of substrates and materials. More importantly, the equipped functional groups can served as anchoring sites for tailoring the surface properties, making these reactive coatings a robust platform that can deal with sophisticated challenges faced in biointerfaces. In this work presented herein, surface coatings presenting various functional groups were prepared by CVD process. Such surfaces include aldehyde-functionalized coating to precisely immobilize saccharide molecules onto well-defined areas and alkyne-functionalized coating to click azide-modified molecules via Huisgen 1,3-dipolar cycloaddition reaction. Moreover, CVD copolymerization has been conducted to prepare multifunctional coatings and their specific functions were demonstrated by the immobilization of biotin and NHS-ester molecules. By using a photodefinable coating, polyethylene oxides were immobilized onto a wide range of substrates through photo-immobilization. Spatially controlled protein resistant properties were characterized by selective adsorption of fibrinogen and bovine serum albumin as model systems. Alternatively, surface initiator coatings were used for polymer graftings of polyethylene glycol) methyl ether methacrylate, and the resultant protein- and cell- resistant properties were characterized by adsorption of kinesin motor proteins, fibrinogen, and murine fibroblasts (NIH3T3). Accessibility of reactive coatings within confined microgeometries was systematically studied, and the preparation of homogeneous polymer thin films within the inner surface of microchannels was demonstrated. Moreover, these advanced coatings were applied to develop a dry adhesion process for microfluidic devices. This process provides (i) excellent bonding strength, (ii) extended

  7. Probabilistic structural analysis methods for space transportation propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Moore, N.; Anis, C.; Newell, J.; Nagpal, V.; Singhal, S.

    1991-01-01

    Information on probabilistic structural analysis methods for space propulsion systems is given in viewgraph form. Information is given on deterministic certification methods, probability of failure, component response analysis, stress responses for 2nd stage turbine blades, Space Shuttle Main Engine (SSME) structural durability, and program plans. .

  8. Improved microarray methods for profiling the yeast knockout strain collection

    PubMed Central

    Yuan, Daniel S.; Pan, Xuewen; Ooi, Siew Loon; Peyser, Brian D.; Spencer, Forrest A.; Irizarry, Rafael A.; Boeke, Jef D.

    2005-01-01

    A remarkable feature of the Yeast Knockout strain collection is the presence of two unique 20mer TAG sequences in almost every strain. In principle, the relative abundances of strains in a complex mixture can be profiled swiftly and quantitatively by amplifying these sequences and hybridizing them to microarrays, but TAG microarrays have not been widely used. Here, we introduce a TAG microarray design with sophisticated controls and describe a robust method for hybridizing high concentrations of dye-labeled TAGs in single-stranded form. We also highlight the importance of avoiding PCR contamination and provide procedures for detection and eradication. Validation experiments using these methods yielded false positive (FP) and false negative (FN) rates for individual TAG detection of 3–6% and 15–18%, respectively. Analysis demonstrated that cross-hybridization was the chief source of FPs, while TAG amplification defects were the main cause of FNs. The materials, protocols, data and associated software described here comprise a suite of experimental resources that should facilitate the use of TAG microarrays for a wide variety of genetic screens. PMID:15994458

  9. Construction and Analysis of Multi-Rate Partitioned Runge-Kutta Methods

    DTIC Science & Technology

    2012-06-01

    ANALYSIS OF MULTI-RATE PARTITIONED RUNGE-KUTTA METHODS by Patrick R. Mugg June 2012 Thesis Advisor: Francis Giraldo Second Reader: Hong...COVERED Master’s Thesis 4. TITLE AND SUBTITLE Construction and Analysis of Multi-Rate Partitioned Runge-Kutta Methods 5. FUNDING NUMBERS 6. AUTHOR...The most widely known and used procedure for analyzing stability is the Von Neumann Method , such that Von Neumann’s stability analysis looks at

  10. Thermal Analysis Methods for Aerobraking Heating

    NASA Technical Reports Server (NTRS)

    Amundsen, Ruth M.; Gasbarre, Joseph F.; Dec, John A.

    2005-01-01

    As NASA begins exploration of other planets, a method of non-propulsively slowing vehicles at the planet, aerobraking, may become a valuable technique for managing vehicle design mass and propellant. An example of this is Mars Reconnaissance Orbiter (MRO), which will launch in late 2005 and reach Mars in March of 2006. In order to save propellant, MRO will use aerobraking to modify the initial orbit at Mars. The spacecraft will dip into the atmosphere briefly on each orbit, and during the drag pass, the atmospheric drag on the spacecraft will slow it, thus lowering the orbit apoapsis. The largest area on the spacecraft, and that most affected by the heat generated during the aerobraking process, is the solar arrays. A thermal analysis of the solar arrays was conducted at NASA Langley, to simulate their performance throughout the entire roughly 6-month period of aerobraking. Several interesting methods were used to make this analysis more rapid and robust. Two separate models were built for this analysis, one in Thermal Desktop for radiation and orbital heating analysis, and one in MSC.Patran for thermal analysis. The results from the radiation model were mapped in an automated fashion to the Patran thermal model that was used to analyze the thermal behavior during the drag pass. A high degree of automation in file manipulation as well as other methods for reducing run time were employed, since toward the end of the aerobraking period the orbit period is short, and in order to support flight operations the runs must be computed rapidly. All heating within the Patran Thermal model was combined in one section of logic, such that data mapped from the radiation model and aeroheating model, as well as skin temperature effects on the aeroheating and surface radiation, could be incorporated easily. This approach calculates the aeroheating at any given node, based on its position and temperature as well as the density and velocity at that trajectory point. Run times on

  11. Visceral fat estimation method by bioelectrical impedance analysis and causal analysis

    NASA Astrophysics Data System (ADS)

    Nakajima, Hiroshi; Tasaki, Hiroshi; Tsuchiya, Naoki; Hamaguchi, Takehiro; Shiga, Toshikazu

    2011-06-01

    It has been clarified that abdominal visceral fat accumulation is closely associated to the lifestyle disease and metabolic syndrome. The gold standard in medical fields is visceral fat area measured by an X-ray computer tomography (CT) scan or magnetic resonance imaging. However, their measurements are high invasive and high cost; especially a CT scan causes X-ray exposure. They are the reasons why medical fields need an instrument for viscera fat measurement with low invasive, ease of use, and low cost. The article proposes a simple and practical method of visceral fat estimation by employing bioelectrical impedance analysis and causal analysis. In the method, abdominal shape and dual impedances of abdominal surface and body total are measured to estimate a visceral fat area based on the cause-effect structure. The structure is designed according to the nature of abdominal body composition to be fine-tuned by statistical analysis. The experiments were conducted to investigate the proposed model. 180 subjects were hired to be measured by both a CT scan and the proposed method. The acquired model explained the measurement principle well and the correlation coefficient is 0.88 with the CT scan measurements.

  12. METHOD OF CHEMICAL ANALYSIS FOR OIL SHALE WASTES

    EPA Science Inventory

    Several methods of chemical analysis are described for oil shale wastewaters and retort gases. These methods are designed to support the field testing of various pollution control systems. As such, emphasis has been placed on methods which are rapid and sufficiently rugged to per...

  13. Finite Volume Methods: Foundation and Analysis

    NASA Technical Reports Server (NTRS)

    Barth, Timothy; Ohlberger, Mario

    2003-01-01

    Finite volume methods are a class of discretization schemes that have proven highly successful in approximating the solution of a wide variety of conservation law systems. They are extensively used in fluid mechanics, porous media flow, meteorology, electromagnetics, models of biological processes, semi-conductor device simulation and many other engineering areas governed by conservative systems that can be written in integral control volume form. This article reviews elements of the foundation and analysis of modern finite volume methods. The primary advantages of these methods are numerical robustness through the obtention of discrete maximum (minimum) principles, applicability on very general unstructured meshes, and the intrinsic local conservation properties of the resulting schemes. Throughout this article, specific attention is given to scalar nonlinear hyperbolic conservation laws and the development of high order accurate schemes for discretizing them. A key tool in the design and analysis of finite volume schemes suitable for non-oscillatory discontinuity capturing is discrete maximum principle analysis. A number of building blocks used in the development of numerical schemes possessing local discrete maximum principles are reviewed in one and several space dimensions, e.g. monotone fluxes, E-fluxes, TVD discretization, non-oscillatory reconstruction, slope limiters, positive coefficient schemes, etc. When available, theoretical results concerning a priori and a posteriori error estimates are given. Further advanced topics are then considered such as high order time integration, discretization of diffusion terms and the extension to systems of nonlinear conservation laws.

  14. Probabilistic structural analysis methods of hot engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures is a major activity at Lewis Research Center. Recent activities have focused on extending the methods to include the combined uncertainties in several factors on structural response. This paper briefly describes recent progress on composite load spectra models, probabilistic finite element structural analysis, and probabilistic strength degradation modeling. Progress is described in terms of fundamental concepts, computer code development, and representative numerical results.

  15. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    PubMed Central

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. Conclusions The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil. PMID:25182282

  16. Static Aeroelastic Analysis with an Inviscid Cartesian Method

    NASA Technical Reports Server (NTRS)

    Rodriguez, David L.; Aftosmis, Michael J.; Nemec, Marian; Smith, Stephen C.

    2014-01-01

    An embedded-boundary, Cartesian-mesh flow solver is coupled with a three degree-of-freedom structural model to perform static, aeroelastic analysis of complex aircraft geometries. The approach solves a nonlinear, aerostructural system of equations using a loosely-coupled strategy. An open-source, 3-D discrete-geometry engine is utilized to deform a triangulated surface geometry according to the shape predicted by the structural model under the computed aerodynamic loads. The deformation scheme is capable of modeling large deflections and is applicable to the design of modern, very-flexible transport wings. The coupling interface is modular so that aerodynamic or structural analysis methods can be easily swapped or enhanced. After verifying the structural model with comparisons to Euler beam theory, two applications of the analysis method are presented as validation. The first is a relatively stiff, transport wing model which was a subject of a recent workshop on aeroelasticity. The second is a very flexible model recently tested in a low speed wind tunnel. Both cases show that the aeroelastic analysis method produces results in excellent agreement with experimental data.

  17. Structural analysis of jewelry from the Moche tomb of the `lady of Cao' by X-ray digital radiography

    NASA Astrophysics Data System (ADS)

    Azeredo, S. R.; Cesareo, R.; Franco, R.; Fernandez, A.; Bustamante, A.; Lopes, R. T.

    2018-04-01

    Nose ornaments from the tomb of the `Lady of Cao', a mummified woman representative of the Moche culture and dated to the third-or-fourth century AD, were analyzed by X-ray digital radiography. These spectacular gold and silver jewels are some of the most sophisticated metalworking ever produced in ancient America. The Mochecivilization flourished along the north coast of present-day Peru, between the Andes and the Pacific Ocean, approximately between 100 and 600 AD. The Moche were very sophisticated artisans and metal smiths, being considered the finest producers of jewels and artifacts of the region. A portable X-ray digital radiography (XDR) system consisting of a flat panel detector with high resolution image and a mini X-ray tube was used for the structural analysis of the Moche jewels aiming at inferring different joining methods of the silver-gold sheets. The radiographic analysis showed some differences in the joint of the silver-and-gold sheets. Presence of filler material and adhesive for joining the silver-and-gold sheets was visible as well as silver-gold junctions without filler material (or with a material invisible in radiography). Furthermore, the technique demonstrated the advantage of using a portable XDR micro system when the sample cannot be brought to the laboratory.

  18. Multifunctional Collaborative Modeling and Analysis Methods in Engineering Science

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Broduer, Steve (Technical Monitor)

    2001-01-01

    Engineers are challenged to produce better designs in less time and for less cost. Hence, to investigate novel and revolutionary design concepts, accurate, high-fidelity results must be assimilated rapidly into the design, analysis, and simulation process. This assimilation should consider diverse mathematical modeling and multi-discipline interactions necessitated by concepts exploiting advanced materials and structures. Integrated high-fidelity methods with diverse engineering applications provide the enabling technologies to assimilate these high-fidelity, multi-disciplinary results rapidly at an early stage in the design. These integrated methods must be multifunctional, collaborative, and applicable to the general field of engineering science and mechanics. Multifunctional methodologies and analysis procedures are formulated for interfacing diverse subdomain idealizations including multi-fidelity modeling methods and multi-discipline analysis methods. These methods, based on the method of weighted residuals, ensure accurate compatibility of primary and secondary variables across the subdomain interfaces. Methods are developed using diverse mathematical modeling (i.e., finite difference and finite element methods) and multi-fidelity modeling among the subdomains. Several benchmark scalar-field and vector-field problems in engineering science are presented with extensions to multidisciplinary problems. Results for all problems presented are in overall good agreement with the exact analytical solution or the reference numerical solution. Based on the results, the integrated modeling approach using the finite element method for multi-fidelity discretization among the subdomains is identified as most robust. The multiple-method approach is advantageous when interfacing diverse disciplines in which each of the method's strengths are utilized. The multifunctional methodology presented provides an effective mechanism by which domains with diverse idealizations are

  19. Advanced stress analysis methods applicable to turbine engine structures

    NASA Technical Reports Server (NTRS)

    Pian, T. H. H.

    1985-01-01

    Advanced stress analysis methods applicable to turbine engine structures are investigated. Constructions of special elements which containing traction-free circular boundaries are investigated. New versions of mixed variational principle and version of hybrid stress elements are formulated. A method is established for suppression of kinematic deformation modes. semiLoof plate and shell elements are constructed by assumed stress hybrid method. An elastic-plastic analysis is conducted by viscoplasticity theory using the mechanical subelement model.

  20. Interrupted time-series analysis: studying trends in neurosurgery.

    PubMed

    Wong, Ricky H; Smieliauskas, Fabrice; Pan, I-Wen; Lam, Sandi K

    2015-12-01

    OBJECT Neurosurgery studies traditionally have evaluated the effects of interventions on health care outcomes by studying overall changes in measured outcomes over time. Yet, this type of linear analysis is limited due to lack of consideration of the trend's effects both pre- and postintervention and the potential for confounding influences. The aim of this study was to illustrate interrupted time-series analysis (ITSA) as applied to an example in the neurosurgical literature and highlight ITSA's potential for future applications. METHODS The methods used in previous neurosurgical studies were analyzed and then compared with the methodology of ITSA. RESULTS The ITSA method was identified in the neurosurgical literature as an important technique for isolating the effect of an intervention (such as a policy change or a quality and safety initiative) on a health outcome independent of other factors driving trends in the outcome. The authors determined that ITSA allows for analysis of the intervention's immediate impact on outcome level and on subsequent trends and enables a more careful measure of the causal effects of interventions on health care outcomes. CONCLUSIONS ITSA represents a significant improvement over traditional observational study designs in quantifying the impact of an intervention. ITSA is a useful statistical procedure to understand, consider, and implement as the field of neurosurgery evolves in sophistication in big-data analytics, economics, and health services research.

  1. Using Horn's Parallel Analysis Method in Exploratory Factor Analysis for Determining the Number of Factors

    ERIC Educational Resources Information Center

    Çokluk, Ömay; Koçak, Duygu

    2016-01-01

    In this study, the number of factors obtained from parallel analysis, a method used for determining the number of factors in exploratory factor analysis, was compared to that of the factors obtained from eigenvalue and scree plot--two traditional methods for determining the number of factors--in terms of consistency. Parallel analysis is based on…

  2. Methods for Force Analysis of Overconstrained Parallel Mechanisms: A Review

    NASA Astrophysics Data System (ADS)

    Liu, Wen-Lan; Xu, Yun-Dou; Yao, Jian-Tao; Zhao, Yong-Sheng

    2017-11-01

    The force analysis of overconstrained PMs is relatively complex and difficult, for which the methods have always been a research hotspot. However, few literatures analyze the characteristics and application scopes of the various methods, which is not convenient for researchers and engineers to master and adopt them properly. A review of the methods for force analysis of both passive and active overconstrained PMs is presented. The existing force analysis methods for these two kinds of overconstrained PMs are classified according to their main ideas. Each category is briefly demonstrated and evaluated from such aspects as the calculation amount, the comprehensiveness of considering limbs' deformation, and the existence of explicit expressions of the solutions, which provides an important reference for researchers and engineers to quickly find a suitable method. The similarities and differences between the statically indeterminate problem of passive overconstrained PMs and that of active overconstrained PMs are discussed, and a universal method for these two kinds of overconstrained PMs is pointed out. The existing deficiencies and development directions of the force analysis methods for overconstrained systems are indicated based on the overview.

  3. Nonlinear multivariate and time series analysis by neural network methods

    NASA Astrophysics Data System (ADS)

    Hsieh, William W.

    2004-03-01

    Methods in multivariate statistical analysis are essential for working with large amounts of geophysical data, data from observational arrays, from satellites, or from numerical model output. In classical multivariate statistical analysis, there is a hierarchy of methods, starting with linear regression at the base, followed by principal component analysis (PCA) and finally canonical correlation analysis (CCA). A multivariate time series method, the singular spectrum analysis (SSA), has been a fruitful extension of the PCA technique. The common drawback of these classical methods is that only linear structures can be correctly extracted from the data. Since the late 1980s, neural network methods have become popular for performing nonlinear regression and classification. More recently, neural network methods have been extended to perform nonlinear PCA (NLPCA), nonlinear CCA (NLCCA), and nonlinear SSA (NLSSA). This paper presents a unified view of the NLPCA, NLCCA, and NLSSA techniques and their applications to various data sets of the atmosphere and the ocean (especially for the El Niño-Southern Oscillation and the stratospheric quasi-biennial oscillation). These data sets reveal that the linear methods are often too simplistic to describe real-world systems, with a tendency to scatter a single oscillatory phenomenon into numerous unphysical modes or higher harmonics, which can be largely alleviated in the new nonlinear paradigm.

  4. Security analysis and improvements to the PsychoPass method.

    PubMed

    Brumen, Bostjan; Heričko, Marjan; Rozman, Ivan; Hölbl, Marko

    2013-08-13

    In a recent paper, Pietro Cipresso et al proposed the PsychoPass method, a simple way to create strong passwords that are easy to remember. However, the method has some security issues that need to be addressed. To perform a security analysis on the PsychoPass method and outline the limitations of and possible improvements to the method. We used the brute force analysis and dictionary attack analysis of the PsychoPass method to outline its weaknesses. The first issue with the Psychopass method is that it requires the password reproduction on the same keyboard layout as was used to generate the password. The second issue is a security weakness: although the produced password is 24 characters long, the password is still weak. We elaborate on the weakness and propose a solution that produces strong passwords. The proposed version first requires the use of the SHIFT and ALT-GR keys in combination with other keys, and second, the keys need to be 1-2 distances apart. The proposed improved PsychoPass method yields passwords that can be broken only in hundreds of years based on current computing powers. The proposed PsychoPass method requires 10 keys, as opposed to 20 keys in the original method, for comparable password strength.

  5. Mixed time integration methods for transient thermal analysis of structures

    NASA Technical Reports Server (NTRS)

    Liu, W. K.

    1982-01-01

    The computational methods used to predict and optimize the thermal structural behavior of aerospace vehicle structures are reviewed. In general, two classes of algorithms, implicit and explicit, are used in transient thermal analysis of structures. Each of these two methods has its own merits. Due to the different time scales of the mechanical and thermal responses, the selection of a time integration method can be a different yet critical factor in the efficient solution of such problems. Therefore mixed time integration methods for transient thermal analysis of structures are being developed. The computer implementation aspects and numerical evaluation of these mixed time implicit-explicit algorithms in thermal analysis of structures are presented. A computationally useful method of estimating the critical time step for linear quadrilateral element is also given. Numerical tests confirm the stability criterion and accuracy characteristics of the methods. The superiority of these mixed time methods to the fully implicit method or the fully explicit method is also demonstrated.

  6. Mixed time integration methods for transient thermal analysis of structures

    NASA Technical Reports Server (NTRS)

    Liu, W. K.

    1983-01-01

    The computational methods used to predict and optimize the thermal-structural behavior of aerospace vehicle structures are reviewed. In general, two classes of algorithms, implicit and explicit, are used in transient thermal analysis of structures. Each of these two methods has its own merits. Due to the different time scales of the mechanical and thermal responses, the selection of a time integration method can be a difficult yet critical factor in the efficient solution of such problems. Therefore mixed time integration methods for transient thermal analysis of structures are being developed. The computer implementation aspects and numerical evaluation of these mixed time implicit-explicit algorithms in thermal analysis of structures are presented. A computationally-useful method of estimating the critical time step for linear quadrilateral element is also given. Numerical tests confirm the stability criterion and accuracy characteristics of the methods. The superiority of these mixed time methods to the fully implicit method or the fully explicit method is also demonstrated.

  7. Synchronisation and coupling analysis: applied cardiovascular physics in sleep medicine.

    PubMed

    Wessel, Niels; Riedl, Maik; Kramer, Jan; Muller, Andreas; Penzel, Thomas; Kurths, Jurgen

    2013-01-01

    Sleep is a physiological process with an internal program of a number of well defined sleep stages and intermediate wakefulness periods. The sleep stages modulate the autonomous nervous system and thereby the sleep stages are accompanied by different regulation regimes for the cardiovascular and respiratory system. The differences in regulation can be distinguished by new techniques of cardiovascular physics. The number of patients suffering from sleep disorders increases unproportionally with the increase of the human population and aging, leading to very high expenses in the public health system. Therefore, the challenge of cardiovascular physics is to develop highly-sophisticated methods which are able to, on the one hand, supplement and replace expensive medical devices and, on the other hand, improve the medical diagnostics with decreasing the patient's risk. Methods of cardiovascular physics are used to analyze heart rate, blood pressure and respiration to detect changes of the autonomous nervous system in different diseases. Data driven modeling analysis, synchronization and coupling analysis and their applications to biosignals in healthy subjects and patients with different sleep disorders are presented. Newly derived methods of cardiovascular physics can help to find indicators for these health risks.

  8. Electromagnetic Imaging Methods for Nondestructive Evaluation Applications

    PubMed Central

    Deng, Yiming; Liu, Xin

    2011-01-01

    Electromagnetic nondestructive tests are important and widely used within the field of nondestructive evaluation (NDE). The recent advances in sensing technology, hardware and software development dedicated to imaging and image processing, and material sciences have greatly expanded the application fields, sophisticated the systems design and made the potential of electromagnetic NDE imaging seemingly unlimited. This review provides a comprehensive summary of research works on electromagnetic imaging methods for NDE applications, followed by the summary and discussions on future directions. PMID:22247693

  9. Comparison of detrending methods for fluctuation analysis in hydrology

    NASA Astrophysics Data System (ADS)

    Zhang, Qiang; Zhou, Yu; Singh, Vijay P.; Chen, Yongqin David

    2011-03-01

    SummaryTrends within a hydrologic time series can significantly influence the scaling results of fluctuation analysis, such as rescaled range (RS) analysis and (multifractal) detrended fluctuation analysis (MF-DFA). Therefore, removal of trends is important in the study of scaling properties of the time series. In this study, three detrending methods, including adaptive detrending algorithm (ADA), Fourier-based method, and average removing technique, were evaluated by analyzing numerically generated series and observed streamflow series with obvious relative regular periodic trend. Results indicated that: (1) the Fourier-based detrending method and ADA were similar in detrending practices, and given proper parameters, these two methods can produce similarly satisfactory results; (2) detrended series by Fourier-based detrending method and ADA lose the fluctuation information at larger time scales, and the location of crossover points is heavily impacted by the chosen parameters of these two methods; and (3) the average removing method has an advantage over the other two methods, i.e., the fluctuation information at larger time scales is kept well-an indication of relatively reliable performance in detrending. In addition, the average removing method performed reasonably well in detrending a time series with regular periods or trends. In this sense, the average removing method should be preferred in the study of scaling properties of the hydrometeorolgical series with relative regular periodic trend using MF-DFA.

  10. Security Analysis and Improvements to the PsychoPass Method

    PubMed Central

    2013-01-01

    Background In a recent paper, Pietro Cipresso et al proposed the PsychoPass method, a simple way to create strong passwords that are easy to remember. However, the method has some security issues that need to be addressed. Objective To perform a security analysis on the PsychoPass method and outline the limitations of and possible improvements to the method. Methods We used the brute force analysis and dictionary attack analysis of the PsychoPass method to outline its weaknesses. Results The first issue with the Psychopass method is that it requires the password reproduction on the same keyboard layout as was used to generate the password. The second issue is a security weakness: although the produced password is 24 characters long, the password is still weak. We elaborate on the weakness and propose a solution that produces strong passwords. The proposed version first requires the use of the SHIFT and ALT-GR keys in combination with other keys, and second, the keys need to be 1-2 distances apart. Conclusions The proposed improved PsychoPass method yields passwords that can be broken only in hundreds of years based on current computing powers. The proposed PsychoPass method requires 10 keys, as opposed to 20 keys in the original method, for comparable password strength. PMID:23942458

  11. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method.

    PubMed

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-02-01

    To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  12. Digital dream analysis: a revised method.

    PubMed

    Bulkeley, Kelly

    2014-10-01

    This article demonstrates the use of a digital word search method designed to provide greater accuracy, objectivity, and speed in the study of dreams. A revised template of 40 word search categories, built into the website of the Sleep and Dream Database (SDDb), is applied to four "classic" sets of dreams: The male and female "Norm" dreams of Hall and Van de Castle (1966), the "Engine Man" dreams discussed by Hobson (1988), and the "Barb Sanders Baseline 250" dreams examined by Domhoff (2003). A word search analysis of these original dream reports shows that a digital approach can accurately identify many of the same distinctive patterns of content found by previous investigators using much more laborious and time-consuming methods. The results of this study emphasize the compatibility of word search technologies with traditional approaches to dream content analysis. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Detrended Fluctuation Analysis and Adaptive Fractal Analysis of Stride Time Data in Parkinson's Disease: Stitching Together Short Gait Trials

    PubMed Central

    Liebherr, Magnus; Haas, Christian T.

    2014-01-01

    Variability indicates motor control disturbances and is suitable to identify gait pathologies. It can be quantified by linear parameters (amplitude estimators) and more sophisticated nonlinear methods (structural information). Detrended Fluctuation Analysis (DFA) is one method to measure structural information, e.g., from stride time series. Recently, an improved method, Adaptive Fractal Analysis (AFA), has been proposed. This method has not been applied to gait data before. Fractal scaling methods (FS) require long stride-to-stride data to obtain valid results. However, in clinical studies, it is not usual to measure a large number of strides (e.g., strides). Amongst others, clinical gait analysis is limited due to short walkways, thus, FS seem to be inapplicable. The purpose of the present study was to evaluate FS under clinical conditions. Stride time data of five self-paced walking trials ( strides each) of subjects with PD and a healthy control group (CG) was measured. To generate longer time series, stride time sequences were stitched together. The coefficient of variation (CV), fractal scaling exponents (DFA) and (AFA) were calculated. Two surrogate tests were performed: A) the whole time series was randomly shuffled; B) the single trials were randomly shuffled separately and afterwards stitched together. CV did not discriminate between PD and CG. However, significant differences between PD and CG were found concerning and . Surrogate version B yielded a higher mean squared error and empirical quantiles than version A. Hence, we conclude that the stitching procedure creates an artificial structure resulting in an overestimation of true . The method of stitching together sections of gait seems to be appropriate in order to distinguish between PD and CG with FS. It provides an approach to integrate FS as standard in clinical gait analysis and to overcome limitations such as short walkways. PMID:24465708

  14. The Recoverability of P-Technique Factor Analysis

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  15. Method for combined biometric and chemical analysis of human fingerprints.

    PubMed

    Staymates, Jessica L; Orandi, Shahram; Staymates, Matthew E; Gillen, Greg

    This paper describes a method for combining direct chemical analysis of latent fingerprints with subsequent biometric analysis within a single sample. The method described here uses ion mobility spectrometry (IMS) as a chemical detection method for explosives and narcotics trace contamination. A collection swab coated with a high-temperature adhesive has been developed to lift latent fingerprints from various surfaces. The swab is then directly inserted into an IMS instrument for a quick chemical analysis. After the IMS analysis, the lifted print remains intact for subsequent biometric scanning and analysis using matching algorithms. Several samples of explosive-laden fingerprints were successfully lifted and the explosives detected with IMS. Following explosive detection, the lifted fingerprints remained of sufficient quality for positive match scores using a prepared gallery consisting of 60 fingerprints. Based on our results ( n  = 1200), there was no significant decrease in the quality of the lifted print post IMS analysis. In fact, for a small subset of lifted prints, the quality was improved after IMS analysis. The described method can be readily applied to domestic criminal investigations, transportation security, terrorist and bombing threats, and military in-theatre settings.

  16. Advanced stress analysis methods applicable to turbine engine structures

    NASA Technical Reports Server (NTRS)

    Pian, Theodore H. H.

    1991-01-01

    The following tasks on the study of advanced stress analysis methods applicable to turbine engine structures are described: (1) constructions of special elements which contain traction-free circular boundaries; (2) formulation of new version of mixed variational principles and new version of hybrid stress elements; (3) establishment of methods for suppression of kinematic deformation modes; (4) construction of semiLoof plate and shell elements by assumed stress hybrid method; and (5) elastic-plastic analysis by viscoplasticity theory using the mechanical subelement model.

  17. MHOST: An efficient finite element program for inelastic analysis of solids and structures

    NASA Technical Reports Server (NTRS)

    Nakazawa, S.

    1988-01-01

    An efficient finite element program for 3-D inelastic analysis of gas turbine hot section components was constructed and validated. A novel mixed iterative solution strategy is derived from the augmented Hu-Washizu variational principle in order to nodally interpolate coordinates, displacements, deformation, strains, stresses and material properties. A series of increasingly sophisticated material models incorporated in MHOST include elasticity, secant plasticity, infinitesimal and finite deformation plasticity, creep and unified viscoplastic constitutive model proposed by Walker. A library of high performance elements is built into this computer program utilizing the concepts of selective reduced integrations and independent strain interpolations. A family of efficient solution algorithms is implemented in MHOST for linear and nonlinear equation solution including the classical Newton-Raphson, modified, quasi and secant Newton methods with optional line search and the conjugate gradient method.

  18. Adaptive Nodal Transport Methods for Reactor Transient Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas Downar; E. Lewis

    2005-08-31

    Develop methods for adaptively treating the angular, spatial, and time dependence of the neutron flux in reactor transient analysis. These methods were demonstrated in the DOE transport nodal code VARIANT and the US NRC spatial kinetics code, PARCS.

  19. Systems and methods for sample analysis

    DOEpatents

    Cooks, Robert Graham; Li, Guangtao; Li, Xin; Ouyang, Zheng

    2015-01-13

    The invention generally relates to systems and methods for sample analysis. In certain embodiments, the invention provides a system for analyzing a sample that includes a probe including a material connected to a high voltage source, a device for generating a heated gas, and a mass analyzer.

  20. Analysis methods for tocopherols and tocotrienols

    USDA-ARS?s Scientific Manuscript database

    Tocopherols and tocotrienols, sometimes called tocochromanols or tocols, are also collectively termed Vitamin E. Vitamins A, D, E, and K, are referred to as fat soluble vitamins. Since the discovery of Vitamin E in 1922, many methods have been developed for the analysis of tocopherols and tocotrie...

  1. Application of the Bootstrap Methods in Factor Analysis.

    ERIC Educational Resources Information Center

    Ichikawa, Masanori; Konishi, Sadanori

    1995-01-01

    A Monte Carlo experiment was conducted to investigate the performance of bootstrap methods in normal theory maximum likelihood factor analysis when the distributional assumption was satisfied or unsatisfied. Problems arising with the use of bootstrap methods are highlighted. (SLD)

  2. Prospective risk analysis prior to retrospective incident reporting and analysis as a means to enhance incident reporting behaviour: a quasi-experimental field study.

    PubMed

    Kessels-Habraken, Marieke; De Jonge, Jan; Van der Schaaf, Tjerk; Rutte, Christel

    2010-05-01

    Hospitals can apply prospective and retrospective methods to reduce the large number of medical errors. Retrospective methods are used to identify errors after they occur and to facilitate learning. Prospective methods aim to determine, assess and minimise risks before incidents happen. This paper questions whether the order of implementation of those two methods influences the resultant impact on incident reporting behaviour. From November 2007 until June 2008, twelve wards of two Dutch general hospitals participated in a quasi-experimental reversed-treatment non-equivalent control group design. The six units of Hospital 1 first conducted a prospective analysis, after which a sophisticated incident reporting and analysis system was implemented. On the six units of Hospital 2 the two methods were implemented in reverse order. Data from the incident reporting and analysis system and from a questionnaire were used to assess between-hospital differences regarding the number of reported incidents, the spectrum of reported incident types, and the profession of reporters. The results show that carrying out a prospective analysis first can improve incident reporting behaviour in terms of a wider spectrum of reported incident types and a larger proportion of incidents reported by doctors. However, the proposed order does not necessarily yield a larger number of reported incidents. This study fills an important gap in safety management research regarding the order of the implementation of prospective and retrospective methods, and contributes to literature on incident reporting. This research also builds on the network theory of social contagion. The results might indicate that health care employees can disseminate their risk perceptions through communication with their direct colleagues. Copyright 2010 Elsevier Ltd. All rights reserved.

  3. Data Analysis and Data Mining: Current Issues in Biomedical Informatics

    PubMed Central

    Bellazzi, Riccardo; Diomidous, Marianna; Sarkar, Indra Neil; Takabayashi, Katsuhiko; Ziegler, Andreas; McCray, Alexa T.

    2011-01-01

    Summary Background Medicine and biomedical sciences have become data-intensive fields, which, at the same time, enable the application of data-driven approaches and require sophisticated data analysis and data mining methods. Biomedical informatics provides a proper interdisciplinary context to integrate data and knowledge when processing available information, with the aim of giving effective decision-making support in clinics and translational research. Objectives To reflect on different perspectives related to the role of data analysis and data mining in biomedical informatics. Methods On the occasion of the 50th year of Methods of Information in Medicine a symposium was organized, that reflected on opportunities, challenges and priorities of organizing, representing and analysing data, information and knowledge in biomedicine and health care. The contributions of experts with a variety of backgrounds in the area of biomedical data analysis have been collected as one outcome of this symposium, in order to provide a broad, though coherent, overview of some of the most interesting aspects of the field. Results The paper presents sections on data accumulation and data-driven approaches in medical informatics, data and knowledge integration, statistical issues for the evaluation of data mining models, translational bioinformatics and bioinformatics aspects of genetic epidemiology. Conclusions Biomedical informatics represents a natural framework to properly and effectively apply data analysis and data mining methods in a decision-making context. In the future, it will be necessary to preserve the inclusive nature of the field and to foster an increasing sharing of data and methods between researchers. PMID:22146916

  4. Two implementations of the Expert System for the Flight Analysis System (ESFAS) project

    NASA Technical Reports Server (NTRS)

    Wang, Lui

    1988-01-01

    A comparison is made between the two most sophisticated expert system building tools, the Automated Reasoning Tool (ART) and the Knowledge Engineering Environment (KEE). The same problem domain (ESFAS) was used in making the comparison. The Expert System for the Flight Analysis System (ESFAS) acts as an intelligent front end for the Flight Analysis System (FAS). FAS is a complex configuration controlled set of interrelated processors (FORTRAN routines) which will be used by the Mission Planning and Analysis Div. (MPAD) to design and analyze Shuttle and potential Space Station missions. Implementations of ESFAS are described. The two versions represent very different programming paradigms; ART uses rules and KEE uses objects. Due to each of the tools philosophical differences, KEE is implemented using a depth first traversal algorithm, whereas ART uses a user directed traversal method. Either tool could be used to solve this particular problem.

  5. Method of Real-Time Principal-Component Analysis

    NASA Technical Reports Server (NTRS)

    Duong, Tuan; Duong, Vu

    2005-01-01

    Dominant-element-based gradient descent and dynamic initial learning rate (DOGEDYN) is a method of sequential principal-component analysis (PCA) that is well suited for such applications as data compression and extraction of features from sets of data. In comparison with a prior method of gradient-descent-based sequential PCA, this method offers a greater rate of learning convergence. Like the prior method, DOGEDYN can be implemented in software. However, the main advantage of DOGEDYN over the prior method lies in the facts that it requires less computation and can be implemented in simpler hardware. It should be possible to implement DOGEDYN in compact, low-power, very-large-scale integrated (VLSI) circuitry that could process data in real time.

  6. Reconnecting Stochastic Methods With Hydrogeological Applications: A Utilitarian Uncertainty Analysis and Risk Assessment Approach for the Design of Optimal Monitoring Networks

    NASA Astrophysics Data System (ADS)

    Bode, Felix; Ferré, Ty; Zigelli, Niklas; Emmert, Martin; Nowak, Wolfgang

    2018-03-01

    Collaboration between academics and practitioners promotes knowledge transfer between research and industry, with both sides benefiting greatly. However, academic approaches are often not feasible given real-world limits on time, cost and data availability, especially for risk and uncertainty analyses. Although the need for uncertainty quantification and risk assessment are clear, there are few published studies examining how scientific methods can be used in practice. In this work, we introduce possible strategies for transferring and communicating academic approaches to real-world applications, countering the current disconnect between increasingly sophisticated academic methods and methods that work and are accepted in practice. We analyze a collaboration between academics and water suppliers in Germany who wanted to design optimal groundwater monitoring networks for drinking-water well catchments. Our key conclusions are: to prefer multiobjective over single-objective optimization; to replace Monte-Carlo analyses by scenario methods; and to replace data-hungry quantitative risk assessment by easy-to-communicate qualitative methods. For improved communication, it is critical to set up common glossaries of terms to avoid misunderstandings, use striking visualization to communicate key concepts, and jointly and continually revisit the project objectives. Ultimately, these approaches and recommendations are simple and utilitarian enough to be transferred directly to other practical water resource related problems.

  7. Advanced Methods of Protein Crystallization.

    PubMed

    Moreno, Abel

    2017-01-01

    This chapter provides a review of different advanced methods that help to increase the success rate of a crystallization project, by producing larger and higher quality single crystals for determination of macromolecular structures by crystallographic methods. For this purpose, the chapter is divided into three parts. The first part deals with the fundamentals for understanding the crystallization process through different strategies based on physical and chemical approaches. The second part presents new approaches involved in more sophisticated methods not only for growing protein crystals but also for controlling the size and orientation of crystals through utilization of electromagnetic fields and other advanced techniques. The last section deals with three different aspects: the importance of microgravity, the use of ligands to stabilize proteins, and the use of microfluidics to obtain protein crystals. All these advanced methods will allow the readers to obtain suitable crystalline samples for high-resolution X-ray and neutron crystallography.

  8. Wilsonian methods of concept analysis: a critique.

    PubMed

    Hupcey, J E; Morse, J M; Lenz, E R; Tasón, M C

    1996-01-01

    Wilsonian methods of concept analysis--that is, the method proposed by Wilson and Wilson-derived methods in nursing (as described by Walker and Avant; Chinn and Kramer [Jacobs]; Schwartz-Barcott and Kim; and Rodgers)--are discussed and compared in this article. The evolution and modifications of Wilson's method in nursing are described and research that has used these methods, assessed. The transformation of Wilson's method is traced as each author has adopted his techniques and attempted to modify the method to correct for limitations. We suggest that these adaptations and modifications ultimately erode Wilson's method. Further, the Wilson-derived methods have been overly simplified and used by nurse researchers in a prescriptive manner, and the results often do not serve the purpose of expanding nursing knowledge. We conclude that, considering the significance of concept development for the nursing profession, the development of new methods and a means for evaluating conceptual inquiry must be given priority.

  9. On the Analysis of Output Information of S-tree Method

    NASA Astrophysics Data System (ADS)

    Bekaryan, Karen M.; Melkonyan, Anahit A.

    2007-08-01

    On of the most popular and effective method of analysis of hierarchical structure of N-body gravitating systems is method of S-tree diagrams. Apart from many interesting peculiarities, the method, unfortunately, is not free from some disadvantages, among which most important is an extremely complexity of analysis of output information. To solve this problem a number of methods are suggested. From our point of view, most effective approach is an application of all these methods simultaneousely. This allows to obtaine more complete and objective «picture» concerning a final distribution.

  10. Analysis of Electrowetting Dynamics with Level Set Method

    NASA Astrophysics Data System (ADS)

    Park, Jun Kwon; Hong, Jiwoo; Kang, Kwan Hyoung

    2009-11-01

    Electrowetting is a versatile tool to handle tiny droplets and forms a backbone of digital microfluidics. Numerical analysis is necessary to fully understand the dynamics of electrowetting, especially in designing electrowetting-based liquid lenses and reflective displays. We developed a numerical method to analyze the general contact-line problems, incorporating dynamic contact angle models. The method was applied to the analysis of spreading process of a sessile droplet for step input voltages in electrowetting. The result was compared with experimental data and analytical result which is based on the spectral method. It is shown that contact line friction significantly affects the contact line motion and the oscillation amplitude. The pinning process of contact line was well represented by including the hysteresis effect in the contact angle models.

  11. Risk Analysis Methods for Deepwater Port Oil Transfer Systems

    DOT National Transportation Integrated Search

    1976-06-01

    This report deals with the risk analysis methodology for oil spills from the oil transfer systems in deepwater ports. Failure mode and effect analysis in combination with fault tree analysis are identified as the methods best suited for the assessmen...

  12. C-Depth Method to Determine Diffusion Coefficient and Partition Coefficient of PCB in Building Materials.

    PubMed

    Liu, Cong; Kolarik, Barbara; Gunnarsen, Lars; Zhang, Yinping

    2015-10-20

    Polychlorinated biphenyls (PCBs) have been found to be persistent in the environment and possibly harmful. Many buildings are characterized with high PCB concentrations. Knowledge about partitioning between primary sources and building materials is critical for exposure assessment and practical remediation of PCB contamination. This study develops a C-depth method to determine diffusion coefficient (D) and partition coefficient (K), two key parameters governing the partitioning process. For concrete, a primary material studied here, relative standard deviations of results among five data sets are 5%-22% for K and 42-66% for D. Compared with existing methods, C-depth method overcomes the inability to obtain unique estimation for nonlinear regression and does not require assumed correlations for D and K among congeners. Comparison with a more sophisticated two-term approach implies significant uncertainty for D, and smaller uncertainty for K. However, considering uncertainties associated with sampling and chemical analysis, and impact of environmental factors, the results are acceptable for engineering applications. This was supported by good agreement between model prediction and measurement. Sensitivity analysis indicated that effective diffusion distance, contacting time of materials with primary sources, and depth of measured concentrations are critical for determining D, and PCB concentration in primary sources is critical for K.

  13. Voltammetric analysis apparatus and method

    DOEpatents

    Almon, A.C.

    1993-06-08

    An apparatus and method is described for electrochemical analysis of elements in solution. An auxiliary electrode, a reference electrode, and five working electrodes are positioned in a container containing a sample solution. The working electrodes are spaced apart evenly from each other and the auxiliary electrode to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between the auxiliary electrode and each of the working electrodes. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in the sample solution and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  14. Bioinformatics/biostatistics: microarray analysis.

    PubMed

    Eichler, Gabriel S

    2012-01-01

    The quantity and complexity of the molecular-level data generated in both research and clinical settings require the use of sophisticated, powerful computational interpretation techniques. It is for this reason that bioinformatic analysis of complex molecular profiling data has become a fundamental technology in the development of personalized medicine. This chapter provides a high-level overview of the field of bioinformatics and outlines several, classic bioinformatic approaches. The highlighted approaches can be aptly applied to nearly any sort of high-dimensional genomic, proteomic, or metabolomic experiments. Reviewed technologies in this chapter include traditional clustering analysis, the Gene Expression Dynamics Inspector (GEDI), GoMiner (GoMiner), Gene Set Enrichment Analysis (GSEA), and the Learner of Functional Enrichment (LeFE).

  15. Classical and all-floating FETI methods for the simulation of arterial tissues

    PubMed Central

    Augustin, Christoph M.; Holzapfel, Gerhard A.; Steinbach, Olaf

    2015-01-01

    High-resolution and anatomically realistic computer models of biological soft tissues play a significant role in the understanding of the function of cardiovascular components in health and disease. However, the computational effort to handle fine grids to resolve the geometries as well as sophisticated tissue models is very challenging. One possibility to derive a strongly scalable parallel solution algorithm is to consider finite element tearing and interconnecting (FETI) methods. In this study we propose and investigate the application of FETI methods to simulate the elastic behavior of biological soft tissues. As one particular example we choose the artery which is – as most other biological tissues – characterized by anisotropic and nonlinear material properties. We compare two specific approaches of FETI methods, classical and all-floating, and investigate the numerical behavior of different preconditioning techniques. In comparison to classical FETI, the all-floating approach has not only advantages concerning the implementation but in many cases also concerning the convergence of the global iterative solution method. This behavior is illustrated with numerical examples. We present results of linear elastic simulations to show convergence rates, as expected from the theory, and results from the more sophisticated nonlinear case where we apply a well-known anisotropic model to the realistic geometry of an artery. Although the FETI methods have a great applicability on artery simulations we will also discuss some limitations concerning the dependence on material parameters. PMID:26751957

  16. Comparison of histomorphometrical data obtained with two different image analysis methods.

    PubMed

    Ballerini, Lucia; Franke-Stenport, Victoria; Borgefors, Gunilla; Johansson, Carina B

    2007-08-01

    A common way to determine tissue acceptance of biomaterials is to perform histomorphometrical analysis on histologically stained sections from retrieved samples with surrounding tissue, using various methods. The "time and money consuming" methods and techniques used are often "in house standards". We address light microscopic investigations of bone tissue reactions on un-decalcified cut and ground sections of threaded implants. In order to screen sections and generate results faster, the aim of this pilot project was to compare results generated with the in-house standard visual image analysis tool (i.e., quantifications and judgements done by the naked eye) with a custom made automatic image analysis program. The histomorphometrical bone area measurements revealed no significant differences between the methods but the results of the bony contacts varied significantly. The raw results were in relative agreement, i.e., the values from the two methods were proportional to each other: low bony contact values in the visual method corresponded to low values with the automatic method. With similar resolution images and further improvements of the automatic method this difference should become insignificant. A great advantage using the new automatic image analysis method is that it is time saving--analysis time can be significantly reduced.

  17. Diffraction as a Method of Critical Policy Analysis

    ERIC Educational Resources Information Center

    Ulmer, Jasmine B.

    2016-01-01

    Recent developments in critical policy analysis have occurred alongside the new materialisms in qualitative research. These lines of scholarship have unfolded along two separate, but related, tracks. In particular, the new materialist method of "diffraction" aligns with many elements of critical policy analysis. Both involve critical…

  18. Validity and consistency assessment of accident analysis methods in the petroleum industry.

    PubMed

    Ahmadi, Omran; Mortazavi, Seyed Bagher; Khavanin, Ali; Mokarami, Hamidreza

    2017-11-17

    Accident analysis is the main aspect of accident investigation. It includes the method of connecting different causes in a procedural way. Therefore, it is important to use valid and reliable methods for the investigation of different causal factors of accidents, especially the noteworthy ones. This study aimed to prominently assess the accuracy (sensitivity index [SI]) and consistency of the six most commonly used accident analysis methods in the petroleum industry. In order to evaluate the methods of accident analysis, two real case studies (process safety and personal accident) from the petroleum industry were analyzed by 10 assessors. The accuracy and consistency of these methods were then evaluated. The assessors were trained in the workshop of accident analysis methods. The systematic cause analysis technique and bowtie methods gained the greatest SI scores for both personal and process safety accidents, respectively. The best average results of the consistency in a single method (based on 10 independent assessors) were in the region of 70%. This study confirmed that the application of methods with pre-defined causes and a logic tree could enhance the sensitivity and consistency of accident analysis.

  19. Analysis and optimization of cyclic methods in orbit computation

    NASA Technical Reports Server (NTRS)

    Pierce, S.

    1973-01-01

    The mathematical analysis and computation of the K=3, order 4; K=4, order 6; and K=5, order 7 cyclic methods and the K=5, order 6 Cowell method and some results of optimizing the 3 backpoint cyclic multistep methods for solving ordinary differential equations are presented. Cyclic methods have the advantage over traditional methods of having higher order for a given number of backpoints while at the same time having more free parameters. After considering several error sources the primary source for the cyclic methods has been isolated. The free parameters for three backpoint methods were used to minimize the effects of some of these error sources. They now yield more accuracy with the same computing time as Cowell's method on selected problems. This work is being extended to the five backpoint methods. The analysis and optimization are more difficult here since the matrices are larger and the dimension of the optimizing space is larger. Indications are that the primary error source can be reduced. This will still leave several parameters free to minimize other sources.

  20. Neutron activation analysis of certified samples by the absolute method

    NASA Astrophysics Data System (ADS)

    Kadem, F.; Belouadah, N.; Idiri, Z.

    2015-07-01

    The nuclear reactions analysis technique is mainly based on the relative method or the use of activation cross sections. In order to validate nuclear data for the calculated cross section evaluated from systematic studies, we used the neutron activation analysis technique (NAA) to determine the various constituent concentrations of certified samples for animal blood, milk and hay. In this analysis, the absolute method is used. The neutron activation technique involves irradiating the sample and subsequently performing a measurement of the activity of the sample. The fundamental equation of the activation connects several physical parameters including the cross section that is essential for the quantitative determination of the different elements composing the sample without resorting to the use of standard sample. Called the absolute method, it allows a measurement as accurate as the relative method. The results obtained by the absolute method showed that the values are as precise as the relative method requiring the use of standard sample for each element to be quantified.

  1. Open Rotor Computational Aeroacoustic Analysis with an Immersed Boundary Method

    NASA Technical Reports Server (NTRS)

    Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.

    2016-01-01

    Reliable noise prediction capabilities are essential to enable novel fuel efficient open rotor designs that can meet the community and cabin noise standards. Toward this end, immersed boundary methods have reached a level of maturity so that they are being frequently employed for specific real world applications within NASA. This paper demonstrates that our higher-order immersed boundary method provides the ability for aeroacoustic analysis of wake-dominated flow fields generated by highly complex geometries. This is the first of a kind aeroacoustic simulation of an open rotor propulsion system employing an immersed boundary method. In addition to discussing the peculiarities of applying the immersed boundary method to this moving boundary problem, we will provide a detailed aeroacoustic analysis of the noise generation mechanisms encountered in the open rotor flow. The simulation data is compared to available experimental data and other computational results employing more conventional CFD methods. The noise generation mechanisms are analyzed employing spectral analysis, proper orthogonal decomposition and the causality method.

  2. Methods for Conducting Cognitive Task Analysis for a Decision Making Task.

    DTIC Science & Technology

    1996-01-01

    Cognitive task analysis (CTA) improves traditional task analysis procedures by analyzing the thought processes of performers while they complete a...for using these methods to conduct a CTA for domains which involve critical decision making tasks in naturalistic settings. The cognitive task analysis methods

  3. Sensitivity analysis and approximation methods for general eigenvalue problems

    NASA Technical Reports Server (NTRS)

    Murthy, D. V.; Haftka, R. T.

    1986-01-01

    Optimization of dynamic systems involving complex non-hermitian matrices is often computationally expensive. Major contributors to the computational expense are the sensitivity analysis and reanalysis of a modified design. The present work seeks to alleviate this computational burden by identifying efficient sensitivity analysis and approximate reanalysis methods. For the algebraic eigenvalue problem involving non-hermitian matrices, algorithms for sensitivity analysis and approximate reanalysis are classified, compared and evaluated for efficiency and accuracy. Proper eigenvector normalization is discussed. An improved method for calculating derivatives of eigenvectors is proposed based on a more rational normalization condition and taking advantage of matrix sparsity. Important numerical aspects of this method are also discussed. To alleviate the problem of reanalysis, various approximation methods for eigenvalues are proposed and evaluated. Linear and quadratic approximations are based directly on the Taylor series. Several approximation methods are developed based on the generalized Rayleigh quotient for the eigenvalue problem. Approximation methods based on trace theorem give high accuracy without needing any derivatives. Operation counts for the computation of the approximations are given. General recommendations are made for the selection of appropriate approximation technique as a function of the matrix size, number of design variables, number of eigenvalues of interest and the number of design points at which approximation is sought.

  4. Research on Visual Analysis Methods of Terrorism Events

    NASA Astrophysics Data System (ADS)

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  5. Probabilistic structural analysis methods for improving Space Shuttle engine reliability

    NASA Technical Reports Server (NTRS)

    Boyce, L.

    1989-01-01

    Probabilistic structural analysis methods are particularly useful in the design and analysis of critical structural components and systems that operate in very severe and uncertain environments. These methods have recently found application in space propulsion systems to improve the structural reliability of Space Shuttle Main Engine (SSME) components. A computer program, NESSUS, based on a deterministic finite-element program and a method of probabilistic analysis (fast probability integration) provides probabilistic structural analysis for selected SSME components. While computationally efficient, it considers both correlated and nonnormal random variables as well as an implicit functional relationship between independent and dependent variables. The program is used to determine the response of a nickel-based superalloy SSME turbopump blade. Results include blade tip displacement statistics due to the variability in blade thickness, modulus of elasticity, Poisson's ratio or density. Modulus of elasticity significantly contributed to blade tip variability while Poisson's ratio did not. Thus, a rational method for choosing parameters to be modeled as random is provided.

  6. Actuarial analysis of surgical results: rationale and method.

    PubMed

    Grunkemeier, G L; Starr, A

    1977-11-01

    The use of time-related methods of statistical analysis is essential for valid evaluation of the long-term results of a surgical procedure. Accurate comparison of two procedures or two prosthetic devices is possible only when the length of follow-up is properly accounted for. The purpose of this report is to make the technical aspects of the acturial, or life table, method easily accessible to the surgeon, with emphasis on the motivation for and the rationale behind it. This topic is illustrated in terms of heart valve prostheses, a field that is rapidly developing. Both the authors and readers of articles must be aware that controversies surrounding the relative merits of various prosthetic designs or operative procedures can be settled only if proper time-related methods of analysis are utilized.

  7. An approach for quantitative image quality analysis for CT

    NASA Astrophysics Data System (ADS)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  8. Nonlinear analysis of structures. [within framework of finite element method

    NASA Technical Reports Server (NTRS)

    Armen, H., Jr.; Levine, H.; Pifko, A.; Levy, A.

    1974-01-01

    The development of nonlinear analysis techniques within the framework of the finite-element method is reported. Although the emphasis is concerned with those nonlinearities associated with material behavior, a general treatment of geometric nonlinearity, alone or in combination with plasticity is included, and applications presented for a class of problems categorized as axisymmetric shells of revolution. The scope of the nonlinear analysis capabilities includes: (1) a membrane stress analysis, (2) bending and membrane stress analysis, (3) analysis of thick and thin axisymmetric bodies of revolution, (4) a general three dimensional analysis, and (5) analysis of laminated composites. Applications of the methods are made to a number of sample structures. Correlation with available analytic or experimental data range from good to excellent.

  9. An approximate methods approach to probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.

    1989-01-01

    A probabilistic structural analysis method (PSAM) is described which makes an approximate calculation of the structural response of a system, including the associated probabilistic distributions, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The method employs the fast probability integration (FPI) algorithm of Wu and Wirsching. Typical solution strategies are illustrated by formulations for a representative critical component chosen from the Space Shuttle Main Engine (SSME) as part of a major NASA-sponsored program on PSAM. Typical results are presented to demonstrate the role of the methodology in engineering design and analysis.

  10. Social network analysis: Presenting an underused method for nursing research.

    PubMed

    Parnell, James Michael; Robinson, Jennifer C

    2018-06-01

    This paper introduces social network analysis as a versatile method with many applications in nursing research. Social networks have been studied for years in many social science fields. The methods continue to advance but remain unknown to most nursing scholars. Discussion paper. English language and interpreted literature was searched from Ovid Healthstar, CINAHL, PubMed Central, Scopus and hard copy texts from 1965 - 2017. Social network analysis first emerged in nursing literature in 1995 and appears minimally through present day. To convey the versatility and applicability of social network analysis in nursing, hypothetical scenarios are presented. The scenarios are illustrative of three approaches to social network analysis and include key elements of social network research design. The methods of social network analysis are underused in nursing research, primarily because they are unknown to most scholars. However, there is methodological flexibility and epistemological versatility capable of supporting quantitative and qualitative research. The analytic techniques of social network analysis can add new insight into many areas of nursing inquiry, especially those influenced by cultural norms. Furthermore, visualization techniques associated with social network analysis can be used to generate new hypotheses. Social network analysis can potentially uncover findings not accessible through methods commonly used in nursing research. Social networks can be analysed based on individual-level attributes, whole networks and subgroups within networks. Computations derived from social network analysis may stand alone to answer a research question or incorporated as variables into robust statistical models. © 2018 John Wiley & Sons Ltd.

  11. Simple gas chromatographic method for furfural analysis.

    PubMed

    Gaspar, Elvira M S M; Lopes, João F

    2009-04-03

    A new, simple, gas chromatographic method was developed for the direct analysis of 5-hydroxymethylfurfural (5-HMF), 2-furfural (2-F) and 5-methylfurfural (5-MF) in liquid and water soluble foods, using direct immersion SPME coupled to GC-FID and/or GC-TOF-MS. The fiber (DVB/CAR/PDMS) conditions were optimized: pH effect, temperature, adsorption and desorption times. The method is simple and accurate (RSD<8%), showed good recoveries (77-107%) and good limits of detection (GC-FID: 1.37 microgL(-1) for 2-F, 8.96 microgL(-1) for 5-MF, 6.52 microgL(-1) for 5-HMF; GC-TOF-MS: 0.3, 1.2 and 0.9 ngmL(-1) for 2-F, 5-MF and 5-HMF, respectively). It was applied to different commercial food matrices: honey, white, demerara, brown and yellow table sugars, and white and red balsamic vinegars. This one-step, sensitive and direct method for the analysis of furfurals will contribute to characterise and quantify their presence in the human diet.

  12. Global/local methods for probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Wu, Y.-T.

    1993-01-01

    A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.

  13. Global/local methods for probabilistic structural analysis

    NASA Astrophysics Data System (ADS)

    Millwater, H. R.; Wu, Y.-T.

    1993-04-01

    A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.

  14. Earthquake Hazard Analysis Methods: A Review

    NASA Astrophysics Data System (ADS)

    Sari, A. M.; Fakhrurrozi, A.

    2018-02-01

    One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.

  15. Study on Collision of Ship Side Structure by Simplified Plastic Analysis Method

    NASA Astrophysics Data System (ADS)

    Sun, C. J.; Zhou, J. H.; Wu, W.

    2017-10-01

    During its lifetime, a ship may encounter collision or grounding and sustain permanent damage after these types of accidents. Crashworthiness has been based on two kinds of main methods: simplified plastic analysis and numerical simulation. A simplified plastic analysis method is presented in this paper. Numerical methods using the non-linear finite-element software LS-DYNA are conducted to validate the method. The results show that, as for the accuracy of calculation results, the simplified plasticity analysis are in good agreement with the finite element simulation, which reveals that the simplified plasticity analysis method can quickly and accurately estimate the crashworthiness of the side structure during the collision process and can be used as a reliable risk assessment method.

  16. A Sociolinguistic Analysis of English Borrowings in Japanese Advertising Texts.

    ERIC Educational Resources Information Center

    Takashi, Kyoko

    1990-01-01

    Sociolinguistic analysis of English borrowings in Japanese television and print advertising supported hypotheses that the primary reason for loanword use was to make the product seem more modern and sophisticated and that there was a relationship between loan functions and such audience characteristics as gender, age, occupation, and background.…

  17. An operational modal analysis method in frequency and spatial domain

    NASA Astrophysics Data System (ADS)

    Wang, Tong; Zhang, Lingmi; Tamura, Yukio

    2005-12-01

    A frequency and spatial domain decomposition method (FSDD) for operational modal analysis (OMA) is presented in this paper, which is an extension of the complex mode indicator function (CMIF) method for experimental modal analysis (EMA). The theoretical background of the FSDD method is clarified. Singular value decomposition is adopted to separate the signal space from the noise space. Finally, an enhanced power spectrum density (PSD) is proposed to obtain more accurate modal parameters by curve fitting in the frequency domain. Moreover, a simulation case and an application case are used to validate this method.

  18. Sampling and sample processing in pesticide residue analysis.

    PubMed

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  19. A Practical Method of Policy Analysis by Simulating Policy Options

    ERIC Educational Resources Information Center

    Phelps, James L.

    2011-01-01

    This article focuses on a method of policy analysis that has evolved from the previous articles in this issue. The first section, "Toward a Theory of Educational Production," identifies concepts from science and achievement production to be incorporated into this policy analysis method. Building on Kuhn's (1970) discussion regarding paradigms, the…

  20. Open Rotor Computational Aeroacoustic Analysis with an Immersed Boundary Method

    NASA Technical Reports Server (NTRS)

    Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.

    2016-01-01

    Reliable noise prediction capabilities are essential to enable novel fuel efficient open rotor designs that can meet the community and cabin noise standards. Toward this end, immersed boundary methods have reached a level of maturity where more and more complex flow problems can be tackled with this approach. This paper demonstrates that our higher-order immersed boundary method provides the ability for aeroacoustic analysis of wake-dominated flow fields generated by a contra-rotating open rotor. This is the first of a kind aeroacoustic simulation of an open rotor propulsion system employing an immersed boundary method. In addition to discussing the methodologies of how to apply the immersed boundary method to this moving boundary problem, we will provide a detailed validation of the aeroacoustic analysis approach employing the Launch Ascent and Vehicle Aerodynamics (LAVA) solver. Two free-stream Mach numbers with M=0.2 and M=0.78 are considered in this analysis that are based on the nominally take-off and cruise flow conditions. The simulation data is compared to available experimental data and other computational results employing more conventional CFD methods. Spectral analysis is used to determine the dominant wave propagation pattern in the acoustic near-field.

  1. Text analysis devices, articles of manufacture, and text analysis methods

    DOEpatents

    Turner, Alan E; Hetzler, Elizabeth G; Nakamura, Grant C

    2015-03-31

    Text analysis devices, articles of manufacture, and text analysis methods are described according to some aspects. In one aspect, a text analysis device includes a display configured to depict visible images, and processing circuitry coupled with the display and wherein the processing circuitry is configured to access a first vector of a text item and which comprises a plurality of components, to access a second vector of the text item and which comprises a plurality of components, to weight the components of the first vector providing a plurality of weighted values, to weight the components of the second vector providing a plurality of weighted values, and to combine the weighted values of the first vector with the weighted values of the second vector to provide a third vector.

  2. Comparison of analysis methods for airway quantification

    NASA Astrophysics Data System (ADS)

    Odry, Benjamin L.; Kiraly, Atilla P.; Novak, Carol L.; Naidich, David P.

    2012-03-01

    Diseased airways have been known for several years as a possible contributing factor to airflow limitation in Chronic Obstructive Pulmonary Diseases (COPD). Quantification of disease severity through the evaluation of airway dimensions - wall thickness and lumen diameter - has gained increased attention, thanks to the availability of multi-slice computed tomography (CT). Novel approaches have focused on automated methods of measurement as a faster and more objective means that the visual assessment routinely employed in the clinic. Since the Full-Width Half-Maximum (FWHM) method of airway measurement was introduced two decades ago [1], several new techniques for quantifying airways have been detailed in the literature, but no approach has truly become a standard for such analysis. Our own research group has presented two alternative approaches for determining airway dimensions, one involving a minimum path and the other active contours [2, 3]. With an increasing number of techniques dedicated to the same goal, we decided to take a step back and analyze the differences of these methods. We consequently put to the test our two methods of analysis and the FWHM approach. We first measured a set of 5 airways from a phantom of known dimensions. Then we compared measurements from the three methods to those of two independent readers, performed on 35 airways in 5 patients. We elaborate on the differences of each approach and suggest conclusions on which could be defined as the best one.

  3. Bearing Procurement Analysis Method by Total Cost of Ownership Analysis and Reliability Prediction

    NASA Astrophysics Data System (ADS)

    Trusaji, Wildan; Akbar, Muhammad; Sukoyo; Irianto, Dradjad

    2018-03-01

    In making bearing procurement analysis, price and its reliability must be considered as decision criteria, since price determines the direct cost as acquisition cost and reliability of bearing determine the indirect cost such as maintenance cost. Despite the indirect cost is hard to identify and measured, it has high contribution to overall cost that will be incurred. So, the indirect cost of reliability must be considered when making bearing procurement analysis. This paper tries to explain bearing evaluation method with the total cost of ownership analysis to consider price and maintenance cost as decision criteria. Furthermore, since there is a lack of failure data when bearing evaluation phase is conducted, reliability prediction method is used to predict bearing reliability from its dynamic load rating parameter. With this method, bearing with a higher price but has higher reliability is preferable for long-term planning. But for short-term planning the cheaper one but has lower reliability is preferable. This contextuality can give rise to conflict between stakeholders. Thus, the planning horizon needs to be agreed by all stakeholder before making a procurement decision.

  4. Beyond linear methods of data analysis: time series analysis and its applications in renal research.

    PubMed

    Gupta, Ashwani K; Udrea, Andreea

    2013-01-01

    Analysis of temporal trends in medicine is needed to understand normal physiology and to study the evolution of disease processes. It is also useful for monitoring response to drugs and interventions, and for accountability and tracking of health care resources. In this review, we discuss what makes time series analysis unique for the purposes of renal research and its limitations. We also introduce nonlinear time series analysis methods and provide examples where these have advantages over linear methods. We review areas where these computational methods have found applications in nephrology ranging from basic physiology to health services research. Some examples include noninvasive assessment of autonomic function in patients with chronic kidney disease, dialysis-dependent renal failure and renal transplantation. Time series models and analysis methods have been utilized in the characterization of mechanisms of renal autoregulation and to identify the interaction between different rhythms of nephron pressure flow regulation. They have also been used in the study of trends in health care delivery. Time series are everywhere in nephrology and analyzing them can lead to valuable knowledge discovery. The study of time trends of vital signs, laboratory parameters and the health status of patients is inherent to our everyday clinical practice, yet formal models and methods for time series analysis are not fully utilized. With this review, we hope to familiarize the reader with these techniques in order to assist in their proper use where appropriate.

  5. Electrochemical Sensors for Clinic Analysis

    PubMed Central

    Wang, You; Xu, Hui; Zhang, Jianming; Li, Guang

    2008-01-01

    Demanded by modern medical diagnosis, advances in microfabrication technology have led to the development of fast, sensitive and selective electrochemical sensors for clinic analysis. This review addresses the principles behind electrochemical sensor design and fabrication, and introduces recent progress in the application of electrochemical sensors to analysis of clinical chemicals such as blood gases, electrolytes, metabolites, DNA and antibodies, including basic and applied research. Miniaturized commercial electrochemical biosensors will form the basis of inexpensive and easy to use devices for acquiring chemical information to bring sophisticated analytical capabilities to the non-specialist and general public alike in the future. PMID:27879810

  6. Vectorized Monte Carlo methods for reactor lattice analysis

    NASA Technical Reports Server (NTRS)

    Brown, F. B.

    1984-01-01

    Some of the new computational methods and equivalent mathematical representations of physics models used in the MCV code, a vectorized continuous-enery Monte Carlo code for use on the CYBER-205 computer are discussed. While the principal application of MCV is the neutronics analysis of repeating reactor lattices, the new methods used in MCV should be generally useful for vectorizing Monte Carlo for other applications. For background, a brief overview of the vector processing features of the CYBER-205 is included, followed by a discussion of the fundamentals of Monte Carlo vectorization. The physics models used in the MCV vectorized Monte Carlo code are then summarized. The new methods used in scattering analysis are presented along with details of several key, highly specialized computational routines. Finally, speedups relative to CDC-7600 scalar Monte Carlo are discussed.

  7. Astrophysics and Big Data: Challenges, Methods, and Tools

    NASA Astrophysics Data System (ADS)

    Garofalo, Mauro; Botta, Alessio; Ventre, Giorgio

    2017-06-01

    Nowadays there is no field research which is not flooded with data. Among the sciences, astrophysics has always been driven by the analysis of massive amounts of data. The development of new and more sophisticated observation facilities, both ground-based and spaceborne, has led data more and more complex (Variety), an exponential growth of both data Volume (i.e., in the order of petabytes), and Velocity in terms of production and transmission. Therefore, new and advanced processing solutions will be needed to process this huge amount of data. We investigate some of these solutions, based on machine learning models as well as tools and architectures for Big Data analysis that can be exploited in the astrophysical context.

  8. Analysis Method for Non-Nominal First Acquisition

    NASA Technical Reports Server (NTRS)

    Sieg, Detlef; Mugellesi-Dow, Roberta

    2007-01-01

    First this paper describes a method how the trajectory of the launcher can be modelled for the contingency analysis without having much information about the launch vehicle itself. From a dense sequence of state vectors a velocity profile is derived which is sufficiently accurate to enable the Flight Dynamics Team to integrate parts of the launcher trajectory on its own and to simulate contingency cases by modifying the velocity profile. Then the paper focuses on the thorough visibility analysis which has to follow the contingency case or burn performance simulations. In the ideal case it is possible to identify a ground station which is able to acquire the satellite independent from the burn performance. The correlations between the burn performance and the pointing at subsequent ground stations are derived with the aim of establishing simple guidelines which can be applied quickly and which significantly improve the chance of acquisition at subsequent ground stations. In the paper the method is applied to the Soyuz/Fregat launch with the MetOp satellite. Overall the paper shows that the launcher trajectory modelling with the simulation of contingency cases in connection with a ground station visibility analysis leads to a proper selection of ground stations and acquisition methods. In the MetOp case this ensured successful contact of all ground stations during the first hour after separation without having to rely on any early orbit determination result or state vector update.

  9. Does a better model yield a better argument? An info-gap analysis

    NASA Astrophysics Data System (ADS)

    Ben-Haim, Yakov

    2017-04-01

    Theories, models and computations underlie reasoned argumentation in many areas. The possibility of error in these arguments, though of low probability, may be highly significant when the argument is used in predicting the probability of rare high-consequence events. This implies that the choice of a theory, model or computational method for predicting rare high-consequence events must account for the probability of error in these components. However, error may result from lack of knowledge or surprises of various sorts, and predicting the probability of error is highly uncertain. We show that the putatively best, most innovative and sophisticated argument may not actually have the lowest probability of error. Innovative arguments may entail greater uncertainty than more standard but less sophisticated methods, creating an innovation dilemma in formulating the argument. We employ info-gap decision theory to characterize and support the resolution of this problem and present several examples.

  10. Cognitive Pathways: Analysis of Students' Written Texts for Science Understanding

    ERIC Educational Resources Information Center

    Grimberg, Bruna Irene; Hand, Brian

    2009-01-01

    The purpose of this study was to reconstruct writers' reasoning process as reflected in their written texts. The codes resulting from the text analysis were related to cognitive operations, ranging from simple to more sophisticated ones. The sequence of the cognitive operations as the text unfolded represents the writer's cognitive pathway at the…

  11. A validated fast difference spectrophotometric method for 5-hydroxymethyl-2-furfural (HMF) determination in corn syrups.

    PubMed

    de Andrade, Jucimara Kulek; de Andrade, Camila Kulek; Komatsu, Emy; Perreault, Hélène; Torres, Yohandra Reyes; da Rosa, Marcos Roberto; Felsner, Maria Lurdes

    2017-08-01

    Corn syrups, important ingredients used in food and beverage industries, often contain high levels of 5-hydroxymethyl-2-furfural (HMF), a toxic contaminant. In this work, an in house validation of a difference spectrophotometric method for HMF analysis in corn syrups was developed using sophisticated statistical tools by the first time. The methodology showed excellent analytical performance with good selectivity, linearity (R 2 =99.9%, r>0.99), accuracy and low limits (LOD=0.10mgL -1 and LOQ=0.34mgL -1 ). An excellent precision was confirmed by repeatability (RSD (%)=0.30) and intermediate precision (RSD (%)=0.36) estimates and by Horrat value (0.07). A detailed study of method precision using a nested design demonstrated that variation sources such as instruments, operators and time did not interfere in the variability of results within laboratory and consequently in its intermediate precision. The developed method is environmentally friendly, fast, cheap and easy to implement resulting in an attractive alternative for corn syrups quality control in industries and official laboratories. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. A Review of Classical Methods of Item Analysis.

    ERIC Educational Resources Information Center

    French, Christine L.

    Item analysis is a very important consideration in the test development process. It is a statistical procedure to analyze test items that combines methods used to evaluate the important characteristics of test items, such as difficulty, discrimination, and distractibility of the items in a test. This paper reviews some of the classical methods for…

  14. High Performance Liquid Chromatography of Some Analgesic Compounds: An Instrumental Analysis Experiment.

    ERIC Educational Resources Information Center

    Haddad, Paul; And Others

    1983-01-01

    Background information, procedures, and results are provided for an experiment demonstrating techniques of solvent selection, gradient elution, pH control, and ion-pairing in the analysis of an analgesic mixture using reversed-phase liquid chromatography on an octadecylsilane column. Although developed using sophisticated/expensive equipment, less…

  15. Orthopedic workforce planning in Germany – an analysis of orthopedic accessibility

    PubMed Central

    Müller, Peter; Maier, Werner; Groneberg, David A.

    2017-01-01

    In Germany, orthopedic workforce planning relies on population-to-provider-ratios represented by the ‘official degree of care provision’. However, with geographic information systems (GIS), more sophisticated measurements are available. By utilizing GIS-based technologies we analyzed the current state of demand and supply of the orthopedic workforce in Germany (orthopedic accessibility) with the integrated Floating Catchment Area method. The analysis of n = 153,352,220 distances revealed significant geographical variations on national scale: 5,617,595 people (6.9% of total population) lived in an area with significant low orthopedic accessibility (average z-score = -4.0), whereas 31,748,161 people (39.0% of total population) lived in an area with significant high orthopedic accessibility (average z-score = 8.0). Accessibility was positively correlated with the degree of urbanization (r = 0.49; p<0.001) and the official degree of care provision (r = 0.33; p<0.001) and negatively correlated with regional social deprivation (r = -0.47; p<0.001). Despite advantages of simpler measures regarding implementation and acceptance in health policy, more sophisticated measures of accessibility have the potential to reduce costs as well as improve health care. With this study, significant geographical variations were revealed that show the need to reduce oversupply in less deprived urban areas in order to enable adequate care in more deprived rural areas. PMID:28178335

  16. Voltametric analysis apparatus and method

    DOEpatents

    Almon, Amy C.

    1993-01-01

    An apparatus and method for electrochemical analysis of elements in solution. An auxiliary electrode 14, a reference electrode 18, and five working electrodes 20, 22, 26, 28, and 30 are positioned in a container 12 containing a sample solution 34. The working electrodes are spaced apart evenly from each other and auxiliary electrode 14 to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between auxiliary electrode 14 and each of the working electrodes 20, 22, 26, 28, and 30. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in sample solution 34 and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  17. [Technical aspects of measurement for optically sophisticated eyeglasses].

    PubMed

    Guilino, G

    1988-07-01

    This paper deals with the question of how aspherical ophthalmic lenses can be measured outside the reference point given by the manufacturer in order to make a lens comparison or to test the shape faithfulness of the lens surface. Three procedures are presented with measuring examples - vertex power measurement with swiveled lens mount, probe scan in a three-coordinate measurement, and interferometer measurement using a non-ideal reference lens. The basic problem inherent in an application-related interpretation of the measuring data sets gained by these methods is shown.

  18. Multiaxial Cyclic Thermoplasticity Analysis with Besseling's Subvolume Method

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.

    1983-01-01

    A modification was formulated to Besseling's Subvolume Method to allow it to use multilinear stress-strain curves which are temperature dependent to perform cyclic thermoplasticity analyses. This method automotically reproduces certain aspects of real material behavior important in the analysis of Aircraft Gas Turbine Engine (AGTE) components. These include the Bauschinger effect, cross-hardening, and memory. This constitutive equation was implemented in a finite element computer program called CYANIDE. Subsequently, classical time dependent plasticity (creep) was added to the program. Since its inception, this program was assessed against laboratory and component testing and engine experience. The ability of this program to simulate AGTE material response characteristics was verified by this experience and its utility in providing data for life analyses was demonstrated. In this area of life analysis, the multiaxial thermoplasticity capabilities of the method have proved a match for the actual AGTE life experience.

  19. Structural Analysis Made 'NESSUSary'

    NASA Technical Reports Server (NTRS)

    2005-01-01

    Everywhere you look, chances are something that was designed and tested by a computer will be in plain view. Computers are now utilized to design and test just about everything imaginable, from automobiles and airplanes to bridges and boats, and elevators and escalators to streets and skyscrapers. Computer-design engineering first emerged in the 1970s, in the automobile and aerospace industries. Since computers were in their infancy, however, architects and engineers during the time were limited to producing only designs similar to hand-drafted drawings. (At the end of 1970s, a typical computer-aided design system was a 16-bit minicomputer with a price tag of $125,000.) Eventually, computers became more affordable and related software became more sophisticated, offering designers the "bells and whistles" to go beyond the limits of basic drafting and rendering, and venture into more skillful applications. One of the major advancements was the ability to test the objects being designed for the probability of failure. This advancement was especially important for the aerospace industry, where complicated and expensive structures are designed. The ability to perform reliability and risk assessment without using extensive hardware testing is critical to design and certification. In 1984, NASA initiated the Probabilistic Structural Analysis Methods (PSAM) project at Glenn Research Center to develop analysis methods and computer programs for the probabilistic structural analysis of select engine components for current Space Shuttle and future space propulsion systems. NASA envisioned that these methods and computational tools would play a critical role in establishing increased system performance and durability, and assist in structural system qualification and certification. Not only was the PSAM project beneficial to aerospace, it paved the way for a commercial risk- probability tool that is evaluating risks in diverse, down- to-Earth application

  20. Probabilistic structural analysis methods for space propulsion system components

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1986-01-01

    The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.

  1. Biological Parametric Mapping: A Statistical Toolbox for Multi-Modality Brain Image Analysis

    PubMed Central

    Casanova, Ramon; Ryali, Srikanth; Baer, Aaron; Laurienti, Paul J.; Burdette, Jonathan H.; Hayasaka, Satoru; Flowers, Lynn; Wood, Frank; Maldjian, Joseph A.

    2006-01-01

    In recent years multiple brain MR imaging modalities have emerged; however, analysis methodologies have mainly remained modality specific. In addition, when comparing across imaging modalities, most researchers have been forced to rely on simple region-of-interest type analyses, which do not allow the voxel-by-voxel comparisons necessary to answer more sophisticated neuroscience questions. To overcome these limitations, we developed a toolbox for multimodal image analysis called biological parametric mapping (BPM), based on a voxel-wise use of the general linear model. The BPM toolbox incorporates information obtained from other modalities as regressors in a voxel-wise analysis, thereby permitting investigation of more sophisticated hypotheses. The BPM toolbox has been developed in MATLAB with a user friendly interface for performing analyses, including voxel-wise multimodal correlation, ANCOVA, and multiple regression. It has a high degree of integration with the SPM (statistical parametric mapping) software relying on it for visualization and statistical inference. Furthermore, statistical inference for a correlation field, rather than a widely-used T-field, has been implemented in the correlation analysis for more accurate results. An example with in-vivo data is presented demonstrating the potential of the BPM methodology as a tool for multimodal image analysis. PMID:17070709

  2. The stress analysis method for three-dimensional composite materials

    NASA Astrophysics Data System (ADS)

    Nagai, Kanehiro; Yokoyama, Atsushi; Maekawa, Zen'ichiro; Hamada, Hiroyuki

    1994-05-01

    This study proposes a stress analysis method for three-dimensionally fiber reinforced composite materials. In this method, the rule-of mixture for composites is successfully applied to 3-D space in which material properties would change 3-dimensionally. The fundamental formulas for Young's modulus, shear modulus, and Poisson's ratio are derived. Also, we discuss a strength estimation and an optimum material design technique for 3-D composite materials. The analysis is executed for a triaxial orthogonally woven fabric, and their results are compared to the experimental data in order to verify the accuracy of this method. The present methodology can be easily understood with basic material mechanics and elementary mathematics, so it enables us to write a computer program of this theory without difficulty. Furthermore, this method can be applied to various types of 3-D composites because of its general-purpose characteristics.

  3. New method for stock-tank oil compositional analysis.

    PubMed

    McAndrews, Kristine; Nighswander, John; Kotzakoulakis, Konstantin; Ross, Paul; Schroeder, Helmut

    2009-01-01

    A new method for accurately determining stock-tank oil composition to normal pentatriacontane using gas chromatography is developed and validated. The new method addresses the potential errors associated with the traditional equipment and technique employed for extended hydrocarbon gas chromatography outside a controlled laboratory environment, such as on an offshore oil platform. In particular, the experimental measurement of stock-tank oil molecular weight with the freezing point depression technique and the use of an internal standard to find the unrecovered sample fraction are replaced with correlations for estimating these properties. The use of correlations reduces the number of necessary experimental steps in completing the required sample preparation and analysis, resulting in reduced uncertainty in the analysis.

  4. An Quantitative Analysis Method Of Trabecular Pattern In A Bone

    NASA Astrophysics Data System (ADS)

    Idesawa, Masanor; Yatagai, Toyohiko

    1982-11-01

    Orientation and density of trabecular pattern observed in a bone is closely related to its mechanical properties and deseases of a bone are appeared as changes of orientation and/or density distrbution of its trabecular patterns. They have been treated from a qualitative point of view so far because quantitative analysis method has not be established. In this paper, the authors proposed and investigated some quantitative analysis methods of density and orientation of trabecular patterns observed in a bone. These methods can give an index for evaluating orientation of trabecular pattern quantitatively and have been applied to analyze trabecular pattern observed in a head of femur and their availabilities are confirmed. Key Words: Index of pattern orientation, Trabecular pattern, Pattern density, Quantitative analysis

  5. Vulnerability analysis methods for road networks

    NASA Astrophysics Data System (ADS)

    Bíl, Michal; Vodák, Rostislav; Kubeček, Jan; Rebok, Tomáš; Svoboda, Tomáš

    2014-05-01

    steps can be taken in order to make it more resilient. Performing such an analysis of network break-ups requires consideration of the network as a whole, ideally identifying all the cases generated by simultaneous closure of multiple links and evaluating them using various criteria. The spatial distribution of settlements, important companies and the overall population in the nodes of the network are several factors, apart from the topology of the network which could be taken into account when computing vulnerability indices and identifying the weakest links and/or weakest link combinations. However, even for small networks (i.e., hundreds of nodes and links), the problem of break-up identification becomes extremely difficult to resolve. The naive approaches of the brute force examination consequently fail and more elaborated algorithms have to be applied. We address the problem of evaluating the vulnerability of road networks in our work by simulating the impacts of the simultaneous closure of multiple roads/links. We present an ongoing work on a sophisticated algorithm focused on the identification of network break-ups and evaluating them by various criteria.

  6. How Commercial Banks Use the World Wide Web: A Content Analysis.

    ERIC Educational Resources Information Center

    Leovic, Lydia K.

    New telecommunications vehicles expand the possible ways that business is conducted. The hypermedia portion of the Internet, the World Wide Web, is such a telecommunications device. The Web is presently one of the most flexible and dynamic methods for electronic information dissemination. The level of technological sophistication necessary to…

  7. Data Analysis Methods for Library Marketing

    NASA Astrophysics Data System (ADS)

    Minami, Toshiro; Kim, Eunja

    Our society is rapidly changing to information society, where the needs and requests of the people on information access are different widely from person to person. Library's mission is to provide its users, or patrons, with the most appropriate information. Libraries have to know the profiles of their patrons, in order to achieve such a role. The aim of library marketing is to develop methods based on the library data, such as circulation records, book catalogs, book-usage data, and others. In this paper we discuss the methodology and imporatnce of library marketing at the beginning. Then we demonstrate its usefulness through some examples of analysis methods applied to the circulation records in Kyushu University and Guacheon Library, and some implication that obtained as the results of these methods. Our research is a big beginning towards the future when library marketing is an unavoidable tool.

  8. Emergency First Responders' Experience with Colorimetric Detection Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sandra L. Fox; Keith A. Daum; Carla J. Miller

    2007-10-01

    Nationwide, first responders from state and federal support teams respond to hazardous materials incidents, industrial chemical spills, and potential weapons of mass destruction (WMD) attacks. Although first responders have sophisticated chemical, biological, radiological, and explosive detectors available for assessment of the incident scene, simple colorimetric detectors have a role in response actions. The large number of colorimetric chemical detection methods available on the market can make the selection of the proper methods difficult. Although each detector has unique aspects to provide qualitative or quantitative data about the unknown chemicals present, not all detectors provide consistent, accurate, and reliable results. Includedmore » here, in a consumer-report-style format, we provide “boots on the ground” information directly from first responders about how well colorimetric chemical detection methods meet their needs in the field and how they procure these methods.« less

  9. Measurement methods for human exposure analysis.

    PubMed Central

    Lioy, P J

    1995-01-01

    The general methods used to complete measurements of human exposures are identified and illustrations are provided for the cases of indirect and direct methods used for exposure analysis. The application of the techniques for external measurements of exposure, microenvironmental and personal monitors, are placed in the context of the need to test hypotheses concerning the biological effects of concern. The linkage of external measurements to measurements made in biological fluids is explored for a suite of contaminants. This information is placed in the context of the scientific framework used to conduct exposure assessment. Examples are taken from research on volatile organics and for a large scale problem: hazardous waste sites. PMID:7635110

  10. Method Analysis of Microbial-Resistant Gypsum Products

    EPA Science Inventory

    Method Analysis of Microbial-Resistant Gypsum ProductsD.A. Betancourt1, T.R.Dean1, A. Evans2, and G.Byfield2 1. US Environmental Protection Agency, Office of Research and Development, National Risk Management Research Laboratory; RTP, NC 277112. RTI International, RTP, NCSeveral...

  11. A Unified Development of Basis Reduction Methods for Rotor Blade Analysis

    NASA Technical Reports Server (NTRS)

    Ruzicka, Gene C.; Hodges, Dewey H.; Rutkowski, Michael (Technical Monitor)

    2001-01-01

    The axial foreshortening effect plays a key role in rotor blade dynamics, but approximating it accurately in reduced basis models has long posed a difficult problem for analysts. Recently, though, several methods have been shown to be effective in obtaining accurate,reduced basis models for rotor blades. These methods are the axial elongation method,the mixed finite element method, and the nonlinear normal mode method. The main objective of this paper is to demonstrate the close relationships among these methods, which are seemingly disparate at first glance. First, the difficulties inherent in obtaining reduced basis models of rotor blades are illustrated by examining the modal reduction accuracy of several blade analysis formulations. It is shown that classical, displacement-based finite elements are ill-suited for rotor blade analysis because they can't accurately represent the axial strain in modal space, and that this problem may be solved by employing the axial force as a variable in the analysis. It is shown that the mixed finite element method is a convenient means for accomplishing this, and the derivation of a mixed finite element for rotor blade analysis is outlined. A shortcoming of the mixed finite element method is that is that it increases the number of variables in the analysis. It is demonstrated that this problem may be rectified by solving for the axial displacements in terms of the axial forces and the bending displacements. Effectively, this procedure constitutes a generalization of the widely used axial elongation method to blades of arbitrary topology. The procedure is developed first for a single element, and then extended to an arbitrary assemblage of elements of arbitrary type. Finally, it is shown that the generalized axial elongation method is essentially an approximate solution for an invariant manifold that can be used as the basis for a nonlinear normal mode.

  12. Full cost accounting in the analysis of separated waste collection efficiency: A methodological proposal.

    PubMed

    D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco

    2016-02-01

    Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Video markers tracking methods for bike fitting

    NASA Astrophysics Data System (ADS)

    Rajkiewicz, Piotr; Łepkowska, Katarzyna; Cygan, Szymon

    2015-09-01

    Sports cycling is becoming increasingly popular over last years. Obtaining and maintaining a proper position on the bike has been shown to be crucial for performance, comfort and injury avoidance. Various techniques of bike fitting are available - from rough settings based on body dimensions to professional services making use of sophisticated equipment and expert knowledge. Modern fitting techniques use mainly joint angles as a criterion of proper position. In this work we examine performance of two proposed methods for dynamic cyclist position assessment based on video data recorded during stationary cycling. Proposed methods are intended for home use, to help amateur cyclist improve their position on the bike, and therefore no professional equipment is used. As a result of data processing, ranges of angles in selected joints are provided. Finally strengths and weaknesses of both proposed methods are discussed.

  14. A refined method for multivariate meta-analysis and meta-regression

    PubMed Central

    Jackson, Daniel; Riley, Richard D

    2014-01-01

    Making inferences about the average treatment effect using the random effects model for meta-analysis is problematic in the common situation where there is a small number of studies. This is because estimates of the between-study variance are not precise enough to accurately apply the conventional methods for testing and deriving a confidence interval for the average effect. We have found that a refined method for univariate meta-analysis, which applies a scaling factor to the estimated effects’ standard error, provides more accurate inference. We explain how to extend this method to the multivariate scenario and show that our proposal for refined multivariate meta-analysis and meta-regression can provide more accurate inferences than the more conventional approach. We explain how our proposed approach can be implemented using standard output from multivariate meta-analysis software packages and apply our methodology to two real examples. © 2013 The Authors. Statistics in Medicine published by John Wiley & Sons, Ltd. PMID:23996351

  15. A refined method for multivariate meta-analysis and meta-regression.

    PubMed

    Jackson, Daniel; Riley, Richard D

    2014-02-20

    Making inferences about the average treatment effect using the random effects model for meta-analysis is problematic in the common situation where there is a small number of studies. This is because estimates of the between-study variance are not precise enough to accurately apply the conventional methods for testing and deriving a confidence interval for the average effect. We have found that a refined method for univariate meta-analysis, which applies a scaling factor to the estimated effects' standard error, provides more accurate inference. We explain how to extend this method to the multivariate scenario and show that our proposal for refined multivariate meta-analysis and meta-regression can provide more accurate inferences than the more conventional approach. We explain how our proposed approach can be implemented using standard output from multivariate meta-analysis software packages and apply our methodology to two real examples. Copyright © 2013 John Wiley & Sons, Ltd.

  16. Phased-mission system analysis using Boolean algebraic methods

    NASA Technical Reports Server (NTRS)

    Somani, Arun K.; Trivedi, Kishor S.

    1993-01-01

    Most reliability analysis techniques and tools assume that a system is used for a mission consisting of a single phase. However, multiple phases are natural in many missions. The failure rates of components, system configuration, and success criteria may vary from phase to phase. In addition, the duration of a phase may be deterministic or random. Recently, several researchers have addressed the problem of reliability analysis of such systems using a variety of methods. A new technique for phased-mission system reliability analysis based on Boolean algebraic methods is described. Our technique is computationally efficient and is applicable to a large class of systems for which the failure criterion in each phase can be expressed as a fault tree (or an equivalent representation). Our technique avoids state space explosion that commonly plague Markov chain-based analysis. A phase algebra to account for the effects of variable configurations and success criteria from phase to phase was developed. Our technique yields exact (as opposed to approximate) results. The use of our technique was demonstrated by means of an example and present numerical results to show the effects of mission phases on the system reliability.

  17. Chapter 11. Community analysis-based methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cao, Y.; Wu, C.H.; Andersen, G.L.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. Inmore » increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.« less

  18. Robust Methods for Moderation Analysis with a Two-Level Regression Model.

    PubMed

    Yang, Miao; Yuan, Ke-Hai

    2016-01-01

    Moderation analysis has many applications in social sciences. Most widely used estimation methods for moderation analysis assume that errors are normally distributed and homoscedastic. When these assumptions are not met, the results from a classical moderation analysis can be misleading. For more reliable moderation analysis, this article proposes two robust methods with a two-level regression model when the predictors do not contain measurement error. One method is based on maximum likelihood with Student's t distribution and the other is based on M-estimators with Huber-type weights. An algorithm for obtaining the robust estimators is developed. Consistent estimates of standard errors of the robust estimators are provided. The robust approaches are compared against normal-distribution-based maximum likelihood (NML) with respect to power and accuracy of parameter estimates through a simulation study. Results show that the robust approaches outperform NML under various distributional conditions. Application of the robust methods is illustrated through a real data example. An R program is developed and documented to facilitate the application of the robust methods.

  19. Analysis of Brick Masonry Wall using Applied Element Method

    NASA Astrophysics Data System (ADS)

    Lincy Christy, D.; Madhavan Pillai, T. M.; Nagarajan, Praveen

    2018-03-01

    The Applied Element Method (AEM) is a versatile tool for structural analysis. Analysis is done by discretising the structure as in the case of Finite Element Method (FEM). In AEM, elements are connected by a set of normal and shear springs instead of nodes. AEM is extensively used for the analysis of brittle materials. Brick masonry wall can be effectively analyzed in the frame of AEM. The composite nature of masonry wall can be easily modelled using springs. The brick springs and mortar springs are assumed to be connected in series. The brick masonry wall is analyzed and failure load is determined for different loading cases. The results were used to find the best aspect ratio of brick to strengthen brick masonry wall.

  20. A brain-region-based meta-analysis method utilizing the Apriori algorithm.

    PubMed

    Niu, Zhendong; Nie, Yaoxin; Zhou, Qian; Zhu, Linlin; Wei, Jieyao

    2016-05-18

    Brain network connectivity modeling is a crucial method for studying the brain's cognitive functions. Meta-analyses can unearth reliable results from individual studies. Meta-analytic connectivity modeling is a connectivity analysis method based on regions of interest (ROIs) which showed that meta-analyses could be used to discover brain network connectivity. In this paper, we propose a new meta-analysis method that can be used to find network connectivity models based on the Apriori algorithm, which has the potential to derive brain network connectivity models from activation information in the literature, without requiring ROIs. This method first extracts activation information from experimental studies that use cognitive tasks of the same category, and then maps the activation information to corresponding brain areas by using the automatic anatomical label atlas, after which the activation rate of these brain areas is calculated. Finally, using these brain areas, a potential brain network connectivity model is calculated based on the Apriori algorithm. The present study used this method to conduct a mining analysis on the citations in a language review article by Price (Neuroimage 62(2):816-847, 2012). The results showed that the obtained network connectivity model was consistent with that reported by Price. The proposed method is helpful to find brain network connectivity by mining the co-activation relationships among brain regions. Furthermore, results of the co-activation relationship analysis can be used as a priori knowledge for the corresponding dynamic causal modeling analysis, possibly achieving a significant dimension-reducing effect, thus increasing the efficiency of the dynamic causal modeling analysis.

  1. Methods for the survey and genetic analysis of populations

    DOEpatents

    Ashby, Matthew

    2003-09-02

    The present invention relates to methods for performing surveys of the genetic diversity of a population. The invention also relates to methods for performing genetic analyses of a population. The invention further relates to methods for the creation of databases comprising the survey information and the databases created by these methods. The invention also relates to methods for analyzing the information to correlate the presence of nucleic acid markers with desired parameters in a sample. These methods have application in the fields of geochemical exploration, agriculture, bioremediation, environmental analysis, clinical microbiology, forensic science and medicine.

  2. The Fusion of Financial Analysis and Seismology: Statistical Methods from Financial Market Analysis Applied to Earthquake Data

    NASA Astrophysics Data System (ADS)

    Ohyanagi, S.; Dileonardo, C.

    2013-12-01

    As a natural phenomenon earthquake occurrence is difficult to predict. Statistical analysis of earthquake data was performed using candlestick chart and Bollinger Band methods. These statistical methods, commonly used in the financial world to analyze market trends were tested against earthquake data. Earthquakes above Mw 4.0 located on shore of Sanriku (37.75°N ~ 41.00°N, 143.00°E ~ 144.50°E) from February 1973 to May 2013 were selected for analysis. Two specific patterns in earthquake occurrence were recognized through the analysis. One is a spread of candlestick prior to the occurrence of events greater than Mw 6.0. A second pattern shows convergence in the Bollinger Band, which implies a positive or negative change in the trend of earthquakes. Both patterns match general models for the buildup and release of strain through the earthquake cycle, and agree with both the characteristics of the candlestick chart and Bollinger Band analysis. These results show there is a high correlation between patterns in earthquake occurrence and trend analysis by these two statistical methods. The results of this study agree with the appropriateness of the application of these financial analysis methods to the analysis of earthquake occurrence.

  3. Optimal Multicomponent Analysis Using the Generalized Standard Addition Method.

    ERIC Educational Resources Information Center

    Raymond, Margaret; And Others

    1983-01-01

    Describes an experiment on the simultaneous determination of chromium and magnesium by spectophotometry modified to include the Generalized Standard Addition Method computer program, a multivariate calibration method that provides optimal multicomponent analysis in the presence of interference and matrix effects. Provides instructions for…

  4. A Most Probable Point-Based Method for Reliability Analysis, Sensitivity Analysis and Design Optimization

    NASA Technical Reports Server (NTRS)

    Hou, Gene J.-W; Newman, Perry A. (Technical Monitor)

    2004-01-01

    A major step in a most probable point (MPP)-based method for reliability analysis is to determine the MPP. This is usually accomplished by using an optimization search algorithm. The minimum distance associated with the MPP provides a measurement of safety probability, which can be obtained by approximate probability integration methods such as FORM or SORM. The reliability sensitivity equations are derived first in this paper, based on the derivatives of the optimal solution. Examples are provided later to demonstrate the use of these derivatives for better reliability analysis and reliability-based design optimization (RBDO).

  5. Market segmentation for multiple option healthcare delivery systems--an application of cluster analysis.

    PubMed

    Jarboe, G R; Gates, R H; McDaniel, C D

    1990-01-01

    Healthcare providers of multiple option plans may be confronted with special market segmentation problems. This study demonstrates how cluster analysis may be used for discovering distinct patterns of preference for multiple option plans. The availability of metric, as opposed to categorical or ordinal, data provides the ability to use sophisticated analysis techniques which may be superior to frequency distributions and cross-tabulations in revealing preference patterns.

  6. Analysis of live cell images: Methods, tools and opportunities.

    PubMed

    Nketia, Thomas A; Sailem, Heba; Rohde, Gustavo; Machiraju, Raghu; Rittscher, Jens

    2017-02-15

    Advances in optical microscopy, biosensors and cell culturing technologies have transformed live cell imaging. Thanks to these advances live cell imaging plays an increasingly important role in basic biology research as well as at all stages of drug development. Image analysis methods are needed to extract quantitative information from these vast and complex data sets. The aim of this review is to provide an overview of available image analysis methods for live cell imaging, in particular required preprocessing image segmentation, cell tracking and data visualisation methods. The potential opportunities recent advances in machine learning, especially deep learning, and computer vision provide are being discussed. This review includes overview of the different available software packages and toolkits. Copyright © 2017. Published by Elsevier Inc.

  7. Efficient sensitivity analysis method for chaotic dynamical systems

    NASA Astrophysics Data System (ADS)

    Liao, Haitao

    2016-05-01

    The direct differentiation and improved least squares shadowing methods are both developed for accurately and efficiently calculating the sensitivity coefficients of time averaged quantities for chaotic dynamical systems. The key idea is to recast the time averaged integration term in the form of differential equation before applying the sensitivity analysis method. An additional constraint-based equation which forms the augmented equations of motion is proposed to calculate the time averaged integration variable and the sensitivity coefficients are obtained as a result of solving the augmented differential equations. The application of the least squares shadowing formulation to the augmented equations results in an explicit expression for the sensitivity coefficient which is dependent on the final state of the Lagrange multipliers. The LU factorization technique to calculate the Lagrange multipliers leads to a better performance for the convergence problem and the computational expense. Numerical experiments on a set of problems selected from the literature are presented to illustrate the developed methods. The numerical results demonstrate the correctness and effectiveness of the present approaches and some short impulsive sensitivity coefficients are observed by using the direct differentiation sensitivity analysis method.

  8. Discrimination of particulate matter emission sources using stochastic methods

    NASA Astrophysics Data System (ADS)

    Szczurek, Andrzej; Maciejewska, Monika; Wyłomańska, Agnieszka; Sikora, Grzegorz; Balcerek, Michał; Teuerle, Marek

    2016-12-01

    Particulate matter (PM) is one of the criteria pollutants which has been determined as harmful to public health and the environment. For this reason the ability to recognize its emission sources is very important. There are a number of measurement methods which allow to characterize PM in terms of concentration, particles size distribution, and chemical composition. All these information are useful to establish a link between the dust found in the air, its emission sources and influence on human as well as the environment. However, the methods are typically quite sophisticated and not applicable outside laboratories. In this work, we considered PM emission source discrimination method which is based on continuous measurements of PM concentration with a relatively cheap instrument and stochastic analysis of the obtained data. The stochastic analysis is focused on the temporal variation of PM concentration and it involves two steps: (1) recognition of the category of distribution for the data i.e. stable or the domain of attraction of stable distribution and (2) finding best matching distribution out of Gaussian, stable and normal-inverse Gaussian (NIG). We examined six PM emission sources. They were associated with material processing in industrial environment, namely machining and welding aluminum, forged carbon steel and plastic with various tools. As shown by the obtained results, PM emission sources may be distinguished based on statistical distribution of PM concentration variations. Major factor responsible for the differences detectable with our method was the type of material processing and the tool applied. In case different materials were processed by the same tool the distinction of emission sources was difficult. For successful discrimination it was crucial to consider size-segregated mass fraction concentrations. In our opinion the presented approach is very promising. It deserves further study and development.

  9. Review of Computational Stirling Analysis Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.

    2004-01-01

    Nuclear thermal to electric power conversion carries the promise of longer duration missions and higher scientific data transmission rates back to Earth for both Mars rovers and deep space missions. A free-piston Stirling convertor is a candidate technology that is considered an efficient and reliable power conversion device for such purposes. While already very efficient, it is believed that better Stirling engines can be developed if the losses inherent its current designs could be better understood. However, they are difficult to instrument and so efforts are underway to simulate a complete Stirling engine numerically. This has only recently been attempted and a review of the methods leading up to and including such computational analysis is presented. And finally it is proposed that the quality and depth of Stirling loss understanding may be improved by utilizing the higher fidelity and efficiency of recently developed numerical methods. One such method, the Ultra HI-Fl technique is presented in detail.

  10. ARBAN-A new method for analysis of ergonomic effort.

    PubMed

    Holzmann, P

    1982-06-01

    ARBAN is a method for the ergonomic analysis of work, including work situations which involve widely differing body postures and loads. The idea of the method is thal all phases of the analysis process that imply specific knowledge on ergonomics are teken over by filming equipment and a computer routine. All tasks that must be carried out by the investigator in the process of analysis are so designed that they appear as evident by the use of systematic common sense. The ARBAN analysis method contains four steps: 1. Recording of the workplace situation on video or film. 2. Coding the posture and load situation at a number of closely spaced 'frozen' situations. 3. Computerisation. 4. Evaluation of the results. The computer calculates figures for the total ergonomic stress on the whole body as well as on different parts of the body separately. They are presented as 'Ergonomic stress/ time curves', where the heavy load situations occur as peaks of the curve. The work cycle may also be divided into different tasks, where the stress and duration patterns can be compared. The integral of the curves are calculated for single-figure comparison of different tasks as well as different work situations.

  11. Development of Educational Methods and Techniques Adapted to the Specific Conditions of the Developing Countries. Peer Tutoring: Operational Description of Various Systems and Their Applications.

    ERIC Educational Resources Information Center

    Charconnet, Marie-George

    This study describes various patterns of peer tutoring and is based on the use of cultural traditions and endogenous methods, on techniques and equipment acquired from other cultures, on problems presented by the adoption of educational technologies, and on methods needing little sophisticated equipment. A dozen peer tutoring systems are…

  12. Some New Mathematical Methods for Variational Objective Analysis

    NASA Technical Reports Server (NTRS)

    Wahba, G.; Johnson, D. R.

    1984-01-01

    New and/or improved variational methods for simultaneously combining forecast, heterogeneous observational data, a priori climatology, and physics to obtain improved estimates of the initial state of the atmosphere for the purpose of numerical weather prediction are developed. Cross validated spline methods are applied to atmospheric data for the purpose of improved description and analysis of atmospheric phenomena such as the tropopause and frontal boundary surfaces.

  13. Better Crunching: Recommendations for Multivariate Data Analysis Approaches for Program Impact Evaluations

    ERIC Educational Resources Information Center

    Braverman, Marc T.

    2016-01-01

    Extension program evaluations often present opportunities to analyze data in multiple ways. This article suggests that program evaluations can involve more sophisticated data analysis approaches than are often used. On the basis of a hypothetical program scenario and corresponding data set, two approaches to testing for evidence of program impact…

  14. Selenium analysis in waters. Part 2: Speciation methods.

    PubMed

    LeBlanc, Kelly L; Kumkrong, Paramee; Mercier, Patrick H J; Mester, Zoltán

    2018-06-21

    In aquatic ecosystems, there is often no correlation between the total concentration of selenium present in the water column and the toxic effects observed in that environment. This is due, in part, to the variation in the bioavailability of different selenium species to organisms at the base of the aquatic food chain. The first part of this review (Kumkrong et al., 2018) discusses regulatory framework and standard methodologies for selenium analysis in waters. In this second article, we are reviewing the state of speciation analysis and importance of speciation data for decision makers in industry and regulators. We look in detail at fractionation methods for speciation, including the popular selective sequential hydride generation. We examine advantages and limitations of these methods, in terms of achievable detection limits and interferences from other matrix species, as well as the potential to over- or under-estimate operationally-defined fractions based on the various conversion steps involved in fractionation processes. Additionally, we discuss methods of discrete speciation (through separation methods), their importance in analyzing individual selenium species, difficulties associated with their implementation, as well as ways to overcome these difficulties. We also provide a brief overview of biological treatment methods for the remediation of selenium-contaminated waters. We discuss the importance of selenium speciation in the application of these methods and their potential to actually increase the bioavailability of selenium despite decreasing its total waterborne concentration. Copyright © 2018. Published by Elsevier B.V.

  15. Application of econometric and ecology analysis methods in physics software

    NASA Astrophysics Data System (ADS)

    Han, Min Cheol; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Sung Hun; Grazia Pia, Maria; Ronchieri, Elisabetta; Saracco, Paolo

    2017-10-01

    Some data analysis methods typically used in econometric studies and in ecology have been evaluated and applied in physics software environments. They concern the evolution of observables through objective identification of change points and trends, and measurements of inequality, diversity and evenness across a data set. Within each analysis area, various statistical tests and measures have been examined. This conference paper summarizes a brief overview of some of these methods.

  16. An Ultra-Sensitive Method for the Analysis of Perfluorinated ...

    EPA Pesticide Factsheets

    In epidemiological research, it has become increasingly important to assess subjects' exposure to different classes of chemicals in multiple environmental media. It is a common practice to aliquot limited volumes of samples into smaller quantities for specific trace level chemical analysis. A novel method was developed for the determination of 14 perfluorinated alkyl acids (PFAAs) in small volumes (10 mL) of drinking water using off-line solid phase extraction (SPE) pre-treatment followed by on-line pre-concentration on WAX column before analysis on column-switching high performance liquid chromatography tandem mass spectrometry (HPLC-MS/MS). In general, large volumes (100 - 1000 mL) have been used for the analysis of PFAAs in drinking water. The current method requires approximately 10 mL of drinking water concentrated by using an SPE cartridge and eluted with methanol. A large volume injection of the extract was introduced on to a column-switching HPLC-MS/MS using a mix-mode SPE column for the trace level analysis of PFAAs in water. The recoveries for most of the analytes in the fortified laboratory blanks ranged from 73±14% to 128±5%. The lowest concentration minimum reporting levels (LCMRL) for the 14 PFAAs ranged from 0.59 to 3.4 ng/L. The optimized method was applied to a pilot-scale analysis of a subset of drinking water samples from an epidemiological study. These samples were collected directly from the taps in the households of Ohio and Nor

  17. The Design of Lessons Using Mathematics Analysis Software to Support Multiple Representations in Secondary School Mathematics

    ERIC Educational Resources Information Center

    Pierce, Robyn; Stacey, Kaye; Wander, Roger; Ball, Lynda

    2011-01-01

    Current technologies incorporating sophisticated mathematical analysis software (calculation, graphing, dynamic geometry, tables, and more) provide easy access to multiple representations of mathematical problems. Realising the affordances of such technology for students' learning requires carefully designed lessons. This paper reports on design…

  18. Participant Interaction in Asynchronous Learning Environments: Evaluating Interaction Analysis Methods

    ERIC Educational Resources Information Center

    Blanchette, Judith

    2012-01-01

    The purpose of this empirical study was to determine the extent to which three different objective analytical methods--sequence analysis, surface cohesion analysis, and lexical cohesion analysis--can most accurately identify specific characteristics of online interaction. Statistically significant differences were found in all points of…

  19. Application of principal component analysis to distinguish patients with schizophrenia from healthy controls based on fractional anisotropy measurements.

    PubMed

    Caprihan, A; Pearlson, G D; Calhoun, V D

    2008-08-15

    Principal component analysis (PCA) is often used to reduce the dimension of data before applying more sophisticated data analysis methods such as non-linear classification algorithms or independent component analysis. This practice is based on selecting components corresponding to the largest eigenvalues. If the ultimate goal is separation of data in two groups, then these set of components need not have the most discriminatory power. We measured the distance between two such populations using Mahalanobis distance and chose the eigenvectors to maximize it, a modified PCA method, which we call the discriminant PCA (DPCA). DPCA was applied to diffusion tensor-based fractional anisotropy images to distinguish age-matched schizophrenia subjects from healthy controls. The performance of the proposed method was evaluated by the one-leave-out method. We show that for this fractional anisotropy data set, the classification error with 60 components was close to the minimum error and that the Mahalanobis distance was twice as large with DPCA, than with PCA. Finally, by masking the discriminant function with the white matter tracts of the Johns Hopkins University atlas, we identified left superior longitudinal fasciculus as the tract which gave the least classification error. In addition, with six optimally chosen tracts the classification error was zero.

  20. Targeted methods for quantitative analysis of protein glycosylation

    PubMed Central

    Goldman, Radoslav; Sanda, Miloslav

    2018-01-01

    Quantification of proteins by LC-MS/MS-MRM has become a standard method with broad projected clinical applicability. MRM quantification of protein modifications is, however, far less utilized, especially in the case of glycoproteins. This review summarizes current methods for quantitative analysis of protein glycosylation with a focus on MRM methods. We describe advantages of this quantitative approach, analytical parameters that need to be optimized to achieve reliable measurements, and point out the limitations. Differences between major classes of N- and O-glycopeptides are described and class-specific glycopeptide assays are demonstrated. PMID:25522218

  1. Comparison of urine analysis using manual and sedimentation methods.

    PubMed

    Kurup, R; Leich, M

    2012-06-01

    Microscopic examination of urine sediment is an essential part in the evaluation of renal and urinary tract diseases. Traditionally, urine sediments are assessed by microscopic examination of centrifuged urine. However the current method used by the Georgetown Public Hospital Corporation Medical Laboratory involves uncentrifuged urine. To encourage high level of care, the results provided to the physician must be accurate and reliable for proper diagnosis. The aim of this study is to determine whether the centrifuge method is more clinically significant than the uncentrifuged method. In this study, a comparison between the results obtained from centrifuged and uncentrifuged methods were performed. A total of 167 urine samples were randomly collected and analysed during the period April-May 2010 at the Medical Laboratory, Georgetown Public Hospital Corporation. The urine samples were first analysed microscopically by the uncentrifuged, and then by the centrifuged method. The results obtained from both methods were recorded in a log book. These results were then entered into a database created in Microsoft Excel, and analysed for differences and similarities using this application. Analysis was further done in SPSS software to compare the results using Pearson ' correlation. When compared using Pearson's correlation coefficient analysis, both methods showed a good correlation between urinary sediments with the exception of white bloods cells. The centrifuged method had a slightly higher identification rate for all of the parameters. There is substantial agreement between the centrifuged and uncentrifuged methods. However the uncentrifuged method provides for a rapid turnaround time.

  2. Establishment of analysis method for methane detection by gas chromatography

    NASA Astrophysics Data System (ADS)

    Liu, Xinyuan; Yang, Jie; Ye, Tianyi; Han, Zeyu

    2018-02-01

    The study focused on the establishment of analysis method for methane determination by gas chromatography. Methane was detected by hydrogen flame ionization detector, and the quantitative relationship was determined by working curve of y=2041.2x+2187 with correlation coefficient of 0.9979. The relative standard deviation of 2.60-6.33% and the recovery rate of 96.36%∼105.89% were obtained during the parallel determination of standard gas. This method was not quite suitable for biogas content analysis because methane content in biogas would be over the measurement range in this method.

  3. Mendelian randomization analysis of a time-varying exposure for binary disease outcomes using functional data analysis methods.

    PubMed

    Cao, Ying; Rajan, Suja S; Wei, Peng

    2016-12-01

    A Mendelian randomization (MR) analysis is performed to analyze the causal effect of an exposure variable on a disease outcome in observational studies, by using genetic variants that affect the disease outcome only through the exposure variable. This method has recently gained popularity among epidemiologists given the success of genetic association studies. Many exposure variables of interest in epidemiological studies are time varying, for example, body mass index (BMI). Although longitudinal data have been collected in many cohort studies, current MR studies only use one measurement of a time-varying exposure variable, which cannot adequately capture the long-term time-varying information. We propose using the functional principal component analysis method to recover the underlying individual trajectory of the time-varying exposure from the sparsely and irregularly observed longitudinal data, and then conduct MR analysis using the recovered curves. We further propose two MR analysis methods. The first assumes a cumulative effect of the time-varying exposure variable on the disease risk, while the second assumes a time-varying genetic effect and employs functional regression models. We focus on statistical testing for a causal effect. Our simulation studies mimicking the real data show that the proposed functional data analysis based methods incorporating longitudinal data have substantial power gains compared to standard MR analysis using only one measurement. We used the Framingham Heart Study data to demonstrate the promising performance of the new methods as well as inconsistent results produced by the standard MR analysis that relies on a single measurement of the exposure at some arbitrary time point. © 2016 WILEY PERIODICALS, INC.

  4. Methods for Chemical Analysis of Fresh Waters.

    ERIC Educational Resources Information Center

    Golterman, H. L.

    This manual, one of a series prepared for the guidance of research workers conducting studies as part of the International Biological Programme, contains recommended methods for the analysis of fresh water. The techniques are grouped in the following major sections: Sample Taking and Storage; Conductivity, pH, Oxidation-Reduction Potential,…

  5. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    NASA Technical Reports Server (NTRS)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  6. QCL spectroscopy combined with the least squares method for substance analysis

    NASA Astrophysics Data System (ADS)

    Samsonov, D. A.; Tabalina, A. S.; Fufurin, I. L.

    2017-11-01

    The article briefly describes distinctive features of quantum cascade lasers (QCL). It also describes an experimental set-up for acquiring mid-infrared absorption spectra using QCL. The paper demonstrates experimental results in the form of normed spectra. We tested the application of the least squares method for spectrum analysis. We used this method for substance identification and extraction of concentration data. We compare the results with more common methods of absorption spectroscopy. Eventually, we prove the feasibility of using this simple method for quantitative and qualitative analysis of experimental data acquired with QCL.

  7. Scalable Kernel Methods and Algorithms for General Sequence Analysis

    ERIC Educational Resources Information Center

    Kuksa, Pavel

    2011-01-01

    Analysis of large-scale sequential data has become an important task in machine learning and pattern recognition, inspired in part by numerous scientific and technological applications such as the document and text classification or the analysis of biological sequences. However, current computational methods for sequence comparison still lack…

  8. Compositional Analysis of Lignocellulosic Feedstocks. 1. Review and Description of Methods

    PubMed Central

    2010-01-01

    As interest in lignocellulosic biomass feedstocks for conversion into transportation fuels grows, the summative compositional analysis of biomass, or plant-derived material, becomes ever more important. The sulfuric acid hydrolysis of biomass has been used to measure lignin and structural carbohydrate content for more than 100 years. Researchers have applied these methods to measure the lignin and structural carbohydrate contents of woody materials, estimate the nutritional value of animal feed, analyze the dietary fiber content of human food, compare potential biofuels feedstocks, and measure the efficiency of biomass-to-biofuels processes. The purpose of this paper is to review the history and lineage of biomass compositional analysis methods based on a sulfuric acid hydrolysis. These methods have become the de facto procedure for biomass compositional analysis. The paper traces changes to the biomass compositional analysis methods through time to the biomass methods currently used at the National Renewable Energy Laboratory (NREL). The current suite of laboratory analytical procedures (LAPs) offered by NREL is described, including an overview of the procedures and methodologies and some common pitfalls. Suggestions are made for continuing improvement to the suite of analyses. PMID:20669951

  9. Method for factor analysis of GC/MS data

    DOEpatents

    Van Benthem, Mark H; Kotula, Paul G; Keenan, Michael R

    2012-09-11

    The method of the present invention provides a fast, robust, and automated multivariate statistical analysis of gas chromatography/mass spectroscopy (GC/MS) data sets. The method can involve systematic elimination of undesired, saturated peak masses to yield data that follow a linear, additive model. The cleaned data can then be subjected to a combination of PCA and orthogonal factor rotation followed by refinement with MCR-ALS to yield highly interpretable results.

  10. Preliminary Tests For Development Of A Non-Pertechnetate Analysis Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diprete, D.; McCabe, D.

    2016-09-28

    The objective of this task was to develop a non-pertechnetate analysis method that 222-S lab could easily implement. The initial scope involved working with 222-S laboratory personnel to adapt the existing Tc analytical method to fractionate the non-pertechnetate and pertechnetate. SRNL then developed and tested a method using commercial sorbents containing Aliquat ® 336 to extract the pertechnetate (thereby separating it from non-pertechnetate), followed by oxidation, extraction, and stripping steps, and finally analysis by beta counting and Mass Spectroscopy. Several additional items were partially investigated, including impacts of a 137Cs removal step. The method was initially tested on SRS tankmore » waste samples to determine its viability. Although SRS tank waste does not contain non-pertechnetate, testing with it was useful to investigate the compatibility, separation efficiency, interference removal efficacy, and method sensitivity.« less

  11. Modified electrokinetic sample injection method in chromatography and electrophoresis analysis

    DOEpatents

    Davidson, J. Courtney; Balch, Joseph W.

    2001-01-01

    A sample injection method for horizontal configured multiple chromatography or electrophoresis units, each containing a number of separation/analysis channels, that enables efficient introduction of analyte samples. This method for loading when taken in conjunction with horizontal microchannels allows much reduced sample volumes and a means of sample stacking to greatly reduce the concentration of the sample. This reduction in the amount of sample can lead to great cost savings in sample preparation, particularly in massively parallel applications such as DNA sequencing. The essence of this method is in preparation of the input of the separation channel, the physical sample introduction, and subsequent removal of excess material. By this method, sample volumes of 100 nanoliter to 2 microliters have been used successfully, compared to the typical 5 microliters of sample required by the prior separation/analysis method.

  12. Application of the probabilistic approximate analysis method to a turbopump blade analysis. [for Space Shuttle Main Engine

    NASA Technical Reports Server (NTRS)

    Thacker, B. H.; Mcclung, R. C.; Millwater, H. R.

    1990-01-01

    An eigenvalue analysis of a typical space propulsion system turbopump blade is presented using an approximate probabilistic analysis methodology. The methodology was developed originally to investigate the feasibility of computing probabilistic structural response using closed-form approximate models. This paper extends the methodology to structures for which simple closed-form solutions do not exist. The finite element method will be used for this demonstration, but the concepts apply to any numerical method. The results agree with detailed analysis results and indicate the usefulness of using a probabilistic approximate analysis in determining efficient solution strategies.

  13. Static analysis of a sonar dome rubber window

    NASA Technical Reports Server (NTRS)

    Lai, J. L.

    1978-01-01

    The application of NASTRAN (level 16.0.1) to the static analysis of a sonar dome rubber window (SDRW) was demonstrated. The assessment of the conventional model (neglecting the enclosed fluid) for the stress analysis of the SDRW was made by comparing its results to those based on a sophisticated model (including the enclosed fluid). The fluid was modeled with isoparametric linear hexahedron elements with approximate material properties whose shear modulus was much smaller than its bulk modulus. The effect of the chosen material property for the fluid is discussed.

  14. Factor Retention in Exploratory Factor Analysis: A Comparison of Alternative Methods.

    ERIC Educational Resources Information Center

    Mumford, Karen R.; Ferron, John M.; Hines, Constance V.; Hogarty, Kristine Y.; Kromrey, Jeffery D.

    This study compared the effectiveness of 10 methods of determining the number of factors to retain in exploratory common factor analysis. The 10 methods included the Kaiser rule and a modified Kaiser criterion, 3 variations of parallel analysis, 4 regression-based variations of the scree procedure, and the minimum average partial procedure. The…

  15. Robust and reliable banknote authentification and print flaw detection with opto-acoustical sensor fusion methods

    NASA Astrophysics Data System (ADS)

    Lohweg, Volker; Schaede, Johannes; Türke, Thomas

    2006-02-01

    The authenticity checking and inspection of bank notes is a high labour intensive process where traditionally every note on every sheet is inspected manually. However with the advent of more and more sophisticated security features, both visible and invisible, and the requirement of cost reduction in the printing process, it is clear that automation is required. As more and more print techniques and new security features will be established, total quality security, authenticity and bank note printing must be assured. Therefore, this factor necessitates amplification of a sensorial concept in general. We propose a concept for both authenticity checking and inspection methods for pattern recognition and classification for securities and banknotes, which is based on the concept of sensor fusion and fuzzy interpretation of data measures. In the approach different methods of authenticity analysis and print flaw detection are combined, which can be used for vending or sorting machines, as well as for printing machines. Usually only the existence or appearance of colours and their textures are checked by cameras. Our method combines the visible camera images with IR-spectral sensitive sensors, acoustical and other measurements like temperature and pressure of printing machines.

  16. Pesticide analysis using nanoceria-coated paper-based devices as a detection platform.

    PubMed

    Nouanthavong, Souksanh; Nacapricha, Duangjai; Henry, Charles S; Sameenoi, Yupaporn

    2016-03-07

    We report the first use of a paper-based device coated with nanoceria as a simple, low-cost and rapid detection platform for the analysis of organophosphate (OP) pesticides using an enzyme inhibition assay with acetylcholinesterase (AChE) and choline oxidase (ChOX). In the presence of acetylcholine, AChE and ChOX catalyze the formation of H2O2, which is detected colorimetrically by a nanoceria-coated device resulting in the formation of a yellow color. After incubation with OP pesticides, the AChE activity was inhibited, producing less H2O2, and a reduction in the yellow intensity. The assay is able to analyze OP pesticides without the use of sophisticated instruments and gives detection limits of 18 ng mL(-1) and 5.3 ng mL(-1) for methyl-paraoxon and chlorpyrifos-oxon, respectively. The developed method was successfully applied to detect methyl-paraoxon in spiked vegetables (cabbage) and a dried seafood product (dried green mussel), obtaining ∼95% recovery values for both sample types. The spiked samples were also analyzed using LC-MS/MS as a comparison to the developed method and similar values were obtained, indicating that the developed method gives accurate results and is suitable for OP analysis in real samples.

  17. Investigating Convergence Patterns for Numerical Methods Using Data Analysis

    ERIC Educational Resources Information Center

    Gordon, Sheldon P.

    2013-01-01

    The article investigates the patterns that arise in the convergence of numerical methods, particularly those in the errors involved in successive iterations, using data analysis and curve fitting methods. In particular, the results obtained are used to convey a deeper level of understanding of the concepts of linear, quadratic, and cubic…

  18. Robust gene selection methods using weighting schemes for microarray data analysis.

    PubMed

    Kang, Suyeon; Song, Jongwoo

    2017-09-02

    A common task in microarray data analysis is to identify informative genes that are differentially expressed between two different states. Owing to the high-dimensional nature of microarray data, identification of significant genes has been essential in analyzing the data. However, the performances of many gene selection techniques are highly dependent on the experimental conditions, such as the presence of measurement error or a limited number of sample replicates. We have proposed new filter-based gene selection techniques, by applying a simple modification to significance analysis of microarrays (SAM). To prove the effectiveness of the proposed method, we considered a series of synthetic datasets with different noise levels and sample sizes along with two real datasets. The following findings were made. First, our proposed methods outperform conventional methods for all simulation set-ups. In particular, our methods are much better when the given data are noisy and sample size is small. They showed relatively robust performance regardless of noise level and sample size, whereas the performance of SAM became significantly worse as the noise level became high or sample size decreased. When sufficient sample replicates were available, SAM and our methods showed similar performance. Finally, our proposed methods are competitive with traditional methods in classification tasks for microarrays. The results of simulation study and real data analysis have demonstrated that our proposed methods are effective for detecting significant genes and classification tasks, especially when the given data are noisy or have few sample replicates. By employing weighting schemes, we can obtain robust and reliable results for microarray data analysis.

  19. Sentiment Analysis of Health Care Tweets: Review of the Methods Used.

    PubMed

    Gohil, Sunir; Vuik, Sabine; Darzi, Ara

    2018-04-23

    Twitter is a microblogging service where users can send and read short 140-character messages called "tweets." There are several unstructured, free-text tweets relating to health care being shared on Twitter, which is becoming a popular area for health care research. Sentiment is a metric commonly used to investigate the positive or negative opinion within these messages. Exploring the methods used for sentiment analysis in Twitter health care research may allow us to better understand the options available for future research in this growing field. The first objective of this study was to understand which tools would be available for sentiment analysis of Twitter health care research, by reviewing existing studies in this area and the methods they used. The second objective was to determine which method would work best in the health care settings, by analyzing how the methods were used to answer specific health care questions, their production, and how their accuracy was analyzed. A review of the literature was conducted pertaining to Twitter and health care research, which used a quantitative method of sentiment analysis for the free-text messages (tweets). The study compared the types of tools used in each case and examined methods for tool production, tool training, and analysis of accuracy. A total of 12 papers studying the quantitative measurement of sentiment in the health care setting were found. More than half of these studies produced tools specifically for their research, 4 used open source tools available freely, and 2 used commercially available software. Moreover, 4 out of the 12 tools were trained using a smaller sample of the study's final data. The sentiment method was trained against, on an average, 0.45% (2816/627,024) of the total sample data. One of the 12 papers commented on the analysis of accuracy of the tool used. Multiple methods are used for sentiment analysis of tweets in the health care setting. These range from self-produced basic

  20. New insight in quantitative analysis of vascular permeability during immune reaction (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Kalchenko, Vyacheslav; Molodij, Guillaume; Kuznetsov, Yuri; Smolyakov, Yuri; Israeli, David; Meglinski, Igor; Harmelin, Alon

    2016-03-01

    The use of fluorescence imaging of vascular permeability becomes a golden standard for assessing the inflammation process during experimental immune response in vivo. The use of the optical fluorescence imaging provides a very useful and simple tool to reach this purpose. The motivation comes from the necessity of a robust and simple quantification and data presentation of inflammation based on a vascular permeability. Changes of the fluorescent intensity, as a function of time is a widely accepted method to assess the vascular permeability during inflammation related to the immune response. In the present study we propose to bring a new dimension by applying a more sophisticated approach to the analysis of vascular reaction by using a quantitative analysis based on methods derived from astronomical observations, in particular by using a space-time Fourier filtering analysis followed by a polynomial orthogonal modes decomposition. We demonstrate that temporal evolution of the fluorescent intensity observed at certain pixels correlates quantitatively to the blood flow circulation at normal conditions. The approach allows to determine the regions of permeability and monitor both the fast kinetics related to the contrast material distribution in the circulatory system and slow kinetics associated with extravasation of the contrast material. Thus, we introduce a simple and convenient method for fast quantitative visualization of the leakage related to the inflammatory (immune) reaction in vivo.

  1. The Constant Comparative Analysis Method Outside of Grounded Theory

    ERIC Educational Resources Information Center

    Fram, Sheila M.

    2013-01-01

    This commentary addresses the gap in the literature regarding discussion of the legitimate use of Constant Comparative Analysis Method (CCA) outside of Grounded Theory. The purpose is to show the strength of using CCA to maintain the emic perspective and how theoretical frameworks can maintain the etic perspective throughout the analysis. My…

  2. Analysis method for Thomson scattering diagnostics in GAMMA 10/PDX.

    PubMed

    Ohta, K; Yoshikawa, M; Yasuhara, R; Chikatsu, M; Shima, Y; Kohagura, J; Sakamoto, M; Nakasima, Y; Imai, T; Ichimura, M; Yamada, I; Funaba, H; Minami, T

    2016-11-01

    We have developed an analysis method to improve the accuracies of electron temperature measurement by employing a fitting technique for the raw Thomson scattering (TS) signals. Least square fitting of the raw TS signals enabled reduction of the error in the electron temperature measurement. We applied the analysis method to a multi-pass (MP) TS system. Because the interval between the MPTS signals is very short, it is difficult to separately analyze each Thomson scattering signal intensity by using the raw signals. We used the fitting method to obtain the original TS scattering signals from the measured raw MPTS signals to obtain the electron temperatures in each pass.

  3. Probabilistic structural analysis methods and applications

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Wu, Y.-T.; Dias, B.; Rajagopal, K. R.

    1988-01-01

    An advanced algorithm for simulating the probabilistic distribution of structural responses due to statistical uncertainties in loads, geometry, material properties, and boundary conditions is reported. The method effectively combines an advanced algorithm for calculating probability levels for multivariate problems (fast probability integration) together with a general-purpose finite-element code for stress, vibration, and buckling analysis. Application is made to a space propulsion system turbine blade for which the geometry and material properties are treated as random variables.

  4. Regional frequency analysis of extreme rainfalls using partial L moments method

    NASA Astrophysics Data System (ADS)

    Zakaria, Zahrahtul Amani; Shabri, Ani

    2013-07-01

    An approach based on regional frequency analysis using L moments and LH moments are revisited in this study. Subsequently, an alternative regional frequency analysis using the partial L moments (PL moments) method is employed, and a new relationship for homogeneity analysis is developed. The results were then compared with those obtained using the method of L moments and LH moments of order two. The Selangor catchment, consisting of 37 sites and located on the west coast of Peninsular Malaysia, is chosen as a case study. PL moments for the generalized extreme value (GEV), generalized logistic (GLO), and generalized Pareto distributions were derived and used to develop the regional frequency analysis procedure. PL moment ratio diagram and Z test were employed in determining the best-fit distribution. Comparison between the three approaches showed that GLO and GEV distributions were identified as the suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation used for performance evaluation shows that the method of PL moments would outperform L and LH moments methods for estimation of large return period events.

  5. Analytical quality assurance in veterinary drug residue analysis methods: matrix effects determination and monitoring for sulfonamides analysis.

    PubMed

    Hoff, Rodrigo Barcellos; Rübensam, Gabriel; Jank, Louise; Barreto, Fabiano; Peralba, Maria do Carmo Ruaro; Pizzolato, Tânia Mara; Silvia Díaz-Cruz, M; Barceló, Damià

    2015-01-01

    In residue analysis of veterinary drugs in foodstuff, matrix effects are one of the most critical points. This work present a discuss considering approaches used to estimate, minimize and monitoring matrix effects in bioanalytical methods. Qualitative and quantitative methods for estimation of matrix effects such as post-column infusion, slopes ratios analysis, calibration curves (mathematical and statistical analysis) and control chart monitoring are discussed using real data. Matrix effects varying in a wide range depending of the analyte and the sample preparation method: pressurized liquid extraction for liver samples show matrix effects from 15.5 to 59.2% while a ultrasound-assisted extraction provide values from 21.7 to 64.3%. The matrix influence was also evaluated: for sulfamethazine analysis, losses of signal were varying from -37 to -96% for fish and eggs, respectively. Advantages and drawbacks are also discussed considering a workflow for matrix effects assessment proposed and applied to real data from sulfonamides residues analysis. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Quantitative mass spectrometry methods for pharmaceutical analysis

    PubMed Central

    Loos, Glenn; Van Schepdael, Ann

    2016-01-01

    Quantitative pharmaceutical analysis is nowadays frequently executed using mass spectrometry. Electrospray ionization coupled to a (hybrid) triple quadrupole mass spectrometer is generally used in combination with solid-phase extraction and liquid chromatography. Furthermore, isotopically labelled standards are often used to correct for ion suppression. The challenges in producing sensitive but reliable quantitative data depend on the instrumentation, sample preparation and hyphenated techniques. In this contribution, different approaches to enhance the ionization efficiencies using modified source geometries and improved ion guidance are provided. Furthermore, possibilities to minimize, assess and correct for matrix interferences caused by co-eluting substances are described. With the focus on pharmaceuticals in the environment and bioanalysis, different separation techniques, trends in liquid chromatography and sample preparation methods to minimize matrix effects and increase sensitivity are discussed. Although highly sensitive methods are generally aimed for to provide automated multi-residue analysis, (less sensitive) miniaturized set-ups have a great potential due to their ability for in-field usage. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644982

  7. Applications of modern statistical methods to analysis of data in physical science

    NASA Astrophysics Data System (ADS)

    Wicker, James Eric

    Modern methods of statistical and computational analysis offer solutions to dilemmas confronting researchers in physical science. Although the ideas behind modern statistical and computational analysis methods were originally introduced in the 1970's, most scientists still rely on methods written during the early era of computing. These researchers, who analyze increasingly voluminous and multivariate data sets, need modern analysis methods to extract the best results from their studies. The first section of this work showcases applications of modern linear regression. Since the 1960's, many researchers in spectroscopy have used classical stepwise regression techniques to derive molecular constants. However, problems with thresholds of entry and exit for model variables plagues this analysis method. Other criticisms of this kind of stepwise procedure include its inefficient searching method, the order in which variables enter or leave the model and problems with overfitting data. We implement an information scoring technique that overcomes the assumptions inherent in the stepwise regression process to calculate molecular model parameters. We believe that this kind of information based model evaluation can be applied to more general analysis situations in physical science. The second section proposes new methods of multivariate cluster analysis. The K-means algorithm and the EM algorithm, introduced in the 1960's and 1970's respectively, formed the basis of multivariate cluster analysis methodology for many years. However, several shortcomings of these methods include strong dependence on initial seed values and inaccurate results when the data seriously depart from hypersphericity. We propose new cluster analysis methods based on genetic algorithms that overcomes the strong dependence on initial seed values. In addition, we propose a generalization of the Genetic K-means algorithm which can accurately identify clusters with complex hyperellipsoidal covariance

  8. [Development and application of morphological analysis method in Aspergillus niger fermentation].

    PubMed

    Tang, Wenjun; Xia, Jianye; Chu, Ju; Zhuang, Yingping; Zhang, Siliang

    2015-02-01

    Filamentous fungi are widely used in industrial fermentation. Particular fungal morphology acts as a critical index for a successful fermentation. To break the bottleneck of morphological analysis, we have developed a reliable method for fungal morphological analysis. By this method, we can prepare hundreds of pellet samples simultaneously and obtain quantitative morphological information at large scale quickly. This method can largely increase the accuracy and reliability of morphological analysis result. Based on that, the studies of Aspergillus niger morphology under different oxygen supply conditions and shear rate conditions were carried out. As a result, the morphological responding patterns of A. niger morphology to these conditions were quantitatively demonstrated, which laid a solid foundation for the further scale-up.

  9. Sampling methods for microbiological analysis of red meat and poultry carcasses.

    PubMed

    Capita, Rosa; Prieto, Miguel; Alonso-Calleja, Carlos

    2004-06-01

    Microbiological analysis of carcasses at slaughterhouses is required in the European Union for evaluating the hygienic performance of carcass production processes as required for effective hazard analysis critical control point implementation. The European Union microbial performance standards refer exclusively to the excision method, even though swabbing using the wet/dry technique is also permitted when correlation between both destructive and nondestructive methods can be established. For practical and economic reasons, the swab technique is the most extensively used carcass surface-sampling method. The main characteristics, advantages, and limitations of the common excision and swabbing methods are described here.

  10. An empirical method for dynamic camouflage assessment

    NASA Astrophysics Data System (ADS)

    Blitch, John G.

    2011-06-01

    As camouflage systems become increasingly sophisticated in their potential to conceal military personnel and precious cargo, evaluation methods need to evolve as well. This paper presents an overview of one such attempt to explore alternative methods for empirical evaluation of dynamic camouflage systems which aspire to keep pace with a soldier's movement through rapidly changing environments that are typical of urban terrain. Motivating factors are covered first, followed by a description of the Blitz Camouflage Assessment (BCA) process and results from an initial proof of concept experiment conducted in November 2006. The conclusion drawn from these results, related literature and the author's personal experience suggest that operational evaluation of personal camouflage needs to be expanded beyond its foundation in signal detection theory and embrace the challenges posed by high levels of cognitive processing.

  11. Multiple methods integration for structural mechanics analysis and design

    NASA Technical Reports Server (NTRS)

    Housner, J. M.; Aminpour, M. A.

    1991-01-01

    A new research area of multiple methods integration is proposed for joining diverse methods of structural mechanics analysis which interact with one another. Three categories of multiple methods are defined: those in which a physical interface are well defined; those in which a physical interface is not well-defined, but selected; and those in which the interface is a mathematical transformation. Two fundamental integration procedures are presented that can be extended to integrate various methods (e.g., finite elements, Rayleigh Ritz, Galerkin, and integral methods) with one another. Since the finite element method will likely be the major method to be integrated, its enhanced robustness under element distortion is also examined and a new robust shell element is demonstrated.

  12. Self-adaptive method for high frequency multi-channel analysis of surface wave method

    USDA-ARS?s Scientific Manuscript database

    When the high frequency multi-channel analysis of surface waves (MASW) method is conducted to explore soil properties in the vadose zone, existing rules for selecting the near offset and spread lengths cannot satisfy the requirements of planar dominant Rayleigh waves for all frequencies of interest ...

  13. The Impact of Normalization Methods on RNA-Seq Data Analysis

    PubMed Central

    Zyprych-Walczak, J.; Szabelska, A.; Handschuh, L.; Górczak, K.; Klamecka, K.; Figlerowicz, M.; Siatkowski, I.

    2015-01-01

    High-throughput sequencing technologies, such as the Illumina Hi-seq, are powerful new tools for investigating a wide range of biological and medical problems. Massive and complex data sets produced by the sequencers create a need for development of statistical and computational methods that can tackle the analysis and management of data. The data normalization is one of the most crucial steps of data processing and this process must be carefully considered as it has a profound effect on the results of the analysis. In this work, we focus on a comprehensive comparison of five normalization methods related to sequencing depth, widely used for transcriptome sequencing (RNA-seq) data, and their impact on the results of gene expression analysis. Based on this study, we suggest a universal workflow that can be applied for the selection of the optimal normalization procedure for any particular data set. The described workflow includes calculation of the bias and variance values for the control genes, sensitivity and specificity of the methods, and classification errors as well as generation of the diagnostic plots. Combining the above information facilitates the selection of the most appropriate normalization method for the studied data sets and determines which methods can be used interchangeably. PMID:26176014

  14. Strength Analysis on Ship Ladder Using Finite Element Method

    NASA Astrophysics Data System (ADS)

    Budianto; Wahyudi, M. T.; Dinata, U.; Ruddianto; Eko P., M. M.

    2018-01-01

    In designing the ship’s structure, it should refer to the rules in accordance with applicable classification standards. In this case, designing Ladder (Staircase) on a Ferry Ship which is set up, it must be reviewed based on the loads during ship operations, either during sailing or at port operations. The classification rules in ship design refer to the calculation of the structure components described in Classification calculation method and can be analysed using the Finite Element Method. Classification Regulations used in the design of Ferry Ships used BKI (Bureau of Classification Indonesia). So the rules for the provision of material composition in the mechanical properties of the material should refer to the classification of the used vessel. The analysis in this structure used program structure packages based on Finite Element Method. By using structural analysis on Ladder (Ladder), it obtained strength and simulation structure that can withstand load 140 kg both in static condition, dynamic, and impact. Therefore, the result of the analysis included values of safety factors in the ship is to keep the structure safe but the strength of the structure is not excessive.

  15. The Precision Efficacy Analysis for Regression Sample Size Method.

    ERIC Educational Resources Information Center

    Brooks, Gordon P.; Barcikowski, Robert S.

    The general purpose of this study was to examine the efficiency of the Precision Efficacy Analysis for Regression (PEAR) method for choosing appropriate sample sizes in regression studies used for precision. The PEAR method, which is based on the algebraic manipulation of an accepted cross-validity formula, essentially uses an effect size to…

  16. Proteomic methods for analysis of S-nitrosation⋄

    PubMed Central

    Kettenhofen, Nicholas; Broniowska, Katarzyna; Keszler, Agnes; Zhang, Yanhong; Hogg, Neil

    2007-01-01

    This review discusses proteomic methods to detect and identify S-nitrosated proteins. Protein S-nitrosation, the post-translational modification of thiol residues to form S-nitrosothiols, has been suggested to be a mechanism of cellular redox signaling by which nitric oxide can alter cellular function through modification of protein thiol residues. It has become apparent that methods that will detect and identify low levels of S-nitrosated protein in complex protein mixtures are required in order to fully appreciate the range, extent and selectivity of this modification in both physiological and pathological conditions. While many advances have been made in the detection of either total cellular S-nitrosation or individual S-nitrosothiols, proteomic methods for the detection of S-nitrosation are in relative infancy. This review will discuss the major methods that have been used for the proteomic analysis of protein S-nitrosation and discuss the pros and cons of this methodology. PMID:17360249

  17. Low-Cost Method for Quantifying Sodium in Coconut Water and Seawater for the Undergraduate Analytical Chemistry Laboratory: Flame Test, a Mobile Phone Camera, and Image Processing

    ERIC Educational Resources Information Center

    Moraes, Edgar P.; da Silva, Nilbert S. A.; de Morais, Camilo de L. M.; das Neves, Luiz S.; de Lima, Kassio M. G.

    2014-01-01

    The flame test is a classical analytical method that is often used to teach students how to identify specific metals. However, some universities in developing countries have difficulties acquiring the sophisticated instrumentation needed to demonstrate how to identify and quantify metals. In this context, a method was developed based on the flame…

  18. Reliability-Based Stability Analysis of Rock Slopes Using Numerical Analysis and Response Surface Method

    NASA Astrophysics Data System (ADS)

    Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.

    2017-08-01

    While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.

  19. Improved methods of vibration analysis of pretwisted, airfoil blades

    NASA Technical Reports Server (NTRS)

    Subrahmanyam, K. B.; Kaza, K. R. V.

    1984-01-01

    Vibration analysis of pretwisted blades of asymmetric airfoil cross section is performed by using two mixed variational approaches. Numerical results obtained from these two methods are compared to those obtained from an improved finite difference method and also to those given by the ordinary finite difference method. The relative merits, convergence properties and accuracies of all four methods are studied and discussed. The effects of asymmetry and pretwist on natural frequencies and mode shapes are investigated. The improved finite difference method is shown to be far superior to the conventional finite difference method in several respects. Close lower bound solutions are provided by the improved finite difference method for untwisted blades with a relatively coarse mesh while the mixed methods have not indicated any specific bound.

  20. Study on color difference estimation method of medicine biochemical analysis

    NASA Astrophysics Data System (ADS)

    Wang, Chunhong; Zhou, Yue; Zhao, Hongxia; Sun, Jiashi; Zhou, Fengkun

    2006-01-01

    The biochemical analysis in medicine is an important inspection and diagnosis method in hospital clinic. The biochemical analysis of urine is one important item. The Urine test paper shows corresponding color with different detection project or different illness degree. The color difference between the standard threshold and the test paper color of urine can be used to judge the illness degree, so that further analysis and diagnosis to urine is gotten. The color is a three-dimensional physical variable concerning psychology, while reflectance is one-dimensional variable; therefore, the estimation method of color difference in urine test can have better precision and facility than the conventional test method with one-dimensional reflectance, it can make an accurate diagnose. The digital camera is easy to take an image of urine test paper and is used to carry out the urine biochemical analysis conveniently. On the experiment, the color image of urine test paper is taken by popular color digital camera and saved in the computer which installs a simple color space conversion (RGB -> XYZ -> L *a *b *)and the calculation software. Test sample is graded according to intelligent detection of quantitative color. The images taken every time were saved in computer, and the whole illness process will be monitored. This method can also use in other medicine biochemical analyses that have relation with color. Experiment result shows that this test method is quick and accurate; it can be used in hospital, calibrating organization and family, so its application prospect is extensive.

  1. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, Lee M.; Ng, Esmond G.

    1998-01-01

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated.

  2. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system structural components

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.

    1987-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  3. Probabilistic Structural Analysis Methods for select space propulsion system structural components (PSAM)

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.

    1988-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  4. Analysis of Social Cohesion in Health Data by Factor Analysis Method: The Ghanaian Perspective

    ERIC Educational Resources Information Center

    Saeed, Bashiru I. I.; Xicang, Zhao; Musah, A. A. I.; Abdul-Aziz, A. R.; Yawson, Alfred; Karim, Azumah

    2013-01-01

    We investigated the study of the overall social cohesion of Ghanaians. In this study, we considered the paramount interest of the involvement of Ghanaians in their communities, their views of other people and institutions, and their level of interest in both local and national politics. The factor analysis method was employed for analysis using R…

  5. Methods for collection and analysis of aquatic biological and microbiological samples

    USGS Publications Warehouse

    Britton, L.J.; Greeson, P.E.

    1989-01-01

    The series of chapters on techniques describes methods used by the U.S. Geological Survey for planning and conducting water-resources investigations. The material is arranged under major subject headings called books and is further subdivided into sections and chapters. Book 5 is on laboratory analysis. Section A is on water. The unit of publication, the chapter, is limited to a narrow field of subject matter. "Methods for Collection and Analysis of Aquatic Biological and Microbiological Samples" is the fourth chapter to be published under Section A of Book 5. The chapter number includes the letter of the section.This chapter was prepared by several aquatic biologists and microbiologists of the U.S. Geological Survey to provide accurate and precise methods for the collection and analysis of aquatic biological and microbiological samples.Use of brand, firm, and trade names in this chapter is for identification purposes only and does not constitute endorsement by the U.S. Geological Survey.This chapter supersedes "Methods for Collection and Analysis of Aquatic Biological and Microbiological Samples" edited by P.E. Greeson, T.A. Ehlke, G.A. Irwin, B.W. Lium, and K.V. Slack (U.S. Geological Survey Techniques of Water-Resources Investigations, Book 5, Chapter A4, 1977) and also supersedes "A Supplement to-Methods for Collection and Analysis of Aquatic Biological and Microbiological Samples" by P.E. Greeson (U.S. Geological Survey Techniques of Water-Resources Investigations, Book 5, Chapter A4), Open-File Report 79-1279, 1979.

  6. Big Data Analysis Framework for Healthcare and Social Sectors in Korea

    PubMed Central

    Song, Tae-Min

    2015-01-01

    Objectives We reviewed applications of big data analysis of healthcare and social services in developed countries, and subsequently devised a framework for such an analysis in Korea. Methods We reviewed the status of implementing big data analysis of health care and social services in developed countries, and strategies used by the Ministry of Health and Welfare of Korea (Government 3.0). We formulated a conceptual framework of big data in the healthcare and social service sectors at the national level. As a specific case, we designed a process and method of social big data analysis on suicide buzz. Results Developed countries (e.g., the United States, the UK, Singapore, Australia, and even OECD and EU) are emphasizing the potential of big data, and using it as a tool to solve their long-standing problems. Big data strategies for the healthcare and social service sectors were formulated based on an ICT-based policy of current government and the strategic goals of the Ministry of Health and Welfare. We suggest a framework of big data analysis in the healthcare and welfare service sectors separately and assigned them tentative names: 'health risk analysis center' and 'integrated social welfare service network'. A framework of social big data analysis is presented by applying it to the prevention and proactive detection of suicide in Korea. Conclusions There are some concerns with the utilization of big data in the healthcare and social welfare sectors. Thus, research on these issues must be conducted so that sophisticated and practical solutions can be reached. PMID:25705552

  7. Attitudes about high school physics in relationship to gender and ethnicity: A mixed method analysis

    NASA Astrophysics Data System (ADS)

    Hafza, Rabieh Jamal

    There is an achievement gap and lack of participation in science, technology, engineering, and math (STEM) by minority females. The number of minority females majoring in STEM related fields and earning advanced degrees in these fields has not significantly increased over the past 40 years. Previous research has evaluated the relationship between self-identity concept and factors that promote the academic achievement as well the motivation of students to study different subject areas. This study examined the interaction between gender and ethnicity in terms of physics attitudes in the context of real world connections, personal interest, sense making/effort, problem solving confidence, and problem solving sophistication. The Colorado Learning Attitudes about Science Survey (CLASS) was given to 131 students enrolled in physics classes. There was a statistically significant Gender*Ethnicity interaction for attitude in the context of Real World Connections, Personal Interest, Sense Making/Effort, Problem Solving Confidence, and Problem Solving Sophistication as a whole. There was also a statistically significant Gender*Ethnicity interaction for attitude in the context of Real World Connections, Personal Interest, and Sense Making/Effort individually. Five Black females were interviewed to triangulate the quantitative results and to describe the experiences of minority females taking physics classes. There were four themes that emerged from the interviews and supported the findings from the quantitative results. The data supported previous research done on attitudes about STEM. The results reported that Real World Connections and Personal Interest could be possible factors that explain the lack of participation and achievement gaps that exists among minority females.

  8. Factor Analysis of Drawings: Application to college student models of the greenhouse effect

    NASA Astrophysics Data System (ADS)

    Libarkin, Julie C.; Thomas, Stephen R.; Ording, Gabriel

    2015-09-01

    Exploratory factor analysis was used to identify models underlying drawings of the greenhouse effect made by over 200 entering university freshmen. Initial content analysis allowed deconstruction of drawings into salient features, with grouping of these features via factor analysis. A resulting 4-factor solution explains 62% of the data variance, suggesting that 4 archetype models of the greenhouse effect dominate thinking within this population. Factor scores, indicating the extent to which each student's drawing aligned with representative models, were compared to performance on conceptual understanding and attitudes measures, demographics, and non-cognitive features of drawings. Student drawings were also compared to drawings made by scientists to ascertain the extent to which models reflect more sophisticated and accurate models. Results indicate that student and scientist drawings share some similarities, most notably the presence of some features of the most sophisticated non-scientific model held among the study population. Prior knowledge, prior attitudes, gender, and non-cognitive components are also predictive of an individual student's model. This work presents a new technique for analyzing drawings, with general implications for the use of drawings in investigating student conceptions.

  9. Markov Chain Monte Carlo Methods for Bayesian Data Analysis in Astronomy

    NASA Astrophysics Data System (ADS)

    Sharma, Sanjib

    2017-08-01

    Markov Chain Monte Carlo based Bayesian data analysis has now become the method of choice for analyzing and interpreting data in almost all disciplines of science. In astronomy, over the last decade, we have also seen a steady increase in the number of papers that employ Monte Carlo based Bayesian analysis. New, efficient Monte Carlo based methods are continuously being developed and explored. In this review, we first explain the basics of Bayesian theory and discuss how to set up data analysis problems within this framework. Next, we provide an overview of various Monte Carlo based methods for performing Bayesian data analysis. Finally, we discuss advanced ideas that enable us to tackle complex problems and thus hold great promise for the future. We also distribute downloadable computer software (available at https://github.com/sanjibs/bmcmc/ ) that implements some of the algorithms and examples discussed here.

  10. Application of Bounded Linear Stability Analysis Method for Metrics-Driven Adaptive Control

    NASA Technical Reports Server (NTRS)

    Bakhtiari-Nejad, Maryam; Nguyen, Nhan T.; Krishnakumar, Kalmanje

    2009-01-01

    This paper presents the application of Bounded Linear Stability Analysis (BLSA) method for metrics-driven adaptive control. The bounded linear stability analysis method is used for analyzing stability of adaptive control models, without linearizing the adaptive laws. Metrics-driven adaptive control introduces a notion that adaptation should be driven by some stability metrics to achieve robustness. By the application of bounded linear stability analysis method the adaptive gain is adjusted during the adaptation in order to meet certain phase margin requirements. Analysis of metrics-driven adaptive control is evaluated for a second order system that represents a pitch attitude control of a generic transport aircraft. The analysis shows that the system with the metrics-conforming variable adaptive gain becomes more robust to unmodeled dynamics or time delay. The effect of analysis time-window for BLSA is also evaluated in order to meet the stability margin criteria.

  11. Benchmarking Foot Trajectory Estimation Methods for Mobile Gait Analysis

    PubMed Central

    Ollenschläger, Malte; Roth, Nils; Klucken, Jochen

    2017-01-01

    Mobile gait analysis systems based on inertial sensing on the shoe are applied in a wide range of applications. Especially for medical applications, they can give new insights into motor impairment in, e.g., neurodegenerative disease and help objectify patient assessment. One key component in these systems is the reconstruction of the foot trajectories from inertial data. In literature, various methods for this task have been proposed. However, performance is evaluated on a variety of datasets due to the lack of large, generally accepted benchmark datasets. This hinders a fair comparison of methods. In this work, we implement three orientation estimation and three double integration schemes for use in a foot trajectory estimation pipeline. All methods are drawn from literature and evaluated against a marker-based motion capture reference. We provide a fair comparison on the same dataset consisting of 735 strides from 16 healthy subjects. As a result, the implemented methods are ranked and we identify the most suitable processing pipeline for foot trajectory estimation in the context of mobile gait analysis. PMID:28832511

  12. Methods for the Joint Meta-Analysis of Multiple Tests

    ERIC Educational Resources Information Center

    Trikalinos, Thomas A.; Hoaglin, David C.; Small, Kevin M.; Terrin, Norma; Schmid, Christopher H.

    2014-01-01

    Existing methods for meta-analysis of diagnostic test accuracy focus primarily on a single index test. We propose models for the joint meta-analysis of studies comparing multiple index tests on the same participants in paired designs. These models respect the grouping of data by studies, account for the within-study correlation between the tests'…

  13. Regional analysis of annual maximum rainfall using TL-moments method

    NASA Astrophysics Data System (ADS)

    Shabri, Ani Bin; Daud, Zalina Mohd; Ariff, Noratiqah Mohd

    2011-06-01

    Information related to distributions of rainfall amounts are of great importance for designs of water-related structures. One of the concerns of hydrologists and engineers is the probability distribution for modeling of regional data. In this study, a novel approach to regional frequency analysis using L-moments is revisited. Subsequently, an alternative regional frequency analysis using the TL-moments method is employed. The results from both methods were then compared. The analysis was based on daily annual maximum rainfall data from 40 stations in Selangor Malaysia. TL-moments for the generalized extreme value (GEV) and generalized logistic (GLO) distributions were derived and used to develop the regional frequency analysis procedure. TL-moment ratio diagram and Z-test were employed in determining the best-fit distribution. Comparison between the two approaches showed that the L-moments and TL-moments produced equivalent results. GLO and GEV distributions were identified as the most suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation was used for performance evaluation, and it showed that the method of TL-moments was more efficient for lower quantile estimation compared with the L-moments.

  14. Singular boundary method for wave propagation analysis in periodic structures

    NASA Astrophysics Data System (ADS)

    Fu, Zhuojia; Chen, Wen; Wen, Pihua; Zhang, Chuanzeng

    2018-07-01

    A strong-form boundary collocation method, the singular boundary method (SBM), is developed in this paper for the wave propagation analysis at low and moderate wavenumbers in periodic structures. The SBM is of several advantages including mathematically simple, easy-to-program, meshless with the application of the concept of origin intensity factors in order to eliminate the singularity of the fundamental solutions and avoid the numerical evaluation of the singular integrals in the boundary element method. Due to the periodic behaviors of the structures, the SBM coefficient matrix can be represented as a block Toeplitz matrix. By employing three different fast Toeplitz-matrix solvers, the computational time and storage requirements are significantly reduced in the proposed SBM analysis. To demonstrate the effectiveness of the proposed SBM formulation for wave propagation analysis in periodic structures, several benchmark examples are presented and discussed The proposed SBM results are compared with the analytical solutions, the reference results and the COMSOL software.

  15. The reliability of an instrumented start block analysis system.

    PubMed

    Tor, Elaine; Pease, David L; Ball, Kevin A

    2015-02-01

    The swimming start is highly influential to overall competition performance. Therefore, it is paramount to develop reliable methods to perform accurate biomechanical analysis of start performance for training and research. The Wetplate Analysis System is a custom-made force plate system developed by the Australian Institute of Sport--Aquatic Testing, Training and Research Unit (AIS ATTRU). This sophisticated system combines both force data and 2D digitization to measure a number of kinetic and kinematic parameter values in an attempt to evaluate start performance. Fourteen elite swimmers performed two maximal effort dives (performance was defined as time from start signal to 15 m) over two separate testing sessions. Intraclass correlation coefficients (ICC) were used to determine each parameter's reliability. The kinetic parameters all had ICC greater than 0.9 except the time of peak vertical force (0.742). This may have been due to variations in movement initiation after the starting signal between trials. The kinematic and time parameters also had ICC greater than 0.9 apart from for the time of maximum depth (0.719). This parameter was lower due to the swimmers varying their depth between trials. Based on the high ICC scores for all parameters, the Wetplate Analysis System is suitable for biomechanical analysis of swimming starts.

  16. Identification of the isomers using principal component analysis (PCA) method

    NASA Astrophysics Data System (ADS)

    Kepceoǧlu, Abdullah; Gündoǧdu, Yasemin; Ledingham, Kenneth William David; Kilic, Hamdi Sukur

    2016-03-01

    In this work, we have carried out a detailed statistical analysis for experimental data of mass spectra from xylene isomers. Principle Component Analysis (PCA) was used to identify the isomers which cannot be distinguished using conventional statistical methods for interpretation of their mass spectra. Experiments have been carried out using a linear TOF-MS coupled to a femtosecond laser system as an energy source for the ionisation processes. We have performed experiments and collected data which has been analysed and interpreted using PCA as a multivariate analysis of these spectra. This demonstrates the strength of the method to get an insight for distinguishing the isomers which cannot be identified using conventional mass analysis obtained through dissociative ionisation processes on these molecules. The PCA results dependending on the laser pulse energy and the background pressure in the spectrometers have been presented in this work.

  17. Who's in and why? A typology of stakeholder analysis methods for natural resource management.

    PubMed

    Reed, Mark S; Graves, Anil; Dandy, Norman; Posthumus, Helena; Hubacek, Klaus; Morris, Joe; Prell, Christina; Quinn, Claire H; Stringer, Lindsay C

    2009-04-01

    Stakeholder analysis means many things to different people. Various methods and approaches have been developed in different fields for different purposes, leading to confusion over the concept and practice of stakeholder analysis. This paper asks how and why stakeholder analysis should be conducted for participatory natural resource management research. This is achieved by reviewing the development of stakeholder analysis in business management, development and natural resource management. The normative and instrumental theoretical basis for stakeholder analysis is discussed, and a stakeholder analysis typology is proposed. This consists of methods for: i) identifying stakeholders; ii) differentiating between and categorising stakeholders; and iii) investigating relationships between stakeholders. The range of methods that can be used to carry out each type of analysis is reviewed. These methods and approaches are then illustrated through a series of case studies funded through the Rural Economy and Land Use (RELU) programme. These case studies show the wide range of participatory and non-participatory methods that can be used, and discuss some of the challenges and limitations of existing methods for stakeholder analysis. The case studies also propose new tools and combinations of methods that can more effectively identify and categorise stakeholders and help understand their inter-relationships.

  18. Application of the pulsed fast/thermal neutron method for soil elemental analysis

    USDA-ARS?s Scientific Manuscript database

    Soil science is a research field where physic concepts and experimental methods are widely used, particularly in agro-chemistry and soil elemental analysis. Different methods of analysis are currently available. The evolution of nuclear physics (methodology and instrumentation) combined with the ava...

  19. Method of analysis of asbestiform minerals by thermoluminescence

    DOEpatents

    Fisher, Gerald L.; Bradley, Edward W.

    1980-01-01

    A method for the qualitative and quantitative analysis of asbestiform minerals, including the steps of subjecting a sample to be analyzed to the thermoluminescent analysis, annealing the sample, subjecting the sample to ionizing radiation, and subjecting the sample to a second thermoluminescent analysis. Glow curves are derived from the two thermoluminescent analyses and their shapes then compared to established glow curves of known asbestiform minerals to identify the type of asbestiform in the sample. Also, during at least one of the analyses, the thermoluminescent response for each sample is integrated during a linear heating period of the analysis in order to derive the total thermoluminescence per milligram of sample. This total is a measure of the quantity of asbestiform in the sample and may also be used to identify the source of the sample.

  20. Design sensitivity analysis with Applicon IFAD using the adjoint variable method

    NASA Technical Reports Server (NTRS)

    Frederick, Marjorie C.; Choi, Kyung K.

    1984-01-01

    A numerical method is presented to implement structural design sensitivity analysis using the versatility and convenience of existing finite element structural analysis program and the theoretical foundation in structural design sensitivity analysis. Conventional design variables, such as thickness and cross-sectional areas, are considered. Structural performance functionals considered include compliance, displacement, and stress. It is shown that calculations can be carried out outside existing finite element codes, using postprocessing data only. That is, design sensitivity analysis software does not have to be imbedded in an existing finite element code. The finite element structural analysis program used in the implementation presented is IFAD. Feasibility of the method is shown through analysis of several problems, including built-up structures. Accurate design sensitivity results are obtained without the uncertainty of numerical accuracy associated with selection of a finite difference perturbation.

  1. [Scenario analysis--a method for long-term planning].

    PubMed

    Stavem, K

    2000-01-10

    Scenarios are known from the film industry, as detailed descriptions of films. This has given name to scenario analysis, a method for long term planning using descriptions of composite future pictures. This article is an introduction to the scenario method. Scenarios describe plausible, not necessarily probable, developments. They focus on problems and questions that decision makers must be aware of and prepare to deal with, and the consequences of alternative decisions. Scenarios are used in corporate and governmental planning, and they can be useful and complementary to traditional planning and extrapolation of past experience. The method is particularly useful in a rapidly changing world with shifting external conditions.

  2. Shear Lag in Box Beams Methods of Analysis and Experimental Investigations

    NASA Technical Reports Server (NTRS)

    Kuhn, Paul; Chiarito, Patrick T

    1942-01-01

    The bending stresses in the covers of box beams or wide-flange beams differ appreciably from the stresses predicted by the ordinary bending theory on account of shear deformation of the flanges. The problem of predicting these differences has become known as the shear-lag problem. The first part of this paper deals with methods of shear-lag analysis suitable for practical use. The second part of the paper describes strain-gage tests made by the NACA to verify the theory. Three tests published by other investigators are also analyzed by the proposed method. The third part of the paper gives numerical examples illustrating the methods of analysis. An appendix gives comparisons with other methods, particularly with the method of Ebner and Koller.

  3. Development Optimization and Uncertainty Analysis Methods for Oil and Gas Reservoirs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ettehadtavakkol, Amin, E-mail: amin.ettehadtavakkol@ttu.edu; Jablonowski, Christopher; Lake, Larry

    Uncertainty complicates the development optimization of oil and gas exploration and production projects, but methods have been devised to analyze uncertainty and its impact on optimal decision-making. This paper compares two methods for development optimization and uncertainty analysis: Monte Carlo (MC) simulation and stochastic programming. Two example problems for a gas field development and an oilfield development are solved and discussed to elaborate the advantages and disadvantages of each method. Development optimization involves decisions regarding the configuration of initial capital investment and subsequent operational decisions. Uncertainty analysis involves the quantification of the impact of uncertain parameters on the optimum designmore » concept. The gas field development problem is designed to highlight the differences in the implementation of the two methods and to show that both methods yield the exact same optimum design. The results show that both MC optimization and stochastic programming provide unique benefits, and that the choice of method depends on the goal of the analysis. While the MC method generates more useful information, along with the optimum design configuration, the stochastic programming method is more computationally efficient in determining the optimal solution. Reservoirs comprise multiple compartments and layers with multiphase flow of oil, water, and gas. We present a workflow for development optimization under uncertainty for these reservoirs, and solve an example on the design optimization of a multicompartment, multilayer oilfield development.« less

  4. Effect of the absolute statistic on gene-sampling gene-set analysis methods.

    PubMed

    Nam, Dougu

    2017-06-01

    Gene-set enrichment analysis and its modified versions have commonly been used for identifying altered functions or pathways in disease from microarray data. In particular, the simple gene-sampling gene-set analysis methods have been heavily used for datasets with only a few sample replicates. The biggest problem with this approach is the highly inflated false-positive rate. In this paper, the effect of absolute gene statistic on gene-sampling gene-set analysis methods is systematically investigated. Thus far, the absolute gene statistic has merely been regarded as a supplementary method for capturing the bidirectional changes in each gene set. Here, it is shown that incorporating the absolute gene statistic in gene-sampling gene-set analysis substantially reduces the false-positive rate and improves the overall discriminatory ability. Its effect was investigated by power, false-positive rate, and receiver operating curve for a number of simulated and real datasets. The performances of gene-set analysis methods in one-tailed (genome-wide association study) and two-tailed (gene expression data) tests were also compared and discussed.

  5. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, L.M.; Ng, E.G.

    1998-09-29

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data are disclosed. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated. 8 figs.

  6. Experience report: Using formal methods for requirements analysis of critical spacecraft software

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn R.; Ampo, Yoko

    1994-01-01

    Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.

  7. A single-loop optimization method for reliability analysis with second order uncertainty

    NASA Astrophysics Data System (ADS)

    Xie, Shaojun; Pan, Baisong; Du, Xiaoping

    2015-08-01

    Reliability analysis may involve random variables and interval variables. In addition, some of the random variables may have interval distribution parameters owing to limited information. This kind of uncertainty is called second order uncertainty. This article develops an efficient reliability method for problems involving the three aforementioned types of uncertain input variables. The analysis produces the maximum and minimum reliability and is computationally demanding because two loops are needed: a reliability analysis loop with respect to random variables and an interval analysis loop for extreme responses with respect to interval variables. The first order reliability method and nonlinear optimization are used for the two loops, respectively. For computational efficiency, the two loops are combined into a single loop by treating the Karush-Kuhn-Tucker (KKT) optimal conditions of the interval analysis as constraints. Three examples are presented to demonstrate the proposed method.

  8. The SNPforID Assay as a Supplementary Method in Kinship and Trace Analysis

    PubMed Central

    Schwark, Thorsten; Meyer, Patrick; Harder, Melanie; Modrow, Jan-Hendrick; von Wurmb-Schwark, Nicole

    2012-01-01

    Objective Short tandem repeat (STR) analysis using commercial multiplex PCR kits is the method of choice for kinship testing and trace analysis. However, under certain circumstances (deficiency testing, mutations, minute DNA amounts), STRs alone may not suffice. Methods We present a 50-plex single nucleotide polymorphism (SNP) assay based on the SNPs chosen by the SNPforID consortium as an additional method for paternity and for trace analysis. The new assay was applied to selected routine paternity and trace cases from our laboratory. Results and Conclusions Our investigation shows that the new SNP multiplex assay is a valuable method to supplement STR analysis, and is a powerful means to solve complicated genetic analyses. PMID:22851934

  9. Application of computational aerodynamics methods to the design and analysis of transport aircraft

    NASA Technical Reports Server (NTRS)

    Da Costa, A. L.

    1978-01-01

    The application and validation of several computational aerodynamic methods in the design and analysis of transport aircraft is established. An assessment is made concerning more recently developed methods that solve three-dimensional transonic flow and boundary layers on wings. Capabilities of subsonic aerodynamic methods are demonstrated by several design and analysis efforts. Among the examples cited are the B747 Space Shuttle Carrier Aircraft analysis, nacelle integration for transport aircraft, and winglet optimization. The accuracy and applicability of a new three-dimensional viscous transonic method is demonstrated by comparison of computed results to experimental data

  10. Dynamic Analysis With Stress Mode Animation by the Integrated Force Method

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Coroneos, Rula M.; Hopkins, Dale A.

    1997-01-01

    Dynamic animation of stresses and displacements, which complement each other, can be a useful tool in the analysis and design of structural components. At the present time only displacement-mode animation is available through the popular stiffness formulation. This paper attempts to complete this valuable visualization tool by augmenting the existing art with stress mode animation. The reformulated method of forces, which in the literature is known as the integrated force method (IFM), became the analyzer of choice for the development of stress mode animation because stresses are the primary unknowns of its dynamic analysis. Animation of stresses and displacements, which have been developed successfully through the IFM analyzers, is illustrated in several examples along with a brief introduction to IFM dynamic analysis. The usefulness of animation in design optimization is illustrated considering the spacer structure component of the International Space Station as an example. An overview of the integrated force method analysis code (IFM/ANALYZERS) is provided in the appendix.

  11. A method for communication analysis in prosthodontics.

    PubMed

    Sondell, K; Söderfeldt, B; Palmqvist, S

    1998-02-01

    Particularly in prosthodontics, in which the issues of esthetic preferences and possibilities are abundant, improved knowledge about dentist patient communication during clinical encounters is important. Because previous studies on communication used different methods and patient materials, the results are difficult to evaluate. There is, therefore, a need for methodologic development. One method that makes it possible to quantitatively describe different interaction behaviors during clinical encounters is the Roter Method of Interaction Process Analysis (RIAS). Since the method was developed in the USA for use in the medical context, a translation of the method into Swedish and a modification of the categories for use in prosthodontics were necessary. The revised manual was used to code 10 audio recordings of dentist patient encounters at a specialist clinic for prosthodontics. No major alterations of the RIAS manual were made during the translation and modification. The study shows that it is possible to distinguish patterns of communication in audio-recorded dentist patient encounters. The method also made the identification of different interaction profiles possible. These profiles distinguished well among the audio-recorded encounters. The coding procedures were tested for intra-rater reliability and found to be 97% for utterance classification and lambda = 0.76 for categorization definition. It was concluded that the revised RIAS method is applicable in communication studies in prosthodontics.

  12. Material nonlinear analysis via mixed-iterative finite element method

    NASA Technical Reports Server (NTRS)

    Sutjahjo, Edhi; Chamis, Christos C.

    1992-01-01

    The performance of elastic-plastic mixed-iterative analysis is examined through a set of convergence studies. Membrane and bending behaviors are tested using 4-node quadrilateral finite elements. The membrane result is excellent, which indicates the implementation of elastic-plastic mixed-iterative analysis is appropriate. On the other hand, further research to improve bending performance of the method seems to be warranted.

  13. Apparatus and method for fluid analysis

    DOEpatents

    Wilson, Bary W.; Peters, Timothy J.; Shepard, Chester L.; Reeves, James H.

    2004-11-02

    The present invention is an apparatus and method for analyzing a fluid used in a machine or in an industrial process line. The apparatus has at least one meter placed proximate the machine or process line and in contact with the machine or process fluid for measuring at least one parameter related to the fluid. The at least one parameter is a standard laboratory analysis parameter. The at least one meter includes but is not limited to viscometer, element meter, optical meter, particulate meter, and combinations thereof.

  14. Method of multi-dimensional moment analysis for the characterization of signal peaks

    DOEpatents

    Pfeifer, Kent B; Yelton, William G; Kerr, Dayle R; Bouchier, Francis A

    2012-10-23

    A method of multi-dimensional moment analysis for the characterization of signal peaks can be used to optimize the operation of an analytical system. With a two-dimensional Peclet analysis, the quality and signal fidelity of peaks in a two-dimensional experimental space can be analyzed and scored. This method is particularly useful in determining optimum operational parameters for an analytical system which requires the automated analysis of large numbers of analyte data peaks. For example, the method can be used to optimize analytical systems including an ion mobility spectrometer that uses a temperature stepped desorption technique for the detection of explosive mixtures.

  15. Comparative analysis of methods and sources of financing of the transport organizations activity

    NASA Astrophysics Data System (ADS)

    Gorshkov, Roman

    2017-10-01

    The article considers the analysis of methods of financing of transport organizations in conditions of limited investment resources. A comparative analysis of these methods is carried out, the classification of investment, methods and sources of financial support for projects being implemented to date are presented. In order to select the optimal sources of financing for the projects, various methods of financial management and financial support for the activities of the transport organization were analyzed, which were considered from the perspective of analysis of advantages and limitations. The result of the study is recommendations on the selection of optimal sources and methods of financing of transport organizations.

  16. Social network analysis using k-Path centrality method

    NASA Astrophysics Data System (ADS)

    Taniarza, Natya; Adiwijaya; Maharani, Warih

    2018-03-01

    k-Path centrality is deemed as one of the effective methods to be applied in centrality measurement in which the influential node is estimated as the node that is being passed by information path frequently. Regarding this, k-Path centrality has been employed in the analysis of this paper specifically by adapting random-algorithm approach in order to: (1) determine the influential user’s ranking in a social media Twitter; and (2) ascertain the influence of parameter α in the numeration of k-Path centrality. According to the analysis, the findings showed that the method of k-Path centrality with random-algorithm approach can be used to determine user’s ranking which influences in the dissemination of information in Twitter. Furthermore, the findings also showed that parameter α influenced the duration and the ranking results: the less the α value, the longer the duration, yet the ranking results were more stable.

  17. Reliability and Validity of the Footprint Assessment Method Using Photoshop CS5 Software.

    PubMed

    Gutiérrez-Vilahú, Lourdes; Massó-Ortigosa, Núria; Costa-Tutusaus, Lluís; Guerra-Balic, Myriam

    2015-05-01

    Several sophisticated methods of footprint analysis currently exist. However, it is sometimes useful to apply standard measurement methods of recognized evidence with an easy and quick application. We sought to assess the reliability and validity of a new method of footprint assessment in a healthy population using Photoshop CS5 software (Adobe Systems Inc, San Jose, California). Forty-two footprints, corresponding to 21 healthy individuals (11 men with a mean ± SD age of 20.45 ± 2.16 years and 10 women with a mean ± SD age of 20.00 ± 1.70 years) were analyzed. Footprints were recorded in static bipedal standing position using optical podography and digital photography. Three trials for each participant were performed. The Hernández-Corvo, Chippaux-Smirak, and Staheli indices and the Clarke angle were calculated by manual method and by computerized method using Photoshop CS5 software. Test-retest was used to determine reliability. Validity was obtained by intraclass correlation coefficient (ICC). The reliability test for all of the indices showed high values (ICC, 0.98-0.99). Moreover, the validity test clearly showed no difference between techniques (ICC, 0.99-1). The reliability and validity of a method to measure, assess, and record the podometric indices using Photoshop CS5 software has been demonstrated. This provides a quick and accurate tool useful for the digital recording of morphostatic foot study parameters and their control.

  18. Generalized fictitious methods for fluid-structure interactions: Analysis and simulations

    NASA Astrophysics Data System (ADS)

    Yu, Yue; Baek, Hyoungsu; Karniadakis, George Em

    2013-07-01

    We present a new fictitious pressure method for fluid-structure interaction (FSI) problems in incompressible flow by generalizing the fictitious mass and damping methods we published previously in [1]. The fictitious pressure method involves modification of the fluid solver whereas the fictitious mass and damping methods modify the structure solver. We analyze all fictitious methods for simplified problems and obtain explicit expressions for the optimal reduction factor (convergence rate index) at the FSI interface [2]. This analysis also demonstrates an apparent similarity of fictitious methods to the FSI approach based on Robin boundary conditions, which have been found to be very effective in FSI problems. We implement all methods, including the semi-implicit Robin based coupling method, in the context of spectral element discretization, which is more sensitive to temporal instabilities than low-order methods. However, the methods we present here are simple and general, and hence applicable to FSI based on any other spatial discretization. In numerical tests, we verify the selection of optimal values for the fictitious parameters for simplified problems and for vortex-induced vibrations (VIV) even at zero mass ratio ("for-ever-resonance"). We also develop an empirical a posteriori analysis for complex geometries and apply it to 3D patient-specific flexible brain arteries with aneurysms for very large deformations. We demonstrate that the fictitious pressure method enhances stability and convergence, and is comparable or better in most cases to the Robin approach or the other fictitious methods.

  19. An XML-based method for astronomy software designing

    NASA Astrophysics Data System (ADS)

    Liao, Mingxue; Aili, Yusupu; Zhang, Jin

    XML-based method for standardization of software designing is introduced and analyzed and successfully applied to renovating the hardware and software of the digital clock at Urumqi Astronomical Station. Basic strategy for eliciting time information from the new digital clock of FT206 in the antenna control program is introduced. By FT206, the need to compute how many centuries passed since a certain day with sophisticated formulas is eliminated and it is no longer necessary to set right UT time for the computer holding control over antenna because the information about year, month, day are all deduced from Julian day dwelling in FT206, rather than from computer time. With XML-based method and standard for software designing, various existing designing methods are unified, communications and collaborations between developers are facilitated, and thus Internet-based mode of developing software becomes possible. The trend of development of XML-based designing method is predicted.

  20. Iodine-filter-based mobile Doppler lidar to make continuous and full-azimuth-scanned wind measurements: data acquisition and analysis system, data retrieval methods, and error analysis.

    PubMed

    Wang, Zhangjun; Liu, Zhishen; Liu, Liping; Wu, Songhua; Liu, Bingyi; Li, Zhigang; Chu, Xinzhao

    2010-12-20

    An incoherent Doppler wind lidar based on iodine edge filters has been developed at the Ocean University of China for remote measurements of atmospheric wind fields. The lidar is compact enough to fit in a minivan for mobile deployment. With its sophisticated and user-friendly data acquisition and analysis system (DAAS), this lidar has made a variety of line-of-sight (LOS) wind measurements in different operational modes. Through carefully developed data retrieval procedures, various wind products are provided by the lidar, including wind profile, LOS wind velocities in plan position indicator (PPI) and range height indicator (RHI) modes, and sea surface wind. Data are processed and displayed in real time, and continuous wind measurements have been demonstrated for as many as 16 days. Full-azimuth-scanned wind measurements in PPI mode and full-elevation-scanned wind measurements in RHI mode have been achieved with this lidar. The detection range of LOS wind velocity PPI and RHI reaches 8-10 km at night and 6-8 km during daytime with range resolution of 10 m and temporal resolution of 3 min. In this paper, we introduce the DAAS architecture and describe the data retrieval methods for various operation modes. We present the measurement procedures and results of LOS wind velocities in PPI and RHI scans along with wind profiles obtained by Doppler beam swing. The sea surface wind measured for the sailing competition during the 2008 Beijing Olympics is also presented. The precision and accuracy of wind measurements are estimated through analysis of the random errors associated with photon noise and the systematic errors introduced by the assumptions made in data retrieval. The three assumptions of horizontal homogeneity of atmosphere, close-to-zero vertical wind, and uniform sensitivity are made in order to experimentally determine the zero wind ratio and the measurement sensitivity, which are important factors in LOS wind retrieval. Deviations may occur under certain

  1. Why Map Issues? On Controversy Analysis as a Digital Method

    PubMed Central

    2015-01-01

    This article takes stock of recent efforts to implement controversy analysis as a digital method in the study of science, technology, and society (STS) and beyond and outlines a distinctive approach to address the problem of digital bias. Digital media technologies exert significant influence on the enactment of controversy in online settings, and this risks undermining the substantive focus of controversy analysis conducted by digital means. To address this problem, I propose a shift in thematic focus from controversy analysis to issue mapping. The article begins by distinguishing between three broad frameworks that currently guide the development of controversy analysis as a digital method, namely, demarcationist, discursive, and empiricist. Each has been adopted in STS, but only the last one offers a digital “move beyond impartiality.” I demonstrate this approach by analyzing issues of Internet governance with the aid of the social media platform Twitter. PMID:26336325

  2. [Development of sample pretreatment techniques-rapid detection coupling methods for food security analysis].

    PubMed

    Huang, Yichun; Ding, Weiwei; Zhang, Zhuomin; Li, Gongke

    2013-07-01

    This paper summarizes the recent developments of the rapid detection methods for food security, such as sensors, optical techniques, portable spectral analysis, enzyme-linked immunosorbent assay, portable gas chromatograph, etc. Additionally, the applications of these rapid detection methods coupled with sample pretreatment techniques in real food security analysis are reviewed. The coupling technique has the potential to provide references to establish the selective, precise and quantitative rapid detection methods in food security analysis.

  3. Reliability and cost analysis methods

    NASA Technical Reports Server (NTRS)

    Suich, Ronald C.

    1991-01-01

    In the design phase of a system, how does a design engineer or manager choose between a subsystem with .990 reliability and a more costly subsystem with .995 reliability? When is the increased cost justified? High reliability is not necessarily an end in itself but may be desirable in order to reduce the expected cost due to subsystem failure. However, this may not be the wisest use of funds since the expected cost due to subsystem failure is not the only cost involved. The subsystem itself may be very costly. We should not consider either the cost of the subsystem or the expected cost due to subsystem failure separately but should minimize the total of the two costs, i.e., the total of the cost of the subsystem plus the expected cost due to subsystem failure. This final report discusses the Combined Analysis of Reliability, Redundancy, and Cost (CARRAC) methods which were developed under Grant Number NAG 3-1100 from the NASA Lewis Research Center. CARRAC methods and a CARRAC computer program employ five models which can be used to cover a wide range of problems. The models contain an option which can include repair of failed modules.

  4. A sensitive continuum analysis method for gamma ray spectra

    NASA Technical Reports Server (NTRS)

    Thakur, Alakh N.; Arnold, James R.

    1993-01-01

    In this work we examine ways to improve the sensitivity of the analysis procedure for gamma ray spectra with respect to small differences in the continuum (Compton) spectra. The method developed is applied to analyze gamma ray spectra obtained from planetary mapping by the Mars Observer spacecraft launched in September 1992. Calculated Mars simulation spectra and actual thick target bombardment spectra have been taken as test cases. The principle of the method rests on the extraction of continuum information from Fourier transforms of the spectra. We study how a better estimate of the spectrum from larger regions of the Mars surface will improve the analysis for smaller regions with poorer statistics. Estimation of signal within the continuum is done in the frequency domain which enables efficient and sensitive discrimination of subtle differences between two spectra. The process is compared to other methods for the extraction of information from the continuum. Finally we explore briefly the possible uses of this technique in other applications of continuum spectra.

  5. Test versus analysis: A discussion of methods

    NASA Technical Reports Server (NTRS)

    Butler, T. G.

    1986-01-01

    Some techniques for comparing structural vibration data determined from test and analysis are discussed. Orthogonality is a general category of one group, correlation is a second, synthesis is a third and matrix improvement is a fourth. Advantages and short-comings of the methods are explored with suggestions as to how they can complement one another. The purpose for comparing vibration data from test and analysis for a given structure is to find out whether each is representing the dynamic properties of the structure in the same way. Specifically, whether: mode shapes are alike; the frequencies of the modes are alike; modes appear in the same frequency sequence; and if they are not alike, how to judge which to believe.

  6. Novel Method of Production Decline Analysis

    NASA Astrophysics Data System (ADS)

    Xie, Shan; Lan, Yifei; He, Lei; Jiao, Yang; Wu, Yong

    2018-02-01

    ARPS decline curves is the most commonly used in oil and gas field due to its minimal data requirements and ease application. And prediction of production decline which is based on ARPS analysis rely on known decline type. However, when coefficient index are very approximate under different decline type, it is difficult to directly recognize decline trend of matched curves. Due to difficulties above, based on simulation results of multi-factor response experiments, a new dynamic decline prediction model is introduced with using multiple linear regression of influence factors. First of all, according to study of effect factors of production decline, interaction experimental schemes are designed. Based on simulated results, annual decline rate is predicted by decline model. Moreover, the new method is applied in A gas filed of Ordos Basin as example to illustrate reliability. The result commit that the new model can directly predict decline tendency without needing recognize decline style. From arithmetic aspect, it also take advantage of high veracity. Finally, the new method improves the evaluation method of gas well production decline in low permeability gas reservoir, which also provides technical support for further understanding of tight gas field development laws.

  7. A comparison of cosegregation analysis methods for the clinical setting.

    PubMed

    Rañola, John Michael O; Liu, Quanhui; Rosenthal, Elisabeth A; Shirts, Brian H

    2018-04-01

    Quantitative cosegregation analysis can help evaluate the pathogenicity of genetic variants. However, genetics professionals without statistical training often use simple methods, reporting only qualitative findings. We evaluate the potential utility of quantitative cosegregation in the clinical setting by comparing three methods. One thousand pedigrees each were simulated for benign and pathogenic variants in BRCA1 and MLH1 using United States historical demographic data to produce pedigrees similar to those seen in the clinic. These pedigrees were analyzed using two robust methods, full likelihood Bayes factors (FLB) and cosegregation likelihood ratios (CSLR), and a simpler method, counting meioses. Both FLB and CSLR outperform counting meioses when dealing with pathogenic variants, though counting meioses is not far behind. For benign variants, FLB and CSLR greatly outperform as counting meioses is unable to generate evidence for benign variants. Comparing FLB and CSLR, we find that the two methods perform similarly, indicating that quantitative results from either of these methods could be combined in multifactorial calculations. Combining quantitative information will be important as isolated use of cosegregation in single families will yield classification for less than 1% of variants. To encourage wider use of robust cosegregation analysis, we present a website ( http://www.analyze.myvariant.org ) which implements the CSLR, FLB, and Counting Meioses methods for ATM, BRCA1, BRCA2, CHEK2, MEN1, MLH1, MSH2, MSH6, and PMS2. We also present an R package, CoSeg, which performs the CSLR analysis on any gene with user supplied parameters. Future variant classification guidelines should allow nuanced inclusion of cosegregation evidence against pathogenicity.

  8. A Hybrid On-line Verification Method of Relay Setting

    NASA Astrophysics Data System (ADS)

    Gao, Wangyuan; Chen, Qing; Si, Ji; Huang, Xin

    2017-05-01

    Along with the rapid development of the power industry, grid structure gets more sophisticated. The validity and rationality of protective relaying are vital to the security of power systems. To increase the security of power systems, it is essential to verify the setting values of relays online. Traditional verification methods mainly include the comparison of protection range and the comparison of calculated setting value. To realize on-line verification, the verifying speed is the key. The verifying result of comparing protection range is accurate, but the computation burden is heavy, and the verifying speed is slow. Comparing calculated setting value is much faster, but the verifying result is conservative and inaccurate. Taking the overcurrent protection as example, this paper analyses the advantages and disadvantages of the two traditional methods above, and proposes a hybrid method of on-line verification which synthesizes the advantages of the two traditional methods. This hybrid method can meet the requirements of accurate on-line verification.

  9. Lattice Boltzmann methods for global linear instability analysis

    NASA Astrophysics Data System (ADS)

    Pérez, José Miguel; Aguilar, Alfonso; Theofilis, Vassilis

    2017-12-01

    Modal global linear instability analysis is performed using, for the first time ever, the lattice Boltzmann method (LBM) to analyze incompressible flows with two and three inhomogeneous spatial directions. Four linearization models have been implemented in order to recover the linearized Navier-Stokes equations in the incompressible limit. Two of those models employ the single relaxation time and have been proposed previously in the literature as linearization of the collision operator of the lattice Boltzmann equation. Two additional models are derived herein for the first time by linearizing the local equilibrium probability distribution function. Instability analysis results are obtained in three benchmark problems, two in closed geometries and one in open flow, namely the square and cubic lid-driven cavity flow and flow in the wake of the circular cylinder. Comparisons with results delivered by classic spectral element methods verify the accuracy of the proposed new methodologies and point potential limitations particular to the LBM approach. The known issue of appearance of numerical instabilities when the SRT model is used in direct numerical simulations employing the LBM is shown to be reflected in a spurious global eigenmode when the SRT model is used in the instability analysis. Although this mode is absent in the multiple relaxation times model, other spurious instabilities can also arise and are documented herein. Areas of potential improvements in order to make the proposed methodology competitive with established approaches for global instability analysis are discussed.

  10. Single-Cell RNA-Sequencing: Assessment of Differential Expression Analysis Methods.

    PubMed

    Dal Molin, Alessandra; Baruzzo, Giacomo; Di Camillo, Barbara

    2017-01-01

    The sequencing of the transcriptomes of single-cells, or single-cell RNA-sequencing, has now become the dominant technology for the identification of novel cell types and for the study of stochastic gene expression. In recent years, various tools for analyzing single-cell RNA-sequencing data have been proposed, many of them with the purpose of performing differentially expression analysis. In this work, we compare four different tools for single-cell RNA-sequencing differential expression, together with two popular methods originally developed for the analysis of bulk RNA-sequencing data, but largely applied to single-cell data. We discuss results obtained on two real and one synthetic dataset, along with considerations about the perspectives of single-cell differential expression analysis. In particular, we explore the methods performance in four different scenarios, mimicking different unimodal or bimodal distributions of the data, as characteristic of single-cell transcriptomics. We observed marked differences between the selected methods in terms of precision and recall, the number of detected differentially expressed genes and the overall performance. Globally, the results obtained in our study suggest that is difficult to identify a best performing tool and that efforts are needed to improve the methodologies for single-cell RNA-sequencing data analysis and gain better accuracy of results.

  11. Utilization of nuclear methods for materials analysis and the determination of concentration gradients

    NASA Technical Reports Server (NTRS)

    Darras, R.

    1979-01-01

    The various types of nuclear chemical analysis methods are discussed. The possibilities of analysis through activation and direct observation of nuclear reactions are described. Such methods make it possible to analyze trace elements and impurities with selectivity, accuracy, and a high degree of sensitivity. Such methods are used in measuring major elements present in materials which are available for analysis only in small quantities. These methods are well suited to superficial analyses and to determination of concentration gradients; provided the nature and energy of the incident particles are chosen judiciously. Typical examples of steels, pure iron and refractory metals are illustrated.

  12. Evaluation of Contamination Inspection and Analysis Methods through Modeling System Performance

    NASA Technical Reports Server (NTRS)

    Seasly, Elaine; Dever, Jason; Stuban, Steven M. F.

    2016-01-01

    Contamination is usually identified as a risk on the risk register for sensitive space systems hardware. Despite detailed, time-consuming, and costly contamination control efforts during assembly, integration, and test of space systems, contaminants are still found during visual inspections of hardware. Improved methods are needed to gather information during systems integration to catch potential contamination issues earlier and manage contamination risks better. This research explores evaluation of contamination inspection and analysis methods to determine optical system sensitivity to minimum detectable molecular contamination levels based on IEST-STD-CC1246E non-volatile residue (NVR) cleanliness levels. Potential future degradation of the system is modeled given chosen modules representative of optical elements in an optical system, minimum detectable molecular contamination levels for a chosen inspection and analysis method, and determining the effect of contamination on the system. By modeling system performance based on when molecular contamination is detected during systems integration and at what cleanliness level, the decision maker can perform trades amongst different inspection and analysis methods and determine if a planned method is adequate to meet system requirements and manage contamination risk.

  13. METHODS ADVANCEMENT FOR MILK ANALYSIS: THE MAMA STUDY

    EPA Science Inventory

    The Methods Advancement for Milk Analysis (MAMA) study was designed by US EPA and CDC investigators to provide data to support the technological and study design needs of the proposed National Children=s Study (NCS). The NCS is a multi-Agency-sponsored study, authorized under the...

  14. Variational Methods in Sensitivity Analysis and Optimization for Aerodynamic Applications

    NASA Technical Reports Server (NTRS)

    Ibrahim, A. H.; Hou, G. J.-W.; Tiwari, S. N. (Principal Investigator)

    1996-01-01

    Variational methods (VM) sensitivity analysis, which is the continuous alternative to the discrete sensitivity analysis, is employed to derive the costate (adjoint) equations, the transversality conditions, and the functional sensitivity derivatives. In the derivation of the sensitivity equations, the variational methods use the generalized calculus of variations, in which the variable boundary is considered as the design function. The converged solution of the state equations together with the converged solution of the costate equations are integrated along the domain boundary to uniquely determine the functional sensitivity derivatives with respect to the design function. The determination of the sensitivity derivatives of the performance index or functional entails the coupled solutions of the state and costate equations. As the stable and converged numerical solution of the costate equations with their boundary conditions are a priori unknown, numerical stability analysis is performed on both the state and costate equations. Thereafter, based on the amplification factors obtained by solving the generalized eigenvalue equations, the stability behavior of the costate equations is discussed and compared with the state (Euler) equations. The stability analysis of the costate equations suggests that the converged and stable solution of the costate equation is possible only if the computational domain of the costate equations is transformed to take into account the reverse flow nature of the costate equations. The application of the variational methods to aerodynamic shape optimization problems is demonstrated for internal flow problems at supersonic Mach number range. The study shows, that while maintaining the accuracy of the functional sensitivity derivatives within the reasonable range for engineering prediction purposes, the variational methods show a substantial gain in computational efficiency, i.e., computer time and memory, when compared with the finite

  15. The power-proportion method for intracranial volume correction in volumetric imaging analysis.

    PubMed

    Liu, Dawei; Johnson, Hans J; Long, Jeffrey D; Magnotta, Vincent A; Paulsen, Jane S

    2014-01-01

    In volumetric brain imaging analysis, volumes of brain structures are typically assumed to be proportional or linearly related to intracranial volume (ICV). However, evidence abounds that many brain structures have power law relationships with ICV. To take this relationship into account in volumetric imaging analysis, we propose a power law based method-the power-proportion method-for ICV correction. The performance of the new method is demonstrated using data from the PREDICT-HD study.

  16. Numerical realization of the variational method for generating self-trapped beams.

    PubMed

    Duque, Erick I; Lopez-Aguayo, Servando; Malomed, Boris A

    2018-03-19

    We introduce a numerical variational method based on the Rayleigh-Ritz optimization principle for predicting two-dimensional self-trapped beams in nonlinear media. This technique overcomes the limitation of the traditional variational approximation in performing analytical Lagrangian integration and differentiation. Approximate soliton solutions of a generalized nonlinear Schrödinger equation are obtained, demonstrating robustness of the beams of various types (fundamental, vortices, multipoles, azimuthons) in the course of their propagation. The algorithm offers possibilities to produce more sophisticated soliton profiles in general nonlinear models.

  17. Numerical realization of the variational method for generating self-trapped beams

    NASA Astrophysics Data System (ADS)

    Duque, Erick I.; Lopez-Aguayo, Servando; Malomed, Boris A.

    2018-03-01

    We introduce a numerical variational method based on the Rayleigh-Ritz optimization principle for predicting two-dimensional self-trapped beams in nonlinear media. This technique overcomes the limitation of the traditional variational approximation in performing analytical Lagrangian integration and differentiation. Approximate soliton solutions of a generalized nonlinear Schr\\"odinger equation are obtained, demonstrating robustness of the beams of various types (fundamental, vortices, multipoles, azimuthons) in the course of their propagation. The algorithm offers possibilities to produce more sophisticated soliton profiles in general nonlinear models.

  18. A Case Study of a Mixed Methods Study Engaged in Integrated Data Analysis

    ERIC Educational Resources Information Center

    Schiazza, Daniela Marie

    2013-01-01

    The nascent field of mixed methods research has yet to develop a cohesive framework of guidelines and procedures for mixed methods data analysis (Greene, 2008). To support the field's development of analytical frameworks, this case study reflects on the development and implementation of a mixed methods study engaged in integrated data analysis.…

  19. Flow analysis system and method

    NASA Technical Reports Server (NTRS)

    Hill, Wayne S. (Inventor); Barck, Bruce N. (Inventor)

    1998-01-01

    A non-invasive flow analysis system and method wherein a sensor, such as an acoustic sensor, is coupled to a conduit for transmitting a signal which varies depending on the characteristics of the flow in the conduit. The signal is amplified and there is a filter, responsive to the sensor signal, and tuned to pass a narrow band of frequencies proximate the resonant frequency of the sensor. A demodulator generates an amplitude envelope of the filtered signal and a number of flow indicator quantities are calculated based on variations in amplitude of the amplitude envelope. A neural network, or its equivalent, is then used to determine the flow rate of the flow in the conduit based on the flow indicator quantities.

  20. Data-driven and hybrid coastal morphological prediction methods for mesoscale forecasting

    NASA Astrophysics Data System (ADS)

    Reeve, Dominic E.; Karunarathna, Harshinie; Pan, Shunqi; Horrillo-Caraballo, Jose M.; Różyński, Grzegorz; Ranasinghe, Roshanka

    2016-03-01

    It is now common for coastal planning to anticipate changes anywhere from 70 to 100 years into the future. The process models developed and used for scheme design or for large-scale oceanography are currently inadequate for this task. This has prompted the development of a plethora of alternative methods. Some, such as reduced complexity or hybrid models simplify the governing equations retaining processes that are considered to govern observed morphological behaviour. The computational cost of these models is low and they have proven effective in exploring morphodynamic trends and improving our understanding of mesoscale behaviour. One drawback is that there is no generally agreed set of principles on which to make the simplifying assumptions and predictions can vary considerably between models. An alternative approach is data-driven techniques that are based entirely on analysis and extrapolation of observations. Here, we discuss the application of some of the better known and emerging methods in this category to argue that with the increasing availability of observations from coastal monitoring programmes and the development of more sophisticated statistical analysis techniques data-driven models provide a valuable addition to the armoury of methods available for mesoscale prediction. The continuation of established monitoring programmes is paramount, and those that provide contemporaneous records of the driving forces and the shoreline response are the most valuable in this regard. In the second part of the paper we discuss some recent research that combining some of the hybrid techniques with data analysis methods in order to synthesise a more consistent means of predicting mesoscale coastal morphological evolution. While encouraging in certain applications a universally applicable approach has yet to be found. The route to linking different model types is highlighted as a major challenge and requires further research to establish its viability. We argue that

  1. Comprehensive comparative analysis of 5'-end RNA-sequencing methods.

    PubMed

    Adiconis, Xian; Haber, Adam L; Simmons, Sean K; Levy Moonshine, Ami; Ji, Zhe; Busby, Michele A; Shi, Xi; Jacques, Justin; Lancaster, Madeline A; Pan, Jen Q; Regev, Aviv; Levin, Joshua Z

    2018-06-04

    Specialized RNA-seq methods are required to identify the 5' ends of transcripts, which are critical for studies of gene regulation, but these methods have not been systematically benchmarked. We directly compared six such methods, including the performance of five methods on a single human cellular RNA sample and a new spike-in RNA assay that helps circumvent challenges resulting from uncertainties in annotation and RNA processing. We found that the 'cap analysis of gene expression' (CAGE) method performed best for mRNA and that most of its unannotated peaks were supported by evidence from other genomic methods. We applied CAGE to eight brain-related samples and determined sample-specific transcription start site (TSS) usage, as well as a transcriptome-wide shift in TSS usage between fetal and adult brain.

  2. Interactive Visualization and Analysis of Geospatial Data Sets - TrikeND-iGlobe

    NASA Astrophysics Data System (ADS)

    Rosebrock, Uwe; Hogan, Patrick; Chandola, Varun

    2013-04-01

    The visualization of scientific datasets is becoming an ever-increasing challenge as advances in computing technologies have enabled scientists to build high resolution climate models that have produced petabytes of climate data. To interrogate and analyze these large datasets in real-time is a task that pushes the boundaries of computing hardware and software. But integration of climate datasets with geospatial data requires considerable amount of effort and close familiarity of various data formats and projection systems, which has prevented widespread utilization outside of climate community. TrikeND-iGlobe is a sophisticated software tool that bridges this gap, allows easy integration of climate datasets with geospatial datasets and provides sophisticated visualization and analysis capabilities. The objective for TrikeND-iGlobe is the continued building of an open source 4D virtual globe application using NASA World Wind technology that integrates analysis of climate model outputs with remote sensing observations as well as demographic and environmental data sets. This will facilitate a better understanding of global and regional phenomenon, and the impact analysis of climate extreme events. The critical aim is real-time interactive interrogation. At the data centric level the primary aim is to enable the user to interact with the data in real-time for the purpose of analysis - locally or remotely. TrikeND-iGlobe provides the basis for the incorporation of modular tools that provide extended interactions with the data, including sub-setting, aggregation, re-shaping, time series analysis methods and animation to produce publication-quality imagery. TrikeND-iGlobe may be run locally or can be accessed via a web interface supported by high-performance visualization compute nodes placed close to the data. It supports visualizing heterogeneous data formats: traditional geospatial datasets along with scientific data sets with geographic coordinates (NetCDF, HDF, etc

  3. Probabilistic structural analysis methods of hot engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures at Lewis Research Center is presented. Three elements of the research program are: (1) composite load spectra methodology; (2) probabilistic structural analysis methodology; and (3) probabilistic structural analysis application. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) turbine blade temperature, pressure, and torque of the space shuttle main engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; and (3) evaluation of the failure probability. Collectively, the results demonstrate that the structural durability of hot engine structural components can be effectively evaluated in a formal probabilistic/reliability framework.

  4. Multiresolution and Explicit Methods for Vector Field Analysis and Visualization

    NASA Technical Reports Server (NTRS)

    Nielson, Gregory M.

    1997-01-01

    This is a request for a second renewal (3d year of funding) of a research project on the topic of multiresolution and explicit methods for vector field analysis and visualization. In this report, we describe the progress made on this research project during the second year and give a statement of the planned research for the third year. There are two aspects to this research project. The first is concerned with the development of techniques for computing tangent curves for use in visualizing flow fields. The second aspect of the research project is concerned with the development of multiresolution methods for curvilinear grids and their use as tools for visualization, analysis and archiving of flow data. We report on our work on the development of numerical methods for tangent curve computation first.

  5. MPQ-cytometry: a magnetism-based method for quantification of nanoparticle-cell interactions

    NASA Astrophysics Data System (ADS)

    Shipunova, V. O.; Nikitin, M. P.; Nikitin, P. I.; Deyev, S. M.

    2016-06-01

    Precise quantification of interactions between nanoparticles and living cells is among the imperative tasks for research in nanobiotechnology, nanotoxicology and biomedicine. To meet the challenge, a rapid method called MPQ-cytometry is developed, which measures the integral non-linear response produced by magnetically labeled nanoparticles in a cell sample with an original magnetic particle quantification (MPQ) technique. MPQ-cytometry provides a sensitivity limit 0.33 ng of nanoparticles and is devoid of a background signal present in many label-based assays. Each measurement takes only a few seconds, and no complicated sample preparation or data processing is required. The capabilities of the method have been demonstrated by quantification of interactions of iron oxide nanoparticles with eukaryotic cells. The total amount of targeted nanoparticles that specifically recognized the HER2/neu oncomarker on the human cancer cell surface was successfully measured, the specificity of interaction permitting the detection of HER2/neu positive cells in a cell mixture. Moreover, it has been shown that MPQ-cytometry analysis of a HER2/neu-specific iron oxide nanoparticle interaction with six cell lines of different tissue origins quantitatively reflects the HER2/neu status of the cells. High correlation of MPQ-cytometry data with those obtained by three other commonly used in molecular and cell biology methods supports consideration of this method as a prospective alternative for both quantifying cell-bound nanoparticles and estimating the expression level of cell surface antigens. The proposed method does not require expensive sophisticated equipment or highly skilled personnel and it can be easily applied for rapid diagnostics, especially under field conditions.Precise quantification of interactions between nanoparticles and living cells is among the imperative tasks for research in nanobiotechnology, nanotoxicology and biomedicine. To meet the challenge, a rapid method

  6. Design component method for sensitivity analysis of built-up structures

    NASA Technical Reports Server (NTRS)

    Choi, Kyung K.; Seong, Hwai G.

    1986-01-01

    A 'design component method' that provides a unified and systematic organization of design sensitivity analysis for built-up structures is developed and implemented. Both conventional design variables, such as thickness and cross-sectional area, and shape design variables of components of built-up structures are considered. It is shown that design of components of built-up structures can be characterized and system design sensitivity expressions obtained by simply adding contributions from each component. The method leads to a systematic organization of computations for design sensitivity analysis that is similar to the way in which computations are organized within a finite element code.

  7. Methods for the visualization and analysis of extracellular matrix protein structure and degradation.

    PubMed

    Leonard, Annemarie K; Loughran, Elizabeth A; Klymenko, Yuliya; Liu, Yueying; Kim, Oleg; Asem, Marwa; McAbee, Kevin; Ravosa, Matthew J; Stack, M Sharon

    2018-01-01

    This chapter highlights methods for visualization and analysis of extracellular matrix (ECM) proteins, with particular emphasis on collagen type I, the most abundant protein in mammals. Protocols described range from advanced imaging of complex in vivo matrices to simple biochemical analysis of individual ECM proteins. The first section of this chapter describes common methods to image ECM components and includes protocols for second harmonic generation, scanning electron microscopy, and several histological methods of ECM localization and degradation analysis, including immunohistochemistry, Trichrome staining, and in situ zymography. The second section of this chapter details both a common transwell invasion assay and a novel live imaging method to investigate cellular behavior with respect to collagen and other ECM proteins of interest. The final section consists of common electrophoresis-based biochemical methods that are used in analysis of ECM proteins. Use of the methods described herein will enable researchers to gain a greater understanding of the role of ECM structure and degradation in development and matrix-related diseases such as cancer and connective tissue disorders. © 2018 Elsevier Inc. All rights reserved.

  8. NECAP: NASA's Energy-Cost Analysis Program. Part 1: User's manual

    NASA Technical Reports Server (NTRS)

    Henninger, R. H. (Editor)

    1975-01-01

    The NECAP is a sophisticated building design and energy analysis tool which has embodied within it all of the latest ASHRAE state-of-the-art techniques for performing thermal load calculation and energy usage predictions. It is a set of six individual computer programs which include: response factor program, data verification program, thermal load analysis program, variable temperature program, system and equipment simulation program, and owning and operating cost program. Each segment of NECAP is described, and instructions are set forth for preparing the required input data and for interpreting the resulting reports.

  9. Advancement of Analysis Method for Electromagnetic Screening Effect of Mountain Tunnel

    NASA Astrophysics Data System (ADS)

    Okutani, Tamio; Nakamura, Nobuyuki; Terada, Natsuki; Fukuda, Mitsuyoshi; Tate, Yutaka; Inada, Satoshi; Itoh, Hidenori; Wakao, Shinji

    In this paper we report advancement of an analysis method for electromagnetic screening effect of mountain tunnel with a multiple conductor circuit model. On A.C. electrified railways it is a great issue to manage the influence of electromagnetic induction caused by feeding circuits. Tunnels are said to have a screening effect to reduce the electromagnetic induction because a large amount of steel is used in the tunnels. But recently the screening effect is less expected because New Austrian Tunneling Method (NATM), in which the amount of steel used is less than in conventional methods, is adopted as the standard tunneling method for constructing mountain tunnels. So we measured and analyzed the actual screening effect of mountain tunnels constructed with NATM. In the process of the analysis we have advanced a method to analyze the screening effect more precisely. In this method we can adequately model tunnel structure as a part of multiple conductor circuit.

  10. Analysis of rubber supply in Sri Lanka

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartley, M.J.; Nerlove, M.; Peters, R.K. Jr.

    1987-11-01

    An analysis of the supply response for perennial crops is undertaken for rubber in Sir Lanka, focusing on the uprooting-replanting decision and disaggregating the typical reduced-form supply response equation into several structural relationships. This approach is compared and contrasted with Dowling's analysis of supply response for rubber in Thailand, which is based upon a sophisticated reduced-form supply function developed by Wickens and Greenfield for Brazilian coffee. Because the uprooting-replanting decision is central to understanding rubber supply response in Sri Lanka and for other perennial crops where replanting activities dominate new planting, the standard approaches do not adequately capture supply response.

  11. Reliability analysis method of a solar array by using fault tree analysis and fuzzy reasoning Petri net

    NASA Astrophysics Data System (ADS)

    Wu, Jianing; Yan, Shaoze; Xie, Liyang

    2011-12-01

    To address the impact of solar array anomalies, it is important to perform analysis of the solar array reliability. This paper establishes the fault tree analysis (FTA) and fuzzy reasoning Petri net (FRPN) models of a solar array mechanical system and analyzes reliability to find mechanisms of the solar array fault. The index final truth degree (FTD) and cosine matching function (CMF) are employed to resolve the issue of how to evaluate the importance and influence of different faults. So an improvement reliability analysis method is developed by means of the sorting of FTD and CMF. An example is analyzed using the proposed method. The analysis results show that harsh thermal environment and impact caused by particles in space are the most vital causes of the solar array fault. Furthermore, other fault modes and the corresponding improvement methods are discussed. The results reported in this paper could be useful for the spacecraft designers, particularly, in the process of redesigning the solar array and scheduling its reliability growth plan.

  12. A calibration-free electrode compensation method

    PubMed Central

    Rossant, Cyrille; Fontaine, Bertrand; Magnusson, Anna K.

    2012-01-01

    In a single-electrode current-clamp recording, the measured potential includes both the response of the membrane and that of the measuring electrode. The electrode response is traditionally removed using bridge balance, where the response of an ideal resistor representing the electrode is subtracted from the measurement. Because the electrode is not an ideal resistor, this procedure produces capacitive transients in response to fast or discontinuous currents. More sophisticated methods exist, but they all require a preliminary calibration phase, to estimate the properties of the electrode. If these properties change after calibration, the measurements are corrupted. We propose a compensation method that does not require preliminary calibration. Measurements are compensated offline by fitting a model of the neuron and electrode to the trace and subtracting the predicted electrode response. The error criterion is designed to avoid the distortion of compensated traces by spikes. The technique allows electrode properties to be tracked over time and can be extended to arbitrary models of electrode and neuron. We demonstrate the method using biophysical models and whole cell recordings in cortical and brain-stem neurons. PMID:22896724

  13. Investigation of safety analysis methods using computer vision techniques

    NASA Astrophysics Data System (ADS)

    Shirazi, Mohammad Shokrolah; Morris, Brendan Tran

    2017-09-01

    This work investigates safety analysis methods using computer vision techniques. The vision-based tracking system is developed to provide the trajectory of road users including vehicles and pedestrians. Safety analysis methods are developed to estimate time to collision (TTC) and postencroachment time (PET) that are two important safety measurements. Corresponding algorithms are presented and their advantages and drawbacks are shown through their success in capturing the conflict events in real time. The performance of the tracking system is evaluated first, and probability density estimation of TTC and PET are shown for 1-h monitoring of a Las Vegas intersection. Finally, an idea of an intersection safety map is introduced, and TTC values of two different intersections are estimated for 1 day from 8:00 a.m. to 6:00 p.m.

  14. A Comparison of Imputation Methods for Bayesian Factor Analysis Models

    ERIC Educational Resources Information Center

    Merkle, Edgar C.

    2011-01-01

    Imputation methods are popular for the handling of missing data in psychology. The methods generally consist of predicting missing data based on observed data, yielding a complete data set that is amiable to standard statistical analyses. In the context of Bayesian factor analysis, this article compares imputation under an unrestricted…

  15. A method for data base management and analysis for wind tunnel data

    NASA Technical Reports Server (NTRS)

    Biser, Aileen O.

    1987-01-01

    To respond to the need for improved data base management and analysis capabilities for wind-tunnel data at the Langley 16-Foot Transonic Tunnel, research was conducted into current methods of managing wind-tunnel data and a method was developed as a solution to this need. This paper describes the development of the data base management and analysis method for wind-tunnel data. The design and implementation of the software system are discussed and examples of its use are shown.

  16. Method and Apparatus for Concentrating Vapors for Analysis

    DOEpatents

    Grate, Jay W.; Baldwin, David L.; Anheier, Jr., Norman C.

    2008-10-07

    An apparatus and method are disclosed for pre-concentrating gaseous vapors for analysis. The invention finds application in conjunction with, e.g., analytical instruments where low detection limits for gaseous vapors are desirable. Vapors sorbed and concentrated within the bed of the apparatus can be thermally desorbed achieving at least partial separation of vapor mixtures. The apparatus is suitable, e.g., for preconcentration and sample injection, and provides greater resolution of peaks for vapors within vapor mixtures, yielding detection levels that are 10-10,000 times better than for direct sampling and analysis systems. Features are particularly useful for continuous unattended monitoring applications.

  17. Methods for Analyzing the Benefits and Costs of Distributed Photovoltaic Generation to the U.S. Electric Utility System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Denholm, P.; Margolis, R.; Palmintier, B.

    This report outlines the methods, data, and tools that could be used at different levels of sophistication and effort to estimate the benefits and costs of DGPV. In so doing, we identify the gaps in current benefit-cost-analysis methods, which we hope will inform the ongoing research agenda in this area. The focus of this report is primarily on benefits and costs from the utility or electricity generation system perspective. It is intended to provide useful background information to utility and regulatory decision makers and their staff, who are often being asked to use or evaluate estimates of the benefits andmore » cost of DGPV in regulatory proceedings. Understanding the technical rigor of the range of methods and how they might need to evolve as DGPV becomes a more significant contributor of energy to the electricity system will help them be better consumers of this type of information. This report is also intended to provide information to utilities, policy makers, PV technology developers, and other stakeholders, which might help them maximize the benefits and minimize the costs of integrating DGPV into a changing electricity system.« less

  18. Passive Infrared Surveillance: New Methods of Analysis

    DTIC Science & Technology

    1979-09-24

    f NRL Memorandum Report 4078 EOTPO Report 55 Passive Infrared Surveillance: New Methods of Analysis RICHARD A. STINBERG Electro- Optical Technology...Progrant Offtcw Management Information and Special Programs Organizallon September 24, 1979 KL I -r- ’I . ,3 ELECTRO- OPTICAL TECHNOLOGY PROGRAM OFFICE...by Dr.Joh M.Ma1a1u m ,J, I’ Head, Bleutro- Optical Technology Program Office SECURITY CLASSIPICATION Of THII4 PAGE (Wim"u Data Bille) REPORT

  19. Integrated Data Collection Analysis (IDCA) Program - SSST Testing Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sandstrom, Mary M.; Brown, Geoffrey W.; Preston, Daniel N.

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the methods used for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis during the IDCA program. These methods changed throughout the Proficiency Test and the reasons for these changes are documented in this report. The most significant modifications in standard testing methods are: 1) including one specified sandpaper in impact testing among all the participants, 2) diversifying liquid test methods for selected participants, and 3) including sealed sample holders for thermal testingmore » by at least one participant. This effort, funded by the Department of Homeland Security (DHS), is putting the issues of safe handling of these materials in perspective with standard military explosives. The study is adding SSST testing results for a broad suite of different HMEs to the literature. Ultimately the study will suggest new guidelines and methods and possibly establish the SSST testing accuracies needed to develop safe handling practices for HMEs. Each participating testing laboratory uses identical test materials and preparation methods wherever possible. The testing performers involved are Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory (LANL), Indian Head Division, Naval Surface Warfare Center, (NSWC IHD), Sandia National Laboratories (SNL), and Air Force Research Laboratory (AFRL/RXQL). These tests are conducted as a proficiency study in order to establish some consistency in test protocols, procedures, and experiments and to compare results when these testing variables cannot be made consistent.« less

  20. SIMS: A Hybrid Method for Rapid Conformational Analysis

    PubMed Central

    Gipson, Bryant; Moll, Mark; Kavraki, Lydia E.

    2013-01-01

    Proteins are at the root of many biological functions, often performing complex tasks as the result of large changes in their structure. Describing the exact details of these conformational changes, however, remains a central challenge for computational biology due the enormous computational requirements of the problem. This has engendered the development of a rich variety of useful methods designed to answer specific questions at different levels of spatial, temporal, and energetic resolution. These methods fall largely into two classes: physically accurate, but computationally demanding methods and fast, approximate methods. We introduce here a new hybrid modeling tool, the Structured Intuitive Move Selector (sims), designed to bridge the divide between these two classes, while allowing the benefits of both to be seamlessly integrated into a single framework. This is achieved by applying a modern motion planning algorithm, borrowed from the field of robotics, in tandem with a well-established protein modeling library. sims can combine precise energy calculations with approximate or specialized conformational sampling routines to produce rapid, yet accurate, analysis of the large-scale conformational variability of protein systems. Several key advancements are shown, including the abstract use of generically defined moves (conformational sampling methods) and an expansive probabilistic conformational exploration. We present three example problems that sims is applied to and demonstrate a rapid solution for each. These include the automatic determination of “active” residues for the hinge-based system Cyanovirin-N, exploring conformational changes involving long-range coordinated motion between non-sequential residues in Ribose-Binding Protein, and the rapid discovery of a transient conformational state of Maltose-Binding Protein, previously only determined by Molecular Dynamics. For all cases we provide energetic validations using well-established energy