Relative contributions of three descriptive methods: implications for behavioral assessment.
Pence, Sacha T; Roscoe, Eileen M; Bourret, Jason C; Ahearn, William H
2009-01-01
This study compared the outcomes of three descriptive analysis methods-the ABC method, the conditional probability method, and the conditional and background probability method-to each other and to the results obtained from functional analyses. Six individuals who had been diagnosed with developmental delays and exhibited problem behavior participated. Functional analyses indicated that participants' problem behavior was maintained by social positive reinforcement (n = 2), social negative reinforcement (n = 2), or automatic reinforcement (n = 2). Results showed that for all but 1 participant, descriptive analysis outcomes were similar across methods. In addition, for all but 1 participant, the descriptive analysis outcome differed substantially from the functional analysis outcome. This supports the general finding that descriptive analysis is a poor means of determining functional relations.
RELATIVE CONTRIBUTIONS OF THREE DESCRIPTIVE METHODS: IMPLICATIONS FOR BEHAVIORAL ASSESSMENT
Pence, Sacha T; Roscoe, Eileen M; Bourret, Jason C; Ahearn, William H
2009-01-01
This study compared the outcomes of three descriptive analysis methods—the ABC method, the conditional probability method, and the conditional and background probability method—to each other and to the results obtained from functional analyses. Six individuals who had been diagnosed with developmental delays and exhibited problem behavior participated. Functional analyses indicated that participants' problem behavior was maintained by social positive reinforcement (n = 2), social negative reinforcement (n = 2), or automatic reinforcement (n = 2). Results showed that for all but 1 participant, descriptive analysis outcomes were similar across methods. In addition, for all but 1 participant, the descriptive analysis outcome differed substantially from the functional analysis outcome. This supports the general finding that descriptive analysis is a poor means of determining functional relations. PMID:19949536
Position Description Analysis: A Method for Describing Academic Roles and Functions.
ERIC Educational Resources Information Center
Renner, K. Edward; Skibbens, Ronald J.
1990-01-01
The Position Description Analysis method for assessing the discrepancy between status quo and specializations needed by institutions to meet new demands and expectations is presented using Dalhousie University (Nova Scotia) as a case study. Dramatic realignment of fields of specialization and change strategies accommodating the aging professoriate…
ERIC Educational Resources Information Center
Folsom, Jessica Sidler; Osborne-Lampkin, La'Tara; Herrington, Carolyn D.
2014-01-01
This document is a companion guide to "A Descriptive Analysis of the Principal Workforce in Florida Schools" (Folsom, Osborne-Lampkin, & Herrington, in press). It describes the methods used to extract information from the Florida Department of Education database in order to conduct a descriptive analysis of the demographic…
Relative Contributions of Three Descriptive Methods: Implications for Behavioral Assessment
ERIC Educational Resources Information Center
Pence, Sacha T.; Roscoe, Eileen M.; Bourret, Jason C.; Ahearn, William H.
2009-01-01
This study compared the outcomes of three descriptive analysis methods--the ABC method, the conditional probability method, and the conditional and background probability method--to each other and to the results obtained from functional analyses. Six individuals who had been diagnosed with developmental delays and exhibited problem behavior…
The use of cognitive task analysis to improve instructional descriptions of procedures.
Clark, Richard E; Pugh, Carla M; Yates, Kenneth A; Inaba, Kenji; Green, Donald J; Sullivan, Maura E
2012-03-01
Surgical training relies heavily on the ability of expert surgeons to provide complete and accurate descriptions of a complex procedure. However, research from a variety of domains suggests that experts often omit critical information about the judgments, analysis, and decisions they make when solving a difficult problem or performing a complex task. In this study, we compared three methods for capturing surgeons' descriptions of how to perform the procedure for inserting a femoral artery shunt (unaided free-recall, unaided free-recall with simulation, and cognitive task analysis methods) to determine which method produced more accurate and complete results. Cognitive task analysis was approximately 70% more complete and accurate than free-recall and or free-recall during a simulation of the procedure. Ten expert trauma surgeons at a major urban trauma center were interviewed separately and asked to describe how to perform an emergency shunt procedure. Four surgeons provided an unaided free-recall description of the shunt procedure, five surgeons provided an unaided free-recall description of the procedure using visual aids and surgical instruments (simulation), and one (chosen randomly) was interviewed using cognitive task analysis (CTA) methods. An 11th vascular surgeon approved the final CTA protocol. The CTA interview with only one expert surgeon resulted in significantly greater accuracy and completeness of the descriptions compared with the unaided free-recall interviews with multiple expert surgeons. Surgeons in the unaided group omitted nearly 70% of necessary decision steps. In the free-recall group, heavy use of simulation improved surgeons' completeness when describing the steps of the procedure. CTA significantly increases the completeness and accuracy of surgeons' instructional descriptions of surgical procedures. In addition, simulation during unaided free-recall interviews may improve the completeness of interview data. Copyright © 2012 Elsevier Inc. All rights reserved.
Systematic text condensation: a strategy for qualitative analysis.
Malterud, Kirsti
2012-12-01
To present background, principles, and procedures for a strategy for qualitative analysis called systematic text condensation and discuss this approach compared with related strategies. Giorgi's psychological phenomenological analysis is the point of departure and inspiration for systematic text condensation. The basic elements of Giorgi's method and the elaboration of these in systematic text condensation are presented, followed by a detailed description of procedures for analysis according to systematic text condensation. Finally, similarities and differences compared with other frequently applied methods for qualitative analysis are identified, as the foundation of a discussion of strengths and limitations of systematic text condensation. Systematic text condensation is a descriptive and explorative method for thematic cross-case analysis of different types of qualitative data, such as interview studies, observational studies, and analysis of written texts. The method represents a pragmatic approach, although inspired by phenomenological ideas, and various theoretical frameworks can be applied. The procedure consists of the following steps: 1) total impression - from chaos to themes; 2) identifying and sorting meaning units - from themes to codes; 3) condensation - from code to meaning; 4) synthesizing - from condensation to descriptions and concepts. Similarities and differences comparing systematic text condensation with other frequently applied qualitative methods regarding thematic analysis, theoretical methodological framework, analysis procedures, and taxonomy are discussed. Systematic text condensation is a strategy for analysis developed from traditions shared by most of the methods for analysis of qualitative data. The method offers the novice researcher a process of intersubjectivity, reflexivity, and feasibility, while maintaining a responsible level of methodological rigour.
Probabilistic Structural Analysis Methods (PSAM) for select space propulsion systems components
NASA Technical Reports Server (NTRS)
1991-01-01
Summarized here is the technical effort and computer code developed during the five year duration of the program for probabilistic structural analysis methods. The summary includes a brief description of the computer code manuals and a detailed description of code validation demonstration cases for random vibrations of a discharge duct, probabilistic material nonlinearities of a liquid oxygen post, and probabilistic buckling of a transfer tube liner.
SEP thrust subsystem performance sensitivity analysis
NASA Technical Reports Server (NTRS)
Atkins, K. L.; Sauer, C. G., Jr.; Kerrisk, D. J.
1973-01-01
This is a two-part report on solar electric propulsion (SEP) performance sensitivity analysis. The first part describes the preliminary analysis of the SEP thrust system performance for an Encke rendezvous mission. A detailed description of thrust subsystem hardware tolerances on mission performance is included together with nominal spacecraft parameters based on these tolerances. The second part describes the method of analysis and graphical techniques used in generating the data for Part 1. Included is a description of both the trajectory program used and the additional software developed for this analysis. Part 2 also includes a comprehensive description of the use of the graphical techniques employed in this performance analysis.
NASA Technical Reports Server (NTRS)
Middleton, W. D.; Lundry, J. L.
1975-01-01
An integrated system of computer programs has been developed for the design and analysis of supersonic configurations. The system uses linearized theory methods for the calculation of surface pressures and supersonic area rule concepts in combination with linearized theory for calculation of aerodynamic force coefficients. Interactive graphics are optional at the user's request. This part presents a general description of the system and describes the theoretical methods used.
Alvin H. Yu; Garry Chick
2010-01-01
This study compared the utility of two different post-hoc tests after detecting significant differences within factors on multiple dependent variables using multivariate analysis of variance (MANOVA). We compared the univariate F test (the Scheffé method) to descriptive discriminant analysis (DDA) using an educational-tour survey of university study-...
Description of textures by a structural analysis.
Tomita, F; Shirai, Y; Tsuji, S
1982-02-01
A structural analysis system for describing natural textures is introduced. The analyzer automatically extracts the texture elements in an input image, measures their properties, classifies them into some distinctive classes (one ``ground'' class and some ``figure'' classes), and computes the distributions of the gray level, the shape, and the placement of the texture elements in each class. These descriptions are used for classification of texture images. An analysis-by-synthesis method for evaluating texture analyzers is also presented. We propose a synthesizer which generates a texture image based on the descriptions. By comparing the reconstructed image with the original one, we can see what information is preserved and what is lost in the descriptions.
Quantitative method of medication system interface evaluation.
Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F
2007-01-01
The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.
[Scenario analysis--a method for long-term planning].
Stavem, K
2000-01-10
Scenarios are known from the film industry, as detailed descriptions of films. This has given name to scenario analysis, a method for long term planning using descriptions of composite future pictures. This article is an introduction to the scenario method. Scenarios describe plausible, not necessarily probable, developments. They focus on problems and questions that decision makers must be aware of and prepare to deal with, and the consequences of alternative decisions. Scenarios are used in corporate and governmental planning, and they can be useful and complementary to traditional planning and extrapolation of past experience. The method is particularly useful in a rapidly changing world with shifting external conditions.
Tuuli, Methodius G; Odibo, Anthony O
2011-08-01
The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.
Testing deformation hypotheses by constraints on a time series of geodetic observations
NASA Astrophysics Data System (ADS)
Velsink, Hiddo
2018-01-01
In geodetic deformation analysis observations are used to identify form and size changes of a geodetic network, representing objects on the earth's surface. The network points are monitored, often continuously, because of suspected deformations. A deformation may affect many points during many epochs. The problem is that the best description of the deformation is, in general, unknown. To find it, different hypothesised deformation models have to be tested systematically for agreement with the observations. The tests have to be capable of stating with a certain probability the size of detectable deformations, and to be datum invariant. A statistical criterion is needed to find the best deformation model. Existing methods do not fulfil these requirements. Here we propose a method that formulates the different hypotheses as sets of constraints on the parameters of a least-squares adjustment model. The constraints can relate to subsets of epochs and to subsets of points, thus combining time series analysis and congruence model analysis. The constraints are formulated as nonstochastic observations in an adjustment model of observation equations. This gives an easy way to test the constraints and to get a quality description. The proposed method aims at providing a good discriminating method to find the best description of a deformation. The method is expected to improve the quality of geodetic deformation analysis. We demonstrate the method with an elaborate example.
Schiek, Richard [Albuquerque, NM
2006-06-20
A method of generating two-dimensional masks from a three-dimensional model comprises providing a three-dimensional model representing a micro-electro-mechanical structure for manufacture and a description of process mask requirements, reducing the three-dimensional model to a topological description of unique cross sections, and selecting candidate masks from the unique cross sections and the cross section topology. The method further can comprise reconciling the candidate masks based on the process mask requirements description to produce two-dimensional process masks.
Advanced Productivity Analysis Methods for Air Traffic Control Operations.
DOT National Transportation Integrated Search
1976-12-01
This report gives a description of the Air Traffic Control (ATC) productivity analysis methods developed, implemented, and refined by the Stanford Research Institute (SRI) under the sponsorship of FAA and TSC. Two models are included in the productiv...
Atkins, Salla; Launiala, Annika; Kagaha, Alexander; Smith, Helen
2012-04-30
Health policy makers now have access to a greater number and variety of systematic reviews to inform different stages in the policy making process, including reviews of qualitative research. The inclusion of mixed methods studies in systematic reviews is increasing, but these studies pose particular challenges to methods of review. This article examines the quality of the reporting of mixed methods and qualitative-only studies. We used two completed systematic reviews to generate a sample of qualitative studies and mixed method studies in order to make an assessment of how the quality of reporting and rigor of qualitative-only studies compares with that of mixed-methods studies. Overall, the reporting of qualitative studies in our sample was consistently better when compared with the reporting of mixed methods studies. We found that mixed methods studies are less likely to provide a description of the research conduct or qualitative data analysis procedures and less likely to be judged credible or provide rich data and thick description compared with standalone qualitative studies. Our time-related analysis shows that for both types of study, papers published since 2003 are more likely to report on the study context, describe analysis procedures, and be judged credible and provide rich data. However, the reporting of other aspects of research conduct (i.e. descriptions of the research question, the sampling strategy, and data collection methods) in mixed methods studies does not appear to have improved over time. Mixed methods research makes an important contribution to health research in general, and could make a more substantial contribution to systematic reviews. Through our careful analysis of the quality of reporting of mixed methods and qualitative-only research, we have identified areas that deserve more attention in the conduct and reporting of mixed methods research.
The Analysis of Organizational Diagnosis on Based Six Box Model in Universities
ERIC Educational Resources Information Center
Hamid, Rahimi; Siadat, Sayyed Ali; Reza, Hoveida; Arash, Shahin; Ali, Nasrabadi Hasan; Azizollah, Arbabisarjou
2011-01-01
Purpose: The analysis of organizational diagnosis on based six box model at universities. Research method: Research method was descriptive-survey. Statistical population consisted of 1544 faculty members of universities which through random strafed sampling method 218 persons were chosen as the sample. Research Instrument were organizational…
Prevalence of Evaluation Method Courses in Education Leader Doctoral Preparation
ERIC Educational Resources Information Center
Shepperson, Tara L.
2013-01-01
This exploratory study investigated the prevalence of single evaluation methods courses in doctoral education leadership programs. Analysis of websites of 132 leading U.S. university programs found 62 evaluation methods courses in 54 programs. Content analysis of 49 course catalog descriptions resulted in five categories: survey, planning and…
Objective analysis of observational data from the FGGE observing systems
NASA Technical Reports Server (NTRS)
Baker, W.; Edelmann, D.; Iredell, M.; Han, D.; Jakkempudi, S.
1981-01-01
An objective analysis procedure for updating the GLAS second and fourth order general atmospheric circulation models using observational data from the first GARP global experiment is described. The objective analysis procedure is based on a successive corrections method and the model is updated in a data assimilation cycle. Preparation of the observational data for analysis and the objective analysis scheme are described. The organization of the program and description of the required data sets are presented. The program logic and detailed descriptions of each subroutine are given.
System safety engineering analysis handbook
NASA Technical Reports Server (NTRS)
Ijams, T. E.
1972-01-01
The basic requirements and guidelines for the preparation of System Safety Engineering Analysis are presented. The philosophy of System Safety and the various analytic methods available to the engineering profession are discussed. A text-book description of each of the methods is included.
LES, DNS and RANS for the analysis of high-speed turbulent reacting flows
NASA Technical Reports Server (NTRS)
Givi, Peyman; Taulbee, Dale B.; Adumitroaie, Virgil; Sabini, George J.; Shieh, Geoffrey S.
1994-01-01
The purpose of this research is to continue our efforts in advancing the state of knowledge in large eddy simulation (LES), direct numerical simulation (DNS), and Reynolds averaged Navier Stokes (RANS) methods for the computational analysis of high-speed reacting turbulent flows. In the second phase of this work, covering the period 1 Sep. 1993 - 1 Sep. 1994, we have focused our efforts on two research problems: (1) developments of 'algebraic' moment closures for statistical descriptions of nonpremixed reacting systems, and (2) assessments of the Dirichlet frequency in presumed scalar probability density function (PDF) methods in stochastic description of turbulent reacting flows. This report provides a complete description of our efforts during this past year as supported by the NASA Langley Research Center under Grant NAG1-1122.
Prison Radicalization: The New Extremist Training Grounds?
2007-09-01
distributing and collecting survey data , and the data analysis. The analytical methodology includes descriptive and inferential statistical methods, in... statistical analysis of the responses to identify significant correlations and relationships. B. SURVEY DATA COLLECTION To effectively access a...Q18, Q19, Q20, and Q21. Due to the exploratory nature of this small survey, data analyses were confined mostly to descriptive statistics and
Modal analysis applied to circular, rectangular, and coaxial waveguides
NASA Technical Reports Server (NTRS)
Hoppe, D. J.
1988-01-01
Recent developments in the analysis of various waveguide components and feedhorns using Modal Analysis (Mode Matching Method) are summarized. A brief description of the theory is presented, and the important features of the method are pointed out. Specific examples in circular, rectangular, and coaxial waveguides are included, with comparisons between the theory and experimental measurements. Extensions to the methods are described.
Artificial intelligence techniques used in respiratory sound analysis--a systematic review.
Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian
2014-02-01
Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.
Data-analysis issues in a phenomenographic investigation of information literacy in nursing.
Forster, Marc
2013-11-01
To explore two contrasting methods of phenomenographic data analysis. Phenomenography is a still-uncommon but increasingly used methodology based on qualitative interviews that allows experiences to be categorised and put into a descriptive structure for use in developing educational interventions. There are two different approaches in the literature to analysing data: the Marton and Åkerlind methods. A doctoral research project investigating the role of information literacy in evidence-based practice in nursing. The phenomenographic study involves open-ended interviews in which participants are asked to describe their 'life-world' where the phenomenon is experienced, covering the contexts in which it is experienced and how it is experienced. The researcher attempts to develop statements from the interview transcripts that describe representative ways of experiencing the phenomenon in the form of 'categories of description'. A category of description represents a qualitatively different way of experiencing a phenomenon. This article discusses the reasons for adopting phenomenography, phenomenography's epistemological assumptions, and the strengths and weaknesses of the two different data-analysis methods. Phenomenography's strength is its ability to develop logical structures that give a picture of the experience of a phenomenon while being able to read into the structure as much of the complexity of that experience as is consciously and practically possible. One method, described as the 'Åkerlind' method, emerged as the appropriate method for phenomenographic studies in nursing.
ERIC Educational Resources Information Center
Brennan, Tim
1980-01-01
A review of prior classification systems of runaways is followed by a descriptive taxonomy of runaways developed using cluster-analytic methods. The empirical types illustrate patterns of weakness in bonds between runaways and families, schools, or peer relationships. (Author)
Syntactic methods of shape feature description and its application in analysis of medical images
NASA Astrophysics Data System (ADS)
Ogiela, Marek R.; Tadeusiewicz, Ryszard
2000-02-01
The paper presents specialist algorithms of morphologic analysis of shapes of selected organs of abdominal cavity proposed in order to diagnose disease symptoms occurring in the main pancreatic ducts and upper segments of ureters. Analysis of the correct morphology of these structures has been conducted with the use of syntactic methods of pattern recognition. Its main objective is computer-aided support to early diagnosis of neoplastic lesions and pancreatitis based on images taken in the course of examination with the endoscopic retrograde cholangiopancreatography (ERCP) method and a diagnosis of morphological lesions in ureter based on kidney radiogram analysis. In the analysis of ERCP images, the main objective is to recognize morphological lesions in pancreas ducts characteristic for carcinoma and chronic pancreatitis. In the case of kidney radiogram analysis the aim is to diagnose local irregularity of ureter lumen. Diagnosing the above mentioned lesion has been conducted with the use of syntactic methods of pattern recognition, in particular the languages of shape features description and context-free attributed grammars. These methods allow to recognize and describe in a very efficient way the aforementioned lesions on images obtained as a result of initial image processing into diagrams of widths of the examined structures.
A survey of functional behavior assessment methods used by behavior analysts in practice.
Oliver, Anthony C; Pratt, Leigh A; Normand, Matthew P
2015-12-01
To gather information about the functional behavior assessment (FBA) methods behavior analysts use in practice, we sent a web-based survey to 12,431 behavior analysts certified by the Behavior Analyst Certification Board. Ultimately, 724 surveys were returned, with the results suggesting that most respondents regularly use FBA methods, especially descriptive assessments. Moreover, the data suggest that the majority of students are being formally taught about the various FBA methods and that educators are emphasizing the range of FBA methods in their teaching. However, less than half of the respondents reported using functional analyses in practice, although many considered descriptive assessments and functional analyses to be the most useful FBA methods. Most respondents reported using informant and descriptive assessments more frequently than functional analyses, and a majority of respondents indicated that they "never" or "almost never" used functional analyses to identify the function of behavior. © Society for the Experimental Analysis of Behavior.
Developing a model for the adequate description of electronic communication in hospitals.
Saboor, Samrend; Ammenwerth, Elske
2011-01-01
Adequate information and communication systems (ICT) can help to improve the communication in hospitals. Changes to the ICT-infrastructure of hospitals must be planed carefully. In order to support a comprehensive planning, we presented a classification of 81 common errors of the electronic communication on the MIE 2008 congress. Our objective now was to develop a data model that defines specific requirements for an adequate description of electronic communication processes We first applied the method of explicating qualitative content analysis on the error categorization in order to determine the essential process details. After this, we applied the method of subsuming qualitative content analysis on the results of the first step. A data model for the adequate description of electronic communication. This model comprises 61 entities and 91 relationships. The data model comprises and organizes all details that are necessary for the detection of the respective errors. It can be for either used to extend the capabilities of existing modeling methods or as a basis for the development of a new approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, Andrew; Haass, Michael; Rintoul, Mark Daniel
GazeAppraise advances the state of the art of gaze pattern analysis using methods that simultaneously analyze spatial and temporal characteristics of gaze patterns. GazeAppraise enables novel research in visual perception and cognition; for example, using shape features as distinguishing elements to assess individual differences in visual search strategy. Given a set of point-to-point gaze sequences, hereafter referred to as scanpaths, the method constructs multiple descriptive features for each scanpath. Once the scanpath features have been calculated, they are used to form a multidimensional vector representing each scanpath and cluster analysis is performed on the set of vectors from all scanpaths.more » An additional benefit of this method is the identification of causal or correlated characteristics of the stimuli, subjects, and visual task through statistical analysis of descriptive metadata distributions within and across clusters.« less
ERIC Educational Resources Information Center
Karadag, Engin
2010-01-01
To assess research methods and analysis of statistical techniques employed by educational researchers, this study surveyed unpublished doctoral dissertation from 2003 to 2007. Frequently used research methods consisted of experimental research; a survey; a correlational study; and a case study. Descriptive statistics, t-test, ANOVA, factor…
2012-01-01
Background Health policy makers now have access to a greater number and variety of systematic reviews to inform different stages in the policy making process, including reviews of qualitative research. The inclusion of mixed methods studies in systematic reviews is increasing, but these studies pose particular challenges to methods of review. This article examines the quality of the reporting of mixed methods and qualitative-only studies. Methods We used two completed systematic reviews to generate a sample of qualitative studies and mixed method studies in order to make an assessment of how the quality of reporting and rigor of qualitative-only studies compares with that of mixed-methods studies. Results Overall, the reporting of qualitative studies in our sample was consistently better when compared with the reporting of mixed methods studies. We found that mixed methods studies are less likely to provide a description of the research conduct or qualitative data analysis procedures and less likely to be judged credible or provide rich data and thick description compared with standalone qualitative studies. Our time-related analysis shows that for both types of study, papers published since 2003 are more likely to report on the study context, describe analysis procedures, and be judged credible and provide rich data. However, the reporting of other aspects of research conduct (i.e. descriptions of the research question, the sampling strategy, and data collection methods) in mixed methods studies does not appear to have improved over time. Conclusions Mixed methods research makes an important contribution to health research in general, and could make a more substantial contribution to systematic reviews. Through our careful analysis of the quality of reporting of mixed methods and qualitative-only research, we have identified areas that deserve more attention in the conduct and reporting of mixed methods research. PMID:22545681
Choi, Ji-Hye; Gwak, Mi-Jin; Chung, Seo-Jin; Kim, Kwang-Ok; O'Mahony, Michael; Ishii, Rie; Bae, Ye-Won
2015-06-01
The present study cross-culturally investigated the drivers of liking for traditional and ethnic chicken marinades using descriptive analysis and consumer taste tests incorporating the check-all-that-apply (CATA) method. Seventy-three Koreans and 86 US consumers participated. The tested sauces comprised three tomato-based sauces, a teriyaki-based sauce and a Korean spicy seasoning-based sauce. Chicken breasts were marinated with each of the five barbecue sauces, grilled and served for evaluation. Descriptive analysis and consumer taste tests were conducted. Consumers rated the acceptance on a hedonic scale and checked the reasons for (dis)liking by the CATA method for each sauce. A general linear model, multiple factor analysis and chi-square analysis were conducted using the data. The results showed that the preference orders of the samples between Koreans and US consumers were strikingly similar to each other. However, the reasons for (dis)liking the samples differed cross-culturally. The drivers of liking of two sauces sharing relatively similar sensory profiles but differing significantly in hedonic ratings were effectively delineated by reasons of (dis)liking CATA results. Reasons for (dis)liking CATA proved to be a powerful supporting method to understand the internal drivers of liking which can be overlooked by generic descriptive analysis. © 2014 Society of Chemical Industry.
Research Methods in School Psychology: An Overview.
ERIC Educational Resources Information Center
Keith, Timothy Z.
1988-01-01
This article introduces a mini-series on research methods in school psychology. A conceptual overview of research methods is presented, emphasizing the degree to which each method allows the inference that treatment affects outcome. Experimental and nonexperimental, psychometric, descriptive, and meta-analysis research methods are outlined. (SLD)
Frequency-Domain Identification Of Aeroelastic Modes
NASA Technical Reports Server (NTRS)
Acree, C. W., Jr.; Tischler, Mark B.
1991-01-01
Report describes flight measurements and frequency-domain analyses of aeroelastic vibrational modes of wings of XV-15 tilt-rotor aircraft. Begins with description of flight-test methods. Followed by brief discussion of methods of analysis, which include Fourier-transform computations using chirp z transformers, use of coherence and other spectral functions, and methods and computer programs to obtain frequencies and damping coefficients from measurements. Includes brief description of results of flight tests and comparisions among various experimental and theoretical results. Ends with section on conclusions and recommended improvements in techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cooper, W. James; Albertson, R Craig; Jacob, Rick E.
Here we present a re-description of Abudefduf luridus and reassign it to the genus Similiparma. We supplement traditional diagnoses and descriptions of this species with quantitative anatomical data collected from a family-wide geometric morphometric analysis of head morphology (44 species representing all 30 damselfish genera) and data from cranial micro-CT scans of fishes in the genus Similiparma. The use of geometric morphometric analyses (and other methods of shape analysis) permits detailed comparisons between the morphology of specific taxa and the anatomical diversity that has arisen in an entire lineage. This provides a particularly useful supplement to traditional description methods andmore » we recommend the use of such techniques by systematists. Similiparma and its close relatives constitute a branch of the damselfish phylogenetic tree that predominantly inhabits rocky reefs in the Atlantic and Eastern Pacific, as opposed to the more commonly studied damselfishes that constitute a large portion of the ichthyofauna on all coral-reef communities.« less
Qualitative Descriptive Methods in Health Science Research.
Colorafi, Karen Jiggins; Evans, Bronwynne
2016-07-01
The purpose of this methodology paper is to describe an approach to qualitative design known as qualitative descriptive that is well suited to junior health sciences researchers because it can be used with a variety of theoretical approaches, sampling techniques, and data collection strategies. It is often difficult for junior qualitative researchers to pull together the tools and resources they need to embark on a high-quality qualitative research study and to manage the volumes of data they collect during qualitative studies. This paper seeks to pull together much needed resources and provide an overview of methods. A step-by-step guide to planning a qualitative descriptive study and analyzing the data is provided, utilizing exemplars from the authors' research. This paper presents steps to conducting a qualitative descriptive study under the following headings: describing the qualitative descriptive approach, designing a qualitative descriptive study, steps to data analysis, and ensuring rigor of findings. The qualitative descriptive approach results in a summary in everyday, factual language that facilitates understanding of a selected phenomenon across disciplines of health science researchers. © The Author(s) 2016.
ERIC Educational Resources Information Center
Serebryakova, Tat'yana A.; Morozova, Lyudmila B.; Kochneva, Elena M.; Zharova, Darya V.; Kostyleva, Elena A.; Kolarkova, Oxana G.
2016-01-01
Background/Objectives: The objective of the paper is analysis and description of findings of an empiric study on the issue of social and psychological adaptation of first year students to studying in a higher educational institution. Methods/Statistical analysis: Using the methods of theoretical analysis the paper's authors plan and carry out an…
Evaluation of Yogurt Microstructure Using Confocal Laser Scanning Microscopy and Image Analysis.
Skytte, Jacob L; Ghita, Ovidiu; Whelan, Paul F; Andersen, Ulf; Møller, Flemming; Dahl, Anders B; Larsen, Rasmus
2015-06-01
The microstructure of protein networks in yogurts defines important physical properties of the yogurt and hereby partly its quality. Imaging this protein network using confocal scanning laser microscopy (CSLM) has shown good results, and CSLM has become a standard measuring technique for fermented dairy products. When studying such networks, hundreds of images can be obtained, and here image analysis methods are essential for using the images in statistical analysis. Previously, methods including gray level co-occurrence matrix analysis and fractal analysis have been used with success. However, a range of other image texture characterization methods exists. These methods describe an image by a frequency distribution of predefined image features (denoted textons). Our contribution is an investigation of the choice of image analysis methods by performing a comparative study of 7 major approaches to image texture description. Here, CSLM images from a yogurt fermentation study are investigated, where production factors including fat content, protein content, heat treatment, and incubation temperature are varied. The descriptors are evaluated through nearest neighbor classification, variance analysis, and cluster analysis. Our investigation suggests that the texton-based descriptors provide a fuller description of the images compared to gray-level co-occurrence matrix descriptors and fractal analysis, while still being as applicable and in some cases as easy to tune. © 2015 Institute of Food Technologists®
Oliver, Penelope; Cicerale, Sara; Pang, Edwin; Keast, Russell
2018-04-01
Temporal dominance of sensations (TDS) is a rapid descriptive method that offers a different magnitude of information to traditional descriptive analysis methodologies. This methodology considers the dynamic nature of eating, assessing sensory perception of foods as they change throughout the eating event. Limited research has applied the TDS methodology to strawberries and subsequently validated the results against Quantitative Descriptive Analysis (QDA™). The aim of this research is to compare the TDS methodology using an untrained consumer panel to the results obtained via QDA™ with a trained sensory panel. The trained panelists (n = 12, minimum 60 hr each panelist) were provided with six strawberry samples (three cultivars at two maturation levels) and applied QDA™ techniques to profile each strawberry sample. Untrained consumers (n = 103) were provided with six strawberry samples (three cultivars at two maturation levels) and required to use TDS methodology to assess the dominant sensations for each sample as they change over time. Results revealed moderately comparable product configurations produced via TDS in comparison to QDA™ (RV coefficient = 0.559), as well as similar application of the sweet attribute (correlation coefficient of 0.895 at first bite). The TDS methodology however was not in agreement with the QDA™ methodology regarding more complex flavor terms. These findings support the notion that the lack of training on the definition of terms, together with the limitations of the methodology to ignore all attributes other than those dominant, provide a different magnitude of information than the QDA™ methodology. A comparison of TDS to traditional descriptive analysis indicate that TDS provides additional information to QDA™ regarding the lingering component of eating. The QDA™ results however provide more precise detail regarding singular attributes. Therefore, the TDS methodology has an application in industry when it is important to understand the lingering profile of products. However, this methodology should not be employed as a replacement to traditional descriptive analysis methods. © 2018 Institute of Food Technologists®.
A New View of Earthquake Ground Motion Data: The Hilbert Spectral Analysis
NASA Technical Reports Server (NTRS)
Huang, Norden; Busalacchi, Antonio J. (Technical Monitor)
2000-01-01
A brief description of the newly developed Empirical Mode Decomposition (ENID) and Hilbert Spectral Analysis (HSA) method will be given. The decomposition is adaptive and can be applied to both nonlinear and nonstationary data. Example of the method applied to a sample earthquake record will be given. The results indicate those low frequency components, totally missed by the Fourier analysis, are clearly identified by the new method. Comparisons with Wavelet and window Fourier analysis show the new method offers much better temporal and frequency resolutions.
LITERATURE REVIEW OF REMEDIATION METHODS FOR PCBS IN BUILDINGS
This literature review contains a description and analysis of existing methods for management of PCBs in construction materials. Information on the strengths and limitations, efficacy, cost, and byproducts of each remediation method is presented, where available. The report is ba...
Chivukula, V; Mousel, J; Lu, J; Vigmostad, S
2014-12-01
The current research presents a novel method in which blood particulates - biconcave red blood cells (RBCs) and spherical cells are modeled using isogeometric analysis, specifically Non-Uniform Rational B-Splines (NURBS) in 3-D. The use of NURBS ensures that even with a coarse representation, the geometry of the blood particulates maintains an accurate description when subjected to large deformations. The fundamental advantage of this method is the coupling of the geometrical description and the stress analysis of the cell membrane into a single, unified framework. Details on the modeling approach, implementation of boundary conditions and the membrane mechanics analysis using isogeometric modeling are presented, along with validation cases for spherical and biconcave cells. Using NURBS - based isogeometric analysis, the behavior of individual cells in fluid flow is presented and analyzed in different flow regimes using as few as 176 elements for a spherical cell and 220 elements for a biconcave RBC. This work provides a framework for modeling a large number of 3-D deformable biological cells, each with its own geometric description and membrane properties. To the best knowledge of the authors, this is the first application of the NURBS - based isogeometric analysis to model and simulate blood particulates in flow in 3D. Copyright © 2014 John Wiley & Sons, Ltd.
Use of modeling to identify vulnerabilities to human error in laparoscopy.
Funk, Kenneth H; Bauer, James D; Doolen, Toni L; Telasha, David; Nicolalde, R Javier; Reeber, Miriam; Yodpijit, Nantakrit; Long, Myra
2010-01-01
This article describes an exercise to investigate the utility of modeling and human factors analysis in understanding surgical processes and their vulnerabilities to medical error. A formal method to identify error vulnerabilities was developed and applied to a test case of Veress needle insertion during closed laparoscopy. A team of 2 surgeons, a medical assistant, and 3 engineers used hierarchical task analysis and Integrated DEFinition language 0 (IDEF0) modeling to create rich models of the processes used in initial port creation. Using terminology from a standardized human performance database, detailed task descriptions were written for 4 tasks executed in the process of inserting the Veress needle. Key terms from the descriptions were used to extract from the database generic errors that could occur. Task descriptions with potential errors were translated back into surgical terminology. Referring to the process models and task descriptions, the team used a modified failure modes and effects analysis (FMEA) to consider each potential error for its probability of occurrence, its consequences if it should occur and be undetected, and its probability of detection. The resulting likely and consequential errors were prioritized for intervention. A literature-based validation study confirmed the significance of the top error vulnerabilities identified using the method. Ongoing work includes design and evaluation of procedures to correct the identified vulnerabilities and improvements to the modeling and vulnerability identification methods. Copyright 2010 AAGL. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Friman, Margareta; Nyberg, Claes; Norlander, Torsten
2004-01-01
A descriptive qualitative analysis of in-depth interviews involving seven provincial Soccer Association referees was carried out in order to find out how referees experience threats and aggression directed to soccer referees. The Empirical Phenomenological Psychological method (EPP-method) was used. The analysis resulted in thirty categories which…
ERIC Educational Resources Information Center
Yung-Kuan, Chan; Hsieh, Ming-Yuan; Lee, Chin-Feng; Huang, Chih-Cheng; Ho, Li-Chih
2017-01-01
Under the hyper-dynamic education situation, this research, in order to comprehensively explore the interplays between Teacher Competence Demands (TCD) and Learning Organization Requests (LOR), cross-employs the data refined method of Descriptive Statistics (DS) method and Analysis of Variance (ANOVA) and Principal Components Analysis (PCA)…
A laboratory study of nonlinear changes in the directionality of extreme seas
NASA Astrophysics Data System (ADS)
Latheef, M.; Swan, C.; Spinneken, J.
2017-03-01
This paper concerns the description of surface water waves, specifically nonlinear changes in the directionality. Supporting calculations are provided to establish the best method of directional wave generation, the preferred method of directional analysis and the inputs on which such a method should be based. These calculations show that a random directional method, in which the phasing, amplitude and direction of propagation of individual wave components are chosen randomly, has benefits in achieving the required ergodicity. In terms of analysis procedures, the extended maximum entropy principle, with inputs based upon vector quantities, produces the best description of directionality. With laboratory data describing the water surface elevation and the two horizontal velocity components at a single point, several steep sea states are considered. The results confirm that, as the steepness of a sea state increases, the overall directionality of the sea state reduces. More importantly, it is also shown that the largest waves become less spread or more unidirectional than the sea state as a whole. This provides an important link to earlier descriptions of deterministic wave groups produced by frequency focusing, helps to explain recent field observations and has important practical implications for the design of marine structures and vessels.
Some New Mathematical Methods for Variational Objective Analysis
NASA Technical Reports Server (NTRS)
Wahba, G.; Johnson, D. R.
1984-01-01
New and/or improved variational methods for simultaneously combining forecast, heterogeneous observational data, a priori climatology, and physics to obtain improved estimates of the initial state of the atmosphere for the purpose of numerical weather prediction are developed. Cross validated spline methods are applied to atmospheric data for the purpose of improved description and analysis of atmospheric phenomena such as the tropopause and frontal boundary surfaces.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jochen, J.E.; Hopkins, C.W.
1993-12-31
;Contents: Naturally fractured reservoir description; Geologic considerations; Shale-specific log model; Stress profiles; Berea reasearch; Benefits analysis; Summary of technologies; Novel well test methods; Natural fracture identification; Reverse drilling; Production data analysis; Fracture treatment quality control; Novel core analysis methods; and Shale well cleanouts.
Method of Analysis for Determining and Correcting Mirror Deformation due to Gravity
2014-01-01
obtainable. 1.3 Description of As-Built Beam Compressor Assembly The as-built beam compressor assembly consists of primary and secondary Zerodur ® mirrors held...Method of analysis for determining and correcting mirror deformation due to gravity James H. Clark, III F. Ernesto, Penado Downloaded From: http...00-00-2014 4. TITLE AND SUBTITLE Method of analysis for determining and correcting mirror deformation due to gravity 5a. CONTRACT NUMBER 5b. GRANT
ERIC Educational Resources Information Center
Brattin, Barbara C.
Content analysis was performed on the top six core journals for 1990 in library and information science to determine the extent of research in the field. Articles (n=186) were examined for descriptive or inferential statistics and separately for the presence of mathematical models. Results show a marked (14%) increase in research for 1990,…
ERIC Educational Resources Information Center
Hudson, Barclay M.
Descriptions of models for policy analysis in future studies are presented. Separate sections of the paper focus on the need for appropriate technologies of social science in future studies, a description of "compact policy assessment" (CPA), and a comparison of two CPA methods, Compass and Delphi. Compact policy assessment refers to any low-cost,…
Descriptive and inferential statistical methods used in burns research.
Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars
2010-05-01
Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals in the fields of biostatistics and epidemiology when using more advanced statistical techniques. Copyright 2009 Elsevier Ltd and ISBI. All rights reserved.
Safety diagnosis: are we doing a good job?
Park, Peter Y; Sahaji, Rajib
2013-03-01
Collision diagnosis is the second step in the six-step road safety management process described in the AASHTO Highway Safety Manual (HSM). Diagnosis is designed to identify a dominant or abnormally high proportion of particular collision configurations (e.g., rear end, right angle, etc.) at a target location. The primary diagnosis method suggested in the HSM is descriptive data analysis. This type of analysis relies on, for example, pie charts, histograms, and/or collision diagrams. Using location specific collision data (e.g., collision frequency per collision configuration for a target location), safety engineers identify (the most) frequent collision configurations. Safety countermeasures are then likely to concentrate on preventing the selected collision configurations. Although its real-world application in engineering practice is limited, an additional collision diagnosis method, known as the beta-binomial (BB) test, is also presented as the secondary diagnosis tool in the HSM. The BB test compares the proportion of a particular collision configuration observed at one location with the proportion of the same collision configuration found at other reference locations which are similar to the target location in terms of selected traffic and roadway characteristics (e.g., traffic volume, traffic control, and number of lanes). This study compared the outcomes obtained from descriptive data analysis and the BB test, and investigates two questions: (1) Do descriptive data analysis and the BB tests produce the same results (i.e., do they select the same collision configurations at the same locations)? and (2) If the tests produce different results, which result should be adopted in engineering practice? This study's analysis was based on a sample of the most recent five years (2005-2009) of collision and roadway configuration data for 143 signalized intersections in the City of Saskatoon, Saskatchewan. The study results show that the BB test's role in diagnosing safety concerns in road safety engineering projects such as safety review projects for existing roadways may be just as important as the descriptive data analysis method. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Singh, S. P.
1979-01-01
The computer software developed to set up a method for Wiener spectrum analysis of photographic films is presented. This method is used for the quantitative analysis of the autoradiographic enhancement process. The software requirements and design for the autoradiographic enhancement process are given along with the program listings and the users manual. A software description and program listings modification of the data analysis software are included.
the published methods to increase LAP applicability. The adaptations must be tested to ensure the same concentrations must approximate the corresponding sugar concentrations in the sample. Methods for optimizing ;Compositional Analysis of Lignocellulosic Feedstocks. 1. Review and Description of Methods," J. Agric. Food
2012-01-01
Background Multi attribute utility (MAU) instruments are used to include the health related quality of life (HRQoL) in economic evaluations of health programs. Comparative studies suggest different MAU instruments measure related but different constructs. The objective of this paper is to describe the methods employed to achieve content validity in the descriptive system of the Assessment of Quality of Life (AQoL)-6D, MAU instrument. Methods The AQoL program introduced the use of psychometric methods in the construction of health related MAU instruments. To develop the AQoL-6D we selected 112 items from previous research, focus groups and expert judgment and administered them to 316 members of the public and 302 hospital patients. The search for content validity across a broad spectrum of health states required both formative and reflective modelling. We employed Exploratory Factor Analysis and Structural Equation Modelling (SEM) to meet these dual requirements. Results and Discussion The resulting instrument employs 20 items in a multi-tier descriptive system. Latent dimension variables achieve sensitive descriptions of 6 dimensions which, in turn, combine to form a single latent QoL variable. Diagnostic statistics from the SEM analysis are exceptionally good and confirm the hypothesised structure of the model. Conclusions The AQoL-6D descriptive system has good psychometric properties. They imply that the instrument has achieved construct validity and provides a sensitive description of HRQoL. This means that it may be used with confidence for measuring health related quality of life and that it is a suitable basis for modelling utilities for inclusion in the economic evaluation of health programs. PMID:22507254
Jähnig, P; Jobert, M
1995-01-01
Quantitative EEG is a sensitive method for measuring pharmacological effects on the central nervous system. Nowadays, computers enable EEG data to be stored and spectral parameters to be computed for signals obtained from a large number of electrode locations. However, the statistical analysis of such vast amounts of EEG data is complicated due to the limited number of subjects usually involved in pharmacological studies. In the present study, data from a trial aimed at comparing diazepam and placebo were used to investigate different properties of EEG mapping data and to compare different methods of data analysis. Both the topography and the temporal changes of EEG activity were investigated using descriptive data analysis, which is based on an inspection of patterns of pd values (descriptive p values) assessed for all pair-wise tests for differences in time or treatment. An empirical measure (tri-mean) for the computation of group maps is suggested, allowing a better description of group effects with skewed data of small samples size. Finally, both the investigation of maps based on principal component analysis and the notion of distance between maps are discussed and applied to the analysis of the data collected under diazepam treatment, exemplifying the evaluation of pharmacodynamic drug effects.
ERIC Educational Resources Information Center
Hoffman, John L.; Bresciani, Marilee J.
2012-01-01
This mixed method study explored the professional competencies that administrators expect from entry-, mid-, and senior-level professionals as reflected in 1,759 job openings posted in 2008. Knowledge, skill, and dispositional competencies were identified during the qualitative phase of the study. Statistical analysis of the prevalence of…
ERIC Educational Resources Information Center
Hofmann, Fabian
2016-01-01
Social phenomenological analysis is presented as a research method for museum and art education. After explaining its methodological background, it is shown how this method has been applied in a study of gallery talks or guided tours in art museums: Analyzing the situation by description and interpretation, a model for understanding gallery talks…
This Paper covers the basics of passive sampler design, compares passive samplers to conventional methods of air sampling, and discusses considerations when implementing a passive sampling program. The Paper also discusses field sampling and sample analysis considerations to ensu...
EHR Improvement Using Incident Reports.
Teame, Tesfay; Stålhane, Tor; Nytrø, Øystein
2017-01-01
This paper discusses reactive improvement of clinical software using methods for incident analysis. We used the "Five Whys" method because we had only descriptive data and depended on a domain expert for the analysis. The analysis showed that there are two major root causes for EHR software failure, and that they are related to human and organizational errors. A main identified improvement is allocating more resources to system maintenance and user training.
Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS).
Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M; Khan, Ajmal
2016-01-01
This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher's exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher's exact test, logistic regression, epidemiological statistics, and non-parametric tests. This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design.
Martens, Brian K; DiGennaro, Florence D; Reed, Derek D; Szczech, Frances M; Rosenthal, Blair D
2008-01-01
Descriptive assessment methods have been used in applied settings to identify consequences for problem behavior, thereby aiding in the design of effective treatment programs. Consensus has not been reached, however, regarding the types of data or analytic strategies that are most useful for describing behavior–consequence relations. One promising approach involves the analysis of conditional probabilities from sequential recordings of behavior and events that follow its occurrence. In this paper we review several strategies for identifying contingent relations from conditional probabilities, and propose an alternative strategy known as a contingency space analysis (CSA). Step-by-step procedures for conducting and interpreting a CSA using sample data are presented, followed by discussion of the potential use of a CSA for conducting descriptive assessments, informing intervention design, and evaluating changes in reinforcement contingencies following treatment. PMID:18468280
A Case Study on Teaching of Energy as a Subject for 9th Graders
ERIC Educational Resources Information Center
Bezen, Sevim; Bayrak, Celal; Aykutlu, Isil
2017-01-01
This study aims to describe how energy subject is taught in 9th grades. The study is designed as a descriptive case study with the participation of 3 physics teachers and 85 students. Data were obtained through observation, interviews, and documents, and they were analyzed through descriptive analysis method. In the observations made at the…
This compendium includes descriptions of methods for analyzing metals, pesticides and volatile organic compounds (VOCs) in water. The individual methods covered are these: (1) Method 200.8: determination of trace elements in waters and wastes by inductively coupled plasma-mass s...
Towards health care process description framework: an XML DTD design.
Staccini, P.; Joubert, M.; Quaranta, J. F.; Aymard, S.; Fieschi, D.; Fieschi, M.
2001-01-01
The development of health care and hospital information systems has to meet users needs as well as requirements such as the tracking of all care activities and the support of quality improvement. The use of process-oriented analysis is of-value to provide analysts with: (i) a systematic description of activities; (ii) the elicitation of the useful data to perform and record care tasks; (iii) the selection of relevant decision-making support. But paper-based tools are not a very suitable way to manage and share the documentation produced during this step. The purpose of this work is to propose a method to implement the results of process analysis according to XML techniques (eXtensible Markup Language). It is based on the IDEF0 activity modeling language (Integration DEfinition for Function modeling). A hierarchical description of a process and its components has been defined through a flat XML file with a grammar of proper metadata tags. Perspectives of this method are discussed. PMID:11825265
NASA Astrophysics Data System (ADS)
Matsuzaki, F.; Yoshikawa, N.; Tanaka, M.; Fujimaki, A.; Takai, Y.
2003-10-01
Recently many single flux quantum (SFQ) logic circuits containing several thousands of Josephson junctions have been designed successfully by using digital domain simulation based on the hard ware description language (HDL). In the present HDL-based design of SFQ circuits, a structure-level HDL description has been used, where circuits are made up of basic gate cells. However, in order to analyze large-scale SFQ digital systems, such as a microprocessor, more higher-level circuit abstraction is necessary to reduce the circuit simulation time. In this paper we have investigated the way to describe functionality of the large-scale SFQ digital circuits by a behavior-level HDL description. In this method, the functionality and the timing of the circuit block is defined directly by describing their behavior by the HDL. Using this method, we can dramatically reduce the simulation time of large-scale SFQ digital circuits.
Midilli, Tulay Sagkal; Yasar, Eda; Baysal, Ebru
2015-01-01
The purpose of this study was to examine the menstruation and dysmenorrhea characteristics and the factors affecting dysmenorrhea of health school students, and the knowledge and use of the methods of complementary and alternative medicine (CAM) on the part of those students with dysmenorrhea. This is a descriptive study. A descriptive analysis was made by calculating the number, percentage, mean, Pearson χ, and logistic regression analysis. A total of 488 female students participated in the research and 87.7% (n = 428) of all students experienced dysmenorrhea. It was detected that a family history of dysmenorrhea and regular menstrual cycles of the students were dysmenorrhea-affecting factors (P < .05). Seven of 10 students with dysmenorrhea used CAM methods. Heat application of CAM methods for dysmenorrhea management was the most commonly used and also known by the students. The students who experienced severe pain used analgesics (P < .05) and CAM methods (P < .05).
Multiscale multifractal detrended cross-correlation analysis of financial time series
NASA Astrophysics Data System (ADS)
Shi, Wenbin; Shang, Pengjian; Wang, Jing; Lin, Aijing
2014-06-01
In this paper, we introduce a method called multiscale multifractal detrended cross-correlation analysis (MM-DCCA). The method allows us to extend the description of the cross-correlation properties between two time series. MM-DCCA may provide new ways of measuring the nonlinearity of two signals, and it helps to present much richer information than multifractal detrended cross-correlation analysis (MF-DCCA) by sweeping all the range of scale at which the multifractal structures of complex system are discussed. Moreover, to illustrate the advantages of this approach we make use of the MM-DCCA to analyze the cross-correlation properties between financial time series. We show that this new method can be adapted to investigate stock markets under investigation. It can provide a more faithful and more interpretable description of the dynamic mechanism between financial time series than traditional MF-DCCA. We also propose to reduce the scale ranges to analyze short time series, and some inherent properties which remain hidden when a wide range is used may exhibit perfectly in this way.
Cognitive Approaches for Medicine in Cloud Computing.
Ogiela, Urszula; Takizawa, Makoto; Ogiela, Lidia
2018-03-03
This paper will present the application potential of the cognitive approach to data interpretation, with special reference to medical areas. The possibilities of using the meaning approach to data description and analysis will be proposed for data analysis tasks in Cloud Computing. The methods of cognitive data management in Cloud Computing are aimed to support the processes of protecting data against unauthorised takeover and they serve to enhance the data management processes. The accomplishment of the proposed tasks will be the definition of algorithms for the execution of meaning data interpretation processes in safe Cloud Computing. • We proposed a cognitive methods for data description. • Proposed a techniques for secure data in Cloud Computing. • Application of cognitive approaches for medicine was described.
Analysis of Pre-Service Science Teachers' Views about the Methods Which Develop Reflective Thinking
ERIC Educational Resources Information Center
Töman, Ufuk; Odabasi Çimer, Sabiha; Çimer, Atilla
2014-01-01
In this study, we investigate of science and technology pre-service teachers' opinions about the methods developed reflective thinking and we determined at the level of reflective thinking. This study is a descriptive study. Open-ended questions were used to determine the views of pre-service teachers. Questions used in the statistical analysis of…
Turbofan forced mixer lobe flow modeling. 2: Three-dimensional inviscid mixer analysis (FLOMIX)
NASA Technical Reports Server (NTRS)
Barber, T.
1988-01-01
A three-dimensional potential analysis (FLOMIX) was formulated and applied to the inviscid flow over a turbofan foced mixer. The method uses a small disturbance formulation to analytically uncouple the circumferential flow from the radial and axial flow problem, thereby reducing the analysis to the solution of a series of axisymmetric problems. These equations are discretized using a flux volume formulation along a Cartesian grid. The method extends earlier applications of the Cartesian method to complex cambered geometries. The effects of power addition are also included within the potential formulation. Good agreement is obtained with an alternate small disturbance analysis for a high penetration symmetric mixer in a planar duct. In addition, calculations showing pressure distributions and induced secondary vorticity fields are presented for practical trubofan mixer configurations, and where possible, comparison was made with available experimental data. A detailed description of the required data input and coordinate definition is presented along with a sample data set for a practical forced mixer configuration. A brief description of the program structure and subroutines is also provided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brink, A.; Kilpinen, P.; Hupa, M.
1996-01-01
Two methods to improve the modeling of NO{sub x} emissions in numerical flow simulation of combustion are investigated. The models used are a reduced mechanism for nitrogen chemistry in methane combustion and a new model based on regression analysis of perfectly stirred reactor simulations using detailed comprehensive reaction kinetics. The applicability of the methods to numerical flow simulation of practical furnaces, especially in the near burner region, is tested against experimental data from a pulverized coal fired single burner furnace. The results are also compared to those obtained using a commonly used description for the overall reaction rate of NO.
Automatic control system generation for robot design validation
NASA Technical Reports Server (NTRS)
Bacon, James A. (Inventor); English, James D. (Inventor)
2012-01-01
The specification and drawings present a new method, system and software product for and apparatus for generating a robotic validation system for a robot design. The robotic validation system for the robot design of a robotic system is automatically generated by converting a robot design into a generic robotic description using a predetermined format, then generating a control system from the generic robotic description and finally updating robot design parameters of the robotic system with an analysis tool using both the generic robot description and the control system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dréan, Gaël; Acosta, Oscar, E-mail: Oscar.Acosta@univ-rennes1.fr; Simon, Antoine
2016-06-15
Purpose: Recent studies revealed a trend toward voxelwise population analysis in order to understand the local dose/toxicity relationships in prostate cancer radiotherapy. Such approaches require, however, an accurate interindividual mapping of the anatomies and 3D dose distributions toward a common coordinate system. This step is challenging due to the high interindividual variability. In this paper, the authors propose a method designed for interindividual nonrigid registration of the rectum and dose mapping for population analysis. Methods: The method is based on the computation of a normalized structural description of the rectum using a Laplacian-based model. This description takes advantage of themore » tubular structure of the rectum and its centerline to be embedded in a nonrigid registration-based scheme. The performances of the method were evaluated on 30 individuals treated for prostate cancer in a leave-one-out cross validation. Results: Performance was measured using classical metrics (Dice score and Hausdorff distance), along with new metrics devised to better assess dose mapping in relation with structural deformation (dose-organ overlap). Considering these scores, the proposed method outperforms intensity-based and distance maps-based registration methods. Conclusions: The proposed method allows for accurately mapping interindividual 3D dose distributions toward a single anatomical template, opening the way for further voxelwise statistical analysis.« less
Pickup, William; Bremer, Phil; Peng, Mei
2018-03-01
The extensive time and cost associated with conventional sensory profiling methods has spurred sensory researchers to develop rapid method alternatives, such as Napping® with Ultra-Flash Profiling (UFP). Napping®-UFP generates sensory maps by requiring untrained panellists to separate samples based on perceived sensory similarities. Evaluations of this method have been restrained to manufactured/formulated food models, and predominantly structured on comparisons against the conventional descriptive method. The present study aims to extend the validation of Napping®-UFP (N = 72) to natural biological products; and to evaluate this method against Descriptive Analysis (DA; N = 8) with physiochemical measurements as an additional evaluative criterion. The results revealed that sample configurations generated by DA and Napping®-UFP were not significantly correlated (RV = 0.425, P = 0.077); however, they were both correlated with the product map generated based on the instrumental measures (P < 0.05). The finding also noted that sample characterisations from DA and Napping®-UFP were driven by different sensory attributes, indicating potential structural differences between these two methods in configuring samples. Overall, these findings lent support for the extended use of Napping®-UFP for evaluations of natural biological products. Although DA was shown to be a better method for establishing sensory-instrumental relationships, Napping®-UFP exhibited strengths in generating informative sample configurations based on holistic perception of products. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
NASA Astrophysics Data System (ADS)
Fan, Daidu; Tu, Junbiao; Cai, Guofu; Shang, Shuai
2015-06-01
Grain-size analysis is a basic routine in sedimentology and related fields, but diverse methods of sample collection, processing and statistical analysis often make direct comparisons and interpretations difficult or even impossible. In this paper, 586 published grain-size datasets from the Qiantang Estuary (East China Sea) sampled and analyzed by the same procedures were merged and their textural parameters calculated by a percentile and two moment methods. The aim was to explore which of the statistical procedures performed best in the discrimination of three distinct sedimentary units on the tidal flats of the middle Qiantang Estuary. A Gaussian curve-fitting method served to simulate mixtures of two normal populations having different modal sizes, sorting values and size distributions, enabling a better understanding of the impact of finer tail components on textural parameters, as well as the proposal of a unifying descriptive nomenclature. The results show that percentile and moment procedures yield almost identical results for mean grain size, and that sorting values are also highly correlated. However, more complex relationships exist between percentile and moment skewness (kurtosis), changing from positive to negative correlations when the proportions of the finer populations decrease below 35% (10%). This change results from the overweighting of tail components in moment statistics, which stands in sharp contrast to the underweighting or complete amputation of small tail components by the percentile procedure. Intercomparisons of bivariate plots suggest an advantage of the Friedman & Johnson moment procedure over the McManus moment method in terms of the description of grain-size distributions, and over the percentile method by virtue of a greater sensitivity to small variations in tail components. The textural parameter scalings of Folk & Ward were translated into their Friedman & Johnson moment counterparts by application of mathematical functions derived by regression analysis of measured and modeled grain-size data, or by determining the abscissa values of intersections between auxiliary lines running parallel to the x-axis and vertical lines corresponding to the descriptive percentile limits along the ordinate of representative bivariate plots. Twofold limits were extrapolated for the moment statistics in relation to single descriptive terms in the cases of skewness and kurtosis by considering both positive and negative correlations between percentile and moment statistics. The extrapolated descriptive scalings were further validated by examining entire size-frequency distributions simulated by mixing two normal populations of designated modal size and sorting values, but varying in mixing ratios. These were found to match well in most of the proposed scalings, although platykurtic and very platykurtic categories were questionable when the proportion of the finer population was below 5%. Irrespective of the statistical procedure, descriptive nomenclatures should therefore be cautiously used when tail components contribute less than 5% to grain-size distributions.
Methods for the design and analysis of power optimized finite-state machines using clock gating
NASA Astrophysics Data System (ADS)
Chodorowski, Piotr
2017-11-01
The paper discusses two methods of design of power optimized FSMs. Both methods use clock gating techniques. The main objective of the research was to write a program capable of generating automatic hardware description of finite-state machines in VHDL and testbenches to help power analysis. The creation of relevant output files is detailed step by step. The program was tested using the LGSynth91 FSM benchmark package. An analysis of the generated circuits shows that the second method presented in this paper leads to significant reduction of power consumption.
Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS)
Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M.; Khan, Ajmal
2016-01-01
Objective: This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Methods: Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. Results: A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher’s exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher’s exact test, logistic regression, epidemiological statistics, and non-parametric tests. Conclusion: This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design. PMID:27022365
Neyeloff, Jeruza L; Fuchs, Sandra C; Moreira, Leila B
2012-01-20
Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software.
2012-01-01
Background Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. Findings We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. Conclusions It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software. PMID:22264277
Morales, Daniel R.; Pacurariu, Alexandra; Kurz, Xavier
2017-01-01
Aims Evaluating the public health impact of regulatory interventions is important but there is currently no common methodological approach to guide this evaluation. This systematic review provides a descriptive overview of the analytical methods for impact research. Methods We searched MEDLINE and EMBASE for articles with an empirical analysis evaluating the impact of European Union or non‐European Union regulatory actions to safeguard public health published until March 2017. References from systematic reviews and articles from other known sources were added. Regulatory interventions, data sources, outcomes of interest, methodology and key findings were extracted. Results From 1246 screened articles, 229 were eligible for full‐text review and 153 articles in English language were included in the descriptive analysis. Over a third of articles studied analgesics and antidepressants. Interventions most frequently evaluated are regulatory safety communications (28.8%), black box warnings (23.5%) and direct healthcare professional communications (10.5%); 55% of studies measured changes in drug utilization patterns, 27% evaluated health outcomes, and 18% targeted knowledge, behaviour or changes in clinical practice. Unintended consequences like switching therapies or spill‐over effects were rarely evaluated. Two‐thirds used before–after time series and 15.7% before–after cross‐sectional study designs. Various analytical approaches were applied including interrupted time series regression (31.4%), simple descriptive analysis (28.8%) and descriptive analysis with significance tests (23.5%). Conclusion Whilst impact evaluation of pharmacovigilance and product‐specific regulatory interventions is increasing, the marked heterogeneity in study conduct and reporting highlights the need for scientific guidance to ensure robust methodologies are applied and systematic dissemination of results occurs. PMID:29105853
ERIC Educational Resources Information Center
Bailey, Charles-James N.
The author aims: (1) to show that generative phonology uses essentially the method of internal reconstruction which has previously been employed only in diachronic studies in setting up synchronic underlying phonological representations; (2) to show why synchronic analysis should add the comparative method to its arsenal, together with whatever…
The Analysis of Discourse as Evaluation of Productive Thinking.
ERIC Educational Resources Information Center
Tripp, D. H.
This paper provides a thorough description of a method of analyzing and scoring group discussions from a particular point of view. After discussing shortcomings of traditional methods of reporting data from group discussions and problems inherent in the use of paper-and-pencil creativity tests, the author describes a method which was developed as…
A Comparison of Descriptive and Functional Analyses of Inappropriate Mealtime Behavior.
Borrero, Carrie S W; England, Jennie D; Sarcia, Ben; Woods, Julia N
2016-12-01
In recent years, rather than being used to assess the potential function of a response, descriptive assessment methods have been applied to evaluate potential consequences or contingencies for problem behavior (Borrero, Woods, Borrero, Masler, & Lesser in Journal of Applied Behavior Analysis, 43 , 71-88. doi: 10.1901/jaba.2010.43-71, 2010) or to assist with designing baseline conditions to approximate caregiver behavior (Casey et al. in Behavior Modification, 33 , 537-558. doi: 10.1177/0145445509341457, 2009). It has been shown that descriptive assessments of some forms of problem behavior (e.g., self-injury, aggression) are not good indicators of behavioral function and should not be used exclusively when conducting functional behavior assessments (Thompson & Iwata in Journal of Applied Behavior Analysis, 40 , 333-338. doi: 10.1901/jaba.2007.56.06/epdf, 2007). However, the extent to which descriptive assessments of inappropriate mealtime behavior can predict behavioral function is not yet clear. We conducted descriptive assessments of inappropriate mealtime behavior and compared the results to functional analyses for ten children with severe food refusal. Results showed that, for 71 % of participants, the descriptive and functional analyses matched. These results suggest that the correspondence between descriptive and functional analyses, at least for inappropriate mealtime behavior, may be higher than that for other forms of problem behavior.
Survey of Existing and Promising New Methods of Surface Preparation
1982-04-01
and abroad, a description and analysis are givev of applicable methods including: • Equipment employing recycled steel shot and grit. • wet blast...requirements that must be met by these methods. 23. Barrillom, P., “Preservation of Materials in the Marine Environment— Analysis of Replies TO The Enquiry on...conditions, can hydrolyze or give sulfuric acid, causing renewed corrosion. Wet blasting or the use of high pressure water jets appears to be useful in
Wirihana, Lisa; Welch, Anthony; Williamson, Moira; Christensen, Martin; Bakon, Shannon; Craft, Judy
2018-03-16
Phenomenology is a useful methodological approach in qualitative nursing research. It enables researchers to put aside their perceptions of a phenomenon and give meaning to a participant's experiences. Exploring the experiences of others enables previously unavailable insights to be discovered. To delineate the implementation of Colaizzi's ( 1978 ) method of data analysis in descriptive phenomenological nursing research. The use of Colaizzi's method of data analysis enabled new knowledge to be revealed and provided insights into the experiences of nurse academics teaching on satellite campuses. Local adaptation of the nursing curriculum and additional unnoticed responsibilities had not been identified previously and warrant further research. Colaizzi's ( 1978 ) method of data analysis is rigorous and robust, and therefore a qualitative method that ensures the credibility and reliability of its results. It allows researchers to reveal emergent themes and their interwoven relationships. Researchers using a descriptive phenomenological approach should consider using this method as a clear and logical process through which the fundamental structure of an experience can be explored. Colaizzi's phenomenological methodology can be used reliably to understand people's experiences. This may prove beneficial in the development of therapeutic policy and the provision of patient-centred care. ©2018 RCN Publishing Company Ltd. All rights reserved. Not to be copied, transmitted or recorded in any way, in whole or part, without prior permission of the publishers.
Automated geospatial Web Services composition based on geodata quality requirements
NASA Astrophysics Data System (ADS)
Cruz, Sérgio A. B.; Monteiro, Antonio M. V.; Santos, Rafael
2012-10-01
Service-Oriented Architecture and Web Services technologies improve the performance of activities involved in geospatial analysis with a distributed computing architecture. However, the design of the geospatial analysis process on this platform, by combining component Web Services, presents some open issues. The automated construction of these compositions represents an important research topic. Some approaches to solving this problem are based on AI planning methods coupled with semantic service descriptions. This work presents a new approach using AI planning methods to improve the robustness of the produced geospatial Web Services composition. For this purpose, we use semantic descriptions of geospatial data quality requirements in a rule-based form. These rules allow the semantic annotation of geospatial data and, coupled with the conditional planning method, this approach represents more precisely the situations of nonconformities with geodata quality that may occur during the execution of the Web Service composition. The service compositions produced by this method are more robust, thus improving process reliability when working with a composition of chained geospatial Web Services.
DOT National Transportation Integrated Search
1994-02-01
This report describes the data collection procedures, the data analysis methods, and the results gained from the on-site evaluations. The content of the report is as follows: Chapter 2 - State Profiles. This chapter includes descriptions of the organ...
A Descriptive Analysis of Overviews of Reviews Published between 2000 and 2011
Hartling, Lisa; Chisholm, Annabritt; Thomson, Denise; Dryden, Donna M.
2012-01-01
Background Overviews of systematic reviews compile data from multiple systematic reviews (SRs) and are a new method of evidence synthesis. Objectives To describe the methodological approaches in overviews of interventions. Design Descriptive study. Methods We searched 4 databases from 2000 to July 2011; we handsearched Evidence-based Child Health: A Cochrane Review Journal. We defined an overview as a study that: stated a clear objective; examined an intervention; used explicit methods to identify SRs; collected and synthesized outcome data from the SRs; and intended to include only SRs. We did not restrict inclusion by population characteristics (e.g., adult or children only). Two researchers independently screened studies and applied eligibility criteria. One researcher extracted data with verification by a second. We conducted a descriptive analysis. Results From 2,245 citations, 75 overviews were included. The number of overviews increased from 1 in 2000 to 14 in 2010. The interventions were pharmacological (n = 20, 26.7%), non-pharmacological (n = 26, 34.7%), or both (n = 29, 38.7%). Inclusion criteria were clearly stated in 65 overviews. Thirty-three (44%) overviews searched at least 2 databases. The majority reported the years and databases searched (n = 46, 61%), and provided key words (n = 58, 77%). Thirty-nine (52%) overviews included Cochrane SRs only. Two reviewers independently screened and completed full text review in 29 overviews (39%). Methods of data extraction were reported in 45 (60%). Information on quality of individual studies was extracted from the original SRs in 27 (36%) overviews. Quality assessment of the SRs was performed in 28 (37%) overviews; at least 9 different tools were used. Quality of the body of evidence was assessed in 13 (17%) overviews. Most overviews provided a narrative or descriptive analysis of the included SRs. One overview conducted indirect analyses and the other conducted mixed treatment comparisons. Publication bias was discussed in 18 (24%) overviews. Conclusions This study shows considerable variation in the methods used for overviews. There is a need for methodological rigor and consistency in overviews, as well as empirical evidence to support the methods employed. PMID:23166744
Methodological reporting of randomized trials in five leading Chinese nursing journals.
Shi, Chunhu; Tian, Jinhui; Ren, Dan; Wei, Hongli; Zhang, Lihuan; Wang, Quan; Yang, Kehu
2014-01-01
Randomized controlled trials (RCTs) are not always well reported, especially in terms of their methodological descriptions. This study aimed to investigate the adherence of methodological reporting complying with CONSORT and explore associated trial level variables in the Chinese nursing care field. In June 2012, we identified RCTs published in five leading Chinese nursing journals and included trials with details of randomized methods. The quality of methodological reporting was measured through the methods section of the CONSORT checklist and the overall CONSORT methodological items score was calculated and expressed as a percentage. Meanwhile, we hypothesized that some general and methodological characteristics were associated with reporting quality and conducted a regression with these data to explore the correlation. The descriptive and regression statistics were calculated via SPSS 13.0. In total, 680 RCTs were included. The overall CONSORT methodological items score was 6.34 ± 0.97 (Mean ± SD). No RCT reported descriptions and changes in "trial design," changes in "outcomes" and "implementation," or descriptions of the similarity of interventions for "blinding." Poor reporting was found in detailing the "settings of participants" (13.1%), "type of randomization sequence generation" (1.8%), calculation methods of "sample size" (0.4%), explanation of any interim analyses and stopping guidelines for "sample size" (0.3%), "allocation concealment mechanism" (0.3%), additional analyses in "statistical methods" (2.1%), and targeted subjects and methods of "blinding" (5.9%). More than 50% of trials described randomization sequence generation, the eligibility criteria of "participants," "interventions," and definitions of the "outcomes" and "statistical methods." The regression analysis found that publication year and ITT analysis were weakly associated with CONSORT score. The completeness of methodological reporting of RCTs in the Chinese nursing care field is poor, especially with regard to the reporting of trial design, changes in outcomes, sample size calculation, allocation concealment, blinding, and statistical methods.
Applying Molecular Bonding Concepts to the Solid State
NASA Astrophysics Data System (ADS)
Dunnington, Benjamin D.
In this thesis, we describe the extension and application of Natural Bond Orbital (NBO) analysis to periodic systems. This enables the translation of rigorous, quantum mechanical calculation results of solid systems into the localized lone pairs and two-center bonds of Lewis structures. Such localized bonding descriptions form the basic language of chemistry, and application of these ideas to solids allows for the understanding of complex phenomena in bulk systems using readily accessible concepts from molecular science. In addition to the algorithmic adjustments needed for to account for periodic boundary conditions in the NBO process, we also discuss methodology to interface the ubiquitous plane wave basis sets of the solid state with the atom-centered basis functions needed as input for NBO analysis. We will describe one method using projection of the plane wave eigenstates, and a second projection-free method that involves the direct calculation of matrix elements of the plane wave Hamiltonian in an atom-centered basis. The reliance of many localized, post-computational analysis techniques on an atom-centered description of the orbitals, means these interfaces will have applicability beyond our NBO development. An ideal area for application of such molecular descriptions of periodic systems is heterogeneous catalysis, where reactants from a gas/liquid phase react on a solid catalyst surface. Previous studies of these systems have originated from the delocalized perspective of the bulk catalyst. NBO provides an explicit description of the perturbative effect of the catalyst on the covalent bonds of the reactant, which is correlated with the catalytic activity of the material. Such a shift to an adsorbate focused description of surface reactivity will enable understanding of catalysis across a variety of materials.
2016-11-15
participants who were followed for the development of back pain for an average of 3.9 years. Methods. Descriptive statistics and longitudinal...health, military personnel, occupational health, outcome assessment, statistics, survey methodology . Level of Evidence: 3 Spine 2016;41:1754–1763ack...based on the National Health and Nutrition Examination Survey.21 Statistical Analysis Descriptive and univariate analyses compared character- istics
Methodological challenges in qualitative content analysis: A discussion paper.
Graneheim, Ulla H; Lindgren, Britt-Marie; Lundman, Berit
2017-09-01
This discussion paper is aimed to map content analysis in the qualitative paradigm and explore common methodological challenges. We discuss phenomenological descriptions of manifest content and hermeneutical interpretations of latent content. We demonstrate inductive, deductive, and abductive approaches to qualitative content analysis, and elaborate on the level of abstraction and degree of interpretation used in constructing categories, descriptive themes, and themes of meaning. With increased abstraction and interpretation comes an increased challenge to demonstrate the credibility and authenticity of the analysis. A key issue is to show the logic in how categories and themes are abstracted, interpreted, and connected to the aim and to each other. Qualitative content analysis is an autonomous method and can be used at varying levels of abstraction and interpretation. Copyright © 2017 Elsevier Ltd. All rights reserved.
Towards an Interoperability Ontology for Software Development Tools
2003-03-01
The description of feature models was tied to the introduction of the Feature-Oriented Domain Analysis ( FODA *) [KANG90] approach in the late eighties...Feature-oriented domain analysis ( FODA ) is a domain analysis method developed at the Software...ese obstacles was to construct a “pilot” ontology that is extensible. We applied the Feature-Oriented Domain Analysis approach to capture the
Electronic Analysis of Communication.
ERIC Educational Resources Information Center
Baggaley, Jon
1982-01-01
Discusses the use of microcomputer-based testing methods in media and community research, with descriptions of the Programme Evaluation Analysis Computer (PEAC) developed for the Ontario Education Communications Authority and of the application of the PEAC system in a study of second-by-second responses to Orson Welles'"War of the…
Frishkoff, Gwen; Sydes, Jason; Mueller, Kurt; Frank, Robert; Curran, Tim; Connolly, John; Kilborn, Kerry; Molfese, Dennis; Perfetti, Charles; Malony, Allen
2011-01-01
We present MINEMO (Minimal Information for Neural ElectroMagnetic Ontologies), a checklist for the description of event-related potentials (ERP) studies. MINEMO extends MINI (Minimal Information for Neuroscience Investigations)to the ERP domain. Checklist terms are explicated in NEMO, a formal ontology that is designed to support ERP data sharing and integration. MINEMO is also linked to an ERP database and web application (the NEMO portal). Users upload their data and enter MINEMO information through the portal. The database then stores these entries in RDF (Resource Description Framework), along with summary metrics, i.e., spatial and temporal metadata. Together these spatial, temporal, and functional metadata provide a complete description of ERP data and the context in which these data were acquired. The RDF files then serve as inputs to ontology-based labeling and meta-analysis. Our ultimate goal is to represent ERPs using a rich semantic structure, so results can be queried at multiple levels, to stimulate novel hypotheses and to promote a high-level, integrative account of ERP results across diverse study methods and paradigms. PMID:22180824
The purpose of this SOP is to describe the collection, storage, and shipment of tap and drinking water samples for analysis by EPA method 524.2 (revision 4.0). This SOP provides a brief description of the sample containers, collection, preservation, storage, shipping, and custod...
NASA Technical Reports Server (NTRS)
Parse, Joseph B.; Wert, J. A.
1991-01-01
Inhomogeneities in the spatial distribution of second phase particles in engineering materials are known to affect certain mechanical properties. Progress in this area has been hampered by the lack of a convenient method for quantitative description of the spatial distribution of the second phase. This study intends to develop a broadly applicable method for the quantitative analysis and description of the spatial distribution of second phase particles. The method was designed to operate on a desktop computer. The Dirichlet tessellation technique (geometrical method for dividing an area containing an array of points into a set of polygons uniquely associated with the individual particles) was selected as the basis of an analysis technique implemented on a PC. This technique is being applied to the production of Al sheet by PM processing methods; vacuum hot pressing, forging, and rolling. The effect of varying hot working parameters on the spatial distribution of aluminum oxide particles in consolidated sheet is being studied. Changes in distributions of properties such as through-thickness near-neighbor distance correlate with hot-working reduction.
NASA Astrophysics Data System (ADS)
Ganiev, R. F.; Reviznikov, D. L.; Rogoza, A. N.; Slastushenskiy, Yu. V.; Ukrainskiy, L. E.
2017-03-01
A description of a complex approach to investigation of nonlinear wave processes in the human cardiovascular system based on a combination of high-precision methods of measuring a pulse wave, mathematical methods of processing the empirical data, and methods of direct numerical modeling of hemodynamic processes in an arterial tree is given.
Fourier Descriptor Analysis and Unification of Voice Range Profile Contours: Method and Applications
ERIC Educational Resources Information Center
Pabon, Peter; Ternstrom, Sten; Lamarche, Anick
2011-01-01
Purpose: To describe a method for unified description, statistical modeling, and comparison of voice range profile (VRP) contours, even from diverse sources. Method: A morphologic modeling technique, which is based on Fourier descriptors (FDs), is applied to the VRP contour. The technique, which essentially involves resampling of the curve of the…
Descriptive and experimental analyses of variables maintaining self-injurious behavior.
Lerman, D C; Iwata, B A
1993-01-01
Independent descriptive (correlational) and functional (experimental) analyses were conducted to determine the extent to which the two methods would yield data supporting similar conclusions about variables maintaining the self-injurious behavior (SIB) of 6 subjects. For the descriptive analyses, subjects were observed in their residences and at training sites at various times each day while observers recorded naturally occurring sequences of specified subject and staff behaviors. The subjects also participated in a day program for the assessment and treatment of SIB, in which they were exposed to functional analyses that manipulated potential maintaining variables in multielement designs. Both sets of data were analyzed via conditional probabilities to identify relevant antecedent and consequent events for subjects' SIB. Using outcomes of the experimental analysis as the standard for comparison, results indicated that the descriptive analysis was useful in identifying the extent to which SIB was related to social versus nonsocial contingencies, but was limited in its ability to distinguish between positive and negative reinforcement (i.e., attention versus escape). PMID:8407680
Development Plans in Turkey and a Managerial Analysis on Education (1963-2013 Years)
ERIC Educational Resources Information Center
Çakir, Rahman
2015-01-01
The purpose of this study is that to evaluate nine educational management plans between the years 1963-2013 and one plan in the process of implementation educational management. Document analysis technique from the qualitative research methods was used in this research. Data was analyzed as three stages: description, analysis and interpretation.…
Cautions on the Use of Investigative Case Studies in Meta-Evaluation.
ERIC Educational Resources Information Center
Smith, Nick L.
1990-01-01
A meta-analysis combining expert evaluation with naturalistic case study methods indicates that such investigations must use special methods to render evaluative judgments of worth. It is demonstrated that descriptive, interpretive, and evaluative aspects of such a study must be combined to yield justifiable conclusions. (TJH)
The Tracer Method of Curriculum Analysis in Cancer Education
ERIC Educational Resources Information Center
Mahan, J. Maurice; And Others
1976-01-01
To assist faculty involved in cancer education in various courses in the curriculum, rather than instituting a new course in oncology, a method was developed for identifying and assessing cancer-related content (a clinical clerk attended lectures, interviewed instructors, reviewed syllibi etc.) and a comprehensive description was produced and…
Implementation of radiation shielding calculation methods. Volume 2: Seminar/Workshop notes
NASA Technical Reports Server (NTRS)
Capo, M. A.; Disney, R. K.
1971-01-01
Detailed descriptions are presented of the input data for each of the MSFC computer codes applied to the analysis of a realistic nuclear propelled vehicle. The analytical techniques employed include cross section data, preparation, one and two dimensional discrete ordinates transport, point kernel, and single scatter methods.
2015-03-26
to my reader, Lieutenant Colonel Robert Overstreet, for helping solidify my research, coaching me through the statistical analysis, and positive...61 Descriptive Statistics .............................................................................................................. 61...common-method bias requires careful assessment of potential sources of bias and implementing procedural and statistical control methods. Podsakoff
Jacques, Eveline; Wells, Darren M; Bennett, Malcolm J; Vissenberg, Kris
2015-01-01
High-resolution imaging of cytoskeletal structures paves the way for standardized methods to quantify cytoskeletal organization. Here we provide a detailed description of the analysis performed to determine the microtubule patterns in gravistimulated roots, using the recently developed software tool MicroFilament Analyzer.
The Analysis of Iranian Students' Persistence in Online Education
ERIC Educational Resources Information Center
Mahmodi, Mahdi; Ebrahimzade, Issa
2015-01-01
In the following research, the relationship between instructional interaction and student persistence in e-learning has been analyzed. In order to conduct a descriptive-analytic survey, 744 undergraduate e-students were selected by stratified random sampling method to examine not only the frequency and the methods of establishing an instructional…
Transient flow thrust prediction for an ejector propulsion concept
NASA Technical Reports Server (NTRS)
Drummond, Colin K.
1989-01-01
A method for predicting transient thrust augmenting ejector characteristics is introduced. The analysis blends classic self-similar turbulent jet descriptions with a mixing region control volume analysis to predict transient effects in a new way. Details of the theoretical foundation, the solution algorithm, and sample calculations are given.
Generating Three-Dimensional Surface Models of Solid Objects from Multiple Projections.
1982-10-01
volume descriptions. The surface models are composed of curved, topologically rectangular, parametric patches. The data required to define these patches...geometry directly from image data .__ This method generates 3D surface descriptions of only those parts of the object that are illuminated by the pro- jected...objects. Generation of such models inherently requires the acquisition and analysis of 3D surface data . In this context, acquisition refers to the
NASA Technical Reports Server (NTRS)
Pera, R. J.; Onat, E.; Klees, G. W.; Tjonneland, E.
1977-01-01
Weight and envelope dimensions of aircraft gas turbine engines are estimated within plus or minus 5% to 10% using a computer method based on correlations of component weight and design features of 29 data base engines. Rotating components are estimated by a preliminary design procedure where blade geometry, operating conditions, material properties, shaft speed, hub-tip ratio, etc., are the primary independent variables used. The development and justification of the method selected, the various methods of analysis, the use of the program, and a description of the input/output data are discussed.
Richardson, Jeffrey R J; Peacock, Stuart J; Hawthorne, Graeme; Iezzi, Angelo; Elsworth, Gerald; Day, Neil A
2012-04-17
Multi attribute utility (MAU) instruments are used to include the health related quality of life (HRQoL) in economic evaluations of health programs. Comparative studies suggest different MAU instruments measure related but different constructs. The objective of this paper is to describe the methods employed to achieve content validity in the descriptive system of the Assessment of Quality of Life (AQoL)-6D, MAU instrument. The AQoL program introduced the use of psychometric methods in the construction of health related MAU instruments. To develop the AQoL-6D we selected 112 items from previous research, focus groups and expert judgment and administered them to 316 members of the public and 302 hospital patients. The search for content validity across a broad spectrum of health states required both formative and reflective modelling. We employed Exploratory Factor Analysis and Structural Equation Modelling (SEM) to meet these dual requirements. The resulting instrument employs 20 items in a multi-tier descriptive system. Latent dimension variables achieve sensitive descriptions of 6 dimensions which, in turn, combine to form a single latent QoL variable. Diagnostic statistics from the SEM analysis are exceptionally good and confirm the hypothesised structure of the model. The AQoL-6D descriptive system has good psychometric properties. They imply that the instrument has achieved construct validity and provides a sensitive description of HRQoL. This means that it may be used with confidence for measuring health related quality of life and that it is a suitable basis for modelling utilities for inclusion in the economic evaluation of health programs.
Integration of the primary health care approach into a community nursing science curriculum.
Vilakazi, S S; Chabeli, M M; Roos, S D
2000-12-01
The purpose of this article is to explore and describe guidelines for integration of the primary health care approach into a Community Nursing Science Curriculum in a Nursing College in Gauteng. A qualitative, exploratory, descriptive and contextual research design was utilized. The focus group interviews were conducted with community nurses and nurse educators as respondents. Data were analysed by a qualitative descriptive method of analysis as described in Creswell (1994: 155). Respondents in both groups held similar perceptions regarding integration of primary health care approach into a Community Nursing Science Curriculum. Five categories, which are in line with the curriculum cycle, were identified as follows: situation analysis, selection and organisation of objectives/goals, content, teaching methods and evaluation. Guidelines and recommendations for the integration of the primary health care approach into a Community Nursing Science Curriculum were described.
A descriptive analysis of overviews of reviews published between 2000 and 2011.
Hartling, Lisa; Chisholm, Annabritt; Thomson, Denise; Dryden, Donna M
2012-01-01
Overviews of systematic reviews compile data from multiple systematic reviews (SRs) and are a new method of evidence synthesis. To describe the methodological approaches in overviews of interventions. Descriptive study. We searched 4 databases from 2000 to July 2011; we handsearched Evidence-based Child Health: A Cochrane Review Journal. We defined an overview as a study that: stated a clear objective; examined an intervention; used explicit methods to identify SRs; collected and synthesized outcome data from the SRs; and intended to include only SRs. We did not restrict inclusion by population characteristics (e.g., adult or children only). Two researchers independently screened studies and applied eligibility criteria. One researcher extracted data with verification by a second. We conducted a descriptive analysis. From 2,245 citations, 75 overviews were included. The number of overviews increased from 1 in 2000 to 14 in 2010. The interventions were pharmacological (n = 20, 26.7%), non-pharmacological (n = 26, 34.7%), or both (n = 29, 38.7%). Inclusion criteria were clearly stated in 65 overviews. Thirty-three (44%) overviews searched at least 2 databases. The majority reported the years and databases searched (n = 46, 61%), and provided key words (n = 58, 77%). Thirty-nine (52%) overviews included Cochrane SRs only. Two reviewers independently screened and completed full text review in 29 overviews (39%). Methods of data extraction were reported in 45 (60%). Information on quality of individual studies was extracted from the original SRs in 27 (36%) overviews. Quality assessment of the SRs was performed in 28 (37%) overviews; at least 9 different tools were used. Quality of the body of evidence was assessed in 13 (17%) overviews. Most overviews provided a narrative or descriptive analysis of the included SRs. One overview conducted indirect analyses and the other conducted mixed treatment comparisons. Publication bias was discussed in 18 (24%) overviews. This study shows considerable variation in the methods used for overviews. There is a need for methodological rigor and consistency in overviews, as well as empirical evidence to support the methods employed.
Fleming, Erin E.; Ziegler, Gregory R.; Hayes, John E.
2015-01-01
Multiple rapid sensory profiling techniques have been developed as more efficient alternatives to traditional sensory descriptive analysis. Here, we compare the results of three rapid sensory profiling techniques – check-all-that-apply (CATA), sorting, and polarized sensory positioning (PSP) – using a diverse range of astringent stimuli. These rapid methods differ in their theoretical basis, implementation, and data analyses, and the relative advantages and limitations are largely unexplored. Additionally, we were interested in using these methods to compare varied astringent stimuli, as these compounds are difficult to characterize using traditional descriptive analysis due to high fatigue and potential carry-over. In the CATA experiment, subjects (n=41) were asked to rate the overall intensity of each stimulus as well as to endorse any relevant terms (from a list of 13) which characterized the sample. In the sorting experiment, subjects (n=30) assigned intensity-matched stimuli into groups 1-on-1 with the experimenter. In the PSP experiment, (n=41) subjects first sampled and took notes on three blind references (‘poles’) before rating each stimulus for its similarity to each of the 3 poles. Two-dimensional perceptual maps from correspondence analysis (CATA), multidimensional scaling (sorting), and multiple factor analysis (PSP) were remarkably similar, with normalized RV coefficients indicating significantly similar plots, regardless of method. Agglomerative hierarchical clustering of all data sets using Ward’s minimum variance as the linkage criteria showed the clusters of astringent stimuli were approximately based on the respective class of astringent agent. Based on the descriptive CATA data, it appears these differences may be due to the presence of side tastes such as bitterness and sourness, rather than astringent sub-qualities per se. Although all three methods are considered ‘rapid,’ our prior experience with sorting suggests it is best performed 1:1 with the experimenter, which makes sorting relatively less efficient than CATA or PSP. Based on the evaluation criteria used here, the choice of method depends on the time constraints of the experimenter and the need for descriptive terms to understand the sensory space of the samples. Accordingly, we recommend a mixed approach that combines CATA with a subsequent PSP task so that the product space can be well characterized before choosing poles for PSP. PMID:26113771
The purpose of this SOP is to describe how to collect, store, and ship tap and drinking water samples for analysis by EPA Method 200.8 (revision 4.4) for the NHEXAS Arizona project. This SOP provides a brief description of the sample containers, collection, preservation, storage...
Complex networks as a unified framework for descriptive analysis and predictive modeling in climate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steinhaeuser, Karsten J K; Chawla, Nitesh; Ganguly, Auroop R
The analysis of climate data has relied heavily on hypothesis-driven statistical methods, while projections of future climate are based primarily on physics-based computational models. However, in recent years a wealth of new datasets has become available. Therefore, we take a more data-centric approach and propose a unified framework for studying climate, with an aim towards characterizing observed phenomena as well as discovering new knowledge in the climate domain. Specifically, we posit that complex networks are well-suited for both descriptive analysis and predictive modeling tasks. We show that the structural properties of climate networks have useful interpretation within the domain. Further,more » we extract clusters from these networks and demonstrate their predictive power as climate indices. Our experimental results establish that the network clusters are statistically significantly better predictors than clusters derived using a more traditional clustering approach. Using complex networks as data representation thus enables the unique opportunity for descriptive and predictive modeling to inform each other.« less
Case Method Teaching as Science and Art: A Metaphoric Approach and Curricular Application
ERIC Educational Resources Information Center
Greenhalgh, Anne M.
2007-01-01
The following article takes a metaphoric approach to case method teaching to shed light on one of our most important practices. The article hinges on the dual comparison of case method as science and as art. The dominant, scientific view of cases is that they are neutral descriptions of real-life business problems, subject to rigorous analysis.…
Jepson, Marcus; Elliott, Daisy; Conefrey, Carmel; Wade, Julia; Rooshenas, Leila; Wilson, Caroline; Beard, David; Blazeby, Jane M; Birtle, Alison; Halliday, Alison; Stein, Rob; Donovan, Jenny L
2018-07-01
To explore how the concept of randomization is described by clinicians and understood by patients in randomized controlled trials (RCTs) and how it contributes to patient understanding and recruitment. Qualitative analysis of 73 audio recordings of recruitment consultations from five, multicenter, UK-based RCTs with identified or anticipated recruitment difficulties. One in 10 appointments did not include any mention of randomization. Most included a description of the method or process of allocation. Descriptions often made reference to gambling-related metaphors or similes, or referred to allocation by a computer. Where reference was made to a computer, some patients assumed that they would receive the treatment that was "best for them". Descriptions of the rationale for randomization were rarely present and often only came about as a consequence of patients questioning the reason for a random allocation. The methods and processes of randomization were usually described by recruiters, but often without clarity, which could lead to patient misunderstanding. The rationale for randomization was rarely mentioned. Recruiters should avoid problematic gambling metaphors and illusions of agency in their explanations and instead focus on clearer descriptions of the rationale and method of randomization to ensure patients are better informed about randomization and RCT participation. Copyright © 2018 University of Bristol. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Yue, Songshan; Chen, Min; Wen, Yongning; Lu, Guonian
2016-04-01
Earth environment is extremely complicated and constantly changing; thus, it is widely accepted that the use of a single geo-analysis model cannot accurately represent all details when solving complex geo-problems. Over several years of research, numerous geo-analysis models have been developed. However, a collaborative barrier between model providers and model users still exists. The development of cloud computing has provided a new and promising approach for sharing and integrating geo-analysis models across an open web environment. To share and integrate these heterogeneous models, encapsulation studies should be conducted that are aimed at shielding original execution differences to create services which can be reused in the web environment. Although some model service standards (such as Web Processing Service (WPS) and Geo Processing Workflow (GPW)) have been designed and developed to help researchers construct model services, various problems regarding model encapsulation remain. (1) The descriptions of geo-analysis models are complicated and typically require rich-text descriptions and case-study illustrations, which are difficult to fully represent within a single web request (such as the GetCapabilities and DescribeProcess operations in the WPS standard). (2) Although Web Service technologies can be used to publish model services, model users who want to use a geo-analysis model and copy the model service into another computer still encounter problems (e.g., they cannot access the model deployment dependencies information). This study presents a strategy for encapsulating geo-analysis models to reduce problems encountered when sharing models between model providers and model users and supports the tasks with different web service standards (e.g., the WPS standard). A description method for heterogeneous geo-analysis models is studied. Based on the model description information, the methods for encapsulating the model-execution program to model services and for describing model-service deployment information are also included in the proposed strategy. Hence, the model-description interface, model-execution interface and model-deployment interface are studied to help model providers and model users more easily share, reuse and integrate geo-analysis models in an open web environment. Finally, a prototype system is established, and the WPS standard is employed as an example to verify the capability and practicability of the model-encapsulation strategy. The results show that it is more convenient for modellers to share and integrate heterogeneous geo-analysis models in cloud computing platforms.
NASA Technical Reports Server (NTRS)
Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R. X.;
2016-01-01
This article provides supplemental information for a Letter reporting the rate of (BBH) coalescences inferred from 16 days of coincident Advanced LIGO observations surrounding the transient (GW) signal GW150914. In that work wereported various rate estimates whose 90% confidence intervals fell in the range 2600 Gpc(exp -3) yr(exp -1). Here we givedetails on our method and computations, including information about our search pipelines, a derivation of ourlikelihood function for the analysis, a description of the astrophysical search trigger distribution expected frommerging BBHs, details on our computational methods, a description of the effects and our model for calibrationuncertainty, and an analytic method for estimating our detector sensitivity, which is calibrated to our measurements.
High Performance Descriptive Semantic Analysis of Semantic Graph Databases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joslyn, Cliff A.; Adolf, Robert D.; al-Saffar, Sinan
As semantic graph database technology grows to address components ranging from extant large triple stores to SPARQL endpoints over SQL-structured relational databases, it will become increasingly important to be able to understand their inherent semantic structure, whether codified in explicit ontologies or not. Our group is researching novel methods for what we call descriptive semantic analysis of RDF triplestores, to serve purposes of analysis, interpretation, visualization, and optimization. But data size and computational complexity makes it increasingly necessary to bring high performance computational resources to bear on this task. Our research group built a novel high performance hybrid system comprisingmore » computational capability for semantic graph database processing utilizing the large multi-threaded architecture of the Cray XMT platform, conventional servers, and large data stores. In this paper we describe that architecture and our methods, and present the results of our analyses of basic properties, connected components, namespace interaction, and typed paths such for the Billion Triple Challenge 2010 dataset.« less
NASA Astrophysics Data System (ADS)
Sarmini; Suyanto, Totok; Nadiroh, Ulin
2018-01-01
In general, corruption is very harmful to society. One of the efforts in preventing corruption is by the culture of Anti-Corruption Education in the young generation through teaching materials in schools. The research method used is qualitative description. The sample in this research is 60 junior high school teachers of Citizenship Education in Surabaya. Data analysis technique used in this research is descriptive statistic with percentage technique. The result of this research is that it is very important that the value of the character of anti-corruption education in teaching materials to grow in the young generation.
REMARK checklist elaborated to improve tumor prognostician
Experts have elaborated on a previously published checklist of 20 items -- including descriptions of design, methods, and analysis -- that researchers should address when publishing studies of prognostic markers. These markers are indicators that enable d
Teaching Crystallography to Noncrystallographers.
ERIC Educational Resources Information Center
Glusker, Jenny P.
1988-01-01
Addresses the requirements of high school students and noncrystallographers in lectures on crystals, diffraction, and structure analysis. Discusses basic understanding and a sequence that addresses these requirements. Suggests visual and descriptive teaching methods used in this effort. (CW)
Risky Business: An Ecological Analysis of Intimate Partner Violence Disclosure
ERIC Educational Resources Information Center
Alaggia, Ramona; Regehr, Cheryl; Jenney, Angelique
2012-01-01
Objective: A multistage, mixed-methods study using grounded theory with descriptive data was conducted to examine factors in disclosure of intimate partner violence (IPV). Method: In-depth interviews with individuals and focus groups were undertaken to collect data from 98 IPV survivors and service providers to identify influential factors.…
School Help Professionals' Ideas on Child Abuse and Neglect
ERIC Educational Resources Information Center
Usakli, Hakan
2012-01-01
Method: In this study, a qualitative research has been carried out; there were interviews with 50 school counselors working in Sinop; they stated their ideas on child abuse and neglect. Analysis: Data collected via semi constructed interviews have been subjected to descriptive and content analysis.The participant counselors were asked three…
ERIC Educational Resources Information Center
Hill, Susan C.; Lindsay, Gordon B.; Thomsen, Steve R.; Olsen, Astrid M.
2003-01-01
Media literacy education helps individuals become discriminating consumers of health information. Informed consumers are less likely to purchase useless health products if informed of misleading and deceptive advertising methods. The purpose of this study was to conduct a content analysis of health-related TV infomercials. An instrument…
A Computer Assisted Language Analysis System.
ERIC Educational Resources Information Center
Rush, J. E.; And Others
A description is presented of a computer-assisted language analysis system (CALAS) which can serve as a method for isolating and displaying language utterances found in conversation. The purpose of CALAS is stated as being to deal with the question of whether it is possible to detect, isolate, and display information indicative of what is…
SSME structural dynamic model development, phase 2
NASA Technical Reports Server (NTRS)
Foley, M. J.; Wilson, V. L.
1985-01-01
A set of test correlated mathematical models of the SSME High Pressure Oxygen Turbopump (HPOTP) housing and rotor assembly was produced. New analysis methods within the EISI/EAL and SPAR systems were investigated and runstreams for future use were developed. The LOX pump models have undergone extensive modification since the first phase of this effort was completed. The rotor assembly from the original model was abandoned and a new, more detailed model constructed. A description of the new rotor math model is presented. Also, the pump housing model was continually modified as additional test data have become available. This model is documented along with measured test results. Many of the more advanced features of the EAL/SPAR finite element analysis system were exercised. These included the cyclic symmetry option, the macro-element procedures, and the fluid analysis capability. In addition, a new tool was developed that allows an automated analysis of a disjoint structure in terms of its component modes. A complete description of the implementation of the Craig-Bampton method is given along with two worked examples.
Development of a structured sensory honey analysis: application to artisanal Madrid honeys.
González, M M; de Lorenzo, C; Pérez, R A
2010-02-01
In this work a methodology to evaluate the sensory properties of honeys has been developed. The sensory analysis was carried out by means of a quantitative descriptive analysis (QDA) method, based on several reference scales, for the coverage of the designed range for each descriptor. The peculiarity of this sensory analysis is that the reference scales have been constituted by common foodstuffs agreed upon by consensus of the panel. The main sensory attributes evaluated in the analyses were: adhesiveness, viscosity, bitterness, aroma, sweetness, acidity, color and granularity. Both the intensity and persistence of honey aromas have also been estimated, together with the classification of the identified aromatic attributes into different groups. The method was applied to 55 artisanal honeys from Madrid (Spain) with the following results: (i) the developed sensory profile sheet allowed a satisfactory description of Madrid honeys; (ii) correlations between sensory attributes of three broad groups of Madrid honeys were obtained and (iii) aroma persistence, sweetness, bitterness, color and granularity appeared as the main sensorial characteristics of honey with discrimination power between floral and honeydew honeys.
The purpose of this SOP is to describe how to collect, store, and ship tap and drinking water samples for analysis by EPA Method 525.2 (revision 1.0) and EPA method 531.1 (revision 3). This SOP provides a brief description of the sample containers, collection, preservation, stor...
The TMDL Program Results Analysis Project: Matching Results Measures with Program Expectations
The paper provides a detailed description of the aims, methods and outputs of the program evaluation project undertaken by EPA in order to generate the insights needed to make TMDL program improvements.
NASA Technical Reports Server (NTRS)
Keith, J. S.; Ferguson, D. R.; Heck, P. H.
1972-01-01
The computer program, Streamtube Curvature Analysis, is described for the engineering user and for the programmer. The user oriented documentation includes a description of the mathematical governing equations, their use in the solution, and the method of solution. The general logical flow of the program is outlined and detailed instructions for program usage and operation are explained. General procedures for program use and the program capabilities and limitations are described. From the standpoint of the grammar, the overlay structure of the program is described. The various storage tables are defined and their uses explained. The input and output are discussed in detail. The program listing includes numerous comments so that the logical flow within the program is easily followed. A test case showing input data and output format is included as well as an error printout description.
NASA Astrophysics Data System (ADS)
Syerliana, L.; Muslim; Setiawan, W.
2018-05-01
This study aims to know profile of argumentation skill high school student at Kabupaten Subang. To achieve this goal, researcher conducted a descriptive study to analysis student test results of argumentation skill of 35 students XII SMAN. Data collection using argumentation test which has validation by expert and then it is analyzed using TAP (Toulmin Argumentation Pattern) which consist of some components such a data, claim, warrant, backing, and rebuttal on the topic of hydrostatic pressure. The method used in this research is descriptive method. The result of this research show the student’s scientific argumentation skill is still low, this is proven by 54% average claim score, 38% data, 29% warrant, 35% backing and 35% rebuttal. These findings will serve as a basis for further research on innovative learning models that can improve students’ argumentation skill.
ERIC Educational Resources Information Center
Güngör, Sema Nur; Özkan, Muhlis
2016-01-01
The aim of this study is to teach enzymes, which are one of the biology subjects in understanding which students have a big difficulty, to pre-service teachers through POE method in the case of catalase, which is an oxidoreductase. Descriptive analysis method was employed in this study in which 38 second grade pre-service teachers attending Uludag…
Music viewed by its entropy content: A novel window for comparative analysis
Febres, Gerardo; Jaffe, Klaus
2017-01-01
Polyphonic music files were analyzed using the set of symbols that produced the Minimal Entropy Description, which we call the Fundamental Scale. This allowed us to create a novel space to represent music pieces by developing: (a) a method to adjust a textual description from its original scale of observation to an arbitrarily selected scale, (b) a method to model the structure of any textual description based on the shape of the symbol frequency profiles, and (c) the concept of higher order entropy as the entropy associated with the deviations of a frequency-ranked symbol profile from a perfect Zipfian profile. We call this diversity index the ‘2nd Order Entropy’. Applying these methods to a variety of musical pieces showed how the space of ‘symbolic specific diversity-entropy’ and that of ‘2nd order entropy’ captures characteristics that are unique to each music type, style, composer and genre. Some clustering of these properties around each musical category is shown. These methods allow us to visualize a historic trajectory of academic music across this space, from medieval to contemporary academic music. We show that the description of musical structures using entropy, symbol frequency profiles and specific symbolic diversity allows us to characterize traditional and popular expressions of music. These classification techniques promise to be useful in other disciplines for pattern recognition and machine learning. PMID:29040288
Music viewed by its entropy content: A novel window for comparative analysis.
Febres, Gerardo; Jaffe, Klaus
2017-01-01
Polyphonic music files were analyzed using the set of symbols that produced the Minimal Entropy Description, which we call the Fundamental Scale. This allowed us to create a novel space to represent music pieces by developing: (a) a method to adjust a textual description from its original scale of observation to an arbitrarily selected scale, (b) a method to model the structure of any textual description based on the shape of the symbol frequency profiles, and (c) the concept of higher order entropy as the entropy associated with the deviations of a frequency-ranked symbol profile from a perfect Zipfian profile. We call this diversity index the '2nd Order Entropy'. Applying these methods to a variety of musical pieces showed how the space of 'symbolic specific diversity-entropy' and that of '2nd order entropy' captures characteristics that are unique to each music type, style, composer and genre. Some clustering of these properties around each musical category is shown. These methods allow us to visualize a historic trajectory of academic music across this space, from medieval to contemporary academic music. We show that the description of musical structures using entropy, symbol frequency profiles and specific symbolic diversity allows us to characterize traditional and popular expressions of music. These classification techniques promise to be useful in other disciplines for pattern recognition and machine learning.
Analysis of the shapes of hemocytes of Callista brevisiphonata in vitro (Bivalvia, Veneridae).
Karetin, Yu A; Pushchin, I I
2015-08-01
Fractal formalism in conjunction with linear methods of image analysis is suitable for the comparative analysis of such "irregular" shapes (from the point of view of classical Euclidean geometry) as flattened amoeboid cells of invertebrates in vitro. Cell morphology of in vitro spreading hemocytes from the bivalve mollusc Callista brevisiphonata was analyzed using correlation, factor and cluster analysis. Four significantly different cell types were identified on the basis of 36 linear and nonlinear parameters. The analysis confirmed the adequacy of the selected methodology for numerical description of the shape and the adequacy of classification of nonlinear shapes of spread hemocytes belonging to the same species. Investigation has practical significance for the description of the morphology of cultured cells, since cell shape is a result of summation of a number of extracellular and intracellular factors. © 2015 International Society for Advancement of Cytometry.
Ghosh, Debasree; Chattopadhyay, Parimal
2012-06-01
The objective of the work was to use the method of quantitative descriptive analysis (QDA) to describe the sensory attributes of the fermented food products prepared with the incorporation of lactic cultures. Panellists were selected and trained to evaluate various attributes specially color and appearance, body texture, flavor, overall acceptability and acidity of the fermented food products like cow milk curd and soymilk curd, idli, sauerkraut and probiotic ice cream. Principal component analysis (PCA) identified the six significant principal components that accounted for more than 90% of the variance in the sensory attribute data. Overall product quality was modelled as a function of principal components using multiple least squares regression (R (2) = 0.8). The result from PCA was statistically analyzed by analysis of variance (ANOVA). These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring the fermented food product attributes that are important for consumer acceptability.
Methodological Reporting of Randomized Trials in Five Leading Chinese Nursing Journals
Shi, Chunhu; Tian, Jinhui; Ren, Dan; Wei, Hongli; Zhang, Lihuan; Wang, Quan; Yang, Kehu
2014-01-01
Background Randomized controlled trials (RCTs) are not always well reported, especially in terms of their methodological descriptions. This study aimed to investigate the adherence of methodological reporting complying with CONSORT and explore associated trial level variables in the Chinese nursing care field. Methods In June 2012, we identified RCTs published in five leading Chinese nursing journals and included trials with details of randomized methods. The quality of methodological reporting was measured through the methods section of the CONSORT checklist and the overall CONSORT methodological items score was calculated and expressed as a percentage. Meanwhile, we hypothesized that some general and methodological characteristics were associated with reporting quality and conducted a regression with these data to explore the correlation. The descriptive and regression statistics were calculated via SPSS 13.0. Results In total, 680 RCTs were included. The overall CONSORT methodological items score was 6.34±0.97 (Mean ± SD). No RCT reported descriptions and changes in “trial design,” changes in “outcomes” and “implementation,” or descriptions of the similarity of interventions for “blinding.” Poor reporting was found in detailing the “settings of participants” (13.1%), “type of randomization sequence generation” (1.8%), calculation methods of “sample size” (0.4%), explanation of any interim analyses and stopping guidelines for “sample size” (0.3%), “allocation concealment mechanism” (0.3%), additional analyses in “statistical methods” (2.1%), and targeted subjects and methods of “blinding” (5.9%). More than 50% of trials described randomization sequence generation, the eligibility criteria of “participants,” “interventions,” and definitions of the “outcomes” and “statistical methods.” The regression analysis found that publication year and ITT analysis were weakly associated with CONSORT score. Conclusions The completeness of methodological reporting of RCTs in the Chinese nursing care field is poor, especially with regard to the reporting of trial design, changes in outcomes, sample size calculation, allocation concealment, blinding, and statistical methods. PMID:25415382
Conducting qualitative research in mental health: Thematic and content analyses.
Crowe, Marie; Inder, Maree; Porter, Richard
2015-07-01
The objective of this paper is to describe two methods of qualitative analysis - thematic analysis and content analysis - and to examine their use in a mental health context. A description of the processes of thematic analysis and content analysis is provided. These processes are then illustrated by conducting two analyses of the same qualitative data. Transcripts of qualitative interviews are analysed using each method to illustrate these processes. The illustration of the processes highlights the different outcomes from the same set of data. Thematic and content analyses are qualitative methods that serve different research purposes. Thematic analysis provides an interpretation of participants' meanings, while content analysis is a direct representation of participants' responses. These methods provide two ways of understanding meanings and experiences and provide important knowledge in a mental health context. © The Royal Australian and New Zealand College of Psychiatrists 2015.
A compilation and analysis of helicopter handling qualities data. Volume 2: Data analysis
NASA Technical Reports Server (NTRS)
Heffley, R. K.
1979-01-01
A compilation and an analysis of helicopter handling qualities data are presented. Multiloop manual control methods are used to analyze the descriptive data, stability derivatives, and transfer functions for a six degrees of freedom, quasi static model. A compensatory loop structure is applied to coupled longitudinal, lateral and directional equations in such a way that key handling qualities features are examined directly.
2012-05-01
with HPLC and PCBs with GC-ECD. Details of the chemical analysis are not included in this description but standard methods are referenced. Other...5 4.4 Analysis of samples to get the accumulated uptake in the fiber ...................................... 8 4.5 Determination of pore water...13 5.5 QC samples for chemical analysis
Tertiary structure-based analysis of microRNA–target interactions
Gan, Hin Hark; Gunsalus, Kristin C.
2013-01-01
Current computational analysis of microRNA interactions is based largely on primary and secondary structure analysis. Computationally efficient tertiary structure-based methods are needed to enable more realistic modeling of the molecular interactions underlying miRNA-mediated translational repression. We incorporate algorithms for predicting duplex RNA structures, ionic strength effects, duplex entropy and free energy, and docking of duplex–Argonaute protein complexes into a pipeline to model and predict miRNA–target duplex binding energies. To ensure modeling accuracy and computational efficiency, we use an all-atom description of RNA and a continuum description of ionic interactions using the Poisson–Boltzmann equation. Our method predicts the conformations of two constructs of Caenorhabditis elegans let-7 miRNA–target duplexes to an accuracy of ∼3.8 Å root mean square distance of their NMR structures. We also show that the computed duplex formation enthalpies, entropies, and free energies for eight miRNA–target duplexes agree with titration calorimetry data. Analysis of duplex–Argonaute docking shows that structural distortions arising from single-base-pair mismatches in the seed region influence the activity of the complex by destabilizing both duplex hybridization and its association with Argonaute. Collectively, these results demonstrate that tertiary structure-based modeling of miRNA interactions can reveal structural mechanisms not accessible with current secondary structure-based methods. PMID:23417009
Picture grammars in classification and semantic interpretation of 3D coronary vessels visualisations
NASA Astrophysics Data System (ADS)
Ogiela, M. R.; Tadeusiewicz, R.; Trzupek, M.
2009-09-01
The work presents the new opportunity for making semantic descriptions and analysis of medical structures, especially coronary vessels CT spatial reconstructions, with the use of AI graph-based linguistic formalisms. In the paper there will be discussed the manners of applying methods of computational intelligence to the development of a syntactic semantic description of spatial visualisations of the heart's coronary vessels. Such descriptions may be used for both smart ordering of images while archiving them and for their semantic searches in medical multimedia databases. Presented methodology of analysis can furthermore be used for attaining other goals related performance of computer-assisted semantic interpretation of selected elements and/or the entire 3D structure of the coronary vascular tree. These goals are achieved through the use of graph-based image formalisms based on IE graphs generating grammars that allow discovering and automatic semantic interpretation of irregularities visualised on the images obtained during diagnostic examinations of the heart muscle. The basis for the construction of 3D reconstructions of biological objects used in this work are visualisations obtained from helical CT scans, yet the method itself may be applied also for other methods of medical 3D images acquisition. The obtained semantic information makes it possible to make a description of the structure focused on the semantics of various morphological forms of the visualised vessels from the point of view of the operation of coronary circulation and the blood supply of the heart muscle. Thanks to these, the analysis conducted allows fast and — to a great degree — automated interpretation of the semantics of various morphological changes in the coronary vascular tree, and especially makes it possible to detect these stenoses in the lumen of the vessels that can cause critical decrease in blood supply to extensive or especially important fragments of the heart muscle.
The Use of Non-Standard Devices in Finite Element Analysis
NASA Technical Reports Server (NTRS)
Schur, Willi W.; Broduer, Steve (Technical Monitor)
2001-01-01
A general mathematical description of the response behavior of thin-skin pneumatic envelopes and many other membrane and cable structures produces under-constrained systems that pose severe difficulties to analysis. These systems are mobile, and the general mathematical description exposes the mobility. Yet the response behavior of special under-constrained structures under special loadings can be accurately predicted using a constrained mathematical description. The static response behavior of systems that are infinitesimally mobile, such as a non-slack membrane subtended from a rigid or elastic boundary frame, can be easily analyzed using such general mathematical description as afforded by the non-linear, finite element method using an implicit solution scheme if the incremental uploading is guided through a suitable path. Similarly, if such structures are assembled with structural lack of fit that provides suitable self-stress, then dynamic response behavior can be predicted by the non-linear, finite element method and an implicit solution scheme. An explicit solution scheme is available for evolution problems. Such scheme can be used via the method of dynamic relaxation to obtain the solution to a static problem. In some sense, pneumatic envelopes and many other compliant structures can be said to have destiny under a specified loading system. What that means to the analyst is that what happens on the evolution path of the solution is irrelevant as long as equilibrium is achieved at destiny under full load and that the equilibrium is stable in the vicinity of that load. The purpose of this paper is to alert practitioners to the fact that non-standard procedures in finite element analysis are useful and can be legitimate although they burden their users with the requirement to use special caution. Some interesting findings that are useful to the US Scientific Balloon Program and that could not be obtained without non-standard techniques are presented.
Getting past first base: Going all the way with Cognitive Work Analysis.
McIlroy, Rich C; Stanton, Neville A
2011-01-01
This paper reports the application of Cognitive Work Analysis (CWA) to the problem of communications planning in military aviation. Applications of CWA rarely get beyond the first one or two phases; this paper presents an analysis in which all five phases have been completed. The method offers a formative description of the system, defining the set of boundaries and constraints that shape system activity in terms of work domain, recurring activities, decision making, social organisation and worker competency requirements. It is an analysis that is well suited to environments in which the occurrence of unanticipated events can have serious implications for both safety and productivity. Communications planning in military aviation is such an environment. The outputs of the analysis provided an extensive and exhaustive description of the system, highlighting the uneven spread of activity, across actors involved in communications planning and across the situations in which planning can occur. In addition, a new method for informing worker competency requirements based on abstract functions rather than specific decision steps is proposed and discussed in terms of job design, interface design, and person specification. Copyright © 2010 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Management and Feasibility Analysis of Smoked Fish Business in Ambon
NASA Astrophysics Data System (ADS)
Nanlohy, Hellen; Apituley, Yolanda M. T. N.; Tapotubun, Alfonsina M.; Reiuwpassa, Frederik; Matrutty, Theodora E. A. A.
2017-10-01
This research aims to examine management and feasibility aspects of smoked fish business in Ambon. By using survey, this research focused on smoked fish business in Negeri Hative Kecil and Silale, known as dried fish producer villages.Primary and secondary data collected by using interview, observation, and recording. Analysis methods used are qualitative descriptive analysis and business feasibility analysis covers NPV, Payback Period (PP), and Break Even Point (BEP). The result shows that most of the smoked fish businessmen do not apply the proper management in their business. Two to three people do all the works (from production to marketing) without a clear job description. Feasibility analysis for smoked fish business in Negeri Hative Kecil shows that NPV is 21.501.053,- PP is 58 days, and Benefit Cost Ratio (B/C) is 1,06. BEP Production is 1,455 kg, and BEP Price is IDR 19,941, while the feasibility analysis for smoked fish business in Desa Silale shows that NPV is 30.745.837,-, PP is 24 days and Benefit Cost Ratio (B/C) is 1,41. BEP Production is 988 kg, and BEP Price is IDR 7,966. Based on the result, the two smoked fish business in Ambon is feasible to be developed. However, good management with clear job description should be applied to improve the business.
ERIC Educational Resources Information Center
Roth-Yousey, Lori; Chu, Yen Li; Reicks, Marla
2012-01-01
Objective: To understand parent beverage expectations for early adolescents (EAs) by eating occasion at home and in various settings. Methods: Descriptive study using focus group interviews and the constant comparative method for qualitative data analysis. Results: Six focus groups were completed, and 2 were conducted in Spanish. Participants (n =…
Ritualizing Expertise: Non-Montessorian View of the Montessori Method
ERIC Educational Resources Information Center
Cossentino, Jacqueline
2005-01-01
This article examines the practice of Montessori education through the lens of ritual. Anchored by description and analysis of a lesson in an elementary classroom, the lesson is viewed as a series of ritualized interactions in which both teacher and student act out multiple layers of expertise within the cultural frame of the Montessori method.…
Illinois' Forests, 2005: Statistics, Methods, and Quality Assurance
Susan J. Crocker; Charles J. Barnett; Mark A. Hatfield
2013-01-01
The first full annual inventory of Illinois' forests was completed in 2005. This report contains 1) descriptive information on methods, statistics, and quality assurance of data collection, 2) a glossary of terms, 3) tables that summarize quality assurance, and 4) a core set of tabular estimates for a variety of forest resources. A detailed analysis of inventory...
Ju, Lining; Wang, Yijie Dylan; Hung, Ying; Wu, Chien-Fu Jeff; Zhu, Cheng
2013-01-01
Motivation: Abrupt reduction/resumption of thermal fluctuations of a force probe has been used to identify association/dissociation events of protein–ligand bonds. We show that off-rate of molecular dissociation can be estimated by the analysis of the bond lifetime, while the on-rate of molecular association can be estimated by the analysis of the waiting time between two neighboring bond events. However, the analysis relies heavily on subjective judgments and is time-consuming. To automate the process of mapping out bond events from thermal fluctuation data, we develop a hidden Markov model (HMM)-based method. Results: The HMM method represents the bond state by a hidden variable with two values: bound and unbound. The bond association/dissociation is visualized and pinpointed. We apply the method to analyze a key receptor–ligand interaction in the early stage of hemostasis and thrombosis: the von Willebrand factor (VWF) binding to platelet glycoprotein Ibα (GPIbα). The numbers of bond lifetime and waiting time events estimated by the HMM are much more than those estimated by a descriptive statistical method from the same set of raw data. The kinetic parameters estimated by the HMM are in excellent agreement with those by a descriptive statistical analysis, but have much smaller errors for both wild-type and two mutant VWF-A1 domains. Thus, the computerized analysis allows us to speed up the analysis and improve the quality of estimates of receptor–ligand binding kinetics. Contact: jeffwu@isye.gatech.edu or cheng.zhu@bme.gatech.edu PMID:23599504
Phase walk analysis of leptokurtic time series.
Schreiber, Korbinian; Modest, Heike I; Räth, Christoph
2018-06-01
The Fourier phase information play a key role for the quantified description of nonlinear data. We present a novel tool for time series analysis that identifies nonlinearities by sensitively detecting correlations among the Fourier phases. The method, being called phase walk analysis, is based on well established measures from random walk analysis, which are now applied to the unwrapped Fourier phases of time series. We provide an analytical description of its functionality and demonstrate its capabilities on systematically controlled leptokurtic noise. Hereby, we investigate the properties of leptokurtic time series and their influence on the Fourier phases of time series. The phase walk analysis is applied to measured and simulated intermittent time series, whose probability density distribution is approximated by power laws. We use the day-to-day returns of the Dow-Jones industrial average, a synthetic time series with tailored nonlinearities mimicing the power law behavior of the Dow-Jones and the acceleration of the wind at an Atlantic offshore site. Testing for nonlinearities by means of surrogates shows that the new method yields strong significances for nonlinear behavior. Due to the drastically decreased computing time as compared to embedding space methods, the number of surrogate realizations can be increased by orders of magnitude. Thereby, the probability distribution of the test statistics can very accurately be derived and parameterized, which allows for much more precise tests on nonlinearities.
Exploring Ways to Implement the Health Services Mobility Study: A Feasibility Study.
ERIC Educational Resources Information Center
Lavine, Eileen M.; Moore, Audrey
A feasibility study was aimed at developing a strategy for implementing and utilizing the job analysis methodology which resulted from the Health Services Mobility Study (HSMS), particularly as it can be applied to the field of diagnostic radiology. (The HSMS method of job analysis starts with task descriptions analyzing the tasks that make up a…
ERIC Educational Resources Information Center
Muller, Veronica; Brooks, Jessica; Tu, Wei-Mo; Moser, Erin; Lo, Chu-Ling; Chan, Fong
2015-01-01
Purpose: The main objective of this study was to determine the extent to which physical and cognitive-affective factors are associated with fibromyalgia (FM) fatigue. Method: A quantitative descriptive design using correlation techniques and multiple regression analysis. The participants consisted of 302 members of the National Fibromyalgia &…
An Evaluation of Normal versus Lognormal Distribution in Data Description and Empirical Analysis
ERIC Educational Resources Information Center
Diwakar, Rekha
2017-01-01
Many existing methods of statistical inference and analysis rely heavily on the assumption that the data are normally distributed. However, the normality assumption is not fulfilled when dealing with data which does not contain negative values or are otherwise skewed--a common occurrence in diverse disciplines such as finance, economics, political…
ERIC Educational Resources Information Center
Moallem, Mahnaz
A study was conducted to analyze current job announcements in the field of instructional design and technology and to produce descriptive information that portrays the required skills and areas of knowledge for instructional technology graduates. Content analysis, in its general terms, was used as the research method for this study. One hundred…
A Programmer-Oriented Approach to Safe Concurrency
2003-05-01
and leaving a synchronized block additionally has effects on the management of memory values in the JMM. The practical outcome of these effects is...object-oriented effects system; (3) analysis to track the association of locks with regions, (4) policy descriptions for allowable method...Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 4 An Object-Oriented Effects System 45 4.1 Regions Identify State
Quantitative and Descriptive Comparison of Four Acoustic Analysis Systems: Vowel Measurements
ERIC Educational Resources Information Center
Burris, Carlyn; Vorperian, Houri K.; Fourakis, Marios; Kent, Ray D.; Bolt, Daniel M.
2014-01-01
Purpose: This study examines accuracy and comparability of 4 trademarked acoustic analysis software packages (AASPs): Praat, WaveSurfer, TF32, and CSL by using synthesized and natural vowels. Features of AASPs are also described. Method: Synthesized and natural vowels were analyzed using each of the AASP's default settings to secure 9…
Secondary Analysis of the "Love Me...Never Shake Me" SBS Education Program
ERIC Educational Resources Information Center
Deyo, Grace; Skybo, Theresa; Carroll, Alisa
2008-01-01
Objective: Shaken baby syndrome (SBS) is preventable; however, an estimated 21-74 per 100,000 children worldwide are victims annually. This study examined the effectiveness of an SBS prevention program in the US. Methods: A descriptive, secondary analysis of the Prevent Child Abuse Ohio (PCAO) "Love Me...Never Shake Me" SBS education program…
ERIC Educational Resources Information Center
Sandefur, James T.
1991-01-01
Discussed is the process of translating situations involving changing quantities into mathematical relationships. This process, called dynamical modeling, allows students to learn new mathematics while sharpening their algebraic skills. A description of dynamical systems, problem-solving methods, a graphical analysis, and available classroom…
Transonic propulsion system integration analysis at McDonnell Aircraft Company
NASA Technical Reports Server (NTRS)
Cosner, Raymond R.
1989-01-01
The technology of Computational Fluid Dynamics (CFD) is becoming an important tool in the development of aircraft propulsion systems. Two of the most valuable features of CFD are: (1) quick acquisition of flow field data; and (2) complete description of flow fields, allowing detailed investigation of interactions. Current analysis methods complement wind tunnel testing in several ways. Herein, the discussion is focused on CFD methods. However, aircraft design studies need data from both CFD and wind tunnel testing. Each approach complements the other.
Barnes, Stephen; Benton, H. Paul; Casazza, Krista; Cooper, Sara; Cui, Xiangqin; Du, Xiuxia; Engler, Jeffrey; Kabarowski, Janusz H.; Li, Shuzhao; Pathmasiri, Wimal; Prasain, Jeevan K.; Renfrow, Matthew B.; Tiwari, Hemant K.
2017-01-01
Metabolomics, a systems biology discipline representing analysis of known and unknown pathways of metabolism, has grown tremendously over the past 20 years. Because of its comprehensive nature, metabolomics requires careful consideration of the question(s) being asked, the scale needed to answer the question(s), collection and storage of the sample specimens, methods for extraction of the metabolites from biological matrices, the analytical method(s) to be employed and the quality control of the analyses, how collected data are correlated, the statistical methods to determine metabolites undergoing significant change, putative identification of metabolites, and the use of stable isotopes to aid in verifying metabolite identity and establishing pathway connections and fluxes. This second part of a comprehensive description of the methods of metabolomics focuses on data analysis, emerging methods in metabolomics and the future of this discipline. PMID:28239968
Behavior dynamics: One perspective
Marr, M. Jackson
1992-01-01
Behavior dynamics is a field devoted to analytic descriptions of behavior change. A principal source of both models and methods for these descriptions is found in physics. This approach is an extension of a long conceptual association between behavior analysis and physics. A theme common to both is the role of molar versus molecular events in description and prediction. Similarities and differences in how these events are treated are discussed. Two examples are presented that illustrate possible correspondence between mechanical and behavioral systems. The first demonstrates the use of a mechanical model to describe the molar properties of behavior under changing reinforcement conditions. The second, dealing with some features of concurrent schedules, focuses on the possible utility of nonlinear dynamical systems to the description of both molar and molecular behavioral events as the outcome of a deterministic, but chaotic, process. PMID:16812655
Lindhardt, Christina Louise; Rubak, Sune; Mogensen, Ole; Hansen, Helle Ploug; Goldstein, Henri; Lamont, Ronald F; Joergensen, Jan Stener
2015-07-01
to explore and describe how healthcare professionals in the Southern Region of Denmark experienced motivational interviewing as a communication method when working with pregnant women with obesity. a qualitative, descriptive study based on face-to-face interviews with 11 obstetric healthcare professionals working in a perinatal setting. a thematic descriptive method was applied to semi-structured interviews. The healthcare professional's experiences were recorded verbatim during individual semi-structured qualitative interviews, transcribed, and analysed using a descriptive analysis methodology. motivational interviewing was found to be a useful method when communicating with obese pregnant women. The method made the healthcare professionals more aware of their own communication style both when encountering pregnant women and in their interaction with colleagues. However, most of the healthcare professionals emphasised that time was crucial and they had to be dedicated to the motivational interviewing method. The healthcare professionals further stated that it enabled them to become more professional in their daily work and made some of them feel less 'burned out', 'powerless' and 'stressed' as they felt they had a communication method in handling difficult workloads. healthcare professionals experienced motivational interviewing to be a useful method when working perinatally. The motivational interviewing method permitted heightened awareness of the healthcare professionals communication method with the patients and increased their ability to handle a difficult workload. Overall, lack of time restricted the use of the motivational interviewing method on a daily basis. Copyright © 2015 Elsevier Ltd. All rights reserved.
Ayeleke, Reuben Olugbenga; North, Nicola; Wallis, Katharine Ann; Liang, Zhanming; Dunham, Annette
2016-01-01
Background: The need for competence training and development in health management and leadership workforces has been emphasised. However, evidence of the outcomes and impact of such training and development has not been systematically assessed. The aim of this review is to synthesise the available evidence of the outcomes and impact of training and development in relation to the competence of health management and leadership workforces. This is with a view to enhancing the development of evidence-informed programmes to improve competence. Methods and Analysis: A systematic review will be undertaken using a mixed-methods research synthesis to identify, assess and synthesise relevant empirical studies. We will search relevant electronic databases and other sources for eligible studies. The eligibility of studies for inclusion will be assessed independently by two review authors. Similarly, the methodological quality of the included studies will be assessed independently by two review authors using appropriate validated instruments. Data from qualitative studies will be synthesised using thematic analysis. For quantitative studies, appropriate effect size estimate will be calculated for each of the interventions. Where studies are sufficiently similar, their findings will be combined in meta-analyses or meta-syntheses. Findings from quantitative syntheses will be converted into textual descriptions (qualitative themes) using Bayesian method. Textual descriptions and results of the initial qualitative syntheses that are mutually compatible will be combined in mixed-methods syntheses. Discussion: The outcome of data collection and analysis will lead, first, to a descriptive account of training and development programmes used to improve the competence of health management and leadership workforces and the acceptability of such programmes to participants. Secondly, the outcomes and impact of such programmes in relation to participants’ competence as well as individual and organisational performance will be identified. If possible, the relationship between health contexts and the interventions required to improve management and leadership competence will be examined PMID:28005551
Demonstration Advanced Avionics System (DAAS), Phase 1
NASA Technical Reports Server (NTRS)
Bailey, A. J.; Bailey, D. G.; Gaabo, R. J.; Lahn, T. G.; Larson, J. C.; Peterson, E. M.; Schuck, J. W.; Rodgers, D. L.; Wroblewski, K. A.
1981-01-01
Demonstration advanced anionics system (DAAS) function description, hardware description, operational evaluation, and failure mode and effects analysis (FMEA) are provided. Projected advanced avionics system (PAAS) description, reliability analysis, cost analysis, maintainability analysis, and modularity analysis are discussed.
Resonating group method as applied to the spectroscopy of α-transfer reactions
NASA Astrophysics Data System (ADS)
Subbotin, V. B.; Semjonov, V. M.; Gridnev, K. A.; Hefter, E. F.
1983-10-01
In the conventional approach to α-transfer reactions the finite- and/or zero-range distorted-wave Born approximation is used in liaison with a macroscopic description of the captured α particle in the residual nucleus. Here the specific example of 16O(6Li,d)20Ne reactions at different projectile energies is taken to present a microscopic resonating group method analysis of the α particle in the final nucleus (for the reaction part the simple zero-range distorted-wave Born approximation is employed). In the discussion of suitable nucleon-nucleon interactions, force number one of the effective interactions presented by Volkov is shown to be most appropriate for the system considered. Application of the continuous analog of Newton's method to the evaluation of the resonating group method equations yields an increased accuracy with respect to traditional methods. The resonating group method description induces only minor changes in the structures of the angular distributions, but it does serve its purpose in yielding reliable and consistent spectroscopic information. NUCLEAR STRUCTURE 16O(6Li,d)20Ne; E=20 to 32 MeV; calculated B(E2); reduced widths, dσdΩ extracted α-spectroscopic factors. ZRDWBA with microscope RGM description of residual α particle in 20Ne; application of continuous analog of Newton's method; tested and applied Volkov force No. 1; direct mechanism.
Methods for collection and analysis of aquatic biological and microbiological samples
Greeson, Phillip E.; Ehlke, T.A.; Irwin, G.A.; Lium, B.W.; Slack, K.V.
1977-01-01
Chapter A4 contains methods used by the U.S. Geological Survey to collect, preserve, and analyze waters to determine their biological and microbiological properties. Part 1 discusses biological sampling and sampling statistics. The statistical procedures are accompanied by examples. Part 2 consists of detailed descriptions of more than 45 individual methods, including those for bacteria, phytoplankton, zooplankton, seston, periphyton, macrophytes, benthic invertebrates, fish and other vertebrates, cellular contents, productivity, and bioassays. Each method is summarized, and the application, interferences, apparatus, reagents, collection, analysis, calculations, reporting of results, precision and references are given. Part 3 consists of a glossary. Part 4 is a list of taxonomic references.
A Descriptive Guide to Trade Space Analysis
2015-09-01
Development QFD Quality Function Deployment RSM Response Surface Method RSE Response Surface Equation SE Systems Engineering SME Subject Matter...surface equations ( RSEs ) as surrogate models. It uses the RSEs with Monte Carlo simulation to quantitatively explore changes across the surfaces to
DNA DAMAGE QUANTITATION BY ALKALINE GEL ELECTROPHORESIS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
SUTHERLAND,B.M.; BENNETT,P.V.; SUTHERLAND, J.C.
2004-03-24
Physical and chemical agents in the environment, those used in clinical applications, or encountered during recreational exposures to sunlight, induce damages in DNA. Understanding the biological impact of these agents requires quantitation of the levels of such damages in laboratory test systems as well as in field or clinical samples. Alkaline gel electrophoresis provides a sensitive (down to {approx} a few lesions/5Mb), rapid method of direct quantitation of a wide variety of DNA damages in nanogram quantities of non-radioactive DNAs from laboratory, field, or clinical specimens, including higher plants and animals. This method stems from velocity sedimentation studies of DNAmore » populations, and from the simple methods of agarose gel electrophoresis. Our laboratories have developed quantitative agarose gel methods, analytical descriptions of DNA migration during electrophoresis on agarose gels (1-6), and electronic imaging for accurate determinations of DNA mass (7-9). Although all these components improve sensitivity and throughput of large numbers of samples (7,8,10), a simple version using only standard molecular biology equipment allows routine analysis of DNA damages at moderate frequencies. We present here a description of the methods, as well as a brief description of the underlying principles, required for a simplified approach to quantitation of DNA damages by alkaline gel electrophoresis.« less
A control-volume method for analysis of unsteady thrust augmenting ejector flows
NASA Technical Reports Server (NTRS)
Drummond, Colin K.
1988-01-01
A method for predicting transient thrust augmenting ejector characteristics is presented. The analysis blends classic self-similar turbulent jet descriptions with a control volume mixing region discretization to solicit transient effects in a new way. Division of the ejector into an inlet, diffuser, and mixing region corresponds with the assumption of viscous-dominated phenomenon in the latter. Inlet and diffuser analyses are simplified by a quasi-steady analysis, justified by the assumptions that pressure is the forcing function in those regions. Details of the theoretical foundation, the solution algorithm, and sample calculations are given.
Computer program for preliminary design analysis of axial-flow turbines
NASA Technical Reports Server (NTRS)
Glassman, A. J.
1972-01-01
The program method is based on a mean-diameter flow analysis. Input design requirements include power or pressure ratio, flow, temperature, pressure, and speed. Turbine designs are generated for any specified number of stages and for any of three types of velocity diagrams (symmetrical, zero exit swirl, or impulse). Exit turning vanes can be included in the design. Program output includes inlet and exit annulus dimensions, exit temperature and pressure, total and static efficiencies, blading angles, and last-stage critical velocity ratios. The report presents the analysis method, a description of input and output with sample cases, and the program listing.
Qualitative Research in Palliative Care: Applications to Clinical Trials Work.
Lim, Christopher T; Tadmor, Avia; Fujisawa, Daisuke; MacDonald, James J; Gallagher, Emily R; Eusebio, Justin; Jackson, Vicki A; Temel, Jennifer S; Greer, Joseph A; Hagan, Teresa; Park, Elyse R
2017-08-01
While vast opportunities for using qualitative methods exist within palliative care research, few studies provide practical advice for researchers and clinicians as a roadmap to identify and utilize such opportunities. To provide palliative care clinicians and researchers descriptions of qualitative methodology applied to innovative research questions relative to palliative care research and define basic concepts in qualitative research. Body: We describe three qualitative projects as exemplars to describe major concepts in qualitative analysis of early palliative care: (1) a descriptive analysis of clinician documentation in the electronic health record, (2) a thematic content analysis of palliative care clinician focus groups, and (3) a framework analysis of audio-recorded encounters between patients and clinicians as part of a clinical trial. This study provides a foundation for undertaking qualitative research within palliative care and serves as a framework for use by other palliative care researchers interested in qualitative methodologies.
ERIC Educational Resources Information Center
Guven, Meral; Kurum, Dilruba; Saglam, Mustafa
2012-01-01
The aim of this study was to determine the distance education pre-service teachers' opinions about the teaching practice course. The study was conducted with descriptive method. For data collection, analysis and interpretation, qualitative research method was used. Out of the students enrolled at Open Education Faculty, Department of Pre-school…
Fractal-Based Image Analysis In Radiological Applications
NASA Astrophysics Data System (ADS)
Dellepiane, S.; Serpico, S. B.; Vernazza, G.; Viviani, R.
1987-10-01
We present some preliminary results of a study aimed to assess the actual effectiveness of fractal theory and to define its limitations in the area of medical image analysis for texture description, in particular, in radiological applications. A general analysis to select appropriate parameters (mask size, tolerance on fractal dimension estimation, etc.) has been performed on synthetically generated images of known fractal dimensions. Moreover, we analyzed some radiological images of human organs in which pathological areas can be observed. Input images were subdivided into blocks of 6x6 pixels; then, for each block, the fractal dimension was computed in order to create fractal images whose intensity was related to the D value, i.e., texture behaviour. Results revealed that the fractal images could point out the differences between normal and pathological tissues. By applying histogram-splitting segmentation to the fractal images, pathological areas were isolated. Two different techniques (i.e., the method developed by Pentland and the "blanket" method) were employed to obtain fractal dimension values, and the results were compared; in both cases, the appropriateness of the fractal description of the original images was verified.
Aboagye-Sarfo, Patrick; Mai, Qun; Sanfilippo, Frank M; Preen, David B; Stewart, Louise M; Fatovich, Daniel M
2015-10-01
To develop multivariate vector-ARMA (VARMA) forecast models for predicting emergency department (ED) demand in Western Australia (WA) and compare them to the benchmark univariate autoregressive moving average (ARMA) and Winters' models. Seven-year monthly WA state-wide public hospital ED presentation data from 2006/07 to 2012/13 were modelled. Graphical and VARMA modelling methods were used for descriptive analysis and model fitting. The VARMA models were compared to the benchmark univariate ARMA and Winters' models to determine their accuracy to predict ED demand. The best models were evaluated by using error correction methods for accuracy. Descriptive analysis of all the dependent variables showed an increasing pattern of ED use with seasonal trends over time. The VARMA models provided a more precise and accurate forecast with smaller confidence intervals and better measures of accuracy in predicting ED demand in WA than the ARMA and Winters' method. VARMA models are a reliable forecasting method to predict ED demand for strategic planning and resource allocation. While the ARMA models are a closely competing alternative, they under-estimated future ED demand. Copyright © 2015 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Kosko, Karl W.; Herbst, Patricio
2012-01-01
Analysis of teacher-to-teacher talk provides researchers with useful information regarding the teaching profession and teachers' perspectives. This article provides a description of a method, with accompanying example, examining teacher-to-teacher talk by incorporating semantic modality and examining trends of its usage in a quantitative manner.…
A Metaphor Analysis of the Fifth Grade Students' Perceptions about Writing
ERIC Educational Resources Information Center
Erdogan, Tolga; Erdogan, Özge
2013-01-01
The aim of this study is to examine the fifth grade students' perceptions about writing through metaphor analysis. This is a descriptive research in nature, and a qualitative research method was employed in the study. The participants of the study are a total of 594 fifth graders in the city of Ankara. The students are asked to complete the…
Index in Alexandre Dumas' Novel the Man in the Iron Mask: A Semiotic Analysis
ERIC Educational Resources Information Center
Syarifuddin, Salmia; Yahya, Andi Rukayah Alim; Jusoff, Kamaruzaman; Makhsud, Abdul
2013-01-01
Novel as a literary work can be analyzed by using semiotic analysis. This article aims to analyze the meaning of index found in characterizations in the novel "The Man in the Iron Mask" by Alexandre Dumas. This article involved the descriptive qualitative method. The results revealed that there are many causal relations between the index…
Technology assessment of solar energy utilization
NASA Astrophysics Data System (ADS)
Jaeger, F.
1985-11-01
The general objectives and methods of Technology Assessment (TA) are outlined. Typical analysis steps of a TA for solar energy are reviewed: description of the technology and its further development; identification of impact areas; analysis of boundary conditions and definition of scenarios; market penetration of solar technologies; projection of consequences in areas of impact; and assessment of impacts and identification of options for action.
The solution of linear systems of equations with a structural analysis code on the NAS CRAY-2
NASA Technical Reports Server (NTRS)
Poole, Eugene L.; Overman, Andrea L.
1988-01-01
Two methods for solving linear systems of equations on the NAS Cray-2 are described. One is a direct method; the other is an iterative method. Both methods exploit the architecture of the Cray-2, particularly the vectorization, and are aimed at structural analysis applications. To demonstrate and evaluate the methods, they were installed in a finite element structural analysis code denoted the Computational Structural Mechanics (CSM) Testbed. A description of the techniques used to integrate the two solvers into the Testbed is given. Storage schemes, memory requirements, operation counts, and reformatting procedures are discussed. Finally, results from the new methods are compared with results from the initial Testbed sparse Choleski equation solver for three structural analysis problems. The new direct solvers described achieve the highest computational rates of the methods compared. The new iterative methods are not able to achieve as high computation rates as the vectorized direct solvers but are best for well conditioned problems which require fewer iterations to converge to the solution.
Sensory properties of Californian and imported extra virgin olive oils.
Delgado, Claudia; Guinard, Jean-Xavier
2011-04-01
Production and consumption of extra-virgin olive has been increasing in the United States, particularly in California. The objective of this study was to compare the sensory characteristics of 22 extra virgin olive oils (EVOO) from California, Italy, Spain, Chile, and Australia using a generic descriptive analysis. A total of 22 sensory attributes were identified and defined by the descriptive panel. With the exception of thick and citrus, all sensory attributes were significantly different among the oils. Canonical Variate Analysis (CVA) showed that California oils differed from some imported EVOOs, mainly by their absence of defects. A second analysis, of only those attributes included in the International Olive Council (IOC) official scorecard, provided a less detailed description of the samples and did not allow for a full characterization of the oils. While the IOC attributes allowed for faster classification in terms of clean versus defective EVOOs, the more comprehensive descriptive analysis provided both more information and a more refined classification of the samples. Variety and region of origin were important factors in the classification of both Californian and imported EVOOs. Measuring olive oil sensory quality using the IOC method-positive attributes of fruitiness, bitterness, and pungency, and defects including fusty, musty, winey, and rancid-allows for the certification of oils as extra virgin but it provides limited information on the sensory characteristics of the oils. A full descriptive profile, on the other hand, provides information that can be used by producers in the processing and marketing of their oils, and is a useful tool in the education of consumers about the wide range of (positive) sensory attributes in EVOO and the various sensory styles of EVOO.
Beda, Alessandro; Simpson, David M; Faes, Luca
2017-01-01
The growing interest in personalized medicine requires making inferences from descriptive indexes estimated from individual recordings of physiological signals, with statistical analyses focused on individual differences between/within subjects, rather than comparing supposedly homogeneous cohorts. To this end, methods to compute confidence limits of individual estimates of descriptive indexes are needed. This study introduces numerical methods to compute such confidence limits and perform statistical comparisons between indexes derived from autoregressive (AR) modeling of individual time series. Analytical approaches are generally not viable, because the indexes are usually nonlinear functions of the AR parameters. We exploit Monte Carlo (MC) and Bootstrap (BS) methods to reproduce the sampling distribution of the AR parameters and indexes computed from them. Here, these methods are implemented for spectral and information-theoretic indexes of heart-rate variability (HRV) estimated from AR models of heart-period time series. First, the MS and BC methods are tested in a wide range of synthetic HRV time series, showing good agreement with a gold-standard approach (i.e. multiple realizations of the "true" process driving the simulation). Then, real HRV time series measured from volunteers performing cognitive tasks are considered, documenting (i) the strong variability of confidence limits' width across recordings, (ii) the diversity of individual responses to the same task, and (iii) frequent disagreement between the cohort-average response and that of many individuals. We conclude that MC and BS methods are robust in estimating confidence limits of these AR-based indexes and thus recommended for short-term HRV analysis. Moreover, the strong inter-individual differences in the response to tasks shown by AR-based indexes evidence the need of individual-by-individual assessments of HRV features. Given their generality, MC and BS methods are promising for applications in biomedical signal processing and beyond, providing a powerful new tool for assessing the confidence limits of indexes estimated from individual recordings.
2017-01-01
The growing interest in personalized medicine requires making inferences from descriptive indexes estimated from individual recordings of physiological signals, with statistical analyses focused on individual differences between/within subjects, rather than comparing supposedly homogeneous cohorts. To this end, methods to compute confidence limits of individual estimates of descriptive indexes are needed. This study introduces numerical methods to compute such confidence limits and perform statistical comparisons between indexes derived from autoregressive (AR) modeling of individual time series. Analytical approaches are generally not viable, because the indexes are usually nonlinear functions of the AR parameters. We exploit Monte Carlo (MC) and Bootstrap (BS) methods to reproduce the sampling distribution of the AR parameters and indexes computed from them. Here, these methods are implemented for spectral and information-theoretic indexes of heart-rate variability (HRV) estimated from AR models of heart-period time series. First, the MS and BC methods are tested in a wide range of synthetic HRV time series, showing good agreement with a gold-standard approach (i.e. multiple realizations of the "true" process driving the simulation). Then, real HRV time series measured from volunteers performing cognitive tasks are considered, documenting (i) the strong variability of confidence limits' width across recordings, (ii) the diversity of individual responses to the same task, and (iii) frequent disagreement between the cohort-average response and that of many individuals. We conclude that MC and BS methods are robust in estimating confidence limits of these AR-based indexes and thus recommended for short-term HRV analysis. Moreover, the strong inter-individual differences in the response to tasks shown by AR-based indexes evidence the need of individual-by-individual assessments of HRV features. Given their generality, MC and BS methods are promising for applications in biomedical signal processing and beyond, providing a powerful new tool for assessing the confidence limits of indexes estimated from individual recordings. PMID:28968394
Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C
2018-03-07
Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses.
Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.
Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing
2017-01-01
Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.
NASA Astrophysics Data System (ADS)
Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Aggarwal, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Ajith, P.; Allen, B.; Allocca, A.; Altin, P. A.; Anderson, S. B.; Anderson, W. G.; Arai, K.; Araya, M. C.; Arceneaux, C. C.; Areeda, J. S.; Arnaud, N.; Arun, K. G.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Babak, S.; Bacon, P.; Bader, M. K. M.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barclay, S. E.; Barish, B. C.; Barker, D.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Baune, C.; Bavigadda, V.; Bazzan, M.; Behnke, B.; Bejger, M.; Bell, A. S.; Bell, C. J.; Berger, B. K.; Bergman, J.; Bergmann, G.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Birch, J.; Birney, R.; Biscans, S.; Bisht, A.; Bitossi, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, S.; Bock, O.; Bodiya, T. P.; Boer, M.; Bogaert, G.; Bogan, C.; Bohe, A.; Bojtos, P.; Bond, C.; Bondu, F.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Bouffanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brockill, P.; Brooks, A. F.; Brown, D. A.; Brown, D. D.; Brown, N. M.; Buchanan, C. C.; Buikema, A.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cadonati, L.; Cagnoli, G.; Cahillane, C.; Calderón Bustillo, J.; Callister, T.; Calloni, E.; Camp, J. B.; Cannon, K. C.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Casanueva Diaz, J.; Casentini, C.; Caudill, S.; Cavaglià, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Cerboni Baiardi, L.; Cerretani, G.; Cesarini, E.; Chakraborty, R.; Chalermsongsak, T.; Chamberlin, S. J.; Chan, M.; Chao, S.; Charlton, P.; Chassande-Mottin, E.; Chen, H. Y.; Chen, Y.; Cheng, C.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Q.; Chua, S.; Chung, S.; Ciani, G.; Clara, F.; Clark, J. A.; Cleva, F.; Coccia, E.; Cohadon, P.-F.; Colla, A.; Collette, C. G.; Cominsky, L.; Constancio, M., Jr.; Conte, A.; Conti, L.; Cook, D.; Corbitt, T. R.; Cornish, N.; Corsi, A.; Cortese, S.; Costa, C. A.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J.-P.; Countryman, S. T.; Couvares, P.; Cowan, E. E.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; Craig, K.; Creighton, J. D. E.; Cripe, J.; Crowder, S. G.; Cumming, A.; Cunningham, L.; Cuoco, E.; Dal Canton, T.; Danilishin, S. L.; D’Antonio, S.; Danzmann, K.; Darman, N. S.; Dattilo, V.; Dave, I.; Daveloza, H. P.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; De, S.; DeBra, D.; Debreczeni, G.; Degallaix, J.; De Laurentis, M.; Deléglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dereli, H.; Dergachev, V.; De Rosa, R.; DeRosa, R. T.; DeSalvo, R.; Dhurandhar, S.; Díaz, M. C.; Di Fiore, L.; Di Giovanni, M.; Di Lieto, A.; Di Pace, S.; Di Palma, I.; Di Virgilio, A.; Dojcinoski, G.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Douglas, R.; Downes, T. P.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Du, Z.; Ducrot, M.; Dwyer, S. E.; Edo, T. B.; Edwards, M. C.; Effler, A.; Eggenstein, H.-B.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Engels, W.; Essick, R. C.; Etzel, T.; Evans, M.; Evans, T. M.; Everett, R.; Factourovich, M.; Fafone, V.; Fair, H.; Fairhurst, S.; Fan, X.; Fang, Q.; Farinon, S.; Farr, B.; Farr, W. M.; Favata, M.; Fays, M.; Fehrmann, H.; Fejer, M. M.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Fiori, I.; Fiorucci, D.; Fisher, R. P.; Flaminio, R.; Fletcher, M.; Fong, H.; Fournier, J.-D.; Franco, S.; Frasca, S.; Frasconi, F.; Frei, Z.; Freise, A.; Frey, R.; Frey, V.; Fricke, T. T.; Fritschel, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gabbard, H. A. G.; Gair, J. R.; Gammaitoni, L.; Gaonkar, S. G.; Garufi, F.; Gatto, A.; Gaur, G.; Gehrels, N.; Gemme, G.; Gendre, B.; Genin, E.; Gennai, A.; George, J.; Gergely, L.; Germain, V.; Ghosh, Archisman; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gill, K.; Glaefke, A.; Goetz, E.; Goetz, R.; Gondan, L.; González, G.; Gonzalez Castro, J. M.; Gopakumar, A.; Gordon, N. A.; Gorodetsky, M. L.; Gossan, S. E.; Gosselin, M.; Gouaty, R.; Graef, C.; Graff, P. B.; Granata, M.; Grant, A.; Gras, S.; Gray, C.; Greco, G.; Green, A. C.; Groot, P.; Grote, H.; Grunewald, S.; Guidi, G. M.; Guo, X.; Gupta, A.; Gupta, M. K.; Gushwa, K. E.; Gustafson, E. K.; Gustafson, R.; Hacker, J. J.; Hall, B. R.; Hall, E. D.; Hammond, G.; Haney, M.; Hanke, M. M.; Hanks, J.; Hanna, C.; Hannam, M. D.; Hanson, J.; Hardwick, T.; Harms, J.; Harry, G. M.; Harry, I. W.; Hart, M. J.; Hartman, M. T.; Haster, C.-J.; Haughian, K.; Heidmann, A.; Heintze, M. C.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M.; Heng, I. S.; Hennig, J.; Heptonstall, A. W.; Heurs, M.; Hild, S.; Hoak, D.; Hodge, K. A.; Hofman, D.; Hollitt, S. E.; Holt, K.; Holz, D. E.; Hopkins, P.; Hosken, D. J.; Hough, J.; Houston, E. A.; Howell, E. J.; Hu, Y. M.; Huang, S.; Huerta, E. A.; Huet, D.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh-Dinh, T.; Idrisy, A.; Indik, N.; Ingram, D. R.; Inta, R.; Isa, H. N.; Isac, J.-M.; Isi, M.; Islas, G.; Isogai, T.; Iyer, B. R.; Izumi, K.; Jacqmin, T.; Jang, H.; Jani, K.; Jaranowski, P.; Jawahar, S.; Jiménez-Forteza, F.; Johnson, W. W.; Jones, D. I.; Jones, R.; Jonker, R. J. G.; Ju, L.; K, Haris; Kalaghatgi, C. V.; Kalogera, V.; Kandhasamy, S.; Kang, G.; Kanner, J. B.; Karki, S.; Kasprzack, M.; Katsavounidis, E.; Katzman, W.; Kaufer, S.; Kaur, T.; Kawabe, K.; Kawazoe, F.; Kéfélian, F.; Kehl, M. S.; Keitel, D.; Kelley, D. B.; Kells, W.; Kennedy, R.; Key, J. S.; Khalaidovski, A.; Khalili, F. Y.; Khan, I.; Khan, S.; Khan, Z.; Khazanov, E. A.; Kijbunchoo, N.; Kim, C.; Kim, J.; Kim, K.; Kim, Nam-Gyu; Kim, Namjun; Kim, Y.-M.; King, E. J.; King, P. J.; Kinzel, D. L.; Kissel, J. S.; Kleybolte, L.; Klimenko, S.; Koehlenbeck, S. M.; Kokeyama, K.; Koley, S.; Kondrashov, V.; Kontos, A.; Korobko, M.; Korth, W. Z.; Kowalska, I.; Kozak, D. B.; Kringel, V.; Krishnan, B.; Królak, A.; Krueger, C.; Kuehn, G.; Kumar, P.; Kuo, L.; Kutynia, A.; Lackey, B. D.; Landry, M.; Lange, J.; Lantz, B.; Lasky, P. D.; Lazzarini, A.; Lazzaro, C.; Leaci, P.; Leavey, S.; Lebigot, E. O.; Lee, C. H.; Lee, H. K.; Lee, H. M.; Lee, K.; Lenon, A.; Leonardi, M.; Leong, J. R.; Leroy, N.; Letendre, N.; Levin, Y.; Levine, B. M.; Li, T. G. F.; Libson, A.; Littenberg, T. B.; Lockerbie, N. A.; Logue, J.; Lombardi, A. L.; Lord, J. E.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J. D.; Lück, H.; Lundgren, A. P.; Luo, J.; Lynch, R.; Ma, Y.; MacDonald, T.; Machenschalk, B.; MacInnis, M.; Macleod, D. M.; Magaña-Sandoval, F.; Magee, R. M.; Mageswaran, M.; Majorana, E.; Maksimovic, I.; Malvezzi, V.; Man, N.; Mandel, I.; Mandic, V.; Mangano, V.; Mansell, G. L.; Manske, M.; Mantovani, M.; Marchesoni, F.; Marion, F.; Márka, S.; Márka, Z.; Markosyan, A. S.; Maros, E.; Martelli, F.; Martellini, L.; Martin, I. W.; Martin, R. M.; Martynov, D. V.; Marx, J. N.; Mason, K.; Masserot, A.; Massinger, T. J.; Masso-Reid, M.; Matichard, F.; Matone, L.; Mavalvala, N.; Mazumder, N.; Mazzolo, G.; McCarthy, R.; McClelland, D. E.; McCormick, S.; McGuire, S. C.; McIntyre, G.; McIver, J.; McManus, D. J.; McWilliams, S. T.; Meacher, D.; Meadors, G. D.; Meidam, J.; Melatos, A.; Mendell, G.; Mendoza-Gandara, D.; Mercer, R. A.; Merilh, E.; Merzougui, M.; Meshkov, S.; Messenger, C.; Messick, C.; Meyers, P. M.; Mezzani, F.; Miao, H.; Michel, C.; Middleton, H.; Mikhailov, E. E.; Milano, L.; Miller, J.; Millhouse, M.; Minenkov, Y.; Ming, J.; Mirshekari, S.; Mishra, C.; Mitra, S.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moggi, A.; Mohan, M.; Mohapatra, S. R. P.; Montani, M.; Moore, B. C.; Moore, C. J.; Moraru, D.; Moreno, G.; Morriss, S. R.; Mossavi, K.; Mours, B.; Mow-Lowry, C. M.; Mueller, C. L.; Mueller, G.; Muir, A. W.; Mukherjee, Arunava; Mukherjee, D.; Mukherjee, S.; Mukund, N.; Mullavey, A.; Munch, J.; Murphy, D. J.; Murray, P. G.; Mytidis, A.; Nardecchia, I.; Naticchioni, L.; Nayak, R. K.; Necula, V.; Nedkova, K.; Nelemans, G.; Neri, M.; Neunzert, A.; Newton, G.; Nguyen, T. T.; Nielsen, A. B.; Nissanke, S.; Nitz, A.; Nocera, F.; Nolting, D.; Normandin, M. E.; Nuttall, L. K.; Oberling, J.; Ochsner, E.; O’Dell, J.; Oelker, E.; Ogin, G. H.; Oh, J. J.; Oh, S. H.; Ohme, F.; Oliver, M.; Oppermann, P.; Oram, Richard J.; O’Reilly, B.; O’Shaughnessy, R.; Ottaway, D. J.; Ottens, R. S.; Overmier, H.; Owen, B. J.; Pai, A.; Pai, S. A.; Palamos, J. R.; Palashov, O.; Palomba, C.; Pal-Singh, A.; Pan, H.; Pankow, C.; Pannarale, F.; Pant, B. C.; Paoletti, F.; Paoli, A.; Papa, M. A.; Paris, H. R.; Parker, W.; Pascucci, D.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Patricelli, B.; Patrick, Z.; Pearlstone, B. L.; Pedraza, M.; Pedurand, R.; Pekowsky, L.; Pele, A.; Penn, S.; Perreca, A.; Phelps, M.; Piccinni, O.; Pichot, M.; Piergiovanni, F.; Pierro, V.; Pillant, G.; Pinard, L.; Pinto, I. M.; Pitkin, M.; Poggiani, R.; Popolizio, P.; Porter, E. K.; Post, A.; Powell, J.; Prasad, J.; Predoi, V.; Premachandra, S. S.; Prestegard, T.; Price, L. R.; Prijatelj, M.; Principe, M.; Privitera, S.; Prodi, G. A.; Prokhorov, L.; Puncken, O.; Punturo, M.; Puppo, P.; Pürrer, M.; Qi, H.; Qin, J.; Quetschke, V.; Quintero, E. A.; Quitzow-James, R.; Raab, F. J.; Rabeling, D. S.; Radkins, H.; Raffai, P.; Raja, S.; Rakhmanov, M.; Rapagnani, P.; Raymond, V.; Razzano, M.; Re, V.; Read, J.; Reed, C. M.; Regimbau, T.; Rei, L.; Reid, S.; Reitze, D. H.; Rew, H.; Reyes, S. D.; Ricci, F.; Riles, K.; Robertson, N. A.; Robie, R.; Robinet, F.; Rocchi, A.; Rolland, L.; Rollins, J. G.; Roma, V. J.; Romano, R.; Romanov, G.; Romie, J. H.; Rosińska, D.; Rowan, S.; Rüdiger, A.; Ruggi, P.; Ryan, K.; Sachdev, S.; Sadecki, T.; Sadeghian, L.; Salconi, L.; Saleem, M.; Salemi, F.; Samajdar, A.; Sammut, L.; Sampson, L.; Sanchez, E. J.; Sandberg, V.; Sandeen, B.; Sanders, J. R.; Sassolas, B.; Sathyaprakash, B. S.; Saulson, P. R.; Sauter, O.; Savage, R. L.; Sawadsky, A.; Schale, P.; Schilling, R.; Schmidt, J.; Schmidt, P.; Schnabel, R.; Schofield, R. M. S.; Schönbeck, A.; Schreiber, E.; Schuette, D.; Schutz, B. F.; Scott, J.; Scott, S. M.; Sellers, D.; Sengupta, A. S.; Sentenac, D.; Sequino, V.; Sergeev, A.; Serna, G.; Setyawati, Y.; Sevigny, A.; Shaddock, D. A.; Shah, S.; Shahriar, M. S.; Shaltev, M.; Shao, Z.; Shapiro, B.; Shawhan, P.; Sheperd, A.; Shoemaker, D. H.; Shoemaker, D. M.; Siellez, K.; Siemens, X.; Sigg, D.; Silva, A. D.; Simakov, D.; Singer, A.; Singer, L. P.; Singh, A.; Singh, R.; Singhal, A.; Sintes, A. M.; Slagmolen, B. J. J.; Smith, J. R.; Smith, N. D.; Smith, R. J. E.; Son, E. J.; Sorazu, B.; Sorrentino, F.; Souradeep, T.; Srivastava, A. K.; Staley, A.; Steinke, M.; Steinlechner, J.; Steinlechner, S.; Steinmeyer, D.; Stephens, B. C.; Stevenson, S.; Stone, R.; Strain, K. A.; Straniero, N.; Stratta, G.; Strauss, N. A.; Strigin, S.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Sun, L.; Sutton, P. J.; Swinkels, B. L.; Szczepańczyk, M. J.; Tacca, M.; Talukder, D.; Tanner, D. B.; Tápai, M.; Tarabrin, S. P.; Taracchini, A.; Taylor, R.; Theeg, T.; Thirugnanasambandam, M. P.; Thomas, E. G.; Thomas, M.; Thomas, P.; Thorne, K. A.; Thorne, K. S.; Thrane, E.; Tiwari, S.; Tiwari, V.; Tokmakov, K. V.; Tomlinson, C.; Tonelli, M.; Torres, C. V.; Torrie, C. I.; Töyrä, D.; Travasso, F.; Traylor, G.; Trifirò, D.; Tringali, M. C.; Trozzo, L.; Tse, M.; Turconi, M.; Tuyenbayev, D.; Ugolini, D.; Unnikrishnan, C. S.; Urban, A. L.; Usman, S. A.; Vahlbruch, H.; Vajente, G.; Valdes, G.; Vallisneri, M.; van Bakel, N.; van Beuzekom, M.; van den Brand, J. F. J.; Van Den Broeck, C.; Vander-Hyde, D. C.; van der Schaaf, L.; van Heijningen, J. V.; van Veggel, A. A.; Vardaro, M.; Vass, S.; Vasúth, M.; Vaulin, R.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, P. J.; Venkateswara, K.; Verkindt, D.; Vetrano, F.; Viceré, A.; Vinciguerra, S.; Vine, D. J.; Vinet, J.-Y.; Vitale, S.; Vo, T.; Vocca, H.; Vorvick, C.; Voss, D.; Vousden, W. D.; Vyatchanin, S. P.; Wade, A. R.; Wade, L. E.; Wade, M.; Walker, M.; Wallace, L.; Walsh, S.; Wang, G.; Wang, H.; Wang, M.; Wang, X.; Wang, Y.; Ward, R. L.; Warner, J.; Was, M.; Weaver, B.; Wei, L.-W.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Welborn, T.; Wen, L.; Wesels, P.; Westphal, T.; Wette, K.; Whelan, J. T.; White, D. J.; Whiting, B. F.; Williams, R. D.; Williamson, A. R.; Willis, J. L.; Willke, B.; Wimmer, M. H.; Winkler, W.; Wipf, C. C.; Wittel, H.; Woan, G.; Worden, J.; Wright, J. L.; Wu, G.; Yablon, J.; Yam, W.; Yamamoto, H.; Yancey, C. C.; Yap, M. J.; Yu, H.; Yvert, M.; Zadrożny, A.; Zangrando, L.; Zanolin, M.; Zendri, J.-P.; Zevin, M.; Zhang, F.; Zhang, L.; Zhang, M.; Zhang, Y.; Zhao, C.; Zhou, M.; Zhou, Z.; Zhu, X. J.; Zucker, M. E.; Zuraw, S. E.; Zweizig, J.; LIGO Scientific Collaboration; Virgo Collaboration
2016-12-01
This article provides supplemental information for a Letter reporting the rate of (BBH) coalescences inferred from 16 days of coincident Advanced LIGO observations surrounding the transient (GW) signal GW150914. In that work we reported various rate estimates whose 90% confidence intervals fell in the range 2–600 Gpc‑3 yr‑1. Here we give details on our method and computations, including information about our search pipelines, a derivation of our likelihood function for the analysis, a description of the astrophysical search trigger distribution expected from merging BBHs, details on our computational methods, a description of the effects and our model for calibration uncertainty, and an analytic method for estimating our detector sensitivity, which is calibrated to our measurements.
[Review of research design and statistical methods in Chinese Journal of Cardiology].
Zhang, Li-jun; Yu, Jin-ming
2009-07-01
To evaluate the research design and the use of statistical methods in Chinese Journal of Cardiology. Peer through the research design and statistical methods in all of the original papers in Chinese Journal of Cardiology from December 2007 to November 2008. The most frequently used research designs are cross-sectional design (34%), prospective design (21%) and experimental design (25%). In all of the articles, 49 (25%) use wrong statistical methods, 29 (15%) lack some sort of statistic analysis, 23 (12%) have inconsistencies in description of methods. There are significant differences between different statistical methods (P < 0.001). The correction rates of multifactor analysis were low and repeated measurement datas were not used repeated measurement analysis. Many problems exist in Chinese Journal of Cardiology. Better research design and correct use of statistical methods are still needed. More strict review by statistician and epidemiologist is also required to improve the literature qualities.
NASA Astrophysics Data System (ADS)
Fujimoto, Kazuhiro J.
2012-07-01
A transition-density-fragment interaction (TDFI) combined with a transfer integral (TI) method is proposed. The TDFI method was previously developed for describing electronic Coulomb interaction, which was applied to excitation-energy transfer (EET) [K. J. Fujimoto and S. Hayashi, J. Am. Chem. Soc. 131, 14152 (2009)] and exciton-coupled circular dichroism spectra [K. J. Fujimoto, J. Chem. Phys. 133, 124101 (2010)]. In the present study, the TDFI method is extended to the exchange interaction, and hence it is combined with the TI method for applying to the EET via charge-transfer (CT) states. In this scheme, the overlap correction is also taken into account. To check the TDFI-TI accuracy, several test calculations are performed to an ethylene dimer. As a result, the TDFI-TI method gives a much improved description of the electronic coupling, compared with the previous TDFI method. Based on the successful description of the electronic coupling, the decomposition analysis is also performed with the TDFI-TI method. The present analysis clearly shows a large contribution from the Coulomb interaction in most of the cases, and a significant influence of the CT states at the small separation. In addition, the exchange interaction is found to be small in this system. The present approach is useful for analyzing and understanding the mechanism of EET.
Asker, Dalal; Awad, Tarek S; Beppu, Teruhiko; Ueda, Kenji
2012-01-01
Astaxanthin is a red ketocarotenoid that exhibits extraordinary health-promoting activities such as antioxidant, anti-inflammatory, antitumor, and immune booster. The recent discovery of the beneficial roles of astaxanthin against many degenerative diseases such as cancers, heart diseases, and exercise-induced fatigue has raised its market demand as a nutraceutical and medicinal ingredient in aquaculture, food, and pharmaceutical industries. To satisfy the growing demand for this high-value nutraceuticals ingredient and consumer interest in natural products, many research efforts are being made to discover novel microbial producers with effective biotechnological production of astaxanthin. Using a rapid screening method based on 16S rRNA gene, and effective HPLC-Diodearray-MS methods for carotenoids analysis, we succeeded to isolate a unique astaxanthin-producing bacterium (strain TDMA-17(T)) that belongs to the family Sphingomonadaceae (Asker et al., Appl Microbiol Biotechnol 77: 383-392, 2007). In this chapter, we provide a detailed description of effective HPLC-Diodearray-MS methods for rapid analysis and identification of the carotenoids produced by strain TDMA-17(T). We also describe the methods of isolation and identification for a novel bacterial carotenoid (astaxanthin derivative), a major carotenoid that is produced by strain TDMA-17(T). Finally, we describe the polyphasic taxonomic analysis of strain TDMA-17(T) and the description of a novel species belonging to genus Sphingomonas.
Anxiety (Low Ago Strength) And Intelligence Among Students Of High School Mathematics
NASA Astrophysics Data System (ADS)
Naderi, Habibollah
2008-01-01
The aim of this study was to investigate the relationship between anxiety (low ago strength) and intelligence among student's mathematics. All the effects of anxiety were studied within the sample of 112 subjects (boys). 56 of them were regular of students (RS) and 56 were intelligent of students (IS) of high schools. Mean age was (17.1 years), SD (.454) and range age was 16-18 years in 3 classes of regular of high school mathematics was for regular students. For the IS, mean age was (16.75 years), SD (.436) and range age was l6-17 years in 4 classes of students exceptional talent for high school mathematics. The sampling method in this study was the simple randomization method. In this studied, for analysis of method used both descriptive and inference of research, which for description of analysis used Average and analysis of covariance and Variance, also for inference of analysis, used with t-test between two the groups of students. The Cattell of Anxiety Test (1958) (CTAT) has been used in a number of studies for measurement trait anxiety in Iran. In general, the findings were found not statistical significant between the RS and the IS of students in that factorial of low of ago strength (C-). Further research is needed to investigate whether the current findings hold for student populations by others anxiety tests.
The integration of system specifications and program coding
NASA Technical Reports Server (NTRS)
Luebke, W. R.
1970-01-01
Experience in maintaining up-to-date documentation for one module of the large-scale Medical Literature Analysis and Retrieval System 2 (MEDLARS 2) is described. Several innovative techniques were explored in the development of this system's data management environment, particularly those that use PL/I as an automatic documenter. The PL/I data description section can provide automatic documentation by means of a master description of data elements that has long and highly meaningful mnemonic names and a formalized technique for the production of descriptive commentary. The techniques discussed are practical methods that employ the computer during system development in a manner that assists system implementation, provides interim documentation for customer review, and satisfies some of the deliverable documentation requirements.
The geometry of structural equilibrium
2017-01-01
Building on a long tradition from Maxwell, Rankine, Klein and others, this paper puts forward a geometrical description of structural equilibrium which contains a procedure for the graphic analysis of stress resultants within general three-dimensional frames. The method is a natural generalization of Rankine’s reciprocal diagrams for three-dimensional trusses. The vertices and edges of dual abstract 4-polytopes are embedded within dual four-dimensional vector spaces, wherein the oriented area of generalized polygons give all six components (axial and shear forces with torsion and bending moments) of the stress resultants. The relevant quantities may be readily calculated using four-dimensional Clifford algebra. As well as giving access to frame analysis and design, the description resolves a number of long-standing problems with the incompleteness of Rankine’s description of three-dimensional trusses. Examples are given of how the procedure may be applied to structures of engineering interest, including an outline of a two-stage procedure for addressing the equilibrium of loaded gridshell rooves. PMID:28405361
Determining the 40K radioactivity in rocks using x-ray spectrometry
NASA Astrophysics Data System (ADS)
Pilakouta, M.; Kallithrakas-Kontos, N.; Nikolaou, G.
2017-09-01
In this paper we propose an experimental method for the determination of potassium-40 (40K) radioactivity in commercial granite samples using x-ray fluorescence (XRF). The method correlates the total potassium concentration (yield) in samples deduced by XRF analysis with the radioactivity of the sample due to the 40K radionuclide. This method can be used in an undergraduate student laboratory. A brief theoretical background and description of the method, as well as some results and their interpretation, are presented.
Otwombe, Kennedy N.; Petzold, Max; Martinson, Neil; Chirwa, Tobias
2014-01-01
Background Research in the predictors of all-cause mortality in HIV-infected people has widely been reported in literature. Making an informed decision requires understanding the methods used. Objectives We present a review on study designs, statistical methods and their appropriateness in original articles reporting on predictors of all-cause mortality in HIV-infected people between January 2002 and December 2011. Statistical methods were compared between 2002–2006 and 2007–2011. Time-to-event analysis techniques were considered appropriate. Data Sources Pubmed/Medline. Study Eligibility Criteria Original English-language articles were abstracted. Letters to the editor, editorials, reviews, systematic reviews, meta-analysis, case reports and any other ineligible articles were excluded. Results A total of 189 studies were identified (n = 91 in 2002–2006 and n = 98 in 2007–2011) out of which 130 (69%) were prospective and 56 (30%) were retrospective. One hundred and eighty-two (96%) studies described their sample using descriptive statistics while 32 (17%) made comparisons using t-tests. Kaplan-Meier methods for time-to-event analysis were commonly used in the earlier period (n = 69, 76% vs. n = 53, 54%, p = 0.002). Predictors of mortality in the two periods were commonly determined using Cox regression analysis (n = 67, 75% vs. n = 63, 64%, p = 0.12). Only 7 (4%) used advanced survival analysis methods of Cox regression analysis with frailty in which 6 (3%) were used in the later period. Thirty-two (17%) used logistic regression while 8 (4%) used other methods. There were significantly more articles from the first period using appropriate methods compared to the second (n = 80, 88% vs. n = 69, 70%, p-value = 0.003). Conclusion Descriptive statistics and survival analysis techniques remain the most common methods of analysis in publications on predictors of all-cause mortality in HIV-infected cohorts while prospective research designs are favoured. Sophisticated techniques of time-dependent Cox regression and Cox regression with frailty are scarce. This motivates for more training in the use of advanced time-to-event methods. PMID:24498313
Toledo, Cíntia Matsuda; Cunha, Andre; Scarton, Carolina; Aluísio, Sandra
2014-01-01
Discourse production is an important aspect in the evaluation of brain-injured individuals. We believe that studies comparing the performance of brain-injured subjects with that of healthy controls must use groups with compatible education. A pioneering application of machine learning methods using Brazilian Portuguese for clinical purposes is described, highlighting education as an important variable in the Brazilian scenario. Objective The aims were to describe how to: (i) develop machine learning classifiers using features generated by natural language processing tools to distinguish descriptions produced by healthy individuals into classes based on their years of education; and (ii) automatically identify the features that best distinguish the groups. Methods The approach proposed here extracts linguistic features automatically from the written descriptions with the aid of two Natural Language Processing tools: Coh-Metrix-Port and AIC. It also includes nine task-specific features (three new ones, two extracted manually, besides description time; type of scene described – simple or complex; presentation order – which type of picture was described first; and age). In this study, the descriptions by 144 of the subjects studied in Toledo18 were used,which included 200 healthy Brazilians of both genders. Results and Conclusion A Support Vector Machine (SVM) with a radial basis function (RBF) kernel is the most recommended approach for the binary classification of our data, classifying three of the four initial classes. CfsSubsetEval (CFS) is a strong candidate to replace manual feature selection methods. PMID:29213908
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neilson, James R.; McQueen, Tyrel M.
With the increased availability of high-intensity time-of-flight neutron and synchrotron X-ray scattering sources that can access wide ranges of momentum transfer, the pair distribution function method has become a standard analysis technique for studying disorder of local coordination spheres and at intermediate atomic separations. In some cases, rational modeling of the total scattering data (Bragg and diffuse) becomes intractable with least-squares approaches, necessitating reverse Monte Carlo simulations using large atomistic ensembles. However, the extraction of meaningful information from the resulting atomistic ensembles is challenging, especially at intermediate length scales. Representational analysis is used here to describe the displacements of atomsmore » in reverse Monte Carlo ensembles from an ideal crystallographic structure in an approach analogous to tight-binding methods. Rewriting the displacements in terms of a local basis that is descriptive of the ideal crystallographic symmetry provides a robust approach to characterizing medium-range order (and disorder) and symmetry breaking in complex and disordered crystalline materials. Lastly, this method enables the extraction of statistically relevant displacement modes (orientation, amplitude and distribution) of the crystalline disorder and provides directly meaningful information in a locally symmetry-adapted basis set that is most descriptive of the crystal chemistry and physics.« less
Neilson, James R.; McQueen, Tyrel M.
2015-09-20
With the increased availability of high-intensity time-of-flight neutron and synchrotron X-ray scattering sources that can access wide ranges of momentum transfer, the pair distribution function method has become a standard analysis technique for studying disorder of local coordination spheres and at intermediate atomic separations. In some cases, rational modeling of the total scattering data (Bragg and diffuse) becomes intractable with least-squares approaches, necessitating reverse Monte Carlo simulations using large atomistic ensembles. However, the extraction of meaningful information from the resulting atomistic ensembles is challenging, especially at intermediate length scales. Representational analysis is used here to describe the displacements of atomsmore » in reverse Monte Carlo ensembles from an ideal crystallographic structure in an approach analogous to tight-binding methods. Rewriting the displacements in terms of a local basis that is descriptive of the ideal crystallographic symmetry provides a robust approach to characterizing medium-range order (and disorder) and symmetry breaking in complex and disordered crystalline materials. Lastly, this method enables the extraction of statistically relevant displacement modes (orientation, amplitude and distribution) of the crystalline disorder and provides directly meaningful information in a locally symmetry-adapted basis set that is most descriptive of the crystal chemistry and physics.« less
De-biasing the dynamic mode decomposition for applied Koopman spectral analysis of noisy datasets
NASA Astrophysics Data System (ADS)
Hemati, Maziar S.; Rowley, Clarence W.; Deem, Eric A.; Cattafesta, Louis N.
2017-08-01
The dynamic mode decomposition (DMD)—a popular method for performing data-driven Koopman spectral analysis—has gained increased popularity for extracting dynamically meaningful spatiotemporal descriptions of fluid flows from snapshot measurements. Often times, DMD descriptions can be used for predictive purposes as well, which enables informed decision-making based on DMD model forecasts. Despite its widespread use and utility, DMD can fail to yield accurate dynamical descriptions when the measured snapshot data are imprecise due to, e.g., sensor noise. Here, we express DMD as a two-stage algorithm in order to isolate a source of systematic error. We show that DMD's first stage, a subspace projection step, systematically introduces bias errors by processing snapshots asymmetrically. To remove this systematic error, we propose utilizing an augmented snapshot matrix in a subspace projection step, as in problems of total least-squares, in order to account for the error present in all snapshots. The resulting unbiased and noise-aware total DMD (TDMD) formulation reduces to standard DMD in the absence of snapshot errors, while the two-stage perspective generalizes the de-biasing framework to other related methods as well. TDMD's performance is demonstrated in numerical and experimental fluids examples. In particular, in the analysis of time-resolved particle image velocimetry data for a separated flow, TDMD outperforms standard DMD by providing dynamical interpretations that are consistent with alternative analysis techniques. Further, TDMD extracts modes that reveal detailed spatial structures missed by standard DMD.
Ice Growth Measurements from Image Data to Support Ice Crystal and Mixed-Phase Accretion Testing
NASA Technical Reports Server (NTRS)
Struk, Peter M.; Lynch, Christopher J.
2012-01-01
This paper describes the imaging techniques as well as the analysis methods used to measure the ice thickness and growth rate in support of ice-crystal icing tests performed at the National Research Council of Canada (NRC) Research Altitude Test Facility (RATFac). A detailed description of the camera setup, which involves both still and video cameras, as well as the analysis methods using the NASA Spotlight software, are presented. Two cases, one from two different test entries, showing significant ice growth are analyzed in detail describing the ice thickness and growth rate which is generally linear. Estimates of the bias uncertainty are presented for all measurements. Finally some of the challenges related to the imaging and analysis methods are discussed as well as methods used to overcome them.
Ice Growth Measurements from Image Data to Support Ice-Crystal and Mixed-Phase Accretion Testing
NASA Technical Reports Server (NTRS)
Struk, Peter, M; Lynch, Christopher, J.
2012-01-01
This paper describes the imaging techniques as well as the analysis methods used to measure the ice thickness and growth rate in support of ice-crystal icing tests performed at the National Research Council of Canada (NRC) Research Altitude Test Facility (RATFac). A detailed description of the camera setup, which involves both still and video cameras, as well as the analysis methods using the NASA Spotlight software, are presented. Two cases, one from two different test entries, showing significant ice growth are analyzed in detail describing the ice thickness and growth rate which is generally linear. Estimates of the bias uncertainty are presented for all measurements. Finally some of the challenges related to the imaging and analysis methods are discussed as well as methods used to overcome them.
SaaS Platform for Time Series Data Handling
NASA Astrophysics Data System (ADS)
Oplachko, Ekaterina; Rykunov, Stanislav; Ustinin, Mikhail
2018-02-01
The paper is devoted to the description of MathBrain, a cloud-based resource, which works as a "Software as a Service" model. It is designed to maximize the efficiency of the current technology and to provide a tool for time series data handling. The resource provides access to the following analysis methods: direct and inverse Fourier transforms, Principal component analysis and Independent component analysis decompositions, quantitative analysis, magnetoencephalography inverse problem solution in a single dipole model based on multichannel spectral data.
NASA Technical Reports Server (NTRS)
Rummler, D. R.
1976-01-01
The results are presented of investigations to apply regression techniques to the development of methodology for creep-rupture data analysis. Regression analysis techniques are applied to the explicit description of the creep behavior of materials for space shuttle thermal protection systems. A regression analysis technique is compared with five parametric methods for analyzing three simulated and twenty real data sets, and a computer program for the evaluation of creep-rupture data is presented.
The description of cough sounds by healthcare professionals
Smith, Jaclyn A; Ashurst, H Louise; Jack, Sandy; Woodcock, Ashley A; Earis, John E
2006-01-01
Background Little is known of the language healthcare professionals use to describe cough sounds. We aimed to examine how they describe cough sounds and to assess whether these descriptions suggested they appreciate the basic sound qualities (as assessed by acoustic analysis) and the underlying diagnosis of the patient coughing. Methods 53 health professionals from two large respiratory tertiary referral centres were recruited; 22 doctors and 31 staff from professions allied to medicine. Participants listened to 9 sequences of spontaneous cough sounds from common respiratory diseases. For each cough they selected patient gender, the most appropriate descriptors and a diagnosis. Cluster analysis was performed to assess which cough sounds attracted similar descriptions. Results Gender was correctly identified in 93% of cases. The presence or absence of mucus was correct in 76.1% and wheeze in 39.3% of cases. However, identifying clinical diagnosis from cough was poor at 34.0%. Cluster analysis showed coughs with the same acoustics properties rather than the same diagnoses attracted the same descriptions. Conclusion These results suggest that healthcare professionals can recognise some of the qualities of cough sounds but are poor at making diagnoses from them. It remains to be seen whether in the future cough sound acoustics will provide useful clinical information and whether their study will lead to the development of useful new outcome measures in cough monitoring. PMID:16436200
A discourse on sensitivity analysis for discretely-modeled structures
NASA Technical Reports Server (NTRS)
Adelman, Howard M.; Haftka, Raphael T.
1991-01-01
A descriptive review is presented of the most recent methods for performing sensitivity analysis of the structural behavior of discretely-modeled systems. The methods are generally but not exclusively aimed at finite element modeled structures. Topics included are: selections of finite difference step sizes; special consideration for finite difference sensitivity of iteratively-solved response problems; first and second derivatives of static structural response; sensitivity of stresses; nonlinear static response sensitivity; eigenvalue and eigenvector sensitivities for both distinct and repeated eigenvalues; and sensitivity of transient response for both linear and nonlinear structural response.
School, State and Sangha in Burma.
ERIC Educational Resources Information Center
Cheesman, Nick
2003-01-01
The value of historical descriptive analysis in comparative education is highlighted by the method's application to schooling in Burma, demonstrating how control over schooling relates to state legitimacy. Supervision of Burmese education by Theravada Buddhist monasteries--Sangha--was undermined by 19th-century British colonial administration. The…
Fostering Spiritual Formation of Millennials in Christian Schools
ERIC Educational Resources Information Center
Horan, Anne Puidk
2017-01-01
Christian education seeks to foster millennials' spiritual formation to equip them for future challenges and to benefit society. Using nonexperimental mixed methods, 504 secondary educators revealed what spiritual formation programs their schools implement and their perceptions about millennial spiritual formation. Descriptive analysis showed that…
A Constrained-Clustering Approach to the Analysis of Remote Sensing Data.
1983-01-01
One old and two new clustering methods were applied to the constrained-clustering problem of separating different agricultural fields based on multispectral remote sensing satellite data. (Constrained-clustering involves double clustering in multispectral measurement similarity and geographical location.) The results of applying the three methods are provided along with a discussion of their relative strengths and weaknesses and a detailed description of their algorithms.
The purpose of this SOP is to describe the collection, storage, and shipment of tap and drinking water samples for analysis by EPA method 524.2 (revision 4.0) for the NHEXAS Arizona project. This SOP provides a brief description of the sample containers, collection, preservation...
Vedula, S. Swaroop; Li, Tianjing; Dickersin, Kay
2013-01-01
Background Details about the type of analysis (e.g., intent to treat [ITT]) and definitions (i.e., criteria for including participants in the analysis) are necessary for interpreting a clinical trial's findings. Our objective was to compare the description of types of analyses and criteria for including participants in the publication (i.e., what was reported) with descriptions in the corresponding internal company documents (i.e., what was planned and what was done). Trials were for off-label uses of gabapentin sponsored by Pfizer and Parke-Davis, and documents were obtained through litigation. Methods and Findings For each trial, we compared internal company documents (protocols, statistical analysis plans, and research reports, all unpublished), with publications. One author extracted data and another verified, with a third person verifying discordant items and a sample of the rest. Extracted data included the number of participants randomized and analyzed for efficacy, and types of analyses for efficacy and safety and their definitions (i.e., criteria for including participants in each type of analysis). We identified 21 trials, 11 of which were published randomized controlled trials, and that provided the documents needed for planned comparisons. For three trials, there was disagreement on the number of randomized participants between the research report and publication. Seven types of efficacy analyses were described in the protocols, statistical analysis plans, and publications, including ITT and six others. The protocol or publication described ITT using six different definitions, resulting in frequent disagreements between the two documents (i.e., different numbers of participants were included in the analyses). Conclusions Descriptions of analyses conducted did not agree between internal company documents and what was publicly reported. Internal company documents provide extensive documentation of methods planned and used, and trial findings, and should be publicly accessible. Reporting standards for randomized controlled trials should recommend transparent descriptions and definitions of analyses performed and which study participants are excluded. Please see later in the article for the Editors' Summary PMID:23382656
Lifetime Measurement in the Yrast Band of 119I
NASA Astrophysics Data System (ADS)
Lobach, Yu. N.; Pasternak, A. A.; Srebrny, J.; Droste, Ch.; Hagemann, G. B.; Juutinen, S.; Morek, T.; Piiparinen, M.; Podsvirova, E. O.; Toermaenen, S.; Starosta, K.; Virtanen, A.; Wasilewski, A. A.
1999-05-01
The lifetime of levels in the yrast band of 119I were measured by DSAM and RDM using the 109Ag (13C,3n) reaction at E=54 MeV. The detailed description of data analysis including the stopping power determination and estimation of side feeding time is given. A modified method of RDM data analysis --- Recoil Distance Doppler Shape Attenuation (RDDSA) is used.
Quantitation and detection of vanadium in biologic and pollution materials
NASA Technical Reports Server (NTRS)
Gordon, W. A.
1974-01-01
A review is presented of special considerations and methodology for determining vanadium in biological and air pollution materials. In addition to descriptions of specific analysis procedures, general sections are included on quantitation of analysis procedures, sample preparation, blanks, and methods of detection of vanadium. Most of the information presented is applicable to the determination of other trace elements in addition to vanadium.
Transportable, Low-Dose Active Fast-Neutron Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mihalczo, John T.; Wright, Michael C.; McConchie, Seth M.
2017-08-01
This document contains a description of the method of transportable, low-dose active fast-neutron imaging as developed by ORNL. The discussion begins with the technique and instrumentation and continues with the image reconstruction and analysis. The analysis discussion includes an example of how a gap smaller than the neutron production spot size and detector size can be detected and characterized depending upon the measurement time.
ERIC Educational Resources Information Center
Alaei, Mahya; Ahangari, Saeideh
2016-01-01
The linguistic study of literature or critical analysis of literary discourse is no different from any other textual description; it is not a new branch or a new level or a new kind of linguistics but the application of existing theories and methods (Halliday, 2002). This study intends to determine how ideology or opinion is expressed in Joseph…
ERIC Educational Resources Information Center
Garcia, Jorge; Zeglin, Robert J.; Matray, Shari; Froehlich, Robert; Marable, Ronica; McGuire-Kuletz, Maureen
2016-01-01
Purpose: The purpose of this article was to gather descriptive data on the professional use of social media in public rehabilitation settings and to analyze existing social media policies in those agencies through content analysis. Methods: The authors sent a survey to all state administrators or directors of these agencies (N = 50) in the United…
ERIC Educational Resources Information Center
Finch, Holmes
2010-01-01
Discriminant Analysis (DA) is a tool commonly used for differentiating among 2 or more groups based on 2 or more predictor variables. DA works by finding 1 or more linear combinations of the predictors that yield maximal difference among the groups. One common goal of researchers using DA is to characterize the nature of group difference by…
ERIC Educational Resources Information Center
Gottschalk, Louis A.
This paper examines the use of content analysis of speech in the objective recording and measurement of changes in emotional and cognitive function of humans in whom natural or experimental changes in neural status have occurred. A brief description of the data gathering process, details of numerous physiological effects, an anxiety scale, and a…
Irons, R D
1981-01-01
A detailed description of flow cytofluorometric DNA cell cycle analysis is presented. A number of studies by the author and other investigators are reviewed in which a method is developed for the analysis of cell cycle phase in bone marrow of experimental animals. Bone marrow cell cycle analysis is a sensitive indicator of changes in bone marrow proliferative activity occurring early in chemically-induced myelotoxicity. Cell cycle analysis, used together with other hematologic methods, has revealed benzene-induced toxicity in proliferating bone marrow cells to be cycle specific, appearing to affect a population in late S phase which then accumulate in G2/M. PMID:7016521
Analytical method of waste allocation in waste management systems: Concept, method and case study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bergeron, Francis C., E-mail: francis.b.c@videotron.ca
Waste is not a rejected item to dispose anymore but increasingly a secondary resource to exploit, influencing waste allocation among treatment operations in a waste management (WM) system. The aim of this methodological paper is to present a new method for the assessment of the WM system, the “analytical method of the waste allocation process” (AMWAP), based on the concept of the “waste allocation process” defined as the aggregation of all processes of apportioning waste among alternative waste treatment operations inside or outside the spatial borders of a WM system. AMWAP contains a conceptual framework and an analytical approach. Themore » conceptual framework includes, firstly, a descriptive model that focuses on the description and classification of the WM system. It includes, secondly, an explanatory model that serves to explain and to predict the operation of the WM system. The analytical approach consists of a step-by-step analysis for the empirical implementation of the conceptual framework. With its multiple purposes, AMWAP provides an innovative and objective modular method to analyse a WM system which may be integrated in the framework of impact assessment methods and environmental systems analysis tools. Its originality comes from the interdisciplinary analysis of the WAP and to develop the conceptual framework. AMWAP is applied in the framework of an illustrative case study on the household WM system of Geneva (Switzerland). It demonstrates that this method provides an in-depth and contextual knowledge of WM. - Highlights: • The study presents a new analytical method based on the waste allocation process. • The method provides an in-depth and contextual knowledge of the waste management system. • The paper provides a reproducible procedure for professionals, experts and academics. • It may be integrated into impact assessment or environmental system analysis tools. • An illustrative case study is provided based on household waste management in Geneva.« less
Lee, Youngjin; Choo, Jina; Cho, Jeonghyun; Kim, So-Nam; Lee, Hye-Eun; Yoon, Seok-Jun; Seomun, GyeongAe
2014-03-01
This study aimed to develop a job description for healthcare managers of metabolic syndrome management programs using task analysis. Exploratory research was performed by using the Developing a Curriculum method, the Intervention Wheel model, and focus group discussions. Subsequently, we conducted a survey of 215 healthcare workers from 25 community health centers to verify that the job description we created was accurate. We defined the role of healthcare managers. Next, we elucidated the tasks of healthcare managers and performed needs analysis to examine the frequency, importance, and difficulty of each of their duties. Finally, we verified that our job description was accurate. Based on the 8 duties, 30 tasks, and 44 task elements assigned to healthcare managers, we found that the healthcare managers functioned both as team coordinators responsible for providing multidisciplinary health services and nurse specialists providing health promotion services. In terms of importance and difficulty of tasks performed by the healthcare managers, which were measured using a determinant coefficient, the highest-ranked task was planning social marketing (15.4), while the lowest-ranked task was managing human resources (9.9). A job description for healthcare managers may provide basic data essential for the development of a job training program for healthcare managers working in community health promotion programs. Copyright © 2014. Published by Elsevier B.V.
Characteristics of Qualitative Descriptive Studies: A Systematic Review
Kim, Hyejin; Sefcik, Justine S.; Bradway, Christine
2016-01-01
Qualitative description (QD) is a term that is widely used to describe qualitative studies of health care and nursing-related phenomena. However, limited discussions regarding QD are found in the existing literature. In this systematic review, we identified characteristics of methods and findings reported in research articles published in 2014 whose authors identified the work as QD. After searching and screening, data were extracted from the sample of 55 QD articles and examined to characterize research objectives, design justification, theoretical/philosophical frameworks, sampling and sample size, data collection and sources, data analysis, and presentation of findings. In this review, three primary findings were identified. First, despite inconsistencies, most articles included characteristics consistent with limited, available QD definitions and descriptions. Next, flexibility or variability of methods was common and desirable for obtaining rich data and achieving understanding of a phenomenon. Finally, justification for how a QD approach was chosen and why it would be an appropriate fit for a particular study was limited in the sample and, therefore, in need of increased attention. Based on these findings, recommendations include encouragement to researchers to provide as many details as possible regarding the methods of their QD study so that readers can determine whether the methods used were reasonable and effective in producing useful findings. PMID:27686751
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skiles, S. K.
1994-12-22
An inductive double-contingency analysis (DCA) method developed by the criticality safety function at the Savannah River Site, was applied in Criticality Safety Evaluations (CSEs) of five major plant process systems at the Westinghouse Electric Corporation`s Commercial Nuclear Fuel Manufacturing Plant in Columbia, South Carolina (WEC-Cola.). The method emphasizes a thorough evaluation of the controls intended to provide barriers against criticality for postulated initiating events, and has been demonstrated effective at identifying common mode failure potential and interdependence among multiple controls. A description of the method and an example of its application is provided.
A New Methodology for the Extension of the Impact of Data Assimilation on Ocean Wave Prediction
2008-07-01
Assimilation method The analysis fields used were corrected by an assimilation method developed at the Norwegian Meteorological Insti- tute ( Breivik and Reistad...523–535 525 becomes equal to the solution obtained by optimal interpolation (see Bratseth 1986 and Breivik and Reistad 1994). The iterations begin with...updated accordingly. A more detailed description of the assimilation method is given in Breivik and Reistad (1994). 2.3 Kolmogorov–Zurbenko filters
Portfolio assessment and evaluation: implications and guidelines for clinical nursing education.
Chabeli, M M
2002-08-01
With the advent of Outcomes-Based Education in South Africa, the quality of nursing education is debatable, especially with regard to the assessment and evaluation of clinical nursing education, which is complex and renders the validity and reliability of the methods used questionable. This paper seeks to explore and describe the use of portfolio assessment and evaluation, its implications and guidelines for its effective use in nursing education. Firstly, the concepts of assessment, evaluation, portfolio and alternative methods of evaluation are defined. Secondly, a comparison of the characteristics of the old (traditional) methods and the new alternative methods of evaluation is made. Thirdly, through deductive analysis, synthesis and inference, implications and guidelines for the effective use of portfolio assessment and evaluation are described. In view of the qualitative, descriptive and exploratory nature of the study, a focus group interview with twenty students following a post-basic degree at a university in Gauteng regarding their perceptions on the use of portfolio assessment and evaluation method in clinical nursing education was used. A descriptive method of qualitative data analysis of open coding in accordance with Tesch's protocol (in Creswell 1994:155) was used. Resultant implications and guidelines were conceptualised and described within the existing theoretical framework. Principles of trustworthiness were maintained as described by (Lincoln & Guba 1985:290-327). Ethical considerations were in accordance with DENOSA's standards of research (1998:7).
NASA Technical Reports Server (NTRS)
Junkin, B. G. (Principal Investigator)
1979-01-01
A method is presented for the processing and analysis of digital topography data that can subsequently be entered in an interactive data base in the form of slope, slope length, elevation, and aspect angle. A discussion of the data source and specific descriptions of the data processing software programs are included. In addition, the mathematical considerations involved in the registration of raw digitized coordinate points to the UTM coordinate system are presented. Scale factor considerations are also included. Results of the processing and analysis are illustrated using the Shiprock and Gallup Quadrangle test data.
Barnes, Stephen; Benton, H Paul; Casazza, Krista; Cooper, Sara J; Cui, Xiangqin; Du, Xiuxia; Engler, Jeffrey; Kabarowski, Janusz H; Li, Shuzhao; Pathmasiri, Wimal; Prasain, Jeevan K; Renfrow, Matthew B; Tiwari, Hemant K
2016-08-01
Metabolomics, a systems biology discipline representing analysis of known and unknown pathways of metabolism, has grown tremendously over the past 20 years. Because of its comprehensive nature, metabolomics requires careful consideration of the question(s) being asked, the scale needed to answer the question(s), collection and storage of the sample specimens, methods for extraction of the metabolites from biological matrices, the analytical method(s) to be employed and the quality control of the analyses, how collected data are correlated, the statistical methods to determine metabolites undergoing significant change, putative identification of metabolites and the use of stable isotopes to aid in verifying metabolite identity and establishing pathway connections and fluxes. This second part of a comprehensive description of the methods of metabolomics focuses on data analysis, emerging methods in metabolomics and the future of this discipline. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
An evaluation of authentication methods for smartphone based on users’ preferences
NASA Astrophysics Data System (ADS)
Sari, P. K.; Ratnasari, G. S.; Prasetio, A.
2016-04-01
This study discusses about smartphone screen lock preferences using some types of authentication methods. The purpose is to determine the user behaviours based on the perceived security and convenience, as well as the preferences for different types of authentication methods. Variables used are the considerations for locking the screens and the types of authentication methods. The population consists of the smartphone users with the total samples of 400 respondents within a nonprobability sampling method. Data analysis method used is the descriptive analysis. The results showed that the convenience factor is still the major consideration for locking the smartphone screens. Majority of the users chose the pattern unlock as the most convenient method to use. Meanwhile, fingerprint unlock becomes the most secure method in the users’ perceptions and as the method chosen to be used in the future.
An exploration of function analysis and function allocation in the commercial flight domain
NASA Technical Reports Server (NTRS)
Mcguire, James C.; Zich, John A.; Goins, Richard T.; Erickson, Jeffery B.; Dwyer, John P.; Cody, William J.; Rouse, William B.
1991-01-01
The applicability is explored of functional analysis methods to support cockpit design. Specifically, alternative techniques are studied for ensuring an effective division of responsibility between the flight crew and automation. A functional decomposition is performed of the commercial flight domain to provide the information necessary to support allocation decisions and demonstrate methodology for allocating functions to flight crew or to automation. The function analysis employed 'bottom up' and 'top down' analyses and demonstrated the comparability of identified functions, using the 'lift off' segment of the 'take off' phase as a test case. The normal flight mission and selected contingencies were addressed. Two alternative methods for using the functional description in the allocation of functions between man and machine were investigated. The two methods were compared in order to ascertain their relative strengths and weaknesses. Finally, conclusions were drawn regarding the practical utility of function analysis methods.
2008-01-01
A new method for extracting common themes from written text is introduced and applied to 1,165 open-ended self-descriptive narratives. Drawing on a lexical approach to personality, the most commonly-used adjectives within narratives written by college students were identified using computerized text analytic tools. A factor analysis on the use of these adjectives in the self-descriptions produced a 7-factor solution consisting of psychologically meaningful dimensions. Some dimensions were unipolar (e.g., Negativity factor, wherein most loaded items were negatively valenced adjectives); others were dimensional in that semantically opposite words clustered together (e.g., Sociability factor, wherein terms such as shy, outgoing, reserved, and loud all loaded in the same direction). The factors exhibited modest reliability across different types of writ writing samples and were correlated with self-reports and behaviors consistent with the dimensions. Similar analyses with additional content words (adjectives, adverbs, nouns, and verbs) yielded additional psychological dimensions associated with physical appearance, school, relationships, etc. in which people contextualize their self-concepts. The results suggest that the meaning extraction method is a promising strategy that determines the dimensions along which people think about themselves. PMID:18802499
Luijkx, Katrien; Calciolari, Stefano; González-Ortiz, Laura G.
2017-01-01
Introduction: In this paper, we provide a detailed and explicit description of the processes and decisions underlying and shaping the emergent multimethod research design of our study on workforce changes in integrated chronic care. Theory and methods: The study was originally planned as mixed method research consisting of a preliminary literature review and quantitative check of these findings via a Delphi panel. However, when the findings of the literature review were not appropriate for quantitative confirmation, we chose to continue our qualitative exploration of the topic via qualitative questionnaires and secondary analysis of two best practice case reports. Results: The resulting research design is schematically described as an emergent and interactive multimethod design with multiphase combination timing. In doing so, we provide other researchers with a set of theory- and experience-based options to develop their own multimethod research and provide an example for more detailed and structured reporting of emergent designs. Conclusion and discussion: We argue that the terminology developed for the description of mixed methods designs should also be used for multimethod designs such as the one presented here. PMID:29042843
Larson, S.J.; Capel, P.D.; VanderLoop, A.G.
1996-01-01
Laboratory and quality assurance procedures for the analysis of ground-water samples for herbicides at the Management Systems Evaluation Area near Princeton, Minnesota are described. The target herbicides include atrazine, de-ethylatrazine, de-isopropylatrazine, metribuzin, alachlor, 2,6-diethylaniline, and metolachlor. The analytical techniques used are solid-phase extraction, and analysis by gas chromatography with mass-selective detection. Descriptions of cleaning procedures, preparation of standard solutions, isolation of analytes from water, sample transfer methods, instrumental analysis, and data analysis are included.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abbott, B. P.; Abbott, R.; Abernathy, M. R.
This article provides supplemental information for a Letter reporting the rate of (BBH) coalescences inferred from 16 days of coincident Advanced LIGO observations surrounding the transient (GW) signal GW150914. In that work we reported various rate estimates whose 90% confidence intervals fell in the range 2–600 Gpc{sup −3} yr{sup −1}. Here we give details on our method and computations, including information about our search pipelines, a derivation of our likelihood function for the analysis, a description of the astrophysical search trigger distribution expected from merging BBHs, details on our computational methods, a description of the effects and our model for calibration uncertainty,more » and an analytic method for estimating our detector sensitivity, which is calibrated to our measurements.« less
Can improvised somatic dance reduce acute pain for young people in hospital?
Dowler, Lisa
2016-11-08
Aim This study explores the effects of improvised somatic dance (ISD) on children and young people experiencing acute pain following orthopaedic or cardiac surgery, or post-acquired brain injury. Methods The study involved 25 children and young people and adopted a mixed methods approach. This included a descriptive qualitative approach to help the participants and witnesses verbalise their experience of ISD, and pain scores were assessed before and after ISD using validated pain assessment tools. Data were analysed using descriptive statistical analysis. Findings A total of 92% of participants experienced a reduction in pain, with 80% experiencing a >50% reduction. There was an improved sense of well-being for all. Conclusion Although not a replacement for pharmacological treatments, a multidimensional, child-centred and inclusive approach with ISD can be a useful complementary, non-pharmacological method of pain management in children and young people.
A new theoretical approach to analyze complex processes in cytoskeleton proteins.
Li, Xin; Kolomeisky, Anatoly B
2014-03-20
Cytoskeleton proteins are filament structures that support a large number of important biological processes. These dynamic biopolymers exist in nonequilibrium conditions stimulated by hydrolysis chemical reactions in their monomers. Current theoretical methods provide a comprehensive picture of biochemical and biophysical processes in cytoskeleton proteins. However, the description is only qualitative under biologically relevant conditions because utilized theoretical mean-field models neglect correlations. We develop a new theoretical method to describe dynamic processes in cytoskeleton proteins that takes into account spatial correlations in the chemical composition of these biopolymers. Our approach is based on analysis of probabilities of different clusters of subunits. It allows us to obtain exact analytical expressions for a variety of dynamic properties of cytoskeleton filaments. By comparing theoretical predictions with Monte Carlo computer simulations, it is shown that our method provides a fully quantitative description of complex dynamic phenomena in cytoskeleton proteins under all conditions.
Nominal Group as Qualifier to "Someone"
ERIC Educational Resources Information Center
Sujatna, Eva Tuckyta Sari; Wahyuni, Sri
2017-01-01
The paper titled "Nominal Group as Qualifier to 'Someone'" investigated types of qualifiers which are embedded to the head "someone" in a nominal group. This research was conducted in the light of Systemic Functional Linguistics analysis. The data was analyzed, classified then described using descriptive qualitative method.…
Curriculum for Young Deaf Children.
ERIC Educational Resources Information Center
Restaino, Lillian C. R.; And Others
Presented is a curriculum designed to provide the teacher of the young deaf child with learning disabilities with a description of developmental objectives and methods for fulfilling these objectives in the areas of gross motor development, sensory motor integration, visual analysis, attention and memory, and conceptualization. The objectives are…
González-Álvarez, Mariana; Noguerol-Pato, Raquel; González-Barreiro, Carmen; Cancho-Grande, Beatriz; Simal-Gándara, Jesús
2014-02-15
The effect of winemaking procedures on the sensory modification of sweet wines was investigated. Garnacha Tintorera-based sweet wines were obtained by two different processes: by using raisins for vinification to obtain a naturally sweet wine and by using freshly harvested grapes with the stoppage of the fermentation by the addition of alcohol. Eight international sweet wines were also subjected to sensory analysis for comparative description purposes. Wines were described with a sensory profile by 12 trained panellists on 70 sensory attributes by employing the frequency of citation method. Analysis of variance of the descriptive data confirmed the existence of subtle sensory differences among Garnacha Tintorera-based sweet wines depending on the procedure used for their production. Cluster analysis emphasised discriminated attributes between the Garnacha Tintorera-based and the commercial groups of sweet wines for both those obtained by raisining and by fortification. Several kinds of discriminant functions were used to separate groups of sweet wines--obtained by botrytisation, raisining and fortification--to show the key descriptors that contribute to their separation and define the sensory perception of each type of wine. Copyright © 2013 Elsevier Ltd. All rights reserved.
Medical emergencies on board commercial airlines: is documentation as expected?
2012-01-01
Introduction The purpose of this study was to perform a descriptive, content-based analysis on the different forms of documentation for in-flight medical emergencies that are currently provided in the emergency medical kits on board commercial airlines. Methods Passenger airlines in the World Airline Directory were contacted between March and May 2011. For each participating airline, sample in-flight medical emergency documentation forms were obtained. All items in the sample documentation forms were subjected to a descriptive analysis and compared to a sample "medical incident report" form published by the International Air Transport Association (IATA). Results A total of 1,318 airlines were contacted. Ten airlines agreed to participate in the study and provided a copy of their documentation forms. A descriptive analysis revealed a total of 199 different items, which were summarized into five sub-categories: non-medical data (63), signs and symptoms (68), diagnosis (26), treatment (22) and outcome (20). Conclusions The data in this study illustrate a large variation in the documentation of in-flight medical emergencies by different airlines. A higher degree of standardization is preferable to increase the data quality in epidemiologic aeromedical research in the future. PMID:22397530
Interoperability between phenotype and anatomy ontologies.
Hoehndorf, Robert; Oellrich, Anika; Rebholz-Schuhmann, Dietrich
2010-12-15
Phenotypic information is important for the analysis of the molecular mechanisms underlying disease. A formal ontological representation of phenotypic information can help to identify, interpret and infer phenotypic traits based on experimental findings. The methods that are currently used to represent data and information about phenotypes fail to make the semantics of the phenotypic trait explicit and do not interoperate with ontologies of anatomy and other domains. Therefore, valuable resources for the analysis of phenotype studies remain unconnected and inaccessible to automated analysis and reasoning. We provide a framework to formalize phenotypic descriptions and make their semantics explicit. Based on this formalization, we provide the means to integrate phenotypic descriptions with ontologies of other domains, in particular anatomy and physiology. We demonstrate how our framework leads to the capability to represent disease phenotypes, perform powerful queries that were not possible before and infer additional knowledge. http://bioonto.de/pmwiki.php/Main/PheneOntology.
Liu, Jing; Bredie, Wender L P; Sherman, Emma; Harbertson, James F; Heymann, Hildegarde
2018-04-01
Rapid sensory methods have been developed as alternatives to traditional sensory descriptive analysis methods. Among them, Free-Choice Profiling (FCP) and Flash Profile (FP) are two that have been known for many years. The objectives of this work were to compare the rating-based FCP and ranking-based FP method; to evaluate the impact of adding adjustments to FP approach; to investigate the influence of the number of assessors on the outcome of modified FP. To achieve these aims, a conventional descriptive analysis (DA), FCP, FP and a modified version of FP were carried out. Red wines made by different grape maturity and ethanol concentration were used for sensory testing. This study showed that DA provided a more detailed and accurate information on products through a quantitative measure of the intensity of sensory attributes than FCP and FP. However, the panel hours for conducting DA were higher than that for rapid methods, and FP was even able to separate the samples to a higher degree than DA. When comparing FCP and FP, this study showed that the ranking-based FP provided a clearer separation of samples than rating-based FCP, but the latter was an easier task for most assessors. When restricting assessors on their use of attributes in FP, the sample space became clearer and the ranking task was simplified. The FP protocol with restricted attribute sets seems to be a promising approach for efficient screening of sensory properties in wine. When increasing the number of assessors from 10 to 20 for conducting the modified FP, the outcome tended to be slightly more stable, however, one should consider the degree of panel training when deciding the optimal number of assessors for conducting FP. Copyright © 2018 Elsevier Ltd. All rights reserved.
Analysis of laparoscopic port site complications: A descriptive study
Karthik, Somu; Augustine, Alfred Joseph; Shibumon, Mundunadackal Madhavan; Pai, Manohar Varadaraya
2013-01-01
CONTEXT: The rate of port site complications following conventional laparoscopic surgery is about 21 per 100,000 cases. It has shown a proportional rise with increase in the size of the port site incision and trocar. Although rare, complications that occur at the port site include infection, bleeding, and port site hernia. AIMS: To determine the morbidity associated with ports at the site of their insertion in laparoscopic surgery and to identify risk factors for complications. SETTINGS AND DESIGN: Prospective descriptive study. MATERIALS AND METHODS: In the present descriptive study, a total of 570 patients who underwent laparoscopic surgeries for various ailments between August 2009 and July 2011 at our institute were observed for port site complications prospectively and the complications were reviewed. STATISTICAL ANALYSIS USED: Descriptive statistical analysis was carried out in the present study. The statistical software, namely, SPSS 15.0 was used for the analysis of the data. RESULTS: Of the 570 patients undergoing laparoscopic surgery, 17 (3%) had developed complications specifically related to the port site during a minimum follow-up of three months; port site infection (PSI) was the most frequent (n = 10, 1.8%), followed by port site bleeding (n = 4, 0.7%), omentum-related complications (n = 2; 0.35%), and port site metastasis (n = 1, 0.175%). CONCLUSIONS: Laparoscopic surgeries are associated with minimal port site complications. Complications are related to the increased number of ports. Umbilical port involvement is the commonest. Most complications are manageable with minimal morbidity, and can be further minimized with meticulous surgical technique during entry and exit. PMID:23741110
NASA Technical Reports Server (NTRS)
Miller, R. D.; Anderson, L. R.
1979-01-01
The LOADS program L218, a digital computer program that calculates dynamic load coefficient matrices utilizing the force summation method, is described. The load equations are derived for a flight vehicle in straight and level flight and excited by gusts and/or control motions. In addition, sensor equations are calculated for use with an active control system. The load coefficient matrices are calculated for the following types of loads: translational and rotational accelerations, velocities, and displacements; panel aerodynamic forces; net panel forces; shears and moments. Program usage and a brief description of the analysis used are presented. A description of the design and structure of the program to aid those who will maintain and/or modify the program in the future is included.
A review of in situ propellant production techniques for solar system exploration
NASA Technical Reports Server (NTRS)
Hoffman, S. J.
1983-01-01
Representative studies done in the area of extraterrestrial chemical production as it applies to solar system exploration are presented. A description of the In Situ Propellant Production (ISPP) system is presented. Various propellant combinations and direct applications along with the previously mentioned benefits and liens are discussed. A series of mission scenarios is presented which is studied in the greatest detail. A general description of the method(s) of analysis used to study each mission is provided. Each section will be closed by an assessment of the performance advantage, if any, that can be provided by ISPP. A final section briefly summarizes those missions which, as a result of the studies completed thus far, should see a sizable benefit from the use of ISPP.
The purpose of this SOP is to describe how to collect, store, and ship tap and drinking water samples for analysis by EPA Method 200.8 (revision 4.4) for the NHEXAS Arizona project. This SOP provides a brief description of the sample containers, collection, preservation, storage...
“Magnitude-based Inference”: A Statistical Review
Welsh, Alan H.; Knight, Emma J.
2015-01-01
ABSTRACT Purpose We consider “magnitude-based inference” and its interpretation by examining in detail its use in the problem of comparing two means. Methods We extract from the spreadsheets, which are provided to users of the analysis (http://www.sportsci.org/), a precise description of how “magnitude-based inference” is implemented. We compare the implemented version of the method with general descriptions of it and interpret the method in familiar statistical terms. Results and Conclusions We show that “magnitude-based inference” is not a progressive improvement on modern statistics. The additional probabilities introduced are not directly related to the confidence interval but, rather, are interpretable either as P values for two different nonstandard tests (for different null hypotheses) or as approximate Bayesian calculations, which also lead to a type of test. We also discuss sample size calculations associated with “magnitude-based inference” and show that the substantial reduction in sample sizes claimed for the method (30% of the sample size obtained from standard frequentist calculations) is not justifiable so the sample size calculations should not be used. Rather than using “magnitude-based inference,” a better solution is to be realistic about the limitations of the data and use either confidence intervals or a fully Bayesian analysis. PMID:25051387
Analog computation of auto and cross-correlation functions
NASA Technical Reports Server (NTRS)
1974-01-01
For analysis of the data obtained from the cross beam systems it was deemed desirable to compute the auto- and cross-correlation functions by both digital and analog methods to provide a cross-check of the analysis methods and an indication as to which of the two methods would be most suitable for routine use in the analysis of such data. It is the purpose of this appendix to provide a concise description of the equipment and procedures used for the electronic analog analysis of the cross beam data. A block diagram showing the signal processing and computation set-up used for most of the analog data analysis is provided. The data obtained at the field test sites were recorded on magnetic tape using wide-band FM recording techniques. The data as recorded were band-pass filtered by electronic signal processing in the data acquisition systems.
Defining Human Failure Events for Petroleum Risk Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring; Knut Øien
2014-06-01
In this paper, an identification and description of barriers and human failure events (HFEs) for human reliability analysis (HRA) is performed. The barriers, called target systems, are identified from risk significant accident scenarios represented as defined situations of hazard and accident (DSHAs). This report serves as the foundation for further work to develop petroleum HFEs compatible with the SPAR-H method and intended for reuse in future HRAs.
Krzysztof, Naus; Aleksander, Nowak
2016-01-01
The article presents a study of the accuracy of estimating the position coordinates of BAUV (Biomimetic Autonomous Underwater Vehicle) by the extended Kalman filter (EKF) method. The fusion of movement parameters measurements and position coordinates fixes was applied. The movement parameters measurements are carried out by on-board navigation devices, while the position coordinates fixes are done by the USBL (Ultra Short Base Line) system. The problem of underwater positioning and the conceptual design of the BAUV navigation system constructed at the Naval Academy (Polish Naval Academy—PNA) are presented in the first part of the paper. The second part consists of description of the evaluation results of positioning accuracy, the genesis of the problem of selecting method for underwater positioning, and the mathematical description of the method of estimating the position coordinates using the EKF method by the fusion of measurements with on-board navigation and measurements obtained with the USBL system. The main part contains a description of experimental research. It consists of a simulation program of navigational parameter measurements carried out during the BAUV passage along the test section. Next, the article covers the determination of position coordinates on the basis of simulated parameters, using EKF and DR methods and the USBL system, which are then subjected to a comparative analysis of accuracy. The final part contains systemic conclusions justifying the desirability of applying the proposed fusion method of navigation parameters for the BAUV positioning. PMID:27537884
Krzysztof, Naus; Aleksander, Nowak
2016-08-15
The article presents a study of the accuracy of estimating the position coordinates of BAUV (Biomimetic Autonomous Underwater Vehicle) by the extended Kalman filter (EKF) method. The fusion of movement parameters measurements and position coordinates fixes was applied. The movement parameters measurements are carried out by on-board navigation devices, while the position coordinates fixes are done by the USBL (Ultra Short Base Line) system. The problem of underwater positioning and the conceptual design of the BAUV navigation system constructed at the Naval Academy (Polish Naval Academy-PNA) are presented in the first part of the paper. The second part consists of description of the evaluation results of positioning accuracy, the genesis of the problem of selecting method for underwater positioning, and the mathematical description of the method of estimating the position coordinates using the EKF method by the fusion of measurements with on-board navigation and measurements obtained with the USBL system. The main part contains a description of experimental research. It consists of a simulation program of navigational parameter measurements carried out during the BAUV passage along the test section. Next, the article covers the determination of position coordinates on the basis of simulated parameters, using EKF and DR methods and the USBL system, which are then subjected to a comparative analysis of accuracy. The final part contains systemic conclusions justifying the desirability of applying the proposed fusion method of navigation parameters for the BAUV positioning.
Connected Learning: Evaluating and Refining an Academic Community Blogging Platform
ERIC Educational Resources Information Center
Stephens, Michael
2016-01-01
This study investigates the benefits of a community blogging platform for students in an online LIS program. Using a web survey and descriptive content analysis methods, this paper empirically addresses how student blogging communities can be effectively foster connections amongst instructors and students, and enhance perceptions of learning…
49 CFR Appendix F to Part 229 - Recommended Practices for Design and Safety Analysis
Code of Federal Regulations, 2014 CFR
2014-10-01
... expected order of use; (v) Group similar controls together; (vi) Design for high stimulus-response compatibility (geometric and conceptual); (vii) Design safety-critical controls to require more than one... description of all backup methods of operation; and (s) The configuration/revision control measures designed...
49 CFR Appendix F to Part 229 - Recommended Practices for Design and Safety Analysis
Code of Federal Regulations, 2012 CFR
2012-10-01
... expected order of use; (v) Group similar controls together; (vi) Design for high stimulus-response compatibility (geometric and conceptual); (vii) Design safety-critical controls to require more than one... description of all backup methods of operation; and (s) The configuration/revision control measures designed...
49 CFR Appendix F to Part 229 - Recommended Practices for Design and Safety Analysis
Code of Federal Regulations, 2013 CFR
2013-10-01
... expected order of use; (v) Group similar controls together; (vi) Design for high stimulus-response compatibility (geometric and conceptual); (vii) Design safety-critical controls to require more than one... description of all backup methods of operation; and (s) The configuration/revision control measures designed...
Empowering Students with Special Needs through Service-Learning.
ERIC Educational Resources Information Center
Karayan, Silva; Gathercoal, Paul
This paper reports on a qualitative and quantitative study in which the service learning projects of college students in special education teacher training were analyzed using elements of quality service learning as criteria. The study used the "portraiture" method of analysis, which attempts to combine empirical and aesthetic description and…
Creativity, Bipolar Disorder Vulnerability and Psychological Well-Being: A Preliminary Study
ERIC Educational Resources Information Center
Gostoli, Sara; Cerini, Veronica; Piolanti, Antonio; Rafanelli, Chiara
2017-01-01
The aim of this research was to investigate the relationships between creativity, subclinical bipolar disorder symptomatology, and psychological well-being. The study method was of descriptive, correlational type. Significant tests were performed using multivariate regression analysis. Students of the 4th grade of 6 different Italian colleges…
ERIC Educational Resources Information Center
Plakhotnik, Maria S.
2016-01-01
The purpose of this perspective on practice is to share my experience conducting an organizational change evaluation using qualitative methodology at a multinational transportation company Global Logistics. I provide a detailed description of the three phase approach to data analysis and my reflections on the process.
Computational Prosodic Markers for Autism
ERIC Educational Resources Information Center
Van Santen, Jan P.H.; Prud'hommeaux, Emily T.; Black, Lois M.; Mitchell, Margaret
2010-01-01
We present results obtained with new instrumental methods for the acoustic analysis of prosody to evaluate prosody production by children with Autism Spectrum Disorder (ASD) and Typical Development (TD). Two tasks elicit focal stress--one in a vocal imitation paradigm, the other in a picture-description paradigm; a third task also uses a vocal…
Internal Labor Markets: An Empirical Investigation.
ERIC Educational Resources Information Center
Mahoney, Thomas A.; Milkovich, George T.
Methods of internal labor market analysis for three organizational areas are presented, along with some evidence about the validity and utility of conceptual descriptions of such markets. The general concept of an internal labor market refers to the process of pricing and allocation of manpower resources with an employing organization and rests…
Validating a Lifestyle Physical Activity Measure for People with Serious Mental Illness
ERIC Educational Resources Information Center
Bezyak, Jill L.; Chan, Fong; Chiu, Chung-Yi; Kaya, Cahit; Huck, Garrett
2014-01-01
Purpose: To evaluate the measurement structure of the "Physical Activity Scale for Individuals With Physical Disabilities" (PASIPD) as an assessment tool of lifestyle physical activities for people with severe mental illness. Method: A quantitative descriptive research design using factor analysis was employed. A sample of 72 individuals…
47 CFR 101.129 - Transmitter location.
Code of Federal Regulations, 2013 CFR
2013-10-01
... survey tests to be made pursuant to an experimental license under part 5 of this chapter. In such cases... data obtained from such surveys and its analysis, including a description of the methods used and the name, address and qualifications of the engineer making the survey, must be supplied to the Commission...
47 CFR 101.129 - Transmitter location.
Code of Federal Regulations, 2014 CFR
2014-10-01
... survey tests to be made pursuant to an experimental license under part 5 of this chapter. In such cases... data obtained from such surveys and its analysis, including a description of the methods used and the name, address and qualifications of the engineer making the survey, must be supplied to the Commission...
Circuit Riding: A Method for Providing Reference Services.
ERIC Educational Resources Information Center
Plunket, Linda; And Others
1983-01-01
Discussion of the design and implementation of the Circuit Rider Librarian Program, a shared services project for delivering reference services to eight hospitals in Maine, includes a cost analysis of services and description of user evaluation survey. Five references, composite results of the survey, and postgrant options proposal are appended.…
Communicative Competence of the Fourth Year Students: Basis for Proposed English Language Program
ERIC Educational Resources Information Center
Tuan, Vu Van
2017-01-01
This study on level of communicative competence covering linguistic/grammatical and discourse has aimed at constructing a proposed English language program for 5 key universities in Vietnam. The descriptive method utilized was scientifically employed with comparative techniques and correlational analysis. The researcher treated the surveyed data…
Analysis of Utah Career Ladder Plans.
ERIC Educational Resources Information Center
Murphy, Michael J.; And Others
This report analyzes the content and development of the 45 school district career ladder plans submitted in 1984 to the Utah State Office of Education. Descriptive commentary and data tables are used to examine (1) the structure and composition of planning committees; (2) teacher evaluation provisions, including changes in evaluation methods, the…
Product Description: To understand how some chemicals affect the endocrine system, controlled lab experiments often monitor how chemicals impact natural steroid hormones in fish. Current methods can target only one or two hormones in a single sample, limiting the information that...
"Magnitude-based inference": a statistical review.
Welsh, Alan H; Knight, Emma J
2015-04-01
We consider "magnitude-based inference" and its interpretation by examining in detail its use in the problem of comparing two means. We extract from the spreadsheets, which are provided to users of the analysis (http://www.sportsci.org/), a precise description of how "magnitude-based inference" is implemented. We compare the implemented version of the method with general descriptions of it and interpret the method in familiar statistical terms. We show that "magnitude-based inference" is not a progressive improvement on modern statistics. The additional probabilities introduced are not directly related to the confidence interval but, rather, are interpretable either as P values for two different nonstandard tests (for different null hypotheses) or as approximate Bayesian calculations, which also lead to a type of test. We also discuss sample size calculations associated with "magnitude-based inference" and show that the substantial reduction in sample sizes claimed for the method (30% of the sample size obtained from standard frequentist calculations) is not justifiable so the sample size calculations should not be used. Rather than using "magnitude-based inference," a better solution is to be realistic about the limitations of the data and use either confidence intervals or a fully Bayesian analysis.
Rosier, Arnaud; Mabo, Philippe; Chauvin, Michel; Burgun, Anita
2015-05-01
The patient population benefitting from cardiac implantable electronic devices (CIEDs) is increasing. This study introduces a device annotation method that supports the consistent description of the functional attributes of cardiac devices and evaluates how this method can detect device changes from a CIED registry. We designed the Cardiac Device Ontology, an ontology of CIEDs and device functions. We annotated 146 cardiac devices with this ontology and used it to detect therapy changes with respect to atrioventricular pacing, cardiac resynchronization therapy, and defibrillation capability in a French national registry of patients with implants (STIDEFIX). We then analyzed a set of 6905 device replacements from the STIDEFIX registry. Ontology-based identification of therapy changes (upgraded, downgraded, or similar) was accurate (6905 cases) and performed better than straightforward analysis of the registry codes (F-measure 1.00 versus 0.75 to 0.97). This study demonstrates the feasibility and effectiveness of ontology-based functional annotation of devices in the cardiac domain. Such annotation allowed a better description and in-depth analysis of STIDEFIX. This method was useful for the automatic detection of therapy changes and may be reused for analyzing data from other device registries.
Investigating System Dependability Modeling Using AADL
NASA Technical Reports Server (NTRS)
Hall, Brendan; Driscoll, Kevin R.; Madl, Gabor
2013-01-01
This report describes Architecture Analysis & Design Language (AADL) models for a diverse set of fault-tolerant, embedded data networks and describes the methods and tools used to created these models. It also includes error models per the AADL Error Annex. Some networks were modeled using Error Detection Isolation Containment Types (EDICT). This report gives a brief description for each of the networks, a description of its modeling, the model itself, and evaluations of the tools used for creating the models. The methodology includes a naming convention that supports a systematic way to enumerate all of the potential failure modes.
Descriptive Statistics and Cluster Analysis for Extreme Rainfall in Java Island
NASA Astrophysics Data System (ADS)
E Komalasari, K.; Pawitan, H.; Faqih, A.
2017-03-01
This study aims to describe regional pattern of extreme rainfall based on maximum daily rainfall for period 1983 to 2012 in Java Island. Descriptive statistics analysis was performed to obtain centralization, variation and distribution of maximum precipitation data. Mean and median are utilized to measure central tendency data while Inter Quartile Range (IQR) and standard deviation are utilized to measure variation of data. In addition, skewness and kurtosis used to obtain shape the distribution of rainfall data. Cluster analysis using squared euclidean distance and ward method is applied to perform regional grouping. Result of this study show that mean (average) of maximum daily rainfall in Java Region during period 1983-2012 is around 80-181mm with median between 75-160mm and standard deviation between 17 to 82. Cluster analysis produces four clusters and show that western area of Java tent to have a higher annual maxima of daily rainfall than northern area, and have more variety of annual maximum value.
Rotation covariant image processing for biomedical applications.
Skibbe, Henrik; Reisert, Marco
2013-01-01
With the advent of novel biomedical 3D image acquisition techniques, the efficient and reliable analysis of volumetric images has become more and more important. The amount of data is enormous and demands an automated processing. The applications are manifold, ranging from image enhancement, image reconstruction, and image description to object/feature detection and high-level contextual feature extraction. In most scenarios, it is expected that geometric transformations alter the output in a mathematically well-defined manner. In this paper we emphasis on 3D translations and rotations. Many algorithms rely on intensity or low-order tensorial-like descriptions to fulfill this demand. This paper proposes a general mathematical framework based on mathematical concepts and theories transferred from mathematical physics and harmonic analysis into the domain of image analysis and pattern recognition. Based on two basic operations, spherical tensor differentiation and spherical tensor multiplication, we show how to design a variety of 3D image processing methods in an efficient way. The framework has already been applied to several biomedical applications ranging from feature and object detection tasks to image enhancement and image restoration techniques. In this paper, the proposed methods are applied on a variety of different 3D data modalities stemming from medical and biological sciences.
The adaptation of GDL motion recognition system to sport and rehabilitation techniques analysis.
Hachaj, Tomasz; Ogiela, Marek R
2016-06-01
The main novelty of this paper is presenting the adaptation of Gesture Description Language (GDL) methodology to sport and rehabilitation data analysis and classification. In this paper we showed that Lua language can be successfully used for adaptation of the GDL classifier to those tasks. The newly applied scripting language allows easily extension and integration of classifier with other software technologies and applications. The obtained execution speed allows using the methodology in the real-time motion capture data processing where capturing frequency differs from 100 Hz to even 500 Hz depending on number of features or classes to be calculated and recognized. Due to this fact the proposed methodology can be used to the high-end motion capture system. We anticipate that using novel, efficient and effective method will highly help both sport trainers and physiotherapist in they practice. The proposed approach can be directly applied to motion capture data kinematics analysis (evaluation of motion without regard to the forces that cause that motion). The ability to apply pattern recognition methods for GDL description can be utilized in virtual reality environment and used for sport training or rehabilitation treatment.
14 CFR 161.9 - Designation of noise description methods.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Designation of noise description methods... TRANSPORTATION (CONTINUED) AIRPORTS NOTICE AND APPROVAL OF AIRPORT NOISE AND ACCESS RESTRICTIONS General Provisions § 161.9 Designation of noise description methods. For purposes of this part, the following...
14 CFR 161.9 - Designation of noise description methods.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Designation of noise description methods... TRANSPORTATION (CONTINUED) AIRPORTS NOTICE AND APPROVAL OF AIRPORT NOISE AND ACCESS RESTRICTIONS General Provisions § 161.9 Designation of noise description methods. For purposes of this part, the following...
[Statistical analysis of German radiologic periodicals: developmental trends in the last 10 years].
Golder, W
1999-09-01
To identify which statistical tests are applied in German radiological publications, to what extent their use has changed during the last decade, and which factors might be responsible for this development. The major articles published in "ROFO" and "DER RADIOLOGE" during 1988, 1993 and 1998 were reviewed for statistical content. The contributions were classified by principal focus and radiological subspecialty. The methods used were assigned to descriptive, basal and advanced statistics. Sample size, significance level and power were established. The use of experts' assistance was monitored. Finally, we calculated the so-called cumulative accessibility of the publications. 525 contributions were found to be eligible. In 1988, 87% used descriptive statistics only, 12.5% basal, and 0.5% advanced statistics. The corresponding figures in 1993 and 1998 are 62 and 49%, 32 and 41%, and 6 and 10%, respectively. Statistical techniques were most likely to be used in research on musculoskeletal imaging and articles dedicated to MRI. Six basic categories of statistical methods account for the complete statistical analysis appearing in 90% of the articles. ROC analysis is the single most common advanced technique. Authors make increasingly use of statistical experts' opinion and programs. During the last decade, the use of statistical methods in German radiological journals has fundamentally improved, both quantitatively and qualitatively. Presently, advanced techniques account for 20% of the pertinent statistical tests. This development seems to be promoted by the increasing availability of statistical analysis software.
Sensory Characterization of Odors in Used Disposable Absorbent Incontinence Products
Widén, Heléne; Forsgren-Brusk, Ulla; Hall, Gunnar
2017-01-01
PURPOSE: The objectives of this study were to characterize the odors of used incontinence products by descriptive analysis and to define attributes to be used in the analysis. A further objective was to investigate to what extent the odor profiles of used incontinence products differed from each other and, if possible, to group these profiles into classes. SUBJECTS AND SETTING: Used incontinence products were collected from 14 residents with urinary incontinence living in geriatric nursing homes in the Gothenburg area, Sweden. METHODS: Pieces were cut from the wet area of used incontinence products. They were placed in glass bottles and kept frozen until odor analysis was completed. A trained panel consisting of 8 judges experienced in this area of investigation defined terminology for odor attributes. The intensities of these attributes in the used products were determined by descriptive odor analysis. Data were analyzed both by analysis of variance (ANOVA) followed by the Tukey post hoc test and by principal component analysis and cluster analysis. RESULTS: An odor wheel, with 10 descriptive attributes, was developed. The total odor intensity, and the intensities of the attributes, varied considerably between different, used incontinence products. The typical odors varied from “sweetish” to “urinal,” “ammonia,” and “smoked.” Cluster analysis showed that the used products, based on the quantitative odor data, could be divided into 5 odor classes with different profiles. CONCLUSIONS: The used products varied considerably in odor character and intensity. Findings suggest that odors in used absorptive products are caused by different types of compounds that may vary in concentration. PMID:28328646
What correlation effects are covered by density functional theory?
NASA Astrophysics Data System (ADS)
He, Yuan; Grafenstein, Jurgen; Kraka, Elfi; Cremer, Dieter
The electron density distribution rho(r) generated by a DFT calculation was systematically studied by comparison with a series of reference densities obtained by wavefunction theory (WFT) methods that cover typical electron correlation effects. As a sensitive indicator for correlation effects the dipole moment of the CO molecule was used. The analysis reveals that typical LDA and GGA exchange functionals already simulate effects that are actually reminiscent of pair and three-electron correlation effects covered by MP2, MP4, and CCSD(T) in WFT. Correlation functionals contract the density towards the bond and the valence region thus taking negative charge out of the van der Waals region. It is shown that these improvements are relevant for the description of van der Waals interactions. Similar to certain correlated single-determinant WFT methods, BLYP and other GGA functionals underestimate ionic terms needed for a correct description of polar bonds. This is compensated for in hybrid functionals by mixing in HF exchange. The balanced mixing of local and non-local exchange and correlation effects leads to the correct description of polar bonds as in the B3LYP description of the CO molecule. The density obtained with B3LYP is closer to CCSD and CCSD(T) than to MP2 or MP4, which indicates that the B3LYP hybrid functional mimics those pair and three-electron correlation effects, which in WFT are only covered by coupled cluster methods.
Kostakis, George E; Blatov, Vladislav A; Proserpio, Davide M
2012-04-21
A novel method for the topological description of high nuclearity coordination clusters (CCs) was improved and applied to all compounds containing only manganese as a metal center, the data on which are collected in the CCDC (CCDC 5.33 Nov. 2011). Using the TOPOS program package that supports this method, we identified 539 CCs with five or more Mn centers adopting 159 topologically different graphs. In the present database all the Mn CCs are collected and illustrated in such a way that can be searched by cluster topological symbol and nuclearity, compound name and Refcode. The main principles for such an analysis are described herein as well as useful applications of this method.
Improvements in surface singularity analysis and design methods. [applicable to airfoils
NASA Technical Reports Server (NTRS)
Bristow, D. R.
1979-01-01
The coupling of the combined source vortex distribution of Green's potential flow function with contemporary numerical techniques is shown to provide accurate, efficient, and stable solutions to subsonic inviscid analysis and design problems for multi-element airfoils. The analysis problem is solved by direct calculation of the surface singularity distribution required to satisfy the flow tangency boundary condition. The design or inverse problem is solved by an iteration process. In this process, the geometry and the associated pressure distribution are iterated until the pressure distribution most nearly corresponding to the prescribed design distribution is obtained. Typically, five iteration cycles are required for convergence. A description of the analysis and design method is presented, along with supporting examples.
Lindsey, David A.; Tysdal, Russell G.; Taggart, Joseph E.
2002-01-01
The principal purpose of this report is to provide a reference archive for results of a statistical analysis of geochemical data for metasedimentary rocks of Mesoproterozoic age of the Salmon River Mountains and Lemhi Range, central Idaho. Descriptions of geochemical data sets, statistical methods, rationale for interpretations, and references to the literature are provided. Three methods of analysis are used: R-mode factor analysis of major oxide and trace element data for identifying petrochemical processes, analysis of variance for effects of rock type and stratigraphic position on chemical composition, and major-oxide ratio plots for comparison with the chemical composition of common clastic sedimentary rocks.
NASA Astrophysics Data System (ADS)
Bohmann, Jonathan A.; Weinhold, Frank; Farrar, Thomas C.
1997-07-01
Nuclear magnetic shielding tensors computed by the gauge including atomic orbital (GIAO) method in the Hartree-Fock self-consistent-field (HF-SCF) framework are partitioned into magnetic contributions from chemical bonds and lone pairs by means of natural chemical shielding (NCS) analysis, an extension of natural bond orbital (NBO) analysis. NCS analysis complements the description provided by alternative localized orbital methods by directly calculating chemical shieldings due to delocalized features in the electronic structure, such as bond conjugation and hyperconjugation. Examples of NCS tensor decomposition are reported for CH4, CO, and H2CO, for which a graphical mnemonic due to Cornwell is used to illustrate the effect of hyperconjugative delocalization on the carbon shielding.
A Comparison of Functional Models for Use in the Function-Failure Design Method
NASA Technical Reports Server (NTRS)
Stock, Michael E.; Stone, Robert B.; Tumer, Irem Y.
2006-01-01
When failure analysis and prevention, guided by historical design knowledge, are coupled with product design at its conception, shorter design cycles are possible. By decreasing the design time of a product in this manner, design costs are reduced and the product will better suit the customer s needs. Prior work indicates that similar failure modes occur with products (or components) with similar functionality. To capitalize on this finding, a knowledge base of historical failure information linked to functionality is assembled for use by designers. One possible use for this knowledge base is within the Elemental Function-Failure Design Method (EFDM). This design methodology and failure analysis tool begins at conceptual design and keeps the designer cognizant of failures that are likely to occur based on the product s functionality. The EFDM offers potential improvement over current failure analysis methods, such as FMEA, FMECA, and Fault Tree Analysis, because it can be implemented hand in hand with other conceptual design steps and carried throughout a product s design cycle. These other failure analysis methods can only truly be effective after a physical design has been completed. The EFDM however is only as good as the knowledge base that it draws from, and therefore it is of utmost importance to develop a knowledge base that will be suitable for use across a wide spectrum of products. One fundamental question that arises in using the EFDM is: At what level of detail should functional descriptions of components be encoded? This paper explores two approaches to populating a knowledge base with actual failure occurrence information from Bell 206 helicopters. Functional models expressed at various levels of detail are investigated to determine the necessary detail for an applicable knowledge base that can be used by designers in both new designs as well as redesigns. High level and more detailed functional descriptions are derived for each failed component based on NTSB accident reports. To best record this data, standardized functional and failure mode vocabularies are used. Two separate function-failure knowledge bases are then created aid compared. Results indicate that encoding failure data using more detailed functional models allows for a more robust knowledge base. Interestingly however, when applying the EFDM, high level descriptions continue to produce useful results when using the knowledge base generated from the detailed functional models.
The FTA Method And A Possibility Of Its Application In The Area Of Road Freight Transport
NASA Astrophysics Data System (ADS)
Poliaková, Adela
2015-06-01
The Fault Tree process utilizes logic diagrams to portray and analyse potentially hazardous events. Three basic symbols (logic gates) are adequate for diagramming any fault tree. However, additional recently developed symbols can be used to reduce the time and effort required for analysis. A fault tree is a graphical representation of the relationship between certain specific events and the ultimate undesired event (2). This paper deals to method of Fault Tree Analysis basic description and provides a practical view on possibility of application by quality improvement in road freight transport company.
NASA Technical Reports Server (NTRS)
Coen, Peter G.
1991-01-01
A new computer technique for the analysis of transport aircraft sonic boom signature characteristics was developed. This new technique, based on linear theory methods, combines the previously separate equivalent area and F function development with a signature propagation method using a single geometry description. The new technique was implemented in a stand-alone computer program and was incorporated into an aircraft performance analysis program. Through these implementations, both configuration designers and performance analysts are given new capabilities to rapidly analyze an aircraft's sonic boom characteristics throughout the flight envelope.
Colour measurements of pigmented rice grain using flatbed scanning and image analysis
NASA Astrophysics Data System (ADS)
Kaisaat, Khotchakorn; Keawdonree, Nuttapong; Chomkokard, Sakchai; Jinuntuya, Noparit; Pattanasiri, Busara
2017-09-01
Recently, the National Bureau of Agricultural Commodity and Food Standards (ACFS) have drafted a manual of Thai colour rice standards. However, there are no quantitative descriptions of rice colour and its measurement method. These drawbacks might lead to misunderstanding for people who use the manual. In this work, we proposed an inexpensive method, using flatbed scanning together with image analysis, to quantitatively measure rice colour and colour uniformity. To demonstrate its general applicability for colour differentiation of rice, we applied it to different kinds of pigmented rice, including Riceberry rice with and without uniform colour and Chinese black rice.
LES, DNS and RANS for the analysis of high-speed turbulent reacting flows
NASA Technical Reports Server (NTRS)
Adumitroaie, V.; Colucci, P. J.; Taulbee, D. B.; Givi, P.
1995-01-01
The purpose of this research is to continue our efforts in advancing the state of knowledge in large eddy simulation (LES), direct numerical simulation (DNS), and Reynolds averaged Navier Stokes (RANS) methods for the computational analysis of high-speed reacting turbulent flows. In the second phase of this work, covering the period 1 Aug. 1994 - 31 Jul. 1995, we have focused our efforts on two programs: (1) developments of explicit algebraic moment closures for statistical descriptions of compressible reacting flows and (2) development of Monte Carlo numerical methods for LES of chemically reacting flows.
Busetto, Loraine; Luijkx, Katrien; Calciolari, Stefano; González-Ortiz, Laura G; Vrijhoef, Hubertus J M
2017-03-08
In this paper, we provide a detailed and explicit description of the processes and decisions underlying and shaping the emergent multimethod research design of our study on workforce changes in integrated chronic care. The study was originally planned as mixed method research consisting of a preliminary literature review and quantitative check of these findings via a Delphi panel. However, when the findings of the literature review were not appropriate for quantitative confirmation, we chose to continue our qualitative exploration of the topic via qualitative questionnaires and secondary analysis of two best practice case reports. The resulting research design is schematically described as an emergent and interactive multimethod design with multiphase combination timing. In doing so, we provide other researchers with a set of theory- and experience-based options to develop their own multimethod research and provide an example for more detailed and structured reporting of emergent designs. We argue that the terminology developed for the description of mixed methods designs should also be used for multimethod designs such as the one presented here.
Toledo, Cíntia Matsuda; Cunha, Andre; Scarton, Carolina; Aluísio, Sandra
2014-01-01
Discourse production is an important aspect in the evaluation of brain-injured individuals. We believe that studies comparing the performance of brain-injured subjects with that of healthy controls must use groups with compatible education. A pioneering application of machine learning methods using Brazilian Portuguese for clinical purposes is described, highlighting education as an important variable in the Brazilian scenario. The aims were to describe how to:(i) develop machine learning classifiers using features generated by natural language processing tools to distinguish descriptions produced by healthy individuals into classes based on their years of education; and(ii) automatically identify the features that best distinguish the groups. The approach proposed here extracts linguistic features automatically from the written descriptions with the aid of two Natural Language Processing tools: Coh-Metrix-Port and AIC. It also includes nine task-specific features (three new ones, two extracted manually, besides description time; type of scene described - simple or complex; presentation order - which type of picture was described first; and age). In this study, the descriptions by 144 of the subjects studied in Toledo 18 were used,which included 200 healthy Brazilians of both genders. A Support Vector Machine (SVM) with a radial basis function (RBF) kernel is the most recommended approach for the binary classification of our data, classifying three of the four initial classes. CfsSubsetEval (CFS) is a strong candidate to replace manual feature selection methods.
Wickner, Reed B.; Kryndushkin, Dmitry; Shewmaker, Frank; McGlinchey, Ryan; Edskes, Herman K.
2012-01-01
Summary Saccharomyces cerevisiae has been a useful model organism in such fields as the cell cycle, regulation of transcription, protein trafficking and cell biology, primarily because of its ease of genetic manipulation. This is no less so in the area of amyloid studies. The endogenous yeast amyloids described to date include prions, infectious proteins (Table 1), and some cell wall proteins (1). and amyloids of humans and a fungal prion have also been studied using the yeast system. Accordingly, the emphasis of this chapter will be on genetic, biochemical, cell biological and physical methods particularly useful in the study of yeast prions and other amyloids studied in yeast. We limit our description of these methods to those aspects which have been most useful in studying yeast prions, citing more detailed expositions in the literature. Volumes on yeast genetics methods (2–4), and on amyloids and prions (5, 6) are useful, and Masison has edited a volume of Methods on “Identification, analysis and characterization of fungal prions” which covers some of this territory (7). We also outline some useful physical methods, pointing the reader to more extensive and authoratative descriptions. PMID:22528100
An artificial viscosity method for the design of supercritical airfoils
NASA Technical Reports Server (NTRS)
Mcfadden, G. B.
1979-01-01
A numerical technique is presented for the design of two-dimensional supercritical wing sections with low wave drag. The method is a design mode of the analysis code H which gives excellent agreement with experimental results and is widely used in the aircraft industry. Topics covered include the partial differential equations of transonic flow, the computational procedure and results; the design procedure; a convergence theorem; and description of the code.
IMP: Interactive mass properties program. Volume 1: Program description
NASA Technical Reports Server (NTRS)
Stewart, W. A.
1976-01-01
A method of computing a weights and center of gravity analysis of a flight vehicle using interactive graphical capabilities of the Adage 340 computer is described. The equations used to calculate area, volume, and mass properties are based on elemental surface characteristics. The input/output methods employ the graphic support of the Adage computer. Several interactive program options are available for analyzing the mass properties of a vehicle. These options are explained.
Vandenabeele-Trambouze, O; Claeys-Bruno, M; Dobrijevic, M; Rodier, C; Borruat, G; Commeyras, A; Garrelly, L
2005-02-01
The need for criteria to compare different analytical methods for measuring extraterrestrial organic matter at ultra-trace levels in relatively small and unique samples (e.g., fragments of meteorites, micrometeorites, planetary samples) is discussed. We emphasize the need to standardize the description of future analyses, and take the first step toward a proposed international laboratory network for performance testing.
Delaney, Aogán; Tamás, Peter A; Crane, Todd A; Chesterman, Sabrina
2016-01-01
There is increasing interest in using systematic review to synthesize evidence on the social and environmental effects of and adaptations to climate change. Use of systematic review for evidence in this field is complicated by the heterogeneity of methods used and by uneven reporting. In order to facilitate synthesis of results and design of subsequent research a method, construct-centered methods aggregation, was designed to 1) provide a transparent, valid and reliable description of research methods, 2) support comparability of primary studies and 3) contribute to a shared empirical basis for improving research practice. Rather than taking research reports at face value, research designs are reviewed through inductive analysis. This involves bottom-up identification of constructs, definitions and operationalizations; assessment of concepts' commensurability through comparison of definitions; identification of theoretical frameworks through patterns of construct use; and integration of transparently reported and valid operationalizations into ideal-type research frameworks. Through the integration of reliable bottom-up inductive coding from operationalizations and top-down coding driven from stated theory with expert interpretation, construct-centered methods aggregation enabled both resolution of heterogeneity within identically named constructs and merging of differently labeled but identical constructs. These two processes allowed transparent, rigorous and contextually sensitive synthesis of the research presented in an uneven set of reports undertaken in a heterogenous field. If adopted more broadly, construct-centered methods aggregation may contribute to the emergence of a valid, empirically-grounded description of methods used in primary research. These descriptions may function as a set of expectations that improves the transparency of reporting and as an evolving comprehensive framework that supports both interpretation of existing and design of future research.
Crane, Todd A.; Chesterman, Sabrina
2016-01-01
There is increasing interest in using systematic review to synthesize evidence on the social and environmental effects of and adaptations to climate change. Use of systematic review for evidence in this field is complicated by the heterogeneity of methods used and by uneven reporting. In order to facilitate synthesis of results and design of subsequent research a method, construct-centered methods aggregation, was designed to 1) provide a transparent, valid and reliable description of research methods, 2) support comparability of primary studies and 3) contribute to a shared empirical basis for improving research practice. Rather than taking research reports at face value, research designs are reviewed through inductive analysis. This involves bottom-up identification of constructs, definitions and operationalizations; assessment of concepts’ commensurability through comparison of definitions; identification of theoretical frameworks through patterns of construct use; and integration of transparently reported and valid operationalizations into ideal-type research frameworks. Through the integration of reliable bottom-up inductive coding from operationalizations and top-down coding driven from stated theory with expert interpretation, construct-centered methods aggregation enabled both resolution of heterogeneity within identically named constructs and merging of differently labeled but identical constructs. These two processes allowed transparent, rigorous and contextually sensitive synthesis of the research presented in an uneven set of reports undertaken in a heterogenous field. If adopted more broadly, construct-centered methods aggregation may contribute to the emergence of a valid, empirically-grounded description of methods used in primary research. These descriptions may function as a set of expectations that improves the transparency of reporting and as an evolving comprehensive framework that supports both interpretation of existing and design of future research. PMID:26901409
Characteristics of Qualitative Descriptive Studies: A Systematic Review.
Kim, Hyejin; Sefcik, Justine S; Bradway, Christine
2017-02-01
Qualitative description (QD) is a term that is widely used to describe qualitative studies of health care and nursing-related phenomena. However, limited discussions regarding QD are found in the existing literature. In this systematic review, we identified characteristics of methods and findings reported in research articles published in 2014 whose authors identified the work as QD. After searching and screening, data were extracted from the sample of 55 QD articles and examined to characterize research objectives, design justification, theoretical/philosophical frameworks, sampling and sample size, data collection and sources, data analysis, and presentation of findings. In this review, three primary findings were identified. First, although there were some inconsistencies, most articles included characteristics consistent with the limited available QD definitions and descriptions. Next, flexibility or variability of methods was common and effective for obtaining rich data and achieving understanding of a phenomenon. Finally, justification for how a QD approach was chosen and why it would be an appropriate fit for a particular study was limited in the sample and, therefore, in need of increased attention. Based on these findings, recommendations include encouragement to researchers to provide as many details as possible regarding the methods of their QD studies so that readers can determine whether the methods used were reasonable and effective in producing useful findings. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Middleton, W. D.; Lundry, J. L.
1976-01-01
An integrated system of computer programs was developed for the design and analysis of supersonic configurations. The system uses linearized theory methods for the calculation of surface pressures and supersonic area rule concepts in combination with linearized theory for calculation of aerodynamic force coefficients. Interactive graphics are optional at the user's request. Schematics of the program structure and the individual overlays and subroutines are described.
ERIC Educational Resources Information Center
Ling, Guangming; Rijmen, Frank
2011-01-01
The factorial structure of the Time Management (TM) scale of the Student 360: Insight Program (S360) was evaluated based on a national sample. A general procedure with a variety of methods was introduced and implemented, including the computation of descriptive statistics, exploratory factor analysis (EFA), and confirmatory factor analysis (CFA).…
Principal component regression analysis with SPSS.
Liu, R X; Kuang, J; Gong, Q; Hou, X L
2003-06-01
The paper introduces all indices of multicollinearity diagnoses, the basic principle of principal component regression and determination of 'best' equation method. The paper uses an example to describe how to do principal component regression analysis with SPSS 10.0: including all calculating processes of the principal component regression and all operations of linear regression, factor analysis, descriptives, compute variable and bivariate correlations procedures in SPSS 10.0. The principal component regression analysis can be used to overcome disturbance of the multicollinearity. The simplified, speeded up and accurate statistical effect is reached through the principal component regression analysis with SPSS.
Qualitative Secondary Analysis: A Case Exemplar.
Tate, Judith Ann; Happ, Mary Beth
Qualitative secondary analysis (QSA) is the use of qualitative data that was collected by someone else or was collected to answer a different research question. Secondary analysis of qualitative data provides an opportunity to maximize data utility, particularly with difficult-to-reach patient populations. However, qualitative secondary analysis methods require careful consideration and explicit description to best understand, contextualize, and evaluate the research results. In this article, we describe methodologic considerations using a case exemplar to illustrate challenges specific to qualitative secondary analysis and strategies to overcome them. Copyright © 2017 National Association of Pediatric Nurse Practitioners. Published by Elsevier Inc. All rights reserved.
Research methods in nursing students' Bachelor's theses in Sweden: A descriptive study.
Johansson, Linda; Silén, Marit
2018-07-01
During the nursing programme in Sweden, students complete an independent project that allows them to receive both a professional qualification as a nurse and a Bachelor's degree. This project gives students the opportunity to develop and apply skills such as critical thinking, problem-solving and decision-making, thus preparing them for their future work. However, only a few, small-scale studies have analysed the independent project to gain more insight into how nursing students carry out this task. The aim of the present study was to describe the methods, including ethical considerations and assessment of data quality, applied in nursing students' independent Bachelor's degree projects in a Swedish context. A descriptive study with a quantitative approach. A total of 490 independent projects were analysed using descriptive statistics. Literature reviews were the predominant project form. References were often used to support the analysis method. They were not, however, always relevant to the method. This was also true of ethical considerations. When a qualitative approach was used, and data collected through interviews, the participants were typically professionals. In qualitative projects involving analysis of biographies/autobiographies or blogs participants were either persons with a disease or next of kin of a person with a disease. Although most of the projects were literature reviews, it seemed unclear to the nursing students how the data should be analysed as well as what ethical issues should be raised in relation to the method. Consequently, further research and guidance are needed. In Sweden, independent projects are not considered research and are therefore not required to undergo ethics vetting. However, it is important that they be designed so as to avoid possible research ethics problems. Asking persons about their health, which occurred in some of the empirical projects, may therefore be considered questionable. Copyright © 2018 Elsevier Ltd. All rights reserved.
Video Analysis of Anterior Cruciate Ligament (ACL) Injuries
Carlson, Victor R.; Sheehan, Frances T.; Boden, Barry P.
2016-01-01
Background: As the most viable method for investigating in vivo anterior cruciate ligament (ACL) rupture, video analysis is critical for understanding ACL injury mechanisms and advancing preventative training programs. Despite the limited number of published studies involving video analysis, much has been gained through evaluating actual injury scenarios. Methods: Studies meeting criteria for this systematic review were collected by performing a broad search of the ACL literature with use of variations and combinations of video recordings and ACL injuries. Both descriptive and analytical studies were included. Results: Descriptive studies have identified specific conditions that increase the likelihood of an ACL injury. These conditions include close proximity to opposing players or other perturbations, high shoe-surface friction, and landing on the heel or the flat portion of the foot. Analytical studies have identified high-risk joint angles on landing, such as a combination of decreased ankle plantar flexion, decreased knee flexion, and increased hip flexion. Conclusions: The high-risk landing position appears to influence the likelihood of ACL injury to a much greater extent than inherent risk factors. As such, on the basis of the results of video analysis, preventative training should be applied broadly. Kinematic data from video analysis have provided insights into the dominant forces that are responsible for the injury (i.e., axial compression with potential contributions from quadriceps contraction and valgus loading). With the advances in video technology currently underway, video analysis will likely lead to enhanced understanding of non-contact ACL injury. PMID:27922985
Role of Media Rumors in the Modern Society
ERIC Educational Resources Information Center
Zheltukhina, Marina R.; Slyshkin, Gennady G.; Ponomarenko, Elena B.; Busygina, Maryana V.; Omelchenko, Anatoly V.
2016-01-01
The article examines the using of media rumors as pragmatic influence mechanism in the modern communication. The printed and electronic messages with rumors make the material of research. The complex methods of analysis of the rumors role in the modern society are used. The inductive, descriptive and comparative, cognitive and discursive,…
ERIC Educational Resources Information Center
Showanasai, Parinya; Lu, Jiafang; Hallinger, Philip
2013-01-01
Purpose: The extant literature on school leadership development is dominated by conceptual analysis, descriptive studies of current practice, critiques of current practice, and prescriptions for better ways to approach practice. Relatively few studies have examined impact of leadership development using experimental methods, among which even fewer…
ERIC Educational Resources Information Center
Campbell, Throy A.
2015-01-01
A phenomenological method was used to analyze ten international doctoral students' description of their lived experiences at a United States (U.S.) university. The analysis was based on the theoretical premise of how students acculturate to their new educational settings. Three broad overlapping themes emerged: (1) participants' past experiences…
Inside the Black Box: Revealing the Process in Applying a Grounded Theory Analysis
ERIC Educational Resources Information Center
Rich, Peter
2012-01-01
Qualitative research methods have long set an example of rich description, in which data and researchers' hermeneutics work together to inform readers of findings in specific contexts. Among published works, insight into the analytical process is most often represented in the form of methodological propositions or research results. This paper…
Introduction to Multilevel Item Response Theory Analysis: Descriptive and Explanatory Models
ERIC Educational Resources Information Center
Sulis, Isabella; Toland, Michael D.
2017-01-01
Item response theory (IRT) models are the main psychometric approach for the development, evaluation, and refinement of multi-item instruments and scaling of latent traits, whereas multilevel models are the primary statistical method when considering the dependence between person responses when primary units (e.g., students) are nested within…
Preservice Teachers' Developing Conceptions of Teaching English Learners
ERIC Educational Resources Information Center
Kelly, Laura Beth
2018-01-01
In this study, 12 preservice teachers in a community college English as a second language (ESL) K-12 teacher education program drew pictures and wrote descriptions of teachers teaching English language learners (ELLs) at the beginning and end of an ESL methods course. Using content analysis, the researcher analyzed the drawings and descriptions…
Impact of E-Learning and Digitalization in Primary and Secondary Schools
ERIC Educational Resources Information Center
Tunmibi, Sunday; Aregbesola, Ayooluwa; Adejobi, Pascal; Ibrahim, Olaniyi
2015-01-01
This study examines into the impact of e-learning and digitalization in primary and secondary schools, using Greensprings School in Lagos State, Nigeria as a case study. Questionnaire was used as a data collection instrument, and descriptive statistical method was adopted for analysis. Responses from students and teachers reveal that application…
ERIC Educational Resources Information Center
Allen, Kelly-Ann; Kern, Margaret L.; Vella-Brodrick, Dianne; Waters, Lea
2018-01-01
Purpose: The vision or mission statement of a school outlines the school's purpose and defines the context, goals, and aspirations that govern the institution. Using vision and mission statements, the present descriptive research study investigated trends in Australian secondary schools' priorities. Research Methods: A stratified sample of…
A Descriptive Analysis of School Connectedness: The Views of School Personnel
ERIC Educational Resources Information Center
Biag, Manuelito
2016-01-01
Few studies have investigated school connectedness from the perspectives of the adults working in the school. Using qualitative methods, the present study examined three dimensions of school connectedness in one urban, low-income middle school. Analyses revealed that school personnel cared for students' needs, sometimes at the expense of holding…
An Analysis of Bullying Legislation among the Various States
ERIC Educational Resources Information Center
Hallford, Abby Jane Swanson
2009-01-01
Scope and method of study. The purpose of this study is to understand the existing state legislation concerning bullying in schools to determine whether the development, structure, and content of these state mandates parallel any change in reported incidents of bullying by public schools in each of those states. This is a descriptive and…
Stereotypes of Latin Americans Perpetuated in Secondary School History Textbooks.
ERIC Educational Resources Information Center
Cruz, Barbara C.
1994-01-01
This study reviewed six history textbooks widely used in grades 7-12 across the U.S. Using a story-line analysis, the findings of this study suggest: (1) textbooks reinforce negative stereotypes of Latin Americans as lazy, passive, irresponsible, and, somewhat paradoxically, lustful, animalistic and violent; (2) the method of description employed…
A summary and evaluation of semi-empirical methods for the prediction of helicopter rotor noise
NASA Technical Reports Server (NTRS)
Pegg, R. J.
1979-01-01
Existing prediction techniques are compiled and described. The descriptions include input and output parameter lists, required equations and graphs, and the range of validity for each part of the prediction procedures. Examples are provided illustrating the analysis procedure and the degree of agreement with experimental results.
Early Child Disaster Mental Health Interventions: A Review of the Empirical Evidence
ERIC Educational Resources Information Center
Pfefferbaum, Betty; Nitiéma, Pascal; Tucker, Phebe; Newman, Elana
2017-01-01
Background: The need to establish an evidence base for early child disaster interventions has been long recognized. Objective: This paper presents a descriptive analysis of the empirical research on early disaster mental health interventions delivered to children within the first 3 months post event. Methods: Characteristics and findings of the…
Farmers as Consumers of Agricultural Education Services: Willingness to Pay and Spend Time
ERIC Educational Resources Information Center
Charatsari, Chrysanthi; Papadaki-Klavdianou, Afroditi; Michailidis, Anastasios
2011-01-01
This study assessed farmers' willingness to pay for and spend time attending an Agricultural Educational Program (AEP). Primary data on the demographic and socio-economic variables of farmers were collected from 355 farmers selected randomly from Northern Greece. Descriptive statistics and multivariate analysis methods were used in order to meet…
Interviewing Youthful Suspects in Alleged Sex Crimes: A Descriptive Analysis
ERIC Educational Resources Information Center
Hershkowitz, Irit; Horowitz, Dvora; Lamb, Michael E.; Orbach, Yael; Sternberg, Kathleen J.
2004-01-01
Objective: To introduce and evaluate a structured interview protocol designed for investigative interviews of youthful alleged perpetrators of child sexual abuse. Method: Seventy-two alleged perpetrators ranging from 9 to 14 years of age (M=12 years) were interviewed by 1 of 13 experienced youth investigators, employed by the Israeli Ministry of…
An analytical approach to customer requirement information processing
NASA Astrophysics Data System (ADS)
Zhou, Zude; Xiao, Zheng; Liu, Quan; Ai, Qingsong
2013-11-01
'Customer requirements' (CRs) management is a key component of customer relationship management (CRM). By processing customer-focused information, CRs management plays an important role in enterprise systems (ESs). Although two main CRs analysis methods, quality function deployment (QFD) and Kano model, have been applied to many fields by many enterprises in the past several decades, the limitations such as complex processes and operations make them unsuitable for online businesses among small- and medium-sized enterprises (SMEs). Currently, most SMEs do not have the resources to implement QFD or Kano model. In this article, we propose a method named customer requirement information (CRI), which provides a simpler and easier way for SMEs to run CRs analysis. The proposed method analyses CRs from the perspective of information and applies mathematical methods to the analysis process. A detailed description of CRI's acquisition, classification and processing is provided.
Uncertainty Analysis of the NASA Glenn 8x6 Supersonic Wind Tunnel
NASA Technical Reports Server (NTRS)
Stephens, Julia; Hubbard, Erin; Walter, Joel; McElroy, Tyler
2016-01-01
This paper presents methods and results of a detailed measurement uncertainty analysis that was performed for the 8- by 6-foot Supersonic Wind Tunnel located at the NASA Glenn Research Center. The statistical methods and engineering judgments used to estimate elemental uncertainties are described. The Monte Carlo method of propagating uncertainty was selected to determine the uncertainty of calculated variables of interest. A detailed description of the Monte Carlo method as applied for this analysis is provided. Detailed uncertainty results for the uncertainty in average free stream Mach number as well as other variables of interest are provided. All results are presented as random (variation in observed values about a true value), systematic (potential offset between observed and true value), and total (random and systematic combined) uncertainty. The largest sources contributing to uncertainty are determined and potential improvement opportunities for the facility are investigated.
Research accomplished at the Knowledge Based Systems Lab: IDEF3, version 1.0
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Menzel, Christopher P.; Mayer, Paula S. D.
1991-01-01
An overview is presented of the foundations and content of the evolving IDEF3 process flow and object state description capture method. This method is currently in beta test. Ongoing efforts in the formulation of formal semantics models for descriptions captured in the outlined form and in the actual application of this method can be expected to cause an evolution in the method language. A language is described for the representation of process and object state centered system description. IDEF3 is a scenario driven process flow modeling methodology created specifically for these types of descriptive activities.
Fotina, I; Lütgendorf-Caucig, C; Stock, M; Pötter, R; Georg, D
2012-02-01
Inter-observer studies represent a valid method for the evaluation of target definition uncertainties and contouring guidelines. However, data from the literature do not yet give clear guidelines for reporting contouring variability. Thus, the purpose of this work was to compare and discuss various methods to determine variability on the basis of clinical cases and a literature review. In this study, 7 prostate and 8 lung cases were contoured on CT images by 8 experienced observers. Analysis of variability included descriptive statistics, calculation of overlap measures, and statistical measures of agreement. Cross tables with ratios and correlations were established for overlap parameters. It was shown that the minimal set of parameters to be reported should include at least one of three volume overlap measures (i.e., generalized conformity index, Jaccard coefficient, or conformation number). High correlation between these parameters and scatter of the results was observed. A combination of descriptive statistics, overlap measure, and statistical measure of agreement or reliability analysis is required to fully report the interrater variability in delineation.
Eastwood, John Graeme; Jalaludin, Bin Badrudin; Kemp, Lynn Ann; Phung, Hai Ngoc
2014-01-01
We have previously reported in this journal on an ecological study of perinatal depressive symptoms in South Western Sydney. In that article, we briefly reported on a factor analysis that was utilized to identify empirical indicators for analysis. In this article, we report on the mixed method approach that was used to identify those latent variables. Social epidemiology has been slow to embrace a latent variable approach to the study of social, political, economic, and cultural structures and mechanisms, partly for philosophical reasons. Critical realist ontology and epistemology have been advocated as an appropriate methodological approach to both theory building and theory testing in the health sciences. We describe here an emergent mixed method approach that uses qualitative methods to identify latent constructs followed by factor analysis using empirical indicators chosen to measure identified qualitative codes. Comparative analysis of the findings is reported together with a limited description of realist approaches to abstract reasoning.
Moving from Descriptive to Causal Analytics: Case Study of the Health Indicators Warehouse
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schryver, Jack C.; Shankar, Mallikarjun; Xu, Songhua
The KDD community has described a multitude of methods for knowledge discovery on large datasets. We consider some of these methods and integrate them into an analyst s workflow that proceeds from the data-centric descriptive level to the model-centric causal level. Examples of the workflow are shown for the Health Indicators Warehouse, which is a public database for community health information that is a potent resource for conducting data science on a medium scale. We demonstrate the potential of HIW as a source of serious visual analytics efforts by showing correlation matrix visualizations, multivariate outlier analysis, multiple linear regression ofmore » Medicare costs, and scatterplot matrices for a broad set of health indicators. We conclude by sketching the first steps toward a causal dependence hypothesis.« less
Reasons for leaving nursing: a study among Turkish nurses.
Gök, Ayşen Uğur; Kocaman, Gülseren
2011-08-01
Reasons for the growing nursing shortage are often complex and multidimensional. To explore the phenomenon of why Turkish nurses leave nursing. The sample in this descriptive study was 134 nurses who had left the profession. A snowball sampling method was used to identify subjects and multiple methods were used to elicit reasons for leaving. Data analysis included descriptive statistics. The main reasons for leaving nursing were related to unsatisfactory working conditions and a negative perception of nursing. Of the respondents, 69.4% received education in a non-nursing field. The most popular career choice was teaching (27.6%). The results of this study indicate that working conditions and public opinion adversely affect a nurse's interest in the profession. The results of the study indicate a need to improve working conditions and to approach this subject from a multidimensional perspective.
NASA Technical Reports Server (NTRS)
Kim, H.; Crawford, F. W.
1977-01-01
It is pointed out that the conventional iterative analysis of nonlinear plasma wave phenomena, which involves a direct use of Maxwell's equations and the equations describing the particle dynamics, leads to formidable theoretical and algebraic complexities, especially for warm plasmas. As an effective alternative, the Lagrangian method may be applied. It is shown how this method may be used in the microscopic description of small-signal wave propagation and in the study of nonlinear wave interactions. The linear theory is developed for an infinite, homogeneous, collisionless, warm magnetoplasma. A summary is presented of a perturbation expansion scheme described by Galloway and Kim (1971), and Lagrangians to third order in perturbation are considered. Attention is given to the averaged-Lagrangian density, the action-transfer and coupled-mode equations, and the general solution of the coupled-mode equations.
Crosta, Fernando; Nishiwaki-Dantas, Maria Cristina; Silvino, Wilmar; Dantas, Paulo Elias Correa
2005-01-01
To verify the frequency of study design, applied statistical analysis and approval by institutional review offices (Ethics Committee) of articles published in the "Arquivos Brasileiros de Oftalmologia" during a 10-year interval, with later comparative and critical analysis by some of the main international journals in the field of Ophthalmology. Systematic review without metanalysis was performed. Scientific papers published in the "Arquivos Brasileiros de Oftalmologia" between January 1993 and December 2002 were reviewed by two independent reviewers and classified according to the applied study design, statistical analysis and approval by the institutional review offices. To categorize those variables, a descriptive statistical analysis was used. After applying inclusion and exclusion criteria, 584 articles for evaluation of statistical analysis and, 725 articles for evaluation of study design were reviewed. Contingency table (23.10%) was the most frequently applied statistical method, followed by non-parametric tests (18.19%), Student's t test (12.65%), central tendency measures (10.60%) and analysis of variance (9.81%). Of 584 reviewed articles, 291 (49.82%) presented no statistical analysis. Observational case series (26.48%) was the most frequently used type of study design, followed by interventional case series (18.48%), observational case description (13.37%), non-random clinical study (8.96%) and experimental study (8.55%). We found a higher frequency of observational clinical studies, lack of statistical analysis in almost half of the published papers. Increase in studies with approval by institutional review Ethics Committee was noted since it became mandatory in 1996.
Nursing management of sensory overload in psychiatry – development of a theoretical framework model
Scheydt, Stefan; Needham, Ian; Nielsen, Gunnar H; Behrens, Johann
2016-09-01
Background: The concept of “removal from stimuli” has already been examined by a Delphi-Study. However, some knowledge gaps remained open, which have now been further investigated. Aim: Examination of the concept “management of sensory overload in inpatient psychiatry” including its sub-concepts and specific measures. Method: Analysis of qualitative data about “removal from stimuli” by content analysis according to Mayring. Results: A theoretical description and definition of the concept could be achieved. In addition, sub-concepts (removal from stimuli, modulation of environmental factors, help somebody to help him-/herself) could be identified, theoretical defined and complemented by possible specific measures. Conclusions: The conceptual descriptions provide a further step to raise awareness of professionals in the subject area. Furthermore, we created a theoretical basis for further empirical studies.
Goedecke, Thomas; Morales, Daniel R; Pacurariu, Alexandra; Kurz, Xavier
2018-03-01
Evaluating the public health impact of regulatory interventions is important but there is currently no common methodological approach to guide this evaluation. This systematic review provides a descriptive overview of the analytical methods for impact research. We searched MEDLINE and EMBASE for articles with an empirical analysis evaluating the impact of European Union or non-European Union regulatory actions to safeguard public health published until March 2017. References from systematic reviews and articles from other known sources were added. Regulatory interventions, data sources, outcomes of interest, methodology and key findings were extracted. From 1246 screened articles, 229 were eligible for full-text review and 153 articles in English language were included in the descriptive analysis. Over a third of articles studied analgesics and antidepressants. Interventions most frequently evaluated are regulatory safety communications (28.8%), black box warnings (23.5%) and direct healthcare professional communications (10.5%); 55% of studies measured changes in drug utilization patterns, 27% evaluated health outcomes, and 18% targeted knowledge, behaviour or changes in clinical practice. Unintended consequences like switching therapies or spill-over effects were rarely evaluated. Two-thirds used before-after time series and 15.7% before-after cross-sectional study designs. Various analytical approaches were applied including interrupted time series regression (31.4%), simple descriptive analysis (28.8%) and descriptive analysis with significance tests (23.5%). Whilst impact evaluation of pharmacovigilance and product-specific regulatory interventions is increasing, the marked heterogeneity in study conduct and reporting highlights the need for scientific guidance to ensure robust methodologies are applied and systematic dissemination of results occurs. © 2017 The Authors. British Journal of Clinical Pharmacology published by John Wiley & Sons Ltd on behalf of British Pharmacological Society.
Development of Clinical Vignettes to Describe Alzheimer's Disease Health States: A Qualitative Study
Oremus, Mark; Xie, Feng; Gaebel, Kathryn
2016-01-01
Aims To develop clinical descriptions (vignettes) of life with Alzheimer’s disease (AD), we conducted focus groups of persons with AD (n = 14), family caregivers of persons with AD (n = 20), and clinicians who see persons with AD in their practices (n = 5). Methods Group participants read existing descriptions of AD and commented on the realism and comprehensibility of the descriptions. We used thematic framework analysis to code the comments into themes and develop three new vignettes to describe mild, moderate, and severe AD. Results Themes included the types of symptoms to mention in the new vignettes, plus the manner in which the vignettes should be written. Since the vignette descriptions were based on focus group participants’ first-hand knowledge of AD, the descriptions can be said to demonstrate content validity. Conclusion Members of the general public can read the vignettes and estimate their health-related quality-of-life (HRQoL) as if they had AD based on the vignette descriptions. This is especially important for economic evaluations of new AD medications, which require HRQoL to be assessed in a manner that persons with AD often find difficult to undertake. The vignettes will allow the general public to serve as a proxy and provide HRQoL estimates in place of persons with AD. PMID:27589604
NASA Astrophysics Data System (ADS)
Zhukotsky, Alexander V.; Kogan, Emmanuil M.; Kopylov, Victor F.; Marchenko, Oleg V.; Lomakin, O. A.
1994-07-01
A new method for morphodensitometric analysis of blood cells was applied for medically screening some ecological influence and infection pathologies. A complex algorithm of computational image processing was created for supra molecular restructurings of interphase chromatin of lymphocytes research. It includes specific methods of staining and unifies different quantitative analysis methods. Our experience with the use of a television image analyzer in cytological and immunological studies made it possible to carry out some research in morphometric analysis of chromatin structure in interphase lymphocyte nuclei in genetic and virus pathologies. In our study to characterize lymphocytes as an image-forming system by a rigorous mathematical description we used an approach involving contaminant evaluation of the topography of chromatin network intact and victims' lymphocytes. It is also possible to digitize data, which revealed significant distinctions between control and experiment. The method allows us to observe the minute structural changes in chromatin, especially eu- and hetero-chromatin that were previously studied by genetics only in chromosomes.
Methods for structural design at elevated temperatures
NASA Technical Reports Server (NTRS)
Ellison, A. M.; Jones, W. E., Jr.; Leimbach, K. R.
1973-01-01
A procedure which can be used to design elevated temperature structures is discussed. The desired goal is to have the same confidence in the structural integrity at elevated temperature as the factor of safety gives on mechanical loads at room temperature. Methods of design and analysis for creep, creep rupture, and creep buckling are presented. Example problems are included to illustrate the analytical methods. Creep data for some common structural materials are presented. Appendix B is description, user's manual, and listing for the creep analysis program. The program predicts time to a given creep or to creep rupture for a material subjected to a specified stress-temperature-time spectrum. Fatigue at elevated temperature is discussed. Methods of analysis for high stress-low cycle fatigue, fatigue below the creep range, and fatigue in the creep range are included. The interaction of thermal fatigue and mechanical loads is considered, and a detailed approach to fatigue analysis is given for structures operating below the creep range.
The Description of Shale Reservoir Pore Structure Based on Method of Moments Estimation
Li, Wenjie; Wang, Changcheng; Shi, Zejin; Wei, Yi; Zhou, Huailai; Deng, Kun
2016-01-01
Shale has been considered as good gas reservoir due to its abundant interior nanoscale pores. Thus, the study of the pore structure of shale is of great significance for the evaluation and development of shale oil and gas. To date, the most widely used approaches for studying the shale pore structure include image analysis, radiation and fluid invasion methods. The detailed pore structures can be studied intuitively by image analysis and radiation methods, but the results obtained are quite sensitive to sample preparation, equipment performance and experimental operation. In contrast, the fluid invasion method can be used to obtain information on pore size distribution and pore structure, but the relative simple parameters derived cannot be used to evaluate the pore structure of shale comprehensively and quantitatively. To characterize the nanoscale pore structure of shale reservoir more effectively and expand the current research techniques, we proposed a new method based on gas adsorption experimental data and the method of moments to describe the pore structure parameters of shale reservoir. Combined with the geological mixture empirical distribution and the method of moments estimation principle, the new method calculates the characteristic parameters of shale, including the mean pore size (x¯), standard deviation (σ), skewness (Sk) and variation coefficient (c). These values are found by reconstructing the grouping intervals of observation values and optimizing algorithms for eigenvalues. This approach assures a more effective description of the characteristics of nanoscale pore structures. Finally, the new method has been applied to analyze the Yanchang shale in the Ordos Basin (China) and Longmaxi shale from the Sichuan Basin (China). The results obtained well reveal the pore characteristics of shale, indicating the feasibility of this new method in the study of the pore structure of shale reservoir. PMID:26992168
The Description of Shale Reservoir Pore Structure Based on Method of Moments Estimation.
Li, Wenjie; Wang, Changcheng; Shi, Zejin; Wei, Yi; Zhou, Huailai; Deng, Kun
2016-01-01
Shale has been considered as good gas reservoir due to its abundant interior nanoscale pores. Thus, the study of the pore structure of shale is of great significance for the evaluation and development of shale oil and gas. To date, the most widely used approaches for studying the shale pore structure include image analysis, radiation and fluid invasion methods. The detailed pore structures can be studied intuitively by image analysis and radiation methods, but the results obtained are quite sensitive to sample preparation, equipment performance and experimental operation. In contrast, the fluid invasion method can be used to obtain information on pore size distribution and pore structure, but the relative simple parameters derived cannot be used to evaluate the pore structure of shale comprehensively and quantitatively. To characterize the nanoscale pore structure of shale reservoir more effectively and expand the current research techniques, we proposed a new method based on gas adsorption experimental data and the method of moments to describe the pore structure parameters of shale reservoir. Combined with the geological mixture empirical distribution and the method of moments estimation principle, the new method calculates the characteristic parameters of shale, including the mean pore size (mean), standard deviation (σ), skewness (Sk) and variation coefficient (c). These values are found by reconstructing the grouping intervals of observation values and optimizing algorithms for eigenvalues. This approach assures a more effective description of the characteristics of nanoscale pore structures. Finally, the new method has been applied to analyze the Yanchang shale in the Ordos Basin (China) and Longmaxi shale from the Sichuan Basin (China). The results obtained well reveal the pore characteristics of shale, indicating the feasibility of this new method in the study of the pore structure of shale reservoir.
[Preoperative psychoprophylaxis in childhood. Results of a hospital program].
Admetlla i Admetlla, I A; Jover i Fulgueira, S
1988-05-01
Results of a surgical psychoprophylaxis program, theoretically and technically framed within psychoanalytic theory is presented. It also comprises a description of the method used, as well as criteria by which authors have determined whether or not a child is ready for surgery. Results obtained with 134 children and a description of those who showed post-surgical disturbances are presented. Analysis is carried out of the percentage of disorders according to age group, showing that highest risk is among children up to five years of age, coinciding with the finding put forth by other authors. Finally some conclusions in relation to prevention of psychologic iatrogenic disorders in pediatric surgery are drawn.
Methods for the Study of Gonadal Development.
Piprek, Rafal P
2016-01-01
Current knowledge on gonadal development and sex determination is the product of many decades of research involving a variety of scientific methods from different biological disciplines such as histology, genetics, biochemistry, and molecular biology. The earliest embryological investigations, followed by the invention of microscopy and staining methods, were based on histological examinations. The most robust development of histological staining techniques occurred in the second half of the nineteenth century and resulted in structural descriptions of gonadogenesis. These first studies on gonadal development were conducted on domesticated animals; however, currently the mouse is the most extensively studied species. The next key point in the study of gonadogenesis was the advancement of methods allowing for the in vitro culture of fetal gonads. For instance, this led to the description of the origin of cell lines forming the gonads. Protein detection using antibodies and immunolabeling methods and the use of reporter genes were also invaluable for developmental studies, enabling the visualization of the formation of gonadal structure. Recently, genetic and molecular biology techniques, especially gene expression analysis, have revolutionized studies on gonadogenesis and have provided insight into the molecular mechanisms that govern this process. The successive invention of new methods is reflected in the progress of research on gonadal development.
The purpose of this SOP is to describe how to collect, store, and ship tap and drinking water samples for analysis by EPA Method 525.2 (revision 1.0) and EPA method 531.1 (revision 3) for the NHEXAS Arizona project. This SOP provides a brief description of the sample containers,...
[A quickly methodology for drug intelligence using profiling of illicit heroin samples].
Zhang, Jianxin; Chen, Cunyi
2012-07-01
The aim of the paper was to evaluate a link between two heroin seizures using a descriptive method. The system involved the derivation and gas chromatographic separation of samples followed by a fully automatic data analysis and transfer to a database. Comparisons used the square cosine function between two chromatograms assimilated to vectors. The method showed good discriminatory capabilities. The probability of false positives was extremely slight. In conclusion, this method proved to be efficient and reliable, which appeared suitable for estimating the links between illicit heroin samples.
Small sample estimation of the reliability function for technical products
NASA Astrophysics Data System (ADS)
Lyamets, L. L.; Yakimenko, I. V.; Kanishchev, O. A.; Bliznyuk, O. A.
2017-12-01
It is demonstrated that, in the absence of big statistic samples obtained as a result of testing complex technical products for failure, statistic estimation of the reliability function of initial elements can be made by the moments method. A formal description of the moments method is given and its advantages in the analysis of small censored samples are discussed. A modified algorithm is proposed for the implementation of the moments method with the use of only the moments at which the failures of initial elements occur.
Safety Culture Assessment in Petrochemical Industry: A Comparative Study of Two Algerian Plants
Boughaba, Assia; Hassane, Chabane; Roukia, Ouddai
2014-01-01
Background To elucidate the relationship between safety culture maturity and safety performance of a particular company. Methods To identify the factors that contribute to a safety culture, a survey questionnaire was created based mainly on the studies of Fernández-Muñiz et al. The survey was randomly distributed to 1000 employees of two oil companies and realized a rate of valid answer of 51%. Minitab 16 software was used and diverse tests, including the descriptive statistical analysis, factor analysis, reliability analysis, mean analysis, and correlation, were used for the analysis of data. Ten factors were extracted using the analysis of factor to represent safety culture and safety performance. Results The results of this study showed that the managers' commitment, training, incentives, communication, and employee involvement are the priority domains on which it is necessary to stress the effort of improvement, where they had all the descriptive average values lower than 3.0 at the level of Company B. Furthermore, the results also showed that the safety culture influences the safety performance of the company. Therefore, Company A with a good safety culture (the descriptive average values more than 4.0), is more successful than Company B in terms of accident rates. Conclusion The comparison between the two petrochemical plants of the group Sonatrach confirms these results in which Company A, the managers of which are English and Norwegian, distinguishes itself by the maturity of their safety culture has significantly higher evaluations than the company B, who is constituted of Algerian staff, in terms of safety management practices and safety performance. PMID:25180135
Compositional Analysis of Lignocellulosic Feedstocks. 1. Review and Description of Methods
2010-01-01
As interest in lignocellulosic biomass feedstocks for conversion into transportation fuels grows, the summative compositional analysis of biomass, or plant-derived material, becomes ever more important. The sulfuric acid hydrolysis of biomass has been used to measure lignin and structural carbohydrate content for more than 100 years. Researchers have applied these methods to measure the lignin and structural carbohydrate contents of woody materials, estimate the nutritional value of animal feed, analyze the dietary fiber content of human food, compare potential biofuels feedstocks, and measure the efficiency of biomass-to-biofuels processes. The purpose of this paper is to review the history and lineage of biomass compositional analysis methods based on a sulfuric acid hydrolysis. These methods have become the de facto procedure for biomass compositional analysis. The paper traces changes to the biomass compositional analysis methods through time to the biomass methods currently used at the National Renewable Energy Laboratory (NREL). The current suite of laboratory analytical procedures (LAPs) offered by NREL is described, including an overview of the procedures and methodologies and some common pitfalls. Suggestions are made for continuing improvement to the suite of analyses. PMID:20669951
NASA Technical Reports Server (NTRS)
Middleton, W. D.; Lundry, J. L.; Coleman, R. G.
1976-01-01
An integrated system of computer programs was developed for the design and analysis of supersonic configurations. The system uses linearized theory methods for the calculation of surface pressures and supersonic area rule concepts in combination with linearized theory for calculation of aerodynamic force coefficients. Interactive graphics are optional at the user's request. This user's manual contains a description of the system, an explanation of its usage, the input definition, and example output.
NASA Technical Reports Server (NTRS)
Wolsko, T.; Buehring, W.; Cirillo, R.; Gasper, J.; Habegger, L.; Hub, K.; Newsom, D.; Samsa, M.; Stenehjem, E.; Whitfield, R.
1980-01-01
The energy systems concerned are the satellite power system, several coal technologies, geothermal energy, fission, fusion, terrestrial solar systems, and ocean thermal energy conversion. Guidelines are suggested for the characterization of these systems, side-by-side analysis, alternative futures analysis, and integration and aggregation of data. A description of the methods for assessing the technical, economic, environmental, societal, and institutional issues surrounding the development of the selected energy technologies is presented.
Analysis of supersonic combustion flow fields with embedded subsonic regions
NASA Technical Reports Server (NTRS)
Dash, S.; Delguidice, P.
1972-01-01
The viscous characteristic analysis for supersonic chemically reacting flows was extended to include provisions for analyzing embedded subsonic regions. The numerical method developed to analyze this mixed subsonic-supersonic flow fields is described. The boundary conditions are discussed related to the supersonic-subsonic and subsonic-supersonic transition, as well as a heuristic description of several other numerical schemes for analyzing this problem. An analysis of shock waves generated either by pressure mismatch between the injected fluid and surrounding flow or by chemical heat release is also described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morgans, D. L.; Lindberg, S. L.
The purpose of this technical approach document (TAD) is to document the assumptions, equations, and methods used to perform the groundwater pathway radiological dose calculations for the revised Hanford Site Composite Analysis (CA). DOE M 435.1-1, states, “The composite analysis results shall be used for planning, radiation protection activities, and future use commitments to minimize the likelihood that current low-level waste disposal activities will result in the need for future corrective or remedial actions to adequately protect the public and the environment.”
General description and understanding of the nonlinear dynamics of mode-locked fiber lasers.
Wei, Huai; Li, Bin; Shi, Wei; Zhu, Xiushan; Norwood, Robert A; Peyghambarian, Nasser; Jian, Shuisheng
2017-05-02
As a type of nonlinear system with complexity, mode-locked fiber lasers are known for their complex behaviour. It is a challenging task to understand the fundamental physics behind such complex behaviour, and a unified description for the nonlinear behaviour and the systematic and quantitative analysis of the underlying mechanisms of these lasers have not been developed. Here, we present a complexity science-based theoretical framework for understanding the behaviour of mode-locked fiber lasers by going beyond reductionism. This hierarchically structured framework provides a model with variable dimensionality, resulting in a simple view that can be used to systematically describe complex states. Moreover, research into the attractors' basins reveals the origin of stochasticity, hysteresis and multistability in these systems and presents a new method for quantitative analysis of these nonlinear phenomena. These findings pave the way for dynamics analysis and system designs of mode-locked fiber lasers. We expect that this paradigm will also enable potential applications in diverse research fields related to complex nonlinear phenomena.
Atomic characterization of Si nanoclusters embedded in SiO2 by atom probe tomography
2011-01-01
Silicon nanoclusters are of prime interest for new generation of optoelectronic and microelectronics components. Physical properties (light emission, carrier storage...) of systems using such nanoclusters are strongly dependent on nanostructural characteristics. These characteristics (size, composition, distribution, and interface nature) are until now obtained using conventional high-resolution analytic methods, such as high-resolution transmission electron microscopy, EFTEM, or EELS. In this article, a complementary technique, the atom probe tomography, was used for studying a multilayer (ML) system containing silicon clusters. Such a technique and its analysis give information on the structure at the atomic level and allow obtaining complementary information with respect to other techniques. A description of the different steps for such analysis: sample preparation, atom probe analysis, and data treatment are detailed. An atomic scale description of the Si nanoclusters/SiO2 ML will be fully described. This system is composed of 3.8-nm-thick SiO layers and 4-nm-thick SiO2 layers annealed 1 h at 900°C. PMID:21711666
The U.S. Market For Broadband Over Powerline, 3. edition
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2007-07-15
The report provides a study of the potential market for BPL technology in the U.S. including a look at the past, present, and future state of Broadband Over Powerline (BPL) in the U.S. The Scope of the report includes the following topics: a description of the history of powerline communications (PLC) and broadband over powerline (BPL) technology; an analysis of key drivers of BPL within the electric utility and internet access industries; an overview of BPL technology and architecture; a comparison of BPL with alternative broadband access methods; an analysis of technological, regulatory, and business barriers to BPL's success; identificationmore » of the key applications and markets for BPL; a description of business models for BPL; an analysis of key market trends in broadband internet access; a review of the market development of cable modem broadband access; profiles of major U.S. BPL market participants; and, profiles of major U.S. BPL projects.« less
ERIC Educational Resources Information Center
Hull, Daniel M.; Lovett, James E.
This task analysis report for the Robotics/Automated Systems Technician (RAST) curriculum project first provides a RAST job description. It then discusses the task analysis, including the identification of tasks, the grouping of tasks according to major areas of specialty, and the comparison of the competencies to existing or new courses to…
Recent Advances in the Analysis of Spiral Bevel Gears
NASA Technical Reports Server (NTRS)
Handschuh, Robert F.
1997-01-01
A review of recent progress for the analysis of spiral bevel gears will be described. The foundation of this work relies on the description of the gear geometry of face-milled spiral bevel gears via the approach developed by Litvin. This methodology was extended by combining the basic gear design data with the manufactured surfaces using a differential geometry approach, and provides the data necessary for assembling three-dimensional finite element models. The finite element models have been utilized to conduct thermal and structural analysis of the gear system. Examples of the methods developed for thermal and structural/contact analysis are presented.
Gebresenbet, Girma
2018-01-01
Consumers’ demand for locally produced and organic foods has increased in Sweden. This paper presents the results obtained from the analysis of data acquired from 100 consumers in Sweden who participated in an online survey during March to June 2016. The objective was to identify consumers’ demand in relation to organic food and sustainable food production, and to understand how the consumers evaluate food quality and make buying decisions. Qualitative descriptions, descriptive statistics and Pearson’s Chi-square test (with alpha value of p < 0.05 as level of significance), and Pearson’s correlation coefficient were used for analysis. About 72% of participants have the perception that organic food production method is more sustainable than conventional methods. Female consumers have more positive attitudes than men towards organic food. However, age difference, household size and income level do not significantly influence the consumers’ perception of sustainable food production concepts. Regionality, sustainable methods of production and organic production are the most important parameters to characterize the food as high quality and make buying decisions. On the other hand, product uniformity, appearance, and price were found to be relatively less important parameters. Food buying decisions and food quality were found to be highly related with Pearson’s correlation coefficient of r = 0.99. PMID:29614785
Bosona, Techane; Gebresenbet, Girma
2018-04-01
Consumers' demand for locally produced and organic foods has increased in Sweden. This paper presents the results obtained from the analysis of data acquired from 100 consumers in Sweden who participated in an online survey during March to June 2016. The objective was to identify consumers' demand in relation to organic food and sustainable food production, and to understand how the consumers evaluate food quality and make buying decisions. Qualitative descriptions, descriptive statistics and Pearson's Chi-square test (with alpha value of p < 0.05 as level of significance), and Pearson's correlation coefficient were used for analysis. About 72% of participants have the perception that organic food production method is more sustainable than conventional methods. Female consumers have more positive attitudes than men towards organic food. However, age difference, household size and income level do not significantly influence the consumers' perception of sustainable food production concepts. Regionality, sustainable methods of production and organic production are the most important parameters to characterize the food as high quality and make buying decisions. On the other hand, product uniformity, appearance, and price were found to be relatively less important parameters. Food buying decisions and food quality were found to be highly related with Pearson's correlation coefficient of r = 0.99.
Age-differentiated Risk Factors of Suicidal Ideation among Young and Middle-aged Korean Adults
Jo, Ahra; Jeon, Minho; Oh, Heeyoung
2017-01-01
Objectives This study aimed to determine the prevalence of suicidal ideation among young and middle-aged adults, and explore the risk factors that affect suicidal ideation. Methods A descriptive study design was used for secondary data analysis. A total sample of 5,214 was drawn from two waves (2012–2013) of the 7th Korea Health Panel (KHP) survey. The KHP data were collected by a well-trained interviewer using the face-to-face method during home visits as well as self-report method. Descriptive statistics of frequency, percentage, chi-square test, and logistic regression analysis were performed using SPSS 22.0. Results The prevalence of suicidal ideation in young and middle-aged adults was 4.4% and 5.6%, respectively. For young adults, suicidal ideation risk was higher among those with low income or heavy drinking habits. In middle-aged adults, low income, poor perceived health status, negative perception of peer-compared health status, and negative social perspective were the major risk factors. Conclusion There is considerable risk of suicidal ideation in adulthood. Opportunities for increased income, avoidance of heavy drinking, and the construction of positive subjective health status and social perspective should be considered in suicide prevention interventions for Korean young and middle-aged adults. PMID:28781943
Navarrete, J; Magliano, J; Martínez, M; Bazzano, C
2018-04-01
The primary goal of Mohs micrographic surgery (MMS) is to completely excise a cancerous lesion and a wide range of reconstructive techniques of varying complexity are used to close the resulting wound. In this study, we performed a descriptive analysis of patients who underwent MMS, with a focus on wound closure methods. We conducted a bidirectional descriptive cohort analysis of all MMS procedures performed by a single surgeon between November 2013 and April 2016. Cosmetic outcomes were photographically assessed by a dermatologist after a minimum follow-up of 90 days. We analyzed 100 MMS procedures in 71 patients with a median age of 73 years. The tumors were basal cell carcinoma (70%), squamous cell carcinoma (29%), and dermatofibrosarcoma protuberans (1%); 75% were located on the head and neck. The reconstructive techniques used were flap closure (48%), simple closure (36%), closure by second intention (11%), and other (5%). Cosmetic outcomes were assessed for 70 procedures (47 patients) and the results were rated as excellent in 20% of cases, very good in 40%, good in 20%, moderate in 17%, and bad/very bad in 2.9%. No significant associations were observed between cosmetic outcome and sex, Fitzpatrick skin type, hypertension, diabetes mellitus, or smoking. Worse outcomes, however, were significantly associated with larger tumor areas and defects, location on the trunk, and flap and second-intention closure. Although there was a tendency to use simple wound closure for lesions located on the trunk and surgical defects of under 4.4cm 2 , the choice of reconstructive technique should be determined by individual circumstances with contemplation of clinical and tumor-related factors and the preference and experience of the surgeon. Copyright © 2017 AEDV. Publicado por Elsevier España, S.L.U. All rights reserved.
Laccourreye, O; Bonfils, P; Denoyelle, F; Garrel, R; Jankowski, R; Karkas, A; Makeieff, M; Righini, C; Vincent, C; Martin, C
2015-09-01
To evaluate characteristics, suggested modifications and reasons for rejection in scientific articles submitted for publication in the European Annals of Otorhinolaryngology, Head and Neck Diseases. A prospective study analyzed the flaws noted by reviewers in 52 scientific articles submitted to the European Annals of Otorhinolaryngology, Head and Neck Diseases between August 31, 2014 and February 28, 2015. Fifteen flaws concerning content and 7 concerning form were identified. In more than 25% of submissions, major flaws were noted: purely descriptive paper; lack of contribution to existing state of knowledge; failure to define a clear study objective and/or analyze the impact of major variables; poorly structured Materials and methods section, lacking description of study population, objective and/or variables; lack of or inappropriate statistical analysis; Introduction verbose and/or misrepresenting the literature; excessively heterogeneous and/or poorly described study population; imprecise discussion, straying from the point, overstating the significance of results and/or introducing new results not mentioned in the Results section; description of the study population placed in the Results section instead of under Materials and methods; serious mistakes of syntax, spelling and/or tense; and failure to follow the Instructions to Authors. After review, 21.1% of articles were published, 65.3% rejected and 13.4% non-resubmitted within 3 months of review. On univariate analysis, the only variable increasing the percentage of articles accepted was the topic not being devoted to head and neck surgery (P=0.03). These results document the excessive flaw rate still to be found in manuscripts and demonstrate the continuing need for authors to master and implement the rules of scientific medical writing. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
Novel Method For Low-Rate Ddos Attack Detection
NASA Astrophysics Data System (ADS)
Chistokhodova, A. A.; Sidorov, I. D.
2018-05-01
The relevance of the work is associated with an increasing number of advanced types of DDoS attacks, in particular, low-rate HTTP-flood. Last year, the power and complexity of such attacks increased significantly. The article is devoted to the analysis of DDoS attacks detecting methods and their modifications with the purpose of increasing the accuracy of DDoS attack detection. The article details low-rate attacks features in comparison with conventional DDoS attacks. During the analysis, significant shortcomings of the available method for detecting low-rate DDoS attacks were found. Thus, the result of the study is an informal description of a new method for detecting low-rate denial-of-service attacks. The architecture of the stand for approbation of the method is developed. At the current stage of the study, it is possible to improve the efficiency of an already existing method by using a classifier with memory, as well as additional information.
Parameter identification for structural dynamics based on interval analysis algorithm
NASA Astrophysics Data System (ADS)
Yang, Chen; Lu, Zixing; Yang, Zhenyu; Liang, Ke
2018-04-01
A parameter identification method using interval analysis algorithm for structural dynamics is presented in this paper. The proposed uncertain identification method is investigated by using central difference method and ARMA system. With the help of the fixed memory least square method and matrix inverse lemma, a set-membership identification technology is applied to obtain the best estimation of the identified parameters in a tight and accurate region. To overcome the lack of insufficient statistical description of the uncertain parameters, this paper treats uncertainties as non-probabilistic intervals. As long as we know the bounds of uncertainties, this algorithm can obtain not only the center estimations of parameters, but also the bounds of errors. To improve the efficiency of the proposed method, a time-saving algorithm is presented by recursive formula. At last, to verify the accuracy of the proposed method, two numerical examples are applied and evaluated by three identification criteria respectively.
Statistical methods in personality assessment research.
Schinka, J A; LaLone, L; Broeckel, J A
1997-06-01
Emerging models of personality structure and advances in the measurement of personality and psychopathology suggest that research in personality and personality assessment has entered a stage of advanced development, in this article we examine whether researchers in these areas have taken advantage of new and evolving statistical procedures. We conducted a review of articles published in the Journal of Personality, Assessment during the past 5 years. Of the 449 articles that included some form of data analysis, 12.7% used only descriptive statistics, most employed only univariate statistics, and fewer than 10% used multivariate methods of data analysis. We discuss the cost of using limited statistical methods, the possible reasons for the apparent reluctance to employ advanced statistical procedures, and potential solutions to this technical shortcoming.
NASA Astrophysics Data System (ADS)
Cuntz, Matthias; Mai, Juliane; Zink, Matthias; Thober, Stephan; Kumar, Rohini; Schäfer, David; Schrön, Martin; Craven, John; Rakovec, Oldrich; Spieler, Diana; Prykhodko, Vladyslav; Dalmasso, Giovanni; Musuuza, Jude; Langenberg, Ben; Attinger, Sabine; Samaniego, Luis
2015-08-01
Environmental models tend to require increasing computational time and resources as physical process descriptions are improved or new descriptions are incorporated. Many-query applications such as sensitivity analysis or model calibration usually require a large number of model evaluations leading to high computational demand. This often limits the feasibility of rigorous analyses. Here we present a fully automated sequential screening method that selects only informative parameters for a given model output. The method requires a number of model evaluations that is approximately 10 times the number of model parameters. It was tested using the mesoscale hydrologic model mHM in three hydrologically unique European river catchments. It identified around 20 informative parameters out of 52, with different informative parameters in each catchment. The screening method was evaluated with subsequent analyses using all 52 as well as only the informative parameters. Subsequent Sobol's global sensitivity analysis led to almost identical results yet required 40% fewer model evaluations after screening. mHM was calibrated with all and with only informative parameters in the three catchments. Model performances for daily discharge were equally high in both cases with Nash-Sutcliffe efficiencies above 0.82. Calibration using only the informative parameters needed just one third of the number of model evaluations. The universality of the sequential screening method was demonstrated using several general test functions from the literature. We therefore recommend the use of the computationally inexpensive sequential screening method prior to rigorous analyses on complex environmental models.
NASA Astrophysics Data System (ADS)
Mai, Juliane; Cuntz, Matthias; Zink, Matthias; Thober, Stephan; Kumar, Rohini; Schäfer, David; Schrön, Martin; Craven, John; Rakovec, Oldrich; Spieler, Diana; Prykhodko, Vladyslav; Dalmasso, Giovanni; Musuuza, Jude; Langenberg, Ben; Attinger, Sabine; Samaniego, Luis
2016-04-01
Environmental models tend to require increasing computational time and resources as physical process descriptions are improved or new descriptions are incorporated. Many-query applications such as sensitivity analysis or model calibration usually require a large number of model evaluations leading to high computational demand. This often limits the feasibility of rigorous analyses. Here we present a fully automated sequential screening method that selects only informative parameters for a given model output. The method requires a number of model evaluations that is approximately 10 times the number of model parameters. It was tested using the mesoscale hydrologic model mHM in three hydrologically unique European river catchments. It identified around 20 informative parameters out of 52, with different informative parameters in each catchment. The screening method was evaluated with subsequent analyses using all 52 as well as only the informative parameters. Subsequent Sobol's global sensitivity analysis led to almost identical results yet required 40% fewer model evaluations after screening. mHM was calibrated with all and with only informative parameters in the three catchments. Model performances for daily discharge were equally high in both cases with Nash-Sutcliffe efficiencies above 0.82. Calibration using only the informative parameters needed just one third of the number of model evaluations. The universality of the sequential screening method was demonstrated using several general test functions from the literature. We therefore recommend the use of the computationally inexpensive sequential screening method prior to rigorous analyses on complex environmental models.
Rotation Covariant Image Processing for Biomedical Applications
Reisert, Marco
2013-01-01
With the advent of novel biomedical 3D image acquisition techniques, the efficient and reliable analysis of volumetric images has become more and more important. The amount of data is enormous and demands an automated processing. The applications are manifold, ranging from image enhancement, image reconstruction, and image description to object/feature detection and high-level contextual feature extraction. In most scenarios, it is expected that geometric transformations alter the output in a mathematically well-defined manner. In this paper we emphasis on 3D translations and rotations. Many algorithms rely on intensity or low-order tensorial-like descriptions to fulfill this demand. This paper proposes a general mathematical framework based on mathematical concepts and theories transferred from mathematical physics and harmonic analysis into the domain of image analysis and pattern recognition. Based on two basic operations, spherical tensor differentiation and spherical tensor multiplication, we show how to design a variety of 3D image processing methods in an efficient way. The framework has already been applied to several biomedical applications ranging from feature and object detection tasks to image enhancement and image restoration techniques. In this paper, the proposed methods are applied on a variety of different 3D data modalities stemming from medical and biological sciences. PMID:23710255
Motamed-Jahromi, Mohadeseh; Dehghani, Seyedeh Leila
2014-01-01
Aim: A thesis is an important part of nursing graduate students’ education, which is also their first systematic and scientific attempt to learn the ABCs of research. Articles derived from theses are important for the dissemination of science and the improvement of nursing as a field. Therefore, it is the goal of the present research is to analyze the different aspects of nursing MSc theses and the number of published articles derived from them. Methods: This was a descriptive research carried out on 145 nursing MSc theses defended in Razi Faculty of Nursing and Midwifery in Kerman between 1990 and 2010. All of the extracted data were put into an Excel file (2007 version) followed by a data analysis. Results: The results of this study were then presented via the use of descriptive statistics and figures. The research findings showed that most of the theses used a descriptive or analytical-descriptive method, and 42% of them had patients as their participants. They were usually delivered on the subject of health care, and only 58 articles were extracted from the whole 145 theses. Conclusion: The process of writing nursing MSc theses and thesis research articles is improving gradually. However, there is a growing need for empirical and semi-empirical research to bridge the gap between theory and practice, which is also a major concern among nurses. PMID:25168988
In Search of a Pony: Sources, Methods, Outcomes, and Motivated Reasoning.
Stone, Marc B
2018-05-01
It is highly desirable to be able to evaluate the effect of policy interventions. Such evaluations should have expected outcomes based upon sound theory and be carefully planned, objectively evaluated and prospectively executed. In many cases, however, assessments originate with investigators' poorly substantiated beliefs about the effects of a policy. Instead of designing studies that test falsifiable hypotheses, these investigators adopt methods and data sources that serve as little more than descriptions of these beliefs in the guise of analysis. Interrupted time series analysis is one of the most popular forms of analysis used to present these beliefs. It is intuitively appealing but, in most cases, it is based upon false analogies, fallacious assumptions and analytical errors.
Gallo-Oller, Gabriel; Ordoñez, Raquel; Dotor, Javier
2018-06-01
Since its first description, Western blot has been widely used in molecular labs. It constitutes a multistep method that allows the detection and/or quantification of proteins from simple to complex protein mixtures. Western blot quantification method constitutes a critical step in order to obtain accurate and reproducible results. Due to the technical knowledge required for densitometry analysis together with the resources availability, standard office scanners are often used for the imaging acquisition of developed Western blot films. Furthermore, the use of semi-quantitative software as ImageJ (Java-based image-processing and analysis software) is clearly increasing in different scientific fields. In this work, we describe the use of office scanner coupled with the ImageJ software together with a new image background subtraction method for accurate Western blot quantification. The proposed method represents an affordable, accurate and reproducible approximation that could be used in the presence of limited resources availability. Copyright © 2018 Elsevier B.V. All rights reserved.
Grande, Marion; Meffert, Elisabeth; Schoenberger, Eva; Jung, Stefanie; Frauenrath, Tobias; Huber, Walter; Hussmann, Katja; Moormann, Mareike; Heim, Stefan
2012-07-02
Spontaneous language has rarely been subjected to neuroimaging studies. This study therefore introduces a newly developed method for the analysis of linguistic phenomena observed in continuous language production during fMRI. Most neuroimaging studies investigating language have so far focussed on single word or - to a smaller extent - sentence processing, mostly due to methodological considerations. Natural language production, however, is far more than the mere combination of words to larger units. Therefore, the present study aimed at relating brain activation to linguistic phenomena like word-finding difficulties or syntactic completeness in a continuous language fMRI paradigm. A picture description task with special constraints was used to provoke hesitation phenomena and speech errors. The transcribed speech sample was segmented into events of one second and each event was assigned to one category of a complex schema especially developed for this purpose. The main results were: conceptual planning engages bilateral activation of the precuneus. Successful lexical retrieval is accompanied - particularly in comparison to unsolved word-finding difficulties - by the left middle and superior temporal gyrus. Syntactic completeness is reflected in activation of the left inferior frontal gyrus (IFG) (area 44). In sum, the method has proven to be useful for investigating the neural correlates of lexical and syntactic phenomena in an overt picture description task. This opens up new prospects for the analysis of spontaneous language production during fMRI. Copyright © 2012 Elsevier Inc. All rights reserved.
Langenderfer, Joseph E; Rullkoetter, Paul J; Mell, Amy G; Laz, Peter J
2009-04-01
An accurate assessment of shoulder kinematics is useful for understanding healthy normal and pathological mechanics. Small variability in identifying and locating anatomical landmarks (ALs) has potential to affect reported shoulder kinematics. The objectives of this study were to quantify the effect of landmark location variability on scapular and humeral kinematic descriptions for multiple subjects using probabilistic analysis methods, and to evaluate the consistency in results across multiple subjects. Data from 11 healthy subjects performing humeral elevation in the scapular plane were used to calculate Euler angles describing humeral and scapular kinematics. Probabilistic analyses were performed for each subject to simulate uncertainty in the locations of 13 upper-extremity ALs. For standard deviations of 4 mm in landmark location, the analysis predicted Euler angle envelopes between the 1 and 99 percentile bounds of up to 16.6 degrees . While absolute kinematics varied with the subject, the average 1-99% kinematic ranges for the motion were consistent across subjects and sensitivity factors showed no statistically significant differences between subjects. The description of humeral kinematics was most sensitive to the location of landmarks on the thorax, while landmarks on the scapula had the greatest effect on the description of scapular elevation. The findings of this study can provide a better understanding of kinematic variability, which can aid in making accurate clinical diagnoses and refining kinematic measurement techniques.
A simplified fuel control approach for low cost aircraft gas turbines
NASA Technical Reports Server (NTRS)
Gold, H.
1973-01-01
Reduction in the complexity of gas turbine fuel controls without loss of control accuracy, reliability, or effectiveness as a method for reducing engine costs is discussed. A description and analysis of hydromechanical approach are presented. A computer simulation of the control mechanism is given and performance of a physical model in engine test is reported.
Verification of aerial photo stand volume tables for southeast Alaska.
Theodore S. Setzer; Bert R. Mead
1988-01-01
Aerial photo volume tables are used in the multilevel sampling system of Alaska Forest Inventory and Analysis. These volume tables are presented with a description of the data base and methods used to construct the tables. Volume estimates compiled from the aerial photo stand volume tables and associated ground-measured values are compared and evaluated.
ERIC Educational Resources Information Center
Degeneffe, Charles Edmund; Green, Richard; Jones, Clair
2016-01-01
Purpose: The study aimed to understand how use and satisfaction with services following discharge from an acquired brain injury (ABI) acute-care facility related to family caregiver outcomes. Methods: A correlational and descriptive study design was used. Nineteen primary family caregivers of persons recently discharged from an ABI acute-care…
ERIC Educational Resources Information Center
Park, Hyeran; Nielsen, Wendy; Woodruff, Earl
2014-01-01
This study examined and compared students' understanding of nature of science (NOS) with 521 Grade 8 Canadian and Korean students using a mixed methods approach. The concepts of NOS were measured using a survey that had both quantitative and qualitative elements. Descriptive statistics and one-way multivariate analysis of variances examined the…
School Nurses' Descriptions of Concerns Arising during Pupils' Health Check-Ups: A Qualitative Study
ERIC Educational Resources Information Center
Poutiainen, Hannele; Holopainen, Arja; Hakulinen-Viitanen, Tuovi; Laatikainen, Tiina
2015-01-01
Objective: To describe the concerns and modes of action of Finnish school nurses during pupils' health check-ups. Methods: Focus group interviews with 17 school nurses were performed in 2011 and again in 2013. Data were analysed using inductive content analysis. Results: School nurses' concerns were mostly associated with the psychosocial…
Receipt of the Human Papillomavirus Vaccine among Female College Students in the United States, 2009
ERIC Educational Resources Information Center
Lindley, Lisa L.; Elkind, Julia S.; Landi, Suzanne N.; Brandt, Heather M.
2013-01-01
Objective: To determine receipt of the human papillomavirus (HPV) vaccine among female college students by demographic/descriptive characteristics and sexual behaviors. Methods: A secondary analysis of the Spring 2009 National College Health Assessment-II was conducted with 40,610 female college students (aged 18 to 24 years) attending 4-year…
Values of Local Wisdom: A Potential to Develop an Assessment and Remedial
ERIC Educational Resources Information Center
Toharudin, Uus; Kurniawan, Iwan Setia
2017-01-01
Development assessment and remedial needs to be done because it is an important part of a learning process. This study aimed to describe the ability of student teachers of biology in developing assessment and remedial based on local wisdom. using a quasi-experimental research methods with quantitative descriptive analysis techniques. The research…
ERIC Educational Resources Information Center
Harry, Melissa L.; MacDonald, Lynn; McLuckie, Althea; Battista, Christina; Mahoney, Ellen K.; Mahoney, Kevin J.
2017-01-01
Background: Our aim was to explore previously unknown long-term outcomes of self-directed personal care services for young adults with intellectual disabilities and limitations in activities of daily living. Materials and Methods: The present authors utilized participatory action research and qualitative content analysis in interviewing 11 unpaid…
ERIC Educational Resources Information Center
Wahyuddin, Wawan
2016-01-01
This study wants to examine the relationship between teacher competence and emotional intelligence that held by teachers to increase the teacher performance Madrasah Tsanawiyah at district of Serang Banten. This research was conducted with the quantitative method, through analysis descriptive and inferential. Samples the research were teachers…
ERIC Educational Resources Information Center
Kandeel, Refat A. A.
2016-01-01
The purpose of this study was to determine the multiple intelligences patterns of students at King Saud University and its relationship with academic achievement for the courses of Mathematics. The study sample consisted of 917 students were selected a stratified random manner, the descriptive analysis method and Pearson correlation were used, the…
Technical Report on Manpower Planning. Technical Group Report No. 7.
ERIC Educational Resources Information Center
Montana Commission on Post-Secondary Education, Helena.
The report is one of a series containing data and recommendations relevant to the task of developing future plans for Montana's post-secondary education. A brief introduction outlines the methods used in gathering information on manpower needs, and is followed by a review and summarization of the data collected, including a description, analysis,…
ERIC Educational Resources Information Center
Boiani, James A.
1986-01-01
Describes an experiment which uses the Gran plot for analyzing free ions as well as those involved in an equilibrium. Discusses the benefits of using Gran plots in the study of acids, as well as other analytes in solutions. Presents background theory along with a description of the experimental procedures. (TW)
Gender Differentiation in the New York "Times": 1885 and 1985.
ERIC Educational Resources Information Center
Jolliffe, Lee
A study examined the descriptive language and sex-linked roles ascribed to women and men in articles of the New York "Times" from 1885 and 1985. Seven content analysis methods were applied to four random samples from the "Times"; one sample each for women and men from both years. Samples were drawn using randomly constructed…
ERIC Educational Resources Information Center
Smedema, Susan Miller; Pfaller, Joseph S.; Yaghmaian, Rana A.; Weaver, Hayley; da Silva Cardoso, Elizabeth; Chan, Fong
2015-01-01
Purpose: To examine the mediational effect of core self-evaluations (CSE) on the relationship between functional disability and life satisfaction. Methods: A quantitative descriptive design using multiple regression analysis. The participants were 97 college students with disabilities receiving services through Hunter College's Minority-Disability…
How Counselors Are Trained to Work with Bisexual Clients in CACREP-Accredited Programs
ERIC Educational Resources Information Center
Bonjo, Laurie Anne
2013-01-01
In spite of recent progress toward addressing the need for cultural competence with lesbian and gay-identified clients, bisexual-identified clients continue to be marginalized in the principles, theories, and methods of studying sexuality as well as in the training provided by counselor educators. A descriptive content analysis was conducted to…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-13
...) evaluation design, which will involve baseline surveys and two follow-up surveys. This will allow short- and... descriptive analysis of how States designed and implemented PREP programs. The study will use multiple methods... ``Design Survey'', will focus on how states designed programs, and the second round of interviews, known as...
ERIC Educational Resources Information Center
Akan, Durdagi
2015-01-01
The purpose of this study is to determine the relations between the organizational creativity perceptions and life satisfaction levels of the teachers. This study is conducted in descriptive survey method. Satisfaction with Life Scale and Organizational Creativity Scale were used to collect data from 233 primary and secondary school teachers…
ERIC Educational Resources Information Center
Tao, Fumiyo; And Others
This volume contains technical and supporting materials that supplement Volume I, which describes upward mobility programs for disadvantaged and dislocated workers in the service sector. Appendix A is a detailed description of the project methodology, including data collection methods and information on data compilation, processing, and analysis.…
Methods for collection and analysis of aquatic biological and microbiological samples
Britton, L.J.; Greeson, P.E.
1988-01-01
Chapter A4, methods for collection and analyses of aquatic biological and microbiological samples, contains methods used by the U.S. Geological Survey to collect, preserve, and analyze waters to determine their biological and microbiological properties. Part 1 consists of detailed descriptions of more than 45 individual methods, including those for bacteria, phytoplankton, zooplankton, seston, periphyton, macrophytes, benthic invertebrates, fish and other vertebrates, cellular contents, productivity and bioassay. Each method is summarized, and the applications, interferences, apparatus, reagents, analyses, calculations, reporting of results, precisions, and references are given. Part 2 consists of a glossary. Part 3 is a list of taxonomic references. (USGS)
NASA Technical Reports Server (NTRS)
Lutzky, D.; Bjorkman, W. S.
1973-01-01
The Mission Analysis Evaluation and Space Trajectory Operations program known as MAESTRO is described. MAESTRO is an all FORTRAN, block style, computer program designed to perform various mission control tasks. This manual is a guide to MAESTRO, providing individuals the capability of modifying the program to suit their needs. Descriptions are presented of each of the subroutines descriptions consist of input/output description, theory, subroutine description, and a flow chart where applicable. The programmer's manual also contains a detailed description of the common blocks, a subroutine cross reference map, and a general description of the program structure.
A Description for Rock Joint Roughness Based on Terrestrial Laser Scanner and Image Analysis
Ge, Yunfeng; Tang, Huiming; Eldin, M. A. M Ez; Chen, Pengyu; Wang, Liangqing; Wang, Jinge
2015-01-01
Shear behavior of rock mass greatly depends upon the rock joint roughness which is generally characterized by anisotropy, scale effect and interval effect. A new index enabling to capture all the three features, namely brightness area percentage (BAP), is presented to express the roughness based on synthetic illumination of a digital terrain model derived from terrestrial laser scanner (TLS). Since only tiny planes facing opposite to shear direction make contribution to resistance during shear failure, therefore these planes are recognized through the image processing technique by taking advantage of the fact that they appear brighter than other ones under the same light source. Comparison with existing roughness indexes and two case studies were illustrated to test the performance of BAP description. The results reveal that the rock joint roughness estimated by the presented description has a good match with existing roughness methods and displays a wider applicability. PMID:26585247
Analysis of Parasite and Other Skewed Counts
Alexander, Neal
2012-01-01
Objective To review methods for the statistical analysis of parasite and other skewed count data. Methods Statistical methods for skewed count data are described and compared, with reference to those used over a ten year period of Tropical Medicine and International Health. Two parasitological datasets are used for illustration. Results Ninety papers were identified, 89 with descriptive and 60 with inferential analysis. A lack of clarity is noted in identifying measures of location, in particular the Williams and geometric mean. The different measures are compared, emphasizing the legitimacy of the arithmetic mean for skewed data. In the published papers, the t test and related methods were often used on untransformed data, which is likely to be invalid. Several approaches to inferential analysis are described, emphasizing 1) non-parametric methods, while noting that they are not simply comparisons of medians, and 2) generalized linear modelling, in particular with the negative binomial distribution. Additional methods, such as the bootstrap, with potential for greater use are described. Conclusions Clarity is recommended when describing transformations and measures of location. It is suggested that non-parametric methods and generalized linear models are likely to be sufficient for most analyses. PMID:22943299
NASA Technical Reports Server (NTRS)
Lund, T. S.; Tavella, D. A.; Roberts, L.
1985-01-01
A viscous-inviscid interaction methodology based on a zonal description of the flowfield is developed as a mean of predicting the performance of two-dimensional thrust augmenting ejectors. An inviscid zone comprising the irrotational flow about the device is patched together with a viscous zone containing the turbulent mixing flow. The inviscid region is computed by a higher order panel method, while an integral method is used for the description of the viscous part. A non-linear, constrained optimization study is undertaken for the design of the inlet region. In this study, the viscous-inviscid analysis is complemented with a boundary layer calculation to account for flow separation from the walls of the inlet region. The thrust-based Reynolds number as well as the free stream velocity are shown to be important parameters in the design of a thrust augmentor inlet.
Vedula, S Swaroop; Li, Tianjing; Dickersin, Kay
2013-01-01
Details about the type of analysis (e.g., intent to treat [ITT]) and definitions (i.e., criteria for including participants in the analysis) are necessary for interpreting a clinical trial's findings. Our objective was to compare the description of types of analyses and criteria for including participants in the publication (i.e., what was reported) with descriptions in the corresponding internal company documents (i.e., what was planned and what was done). Trials were for off-label uses of gabapentin sponsored by Pfizer and Parke-Davis, and documents were obtained through litigation. For each trial, we compared internal company documents (protocols, statistical analysis plans, and research reports, all unpublished), with publications. One author extracted data and another verified, with a third person verifying discordant items and a sample of the rest. Extracted data included the number of participants randomized and analyzed for efficacy, and types of analyses for efficacy and safety and their definitions (i.e., criteria for including participants in each type of analysis). We identified 21 trials, 11 of which were published randomized controlled trials, and that provided the documents needed for planned comparisons. For three trials, there was disagreement on the number of randomized participants between the research report and publication. Seven types of efficacy analyses were described in the protocols, statistical analysis plans, and publications, including ITT and six others. The protocol or publication described ITT using six different definitions, resulting in frequent disagreements between the two documents (i.e., different numbers of participants were included in the analyses). Descriptions of analyses conducted did not agree between internal company documents and what was publicly reported. Internal company documents provide extensive documentation of methods planned and used, and trial findings, and should be publicly accessible. Reporting standards for randomized controlled trials should recommend transparent descriptions and definitions of analyses performed and which study participants are excluded.
Steele Gray, Carolyn; Khan, Anum Irfan; Kuluski, Kerry; McKillop, Ian; Sharpe, Sarah; Bierman, Arlene S; Lyons, Renee F; Cott, Cheryl
2016-02-18
Many mHealth technologies do not meet the needs of patients with complex chronic disease and disabilities (CCDDs) who are among the highest users of health systems worldwide. Furthermore, many of the development methodologies used in the creation of mHealth and eHealth technologies lack the ability to embrace users with CCDD in the specification process. This paper describes how we adopted and modified development techniques to create the electronic Patient-Reported Outcomes (ePRO) tool, a patient-centered mHealth solution to help improve primary health care for patients experiencing CCDD. This paper describes the design and development approach, specifically the process of incorporating qualitative research methods into user-centered design approaches to create the ePRO tool. Key lessons learned are offered as a guide for other eHealth and mHealth research and technology developers working with complex patient populations and their primary health care providers. Guided by user-centered design principles, interpretive descriptive qualitative research methods were adopted to capture user experiences through interviews and working groups. Consistent with interpretive descriptive methods, an iterative analysis technique was used to generate findings, which were then organized in relation to the tool design and function to help systematically inform modifications to the tool. User feedback captured and analyzed through this method was used to challenge the design and inform the iterative development of the tool. Interviews with primary health care providers (n=7) and content experts (n=6), and four focus groups with patients and carers (n=14) along with a PICK analysis-Possible, Implementable, (to be) Challenged, (to be) Killed-guided development of the first prototype. The initial prototype was presented in three design working groups with patients/carers (n=5), providers (n=6), and experts (n=5). Working group findings were broken down into categories of what works and what does not work to inform modifications to the prototype. This latter phase led to a major shift in the purpose and design of the prototype, validating the importance of using iterative codesign processes. Interpretive descriptive methods allow for an understanding of user experiences of patients with CCDD, their carers, and primary care providers. Qualitative methods help to capture and interpret user needs, and identify contextual barriers and enablers to tool adoption, informing a redesign to better suit the needs of this diverse user group. This study illustrates the value of adopting interpretive descriptive methods into user-centered mHealth tool design and can also serve to inform the design of other eHealth technologies. Our approach is particularly useful in requirements determination when developing for a complex user group and their health care providers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kercel, S.W.
1999-11-07
For several reasons, Bayesian parameter estimation is superior to other methods for inductively learning a model for an anticipatory system. Since it exploits prior knowledge, the analysis begins from a more advantageous starting point than other methods. Also, since "nuisance parameters" can be removed from the Bayesian analysis, the description of the model need not be as complete as is necessary for such methods as matched filtering. In the limit of perfectly random noise and a perfect description of the model, the signal-to-noise ratio improves as the square root of the number of samples in the data. Even with themore » imperfections of real-world data, Bayesian methods approach this ideal limit of performance more closely than other methods. These capabilities provide a strategy for addressing a major unsolved problem in pump operation: the identification of precursors of cavitation. Cavitation causes immediate degradation of pump performance and ultimate destruction of the pump. However, the most efficient point to operate a pump is just below the threshold of cavitation. It might be hoped that a straightforward method to minimize pump cavitation damage would be to simply adjust the operating point until the inception of cavitation is detected and then to slightly readjust the operating point to let the cavitation vanish. However, due to the continuously evolving state of the fluid moving through the pump, the threshold of cavitation tends to wander. What is needed is to anticipate cavitation, and this requires the detection and identification of precursor features that occur just before cavitation starts.« less
Molecular counting of membrane receptor subunits with single-molecule localization microscopy
NASA Astrophysics Data System (ADS)
Krüger, Carmen; Fricke, Franziska; Karathanasis, Christos; Dietz, Marina S.; Malkusch, Sebastian; Hummer, Gerhard; Heilemann, Mike
2017-02-01
We report on quantitative single-molecule localization microscopy, a method that next to super-resolved images of cellular structures provides information on protein copy numbers in protein clusters. This approach is based on the analysis of blinking cycles of single fluorophores, and on a model-free description of the distribution of the number of blinking events. We describe the experimental and analytical procedures, present cellular data of plasma membrane proteins and discuss the applicability of this method.
Longitudinal Relationships Among Perceived Injunctive and Descriptive Norms and Marijuana Use
Napper, Lucy E.; Kenney, Shannon R.; Hummer, Justin F.; Fiorot, Sara; LaBrie, Joseph W.
2016-01-01
Objective: The current study uses longitudinal data to examine the relative influence of perceived descriptive and injunctive norms for proximal and distal referents on marijuana use. Method: Participants were 740 undergraduate students (67% female) who completed web-based surveys at two time points 12 months apart. Time 1 measures included reports of marijuana use, approval, perceived descriptive norms, and perceived injunctive norms for the typical student, close friends, and parents. At Time 2, students reported on their marijuana use. Results: Results of a path analysis suggest that, after we controlled for Time 1 marijuana use, greater perceived friend approval indirectly predicted Time 2 marijuana use as mediated by personal approval. Greater perceived parental approval was both indirectly and directly associated with greater marijuana use at follow-up. Perceived typical-student descriptive norms were neither directly nor indirectly related to Time 2 marijuana use. Conclusions: The findings support the role of proximal injunctive norms in predicting college student marijuana use up to 12 months later. The results indicate the potential importance of developing normative interventions that incorporate the social influences of proximal referents. PMID:27172578
NASA Astrophysics Data System (ADS)
Menthe, R. W.; McColgan, C. J.; Ladden, R. M.
1991-05-01
The Unified AeroAcoustic Program (UAAP) code calculates the airloads on a single rotation prop-fan, or propeller, and couples these airloads with an acoustic radiation theory, to provide estimates of near-field or far-field noise levels. The steady airloads can also be used to calculate the nonuniform velocity components in the propeller wake. The airloads are calculated using a three dimensional compressible panel method which considers the effects of thin, cambered, multiple blades which may be highly swept. These airloads may be either steady or unsteady. The acoustic model uses the blade thickness distribution and the steady or unsteady aerodynamic loads to calculate the acoustic radiation. The users manual for the UAAP code is divided into five sections: general code description; input description; output description; system description; and error codes. The user must have access to IMSL10 libraries (MATH and SFUN) for numerous calls made for Bessel functions and matrix inversion. For plotted output users must modify the dummy calls to plotting routines included in the code to system-specific calls appropriate to the user's installation.
NASA Technical Reports Server (NTRS)
Menthe, R. W.; Mccolgan, C. J.; Ladden, R. M.
1991-01-01
The Unified AeroAcoustic Program (UAAP) code calculates the airloads on a single rotation prop-fan, or propeller, and couples these airloads with an acoustic radiation theory, to provide estimates of near-field or far-field noise levels. The steady airloads can also be used to calculate the nonuniform velocity components in the propeller wake. The airloads are calculated using a three dimensional compressible panel method which considers the effects of thin, cambered, multiple blades which may be highly swept. These airloads may be either steady or unsteady. The acoustic model uses the blade thickness distribution and the steady or unsteady aerodynamic loads to calculate the acoustic radiation. The users manual for the UAAP code is divided into five sections: general code description; input description; output description; system description; and error codes. The user must have access to IMSL10 libraries (MATH and SFUN) for numerous calls made for Bessel functions and matrix inversion. For plotted output users must modify the dummy calls to plotting routines included in the code to system-specific calls appropriate to the user's installation.
Factors affecting quality of social interaction park in Jakarta
NASA Astrophysics Data System (ADS)
Mangunsong, N. I.
2018-01-01
The existence of social interactions park in Jakarta is an oasis in the middle of a concrete jungle. Parks is a response to the need for open space as a place of recreation and community interaction. Often the social interaction parks built by the government does not function as expected, but other functions such as a place to sell, trash, unsafe so be rarely visited by visitors. The purpose of this study was to analyze the factors that affect the quality of social interaction parks in Jakarta by conducting descriptive analysis and correlation analysis of the variables assessment. The results of the analysis can give an idea of social interactions park based on community needs and propose the development of social interactioncity park. The object of study are 25 social interaction parks in 5 municipalities of Jakarta. The method used is descriptive analysis method, correlation analysis using SPSS 19 and using crosstab, chi-square tests. The variables are 5 aspects of Design, Plants composition: Selection type of plant (D); the beauty and harmony (Ind); Maintenance and fertility (P); Cleanliness and Environmental Health (BS); Specificity (Drainage, Multi Function garden, Means, Concern/Mutual cooperation, in dense settlements) (K). The results of analysis show that beauty is the most significant correlation with the value of the park followed by specificity, cleanliness and maintenance. Design was not the most significant variable affecting the quality of the park. The results of this study can be used by the Department of Parks and Cemeteries as input in managing park existing or to be developed and to improve the quality of social interaction park in Jakarta.
Job Analysis and the Preparation of Job Descriptions. Mendip Papers MP 037.
ERIC Educational Resources Information Center
Saunders, Bob
This document provides guidelines for conducting job analyses and writing job descriptions. It covers the following topics: the rationale for job descriptions, the terminology of job descriptions, who should write job descriptions, getting the information to write job descriptions, preparing for staff interviews, conducting interviews, writing the…
Wilson, Lynda; Moran, Laura; Zarate, Rosa; Warren, Nicole; Ventura, Carla Aparecida Arena; Tamí-Maury, Irene; Mendes, Isabel Amélia Costa
2016-01-01
Abstract Objective: to analyze qualitative comments from four surveys asking nursing faculty to rate the importance of 30 global health competencies for undergraduate nursing programs. Method: qualitative descriptive study that included 591 individuals who responded to the survey in English (49 from Africa and 542 from the Americas), 163 who responded to the survey in Spanish (all from Latin America), and 222 Brazilian faculty who responded to the survey in Portuguese. Qualitative comments were recorded at the end of the surveys by 175 respondents to the English survey, 75 to the Spanish survey, and 70 to the Portuguese survey. Qualitative description and a committee approach guided data analysis. Results: ten new categories of global health competencies emerged from the analysis. Faculty also demonstrated concern about how and when these competencies could be integrated into nursing curricula. Conclusion: the additional categories should be considered for addition to the previously identified global health competencies. These, in addition to the guidance about integration into existing curricula, can be used to guide refinement of the original list of global health competencies. Further research is needed to seek consensus about these competencies and to develop recommendations and standards to guide nursing curriculum development. PMID:27276020
Gutierrez-Navarro, Omar; Campos-Delgado, Daniel U; Arce-Santana, Edgar R; Maitland, Kristen C; Cheng, Shuna; Jabbour, Joey; Malik, Bilal; Cuenca, Rodrigo; Jo, Javier A
2014-05-19
Multispectral fluorescence lifetime imaging (m-FLIM) can potentially allow identifying the endogenous fluorophores present in biological tissue. Quantitative description of such data requires estimating the number of components in the sample, their characteristic fluorescent decays, and their relative contributions or abundances. Unfortunately, this inverse problem usually requires prior knowledge about the data, which is seldom available in biomedical applications. This work presents a new methodology to estimate the number of potential endogenous fluorophores present in biological tissue samples from time-domain m-FLIM data. Furthermore, a completely blind linear unmixing algorithm is proposed. The method was validated using both synthetic and experimental m-FLIM data. The experimental m-FLIM data include in-vivo measurements from healthy and cancerous hamster cheek-pouch epithelial tissue, and ex-vivo measurements from human coronary atherosclerotic plaques. The analysis of m-FLIM data from in-vivo hamster oral mucosa identified healthy from precancerous lesions, based on the relative concentration of their characteristic fluorophores. The algorithm also provided a better description of atherosclerotic plaques in term of their endogenous fluorophores. These results demonstrate the potential of this methodology to provide quantitative description of tissue biochemical composition.
Multibeam antenna study, phase 1
NASA Technical Reports Server (NTRS)
Bellamy, J. L.
1972-01-01
A multibeam antenna concept was developed for providing spot beam coverage of the contiguous 48 states. The selection of a suitable antenna concept for the multibeam application and an experimental evaluation of the antenna concept selected are described. The final analysis indicates that the preferred concept is a dual-antenna, circular artificial dielectric lens. A description of the analytical methods is provided, as well as a discussion of the absolute requirements placed on the antenna concepts. Finally, a comparative analysis of reflector antenna off-axis beam performance is presented.
African Primary Care Research: Quantitative analysis and presentation of results
Ogunbanjo, Gboyega A.
2014-01-01
Abstract This article is part of a series on Primary Care Research Methods. The article describes types of continuous and categorical data, how to capture data in a spreadsheet, how to use descriptive and inferential statistics and, finally, gives advice on how to present the results in text, figures and tables. The article intends to help Master's level students with writing the data analysis section of their research proposal and presenting their results in their final research report. PMID:26245435
Computer applications in scientific balloon quality control
NASA Astrophysics Data System (ADS)
Seely, Loren G.; Smith, Michael S.
Seal defects and seal tensile strength are primary determinants of product quality in scientific balloon manufacturing; they therefore require a unit of quality measure. The availability of inexpensive and powerful data-processing tools can serve as the basis of a quality-trends-discerning analysis of products. The results of one such analysis are presently given in graphic form for use on the production floor. Software descriptions and their sample outputs are presented, together with a summary of overall and long-term effects of these methods on product quality.
NASA Technical Reports Server (NTRS)
Smith, C. C.; Warner, D. B.; Dajani, J. S.
1977-01-01
The technical, economic, and environmental problems restricting commercial helicopter passenger operations are reviewed. The key considerations for effective assessment procedures are outlined and a preliminary model for the environmental analysis of helicopters is developed. It is recommended that this model, or some similar approach, be used as a common base for the development of comprehensive environmental assessment methods for each of the federal agencies concerned with helicopters. A description of the critical environmental research issues applicable to helicopters is also presented.
NASA Astrophysics Data System (ADS)
Eichinger, M.; Tavan, P.; Hutter, J.; Parrinello, M.
1999-06-01
We present a hybrid method for molecular dynamics simulations of solutes in complex solvents as represented, for example, by substrates within enzymes. The method combines a quantum mechanical (QM) description of the solute with a molecular mechanics (MM) approach for the solvent. The QM fragment of a simulation system is treated by ab initio density functional theory (DFT) based on plane-wave expansions. Long-range Coulomb interactions within the MM fragment and between the QM and the MM fragment are treated by a computationally efficient fast multipole method. For the description of covalent bonds between the two fragments, we introduce the scaled position link atom method (SPLAM), which removes the shortcomings of related procedures. The various aspects of the hybrid method are scrutinized through test calculations on liquid water, the water dimer, ethane and a small molecule related to the retinal Schiff base. In particular, the extent to which vibrational spectra obtained by DFT for the solute can be spoiled by the lower quality force field of the solvent is checked, including cases in which the two fragments are covalently joined. The results demonstrate that our QM/MM hybrid method is especially well suited for the vibrational analysis of molecules in condensed phase.
Molina, Martin; Sanchez-Soriano, Javier; Corcho, Oscar
2015-07-03
Providing descriptions of isolated sensors and sensor networks in natural language, understandable by the general public, is useful to help users find relevant sensors and analyze sensor data. In this paper, we discuss the feasibility of using geographic knowledge from public databases available on the Web (such as OpenStreetMap, Geonames, or DBpedia) to automatically construct such descriptions. We present a general method that uses such information to generate sensor descriptions in natural language. The results of the evaluation of our method in a hydrologic national sensor network showed that this approach is feasible and capable of generating adequate sensor descriptions with a lower development effort compared to other approaches. In the paper we also analyze certain problems that we found in public databases (e.g., heterogeneity, non-standard use of labels, or rigid search methods) and their impact in the generation of sensor descriptions.
Molina, Martin; Sanchez-Soriano, Javier; Corcho, Oscar
2015-01-01
Providing descriptions of isolated sensors and sensor networks in natural language, understandable by the general public, is useful to help users find relevant sensors and analyze sensor data. In this paper, we discuss the feasibility of using geographic knowledge from public databases available on the Web (such as OpenStreetMap, Geonames, or DBpedia) to automatically construct such descriptions. We present a general method that uses such information to generate sensor descriptions in natural language. The results of the evaluation of our method in a hydrologic national sensor network showed that this approach is feasible and capable of generating adequate sensor descriptions with a lower development effort compared to other approaches. In the paper we also analyze certain problems that we found in public databases (e.g., heterogeneity, non-standard use of labels, or rigid search methods) and their impact in the generation of sensor descriptions. PMID:26151211
PWR PRELIMINARY DESIGN FOR PL-3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humphries, G. E.
1962-02-28
The pressurized water reactor preliminary design, the preferred design developed under Phase I of the PL-3 contract, is presented. Plant design criteria, summary of plant selection, plant description, reactor and primary system description, thermal and hydraulic analysis, nuclear analysis, control and instrumentatlon description, shielding description, auxiliary systems, power plant equipment, waste dispusal, buildings and tunnels, services, operation and maintenance, logistics, erection, cost information, and a training program outline are given. (auth)
Analysis of Multiallelic CNVs by Emulsion Haplotype Fusion PCR.
Tyson, Jess; Armour, John A L
2017-01-01
Emulsion-fusion PCR recovers long-range sequence information by combining products in cis from individual genomic DNA molecules. Emulsion droplets act as very numerous small reaction chambers in which different PCR products from a single genomic DNA molecule are condensed into short joint products, to unite sequences in cis from widely separated genomic sites. These products can therefore provide information about the arrangement of sequences and variants at a larger scale than established long-read sequencing methods. The method has been useful in defining the phase of variants in haplotypes, the typing of inversions, and determining the configuration of sequence variants in multiallelic CNVs. In this description we outline the rationale for the application of emulsion-fusion PCR methods to the analysis of multiallelic CNVs, and give practical details for our own implementation of the method in that context.
NASA Technical Reports Server (NTRS)
Stoll, Frederick
1993-01-01
The NLPAN computer code uses a finite-strip approach to the analysis of thin-walled prismatic composite structures such as stiffened panels. The code can model in-plane axial loading, transverse pressure loading, and constant through-the-thickness thermal loading, and can account for shape imperfections. The NLPAN code represents an attempt to extend the buckling analysis of the VIPASA computer code into the geometrically nonlinear regime. Buckling mode shapes generated using VIPASA are used in NLPAN as global functions for representing displacements in the nonlinear regime. While the NLPAN analysis is approximate in nature, it is computationally economical in comparison with finite-element analysis, and is thus suitable for use in preliminary design and design optimization. A comprehensive description of the theoretical approach of NLPAN is provided. A discussion of some operational considerations for the NLPAN code is included. NLPAN is applied to several test problems in order to demonstrate new program capabilities, and to assess the accuracy of the code in modeling various types of loading and response. User instructions for the NLPAN computer program are provided, including a detailed description of the input requirements and example input files for two stiffened-panel configurations.
Visibility Graph Based Time Series Analysis.
Stephen, Mutua; Gu, Changgui; Yang, Huijie
2015-01-01
Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.
Advanced statistical methods for improved data analysis of NASA astrophysics missions
NASA Technical Reports Server (NTRS)
Feigelson, Eric D.
1992-01-01
The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.
Fusing modeling techniques to support domain analysis for reuse opportunities identification
NASA Technical Reports Server (NTRS)
Hall, Susan Main; Mcguire, Eileen
1993-01-01
Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.
Network Analysis: Applications for the Developing Brain
Chu-Shore, Catherine J.; Kramer, Mark A.; Bianchi, Matt T.; Caviness, Verne S.; Cash, Sydney S.
2011-01-01
Development of the human brain follows a complex trajectory of age-specific anatomical and physiological changes. The application of network analysis provides an illuminating perspective on the dynamic interregional and global properties of this intricate and complex system. Here, we provide a critical synopsis of methods of network analysis with a focus on developing brain networks. After discussing basic concepts and approaches to network analysis, we explore the primary events of anatomical cortical development from gestation through adolescence. Upon this framework, we describe early work revealing the evolution of age-specific functional brain networks in normal neurodevelopment. Finally, we review how these relationships can be altered in disease and perhaps even rectified with treatment. While this method of description and inquiry remains in early form, there is already substantial evidence that the application of network models and analysis to understanding normal and abnormal human neural development holds tremendous promise for future discovery. PMID:21303762
A Descriptive Analysis of Oral Health Systematic Reviews Published 1991–2012: Cross Sectional Study
Saltaji, Humam; Cummings, Greta G.; Armijo-Olivo, Susan; Major, Michael P.; Amin, Maryam; Major, Paul W.; Hartling, Lisa; Flores-Mir, Carlos
2013-01-01
Objectives To identify all systematic reviews (SRs) published in the domain of oral health research and describe them in terms of their epidemiological and descriptive characteristics. Design Cross sectional, descriptive study. Methods An electronic search of seven databases was performed from inception through May 2012; bibliographies of relevant publications were also reviewed. Studies were considered for inclusion if they were oral health SRs defined as therapeutic or non-therapeutic investigations that studied a topic or an intervention related to dental, oral or craniofacial diseases/disorders. Data were extracted from all the SRs based on a number of epidemiological and descriptive characteristics. Data were analysed descriptively for all the SRs, within each of the nine dental specialities, and for Cochrane and non-Cochrane SRs separately. Results 1,188 oral health (126 Cochrane and 1062 non-Cochrane) SRs published from 1991 through May 2012 were identified, encompassing the nine dental specialties. Over half (n = 676; 56.9%) of the SRs were published in specialty oral health journals, with almost all (n = 1,178; 99.2%) of the SRs published in English and almost none of the non-Cochrane SRs (n = 11; 0.9%) consisting of updates of previously published SRs. 75.3% of the SRs were categorized as therapeutic, with 64.5% examining non-drug interventions, while approximately half (n = 150/294; 51%) of the non-therapeutic SRs were classified as epidemiological SRs. The SRs included a median of 15 studies, with a meta-analysis conducted in 43.6%, in which a median of 9 studies/1 randomized trial were included in the largest meta-analysis conducted. Funding was received for 25.1% of the SRs, including nearly three-quarters (n = 96; 76.2%) of the Cochrane SRs. Conclusion Epidemiological and descriptive characteristics of the 1,188 oral health SRs varied across the nine dental specialties and by SR category (Cochrane vs. non-Cochrane). There is a clear need for more updates of SRs in all the dental specialties. PMID:24098657
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sosnovsky, Denis V.; Ivanov, Konstantin L., E-mail: ivanov@tomo.nsc.ru; Novosibirsk State University, Pirogova 2, 630090, Novosibirsk
Chemically Induced Dynamic Nuclear Polarization (CIDNP) is an efficient method of creating non-equilibrium polarization of nuclear spins by using chemical reactions, which have radical pairs as intermediates. The CIDNP effect originates from (i) electron spin-selective recombination of radical pairs and (ii) the dependence of the inter-system crossing rate in radical pairs on the state of magnetic nuclei. The CIDNP effect can be investigated by using Nuclear Magnetic Resonance (NMR) methods. The gain from CIDNP is then two-fold: it allows one to obtain considerable amplification of NMR signals; in addition, it provides a very useful tool for investigating elusive radicals andmore » radical pairs. While the mechanisms of the CIDNP effect in liquids are well established and understood, detailed analysis of solid-state CIDNP mechanisms still remains challenging; likewise a common theoretical frame for the description of CIDNP in both solids and liquids is missing. Difficulties in understanding the spin dynamics that lead to the CIDNP effect in the solid-state case are caused by the anisotropy of spin interactions, which increase the complexity of spin evolution. In this work, we propose to analyze CIDNP in terms of level crossing phenomena, namely, to attribute features in the CIDNP magnetic field dependence to Level Crossings (LCs) and Level Anti-Crossings (LACs) in a radical pair. This approach allows one to describe liquid-state CIDNP; the same holds for the solid-state case where anisotropic interactions play a significant role in CIDNP formation. In solids, features arise predominantly from LACs, since in most cases anisotropic couplings result in perturbations, which turn LCs into LACs. We have interpreted the CIDNP mechanisms in terms of the LC/LAC concept. This consideration allows one to find analytical expressions for a wide magnetic field range, where several different mechanisms are operative; furthermore, the LAC description gives a way to determine CIDNP sign rules. Thus, LCs/LACs provide a consistent description of CIDNP in both liquids and solids with the prospect of exploiting it for the analysis of short-lived radicals and for optimizing the polarization level.« less
2012-01-01
Background Computer-based analysis of digitalized histological images has been gaining increasing attention, due to their extensive use in research and routine practice. The article aims to contribute towards the description and retrieval of histological images by employing a structural method using graphs. Due to their expressive ability, graphs are considered as a powerful and versatile representation formalism and have obtained a growing consideration especially by the image processing and computer vision community. Methods The article describes a novel method for determining similarity between histological images through graph-theoretic description and matching, for the purpose of content-based retrieval. A higher order (region-based) graph-based representation of breast biopsy images has been attained and a tree-search based inexact graph matching technique has been employed that facilitates the automatic retrieval of images structurally similar to a given image from large databases. Results The results obtained and evaluation performed demonstrate the effectiveness and superiority of graph-based image retrieval over a common histogram-based technique. The employed graph matching complexity has been reduced compared to the state-of-the-art optimal inexact matching methods by applying a pre-requisite criterion for matching of nodes and a sophisticated design of the estimation function, especially the prognosis function. Conclusion The proposed method is suitable for the retrieval of similar histological images, as suggested by the experimental and evaluation results obtained in the study. It is intended for the use in Content Based Image Retrieval (CBIR)-requiring applications in the areas of medical diagnostics and research, and can also be generalized for retrieval of different types of complex images. Virtual Slides The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/1224798882787923. PMID:23035717
Antropov, K M; Varaksin, A N
2013-01-01
This paper provides the description of Land Use Regression (LUR) modeling and the result of its application in the study of nitrogen dioxide air pollution in Ekaterinburg. The paper describes the difficulties of the modeling for air pollution caused by motor vehicles exhaust, and the ways to address these challenges. To create LUR model of the NO2 air pollution in Ekaterinburg, concentrations of NO2 were measured, data on factors affecting air pollution were collected, a statistical analysis of the data were held. A statistical model of NO2 air pollution (coefficient of determination R2 = 0.70) and a map of pollution were created.
Contact Kinetics in Fractal Macromolecules.
Dolgushev, Maxim; Guérin, Thomas; Blumen, Alexander; Bénichou, Olivier; Voituriez, Raphaël
2015-11-13
We consider the kinetics of first contact between two monomers of the same macromolecule. Relying on a fractal description of the macromolecule, we develop an analytical method to compute the mean first contact time for various molecular sizes. In our theoretical description, the non-Markovian feature of monomer motion, arising from the interactions with the other monomers, is captured by accounting for the nonequilibrium conformations of the macromolecule at the very instant of first contact. This analysis reveals a simple scaling relation for the mean first contact time between two monomers, which involves only their equilibrium distance and the spectral dimension of the macromolecule, independently of its microscopic details. Our theoretical predictions are in excellent agreement with numerical stochastic simulations.
Laser-enhanced dynamics in molecular rate processes
NASA Technical Reports Server (NTRS)
George, T. F.; Zimmerman, I. H.; Devries, P. L.; Yuan, J.-M.; Lam, K.-S.; Bellum, J. C.; Lee, H.-W.; Slutsky, M. S.
1978-01-01
The present discussion deals with some theoretical aspects associated with the description of molecular rate processes in the presence of intense laser radiation, where the radiation actually interacts with the molecular dynamics. Whereas for weak and even moderately intense radiation, the absorption and stimulated emission of photons by a molecular system can be described by perturbative methods, for intense radiation, perturbation theory is usually not adequate. Limiting the analysis to the gas phase, an attempt is made to describe nonperturbative approaches applicable to the description of such processes (in the presence of intense laser radiation) as electronic energy transfer in molecular (in particular atom-atom) collisions; collision-induced ionization and emission; and unimolecular dissociation.
1981-01-01
Reference Direction4 at " Is - (198) SNetwork’Ports. In either c•es, the port voltagemay be related to the appl &id field on the "segment by’ t~h constant...04 6.|• swot -0 1, i.61-03 45.766 17 0 0.117* 0.US30 ,0001 0.01111,31 1 I. K-03 1.137ft-04 i .3%$K-03 11.i1i is 0 0a1113 0.2178 0.0003 0.00339 1.1117K
Recent research in data description of the measurement property resource on common data dictionary
NASA Astrophysics Data System (ADS)
Lu, Tielin; Fan, Zitian; Wang, Chunxi; Liu, Xiaojing; Wang, Shuo; Zhao, Hua
2018-03-01
A method for measurement equipment data description has been proposed based on the property resource analysis. The applications of common data dictionary (CDD) to devices and equipment is mainly used in digital factory to advance the management not only in the enterprise, also to the different enterprise in the same data environment. In this paper, we can make a brief of the data flow in the whole manufacture enterprise and the automatic trigger the process of the data exchange. Furthermore,the application of the data dictionary is available for the measurement and control equipment, which can also be used in other different industry in smart manufacture.
Global Seabed Materials and Habitats Mapped: The Computational Methods
NASA Astrophysics Data System (ADS)
Jenkins, C. J.
2016-02-01
What the seabed is made of has proven difficult to map on the scale of whole ocean-basins. Direct sampling and observation can be augmented with proxy-parameter methods such as acoustics. Both avenues are essential to obtain enough detail and coverage, and also to validate the mapping methods. We focus on the direct observations such as samplings, photo and video, probes, diver and sub reports, and surveyed features. These are often in word-descriptive form: over 85% of the records for site materials are in this form, whether as sample/view descriptions or classifications, or described parameters such as consolidation, color, odor, structures and components. Descriptions are absolutely necessary for unusual materials and for processes - in other words, for research. This project dbSEABED not only has the largest collection of seafloor materials data worldwide, but it uses advanced computing math to obtain the best possible coverages and detail. Included in those techniques are linguistic text analysis (e.g., Natural Language Processing, NLP), fuzzy set theory (FST), and machine learning (ML, e.g., Random Forest). These techniques allow efficient and accurate import of huge datasets, thereby optimizing the data that exists. They merge quantitative and qualitative types of data for rich parameter sets, and extrapolate where the data are sparse for best map production. The dbSEABED data resources are now very widely used worldwide in oceanographic research, environmental management, the geosciences, engineering and survey.
Recommended procedures and techniques for the petrographic description of bituminous coals
Chao, E.C.T.; Minkin, J.A.; Thompson, C.L.
1982-01-01
Modern coal petrology requires rapid and precise description of great numbers of coal core or bench samples in order to acquire the information required to understand and predict vertical and lateral variation of coal quality for correlation with coal-bed thickness, depositional environment, suitability for technological uses, etc. Procedures for coal description vary in accordance with the objectives of the description. To achieve our aim of acquiring the maximum amount of quantitative information within the shortest period of time, we have adopted a combined megascopic-microscopic procedure. Megascopic analysis is used to identify the distinctive lithologies present, and microscopic analysis is required only to describe representative examples of the mixed lithologies observed. This procedure greatly decreases the number of microscopic analyses needed for adequate description of a sample. For quantitative megascopic description of coal microlithotypes, microlithotype assemblages, and lithotypes, we use (V) for vitrite or vitrain, (E) for liptite, (I) for inertite or fusain, (M) for mineral layers or lenses other than iron sulfide, (S) for iron sulfide, and (X1), (X2), etc. for mixed lithologies. Microscopic description is expressed in terms of V representing the vitrinite maceral group, E the exinite group, I the inertinite group, and M mineral components. volume percentages are expressed as subscripts. Thus (V)20(V80E10I5M5)80 indicates a lithotype or assemblage of microlithotypes consisting of 20 vol. % vitrite and 80% of a mixed lithology having a modal maceral composition V80E10I5M5. This bulk composition can alternatively be recalculated and described as V84E8I4M4. To generate these quantitative data rapidly and accurately, we utilize an automated image analysis system (AIAS). Plots of VEIM data on easily constructed ternary diagrams provide readily comprehended illustrations of the range of modal composition of the lithologic units making up a given coal bed. The use of bulk-specific-gravity determinations is alo recommended for identification and characterization of the distinctive lithologic units. The availability of an AIAS also enhances the capability to acquire textural information. Ranges of size of maceral and mineral grains can be quickly and precisely determined by use of an AIAS. We assume that shape characteristics of coal particles can also be readily evaluated by automated image analysis, although this evaluation has not yet been attempted in our laboratory. Definitive data on the particulate mineral content of coal constitute another important segment of petrographic description. Characterization of mineral content may be accomplished by optical identification, electron microprobe analysis, X-ray diffraction, and scanning and transmission electron microscopy. Individual mineral grains in place in polished blocks or polished this sections, or separated from the coal matrix by sink-float methods are studied by analytical techniques appropriate to the conditions of sampling. Finally, whenever possible, identification of the probable genus or plant species from which a given coal component is derived will add valuable information and meaning to the petrographic description. ?? 1982.
Vela, Luzita I.; Denegar, Craig
2010-01-01
Abstract Context: Disablement theory has been characterized as the sequence of events that occurs after an injury, but little research has been conducted to establish how disablement is experienced and described by physically active persons. Objective: To describe the disablement process in physically active persons with musculoskeletal injuries. Design: Concurrent, embedded mixed-methods study. For the qualitative portion, interviews were conducted to create descriptive disablement themes. For the quantitative portion, frequencies analysis was used to identify common terminology. Setting: National Collegiate Athletic Association Division I collegiate and club sports, collegiate intramural program, large high school athletics program, and outpatient orthopaedic center. Patients or Other Participants: Thirty-one physically active volunteers (15 males, 16 females; mean age = 21.2 years; range, 14–53 years) with a current injury (18 lower extremity injuries, 13 upper extremity injuries) participated in individual interviews. Six physically active volunteers (3 males, 3 females; mean age = 22.2 years; range, 16–28 years) participated in the group interview to assess trustworthiness. Data Collection and Analysis: We analyzed interviews through a constant-comparison method, and data were collected until saturation occurred. Common limitations were transformed into descriptive themes and were confirmed during the group interview. Disablement descriptors were identified with frequencies and fit to the themes. Results: A total of 15 overall descriptive themes emerged within the 4 disablement components, and descriptive terms were identified for each theme. Impairments were marked by 4 complaints: pain, decreased motion, decreased muscle function, and instability. Functional limitations were denoted by problems with skill performance, daily actions, maintaining positions, fitness, and changing directions. Disability consisted of problems with participation in desired activities. Lastly, problems in quality of life encompassed uncertainty and fear, stress and pressure, mood and frustration, overall energy, and altered relationships. A preliminary generic outcomes instrument was generated from the findings. Conclusions: Our results will help clinicians understand how disablement is described by the physically active. The findings also have implications for how disablement outcomes are measured. PMID:21062186
DOE Office of Scientific and Technical Information (OSTI.GOV)
Donnelly, H.; Fullwood, R.; Glancy, J.
This is the second volume of a two volume report on the VISA method for evaluating safeguards at fixed-site facilities. This volume contains appendices that support the description of the VISA concept and the initial working version of the method, VISA-1, presented in Volume I. The information is separated into four appendices, each describing details of one of the four analysis modules that comprise the analysis sections of the method. The first appendix discusses Path Analysis methodology, applies it to a Model Fuel Facility, and describes the computer codes that are being used. Introductory material on Path Analysis given inmore » Chapter 3.2.1 and Chapter 4.2.1 of Volume I. The second appendix deals with Detection Analysis, specifically the schemes used in VISA-1 for classifying adversaries and the methods proposed for evaluating individual detection mechanisms in order to build the data base required for detection analysis. Examples of evaluations on identity-access systems, SNM portal monitors, and intrusion devices are provided. The third appendix describes the Containment Analysis overt-segment path ranking, the Monte Carlo engagement model, the network simulation code, the delay mechanism data base, and the results of a sensitivity analysis. The last appendix presents general equations used in Interruption Analysis for combining covert-overt segments and compares them with equations given in Volume I, Chapter 3.« less
Selecting foils for identification lineups: matching suspects or descriptions?
Tunnicliff, J L; Clark, S E
2000-04-01
Two experiments directly compare two methods of selecting foils for identification lineups. The suspect-matched method selects foils based on their match to the suspect, whereas the description-matched method selects foils based on their match to the witness's description of the perpetrator. Theoretical analyses and previous results predict an advantage for description-matched lineups both in terms of correctly identifying the perpetrator and minimizing false identification of innocent suspects. The advantage for description-matched lineups should be particularly pronounced if the foils selected in suspect-matched lineups are too similar to the suspect. In Experiment 1, the lineups were created by trained police officers, and in Experiment 2, the lineups were constructed by undergraduate college students. The results of both experiments showed higher suspect-to-foil similarity for suspect-matched lineups than for description-matched lineups. However, neither experiment showed a difference in correct or false identification rates. Both experiments did, however, show that there may be an advantage for suspect-matched lineups in terms of no-pick and rejection responses. From these results, the endorsement of one method over the other seems premature.
NASA Technical Reports Server (NTRS)
Bains, R. W.; Herwig, H. A.; Luedeman, J. K.; Torina, E. M.
1974-01-01
The Shuttle Electric Power System (SEPS) computer program is considered in terms of the program manual, programmer guide, and program utilization. The main objective is to provide the information necessary to interpret and use the routines comprising the SEPS program. Subroutine descriptions including the name, purpose, method, variable definitions, and logic flow are presented.
Birth Outcomes across Three Rural-Urban Typologies in the Finger Lakes Region of New York
ERIC Educational Resources Information Center
Strutz, Kelly L.; Dozier, Ann M.; van Wijngaarden, Edwin; Glantz, J. Christopher
2012-01-01
Purpose: The study is a descriptive, population-based analysis of birth outcomes in the New York State Finger Lakes region designed to determine whether perinatal outcomes differed across 3 rural typologies. Methods: Hospital birth data for the Finger Lakes region from 2006 to 2007 were used to identify births classified as low birthweight (LBW),…
The Examining Reading Motivation of Primary Students in the Terms of Some Variables
ERIC Educational Resources Information Center
Biyik, Merve Atas; Erdogan, Tolga; Yildiz, Mustafa
2017-01-01
The purpose of this research, is to examine reading motivation of the primary 2, 3 and 4th grade students in the terms of gender, class and socioeconomic status. Research is structured according to model of survey in the descriptive type. In the collection, analysis and interpretation of the data "mix method". The sample consists of…
Alternative Methods of Base Level Demand Forecasting for Economic Order Quantity Items,
1975-12-01
Note .. . . . . . . . . . . . . . . . . . . . . . . . 21 AdaptivC Single Exponential Smooti-ing ........ 21 Choosing the Smoothiing Constant... methodology used in the study, an analysis of results, .And a detailed summary. Chapter I. Methodology , contains a description o the data, a...Chapter IV. Detailed Summary, presents a detailed summary of the findings, lists the limitations inherent in the 7’" research methodology , and
Defense Small Business Innovation Research Program (SBIR) FY 1985.
1985-01-31
EO)/ INFRARED (IR) COUNTERMEASURES W CATEGORY: Exploratory Development DESCRIPTION: Analysis needs to be performed to determine how to counter weapon...MHz to 2 GHz. Electrically conducting surfaces of interest are cable shields and braids, and optically transparent (to visible and infrared ) screens...conjunction with a particular method of low temperature depositions from organometallics. 37 .. . -. . . . . . -67 - . ... . 43. TITLE: TUNABLE INFRARED LASER
Analysis of 4th Grade Students' Problem Solving Skills in Terms of Several Variables
ERIC Educational Resources Information Center
Sungur, Gülcan; Bal, Pervin Nedim
2016-01-01
The aim of this study is to examine if the level of primary school students in solving problems differs according to some demographic variables. The research is descriptive type in the general survey method, it was carried out with quantitative research techniques. The sample of the study consisted of 587 primary school students in Grade 4. The…
ERIC Educational Resources Information Center
Stephens, Michael; Jones, Kyle M. L.
2014-01-01
Beyond for-credit offerings, some library and information science (LIS) schools are exploring MOOCs as a means to promote lifelong learning and professional development. Using web surveys and descriptive content analysis methods, this paper empirically addresses if, in LIS programs, MOOCs can fill a role and serve new populations of learners…
ERIC Educational Resources Information Center
Atkinson, Kayla M.; Koenka, Alison C.; Sanchez, Carmen E.; Moshontz, Hannah; Cooper, Harris
2015-01-01
A complete description of the literature search, including the criteria used for the inclusion of reports after they have been located, used in a research synthesis or meta-analysis is critical if subsequent researchers are to accurately evaluate and reproduce a synthesis' methods and results. Based on previous guidelines and new suggestions, we…
Medical Conditions and Medication Use in Adults with Down Syndrome: A Descriptive Analysis
ERIC Educational Resources Information Center
Kerins, Gerard; Petrovic, Kimberly; Bruder, Mary Beth; Gruman, Cynthia
2008-01-01
Background: We examined the presence of medical conditions and medication use within a sample of adults with Down syndrome. Methods: Retrospective chart review using a sample of 141 adults with Down syndrome and age range of 30 to 65 years. Results: We identify 23 categories of commonly occurring medical conditions and 24 categories of medications…
ERIC Educational Resources Information Center
Haider, Zubair; Latif, Farah; Akhtar, Samina; Mushtaq, Maria
2012-01-01
Validity, reliability and item analysis are critical to the process of evaluating the quality of an educational measurement. The present study evaluates the quality of an assessment constructed to measure elementary school student's achievement in English. In this study, the survey model of descriptive research was used as a research method.…
ERIC Educational Resources Information Center
Smedema, Susan Miller; Kesselmayer, Rachel Friefeld; Peterson, Lauren
2018-01-01
Purpose: To test a meditation model of the relationship between core self-evaluations (CSE) and job satisfaction in employed individuals with disabilities. Method: A quantitative descriptive design using Hayes's (2012) PROCESS macro for SPSS and multiple regression analysis. Two-hundred fifty-nine employed persons with disabilities were recruited…
Test and Evaluation Management Guide
1988-03-01
of the acquisition community at large, whose comments, suggestions, and materials were helpful in completing this project . The ...analysis are particularly useful in the early stages of development to provide early projections before system hardware is available. These methods are also...Statement is prescribed by DoD Instruction 5000.2. In summary, it contains the following: o A description of the mission need 0 The projected
Effective Management Selection: The Analysis of Behavior by Simulation Techniques.
ERIC Educational Resources Information Center
Jaffee, Cabot L.
This book presents a system by which feedback might be generated and used as a basis for organizational change. The major areas covered consist of the development of a rationale for the use of simulation in the selection of supervisors, a description of actual techniques, and a method for training individuals in the use of the material. The…
ERIC Educational Resources Information Center
Alter, Peter J.; Conroy, Maureen A.; Mancil, G. Rich; Haydon, Todd
2008-01-01
The use of functional behavior assessment (FBA) to guide the development of behavior intervention plans continues to increase since they were first mandated in IDEA (Individuals with Disabilities Education Act Amendments of 1997, 20 U.S.C. Section 1400 et seq, 1997). A variety of indirect and direct instruments have been developed to facilitate…
Iravani, Mina; Janghorbani, Mohsen; Zarean, Ellahe; Bahrami, Masod
2016-01-01
Background: Evidence based practice is an effective strategy to improve the quality of obstetric care. Identification of barriers to adaptation of evidence-based intrapartum care is necessary and crucial to deliver high quality care to parturient women. Objectives: The current study aimed to explore barriers to adaptation of evidence-based intrapartum care from the perspective of clinical groups that provide obstetric care in Iran. Materials and Methods: This descriptive exploratory qualitative research was conducted from 2013 to 2014 in fourteen state medical training centers in Iran. Participants were selected from midwives, specialists, and residents of obstetrics and gynecology, with a purposive sample and snowball method. Data were collected through face-to-face semi-structured in-depth interviews and analyzed according to conventional content analysis. Results: Data analysis identified twenty subcategories and four main categories. Main categories included barriers were related to laboring women, persons providing care, the organization environment and health system. Conclusions: The adoption of evidence based intrapartum care is a complex process. In this regard, identifying potential barriers is the first step to determine and apply effective strategies to encourage the compliance evidence based obstetric care and improves maternity care quality. PMID:27175303
I'll take that to go: Big data bags and minimal identifiers for exchange of large, complex datasets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chard, Kyle; D'Arcy, Mike; Heavner, Benjamin D.
Big data workflows often require the assembly and exchange of complex, multi-element datasets. For example, in biomedical applications, the input to an analytic pipeline can be a dataset consisting thousands of images and genome sequences assembled from diverse repositories, requiring a description of the contents of the dataset in a concise and unambiguous form. Typical approaches to creating datasets for big data workflows assume that all data reside in a single location, requiring costly data marshaling and permitting errors of omission and commission because dataset members are not explicitly specified. We address these issues by proposing simple methods and toolsmore » for assembling, sharing, and analyzing large and complex datasets that scientists can easily integrate into their daily workflows. These tools combine a simple and robust method for describing data collections (BDBags), data descriptions (Research Objects), and simple persistent identifiers (Minids) to create a powerful ecosystem of tools and services for big data analysis and sharing. We present these tools and use biomedical case studies to illustrate their use for the rapid assembly, sharing, and analysis of large datasets.« less
Cancer Survivorship in the Age of YouTube and Social Media: A Narrative Analysis
Hunt, Yvonne; Folkers, Anna
2011-01-01
Background As evidenced by the increasing popularity of YouTube (www.youtube.com), personal narratives shared through social media are an area of rapid development in communication among cancer survivors. Identifying the thematic and linguistic characteristics of YouTube cancer stories can provide a better understanding of this naturally occurring communication channel and inform social media communication efforts aiming to use personal stories to reach individuals with serious illnesses. Objective The objective of our study was to provide an in-depth description of authentic personal cancer stories. Through a linguistically based narrative analysis of YouTube stories, the analysis explicates the common attributes of these narratives. Methods Informed by narrative theories, we conducted an iterative, bottom-up analysis of 35 YouTube videos identified by the search terms “cancer survivor” and “cancer stories”. A list of shared thematic and linguistic characteristics was identified and analyzed. Results A subnarrative on the cancer diagnosis was present in 86% (30/35) of the stories under analysis. These diagnostic narratives were characterized by dramatic tension, emotional engagement, markers of the loss of agency or control, depersonalized reference to the medical personnel, and the unexpectedness of a cancer diagnosis. The analysis highlights the themes of story authenticity and emotional engagement in this online communication medium. Conclusions Internet advances have enabled new and efficient exchange of personal stories, including the sharing of personal cancer experience among cancer survivors and their caregivers. The analytic results of this descriptive study point to the common characteristics of authentic cancer survivorship stories online. Furthermore, the results of this descriptive study may inform development of narrative-based communication, particularly in maintaining authenticity and emotional engagement. PMID:21247864
Computer-Aided Design of Low-Noise Microwave Circuits
NASA Astrophysics Data System (ADS)
Wedge, Scott William
1991-02-01
Devoid of most natural and manmade noise, microwave frequencies have detection sensitivities limited by internally generated receiver noise. Low-noise amplifiers are therefore critical components in radio astronomical antennas, communications links, radar systems, and even home satellite dishes. A general technique to accurately predict the noise performance of microwave circuits has been lacking. Current noise analysis methods have been limited to specific circuit topologies or neglect correlation, a strong effect in microwave devices. Presented here are generalized methods, developed for computer-aided design implementation, for the analysis of linear noisy microwave circuits comprised of arbitrarily interconnected components. Included are descriptions of efficient algorithms for the simultaneous analysis of noisy and deterministic circuit parameters based on a wave variable approach. The methods are therefore particularly suited to microwave and millimeter-wave circuits. Noise contributions from lossy passive components and active components with electronic noise are considered. Also presented is a new technique for the measurement of device noise characteristics that offers several advantages over current measurement methods.
Capodaglio, E M; Facioli, M; Bazzini, G
2001-01-01
Pathologies due to the repetitive activity of the upper limbs constitutes a growing part of the work-related musculo-skeletal disorders. At the moment, there are no universally accepted and validated methods for the description and assessment of the work-related risks. Yet, the criteria fundamentally characterizing the exposure are rather clear and even. This study reports a practical example of the application of some recent risk assessment methods proposed in the literature, combining objective and subjective measures obtained on the field, with the traditional activity analysis.
In Vitro Electrochemistry of Biological Systems
Adams, Kelly L.; Puchades, Maja; Ewing, Andrew G.
2009-01-01
This article reviews recent work involving electrochemical methods for in vitro analysis of biomolecules, with an emphasis on detection and manipulation at and of single cells and cultures of cells. The techniques discussed include constant potential amperometry, chronoamperometry, cellular electroporation, scanning electrochemical microscopy, and microfluidic platforms integrated with electrochemical detection. The principles of these methods are briefly described, followed in most cases with a short description of an analytical or biological application and its significance. The use of electrochemical methods to examine specific mechanistic issues in exocytosis is highlighted, as a great deal of recent work has been devoted to this application. PMID:20151038
Antenna pattern interpolation by generalized Whittaker reconstruction
NASA Astrophysics Data System (ADS)
Tjonneland, K.; Lindley, A.; Balling, P.
Whittaker reconstruction is an effective tool for interpolation of band limited data. Whittaker originally introduced the interpolation formula termed the cardinal function as the function that represents a set of equispaced samples but has no periodic components of period less than twice the sample spacing. It appears that its use for reflector antennas was pioneered in France. The method is now a useful tool in the analysis and design of multiple beam reflector antenna systems. A good description of the method has been given by Bucci et al. This paper discusses some problems encountered with the method and their solution.
NASA Astrophysics Data System (ADS)
Obuchowski, Nancy A.; Bullen, Jennifer A.
2018-04-01
Receiver operating characteristic (ROC) analysis is a tool used to describe the discrimination accuracy of a diagnostic test or prediction model. While sensitivity and specificity are the basic metrics of accuracy, they have many limitations when characterizing test accuracy, particularly when comparing the accuracies of competing tests. In this article we review the basic study design features of ROC studies, illustrate sample size calculations, present statistical methods for measuring and comparing accuracy, and highlight commonly used ROC software. We include descriptions of multi-reader ROC study design and analysis, address frequently seen problems of verification and location bias, discuss clustered data, and provide strategies for testing endpoints in ROC studies. The methods are illustrated with a study of transmission ultrasound for diagnosing breast lesions.
Signal Detection Techniques for Diagnostic Monitoring of Space Shuttle Main Engine Turbomachinery
NASA Technical Reports Server (NTRS)
Coffin, Thomas; Jong, Jen-Yi
1986-01-01
An investigation to develop, implement, and evaluate signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery is reviewed. A brief description of the Space Shuttle Main Engine (SSME) test/measurement program is presented. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques have been implemented on a computer and applied to dynamc signals. A laboratory evaluation of the methods with respect to signal detection capability is described. A unique coherence function (the hyper-coherence) was developed through the course of this investigation, which appears promising as a diagnostic tool. This technique and several other non-linear methods of signal analysis are presented and illustrated by application. Software for application of these techniques has been installed on the signal processing system at the NASA/MSFC Systems Dynamics Laboratory.
Processing and analysis of cardiac optical mapping data obtained with potentiometric dyes
Laughner, Jacob I.; Ng, Fu Siong; Sulkin, Matthew S.; Arthur, R. Martin
2012-01-01
Optical mapping has become an increasingly important tool to study cardiac electrophysiology in the past 20 years. Multiple methods are used to process and analyze cardiac optical mapping data, and no consensus currently exists regarding the optimum methods. The specific methods chosen to process optical mapping data are important because inappropriate data processing can affect the content of the data and thus alter the conclusions of the studies. Details of the different steps in processing optical imaging data, including image segmentation, spatial filtering, temporal filtering, and baseline drift removal, are provided in this review. We also provide descriptions of the common analyses performed on data obtained from cardiac optical imaging, including activation mapping, action potential duration mapping, repolarization mapping, conduction velocity measurements, and optical action potential upstroke analysis. Optical mapping is often used to study complex arrhythmias, and we also discuss dominant frequency analysis and phase mapping techniques used for the analysis of cardiac fibrillation. PMID:22821993
Suzuki, Shigeru
2014-01-01
The techniques and measurement methods developed in the Environmental Survey and Monitoring of Chemicals by Japan’s Ministry of the Environment, as well as a large amount of knowledge archived in the survey, have led to the advancement of environmental analysis. Recently, technologies such as non-target liquid chromatography/high resolution mass spectrometry and liquid chromatography with micro bore column have further developed the field. Here, the general strategy of a method developed for the liquid chromatography/mass spectrometry (LC/MS) analysis of environmental chemicals with a brief description is presented. Also, a non-target analysis for the identification of environmental pollutants using a provisional fragment database and “MsMsFilter,” an elemental composition elucidation tool, is presented. This analytical method is shown to be highly effective in the identification of a model chemical, the pesticide Bendiocarb. Our improved micro-liquid chromatography injection system showed substantially enhanced sensitivity to perfluoroalkyl substances, with peak areas 32–71 times larger than those observed in conventional LC/MS. PMID:26819891
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-17
...; Information Collection; Freight Classification Description AGENCIES: Department of Defense (DOD), General... collection requirement concerning freight classification description. Public comments are particularly... Information Collection 9000- 0055, Freight Classification Description, by any of the following methods...
Sayadi, Saeed; Safdarian, Ali; Khayeri, Behnaz
2015-01-01
Introduction: Training the man power is an inevitable necessity that the organizations need in order to survive and develop in today changing world. Aims: The aim of the present study is to identify the relationship between the components of on-site training and emotional intelligence in librarians of Isfahan University of Medical Sciences and Isfahan University with moderating role of personality characteristics. Settings and Design: Descriptive correlation method was used in the present study. The statistical population of the study was all of the 157 librarians of Isfahan University of Medical Sciences and Isfahan University from whom the appointed individuals were selected through random sampling method. Subjects and Methods: The research tools included the researcher-made questionnaire of investigating the effectiveness of on-site training system and two other standard questionnaires of Shrink emotional intelligence, and NEO personality questionnaire, which all of them had the needed reliability and validity. Statistical Analysis: The descriptive indices (distribution and mean) and also the inferential methods (Pearson correlation, regression analysis and analysis of variance) were used through applying version 20 of SPSS software to analyze the obtained data. Results: There was a significant relationship with certainty level of 95% between the components of on-site training with emotional intelligence in those who obtained low grades in the features of being extrovert and between the individual aspects of on-site training with emotional intelligence in those who got higher grades in the characteristic of being extrovert. Conclusion: The emotional intelligence is a promotable skill and considering the existence of a significant relationship between some components of emotional intelligence and on-site training, these skills can be institutionalized through conducting mentioned educational courses. PMID:27462631
Boiling process modelling peculiarities analysis of the vacuum boiler
NASA Astrophysics Data System (ADS)
Slobodina, E. N.; Mikhailov, A. G.
2017-06-01
The analysis of the low and medium powered boiler equipment development was carried out, boiler units possible development directions with the purpose of energy efficiency improvement were identified. Engineering studies for the vacuum boilers applying are represented. Vacuum boiler heat-exchange processes where boiling water is the working body are considered. Heat-exchange intensification method under boiling at the maximum heat- transfer coefficient is examined. As a result of the conducted calculation studies, heat-transfer coefficients variation curves depending on the pressure, calculated through the analytical and numerical methodologies were obtained. The conclusion about the possibility of numerical computing method application through RPI ANSYS CFX for the boiling process description in boiler vacuum volume was given.
NASA Technical Reports Server (NTRS)
1972-01-01
The Performance Analysis and Design Synthesis (PADS) computer program has a two-fold purpose. It can size launch vehicles in conjunction with calculus-of-variations optimal trajectories and can also be used as a general-purpose branched trajectory optimization program. In the former use, it has the Space Shuttle Synthesis Program as well as a simplified stage weight module for optimally sizing manned recoverable launch vehicles. For trajectory optimization alone or with sizing, PADS has two trajectory modules. The first trajectory module uses the method of steepest descent; the second employs the method of quasilinearization, which requires a starting solution from the first trajectory module. For Volume 1 see N73-13199.
Primal-dual methods of shape sensitivity analysis for curvilinear cracks with nonpenetration
NASA Astrophysics Data System (ADS)
Kovtunenko, V. A.
2006-10-01
Based on a level-set description of a crack moving with a given velocity, the problem of shape perturb-ation of the crack is considered. Nonpenetration conditions are imposed between opposite crack surfaces which result in a constrained minimization problem describing equilibrium of a solid with the crack. We suggest a minimax formulation of the state problem thus allowing curvilinear (nonplanar) cracks for the consideration. Utilizing primal-dual methods of shape sensitivity analysis we obtain the general formula for a shape derivative of the potential energy, which describes an energy-release rate for the curvilinear cracks. The conditions sufficient to rewrite it in the form of a path-independent integral (J-integral) are derived.
The role of qualitative research in psychological journals.
Kidd, Sean A
2002-03-01
The acceptance of qualitative research in 15 journals published and distributed by the American Psychological Association (APA) was investigated. This investigation included a PsycINFO search using the keyword qualitative, an analysis of 15 APA journals for frequency of qualitative publication, a content analysis of the journal descriptions, and the results of qualitative interviews with 10 of the chief editors of those journals. The results indicate that there exists a substantial amount of interest in the potential contribution of qualitative methods in major psychological journals, although this interest is not ubiquitous, well defined, or communicated. These findings highlight the need for APA to state its position regarding the applicability of qualitative methods in the study of psychology.
Public Experiments and their Analysis with the Replication Method
NASA Astrophysics Data System (ADS)
Heering, Peter
2007-06-01
One of those who failed to establish himself as a natural philosopher in 18th century Paris was the future revolutionary Jean Paul Marat. He did not only publish several monographs on heat, optics and electricity in which he attempted to characterise his work as being purely empirical but he also tried to establish himself as a public lecturer. From the analysis of his experiments using the replication method it became obvious that the written description is missing several relevant aspects of the experiments. In my paper, I am going to discuss the experiences made in analysing these experiments and will suggest possible relations between these publications and the public demonstrations.
Extended GTST-MLD for aerospace system safety analysis.
Guo, Chiming; Gong, Shiyu; Tan, Lin; Guo, Bo
2012-06-01
The hazards caused by complex interactions in the aerospace system have become a problem that urgently needs to be settled. This article introduces a method for aerospace system hazard interaction identification based on extended GTST-MLD (goal tree-success tree-master logic diagram) during the design stage. GTST-MLD is a functional modeling framework with a simple architecture. Ontology is used to extend the ability of system interaction description in GTST-MLD by adding the system design knowledge and the past accident experience. From the level of functionality and equipment, respectively, this approach can help the technician detect potential hazard interactions. Finally, a case is used to show the method. © 2011 Society for Risk Analysis.
Wind adaptive modeling of transmission lines using minimum description length
NASA Astrophysics Data System (ADS)
Jaw, Yoonseok; Sohn, Gunho
2017-03-01
The transmission lines are moving objects, which positions are dynamically affected by wind-induced conductor motion while they are acquired by airborne laser scanners. This wind effect results in a noisy distribution of laser points, which often hinders accurate representation of transmission lines and thus, leads to various types of modeling errors. This paper presents a new method for complete 3D transmission line model reconstruction in the framework of inner and across span analysis. The highlighted fact is that the proposed method is capable of indirectly estimating noise scales, which corrupts the quality of laser observations affected by different wind speeds through a linear regression analysis. In the inner span analysis, individual transmission line models of each span are evaluated based on the Minimum Description Length theory and erroneous transmission line segments are subsequently replaced by precise transmission line models with wind-adaptive noise scale estimated. In the subsequent step of across span analysis, detecting the precise start and end positions of the transmission line models, known as the Point of Attachment, is the key issue for correcting partial modeling errors, as well as refining transmission line models. Finally, the geometric and topological completion of transmission line models are achieved over the entire network. A performance evaluation was conducted over 138.5 km long corridor data. In a modest wind condition, the results demonstrates that the proposed method can improve the accuracy of non-wind-adaptive initial models on an average of 48% success rate to produce complete transmission line models in the range between 85% and 99.5% with the positional accuracy of 9.55 cm transmission line models and 28 cm Point of Attachment in the root-mean-square error.
2014-01-01
Background This study was conducted in the Pacific island nation of Vanuatu. Our objective was to assess knowledge, attitudes and practice of traditional healers who treat lung diseases and tuberculosis (TB), including their willingness to collaborate with the national TB programme. Methods This was a descriptive study using both qualitative and quantitative methods. Quantitative analysis was based on the responses provided to closed-ended questions, and we used descriptive analysis (frequencies) to describe the knowledge, attitudes and practice of the traditional healers towards TB. Qualitative analysis was based on open-ended questions permitting fuller explanations. We used thematic analysis and developed a posteriori inductive categories to draw original and unbiased conclusions. Results Nineteen traditional healers were interviewed; 18 were male. Fifteen of the healers reported treating short wind (a local term to describe lung, chest or breathing illnesses) which they attributed to food, alcohol, smoking or pollution from contact with menstrual blood, and a range of other physical and spiritual causes. Ten said that they would treat TB with leaf medicine. Four traditional healers said that they would not treat TB. Twelve of the healers had referred someone to a hospital for a strong wet-cough and just over half of the healers (9) reported a previous collaboration with the Government health care system. Eighteen of the traditional healers would be willing to collaborate with the national TB programme, with or without compensation. Conclusions Traditional healers in Vanuatu treat lung diseases including TB. Many have previously collaborated with the Government funded health care system, and almost all of them indicated a willingness to collaborate with the national TB programme. The engagement of traditional healers in TB management should be considered, using an evidence based and culturally sensitive approach. PMID:24758174
Independent Component Analysis of Textures
NASA Technical Reports Server (NTRS)
Manduchi, Roberto; Portilla, Javier
2000-01-01
A common method for texture representation is to use the marginal probability densities over the outputs of a set of multi-orientation, multi-scale filters as a description of the texture. We propose a technique, based on Independent Components Analysis, for choosing the set of filters that yield the most informative marginals, meaning that the product over the marginals most closely approximates the joint probability density function of the filter outputs. The algorithm is implemented using a steerable filter space. Experiments involving both texture classification and synthesis show that compared to Principal Components Analysis, ICA provides superior performance for modeling of natural and synthetic textures.
Performance analysis of the ascent propulsion system of the Apollo spacecraft
NASA Technical Reports Server (NTRS)
Hooper, J. C., III
1973-01-01
Activities involved in the performance analysis of the Apollo lunar module ascent propulsion system are discussed. A description of the ascent propulsion system, including hardware, instrumentation, and system characteristics, is included. The methods used to predict the inflight performance and to establish performance uncertainties of the ascent propulsion system are discussed. The techniques of processing the telemetered flight data and performing postflight performance reconstruction to determine actual inflight performance are discussed. Problems that have been encountered and results from the analysis of the ascent propulsion system performance during the Apollo 9, 10, and 11 missions are presented.
Descriptive Analysis in Education: A Guide for Researchers. NCEE 2017-4023
ERIC Educational Resources Information Center
Loeb, Susanna; Dynarski, Susan; McFarland, Daniel; Morris, Pamela; Reardon, Sean; Reber, Sarah
2017-01-01
Whether the goal is to identify and describe trends and variation in populations, create new measures of key phenomena, or describe samples in studies aimed at identifying causal effects, description plays a critical role in the scientific process in general and education research in particular. Descriptive analysis identifies patterns in data to…
ERIC Educational Resources Information Center
Friedman, Lee; Harvey, Robert J.
1986-01-01
Job-naive raters provided with job descriptive information made Position Analysis Questionnaire (PAQ) ratings which were validated against ratings of job analysts who were also job content experts. None of the reduced job descriptive information conditions enabled job-naive raters to obtain either acceptable levels of convergent validity with…
Resolving the biophysics of axon transmembrane polarization in a single closed-form description
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melendy, Robert F., E-mail: rfmelendy@liberty.edu
2015-12-28
When a depolarizing event occurs across a cell membrane there is a remarkable change in its electrical properties. A complete depolarization event produces a considerably rapid increase in voltage that propagates longitudinally along the axon and is accompanied by changes in axial conductance. A dynamically changing magnetic field is associated with the passage of the action potential down the axon. Over 75 years of research has gone into the quantification of this phenomenon. To date, no unified model exist that resolves transmembrane polarization in a closed-form description. Here, a simple but formative description of propagated signaling phenomena in the membranemore » of an axon is presented in closed-form. The focus is on using both biophysics and mathematical methods for elucidating the fundamental mechanisms governing transmembrane polarization. The results presented demonstrate how to resolve electromagnetic and thermodynamic factors that govern transmembrane potential. Computational results are supported by well-established quantitative descriptions of propagated signaling phenomena in the membrane of an axon. The findings demonstrate how intracellular conductance, the thermodynamics of magnetization, and current modulation function together in generating an action potential in a unified closed-form description. The work presented in this paper provides compelling evidence that three basic factors contribute to the propagated signaling in the membrane of an axon. It is anticipated this work will compel those in biophysics, physical biology, and in the computational neurosciences to probe deeper into the classical and quantum features of membrane magnetization and signaling. It is hoped that subsequent investigations of this sort will be advanced by the computational features of this model without having to resort to numerical methods of analysis.« less
Skill components of task analysis
Rogers, Wendy A.; Fisk, Arthur D.
2017-01-01
Some task analysis methods break down a task into a hierarchy of subgoals. Although an important tool of many fields of study, learning to create such a hierarchy (redescription) is not trivial. To further the understanding of what makes task analysis a skill, the present research examined novices’ problems with learning Hierarchical Task Analysis and captured practitioners’ performance. All participants received a task description and analyzed three cooking and three communication tasks by drawing on their knowledge of those tasks. Thirty six younger adults (18–28 years) in Study 1 analyzed one task before training and five afterwards. Training consisted of a general handout that all participants received and an additional handout that differed between three conditions: a list of steps, a flow-diagram, and concept map. In Study 2, eight experienced task analysts received the same task descriptions as in Study 1 and demonstrated their understanding of task analysis while thinking aloud. Novices’ initial task analysis scored low on all coding criteria. Performance improved on some criteria but was well below 100 % on others. Practitioners’ task analyses were 2–3 levels deep but also scored low on some criteria. A task analyst’s purpose of analysis may be the reason for higher specificity of analysis. This research furthers the understanding of Hierarchical Task Analysis and provides insights into the varying nature of task analyses as a function of experience. The derived skill components can inform training objectives. PMID:29075044
Multivariate data analysis methods for the interpretation of microbial flow cytometric data.
Davey, Hazel M; Davey, Christopher L
2011-01-01
Flow cytometry is an important technique in cell biology and immunology and has been applied by many groups to the analysis of microorganisms. This has been made possible by developments in hardware that is now sensitive enough to be used routinely for analysis of microbes. However, in contrast to advances in the technology that underpin flow cytometry, there has not been concomitant progress in the software tools required to analyse, display and disseminate the data and manual analysis, of individual samples remains a limiting aspect of the technology. We present two new data sets that illustrate common applications of flow cytometry in microbiology and demonstrate the application of manual data analysis, automated visualisation (including the first description of a new piece of software we are developing to facilitate this), genetic programming, principal components analysis and artificial neural nets to these data. The data analysis methods described here are equally applicable to flow cytometric applications with other cell types.
Thompson, Cheryl Bagley
2009-01-01
This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.
Global Deployment Anaylsis System Algorithm Description (With Updates)
1998-09-01
Global Deployment Analysis System Algorithm Description (with Updates) By Noetics , Inc. For U.S. Army Concepts Analysis Agency Contract...t "O -Q £5.3 Q 20000224 107 aQU’no-bi-o^f r This Algorithm Description for the Global Deployment Analysis System (GDAS) was prepared by Noetics ...support for Paradox Runtime will be provided by the GDAS developers, CAA and Noetics Inc., and not by Borland International. GDAS for Windows has
48 CFR 232.102 - Description of contract financing methods.
Code of Federal Regulations, 2010 CFR
2010-10-01
... financing methods. 232.102 Section 232.102 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE GENERAL CONTRACTING REQUIREMENTS CONTRACT FINANCING Non-Commercial Item Purchase Financing 232.102 Description of contract financing methods. (e)(2) Progress payments...
Maikusa, Norihide; Yamashita, Fumio; Tanaka, Kenichiro; Abe, Osamu; Kawaguchi, Atsushi; Kabasawa, Hiroyuki; Chiba, Shoma; Kasahara, Akihiro; Kobayashi, Nobuhisa; Yuasa, Tetsuya; Sato, Noriko; Matsuda, Hiroshi; Iwatsubo, Takeshi
2013-06-01
Serial magnetic resonance imaging (MRI) images acquired from multisite and multivendor MRI scanners are widely used in measuring longitudinal structural changes in the brain. Precise and accurate measurements are important in understanding the natural progression of neurodegenerative disorders such as Alzheimer's disease. However, geometric distortions in MRI images decrease the accuracy and precision of volumetric or morphometric measurements. To solve this problem, the authors suggest a commercially available phantom-based distortion correction method that accommodates the variation in geometric distortion within MRI images obtained with multivendor MRI scanners. The authors' method is based on image warping using a polynomial function. The method detects fiducial points within a phantom image using phantom analysis software developed by the Mayo Clinic and calculates warping functions for distortion correction. To quantify the effectiveness of the authors' method, the authors corrected phantom images obtained from multivendor MRI scanners and calculated the root-mean-square (RMS) of fiducial errors and the circularity ratio as evaluation values. The authors also compared the performance of the authors' method with that of a distortion correction method based on a spherical harmonics description of the generic gradient design parameters. Moreover, the authors evaluated whether this correction improves the test-retest reproducibility of voxel-based morphometry in human studies. A Wilcoxon signed-rank test with uncorrected and corrected images was performed. The root-mean-square errors and circularity ratios for all slices significantly improved (p < 0.0001) after the authors' distortion correction. Additionally, the authors' method was significantly better than a distortion correction method based on a description of spherical harmonics in improving the distortion of root-mean-square errors (p < 0.001 and 0.0337, respectively). Moreover, the authors' method reduced the RMS error arising from gradient nonlinearity more than gradwarp methods. In human studies, the coefficient of variation of voxel-based morphometry analysis of the whole brain improved significantly from 3.46% to 2.70% after distortion correction of the whole gray matter using the authors' method (Wilcoxon signed-rank test, p < 0.05). The authors proposed a phantom-based distortion correction method to improve reproducibility in longitudinal structural brain analysis using multivendor MRI. The authors evaluated the authors' method for phantom images in terms of two geometrical values and for human images in terms of test-retest reproducibility. The results showed that distortion was corrected significantly using the authors' method. In human studies, the reproducibility of voxel-based morphometry analysis for the whole gray matter significantly improved after distortion correction using the authors' method.
Liu, Jian-ping; Xia, Yun
2007-04-01
To critically assess the quality of literature about systematic review or meta-analysis on traditional Chinese medicine (TCM) published in Chinese journals. Electronic searches in CNKI, VIP and Wanfang data-base were conducted to retrieve the systematic reviews or meta-analysis reports on TCM, including herbal medicine, needling, acupuncture and moxibustion, as well as integrative medicine, they were identified and extracted according to the 18 items of QUOROM (quality of reporting of meta-analyses) Statement and relative information. The appraisal was made taking the indexes mainly including objectives, source of data, methods of data extraction, quality assessment of the included studies, measurement data synthesis, etc. Eighty-two systematic reviews were identified, except 6 reviews were excluded for repeatedly published or didn't comply with the enrolled criterion, 76 reviews concerning 51 kinds of diseases were enrolled for appraisal. Among them, 70 reviews evaluated the efficacy of TCM, mainly on Chinese herbs and 9 on acupuncture and moxibustion. In majority of the reviews, randomised controlled trials were included and the data resources were described, but in 26 reviews only the Chinese databases were searched and the description about data extraction and analysis method were too simple; and 70% of reviews assessed the quality of the included studies; none used flow chart to express the process of selection, inclusion and exclusion of studies. Few reviews or Meta-analysis reports reached the international standard and there is insufficient description of methodology for conducting systematic reviews, so it is hardly to be repeated. The authors suggested that advanced methodological training is necessary for reviewers.
Shin, Jae Hyuk; Lee, Boreom; Park, Kwang Suk
2011-05-01
In this study, we developed an automated behavior analysis system using infrared (IR) motion sensors to assist the independent living of the elderly who live alone and to improve the efficiency of their healthcare. An IR motion-sensor-based activity-monitoring system was installed in the houses of the elderly subjects to collect motion signals and three different feature values, activity level, mobility level, and nonresponse interval (NRI). These factors were calculated from the measured motion signals. The support vector data description (SVDD) method was used to classify normal behavior patterns and to detect abnormal behavioral patterns based on the aforementioned three feature values. The simulation data and real data were used to verify the proposed method in the individual analysis. A robust scheme is presented in this paper for optimally selecting the values of different parameters especially that of the scale parameter of the Gaussian kernel function involving in the training of the SVDD window length, T of the circadian rhythmic approach with the aim of applying the SVDD to the daily behavior patterns calculated over 24 h. Accuracies by positive predictive value (PPV) were 95.8% and 90.5% for the simulation and real data, respectively. The results suggest that the monitoring system utilizing the IR motion sensors and abnormal-behavior-pattern detection with SVDD are effective methods for home healthcare of elderly people living alone.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wunschel, David S.; Kreuzer-Martin, Helen W.; Antolick, Kathryn C.
2009-12-01
This report describes method development and preliminary evaluation for analyzing castor samples for signatures of purifying ricin. Ricin purification from the source castor seeds is essentially a problem of protein purification using common biochemical methods. Indications of protein purification will likely manifest themselves as removal of the non-protein fractions of the seed. Two major, non-protein, types of biochemical constituents in the seed are the castor oil and various carbohydrates. The oil comprises roughly half the seed weight while the carbohydrate component comprises roughly half of the remaining “mash” left after oil and hull removal. Different castor oil and carbohydrate componentsmore » can serve as indicators of specific toxin processing steps. Ricinoleic acid is a relatively unique fatty acid in nature and is the most abundant component of castor oil. The loss of ricinoleic acid indicates a step to remove oil from the seeds. The relative amounts of carbohydrates and carbohydrate-like compounds, including arabinose, xylose, myo-inositol fucose, rhamnose, glucosamine and mannose detected in the sample can also indicate specific processing steps. For instance, the differential loss of arabinose relative to mannose and N-acetyl glucosamine indicates enrichment for the protein fraction of the seed using protein precipitation. The methods developed in this project center on fatty acid and carbohydrate extraction from castor samples followed by derivatization to permit analysis by gas chromatography-mass spectrometry (GC-MS). Method descriptions herein include: the source and preparation of castor materials used for method evaluation, the equipment and description of procedure required for chemical derivatization, and the instrument parameters used in the analysis. Two types of derivatization methods describe analysis of carbohydrates and one procedure for analysis of fatty acids. Two types of GC-MS analysis is included in the method development, one employing a quadrupole MS system for compound identification and an isotope ratio MS for measuring the stable isotope ratios of deuterium and hydrogen (D/H) in fatty acids. Finally, the method for analyzing the compound abundance data is included. This study indicates that removal of ricinoleic acid is a conserved consequence of each processing step we tested. Furthermore, the stable isotope D/H ratio of ricinoleic acid distinguished between two of the three castor seed sources. Concentrations of arabinose, xylose, mannose, glucosamine and myo-inositol differentiated between crude or acetone extracted samples and samples produced by protein precipitation. Taken together these data illustrate the ability to distinguish between processes used to purify a ricin sample as well as potentially the source seeds.« less
Chadeau-Hyam, Marc; Campanella, Gianluca; Jombart, Thibaut; Bottolo, Leonardo; Portengen, Lutzen; Vineis, Paolo; Liquet, Benoit; Vermeulen, Roel C H
2013-08-01
Recent technological advances in molecular biology have given rise to numerous large-scale datasets whose analysis imposes serious methodological challenges mainly relating to the size and complex structure of the data. Considerable experience in analyzing such data has been gained over the past decade, mainly in genetics, from the Genome-Wide Association Study era, and more recently in transcriptomics and metabolomics. Building upon the corresponding literature, we provide here a nontechnical overview of well-established methods used to analyze OMICS data within three main types of regression-based approaches: univariate models including multiple testing correction strategies, dimension reduction techniques, and variable selection models. Our methodological description focuses on methods for which ready-to-use implementations are available. We describe the main underlying assumptions, the main features, and advantages and limitations of each of the models. This descriptive summary constitutes a useful tool for driving methodological choices while analyzing OMICS data, especially in environmental epidemiology, where the emergence of the exposome concept clearly calls for unified methods to analyze marginally and jointly complex exposure and OMICS datasets. Copyright © 2013 Wiley Periodicals, Inc.
Analysis of Waves in Space Plasma (WISP) near field simulation and experiment
NASA Technical Reports Server (NTRS)
Richie, James E.
1992-01-01
The WISP payload scheduler for a 1995 space transportation system (shuttle flight) will include a large power transmitter on board at a wide range of frequencies. The levels of electromagnetic interference/electromagnetic compatibility (EMI/EMC) must be addressed to insure the safety of the shuttle crew. This report is concerned with the simulation and experimental verification of EMI/EMC for the WISP payload in the shuttle cargo bay. The simulations have been carried out using the method of moments for both thin wires and patches to stimulate closed solids. Data obtained from simulation is compared with experimental results. An investigation of the accuracy of the modeling approach is also included. The report begins with a description of the WISP experiment. A description of the model used to simulate the cargo bay follows. The results of the simulation are compared to experimental data on the input impedance of the WISP antenna with the cargo bay present. A discussion of the methods used to verify the accuracy of the model is shown to illustrate appropriate methods for obtaining this information. Finally, suggestions for future work are provided.
ERIC Educational Resources Information Center
Rule, Audrey C.; Crisafulli, Sherry; DeCare, Heather; DeLeo, Tonya; Eastman, Keri; Farrell, Liz; Geblein, Jennifer; Gioia, Chelsea; Joyce, Ashley; Killian, Kali; Knoop, Kelly; LaRocca, Alison; Meyer, Katie; Miller, Julianne; Roth, Vicki; Throo, Julie; Van Arsdale, Jim; Walker, Malissa
2007-01-01
Descriptive vocabulary is needed for communication and mental processing of science observations. Elementary preservice teachers in a science methods class at a mid-sized public college in central New York State increased their descriptive vocabularies through a course assignment of making a descriptive adjective object box. This teaching material…
48 CFR 1432.102 - Description of contract financing methods.
Code of Federal Regulations, 2010 CFR
2010-10-01
... financing methods. 1432.102 Section 1432.102 Federal Acquisition Regulations System DEPARTMENT OF THE INTERIOR GENERAL CONTRACTING REQUIREMENTS CONTRACT FINANCING Non-Commercial Item Purchase Financing 1432.102 Description of contract financing methods. Use of progress payments based on a percentage or stage...
48 CFR 432.102 - Description of contract financing methods.
Code of Federal Regulations, 2010 CFR
2010-10-01
... financing methods. 432.102 Section 432.102 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE GENERAL CONTRACTING REQUIREMENTS CONTRACT FINANCING Non-Commercial Item Purchase Financing 432.102 Description of contract financing methods. Progress payments based on a percentage or stage of completion are...
48 CFR 1532.102 - Description of contract financing methods.
Code of Federal Regulations, 2010 CFR
2010-10-01
... financing methods. 1532.102 Section 1532.102 Federal Acquisition Regulations System ENVIRONMENTAL PROTECTION AGENCY GENERAL CONTRACTING REQUIREMENTS CONTRACT FINANCING General 1532.102 Description of contract financing methods. Progress payments based on a percentage or stage of completion are authorized for use as...
48 CFR 932.102 - Description of contract financing methods.
Code of Federal Regulations, 2010 CFR
2010-10-01
... financing methods. 932.102 Section 932.102 Federal Acquisition Regulations System DEPARTMENT OF ENERGY GENERAL CONTRACTING REQUIREMENTS CONTRACT FINANCING Non-Commercial Item Purchase Financing 932.102 Description of contract financing methods. (e)(2) Progress payments based on a percentage or stage of...
Lesion Border Detection in Dermoscopy Images
Celebi, M. Emre; Schaefer, Gerald; Iyatomi, Hitoshi; Stoecker, William V.
2009-01-01
Background Dermoscopy is one of the major imaging modalities used in the diagnosis of melanoma and other pigmented skin lesions. Due to the difficulty and subjectivity of human interpretation, computerized analysis of dermoscopy images has become an important research area. One of the most important steps in dermoscopy image analysis is the automated detection of lesion borders. Methods In this article, we present a systematic overview of the recent border detection methods in the literature paying particular attention to computational issues and evaluation aspects. Conclusion Common problems with the existing approaches include the acquisition, size, and diagnostic distribution of the test image set, the evaluation of the results, and the inadequate description of the employed methods. Border determination by dermatologists appears to depend upon higher-level knowledge, therefore it is likely that the incorporation of domain knowledge in automated methods will enable them to perform better, especially in sets of images with a variety of diagnoses. PMID:19121917
NASA Technical Reports Server (NTRS)
Gates, Thomas S.; Johnson, Theodore F.; Whitley, Karen S.
2005-01-01
The objective of this report is to contribute to the independent assessment of the Space Shuttle External Tank Foam Material. This report specifically addresses material modeling, characterization testing, data reduction methods, and data pedigree. A brief description of the External Tank foam materials, locations, and standard failure modes is provided to develop suitable background information. A review of mechanics based analysis methods from the open literature is used to provide an assessment of the state-of-the-art in material modeling of closed cell foams. Further, this report assesses the existing material property database and investigates sources of material property variability. The report presents identified deficiencies in testing methods and procedures, recommendations for additional testing as required, identification of near-term improvements that should be pursued, and long-term capabilities or enhancements that should be developed.
Thermal APU/hydraulics analysis program. User's guide and programmer's manual
NASA Technical Reports Server (NTRS)
Deluna, T. A.
1976-01-01
The User's Guide information plus program description necessary to run and have a general understanding of the Thermal APU/Hydraulics Analysis Program (TAHAP) is described. This information consists of general descriptions of the APU/hydraulic system and the TAHAP model, input and output data descriptions, and specific subroutine requirements. Deck setups and input data formats are included and other necessary and/or helpful information for using TAHAP is given. The math model descriptions for the driver program and each of its supporting subroutines are outlined.
Marateb, Hamid Reza; Mansourian, Marjan; Adibi, Peyman; Farina, Dario
2014-01-01
Background: selecting the correct statistical test and data mining method depends highly on the measurement scale of data, type of variables, and purpose of the analysis. Different measurement scales are studied in details and statistical comparison, modeling, and data mining methods are studied based upon using several medical examples. We have presented two ordinal–variables clustering examples, as more challenging variable in analysis, using Wisconsin Breast Cancer Data (WBCD). Ordinal-to-Interval scale conversion example: a breast cancer database of nine 10-level ordinal variables for 683 patients was analyzed by two ordinal-scale clustering methods. The performance of the clustering methods was assessed by comparison with the gold standard groups of malignant and benign cases that had been identified by clinical tests. Results: the sensitivity and accuracy of the two clustering methods were 98% and 96%, respectively. Their specificity was comparable. Conclusion: by using appropriate clustering algorithm based on the measurement scale of the variables in the study, high performance is granted. Moreover, descriptive and inferential statistics in addition to modeling approach must be selected based on the scale of the variables. PMID:24672565
Cervical vertebral maturation as a biologic indicator of skeletal maturity.
Santiago, Rodrigo César; de Miranda Costa, Luiz Felipe; Vitral, Robert Willer Farinazzo; Fraga, Marcelo Reis; Bolognese, Ana Maria; Maia, Lucianne Cople
2012-11-01
To identify and review the literature regarding the reliability of cervical vertebrae maturation (CVM) staging to predict the pubertal spurt. The selection criteria included cross-sectional and longitudinal descriptive studies in humans that evaluated qualitatively or quantitatively the accuracy and reproducibility of the CVM method on lateral cephalometric radiographs, as well as the correlation with a standard method established by hand-wrist radiographs. The searches retrieved 343 unique citations. Twenty-three studies met the inclusion criteria. Six articles had moderate to high scores, while 17 of 23 had low scores. Analysis also showed a moderate to high statistically significant correlation between CVM and hand-wrist maturation methods. There was a moderate to high reproducibility of the CVM method, and only one specific study investigated the accuracy of the CVM index in detecting peak pubertal growth. This systematic review has shown that the studies on CVM method for radiographic assessment of skeletal maturation stages suffer from serious methodological failures. Better-designed studies with adequate accuracy, reproducibility, and correlation analysis, including studies with appropriate sensitivity-specificity analysis, should be performed.
Wang, Li; Wang, Xiaoyi; Jin, Xuebo; Xu, Jiping; Zhang, Huiyan; Yu, Jiabin; Sun, Qian; Gao, Chong; Wang, Lingbin
2017-03-01
The formation process of algae is described inaccurately and water blooms are predicted with a low precision by current methods. In this paper, chemical mechanism of algae growth is analyzed, and a correlation analysis of chlorophyll-a and algal density is conducted by chemical measurement. Taking into account the influence of multi-factors on algae growth and water blooms, the comprehensive prediction method combined with multivariate time series and intelligent model is put forward in this paper. Firstly, through the process of photosynthesis, the main factors that affect the reproduction of the algae are analyzed. A compensation prediction method of multivariate time series analysis based on neural network and Support Vector Machine has been put forward which is combined with Kernel Principal Component Analysis to deal with dimension reduction of the influence factors of blooms. Then, Genetic Algorithm is applied to improve the generalization ability of the BP network and Least Squares Support Vector Machine. Experimental results show that this method could better compensate the prediction model of multivariate time series analysis which is an effective way to improve the description accuracy of algae growth and prediction precision of water blooms.
NASA Astrophysics Data System (ADS)
Hristian, L.; Ostafe, M. M.; Manea, L. R.; Apostol, L. L.
2017-06-01
The work pursued the distribution of combed wool fabrics destined to manufacturing of external articles of clothing in terms of the values of durability and physiological comfort indices, using the mathematical model of Principal Component Analysis (PCA). Principal Components Analysis (PCA) applied in this study is a descriptive method of the multivariate analysis/multi-dimensional data, and aims to reduce, under control, the number of variables (columns) of the matrix data as much as possible to two or three. Therefore, based on the information about each group/assortment of fabrics, it is desired that, instead of nine inter-correlated variables, to have only two or three new variables called components. The PCA target is to extract the smallest number of components which recover the most of the total information contained in the initial data.
The use of copula functions for predictive analysis of correlations between extreme storm tides
NASA Astrophysics Data System (ADS)
Domino, Krzysztof; Błachowicz, Tomasz; Ciupak, Maurycy
2014-11-01
In this paper we present a method used in quantitative description of weakly predictable hydrological, extreme events at inland sea. Investigations for correlations between variations of individual measuring points, employing combined statistical methods, were carried out. As a main tool for this analysis we used a two-dimensional copula function sensitive for correlated extreme effects. Additionally, a new proposed methodology, based on Detrended Fluctuations Analysis (DFA) and Anomalous Diffusion (AD), was used for the prediction of negative and positive auto-correlations and associated optimum choice of copula functions. As a practical example we analysed maximum storm tides data recorded at five spatially separated places at the Baltic Sea. For the analysis we used Gumbel, Clayton, and Frank copula functions and introduced the reversed Clayton copula. The application of our research model is associated with modelling the risk of high storm tides and possible storm flooding.
Sharma, Harshita; Alekseychuk, Alexander; Leskovsky, Peter; Hellwich, Olaf; Anand, R S; Zerbe, Norman; Hufnagl, Peter
2012-10-04
Computer-based analysis of digitalized histological images has been gaining increasing attention, due to their extensive use in research and routine practice. The article aims to contribute towards the description and retrieval of histological images by employing a structural method using graphs. Due to their expressive ability, graphs are considered as a powerful and versatile representation formalism and have obtained a growing consideration especially by the image processing and computer vision community. The article describes a novel method for determining similarity between histological images through graph-theoretic description and matching, for the purpose of content-based retrieval. A higher order (region-based) graph-based representation of breast biopsy images has been attained and a tree-search based inexact graph matching technique has been employed that facilitates the automatic retrieval of images structurally similar to a given image from large databases. The results obtained and evaluation performed demonstrate the effectiveness and superiority of graph-based image retrieval over a common histogram-based technique. The employed graph matching complexity has been reduced compared to the state-of-the-art optimal inexact matching methods by applying a pre-requisite criterion for matching of nodes and a sophisticated design of the estimation function, especially the prognosis function. The proposed method is suitable for the retrieval of similar histological images, as suggested by the experimental and evaluation results obtained in the study. It is intended for the use in Content Based Image Retrieval (CBIR)-requiring applications in the areas of medical diagnostics and research, and can also be generalized for retrieval of different types of complex images. The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/1224798882787923.
Fishman, M. J.
1993-01-01
Methods to be used to analyze samples of water, suspended sediment and bottom material for their content of inorganic and organic constituents are presented. Technology continually changes, and so this laboratory manual includes new and revised methods for determining the concentration of dissolved constituents in water, whole water recoverable constituents in water-suspended sediment samples, and recoverable concentration of constit- uents in bottom material. For each method, the general topics covered are the application, the principle of the method, interferences, the apparatus and reagents required, a detailed description of the analytical procedure, reporting results, units and significant figures, and analytical precision data. Included in this manual are 30 methods.
[Reconstituting evaluation methods based on both qualitative and quantitative paradigms].
Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro
2011-01-01
Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.
The value of job analysis, job description and performance.
Wolfe, M N; Coggins, S
1997-01-01
All companies, regardless of size, are faced with the same employment concerns. Efficient personnel management requires the use of three human resource techniques--job analysis, job description and performance appraisal. These techniques and tools are not for large practices only. Small groups can obtain the same benefits by employing these performance control measures. Job analysis allows for the development of a compensation system. Job descriptions summarize the most important duties. Performance appraisals help reward outstanding work.
Concentrating Solar Power Projects - Olivenza 1 | Concentrating Solar Power
Manufacturer: Siemens Turbine Description: 5 extractions Output Type: Steam Rankine Power Cycle Pressure: 100.0 bar Cooling Method: Wet cooling Cooling Method Description: Cooling Towers