Sample records for modeling sem software

  1. Three-Dimensional (3D) Nanometrology Based on Scanning Electron Microscope (SEM) Stereophotogrammetry.

    PubMed

    Tondare, Vipin N; Villarrubia, John S; Vlada R, András E

    2017-10-01

    Three-dimensional (3D) reconstruction of a sample surface from scanning electron microscope (SEM) images taken at two perspectives has been known for decades. Nowadays, there exist several commercially available stereophotogrammetry software packages. For testing these software packages, in this study we used Monte Carlo simulated SEM images of virtual samples. A virtual sample is a model in a computer, and its true dimensions are known exactly, which is impossible for real SEM samples due to measurement uncertainty. The simulated SEM images can be used for algorithm testing, development, and validation. We tested two stereophotogrammetry software packages and compared their reconstructed 3D models with the known geometry of the virtual samples used to create the simulated SEM images. Both packages performed relatively well with simulated SEM images of a sample with a rough surface. However, in a sample containing nearly uniform and therefore low-contrast zones, the height reconstruction error was ≈46%. The present stereophotogrammetry software packages need further improvement before they can be used reliably with SEM images with uniform zones.

  2. Multiple-Group Analysis Using the sem Package in the R System

    ERIC Educational Resources Information Center

    Evermann, Joerg

    2010-01-01

    Multiple-group analysis in covariance-based structural equation modeling (SEM) is an important technique to ensure the invariance of latent construct measurements and the validity of theoretical models across different subpopulations. However, not all SEM software packages provide multiple-group analysis capabilities. The sem package for the R…

  3. Structural Equation Modeling: A Framework for Ocular and Other Medical Sciences Research

    PubMed Central

    Christ, Sharon L.; Lee, David J.; Lam, Byron L.; Diane, Zheng D.

    2017-01-01

    Structural equation modeling (SEM) is a modeling framework that encompasses many types of statistical models and can accommodate a variety of estimation and testing methods. SEM has been used primarily in social sciences but is increasingly used in epidemiology, public health, and the medical sciences. SEM provides many advantages for the analysis of survey and clinical data, including the ability to model latent constructs that may not be directly observable. Another major feature is simultaneous estimation of parameters in systems of equations that may include mediated relationships, correlated dependent variables, and in some instances feedback relationships. SEM allows for the specification of theoretically holistic models because multiple and varied relationships may be estimated together in the same model. SEM has recently expanded by adding generalized linear modeling capabilities that include the simultaneous estimation of parameters of different functional form for outcomes with different distributions in the same model. Therefore, mortality modeling and other relevant health outcomes may be evaluated. Random effects estimation using latent variables has been advanced in the SEM literature and software. In addition, SEM software has increased estimation options. Therefore, modern SEM is quite general and includes model types frequently used by health researchers, including generalized linear modeling, mixed effects linear modeling, and population average modeling. This article does not present any new information. It is meant as an introduction to SEM and its uses in ocular and other health research. PMID:24467557

  4. On the Nature of SEM Estimates of ARMA Parameters.

    ERIC Educational Resources Information Center

    Hamaker, Ellen L.; Dolan, Conor V.; Molenaar, Peter C. M.

    2002-01-01

    Reexamined the nature of structural equation modeling (SEM) estimates of autoregressive moving average (ARMA) models, replicated the simulation experiments of P. Molenaar, and examined the behavior of the log-likelihood ratio test. Simulation studies indicate that estimates of ARMA parameters observed with SEM software are identical to those…

  5. Software analysis in the semantic web

    NASA Astrophysics Data System (ADS)

    Taylor, Joshua; Hall, Robert T.

    2013-05-01

    Many approaches in software analysis, particularly dynamic malware analyis, benefit greatly from the use of linked data and other Semantic Web technology. In this paper, we describe AIS, Inc.'s Semantic Extractor (SemEx) component from the Malware Analysis and Attribution through Genetic Information (MAAGI) effort, funded under DARPA's Cyber Genome program. The SemEx generates OWL-based semantic models of high and low level behaviors in malware samples from system call traces generated by AIS's introspective hypervisor, IntroVirtTM. Within MAAGI, these semantic models were used by modules that cluster malware samples by functionality, and construct "genealogical" malware lineages. Herein, we describe the design, implementation, and use of the SemEx, as well as the C2DB, an OWL ontology used for representing software behavior and cyber-environments.

  6. Software Technology for Adaptable, Reliable Systems (STARS)

    DTIC Science & Technology

    1994-03-25

    Tmeline(3), SECOMO(3), SEER(3), GSFC Software Engineering Lab Model(l), SLIM(4), SEER-SEM(l), SPQR (2), PRICE-S(2), internally-developed models(3), APMSS(1...3 " Timeline - 3 " SASET (Software Architecture Sizing Estimating Tool) - 2 " MicroMan 11- 2 * LCM (Logistics Cost Model) - 2 * SPQR - 2 * PRICE-S - 2

  7. An SAS Macro for Implementing the Modified Bollen-Stine Bootstrap for Missing Data: Implementing the Bootstrap Using Existing Structural Equation Modeling Software

    ERIC Educational Resources Information Center

    Enders, Craig K.

    2005-01-01

    The Bollen-Stine bootstrap can be used to correct for standard error and fit statistic bias that occurs in structural equation modeling (SEM) applications due to nonnormal data. The purpose of this article is to demonstrate the use of a custom SAS macro program that can be used to implement the Bollen-Stine bootstrap with existing SEM software.…

  8. Graphical Tools for Linear Structural Equation Modeling

    DTIC Science & Technology

    2014-06-01

    others. 4Kenny and Milan (2011) write, “Identification is perhaps the most difficult concept for SEM researchers to understand. We have seen SEM...model to using typical SEM software to determine model identifia- bility. Kenny and Milan (2011) list the following drawbacks: (i) If poor starting...the well known recursive and null rules (Bollen, 1989) and the regression rule (Kenny and Milan , 2011). A Simple Criterion for Identifying Individual

  9. [Sem: a suitable statistical software adaptated for research in oncology].

    PubMed

    Kwiatkowski, F; Girard, M; Hacene, K; Berlie, J

    2000-10-01

    Many softwares have been adapted for medical use; they rarely enable conveniently both data management and statistics. A recent cooperative work ended up in a new software, Sem (Statistics Epidemiology Medicine), which allows data management of trials and, as well, statistical treatments on them. Very convenient, it can be used by non professional in statistics (biologists, doctors, researchers, data managers), since usually (excepted with multivariate models), the software performs by itself the most adequate test, after what complementary tests can be requested if needed. Sem data base manager (DBM) is not compatible with usual DBM: this constitutes a first protection against loss of privacy. Other shields (passwords, cryptage...) strengthen data security, all the more necessary today since Sem can be run on computers nets. Data organization enables multiplicity: forms can be duplicated by patient. Dates are treated in a special but transparent manner (sorting, date and delay calculations...). Sem communicates with common desktop softwares, often with a simple copy/paste. So, statistics can be easily performed on data stored in external calculation sheets, and slides by pasting graphs with a single mouse click (survival curves...). Already used over fifty places in different hospitals for daily work, this product, combining data management and statistics, appears to be a convenient and innovative solution.

  10. Virtual Levels and Role Models: N-Level Structural Equations Model of Reciprocal Ratings Data.

    PubMed

    Mehta, Paras D

    2018-01-01

    A general latent variable modeling framework called n-Level Structural Equations Modeling (NL-SEM) for dependent data-structures is introduced. NL-SEM is applicable to a wide range of complex multilevel data-structures (e.g., cross-classified, switching membership, etc.). Reciprocal dyadic ratings obtained in round-robin design involve complex set of dependencies that cannot be modeled within Multilevel Modeling (MLM) or Structural Equations Modeling (SEM) frameworks. The Social Relations Model (SRM) for round robin data is used as an example to illustrate key aspects of the NL-SEM framework. NL-SEM introduces novel constructs such as 'virtual levels' that allows a natural specification of latent variable SRMs. An empirical application of an explanatory SRM for personality using xxM, a software package implementing NL-SEM is presented. Results show that person perceptions are an integral aspect of personality. Methodological implications of NL-SEM for the analyses of an emerging class of contextual- and relational-SEMs are discussed.

  11. Teacher's Corner: Structural Equation Modeling with the Sem Package in R

    ERIC Educational Resources Information Center

    Fox, John

    2006-01-01

    R is free, open-source, cooperatively developed software that implements the S statistical programming language and computing environment. The current capabilities of R are extensive, and it is in wide use, especially among statisticians. The sem package provides basic structural equation modeling facilities in R, including the ability to fit…

  12. 3D reconstruction of SEM images by use of optical photogrammetry software.

    PubMed

    Eulitz, Mona; Reiss, Gebhard

    2015-08-01

    Reconstruction of the three-dimensional (3D) surface of an object to be examined is widely used for structure analysis in science and many biological questions require information about their true 3D structure. For Scanning Electron Microscopy (SEM) there has been no efficient non-destructive solution for reconstruction of the surface morphology to date. The well-known method of recording stereo pair images generates a 3D stereoscope reconstruction of a section, but not of the complete sample surface. We present a simple and non-destructive method of 3D surface reconstruction from SEM samples based on the principles of optical close range photogrammetry. In optical close range photogrammetry a series of overlapping photos is used to generate a 3D model of the surface of an object. We adapted this method to the special SEM requirements. Instead of moving a detector around the object, the object itself was rotated. A series of overlapping photos was stitched and converted into a 3D model using the software commonly used for optical photogrammetry. A rabbit kidney glomerulus was used to demonstrate the workflow of this adaption. The reconstruction produced a realistic and high-resolution 3D mesh model of the glomerular surface. The study showed that SEM micrographs are suitable for 3D reconstruction by optical photogrammetry. This new approach is a simple and useful method of 3D surface reconstruction and suitable for various applications in research and teaching. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Simulation based mask defect repair verification and disposition

    NASA Astrophysics Data System (ADS)

    Guo, Eric; Zhao, Shirley; Zhang, Skin; Qian, Sandy; Cheng, Guojie; Vikram, Abhishek; Li, Ling; Chen, Ye; Hsiang, Chingyun; Zhang, Gary; Su, Bo

    2009-10-01

    As the industry moves towards sub-65nm technology nodes, the mask inspection, with increased sensitivity and shrinking critical defect size, catches more and more nuisance and false defects. Increased defect counts pose great challenges in the post inspection defect classification and disposition: which defect is real defect, and among the real defects, which defect should be repaired and how to verify the post-repair defects. In this paper, we address the challenges in mask defect verification and disposition, in particular, in post repair defect verification by an efficient methodology, using SEM mask defect images, and optical inspection mask defects images (only for verification of phase and transmission related defects). We will demonstrate the flow using programmed mask defects in sub-65nm technology node design. In total 20 types of defects were designed including defects found in typical real circuit environments with 30 different sizes designed for each type. The SEM image was taken for each programmed defect after the test mask was made. Selected defects were repaired and SEM images from the test mask were taken again. Wafers were printed with the test mask before and after repair as defect printability references. A software tool SMDD-Simulation based Mask Defect Disposition-has been used in this study. The software is used to extract edges from the mask SEM images and convert them into polygons to save in GDSII format. Then, the converted polygons from the SEM images were filled with the correct tone to form mask patterns and were merged back into the original GDSII design file. This merge is for the purpose of contour simulation-since normally the SEM images cover only small area (~1 μm) and accurate simulation requires including larger area of optical proximity effect. With lithography process model, the resist contour of area of interest (AOI-the area surrounding a mask defect) can be simulated. If such complicated model is not available, a simple optical model can be used to get simulated aerial image intensity in the AOI. With built-in contour analysis functions, the SMDD software can easily compare the contour (or intensity) differences between defect pattern and normal pattern. With user provided judging criteria, this software can be easily disposition the defect based on contour comparison. In addition, process sensitivity properties, like MEEF and NILS, can be readily obtained in the AOI with a lithography model, which will make mask defect disposition criteria more intelligent.

  14. Modelling and analysis of FMS productivity variables by ISM, SEM and GTMA approach

    NASA Astrophysics Data System (ADS)

    Jain, Vineet; Raj, Tilak

    2014-09-01

    Productivity has often been cited as a key factor in a flexible manufacturing system (FMS) performance, and actions to increase it are said to improve profitability and the wage earning capacity of employees. Improving productivity is seen as a key issue for survival and success in the long term of a manufacturing system. The purpose of this paper is to make a model and analysis of the productivity variables of FMS. This study was performed by different approaches viz. interpretive structural modelling (ISM), structural equation modelling (SEM), graph theory and matrix approach (GTMA) and a cross-sectional survey within manufacturing firms in India. ISM has been used to develop a model of productivity variables, and then it has been analyzed. Exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) are powerful statistical techniques. CFA is carried by SEM. EFA is applied to extract the factors in FMS by the statistical package for social sciences (SPSS 20) software and confirming these factors by CFA through analysis of moment structures (AMOS 20) software. The twenty productivity variables are identified through literature and four factors extracted, which involves the productivity of FMS. The four factors are people, quality, machine and flexibility. SEM using AMOS 20 was used to perform the first order four-factor structures. GTMA is a multiple attribute decision making (MADM) methodology used to find intensity/quantification of productivity variables in an organization. The FMS productivity index has purposed to intensify the factors which affect FMS.

  15. Using digital colour to increase the realistic appearance of SEM micrographs of bloodstains.

    PubMed

    Hortolà, Policarp

    2010-10-01

    Although in the scientific-research literature the micrographs from scanning electron microscopes (SEMs) are usually displayed in greyscale, the potential of colour resources provided by the SEM-coupled image-acquiring systems and, subsidiarily, by image-manipulation free softwares deserves be explored as a tool for colouring SEM micrographs of bloodstains. After acquiring greyscale SEM micrographs of a (dark red to the naked eye) human blood smear on grey chert, they were manually obtained in red tone using both the SEM-coupled image-acquiring system and an image-manipulation free software, as well as they were automatically generated in thermal tone using the SEM-coupled system. Red images obtained by the SEM-coupled system demonstrated lower visual-discrimination capability than the other coloured images, whereas those in red generated by the free software rendered better magnitude of scopic information than the red images generated by the SEM-coupled system. Thermal-tone images, although were further from the real sample colour than the red ones, not only increased their realistic appearance over the greyscale images, but also yielded the best visual-discrimination capability among all the coloured SEM micrographs, and fairly enhanced the relief effect of the SEM micrographs over both the greyscale and the red images. The application of digital colour by means of the facilities provided by an SEM-coupled image-acquiring system or, when required, by an image-manipulation free software provides a user-friendly, quick and inexpensive way of obtaining coloured SEM micrographs of bloodstains, avoiding to do sophisticated, time-consuming colouring procedures. Although this work was focused on bloodstains, well probably other monochromatic or quasi-monochromatic samples are also susceptible of increasing their realistic appearance by colouring them using the simple methods utilized in this study.

  16. Analyzing latent state-trait and multiple-indicator latent growth curve models as multilevel structural equation models

    PubMed Central

    Geiser, Christian; Bishop, Jacob; Lockhart, Ginger; Shiffman, Saul; Grenard, Jerry L.

    2013-01-01

    Latent state-trait (LST) and latent growth curve (LGC) models are frequently used in the analysis of longitudinal data. Although it is well-known that standard single-indicator LGC models can be analyzed within either the structural equation modeling (SEM) or multilevel (ML; hierarchical linear modeling) frameworks, few researchers realize that LST and multivariate LGC models, which use multiple indicators at each time point, can also be specified as ML models. In the present paper, we demonstrate that using the ML-SEM rather than the SL-SEM framework to estimate the parameters of these models can be practical when the study involves (1) a large number of time points, (2) individually-varying times of observation, (3) unequally spaced time intervals, and/or (4) incomplete data. Despite the practical advantages of the ML-SEM approach under these circumstances, there are also some limitations that researchers should consider. We present an application to an ecological momentary assessment study (N = 158 youths with an average of 23.49 observations of positive mood per person) using the software Mplus (Muthén and Muthén, 1998–2012) and discuss advantages and disadvantages of using the ML-SEM approach to estimate the parameters of LST and multiple-indicator LGC models. PMID:24416023

  17. Ensuring Positiveness of the Scaled Difference Chi-square Test Statistic.

    PubMed

    Satorra, Albert; Bentler, Peter M

    2010-06-01

    A scaled difference test statistic [Formula: see text] that can be computed from standard software of structural equation models (SEM) by hand calculations was proposed in Satorra and Bentler (2001). The statistic [Formula: see text] is asymptotically equivalent to the scaled difference test statistic T̄(d) introduced in Satorra (2000), which requires more involved computations beyond standard output of SEM software. The test statistic [Formula: see text] has been widely used in practice, but in some applications it is negative due to negativity of its associated scaling correction. Using the implicit function theorem, this note develops an improved scaling correction leading to a new scaled difference statistic T̄(d) that avoids negative chi-square values.

  18. Modeling Latent Interactions at Level 2 in Multilevel Structural Equation Models: An Evaluation of Mean-Centered and Residual-Centered Unconstrained Approaches

    ERIC Educational Resources Information Center

    Leite, Walter L.; Zuo, Youzhen

    2011-01-01

    Among the many methods currently available for estimating latent variable interactions, the unconstrained approach is attractive to applied researchers because of its relatively easy implementation with any structural equation modeling (SEM) software. Using a Monte Carlo simulation study, we extended and evaluated the unconstrained approach to…

  19. OpenMx: An Open Source Extended Structural Equation Modeling Framework

    ERIC Educational Resources Information Center

    Boker, Steven; Neale, Michael; Maes, Hermine; Wilde, Michael; Spiegel, Michael; Brick, Timothy; Spies, Jeffrey; Estabrook, Ryne; Kenny, Sarah; Bates, Timothy; Mehta, Paras; Fox, John

    2011-01-01

    OpenMx is free, full-featured, open source, structural equation modeling (SEM) software. OpenMx runs within the "R" statistical programming environment on Windows, Mac OS-X, and Linux computers. The rationale for developing OpenMx is discussed along with the philosophy behind the user interface. The OpenMx data structures are…

  20. Using genetic markers to orient the edges in quantitative trait networks: the NEO software.

    PubMed

    Aten, Jason E; Fuller, Tova F; Lusis, Aldons J; Horvath, Steve

    2008-04-15

    Systems genetic studies have been used to identify genetic loci that affect transcript abundances and clinical traits such as body weight. The pairwise correlations between gene expression traits and/or clinical traits can be used to define undirected trait networks. Several authors have argued that genetic markers (e.g expression quantitative trait loci, eQTLs) can serve as causal anchors for orienting the edges of a trait network. The availability of hundreds of thousands of genetic markers poses new challenges: how to relate (anchor) traits to multiple genetic markers, how to score the genetic evidence in favor of an edge orientation, and how to weigh the information from multiple markers. We develop and implement Network Edge Orienting (NEO) methods and software that address the challenges of inferring unconfounded and directed gene networks from microarray-derived gene expression data by integrating mRNA levels with genetic marker data and Structural Equation Model (SEM) comparisons. The NEO software implements several manual and automatic methods for incorporating genetic information to anchor traits. The networks are oriented by considering each edge separately, thus reducing error propagation. To summarize the genetic evidence in favor of a given edge orientation, we propose Local SEM-based Edge Orienting (LEO) scores that compare the fit of several competing causal graphs. SEM fitting indices allow the user to assess local and overall model fit. The NEO software allows the user to carry out a robustness analysis with regard to genetic marker selection. We demonstrate the utility of NEO by recovering known causal relationships in the sterol homeostasis pathway using liver gene expression data from an F2 mouse cross. Further, we use NEO to study the relationship between a disease gene and a biologically important gene co-expression module in liver tissue. The NEO software can be used to orient the edges of gene co-expression networks or quantitative trait networks if the edges can be anchored to genetic marker data. R software tutorials, data, and supplementary material can be downloaded from: http://www.genetics.ucla.edu/labs/horvath/aten/NEO.

  1. Massive metrology using fast e-beam technology improves OPC model accuracy by >2x at faster turnaround time

    NASA Astrophysics Data System (ADS)

    Zhao, Qian; Wang, Lei; Wang, Jazer; Wang, ChangAn; Shi, Hong-Fei; Guerrero, James; Feng, Mu; Zhang, Qiang; Liang, Jiao; Guo, Yunbo; Zhang, Chen; Wallow, Tom; Rio, David; Wang, Lester; Wang, Alvin; Wang, Jen-Shiang; Gronlund, Keith; Lang, Jun; Koh, Kar Kit; Zhang, Dong Qing; Zhang, Hongxin; Krishnamurthy, Subramanian; Fei, Ray; Lin, Chiawen; Fang, Wei; Wang, Fei

    2018-03-01

    Classical SEM metrology, CD-SEM, uses low data rate and extensive frame-averaging technique to achieve high-quality SEM imaging for high-precision metrology. The drawbacks include prolonged data collection time and larger photoresist shrinkage due to excess electron dosage. This paper will introduce a novel e-beam metrology system based on a high data rate, large probe current, and ultra-low noise electron optics design. At the same level of metrology precision, this high speed e-beam metrology system could significantly shorten data collection time and reduce electron dosage. In this work, the data collection speed is higher than 7,000 images per hr. Moreover, a novel large field of view (LFOV) capability at high resolution was enabled by an advanced electron deflection system design. The area coverage by LFOV is >100x larger than classical SEM. Superior metrology precision throughout the whole image has been achieved, and high quality metrology data could be extracted from full field. This new capability on metrology will further improve metrology data collection speed to support the need for large volume of metrology data from OPC model calibration of next generation technology. The shrinking EPE (Edge Placement Error) budget places more stringent requirement on OPC model accuracy, which is increasingly limited by metrology errors. In the current practice of metrology data collection and data processing to model calibration flow, CD-SEM throughput becomes a bottleneck that limits the amount of metrology measurements available for OPC model calibration, impacting pattern coverage and model accuracy especially for 2D pattern prediction. To address the trade-off in metrology sampling and model accuracy constrained by the cycle time requirement, this paper employs the high speed e-beam metrology system and a new computational software solution to take full advantage of the large volume data and significantly reduce both systematic and random metrology errors. The new computational software enables users to generate large quantity of highly accurate EP (Edge Placement) gauges and significantly improve design pattern coverage with up to 5X gain in model prediction accuracy on complex 2D patterns. Overall, this work showed >2x improvement in OPC model accuracy at a faster model turn-around time.

  2. Validity and reliability of balance assessment software using the Nintendo Wii balance board: usability and validation

    PubMed Central

    2014-01-01

    Background A balance test provides important information such as the standard to judge an individual’s functional recovery or make the prediction of falls. The development of a tool for a balance test that is inexpensive and widely available is needed, especially in clinical settings. The Wii Balance Board (WBB) is designed to test balance, but there is little software used in balance tests, and there are few studies on reliability and validity. Thus, we developed a balance assessment software using the Nintendo Wii Balance Board, investigated its reliability and validity, and compared it with a laboratory-grade force platform. Methods Twenty healthy adults participated in our study. The participants participated in the test for inter-rater reliability, intra-rater reliability, and concurrent validity. The tests were performed with balance assessment software using the Nintendo Wii balance board and a laboratory-grade force platform. Data such as Center of Pressure (COP) path length and COP velocity were acquired from the assessment systems. The inter-rater reliability, the intra-rater reliability, and concurrent validity were analyzed by an intraclass correlation coefficient (ICC) value and a standard error of measurement (SEM). Results The inter-rater reliability (ICC: 0.89-0.79, SEM in path length: 7.14-1.90, SEM in velocity: 0.74-0.07), intra-rater reliability (ICC: 0.92-0.70, SEM in path length: 7.59-2.04, SEM in velocity: 0.80-0.07), and concurrent validity (ICC: 0.87-0.73, SEM in path length: 5.94-0.32, SEM in velocity: 0.62-0.08) were high in terms of COP path length and COP velocity. Conclusion The balance assessment software incorporating the Nintendo Wii balance board was used in our study and was found to be a reliable assessment device. In clinical settings, the device can be remarkably inexpensive, portable, and convenient for the balance assessment. PMID:24912769

  3. Validity and reliability of balance assessment software using the Nintendo Wii balance board: usability and validation.

    PubMed

    Park, Dae-Sung; Lee, GyuChang

    2014-06-10

    A balance test provides important information such as the standard to judge an individual's functional recovery or make the prediction of falls. The development of a tool for a balance test that is inexpensive and widely available is needed, especially in clinical settings. The Wii Balance Board (WBB) is designed to test balance, but there is little software used in balance tests, and there are few studies on reliability and validity. Thus, we developed a balance assessment software using the Nintendo Wii Balance Board, investigated its reliability and validity, and compared it with a laboratory-grade force platform. Twenty healthy adults participated in our study. The participants participated in the test for inter-rater reliability, intra-rater reliability, and concurrent validity. The tests were performed with balance assessment software using the Nintendo Wii balance board and a laboratory-grade force platform. Data such as Center of Pressure (COP) path length and COP velocity were acquired from the assessment systems. The inter-rater reliability, the intra-rater reliability, and concurrent validity were analyzed by an intraclass correlation coefficient (ICC) value and a standard error of measurement (SEM). The inter-rater reliability (ICC: 0.89-0.79, SEM in path length: 7.14-1.90, SEM in velocity: 0.74-0.07), intra-rater reliability (ICC: 0.92-0.70, SEM in path length: 7.59-2.04, SEM in velocity: 0.80-0.07), and concurrent validity (ICC: 0.87-0.73, SEM in path length: 5.94-0.32, SEM in velocity: 0.62-0.08) were high in terms of COP path length and COP velocity. The balance assessment software incorporating the Nintendo Wii balance board was used in our study and was found to be a reliable assessment device. In clinical settings, the device can be remarkably inexpensive, portable, and convenient for the balance assessment.

  4. Investigating the Relationships among Metacognitive Strategy Training, Willingness to Read English Medical Texts, and Reading Comprehension Ability Using Structural Equation Modeling

    ERIC Educational Resources Information Center

    Hassanpour, Masoumeh; Ghonsooly, Behzad; Nooghabi, Mehdi Jabbari; Shafiee, Mohammad Naser

    2017-01-01

    This quasi-experimental study examined the relationship between students' metacognitive awareness and willingness to read English medical texts. So, a model was proposed and tested using structural equation modeling (SEM) with R software. Participants included 98 medical students of two classes. One class was assigned as the control group and the…

  5. Faults and foibles of quantitative scanning electron microscopy/energy dispersive x-ray spectrometry (SEM/EDS)

    NASA Astrophysics Data System (ADS)

    Newbury, Dale E.; Ritchie, Nicholas W. M.

    2012-06-01

    Scanning electron microscopy with energy dispersive x-ray spectrometry (SEM/EDS) is a powerful and flexible elemental analysis method that can identify and quantify elements with atomic numbers > 4 (Be) present as major constituents (where the concentration C > 0.1 mass fraction, or 10 weight percent), minor (0.01<= C <= 0.1) and trace (C < 0.01, with a minimum detectable limit of ~+/- 0.0005 - 0.001 under routine measurement conditions, a level which is analyte and matrix dependent ). SEM/EDS can select specimen volumes with linear dimensions from ~ 500 nm to 5 μm depending on composition (masses ranging from ~ 10 pg to 100 pg) and can provide compositional maps that depict lateral elemental distributions. Despite the maturity of SEM/EDS, which has a history of more than 40 years, and the sophistication of modern analytical software, the method is vulnerable to serious shortcomings that can lead to incorrect elemental identifications and quantification errors that significantly exceed reasonable expectations. This paper will describe shortcomings in peak identification procedures, limitations on the accuracy of quantitative analysis due to specimen topography or failures in physical models for matrix corrections, and quantitative artifacts encountered in xray elemental mapping. Effective solutions to these problems are based on understanding the causes and then establishing appropriate measurement science protocols. NIST DTSA II and Lispix are open source analytical software available free at www.nist.gov that can aid the analyst in overcoming significant limitations to SEM/EDS.

  6. SEM-microphotogrammetry, a new take on an old method for generating high-resolution 3D models from SEM images.

    PubMed

    Ball, A D; Job, P A; Walker, A E L

    2017-08-01

    The method we present here uses a scanning electron microscope programmed via macros to automatically capture dozens of images at suitable angles to generate accurate, detailed three-dimensional (3D) surface models with micron-scale resolution. We demonstrate that it is possible to use these Scanning Electron Microscope (SEM) images in conjunction with commercially available software originally developed for photogrammetry reconstructions from Digital Single Lens Reflex (DSLR) cameras and to reconstruct 3D models of the specimen. These 3D models can then be exported as polygon meshes and eventually 3D printed. This technique offers the potential to obtain data suitable to reconstruct very tiny features (e.g. diatoms, butterfly scales and mineral fabrics) at nanometre resolution. Ultimately, we foresee this as being a useful tool for better understanding spatial relationships at very high resolution. However, our motivation is also to use it to produce 3D models to be used in public outreach events and exhibitions, especially for the blind or partially sighted. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.

  7. Photogrammetry of the three-dimensional shape and texture of a nanoscale particle using scanning electron microscopy and free software.

    PubMed

    Gontard, Lionel C; Schierholz, Roland; Yu, Shicheng; Cintas, Jesús; Dunin-Borkowski, Rafal E

    2016-10-01

    We apply photogrammetry in a scanning electron microscope (SEM) to study the three-dimensional shape and surface texture of a nanoscale LiTi2(PO4)3 particle. We highlight the fact that the technique can be applied non-invasively in any SEM using free software (freeware) and does not require special sample preparation. Three-dimensional information is obtained in the form of a surface mesh, with the texture of the sample stored as a separate two-dimensional image (referred to as a UV Map). The mesh can be used to measure parameters such as surface area, volume, moment of inertia and center of mass, while the UV map can be used to study the surface texture using conventional image processing techniques. We also illustrate the use of 3D printing to visualize the reconstructed model. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Ensuring Positiveness of the Scaled Difference Chi-Square Test Statistic

    ERIC Educational Resources Information Center

    Satorra, Albert; Bentler, Peter M.

    2010-01-01

    A scaled difference test statistic T[tilde][subscript d] that can be computed from standard software of structural equation models (SEM) by hand calculations was proposed in Satorra and Bentler (Psychometrika 66:507-514, 2001). The statistic T[tilde][subscript d] is asymptotically equivalent to the scaled difference test statistic T[bar][subscript…

  9. Generating 3D and 3D-like animations of strongly uneven surface microareas of bloodstains from small series of partially out-of-focus digital SEM micrographs.

    PubMed

    Hortolà, Policarp

    2010-01-01

    When dealing with microscopic still images of some kinds of samples, the out-of-focus problem represents a particularly serious limiting factor for the subsequent generation of fully sharp 3D animations. In order to produce fully-focused 3D animations of strongly uneven surface microareas, a vertical stack of six digital secondary-electron SEM micrographs of a human bloodstain microarea was acquired. Afterwards, single combined images were generated using a macrophotography and light microscope image post-processing software. Subsequently, 3D animations of texture and topography were obtained in different formats using a combination of software tools. Finally, a 3D-like animation of a texture-topography composite was obtained in different formats using another combination of software tools. By one hand, results indicate that the use of image post-processing software not concerned primarily with electron micrographs allows to obtain, in an easy way, fully-focused images of strongly uneven surface microareas of bloodstains from small series of partially out-of-focus digital SEM micrographs. On the other hand, results also indicate that such small series of electron micrographs can be utilized for generating 3D and 3D-like animations that can subsequently be converted into different formats, by using certain user-friendly software facilities not originally designed for use in SEM, that are easily available from Internet. Although the focus of this study was on bloodstains, the methods used in it well probably are also of relevance for studying the surface microstructures of other organic or inorganic materials whose sharp displaying is difficult of obtaining from a single SEM micrograph.

  10. Semantics-Based Composition of Integrated Cardiomyocyte Models Motivated by Real-World Use Cases.

    PubMed

    Neal, Maxwell L; Carlson, Brian E; Thompson, Christopher T; James, Ryan C; Kim, Karam G; Tran, Kenneth; Crampin, Edmund J; Cook, Daniel L; Gennari, John H

    2015-01-01

    Semantics-based model composition is an approach for generating complex biosimulation models from existing components that relies on capturing the biological meaning of model elements in a machine-readable fashion. This approach allows the user to work at the biological rather than computational level of abstraction and helps minimize the amount of manual effort required for model composition. To support this compositional approach, we have developed the SemGen software, and here report on SemGen's semantics-based merging capabilities using real-world modeling use cases. We successfully reproduced a large, manually-encoded, multi-model merge: the "Pandit-Hinch-Niederer" (PHN) cardiomyocyte excitation-contraction model, previously developed using CellML. We describe our approach for annotating the three component models used in the PHN composition and for merging them at the biological level of abstraction within SemGen. We demonstrate that we were able to reproduce the original PHN model results in a semi-automated, semantics-based fashion and also rapidly generate a second, novel cardiomyocyte model composed using an alternative, independently-developed tension generation component. We discuss the time-saving features of our compositional approach in the context of these merging exercises, the limitations we encountered, and potential solutions for enhancing the approach.

  11. Semantics-Based Composition of Integrated Cardiomyocyte Models Motivated by Real-World Use Cases

    PubMed Central

    Neal, Maxwell L.; Carlson, Brian E.; Thompson, Christopher T.; James, Ryan C.; Kim, Karam G.; Tran, Kenneth; Crampin, Edmund J.; Cook, Daniel L.; Gennari, John H.

    2015-01-01

    Semantics-based model composition is an approach for generating complex biosimulation models from existing components that relies on capturing the biological meaning of model elements in a machine-readable fashion. This approach allows the user to work at the biological rather than computational level of abstraction and helps minimize the amount of manual effort required for model composition. To support this compositional approach, we have developed the SemGen software, and here report on SemGen’s semantics-based merging capabilities using real-world modeling use cases. We successfully reproduced a large, manually-encoded, multi-model merge: the “Pandit-Hinch-Niederer” (PHN) cardiomyocyte excitation-contraction model, previously developed using CellML. We describe our approach for annotating the three component models used in the PHN composition and for merging them at the biological level of abstraction within SemGen. We demonstrate that we were able to reproduce the original PHN model results in a semi-automated, semantics-based fashion and also rapidly generate a second, novel cardiomyocyte model composed using an alternative, independently-developed tension generation component. We discuss the time-saving features of our compositional approach in the context of these merging exercises, the limitations we encountered, and potential solutions for enhancing the approach. PMID:26716837

  12. Modeling of Individual and Organizational Factors Affecting Traumatic Occupational Injuries Based on the Structural Equation Modeling: A Case Study in Large Construction Industries.

    PubMed

    Mohammadfam, Iraj; Soltanzadeh, Ahmad; Moghimbeigi, Abbas; Akbarzadeh, Mehdi

    2016-09-01

    Individual and organizational factors are the factors influencing traumatic occupational injuries. The aim of the present study was the short path analysis of the severity of occupational injuries based on individual and organizational factors. The present cross-sectional analytical study was implemented on traumatic occupational injuries within a ten-year timeframe in 13 large Iranian construction industries. Modeling and data analysis were done using the structural equation modeling (SEM) approach and the IBM SPSS AMOS statistical software version 22.0, respectively. The mean age and working experience of the injured workers were 28.03 ± 5.33 and 4.53 ± 3.82 years, respectively. The portions of construction and installation activities of traumatic occupational injuries were 64.4% and 18.1%, respectively. The SEM findings showed that the individual, organizational and accident type factors significantly were considered as effective factors on occupational injuries' severity (P < 0.05). Path analysis of occupational injuries based on the SEM reveals that individual and organizational factors and their indicator variables are very influential on the severity of traumatic occupational injuries. So, these should be considered to reduce occupational accidents' severity in large construction industries.

  13. EQS Goes R: Simulations for SEM Using the Package REQS

    ERIC Educational Resources Information Center

    Mair, Patrick; Wu, Eric; Bentler, Peter M.

    2010-01-01

    The REQS package is an interface between the R environment of statistical computing and the EQS software for structural equation modeling. The package consists of 3 main functions that read EQS script files and import the results into R, call EQS script files from R, and run EQS script files from R and import the results after EQS computations.…

  14. Uncertainty in the use of MAMA software to measure particle morphological parameters from SEM images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, Daniel S.; Tandon, Lav

    The MAMA software package developed at LANL is designed to make morphological measurements on a wide variety of digital images of objects. At LANL, we have focused on using MAMA to measure scanning electron microscope (SEM) images of particles, as this is a critical part of our forensic analysis of interdicted radiologic materials. In order to successfully use MAMA to make such measurements, we must understand the level of uncertainty involved in the process, so that we can rigorously support our quantitative conclusions.

  15. Modeling of Individual and Organizational Factors Affecting Traumatic Occupational Injuries Based on the Structural Equation Modeling: A Case Study in Large Construction Industries

    PubMed Central

    Mohammadfam, Iraj; Soltanzadeh, Ahmad; Moghimbeigi, Abbas; Akbarzadeh, Mehdi

    2016-01-01

    Background Individual and organizational factors are the factors influencing traumatic occupational injuries. Objectives The aim of the present study was the short path analysis of the severity of occupational injuries based on individual and organizational factors. Materials and Methods The present cross-sectional analytical study was implemented on traumatic occupational injuries within a ten-year timeframe in 13 large Iranian construction industries. Modeling and data analysis were done using the structural equation modeling (SEM) approach and the IBM SPSS AMOS statistical software version 22.0, respectively. Results The mean age and working experience of the injured workers were 28.03 ± 5.33 and 4.53 ± 3.82 years, respectively. The portions of construction and installation activities of traumatic occupational injuries were 64.4% and 18.1%, respectively. The SEM findings showed that the individual, organizational and accident type factors significantly were considered as effective factors on occupational injuries’ severity (P < 0.05). Conclusions Path analysis of occupational injuries based on the SEM reveals that individual and organizational factors and their indicator variables are very influential on the severity of traumatic occupational injuries. So, these should be considered to reduce occupational accidents’ severity in large construction industries. PMID:27800465

  16. Deriving the Cost of Software Maintenance for Software Intensive Systems

    DTIC Science & Technology

    2011-08-29

    more of software maintenance). Figure 4. SEER-SEM Maintenance Effort by Year Report (Reifer, Allen, Fersch, Hitchings, Judy , & Rosa, 2010...understand the linear relationship between two variables. The formula for the simple Pearson product-moment correlation is represented in Equation 5...standardization is required across the software maintenance community in order to ensure that the data being recorded can be employed beyond the agency or

  17. Multivariate normal maximum likelihood with both ordinal and continuous variables, and data missing at random.

    PubMed

    Pritikin, Joshua N; Brick, Timothy R; Neale, Michael C

    2018-04-01

    A novel method for the maximum likelihood estimation of structural equation models (SEM) with both ordinal and continuous indicators is introduced using a flexible multivariate probit model for the ordinal indicators. A full information approach ensures unbiased estimates for data missing at random. Exceeding the capability of prior methods, up to 13 ordinal variables can be included before integration time increases beyond 1 s per row. The method relies on the axiom of conditional probability to split apart the distribution of continuous and ordinal variables. Due to the symmetry of the axiom, two similar methods are available. A simulation study provides evidence that the two similar approaches offer equal accuracy. A further simulation is used to develop a heuristic to automatically select the most computationally efficient approach. Joint ordinal continuous SEM is implemented in OpenMx, free and open-source software.

  18. Deep machine learning based Image classification in hard disk drive manufacturing (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Rana, Narender; Chien, Chester

    2018-03-01

    A key sensor element in a Hard Disk Drive (HDD) is the read-write head device. The device is complex 3D shape and its fabrication requires over thousand process steps with many of them being various types of image inspection and critical dimension (CD) metrology steps. In order to have high yield of devices across a wafer, very tight inspection and metrology specifications are implemented. Many images are collected on a wafer and inspected for various types of defects and in CD metrology the quality of image impacts the CD measurements. Metrology noise need to be minimized in CD metrology to get better estimate of the process related variations for implementing robust process controls. Though there are specialized tools available for defect inspection and review allowing classification and statistics. However, due to unavailability of such advanced tools or other reasons, many times images need to be manually inspected. SEM Image inspection and CD-SEM metrology tools are different tools differing in software as well. SEM Image inspection and CD-SEM metrology tools are separate tools differing in software and purpose. There have been cases where a significant numbers of CD-SEM images are blurred or have some artefact and there is a need for image inspection along with the CD measurement. Tool may not report a practical metric highlighting the quality of image. Not filtering CD from these blurred images will add metrology noise to the CD measurement. An image classifier can be helpful here for filtering such data. This paper presents the use of artificial intelligence in classifying the SEM images. Deep machine learning is used to train a neural network which is then used to classify the new images as blurred and not blurred. Figure 1 shows the image blur artefact and contingency table of classification results from the trained deep neural network. Prediction accuracy of 94.9 % was achieved in the first model. Paper covers other such applications of the deep neural network in image classification for inspection, review and metrology.

  19. A computer-guided minimally-invasive technique for orthodontic forced eruption of impacted canines.

    PubMed

    BERTELè, Matteo; Minniti, Paola P; Dalessandri, Domenico; Bonetti, Stefano; Visconti, Luca; Paganelli, Corrado

    2016-06-01

    The aim of this study was to develop a computer-guided minimally-invasive protocol for the surgical application of an orthodontic traction during the forced eruption of an impacted canine. 3Diagnosys® software was used to evaluate impacted canines position and to plan the surgical access, taking into account soft and hard tissues thickness, orthodontic traction path and presence of possible obstacles. Geomagic® software was used for reverse engineering and RhinocerosTM software was employed as three-dimensional modeller in preparing individualized surgical guides. Surgical access was gained flapless through the use of a mucosal punch for soft tissues, followed by a trephine bur with a pre-adjusted stop for bone path creation. A diamond bur mounted on SONICflex® 2003/L handpiece was used to prepare a 2-mm-deep calibrated hole into the canine enamel where a titanium screw connected with a stainless steel ligature was screwed. In-vitro pull-out tests, radiological and SEM analysis were realized in order to investigate screw stability and position. In two out of ten samples the screw was removed after the application of a 1-kg pull-out force. Radiological and SEM analysis demonstrated that all the screws were inserted into the enamel without affecting dentine integrity. This computer-guided minimally-invasive technique allowed a precise and reliable positioning of screws utilized during the orthodontic traction of impacted canines.

  20. Automated Transmission-Mode Scanning Electron Microscopy (tSEM) for Large Volume Analysis at Nanoscale Resolution

    PubMed Central

    Kuwajima, Masaaki; Mendenhall, John M.; Lindsey, Laurence F.; Harris, Kristen M.

    2013-01-01

    Transmission-mode scanning electron microscopy (tSEM) on a field emission SEM platform was developed for efficient and cost-effective imaging of circuit-scale volumes from brain at nanoscale resolution. Image area was maximized while optimizing the resolution and dynamic range necessary for discriminating key subcellular structures, such as small axonal, dendritic and glial processes, synapses, smooth endoplasmic reticulum, vesicles, microtubules, polyribosomes, and endosomes which are critical for neuronal function. Individual image fields from the tSEM system were up to 4,295 µm2 (65.54 µm per side) at 2 nm pixel size, contrasting with image fields from a modern transmission electron microscope (TEM) system, which were only 66.59 µm2 (8.160 µm per side) at the same pixel size. The tSEM produced outstanding images and had reduced distortion and drift relative to TEM. Automated stage and scan control in tSEM easily provided unattended serial section imaging and montaging. Lens and scan properties on both TEM and SEM platforms revealed no significant nonlinear distortions within a central field of ∼100 µm2 and produced near-perfect image registration across serial sections using the computational elastic alignment tool in Fiji/TrakEM2 software, and reliable geometric measurements from RECONSTRUCT™ or Fiji/TrakEM2 software. Axial resolution limits the analysis of small structures contained within a section (∼45 nm). Since this new tSEM is non-destructive, objects within a section can be explored at finer axial resolution in TEM tomography with current methods. Future development of tSEM tomography promises thinner axial resolution producing nearly isotropic voxels and should provide within-section analyses of structures without changing platforms. Brain was the test system given our interest in synaptic connectivity and plasticity; however, the new tSEM system is readily applicable to other biological systems. PMID:23555711

  1. Determinants of quality of life in patients with fibromyalgia: A structural equation modeling approach.

    PubMed

    Lee, Jeong-Won; Lee, Kyung-Eun; Park, Dong-Jin; Kim, Seong-Ho; Nah, Seong-Su; Lee, Ji Hyun; Kim, Seong-Kyu; Lee, Yeon-Ah; Hong, Seung-Jae; Kim, Hyun-Sook; Lee, Hye-Soon; Kim, Hyoun Ah; Joung, Chung-Il; Kim, Sang-Hyon; Lee, Shin-Seok

    2017-01-01

    Health-related quality of life (HRQOL) in patients with fibromyalgia (FM) is lower than in patients with other chronic diseases and the general population. Although various factors affect HRQOL, no study has examined a structural equation model of HRQOL as an outcome variable in FM patients. The present study assessed relationships among physical function, social factors, psychological factors, and HRQOL, and the effects of these variables on HRQOL in a hypothesized model using structural equation modeling (SEM). HRQOL was measured using SF-36, and the Fibromyalgia Impact Questionnaire (FIQ) was used to assess physical dysfunction. Social and psychological statuses were assessed using the Beck Depression Inventory (BDI), the State-Trait Anxiety Inventory (STAI), the Arthritis Self-Efficacy Scale (ASES), and the Social Support Scale. SEM analysis was used to test the structural relationships of the model using the AMOS software. Of the 336 patients, 301 (89.6%) were women with an average age of 47.9±10.9 years. The SEM results supported the hypothesized structural model (χ2 = 2.336, df = 3, p = 0.506). The final model showed that Physical Component Summary (PCS) was directly related to self-efficacy and inversely related to FIQ, and that Mental Component Summary (MCS) was inversely related to FIQ, BDI, and STAI. In our model of FM patients, HRQOL was affected by physical, social, and psychological variables. In these patients, higher levels of physical function and self-efficacy can improve the PCS of HRQOL, while physical function, depression, and anxiety negatively affect the MCS of HRQOL.

  2. Determinants of quality of life in patients with fibromyalgia: A structural equation modeling approach

    PubMed Central

    Lee, Jeong-Won; Lee, Kyung-Eun; Park, Dong-Jin; Kim, Seong-Ho; Nah, Seong-Su; Lee, Ji Hyun; Kim, Seong-Kyu; Lee, Yeon-Ah; Hong, Seung-Jae; Kim, Hyun-Sook; Lee, Hye-Soon; Kim, Hyoun Ah; Joung, Chung-Il; Kim, Sang-Hyon

    2017-01-01

    Objective Health-related quality of life (HRQOL) in patients with fibromyalgia (FM) is lower than in patients with other chronic diseases and the general population. Although various factors affect HRQOL, no study has examined a structural equation model of HRQOL as an outcome variable in FM patients. The present study assessed relationships among physical function, social factors, psychological factors, and HRQOL, and the effects of these variables on HRQOL in a hypothesized model using structural equation modeling (SEM). Methods HRQOL was measured using SF-36, and the Fibromyalgia Impact Questionnaire (FIQ) was used to assess physical dysfunction. Social and psychological statuses were assessed using the Beck Depression Inventory (BDI), the State-Trait Anxiety Inventory (STAI), the Arthritis Self-Efficacy Scale (ASES), and the Social Support Scale. SEM analysis was used to test the structural relationships of the model using the AMOS software. Results Of the 336 patients, 301 (89.6%) were women with an average age of 47.9±10.9 years. The SEM results supported the hypothesized structural model (χ2 = 2.336, df = 3, p = 0.506). The final model showed that Physical Component Summary (PCS) was directly related to self-efficacy and inversely related to FIQ, and that Mental Component Summary (MCS) was inversely related to FIQ, BDI, and STAI. Conclusions In our model of FM patients, HRQOL was affected by physical, social, and psychological variables. In these patients, higher levels of physical function and self-efficacy can improve the PCS of HRQOL, while physical function, depression, and anxiety negatively affect the MCS of HRQOL. PMID:28158289

  3. Using Statechart Assertion for the Formal Validation and Verification of a Real-Time Software System: A Case Study

    DTIC Science & Technology

    2011-03-01

    could be an entry point into a repeated task (or thread). The following example uses binary semaphores . The VxWorks operating system utilizes binary... semaphores via system calls: SemTake and SemGive. These semaphores are used primarily for mutual exclusion to protect resources from being accessed

  4. High Resolution X-Ray Micro-CT of Ultra-Thin Wall Space Components

    NASA Technical Reports Server (NTRS)

    Roth, Don J.; Rauser, R. W.; Bowman, Randy R.; Bonacuse, Peter; Martin, Richard E.; Locci, I. E.; Kelley, M.

    2012-01-01

    A high resolution micro-CT system has been assembled and is being used to provide optimal characterization for ultra-thin wall space components. The Glenn Research Center NDE Sciences Team, using this CT system, has assumed the role of inspection vendor for the Advanced Stirling Convertor (ASC) project at NASA. This article will discuss many aspects of the development of the CT scanning for this type of component, including CT system overview; inspection requirements; process development, software utilized and developed to visualize, process, and analyze results; calibration sample development; results on actual samples; correlation with optical/SEM characterization; CT modeling; and development of automatic flaw recognition software. Keywords: Nondestructive Evaluation, NDE, Computed Tomography, Imaging, X-ray, Metallic Components, Thin Wall Inspection

  5. Sensitivity Analysis of Multiple Informant Models When Data are Not Missing at Random

    PubMed Central

    Blozis, Shelley A.; Ge, Xiaojia; Xu, Shu; Natsuaki, Misaki N.; Shaw, Daniel S.; Neiderhiser, Jenae; Scaramella, Laura; Leve, Leslie; Reiss, David

    2014-01-01

    Missing data are common in studies that rely on multiple informant data to evaluate relationships among variables for distinguishable individuals clustered within groups. Estimation of structural equation models using raw data allows for incomplete data, and so all groups may be retained even if only one member of a group contributes data. Statistical inference is based on the assumption that data are missing completely at random or missing at random. Importantly, whether or not data are missing is assumed to be independent of the missing data. A saturated correlates model that incorporates correlates of the missingness or the missing data into an analysis and multiple imputation that may also use such correlates offer advantages over the standard implementation of SEM when data are not missing at random because these approaches may result in a data analysis problem for which the missingness is ignorable. This paper considers these approaches in an analysis of family data to assess the sensitivity of parameter estimates to assumptions about missing data, a strategy that may be easily implemented using SEM software. PMID:25221420

  6. Ag2S atomic switch-based `tug of war' for decision making

    NASA Astrophysics Data System (ADS)

    Lutz, C.; Hasegawa, T.; Chikyow, T.

    2016-07-01

    For a computing process such as making a decision, a software controlled chip of several transistors is necessary. Inspired by how a single cell amoeba decides its movements, the theoretical `tug of war' computing model was proposed but not yet implemented in an analogue device suitable for integrated circuits. Based on this model, we now developed a new electronic element for decision making processes, which will have no need for prior programming. The devices are based on the growth and shrinkage of Ag filaments in α-Ag2+δS gap-type atomic switches. Here we present the adapted device design and the new materials. We demonstrate the basic `tug of war' operation by IV-measurements and Scanning Electron Microscopy (SEM) observation. These devices could be the base for a CMOS-free new computer architecture.For a computing process such as making a decision, a software controlled chip of several transistors is necessary. Inspired by how a single cell amoeba decides its movements, the theoretical `tug of war' computing model was proposed but not yet implemented in an analogue device suitable for integrated circuits. Based on this model, we now developed a new electronic element for decision making processes, which will have no need for prior programming. The devices are based on the growth and shrinkage of Ag filaments in α-Ag2+δS gap-type atomic switches. Here we present the adapted device design and the new materials. We demonstrate the basic `tug of war' operation by IV-measurements and Scanning Electron Microscopy (SEM) observation. These devices could be the base for a CMOS-free new computer architecture. Electronic supplementary information (ESI) available. See DOI: 10.1039/c6nr00690f

  7. A hybrid method for the computation of quasi-3D seismograms.

    NASA Astrophysics Data System (ADS)

    Masson, Yder; Romanowicz, Barbara

    2013-04-01

    The development of powerful computer clusters and efficient numerical computation methods, such as the Spectral Element Method (SEM) made possible the computation of seismic wave propagation in a heterogeneous 3D earth. However, the cost of theses computations is still problematic for global scale tomography that requires hundreds of such simulations. Part of the ongoing research effort is dedicated to the development of faster modeling methods based on the spectral element method. Capdeville et al. (2002) proposed to couple SEM simulations with normal modes calculation (C-SEM). Nissen-Meyer et al. (2007) used 2D SEM simulations to compute 3D seismograms in a 1D earth model. Thanks to these developments, and for the first time, Lekic et al. (2011) developed a 3D global model of the upper mantle using SEM simulations. At the local and continental scale, adjoint tomography that is using a lot of SEM simulation can be implemented on current computers (Tape, Liu et al. 2009). Due to their smaller size, these models offer higher resolution. They provide us with images of the crust and the upper part of the mantle. In an attempt to teleport such local adjoint tomographic inversions into the deep earth, we are developing a hybrid method where SEM computation are limited to a region of interest within the earth. That region can have an arbitrary shape and size. Outside this region, the seismic wavefield is extrapolated to obtain synthetic data at the Earth's surface. A key feature of the method is the use of a time reversal mirror to inject the wavefield induced by distant seismic source into the region of interest (Robertsson and Chapman 2000). We compute synthetic seismograms as follow: Inside the region of interest, we are using regional spectral element software RegSEM to compute wave propagation in 3D. Outside this region, the wavefield is extrapolated to the surface by convolution with the Green's functions from the mirror to the seismic stations. For now, these Green's functions are computed using 2D SEM simulation in a 1D Earth model. Such seismograms account for the 3D structure inside the region of interest in a quasi-exact manner. Later we plan to extrapolate the misfit function computed from such seismograms at the stations back into the SEM region in order to compute local adjoint kernels. This opens a new path toward regional adjoint tomography into the deep Earth. Capdeville, Y., et al. (2002). "Coupling the spectral element method with a modal solution for elastic wave propagation in global Earth models." Geophysical Journal International 152(1): 34-67. Lekic, V. and B. Romanowicz (2011). "Inferring upper-mantle structure by full waveform tomography with the spectral element method." Geophysical Journal International 185(2): 799-831. Nissen-Meyer, T., et al. (2007). "A two-dimensional spectral-element method for computing spherical-earth seismograms-I. Moment-tensor source." Geophysical Journal International 168(3): 1067-1092. Robertsson, J. O. A. and C. H. Chapman (2000). "An efficient method for calculating finite-difference seismograms after model alterations." Geophysics 65(3): 907-918. Tape, C., et al. (2009). "Adjoint tomography of the southern California crust." Science 325(5943): 988-992.

  8. Hierarchical Structure of Articular Bone-Cartilage Interface and Its Potential Application for Osteochondral Tissue Engineering

    NASA Astrophysics Data System (ADS)

    Bian, Weiguo; Qin, Lian; Li, Dichen; Wang, Jin; Jin, Zhongmin

    2010-09-01

    The artificial biodegradable osteochondral construct is one of mostly promising lifetime substitute in the joint replacement. And the complex hierarchical structure of natural joint is important in developing the osteochondral construct. However, the architecture features of the interface between cartilage and bone, in particular those at the micro-and nano-structural level, remain poorly understood. This paper investigates these structural data of the cartilage-bone interface by micro computerized tomography (μCT) and Scanning Electron Microscope (SEM). The result of μCT shows that important bone parameters and the density of articular cartilage are all related to the position in the hierarchical structure. The conjunctions of bone and cartilage were defined by SEM. All of the study results would be useful for the design of osteochondral construct further manufactured by nano-tech. A three-dimensional model with gradient porous structure is constructed in the environment of Pro/ENGINEERING software.

  9. Behavioral change theories can inform the prediction of young adults' adoption of a plant-based diet.

    PubMed

    Wyker, Brett A; Davison, Kirsten K

    2010-01-01

    Drawing on the Theory of Planned Behavior (TPB) and the Transtheoretical Model (TTM), this study (1) examines links between stages of change for following a plant-based diet (PBD) and consuming more fruits and vegetables (FV); (2) tests an integrated theoretical model predicting intention to follow a PBD; and (3) identifies associated salient beliefs. Cross-sectional. Large public university in the northeastern United States. 204 college students. TPB and TTM constructs were assessed using validated scales. Outcome, normative, and control beliefs were measured using open-ended questions. The overlap between stages of change for FV consumption and adopting a PBD was assessed using Spearman rank correlation analysis and cross-tab comparisons. The proposed model predicting adoption of a PBD was tested using structural equation modeling (SEM). Salient beliefs were coded using automatic response coding software. No association was found between stages of change for FV consumption and following a PBD. Results from SEM analyses provided support for the proposed model predicting intention to follow a PBD. Gender differences in salient beliefs for following a PBD were found. Results demonstrate the potential for effective theory-driven and stage-tailored public health interventions to promote PBDs. Copyright 2010 Society for Nutrition Education. Published by Elsevier Inc. All rights reserved.

  10. COMODI: an ontology to characterise differences in versions of computational models in biology.

    PubMed

    Scharm, Martin; Waltemath, Dagmar; Mendes, Pedro; Wolkenhauer, Olaf

    2016-07-11

    Open model repositories provide ready-to-reuse computational models of biological systems. Models within those repositories evolve over time, leading to different model versions. Taken together, the underlying changes reflect a model's provenance and thus can give valuable insights into the studied biology. Currently, however, changes cannot be semantically interpreted. To improve this situation, we developed an ontology of terms describing changes in models. The ontology can be used by scientists and within software to characterise model updates at the level of single changes. When studying or reusing a model, these annotations help with determining the relevance of a change in a given context. We manually studied changes in selected models from BioModels and the Physiome Model Repository. Using the BiVeS tool for difference detection, we then performed an automatic analysis of changes in all models published in these repositories. The resulting set of concepts led us to define candidate terms for the ontology. In a final step, we aggregated and classified these terms and built the first version of the ontology. We present COMODI, an ontology needed because COmputational MOdels DIffer. It empowers users and software to describe changes in a model on the semantic level. COMODI also enables software to implement user-specific filter options for the display of model changes. Finally, COMODI is a step towards predicting how a change in a model influences the simulation results. COMODI, coupled with our algorithm for difference detection, ensures the transparency of a model's evolution, and it enhances the traceability of updates and error corrections. COMODI is encoded in OWL. It is openly available at http://comodi.sems.uni-rostock.de/ .

  11. Automated SEM-EDS GSR Analysis for Turkish Ammunitions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cakir, Ismail; Uner, H. Bulent

    2007-04-23

    In this work, Automated Scanning Electron Microscopy with Energy Dispersive X-ray Spectrometry (SEM-EDS) was used to characterize 7.65 and 9mm cartridges Turkish ammunition. All samples were analyzed in a SEM Jeol JSM-5600LV equipped BSE detector and a Link ISIS 300 (EDS). A working distance of 20mm, an accelerating voltage of 20 keV and gunshot residue software was used in all analysis. Automated search resulted in a high number of particles analyzed containing gunshot residues (GSR) unique elements (PbBaSb). The obtained data about the definition of characteristic GSR particles was concordant with other studies on this topic.

  12. Knowledge work productivity effect on quality of knowledge work in software development process in SME

    NASA Astrophysics Data System (ADS)

    Yusoff, Mohd Zairol; Mahmuddin, Massudi; Ahmad, Mazida

    2016-08-01

    Knowledge and skill are necessary to develop the capability of knowledge workers. However, there is very little understanding of what the necessary knowledge work (KW) is, and how they influence the quality of knowledge work or knowledge work productivity (KWP) in software development process, including that in small and medium-sized (SME) enterprise. The SME constitutes a major part of the economy and it has been relatively unsuccessful in developing KWP. Accordingly, this paper seeks to explore the influencing dimensions of KWP that effect on the quality of KW in SME environment. First, based on the analysis of the existing literatures, the key characteristics of KW productivity are defined. Second, the conceptual model is proposed, which explores the dimensions of the KWP and its quality. This study analyses data collected from 150 respondents (based on [1], who involve in SME in Malaysia and validates the models by using structural equation modeling (SEM). The results provide an analysis of the effect of KWP on the quality of KW and business success, and have a significant relevance for both research and practice in the SME

  13. High-resolution 3D analyses of the shape and internal constituents of small volcanic ash particles: The contribution of SEM micro-computed tomography (SEM micro-CT)

    NASA Astrophysics Data System (ADS)

    Vonlanthen, Pierre; Rausch, Juanita; Ketcham, Richard A.; Putlitz, Benita; Baumgartner, Lukas P.; Grobéty, Bernard

    2015-02-01

    The morphology of small volcanic ash particles is fundamental to our understanding of magma fragmentation, and in transport modeling of volcanic plumes and clouds. Until recently, the analysis of 3D features in small objects (< 250 μm) was either restricted to extrapolations from 2D approaches, partial stereo-imaging, or CT methods having limited spatial resolution and/or accessibility. In this study, an X-ray computed-tomography technique known as SEM micro-CT, also called 3D X-ray ultramicroscopy (3D XuM), was used to investigate the 3D morphology of small volcanic ash particles (125-250 μm sieve fraction), as well as their vesicle and microcrystal distribution. The samples were selected from four stratigraphically well-established tephra layers of the Meerfelder Maar (West Eifel Volcanic Field, Germany). Resolution tests performed on a Beametr v1 pattern sample along with Monte Carlo simulations of X-ray emission volumes indicated that a spatial resolution of 0.65 μm was obtained for X-ray shadow projections using a standard thermionic SEM and a bulk brass target as X-ray source. Analysis of a smaller volcanic ash particle (64-125 μm sieve fraction) showed that features with volumes > 20 μm3 (~ 3.5 μm in diameter) can be successfully reconstructed and quantified. In addition, new functionalities of the Blob3D software were developed to allow the particle shape factors frequently used as input parameters in ash transport and dispersion models to be calculated. This study indicates that SEM micro-CT is very well suited to quantify the various aspects of shape in fine volcanic ash, and potentially also to investigate the 3D morphology and internal structure of any object < 0.1 mm3.

  14. Use of fluorescence and scanning electron microscopy as tools in teaching biology

    NASA Astrophysics Data System (ADS)

    Ghosh, Nabarun; Silva, Jessica; Vazquez, Aracely; Das, A. B.; Smith, Don W.

    2011-06-01

    Recent nationwide surveys reveal significant decline in students' interest in Math and Sciences. The objective of this project was to inspire young minds in using various techniques involved in Sciences including Scanning Electron Microscopy. We used Scanning Electron Microscope in demonstrating various types of Biological samples. An SEM Tabletop model in the past decade has revolutionized the use of Scanning Electron Microscopes. Using SEM Tabletop model TM 1000 we studied biological specimens of fungal spores, pollen grains, diatoms, plant fibers, dust mites, insect parts and leaf surfaces. We also used fluorescence microscopy to view, to record and analyze various specimens with an Olympus BX40 microscope equipped with FITC and TRITC fluorescent filters, a mercury lamp source, DP-70 digital camera with Image Pro 6.0 software. Micrographs were captured using bright field microscopy, the fluoresceinisothiocyanate (FITC) filter, and the tetramethylrhodamine (TRITC) filter settings at 40X. A high pressure mercury lamp or UV source was used to excite the storage molecules or proteins which exhibited autofluorescence. We used fluorescent microscopy to confirm the localization of sugar beet viruses in plant organs by viewing the vascular bundles in the thin sections of the leaves and other tissues. We worked with the REU summer students on sample preparation and observation on various samples utilizing the SEM. Critical Point Drying (CPD) and metal coating with the sputter coater was followed before observing some cultured specimen and the samples that were soft in textures with high water content. SEM Top allowed investigating the detailed morphological features that can be used for classroom teaching. Undergraduate and graduate researchers studied biological samples of Arthropods, pollen grains and teeth collected from four species of snakes using SEM. This project inspired the research students to pursue their career in higher studies in science and 45% of the undergraduates participated in this project entered Graduate school.

  15. Successful aging in Spanish older adults: the role of psychosocial resources.

    PubMed

    Dumitrache, Cristina G; Rubio, Laura; Cordón-Pozo, Eulogio

    2018-05-25

    ABSTRACTBackground:Psychological and social resources such as extraversion, optimism, social support, or social networks contribute to adaptation and to successful aging. Building on assumptions derived from successful aging and from the developmental adaptation models, this study aims to analyze the joint impact of different psychosocial resources, such as personality, social relations, health, and socio-demographic characteristics on life satisfaction in a group of people aged 65 years-old and older from Spain. A cross-sectional survey using non-proportional quota sampling was carried out. The sample comprised 406 community-dwelling older adults (M = 74.88, SD = 6.75). In order to collect the data, face-to-face interviews were individually conducted. A structural equation model (SEM) was carried out using the PLS software. The results of the SEM model showed that, within this sample, psychosocial variables explain 47.4% of the variance in life satisfaction. Social relations and personality, specifically optimism, were strongly related with life satisfaction, while health status and socio-demographic characteristics were modestly associated with life satisfaction. Findings support the view that psychosocial resources are important for successful aging and therefore should be included in successful aging models. Furthermore, interventions aimed at fostering successful aging should take into account the role of psychosocial variables.

  16. Three-dimensional intracellular structure of a whole rice mesophyll cell observed with FIB-SEM.

    PubMed

    Oi, Takao; Enomoto, Sakiko; Nakao, Tomoyo; Arai, Shigeo; Yamane, Koji; Taniguchi, Mitsutaka

    2017-07-01

    Ultrathin sections of rice leaf blades observed two-dimensionally using a transmission electron microscope (TEM) show that the chlorenchyma is composed of lobed mesophyll cells, with intricate cell boundaries, and lined with chloroplasts. The lobed cell shape and chloroplast positioning are believed to enhance the area available for the gas exchange surface for photosynthesis in rice leaves. However, a cell image revealing the three-dimensional (3-D) ultrastructure of rice mesophyll cells has not been visualized. In this study, a whole rice mesophyll cell was observed using a focused ion beam scanning electron microscope (FIB-SEM), which provides many serial sections automatically, rapidly and correctly, thereby enabling 3-D cell structure reconstruction. Rice leaf blades were fixed chemically using the method for conventional TEM observation, embedded in resin and subsequently set in the FIB-SEM chamber. Specimen blocks were sectioned transversely using the FIB, and block-face images were captured using the SEM. The sectioning and imaging were repeated overnight for 200-500 slices (each 50 nm thick). The resultant large-volume image stacks ( x = 25 μm, y = 25 μm, z = 10-25 μm) contained one or two whole mesophyll cells. The 3-D models of whole mesophyll cells were reconstructed using image processing software. The reconstructed cell models were discoid shaped with several lobes around the cell periphery. The cell shape increased the surface area, and the ratio of surface area to volume was twice that of a cylinder having the same volume. The chloroplasts occupied half the cell volume and spread as sheets along the cell lobes, covering most of the inner cell surface, with adjacent chloroplasts in close contact with each other. Cellular and sub-cellular ultrastructures of a whole mesophyll cell in a rice leaf blade are demonstrated three-dimensionally using a FIB-SEM. The 3-D models and numerical information support the hypothesis that rice mesophyll cells enhance their CO 2 absorption with increased cell surface and sheet-shaped chloroplasts. © The Author 2017. Published by Oxford University Press on behalf of the Annals of Botany Company. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  17. Multitemporal Three Dimensional Imaging of Volcanic Products on the Macro- and Micro- Scale

    NASA Astrophysics Data System (ADS)

    Carter, A. J.; Ramsey, M. S.; Durant, A. J.; Skilling, I. P.

    2006-12-01

    Satellite data from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) can be processed using a nadir- and backward-viewing band at the same wavelength to generate a Digital Elevation Model (DEM) at a maximum spatial resolution of 15 metres. Bezymianny Volcano (Kamchatka Peninsula, Russia) was chosen as a test target for multitemporal DEM generation. DEMs were used to generate a layer stack and calculate coarse topographic changes from 2000 to 2006, the most significant of which was a new crater that formed in spring 2005. The eruption that occurred on 11 January 2005 produced a pyroclastic deposit on the east flank, which was mapped and from which samples were collected in August 2005. A comparison was made between field-based observations of the deposit and micron-scale roughness (analogous to vesicularity) derived from ASTER thermal infrared data following the model described in Ramsey and Fink (1999) on lava domes. In order to investigate applying this technique to the pyroclastic deposits, 18 small samples from Bezymianny were selected for Scanning Electron Microscope (SEM) micron-scale analysis. The SEM image data were processed using software capable of calculating surface roughness and vesicle volume from stereo pairs: a statistical analysis of samples is presented using a high resolution grid of surface profiles. The results allow for a direct comparison to field, laboratory, and satellite-based estimates of micron-scale roughness. Prior to SEM processing, laboratory thermal emission spectra of the microsamples were collected and modelled to estimate vesicularity. Each data set was compared and assessed for coherence within the limitations of each technique. This study outlines the value of initially imaging at the macro-scale to assess major topographic changes over time at the volcano. This is followed by an example of the application of micro-scale SEM imaging and spectral deconvolution, highlighting the advantages of using multiple resolutions to analyse frequently overlapping products at Bezymianny.

  18. Symposium on Automation, Robotics and Advanced Computing for the National Space Program (2nd) Held in Arlington, Virginia on 9-11 March 1987

    DTIC Science & Technology

    1988-02-28

    enormous investment in software. This is an area extremely important objective. We need additional where better methodologies , tools and theories...microscopy (SEM) and optical mi- [131 Hanson, A., et a. "A Methodology for the Develop- croscopy. Current activities include the study of SEM im- ment...through a phased knowledge engineering methodology Center (ARC) and NASA Johnson Space Center consisting of: prototype knowledge base develop- iJSC

  19. A novel approach to TEM preparation with a (7-axis stage) triple-beam FIB-SEM system

    NASA Astrophysics Data System (ADS)

    Clarke, Jamil J.

    2015-10-01

    Preparation of lamellae from bulk to grid for Cs-corrected Transmission Electron Microscope (TEM) observation has mostly become routine work on the latest FIB-SEM systems, with standardized techniques that often are left to automation for the initial steps. The finalization of lamellae however, has mostly become, non-routine, non-repeatable and often driven by user experience level in most cases to produce high quality damage-less cross section. Materials processing of the latest technologies, with ever-shrinking Nano-sized structures pose challenges to modern FIB-SEM systems. This can often lead to specialized techniques and hyper-specific functions for producing ultra-thin high quality lamellae that often are lab specific, preventing practical use of such techniques across multiple materials and applications. Several factors that should be incorporated in processing fine structured materials successfully include how the use of electron and ion scan conditions can affect a thin section during ion milling, the type of ion species applied for material processing during the finalization of lamellae with gallium ions or of a smaller ion species type such as Ar/Xe, sample orientation of the lamella during the thinning process which is linked to ion beam incident angle as a direct relationship in the creation of waterfall effects or curtain effects, and how software can be employed to aid in the reduction of these artifacts with reproducible results regardless of FIB-SEM experience for site-specific lift outs. A traditional TEM preparation was performed of a fine structure specimen in pursuit of a process technique to produce a high quality TEM lamella which would address all of the factors mentioned. These new capabilities have been refined and improved upon during the FIB-SEM design and development stages with an end result of a new approach that yields an improvement in quality by the reduction of common ion milling artifacts such as curtain effects, amorphous material, and better pin pointing of the area of interest while reducing overall processing time for the TEM sample preparation process and enhancing repeatability through ease of use via software controls. The development of these new technologies, incorporating a third Ar/Xe ion beam column in conjunction with the electron and gallium ion beam column, a 7-axis stage for enhanced sample orientation with tilt functions in two axes and automated swing control along with a host of additional functions which address the factors aforementioned such as electron and ion scan techniques and curtain effect removal by the use of hardware and software components that are key to reduce typical FIB related artifacts, all of which are called "ACE [Anti Curtaining Effect] Technologies" are explained. The overall developments of these technologies are to address a significant point that productivity, throughput and repeatability are comprised by synergy between the user, application, software and hardware within a FIB-SEM system. The latest Hitachi FIB-SEM platform offers these innovations for reliability, repeatability and high quality lamella preparation for Cs-corrected (S)TEMs.

  20. Reliability of four models for clinical gait analysis.

    PubMed

    Kainz, Hans; Graham, David; Edwards, Julie; Walsh, Henry P J; Maine, Sheanna; Boyd, Roslyn N; Lloyd, David G; Modenese, Luca; Carty, Christopher P

    2017-05-01

    Three-dimensional gait analysis (3DGA) has become a common clinical tool for treatment planning in children with cerebral palsy (CP). Many clinical gait laboratories use the conventional gait analysis model (e.g. Plug-in-Gait model), which uses Direct Kinematics (DK) for joint kinematic calculations, whereas, musculoskeletal models, mainly used for research, use Inverse Kinematics (IK). Musculoskeletal IK models have the advantage of enabling additional analyses which might improve the clinical decision-making in children with CP. Before any new model can be used in a clinical setting, its reliability has to be evaluated and compared to a commonly used clinical gait model (e.g. Plug-in-Gait model) which was the purpose of this study. Two testers performed 3DGA in eleven CP and seven typically developing participants on two occasions. Intra- and inter-tester standard deviations (SD) and standard error of measurement (SEM) were used to compare the reliability of two DK models (Plug-in-Gait and a six degrees-of-freedom model solved using Vicon software) and two IK models (two modifications of 'gait2392' solved using OpenSim). All models showed good reliability (mean SEM of 3.0° over all analysed models and joint angles). Variations in joint kinetics were less in typically developed than in CP participants. The modified 'gait2392' model which included all the joint rotations commonly reported in clinical 3DGA, showed reasonable reliable joint kinematic and kinetic estimates, and allows additional musculoskeletal analysis on surgically adjustable parameters, e.g. muscle-tendon lengths, and, therefore, is a suitable model for clinical gait analysis. Copyright © 2017. Published by Elsevier B.V.

  1. Enhanced EDX images by fusion of multimodal SEM images using pansharpening techniques.

    PubMed

    Franchi, G; Angulo, J; Moreaud, M; Sorbier, L

    2018-01-01

    The goal of this paper is to explore the potential interest of image fusion in the context of multimodal scanning electron microscope (SEM) imaging. In particular, we aim at merging the backscattered electron images that usually have a high spatial resolution but do not provide enough discriminative information to physically classify the nature of the sample, with energy-dispersive X-ray spectroscopy (EDX) images that have discriminative information but a lower spatial resolution. The produced images are named enhanced EDX. To achieve this goal, we have compared the results obtained with classical pansharpening techniques for image fusion with an original approach tailored for multimodal SEM fusion of information. Quantitative assessment is obtained by means of two SEM images and a simulated dataset produced by a software based on PENELOPE. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.

  2. CD-SEM real time bias correction using reference metrology based modeling

    NASA Astrophysics Data System (ADS)

    Ukraintsev, V.; Banke, W.; Zagorodnev, G.; Archie, C.; Rana, N.; Pavlovsky, V.; Smirnov, V.; Briginas, I.; Katnani, A.; Vaid, A.

    2018-03-01

    Accuracy of patterning impacts yield, IC performance and technology time to market. Accuracy of patterning relies on optical proximity correction (OPC) models built using CD-SEM inputs and intra die critical dimension (CD) control based on CD-SEM. Sub-nanometer measurement uncertainty (MU) of CD-SEM is required for current technologies. Reported design and process related bias variation of CD-SEM is in the range of several nanometers. Reference metrology and numerical modeling are used to correct SEM. Both methods are slow to be used for real time bias correction. We report on real time CD-SEM bias correction using empirical models based on reference metrology (RM) data. Significant amount of currently untapped information (sidewall angle, corner rounding, etc.) is obtainable from SEM waveforms. Using additional RM information provided for specific technology (design rules, materials, processes) CD extraction algorithms can be pre-built and then used in real time for accurate CD extraction from regular CD-SEM images. The art and challenge of SEM modeling is in finding robust correlation between SEM waveform features and bias of CD-SEM as well as in minimizing RM inputs needed to create accurate (within the design and process space) model. The new approach was applied to improve CD-SEM accuracy of 45 nm GATE and 32 nm MET1 OPC 1D models. In both cases MU of the state of the art CD-SEM has been improved by 3x and reduced to a nanometer level. Similar approach can be applied to 2D (end of line, contours, etc.) and 3D (sidewall angle, corner rounding, etc.) cases.

  3. Investigation of hidden periodic structures on SEM images of opal-like materials using FFT and IFFT.

    PubMed

    Stephant, Nicolas; Rondeau, Benjamin; Gauthier, Jean-Pierre; Cody, Jason A; Fritsch, Emmanuel

    2014-01-01

    We have developed a method to use fast Fourier transformation (FFT) and inverse fast Fourier transformation (IFFT) to investigate hidden periodic structures on SEM images. We focused on samples of natural, play-of-color opals that diffract visible light and hence are periodically structured. Conventional sample preparation by hydrofluoric acid etch was not used; untreated, freshly broken surfaces were examined at low magnification relative to the expected period of the structural features, and, the SEM was adjusted to get a very high number of pixels in the images. These SEM images were treated by software to calculate autocorrelation, FFT, and IFFT. We present how we adjusted SEM acquisition parameters for best results. We first applied our procedure on an SEM image on which the structure was obvious. Then, we applied the same procedure on a sample that must contain a periodic structure because it diffracts visible light, but on which no structure was visible on the SEM image. In both cases, we obtained clearly periodic patterns that allowed measurements of structural parameters. We also investigated how the irregularly broken surface interfered with the periodic structure to produce additional periodicity. We tested the limits of our methodology with the help of simulated images. © 2014 Wiley Periodicals, Inc.

  4. Structural Equation Model Trees

    ERIC Educational Resources Information Center

    Brandmaier, Andreas M.; von Oertzen, Timo; McArdle, John J.; Lindenberger, Ulman

    2013-01-01

    In the behavioral and social sciences, structural equation models (SEMs) have become widely accepted as a modeling tool for the relation between latent and observed variables. SEMs can be seen as a unification of several multivariate analysis techniques. SEM Trees combine the strengths of SEMs and the decision tree paradigm by building tree…

  5. Hippocampal atrophy and memory dysfunction associated with physical inactivity in community-dwelling elderly subjects: The Sefuri study.

    PubMed

    Hashimoto, Manabu; Araki, Yuko; Takashima, Yuki; Nogami, Kohjiro; Uchino, Akira; Yuzuriha, Takefumi; Yao, Hiroshi

    2017-02-01

    Physical inactivity is one of the modifiable risk factors for hippocampal atrophy and Alzheimer's disease. We investigated the relationship between physical activity, hippocampal atrophy, and memory using structural equation modeling (SEM). We examined 213 community-dwelling elderly subjects (99 men and 114 women with a mean age of 68.9 years) without dementia or clinically apparent depression. All participants underwent Mini-Mental State Examination (MMSE) and Rivermead Behavioral Memory Test (RBMT). Physical activities were assessed with a structured questionnaire. We evaluated the degree of hippocampal atrophy (z-score-referred to as ZAdvance hereafter), using a free software program-the voxel-based specific regional analysis system for Alzheimer's disease (VSRAD) based on statistical parametric mapping 8 plus Diffeomorphic Anatomical Registration Through an Exponentiated Lie algebra. Routine magnetic resonance imaging findings were as follows: silent brain infarction, n  = 24 (11.3%); deep white matter lesions, n  = 72 (33.8%); periventricular hyperintensities, n  = 35 (16.4%); and cerebral microbleeds, n  = 14 (6.6%). Path analysis based on SEM indicated that the direct paths from leisure-time activity to hippocampal atrophy (β = -.18, p  < .01) and from hippocampal atrophy to memory dysfunction (RBMT) (β = -.20, p  < .01) were significant. Direct paths from "hippocampus" gray matter volume to RBMT and MMSE were highly significant, while direct paths from "whole brain" gray matter volume to RBMT and MMSE were not significant. The presented SEM model fit the data reasonably well. Based on the present SEM analysis, we found that hippocampal atrophy was associated with age and leisure-time physical inactivity, and hippocampal atrophy appeared to cause memory dysfunction, although we are unable to infer a causal or temporal association between hippocampal atrophy and memory dysfunction from the present observational study.

  6. Evaluation of an Approximate Method for Synthesizing Covariance Matrices for Use in Meta-Analytic SEM

    ERIC Educational Resources Information Center

    Beretvas, S. Natasha; Furlow, Carolyn F.

    2006-01-01

    Meta-analytic structural equation modeling (MA-SEM) is increasingly being used to assess model-fit for variables' interrelations synthesized across studies. MA-SEM researchers have analyzed synthesized correlation matrices using structural equation modeling (SEM) estimation that is designed for covariance matrices. This can produce incorrect…

  7. Structural Equation Model Trees

    PubMed Central

    Brandmaier, Andreas M.; von Oertzen, Timo; McArdle, John J.; Lindenberger, Ulman

    2015-01-01

    In the behavioral and social sciences, structural equation models (SEMs) have become widely accepted as a modeling tool for the relation between latent and observed variables. SEMs can be seen as a unification of several multivariate analysis techniques. SEM Trees combine the strengths of SEMs and the decision tree paradigm by building tree structures that separate a data set recursively into subsets with significantly different parameter estimates in a SEM. SEM Trees provide means for finding covariates and covariate interactions that predict differences in structural parameters in observed as well as in latent space and facilitate theory-guided exploration of empirical data. We describe the methodology, discuss theoretical and practical implications, and demonstrate applications to a factor model and a linear growth curve model. PMID:22984789

  8. The capability of lithography simulation based on MVM-SEM® system

    NASA Astrophysics Data System (ADS)

    Yoshikawa, Shingo; Fujii, Nobuaki; Kanno, Koichi; Imai, Hidemichi; Hayano, Katsuya; Miyashita, Hiroyuki; Shida, Soichi; Murakawa, Tsutomu; Kuribara, Masayuki; Matsumoto, Jun; Nakamura, Takayuki; Matsushita, Shohei; Hara, Daisuke; Pang, Linyong

    2015-10-01

    The 1Xnm technology node lithography is using SMO-ILT, NTD or more complex pattern. Therefore in mask defect inspection, defect verification becomes more difficult because many nuisance defects are detected in aggressive mask feature. One key Technology of mask manufacture is defect verification to use aerial image simulator or other printability simulation. AIMS™ Technology is excellent correlation for the wafer and standards tool for defect verification however it is difficult for verification over hundred numbers or more. We reported capability of defect verification based on lithography simulation with a SEM system that architecture and software is excellent correlation for simple line and space.[1] In this paper, we use a SEM system for the next generation combined with a lithography simulation tool for SMO-ILT, NTD and other complex pattern lithography. Furthermore we will use three dimension (3D) lithography simulation based on Multi Vision Metrology SEM system. Finally, we will confirm the performance of the 2D and 3D lithography simulation based on SEM system for a photomask verification.

  9. A Thermal Precipitator for Fire Characterization Research

    NASA Technical Reports Server (NTRS)

    Meyer, Marit; Bryg, Vicky

    2008-01-01

    Characterization of the smoke from pyrolysis of common spacecraft materials provides insight for the design of future smoke detectors and post-fire clean-up equipment on the International Space Station. A thermal precipitator was designed to collect smoke aerosol particles for microscopic analysis in fire characterization research. Information on particle morphology, size and agglomerate structure obtained from these tests supplements additional aerosol data collected. Initial modeling for the thermal precipitator design was performed with the finite element software COMSOL Multiphysics, and includes the flow field and heat transfer in the device. The COMSOL Particle Tracing Module was used to determine particle deposition on SEM stubs which include TEM grids. Modeling provided optimized design parameters such as geometry, flow rate and temperatures. Microscopy results from fire characterization research using the thermal precipitator are presented.

  10. SEM AutoAnalysis: enhancing photomask and NIL defect disposition and review

    NASA Astrophysics Data System (ADS)

    Schulz, Kristian; Egodage, Kokila; Tabbone, Gilles; Ehrlich, Christian; Garetto, Anthony

    2017-06-01

    For defect disposition and repair verification regarding printability, AIMS™ is the state of the art measurement tool in industry. With its unique capability of capturing aerial images of photomasks it is the one method that comes closest to emulating the printing behaviour of a scanner. However for nanoimprint lithography (NIL) templates aerial images cannot be applied to evaluate the success of a repair process. Hence, for NIL defect dispositioning scanning, electron microscopy (SEM) imaging is the method of choice. In addition, it has been a standard imaging method for further root cause analysis of defects and defect review on optical photomasks which enables 2D or even 3D mask profiling at high resolutions. In recent years a trend observed in mask shops has been the automation of processes that traditionally were driven by operators. This of course has brought many advantages one of which is freeing cost intensive labour from conducting repetitive and tedious work. Furthermore, it reduces variability in processes due to different operator skill and experience levels which at the end contributes to eliminating the human factor. Taking these factors into consideration, one of the software based solutions available under the FAVOR® brand to support customer needs is the aerial image evaluation software, AIMS™ AutoAnalysis (AAA). It provides fully automated analysis of AIMS™ images and runs in parallel to measurements. This is enabled by its direct connection and communication with the AIMS™tools. As one of many positive outcomes, generating automated result reports is facilitated, standardizing the mask manufacturing workflow. Today, AAA has been successfully introduced into production at multiple customers and is supporting the workflow as described above. These trends indeed have triggered the demand for similar automation with respect to SEM measurements leading to the development of SEM AutoAnalysis (SAA). It aims towards a fully automated SEM image evaluation process utilizing a completely different algorithm due to the different nature of SEM images and aerial images. Both AAA and SAA are the building blocks towards an image evaluation suite in the mask shop industry.

  11. The SEM Risk Behavior (SRB) Model: A New Conceptual Model of how Pornography Influences the Sexual Intentions and HIV Risk Behavior of MSM.

    PubMed

    Wilkerson, J Michael; Iantaffi, Alex; Smolenski, Derek J; Brady, Sonya S; Horvath, Keith J; Grey, Jeremy A; Rosser, B R Simon

    2012-01-01

    While the effects of sexually explicit media (SEM) on heterosexuals' sexual intentions and behaviors have been studied, little is known about the consumption and possible influence of SEM among men who have sex with men (MSM). Importantly, conceptual models of how Internet-based SEM influences behavior are lacking. Seventy-nine MSM participated in online focus groups about their SEM viewing preferences and sexual behavior. Twenty-three participants reported recent exposure to a new behavior via SEM. Whether participants modified their sexual intentions and/or engaged in the new behavior depended on three factors: arousal when imagining the behavior, pleasure when attempting the behavior, and trust between sex partners. Based on MSM's experience, we advance a model of how viewing a new sexual behavior in SEM influences sexual intentions and behaviors. The model includes five paths. Three paths result in the maintenance of sexual intentions and behaviors. One path results in a modification of sexual intentions while maintaining previous sexual behaviors, and one path results in a modification of both sexual intentions and behaviors. With this model, researchers have a framework to test associations between SEM consumption and sexual intentions and behavior, and public health programs have a framework to conceptualize SEM-based HIV/STI prevention programs.

  12. An overview of structural equation modeling: its beginnings, historical development, usefulness and controversies in the social sciences.

    PubMed

    Tarka, Piotr

    2018-01-01

    This paper is a tribute to researchers who have significantly contributed to improving and advancing structural equation modeling (SEM). It is, therefore, a brief overview of SEM and presents its beginnings, historical development, its usefulness in the social sciences and the statistical and philosophical (theoretical) controversies which have often appeared in the literature pertaining to SEM. Having described the essence of SEM in the context of causal analysis, the author discusses the years of the development of structural modeling as the consequence of many researchers' systematically growing needs (in particular in the social sciences) who strove to effectively understand the structure and interactions of latent phenomena. The early beginnings of SEM models were related to the work of Spearman and Wright, and to that of other prominent researchers who contributed to SEM development. The importance and predominance of theoretical assumptions over technical issues for the successful construction of SEM models are also described. Then, controversies regarding the use of SEM in the social sciences are presented. Finally, the opportunities and threats of this type of analytical strategy as well as selected areas of SEM applications in the social sciences are discussed.

  13. Modeling a Miniaturized Scanning Electron Microscope Focusing Column - Lessons Learned in Electron Optics Simulation

    NASA Technical Reports Server (NTRS)

    Loyd, Jody; Gregory, Don; Gaskin, Jessica

    2016-01-01

    This presentation discusses work done to assess the design of a focusing column in a miniaturized Scanning Electron Microscope (SEM) developed at the NASA Marshall Space Flight Center (MSFC) for use in-situ on the Moon-in particular for mineralogical analysis. The MSFC beam column design uses purely electrostatic fields for focusing, because of the severe constraints on mass and electrical power consumption imposed by the goals of lunar exploration and of spaceflight in general. The resolution of an SEM ultimately depends on the size of the focused spot of the scanning beam probe, for which the stated goal here is a diameter of 10 nanometers. Optical aberrations are the main challenge to this performance goal, because they blur the ideal geometrical optical image of the electron source, effectively widening the ideal spot size of the beam probe. In the present work the optical aberrations of the mini SEM focusing column were assessed using direct tracing of non-paraxial rays, as opposed to mathematical estimates of aberrations based on paraxial ray-traces. The geometrical ray-tracing employed here is completely analogous to ray-tracing as conventionally understood in the realm of photon optics, with the major difference being that in electron optics the lens is simply a smoothly varying electric field in vacuum, formed by precisely machined electrodes. Ray-tracing in this context, therefore, relies upon a model of the electrostatic field inside the focusing column to provide the mathematical description of the "lens" being traced. This work relied fundamentally on the boundary element method (BEM) for this electric field model. In carrying out this research the authors discovered that higher accuracy in the field model was essential if aberrations were to be reliably assessed using direct ray-tracing. This led to some work in testing alternative techniques for modeling the electrostatic field. Ultimately, the necessary accuracy was attained using a BEM/Fourier series hybrid approach. The presentation will give background remarks about the MSFC mini Lunar SEM concept and electron optics modeling, followed by a description of the alternate field modeling techniques that were tried, along with their incorporation into a ray-trace simulation. Next, the validation of this simulation against commercially available software will be discussed using an example lens as a test case. Then, the efficacy of aberration assessment using direct ray-tracing will be demonstrated, using this same validation case. The discussion will include practical error checks of the field solution. Finally, the ray-trace assessment of the MSFC mini Lunar SEM concept will be shown and discussed. The authors believe this presentation will be of general interest to practitioners of modeling and simulation, as well as those with a general optics background. Because electron optics and photon optics share many basic concepts (e.g., lenses, images, aberrations, etc.), the appeal of this presentation need not be restricted to just those interested in charged particle optics.

  14. The development of comparative bias index

    NASA Astrophysics Data System (ADS)

    Aimran, Ahmad Nazim; Ahmad, Sabri; Afthanorhan, Asyraf; Awang, Zainudin

    2017-08-01

    Structural Equation Modeling (SEM) is a second generation statistical analysis techniques developed for analyzing the inter-relationships among multiple variables in a model simultaneously. There are two most common used methods in SEM namely Covariance-Based Structural Equation Modeling (CB-SEM) and Partial Least Square Path Modeling (PLS-PM). There have been continuous debates among researchers in the use of PLS-PM over CB-SEM. While there is few studies were conducted to test the performance of CB-SEM and PLS-PM bias in estimating simulation data. This study intends to patch this problem by a) developing the Comparative Bias Index and b) testing the performance of CB-SEM and PLS-PM using developed index. Based on balanced experimental design, two multivariate normal simulation data with of distinct specifications of size 50, 100, 200 and 500 are generated and analyzed using CB-SEM and PLS-PM.

  15. GeoFramework: A Modeling Framework for Solid Earth Geophysics

    NASA Astrophysics Data System (ADS)

    Gurnis, M.; Aivazis, M.; Tromp, J.; Tan, E.; Thoutireddy, P.; Liu, Q.; Choi, E.; Dicaprio, C.; Chen, M.; Simons, M.; Quenette, S.; Appelbe, B.; Aagaard, B.; Williams, C.; Lavier, L.; Moresi, L.; Law, H.

    2003-12-01

    As data sets in geophysics become larger and of greater relevance to other earth science disciplines, and as earth science becomes more interdisciplinary in general, modeling tools are being driven in new directions. There is now a greater need to link modeling codes to one another, link modeling codes to multiple datasets, and to make modeling software available to non modeling specialists. Coupled with rapid progress in computer hardware (including the computational speed afforded by massively parallel computers), progress in numerical algorithms, and the introduction of software frameworks, these lofty goals of merging software in geophysics are now possible. The GeoFramework project, a collaboration between computer scientists and geoscientists, is a response to these needs and opportunities. GeoFramework is based on and extends Pyre, a Python-based modeling framework, recently developed to link solid (Lagrangian) and fluid (Eulerian) models, as well as mesh generators, visualization packages, and databases, with one another for engineering applications. The utility and generality of Pyre as a general purpose framework in science is now being recognized. Besides its use in engineering and geophysics, it is also being used in particle physics and astronomy. Geology and geophysics impose their own unique requirements on software frameworks which are not generally available in existing frameworks and so there is a need for research in this area. One of the special requirements is the way Lagrangian and Eulerian codes will need to be linked in time and space within a plate tectonics context. GeoFramework has grown beyond its initial goal of linking a limited number of exiting codes together. The following codes are now being reengineered within the context of Pyre: Tecton, 3-D FE Visco-elastic code for lithospheric relaxation; CitComS, a code for spherical mantle convection; SpecFEM3D, a SEM code for global and regional seismic waves; eqsim, a FE code for dynamic earthquake rupture; SNAC, a developing 3-D coded based on the FLAC method for visco-elastoplastic deformation; SNARK, a 3-D FE-PIC method for viscoplastic deformation; and gPLATES an open source paleogeographic/plate tectonics modeling package. We will demonstrate how codes can be linked with themselves, such as a regional and global model of mantle convection and a visco-elastoplastic representation of the crust within viscous mantle flow. Finally, we will describe how http://GeoFramework.org has become a distribution site for a suite of modeling software in geophysics.

  16. Documentation of operational protocol for the use of MAMA software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, Daniel S.

    2016-01-21

    Image analysis of Scanning Electron Microscope (SEM) micrographs is a complex process that can vary significantly between analysts. The factors causing the variation are numerous, and the purpose of Task 2b is to develop and test a set of protocols designed to minimize variation in image analysis between different analysts and laboratories, specifically using the MAMA software package, Version 2.1. The protocols were designed to be “minimally invasive”, so that expert SEM operators will not be overly constrained in the way they analyze particle samples. The protocols will be tested using a round-robin approach where results from expert SEM usersmore » at Los Alamos National Laboratory, Lawrence Livermore National Laboratory, Pacific Northwest National Laboratory, Savannah River National Laboratory, and the National Institute of Standards and Testing will be compared. The variation of the results will be used to quantify uncertainty in the particle image analysis process. The round-robin exercise will proceed with 3 levels of rigor, each with their own set of protocols, as described below in Tasks 2b.1, 2b.2, and 2b.3. The uncertainty will be developed using NIST standard reference material SRM 1984 “Thermal Spray Powder – Particle Size Distribution, Tungsten Carbide/Cobalt (Acicular)” [Reference 1]. Full details are available in the Certificate of Analysis, posted on the NIST website (http://www.nist.gov/srm/).« less

  17. Public Response to a Near-Miss Nuclear Accident Scenario Varying in Causal Attributions and Outcome Uncertainty.

    PubMed

    Cui, Jinshu; Rosoff, Heather; John, Richard S

    2018-05-01

    Many studies have investigated public reactions to nuclear accidents. However, few studies focused on more common events when a serious accident could have happened but did not. This study evaluated public response (emotional, cognitive, and behavioral) over three phases of a near-miss nuclear accident. Simulating a loss-of-coolant accident (LOCA) scenario, we manipulated (1) attribution for the initial cause of the incident (software failure vs. cyber terrorist attack vs. earthquake), (2) attribution for halting the incident (fail-safe system design vs. an intervention by an individual expert vs. a chance coincidence), and (3) level of uncertainty (certain vs. uncertain) about risk of a future radiation leak after the LOCA is halted. A total of 773 respondents were sampled using a 3 × 3 × 2 between-subjects design. Results from both MANCOVA and structural equation modeling (SEM) indicate that respondents experienced more negative affect, perceived more risk, and expressed more avoidance behavioral intention when the near-miss event was initiated by an external attributed source (e.g., earthquake) compared to an internally attributed source (e.g., software failure). Similarly, respondents also indicated greater negative affect, perceived risk, and avoidance behavioral intentions when the future impact of the near-miss incident on people and the environment remained uncertain. Results from SEM analyses also suggested that negative affect predicted risk perception, and both predicted avoidance behavior. Affect, risk perception, and avoidance behavior demonstrated high stability (i.e., reliability) from one phase to the next. © 2017 Society for Risk Analysis.

  18. Wear at the titanium-titanium and the titanium-zirconia implant-abutment interface: a comparative in vitro study.

    PubMed

    Stimmelmayr, Michael; Edelhoff, Daniel; Güth, Jan-Frederik; Erdelt, Kurt; Happe, Arndt; Beuer, Florian

    2012-12-01

    The purpose of this study was to determine and measure the wear of the interface between titanium implants and one-piece zirconia abutments in comparison to titanium abutments. 6 implants were secured into epoxy resin blocks. The implant interface of these implants and 6 corresponding abutments (group Zr: three one-piece zirconia abutments; group Ti: three titanium abutments) were examined by a microscope and scanning electron micrograph (SEM). Also the implants and the abutments were scanned by 3D-Micro Computer Tomography (CT). The abutments were connected to the implants and cyclically loaded with 1,200,000 cycles at 100N in a two-axis fatigue testing machine. Afterwards, all specimens were unscrewed and the implants and abutments again were scanned by microscope, SEM and CT. The microscope and SEM images were compared, the CT data were superimposed and the wear was calculated by inspection software. The statistical analysis was carried out with an unpaired t-test. Abutment fracture or screw loosening was not observed during cyclical loading. Comparing the microscope and SEM images more wear was observed on the implants connected to zirconia abutments. The maximum wear on the implant shoulder calculated by the inspection software was 10.2μm for group Zr, and 0.7μm for group Ti. The influence of the abutment material on the measured wear was statistically significant (p≤0.001; Levene-test). Titanium implants showed higher wear at the implant interface following cyclic loading when connected to one-piece zirconia implant abutments compared to titanium abutments. The clinical relevance is not clear; hence damage of the internal implant connection could result in prosthetic failures up to the need of implant removal. Copyright © 2012 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  19. The SEM Risk Behavior (SRB) Model: A New Conceptual Model of how Pornography Influences the Sexual Intentions and HIV Risk Behavior of MSM

    PubMed Central

    Wilkerson, J. Michael; Iantaffi, Alex; Smolenski, Derek J.; Brady, Sonya S.; Horvath, Keith J.; Grey, Jeremy A.; Rosser, B. R. Simon

    2012-01-01

    While the effects of sexually explicit media (SEM) on heterosexuals’ sexual intentions and behaviors have been studied, little is known about the consumption and possible influence of SEM among men who have sex with men (MSM). Importantly, conceptual models of how Internet-based SEM influences behavior are lacking. Seventy-nine MSM participated in online focus groups about their SEM viewing preferences and sexual behavior. Twenty-three participants reported recent exposure to a new behavior via SEM. Whether participants modified their sexual intentions and/or engaged in the new behavior depended on three factors: arousal when imagining the behavior, pleasure when attempting the behavior, and trust between sex partners. Based on MSM’s experience, we advance a model of how viewing a new sexual behavior in SEM influences sexual intentions and behaviors. The model includes five paths. Three paths result in the maintenance of sexual intentions and behaviors. One path results in a modification of sexual intentions while maintaining previous sexual behaviors, and one path results in a modification of both sexual intentions and behaviors. With this model, researchers have a framework to test associations between SEM consumption and sexual intentions and behavior, and public health programs have a framework to conceptualize SEM-based HIV/STI prevention programs. PMID:23185126

  20. Information or resolution: Which is required from an SEM to study bulk inorganic materials?: Evaluate SEMs’ practical performance

    DOE PAGES

    Xing, Q.

    2016-07-11

    Significant technological advances in scanning electron microscopy (SEM) have been achieved over the past years. Different SEMs can have significant differences in functionality and performance. This work presents the perspectives on selecting an SEM for research on bulk inorganic materials. Understanding materials demands quantitative composition and orientation information, and informative and interpretable images that reveal subtle differences in chemistry, orientation/structure, topography, and electronic structure. The capability to yield informative and interpretable images with high signal-to-noise ratios and spatial resolutions is an overall result of the SEM system as a whole, from the electron optical column to the detection system. Themore » electron optical column determines probe performance. The roles of the detection system are to capture, filter or discriminate, and convert signal electrons to imaging information. The capability to control practical operating parameters including electron probe size and current, acceleration voltage or landing voltage, working distance, detector selection, and signal filtration is inherently determined by the SEM itself. As a platform for various accessories, e.g. an energydispersive spectrometer and an electron backscatter diffraction detector, the properties of the electron optical column, specimen chamber, and stage greatly affect the performance of accessories. Ease-of-use and ease-of-maintenance are of practical importance. It is practically important to select appropriate test specimens, design suitable imaging conditions, and analyze the specimen chamber geometry and dimensions to assess the overall functionality and performance of an SEM. Finally, for an SEM that is controlled/operated with a computer, the stable software and user-friendly interface significantly affect the usability of the SEM.« less

  1. Information or resolution: Which is required from an SEM to study bulk inorganic materials?

    PubMed

    Xing, Q

    2016-11-01

    Significant technological advances in scanning electron microscopy (SEM) have been achieved over the past years. Different SEMs can have significant differences in functionality and performance. This work presents the perspectives on selecting an SEM for research on bulk inorganic materials. Understanding materials demands quantitative composition and orientation information, and informative and interpretable images that reveal subtle differences in chemistry, orientation/structure, topography, and electronic structure. The capability to yield informative and interpretable images with high signal-to-noise ratios and spatial resolutions is an overall result of the SEM system as a whole, from the electron optical column to the detection system. The electron optical column determines probe performance. The roles of the detection system are to capture, filter or discriminate, and convert signal electrons to imaging information. The capability to control practical operating parameters including electron probe size and current, acceleration voltage or landing voltage, working distance, detector selection, and signal filtration is inherently determined by the SEM itself. As a platform for various accessories, e.g. an energy-dispersive spectrometer and an electron backscatter diffraction detector, the properties of the electron optical column, specimen chamber, and stage greatly affect the performance of accessories. Ease-of-use and ease-of-maintenance are of practical importance. It is practically important to select appropriate test specimens, design suitable imaging conditions, and analyze the specimen chamber geometry and dimensions to assess the overall functionality and performance of an SEM. For an SEM that is controlled/operated with a computer, the stable software and user-friendly interface significantly improve the usability of the SEM. SCANNING 38:864-879, 2016. © 2016 Wiley Periodicals, Inc. © Wiley Periodicals, Inc.

  2. Information or resolution: Which is required from an SEM to study bulk inorganic materials?: Evaluate SEMs’ practical performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xing, Q.

    Significant technological advances in scanning electron microscopy (SEM) have been achieved over the past years. Different SEMs can have significant differences in functionality and performance. This work presents the perspectives on selecting an SEM for research on bulk inorganic materials. Understanding materials demands quantitative composition and orientation information, and informative and interpretable images that reveal subtle differences in chemistry, orientation/structure, topography, and electronic structure. The capability to yield informative and interpretable images with high signal-to-noise ratios and spatial resolutions is an overall result of the SEM system as a whole, from the electron optical column to the detection system. Themore » electron optical column determines probe performance. The roles of the detection system are to capture, filter or discriminate, and convert signal electrons to imaging information. The capability to control practical operating parameters including electron probe size and current, acceleration voltage or landing voltage, working distance, detector selection, and signal filtration is inherently determined by the SEM itself. As a platform for various accessories, e.g. an energydispersive spectrometer and an electron backscatter diffraction detector, the properties of the electron optical column, specimen chamber, and stage greatly affect the performance of accessories. Ease-of-use and ease-of-maintenance are of practical importance. It is practically important to select appropriate test specimens, design suitable imaging conditions, and analyze the specimen chamber geometry and dimensions to assess the overall functionality and performance of an SEM. Finally, for an SEM that is controlled/operated with a computer, the stable software and user-friendly interface significantly affect the usability of the SEM.« less

  3. Software Intensive Systems Cost and Schedule Estimation

    DTIC Science & Technology

    2013-06-13

    Radio communication systems RTE Electronic navigation systems RTE Space vehicle electronic tracking systems RTE Sonar systems RTE...MONITORING AGENCY NAME(S) AND ADDRESS(ES) DASD (SE), DoD, AIRFORCE 10. SPONSOR/MONITOR’S ACRONYM(S) 11 . SPONSOR/MONITOR’S REPORT NUMBER(S) 12... 11   3.2.2  SEER‐SEM

  4. Reporting Subscores Using R: A Software Review

    ERIC Educational Resources Information Center

    Dai, Shenghai; Svetina, Dubravka; Wang, Xiaolin

    2017-01-01

    There is an increasing interest in reporting test subscores for diagnostic purposes. In this article, we review nine popular R packages (subscore, mirt, TAM, sirt, CDM, NPCD, lavaan, sem, and OpenMX) that are capable of implementing subscore-reporting methods within one or more frameworks including classical test theory, multidimensional item…

  5. Implementing Restricted Maximum Likelihood Estimation in Structural Equation Models

    ERIC Educational Resources Information Center

    Cheung, Mike W.-L.

    2013-01-01

    Structural equation modeling (SEM) is now a generic modeling framework for many multivariate techniques applied in the social and behavioral sciences. Many statistical models can be considered either as special cases of SEM or as part of the latent variable modeling framework. One popular extension is the use of SEM to conduct linear mixed-effects…

  6. Spectral-element Method for 3D Marine Controlled-source EM Modeling

    NASA Astrophysics Data System (ADS)

    Liu, L.; Yin, C.; Zhang, B., Sr.; Liu, Y.; Qiu, C.; Huang, X.; Zhu, J.

    2017-12-01

    As one of the predrill reservoir appraisal methods, marine controlled-source EM (MCSEM) has been widely used in mapping oil reservoirs to reduce risk of deep water exploration. With the technical development of MCSEM, the need for improved forward modeling tools has become evident. We introduce in this paper spectral element method (SEM) for 3D MCSEM modeling. It combines the flexibility of finite-element and high accuracy of spectral method. We use Galerkin weighted residual method to discretize the vector Helmholtz equation, where the curl-conforming Gauss-Lobatto-Chebyshev (GLC) polynomials are chosen as vector basis functions. As a kind of high-order complete orthogonal polynomials, the GLC have the characteristic of exponential convergence. This helps derive the matrix elements analytically and improves the modeling accuracy. Numerical 1D models using SEM with different orders show that SEM method delivers accurate results. With increasing SEM orders, the modeling accuracy improves largely. Further we compare our SEM with finite-difference (FD) method for a 3D reservoir model (Figure 1). The results show that SEM method is more effective than FD method. Only when the mesh is fine enough, can FD achieve the same accuracy of SEM. Therefore, to obtain the same precision, SEM greatly reduces the degrees of freedom and cost. Numerical experiments with different models (not shown here) demonstrate that SEM is an efficient and effective tool for MSCEM modeling that has significant advantages over traditional numerical methods.This research is supported by Key Program of National Natural Science Foundation of China (41530320), China Natural Science Foundation for Young Scientists (41404093), and Key National Research Project of China (2016YFC0303100, 2017YFC0601900).

  7. Reliability of measuring sciatic and tibial nerve movement with diagnostic ultrasound during a neural mobilisation technique.

    PubMed

    Ellis, Richard; Hing, Wayne; Dilley, Andrew; McNair, Peter

    2008-08-01

    Diagnostic ultrasound provides a technique whereby real-time, in vivo analysis of peripheral nerve movement is possible. This study measured sciatic nerve movement during a "slider" neural mobilisation technique (ankle dorsiflexion/plantar flexion and cervical extension/flexion). Transverse and longitudinal movement was assessed from still ultrasound images and video sequences by using frame-by-frame cross-correlation software. Sciatic nerve movement was recorded in the transverse and longitudinal planes. For transverse movement, at the posterior midthigh (PMT) the mean value of lateral sciatic nerve movement was 3.54 mm (standard error of measurement [SEM] +/- 1.18 mm) compared with anterior-posterior/vertical (AP) movement of 1.61 mm (SEM +/- 0.78 mm). At the popliteal crease (PC) scanning location, lateral movement was 6.62 mm (SEM +/- 1.10 mm) compared with AP movement of 3.26 mm (SEM +/- 0.99 mm). Mean longitudinal sciatic nerve movement at the PMT was 3.47 mm (SEM +/- 0.79 mm; n = 27) compared with the PC of 5.22 mm (SEM +/- 0.05 mm; n = 3). The reliability of ultrasound measurement of transverse sciatic nerve movement was fair to excellent (Intraclass correlation coefficient [ICC] = 0.39-0.76) compared with excellent (ICC = 0.75) for analysis of longitudinal movement. Diagnostic ultrasound presents a reliable, noninvasive, real-time, in vivo method for analysis of sciatic nerve movement.

  8. Synthesis and characterization of nanocrystalline graphite from coconut shell with heating process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wachid, Frischa M., E-mail: frischamw@yahoo.com, E-mail: adhiyudhaperkasa@yahoo.com, E-mail: afandisar@yahoo.com, E-mail: nurulrosyidah92@gmail.com, E-mail: darminto@physics.its.ac.id; Perkasa, Adhi Y., E-mail: frischamw@yahoo.com, E-mail: adhiyudhaperkasa@yahoo.com, E-mail: afandisar@yahoo.com, E-mail: nurulrosyidah92@gmail.com, E-mail: darminto@physics.its.ac.id; Prasetya, Fandi A., E-mail: frischamw@yahoo.com, E-mail: adhiyudhaperkasa@yahoo.com, E-mail: afandisar@yahoo.com, E-mail: nurulrosyidah92@gmail.com, E-mail: darminto@physics.its.ac.id

    Graphite were synthesized and characterized by heating process of coconut shell with varying temperature (400, 800 and 1000°C) and holding time (3 and 5 hours). After heating process, the samples were characterized by X-ray diffraction (XRD) and analyzed by X'pert HighScore Plus Software, Scanning Electron Microcope-Energy Dispersive X-Ray (SEM-EDX) and Transmission Electron Microscope-Energy Dispersive X-Ray (TEM-EDX). Graphite and londsdaelite phase were analyzed by XRD. According to EDX analysis, the sample was heated in 1000°C got the highest content of carbon. The amorphous carbon and nanocrystalline graphite were observed by SEM-EDX and TEM-EDX.

  9. The relationship between cost estimates reliability and BIM adoption: SEM analysis

    NASA Astrophysics Data System (ADS)

    Ismail, N. A. A.; Idris, N. H.; Ramli, H.; Rooshdi, R. R. Raja Muhammad; Sahamir, S. R.

    2018-02-01

    This paper presents the usage of Structural Equation Modelling (SEM) approach in analysing the effects of Building Information Modelling (BIM) technology adoption in improving the reliability of cost estimates. Based on the questionnaire survey results, SEM analysis using SPSS-AMOS application examined the relationships between BIM-improved information and cost estimates reliability factors, leading to BIM technology adoption. Six hypotheses were established prior to SEM analysis employing two types of SEM models, namely the Confirmatory Factor Analysis (CFA) model and full structural model. The SEM models were then validated through the assessment on their uni-dimensionality, validity, reliability, and fitness index, in line with the hypotheses tested. The final SEM model fit measures are: P-value=0.000, RMSEA=0.079<0.08, GFI=0.824, CFI=0.962>0.90, TLI=0.956>0.90, NFI=0.935>0.90 and ChiSq/df=2.259; indicating that the overall index values achieved the required level of model fitness. The model supports all the hypotheses evaluated, confirming that all relationship exists amongst the constructs are positive and significant. Ultimately, the analysis verified that most of the respondents foresee better understanding of project input information through BIM visualization, its reliable database and coordinated data, in developing more reliable cost estimates. They also perceive to accelerate their cost estimating task through BIM adoption.

  10. Assessing the utility of FIB-SEM images for shale digital rock physics

    NASA Astrophysics Data System (ADS)

    Kelly, Shaina; El-Sobky, Hesham; Torres-Verdín, Carlos; Balhoff, Matthew T.

    2016-09-01

    Shales and other unconventional or low permeability (tight) reservoirs house vast quantities of hydrocarbons, often demonstrate considerable water uptake, and are potential repositories for fluid sequestration. The pore-scale topology and fluid transport mechanisms within these nanoporous sedimentary rocks remain to be fully understood. Image-informed pore-scale models are useful tools for studying porous media: a debated question in shale pore-scale petrophysics is whether there is a representative elementary volume (REV) for shale models? Furthermore, if an REV exists, how does it differ among petrophysical properties? We obtain three dimensional (3D) models of the topology of microscale shale volumes from image analysis of focused ion beam-scanning electron microscope (FIB-SEM) image stacks and investigate the utility of these models as a potential REV for shale. The scope of data used in this work includes multiple local groups of neighboring FIB-SEM images of different microscale sizes, corresponding core-scale (milli- and centimeters) laboratory data, and, for comparison, series of two-dimensional (2D) cross sections from broad ion beam SEM images (BIB-SEM), which capture a larger microscale field of view than the FIB-SEM images; this array of data is larger than the majority of investigations with FIB-SEM-derived microscale models of shale. Properties such as porosity, organic matter content, and pore connectivity are extracted from each model. Assessments of permeability with single phase, pressure-driven flow simulations are performed in the connected pore space of the models using the lattice-Boltzmann method. Calculated petrophysical properties are compared to those of neighboring FIB-SEM images and to core-scale measurements of the sample associated with the FIB-SEM sites. Results indicate that FIB-SEM images below ∼5000 μm3 volume (the largest volume analyzed) are not a suitable REV for shale permeability and pore-scale networks; i.e. field of view is compromised at the expense of detailed, but often unconnected, nanopore morphology. Further, we find that it is necessary to acquire several local FIB-SEM or BIB-SEM images and correlate their extracted geometric properties to improve the likelihood of achieving representative values of porosity and organic matter volume. Our work indicates that FIB-SEM images of microscale volumes of shale are a qualitative tool for petrophysical and transport analysis. Finally, we offer alternatives for quantitative pore-scale assessments of shale.

  11. The assessment of the performance of covariance-based structural equation modeling and partial least square path modeling

    NASA Astrophysics Data System (ADS)

    Aimran, Ahmad Nazim; Ahmad, Sabri; Afthanorhan, Asyraf; Awang, Zainudin

    2017-05-01

    Structural equation modeling (SEM) is the second generation statistical analysis technique developed for analyzing the inter-relationships among multiple variables in a model. Previous studies have shown that there seemed to be at least an implicit agreement about the factors that should drive the choice between covariance-based structural equation modeling (CB-SEM) and partial least square path modeling (PLS-PM). PLS-PM appears to be the preferred method by previous scholars because of its less stringent assumption and the need to avoid the perceived difficulties in CB-SEM. Along with this issue has been the increasing debate among researchers on the use of CB-SEM and PLS-PM in studies. The present study intends to assess the performance of CB-SEM and PLS-PM as a confirmatory study in which the findings will contribute to the body of knowledge of SEM. Maximum likelihood (ML) was chosen as the estimator for CB-SEM and was expected to be more powerful than PLS-PM. Based on the balanced experimental design, the multivariate normal data with specified population parameter and sample sizes were generated using Pro-Active Monte Carlo simulation, and the data were analyzed using AMOS for CB-SEM and SmartPLS for PLS-PM. Comparative Bias Index (CBI), construct relationship, average variance extracted (AVE), composite reliability (CR), and Fornell-Larcker criterion were used to study the consequence of each estimator. The findings conclude that CB-SEM performed notably better than PLS-PM in estimation for large sample size (100 and above), particularly in terms of estimations accuracy and consistency.

  12. On the Benefits of Latent Variable Modeling for Norming Scales: The Case of the "Supports Intensity Scale-Children's Version"

    ERIC Educational Resources Information Center

    Seo, Hyojeong; Little, Todd D.; Shogren, Karrie A.; Lang, Kyle M.

    2016-01-01

    Structural equation modeling (SEM) is a powerful and flexible analytic tool to model latent constructs and their relations with observed variables and other constructs. SEM applications offer advantages over classical models in dealing with statistical assumptions and in adjusting for measurement error. So far, however, SEM has not been fully used…

  13. Bootstrap Estimation of Sample Statistic Bias in Structural Equation Modeling.

    ERIC Educational Resources Information Center

    Thompson, Bruce; Fan, Xitao

    This study empirically investigated bootstrap bias estimation in the area of structural equation modeling (SEM). Three correctly specified SEM models were used under four different sample size conditions. Monte Carlo experiments were carried out to generate the criteria against which bootstrap bias estimation should be judged. For SEM fit indices,…

  14. On the Benefits of Latent Variable Modeling for Norming Scales: The Case of the "Supports Intensity Scale--Children's Version"

    ERIC Educational Resources Information Center

    Seo, Hyojeong; Little, Todd D.; Shogren, Karrie A.; Lang, Kyle M.

    2016-01-01

    Structural equation modeling (SEM) is a powerful and flexible analytic tool to model latent constructs and their relations with observed variables and other constructs. SEM applications offer advantages over classical models in dealing with statistical assumptions and in adjusting for measurement error. So far, however, SEM has not been fully used…

  15. Single-entry models (SEMs) for scheduled services: Towards a roadmap for the implementation of recommended practices.

    PubMed

    Lopatina, Elena; Damani, Zaheed; Bohm, Eric; Noseworthy, Tom W; Conner-Spady, Barbara; MacKean, Gail; Simpson, Chris S; Marshall, Deborah A

    2017-09-01

    Long waiting times for elective services continue to be a challenging issue. Single-entry models (SEMs) are used to increase access to and flow through the healthcare system. This paper provides a roadmap for healthcare decision-makers, managers, physicians, and researchers to guide implementation and management of successful and sustainable SEMs. The roadmap was informed by an inductive qualitative synthesis of the findings from a deliberative process (a symposium on SEMs, with clinicians, researchers, senior policy-makers, healthcare managers, and patient representatives) and focus groups with the symposium participants. SEMs are a promising strategy to improve the management of referrals and represent one approach to reduce waiting times. The SEMs roadmap outlines current knowledge about SEMs and critical success factors for SEMs' implementation and management. This SEM roadmap is intended to help clinicians, decision-makers, managers, and researchers interested in developing new or strengthening existing SEMs. We consider this roadmap to be a living document that will continue to evolve as we learn more about implementing and managing sustainable SEMs. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. A Two-Stage Approach to Synthesizing Covariance Matrices in Meta-Analytic Structural Equation Modeling

    ERIC Educational Resources Information Center

    Cheung, Mike W. L.; Chan, Wai

    2009-01-01

    Structural equation modeling (SEM) is widely used as a statistical framework to test complex models in behavioral and social sciences. When the number of publications increases, there is a need to systematically synthesize them. Methodology of synthesizing findings in the context of SEM is known as meta-analytic SEM (MASEM). Although correlation…

  17. Multimodal Hierarchical Imaging of Serial Sections for Finding Specific Cellular Targets within Large Volumes

    PubMed Central

    Wacker, Irene U.; Veith, Lisa; Spomer, Waldemar; Hofmann, Andreas; Thaler, Marlene; Hillmer, Stefan; Gengenbach, Ulrich; Schröder, Rasmus R.

    2018-01-01

    Targeting specific cells at ultrastructural resolution within a mixed cell population or a tissue can be achieved by hierarchical imaging using a combination of light and electron microscopy. Samples embedded in resin are sectioned into arrays consisting of ribbons of hundreds of ultrathin sections and deposited on pieces of silicon wafer or conductively coated coverslips. Arrays are imaged at low resolution using a digital consumer like smartphone camera or light microscope (LM) for a rapid large area overview, or a wide field fluorescence microscope (fluorescence light microscopy (FLM)) after labeling with fluorophores. After post-staining with heavy metals, arrays are imaged in a scanning electron microscope (SEM). Selection of targets is possible from 3D reconstructions generated by FLM or from 3D reconstructions made from the SEM image stacks at intermediate resolution if no fluorescent markers are available. For ultrastructural analysis, selected targets are finally recorded in the SEM at high-resolution (a few nanometer image pixels). A ribbon-handling tool that can be retrofitted to any ultramicrotome is demonstrated. It helps with array production and substrate removal from the sectioning knife boat. A software platform that allows automated imaging of arrays in the SEM is discussed. Compared to other methods generating large volume EM data, such as serial block-face SEM (SBF-SEM) or focused ion beam SEM (FIB-SEM), this approach has two major advantages: (1) The resin-embedded sample is conserved, albeit in a sliced-up version. It can be stained in different ways and imaged with different resolutions. (2) As the sections can be post-stained, it is not necessary to use samples strongly block-stained with heavy metals to introduce contrast for SEM imaging or render the tissue blocks conductive. This makes the method applicable to a wide variety of materials and biological questions. Particularly prefixed materials e.g., from biopsy banks and pathology labs, can directly be embedded and reconstructed in 3D. PMID:29630046

  18. Development of X-ray micro-focus computed tomography to image and quantify biofilms in central venous catheter models in vitro.

    PubMed

    Niehaus, Wilmari L; Howlin, Robert P; Johnston, David A; Bull, Daniel J; Jones, Gareth L; Calton, Elizabeth; Mavrogordato, Mark N; Clarke, Stuart C; Thurner, Philipp J; Faust, Saul N; Stoodley, Paul

    2016-09-01

    Bacterial infections of central venous catheters (CVCs) cause much morbidity and mortality, and are usually diagnosed by concordant culture of blood and catheter tip. However, studies suggest that culture often fails to detect biofilm bacteria. This study optimizes X-ray micro-focus computed tomography (X-ray µCT) for the quantification and determination of distribution and heterogeneity of biofilms in in vitro CVC model systems.Bacterial culture and scanning electron microscopy (SEM) were used to detect Staphylococcus epidermidis ATCC 35984 biofilms grown on catheters in vitro in both flow and static biofilm models. Alongside this, X-ray µCT techniques were developed in order to detect biofilms inside CVCs. Various contrast agent stains were evaluated using energy-dispersive X-ray spectroscopy (EDS) to further optimize these methods. Catheter material and biofilm were segmented using a semi-automated matlab script and quantified using the Avizo Fire software package. X-ray µCT was capable of distinguishing between the degree of biofilm formation across different segments of a CVC flow model. EDS screening of single- and dual-compound contrast stains identified 10 nm gold and silver nitrate as the optimum contrast agent for X-ray µCT. This optimized method was then demonstrated to be capable of quantifying biofilms in an in vitro static biofilm formation model, with a strong correlation between biofilm detection via SEM and culture. X-ray µCT has good potential as a direct, non-invasive, non-destructive technology to image biofilms in CVCs, as well as other in vivo medical components in which biofilms accumulate in concealed areas.

  19. Structural equation modeling in pediatric psychology: overview and review of applications.

    PubMed

    Nelson, Timothy D; Aylward, Brandon S; Steele, Ric G

    2008-08-01

    To describe the use of structural equation modeling (SEM) in the Journal of Pediatric Psychology (JPP) and to discuss the usefulness of SEM applications in pediatric psychology research. The use of SEM in JPP between 1997 and 2006 was examined and compared to leading journals in clinical psychology, clinical child psychology, and child development. SEM techniques were used in <4% of the empirical articles appearing in JPP between 1997 and 2006. SEM was used less frequently in JPP than in other clinically relevant journals over the past 10 years. However, results indicated a recent increase in JPP studies employing SEM techniques. SEM is an under-utilized class of techniques within pediatric psychology research, although investigations employing these methods are becoming more prevalent. Despite its infrequent use to date, SEM is a potentially useful tool for advancing pediatric psychology research with a number of advantages over traditional statistical methods.

  20. Cross-instrument Analysis Correlation Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McJunkin, Timothy R.

    This program has been designed to assist with the tracking of a sample from one analytical instrument to another such as SEM, microscopes, micro x-ray diffraction and other instruments where particular positions/locations on the sample are examined, photographed, etc. The software is designed to easily enter the position of fiducials and locations of interest such that in a future session in the same of different instrument the positions of interest can be re-found through using the known location fiducials in the current and reference session to transform the point into the current sessions coordinate system. The software is dialog boxmore » driven guiding the user through the necessary data entry and program choices. Information is stored in a series of text based extensible markup language (XML) files.« less

  1. 3D profilometric characterization of the aged skin surface using a skin replica and alicona Mex software.

    PubMed

    Pirisinu, Marco; Mazzarello, Vittorio

    2016-05-01

    The skin's surface is characterized by a network of furrows and wrinkles showing different height and depth. Different studies showed that processes such as aging, photo aging and cancer may alter dermal ultrastructure surface. The quantitative analysis of skin topography is a key point for understanding health condition of the skin. Here, for the first time, the skin fine structure was studied via a new approach where replica method was combined with Mex Alicona software and scanning electron microscopy (SEM). The skin texture of cheek and forearm were studied in 120 healthy sardinian volunteers. Patients were divided into three different aged groups. The skin areas of interest were reproduced by the silicone replica method, each replica was explored by SEM and digital images were taken. By using Mex Alicona software were created 3D imagine and a list of 24 surface texture parameters were obtained, of these the most representative were chosen in order to assess eventual changes between groups. The skin's texture of forearm and cheek showed a gradually loss of its typical polyhedric mesh with increasing age group. In particular, the photoexposition increased loss of dermal texture. At today, Alicona mex technology was exclusively used on palaeontology studies, our results showed that a deep analyze of skin texture was performed and support Mex alicona software as a new promising tool on dermatological research. This new analytical approach provided an easy and fast process to appreciate skin texture and its changes, by using high quality 3D dimension images. SCANNING 38:213-220, 2016. © 2015 Wiley Periodicals, Inc. © Wiley Periodicals, Inc.

  2. Pornography, sexual socialization, and satisfaction among young men.

    PubMed

    Stulhofer, Aleksandar; Busko, Vesna; Landripet, Ivan

    2010-02-01

    In spite of a growing presence of pornography in contemporary life, little is known about its potential effects on young people's sexual socialization and sexual satisfaction. In this article, we present a theoretical model of the effects of sexually explicit materials (SEM) mediated by sexual scripting and moderated by the type of SEM used. An on-line survey dataset that included 650 young Croatian men aged 18-25 years was used to explore empirically the model. Descriptive findings pointed to significant differences between mainstream and paraphilic SEM users in frequency of SEM use at the age of 14, current SEM use, frequency of masturbation, sexual boredom, acceptance of sex myths, and sexual compulsiveness. In testing the model, a novel instrument was used, the Sexual Scripts Overlap Scale, designed to measure the influence of SEM on sexual socialization. Structural equation analyses suggested that negative effects of early exposure to SEM on young men's sexual satisfaction, albeit small, could be stronger than positive effects. Both positive and negative effects-the latter being expressed through suppression of intimacy-were observed only among users of paraphilic SEM. No effect of early exposure to SEM was found among the mainstream SEM users. To counterbalance moral panic but also glamorization of pornography, sex education programs should incorporate contents that would increase media literacy and assist young people in critical interpretation of pornographic imagery.

  3. Measurements of Weight Bearing Asymmetry Using the Nintendo Wii Fit Balance Board Are Not Reliable for Older Adults and Individuals With Stroke.

    PubMed

    Liuzzo, Derek M; Peters, Denise M; Middleton, Addie; Lanier, Wes; Chain, Rebecca; Barksdale, Brittany; Fritz, Stacy L

    Clinicians and researchers have used bathroom scales, balance performance monitors with feedback, postural scale analysis, and force platforms to evaluate weight bearing asymmetry (WBA). Now video game consoles offer a novel alternative for assessing this construct. By using specialized software, the Nintendo Wii Fit balance board can provide reliable measurements of WBA in healthy, young adults. However, reliability of measurements obtained using only the factory settings to assess WBA in older adults and individuals with stroke has not been established. To determine whether measurements of WBA obtained using the Nintendo Wii Fit balance board and default settings are reliable in older adults and individuals with stroke. Weight bearing asymmetry was assessed using the Nintendo Wii Fit balance board in 2 groups of participants-individuals older than 65 years (n = 41) and individuals with stroke (n = 41). Participants were given a standardized set of instructions and were not provided auditory or visual feedback. Two trials were performed. Intraclass correlation coefficients (ICC), standard error of measure (SEM), and minimal detectable change (MDC) scores were determined for each group. The ICC for the older adults sample was 0.59 (0.35-0.76) with SEM95 = 6.2% and MDC95 = 8.8%. The ICC for the sample including individuals with stroke was 0.60 (0.47-0.70) with SEM95 = 9.6% and MDC95 = 13.6%. Although measurements of WBA obtained using the Nintendo Wii Fit balance board, and its default factory settings, demonstrate moderate reliability in older adults and individuals with stroke, the relatively high associated SEM and MDC values substantially reduce the clinical utility of the Nintendo Wii Fit balance board as an assessment tool for WBA. Weight bearing asymmetry cannot be measured reliably in older adults and individuals with stroke using the Nintendo Wii Fit balance board without the use of specialized software.

  4. Measurements of Weight Bearing Asymmetry Using the Nintendo Wii Fit Balance Board Are Not Reliable for Older Adults and Individuals With Stroke

    PubMed Central

    Liuzzo, Derek M.; Peters, Denise M.; Middleton, Addie; Lanier, Wes; Chain, Rebecca; Barksdale, Brittany; Fritz, Stacy L.

    2015-01-01

    Background Clinicians and researchers have used bathroom scales, balance performance monitors with feedback, postural scale analysis, and force platforms to evaluate weight bearing asymmetry (WBA). Now video game consoles offer a novel alternative for assessing this construct. By using specialized software, the Nintendo Wii Fit balance board can provide reliable measurements of WBA in healthy, young adults. However, reliability of measurements obtained using only the factory settings to assess WBA in older adults and individuals with stroke has not been established. Purpose To determine whether measurements of WBA obtained using the Nintendo Wii Fit balance board and default settings are reliable in older adults and individuals with stroke. Methods Weight bearing asymmetry was assessed using the Nintendo Wii Fit balance board in 2 groups of participants—individuals older than 65 years (n = 41) and individuals with stroke (n = 41). Participants were given a standardized set of instructions and were not provided auditory or visual feedback. Two trials were performed. Intraclass correlation coefficients (ICC), standard error of measure (SEM), and minimal detectable change (MDC) scores were determined for each group. Results The ICC for the older adults sample was 0.59 (0.35–0.76) with SEM95= 6.2% and MDC95= 8.8%. The ICC for the sample including individuals with stroke was 0.60 (0.47–0.70) with SEM95= 9.6% and MDC95= 13.6%. Discussion Although measurements of WBA obtained using the Nintendo Wii Fit balance board, and its default factory settings, demonstrate moderate reliability in older adults and individuals with stroke, the relatively high associated SEM and MDC values substantially reduce the clinical utility of the Nintendo Wii Fit balance board as an assessment tool for WBA. Conclusions Weight bearing asymmetry cannot be measured reliably in older adults and individuals with stroke using the Nintendo Wii Fit balance board without the use of specialized software. PMID:26288237

  5. SEM and AFM studies of dip-coated CuO nanofilms.

    PubMed

    Dhanasekaran, V; Mahalingam, T; Ganesan, V

    2013-01-01

    Cupric oxide (CuO) semiconducting thin films were prepared at various copper sulfate concentrations by dip coating. The copper sulfate concentration was varied to yield films of thicknesses in the range of 445-685 nm by surface profilometer. X-ray diffraction patterns revealed that the deposited films were polycrystalline in nature with monoclinic structure of (-111) plane. The surface morphology and topography of monoclinic-phase CuO thin films were examined using scanning electron microscopy (SEM) and atomic force microscopy (AFM), respectively. Surface roughness profile was plotted using WSxM software and the estimated surface roughness was about ∼19.4 nm at 30 mM molar concentration. The nanosheets shaped grains were observed by SEM and AFM studies. The stoichiometric compound formation was observed at 30 mM copper sulfate concentration prepared film by EDX. The indirect band gap energy of CuO films was increased from 1.08 to 1.20 eV with the increase of copper sulfate concentrations. Copyright © 2012 Wiley Periodicals, Inc.

  6. Acquisition of a High Resolution Field Emission Scanning Electron Microscope for the Analysis of Returned Samples

    NASA Technical Reports Server (NTRS)

    Nittler, Larry R.

    2003-01-01

    This grant furnished funds to purchase a state-of-the-art scanning electron microscope (SEM) to support our analytical facilities for extraterrestrial samples. After evaluating several instruments, we purchased a JEOL 6500F thermal field emission SEM with the following analytical accessories: EDAX energy-dispersive x-ray analysis system with fully automated control of instrument and sample stage; EDAX LEXS wavelength-dispersive x-ray spectrometer for high sensitivity light-element analysis; EDAX/TSL electron backscatter diffraction (EBSD) system with software for phase identification and crystal orientation mapping; Robinson backscatter electron detector; and an in situ micro-manipulator (Kleindiek). The total price was $550,000 (with $150,000 of the purchase supported by Carnegie institution matching funds). The microscope was delivered in October 2002, and most of the analytical accessories were installed by January 2003. With the exception of the wavelength spectrometer (which has been undergoing design changes) everything is working well and the SEM is in routine use in our laboratory.

  7. Some Esoteric Aspects of SEM that Its Practitioners Should Want to Know

    ERIC Educational Resources Information Center

    Rozeboom, William W.

    2009-01-01

    The topic of this article is the interpretation of structural equation modeling (SEM) solutions. Its purpose is to augment structural modeling's metatheoretic resources while enhancing awareness of how problematic is the causal significance of SEM-parameter solutions. Part I focuses on the nonuniqueness and consequent dubious interpretability of…

  8. Experimental sharp force injuries to ribs: Multimodal morphological and geometric morphometric analyses using micro-CT, macro photography and SEM.

    PubMed

    Komo, Larissa; Grassberger, Martin

    2018-07-01

    Tool marks on bones induced by knife blades can be analysed morphometrically in order to enable an allocation of the suspected "inflicting weapon" to the particular morphology of the bone lesions. Until now, geometric morphometrics has not been used to analyse the morphology of knife lesions on fleshed bones in detail. By using twelve experimental knives and a drop weight tower, stab/cut injuries were inflicted on untreated pig ribs. The morphology of the experimentally produced lesions was subsequently recorded with three imaging techniques (μCT, macro photography and SEM) and analysed with different morphometric software (Amira, tps and Morpheus). Based on the measured distances between the walls of the kerf marks, which corresponded to the thickness of the blade, one could conclude to the respective blade thickness with a deviation of max. ±0.35mm and match the injuries to the knives. With subsequent reanalysis after maceration, an average shrinkage factor up to 8.6% was observed. Among the three imaging techniques used in this study, μCT was the most accurate and efficient technique, particularly because it represented the only non-destructive modality to document injuries without maceration, even though μCT is more expensive and time-consuming as well as less accessible than a macro SLR-camera or a SEM. For optimal characterizations of the blades' and kerfs' shapes the software tps proofed to be the best choice. Accordingly, geometric morphometrics could serve as a tool in forensic investigations concerning kerf marks. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Bayesian Data-Model Fit Assessment for Structural Equation Modeling

    ERIC Educational Resources Information Center

    Levy, Roy

    2011-01-01

    Bayesian approaches to modeling are receiving an increasing amount of attention in the areas of model construction and estimation in factor analysis, structural equation modeling (SEM), and related latent variable models. However, model diagnostics and model criticism remain relatively understudied aspects of Bayesian SEM. This article describes…

  10. Overlay of multiframe SEM images including nonlinear field distortions

    NASA Astrophysics Data System (ADS)

    Babin, S.; Borisov, S.; Ivonin, I.; Nakazawa, S.; Yamazaki, Y.

    2018-03-01

    To reduce charging and shrinkage, CD-SEMs utilize low electron energies and multiframe imaging. This results in every next frame being altered due to stage and beam instability, as well as due to charging. Regular averaging of the frames blurs the edges; this directly effects the extracted values of critical dimensions. A technique was developed to overlay multiframe images without the loss of quality. This method takes into account drift, rotation, and magnification corrections, as well as nonlinear distortions due to wafer charging. A significant improvement in the signal to noise ratio and overall image quality without degradation of the feature's edge quality was achieved. The developed software is capable of working with regular and large size images up to 32K pixels in each direction.

  11. USING STRUCTURAL EQUATION MODELING TO INVESTIGATE RELATIONSHIPS AMONG ECOLOGICAL VARIABLES

    EPA Science Inventory

    This paper gives an introductory account of Structural Equation Modeling (SEM) and demonstrates its application using LISREL< with a model utilizing environmental data. Using nine EMAP data variables, we analyzed their correlation matrix with an SEM model. The model characterized...

  12. Using structural equation modeling for network meta-analysis.

    PubMed

    Tu, Yu-Kang; Wu, Yun-Chun

    2017-07-14

    Network meta-analysis overcomes the limitations of traditional pair-wise meta-analysis by incorporating all available evidence into a general statistical framework for simultaneous comparisons of several treatments. Currently, network meta-analyses are undertaken either within the Bayesian hierarchical linear models or frequentist generalized linear mixed models. Structural equation modeling (SEM) is a statistical method originally developed for modeling causal relations among observed and latent variables. As random effect is explicitly modeled as a latent variable in SEM, it is very flexible for analysts to specify complex random effect structure and to make linear and nonlinear constraints on parameters. The aim of this article is to show how to undertake a network meta-analysis within the statistical framework of SEM. We used an example dataset to demonstrate the standard fixed and random effect network meta-analysis models can be easily implemented in SEM. It contains results of 26 studies that directly compared three treatment groups A, B and C for prevention of first bleeding in patients with liver cirrhosis. We also showed that a new approach to network meta-analysis based on the technique of unrestricted weighted least squares (UWLS) method can also be undertaken using SEM. For both the fixed and random effect network meta-analysis, SEM yielded similar coefficients and confidence intervals to those reported in the previous literature. The point estimates of two UWLS models were identical to those in the fixed effect model but the confidence intervals were greater. This is consistent with results from the traditional pairwise meta-analyses. Comparing to UWLS model with common variance adjusted factor, UWLS model with unique variance adjusted factor has greater confidence intervals when the heterogeneity was larger in the pairwise comparison. The UWLS model with unique variance adjusted factor reflects the difference in heterogeneity within each comparison. SEM provides a very flexible framework for univariate and multivariate meta-analysis, and its potential as a powerful tool for advanced meta-analysis is still to be explored.

  13. Structural equation modeling: building and evaluating causal models: Chapter 8

    USGS Publications Warehouse

    Grace, James B.; Scheiner, Samuel M.; Schoolmaster, Donald R.

    2015-01-01

    Scientists frequently wish to study hypotheses about causal relationships, rather than just statistical associations. This chapter addresses the question of how scientists might approach this ambitious task. Here we describe structural equation modeling (SEM), a general modeling framework for the study of causal hypotheses. Our goals are to (a) concisely describe the methodology, (b) illustrate its utility for investigating ecological systems, and (c) provide guidance for its application. Throughout our presentation, we rely on a study of the effects of human activities on wetland ecosystems to make our description of methodology more tangible. We begin by presenting the fundamental principles of SEM, including both its distinguishing characteristics and the requirements for modeling hypotheses about causal networks. We then illustrate SEM procedures and offer guidelines for conducting SEM analyses. Our focus in this presentation is on basic modeling objectives and core techniques. Pointers to additional modeling options are also given.

  14. From patterns to causal understanding: Structural equation modeling (SEM) in soil ecology

    USGS Publications Warehouse

    Eisenhauer, Nico; Powell, Jeff R; Grace, James B.; Bowker, Matthew A.

    2015-01-01

    In this perspectives paper we highlight a heretofore underused statistical method in soil ecological research, structural equation modeling (SEM). SEM is commonly used in the general ecological literature to develop causal understanding from observational data, but has been more slowly adopted by soil ecologists. We provide some basic information on the many advantages and possibilities associated with using SEM and provide some examples of how SEM can be used by soil ecologists to shift focus from describing patterns to developing causal understanding and inspiring new types of experimental tests. SEM is a promising tool to aid the growth of soil ecology as a discipline, particularly by supporting research that is increasingly hypothesis-driven and interdisciplinary, thus shining light into the black box of interactions belowground.

  15. Sample Size Requirements for Structural Equation Models: An Evaluation of Power, Bias, and Solution Propriety

    ERIC Educational Resources Information Center

    Wolf, Erika J.; Harrington, Kelly M.; Clark, Shaunna L.; Miller, Mark W.

    2013-01-01

    Determining sample size requirements for structural equation modeling (SEM) is a challenge often faced by investigators, peer reviewers, and grant writers. Recent years have seen a large increase in SEMs in the behavioral science literature, but consideration of sample size requirements for applied SEMs often relies on outdated rules-of-thumb.…

  16. A structural equation modeling approach for the adoption of cloud computing to enhance the Malaysian healthcare sector.

    PubMed

    Ratnam, Kalai Anand; Dominic, P D D; Ramayah, T

    2014-08-01

    The investments and costs of infrastructure, communication, medical-related equipments, and software within the global healthcare ecosystem portray a rather significant increase. The emergence of this proliferation is then expected to grow. As a result, information and cross-system communication became challenging due to the detached independent systems and subsystems which are not connected. The overall model fit expending over a sample size of 320 were tested with structural equation modelling (SEM) using AMOS 20.0 as the modelling tool. SPSS 20.0 is used to analyse the descriptive statistics and dimension reliability. Results of the study show that system utilisation and system impact dimension influences the overall level of services of the healthcare providers. In addition to that, the findings also suggest that systems integration and security plays a pivotal role for IT resources in healthcare organisations. Through this study, a basis for investigation on the need to improvise the Malaysian healthcare ecosystem and the introduction of a cloud computing platform to host the national healthcare information exchange has been successfully established.

  17. Screening and vaccination as determined by the Social Ecological Model and the Theory of Triadic Influence: a systematic review.

    PubMed

    Nyambe, Anayawa; Van Hal, Guido; Kampen, Jarl K

    2016-11-17

    Vaccination and screening are forms of primary and secondary prevention methods. These methods are recommended for controlling the spread of a vast number of diseases and conditions. To determine the most effective preventive methods to be used by a society, multi-level models have shown to be more effective than models that focus solely on individual level characteristics. The Social Ecological Model (SEM) and the Theory of Triadic Influence (TTI) are such models. The purpose of this systematic review was to identify main differences and similarities of SEM and TTI regarding screening and vaccination in order to prepare potentially successful prevention programs for practice. A systematic review was conducted. Separate literature searches were performed during January and February 2015 using Medline, Ovid, Proquest, PubMed, University of Antwerp Discovery Service and Web of Science, for articles that apply the SEM and TTI. A Data Extraction Form with mostly closed-end questions was developed to assist with data extraction. Aggregate descriptive statistics were utilized to summarize the general characteristics of the SEM and TTI as documented in the scientific literature. A total of 290 potentially relevant articles referencing the SEM were found. As for the TTI, a total of 131 potentially relevant articles were found. After strict evaluation for inclusion and exclusion criteria, 40 SEM studies and 46 TTI studies were included in the systematic review. The SEM and TTI are theoretical frameworks that share many theoretical concepts and are relevant for several types of health behaviors. However, they differ in the structure of the model, and in how the variables are thought to interact with each other, the TTI being a matrix while the SEM has a ring structure. The main difference consists of the division of the TTI into levels of causation (ultimate, distal and proximal) which are not considered within the levels of the SEM. It was further found that in the articles studied in this systematic review, both models are often considered effective, while the empirical basis of these (and other) conclusions reached by their authors is in many cases unclear or incompletely specified.

  18. Structural Equation Modeling in Language Testing and Learning Research: A Review

    ERIC Educational Resources Information Center

    In'nami, Yo; Koizumi, Rie

    2011-01-01

    Despite the recent increase of structural equation modeling (SEM) in language testing and learning research and Kunnan's (1998) call for the proper use of SEM to produce useful findings, there seem to be no reviews about how SEM is applied in these areas or about the extent to which the current application accords with appropriate practices. To…

  19. Structural equation modeling for observational studies

    USGS Publications Warehouse

    Grace, J.B.

    2008-01-01

    Structural equation modeling (SEM) represents a framework for developing and evaluating complex hypotheses about systems. This method of data analysis differs from conventional univariate and multivariate approaches familiar to most biologists in several ways. First, SEMs are multiequational and capable of representing a wide array of complex hypotheses about how system components interrelate. Second, models are typically developed based on theoretical knowledge and designed to represent competing hypotheses about the processes responsible for data structure. Third, SEM is conceptually based on the analysis of covariance relations. Most commonly, solutions are obtained using maximum-likelihood solution procedures, although a variety of solution procedures are used, including Bayesian estimation. Numerous extensions give SEM a very high degree of flexibility in dealing with nonnormal data, categorical responses, latent variables, hierarchical structure, multigroup comparisons, nonlinearities, and other complicating factors. Structural equation modeling allows researchers to address a variety of questions about systems, such as how different processes work in concert, how the influences of perturbations cascade through systems, and about the relative importance of different influences. I present 2 example applications of SEM, one involving interactions among lynx (Lynx pardinus), mongooses (Herpestes ichneumon), and rabbits (Oryctolagus cuniculus), and the second involving anuran species richness. Many wildlife ecologists may find SEM useful for understanding how populations function within their environments. Along with the capability of the methodology comes a need for care in the proper application of SEM.

  20. Fatigue lifetime prediction of a reduced-diameter dental implant system: Numerical and experimental study.

    PubMed

    Duan, Yuanyuan; Gonzalez, Jorge A; Kulkarni, Pratim A; Nagy, William W; Griggs, Jason A

    2018-06-16

    To validate the fatigue lifetime of a reduced-diameter dental implant system predicted by three-dimensional finite element analysis (FEA) by testing physical implant specimens using an accelerated lifetime testing (ALT) strategy with the apparatus specified by ISO 14801. A commercially-available reduced-diameter titanium dental implant system (Straumann Standard Plus NN) was digitized using a micro-CT scanner. Axial slices were processed using an interactive medical image processing software (Mimics) to create 3D models. FEA analysis was performed in ABAQUS, and fatigue lifetime was predicted using fe-safe ® software. The same implant specimens (n=15) were tested at a frequency of 2Hz on load frames using apparatus specified by ISO 14801 and ALT. Multiple step-stress load profiles with various aggressiveness were used to improve testing efficiency. Fatigue lifetime statistics of physical specimens were estimated in a reliability analysis software (ALTA PRO). Fractured specimens were examined using SEM with fractographic technique to determine the failure mode. FEA predicted lifetime was within the 95% confidence interval of lifetime estimated by experimental results, which suggested that FEA prediction was accurate for this implant system. The highest probability of failure was located at the root of the implant body screw thread adjacent to the simulated bone level, which also agreed with the failure origin in physical specimens. Fatigue lifetime predictions based on finite element modeling could yield similar results in lieu of physical testing, allowing the use of virtual testing in the early stages of future research projects on implant fatigue. Copyright © 2018 The Academy of Dental Materials. Published by Elsevier Inc. All rights reserved.

  1. Characterization of aeroallergen of Texas panhandle using scanning and fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Ghosh, Nabarun; Whiteside, Mandy; Ridner, Chris; Celik, Yasemin; Saadeh, C.; Bennert, Jeff

    2010-06-01

    Aeroallergens cause serious allergic and asthmatic reactions. Characterizing the aeroallergen provides information regarding the onset, duration, and severity of the pollen season that clinicians use to guide allergen selection for skin testing and treatment. Fluorescence Microscopy has useful approaches to understand the structure and function of the microscopic objects. Prepared slides from the pollen were observed under an Olympus BX40 microscope equipped with FITC and TRITC fluorescent filters, a mercury lamp source, an Olympus DP-70 digital camera connected to the computer with Image Pro 6.0 software. Aeroallergens were viewed, recorded and analyzed with DP Manager using the Image Pro 6.0 software. Photographs were taken at bright field, the fluorescein-isothiocyanate (FITC) filter, and the tetramethylrhodamine (TRITC) filter settings at 40X. A high pressure mercury lamp or UV source was used to excite the storage molecules or proteins which exhibited autofluorescence. The FITC filter reveals the green fluorescent proteins (GFP and EGFP), and the TRITC filter for red fluorescent proteins (DsRed). SEM proved to be useful for observing ultra-structural details like pores, colpi, sulci and ornamentations on the pollen surface. Samples were examined with an SEM (TM-1000) after gold coating and Critical Point Drying. Pollen grains were measured using the TM-1000 imaging software that revealed the specific information on the size of colpi or sulci and the distance between the micro-structures. This information can be used for classification and circumscription in Angiosperm taxonomy. Data were correlated clinical studies established at Allergy A.R.T.S. Clinical Research Laboratory.

  2. A structural equation model of soil metal bioavailability to earthworms: confronting causal theory and observations using a laboratory exposure to field-contaminated soils.

    PubMed

    Beaumelle, Léa; Vile, Denis; Lamy, Isabelle; Vandenbulcke, Franck; Gimbert, Frédéric; Hedde, Mickaël

    2016-11-01

    Structural equation models (SEM) are increasingly used in ecology as multivariate analysis that can represent theoretical variables and address complex sets of hypotheses. Here we demonstrate the interest of SEM in ecotoxicology, more precisely to test the three-step concept of metal bioavailability to earthworms. The SEM modeled the three-step causal chain between environmental availability, environmental bioavailability and toxicological bioavailability. In the model, each step is an unmeasured (latent) variable reflected by several observed variables. In an exposure experiment designed specifically to test this SEM for Cd, Pb and Zn, Aporrectodea caliginosa was exposed to 31 agricultural field-contaminated soils. Chemical and biological measurements used included CaC12-extractable metal concentrations in soils, free ion concentration in soil solution as predicted by a geochemical model, dissolved metal concentration as predicted by a semi-mechanistic model, internal metal concentrations in total earthworms and in subcellular fractions, and several biomarkers. The observations verified the causal definition of Cd and Pb bioavailability in the SEM, but not for Zn. Several indicators consistently reflected the hypothetical causal definition and could thus be pertinent measurements of Cd and Pb bioavailability to earthworm in field-contaminated soils. SEM highlights that the metals present in the soil solution and easily extractable are not the main source of available metals for earthworms. This study further highlights SEM as a powerful tool that can handle natural ecosystem complexity, thus participating to the paradigm change in ecotoxicology from a bottom-up to a top-down approach. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Vector Autoregression, Structural Equation Modeling, and Their Synthesis in Neuroimaging Data Analysis

    PubMed Central

    Chen, Gang; Glen, Daniel R.; Saad, Ziad S.; Hamilton, J. Paul; Thomason, Moriah E.; Gotlib, Ian H.; Cox, Robert W.

    2011-01-01

    Vector autoregression (VAR) and structural equation modeling (SEM) are two popular brain-network modeling tools. VAR, which is a data-driven approach, assumes that connected regions exert time-lagged influences on one another. In contrast, the hypothesis-driven SEM is used to validate an existing connectivity model where connected regions have contemporaneous interactions among them. We present the two models in detail and discuss their applicability to FMRI data, and interpretational limits. We also propose a unified approach that models both lagged and contemporaneous effects. The unifying model, structural vector autoregression (SVAR), may improve statistical and explanatory power, and avoids some prevalent pitfalls that can occur when VAR and SEM are utilized separately. PMID:21975109

  4. Intrafamilial aggregation and heritability of left ventricular geometric remodeling is independent of cardiac mass in families of African ancestry.

    PubMed

    Peterson, Vernice R; Norton, Gavin R; Redelinghuys, Michelle; Libhaber, Carlos D; Maseko, Muzi J; Majane, Olebogeng H I; Brooksbank, Richard; Woodiwiss, Angela J

    2015-05-01

    Whether left ventricular (LV) geometric remodeling, as indexed by relative wall thickness (RWT), aggregates in families and is inherited independent of LV mass (LVM) and additional confounders is uncertain. We determined whether RWT as assessed from 2D targeted M-mode echocardiography shows intrafamilial aggregation and heritability independent of LVM in 181 nuclear families (73 spouse pairs, 403 parent-child pairs, and 177 sibling-sibling pairs) with 16 families including 3 generations from an urban developing community of black Africans. Intrafamilial aggregation and heritability estimates (S.A.G.E. software) were assessed independent of confounders, including central aortic systolic blood pressure (SBPc) (radial applanation tonometry and SphygmoCor software). Independent of confounders including SBPc, LV RWT was correlated in parent-child (r = 0.32, P < 0.0001) and sibling-sibling (r = 0.29, P < 0.0001), but not in spouse (r = 0.11, P = 0.33) pairs. The relationships between parent-child (r = 0.28, P < 0.0001) and sibling-sibling (r = 0.24, P < 0.001) pairs persisted with further adjustments for LVM or LVM indexed to height(2.7) (LVMI). Similarly, independent of confounders, LV RWT showed significant heritability (h(2) ± SEM = 0.56 ± 0.09, P < 0.0001) and this persisted with further adjustments for LVM (h(2) ± SEM = 0.48 ± 0.09, P < 0.0001) or LVMI (h(2) ± SEM = 0.49 ± 0.09, P < 0.0001). In a group of African ancestry, independent of LVM, LV geometric remodeling shows significant intrafamilial aggregation and heritability. Genetic factors may in-part determine the LV geometric remodeling process independent of the extent of cardiac hypertrophy. © American Journal of Hypertension, Ltd 2014. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. A large dataset of synthetic SEM images of powder materials and their ground truth 3D structures.

    PubMed

    DeCost, Brian L; Holm, Elizabeth A

    2016-12-01

    This data article presents a data set comprised of 2048 synthetic scanning electron microscope (SEM) images of powder materials and descriptions of the corresponding 3D structures that they represent. These images were created using open source rendering software, and the generating scripts are included with the data set. Eight particle size distributions are represented with 256 independent images from each. The particle size distributions are relatively similar to each other, so that the dataset offers a useful benchmark to assess the fidelity of image analysis techniques. The characteristics of the PSDs and the resulting images are described and analyzed in more detail in the research article "Characterizing powder materials using keypoint-based computer vision methods" (B.L. DeCost, E.A. Holm, 2016) [1]. These data are freely available in a Mendeley Data archive "A large dataset of synthetic SEM images of powder materials and their ground truth 3D structures" (B.L. DeCost, E.A. Holm, 2016) located at http://dx.doi.org/10.17632/tj4syyj9mr.1[2] for any academic, educational, or research purposes.

  6. The advancement of the built environment research through employment of structural equation modeling (SEM)

    NASA Astrophysics Data System (ADS)

    Wasilah, S.; Fahmyddin, T.

    2018-03-01

    The employment of structural equation modeling (SEM) in research has taken an increasing attention in among researchers in built environment. There is a gap to understand the attributes, application, and importance of this approach in data analysis in built environment study. This paper intends to provide fundamental comprehension of SEM method in data analysis, unveiling attributes, employment and significance and bestow cases to assess associations amongst variables and constructs. The study uses some main literature to grasp the essence of SEM regarding with built environment research. The better acknowledgment of this analytical tool may assist the researcher in the built environment to analyze data under complex research questions and to test multivariate models in a single study.

  7. Modelling the structure of sludge aggregates

    PubMed Central

    Smoczyński, Lech; Ratnaweera, Harsha; Kosobucka, Marta; Smoczyński, Michał; Kalinowski, Sławomir; Kvaal, Knut

    2016-01-01

    ABSTRACT The structure of sludge is closely associated with the process of wastewater treatment. Synthetic dyestuff wastewater and sewage were coagulated using the PAX and PIX methods, and electro-coagulated on aluminium electrodes. The processes of wastewater treatment were supported with an organic polymer. The images of surface structures of the investigated sludge were obtained using scanning electron microscopy (SEM). The software image analysis permitted obtaining plots log A vs. log P, wherein A is the surface area and P is the perimeter of the object, for individual objects comprised in the structure of the sludge. The resulting database confirmed the ‘self-similarity’ of the structural objects in the studied groups of sludge, which enabled calculating their fractal dimension and proposing models for these objects. A quantitative description of the sludge aggregates permitted proposing a mechanism of the processes responsible for their formation. In the paper, also, the impact of the structure of the investigated sludge on the process of sedimentation, and dehydration of the thickened sludge after sedimentation, was discussed. PMID:26549812

  8. Beyond logistic regression: structural equations modelling for binary variables and its application to investigating unobserved confounders.

    PubMed

    Kupek, Emil

    2006-03-15

    Structural equation modelling (SEM) has been increasingly used in medical statistics for solving a system of related regression equations. However, a great obstacle for its wider use has been its difficulty in handling categorical variables within the framework of generalised linear models. A large data set with a known structure among two related outcomes and three independent variables was generated to investigate the use of Yule's transformation of odds ratio (OR) into Q-metric by (OR-1)/(OR+1) to approximate Pearson's correlation coefficients between binary variables whose covariance structure can be further analysed by SEM. Percent of correctly classified events and non-events was compared with the classification obtained by logistic regression. The performance of SEM based on Q-metric was also checked on a small (N = 100) random sample of the data generated and on a real data set. SEM successfully recovered the generated model structure. SEM of real data suggested a significant influence of a latent confounding variable which would have not been detectable by standard logistic regression. SEM classification performance was broadly similar to that of the logistic regression. The analysis of binary data can be greatly enhanced by Yule's transformation of odds ratios into estimated correlation matrix that can be further analysed by SEM. The interpretation of results is aided by expressing them as odds ratios which are the most frequently used measure of effect in medical statistics.

  9. Comparison of methods for the analysis of relatively simple mediation models.

    PubMed

    Rijnhart, Judith J M; Twisk, Jos W R; Chinapaw, Mai J M; de Boer, Michiel R; Heymans, Martijn W

    2017-09-01

    Statistical mediation analysis is an often used method in trials, to unravel the pathways underlying the effect of an intervention on a particular outcome variable. Throughout the years, several methods have been proposed, such as ordinary least square (OLS) regression, structural equation modeling (SEM), and the potential outcomes framework. Most applied researchers do not know that these methods are mathematically equivalent when applied to mediation models with a continuous mediator and outcome variable. Therefore, the aim of this paper was to demonstrate the similarities between OLS regression, SEM, and the potential outcomes framework in three mediation models: 1) a crude model, 2) a confounder-adjusted model, and 3) a model with an interaction term for exposure-mediator interaction. Secondary data analysis of a randomized controlled trial that included 546 schoolchildren. In our data example, the mediator and outcome variable were both continuous. We compared the estimates of the total, direct and indirect effects, proportion mediated, and 95% confidence intervals (CIs) for the indirect effect across OLS regression, SEM, and the potential outcomes framework. OLS regression, SEM, and the potential outcomes framework yielded the same effect estimates in the crude mediation model, the confounder-adjusted mediation model, and the mediation model with an interaction term for exposure-mediator interaction. Since OLS regression, SEM, and the potential outcomes framework yield the same results in three mediation models with a continuous mediator and outcome variable, researchers can continue using the method that is most convenient to them.

  10. Equivalence and Differences between Structural Equation Modeling and State-Space Modeling Techniques

    ERIC Educational Resources Information Center

    Chow, Sy-Miin; Ho, Moon-ho R.; Hamaker, Ellen L.; Dolan, Conor V.

    2010-01-01

    State-space modeling techniques have been compared to structural equation modeling (SEM) techniques in various contexts but their unique strengths have often been overshadowed by their similarities to SEM. In this article, we provide a comprehensive discussion of these 2 approaches' similarities and differences through analytic comparisons and…

  11. Bayesian Semiparametric Structural Equation Models with Latent Variables

    ERIC Educational Resources Information Center

    Yang, Mingan; Dunson, David B.

    2010-01-01

    Structural equation models (SEMs) with latent variables are widely useful for sparse covariance structure modeling and for inferring relationships among latent variables. Bayesian SEMs are appealing in allowing for the incorporation of prior information and in providing exact posterior distributions of unknowns, including the latent variables. In…

  12. Test and Evaluation of TRUST: Tools for Recognizing Useful Signals of Trustworthiness

    DTIC Science & Technology

    2016-04-01

    guaranteed, social exchange requires trust—the belief that others will follow through on their obligations. The model includes the beliefs that...current reflection could be measured based on properties of the skin, and (2) skin conductance response (SCR), where the fastest could be measured and...SEM prediction (H4d). The results of the LF HRV signals indicate the SEM model predicts distrust base on the experimental SS paradigm and SEM

  13. Multiplicity Control in Structural Equation Modeling: Incorporating Parameter Dependencies

    ERIC Educational Resources Information Center

    Smith, Carrie E.; Cribbie, Robert A.

    2013-01-01

    When structural equation modeling (SEM) analyses are conducted, significance tests for all important model relationships (parameters including factor loadings, covariances, etc.) are typically conducted at a specified nominal Type I error rate ([alpha]). Despite the fact that many significance tests are often conducted in SEM, rarely is…

  14. Three Approaches to Using Lengthy Ordinal Scales in Structural Equation Models: Parceling, Latent Scoring, and Shortening Scales

    ERIC Educational Resources Information Center

    Yang, Chongming; Nay, Sandra; Hoyle, Rick H.

    2010-01-01

    Lengthy scales or testlets pose certain challenges for structural equation modeling (SEM) if all the items are included as indicators of a latent construct. Three general approaches to modeling lengthy scales in SEM (parceling, latent scoring, and shortening) have been reviewed and evaluated. A hypothetical population model is simulated containing…

  15. Adapting the Sport Education Model for Children with Disabilities

    ERIC Educational Resources Information Center

    Presse, Cindy; Block, Martin E.; Horton, Mel; Harvey, William J.

    2011-01-01

    The sport education model (SEM) has been widely used as a curriculum and instructional model to provide children with authentic and active sport experiences in physical education. In this model, students are assigned various roles to gain a deeper understanding of the sport or activity. This article provides a brief overview of the SEM and…

  16. A Sandwich-Type Standard Error Estimator of SEM Models with Multivariate Time Series

    ERIC Educational Resources Information Center

    Zhang, Guangjian; Chow, Sy-Miin; Ong, Anthony D.

    2011-01-01

    Structural equation models are increasingly used as a modeling tool for multivariate time series data in the social and behavioral sciences. Standard error estimators of SEM models, originally developed for independent data, require modifications to accommodate the fact that time series data are inherently dependent. In this article, we extend a…

  17. Maximum Likelihood Dynamic Factor Modeling for Arbitrary "N" and "T" Using SEM

    ERIC Educational Resources Information Center

    Voelkle, Manuel C.; Oud, Johan H. L.; von Oertzen, Timo; Lindenberger, Ulman

    2012-01-01

    This article has 3 objectives that build on each other. First, we demonstrate how to obtain maximum likelihood estimates for dynamic factor models (the direct autoregressive factor score model) with arbitrary "T" and "N" by means of structural equation modeling (SEM) and compare the approach to existing methods. Second, we go beyond standard time…

  18. COMPARATIVE SEM EVALUATION OF THREE SOLVENTS USED IN ENDODONTIC RETREATMENT: AN EX VIVO STUDY

    PubMed Central

    Scelza, Miriam F. Zaccaro; Coil, Jeffrey M.; Maciel, Ana Carolina de Carvalho; Oliveira, Lílian Rachel L.; Scelza, Pantaleo

    2008-01-01

    This study compared, by scanning electron microscopy (SEM), the efficacy of three solvents on the removal of filling materials from dentinal tubules during endodontic retreatment. Forty human maxillary canines with straight canals were prepared according to a crown-down technique and enlarged to a#30 apical file size, before obturation with gutta-percha and a zinc-oxide-eugenol based sealer. The samples were stored for 3 months before being randomly assigned to four groups: chloroform (n=10), orange oil (n=10), eucalyptol (n=10) and control (n=10). Solvents were applied to a reservoir created on the coronal root third using Gates Glidden drills. The total time for retreatment using the solvents was 5 minutes per tooth. Following retreatment the roots were split longitudinally for SEM evaluation. SEM images were digitized, analyzed using Image ProPlus 4.5 software, and the number of dentinal tubules free of filling material from the middle and apical thirds was recorded. No significant difference was found among the solvent groups regarding the number of dentinal tubules free of root filling remnants in the middle and apical root thirds (p>0.05). However, the control group had fewer dentinal tubules free of filling material (p<0.05). Under the tested conditions, it may be concluded that there was no significant difference among the solvents used to obtain dentinal tubules free of filling material remnants. PMID:19089285

  19. Towards Automated Nanomanipulation under Scanning Electron Microscopy

    NASA Astrophysics Data System (ADS)

    Ye, Xutao

    Robotic Nanomaterial Manipulation inside scanning electron microscopes (SEM) is useful for prototyping functional devices and characterizing one-dimensional nanomaterial's properties. Conventionally, manipulation of nanowires has been performed via teleoperation, which is time-consuming and highly skill-dependent. Manual manipulation also has the limitation of low success rates and poor reproducibility. This research focuses on a robotic system capable of automated pick-place of single nanowires. Through SEM visual detection and vision-based motion control, the system transferred individual silicon nanowires from their growth substrate to a microelectromechanical systems (MEMS) device that characterized the nanowires' electromechanical properties. The performances of the nanorobotic pick-up and placement procedures were quantified by experiments. The system demonstrated automated nanowire pick-up and placement with high reliability. A software system for a load-lock-compatible nanomanipulation system is also designed and developed in this research.

  20. Consumer Adoption of Future MyData-Based Preventive eHealth Services: An Acceptance Model and Survey Study.

    PubMed

    Koivumäki, Timo; Pekkarinen, Saara; Lappi, Minna; Väisänen, Jere; Juntunen, Jouni; Pikkarainen, Minna

    2017-12-22

    Constantly increasing health care costs have led countries and health care providers to the point where health care systems must be reinvented. Consequently, electronic health (eHealth) has recently received a great deal of attention in social sciences in the domain of Internet studies. However, only a fraction of these studies focuses on the acceptability of eHealth, making consumers' subjective evaluation an understudied field. This study will address this gap by focusing on the acceptance of MyData-based preventive eHealth services from the consumer point of view. We are adopting the term "MyData", which according to a White Paper of the Finnish Ministry of Transport and Communication refers to "1) a new approach, a paradigm shift in personal data management and processing that seeks to transform the current organization centric system to a human centric system, 2) to personal data as a resource that the individual can access and control." The aim of this study was to investigate what factors influence consumers' intentions to use a MyData-based preventive eHealth service before use. We applied a new adoption model combining Venkatesh's unified theory of acceptance and use of technology 2 (UTAUT2) in a consumer context and three constructs from health behavior theories, namely threat appraisals, self-efficacy, and perceived barriers. To test the research model, we applied structural equation modeling (SEM) with Mplus software, version 7.4. A Web-based survey was administered. We collected 855 responses. We first applied traditional SEM for the research model, which was not statistically significant. We then tested for possible heterogeneity in the data by running a mixture analysis. We found that heterogeneity was not the cause for the poor performance of the research model. Thus, we moved on to model-generating SEM and ended up with a statistically significant empirical model (root mean square error of approximation [RMSEA] 0.051, Tucker-Lewis index [TLI] 0.906, comparative fit index [CFI] 0.915, and standardized root mean square residual 0.062). According to our empirical model, the statistically significant drivers for behavioral intention were effort expectancy (beta=.191, P<.001), self-efficacy (beta=.449, P<.001), threat appraisals (beta=.416, P<.001), and perceived barriers (beta=-.212, P=.009). Our research highlighted the importance of health-related factors when it comes to eHealth technology adoption in the consumer context. Emphasis should especially be placed on efforts to increase consumers' self-efficacy in eHealth technology use and in supporting healthy behavior. ©Timo Koivumäki, Saara Pekkarinen, Minna Lappi, Jere Väisänen, Jouni Juntunen, Minna Pikkarainen. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 22.12.2017.

  1. Consumer Adoption of Future MyData-Based Preventive eHealth Services: An Acceptance Model and Survey Study

    PubMed Central

    Pekkarinen, Saara; Lappi, Minna; Väisänen, Jere; Juntunen, Jouni; Pikkarainen, Minna

    2017-01-01

    Background Constantly increasing health care costs have led countries and health care providers to the point where health care systems must be reinvented. Consequently, electronic health (eHealth) has recently received a great deal of attention in social sciences in the domain of Internet studies. However, only a fraction of these studies focuses on the acceptability of eHealth, making consumers’ subjective evaluation an understudied field. This study will address this gap by focusing on the acceptance of MyData-based preventive eHealth services from the consumer point of view. We are adopting the term "MyData", which according to a White Paper of the Finnish Ministry of Transport and Communication refers to "1) a new approach, a paradigm shift in personal data management and processing that seeks to transform the current organization centric system to a human centric system, 2) to personal data as a resource that the individual can access and control." Objective The aim of this study was to investigate what factors influence consumers’ intentions to use a MyData-based preventive eHealth service before use. Methods We applied a new adoption model combining Venkatesh’s unified theory of acceptance and use of technology 2 (UTAUT2) in a consumer context and three constructs from health behavior theories, namely threat appraisals, self-efficacy, and perceived barriers. To test the research model, we applied structural equation modeling (SEM) with Mplus software, version 7.4. A Web-based survey was administered. We collected 855 responses. Results We first applied traditional SEM for the research model, which was not statistically significant. We then tested for possible heterogeneity in the data by running a mixture analysis. We found that heterogeneity was not the cause for the poor performance of the research model. Thus, we moved on to model-generating SEM and ended up with a statistically significant empirical model (root mean square error of approximation [RMSEA] 0.051, Tucker-Lewis index [TLI] 0.906, comparative fit index [CFI] 0.915, and standardized root mean square residual 0.062). According to our empirical model, the statistically significant drivers for behavioral intention were effort expectancy (beta=.191, P<.001), self-efficacy (beta=.449, P<.001), threat appraisals (beta=.416, P<.001), and perceived barriers (beta=−.212, P=.009). Conclusions Our research highlighted the importance of health-related factors when it comes to eHealth technology adoption in the consumer context. Emphasis should especially be placed on efforts to increase consumers’ self-efficacy in eHealth technology use and in supporting healthy behavior. PMID:29273574

  2. Designing a model for critical thinking development in AJA University of Medical Sciences.

    PubMed

    Mafakheri Laleh, Mahyar; Mohammadimehr, Mojgan; Zargar Balaye Jame, Sanaz

    2016-10-01

    In the new concept of medical education, creativity development is an important goal. The aim of this research was to identify a model for developing critical thinking among students with the special focus on learning environment and learning style. This applied and cross-sectional study was conducted among all students studying in undergraduate and professional doctorate programs in Fall Semester 2013-2014 in AJA University of Medical Sciences (N=777). The sample consisted of 257 students selected based on the proportional stratified random sampling method. To collect data, three questionnaires including Critical Thinking, Perception of Learning Environment and Learning Style were employed. The data were analyzed using Pearson's correlation statistical test, and one-sample t-test. The Structural Equation Model (SEM) was used to test the research model. SPSS software, version 14 and the LISREL software were used for data analysis. The results showed that students had significantly assessed the teaching-learning environment and two components of "perception of teachers" and "perception of emotional-psychological climate" at the desirable level (p<0.05). Also learning style and two components of "the study method" and "motivation for studying" were considered significantly desirable (p<0.05). The level of critical thinking among students in terms of components of "commitment", "creativity" and "cognitive maturity" was at the relatively desirable level (p<0.05). In addition, perception of the learning environment can impact the critical thinking through learning style. One of the factors which can significantly impact the quality improvement of the teaching and learning process in AJA University of Medical Sciences is to develop critical thinking among learners. This issue requires providing the proper situation for teaching and learning critical thinking in the educational environment.

  3. Designing a model for critical thinking development in AJA University of Medical Sciences

    PubMed Central

    MAFAKHERI LALEH, MAHYAR; MOHAMMADIMEHR, MOJGAN; ZARGAR BALAYE JAME, SANAZ

    2016-01-01

    Introduction: In the new concept of medical education, creativity development is an important goal. The aim of this research was to identify a model for developing critical thinking among students with the special focus on learning environment and learning style. Methods: This applied and cross-sectional study was conducted among all students studying in undergraduate and professional doctorate programs in Fall Semester 2013-2014 in AJA University of Medical Sciences (N=777). The sample consisted of 257 students selected based on the proportional stratified random sampling method. To collect data, three questionnaires including Critical Thinking, Perception of Learning Environment and Learning Style were employed. The data were analyzed using Pearson's correlation statistical test, and one-sample t-test. The Structural Equation Model (SEM) was used to test the research model. SPSS software, version 14 and the LISREL software were used for data analysis. Results: The results showed that students had significantly assessed the teaching-learning environment and two components of "perception of teachers" and "perception of emotional-psychological climate" at the desirable level (p<0.05). Also learning style and two components of "the study method" and "motivation for studying" were considered significantly desirable (p<0.05). The level of critical thinking among students in terms of components of "commitment", "creativity" and "cognitive maturity" was at the relatively desirable level (p<0.05). In addition, perception of the learning environment can impact the critical thinking through learning style. Conclusion: One of the factors which can significantly impact the quality improvement of the teaching and learning process in AJA University of Medical Sciences is to develop critical thinking among learners. This issue requires providing the proper situation for teaching and learning critical thinking in the educational environment. PMID:27795968

  4. Structural Equations and Causal Explanations: Some Challenges for Causal SEM

    ERIC Educational Resources Information Center

    Markus, Keith A.

    2010-01-01

    One common application of structural equation modeling (SEM) involves expressing and empirically investigating causal explanations. Nonetheless, several aspects of causal explanation that have an impact on behavioral science methodology remain poorly understood. It remains unclear whether applications of SEM should attempt to provide complete…

  5. Prescriptive Statements and Educational Practice: What Can Structural Equation Modeling (SEM) Offer?

    ERIC Educational Resources Information Center

    Martin, Andrew J.

    2011-01-01

    Longitudinal structural equation modeling (SEM) can be a basis for making prescriptive statements on educational practice and offers yields over "traditional" statistical techniques under the general linear model. The extent to which prescriptive statements can be made will rely on the appropriate accommodation of key elements of research design,…

  6. A Note on Structural Equation Modeling Estimates of Reliability

    ERIC Educational Resources Information Center

    Yang, Yanyun; Green, Samuel B.

    2010-01-01

    Reliability can be estimated using structural equation modeling (SEM). Two potential problems with this approach are that estimates may be unstable with small sample sizes and biased with misspecified models. A Monte Carlo study was conducted to investigate the quality of SEM estimates of reliability by themselves and relative to coefficient…

  7. The Schoolwide Enrichment Model: A Focus on Student Strengths and Interests

    ERIC Educational Resources Information Center

    Renzulli, Joseph S.; Renzulli, Sally Reis

    2010-01-01

    This article includes an introduction to the Schoolwide Enrichment Model (SEM), with its three components: a total talent portfolio for each child, curriculum differentiation and modification, and enrichment opportunities from the Enrichment Triad Model. Also included is a brief history of the SEM and a summary of 30 years of research underlying…

  8. A Multilevel CFA-MTMM Model for Nested Structurally Different Methods

    ERIC Educational Resources Information Center

    Koch, Tobias; Schultze, Martin; Burrus, Jeremy; Roberts, Richard D.; Eid, Michael

    2015-01-01

    The numerous advantages of structural equation modeling (SEM) for the analysis of multitrait-multimethod (MTMM) data are well known. MTMM-SEMs allow researchers to explicitly model the measurement error, to examine the true convergent and discriminant validity of the given measures, and to relate external variables to the latent trait as well as…

  9. Frequency domain finite-element and spectral-element acoustic wave modeling using absorbing boundaries and perfectly matched layer

    NASA Astrophysics Data System (ADS)

    Rahimi Dalkhani, Amin; Javaherian, Abdolrahim; Mahdavi Basir, Hadi

    2018-04-01

    Wave propagation modeling as a vital tool in seismology can be done via several different numerical methods among them are finite-difference, finite-element, and spectral-element methods (FDM, FEM and SEM). Some advanced applications in seismic exploration benefit the frequency domain modeling. Regarding flexibility in complex geological models and dealing with the free surface boundary condition, we studied the frequency domain acoustic wave equation using FEM and SEM. The results demonstrated that the frequency domain FEM and SEM have a good accuracy and numerical efficiency with the second order interpolation polynomials. Furthermore, we developed the second order Clayton and Engquist absorbing boundary condition (CE-ABC2) and compared it with the perfectly matched layer (PML) for the frequency domain FEM and SEM. In spite of PML method, CE-ABC2 does not add any additional computational cost to the modeling except assembling boundary matrices. As a result, considering CE-ABC2 is more efficient than PML for the frequency domain acoustic wave propagation modeling especially when computational cost is high and high-level absorbing performance is unnecessary.

  10. Spatial modeling on the upperstream of the Citarum watershed: An application of geoinformatics

    NASA Astrophysics Data System (ADS)

    Ningrum, Windy Setia; Widyaningsih, Yekti; Indra, Tito Latif

    2017-03-01

    The Citarum watershed is the longest and the largest watershed in West Java, Indonesia, located at 106°51'36''-107°51' E and 7°19'-6°24'S across 10 districts, and serves as the water supply for over 15 million people. In this area, the water criticality index is concerned to reach the balance between water supply and water demand, so that in the dry season, the watershed is still able to meet the water needs of the society along the Citarum river. The objective of this research is to evaluate the water criticality index of Citarum watershed area using spatial model to overcome the spatial dependencies in the data. The result of Lagrange multiplier diagnostics for spatial dependence results are LM-err = 34.6 (p-value = 4.1e-09) and LM-lag = 8.05 (p-value = 0.005), then modeling using Spatial Lag Model (SLM) and Spatial Error Model (SEM) were conducted. The likelihood ratio test show that both of SLM dan SEM model is better than OLS model in modeling water criticality index in Citarum watershed. The AIC value of SLM and SEM model are 78.9 and 51.4, then the SEM model is better than SLM model in predicting water criticality index in Citarum watershed.

  11. Surgery on spinal epidural metastases (SEM) in renal cell carcinoma: a plea for a new paradigm.

    PubMed

    Bakker, Nicolaas A; Coppes, Maarten H; Vergeer, Rob A; Kuijlen, Jos M A; Groen, Rob J M

    2014-09-01

    Prediction models for outcome of decompressive surgical resection of spinal epidural metastases (SEM) have in common that they have been developed for all types of SEM, irrespective of the type of primary tumor. It is our experience in clinical practice, however, that these models often fail to accurately predict outcome in the individual patient. To investigate whether decision making could be optimized by applying tumor-specific prediction models. For the proof of concept, we analyzed patients with SEM from renal cell carcinoma that we have operated on. Retrospective chart analysis 2006 to 2012. Twenty-one consecutive patients with symptomatic SEM of renal cell carcinoma. Predictive factors for survival. Next to established predictive factors for survival, we analyzed the predictive value of the Motzer criteria in these patients. The Motzer criteria comprise a specific and validated risk model for survival in patients with renal cell carcinoma. After multivariable analysis, only Motzer intermediate (hazard ratio [HR] 17.4, 95% confidence interval [CI] 1.82-166, p=.01) and high risk (HR 39.3, 95% CI 3.10-499, p=.005) turned out to be significantly associated with survival in patients with renal cell carcinoma that we have operated on. In this study, we have demonstrated that decision making could have been optimized by implementing the Motzer criteria next to established prediction models. We, therefore, suggest that in future, in patients with SEM from renal cell carcinoma, the Motzer criteria are also taken into account. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Spatially explicit modeling in ecology: A review

    USGS Publications Warehouse

    DeAngelis, Donald L.; Yurek, Simeon

    2017-01-01

    The use of spatially explicit models (SEMs) in ecology has grown enormously in the past two decades. One major advancement has been that fine-scale details of landscapes, and of spatially dependent biological processes, such as dispersal and invasion, can now be simulated with great precision, due to improvements in computer technology. Many areas of modeling have shifted toward a focus on capturing these fine-scale details, to improve mechanistic understanding of ecosystems. However, spatially implicit models (SIMs) have played a dominant role in ecology, and arguments have been made that SIMs, which account for the effects of space without specifying spatial positions, have an advantage of being simpler and more broadly applicable, perhaps contributing more to understanding. We address this debate by comparing SEMs and SIMs in examples from the past few decades of modeling research. We argue that, although SIMs have been the dominant approach in the incorporation of space in theoretical ecology, SEMs have unique advantages for addressing pragmatic questions concerning species populations or communities in specific places, because local conditions, such as spatial heterogeneities, organism behaviors, and other contingencies, produce dynamics and patterns that usually cannot be incorporated into simpler SIMs. SEMs are also able to describe mechanisms at the local scale that can create amplifying positive feedbacks at that scale, creating emergent patterns at larger scales, and therefore are important to basic ecological theory. We review the use of SEMs at the level of populations, interacting populations, food webs, and ecosystems and argue that SEMs are not only essential in pragmatic issues, but must play a role in the understanding of causal relationships on landscapes.

  13. Measuring surface topography with scanning electron microscopy. I. EZEImage: a program to obtain 3D surface data.

    PubMed

    Ponz, Ezequiel; Ladaga, Juan Luis; Bonetto, Rita Dominga

    2006-04-01

    Scanning electron microscopy (SEM) is widely used in the science of materials and different parameters were developed to characterize the surface roughness. In a previous work, we studied the surface topography with fractal dimension at low scale and two parameters at high scale by using the variogram, that is, variance vs. step log-log graph, of a SEM image. Those studies were carried out with the FERImage program, previously developed by us. To verify the previously accepted hypothesis by working with only an image, it is indispensable to have reliable three-dimensional (3D) surface data. In this work, a new program (EZEImage) to characterize 3D surface topography in SEM has been developed. It uses fast cross correlation and dynamic programming to obtain reliable dense height maps in a few seconds which can be displayed as an image where each gray level represents a height value. This image can be used for the FERImage program or any other software to obtain surface topography characteristics. EZEImage also generates anaglyph images as well as characterizes 3D surface topography by means of a parameter set to describe amplitude properties and three functional indices for characterizing bearing and fluid properties.

  14. Development of uncertainty-based work injury model using Bayesian structural equation modelling.

    PubMed

    Chatterjee, Snehamoy

    2014-01-01

    This paper proposed a Bayesian method-based structural equation model (SEM) of miners' work injury for an underground coal mine in India. The environmental and behavioural variables for work injury were identified and causal relationships were developed. For Bayesian modelling, prior distributions of SEM parameters are necessary to develop the model. In this paper, two approaches were adopted to obtain prior distribution for factor loading parameters and structural parameters of SEM. In the first approach, the prior distributions were considered as a fixed distribution function with specific parameter values, whereas, in the second approach, prior distributions of the parameters were generated from experts' opinions. The posterior distributions of these parameters were obtained by applying Bayesian rule. The Markov Chain Monte Carlo sampling in the form Gibbs sampling was applied for sampling from the posterior distribution. The results revealed that all coefficients of structural and measurement model parameters are statistically significant in experts' opinion-based priors, whereas, two coefficients are not statistically significant when fixed prior-based distributions are applied. The error statistics reveals that Bayesian structural model provides reasonably good fit of work injury with high coefficient of determination (0.91) and less mean squared error as compared to traditional SEM.

  15. Using multiple group modeling to test moderators in meta-analysis.

    PubMed

    Schoemann, Alexander M

    2016-12-01

    Meta-analysis is a popular and flexible analysis that can be fit in many modeling frameworks. Two methods of fitting meta-analyses that are growing in popularity are structural equation modeling (SEM) and multilevel modeling (MLM). By using SEM or MLM to fit a meta-analysis researchers have access to powerful techniques associated with SEM and MLM. This paper details how to use one such technique, multiple group analysis, to test categorical moderators in meta-analysis. In a multiple group meta-analysis a model is fit to each level of the moderator simultaneously. By constraining parameters across groups any model parameter can be tested for equality. Using multiple groups to test for moderators is especially relevant in random-effects meta-analysis where both the mean and the between studies variance of the effect size may be compared across groups. A simulation study and the analysis of a real data set are used to illustrate multiple group modeling with both SEM and MLM. Issues related to multiple group meta-analysis and future directions for research are discussed. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  16. Reliability of Summed Item Scores Using Structural Equation Modeling: An Alternative to Coefficient Alpha

    ERIC Educational Resources Information Center

    Green, Samuel B.; Yang, Yanyun

    2009-01-01

    A method is presented for estimating reliability using structural equation modeling (SEM) that allows for nonlinearity between factors and item scores. Assuming the focus is on consistency of summed item scores, this method for estimating reliability is preferred to those based on linear SEM models and to the most commonly reported estimate of…

  17. Structural Equation Modeling (SEM) for Satisfaction and Dissatisfaction Ratings; Multiple Group Invariance Analysis across Scales with Different Response Format

    ERIC Educational Resources Information Center

    Mazaheri, Mehrdad; Theuns, Peter

    2009-01-01

    The current study evaluates three hypothesized models on subjective well-being, comprising life domain ratings (LDR), overall satisfaction with life (OSWL), and overall dissatisfaction with life (ODWL), using structural equation modeling (SEM). A sample of 1,310 volunteering students, randomly assigned to six conditions, rated their overall life…

  18. Comparing Indirect Effects in SEM: A Sequential Model Fitting Method Using Covariance-Equivalent Specifications

    ERIC Educational Resources Information Center

    Chan, Wai

    2007-01-01

    In social science research, an indirect effect occurs when the influence of an antecedent variable on the effect variable is mediated by an intervening variable. To compare indirect effects within a sample or across different samples, structural equation modeling (SEM) can be used if the computer program supports model fitting with nonlinear…

  19. Self-Concealment, Social Network Sites Usage, Social Appearance Anxiety, Loneliness of High School Students: A Model Testing

    ERIC Educational Resources Information Center

    Dogan, Ugur; Çolak, Tugba Seda

    2016-01-01

    This study was tested a model for explain to social networks sites (SNS) usage with structural equation modeling (SEM). Using SEM on a sample of 475 high school students (35% male, 65% female) students, model was investigated the relationship between self-concealment, social appearance anxiety, loneliness on SNS such as Twitter and Facebook usage.…

  20. CD-SEM metrology and OPC modeling for 2D patterning in advanced technology nodes (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Wallow, Thomas I.; Zhang, Chen; Fumar-Pici, Anita; Chen, Jun; Laenens, Bart; Spence, Christopher A.; Rio, David; van Adrichem, Paul; Dillen, Harm; Wang, Jing; Yang, Peng-Cheng; Gillijns, Werner; Jaenen, Patrick; van Roey, Frieda; van de Kerkhove, Jeroen; Babin, Sergey

    2017-03-01

    In the course of assessing OPC compact modeling capabilities and future requirements, we chose to investigate the interface between CD-SEM metrology methods and OPC modeling in some detail. Two linked observations motivated our study: 1) OPC modeling is, in principle, agnostic of metrology methods and best practice implementation. 2) Metrology teams across the industry use a wide variety of equipment, hardware settings, and image/data analysis methods to generate the large volumes of CD-SEM measurement data that are required for OPC in advanced technology nodes. Initial analyses led to the conclusion that many independent best practice metrology choices based on systematic study as well as accumulated institutional knowledge and experience can be reasonably made. Furthermore, these choices can result in substantial variations in measurement of otherwise identical model calibration and verification patterns. We will describe several experimental 2D test cases (i.e., metal, via/cut layers) that examine how systematic changes in metrology practice impact both the metrology data itself and the resulting full chip compact model behavior. Assessment of specific methodology choices will include: • CD-SEM hardware configurations and settings: these may range from SEM beam conditions (voltage, current, etc.,) to magnification, to frame integration optimizations that balance signal-to-noise vs. resist damage. • Image and measurement optimization: these may include choice of smoothing filters for noise suppression, threshold settings, etc. • Pattern measurement methodologies: these may include sampling strategies, CD- and contour- based approaches, and various strategies to optimize the measurement of complex 2D shapes. In addition, we will present conceptual frameworks and experimental methods that allow practitioners of OPC metrology to assess impacts of metrology best practice choices on model behavior. Finally, we will also assess requirements posed by node scaling on OPC model accuracy, and evaluate potential consequences for CD-SEM metrology capabilities and practices.

  1. The Role of Sexually Explicit Material (SEM) in the Sexual Development of Black Young Same-Sex-Attracted Men

    PubMed Central

    Morgan, Anthony; Ogunbajo, Adedotun; Trent, Maria; Harper, Gary W.; Fortenberry, J. Dennis

    2015-01-01

    Sexually explicit material (SEM) (including Internet, video, and print) may play a key role in the lives of Black same-sex sexually active youth by providing the only information to learn about sexual development. There is limited school-and/or family-based sex education to serve as models for sexual behaviors for Black youth. We describe the role SEM plays in the sexual development of a sample of Black same-sex attracted (SSA) young adolescent men ages 15–19. Adolescents recruited from clinics, social networking sites, and through snowball sampling were invited to participate in a 90-min, semi-structured qualitative interview. Most participants described using SEM prior to their first same-sex sexual experience. Participants described using SEM primarily for sexual development, including learning about sexual organs and function, the mechanics of same-gender sex, and to negotiate one’s sexual identity. Secondary functions were to determine readiness for sex; to learn about sexual performance, including understanding sexual roles and responsibilities (e.g., “top” or “bottom”); to introduce sexual performance scripts; and to develop models for how sex should feel (e.g., pleasure and pain). Youth also described engaging in sexual behaviors (including condom non-use and/or swallowing ejaculate) that were modeled on SEM. Comprehensive sexuality education programs should be designed to address the unmet needs of young, Black SSA young men, with explicit focus on sexual roles and behaviors that may be inaccurately portrayed and/or involve sexual risk-taking (such as unprotected anal intercourse and swallowing ejaculate) in SEM. This work also calls for development of Internet-based HIV/STI prevention strategies targeting young Black SSA men who maybe accessing SEM. PMID:25677334

  2. Causal modelling applied to the risk assessment of a wastewater discharge.

    PubMed

    Paul, Warren L; Rokahr, Pat A; Webb, Jeff M; Rees, Gavin N; Clune, Tim S

    2016-03-01

    Bayesian networks (BNs), or causal Bayesian networks, have become quite popular in ecological risk assessment and natural resource management because of their utility as a communication and decision-support tool. Since their development in the field of artificial intelligence in the 1980s, however, Bayesian networks have evolved and merged with structural equation modelling (SEM). Unlike BNs, which are constrained to encode causal knowledge in conditional probability tables, SEMs encode this knowledge in structural equations, which is thought to be a more natural language for expressing causal information. This merger has clarified the causal content of SEMs and generalised the method such that it can now be performed using standard statistical techniques. As it was with BNs, the utility of this new generation of SEM in ecological risk assessment will need to be demonstrated with examples to foster an understanding and acceptance of the method. Here, we applied SEM to the risk assessment of a wastewater discharge to a stream, with a particular focus on the process of translating a causal diagram (conceptual model) into a statistical model which might then be used in the decision-making and evaluation stages of the risk assessment. The process of building and testing a spatial causal model is demonstrated using data from a spatial sampling design, and the implications of the resulting model are discussed in terms of the risk assessment. It is argued that a spatiotemporal causal model would have greater external validity than the spatial model, enabling broader generalisations to be made regarding the impact of a discharge, and greater value as a tool for evaluating the effects of potential treatment plant upgrades. Suggestions are made on how the causal model could be augmented to include temporal as well as spatial information, including suggestions for appropriate statistical models and analyses.

  3. Measurement of the Position Angle and Separation of HJ 1924

    NASA Astrophysics Data System (ADS)

    Badami, Umar Ahmed; Tock, Kalée.; Carpenter, Steve; Kruger, Kurt; Freed, Rachel; Genet, Russell

    2018-01-01

    The position angle and separation of the binary HJ 1924 have been measured and noted in 10 publications since John Herschel's initial observation in 1828. Measurement techniques have improved in both precision and accuracy since that time. Although Herschel's initial measurement was slightly different, the position angle and separation of these stars have remained relatively constant for the past 122 years. The system was observed using the Skynet Robotic Telescope Network. AstroImageJ software was used to contribute a new data point. Our measurement of 8.12" ± 0.0127 (1 ± SEM), 225.1o ± 0.0298 (1 ± SEM),was in agreement with the 10 most recent published measurements, but not the initial one, implying that Herschel's measurement may have been inaccurate. While these stars appear to exhibit similar proper motion, and may therefore share a common origin, they are unlikely to be gravitationally bound.

  4. KLASS: Kennedy Launch Academy Simulation System

    NASA Technical Reports Server (NTRS)

    Garner, Lesley C.

    2007-01-01

    Software provides access to many sophisticated scientific instrumentation (Scanning Electron Microscope (SEM), a Light Microscope, a Scanning Probe Microscope (covering Scanning Tunneling, Atomic Force, and Magnetic Force microscopy), and an Energy Dispersive Spectrometer for the SEM). Flash animation videos explain how each of the instruments work. Videos on how they are used at NASA and the sample preparation. Measuring and labeling tools provided with each instrument. Hands on experience of controlling the virtual instrument to conduct investigations, much like the real scientists at NASA do. Very open architecture. Open source on SourceForge. Extensive use of XML Target audience is high school and entry-level college students. "Many beginning students never get closer to an electron microscope than the photos in their textbooks. But anyone can get a sense of what the instrument can do by downloading this simulator from NASA's Kennedy Space Center." Science Magazine, April 8th, 2005

  5. Dealing with Multiple Solutions in Structural Vector Autoregressive Models.

    PubMed

    Beltz, Adriene M; Molenaar, Peter C M

    2016-01-01

    Structural vector autoregressive models (VARs) hold great potential for psychological science, particularly for time series data analysis. They capture the magnitude, direction of influence, and temporal (lagged and contemporaneous) nature of relations among variables. Unified structural equation modeling (uSEM) is an optimal structural VAR instantiation, according to large-scale simulation studies, and it is implemented within an SEM framework. However, little is known about the uniqueness of uSEM results. Thus, the goal of this study was to investigate whether multiple solutions result from uSEM analysis and, if so, to demonstrate ways to select an optimal solution. This was accomplished with two simulated data sets, an empirical data set concerning children's dyadic play, and modifications to the group iterative multiple model estimation (GIMME) program, which implements uSEMs with group- and individual-level relations in a data-driven manner. Results revealed multiple solutions when there were large contemporaneous relations among variables. Results also verified several ways to select the correct solution when the complete solution set was generated, such as the use of cross-validation, maximum standardized residuals, and information criteria. This work has immediate and direct implications for the analysis of time series data and for the inferences drawn from those data concerning human behavior.

  6. Toward a Model of Strategies and Summary Writing Performance

    ERIC Educational Resources Information Center

    Yang, Hui-Chun

    2014-01-01

    This study explores the construct of a summarization test task by means of single-group and multigroup structural equation modeling (SEM). It examines the interrelationships between strategy use and performance, drawing on data from 298 Taiwanese undergraduates' summary essays and their self-reported strategy use. Single-group SEM analyses…

  7. Modeling the Relationships between Subdimensions of Environmental Literacy

    ERIC Educational Resources Information Center

    Genc, Murat; Akilli, Mustafa

    2016-01-01

    The aim of this study is to demonstrate the relationships between subdimensions of environmental literacy using Structural Equation Modeling (SEM). The study was conducted by the analysis of students' answers to questionnaires data using SEM. Initially, Kaiser-Meyer-Olkin and Bartlett's tests were done to test appropriateness of subdimensions to…

  8. Errors of Inference in Structural Equation Modeling

    ERIC Educational Resources Information Center

    McCoach, D. Betsy; Black, Anne C.; O'Connell, Ann A.

    2007-01-01

    Although structural equation modeling (SEM) is one of the most comprehensive and flexible approaches to data analysis currently available, it is nonetheless prone to researcher misuse and misconceptions. This article offers a brief overview of the unique capabilities of SEM and discusses common sources of user error in drawing conclusions from…

  9. A Methodological Review of Structural Equation Modelling in Higher Education Research

    ERIC Educational Resources Information Center

    Green, Teegan

    2016-01-01

    Despite increases in the number of articles published in higher education journals using structural equation modelling (SEM), research addressing their statistical sufficiency, methodological appropriateness and quantitative rigour is sparse. In response, this article provides a census of all covariance-based SEM articles published up until 2013…

  10. Is scanning electron microscopy/energy dispersive X-ray spectrometry (SEM/EDS) quantitative?

    PubMed

    Newbury, Dale E; Ritchie, Nicholas W M

    2013-01-01

    Scanning electron microscopy/energy dispersive X-ray spectrometry (SEM/EDS) is a widely applied elemental microanalysis method capable of identifying and quantifying all elements in the periodic table except H, He, and Li. By following the "k-ratio" (unknown/standard) measurement protocol development for electron-excited wavelength dispersive spectrometry (WDS), SEM/EDS can achieve accuracy and precision equivalent to WDS and at substantially lower electron dose, even when severe X-ray peak overlaps occur, provided sufficient counts are recorded. Achieving this level of performance is now much more practical with the advent of the high-throughput silicon drift detector energy dispersive X-ray spectrometer (SDD-EDS). However, three measurement issues continue to diminish the impact of SEM/EDS: (1) In the qualitative analysis (i.e., element identification) that must precede quantitative analysis, at least some current and many legacy software systems are vulnerable to occasional misidentification of major constituent peaks, with the frequency of misidentifications rising significantly for minor and trace constituents. (2) The use of standardless analysis, which is subject to much broader systematic errors, leads to quantitative results that, while useful, do not have sufficient accuracy to solve critical problems, e.g. determining the formula of a compound. (3) EDS spectrometers have such a large volume of acceptance that apparently credible spectra can be obtained from specimens with complex topography that introduce uncontrolled geometric factors that modify X-ray generation and propagation, resulting in very large systematic errors, often a factor of ten or more. © Wiley Periodicals, Inc.

  11. AxiSEM3D: broadband seismic wavefields in 3-D aspherical Earth models

    NASA Astrophysics Data System (ADS)

    Leng, K.; Nissen-Meyer, T.; Zad, K. H.; van Driel, M.; Al-Attar, D.

    2017-12-01

    Seismology is the primary tool for data-informed inference of Earth structure and dynamics. Simulating seismic wave propagation at a global scale is fundamental to seismology, but remains as one of most challenging problems in scientific computing, because of both the multiscale nature of Earth's interior and the observable frequency band of seismic data. We present a novel numerical method to simulate global seismic wave propagation in realistic 3-D Earth models. Our method, named AxiSEM3D, is a hybrid of spectral element method and pseudospectral method. It reduces the azimuthal dimension of wavefields by means of a global Fourier series parameterization, of which the number of terms can be locally adapted to the inherent azimuthal smoothness of the wavefields. AxiSEM3D allows not only for material heterogeneities, such as velocity, density, anisotropy and attenuation, but also for finite undulations on radial discontinuities, both solid-solid and solid-fluid, and thereby a variety of aspherical Earth features such as ellipticity, topography, variable crustal thickness, and core-mantle boundary topography. Such interface undulations are equivalently interpreted as material perturbations of the contiguous media, based on the "particle relabelling transformation". Efficiency comparisons show that AxiSEM3D can be 1 to 3 orders of magnitude faster than conventional 3-D methods, with the speedup increasing with simulation frequency and decreasing with model complexity, but for all realistic structures the speedup remains at least one order of magnitude. The observable frequency range of global seismic data (up to 1 Hz) has been covered for wavefield modelling upon a 3-D Earth model with reasonable computing resources. We show an application of surface wave modelling within a state-of-the-art global crustal model (Crust1.0), with the synthetics compared to real data. The high-performance C++ code is released at github.com/AxiSEM3D/AxiSEM3D.

  12. SEM-PLS Analysis of Inhibiting Factors of Cost Performance for Large Construction Projects in Malaysia: Perspective of Clients and Consultants

    PubMed Central

    Memon, Aftab Hameed; Rahman, Ismail Abdul

    2014-01-01

    This study uncovered inhibiting factors to cost performance in large construction projects of Malaysia. Questionnaire survey was conducted among clients and consultants involved in large construction projects. In the questionnaire, a total of 35 inhibiting factors grouped in 7 categories were presented to the respondents for rating significant level of each factor. A total of 300 questionnaire forms were distributed. Only 144 completed sets were received and analysed using advanced multivariate statistical software of Structural Equation Modelling (SmartPLS v2). The analysis involved three iteration processes where several of the factors were deleted in order to make the model acceptable. The result of the analysis found that R 2 value of the model is 0.422 which indicates that the developed model has a substantial impact on cost performance. Based on the final form of the model, contractor's site management category is the most prominent in exhibiting effect on cost performance of large construction projects. This finding is validated using advanced techniques of power analysis. This vigorous multivariate analysis has explicitly found the significant category which consists of several causative factors to poor cost performance in large construction projects. This will benefit all parties involved in construction projects for controlling cost overrun. PMID:24693227

  13. SEM-PLS analysis of inhibiting factors of cost performance for large construction projects in Malaysia: perspective of clients and consultants.

    PubMed

    Memon, Aftab Hameed; Rahman, Ismail Abdul

    2014-01-01

    This study uncovered inhibiting factors to cost performance in large construction projects of Malaysia. Questionnaire survey was conducted among clients and consultants involved in large construction projects. In the questionnaire, a total of 35 inhibiting factors grouped in 7 categories were presented to the respondents for rating significant level of each factor. A total of 300 questionnaire forms were distributed. Only 144 completed sets were received and analysed using advanced multivariate statistical software of Structural Equation Modelling (SmartPLS v2). The analysis involved three iteration processes where several of the factors were deleted in order to make the model acceptable. The result of the analysis found that R(2) value of the model is 0.422 which indicates that the developed model has a substantial impact on cost performance. Based on the final form of the model, contractor's site management category is the most prominent in exhibiting effect on cost performance of large construction projects. This finding is validated using advanced techniques of power analysis. This vigorous multivariate analysis has explicitly found the significant category which consists of several causative factors to poor cost performance in large construction projects. This will benefit all parties involved in construction projects for controlling cost overrun.

  14. Expanding the Conversation about SEM: Advancing SEM Efforts to Improve Student Learning and Persistence--Part II

    ERIC Educational Resources Information Center

    Yale, Amanda

    2010-01-01

    The first article in this two-part series focused on the need for enrollment management conceptual and organizational models to focus more intentionally and purposefully on efforts related to improving student learning, success, and persistence. Time and again, SEM is viewed from a conventional lens comprising marketing, recruitment and …

  15. Minimal resin embedding of multicellular specimens for targeted FIB-SEM imaging.

    PubMed

    Schieber, Nicole L; Machado, Pedro; Markert, Sebastian M; Stigloher, Christian; Schwab, Yannick; Steyer, Anna M

    2017-01-01

    Correlative light and electron microscopy (CLEM) is a powerful tool to perform ultrastructural analysis of targeted tissues or cells. The large field of view of the light microscope (LM) enables quick and efficient surveys of the whole specimen. It is also compatible with live imaging, giving access to functional assays. CLEM protocols take advantage of the features to efficiently retrace the position of targeted sites when switching from one modality to the other. They more often rely on anatomical cues that are visible both by light and electron microscopy. We present here a simple workflow where multicellular specimens are embedded in minimal amounts of resin, exposing their surface topology that can be imaged by scanning electron microscopy (SEM). LM and SEM both benefit from a large field of view that can cover whole model organisms. As a result, targeting specific anatomic locations by focused ion beam-SEM (FIB-SEM) tomography becomes straightforward. We illustrate this application on three different model organisms, used in our laboratory: the zebrafish embryo Danio rerio, the marine worm Platynereis dumerilii, and the dauer larva of the nematode Caenorhabditis elegans. Here we focus on the experimental steps to reduce the amount of resin covering the samples and to image the specimens inside an FIB-SEM. We expect this approach to have widespread applications for volume electron microscopy on multiple model organisms. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Rule-Based Flight Software Cost Estimation

    NASA Technical Reports Server (NTRS)

    Stukes, Sherry A.; Spagnuolo, John N. Jr.

    2015-01-01

    This paper discusses the fundamental process for the computation of Flight Software (FSW) cost estimates. This process has been incorporated in a rule-based expert system [1] that can be used for Independent Cost Estimates (ICEs), Proposals, and for the validation of Cost Analysis Data Requirements (CADRe) submissions. A high-level directed graph (referred to here as a decision graph) illustrates the steps taken in the production of these estimated costs and serves as a basis of design for the expert system described in this paper. Detailed discussions are subsequently given elaborating upon the methodology, tools, charts, and caveats related to the various nodes of the graph. We present general principles for the estimation of FSW using SEER-SEM as an illustration of these principles when appropriate. Since Source Lines of Code (SLOC) is a major cost driver, a discussion of various SLOC data sources for the preparation of the estimates is given together with an explanation of how contractor SLOC estimates compare with the SLOC estimates used by JPL. Obtaining consistency in code counting will be presented as well as factors used in reconciling SLOC estimates from different code counters. When sufficient data is obtained, a mapping into the JPL Work Breakdown Structure (WBS) from the SEER-SEM output is illustrated. For across the board FSW estimates, as was done for the NASA Discovery Mission proposal estimates performed at JPL, a comparative high-level summary sheet for all missions with the SLOC, data description, brief mission description and the most relevant SEER-SEM parameter values is given to illustrate an encapsulation of the used and calculated data involved in the estimates. The rule-based expert system described provides the user with inputs useful or sufficient to run generic cost estimation programs. This system's incarnation is achieved via the C Language Integrated Production System (CLIPS) and will be addressed at the end of this paper.

  17. Three-Dimensional FIB/EBSD Characterization of Irradiated HfAl3-Al Composite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hua, Zilong; Guillen, Donna Post; Harris, William

    2016-09-01

    A thermal neutron absorbing material, comprised of 28.4 vol% HfAl3 in an Al matrix, was developed to serve as a conductively cooled thermal neutron filter to enable fast flux materials and fuels testing in a pressurized water reactor. In order to observe the microstructural change of the HfAl3-Al composite due to neutron irradiation, an EBSD-FIB characterization approach is developed and presented in this paper. Using the focused ion beam (FIB), the sample was fabricated to 25µm × 25µm × 20 µm and mounted on the grid. A series of operations were carried out repetitively on the sample top surface tomore » prepare it for scanning electron microscopy (SEM). First, a ~100-nm layer was removed by high voltage FIB milling. Then, several cleaning passes were performed on the newly exposed surface using low voltage FIB milling to improve the SEM image quality. Last, the surface was scanned by Electron Backscattering Diffraction (EBSD) to obtain the two-dimensional image. After 50 to 100 two-dimensional images were collected, the images were stacked to reconstruct a three-dimensional model using DREAM.3D software. Two such reconstructed three-dimensional models were obtained from samples of the original and post-irradiation HfAl3-Al composite respectively, from which the most significant microstructural change caused by neutron irradiation apparently is the size reduction of both HfAl3 and Al grains. The possible reason is the thermal expansion and related thermal strain from the thermal neutron absorption. This technique can be applied to three-dimensional microstructure characterization of irradiated materials.« less

  18. Effect of cobalt doping on structural and dielectric properties of nanocrystalline LaCrO3

    NASA Astrophysics Data System (ADS)

    Zarrin, Naima; Husain, Shahid

    2018-05-01

    Pure and Co doped Lanthanum chromite (LaCrO3) nanoparticles, LaCr1-xCoxO3 (0≤x≤0.3), have been synthesized through sol-gel process and their structural, morphological and dielectric properties have been studied. X ray diffraction patterns reveal that the samples are in single phase having orthorhombic structure with Pnma space group. Structural parameters are refined by Rietveld refinement using Fullprof software. Lattice parameters and unit cell volume are found to decrease with increase in Co doping. Crystallite size is calculated using Scherrer equation and is also found to decrease with increase in Co concentration. Surface morphology is examined using SEM-EDX analysis, which confirms the formation of regular and homogeneous samples without any impurities. The value of dielectric constant (ɛ') decreases with the increase in frequency while it enhances with the increase in Co concentration. The log (ɛ'×f) versus log (f) graphs have been plotted to verify the universal dielectric response (UDR) model. All the samples follow UDR model in the low frequency range.

  19. Synthesis of zeolite/nickel ferrite/sodium alginate bionanocomposite via a co-precipitation technique for efficient removal of water-soluble methylene blue dye.

    PubMed

    Bayat, Mahsa; Javanbakht, Vahid; Esmaili, Javad

    2018-05-05

    In this study, we sought to synthesize magnetic nanocomposite of zeolite/nickel ferrite through co-precipitation method and modify its surface by sodium alginate to enhance its methylene blue adsorption capacity and to prevent its oxidation. Nanocomposite characteristics were investigated by SEM, VSM, XRD and FTIR analyses. The results indicate that nanocomposite synthesis and modification has been completely successful. Adsorption thermodynamics, kinetics, and isotherms were examined and parameters were optimized by Minitab software using experimental design method, response surface methodology and Box-Behnken design. The highest capacity of methylene blue adsorption from the aqueous solution obtained at optimal pH of 5, the initial dye concentration of 10 mg/L and an adsorbent amount of 0.03 g was about 54.05 mg/g. Analyzing kinetic data of adsorption experiments confirmed that adsorption process complies with the pseudo-second-order kinetic model. Assessing equilibrium isotherm data at different temperatures showed that these data are in good agreement with Langmuir isotherm model. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. The contributions of human factors on human error in Malaysia aviation maintenance industries

    NASA Astrophysics Data System (ADS)

    Padil, H.; Said, M. N.; Azizan, A.

    2018-05-01

    Aviation maintenance is a multitasking activity in which individuals perform varied tasks under constant pressure to meet deadlines as well as challenging work conditions. These situational characteristics combined with human factors can lead to various types of human related errors. The primary objective of this research is to develop a structural relationship model that incorporates human factors, organizational factors, and their impact on human errors in aviation maintenance. Towards that end, a questionnaire was developed which was administered to Malaysian aviation maintenance professionals. Structural Equation Modelling (SEM) approach was used in this study utilizing AMOS software. Results showed that there were a significant relationship of human factors on human errors and were tested in the model. Human factors had a partial effect on organizational factors while organizational factors had a direct and positive impact on human errors. It was also revealed that organizational factors contributed to human errors when coupled with human factors construct. This study has contributed to the advancement of knowledge on human factors effecting safety and has provided guidelines for improving human factors performance relating to aviation maintenance activities and could be used as a reference for improving safety performance in the Malaysian aviation maintenance companies.

  1. Mathematical model of the seismic electromagnetic signals (SEMS) in non crystalline substances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dennis, L. C. C.; Yahya, N.; Daud, H.

    The mathematical model of seismic electromagnetic waves in non crystalline substances is developed and the solutions are discussed to show the possibility of improving the electromagnetic waves especially the electric field. The shear stress of the medium in fourth order tensor gives the equation of motion. Analytic methods are selected for the solutions written in Hansen vector form. From the simulated SEMS, the frequency of seismic waves has significant effects to the SEMS propagating characteristics. EM waves transform into SEMS or energized seismic waves. Traveling distance increases once the frequency of the seismic waves increases from 100% to 1000%. SEMSmore » with greater seismic frequency will give seismic alike waves but greater energy is embedded by EM waves and hence further distance the waves travel.« less

  2. Subjective Values of Quality of Life Dimensions in Elderly People. A SEM Preference Model Approach

    ERIC Educational Resources Information Center

    Elosua, Paula

    2011-01-01

    This article proposes a Thurstonian model in the framework of Structural Equation Modelling (SEM) to assess preferences among quality of life dimensions for the elderly. Data were gathered by a paired comparison design in a sample comprised of 323 people aged from 65 to 94 years old. Five dimensions of quality of life were evaluated: Health,…

  3. Clinical Development of Gamitrinib, a Novel Mitochondrial-Targeted Small Molecule Hsp90 Inhibitor

    DTIC Science & Technology

    2015-09-01

    Group 2 and Group 3 animals examined at the end of the 7-repeated doses was comparable to those in control Group 1 animals (Figure 2). (7) Despite... posttest (for more than two- group comparisons) using a GraphPad software package (Prism 6.0) for Windows. Data are expressed as mean ± SD or mean ± SEM...Benjamini Y, Hochberg Y (1995) Controlling the false discovery rate: A practical and powerful approach to multiple testing. J R Stat Soc Series B Stat

  4. A Review of Structural Equation Modeling Applications in Turkish Educational Science Literature, 2010-2015

    ERIC Educational Resources Information Center

    Karakaya-Ozyer, Kubra; Aksu-Dunya, Beyza

    2018-01-01

    Structural equation modeling (SEM) is one of the most popular multivariate statistical techniques in Turkish educational research. This study elaborates the SEM procedures employed by 75 educational research articles which were published from 2010 to 2015 in Turkey. After documenting and coding 75 academic papers, categorical frequencies and…

  5. Solutions for Missing Data in Structural Equation Modeling

    ERIC Educational Resources Information Center

    Carter, Rufus Lynn

    2006-01-01

    Many times in both educational and social science research it is impossible to collect data that is complete. When administering a survey, for example, people may answer some questions and not others. This missing data causes a problem for researchers using structural equation modeling (SEM) techniques for data analyses. Because SEM and…

  6. Meta-Analytic Structural Equation Modeling: A Two-Stage Approach

    ERIC Educational Resources Information Center

    Cheung, Mike W. L.; Chan, Wai

    2005-01-01

    To synthesize studies that use structural equation modeling (SEM), researchers usually use Pearson correlations (univariate r), Fisher z scores (univariate z), or generalized least squares (GLS) to combine the correlation matrices. The pooled correlation matrix is then analyzed by the use of SEM. Questionable inferences may occur for these ad hoc…

  7. Comparison of non-Gaussian and Gaussian diffusion models of diffusion weighted imaging of rectal cancer at 3.0 T MRI.

    PubMed

    Zhang, Guangwen; Wang, Shuangshuang; Wen, Didi; Zhang, Jing; Wei, Xiaocheng; Ma, Wanling; Zhao, Weiwei; Wang, Mian; Wu, Guosheng; Zhang, Jinsong

    2016-12-09

    Water molecular diffusion in vivo tissue is much more complicated. We aimed to compare non-Gaussian diffusion models of diffusion-weighted imaging (DWI) including intra-voxel incoherent motion (IVIM), stretched-exponential model (SEM) and Gaussian diffusion model at 3.0 T MRI in patients with rectal cancer, and to determine the optimal model for investigating the water diffusion properties and characterization of rectal carcinoma. Fifty-nine consecutive patients with pathologically confirmed rectal adenocarcinoma underwent DWI with 16 b-values at a 3.0 T MRI system. DWI signals were fitted to the mono-exponential and non-Gaussian diffusion models (IVIM-mono, IVIM-bi and SEM) on primary tumor and adjacent normal rectal tissue. Parameters of standard apparent diffusion coefficient (ADC), slow- and fast-ADC, fraction of fast ADC (f), α value and distributed diffusion coefficient (DDC) were generated and compared between the tumor and normal tissues. The SEM exhibited the best fitting results of actual DWI signal in rectal cancer and the normal rectal wall (R 2  = 0.998, 0.999 respectively). The DDC achieved relatively high area under the curve (AUC = 0.980) in differentiating tumor from normal rectal wall. Non-Gaussian diffusion models could assess tissue properties more accurately than the ADC derived Gaussian diffusion model. SEM may be used as a potential optimal model for characterization of rectal cancer.

  8. Associations Between Croatian Adolescents' Use of Sexually Explicit Material and Sexual Behavior: Does Parental Monitoring Play a Role?

    PubMed

    Tomić, Ivan; Burić, Jakov; Štulhofer, Aleksandar

    2017-10-25

    The use of sexually explicit material (SEM) has become a part of adolescent sexual socialization, at least in the Western world. Adolescent and young people's SEM use has been associated with risky sexual behaviors, which has recently resulted in policy debates about restricting access to SEM. Such development seems to suggest a crisis of the preventive role of parental oversight. Based on the Differential Susceptibility to Media Effects Model, this study assessed the role of parental monitoring in the context of adolescent vulnerability to SEM-associated risky or potentially adverse outcomes (sexual activity, sexual aggressiveness, and sexting). Using an online sample of Croatian 16-year-olds (N = 1265) and structural equation modeling approach, parental monitoring was found consistently and negatively related to the problematic behavioral outcomes, regardless of participants' gender. While SEM use was related to sexual experience and sexting, higher levels of parental monitoring were associated with less frequent SEM use and lower acceptance of sexual permissiveness. Despite parents' fears about losing the ability to monitor their adolescent children's lives in the Internet era, there is evidence that parental engagement remains an important protective factor.

  9. Perceived threat and corroboration: key factors that improve a predictive model of trust in internet-based health information and advice.

    PubMed

    Harris, Peter R; Sillence, Elizabeth; Briggs, Pam

    2011-07-27

    How do people decide which sites to use when seeking health advice online? We can assume, from related work in e-commerce, that general design factors known to affect trust in the site are important, but in this paper we also address the impact of factors specific to the health domain. The current study aimed to (1) assess the factorial structure of a general measure of Web trust, (2) model how the resultant factors predicted trust in, and readiness to act on, the advice found on health-related websites, and (3) test whether adding variables from social cognition models to capture elements of the response to threatening, online health-risk information enhanced the prediction of these outcomes. Participants were asked to recall a site they had used to search for health-related information and to think of that site when answering an online questionnaire. The questionnaire consisted of a general Web trust questionnaire plus items assessing appraisals of the site, including threat appraisals, information checking, and corroboration. It was promoted on the hungersite.com website. The URL was distributed via Yahoo and local print media. We assessed the factorial structure of the measures using principal components analysis and modeled how well they predicted the outcome measures using structural equation modeling (SEM) with EQS software. We report an analysis of the responses of participants who searched for health advice for themselves (N = 561). Analysis of the general Web trust questionnaire revealed 4 factors: information quality, personalization, impartiality, and credible design. In the final SEM model, information quality and impartiality were direct predictors of trust. However, variables specific to eHealth (perceived threat, coping, and corroboration) added substantially to the ability of the model to predict variance in trust and readiness to act on advice on the site. The final model achieved a satisfactory fit: χ(2) (5) = 10.8 (P = .21), comparative fit index = .99, root mean square error of approximation = .052. The model accounted for 66% of the variance in trust and 49% of the variance in readiness to act on the advice. Adding variables specific to eHealth enhanced the ability of a model of trust to predict trust and readiness to act on advice.

  10. Perceived Threat and Corroboration: Key Factors That Improve a Predictive Model of Trust in Internet-based Health Information and Advice

    PubMed Central

    Harris, Peter R; Briggs, Pam

    2011-01-01

    Background How do people decide which sites to use when seeking health advice online? We can assume, from related work in e-commerce, that general design factors known to affect trust in the site are important, but in this paper we also address the impact of factors specific to the health domain. Objective The current study aimed to (1) assess the factorial structure of a general measure of Web trust, (2) model how the resultant factors predicted trust in, and readiness to act on, the advice found on health-related websites, and (3) test whether adding variables from social cognition models to capture elements of the response to threatening, online health-risk information enhanced the prediction of these outcomes. Methods Participants were asked to recall a site they had used to search for health-related information and to think of that site when answering an online questionnaire. The questionnaire consisted of a general Web trust questionnaire plus items assessing appraisals of the site, including threat appraisals, information checking, and corroboration. It was promoted on the hungersite.com website. The URL was distributed via Yahoo and local print media. We assessed the factorial structure of the measures using principal components analysis and modeled how well they predicted the outcome measures using structural equation modeling (SEM) with EQS software. Results We report an analysis of the responses of participants who searched for health advice for themselves (N = 561). Analysis of the general Web trust questionnaire revealed 4 factors: information quality, personalization, impartiality, and credible design. In the final SEM model, information quality and impartiality were direct predictors of trust. However, variables specific to eHealth (perceived threat, coping, and corroboration) added substantially to the ability of the model to predict variance in trust and readiness to act on advice on the site. The final model achieved a satisfactory fit: χ2 5 = 10.8 (P = .21), comparative fit index = .99, root mean square error of approximation = .052. The model accounted for 66% of the variance in trust and 49% of the variance in readiness to act on the advice. Conclusions Adding variables specific to eHealth enhanced the ability of a model of trust to predict trust and readiness to act on advice. PMID:21795237

  11. Nutrition, Balance and Fear of Falling as Predictors of Risk for Falls among Filipino Elderly in Nursing Homes: A Structural Equation Model (SEM)

    ERIC Educational Resources Information Center

    de Guzman, Allan B.; Ines, Joanna Louise C.; Inofinada, Nina Josefa A.; Ituralde, Nielson Louie J.; Janolo, John Robert E.; Jerezo, Jnyv L.; Jhun, Hyae Suk J.

    2013-01-01

    While a number of empirical studies have been conducted regarding risk for falls among the elderly, there is still a paucity of similar studies in a developing country like the Philippines. This study purports to test through Structural Equation Modeling (SEM) a model that shows the interaction between and among nutrition, balance, fear of…

  12. A Model for Assessing the Liability of Seemingly Correct Software

    NASA Technical Reports Server (NTRS)

    Voas, Jeffrey M.; Voas, Larry K.; Miller, Keith W.

    1991-01-01

    Current research on software reliability does not lend itself to quantitatively assessing the risk posed by a piece of life-critical software. Black-box software reliability models are too general and make too many assumptions to be applied confidently to assessing the risk of life-critical software. We present a model for assessing the risk caused by a piece of software; this model combines software testing results and Hamlet's probable correctness model. We show how this model can assess software risk for those who insure against a loss that can occur if life-critical software fails.

  13. MAMA User Guide v2.0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaschen, Brian Keith; Bloch, Jeffrey Joseph; Porter, Reid

    Morphological signatures of bulk SNM materials have significant promise, but these potential signatures are not fully utilized. This document describes software tools, collectively called the MAMA (Morphological Analysis for Material Attribution) software that can help provide robust and accurate quantification of morphological features in bulk material microscopy images (Optical, SEM). Although many of the specific tools are not unique to Mama, the software package has been designed specifically for nuclear material morphological analysis, and is at a point where it can be easily adapted (by Los Alamos or by collaborators) in response to new, different, or changing forensics needs. Themore » current release of the MAMA software only includes the image quantification, descriptions, and annotation functionality. Only limited information on a sample, its pedigree, and its chemistry are recorded inside this part of the software. This was decision based on initial feedback and the fact that there are several analytical chemistry databases being developed within the community. Currently MAMA is a standalone program that can export quantification results in a basic text format that can be imported into other programs such as Excel and Access. There is also a basic report generating feature that produces HTML formatted pages of the same information. We will be working with collaborators to provide better integration of MAMA into their particular systems, databases and workflows.« less

  14. Optimization of the fabrication of novel stealth PLA-based nanoparticles by dispersion polymerization using D-optimal mixture design

    PubMed Central

    Adesina, Simeon K.; Wight, Scott A.; Akala, Emmanuel O.

    2015-01-01

    Purpose Nanoparticle size is important in drug delivery. Clearance of nanoparticles by cells of the reticuloendothelial system has been reported to increase with increase in particle size. Further, nanoparticles should be small enough to avoid lung or spleen filtering effects. Endocytosis and accumulation in tumor tissue by the enhanced permeability and retention effect are also processes that are influenced by particle size. We present the results of studies designed to optimize crosslinked biodegradable stealth polymeric nanoparticles fabricated by dispersion polymerization. Methods Nanoparticles were fabricated using different amounts of macromonomer, initiators, crosslinking agent and stabilizer in a dioxane/DMSO/water solvent system. Confirmation of nanoparticle formation was by scanning electron microscopy (SEM). Particle size was measured by dynamic light scattering (DLS). D-optimal mixture statistical experimental design was used for the experimental runs, followed by model generation (Scheffe polynomial) and optimization with the aid of a computer software. Model verification was done by comparing particle size data of some suggested solutions to the predicted particle sizes. Results and Conclusion Data showed that average particle sizes follow the same trend as predicted by the model. Negative terms in the model corresponding to the crosslinking agent and stabilizer indicate the important factors for minimizing particle size. PMID:24059281

  15. Optimization of the fabrication of novel stealth PLA-based nanoparticles by dispersion polymerization using D-optimal mixture design.

    PubMed

    Adesina, Simeon K; Wight, Scott A; Akala, Emmanuel O

    2014-11-01

    Nanoparticle size is important in drug delivery. Clearance of nanoparticles by cells of the reticuloendothelial system has been reported to increase with increase in particle size. Further, nanoparticles should be small enough to avoid lung or spleen filtering effects. Endocytosis and accumulation in tumor tissue by the enhanced permeability and retention effect are also processes that are influenced by particle size. We present the results of studies designed to optimize cross-linked biodegradable stealth polymeric nanoparticles fabricated by dispersion polymerization. Nanoparticles were fabricated using different amounts of macromonomer, initiators, crosslinking agent and stabilizer in a dioxane/DMSO/water solvent system. Confirmation of nanoparticle formation was by scanning electron microscopy (SEM). Particle size was measured by dynamic light scattering (DLS). D-optimal mixture statistical experimental design was used for the experimental runs, followed by model generation (Scheffe polynomial) and optimization with the aid of a computer software. Model verification was done by comparing particle size data of some suggested solutions to the predicted particle sizes. Data showed that average particle sizes follow the same trend as predicted by the model. Negative terms in the model corresponding to the cross-linking agent and stabilizer indicate the important factors for minimizing particle size.

  16. Examination of Self-Determination within the Sport Education Model

    ERIC Educational Resources Information Center

    Perlman, Dana J.

    2011-01-01

    The purpose of this study was to examine the influence of the Sport Education Model (SEM) on students' self-determined motivation and underlying psychological need(s) in physical education. A total of 182 Year-9 students were engaged in 20 lesson units of volleyball, using either the SEM or a traditional approach. Data was collected using a…

  17. Meta-Analytic Methods of Pooling Correlation Matrices for Structural Equation Modeling under Different Patterns of Missing Data

    ERIC Educational Resources Information Center

    Furlow, Carolyn F.; Beretvas, S. Natasha

    2005-01-01

    Three methods of synthesizing correlations for meta-analytic structural equation modeling (SEM) under different degrees and mechanisms of missingness were compared for the estimation of correlation and SEM parameters and goodness-of-fit indices by using Monte Carlo simulation techniques. A revised generalized least squares (GLS) method for…

  18. A SEM Model in Assessing the Effect of Convergent, Divergent and Logical Thinking on Students' Understanding of Chemical Phenomena

    ERIC Educational Resources Information Center

    Stamovlasis, D.; Kypraios, N.; Papageorgiou, G.

    2015-01-01

    In this study, structural equation modeling (SEM) is applied to an instrument assessing students' understanding of chemical change. The instrument comprised items on understanding the structure of substances, chemical changes and their interpretation. The structural relationships among particular groups of items are investigated and analyzed using…

  19. Modelling of electron beam induced nanowire attraction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bitzer, Lucas A.; Benson, Niels, E-mail: niels.benson@uni-due.de; Schmechel, Roland

    2016-04-14

    Scanning electron microscope (SEM) induced nanowire (NW) attraction or bundling is a well known effect, which is mainly ascribed to structural or material dependent properties. However, there have also been recent reports of electron beam induced nanowire bending by SEM imaging, which is not fully explained by the current models, especially when considering the electro-dynamic interaction between NWs. In this article, we contribute to the understanding of this phenomenon, by introducing an electro-dynamic model based on capacitor and Lorentz force interaction, where the active NW bending is stimulated by an electromagnetic force between individual wires. The model includes geometrical, electrical,more » and mechanical NW parameters, as well as the influence of the electron beam source parameters and is validated using in-situ observations of electron beam induced GaAs nanowire (NW) bending by SEM imaging.« less

  20. Modeling SMAP Spacecraft Attitude Control Estimation Error Using Signal Generation Model

    NASA Technical Reports Server (NTRS)

    Rizvi, Farheen

    2016-01-01

    Two ground simulation software are used to model the SMAP spacecraft dynamics. The CAST software uses a higher fidelity model than the ADAMS software. The ADAMS software models the spacecraft plant, controller and actuator models, and assumes a perfect sensor and estimator model. In this simulation study, the spacecraft dynamics results from the ADAMS software are used as CAST software is unavailable. The main source of spacecraft dynamics error in the higher fidelity CAST software is due to the estimation error. A signal generation model is developed to capture the effect of this estimation error in the overall spacecraft dynamics. Then, this signal generation model is included in the ADAMS software spacecraft dynamics estimate such that the results are similar to CAST. This signal generation model has similar characteristics mean, variance and power spectral density as the true CAST estimation error. In this way, ADAMS software can still be used while capturing the higher fidelity spacecraft dynamics modeling from CAST software.

  1. Fitting direct covariance structures by the MSTRUCT modeling language of the CALIS procedure.

    PubMed

    Yung, Yiu-Fai; Browne, Michael W; Zhang, Wei

    2015-02-01

    This paper demonstrates the usefulness and flexibility of the general structural equation modelling (SEM) approach to fitting direct covariance patterns or structures (as opposed to fitting implied covariance structures from functional relationships among variables). In particular, the MSTRUCT modelling language (or syntax) of the CALIS procedure (SAS/STAT version 9.22 or later: SAS Institute, 2010) is used to illustrate the SEM approach. The MSTRUCT modelling language supports a direct covariance pattern specification of each covariance element. It also supports the input of additional independent and dependent parameters. Model tests, fit statistics, estimates, and their standard errors are then produced under the general SEM framework. By using numerical and computational examples, the following tests of basic covariance patterns are illustrated: sphericity, compound symmetry, and multiple-group covariance patterns. Specification and testing of two complex correlation structures, the circumplex pattern and the composite direct product models with or without composite errors and scales, are also illustrated by the MSTRUCT syntax. It is concluded that the SEM approach offers a general and flexible modelling of direct covariance and correlation patterns. In conjunction with the use of SAS macros, the MSTRUCT syntax provides an easy-to-use interface for specifying and fitting complex covariance and correlation structures, even when the number of variables or parameters becomes large. © 2014 The British Psychological Society.

  2. A Linear Empirical Model of Self-Regulation on Flourishing, Health, Procrastination, and Achievement, Among University Students

    PubMed Central

    Garzón-Umerenkova, Angélica; de la Fuente, Jesús; Amate, Jorge; Paoloni, Paola V.; Fadda, Salvatore; Pérez, Javier Fiz

    2018-01-01

    This research aimed to analyze the linear bivariate correlation and structural relations between self-regulation -as a central construct-, with flow, health, procrastination and academic performance, in an academic context. A total of 363 college students took part, 101 men (27.8%) and 262 women (72.2%). Participants had an average age of 22 years and were between the first and fifth year of studies. They were from five different programs and two universities in Bogotá city (Colombia). A validated ad hoc questionnaire of physical and psychological health was applied along with a battery of tests to measure self-regulation, procrastination, and flourishing. To establish an association relationship, Pearson bivariate correlations were performed using SPSS software (v. 22.0), and structural relationship predictive analysis was performed using an SEM on AMOS software (v. 22.0). Regarding this linear association, it was established that (1) self-regulation has a significant positive association on flourishing and overall health, and a negative effect on procrastination. Regarding the structural relation, it confirmed that (2) self-regulation is a direct and positive predictor of flourishing and health; (3) self-regulation predicts procrastination directly and negatively, and academic performance indirectly and positively; and (4) age and gender have a prediction effect on the analyzed variables. Implications, limitations and future research scope are discussed. PMID:29706922

  3. A Linear Empirical Model of Self-Regulation on Flourishing, Health, Procrastination, and Achievement, Among University Students.

    PubMed

    Garzón-Umerenkova, Angélica; de la Fuente, Jesús; Amate, Jorge; Paoloni, Paola V; Fadda, Salvatore; Pérez, Javier Fiz

    2018-01-01

    This research aimed to analyze the linear bivariate correlation and structural relations between self-regulation -as a central construct-, with flow, health, procrastination and academic performance, in an academic context. A total of 363 college students took part, 101 men (27.8%) and 262 women (72.2%). Participants had an average age of 22 years and were between the first and fifth year of studies. They were from five different programs and two universities in Bogotá city (Colombia). A validated ad hoc questionnaire of physical and psychological health was applied along with a battery of tests to measure self-regulation, procrastination, and flourishing. To establish an association relationship, Pearson bivariate correlations were performed using SPSS software (v. 22.0), and structural relationship predictive analysis was performed using an SEM on AMOS software (v. 22.0). Regarding this linear association, it was established that (1) self-regulation has a significant positive association on flourishing and overall health, and a negative effect on procrastination. Regarding the structural relation, it confirmed that (2) self-regulation is a direct and positive predictor of flourishing and health; (3) self-regulation predicts procrastination directly and negatively, and academic performance indirectly and positively; and (4) age and gender have a prediction effect on the analyzed variables. Implications, limitations and future research scope are discussed.

  4. Functionalized silica nanoparticles as a carrier for Betamethasone Sodium Phosphate: Drug release study and statistical optimization of drug loading by response surface method.

    PubMed

    Ghasemnejad, M; Ahmadi, E; Mohamadnia, Z; Doustgani, A; Hashemikia, S

    2015-11-01

    Mesoporous silica nanoparticles with a hexagonal structure (SBA-15) were synthesized and modified with (3-aminopropyl) triethoxysilane (APTES), and their performance as a carrier for drug delivery system was studied. Chemical structure and morphology of the synthesized and modified SBA-15 were characterized by SEM, BET, TEM, FT-IR and CHN technique. Betamethasone Sodium Phosphate (BSP) as a water soluble drug was loaded on the mesoporous silica particle for the first time. The response surface method was employed to obtain the optimum conditions for the drug/silica nanoparticle preparation, by using Design-Expert software. The effect of time, pH of preparative media, and drug/silica ratio on the drug loading efficiency was investigated by the software. The maximum loading (33.69%) was achieved under optimized condition (pH: 1.8, time: 3.54 (h) and drug/silica ratio: 1.7). The in vitro release behavior of drug loaded particles under various pH values was evaluated. Finally, the release kinetic of the drug was investigated using the Higuchi and Korsmeyer-Peppas models. Cell culture and cytotoxicity assays revealed the synthesized product doesn't have any cytotoxicity against human bladder cell line 5637. Accordingly, the produced drug-loaded nanostructures can be applied via different routes, such as implantation and topical or oral administration. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Statistical modelling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1991-01-01

    During the six-month period from 1 April 1991 to 30 September 1991 the following research papers in statistical modeling of software reliability appeared: (1) A Nonparametric Software Reliability Growth Model; (2) On the Use and the Performance of Software Reliability Growth Models; (3) Research and Development Issues in Software Reliability Engineering; (4) Special Issues on Software; and (5) Software Reliability and Safety.

  6. Toward Open Science at the European Scale: Geospatial Semantic Array Programming for Integrated Environmental Modelling

    NASA Astrophysics Data System (ADS)

    de Rigo, Daniele; Corti, Paolo; Caudullo, Giovanni; McInerney, Daniel; Di Leo, Margherita; San-Miguel-Ayanz, Jesús

    2013-04-01

    Interfacing science and policy raises challenging issues when large spatial-scale (regional, continental, global) environmental problems need transdisciplinary integration within a context of modelling complexity and multiple sources of uncertainty [1]. This is characteristic of science-based support for environmental policy at European scale [1], and key aspects have also long been investigated by European Commission transnational research [2-5]. Parameters ofthe neededdata- transformations ? = {?1????m} (a.5) Wide-scale transdisciplinary modelling for environment. Approaches (either of computational science or of policy-making) suitable at a given domain-specific scale may not be appropriate for wide-scale transdisciplinary modelling for environment (WSTMe) and corresponding policy-making [6-10]. In WSTMe, the characteristic heterogeneity of available spatial information (a) and complexity of the required data-transformation modelling (D- TM) appeal for a paradigm shift in how computational science supports such peculiarly extensive integration processes. In particular, emerging wide-scale integration requirements of typical currently available domain-specific modelling strategies may include increased robustness and scalability along with enhanced transparency and reproducibility [11-15]. This challenging shift toward open data [16] and reproducible research [11] (open science) is also strongly suggested by the potential - sometimes neglected - huge impact of cascading effects of errors [1,14,17-19] within the impressively growing interconnection among domain-specific computational models and frameworks. From a computational science perspective, transdisciplinary approaches to integrated natural resources modelling and management (INRMM) [20] can exploit advanced geospatial modelling techniques with an awesome battery of free scientific software [21,22] for generating new information and knowledge from the plethora of composite data [23-26]. From the perspective of the science-policy interface, INRMM should be able to provide citizens and policy-makers with a clear, accurate understanding of the implications of the technical apparatus on collective environmental decision-making [1]. Complexity of course should not be intended as an excuse for obscurity [27-29]. Geospatial Semantic Array Programming. Concise array-based mathematical formulation and implementation (with array programming tools, see (b) ) have proved helpful in supporting and mitigating the complexity of WSTMe [40-47] when complemented with generalized modularization and terse array-oriented semantic constraints. This defines the paradigm of Semantic Array Programming (SemAP) [35,36] where semantic transparency also implies free software use (although black-boxes [12] - e.g. legacy code - might easily be semantically interfaced). A new approach for WSTMe has emerged by formalizing unorganized best practices and experience-driven informal patterns. The approach introduces a lightweight (non-intrusive) integration of SemAP and geospatial tools (c) - called Geospatial Semantic Array Programming (GeoSemAP). GeoSemAP (d) exploits the joint semantics provided by SemAP and geospatial tools to split a complex D- TM into logical blocks which are easier to check by means of mathematical array-based and geospatial constraints. Those constraints take the form of precondition, invariant and postcondition semantic checks. This way, even complex WSTMe may be described as the composition of simpler GeoSemAP blocks, each of them structured as (d). GeoSemAP allows intermediate data and information layers to be more easily an formally semantically described so as to increase fault-tolerance [17], transparency and reproducibility of WSTMe. This might also help to better communicate part of the policy-relevant knowledge, often difficult to transfer from technical WSTMe to the science-policy interface [1,15]. References de Rigo, D., 2013. Behind the horizon of reproducible integrated environmental modelling at European scale: ethics and practice of scientific knowledge freedom. F1000 Research. To appear as discussion paper. Funtowicz, S. O., Ravetz, J. R., 1994. Uncertainty, complexity and post-normal science. Environmental Toxicology and Chemistry 13 (12), 1881-1885. http://dx.doi.org/10.1002/etc.5620131203 Funtowicz, S. O., Ravetz, J. R., 1994. The worth of a songbird: ecological economics as a post-normal science. Ecological Economics 10 (3), 197-207. http://dx.doi.org/10.1016/0921-8009(94)90108-2 Funtowicz, S. O., Ravetz, J. R., 2003. Funtowicz, S., Ravetz, J. (2003). Post-normal science. International Society for Ecological Economics, Internet Encyclopaedia of Ecological Economics Ravetz, J., 2004. The post-normal science of precaution. Futures 36 (3), 347-357. http://dx.doi.org/10.1016/S0016-3287(03)00160-5 van der Sluijs, J. P., 2012. Uncertainty and dissent in climate risk assessment: A Post-Normal perspective. Nature and Culture 7 (2), 174-195. http://dx.doi.org/10.3167/nc.2012.070204 Ulieru, M., Doursat, R., 2011. Emergent engineering: a radical paradigm shift. International Journal of Autonomous and Adaptive Communications Systems 4 (1), 39-60. http://dx.doi.org/10.1504/IJAACS.2011.037748 Turner, M. G., Dale, V. H., Gardner, R. H., Dec. 1989. Predicting across scales: Theory development and testing. Landscape Ecology 3 (3), 245-252. http://dx.doi.org/10.1007/BF00131542 Zhang, X., Drake, N. A., Wainwright, J., 2004. Scaling issues in environmental modelling. In: Wainwright, J., Mulligan, M. (Eds.), Environmental modelling : finding simplicity in complexity. Wiley. ISBN: 9780471496182 Bankes, S. C., 2002. Tools and techniques for developing policies for complex and uncertain systems. Proceedings of the National Academy of Sciences of the United States of America 99 (Suppl 3), 7263-7266. http://dx.doi.org/10.1073/pnas.092081399 Peng, R. D., 2011. Reproducible research in computational science. Science 334 (6060), 1226-1227. http://dx.doi.org/10.1126/science.1213847 Morin, A., Urban, J., Adams, P. D., Foster, I., Sali, A., Baker, D., Sliz, P., 2012. Shining light into black boxes. Science 336 (6078), 159-160. http://dx.doi.org/10.1126/science.1218263 Nature, 2011. Devil in the details. Nature 470 (7334), 305-306. http://dx.doi.org/10.1038/470305b Stodden, V., 2012. Reproducible research: Tools and strategies for scientific computing. Computing in Science and Engineering 14, 11-12. http://dx.doi.org/10.1109/MCSE.2012.82 de Rigo, D., Corti, P., Caudullo, G., McInerney, D., Di Leo, M., San-Miguel-Ayanz, J., (exp. 2013). Supporting Environmental Modelling and Science-Policy Interface at European Scale with Geospatial Semantic Array Programming. In prep. Molloy, J. C., 2011. The open knowledge foundation: Open data means better science. PLoS Biology 9 (12), e1001195+. http://dx.doi.org/10.1371/journal.pbio.1001195 de Rigo, D., 2013. Software Uncertainty in Integrated Environmental Modelling: the role of Semantics and Open Science. Geophysical Research Abstracts 15, EGU General Assembly 2013. Cerf, V. G., 2012. Where is the science in computer science? Commun. ACM 55 (10), 5. http://dx.doi.org/10.1145/2347736.2347737 Wilson, G., 2006. Where's the real bottleneck in scientific computing? American Scientist 94 (1), 5+. http://dx.doi.org/10.1511/2006.1.5 de Rigo, D. 2012. Integrated Natural Resources Modelling and Management: minimal redefinition of a known challenge for environmental modelling. Excerpt from the Call for a shared research agenda toward scientific knowledge freedom, Maieutike Research Initiative. http://www.citeulike.org/groupfunc/15400/home Stallman, R. M., 2005. Free community science and the free development of science. PLoS Med 2 (2), e47+. http://dx.doi.org/10.1371/journal.pmed.0020047 Stallman, R. M., 2009. Viewpoint: Why "open source" misses the point of free software. Communications of the ACM 52 (6), 31-33. http://dx.doi.org/10.1145/1516046.1516058 (free access version: http://www.gnu.org/philosophy/open-source-misses-the-point.html ) Rodriguez Aseretto, D., Di Leo, M., de Rigo, D., Corti, P., McInerney, D., Camia, A., San Miguel-Ayanz, J., 2013. Free and Open Source Software underpinning the European Forest Data Centre. Geophysical Research Abstracts 15, EGU General Assembly 2013. Giovando, C., Whitmore, C., Camia, A., San-Miguel-Ayanz, J., 2010. Enhancing the European Forest Fire Information System (EFFIS) with open source software. In: FOSS4G 2010. http://2010.foss4g.org/presentations_show.php?id=3693 Corti, P., San-Miguel-Ayanz, J., Camia, A., McInerney, D., Boca, R., Di Leo, M., 2012. Fire news management in the context of the European Forest Fire Information System (EFFIS). In: proceedings of "Quinta conferenza italiana sul software geografico e sui dati geografici liberi" (GFOSS DAY 2012). http://files.figshare.com/229492/Fire_news_management_in_the_context_of_EFFIS.pdf McInerney, D., Bastin, L., Diaz, L., Figueiredo, C., Barredo, J. I., San-Miguel-Ayanz, J., 2012. Developing a forest data portal to support Multi-Scale decision making. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 5 (6), 1-8. http://dx.doi.org/10.1109/JSTARS.2012.2194136 Morin, A., Urban, J., Adams, P. D., Foster, I., Sali, A., Baker, D., Sliz, P., (2012). Shining light into black boxes. Science 336 (6078), 159-160. http://dx.doi.org/10.1126/science.1218263 Stodden, V., 2011. Trust your science? Open your data and code. Amstat News July 2011, 21-22. http://www.stanford.edu/ vcs/papers/TrustYourScience-STODDEN.pdf van der Sluijs, J., 2005. Uncertainty as a monster in the science-policy interface: four coping strategies. Water Science & Technology 52 (6), 87-92. http://www.iwaponline.com/wst/05206/wst052060087.htm Iverson, K. E., 1980. Notation as a tool of thought. Communications of the ACM 23 (8), 444-465. http://awards.acm.org/images/awards/140/articles/9147499.pdf Eaton, J. W., Bateman, D., Hauberg, S., 2008. GNU Octave: a high-level interactive language for numerical computations. Network Theory. ISBN: 9780954612061 Eaton, J. W., 2012. GNU octave and reproducible research. Journal of Process Control 22 (8), 1433-1438. http://dx.doi.org/10.1016/j.jprocont.2012.04.006 R Development Core Team, 2011. The R reference manual. Network Theory Ltd. Vol. 1, ISBN: 978-1-906966-09-6. Vol. 2, ISBN: 978-1-906966-10-2. Vol. 3, ISBN: 978-1-906966-11-9. Vol. 4, ISBN: 978-1-906966-12-6. Ramey, C., Fox, B., 2006. Bash reference manual : reference documentation for Bash edition 2.5b, for Bash version 2.05b. Network Theory Limited. ISBN: 978-0-9541617-7-4. de Rigo, D., 2012. Semantic array programming for environmental modelling: Application of the mastrave library. In: Seppelt, R., Voinov, A. A., Lange, S., Bankamp, D. (Eds.), International Environmental Modelling and Software Society (iEMSs) 2012 International Congress on Environmental Modelling and Software. Managing Resources of a Limited Planet: Pathways and Visions under Uncertainty, Sixth Biennial Meeting. pp. 1167-1176. http://www.iemss.org/iemss2012/proceedings/D3_1_0715_deRigo.pdf de Rigo, D., 2012. Semantic Array Programming with Mastrave - Introduction to Semantic Computational Modelling. http://mastrave.org/doc/MTV-1.012-1.htm Van Rossum, G., Drake, F.J., 2011. Python Language Ref. Manual, Network Theory Ltd. ISBN: 0954161785. http://www.network-theory.co.uk/docs/pylang/ The Scipy community, 2012. NumPy Reference Guide. SciPy.org. http://docs.scipy.org/doc/numpy/reference/ The Scipy community, 2012. SciPy Reference Guide. SciPy.org. http://docs.scipy.org/doc/scipy/reference/ de Rigo, D., Castelletti, A., Rizzoli, A. E., Soncini-Sessa, R., Weber, E., Jul. 2005. A selective improvement technique for fastening neuro-dynamic programming in water resources network management. In: Zítek, P. (Ed.), Proceedings of the 16th IFAC World Congress. Vol. 16. International Federation of Automatic Control (IFAC), pp. 7-12. http://dx.doi.org/10.3182/20050703-6-CZ-1902.02172 de Rigo, D., Bosco, C., 2011. Architecture of a Pan-European Framework for Integrated Soil Water Erosion Assessment. Vol. 359 of IFIP Advances in Information and Communication Technology. Springer Boston, Berlin, Heidelberg, Ch. 34, pp. 310-318. http://dx.doi.org/10.1007/978-3-642-22285-6_34 San-Miguel-Ayanz, J., Schulte, E., Schmuck, G., Camia, A., Strobl, P., Liberta, G., Giovando, C., Boca, R., Sedano, F., Kempeneers, P., McInerney, D., Withmore, C., de Oliveira, S. S., Rodrigues, M., Durrant, T., Corti, P., Oehler, F., Vilar, L., Amatulli, G., Mar. 2012. Comprehensive monitoring of wildfires in Europe: The European Forest Fire Information System (EFFIS). In: Tiefenbacher, J. (Ed.), Approaches to Managing Disaster - Assessing Hazards, Emergencies and Disaster Impacts. InTech, Ch. 5. http://dx.doi.org/10.5772/28441 de Rigo, D., Caudullo, G., San-Miguel-Ayanz, J., Stancanelli, G., 2012. Mapping European forest tree species distribution to support pest risk assessment. In: Baker, R., Koch, F., Kriticos, D., Rafoss, T., Venette, R., van der Werf, W. (Eds.), Advancing risk assessment models for invasive alien species in the food chain: contending with climate change, economics and uncertainty. Bioforsk FOKUS 7. OECD Co-operative Research Programme on Biological Resource Management for Sustainable Agricultural Systems; Bioforsk - Norwegian Institute for Agricultural and Environmental Research. http://www.pestrisk.org/2012/BioforskFOKUS7-10_IPRMW-VI.pdf Estreguil, C., Caudullo, G., de Rigo, D., Whitmore, C., San-Miguel-Ayanz, J., 2012. Reporting on European forest fragmentation: Standardized indices and web map services. IEEE Earthzine. http://www.earthzine.org/2012/07/05/reporting-on-european-forest-fragmentation-standardized-indices-and-web-map-services/ Estreguil, C., de Rigo, D. and Caudullo, G. (exp. 2013). Towards an integrated and reproducible characterisation of habitat pattern. Submitted to Environmental Modelling & Software Amatulli, G., Camia, A., San-Miguel-Ayanz, J., 2009. Projecting future burnt area in the EU-mediterranean countries under IPCC SRES A2/B2 climate change scenarios (JRC55149), 33-38 de Rigo, D., Caudullo, G., Amatulli, G., Strobl, P., San-Miguel-Ayanz, J. (exp. 2013). Modelling tree species distribution in Europe with constrained spatial multi-frequency analysis. In prep. GRASS Development Team, 2012. Geographic Resources Analysis Support System (GRASS) Software. Open Source Geospatial Foundation. http://grass.osgeo.org http://www.spatial-ecology.net/dokuwiki/doku.php?id=wiki:firemod Neteler, M., Bowman, M. H., Landa, M., Metz, M., 2012. GRASS GIS: A multi-purpose open source GIS. Environmental Modelling & Software 31, 124-130. http://dx.doi.org/10.1016/j.envsoft.2011.11.014 Neteler, M., Mitasova, H., 2008. Open source GIS a GRASS GIS approach. ISBN: 978-0-387-35767-6 Warmerdam, F., 2008. The geospatial data abstraction library. In: Hall, G. B., Leahy, M. G. (Eds.), Open Source Approaches in Spatial Data Handling. Vol. 2 of Advances in Geographic Information Science. Springer Berlin Heidelberg, pp. 87-104. http://dx.doi.org/10.1007/978-3-540-74831-15 Open Geospatial Consortium, 2007. OpenGIS Web Processing Service version 1.0.0. No. OGC 05-007r7 in OpenGIS Standard. Open Geospatial Consortium (OGC). http://portal.opengeospatial.org/files/?artifact_id=24151 Hazzard, E., 2011. Openlayers 2.10 beginner's guide. Packt Publishing. ISBN: 1849514127 Obe, R., Hsu, L., 2011. PostGIS in Action. Manning Publications. http://dl.acm.org/citation.cfm?id=2018871 Sutton, T., 2009. Clipping data from postgis. linfiniti.com Open Source Geospatial Solutions. http://linfiniti.com/2009/09/clipping-data-from-postgis/

  7. The Influence of the Sport Education Model on Amotivated Students' In-Class Physical Activity

    ERIC Educational Resources Information Center

    Perlman, Dana

    2012-01-01

    The Sport Education Model (SEM) was designed by Siedentop to provide students with a holistic sport-based experience. As research on the SEM continues, an aspect that has gained interest is the influence on (a) students with low levels of motivation and (b) opportunities to engage in health-enhancing levels of physical activity. The purpose of…

  8. Case Studies of Successful Schoolwide Enrichment Model-Reading (SEM-R) Classroom Implementations. Research Monograph Series. RM10204

    ERIC Educational Resources Information Center

    Reis, Sally M.; Little, Catherine A.; Fogarty, Elizabeth; Housand, Angela M.; Housand, Brian C.; Sweeny, Sheelah M.; Eckert, Rebecca D.; Muller, Lisa M.

    2010-01-01

    The purpose of this qualitative study was to examine the scaling up of the Schoolwide Enrichment Model in Reading (SEM-R) in 11 elementary and middle schools in geographically diverse sites across the country. Qualitative comparative analysis was used in this study, with multiple data sources compiled into 11 in-depth school case studies…

  9. Classroom versus Societal Willingness to Communicate: Investigating French as a Second Language in Flanders

    ERIC Educational Resources Information Center

    Denies, Katrijn; Yashima, Tomoko; Janssen, Rianne

    2015-01-01

    This study investigates willingness to communicate (WTC) and its determinants through structural equation modelling (SEM). Building on models by MacIntyre and Charos (1996) and Yashima (2002), it addresses 3 apparent gaps in the current knowledge base: It is the first SEM-based WTC study in a Western European context, investigating French as a…

  10. A Simulation of the Topographic Contrast in the SEM

    NASA Astrophysics Data System (ADS)

    Kotera, Masatoshi; Fujiwara, Takafumi; Suga, Hiroshi; Wittry, David B.

    1990-10-01

    A simulation model is presented to analyze the topographic contast in the scanning electron microscope (SEM). This simulation takes into account all major mechanisms from signal generation to signal detection in the SEM. The calculated result shows that the resolution of the secondary electron image is better than that of the backscattered electron image for 1 and 3 keV primary electrons incident on an Al target. An asymmetric intensity profile of a signal at a topographic pattern, usually found in the SEM equipped with the Everhart-Thornley detector, is mainly due to the asymmetric profile of the backscattered electron signal.

  11. Remote Internet access to advanced analytical facilities: a new approach with Web-based services.

    PubMed

    Sherry, N; Qin, J; Fuller, M Suominen; Xie, Y; Mola, O; Bauer, M; McIntyre, N S; Maxwell, D; Liu, D; Matias, E; Armstrong, C

    2012-09-04

    Over the past decade, the increasing availability of the World Wide Web has held out the possibility that the efficiency of scientific measurements could be enhanced in cases where experiments were being conducted at distant facilities. Examples of early successes have included X-ray diffraction (XRD) experimental measurements of protein crystal structures at synchrotrons and access to scanning electron microscopy (SEM) and NMR facilities by users from institutions that do not possess such advanced capabilities. Experimental control, visual contact, and receipt of results has used some form of X forwarding and/or VNC (virtual network computing) software that transfers the screen image of a server at the experimental site to that of the users' home site. A more recent development is a web services platform called Science Studio that provides teams of scientists with secure links to experiments at one or more advanced research facilities. The software provides a widely distributed team with a set of controls and screens to operate, observe, and record essential parts of the experiment. As well, Science Studio provides high speed network access to computing resources to process the large data sets that are often involved in complex experiments. The simple web browser and the rapid transfer of experimental data to a processing site allow efficient use of the facility and assist decision making during the acquisition of the experimental results. The software provides users with a comprehensive overview and record of all parts of the experimental process. A prototype network is described involving X-ray beamlines at two different synchrotrons and an SEM facility. An online parallel processing facility has been developed that analyzes the data in near-real time using stream processing. Science Studio and can be expanded to include many other analytical applications, providing teams of users with rapid access to processed results along with the means for detailed discussion of their significance.

  12. Model Driven Engineering with Ontology Technologies

    NASA Astrophysics Data System (ADS)

    Staab, Steffen; Walter, Tobias; Gröner, Gerd; Parreiras, Fernando Silva

    Ontologies constitute formal models of some aspect of the world that may be used for drawing interesting logical conclusions even for large models. Software models capture relevant characteristics of a software artifact to be developed, yet, most often these software models have limited formal semantics, or the underlying (often graphical) software language varies from case to case in a way that makes it hard if not impossible to fix its semantics. In this contribution, we survey the use of ontology technologies for software modeling in order to carry over advantages from ontology technologies to the software modeling domain. It will turn out that ontology-based metamodels constitute a core means for exploiting expressive ontology reasoning in the software modeling domain while remaining flexible enough to accommodate varying needs of software modelers.

  13. SOFC Microstructures (PFIB-SEM and synthetic) from JPS 2018

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hsu, Tim; Epting, William K; Mahbub, Rubayyat

    This is the microstructural data used in the publication "Mesoscale characterization of local property distributions in hetergeneous electrodes" by Tim Hsu, William K. Epting, Rubayyat Mahbub, et al., published in the Journal of Power Sources in 2018 (DOI 10.1016/j.jpowsour.2018.03.025). Included are a commercial cathode and anode active layer (Materials and Systems Research, Inc., Salt Lake City, UT) imaged by Xe plasma FIB-SEM (FEI, Hillsboro, OR), and four synthetic microstructures of varying particle size distribution widths generated by DREAM3D (BlueQuartz Software, Springboro, OH). For the MSRI electrodes, both the original greyscale and the segmented versions are provided. Each .zip file containsmore » a "stack" of .tif image files in the Z dimension, and an .info ascii text file containing useful information like voxel sizes and phase IDs. More details can be found in the pertinent publication at http://dx.doi.org/10.1016/j.jpowsour.2018.03.025.« less

  14. The factors that influence job satisfaction among royal Malaysian customs department employee

    NASA Astrophysics Data System (ADS)

    Ammar Shafi, Muhammad; Saifullah Rusiman, Mohd; Nor, Maria Elena; Khamis, Azme; Nabilah Syuhada Abdullah, Siti; Syafiq Azmi, Mohd; Sakinah Zainal Abidin, Munirah; Ali, Maselan

    2018-04-01

    This research aims to spot the factors that influence job satisfaction among Royal Malaysian Customs Department employees. Primary data was used in this research and it was collected from the employees who work in five different departments at Royal Malaysian Customs Department Tower Johor. Those departments were customs department, Internal Taxes, Technical Services, Management and Prevention. The research used stratified random sampling to collect the sample and Structural Equation Modelling (SEM) to measure the relationship between variables using AMOS software. About 127 employees are selected as the respondents from five departments to represent the sample. The result showed that ‘Organizational Commitment’ (p-value = 0.001) has significant and direct effect toward job satisfaction compared to the ‘Stress Condition’ (p-value = 0.819) and ‘Motivation’ factor (p-value = 0.978). It was also concluded that ‘Organizational Commitment’ was the most influential factor toward job satisfaction among Royal Malaysian Customs Department employees at Tower Custom Johor, Johor Bahru.

  15. Change in Affect and Needs Satisfaction for Amotivated Students within the Sport Education Model

    ERIC Educational Resources Information Center

    Perlman, Dana

    2010-01-01

    The purpose of this study is to examine the influence of the Sport Education Model ("SEM") on amotivated students affect and needs satisfaction. 78 amotivated students from an original pool of 1,176 students enrolled in one of 32 physical education classes. Classes were randomly assigned to either the "SEM" (N = 16)or traditional class (N = 16).…

  16. Situational Effects May Account for Gain Scores in Cognitive Ability Testing: A Longitudinal SEM Approach

    ERIC Educational Resources Information Center

    Matton, Nadine; Vautier, Stephane; Raufaste, Eric

    2009-01-01

    Mean gain scores for cognitive ability tests between two sessions in a selection setting are now a robust finding, yet not fully understood. Many authors do not attribute such gain scores to an increase in the target abilities. Our approach consists of testing a longitudinal SEM model suitable to this view. We propose to model the scores' changes…

  17. Review of Sample Size for Structural Equation Models in Second Language Testing and Learning Research: A Monte Carlo Approach

    ERIC Educational Resources Information Center

    In'nami, Yo; Koizumi, Rie

    2013-01-01

    The importance of sample size, although widely discussed in the literature on structural equation modeling (SEM), has not been widely recognized among applied SEM researchers. To narrow this gap, we focus on second language testing and learning studies and examine the following: (a) Is the sample size sufficient in terms of precision and power of…

  18. Generic domain models in software engineering

    NASA Technical Reports Server (NTRS)

    Maiden, Neil

    1992-01-01

    This paper outlines three research directions related to domain-specific software development: (1) reuse of generic models for domain-specific software development; (2) empirical evidence to determine these generic models, namely elicitation of mental knowledge schema possessed by expert software developers; and (3) exploitation of generic domain models to assist modelling of specific applications. It focuses on knowledge acquisition for domain-specific software development, with emphasis on tool support for the most important phases of software development.

  19. Representing general theoretical concepts in structural equation models: The role of composite variables

    USGS Publications Warehouse

    Grace, J.B.; Bollen, K.A.

    2008-01-01

    Structural equation modeling (SEM) holds the promise of providing natural scientists the capacity to evaluate complex multivariate hypotheses about ecological systems. Building on its predecessors, path analysis and factor analysis, SEM allows for the incorporation of both observed and unobserved (latent) variables into theoretically-based probabilistic models. In this paper we discuss the interface between theory and data in SEM and the use of an additional variable type, the composite. In simple terms, composite variables specify the influences of collections of other variables and can be helpful in modeling heterogeneous concepts of the sort commonly of interest to ecologists. While long recognized as a potentially important element of SEM, composite variables have received very limited use, in part because of a lack of theoretical consideration, but also because of difficulties that arise in parameter estimation when using conventional solution procedures. In this paper we present a framework for discussing composites and demonstrate how the use of partially-reduced-form models can help to overcome some of the parameter estimation and evaluation problems associated with models containing composites. Diagnostic procedures for evaluating the most appropriate and effective use of composites are illustrated with an example from the ecological literature. It is argued that an ability to incorporate composite variables into structural equation models may be particularly valuable in the study of natural systems, where concepts are frequently multifaceted and the influence of suites of variables are often of interest. ?? Springer Science+Business Media, LLC 2007.

  20. The Interface Between Theory and Data in Structural Equation Models

    USGS Publications Warehouse

    Grace, James B.; Bollen, Kenneth A.

    2006-01-01

    Structural equation modeling (SEM) holds the promise of providing natural scientists the capacity to evaluate complex multivariate hypotheses about ecological systems. Building on its predecessors, path analysis and factor analysis, SEM allows for the incorporation of both observed and unobserved (latent) variables into theoretically based probabilistic models. In this paper we discuss the interface between theory and data in SEM and the use of an additional variable type, the composite, for representing general concepts. In simple terms, composite variables specify the influences of collections of other variables and can be helpful in modeling general relationships of the sort commonly of interest to ecologists. While long recognized as a potentially important element of SEM, composite variables have received very limited use, in part because of a lack of theoretical consideration, but also because of difficulties that arise in parameter estimation when using conventional solution procedures. In this paper we present a framework for discussing composites and demonstrate how the use of partially reduced form models can help to overcome some of the parameter estimation and evaluation problems associated with models containing composites. Diagnostic procedures for evaluating the most appropriate and effective use of composites are illustrated with an example from the ecological literature. It is argued that an ability to incorporate composite variables into structural equation models may be particularly valuable in the study of natural systems, where concepts are frequently multifaceted and the influences of suites of variables are often of interest.

  1. A New Look at Genetic and Environmental Architecture on Lipids Using Non-Normal Structural Equation Modeling in Male Twins: The NHLBI Twin Study.

    PubMed

    Wu, Sheng-Hui; Ozaki, Koken; Reed, Terry; Krasnow, Ruth E; Dai, Jun

    2017-07-01

    This study examined genetic and environmental influences on the lipid concentrations of 1028 male twins using the novel univariate non-normal structural equation modeling (nnSEM) ADCE and ACE models. In the best fitting nnSEM ADCE model that was also better than the nnSEM ACE model, additive genetic factors (A) explained 4%, dominant genetic factors (D) explained 17%, and common (C) and unique (E) environmental factors explained 47% and 33% of the total variance of high-density lipoprotein cholesterol (HDL-C). The percentage of variation explained for other lipids was 0% (A), 30% (D), 34% (C) and 37% (E) for low-density lipoprotein cholesterol (LDL-C); 30, 0, 31 and 39% for total cholesterol; and 0, 31, 12 and 57% for triglycerides. It was concluded that additive and dominant genetic factors simultaneously affected HDL-C concentrations but not other lipids. Common and unique environmental factors influenced concentrations of all lipids.

  2. Analysis on the precision of the dimensions of self-ligating brackets.

    PubMed

    Erduran, Rackel Hatice Milhomens Gualberto; Maeda, Fernando Akio; Ortiz, Sandra Regina Mota; Triviño, Tarcila; Fuziy, Acácio; Carvalho, Paulo Eduardo Guedes

    2016-12-01

    The present study aimed to evaluate the precision of the torque applied by 0.022" self-ligating brackets of different brands, the precision of parallelism between the inner walls of their slots, and precision of their slot height. Eighty brackets for upper central incisors of eight trademarked models were selected: Abzil, GAC, American Orthodontics, Morelli, Orthometric, Ormco, Forestadent, and Ortho Organizers. Images of the brackets were obtained using a scanning electron microscope (SEM) and these were measured using the AutoCAD 2011 software. The tolerance parameters stated in the ISO 27020 standard were used as references. The results showed that only the Orthometric, Morelli, and Ormco groups showed results inconsistent with the ISO standard. Regarding the parallelism of the internal walls of the slots, most of the models studied had results in line with the ISO prescription, except the Morelli group. In assessing bracket slot height, only the Forestadent, GAC, American Orthodontics, and Ormco groups presented results in accordance with the ISO standard. The GAC, Forestadent, and American Orthodontics groups did not differ in relation to the three factors of the ISO 27020 standard. Great variability of results is observed in relation to all the variables. © 2016 Wiley Periodicals, Inc.

  3. Consumer's Buying Decision-Making Process in E-Commerce

    NASA Astrophysics Data System (ADS)

    Puspitasari, Nia Budi; Susatyo, Nugroho W. P.; Amyhorsea, Deya Nilan; Susanty, Aries

    2018-02-01

    The e-commerce growth and development in Indonesia is very rapid as well as the internet grows, but it is not well-balanced with the number of online buying transaction which is still relatively low. Even the today's biggest B2C e-commerce people in Indonesia, Lazada, has continually decreased online purchasing. This research is aimed to describe factors affecting online buying decision- making in the e-commerce Lazada. The type of this research is confirmatory research. The variable used is following conceptual model i.e. Electronic Word of Mouth (EWOM), social identity, risk perception, trust, and purchase intention. The data were obtained through the questionnaire with Likert scale 1-5. There are 104 people researching sample who meets the criteria as Lazada consumer that, at least do a transaction in recent six months. Data analyzing were done using Structural Equation Modelling (SEM) method by Analysis of Moment Structures (AMOS) software. The results showed that the purchase intention has positively related to the purchase decision. Variable EWOM toward trust has positive relation, variable social identity and risk perception have no any significant relation to trust. Variable risk perception toward purchase intention has no significant relation, while the variable trust has significant positive relation to purchase intention.

  4. DEVELOPMENT OF A PORTABLE SOFTWARE LANGUAGE FOR PHYSIOLOGICALLY-BASED PHARMACOKINETIC (PBPK) MODELS

    EPA Science Inventory

    The PBPK modeling community has had a long-standing problem with modeling software compatibility. The numerous software packages used for PBPK models are, at best, minimally compatible. This creates problems ranging from model obsolescence due to software support discontinuation...

  5. Cost Estimation of Software Development and the Implications for the Program Manager

    DTIC Science & Technology

    1992-06-01

    Software Lifecycle Model (SLIM), the Jensen System-4 model, the Software Productivity, Quality, and Reliability Estimator ( SPQR \\20), the Constructive...function models in current use are the Software Productivity, Quality, and Reliability Estimator ( SPQR /20) and the Software Architecture Sizing and...Estimator ( SPQR /20) was developed by T. Capers Jones of Software Productivity Research, Inc., in 1985. The model is intended to estimate the outcome

  6. Software reliability models for critical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, H.; Pham, M.

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the secondmore » place. 407 refs., 4 figs., 2 tabs.« less

  7. Software reliability models for critical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, H.; Pham, M.

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place.more » 407 refs., 4 figs., 2 tabs.« less

  8. Relationship between Family-Work and Work-Family Conflict with Organizational Commitment and Desertion Intention among Nurses and Paramedical Staff at Hospitals

    PubMed Central

    Hatam, Nahid; Jalali, Marzie Tajik; Askarian, Mehrdad; Kharazmi, Erfan

    2016-01-01

    Background: High turnover intention rate is one of the most common problems in healthcare organizations throughout the world. There are several factors that can potentially affect the individuals’ turnover intention; they include factors such as work-family conflict, family-work conflict, and organizational commitment. The aim of this research was to determine the relationship between family-work and work-family conflicts and organizational commitment and turnover intention among nurses and paramedical staff at hospitals affiliated to Shiraz University of Medical Sciences (SUMS) and present a model using SEM. Methods: This is a questionnaire based cross-sectional study among 400 nurses and paramedical staff of hospitals affiliated to SUMS using a random-proportional (quota) sampling method. Data collection was performed using four standard questionnaires. SPSS software was used for data analysis and SmartPLS software for modeling variables. Results: Mean scores of work-family conflict and desertion intention were 2.6 and 2.77, respectively. There was a significant relationship between gender and family-work conflict (P=0.02). Family-work conflict was significantly higher in married participants (P=0.001). Based on the findings of this study, there was a significant positive relationship between work-family and family-work conflict (P=0.001). Also, work-family conflict had a significant inverse relationship with organizational commitment (P=0.001). An inverse relationship was seen between organizational commitment and turnover intentions (P=0.001). Conclusion: Thus, regarding the prominent and preventative role of organizational commitment in employees’ desertion intentions, in order to prevent negative effects of staff desertion in health sector, attempts to make policies to increase people’s organizational commitment must be considered by health system managers more than ever. PMID:27218108

  9. Relationship between Family-Work and Work-Family Conflict with Organizational Commitment and Desertion Intention among Nurses and Paramedical Staff at Hospitals.

    PubMed

    Hatam, Nahid; Jalali, Marzie Tajik; Askarian, Mehrdad; Kharazmi, Erfan

    2016-04-01

    High turnover intention rate is one of the most common problems in healthcare organizations throughout the world. There are several factors that can potentially affect the individuals' turnover intention; they include factors such as work-family conflict, family-work conflict, and organizational commitment. The aim of this research was to determine the relationship between family-work and work-family conflicts and organizational commitment and turnover intention among nurses and paramedical staff at hospitals affiliated to Shiraz University of Medical Sciences (SUMS) and present a model using SEM. This is a questionnaire based cross-sectional study among 400 nurses and paramedical staff of hospitals affiliated to SUMS using a random-proportional (quota) sampling method. Data collection was performed using four standard questionnaires. SPSS software was used for data analysis and SmartPLS software for modeling variables. Mean scores of work-family conflict and desertion intention were 2.6 and 2.77, respectively. There was a significant relationship between gender and family-work conflict (P=0.02). Family-work conflict was significantly higher in married participants (P=0.001). Based on the findings of this study, there was a significant positive relationship between work-family and family-work conflict (P=0.001). Also, work-family conflict had a significant inverse relationship with organizational commitment (P=0.001). An inverse relationship was seen between organizational commitment and turnover intentions (P=0.001). Thus, regarding the prominent and preventative role of organizational commitment in employees' desertion intentions, in order to prevent negative effects of staff desertion in health sector, attempts to make policies to increase people's organizational commitment must be considered by health system managers more than ever.

  10. Automated Quantitative Rare Earth Elements Mineralogy by Scanning Electron Microscopy

    NASA Astrophysics Data System (ADS)

    Sindern, Sven; Meyer, F. Michael

    2016-09-01

    Increasing industrial demand of rare earth elements (REEs) stems from the central role they play for advanced technologies and the accelerating move away from carbon-based fuels. However, REE production is often hampered by the chemical, mineralogical as well as textural complexity of the ores with a need for better understanding of their salient properties. This is not only essential for in-depth genetic interpretations but also for a robust assessment of ore quality and economic viability. The design of energy and cost-efficient processing of REE ores depends heavily on information about REE element deportment that can be made available employing automated quantitative process mineralogy. Quantitative mineralogy assigns numeric values to compositional and textural properties of mineral matter. Scanning electron microscopy (SEM) combined with a suitable software package for acquisition of backscatter electron and X-ray signals, phase assignment and image analysis is one of the most efficient tools for quantitative mineralogy. The four different SEM-based automated quantitative mineralogy systems, i.e. FEI QEMSCAN and MLA, Tescan TIMA and Zeiss Mineralogic Mining, which are commercially available, are briefly characterized. Using examples of quantitative REE mineralogy, this chapter illustrates capabilities and limitations of automated SEM-based systems. Chemical variability of REE minerals and analytical uncertainty can reduce performance of phase assignment. This is shown for the REE phases parisite and synchysite. In another example from a monazite REE deposit, the quantitative mineralogical parameters surface roughness and mineral association derived from image analysis are applied for automated discrimination of apatite formed in a breakdown reaction of monazite and apatite formed by metamorphism prior to monazite breakdown. SEM-based automated mineralogy fulfils all requirements for characterization of complex unconventional REE ores that will become increasingly important for supply of REEs in the future.

  11. Evaluation of Human Corneal Lenticule Quality After SMILE With Different Cap Thicknesses Using Scanning Electron Microscopy.

    PubMed

    Weng, Shengbei; Liu, Manli; Yang, Xiaonan; Liu, Fang; Zhou, Yugui; Lin, Haiqin; Liu, Quan

    2018-01-01

    To evaluate the surface characteristics of lenticules created by small-incision lenticule extraction (SMILE) with different cap thicknesses. This prospective study included 20 consecutive patients who underwent bilateral SMILE. Surface regularity of the extracted corneal lenticule was analyzed using scanning electron microscopy (SEM) combined with 2 methods: qualitative and quantitative regularity. Qualitative regularity of SEM images was graded by masked observers using an established scoring system. Quantitative regularity of SEM images was assessed by counting the total number and areas of tissue bridges using Image-Pro Plus software. Four different cap thickness of 120, 130, 140, and 150 μm were compared. Refractive outcomes of patients were measured at baseline and 1 month after surgery. As 10 specimens were not analyzable, only 30 eyes were included. Postoperatively, all eyes had postoperative uncorrected distance visual acuity of 20/20 or better; 43% had an unchanged corrected distance visual acuity; 43% gained 1 line; 10% lost 1 line. Ultrastructurally, surface irregularity was primarily caused by tissue bridges. The average surface regularity score obtained was 10.87 ± 2.40 for 120 μm, 10.78 ± 2.60 for 130 μm, 8.76 ± 2.16 for 140 μm, and 8.70 ± 2.66 for 150 μm (P < 0.001). The total number and areas of tissue bridges of 120 to 130 μm were significantly less than 140 to 150 μm (P < 0.05). Surface regularity decreased as cap thickness increased (P < 0.05). There is smoother appearance of the lenticular surface as seen through SEM when a thin cap is created compared with a thick cap qualitatively and quantitatively.

  12. Roughness-controlled self-assembly of mannitol/LB agar microparticles by polymorphic transformation for pulmonary drug delivery.

    PubMed

    Zhang, Fengying; Ngoc, Nguyen Thi Quynh; Tay, Bao Hui; Mendyk, Aleksander; Shao, Yu-Hsuan; Lau, Raymond

    2015-01-05

    Novel roughness-controlled mannitol/LB Agar microparticles were synthesized by polymorphic transformation and self-assembly method using hexane as the polymorphic transformation reagent and spray-dried mannitol/LB Agar microparticles as the starting material. As-prepared microparticles were characterized by Fourier transform infrared spectra (FTIR), X-ray diffraction spectra (XRD), differential scanning calorimetry (DSC), scanning electron microscopy (SEM), thermal gravimetric analysis (TGA), and Andersen Cascade Impactor (ACI). The XRD and DSC results indicate that after immersing spray-dried mannitol/LB Agar microparticles in hexane, β-mannitol was completely transformed to α-mannitol in 1 h, and all the δ-mannitol was transformed to α form after 14 days. SEM shows that during the transformation the nanobelts on the spray-dried mannitol/LB Agar microparticles become more dispersed and the contour of the individual nanobelts becomes more noticeable. Afterward, the nanobelts self-assemble to nanorods and result in rod-covered mannitol/LB Agar microparticles. FTIR indicates new hydrogen bonds were formed among mannitol, LB Agar, and hexane. SEM images coupled with image analysis software reveal that different surface morphology of the microparticles have different drug adhesion mechanisms. Comparison of ACI results and image analysis of SEM images shows that an increase in the particle surface roughness can increase the fine particle fractions (FPFs) using the rod-covered mannitol microparticles as drug carriers. Transformed microparticles show higher FPFs than commercially available lactose carriers. An FPF of 28.6 ± 2.4% was achieved by microparticles transformed from spray-dried microparticles using 2% mannitol(w/v)/LB Agar as feed solution. It is comparable to the highest FPF reported in the literature using lactose and spray-dried mannitol as carriers.

  13. Posttraumatic stress symptoms and the diathesis-stress model of chronic pain and disability in patients undergoing major surgery.

    PubMed

    Martin, Andrea L; Halket, Eileen; Asmundson, Gordon J G; Flora, David B; Katz, Joel

    2010-01-01

    To (1) use structural equation modeling (SEM) to examine relationships proposed in Turk's diathesis-stress model of chronic pain and disability as well as (2) investigate what role, if any, posttraumatic stress symptoms (PTSS) play in predicting pain disability, relative to some of the other factors in the model. The study sample consisted of 208 patients scheduled for general surgery, 21 to 60 years of age (mean age=47.18 y, SD=9.72 y), who reported experiencing persistent pain for an average of 5.56 years (SD=7.90 y). At their preadmission hospital visit, patients completed the Anxiety Sensitivity Index, Pain Catastrophizing Scale, Pain Anxiety Symptoms Scale-20, Pain Disability Index, posttraumatic stress disorder Checklist, and rated the average intensity of their pain (0 to 10 numeric rating scale). SEM was used to test a model of chronic pain disability and to explore potential relationships between PTSS and factors in the diathesis-stress model. SEM results provided support for a model in which anxiety sensitivity predicted fear of pain and catastrophizing, fear of pain predicted escape/avoidance, and escape/avoidance predicted pain disability. Results also provided support for a feedback loop between disability and fear of pain. SEM analyses provided preliminary support for the inclusion of PTSS in the diathesis-stress model, with PTSS accounting for a significant proportion of the variance in pain disability. Results provide empirical support for aspects of Turk's diathesis-stress model in a sample of patients with persistent pain. Findings also offer preliminary support for the role of PTSS in fear-avoidance models of chronic pain.

  14. Structural equation modeling of the relationships between pesticide poisoning, depressive symptoms and safety behaviors among Colorado farm residents.

    PubMed

    Beseler, Cheryl Lynn; Stallones, Lorann

    2006-01-01

    To use structural equation modeling (SEM) to test the theory that a past pesticide poisoning may act as a mediator in the relationship between depression and safety practices. Depression has been associated with pesticide poisoning and was more strongly associated with safety behaviors than workload, social support or health status of farm residents in a previously published report. A cross-sectional survey of farmers and their spouses was conducted in eight counties in northeastern Colorado. Depressive symptoms were assessed using the Center for Epidemiologic Studies-Depression (CES-D) scale. Exploratory and confirmatory factor analyses were used to identify symptoms most correlated with risk factors for depression and safety practices. SEM was used to examine theoretical causal models of the relationship between depression and poor health, financial difficulties, a history of pesticide poisoning, and safety practices. Exploratory factor analysis identified three factors in the CES-D scale. The SEM showed that poor health, financial difficulties and a history of pesticide poisoning significantly explained the depressive symptoms. Models with an excellent fit for the safety behaviors resulted when modeling the probability that the pesticide poisoning preceded depression, but no fit was possible when reversing the direction and modeling depression preceding pesticide poisoning. Specific depressive symptoms appeared to be significantly associated with primarily animal handling and farm machinery. The order of events, based on SEM results, was a pesticide poisoning preceding depressed mood in relation to safety behaviors.

  15. Usability Prediction & Ranking of SDLC Models Using Fuzzy Hierarchical Usability Model

    NASA Astrophysics Data System (ADS)

    Gupta, Deepak; Ahlawat, Anil K.; Sagar, Kalpna

    2017-06-01

    Evaluation of software quality is an important aspect for controlling and managing the software. By such evaluation, improvements in software process can be made. The software quality is significantly dependent on software usability. Many researchers have proposed numbers of usability models. Each model considers a set of usability factors but do not cover all the usability aspects. Practical implementation of these models is still missing, as there is a lack of precise definition of usability. Also, it is very difficult to integrate these models into current software engineering practices. In order to overcome these challenges, this paper aims to define the term `usability' using the proposed hierarchical usability model with its detailed taxonomy. The taxonomy considers generic evaluation criteria for identifying the quality components, which brings together factors, attributes and characteristics defined in various HCI and software models. For the first time, the usability model is also implemented to predict more accurate usability values. The proposed system is named as fuzzy hierarchical usability model that can be easily integrated into the current software engineering practices. In order to validate the work, a dataset of six software development life cycle models is created and employed. These models are ranked according to their predicted usability values. This research also focuses on the detailed comparison of proposed model with the existing usability models.

  16. Observation of Live Ticks (Haemaphysalis flava) by Scanning Electron Microscopy under High Vacuum Pressure

    PubMed Central

    Ishigaki, Yasuhito; Nakamura, Yuka; Oikawa, Yosaburo; Yano, Yasuhiro; Kuwabata, Susumu; Nakagawa, Hideaki; Tomosugi, Naohisa; Takegami, Tsutomu

    2012-01-01

    Scanning electron microscopes (SEM), which image sample surfaces by scanning with an electron beam, are widely used for steric observations of resting samples in basic and applied biology. Various conventional methods exist for SEM sample preparation. However, conventional SEM is not a good tool to observe living organisms because of the associated exposure to high vacuum pressure and electron beam radiation. Here we attempted SEM observations of live ticks. During 1.5×10−3 Pa vacuum pressure and electron beam irradiation with accelerated voltages (2–5 kV), many ticks remained alive and moved their legs. After 30-min observation, we removed the ticks from the SEM stage; they could walk actively under atmospheric pressure. When we tested 20 ticks (8 female adults and 12 nymphs), they survived for two days after SEM observation. These results indicate the resistance of ticks against SEM observation. Our second survival test showed that the electron beam, not vacuum conditions, results in tick death. Moreover, we describe the reaction of their legs to electron beam exposure. These findings open the new possibility of SEM observation of living organisms and showed the resistance of living ticks to vacuum condition in SEM. These data also indicate, for the first time, the usefulness of tick as a model system for biology under extreme condition. PMID:22431980

  17. Guidelines for a graph-theoretic implementation of structural equation modeling

    USGS Publications Warehouse

    Grace, James B.; Schoolmaster, Donald R.; Guntenspergen, Glenn R.; Little, Amanda M.; Mitchell, Brian R.; Miller, Kathryn M.; Schweiger, E. William

    2012-01-01

    Structural equation modeling (SEM) is increasingly being chosen by researchers as a framework for gaining scientific insights from the quantitative analyses of data. New ideas and methods emerging from the study of causality, influences from the field of graphical modeling, and advances in statistics are expanding the rigor, capability, and even purpose of SEM. Guidelines for implementing the expanded capabilities of SEM are currently lacking. In this paper we describe new developments in SEM that we believe constitute a third-generation of the methodology. Most characteristic of this new approach is the generalization of the structural equation model as a causal graph. In this generalization, analyses are based on graph theoretic principles rather than analyses of matrices. Also, new devices such as metamodels and causal diagrams, as well as an increased emphasis on queries and probabilistic reasoning, are now included. Estimation under a graph theory framework permits the use of Bayesian or likelihood methods. The guidelines presented start from a declaration of the goals of the analysis. We then discuss how theory frames the modeling process, requirements for causal interpretation, model specification choices, selection of estimation method, model evaluation options, and use of queries, both to summarize retrospective results and for prospective analyses. The illustrative example presented involves monitoring data from wetlands on Mount Desert Island, home of Acadia National Park. Our presentation walks through the decision process involved in developing and evaluating models, as well as drawing inferences from the resulting prediction equations. In addition to evaluating hypotheses about the connections between human activities and biotic responses, we illustrate how the structural equation (SE) model can be queried to understand how interventions might take advantage of an environmental threshold to limit Typha invasions. The guidelines presented provide for an updated definition of the SEM process that subsumes the historical matrix approach under a graph-theory implementation. The implementation is also designed to permit complex specifications and to be compatible with various estimation methods. Finally, they are meant to foster the use of probabilistic reasoning in both retrospective and prospective considerations of the quantitative implications of the results.

  18. Evaluation of clinical information modeling tools.

    PubMed

    Moreno-Conde, Alberto; Austin, Tony; Moreno-Conde, Jesús; Parra-Calderón, Carlos L; Kalra, Dipak

    2016-11-01

    Clinical information models are formal specifications for representing the structure and semantics of the clinical content within electronic health record systems. This research aims to define, test, and validate evaluation metrics for software tools designed to support the processes associated with the definition, management, and implementation of these models. The proposed framework builds on previous research that focused on obtaining agreement on the essential requirements in this area. A set of 50 conformance criteria were defined based on the 20 functional requirements agreed by that consensus and applied to evaluate the currently available tools. Of the 11 initiative developing tools for clinical information modeling identified, 9 were evaluated according to their performance on the evaluation metrics. Results show that functionalities related to management of data types, specifications, metadata, and terminology or ontology bindings have a good level of adoption. Improvements can be made in other areas focused on information modeling and associated processes. Other criteria related to displaying semantic relationships between concepts and communication with terminology servers had low levels of adoption. The proposed evaluation metrics were successfully tested and validated against a representative sample of existing tools. The results identify the need to improve tool support for information modeling and software development processes, especially in those areas related to governance, clinician involvement, and optimizing the technical validation of testing processes. This research confirmed the potential of these evaluation metrics to support decision makers in identifying the most appropriate tool for their organization. Los Modelos de Información Clínica son especificaciones para representar la estructura y características semánticas del contenido clínico en los sistemas de Historia Clínica Electrónica. Esta investigación define, prueba y valida un marco para la evaluación de herramientas informáticas diseñadas para dar soporte en la en los procesos de definición, gestión e implementación de estos modelos. El marco de evaluación propuesto se basa en una investigación previa para obtener consenso en la definición de requisitos esenciales en esta área. A partir de los 20 requisitos funcionales acordados, un conjunto de 50 criterios de conformidad fueron definidos y aplicados en la evaluación de las herramientas existentes. Un total de 9 de las 11 iniciativas identificadas desarrollando herramientas para el modelado de información clínica fueron evaluadas. Los resultados muestran que las funcionalidades relacionadas con la gestión de tipos de datos, especificaciones, metadatos y mapeo con terminologías u ontologías tienen un buen nivel de adopción. Se identifican posibles mejoras en áreas relacionadas con los procesos de modelado de información. Otros criterios relacionados con presentar las relaciones semánticas entre conceptos y la comunicación con servidores de terminología tienen un bajo nivel de adopción. El marco de evaluación propuesto fue probado y validado satisfactoriamente contra un conjunto representativo de las herramientas existentes. Los resultados identifican la necesidad de mejorar el soporte de herramientas a los procesos de modelado de información y desarrollo de software, especialmente en las áreas relacionadas con gobernanza, participación de profesionales clínicos y la optimización de la validación técnica en los procesos de pruebas técnicas. Esta investigación ha confirmado el potencial de este marco de evaluación para dar soporte a los usuarios en la toma de decisiones sobre que herramienta es más apropiadas para su organización. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. Electron-beam-induced potentials in semiconductors: calculation and measurement with an SEM/SPM hybrid system

    NASA Astrophysics Data System (ADS)

    Thomas, Ch; Joachimsthaler, I.; Heiderhoff, R.; Balk, L. J.

    2004-10-01

    In this work electron-beam-induced potentials are analysed theoretically and experimentally for semiconductors. A theoretical model is developed to describe the surface potential distribution produced by an electron beam. The distribution of generated carriers is calculated using semiconductor equations. This distribution causes a local change in surface potential, which is derived with the help of quasi-Fermi energies. The potential distribution is simulated using the model developed and measured with a scanning probe microscope (SPM) built inside a scanning electron microscope (SEM), for different samples, for different beam excitations and for different cantilever voltages of SPM. In the end, some fields of application are shown where material properties can be determined using an SEM/SPM hybrid system.

  20. A new way of measuring wiggling pattern in SADP for 3D NAND technology

    NASA Astrophysics Data System (ADS)

    Mi, Jian; Chen, Ziqi; Tu, Li Ming; Mao, Xiaoming; Liu, Gong Cai; Kawada, Hiroki

    2018-03-01

    A new metrology method of quantitatively measuring wiggling patterns in a Self-Aligned Double Patterning (SADP) process for 2D NAND technology has been developed with a CD-SEM metrology program on images from a Review-SEM system. The metrology program provided accurate modeling of various wiggling patterns. The Review-SEM system provided a-few-micrometer-wide Field of View (FOV), which exceeds precision-guaranteed FOV of a conventional CD-SEM. The result has been effectively verified by visual inspection on vertically compressed images compared with Wiggling Index from this new method. A best-known method (BKM) system has been developed with connected HW and SW to automatically measure wiggling patterns.

  1. The relationship between use of sexually explicit media and sexual risk behavior in men who have sex with men: exploring the mediating effects of sexual self-esteem and condom use self-efficacy

    PubMed Central

    Træen, Bente; Hald, Gert Martin; Noor, Syed W.; Iantaffi, Alex; Grey, Jeremy; Rosser, B. R. Simon

    2014-01-01

    This study tests the following three hypotheses: 1) there is a direct association between consumption of sexually explicit media (SEM) depicting non-condom use and STI-related sexual risk behavior among men who have sex with men (MSM), 2) The association between SEM consumption and STI-related sexual risk behavior is mediated by men’s sexual self-esteem, and 3) the relationship between SEM consumption and sexual risk behavior is mediated by condom use self-efficacy. A cross-sectional, Internet-based survey on exposure to SEM and sexual behavior of 1,391 MSM in the USA was conducted in 2011. The results confirmed hypothesis 1 and 3 while hypothesis 2 was rejected. Accordingly, a significant association between the use of SEM picturing condom use and STI related sexual risk behavior among MSM was found. Likewise, we found that the association between the use of SEM and sexual risk behavior was mediated by condom use self-efficacy in an indirect path. However, SEM did not influence sexual risk behavior via sexual self-esteem. To promote STI prevention, the actors in SEM may be used as role models in managing condom use in sexual contexts. PMID:24904709

  2. AxiSEM3D: a new fast method for global wave propagation in 3-D Earth models with undulating discontinuities

    NASA Astrophysics Data System (ADS)

    Leng, K.; Nissen-Meyer, T.; van Driel, M.; Al-Attar, D.

    2016-12-01

    We present a new, computationally efficient numerical method to simulate global seismic wave propagation in realistic 3-D Earth models with laterally heterogeneous media and finite boundary perturbations. Our method is a hybrid of pseudo-spectral and spectral element methods (SEM). We characterize the azimuthal dependence of 3-D wavefields in terms of Fourier series, such that the 3-D equations of motion reduce to an algebraic system of coupled 2-D meridional equations, which can be solved by a 2-D spectral element method (based on www.axisem.info). Computational efficiency of our method stems from lateral smoothness of global Earth models (with respect to wavelength) as well as axial singularity of seismic point sources, which jointly confine the Fourier modes of wavefields to a few lower orders. All boundary perturbations that violate geometric spherical symmetry, including Earth's ellipticity, topography and bathymetry, undulations of internal discontinuities such as Moho and CMB, are uniformly considered by means of a Particle Relabeling Transformation.The MPI-based high performance C++ code AxiSEM3D, is now available for forward simulations upon 3-D Earth models with fluid outer core, ellipticity, and both mantle and crustal structures. We show novel benchmarks for global wave solutions in 3-D mantle structures between our method and an independent, fully discretized 3-D SEM with remarkable agreement. Performance comparisons are carried out on three state-of-the-art tomography models, with seismic period going down to 5s. It is shown that our method runs up to two orders of magnitude faster than the 3-D SEM for such settings, and such computational advantage scales favourably with seismic frequency. By examining wavefields passing through hypothetical Gaussian plumes of varying sharpness, we identify in model-wavelength space the limits where our method may lose its advantage.

  3. Structural equation modeling and natural systems

    USGS Publications Warehouse

    Grace, James B.

    2006-01-01

    This book, first published in 2006, presents an introduction to the methodology of structural equation modeling, illustrates its use, and goes on to argue that it has revolutionary implications for the study of natural systems. A major theme of this book is that we have, up to this point, attempted to study systems primarily using methods (such as the univariate model) that were designed only for considering individual processes. Understanding systems requires the capacity to examine simultaneous influences and responses. Structural equation modeling (SEM) has such capabilities. It also possesses many other traits that add strength to its utility as a means of making scientific progress. In light of the capabilities of SEM, it can be argued that much of ecological theory is currently locked in an immature state that impairs its relevance. It is further argued that the principles of SEM are capable of leading to the development and evaluation of multivariate theories of the sort vitally needed for the conservation of natural systems.

  4. Validation and Verification of LADEE Models and Software

    NASA Technical Reports Server (NTRS)

    Gundy-Burlet, Karen

    2013-01-01

    The Lunar Atmosphere Dust Environment Explorer (LADEE) mission will orbit the moon in order to measure the density, composition and time variability of the lunar dust environment. The ground-side and onboard flight software for the mission is being developed using a Model-Based Software methodology. In this technique, models of the spacecraft and flight software are developed in a graphical dynamics modeling package. Flight Software requirements are prototyped and refined using the simulated models. After the model is shown to work as desired in this simulation framework, C-code software is automatically generated from the models. The generated software is then tested in real time Processor-in-the-Loop and Hardware-in-the-Loop test beds. Travelling Road Show test beds were used for early integration tests with payloads and other subsystems. Traditional techniques for verifying computational sciences models are used to characterize the spacecraft simulation. A lightweight set of formal methods analysis, static analysis, formal inspection and code coverage analyses are utilized to further reduce defects in the onboard flight software artifacts. These techniques are applied early and often in the development process, iteratively increasing the capabilities of the software and the fidelity of the vehicle models and test beds.

  5. Stent migration following endoscopic suture fixation of esophageal self-expandable metal stents: a systematic review and meta-analysis.

    PubMed

    Law, Ryan; Prabhu, Anoop; Fujii-Lau, Larissa; Shannon, Carol; Singh, Siddharth

    2018-02-01

    Covered self-expandable metal stents (SEMS) are utilized for the management of benign and malignant esophageal conditions; however, covered SEMS are prone to migration. Endoscopic suture fixation may mitigate the migration risk of covered esophageal SEMS. Hence, we conducted a systematic review and meta-analysis to evaluate the effectiveness and safety of endoscopic suture fixation for covered esophageal SEMS. Following PRISMA guidelines, we performed a systematic review from 2011 to 2016 to identify studies (case control/case series) reporting the technical success and migration rate of covered esophageal SEMS following endoscopic suture fixation. We searched multiple electronic databases and conference proceedings. We calculated pooled rates (and 95% confidence intervals [CI]) of technical success and stent migration using a random effects model. We identified 14 studies (212 patients) describing covered esophageal SEMS placement with endoscopic suture fixation. When reported, SEMS indications included leak/fistula (n = 75), stricture (n = 65), perforation (n = 10), and achalasia (n = 4). The pooled technical success rate was 96.7% (95% CI 92.3-98.6), without heterogeneity (I 2  = 0%). We identified 29 SEMS migrations at rate of 15.9% (95% CI 11.4-21.6), without heterogeneity (I 2  = 0%). Publication bias was observed, and using the trim-and-fill method, a more conservative estimate for stent migration was 17.0%. Suture-related adverse events were estimated to occur in 3.7% (95% CI 1.6-8.2) of cases. Endoscopic suture fixation of covered esophageal SEMS appears to reduce stent migration when compared to published rates of non-anchored SEMS. However, SEMS migration still occurs in approximately 1 out of 6 cases despite excellent immediate technical success and low risk of suture-related adverse events.

  6. Vibration band gaps for elastic metamaterial rods using wave finite element method

    NASA Astrophysics Data System (ADS)

    Nobrega, E. D.; Gautier, F.; Pelat, A.; Dos Santos, J. M. C.

    2016-10-01

    Band gaps in elastic metamaterial rods with spatial periodic distribution and periodically attached local resonators are investigated. New techniques to analyze metamaterial systems are using a combination of analytical or numerical method with wave propagation. One of them, called here wave spectral element method (WSEM), consists of combining the spectral element method (SEM) with Floquet-Bloch's theorem. A modern methodology called wave finite element method (WFEM), developed to calculate dynamic behavior in periodic acoustic and structural systems, utilizes a similar approach where SEM is substituted by the conventional finite element method (FEM). In this paper, it is proposed to use WFEM to calculate band gaps in elastic metamaterial rods with spatial periodic distribution and periodically attached local resonators of multi-degree-of-freedom (M-DOF). Simulated examples with band gaps generated by Bragg scattering and local resonators are calculated by WFEM and verified with WSEM, which is used as a reference method. Results are presented in the form of attenuation constant, vibration transmittance and frequency response function (FRF). For all cases, WFEM and WSEM results are in agreement, provided that the number of elements used in WFEM is sufficient to convergence. An experimental test was conducted with a real elastic metamaterial rod, manufactured with plastic in a 3D printer, without local resonance-type effect. The experimental results for the metamaterial rod with band gaps generated by Bragg scattering are compared with the simulated ones. Both numerical methods (WSEM and WFEM) can localize the band gap position and width very close to the experimental results. A hybrid approach combining WFEM with the commercial finite element software ANSYS is proposed to model complex metamaterial systems. Two examples illustrating its efficiency and accuracy to model an elastic metamaterial rod unit-cell using 1D simple rod element and 3D solid element are demonstrated and the results present good approximation to the experimental data.

  7. Oceanic Whitecaps and Associated, Bubble-Mediated, Air-Sea Exchange Processes

    DTIC Science & Technology

    1992-10-01

    experiments performed in laboratory conditions using Air-Sea Exchange Monitoring System (A-SEMS). EXPERIMENTAL SET-UP In a first look, the Air-Sea Exchange...Model 225, equipped with a Model 519 plug-in module. Other complementary information on A-SEMS along with results from first tests and calibration...between 9.50C and 22.40C within the first 24 hours after transferring the water sample into laboratory conditions. The results show an enhancement of

  8. Characterization of fission gas bubbles in irradiated U-10Mo fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casella, Andrew M.; Burkes, Douglas E.; MacFarlan, Paul J.

    2017-09-01

    Irradiated U-10Mo fuel samples were prepared with traditional mechanical potting and polishing methods with in a hot cell. They were then removed and imaged with an SEM located outside of a hot cell. The images were then processed with basic imaging techniques from 3 separate software packages. The results were compared and a baseline method for characterization of fission gas bubbles in the samples is proposed. It is hoped that through adoption of or comparison to this baseline method that sample characterization can be somewhat standardized across the field of post irradiated examination of metal fuels.

  9. State-of-the-Art Resources (SOAR) for Software Vulnerability Detection, Test, and Evaluation

    DTIC Science & Technology

    2014-07-01

    preclude in-depth analysis, and widespread use of a Software -as-a- Service ( SaaS ) model that limits data availability and application to DoD systems...provide mobile application analysis using a Software - as-a- Service ( SaaS ) model. In this case, any software to be analyzed must be sent to the...tools are only available through a SaaS model. The widespread use of a Software -as-a- Service ( SaaS ) model as a sole evaluation model limits data

  10. Linking Structural Equation Modelling with Bayesian Network and Coastal Phytoplankton Dynamics in Bohai Bay

    NASA Astrophysics Data System (ADS)

    Chu, Jiangtao; Yang, Yue

    2018-06-01

    Bayesian networks (BN) have many advantages over other methods in ecological modelling and have become an increasingly popular modelling tool. However, BN are flawed in regard to building models based on inadequate existing knowledge. To overcome this limitation, we propose a new method that links BN with structural equation modelling (SEM). In this method, SEM is used to improve the model structure for BN. This method was used to simulate coastal phytoplankton dynamics in Bohai Bay. We demonstrate that this hybrid approach minimizes the need for expert elicitation, generates more reasonable structures for BN models and increases the BN model's accuracy and reliability. These results suggest that the inclusion of SEM for testing and verifying the theoretical structure during the initial construction stage improves the effectiveness of BN models, especially for complex eco-environment systems. The results also demonstrate that in Bohai Bay, while phytoplankton biomass has the greatest influence on phytoplankton dynamics, the impact of nutrients on phytoplankton dynamics is larger than the influence of the physical environment in summer. Furthermore, despite the Redfield ratio indicating that phosphorus should be the primary nutrient limiting factor, our results indicate that silicate plays the most important role in regulating phytoplankton dynamics in Bohai Bay.

  11. A UML-based metamodel for software evolution process

    NASA Astrophysics Data System (ADS)

    Jiang, Zuo; Zhou, Wei-Hong; Fu, Zhi-Tao; Xiong, Shun-Qing

    2014-04-01

    A software evolution process is a set of interrelated software processes under which the corresponding software is evolving. An object-oriented software evolution process meta-model (OO-EPMM), abstract syntax and formal OCL constraint of meta-model are presented in this paper. OO-EPMM can not only represent software development process, but also represent software evolution.

  12. New airtight transfer box for SEM experiments: Application to lithium and sodium metals observation and analyses.

    PubMed

    Stephant, Nicolas; Grissa, Rabeb; Guillou, Fanch; Bretaudeau, Mickaël; Borjon-Piron, Yann; Guillet, Jacques; Moreau, Philippe

    2018-04-18

    The surface of some materials reacts very quickly on contact with air, either because it is oxidized or because it gets humidity from the air. For the sake of original surface observation by scanning electron microscopy (SEM), we conceived an airtight transfer box to keep the samples under vacuum from the place of manufacturing to the SEM chamber. This object is designed to fit in all the models of SEM including those provided with an airlock chamber. The design is voluntarily simplified to allow the manufacturing of the object by a standard mechanical workshop. The transfer box can be easily opened by gravity inside the SEM and allows the preservation of the best vacuum inside, before opening. SEM images and energy dispersive spectroscopy (EDX) analyses of metallic lithium and sodium samples are presented prior and after exposure to the air. X-ray Photoelectron Spectroscopy (XPS) analyses of all samples are also discussed in order to investigate the chemical environments of the detected elements. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. In Situ Characterization of Boehmite Particles in Water Using Liquid SEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yao, Juan; Arey, Bruce W.; Yang, Li

    In situ imaging and elemental analysis of boehmite (AlOOH) particles in water is realized using the System for Analysis at the Liquid Vacuum Interface (SALVI) and Scanning Electron Microscopy (SEM). This paper describes the method and key steps in integrating the vacuum compatible SAVLI to SEM and obtaining secondary electron (SE) images of particles in liquid in high vacuum. Energy dispersive x-ray spectroscopy (EDX) is used to obtain elemental analysis of particles in liquid. A synthesized AlOOH particle is used as a model in the liquid SEM illustration. Our results demonstrate that particles can be imaged in the SE modemore » with good resolution. The AlOOH EDX spectrum shows significant signal from the Al compared with deionized water and the empty channel control. In situ liquid SEM is a powerful technique to study particles in liquid with many exciting applications. This procedure aims to provide technical details in how to conduct liquid SEM imaging and EDX analysis using SALVI and reduce potential pitfalls using this approach for other researchers.« less

  14. Fabrication of a silver particle-integrated silicone polymer-covered metal stent against sludge and biofilm formation and stent-induced tissue inflammation

    PubMed Central

    Lee, Tae Hoon; Jang, Bong Seok; Jung, Min Kyo; Pack, Chan Gi; Choi, Jun-Ho; Park, Do Hyun

    2016-01-01

    To reduce tissue or tumor ingrowth, covered self-expandable metal stents (SEMSs) have been developed. The effectiveness of covered SEMSs may be attenuated by sludge or stone formation or by stent clogging due to the formation of biofilm on the covering membrane. In this study, we tested the hypothesis that a silicone membrane containing silver particles (Ag-P) would prevent sludge and biofilm formation on the covered SEMS. In vitro, the Ag-P-integrated silicone polymer-covered membrane exhibited sustained antibacterial activity, and there was no definite release of silver ions from the Ag-P-integrated silicone polymer membrane at any time point. Using a porcine stent model, in vivo analysis demonstrated that the Ag-P-integrated silicone polymer-covered SEMS reduced the thickness of the biofilm and the quantity of sludge formed, compared with a conventional silicone-covered SEMS. In vivo, the release of silver ions from an Ag-P-integrated silicone polymer-covered SEMS was not detected in porcine serum. The Ag-P-integrated silicone polymer-covered SEMS also resulted in significantly less stent-related bile duct and subepithelium tissue inflammation than a conventional silicone polymer-covered SEMS. Therefore, the Ag-P-integrated silicone polymer-covered SEMS reduced sludge and biofilm formation and stent-induced pathological changes in tissue. This novel SEMS may prolong the stent patency in clinical application. PMID:27739486

  15. Focused ion beam (FIB)/scanning electron microscopy (SEM) in tissue structural research.

    PubMed

    Leser, Vladka; Milani, Marziale; Tatti, Francesco; Tkalec, Ziva Pipan; Strus, Jasna; Drobne, Damjana

    2010-10-01

    The focused ion beam (FIB) and scanning electron microscope (SEM) are commonly used in material sciences for imaging and analysis of materials. Over the last decade, the combined FIB/SEM system has proven to be also applicable in the life sciences. We have examined the potential of the focused ion beam/scanning electron microscope system for the investigation of biological tissues of the model organism Porcellio scaber (Crustacea: Isopoda). Tissue from digestive glands was prepared as for conventional SEM or as for transmission electron microscopy (TEM). The samples were transferred into FIB/SEM for FIB milling and an imaging operation. FIB-milled regions were secondary electron imaged, back-scattered electron imaged, or energy dispersive X-ray (EDX) analyzed. Our results demonstrated that FIB/SEM enables simultaneous investigation of sample gross morphology, cell surface characteristics, and subsurface structures. The same FIB-exposed regions were analyzed by EDX to provide basic compositional data. When samples were prepared as for TEM, the information obtained with FIB/SEM is comparable, though at limited magnification, to that obtained from TEM. A combination of imaging, micro-manipulation, and compositional analysis appears of particular interest in the investigation of epithelial tissues, which are subjected to various endogenous and exogenous conditions affecting their structure and function. The FIB/SEM is a promising tool for an overall examination of epithelial tissue under normal, stressed, or pathological conditions.

  16. Fabrication of a silver particle-integrated silicone polymer-covered metal stent against sludge and biofilm formation and stent-induced tissue inflammation.

    PubMed

    Lee, Tae Hoon; Jang, Bong Seok; Jung, Min Kyo; Pack, Chan Gi; Choi, Jun-Ho; Park, Do Hyun

    2016-10-14

    To reduce tissue or tumor ingrowth, covered self-expandable metal stents (SEMSs) have been developed. The effectiveness of covered SEMSs may be attenuated by sludge or stone formation or by stent clogging due to the formation of biofilm on the covering membrane. In this study, we tested the hypothesis that a silicone membrane containing silver particles (Ag-P) would prevent sludge and biofilm formation on the covered SEMS. In vitro, the Ag-P-integrated silicone polymer-covered membrane exhibited sustained antibacterial activity, and there was no definite release of silver ions from the Ag-P-integrated silicone polymer membrane at any time point. Using a porcine stent model, in vivo analysis demonstrated that the Ag-P-integrated silicone polymer-covered SEMS reduced the thickness of the biofilm and the quantity of sludge formed, compared with a conventional silicone-covered SEMS. In vivo, the release of silver ions from an Ag-P-integrated silicone polymer-covered SEMS was not detected in porcine serum. The Ag-P-integrated silicone polymer-covered SEMS also resulted in significantly less stent-related bile duct and subepithelium tissue inflammation than a conventional silicone polymer-covered SEMS. Therefore, the Ag-P-integrated silicone polymer-covered SEMS reduced sludge and biofilm formation and stent-induced pathological changes in tissue. This novel SEMS may prolong the stent patency in clinical application.

  17. CCD Astrometric Measurements of WDS 00420-5547 MLO 1

    NASA Astrophysics Data System (ADS)

    Kith, Camerin; Wilson, Jake; Agro, Sam; Toms, Sarah; Andreski, Bella; Torrance, Emily; Tock, Kalée.

    2018-01-01

    The position angle and separation of WDS 00420-5547 MLO 1 has been measured and noted in 20 publications since Robert Lewis Ellery’s initial observation in 1877. This system was observed using the R-COP robotic telescope in Australia, which is part of the Skynet Robotic Telescope Network. Their small separation made it difficult to resolve the two stars, except for the lowest-exposure-time images (5 seconds and 10 seconds) using a small measuring aperture (3-4 pixel aperture radius). AstroImageJ software was used to reduce the data and contribute a new measurement: position angle 165° ± 0.63 (1± SEM) and separation ?= 6.0 arc sec ± 0.12 (1 ± SEM) on 2017.093 (Besselian date). The observation was plotted along with the past observations using the Desmos plotting tool, which allows the date to be displayed next to each position of the secondary. Despite the fact that these stars are a Common Proper Motion pair, the data and plot do not currently support classification of this system as one that is gravitationally bound.

  18. Rigorous quantitative elemental microanalysis by scanning electron microscopy/energy dispersive x-ray spectrometry (SEM/EDS) with spectrum processing by NIST DTSA-II

    NASA Astrophysics Data System (ADS)

    Newbury, Dale E.; Ritchie, Nicholas W. M.

    2014-09-01

    Quantitative electron-excited x-ray microanalysis by scanning electron microscopy/silicon drift detector energy dispersive x-ray spectrometry (SEM/SDD-EDS) is capable of achieving high accuracy and high precision equivalent to that of the high spectral resolution wavelength dispersive x-ray spectrometer even when severe peak interference occurs. The throughput of the SDD-EDS enables high count spectra to be measured that are stable in calibration and resolution (peak shape) across the full deadtime range. With this high spectral stability, multiple linear least squares peak fitting is successful for separating overlapping peaks and spectral background. Careful specimen preparation is necessary to remove topography on unknowns and standards. The standards-based matrix correction procedure embedded in the NIST DTSA-II software engine returns quantitative results supported by a complete error budget, including estimates of the uncertainties from measurement statistics and from the physical basis of the matrix corrections. NIST DTSA-II is available free for Java-platforms at: http://www.cstl.nist.gov/div837/837.02/epq/dtsa2/index.html).

  19. Structure, microstructure and infrared studies of Ba{sub 0.06}(Na{sub 1/2}Bi{sub 1/2}){sub 0.94}TiO{sub 3}-NaNbO{sub 3} ceramics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roy, Sumit K., E-mail: sumit.sxc13@gmail.com; Singh, S. N., E-mail: snsphyru@gmail.com; Prasad, K., E-mail: k.prasad65@gmail.com

    2016-05-06

    Lead-free solid solutions (1-x)Ba{sub 0.06}(Na{sub 1/2}Bi{sub 1/2}){sub 0.94}TiO{sub 3}-xNaNbO{sub 3} (0 ≤ x ≤ 1.0) were prepared by conventional ceramic fabrication technique. X-ray diffraction and Rietveld refinement analyses of these ceramics were carried out using X’Pert HighScore Plus software to determine the crystal symmetry, space group and unit cell dimensions. Rietveld refinement revealed that NaNbO{sub 3} with orthorhombic structure was completely diffused into Ba{sub 0.06}(Na{sub 1/2}Bi{sub 1/2}){sub 0.94}TiO{sub 3} lattice having the rhombohedral-tetragonal symmetry. EDS and SEM studies were carried out in order to evaluate the quality and purity of the compounds. SEM images showed a change in grain shapemore » with the increase of NaNbO{sub 3} content. FTIR spectra confirmed the formation of solid solution.« less

  20. Filling the gap: adding super-resolution to array tomography for correlated ultrastructural and molecular identification of electrical synapses at the C. elegans connectome.

    PubMed

    Markert, Sebastian Matthias; Britz, Sebastian; Proppert, Sven; Lang, Marietta; Witvliet, Daniel; Mulcahy, Ben; Sauer, Markus; Zhen, Mei; Bessereau, Jean-Louis; Stigloher, Christian

    2016-10-01

    Correlating molecular labeling at the ultrastructural level with high confidence remains challenging. Array tomography (AT) allows for a combination of fluorescence and electron microscopy (EM) to visualize subcellular protein localization on serial EM sections. Here, we describe an application for AT that combines near-native tissue preservation via high-pressure freezing and freeze substitution with super-resolution light microscopy and high-resolution scanning electron microscopy (SEM) analysis on the same section. We established protocols that combine SEM with structured illumination microscopy (SIM) and direct stochastic optical reconstruction microscopy (dSTORM). We devised a method for easy, precise, and unbiased correlation of EM images and super-resolution imaging data using endogenous cellular landmarks and freely available image processing software. We demonstrate that these methods allow us to identify and label gap junctions in Caenorhabditis elegans with precision and confidence, and imaging of even smaller structures is feasible. With the emergence of connectomics, these methods will allow us to fill in the gap-acquiring the correlated ultrastructural and molecular identity of electrical synapses.

  1. A Simulation Study of Threats to Validity in Quasi-Experimental Designs: Interrelationship between Design, Measurement, and Analysis.

    PubMed

    Holgado-Tello, Fco P; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Pérez-Gil, José A

    2016-01-01

    The Campbellian tradition provides a conceptual framework to assess threats to validity. On the other hand, different models of causal analysis have been developed to control estimation biases in different research designs. However, the link between design features, measurement issues, and concrete impact estimation analyses is weak. In order to provide an empirical solution to this problem, we use Structural Equation Modeling (SEM) as a first approximation to operationalize the analytical implications of threats to validity in quasi-experimental designs. Based on the analogies established between the Classical Test Theory (CTT) and causal analysis, we describe an empirical study based on SEM in which range restriction and statistical power have been simulated in two different models: (1) A multistate model in the control condition (pre-test); and (2) A single-trait-multistate model in the control condition (post-test), adding a new mediator latent exogenous (independent) variable that represents a threat to validity. Results show, empirically, how the differences between both the models could be partially or totally attributed to these threats. Therefore, SEM provides a useful tool to analyze the influence of potential threats to validity.

  2. A Simulation Study of Threats to Validity in Quasi-Experimental Designs: Interrelationship between Design, Measurement, and Analysis

    PubMed Central

    Holgado-Tello, Fco. P.; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Pérez-Gil, José A.

    2016-01-01

    The Campbellian tradition provides a conceptual framework to assess threats to validity. On the other hand, different models of causal analysis have been developed to control estimation biases in different research designs. However, the link between design features, measurement issues, and concrete impact estimation analyses is weak. In order to provide an empirical solution to this problem, we use Structural Equation Modeling (SEM) as a first approximation to operationalize the analytical implications of threats to validity in quasi-experimental designs. Based on the analogies established between the Classical Test Theory (CTT) and causal analysis, we describe an empirical study based on SEM in which range restriction and statistical power have been simulated in two different models: (1) A multistate model in the control condition (pre-test); and (2) A single-trait-multistate model in the control condition (post-test), adding a new mediator latent exogenous (independent) variable that represents a threat to validity. Results show, empirically, how the differences between both the models could be partially or totally attributed to these threats. Therefore, SEM provides a useful tool to analyze the influence of potential threats to validity. PMID:27378991

  3. Maximum Entropy Discrimination Poisson Regression for Software Reliability Modeling.

    PubMed

    Chatzis, Sotirios P; Andreou, Andreas S

    2015-11-01

    Reliably predicting software defects is one of the most significant tasks in software engineering. Two of the major components of modern software reliability modeling approaches are: 1) extraction of salient features for software system representation, based on appropriately designed software metrics and 2) development of intricate regression models for count data, to allow effective software reliability data modeling and prediction. Surprisingly, research in the latter frontier of count data regression modeling has been rather limited. More specifically, a lack of simple and efficient algorithms for posterior computation has made the Bayesian approaches appear unattractive, and thus underdeveloped in the context of software reliability modeling. In this paper, we try to address these issues by introducing a novel Bayesian regression model for count data, based on the concept of max-margin data modeling, effected in the context of a fully Bayesian model treatment with simple and efficient posterior distribution updates. Our novel approach yields a more discriminative learning technique, making more effective use of our training data during model inference. In addition, it allows of better handling uncertainty in the modeled data, which can be a significant problem when the training data are limited. We derive elegant inference algorithms for our model under the mean-field paradigm and exhibit its effectiveness using the publicly available benchmark data sets.

  4. Measurement invariance via multigroup SEM: Issues and solutions with chi-square-difference tests.

    PubMed

    Yuan, Ke-Hai; Chan, Wai

    2016-09-01

    Multigroup structural equation modeling (SEM) plays a key role in studying measurement invariance and in group comparison. When population covariance matrices are deemed not equal across groups, the next step to substantiate measurement invariance is to see whether the sample covariance matrices in all the groups can be adequately fitted by the same factor model, called configural invariance. After configural invariance is established, cross-group equalities of factor loadings, error variances, and factor variances-covariances are then examined in sequence. With mean structures, cross-group equalities of intercepts and factor means are also examined. The established rule is that if the statistic at the current model is not significant at the level of .05, one then moves on to testing the next more restricted model using a chi-square-difference statistic. This article argues that such an established rule is unable to control either Type I or Type II errors. Analysis, an example, and Monte Carlo results show why and how chi-square-difference tests are easily misused. The fundamental issue is that chi-square-difference tests are developed under the assumption that the base model is sufficiently close to the population, and a nonsignificant chi-square statistic tells little about how good the model is. To overcome this issue, this article further proposes that null hypothesis testing in multigroup SEM be replaced by equivalence testing, which allows researchers to effectively control the size of misspecification before moving on to testing a more restricted model. R code is also provided to facilitate the applications of equivalence testing for multigroup SEM. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  5. A measurement system for large, complex software programs

    NASA Technical Reports Server (NTRS)

    Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.

    1994-01-01

    This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.

  6. The Feasibility of Using Causal Indicators in Educational Measurement

    ERIC Educational Resources Information Center

    Wang, Jue; Engelhard, George, Jr.

    2016-01-01

    The authors of the focus article describe an important issue related to the use and interpretation of causal indicators within the context of structural equation modeling (SEM). In the focus article, the authors illustrate with simulated data the effects of omitting a causal indicator. Since SEMs are used extensively in the social and behavioral…

  7. Improving access in gastroenterology: The single point of entry model for referrals

    PubMed Central

    Novak, Kerri L; Van Zanten, Sander Veldhuyzen; Pendharkar, Sachin R

    2013-01-01

    In 2005, a group of academic gastroenterologists in Calgary (Alberta) adopted a centralized referral intake system known as central triage. This system provided a single point of entry model (SEM) for referrals rather than the traditional system of individual practitioners managing their own referrals and queues. The goal of central triage was to improve wait times and referral management. In 2008, a similar system was developed in Edmonton at the University of Alberta Hospital (Edmonton, Alberta). SEMs have subsequently been adopted by numerous subspecialties throughout Alberta. There are many benefits of SEMs including improved access and reduced wait times. Understanding and measuring complex patient flow systems is key to improving access, and centralized intake systems provide an opportunity to better understand total demand and system bottlenecks. This knowledge is particularly important for specialties such as gastroenterology (GI), in which demand exceeds supply. While it is anticipated that SEMs will reduce wait times for GI care in Canada, the lack of sufficient resources to meet the demand for GI care necessitates additional strategies. PMID:24040629

  8. Improving access in gastroenterology: the single point of entry model for referrals.

    PubMed

    Novak, Kerri; Veldhuyzen Van Zanten, Sander; Pendharkar, Sachin R

    2013-11-01

    In 2005, a group of academic gastroenterologists in Calgary (Alberta) adopted a centralized referral intake system known as central triage. This system provided a single point of entry model (SEM) for referrals rather than the traditional system of individual practitioners managing their own referrals and queues. The goal of central triage was to improve wait times and referral management. In 2008, a similar system was developed in Edmonton at the University of Alberta Hospital (Edmonton, Alberta). SEMs have subsequently been adopted by numerous subspecialties throughout Alberta. There are many benefits of SEMs including improved access and reduced wait times. Understanding and measuring complex patient flow systems is key to improving access, and centralized intake systems provide an opportunity to better understand total demand and system bottlenecks. This knowledge is particularly important for specialties such as gastroenterology (GI), in which demand exceeds supply. While it is anticipated that SEMs will reduce wait times for GI care in Canada, the lack of sufficient resources to meet the demand for GI care necessitates additional strategies.

  9. Dependability modeling and assessment in UML-based software development.

    PubMed

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.

  10. Dependability Modeling and Assessment in UML-Based Software Development

    PubMed Central

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C.

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results. PMID:22988428

  11. Software reliability models for fault-tolerant avionics computers and related topics

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1987-01-01

    Software reliability research is briefly described. General research topics are reliability growth models, quality of software reliability prediction, the complete monotonicity property of reliability growth, conceptual modelling of software failure behavior, assurance of ultrahigh reliability, and analysis techniques for fault-tolerant systems.

  12. Using a Structural Equation Modelling Approach (SEM) to Examine Leadership of Heads of Subject Departments (HODs) as Perceived by Principals and Vice-Principals, Heads of Subject Departments and Teachers within "School Based Management" (SBM) Secondary Schools: Some Evidence from Hong Kong

    ERIC Educational Resources Information Center

    Au, Loretta; Wright, Nigel; Botton, Christopher

    2003-01-01

    This article reports the use of a Structural Equation Modelling (SEM) technique as a means of exploring our understanding of the leadership of Heads of Subject Departments within School Based Management (SBM) secondary schools in Hong Kong. Arguments made by Gronn (1999, 2000), Spillane et al. (2001) suggest that studies of leadership need to…

  13. Access to primary care for socio-economically disadvantaged older people in rural areas: exploring realist theory using structural equation modelling in a linked dataset.

    PubMed

    Ford, John A; Jones, Andy; Wong, Geoff; Clark, Allan; Porter, Tom; Steel, Nick

    2018-06-19

    Realist approaches seek to answer questions such as 'how?', 'why?', 'for whom?', 'in what circumstances?' and 'to what extent?' interventions 'work' using context-mechanism-outcome (CMO) configurations. Quantitative methods are not well-established in realist approaches, but structural equation modelling (SEM) may be useful to explore CMO configurations. Our aim was to assess the feasibility and appropriateness of SEM to explore CMO configurations and, if appropriate, make recommendations based on our access to primary care research. Our specific objectives were to map variables from two large population datasets to CMO configurations from our realist review looking at access to primary care, generate latent variables where needed, and use SEM to quantitatively test the CMO configurations. A linked dataset was created by merging individual patient data from the English Longitudinal Study of Ageing and practice data from the GP Patient Survey. Patients registered in rural practices and who were in the highest deprivation tertile were included. Three latent variables were defined using confirmatory factor analysis. SEM was used to explore the nine full CMOs. All models were estimated using robust maximum likelihoods and accounted for clustering at practice level. Ordinal variables were treated as continuous to ensure convergence. We successfully explored our CMO configurations, but analysis was limited because of data availability. Two hundred seventy-six participants were included. We found a statistically significant direct (context to outcome) or indirect effect (context to outcome via mechanism) for two of nine CMOs. The strongest association was between 'ease of getting through to the surgery' and 'being able to get an appointment' with an indirect mediated effect through convenience (proportion of the indirect effect of the total was 21%). Healthcare experience was not directly associated with getting an appointment, but there was a statistically significant indirect effect through convenience (53% mediated effect). Model fit indices showed adequate fit. SEM allowed quantification of CMO configurations and could complement other qualitative and quantitative techniques in realist evaluations to support inferences about strengths of relationships. Future research exploring CMO configurations with SEM should aim to collect, preferably continuous, primary data.

  14. Software For Computing Reliability Of Other Software

    NASA Technical Reports Server (NTRS)

    Nikora, Allen; Antczak, Thomas M.; Lyu, Michael

    1995-01-01

    Computer Aided Software Reliability Estimation (CASRE) computer program developed for use in measuring reliability of other software. Easier for non-specialists in reliability to use than many other currently available programs developed for same purpose. CASRE incorporates mathematical modeling capabilities of public-domain Statistical Modeling and Estimation of Reliability Functions for Software (SMERFS) computer program and runs in Windows software environment. Provides menu-driven command interface; enabling and disabling of menu options guides user through (1) selection of set of failure data, (2) execution of mathematical model, and (3) analysis of results from model. Written in C language.

  15. Software-Engineering Process Simulation (SEPS) model

    NASA Technical Reports Server (NTRS)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  16. Scanning electron microscopy analysis of hair index on Karachi's population for social and professional appearance enhancement.

    PubMed

    Ali, N; Zohra, R R; Qader, S A U; Mumtaz, M

    2015-06-01

    Hair texture, appearance and pigment play an important role in social and professional communication and maintaining an overall appearance. This study was especially designed for morphological assessment of hair damage caused to Karachi's population due to natural factors and cosmetic treatments using scanning electron microscopy (SEM) technique. Hair samples under the study of synthetic factor's effect were given several cosmetic treatments (hot straightened, bleached, synthetic dyed and henna dyed) whereas samples under natural factor's effect (variation in gender, age and pigmentation) were left untreated. Morphological assessment was performed using SEM technique. Results obtained were statistically analysed using minitab 16 and spss 18 softwares. Scanning electron microscopy images revealed less number of cuticular scales in males than females of same age although size of cuticular scales was found to be larger in males than in females. Mean hair index of white hair was greater than black hair of the same head as it is comparatively newly originated. Tukey's method revealed that among cosmetic treatments, bleaching and synthetic henna caused most of the damage to the hair. Statistical evaluation of results obtained from SEM analysis revealed that human scalp hair index show morphological variation with respect to age, gender, hair pigmentation, chemical and physical treatments. Individuals opting for cosmetic treatments could clearly visualize the extent of hair damage these may cause in long run. © 2015 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  17. The reliability of a segmentation methodology for assessing intramuscular adipose tissue and other soft-tissue compartments of lower leg MRI images.

    PubMed

    Karampatos, Sarah; Papaioannou, Alexandra; Beattie, Karen A; Maly, Monica R; Chan, Adrian; Adachi, Jonathan D; Pritchard, Janet M

    2016-04-01

    Determine the reliability of a magnetic resonance (MR) image segmentation protocol for quantifying intramuscular adipose tissue (IntraMAT), subcutaneous adipose tissue, total muscle and intermuscular adipose tissue (InterMAT) of the lower leg. Ten axial lower leg MRI slices were obtained from 21 postmenopausal women using a 1 Tesla peripheral MRI system. Images were analyzed using sliceOmatic™ software. The average cross-sectional areas of the tissues were computed for the ten slices. Intra-rater and inter-rater reliability were determined and expressed as the standard error of measurement (SEM) (absolute reliability) and intraclass coefficient (ICC) (relative reliability). Intra-rater and inter-rater reliability for IntraMAT were 0.991 (95% confidence interval [CI] 0.978-0.996, p < 0.05) and 0.983 (95% CI 0.958-9.993, p < 0.05), respectively. For the other soft tissue compartments, the ICCs were all >0.90 (p < 0.05). The absolute intra-rater and inter-rater reliability (expressed as SEM) for segmenting IntraMAT were 22.19 mm(2) (95% CI 16.97-32.04) and 78.89 mm(2) (95% CI 60.36-113.92), respectively. This is a reliable segmentation protocol for quantifying IntraMAT and other soft-tissue compartments of the lower leg. A standard operating procedure manual is provided to assist users, and SEM values can be used to estimate sample size and determine confidence in repeated measurements in future research.

  18. Software Quality Evaluation Models Applicable in Health Information and Communications Technologies. A Review of the Literature.

    PubMed

    Villamor Ordozgoiti, Alberto; Delgado Hito, Pilar; Guix Comellas, Eva María; Fernandez Sanchez, Carlos Manuel; Garcia Hernandez, Milagros; Lluch Canut, Teresa

    2016-01-01

    Information and Communications Technologies in healthcare has increased the need to consider quality criteria through standardised processes. The aim of this study was to analyse the software quality evaluation models applicable to healthcare from the perspective of ICT-purchasers. Through a systematic literature review with the keywords software, product, quality, evaluation and health, we selected and analysed 20 original research papers published from 2005-2016 in health science and technology databases. The results showed four main topics: non-ISO models, software quality evaluation models based on ISO/IEC standards, studies analysing software quality evaluation models, and studies analysing ISO standards for software quality evaluation. The models provide cost-efficiency criteria for specific software, and improve use outcomes. The ISO/IEC25000 standard is shown as the most suitable for evaluating the quality of ICTs for healthcare use from the perspective of institutional acquisition.

  19. Measurement properties of the WOMAC LK 3.1 pain scale.

    PubMed

    Stratford, P W; Kennedy, D M; Woodhouse, L J; Spadoni, G F

    2007-03-01

    The Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC) is applied extensively to patients with osteoarthritis of the hip or knee. Previous work has challenged the validity of its physical function scale however an extensive evaluation of its pain scale has not been reported. Our purpose was to estimate internal consistency, factorial validity, test-retest reliability, and the standard error of measurement (SEM) of the WOMAC LK 3.1 pain scale. Four hundred and seventy-four patients with osteoarthritis of the hip or knee awaiting arthroplasty were administered the WOMAC. Estimates of internal consistency (coefficient alpha), factorial validity (confirmatory factor analysis), and the SEM based on internal consistency (SEM(IC)) were obtained. Test-retest reliability [Type 2,1 intraclass correlation coefficients (ICC)] and a corresponding SEM(TRT) were estimated on a subsample of 36 patients. Our estimates were: internal consistency alpha=0.84; SEM(IC)=1.48; Type 2,1 ICC=0.77; SEM(TRT)=1.69. Confirmatory factor analysis failed to support a single factor structure of the pain scale with uncorrelated error terms. Two comparable models provided excellent fit: (1) a model with correlated error terms between the walking and stairs items, and between night and sit items (chi2=0.18, P=0.98); (2) a two factor model with walking and stairs items loading on one factor, night and sit items loading on a second factor, and the standing item loading on both factors (chi2=0.18, P=0.98). Our examination of the factorial structure of the WOMAC pain scale failed to support a single factor and internal consistency analysis yielded a coefficient less than optimal for individual patient use. An alternate strategy to summing the five-item responses when considering individual patient application would be to interpret item responses separately or to sum only those items which display homogeneity.

  20. TriBITS lifecycle model. Version 1.0, a lean/agile software lifecycle model for research-based computational science and engineering and applied mathematical software.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willenbring, James M.; Bartlett, Roscoe Ainsworth; Heroux, Michael Allen

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science and engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process - respecting the competing needs of research vs. production - cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for manymore » CSE software projects that are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Here, we advocate three to four phases or maturity levels that address the appropriate handling of many issues associated with the transition from research to production software. The goals of this lifecycle model are to better communicate maturity levels with customers and to help to identify and promote Software Engineering (SE) practices that will help to improve productivity and produce better software. An important collection of software in this domain is Trilinos, which is used as the motivation and the initial target for this lifecycle model. However, many other related and similar CSE (and non-CSE) software projects can also make good use of this lifecycle model, especially those that use the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less

  1. MicroCT analysis of a retrieved root restored with a bonded fiber-reinforced composite dowel: a pilot study.

    PubMed

    Lorenzoni, Fabio Cesar; Bonfante, Estevam A; Bonfante, Gerson; Martins, Leandro M; Witek, Lukasz; Silva, Nelson R F A

    2013-08-01

    This evaluation aimed to (1) validate micro-computed tomography (microCT) findings using scanning electron microscopy (SEM) imaging, and (2) quantify the volume of voids and the bonded surface area resulting from fiber-reinforced composite (FRC) dowel cementation technique using microCT scanning technology/3D reconstructing software. A fiberglass dowel was cemented in a condemned maxillary lateral incisor prior to its extraction. A microCT scan was performed of the extracted tooth creating a large volume of data in DICOM format. This set of images was imported to image-processing software to inspect the internal architecture of structures. The outer surface and the spatial relationship of dentin, FRC dowel, cement layer, and voids were reconstructed. Three-dimensional spatial architecture of structures and volumetric analysis revealed that 9.89% of the resin cement was composed of voids and that the bonded area between root dentin and cement was 60.63% larger than that between cement and FRC dowel. SEM imaging demonstrated the presence of voids similarly observed using microCT technology (aim 1). MicroCT technology was able to nondestructively measure the volume of voids within the cement layer and the bonded surface area at the root/cement/FRC interfaces (aim 2). The interfaces at the root dentin/cement/dowel represent a timely and relevant topic where several efforts have been conducted in the past few years to understand their inherent features. MicroCT technology combined with 3D reconstruction allows for not only inspecting the internal arrangement rendered by fiberglass adhesively bonded to root dentin, but also estimating the volume of voids and contacted bond area between the dentin and cement layer. © 2013 by the American College of Prosthodontists.

  2. Preparation, Characterization, and Optimization of Folic Acid-Chitosan-Methotrexate Core-Shell Nanoparticles by Box-Behnken Design for Tumor-Targeted Drug Delivery.

    PubMed

    Naghibi Beidokhti, Hamid Reza; Ghaffarzadegan, Reza; Mirzakhanlouei, Sasan; Ghazizadeh, Leila; Dorkoosh, Farid Abedin

    2017-01-01

    The objective of this study was to investigate the combined influence of independent variables in the preparation of folic acid-chitosan-methotrexate nanoparticles (FA-Chi-MTX NPs). These NPs were designed and prepared for targeted drug delivery in tumor. The NPs of each batch were prepared by coaxial electrospray atomization method and evaluated for particle size (PS) and particle size distribution (PSD). The independent variables were selected to be concentration of FA-chitosan, ratio of shell solution flow rate to core solution flow rate, and applied voltage. The process design of experiments (DOE) was obtained with three factors in three levels by Design expert software. Box-Behnken design was used to select 15 batches of experiments randomly. The chemical structure of FA-chitosan was examined by FTIR. The NPs of each batch were collected separately, and morphologies of NPs were investigated by field emission scanning electron microscope (FE-SEM). The captured pictures of all batches were analyzed by ImageJ software. Mean PS and PSD were calculated for each batch. Polynomial equation was produced for each response. The FE-SEM results showed the mean diameter of the core-shell NPs was around 304 nm, and nearly 30% of the produced NPs are in the desirable range. Optimum formulations were selected. The validation of DOE optimization results showed errors around 2.5 and 2.3% for PS and PSD, respectively. Moreover, the feasibility of using prepared NPs to target tumor extracellular pH was shown, as drug release was greater in the pH of endosome (acidic medium). Finally, our results proved that FA-Chi-MTX NPs were active against the human epithelial cervical cancer (HeLa) cells.

  3. Deep space network software cost estimation model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1981-01-01

    A parametric software cost estimation model prepared for Jet PRopulsion Laboratory (JPL) Deep Space Network (DSN) Data System implementation tasks is described. The resource estimation mdel modifies and combines a number of existing models. The model calibrates the task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software life-cycle statistics.

  4. Dynamic predictive model for growth of Salmonella spp. in scrambled egg mix.

    PubMed

    Li, Lin; Cepeda, Jihan; Subbiah, Jeyamkondan; Froning, Glenn; Juneja, Vijay K; Thippareddi, Harshavardhan

    2017-06-01

    Liquid egg products can be contaminated with Salmonella spp. during processing. A dynamic model for the growth of Salmonella spp. in scrambled egg mix - high solids (SEM) was developed and validated. SEM was prepared and inoculated with ca. 2 log CFU/mL of a five serovar Salmonella spp. cocktail. Salmonella spp. growth data at isothermal temperatures (10, 15, 20, 25, 30, 35, 37, 39, 41, 43, 45, and 47 °C) in SEM were collected. Baranyi model was used (primary model) to fit growth data and the maximum growth rate and lag phase duration for each temperature were determined. A secondary model was developed with maximum growth rate as a function of temperature. The model performance measures, root mean squared error (RMSE, 0.09) and pseudo-R 2 (1.00) indicated good fit for both primary and secondary models. A dynamic model was developed by integrating the primary and secondary models and validated using two sinusoidal temperature profiles, 5-15 °C (low temperature) for 480 h and 10-40 °C (high temperature) for 48 h. The RMSE values for the sinusoidal low and high temperature profiles were 0.47 and 0.42 log CFU/mL, respectively. The model can be used to predict Salmonella spp. growth in case of temperature abuse during liquid egg processing. Copyright © 2016. Published by Elsevier Ltd.

  5. Open Source Molecular Modeling

    PubMed Central

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-01-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. PMID:27631126

  6. NASA Software Cost Estimation Model: An Analogy Based Estimation Model

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus; Juster, Leora; Menzies, Tim; Mathew, George; Johnson, James

    2015-01-01

    The cost estimation of software development activities is increasingly critical for large scale integrated projects such as those at DOD and NASA especially as the software systems become larger and more complex. As an example MSL (Mars Scientific Laboratory) developed at the Jet Propulsion Laboratory launched with over 2 million lines of code making it the largest robotic spacecraft ever flown (Based on the size of the software). Software development activities are also notorious for their cost growth, with NASA flight software averaging over 50% cost growth. All across the agency, estimators and analysts are increasingly being tasked to develop reliable cost estimates in support of program planning and execution. While there has been extensive work on improving parametric methods there is very little focus on the use of models based on analogy and clustering algorithms. In this paper we summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods to develop estimation models based on analogy and clustering. The NASA Software Cost Model performance is evaluated by comparing it to COCOMO II, linear regression, and K-­ nearest neighbor prediction model performance on the same data set.

  7. Exploring the influence of a social ecological model on school-based physical activity.

    PubMed

    Langille, Jessie-Lee D; Rodgers, Wendy M

    2010-12-01

    Among rising rates of overweight and obesity, schools have become essential settings to promote health behaviors, such as physical activity (PA). As schools exist within a broader environment, the social ecological model (SEM) provided a framework to consider how different levels interact and influence PA. The purpose of this study was to provide insight on school-based PA promotion by investigating the integration between different levels of Emmons's SEM within one public school board in a large Canadian city. Interviews were conducted with participants from the government (n = 4), the public school board (n = 3), principals (n = 3), and teachers (n = 4) and analyzed to explore perspectives on the various levels of the model. The results suggested that higher level policies "trickled down" into the organizational level of the SEM but there was pivotal responsibility for schools to determine how to implement PA strategies. Furthermore, schools have difficulty implementing PA because of the continued priority of academic achievement.

  8. Analysis on factors affecting household customers decision in using electricity at peak time and its correlation towards saving electricity

    NASA Astrophysics Data System (ADS)

    Pasasa, Linus; Marbun, Parlin; Mariza, Ita

    2015-09-01

    The purpose of this paper is to study and analyse the factors affecting customer decisions in using electricity at peak-load hours (between 17.00 to 22.00 WIB) and their behaviors towards electricity conservation in Indonesian household. The underlying rationale is to influence a reduction in energy consumption by stimulating energy saving behaviors, thereby reducing the impact of energy use on the environment. How is the correlation between the decisions in using electricity during peak load hours with the household customer's behavior towards saving electricity? The primary data is obtained by distributing questionnaires to customers of PT. PLN Jakarta Raya and Tangerang Distribution from Household segment. The data is analysed using the Structural Equation Model (SEM) and AMOS Software. The research is finding that all factors (Personal, Social, PLN Services, Psychological, and Cultural) are positively influence customer decision in using electricity at peak load hours. There is a correlation between the decisions in using electricity during peak load hours with the household customer's behavior towards saving electricity.

  9. [Characteristics of social supportive network serving the older female sex workers in Qingdao].

    PubMed

    Xu, Y Q; Li, Y F; Jiang, Z X; Zhang, X J; Yuan, X; Zhang, N; Li, X F; Jiang, B F

    2016-02-01

    To overview the status of social support on older female sex workers (OFSWs) in Qingdao and to better understand the characteristics of this egocentric social support networks. Ucinet 6 software was used to analyze the characteristics of egocentric social networks which involving 400 OFSWs who were recruited by respondent-driven sampling (RDS) method in Qingdao during March 2014 to June. Structural equation model (SEM) was used for data analysis, fitted test and estimation. A total of 400 OFSWs of Qingdao nominated 1 617 social supportive members, and the average size of egocentric social networks of OFSWs was (4.0 ± 1.5). Among all the alter egos (social support network members of the egos), 613 were female sex workers fellows, accounted for the most important part of all the social ties (37.91%). Characteristics of small size and non-relative relationships were seen more obviously among OFSWs with non-local registration and the ratings of emotional support (4.42±2.38) was significantly lower than the tangible support (5.73 ± 1.69) (P<0.05). Result of the SEM showed that homogeneity, joint strength and the network structure were significantly related to the ratings of average support. The total standard effects of which were 0.110, 0.925 and -0.069 respectively. It seemed that homogeneity can affect the degree of support, both directly and indirectly. OFSWs in Qingdao tended to ask for social support from friends who were also female sex workers. Stronger the joint strength between egos and alters, greater the homogeneity between the two was seen. Tighter relations among the alter egos, higher degree of average social support of the egos were acquired.

  10. Presenting an evaluation model of the trauma registry software.

    PubMed

    Asadi, Farkhondeh; Paydar, Somayeh

    2018-04-01

    Trauma is a major cause of 10% death in the worldwide and is considered as a global concern. This problem has made healthcare policy makers and managers to adopt a basic strategy in this context. Trauma registry has an important and basic role in decreasing the mortality and the disabilities due to injuries resulted from trauma. Today, different software are designed for trauma registry. Evaluation of this software improves management, increases efficiency and effectiveness of these systems. Therefore, the aim of this study is to present an evaluation model for trauma registry software. The present study is an applied research. In this study, general and specific criteria of trauma registry software were identified by reviewing literature including books, articles, scientific documents, valid websites and related software in this domain. According to general and specific criteria and related software, a model for evaluating trauma registry software was proposed. Based on the proposed model, a checklist designed and its validity and reliability evaluated. Mentioned model by using of the Delphi technique presented to 12 experts and specialists. To analyze the results, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved by the experts and professionals, the final version of the evaluation model for the trauma registry software was presented. For evaluating of criteria of trauma registry software, two groups were presented: 1- General criteria, 2- Specific criteria. General criteria of trauma registry software were classified into four main categories including: 1- usability, 2- security, 3- maintainability, and 4-interoperability. Specific criteria were divided into four main categories including: 1- data submission and entry, 2- reporting, 3- quality control, 4- decision and research support. The presented model in this research has introduced important general and specific criteria of trauma registry software and sub criteria related to each main criteria separately. This model was validated by experts in this field. Therefore, this model can be used as a comprehensive model and a standard evaluation tool for measuring efficiency and effectiveness and performance improvement of trauma registry software. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Development of the UTAUT2 model to measure the acceptance of medical laboratory portals by patients in Shiraz.

    PubMed

    Ravangard, Ramin; Kazemi, Zhila; Abbasali, Somaye Zaker; Sharifian, Roxana; Monem, Hossein

    2017-02-01

    One of the main stages for achieving the success is acceptance of technology by its users. Hence, identifying the effective factors in successful acceptance of information technology is necessary and vital. One such factor is usability. This study aimed to investigate the software usability in the "Unified Theory of Acceptance and Use of Technology 2 (UTAUT2)" model in patients' use of medical diagnosis laboratories' electronic portals in 2015. This cross-sectional study was carried out on 170 patients in 2015. A 27-item questionnaire adopted from previous research and the Usability Evaluation questionnaire were used for data collection. Data were analyzed using Structural Equation Modeling (SEM), with Partial Least Squares approach by SPSS 20.0 and Smart-PLS V3.0. The results showed that the construct of intention to use had significant associations with price value (t-value=2.77), hedonic motivation (t-value=4.46), habit (t-value=1.99) and usability (t-value=5.2), as well as the construct of usage behavior with usability (t-value=3.45) and intention to use (t-value=2.03). Considering the results of this study, the following recommendations can be made in order for the higher use of portals by the patients: informing patients about the advantages of using these portals, designing portals in a simple and understandable form, increasing the portals' attractiveness, etc.

  12. Template construction grammar: from visual scene description to language comprehension and agrammatism.

    PubMed

    Barrès, Victor; Lee, Jinyong

    2014-01-01

    How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and world-knowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuo-motor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eye-tracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentence-picture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community.

  13. NHPP-Based Software Reliability Models Using Equilibrium Distribution

    NASA Astrophysics Data System (ADS)

    Xiao, Xiao; Okamura, Hiroyuki; Dohi, Tadashi

    Non-homogeneous Poisson processes (NHPPs) have gained much popularity in actual software testing phases to estimate the software reliability, the number of remaining faults in software and the software release timing. In this paper, we propose a new modeling approach for the NHPP-based software reliability models (SRMs) to describe the stochastic behavior of software fault-detection processes. The fundamental idea is to apply the equilibrium distribution to the fault-detection time distribution in NHPP-based modeling. We also develop efficient parameter estimation procedures for the proposed NHPP-based SRMs. Through numerical experiments, it can be concluded that the proposed NHPP-based SRMs outperform the existing ones in many data sets from the perspective of goodness-of-fit and prediction performance.

  14. An information model for use in software management estimation and prediction

    NASA Technical Reports Server (NTRS)

    Li, Ningda R.; Zelkowitz, Marvin V.

    1993-01-01

    This paper describes the use of cluster analysis for determining the information model within collected software engineering development data at the NASA/GSFC Software Engineering Laboratory. We describe the Software Management Environment tool that allows managers to predict development attributes during early phases of a software project and the modifications we propose to allow it to develop dynamic models for better predictions of these attributes.

  15. Predicting Software Suitability Using a Bayesian Belief Network

    NASA Technical Reports Server (NTRS)

    Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.

    2005-01-01

    The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.

  16. Why Isn't Talent Development on the IEP? SEM and the Twice Exceptional Learner

    ERIC Educational Resources Information Center

    Baum, Susan; Novak, Cynthia

    2010-01-01

    Why isn't talent development included on the Individual Educational Plan of 2E students? Twice exceptional students have unique issues that respond especially well to a talent development approach especially within the context of the Schoolwide Enrichment Model. Through case studies and a review of successful projects using SEM with at risk…

  17. The Application of SEM to Behavioral Research in Oncology: Past Accomplishments and Future Opportunities

    ERIC Educational Resources Information Center

    Schnoll, Robert A.; Fang, Carolyn Y.; Manne, Sharon L.

    2004-01-01

    The past decade has seen a tremendous growth in the use of structural equation modeling (SEM) to address research questions in 2 subfields of behavioral science: cancer prevention and control (e.g., determinants of cancer screening adherence) and behavioral oncology (e.g., determinants of psychosocial adjustment among cancer patients or…

  18. Lessons Learned from My Students: The Impact of SEM Teaching and Learning on Affective Development

    ERIC Educational Resources Information Center

    Hebert, Thomas P.

    2010-01-01

    Through reflection on his years as an enrichment teacher in Schoolwide Enrichment Model (SEM) programs, the author describes significant ways the social and emotional development of his students was shaped by their involvement in enriched teaching and learning. Through portraits of his students engaged in Type II and Type III enrichment, the…

  19. System and Software Reliability (C103)

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores

    2003-01-01

    Within the last decade better reliability models (hardware. software, system) than those currently used have been theorized and developed but not implemented in practice. Previous research on software reliability has shown that while some existing software reliability models are practical, they are no accurate enough. New paradigms of development (e.g. OO) have appeared and associated reliability models have been proposed posed but not investigated. Hardware models have been extensively investigated but not integrated into a system framework. System reliability modeling is the weakest of the three. NASA engineers need better methods and tools to demonstrate that the products meet NASA requirements for reliability measurement. For the new models for the software component of the last decade, there is a great need to bring them into a form that they can be used on software intensive systems. The Statistical Modeling and Estimation of Reliability Functions for Systems (SMERFS'3) tool is an existing vehicle that may be used to incorporate these new modeling advances. Adapting some existing software reliability modeling changes to accommodate major changes in software development technology may also show substantial improvement in prediction accuracy. With some additional research, the next step is to identify and investigate system reliability. System reliability models could then be incorporated in a tool such as SMERFS'3. This tool with better models would greatly add value in assess in GSFC projects.

  20. Are Earth System model software engineering practices fit for purpose? A case study.

    NASA Astrophysics Data System (ADS)

    Easterbrook, S. M.; Johns, T. C.

    2009-04-01

    We present some analysis and conclusions from a case study of the culture and practices of scientists at the Met Office and Hadley Centre working on the development of software for climate and Earth System models using the MetUM infrastructure. The study examined how scientists think about software correctness, prioritize their requirements in making changes, and develop a shared understanding of the resulting models. We conclude that highly customized techniques driven strongly by scientific research goals have evolved for verification and validation of such models. In a formal software engineering context these represents costly, but invaluable, software integration tests with considerable benefits. The software engineering practices seen also exhibit recognisable features of both agile and open source software development projects - self-organisation of teams consistent with a meritocracy rather than top-down organisation, extensive use of informal communication channels, and software developers who are generally also users and science domain experts. We draw some general conclusions on whether these practices work well, and what new software engineering challenges may lie ahead as Earth System models become ever more complex and petascale computing becomes the norm.

  1. Students' Different Understandings of Class Diagrams

    ERIC Educational Resources Information Center

    Boustedt, Jonas

    2012-01-01

    The software industry needs well-trained software designers and one important aspect of software design is the ability to model software designs visually and understand what visual models represent. However, previous research indicates that software design is a difficult task to many students. This article reports empirical findings from a…

  2. Model-based software process improvement

    NASA Technical Reports Server (NTRS)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  3. Assessing statistical differences between parameters estimates in Partial Least Squares path modeling.

    PubMed

    Rodríguez-Entrena, Macario; Schuberth, Florian; Gelhard, Carsten

    2018-01-01

    Structural equation modeling using partial least squares (PLS-SEM) has become a main-stream modeling approach in various disciplines. Nevertheless, prior literature still lacks a practical guidance on how to properly test for differences between parameter estimates. Whereas existing techniques such as parametric and non-parametric approaches in PLS multi-group analysis solely allow to assess differences between parameters that are estimated for different subpopulations, the study at hand introduces a technique that allows to also assess whether two parameter estimates that are derived from the same sample are statistically different. To illustrate this advancement to PLS-SEM, we particularly refer to a reduced version of the well-established technology acceptance model.

  4. Increasing the reliability of ecological models using modern software engineering techniques

    Treesearch

    Robert M. Scheller; Brian R. Sturtevant; Eric J. Gustafson; Brendan C. Ward; David J. Mladenoff

    2009-01-01

    Modern software development techniques are largely unknown to ecologists. Typically, ecological models and other software tools are developed for limited research purposes, and additional capabilities are added later, usually in an ad hoc manner. Modern software engineering techniques can substantially increase scientific rigor and confidence in ecological models and...

  5. Tips on Creating Complex Geometry Using Solid Modeling Software

    ERIC Educational Resources Information Center

    Gow, George

    2008-01-01

    Three-dimensional computer-aided drafting (CAD) software, sometimes referred to as "solid modeling" software, is easy to learn, fun to use, and becoming the standard in industry. However, many users have difficulty creating complex geometry with the solid modeling software. And the problem is not entirely a student problem. Even some teachers and…

  6. B Layers and Adhesion on Armco Iron Substrate

    NASA Astrophysics Data System (ADS)

    Elias-Espinosa, M.; Ortiz-Domínguez, M.; Keddam, M.; Flores-Rentería, M. A.; Damián-Mejía, O.; Zuno-Silva, J.; Hernández-Ávila, J.; Cardoso-Legorreta, E.; Arenas-Flores, A.

    2014-08-01

    In this work, a kinetic model was suggested to evaluate the boron diffusion coefficient in the Fe2B layers grown on the Armco iron substrate by the powder-pack boriding. This thermochemical treatment was carried out in the temperature range of 1123-1273 K for treatment times ranging from 2 to 8 h. The boron diffusion coefficient in the Fe2B layers was estimated by solving the mass balance equation at the (Fe2B/substrate) interface with an inclusion of boride incubation time. To validate the present model, the simulated value of Fe2B layer thickness was compared with the experimental value obtained at 1253 K for a treatment time of 5 h. The morphology of Fe2B layers was observed by SEM and optical microscopy. Metallographic studies showed that the boride layer has a saw-tooth morphology in all the samples. The layer thickness measurements were done with the help of MSQ PLUS software. The Fe2B phase was identified by x-ray diffraction method. Finally, the adherence of Fe2B layers on the Armco iron substrate was qualitatively evaluated by using the Daimler-Benz Rockwell-C indentation technique. In addition, the estimated value of boron activation energy was compared to the literature data.

  7. Software engineering and the role of Ada: Executive seminar

    NASA Technical Reports Server (NTRS)

    Freedman, Glenn B.

    1987-01-01

    The objective was to introduce the basic terminology and concepts of software engineering and Ada. The life cycle model is reviewed. The application of the goals and principles of software engineering is applied. An introductory understanding of the features of the Ada language is gained. Topics addressed include: the software crises; the mandate of the Space Station Program; software life cycle model; software engineering; and Ada under the software engineering umbrella.

  8. A two scale analysis of tight sandstones

    NASA Astrophysics Data System (ADS)

    Adler, P. M.; Davy, C. A.; Song, Y.; Troadec, D.; Hauss, G.; Skoczylas, F.

    2015-12-01

    Tight sandstones have a low porosity and a very small permeability K. Available models for K do not compare well with measurements. These sandstones are made of SiO_2 grains, with a typical size of several hundreds of micron. These grains are separated by a network of micro-cracks, with sizes ranging between microns down to tens of nm. Therefore, the structure can be schematized by Voronoi polyhedra separated by plane and permeable polygonal micro-cracks. Our goal is to estimate K based on a two scale analysis and to compare the results to measurements. For a particular sample [2], local measurements on several scales include FIB/SEM [3], CMT and 2D SEM. FIB/SEM is selected because the peak pore size given by Mercury Intrusion Porosimetry is of 350nm. FIB/SEM imaging (with 50 nm voxel size) identifies an individual crack of 180nm average opening, whereas CMT provides a connected porosity (individual crack) for 60 nm voxel size, of 4 micron average opening. Numerical modelling is performed by combining the micro-crack network scale (given by 2D SEM) and the 3D micro-crack scale (given by either FIB/SEM or CMT). Estimates of the micro-crack density are derived from 2D SEM trace maps by counting the intersections with scanlines, the surface density of traces, and the number of fracture intersections. K is deduced by using a semi empirical formula valid for identical, isotropic and uniformly distributed fractures [1]. This value is proportional to the micro-crack transmissivity sigma. Sigma is determined by solving the Stokes equation in the micro-cracks measured by FIB/SEM or CMT. K is obtained by combining the two previous results. Good correlation with measured values on centimetric plugs is found when using sigma from CMT data. The results are discussed and further research is proposed. [1] Adler et al, Fractured porous media, Oxford Univ. Press, 2012. [2] Duan et al, Int. J. Rock Mech. Mining Sci., 65, p75, 2014. [3] Song et al, Marine and Petroleum Eng., 65, p63, 2015.

  9. 3DSEM++: Adaptive and intelligent 3D SEM surface reconstruction.

    PubMed

    Tafti, Ahmad P; Holz, Jessica D; Baghaie, Ahmadreza; Owen, Heather A; He, Max M; Yu, Zeyun

    2016-08-01

    Structural analysis of microscopic objects is a longstanding topic in several scientific disciplines, such as biological, mechanical, and materials sciences. The scanning electron microscope (SEM), as a promising imaging equipment has been around for decades to determine the surface properties (e.g., compositions or geometries) of specimens by achieving increased magnification, contrast, and resolution greater than one nanometer. Whereas SEM micrographs still remain two-dimensional (2D), many research and educational questions truly require knowledge and facts about their three-dimensional (3D) structures. 3D surface reconstruction from SEM images leads to remarkable understanding of microscopic surfaces, allowing informative and qualitative visualization of the samples being investigated. In this contribution, we integrate several computational technologies including machine learning, contrario methodology, and epipolar geometry to design and develop a novel and efficient method called 3DSEM++ for multi-view 3D SEM surface reconstruction in an adaptive and intelligent fashion. The experiments which have been performed on real and synthetic data assert the approach is able to reach a significant precision to both SEM extrinsic calibration and its 3D surface modeling. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Recent advances in 3D SEM surface reconstruction.

    PubMed

    Tafti, Ahmad P; Kirkpatrick, Andrew B; Alavi, Zahrasadat; Owen, Heather A; Yu, Zeyun

    2015-11-01

    The scanning electron microscope (SEM), as one of the most commonly used instruments in biology and material sciences, employs electrons instead of light to determine the surface properties of specimens. However, the SEM micrographs still remain 2D images. To effectively measure and visualize the surface attributes, we need to restore the 3D shape model from the SEM images. 3D surface reconstruction is a longstanding topic in microscopy vision as it offers quantitative and visual information for a variety of applications consisting medicine, pharmacology, chemistry, and mechanics. In this paper, we attempt to explain the expanding body of the work in this area, including a discussion of recent techniques and algorithms. With the present work, we also enhance the reliability, accuracy, and speed of 3D SEM surface reconstruction by designing and developing an optimized multi-view framework. We then consider several real-world experiments as well as synthetic data to examine the qualitative and quantitative attributes of our proposed framework. Furthermore, we present a taxonomy of 3D SEM surface reconstruction approaches and address several challenging issues as part of our future work. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. In Situ Characterization of Boehmite Particles in Water Using Liquid SEM.

    PubMed

    Yao, Juan; Arey, Bruce W; Yang, Li; Zhang, Fei; Komorek, Rachel; Chun, Jaehun; Yu, Xiao-Ying

    2017-09-27

    In situ imaging and elemental analysis of boehmite (AlOOH) particles in water is realized using the System for Analysis at the Liquid Vacuum Interface (SALVI) and Scanning Electron Microscopy (SEM). This paper describes the method and key steps in integrating the vacuum compatible SAVLI to SEM and obtaining secondary electron (SE) images of particles in liquid in high vacuum. Energy dispersive x-ray spectroscopy (EDX) is used to obtain elemental analysis of particles in liquid and control samples including deionized (DI) water only and an empty channel as well. Synthesized boehmite (AlOOH) particles suspended in liquid are used as a model in the liquid SEM illustration. The results demonstrate that the particles can be imaged in the SE mode with good resolution (i.e., 400 nm). The AlOOH EDX spectrum shows significant signal from the aluminum (Al) when compared with the DI water and the empty channel control. In situ liquid SEM is a powerful technique to study particles in liquid with many exciting applications. This procedure aims to provide technical know-how in order to conduct liquid SEM imaging and EDX analysis using SALVI and to reduce potential pitfalls when using this approach.

  12. Presenting an Evaluation Model for the Cancer Registry Software.

    PubMed

    Moghaddasi, Hamid; Asadi, Farkhondeh; Rabiei, Reza; Rahimi, Farough; Shahbodaghi, Reihaneh

    2017-12-01

    As cancer is increasingly growing, cancer registry is of great importance as the main core of cancer control programs, and many different software has been designed for this purpose. Therefore, establishing a comprehensive evaluation model is essential to evaluate and compare a wide range of such software. In this study, the criteria of the cancer registry software have been determined by studying the documents and two functional software of this field. The evaluation tool was a checklist and in order to validate the model, this checklist was presented to experts in the form of a questionnaire. To analyze the results of validation, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved, the final version of the evaluation model for the cancer registry software was presented. The evaluation model of this study contains tool and method of evaluation. The evaluation tool is a checklist including the general and specific criteria of the cancer registry software along with their sub-criteria. The evaluation method of this study was chosen as a criteria-based evaluation method based on the findings. The model of this study encompasses various dimensions of cancer registry software and a proper method for evaluating it. The strong point of this evaluation model is the separation between general criteria and the specific ones, while trying to fulfill the comprehensiveness of the criteria. Since this model has been validated, it can be used as a standard to evaluate the cancer registry software.

  13. Methods for cost estimation in software project management

    NASA Astrophysics Data System (ADS)

    Briciu, C. V.; Filip, I.; Indries, I. I.

    2016-02-01

    The speed in which the processes used in software development field have changed makes it very difficult the task of forecasting the overall costs for a software project. By many researchers, this task has been considered unachievable, but there is a group of scientist for which this task can be solved using the already known mathematical methods (e.g. multiple linear regressions) and the new techniques as genetic programming and neural networks. The paper presents a solution for building a model for the cost estimation models in the software project management using genetic algorithms starting from the PROMISE datasets related COCOMO 81 model. In the first part of the paper, a summary of the major achievements in the research area of finding a model for estimating the overall project costs is presented together with the description of the existing software development process models. In the last part, a basic proposal of a mathematical model of a genetic programming is proposed including here the description of the chosen fitness function and chromosome representation. The perspective of model described it linked with the current reality of the software development considering as basis the software product life cycle and the current challenges and innovations in the software development area. Based on the author's experiences and the analysis of the existing models and product lifecycle it was concluded that estimation models should be adapted with the new technologies and emerging systems and they depend largely by the chosen software development method.

  14. Comparison of two dental implant surface modifications on implants with same macrodesign: an experimental study in the pelvic sheep model.

    PubMed

    Ernst, Sabrina; Stübinger, Stefan; Schüpbach, Peter; Sidler, Michéle; Klein, Karina; Ferguson, Stephen J; von Rechenberg, Brigitte

    2015-08-01

    The aim of this study was to compare two different surfaces of one uniform macro-implant design in order to focus exclusively on the osseointegration properties after 2, 4 and 8 weeks and to discuss the animal model chosen. In six mature sheep, n = 36 implants with a highly crystalline and phosphate-enriched anodized titanium oxide surface (TiU) and n = 36 implants with a hydrophilic, sandblasted, large grit and acid-etched surface (SLA) were placed in the pelvic bone. TiU implants were custom-made to match the SLA implant design. The implant stability and bone-to-implant contact (BIC) were assessed by resonance frequency (ISQ), backscatter scanning electron microscopy (B-SEM), light microscopy (LM), micro-CT and intravital fluorochrome staining. Biomechanical removal torque testing was performed. Overall, no statistically significant differences in BIC total (trabecular + cortical) between TiU and SLA were found via LM and B-SEM. BIC values (B-SEM; LM) in both groups revealed a steady rise in trabecular bone attachment to the implant surface after 2, 4 and 8 weeks. In the 2- to 4-week time interval in the TiU group (P = 0.005) as well as in the SLA group (P = 0.01), a statistically significant increase in BIC trabecular could be observed via LM. B-SEM values confirmed the statistically significant increase for TiU (P = 0.001). In both groups, BIC trabecular values after 8 weeks were significantly higher (P ≤ 0.05) than after 2 weeks (B-SEM; LM). Biomechanical data confirmed the histological data. The two surfaces proved comparable osseointegration in this sheep model. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  15. Choices in higher education: Majoring in and changing from the sciences

    NASA Astrophysics Data System (ADS)

    Minear, Nancy Ann

    This dissertation addresses patterns of retention of undergraduate science, engineering and mathematics (SEM) students, with special attention paid to female and under represented minority students. As such, the study is focused on issues related to academic discipline and institutional retention, rather than the retention of students in the overall system of higher education. While previous retention studies have little to say about rates of retention that are specific to the sciences (or any other specific area of study) or employ models that rely on students' performance at the college level, this work address both points by identifying the post secondary academic performance characteristics of persisters and non-persisters in the sciences by gender, ethnicity and matriculating major as well as identifying introductory SEM course requirements that prevent students from persisting in sciencegender, ethnicity and matriculating major as well as identifying introductory SEM course requirements that prevent students from persisting in science majors. A secondary goal of investigating the usefulness of institutional records for retention research is addressed. Models produced for the entire population and selected subpopulations consistently classified higher-performing (both SEM and non-SEM grade point averages) students into Bachelor of Science categories using the number of Introductory Chemistry courses attempted at the university. For lower performing students, those with more introductory chemistry courses were classified as changing majors out of the sciences, and in general as completing a Bachelor of Arts degree. Performance in gatekeeper courses as a predictor of terminal academic status was limited to Introductory Physics for a small number of cases. Performance in Introductory Calculus and Introductory Chemistry were not consistently utilized as predictor variables. The models produced for various subpopulations (women, ethnic groups and matriculation major) utilized the same set of predictor variables with varying cutpoints for classification.

  16. Spectral-element global waveform tomography: A second-generation upper-mantle model

    NASA Astrophysics Data System (ADS)

    French, S. W.; Lekic, V.; Romanowicz, B. A.

    2012-12-01

    The SEMum model of Lekic and Romanowicz (2011a) was the first global upper-mantle VS model obtained using whole-waveform inversion with spectral element (SEM: Komatitsch and Vilotte, 1998) forward modeling of time domain three component waveforms. SEMum exhibits stronger amplitudes of heterogeneity in the upper 200km of the mantle compared to previous global models - particularly with respect to low-velocity anomalies. To make SEM-based waveform inversion tractable at global scales, SEMum was developed using: (1) a version of SEM coupled to 1D mode computation in the earth's core (C-SEM, Capdeville et al., 2003); (2) asymptotic normal-mode sensitivity kernels, incorporating multiple forward scattering and finite-frequency effects in the great-circle plane (NACT: Li and Romanowicz, 1995); and (3) a smooth anisotropic crustal layer of uniform 60km thickness, designed to match global surface-wave dispersion while reducing the cost of time integration in the SEM. The use of asymptotic kernels reduced the number of SEM computations considerably (≥ 3x) relative to purely numerical approaches (e.g. Tarantola, 1984), while remaining sufficiently accurate at the periods of interest (down to 60s). However, while the choice of a 60km crustal-layer thickness is justifiable in the continents, it can complicate interpretation of shallow oceanic upper-mantle structure. We here present an update to the SEMum model, designed primarily to address these concerns. The resulting model, SEMum2, was derived using a crustal layer that again fits global surface-wave dispersion, but with a more geologically consistent laterally varying thickness: approximately honoring Crust2.0 (Bassin, et al., 2000) Moho depth in the continents, while saturating at 30km in the oceans. We demonstrate that this approach does not bias our upper mantle model, which is constrained not only by fundamental mode surface waves, but also by overtone waveforms. We have also improved our data-selection and assimilation scheme, more readily allowing for additional and higher-quality data to be incorporated into our inversion as the model improves. Further, we have been able to refine the parameterization of the isotropic component of our model, previously limited by our ability to solve the large dense linear system that governs model updates (Tarantola and Valette, 1982). The construction of SEMum2 involved 3 additional inversion iterations away from SEMum. Overall, the combined effect of these improvements confirms and validates the general structure of the original SEMum. Model amplitudes remain an impressive feature in SEMum2, wherein peak-to-peak variation in VS can exceed 15% in close lateral juxtaposition. Further, many intriguing structures present in SEMum are now imaged with improved resolution in the updated model. In particular, the geographic extents of the anomalous oceanic cluster identified by Lekic and Romanowicz (2011b) are consistent with our findings and now allow us to further identify alternating bands of lower and higher velocities in the 200-300km depth range beneath the Pacific basin, with a characteristic spacing of ˜2000km normal to absolute plate motion. Possible dynamic interpretation of these and other features in the ocean basins is explored in a companion presentation (Romanowicz et al., this meeting).

  17. Cognitive aging on latent constructs for visual processing capacity: a novel structural equation modeling framework with causal assumptions based on a theory of visual attention.

    PubMed

    Nielsen, Simon; Wilms, L Inge

    2014-01-01

    We examined the effects of normal aging on visual cognition in a sample of 112 healthy adults aged 60-75. A testbattery was designed to capture high-level measures of visual working memory and low-level measures of visuospatial attention and memory. To answer questions of how cognitive aging affects specific aspects of visual processing capacity, we used confirmatory factor analyses in Structural Equation Modeling (SEM; Model 2), informed by functional structures that were modeled with path analyses in SEM (Model 1). The results show that aging effects were selective to measures of visual processing speed compared to visual short-term memory (VSTM) capacity (Model 2). These results are consistent with some studies reporting selective aging effects on processing speed, and inconsistent with other studies reporting aging effects on both processing speed and VSTM capacity. In the discussion we argue that this discrepancy may be mediated by differences in age ranges, and variables of demography. The study demonstrates that SEM is a sensitive method to detect cognitive aging effects even within a narrow age-range, and a useful approach to structure the relationships between measured variables, and the cognitive functional foundation they supposedly represent.

  18. Using structural equation modeling to detect response shifts and true change in discrete variables: an application to the items of the SF-36.

    PubMed

    Verdam, Mathilde G E; Oort, Frans J; Sprangers, Mirjam A G

    2016-06-01

    The structural equation modeling (SEM) approach for detection of response shift (Oort in Qual Life Res 14:587-598, 2005. doi: 10.1007/s11136-004-0830-y ) is especially suited for continuous data, e.g., questionnaire scales. The present objective is to explain how the SEM approach can be applied to discrete data and to illustrate response shift detection in items measuring health-related quality of life (HRQL) of cancer patients. The SEM approach for discrete data includes two stages: (1) establishing a model of underlying continuous variables that represent the observed discrete variables, (2) using these underlying continuous variables to establish a common factor model for the detection of response shift and to assess true change. The proposed SEM approach was illustrated with data of 485 cancer patients whose HRQL was measured with the SF-36, before and after start of antineoplastic treatment. Response shift effects were detected in items of the subscales mental health, physical functioning, role limitations due to physical health, and bodily pain. Recalibration response shifts indicated that patients experienced relatively fewer limitations with "bathing or dressing yourself" (effect size d = 0.51) and less "nervousness" (d = 0.30), but more "pain" (d = -0.23) and less "happiness" (d = -0.16) after antineoplastic treatment as compared to the other symptoms of the same subscale. Overall, patients' mental health improved, while their physical health, vitality, and social functioning deteriorated. No change was found for the other subscales of the SF-36. The proposed SEM approach to discrete data enables response shift detection at the item level. This will lead to a better understanding of the response shift phenomena at the item level and therefore enhances interpretation of change in the area of HRQL.

  19. Program Model Checking as a New Trend

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This paper introduces a special section of STTT (International Journal on Software Tools for Technology Transfer) containing a selection of papers that were presented at the 7th International SPIN workshop, Stanford, August 30 - September 1, 2000. The workshop was named SPIN Model Checking and Software Verification, with an emphasis on model checking of programs. The paper outlines the motivation for stressing software verification, rather than only design and model verification, by presenting the work done in the Automated Software Engineering group at NASA Ames Research Center within the last 5 years. This includes work in software model checking, testing like technologies and static analysis.

  20. Knowledge-based approach for generating target system specifications from a domain model

    NASA Technical Reports Server (NTRS)

    Gomaa, Hassan; Kerschberg, Larry; Sugumaran, Vijayan

    1992-01-01

    Several institutions in industry and academia are pursuing research efforts in domain modeling to address unresolved issues in software reuse. To demonstrate the concepts of domain modeling and software reuse, a prototype software engineering environment is being developed at George Mason University to support the creation of domain models and the generation of target system specifications. This prototype environment, which is application domain independent, consists of an integrated set of commercial off-the-shelf software tools and custom-developed software tools. This paper describes the knowledge-based tool that was developed as part of the environment to generate target system specifications from a domain model.

  1. Software cost/resource modeling: Deep space network software cost estimation model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. J.

    1980-01-01

    A parametric software cost estimation model prepared for JPL deep space network (DSN) data systems implementation tasks is presented. The resource estimation model incorporates principles and data from a number of existing models, such as those of the General Research Corporation, Doty Associates, IBM (Walston-Felix), Rome Air Force Development Center, University of Maryland, and Rayleigh-Norden-Putnam. The model calibrates task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software lifecycle statistics. The estimation model output scales a standard DSN work breakdown structure skeleton, which is then input to a PERT/CPM system, producing a detailed schedule and resource budget for the project being planned.

  2. Investigation of multiferroic behavior on flakes-like BiFeO{sub 3}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheikh, Javed R.; Gaikwad, Vishwajit M.; Acharya, Smita A., E-mail: saha275@yahoo.com

    2016-05-23

    In present work, multiferroic BiFeO{sub 3} was synthesized by hydrothermal route. The rhombohedral structure was confirmed X-ray diffraction pattern and data fitted with Reitveld refinement using Full-Prof software suite. SEM micrograph shows flake like morphology. Frequency and temperature dependence of dielectric constant and dielectric loss were studied and detected enhancement in dielectric constant. The magnetic measurement indicates antiferromagnetic nature of BFO. P-E curve shows ferroelectic hysteresis loop with remanent polarization (2Pr) 0.3518 µC/cm{sup 2}. The dielectric anomaly observed near T{sub N} can be assigned to magnetoelectric coupling which is useful in device application.

  3. Neural Network for Nanoscience Scanning Electron Microscope Image Recognition.

    PubMed

    Modarres, Mohammad Hadi; Aversa, Rossella; Cozzini, Stefano; Ciancio, Regina; Leto, Angelo; Brandino, Giuseppe Piero

    2017-10-16

    In this paper we applied transfer learning techniques for image recognition, automatic categorization, and labeling of nanoscience images obtained by scanning electron microscope (SEM). Roughly 20,000 SEM images were manually classified into 10 categories to form a labeled training set, which can be used as a reference set for future applications of deep learning enhanced algorithms in the nanoscience domain. The categories chosen spanned the range of 0-Dimensional (0D) objects such as particles, 1D nanowires and fibres, 2D films and coated surfaces, and 3D patterned surfaces such as pillars. The training set was used to retrain on the SEM dataset and to compare many convolutional neural network models (Inception-v3, Inception-v4, ResNet). We obtained compatible results by performing a feature extraction of the different models on the same dataset. We performed additional analysis of the classifier on a second test set to further investigate the results both on particular cases and from a statistical point of view. Our algorithm was able to successfully classify around 90% of a test dataset consisting of SEM images, while reduced accuracy was found in the case of images at the boundary between two categories or containing elements of multiple categories. In these cases, the image classification did not identify a predominant category with a high score. We used the statistical outcomes from testing to deploy a semi-automatic workflow able to classify and label images generated by the SEM. Finally, a separate training was performed to determine the volume fraction of coherently aligned nanowires in SEM images. The results were compared with what was obtained using the Local Gradient Orientation method. This example demonstrates the versatility and the potential of transfer learning to address specific tasks of interest in nanoscience applications.

  4. Alpha decay calculations with a new formula

    NASA Astrophysics Data System (ADS)

    Akrawy, D. T.; Poenaru, D. N.

    2017-10-01

    A new semi-empirical formula for calculations of α decay half-lives is presented. It was derived from the Royer relationship by introducing new parameters which are fixed by fit to a set of experimental data. We are using three sets: set A with 130 e-e (even-even), 119 e-o (even-odd), 109 o-e, and 96 o-o, set B with 188 e-e, 147 e-o, 131 o-e and 114 o-o, and set C with 136 e-e, 84 e-o, 76 o-e and 48 o-o alpha emitters. A comparison of results obtained with the new formula (newF) and the following well known relationships: semiempirical relationship based on fission theory (semFIS), analytical superasymmetric fission (ASAF) model and universal formula (UNIV) made in terms of rms standard deviation. We also introduced a weighted mean value of this quantity, allowing us to compare the global properties of a given model. For set B the order of the four models is the following: semFIS, UNIV, newF and ASAF. Nevertheless for even-even alpha emitters, UNIV gives the second best result after semFIS, and for odd-even parents the second is newF. Despite its simplicity in comparison with semFIS, newF, presented in this article, behaves quite well, competing with the other well known relationships.

  5. A structural equation modeling of executive functions, IQ and mathematical skills in primary students: Differential effects on number production, mental calculus and arithmetical problems.

    PubMed

    Arán Filippetti, Vanessa; Richaud, María Cristina

    2017-10-01

    Though the relationship between executive functions (EFs) and mathematical skills has been well documented, little is known about how both EFs and IQ differentially support diverse math domains in primary students. Inconsistency of results may be due to the statistical techniques employed, specifically, if the analysis is conducted with observed variables, i.e., regression analysis, or at the latent level, i.e., structural equation modeling (SEM). The current study explores the contribution of both EFs and IQ in mathematics through an SEM approach. A total of 118 8- to 12-year-olds were administered measures of EFs, crystallized (Gc) and fluid (Gf) intelligence, and math abilities (i.e., number production, mental calculus and arithmetical problem-solving). Confirmatory factor analysis (CFA) offered support for the three-factor solution of EFs: (1) working memory (WM), (2) shifting, and (3) inhibition. Regarding the relationship among EFs, IQ and math abilities, the results of the SEM analysis showed that (i) WM and age predict number production and mental calculus, and (ii) shifting and sex predict arithmetical problem-solving. In all of the SEM models, EFs partially or totally mediated the relationship between IQ, age and math achievement. These results suggest that EFs differentially supports math abilities in primary-school children and is a more significant predictor of math achievement than IQ level.

  6. Preferences for Condomless Sex in Sexually Explicit Media Among Black/African American Men Who Have Sex with Men: Implications for HIV Prevention.

    PubMed

    Nelson, Kimberly M; Eaton, Lisa A; Gamarel, Kristi E

    2017-05-01

    Accumulating evidence suggests that viewing sexually explicit media (SEM; i.e., pornography) may be related to the sexual behaviors of men who have sex with men (MSM). Furthermore, stereotypical depictions of Black/African American MSM engaging in sexual risk behaviors in SEM may serve to normalize condomless sex, reinforce low peer norms around condom use, and facilitate HIV risk taking among Black/African American MSM. Despite this evidence, very little is known about the correlates of SEM consumption among Black/African American MSM, including HIV risk behaviors and their relation to preferences for viewing condomless sex in SEM. Participants were 653 HIV-seronegative Black-identified MSM ages 18-62 (M 33.58, SD 11.01) who completed a cross-sectional survey as a part of a HIV prevention trial in Atlanta, Georgia. Over three-quarters of the men (n = 514) reported a preference for condomless sex in SEM. In multivariate models, engaging in serodiscordant condomless sex was not significantly associated with preferences for condomless sex in SEM; however, men who self-identified as bisexual, engaged in transactional sex, and reported greater agreement with sexual risk cognitions (i.e., heat-of-the-moment thoughts about condom use) had significantly greater odds of reporting a preference for condomless sex in SEM. Study findings highlight the need for future research exploring the role of SEM in the sexual health of Black/African American MSM, including the extent to which SEM exposure alters norms and expectations about sexual behaviors among Black/African American MSM and how this might be addressed in HIV prevention programs.

  7. Preferences for condomless sex in sexually explicit media among Black/African American men who have sex with men: Implications for HIV prevention

    PubMed Central

    Nelson, Kimberly M.; Eaton, Lisa A.; Gamarel, Kristi E.

    2016-01-01

    Accumulating evidence suggests that viewing sexually explicit media (SEM; i.e., pornography) may be related to the sexual behaviors of men who have sex with men (MSM). Furthermore, stereotypical depictions of Black/African American MSM engaging in sexual risk behaviors in SEM may serve to normalize condomless sex, reinforce low peer norms around condom use, and facilitate HIV risk-taking among Black/African American MSM. Despite this evidence, very little is known about the correlates of SEM consumption among Black/African American MSM, including HIV risk behaviors and their relation to preferences for viewing condomless sex in SEM. Participants were 653 HIV-seronegative Black-identified MSM ages 18 to 62 (M = 33.58, SD = 11.01) who completed a cross-sectional survey as a part of a HIV-prevention trial in Atlanta, Georgia. Over three-quarters of the men (n = 514) reported a preference for condomless sex in SEM. In multivariate models, engaging in serodiscordant condomless sex was not significantly associated with preferences for condomless sex in SEM; however, men who self-identified as bisexual, engaged in transactional sex, and reported greater agreement with sexual risk cognitions (i.e., heat of the moment thoughts about condom use) has significantly greater odds of reporting a preference for condomless sex in SEM. Study findings highlight the need for future research exploring the role of SEM in the sexual health of Black/African American MSM, including the extent to which SEM exposure alters norms and expectations about sexual behaviors among Black/African American MSM and how this might be addressed in HIV prevention programs. PMID:27987085

  8. Software Cost-Estimation Model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1985-01-01

    Software Cost Estimation Model SOFTCOST provides automated resource and schedule model for software development. Combines several cost models found in open literature into one comprehensive set of algorithms. Compensates for nearly fifty implementation factors relative to size of task, inherited baseline, organizational and system environment and difficulty of task.

  9. Software Program: Software Management Guidebook

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The purpose of this NASA Software Management Guidebook is twofold. First, this document defines the core products and activities required of NASA software projects. It defines life-cycle models and activity-related methods but acknowledges that no single life-cycle model is appropriate for all NASA software projects. It also acknowledges that the appropriate method for accomplishing a required activity depends on characteristics of the software project. Second, this guidebook provides specific guidance to software project managers and team leaders in selecting appropriate life cycles and methods to develop a tailored plan for a software engineering project.

  10. Young's modulus and SEM analysis of leg bones exposed to simulated microgravity by hind limb suspension (HLS)

    NASA Astrophysics Data System (ADS)

    Patel, Niravkumar D.; Mehta, Rahul; Ali, Nawab; Soulsby, Michael; Chowdhury, Parimal

    2013-04-01

    The aim of this study was to determine composition of the leg bone tissue of rats that were exposed to simulated microgravity by Hind-Limb Suspension (HLS) by tail for one week. The leg bones were cross sectioned, cleaned of soft tissues, dried and sputter coated, and then placed horizontally on the stage of a Scanning Electron Microscope (SEM) for analysis. Interaction of a 17.5 keV electron beam, incident from the vertical direction on the sample, generated images using two detectors. X-rays emitted from the sample during electron bombardment were measured with an Energy Dispersive Spectroscopy (EDS) feature of SEM using a liquid-nitrogen cooled Si(Li) detector with a resolution of 144 eV at 5.9 keV (25Mn Kα x-ray). Kα- x-rays from carbon, oxygen, phosphorus and calcium formed the major peaks in the spectrum. Relative percentages of these elements were determined using a software that could also correct for ZAF factors namely Z(atomic number), A(X-ray absorption) and F(characteristic fluorescence). The x-rays from the control groups and from the experimental (HLS) groups were analyzed on well-defined parts (femur, tibia and knee) of the leg bone. The SEM analysis shows that there are definite changes in the hydroxyl or phosphate group of the main component of the bone structure, hydroxyapatite [Ca10(PO4)6(OH)2], due to hind limb suspension. In a separate experiment, entire leg bones (both from HLS and control rats) were subjected to mechanical stress by mean of a variable force. The stress vs. strain graph was fitted with linear and polynomial function, and the parameters reflecting the mechanical strength of the bone, under increasing stress, were calculated. From the slope of the linear part of the graph the Young's modulus for HLS bones were calculated and found to be 2.49 times smaller than those for control bones.

  11. Using the Partial Credit Model to Evaluate the Student Engagement in Mathematics Scale

    ERIC Educational Resources Information Center

    Leis, Micela; Schmidt, Karen M.; Rimm-Kaufman, Sara E.

    2015-01-01

    The Student Engagement in Mathematics Scale (SEMS) is a self-report measure that was created to assess three dimensions of student engagement (social, emotional, and cognitive) in mathematics based on a single day of class. In the current study, the SEMS was administered to a sample of 360 fifth graders from a large Mid-Atlantic district. The…

  12. Effects of Two Instructional Approaches on Skill Development, Knowledge, and Game Performance

    ERIC Educational Resources Information Center

    Pritchard, Tony; Hawkins, Andrew; Wiegand, Robert; Metzler, Jonathan N.

    2008-01-01

    Two instructional approaches that have been of interest in promoting sport have been the Sport Education Model (SEM) and the Traditional Style (TS) of teaching physical education. The purpose of this study was to investigate how SEM and TS would affect skill development, knowledge, and game performance for volleyball at the secondary level. A 2 x…

  13. SEM with Missing Data and Unknown Population Distributions Using Two-Stage ML: Theory and Its Application

    ERIC Educational Resources Information Center

    Yuan, Ke-Hai; Lu, Laura

    2008-01-01

    This article provides the theory and application of the 2-stage maximum likelihood (ML) procedure for structural equation modeling (SEM) with missing data. The validity of this procedure does not require the assumption of a normally distributed population. When the population is normally distributed and all missing data are missing at random…

  14. The Structure of Memory in Infants and Toddlers: An SEM Study with Full-Terms and Preterms

    ERIC Educational Resources Information Center

    Rose, Susan A.; Feldman, Judith F.; Jankowski, Jeffery J.; Van Rossem, Ronan

    2011-01-01

    There is considerable dispute about the nature of infant memory. Using SEM models, we examined whether popular characterizations of the structure of adult memory, including the two-process theory of recognition, are applicable in the infant and toddler years. The participants were a cohort of preterms and full-terms assessed longitudinally--at 1,…

  15. Symposium N: Materials and Devices for Thermal-to-Electric Energy Conversion

    DTIC Science & Technology

    2010-08-24

    X - ray diffraction, transmission electron microscopy, scanning electron microscopy, and dynamic light scattering. Thermal conductivity measurements...SEM), X - ray diffraction (XRD) measurements as well as Raman spectroscopy. The results from these techniques indicate a clear modification...was examined by using scanning electron microscope (SEM; HITACHI S-4500 model) attached with an energy dispersive x - ray spectroscopy. The electrical

  16. Empirical studies of software design: Implications for SSEs

    NASA Technical Reports Server (NTRS)

    Krasner, Herb

    1988-01-01

    Implications for Software Engineering Environments (SEEs) are presented in viewgraph format for characteristics of projects studied; significant problems and crucial problem areas in software design for large systems; layered behavioral model of software processes; implications of field study results; software project as an ecological system; results of the LIFT study; information model of design exploration; software design strategies; results of the team design study; and a list of publications.

  17. Software Metrics

    DTIC Science & Technology

    1988-12-01

    software development scene is often charac- c. SPQR Model-Jones terized by: * schedule and cost estimates that are gross-d. COPMO-Thebaut ly inaccurate, SEI...time c. SPQR Model-Jones (in seconds) is simply derived from E by dividing T. Capers Jones has developed a software cost by the Stroud number, S...estimation model called the Software Produc- T=E/S tivity, Quality, and Reliability ( SPQR ) model. The basic approach is similar to that of Boehm’s The value

  18. Studying the Accuracy of Software Process Elicitation: The User Articulated Model

    ERIC Educational Resources Information Center

    Crabtree, Carlton A.

    2010-01-01

    Process models are often the basis for demonstrating improvement and compliance in software engineering organizations. A descriptive model is a type of process model describing the human activities in software development that actually occur. The purpose of a descriptive model is to provide a documented baseline for further process improvement…

  19. Overview of the TriBITS Lifecycle Model: Lean/Agile Software Lifecycle Model for Research-based Computational Science and Engineering Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Roscoe A; Heroux, Dr. Michael A; Willenbring, James

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science & engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process--respecting the competing needs of research vs. production--cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for many CSE software projects thatmore » are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less

  20. A conceptual model for megaprogramming

    NASA Technical Reports Server (NTRS)

    Tracz, Will

    1990-01-01

    Megaprogramming is component-based software engineering and life-cycle management. Magaprogramming and its relationship to other research initiatives (common prototyping system/common prototyping language, domain specific software architectures, and software understanding) are analyzed. The desirable attributes of megaprogramming software components are identified and a software development model and resulting prototype megaprogramming system (library interconnection language extended by annotated Ada) are described.

  1. Deep space network software cost estimation model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1981-01-01

    A parametric software cost estimation model prepared for Deep Space Network (DSN) Data Systems implementation tasks is presented. The resource estimation model incorporates principles and data from a number of existing models. The model calibrates task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit DSN software life cycle statistics. The estimation model output scales a standard DSN Work Breakdown Structure skeleton, which is then input into a PERT/CPM system, producing a detailed schedule and resource budget for the project being planned.

  2. Consistent Evolution of Software Artifacts and Non-Functional Models

    DTIC Science & Technology

    2014-11-14

    induce bad software performance)? 15. SUBJECT TERMS EOARD, Nano particles, Photo-Acoustic Sensors, Model-Driven Engineering ( MDE ), Software Performance...Università degli Studi dell’Aquila, Via Vetoio, 67100 L’Aquila, Italy Email: vittorio.cortellessa@univaq.it Web : http: // www. di. univaq. it/ cortelle/ Phone...Model-Driven Engineering ( MDE ), Software Performance Engineering (SPE), Change Propagation, Performance Antipatterns. For sake of readability of the

  3. THE EPA MULTIMEDIA INTEGRATED MODELING SYSTEM SOFTWARE SUITE

    EPA Science Inventory

    The U.S. EPA is developing a Multimedia Integrated Modeling System (MIMS) framework that will provide a software infrastructure or environment to support constructing, composing, executing, and evaluating complex modeling studies. The framework will include (1) common software ...

  4. Development and Application of New Quality Model for Software Projects

    PubMed Central

    Karnavel, K.; Dillibabu, R.

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects. PMID:25478594

  5. Development and application of new quality model for software projects.

    PubMed

    Karnavel, K; Dillibabu, R

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.

  6. Improving contact layer patterning using SEM contour based etch model

    NASA Astrophysics Data System (ADS)

    Weisbuch, François; Lutich, Andrey; Schatz, Jirka; Hertzsch, Tino; Moll, Hans-Peter

    2016-10-01

    The patterning of the contact layer is modulated by strong etch effects that are highly dependent on the geometry of the contacts. Such litho-etch biases need to be corrected to ensure a good pattern fidelity. But aggressive designs contain complex shapes that can hardly be compensated with etch bias table and are difficult to characterize with standard CD metrology. In this work we propose to implement a model based etch compensation method able to deal with any contact configuration. With the help of SEM contours, it was possible to get reliable 2D measurements particularly helpful to calibrate the etch model. The selections of calibration structures was optimized in combination with model form to achieve an overall errRMS of 3nm allowing the implementation of the model in production.

  7. Theoretical and measured electric field distributions within an annular phased array: consideration of source antennas.

    PubMed

    Zhang, Y; Joines, W T; Jirtle, R L; Samulski, T V

    1993-08-01

    The magnitude of E-field patterns generated by an annular array prototype device has been calculated and measured. Two models were used to describe the radiating sources: a simple linear dipole and a stripline antenna model. The stripline model includes detailed geometry of the actual antennas used in the prototype and an estimate of the antenna current based on microstrip transmission line theory. This more detailed model yields better agreement with the measured field patterns, reducing the rms discrepancy by a factor of about 6 (from approximately 23 to 4%) in the central region of interest where the SEM is within 25% of the maximum. We conclude that accurate modeling of source current distributions is important for determining SEM distributions associated with such heating devices.

  8. Predicting physical activity and fruit and vegetable intake in adolescents: a test of the information, motivation, behavioral skills model.

    PubMed

    Kelly, Stephanie; Melnyk, Bernadette Mazurek; Belyea, Michael

    2012-04-01

    Most adolescents do not meet national recommendations regarding physical activity and/or the intake of fruits and vegetables. The purpose of this study was to explore whether variables in the information, motivation, behavioral skills (IMB) model of health promotion predicted physical activity and fruit and vegetable intake in 404 adolescents from 2 high schools in the Southwest United States using structural equation modeling (SEM). The SEM models included theoretical constructs, contextual variables, and moderators. The theoretical relationships in the IMB model were confirmed and were moderated by gender and race. Interventions that incorporate cognitive-behavioral skills building may be a key factor for promoting physical activity as well as fruit and vegetable intake in adolescents. Copyright © 2012 Wiley Periodicals, Inc.

  9. Visualization Skills: A Prerequisite to Advanced Solid Modeling

    ERIC Educational Resources Information Center

    Gow, George

    2007-01-01

    Many educators believe that solid modeling software has made teaching two- and three-dimensional visualization skills obsolete. They claim that the visual tools built into the solid modeling software serve as a replacement for the CAD operator's personal visualization skills. They also claim that because solid modeling software can produce…

  10. Software engineering the mixed model for genome-wide association studies on large samples.

    PubMed

    Zhang, Zhiwu; Buckler, Edward S; Casstevens, Terry M; Bradbury, Peter J

    2009-11-01

    Mixed models improve the ability to detect phenotype-genotype associations in the presence of population stratification and multiple levels of relatedness in genome-wide association studies (GWAS), but for large data sets the resource consumption becomes impractical. At the same time, the sample size and number of markers used for GWAS is increasing dramatically, resulting in greater statistical power to detect those associations. The use of mixed models with increasingly large data sets depends on the availability of software for analyzing those models. While multiple software packages implement the mixed model method, no single package provides the best combination of fast computation, ability to handle large samples, flexible modeling and ease of use. Key elements of association analysis with mixed models are reviewed, including modeling phenotype-genotype associations using mixed models, population stratification, kinship and its estimation, variance component estimation, use of best linear unbiased predictors or residuals in place of raw phenotype, improving efficiency and software-user interaction. The available software packages are evaluated, and suggestions made for future software development.

  11. Are Systemic Manifestations Ascribable to COPD in Smokers? A Structural Equation Modeling Approach.

    PubMed

    Boyer, Laurent; Bastuji-Garin, Sylvie; Chouaid, Christos; Housset, Bruno; Le Corvoisier, Philippe; Derumeaux, Geneviève; Boczkowski, Jorge; Maitre, Bernard; Adnot, Serge; Audureau, Etienne

    2018-06-05

    Whether the systemic manifestations observed in Chronic Obstructive Pulmonary Disease (COPD) are ascribable to lung dysfunction or direct effects of smoking is in debate. Structural Equations Modeling (SEM), a causal-oriented statistical approach, could help unraveling the pathways involved, by enabling estimation of direct and indirect associations between variables. The objectives of the study was to investigate the relative impact of smoking and COPD on systemic manifestations, inflammation and telomere length. In 292 individuals (103 women; 97 smokers with COPD, 96 smokers without COPD, 99 non-smokers), we used SEM to explore the pathways between smoking (pack-years), lung disease (FEV 1 , K CO ), and the following parameters: arterial stiffness (aortic pulse wave velocity, PWV), bone mineral density (BMD), appendicular skeletal muscle mass (ASMM), grip strength, insulin resistance (HOMA-IR), creatinine clearance (eGFR), blood leukocyte telomere length and inflammatory markers (Luminex assay). All models were adjusted on age and gender. Latent variables were created for systemic inflammation (inflammatory markers) and musculoskeletal parameters (ASMM, grip strength, BMD). SEM showed that most effects of smoking were indirectly mediated by lung dysfunction: e.g. via FEV 1 on musculoskeletal factor, eGFR, HOMA-IR, PWV, telomere length, CRP, white blood cells count (WBC) and inflammation factor, and via K CO on musculoskeletal factor, eGFR and PWV. Direct effects of smoking were limited to CRP and WBC. Models had excellent fit. In conclusion, SEM highlighted the major role of COPD in the occurrence of systemic manifestations while smoking effects were mostly mediated by lung function.

  12. Novel characteristics of traction force in biliary self-expandable metallic stents.

    PubMed

    Hori, Yasuki; Hayashi, Kazuki; Yoshida, Michihiro; Naitoh, Itaru; Ban, Tesshin; Miyabe, Katsuyuki; Kondo, Hiromu; Nishi, Yuji; Umemura, Shuichiro; Fujita, Yasuaki; Natsume, Makoto; Kato, Akihisa; Ohara, Hirotaka; Joh, Takashi

    2017-05-01

    In recent years, knowledge concerning the mechanical properties of self-expandable metallic stents (SEMS) has increased. In a previous study, we defined traction force and traction momentum and reported that these characteristics are important for optimal stent deployment. However, traction force and traction momentum were represented as relative values and were not evaluated in various conditions. The purpose of the present study was to measure traction force in various situations assumed during SEMS placement. Traction force and traction momentum were measured in non-stricture, stricture, and angled stricture models using in-house equipment. Stricture and angled stricture models had significantly higher traction force and traction momentum than those of the non-stricture model (stricture vs non-stricture: traction force, 7.2 N vs 1.4 N, P < 0.001; traction momentum, 237.8 Ns vs 62.3 Ns, P = 0.001; angled stricture vs non-stricture: traction force, 7.4 N vs 1.4 N, P < 0.001; traction momentum, 307.2 Ns vs 62.3 Ns, P < 0.001). Traction force was variable during SEMS placement and was categorized into five different stages, which were similar in both the stricture and angled stricture models. We measured traction force and traction momentum under simulated clinical conditions and demonstrated that strictures and the angular positioning of the stent influenced the traction force. Clinicians should be aware of the transition of the traction force and should schedule X-ray imaging during SEMS placement. © 2017 Japan Gastroenterological Endoscopy Society.

  13. Development of a calibrated software reliability model for flight and supporting ground software for avionic systems

    NASA Technical Reports Server (NTRS)

    Lawrence, Stella

    1991-01-01

    The object of this project was to develop and calibrate quantitative models for predicting the quality of software. Reliable flight and supporting ground software is a highly important factor in the successful operation of the space shuttle program. The models used in the present study consisted of SMERFS (Statistical Modeling and Estimation of Reliability Functions for Software). There are ten models in SMERFS. For a first run, the results obtained in modeling the cumulative number of failures versus execution time showed fairly good results for our data. Plots of cumulative software failures versus calendar weeks were made and the model results were compared with the historical data on the same graph. If the model agrees with actual historical behavior for a set of data then there is confidence in future predictions for this data. Considering the quality of the data, the models have given some significant results, even at this early stage. With better care in data collection, data analysis, recording of the fixing of failures and CPU execution times, the models should prove extremely helpful in making predictions regarding the future pattern of failures, including an estimate of the number of errors remaining in the software and the additional testing time required for the software quality to reach acceptable levels. It appears that there is no one 'best' model for all cases. It is for this reason that the aim of this project was to test several models. One of the recommendations resulting from this study is that great care must be taken in the collection of data. When using a model, the data should satisfy the model assumptions.

  14. Revealing the ISO/IEC 9126-1 Clique Tree for COTS Software Evaluation

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry

    2007-01-01

    Previous research has shown that acyclic dependency models, if they exist, can be extracted from software quality standards and that these models can be used to assess software safety and product quality. In the case of commercial off-the-shelf (COTS) software, the extracted dependency model can be used in a probabilistic Bayesian network context for COTS software evaluation. Furthermore, while experts typically employ Bayesian networks to encode domain knowledge, secondary structures (clique trees) from Bayesian network graphs can be used to determine the probabilistic distribution of any software variable (attribute) using any clique that contains that variable. Secondary structures, therefore, provide insight into the fundamental nature of graphical networks. This paper will apply secondary structure calculations to reveal the clique tree of the acyclic dependency model extracted from the ISO/IEC 9126-1 software quality standard. Suggestions will be provided to describe how the clique tree may be exploited to aid efficient transformation of an evaluation model.

  15. Collected software engineering papers, volume 9

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from November 1990 through October 1991. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. This is the ninth such volume of technical papers produced by the SEL. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. For the convenience of this presentation, the eight papers contained here are grouped into three major categories: (1) software models studies; (2) software measurement studies; and (3) Ada technology studies. The first category presents studies on reuse models, including a software reuse model applied to maintenance and a model for an organization to support software reuse. The second category includes experimental research methods and software measurement techniques. The third category presents object-oriented approaches using Ada and object-oriented features proposed for Ada. The SEL is actively working to understand and improve the software development process at GSFC.

  16. The TAME Project: Towards improvement-oriented software environments

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Rombach, H. Dieter

    1988-01-01

    Experience from a dozen years of analyzing software engineering processes and products is summarized as a set of software engineering and measurement principles that argue for software engineering process models that integrate sound planning and analysis into the construction process. In the TAME (Tailoring A Measurement Environment) project at the University of Maryland, such an improvement-oriented software engineering process model was developed that uses the goal/question/metric paradigm to integrate the constructive and analytic aspects of software development. The model provides a mechanism for formalizing the characterization and planning tasks, controlling and improving projects based on quantitative analysis, learning in a deeper and more systematic way about the software process and product, and feeding the appropriate experience back into the current and future projects. The TAME system is an instantiation of the TAME software engineering process model as an ISEE (integrated software engineering environment). The first in a series of TAME system prototypes has been developed. An assessment of experience with this first limited prototype is presented including a reassessment of its initial architecture.

  17. Using software security analysis to verify the secure socket layer (SSL) protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2004-01-01

    nal Aeronautics and Space Administration (NASA) have tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information the3, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach '' offers, among its capabilities, formal verification of software security properties, through the use of model based verification (MBV) to address software security risks. [1,2,3,4,5,6] MBV is a formal approach to software assurance that combines analysis of software, via abstract models, with technology, such as model checkers, that provide automation of the mechanical portions of the analysis process. This paper will discuss: The need for formal analysis to assure software systems with respect to software and why testing alone cannot provide it. The means by which MBV with a Flexible Modeling Framework (FMF) accomplishes the necessary analysis task. An example of FMF style MBV in the verification of properties over the Secure Socket Layer (SSL) communication protocol as a demonstration.

  18. Software-defined Quantum Networking Ecosystem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S.; Sadlier, Ronald

    The software enables a user to perform modeling and simulation of software-defined quantum networks. The software addresses the problem of how to synchronize transmission of quantum and classical signals through multi-node networks and to demonstrate quantum information protocols such as quantum teleportation. The software approaches this problem by generating a graphical model of the underlying network and attributing properties to each node and link in the graph. The graphical model is then simulated using a combination of discrete-event simulators to calculate the expected state of each node and link in the graph at a future time. A user interacts withmore » the software by providing an initial network model and instantiating methods for the nodes to transmit information with each other. This includes writing application scripts in python that make use of the software library interfaces. A user then initiates the application scripts, which invokes the software simulation. The user then uses the built-in diagnostic tools to query the state of the simulation and to collect statistics on synchronization.« less

  19. A Prototype for the Support of Integrated Software Process Development and Improvement

    NASA Astrophysics Data System (ADS)

    Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian

    An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.

  20. Supervised Learning Based Hypothesis Generation from Biomedical Literature.

    PubMed

    Sang, Shengtian; Yang, Zhihao; Li, Zongyao; Lin, Hongfei

    2015-01-01

    Nowadays, the amount of biomedical literatures is growing at an explosive speed, and there is much useful knowledge undiscovered in this literature. Researchers can form biomedical hypotheses through mining these works. In this paper, we propose a supervised learning based approach to generate hypotheses from biomedical literature. This approach splits the traditional processing of hypothesis generation with classic ABC model into AB model and BC model which are constructed with supervised learning method. Compared with the concept cooccurrence and grammar engineering-based approaches like SemRep, machine learning based models usually can achieve better performance in information extraction (IE) from texts. Then through combining the two models, the approach reconstructs the ABC model and generates biomedical hypotheses from literature. The experimental results on the three classic Swanson hypotheses show that our approach outperforms SemRep system.

  1. Idea Paper: The Lifecycle of Software for Scientific Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dubey, Anshu; McInnes, Lois C.

    The software lifecycle is a well researched topic that has produced many models to meet the needs of different types of software projects. However, one class of projects, software development for scientific computing, has received relatively little attention from lifecycle researchers. In particular, software for end-to-end computations for obtaining scientific results has received few lifecycle proposals and no formalization of a development model. An examination of development approaches employed by the teams implementing large multicomponent codes reveals a great deal of similarity in their strategies. This idea paper formalizes these related approaches into a lifecycle model for end-to-end scientific applicationmore » software, featuring loose coupling between submodels for development of infrastructure and scientific capability. We also invite input from stakeholders to converge on a model that captures the complexity of this development processes and provides needed lifecycle guidance to the scientific software community.« less

  2. Spatio-Temporal Change Modeling of Lulc: a Semantic Kriging Approach

    NASA Astrophysics Data System (ADS)

    Bhattacharjee, S.; Ghosh, S. K.

    2015-07-01

    Spatio-temporal land-use/ land-cover (LULC) change modeling is important to forecast the future LULC distribution, which may facilitate natural resource management, urban planning, etc. The spatio-temporal change in LULC trend often exhibits non-linear behavior, due to various dynamic factors, such as, human intervention (e.g., urbanization), environmental factors, etc. Hence, proper forecasting of LULC distribution should involve the study and trend modeling of historical data. Existing literatures have reported that the meteorological attributes (e.g., NDVI, LST, MSI), are semantically related to the terrain. Being influenced by the terrestrial dynamics, the temporal changes of these attributes depend on the LULC properties. Hence, incorporating meteorological knowledge into the temporal prediction process may help in developing an accurate forecasting model. This work attempts to study the change in inter-annual LULC pattern and the distribution of different meteorological attributes of a region in Kolkata (a metropolitan city in India) during the years 2000-2010 and forecast the future spread of LULC using semantic kriging (SemK) approach. A new variant of time-series SemK is proposed, namely Rev-SemKts to capture the multivariate semantic associations between different attributes. From empirical analysis, it may be observed that the augmentation of semantic knowledge in spatio-temporal modeling of meteorological attributes facilitate more precise forecasting of LULC pattern.

  3. Assessing pollution in a Mediterranean lagoon using acid volatile sulfides and estimations of simultaneously extracted metals.

    PubMed

    Zaaboub, Noureddine; Helali, Mohamed Amine; Martins, Maria Virgínia Alves; Ennouri, Rym; Béjaoui, Béchir; da Silva, Eduardo Ferreira; El Bour, Monia; Aleya, Lotfi

    2016-11-01

    Bizerte Lagoon is a southern Mediterranean semi-enclosed lagoon with a maximum depth of 12 m. After assessing sediment quality, the authors report on the physicochemical characteristics of the lagoon's surface sediment using SEM (simultaneously extracted metals) and AVS (acid volatile sulfides) as proxies. Biogeochemical tools are used to investigate the environmental disturbance at the water-sediment interface by means of SEM and AVS to seek conclusions concerning the study area's pollution status. Results confirm accumulation of trace elements in sediment. The use of the SEM-AVS model with organic matter in sediment (ƒOC) confirms possible bioavailability of accumulated trace elements, especially Zn, in the southern part of the lagoon, with organic matter playing an important role in SEM excess correction to affirm a nontoxic total metal sediment state. Individual trace element toxicity is dependent on the bioavailable fraction of SEM Metal on sediment, as is the influence of lagoon inflow from southern water sources on element bioavailability. Appropriate management strategies are highly recommended to mitigate any potential harmful effects on health from this heavy-metal-based pollution.

  4. Software development predictors, error analysis, reliability models and software metric analysis

    NASA Technical Reports Server (NTRS)

    Basili, Victor

    1983-01-01

    The use of dynamic characteristics as predictors for software development was studied. It was found that there are some significant factors that could be useful as predictors. From a study on software errors and complexity, it was shown that meaningful results can be obtained which allow insight into software traits and the environment in which it is developed. Reliability models were studied. The research included the field of program testing because the validity of some reliability models depends on the answers to some unanswered questions about testing. In studying software metrics, data collected from seven software engineering laboratory (FORTRAN) projects were examined and three effort reporting accuracy checks were applied to demonstrate the need to validate a data base. Results are discussed.

  5. School Emphasis on Academic Success: Exploring Changes in Science Performance in Norway between 2007 and 2011 Employing Two-Level SEM

    ERIC Educational Resources Information Center

    Nilsen, Trude; Gustafsson, Jan-Eric

    2014-01-01

    We study whether changes in school emphasis on academic success (SEAS) and safe schools (SAFE) may explain the increased science performance in Norway between TIMSS 2007 and 2011. Two-level structural equation modelling (SEM) of merged TIMSS data was used to investigate whether changes in levels of SEAS and SAFE mediate the changes in science…

  6. Construction of a Virtual Scanning Electron Microscope (VSEM)

    NASA Technical Reports Server (NTRS)

    Fried, Glenn; Grosser, Benjamin

    2004-01-01

    The Imaging Technology Group (ITG) proposed to develop a Virtual SEM (VSEM) application and supporting materials as the first installed instrument in NASA s Virtual Laboratory Project. The instrument was to be a simulator modeled after an existing SEM, and was to mimic that real instrument as closely as possible. Virtual samples would be developed and provided along with the instrument, which would be written in Java.

  7. Resveratrol given intraperitoneally does not inhibit growth of high-risk t(4;11) acute lymphoblastic leukemia cells in NOD/SCID mouse model

    USDA-ARS?s Scientific Manuscript database

    The efficacy of the phytochemical resveratrol as a preventive agent against the growth of t(4;11) acute lymphoblastic leukemia (ALL) was evaluated in NOD.CB17-Prkdcscid/J mice engrafted with the human t(4;11) ALL line SEM. SEM cells were injected into the tail vein and engraftment was monitored by ...

  8. Is the ML Chi-Square Ever Robust to Nonnormality? A Cautionary Note with Missing Data

    ERIC Educational Resources Information Center

    Savalei, Victoria

    2008-01-01

    Normal theory maximum likelihood (ML) is by far the most popular estimation and testing method used in structural equation modeling (SEM), and it is the default in most SEM programs. Even though this approach assumes multivariate normality of the data, its use can be justified on the grounds that it is fairly robust to the violations of the…

  9. Dry Snow Metamorphism

    DTIC Science & Technology

    2012-09-19

    behavior of snow during metamorphism and grain sintering using mathematical models. 2 Approach Our approach involved the collection and...examination of both types of specimens at various stages of metamorphism using the SEM and micro-CT. More specifically, the above approach involved...than 10ºC·m-1). 5. High-resolution images and X-ray spectra of snow specimens at various metamorphism stages were obtained using an SEM and EDS. 6

  10. Resource utilization during software development

    NASA Technical Reports Server (NTRS)

    Zelkowitz, Marvin V.

    1988-01-01

    This paper discusses resource utilization over the life cycle of software development and discusses the role that the current 'waterfall' model plays in the actual software life cycle. Software production in the NASA environment was analyzed to measure these differences. The data from 13 different projects were collected by the Software Engineering Laboratory at NASA Goddard Space Flight Center and analyzed for similarities and differences. The results indicate that the waterfall model is not very realistic in practice, and that as technology introduces further perturbations to this model with concepts like executable specifications, rapid prototyping, and wide-spectrum languages, we need to modify our model of this process.

  11. Method variation in the impact of missing data on response shift detection.

    PubMed

    Schwartz, Carolyn E; Sajobi, Tolulope T; Verdam, Mathilde G E; Sebille, Veronique; Lix, Lisa M; Guilleux, Alice; Sprangers, Mirjam A G

    2015-03-01

    Missing data due to attrition or item non-response can result in biased estimates and loss of power in longitudinal quality-of-life (QOL) research. The impact of missing data on response shift (RS) detection is relatively unknown. This overview article synthesizes the findings of three methods tested in this special section regarding the impact of missing data patterns on RS detection in incomplete longitudinal data. The RS detection methods investigated include: (1) Relative importance analysis to detect reprioritization RS in stroke caregivers; (2) Oort's structural equation modeling (SEM) to detect recalibration, reprioritization, and reconceptualization RS in cancer patients; and (3) Rasch-based item-response theory-based (IRT) models as compared to SEM models to detect recalibration and reprioritization RS in hospitalized chronic disease patients. Each method dealt with missing data differently, either with imputation (1), attrition-based multi-group analysis (2), or probabilistic analysis that is robust to missingness due to the specific objectivity property (3). Relative importance analyses were sensitive to the type and amount of missing data and imputation method, with multiple imputation showing the largest RS effects. The attrition-based multi-group SEM revealed differential effects of both the changes in health-related QOL and the occurrence of response shift by attrition stratum, and enabled a more complete interpretation of findings. The IRT RS algorithm found evidence of small recalibration and reprioritization effects in General Health, whereas SEM mostly evidenced small recalibration effects. These differences may be due to differences between the two methods in handling of missing data. Missing data imputation techniques result in different conclusions about the presence of reprioritization RS using the relative importance method, while the attrition-based SEM approach highlighted different recalibration and reprioritization RS effects by attrition group. The IRT analyses detected more recalibration and reprioritization RS effects than SEM, presumably due to IRT's robustness to missing data. Future research should apply simulation techniques in order to make conclusive statements about the impacts of missing data according to the type and amount of RS.

  12. Generalized Multilevel Structural Equation Modeling

    ERIC Educational Resources Information Center

    Rabe-Hesketh, Sophia; Skrondal, Anders; Pickles, Andrew

    2004-01-01

    A unifying framework for generalized multilevel structural equation modeling is introduced. The models in the framework, called generalized linear latent and mixed models (GLLAMM), combine features of generalized linear mixed models (GLMM) and structural equation models (SEM) and consist of a response model and a structural model for the latent…

  13. Mechanical Sensing with Flexible Metallic Nanowires

    NASA Astrophysics Data System (ADS)

    Dobrokhotov, Vladimir; Yazdanpanah, Mehdi; Pabba, Santosh; Safir, Abdelilah; Cohn, Robert

    2008-03-01

    A calibrated method of force sensing is demonstrated in which the buckled shape of a long flexible metallic nanowire is interpreted to determine the applied force. Using a nanomanipulator the nanowire is buckled in the chamber of a scanning electron microscope (SEM) and the buckled shapes are recorded in SEM images. Force is determined as a function of deflection for an assumed elastic modulus by fitting the shapes using the generalized elastica model. In this calibration the elastic modulus was determined using an auxiliary AFM measurement, with the needle in the same orientation as in the SEM. Following this calibration the needle was used as a sensor in a different orientation than the AFM coordinates to deflect a suspended PLLA polymer fiber from which the elastic modulus (2.96 GPa) was determined. In this study the same needle remained rigidly secured to the AFM cantilever throughout the entire SEM/AFM calibration procedure and the characterization of the nanofiber.

  14. Comparison of 3D reconstruction of mandible for pre-operative planning using commercial and open-source software

    NASA Astrophysics Data System (ADS)

    Abdullah, Johari Yap; Omar, Marzuki; Pritam, Helmi Mohd Hadi; Husein, Adam; Rajion, Zainul Ahmad

    2016-12-01

    3D printing of mandible is important for pre-operative planning, diagnostic purposes, as well as for education and training. Currently, the processing of CT data is routinely performed with commercial software which increases the cost of operation and patient management for a small clinical setting. Usage of open-source software as an alternative to commercial software for 3D reconstruction of the mandible from CT data is scarce. The aim of this study is to compare two methods of 3D reconstruction of the mandible using commercial Materialise Mimics software and open-source Medical Imaging Interaction Toolkit (MITK) software. Head CT images with a slice thickness of 1 mm and a matrix of 512x512 pixels each were retrieved from the server located at the Radiology Department of Hospital Universiti Sains Malaysia. The CT data were analysed and the 3D models of mandible were reconstructed using both commercial Materialise Mimics and open-source MITK software. Both virtual 3D models were saved in STL format and exported to 3matic and MeshLab software for morphometric and image analyses. Both models were compared using Wilcoxon Signed Rank Test and Hausdorff Distance. No significant differences were obtained between the 3D models of the mandible produced using Mimics and MITK software. The 3D model of the mandible produced using MITK open-source software is comparable to the commercial MIMICS software. Therefore, open-source software could be used in clinical setting for pre-operative planning to minimise the operational cost.

  15. 3D modeling based on CityEngine

    NASA Astrophysics Data System (ADS)

    Jia, Guangyin; Liao, Kaiju

    2017-03-01

    Currently, there are many 3D modeling softwares, like 3DMAX, AUTOCAD, and more populous BIM softwares represented by REVIT. CityEngine modeling software introduced in this paper can fully utilize the existing GIS data and combine other built models to make 3D modeling on internal and external part of buildings in a rapid and batch manner, so as to improve the 3D modeling efficiency.

  16. Global review of open access risk assessment software packages valid for global or continental scale analysis

    NASA Astrophysics Data System (ADS)

    Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan

    2015-04-01

    Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user-defined exposure and vulnerability. Without this function, many tools can only be used regionally and not at global or continental scale. It is becoming increasingly easy to use multiple packages for a single region and/or hazard to characterize the uncertainty in the risk, or use as checks for the sensitivities in the analysis. There is a potential for valuable synergy between existing software. A number of open source software packages could be combined to generate a multi-risk model with multiple views of a hazard. This extensive review has simply attempted to provide a platform for dialogue between all open source and open access software packages and to hopefully inspire collaboration between developers, given the great work done by all open access and open source developers.

  17. Effect of Autoclave Cycles on Surface Characteristics of S-File Evaluated by Scanning Electron Microscopy.

    PubMed

    Razavian, Hamid; Iranmanesh, Pedram; Mojtahedi, Hamid; Nazeri, Rahman

    2016-01-01

    Presence of surface defects in endodontic instruments can lead to unwanted complications such as instrument fracture and incomplete preparation of the canal. The current study was conducted to evaluate the effect of autoclave cycles on surface characteristics of S-File by scanning electron microscopy (SEM). In this experimental study, 17 brand new S-Files (#30) were used. The surface characteristics of the files were examined in four steps (without autoclave, 1 autoclave cycle, 5 autoclave cycles and 10 autoclave cycles) by SEM under 200× and 1000× magnifications. Data were analyzed using the SPSS software and the paired sample t-test, independent sample t-test and multifactorial repeated measures ANOVA. The level of significance was set at 0.05. New files had debris and pitting on their surfaces. When the autoclave cycles were increased, the mean of surface roughness also increased at both magnifications (P<0.05). Moreover, under 1000× magnification the multifactorial repeated measures ANOVA showed more surface roughness (P<0.001). Sterilization by autoclave increased the surface roughness of the files and this had was directly related to the number of autoclave cycles.

  18. Effect of Autoclave Cycles on Surface Characteristics of S-File Evaluated by Scanning Electron Microscopy

    PubMed Central

    Razavian, Hamid; Iranmanesh, Pedram; Mojtahedi, Hamid; Nazeri, Rahman

    2016-01-01

    Introduction: Presence of surface defects in endodontic instruments can lead to unwanted complications such as instrument fracture and incomplete preparation of the canal. The current study was conducted to evaluate the effect of autoclave cycles on surface characteristics of S-File by scanning electron microscopy (SEM). Methods and Materials: In this experimental study, 17 brand new S-Files (#30) were used. The surface characteristics of the files were examined in four steps (without autoclave, 1 autoclave cycle, 5 autoclave cycles and 10 autoclave cycles) by SEM under 200× and 1000× magnifications. Data were analyzed using the SPSS software and the paired sample t-test, independent sample t-test and multifactorial repeated measures ANOVA. The level of significance was set at 0.05. Results: New files had debris and pitting on their surfaces. When the autoclave cycles were increased, the mean of surface roughness also increased at both magnifications (P<0.05). Moreover, under 1000× magnification the multifactorial repeated measures ANOVA showed more surface roughness (P<0.001). Conclusion: Sterilization by autoclave increased the surface roughness of the files and this had was directly related to the number of autoclave cycles. PMID:26843874

  19. Characterization and extraction of the synaptic apposition surface for synaptic geometry analysis

    PubMed Central

    Morales, Juan; Rodríguez, Angel; Rodríguez, José-Rodrigo; DeFelipe, Javier; Merchán-Pérez, Angel

    2013-01-01

    Geometrical features of chemical synapses are relevant to their function. Two critical components of the synaptic junction are the active zone (AZ) and the postsynaptic density (PSD), as they are related to the probability of synaptic release and the number of postsynaptic receptors, respectively. Morphological studies of these structures are greatly facilitated by the use of recent electron microscopy techniques, such as combined focused ion beam milling and scanning electron microscopy (FIB/SEM), and software tools that permit reconstruction of large numbers of synapses in three dimensions. Since the AZ and the PSD are in close apposition and have a similar surface area, they can be represented by a single surface—the synaptic apposition surface (SAS). We have developed an efficient computational technique to automatically extract this surface from synaptic junctions that have previously been three-dimensionally reconstructed from actual tissue samples imaged by automated FIB/SEM. Given its relationship with the release probability and the number of postsynaptic receptors, the surface area of the SAS is a functionally relevant measure of the size of a synapse that can complement other geometrical features like the volume of the reconstructed synaptic junction, the equivalent ellipsoid size and the Feret's diameter. PMID:23847474

  20. Solution and Aging of MAR-M246 Nickel-Based Superalloy

    NASA Astrophysics Data System (ADS)

    Baldan, Renato; da Silva, Antonio Augusto Araújo Pinto; Nunes, Carlos Angelo; Couto, Antonio Augusto; Gabriel, Sinara Borborema; Alkmin, Luciano Braga

    2017-02-01

    Solution and aging heat-treatments play a key role for the application of the superalloys. The aim of this work is to evaluate the microstructure of the MAR-M246 nickel-based superalloy solutioned at 1200 and 1250 °C for 330 min and aged at 780, 880 and 980 °C for 5, 20 and 80 h. The γ' solvus, solidus and liquidus temperatures were calculated with the aid of the JMatPro software (Ni database). The as-cast and heat-treated samples were characterized by SEM/EDS and SEM-FEG. The γ' size precipitated in the aged samples was measured and compared with JMatPro simulations. The results have shown that the sample solutioned at 1250 °C for 330 min showed a very homogeneous γ matrix with carbides and cubic γ' precipitates uniformly distributed. The mean γ' size of aged samples at 780 and 880 °C for 5, 20 and 80 h did not present significant differences when compared to the solutioned sample. However, a significant increasing in the γ' particles was observed at 980 °C, evidenced by the large mean size of these particles after 80 h of aging heat-treatment.

  1. [Preliminary study of bonding strength between diatomite-based dental ceramic and veneering porcelains].

    PubMed

    Lu, Xiao-li; Gao, Mei-qin; Cheng, Yu-ye; Zhang, Fei-min

    2015-04-01

    In order to choose the best veneering porcelain for diatomite-based dental ceramic substrate, the bonding strength between diatomite-based dental ceramics and veneering porcelains was measured, and the microstructure and elements distribution of interface were analyzed. The coefficient of thermal expansion (CTE) of diatomite-based dental ceramics was detected by dilatometry. Three veneering porcelain materials were selected with the best CTE matching including alumina veneering porcelain (group A), titanium porcelain veneering porcelain (group B), and E-max veneering porcelain (group C). Shear bonding strength was detected. SEM and EDS were used to observe the interface microstructure and element distribution. Statistical analysis was performed using SPSS 17.0 software package. The CTE of diatomite-based dental ceramics at 25-500 degrees centigrade was 8.85×10-6K-1. The diatomite-based substrate ceramics combined best with group C. Shear bonding strength between group A and C and group B and C both showed significant differences(P<0.05). SEM and EDS showed that the interface of group C sintered tightly and elements permeated on both sides of the interface. The diatomite-based substrate ceramics combines better with E-max porcelain veneer.

  2. Correlative Light-Electron Fractography of Interlaminar Fracture in a Carbon-Epoxy Composite.

    PubMed

    Hein, Luis Rogerio de O; Campos, Kamila A de

    2015-12-01

    This work evaluates the use of light microscopes (LMs) as a tool for interlaminar fracture of polymer composite investigation with the aid of correlative fractography. Correlative fractography consists of an association of the extended depth of focus (EDF) method, based on reflected LM, with scanning electron microscopy (SEM) to evaluate interlaminar fractures. The use of these combined techniques is exemplified here for the mode I fracture of carbon-epoxy plain-weave reinforced composite. The EDF-LM is a digital image-processing method that consists of the extraction of in-focus pixels for each x-y coordinate in an image from a stack of Z-ordered digital pictures from an LM, resulting in a fully focused picture and a height elevation map for each stack. SEM is the most used tool for the identification of fracture mechanisms in a qualitative approach, with the combined advantages of a large focus depth and fine lateral resolution. However, LMs, with EDF software, may bypass the restriction on focus depth and present enough lateral resolution at low magnification. Finally, correlative fractography can provide the general comprehension of fracture processes, with the benefits of the association of different resolution scales and contrast modes.

  3. Fabrication of digital rainbow holograms and 3-D imaging using SEM based e-beam lithography.

    PubMed

    Firsov, An; Firsov, A; Loechel, B; Erko, A; Svintsov, A; Zaitsev, S

    2014-11-17

    Here we present an approach for creating full-color digital rainbow holograms based on mixing three basic colors. Much like in a color TV with three luminescent points per single screen pixel, each color pixel of initial image is presented by three (R, G, B) distinct diffractive gratings in a hologram structure. Change of either duty cycle or area of the gratings are used to provide proper R, G, B intensities. Special algorithms allow one to design rather complicated 3D images (that might even be replacing each other with hologram rotation). The software developed ("RainBow") provides stability of colorization of rotated image by means of equalizing of angular blur from gratings responsible for R, G, B basic colors. The approach based on R, G, B color synthesis allows one to fabricate gray-tone rainbow hologram containing white color what is hardly possible in traditional dot-matrix technology. Budgetary electron beam lithography based on SEM column was used to fabricate practical examples of digital rainbow hologram. The results of fabrication of large rainbow holograms from design to imprinting are presented. Advantages of the EBL in comparison to traditional optical (dot-matrix) technology is considered.

  4. Nuclear forensics investigation of morphological signatures in the thermal decomposition of uranyl peroxide.

    PubMed

    Schwerdt, Ian J; Olsen, Adam; Lusk, Robert; Heffernan, Sean; Klosterman, Michael; Collins, Bryce; Martinson, Sean; Kirkham, Trenton; McDonald, Luther W

    2018-01-01

    The analytical techniques typically utilized in a nuclear forensic investigation often provide limited information regarding the process history and production conditions of interdicted nuclear material. In this study, scanning electron microscopy (SEM) analysis of the surface morphology of amorphous-UO 3 samples calcined at 250, 300, 350, 400, and 450°C from uranyl peroxide was performed to determine if the morphology was indicative of the synthesis route and thermal history for the samples. Thermogravimetic analysis-mass spectrometry (TGA-MS) and differential scanning calorimetry (DSC) were used to correlate transitions in the calcined material to morphological transformations. The high-resolution SEM images were processed using the Morphological Analysis for Material Attribution (MAMA) software. Morphological attributes, particle area and circularity, indicated significant trends as a result of calcination temperature. The quantitative morphological analysis was able to track the process of particle fragmentation and subsequent sintering as calcination temperature was increased. At the 90% confidence interval, with 1000 segmented particles, the use of Kolmogorov-Smirnov statistical comparisons allowed discernment between all calcination temperatures for the uranyl peroxide route. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  5. Models and metrics for software management and engineering

    NASA Technical Reports Server (NTRS)

    Basili, V. R.

    1988-01-01

    This paper attempts to characterize and present a state of the art view of several quantitative models and metrics of the software life cycle. These models and metrics can be used to aid in managing and engineering software projects. They deal with various aspects of the software process and product, including resources allocation and estimation, changes and errors, size, complexity and reliability. Some indication is given of the extent to which the various models have been used and the success they have achieved.

  6. Open source molecular modeling.

    PubMed

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-09-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. An updated online version of this catalog can be found at https://opensourcemolecularmodeling.github.io. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  7. A software quality model and metrics for risk assessment

    NASA Technical Reports Server (NTRS)

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.

  8. Experiences in Teaching a Graduate Course on Model-Driven Software Development

    ERIC Educational Resources Information Center

    Tekinerdogan, Bedir

    2011-01-01

    Model-driven software development (MDSD) aims to support the development and evolution of software intensive systems using the basic concepts of model, metamodel, and model transformation. In parallel with the ongoing academic research, MDSD is more and more applied in industrial practices. After being accepted both by a broad community of…

  9. Industry Software Trustworthiness Criterion Research Based on Business Trustworthiness

    NASA Astrophysics Data System (ADS)

    Zhang, Jin; Liu, Jun-fei; Jiao, Hai-xing; Shen, Yi; Liu, Shu-yuan

    To industry software Trustworthiness problem, an idea aiming to business to construct industry software trustworthiness criterion is proposed. Based on the triangle model of "trustworthy grade definition-trustworthy evidence model-trustworthy evaluating", the idea of business trustworthiness is incarnated from different aspects of trustworthy triangle model for special industry software, power producing management system (PPMS). Business trustworthiness is the center in the constructed industry trustworthy software criterion. Fusing the international standard and industry rules, the constructed trustworthy criterion strengthens the maneuverability and reliability. Quantitive evaluating method makes the evaluating results be intuitionistic and comparable.

  10. Automated support for experience-based software management

    NASA Technical Reports Server (NTRS)

    Valett, Jon D.

    1992-01-01

    To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.

  11. Graphical Technique to Support the Teaching/Learning Process of Software Process Reference Models

    NASA Astrophysics Data System (ADS)

    Espinosa-Curiel, Ismael Edrein; Rodríguez-Jacobo, Josefina; Fernández-Zepeda, José Alberto

    In this paper, we propose a set of diagrams to visualize software process reference models (PRM). The diagrams, called dimods, are the combination of some visual and process modeling techniques such as rich pictures, mind maps, IDEF and RAD diagrams. We show the use of this technique by designing a set of dimods for the Mexican Software Industry Process Model (MoProSoft). Additionally, we perform an evaluation of the usefulness of dimods. The result of the evaluation shows that dimods may be a support tool that facilitates the understanding, memorization, and learning of software PRMs in both, software development organizations and universities. The results also show that dimods may have advantages over the traditional description methods for these types of models.

  12. Insights into chromatographic separation using core-shell metal-organic frameworks: Size exclusion and polarity effects.

    PubMed

    Qin, Weiwei; Silvestre, Martin E; Kirschhöfer, Frank; Brenner-Weiss, Gerald; Franzreb, Matthias

    2015-09-11

    Porous metal-organic frameworks (MOFs) [Cu3(BTC)2(H2O)3]n (also known as HKUST-1; BTC, benzene-1,3,5-tricarboxylic acid) were synthesized as homogeneous shell onto carboxyl functionalized magnetic microparticles through a liquid phase epitaxy (LPE) process. The as-synthesized core-shell HKUST-1 magnetic microparticles composites were characterized by XRD and SEM, and used as stationary phase in high performance liquid chromatography (HPLC). The effects of the unique properties of MOFs onto the chromatographic performance are demonstrated by the experiments. First, remarkable separation of pyridine and bipyridine is achieved, although both molecules show a strong interaction between the Cu-ions in HKUST-1 and the nitrogen atoms in their heterocyles. The difference can be explained due to size exclusion of bipyridine from the well defined pore structure of crystalline HKUST-1. Second, the enormous variety of possible interactions of sample molecules with the metal ions and linkers within MOFs allows for specifically tailored solid phases for challenging separation tasks. For example, baseline separation of three chloroaniline (CLA) isomers tested can be achieved without the need for gradient elution modes. Along with the experimental HPLC runs, in-depth modelling with a recently developed chromatography modelling software (ChromX) was applied and proofs the software to be a powerful tool for exploring the separation potential of thin MOF films. The pore diffusivity of pyridine and CLA isomers within HKUST-1 are found to be around 2.3×10(-15)m(2)s(-1). While the affinity of HKUST-1 to the tested molecules strongly differs, the maximum capacities are in the same range, with 0.37molL(-1) for pyridine and 0.23molL(-1) for CLA isomers, corresponding to 4.0 and 2.5 molecules per MOF unit cell, respectively. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Capability Maturity Model (CMM) for Software Process Improvements

    NASA Technical Reports Server (NTRS)

    Ling, Robert Y.

    2000-01-01

    This slide presentation reviews the Avionic Systems Division's implementation of the Capability Maturity Model (CMM) for improvements in the software development process. The presentation reviews the process involved in implementing the model and the benefits of using CMM to improve the software development process.

  14. Structural equation modeling of the inflammatory response to traffic air pollution

    PubMed Central

    Baja, Emmanuel S.; Schwartz, Joel D.; Coull, Brent A.; Wellenius, Gregory A.; Vokonas, Pantel S.; Suh, Helen H.

    2015-01-01

    Several epidemiological studies have reported conflicting results on the effect of traffic-related pollutants on markers of inflammation. In a Bayesian framework, we examined the effect of traffic pollution on inflammation using structural equation models (SEMs). We studied measurements of C-reactive protein (CRP), soluble vascular cell adhesion molecule-1 (sVCAM-1), and soluble intracellular adhesion molecule-1 (sICAM-1) for 749 elderly men from the Normative Aging Study. Using repeated measures SEMs, we fit a latent variable for traffic pollution that is reflected by levels of black carbon, carbon monoxide, nitrogen monoxide and nitrogen dioxide to estimate its effect on a latent variable for inflammation that included sICAM-1, sVCAM-1 and CRP. Exposure periods were assessed using 1-, 2-, 3-, 7-, 14- and 30-day moving averages previsit. We compared our findings using SEMs with those obtained using linear mixed models. Traffic pollution was related to increased inflammation for 3-, 7-, 14- and 30-day exposure periods. An inter-quartile range increase in traffic pollution was associated with a 2.3% (95% posterior interval (PI): 0.0–4.7%) increase in inflammation for the 3-day moving average, with the most significant association observed for the 30-day moving average (23.9%; 95% PI: 13.9–36.7%). Traffic pollution adversely impacts inflammation in the elderly. SEMs in a Bayesian framework can comprehensively incorporate multiple pollutants and health outcomes simultaneously in air pollution–cardiovascular epidemiological studies. PMID:23232970

  15. A bridge role metric model for nodes in software networks.

    PubMed

    Li, Bo; Feng, Yanli; Ge, Shiyu; Li, Dashe

    2014-01-01

    A bridge role metric model is put forward in this paper. Compared with previous metric models, our solution of a large-scale object-oriented software system as a complex network is inherently more realistic. To acquire nodes and links in an undirected network, a new model that presents the crucial connectivity of a module or the hub instead of only centrality as in previous metric models is presented. Two previous metric models are described for comparison. In addition, it is obvious that the fitting curve between the Bre results and degrees can well be fitted by a power law. The model represents many realistic characteristics of actual software structures, and a hydropower simulation system is taken as an example. This paper makes additional contributions to an accurate understanding of module design of software systems and is expected to be beneficial to software engineering practices.

  16. A Bridge Role Metric Model for Nodes in Software Networks

    PubMed Central

    Li, Bo; Feng, Yanli; Ge, Shiyu; Li, Dashe

    2014-01-01

    A bridge role metric model is put forward in this paper. Compared with previous metric models, our solution of a large-scale object-oriented software system as a complex network is inherently more realistic. To acquire nodes and links in an undirected network, a new model that presents the crucial connectivity of a module or the hub instead of only centrality as in previous metric models is presented. Two previous metric models are described for comparison. In addition, it is obvious that the fitting curve between the results and degrees can well be fitted by a power law. The model represents many realistic characteristics of actual software structures, and a hydropower simulation system is taken as an example. This paper makes additional contributions to an accurate understanding of module design of software systems and is expected to be beneficial to software engineering practices. PMID:25364938

  17. Development of an Environment for Software Reliability Model Selection

    DTIC Science & Technology

    1992-09-01

    now is directed to other related problems such as tools for model selection, multiversion programming, and software fault tolerance modeling... multiversion programming, 7. Hlardware can be repaired by spare modules, which is not. the case for software, 2-6 N. Preventive maintenance is very important

  18. GeoTess: A generalized Earth model software utility

    DOE PAGES

    Ballard, Sanford; Hipp, James; Kraus, Brian; ...

    2016-03-23

    GeoTess is a model parameterization and software support library that manages the construction, population, storage, and interrogation of data stored in 2D and 3D Earth models. Here, the software is available in Java and C++, with a C interface to the C++ library.

  19. Mental Models of Software Forecasting

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Griesel, A.; Bruno, K.; Fouser, T.; Tausworthe, R.

    1993-01-01

    The majority of software engineers resist the use of the currently available cost models. One problem is that the mathematical and statistical models that are currently available do not correspond with the mental models of the software engineers. In an earlier JPL funded study (Hihn and Habib-agahi, 1991) it was found that software engineers prefer to use analogical or analogy-like techniques to derive size and cost estimates, whereas curren CER's hide any analogy in the regression equations. In addition, the currently available models depend upon information which is not available during early planning when the most important forecasts must be made.

  20. Model Driven Engineering

    NASA Astrophysics Data System (ADS)

    Gaševic, Dragan; Djuric, Dragan; Devedžic, Vladan

    A relevant initiative from the software engineering community called Model Driven Engineering (MDE) is being developed in parallel with the Semantic Web (Mellor et al. 2003a). The MDE approach to software development suggests that one should first develop a model of the system under study, which is then transformed into the real thing (i.e., an executable software entity). The most important research initiative in this area is the Model Driven Architecture (MDA), which is Model Driven Architecture being developed under the umbrella of the Object Management Group (OMG). This chapter describes the basic concepts of this software engineering effort.

  1. Advanced metrology by offline SEM data processing

    NASA Astrophysics Data System (ADS)

    Lakcher, Amine; Schneider, Loïc.; Le-Gratiet, Bertrand; Ducoté, Julien; Farys, Vincent; Besacier, Maxime

    2017-06-01

    Today's technology nodes contain more and more complex designs bringing increasing challenges to chip manufacturing process steps. It is necessary to have an efficient metrology to assess process variability of these complex patterns and thus extract relevant data to generate process aware design rules and to improve OPC models. Today process variability is mostly addressed through the analysis of in-line monitoring features which are often designed to support robust measurements and as a consequence are not always very representative of critical design rules. CD-SEM is the main CD metrology technique used in chip manufacturing process but it is challenged when it comes to measure metrics like tip to tip, tip to line, areas or necking in high quantity and with robustness. CD-SEM images contain a lot of information that is not always used in metrology. Suppliers have provided tools that allow engineers to extract the SEM contours of their features and to convert them into a GDS. Contours can be seen as the signature of the shape as it contains all the dimensional data. Thus the methodology is to use the CD-SEM to take high quality images then generate SEM contours and create a data base out of them. Contours are used to feed an offline metrology tool that will process them to extract different metrics. It was shown in two previous papers that it is possible to perform complex measurements on hotspots at different process steps (lithography, etch, copper CMP) by using SEM contours with an in-house offline metrology tool. In the current paper, the methodology presented previously will be expanded to improve its robustness and combined with the use of phylogeny to classify the SEM images according to their geometrical proximities.

  2. Modeling Latent Growth Curves With Incomplete Data Using Different Types of Structural Equation Modeling and Multilevel Software

    ERIC Educational Resources Information Center

    Ferrer, Emilio; Hamagami, Fumiaki; McArdle, John J.

    2004-01-01

    This article offers different examples of how to fit latent growth curve (LGC) models to longitudinal data using a variety of different software programs (i.e., LISREL, Mx, Mplus, AMOS, SAS). The article shows how the same model can be fitted using both structural equation modeling and multilevel software, with nearly identical results, even in…

  3. Supporting the Use of CERT (registered trademark) Secure Coding Standards in DoD Acquisitions

    DTIC Science & Technology

    2012-07-01

    Capability Maturity Model IntegrationSM (CMMI®) [Davis 2009]. SM Team Software Process, TSP, and Capability Maturity Model Integration are service...STP Software Test Plan TEP Test and Evaluation Plan TSP Team Software Process V & V verification and validation CMU/SEI-2012-TN-016 | 47...Supporting the Use of CERT® Secure Coding Standards in DoD Acquisitions Tim Morrow ( Software Engineering Institute) Robert Seacord ( Software

  4. Modeling of electron-specimen interaction in scanning electron microscope for e-beam metrology and inspection: challenges and perspectives

    NASA Astrophysics Data System (ADS)

    Suzuki, Makoto; Kameda, Toshimasa; Doi, Ayumi; Borisov, Sergey; Babin, Sergey

    2018-03-01

    The interpretation of scanning electron microscopy (SEM) images of the latest semiconductor devices is not intuitive and requires comparison with computed images based on theoretical modeling and simulations. For quantitative image prediction and geometrical reconstruction of the specimen structure, the accuracy of the physical model is essential. In this paper, we review the current models of electron-solid interaction and discuss their accuracy. We perform the comparison of the simulated results with our experiments of SEM overlay of under-layer, grain imaging of copper interconnect, and hole bottom visualization by angular selective detectors, and show that our model well reproduces the experimental results. Remaining issues for quantitative simulation are also discussed, including the accuracy of the charge dynamics, treatment of beam skirt, and explosive increase in computing time.

  5. Spectral element modelling of fault-plane reflections arising from fluid pressure distributions

    USGS Publications Warehouse

    Haney, M.; Snieder, R.; Ampuero, J.-P.; Hofmann, R.

    2007-01-01

    The presence of fault-plane reflections in seismic images, besides indicating the locations of faults, offers a possible source of information on the properties of these poorly understood zones. To better understand the physical mechanism giving rise to fault-plane reflections in compacting sedimentary basins, we numerically model the full elastic wavefield via the spectral element method (SEM) for several different fault models. Using well log data from the South Eugene Island field, offshore Louisiana, we derive empirical relationships between the elastic parameters (e.g. P-wave velocity and density) and the effective-stress along both normal compaction and unloading paths. These empirical relationships guide the numerical modelling and allow the investigation of how differences in fluid pressure modify the elastic wavefield. We choose to simulate the elastic wave equation via SEM since irregular model geometries can be accommodated and slip boundary conditions at an interface, such as a fault or fracture, are implemented naturally. The method we employ for including a slip interface retains the desirable qualities of SEM in that it is explicit in time and, therefore, does not require the inversion of a large matrix. We performa complete numerical study by forward modelling seismic shot gathers over a faulted earth model using SEM followed by seismic processing of the simulated data. With this procedure, we construct post-stack time-migrated images of the kind that are routinely interpreted in the seismic exploration industry. We dip filter the seismic images to highlight the fault-plane reflections prior to making amplitude maps along the fault plane. With these amplitude maps, we compare the reflectivity from the different fault models to diagnose which physical mechanism contributes most to observed fault reflectivity. To lend physical meaning to the properties of a locally weak fault zone characterized as a slip interface, we propose an equivalent-layer model under the assumption of weak scattering. This allows us to use the empirical relationships between density, velocity and effective stress from the South Eugene Island field to relate a slip interface to an amount of excess pore-pressure in a fault zone. ?? 2007 The Authors Journal compilation ?? 2007 RAS.

  6. Defect measurement and analysis of JPL ground software: a case study

    NASA Technical Reports Server (NTRS)

    Powell, John D.; Spagnuolo, John N., Jr.

    2004-01-01

    Ground software systems at JPL must meet high assurance standards while remaining on schedule due to relatively immovable launch dates for spacecraft that will be controlled by such systems. Toward this end, the Software Quality Improvement (SQI) project's Measurement and Benchmarking (M&B) team is collecting and analyzing defect data of JPL ground system software projects to build software defect prediction models. The aim of these models is to improve predictability with regard to software quality activities. Predictive models will quantitatively define typical trends for JPL ground systems as well as Critical Discriminators (CDs) to provide explanations for atypical deviations from the norm at JPL. CDs are software characteristics that can be estimated or foreseen early in a software project's planning. Thus, these CDs will assist in planning for the predicted degree to which software quality activities for a project are likely to deviation from the normal JPL ground system based on pasted experience across the lab.

  7. Co Modeling and Co Synthesis of Safety Critical Multi threaded Embedded Software for Multi Core Embedded Platforms

    DTIC Science & Technology

    2017-03-20

    computation, Prime Implicates, Boolean Abstraction, real- time embedded software, software synthesis, correct by construction software design , model...types for time -dependent data-flow networks". J.-P. Talpin, P. Jouvelot, S. Shukla. ACM-IEEE Conference on Methods and Models for System Design ...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing   data sources, gathering and

  8. Adaptive Long-Term Monitoring at Environmental Restoration Sites (ER-0629)

    DTIC Science & Technology

    2009-05-01

    Figures Figure 2-1 General Flowchart of Software Application Figure 2-2 Overview of the Genetic Algorithm Approach Figure 2-3 Example of a...and Model Builder) are highlighted on Figure 2-1, which is a general flowchart illustrating the application of the software. The software is applied...monitoring event (e.g., contaminant mass based on interpolation) that modeling is provided by Model Builder. 4 Figure 2-1. General Flowchart of Software

  9. Software forecasting as it is really done: A study of JPL software engineers

    NASA Technical Reports Server (NTRS)

    Griesel, Martha Ann; Hihn, Jairus M.; Bruno, Kristin J.; Fouser, Thomas J.; Tausworthe, Robert C.

    1993-01-01

    This paper presents a summary of the results to date of a Jet Propulsion Laboratory internally funded research task to study the costing process and parameters used by internally recognized software cost estimating experts. Protocol Analysis and Markov process modeling were used to capture software engineer's forecasting mental models. While there is significant variation between the mental models that were studied, it was nevertheless possible to identify a core set of cost forecasting activities, and it was also found that the mental models cluster around three forecasting techniques. Further partitioning of the mental models revealed clustering of activities, that is very suggestive of a forecasting lifecycle. The different forecasting methods identified were based on the use of multiple-decomposition steps or multiple forecasting steps. The multiple forecasting steps involved either forecasting software size or an additional effort forecast. Virtually no subject used risk reduction steps in combination. The results of the analysis include: the identification of a core set of well defined costing activities, a proposed software forecasting life cycle, and the identification of several basic software forecasting mental models. The paper concludes with a discussion of the implications of the results for current individual and institutional practices.

  10. A comparative approach to computer aided design model of a dog femur.

    PubMed

    Turamanlar, O; Verim, O; Karabulut, A

    2016-01-01

    Computer assisted technologies offer new opportunities in medical imaging and rapid prototyping in biomechanical engineering. Three dimensional (3D) modelling of soft tissues and bones are becoming more important. The accuracy of the analysis in modelling processes depends on the outline of the tissues derived from medical images. The aim of this study is the evaluation of the accuracy of 3D models of a dog femur derived from computed tomography data by using point cloud method and boundary line method on several modelling software. Solidworks, Rapidform and 3DSMax software were used to create 3D models and outcomes were evaluated statistically. The most accurate 3D prototype of the dog femur was created with stereolithography method using rapid prototype device. Furthermore, the linearity of the volumes of models was investigated between software and the constructed models. The difference between the software and real models manifests the sensitivity of the software and the devices used in this manner.

  11. Integrated Functional and Executional Modelling of Software Using Web-Based Databases

    NASA Technical Reports Server (NTRS)

    Kulkarni, Deepak; Marietta, Roberta

    1998-01-01

    NASA's software subsystems undergo extensive modification and updates over the operational lifetimes. It is imperative that modified software should satisfy safety goals. This report discusses the difficulties encountered in doing so and discusses a solution based on integrated modelling of software, use of automatic information extraction tools, web technology and databases.

  12. Experimental Evaluation of a Serious Game for Teaching Software Process Modeling

    ERIC Educational Resources Information Center

    Chaves, Rafael Oliveira; von Wangenheim, Christiane Gresse; Furtado, Julio Cezar Costa; Oliveira, Sandro Ronaldo Bezerra; Santos, Alex; Favero, Eloi Luiz

    2015-01-01

    Software process modeling (SPM) is an important area of software engineering because it provides a basis for managing, automating, and supporting software process improvement (SPI). Teaching SPM is a challenging task, mainly because it lays great emphasis on theory and offers few practical exercises. Furthermore, as yet few teaching approaches…

  13. Estimating Software-Development Costs With Greater Accuracy

    NASA Technical Reports Server (NTRS)

    Baker, Dan; Hihn, Jairus; Lum, Karen

    2008-01-01

    COCOMOST is a computer program for use in estimating software development costs. The goal in the development of COCOMOST was to increase estimation accuracy in three ways: (1) develop a set of sensitivity software tools that return not only estimates of costs but also the estimation error; (2) using the sensitivity software tools, precisely define the quantities of data needed to adequately tune cost estimation models; and (3) build a repository of software-cost-estimation information that NASA managers can retrieve to improve the estimates of costs of developing software for their project. COCOMOST implements a methodology, called '2cee', in which a unique combination of well-known pre-existing data-mining and software-development- effort-estimation techniques are used to increase the accuracy of estimates. COCOMOST utilizes multiple models to analyze historical data pertaining to software-development projects and performs an exhaustive data-mining search over the space of model parameters to improve the performances of effort-estimation models. Thus, it is possible to both calibrate and generate estimates at the same time. COCOMOST is written in the C language for execution in the UNIX operating system.

  14. SWIFT MODELLER: a Java based GUI for molecular modeling.

    PubMed

    Mathur, Abhinav; Shankaracharya; Vidyarthi, Ambarish S

    2011-10-01

    MODELLER is command line argument based software which requires tedious formatting of inputs and writing of Python scripts which most people are not comfortable with. Also the visualization of output becomes cumbersome due to verbose files. This makes the whole software protocol very complex and requires extensive study of MODELLER manuals and tutorials. Here we describe SWIFT MODELLER, a GUI that automates formatting, scripting and data extraction processes and present it in an interactive way making MODELLER much easier to use than before. The screens in SWIFT MODELLER are designed keeping homology modeling in mind and their flow is a depiction of its steps. It eliminates the formatting of inputs, scripting processes and analysis of verbose output files through automation and makes pasting of the target sequence as the only prerequisite. Jmol (3D structure visualization tool) has been integrated into the GUI which opens and demonstrates the protein data bank files created by the MODELLER software. All files required and created by the software are saved in a folder named after the work instance's date and time of execution. SWIFT MODELLER lowers the skill level required for the software through automation of many of the steps in the original software protocol, thus saving an enormous amount of time per instance and making MODELLER very easy to work with.

  15. Oral or parenteral administration of curcumin does not prevent the growth of high-risk t(4;11) acute lymphoblastic leukemia cells engrafted into a NOD/SCID mouse model

    USDA-ARS?s Scientific Manuscript database

    The efficacy of orally and parenterally administered curcumin was evaluated in NOD.CB17-Prkdcscid/J mice engrafted with the human t(4;11) acute lymphoblastic leukemia line SEM. SEM cells were injected into the tail vein and engraftment was monitored by flow cytometry. Once engraftment was observed...

  16. Simulation of High-Latitude Hydrological Processes in the Torne-Kalix Basin: PILPS Phase 2(e). 3; Equivalent Model Representation and Sensitivity Experiments

    NASA Technical Reports Server (NTRS)

    Bowling, Laura C.; Lettenmaier, Dennis P.; Nijssen, Bart; Polcher, Jan; Koster, Randal D.; Lohmann, Dag; Houser, Paul R. (Technical Monitor)

    2002-01-01

    The Project for Intercomparison of Land Surface Parameterization Schemes (PILPS) Phase 2(e) showed that in cold regions the annual runoff production in Land Surface Schemes (LSSs) is closely related to the maximum snow accumulation, which in turn is controlled in large part by winter sublimation. To help further explain the relationship between snow cover, turbulent exchanges and runoff production, a simple equivalent model-(SEM) was devised to reproduce the seasonal and annual fluxes simulated by 13 LSSs that participated in PILPS Phase 2(e). The design of the SEM relates the annual partitioning of precipitation and energy in the LSSs to three primary parameters: snow albedo, effective aerodynamic resistance and evaporation efficiency. Isolation of each of the parameters showed that the annual runoff production was most sensitive to the aerodynamic resistance. The SEM was somewhat successful in reproducing the observed LSS response to a decrease in shortwave radiation and changes in wind speed forcings. SEM parameters derived from the reduced shortwave forcings suggested that increased winter stability suppressed turbulent heat fluxes over snow. Because winter sensible heat fluxes were largely negative, reductions in winter shortwave radiation imply an increase in annual average sensible heat.

  17. GPU accelerated Monte-Carlo simulation of SEM images for metrology

    NASA Astrophysics Data System (ADS)

    Verduin, T.; Lokhorst, S. R.; Hagen, C. W.

    2016-03-01

    In this work we address the computation times of numerical studies in dimensional metrology. In particular, full Monte-Carlo simulation programs for scanning electron microscopy (SEM) image acquisition are known to be notoriously slow. Our quest in reducing the computation time of SEM image simulation has led us to investigate the use of graphics processing units (GPUs) for metrology. We have succeeded in creating a full Monte-Carlo simulation program for SEM images, which runs entirely on a GPU. The physical scattering models of this GPU simulator are identical to a previous CPU-based simulator, which includes the dielectric function model for inelastic scattering and also refinements for low-voltage SEM applications. As a case study for the performance, we considered the simulated exposure of a complex feature: an isolated silicon line with rough sidewalls located on a at silicon substrate. The surface of the rough feature is decomposed into 408 012 triangles. We have used an exposure dose of 6 mC/cm2, which corresponds to 6 553 600 primary electrons on average (Poisson distributed). We repeat the simulation for various primary electron energies, 300 eV, 500 eV, 800 eV, 1 keV, 3 keV and 5 keV. At first we run the simulation on a GeForce GTX480 from NVIDIA. The very same simulation is duplicated on our CPU-based program, for which we have used an Intel Xeon X5650. Apart from statistics in the simulation, no difference is found between the CPU and GPU simulated results. The GTX480 generates the images (depending on the primary electron energy) 350 to 425 times faster than a single threaded Intel X5650 CPU. Although this is a tremendous speedup, we actually have not reached the maximum throughput because of the limited amount of available memory on the GTX480. Nevertheless, the speedup enables the fast acquisition of simulated SEM images for metrology. We now have the potential to investigate case studies in CD-SEM metrology, which otherwise would take unreasonable amounts of computation time.

  18. Modeling and MBL: Software Tools for Science.

    ERIC Educational Resources Information Center

    Tinker, Robert F.

    Recent technological advances and new software packages put unprecedented power for experimenting and theory-building in the hands of students at all levels. Microcomputer-based laboratory (MBL) and model-solving tools illustrate the educational potential of the technology. These tools include modeling software and three MBL packages (which are…

  19. Modified multiblock partial least squares path modeling algorithm with backpropagation neural networks approach

    NASA Astrophysics Data System (ADS)

    Yuniarto, Budi; Kurniawan, Robert

    2017-03-01

    PLS Path Modeling (PLS-PM) is different from covariance based SEM, where PLS-PM use an approach based on variance or component, therefore, PLS-PM is also known as a component based SEM. Multiblock Partial Least Squares (MBPLS) is a method in PLS regression which can be used in PLS Path Modeling which known as Multiblock PLS Path Modeling (MBPLS-PM). This method uses an iterative procedure in its algorithm. This research aims to modify MBPLS-PM with Back Propagation Neural Network approach. The result is MBPLS-PM algorithm can be modified using the Back Propagation Neural Network approach to replace the iterative process in backward and forward step to get the matrix t and the matrix u in the algorithm. By modifying the MBPLS-PM algorithm using Back Propagation Neural Network approach, the model parameters obtained are relatively not significantly different compared to model parameters obtained by original MBPLS-PM algorithm.

  20. Modeling software systems by domains

    NASA Technical Reports Server (NTRS)

    Dippolito, Richard; Lee, Kenneth

    1992-01-01

    The Software Architectures Engineering (SAE) Project at the Software Engineering Institute (SEI) has developed engineering modeling techniques that both reduce the complexity of software for domain-specific computer systems and result in systems that are easier to build and maintain. These techniques allow maximum freedom for system developers to apply their domain expertise to software. We have applied these techniques to several types of applications, including training simulators operating in real time, engineering simulators operating in non-real time, and real-time embedded computer systems. Our modeling techniques result in software that mirrors both the complexity of the application and the domain knowledge requirements. We submit that the proper measure of software complexity reflects neither the number of software component units nor the code count, but the locus of and amount of domain knowledge. As a result of using these techniques, domain knowledge is isolated by fields of engineering expertise and removed from the concern of the software engineer. In this paper, we will describe kinds of domain expertise, describe engineering by domains, and provide relevant examples of software developed for simulator applications using the techniques.

  1. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    NASA Astrophysics Data System (ADS)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  2. Implementing a modeling software for animated protein-complex interactions using a physics simulation library.

    PubMed

    Ueno, Yutaka; Ito, Shuntaro; Konagaya, Akihiko

    2014-12-01

    To better understand the behaviors and structural dynamics of proteins within a cell, novel software tools are being developed that can create molecular animations based on the findings of structural biology. This study proposes our method developed based on our prototypes to detect collisions and examine the soft-body dynamics of molecular models. The code was implemented with a software development toolkit for rigid-body dynamics simulation and a three-dimensional graphics library. The essential functions of the target software system included the basic molecular modeling environment, collision detection in the molecular models, and physical simulations of the movement of the model. Taking advantage of recent software technologies such as physics simulation modules and interpreted scripting language, the functions required for accurate and meaningful molecular animation were implemented efficiently.

  3. Proposal for constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.

    1990-01-01

    Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.

  4. A General Water Resources Regulation Software System in China

    NASA Astrophysics Data System (ADS)

    LEI, X.

    2017-12-01

    To avoid iterative development of core modules in water resource normal regulation and emergency regulation and improve the capability of maintenance and optimization upgrading of regulation models and business logics, a general water resources regulation software framework was developed based on the collection and analysis of common demands for water resources regulation and emergency management. It can provide a customizable, secondary developed and extensible software framework for the three-level platform "MWR-Basin-Province". Meanwhile, this general software system can realize business collaboration and information sharing of water resources regulation schemes among the three-level platforms, so as to improve the decision-making ability of national water resources regulation. There are four main modules involved in the general software system: 1) A complete set of general water resources regulation modules allows secondary developer to custom-develop water resources regulation decision-making systems; 2) A complete set of model base and model computing software released in the form of Cloud services; 3) A complete set of tools to build the concept map and model system of basin water resources regulation, as well as a model management system to calibrate and configure model parameters; 4) A database which satisfies business functions and functional requirements of general water resources regulation software can finally provide technical support for building basin or regional water resources regulation models.

  5. The discounting model selector: Statistical software for delay discounting applications.

    PubMed

    Gilroy, Shawn P; Franck, Christopher T; Hantula, Donald A

    2017-05-01

    Original, open-source computer software was developed and validated against established delay discounting methods in the literature. The software executed approximate Bayesian model selection methods from user-supplied temporal discounting data and computed the effective delay 50 (ED50) from the best performing model. Software was custom-designed to enable behavior analysts to conveniently apply recent statistical methods to temporal discounting data with the aid of a graphical user interface (GUI). The results of independent validation of the approximate Bayesian model selection methods indicated that the program provided results identical to that of the original source paper and its methods. Monte Carlo simulation (n = 50,000) confirmed that true model was selected most often in each setting. Simulation code and data for this study were posted to an online repository for use by other researchers. The model selection approach was applied to three existing delay discounting data sets from the literature in addition to the data from the source paper. Comparisons of model selected ED50 were consistent with traditional indices of discounting. Conceptual issues related to the development and use of computer software by behavior analysts and the opportunities afforded by free and open-sourced software are discussed and a review of possible expansions of this software are provided. © 2017 Society for the Experimental Analysis of Behavior.

  6. Development of the UTAUT2 model to measure the acceptance of medical laboratory portals by patients in Shiraz

    PubMed Central

    Ravangard, Ramin; Kazemi, Zhila; Abbasali, Somaye Zaker; Sharifian, Roxana; Monem, Hossein

    2017-01-01

    Introduction One of the main stages for achieving the success is acceptance of technology by its users. Hence, identifying the effective factors in successful acceptance of information technology is necessary and vital. One such factor is usability. This study aimed to investigate the software usability in the “Unified Theory of Acceptance and Use of Technology 2 (UTAUT2)” model in patients’ use of medical diagnosis laboratories’ electronic portals in 2015. Methods This cross-sectional study was carried out on 170 patients in 2015. A 27-item questionnaire adopted from previous research and the Usability Evaluation questionnaire were used for data collection. Data were analyzed using Structural Equation Modeling (SEM), with Partial Least Squares approach by SPSS 20.0 and Smart-PLS V3.0. Results The results showed that the construct of intention to use had significant associations with price value (t-value=2.77), hedonic motivation (t-value=4.46), habit (t-value=1.99) and usability (t-value=5.2), as well as the construct of usage behavior with usability (t-value=3.45) and intention to use (t-value=2.03). Conclusion Considering the results of this study, the following recommendations can be made in order for the higher use of portals by the patients: informing patients about the advantages of using these portals, designing portals in a simple and understandable form, increasing the portals’ attractiveness, etc. PMID:28465819

  7. Dual FIB-SEM 3D imaging and lattice boltzmann modeling of porosimetry and multiphase flow in chalk.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rinehart, Alex; Petrusak, Robin; Heath, Jason E.

    2010-12-01

    Mercury intrusion porosimetry (MIP) is an often-applied technique for determining pore throat distributions and seal analysis of fine-grained rocks. Due to closure effects, potential pore collapse, and complex pore network topologies, MIP data interpretation can be ambiguous, and often biased toward smaller pores in the distribution. We apply 3D imaging techniques and lattice-Boltzmann modeling in interpreting MIP data for samples of the Cretaceous Selma Group Chalk. In the Mississippi Interior Salt Basin, the Selma Chalk is the apparent seal for oil and gas fields in the underlying Eutaw Fm., and, where unfractured, the Selma Chalk is one of the regional-scalemore » seals identified by the Southeast Regional Carbon Sequestration Partnership for CO2 injection sites. Dual focused ion - scanning electron beam and laser scanning confocal microscopy methods are used for 3D imaging of nanometer-to-micron scale microcrack and pore distributions in the Selma Chalk. A combination of image analysis software is used to obtain geometric pore body and throat distributions and other topological properties, which are compared to MIP results. 3D data sets of pore-microfracture networks are used in Lattice Boltzmann simulations of drainage (wetting fluid displaced by non-wetting fluid via the Shan-Chen algorithm), which in turn are used to model MIP procedures. Results are used in interpreting MIP results, understanding microfracture-matrix interaction during multiphase flow, and seal analysis for underground CO2 storage.« less

  8. A Comparison and Evaluation of Real-Time Software Systems Modeling Languages

    NASA Technical Reports Server (NTRS)

    Evensen, Kenneth D.; Weiss, Kathryn Anne

    2010-01-01

    A model-driven approach to real-time software systems development enables the conceptualization of software, fostering a more thorough understanding of its often complex architecture and behavior while promoting the documentation and analysis of concerns common to real-time embedded systems such as scheduling, resource allocation, and performance. Several modeling languages have been developed to assist in the model-driven software engineering effort for real-time systems, and these languages are beginning to gain traction with practitioners throughout the aerospace industry. This paper presents a survey of several real-time software system modeling languages, namely the Architectural Analysis and Design Language (AADL), the Unified Modeling Language (UML), Systems Modeling Language (SysML), the Modeling and Analysis of Real-Time Embedded Systems (MARTE) UML profile, and the AADL for UML profile. Each language has its advantages and disadvantages, and in order to adequately describe a real-time software system's architecture, a complementary use of multiple languages is almost certainly necessary. This paper aims to explore these languages in the context of understanding the value each brings to the model-driven software engineering effort and to determine if it is feasible and practical to combine aspects of the various modeling languages to achieve more complete coverage in architectural descriptions. To this end, each language is evaluated with respect to a set of criteria such as scope, formalisms, and architectural coverage. An example is used to help illustrate the capabilities of the various languages.

  9. Seismic Wave Propagation in Fully Anisotropic Axisymmetric Media: Applications and Practical Considerations

    NASA Astrophysics Data System (ADS)

    van Driel, Martin; Nissen-Meyer, Tarje; Stähler, Simon; Waszek, Lauren; Hempel, Stefanie; Auer, Ludwig; Deuss, Arwen

    2014-05-01

    We present a numerical method to compute high-frequency 3D elastic waves in fully anisotropic axisymmetric media. The method is based on a decomposition of the wavefield into a series of uncoupled 2D equations, for which the dependence of the wavefield on the azimuth can be solved analytically. The remaining 2D problems are then solved using a spectral element method (AxiSEM). AxiSEM was recently published open-source (Nissen-Meyer et al. 2014) as a production ready code capable to compute global seismic wave propagation up to frequencies of ~2Hz. It accurately models visco-elastic dissipation and anisotropy (van Driel et al., submitted to GJI) and runs efficiently on HPC resources using up to 10K cores. At very short period, the Fresnel Zone of body waves is narrow and sensitivity is focused around the geometrical ray. In cases where the azimuthal variations of structural heterogeneity exhibit long spatial wavelengths, so called 2.5D simulations (3D wavefields in 2D models) provide a good approximation. In AxiSEM, twodimensional variations in the source-receiver plane are effectively modelled as ringlike structures extending in the out-of-plane direction. In contrast to ray-theory, which is widely used in high-frequency applications, AxiSEM provides complete waveforms, thus giving access to frequency dependency, amplitude variations, and peculiar wave effects such as diffraction and caustics. Here we focus on the practical implications of the inherent axisymmetric geometry and show how the 2.5D-features of our method method can be used to model realistic anisotropic structures, by applying it to problems such as the D" region and the inner core.

  10. Modeling Particle Exposure in US Trucking Terminals

    PubMed Central

    Davis, ME; Smith, TJ; Laden, F; Hart, JE; Ryan, LM; Garshick, E

    2007-01-01

    Multi-tiered sampling approaches are common in environmental and occupational exposure assessment, where exposures for a given individual are often modeled based on simultaneous measurements taken at multiple indoor and outdoor sites. The monitoring data from such studies is hierarchical by design, imposing a complex covariance structure that must be accounted for in order to obtain unbiased estimates of exposure. Statistical methods such as structural equation modeling (SEM) represent a useful alternative to simple linear regression in these cases, providing simultaneous and unbiased predictions of each level of exposure based on a set of covariates specific to the exposure setting. We test the SEM approach using data from a large exposure assessment of diesel and combustion particles in the US trucking industry. The exposure assessment includes data from 36 different trucking terminals across the United States sampled between 2001 and 2005, measuring PM2.5 and its elemental carbon (EC), organic carbon (OC) components, by personal monitoring, and sampling at two indoor work locations and an outdoor “background” location. Using the SEM method, we predict: 1) personal exposures as a function of work related exposure and smoking status; 2) work related exposure as a function of terminal characteristics, indoor ventilation, job location, and background exposure conditions; and 3) background exposure conditions as a function of weather, nearby source pollution, and other regional differences across terminal sites. The primary advantage of SEMs in this setting is the ability to simultaneously predict exposures at each of the sampling locations, while accounting for the complex covariance structure among the measurements and descriptive variables. The statistically significant results and high R2 values observed from the trucking industry application supports the broader use of this approach in exposure assessment modeling. PMID:16856739

  11. Nuclear forensics of a non-traditional sample: Neptunium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doyle, Jamie L.; Schwartz, Daniel; Tandon, Lav

    Recent nuclear forensics cases have focused primarily on plutonium (Pu) and uranium (U) materials. By definition however, nuclear forensics can apply to any diverted nuclear material. This includes neptunium (Np), an internationally safeguarded material like Pu and U, that could offer a nuclear security concern if significant quantities were found outside of regulatory control. This case study couples scanning electron microscopy (SEM) with quantitative analysis using newly developed specialized software, to evaluate a non-traditional nuclear forensic sample of Np. Here, the results of the morphological analyses were compared with another Np sample of known pedigree, as well as other traditionalmore » actinide materials in order to determine potential processing and point-of-origin.« less

  12. Nuclear forensics of a non-traditional sample: Neptunium

    DOE PAGES

    Doyle, Jamie L.; Schwartz, Daniel; Tandon, Lav

    2016-05-16

    Recent nuclear forensics cases have focused primarily on plutonium (Pu) and uranium (U) materials. By definition however, nuclear forensics can apply to any diverted nuclear material. This includes neptunium (Np), an internationally safeguarded material like Pu and U, that could offer a nuclear security concern if significant quantities were found outside of regulatory control. This case study couples scanning electron microscopy (SEM) with quantitative analysis using newly developed specialized software, to evaluate a non-traditional nuclear forensic sample of Np. Here, the results of the morphological analyses were compared with another Np sample of known pedigree, as well as other traditionalmore » actinide materials in order to determine potential processing and point-of-origin.« less

  13. Structure and optical properties of TiO2 thin films deposited by ALD method

    NASA Astrophysics Data System (ADS)

    Szindler, Marek; Szindler, Magdalena M.; Boryło, Paulina; Jung, Tymoteusz

    2017-12-01

    This paper presents the results of study on titanium dioxide thin films prepared by atomic layer deposition method on a silicon substrate. The changes of surface morphology have been observed in topographic images performed with the atomic force microscope (AFM) and scanning electron microscope (SEM). Obtained roughness parameters have been calculated with XEI Park Systems software. Qualitative studies of chemical composition were also performed using the energy dispersive spectrometer (EDS). The structure of titanium dioxide was investigated by X-ray crystallography. A variety of crystalline TiO2 was also confirmed by using the Raman spectrometer. The optical reflection spectra have been measured with UV-Vis spectrophotometry.

  14. Models for Deploying Open Source and Commercial Software to Support Earth Science Data Processing and Distribution

    NASA Astrophysics Data System (ADS)

    Yetman, G.; Downs, R. R.

    2011-12-01

    Software deployment is needed to process and distribute scientific data throughout the data lifecycle. Developing software in-house can take software development teams away from other software development projects and can require efforts to maintain the software over time. Adopting and reusing software and system modules that have been previously developed by others can reduce in-house software development and maintenance costs and can contribute to the quality of the system being developed. A variety of models are available for reusing and deploying software and systems that have been developed by others. These deployment models include open source software, vendor-supported open source software, commercial software, and combinations of these approaches. Deployment in Earth science data processing and distribution has demonstrated the advantages and drawbacks of each model. Deploying open source software offers advantages for developing and maintaining scientific data processing systems and applications. By joining an open source community that is developing a particular system module or application, a scientific data processing team can contribute to aspects of the software development without having to commit to developing the software alone. Communities of interested developers can share the work while focusing on activities that utilize in-house expertise and addresses internal requirements. Maintenance is also shared by members of the community. Deploying vendor-supported open source software offers similar advantages to open source software. However, by procuring the services of a vendor, the in-house team can rely on the vendor to provide, install, and maintain the software over time. Vendor-supported open source software may be ideal for teams that recognize the value of an open source software component or application and would like to contribute to the effort, but do not have the time or expertise to contribute extensively. Vendor-supported software may also have the additional benefits of guaranteed up-time, bug fixes, and vendor-added enhancements. Deploying commercial software can be advantageous for obtaining system or software components offered by a vendor that meet in-house requirements. The vendor can be contracted to provide installation, support and maintenance services as needed. Combining these options offers a menu of choices, enabling selection of system components or software modules that meet the evolving requirements encountered throughout the scientific data lifecycle.

  15. MATTS- A Step Towards Model Based Testing

    NASA Astrophysics Data System (ADS)

    Herpel, H.-J.; Willich, G.; Li, J.; Xie, J.; Johansen, B.; Kvinnesland, K.; Krueger, S.; Barrios, P.

    2016-08-01

    In this paper we describe a Model Based approach to testing of on-board software and compare it with traditional validation strategy currently applied to satellite software. The major problems that software engineering will face over at least the next two decades are increasing application complexity driven by the need for autonomy and serious application robustness. In other words, how do we actually get to declare success when trying to build applications one or two orders of magnitude more complex than today's applications. To solve the problems addressed above the software engineering process has to be improved at least for two aspects: 1) Software design and 2) Software testing. The software design process has to evolve towards model-based approaches with extensive use of code generators. Today, testing is an essential, but time and resource consuming activity in the software development process. Generating a short, but effective test suite usually requires a lot of manual work and expert knowledge. In a model-based process, among other subtasks, test construction and test execution can also be partially automated. The basic idea behind the presented study was to start from a formal model (e.g. State Machines), generate abstract test cases which are then converted to concrete executable test cases (input and expected output pairs). The generated concrete test cases were applied to an on-board software. Results were collected and evaluated wrt. applicability, cost-efficiency, effectiveness at fault finding, and scalability.

  16. Service quality, trust, and patient satisfaction in interpersonal-based medical service encounters.

    PubMed

    Chang, Ching-Sheng; Chen, Su-Yueh; Lan, Yi-Ting

    2013-01-16

    Interaction between service provider and customer is the primary core of service businesses of different natures, and the influence of trust on service quality and customer satisfaction could not be ignored in interpersonal-based service encounters. However, lack of existing literature on the correlation between service quality, patient trust, and satisfaction from the prospect of interpersonal-based medical service encounters has created a research gap in previous studies. Therefore, this study attempts to bridge such a gap with an evidence-based practice study. We adopted a cross-sectional design using a questionnaire survey of outpatients in seven medical centers of Taiwan. Three hundred and fifty copies of questionnaire were distributed, and 285 valid copies were retrieved, with a valid response rate of 81.43%. The SPSS 14.0 and AMOS 14.0 (structural equation modeling) statistical software packages were used for analysis. Structural equation modeling clarifies the extent of relationships between variables as well as the chain of cause and effect. Restated, SEM results do not merely show empirical relationships between variables when defining the practical situation. For this reason, SEM was used to test the hypotheses. Perception of interpersonal-based medical service encounters positively influences service quality and patient satisfaction. Perception of service quality among patients positively influences their trust. Perception of trust among patients positively influences their satisfaction. According to the findings, as interpersonal-based medical service encounters will positively influence service quality and patient satisfaction, and the differences for patients' perceptions of the professional skill and communication attitude of personnel in interpersonal-based medical service encounters will influence patients' overall satisfaction in two ways: (A) interpersonal-based medical service encounter directly affects patient satisfaction, which represents a direct effect; and (B) service quality and patient trust are used as intervening variables to affect patient satisfaction, which represents an indirect effect. Due to differences in the scale, resources and costs among medical institutions of different levels, it is a most urgent and concerning issue of how to control customers' demands and preferences and adopt correct marketing concepts under the circumstances of intense competition in order to satisfy the public and build up a competitive edge for medical institutions.

  17. Software dependability in the Tandem GUARDIAN system

    NASA Technical Reports Server (NTRS)

    Lee, Inhwan; Iyer, Ravishankar K.

    1995-01-01

    Based on extensive field failure data for Tandem's GUARDIAN operating system this paper discusses evaluation of the dependability of operational software. Software faults considered are major defects that result in processor failures and invoke backup processes to take over. The paper categorizes the underlying causes of software failures and evaluates the effectiveness of the process pair technique in tolerating software faults. A model to describe the impact of software faults on the reliability of an overall system is proposed. The model is used to evaluate the significance of key factors that determine software dependability and to identify areas for improvement. An analysis of the data shows that about 77% of processor failures that are initially considered due to software are confirmed as software problems. The analysis shows that the use of process pairs to provide checkpointing and restart (originally intended for tolerating hardware faults) allows the system to tolerate about 75% of reported software faults that result in processor failures. The loose coupling between processors, which results in the backup execution (the processor state and the sequence of events) being different from the original execution, is a major reason for the measured software fault tolerance. Over two-thirds (72%) of measured software failures are recurrences of previously reported faults. Modeling, based on the data, shows that, in addition to reducing the number of software faults, software dependability can be enhanced by reducing the recurrence rate.

  18. Modular modeling system for building distributed hydrologic models with a user-friendly software package

    NASA Astrophysics Data System (ADS)

    Wi, S.; Ray, P. A.; Brown, C.

    2015-12-01

    A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.

  19. Evaluating the contribution of genetics and familial shared environment to common disease using the UK Biobank.

    PubMed

    Muñoz, María; Pong-Wong, Ricardo; Canela-Xandri, Oriol; Rawlik, Konrad; Haley, Chris S; Tenesa, Albert

    2016-09-01

    Genome-wide association studies have detected many loci underlying susceptibility to disease, but most of the genetic factors that contribute to disease susceptibility remain unknown. Here we provide evidence that part of the 'missing heritability' can be explained by an overestimation of heritability. We estimated the heritability of 12 complex human diseases using family history of disease in 1,555,906 individuals of white ancestry from the UK Biobank. Estimates using simple family-based statistical models were inflated on average by ∼47% when compared with those from structural equation modeling (SEM), which specifically accounted for shared familial environmental factors. In addition, heritabilities estimated using SNP data explained an average of 44.2% of the simple family-based estimates across diseases and an average of 57.3% of the SEM-estimated heritabilities, accounting for almost all of the SEM heritability for hypertension. Our results show that both genetics and familial environment make substantial contributions to familial clustering of disease.

  20. Mapping temporal dynamics in social interactions with unified structural equation modeling: A description and demonstration revealing time-dependent sex differences in play behavior

    PubMed Central

    Beltz, Adriene M.; Beekman, Charles; Molenaar, Peter C. M.; Buss, Kristin A.

    2013-01-01

    Developmental science is rich with observations of social interactions, but few available methodological and statistical approaches take full advantage of the information provided by these data. The authors propose implementation of the unified structural equation model (uSEM), a network analysis technique, for observational data coded repeatedly across time; uSEM captures the temporal dynamics underlying changes in behavior at the individual level by revealing the ways in which a single person influences – concurrently and in the future – other people. To demonstrate the utility of uSEM, the authors applied it to ratings of positive affect and vigor of activity during children’s unstructured laboratory play with unfamiliar, same-sex peers. Results revealed the time-dependent nature of sex differences in play behavior. For girls more than boys, positive affect was dependent upon peers’ prior positive affect. For boys more than girls, vigor of activity was dependent upon peers’ current vigor of activity. PMID:24039386

  1. RAD-ADAPT: Software for modelling clonogenic assay data in radiation biology.

    PubMed

    Zhang, Yaping; Hu, Kaiqiang; Beumer, Jan H; Bakkenist, Christopher J; D'Argenio, David Z

    2017-04-01

    We present a comprehensive software program, RAD-ADAPT, for the quantitative analysis of clonogenic assays in radiation biology. Two commonly used models for clonogenic assay analysis, the linear-quadratic model and single-hit multi-target model, are included in the software. RAD-ADAPT uses maximum likelihood estimation method to obtain parameter estimates with the assumption that cell colony count data follow a Poisson distribution. The program has an intuitive interface, generates model prediction plots, tabulates model parameter estimates, and allows automatic statistical comparison of parameters between different groups. The RAD-ADAPT interface is written using the statistical software R and the underlying computations are accomplished by the ADAPT software system for pharmacokinetic/pharmacodynamic systems analysis. The use of RAD-ADAPT is demonstrated using an example that examines the impact of pharmacologic ATM and ATR kinase inhibition on human lung cancer cell line A549 after ionizing radiation. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Using UML Modeling to Facilitate Three-Tier Architecture Projects in Software Engineering Courses

    ERIC Educational Resources Information Center

    Mitra, Sandeep

    2014-01-01

    This article presents the use of a model-centric approach to facilitate software development projects conforming to the three-tier architecture in undergraduate software engineering courses. Many instructors intend that such projects create software applications for use by real-world customers. While it is important that the first version of these…

  3. 75 FR 30387 - Improving Market and Planning Efficiency Through Improved Software; Notice of Agenda and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-01

    ... Market and Planning Efficiency Through Improved Software; Notice of Agenda and Procedures for Staff... planning models and software. The technical conference will be held from 8 a.m. to 5:30 p.m. (EDT) on June.... Agenda for AD10-12 Staff Technical Conference on Planning Models and Software Federal Energy Regulatory...

  4. The Emergence of Open-Source Software in North America

    ERIC Educational Resources Information Center

    Pan, Guohua; Bonk, Curtis J.

    2007-01-01

    Unlike conventional models of software development, the open source model is based on the collaborative efforts of users who are also co-developers of the software. Interest in open source software has grown exponentially in recent years. A "Google" search for the phrase open source in early 2005 returned 28.8 million webpage hits, while…

  5. The Evolution of Software Pricing: From Box Licenses to Application Service Provider Models.

    ERIC Educational Resources Information Center

    Bontis, Nick; Chung, Honsan

    2000-01-01

    Describes three different pricing models for software. Findings of this case study support the proposition that software pricing is a complex and subjective process. The key determinant of alignment between vendor and user is the nature of value in the software to the buyer. This value proposition may range from increased cost reduction to…

  6. A structural equation modelling approach to explore the role of B vitamins and immune markers in lung cancer risk.

    PubMed

    Baltar, Valéria Troncoso; Xun, Wei W; Johansson, Mattias; Ferrari, Pietro; Chuang, Shu-Chun; Relton, Caroline; Ueland, Per Magne; Midttun, Øivind; Slimani, Nadia; Jenab, Mazda; Clavel-Chapelon, Françoise; Boutron-Ruault, Marie-Christine; Fagherazzi, Guy; Kaaks, Rudolf; Rohrmann, Sabine; Boeing, Heiner; Weikert, Cornelia; Bueno-de-Mesquita, Bas; Boshuizen, Hendriek; van Gils, Carla H; Onland-Moret, N Charlotte; Agudo, Antonio; Barricarte, Aurelio; Navarro, Carmen; Rodríguez, Laudina; Castaño, José Maria Huerta; Larrañaga, Nerea; Khaw, Kay-Tee; Wareham, Nick; Allen, Naomi E; Crowe, Francesca; Gallo, Valentina; Norat, Teresa; Krogh, Vittorio; Masala, Giovanna; Panico, Salvatore; Sacerdote, Carlotta; Tumino, Rosario; Trichopoulou, Antonia; Lagiou, Pagona; Trichopoulos, Dimitrios; Rasmuson, Torgny; Hallmans, Göran; Roswall, Nina; Tjønneland, Anne; Riboli, Elio; Brennan, Paul; Vineis, Paolo

    2013-08-01

    The one-carbon metabolism (OCM) is considered key in maintaining DNA integrity and regulating gene expression, and may be involved in the process of carcinogenesis. Several B-vitamins and amino acids have been implicated in lung cancer risk, via the OCM directly as well as immune system activation. However it is unclear whether these factors act independently or through complex mechanisms. The current study applies structural equations modelling (SEM) to further disentangle the mechanisms involved in lung carcinogenesis. SEM allows simultaneous estimation of linear relations where a variable can be the outcome in one equation and the predictor in another, as well as allowing estimation using latent variables (factors estimated by correlation matrix). A large number of biomarkers have been analysed from 891 lung cancer cases and 1,747 controls nested within the European Prospective Investigation into Cancer and Nutrition (EPIC) cohort. Four putative mechanisms in the OCM and immunity were investigated in relation to lung cancer risk: methionine-homocysteine metabolism, folate cycle, transsulfuration, and mechanisms involved in inflammation and immune activation, all adjusted for tobacco exposure. The hypothesized SEM model confirmed a direct and protective effect for factors representing methionine-homocysteine metabolism (p = 0.020) and immune activation (p = 0.021), and an indirect protective effect of folate cycle (p = 0.019), after adjustment for tobacco smoking. In conclusion, our results show that in the investigation of the involvement of the OCM, the folate cycle and immune system in lung carcinogenesis, it is important to consider complex pathways (by applying SEM) rather than the effects of single vitamins or nutrients (e.g. using traditional multiple regression). In our study SEM were able to suggest a greater role of the methionine-homocysteine metabolism and immune activation over other potential mechanisms.

  7. An ontology based trust verification of software license agreement

    NASA Astrophysics Data System (ADS)

    Lu, Wenhuan; Li, Xiaoqing; Gan, Zengqin; Wei, Jianguo

    2017-08-01

    When we install software or download software, there will show up so big mass document to state the rights and obligations, for which lots of person are not patient to read it or understand it. That would may make users feel distrust for the software. In this paper, we propose an ontology based verification for Software License Agreement. First of all, this work proposed an ontology model for domain of Software License Agreement. The domain ontology is constructed by proposed methodology according to copyright laws and 30 software license agreements. The License Ontology can act as a part of generalized copyright law knowledge model, and also can work as visualization of software licenses. Based on this proposed ontology, a software license oriented text summarization approach is proposed which performances showing that it can improve the accuracy of software licenses summarizing. Based on the summarization, the underline purpose of the software license can be explicitly explored for trust verification.

  8. Structural Equation Modeling: Applications in ecological and evolutionary biology research

    USGS Publications Warehouse

    Pugesek, Bruce H.; von Eye, Alexander; Tomer, Adrian

    2003-01-01

    This book presents an introduction to the methodology of structural equation modeling, illustrates its use, and goes on to argue that it has revolutionary implications for the study of natural systems. A major theme of this book is that we have, up to this point, attempted to study systems primarily using methods (such as the univariate model) that were designed only for considering individual processes. Understanding systems requires the capacity to examine simultaneous influences and responses. Structural equation modeling (SEM) has such capabilities. It also possesses many other traits that add strength to its utility as a means of making scientific progress. In light of the capabilities of SEM, it can be argued that much of ecological theory is currently locked in an immature state that impairs its relevance. It is further argued that the principles of SEM are capable of leading to the development and evaluation of multivariate theories of the sort vitally needed for the conservation of natural systems. Supplementary information can be found at the authors website, http://www.jamesbgrace.com/. • Details why multivariate analyses should be used to study ecological systems • Exposes unappreciated weakness in many current popular analyses • Emphasizes the future methodological developments needed to advance our understanding of ecological systems.

  9. Does Depressive Affect Mediate the Relationship between Self-Care Capacity and Nutritional Status Among Rural Older Adults? : A Structural Equation Modeling Approach.

    PubMed

    Jung, Seung Eun; Bishop, Alex J; Kim, Minjung; Hermann, Janice; Kim, Giyeon; Lawrence, Jeannine

    2017-01-01

    This study examined the relationships of self-care capacity and depressive affect on nutritional status and whether depressive affect mediated the relationship of self-care capacity on nutritional status. A convenience sample of 171 rural community-dwelling older adults, 65 years and above, participated. Structural equation modeling (SEM) was conducted to test a mediation model. The hypothesized SEM model was supported with adequate fit (χ 2 (1) = 1.87, p = 0.17; CFI = 0.94; RMSEA = 0.07; SRMR = 0.03). SEM analysis revealed a significant positive direct effect of self-care capacity on nutritional status (γ = 0.14, p = 0.042). Significant negative direct effects were observed for self-care capacity on depressive affect (γ = -0.15, p = 0.027) and for depressive affect on nutritional status (β = -0.27, p < 0.01). Depressive affect was also observed to partially mediate the relationship of self-care capacity on nutrition status (γ = 0.04, p = 0.046). Findings highlight the importance of emotional well-being on rural older adults' nutritional status, particularly those with decreased ability to engage in self-care practices.

  10. Estimation of Bid Curves in Power Exchanges using Time-varying Simultaneous-Equations Models

    NASA Astrophysics Data System (ADS)

    Ofuji, Kenta; Yamaguchi, Nobuyuki

    Simultaneous-equations model (SEM) is generally used in economics to estimate interdependent endogenous variables such as price and quantity in a competitive, equilibrium market. In this paper, we have attempted to apply SEM to JEPX (Japan Electric Power eXchange) spot market, a single-price auction market, using the publicly available data of selling and buying bid volumes, system price and traded quantity. The aim of this analysis is to understand the magnitude of influences to the auctioned prices and quantity from the selling and buying bids, than to forecast prices and quantity for risk management purposes. In comparison with the Ordinary Least Squares (OLS) estimation where the estimation results represent average values that are independent of time, we employ a time-varying simultaneous-equations model (TV-SEM) to capture structural changes inherent in those influences, using State Space models with Kalman filter stepwise estimation. The results showed that the buying bid volumes has that highest magnitude of influences among the factors considered, exhibiting time-dependent changes, ranging as broad as about 240% of its average. The slope of the supply curve also varies across time, implying the elastic property of the supply commodity, while the demand curve remains comparatively inelastic and stable over time.

  11. Avoidable Software Procurements

    DTIC Science & Technology

    2012-09-01

    software license, software usage, ELA, Software as a Service , SaaS , Software Asset...PaaS Platform as a Service SaaS Software as a Service SAM Software Asset Management SMS System Management Server SEWP Solutions for Enterprise Wide...delivery of full Cloud Services , we will see the transition of the Cloud Computing service model from Iaas to SaaS , or Software as a Service . Software

  12. An alternative model to distribute VO software to WLCG sites based on CernVM-FS: a prototype at PIC Tier1

    NASA Astrophysics Data System (ADS)

    Lanciotti, E.; Merino, G.; Bria, A.; Blomer, J.

    2011-12-01

    In a distributed computing model as WLCG the software of experiment specific application software has to be efficiently distributed to any site of the Grid. Application software is currently installed in a shared area of the site visible for all Worker Nodes (WNs) of the site through some protocol (NFS, AFS or other). The software is installed at the site by jobs which run on a privileged node of the computing farm where the shared area is mounted in write mode. This model presents several drawbacks which cause a non-negligible rate of job failure. An alternative model for software distribution based on the CERN Virtual Machine File System (CernVM-FS) has been tried at PIC, the Spanish Tierl site of WLCG. The test bed used and the results are presented in this paper.

  13. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  14. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  15. Framework Programmable Platform for the Advanced Software Development Workstation (FPP/ASDW). Demonstration framework document. Volume 1: Concepts and activity descriptions

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paul S.; Crump, John W.; Ackley, Keith A.

    1992-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at effectively combining tool and data integration mechanisms with a model of the software development process to provide an intelligent integrated software development environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The Advanced Software Development Workstation (ASDW) program is conducting research into development of advanced technologies for Computer Aided Software Engineering (CASE).

  16. The Capabilities and Applications of FY-3A/B SEM on Monitoring Space Weather Events

    NASA Astrophysics Data System (ADS)

    Huang, C.; Li, J.; Yu, T.; Xue, B.; Wang, C.; Zhang, X.; Cao, G.; Liu, D.; Tang, W.

    2012-12-01

    The Space Environment Monitor (SEM), on board the Chinese meteorological satellites, FengYun-3A/B has the abilities to measure proton flux in 3-300 Mev energy range and electron flux in 0.15-5.7 Mev energy range. SEM can also detect the heavy ion compositions, satellite surface potential, the radiation dose in sensors, and the single events. The space environment information derived from SEM can be utilized for satellite security designs, scientific studies, development of radiation belt models, and space weather monitoring and disaster warning. In this study, the SEM's instrument characteristics are introduced and the post-launch calibration algorithm is presented. The applications in monitoring space weather events and the service for manned spaceflights are also demonstrated.; The protons with particle energy over 10 Mev are called "killer particles". These particles may damage the satellite and cause disruption of satellite's system. The protons flux of 10 M-26 Mev energy band reached 5000 in the SPE caused by a solar flare with CME during the period of 2012.01.23 to 2012.01.27 as shown in the figure. THE COMPARISONS OF HEAVY IONS (2010.11.11-2010.12.15)t;

  17. Effect of Layer Thickness and Printing Orientation on Mechanical Properties and Dimensional Accuracy of 3D Printed Porous Samples for Bone Tissue Engineering

    PubMed Central

    Farzadi, Arghavan; Solati-Hashjin, Mehran; Asadi-Eydivand, Mitra; Abu Osman, Noor Azuan

    2014-01-01

    Powder-based inkjet 3D printing method is one of the most attractive solid free form techniques. It involves a sequential layering process through which 3D porous scaffolds can be directly produced from computer-generated models. 3D printed products' quality are controlled by the optimal build parameters. In this study, Calcium Sulfate based powders were used for porous scaffolds fabrication. The printed scaffolds of 0.8 mm pore size, with different layer thickness and printing orientation, were subjected to the depowdering step. The effects of four layer thicknesses and printing orientations, (parallel to X, Y and Z), on the physical and mechanical properties of printed scaffolds were investigated. It was observed that the compressive strength, toughness and Young's modulus of samples with 0.1125 and 0.125 mm layer thickness were more than others. Furthermore, the results of SEM and μCT analyses showed that samples with 0.1125 mm layer thickness printed in X direction have more dimensional accuracy and significantly close to CAD software based designs with predefined pore size, porosity and pore interconnectivity. PMID:25233468

  18. Development of pH sensitive microparticles of Karaya gum: By response surface methodology.

    PubMed

    Raizaday, Abhay; Yadav, Hemant K S; Kumar, S Hemanth; Kasina, Susmitha; Navya, M; Tashi, C

    2015-12-10

    The objective of the proposed work was to prepare pH sensitive microparticles (MP) of Karaya gum using distilled water as a solvent by spray drying technique. Different formulations were designed, prepared and evaluated by employing response surface methodology and optimal design of experiment technique using Design Expert(®) ver 8.0.1 software. SEM photographs showed that MP were roughly spherical in shape and free from cracks. The particle size and encapsulation efficiency for optimized MP was found to be between 3.89 and 6.5 μm and 81-94% respectively with good flow properties. At the end of the 12th hour the in vitro drug release was found to be 96.9% for the optimized formulation in pH 5.6 phosphate buffer. Low prediction errors were observed for Cmax and AUC0-∞ which demonstrated that the Frusemide IVIVC model was valid. Hence it can be concluded that pH sensitive MP of Karaya gum were effectively prepared by spray drying technique using aqueous solvents and can be used for treating various diseases like chronic hypertension, Ulcerative Colitis and Diverticulitis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Framework programmable platform for the advanced software development workstation. Integration mechanism design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Reddy, Uday; Ackley, Keith; Futrell, Mike

    1991-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by this model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated.

  20. An approach to developing user interfaces for space systems

    NASA Astrophysics Data System (ADS)

    Shackelford, Keith; McKinney, Karen

    1993-08-01

    Inherent weakness in the traditional waterfall model of software development has led to the definition of the spiral model. The spiral model software development lifecycle model, however, has not been applied to NASA projects. This paper describes its use in developing real time user interface software for an Environmental Control and Life Support System (ECLSS) Process Control Prototype at NASA's Marshall Space Flight Center.

  1. Investing in Software Sustainment

    DTIC Science & Technology

    2015-04-30

    colored arrows simply represent a reinforcing  loop called the “ Bandwagon   Effect ”.  This  effect   simply means that a series of successful missions will...the Software Engineering Institute (SEI) developed a simulation model for analyzing the effects of changes in demand for software sustainment and the...developed a simulation model for analyzing the effects of changes in demand for software sustainment and the corresponding funding decisions. The model

  2. Simplifying the interaction between cognitive models and task environments with the JSON Network Interface.

    PubMed

    Hope, Ryan M; Schoelles, Michael J; Gray, Wayne D

    2014-12-01

    Process models of cognition, written in architectures such as ACT-R and EPIC, should be able to interact with the same software with which human subjects interact. By eliminating the need to simulate the experiment, this approach would simplify the modeler's effort, while ensuring that all steps required of the human are also required by the model. In practice, the difficulties of allowing one software system to interact with another present a significant barrier to any modeler who is not also skilled at this type of programming. The barrier increases if the programming language used by the modeling software differs from that used by the experimental software. The JSON Network Interface simplifies this problem for ACT-R modelers, and potentially, modelers using other systems.

  3. Testing Software Development Project Productivity Model

    NASA Astrophysics Data System (ADS)

    Lipkin, Ilya

    Software development is an increasingly influential factor in today's business environment, and a major issue affecting software development is how an organization estimates projects. If the organization underestimates cost, schedule, and quality requirements, the end results will not meet customer needs. On the other hand, if the organization overestimates these criteria, resources that could have been used more profitably will be wasted. There is no accurate model or measure available that can guide an organization in a quest for software development, with existing estimation models often underestimating software development efforts as much as 500 to 600 percent. To address this issue, existing models usually are calibrated using local data with a small sample size, with resulting estimates not offering improved cost analysis. This study presents a conceptual model for accurately estimating software development, based on an extensive literature review and theoretical analysis based on Sociotechnical Systems (STS) theory. The conceptual model serves as a solution to bridge organizational and technological factors and is validated using an empirical dataset provided by the DoD. Practical implications of this study allow for practitioners to concentrate on specific constructs of interest that provide the best value for the least amount of time. This study outlines key contributing constructs that are unique for Software Size E-SLOC, Man-hours Spent, and Quality of the Product, those constructs having the largest contribution to project productivity. This study discusses customer characteristics and provides a framework for a simplified project analysis for source selection evaluation and audit task reviews for the customers and suppliers. Theoretical contributions of this study provide an initial theory-based hypothesized project productivity model that can be used as a generic overall model across several application domains such as IT, Command and Control, Simulation and etc... This research validates findings from previous work concerning software project productivity and leverages said results in this study. The hypothesized project productivity model provides statistical support and validation of expert opinions used by practitioners in the field of software project estimation.

  4. Exploring biological, chemical and geomorphological patterns in fluvial ecosystems with Structural Equation Modelling

    NASA Astrophysics Data System (ADS)

    Bizzi, S.; Surridge, B.; Lerner, D. N.:

    2009-04-01

    River ecosystems represent complex networks of interacting biological, chemical and geomorphological processes. These processes generate spatial and temporal patterns in biological, chemical and geomorphological variables, and a growing number of these variables are now being used to characterise the status of rivers. However, integrated analyses of these biological-chemical-geomorphological networks have rarely been undertaken, and as a result our knowledge of the underlying processes and how they generate the resulting patterns remains weak. The apparent complexity of the networks involved, and the lack of coherent datasets, represent two key challenges to such analyses. In this paper we describe the application of a novel technique, Structural Equation Modelling (SEM), to the investigation of biological, chemical and geomorphological data collected from rivers across England and Wales. The SEM approach is a multivariate statistical technique enabling simultaneous examination of direct and indirect relationships across a network of variables. Further, SEM allows a-priori conceptual or theoretical models to be tested against available data. This is a significant departure from the solely exploratory analyses which characterise other multivariate techniques. We took biological, chemical and river habitat survey data collected by the Environment Agency for 400 sites in rivers spread across England and Wales, and created a single, coherent dataset suitable for SEM analyses. Biological data cover benthic macroinvertebrates, chemical data relate to a range of standard parameters (e.g. BOD, dissolved oxygen and phosphate concentration), and geomorphological data cover factors such as river typology, substrate material and degree of physical modification. We developed a number of a-priori conceptual models, reflecting current research questions or existing knowledge, and tested the ability of these conceptual models to explain the variance and covariance within the dataset. The conceptual models we developed were able to explain correctly the variance and covariance shown by the datasets, proving to be a relevant representation of the processes involved. The models explained 65% of the variance in indices describing benthic macroinvertebrate communities. Dissolved oxygen was of primary importance, but geomorphological factors, including river habitat type and degree of habitat degradation, also had significant explanatory power. The addition of spatial variables, such as latitude or longitude, did not provide additional explanatory power. This suggests that the variables already included in the models effectively represented the eco-regions across which our data were distributed. The models produced new insights into the relative importance of chemical and geomorphological factors for river macroinvertebrate communities. The SEM technique proved a powerful tool for exploring complex biological-chemical-geomorphological networks, for example able to deal with the co-correlations that are common in rivers due to multiple feedback mechanisms.

  5. Software Transition Project Retrospectives and the Application of SEL Effort Estimation Model and Boehm's COCOMO to Complex Software Transition Projects

    NASA Technical Reports Server (NTRS)

    McNeill, Justin

    1995-01-01

    The Multimission Image Processing Subsystem (MIPS) at the Jet Propulsion Laboratory (JPL) has managed transitions of application software sets from one operating system and hardware platform to multiple operating systems and hardware platforms. As a part of these transitions, cost estimates were generated from the personal experience of in-house developers and managers to calculate the total effort required for such projects. Productivity measures have been collected for two such transitions, one very large and the other relatively small in terms of source lines of code. These estimates used a cost estimation model similar to the Software Engineering Laboratory (SEL) Effort Estimation Model. Experience in transitioning software within JPL MIPS have uncovered a high incidence of interface complexity. Interfaces, both internal and external to individual software applications, have contributed to software transition project complexity, and thus to scheduling difficulties and larger than anticipated design work on software to be ported.

  6. Semi-empirical model for retrieval of soil moisture using RISAT-1 C-Band SAR data over a sub-tropical semi-arid area of Rewari district, Haryana (India)

    NASA Astrophysics Data System (ADS)

    Rawat, Kishan Singh; Sehgal, Vinay Kumar; Pradhan, Sanatan; Ray, Shibendu S.

    2018-03-01

    We have estimated soil moisture (SM) by using circular horizontal polarization backscattering coefficient (σ o_{RH}), differences of circular vertical and horizontal σ o (σ o_{RV} {-} σ o_{RH}) from FRS-1 data of Radar Imaging Satellite (RISAT-1) and surface roughness in terms of RMS height ({RMS}_{height}). We examined the performance of FRS-1 in retrieving SM under wheat crop at tillering stage. Results revealed that it is possible to develop a good semi-empirical model (SEM) to estimate SM of the upper soil layer using RISAT-1 SAR data rather than using existing empirical model based on only single parameter, i.e., σ o. Near surface SM measurements were related to σ o_{RH}, σ o_{RV} {-} σ o_{RH} derived using 5.35 GHz (C-band) image of RISAT-1 and {RMS}_{height}. The roughness component derived in terms of {RMS}_{height} showed a good positive correlation with σ o_{RV} {-} σ o_{RH} (R2 = 0.65). By considering all the major influencing factors (σ o_{RH}, σ o_{RV} {-} σ o_{RH}, and {RMS}_{height}), an SEM was developed where SM (volumetric) predicted values depend on σ o_{RH}, σ o_{RV} {-} σ o_{RH}, and {RMS}_{height}. This SEM showed R2 of 0.87 and adjusted R2 of 0.85, multiple R=0.94 and with standard error of 0.05 at 95% confidence level. Validation of the SM derived from semi-empirical model with observed measurement ({SM}_{Observed}) showed root mean square error (RMSE) = 0.06, relative-RMSE (R-RMSE) = 0.18, mean absolute error (MAE) = 0.04, normalized RMSE (NRMSE) = 0.17, Nash-Sutcliffe efficiency (NSE) = 0.91 ({≈ } 1), index of agreement (d) = 1, coefficient of determination (R2) = 0.87, mean bias error (MBE) = 0.04, standard error of estimate (SEE) = 0.10, volume error (VE) = 0.15, variance of the distribution of differences ({S}d2) = 0.004. The developed SEM showed better performance in estimating SM than Topp empirical model which is based only on σ o. By using the developed SEM, top soil SM can be estimated with low mean absolute percent error (MAPE) = 1.39 and can be used for operational applications.

  7. Treatment with Saccharomyces boulardii reduces the inflammation and dysfunction of the gastrointestinal tract in 5-fluorouracil-induced intestinal mucositis in mice.

    PubMed

    Justino, Priscilla F C; Melo, Luis F M; Nogueira, Andre F; Costa, Jose V G; Silva, Luara M N; Santos, Cecila M; Mendes, Walber O; Costa, Marina R; Franco, Alvaro X; Lima, Aldo A; Ribeiro, Ronaldo A; Souza, Marcellus H L P; Soares, Pedro M G

    2014-05-01

    Intestinal mucositis is an important toxic side effect of 5-fluorouracil (5-FU) treatment. Saccharomyces boulardii is known to protect from intestinal injury via an effect on the gastrointestinal microbiota. The objective of the present study was to evaluate the effect of S. boulardii on intestinal mucositis induced by 5-FU in a murine model. Mice were divided into saline, saline (control)+5-FU or 5-FU+S. boulardii (16 × 10⁹ colony-forming units/kg) treatment groups, and the jejunum and ileum were removed after killing of mice for the evaluation of histopathology, myeloperoxidase (MPO) activity, and non-protein sulfhydryl group (mainly reduced glutathione; GSH), nitrite and cytokine concentrations. To determine gastric emptying, phenol red was administered orally, mice were killed 20 min after administration, and the absorbance of samples collected from the mice was measured by spectrophotometry. Intestinal permeability was measured by the urinary excretion rate of lactulose and mannitol following oral administration. S. boulardii significantly reversed the histopathological changes in intestinal mucositis induced by 5-FU and reduced the inflammatory parameters: neutrophil infiltration (control 1·73 (SEM 0·37) ultrastructural MPO (UMPO)/mg, 5-FU 7·37 (SEM 1·77) UMPO/mg and 5-FU+S. boulardii 4·15 (SEM 0·73) UMPO/mg); nitrite concentration (control 37·00 (SEM 2·39) μm, 5-FU 59·04 (SEM 11·41) μm and 5-FU+S. boulardii 37·90 (SEM 5·78) μm); GSH concentration (control 477·60 (SEM 25·25) μg/mg, 5-FU 270·90 (SEM 38·50) μg/mg and 5-FU+S. boulardii 514·00 (SEM 38·64) μg/mg). Treatment with S. Boulardii significantly reduced the concentrations of TNF-α and IL-1β by 48·92 and 32·21 % in the jejunum and 38·92 and 61·79 % in the ileum. In addition, S. boulardii decreased the concentrations of chemokine (C-X-C motif) ligand 1 by 5-fold in the jejunum and 3-fold in the ileum. Interestingly, S. boulardii reduced the delay in gastric emptying (control 25·21 (SEM 2·55) %, 5-FU 54·91 (SEM 3·43) % and 5-FU+S. boulardii 31·38 (SEM 2·80) %) and induced the recovery of intestinal permeability (lactulose:mannitol ratio: control 0·52 (SEM 0·03), 5-FU 1·38 (SEM 0·24) and 5-FU+S. boulardii 0·62 (SEM 0·03)). In conclusion, S. boulardii reduces the inflammation and dysfunction of the gastrointestinal tract in intestinal mucositis induced by 5-FU.

  8. Software reliability through fault-avoidance and fault-tolerance

    NASA Technical Reports Server (NTRS)

    Vouk, Mladen A.; Mcallister, David F.

    1993-01-01

    Strategies and tools for the testing, risk assessment and risk control of dependable software-based systems were developed. Part of this project consists of studies to enable the transfer of technology to industry, for example the risk management techniques for safety-concious systems. Theoretical investigations of Boolean and Relational Operator (BRO) testing strategy were conducted for condition-based testing. The Basic Graph Generation and Analysis tool (BGG) was extended to fully incorporate several variants of the BRO metric. Single- and multi-phase risk, coverage and time-based models are being developed to provide additional theoretical and empirical basis for estimation of the reliability and availability of large, highly dependable software. A model for software process and risk management was developed. The use of cause-effect graphing for software specification and validation was investigated. Lastly, advanced software fault-tolerance models were studied to provide alternatives and improvements in situations where simple software fault-tolerance strategies break down.

  9. Unbiased roughness measurements: the key to better etch performance

    NASA Astrophysics Data System (ADS)

    Liang, Andrew; Mack, Chris; Sirard, Stephen; Liang, Chen-wei; Yang, Liu; Jiang, Justin; Shamma, Nader; Wise, Rich; Yu, Jengyi; Hymes, Diane

    2018-03-01

    Edge placement error (EPE) has become an increasingly critical metric to enable Moore's Law scaling. Stochastic variations, as characterized for lines by line width roughness (LWR) and line edge roughness (LER), are dominant factors in EPE and known to increase with the introduction of EUV lithography. However, despite recommendations from ITRS, NIST, and SEMI standards, the industry has not agreed upon a methodology to quantify these properties. Thus, differing methodologies applied to the same image often result in different roughness measurements and conclusions. To standardize LWR and LER measurements, Fractilia has developed an unbiased measurement that uses a raw unfiltered line scan to subtract out image noise and distortions. By using Fractilia's inverse linescan model (FILM) to guide development, we will highlight the key influences of roughness metrology on plasma-based resist smoothing processes. Test wafers were deposited to represent a 5 nm node EUV logic stack. The patterning stack consists of a core Si target layer with spin-on carbon (SOC) as the hardmask and spin-on glass (SOG) as the cap. Next, these wafers were exposed through an ASML NXE 3350B EUV scanner with an advanced chemically amplified resist (CAR). Afterwards, these wafers were etched through a variety of plasma-based resist smoothing techniques using a Lam Kiyo conductor etch system. Dense line and space patterns on the etched samples were imaged through advanced Hitachi CDSEMs and the LER and LWR were measured through both Fractilia and an industry standard roughness measurement software. By employing Fractilia to guide plasma-based etch development, we demonstrate that Fractilia produces accurate roughness measurements on resist in contrast to an industry standard measurement software. These results highlight the importance of subtracting out SEM image noise to obtain quicker developmental cycle times and lower target layer roughness.

  10. Software Reliability 2002

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    In FY01 we learned that hardware reliability models need substantial changes to account for differences in software, thus making software reliability measurements more effective, accurate, and easier to apply. These reliability models are generally based on familiar distributions or parametric methods. An obvious question is 'What new statistical and probability models can be developed using non-parametric and distribution-free methods instead of the traditional parametric method?" Two approaches to software reliability engineering appear somewhat promising. The first study, begin in FY01, is based in hardware reliability, a very well established science that has many aspects that can be applied to software. This research effort has investigated mathematical aspects of hardware reliability and has identified those applicable to software. Currently the research effort is applying and testing these approaches to software reliability measurement, These parametric models require much project data that may be difficult to apply and interpret. Projects at GSFC are often complex in both technology and schedules. Assessing and estimating reliability of the final system is extremely difficult when various subsystems are tested and completed long before others. Parametric and distribution free techniques may offer a new and accurate way of modeling failure time and other project data to provide earlier and more accurate estimates of system reliability.

  11. Partial Least Squares Structural Equation Modeling with R

    ERIC Educational Resources Information Center

    Ravand, Hamdollah; Baghaei, Purya

    2016-01-01

    Structural equation modeling (SEM) has become widespread in educational and psychological research. Its flexibility in addressing complex theoretical models and the proper treatment of measurement error has made it the model of choice for many researchers in the social sciences. Nevertheless, the model imposes some daunting assumptions and…

  12. A Practical Guide to Calibration of a GSSHA Hydrologic Model Using ERDC Automated Model Calibration Software - Effective and Efficient Stochastic Global Optimization

    DTIC Science & Technology

    2012-02-01

    parameter estimation method, but rather to carefully describe how to use the ERDC software implementation of MLSL that accommodates the PEST model...model independent LM method based parameter estimation software PEST (Doherty, 2004, 2007a, 2007b), which quantifies model to measure- ment misfit...et al. (2011) focused on one drawback associated with LM-based model independent parameter estimation as implemented in PEST ; viz., that it requires

  13. Software Assurance Competency Model

    DTIC Science & Technology

    2013-03-01

    COTS) software , and software as a service ( SaaS ). L2: Define and analyze risks in the acquisition of contracted software , COTS software , and SaaS ...2010a]: Application of technologies and processes to achieve a required level of confidence that software systems and services function in the...

  14. A Framework of the Use of Information in Software Testing

    ERIC Educational Resources Information Center

    Kaveh, Payman

    2010-01-01

    With the increasing role that software systems play in our daily lives, software quality has become extremely important. Software quality is impacted by the efficiency of the software testing process. There are a growing number of software testing methodologies, models, and initiatives to satisfy the need to improve software quality. The main…

  15. A Model Independent S/W Framework for Search-Based Software Testing

    PubMed Central

    Baik, Jongmoon

    2014-01-01

    In Model-Based Testing (MBT) area, Search-Based Software Testing (SBST) has been employed to generate test cases from the model of a system under test. However, many types of models have been used in MBT. If the type of a model has changed from one to another, all functions of a search technique must be reimplemented because the types of models are different even if the same search technique has been applied. It requires too much time and effort to implement the same algorithm over and over again. We propose a model-independent software framework for SBST, which can reduce redundant works. The framework provides a reusable common software platform to reduce time and effort. The software framework not only presents design patterns to find test cases for a target model but also reduces development time by using common functions provided in the framework. We show the effectiveness and efficiency of the proposed framework with two case studies. The framework improves the productivity by about 50% when changing the type of a model. PMID:25302314

  16. The development and application of composite complexity models and a relative complexity metric in a software maintenance environment

    NASA Technical Reports Server (NTRS)

    Hops, J. M.; Sherif, J. S.

    1994-01-01

    A great deal of effort is now being devoted to the study, analysis, prediction, and minimization of software maintenance expected cost, long before software is delivered to users or customers. It has been estimated that, on the average, the effort spent on software maintenance is as costly as the effort spent on all other software costs. Software design methods should be the starting point to aid in alleviating the problems of software maintenance complexity and high costs. Two aspects of maintenance deserve attention: (1) protocols for locating and rectifying defects, and for ensuring that noe new defects are introduced in the development phase of the software process; and (2) protocols for modification, enhancement, and upgrading. This article focuses primarily on the second aspect, the development of protocols to help increase the quality and reduce the costs associated with modifications, enhancements, and upgrades of existing software. This study developed parsimonious models and a relative complexity metric for complexity measurement of software that were used to rank the modules in the system relative to one another. Some success was achieved in using the models and the relative metric to identify maintenance-prone modules.

  17. Hemodynamics model of fluid–solid interaction in internal carotid artery aneurysms

    PubMed Central

    Fu-Yu, Wang; Lei, Liu; Xiao-Jun, Zhang; Hai-Yue, Ju

    2010-01-01

    The objective of this study is to present a relatively simple method to reconstruct cerebral aneurysms as 3D numerical grids. The method accurately duplicates the geometry to provide computer simulations of the blood flow. Initial images were obtained by using CT angiography and 3D digital subtraction angiography in DICOM format. The image was processed by using MIMICS software, and the 3D fluid model (blood flow) and 3D solid model (wall) were generated. The subsequent output was exported to the ANSYS workbench software to generate the volumetric mesh for further hemodynamic study. The fluid model was defined and simulated in CFX software while the solid model was calculated in ANSYS software. The force data calculated firstly in the CFX software were transferred to the ANSYS software, and after receiving the force data, total mesh displacement data were calculated in the ANSYS software. Then, the mesh displacement data were transferred back to the CFX software. The data exchange was processed in workbench software. The results of simulation could be visualized in CFX-post. Two examples of grid reconstruction and blood flow simulation for patients with internal carotid artery aneurysms were presented. The wall shear stress, wall total pressure, and von Mises stress could be visualized. This method seems to be relatively simple and suitable for direct use by neurosurgeons or neuroradiologists, and maybe a practical tool for planning treatment and follow-up of patients after neurosurgical or endovascular interventions with 3D angiography. PMID:20812022

  18. Hemodynamics model of fluid-solid interaction in internal carotid artery aneurysms.

    PubMed

    Bai-Nan, Xu; Fu-Yu, Wang; Lei, Liu; Xiao-Jun, Zhang; Hai-Yue, Ju

    2011-01-01

    The objective of this study is to present a relatively simple method to reconstruct cerebral aneurysms as 3D numerical grids. The method accurately duplicates the geometry to provide computer simulations of the blood flow. Initial images were obtained by using CT angiography and 3D digital subtraction angiography in DICOM format. The image was processed by using MIMICS software, and the 3D fluid model (blood flow) and 3D solid model (wall) were generated. The subsequent output was exported to the ANSYS workbench software to generate the volumetric mesh for further hemodynamic study. The fluid model was defined and simulated in CFX software while the solid model was calculated in ANSYS software. The force data calculated firstly in the CFX software were transferred to the ANSYS software, and after receiving the force data, total mesh displacement data were calculated in the ANSYS software. Then, the mesh displacement data were transferred back to the CFX software. The data exchange was processed in workbench software. The results of simulation could be visualized in CFX-post. Two examples of grid reconstruction and blood flow simulation for patients with internal carotid artery aneurysms were presented. The wall shear stress, wall total pressure, and von Mises stress could be visualized. This method seems to be relatively simple and suitable for direct use by neurosurgeons or neuroradiologists, and maybe a practical tool for planning treatment and follow-up of patients after neurosurgical or endovascular interventions with 3D angiography.

  19. Framework for SEM contour analysis

    NASA Astrophysics Data System (ADS)

    Schneider, L.; Farys, V.; Serret, E.; Fenouillet-Beranger, C.

    2017-03-01

    SEM images provide valuable information about patterning capability. Geometrical properties such as Critical Dimension (CD) can be extracted from them and are used to calibrate OPC models, thus making OPC more robust and reliable. However, there is currently a shortage of appropriate metrology tools to inspect complex two-dimensional patterns in the same way as one would work with simple one-dimensional patterns. In this article we present a full framework for the analysis of SEM images. It has been proven to be fast, reliable and robust for every type of structure, and particularly for two-dimensional structures. To achieve this result, several innovative solutions have been developed and will be presented in the following pages. Firstly, we will present a new noise filter which is used to reduce noise on SEM images, followed by an efficient topography identifier, and finally we will describe the use of a topological skeleton as a measurement tool that can extend CD measurements on all kinds of patterns.

  20. A light and scanning electron microscopic evaluation of electro-discharge-compacted porous titanium implants in rabbit tibia.

    PubMed

    Drummond, J F; Dominici, J T; Sammon, P J; Okazaki, K; Geissler, R; Lifland, M I; Anderson, S A; Renshaw, W

    1995-01-01

    This study used light and scanning electron microscopic (SEM) histomorphometric methods to quantitate the rate of osseointegration of totally porous titanium alloy (Ti-6Al-4V) implants prepared by a novel fabrication technique--electrodischarge compaction (EDC). EDC was used to fuse 150-250-micrometer spherical titanium alloy beads into 4 X 6 mm cylindrical implants through application of a 300-microsecond pulse of high-voltage/high-current density. Two sterilized implants were surgically placed into each tibia of 20 New Zealand white rabbits and left in situ for periods corresponding to 2, 4, 8, 12, and 24 weeks. At each time point, 4 rabbits were humanely killed, and the implants with surrounding bone were removed, fixed, and sectioned for light and SEM studies. The degree of osseointegration was quantitated by means of a True Grid Digitizing Pad and Jandel Scan Version 3.9 software on an IBM PS/2 computer. The total pore area occupied by bone was divided by the total pore area available for bone ingrowth, and a Bone Ingrowth Factor (BIF) was calculated as a percent. The light microscopic results showed BIFs of 4% at week 2, 47% at week 4, 62% at week 8, 84% at week 12, and greater than 90% at week 24. The SEM results showed BIFs of 5% at week 2, 34% at week 4, 69% at week 8, 75% at week 12, and in excess of 90% at week 24. The results of this study show that EDC implants are biocompatible and support rapid osseointegration in the rabbit tibia and suggest that, after additional studies, they may be suitable for use as dental implants in humans.

  1. Scanning electron microscopy/energy dispersive spectrometry fixedbeam or overscan x-ray microanalysis of particles can miss the real structure: x-ray spectrum image mapping reveals the true nature

    NASA Astrophysics Data System (ADS)

    Newbury, Dale E.; Ritchie, Nicholas W. M.

    2013-05-01

    The typical strategy for analysis of a microscopic particle by scanning electron microscopy/energy dispersive spectrometry x-ray microanalysis (SEM/EDS) is to use a fixed beam placed at the particle center or to continuously overscan to gather an "averaged" x-ray spectrum. While useful, such strategies inevitably concede any possibility of recognizing microstructure within the particle, and such fine scale structure is often critical for understanding the origins, behavior, and fate of particles. Elemental imaging by x-ray mapping has been a mainstay of SEM/EDS analytical practice for many years, but the time penalty associated with mapping with older EDS technology has discouraged its general use and reserved it more for detailed studies that justified the time investment. The emergence of the high throughput, high peak stability silicon drift detector (SDD-EDS) has enabled a more effective particle mapping strategy: "flash" x-ray spectrum image maps can now be recorded in seconds that capture the spatial distribution of major (concentration, C > 0.1 mass fraction) and minor (0.01 <= C <= 0.1) constituents. New SEM/SDD-EDS instrument configurations feature multiple SDDs that view the specimen from widely spaced azimuthal angles. Multiple, simultaneous measurements from different angles enable x-ray spectrometry and mapping that can minimize the strong geometric effects of particles. The NIST DTSA-II software engine is a powerful aid for quantitatively analyzing EDS spectra measured individually as well as for mapping information (available free for Java platforms at: http://www.cstl.nist.gov/div837/837.02/epq/dtsa2/index.html).

  2. Unraveling the intrafamilial correlations and heritability of tumor types in MEN1: a Groupe d'étude des Tumeurs Endocrines study.

    PubMed

    Thevenon, J; Bourredjem, A; Faivre, L; Cardot-Bauters, C; Calender, A; Le Bras, M; Giraud, S; Niccoli, P; Odou, M F; Borson-Chazot, F; Barlier, A; Lombard-Bohas, C; Clauser, E; Tabarin, A; Pasmant, E; Chabre, O; Castermans, E; Ruszniewski, P; Bertherat, J; Delemer, B; Christin-Maitre, S; Beckers, A; Guilhem, I; Rohmer, V; Goichot, B; Caron, P; Baudin, E; Chanson, P; Groussin, L; Du Boullay, H; Weryha, G; Lecomte, P; Schillo, F; Bihan, H; Archambeaud, F; Kerlan, V; Bourcigaux, N; Kuhn, J M; Vergès, B; Rodier, M; Renard, M; Sadoul, J L; Binquet, C; Goudet, P

    2015-12-01

    MEN1, which is secondary to the mutation of the MEN1 gene, is a rare autosomal-dominant disease that predisposes mutation carriers to endocrine tumors. Most studies demonstrated the absence of direct genotype-phenotype correlations. The existence of a higher risk of death in the Groupe d'étude des Tumeurs Endocrines-cohort associated with a mutation in the JunD interacting domain suggests heterogeneity across families in disease expressivity. This study aims to assess the existence of modifying genetic factors by estimating the intrafamilial correlations and heritability of the six main tumor types in MEN1. The study included 797 patients from 265 kindred and studied seven phenotypic criteria: parathyroid and pancreatic neuroendocrine tumors (NETs) and pituitary, adrenal, bronchial, and thymic (thNET) tumors and the presence of metastasis. Intrafamilial correlations and heritability estimates were calculated from family tree data using specific validated statistical analysis software. Intrafamilial correlations were significant and decreased along parental degrees distance for pituitary, adrenal and thNETs. The heritability of these three tumor types was consistently strong and significant with 64% (s.e.m.=0.13; P<0.001) for pituitary tumor, 65% (s.e.m.=0.21; P<0.001) for adrenal tumors, and 97% (s.e.m.=0.41; P=0.006) for thNETs. The present study shows the existence of modifying genetic factors for thymus, adrenal, and pituitary MEN1 tumor types. The identification of at-risk subgroups of individuals within cohorts is the first step toward personalization of care. Next generation sequencing on this subset of tumors will help identify the molecular basis of MEN1 variable genetic expressivity. © 2015 European Society of Endocrinology.

  3. Preliminary description of the area navigation software for a microcomputer-based Loran-C receiver

    NASA Technical Reports Server (NTRS)

    Oguri, F.

    1983-01-01

    The development of new software implementation of this software on a microcomputer (MOS 6502) to provide high quality navigation information is described. This software development provides Area/Route Navigation (RNAV) information from Time Differences (TDs) in raw form using an elliptical Earth model and a spherical model. The software is prepared for the microcomputer based Loran-C receiver. To compute navigation infomation, a (MOS 6502) microcomputer and a mathematical chip (AM 9511A) were combined with the Loran-C receiver. Final data reveals that this software does indeed provide accurate information with reasonable execution times.

  4. NCAR global model topography generation software for unstructured grids

    NASA Astrophysics Data System (ADS)

    Lauritzen, P. H.; Bacmeister, J. T.; Callaghan, P. F.; Taylor, M. A.

    2015-06-01

    It is the purpose of this paper to document the NCAR global model topography generation software for unstructured grids. Given a model grid, the software computes the fraction of the grid box covered by land, the gridbox mean elevation, and associated sub-grid scale variances commonly used for gravity wave and turbulent mountain stress parameterizations. The software supports regular latitude-longitude grids as well as unstructured grids; e.g. icosahedral, Voronoi, cubed-sphere and variable resolution grids. As an example application and in the spirit of documenting model development, exploratory simulations illustrating the impacts of topographic smoothing with the NCAR-DOE CESM (Community Earth System Model) CAM5.2-SE (Community Atmosphere Model version 5.2 - Spectral Elements dynamical core) are shown.

  5. Requirements model for an e-Health awareness portal

    NASA Astrophysics Data System (ADS)

    Hussain, Azham; Mkpojiogu, Emmanuel O. C.; Nawi, Mohd Nasrun M.

    2016-08-01

    Requirements engineering is at the heart and foundation of software engineering process. Poor quality requirements inevitably lead to poor quality software solutions. Also, poor requirement modeling is tantamount to designing a poor quality product. So, quality assured requirements development collaborates fine with usable products in giving the software product the needed quality it demands. In the light of the foregoing, the requirements for an e-Ebola Awareness Portal were modeled with a good attention given to these software engineering concerns. The requirements for the e-Health Awareness Portal are modeled as a contribution to the fight against Ebola and helps in the fulfillment of the United Nation's Millennium Development Goal No. 6. In this study requirements were modeled using UML 2.0 modeling technique.

  6. Model-Based Verification and Validation of Spacecraft Avionics

    NASA Technical Reports Server (NTRS)

    Khan, M. Omair; Sievers, Michael; Standley, Shaun

    2012-01-01

    Verification and Validation (V&V) at JPL is traditionally performed on flight or flight-like hardware running flight software. For some time, the complexity of avionics has increased exponentially while the time allocated for system integration and associated V&V testing has remained fixed. There is an increasing need to perform comprehensive system level V&V using modeling and simulation, and to use scarce hardware testing time to validate models; the norm for thermal and structural V&V for some time. Our approach extends model-based V&V to electronics and software through functional and structural models implemented in SysML. We develop component models of electronics and software that are validated by comparison with test results from actual equipment. The models are then simulated enabling a more complete set of test cases than possible on flight hardware. SysML simulations provide access and control of internal nodes that may not be available in physical systems. This is particularly helpful in testing fault protection behaviors when injecting faults is either not possible or potentially damaging to the hardware. We can also model both hardware and software behaviors in SysML, which allows us to simulate hardware and software interactions. With an integrated model and simulation capability we can evaluate the hardware and software interactions and identify problems sooner. The primary missing piece is validating SysML model correctness against hardware; this experiment demonstrated such an approach is possible.

  7. Performance Evaluation of 3d Modeling Software for Uav Photogrammetry

    NASA Astrophysics Data System (ADS)

    Yanagi, H.; Chikatsu, H.

    2016-06-01

    UAV (Unmanned Aerial Vehicle) photogrammetry, which combines UAV and freely available internet-based 3D modeling software, is widely used as a low-cost and user-friendly photogrammetry technique in the fields such as remote sensing and geosciences. In UAV photogrammetry, only the platform used in conventional aerial photogrammetry is changed. Consequently, 3D modeling software contributes significantly to its expansion. However, the algorithms of the 3D modelling software are black box algorithms. As a result, only a few studies have been able to evaluate their accuracy using 3D coordinate check points. With this motive, Smart3DCapture and Pix4Dmapper were downloaded from the Internet and commercial software PhotoScan was also employed; investigations were performed in this paper using check points and images obtained from UAV.

  8. Highly Sophisticated Virtual Laboratory Instruments in Education

    NASA Astrophysics Data System (ADS)

    Gaskins, T.

    2006-12-01

    Many areas of Science have advanced or stalled according to the ability to see what can not normally be seen. Visual understanding has been key to many of the world's greatest breakthroughs, such as discovery of DNAs double helix. Scientists use sophisticated instruments to see what the human eye can not. Light microscopes, scanning electron microscopes (SEM), spectrometers and atomic force microscopes are employed to examine and learn the details of the extremely minute. It's rare that students prior to university have access to such instruments, or are granted full ability to probe and magnify as desired. Virtual Lab, by providing highly authentic software instruments and comprehensive imagery of real specimens, provides them this opportunity. Virtual Lab's instruments let explorers operate virtual devices on a personal computer to examine real specimens. Exhaustive sets of images systematically and robotically photographed at thousands of positions and multiple magnifications and focal points allow students to zoom in and focus on the most minute detail of each specimen. Controls on each Virtual Lab device interactively and smoothly move the viewer through these images to display the specimen as the instrument saw it. Users control position, magnification, focal length, filters and other parameters. Energy dispersion spectrometry is combined with SEM imagery to enable exploration of chemical composition at minute scale and arbitrary location. Annotation capabilities allow scientists, teachers and students to indicate important features or areas. Virtual Lab is a joint project of NASA and the Beckman Institute at the University of Illinois at Urbana- Champaign. Four instruments currently compose the Virtual Lab suite: A scanning electron microscope and companion energy dispersion spectrometer, a high-power light microscope, and a scanning probe microscope that captures surface properties to the level of atoms. Descriptions of instrument operating principles and uses are also part of Virtual Lab. The Virtual Lab software and its increasingly rich collection of specimens are free to anyone. This presentation describes Virtual Lab and its uses in formal and informal education.

  9. Metrological characterization of X-ray diffraction methods at different acquisition geometries for determination of crystallite size in nano-scale materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uvarov, Vladimir, E-mail: vladimiru@savion.huji.ac.il; Popov, Inna

    2013-11-15

    Crystallite size values were determined by X-ray diffraction methods for 183 powder samples. The tested size range was from a few to about several hundred nanometers. Crystallite size was calculated with direct use of the Scherrer equation, the Williamson–Hall method and the Rietveld procedure via the application of a series of commercial and free software. The results were statistically treated to estimate the significance of the difference in size resulting from these methods. We also estimated effect of acquisition conditions (Bragg–Brentano, parallel-beam geometry, step size, counting time) and data processing on the calculated crystallite size values. On the basis ofmore » the obtained results it is possible to conclude that direct use of the Scherrer equation, Williamson–Hall method and the Rietveld refinement employed by a series of software (EVA, PCW and TOPAS respectively) yield very close results for crystallite sizes less than 60 nm for parallel beam geometry and less than 100 nm for Bragg–Brentano geometry. However, we found that despite the fact that the differences between the crystallite sizes, which were calculated by various methods, are small by absolute values, they are statistically significant in some cases. The values of crystallite size determined from XRD were compared with those obtained by imaging in a transmission (TEM) and scanning electron microscopes (SEM). It was found that there was a good correlation in size only for crystallites smaller than 50 – 60 nm. Highlights: • The crystallite sizes for 183 nanopowders were calculated using different XRD methods • Obtained results were subject to statistical treatment • Results obtained with Bragg-Brentano and parallel beam geometries were compared • Influence of conditions of XRD pattern acquisition on results was estimated • Calculated by XRD crystallite sizes were compared with same obtained by TEM and SEM.« less

  10. Bushland Evapotranspiration and Agricultural Remote Sensing System (BEARS) software

    NASA Astrophysics Data System (ADS)

    Gowda, P. H.; Moorhead, J.; Brauer, D. K.

    2017-12-01

    Evapotranspiration (ET) is a major component of the hydrologic cycle. ET data are used for a variety of water management and research purposes such as irrigation scheduling, water and crop modeling, streamflow, water availability, and many more. Remote sensing products have been widely used to create spatially representative ET data sets which provide important information from field to regional scales. As UAV capabilities increase, remote sensing use is likely to also increase. For that purpose, scientists at the USDA-ARS research laboratory in Bushland, TX developed the Bushland Evapotranspiration and Agricultural Remote Sensing System (BEARS) software. The BEARS software is a Java based software that allows users to process remote sensing data to generate ET outputs using predefined models, or enter custom equations and models. The capability to define new equations and build new models expands the applicability of the BEARS software beyond ET mapping to any remote sensing application. The software also includes an image viewing tool that allows users to visualize outputs, as well as draw an area of interest using various shapes. This software is freely available from the USDA-ARS Conservation and Production Research Laboratory website.

  11. Modeling of a 3DTV service in the software-defined networking architecture

    NASA Astrophysics Data System (ADS)

    Wilczewski, Grzegorz

    2014-11-01

    In this article a newly developed concept towards modeling of a multimedia service offering stereoscopic motion imagery is presented. Proposed model is based on the approach of utilization of Software-defined Networking or Software Defined Networks architecture (SDN). The definition of 3D television service spanning SDN concept is identified, exposing basic characteristic of a 3DTV service in a modern networking organization layout. Furthermore, exemplary functionalities of the proposed 3DTV model are depicted. It is indicated that modeling of a 3DTV service in the Software-defined Networking architecture leads to multiplicity of improvements, especially towards flexibility of a service supporting heterogeneity of end user devices.

  12. An object-oriented description method of EPMM process

    NASA Astrophysics Data System (ADS)

    Jiang, Zuo; Yang, Fan

    2017-06-01

    In order to use the object-oriented mature tools and language in software process model, make the software process model more accord with the industrial standard, it’s necessary to study the object-oriented modelling of software process. Based on the formal process definition in EPMM, considering the characteristics that Petri net is mainly formal modelling tool and combining the Petri net modelling with the object-oriented modelling idea, this paper provides this implementation method to convert EPMM based on Petri net into object models based on object-oriented description.

  13. Theoretical and software considerations for nonlinear dynamic analysis

    NASA Technical Reports Server (NTRS)

    Schmidt, R. J.; Dodds, R. H., Jr.

    1983-01-01

    In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.

  14. The Effect of Microstructure On Transport Properties of Porous Electrodes

    NASA Astrophysics Data System (ADS)

    Peterson, Serena W.

    The goal of this work is to further understand the relationships between porous electrode microstructure and mass transport properties. This understanding allows us to predict and improve cell performance from fundamental principles. The investigated battery systems are the widely used rechargeable Li-ion battery and the non-rechargeable alkaline battery. This work includes three main contributions in the battery field listed below. Direct Measurement of Effective Electronic Transport in Porous Li-ion Electrodes. An accurate assessment of the electronic conductivity of electrodes is necessary for understanding and optimizing battery performance. The bulk electronic conductivity of porous LiCoO2-based cathodes was measured as a function of porosity, pressure, carbon fraction, and the presence of an electrolyte. The measurements were performed by delamination of thin-film electrodes from their aluminum current collectors and by use of a four-line probe. Imaging and Correlating Microstructure To Conductivity. Transport properties of porous electrodes are strongly related to microstructure. An experimental 3D microstructure is needed not only for computation of direct transport properties, but also for a detailed electrode microstructure characterization. This work utilized X-ray tomography and focused ion beam (FIB)/scanning electron microscopy (SEM) to obtain the 3D structures of alkaline battery cathodes. FIB/SEM has the advantage of detecting carbon additives; thus, it was the main tomography tool employed. Additionally, protocols and techniques for acquiring, processing and segmenting series of FIB/SEM images were developed as part of this work. FIB/SEM images were also used to correlate electrodes' microstructure to their respective conductivities for both Li-ion and alkaline batteries. Electrode Microstructure Metrics and the 3D Stochastic Grid Model. A detailed characterization of microstructure was conducted in this work, including characterization of the volume fraction, nearest neighbor probability, domain size distribution, shape factor, and Fourier transform coefficient. These metrics are compared between 2D FIB/SEM, 3D FIB/SEM and X-ray structures. Among those metrics, the first three metrics are used as a basis for SG model parameterization. The 3D stochastic grid (SG) model is based on Monte Carlo techniques, in which a small set of fundamental inter-domain parameters are used to generate structures. This allows us to predict electrode microstructure and its effects on both electronic and ionic properties.

  15. The relationship between quality management practices and organisational performance: A structural equation modelling approach

    NASA Astrophysics Data System (ADS)

    Jamaluddin, Z.; Razali, A. M.; Mustafa, Z.

    2015-02-01

    The purpose of this paper is to examine the relationship between the quality management practices (QMPs) and organisational performance for the manufacturing industry in Malaysia. In this study, a QMPs and organisational performance framework is developed according to a comprehensive literature review which cover aspects of hard and soft quality factors in manufacturing process environment. A total of 11 hypotheses have been put forward to test the relationship amongst the six constructs, which are management commitment, training, process management, quality tools, continuous improvement and organisational performance. The model is analysed using Structural Equation Modeling (SEM) with AMOS software version 18.0 using Maximum Likelihood (ML) estimation. A total of 480 questionnaires were distributed, and 210 questionnaires were valid for analysis. The results of the modeling analysis using ML estimation indicate that the fits statistics of QMPs and organisational performance model for manufacturing industry is admissible. From the results, it found that the management commitment have significant impact on the training and process management. Similarly, the training had significant effect to the quality tools, process management and continuous improvement. Furthermore, the quality tools have significant influence on the process management and continuous improvement. Likewise, the process management also has a significant impact to the continuous improvement. In addition the continuous improvement has significant influence the organisational performance. However, the results of the study also found that there is no significant relationship between management commitment and quality tools, and between the management commitment and continuous improvement. The results of the study can be used by managers to prioritize the implementation of QMPs. For instances, those practices that are found to have positive impact on organisational performance can be recommended to managers so that they can allocate resources to improve these practices to get better performance.

  16. Relationship of organizational culture, teamwork and job satisfaction in interprofessional teams.

    PubMed

    Körner, Mirjam; Wirtz, Markus A; Bengel, Jürgen; Göritz, Anja S

    2015-06-23

    Team effectiveness is often explained on the basis of input-process-output (IPO) models. According to these models a relationship between organizational culture (input = I), interprofessional teamwork (process = P) and job satisfaction (output = O) is postulated. The aim of this study was to examine the relationship between these three aspects using structural analysis. A multi-center cross-sectional study with a survey of 272 employees was conducted in fifteen rehabilitation clinics with different indication fields in Germany. Structural equation modeling (SEM) was carried out using AMOS software version 20.0 (maximum-likelihood method). Of 661 questionnaires sent out to members of the health care teams in the medical rehabilitation clinics, 275 were returned (41.6%). Three questionnaires were excluded (missing data greater than 30%), yielding a total of 272 employees that could be analyzed. The confirmatory models were supported by the data. The results showed that 35% of job satisfaction is predicted by a structural equation model that includes both organizational culture and teamwork. The comparison of this predictive IPO model (organizational culture (I), interprofessional teamwork (P), job satisfaction (O)) and the predictive IO model (organizational culture (I), job satisfaction (O)) showed that the effect of organizational culture is completely mediated by interprofessional teamwork. The global fit indices are a little better for the IO model (TLI: .967, CFI: .972, RMSEA .052) than for the IPO model (TLI: .934, CFI: .943, RMSEA: .61), but the prediction of job satisfaction is better in the IPO model (R(2) = 35%) than in the IO model (R(2) = 24%). Our study results underpin the importance of interprofessional teamwork in health care organizations. To enhance interprofessional teamwork, team interventions can be recommended and should be supported. Further studies investigating the organizational culture and its impact on interprofessional teamwork and team effectiveness in health care are important.

  17. Integrated Functional and Executional Modelling of Software Using Web-Based Databases

    NASA Technical Reports Server (NTRS)

    Kulkarni, Deepak; Marietta, Roberta

    1998-01-01

    NASA's software subsystems undergo extensive modification and updates over the operational lifetimes. It is imperative that modified software should satisfy safety goals. This report discusses the difficulties encountered in doing so and discusses a solution based on integrated modelling of software, use of automatic information extraction tools, web technology and databases. To appear in an article of Journal of Database Management.

  18. From Product- to Service-Oriented Strategies in the Enterprise Software Market

    ERIC Educational Resources Information Center

    Xin, Mingdi

    2009-01-01

    The enterprise software market is seeing the rise of a new business model--selling Software-as-a-Service (SaaS), in which a standard piece of software is owned and managed remotely by the vendor and delivered as a service over the Internet. Despite the hype, questions remain regarding the rise of this new service model and how it would impact the…

  19. Development of the quality assessment model of EHR software in family medicine practices: research based on user satisfaction.

    PubMed

    Kralj, Damir; Kern, Josipa; Tonkovic, Stanko; Koncar, Miroslav

    2015-09-09

    Family medicine practices (FMPs) make the basis for the Croatian health care system. Use of electronic health record (EHR) software is mandatory and it plays an important role in running these practices, but important functional features still remain uneven and largely left to the will of the software developers. The objective of this study was to develop a novel and comprehensive model for functional evaluation of the EHR software in FMPs, based on current world standards, models and projects, as well as on actual user satisfaction and requirements. Based on previous theoretical and experimental research in this area, we made the initial framework model consisting of six basic categories as a base for online survey questionnaire. Family doctors assessed perceived software quality by using a five-point Likert-type scale. Using exploratory factor analysis and appropriate statistical methods over the collected data, the final optimal structure of the novel model was formed. Special attention was focused on the validity and quality of the novel model. The online survey collected a total of 384 cases. The obtained results indicate both the quality of the assessed software and the quality in use of the novel model. The intense ergonomic orientation of the novel measurement model was particularly emphasised. The resulting novel model is multiple validated, comprehensive and universal. It could be used to assess the user-perceived quality of almost all forms of the ambulatory EHR software and therefore useful to all stakeholders in this area of the health care informatisation.

  20. Software Past, Present, and Future: Views from Government, Industry and Academia

    NASA Technical Reports Server (NTRS)

    Holcomb, Lee; Page, Jerry; Evangelist, Michael

    2000-01-01

    Views from the NASA CIO NASA Software Engineering Workshop on software development from the past, present, and future are presented. The topics include: 1) Software Past; 2) Software Present; 3) NASA's Largest Software Challenges; 4) 8330 Software Projects in Industry Standish Groups 1994 Report; 5) Software Future; 6) Capability Maturity Model (CMM): Software Engineering Institute (SEI) levels; 7) System Engineering Quality Also Part of the Problem; 8) University Environment Trends Will Increase the Problem in Software Engineering; and 9) NASA Software Engineering Goals.

Top