Automating Structural Analysis of Spacecraft Vehicles
NASA Technical Reports Server (NTRS)
Hrinda, Glenn A.
2004-01-01
A major effort within NASA's vehicle analysis discipline has been to automate structural analysis and sizing optimization during conceptual design studies of advanced spacecraft. Traditional spacecraft structural sizing has involved detailed finite element analysis (FEA) requiring large degree-of-freedom (DOF) finite element models (FEM). Creation and analysis of these models can be time consuming and limit model size during conceptual designs. The goal is to find an optimal design that meets the mission requirements but produces the lightest structure. A structural sizing tool called HyperSizer has been successfully used in the conceptual design phase of a reusable launch vehicle and planetary exploration spacecraft. The program couples with FEA to enable system level performance assessments and weight predictions including design optimization of material selections and sizing of spacecraft members. The software's analysis capabilities are based on established aerospace structural methods for strength, stability and stiffness that produce adequately sized members and reliable structural weight estimates. The software also helps to identify potential structural deficiencies early in the conceptual design so changes can be made without wasted time. HyperSizer's automated analysis and sizing optimization increases productivity and brings standardization to a systems study. These benefits will be illustrated in examining two different types of conceptual spacecraft designed using the software. A hypersonic air breathing, single stage to orbit (SSTO), reusable launch vehicle (RLV) will be highlighted as well as an aeroshell for a planetary exploration vehicle used for aerocapture at Mars. By showing the two different types of vehicles, the software's flexibility will be demonstrated with an emphasis on reducing aeroshell structural weight. Member sizes, concepts and material selections will be discussed as well as analysis methods used in optimizing the structure. Analysis based on the HyperSizer structural sizing software will be discussed. Design trades required to optimize structural weight will be presented.
Efficient, Multi-Scale Designs Take Flight
NASA Technical Reports Server (NTRS)
2003-01-01
Engineers can solve aerospace design problems faster and more efficiently with a versatile software product that performs automated structural analysis and sizing optimization. Collier Research Corporation's HyperSizer Structural Sizing Software is a design, analysis, and documentation tool that increases productivity and standardization for a design team. Based on established aerospace structural methods for strength, stability, and stiffness, HyperSizer can be used all the way from the conceptual design to in service support. The software originated from NASA s efforts to automate its capability to perform aircraft strength analyses, structural sizing, and weight prediction and reduction. With a strategy to combine finite element analysis with an automated design procedure, NASA s Langley Research Center led the development of a software code known as ST-SIZE from 1988 to 1995. Collier Research employees were principal developers of the code along with Langley researchers. The code evolved into one that could analyze the strength and stability of stiffened panels constructed of any material, including light-weight, fiber-reinforced composites.
Structural Analysis and Design Software
NASA Technical Reports Server (NTRS)
1997-01-01
Collier Research and Development Corporation received a one-of-a-kind computer code for designing exotic hypersonic aircraft called ST-SIZE in the first ever Langley Research Center software copyright license agreement. Collier transformed the NASA computer code into a commercial software package called HyperSizer, which integrates with other Finite Element Modeling and Finite Analysis private-sector structural analysis program. ST-SIZE was chiefly conceived as a means to improve and speed the structural design of a future aerospace plane for Langley Hypersonic Vehicles Office. Including the NASA computer code into HyperSizer has enabled the company to also apply the software to applications other than aerospace, including improved design and construction for offices, marine structures, cargo containers, commercial and military aircraft, rail cars, and a host of everyday consumer products.
Multi-Scale Sizing of Lightweight Multifunctional Spacecraft Structural Components
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.
2005-01-01
This document is the final report for the project entitled, "Multi-Scale Sizing of Lightweight Multifunctional Spacecraft Structural Components," funded under the NRA entitled "Cross-Enterprise Technology Development Program" issued by the NASA Office of Space Science in 2000. The project was funded in 2001, and spanned a four year period from March, 2001 to February, 2005. Through enhancements to and synthesis of unique, state of the art structural mechanics and micromechanics analysis software, a new multi-scale tool has been developed that enables design, analysis, and sizing of advance lightweight composite and smart materials and structures from the full vehicle, to the stiffened structure, to the micro (fiber and matrix) scales. The new software tool has broad, cross-cutting value to current and future NASA missions that will rely on advanced composite and smart materials and structures.
CytoSpectre: a tool for spectral analysis of oriented structures on cellular and subcellular levels.
Kartasalo, Kimmo; Pölönen, Risto-Pekka; Ojala, Marisa; Rasku, Jyrki; Lekkala, Jukka; Aalto-Setälä, Katriina; Kallio, Pasi
2015-10-26
Orientation and the degree of isotropy are important in many biological systems such as the sarcomeres of cardiomyocytes and other fibrillar structures of the cytoskeleton. Image based analysis of such structures is often limited to qualitative evaluation by human experts, hampering the throughput, repeatability and reliability of the analyses. Software tools are not readily available for this purpose and the existing methods typically rely at least partly on manual operation. We developed CytoSpectre, an automated tool based on spectral analysis, allowing the quantification of orientation and also size distributions of structures in microscopy images. CytoSpectre utilizes the Fourier transform to estimate the power spectrum of an image and based on the spectrum, computes parameter values describing, among others, the mean orientation, isotropy and size of target structures. The analysis can be further tuned to focus on targets of particular size at cellular or subcellular scales. The software can be operated via a graphical user interface without any programming expertise. We analyzed the performance of CytoSpectre by extensive simulations using artificial images, by benchmarking against FibrilTool and by comparisons with manual measurements performed for real images by a panel of human experts. The software was found to be tolerant against noise and blurring and superior to FibrilTool when analyzing realistic targets with degraded image quality. The analysis of real images indicated general good agreement between computational and manual results while also revealing notable expert-to-expert variation. Moreover, the experiment showed that CytoSpectre can handle images obtained of different cell types using different microscopy techniques. Finally, we studied the effect of mechanical stretching on cardiomyocytes to demonstrate the software in an actual experiment and observed changes in cellular orientation in response to stretching. CytoSpectre, a versatile, easy-to-use software tool for spectral analysis of microscopy images was developed. The tool is compatible with most 2D images and can be used to analyze targets at different scales. We expect the tool to be useful in diverse applications dealing with structures whose orientation and size distributions are of interest. While designed for the biological field, the software could also be useful in non-biological applications.
Combining analysis with optimization at Langley Research Center. An evolutionary process
NASA Technical Reports Server (NTRS)
Rogers, J. L., Jr.
1982-01-01
The evolutionary process of combining analysis and optimization codes was traced with a view toward providing insight into the long term goal of developing the methodology for an integrated, multidisciplinary software system for the concurrent analysis and optimization of aerospace structures. It was traced along the lines of strength sizing, concurrent strength and flutter sizing, and general optimization to define a near-term goal for combining analysis and optimization codes. Development of a modular software system combining general-purpose, state-of-the-art, production-level analysis computer programs for structures, aerodynamics, and aeroelasticity with a state-of-the-art optimization program is required. Incorporation of a modular and flexible structural optimization software system into a state-of-the-art finite element analysis computer program will facilitate this effort. This effort results in the software system used that is controlled with a special-purpose language, communicates with a data management system, and is easily modified for adding new programs and capabilities. A 337 degree-of-freedom finite element model is used in verifying the accuracy of this system.
CHIME: A Metadata-Based Distributed Software Development Environment
2005-01-01
structures by using typography , graphics , and animation. The Software Im- mersion in our conceptual model for CHIME can be seen as a form of Software...Even small- to medium-sized development efforts may involve hundreds of artifacts -- design documents, change requests, test cases and results, code...for managing and organizing information from all phases of the software lifecycle. CHIME is designed around an XML-based metadata architecture, in
A controlled experiment on the impact of software structure on maintainability
NASA Technical Reports Server (NTRS)
Rombach, Dieter H.
1987-01-01
The impact of software structure on maintainability aspects including comprehensibility, locality, modifiability, and reusability in a distributed system environment is studied in a controlled maintenance experiment involving six medium-size distributed software systems implemented in LADY (language for distributed systems) and six in an extended version of sequential PASCAL. For all maintenance aspects except reusability, the results were quantitatively given in terms of complexity metrics which could be automated. The results showed LADY to be better suited to the development of maintainable software than the extension of sequential PASCAL. The strong typing combined with high parametrization of units is suggested to improve the reusability of units in LADY.
Advanced Structural Optimization Under Consideration of Cost Tracking
NASA Astrophysics Data System (ADS)
Zell, D.; Link, T.; Bickelmaier, S.; Albinger, J.; Weikert, S.; Cremaschi, F.; Wiegand, A.
2014-06-01
In order to improve the design process of launcher configurations in the early development phase, the software Multidisciplinary Optimization (MDO) was developed. The tool combines different efficient software tools such as Optimal Design Investigations (ODIN) for structural optimizations, Aerospace Trajectory Optimization Software (ASTOS) for trajectory and vehicle design optimization for a defined payload and mission.The present paper focuses to the integration and validation of ODIN. ODIN enables the user to optimize typical axis-symmetric structures by means of sizing the stiffening designs concerning strength and stability while minimizing the structural mass. In addition a fully automatic finite element model (FEM) generator module creates ready-to-run FEM models of a complete stage or launcher assembly.Cost tracking respectively future improvements concerning cost optimization are indicated.
Interactive computer graphics system for structural sizing and analysis of aircraft structures
NASA Technical Reports Server (NTRS)
Bendavid, D.; Pipano, A.; Raibstein, A.; Somekh, E.
1975-01-01
A computerized system for preliminary sizing and analysis of aircraft wing and fuselage structures was described. The system is based upon repeated application of analytical program modules, which are interactively interfaced and sequence-controlled during the iterative design process with the aid of design-oriented graphics software modules. The entire process is initiated and controlled via low-cost interactive graphics terminals driven by a remote computer in a time-sharing mode.
Implementing large projects in software engineering courses
NASA Astrophysics Data System (ADS)
Coppit, David
2006-03-01
In software engineering education, large projects are widely recognized as a useful way of exposing students to the real-world difficulties of team software development. But large projects are difficult to put into practice. First, educators rarely have additional time to manage software projects. Second, classrooms have inherent limitations that threaten the realism of large projects. Third, quantitative evaluation of individuals who work in groups is notoriously difficult. As a result, many software engineering courses compromise the project experience by reducing the team sizes, project scope, and risk. In this paper, we present an approach to teaching a one-semester software engineering course in which 20 to 30 students work together to construct a moderately sized (15KLOC) software system. The approach combines carefully coordinated lectures and homeworks, a hierarchical project management structure, modern communication technologies, and a web-based project tracking and individual assessment system. Our approach provides a more realistic project experience for the students, without incurring significant additional overhead for the instructor. We present our experiences using the approach the last 2 years for the software engineering course at The College of William and Mary. Although the approach has some weaknesses, we believe that they are strongly outweighed by the pedagogical benefits.
Jiřík, Miroslav; Bartoš, Martin; Tomášek, Petr; Malečková, Anna; Kural, Tomáš; Horáková, Jana; Lukáš, David; Suchý, Tomáš; Kochová, Petra; Hubálek Kalbáčová, Marie; Králíčková, Milena; Tonar, Zbyněk
2018-06-01
Quantification of the structure and composition of biomaterials using micro-CT requires image segmentation due to the low contrast and overlapping radioopacity of biological materials. The amount of bias introduced by segmentation procedures is generally unknown. We aim to develop software that generates three-dimensional models of fibrous and porous structures with known volumes, surfaces, lengths, and object counts in fibrous materials and to provide a software tool that calibrates quantitative micro-CT assessments. Virtual image stacks were generated using the newly developed software TeIGen, enabling the simulation of micro-CT scans of unconnected tubes, connected tubes, and porosities. A realistic noise generator was incorporated. Forty image stacks were evaluated using micro-CT, and the error between the true known and estimated data was quantified. Starting with geometric primitives, the error of the numerical estimation of surfaces and volumes was eliminated, thereby enabling the quantification of volumes and surfaces of colliding objects. Analysis of the sensitivity of the thresholding upon parameters of generated testing image sets revealed the effects of decreasing resolution and increasing noise on the accuracy of the micro-CT quantification. The size of the error increased with decreasing resolution when the voxel size exceeded 1/10 of the typical object size, which simulated the effect of the smallest details that could still be reliably quantified. Open-source software for calibrating quantitative micro-CT assessments by producing and saving virtually generated image data sets with known morphometric data was made freely available to researchers involved in morphometry of three-dimensional fibrillar and porous structures in micro-CT scans. © 2018 Wiley Periodicals, Inc.
LLIMAS: Revolutionizing integrating modeling and analysis at MIT Lincoln Laboratory
NASA Astrophysics Data System (ADS)
Doyle, Keith B.; Stoeckel, Gerhard P.; Rey, Justin J.; Bury, Mark E.
2017-08-01
MIT Lincoln Laboratory's Integrated Modeling and Analysis Software (LLIMAS) enables the development of novel engineering solutions for advanced prototype systems through unique insights into engineering performance and interdisciplinary behavior to meet challenging size, weight, power, environmental, and performance requirements. LLIMAS is a multidisciplinary design optimization tool that wraps numerical optimization algorithms around an integrated framework of structural, thermal, optical, stray light, and computational fluid dynamics analysis capabilities. LLIMAS software is highly extensible and has developed organically across a variety of technologies including laser communications, directed energy, photometric detectors, chemical sensing, laser radar, and imaging systems. The custom software architecture leverages the capabilities of existing industry standard commercial software and supports the incorporation of internally developed tools. Recent advances in LLIMAS's Structural-Thermal-Optical Performance (STOP), aeromechanical, and aero-optical capabilities as applied to Lincoln prototypes are presented.
NASA Astrophysics Data System (ADS)
Ozbasaran, Hakan
Trusses have an important place amongst engineering structures due to many advantages such as high structural efficiency, fast assembly and easy maintenance. Iterative truss design procedures, which require analysis of a large number of candidate structural systems such as size, shape and topology optimization with stochastic methods, mostly lead the engineer to establish a link between the development platform and external structural analysis software. By increasing number of structural analyses, this (probably slow-response) link may climb to the top of the list of performance issues. This paper introduces a software for static, global member buckling and frequency analysis of 2D and 3D trusses to overcome this problem for Mathematica users.
Acoustic Emission Analysis Applet (AEAA) Software
NASA Technical Reports Server (NTRS)
Nichols, Charles T.; Roth, Don J.
2013-01-01
NASA Glenn Research and NASA White Sands Test Facility have developed software supporting an automated pressure vessel structural health monitoring (SHM) system based on acoustic emissions (AE). The software, referred to as the Acoustic Emission Analysis Applet (AEAA), provides analysts with a tool that can interrogate data collected on Digital Wave Corp. and Physical Acoustics Corp. software using a wide spectrum of powerful filters and charts. This software can be made to work with any data once the data format is known. The applet will compute basic AE statistics, and statistics as a function of time and pressure (see figure). AEAA provides value added beyond the analysis provided by the respective vendors' analysis software. The software can handle data sets of unlimited size. A wide variety of government and commercial applications could benefit from this technology, notably requalification and usage tests for compressed gas and hydrogen-fueled vehicles. Future enhancements will add features similar to a "check engine" light on a vehicle. Once installed, the system will ultimately be used to alert International Space Station crewmembers to critical structural instabilities, but will have little impact to missions otherwise. Diagnostic information could then be transmitted to experienced technicians on the ground in a timely manner to determine whether pressure vessels have been impacted, are structurally unsound, or can be safely used to complete the mission.
OPTICON: Pro-Matlab software for large order controlled structure design
NASA Technical Reports Server (NTRS)
Peterson, Lee D.
1989-01-01
A software package for large order controlled structure design is described and demonstrated. The primary program, called OPTICAN, uses both Pro-Matlab M-file routines and selected compiled FORTRAN routines linked into the Pro-Matlab structure. The program accepts structural model information in the form of state-space matrices and performs three basic design functions on the model: (1) open loop analyses; (2) closed loop reduced order controller synthesis; and (3) closed loop stability and performance assessment. The current controller synthesis methods which were implemented in this software are based on the Generalized Linear Quadratic Gaussian theory of Bernstein. In particular, a reduced order Optimal Projection synthesis algorithm based on a homotopy solution method was successfully applied to an experimental truss structure using a 58-state dynamic model. These results are presented and discussed. Current plans to expand the practical size of the design model to several hundred states and the intention to interface Pro-Matlab to a supercomputing environment are discussed.
Nema, Vijay; Pal, Sudhir Kumar
2013-01-01
This study was conducted to find the best suited freely available software for modelling of proteins by taking a few sample proteins. The proteins used were small to big in size with available crystal structures for the purpose of benchmarking. Key players like Phyre2, Swiss-Model, CPHmodels-3.0, Homer, (PS)2, (PS)(2)-V(2), Modweb were used for the comparison and model generation. Benchmarking process was done for four proteins, Icl, InhA, and KatG of Mycobacterium tuberculosis and RpoB of Thermus Thermophilus to get the most suited software. Parameters compared during analysis gave relatively better values for Phyre2 and Swiss-Model. This comparative study gave the information that Phyre2 and Swiss-Model make good models of small and large proteins as compared to other screened software. Other software was also good but is often not very efficient in providing full-length and properly folded structure.
An optimized video system for augmented reality in endodontics: a feasibility study.
Bruellmann, D D; Tjaden, H; Schwanecke, U; Barth, P
2013-03-01
We propose an augmented reality system for the reliable detection of root canals in video sequences based on a k-nearest neighbor color classification and introduce a simple geometric criterion for teeth. The new software was implemented using C++, Qt, and the image processing library OpenCV. Teeth are detected in video images to restrict the segmentation of the root canal orifices by using a k-nearest neighbor algorithm. The location of the root canal orifices were determined using Euclidean distance-based image segmentation. A set of 126 human teeth with known and verified locations of the root canal orifices was used for evaluation. The software detects root canals orifices for automatic classification of the teeth in video images and stores location and size of the found structures. Overall 287 of 305 root canals were correctly detected. The overall sensitivity was about 94 %. Classification accuracy for molars ranged from 65.0 to 81.2 % and from 85.7 to 96.7 % for premolars. The realized software shows that observations made in anatomical studies can be exploited to automate real-time detection of root canal orifices and tooth classification with a software system. Automatic storage of location, size, and orientation of the found structures with this software can be used for future anatomical studies. Thus, statistical tables with canal locations will be derived, which can improve anatomical knowledge of the teeth to alleviate root canal detection in the future. For this purpose the software is freely available at: http://www.dental-imaging.zahnmedizin.uni-mainz.de/.
Efficient Design and Analysis of Lightweight Reinforced Core Sandwich and PRSEUS Structures
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Yarrington, Phillip W.; Lucking, Ryan C.; Collier, Craig S.; Ainsworth, James J.; Toubia, Elias A.
2012-01-01
Design, analysis, and sizing methods for two novel structural panel concepts have been developed and incorporated into the HyperSizer Structural Sizing Software. Reinforced Core Sandwich (RCS) panels consist of a foam core with reinforcing composite webs connecting composite facesheets. Boeing s Pultruded Rod Stitched Efficient Unitized Structure (PRSEUS) panels use a pultruded unidirectional composite rod to provide axial stiffness along with integrated transverse frames and stitching. Both of these structural concepts are ovencured and have shown great promise applications in lightweight structures, but have suffered from the lack of efficient sizing capabilities similar to those that exist for honeycomb sandwich, foam sandwich, hat stiffened, and other, more traditional concepts. Now, with accurate design methods for RCS and PRSEUS panels available in HyperSizer, these concepts can be traded and used in designs as is done with the more traditional structural concepts. The methods developed to enable sizing of RCS and PRSEUS are outlined, as are results showing the validity and utility of the methods. Applications include several large NASA heavy lift launch vehicle structures.
Cost Estimation Techniques for C3I System Software.
1984-07-01
opment manmonth have been determined for maxi, midi , and mini .1 type computers. Small to median size timeshared developments used 0.2 to 1.5 hours...development schedule 1.23 1.00 1.10 2.1.3 Detailed Model The final codification of the COCOMO regressions was the development of separate effort...regardless of the software structure level being estimated: D8VC -- the expected development computer (maxi. midi . mini, micro) MODE -- the expected
Effect of restoration volume on stresses in a mandibular molar: a finite element study.
Wayne, Jennifer S; Chande, Ruchi; Porter, H Christian; Janus, Charles
2014-10-01
There can be significant disagreement among dentists when planning treatment for a tooth with a failing medium-to-large--sized restoration. The clinician must determine whether the restoration should be replaced or treated with a crown, which covers and protects the remaining weakened tooth structure during function. The purpose of this study was to evaluate the stresses generated in different sized amalgam restorations via a computational modeling approach and reveal whether a predictable pattern emerges. A computer tomography scan was performed of an extracted mandibular first molar, and the resulting images were imported into a medical imaging software package for tissue segmentation. The software was used to separate the enamel, dentin, and pulp cavity through density thresholding and surface rendering. These tissue structures then were imported into 3-dimensional computer-aided design software in which material properties appropriate to the tissues in the model were assigned. A static finite element analysis was conducted to investigate the stresses that result from normal occlusal forces. Five models were analyzed, 1 with no restoration and 4 with increasingly larger restoration volume proportions: a normal-sized tooth, a small-sized restoration, 2 medium-sized restorations, and 1 large restoration as determined from bitewing radiographs and occlusal surface digital photographs. The resulting von Mises stresses for dentin-enamel of the loaded portion of the tooth grew progressively greater as the size of the restoration increased. The average stress in the normal, unrestored tooth was 4.13 MPa, whereas the smallest restoration size increased this stress to 5.52 MPa. The largest restoration had a dentin-enamel stress of 6.47 MPa. A linear correlation existed between restoration size and dentin-enamel stress, with an R(2) of 0.97. A larger restoration volume proportion resulted in higher dentin-enamel stresses under static loading. A comparison of the von Mises stresses to the yield strengths of the materials revealed a relationship between a tooth's restoration volume proportion and the potential for failure, although factors other than restoration volume proportion may also impact the stresses generated in moderate-sized restorations. Copyright © 2014 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Integrated design optimization research and development in an industrial environment
NASA Astrophysics Data System (ADS)
Kumar, V.; German, Marjorie D.; Lee, S.-J.
1989-04-01
An overview is given of a design optimization project that is in progress at the GE Research and Development Center for the past few years. The objective of this project is to develop a methodology and a software system for design automation and optimization of structural/mechanical components and systems. The effort focuses on research and development issues and also on optimization applications that can be related to real-life industrial design problems. The overall technical approach is based on integration of numerical optimization techniques, finite element methods, CAE and software engineering, and artificial intelligence/expert systems (AI/ES) concepts. The role of each of these engineering technologies in the development of a unified design methodology is illustrated. A software system DESIGN-OPT has been developed for both size and shape optimization of structural components subjected to static as well as dynamic loadings. By integrating this software with an automatic mesh generator, a geometric modeler and an attribute specification computer code, a software module SHAPE-OPT has been developed for shape optimization. Details of these software packages together with their applications to some 2- and 3-dimensional design problems are described.
Integrated design optimization research and development in an industrial environment
NASA Technical Reports Server (NTRS)
Kumar, V.; German, Marjorie D.; Lee, S.-J.
1989-01-01
An overview is given of a design optimization project that is in progress at the GE Research and Development Center for the past few years. The objective of this project is to develop a methodology and a software system for design automation and optimization of structural/mechanical components and systems. The effort focuses on research and development issues and also on optimization applications that can be related to real-life industrial design problems. The overall technical approach is based on integration of numerical optimization techniques, finite element methods, CAE and software engineering, and artificial intelligence/expert systems (AI/ES) concepts. The role of each of these engineering technologies in the development of a unified design methodology is illustrated. A software system DESIGN-OPT has been developed for both size and shape optimization of structural components subjected to static as well as dynamic loadings. By integrating this software with an automatic mesh generator, a geometric modeler and an attribute specification computer code, a software module SHAPE-OPT has been developed for shape optimization. Details of these software packages together with their applications to some 2- and 3-dimensional design problems are described.
Genetic structure and conservation of Mountain Lions in the South-Brazilian Atlantic Rain Forest.
Castilho, Camila S; Marins-Sá, Luiz G; Benedet, Rodrigo C; Freitas, Thales R O
2012-01-01
The Brazilian Atlantic Rain Forest, one of the most endangered ecosystems worldwide, is also among the most important hotspots as regards biodiversity. Through intensive logging, the initial area has been reduced to around 12% of its original size. In this study we investigated the genetic variability and structure of the mountain lion, Puma concolor. Using 18 microsatellite loci we analyzed evidence of allele dropout, null alleles and stuttering, calculated the number of allele/locus, PIC, observed and expected heterozygosity, linkage disequilibrium, Hardy-Weinberg equilibrium, F(IS), effective population size and genetic structure (MICROCHECKER, CERVUS, GENEPOP, FSTAT, ARLEQUIN, ONESAMP, LDNe, PCAGEN, GENECLASS software), we also determine whether there was evidence of a bottleneck (HYBRIDLAB, BOTTLENECK software) that might influence the future viability of the population in south Brazil. 106 alleles were identified, with the number of alleles/locus ranging from 2 to 11. Mean observed heterozygosity, mean number of alleles and polymorphism information content were 0.609, 5.89, and 0.6255, respectively. This population presented evidence of a recent bottleneck and loss of genetic variation. Persistent regional poaching constitutes an increasing in the extinction risk.
Nema, Vijay; Pal, Sudhir Kumar
2013-01-01
Aim: This study was conducted to find the best suited freely available software for modelling of proteins by taking a few sample proteins. The proteins used were small to big in size with available crystal structures for the purpose of benchmarking. Key players like Phyre2, Swiss-Model, CPHmodels-3.0, Homer, (PS)2, (PS)2-V2, Modweb were used for the comparison and model generation. Results: Benchmarking process was done for four proteins, Icl, InhA, and KatG of Mycobacterium tuberculosis and RpoB of Thermus Thermophilus to get the most suited software. Parameters compared during analysis gave relatively better values for Phyre2 and Swiss-Model. Conclusion: This comparative study gave the information that Phyre2 and Swiss-Model make good models of small and large proteins as compared to other screened software. Other software was also good but is often not very efficient in providing full-length and properly folded structure. PMID:24023424
Structural Design and Analysis of the Upper Pressure Shell Section of a Composite Crew Module
NASA Technical Reports Server (NTRS)
Sleight, David W.; Paddock, David; Jeans, Jim W.; Hudeck, John D.
2008-01-01
This paper presents the results of the structural design and analysis of the upper pressure shell section of a carbon composite demonstration structure for the Composite Crew Module (CCM) Project. The project is managed by the NASA Engineering and Safety Center with participants from eight NASA Centers, the Air Force Research Laboratory, and multiple aerospace contractors including ATK/Swales, Northrop Grumman, Lockheed Martin, Collier Research Corporation, Genesis Engineering, and Janicki Industries. The paper discusses details of the upper pressure shell section design of the CCM and presents the structural analysis results using the HyperSizer structural sizing software and the MSC Nastran finite element analysis software. The HyperSizer results showed that the controlling load case driving most of the sizing in the upper pressure shell section was the internal pressure load case. The regions around the cutouts were controlled by internal pressure and the main parachute load cases. The global finite element analysis results showed that the majority of the elements of the CCM had a positive margin of safety with the exception of a few hot spots around the cutouts. These hot spots are currently being investigated with a more detailed analysis. Local finite element models of the Low Impact Docking System (LIDS) interface ring and the forward bay gussets with greater mesh fidelity were created for local sizing and analysis. The sizing of the LIDS interface ring was driven by the drogue parachute loads, Trans-Lunar Insertion (TLI) loads, and internal pressure. The drogue parachute loads controlled the sizing of the gusset cap on the drogue gusset and TLI loads controlled the sizing of the other five gusset caps. The main parachute loads controlled the sizing of the lower ends of the gusset caps on the main parachute fittings. The results showed that the gusset web/pressure shell and gusset web/gusset cap interfaces bonded using Pi-preform joints had local hot spots in the Pi-preform termination regions. These regions require a detailed three-dimensional analysis, which is currently being performed, to accurately address the load distribution near the Pi-preform termination in the upper and lower gusset caps.
Bradley, Anthony R; Rose, Alexander S; Pavelka, Antonín; Valasatava, Yana; Duarte, Jose M; Prlić, Andreas; Rose, Peter W
2017-06-01
Recent advances in experimental techniques have led to a rapid growth in complexity, size, and number of macromolecular structures that are made available through the Protein Data Bank. This creates a challenge for macromolecular visualization and analysis. Macromolecular structure files, such as PDB or PDBx/mmCIF files can be slow to transfer, parse, and hard to incorporate into third-party software tools. Here, we present a new binary and compressed data representation, the MacroMolecular Transmission Format, MMTF, as well as software implementations in several languages that have been developed around it, which address these issues. We describe the new format and its APIs and demonstrate that it is several times faster to parse, and about a quarter of the file size of the current standard format, PDBx/mmCIF. As a consequence of the new data representation, it is now possible to visualize structures with millions of atoms in a web browser, keep the whole PDB archive in memory or parse it within few minutes on average computers, which opens up a new way of thinking how to design and implement efficient algorithms in structural bioinformatics. The PDB archive is available in MMTF file format through web services and data that are updated on a weekly basis.
Three-dimensional biofilm structure quantification.
Beyenal, Haluk; Donovan, Conrad; Lewandowski, Zbigniew; Harkin, Gary
2004-12-01
Quantitative parameters describing biofilm physical structure have been extracted from three-dimensional confocal laser scanning microscopy images and used to compare biofilm structures, monitor biofilm development, and quantify environmental factors affecting biofilm structure. Researchers have previously used biovolume, volume to surface ratio, roughness coefficient, and mean and maximum thicknesses to compare biofilm structures. The selection of these parameters is dependent on the availability of software to perform calculations. We believe it is necessary to develop more comprehensive parameters to describe heterogeneous biofilm morphology in three dimensions. This research presents parameters describing three-dimensional biofilm heterogeneity, size, and morphology of biomass calculated from confocal laser scanning microscopy images. This study extends previous work which extracted quantitative parameters regarding morphological features from two-dimensional biofilm images to three-dimensional biofilm images. We describe two types of parameters: (1) textural parameters showing microscale heterogeneity of biofilms and (2) volumetric parameters describing size and morphology of biomass. The three-dimensional features presented are average (ADD) and maximum diffusion distances (MDD), fractal dimension, average run lengths (in X, Y and Z directions), aspect ratio, textural entropy, energy and homogeneity. We discuss the meaning of each parameter and present the calculations in detail. The developed algorithms, including automatic thresholding, are implemented in software as MATLAB programs which will be available at site prior to publication of the paper.
Pavelka, Antonín; Valasatava, Yana; Prlić, Andreas
2017-01-01
Recent advances in experimental techniques have led to a rapid growth in complexity, size, and number of macromolecular structures that are made available through the Protein Data Bank. This creates a challenge for macromolecular visualization and analysis. Macromolecular structure files, such as PDB or PDBx/mmCIF files can be slow to transfer, parse, and hard to incorporate into third-party software tools. Here, we present a new binary and compressed data representation, the MacroMolecular Transmission Format, MMTF, as well as software implementations in several languages that have been developed around it, which address these issues. We describe the new format and its APIs and demonstrate that it is several times faster to parse, and about a quarter of the file size of the current standard format, PDBx/mmCIF. As a consequence of the new data representation, it is now possible to visualize structures with millions of atoms in a web browser, keep the whole PDB archive in memory or parse it within few minutes on average computers, which opens up a new way of thinking how to design and implement efficient algorithms in structural bioinformatics. The PDB archive is available in MMTF file format through web services and data that are updated on a weekly basis. PMID:28574982
Preliminary Structural Sizing and Alternative Material Trade Study of CEV Crew Module
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Arnold, Steve M.; Collier, Craig S.; Yarrington, Phillip W.
2007-01-01
This paper presents the results of a preliminary structural sizing and alternate material trade study for NASA s Crew Exploration Vehicle (CEV) Crew Module (CM). This critical CEV component will house the astronauts during ascent, docking with the International Space Station, reentry, and landing. The alternate material design study considers three materials beyond the standard metallic (aluminum alloy) design that resulted from an earlier NASA Smart Buyer Team analysis. These materials are graphite/epoxy composite laminates, discontinuously reinforced SiC/Al (DRA) composites, and a novel integrated panel material/concept known as WebCore. Using the HyperSizer (Collier Research and Development Corporation) structural sizing software and NASTRAN finite element analysis code, a comparison is made among these materials for the three composite CM concepts considered by the 2006 NASA Engineering and Safety Center Composite Crew Module project.
NASA Technical Reports Server (NTRS)
Roche, Joseph M.
2002-01-01
Single-stage-to-orbit (SSTO) propulsion remains an elusive goal for launch vehicles. The physics of the problem is leading developers to a search for higher propulsion performance than is available with all-rocket power. Rocket-based combined cycle (RBCC) technology provides additional propulsion performance that may enable SSTO flight. Structural efficiency is also a major driving force in enabling SSTO flight. Increases in performance with RBCC propulsion are offset with the added size of the propulsion system. Geometrical considerations must be exploited to minimize the weight. Integration of the propulsion system with the vehicle must be carefully planned such that aeroperformance is not degraded and the air-breathing performance is enhanced. Consequently, the vehicle's structural architecture becomes one with the propulsion system architecture. Geometrical considerations applied to the integrated vehicle lead to low drag and high structural and volumetric efficiency. Sizing of the SSTO launch vehicle (GTX) is itself an elusive task. The weight of the vehicle depends strongly on the propellant required to meet the mission requirements. Changes in propellant requirements result in changes in the size of the vehicle, which in turn, affect the weight of the vehicle and change the propellant requirements. An iterative approach is necessary to size the vehicle to meet the flight requirements. GTX Sizer was developed to do exactly this. The governing geometry was built into a spreadsheet model along with scaling relationships. The scaling laws attempt to maintain structural integrity as the vehicle size is changed. Key aerodynamic relationships are maintained as the vehicle size is changed. The closed weight and center of gravity are displayed graphically on a plot of the synthesized vehicle. In addition, comprehensive tabular data of the subsystem weights and centers of gravity are generated. The model has been verified for accuracy with finite element analysis. The final trajectory was rerun using OTIS (Boeing Corporation's trajectory optimization software package), and the sizing output was incorporated into a solid model of the vehicle using PRO/Engineer computer-aided design software (Parametric Technology Corporation, Waltham, MA).
Structural Design of Ares V Interstage Composite Structure
NASA Technical Reports Server (NTRS)
Sleigh, David W.; Sreekantamurthy, Thammaiah; Kosareo, Daniel N.; Martin, Robert A.; Johnson, Theodore F.
2011-01-01
Preliminary and detailed design studies were performed to mature composite structural design concepts for the Ares V Interstage structure as a part of NASA s Advanced Composite Technologies Project. Aluminum honeycomb sandwich and hat-stiffened composite panel structural concepts were considered. The structural design and analysis studies were performed using HyperSizer design sizing software and MSC Nastran finite element analysis software. System-level design trade studies were carried out to predict weight and margins of safety for composite honeycomb-core sandwich and composite hat-stiffened skin design concepts. Details of both preliminary and detailed design studies are presented in the paper. For the range of loads and geometry considered in this work, the hat-stiffened designs were found to be approximately 11-16 percent lighter than the sandwich designs. A down-select process was used to choose the most favorable structural concept based on a set of figures of merit, and the honeycomb sandwich design was selected as the best concept based on advantages in manufacturing cost.
Accessible and informative sectioned images, color-coded images, and surface models of the ear.
Park, Hyo Seok; Chung, Min Suk; Shin, Dong Sun; Jung, Yong Wook; Park, Jin Seo
2013-08-01
In our previous research, we created state-of-the-art sectioned images, color-coded images, and surface models of the human ear. Our ear data would be more beneficial and informative if they were more easily accessible. Therefore, the purpose of this study was to distribute the browsing software and the PDF file in which ear images are to be readily obtainable and freely explored. Another goal was to inform other researchers of our methods for establishing the browsing software and the PDF file. To achieve this, sectioned images and color-coded images of ear were prepared (voxel size 0.1 mm). In the color-coded images, structures related to hearing, equilibrium, and structures originated from the first and second pharyngeal arches were segmented supplementarily. The sectioned and color-coded images of right ear were added to the browsing software, which displayed the images serially along with structure names. The surface models were reconstructed to be combined into the PDF file where they could be freely manipulated. Using the browsing software and PDF file, sectional and three-dimensional shapes of ear structures could be comprehended in detail. Furthermore, using the PDF file, clinical knowledge could be identified through virtual otoscopy. Therefore, the presented educational tools will be helpful to medical students and otologists by improving their knowledge of ear anatomy. The browsing software and PDF file can be downloaded without charge and registration at our homepage (http://anatomy.dongguk.ac.kr/ear/). Copyright © 2013 Wiley Periodicals, Inc.
Design sensitivity analysis and optimization tool (DSO) for sizing design applications
NASA Technical Reports Server (NTRS)
Chang, Kuang-Hua; Choi, Kyung K.; Perng, Jyh-Hwa
1992-01-01
The DSO tool, a structural design software system that provides the designer with a graphics-based menu-driven design environment to perform easy design optimization for general applications, is presented. Three design stages, preprocessing, design sensitivity analysis, and postprocessing, are implemented in the DSO to allow the designer to carry out the design process systematically. A framework, including data base, user interface, foundation class, and remote module, has been designed and implemented to facilitate software development for the DSO. A number of dedicated commercial software/packages have been integrated in the DSO to support the design procedures. Instead of parameterizing an FEM, design parameters are defined on a geometric model associated with physical quantities, and the continuum design sensitivity analysis theory is implemented to compute design sensitivity coefficients using postprocessing data from the analysis codes. A tracked vehicle road wheel is given as a sizing design application to demonstrate the DSO's easy and convenient design optimization process.
Structator: fast index-based search for RNA sequence-structure patterns
2011-01-01
Background The secondary structure of RNA molecules is intimately related to their function and often more conserved than the sequence. Hence, the important task of searching databases for RNAs requires to match sequence-structure patterns. Unfortunately, current tools for this task have, in the best case, a running time that is only linear in the size of sequence databases. Furthermore, established index data structures for fast sequence matching, like suffix trees or arrays, cannot benefit from the complementarity constraints introduced by the secondary structure of RNAs. Results We present a novel method and readily applicable software for time efficient matching of RNA sequence-structure patterns in sequence databases. Our approach is based on affix arrays, a recently introduced index data structure, preprocessed from the target database. Affix arrays support bidirectional pattern search, which is required for efficiently handling the structural constraints of the pattern. Structural patterns like stem-loops can be matched inside out, such that the loop region is matched first and then the pairing bases on the boundaries are matched consecutively. This allows to exploit base pairing information for search space reduction and leads to an expected running time that is sublinear in the size of the sequence database. The incorporation of a new chaining approach in the search of RNA sequence-structure patterns enables the description of molecules folding into complex secondary structures with multiple ordered patterns. The chaining approach removes spurious matches from the set of intermediate results, in particular of patterns with little specificity. In benchmark experiments on the Rfam database, our method runs up to two orders of magnitude faster than previous methods. Conclusions The presented method's sublinear expected running time makes it well suited for RNA sequence-structure pattern matching in large sequence databases. RNA molecules containing several stem-loop substructures can be described by multiple sequence-structure patterns and their matches are efficiently handled by a novel chaining method. Beyond our algorithmic contributions, we provide with Structator a complete and robust open-source software solution for index-based search of RNA sequence-structure patterns. The Structator software is available at http://www.zbh.uni-hamburg.de/Structator. PMID:21619640
Influence of local meshing size on stress intensity factor of orthopedic lag screw
NASA Astrophysics Data System (ADS)
Husain, M. N.; Daud, R.; Basaruddin, K. S.; Mat, F.; Bajuri, M. Y.; Arifin, A. K.
2017-09-01
Linear elastic fracture mechanics (LEFM) concept is generally used to study the influence of crack on the performance of structures. In order to study the LEFM concept on damaged structure, the usage of finite element analysis software is implemented to do the simulation of the structure. Mesh generation is one of the most crucial procedures in finite element method. For the structure that crack or damaged, it is very important to determine the accurate local meshing size at the crack tip of the crack itself in order to get the accurate value of stress intensity factor, KI. Pre crack will be introduced to the lag screw based on the von mises' stress result that had been performed in previous research. This paper shows the influence of local mesh arrangement on numerical value of the stress intensity factor, KI obtained by the displacement method. This study aims to simulate the effect of local meshing which is the singularity region on stress intensity factor, KI to the critical point of failure in screw. Five different set of wedges meshing size are introduced during the simulation of finite element analysis. The number of wedges used to simulate this research is 8, 10, 14, 16 and 20. There are three set of numerical equations used to validate the results which are brown and srawley, gross and brown and Tada equation. The result obtained from the finite element software (ANSYS APDL) has a positive agreement with the numerical analysis which is Brown and Srawley compared to other numerical formula. Radius of first row size of 0.014 and singularity element with 14 numbers of wedges is proved to be the best local meshing for this study.
P-TRAP: a Panicle TRAit Phenotyping tool.
A L-Tam, Faroq; Adam, Helene; Anjos, António dos; Lorieux, Mathias; Larmande, Pierre; Ghesquière, Alain; Jouannic, Stefan; Shahbazkia, Hamid Reza
2013-08-29
In crops, inflorescence complexity and the shape and size of the seed are among the most important characters that influence yield. For example, rice panicles vary considerably in the number and order of branches, elongation of the axis, and the shape and size of the seed. Manual low-throughput phenotyping methods are time consuming, and the results are unreliable. However, high-throughput image analysis of the qualitative and quantitative traits of rice panicles is essential for understanding the diversity of the panicle as well as for breeding programs. This paper presents P-TRAP software (Panicle TRAit Phenotyping), a free open source application for high-throughput measurements of panicle architecture and seed-related traits. The software is written in Java and can be used with different platforms (the user-friendly Graphical User Interface (GUI) uses Netbeans Platform 7.3). The application offers three main tools: a tool for the analysis of panicle structure, a spikelet/grain counting tool, and a tool for the analysis of seed shape. The three tools can be used independently or simultaneously for analysis of the same image. Results are then reported in the Extensible Markup Language (XML) and Comma Separated Values (CSV) file formats. Images of rice panicles were used to evaluate the efficiency and robustness of the software. Compared to data obtained by manual processing, P-TRAP produced reliable results in a much shorter time. In addition, manual processing is not repeatable because dry panicles are vulnerable to damage. The software is very useful, practical and collects much more data than human operators. P-TRAP is a new open source software that automatically recognizes the structure of a panicle and the seeds on the panicle in numeric images. The software processes and quantifies several traits related to panicle structure, detects and counts the grains, and measures their shape parameters. In short, P-TRAP offers both efficient results and a user-friendly environment for experiments. The experimental results showed very good accuracy compared to field operator, expert verification and well-known academic methods.
P-TRAP: a Panicle Trait Phenotyping tool
2013-01-01
Background In crops, inflorescence complexity and the shape and size of the seed are among the most important characters that influence yield. For example, rice panicles vary considerably in the number and order of branches, elongation of the axis, and the shape and size of the seed. Manual low-throughput phenotyping methods are time consuming, and the results are unreliable. However, high-throughput image analysis of the qualitative and quantitative traits of rice panicles is essential for understanding the diversity of the panicle as well as for breeding programs. Results This paper presents P-TRAP software (Panicle TRAit Phenotyping), a free open source application for high-throughput measurements of panicle architecture and seed-related traits. The software is written in Java and can be used with different platforms (the user-friendly Graphical User Interface (GUI) uses Netbeans Platform 7.3). The application offers three main tools: a tool for the analysis of panicle structure, a spikelet/grain counting tool, and a tool for the analysis of seed shape. The three tools can be used independently or simultaneously for analysis of the same image. Results are then reported in the Extensible Markup Language (XML) and Comma Separated Values (CSV) file formats. Images of rice panicles were used to evaluate the efficiency and robustness of the software. Compared to data obtained by manual processing, P-TRAP produced reliable results in a much shorter time. In addition, manual processing is not repeatable because dry panicles are vulnerable to damage. The software is very useful, practical and collects much more data than human operators. Conclusions P-TRAP is a new open source software that automatically recognizes the structure of a panicle and the seeds on the panicle in numeric images. The software processes and quantifies several traits related to panicle structure, detects and counts the grains, and measures their shape parameters. In short, P-TRAP offers both efficient results and a user-friendly environment for experiments. The experimental results showed very good accuracy compared to field operator, expert verification and well-known academic methods. PMID:23987653
Genetic structure and conservation of Mountain Lions in the South-Brazilian Atlantic Rain Forest
Castilho, Camila S.; Marins-Sá, Luiz G.; Benedet, Rodrigo C.; Freitas, Thales R.O.
2012-01-01
The Brazilian Atlantic Rain Forest, one of the most endangered ecosystems worldwide, is also among the most important hotspots as regards biodiversity. Through intensive logging, the initial area has been reduced to around 12% of its original size. In this study we investigated the genetic variability and structure of the mountain lion, Puma concolor. Using 18 microsatellite loci we analyzed evidence of allele dropout, null alleles and stuttering, calculated the number of allele/locus, PIC, observed and expected heterozygosity, linkage disequilibrium, Hardy-Weinberg equilibrium, FIS, effective population size and genetic structure (MICROCHECKER, CERVUS, GENEPOP, FSTAT, ARLEQUIN, ONESAMP, LDNe, PCAGEN, GENECLASS software), we also determine whether there was evidence of a bottleneck (HYBRIDLAB, BOTTLENECK software) that might influence the future viability of the population in south Brazil. 106 alleles were identified, with the number of alleles/locus ranging from 2 to 11. Mean observed heterozygosity, mean number of alleles and polymorphism information content were 0.609, 5.89, and 0.6255, respectively. This population presented evidence of a recent bottleneck and loss of genetic variation. Persistent regional poaching constitutes an increasing in the extinction risk. PMID:22481876
ELSI: A unified software interface for Kohn–Sham electronic structure solvers
Yu, Victor Wen-zhe; Corsetti, Fabiano; Garcia, Alberto; ...
2017-09-15
Solving the electronic structure from a generalized or standard eigenproblem is often the bottleneck in large scale calculations based on Kohn-Sham density-functional theory. This problem must be addressed by essentially all current electronic structure codes, based on similar matrix expressions, and by high-performance computation. We here present a unified software interface, ELSI, to access different strategies that address the Kohn-Sham eigenvalue problem. Currently supported algorithms include the dense generalized eigensolver library ELPA, the orbital minimization method implemented in libOMM, and the pole expansion and selected inversion (PEXSI) approach with lower computational complexity for semilocal density functionals. The ELSI interface aimsmore » to simplify the implementation and optimal use of the different strategies, by offering (a) a unified software framework designed for the electronic structure solvers in Kohn-Sham density-functional theory; (b) reasonable default parameters for a chosen solver; (c) automatic conversion between input and internal working matrix formats, and in the future (d) recommendation of the optimal solver depending on the specific problem. As a result, comparative benchmarks are shown for system sizes up to 11,520 atoms (172,800 basis functions) on distributed memory supercomputing architectures.« less
ELSI: A unified software interface for Kohn-Sham electronic structure solvers
NASA Astrophysics Data System (ADS)
Yu, Victor Wen-zhe; Corsetti, Fabiano; García, Alberto; Huhn, William P.; Jacquelin, Mathias; Jia, Weile; Lange, Björn; Lin, Lin; Lu, Jianfeng; Mi, Wenhui; Seifitokaldani, Ali; Vázquez-Mayagoitia, Álvaro; Yang, Chao; Yang, Haizhao; Blum, Volker
2018-01-01
Solving the electronic structure from a generalized or standard eigenproblem is often the bottleneck in large scale calculations based on Kohn-Sham density-functional theory. This problem must be addressed by essentially all current electronic structure codes, based on similar matrix expressions, and by high-performance computation. We here present a unified software interface, ELSI, to access different strategies that address the Kohn-Sham eigenvalue problem. Currently supported algorithms include the dense generalized eigensolver library ELPA, the orbital minimization method implemented in libOMM, and the pole expansion and selected inversion (PEXSI) approach with lower computational complexity for semilocal density functionals. The ELSI interface aims to simplify the implementation and optimal use of the different strategies, by offering (a) a unified software framework designed for the electronic structure solvers in Kohn-Sham density-functional theory; (b) reasonable default parameters for a chosen solver; (c) automatic conversion between input and internal working matrix formats, and in the future (d) recommendation of the optimal solver depending on the specific problem. Comparative benchmarks are shown for system sizes up to 11,520 atoms (172,800 basis functions) on distributed memory supercomputing architectures.
ELSI: A unified software interface for Kohn–Sham electronic structure solvers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Victor Wen-zhe; Corsetti, Fabiano; Garcia, Alberto
Solving the electronic structure from a generalized or standard eigenproblem is often the bottleneck in large scale calculations based on Kohn-Sham density-functional theory. This problem must be addressed by essentially all current electronic structure codes, based on similar matrix expressions, and by high-performance computation. We here present a unified software interface, ELSI, to access different strategies that address the Kohn-Sham eigenvalue problem. Currently supported algorithms include the dense generalized eigensolver library ELPA, the orbital minimization method implemented in libOMM, and the pole expansion and selected inversion (PEXSI) approach with lower computational complexity for semilocal density functionals. The ELSI interface aimsmore » to simplify the implementation and optimal use of the different strategies, by offering (a) a unified software framework designed for the electronic structure solvers in Kohn-Sham density-functional theory; (b) reasonable default parameters for a chosen solver; (c) automatic conversion between input and internal working matrix formats, and in the future (d) recommendation of the optimal solver depending on the specific problem. As a result, comparative benchmarks are shown for system sizes up to 11,520 atoms (172,800 basis functions) on distributed memory supercomputing architectures.« less
Endoscopic Stone Measurement During Ureteroscopy.
Ludwig, Wesley W; Lim, Sunghwan; Stoianovici, Dan; Matlaga, Brian R
2018-01-01
Currently, stone size cannot be accurately measured while performing flexible ureteroscopy (URS). We developed novel software for ureteroscopic, stone size measurement, and then evaluated its performance. A novel application capable of measuring stone fragment size, based on the known distance of the basket tip in the ureteroscope's visual field, was designed and calibrated in a laboratory setting. Complete URS procedures were recorded and 30 stone fragments were extracted and measured using digital calipers. The novel software program was applied to the recorded URS footage to obtain ureteroscope-derived stone size measurements. These ureteroscope-derived measurements were then compared with the actual-measured fragment size. The median longitudinal and transversal errors were 0.14 mm (95% confidence interval [CI] 0.1, 0.18) and 0.09 mm (95% CI 0.02, 0.15), respectively. The overall software accuracy and precision were 0.17 and 0.15 mm, respectively. The longitudinal and transversal measurements obtained by the software and digital calipers were highly correlated (r = 0.97 and 0.93). Neither stone size nor stone type was correlated with error measurements. This novel method and software reliably measured stone fragment size during URS. The software ultimately has the potential to make URS safer and more efficient.
Radiation breakage of DNA: a model based on random-walk chromatin structure
NASA Technical Reports Server (NTRS)
Ponomarev, A. L.; Sachs, R. K.
2001-01-01
Monte Carlo computer software, called DNAbreak, has recently been developed to analyze observed non-random clustering of DNA double strand breaks in chromatin after exposure to densely ionizing radiation. The software models coarse-grained configurations of chromatin and radiation tracks, small-scale details being suppressed in order to obtain statistical results for larger scales, up to the size of a whole chromosome. We here give an analytic counterpart of the numerical model, useful for benchmarks, for elucidating the numerical results, for analyzing the assumptions of a more general but less mechanistic "randomly-located-clusters" formalism, and, potentially, for speeding up the calculations. The equations characterize multi-track DNA fragment-size distributions in terms of one-track action; an important step in extrapolating high-dose laboratory results to the much lower doses of main interest in environmental or occupational risk estimation. The approach can utilize the experimental information on DNA fragment-size distributions to draw inferences about large-scale chromatin geometry during cell-cycle interphase.
Park, Jin Seo; Jung, Yong Wook; Choi, Hyung-Do; Lee, Ae-Kyoung
2018-01-01
Abstract The anatomical structures in most phantoms are classified according to tissue properties rather than according to their detailed structures, because the tissue properties, not the detailed structures, are what is considered important. However, if a phantom does not have detailed structures, the phantom will be unreliable because different tissues can be regarded as the same. Thus, we produced the Visible Korean (VK) -phantoms with detailed structures (male, 583 structures; female, 459 structures) based on segmented images of the whole male body (interval, 1.0 mm; pixel size, 1.0 mm2) and the whole female body (interval, 1.0 mm; pixel size, 1.0 mm2), using house-developed software to analyze the text string and voxel information for each of the structures. The density of each structure in the VK-phantom was calculated based on Virtual Population and a publication of the International Commission on Radiological Protection. In the future, we will standardize the size of each structure in the VK-phantoms. If the VK-phantoms are standardized and the mass density of each structure is precisely known, researchers will be able to measure the exact absorption rate of electromagnetic radiation in specific organs and tissues of the whole body. PMID:29659988
Park, Jin Seo; Jung, Yong Wook; Choi, Hyung-Do; Lee, Ae-Kyoung
2018-05-01
The anatomical structures in most phantoms are classified according to tissue properties rather than according to their detailed structures, because the tissue properties, not the detailed structures, are what is considered important. However, if a phantom does not have detailed structures, the phantom will be unreliable because different tissues can be regarded as the same. Thus, we produced the Visible Korean (VK) -phantoms with detailed structures (male, 583 structures; female, 459 structures) based on segmented images of the whole male body (interval, 1.0 mm; pixel size, 1.0 mm2) and the whole female body (interval, 1.0 mm; pixel size, 1.0 mm2), using house-developed software to analyze the text string and voxel information for each of the structures. The density of each structure in the VK-phantom was calculated based on Virtual Population and a publication of the International Commission on Radiological Protection. In the future, we will standardize the size of each structure in the VK-phantoms. If the VK-phantoms are standardized and the mass density of each structure is precisely known, researchers will be able to measure the exact absorption rate of electromagnetic radiation in specific organs and tissues of the whole body.
Hail Size Distribution Mapping
NASA Technical Reports Server (NTRS)
2008-01-01
A 3-D weather radar visualization software program was developed and implemented as part of an experimental Launch Pad 39 Hail Monitor System. 3DRadPlot, a radar plotting program, is one of several software modules that form building blocks of the hail data processing and analysis system (the complete software processing system under development). The spatial and temporal mapping algorithms were originally developed through research at the University of Central Florida, funded by NASA s Tropical Rainfall Measurement Mission (TRMM), where the goal was to merge National Weather Service (NWS) Next-Generation Weather Radar (NEXRAD) volume reflectivity data with drop size distribution data acquired from a cluster of raindrop disdrometers. In this current work, we adapted these algorithms to process data from a cluster of hail disdrometers positioned around Launch Pads 39A or 39B, along with the corresponding NWS radar data. Radar data from all NWS NEXRAD sites is archived at the National Climatic Data Center (NCDC). That data can be readily accessed at
Reiter, Rachel; Viehdorfer, Matt; Hescock, Kimmy; Clark, Terri; Nemanic, Sarah
The goal of this study was to determine the effectiveness of an interactive radiology software application that we developed to enhance learning of normal canine radiographic anatomy. All first-year veterinary medical students were eligible to participate in this subject pre-test-post-test experimental design. When presented with the software application, all students had completed two terms of gross anatomy in which the complete anatomy of the dog had been taught using a combination of lectures and laboratory dissections, including radiographic examples. The software application was divided into four body regions: front limb, hind limb, skull/spine, and thorax/abdomen, each with a learning mode and a quiz mode. Quizzes were composed of 15 questions drawn pseudo-randomly without repeat from all structures within a region (median 206 structures). Students were initially given the software application with only the quiz mode activated. After completing four quizzes, one for each body region, students were given access to the software application with both learning mode and quiz mode activated. Students were instructed to spend 30 minutes using the learning mode to study the radiographic anatomy of each region and to retake each quiz. Quiz scores after using the learning mode were significantly higher for each body region (p<.001), with a large effect size for all four regions (Cohen's d=0.83-1.56). These results suggest that this radiographic anatomy software application is an effective tool for students to use to learn normal radiographic anatomy.
Criteria for software modularization
NASA Technical Reports Server (NTRS)
Card, David N.; Page, Gerald T.; Mcgarry, Frank E.
1985-01-01
A central issue in programming practice involves determining the appropriate size and information content of a software module. This study attempted to determine the effectiveness of two widely used criteria for software modularization, strength and size, in reducing fault rate and development cost. Data from 453 FORTRAN modules developed by professional programmers were analyzed. The results indicated that module strength is a good criterion with respect to fault rate, whereas arbitrary module size limitations inhibit programmer productivity. This analysis is a first step toward defining empirically based standards for software modularization.
Tian, Bian; Zhao, Yulong; Jiang, Zhuangde; Zhang, Ling; Liao, Nansheng; Liu, Yuanhao; Meng, Chao
2009-01-01
In this paper we describe the design and testing of a micro piezoresistive pressure sensor for a Tire Pressure Measurement System (TPMS) which has the advantages of a minimized structure, high sensitivity, linearity and accuracy. Through analysis of the stress distribution of the diaphragm using the ANSYS software, a model of the structure was established. The fabrication on a single silicon substrate utilizes the technologies of anisotropic chemical etching and packaging through glass anodic bonding. The performance of this type of piezoresistive sensor, including size, sensitivity, and long-term stability, were investigated. The results indicate that the accuracy is 0.5% FS, therefore this design meets the requirements for a TPMS, and not only has a smaller size and simplicity of preparation, but also has high sensitivity and accuracy.
An expert system based software sizing tool, phase 2
NASA Technical Reports Server (NTRS)
Friedlander, David
1990-01-01
A software tool was developed for predicting the size of a future computer program at an early stage in its development. The system is intended to enable a user who is not expert in Software Engineering to estimate software size in lines of source code with an accuracy similar to that of an expert, based on the program's functional specifications. The project was planned as a knowledge based system with a field prototype as the goal of Phase 2 and a commercial system planned for Phase 3. The researchers used techniques from Artificial Intelligence and knowledge from human experts and existing software from NASA's COSMIC database. They devised a classification scheme for the software specifications, and a small set of generic software components that represent complexity and apply to large classes of programs. The specifications are converted to generic components by a set of rules and the generic components are input to a nonlinear sizing function which makes the final prediction. The system developed for this project predicted code sizes from the database with a bias factor of 1.06 and a fluctuation factor of 1.77, an accuracy similar to that of human experts but without their significant optimistic bias.
Nano sized La2Co2O6 double perovskite synthesized by sol gel method
NASA Astrophysics Data System (ADS)
Solanki, Neha; Lodhi, Pavitra Devi; Choudhary, K. K.; Kaurav, Netram
2018-05-01
We report here the synthesis of double perovskite La2Co2O6 (LCO) compound by a sol gel route method. The double perovskite structure of LCO system was confirmed via X-ray diffraction (XRD) analysis. Further, the lattice parameter, unit cell volume and bond length were refined by means of rietveld analysis using the full proof software. Debye Scherer formula was used to determine the particle size. The compound crystallized in triclinic structure with space group P-1 in ambient condition. We also obtained Raman modes from XRD spectra of poly-crystalline LCO sample. These results were interpreted for the observation of phonon excitations in this compound.
Hardware/software codesign for embedded RISC core
NASA Astrophysics Data System (ADS)
Liu, Peng
2001-12-01
This paper describes hardware/software codesign method of the extendible embedded RISC core VIRGO, which based on MIPS-I instruction set architecture. VIRGO is described by Verilog hardware description language that has five-stage pipeline with shared 32-bit cache/memory interface, and it is controlled by distributed control scheme. Every pipeline stage has one small controller, which controls the pipeline stage status and cooperation among the pipeline phase. Since description use high level language and structure is distributed, VIRGO core has highly extension that can meet the requirements of application. We take look at the high-definition television MPEG2 MPHL decoder chip, constructed the hardware/software codesign virtual prototyping machine that can research on VIRGO core instruction set architecture, and system on chip memory size requirements, and system on chip software, etc. We also can evaluate the system on chip design and RISC instruction set based on the virtual prototyping machine platform.
Design of Fiber Reinforced Foam Sandwich Panels for Large Ares V Structural Applications
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Arnold, Steven M.; Hopkins, Dale A.
2010-01-01
The preliminary design of three major structural components within NASA's Ares V heavy lift vehicle using a novel fiber reinforced foam composite sandwich panel concept is presented. The Ares V payload shroud, interstage, and core intertank are designed for minimum mass using this panel concept, which consists of integral composite webs separated by structural foam between two composite facesheets. The HyperSizer structural sizing software, in conjunction with NASTRAN finite element analyses, is used. However, since HyperSizer does not currently include a panel concept for fiber reinforced foam, the sizing was performed using two separate approaches. In the first, the panel core is treated as an effective (homogenized) material, whose properties are provided by the vendor. In the second approach, the panel is treated as a blade stiffened sandwich panel, with the mass of the foam added after completion of the panel sizing. Details of the sizing for each of the three Ares V components are given, and it is demonstrated that the two panel sizing approaches are in reasonable agreement for thinner panel designs, but as the panel thickness increases, the blade stiffened sandwich panel approach yields heavier panel designs. This is due to the effects of local buckling, which are not considered in the effective core property approach.
Algorithm Diversity for Resilent Systems
2016-06-27
data structures. 15. SUBJECT TERMS computer security, software diversity, program transformation 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF 18...systematic method for transforming Datalog rules with general universal and existential quantification into efficient algorithms with precise complexity...worst case in the size of the ground rules. There are numerous choices during the transformation that lead to diverse algorithms and different
The size effect to O2- -Ce4+ charge transfer emission and band gap structure of Sr2 CeO4.
Wang, Wenjun; Pan, Yu; Zhang, Wenying; Liu, Xiaoguang; Li, Ling
2018-04-24
Sr 2 CeO 4 phosphors with different crystalline sizes were synthesized by the sol-gel method or the solid-state reaction. Their crystalline size, luminescence intensity of O 2- -Ce 4+ charge transfer and energy gaps were obtained through the characterization by X-ray diffraction, photoluminescence spectra, as well as UV-visible diffuse reflectance measurements. An inverse relationship between photoluminescence (PL) spectra and crystalline size was observed when the heating temperature was from 1000°C to 1300°C. In addition, band energy calculated for all samples showed that a reaction temperature of 1200°C for the solid-state method and 1100°C for sol-gel method gave the largest values, which corresponded with the smallest crystalline size. Correlation between PL intensity and crystalline size showed an inverse relationship. Band structure, density of states and partial density of states of the crystal were calculated to analyze the mechanism using the cambrige sequential total energy package (CASTEP) module integrated with Materials Studio software. Copyright © 2018 John Wiley & Sons, Ltd.
A computerized self-compensating system for ultrasonic inspection of airplane structures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Komsky, I.N.; Achenbach, J.D.; Hagemaier, D.
1993-12-31
Application of a self-compensating technique for ultrasonic inspection of airplane structures makes it possible not only to detect cracks in the different layers of joints but also to obtain information on crack sizes. A prototype computerized ultrasonic system, which utilizes the self-compensating method, has been developed for non-destructive inspection of multilayered airplane structures with in-between sealants, such as bolted joints in tail connections. Industrial applications of the system would require deployment of commercially available portable modules for data acquisition and processing. A portable ultrasonic flaw detector EPOCH II manual scanners and HandiScan, and SQL and FCS software modules form themore » PC-based TestPro system have been selected for initial tests. A pair of contact angle-beam transducers were used to generate shear waves in the material. Both hardware and software components of the system have been modified for the application in conjunction with the self-compensating technique. The system has bene tested on two calibration specimens with artificial flaws of different sizes in internal layers of multilayered structures. Ultrasonic signals transmitted through and reflected from the artificial flaws have bene discriminated and characterized using multiple time domain amplitude gates. Then the ratios of the reflection and transmission coefficients, R/T, were calculated for several positions of the transducers. Inspection of measured R/T curves shows it is difficult to visually associate curve shapes with corresponding flaw sizes and orientation. Hence for online classification of these curve shapes, application of an adaptive signal classifier was considered. Several different types and configurations of the classifiers, including a neural network, have been tested. Test results showed that improved performance of the classifier can be achieved by combination of a back-propagation neural network with a signal pre-processing module.« less
Paiva, Anthony; Shou, Wilson Z
2016-08-01
The last several years have seen the rapid adoption of the high-resolution MS (HRMS) for bioanalytical support of high throughput in vitro ADME profiling. Many capable software tools have been developed and refined to process quantitative HRMS bioanalysis data for ADME samples with excellent performance. Additionally, new software applications specifically designed for quan/qual soft spot identification workflows using HRMS have greatly enhanced the quality and efficiency of the structure elucidation process for high throughput metabolite ID in early in vitro ADME profiling. Finally, novel approaches in data acquisition and compression, as well as tools for transferring, archiving and retrieving HRMS data, are being continuously refined to tackle the issue of large data file size typical for HRMS analyses.
Tian, Bian; Zhao, Yulong; Jiang, Zhuangde; Zhang, Ling; Liao, Nansheng; Liu, Yuanhao; Meng, Chao
2009-01-01
In this paper we describe the design and testing of a micro piezoresistive pressure sensor for a Tire Pressure Measurement System (TPMS) which has the advantages of a minimized structure, high sensitivity, linearity and accuracy. Through analysis of the stress distribution of the diaphragm using the ANSYS software, a model of the structure was established. The fabrication on a single silicon substrate utilizes the technologies of anisotropic chemical etching and packaging through glass anodic bonding. The performance of this type of piezoresistive sensor, including size, sensitivity, and long-term stability, were investigated. The results indicate that the accuracy is 0.5% FS, therefore this design meets the requirements for a TPMS, and not only has a smaller size and simplicity of preparation, but also has high sensitivity and accuracy. PMID:22573960
A General Sparse Tensor Framework for Electronic Structure Theory
Manzer, Samuel; Epifanovsky, Evgeny; Krylov, Anna I.; ...
2017-01-24
Linear-scaling algorithms must be developed in order to extend the domain of applicability of electronic structure theory to molecules of any desired size. But, the increasing complexity of modern linear-scaling methods makes code development and maintenance a significant challenge. A major contributor to this difficulty is the lack of robust software abstractions for handling block-sparse tensor operations. We therefore report the development of a highly efficient symbolic block-sparse tensor library in order to provide access to high-level software constructs to treat such problems. Our implementation supports arbitrary multi-dimensional sparsity in all input and output tensors. We then avoid cumbersome machine-generatedmore » code by implementing all functionality as a high-level symbolic C++ language library and demonstrate that our implementation attains very high performance for linear-scaling sparse tensor contractions.« less
NASA Astrophysics Data System (ADS)
Syafiqah Syahirah Mohamed, Nor; Amalina Banu Mohamat Adek, Noor; Hamid, Nurul Farhana Abd
2018-03-01
This paper presents the development of Graphical User Interface (GUI) software for sizing main component in AC coupled photovoltaic (PV) hybrid power system based on Malaysia climate. This software provides guideline for PV system integrator to design effectively the size of components and system configuration to match the system and load requirement with geographical condition. The concept of the proposed software is balancing the annual average renewable energy generation and load demand. In this study, the PV to diesel generator (DG) ratio is introduced by considering the hybrid system energy contribution. The GUI software is able to size the main components in the PV hybrid system to meet with the set target of energy contribution ratio. The rated powers of the components to be defined are PV array, grid-tie inverter, bi-directional inverter, battery storage and DG. GUI is used to perform all the system sizing procedures to make it user friendly interface as a sizing tool for AC coupled PV hybrid system. The GUI will be done by using Visual Studio 2015 based on the real data under Malaysia Climate.
Belwalkar, A; Grasing, E; Van Geertruyden, W; Huang, Z; Misiolek, W Z
2008-07-01
Nanoporous anodic aluminum oxide (AAO) tubular membranes were fabricated from aluminum alloy tubes in sulfuric and oxalic acid electrolytes using a two-step anodization process. The membranes were investigated for characteristics such as pore size, interpore distance and thickness by varying applied voltage and electrolyte concentration. Morphology of the membranes was examined using light optical and scanning electron microscopy and characterized using ImageJ software. Results showed that membranes having narrow pore size and uniform pore distribution with parallel channel arrays were obtained. The pore sizes were ranging from 14 to 24 nm and the wall thicknesses as high as 76 microm. It was found that the pore size increased in direct proportion with the applied voltage and inversely with the electrolyte concentration while the interpore distance increased linearly with the applied voltage. It was also observed that increase in acid concentration increased tubular membrane wall thickness that improved mechanical handling. By using anodic alumina technology, robust ceramic tubes with uniformly distributed pore-structure and parallel nano-channels of lengths and sizes practical for industrial applications were reliably produced in quantity.
Belwalkar, A.; Grasing, E.; Huang, Z.; Misiolek, W.Z.
2008-01-01
Nanoporous anodic aluminum oxide (AAO) tubular membranes were fabricated from aluminum alloy tubes in sulfuric and oxalic acid electrolytes using a two-step anodization process. The membranes were investigated for characteristics such as pore size, interpore distance and thickness by varying applied voltage and electrolyte concentration. Morphology of the membranes was examined using light optical and scanning electron microscopy and characterized using ImageJ software. Results showed that membranes having narrow pore size and uniform pore distribution with parallel channel arrays were obtained. The pore sizes were ranging from 14 to 24 nm and the wall thicknesses as high as 76 µm. It was found that the pore size increased in direct proportion with the applied voltage and inversely with the electrolyte concentration while the interpore distance increased linearly with the applied voltage. It was also observed that increase in acid concentration increased tubular membrane wall thickness that improved mechanical handling. By using anodic alumina technology, robust ceramic tubes with uniformly distributed pore-structure and parallel nano-channels of lengths and sizes practical for industrial applications were reliably produced in quantity. PMID:19578471
Automation of NMR structure determination of proteins.
Altieri, Amanda S; Byrd, R Andrew
2004-10-01
The automation of protein structure determination using NMR is coming of age. The tedious processes of resonance assignment, followed by assignment of NOE (nuclear Overhauser enhancement) interactions (now intertwined with structure calculation), assembly of input files for structure calculation, intermediate analyses of incorrect assignments and bad input data, and finally structure validation are all being automated with sophisticated software tools. The robustness of the different approaches continues to deal with problems of completeness and uniqueness; nevertheless, the future is very bright for automation of NMR structure generation to approach the levels found in X-ray crystallography. Currently, near completely automated structure determination is possible for small proteins, and the prospect for medium-sized and large proteins is good. Copyright 2004 Elsevier Ltd.
NASA Technical Reports Server (NTRS)
Martinovic, Zoran N.; Cerro, Jeffrey A.
2002-01-01
This is an interim user's manual for current procedures used in the Vehicle Analysis Branch at NASA Langley Research Center, Hampton, Virginia, for launch vehicle structural subsystem weight estimation based on finite element modeling and structural analysis. The process is intended to complement traditional methods of conceptual and early preliminary structural design such as the application of empirical weight estimation or application of classical engineering design equations and criteria on one dimensional "line" models. Functions of two commercially available software codes are coupled together. Vehicle modeling and analysis are done using SDRC/I-DEAS, and structural sizing is performed with the Collier Research Corp. HyperSizer program.
Study on light weight design of truss structures of spacecrafts
NASA Astrophysics Data System (ADS)
Zeng, Fuming; Yang, Jianzhong; Wang, Jian
2015-08-01
Truss structure is usually adopted as the main structure form for spacecrafts due to its high efficiency in supporting concentrated loads. Light-weight design is now becoming the primary concern during conceptual design of spacecrafts. Implementation of light-weight design on truss structure always goes through three processes: topology optimization, size optimization and composites optimization. During each optimization process, appropriate algorithm such as the traditional optimality criterion method, mathematical programming method and the intelligent algorithms which simulate the growth and evolution processes in nature will be selected. According to the practical processes and algorithms, combined with engineering practice and commercial software, summary is made for the implementation of light-weight design on truss structure for spacecrafts.
Method for analyzing soil structure according to the size of structural elements
NASA Astrophysics Data System (ADS)
Wieland, Ralf; Rogasik, Helmut
2015-02-01
The soil structure in situ is the result of cropping history and soil development over time. It can be assessed by the size distribution of soil structural elements such as air-filled macro-pores, aggregates and stones, which are responsible for important water and solute transport processes, gas exchange, and the stability of the soil against compacting and shearing forces exerted by agricultural machinery. A method was developed to detect structural elements of the soil in selected horizontal slices of soil core samples with different soil structures in order for them to be implemented accordingly. In the second step, a fitting tool (Eureqa) based on artificial programming was used to find a general function to describe ordered sets of detected structural elements. It was shown that all the samples obey a hyperbolic function: Y(k) = A /(B + k) , k ∈ { 0 , 1 , 2 , … }. This general behavior can be used to develop a classification method based on parameters {A and B}. An open source software program in Python was developed, which can be downloaded together with a selection of soil samples.
Lyerla, R; Gouws, E; García-Calleja, J M; Zaniewski, E
2006-06-01
This paper describes improvements and updates to an established approach to making epidemiological estimates of HIV prevalence in countries with low level and concentrated epidemics. The structure of the software used to make estimates is briefly described, with particular attention to changes and improvements. The approach focuses on identifying populations which, through their behaviour, are at high risk of infection with HIV or who are exposed through the risk behaviour of their sexual partners. Estimates of size and HIV prevalence of these populations allow the total number of HIV infected people in a country or region to be estimated. Major changes in the software focus on the move away from short term projections and towards developing an epidemiological curve that more accurately represents the change in prevalence of HIV over time. The software continues to provide an output file for use in the Spectrum software so as to estimate the demographic impact of HIV infection at country level.
A new software for dimensional measurements in 3D endodontic root canal instrumentation.
Sinibaldi, Raffaele; Pecci, Raffaella; Somma, Francesco; Della Penna, Stefania; Bedini, Rossella
2012-01-01
The main issue to be faced to get size estimates of 3D modification of the dental canal after endodontic treatment is the co-registration of the image stacks obtained through micro computed tomography (micro-CT) scans before and after treatment. Here quantitative analysis of micro-CT images have been performed by means of new dedicated software targeted to the analysis of root canal after endodontic instrumentation. This software analytically calculates the best superposition between the pre and post structures using the inertia tensor of the tooth. This strategy avoid minimization procedures, which can be user dependent, and time consuming. Once the co-registration have been achieved dimensional measurements have then been performed by contemporary evaluation of quantitative parameters over the two superimposed stacks of micro-CT images. The software automatically calculated the changes of volume, surface and symmetry axes in 3D occurring after the instrumentation. The calculation is based on direct comparison of the canal and canal branches selected by the user on the pre treatment image stack.
Structured Light-Based 3D Reconstruction System for Plants.
Nguyen, Thuy Tuong; Slaughter, David C; Max, Nelson; Maloof, Julin N; Sinha, Neelima
2015-07-29
Camera-based 3D reconstruction of physical objects is one of the most popular computer vision trends in recent years. Many systems have been built to model different real-world subjects, but there is lack of a completely robust system for plants. This paper presents a full 3D reconstruction system that incorporates both hardware structures (including the proposed structured light system to enhance textures on object surfaces) and software algorithms (including the proposed 3D point cloud registration and plant feature measurement). This paper demonstrates the ability to produce 3D models of whole plants created from multiple pairs of stereo images taken at different viewing angles, without the need to destructively cut away any parts of a plant. The ability to accurately predict phenotyping features, such as the number of leaves, plant height, leaf size and internode distances, is also demonstrated. Experimental results show that, for plants having a range of leaf sizes and a distance between leaves appropriate for the hardware design, the algorithms successfully predict phenotyping features in the target crops, with a recall of 0.97 and a precision of 0.89 for leaf detection and less than a 13-mm error for plant size, leaf size and internode distance.
[Optical Design of Miniature Infrared Gratings Spectrometer Based on Planar Waveguide].
Li, Yang-yu; Fang, Yong-hua; Li, Da-cheng; Liu, Yang
2015-03-01
In order to miniaturize an infrared spectrometer, we analyze the current optical design of miniature spectrometers and propose a method for designing a miniature infrared gratings spectrometer based on planar waveguide. Common miniature spectrometer uses miniature optical elements to reduce the size of system, which also shrinks the effective aperture. So the performance of spectrometer has dropped. Miniaturization principle of planar waveguide spectrometer is different from the principle of common miniature spectrometer. In planar waveguide spectrometer, the propagation of light is limited in a thin planar waveguide, which looks like the whole optical system is squashed flat. In the direction parallel to the planar waveguide, the light through the slit is collimated, dispersed and focused. And a spectral image is formed in the detector plane. This propagation of light is similar to the light in common miniature spectrometer. In the direction perpendicular to the planar waveguide, light is multiple reflected by the upper and lower surfaces of the planar waveguide and propagates in the waveguide. So the size of corresponding optical element could be very small in the vertical direction, which can reduce the size of the optical system. And the performance of the spectrometer is still good. The design method of the planar waveguide spectrometer can be separated into two parts, Czerny-Turner structure design and planar waveguide structure design. First, by using aberration theory an aberration-corrected (spherical aberration, coma, focal curve) Czerny-Turner structure is obtained. The operation wavelength range and spectral resolution are also fixed. Then, by using geometrical optics theory a planar waveguide structure is designed for reducing the system size and correcting the astigmatism. The planar waveguide structure includes a planar waveguide and two cylindrical lenses. Finally, they are modeled together in optical design software and are optimized as a whole. An infrared planar waveguide spectrometer is designed using this method. The operation wavelength range is 8 - 12 μm, the numerical aperture is 0.22, and the linear array detector contains 64 elements. By using Zemax software, the design is optimized and analyzed. The results indicate that the size of the optical system is 130 mm x 125 mm x 20 mm and the spectral resolution of spectrometer is 80 nm, which satisfy the requirements of design index. Thus it is this method that can be used for designing a miniature spectrometer without movable parts and sizes in the range of several cubic centimeters.
Many-level multilevel structural equation modeling: An efficient evaluation strategy.
Pritikin, Joshua N; Hunter, Michael D; von Oertzen, Timo; Brick, Timothy R; Boker, Steven M
2017-01-01
Structural equation models are increasingly used for clustered or multilevel data in cases where mixed regression is too inflexible. However, when there are many levels of nesting, these models can become difficult to estimate. We introduce a novel evaluation strategy, Rampart, that applies an orthogonal rotation to the parts of a model that conform to commonly met requirements. This rotation dramatically simplifies fit evaluation in a way that becomes more potent as the size of the data set increases. We validate and evaluate the implementation using a 3-level latent regression simulation study. Then we analyze data from a state-wide child behavioral health measure administered by the Oklahoma Department of Human Services. We demonstrate the efficiency of Rampart compared to other similar software using a latent factor model with a 5-level decomposition of latent variance. Rampart is implemented in OpenMx, a free and open source software.
NASA Astrophysics Data System (ADS)
Alexander, K.; Easterbrook, S. M.
2015-04-01
We analyze the source code of eight coupled climate models, selected from those that participated in the CMIP5 (Taylor et al., 2012) or EMICAR5 (Eby et al., 2013; Zickfeld et al., 2013) intercomparison projects. For each model, we sort the preprocessed code into components and subcomponents based on dependency structure. We then create software architecture diagrams that show the relative sizes of these components/subcomponents and the flow of data between them. The diagrams also illustrate several major classes of climate model design; the distribution of complexity between components, which depends on historical development paths as well as the conscious goals of each institution; and the sharing of components between different modeling groups. These diagrams offer insights into the similarities and differences in structure between climate models, and have the potential to be useful tools for communication between scientists, scientific institutions, and the public.
Spherical Cryogenic Hydrogen Tank Preliminary Design Trade Studies
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Bednarcyk, Brett A.; Collier, Craig S.; Yarrington, Phillip W.
2007-01-01
A structural analysis, sizing optimization, and weight prediction study was performed by Collier Research Corporation and NASA Glenn on a spherical cryogenic hydrogen tank. The tank consisted of an inner and outer wall separated by a vacuum for thermal insulation purposes. HyperSizer (Collier Research and Development Corporation), a commercial automated structural analysis and sizing software package was used to design the lightest feasible tank for a given overall size and thermomechanical loading environment. Weight trade studies were completed for different panel concepts and metallic and composite material systems. Extensive failure analyses were performed for each combination of dimensional variables, materials, and layups to establish the structural integrity of tank designs. Detailed stress and strain fields were computed from operational temperature changes and pressure loads. The inner tank wall is sized by the resulting biaxial tensile stresses which cause it to be strength driven, and leads to an optimum panel concept that need not be stiffened. Conversely, the outer tank wall is sized by a biaxial compressive stress field, induced by the pressure differential between atmospheric pressure and the vacuum between the tanks, thereby causing the design to be stability driven and thus stiffened to prevent buckling. Induced thermal stresses become a major sizing driver when a composite or hybrid composite/metallic material systems are used for the inner tank wall for purposes such as liners to contain the fuel and reduce hydrogen permeation.
NASA Technical Reports Server (NTRS)
Adams, J. R.; Hawley, S. W.; Peterson, G. R.; Salinger, S. S.; Workman, R. A.
1971-01-01
A hardware and software specification covering requirements for the computer enhancement of structural weld radiographs was considered. Three scanning systems were used to digitize more than 15 weld radiographs. The performance of these systems was evaluated by determining modulation transfer functions and noise characteristics. Enhancement techniques were developed and applied to the digitized radiographs. The scanning parameters of spot size and spacing and film density were studied to optimize the information content of the digital representation of the image.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salama, A.; Mikhail, M.
Comprehensive software packages have been developed at the Western Research Centre as tools to help coal preparation engineers analyze, evaluate, and control coal cleaning processes. The COal Preparation Software package (COPS) performs three functions: (1) data handling and manipulation, (2) data analysis, including the generation of washability data, performance evaluation and prediction, density and size modeling, evaluation of density and size partition characteristics and attrition curves, and (3) generation of graphics output. The Separation ChARacteristics Estimation software packages (SCARE) are developed to balance raw density or size separation data. The cases of density and size separation data are considered. Themore » generated balanced data can take the balanced or normalized forms. The scaled form is desirable for direct determination of the partition functions (curves). The raw and generated separation data are displayed in tabular and/or graphical forms. The computer softwares described in this paper are valuable tools for coal preparation plant engineers and operators for evaluating process performance, adjusting plant parameters, and balancing raw density or size separation data. These packages have been applied very successfully in many projects carried out by WRC for the Canadian coal preparation industry. The software packages are designed to run on a personal computer (PC).« less
A VxD-based automatic blending system using multithreaded programming.
Wang, L; Jiang, X; Chen, Y; Tan, K C
2004-01-01
This paper discusses the object-oriented software design for an automatic blending system. By combining the advantages of a programmable logic controller (PLC) and an industrial control PC (ICPC), an automatic blending control system is developed for a chemical plant. The system structure and multithread-based communication approach are first presented in this paper. The overall software design issues, such as system requirements and functionalities, are then discussed in detail. Furthermore, by replacing the conventional dynamic link library (DLL) with virtual X device drivers (VxD's), a practical and cost-effective solution is provided to improve the robustness of the Windows platform-based automatic blending system in small- and medium-sized plants.
1982-05-13
Size Of The Software. A favourite measure for software system size is linos of operational code, or deliverable code (operational code plus...regression models, these conversions are either derived from productivity measures using the "cost per instruction" type of equation or they are...appropriate to different development organisattons, differert project types, different sets of units for measuring e and s, and different items
King, Michael A; Scotty, Nicole; Klein, Ronald L; Meyer, Edwin M
2002-10-01
Assessing the efficacy of in vivo gene transfer often requires a quantitative determination of the number, size, shape, or histological visualization characteristics of biological objects. The optical fractionator has become a choice stereological method for estimating the number of objects, such as neurons, in a structure, such as a brain subregion. Digital image processing and analytic methods can increase detection sensitivity and quantify structural and/or spectral features located in histological specimens. We describe a hardware and software system that we have developed for conducting the optical fractionator process. A microscope equipped with a video camera and motorized stage and focus controls is interfaced with a desktop computer. The computer contains a combination live video/computer graphics adapter with a video frame grabber and controls the stage, focus, and video via a commercial imaging software package. Specialized macro programs have been constructed with this software to execute command sequences requisite to the optical fractionator method: defining regions of interest, positioning specimens in a systematic uniform random manner, and stepping through known volumes of tissue for interactive object identification (optical dissectors). The system affords the flexibility to work with count regions that exceed the microscope image field size at low magnifications and to adjust the parameters of the fractionator sampling to best match the demands of particular specimens and object types. Digital image processing can be used to facilitate object detection and identification, and objects that meet criteria for counting can be analyzed for a variety of morphometric and optical properties. Copyright 2002 Elsevier Science (USA)
Neugebauer, Tomasz; Bordeleau, Eric; Burrus, Vincent; Brzezinski, Ryszard
2015-01-01
Data visualization methods are necessary during the exploration and analysis activities of an increasingly data-intensive scientific process. There are few existing visualization methods for raw nucleotide sequences of a whole genome or chromosome. Software for data visualization should allow the researchers to create accessible data visualization interfaces that can be exported and shared with others on the web. Herein, novel software developed for generating DNA data visualization interfaces is described. The software converts DNA data sets into images that are further processed as multi-scale images to be accessed through a web-based interface that supports zooming, panning and sequence fragment selection. Nucleotide composition frequencies and GC skew of a selected sequence segment can be obtained through the interface. The software was used to generate DNA data visualization of human and bacterial chromosomes. Examples of visually detectable features such as short and long direct repeats, long terminal repeats, mobile genetic elements, heterochromatic segments in microbial and human chromosomes, are presented. The software and its source code are available for download and further development. The visualization interfaces generated with the software allow for the immediate identification and observation of several types of sequence patterns in genomes of various sizes and origins. The visualization interfaces generated with the software are readily accessible through a web browser. This software is a useful research and teaching tool for genetics and structural genomics.
Tuszynski, Tobias; Rullmann, Michael; Luthardt, Julia; Butzke, Daniel; Tiepolt, Solveig; Gertz, Hermann-Josef; Hesse, Swen; Seese, Anita; Lobsien, Donald; Sabri, Osama; Barthel, Henryk
2016-06-01
For regional quantification of nuclear brain imaging data, defining volumes of interest (VOIs) by hand is still the gold standard. As this procedure is time-consuming and operator-dependent, a variety of software tools for automated identification of neuroanatomical structures were developed. As the quality and performance of those tools are poorly investigated so far in analyzing amyloid PET data, we compared in this project four algorithms for automated VOI definition (HERMES Brass, two PMOD approaches, and FreeSurfer) against the conventional method. We systematically analyzed florbetaben brain PET and MRI data of ten patients with probable Alzheimer's dementia (AD) and ten age-matched healthy controls (HCs) collected in a previous clinical study. VOIs were manually defined on the data as well as through the four automated workflows. Standardized uptake value ratios (SUVRs) with the cerebellar cortex as a reference region were obtained for each VOI. SUVR comparisons between ADs and HCs were carried out using Mann-Whitney-U tests, and effect sizes (Cohen's d) were calculated. SUVRs of automatically generated VOIs were correlated with SUVRs of conventionally derived VOIs (Pearson's tests). The composite neocortex SUVRs obtained by manually defined VOIs were significantly higher for ADs vs. HCs (p=0.010, d=1.53). This was also the case for the four tested automated approaches which achieved effect sizes of d=1.38 to d=1.62. SUVRs of automatically generated VOIs correlated significantly with those of the hand-drawn VOIs in a number of brain regions, with regional differences in the degree of these correlations. Best overall correlation was observed in the lateral temporal VOI for all tested software tools (r=0.82 to r=0.95, p<0.001). Automated VOI definition by the software tools tested has a great potential to substitute for the current standard procedure to manually define VOIs in β-amyloid PET data analysis.
Surface models of the male urogenital organs built from the Visible Korean using popular software
Shin, Dong Sun; Park, Jin Seo; Shin, Byeong-Seok
2011-01-01
Unlike volume models, surface models, which are empty three-dimensional images, have a small file size, so they can be displayed, rotated, and modified in real time. Thus, surface models of male urogenital organs can be effectively applied to an interactive computer simulation and contribute to the clinical practice of urologists. To create high-quality surface models, the urogenital organs and other neighboring structures were outlined in 464 sectioned images of the Visible Korean male using Adobe Photoshop; the outlines were interpolated on Discreet Combustion; then an almost automatic volume reconstruction followed by surface reconstruction was performed on 3D-DOCTOR. The surface models were refined and assembled in their proper positions on Maya, and a surface model was coated with actual surface texture acquired from the volume model of the structure on specially programmed software. In total, 95 surface models were prepared, particularly complete models of the urinary and genital tracts. These surface models will be distributed to encourage other investigators to develop various kinds of medical training simulations. Increasingly automated surface reconstruction technology using commercial software will enable other researchers to produce their own surface models more effectively. PMID:21829759
Structures for the 3rd Generation Reusable Concept Vehicle
NASA Technical Reports Server (NTRS)
Hrinda, Glenn A.
2001-01-01
A major goal of NASA is to create an advance space transportation system that provides a safe, affordable highway through the air and into space. The long-term plans are to reduce the risk of crew loss to 1 in 1,000,000 missions and reduce the cost of Low-Earth Orbit by a factor of 100 from today's costs. A third generation reusable concept vehicle (RCV) was developed to assess technologies required to meet NASA's space access goals. The vehicle will launch from Cape Kennedy carrying a 25,000 lb. payload to the International Space Station (ISS). The system is an air breathing launch vehicle (ABLV) hypersonic lifting body with rockets and uses triple point hydrogen and liquid oxygen propellant. The focus of this paper is on the structural concepts and analysis methods used in developing the third generation reusable launch vehicle (RLV). Member sizes, concepts and material selections will be discussed as well as analysis methods used in optimizing the structure. Analysis based on the HyperSizer structural sizing software will be discussed. Design trades required to optimize structural weight will be presented.
NASA Technical Reports Server (NTRS)
Palmer, Peter T.; Wong, C. M.; Salmonson, J. D.; Yost, R. A.; Griffin, T. P.; Yates, N. A.; Lawless, James G. (Technical Monitor)
1994-01-01
The utility of MS/MS for both target compound analysis and the structure elucidation of unknowns has been described in a number of references. A broader acceptance of this technique has not yet been realized as it requires large, complex, and costly instrumentation which has not been competitive with more conventional techniques. Recent advancements in ion trap mass spectrometry promise to change this situation. Although the ion trap's small size, sensitivity, and ability to perform multiple stages of mass spectrometry have made it eminently suitable for on-line, real-time monitoring applications, advance automation techniques are required to make these capabilities more accessible to non-experts. Towards this end we have developed custom software for the design and implementation of MS/MS experiments. This software allows the user to take full advantage of the ion trap's versatility with respect to ionization techniques, scan proxies, and ion accumulation/ejection methods. Additionally, expert system software has been developed for autonomous target compound analysis. This software has been linked to ion trap control software and a commercial data system to bring all of the steps in the analysis cycle under control of the expert system. These software development efforts and their utilization for a number of trace analysis applications will be described.
Villoria, Eduardo M; Lenzi, Antônio R; Soares, Rodrigo V; Souki, Bernardo Q; Sigurdsson, Asgeir; Marques, Alexandre P; Fidel, Sandra R
2017-01-01
To describe the use of open-source software for the post-processing of CBCT imaging for the assessment of periapical lesions development after endodontic treatment. CBCT scans were retrieved from endodontic records of two patients. Three-dimensional virtual models, voxel counting, volumetric measurement (mm 3 ) and mean intensity of the periapical lesion were performed with ITK-SNAP v. 3.0 software. Three-dimensional models of the lesions were aligned and overlapped through the MeshLab software, which performed an automatic recording of the anatomical structures, based on the best fit. Qualitative and quantitative analyses of the changes in lesions size after treatment were performed with the 3DMeshMetric software. The ITK-SNAP v. 3.0 showed the smaller value corresponding to the voxel count and the volume of the lesion segmented in yellow, indicating reduction in volume of the lesion after the treatment. A higher value of the mean intensity of the segmented image in yellow was also observed, which suggested new bone formation. Colour mapping and "point value" tool allowed the visualization of the reduction of periapical lesions in several regions. Researchers and clinicians in the monitoring of endodontic periapical lesions have the opportunity to use open-source software.
NASA Technical Reports Server (NTRS)
Barnes, Jeffrey M.
2011-01-01
All software systems of significant size and longevity eventually undergo changes to their basic architectural structure. Such changes may be prompted by evolving requirements, changing technology, or other reasons. Whatever the cause, software architecture evolution is commonplace in real world software projects. Recently, software architecture researchers have begun to study this phenomenon in depth. However, this work has suffered from problems of validation; research in this area has tended to make heavy use of toy examples and hypothetical scenarios and has not been well supported by real world examples. To help address this problem, I describe an ongoing effort at the Jet Propulsion Laboratory to re-architect the Advanced Multimission Operations System (AMMOS), which is used to operate NASA's deep-space and astrophysics missions. Based on examination of project documents and interviews with project personnel, I describe the goals and approach of this evolution effort and then present models that capture some of the key architectural changes. Finally, I demonstrate how approaches and formal methods from my previous research in architecture evolution may be applied to this evolution, while using languages and tools already in place at the Jet Propulsion Laboratory.
Velu, Juliëtte F; Groot Jebbink, Erik; de Vries, Jean-Paul Pm; van der Palen, Job Am; Slump, Cornelis H; Geelkerken, Robert H
2018-04-01
Objectives Correct sizing of endoprostheses used for the treatment of abdominal aortic aneurysms is important to prevent endoleaks and migration. Sizing requires several steps and each step introduces a possible sizing error. The goal of this study was to investigate the magnitude of these errors compared to the golden standard: a vessel phantom. This study focuses on the errors in sizing with three different brands of computed tomography angiography scanners in combination with three reconstruction software packages. Methods Three phantoms with a different diameter, altitude and azimuth were scanned with three computed tomography scanners: Toshiba Aquilion 64-slice, Philips Brilliance iCT 256-slice and Siemens Somatom Sensation 64-slice. The phantom diameters were determined in the stretched view after central lumen line reconstruction by three observers using Simbionix PROcedure Rehearsal Studio, 3mensio and TeraRecon planning software. The observers, all novices in sizing endoprostheses using planning software, measured 108 slices each. Two senior vascular surgeons set the tolerated error margin of sizing on ±1.0 mm. Results In total, 11.3% of the measurements (73/648) were outside the set margins of ±1.0 mm from the phantom diameter, with significant differences between the scanner types (14.8%, 12.1%, 6.9% for the Siemens scanner, Philips scanner and Toshiba scanner, respectively, p-value = 0.032), but not between the software packages (8.3%, 11.1%, 14.4%, p-value = 0.141) or the observers (10.6%, 9.7%, 13.4%, p-value = 0.448). Conclusions It can be concluded that the errors in sizing were independent of the used software packages, but the phantoms scanned with Siemens scanner were significantly more measured incorrectly than the phantoms scanned with the Toshiba scanner. Consequently, awareness on the type of computed tomography scanner and computed tomography scanner setting is necessary, especially in complex abdominal aortic aneurysms sizing for fenestrated or branched endovascular aneurysm repair if appropriate the sizing is of upmost importance.
Structured Light-Based 3D Reconstruction System for Plants
Nguyen, Thuy Tuong; Slaughter, David C.; Max, Nelson; Maloof, Julin N.; Sinha, Neelima
2015-01-01
Camera-based 3D reconstruction of physical objects is one of the most popular computer vision trends in recent years. Many systems have been built to model different real-world subjects, but there is lack of a completely robust system for plants.This paper presents a full 3D reconstruction system that incorporates both hardware structures (including the proposed structured light system to enhance textures on object surfaces) and software algorithms (including the proposed 3D point cloud registration and plant feature measurement). This paper demonstrates the ability to produce 3D models of whole plants created from multiple pairs of stereo images taken at different viewing angles, without the need to destructively cut away any parts of a plant. The ability to accurately predict phenotyping features, such as the number of leaves, plant height, leaf size and internode distances, is also demonstrated. Experimental results show that, for plants having a range of leaf sizes and a distance between leaves appropriate for the hardware design, the algorithms successfully predict phenotyping features in the target crops, with a recall of 0.97 and a precision of 0.89 for leaf detection and less than a 13-mm error for plant size, leaf size and internode distance. PMID:26230701
NASA Astrophysics Data System (ADS)
Marcoin, W.; Pasterny, K.; Wrzalik, R.
2005-05-01
Theoretical calculations of magnesium aspartate-arginine (Mg[Asp-Arg]) structure and spectroscopic characteristics have been performed in the gas phase with the GAUSSIAN 98 software package using density functional theory (DFT) at the B3PW91 level. The 6-31+G* basis set was selected due to their reasonable quality and size. The comparison with corresponding results for magnesium aspartate-glycine (Mg[Asp-Gly]) is presented. NMR and IR measurements were carried out and obtained experimental 1H and 13C chemical shifts and IR spectra are compared with calculated spectral parameters.
StochKit2: software for discrete stochastic simulation of biochemical systems with events.
Sanft, Kevin R; Wu, Sheng; Roh, Min; Fu, Jin; Lim, Rone Kwei; Petzold, Linda R
2011-09-01
StochKit2 is the first major upgrade of the popular StochKit stochastic simulation software package. StochKit2 provides highly efficient implementations of several variants of Gillespie's stochastic simulation algorithm (SSA), and tau-leaping with automatic step size selection. StochKit2 features include automatic selection of the optimal SSA method based on model properties, event handling, and automatic parallelism on multicore architectures. The underlying structure of the code has been completely updated to provide a flexible framework for extending its functionality. StochKit2 runs on Linux/Unix, Mac OS X and Windows. It is freely available under GPL version 3 and can be downloaded from http://sourceforge.net/projects/stochkit/. petzold@engineering.ucsb.edu.
Interlinking backscatter, grain size and benthic community structure
NASA Astrophysics Data System (ADS)
McGonigle, Chris; Collier, Jenny S.
2014-06-01
The relationship between acoustic backscatter, sediment grain size and benthic community structure is examined using three different quantitative methods, covering image- and angular response-based approaches. Multibeam time-series backscatter (300 kHz) data acquired in 2008 off the coast of East Anglia (UK) are compared with grain size properties, macrofaunal abundance and biomass from 130 Hamon and 16 Clamshell grab samples. Three predictive methods are used: 1) image-based (mean backscatter intensity); 2) angular response-based (predicted mean grain size), and 3) image-based (1st principal component and classification) from Quester Tangent Corporation Multiview software. Relationships between grain size and backscatter are explored using linear regression. Differences in grain size and benthic community structure between acoustically defined groups are examined using ANOVA and PERMANOVA+. Results for the Hamon grab stations indicate significant correlations between measured mean grain size and mean backscatter intensity, angular response predicted mean grain size, and 1st principal component of QTC analysis (all p < 0.001). Results for the Clamshell grab for two of the methods have stronger positive correlations; mean backscatter intensity (r2 = 0.619; p < 0.001) and angular response predicted mean grain size (r2 = 0.692; p < 0.001). ANOVA reveals significant differences in mean grain size (Hamon) within acoustic groups for all methods: mean backscatter (p < 0.001), angular response predicted grain size (p < 0.001), and QTC class (p = 0.009). Mean grain size (Clamshell) shows a significant difference between groups for mean backscatter (p = 0.001); other methods were not significant. PERMANOVA for the Hamon abundance shows benthic community structure was significantly different between acoustic groups for all methods (p ≤ 0.001). Overall these results show considerable promise in that more than 60% of the variance in the mean grain size of the Clamshell grab samples can be explained by mean backscatter or acoustically-predicted grain size. These results show that there is significant predictive capacity for sediment characteristics from multibeam backscatter and that these acoustic classifications can have ecological validity.
Software Testing for Evolutionary Iterative Rapid Prototyping
1990-12-01
kept later hours than I did. Amidst the hustle and bustle, their prayers and help around the house were a great ast.. Finally, if anything shows the...possible meanings. A basic dictionary definition describes prototyping as "an original type , form, or instance that serves as a modfe] on which later...on program size. Asset instruments 49 the subject procedure and produces a graph of the structure for the type of data flow testing conducted. It
Analysis of seismic stability of large-sized tank VST-20000 with software package ANSYS
NASA Astrophysics Data System (ADS)
Tarasenko, A. A.; Chepur, P. V.; Gruchenkova, A. A.
2018-05-01
The work is devoted to the study of seismic stability of vertical steel tank VST-20000 with due consideration of the system response “foundation-tank-liquid”, conducted on the basis of the finite element method, modal analysis and linear spectral theory. The calculations are performed for the tank model with a high degree of detailing of metallic structures: shells, a fixed roof, a bottom, a reinforcing ring.
NASA Technical Reports Server (NTRS)
Holloway, Sidney E., III; Crossley, Edward A.; Miller, James B.; Jones, Irby W.; Davis, C. Calvin; Behun, Vaughn D.; Goodrich, Lewis R., Sr.
1995-01-01
Linear proof-mass actuator (LPMA) is friction-driven linear mass actuator capable of applying controlled force to structure in outer space to damp out oscillations. Capable of high accelerations and provides smooth, bidirectional travel of mass. Design eliminates gears and belts. LPMA strong enough to be used terrestrially where linear actuators needed to excite or damp out oscillations. High flexibility designed into LPMA by varying size of motors, mass, and length of stroke, and by modifying control software.
Sizing and Lifecycle Cost Analysis of an Ares V Composite Interstage
NASA Technical Reports Server (NTRS)
Mann, Troy; Smeltzer, Stan; Grenoble, Ray; Mason, Brian; Rosario, Sev; Fairbairn, Bob
2012-01-01
The Interstage Element of the Ares V launch vehicle was sized using a commercially available structural sizing software tool. Two different concepts were considered, a metallic design and a composite design. Both concepts were sized using similar levels of analysis fidelity and included the influence of design details on each concept. Additionally, the impact of the different manufacturing techniques and failure mechanisms for composite and metallic construction were considered. Significant details were included in analysis models of each concept, including penetrations for human access, joint connections, as well as secondary loading effects. The designs and results of the analysis were used to determine lifecycle cost estimates for the two Interstage designs. Lifecycle cost estimates were based on industry provided cost data for similar launch vehicle components. The results indicated that significant mass as well as cost savings are attainable for the chosen composite concept as compared with a metallic option.
Osmani, Feroz A; Thakkar, Savyasachi; Ramme, Austin; Elbuluk, Ameer; Wojack, Paul; Vigdorchik, Jonathan M
2017-12-01
Preoperative total hip arthroplasty templating can be performed with radiographs using acetate prints, digital viewing software, or with computed tomography (CT) images. Our hypothesis is that 3D templating is more precise and accurate with cup size prediction as compared to 2D templating with acetate prints and digital templating software. Data collected from 45 patients undergoing robotic-assisted total hip arthroplasty compared cup sizes templated on acetate prints and OrthoView software to MAKOplasty software that uses CT scan. Kappa analysis determined strength of agreement between each templating modality and the final size used. t tests compared mean cup-size variance from the final size for each templating technique. Interclass correlation coefficient (ICC) determined reliability of digital and acetate planning by comparing predictions of the operating surgeon and a blinded adult reconstructive fellow. The Kappa values for CT-guided, digital, and acetate templating with the final size was 0.974, 0.233, and 0.262, respectively. Both digital and acetate templating significantly overpredicted cup size, compared to CT-guided methods ( P < .001). There was no significant difference between digital and acetate templating ( P = .117). Interclass correlation coefficient value for digital and acetate templating was 0.928 and 0.931, respectively. CT-guided planning more accurately predicts hip implant cup size when compared to the significant overpredictions of digital and acetate templating. CT-guided templating may also lead to better outcomes due to bone stock preservation from a smaller and more accurate cup size predicted than that of digital and acetate predictions.
[Simulation and data analysis of stereological modeling based on virtual slices].
Wang, Hao; Shen, Hong; Bai, Xiao-yan
2008-05-01
To establish a computer-assisted stereological model for simulating the process of slice section and evaluate the relationship between section surface and estimated three-dimensional structure. The model was designed by mathematic method as a win32 software based on the MFC using Microsoft visual studio as IDE for simulating the infinite process of sections and analysis of the data derived from the model. The linearity of the fitting of the model was evaluated by comparison with the traditional formula. The win32 software based on this algorithm allowed random sectioning of the particles distributed randomly in an ideal virtual cube. The stereological parameters showed very high throughput (>94.5% and 92%) in homogeneity and independence tests. The data of density, shape and size of the section were tested to conform to normal distribution. The output of the model and that from the image analysis system showed statistical correlation and consistency. The algorithm we described can be used for evaluating the stereologic parameters of the structure of tissue slices.
Velu, Juliëtte F; Groot Jebbink, Erik; de Vries, Jean-Paul P M; Slump, Cornelis H; Geelkerken, Robert H
2017-02-01
An important determinant of successful endovascular aortic aneurysm repair is proper sizing of the dimensions of the aortic-iliac vessels. The goal of the present study was to determine the concurrent validity, a method for comparison of test scores, for EVAR sizing and planning of the recently introduced Simbionix PROcedure Rehearsal Studio (PRORS). Seven vascular specialists analyzed anonymized computed tomography angiography scans of 70 patients with an infrarenal aneurysm of the abdominal aorta, using three different sizing software packages Simbionix PRORS (Simbionix USA Corp., Cleveland, OH, USA), 3mensio (Pie Medical Imaging BV, Maastricht, The Netherlands), and TeraRecon (Aquarius, Foster City, CA, USA). The following measurements were included in the protocol: diameter 1 mm below the most distal main renal artery, diameter 15 mm below the lowest renal artery, maximum aneurysm diameter, and length from the most distal renal artery to the left iliac artery bifurcation. Averaged over the locations, the intraclass correlation coefficient is 0.83 for Simbionix versus 3mensio, 0.81 for Simbionix versus TeraRecon, and 0.86 for 3mensio versus TeraRecon. It can be concluded that the Simbionix sizing software is as precise as two other validated and commercially available software packages.
A novel method to characterize silica bodies in grasses.
Dabney, Clemon; Ostergaard, Jason; Watkins, Eric; Chen, Changbin
2016-01-01
The deposition of silicon into epidermal cells of grass species is thought to be an important mechanism that plants use as a defense against pests and environmental stresses. There are a number of techniques available to study the size, density and distribution pattern of silica bodies in grass leaves. However, none of those techniques can provide a high-throughput analysis, especially for a great number of samples. We developed a method utilizing the autofluorescence of silica bodies to investigate their size and distribution, along with the number of carbon inclusions within the silica bodies of perennial grass species Koeleria macrantha. Fluorescence images were analyzed by image software Adobe Photoshop CS5 or ImageJ that remarkably facilitated the quantification of silica bodies in the dry ash. We observed three types of silica bodies or silica body related mineral structures. Silica bodies were detected on both abaxial and adaxial epidermis of K. macrantha leaves, although their sizes, density, and distribution patterns were different. No auto-fluorescence was detected from carbon inclusions. The combination of fluorescence microscopy and image processing software displayed efficient utilization in the identification and quantification of silica bodies in K. macrantha leaf tissues, which should applicable to biological, ecological and geological studies of grasses including forage, turf grasses and cereal crops.
Self-assembling software generator
Bouchard, Ann M [Albuquerque, NM; Osbourn, Gordon C [Albuquerque, NM
2011-11-25
A technique to generate an executable task includes inspecting a task specification data structure to determine what software entities are to be generated to create the executable task, inspecting the task specification data structure to determine how the software entities will be linked after generating the software entities, inspecting the task specification data structure to determine logic to be executed by the software entities, and generating the software entities to create the executable task.
Specification-based software sizing: An empirical investigation of function metrics
NASA Technical Reports Server (NTRS)
Jeffery, Ross; Stathis, John
1993-01-01
For some time the software industry has espoused the need for improved specification-based software size metrics. This paper reports on a study of nineteen recently developed systems in a variety of application domains. The systems were developed by a single software services corporation using a variety of languages. The study investigated several metric characteristics. It shows that: earlier research into inter-item correlation within the overall function count is partially supported; a priori function counts, in themself, do not explain the majority of the effort variation in software development in the organization studied; documentation quality is critical to accurate function identification; and rater error is substantial in manual function counting. The implication of these findings for organizations using function based metrics are explored.
System Model for MEMS based Laser Ultrasonic Receiver
NASA Technical Reports Server (NTRS)
Wilson, William C.
2002-01-01
A need has been identified for more advanced nondestructive Evaluation technologies for assuring the integrity of airframe structures, wiring, etc. Laser ultrasonic inspection instruments have been shown to detect flaws in structures. However, these instruments are generally too bulky to be used in the confined spaces that are typical of aerospace vehicles. Microsystems technology is one key to reducing the size of current instruments and enabling increased inspection coverage in areas that were previously inaccessible due to instrument size and weight. This paper investigates the system modeling of a Micro OptoElectroMechanical System (MOEMS) based laser ultrasonic receiver. The system model is constructed in software using MATLAB s dynamical simulator, Simulink. The optical components are modeled using geometrical matrix methods and include some image processing. The system model includes a test bench which simulates input stimuli and models the behavior of the material under test.
Smith, Nicholas; Witham, Shawn; Sarkar, Subhra; Zhang, Jie; Li, Lin; Li, Chuan; Alexov, Emil
2012-06-15
A new edition of the DelPhi web server, DelPhi web server v2, is released to include atomic presentation of geometrical figures. These geometrical objects can be used to model nano-size objects together with real biological macromolecules. The position and size of the object can be manipulated by the user in real time until desired results are achieved. The server fixes structural defects, adds hydrogen atoms and calculates electrostatic energies and the corresponding electrostatic potential and ionic distributions. The web server follows a client-server architecture built on PHP and HTML and utilizes DelPhi software. The computation is carried out on supercomputer cluster and results are given back to the user via http protocol, including the ability to visualize the structure and corresponding electrostatic potential via Jmol implementation. The DelPhi web server is available from http://compbio.clemson.edu/delphi_webserver.
Finite-Difference Time-Domain Analysis of Tapered Photonic Crystal Fiber
NASA Astrophysics Data System (ADS)
Ali, M. I. Md; Sanusidin, S. N.; Yusof, M. H. M.
2018-03-01
This paper brief about the simulation of tapered photonic crystal fiber (PCF) LMA-8 single-mode type based on correlation of scattering pattern at wavelength of 1.55 μm, analyzation of transmission spectrum at wavelength over the range of 1.0 until 2.5 μm and correlation of transmission spectrum with the refractive index change in photonic crystal holes with respect to taper size of 0.1 until 1.0 using Optiwave simulation software. The main objective is to simulate using Finite-Difference Time-Domain (FDTD) technique of tapered LMA-8 PCF and for sensing application by improving the capabilities of PCF without collapsing the crystal holes. The types of FDTD techniques used are scattering pattern and transverse transmission and principal component analysis (PCA) used as a mathematical tool to model the data obtained by MathCad software. The simulation results showed that there is no obvious correlation of scattering pattern at a wavelength of 1.55 μm, a correlation obtained between taper sizes with a transverse transmission and there is a parabolic relationship between the refractive index changes inside the crystal structure.
Research on tomato seed vigor based on X-ray digital image
NASA Astrophysics Data System (ADS)
Zhao, Xueguan; Gao, Yuanyuan; Wang, Xiu; Li, Cuiling; Wang, Songlin; Feng, Qinghun
2016-10-01
Seed size, interior abnormal and damage of the tomato seeds will affect the germination. The purpose of this paper was to study the relationship between the internal morphology, seed size and seed germination of tomato. The preprocessing algorithm of X-ray image of tomato seeds was studied, and the internal structure characteristics of tomato seeds were extracted by image processing algorithm. By developing the image processing software, the cavity area between embryo and endosperm and the whole seed zone were determined. According to the difference of area of embryo and endosperm and Internal structural condition, seeds were divided into six categories, Respectively for three kinds of tomato seed germination test, the relationship between seed vigor and seed size , internal free cavity was explored through germination experiment. Through seedling evaluation test found that X-ray image analysis provide a perfect view of the inside part of the seed and seed morphology research methods. The larger the area of the endosperm and the embryo, the greater the probability of healthy seedlings sprout from the same size seeds. Mechanical damage adversely effects on seed germination, deterioration of tissue prone to produce week seedlings and abnormal seedlings.
NASA Astrophysics Data System (ADS)
Khanal, Lokendra Raj; Williams, Thomas; Qiang, You
2018-06-01
Iron/iron-oxide (Fe–Fe3O4) core–shell nanoclusters (NCs) synthesized by a cluster deposition technique were subjected to a study of their high temperature structural and morphological behavior. Annealing effects have been investigated up to 800 °C in vacuum, oxygen and argon environments. The ~18 nm average size of the as-prepared NCs increases slowly in temperatures up to 500 °C in all three environments. The size increases abruptly in the argon environment but slowly in vacuum and oxygen when annealed at 800 °C. The x-ray diffraction (XRD) studies have shown that the iron core remains in the core–shell NCs only when they were annealed in the vacuum. A dramatic change in the surface morphology, an island like structure and/or a network like pattern, was observed at the elevated temperature. The as-prepared and annealed samples were analyzed using XRD, scanning electron microscopy and imageJ software for a close inspection of the temperature aroused properties. This work presents the temperature induced size growth mechanism, oxidation kinetics and phase transformation of the NCs accompanied by cluster aggregation, particle coalescence, and diffusion.
Software and the future of programming languages.
Aho, Alfred V
2004-02-27
Although software is the key enabler of the global information infrastructure, the amount and extent of software in use in the world today are not widely understood, nor are the programming languages and paradigms that have been used to create the software. The vast size of the embedded base of existing software and the increasing costs of software maintenance, poor security, and limited functionality are posing significant challenges for the software R&D community.
Prediction and Estimation of Scaffold Strength with different pore size
NASA Astrophysics Data System (ADS)
Muthu, P.; Mishra, Shubhanvit; Sri Sai Shilpa, R.; Veerendranath, B.; Latha, S.
2018-04-01
This paper emphasizes the significance of prediction and estimation of the mechanical strength of 3D functional scaffolds before the manufacturing process. Prior evaluation of the mechanical strength and structural properties of the scaffold will reduce the cost fabrication and in fact ease up the designing process. Detailed analysis and investigation of various mechanical properties including shear stress equivalence have helped to estimate the effect of porosity and pore size on the functionality of the scaffold. The influence of variation in porosity was examined by computational approach via finite element analysis (FEA) and ANSYS application software. The results designate the adequate perspective of the evolutionary method for the regulation and optimization of the intricate engineering design process.
NASA Astrophysics Data System (ADS)
Sangiorgi, Pierluca; Capalbi, Milvia; Gimenes, Renato; La Rosa, Giovanni; Russo, Francesco; Segreto, Alberto; Sottile, Giuseppe; Catalano, Osvaldo
2016-07-01
The purpose of this contribution is to present the current status of the software architecture of the ASTRI SST-2M Cherenkov Camera. The ASTRI SST-2M telescope is an end-to-end prototype for the Small Size Telescope of the Cherenkov Telescope Array. The ASTRI camera is an innovative instrument based on SiPM detectors and has several internal hardware components. In this contribution we will give a brief description of the hardware components of the camera of the ASTRI SST-2M prototype and of their interconnections. Then we will present the outcome of the software architectural design process that we carried out in order to identify the main structural components of the camera software system and the relationships among them. We will analyze the architectural model that describes how the camera software is organized as a set of communicating blocks. Finally, we will show where these blocks are deployed in the hardware components and how they interact. We will describe in some detail, the physical communication ports and external ancillary devices management, the high precision time-tag management, the fast data collection and the fast data exchange between different camera subsystems, and the interfacing with the external systems.
Power calculation for overall hypothesis testing with high-dimensional commensurate outcomes.
Chi, Yueh-Yun; Gribbin, Matthew J; Johnson, Jacqueline L; Muller, Keith E
2014-02-28
The complexity of system biology means that any metabolic, genetic, or proteomic pathway typically includes so many components (e.g., molecules) that statistical methods specialized for overall testing of high-dimensional and commensurate outcomes are required. While many overall tests have been proposed, very few have power and sample size methods. We develop accurate power and sample size methods and software to facilitate study planning for high-dimensional pathway analysis. With an account of any complex correlation structure between high-dimensional outcomes, the new methods allow power calculation even when the sample size is less than the number of variables. We derive the exact (finite-sample) and approximate non-null distributions of the 'univariate' approach to repeated measures test statistic, as well as power-equivalent scenarios useful to generalize our numerical evaluations. Extensive simulations of group comparisons support the accuracy of the approximations even when the ratio of number of variables to sample size is large. We derive a minimum set of constants and parameters sufficient and practical for power calculation. Using the new methods and specifying the minimum set to determine power for a study of metabolic consequences of vitamin B6 deficiency helps illustrate the practical value of the new results. Free software implementing the power and sample size methods applies to a wide range of designs, including one group pre-intervention and post-intervention comparisons, multiple parallel group comparisons with one-way or factorial designs, and the adjustment and evaluation of covariate effects. Copyright © 2013 John Wiley & Sons, Ltd.
Re-designing a mechanism for higher speed: A case history from textile machinery
NASA Astrophysics Data System (ADS)
Douglas, S. S.; Rooney, G. T.
The generation of general mechanism design software which is the formulation of suitable objective functions is discussed. There is a consistent drive towards higher speeds in the development of industrial sewing machines. This led to experimental analyses of dynamic performance and to a search for improved design methods. The experimental work highlighted the need for smoothness of motion at high speed, component inertias, and frame structural stiffness. Smoothness is associated with transmission properties and harmonic analysis. These are added to other design requirements of synchronization, mechanism size, and function. Some of the mechanism trains in overedte sewing machines are shown. All these trains are designed by digital optimization. The design software combines analysis of the sewing machine mechanisms, formulation of objectives innumerical terms, and suitable mathematical optimization ttechniques.
Optimization of Gate, Runner and Sprue in Two-Plate Family Plastic Injection Mould
NASA Astrophysics Data System (ADS)
Amran, M. A.; Hadzley, M.; Amri, S.; Izamshah, R.; Hassan, A.; Samsi, S.; Shahir, K.
2010-03-01
This paper describes the optimization size of gate, runner and sprue in two-plate family plastic injection mould. An Electronic Cash Register (ECR) plastic product was used in this study, which there are three components in electronic cast register plastic product consist of top casing, bottom casing and paper holder. The objectives of this paper are to find out the optimum size of gate, runner and sprue, to locate the optimum layout of cavities and to recognize the defect problems due to the wrong size of gate, runner and sprue. Three types of software were used in this study, which Unigraphics software as CAD tool was used to design 3D modeling, Rhinoceros software as post processing tool was used to design gate, runner and sprue and Moldex software as simulation tool was used to analyze the plastic flow. As result, some modifications were made on size of feeding system and location of cavity to eliminate the short- shot, over filling and welding line problems in two-plate family plastic injection mould.
Cost Estimation of Software Development and the Implications for the Program Manager
1992-06-01
Software Lifecycle Model (SLIM), the Jensen System-4 model, the Software Productivity, Quality, and Reliability Estimator ( SPQR \\20), the Constructive...function models in current use are the Software Productivity, Quality, and Reliability Estimator ( SPQR /20) and the Software Architecture Sizing and...Estimator ( SPQR /20) was developed by T. Capers Jones of Software Productivity Research, Inc., in 1985. The model is intended to estimate the outcome
Real-Time Simulation of Aeroheating of the Hyper-X Airplane
NASA Technical Reports Server (NTRS)
Gong, Les
2005-01-01
A capability for real-time computational simulation of aeroheating has been developed in support of the Hyper-X program, which is directed toward demonstrating the feasibility of operating an air-breathing ramjet/scramjet engine at mach 5, mach 7, and mach 10. The simulation software will serve as a valuable design tool for initial trajectory studies in which aerodynamic heating is expected to exert a major influence in the design of the Hyper-X airplane; this tool will aid in the selection of materials, sizing of structural skin thicknesses, and selection of components of a thermal-protection system (TPS) for structures that must be insulated against aeroheating.
ELM: super-resolution analysis of wide-field images of fluorescent shell structures
NASA Astrophysics Data System (ADS)
Manton, James D.; Xiao, Yao; Turner, Robert D.; Christie, Graham; Rees, Eric J.
2018-07-01
It is often necessary to precisely quantify the size of specimens in biological studies. When measuring feature size in fluorescence microscopy, significant biases can arise due to blurring of its edges if the feature is smaller than the diffraction limit of resolution. This problem is avoided if an equation describing the feature’s entire image is fitted to its image data. In this paper we present open-source software, ELM, which uses this approach to measure the size of spheroidal or cylindrical fluorescent shells with a precision of around 10 nm. This has been used to measure coat protein locations in bacterial spores and cell wall diameter in vegetative bacilli, and may also be valuable in microbiological studies of algae, fungi and viruses. ELM is available for download at https://github.com/quantitativeimaging/ELM.
ELM: super-resolution analysis of wide-field images of fluorescent shell structures.
Manton, James; Xiao, Yao; Turner, Robert; Christie, Graham; Rees, Eric
2018-05-04
It is often necessary to precisely quantify the size of specimens in biological studies. When measuring feature size in fluorescence microscopy, significant biases can arise due to blurring of its edges if the feature is smaller than the diffraction limit of resolution. This problem is avoided if an equation describing the feature's entire image is fitted to its image data. In this paper we present open-source software, ELM, which uses this approach to measure the size of spheroidal or cylindrical fluorescent shells with a precision of around 10 nm. This has been used to measure coat protein locations in bacterial spores and cell wall diameter in vegetative bacilli, and may also be valuable in microbiological studies of algae, fungi and viruses. ELM is available for download at https://github.com/quantitativeimaging/ELM. Creative Commons Attribution license.
Missileborne Artificial Vision System (MAVIS)
NASA Technical Reports Server (NTRS)
Andes, David K.; Witham, James C.; Miles, Michael D.
1994-01-01
Several years ago when INTEL and China Lake designed the ETANN chip, analog VLSI appeared to be the only way to do high density neural computing. In the last five years, however, digital parallel processing chips capable of performing neural computation functions have evolved to the point of rough equality with analog chips in system level computational density. The Naval Air Warfare Center, China Lake, has developed a real time, hardware and software system designed to implement and evaluate biologically inspired retinal and cortical models. The hardware is based on the Adaptive Solutions Inc. massively parallel CNAPS system COHO boards. Each COHO board is a standard size 6U VME card featuring 256 fixed point, RISC processors running at 20 MHz in a SIMD configuration. Each COHO board has a companion board built to support a real time VSB interface to an imaging seeker, a NTSC camera, and to other COHO boards. The system is designed to have multiple SIMD machines each performing different corticomorphic functions. The system level software has been developed which allows a high level description of corticomorphic structures to be translated into the native microcode of the CNAPS chips. Corticomorphic structures are those neural structures with a form similar to that of the retina, the lateral geniculate nucleus, or the visual cortex. This real time hardware system is designed to be shrunk into a volume compatible with air launched tactical missiles. Initial versions of the software and hardware have been completed and are in the early stages of integration with a missile seeker.
User's Manual and Final Report for Hot-SMAC GUI Development
NASA Technical Reports Server (NTRS)
Yarrington, Phil
2001-01-01
A new software package called Higher Order Theory-Structural/Micro Analysis Code (HOT-SMAC) has been developed as an effective alternative to the finite element approach for Functionally Graded Material (FGM) modeling. HOT-SMAC is a self-contained package including pre- and post-processing through an intuitive graphical user interface, along with the well-established Higher Order Theory for Functionally Graded Materials (HOTFGM) thermomechanical analysis engine. This document represents a Getting Started/User's Manual for HOT-SMAC and a final report for its development. First, the features of the software are presented in a simple step-by-step example where a HOT-SMAC model representing a functionally graded material is created, mechanical and thermal boundary conditions are applied, the model is analyzed and results are reviewed. In a second step-by-step example, a HOT-SMAC model of an actively cooled metallic channel with ceramic thermal barrier coating is built and analyzed. HOT-SMAC results from this model are compared to recently published results (NASA/TM-2001-210702) for two grid densities. Finally, a prototype integration of HOTSMAC with the commercially available HyperSizer(R) structural analysis and sizing software is presented. In this integration, local strain results from HyperSizer's structural analysis are fed to a detailed HOT-SMAC model of the flange-to-facesheet bond region of a stiffened panel. HOT-SMAC is then used to determine the peak shear and peel (normal) stresses between the facesheet and bonded flange of the panel and determine the "free edge" effects.
NASA Technical Reports Server (NTRS)
Plante, I; Wu, H
2014-01-01
The code RITRACKS (Relativistic Ion Tracks) has been developed over the last few years at the NASA Johnson Space Center to simulate the effects of ionizing radiations at the microscopic scale, to understand the effects of space radiation at the biological level. The fundamental part of this code is the stochastic simulation of radiation track structure of heavy ions, an important component of space radiations. The code can calculate many relevant quantities such as the radial dose, voxel dose, and may also be used to calculate the dose in spherical and cylindrical targets of various sizes. Recently, we have incorporated DNA structure and damage simulations at the molecular scale in RITRACKS. The direct effect of radiations is simulated by introducing a slight modification of the existing particle transport algorithms, using the Binary-Encounter-Bethe model of ionization cross sections for each molecular orbitals of DNA. The simulation of radiation chemistry is done by a step-by-step diffusion-reaction program based on the Green's functions of the diffusion equation]. This approach is also used to simulate the indirect effect of ionizing radiation on DNA. The software can be installed independently on PC and tablets using the Windows operating system and does not require any coding from the user. It includes a Graphic User Interface (GUI) and a 3D OpenGL visualization interface. The calculations are executed simultaneously (in parallel) on multiple CPUs. The main features of the software will be presented.
The endothelial sample size analysis in corneal specular microscopy clinical examinations.
Abib, Fernando C; Holzchuh, Ricardo; Schaefer, Artur; Schaefer, Tania; Godois, Ronialci
2012-05-01
To evaluate endothelial cell sample size and statistical error in corneal specular microscopy (CSM) examinations. One hundred twenty examinations were conducted with 4 types of corneal specular microscopes: 30 with each BioOptics, CSO, Konan, and Topcon corneal specular microscopes. All endothelial image data were analyzed by respective instrument software and also by the Cells Analyzer software with a method developed in our lab. A reliability degree (RD) of 95% and a relative error (RE) of 0.05 were used as cut-off values to analyze images of the counted endothelial cells called samples. The sample size mean was the number of cells evaluated on the images obtained with each device. Only examinations with RE < 0.05 were considered statistically correct and suitable for comparisons with future examinations. The Cells Analyzer software was used to calculate the RE and customized sample size for all examinations. Bio-Optics: sample size, 97 ± 22 cells; RE, 6.52 ± 0.86; only 10% of the examinations had sufficient endothelial cell quantity (RE < 0.05); customized sample size, 162 ± 34 cells. CSO: sample size, 110 ± 20 cells; RE, 5.98 ± 0.98; only 16.6% of the examinations had sufficient endothelial cell quantity (RE < 0.05); customized sample size, 157 ± 45 cells. Konan: sample size, 80 ± 27 cells; RE, 10.6 ± 3.67; none of the examinations had sufficient endothelial cell quantity (RE > 0.05); customized sample size, 336 ± 131 cells. Topcon: sample size, 87 ± 17 cells; RE, 10.1 ± 2.52; none of the examinations had sufficient endothelial cell quantity (RE > 0.05); customized sample size, 382 ± 159 cells. A very high number of CSM examinations had sample errors based on Cells Analyzer software. The endothelial sample size (examinations) needs to include more cells to be reliable and reproducible. The Cells Analyzer tutorial routine will be useful for CSM examination reliability and reproducibility.
Multidisciplinary Tool for Systems Analysis of Planetary Entry, Descent, and Landing
NASA Technical Reports Server (NTRS)
Samareh, Jamshid A.
2011-01-01
Systems analysis of a planetary entry (SAPE), descent, and landing (EDL) is a multidisciplinary activity in nature. SAPE improves the performance of the systems analysis team by automating and streamlining the process, and this improvement can reduce the errors that stem from manual data transfer among discipline experts. SAPE is a multidisciplinary tool for systems analysis of planetary EDL for Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Titan. It performs EDL systems analysis for any planet, operates cross-platform (i.e., Windows, Mac, and Linux operating systems), uses existing software components and open-source software to avoid software licensing issues, performs low-fidelity systems analysis in one hour on a computer that is comparable to an average laptop, and keeps discipline experts in the analysis loop. SAPE uses Python, a platform-independent, open-source language, for integration and for the user interface. Development has relied heavily on the object-oriented programming capabilities that are available in Python. Modules are provided to interface with commercial and government off-the-shelf software components (e.g., thermal protection systems and finite-element analysis). SAPE currently includes the following analysis modules: geometry, trajectory, aerodynamics, aerothermal, thermal protection system, and interface for structural sizing.
Simulation of cooling efficiency via miniaturised channels in multilayer LTCC for power electronics
NASA Astrophysics Data System (ADS)
Pietrikova, Alena; Girasek, Tomas; Lukacs, Peter; Welker, Tilo; Müller, Jens
2017-03-01
The aim of this paper is detailed investigation of thermal resistance, flow analysis and distribution of coolant as well as thermal distribution inside multilayer LTCC substrates with embedded channels for power electronic devices by simulation software. For this reason four various structures of internal channels in the multilayer LTCC substrates were designed and simulated. The impact of the volume flow, structures of channels, and power loss of chip was simulated, calculated and analyzed by using the simulation software Mentor Graphics FloEFDTM. The structure, size and location of channels have the significant impact on thermal resistance, pressure of coolant as well as the effectivity of cooling power components (chips) that can be placed on the top of LTCC substrate. The main contribution of this paper is thermal analyze, optimization and impact of 4 various cooling channels embedded in LTCC multilayer structure. Paper investigate, the effect of volume flow in cooling channels for achieving the least thermal resistance of LTCC substrate that is loaded by power thermal chips. Paper shows on the impact of the first chips thermal load on the second chip as well as. This possible new technology could ensure in the case of practical realization effective cooling and increasing reliability of high power modules.
Sekiguchi, Yuki; Oroguchi, Tomotaka; Takayama, Yuki; Nakasako, Masayoshi
2014-05-01
Coherent X-ray diffraction imaging is a promising technique for visualizing the structures of non-crystalline particles with dimensions of micrometers to sub-micrometers. Recently, X-ray free-electron laser sources have enabled efficient experiments in the `diffraction before destruction' scheme. Diffraction experiments have been conducted at SPring-8 Angstrom Compact free-electron LAser (SACLA) using the custom-made diffraction apparatus KOTOBUKI-1 and two multiport CCD detectors. In the experiments, ten thousands of single-shot diffraction patterns can be collected within several hours. Then, diffraction patterns with significant levels of intensity suitable for structural analysis must be found, direct-beam positions in diffraction patterns determined, diffraction patterns from the two CCD detectors merged, and phase-retrieval calculations for structural analyses performed. A software suite named SITENNO has been developed to semi-automatically apply the four-step processing to a huge number of diffraction data. Here, details of the algorithm used in the suite are described and the performance for approximately 9000 diffraction patterns collected from cuboid-shaped copper oxide particles reported. Using the SITENNO suite, it is possible to conduct experiments with data processing immediately after the data collection, and to characterize the size distribution and internal structures of the non-crystalline particles.
Sekiguchi, Yuki; Oroguchi, Tomotaka; Takayama, Yuki; Nakasako, Masayoshi
2014-01-01
Coherent X-ray diffraction imaging is a promising technique for visualizing the structures of non-crystalline particles with dimensions of micrometers to sub-micrometers. Recently, X-ray free-electron laser sources have enabled efficient experiments in the ‘diffraction before destruction’ scheme. Diffraction experiments have been conducted at SPring-8 Angstrom Compact free-electron LAser (SACLA) using the custom-made diffraction apparatus KOTOBUKI-1 and two multiport CCD detectors. In the experiments, ten thousands of single-shot diffraction patterns can be collected within several hours. Then, diffraction patterns with significant levels of intensity suitable for structural analysis must be found, direct-beam positions in diffraction patterns determined, diffraction patterns from the two CCD detectors merged, and phase-retrieval calculations for structural analyses performed. A software suite named SITENNO has been developed to semi-automatically apply the four-step processing to a huge number of diffraction data. Here, details of the algorithm used in the suite are described and the performance for approximately 9000 diffraction patterns collected from cuboid-shaped copper oxide particles reported. Using the SITENNO suite, it is possible to conduct experiments with data processing immediately after the data collection, and to characterize the size distribution and internal structures of the non-crystalline particles. PMID:24763651
Open Source Software in Medium Size Organizations: Key Factors for Adoption
ERIC Educational Resources Information Center
Solomon, Jerry T.
2010-01-01
For-profit organizations are constantly evaluating new technologies to gain competitive advantage. One such technology, application software, has changed significantly over the past 25 years with the introduction of Open Source Software (OSS). In contrast to commercial software that is developed by private companies and sold to organizations, OSS…
Software Size Estimation Using Expert Estimation: A Fuzzy Logic Approach
ERIC Educational Resources Information Center
Stevenson, Glenn A.
2012-01-01
For decades software managers have been using formal methodologies such as the Constructive Cost Model and Function Points to estimate the effort of software projects during the early stages of project development. While some research shows these methodologies to be effective, many software managers feel that they are overly complicated to use and…
SOFTCOST - DEEP SPACE NETWORK SOFTWARE COST MODEL
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1994-01-01
The early-on estimation of required resources and a schedule for the development and maintenance of software is usually the least precise aspect of the software life cycle. However, it is desirable to make some sort of an orderly and rational attempt at estimation in order to plan and organize an implementation effort. The Software Cost Estimation Model program, SOFTCOST, was developed to provide a consistent automated resource and schedule model which is more formalized than the often used guesswork model based on experience, intuition, and luck. SOFTCOST was developed after the evaluation of a number of existing cost estimation programs indicated that there was a need for a cost estimation program with a wide range of application and adaptability to diverse kinds of software. SOFTCOST combines several software cost models found in the open literature into one comprehensive set of algorithms that compensate for nearly fifty implementation factors relative to size of the task, inherited baseline, organizational and system environment, and difficulty of the task. SOFTCOST produces mean and variance estimates of software size, implementation productivity, recommended staff level, probable duration, amount of computer resources required, and amount and cost of software documentation. Since the confidence level for a project using mean estimates is small, the user is given the opportunity to enter risk-biased values for effort, duration, and staffing, to achieve higher confidence levels. SOFTCOST then produces a PERT/CPM file with subtask efforts, durations, and precedences defined so as to produce the Work Breakdown Structure (WBS) and schedule having the asked-for overall effort and duration. The SOFTCOST program operates in an interactive environment prompting the user for all of the required input. The program builds the supporting PERT data base in a file for later report generation or revision. The PERT schedule and the WBS schedule may be printed and stored in a file for later use. The SOFTCOST program is written in Microsoft BASIC for interactive execution and has been implemented on an IBM PC-XT/AT operating MS-DOS 2.1 or higher with 256K bytes of memory. SOFTCOST was originally developed for the Zylog Z80 system running under CP/M in 1981. It was converted to run on the IBM PC XT/AT in 1986. SOFTCOST is a copyrighted work with all copyright vested in NASA.
A communication channel model of the software process
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1988-01-01
Reported here is beginning research into a noisy communication channel analogy of software development process productivity, in order to establish quantifiable behavior and theoretical bounds. The analogy leads to a fundamental mathematical relationship between human productivity and the amount of information supplied by the developers, the capacity of the human channel for processing and transmitting information, the software product yield (object size), the work effort, requirements efficiency, tool and process efficiency, and programming environment advantage. Also derived is an upper bound to productivity that shows that software reuse is the only means than can lead to unbounded productivity growth; practical considerations of size and cost of reusable components may reduce this to a finite bound.
A communication channel model of the software process
NASA Technical Reports Server (NTRS)
Tausworthe, Robert C.
1988-01-01
Beginning research into a noisy communication channel analogy of software development process productivity, in order to establish quantifiable behavior and theoretical bounds is discussed. The analogy leads to a fundamental mathematical relationship between human productivity and the amount of information supplied by the developers, the capacity of the human channel for processing and transmitting information, the software product yield (object size) the work effort, requirements efficiency, tool and process efficiency, and programming environment advantage. An upper bound to productivity is derived that shows that software reuse is the only means that can lead to unbounded productivity growth; practical considerations of size and cost of reusable components may reduce this to a finite bound.
ERIC Educational Resources Information Center
Lui, Joseph P.
2013-01-01
Identifying appropriate international distributors for small and medium-sized enterprises (SMEs) in the software industry for overseas markets can determine a firm's future endeavors in international expansion. SMEs lack the complex skills in market research and decision analysis to identify suitable partners to engage in global market entry.…
Top down, bottom up structured programming and program structuring
NASA Technical Reports Server (NTRS)
Hamilton, M.; Zeldin, S.
1972-01-01
New design and programming techniques for shuttle software. Based on previous Apollo experience, recommendations are made to apply top-down structured programming techniques to shuttle software. New software verification techniques for large software systems are recommended. HAL, the higher order language selected for the shuttle flight code, is discussed and found to be adequate for implementing these techniques. Recommendations are made to apply the workable combination of top-down, bottom-up methods in the management of shuttle software. Program structuring is discussed relevant to both programming and management techniques.
NASA Astrophysics Data System (ADS)
Nag, A.; Mahapatra, D. Roy; Gopalakrishnan, S.
2003-10-01
A hierarchical Genetic Algorithm (GA) is implemented in a high peformance spectral finite element software for identification of delaminations in laminated composite beams. In smart structural health monitoring, the number of delaminations (or any other modes of damage) as well as their locations and sizes are no way completely known. Only known are the healthy structural configuration (mass, stiffness and damping matrices updated from previous phases of monitoring), sensor measurements and some information about the load environment. To handle such enormous complexity, a hierarchical GA is used to represent heterogeneous population consisting of damaged structures with different number of delaminations and their evolution process to identify the correct damage configuration in the structures under monitoring. We consider this similarity with the evolution process in heterogeneous population of species in nature to develop an automated procedure to decide on what possible damaged configuration might have produced the deviation in the measured signals. Computational efficiency of the identification task is demonstrated by considering a single delamination. The behavior of fitness function in GA, which is an important factor for fast convergence, is studied for single and multiple delaminations. Several advantages of the approach in terms of computational cost is discussed. Beside tackling different other types of damage configurations, further scope of research for development of hybrid soft-computing modules are highlighted.
Ground Software Maintenance Facility (GSMF) system manual
NASA Technical Reports Server (NTRS)
Derrig, D.; Griffith, G.
1986-01-01
The Ground Software Maintenance Facility (GSMF) is designed to support development and maintenance of spacelab ground support software. THE GSMF consists of a Perkin Elmer 3250 (Host computer) and a MITRA 125s (ATE computer), with appropriate interface devices and software to simulate the Electrical Ground Support Equipment (EGSE). This document is presented in three sections: (1) GSMF Overview; (2) Software Structure; and (3) Fault Isolation Capability. The overview contains information on hardware and software organization along with their corresponding block diagrams. The Software Structure section describes the modes of software structure including source files, link information, and database files. The Fault Isolation section describes the capabilities of the Ground Computer Interface Device, Perkin Elmer host, and MITRA ATE.
An experimental investigation of fault tolerant software structures in an avionics application
NASA Technical Reports Server (NTRS)
Caglayan, Alper K.; Eckhardt, Dave E., Jr.
1989-01-01
The objective of this experimental investigation is to compare the functional performance and software reliability of competing fault tolerant software structures utilizing software diversity. In this experiment, three versions of the redundancy management software for a skewed sensor array have been developed using three diverse failure detection and isolation algorithms and incorporated into various N-version, recovery block and hybrid software structures. The empirical results show that, for maximum functional performance improvement in the selected application domain, the results of diverse algorithms should be voted before being processed by multiple versions without enforced diversity. Results also suggest that when the reliability gain with an N-version structure is modest, recovery block structures are more feasible since higher reliability can be obtained using an acceptance check with a modest reliability.
A large dataset of synthetic SEM images of powder materials and their ground truth 3D structures.
DeCost, Brian L; Holm, Elizabeth A
2016-12-01
This data article presents a data set comprised of 2048 synthetic scanning electron microscope (SEM) images of powder materials and descriptions of the corresponding 3D structures that they represent. These images were created using open source rendering software, and the generating scripts are included with the data set. Eight particle size distributions are represented with 256 independent images from each. The particle size distributions are relatively similar to each other, so that the dataset offers a useful benchmark to assess the fidelity of image analysis techniques. The characteristics of the PSDs and the resulting images are described and analyzed in more detail in the research article "Characterizing powder materials using keypoint-based computer vision methods" (B.L. DeCost, E.A. Holm, 2016) [1]. These data are freely available in a Mendeley Data archive "A large dataset of synthetic SEM images of powder materials and their ground truth 3D structures" (B.L. DeCost, E.A. Holm, 2016) located at http://dx.doi.org/10.17632/tj4syyj9mr.1[2] for any academic, educational, or research purposes.
ERIC Educational Resources Information Center
Tanner-Smith, Emily E.; Tipton, Elizabeth
2014-01-01
Methodologists have recently proposed robust variance estimation as one way to handle dependent effect sizes in meta-analysis. Software macros for robust variance estimation in meta-analysis are currently available for Stata (StataCorp LP, College Station, TX, USA) and SPSS (IBM, Armonk, NY, USA), yet there is little guidance for authors regarding…
NASA Astrophysics Data System (ADS)
Möller, Thomas; Bellin, Knut; Creutzburg, Reiner
2015-03-01
The aim of this paper is to show the recent progress in the design and prototypical development of a software suite Copra Breeder* for semi-automatic generation of test methodologies and security checklists for IT vulnerability assessment in small and medium-sized enterprises.
A Multidisciplinary Tool for Systems Analysis of Planetary Entry, Descent, and Landing (SAPE)
NASA Technical Reports Server (NTRS)
Samareh, Jamshid A.
2009-01-01
SAPE is a Python-based multidisciplinary analysis tool for systems analysis of planetary entry, descent, and landing (EDL) for Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Titan. The purpose of SAPE is to provide a variable-fidelity capability for conceptual and preliminary analysis within the same framework. SAPE includes the following analysis modules: geometry, trajectory, aerodynamics, aerothermal, thermal protection system, and structural sizing. SAPE uses the Python language-a platform-independent open-source software for integration and for the user interface. The development has relied heavily on the object-oriented programming capabilities that are available in Python. Modules are provided to interface with commercial and government off-the-shelf software components (e.g., thermal protection systems and finite-element analysis). SAPE runs on Microsoft Windows and Apple Mac OS X and has been partially tested on Linux.
Applications of large-scale density functional theory in biology
NASA Astrophysics Data System (ADS)
Cole, Daniel J.; Hine, Nicholas D. M.
2016-10-01
Density functional theory (DFT) has become a routine tool for the computation of electronic structure in the physics, materials and chemistry fields. Yet the application of traditional DFT to problems in the biological sciences is hindered, to a large extent, by the unfavourable scaling of the computational effort with system size. Here, we review some of the major software and functionality advances that enable insightful electronic structure calculations to be performed on systems comprising many thousands of atoms. We describe some of the early applications of large-scale DFT to the computation of the electronic properties and structure of biomolecules, as well as to paradigmatic problems in enzymology, metalloproteins, photosynthesis and computer-aided drug design. With this review, we hope to demonstrate that first principles modelling of biological structure-function relationships are approaching a reality.
Software ion scan functions in analysis of glycomic and lipidomic MS/MS datasets.
Haramija, Marko
2018-03-01
Hardware ion scan functions unique to tandem mass spectrometry (MS/MS) mode of data acquisition, such as precursor ion scan (PIS) and neutral loss scan (NLS), are important for selective extraction of key structural data from complex MS/MS spectra. However, their software counterparts, software ion scan (SIS) functions, are still not regularly available. Software ion scan functions can be easily coded for additional functionalities, such as software multiple precursor ion scan, software no ion scan, and software variable ion scan functions. These are often necessary, since they allow more efficient analysis of complex MS/MS datasets, often encountered in glycomics and lipidomics. Software ion scan functions can be easily coded by using modern script languages and can be independent of instrument manufacturer. Here we demonstrate the utility of SIS functions on a medium-size glycomic MS/MS dataset. Knowledge of sample properties, as well as of diagnostic and conditional diagnostic ions crucial for data analysis, was needed. Based on the tables constructed with the output data from the SIS functions performed, a detailed analysis of a complex MS/MS glycomic dataset could be carried out in a quick, accurate, and efficient manner. Glycomic research is progressing slowly, and with respect to the MS experiments, one of the key obstacles for moving forward is the lack of appropriate bioinformatic tools necessary for fast analysis of glycomic MS/MS datasets. Adding novel SIS functionalities to the glycomic MS/MS toolbox has a potential to significantly speed up the glycomic data analysis process. Similar tools are useful for analysis of lipidomic MS/MS datasets as well, as will be discussed briefly. Copyright © 2017 John Wiley & Sons, Ltd.
Dong, Ling-Bo; Liu, Zhao-Gang; Li, Feng-Ri; Jiang, Li-Chun
2013-09-01
By using the branch analysis data of 955 standard branches from 60 sampled trees in 12 sampling plots of Pinus koraiensis plantation in Mengjiagang Forest Farm in Heilongjiang Province of Northeast China, and based on the linear mixed-effect model theory and methods, the models for predicting branch variables, including primary branch diameter, length, and angle, were developed. Considering tree effect, the MIXED module of SAS software was used to fit the prediction models. The results indicated that the fitting precision of the models could be improved by choosing appropriate random-effect parameters and variance-covariance structure. Then, the correlation structures including complex symmetry structure (CS), first-order autoregressive structure [AR(1)], and first-order autoregressive and moving average structure [ARMA(1,1)] were added to the optimal branch size mixed-effect model. The AR(1) improved the fitting precision of branch diameter and length mixed-effect model significantly, but all the three structures didn't improve the precision of branch angle mixed-effect model. In order to describe the heteroscedasticity during building mixed-effect model, the CF1 and CF2 functions were added to the branch mixed-effect model. CF1 function improved the fitting effect of branch angle mixed model significantly, whereas CF2 function improved the fitting effect of branch diameter and length mixed model significantly. Model validation confirmed that the mixed-effect model could improve the precision of prediction, as compare to the traditional regression model for the branch size prediction of Pinus koraiensis plantation.
Efficient Hardware Implementation of the Lightweight Block Encryption Algorithm LEA
Lee, Donggeon; Kim, Dong-Chan; Kwon, Daesung; Kim, Howon
2014-01-01
Recently, due to the advent of resource-constrained trends, such as smartphones and smart devices, the computing environment is changing. Because our daily life is deeply intertwined with ubiquitous networks, the importance of security is growing. A lightweight encryption algorithm is essential for secure communication between these kinds of resource-constrained devices, and many researchers have been investigating this field. Recently, a lightweight block cipher called LEA was proposed. LEA was originally targeted for efficient implementation on microprocessors, as it is fast when implemented in software and furthermore, it has a small memory footprint. To reflect on recent technology, all required calculations utilize 32-bit wide operations. In addition, the algorithm is comprised of not complex S-Box-like structures but simple Addition, Rotation, and XOR operations. To the best of our knowledge, this paper is the first report on a comprehensive hardware implementation of LEA. We present various hardware structures and their implementation results according to key sizes. Even though LEA was originally targeted at software efficiency, it also shows high efficiency when implemented as hardware. PMID:24406859
Software Cost-Estimation Model
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1985-01-01
Software Cost Estimation Model SOFTCOST provides automated resource and schedule model for software development. Combines several cost models found in open literature into one comprehensive set of algorithms. Compensates for nearly fifty implementation factors relative to size of task, inherited baseline, organizational and system environment and difficulty of task.
NASA Technical Reports Server (NTRS)
1976-01-01
A software analysis was performed of known STS sortie payload elements and their associated experiments. This provided basic data for STS payload software characteristics and sizes. A set of technology drivers was identified based on a survey of future technology needs and an assessment of current software technology. The results will be used to evolve a planned approach to software technology development. The purpose of this plan is to ensure that software technology is advanced at a pace and a depth sufficient to fulfill the identified future needs.
Kobayashi, Masanao; Asada, Yasuki; Matsubara, Kosuke; Suzuki, Shouichi; Matsunaga, Yuta; Haba, Tomonobu; Kawaguchi, Ai; Daioku, Tomihiko; Toyama, Hiroshi; Kato, Ryoichi
2017-05-01
Adequate dose management during computed tomography is important. In the present study, the dosimetric application software ImPACT was added to a functional calculator of the size-specific dose estimate and was part of the scan settings for the auto exposure control (AEC) technique. This study aimed to assess the practicality and accuracy of the modified ImPACT software for dose estimation. We compared the conversion factors identified by the software with the values reported by the American Association of Physicists in Medicine Task Group 204, and we noted similar results. Moreover, doses were calculated with the AEC technique and a fixed-tube current of 200 mA for the chest-pelvis region. The modified ImPACT software could estimate each organ dose, which was based on the modulated tube current. The ability to perform beneficial modifications indicates the flexibility of the ImPACT software. The ImPACT software can be further modified for estimation of other doses. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Software reliability experiments data analysis and investigation
NASA Technical Reports Server (NTRS)
Walker, J. Leslie; Caglayan, Alper K.
1991-01-01
The objectives are to investigate the fundamental reasons which cause independently developed software programs to fail dependently, and to examine fault tolerant software structures which maximize reliability gain in the presence of such dependent failure behavior. The authors used 20 redundant programs from a software reliability experiment to analyze the software errors causing coincident failures, to compare the reliability of N-version and recovery block structures composed of these programs, and to examine the impact of diversity on software reliability using subpopulations of these programs. The results indicate that both conceptually related and unrelated errors can cause coincident failures and that recovery block structures offer more reliability gain than N-version structures if acceptance checks that fail independently from the software components are available. The authors present a theory of general program checkers that have potential application for acceptance tests.
Jeon, Joonryong
2017-01-01
In this paper, a data compression technology-based intelligent data acquisition (IDAQ) system was developed for structural health monitoring of civil structures, and its validity was tested using random signals (El-Centro seismic waveform). The IDAQ system was structured to include a high-performance CPU with large dynamic memory for multi-input and output in a radio frequency (RF) manner. In addition, the embedded software technology (EST) has been applied to it to implement diverse logics needed in the process of acquiring, processing and transmitting data. In order to utilize IDAQ system for the structural health monitoring of civil structures, this study developed an artificial filter bank by which structural dynamic responses (acceleration) were efficiently acquired, and also optimized it on the random El-Centro seismic waveform. All techniques developed in this study have been embedded to our system. The data compression technology-based IDAQ system was proven valid in acquiring valid signals in a compressed size. PMID:28704945
Heo, Gwanghee; Jeon, Joonryong
2017-07-12
In this paper, a data compression technology-based intelligent data acquisition (IDAQ) system was developed for structural health monitoring of civil structures, and its validity was tested using random signals (El-Centro seismic waveform). The IDAQ system was structured to include a high-performance CPU with large dynamic memory for multi-input and output in a radio frequency (RF) manner. In addition, the embedded software technology (EST) has been applied to it to implement diverse logics needed in the process of acquiring, processing and transmitting data. In order to utilize IDAQ system for the structural health monitoring of civil structures, this study developed an artificial filter bank by which structural dynamic responses (acceleration) were efficiently acquired, and also optimized it on the random El-Centro seismic waveform. All techniques developed in this study have been embedded to our system. The data compression technology-based IDAQ system was proven valid in acquiring valid signals in a compressed size.
Scaling Semantic Graph Databases in Size and Performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morari, Alessandro; Castellana, Vito G.; Villa, Oreste
In this paper we present SGEM, a full software system for accelerating large-scale semantic graph databases on commodity clusters. Unlike current approaches, SGEM addresses semantic graph databases by only employing graph methods at all the levels of the stack. On one hand, this allows exploiting the space efficiency of graph data structures and the inherent parallelism of graph algorithms. These features adapt well to the increasing system memory and core counts of modern commodity clusters. On the other hand, however, these systems are optimized for regular computation and batched data transfers, while graph methods usually are irregular and generate fine-grainedmore » data accesses with poor spatial and temporal locality. Our framework comprises a SPARQL to data parallel C compiler, a library of parallel graph methods and a custom, multithreaded runtime system. We introduce our stack, motivate its advantages with respect to other solutions and show how we solved the challenges posed by irregular behaviors. We present the result of our software stack on the Berlin SPARQL benchmarks with datasets up to 10 billion triples (a triple corresponds to a graph edge), demonstrating scaling in dataset size and in performance as more nodes are added to the cluster.« less
ERIC Educational Resources Information Center
Lafferty, Mark T.
2010-01-01
The number of project failures and those projects completed over cost and over schedule has been a significant issue for software project managers. Among the many reasons for failure, inaccuracy in software estimation--the basis for project bidding, budgeting, planning, and probability estimates--has been identified as a root cause of a high…
Quantitative software models for the estimation of cost, size, and defects
NASA Technical Reports Server (NTRS)
Hihn, J.; Bright, L.; Decker, B.; Lum, K.; Mikulski, C.; Powell, J.
2002-01-01
The presentation will provide a brief overview of the SQI measurement program as well as describe each of these models and how they are currently being used in supporting JPL project, task and software managers to estimate and plan future software systems and subsystems.
Characterization of aeroallergen of Texas panhandle using scanning and fluorescence microscopy
NASA Astrophysics Data System (ADS)
Ghosh, Nabarun; Whiteside, Mandy; Ridner, Chris; Celik, Yasemin; Saadeh, C.; Bennert, Jeff
2010-06-01
Aeroallergens cause serious allergic and asthmatic reactions. Characterizing the aeroallergen provides information regarding the onset, duration, and severity of the pollen season that clinicians use to guide allergen selection for skin testing and treatment. Fluorescence Microscopy has useful approaches to understand the structure and function of the microscopic objects. Prepared slides from the pollen were observed under an Olympus BX40 microscope equipped with FITC and TRITC fluorescent filters, a mercury lamp source, an Olympus DP-70 digital camera connected to the computer with Image Pro 6.0 software. Aeroallergens were viewed, recorded and analyzed with DP Manager using the Image Pro 6.0 software. Photographs were taken at bright field, the fluorescein-isothiocyanate (FITC) filter, and the tetramethylrhodamine (TRITC) filter settings at 40X. A high pressure mercury lamp or UV source was used to excite the storage molecules or proteins which exhibited autofluorescence. The FITC filter reveals the green fluorescent proteins (GFP and EGFP), and the TRITC filter for red fluorescent proteins (DsRed). SEM proved to be useful for observing ultra-structural details like pores, colpi, sulci and ornamentations on the pollen surface. Samples were examined with an SEM (TM-1000) after gold coating and Critical Point Drying. Pollen grains were measured using the TM-1000 imaging software that revealed the specific information on the size of colpi or sulci and the distance between the micro-structures. This information can be used for classification and circumscription in Angiosperm taxonomy. Data were correlated clinical studies established at Allergy A.R.T.S. Clinical Research Laboratory.
Ferro, Myriam; Tardif, Marianne; Reguer, Erwan; Cahuzac, Romain; Bruley, Christophe; Vermat, Thierry; Nugues, Estelle; Vigouroux, Marielle; Vandenbrouck, Yves; Garin, Jérôme; Viari, Alain
2008-05-01
PepLine is a fully automated software which maps MS/MS fragmentation spectra of trypsic peptides to genomic DNA sequences. The approach is based on Peptide Sequence Tags (PSTs) obtained from partial interpretation of QTOF MS/MS spectra (first module). PSTs are then mapped on the six-frame translations of genomic sequences (second module) giving hits. Hits are then clustered to detect potential coding regions (third module). Our work aimed at optimizing the algorithms of each component to allow the whole pipeline to proceed in a fully automated manner using raw nucleic acid sequences (i.e., genomes that have not been "reduced" to a database of ORFs or putative exons sequences). The whole pipeline was tested on controlled MS/MS spectra sets from standard proteins and from Arabidopsis thaliana envelope chloroplast samples. Our results demonstrate that PepLine competed with protein database searching softwares and was fast enough to potentially tackle large data sets and/or high size genomes. We also illustrate the potential of this approach for the detection of the intron/exon structure of genes.
NASA Astrophysics Data System (ADS)
Alfianto, E.; Rusydi, F.; Aisyah, N. D.; Fadilla, R. N.; Dipojono, H. K.; Martoprawiro, M. A.
2017-05-01
This study implemented DFT method into the C++ programming language with object-oriented programming rules (expressive software). The use of expressive software results in getting a simple programming structure, which is similar to mathematical formula. This will facilitate the scientific community to develop the software. We validate our software by calculating the energy band structure of Silica, Carbon, and Germanium with FCC structure using the Projector Augmented Wave (PAW) method then compare the results to Quantum Espresso calculation’s results. This study shows that the accuracy of the software is 85% compared to Quantum Espresso.
SEPAC flight software detailed design specifications, volume 1
NASA Technical Reports Server (NTRS)
1982-01-01
The detailed design specifications (as built) for the SEPAC Flight Software are defined. The design includes a description of the total software system and of each individual module within the system. The design specifications describe the decomposition of the software system into its major components. The system structure is expressed in the following forms: the control-flow hierarchy of the system, the data-flow structure of the system, the task hierarchy, the memory structure, and the software to hardware configuration mapping. The component design description includes details on the following elements: register conventions, module (subroutines) invocaton, module functions, interrupt servicing, data definitions, and database structure.
MHEC Academic Scheduling Software Survey Results.
ERIC Educational Resources Information Center
Midwestern Higher Education Commission Academic Software Committee Research Bulletin, 1995
1995-01-01
This bulletin summarizes the chief quantitative findings of a survey of 264 small and medium sized colleges and universities in the midwest concerning their use of and interest in academic scheduling software. This type of software assists in planning course offerings, assigning instructors and course functions to facilities and time slots, and…
Ramezani, Alireza; Ahmadieh, Hamid; Azarmina, Mohsen; Soheilian, Masoud; Dehghan, Mohammad H; Mohebbi, Mohammad R
2009-12-01
To evaluate the validity of a new method for the quantitative analysis of fundus or angiographic images using Photoshop 7.0 (Adobe, USA) software by comparing with clinical evaluation. Four hundred and eighteen fundus and angiographic images of diabetic patients were evaluated by three retina specialists and then by computing using Photoshop 7.0 software. Four variables were selected for comparison: amount of hard exudates (HE) on color pictures, amount of HE on red-free pictures, severity of leakage, and the size of the foveal avascular zone (FAZ). The coefficient of agreement (Kappa) between the two methods in the amount of HE on color and red-free photographs were 85% (0.69) and 79% (0.59), respectively. The agreement for severity of leakage was 72% (0.46). In the two methods for the evaluation of the FAZ size using the magic and lasso software tools, the agreement was 54% (0.09) and 89% (0.77), respectively. Agreement in the estimation of the FAZ size by the lasso magnetic tool was excellent and was almost as good in the quantification of HE on color and on red-free images. Considering the agreement of this new technique for the measurement of variables in fundus images using Photoshop software with the clinical evaluation, this method seems to have sufficient validity to be used for the quantitative analysis of HE, leakage, and FAZ size on the angiograms of diabetic patients.
Scalable PGAS Metadata Management on Extreme Scale Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chavarría-Miranda, Daniel; Agarwal, Khushbu; Straatsma, TP
Programming models intended to run on exascale systems have a number of challenges to overcome, specially the sheer size of the system as measured by the number of concurrent software entities created and managed by the underlying runtime. It is clear from the size of these systems that any state maintained by the programming model has to be strictly sub-linear in size, in order not to overwhelm memory usage with pure overhead. A principal feature of Partitioned Global Address Space (PGAS) models is providing easy access to global-view distributed data structures. In order to provide efficient access to these distributedmore » data structures, PGAS models must keep track of metadata such as where array sections are located with respect to processes/threads running on the HPC system. As PGAS models and applications become ubiquitous on very large transpetascale systems, a key component to their performance and scalability will be efficient and judicious use of memory for model overhead (metadata) compared to application data. We present an evaluation of several strategies to manage PGAS metadata that exhibit different space/time tradeoffs. We use two real-world PGAS applications to capture metadata usage patterns and gain insight into their communication behavior.« less
A Structure for Creating Quality Software.
ERIC Educational Resources Information Center
Christensen, Larry C.; Bodey, Michael R.
1990-01-01
Addresses the issue of assuring quality software for use in computer-aided instruction and presents a structure by which developers can create quality courseware. Differences between courseware and computer-aided instruction software are discussed, methods for testing software are described, and human factors issues as well as instructional design…
NASA Tech Briefs, September 2003
NASA Technical Reports Server (NTRS)
2003-01-01
Topics include: Oxygen-Partial-Pressure Sensor for Aircraft Oxygen Mask; Three-Dimensional Venturi Sensor for Measuring Extreme Winds; Swarms of Micron-Sized Sensors; Monitoring Volcanoes by Use of Air-Dropped Sensor Packages; Capacitive Sensors for Measuring Masses of Cryogenic Fluids; UHF Microstrip Antenna Array for Synthetic- Aperture Radar; Multimode Broad-Band Patch Antennas; 164-GHz MMIC HEMT Frequency Doubler; GPS Position and Heading Circuitry for Ships; Software for Managing Parametric Studies; Software Aids Visualization of Computed Unsteady Flow; Software for Testing Electroactive Structural Components; Advanced Software for Analysis of High-Speed Rolling-Element Bearings; Web Program for Development of GUIs for Cluster Computers; XML-Based Generator of C++ Code for Integration With GUIs; Oxide Protective Coats for Ir/Re Rocket Combustion Chambers; Simplified Waterproofing of Aerogels; Improved Thermal-Insulation Systems for Low Temperatures; Device for Automated Cutting and Transfer of Plant Shoots; Extension of Liouville Formalism to Postinstability Dynamics; Advances in Thrust-Based Emergency Control of an Airplane; Ultrasonic/Sonic Mechanisms for Drilling and Coring; Exercise Device Would Exert Selectable Constant Resistance; Improved Apparatus for Measuring Distance Between Axles; Six Classes of Diffraction-Based Optoelectronic Instruments; Modernizing Fortran 77 Legacy Codes; Active State Model for Autonomous Systems; Shields for Enhanced Protection Against High-Speed Debris; Scaling of Two-Phase Flows to Partial-Earth Gravity; Neutral-Axis Springs for Thin-Wall Integral Boom Hinges.
NASA Technical Reports Server (NTRS)
Farley, Rodger
2007-01-01
PlanetaryBalloon Version 5.0 is a software package for the design of meridionally lobed planetary balloons. It operates in a Windows environment, and programming was done in Visual Basic 6. By including the effects of circular lobes with load tapes, skin mass, hoop and meridional stress, and elasticity in the structural elements, a more accurate balloon shape of practical construction can be determined as well as the room-temperature cut pattern for the gore shapes. The computer algorithm is formulated for sizing meridionally lobed balloons for any generalized atmosphere or planet. This also covers zero-pressure, over-pressure, and super-pressure balloons. Low circumferential loads with meridionally reinforced load tapes will produce shapes close to what are known as the "natural shape." The software allows for the design of constant angle, constant radius, or constant hoop stress balloons. It uses the desired payload capacity for given atmospheric conditions and determines the required volume, allowing users to design exactly to their requirements. The formulations are generalized to use any lift gas (or mixture of gases), any atmosphere, or any planet as described by the local acceleration of gravity. PlanetaryBalloon software has a comprehensive user manual that covers features ranging from, but not limited to, buoyancy and super-pressure, convenient design equations, shape formulation, and orthotropic stress/strain.
NASA PC software evaluation project
NASA Technical Reports Server (NTRS)
Dominick, Wayne D. (Editor); Kuan, Julie C.
1986-01-01
The USL NASA PC software evaluation project is intended to provide a structured framework for facilitating the development of quality NASA PC software products. The project will assist NASA PC development staff to understand the characteristics and functions of NASA PC software products. Based on the results of the project teams' evaluations and recommendations, users can judge the reliability, usability, acceptability, maintainability and customizability of all the PC software products. The objective here is to provide initial, high-level specifications and guidelines for NASA PC software evaluation. The primary tasks to be addressed in this project are as follows: to gain a strong understanding of what software evaluation entails and how to organize a structured software evaluation process; to define a structured methodology for conducting the software evaluation process; to develop a set of PC software evaluation criteria and evaluation rating scales; and to conduct PC software evaluations in accordance with the identified methodology. Communication Packages, Network System Software, Graphics Support Software, Environment Management Software, General Utilities. This report represents one of the 72 attachment reports to the University of Southwestern Louisiana's Final Report on NASA Grant NGT-19-010-900. Accordingly, appropriate care should be taken in using this report out of context of the full Final Report.
An overview of STRUCTURE: applications, parameter settings, and supporting software
Porras-Hurtado, Liliana; Ruiz, Yarimar; Santos, Carla; Phillips, Christopher; Carracedo, Ángel; Lareu, Maria V.
2013-01-01
Objectives: We present an up-to-date review of STRUCTURE software: one of the most widely used population analysis tools that allows researchers to assess patterns of genetic structure in a set of samples. STRUCTURE can identify subsets of the whole sample by detecting allele frequency differences within the data and can assign individuals to those sub-populations based on analysis of likelihoods. The review covers STRUCTURE's most commonly used ancestry and frequency models, plus an overview of the main applications of the software in human genetics including case-control association studies (CCAS), population genetics, and forensic analysis. The review is accompanied by supplementary material providing a step-by-step guide to running STRUCTURE. Methods: With reference to a worked example, we explore the effects of changing the principal analysis parameters on STRUCTURE results when analyzing a uniform set of human genetic data. Use of the supporting software: CLUMPP and distruct is detailed and we provide an overview and worked example of STRAT software, applicable to CCAS. Conclusion: The guide offers a simplified view of how STRUCTURE, CLUMPP, distruct, and STRAT can be applied to provide researchers with an informed choice of parameter settings and supporting software when analyzing their own genetic data. PMID:23755071
Zero Launch Mass Three Dimensional Print Head
NASA Technical Reports Server (NTRS)
Mueller, Robert P.; Gelino, Nathan J.; Smith, Jonathan D.; Buckles, Brad C.; Lippitt, Thomas; Schuler, Jason M.; Nick, Andrew J.; Nugent, Matt W.; Townsend, Ivan I.
2018-01-01
NASA's strategic goal is to put humans on Mars in the 2030's. The NASA Human Spaceflight Architecture Team (HAT) and NASA Mars Design Reference Architecture (DRA) 5.0 has determined that in-situ resource utilization (ISRU) is an essential technology to accomplish this mission. Additive construction technology using in-situ materials from planetary surfaces will reduce launch mass, allow structures to be three dimensionally (3D) printed on demand, and will allow building designs to be transmitted digitally from Earth and printed in space. This will ultimately lead to elimination of reliance on structural materials launched from Earth (zero launch mass of construction consumables). The zero launch mass (ZLM) 3D print head project addressed this need by developing a system that 3D prints using a mixture of in-situ regolith and polymer as feedstock, determining the optimum mixture ratio and regolith particle size distribution, developing software to convert g-code into motion instructions for a FANUC robotic arm, printing test samples, performing materials testing, and printing a reduced scale habitable structure concept. This paper will focus on the ZLM 3D Print Head design, materials selection, software development, and lessons learned from operating the system in the NASA KSC Swamp Works Granular Mechanics & Regolith Operations (GMRO) Laboratory.
Raster-scanning serial protein crystallography using micro- and nano-focused synchrotron beams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coquelle, Nicolas; CNRS, IBS, 38044 Grenoble; CEA, IBS, 38044 Grenoble
A raster scanning serial protein crystallography approach is presented, that consumes as low ∼200–700 nl of sedimented crystals. New serial data pre-analysis software, NanoPeakCell, is introduced. High-resolution structural information was obtained from lysozyme microcrystals (20 µm in the largest dimension) using raster-scanning serial protein crystallography on micro- and nano-focused beamlines at the ESRF. Data were collected at room temperature (RT) from crystals sandwiched between two silicon nitride wafers, thereby preventing their drying, while limiting background scattering and sample consumption. In order to identify crystal hits, new multi-processing and GUI-driven Python-based pre-analysis software was developed, named NanoPeakCell, that was able tomore » read data from a variety of crystallographic image formats. Further data processing was carried out using CrystFEL, and the resultant structures were refined to 1.7 Å resolution. The data demonstrate the feasibility of RT raster-scanning serial micro- and nano-protein crystallography at synchrotrons and validate it as an alternative approach for the collection of high-resolution structural data from micro-sized crystals. Advantages of the proposed approach are its thriftiness, its handling-free nature, the reduced amount of sample required, the adjustable hit rate, the high indexing rate and the minimization of background scattering.« less
[C57BL/6 mice open field behaviour qualitatively depends on arena size].
Lebedev, I V; Pleskacheva, M G; Anokhin, K V
2012-01-01
Open field behavior is well known to depend on physical characteristics of the apparatus. However many of such effects are poorly described especially with using of modern methods of behavioral registration and analysis. The previous results of experiments on the effect of arena size on behavior are not numerous and contradictory. We compared the behavioral scores of four groups of C57BL/6 mice in round open field arenas of four different sizes (diameter 35, 75, 150 and 220 cm). The behavior was registered and analyzed using Noldus EthoVision, WinTrack and SegmentAnalyzer software. A significant effect of arena size was found. Traveled distance and velocity increased, but not in proportion to increase of arena size. Moreover a significant effect on segment characteristics of the trajectory was revealed. Detailed behavior analysis revealed drastic differences in trajectory structure and number of rears between smaller (35 and 75 cm) and bigger (150 and 220 cm) arenas. We conclude, that the character of exploration in smaller and bigger arenas depends on relative size of central open zone in arena. Apparently its extension increases the motivational heterogeneity of space, that requires another than in smaller arenas, strategy of exploration.
The structural stability of lunar lava tubes
NASA Astrophysics Data System (ADS)
Blair, David M.; Chappaz, Loic; Sood, Rohan; Milbury, Colleen; Bobet, Antonio; Melosh, H. Jay; Howell, Kathleen C.; Freed, Andrew M.
2017-01-01
Mounting evidence from the SELENE, LRO, and GRAIL spacecraft suggests the presence of vacant lava tubes under the surface of the Moon. GRAIL evidence, in particular, suggests that some may be more than a kilometer in width. Such large sublunarean structures would be of great benefit to future human exploration of the Moon, providing shelter from the harsh environment at the surface-but could empty lava tubes of this size be stable under lunar conditions? And what is the largest size at which they could remain structurally sound? We address these questions by creating elasto-plastic finite element models of lava tubes using the Abaqus modeling software and examining where there is local material failure in the tube's roof. We assess the strength of the rock body using the Geological Strength Index method with values appropriate to the Moon, assign it a basaltic density derived from a modern re-analysis of lunar samples, and assume a 3:1 width-to-height ratio for the lava tube. Our results show that the stability of a lava tube depends on its width, its roof thickness, and whether the rock comprising the structure begins in a lithostatic or Poisson stress state. With a roof 2 m thick, lava tubes a kilometer or more in width can remain stable, supporting inferences from GRAIL observations. The theoretical maximum size of a lunar lava tube depends on a variety of factors, but given sufficient burial depth (500 m) and an initial lithostatic stress state, our results show that lava tubes up to 5 km wide may be able to remain structurally stable.
Damage Detection Sensor System for Aerospace and Multiple Applications
NASA Technical Reports Server (NTRS)
Williams, Martha; Lewis, Mark; Gibson, Tracy L.; Lane, John; Medelius, Pedro
2017-01-01
NASA has identified structural health monitoring and damage detection and verification as critical needs in multiple technology roadmaps. The sensor systems can be customized for detecting location, damage size, and depth, with velocity options and can be designed for particular environments for monitoring of impact or physical damage to a structure. The damage detection system has been successfully demonstrated in a harsh environment and remote integration tested over 1000 miles apart. Multiple applications includes: Spacecraft and Aircraft; Inflatable, Deployable and Expandable Structures; Space Debris Monitoring; Space Habitats; Military Shelters; Solar Arrays, Smart Garments and Wearables, Extravehicular activity (EVA) suits; Critical Hardware Enclosures; Embedded Composite Structures; and Flexible Hybrid Printed Electronics and Systems. For better implementation and infusion into more flexible architectures, important and improved designs in advancing embedded software and GUI interface, and increasing flexibility, modularity, and configurable capabilities of the system are currently being carried out.
Marathon: An Open Source Software Library for the Analysis of Markov-Chain Monte Carlo Algorithms
Rechner, Steffen; Berger, Annabell
2016-01-01
We present the software library marathon, which is designed to support the analysis of sampling algorithms that are based on the Markov-Chain Monte Carlo principle. The main application of this library is the computation of properties of so-called state graphs, which represent the structure of Markov chains. We demonstrate applications and the usefulness of marathon by investigating the quality of several bounding methods on four well-known Markov chains for sampling perfect matchings and bipartite graphs. In a set of experiments, we compute the total mixing time and several of its bounds for a large number of input instances. We find that the upper bound gained by the famous canonical path method is often several magnitudes larger than the total mixing time and deteriorates with growing input size. In contrast, the spectral bound is found to be a precise approximation of the total mixing time. PMID:26824442
Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 1; Formulation
NASA Technical Reports Server (NTRS)
Walsh, J. L.; Townsend, J. C.; Salas, A. O.; Samareh, J. A.; Mukhopadhyay, V.; Barthelemy, J.-F.
2000-01-01
An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity, finite element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a highspeed civil transport configuration. The paper describes the engineering aspects of formulating the optimization by integrating these analysis codes and associated interface codes into the system. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture (CORBA) compliant software product. A companion paper presents currently available results.
NASA Technical Reports Server (NTRS)
Walsh, J. L.; Weston, R. P.; Samareh, J. A.; Mason, B. H.; Green, L. L.; Biedron, R. T.
2000-01-01
An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity finite-element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a high-speed civil transport configuration. The paper describes both the preliminary results from implementing and validating the multidisciplinary analysis and the results from an aerodynamic optimization. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture compliant software product. A companion paper describes the formulation of the multidisciplinary analysis and optimization system.
Multiscale Fatigue Life Prediction for Composite Panels
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Yarrington, Phillip W.; Arnold, Steven M.
2012-01-01
Fatigue life prediction capabilities have been incorporated into the HyperSizer Composite Analysis and Structural Sizing Software. The fatigue damage model is introduced at the fiber/matrix constituent scale through HyperSizer s coupling with NASA s MAC/GMC micromechanics software. This enables prediction of the micro scale damage progression throughout stiffened and sandwich panels as a function of cycles leading ultimately to simulated panel failure. The fatigue model implementation uses a cycle jumping technique such that, rather than applying a specified number of additional cycles, a specified local damage increment is specified and the number of additional cycles to reach this damage increment is calculated. In this way, the effect of stress redistribution due to damage-induced stiffness change is captured, but the fatigue simulations remain computationally efficient. The model is compared to experimental fatigue life data for two composite facesheet/foam core sandwich panels, demonstrating very good agreement.
NASA Astrophysics Data System (ADS)
Alexander, K.; Easterbrook, S. M.
2015-01-01
We analyse the source code of eight coupled climate models, selected from those that participated in the CMIP5 (Taylor et al., 2012) or EMICAR5 (Eby et al., 2013; Zickfeld et al., 2013) intercomparison projects. For each model, we sort the preprocessed code into components and subcomponents based on dependency structure. We then create software architecture diagrams which show the relative sizes of these components/subcomponents and the flow of data between them. The diagrams also illustrate several major classes of climate model design; the distribution of complexity between components, which depends on historical development paths as well as the conscious goals of each institution; and the sharing of components between different modelling groups. These diagrams offer insights into the similarities and differences between models, and have the potential to be useful tools for communication between scientists, scientific institutions, and the public.
Impact Damage and Strain Rate Effects for Toughened Epoxy Composite Structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Minnetyan, Levon
2006-01-01
Structural integrity of composite systems under dynamic impact loading is investigated herein. The GENOA virtual testing software environment is used to implement the effects of dynamic loading on fracture progression and damage tolerance. Combinations of graphite and glass fibers with a toughened epoxy matrix are investigated. The effect of a ceramic coating for the absorption of impact energy is also included. Impact and post impact simulations include verification and prediction of (1) Load and Impact Energy, (2) Impact Damage Size, (3) Maximum Impact Peak Load, (4) Residual Strength, (5) Maximum Displacement, (6) Contribution of Failure Modes to Failure Mechanisms, (7) Prediction of Impact Load Versus Time, and (8) Damage, and Fracture Pattern. A computer model is utilized for the assessment of structural response, progressive fracture, and defect/damage tolerance characteristics. Results show the damage progression sequence and the changes in the structural response characteristics due to dynamic impact. The fundamental premise of computational simulation is that the complete evaluation of composite fracture requires an assessment of ply and subply level damage/fracture processes as the structure is subjected to loads. Simulation results for the graphite/epoxy composite were compared with the impact and tension failure test data, correlation and verification was obtained that included: (1) impact energy, (2) damage size, (3) maximum impact peak load, (4) residual strength, (5) maximum displacement, and (6) failure mechanisms of the composite structure.
Development problem analysis of correlation leak detector’s software
NASA Astrophysics Data System (ADS)
Faerman, V. A.; Avramchuk, V. S.; Marukyan, V. M.
2018-05-01
In the article, the practical application and the structure of the correlation leak detectors’ software is studied and the task of its designing is analyzed. In the first part of the research paper, the expediency of the facilities development of correlation leak detectors for the following operating efficiency of public utilities exploitation is shown. The analysis of the functional structure of correlation leak detectors is conducted and its program software tasks are defined. In the second part of the research paper some development steps of the software package – requirement forming, program structure definition and software concept creation – are examined in the context of the usage experience of the hardware-software prototype of correlation leak detector.
The Effectiveness of Software Project Management Practices: A Quantitative Measurement
2011-03-01
Assessment (SPMMA) model ( Ramli , 2007). The purpose of the SPMMA was to help a company measure the strength and weaknesses of its software project...Practices,” Fuazi and Ramli presented a model to assess software project management practices using their Software Project Management Maturity...Analysis The SPMMA was carried out on one mid-size Information Technology (IT) Company . Based on the questionnaire responses, interviews and discussions
Software Technology for Adaptable, Reliable Systems (STARS)
1994-03-25
Tmeline(3), SECOMO(3), SEER(3), GSFC Software Engineering Lab Model(l), SLIM(4), SEER-SEM(l), SPQR (2), PRICE-S(2), internally-developed models(3), APMSS(1...3 " Timeline - 3 " SASET (Software Architecture Sizing Estimating Tool) - 2 " MicroMan 11- 2 * LCM (Logistics Cost Model) - 2 * SPQR - 2 * PRICE-S - 2
Vidal-García, Marta; Bandara, Lashi; Keogh, J Scott
2018-05-01
The quantification of complex morphological patterns typically involves comprehensive shape and size analyses, usually obtained by gathering morphological data from all the structures that capture the phenotypic diversity of an organism or object. Articulated structures are a critical component of overall phenotypic diversity, but data gathered from these structures are difficult to incorporate into modern analyses because of the complexities associated with jointly quantifying 3D shape in multiple structures. While there are existing methods for analyzing shape variation in articulated structures in two-dimensional (2D) space, these methods do not work in 3D, a rapidly growing area of capability and research. Here, we describe a simple geometric rigid rotation approach that removes the effect of random translation and rotation, enabling the morphological analysis of 3D articulated structures. Our method is based on Cartesian coordinates in 3D space, so it can be applied to any morphometric problem that also uses 3D coordinates (e.g., spherical harmonics). We demonstrate the method by applying it to a landmark-based dataset for analyzing shape variation using geometric morphometrics. We have developed an R tool (ShapeRotator) so that the method can be easily implemented in the commonly used R package geomorph and MorphoJ software. This method will be a valuable tool for 3D morphological analyses in articulated structures by allowing an exhaustive examination of shape and size diversity.
Leader Delegation and Trust in Global Software Teams
ERIC Educational Resources Information Center
Zhang, Suling
2008-01-01
Virtual teams are an important work structure in global software development. The distributed team structure enables access to a diverse set of expertise which is often not available in one location, to a cheaper labor force, and to a potentially accelerated development process that uses a twenty-four hour work structure. Many software teams…
SMASH - semi-automatic muscle analysis using segmentation of histology: a MATLAB application.
Smith, Lucas R; Barton, Elisabeth R
2014-01-01
Histological assessment of skeletal muscle tissue is commonly applied to many areas of skeletal muscle physiological research. Histological parameters including fiber distribution, fiber type, centrally nucleated fibers, and capillary density are all frequently quantified measures of skeletal muscle. These parameters reflect functional properties of muscle and undergo adaptation in many muscle diseases and injuries. While standard operating procedures have been developed to guide analysis of many of these parameters, the software to freely, efficiently, and consistently analyze them is not readily available. In order to provide this service to the muscle research community we developed an open source MATLAB script to analyze immunofluorescent muscle sections incorporating user controls for muscle histological analysis. The software consists of multiple functions designed to provide tools for the analysis selected. Initial segmentation and fiber filter functions segment the image and remove non-fiber elements based on user-defined parameters to create a fiber mask. Establishing parameters set by the user, the software outputs data on fiber size and type, centrally nucleated fibers, and other structures. These functions were evaluated on stained soleus muscle sections from 1-year-old wild-type and mdx mice, a model of Duchenne muscular dystrophy. In accordance with previously published data, fiber size was not different between groups, but mdx muscles had much higher fiber size variability. The mdx muscle had a significantly greater proportion of type I fibers, but type I fibers did not change in size relative to type II fibers. Centrally nucleated fibers were highly prevalent in mdx muscle and were significantly larger than peripherally nucleated fibers. The MATLAB code described and provided along with this manuscript is designed for image processing of skeletal muscle immunofluorescent histological sections. The program allows for semi-automated fiber detection along with user correction. The output of the code provides data in accordance with established standards of practice. The results of the program have been validated using a small set of wild-type and mdx muscle sections. This program is the first freely available and open source image processing program designed to automate analysis of skeletal muscle histological sections.
Numerical simulation of a novel expanded metal tubular structure for crashworthiness application
NASA Astrophysics Data System (ADS)
Abdelaal, A. H. A.; Tarlochan, F.
2015-12-01
Search for new geometries and materials that would serve in crashworthiness applications is a cumulative process. Recent studies investigated the performance of expanded metal tubes and the possible ways to enhance its energy absorption capability. The aim of this work is to investigate the crashworthiness characteristics of new concept is proposed where expanded metal tube is suited into a double-walled tube made of the same material to form one structure. The tube was then numerically tested through a verified model using ABAQUS software. Moreover, the influence of the size of the expanded metal cell was also investigated in the present study. The new concept showed an enhanced energy absorption characteristics related to the change in the mass of the tubular structure. The enhancement was related to both the change in deformation pattern, and the increase in crushed mass.
Software implementation of the SKIPSM paradigm under PIP
NASA Astrophysics Data System (ADS)
Hack, Ralf; Waltz, Frederick M.; Batchelor, Bruce G.
1997-09-01
SKIPSM (separated-kernel image processing using finite state machines) is a technique for implementing large-kernel binary- morphology operators and many other operations. While earlier papers on SKIPSM concentrated mainly on implementations using pipelined hardware, there is considerable scope for achieving major speed improvements in software systems. Using identical control software, one-pass binary erosion and dilation structuring elements (SEs) ranging from the trivial (3 by 3) to the gigantic (51 by 51, or even larger), are readily available. Processing speed is independent of the size of the SE, making the SKIPSM approach practical for work with very large SEs on ordinary desktop computers. PIP (prolog image processing) is an interactive machine vision prototyping environment developed at the University of Wales Cardiff. It consists of a large number of image processing operators embedded within the standard AI language Prolog. This paper describes the SKIPSM implementation of binary morphology operators within PIP. A large set of binary erosion and dilation operations (circles, squares, diamonds, octagons, etc.) is available to the user through a command-line driven dialogue, via pull-down menus, or incorporated into standard (Prolog) programs. Little has been done thus far to optimize speed on this first software implementation of SKIPSM. Nevertheless, the results are impressive. The paper describes sample applications and presents timing figures. Readers have the opportunity to try out these operations on demonstration software written by the University of Wales, or via their WWW home page at http://bruce.cs.cf.ac.uk/bruce/index.html .
Heliostat cost optimization study
NASA Astrophysics Data System (ADS)
von Reeken, Finn; Weinrebe, Gerhard; Keck, Thomas; Balz, Markus
2016-05-01
This paper presents a methodology for a heliostat cost optimization study. First different variants of small, medium sized and large heliostats are designed. Then the respective costs, tracking and optical quality are determined. For the calculation of optical quality a structural model of the heliostat is programmed and analyzed using finite element software. The costs are determined based on inquiries and from experience with similar structures. Eventually the levelised electricity costs for a reference power tower plant are calculated. Before each annual simulation run the heliostat field is optimized. Calculated LCOEs are then used to identify the most suitable option(s). Finally, the conclusions and findings of this extensive cost study are used to define the concept of a new cost-efficient heliostat called `Stellio'.
NASA Astrophysics Data System (ADS)
Prasad, Sandeep; Choudhary, B. S.; Mishra, A. K.
2017-08-01
Rock fragmentation size is very important parameters for economical point of view in any surface mining. Rock fragment size direct effects on the costs of drilling, blasting, loading, secondary blasting and crushing. The main purpose of this study is to investigate effect of blast design parameters such as burden, blast hole length, stemming length, and powder factor on rock fragmentation. The fragment sizes (MFS, K50, m), and maximum fragment size (K95, m) of rock were determined by using the computer software. For every blast, after blasting operation, the images of whole muck pile are captured and there images were used for fragmentation analysis by using the Fragalyst software. It was observed that the optimal fragment size (MFS, K50, m and maximum fragment size, K95, m) of rock depends strongly on the blast design parameters and explosive parameters.
Lab-on-chip platform for circulating tumor cells isolation
NASA Astrophysics Data System (ADS)
Maurya, D. K.; Fooladvand, M.; Gray, E.; Ziman, M.; Alameh, K.
2015-12-01
We design, develop and demonstrate the principle of a continuous, non-intrusive, low power microfluidics-based lab-ona- chip (LOC) structure for Circulating Tumor Cell (CTC) separation. Cell separation is achieved through 80 cascaded contraction and expansion microchannels of widths 60 μm and 300 μm, respectively, and depth 60 μm, which enable momentum-change-induced inertial forces to be exerted on the cells, thus routing them to desired destinations. The total length of the developed LOC is 72 mm. The LOC structure is simulated using the COMSOL multiphysics software, which enables the optimization of the dimensions of the various components of the LOC structure, namely the three inlets, three filters, three contraction and expansion microchannel segments and five outlets. Simulation results show that the LOC can isolate CTCs of sizes ranging from 15 to 30 μm with a recovery rate in excess of 90%. Fluorescent microparticles of two different sizes (5 μm and 15 μm), emulating blood and CTC cells, respectively, are used to demonstrate the principle of the developed LOC. A mixture of these microparticles is injected into the primary LOC inlet via an electronically-controlled syringe pump, and the large-size particles are routed to the primary LOC outlet through the contraction and expansion microchannels. Experimental results demonstrate the ability of the developed LOC to isolate particles by size exclusion with an accuracy of 80%. Ongoing research is focusing on the LOC design improvement for better separation efficiency and testing of biological samples for isolation of CTCs.
An Educational Software for Simulating the Sample Size of Molecular Marker Experiments
ERIC Educational Resources Information Center
Helms, T. C.; Doetkott, C.
2007-01-01
We developed educational software to show graduate students how to plan molecular marker experiments. These computer simulations give the students feedback on the precision of their experiments. The objective of the software was to show students using a hands-on approach how: (1) environmental variation influences the range of the estimates of the…
A Computerized Cataloging System for an Outdoor Program Library or Resource Center.
ERIC Educational Resources Information Center
Watters, Ron
The Outdoor Resource Library Cataloging System is a computer software program designed primarily for outdoor programs with small to medium-sized resource centers. The software is free to nonprofit organizations and is available from the Idaho State University Outdoor Program. The software is used to construct a database of library materials, which…
Theoretical and software considerations for nonlinear dynamic analysis
NASA Technical Reports Server (NTRS)
Schmidt, R. J.; Dodds, R. H., Jr.
1983-01-01
In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.
Evaluation of software maintain ability with open EHR - a comparison of architectures.
Atalag, Koray; Yang, Hong Yul; Tempero, Ewan; Warren, James R
2014-11-01
To assess whether it is easier to maintain a clinical information system developed using open EHR model driven development versus mainstream methods. A new open source application (GastrOS) has been developed following open EHR's multi-level modelling approach using .Net/C# based on the same requirements of an existing clinically used application developed using Microsoft Visual Basic and Access database. Almost all the domain knowledge was embedded into the software code and data model in the latter. The same domain knowledge has been expressed as a set of open EHR Archetypes in GastrOS. We then introduced eight real-world change requests that had accumulated during live clinical usage, and implemented these in both systems while measuring time for various development tasks and change in software size for each change request. Overall it took half the time to implement changes in GastrOS. However it was the more difficult application to modify for one change request, suggesting the nature of change is also important. It was not possible to implement changes by modelling only. Comparison of relative measures of time and software size change within each application highlights how architectural differences affected maintain ability across change requests. The use of open EHR model driven development can result in better software maintain ability. The degree to which open EHR affects software maintain ability depends on the extent and nature of domain knowledge involved in changes. Although we used relative measures for time and software size, confounding factors could not be totally excluded as a controlled study design was not feasible. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Object-oriented microcomputer software for earthquake seismology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kroeger, G.C.
1993-02-01
A suite of graphically interactive applications for the retrieval, editing and modeling of earthquake seismograms have been developed using object-orientation programming methodology and the C++ language. Retriever is an application which allows the user to search for, browse, and extract seismic data from CD-ROMs produced by the National Earthquake Information Center (NEIC). The user can restrict the date, size, location and depth of desired earthquakes and extract selected data into a variety of common seismic file formats. Reformer is an application that allows the user to edit seismic data and data headers, and perform a variety of signal processing operationsmore » on that data. Synthesizer is a program for the generation and analysis of teleseismic P and SH synthetic seismograms. The program provides graphical manipulation of source parameters, crustal structures and seismograms, as well as near real-time response in generating synthetics for arbitrary flat-layered crustal structures. All three applications use class libraries developed for implementing geologic and seismic objects and views. Standard seismogram view objects and objects that encapsulate the reading and writing of different seismic data file formats are shared by all three applications. The focal mechanism views in Synthesizer are based on a generic stereonet view object. Interaction with the native graphical user interface is encapsulated in a class library in order to simplify the porting of the software to different operating systems and application programming interfaces. The software was developed on the Apple Macintosh and is being ported to UNIX/X-Window platforms.« less
Towards an Early Software Effort Estimation Based on Functional and Non-Functional Requirements
NASA Astrophysics Data System (ADS)
Kassab, Mohamed; Daneva, Maya; Ormandjieva, Olga
The increased awareness of the non-functional requirements as a key to software project and product success makes explicit the need to include them in any software project effort estimation activity. However, the existing approaches to defining size-based effort relationships still pay insufficient attention to this need. This paper presents a flexible, yet systematic approach to the early requirements-based effort estimation, based on Non-Functional Requirements ontology. It complementarily uses one standard functional size measurement model and a linear regression technique. We report on a case study which illustrates the application of our solution approach in context and also helps evaluate our experiences in using it.
First trimester size charts of embryonic brain structures.
Gijtenbeek, M; Bogers, H; Groenenberg, I A L; Exalto, N; Willemsen, S P; Steegers, E A P; Eilers, P H C; Steegers-Theunissen, R P M
2014-02-01
Can reliable size charts of human embryonic brain structures be created from three-dimensional ultrasound (3D-US) visualizations? Reliable size charts of human embryonic brain structures can be created from high-quality images. Previous studies on the visualization of both the cavities and the walls of the brain compartments were performed using 2D-US, 3D-US or invasive intrauterine sonography. However, the walls of the diencephalon, mesencephalon and telencephalon have not been measured non-invasively before. Last-decade improvements in transvaginal ultrasound techniques allow a better visualization and offer the tools to measure these human embryonic brain structures with precision. This study is embedded in a prospective periconceptional cohort study. A total of 141 pregnancies were included before the sixth week of gestation and were monitored until delivery to assess complications and adverse outcomes. For the analysis of embryonic growth, 596 3D-US scans encompassing the entire embryo were obtained from 106 singleton non-malformed live birth pregnancies between 7(+0) and 12(+6) weeks' gestational age (GA). Using 4D View (3D software) the measured embryonic brain structures comprised thickness of the diencephalon, mesencephalon and telencephalon, and the total diameter of the diencephalon and mesencephalon. Of 596 3D scans, 161 (27%) high-quality scans of 79 pregnancies were eligible for analysis. The reliability of all embryonic brain structure measurements, based on the intra-class correlation coefficients (ICCs) (all above 0.98), was excellent. Bland-Altman plots showed moderate agreement for measurements of the telencephalon, but for all other measurements the agreement was good. Size charts were constructed according to crown-rump length (CRL). The percentage of high-quality scans suitable for analysis of these brain structures was low (27%). The size charts of human embryonic brain structures can be used to study normal and abnormal development of brain development in future. Also, the effects of periconceptional maternal exposures, such as folic acid supplement use and smoking, on human embryonic brain development can be a topic of future research. This study was supported by the Department of Obstetrics and Gynaecology of the Erasmus University Medical Center. M.G. was supported by an additional grant from the Sophia Foundation for Medical Research (SSWO grant number 644). No competing interests are declared.
NASA Technical Reports Server (NTRS)
Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron
1994-01-01
This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.
Functional description of the ISIS system
NASA Technical Reports Server (NTRS)
Berman, W. J.
1979-01-01
Development of software for avionic and aerospace applications (flight software) is influenced by a unique combination of factors which includes: (1) length of the life cycle of each project; (2) necessity for cooperation between the aerospace industry and NASA; (3) the need for flight software that is highly reliable; (4) the increasing complexity and size of flight software; and (5) the high quality of the programmers and the tightening of project budgets. The interactive software invocation system (ISIS) which is described is designed to overcome the problems created by this combination of factors.
Software errors and complexity: An empirical investigation
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Perricone, Berry T.
1983-01-01
The distributions and relationships derived from the change data collected during the development of a medium scale satellite software project show that meaningful results can be obtained which allow an insight into software traits and the environment in which it is developed. Modified and new modules were shown to behave similarly. An abstract classification scheme for errors which allows a better understanding of the overall traits of a software project is also shown. Finally, various size and complexity metrics are examined with respect to errors detected within the software yielding some interesting results.
Software errors and complexity: An empirical investigation
NASA Technical Reports Server (NTRS)
Basili, V. R.; Perricone, B. T.
1982-01-01
The distributions and relationships derived from the change data collected during the development of a medium scale satellite software project show that meaningful results can be obtained which allow an insight into software traits and the environment in which it is developed. Modified and new modules were shown to behave similarly. An abstract classification scheme for errors which allows a better understanding of the overall traits of a software project is also shown. Finally, various size and complexity metrics are examined with respect to errors detected within the software yielding some interesting results.
Numerical estimation of cavitation intensity
NASA Astrophysics Data System (ADS)
Krumenacker, L.; Fortes-Patella, R.; Archer, A.
2014-03-01
Cavitation may appear in turbomachinery and in hydraulic orifices, venturis or valves, leading to performance losses, vibrations and material erosion. This study propose a new method to predict the cavitation intensity of the flow, based on a post-processing of unsteady CFD calculations. The paper presents the analyses of cavitating structures' evolution at two different scales: • A macroscopic one in which the growth of cavitating structures is calculated using an URANS software based on a homogeneous model. Simulations of cavitating flows are computed using a barotropic law considering presence of air and interfacial tension, and Reboud's correction on the turbulence model. • Then a small one where a Rayleigh-Plesset software calculates the acoustic energy generated by the implosion of the vapor/gas bubbles with input parameters from macroscopic scale. The volume damage rate of the material during incubation time is supposed to be a part of the cumulated acoustic energy received by the solid wall. The proposed analysis method is applied to calculations on hydrofoil and orifice geometries. Comparisons between model results and experimental works concerning flow characteristic (size of cavity, pressure,velocity) as well as pitting (erosion area, relative cavitation intensity) are presented.
Multiscale analysis of river networks using the R package linbin
Welty, Ethan Z.; Torgersen, Christian E.; Brenkman, Samuel J.; Duda, Jeffrey J.; Armstrong, Jonathan B.
2015-01-01
Analytical tools are needed in riverine science and management to bridge the gap between GIS and statistical packages that were not designed for the directional and dendritic structure of streams. We introduce linbin, an R package developed for the analysis of riverscapes at multiple scales. With this software, riverine data on aquatic habitat and species distribution can be scaled and plotted automatically with respect to their position in the stream network or—in the case of temporal data—their position in time. The linbin package aggregates data into bins of different sizes as specified by the user. We provide case studies illustrating the use of the software for (1) exploring patterns at different scales by aggregating variables at a range of bin sizes, (2) comparing repeat observations by aggregating surveys into bins of common coverage, and (3) tailoring analysis to data with custom bin designs. Furthermore, we demonstrate the utility of linbin for summarizing patterns throughout an entire stream network, and we analyze the diel and seasonal movements of tagged fish past a stationary receiver to illustrate how linbin can be used with temporal data. In short, linbin enables more rapid analysis of complex data sets by fisheries managers and stream ecologists and can reveal underlying spatial and temporal patterns of fish distribution and habitat throughout a riverscape.
Thermal modeling and analysis of structurally complex spacecraft using the IDEAS system
NASA Technical Reports Server (NTRS)
Garrett, L. B.
1983-01-01
Large antenna satellites of unprecedented sizes are needed for a number of applications. Antenna diameters on the order of 50 meters and upward are required. Such antennas involve the use of large expanses of lattice structures with hundreds or thousands of individual connecting members. In connection with the design of such structures, the consideration of thermal effects represents a crucial factor. Software capabilities have emerged which are coded to include major first order thermal effects and to purposely ignore, in the interest of computational efficiency, the secondary effects. The Interactive Design and Evaluation of Advanced Spacecraft (IDEAS) is one such system. It has been developed for an employment in connection with thermal-structural interaction analyses related to the design of large structurally complex classes of future spacecraft. An IDEAS overview is presented. Attention is given to a typical antenna analysis using IDEAS, the thermal and loading analyses of a tetrahedral truss spacecraft, and ecliptic and polar orbit analyses.
NASA Astrophysics Data System (ADS)
Arif Shah, Muhammad; Hashim, Rathiah; Shah, Adil Ali; Farooq Khattak, Umar
2016-11-01
Developing software through Global Software Development (GSD) became very common now days in the software industry. Pakistan is one of the countries where projects are taken and designed from different countries including Afghanistan. The purpose of this paper is to identify and provide an analysis on several communication barriers that can have a negative impact on the project and to provide management guidelines for medium size software organizations working in Pakistan with clients from Afghanistan and to overcome these communication barriers and challenges organizations face when coordinating with client. Initially we performed a literature review to identify different communication barriers and to check if there are any standardized communications management guidelines for medium size software houses provided in the past. The second stage of the research paper involves guidelines with vendor's perspective that include interviews and focus group discussions with different stakeholders and employees of software houses with clients from Afghanistan. Based on those interviews and discussions we established communication management guidelines in order to overcome the communication problems and barriers working with clients from Afghanistan. As a result of the literature review, we have identified that barriers such as cultural barriers and language barrier were one of the main reasons behind the project failure and suggested that software organizations working in Pakistan should follow certain defined communication guidelines in order to overcome communication barriers that affect the project directly.
Launch vehicle design and GNC sizing with ASTOS
NASA Astrophysics Data System (ADS)
Cremaschi, Francesco; Winter, Sebastian; Rossi, Valerio; Wiegand, Andreas
2018-03-01
The European Space Agency (ESA) is currently involved in several activities related to launch vehicle designs (Future Launcher Preparatory Program, Ariane 6, VEGA evolutions, etc.). Within these activities, ESA has identified the importance of developing a simulation infrastructure capable of supporting the multi-disciplinary design and preliminary guidance navigation and control (GNC) design of different launch vehicle configurations. Astos Solutions has developed the multi-disciplinary optimization and launcher GNC simulation and sizing tool (LGSST) under ESA contract. The functionality is integrated in the Analysis, Simulation and Trajectory Optimization Software for space applications (ASTOS) and is intended to be used from the early design phases up to phase B1 activities. ASTOS shall enable the user to perform detailed vehicle design tasks and assessment of GNC systems, covering all aspects of rapid configuration and scenario management, sizing of stages, trajectory-dependent estimation of structural masses, rigid and flexible body dynamics, navigation, guidance and control, worst case analysis, launch safety analysis, performance analysis, and reporting.
Maximum likelihood techniques applied to quasi-elastic light scattering
NASA Technical Reports Server (NTRS)
Edwards, Robert V.
1992-01-01
There is a necessity of having an automatic procedure for reliable estimation of the quality of the measurement of particle size from QELS (Quasi-Elastic Light Scattering). Getting the measurement itself, before any error estimates can be made, is a problem because it is obtained by a very indirect measurement of a signal derived from the motion of particles in the system and requires the solution of an inverse problem. The eigenvalue structure of the transform that generates the signal is such that an arbitrarily small amount of noise can obliterate parts of any practical inversion spectrum. This project uses the Maximum Likelihood Estimation (MLE) as a framework to generate a theory and a functioning set of software to oversee the measurement process and extract the particle size information, while at the same time providing error estimates for those measurements. The theory involved verifying a correct form of the covariance matrix for the noise on the measurement and then estimating particle size parameters using a modified histogram approach.
An empirical study of software design practices
NASA Technical Reports Server (NTRS)
Card, David N.; Church, Victor E.; Agresti, William W.
1986-01-01
Software engineers have developed a large body of software design theory and folklore, much of which was never validated. The results of an empirical study of software design practices in one specific environment are presented. The practices examined affect module size, module strength, data coupling, descendant span, unreferenced variables, and software reuse. Measures characteristic of these practices were extracted from 887 FORTRAN modules developed for five flight dynamics software projects monitored by the Software Engineering Laboratory (SEL). The relationship of these measures to cost and fault rate was analyzed using a contingency table procedure. The results show that some recommended design practices, despite their intuitive appeal, are ineffective in this environment, whereas others are very effective.
Software Design Improvements. Part 1; Software Benefits and Limitations
NASA Technical Reports Server (NTRS)
Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom
1997-01-01
Computer hardware and associated software have been used for many years to process accounting information, to analyze test data and to perform engineering analysis. Now computers and software also control everything from automobiles to washing machines and the number and type of applications are growing at an exponential rate. The size of individual program has shown similar growth. Furthermore, software and hardware are used to monitor and/or control potentially dangerous products and safety-critical systems. These uses include everything from airplanes and braking systems to medical devices and nuclear plants. The question is: how can this hardware and software be made more reliable? Also, how can software quality be improved? What methodology needs to be provided on large and small software products to improve the design and how can software be verified?
Future Software Sizing Metrics and Estimation Challenges
2011-07-01
systems 4. Ultrahigh software system assurance 5. Legacy maintenance and Brownfield development 6. Agile and Lean/ Kanban development. This paper...refined as the design of the maintenance modifications or Brownfield re-engineering is determined. VII. 6. AGILE AND LEAN/ KANBAN DEVELOPMENT The...difficulties of software maintenance estimation can often be mitigated by using lean workflow management techniques such as Kanban [25]. In Kanban
Mason F. Patterson; P. Eric Wiseman; Matthew F. Winn; Sang-mook Lee; Philip A. Araman
2011-01-01
UrbanCrowns is a software program developed by the USDA Forest Service that computes crown attributes using a side-view digital photograph and a few basic field measurements. From an operational standpoint, it is not known how well the software performs under varying photographic conditions for trees of diverse size, which could impact measurement reproducibility and...
Husen, Peter; Tarasov, Kirill; Katafiasz, Maciej; Sokol, Elena; Vogt, Johannes; Baumgart, Jan; Nitsch, Robert; Ekroos, Kim; Ejsing, Christer S
2013-01-01
Global lipidomics analysis across large sample sizes produces high-content datasets that require dedicated software tools supporting lipid identification and quantification, efficient data management and lipidome visualization. Here we present a novel software-based platform for streamlined data processing, management and visualization of shotgun lipidomics data acquired using high-resolution Orbitrap mass spectrometry. The platform features the ALEX framework designed for automated identification and export of lipid species intensity directly from proprietary mass spectral data files, and an auxiliary workflow using database exploration tools for integration of sample information, computation of lipid abundance and lipidome visualization. A key feature of the platform is the organization of lipidomics data in "database table format" which provides the user with an unsurpassed flexibility for rapid lipidome navigation using selected features within the dataset. To demonstrate the efficacy of the platform, we present a comparative neurolipidomics study of cerebellum, hippocampus and somatosensory barrel cortex (S1BF) from wild-type and knockout mice devoid of the putative lipid phosphate phosphatase PRG-1 (plasticity related gene-1). The presented framework is generic, extendable to processing and integration of other lipidomic data structures, can be interfaced with post-processing protocols supporting statistical testing and multivariate analysis, and can serve as an avenue for disseminating lipidomics data within the scientific community. The ALEX software is available at www.msLipidomics.info.
de Oliveira, Marcus Vinicius Linhares; Santos, António Carvalho; Paulo, Graciano; Campos, Paulo Sergio Flores; Santos, Joana
2017-06-01
The purpose of this study was to apply a newly developed free software program, at low cost and with minimal time, to evaluate the quality of dental and maxillofacial cone-beam computed tomography (CBCT) images. A polymethyl methacrylate (PMMA) phantom, CQP-IFBA, was scanned in 3 CBCT units with 7 protocols. A macro program was developed, using the free software ImageJ, to automatically evaluate the image quality parameters. The image quality evaluation was based on 8 parameters: uniformity, the signal-to-noise ratio (SNR), noise, the contrast-to-noise ratio (CNR), spatial resolution, the artifact index, geometric accuracy, and low-contrast resolution. The image uniformity and noise depended on the protocol that was applied. Regarding the CNR, high-density structures were more sensitive to the effect of scanning parameters. There were no significant differences between SNR and CNR in centered and peripheral objects. The geometric accuracy assessment showed that all the distance measurements were lower than the real values. Low-contrast resolution was influenced by the scanning parameters, and the 1-mm rod present in the phantom was not depicted in any of the 3 CBCT units. Smaller voxel sizes presented higher spatial resolution. There were no significant differences among the protocols regarding artifact presence. This software package provided a fast, low-cost, and feasible method for the evaluation of image quality parameters in CBCT.
NASA Workshop on Computational Structural Mechanics 1987, part 3
NASA Technical Reports Server (NTRS)
Sykes, Nancy P. (Editor)
1989-01-01
Computational Structural Mechanics (CSM) topics are explored. Algorithms and software for nonlinear structural dynamics, concurrent algorithms for transient finite element analysis, computational methods and software systems for dynamics and control of large space structures, and the use of multi-grid for structural analysis are discussed.
Reuse at the Software Productivity Consortium
NASA Technical Reports Server (NTRS)
Weiss, David M.
1989-01-01
The Software Productivity Consortium is sponsored by 14 aerospace companies as a developer of software engineering methods and tools. Software reuse and prototyping are currently the major emphasis areas. The Methodology and Measurement Project in the Software Technology Exploration Division has developed some concepts for reuse which they intend to develop into a synthesis process. They have identified two approaches to software reuse: opportunistic and systematic. The assumptions underlying the systematic approach, phrased as hypotheses, are the following: the redevelopment hypothesis, i.e., software developers solve the same problems repeatedly; the oracle hypothesis, i.e., developers are able to predict variations from one redevelopment to others; and the organizational hypothesis, i.e., software must be organized according to behavior and structure to take advantage of the predictions that the developers make. The conceptual basis for reuse includes: program families, information hiding, abstract interfaces, uses and information hiding hierarchies, and process structure. The primary reusable software characteristics are black-box descriptions, structural descriptions, and composition and decomposition based on program families. Automated support can be provided for systematic reuse, and the Consortium is developing a prototype reuse library and guidebook. The software synthesis process that the Consortium is aiming toward includes modeling, refinement, prototyping, reuse, assessment, and new construction.
Jaswal, Sheila S; O'Hara, Patricia B; Williamson, Patrick L; Springer, Amy L
2013-01-01
Because understanding the structure of biological macromolecules is critical to understanding their function, students of biochemistry should become familiar not only with viewing, but also with generating and manipulating structural representations. We report a strategy from a one-semester undergraduate biochemistry course to integrate use of structural representation tools into both laboratory and homework activities. First, early in the course we introduce the use of readily available open-source software for visualizing protein structure, coincident with modules on amino acid and peptide bond properties. Second, we use these same software tools in lectures and incorporate images and other structure representations in homework tasks. Third, we require a capstone project in which teams of students examine a protein-nucleic acid complex and then use the software tools to illustrate for their classmates the salient features of the structure, relating how the structure helps explain biological function. To ensure engagement with a range of software and database features, we generated a detailed template file that can be used to explore any structure, and that guides students through specific applications of many of the software tools. In presentations, students demonstrate that they are successfully interpreting structural information, and using representations to illustrate particular points relevant to function. Thus, over the semester students integrate information about structural features of biological macromolecules into the larger discussion of the chemical basis of function. Together these assignments provide an accessible introduction to structural representation tools, allowing students to add these methods to their biochemical toolboxes early in their scientific development. © 2013 by The International Union of Biochemistry and Molecular Biology.
Power and sample size for multivariate logistic modeling of unmatched case-control studies.
Gail, Mitchell H; Haneuse, Sebastien
2017-01-01
Sample size calculations are needed to design and assess the feasibility of case-control studies. Although such calculations are readily available for simple case-control designs and univariate analyses, there is limited theory and software for multivariate unconditional logistic analysis of case-control data. Here we outline the theory needed to detect scalar exposure effects or scalar interactions while controlling for other covariates in logistic regression. Both analytical and simulation methods are presented, together with links to the corresponding software.
Engine Structures Modeling Software System (ESMOSS)
NASA Technical Reports Server (NTRS)
1991-01-01
Engine Structures Modeling Software System (ESMOSS) is the development of a specialized software system for the construction of geometric descriptive and discrete analytical models of engine parts, components, and substructures which can be transferred to finite element analysis programs such as NASTRAN. The NASA Lewis Engine Structures Program is concerned with the development of technology for the rational structural design and analysis of advanced gas turbine engines with emphasis on advanced structural analysis, structural dynamics, structural aspects of aeroelasticity, and life prediction. Fundamental and common to all of these developments is the need for geometric and analytical model descriptions at various engine assembly levels which are generated using ESMOSS.
Models and metrics for software management and engineering
NASA Technical Reports Server (NTRS)
Basili, V. R.
1988-01-01
This paper attempts to characterize and present a state of the art view of several quantitative models and metrics of the software life cycle. These models and metrics can be used to aid in managing and engineering software projects. They deal with various aspects of the software process and product, including resources allocation and estimation, changes and errors, size, complexity and reliability. Some indication is given of the extent to which the various models have been used and the success they have achieved.
Structural characterization of LiCrxMn2-xO4 via a simple reflux technique
NASA Astrophysics Data System (ADS)
Purwaningsih, Dyah; Roto, Roto; Sutrisno, Hari; Purwanto, Agus
2017-03-01
LiCrxMn2-xO4 (x=0; 0.02; 0.04; 0.06; 0.08, 0.10) have been successfully synthesized via a facile and simple reflux technique. The SEM-EDS data confirm the presence of Cr, Mn and O elements in the products, while the XRD pattern suggests that the materials have well-developed cubic crystals. Direct method was applied to extract structural parameters of LiCrxMn2-xO4 using the Fullprof and Oscail software in WinPlotr package program. Materials were refined in the crystal system, and space group of structures Fd3m phase were then identified. The lattice parameters decrease with the decrease in Cr content. The highest Li-O bond length was found for LiCr0.10Mn1.90O4. It was observed that there is no significant change in particle size as Cr content increased.
NASA Astrophysics Data System (ADS)
Najafi-Ashtiani, Hamed; Bahari, Ali; Gholipour, Samira; Hoseinzadeh, Siamak
2018-01-01
The composites of tungsten trioxide and silver are synthesized by sodium tungstate and silver nitrate precursors. The structural properties of composite coatings are studied by FTIR, XRD, and XPS. The FTIR analysis of synthesized composite powder corroborated the bonds between tungsten and oxygen elements in WO3 molecules. Furthermore, the XRD spectra show crystalline nature while particle size analysis that is investigated by X-powder software shows average particle size of 24 and 25 nm for samples. The structural analyses show that the addition of silver dopant does not change the stoichiometry of tungsten trioxide and only increase the size of the aggregation in the films. Furthermore, these films have an average approximate roughness of about 10.7, 13.1 and 14.2 nm for sample 1, 2 and 3, respectively. The real and imaginative parts of permittivity are investigated using LCR meter in the frequency range 1 Hz-10 GHz. The optical spectra of composite coatings are characterized in the 300-900 nm wavelength range and the calculation of optical band gaps of them exhibited the directly allowed transition with the values of 3.8 and 3.85 eV. From UV-visible spectroscopy studies, the absorption coefficient of the composite thin films is determined to be of the order of 105 cm- 1 and the obtained refraction and extinction indexes indicated normal dispersive coatings. Due to their optical and electrical properties, the synthesized composite material is a promising candidate for use in electro-optical applicants.
The software architecture to control the Cherenkov Telescope Array
NASA Astrophysics Data System (ADS)
Oya, I.; Füßling, M.; Antonino, P. O.; Conforti, V.; Hagge, L.; Melkumyan, D.; Morgenstern, A.; Tosti, G.; Schwanke, U.; Schwarz, J.; Wegner, P.; Colomé, J.; Lyard, E.
2016-07-01
The Cherenkov Telescope Array (CTA) project is an initiative to build two large arrays of Cherenkov gamma- ray telescopes. CTA will be deployed as two installations, one in the northern and the other in the southern hemisphere, containing dozens of telescopes of different sizes. CTA is a big step forward in the field of ground- based gamma-ray astronomy, not only because of the expected scientific return, but also due to the order-of- magnitude larger scale of the instrument to be controlled. The performance requirements associated with such a large and distributed astronomical installation require a thoughtful analysis to determine the best software solutions. The array control and data acquisition (ACTL) work-package within the CTA initiative will deliver the software to control and acquire the data from the CTA instrumentation. In this contribution we present the current status of the formal ACTL system decomposition into software building blocks and the relationships among them. The system is modelled via the Systems Modelling Language (SysML) formalism. To cope with the complexity of the system, this architecture model is sub-divided into different perspectives. The relationships with the stakeholders and external systems are used to create the first perspective, the context of the ACTL software system. Use cases are employed to describe the interaction of those external elements with the ACTL system and are traced to a hierarchy of functionalities (abstract system functions) describing the internal structure of the ACTL system. These functions are then traced to fully specified logical elements (software components), the deployment of which as technical elements, is also described. This modelling approach allows us to decompose the ACTL software in elements to be created and the ow of information within the system, providing us with a clear way to identify sub-system interdependencies. This architectural approach allows us to build the ACTL system model and trace requirements to deliverables (source code, documentation, etc.), and permits the implementation of a flexible use-case driven software development approach thanks to the traceability from use cases to the logical software elements. The Alma Common Software (ACS) container/component framework, used for the control of the Atacama Large Millimeter/submillimeter Array (ALMA) is the basis for the ACTL software and as such it is considered as an integral part of the software architecture.
Kuwajima, Masaaki; Mendenhall, John M.; Lindsey, Laurence F.; Harris, Kristen M.
2013-01-01
Transmission-mode scanning electron microscopy (tSEM) on a field emission SEM platform was developed for efficient and cost-effective imaging of circuit-scale volumes from brain at nanoscale resolution. Image area was maximized while optimizing the resolution and dynamic range necessary for discriminating key subcellular structures, such as small axonal, dendritic and glial processes, synapses, smooth endoplasmic reticulum, vesicles, microtubules, polyribosomes, and endosomes which are critical for neuronal function. Individual image fields from the tSEM system were up to 4,295 µm2 (65.54 µm per side) at 2 nm pixel size, contrasting with image fields from a modern transmission electron microscope (TEM) system, which were only 66.59 µm2 (8.160 µm per side) at the same pixel size. The tSEM produced outstanding images and had reduced distortion and drift relative to TEM. Automated stage and scan control in tSEM easily provided unattended serial section imaging and montaging. Lens and scan properties on both TEM and SEM platforms revealed no significant nonlinear distortions within a central field of ∼100 µm2 and produced near-perfect image registration across serial sections using the computational elastic alignment tool in Fiji/TrakEM2 software, and reliable geometric measurements from RECONSTRUCT™ or Fiji/TrakEM2 software. Axial resolution limits the analysis of small structures contained within a section (∼45 nm). Since this new tSEM is non-destructive, objects within a section can be explored at finer axial resolution in TEM tomography with current methods. Future development of tSEM tomography promises thinner axial resolution producing nearly isotropic voxels and should provide within-section analyses of structures without changing platforms. Brain was the test system given our interest in synaptic connectivity and plasticity; however, the new tSEM system is readily applicable to other biological systems. PMID:23555711
Revealing the ISO/IEC 9126-1 Clique Tree for COTS Software Evaluation
NASA Technical Reports Server (NTRS)
Morris, A. Terry
2007-01-01
Previous research has shown that acyclic dependency models, if they exist, can be extracted from software quality standards and that these models can be used to assess software safety and product quality. In the case of commercial off-the-shelf (COTS) software, the extracted dependency model can be used in a probabilistic Bayesian network context for COTS software evaluation. Furthermore, while experts typically employ Bayesian networks to encode domain knowledge, secondary structures (clique trees) from Bayesian network graphs can be used to determine the probabilistic distribution of any software variable (attribute) using any clique that contains that variable. Secondary structures, therefore, provide insight into the fundamental nature of graphical networks. This paper will apply secondary structure calculations to reveal the clique tree of the acyclic dependency model extracted from the ISO/IEC 9126-1 software quality standard. Suggestions will be provided to describe how the clique tree may be exploited to aid efficient transformation of an evaluation model.
Next-Generation Lightweight Mirror Modeling Software
NASA Technical Reports Server (NTRS)
Arnold, William R., Sr.; Fitzgerald, Mathew; Rosa, Rubin Jaca; Stahl, Phil
2013-01-01
The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 5-10 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any editor, all the key shell thickness parameters are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models possible
Next Generation Lightweight Mirror Modeling Software
NASA Technical Reports Server (NTRS)
Arnold, William; Fitzgerald, Matthew; Stahl, Philip
2013-01-01
The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 5-10 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any editor, all the key shell thickness parameters are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models possible.
Next Generation Lightweight Mirror Modeling Software
NASA Technical Reports Server (NTRS)
Arnold, William R., Sr.; Fitzgerald, Mathew; Rosa, Rubin Jaca; Stahl, H. Philip
2013-01-01
The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 5-10 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any editor, all the key shell thickness parameters are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models easier.
University Software Ownership and Litigation: A First Examination*
Rai, Arti K.; Allison, John R.; Sampat, Bhaven N.
2013-01-01
Software patents and university-owned patents represent two of the most controversial intellectual property developments of the last twenty-five years. Despite this reality, and concerns that universities act as “patent trolls” when they assert software patents in litigation against successful commercializers, no scholar has systematically examined the ownership and litigation of university software patents. In this Article, we present the first such examination. Our empirical research reveals that software patents represent a significant and growing proportion of university patent holdings. Additionally, the most important determinant of the number of software patents a university owns is not its research and development (“R&D”) expenditures (whether computer science-related or otherwise) but, rather, its tendency to seek patents in other areas. In other words, universities appear to take a “one size fits all” approach to patenting their inventions. This one size fits all approach is problematic given the empirical evidence that software is likely to follow a different commercialization path than other types of invention. Thus, it is perhaps not surprising that we see a number of lawsuits in which university software patents have been used not for purposes of fostering commercialization, but instead, to extract rents in apparent holdup litigation. The Article concludes by examining whether this trend is likely to continue in the future, particularly given a 2006 Supreme Court decision that appears to diminish the holdup threat by recognizing the possibility of liability rules in patent suits, as well as recent case law that may call into question certain types of software patents. PMID:23750052
NASA Technical Reports Server (NTRS)
Gaffney, J. E., Jr.; Judge, R. W.
1981-01-01
A model of a software development process is described. The software development process is seen to consist of a sequence of activities, such as 'program design' and 'module development' (or coding). A manpower estimate is made by multiplying code size by the rates (man months per thousand lines of code) for each of the activities relevant to the particular case of interest and summing up the results. The effect of four objectively determinable factors (organization, software product type, computer type, and code type) on productivity values for each of nine principal software development activities was assessed. Four factors were identified which account for 39% of the observed productivity variation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
P. Somasundaran
The aim of the project is to develop a knowledge base to help the design of enhanced processes for mobilizing and extracting untrapped oil. We emphasize evaluation of novel surfactant mixtures and obtaining optimum combinations of the surfactants for efficient chemical flooding EOR processes. In this regard, an understanding of the aggregate shape, size and structure is crucial since these properties govern the crude oil removal efficiency. During the three-year period, the adsorption and aggregation behavior of sugar-based surfactants and their mixtures with other types of surfactants have been studied. Sugar-based surfactants are made from renewable resources, nontoxic and biodegradable.more » They are miscible with water and oil. These environmentally benign surfactants feature high surface activity, good salinity, calcium and temperature tolerance, and unique adsorption behavior. They possess the characteristics required for oil flooding surfactants and have the potential for replacing currently used surfactants in oil recovery. A novel analytical ultracentrifugation technique has been successfully employed for the first time, to characterize the aggregate species present in mixed micellar solution due to its powerful ability to separate particles based on their size and shape and monitor them simultaneously. Analytical ultracentrifugation offers an unprecedented opportunity to obtain important information on mixed micelles, structure-performance relationship for different surfactant aggregates in solution and their role in interfacial processes. Initial sedimentation velocity investigations were conducted using nonyl phenol ethoxylated decyl ether (NP-10) to choose the best analytical protocol, calculate the partial specific volume and obtain information on sedimentation coefficient, aggregation mass of micelles. Four softwares: OptimaTM XL-A/XL-I data analysis software, DCDT+, Svedberg and SEDFIT, were compared for the analysis of sedimentation velocity experimental data. The results have been compared to that from Light Scattering. Based on the tests, Svedberg and SEDFIT analysis were chosen for further studies.« less
NASA Astrophysics Data System (ADS)
Hermens, Ulrike; Pothen, Mario; Winands, Kai; Arntz, Kristian; Klocke, Fritz
2018-02-01
Laser-induced periodic surface structures (LIPSS) found in particular applications in the fields of surface functionalization have been investigated since many years. The direction of these ripple structures with a periodicity in the nanoscale can be manipulated by changing the laser polarization. For industrial use, it is useful to manipulate the direction of these structures automatically and to obtain smooth changes of their orientation without any visible inhomogeneity. However, currently no system solution exists that is able to control the polarization direction completely automated in one software solution so far. In this paper, a system solution is presented that includes a liquid crystal polarizer to control the polarization direction. It is synchronized with a scanner, a dynamic beam expander and a five axis-system. It provides fast switching times and small step sizes. First results of fabricated structures are also presented. In a systematic study, the conjunction of LIPSS with different orientation in two parallel line scans has been investigated.
Selected issues of the universal communication environment implementation for CII standard
NASA Astrophysics Data System (ADS)
Zagoździńska, Agnieszka; Poźniak, Krzysztof T.; Drabik, Paweł K.
2011-10-01
In the contemporary FPGA market there is the wide assortment of structures, integrated development environments, and boards of different producers. The variety allows to fit resources to requirements of the individual designer. There is the need of standardization of the projects to make it useful in research laboratories equipped with different producers tools. Proposed solution is CII standardization of VHDL components. This paper contains specification of the universal communication environment for CII standard. The link can be used in different FPGA structures. Implementation of the link enables object oriented VHDL programming with the use of CII standardization. The whole environment contains FPGA environment and PC software. The paper contains description of the selected issues of FPGA environment. There is description of some specific solutions that enables environment usage in structures of different producers. The flexibility of different size data transmissions with the use of CII is presented. The specified tool gives the opportunity to use FPGA structures variety fully and design faster and more effectively.
Design of a dual band metamaterial absorber for Wi-Fi bands
NASA Astrophysics Data System (ADS)
Alkurt, Fatih Özkan; Baǧmancı, Mehmet; Karaaslan, Muharrem; Bakır, Mehmet; Altıntaş, Olcay; Karadaǧ, Faruk; Akgöl, Oǧuzhan; Ünal, Emin
2018-02-01
The goal of this work is to design and fabrication of a dual band metamaterial based absorber for Wireless Fidelity (Wi-Fi) bands. Wi-Fi has two different operating frequencies such as 2.45 GHz and 5 GHz. A dual band absorber is proposed and the proposed structure consists of two layered unit cells, and different sized square split ring (SSR) resonators located on each layers. Copper is used for metal layer and resonator structure, FR-4 is used as substrate layer in the proposed structure. This designed dual band metamaterial absorber is used in the wireless frequency bands which has two center frequencies such as 2.45 GHz and 5 GHz. Finite Integration Technique (FIT) based simulation software used and according to FIT based simulation results, the absorption peak in the 2.45 GHz is about 90% and the another frequency 5 GHz has absorption peak near 99%. In addition, this proposed structure has a potential for energy harvesting applications in future works.
Effect of Anions on Nanofiber Formation of β-sheet Propensity Amphiphile Peptide
NASA Astrophysics Data System (ADS)
Shamsudeen, H.; Tan, H. L.; Eshak, Z.
2018-05-01
Peptide self-assembly forms different nanostructures under simple alteration in the solution environment. Understanding the mechanism of the assembly will help us to control and tailor functional nanomaterials. This study aims to investigate the influence of anions on the self-assembly morphology and shape using a synthetic peptide of FFFFKK. Circular Dichoism (CD) and Environmental Scanning Electron Microscope (ESEM) were used to determine the secondary structure and self-assembly morphology, while Image J imaging software was used to measure diameter size. In the absence of anion, FFFFKK formed anti-parallel β-sheet that adopted sizeable fibrillar structure with a minimal increment over the first 7 hours of assembly. Irregular structure was observed in the presence of Iodide ion (I-) with a less stable secondary structure such as β-turn and β-loop. In the presence of perchlorate ion (ClO4 -), needle-like structure was observed with predominantly β-sheet structure. Our study showed that peptide morphology can be controlled by using different anions with careful selection of amino acid residues in peptide sequence.
NASA Astrophysics Data System (ADS)
Foucher, Johann; Labrosse, Aurelien; Dervillé, Alexandre; Zimmermann, Yann; Bernard, Guilhem; Martinez, Sergio; Grönqvist, Hanna; Baderot, Julien; Pinzan, Florian
2017-03-01
The development and integration of new materials and structures at the nanoscale require multiple parallel characterizations in order to control mostly physico-chemical properties as a function of applications. Among all properties, we can list physical properties such as: size, shape, specific surface area, aspect ratio, agglomeration/aggregation state, size distribution, surface morphology/topography, structure (including crystallinity and defect structure), solubility and chemical properties such as: structural formula/molecular structure, composition (including degree of purity, known impurities or additives), phase identity, surface chemistry (composition, charge, tension, reactive sites, physical structure, photocatalytic properties, zeta potential), hydrophilicity/lipophilicity. Depending on the final material formulation (aerosol, powder, nanostructuration…) and the industrial application (semiconductor, cosmetics, chemistry, automotive…), a fleet of complementary characterization equipments must be used in synergy for accurate process tuning and high production yield. The synergy between equipment so-called hybrid metrology consists in using the strength of each technique in order to reduce the global uncertainty for better and faster process control. The only way to succeed doing this exercise is to use data fusion methodology. In this paper, we will introduce the work that has been done to create the first generic hybrid metrology software platform dedicated to nanotechnologies process control. The first part will be dedicated to process flow modeling that is related to a fleet of metrology tools. The second part will introduce the concept of entity model which describes the various parameters that have to be extracted. The entity model is fed with data analysis as a function of the application (automatic analysis or semi-automated analysis). The final part will introduce two ways of doing data fusion on real data coming from imaging (SEM, TEM, AFM) and non-imaging techniques (SAXS). First approach is dedicated to high level fusion which is the art of combining various populations of results from homogeneous or heterogeneous tools, taking into account precision and repeatability of each of them to obtain a new more accurate result. The second approach is dedicated to deep level fusion which is the art of combining raw data from various tools in order to create a new raw data. We will introduce a new concept of virtual tool creator based on deep level fusion. As a conclusion we will discuss the implementation of hybrid metrology in semiconductor environment for advanced process control
New software for statistical analysis of Cambridge Structural Database data
Sykes, Richard A.; McCabe, Patrick; Allen, Frank H.; Battle, Gary M.; Bruno, Ian J.; Wood, Peter A.
2011-01-01
A collection of new software tools is presented for the analysis of geometrical, chemical and crystallographic data from the Cambridge Structural Database (CSD). This software supersedes the program Vista. The new functionality is integrated into the program Mercury in order to provide statistical, charting and plotting options alongside three-dimensional structural visualization and analysis. The integration also permits immediate access to other information about specific CSD entries through the Mercury framework, a common requirement in CSD data analyses. In addition, the new software includes a range of more advanced features focused towards structural analysis such as principal components analysis, cone-angle correction in hydrogen-bond analyses and the ability to deal with topological symmetry that may be exhibited in molecular search fragments. PMID:22477784
NASA Astrophysics Data System (ADS)
Oliveira, Micael
The CECAM Electronic Structure Library (ESL) is a community-driven effort to segregate shared pieces of software as libraries that could be contributed and used by the community. Besides allowing to share the burden of developing and maintaining complex pieces of software, these can also become a target for re-coding by software engineers as hardware evolves, ensuring that electronic structure codes remain at the forefront of HPC trends. In a series of workshops hosted at the CECAM HQ in Lausanne, the tools and infrastructure for the project were prepared, and the first contributions were included and made available online (http://esl.cecam.org). In this talk I will present the different aspects and aims of the ESL and how these can be useful for the electronic structure community.
Hadjisolomou, Stavros P; El-Haddad, George
2017-01-01
Coleoid cephalopods (squid, octopus, and sepia) are renowned for their elaborate body patterning capabilities, which are employed for camouflage or communication. The specific chromatic appearance of a cephalopod, at any given moment, is a direct result of the combined action of their intradermal pigmented chromatophore organs and reflecting cells. Therefore, a lot can be learned about the cephalopod coloration system by video recording and analyzing the activation of individual chromatophores in time. The fact that adult cephalopods have small chromatophores, up to several hundred thousand in number, makes measurement and analysis over several seconds a difficult task. However, current advancements in videography enable high-resolution and high framerate recording, which can be used to record chromatophore activity in more detail and accuracy in both space and time domains. In turn, the additional pixel information and extra frames per video from such recordings result in large video files of several gigabytes, even when the recording spans only few minutes. We created a software plugin, "SpotMetrics," that can automatically analyze high resolution, high framerate video of chromatophore organ activation in time. This image analysis software can track hundreds of individual chromatophores over several hundred frames to provide measurements of size and color. This software may also be used to measure differences in chromatophore activation during different behaviors which will contribute to our understanding of the cephalopod sensorimotor integration system. In addition, this software can potentially be utilized to detect numbers of round objects and size changes in time, such as eye pupil size or number of bacteria in a sample. Thus, we are making this software plugin freely available as open-source because we believe it will be of benefit to other colleagues both in the cephalopod biology field and also within other disciplines.
The table of isotopes-8th edition and beyond
DOE Office of Scientific and Technical Information (OSTI.GOV)
Firestone, R.B.
A new edition of the Table of Isotopes has been published this year by John Wiley and Sons, Inc. This edition is the eighth in a series started by Glenn T. Seaborg in 1940. The two-volume, 3168-page, cloth-bound edition is twice the size of the previous edition published in 1978. It contains nuclear structure and decay data, based mainly on the Evaluated Nuclear Structure Data File (ENSDF), for >3100 isotopes and isomers. Approximately 24000 references are cited, and the appendices have been updated and extended. The book is packaged with an interactive CD-ROM that contains the Table of Isotopes inmore » Adobe Acrobat Portable Document Format for convenient viewing on personal computer (PC) and UNIX workstations. The CD-ROM version contains a chart of the nuclides graphical index and separate indices organized for radioisotope users and nuclear structure physicists. More than 100000 hypertext links are provided to move the user quickly through related information free from the limitations of page size. Complete references with keyword abstracts are provided. The CD-ROM also contains the Table of Super-deformed Nuclear Bands and Fission Isomers; Tables of Atoms, Atomic Nuclei, and Subatomic Particles by Ivan P. Selinov; the ENSDF and nuclear structure reference (NSR) databases; the ENSDF manual by Jagdish K. Tuli; and Abode Acrobat Reader software.« less
Development of new vibration energy flow analysis software and its applications to vehicle systems
NASA Astrophysics Data System (ADS)
Kim, D.-J.; Hong, S.-Y.; Park, Y.-H.
2005-09-01
The Energy flow analysis (EFA) offers very promising results in predicting the noise and vibration responses of system structures in medium-to-high frequency ranges. We have developed the Energy flow finite element method (EFFEM) based software, EFADSC++ R4, for the vibration analysis. The software can analyze the system structures composed of beam, plate, spring-damper, rigid body elements and many other components developed, and has many useful functions in analysis. For convenient use of the software, the main functions of the whole software are modularized into translator, model-converter, and solver. The translator module makes it possible to use finite element (FE) model for the vibration analysis. The model-converter module changes FE model into energy flow finite element (EFFE) model, and generates joint elements to cover the vibrational attenuation in the complex structures composed of various elements and can solve the joint element equations by using the wave tra! nsmission approach very quickly. The solver module supports the various direct and iterative solvers for multi-DOF structures. The predictions of vibration for real vehicles by using the developed software were performed successfully.
A daily global mesoscale ocean eddy dataset from satellite altimetry.
Faghmous, James H; Frenger, Ivy; Yao, Yuanshun; Warmka, Robert; Lindell, Aron; Kumar, Vipin
2015-01-01
Mesoscale ocean eddies are ubiquitous coherent rotating structures of water with radial scales on the order of 100 kilometers. Eddies play a key role in the transport and mixing of momentum and tracers across the World Ocean. We present a global daily mesoscale ocean eddy dataset that contains ~45 million mesoscale features and 3.3 million eddy trajectories that persist at least two days as identified in the AVISO dataset over a period of 1993-2014. This dataset, along with the open-source eddy identification software, extract eddies with any parameters (minimum size, lifetime, etc.), to study global eddy properties and dynamics, and to empirically estimate the impact eddies have on mass or heat transport. Furthermore, our open-source software may be used to identify mesoscale features in model simulations and compare them to observed features. Finally, this dataset can be used to study the interaction between mesoscale ocean eddies and other components of the Earth System.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aragon, Kathryn M.; Eaton, Shelley M.; McCornack, Marjorie Turner
When a requirements engineering effort fails to meet expectations, often times the requirements management tool is blamed. Working with numerous project teams at Sandia National Laboratories over the last fifteen years has shown us that the tool is rarely the culprit; usually it is the lack of a viable information architecture with well- designed processes to support requirements engineering. This document illustrates design concepts with rationale, as well as a proven information architecture to structure and manage information in support of requirements engineering activities for any size or type of project. This generalized information architecture is specific to IBM's Rationalmore » DOORS (Dynamic Object Oriented Requirements System) software application, which is the requirements management tool in Sandia's CEE (Common Engineering Environment). This generalized information architecture can be used as presented or as a foundation for designing a tailored information architecture for project-specific needs. It may also be tailored for another software tool. Version 1.0 4 November 201« less
A daily global mesoscale ocean eddy dataset from satellite altimetry
Faghmous, James H.; Frenger, Ivy; Yao, Yuanshun; Warmka, Robert; Lindell, Aron; Kumar, Vipin
2015-01-01
Mesoscale ocean eddies are ubiquitous coherent rotating structures of water with radial scales on the order of 100 kilometers. Eddies play a key role in the transport and mixing of momentum and tracers across the World Ocean. We present a global daily mesoscale ocean eddy dataset that contains ~45 million mesoscale features and 3.3 million eddy trajectories that persist at least two days as identified in the AVISO dataset over a period of 1993–2014. This dataset, along with the open-source eddy identification software, extract eddies with any parameters (minimum size, lifetime, etc.), to study global eddy properties and dynamics, and to empirically estimate the impact eddies have on mass or heat transport. Furthermore, our open-source software may be used to identify mesoscale features in model simulations and compare them to observed features. Finally, this dataset can be used to study the interaction between mesoscale ocean eddies and other components of the Earth System. PMID:26097744
NASA Astrophysics Data System (ADS)
Chávez, G. Moreno; Sarocchi, D.; Santana, E. Arce; Borselli, L.
2015-12-01
The study of grain size distribution is fundamental for understanding sedimentological environments. Through these analyses, clast erosion, transport and deposition processes can be interpreted and modeled. However, grain size distribution analysis can be difficult in some outcrops due to the number and complexity of the arrangement of clasts and matrix and their physical size. Despite various technological advances, it is almost impossible to get the full grain size distribution (blocks to sand grain size) with a single method or instrument of analysis. For this reason development in this area continues to be fundamental. In recent years, various methods of particle size analysis by automatic image processing have been developed, due to their potential advantages with respect to classical ones; speed and final detailed content of information (virtually for each analyzed particle). In this framework, we have developed a novel algorithm and software for grain size distribution analysis, based on color image segmentation using an entropy-controlled quadratic Markov measure field algorithm and the Rosiwal method for counting intersections between clast and linear transects in the images. We test the novel algorithm in different sedimentary deposit types from 14 varieties of sedimentological environments. The results of the new algorithm were compared with grain counts performed manually by the same Rosiwal methods applied by experts. The new algorithm has the same accuracy as a classical manual count process, but the application of this innovative methodology is much easier and dramatically less time-consuming. The final productivity of the new software for analysis of clasts deposits after recording field outcrop images can be increased significantly.
Demonstration of Multi- and Single-Reader Sample Size Program for Diagnostic Studies software.
Hillis, Stephen L; Schartz, Kevin M
2015-02-01
The recently released software Multi- and Single-Reader Sample Size Sample Size Program for Diagnostic Studies , written by Kevin Schartz and Stephen Hillis, performs sample size computations for diagnostic reader-performance studies. The program computes the sample size needed to detect a specified difference in a reader performance measure between two modalities, when using the analysis methods initially proposed by Dorfman, Berbaum, and Metz (DBM) and Obuchowski and Rockette (OR), and later unified and improved by Hillis and colleagues. A commonly used reader performance measure is the area under the receiver-operating-characteristic curve. The program can be used with typical common reader-performance measures which can be estimated parametrically or nonparametrically. The program has an easy-to-use step-by-step intuitive interface that walks the user through the entry of the needed information. Features of the software include the following: (1) choice of several study designs; (2) choice of inputs obtained from either OR or DBM analyses; (3) choice of three different inference situations: both readers and cases random, readers fixed and cases random, and readers random and cases fixed; (4) choice of two types of hypotheses: equivalence or noninferiority; (6) choice of two output formats: power for specified case and reader sample sizes, or a listing of case-reader combinations that provide a specified power; (7) choice of single or multi-reader analyses; and (8) functionality in Windows, Mac OS, and Linux.
NASA Astrophysics Data System (ADS)
Markelin, L.; Honkavaara, E.; Näsi, R.; Nurminen, K.; Hakala, T.
2014-08-01
Remote sensing based on unmanned airborne vehicles (UAVs) is a rapidly developing field of technology. UAVs enable accurate, flexible, low-cost and multiangular measurements of 3D geometric, radiometric, and temporal properties of land and vegetation using various sensors. In this paper we present a geometric processing chain for multiangular measurement system that is designed for measuring object directional reflectance characteristics in a wavelength range of 400-900 nm. The technique is based on a novel, lightweight spectral camera designed for UAV use. The multiangular measurement is conducted by collecting vertical and oblique area-format spectral images. End products of the geometric processing are image exterior orientations, 3D point clouds and digital surface models (DSM). This data is needed for the radiometric processing chain that produces reflectance image mosaics and multiangular bidirectional reflectance factor (BRF) observations. The geometric processing workflow consists of the following three steps: (1) determining approximate image orientations using Visual Structure from Motion (VisualSFM) software, (2) calculating improved orientations and sensor calibration using a method based on self-calibrating bundle block adjustment (standard photogrammetric software) (this step is optional), and finally (3) creating dense 3D point clouds and DSMs using Photogrammetric Surface Reconstruction from Imagery (SURE) software that is based on semi-global-matching algorithm and it is capable of providing a point density corresponding to the pixel size of the image. We have tested the geometric processing workflow over various targets, including test fields, agricultural fields, lakes and complex 3D structures like forests.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKinney, Adriana L.; Varga, Tamas
Branching structures such as lungs, blood vessels and plant roots play a critical role in life. Growth, structure, and function of these branching structures have an immense effect on our lives. Therefore, quantitative size information on such structures in their native environment is invaluable for studying their growth and the effect of the environment on them. X-ray computed tomography (XCT) has been an effective tool for in situ imaging and analysis of branching structures. We developed a costless tool that approximates the surface and volume of branching structures. Our methodology of noninvasive imaging, segmentation and extraction of quantitative information ismore » demonstrated through the analysis of a plant root in its soil medium from 3D tomography data. XCT data collected on a grass specimen was used to visualize its root structure. A suite of open-source software was employed to segment the root from the soil and determine its isosurface, which was used to calculate its volume and surface. This methodology of processing 3D data is applicable to other branching structures even when the structure of interest is of similar x-ray attenuation to its environment and difficulties arise with sample segmentation.« less
Magnetic properties of M0.3Fe2.7O4 (M = Fe, Zn and Mn) ferrites nanoparticles
NASA Astrophysics Data System (ADS)
Modaresi, Nahid; Afzalzadeh, Reza; Aslibeiki, Bagher; Kameli, Parviz
2018-06-01
In the present article a comparative study on the structural and magnetic properties of nano-sized M0.3Fe0.7Fe2O4 (M = Fe, Zn and Mn) ferrites have been reported. The X-ray diffraction (XRD) patterns show that the crystallite size depends on the cation distribution. The Rietveld refinement of XRD patterns using MAUD software determines the distribution of cations and unit cell dimensions. The magnetic measurements show that the maximum and minimum value of saturation magnetization is obtained for Zn and Mn doped samples, respectively. The peak temperature of AC magnetic susceptibility of Zn and Fe doped samples below 300 K shows the superparamagnetic behavior in these samples at room temperature. the AC susceptibility results confirm the presence of strong interactions between the nanoparticles which leads to a superspin glass state in the samples at low temperatures.
Practical design and evaluation methods of omnidirectional vision sensors
NASA Astrophysics Data System (ADS)
Ohte, Akira; Tsuzuki, Osamu
2012-01-01
A practical omnidirectional vision sensor, consisting of a curved mirror, a mirror-supporting structure, and a megapixel digital imaging system, can view a field of 360 deg horizontally and 135 deg vertically. The authors theoretically analyzed and evaluated several curved mirrors, namely, a spherical mirror, an equidistant mirror, and a single viewpoint mirror (hyperboloidal mirror). The focus of their study was mainly on the image-forming characteristics, position of the virtual images, and size of blur spot images. The authors propose here a practical design method that satisfies the required characteristics. They developed image-processing software for converting circular images to images of the desired characteristics in real time. They also developed several prototype vision sensors using spherical mirrors. Reports dealing with virtual images and blur-spot size of curved mirrors are few; therefore, this paper will be very useful for the development of omnidirectional vision sensors.
NASA Astrophysics Data System (ADS)
Koestner, Stefan
2009-09-01
With the increasing size and degree of complexity of today's experiments in high energy physics the required amount of work and complexity to integrate a complete subdetector into an experiment control system is often underestimated. We report here on the layered software structure and protocols used by the LHCb experiment to control its detectors and readout boards. The experiment control system of LHCb is based on the commercial SCADA system PVSS II. Readout boards which are outside the radiation area are accessed via embedded credit card sized PCs which are connected to a large local area network. The SPECS protocol is used for control of the front end electronics. Finite state machines are introduced to facilitate the control of a large number of electronic devices and to model the whole experiment at the level of an expert system.
Computer modeling design of a frame pier for a high-speed railway project
NASA Astrophysics Data System (ADS)
Shi, Jing-xian; Fan, Jiang
2018-03-01
In this paper, a double line pier on a high-speed railway in China is taken as an example. the size of each location is drawn up firstly. The design of pre-stressed steel beam for its crossbeam is carried out, and the configuration of ordinary reinforcement is carried out for concrete piers. Combined with bridge structure analysis software Midas Civil and BSAS, the frame pier is modeled and calculated. The results show that the beam and pier column section size reasonable design of pre-stressed steel beam with 17-7V5 high strength low relaxation steel strand, can meet the requirements of high speed railway carrying capacity; the main reinforcement of pier shaft with HRB400 diameter is 28mm, ring arranged around the pier, can satisfy the eccentric compression strength, stiffness and stability requirements, also meet the requirements of seismic design.
Cellulose polymorphy, crystallite size, and the Segal crystallinity index
USDA-ARS?s Scientific Manuscript database
The X-ray diffraction-based Segal Crystallinity Index (CI) was calculated for simulated different sizes of crystallites for cellulose I' and II. The Mercury software was used, and different crystallite sizes were based on different input peak widths at half of the maximum peak intensity (pwhm). The ...
Small but Pristine--Lessons for Small Library Automation.
ERIC Educational Resources Information Center
Clement, Russell; Robertson, Dane
1990-01-01
Compares the more positive library automation experiences of a small public library with those of a large research library. Topics addressed include collection size; computer size and the need for outside control of a data processing center; staff size; selection process for hardware and software; and accountability. (LRW)
Software Engineering Guidebook
NASA Technical Reports Server (NTRS)
Connell, John; Wenneson, Greg
1993-01-01
The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.
Chen, Weijie; Wunderlich, Adam; Petrick, Nicholas; Gallas, Brandon D
2014-10-01
We treat multireader multicase (MRMC) reader studies for which a reader's diagnostic assessment is converted to binary agreement (1: agree with the truth state, 0: disagree with the truth state). We present a mathematical model for simulating binary MRMC data with a desired correlation structure across readers, cases, and two modalities, assuming the expected probability of agreement is equal for the two modalities ([Formula: see text]). This model can be used to validate the coverage probabilities of 95% confidence intervals (of [Formula: see text], [Formula: see text], or [Formula: see text] when [Formula: see text]), validate the type I error of a superiority hypothesis test, and size a noninferiority hypothesis test (which assumes [Formula: see text]). To illustrate the utility of our simulation model, we adapt the Obuchowski-Rockette-Hillis (ORH) method for the analysis of MRMC binary agreement data. Moreover, we use our simulation model to validate the ORH method for binary data and to illustrate sizing in a noninferiority setting. Our software package is publicly available on the Google code project hosting site for use in simulation, analysis, validation, and sizing of MRMC reader studies with binary agreement data.
Chen, Weijie; Wunderlich, Adam; Petrick, Nicholas; Gallas, Brandon D.
2014-01-01
Abstract. We treat multireader multicase (MRMC) reader studies for which a reader’s diagnostic assessment is converted to binary agreement (1: agree with the truth state, 0: disagree with the truth state). We present a mathematical model for simulating binary MRMC data with a desired correlation structure across readers, cases, and two modalities, assuming the expected probability of agreement is equal for the two modalities (P1=P2). This model can be used to validate the coverage probabilities of 95% confidence intervals (of P1, P2, or P1−P2 when P1−P2=0), validate the type I error of a superiority hypothesis test, and size a noninferiority hypothesis test (which assumes P1=P2). To illustrate the utility of our simulation model, we adapt the Obuchowski–Rockette–Hillis (ORH) method for the analysis of MRMC binary agreement data. Moreover, we use our simulation model to validate the ORH method for binary data and to illustrate sizing in a noninferiority setting. Our software package is publicly available on the Google code project hosting site for use in simulation, analysis, validation, and sizing of MRMC reader studies with binary agreement data. PMID:26158051
NASA Astrophysics Data System (ADS)
Mohan, N. S.; Kulkarni, S. M.
2018-01-01
Polymer based composites have marked their valuable presence in the area of aerospace, defense and automotive industry. Components made of composite, are assembled to main structure by fastener, which require accurate, precise high quality holes to be drilled. Drilling the hole in composite with accuracy require control over various processes parameters viz., speed, feed, drill bit size and thickens of specimen. TRIAC VMC machining center is used to drill the hole and to relate the cutting and machining parameters on the torque. MINITAB 14 software is used to analyze the collected data. As a function of cutting and specimen parameters this method could be useful for predicting torque parameters. The purpose of this work is to investigate the effect of drilling parameters to get low torque value. Results show that thickness of specimen and drill bit size are significant parameters influencing the torque and spindle speed and feed rate have least influence and overlaid plot indicates a feasible and low region of torque is observed for medium to large sized drill bits for the range of spindle speed selected. Response surface contour plots indicate the sensitivity of the drill size and specimen thickness to the torque.
Flow dynamics in bioreactors containing tissue engineering scaffolds.
Lawrence, Benjamin J; Devarapalli, Mamatha; Madihally, Sundararajan V
2009-02-15
Bioreactors are widely used in tissue engineering as a way to distribute nutrients within porous materials and provide physical stimulus required by many tissues. However, the fluid dynamics within the large porous structure are not well understood. In this study, we explored the effect of reactor geometry by using rectangular and circular reactors with three different inlet and outlet patterns. Geometries were simulated with and without the porous structure using the computational fluid dynamics software Comsol Multiphysics 3.4 and/or ANSYS CFX 11 respectively. Residence time distribution analysis using a step change of a tracer within the reactor revealed non-ideal fluid distribution characteristics within the reactors. The Brinkman equation was used to model the permeability characteristics with in the chitosan porous structure. Pore size was varied from 10 to 200 microm and the number of pores per unit area was varied from 15 to 1,500 pores/mm(2). Effect of cellular growth and tissue remodeling on flow distribution was also assessed by changing the pore size (85-10 microm) while keeping the number of pores per unit area constant. These results showed significant increase in pressure with reduction in pore size, which could limit the fluid flow and nutrient transport. However, measured pressure drop was marginally higher than the simulation results. Maximum shear stress was similar in both reactors and ranged approximately 0.2-0.3 dynes/cm(2). The simulations were validated experimentally using both a rectangular and circular bioreactor, constructed in-house. Porous structures for the experiments were formed using 0.5% chitosan solution freeze-dried at -80 degrees C, and the pressure drop across the reactor was monitored.
Computer-based mechanical design of overhead lines
NASA Astrophysics Data System (ADS)
Rusinaru, D.; Bratu, C.; Dinu, R. C.; Manescu, L. G.
2016-02-01
Beside the performance, the safety level according to the actual standards is a compulsory condition for distribution grids’ operation. Some of the measures leading to improvement of the overhead lines reliability ask for installations’ modernization. The constraints imposed to the new lines components refer to the technical aspects as thermal stress or voltage drop, and look for economic efficiency, too. The mechanical sizing of the overhead lines is after all an optimization problem. More precisely, the task in designing of the overhead line profile is to size poles, cross-arms and stays and locate poles along a line route so that the total costs of the line's structure to be minimized and the technical and safety constraints to be fulfilled.The authors present in this paper an application for the Computer-Based Mechanical Design of the Overhead Lines and the features of the corresponding Visual Basic program, adjusted to the distribution lines. The constraints of the optimization problem are adjusted to the existing weather and loading conditions of Romania. The outputs of the software application for mechanical design of overhead lines are: the list of components chosen for the line: poles, cross-arms, stays; the list of conductor tension and forces for each pole, cross-arm and stay for different weather conditions; the line profile drawings.The main features of the mechanical overhead lines design software are interactivity, local optimization function and high-level user-interface
NASA Technical Reports Server (NTRS)
Key, Samuel W.
1993-01-01
The explicit transient dynamics technology in use today for simulating the impact and subsequent transient dynamic response of a structure has its origins in the 'hydrocodes' dating back to the late 1940's. The growth in capability in explicit transient dynamics technology parallels the growth in speed and size of digital computers. Computer software for simulating the explicit transient dynamic response of a structure is characterized by algorithms that use a large number of small steps. In explicit transient dynamics software there is a significant emphasis on speed and simplicity. The finite element technology used to generate the spatial discretization of a structure is based on a compromise between completeness of the representation for the physical processes modelled and speed in execution. That is, since it is expected in every calculation that the deformation will be finite and the material will be strained beyond the elastic range, the geometry and the associated gradient operators must be reconstructed, as well as complex stress-strain models evaluated at every time step. As a result, finite elements derived for explicit transient dynamics software use the simplest and barest constructions possible for computational efficiency while retaining an essential representation of the physical behavior. The best example of this technology is the four-node bending quadrilateral derived by Belytschko, Lin and Tsay. Today, the speed, memory capacity and availability of computer hardware allows a number of the previously used algorithms to be 'improved.' That is, it is possible with today's computing hardware to modify many of the standard algorithms to improve their representation of the physical process at the expense of added complexity and computational effort. The purpose is to review a number of these algorithms and identify the improvements possible. In many instances, both the older, faster version of the algorithm and the improved and somewhat slower version of the algorithm are found implemented together in software. Specifically, the following seven algorithmic items are examined: the invariant time derivatives of stress used in material models expressed in rate form; incremental objectivity and strain used in the numerical integration of the material models; the use of one-point element integration versus mean quadrature; shell elements used to represent the behavior of thin structural components; beam elements based on stress-resultant plasticity versus cross-section integration; the fidelity of elastic-plastic material models in their representation of ductile metals; and the use of Courant subcycling to reduce computational effort.
WalkThrough Example Procedures for MAMA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruggiero, Christy E.; Gaschen, Brian Keith; Bloch, Jeffrey Joseph
This documentation is a growing set of walk through examples of analyses using the MAMA V2.0 software. It does not cover all the features or possibilities with the MAMA software, but will address using many of the basic analysis tools to quantify particle size and shape in an image. This document will continue to evolve as additional procedures and examples are added. The starting assumption is that the MAMA software has been successfully installed.
System analysis tools for an ELT at ESO
NASA Astrophysics Data System (ADS)
Mueller, Michael; Koch, Franz
2006-06-01
Engineering of complex, large scale systems like the ELT designs currently investigated and developed in Europe and Northern America require powerful and sophisticated tools within specific technical disciplines such as mechanics, optics and control engineering. However, even analyzing a certain component of the telescope like the telescope structure necessitates a system approach to evaluate the structural effects onto the optical performance. This paper shows several software tools developed by the European Southern Observatory (ESO) which focus onto the system approach in the analyses: Using modal results of a finite element analysis the SMI-toolbox allows an easy generation of structural models with different sizes and levels of accuracy for the control design and closed-loop simulations. The optical modeling code BeamWarrior was developed by ESO and Astrium GmbH, Germany) especially for integrated modeling and interfering with a structural model. Within BeamWarrior displacements and deformations can be applied in an arbitrary coordinate system, and hence also in the global coordinates of the FE model avoiding error prone transformations. In addition to this, a sparse state space model object was developed for Matlab to gain in computational efficiency and reduced memory requirements due to the sparsity pattern of both the structural models and the control architecture. As one result these tools allow building an integrated model in order to reliably simulate interactions, cross-coupling effects, system responses, and to evaluate global performance. In order to evaluate disturbance effects on the optical performance in openloop more efficiently, an optical evaluation toolbox was built in the FE software ANSYS which performs Zernike decomposition and best-fit computation of the deformations directly in the FE analysis.
BurnCase 3D software validation study: Burn size measurement accuracy and inter-rater reliability.
Parvizi, Daryousch; Giretzlehner, Michael; Wurzer, Paul; Klein, Limor Dinur; Shoham, Yaron; Bohanon, Fredrick J; Haller, Herbert L; Tuca, Alexandru; Branski, Ludwik K; Lumenta, David B; Herndon, David N; Kamolz, Lars-P
2016-03-01
The aim of this study was to compare the accuracy of burn size estimation using the computer-assisted software BurnCase 3D (RISC Software GmbH, Hagenberg, Austria) with that using a 2D scan, considered to be the actual burn size. Thirty artificial burn areas were pre planned and prepared on three mannequins (one child, one female, and one male). Five trained physicians (raters) were asked to assess the size of all wound areas using BurnCase 3D software. The results were then compared with the real wound areas, as determined by 2D planimetry imaging. To examine inter-rater reliability, we performed an intraclass correlation analysis with a 95% confidence interval. The mean wound area estimations of the five raters using BurnCase 3D were in total 20.7±0.9% for the child, 27.2±1.5% for the female and 16.5±0.1% for the male mannequin. Our analysis showed relative overestimations of 0.4%, 2.8% and 1.5% for the child, female and male mannequins respectively, compared to the 2D scan. The intraclass correlation between the single raters for mean percentage of the artificial burn areas was 98.6%. There was also a high intraclass correlation between the single raters and the 2D Scan visible. BurnCase 3D is a valid and reliable tool for the determination of total body surface area burned in standard models. Further clinical studies including different pediatric and overweight adult mannequins are warranted. Copyright © 2016 Elsevier Ltd and ISBI. All rights reserved.
Does filler database size influence identification accuracy?
Bergold, Amanda N; Heaton, Paul
2018-06-01
Police departments increasingly use large photo databases to select lineup fillers using facial recognition software, but this technological shift's implications have been largely unexplored in eyewitness research. Database use, particularly if coupled with facial matching software, could enable lineup constructors to increase filler-suspect similarity and thus enhance eyewitness accuracy (Fitzgerald, Oriet, Price, & Charman, 2013). However, with a large pool of potential fillers, such technologies might theoretically produce lineup fillers too similar to the suspect (Fitzgerald, Oriet, & Price, 2015; Luus & Wells, 1991; Wells, Rydell, & Seelau, 1993). This research proposes a new factor-filler database size-as a lineup feature affecting eyewitness accuracy. In a facial recognition experiment, we select lineup fillers in a legally realistic manner using facial matching software applied to filler databases of 5,000, 25,000, and 125,000 photos, and find that larger databases are associated with a higher objective similarity rating between suspects and fillers and lower overall identification accuracy. In target present lineups, witnesses viewing lineups created from the larger databases were less likely to make correct identifications and more likely to select known innocent fillers. When the target was absent, database size was associated with a lower rate of correct rejections and a higher rate of filler identifications. Higher algorithmic similarity ratings were also associated with decreases in eyewitness identification accuracy. The results suggest that using facial matching software to select fillers from large photograph databases may reduce identification accuracy, and provides support for filler database size as a meaningful system variable. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Update on HCDstruct - A Tool for Hybrid Wing Body Conceptual Design and Structural Optimization
NASA Technical Reports Server (NTRS)
Gern, Frank H.
2015-01-01
HCDstruct is a Matlab® based software tool to rapidly build a finite element model for structural optimization of hybrid wing body (HWB) aircraft at the conceptual design level. The tool uses outputs from a Flight Optimization System (FLOPS) performance analysis together with a conceptual outer mold line of the vehicle, e.g. created by Vehicle Sketch Pad (VSP), to generate a set of MSC Nastran® bulk data files. These files can readily be used to perform a structural optimization and weight estimation using Nastran’s® Solution 200 multidisciplinary optimization solver. Initially developed at NASA Langley Research Center to perform increased fidelity conceptual level HWB centerbody structural analyses, HCDstruct has grown into a complete HWB structural sizing and weight estimation tool, including a fully flexible aeroelastic loads analysis. Recent upgrades to the tool include the expansion to a full wing tip-to-wing tip model for asymmetric analyses like engine out conditions and dynamic overswings, as well as a fully actuated trailing edge, featuring up to 15 independently actuated control surfaces and twin tails. Several example applications of the HCDstruct tool are presented.
Visualization of instationary flows by particle traces
NASA Astrophysics Data System (ADS)
Raasch, S.
An abstract on a study which represents a model of atmospheric flow output by computer movies is presented. The structure and evolution of the flow is visualized by starting weightless particles at the locations of the model grid points at distinct, equally spaced times. These particles are then only advected by the flow. In order to avoid useless accumulation of particles, they can be provided with a limited lifetime. Scalar quantities can be shown in addition to using color shaded contours as background information. A movie with several examples of atmospheric flows, for example convection in the atmospheric boundary layer, slope winds, land seabreeze and Kelvin-Helmholtz waves is presented. The simulations are performed by two dimensional and three dimensional nonhydrostatic, finite difference models. Graphics are produced by using the UNIRAS software and the graphic output is in form of CGM metafiles. The single frames are stored on an ABEKAS real time video disc and then transferred to a BETACAM-SP tape recorder. The graphic software is suitable to produce 2 dimensional pictures, for example only cross sections of three dimensional simulations can be made. To produce a movie of typically 90 seconds duration, the graphic software and the particle model need about 10 hours CPU time on a CCD CYBER 990 and the CGM metafile has a size of about 1.4 GByte.
Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli; Brett, Bevin
2013-01-01
One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. In this work, we have developed a software platform that is designed to support high-performance 3D medical image processing for a wide range of applications using increasingly available and affordable commodity computing systems: multi-core, clusters, and cloud computing systems. To achieve scalable, high-performance computing, our platform (1) employs size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D image processing algorithms; (2) supports task scheduling for efficient load distribution and balancing; and (3) consists of a layered parallel software libraries that allow a wide range of medical applications to share the same functionalities. We evaluated the performance of our platform by applying it to an electronic cleansing system in virtual colonoscopy, with initial experimental results showing a 10 times performance improvement on an 8-core workstation over the original sequential implementation of the system. PMID:23366803
Building Energy Management Open Source Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rahman, Saifur
Funded by the U.S. Department of Energy in November 2013, a Building Energy Management Open Source Software (BEMOSS) platform was engineered to improve sensing and control of equipment in small- and medium-sized commercial buildings. According to the Energy Information Administration (EIA), small- (5,000 square feet or smaller) and medium-sized (between 5,001 to 50,000 square feet) commercial buildings constitute about 95% of all commercial buildings in the U.S. These buildings typically do not have Building Automation Systems (BAS) to monitor and control building operation. While commercial BAS solutions exist, including those from Siemens, Honeywell, Johnsons Controls and many more, they aremore » not cost effective in the context of small- and medium-sized commercial buildings, and typically work with specific controller products from the same company. BEMOSS targets small and medium-sized commercial buildings to address this gap.« less
Laval University and Lakehead University Experiments at TREC 2015 Contextual Suggestion Track
2015-11-20
Department of Computer Science and Software Engineering, Laval University 2 Department of Software Engineering, Lakehead University Abstract—In this...Linear Regression and Lambda Mart perform poorly in this case, be- cause the size of the training data per user is small (less than 50 samples). On the
Desktop Publishing: A Brave New World and Publishing from the Desktop.
ERIC Educational Resources Information Center
Lormand, Robert; Rowe, Jane J.
1988-01-01
The first of two articles presents basic selection criteria for desktop publishing software packages, including discussion of expectations, required equipment, training costs, publication size, desired software features, additional equipment needed, and quality control. The second provides a brief description of desktop publishing using the Apple…
SmartGrain: high-throughput phenotyping software for measuring seed shape through image analysis.
Tanabata, Takanari; Shibaya, Taeko; Hori, Kiyosumi; Ebana, Kaworu; Yano, Masahiro
2012-12-01
Seed shape and size are among the most important agronomic traits because they affect yield and market price. To obtain accurate seed size data, a large number of measurements are needed because there is little difference in size among seeds from one plant. To promote genetic analysis and selection for seed shape in plant breeding, efficient, reliable, high-throughput seed phenotyping methods are required. We developed SmartGrain software for high-throughput measurement of seed shape. This software uses a new image analysis method to reduce the time taken in the preparation of seeds and in image capture. Outlines of seeds are automatically recognized from digital images, and several shape parameters, such as seed length, width, area, and perimeter length, are calculated. To validate the software, we performed a quantitative trait locus (QTL) analysis for rice (Oryza sativa) seed shape using backcrossed inbred lines derived from a cross between japonica cultivars Koshihikari and Nipponbare, which showed small differences in seed shape. SmartGrain removed areas of awns and pedicels automatically, and several QTLs were detected for six shape parameters. The allelic effect of a QTL for seed length detected on chromosome 11 was confirmed in advanced backcross progeny; the cv Nipponbare allele increased seed length and, thus, seed weight. High-throughput measurement with SmartGrain reduced sampling error and made it possible to distinguish between lines with small differences in seed shape. SmartGrain could accurately recognize seed not only of rice but also of several other species, including Arabidopsis (Arabidopsis thaliana). The software is free to researchers.
Reinhardt, Martin; Brandmaier, Philipp; Seider, Daniel; Kolesnik, Marina; Jenniskens, Sjoerd; Sequeiros, Roberto Blanco; Eibisberger, Martin; Voglreiter, Philip; Flanagan, Ronan; Mariappan, Panchatcharam; Busse, Harald; Moche, Michael
2017-12-01
Radio-frequency ablation (RFA) is a promising minimal-invasive treatment option for early liver cancer, however monitoring or predicting the size of the resulting tissue necrosis during the RFA-procedure is a challenging task, potentially resulting in a significant rate of under- or over treatments. Currently there is no reliable lesion size prediction method commercially available. ClinicIMPPACT is designed as multicenter-, prospective-, non-randomized clinical trial to evaluate the accuracy and efficiency of innovative planning and simulation software. 60 patients with early liver cancer will be included at four European clinical institutions and treated with the same RFA system. The preinterventional imaging datasets will be used for computational planning of the RFA treatment. All ablations will be simulated simultaneously to the actual RFA procedure, using the software environment developed in this project. The primary outcome measure is the comparison of the simulated ablation zones with the true lesions shown in follow-up imaging after one month, to assess accuracy of the lesion prediction. This unique multicenter clinical trial aims at the clinical integration of a dedicated software solution to accurately predict lesion size and shape after radiofrequency ablation of liver tumors. Accelerated and optimized workflow integration, and real-time intraoperative image processing, as well as inclusion of patient specific information, e.g. organ perfusion and registration of the real RFA needle position might make the introduced software a powerful tool for interventional radiologists to optimize patient outcomes.
Analysis and Synthesis of Robust Data Structures
1990-08-01
1.3.2 Multiversion Software. .. .. .. .. .. .... .. ... .. ...... 5 1.3.3 Robust Data Structure .. .. .. .. .. .. .. .. .. ... .. ..... 6 1.4...context are 0 multiversion software, which is an adaptation oi N-modulo redundancy (NMR) tech- nique. * recovery blocks, which is an adaptation of...implementations using these features for such a hybrid approach. 1.3.2 Multiversion Software Avizienis [AC77] was the first to adapt NMR technique into
Software Requirements Engineering Methodology (Development)
1979-06-01
Higher Order Software [20]; and the Michael Jackson Design Methodology [21]. Although structured programming constructs have proven to be more useful...reviewed here. Similarly, the manual techniques for software design (e.g., HIPO Diagrams, Nassi-Schneidermann charts, Top-Down Design, the Michael ... Jackson Design Methodology, Yourdon’s Structured Design) are not addressed. 6.1.3 Research Programs There are a number of research programs underway
NASA Technical Reports Server (NTRS)
1983-01-01
The structure and functions of each reporting software program for the Software Engineering Laboratory data base are described. Baseline diagrams, module descriptions, and listings of program generation files are included.
Shyr, Casper; Kushniruk, Andre; van Karnebeek, Clara D M; Wasserman, Wyeth W
2016-03-01
The transition of whole-exome and whole-genome sequencing (WES/WGS) from the research setting to routine clinical practice remains challenging. With almost no previous research specifically assessing interface designs and functionalities of WES and WGS software tools, the authors set out to ascertain perspectives from healthcare professionals in distinct domains on optimal clinical genomics user interfaces. A series of semi-scripted focus groups, structured around professional challenges encountered in clinical WES and WGS, were conducted with bioinformaticians (n = 8), clinical geneticists (n = 9), genetic counselors (n = 5), and general physicians (n = 4). Contrary to popular existing system designs, bioinformaticians preferred command line over graphical user interfaces for better software compatibility and customization flexibility. Clinical geneticists and genetic counselors desired an overarching interactive graphical layout to prioritize candidate variants--a "tiered" system where only functionalities relevant to the user domain are made accessible. They favored a system capable of retrieving consistent representations of external genetic information from third-party sources. To streamline collaboration and patient exchanges, the authors identified user requirements toward an automated reporting system capable of summarizing key evidence-based clinical findings among the vast array of technical details. Successful adoption of a clinical WES/WGS system is heavily dependent on its ability to address the diverse necessities and predilections among specialists in distinct healthcare domains. Tailored software interfaces suitable for each group is likely more appropriate than the current popular "one size fits all" generic framework. This study provides interfaces for future intervention studies and software engineering opportunities. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Shyr, Casper; Kushniruk, Andre; van Karnebeek, Clara D.M.
2016-01-01
Background The transition of whole-exome and whole-genome sequencing (WES/WGS) from the research setting to routine clinical practice remains challenging. Objectives With almost no previous research specifically assessing interface designs and functionalities of WES and WGS software tools, the authors set out to ascertain perspectives from healthcare professionals in distinct domains on optimal clinical genomics user interfaces. Methods A series of semi-scripted focus groups, structured around professional challenges encountered in clinical WES and WGS, were conducted with bioinformaticians (n = 8), clinical geneticists (n = 9), genetic counselors (n = 5), and general physicians (n = 4). Results Contrary to popular existing system designs, bioinformaticians preferred command line over graphical user interfaces for better software compatibility and customization flexibility. Clinical geneticists and genetic counselors desired an overarching interactive graphical layout to prioritize candidate variants—a “tiered” system where only functionalities relevant to the user domain are made accessible. They favored a system capable of retrieving consistent representations of external genetic information from third-party sources. To streamline collaboration and patient exchanges, the authors identified user requirements toward an automated reporting system capable of summarizing key evidence-based clinical findings among the vast array of technical details. Conclusions Successful adoption of a clinical WES/WGS system is heavily dependent on its ability to address the diverse necessities and predilections among specialists in distinct healthcare domains. Tailored software interfaces suitable for each group is likely more appropriate than the current popular “one size fits all” generic framework. This study provides interfaces for future intervention studies and software engineering opportunities. PMID:26117142
Application of Structure-from-Motion photogrammetry in laboratory flumes
NASA Astrophysics Data System (ADS)
Morgan, Jacob A.; Brogan, Daniel J.; Nelson, Peter A.
2017-01-01
Structure-from-Motion (SfM) photogrammetry has become widely used for topographic data collection in field and laboratory studies. However, the relative performance of SfM against other methods of topographic measurement in a laboratory flume environment has not been systematically evaluated, and there is a general lack of guidelines for SfM application in flume settings. As the use of SfM in laboratory flume settings becomes more widespread, it is increasingly critical to develop an understanding of how to acquire and process SfM data for a given flume size and sediment characteristics. In this study, we: (1) compare the resolution and accuracy of SfM topographic measurements to terrestrial laser scanning (TLS) measurements in laboratory flumes of varying physical dimensions containing sediments of varying grain sizes; (2) explore the effects of different image acquisition protocols and data processing methods on the resolution and accuracy of topographic data derived from SfM techniques; and (3) provide general guidance for image acquisition and processing for SfM applications in laboratory flumes. To investigate the effects of flume size, sediment size, and photo overlap on the density and accuracy of SfM data, we collected topographic data using both TLS and SfM in five flumes with widths ranging from 0.22 to 6.71 m, lengths ranging from 9.14 to 30.48 m, and median sediment sizes ranging from 0.2 to 31 mm. Acquisition time, image overlap, point density, elevation data, and computed roughness parameters were compared to evaluate the performance of SfM against TLS. We also collected images of a pan of gravel where we varied the distance and angle between the camera and sediment in order to explore how photo acquisition affects the ability to capture grain-scale microtopographic features in SfM-derived point clouds. A variety of image combinations and SfM software package settings were also investigated to determine optimal processing techniques. Results from this study suggest that SfM provides topographic data of similar accuracy to TLS, at higher resolution and lower cost. We found that about 100pixels per grain are required to resolve grain-scale topography. We suggest protocols for image acquisition and SfM software settings to achieve best results when using SfM in laboratory settings. In general, convergent imagery, taken from a higher angle, with at least several overlapping images for each desired point in the flume will result in an acceptable point cloud.
Mental Models of Software Forecasting
NASA Technical Reports Server (NTRS)
Hihn, J.; Griesel, A.; Bruno, K.; Fouser, T.; Tausworthe, R.
1993-01-01
The majority of software engineers resist the use of the currently available cost models. One problem is that the mathematical and statistical models that are currently available do not correspond with the mental models of the software engineers. In an earlier JPL funded study (Hihn and Habib-agahi, 1991) it was found that software engineers prefer to use analogical or analogy-like techniques to derive size and cost estimates, whereas curren CER's hide any analogy in the regression equations. In addition, the currently available models depend upon information which is not available during early planning when the most important forecasts must be made.
A measurement system for large, complex software programs
NASA Technical Reports Server (NTRS)
Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.
1994-01-01
This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.
A Structured Model for Software Documentation.
ERIC Educational Resources Information Center
Swigger, Keith
The concept of "structured programming" was developed to facilitate software production, but it has not carried over to documentation design. Two concepts of structure are relevant to user documentation for computer programs. The first is based on programming techniques that emphasize decomposition of tasks into discrete modules, while the second…
Software Architecture Evolution
ERIC Educational Resources Information Center
Barnes, Jeffrey M.
2013-01-01
Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…
NASA Astrophysics Data System (ADS)
Fazayeli, Saeed; Eydi, Alireza; Kamalabadi, Isa Nakhai
2017-07-01
Nowadays, organizations have to compete with different competitors in regional, national and international levels, so they have to improve their competition capabilities to survive against competitors. Undertaking activities on a global scale requires a proper distribution system which could take advantages of different transportation modes. Accordingly, the present paper addresses a location-routing problem on multimodal transportation network. The introduced problem follows four objectives simultaneously which form main contribution of the paper; determining multimodal routes between supplier and distribution centers, locating mode changing facilities, locating distribution centers, and determining product delivery tours from the distribution centers to retailers. An integer linear programming is presented for the problem, and a genetic algorithm with a new chromosome structure proposed to solve the problem. Proposed chromosome structure consists of two different parts for multimodal transportation and location-routing parts of the model. Based on published data in the literature, two numerical cases with different sizes generated and solved. Also, different cost scenarios designed to better analyze model and algorithm performance. Results show that algorithm can effectively solve large-size problems within a reasonable time which GAMS software failed to reach an optimal solution even within much longer times.
NASA Astrophysics Data System (ADS)
Fazayeli, Saeed; Eydi, Alireza; Kamalabadi, Isa Nakhai
2018-07-01
Nowadays, organizations have to compete with different competitors in regional, national and international levels, so they have to improve their competition capabilities to survive against competitors. Undertaking activities on a global scale requires a proper distribution system which could take advantages of different transportation modes. Accordingly, the present paper addresses a location-routing problem on multimodal transportation network. The introduced problem follows four objectives simultaneously which form main contribution of the paper; determining multimodal routes between supplier and distribution centers, locating mode changing facilities, locating distribution centers, and determining product delivery tours from the distribution centers to retailers. An integer linear programming is presented for the problem, and a genetic algorithm with a new chromosome structure proposed to solve the problem. Proposed chromosome structure consists of two different parts for multimodal transportation and location-routing parts of the model. Based on published data in the literature, two numerical cases with different sizes generated and solved. Also, different cost scenarios designed to better analyze model and algorithm performance. Results show that algorithm can effectively solve large-size problems within a reasonable time which GAMS software failed to reach an optimal solution even within much longer times.
The readout and control system of the mid-size telescope prototype of the Cherenkov Telescope Array
NASA Astrophysics Data System (ADS)
Oya, I.; Anguner, O.; Behera, B.; Birsin, E.; Fuessling, M.; Melkumyan, D.; Schmidt, T.; Schwanke, U.; Sternberger, R.; Wegner, P.; Wiesand, S.; Cta Consortium,the
2014-06-01
The Cherenkov Telescope Array (CTA) is one of the major ground-based astronomy projects being pursued and will be the largest facility for ground-based y-ray observations ever built. CTA will consist of two arrays: one in the Northern hemisphere composed of about 20 telescopes, and the other one in the Southern hemisphere composed of about 100 telescopes, both arrays containing telescopes of different type and size. A prototype for the Mid-Size Telescope (MST) with a diameter of 12 m has been installed in Berlin and is currently being commissioned. This prototype is composed of a mechanical structure, a drive system and mirror facets mounted with powered actuators to enable active control. Five Charge-Coupled Device (CCD) cameras, and a wide set of sensors allow the evaluation of the performance of the instrument. The design of the control software is following concepts and tools under evaluation within the CTA consortium in order to provide a realistic test-bed for the middleware: 1) The readout and control system for the MST prototype is implemented with the Atacama Large Millimeter/submillimeter Array (ALMA) Common Software (ACS) distributed control middleware; 2) the OPen Connectivity-Unified Architecture (OPC UA) is used for hardware access; 3) the document oriented MongoDB database is used for an efficient storage of CCD images, logging and alarm information: and 4) MySQL and MongoDB databases are used for archiving the slow control monitoring data and for storing the operation configuration parameters. In this contribution, the details of the implementation of the control system for the MST prototype telescope are described.
Integrated optomechanical analysis and testing software development at MIT Lincoln Laboratory
NASA Astrophysics Data System (ADS)
Stoeckel, Gerhard P.; Doyle, Keith B.
2013-09-01
Advanced analytical software capabilities are being developed to advance the design of prototypical hardware in the Engineering Division at MIT Lincoln Laboratory. The current effort is focused on the integration of analysis tools tailored to the work flow, organizational structure, and current technology demands. These tools are being designed to provide superior insight into the interdisciplinary behavior of optical systems and enable rapid assessment and execution of design trades to optimize the design of optomechanical systems. The custom software architecture is designed to exploit and enhance the functionality of existing industry standard commercial software, provide a framework for centralizing internally developed tools, and deliver greater efficiency, productivity, and accuracy through standardization, automation, and integration. Specific efforts have included the development of a feature-rich software package for Structural-Thermal-Optical Performance (STOP) modeling, advanced Line Of Sight (LOS) jitter simulations, and improved integration of dynamic testing and structural modeling.
Study of Star Formation Regions with Molecular Hydrogen Emission Lines
NASA Astrophysics Data System (ADS)
Pak, Soojong
The goal of my dissertation is to understand the large-scale, near-infrared (near-IR) H2 emission from the central kiloparsec (kpc) regions of galaxies, and to study the structure and physics of photon-dominated regions (or photodissociation regions, hereafter PDRs). In order to explore the near-IR H2 lines, our group built the University of Texas near-IR Fabry-Perot Spectrometer optimized for observations of extended, low surface brightness sources. In this instrument project, I designed and built a programmable high voltage DC amplifier for the Fabry-Perot piezoelectric transducers, a temperature-controlled cooling box for the Fabry-Perot etalon, instrument control software, and data reduction software. With this instrument, we observed H2 emission lines in the inner 400 pc of the Galaxy, the central ~1 kpc of NGC 253 and M82, and the star formation regions in the Magellanic Clouds. We also observed the Magellanic Clouds in the CO J=1/to0 line. We found that the H2 emission is very extended in the central kpc of the galaxies and is mostly UV-excited. The ratios of the H2 (1,0) S(1) luminosities to the far-IR continuum luminosities in the central kpc regions do not change from the Galactic center to starburst galaxies and to ultraluminous IR bright galaxies. Using the data from the Magellanic Clouds, we studied the microscopic structure of star forming clouds. We compiled data sets including our H2 (1,0) S(1) and CO J=1/to0 results and published (C scII) and far-IR data from the Magellanic Clouds, and compared these observations with models we made using a PDR code and a radiative transfer code. Assuming the cloud is spherical, we derived the physical sizes of H2, (C scII), and CO emission regions. The average cloud size appears to increase as the metallicity decreases. Our results agree with the theory of photoionization-regulated star formation in which the interplay between the ambipolar diffusion and ionization by far-UV photons determines the size of stable clouds.
ERIC Educational Resources Information Center
Ferrer, Emilio; Hamagami, Fumiaki; McArdle, John J.
2004-01-01
This article offers different examples of how to fit latent growth curve (LGC) models to longitudinal data using a variety of different software programs (i.e., LISREL, Mx, Mplus, AMOS, SAS). The article shows how the same model can be fitted using both structural equation modeling and multilevel software, with nearly identical results, even in…
ERIC Educational Resources Information Center
Millan, Eva; Belmonte, Maria-Victoria; Ruiz-Montiel, Manuela; Gavilanes, Juan; Perez-de-la-Cruz, Jose-Luis
2016-01-01
In this paper, we present BH-ShaDe, a new software tool to assist architecture students learning the ill-structured domain/task of housing design. The software tool provides students with automatic or interactively generated floor plan schemas for basic houses. The students can then use the generated schemas as initial seeds to develop complete…
Artificial intelligence and the space station software support environment
NASA Technical Reports Server (NTRS)
Marlowe, Gilbert
1986-01-01
In a software system the size of the Space Station Software Support Environment (SSE), no one software development or implementation methodology is presently powerful enough to provide safe, reliable, maintainable, cost effective real time or near real time software. In an environment that must survive one of the most harsh and long life times, software must be produced that will perform as predicted, from the first time it is executed to the last. Many of the software challenges that will be faced will require strategies borrowed from Artificial Intelligence (AI). AI is the only development area mentioned as an example of a legitimate reason for a waiver from the overall requirement to use the Ada programming language for software development. The limits are defined of the applicability of the Ada language Ada Programming Support Environment (of which the SSE is a special case), and software engineering to AI solutions by describing a scenario that involves many facets of AI methodologies.
Instrumentation: Software-Driven Instrumentation: The New Wave.
ERIC Educational Resources Information Center
Salit, M. L.; Parsons, M. L.
1985-01-01
Software-driven instrumentation makes measurements that demand a computer as an integral part of either control, data acquisition, or data reduction. The structure of such instrumentation, hardware requirements, and software requirements are discussed. Examples of software-driven instrumentation (such as wavelength-modulated continuum source…
StrAuto: automation and parallelization of STRUCTURE analysis.
Chhatre, Vikram E; Emerson, Kevin J
2017-03-24
Population structure inference using the software STRUCTURE has become an integral part of population genetic studies covering a broad spectrum of taxa including humans. The ever-expanding size of genetic data sets poses computational challenges for this analysis. Although at least one tool currently implements parallel computing to reduce computational overload of this analysis, it does not fully automate the use of replicate STRUCTURE analysis runs required for downstream inference of optimal K. There is pressing need for a tool that can deploy population structure analysis on high performance computing clusters. We present an updated version of the popular Python program StrAuto, to streamline population structure analysis using parallel computing. StrAuto implements a pipeline that combines STRUCTURE analysis with the Evanno Δ K analysis and visualization of results using STRUCTURE HARVESTER. Using benchmarking tests, we demonstrate that StrAuto significantly reduces the computational time needed to perform iterative STRUCTURE analysis by distributing runs over two or more processors. StrAuto is the first tool to integrate STRUCTURE analysis with post-processing using a pipeline approach in addition to implementing parallel computation - a set up ideal for deployment on computing clusters. StrAuto is distributed under the GNU GPL (General Public License) and available to download from http://strauto.popgen.org .
Antiplagiarism Software Takes on the Honor Code
ERIC Educational Resources Information Center
Wasley, Paula
2008-01-01
Among the 100-odd colleges with academic honor codes, plagiarism-detection services raise a knotty problem: Is software compatible with a system based on trust? The answer frequently devolves to the size and culture of the university. Colleges with traditional student-run honor codes tend to "forefront" trust, emphasizing it above all else. This…
CrossTalk: The Journal of Defense Software Engineering. Volume 18, Number 4
2005-04-01
older automated cost- estimating tools are no longer being actively marketed but are still in use such as CheckPoint, COCOMO, ESTIMACS, REVIC, and SPQR ...estimation tools: SPQR /20, Checkpoint, and Knowl- edgePlan. These software estimation tools pioneered the use of function point metrics for sizing and
The Elusive Cost of Library Software
ERIC Educational Resources Information Center
Breeding, Marshall
2009-01-01
Software pricing is not a straightforward issue, since each procurement involves a special business arrangement between a library and its chosen vendor. The author thinks that it is reasonable to scale the cost of a product to such factors as the size of the library, the complexity of the installation, the number of simultaneous users, or the…
On the Use of Software Metrics as a Predictor of Software Security Problems
2013-01-01
models to determine if additional metrics are required to increase the accuracy of the model: non-security SCSA warnings, code churn and size, the...vulnerabilities reported by testing and those found in the field. Summary of Most Important Results We evaluated our model on three commercial telecommunications
Hardwood log defect photographic database, software and user's guide
R. Edward Thomas
2009-01-01
Computer software and user's guide for Hardwood Log Defect Photographic Database. The database contains photographs and information on external hardwood log defects and the corresponding internal characteristics. This database allows users to search for specific defect types, sizes, and locations by tree species. For every defect, the database contains photos of...
NASA Technical Reports Server (NTRS)
Johnson, Charles S.
1986-01-01
It is nearly axiomatic, that to take the greatest advantage of the useful features available in a development system, and to avoid the negative interactions of those features, requires the exercise of a design methodology which constrains their use. A major design support feature of the Ada language is abstraction: for data, functions processes, resources, and system elements in general. Atomic abstract types can be created in packages defining those private types and all of the overloaded operators, functions, and hidden data required for their use in an application. Generically structured abstract types can be created in generic packages defining those structured private types, as buildups from the user-defined data types which are input as parameters. A study is made of the design constraints required for software incorporating either atomic or generically structured abstract types, if the integration of software components based on them is to be subsequently performed. The impact of these techniques on the reusability of software and the creation of project-specific software support environments is also discussed.
NASA Astrophysics Data System (ADS)
Verma, Payal; Juneja, Sucheta; Savelyev, Dmitry A.; Khonina, Svetlana N.; Gopal, Ram
2016-04-01
This paper presents design and fabrication of a 1-DOF (degree-of-freedom) drive mode and 2-DOF sense mode micro-gyroscope. It is an inherently robust structure and offers a high sense frequency bandwidth. The proposed design utilizes resonance of the1-DOF drive mode oscillator and employs dynamic amplification concept in sense modes to increase the sensitivity while maintaining robustness. The 2-DOF in the sense direction renders the device immune to process imperfections and environmental effects. The design is simulated using FEA software (CoventorWare®). The device is designed considering process compatibility with SU-8 based UV-LIGA process, which is an economical fabrication technique. The complete fabrication process is presented along with SEM images of the fabricated device. The device has 9 µm thick Nickel as the key structural layer with an overall reduced key structure size of 2.2 mm by 2.1 mm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Belley, M; Schmidt, M; Knutson, N
Purpose: Physics second-checks for external beam radiation therapy are performed, in-part, to verify that the machine parameters in the Record-and-Verify (R&V) system that will ultimately be sent to the LINAC exactly match the values initially calculated by the Treatment Planning System (TPS). While performing the second-check, a large portion of the physicists’ time is spent navigating and arranging display windows to locate and compare the relevant numerical values (MLC position, collimator rotation, field size, MU, etc.). Here, we describe the development of a software tool that guides the physicist by aggregating and succinctly displaying machine parameter data relevant to themore » physics second-check process. Methods: A data retrieval software tool was developed using Python to aggregate data and generate a list of machine parameters that are commonly verified during the physics second-check process. This software tool imported values from (i) the TPS RT Plan DICOM file and (ii) the MOSAIQ (R&V) Structured Query Language (SQL) database. The machine parameters aggregated for this study included: MLC positions, X&Y jaw positions, collimator rotation, gantry rotation, MU, dose rate, wedges and accessories, cumulative dose, energy, machine name, couch angle, and more. Results: A GUI interface was developed to generate a side-by-side display of the aggregated machine parameter values for each field, and presented to the physicist for direct visual comparison. This software tool was tested for 3D conformal, static IMRT, sliding window IMRT, and VMAT treatment plans. Conclusion: This software tool facilitated the data collection process needed in order for the physicist to conduct a second-check, thus yielding an optimized second-check workflow that was both more user friendly and time-efficient. Utilizing this software tool, the physicist was able to spend less time searching through the TPS PDF plan document and the R&V system and focus the second-check efforts on assessing the patient-specific plan-quality.« less
Three-dimensional measurement system for crime scene documentation
NASA Astrophysics Data System (ADS)
Adamczyk, Marcin; Hołowko, Elwira; Lech, Krzysztof; Michoński, Jakub; MÄ czkowski, Grzegorz; Bolewicki, Paweł; Januszkiewicz, Kamil; Sitnik, Robert
2017-10-01
Three dimensional measurements (such as photogrammetry, Time of Flight, Structure from Motion or Structured Light techniques) are becoming a standard in the crime scene documentation process. The usage of 3D measurement techniques provide an opportunity to prepare more insightful investigation and helps to show every trace in the context of the entire crime scene. In this paper we would like to present a hierarchical, three-dimensional measurement system that is designed for crime scenes documentation process. Our system reflects the actual standards in crime scene documentation process - it is designed to perform measurement in two stages. First stage of documentation, the most general, is prepared with a scanner with relatively low spatial resolution but also big measuring volume - it is used for the whole scene documentation. Second stage is much more detailed: high resolution but smaller size of measuring volume for areas that required more detailed approach. The documentation process is supervised by a specialised application CrimeView3D, that is a software platform for measurements management (connecting with scanners and carrying out measurements, automatic or semi-automatic data registration in the real time) and data visualisation (3D visualisation of documented scenes). It also provides a series of useful tools for forensic technicians: virtual measuring tape, searching for sources of blood spatter, virtual walk on the crime scene and many others. In this paper we present our measuring system and the developed software. We also provide an outcome from research on metrological validation of scanners that was performed according to VDI/VDE standard. We present a CrimeView3D - a software-platform that was developed to manage the crime scene documentation process. We also present an outcome from measurement sessions that were conducted on real crime scenes with cooperation with Technicians from Central Forensic Laboratory of Police.
Update of GRASP/Ada reverse engineering tools for Ada
NASA Technical Reports Server (NTRS)
Cross, James H., II
1992-01-01
The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped a new algorithmic level graphical representation of Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype (Version 1) was designed and implemented using FLEX and BISON running under VMS on a VAS 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented using the HP widget toolkit and the X Windows System. In Phase 3, the user interface was extensively reworked using the Athena widget toolkit and X Windows. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. Following Phase 3, the prototype was evaluated by software engineering students at Auburn University and then updated with significant enhancements to the user interface including editing capabilities. Version 3.2 of the prototype was prepared for limited distribution to facilitate further evaluation. The current prototype provides the capability for the user to generate CSD's from Ada PDL or source code in a reverse engineering as well as forward engineering mode with a level of flexibility suitable for practical application.
Malacarne, D; Pesenti, R; Paolucci, M; Parodi, S
1993-01-01
For a database of 826 chemicals tested for carcinogenicity, we fragmented the structural formula of the chemicals into all possible contiguous-atom fragments with size between two and eight (nonhydrogen) atoms. The fragmentation was obtained using a new software program based on graph theory. We used 80% of the chemicals as a training set and 20% as a test set. The two sets were obtained by random sorting. From the training sets, an average (8 computer runs with independently sorted chemicals) of 315 different fragments were significantly (p < 0.125) associated with carcinogenicity or lack thereof. Even using this relatively low level of statistical significance, 23% of the molecules of the test sets lacked significant fragments. For 77% of the molecules of the test sets, we used the presence of significant fragments to predict carcinogenicity. The average level of accuracy of the predictions in the test sets was 67.5%. Chemicals containing only positive fragments were predicted with an accuracy of 78.7%. The level of accuracy was around 60% for chemicals characterized by contradictory fragments or only negative fragments. In a parallel manner, we performed eight paired runs in which carcinogenicity was attributed randomly to the molecules of the training sets. The fragments generated by these pseudo-training sets were devoid of any predictivity in the corresponding test sets. Using an independent software program, we confirmed (for the complex biological endpoint of carcinogenicity) the validity of a structure-activity relationship approach of the type proposed by Klopman and Rosenkranz with their CASE program. Images Figure 1. Figure 2. Figure 3. Figure 4. Figure 5. Figure 6. PMID:8275991
Methodology for Software Reliability Prediction. Volume 2.
1987-11-01
The overall acquisition ,z program shall include the resources, schedule, management, structure , and controls necessary to ensure that specified AD...Independent Verification/Validation - Programming Team Structure - Educational Level of Team Members - Experience Level of Team Members * Methods Used...Prediction or Estimation Parameter Supported: Software - Characteristics 3. Objectives: Structured programming studies and Government Ur.’.. procurement
A general observatory control software framework design for existing small and mid-size telescopes
NASA Astrophysics Data System (ADS)
Ge, Liang; Lu, Xiao-Meng; Jiang, Xiao-Jun
2015-07-01
A general framework for observatory control software would help to improve the efficiency of observation and operation of telescopes, and would also be advantageous for remote and joint observations. We describe a general framework for observatory control software, which considers principles of flexibility and inheritance to meet the expectations from observers and technical personnel. This framework includes observation scheduling, device control and data storage. The design is based on a finite state machine that controls the whole process.
Introduction to the Security Engineering Risk Analysis (SERA) Framework
2014-11-01
military aircraft has increased from 8% to 80%. At the same time, the size of software in military aircraft has grown from 1,000 lines of code in the F...4A to 1.7 million lines of code in the F-22. This growth trend is expected to con- tinue over time [NASA 2009]. As software exerts more control of...their root causes can be traced to the software’s requirements, architecture, design, or code . Studies have shown that the cost of addressing a software
Iwazawa, J; Ohue, S; Hashimoto, N; Mitani, T
2014-02-01
To compare the accuracy of computer software analysis using three different target-definition protocols to detect tumour feeder vessels for transarterial chemoembolization of hepatocellular carcinoma. C-arm computed tomography (CT) data were analysed for 81 tumours from 57 patients who had undergone chemoembolization using software-assisted detection of tumour feeders. Small, medium, and large-sized targets were manually defined for each tumour. The tumour feeder was verified when the target tumour was enhanced on selective C-arm CT of the investigated vessel during chemoembolization. The sensitivity, specificity, and accuracy of the three protocols were evaluated and compared. One hundred and eight feeder vessels supplying 81 lesions were detected. The sensitivity of the small, medium, and large target protocols was 79.8%, 91.7%, and 96.3%, respectively; specificity was 95%, 88%, and 50%, respectively; and accuracy was 87.5%, 89.9%, and 74%, respectively. The sensitivity was significantly higher for the medium (p = 0.003) and large (p < 0.001) target protocols than for the small target protocol. The specificity and accuracy were higher for the small (p < 0.001 and p < 0.001, respectively) and medium (p < 0.001 and p < 0.001, respectively) target protocols than for the large target protocol. The overall accuracy of software-assisted automated feeder analysis in transarterial chemoembolization for hepatocellular carcinoma is affected by the target definition size. A large target definition increases sensitivity and decreases specificity in detecting tumour feeders. A target size equivalent to the tumour size most accurately predicts tumour feeders. Copyright © 2013 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
Hadjisolomou, Stavros P.; El-Haddad, George
2017-01-01
Coleoid cephalopods (squid, octopus, and sepia) are renowned for their elaborate body patterning capabilities, which are employed for camouflage or communication. The specific chromatic appearance of a cephalopod, at any given moment, is a direct result of the combined action of their intradermal pigmented chromatophore organs and reflecting cells. Therefore, a lot can be learned about the cephalopod coloration system by video recording and analyzing the activation of individual chromatophores in time. The fact that adult cephalopods have small chromatophores, up to several hundred thousand in number, makes measurement and analysis over several seconds a difficult task. However, current advancements in videography enable high-resolution and high framerate recording, which can be used to record chromatophore activity in more detail and accuracy in both space and time domains. In turn, the additional pixel information and extra frames per video from such recordings result in large video files of several gigabytes, even when the recording spans only few minutes. We created a software plugin, “SpotMetrics,” that can automatically analyze high resolution, high framerate video of chromatophore organ activation in time. This image analysis software can track hundreds of individual chromatophores over several hundred frames to provide measurements of size and color. This software may also be used to measure differences in chromatophore activation during different behaviors which will contribute to our understanding of the cephalopod sensorimotor integration system. In addition, this software can potentially be utilized to detect numbers of round objects and size changes in time, such as eye pupil size or number of bacteria in a sample. Thus, we are making this software plugin freely available as open-source because we believe it will be of benefit to other colleagues both in the cephalopod biology field and also within other disciplines. PMID:28298896
New generation of the health monitoring system SMS 2001
NASA Astrophysics Data System (ADS)
Berndt, Rolf-Dietrich; Schwesinger, Peter
2001-08-01
The Structure Monitoring System SMS 2001 (applied for patent) represents a modular structured multi-component measurement devise for use under outdoor conditions. Besides usual continuously (static) measurements of e.g. environmental parameters and structure related responses the SMS is able to register also short term dynamic events automatically with measurement frequencies up to 1 kHz. A larger range of electrical sensors is able to be used. On demand a solar based power supply can be realized. The SMS 2001 is adaptable in a wide range, it is space-saving in its geometric structure and can meet very various demands of the users. The system is applicable preferably for small and medium sized concrete and steel structures (besides buildings and bridges also for special cases). It is suitable to support the efficient concept of a controlled life time extension especially in the case of pre-damaged structures. The interactive communication between SMS and the central office is completely remote controlled. Two point or multi-point connections using the internet can be realized. The measurement data are stored in a central data bank. A safe access supported by software modules can be organized in different levels, e.g. for scientific evaluation, service reasons or needs of authorities.
TDat: An Efficient Platform for Processing Petabyte-Scale Whole-Brain Volumetric Images.
Li, Yuxin; Gong, Hui; Yang, Xiaoquan; Yuan, Jing; Jiang, Tao; Li, Xiangning; Sun, Qingtao; Zhu, Dan; Wang, Zhenyu; Luo, Qingming; Li, Anan
2017-01-01
Three-dimensional imaging of whole mammalian brains at single-neuron resolution has generated terabyte (TB)- and even petabyte (PB)-sized datasets. Due to their size, processing these massive image datasets can be hindered by the computer hardware and software typically found in biological laboratories. To fill this gap, we have developed an efficient platform named TDat, which adopts a novel data reformatting strategy by reading cuboid data and employing parallel computing. In data reformatting, TDat is more efficient than any other software. In data accessing, we adopted parallelization to fully explore the capability for data transmission in computers. We applied TDat in large-volume data rigid registration and neuron tracing in whole-brain data with single-neuron resolution, which has never been demonstrated in other studies. We also showed its compatibility with various computing platforms, image processing software and imaging systems.
The Kinematic Analysis of Flat Leverage Mechanism of the Third Class
NASA Astrophysics Data System (ADS)
Zhauyt, A.; Mamatova, G.; Abdugaliyeva, G.; Alipov, K.; Sakenova, A.; Alimbetov, A.
2017-10-01
It is necessary to make link mechanisms calculation to the strength at designing of flat link mechanisms of high class after definition of block diagrams and link linear sizes i.e. it is rationally to choose their forms and to determine the section sizes. The algorithm of the definition of dimension of link mechanism lengths of high classes (MHC) and their metric parameters at successive approach is offered in this work. It this paper educational and research software named GIM is presented. This software has been developed with the aim of approaching the difficulties students usually encounter when facing up to kinematic analysis of mechanisms. A deep understanding of the kinematic analysis is necessary to go a step further into design and synthesis of mechanisms. In order to support and complement the theoretical lectures, GIM software is used during the practical exercises, serving as an educational complementary tool reinforcing the knowledge acquired by the students.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Renke; Jin, Shuangshuang; Chen, Yousu
This paper presents a faster-than-real-time dynamic simulation software package that is designed for large-size power system dynamic simulation. It was developed on the GridPACKTM high-performance computing (HPC) framework. The key features of the developed software package include (1) faster-than-real-time dynamic simulation for a WECC system (17,000 buses) with different types of detailed generator, controller, and relay dynamic models, (2) a decoupled parallel dynamic simulation algorithm with optimized computation architecture to better leverage HPC resources and technologies, (3) options for HPC-based linear and iterative solvers, (4) hidden HPC details, such as data communication and distribution, to enable development centered on mathematicalmore » models and algorithms rather than on computational details for power system researchers, and (5) easy integration of new dynamic models and related algorithms into the software package.« less
A computer controlled television detector for light, X-rays and particles
NASA Technical Reports Server (NTRS)
Kalata, K.
1981-01-01
A versatile, high resolution, software configurable, two-dimensional intensified vidicon quantum detector system has been developed for multiple research applications. A thin phosphor convertor allows the detection of X-rays below 20 keV and non-relativistic particles in addition to visible light, and a thicker scintillator can be used to detect X-rays up to 100 keV and relativistic particles. Faceplates may be changed to allow any active area from 1 to 40 mm square, and active areas up to 200 mm square are possible. The image is integrated in a digital memory on any software specified array size up to 4000 x 4000. The array size is selected to match the spatial resolution, which ranges from 10 to 100 microns depending on the operating mode, the active area, and the photon or particle energy. All scan and data acquisition parameters are under software control to allow optimal data collection for each application.
Property-Based Software Engineering Measurement
NASA Technical Reports Server (NTRS)
Briand, Lionel; Morasca, Sandro; Basili, Victor R.
1995-01-01
Little theory exists in the field of software system measurement. Concepts such as complexity, coupling, cohesion or even size are very often subject to interpretation and appear to have inconsistent definitions in the literature. As a consequence, there is little guidance provided to the analyst attempting to define proper measures for specific problems. Many controversies in the literature are simply misunderstandings and stem from the fact that some people talk about different measurement concepts under the same label (complexity is the most common case). There is a need to define unambiguously the most important measurement concepts used in the measurement of software products. One way of doing so is to define precisely what mathematical properties characterize these concepts regardless of the specific software artifacts to which these concepts are applied. Such a mathematical framework could generate a consensus in the software engineering community and provide a means for better communication among researchers, better guidelines for analysis, and better evaluation methods for commercial static analyzers for practitioners. In this paper, we propose a mathematical framework which is generic, because it is not specific to any particular software artifact, and rigorous, because it is based on precise mathematical concepts. This framework defines several important measurement concepts (size, length, complexity, cohesion, coupling). It is not intended to be complete or fully objective; other frameworks could have been proposed and different choices could have been made. However, we believe that the formalism and properties we introduce are convenient and intuitive. In addition, we have reviewed the literature on this subject and compared it with our work. This framework contributes constructively to a firmer theoretical ground of software measurement.
Property-Based Software Engineering Measurement
NASA Technical Reports Server (NTRS)
Briand, Lionel C.; Morasca, Sandro; Basili, Victor R.
1997-01-01
Little theory exists in the field of software system measurement. Concepts such as complexity, coupling, cohesion or even size are very often subject to interpretation and appear to have inconsistent definitions in the literature. As a consequence, there is little guidance provided to the analyst attempting to define proper measures for specific problems. Many controversies in the literature are simply misunderstandings and stem from the fact that some people talk about different measurement concepts under the same label (complexity is the most common case). There is a need to define unambiguously the most important measurement concepts used in the measurement of software products. One way of doing so is to define precisely what mathematical properties characterize these concepts, regardless of the specific software artifacts to which these concepts are applied. Such a mathematical framework could generate a consensus in the software engineering community and provide a means for better communication among researchers, better guidelines for analysts, and better evaluation methods for commercial static analyzers for practitioners. In this paper, we propose a mathematical framework which is generic, because it is not specific to any particular software artifact and rigorous, because it is based on precise mathematical concepts. We use this framework to propose definitions of several important measurement concepts (size, length, complexity, cohesion, coupling). It does not intend to be complete or fully objective; other frameworks could have been proposed and different choices could have been made. However, we believe that the formalisms and properties we introduce are convenient and intuitive. This framework contributes constructively to a firmer theoretical ground of software measurement.
Proceedings of the Workshop on Computational Aspects in the Control of Flexible Systems, part 2
NASA Technical Reports Server (NTRS)
Taylor, Lawrence W., Jr. (Compiler)
1989-01-01
The Control/Structures Integration Program, a survey of available software for control of flexible structures, computational efficiency and capability, modeling and parameter estimation, and control synthesis and optimization software are discussed.
Constraints on the size of Asteroid (216) Kleopatra using stress analysis
NASA Astrophysics Data System (ADS)
Hirabayashi, M.; Scheeres, D. J.
2013-12-01
We investigate the stable size of Asteroid (216) Kleopatra by considering structural constraints on this body. Comprehensive radar observations (Ostro et al. 2000, Science) were used to estimate a shape model for this asteroid. Their estimation revealed that the shape looks like a dog-bone, the mean radius is 54.3 km (with uncertainty as large as 25%), and the surface seems similar to lunar surface regolith. However, 10 years later, Descamps et al. (2011, Icarus) performed near-infrared adaptive optics (AO) imaging with the W.M. Keck II telescope and found that although the shape may be consistent with their observation result, their size appeared to be larger than the Ostro size (by a factor of about 1.24). Our motivation in this study is to investigate structural stability constraints on the size of this asteroid. Across the stated range of uncertainty we find significant differences in the necessary angle of friction and cohesion for the body to avoid plastic deformation. We use the following physical parameters as fixed: a mass of 4.64e18 kg (Descamps et al. 2011, Icarus), a rotation period of 5.385 hr (Magnusson 1990, Icarus), and the Ostro et al. shape. We use the Drucker-Prager criterion to describe the rheology of the asteroid's material. Furthermore, we determine the friction angle from the fact that the surface of this asteroid is similar to lunar surface regolith, whose porosity ranges from 33% to 55%. According to Scott (1963), a soil with porosity of 44% (the mean value of the lunar surface porosity) has a friction angle of 32 degrees (which we use as our nominal value). Since the interior structure is unknown, we assume that the body is homogeneous. We first analyze the stable size by using the upper bound theorem from limit analysis on the assumption that this asteroid's materials are cohesionless. Based on this theorem, for any static surface traction and body force, the yield due to a smooth and convex yield envelope associated with the volume average is identical to the upper bound (Holsapple 2008, INT J NONLINEAR MECH). For the average stress, we give total volume (Holsapple, 2008, Icarus) and partial volume (Hirabayashi et al., 2013, ApJ, submitted). This method gives a conservative condition for structural failure. The result shows that if the size is between 1.18 and 1.32 (a scaling factor defined such that the Ostro shape's size has a value of 1.0), (216) Kleopatra is structurally stable, which is consistent with Descamps et al. (2011, Icaurus). Next, we calculate plastic stress solutions to determine possible actual structural failure regimes. For this computation, we use commercial finite element analysis software (ANSYS Academic Teaching Introductory 14.0). To determine structural failure, we search for the condition where a plastic region propagates over the majority of a cross section. Since the zero-cohesion condition leads to large plastic deformations, we evaluate the stable size as a function of cohesion under the constant friction angle 32 degree. The result shows that if the size is 1.24, the necessary cohesion required is 90000 Pa; otherwise, the value dramatically increases up to 1e6 Pa. This technique is robust; therefore, once we obtain accurate physical parameters from more detail observations, our methodology will be able to give stronger constraints (216) Kleopatra, as well as other rubble pile asteroids.
Communication and Organization in Software Development: An Empirical Study
NASA Technical Reports Server (NTRS)
Seaman, Carolyn B.; Basili, Victor R.
1996-01-01
The empirical study described in this paper addresses the issue of communication among members of a software development organization. The independent variables are various attributes of organizational structure. The dependent variable is the effort spent on sharing information which is required by the software development process in use. The research questions upon which the study is based ask whether or not these attributes of organizational structure have an effect on the amount of communication effort expended. In addition, there are a number of blocking variables which have been identified. These are used to account for factors other than organizational structure which may have an effect on communication effort. The study uses both quantitative and qualitative methods for data collection and analysis. These methods include participant observation, structured interviews, and graphical data presentation. The results of this study indicate that several attributes of organizational structure do affect communication effort, but not in a simple, straightforward way. In particular, the distances between communicators in the reporting structure of the organization, as well as in the physical layout of offices, affects how quickly they can share needed information, especially during meetings. These results provide a better understanding of how organizational structure helps or hinders communication in software development.
LIGSIFT: an open-source tool for ligand structural alignment and virtual screening.
Roy, Ambrish; Skolnick, Jeffrey
2015-02-15
Shape-based alignment of small molecules is a widely used approach in computer-aided drug discovery. Most shape-based ligand structure alignment applications, both commercial and freely available ones, use the Tanimoto coefficient or similar functions for evaluating molecular similarity. Major drawbacks of using such functions are the size dependence of the score and the fact that the statistical significance of the molecular match using such metrics is not reported. We describe a new open-source ligand structure alignment and virtual screening (VS) algorithm, LIGSIFT, that uses Gaussian molecular shape overlay for fast small molecule alignment and a size-independent scoring function for efficient VS based on the statistical significance of the score. LIGSIFT was tested against the compounds for 40 protein targets available in the Directory of Useful Decoys and the performance was evaluated using the area under the ROC curve (AUC), the Enrichment Factor (EF) and Hit Rate (HR). LIGSIFT-based VS shows an average AUC of 0.79, average EF values of 20.8 and a HR of 59% in the top 1% of the screened library. LIGSIFT software, including the source code, is freely available to academic users at http://cssb.biology.gatech.edu/LIGSIFT. Supplementary data are available at Bioinformatics online. skolnick@gatech.edu. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Seismic Travel Time Tomography in Modeling Low Velocity Anomalies between the Boreholes
NASA Astrophysics Data System (ADS)
Octova, A.; Sule, R.
2018-04-01
Travel time cross-hole seismic tomography is applied to describing the structure of the subsurface. The sources are placed at one borehole and some receivers are placed in the others. First arrival travel time data that received by each receiver is used as the input data in seismic tomography method. This research is devided into three steps. The first step is reconstructing the synthetic model based on field parameters. Field parameters are divided into 24 receivers and 45 receivers. The second step is applying inversion process for the field data that consists of five pairs bore holes. The last step is testing quality of tomogram with resolution test. Data processing using FAST software produces an explicit shape and resemble the initial model reconstruction of synthetic model with 45 receivers. The tomography processing in field data indicates cavities in several place between the bore holes. Cavities are identified on BH2A-BH1, BH4A-BH2A and BH4A-BH5 with elongated and rounded structure. In resolution tests using a checker-board, anomalies still can be identified up to 2 meter x 2 meter size. Travel time cross-hole seismic tomography analysis proves this mothod is very good to describing subsurface structure and boundary layer. Size and anomalies position can be recognized and interpreted easily.
Effect of cobalt doping on structural and dielectric properties of nanocrystalline LaCrO3
NASA Astrophysics Data System (ADS)
Zarrin, Naima; Husain, Shahid
2018-05-01
Pure and Co doped Lanthanum chromite (LaCrO3) nanoparticles, LaCr1-xCoxO3 (0≤x≤0.3), have been synthesized through sol-gel process and their structural, morphological and dielectric properties have been studied. X ray diffraction patterns reveal that the samples are in single phase having orthorhombic structure with Pnma space group. Structural parameters are refined by Rietveld refinement using Fullprof software. Lattice parameters and unit cell volume are found to decrease with increase in Co doping. Crystallite size is calculated using Scherrer equation and is also found to decrease with increase in Co concentration. Surface morphology is examined using SEM-EDX analysis, which confirms the formation of regular and homogeneous samples without any impurities. The value of dielectric constant (ɛ') decreases with the increase in frequency while it enhances with the increase in Co concentration. The log (ɛ'×f) versus log (f) graphs have been plotted to verify the universal dielectric response (UDR) model. All the samples follow UDR model in the low frequency range.
Foam structure :from soap froth to solid foams.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kraynik, Andrew Michael
2003-01-01
The properties of solid foams depend on their structure, which usually evolves in the fluid state as gas bubbles expand to form polyhedral cells. The characteristic feature of foam structure-randomly packed cells of different sizes and shapes-is examined in this article by considering soap froth. This material can be modeled as a network of minimal surfaces that divide space into polyhedral cells. The cell-level geometry of random soap froth is calculated with Brakke's Surface Evolver software. The distribution of cell volumes ranges from monodisperse to highly polydisperse. Topological and geometric properties, such as surface area and edge length, of themore » entire foam and individual cells, are discussed. The shape of struts in solid foams is related to Plateau borders in liquid foams and calculated for different volume fractions of material. The models of soap froth are used as templates to produce finite element models of open-cell foams. Three-dimensional images of open-cell foams obtained with x-ray microtomography allow virtual reconstruction of skeletal structures that compare well with the Surface Evolver simulations of soap-froth geometry.« less
BioContainers: an open-source and community-driven framework for software standardization.
da Veiga Leprevost, Felipe; Grüning, Björn A; Alves Aflitos, Saulo; Röst, Hannes L; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I; Perez-Riverol, Yasset
2017-08-15
BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). The software is freely available at github.com/BioContainers/. yperez@ebi.ac.uk. © The Author(s) 2017. Published by Oxford University Press.
BioContainers: an open-source and community-driven framework for software standardization
da Veiga Leprevost, Felipe; Grüning, Björn A.; Alves Aflitos, Saulo; Röst, Hannes L.; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C.; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I.; Perez-Riverol, Yasset
2017-01-01
Abstract Motivation BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). Availability and Implementation The software is freely available at github.com/BioContainers/. Contact yperez@ebi.ac.uk PMID:28379341
Open-source meteor detection software for low-cost single-board computers
NASA Astrophysics Data System (ADS)
Vida, D.; Zubović, D.; Šegon, D.; Gural, P.; Cupec, R.
2016-01-01
This work aims to overcome the current price threshold of meteor stations which can sometimes deter meteor enthusiasts from owning one. In recent years small card-sized computers became widely available and are used for numerous applications. To utilize such computers for meteor work, software which can run on them is needed. In this paper we present a detailed description of newly-developed open-source software for fireball and meteor detection optimized for running on low-cost single board computers. Furthermore, an update on the development of automated open-source software which will handle video capture, fireball and meteor detection, astrometry and photometry is given.
Object-oriented productivity metrics
NASA Technical Reports Server (NTRS)
Connell, John L.; Eller, Nancy
1992-01-01
Software productivity metrics are useful for sizing and costing proposed software and for measuring development productivity. Estimating and measuring source lines of code (SLOC) has proven to be a bad idea because it encourages writing more lines of code and using lower level languages. Function Point Analysis is an improved software metric system, but it is not compatible with newer rapid prototyping and object-oriented approaches to software development. A process is presented here for counting object-oriented effort points, based on a preliminary object-oriented analysis. It is proposed that this approach is compatible with object-oriented analysis, design, programming, and rapid prototyping. Statistics gathered on actual projects are presented to validate the approach.
Software-Defined Radio for Space-to-Space Communications
NASA Technical Reports Server (NTRS)
Fisher, Ken; Jih, Cindy; Moore, Michael S.; Price, Jeremy C.; Abbott, Ben A.; Fritz, Justin A.
2011-01-01
A paper describes the Space- to-Space Communications System (SSCS) Software- Defined Radio (SDR) research project to determine the most appropriate method for creating flexible and reconfigurable radios to implement wireless communications channels for space vehicles so that fewer radios are required, and commonality in hardware and software architecture can be leveraged for future missions. The ability to reconfigure the SDR through software enables one radio platform to be reconfigured to interoperate with many different waveforms. This means a reduction in the number of physical radio platforms necessary to support a space mission s communication requirements, thus decreasing the total size, weight, and power needed for a mission.
NASA Astrophysics Data System (ADS)
Oliveira, N. P.; Maciel, L.; Catarino, A. P.; Rocha, A. M.
2017-10-01
This work proposes the creation of models of surfaces using a parametric computer modelling software to obtain three-dimensional structures in weft knitted fabrics produced on single needle system machines. Digital prototyping, another feature of digital modelling software, was also explored in three-dimensional drawings generated using the Rhinoceros software. With this approach, different 3D structures were developed and produced. Physical characterization tests were then performed on the resulting 3D weft knitted structures to assess their ability to promote comfort. From the obtained results, it is apparent that the developed structures have potential for application in different market segments, such as clothing and interior textiles.
Proposed software system for atomic-structure calculation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fischer, C.F.
1981-07-01
Atomic structure calculations are understood well enough that, at a routine level, an atomic structure software package can be developed. At the Atomic Physics Conference in Riga, 1978 L.V. Chernysheva and M.Y. Amusia of Leningrad University, presented a paper on Software for Atomic Calculations. Their system, called ATOM is based on the Hartree-Fock approximation and correlation is included within the framework of RPAE. Energy level calculations, transition probabilities, photo-ionization cross-sections, electron scattering cross-sections are some of the physical properties that can be evaluated by their system. The MCHF method, together with CI techniques and the Breit-Pauli approximation also provides amore » sound theoretical basis for atomic structure calculations.« less
Enhancement/upgrade of Engine Structures Technology Best Estimator (EST/BEST) Software System
NASA Technical Reports Server (NTRS)
Shah, Ashwin
2003-01-01
This report describes the work performed during the contract period and the capabilities included in the EST/BEST software system. The developed EST/BEST software system includes the integrated NESSUS, IPACS, COBSTRAN, and ALCCA computer codes required to perform the engine cycle mission and component structural analysis. Also, the interactive input generator for NESSUS, IPACS, and COBSTRAN computer codes have been developed and integrated with the EST/BEST software system. The input generator allows the user to create input from scratch as well as edit existing input files interactively. Since it has been integrated with the EST/BEST software system, it enables the user to modify EST/BEST generated files and perform the analysis to evaluate the benefits. Appendix A gives details of how to use the newly added features in the EST/BEST software system.
Flexible Software Architecture for Visualization and Seismic Data Analysis
NASA Astrophysics Data System (ADS)
Petunin, S.; Pavlov, I.; Mogilenskikh, D.; Podzyuban, D.; Arkhipov, A.; Baturuin, N.; Lisin, A.; Smith, A.; Rivers, W.; Harben, P.
2007-12-01
Research in the field of seismology requires software and signal processing utilities for seismogram manipulation and analysis. Seismologists and data analysts often encounter a major problem in the use of any particular software application specific to seismic data analysis: the tuning of commands and windows to the specific waveforms and hot key combinations so as to fit their familiar informational environment. The ability to modify the user's interface independently from the developer requires an adaptive code structure. An adaptive code structure also allows for expansion of software capabilities such as new signal processing modules and implementation of more efficient algorithms. Our approach is to use a flexible "open" architecture for development of geophysical software. This report presents an integrated solution for organizing a logical software architecture based on the Unix version of the Geotool software implemented on the Microsoft NET 2.0 platform. Selection of this platform greatly expands the variety and number of computers that can implement the software, including laptops that can be utilized in field conditions. It also facilitates implementation of communication functions for seismic data requests from remote databases through the Internet. The main principle of the new architecture for Geotool is that scientists should be able to add new routines for digital waveform analysis via software plug-ins that utilize the basic Geotool display for GUI interaction. The use of plug-ins allows the efficient integration of diverse signal-processing software, including software still in preliminary development, into an organized platform without changing the fundamental structure of that platform itself. An analyst's use of Geotool is tracked via a metadata file so that future studies can reconstruct, and alter, the original signal processing operations. The work has been completed in the framework of a joint Russian- American project.
Moser, Arvin; Pautler, Brent G
2016-05-15
The successful elucidation of an unknown compound's molecular structure often requires an analyst with profound knowledge and experience of advanced spectroscopic techniques, such as Nuclear Magnetic Resonance (NMR) spectroscopy and mass spectrometry. The implementation of Computer-Assisted Structure Elucidation (CASE) software in solving for unknown structures, such as isolated natural products and/or reaction impurities, can serve both as elucidation and teaching tools. As such, the introduction of CASE software with 112 exercises to train students in conjunction with the traditional pen and paper approach will strengthen their overall understanding of solving unknowns and explore of various structural end points to determine the validity of the results quickly. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Ahmad, Anees
1990-01-01
The development of in-house integrated optical performance modelling capability at MSFC is described. This performance model will take into account the effects of structural and thermal distortions, as well as metrology errors in optical surfaces to predict the performance of large an complex optical systems, such as Advanced X-Ray Astrophysics Facility. The necessary hardware and software were identified to implement an integrated optical performance model. A number of design, development, and testing tasks were supported to identify the debonded mirror pad, and rebuilding of the Technology Mirror Assembly. Over 300 samples of Zerodur were prepared in different sizes and shapes for acid etching, coating, and polishing experiments to characterize the subsurface damage and stresses produced by the grinding and polishing operations.
Software Performs Complex Design Analysis
NASA Technical Reports Server (NTRS)
2008-01-01
Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.
Dynamic Weather Routes Architecture Overview
NASA Technical Reports Server (NTRS)
Eslami, Hassan; Eshow, Michelle
2014-01-01
Dynamic Weather Routes Architecture Overview, presents the high level software architecture of DWR, based on the CTAS software framework and the Direct-To automation tool. The document also covers external and internal data flows, required dataset, changes to the Direct-To software for DWR, collection of software statistics, and the code structure.
A methodology for producing reliable software, volume 1
NASA Technical Reports Server (NTRS)
Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.
1976-01-01
An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Astaf'ev, S. B., E-mail: bard@ns.crys.ras.ru; Shchedrin, B. M.; Yanusova, L. G.
2012-01-15
The main principles of developing the Basic Analysis of Reflectometry Data (BARD) software package, which is aimed at obtaining a unified (standardized) tool for analyzing the structure of thin multilayer films and nanostructures of different nature based on reflectometry data, are considered. This software package contains both traditionally used procedures for processing reflectometry data and the authors' original developments on the basis of new methods for carrying out and analyzing reflectometry experiments. The structure of the package, its functional possibilities, examples of application, and prospects of development are reviewed.
Formal Verification of Large Software Systems
NASA Technical Reports Server (NTRS)
Yin, Xiang; Knight, John
2010-01-01
We introduce a scalable proof structure to facilitate formal verification of large software systems. In our approach, we mechanically synthesize an abstract specification from the software implementation, match its static operational structure to that of the original specification, and organize the proof as the conjunction of a series of lemmas about the specification structure. By setting up a different lemma for each distinct element and proving each lemma independently, we obtain the important benefit that the proof scales easily for large systems. We present details of the approach and an illustration of its application on a challenge problem from the security domain
Hu, Yue-Qing; Fung, Wing K
2003-08-01
The effect of a structured population on the likelihood ratio of a DNA mixture has been studied by the current authors and others. In practice, contributors of a DNA mixture may belong to different ethnic/racial origins, a situation especially common in multi-racial countries such as the USA and Singapore. We have developed a computer software which is available on the web for evaluating DNA mixtures in multi-structured populations. The software can deal with various DNA mixture problems that cannot be handled by the methods given in a recent article of Fung and Hu.
Study on optimum length of raw material in stainless steel high-lock nuts forging
NASA Astrophysics Data System (ADS)
Cheng, Meiwen; Liu, Fenglei; Zhao, Qingyun; Wang, Lidong
2018-04-01
Taking 302 stainless steel (1Cr18Ni9) high-lock nuts for research objects, adjusting the length of raw material, then using DEFORM software to simulate the isothermal forging process of each station and conducting the corresponding field tests to study the effects of raw material size on the stainless steel high-lock nuts forming performance. The tests show that the samples of each raw material length is basically the same as the results of the DEFORM software. When the length of the raw material is 10mm, the appearance size of the parts can meet the design requirements.
Age estimation using exfoliative cytology and radiovisiography: A comparative study
Nallamala, Shilpa; Guttikonda, Venkateswara Rao; Manchikatla, Praveen Kumar; Taneeru, Sravya
2017-01-01
Introduction: Age estimation is one of the essential factors in establishing the identity of an individual. Among various methods, exfoliative cytology (EC) is a unique, noninvasive technique, involving simple, and pain-free collection of intact cells from the oral cavity for microscopic examination. Objective: The study was undertaken with an aim to estimate the age of an individual from the average cell size of their buccal smears calculated using image analysis morphometric software and the pulp–tooth area ratio in mandibular canine of the same individual using radiovisiography (RVG). Materials and Methods: Buccal smears were collected from 100 apparently healthy individuals. After fixation in 95% alcohol, the smears were stained using Papanicolaou stain. The average cell size was measured using image analysis software (Image-Pro Insight 8.0). The RVG images of mandibular canines were obtained, pulp and tooth areas were traced using AutoCAD 2010 software, and area ratio was calculated. The estimated age was then calculated using regression analysis. Results: The paired t-test between chronological age and estimated age by cell size and pulp–tooth area ratio was statistically nonsignificant (P > 0.05). Conclusion: In the present study, age estimated by pulp–tooth area ratio and EC yielded good results. PMID:29657491
Age estimation using exfoliative cytology and radiovisiography: A comparative study.
Nallamala, Shilpa; Guttikonda, Venkateswara Rao; Manchikatla, Praveen Kumar; Taneeru, Sravya
2017-01-01
Age estimation is one of the essential factors in establishing the identity of an individual. Among various methods, exfoliative cytology (EC) is a unique, noninvasive technique, involving simple, and pain-free collection of intact cells from the oral cavity for microscopic examination. The study was undertaken with an aim to estimate the age of an individual from the average cell size of their buccal smears calculated using image analysis morphometric software and the pulp-tooth area ratio in mandibular canine of the same individual using radiovisiography (RVG). Buccal smears were collected from 100 apparently healthy individuals. After fixation in 95% alcohol, the smears were stained using Papanicolaou stain. The average cell size was measured using image analysis software (Image-Pro Insight 8.0). The RVG images of mandibular canines were obtained, pulp and tooth areas were traced using AutoCAD 2010 software, and area ratio was calculated. The estimated age was then calculated using regression analysis. The paired t -test between chronological age and estimated age by cell size and pulp-tooth area ratio was statistically nonsignificant ( P > 0.05). In the present study, age estimated by pulp-tooth area ratio and EC yielded good results.
A Microcomputer-Based Software Package for Eye-Monitoring Research. Technical Report No. 434.
ERIC Educational Resources Information Center
McConkie, George W.; And Others
A software package is described that collects and reduces eye behavior data (eye position and pupil size) using an IBM-PC compatible computer. Written in C language for speed and portability, it includes several features: (1) data can be simultaneously collected from other sources (such as electroencephalography and electromyography); (2)…
A Systematic Software, Firmware, and Hardware Codesign Methodology for Digital Signal Processing
2014-03-01
possible mappings ...................................................60 Table 25. Possible optimal leaf -nodes... size weight and power UAV unmanned aerial vehicle UHF ultra-high frequency UML universal modeling language Verilog verify logic VHDL VHSIC...optimal leaf -nodes to some design patterns for embedded system design. Software and hardware partitioning is a very difficult challenge in the field of
Users guide for FRCS: fuel reduction cost simulator software.
Roger D. Fight; Bruce R. Hartsough; Peter Noordijk
2006-01-01
The Fuel Reduction Cost Simulator (FRCS) spreadsheet application is public domain software used to estimate costs for fuel reduction treatments involving removal of trees of mixed sizes in the form of whole trees, logs, or chips from a forest. Equipment production rates were developed from existing studies. Equipment operating cost rates are from December 2002 prices...
NASA Astrophysics Data System (ADS)
Herbrechtsmeier, Stefan; Witkowski, Ulf; Rückert, Ulrich
Mobile robots become more and more important in current research and education. Especially small ’on the table’ experiments attract interest, because they need no additional or special laboratory equipments. In this context platforms are desirable which are small, simple to access and relatively easy to program. An additional powerful information processing unit is advantageous to simplify the implementation of algorithm and the porting of software from desktop computers to the robot platform. In this paper we present a new versatile miniature robot that can be ideally used for research and education. The small size of the robot of about 9 cm edge length, its robust drive and its modular structure make the robot a general device for single and multi-robot experiments executed ’on the table’. For programming and evaluation the robot can be wirelessly connected via Bluetooth or WiFi. The operating system of the robot is based on the standard Linux kernel and the GNU C standard library. A player/stage model eases software development and testing.
NASA Astrophysics Data System (ADS)
Yusoff, Mohd Zairol; Mahmuddin, Massudi; Ahmad, Mazida
2016-08-01
Knowledge and skill are necessary to develop the capability of knowledge workers. However, there is very little understanding of what the necessary knowledge work (KW) is, and how they influence the quality of knowledge work or knowledge work productivity (KWP) in software development process, including that in small and medium-sized (SME) enterprise. The SME constitutes a major part of the economy and it has been relatively unsuccessful in developing KWP. Accordingly, this paper seeks to explore the influencing dimensions of KWP that effect on the quality of KW in SME environment. First, based on the analysis of the existing literatures, the key characteristics of KW productivity are defined. Second, the conceptual model is proposed, which explores the dimensions of the KWP and its quality. This study analyses data collected from 150 respondents (based on [1], who involve in SME in Malaysia and validates the models by using structural equation modeling (SEM). The results provide an analysis of the effect of KWP on the quality of KW and business success, and have a significant relevance for both research and practice in the SME
NASA Technical Reports Server (NTRS)
Schmidt, R. J.; Dodds, R. H., Jr.
1985-01-01
The dynamic analysis of complex structural systems using the finite element method and multilevel substructured models is presented. The fixed-interface method is selected for substructure reduction because of its efficiency, accuracy, and adaptability to restart and reanalysis. This method is extended to reduction of substructures which are themselves composed of reduced substructures. The implementation and performance of the method in a general purpose software system is emphasized. Solution algorithms consistent with the chosen data structures are presented. It is demonstrated that successful finite element software requires the use of software executives to supplement the algorithmic language. The complexity of the implementation of restart and reanalysis porcedures illustrates the need for executive systems to support the noncomputational aspects of the software. It is shown that significant computational efficiencies can be achieved through proper use of substructuring and reduction technbiques without sacrificing solution accuracy. The restart and reanalysis capabilities and the flexible procedures for multilevel substructured modeling gives economical yet accurate analyses of complex structural systems.
Providing structural modules with self-integrity monitoring software user's manual
NASA Technical Reports Server (NTRS)
1990-01-01
National Aeronautics and Space Administration (NASA) Contract NAS7-961 (A Small Business Innovation and Research (SBIR) contract from NASA) involved research dealing with remote structural damage detection using the concept of substructures. Several approaches were developed. The main two were: (1) the module (substructure) transfer function matrix (MTFM) approach; and (2) modal strain energy distribution method (MSEDM). Either method can be used with a global structure; however, the focus was on substructures. As part of the research contract, computer software was to be developed which would implement the developed methods. This was done and it was used to process all the finite element generated numerical data for the research. The software was written for the IBM AT personal computer. Copies of it were placed on floppy disks. This report serves as a user's manual for the two sets of damage detection software. Sections 2.0 and 3.0 discuss the use of the MTFM and MSEDM software, respectively.
Testing Software Development Project Productivity Model
NASA Astrophysics Data System (ADS)
Lipkin, Ilya
Software development is an increasingly influential factor in today's business environment, and a major issue affecting software development is how an organization estimates projects. If the organization underestimates cost, schedule, and quality requirements, the end results will not meet customer needs. On the other hand, if the organization overestimates these criteria, resources that could have been used more profitably will be wasted. There is no accurate model or measure available that can guide an organization in a quest for software development, with existing estimation models often underestimating software development efforts as much as 500 to 600 percent. To address this issue, existing models usually are calibrated using local data with a small sample size, with resulting estimates not offering improved cost analysis. This study presents a conceptual model for accurately estimating software development, based on an extensive literature review and theoretical analysis based on Sociotechnical Systems (STS) theory. The conceptual model serves as a solution to bridge organizational and technological factors and is validated using an empirical dataset provided by the DoD. Practical implications of this study allow for practitioners to concentrate on specific constructs of interest that provide the best value for the least amount of time. This study outlines key contributing constructs that are unique for Software Size E-SLOC, Man-hours Spent, and Quality of the Product, those constructs having the largest contribution to project productivity. This study discusses customer characteristics and provides a framework for a simplified project analysis for source selection evaluation and audit task reviews for the customers and suppliers. Theoretical contributions of this study provide an initial theory-based hypothesized project productivity model that can be used as a generic overall model across several application domains such as IT, Command and Control, Simulation and etc... This research validates findings from previous work concerning software project productivity and leverages said results in this study. The hypothesized project productivity model provides statistical support and validation of expert opinions used by practitioners in the field of software project estimation.
The Monotonic Lagrangian Grid for Rapid Air-Traffic Evaluation
NASA Technical Reports Server (NTRS)
Kaplan, Carolyn; Dahm, Johann; Oran, Elaine; Alexandrov, Natalia; Boris, Jay
2010-01-01
The Air Traffic Monotonic Lagrangian Grid (ATMLG) is presented as a tool to evaluate new air traffic system concepts. The model, based on an algorithm called the Monotonic Lagrangian Grid (MLG), can quickly sort, track, and update positions of many aircraft, both on the ground (at airports) and in the air. The underlying data structure is based on the MLG, which is used for sorting and ordering positions and other data needed to describe N moving bodies and their interactions. Aircraft that are close to each other in physical space are always near neighbors in the MLG data arrays, resulting in a fast nearest-neighbor interaction algorithm that scales as N. Recent upgrades to ATMLG include adding blank place-holders within the MLG data structure, which makes it possible to dynamically change the MLG size and also improves the quality of the MLG grid. Additional upgrades include adding FAA flight plan data, such as way-points and arrival and departure times from the Enhanced Traffic Management System (ETMS), and combining the MLG with the state-of-the-art strategic and tactical conflict detection and resolution algorithms from the NASA-developed Stratway software. In this paper, we present results from our early efforts to couple ATMLG with the Stratway software, and we demonstrate that it can be used to quickly simulate air traffic flow for a very large ETMS dataset.
TopoGromacs: Automated Topology Conversion from CHARMM to GROMACS within VMD.
Vermaas, Josh V; Hardy, David J; Stone, John E; Tajkhorshid, Emad; Kohlmeyer, Axel
2016-06-27
Molecular dynamics (MD) simulation engines use a variety of different approaches for modeling molecular systems with force fields that govern their dynamics and describe their topology. These different approaches introduce incompatibilities between engines, and previously published software bridges the gaps between many popular MD packages, such as between CHARMM and AMBER or GROMACS and LAMMPS. While there are many structure building tools available that generate topologies and structures in CHARMM format, only recently have mechanisms been developed to convert their results into GROMACS input. We present an approach to convert CHARMM-formatted topology and parameters into a format suitable for simulation with GROMACS by expanding the functionality of TopoTools, a plugin integrated within the widely used molecular visualization and analysis software VMD. The conversion process was diligently tested on a comprehensive set of biological molecules in vacuo. The resulting comparison between energy terms shows that the translation performed was lossless as the energies were unchanged for identical starting configurations. By applying the conversion process to conventional benchmark systems that mimic typical modestly sized MD systems, we explore the effect of the implementation choices made in CHARMM, NAMD, and GROMACS. The newly available automatic conversion capability breaks down barriers between simulation tools and user communities and allows users to easily compare simulation programs and leverage their unique features without the tedium of constructing a topology twice.
The Effects of Size and Type of Vocal Fold Polyp on Some Acoustic Voice Parameters.
Akbari, Elaheh; Seifpanahi, Sadegh; Ghorbani, Ali; Izadi, Farzad; Torabinezhad, Farhad
2018-03-01
Vocal abuse and misuse would result in vocal fold polyp. Certain features define the extent of vocal folds polyp effects on voice acoustic parameters. The present study aimed to define the effects of polyp size on acoustic voice parameters, and compare these parameters in hemorrhagic and non-hemorrhagic polyps. In the present retrospective study, 28 individuals with hemorrhagic or non-hemorrhagic polyps of the true vocal folds were recruited to investigate acoustic voice parameters of vowel/ æ/ computed by the Praat software. The data were analyzed using the SPSS software, version 17.0. According to the type and size of polyps, mean acoustic differences and correlations were analyzed by the statistical t test and Pearson correlation test, respectively; with significance level below 0.05. The results indicated that jitter and the harmonics-to-noise ratio had a significant positive and negative correlation with the polyp size (P=0.01), respectively. In addition, both mentioned parameters were significantly different between the two types of the investigated polyps. Both the type and size of polyps have effects on acoustic voice characteristics. In the present study, a novel method to measure polyp size was introduced. Further confirmation of this method as a tool to compare polyp sizes requires additional investigations.
The Effects of Size and Type of Vocal Fold Polyp on Some Acoustic Voice Parameters
Akbari, Elaheh; Seifpanahi, Sadegh; Ghorbani, Ali; Izadi, Farzad; Torabinezhad, Farhad
2018-01-01
Background Vocal abuse and misuse would result in vocal fold polyp. Certain features define the extent of vocal folds polyp effects on voice acoustic parameters. The present study aimed to define the effects of polyp size on acoustic voice parameters, and compare these parameters in hemorrhagic and non-hemorrhagic polyps. Methods In the present retrospective study, 28 individuals with hemorrhagic or non-hemorrhagic polyps of the true vocal folds were recruited to investigate acoustic voice parameters of vowel/ æ/ computed by the Praat software. The data were analyzed using the SPSS software, version 17.0. According to the type and size of polyps, mean acoustic differences and correlations were analyzed by the statistical t test and Pearson correlation test, respectively; with significance level below 0.05. Results The results indicated that jitter and the harmonics-to-noise ratio had a significant positive and negative correlation with the polyp size (P=0.01), respectively. In addition, both mentioned parameters were significantly different between the two types of the investigated polyps. Conclusion Both the type and size of polyps have effects on acoustic voice characteristics. In the present study, a novel method to measure polyp size was introduced. Further confirmation of this method as a tool to compare polyp sizes requires additional investigations. PMID:29749984
A dictionary based informational genome analysis
2012-01-01
Background In the post-genomic era several methods of computational genomics are emerging to understand how the whole information is structured within genomes. Literature of last five years accounts for several alignment-free methods, arisen as alternative metrics for dissimilarity of biological sequences. Among the others, recent approaches are based on empirical frequencies of DNA k-mers in whole genomes. Results Any set of words (factors) occurring in a genome provides a genomic dictionary. About sixty genomes were analyzed by means of informational indexes based on genomic dictionaries, where a systemic view replaces a local sequence analysis. A software prototype applying a methodology here outlined carried out some computations on genomic data. We computed informational indexes, built the genomic dictionaries with different sizes, along with frequency distributions. The software performed three main tasks: computation of informational indexes, storage of these in a database, index analysis and visualization. The validation was done by investigating genomes of various organisms. A systematic analysis of genomic repeats of several lengths, which is of vivid interest in biology (for example to compute excessively represented functional sequences, such as promoters), was discussed, and suggested a method to define synthetic genetic networks. Conclusions We introduced a methodology based on dictionaries, and an efficient motif-finding software application for comparative genomics. This approach could be extended along many investigation lines, namely exported in other contexts of computational genomics, as a basis for discrimination of genomic pathologies. PMID:22985068
Initial Ada components evaluation
NASA Technical Reports Server (NTRS)
Moebes, Travis
1989-01-01
The SAIC has the responsibility for independent test and validation of the SSE. They have been using a mathematical functions library package implemented in Ada to test the SSE IV and V process. The library package consists of elementary mathematical functions and is both machine and accuracy independent. The SSE Ada components evaluation includes code complexity metrics based on Halstead's software science metrics and McCabe's measure of cyclomatic complexity. Halstead's metrics are based on the number of operators and operands on a logical unit of code and are compiled from the number of distinct operators, distinct operands, and total number of occurrences of operators and operands. These metrics give an indication of the physical size of a program in terms of operators and operands and are used diagnostically to point to potential problems. McCabe's Cyclomatic Complexity Metrics (CCM) are compiled from flow charts transformed to equivalent directed graphs. The CCM is a measure of the total number of linearly independent paths through the code's control structure. These metrics were computed for the Ada mathematical functions library using Software Automated Verification and Validation (SAVVAS), the SSE IV and V tool. A table with selected results was shown, indicating that most of these routines are of good quality. Thresholds for the Halstead measures indicate poor quality if the length metric exceeds 260 or difficulty is greater than 190. The McCabe CCM indicated a high quality of software products.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Polsdofer, E; Crilly, R
Purpose: This study investigates the effect of eye size and eccentricity on doses to critical tissues by simulating doses in the Plaque Simulator (v. 6.3.1) software. Present OHSU plaque brachytherapy treatment focuses on delivering radiation to the tumor measured with ocular ultrasound plus a small margin and assumes the orbit has the dimensions of a “standard eye.” Accurately modeling the dimensions of the orbit requires a high resolution ocular CT. This study quantifies how standard differences in equatorial diameters and eccentricity affect calculated doses to critical structures in order to query the justification of the additional CT scan to themore » treatment planning process. Methods: Tumors of 10 mm × 10 mm × 5 mm were modeled at the 12:00:00 hour with a latitude of 45 degrees. Right eyes were modeled at a number of equatorial diameters from 17.5 to 28 mm for each of the standard non-notched COMS plaques with silastic inserts. The COMS plaques were fully loaded with uniform activity, centered on the tumor, and prescribed to a common tumor dose (85 Gy/100 hours). Variations in the calculated doses to normal structures were examined to see if the changes were significant. Results: The calculated dose to normal structures show a marked dependence on eye geometry. This is exemplified by fovea dose which more than doubled in the smaller eyes and nearly halved in the larger model. Additional significant dependence was found in plaque size on the calculated dose in spite of all plaques giving the same dose to the prescription point. Conclusion: The variation in dose with eye dimension fully justifies the addition of a high resolution ocular CT to the planning technique. Additional attention must be made to plaque size beyond simply covering the tumor when considering normal tissue dose.« less
The impact of software quality characteristics on healthcare outcome: a literature review.
Aghazadeh, Sakineh; Pirnejad, Habibollah; Moradkhani, Alireza; Aliev, Alvosat
2014-01-01
The aim of this study was to discover the effect of software quality characteristics on healthcare quality and efficiency indicators. Through a systematic literature review, we selected and analyzed 37 original research papers to investigate the impact of the software indicators (coming from the standard ISO 9126 quality characteristics and sub-characteristics) on some of healthcare important outcome indicators and finally ranked these software indicators. The results showed that the software characteristics usability, reliability and efficiency were mostly favored in the studies, indicating their importance. On the other hand, user satisfaction, quality of patient care, clinical workflow efficiency, providers' communication and information exchange, patient satisfaction and care costs were among the healthcare outcome indicators frequently evaluated in relation to the mentioned software characteristics. Regression Logistic Method was the most common assessment methodology, and Confirmatory Factor Analysis and Structural Equation Modeling were performed to test the structural model's fit. The software characteristics were considered to impact the healthcare outcome indicators through other intermediate factors (variables).
Aggregation in organic light emitting diodes
NASA Astrophysics Data System (ADS)
Meyer, Abigail
Organic light emitting diode (OLED) technology has great potential for becoming a solid state lighting source. However, there are inefficiencies in OLED devices that need to be understood. Since these inefficiencies occur on a nanometer scale there is a need for structural data on this length scale in three dimensions which has been unattainable until now. Local Electron Atom Probe (LEAP), a specific implementation of Atom Probe Tomography (APT), is used in this work to acquire morphology data in three dimensions on a nanometer scale with much better chemical resolution than is previously seen. Before analyzing LEAP data, simulations were used to investigate how detector efficiency, sample size and cluster size affect data analysis which is done using radial distribution functions (RDFs). Data is reconstructed using the LEAP software which provides mass and position data. Two samples were then analyzed, 3% DCM2 in C60 and 2% DCM2 in Alq3. Analysis of both samples indicated little to no clustering was present in this system.
An Analysis of Scalable GPU-Based Ray-Guided Volume Rendering
Fogal, Thomas; Schiewe, Alexander; Krüger, Jens
2014-01-01
Volume rendering continues to be a critical method for analyzing large-scale scalar fields, in disciplines as diverse as biomedical engineering and computational fluid dynamics. Commodity desktop hardware has struggled to keep pace with data size increases, challenging modern visualization software to deliver responsive interactions for O(N3) algorithms such as volume rendering. We target the data type common in these domains: regularly-structured data. In this work, we demonstrate that the major limitation of most volume rendering approaches is their inability to switch the data sampling rate (and thus data size) quickly. Using a volume renderer inspired by recent work, we demonstrate that the actual amount of visualizable data for a scene is typically bound considerably lower than the memory available on a commodity GPU. Our instrumented renderer is used to investigate design decisions typically swept under the rug in volume rendering literature. The renderer is freely available, with binaries for all major platforms as well as full source code, to encourage reproduction and comparison with future research. PMID:25506079
General software design for multisensor data fusion
NASA Astrophysics Data System (ADS)
Zhang, Junliang; Zhao, Yuming
1999-03-01
In this paper a general method of software design for multisensor data fusion is discussed in detail, which adopts object-oriented technology under UNIX operation system. The software for multisensor data fusion is divided into six functional modules: data collection, database management, GIS, target display and alarming data simulation etc. Furthermore, the primary function, the components and some realization methods of each modular is given. The interfaces among these functional modular relations are discussed. The data exchange among each functional modular is performed by interprocess communication IPC, including message queue, semaphore and shared memory. Thus, each functional modular is executed independently, which reduces the dependence among functional modules and helps software programing and testing. This software for multisensor data fusion is designed as hierarchical structure by the inheritance character of classes. Each functional modular is abstracted and encapsulated through class structure, which avoids software redundancy and enhances readability.
The optimal community detection of software based on complex networks
NASA Astrophysics Data System (ADS)
Huang, Guoyan; Zhang, Peng; Zhang, Bing; Yin, Tengteng; Ren, Jiadong
2016-02-01
The community structure is important for software in terms of understanding the design patterns, controlling the development and the maintenance process. In order to detect the optimal community structure in the software network, a method Optimal Partition Software Network (OPSN) is proposed based on the dependency relationship among the software functions. First, by analyzing the information of multiple execution traces of one software, we construct Software Execution Dependency Network (SEDN). Second, based on the relationship among the function nodes in the network, we define Fault Accumulation (FA) to measure the importance of the function node and sort the nodes with measure results. Third, we select the top K(K=1,2,…) nodes as the core of the primal communities (only exist one core node). By comparing the dependency relationships between each node and the K communities, we put the node into the existing community which has the most close relationship. Finally, we calculate the modularity with different initial K to obtain the optimal division. With experiments, the method OPSN is verified to be efficient to detect the optimal community in various softwares.
Next generation lightweight mirror modeling software
NASA Astrophysics Data System (ADS)
Arnold, William R.; Fitzgerald, Matthew; Rosa, Rubin Jaca; Stahl, H. Philip
2013-09-01
The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 3-5 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any text editor, all the shell thickness parameters and suspension spring rates are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models easier.
NASA Astrophysics Data System (ADS)
Lane, R. J. L.
2015-12-01
At Geoscience Australia, we are upgrading our gravity and magnetic modeling tools to provide new insights into the composition, properties, and structure of the subsurface. The scale of the investigations varies from the size of tectonic plates to the size of a mineral prospect. To accurately model potential field data at all of these scales, we require modeling software that can operate in both spherical and Cartesian coordinate frameworks. The models are in the form of a mesh, with spherical prismatic (tesseroid) elements for spherical coordinate models of large volumes, and rectangular prisms for smaller volumes evaluated in a Cartesian coordinate framework. The software can compute the forward response of supplied rock property models and can perform inversions using constraints that vary from weak generic smoothness through to very specific reference models compiled from various types of "hard facts" (i.e., surface mapping, drilling information, crustal seismic interpretations). To operate efficiently, the software is being specifically developed to make use of the resources of the National Computational Infrastructure (NCI) at the Australian National University (ANU). The development of these tools is been carried out in collaboration with researchers from the Colorado School of Mines (CSM) and the China University of Geosciences (CUG) and is at the stage of advanced testing. The creation of individual 3D geological models will provide immediate insights. Users will also be able to combine models, either by stitching them together or by nesting smaller and more detailed models within a larger model. Comparison of the potential field response of a composite model with the observed fields will give users a sense of how comprehensively these models account for the observations. Users will also be able to model the residual fields (i.e., the observed minus calculated response) to discover features that are not represented in the input composite model.
RAPTR-SV: a hybrid method for the detection of structural variants
USDA-ARS?s Scientific Manuscript database
Motivation: Identification of Structural Variants (SV) in sequence data results in a large number of false positive calls using existing software, which overburdens subsequent validation. Results: Simulations using RAPTR-SV and another software package that uses a similar algorithm for SV detection...
NASA Astrophysics Data System (ADS)
Carrete, Jesús; Vermeersch, Bjorn; Katre, Ankita; van Roekeghem, Ambroise; Wang, Tao; Madsen, Georg K. H.; Mingo, Natalio
2017-11-01
almaBTE is a software package that solves the space- and time-dependent Boltzmann transport equation for phonons, using only ab-initio calculated quantities as inputs. The program can predictively tackle phonon transport in bulk crystals and alloys, thin films, superlattices, and multiscale structures with size features in the nm- μm range. Among many other quantities, the program can output thermal conductances and effective thermal conductivities, space-resolved average temperature profiles, and heat-current distributions resolved in frequency and space. Its first-principles character makes almaBTE especially well suited to investigate novel materials and structures. This article gives an overview of the program structure and presents illustrative examples for some of its uses. PROGRAM SUMMARY Program Title:almaBTE Program Files doi:http://dx.doi.org/10.17632/8tfzwgtp73.1 Licensing provisions: Apache License, version 2.0 Programming language: C++ External routines/libraries: BOOST, MPI, Eigen, HDF5, spglib Nature of problem: Calculation of temperature profiles, thermal flux distributions and effective thermal conductivities in structured systems where heat is carried by phonons Solution method: Solution of linearized phonon Boltzmann transport equation, Variance-reduced Monte Carlo
Benefits of Matching Domain Structure for Planning Software: The Right Stuff
NASA Technical Reports Server (NTRS)
Billman, Dorrit Owen; Arsintescu, Lucica; Feary, Michael S.; Lee, Jessica Chia-Rong; Smith, Asha Halima; Tiwary, Rachna
2011-01-01
We investigated the role of domain structure in software design. We compared 2 planning applications, for a Mission Control group (International Space Station), and measured users speed and accuracy. Based on our needs analysis, we identified domain structure and used this to develop new prototype software that matched domain structure better than the legacy system. We took a high-fidelity analog of the natural task into the laboratory and found (large) periformance differences, favoring the system that matched domain structure. Our task design enabled us to attribute better periormance to better match of domain structure. We ran through the whole development cycle, in miniature, from needs analysis through design, development, and evaluation. Doing so enabled inferences not just about the particular systems compared, but also provided evidence for the viability of the design process (particularly needs analysis) that we are exploring.
Computational methods and software systems for dynamics and control of large space structures
NASA Technical Reports Server (NTRS)
Park, K. C.; Felippa, C. A.; Farhat, C.; Pramono, E.
1990-01-01
This final report on computational methods and software systems for dynamics and control of large space structures covers progress to date, projected developments in the final months of the grant, and conclusions. Pertinent reports and papers that have not appeared in scientific journals (or have not yet appeared in final form) are enclosed. The grant has supported research in two key areas of crucial importance to the computer-based simulation of large space structure. The first area involves multibody dynamics (MBD) of flexible space structures, with applications directed to deployment, construction, and maneuvering. The second area deals with advanced software systems, with emphasis on parallel processing. The latest research thrust in the second area, as reported here, involves massively parallel computers.
Information models of software productivity - Limits on productivity growth
NASA Technical Reports Server (NTRS)
Tausworthe, Robert C.
1992-01-01
Research into generalized information-metric models of software process productivity establishes quantifiable behavior and theoretical bounds. The models establish a fundamental mathematical relationship between software productivity and the human capacity for information traffic, the software product yield (system size), information efficiency, and tool and process efficiencies. An upper bound is derived that quantifies average software productivity and the maximum rate at which it may grow. This bound reveals that ultimately, when tools, methodologies, and automated assistants have reached their maximum effective state, further improvement in productivity can only be achieved through increasing software reuse. The reuse advantage is shown not to increase faster than logarithmically in the number of reusable features available. The reuse bound is further shown to be somewhat dependent on the reuse policy: a general 'reuse everything' policy can lead to a somewhat slower productivity growth than a specialized reuse policy.
Ashford, Paul; Moss, David S; Alex, Alexander; Yeap, Siew K; Povia, Alice; Nobeli, Irene; Williams, Mark A
2012-03-14
Protein structures provide a valuable resource for rational drug design. For a protein with no known ligand, computational tools can predict surface pockets that are of suitable size and shape to accommodate a complementary small-molecule drug. However, pocket prediction against single static structures may miss features of pockets that arise from proteins' dynamic behaviour. In particular, ligand-binding conformations can be observed as transiently populated states of the apo protein, so it is possible to gain insight into ligand-bound forms by considering conformational variation in apo proteins. This variation can be explored by considering sets of related structures: computationally generated conformers, solution NMR ensembles, multiple crystal structures, homologues or homology models. It is non-trivial to compare pockets, either from different programs or across sets of structures. For a single structure, difficulties arise in defining particular pocket's boundaries. For a set of conformationally distinct structures the challenge is how to make reasonable comparisons between them given that a perfect structural alignment is not possible. We have developed a computational method, Provar, that provides a consistent representation of predicted binding pockets across sets of related protein structures. The outputs are probabilities that each atom or residue of the protein borders a predicted pocket. These probabilities can be readily visualised on a protein using existing molecular graphics software. We show how Provar simplifies comparison of the outputs of different pocket prediction algorithms, of pockets across multiple simulated conformations and between homologous structures. We demonstrate the benefits of use of multiple structures for protein-ligand and protein-protein interface analysis on a set of complexes and consider three case studies in detail: i) analysis of a kinase superfamily highlights the conserved occurrence of surface pockets at the active and regulatory sites; ii) a simulated ensemble of unliganded Bcl2 structures reveals extensions of a known ligand-binding pocket not apparent in the apo crystal structure; iii) visualisations of interleukin-2 and its homologues highlight conserved pockets at the known receptor interfaces and regions whose conformation is known to change on inhibitor binding. Through post-processing of the output of a variety of pocket prediction software, Provar provides a flexible approach to the analysis and visualization of the persistence or variability of pockets in sets of related protein structures.
Treatment delivery software for a new clinical grade ultrasound system for thermoradiotherapy.
Novák, Petr; Moros, Eduardo G; Straube, William L; Myerson, Robert J
2005-11-01
A detailed description of a clinical grade Scanning Ultrasound Reflector Linear Array System (SURLAS) applicator was given in a previous paper [Med. Phys. 32, 230-240 (2005)]. In this paper we concentrate on the design, development, and testing of the personal computer (PC) based treatment delivery software that runs the therapy system. The SURLAS requires the coordinated interaction between the therapy applicator and several peripheral devices for its proper and safe operation. One of the most important tasks was the coordination of the input power sequences for the elements of two parallel opposed ultrasound arrays (eight 1.5 cm x 2 cm elements/array, array 1 and 2 operate at 1.9 and 4.9 MHz, respectively) in coordination with the position of a dual-face scanning acoustic reflector. To achieve this, the treatment delivery software can divide the applicator's treatment window in up to 64 sectors (minimum size of 2 cm x 2 cm), and control the power to each sector independently by adjusting the power output levels from the channels of a 16-channel radio-frequency generator. The software coordinates the generator outputs with the position of the reflector as it scans back and forth between the arrays. Individual sector control and dual frequency operation allows the SURLAS to adjust power deposition in three dimensions to superficial targets coupled to its treatment window. The treatment delivery software also monitors and logs several parameters such as temperatures acquired using a 16-channel thermocouple thermometry unit. Safety (in particular to patients) was the paramount concern and design criterion. Failure mode and effects analysis (FMEA) was applied to the applicator as well as to the entire therapy system in order to identify safety issues and rank their relative importance. This analysis led to the implementation of several safety mechanisms and a software structure where each device communicates with the controlling PC independently of the others. In case of a malfunction in any part of the system or a violation of a user-defined safety criterion based on temperature readings, the software terminates treatment immediately and the user is notified. The software development process consisting of problem analysis, design, implementation, and testing is presented in this paper. Once the software was finished and integrated with the hardware, the therapy system was extensively tested. Results demonstrated that the software operates the SURLAS as intended with minimum risk to future patients.
Contour Digitizing and Tagging Software (CONTAGRID).
1980-04-01
TYPE TO TYPE FROM # * DELETE TYPE #1 #, #1 #1 #, e JOYN TYPE #, # * REORDER TYPE #, #1 #1 TYPE = INDEX, NON, DEPRESSION, CUT, FILL, # NUMBER Figure B...ERRCOD.N’.0)C’ TO 8020 SIZE - IOR( IAN ;D(4ZOIFE,ISL(mJF(1),-7)), + lAND(4ZFEOO,ISL(BUr(1),9))) iF(SIZE.GT.390)GO TO 8030 LM- 0 DO 110 I-1,SIZE CALL
Instant Grainification: Real-Time Grain-Size Analysis from Digital Images in the Field
NASA Astrophysics Data System (ADS)
Rubin, D. M.; Chezar, H.
2007-12-01
Over the past few years, digital cameras and underwater microscopes have been developed to collect in-situ images of sand-sized bed sediment, and software has been developed to measure grain size from those digital images (Chezar and Rubin, 2004; Rubin, 2004; Rubin et al., 2006). Until now, all image processing and grain- size analysis was done back in the office where images were uploaded from cameras and processed on desktop computers. Computer hardware has become small and rugged enough to process images in the field, which for the first time allows real-time grain-size analysis of sand-sized bed sediment. We present such a system consisting of weatherproof tablet computer, open source image-processing software (autocorrelation code of Rubin, 2004, running under Octave and Cygwin), and digital camera with macro lens. Chezar, H., and Rubin, D., 2004, Underwater microscope system: U.S. Patent and Trademark Office, patent number 6,680,795, January 20, 2004. Rubin, D.M., 2004, A simple autocorrelation algorithm for determining grain size from digital images of sediment: Journal of Sedimentary Research, v. 74, p. 160-165. Rubin, D.M., Chezar, H., Harney, J.N., Topping, D.J., Melis, T.S., and Sherwood, C.R., 2006, Underwater microscope for measuring spatial and temporal changes in bed-sediment grain size: USGS Open-File Report 2006-1360.
Viability of Cross-Flow Fan with Helical Blades for Vertical Take-off and Landing Aircraft
2012-09-01
fluid dynamics (CFD) software, ANSYS - CFX , a three-dimensional (3-D) straight-bladed model was validated against previous study’s experimental results...computational fluid dynamics software (CFD), ANSYS - CFX , a three-dimensional (3-D) straight-bladed model was validated against previous study’s experimental...37 B. SIZING PARAMETERS AND ILLUSTRATION ................................. 37 APPENDIX B. ANSYS CFX PARAMETERS
Selective Guide to Literature on Software Review Sources. Engineering Literature Guides, Number 8.
ERIC Educational Resources Information Center
Bean, Margaret H., Ed.
This selective literature guide serves as a directory to software evaluation sources for all sizes of microcomputers. Information is provided on review sources and guides which deal with a variety of applications such as library, engineering, school, and business as well as a variety of systems, including DOS and CP/M. This document is intended to…
2006-11-27
clever, but I see that there was nothing in it, after all” – said to Sherlock Holmes – “I begin to think that I make a mistake in explaining... Sherlock Holmes 94 The Criticism from software cont. • Software complexity and performance is improving – Especially in the key area of pattern
Users guide for STHARVEST: software to estimate the cost of harvesting small timber.
Roger D. Fight; Xiaoshan Zhang; Bruce R. Hartsough
2003-01-01
The STHARVEST computer application is Windows-based, public-domain software used to estimate costs for harvesting small-diameter stands or the small-diameter component of a mixed-sized stand. The equipment production rates were developed from existing studies. Equipment operating cost rates were based on November 1998 prices for new equipment and wage rates for the...
ERIC Educational Resources Information Center
Bruton, Samuel; Childers, Dan
2016-01-01
Recently, the usage of plagiarism detection software such as Turnitin® has increased dramatically among university instructors. At the same time, academic criticism of this software's employment has also increased. We interviewed 23 faculty members from various departments at a medium-sized, public university in the southeastern US to determine…
ERIC Educational Resources Information Center
Stivers, Jan; Garrity, N. B.
2004-01-01
When a mid-sized public college made a politically unpopular decision to purchase new student information system software, a team of fourteen people from across campus was assembled and charged with facilitating the transition from the home-grown system. This case report describes the challenges they faced as they worked to understand their…
Ueno, Yutaka; Ito, Shuntaro; Konagaya, Akihiko
2014-12-01
To better understand the behaviors and structural dynamics of proteins within a cell, novel software tools are being developed that can create molecular animations based on the findings of structural biology. This study proposes our method developed based on our prototypes to detect collisions and examine the soft-body dynamics of molecular models. The code was implemented with a software development toolkit for rigid-body dynamics simulation and a three-dimensional graphics library. The essential functions of the target software system included the basic molecular modeling environment, collision detection in the molecular models, and physical simulations of the movement of the model. Taking advantage of recent software technologies such as physics simulation modules and interpreted scripting language, the functions required for accurate and meaningful molecular animation were implemented efficiently.
Structural Analysis Using NX Nastran 9.0
NASA Technical Reports Server (NTRS)
Rolewicz, Benjamin M.
2014-01-01
NX Nastran is a powerful Finite Element Analysis (FEA) software package used to solve linear and non-linear models for structural and thermal systems. The software, which consists of both a solver and user interface, breaks down analysis into four files, each of which are important to the end results of the analysis. The software offers capabilities for a variety of types of analysis, and also contains a respectable modeling program. Over the course of ten weeks, I was trained to effectively implement NX Nastran into structural analysis and refinement for parts of two missions at NASA's Kennedy Space Center, the Restore mission and the Orion mission.
Application of 3-Dimensional Printing Technology to Construct an Eye Model for Fundus Viewing Study
Li, Xinhua; Gao, Zhishan; Yuan, Dongqing; Liu, Qinghuai
2014-01-01
Objective To construct a life-sized eye model using the three-dimensional (3D) printing technology for fundus viewing study of the viewing system. Methods We devised our schematic model eye based on Navarro's eye and redesigned some parameters because of the change of the corneal material and the implantation of intraocular lenses (IOLs). Optical performance of our schematic model eye was compared with Navarro's schematic eye and other two reported physical model eyes using the ZEMAX optical design software. With computer aided design (CAD) software, we designed the 3D digital model of the main structure of the physical model eye, which was used for three-dimensional (3D) printing. Together with the main printed structure, polymethyl methacrylate(PMMA) aspherical cornea, variable iris, and IOLs were assembled to a physical eye model. Angle scale bars were glued from posterior to periphery of the retina. Then we fabricated other three physical models with different states of ammetropia. Optical parameters of these physical eye models were measured to verify the 3D printing accuracy. Results In on-axis calculations, our schematic model eye possessed similar size of spot diagram compared with Navarro's and Bakaraju's model eye, much smaller than Arianpour's model eye. Moreover, the spherical aberration of our schematic eye was much less than other three model eyes. While in off- axis simulation, it possessed a bit higher coma and similar astigmatism, field curvature and distortion. The MTF curves showed that all the model eyes diminished in resolution with increasing field of view, and the diminished tendency of resolution of our physical eye model was similar to the Navarro's eye. The measured parameters of our eye models with different status of ametropia were in line with the theoretical value. Conclusions The schematic eye model we designed can well simulate the optical performance of the human eye, and the fabricated physical one can be used as a tool in fundus range viewing research. PMID:25393277
Application of 3-dimensional printing technology to construct an eye model for fundus viewing study.
Xie, Ping; Hu, Zizhong; Zhang, Xiaojun; Li, Xinhua; Gao, Zhishan; Yuan, Dongqing; Liu, Qinghuai
2014-01-01
To construct a life-sized eye model using the three-dimensional (3D) printing technology for fundus viewing study of the viewing system. We devised our schematic model eye based on Navarro's eye and redesigned some parameters because of the change of the corneal material and the implantation of intraocular lenses (IOLs). Optical performance of our schematic model eye was compared with Navarro's schematic eye and other two reported physical model eyes using the ZEMAX optical design software. With computer aided design (CAD) software, we designed the 3D digital model of the main structure of the physical model eye, which was used for three-dimensional (3D) printing. Together with the main printed structure, polymethyl methacrylate(PMMA) aspherical cornea, variable iris, and IOLs were assembled to a physical eye model. Angle scale bars were glued from posterior to periphery of the retina. Then we fabricated other three physical models with different states of ammetropia. Optical parameters of these physical eye models were measured to verify the 3D printing accuracy. In on-axis calculations, our schematic model eye possessed similar size of spot diagram compared with Navarro's and Bakaraju's model eye, much smaller than Arianpour's model eye. Moreover, the spherical aberration of our schematic eye was much less than other three model eyes. While in off- axis simulation, it possessed a bit higher coma and similar astigmatism, field curvature and distortion. The MTF curves showed that all the model eyes diminished in resolution with increasing field of view, and the diminished tendency of resolution of our physical eye model was similar to the Navarro's eye. The measured parameters of our eye models with different status of ametropia were in line with the theoretical value. The schematic eye model we designed can well simulate the optical performance of the human eye, and the fabricated physical one can be used as a tool in fundus range viewing research.
Software support for improving technology infusion
NASA Technical Reports Server (NTRS)
Feather, M. S.; Hicks, K. A.; Johnson, K. R.; Cornford, S. L.
2003-01-01
This paper focuses on describing the custom software tool, DDP, that was developed to support the TIMA process, and on showing how the needs of the TIMA process have influenced the development of the structure and capabilities of the DDP software.
Correlation analysis of fracture arrangement in space
NASA Astrophysics Data System (ADS)
Marrett, Randall; Gale, Julia F. W.; Gómez, Leonel A.; Laubach, Stephen E.
2018-03-01
We present new techniques that overcome limitations of standard approaches to documenting spatial arrangement. The new techniques directly quantify spatial arrangement by normalizing to expected values for randomly arranged fractures. The techniques differ in terms of computational intensity, robustness of results, ability to detect anti-correlation, and use of fracture size data. Variation of spatial arrangement across a broad range of length scales facilitates distinguishing clustered and periodic arrangements-opposite forms of organization-from random arrangements. Moreover, self-organized arrangements can be distinguished from arrangements due to extrinsic organization. Traditional techniques for analysis of fracture spacing are hamstrung because they account neither for the sequence of fracture spacings nor for possible coordination between fracture size and position, attributes accounted for by our methods. All of the new techniques reveal fractal clustering in a test case of veins, or cement-filled opening-mode fractures, in Pennsylvanian Marble Falls Limestone. The observed arrangement is readily distinguishable from random and periodic arrangements. Comparison of results that account for fracture size with results that ignore fracture size demonstrates that spatial arrangement is dominated by the sequence of fracture spacings, rather than coordination of fracture size with position. Fracture size and position are not completely independent in this example, however, because large fractures are more clustered than small fractures. Both spatial and size organization of veins here probably emerged from fracture interaction during growth. The new approaches described here, along with freely available software to implement the techniques, can be applied with effect to a wide range of structures, or indeed many other phenomena such as drilling response, where spatial heterogeneity is an issue.
Software for keratometry measurements using portable devices
NASA Astrophysics Data System (ADS)
Iyomasa, C. M.; Ventura, L.; De Groote, J. J.
2010-02-01
In this work we present an image processing software for automatic astigmatism measurements developed for a hand held keratometer. The system projects 36 light spots, from LEDs, displayed in a precise circle at the lachrymal film of the examined cornea. The displacement, the size and deformation of the reflected image of these light spots are analyzed providing the keratometry. The purpose of this research is to develop a software that performs fast and precise calculations in mainstream mobile devices. In another words, a software that can be implemented in portable computer systems, which could be of low cost and easy to handle. This project allows portability for keratometers and is a previous work for a portable corneal topographer.
Survey on Intelligent Assistance for Workplace Learning in Software Engineering
NASA Astrophysics Data System (ADS)
Ras, Eric; Rech, Jörg
Technology-enhanced learning (TEL) systems and intelligent assistance systems aim at supporting software engineers during learning and work. A questionnaire-based survey with 89 responses from industry was conducted to find out what kinds of services should be provided and how, as well as to determine which software engineering phases they should focus on. In this paper, we present the survey results regarding intelligent assistance for workplace learning in software engineering. We analyzed whether specific types of assistance depend on the organization's size, the respondent's role, and the experience level. The results show a demand for TEL that supports short-term problem solving and long-term competence development at the workplace.
Improved gap size estimation for scaffolding algorithms.
Sahlin, Kristoffer; Street, Nathaniel; Lundeberg, Joakim; Arvestad, Lars
2012-09-01
One of the important steps of genome assembly is scaffolding, in which contigs are linked using information from read-pairs. Scaffolding provides estimates about the order, relative orientation and distance between contigs. We have found that contig distance estimates are generally strongly biased and based on false assumptions. Since erroneous distance estimates can mislead in subsequent analysis, it is important to provide unbiased estimation of contig distance. In this article, we show that state-of-the-art programs for scaffolding are using an incorrect model of gap size estimation. We discuss why current maximum likelihood estimators are biased and describe what different cases of bias we are facing. Furthermore, we provide a model for the distribution of reads that span a gap and derive the maximum likelihood equation for the gap length. We motivate why this estimate is sound and show empirically that it outperforms gap estimators in popular scaffolding programs. Our results have consequences both for scaffolding software, structural variation detection and for library insert-size estimation as is commonly performed by read aligners. A reference implementation is provided at https://github.com/SciLifeLab/gapest. Supplementary data are availible at Bioinformatics online.
OPC for curved designs in application to photonics on silicon
NASA Astrophysics Data System (ADS)
Orlando, Bastien; Farys, Vincent; Schneider, Loïc.; Cremer, Sébastien; Postnikov, Sergei V.; Millequant, Matthieu; Dirrenberger, Mathieu; Tiphine, Charles; Bayle, Sébastian; Tranquillin, Céline; Schiavone, Patrick
2016-03-01
Today's design for photonics devices on silicon relies on non-Manhattan features such as curves and a wide variety of angles with minimum feature size below 100nm. Industrial manufacturing of such devices requires optimized process window with 193nm lithography. Therefore, Resolution Enhancement Techniques (RET) that are commonly used for CMOS manufacturing are required. However, most RET algorithms are based on Manhattan fragmentation (0°, 45° and 90°) which can generate large CD dispersion on masks for photonic designs. Industrial implementation of RET solutions to photonic designs is challenging as most currently available OPC tools are CMOS-oriented. Discrepancy from design to final results induced by RET techniques can lead to lower photonic device performance. We propose a novel sizing algorithm allowing adjustment of design edge fragments while preserving the topology of the original structures. The results of the algorithm implementation in the rule based sizing, SRAF placement and model based correction will be discussed in this paper. Corrections based on this novel algorithm were applied and characterized on real photonics devices. The obtained results demonstrate the validity of the proposed correction method integrated in Inscale software of Aselta Nanographics.
Proceedings of the Workshop on Computational Aspects in the Control of Flexible Systems, part 1
NASA Technical Reports Server (NTRS)
Taylor, Lawrence W., Jr. (Compiler)
1989-01-01
Control/Structures Integration program software needs, computer aided control engineering for flexible spacecraft, computer aided design, computational efficiency and capability, modeling and parameter estimation, and control synthesis and optimization software for flexible structures and robots are among the topics discussed.
Using the CoRE Requirements Method with ADARTS. Version 01.00.05
1994-03-01
requirements; combining ADARTS processes and objects derived from CoRE requirements into an ADARTS software architecture design ; and taking advantage of...CoRE’s precision in the ADARTS process structuring, class structuring, and software architecture design activities. Object-oriented requirements and
Advances in the REDCAT software package
2013-01-01
Background Residual Dipolar Couplings (RDCs) have emerged in the past two decades as an informative source of experimental restraints for the study of structure and dynamics of biological macromolecules and complexes. The REDCAT software package was previously introduced for the analysis of molecular structures using RDC data. Here we report additional features that have been included in this software package in order to expand the scope of its analyses. We first discuss the features that enhance REDCATs user-friendly nature, such as the integration of a number of analyses into one single operation and enabling convenient examination of a structural ensemble in order to identify the most suitable structure. We then describe the new features which expand the scope of RDC analyses, performing exercises that utilize both synthetic and experimental data to illustrate and evaluate different features with regard to structure refinement and structure validation. Results We establish the seamless interaction that takes place between REDCAT, VMD, and Xplor-NIH in demonstrations that utilize our newly developed REDCAT-VMD and XplorGUI interfaces. These modules enable visualization of RDC analysis results on the molecular structure displayed in VMD and refinement of structures with Xplor-NIH, respectively. We also highlight REDCAT’s Error-Analysis feature in reporting the localized fitness of a structure to RDC data, which provides a more effective means of recognizing local structural anomalies. This allows for structurally sound regions of a molecule to be identified, and for any refinement efforts to be focused solely on locally distorted regions. Conclusions The newly engineered REDCAT software package, which is available for download via the WWW from http://ifestos.cse.sc.edu, has been developed in the Object Oriented C++ environment. Our most recent enhancements to REDCAT serve to provide a more complete RDC analysis suite, while also accommodating a more user-friendly experience, and will be of great interest to the community of researchers and developers since it hides the complications of software development. PMID:24098943
Post Processing and Biological Evaluation of the Titanium Scaffolds for Bone Tissue Engineering.
Wysocki, Bartłomiej; Idaszek, Joanna; Szlązak, Karol; Strzelczyk, Karolina; Brynk, Tomasz; Kurzydłowski, Krzysztof J; Święszkowski, Wojciech
2016-03-15
Nowadays, post-surgical or post-accidental bone loss can be substituted by custom-made scaffolds fabricated by additive manufacturing (AM) methods from metallic powders. However, the partially melted powder particles must be removed in a post-process chemical treatment. The aim of this study was to investigate the effect of the chemical polishing with various acid baths on novel scaffolds' morphology, porosity and mechanical properties. In the first stage, Magics software (Materialise NV, Leuven, Belgium) was used to design a porous scaffolds with pore size equal to (A) 200 µm, (B) 500 µm and (C) 200 + 500 µm, and diamond cell structure. The scaffolds were fabricated from commercially pure titanium powder (CP Ti) using a SLM50 3D printing machine (Realizer GmbH, Borchen, Germany). The selective laser melting (SLM) process was optimized and the laser beam energy density in range of 91-151 J/mm³ was applied to receive 3D structures with fully dense struts. To remove not fully melted titanium particles the scaffolds were chemically polished using various HF and HF-HNO₃ acid solutions. Based on scaffolds mass loss and scanning electron (SEM) observations, baths which provided most uniform surface cleaning were proposed for each porosity. The pore and strut size after chemical treatments was calculated based on the micro-computed tomography (µ-CT) and SEM images. The mechanical tests showed that the treated scaffolds had Young's modulus close to that of compact bone. Additionally, the effect of pore size of chemically polished scaffolds on cell retention, proliferation and differentiation was studied using human mesenchymal stem cells. Small pores yielded higher cell retention within the scaffolds, which then affected their growth. This shows that in vitro cell performance can be controlled to certain extent by varying pore sizes.
Post Processing and Biological Evaluation of the Titanium Scaffolds for Bone Tissue Engineering
Wysocki, Bartłomiej; Idaszek, Joanna; Szlązak, Karol; Strzelczyk, Karolina; Brynk, Tomasz; Kurzydłowski, Krzysztof J.; Święszkowski, Wojciech
2016-01-01
Nowadays, post-surgical or post-accidental bone loss can be substituted by custom-made scaffolds fabricated by additive manufacturing (AM) methods from metallic powders. However, the partially melted powder particles must be removed in a post-process chemical treatment. The aim of this study was to investigate the effect of the chemical polishing with various acid baths on novel scaffolds’ morphology, porosity and mechanical properties. In the first stage, Magics software (Materialise NV, Leuven, Belgium) was used to design a porous scaffolds with pore size equal to (A) 200 µm, (B) 500 µm and (C) 200 + 500 µm, and diamond cell structure. The scaffolds were fabricated from commercially pure titanium powder (CP Ti) using a SLM50 3D printing machine (Realizer GmbH, Borchen, Germany). The selective laser melting (SLM) process was optimized and the laser beam energy density in range of 91–151 J/mm3 was applied to receive 3D structures with fully dense struts. To remove not fully melted titanium particles the scaffolds were chemically polished using various HF and HF-HNO3 acid solutions. Based on scaffolds mass loss and scanning electron (SEM) observations, baths which provided most uniform surface cleaning were proposed for each porosity. The pore and strut size after chemical treatments was calculated based on the micro-computed tomography (µ-CT) and SEM images. The mechanical tests showed that the treated scaffolds had Young’s modulus close to that of compact bone. Additionally, the effect of pore size of chemically polished scaffolds on cell retention, proliferation and differentiation was studied using human mesenchymal stem cells. Small pores yielded higher cell retention within the scaffolds, which then affected their growth. This shows that in vitro cell performance can be controlled to certain extent by varying pore sizes. PMID:28773323
An Object-Oriented Approach to Writing Computational Electromagnetics Codes
NASA Technical Reports Server (NTRS)
Zimmerman, Martin; Mallasch, Paul G.
1996-01-01
Presently, most computer software development in the Computational Electromagnetics (CEM) community employs the structured programming paradigm, particularly using the Fortran language. Other segments of the software community began switching to an Object-Oriented Programming (OOP) paradigm in recent years to help ease design and development of highly complex codes. This paper examines design of a time-domain numerical analysis CEM code using the OOP paradigm, comparing OOP code and structured programming code in terms of software maintenance, portability, flexibility, and speed.
NASA Technical Reports Server (NTRS)
1979-01-01
Program elements of the power module (PM) system, are identified, structured, and defined according to the planned work breakdown structure. Efforts required to design, develop, manufacture, test, checkout, launch and operate a protoflight assembled 25 kW, 50 kW and 100 kW PM include the preparation and delivery of related software, government furnished equipment, space support equipment, ground support equipment, launch site verification software, orbital verification software, and all related data items.
Meeting the memory challenges of brain-scale network simulation.
Kunkel, Susanne; Potjans, Tobias C; Eppler, Jochen M; Plesser, Hans Ekkehard; Morrison, Abigail; Diesmann, Markus
2011-01-01
The development of high-performance simulation software is crucial for studying the brain connectome. Using connectome data to generate neurocomputational models requires software capable of coping with models on a variety of scales: from the microscale, investigating plasticity, and dynamics of circuits in local networks, to the macroscale, investigating the interactions between distinct brain regions. Prior to any serious dynamical investigation, the first task of network simulations is to check the consistency of data integrated in the connectome and constrain ranges for yet unknown parameters. Thanks to distributed computing techniques, it is possible today to routinely simulate local cortical networks of around 10(5) neurons with up to 10(9) synapses on clusters and multi-processor shared-memory machines. However, brain-scale networks are orders of magnitude larger than such local networks, in terms of numbers of neurons and synapses as well as in terms of computational load. Such networks have been investigated in individual studies, but the underlying simulation technologies have neither been described in sufficient detail to be reproducible nor made publicly available. Here, we discover that as the network model sizes approach the regime of meso- and macroscale simulations, memory consumption on individual compute nodes becomes a critical bottleneck. This is especially relevant on modern supercomputers such as the Blue Gene/P architecture where the available working memory per CPU core is rather limited. We develop a simple linear model to analyze the memory consumption of the constituent components of neuronal simulators as a function of network size and the number of cores used. This approach has multiple benefits. The model enables identification of key contributing components to memory saturation and prediction of the effects of potential improvements to code before any implementation takes place. As a consequence, development cycles can be shorter and less expensive. Applying the model to our freely available Neural Simulation Tool (NEST), we identify the software components dominant at different scales, and develop general strategies for reducing the memory consumption, in particular by using data structures that exploit the sparseness of the local representation of the network. We show that these adaptations enable our simulation software to scale up to the order of 10,000 processors and beyond. As memory consumption issues are likely to be relevant for any software dealing with complex connectome data on such architectures, our approach and our findings should be useful for researchers developing novel neuroinformatics solutions to the challenges posed by the connectome project.
SpotCaliper: fast wavelet-based spot detection with accurate size estimation.
Püspöki, Zsuzsanna; Sage, Daniel; Ward, John Paul; Unser, Michael
2016-04-15
SpotCaliper is a novel wavelet-based image-analysis software providing a fast automatic detection scheme for circular patterns (spots), combined with the precise estimation of their size. It is implemented as an ImageJ plugin with a friendly user interface. The user is allowed to edit the results by modifying the measurements (in a semi-automated way), extract data for further analysis. The fine tuning of the detections includes the possibility of adjusting or removing the original detections, as well as adding further spots. The main advantage of the software is its ability to capture the size of spots in a fast and accurate way. http://bigwww.epfl.ch/algorithms/spotcaliper/ zsuzsanna.puspoki@epfl.ch Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Computational methods and software systems for dynamics and control of large space structures
NASA Technical Reports Server (NTRS)
Park, K. C.; Felippa, C. A.; Farhat, C.; Pramono, E.
1990-01-01
Two key areas of crucial importance to the computer-based simulation of large space structures are discussed. The first area involves multibody dynamics (MBD) of flexible space structures, with applications directed to deployment, construction, and maneuvering. The second area deals with advanced software systems, with emphasis on parallel processing. The latest research thrust in the second area involves massively parallel computers.
Tautomerism in chemical information management systems
NASA Astrophysics Data System (ADS)
Warr, Wendy A.
2010-06-01
Tautomerism has an impact on many of the processes in chemical information management systems including novelty checking during registration into chemical structure databases; storage of structures; exact and substructure searching in chemical structure databases; and depiction of structures retrieved by a search. The approaches taken by 27 different software vendors and database producers are compared. It is hoped that this comparison will act as a discussion document that could ultimately improve databases and software for researchers in the future.
Applying Hypertext Structures to Software Documentation.
ERIC Educational Resources Information Center
French, James C.; And Others
1997-01-01
Describes a prototype system for software documentation management called SLEUTH (Software Literacy Enhancing Usefulness to Humans) being developed at the University of Virginia. Highlights include information retrieval techniques, hypertext links that are installed automatically, a WAIS (Wide Area Information Server) search engine, user…
Finite element analysis of container ship's cargo hold using ANSYS and POSEIDON software
NASA Astrophysics Data System (ADS)
Tanny, Tania Tamiz; Akter, Naznin; Amin, Osman Md.
2017-12-01
Nowadays ship structural analysis has become an integral part of the preliminary ship design providing further support for the development and detail design of ship structures. Structural analyses of container ship's cargo holds are carried out for the balancing of their safety and capacity, as those ships are exposed to the high risk of structural damage during voyage. Two different design methodologies have been considered for the structural analysis of a container ship's cargo hold. One is rule-based methodology and the other is a more conventional software based analyses. The rule based analysis is done by DNV-GL's software POSEIDON and the conventional package based analysis is done by ANSYS structural module. Both methods have been applied to analyze some of the mechanical properties of the model such as total deformation, stress-strain distribution, Von Mises stress, Fatigue etc., following different design bases and approaches, to indicate some guidance's for further improvements in ship structural design.
Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS and CMC's
NASA Technical Reports Server (NTRS)
Jadaan, Osama
2003-01-01
This effort is to investigate probabilistic life prediction methodologies for ceramic matrix composites and MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. For CMC's this includes a brief literature survey regarding lifing methodologies. Also of interest for MEMS is the design of a proper test for the Weibull size effect in thin film (bulge test) specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. A main objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures/Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. A second set of objectives is to determine the applicability/suitability of the CARES/Life methodology for CMC analysis, what changes would be needed to the methodology and software, and if feasible, run a demonstration problem. Also important is an evaluation of CARES/Life coupled to the ANSYS Probabilistic Design System (PDS) and the potential of coupling transient reliability analysis to the ANSYS PDS.
Performance of Railway Sleepers with Holes under Impact Loading
NASA Astrophysics Data System (ADS)
Lim, Chie Hong; Kaewunruen, Sakdirat; Mlilo, Nhlanganiso
2017-12-01
Prestressed concrete sleepers are essential structural components of railway track structures, with the purpose of redistributing wheel loads from the rails to the ground. To facilitate cables and signalling equipment, holes are often generated in these prestressed concrete sleepers. However, the performance of these sleepers under impact loading may be a concern with the addition of these holes. Numerical modelling using finite element analysis (FEA) is an ideal tool that enables static and dynamic simulation and can perform analyses of basic/advanced linear and nonlinear problems, without incurring a huge cost in resources like standard experimental test methods would. This paper will utilize the three-dimensional FE modelling software ABAQUS to investigate the behaviour of the prestressed concrete sleepers with holes of varying sizes upon impact loading. To obtain the results that resemble real-life behaviour of the sleepers under impact loading, the material properties, element types, mesh sizes, contact and interactions and boundary conditions will be defined as accurately as possible. Both Concrete Damaged Plasticity (CDP) and Brittle Cracking models will be used in this study. With a better understanding of how the introduction of holes will influence the performance of prestressed sleepers under impact loading, track and railway engineers will be able to generate them in prestressed concrete sleepers without compromising the sleepers’ performance during operation
An experiment in software reliability: Additional analyses using data from automated replications
NASA Technical Reports Server (NTRS)
Dunham, Janet R.; Lauterbach, Linda A.
1988-01-01
A study undertaken to collect software error data of laboratory quality for use in the development of credible methods for predicting the reliability of software used in life-critical applications is summarized. The software error data reported were acquired through automated repetitive run testing of three independent implementations of a launch interceptor condition module of a radar tracking problem. The results are based on 100 test applications to accumulate a sufficient sample size for error rate estimation. The data collected is used to confirm the results of two Boeing studies reported in NASA-CR-165836 Software Reliability: Repetitive Run Experimentation and Modeling, and NASA-CR-172378 Software Reliability: Additional Investigations into Modeling With Replicated Experiments, respectively. That is, the results confirm the log-linear pattern of software error rates and reject the hypothesis of equal error rates per individual fault. This rejection casts doubt on the assumption that the program's failure rate is a constant multiple of the number of residual bugs; an assumption which underlies some of the current models of software reliability. data raises new questions concerning the phenomenon of interacting faults.
Browsing software of the Visible Korean data used for teaching sectional anatomy.
Shin, Dong Sun; Chung, Min Suk; Park, Hyo Seok; Park, Jin Seo; Hwang, Sung Bae
2011-01-01
The interpretation of computed tomographs (CTs) and magnetic resonance images (MRIs) to diagnose clinical conditions requires basic knowledge of sectional anatomy. Sectional anatomy has traditionally been taught using sectioned cadavers, atlases, and/or computer software. The computer software commonly used for this subject is practical and efficient for students but could be more advanced. The objective of this research was to present browsing software developed from the Visible Korean images that can be used for teaching sectional anatomy. One thousand seven hundred and two sets of MRIs, CTs, and sectioned images (intervals, one millimeter) of a whole male cadaver were prepared. Over 900 structures in the sectioned images were outlined and then filled with different colors to elaborate each structure. Software was developed where four corresponding images could be displayed simultaneously; in addition, the structures in the image data could be readily recognized with the aid of the color-filled outlines. The software, distributed free of charge, could be a valuable tool to teach medical students. For example, sectional anatomy could be taught by showing the sectioned images with real color and high resolution. Students could then review the lecture by using the sectioned and color-filled images on their own computers. Students could also be evaluated using the same software. Furthermore, other investigators would be able to replace the images for more comprehensive sectional anatomy. Copyright © 2011 Wiley-Liss, Inc.
Assessing Requirements Quality through Requirements Coverage
NASA Technical Reports Server (NTRS)
Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt
2008-01-01
In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.
Enhanced CARES Software Enables Improved Ceramic Life Prediction
NASA Technical Reports Server (NTRS)
Janosik, Lesley A.
1997-01-01
The NASA Lewis Research Center has developed award-winning software that enables American industry to establish the reliability and life of brittle material (e.g., ceramic, intermetallic, graphite) structures in a wide variety of 21st century applications. The CARES (Ceramics Analysis and Reliability Evaluation of Structures) series of software is successfully used by numerous engineers in industrial, academic, and government organizations as an essential element of the structural design and material selection processes. The latest version of this software, CARES/Life, provides a general- purpose design tool that predicts the probability of failure of a ceramic component as a function of its time in service. CARES/Life was recently enhanced by adding new modules designed to improve functionality and user-friendliness. In addition, a beta version of the newly-developed CARES/Creep program (for determining the creep life of monolithic ceramic components) has just been released to selected organizations.
Coupling Sensing Hardware with Data Interrogation Software for Structural Health Monitoring
Farrar, Charles R.; Allen, David W.; Park, Gyuhae; ...
2006-01-01
The process of implementing a damage detection strategy for aerospace, civil and mechanical engineering infrastructure is referred to as structural health monitoring (SHM). The authors' approach is to address the SHM problem in the context of a statistical pattern recognition paradigm. In this paradigm, the process can be broken down into four parts: (1) Operational Evaluation, (2) Data Acquisition and Cleansing, (3) Feature Extraction and Data Compression, and (4) Statistical Model Development for Feature Discrimination. These processes must be implemented through hardware or software and, in general, some combination of these two approaches will be used. This paper will discussmore » each portion of the SHM process with particular emphasis on the coupling of a general purpose data interrogation software package for structural health monitoring with a modular wireless sensing and processing platform. More specifically, this paper will address the need to take an integrated hardware/software approach to developing SHM solutions.« less
Evaluation of cavity size, kind, and filling technique of composite shrinkage by finite element.
Jafari, Toloo; Alaghehmad, Homayoon; Moodi, Ehsan
2018-01-01
Cavity preparation reduces the rigidity of tooth and its resistance to deformation. The purpose of this study was to evaluate the dimensional changes of the repaired teeth using two types of light cure composite and two methods of incremental and bulk filling by the use of finite element method. In this computerized in vitro experimental study, an intact maxillary premolar was scanned using cone beam computed tomography instrument (SCANORA, Switzerland), then each section of tooth image was transmitted to Ansys software using AUTOCAD. Then, eight sizes of cavity preparations and two methods of restoration (bulk and incremental) using two different types of composite resin materials (Heliomolar, Brilliant) were proposed on software and analysis was completed with Ansys software. Dimensional change increased by widening and deepening of the cavities. It was also increased using Brilliant composite resin and incremental filling technique. Increase in depth and type of filling technique has the greatest role of dimensional change after curing, but the type of composite resin does not have a significant role.
Scenario analysis for techno-economic model development of U.S. offshore wind support structures
Damiani, Rick; Ning, Andrew; Maples, Ben; ...
2016-09-22
Challenging bathymetry and soil conditions of future US offshore wind power plants might promote the use of multimember, fixed-bottom structures (or 'jackets') in place of monopiles. Support structures affect costs associated with the balance of system and operation and maintenance. Understanding the link between these costs and the main environmental design drivers is crucial in the quest for a lower levelized cost of energy, and it is the main rationale for this work. Actual cost and engineering data are still scarce; hence, we evaluated a simplified engineering approach to tie key site and turbine parameters (e.g. water depth, wave height,more » tower-head mass, hub height and generator rating) to the overall support weight. A jacket-and-tower sizing tool, part of the National Renewable Energy Laboratory's system engineering software suite, was utilized to achieve mass-optimized support structures for 81 different configurations. This tool set provides preliminary sizing of all jacket components. Results showed reasonable agreement with the available industry data, and that the jacket mass is mainly driven by water depth, but hub height and tower-head mass become more influential at greater turbine ratings. A larger sensitivity of the structural mass to wave height and target eigenfrequency was observed for the deepest water conditions (>40 m). Thus, techno-economic analyses using this model should be based on accurate estimates of actual metocean conditions and turbine parameters especially for deep waters. Finally, the relationships derived from this study will inform National Renewable Energy Laboratory's offshore balance of system cost model, and they will be used to evaluate the impact of changes in technology on offshore wind lower levelized cost of energy.« less
NASA software specification and evaluation system: Software verification/validation techniques
NASA Technical Reports Server (NTRS)
1977-01-01
NASA software requirement specifications were used in the development of a system for validating and verifying computer programs. The software specification and evaluation system (SSES) provides for the effective and efficient specification, implementation, and testing of computer software programs. The system as implemented will produce structured FORTRAN or ANSI FORTRAN programs, but the principles upon which SSES is designed allow it to be easily adapted to other high order languages.
Development of Total Knee Replacement Digital Templating Software
NASA Astrophysics Data System (ADS)
Yusof, Siti Fairuz; Sulaiman, Riza; Thian Seng, Lee; Mohd. Kassim, Abdul Yazid; Abdullah, Suhail; Yusof, Shahril; Omar, Masbah; Abdul Hamid, Hamzaini
In this study, by taking full advantage of digital X-ray and computer technology, we have developed a semi-automated procedure to template knee implants, by making use of digital templating method. Using this approach, a software system called OrthoKneeTMhas been designed and developed. The system is to be utilities as a study in the Department of Orthopaedic and Traumatology in medical faculty, UKM (FPUKM). OrthoKneeTMtemplating process employs uses a technique similar to those used by many surgeons, using acetate templates over X-ray films. Using template technique makes it easy to template various implant from every Implant manufacturers who have with a comprehensive database of templates. The templating functionality includes, template (knee) and manufactures templates (Smith & Nephew; and Zimmer). From an image of patient x-ray OrthoKneeTMtemplates help in quickly and easily reads to the approximate template size needed. The visual templating features then allow us quickly review multiple template sizes against the X-ray and thus obtain the nearly precise view of the implant size required. The system can assist by templating on one patient image and will generate reports that can accompany patient notes. The software system was implemented in Visual basic 6.0 Pro using the object-oriented techniques to manage the graphics and objects. The approaches for image scaling will be discussed. Several of measurement in orthopedic diagnosis process have been studied and added in this software as measurement tools features using mathematic theorem and equations. The study compared the results of the semi-automated (using digital templating) method to the conventional method to demonstrate the accuracy of the system.
Development of simulation computer complex specification
NASA Technical Reports Server (NTRS)
1973-01-01
The Training Simulation Computer Complex Study was one of three studies contracted in support of preparations for procurement of a shuttle mission simulator for shuttle crew training. The subject study was concerned with definition of the software loads to be imposed on the computer complex to be associated with the shuttle mission simulator and the development of procurement specifications based on the resulting computer requirements. These procurement specifications cover the computer hardware and system software as well as the data conversion equipment required to interface the computer to the simulator hardware. The development of the necessary hardware and software specifications required the execution of a number of related tasks which included, (1) simulation software sizing, (2) computer requirements definition, (3) data conversion equipment requirements definition, (4) system software requirements definition, (5) a simulation management plan, (6) a background survey, and (7) preparation of the specifications.
Using TSP Data to Evaluate Your Project Performance
2010-09-01
EVA) [ Pressman 2005]. However, unlike earned value, the value is calculated based on the planned size of software components instead of the planned...Hopkins University Press, 1881. 38 | CMU/SEI-2010-TR-038 [ Pressman 2005] Pressman , Roger S. Software Engineering: A Practitioner’s Approach, R.S... Pressman and Asso- ciates, 2005. [Tuma 2010] Tuma Solutions LLC, 2010. http://www.processdash.com/ REPORT DOCUMENTATION PAGE Form Approved
Applications of multigrid software in the atmospheric sciences
NASA Technical Reports Server (NTRS)
Adams, J.; Garcia, R.; Gross, B.; Hack, J.; Haidvogel, D.; Pizzo, V.
1992-01-01
Elliptic partial differential equations from different areas in the atmospheric sciences are efficiently and easily solved utilizing the multigrid software package named MUDPACK. It is demonstrated that the multigrid method is more efficient than other commonly employed techniques, such as Gaussian elimination and fixed-grid relaxation. The efficiency relative to other techniques, both in terms of storage requirement and computational time, increases quickly with grid size.
Lean and Efficient Software: Whole-Program Optimization of Executables
2015-09-30
libraries. Many levels of library interfaces—where some libraries are dynamically linked and some are provided in binary form only—significantly limit...software at build time. The opportunity: Our objective in this project is to substantially improve the performance, size, and robustness of binary ...executables by using static and dynamic binary program analysis techniques to perform whole-program optimization directly on compiled programs
Flight software issues in onboard automated planning: lessons learned on EO-1
NASA Technical Reports Server (NTRS)
Tran, Daniel; Chien, Steve; Rabideau, Gregg; Cichy, Benjamin
2004-01-01
This paper focuses on the onboard planner and scheduler CASPER, whose core planning engine is based on the ground system ASPEN. Given the challenges of developing flight software, we discuss several of the issues encountered in preparing the planner for flight, including reducing the code image size, determining what data to place within the engineering telemetry packet, and performing long term planning.
Bright, Philip; Hambly, Karen
2017-12-21
E-health software tools have been deployed in managing knee conditions. Reporting of patient and practitioner satisfaction in studies regarding e-health usage is not widely explored. The objective of this review was to identify studies describing patient and practitioner satisfaction with software use concerning knee pain. A computerized search was undertaken: four electronic databases were searched from January 2007 until January 2017. Key words were decision dashboard, clinical decision, Web-based resource, evidence support, and knee. Full texts were scanned for effect of size reporting and satisfaction scales from participants and practitioners. Binary regression was run; impact factor and sample size were predictors with indicators for satisfaction and effect size reporting as dependent variables. Seventy-seven articles were retrieved; 37 studies were included in final analysis. Ten studies reported patient satisfaction ratings (27.8%): a single study reported both patient and practitioner satisfaction (2.8%). Randomized control trials were the most common design (35%) and knee osteoarthritis the most prevalent condition (38%). Electronic patient-reported outcome measures and Web-based training were the most common interventions. No significant dependency was found within the regression models (p > 0.05). The proportion of reporting of patient satisfaction was low; practitioner satisfaction was poorly represented. There may be implications for the suitability of administering e-health, a medium for capturing further meta-evidence needs to be established and used as best practice for implicated studies in future. This is the first review of its kind to address patient and practitioner satisfaction with knee e-health.
A Method for Assessing the Accuracy of a Photogrammetry System for Precision Deployable Structures
NASA Technical Reports Server (NTRS)
Moore, Ashley
2005-01-01
The measurement techniques used to validate analytical models of large deployable structures are an integral Part of the technology development process and must be precise and accurate. Photogrammetry and videogrammetry are viable, accurate, and unobtrusive methods for measuring such large Structures. Photogrammetry uses Software to determine the three-dimensional position of a target using camera images. Videogrammetry is based on the same principle, except a series of timed images are analyzed. This work addresses the accuracy of a digital photogrammetry system used for measurement of large, deployable space structures at JPL. First, photogrammetry tests are performed on a precision space truss test article, and the images are processed using Photomodeler software. The accuracy of the Photomodeler results is determined through, comparison with measurements of the test article taken by an external testing group using the VSTARS photogrammetry system. These two measurements are then compared with Australis photogrammetry software that simulates a measurement test to predict its accuracy. The software is then used to study how particular factors, such as camera resolution and placement, affect the system accuracy to help design the setup for the videogrammetry system that will offer the highest level of accuracy for measurement of deploying structures.
Proposal for hierarchical description of software systems
NASA Technical Reports Server (NTRS)
Thauboth, H.
1973-01-01
The programming of digital computers has developed into a new dimension full of diffculties, because the hardware of computers has become so powerful that more complex applications are entrusted to computers. The costs of software development, verification, and maintenance are outpacing those of the hardware and the trend is toward futher increase of sophistication of application of computers and consequently of sophistication of software. To obtain better visibility into software systems and to improve the structure of software systems for better tests, verification, and maintenance, a clear, but rigorous description and documentation of software is needed. The purpose of the report is to extend the present methods in order to obtain a documentation that better reflects the interplay between the various components and functions of a software system at different levels of detail without losing the precision in expression. This is done by the use of block diagrams, sequence diagrams, and cross-reference charts. In the appendices, examples from an actual large sofware system, i.e. the Marshall System for Aerospace Systems Simulation (MARSYAS), are presented. The proposed documentation structure is compatible to automation of updating significant portions of the documentation for better software change control.
WIH-based IEEE 802.11 ECG monitoring implementation.
Moein, A; Pouladian, M
2007-01-01
New wireless technologies make possible the implementation of high level integration wireless devices which allow the replacement of traditional large wired monitoring devices. It offers new functionalities to physicians and will reduce the costs. Among these functionalities, biomedical signals can be sent to other devices (PDA, PC . . . ) or processing centers, without restricting the patients' mobility. This article discusses the WIH (Ward-In-Hand) structure and the software required for its implementation before an operational example is presented with its results. The aim of this project is the development and implementation of a reduced size electrocardiograph based on IEEE 802.11 with high speed and more accuracy, which allows wireless monitoring of patients, and the insertion of the information into the Wi-Fi hospital networks.
Park, Min Kyung; Park, Jin Young; Nicolas, Geneviève; Paik, Hee Young; Kim, Jeongseon; Slimani, Nadia
2015-06-14
During the past decades, a rapid nutritional transition has been observed along with economic growth in the Republic of Korea. Since this dramatic change in diet has been frequently associated with cancer and other non-communicable diseases, dietary monitoring is essential to understand the association. Benefiting from pre-existing standardised dietary methodologies, the present study aimed to evaluate the feasibility and describe the development of a Korean version of the international computerised 24 h dietary recall method (GloboDiet software) and its complementary tools, developed at the International Agency for Research on Cancer (IARC), WHO. Following established international Standard Operating Procedures and guidelines, about seventy common and country-specific databases on foods, recipes, dietary supplements, quantification methods and coefficients were customised and translated. The main results of the present study highlight the specific adaptations made to adapt the GloboDiet software for research and dietary surveillance in Korea. New (sub-) subgroups were added into the existing common food classification, and new descriptors were added to the facets to classify and describe specific Korean foods. Quantification methods were critically evaluated and adapted considering the foods and food packages available in the Korean market. Furthermore, a picture book of foods/dishes was prepared including new pictures and food portion sizes relevant to Korean diet. The development of the Korean version of GloboDiet demonstrated that it was possible to adapt the IARC-WHO international dietary tool to an Asian context without compromising its concept of standardisation and software structure. It, thus, confirms that this international dietary methodology, used so far only in Europe, is flexible and robust enough to be customised for other regions worldwide.
Contingency theoretic methodology for agent-based web-oriented manufacturing systems
NASA Astrophysics Data System (ADS)
Durrett, John R.; Burnell, Lisa J.; Priest, John W.
2000-12-01
The development of distributed, agent-based, web-oriented, N-tier Information Systems (IS) must be supported by a design methodology capable of responding to the convergence of shifts in business process design, organizational structure, computing, and telecommunications infrastructures. We introduce a contingency theoretic model for the use of open, ubiquitous software infrastructure in the design of flexible organizational IS. Our basic premise is that developers should change in the way they view the software design process from a view toward the solution of a problem to one of the dynamic creation of teams of software components. We postulate that developing effective, efficient, flexible, component-based distributed software requires reconceptualizing the current development model. The basic concepts of distributed software design are merged with the environment-causes-structure relationship from contingency theory; the task-uncertainty of organizational- information-processing relationships from information processing theory; and the concept of inter-process dependencies from coordination theory. Software processes are considered as employees, groups of processes as software teams, and distributed systems as software organizations. Design techniques already used in the design of flexible business processes and well researched in the domain of the organizational sciences are presented. Guidelines that can be utilized in the creation of component-based distributed software will be discussed.
Evaluation of the BreastSimulator software platform for breast tomography
NASA Astrophysics Data System (ADS)
Mettivier, G.; Bliznakova, K.; Sechopoulos, I.; Boone, J. M.; Di Lillo, F.; Sarno, A.; Castriconi, R.; Russo, P.
2017-08-01
The aim of this work was the evaluation of the software BreastSimulator, a breast x-ray imaging simulation software, as a tool for the creation of 3D uncompressed breast digital models and for the simulation and the optimization of computed tomography (CT) scanners dedicated to the breast. Eight 3D digital breast phantoms were created with glandular fractions in the range 10%-35%. The models are characterised by different sizes and modelled realistic anatomical features. X-ray CT projections were simulated for a dedicated cone-beam CT scanner and reconstructed with the FDK algorithm. X-ray projection images were simulated for 5 mono-energetic (27, 32, 35, 43 and 51 keV) and 3 poly-energetic x-ray spectra typically employed in current CT scanners dedicated to the breast (49, 60, or 80 kVp). Clinical CT images acquired from two different clinical breast CT scanners were used for comparison purposes. The quantitative evaluation included calculation of the power-law exponent, β, from simulated and real breast tomograms, based on the power spectrum fitted with a function of the spatial frequency, f, of the form S(f) = α/f β . The breast models were validated by comparison against clinical breast CT and published data. We found that the calculated β coefficients were close to that of clinical CT data from a dedicated breast CT scanner and reported data in the literature. In evaluating the software package BreastSimulator to generate breast models suitable for use with breast CT imaging, we found that the breast phantoms produced with the software tool can reproduce the anatomical structure of real breasts, as evaluated by calculating the β exponent from the power spectral analysis of simulated images. As such, this research tool might contribute considerably to the further development, testing and optimisation of breast CT imaging techniques.
NASA Astrophysics Data System (ADS)
Wu, F.; Yi, J.; Li, W. J.
2014-03-01
An active sensing diagnostic system for reinforced concrete SHM has been under investigation. Test results show that the system can detect the damage of the structure. To fundamentally understand the damage algorithm and therefore to establish a robust diagnostic method, accurate Finite Element Analysis (FEA) for the system becomes essential. For the system, a rebar with surface bonded PZT under a transient wave load was simulated and analyzed using commercial FEA software. A detailed 2D axi-symmetric model for a rebar attaching PZT was first established. The model simulates the rebar with wedges, an epoxy adhesive layer, as well as a PZT layer. PZT material parameter transformation with high order tensors was discussed due to the format differences between IEEE Standard and ANSYS. The selection of material properties such as Raleigh damping coefficients was discussed. The direct coupled-field analysis type was selected during simulation. The results from simulation matched well with the experimental data. Further simulation for debonding damage detection for concrete beam with the PZT rebar has been performed. And the numerical results have been validated with test results too. The good consistency between two proves that the numerical models were reasonably accurate. Further system optimization has been performed based on these models. By changing PZT layout and size, the output signals could be increased with magnitudes. And the damage detection signals have been found to be increased exponentially with the debonding size of the rebar.
Advanced computer-aided design for bone tissue-engineering scaffolds.
Ramin, E; Harris, R A
2009-04-01
The design of scaffolds with an intricate and controlled internal structure represents a challenge for tissue engineering. Several scaffold-manufacturing techniques allow the creation of complex architectures but with little or no control over the main features of the channel network such as the size, shape, and interconnectivity of each individual channel, resulting in intricate but random structures. The combined use of computer-aided design (CAD) systems and layer-manufacturing techniques allows a high degree of control over these parameters with few limitations in terms of achievable complexity. However, the design of complex and intricate networks of channels required in CAD is extremely time-consuming since manually modelling hundreds of different geometrical elements, all with different parameters, may require several days to design individual scaffold structures. An automated design methodology is proposed by this research to overcome these limitations. This approach involves the investigation of novel software algorithms, which are able to interact with a conventional CAD program and permit the automated design of several geometrical elements, each with a different size and shape. In this work, the variability of the parameters required to define each geometry has been set as random, but any other distribution could have been adopted. This methodology has been used to design five cubic scaffolds with interconnected pore channels that range from 200 to 800 microm in diameter, each with an increased complexity of the internal geometrical arrangement. A clinical case study, consisting of an integration of one of these geometries with a craniofacial implant, is then presented.
NASA Astrophysics Data System (ADS)
Choudhary, Pankaj; Varshney, Dinesh
2018-05-01
Co2+ doped Mg-Zn spinel chromite compositions Mg0.5Zn0.5-xCoxCr2O4 (0.0 ≤ x ≤ 0.5) have been synthesized by the high-temperature solid state method. Synchrotron and X-ray diffraction (XRD) studies show single-phase crystalline nature. The structural analysis is validated by Rietveld refinement confirms the cubic structure with space group Fd3m. Crystallite size is estimated from Synchrotron XRD which was found to be 30-34 nm. Energy dispersive analysis confirms stoichiometric Mg0.5Zn0.5-xCoxCr2O4 composition. Average crystallite size distribution is estimated from imaging software (Image - J) of SEM is in the range of 100-250 nm. Raman spectroscopy reveals four active phonon modes, and a pronounced red shift is due to enhanced Co2+ concentration. Increased Co2+ concentration in Mg-Zn chromites shows a prominent narrowing of band gap from 3.46 to 2.97 eV. The dielectric response is attributed to the interfacial polarization, and the electrical modulus study supports non-Debye type of dielectric relaxation. Ohmic junctions (minimum potential drop) at electrode interface are active at lower levels of doping (x < 0.2) give rise to a low-frequency semicircle as evidenced from the complex impedance analysis. The low dielectric loss and high ac conductivity of Co2+ doped Mg-Zn spinel chromites are suitable for power transformer applications at high frequencies.
Regular Topographic Patterning of Karst Depressions Suggests Landscape Self-Organization
NASA Astrophysics Data System (ADS)
Quintero, C.; Cohen, M. J.
2017-12-01
Thousands of wetland depressions that are commonly host to cypress domes dot the sub-tropical limestone landscape of South Florida. The origin of these depression features has been the topic of debate. Here we build upon the work of previous surveyors of this landscape to analyze the morphology and spatial distribution of depressions on the Big Cypress landscape. We took advantage of the emergence and availability of high resolution Light Direction and Ranging (LiDAR) technology and ArcMap GIS software to analyze the structure and regularity of landscape features with methods unavailable to past surveyors. Six 2.25 km2 LiDAR plots within the preserve were selected for remote analysis and one depression feature within each plot was selected for more intensive sediment and water depth surveying. Depression features on the Big Cypress landscape were found to show strong evidence of regular spatial patterning. Periodicity, a feature of regularly patterned landscapes, is apparent in both Variograms and Radial Spectrum Analyses. Size class distributions of the identified features indicate constrained feature sizes while Average Nearest Neighbor analyses support the inference of dispersed features with non-random spacing. The presence of regular patterning on this landscape strongly implies biotic reinforcement of spatial structure by way of the scale dependent feedback. In characterizing the structure of this wetland landscape we add to the growing body of work dedicated to documenting how water, life and geology may interact to shape the natural landscapes we see today.
Component-based integration of chemistry and optimization software.
Kenny, Joseph P; Benson, Steven J; Alexeev, Yuri; Sarich, Jason; Janssen, Curtis L; McInnes, Lois Curfman; Krishnan, Manojkumar; Nieplocha, Jarek; Jurrus, Elizabeth; Fahlstrom, Carl; Windus, Theresa L
2004-11-15
Typical scientific software designs make rigid assumptions regarding programming language and data structures, frustrating software interoperability and scientific collaboration. Component-based software engineering is an emerging approach to managing the increasing complexity of scientific software. Component technology facilitates code interoperability and reuse. Through the adoption of methodology and tools developed by the Common Component Architecture Forum, we have developed a component architecture for molecular structure optimization. Using the NWChem and Massively Parallel Quantum Chemistry packages, we have produced chemistry components that provide capacity for energy and energy derivative evaluation. We have constructed geometry optimization applications by integrating the Toolkit for Advanced Optimization, Portable Extensible Toolkit for Scientific Computation, and Global Arrays packages, which provide optimization and linear algebra capabilities. We present a brief overview of the component development process and a description of abstract interfaces for chemical optimizations. The components conforming to these abstract interfaces allow the construction of applications using different chemistry and mathematics packages interchangeably. Initial numerical results for the component software demonstrate good performance, and highlight potential research enabled by this platform.
Matriarch: A Python Library for Materials Architecture.
Giesa, Tristan; Jagadeesan, Ravi; Spivak, David I; Buehler, Markus J
2015-10-12
Biological materials, such as proteins, often have a hierarchical structure ranging from basic building blocks at the nanoscale (e.g., amino acids) to assembled structures at the macroscale (e.g., fibers). Current software for materials engineering allows the user to specify polypeptide chains and simple secondary structures prior to molecular dynamics simulation, but is not flexible in terms of the geometric arrangement of unequilibrated structures. Given some knowledge of a larger-scale structure, instructing the software to create it can be very difficult and time-intensive. To this end, the present paper reports a mathematical language, using category theory, to describe the architecture of a material, i.e., its set of building blocks and instructions for combining them. While this framework applies to any hierarchical material, here we concentrate on proteins. We implement this mathematical language as an open-source Python library called Matriarch. It is a domain-specific language that gives the user the ability to create almost arbitrary structures with arbitrary amino acid sequences and, from them, generate Protein Data Bank (PDB) files. In this way, Matriarch is more powerful than commercial software now available. Matriarch can be used in tandem with molecular dynamics simulations and helps engineers design and modify biologically inspired materials based on their desired functionality. As a case study, we use our software to alter both building blocks and building instructions for tropocollagen, and determine their effect on its structure and mechanical properties.
Space station: The role of software
NASA Technical Reports Server (NTRS)
Hall, D.
1985-01-01
Software will play a critical role throughout the Space Station Program. This presentation sets the stage and prompts participant interaction at the Software Issues Forum. The presentation is structured into three major topics: (1) an overview of the concept and status of the Space Station Program; (2) several charts designed to lay out the scope and role of software; and (3) information addressing the four specific areas selected for focus at the forum, specifically: software management, the software development environment, languages, and standards. NASA's current thinking is highlighted and some of the relevant critical issues are raised.
Software engineering project management - A state-of-the-art report
NASA Technical Reports Server (NTRS)
Thayer, R. H.; Lehman, J. H.
1977-01-01
The management of software engineering projects in the aerospace industry was investigated. The survey assessed such features as contract type, specification preparation techniques, software documentation required by customers, planning and cost-estimating, quality control, the use of advanced program practices, software tools and test procedures, the education levels of project managers, programmers and analysts, work assignment, automatic software monitoring capabilities, design and coding reviews, production times, success rates, and organizational structure of the projects.
ERIC Educational Resources Information Center
Du, Yunfei
This paper discusses the impact of sampling error on the construction of confidence intervals around effect sizes. Sampling error affects the location and precision of confidence intervals. Meta-analytic resampling demonstrates that confidence intervals can haphazardly bounce around the true population parameter. Special software with graphical…
BEST (bioreactor economics, size and time of operation) is an Excel™ spreadsheet-based model that is used in conjunction with the public domain geochemical modeling software, PHREEQCI. The BEST model is used in the design process of sulfate-reducing bacteria (SRB) field bioreacto...
sGD software for estimating spatially explicit indices of genetic diversity
A. J. Shirk; Samuel Cushman
2011-01-01
Anthropogenic landscape changes have greatly reduced the population size, range and migration rates of many terrestrial species. The small local effective population size of remnant populations favours loss of genetic diversity leading to reduced fitness and adaptive potential, and thus ultimately greater extinction risk. Accurately quantifying genetic diversity is...
A Unique Software System For Simulation-to-Flight Research
NASA Technical Reports Server (NTRS)
Chung, Victoria I.; Hutchinson, Brian K.
2001-01-01
"Simulation-to-Flight" is a research development concept to reduce costs and increase testing efficiency of future major aeronautical research efforts at NASA. The simulation-to-flight concept is achieved by using common software and hardware, procedures, and processes for both piloted-simulation and flight testing. This concept was applied to the design and development of two full-size transport simulators, a research system installed on a NASA B-757 airplane, and two supporting laboratories. This paper describes the software system that supports the simulation-to-flight facilities. Examples of various simulation-to-flight experimental applications were also provided.
Evaluation of Software for Introducing Protein Structure: Visualization and Simulation
ERIC Educational Resources Information Center
White, Brian; Kahriman, Azmin; Luberice, Lois; Idleh, Farhia
2010-01-01
Communicating an understanding of the forces and factors that determine a protein's structure is an important goal of many biology and biochemistry courses at a variety of levels. Many educators use computer software that allows visualization of these complex molecules for this purpose. Although visualization is in wide use and has been associated…
The large-scale structure of software-intensive systems
Booch, Grady
2012-01-01
The computer metaphor is dominant in most discussions of neuroscience, but the semantics attached to that metaphor are often quite naive. Herein, we examine the ontology of software-intensive systems, the nature of their structure and the application of the computer metaphor to the metaphysical questions of self and causation. PMID:23386964
NASA Astrophysics Data System (ADS)
Mazzaracchio, Antonio; Marchetti, Mario
2010-03-01
Implicit ablation and thermal response software was developed to analyse and size charring ablative thermal protection systems for entry vehicles. A statistical monitor integrated into the tool, which uses the Monte Carlo technique, allows a simulation to run over stochastic series. This performs an uncertainty and sensitivity analysis, which estimates the probability of maintaining the temperature of the underlying material within specified requirements. This approach and the associated software are primarily helpful during the preliminary design phases of spacecraft thermal protection systems. They are proposed as an alternative to traditional approaches, such as the Root-Sum-Square method. The developed tool was verified by comparing the results with those from previous work on thermal protection system probabilistic sizing methodologies, which are based on an industry standard high-fidelity ablation and thermal response program. New case studies were analysed to establish thickness margins on sizing heat shields that are currently proposed for vehicles using rigid aeroshells for future aerocapture missions at Neptune, and identifying the major sources of uncertainty in the material response.
EPICS Controlled Collimator for Controlling Beam Sizes in HIPPO
DOE Office of Scientific and Technical Information (OSTI.GOV)
Napolitano, Arthur Soriano; Vogel, Sven C.
2017-08-03
Controlling the beam spot size and shape in a diffraction experiment determines the probed sample volume. The HIPPO - High-Pressure-Preferred Orientation– neutron time-offlight diffractometer is located at the Lujan Neutron Scattering Center in Los Alamos National Laboratories. HIPPO characterizes microstructural parameters, such as phase composition, strains, grain size, or texture, of bulk (cm-sized) samples. In the current setup, the beam spot has a 10 mm diameter. Using a collimator, consisting of two pairs of neutron absorbing boron-nitride slabs, horizontal and vertical dimensions of a rectangular beam spot can be defined. Using the HIPPO robotic sample changer for sample motion, themore » collimator would enable scanning of e.g. cylindrical samples along the cylinder axis by probing slices of such samples. The project presented here describes implementation of such a collimator, in particular the motion control software. We utilized the EPICS (Experimental Physics Interface and Control System) software interface to integrate the collimator control into the HIPPO instrument control system. Using EPICS, commands are sent to commercial stepper motors that move the beam windows.« less
Software for roof defects recognition on aerial photographs
NASA Astrophysics Data System (ADS)
Yudin, D.; Naumov, A.; Dolzhenko, A.; Patrakova, E.
2018-05-01
The article presents information on software for roof defects recognition on aerial photographs, made with air drones. An areal image segmentation mechanism is described. It allows detecting roof defects – unsmoothness that causes water stagnation after rain. It is shown that HSV-transformation approach allows quick detection of stagnation areas, their size and perimeters, but is sensitive to shadows and changes of the roofing-types. Deep Fully Convolutional Network software solution eliminates this drawback. The tested data set consists of the roofing photos with defects and binary masks for them. FCN approach gave acceptable results of image segmentation in Dice metric average value. This software can be used in inspection automation of roof conditions in the production sector and housing and utilities infrastructure.
Lo, Ming; Hue, Chih-Wei
2008-11-01
The Character-Component Analysis Toolkit (C-CAT) software was designed to assist researchers in constructing experimental materials using traditional Chinese characters. The software package contains two sets of character stocks: one suitable for research using literate adults as subjects and one suitable for research using schoolchildren as subjects. The software can identify linguistic properties, such as the number of strokes contained, the character-component pronunciation regularity, and the arrangement of character components within a character. Moreover, it can compute a character's linguistic frequency, neighborhood size, and phonetic validity with respect to a user-selected character stock. It can also search the selected character stock for similar characters or for character components with user-specified linguistic properties.
Bian, Chao-Rong; Gao, Yu-Meng; Lamberton, Poppy H L; Lu, Da-Bing
2015-06-01
Schistosomiasis japonicum is one of the most important human parasitic diseases, and a number of studies have recently elucidated the difference in biological characteristics of S. japonicum among different parasite isolates, for example, between the field and the laboratory isolates. Therefore, the understanding of underlying genetic mechanism is of both theoretical and practical importance. In this study, we used six microsatellite markers to assess genetic diversity, population structure, and the bottleneck effect (a sharp reduction in population size) of two parasite populations, one field and one laboratory. A total of 136 S. japonicum cercariae from the field and 86 from the laboratory, which were genetically unique within single snails, were analyzed. The results showed bigger numbers of alleles and higher allelic richness in the field parasite population than in the laboratory indicating lower genetic diversity in the laboratory parasites. A bottleneck effect was detected in the laboratory population. When the field and laboratory isolates were combined, there was a clear distinction between two parasite populations using the software Structure. These genetic differences may partially explain the previously observed contrasted biological traits.
Raster-scanning serial protein crystallography using micro- and nano-focused synchrotron beams
Coquelle, Nicolas; Brewster, Aaron S.; Kapp, Ulrike; Shilova, Anastasya; Weinhausen, Britta; Burghammer, Manfred; Colletier, Jacques-Philippe
2015-01-01
High-resolution structural information was obtained from lysozyme microcrystals (20 µm in the largest dimension) using raster-scanning serial protein crystallography on micro- and nano-focused beamlines at the ESRF. Data were collected at room temperature (RT) from crystals sandwiched between two silicon nitride wafers, thereby preventing their drying, while limiting background scattering and sample consumption. In order to identify crystal hits, new multi-processing and GUI-driven Python-based pre-analysis software was developed, named NanoPeakCell, that was able to read data from a variety of crystallographic image formats. Further data processing was carried out using CrystFEL, and the resultant structures were refined to 1.7 Å resolution. The data demonstrate the feasibility of RT raster-scanning serial micro- and nano-protein crystallography at synchrotrons and validate it as an alternative approach for the collection of high-resolution structural data from micro-sized crystals. Advantages of the proposed approach are its thriftiness, its handling-free nature, the reduced amount of sample required, the adjustable hit rate, the high indexing rate and the minimization of background scattering. PMID:25945583
Raster-scanning serial protein crystallography using micro- and nano-focused synchrotron beams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coquelle, Nicolas; Brewster, Aaron S.; Kapp, Ulrike
High-resolution structural information was obtained from lysozyme microcrystals (20 µm in the largest dimension) using raster-scanning serial protein crystallography on micro- and nano-focused beamlines at the ESRF. Data were collected at room temperature (RT) from crystals sandwiched between two silicon nitride wafers, thereby preventing their drying, while limiting background scattering and sample consumption. In order to identify crystal hits, new multi-processing and GUI-driven Python-based pre-analysis software was developed, named NanoPeakCell, that was able to read data from a variety of crystallographic image formats. Further data processing was carried out using CrystFEL, and the resultant structures were refined to 1.7 Åmore » resolution. The data demonstrate the feasibility of RT raster-scanning serial micro- and nano-protein crystallography at synchrotrons and validate it as an alternative approach for the collection of high-resolution structural data from micro-sized crystals. Advantages of the proposed approach are its thriftiness, its handling-free nature, the reduced amount of sample required, the adjustable hit rate, the high indexing rate and the minimization of background scattering.« less
Raster-scanning serial protein crystallography using micro- and nano-focused synchrotron beams.
Coquelle, Nicolas; Brewster, Aaron S; Kapp, Ulrike; Shilova, Anastasya; Weinhausen, Britta; Burghammer, Manfred; Colletier, Jacques Philippe
2015-05-01
High-resolution structural information was obtained from lysozyme microcrystals (20 µm in the largest dimension) using raster-scanning serial protein crystallography on micro- and nano-focused beamlines at the ESRF. Data were collected at room temperature (RT) from crystals sandwiched between two silicon nitride wafers, thereby preventing their drying, while limiting background scattering and sample consumption. In order to identify crystal hits, new multi-processing and GUI-driven Python-based pre-analysis software was developed, named NanoPeakCell, that was able to read data from a variety of crystallographic image formats. Further data processing was carried out using CrystFEL, and the resultant structures were refined to 1.7 Å resolution. The data demonstrate the feasibility of RT raster-scanning serial micro- and nano-protein crystallography at synchrotrons and validate it as an alternative approach for the collection of high-resolution structural data from micro-sized crystals. Advantages of the proposed approach are its thriftiness, its handling-free nature, the reduced amount of sample required, the adjustable hit rate, the high indexing rate and the minimization of background scattering.
Raster-scanning serial protein crystallography using micro- and nano-focused synchrotron beams
Coquelle, Nicolas; Brewster, Aaron S.; Kapp, Ulrike; ...
2015-04-25
High-resolution structural information was obtained from lysozyme microcrystals (20 µm in the largest dimension) using raster-scanning serial protein crystallography on micro- and nano-focused beamlines at the ESRF. Data were collected at room temperature (RT) from crystals sandwiched between two silicon nitride wafers, thereby preventing their drying, while limiting background scattering and sample consumption. In order to identify crystal hits, new multi-processing and GUI-driven Python-based pre-analysis software was developed, named NanoPeakCell, that was able to read data from a variety of crystallographic image formats. Further data processing was carried out using CrystFEL, and the resultant structures were refined to 1.7 Åmore » resolution. The data demonstrate the feasibility of RT raster-scanning serial micro- and nano-protein crystallography at synchrotrons and validate it as an alternative approach for the collection of high-resolution structural data from micro-sized crystals. Advantages of the proposed approach are its thriftiness, its handling-free nature, the reduced amount of sample required, the adjustable hit rate, the high indexing rate and the minimization of background scattering.« less
Numerical analysis of the cylindrical rigidity of the vertical steel tank shell
NASA Astrophysics Data System (ADS)
Chirkov, Sergey; Tarasenko, Alexander; Chepur, Petr
2017-10-01
The paper deals with the study of rigidity of a vertical steel cylindrical tank and its structural elements with the development of inhomogeneous subsidence in ANSYS software complex. The limiting case is considered in this paper: a complete absence of a base sector that varies along an arc of a circle. The subsidence zone is modeled by the parameter n. A finite-element model of vertical 20000 m3 steel tank has been created, taking into account all structural elements of tank metal structures, including the support ring, beam frame and roof sheets. Various combinations of vertical steel tank loading are analyzed. For operational loads, the most unfavorable combination is considered. Calculations were performed for the filled and emptied tank. Values of the maximum possible deformations of the outer contour of the bottom are obtained with the development of inhomogeneous base subsidence for the given tank size. The obtained parameters of intrinsic rigidity (deformability) of vertical steel tank can be used in the development of new regulatory and technical documentation for tanks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chacko, M; Aldoohan, S
Purpose: The low contrast detectability (LCD) of a CT scanner is its ability to detect and display faint lesions. The current approach to quantify LCD is achieved using vendor-specific methods and phantoms, typically by subjectively observing the smallest size object at a contrast level above phantom background. However, this approach does not yield clinically applicable values for LCD. The current study proposes a statistical LCD metric using software tools to not only to assess scanner performance, but also to quantify the key factors affecting LCD. This approach was developed using uniform QC phantoms, and its applicability was then extended undermore » simulated clinical conditions. Methods: MATLAB software was developed to compute LCD using a uniform image of a QC phantom. For a given virtual object size, the software randomly samples the image within a selected area, and uses statistical analysis based on Student’s t-distribution to compute the LCD as the minimal Hounsfield Unit’s that can be distinguished from the background at the 95% confidence level. Its validity was assessed by comparison with the behavior of a known QC phantom under various scan protocols and a tissue-mimicking phantom. The contributions of beam quality and scattered radiation upon the computed LCD were quantified by using various external beam-hardening filters and phantom lengths. Results: As expected, the LCD was inversely related to object size under all scan conditions. The type of image reconstruction kernel filter and tissue/organ type strongly influenced the background noise characteristics and therefore, the computed LCD for the associated image. Conclusion: The proposed metric and its associated software tools are vendor-independent and can be used to analyze any LCD scanner performance. Furthermore, the method employed can be used in conjunction with the relationships established in this study between LCD and tissue type to extend these concepts to patients’ clinical CT images.« less
NASA Astrophysics Data System (ADS)
Herrick, Gregory Paul
The quest to accurately capture flow phenomena with length-scales both short and long and to accurately represent complex flow phenomena within disparately sized geometry inspires a need for an efficient, high-fidelity, multi-block structured computational fluid dynamics (CFD) parallel computational scheme. This research presents and demonstrates a more efficient computational method by which to perform multi-block structured CFD parallel computational simulations, thus facilitating higher-fidelity solutions of complicated geometries (due to the inclusion of grids for "small'' flow areas which are often merely modeled) and their associated flows. This computational framework offers greater flexibility and user-control in allocating the resource balance between process count and wall-clock computation time. The principal modifications implemented in this revision consist of a "multiple grid block per processing core'' software infrastructure and an analytic computation of viscous flux Jacobians. The development of this scheme is largely motivated by the desire to simulate axial compressor stall inception with more complete gridding of the flow passages (including rotor tip clearance regions) than has been previously done while maintaining high computational efficiency (i.e., minimal consumption of computational resources), and thus this paradigm shall be demonstrated with an examination of instability in a transonic axial compressor. However, the paradigm presented herein facilitates CFD simulation of myriad previously impractical geometries and flows and is not limited to detailed analyses of axial compressor flows. While the simulations presented herein were technically possible under the previous structure of the subject software, they were much less computationally efficient and thus not pragmatically feasible; the previous research using this software to perform three-dimensional, full-annulus, time-accurate, unsteady, full-stage (with sliding-interface) simulations of rotating stall inception in axial compressors utilized tip clearance periodic models, while the scheme here is demonstrated by a simulation of axial compressor stall inception utilizing gridded rotor tip clearance regions. As will be discussed, much previous research---experimental, theoretical, and computational---has suggested that understanding clearance flow behavior is critical to understanding stall inception, and previous computational research efforts which have used tip clearance models have begged the question, "What about the clearance flows?''. This research begins to address that question.
SDDL- SOFTWARE DESIGN AND DOCUMENTATION LANGUAGE
NASA Technical Reports Server (NTRS)
Kleine, H.
1994-01-01
Effective, efficient communication is an essential element of the software development process. The Software Design and Documentation Language (SDDL) provides an effective communication medium to support the design and documentation of complex software applications. SDDL supports communication between all the members of a software design team and provides for the production of informative documentation on the design effort. Even when an entire development task is performed by a single individual, it is important to explicitly express and document communication between the various aspects of the design effort including concept development, program specification, program development, and program maintenance. SDDL ensures that accurate documentation will be available throughout the entire software life cycle. SDDL offers an extremely valuable capability for the design and documentation of complex programming efforts ranging from scientific and engineering applications to data management and business sytems. Throughout the development of a software design, the SDDL generated Software Design Document always represents the definitive word on the current status of the ongoing, dynamic design development process. The document is easily updated and readily accessible in a familiar, informative form to all members of the development team. This makes the Software Design Document an effective instrument for reconciling misunderstandings and disagreements in the development of design specifications, engineering support concepts, and the software design itself. Using the SDDL generated document to analyze the design makes it possible to eliminate many errors that might not be detected until coding and testing is attempted. As a project management aid, the Software Design Document is useful for monitoring progress and for recording task responsibilities. SDDL is a combination of language, processor, and methodology. The SDDL syntax consists of keywords to invoke design structures and a collection of directives which control processor actions. The designer has complete control over the choice of keywords, commanding the capabilities of the processor in a way which is best suited to communicating the intent of the design. The SDDL processor translates the designer's creative thinking into an effective document for communication. The processor performs as many automatic functions as possible, thereby freeing the designer's energy for the creative effort. Document formatting includes graphical highlighting of structure logic, accentuation of structure escapes and module invocations, logic error detection, and special handling of title pages and text segments. The SDDL generated document contains software design summary information including module invocation hierarchy, module cross reference, and cross reference tables of user selected words or phrases appearing in the document. The basic forms of the methodology are module and block structures and the module invocation statement. A design is stated in terms of modules that represent problem abstractions which are complete and independent enough to be treated as separate problem entities. Blocks are lower-level structures used to build the modules. Both kinds of structures may have an initiator part, a terminator part, an escape segment, or a substructure. The SDDL processor is written in PASCAL for batch execution on a DEC VAX series computer under VMS. SDDL was developed in 1981 and last updated in 1984.
Practical computational toolkits for dendrimers and dendrons structure design.
Martinho, Nuno; Silva, Liana C; Florindo, Helena F; Brocchini, Steve; Barata, Teresa; Zloh, Mire
2017-09-01
Dendrimers and dendrons offer an excellent platform for developing novel drug delivery systems and medicines. The rational design and further development of these repetitively branched systems are restricted by difficulties in scalable synthesis and structural determination, which can be overcome by judicious use of molecular modelling and molecular simulations. A major difficulty to utilise in silico studies to design dendrimers lies in the laborious generation of their structures. Current modelling tools utilise automated assembly of simpler dendrimers or the inefficient manual assembly of monomer precursors to generate more complicated dendrimer structures. Herein we describe two novel graphical user interface toolkits written in Python that provide an improved degree of automation for rapid assembly of dendrimers and generation of their 2D and 3D structures. Our first toolkit uses the RDkit library, SMILES nomenclature of monomers and SMARTS reaction nomenclature to generate SMILES and mol files of dendrimers without 3D coordinates. These files are used for simple graphical representations and storing their structures in databases. The second toolkit assembles complex topology dendrimers from monomers to construct 3D dendrimer structures to be used as starting points for simulation using existing and widely available software and force fields. Both tools were validated for ease-of-use to prototype dendrimer structure and the second toolkit was especially relevant for dendrimers of high complexity and size.
Practical computational toolkits for dendrimers and dendrons structure design
NASA Astrophysics Data System (ADS)
Martinho, Nuno; Silva, Liana C.; Florindo, Helena F.; Brocchini, Steve; Barata, Teresa; Zloh, Mire
2017-09-01
Dendrimers and dendrons offer an excellent platform for developing novel drug delivery systems and medicines. The rational design and further development of these repetitively branched systems are restricted by difficulties in scalable synthesis and structural determination, which can be overcome by judicious use of molecular modelling and molecular simulations. A major difficulty to utilise in silico studies to design dendrimers lies in the laborious generation of their structures. Current modelling tools utilise automated assembly of simpler dendrimers or the inefficient manual assembly of monomer precursors to generate more complicated dendrimer structures. Herein we describe two novel graphical user interface toolkits written in Python that provide an improved degree of automation for rapid assembly of dendrimers and generation of their 2D and 3D structures. Our first toolkit uses the RDkit library, SMILES nomenclature of monomers and SMARTS reaction nomenclature to generate SMILES and mol files of dendrimers without 3D coordinates. These files are used for simple graphical representations and storing their structures in databases. The second toolkit assembles complex topology dendrimers from monomers to construct 3D dendrimer structures to be used as starting points for simulation using existing and widely available software and force fields. Both tools were validated for ease-of-use to prototype dendrimer structure and the second toolkit was especially relevant for dendrimers of high complexity and size.
NASA Astrophysics Data System (ADS)
Gonzales, H. B.; Ravi, S.; Li, J. J.; Sankey, J. B.
2016-12-01
Hydrological and aeolian processes control the redistribution of soil and nutrients in arid and semi arid environments thereby contributing to the formation of heterogeneous patchy landscapes with nutrient-rich resource islands surrounded by nutrient depleted bare soil patches. The differential trapping of soil particles by vegetation canopies may result in textural changes beneath the vegetation, which, in turn, can alter the hydrological processes such as infiltration and runoff. We conducted infiltration experiments and soil grain size analysis of several shrub (Larrea tridentate) and grass (Bouteloua eriopoda) microsites and in a heterogeneous landscape in the Chihuahuan desert (New Mexico, USA). Our results indicate heterogeneity in soil texture and infiltration patterns under grass and shrub microsites. We assessed the trapping effectiveness of vegetation canopies using a novel computational fluid dynamics (CFD) approach. An open-source software (OpenFOAM) was used to validate the data gathered from particle size distribution (PSD) analysis of soil within the shrub and grass microsites and their porosities (91% for shrub and 68% for grass) determined using terrestrial LiDAR surveys. Three-dimensional architectures of the shrub and grass were created using an open-source computer-aided design (CAD) software (Blender). The readily available solvers within the OpenFOAM architecture were modified to test the validity and optimize input parameters in assessing trapping efficiencies of sparse vegetation against aeolian sediment flux. The results from the numerical simulations explained the observed textual changes under grass and shrub canopies and highlighted the role of sediment trapping by canopies in structuring patch-scale hydrological processes.
Bruellmann, Dan; Sander, Steven; Schmidtmann, Irene
2016-05-01
The endodontic working length is commonly determined by electronic apex locators and intraoral periapical radiographs. No algorithms for the automatic detection of endodontic files in dental radiographs have been described in the recent literature. Teeth from the mandibles of pig cadavers were accessed, and digital radiographs of these specimens were obtained using an optical bench. The specimens were then recorded in identical positions and settings after the insertion of endodontic files of known sizes (ISO sizes 10-15). The frequency bands generated by the endodontic files were determined using fast Fourier transforms (FFTs) to convert the resulting images into frequency spectra. The detected frequencies were used to design a pre-segmentation filter, which was programmed using Delphi XE RAD Studio software (Embarcadero Technologies, San Francisco, USA) and tested on 20 radiographs. For performance evaluation purposes, the gauged lengths (measured with a caliper) of visible endodontic files were measured in the native and filtered images. The software was able to segment the endodontic files in both the samples and similar dental radiographs. We observed median length differences of 0.52 mm (SD: 2.76 mm) and 0.46 mm (SD: 2.33 mm) in the native and post-segmentation images, respectively. Pearson's correlation test revealed a significant correlation of 0.915 between the true length and the measured length in the native images; the corresponding correlation for the filtered images was 0.97 (p=0.0001). The algorithm can be used to automatically detect and measure the lengths of endodontic files in digital dental radiographs. Copyright © 2016 Elsevier Ltd. All rights reserved.
Stella, Stefano; Italia, Leonardo; Geremia, Giulia; Rosa, Isabella; Ancona, Francesco; Marini, Claudia; Capogrosso, Cristina; Giglio, Manuela; Montorfano, Matteo; Latib, Azeem; Margonato, Alberto; Colombo, Antonio; Agricola, Eustachio
2018-02-06
A 3D transoesophageal echocardiography (3D-TOE) reconstruction tool has recently been introduced. The system automatically configures a geometric model of the aortic root and performs quantitative analysis of these structures. We compared the measurements of the aortic annulus (AA) obtained by semi-automated 3D-TOE quantitative software and manual analysis vs. multislice computed tomography (MSCT) ones. One hundred and seventy-five patients (mean age 81.3 ± 6.3 years, 77 men) who underwent both MSCT and 3D-TOE for annulus assessment before transcatheter aortic valve implantation were analysed. Hypothetical prosthetic valve sizing was evaluated using the 3D manual, semi-automated measurements using manufacturer-recommended CT-based sizing algorithm as gold standard. Good correlation between 3D-TOE methods vs. MSCT measurements was found, but the semi-automated analysis demonstrated slightly better correlations for AA major diameter (r = 0.89), perimeter (r = 0.89), and area (r = 0.85) (all P < 0.0001) than manual one. Both 3D methods underestimated the MSCT measurements, but semi-automated measurements showed narrower limits of agreement and lesser bias than manual measurements for most of AA parameters. On average, 3D-TOE semi-automated major diameter, area, and perimeter underestimated the respective MSCT measurements by 7.4%, 3.5%, and 4.4%, respectively, whereas minor diameter was overestimated by 0.3%. Moderate agreement for valve sizing for both 3D-TOE techniques was found: Kappa agreement 0.5 for both semi-automated and manual analysis. Interobserver and intraobserver agreements for the AA measurements were excellent for both techniques (intraclass correlation coefficients for all parameters >0.80). The 3D-TOE semi-automated analysis of AA is feasible and reliable and can be used in clinical practice as an alternative to MSCT for AA assessment. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author(s) 2018. For permissions, please email: journals.permissions@oup.com.
[Near infrared spectroscopy system structure with MOEMS scanning mirror array].
Luo, Biao; Wen, Zhi-Yu; Wen, Zhong-Quan; Chen, Li; Qian, Rong-Rong
2011-11-01
A method which uses MOEMS mirror array optical structure to reduce the high cost of infrared spectrometer is given in the present paper. This method resolved the problem that MOEMS mirror array can not be used in simple infrared spectrometer because the problem of imaging irregularity in infrared spectroscopy and a new structure for spectral imaging was designed. According to the requirements of imaging spot, this method used optical design software ZEMAX and standard-specific aberrations of the optimization algorithm, designed and optimized the optical structure. It works from 900 to 1 400 nm. The results of design analysis showed that with the light source slit width of 50 microm, the spectrophotometric system is superior to the theoretical resolution of 6 nm, and the size of the available spot is 0.042 mm x 0.08 mm. Verification examples show that the design meets the requirements of the imaging regularity, and can be used for MOEMS mirror reflectance scan. And it was also verified that the use of a new MOEMS mirror array spectrometer model is feasible. Finally, analyze the relationship between the location of the detector and the maximum deflection angle of micro-mirror was analyzed.
Fast online and index-based algorithms for approximate search of RNA sequence-structure patterns
2013-01-01
Background It is well known that the search for homologous RNAs is more effective if both sequence and structure information is incorporated into the search. However, current tools for searching with RNA sequence-structure patterns cannot fully handle mutations occurring on both these levels or are simply not fast enough for searching large sequence databases because of the high computational costs of the underlying sequence-structure alignment problem. Results We present new fast index-based and online algorithms for approximate matching of RNA sequence-structure patterns supporting a full set of edit operations on single bases and base pairs. Our methods efficiently compute semi-global alignments of structural RNA patterns and substrings of the target sequence whose costs satisfy a user-defined sequence-structure edit distance threshold. For this purpose, we introduce a new computing scheme to optimally reuse the entries of the required dynamic programming matrices for all substrings and combine it with a technique for avoiding the alignment computation of non-matching substrings. Our new index-based methods exploit suffix arrays preprocessed from the target database and achieve running times that are sublinear in the size of the searched sequences. To support the description of RNA molecules that fold into complex secondary structures with multiple ordered sequence-structure patterns, we use fast algorithms for the local or global chaining of approximate sequence-structure pattern matches. The chaining step removes spurious matches from the set of intermediate results, in particular of patterns with little specificity. In benchmark experiments on the Rfam database, our improved online algorithm is faster than the best previous method by up to factor 45. Our best new index-based algorithm achieves a speedup of factor 560. Conclusions The presented methods achieve considerable speedups compared to the best previous method. This, together with the expected sublinear running time of the presented index-based algorithms, allows for the first time approximate matching of RNA sequence-structure patterns in large sequence databases. Beyond the algorithmic contributions, we provide with RaligNAtor a robust and well documented open-source software package implementing the algorithms presented in this manuscript. The RaligNAtor software is available at http://www.zbh.uni-hamburg.de/ralignator. PMID:23865810
Software Reporting Metrics. Revision 2.
1985-11-01
MITRE Corporation and ESD. Some of the data has been obtained from Dr. Barry Boehm’s Software Engineering Economics (Ref. 1). Thanks are also given to...data level control management " SP = structured programming Barry W. Boehm, Software Engineering Economics, &©1981, p. 122. Reprinted by permission of...investigated and implemented in future prototypes. 43 REFERENCES For further reading: " 1. Boehm, Barry W. Software Engineering Economics; Englewood
TeraStitcher - A tool for fast automatic 3D-stitching of teravoxel-sized microscopy images
2012-01-01
Background Further advances in modern microscopy are leading to teravoxel-sized tiled 3D images at high resolution, thus increasing the dimension of the stitching problem of at least two orders of magnitude. The existing software solutions do not seem adequate to address the additional requirements arising from these datasets, such as the minimization of memory usage and the need to process just a small portion of data. Results We propose a free and fully automated 3D Stitching tool designed to match the special requirements coming out of teravoxel-sized tiled microscopy images that is able to stitch them in a reasonable time even on workstations with limited resources. The tool was tested on teravoxel-sized whole mouse brain images with micrometer resolution and it was also compared with the state-of-the-art stitching tools on megavoxel-sized publicy available datasets. This comparison confirmed that the solutions we adopted are suited for stitching very large images and also perform well on datasets with different characteristics. Indeed, some of the algorithms embedded in other stitching tools could be easily integrated in our framework if they turned out to be more effective on other classes of images. To this purpose, we designed a software architecture which separates the strategies that use efficiently memory resources from the algorithms which may depend on the characteristics of the acquired images. Conclusions TeraStitcher is a free tool that enables the stitching of Teravoxel-sized tiled microscopy images even on workstations with relatively limited resources of memory (<8 GB) and processing power. It exploits the knowledge of approximate tile positions and uses ad-hoc strategies and algorithms designed for such very large datasets. The produced images can be saved into a multiresolution representation to be efficiently retrieved and processed. We provide TeraStitcher both as standalone application and as plugin of the free software Vaa3D. PMID:23181553
NASA Technical Reports Server (NTRS)
Chen, Shu-Po
1999-01-01
This paper presents software for solving the non-conforming fluid structure interfaces in aeroelastic simulation. It reviews the algorithm of interpolation and integration, highlights the flexibility and the user-friendly feature that allows the user to select the existing structure and fluid package, like NASTRAN and CLF3D, to perform the simulation. The presented software is validated by computing the High Speed Civil Transport model.
Corroded Anchor Structure Stability/Reliability (CAS_Stab-R) Software for Hydraulic Structures
2017-12-01
This report describes software that provides a probabilistic estimate of time -to-failure for a corroding anchor strand system. These anchor...stability to the structure. A series of unique pull-test experiments conducted by Ebeling et al. (2016) at the U.S. Army Engineer Research and...Reliability (CAS_Stab-R) produces probabilistic Remaining Anchor Life time estimates for anchor cables based upon the direct corrosion rate for the
MFV-class: a multi-faceted visualization tool of object classes.
Zhang, Zhi-meng; Pan, Yun-he; Zhuang, Yue-ting
2004-11-01
Classes are key software components in an object-oriented software system. In many industrial OO software systems, there are some classes that have complicated structure and relationships. So in the processes of software maintenance, testing, software reengineering, software reuse and software restructure, it is a challenge for software engineers to understand these classes thoroughly. This paper proposes a class comprehension model based on constructivist learning theory, and implements a software visualization tool (MFV-Class) to help in the comprehension of a class. The tool provides multiple views of class to uncover manifold facets of class contents. It enables visualizing three object-oriented metrics of classes to help users focus on the understanding process. A case study was conducted to evaluate our approach and the toolkit.
In Vitro Toxicity of Silver Nanoparticles in Human Lung Epithelial Cells
2009-03-01
software from the particle distributions measured and the polydispersity index (PdI) given is a measure of the size ranges present in the solution...Transmission Electron Microscopy Figure 22 shows the TEM primary particles size and distribution determined from measurement of over 100 particles from...nm uncoated. (B) Ag 80 nm uncoated. (C) Ag 10 nm coated. (D) Ag 80 nm coated Table 4 shows the TEM primary particles size and distribution
LUMA: A many-core, Fluid-Structure Interaction solver based on the Lattice-Boltzmann Method
NASA Astrophysics Data System (ADS)
Harwood, Adrian R. G.; O'Connor, Joseph; Sanchez Muñoz, Jonathan; Camps Santasmasas, Marta; Revell, Alistair J.
2018-01-01
The Lattice-Boltzmann Method at the University of Manchester (LUMA) project was commissioned to build a collaborative research environment in which researchers of all abilities can study fluid-structure interaction (FSI) problems in engineering applications from aerodynamics to medicine. It is built on the principles of accessibility, simplicity and flexibility. The LUMA software at the core of the project is a capable FSI solver with turbulence modelling and many-core scalability as well as a wealth of input/output and pre- and post-processing facilities. The software has been validated and several major releases benchmarked on supercomputing facilities internationally. The software architecture is modular and arranged logically using a minimal amount of object-orientation to maintain a simple and accessible software.
Using Decision Structures for Policy Analysis in Software Product-line Evolution - A Case Study
NASA Astrophysics Data System (ADS)
Sarang, Nita; Sanglikar, Mukund A.
Project management decisions are the primary basis for project success (or failure). Mostly, such decisions are based on an intuitive understanding of the underlying software engineering and management process and have a likelihood of being misjudged. Our problem domain is product-line evolution. We model the dynamics of the process by incorporating feedback loops appropriate to two decision structures: staffing policy, and the forces of growth associated with long-term software evolution. The model is executable and supports project managers to assess the long-term effects of possible actions. Our work also corroborates results from earlier studies of E-type systems, in particular the FEAST project and the rules for software evolution, planning and management.
ERIC Educational Resources Information Center
Zhang, Xuesong; Dorn, Bradley
2012-01-01
Agile development has received increasing interest both in industry and academia due to its benefits in developing software quickly, meeting customer needs, and keeping pace with the rapidly changing requirements. However, agile practices and scrum in particular have been mainly tested in mid- to large-size projects. In this paper, we present…
Perchlorate Detection at Nanomolar Concentrations by Surface-Enhanced Raman Scattering
2009-01-01
grooves/mm grating light path controlled by Renishaw WiRE software and analyzed by Galactic GRAMS software. RESULTS AND DISCUSSION Quantitative... Federal Rights License 14. ABSTRACT Perchlorate (ClO4 ) has emerged as a widespread environmental contaminant and has been detected in various food...by means of dynamic light scattering using a ZetaPlus particle size analyzer (Brookhaven Instruments, Holtsville, NY). Data were collected for every
System Engineering Concept Demonstration, System Engineering Needs. Volume 2
1992-12-01
changeability, and invisibility. "Software entities are perhaps more complex for their size than any other human construct..." In addition, software is... human actions and interactions that often fail or insufficient in large organizations. Specific needs in this area include the following: " Each...needed to accomplish incremental review and critique of information. * Automi ..-’ metrics support is needed for the measuring ikey quality aspects of
Models for Threat Assessment in Networks
2006-09-01
Software International and Command AntiVirus . [Online]. Available: http://www.commandsoftware.com/virus/newlove.html [38] C. Ng and P. Ferrie. (2000...28 2.3 False positive trends across all population sizes for r=0.7 and m=0.1 . . . . 33 2.4 False negative trends across all population...benefits analysis is often performed to determine the list of mitigation procedures. Traditionally, risk assessment has been done in part with software
Analysis of quality raw data of second generation sequencers with Quality Assessment Software.
Ramos, Rommel Tj; Carneiro, Adriana R; Baumbach, Jan; Azevedo, Vasco; Schneider, Maria Pc; Silva, Artur
2011-04-18
Second generation technologies have advantages over Sanger; however, they have resulted in new challenges for the genome construction process, especially because of the small size of the reads, despite the high degree of coverage. Independent of the program chosen for the construction process, DNA sequences are superimposed, based on identity, to extend the reads, generating contigs; mismatches indicate a lack of homology and are not included. This process improves our confidence in the sequences that are generated. We developed Quality Assessment Software, with which one can review graphs showing the distribution of quality values from the sequencing reads. This software allow us to adopt more stringent quality standards for sequence data, based on quality-graph analysis and estimated coverage after applying the quality filter, providing acceptable sequence coverage for genome construction from short reads. Quality filtering is a fundamental step in the process of constructing genomes, as it reduces the frequency of incorrect alignments that are caused by measuring errors, which can occur during the construction process due to the size of the reads, provoking misassemblies. Application of quality filters to sequence data, using the software Quality Assessment, along with graphing analyses, provided greater precision in the definition of cutoff parameters, which increased the accuracy of genome construction.
NASA Technical Reports Server (NTRS)
Lucas, S. H.; Scotti, S. J.
1989-01-01
The nonlinear mathematical programming method (formal optimization) has had many applications in engineering design. A figure illustrates the use of optimization techniques in the design process. The design process begins with the design problem, such as the classic example of the two-bar truss designed for minimum weight as seen in the leftmost part of the figure. If formal optimization is to be applied, the design problem must be recast in the form of an optimization problem consisting of an objective function, design variables, and constraint function relations. The middle part of the figure shows the two-bar truss design posed as an optimization problem. The total truss weight is the objective function, the tube diameter and truss height are design variables, with stress and Euler buckling considered as constraint function relations. Lastly, the designer develops or obtains analysis software containing a mathematical model of the object being optimized, and then interfaces the analysis routine with existing optimization software such as CONMIN, ADS, or NPSOL. This final state of software development can be both tedious and error-prone. The Sizing and Optimization Language (SOL), a special-purpose computer language whose goal is to make the software implementation phase of optimum design easier and less error-prone, is presented.
Natural 3D content on glasses-free light-field 3D cinema
NASA Astrophysics Data System (ADS)
Balogh, Tibor; Nagy, Zsolt; Kovács, Péter Tamás.; Adhikarla, Vamsi K.
2013-03-01
This paper presents a complete framework for capturing, processing and displaying the free viewpoint video on a large scale immersive light-field display. We present a combined hardware-software solution to visualize free viewpoint 3D video on a cinema-sized screen. The new glasses-free 3D projection technology can support larger audience than the existing autostereoscopic displays. We introduce and describe our new display system including optical and mechanical design considerations, the capturing system and render cluster for producing the 3D content, and the various software modules driving the system. The indigenous display is first of its kind, equipped with front-projection light-field HoloVizio technology, controlling up to 63 MP. It has all the advantages of previous light-field displays and in addition, allows a more flexible arrangement with a larger screen size, matching cinema or meeting room geometries, yet simpler to set-up. The software system makes it possible to show 3D applications in real-time, besides the natural content captured from dense camera arrangements as well as from sparse cameras covering a wider baseline. Our software system on the GPU accelerated render cluster, can also visualize pre-recorded Multi-view Video plus Depth (MVD4) videos on this light-field glasses-free cinema system, interpolating and extrapolating missing views.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zakariaee, R; Brown, C J; Hamarneh, G
2014-08-15
Dosimetric parameters based on dose-volume histograms (DVH) of contoured structures are routinely used to evaluate dose delivered to target structures and organs at risk. However, the DVH provides no information on the spatial distribution of the dose in situations of repeated fractions with changes in organ shape or size. The aim of this research was to develop methods to more accurately determine geometrically localized, cumulative dose to the bladder wall in intracavitary brachytherapy for cervical cancer. The CT scans and treatment plans of 20 cervical cancer patients were used. Each patient was treated with five high-dose-rate (HDR) brachytherapy fractions ofmore » 600cGy prescribed dose. The bladder inner and outer surfaces were delineated using MIM Maestro software (MIM Software Inc.) and were imported into MATLAB (MathWorks) as 3-dimensional point clouds constituting the “bladder wall”. A point-set registration toolbox for MATLAB, Coherent Point Drift (CPD), was used to non-rigidly transform the bladder-wall points from four of the fractions to the coordinate system of the remaining (reference) fraction, which was chosen to be the emptiest bladder for each patient. The doses were accumulated on the reference fraction and new cumulative dosimetric parameters were calculated. The LENT-SOMA toxicity scores of these patients were studied against the cumulative dose parameters. Based on this study, there was no significant correlation between the toxicity scores and the determined cumulative dose parameters.« less
The development of a program analysis environment for Ada: Reverse engineering tools for Ada
NASA Technical Reports Server (NTRS)
Cross, James H., II
1991-01-01
The Graphical Representations of Algorithms, Structures, and Processes for Ada (GRASP/Ada) has successfully created and prototyped a new algorithm level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and thus improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype (Version 1) was designed and implemented using FLEX and BISON running under the Virtual Memory System (VMS) on a VAX 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. In Phase 3 of the project, the prototype was prepared for limited distribution (GRASP/Ada Version 3.0) to facilitate evaluation. The user interface was extensively reworked. The current prototype provides the capability for the user to generate CSD from Ada source code in a reverse engineering mode with a level of flexibility suitable for practical application.
Information Systems and Software Engineering Research and Education in Oulu until the 1990s
NASA Astrophysics Data System (ADS)
Oinas-Kukkonen, Henry; Kerola, Pentti; Oinas-Kukkonen, Harri; Similä, Jouni; Pulli, Petri
This paper discusses the internationalization of software business in the Oulu region. Despite its small size, the region grew rapidly and very successfully into a global information and communication technology business center. The University of Oulu, which was the northern most university in the world at the time of its establishment (1958) had a strong emphasis on engineering since its very beginning. Research on electronics was carried out since the early 1960s. Later, when the Department of Information Processing Science was founded in 1969, research on information systems and later also on software engineering was carried out. This paper discusses the role of the information systems and software engineering research for the business growth of the region. Special emphasis is put on understanding the role of system-theoretical and software development expertise for transferring research knowledge into practice.
Performance testing of 3D point cloud software
NASA Astrophysics Data System (ADS)
Varela-González, M.; González-Jorge, H.; Riveiro, B.; Arias, P.
2013-10-01
LiDAR systems are being used widely in recent years for many applications in the engineering field: civil engineering, cultural heritage, mining, industry and environmental engineering. One of the most important limitations of this technology is the large computational requirements involved in data processing, especially for large mobile LiDAR datasets. Several software solutions for data managing are available in the market, including open source suites, however, users often unknown methodologies to verify their performance properly. In this work a methodology for LiDAR software performance testing is presented and four different suites are studied: QT Modeler, VR Mesh, AutoCAD 3D Civil and the Point Cloud Library running in software developed at the University of Vigo (SITEGI). The software based on the Point Cloud Library shows better results in the loading time of the point clouds and CPU usage. However, it is not as strong as commercial suites in working set and commit size tests.
DenInv3D: a geophysical software for three-dimensional density inversion of gravity field data
NASA Astrophysics Data System (ADS)
Tian, Yu; Ke, Xiaoping; Wang, Yong
2018-04-01
This paper presents a three-dimensional density inversion software called DenInv3D that operates on gravity and gravity gradient data. The software performs inversion modelling, kernel function calculation, and inversion calculations using the improved preconditioned conjugate gradient (PCG) algorithm. In the PCG algorithm, due to the uncertainty of empirical parameters, such as the Lagrange multiplier, we use the inflection point of the L-curve as the regularisation parameter. The software can construct unequally spaced grids and perform inversions using such grids, which enables changing the resolution of the inversion results at different depths. Through inversion of airborne gradiometry data on the Australian Kauring test site, we discovered that anomalous blocks of different sizes are present within the study area in addition to the central anomalies. The software of DenInv3D can be downloaded from http://159.226.162.30.
The systematic evolution of a NASA software technology, Appendix C
NASA Technical Reports Server (NTRS)
Deregt, M. P.; Dulfer, J. E.
1972-01-01
A long range program is described whose ultimate purpose is to make possible the production of software in NASA within predictable schedule and budget constraints and with major characteristics such as size, run-time, and correctness predictable within reasonable tolerances. As part of the program a pilot NASA computer center will be chosen to apply software development and management techniques systematically and determine a set which is effective. The techniques will be developed by a Technology Group, which will guide the pilot project and be responsible for its success. The application of the technology will involve a sequence of NASA programming tasks graduated from simpler ones at first to complex systems in late phases of the project. The evaluation of the technology will be made by monitoring the operation of the software at the users' installations. In this way a coherent discipline for software design, production maintenance, and management will be evolved.
Features of commercial computer software systems for medical examiners and coroners.
Hanzlick, R L; Parrish, R G; Ing, R
1993-12-01
There are many ways of automating medical examiner and coroner offices, one of which is to purchase commercial software products specifically designed for death investigation. We surveyed four companies that offer such products and requested information regarding each company and its hardware, software, operating systems, peripheral devices, applications, networking options, programming language, querying capability, coding systems, prices, customer support, and number and size of offices using the product. Although the four products (CME2, ForenCIS, InQuest, and Medical Examiner's Software System) are similar in many respects and each can be installed on personal computers, there are differences among the products with regard to cost, applications, and the other features. Death investigators interested in office automation should explore these products to determine the usefulness of each in comparison with the others and in comparison with general-purpose, off-the-shelf databases and software adaptable to death investigation needs.
2008-12-01
SHA256 DIGEST LENGTH) ) ; peAddSection(&sF i l e , " . S i g S t u b " , dwStubSecSize , dwStubSecSize ) ; 169 peSecure(&sF i l e , deqAddrSize...deqAuthPageAddrSize . s i z e ( ) /2) ∗ (8 + SHA256 DIGEST LENGTH) ) + 16 ; bCode [ 3 4 ] = ( ( char∗)&dwSize ) [ 0 ] ; bCode [ 3 5 ] = ( ( char∗)&dwSize ) [ 1...2) ∗ (8 + SHA256 DIGEST LENGTH... ) ) ; AES KEY aesKey ; unsigned char i v s a l t [ 1 6 ] , temp iv [ 1 6 ] ; 739 unsigned char ∗key
Secure software practices among Malaysian software practitioners: An exploratory study
NASA Astrophysics Data System (ADS)
Mohamed, Shafinah Farvin Packeer; Baharom, Fauziah; Deraman, Aziz; Yahya, Jamaiah; Mohd, Haslina
2016-08-01
Secure software practices is increasingly gaining much importance among software practitioners and researchers due to the rise of computer crimes in the software industry. It has become as one of the determinant factors for producing high quality software. Even though its importance has been revealed, its current practice in the software industry is still scarce, particularly in Malaysia. Thus, an exploratory study is conducted among software practitioners in Malaysia to study their experiences and practices in the real-world projects. This paper discusses the findings from the study, which involved 93 software practitioners. Structured questionnaire is utilized for data collection purpose whilst statistical methods such as frequency, mean, and cross tabulation are used for data analysis. Outcomes from this study reveal that software practitioners are becoming increasingly aware on the importance of secure software practices, however, they lack of appropriate implementation, which could affect the quality of produced software.
Desmarais, Samantha M.; Tropini, Carolina; Miguel, Amanda; Cava, Felipe; Monds, Russell D.; de Pedro, Miguel A.; Huang, Kerwyn Casey
2015-01-01
The bacterial cell wall is a network of glycan strands cross-linked by short peptides (peptidoglycan); it is responsible for the mechanical integrity of the cell and shape determination. Liquid chromatography can be used to measure the abundance of the muropeptide subunits composing the cell wall. Characteristics such as the degree of cross-linking and average glycan strand length are known to vary across species. However, a systematic comparison among strains of a given species has yet to be undertaken, making it difficult to assess the origins of variability in peptidoglycan composition. We present a protocol for muropeptide analysis using ultra performance liquid chromatography (UPLC) and demonstrate that UPLC achieves resolution comparable with that of HPLC while requiring orders of magnitude less injection volume and a fraction of the elution time. We also developed a software platform to automate the identification and quantification of chromatographic peaks, which we demonstrate has improved accuracy relative to other software. This combined experimental and computational methodology revealed that peptidoglycan composition was approximately maintained across strains from three Gram-negative species despite taxonomical and morphological differences. Peptidoglycan composition and density were maintained after we systematically altered cell size in Escherichia coli using the antibiotic A22, indicating that cell shape is largely decoupled from the biochemistry of peptidoglycan synthesis. High-throughput, sensitive UPLC combined with our automated software for chromatographic analysis will accelerate the discovery of peptidoglycan composition and the molecular mechanisms of cell wall structure determination. PMID:26468288
The advanced magnetovision system for Smart application
NASA Astrophysics Data System (ADS)
Kaleta, Jerzy; Wiewiórski, Przemyslaw; Lewandowski, Daniel
2010-04-01
An original method, measurement devices and software tool for examination of magneto-mechanical phenomena in wide range of SMART applications is proposed. In many Hi-End market constructions it is necessary to carry out examinations of mechanical and magnetic properties simultaneously. Technological processes of fabrication of modern materials (for example cutting, premagnetisation and prestress) and advanced concept of using SMART structures involves the design of next generation system for optimization of electric and magnetic field distribution. The original fast and higher than million point static resolution scanner with mulitsensor probes has been constructed to measure full components of the magnetic field intensity vector H, and to visualize them into end user acceptable variant. The scanner has also the capability to acquire electric potentials on surface to work with magneto-piezo devices. Advanced electronic subsystems have been applied for processing of results in the Magscaner Vison System and the corresponding software - Maglab has been also evaluated. The Dipole Contour Method (DCM) is provided for modeling different states between magnetic and electric coupled materials and to visually explain the information of the experimental data. Dedicated software collaborating with industrial parametric systems CAD. Measurement technique consists of acquiring a cloud of points similarly as in tomography, 3D visualisation. The actually carried verification of abilities of 3D digitizer will enable inspection of SMART actuators with the cylindrical form, pellets with miniature sizes designed for oscillations dampers in various construction, for example in vehicle industry.
NASA Technical Reports Server (NTRS)
Eichmann, David A.
1992-01-01
We present a user interface for software reuse repository that relies both on the informal semantics of faceted classification and the formal semantics of type signatures for abstract data types. The result is an interface providing both structural and qualitative feedback to a software reuser.
Design of Dual Band Microstrip Patch Antenna using Metamaterial
NASA Astrophysics Data System (ADS)
Rafiqul Islam, Md; Alsaleh Adel, A. A.; Mimi, Aminah W. N.; Yasmin, M. Sarah; Norun, Farihah A. M.
2017-11-01
Metamaterial has received great attention due to their novel electromagnetic properties. It consists of artificial metallic structures with negative permittivity (ɛ) and permeability (µ). The average cell size of metamaterial must be less than a quarter of wavelength, hence, size reduction for the metamaterial antenna is possible. In addition, metamaterial can be used to enhance the low gain and efficiency in conventional patch antenna, which is important in wireless communication. In this paper, dual band microstrip patch antenna design using metamaterial for mobile GSM and WiMax application is introduced. The antenna structure consists of microstrip feed line connected to a rectangular patch. An array of five split ring resonators (SRRs) unit cells is inserted under the patch. The presented antenna resonates at 1.8 GHz for mobile GSM and 2.4 GHz for WIMAX applications. The return loss in the FR4 antenna at 1.8 GHz is -22.5 dB. Using metamaterial the return loss has improved to -25 dB at 2.4 GHz and -23.5 dB at 1.8 GHz. A conventional microstrip patch antenna using pair of slots is also designed which resonates at 1.8 GHz and 2.4 GHz. The return loss at 1.8 GHz and 2.4 GHz were -12.1 dB and -21.8 dB respectively. The metamaterial antenna achieved results with major size reduction of 45%, better bandwidth and better returns loss if it is compared to the pair of slots antenna. The software used to design, simulate and optimize is CST microwave studio.
Probabilistic structural analysis methods for select space propulsion system components
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Cruse, T. A.
1989-01-01
The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.
NASA Astrophysics Data System (ADS)
Abdelhadi, Ousama Mohamed Omer
Continuous miniaturization of microelectronic interconnects demands smaller joints with comparable microstructural and structural sizes. As the size of joints become smaller, the volume of intermetallics (IMCs) becomes comparable with the joint size. As a result, the kinetics of bond formation changes and the types and thicknesses of IMC phases that form within the constrained region of the bond varies. This dissertation focuses on investigating combination effects of process parameters and size on kinetics of bond formation, resulting microstructure and the mechanical properties of joints that are formed under structurally constrained conditions. An experiment is designed where several process parameters such as time of bonding, temperature, and pressure, and bond thickness as structural chracteristic, are varied at multiple levels. The experiment is then implemented on the process. Scanning electron microscope (SEM) is then utilized to determine the bond thickness, IMC phases and their thicknesses, and morphology of the bonds. Electron backscatter diffraction (EBSD) is used to determine the grain size in different regions, including the bulk solder, and different IMC phases. Physics-based analytical models have been developed for growth kinetics of IMC compounds and are verified using the experimental results. Nanoindentation is used to determine the mechanical behavior of IMC phases in joints in different scales. Four-point bending notched multilayer specimen and four-point bending technique were used to determine fracture toughness of the bonds containing IMCs. Analytical modeling of peeling and shear stresses and fracture toughness in tri-layer four-point bend specimen containing intermetallic layer was developed and was verified and validated using finite element simulation and experimental results. The experiment is used in conjunction with the model to calculate and verify the fracture toughness of Cu6Sn5 IMC materials. As expected two different IMC phases, η-phase (Cu6Sn 5) and epsilon-phase (Cu3Sn), were found in almost all the cases regardless of the process parameters and size levels. The physics-based analytical model was successfully able to capture the governing mechanisms of IMC growth: chemical reaction controlled and diffusion-controlled. Examination of microstructures of solder joints of different sizes revealed the size of the solder joint has no effect on the type of IMCs formed during the process. Joint size, however, affected the thickness of IMC layers significantly. IMC layers formed in the solder joints of smaller sizes were found to be thicker than those in the solder joints of larger sizes. The growth rate constants and activation energies of Cu3Sn IMC layer were also reported and related to joint thickness. In an effort to optimize the EBSD imaging in the multi-layer configuration, an improved specimen preparation technique and optimum software parameters were determined. Nanoindentation results show that size effects play a major role on the mechanical properties of micro-scale solder joints. Smaller joints show higher Young's modulus, hardness, and yield strength and lower work hardening exponents comparing to thicker joints. To obtain the stress concentration factors in a multilayer specimen with IMC layer as bonding material, a four-point bending notched configuration was used. The analytical solutions developed for peeling and shear stresses in notched structure were used to evaluate the stresses at IMC interface layers. Results were in good agreement with the finite-element simulation. The values of interfacial stresses were utilized in obtaining fracture toughness of the IMC material. (Abstract shortened by UMI.)
Computer Program Re-layers Engineering Drawings
NASA Technical Reports Server (NTRS)
Crosby, Dewey C., III
1990-01-01
RULCHK computer program aids in structuring layers of information pertaining to part or assembly designed with software described in article "Software for Drawing Design Details Concurrently" (MFS-28444). Checks and optionally updates structure of layers for part. Enables designer to construct model and annotate its documentation without burden of manually layering part to conform to standards at design time.
Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew
2014-01-01
Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.
DDGui, a new and fast way to analyse DRAGON and DONJON code results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chambon, R.; Marleau, G.
2012-07-01
With the largely increased performance of computer, the results from DRAGON and DONJON have increase in size and complexity. The scroll, copy and paste technique to get the result is not appropriate anymore. Many in-house script, software, macro have been developed to make the data gathering easier. However, the limit of these solutions is their specificity and the difficulty to export them from one place to another. A general tool usable and accessible by everyone was needed. The first bricks for a very fast and intuitive way to analyse the DRAGON and DONJON results have been put together in themore » graphic user interface DDGUI. Based on the extensive ROOT C++ package, the possible features are numerous. For this first version of the software, we have programmed the fundamental tools which may be the more useful on an everyday basis: view the data structures content, draw the geometry and draw the flux or power from a DONJON computation. The tests show how amazingly fast the user can get the information needed for a general overview or more precise analyses. Several other features will be implemented in the near feature. (authors)« less
Modal analysis and acoustic transmission through offset-core honeycomb sandwich panels
NASA Astrophysics Data System (ADS)
Mathias, Adam Dustin
The work presented in this thesis is motivated by an earlier research that showed that double, offset-core honeycomb sandwich panels increased thermal resistance and, hence, decreased heat transfer through the panels. This result lead to the hypothesis that these panels could be used for acoustic insulation. Using commercial finite element modeling software, COMSOL Multiphysics, the acoustical properties, specifically the transmission loss across a variety of offset-core honeycomb sandwich panels, is studied for the case of a plane acoustic wave impacting the panel at normal incidence. The transmission loss results are compared with those of single-core honeycomb panels with the same cell sizes. The fundamental frequencies of the panels are also computed in an attempt to better understand the vibrational modes of these particular sandwich-structured panels. To ensure that the finite element analysis software is adequate for the task at hand, two relevant benchmark problems are solved and compared with theory. Results from these benchmark results compared well to those obtained from theory. Transmission loss results from the offset-core honeycomb sandwich panels show increased transmission loss, especially for large cell honeycombs when compared to single-core honeycomb panels.
NASA Technical Reports Server (NTRS)
Grasso, Christopher; Page, Dennis; O'Reilly, Taifun; Fteichert, Ralph; Lock, Patricia; Lin, Imin; Naviaux, Keith; Sisino, John
2005-01-01
Virtual Machine Language (VML) is a mission-independent, reusable software system for programming for spacecraft operations. Features of VML include a rich set of data types, named functions, parameters, IF and WHILE control structures, polymorphism, and on-the-fly creation of spacecraft commands from calculated values. Spacecraft functions can be abstracted into named blocks that reside in files aboard the spacecraft. These named blocks accept parameters and execute in a repeatable fashion. The sizes of uplink products are minimized by the ability to call blocks that implement most of the command steps. This block approach also enables some autonomous operations aboard the spacecraft, such as aerobraking, telemetry conditional monitoring, and anomaly response, without developing autonomous flight software. Operators on the ground write blocks and command sequences in a concise, high-level, human-readable programming language (also called VML ). A compiler translates the human-readable blocks and command sequences into binary files (the operations products). The flight portion of VML interprets the uplinked binary files. The ground subsystem of VML also includes an interactive sequence- execution tool hosted on workstations, which runs sequences at several thousand times real-time speed, affords debugging, and generates reports. This tool enables iterative development of blocks and sequences within times of the order of seconds.
ReadXplorer—visualization and analysis of mapped sequences
Hilker, Rolf; Stadermann, Kai Bernd; Doppmeier, Daniel; Kalinowski, Jörn; Stoye, Jens; Straube, Jasmin; Winnebald, Jörn; Goesmann, Alexander
2014-01-01
Motivation: Fast algorithms and well-arranged visualizations are required for the comprehensive analysis of the ever-growing size of genomic and transcriptomic next-generation sequencing data. Results: ReadXplorer is a software offering straightforward visualization and extensive analysis functions for genomic and transcriptomic DNA sequences mapped on a reference. A unique specialty of ReadXplorer is the quality classification of the read mappings. It is incorporated in all analysis functions and displayed in ReadXplorer's various synchronized data viewers for (i) the reference sequence, its base coverage as (ii) normalizable plot and (iii) histogram, (iv) read alignments and (v) read pairs. ReadXplorer's analysis capability covers RNA secondary structure prediction, single nucleotide polymorphism and deletion–insertion polymorphism detection, genomic feature and general coverage analysis. Especially for RNA-Seq data, it offers differential gene expression analysis, transcription start site and operon detection as well as RPKM value and read count calculations. Furthermore, ReadXplorer can combine or superimpose coverage of different datasets. Availability and implementation: ReadXplorer is available as open-source software at http://www.readxplorer.org along with a detailed manual. Contact: rhilker@mikrobio.med.uni-giessen.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24790157
PubChem3D: Conformer generation
2011-01-01
Background PubChem, an open archive for the biological activities of small molecules, provides search and analysis tools to assist users in locating desired information. Many of these tools focus on the notion of chemical structure similarity at some level. PubChem3D enables similarity of chemical structure 3-D conformers to augment the existing similarity of 2-D chemical structure graphs. It is also desirable to relate theoretical 3-D descriptions of chemical structures to experimental biological activity. As such, it is important to be assured that the theoretical conformer models can reproduce experimentally determined bioactive conformations. In the present study, we investigate the effects of three primary conformer generation parameters (the fragment sampling rate, the energy window size, and force field variant) upon the accuracy of theoretical conformer models, and determined optimal settings for PubChem3D conformer model generation and conformer sampling. Results Using the software package OMEGA from OpenEye Scientific Software, Inc., theoretical 3-D conformer models were generated for 25,972 small-molecule ligands, whose 3-D structures were experimentally determined. Different values for primary conformer generation parameters were systematically tested to find optimal settings. Employing a greater fragment sampling rate than the default did not improve the accuracy of the theoretical conformer model ensembles. An ever increasing energy window did increase the overall average accuracy, with rapid convergence observed at 10 kcal/mol and 15 kcal/mol for model building and torsion search, respectively; however, subsequent study showed that an energy threshold of 25 kcal/mol for torsion search resulted in slightly improved results for larger and more flexible structures. Exclusion of coulomb terms from the 94s variant of the Merck molecular force field (MMFF94s) in the torsion search stage gave more accurate conformer models at lower energy windows. Overall average accuracy of reproduction of bioactive conformations was remarkably linear with respect to both non-hydrogen atom count ("size") and effective rotor count ("flexibility"). Using these as independent variables, a regression equation was developed to predict the RMSD accuracy of a theoretical ensemble to reproduce bioactive conformations. The equation was modified to give a minimum RMSD conformer sampling value to help ensure that 90% of the sampled theoretical models should contain at least one conformer within the RMSD sampling value to a "bioactive" conformation. Conclusion Optimal parameters for conformer generation using OMEGA were explored and determined. An equation was developed that provides an RMSD sampling value to use that is based on the relative accuracy to reproduce bioactive conformations. The optimal conformer generation parameters and RMSD sampling values determined are used by the PubChem3D project to generate theoretical conformer models. PMID:21272340
Sanyal, Parikshit; Ganguli, Prosenjit; Barui, Sanghita; Deb, Prabal
2018-01-01
The Pap stained cervical smear is a screening tool for cervical cancer. Commercial systems are used for automated screening of liquid based cervical smears. However, there is no image analysis software used for conventional cervical smears. The aim of this study was to develop and test the diagnostic accuracy of a software for analysis of conventional smears. The software was developed using Python programming language and open source libraries. It was standardized with images from Bethesda Interobserver Reproducibility Project. One hundred and thirty images from smears which were reported Negative for Intraepithelial Lesion or Malignancy (NILM), and 45 images where some abnormality has been reported, were collected from the archives of the hospital. The software was then tested on the images. The software was able to segregate images based on overall nuclear: cytoplasmic ratio, coefficient of variation (CV) in nuclear size, nuclear membrane irregularity, and clustering. 68.88% of abnormal images were flagged by the software, as well as 19.23% of NILM images. The major difficulties faced were segmentation of overlapping cell clusters and separation of neutrophils. The software shows potential as a screening tool for conventional cervical smears; however, further refinement in technique is required.
NASA Software Cost Estimation Model: An Analogy Based Estimation Model
NASA Technical Reports Server (NTRS)
Hihn, Jairus; Juster, Leora; Menzies, Tim; Mathew, George; Johnson, James
2015-01-01
The cost estimation of software development activities is increasingly critical for large scale integrated projects such as those at DOD and NASA especially as the software systems become larger and more complex. As an example MSL (Mars Scientific Laboratory) developed at the Jet Propulsion Laboratory launched with over 2 million lines of code making it the largest robotic spacecraft ever flown (Based on the size of the software). Software development activities are also notorious for their cost growth, with NASA flight software averaging over 50% cost growth. All across the agency, estimators and analysts are increasingly being tasked to develop reliable cost estimates in support of program planning and execution. While there has been extensive work on improving parametric methods there is very little focus on the use of models based on analogy and clustering algorithms. In this paper we summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods to develop estimation models based on analogy and clustering. The NASA Software Cost Model performance is evaluated by comparing it to COCOMO II, linear regression, and K- nearest neighbor prediction model performance on the same data set.
Alexander, R.B.; Ludtke, A.S.; Fitzgerald, K.K.; Schertz, T.L.
1996-01-01
Data from two U.S. Geological Survey (USGS) national stream water-quality monitoring networks, the National Stream Quality Accounting Network (NASQAN) and the Hydrologic Benchmark Network (HBN), are now available in a two CD-ROM set. These data on CD-ROM are collectively referred to as WQN, water-quality networks. Data from these networks have been used at the national, regional, and local levels to estimate the rates of chemical flux from watersheds, quantify changes in stream water quality for periods during the past 30 years, and investigate relations between water quality and streamflow as well as the relations of water quality to pollution sources and various physical characteristics of watersheds. The networks include 679 monitoring stations in watersheds that represent diverse climatic, physiographic, and cultural characteristics. The HBN includes 63 stations in relatively small, minimally disturbed basins ranging in size from 2 to 2,000 square miles with a median drainage basin size of 57 square miles. NASQAN includes 618 stations in larger, more culturally-influenced drainage basins ranging in size from one square mile to 1.2 million square miles with a median drainage basin size of about 4,000 square miles. The CD-ROMs contain data for 63 physical, chemical, and biological properties of water (122 total constituents including analyses of dissolved and water suspended-sediment samples) collected during more than 60,000 site visits. These data approximately span the periods 1962-95 for HBN and 1973-95 for NASQAN. The data reflect sampling over a wide range of streamflow conditions and the use of relatively consistent sampling and analytical methods. The CD-ROMs provide ancillary information and data-retrieval tools to allow the national network data to be properly and efficiently used. Ancillary information includes the following: descriptions of the network objectives and history, characteristics of the network stations and water-quality data, historical records of important changes in network sample collection and laboratory analytical methods, water reference sample data for estimating laboratory measurement bias and variability for 34 dissolved constituents for the period 1985-95, discussions of statistical methods for using water reference sample data to evaluate the accuracy of network stream water-quality data, and a bibliography of scientific investigations using national network data and other publications relevant to the networks. The data structure of the CD-ROMs is designed to allow users to efficiently enter the water-quality data to user-supplied software packages including statistical analysis, modeling, or geographic information systems. On one disc, all data are stored in ASCII form accessible from any computer system with a CD-ROM driver. The data also can be accessed using DOS-based retrieval software supplied on a second disc. This software supports logical queries of the water-quality data based on constituent concentrations, sample- collection date, river name, station name, county, state, hydrologic unit number, and 1990 population and 1987 land-cover characteristics for station watersheds. User-selected data may be output in a variety of formats including dBASE, flat ASCII, delimited ASCII, or fixed-field for subsequent use in other software packages.
Detailed requirements document for the integrated structural analysis system, phase B
NASA Technical Reports Server (NTRS)
Rainey, J. A.
1976-01-01
The requirements are defined for a software system entitled integrated Structural Analysis System (ISAS) Phase B which is being developed to provide the user with a tool by which a complete and detailed analysis of a complex structural system can be performed. This software system will allow for automated interface with numerous structural analysis batch programs and for user interaction in the creation, selection, and validation of data. This system will include modifications to the 4 functions developed for ISAS, and the development of 25 new functions. The new functions are described.
GPAW - massively parallel electronic structure calculations with Python-based software.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Enkovaara, J.; Romero, N.; Shende, S.
2011-01-01
Electronic structure calculations are a widely used tool in materials science and large consumer of supercomputing resources. Traditionally, the software packages for these kind of simulations have been implemented in compiled languages, where Fortran in its different versions has been the most popular choice. While dynamic, interpreted languages, such as Python, can increase the effciency of programmer, they cannot compete directly with the raw performance of compiled languages. However, by using an interpreted language together with a compiled language, it is possible to have most of the productivity enhancing features together with a good numerical performance. We have used thismore » approach in implementing an electronic structure simulation software GPAW using the combination of Python and C programming languages. While the chosen approach works well in standard workstations and Unix environments, massively parallel supercomputing systems can present some challenges in porting, debugging and profiling the software. In this paper we describe some details of the implementation and discuss the advantages and challenges of the combined Python/C approach. We show that despite the challenges it is possible to obtain good numerical performance and good parallel scalability with Python based software.« less
Element Load Data Processor (ELDAP) Users Manual
NASA Technical Reports Server (NTRS)
Ramsey, John K., Jr.; Ramsey, John K., Sr.
2015-01-01
Often, the shear and tensile forces and moments are extracted from finite element analyses to be used in off-line calculations for evaluating the integrity of structural connections involving bolts, rivets, and welds. Usually the maximum forces and moments are desired for use in the calculations. In situations where there are numerous structural connections of interest for numerous load cases, the effort in finding the true maximum force and/or moment combinations among all fasteners and welds and load cases becomes difficult. The Element Load Data Processor (ELDAP) software described herein makes this effort manageable. This software eliminates the possibility of overlooking the worst-case forces and moments that could result in erroneous positive margins of safety and/or selecting inconsistent combinations of forces and moments resulting in false negative margins of safety. In addition to forces and moments, any scalar quantity output in a PATRAN report file may be evaluated with this software. This software was originally written to fill an urgent need during the structural analysis of the Ares I-X Interstage segment. As such, this software was coded in a straightforward manner with no effort made to optimize or minimize code or to develop a graphical user interface.
i-Tree: Tools to assess and manage structure, function, and value of community forests
NASA Astrophysics Data System (ADS)
Hirabayashi, S.; Nowak, D.; Endreny, T. A.; Kroll, C.; Maco, S.
2011-12-01
Trees in urban communities can mitigate many adverse effects associated with anthropogenic activities and climate change (e.g. urban heat island, greenhouse gas, air pollution, and floods). To protect environmental and human health, managers need to make informed decisions regarding urban forest management practices. Here we present the i-Tree suite of software tools (www.itreetools.org) developed by the USDA Forest Service and their cooperators. This software suite can help urban forest managers assess and manage the structure, function, and value of urban tree populations regardless of community size or technical capacity. i-Tree is a state-of-the-art, peer-reviewed Windows GUI- or Web-based software that is freely available, supported, and continuously refined by the USDA Forest Service and their cooperators. Two major features of i-Tree are 1) to analyze current canopy structures and identify potential planting spots, and 2) to estimate the environmental benefits provided by the trees, such as carbon storage and sequestration, energy conservation, air pollution removal, and storm water reduction. To cover diverse forest topologies, various tools were developed within the i-Tree suite: i-Tree Design for points (individual trees), i-Tree Streets for lines (street trees), and i-Tree Eco, Vue, and Canopy (in the order of complexity) for areas (community trees). Once the forest structure is identified with these tools, ecosystem services provided by trees can be estimated with common models and protocols, and reports in the form of texts, charts, and figures are then created for users. Since i-Tree was developed with a client/server architecture, nationwide data in the US such as location-related parameters, weather, streamflow, and air pollution data are stored in the server and retrieved to a user's computer at run-time. Freely available remote-sensed images (e.g. NLCD and Google maps) are also employed to estimate tree canopy characteristics. As the demand for i-Tree grows internationally, environmental databases from more countries will be coupled with the software suite. Two more i-Tree applications, i-Tree Forecast and i-Tree Landscape are now under development. i-Tree Forecast simulates canopy structures for up to 100 years based on planting and mortality rates and adds capabilities for other i-Tree applications to estimate the benefits of future canopy scenarios. While most i-Tree applications employ a spatially lumped approach, i-Tree landscape employs a spatially distributed approach that allows users to map changes in canopy cover and ecosystem services through time and space. These new i-Tree tools provide an advanced platform for urban managers to assess the impact of current and future urban forests. i-Tree allows managers to promote effective urban forest management and sound arboricultural practices by providing information for advocacy and planning, baseline data for making informed decisions, and standardization for comparisons with other communities.
Software For Genetic Algorithms
NASA Technical Reports Server (NTRS)
Wang, Lui; Bayer, Steve E.
1992-01-01
SPLICER computer program is genetic-algorithm software tool used to solve search and optimization problems. Provides underlying framework and structure for building genetic-algorithm application program. Written in Think C.
IB2d: a Python and MATLAB implementation of the immersed boundary method.
Battista, Nicholas A; Strickland, W Christopher; Miller, Laura A
2017-03-29
The development of fluid-structure interaction (FSI) software involves trade-offs between ease of use, generality, performance, and cost. Typically there are large learning curves when using low-level software to model the interaction of an elastic structure immersed in a uniform density fluid. Many existing codes are not publicly available, and the commercial software that exists usually requires expensive licenses and may not be as robust or allow the necessary flexibility that in house codes can provide. We present an open source immersed boundary software package, IB2d, with full implementations in both MATLAB and Python, that is capable of running a vast range of biomechanics models and is accessible to scientists who have experience in high-level programming environments. IB2d contains multiple options for constructing material properties of the fiber structure, as well as the advection-diffusion of a chemical gradient, muscle mechanics models, and artificial forcing to drive boundaries with a preferred motion.