Jeddah Historical Building Information Modelling "JHBIM" - Object Library
NASA Astrophysics Data System (ADS)
Baik, A.; Alitany, A.; Boehm, J.; Robson, S.
2014-05-01
The theory of using Building Information Modelling "BIM" has been used in several Heritage places in the worldwide, in the case of conserving, documenting, managing, and creating full engineering drawings and information. However, one of the most serious issues that facing many experts in order to use the Historical Building Information Modelling "HBIM", is creating the complicated architectural elements of these Historical buildings. In fact, many of these outstanding architectural elements have been designed and created in the site to fit the exact location. Similarly, this issue has been faced the experts in Old Jeddah in order to use the BIM method for Old Jeddah historical Building. Moreover, The Saudi Arabian City has a long history as it contains large number of historic houses and buildings that were built since the 16th century. Furthermore, the BIM model of the historical building in Old Jeddah always take a lot of time, due to the unique of Hijazi architectural elements and no such elements library, which have been took a lot of time to be modelled. This paper will focus on building the Hijazi architectural elements library based on laser scanner and image survey data. This solution will reduce the time to complete the HBIM model and offering in depth and rich digital architectural elements library to be used in any heritage projects in Al-Balad district, Jeddah City.
Lee, Jaehoon; Hulse, Nathan C; Wood, Grant M; Oniki, Thomas A; Huff, Stanley M
2016-01-01
In this study we developed a Fast Healthcare Interoperability Resources (FHIR) profile to support exchanging a full pedigree based family health history (FHH) information across multiple systems and applications used by clinicians, patients, and researchers. We used previously developed clinical element models (CEMs) that are capable of representing the FHH information, and derived essential data elements including attributes, constraints, and value sets. We analyzed gaps between the FHH CEM elements and existing FHIR resources. Based on the analysis, we developed a profile that consists of 1) FHIR resources for essential FHH data elements, 2) extensions for additional elements that were not covered by the resources, and 3) a structured definition to integrate patient and family member information in a FHIR message. We implemented the profile using an open-source based FHIR framework and validated it using patient-entered FHH data that was captured through a locally developed FHH tool.
Moral judgment as information processing: an integrative review.
Guglielmo, Steve
2015-01-01
How do humans make moral judgments about others' behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment.
Moral judgment as information processing: an integrative review
Guglielmo, Steve
2015-01-01
How do humans make moral judgments about others’ behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment. PMID:26579022
Energy and technology review: Engineering modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cabayan, H.S.; Goudreau, G.L.; Ziolkowski, R.W.
1986-10-01
This report presents information concerning: Modeling Canonical Problems in Electromagnetic Coupling Through Apertures; Finite-Element Codes for Computing Electrostatic Fields; Finite-Element Modeling of Electromagnetic Phenomena; Modeling Microwave-Pulse Compression in a Resonant Cavity; Lagrangian Finite-Element Analysis of Penetration Mechanics; Crashworthiness Engineering; Computer Modeling of Metal-Forming Processes; Thermal-Mechanical Modeling of Tungsten Arc Welding; Modeling Air Breakdown Induced by Electromagnetic Fields; Iterative Techniques for Solving Boltzmann's Equations for p-Type Semiconductors; Semiconductor Modeling; and Improved Numerical-Solution Techniques in Large-Scale Stress Analysis.
Data Requirements and the Basis for Designing Health Information Kiosks.
Afzali, Mina; Ahmadi, Maryam; Mahmoudvand, Zahra
2017-09-01
Health kiosks are an innovative and cost-effective solution that organizations can easily implement to help educate people. To determine the data requirements and basis for designing health information kiosks as a new technology to maintain the health of society. By reviewing the literature, a list of information requirements was provided in 4 sections (demographic information, general information, diagnostic information and medical history), and questions related to the objectives, data elements, stakeholders, requirements, infrastructures and the applications of health information kiosks were provided. In order to determine the content validity of the designed set, the opinions of 2 physicians and 2 specialists in medical informatics were obtained. The test-retest method was used to measure its reliability. Data were analyzed using SPSS software. In the proposed model for Iran, 170 data elements in 6 sections were presented for experts' opinion, which ultimately, on 106 elements, a collective agreement was reached. To provide a model of health information kiosk, creating a standard data set is a critical point. According to a survey conducted on the various literature review studies related to the health information kiosk, the most important components of a health information kiosk include six categories; information needs, data elements, applications, stakeholders, requirements and infrastructure of health information kiosks that need to be considered when designing a health information kiosk.
ERIC Educational Resources Information Center
Kastner, Theodore A.; Walsh, Kevin K.; Criscione, Teri
1997-01-01
Presents a general model of the structure and functioning of managed care and describes elements (provider networks, fiscal elements, risk estimation, case-mix, management information systems, practice parameters, and quality improvement) critical to people with developmental disabilities. Managed care demonstration projects and a hypothetical…
Elements of Information Inquiry, Evolution of Models & Measured Reflection
ERIC Educational Resources Information Center
Callison, Daniel; Baker, Katie
2014-01-01
In 2003 Paula Montgomery, founding editor of School Library Media Activities Monthly and former branch chief of school media services for the Maryland State Department of Education, published a guide to teaching information inquiry. Her staff also illustrated the elements of information inquiry as a recursive cycle with interaction among the…
NASA Astrophysics Data System (ADS)
Ribeiro, André S.; Almeida, Miguel
2003-11-01
We propose a model of structural organization and intercommunication between all elements of every team involved in the development of a space probe to improve efficiency. Such structure is built to minimize path between any two elements, allowing fast information flow in the structure. Structures are usually very clustered inside each task team but only the heads of departments, or occasional meetings, usually assure the links between team elements. This is responsible for a lack of information exchange between staff members of each team. We propose the establishment of permanent small working groups of staff elements from different teams, in a random but permanent basis. The elements chosen for such connections establishment can be chosen in a temporary basis, but the connections must exist permanently because only with permanent connections can information flow when needed. A few of such random connections between staff members will diminish the average path length, between any two elements of any team, for information exchange. A small world structure will emerge with low internal energy costs, which is the structure used by biological neuronal systems.
NASA Astrophysics Data System (ADS)
Ribeiro, André S.; Almeida, Miguel
2006-10-01
We propose a model of structural organization and intercommunication between all elements of every team involved in the development of a space probe to improve efficiency. Such structure is built to minimize path between any two elements, allowing fast information flow in the structure. Structures are usually very clustered inside each task team but only the heads of departments, or occasional meetings, usually assure the links between team elements. This is responsible for a lack of information exchange between staff members of each team. We propose the establishment of permanent small working groups of staff elements from different teams, in a random but permanent basis. The elements chosen for such connections establishment can be chosen on a temporary basis, but the connections must exist permanently because only with permanent connections can information flow when needed. A few of such random connections between staff members will diminish the average path length, between any two elements of any team, for information exchange. A small world structure will emerge with low internal energy costs, which is the structure used by biological neuronal systems.
Parametric Modelling of As-Built Beam Framed Structure in Bim Environment
NASA Astrophysics Data System (ADS)
Yang, X.; Koehl, M.; Grussenmeyer, P.
2017-02-01
A complete documentation and conservation of a historic timber roof requires the integration of geometry modelling, attributional and dynamic information management and results of structural analysis. Recently developed as-built Building Information Modelling (BIM) technique has the potential to provide a uniform platform, which provides possibility to integrate the traditional geometry modelling, parametric elements management and structural analysis together. The main objective of the project presented in this paper is to develop a parametric modelling tool for a timber roof structure whose elements are leaning and crossing beam frame. Since Autodesk Revit, as the typical BIM software, provides the platform for parametric modelling and information management, an API plugin, able to automatically create the parametric beam elements and link them together with strict relationship, was developed. The plugin under development is introduced in the paper, which can obtain the parametric beam model via Autodesk Revit API from total station points and terrestrial laser scanning data. The results show the potential of automatizing the parametric modelling by interactive API development in BIM environment. It also integrates the separate data processing and different platforms into the uniform Revit software.
An object-oriented, technology-adaptive information model
NASA Technical Reports Server (NTRS)
Anyiwo, Joshua C.
1995-01-01
The primary objective was to develop a computer information system for effectively presenting NASA's technologies to American industries, for appropriate commercialization. To this end a comprehensive information management model, applicable to a wide variety of situations, and immune to computer software/hardware technological gyrations, was developed. The model consists of four main elements: a DATA_STORE, a data PRODUCER/UPDATER_CLIENT and a data PRESENTATION_CLIENT, anchored to a central object-oriented SERVER engine. This server engine facilitates exchanges among the other model elements and safeguards the integrity of the DATA_STORE element. It is designed to support new technologies, as they become available, such as Object Linking and Embedding (OLE), on-demand audio-video data streaming with compression (such as is required for video conferencing), Worldwide Web (WWW) and other information services and browsing, fax-back data requests, presentation of information on CD-ROM, and regular in-house database management, regardless of the data model in place. The four components of this information model interact through a system of intelligent message agents which are customized to specific information exchange needs. This model is at the leading edge of modern information management models. It is independent of technological changes and can be implemented in a variety of ways to meet the specific needs of any communications situation. This summer a partial implementation of the model has been achieved. The structure of the DATA_STORE has been fully specified and successfully tested using Microsoft's FoxPro 2.6 database management system. Data PRODUCER/UPDATER and PRESENTATION architectures have been developed and also successfully implemented in FoxPro; and work has started on a full implementation of the SERVER engine. The model has also been successfully applied to a CD-ROM presentation of NASA's technologies in support of Langley Research Center's TAG efforts.
Determining relative error bounds for the CVBEM
Hromadka, T.V.
1985-01-01
The Complex Variable Boundary Element Methods provides a measure of relative error which can be utilized to subsequently reduce the error or provide information for further modeling analysis. By maximizing the relative error norm on each boundary element, a bound on the total relative error for each boundary element can be evaluated. This bound can be utilized to test CVBEM convergence, to analyze the effects of additional boundary nodal points in reducing the modeling error, and to evaluate the sensitivity of resulting modeling error within a boundary element from the error produced in another boundary element as a function of geometric distance. ?? 1985.
An Ontology-Based Archive Information Model for the Planetary Science Community
NASA Technical Reports Server (NTRS)
Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris
2008-01-01
The Planetary Data System (PDS) information model is a mature but complex model that has been used to capture over 30 years of planetary science data for the PDS archive. As the de-facto information model for the planetary science data archive, it is being adopted by the International Planetary Data Alliance (IPDA) as their archive data standard. However, after seventeen years of evolutionary change the model needs refinement. First a formal specification is needed to explicitly capture the model in a commonly accepted data engineering notation. Second, the core and essential elements of the model need to be identified to help simplify the overall archive process. A team of PDS technical staff members have captured the PDS information model in an ontology modeling tool. Using the resulting knowledge-base, work continues to identify the core elements, identify problems and issues, and then test proposed modifications to the model. The final deliverables of this work will include specifications for the next generation PDS information model and the initial set of IPDA archive data standards. Having the information model captured in an ontology modeling tool also makes the model suitable for use by Semantic Web applications.
Bim Automation: Advanced Modeling Generative Process for Complex Structures
NASA Astrophysics Data System (ADS)
Banfi, F.; Fai, S.; Brumana, R.
2017-08-01
The new paradigm of the complexity of modern and historic structures, which are characterised by complex forms, morphological and typological variables, is one of the greatest challenges for building information modelling (BIM). Generation of complex parametric models needs new scientific knowledge concerning new digital technologies. These elements are helpful to store a vast quantity of information during the life cycle of buildings (LCB). The latest developments of parametric applications do not provide advanced tools, resulting in time-consuming work for the generation of models. This paper presents a method capable of processing and creating complex parametric Building Information Models (BIM) with Non-Uniform to NURBS) with multiple levels of details (Mixed and ReverseLoD) based on accurate 3D photogrammetric and laser scanning surveys. Complex 3D elements are converted into parametric BIM software and finite element applications (BIM to FEA) using specific exchange formats and new modelling tools. The proposed approach has been applied to different case studies: the BIM of modern structure for the courtyard of West Block on Parliament Hill in Ottawa (Ontario) and the BIM of Masegra Castel in Sondrio (Italy), encouraging the dissemination and interaction of scientific results without losing information during the generative process.
Complementarity of Historic Building Information Modelling and Geographic Information Systems
NASA Astrophysics Data System (ADS)
Yang, X.; Koehl, M.; Grussenmeyer, P.; Macher, H.
2016-06-01
In this paper, we discuss the potential of integrating both semantically rich models from Building Information Modelling (BIM) and Geographical Information Systems (GIS) to build the detailed 3D historic model. BIM contributes to the creation of a digital representation having all physical and functional building characteristics in several dimensions, as e.g. XYZ (3D), time and non-architectural information that are necessary for construction and management of buildings. GIS has potential in handling and managing spatial data especially exploring spatial relationships and is widely used in urban modelling. However, when considering heritage modelling, the specificity of irregular historical components makes it problematic to create the enriched model according to its complex architectural elements obtained from point clouds. Therefore, some open issues limiting the historic building 3D modelling will be discussed in this paper: how to deal with the complex elements composing historic buildings in BIM and GIS environment, how to build the enriched historic model, and why to construct different levels of details? By solving these problems, conceptualization, documentation and analysis of enriched Historic Building Information Modelling are developed and compared to traditional 3D models aimed primarily for visualization.
Clement, R; Schneider, J; Brambs, H-J; Wunderlich, A; Geiger, M; Sander, F G
2004-02-01
The paper demonstrates how to generate an individual 3D volume model of a human single-rooted tooth using an automatic workflow. It can be implemented into finite element simulation. In several computational steps, computed tomography data of patients are used to obtain the global coordinates of the tooth's surface. First, the large number of geometric data is processed with several self-developed algorithms for a significant reduction. The most important task is to keep geometrical information of the real tooth. The second main part includes the creation of the volume model for tooth and periodontal ligament (PDL). This is realized with a continuous free form surface of the tooth based on the remaining points. Generating such irregular objects for numerical use in biomechanical research normally requires enormous manual effort and time. The finite element mesh of the tooth, consisting of hexahedral elements, is composed of different materials: dentin, PDL and surrounding alveolar bone. It is capable of simulating tooth movement in a finite element analysis and may give valuable information for a clinical approach without the restrictions of tetrahedral elements. The mesh generator of FE software ANSYS executed the mesh process for hexahedral elements successfully.
The Advanced Modeling, Simulation and Analysis Capability Roadmap Vision for Engineering
NASA Technical Reports Server (NTRS)
Zang, Thomas; Lieber, Mike; Norton, Charles; Fucik, Karen
2006-01-01
This paper summarizes a subset of the Advanced Modeling Simulation and Analysis (AMSA) Capability Roadmap that was developed for NASA in 2005. The AMSA Capability Roadmap Team was chartered to "To identify what is needed to enhance NASA's capabilities to produce leading-edge exploration and science missions by improving engineering system development, operations, and science understanding through broad application of advanced modeling, simulation and analysis techniques." The AMSA roadmap stressed the need for integration, not just within the science, engineering and operations domains themselves, but also across these domains. Here we discuss the roadmap element pertaining to integration within the engineering domain, with a particular focus on implications for future observatory missions. The AMSA products supporting the system engineering function are mission information, bounds on information quality, and system validation guidance. The Engineering roadmap element contains 5 sub-elements: (1) Large-Scale Systems Models, (2) Anomalous Behavior Models, (3) advanced Uncertainty Models, (4) Virtual Testing Models, and (5) space-based Robotics Manufacture and Servicing Models.
NASA Technical Reports Server (NTRS)
Southall, J. W.
1979-01-01
The engineering-specified requirements for integrated information processing by means of the Integrated Programs for Aerospace-Vehicle Design (IPAD) system are presented. A data model is described and is based on the design process of a typical aerospace vehicle. General data management requirements are specified for data storage, retrieval, generation, communication, and maintenance. Information management requirements are specified for a two-component data model. In the general portion, data sets are managed as entities, and in the specific portion, data elements and the relationships between elements are managed by the system, allowing user access to individual elements for the purpose of query. Computer program management requirements are specified for support of a computer program library, control of computer programs, and installation of computer programs into IPAD.
Evaluating Data Clustering Approach for Life-Cycle Facility Control
2013-04-01
produce 90% matching accuracy with noise/variations up to 55%. KEYWORDS: Building Information Modelling ( BIM ), machine learning, pattern detection...reconciled to building information model elements and ultimately to an expected resource utilization schedule. The motivation for this integration is to...by interoperable data sources and building information models . Building performance modelling and simulation efforts such as those by Maile et al
Detailed Primitive-Based 3d Modeling of Architectural Elements
NASA Astrophysics Data System (ADS)
Remondino, F.; Lo Buglio, D.; Nony, N.; De Luca, L.
2012-07-01
The article describes a pipeline, based on image-data, for the 3D reconstruction of building façades or architectural elements and the successive modeling using geometric primitives. The approach overcome some existing problems in modeling architectural elements and deliver efficient-in-size reality-based textured 3D models useful for metric applications. For the 3D reconstruction, an opensource pipeline developed within the TAPENADE project is employed. In the successive modeling steps, the user manually selects an area containing an architectural element (capital, column, bas-relief, window tympanum, etc.) and then the procedure fits geometric primitives and computes disparity and displacement maps in order to tie visual and geometric information together in a light but detailed 3D model. Examples are reported and commented.
BIPAD: A web server for modeling bipartite sequence elements
Bi, Chengpeng; Rogan, Peter K
2006-01-01
Background Many dimeric protein complexes bind cooperatively to families of bipartite nucleic acid sequence elements, which consist of pairs of conserved half-site sequences separated by intervening distances that vary among individual sites. Results We introduce the Bipad Server [1], a web interface to predict sequence elements embedded within unaligned sequences. Either a bipartite model, consisting of a pair of one-block position weight matrices (PWM's) with a gap distribution, or a single PWM matrix for contiguous single block motifs may be produced. The Bipad program performs multiple local alignment by entropy minimization and cyclic refinement using a stochastic greedy search strategy. The best models are refined by maximizing incremental information contents among a set of potential models with varying half site and gap lengths. Conclusion The web service generates information positional weight matrices, identifies binding site motifs, graphically represents the set of discovered elements as a sequence logo, and depicts the gap distribution as a histogram. Server performance was evaluated by generating a collection of bipartite models for distinct DNA binding proteins. PMID:16503993
Students’ Relational Thinking of Impulsive and Reflective in Solving Mathematical Problem
NASA Astrophysics Data System (ADS)
Satriawan, M. A.; Budiarto, M. T.; Siswono, T. Y. E.
2018-01-01
This is a descriptive research which qualitatively investigates students’ relational thinking of impulsive and reflective cognitive style in solving mathematical problem. The method used in this research are test and interview. The data analyzed by reducing, presenting and concluding the data. The results of research show that the students’ reflective cognitive style can possibly help to find out important elements in understanding a problem. Reading more than one is useful to identify what is being questioned and write the information which is known, building relation in every element and connecting information with arithmetic operation, connecting between what is being questioned with known information, making equation model to find out the value by using substitution, and building a connection on re-checking, re-reading, and re-counting. The impulsive students’ cognitive style supports important elements in understanding problems, building a connection in every element, connecting information with arithmetic operation, building a relation about a problem comprehensively by connecting between what is being questioned with known information, finding out the unknown value by using arithmetic operation without making any equation model. The result of re-checking problem solving, impulsive student was only reading at glance without re-counting the result of problem solving.
Li, Yi; Chen, Yuren
2016-12-30
To make driving assistance system more humanized, this study focused on the prediction and assistance of drivers' perception-response time on mountain highway curves. Field tests were conducted to collect real-time driving data and driver vision information. A driver-vision lane model quantified curve elements in drivers' vision. A multinomial log-linear model was established to predict perception-response time with traffic/road environment information, driver-vision lane model, and mechanical status (last second). A corresponding assistance model showed a positive impact on drivers' perception-response times on mountain highway curves. Model results revealed that the driver-vision lane model and visual elements did have important influence on drivers' perception-response time. Compared with roadside passive road safety infrastructure, proper visual geometry design, timely visual guidance, and visual information integrality of a curve are significant factors for drivers' perception-response time.
Building clinical data groups for electronic medical record in China.
Tu, Haibo; Yu, Yingtao; Yang, Peng; Tang, Xuejun; Hu, Jianping; Rao, Keqin; Pan, Feng; Xu, Yongyong; Liu, Danhong
2012-04-01
This article aims at building clinical data groups for Electronic Medical Records (EMR) in China. These data groups can be reused as basic information units in building the medical sheets of Electronic Medical Record Systems (EMRS) and serve as part of its implementation guideline. The results were based on medical sheets, the forms that are used in hospitals, which were collected from hospitals. To categorize the information in these sheets into data groups, we adopted the Health Level 7 Clinical Document Architecture Release 2 Model (HL7 CDA R2 Model). The regulations and legal documents concerning health informatics and related standards in China were implemented. A set of 75 data groups with 452 data elements was created. These data elements were atomic items that comprised the data groups. Medical sheet items contained clinical records information and could be described by standard data elements that exist in current health document protocols. These data groups match different units of the CDA model. Twelve data groups with 87 standardized data elements described EMR headers, and 63 data groups with 405 standardized data elements constituted the body. The later 63 data groups in fact formed the sections of the model. The data groups had two levels. Those at the first level contained both the second level data groups and the standardized data elements. The data groups were basically reusable information units that served as guidelines for building EMRS and that were used to rebuild a medical sheet and serve as templates for the clinical records. As a pilot study of health information standards in China, the development of EMR data groups combined international standards with Chinese national regulations and standards, and this was the most critical part of the research. The original medical sheets from hospitals contain first hand medical information, and some of their items reveal the data types characteristic of the Chinese socialist national health system. It is possible and critical to localize and stabilize the adopted international health standards through abstracting and categorizing those items for future sharing and for the implementation of EMRS in China.
Design and Establishment of Quality Model of Fundamental Geographic Information Database
NASA Astrophysics Data System (ADS)
Ma, W.; Zhang, J.; Zhao, Y.; Zhang, P.; Dang, Y.; Zhao, T.
2018-04-01
In order to make the quality evaluation for the Fundamental Geographic Information Databases(FGIDB) more comprehensive, objective and accurate, this paper studies and establishes a quality model of FGIDB, which formed by the standardization of database construction and quality control, the conformity of data set quality and the functionality of database management system, and also designs the overall principles, contents and methods of the quality evaluation for FGIDB, providing the basis and reference for carry out quality control and quality evaluation for FGIDB. This paper designs the quality elements, evaluation items and properties of the Fundamental Geographic Information Database gradually based on the quality model framework. Connected organically, these quality elements and evaluation items constitute the quality model of the Fundamental Geographic Information Database. This model is the foundation for the quality demand stipulation and quality evaluation of the Fundamental Geographic Information Database, and is of great significance on the quality assurance in the design and development stage, the demand formulation in the testing evaluation stage, and the standard system construction for quality evaluation technology of the Fundamental Geographic Information Database.
Ebrahiminia, Vahid; Yasini, Mobin; Lamy, Jean Baptiste
2013-01-01
Lack of interoperability between health information systems is a major obstacle in implementing Clinical decision supports systems (CDSS) and their widespread disseminations. Virtual Medical Record (vMR) proposed by HL7 is a common data model for representing clinical information Inputs and outputs that can be used by CDSS and local clinical systems. A CDSS called ASTI used a similar model to represent clinical data and therapeutic history of patient. In order to evaluate the compatibility of ASTI with vMR, we started to map the ASTI model of representing patient’s therapeutic data to vMR. We compared the data elements and associated terminologies used in ASTI and vMR and we evaluated the semantic fidelity between the models. Only one data element the qualitative description of drug dosage, did not match the vMR model. However, it can be calculated in the execution engine. The semantic fidelity was satisfactorily preserved in 12 of 17 elements mapped between the models. This model of ASTI seems compatible to vMR. Further work is necessary to evaluate the compatibility of clinical data model of ASTI to vMR and the use of vMR in implementing practice guidelines. PMID:24551344
Tools for Understanding Identity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Creese, Sadie; Gibson-Robinson, Thomas; Goldsmith, Michael
Identity attribution and enrichment is critical to many aspects of law-enforcement and intelligence gathering; this identity typically spans a number of domains in the natural-world such as biographic information (factual information – e.g. names, addresses), biometric information (e.g. fingerprints) and psychological information. In addition to these natural-world projections of identity, identity elements are projected in the cyber-world. Conversely, undesirable elements may use similar techniques to target individuals for spear-phishing attacks (or worse), and potential targets or their organizations may want to determine how to minimize the attack surface exposed. Our research has been exploring the construction of a mathematical modelmore » for identity that supports such holistic identities. The model captures the ways in which an identity is constructed through a combination of data elements (e.g. a username on a forum, an address, a telephone number). Some of these elements may allow new characteristics to be inferred, hence enriching the holistic view of the identity. An example use-case would be the inference of real names from usernames, the ‘path’ created by inferring new elements of identity is highlighted in the ‘critical information’ panel. Individual attribution exercises can be understood as paths through a number of elements. Intuitively the entire realizable ‘capability’ can be modeled as a directed graph, where the elements are nodes and the inferences are represented by links connecting one or more antecedents with a conclusion. The model can be operationalized with two levels of tool support described in this paper, the first is a working prototype, the second is expected to reach prototype by July 2013: Understanding the Model The tool allows a user to easily determine, given a particular set of inferences and attributes, which elements or inferences are of most value to an investigator (or an attacker). The tool is also able to take into account the difficulty of the inferences, allowing the user to consider different scenarios depending on the perceived resources of the attacker, or to prioritize lines of investigation. It also has a number of interesting visualizations that are designed to aid the user in understanding the model. The tool works by considering the inferences as a graph and runs various graph-theoretic algorithms, with some novel adaptations, in order to deduce various properties. Using the Model To help investigators exploit the model to perform identity attribution, we have developed the Identity Map visualization. For a user-provided set of known starting elements and a set of desired target elements for a given identity, the Identity Map generates investigative workflows as paths through the model. Each path consists of a series of elements and inferences between them that connect the input and output elements. Each path also has an associated confidence level that estimates the reliability of the resulting attribution. Identity Map can help investigators understand the possible ways to make an identification decision and guide them toward the data-collection or analysis steps required to reach that decision.« less
Assessment of the quality of reporting observational studies in the pediatric dental literature.
Butani, Yogita; Hartz, Arthur; Levy, Steven; Watkins, Catherine; Kanellis, Michael; Nowak, Arthur
2006-01-01
The purpose of this assessment was to evaluate reporting of observational studies in the pediatric dental literature. This assessment included the following steps: (1) developing a model for reporting information in clinical dentistry studies; (2) identifying treatment comparisons in pediatric dentistry that were evaluated by at least 5 observational studies; (3) abstracting from these studies any data indicated by applying the reporting model; and (4) comparing available data elements to the desired data elements in the reporting model. The reporting model included data elements related to: (1) patients; (2) providers; (3) treatment details; and (4) study design. Two treatment comparisons in pediatric dentistry were identified with 5 or more observational studies: (1) stainless steel crowns vs amalgams (10 studies); and (2) composite restorations vs amalgam (5 studies). Results from studies comparing the same treatments varied substantially. Data elements from the reporting model that could have explained some of the variation were often reported inadequately or not at all. Reporting of observational studies in the pediatric dental literature may be inadequate for an informed interpretation of the results. Models similar to that used in this study could be used for developing standards for the conduct and reporting of observational studies in pediatric dentistry.
Peridynamic Multiscale Finite Element Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Costa, Timothy; Bond, Stephen D.; Littlewood, David John
The problem of computing quantum-accurate design-scale solutions to mechanics problems is rich with applications and serves as the background to modern multiscale science research. The prob- lem can be broken into component problems comprised of communicating across adjacent scales, which when strung together create a pipeline for information to travel from quantum scales to design scales. Traditionally, this involves connections between a) quantum electronic structure calculations and molecular dynamics and between b) molecular dynamics and local partial differ- ential equation models at the design scale. The second step, b), is particularly challenging since the appropriate scales of molecular dynamic andmore » local partial differential equation models do not overlap. The peridynamic model for continuum mechanics provides an advantage in this endeavor, as the basic equations of peridynamics are valid at a wide range of scales limiting from the classical partial differential equation models valid at the design scale to the scale of molecular dynamics. In this work we focus on the development of multiscale finite element methods for the peridynamic model, in an effort to create a mathematically consistent channel for microscale information to travel from the upper limits of the molecular dynamics scale to the design scale. In particular, we first develop a Nonlocal Multiscale Finite Element Method which solves the peridynamic model at multiple scales to include microscale information at the coarse-scale. We then consider a method that solves a fine-scale peridynamic model to build element-support basis functions for a coarse- scale local partial differential equation model, called the Mixed Locality Multiscale Finite Element Method. Given decades of research and development into finite element codes for the local partial differential equation models of continuum mechanics there is a strong desire to couple local and nonlocal models to leverage the speed and state of the art of local models with the flexibility and accuracy of the nonlocal peridynamic model. In the mixed locality method this coupling occurs across scales, so that the nonlocal model can be used to communicate material heterogeneity at scales inappropriate to local partial differential equation models. Additionally, the computational burden of the weak form of the peridynamic model is reduced dramatically by only requiring that the model be solved on local patches of the simulation domain which may be computed in parallel, taking advantage of the heterogeneous nature of next generation computing platforms. Addition- ally, we present a novel Galerkin framework, the 'Ambulant Galerkin Method', which represents a first step towards a unified mathematical analysis of local and nonlocal multiscale finite element methods, and whose future extension will allow the analysis of multiscale finite element methods that mix models across scales under certain assumptions of the consistency of those models.« less
Consumption of Mass Communication--Construction of a Model on Information Consumption Behaviour.
ERIC Educational Resources Information Center
Sepstrup, Preben
A general conceptual model on the consumption of information is introduced. Information as the output of the mass media is treated as a product, and a model on the consumption of this product is developed by merging elements from consumer behavior theory and mass communication theory. Chapter I gives basic assumptions about the individual and the…
NASA Technical Reports Server (NTRS)
Lee, H. P.
1977-01-01
The NASTRAN Thermal Analyzer Manual describes the fundamental and theoretical treatment of the finite element method, with emphasis on the derivations of the constituent matrices of different elements and solution algorithms. Necessary information and data relating to the practical applications of engineering modeling are included.
NASA Technical Reports Server (NTRS)
Chavez, Patrick F.
1987-01-01
The effort at Sandia National Labs. on the methodologies and techniques being used to generate strict hexahedral finite element meshes from a solid model is described. The functionality of the modeler is used to decompose the solid into a set of nonintersecting meshable finite element primitives. The description of the decomposition is exported, via a Boundary Representative format, to the meshing program which uses the information for complete finite element model specification. Particular features of the program are discussed in some detail along with future plans for development which includes automation of the decomposition using artificial intelligence techniques.
Superelement Analysis of Tile-Reinforced Composite Armor
NASA Technical Reports Server (NTRS)
Davila, Carlos G.
1998-01-01
Super-elements can greatly improve the computational efficiency of analyses of tile-reinforced structures such as the hull of the Composite Armored Vehicle. By taking advantage of the periodicity in this type of construction, super-elements can be used to simplify the task of modeling, to virtually eliminate the time required to assemble the stiffness matrices, and to reduce significantly the analysis solution time. Furthermore, super-elements are fully transferable between analyses and analysts, so that they provide a consistent method to share information and reduce duplication. This paper describes a methodology that was developed to model and analyze large upper hull components of the Composite Armored Vehicle. The analyses are based on two types of superelement models. The first type is based on element-layering, which consists of modeling a laminate by using several layers of shell elements constrained together with compatibility equations. Element layering is used to ensure the proper transverse shear deformation in the laminate rubber layer. The second type of model uses three-dimensional elements. Since no graphical pre-processor currently supports super-elements, a special technique based on master-elements was developed. Master-elements are representations of super-elements that are used in conjunction with a custom translator to write the superelement connectivities as input decks for ABAQUS.
Eguzkiza, Aitor; Trigo, Jesús Daniel; Martínez-Espronceda, Miguel; Serrano, Luis; Andonegui, José
2015-08-01
Most healthcare services use information and communication technologies to reduce and redistribute the workload associated with follow-up of chronic conditions. However, the lack of normalization of the information handled in and exchanged between such services hinders the scalability and extendibility. The use of medical standards for modelling and exchanging information, especially dual-model based approaches, can enhance the features of screening services. Hence, the approach of this paper is twofold. First, this article presents a generic methodology to model patient-centered clinical processes. Second, a proof of concept of the proposed methodology was conducted within the diabetic retinopathy (DR) screening service of the Health Service of Navarre (Spain) in compliance with a specific dual-model norm (openEHR). As a result, a set of elements required for deploying a model-driven DR screening service has been established, namely: clinical concepts, archetypes, termsets, templates, guideline definition rules, and user interface definitions. This model fosters reusability, because those elements are available to be downloaded and integrated in any healthcare service, and interoperability, since from then on such services can share information seamlessly. Copyright © 2015 Elsevier Inc. All rights reserved.
Mesh-To from Segmented Mesh Elements to Bim Model with Limited Parameters
NASA Astrophysics Data System (ADS)
Yang, X.; Koehl, M.; Grussenmeyer, P.
2018-05-01
Building Information Modelling (BIM) technique has been widely utilized in heritage documentation and comes to a general term Historical/Heritage BIM (HBIM). The current HBIM project mostly employs the scan-to-BIM process to manually create the geometric model from the point cloud. This paper explains how it is possible to shape from the mesh geometry with reduced human involvement during the modelling process. Aiming at unbuilt heritage, two case studies are handled in this study, including a ruined Roman stone architectural and a severely damaged abbey. The pipeline consists of solid element modelling based on documentation data using Autodesk Revit, a common BIM platform, and the successive modelling from these geometric primitives using Autodesk Dynamo, a visual programming built-in plugin tool in Revit. The BIM-based reconstruction enriches the classic visual model from computer graphics approaches with measurement, semantic and additional information. Dynamo is used to develop a semi-automated function to reduce the manual process, which builds the final BIM model from segmented parametric elements directly. The level of detail (LoD) of the final models is dramatically relevant with the manual involvement in the element creation. The proposed outline also presents two potential issues in the ongoing work: combining the ontology semantics with the parametric BIM model, and introducing the proposed pipeline into the as-built HBIM process.
Informational model verification of ZVS Buck quasi-resonant DC-DC converter
NASA Astrophysics Data System (ADS)
Vakovsky, Dimiter; Hinov, Nikolay
2016-12-01
The aim of the paper is to create a polymorphic informational model of a ZVS Buck quasi-resonant DC-DC converter for the modeling purposes of the object. For the creation of the model is applied flexible open standards for setting, storing, publishing and exchange of data in distributed information environment. The created model is useful for creation of many and different by type variants with different configuration of the composing elements and different inner model of the examined object.
Nuclear microscopy in trace-element biology — from cellular studies to the clinic
NASA Astrophysics Data System (ADS)
Lindh, Ulf
1993-05-01
The concentration and distribution of trace and major elements in cells are of great interest in cell biology. PIXE can provide elemental concentrations in the bulk of cells or organelles as other bulk techniques such as atomic absorption spectrophotometry and nuclear activation analysis. Supplementary information, perhaps more exciting, on the intracellular distributions of trace elements can be provided using nuclear microscopy. Intracellular distributions of trace elements in normal and malignant cells are presented. The toxicity of mercury and cadmium can be prevented by supplementation of the essential trace element selenium. Some results from an experimental animal model are discussed. The intercellular distribution of major and trace elements in isolated blood cells, as revealed by nuclear microscopy, provides useful clinical information. Examples are given concerning inflammatory connective-tissue diseases and the chronic fatigue syndrome.
A Conceptual Model of the Role of Communication in Surrogate Decision Making for Hospitalized Adults
Torke, Alexia M.; Petronio, Sandra; Sachs, Greg A.; Helft, Paul R.; Purnell, Christianna
2011-01-01
Objective To build a conceptual model of the role of communication in decision making, based on literature from medicine, communication studies and medical ethics. Methods We propose a model and describe each construct in detail. We review what is known about interpersonal and patient-physician communication, describe literature about surrogate-clinician communication, and discuss implications for our developing model. Results The communication literature proposes two major elements of interpersonal communication: information processing and relationship building. These elements are composed of constructs such as information disclosure and emotional support that are likely to be relevant to decision making. We propose these elements of communication impact decision making, which in turn affects outcomes for both patients and surrogates. Decision making quality may also mediate the relationship between communication and outcomes. Conclusion Although many elements of the model have been studied in relation to patient-clinician communication, there is limited data about surrogate decision making. There is evidence of high surrogate distress associated with decision making that may be alleviated by communication–focused interventions. More research is needed to test the relationships proposed in the model. Practice Implications Good communication with surrogates may improve both the quality of medical decisions and outcomes for the patient and surrogate. PMID:21889865
Torke, Alexia M; Petronio, Sandra; Sachs, Greg A; Helft, Paul R; Purnell, Christianna
2012-04-01
To build a conceptual model of the role of communication in decision making, based on literature from medicine, communication studies and medical ethics. We proposed a model and described each construct in detail. We review what is known about interpersonal and patient-physician communication, described literature about surrogate-clinician communication, and discussed implications for our developing model. The communication literature proposes two major elements of interpersonal communication: information processing and relationship building. These elements are composed of constructs such as information disclosure and emotional support that are likely to be relevant to decision making. We propose these elements of communication impact decision making, which in turn affects outcomes for both patients and surrogates. Decision making quality may also mediate the relationship between communication and outcomes. Although many elements of the model have been studied in relation to patient-clinician communication, there is limited data about surrogate decision making. There is evidence of high surrogate distress associated with decision making that may be alleviated by communication-focused interventions. More research is needed to test the relationships proposed in the model. Good communication with surrogates may improve both the quality of medical decisions and outcomes for the patient and surrogate. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Nguyen, Nhan; Ting, Eric; Nguyen, Daniel; Dao, Tung; Trinh, Khanh
2013-01-01
This paper presents a coupled vortex-lattice flight dynamic model with an aeroelastic finite-element model to predict dynamic characteristics of a flexible wing transport aircraft. The aircraft model is based on NASA Generic Transport Model (GTM) with representative mass and stiffness properties to achieve a wing tip deflection about twice that of a conventional transport aircraft (10% versus 5%). This flexible wing transport aircraft is referred to as an Elastically Shaped Aircraft Concept (ESAC) which is equipped with a Variable Camber Continuous Trailing Edge Flap (VCCTEF) system for active wing shaping control for drag reduction. A vortex-lattice aerodynamic model of the ESAC is developed and is coupled with an aeroelastic finite-element model via an automated geometry modeler. This coupled model is used to compute static and dynamic aeroelastic solutions. The deflection information from the finite-element model and the vortex-lattice model is used to compute unsteady contributions to the aerodynamic force and moment coefficients. A coupled aeroelastic-longitudinal flight dynamic model is developed by coupling the finite-element model with the rigid-body flight dynamic model of the GTM.
Jesus, Tiago Silva; Silva, Isabel Lopes
2016-04-01
There is a growing interest in linking aspects of patient-provider communication to rehabilitation outcomes. However, the field lacks a conceptual understanding on: (a) 'how' rehabilitation outcomes can be improved by communication; and (b) through 'which' elements in particular. This article elaborates on the conceptual developments toward informing further practice and research. Existing models of communication in healthcare were adapted to rehabilitation, and its outcomes through a comprehensive literature review. After depicting mediating mechanisms and variables (e.g. therapeutic engagement, adjustment toward disability), this article presents the '4 Rehab Communication Elements' deemed likely to underpin rehabilitation outcomes. The four elements are: (a) knowing the person and building a supportive relationship; (b) effective information exchange and education; (c) shared goal-setting and action planning; and (d) fostering a more positive, yet realistic, cognitive and self-reframing. This article describes an unprecedented, outcomes-oriented approach toward the design of rehabilitation communication, which has resulted in the development of a new intervention model: the '4 Rehab Communication Elements'. Further trials are needed to evaluate the impact of this whole intervention model on rehabilitation outcomes. © The Author(s) 2015.
Biomechanical investigation of naso-orbitoethmoid trauma by finite element analysis.
Huempfner-Hierl, Heike; Schaller, Andreas; Hemprich, Alexander; Hierl, Thomas
2014-11-01
Naso-orbitoethmoid fractures account for 5% of all facial fractures. We used data derived from a white 34-year-old man to make a transient dynamic finite element model, which consisted of about 740 000 elements, to simulate fist-like impacts to this anatomically complex area. Finite element analysis showed a pattern of von Mises stresses beyond the yield criterion of bone that corresponded with fractures commonly seen clinically. Finite element models can be used to simulate injuries to the human skull, and provide information about the pathogenesis of different types of fracture. Copyright © 2014 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Application of Probability Methods to Assess Crash Modeling Uncertainty
NASA Technical Reports Server (NTRS)
Lyle, Karen H.; Stockwell, Alan E.; Hardy, Robin C.
2003-01-01
Full-scale aircraft crash simulations performed with nonlinear, transient dynamic, finite element codes can incorporate structural complexities such as: geometrically accurate models; human occupant models; and advanced material models to include nonlinear stress-strain behaviors, and material failure. Validation of these crash simulations is difficult due to a lack of sufficient information to adequately determine the uncertainty in the experimental data and the appropriateness of modeling assumptions. This paper evaluates probabilistic approaches to quantify the effects of finite element modeling assumptions on the predicted responses. The vertical drop test of a Fokker F28 fuselage section will be the focus of this paper. The results of a probabilistic analysis using finite element simulations will be compared with experimental data.
Application of Probability Methods to Assess Crash Modeling Uncertainty
NASA Technical Reports Server (NTRS)
Lyle, Karen H.; Stockwell, Alan E.; Hardy, Robin C.
2007-01-01
Full-scale aircraft crash simulations performed with nonlinear, transient dynamic, finite element codes can incorporate structural complexities such as: geometrically accurate models; human occupant models; and advanced material models to include nonlinear stress-strain behaviors, and material failure. Validation of these crash simulations is difficult due to a lack of sufficient information to adequately determine the uncertainty in the experimental data and the appropriateness of modeling assumptions. This paper evaluates probabilistic approaches to quantify the effects of finite element modeling assumptions on the predicted responses. The vertical drop test of a Fokker F28 fuselage section will be the focus of this paper. The results of a probabilistic analysis using finite element simulations will be compared with experimental data.
An IEEE 1451.1 Architecture for ISHM Applications
NASA Technical Reports Server (NTRS)
Morris, Jon A.; Turowski, Mark; Schmalzel, John L.; Figueroa, Jorge F.
2007-01-01
The IEEE 1451.1 Standard for a Smart Transducer Interface defines a common network information model for connecting and managing smart elements in control and data acquisition networks using network-capable application processors (NCAPs). The Standard is a network-neutral design model that is easily ported across operating systems and physical networks for implementing complex acquisition and control applications by simply plugging in the appropriate network level drivers. To simplify configuration and tracking of transducer and actuator details, the family of 1451 standards defines a Transducer Electronic Data Sheet (TEDS) that is associated with each physical element. The TEDS contains all of the pertinent information about the physical operations of a transducer (such as operating regions, calibration tables, and manufacturer information), which the NCAP uses to configure the system to support a specific transducer. The Integrated Systems Health Management (ISHM) group at NASA's John C. Stennis Space Center (SSC) has been developing an ISHM architecture that utilizes IEEE 1451.1 as the primary configuration and data acquisition mechanism for managing and collecting information from a network of distributed intelligent sensing elements. This work has involved collaboration with other NASA centers, universities and aerospace industries to develop IEEE 1451.1 compliant sensors and interfaces tailored to support health assessment of complex systems. This paper and presentation describe the development and implementation of an interface for the configuration, management and communication of data, information and knowledge generated by a distributed system of IEEE 1451.1 intelligent elements monitoring a rocket engine test system. In this context, an intelligent element is defined as one incorporating support for the IEEE 1451.x standards and additional ISHM functions. Our implementation supports real-time collection of both measurement data (raw ADC counts and converted engineering units) and health statistics produced by each intelligent element. The handling of configuration, calibration and health information is automated by using the TEDS in combination with other electronic data sheets extensions to convey health parameters. By integrating the IEEE 1451.1 Standard for a Smart Transducer Interface with ISHM technologies, each element within a complex system becomes a highly flexible computation engine capable of self-validation and performing other measures of the quality of information it is producing.
NASA Technical Reports Server (NTRS)
Glover, R. C.; Rudy, S. W.; Tischer, A. E.
1987-01-01
The high-pressure oxidizer turbopump (HPOTP) failure information propagation model (FIPM) is presented. The text includes a brief discussion of the FIPM methodology and the various elements which comprise a model. Specific details of the HPOTP FIPM are described. Listings of all the HPOTP data records are included as appendices.
ERIC Educational Resources Information Center
van Nieuwenhuijzen, M.; de Castro, B. O.; van der Valk, I.; Wijnroks, L.; Vermeer, A.; Matthys, W.
2006-01-01
Background: This study aimed to examine whether the social information-processing model (SIP model) applies to aggressive behaviour by children with mild intellectual disabilities (MID). The response-decision element of SIP was expected to be unnecessary to explain aggressive behaviour in these children, and SIP was expected to mediate the…
Two-point method uncertainty during control and measurement of cylindrical element diameters
NASA Astrophysics Data System (ADS)
Glukhov, V. I.; Shalay, V. V.; Radev, H.
2018-04-01
The topic of the article is devoted to the urgent problem of the reliability of technical products geometric specifications measurements. The purpose of the article is to improve the quality of parts linear sizes control by the two-point measurement method. The article task is to investigate methodical extended uncertainties in measuring cylindrical element linear sizes. The investigation method is a geometric modeling of the element surfaces shape and location deviations in a rectangular coordinate system. The studies were carried out for elements of various service use, taking into account their informativeness, corresponding to the kinematic pairs classes in theoretical mechanics and the number of constrained degrees of freedom in the datum element function. Cylindrical elements with informativity of 4, 2, 1 and θ (zero) were investigated. The uncertainties estimation of in two-point measurements was made by comparing the results of of linear dimensions measurements with the functional diameters maximum and minimum of the element material. Methodical uncertainty is formed when cylindrical elements with maximum informativeness have shape deviations of the cut and the curvature types. Methodical uncertainty is formed by measuring the element average size for all types of shape deviations. The two-point measurement method cannot take into account the location deviations of a dimensional element, so its use for elements with informativeness less than the maximum creates unacceptable methodical uncertainties in measurements of the maximum, minimum and medium linear dimensions. Similar methodical uncertainties also exist in the arbitration control of the linear dimensions of the cylindrical elements by limiting two-point gauges.
Common elements of adolescent prevention programs: minimizing burden while maximizing reach.
Boustani, Maya M; Frazier, Stacy L; Becker, Kimberly D; Bechor, Michele; Dinizulu, Sonya M; Hedemann, Erin R; Ogle, Robert R; Pasalich, Dave S
2015-03-01
A growing number of evidence-based youth prevention programs are available, but challenges related to dissemination and implementation limit their reach and impact. The current review identifies common elements across evidence-based prevention programs focused on the promotion of health-related outcomes in adolescents. We reviewed and coded descriptions of the programs for common practice and instructional elements. Problem-solving emerged as the most common practice element, followed by communication skills, and insight building. Psychoeducation, modeling, and role play emerged as the most common instructional elements. In light of significant comorbidity in poor outcomes for youth, and corresponding overlap in their underlying skills deficits, we propose that synthesizing the prevention literature using a common elements approach has the potential to yield novel information and inform prevention programming to minimize burden and maximize reach and impact for youth.
A review of some problems in global-local stress analysis
NASA Technical Reports Server (NTRS)
Nelson, Richard B.
1989-01-01
The various types of local-global finite-element problems point out the need to develop a new generation of software. First, this new software needs to have a complete analysis capability, encompassing linear and nonlinear analysis of 1-, 2-, and 3-dimensional finite-element models, as well as mixed dimensional models. The software must be capable of treating static and dynamic (vibration and transient response) problems, including the stability effects of initial stress, and the software should be able to treat both elastic and elasto-plastic materials. The software should carry a set of optional diagnostics to assist the program user during model generation in order to help avoid obvious structural modeling errors. In addition, the program software should be well documented so the user has a complete technical reference for each type of element contained in the program library, including information on such topics as the type of numerical integration, use of underintegration, and inclusion of incompatible modes, etc. Some packaged information should also be available to assist the user in building mixed-dimensional models. An important advancement in finite-element software should be in the development of program modularity, so that the user can select from a menu various basic operations in matrix structural analysis.
Tao, Cui; Jiang, Guoqian; Oniki, Thomas A; Freimuth, Robert R; Zhu, Qian; Sharma, Deepak; Pathak, Jyotishman; Huff, Stanley M; Chute, Christopher G
2013-05-01
The clinical element model (CEM) is an information model designed for representing clinical information in electronic health records (EHR) systems across organizations. The current representation of CEMs does not support formal semantic definitions and therefore it is not possible to perform reasoning and consistency checking on derived models. This paper introduces our efforts to represent the CEM specification using the Web Ontology Language (OWL). The CEM-OWL representation connects the CEM content with the Semantic Web environment, which provides authoring, reasoning, and querying tools. This work may also facilitate the harmonization of the CEMs with domain knowledge represented in terminology models as well as other clinical information models such as the openEHR archetype model. We have created the CEM-OWL meta ontology based on the CEM specification. A convertor has been implemented in Java to automatically translate detailed CEMs from XML to OWL. A panel evaluation has been conducted, and the results show that the OWL modeling can faithfully represent the CEM specification and represent patient data.
Tao, Cui; Jiang, Guoqian; Oniki, Thomas A; Freimuth, Robert R; Zhu, Qian; Sharma, Deepak; Pathak, Jyotishman; Huff, Stanley M; Chute, Christopher G
2013-01-01
The clinical element model (CEM) is an information model designed for representing clinical information in electronic health records (EHR) systems across organizations. The current representation of CEMs does not support formal semantic definitions and therefore it is not possible to perform reasoning and consistency checking on derived models. This paper introduces our efforts to represent the CEM specification using the Web Ontology Language (OWL). The CEM-OWL representation connects the CEM content with the Semantic Web environment, which provides authoring, reasoning, and querying tools. This work may also facilitate the harmonization of the CEMs with domain knowledge represented in terminology models as well as other clinical information models such as the openEHR archetype model. We have created the CEM-OWL meta ontology based on the CEM specification. A convertor has been implemented in Java to automatically translate detailed CEMs from XML to OWL. A panel evaluation has been conducted, and the results show that the OWL modeling can faithfully represent the CEM specification and represent patient data. PMID:23268487
Toward Validation of the Genius Discipline-Specific Literacy Model
ERIC Educational Resources Information Center
Ellis, Edwin S.; Wills, Stephen; Deshler, Donald D.
2011-01-01
An analysis of the rationale and theoretical foundations of the Genius Discipline-specific Literacy Model and its use of SMARTvisuals to cue information-processing skills and strategies and focus attention on essential informational elements in high-frequency topics in history and the English language arts are presented. Quantitative data…
Management/Technical Interaction in Integrated Information System Development.
ERIC Educational Resources Information Center
Bagley, Clarence H.; Gardner, Don E.
The integrated information system element of the management information system concept has practical applications for management in the areas of both information analysis and decision-model building. Four basic options for achieving integration in operational data systems are: a default option, the coordinated file option, the distributed…
Information Book Read-Alouds as Models for Second-Grade Authors
ERIC Educational Resources Information Center
Bradley, Linda Golson; Donovan, Carol A.
2010-01-01
This article discusses the instructional practice of supporting second graders' information book writing with focused read-alouds that include discussions of information book genre elements, features, and organizational structure. The authors present specific examples of instruction and discuss the resulting information book compositions by…
Modeling Array Stations in SIG-VISA
NASA Astrophysics Data System (ADS)
Ding, N.; Moore, D.; Russell, S.
2013-12-01
We add support for array stations to SIG-VISA, a system for nuclear monitoring using probabilistic inference on seismic signals. Array stations comprise a large portion of the IMS network; they can provide increased sensitivity and more accurate directional information compared to single-component stations. Our existing model assumed that signals were independent at each station, which is false when lots of stations are close together, as in an array. The new model removes that assumption by jointly modeling signals across array elements. This is done by extending our existing Gaussian process (GP) regression models, also known as kriging, from a 3-dimensional single-component space of events to a 6-dimensional space of station-event pairs. For each array and each event attribute (including coda decay, coda height, amplitude transfer and travel time), we model the joint distribution across array elements using a Gaussian process that learns the correlation lengthscale across the array, thereby incorporating information of array stations into the probabilistic inference framework. To evaluate the effectiveness of our model, we perform ';probabilistic beamforming' on new events using our GP model, i.e., we compute the event azimuth having highest posterior probability under the model, conditioned on the signals at array elements. We compare the results from our probabilistic inference model to the beamforming currently performed by IMS station processing.
AutoCAD-To-GIFTS Translator Program
NASA Technical Reports Server (NTRS)
Jones, Andrew
1989-01-01
AutoCAD-to-GIFTS translator program, ACTOG, developed to facilitate quick generation of small finite-element models using CASA/GIFTS finite-element modeling program. Reads geometric data of drawing from Data Exchange File (DXF) used in AutoCAD and other PC-based drafting programs. Geometric entities recognized by ACTOG include points, lines, arcs, solids, three-dimensional lines, and three-dimensional faces. From this information, ACTOG creates GIFTS SRC file, which then reads into GIFTS preprocessor BULKM or modified and reads into EDITM to create finite-element model. SRC file used as is or edited for any number of uses. Written in Microsoft Quick-Basic (Version 2.0).
NASA Astrophysics Data System (ADS)
Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Arrigo, J.; Hooper, R. P.; Valentine, D. W.; Maidment, D. R.
2013-12-01
HydroShare is an online, collaborative system being developed for sharing hydrologic data and models. The goal of HydroShare is to enable scientists to easily discover and access data and models, retrieve them to their desktop or perform analyses in a distributed computing environment that may include grid, cloud or high performance computing model instances as necessary. Scientists may also publish outcomes (data, results or models) into HydroShare, using the system as a collaboration platform for sharing data, models and analyses. HydroShare is expanding the data sharing capability of the CUAHSI Hydrologic Information System by broadening the classes of data accommodated, creating new capability to share models and model components, and taking advantage of emerging social media functionality to enhance information about and collaboration around hydrologic data and models. One of the fundamental concepts in HydroShare is that of a Resource. All content is represented using a Resource Data Model that separates system and science metadata and has elements common to all resources as well as elements specific to the types of resources HydroShare will support. These will include different data types used in the hydrology community and models and workflows that require metadata on execution functionality. HydroShare will use the integrated Rule-Oriented Data System (iRODS) to manage federated data content and perform rule-based background actions on data and model resources, including parsing to generate metadata catalog information and the execution of models and workflows. This presentation will introduce the HydroShare functionality developed to date, describe key elements of the Resource Data Model and outline the roadmap for future development.
Reeves, Mari Kathryn; Perdue, Margaret; Munk, Lee Ann; Hagedorn, Birgit
2018-07-15
Studies of environmental processes exhibit spatial variation within data sets. The ability to derive predictions of risk from field data is a critical path forward in understanding the data and applying the information to land and resource management. Thanks to recent advances in predictive modeling, open source software, and computing, the power to do this is within grasp. This article provides an example of how we predicted relative trace element pollution risk from roads across a region by combining site specific trace element data in soils with regional land cover and planning information in a predictive model framework. In the Kenai Peninsula of Alaska, we sampled 36 sites (191 soil samples) adjacent to roads for trace elements. We then combined this site specific data with freely-available land cover and urban planning data to derive a predictive model of landscape scale environmental risk. We used six different model algorithms to analyze the dataset, comparing these in terms of their predictive abilities and the variables identified as important. Based on comparable predictive abilities (mean R 2 from 30 to 35% and mean root mean square error from 65 to 68%), we averaged all six model outputs to predict relative levels of trace element deposition in soils-given the road surface, traffic volume, sample distance from the road, land cover category, and impervious surface percentage. Mapped predictions of environmental risk from toxic trace element pollution can show land managers and transportation planners where to prioritize road renewal or maintenance by each road segment's relative environmental and human health risk. Published by Elsevier B.V.
NASA Technical Reports Server (NTRS)
Newsom, H. E.; Hagerty, J. J.; Shearer, C. W.
2002-01-01
New SIMS data for mobile elements in Lonar Crater clay minerals are remarkably similar to data for alteration material in the Lafayette Mars meteorite. This work strongly supports the use of terrestrial analogues for Mars, including a new mass balance model for mobile elements through time. Additional information is contained in the original extended abstract.
Nigenda, Gustavo H; González, Luz María
2009-01-01
Introduction Contracting out health services is a strategy that many health systems in the developing world are following, despite the lack of decisive evidence that this is the best way to improve quality, increase efficiency and expand coverage. A large body of literature has appeared in recent years focusing on the results of several contracting strategies, but very few papers have addressed aspects of the managerial process and how this can affect results. Case description This paper describes and analyses the perceptions and opinions of managers and workers about the benefits and challenges of the contracting model that has been in place for almost 10 years in the State of Jalisco, Mexico. Both qualitative and quantitative information was collected. An open-ended questionnaire was used to obtain information from a group of managers, while information provided by a self-selected group of workers was collected via a closed-ended questionnaire. The analysis contrasted the information obtained from each source. Discussion and Evaluation Findings show that perceptions of managers and workers vary for most of the items studied. For managers the model has been a success, as it has allowed for expansion of coverage based on a cost-effective strategy, while for workers the model also possesses positive elements but fails to provide fair labour relationships, which negatively affects their performance. Conclusion Perspectives of the two main groups of actors in Jalisco's contracting model are important in the design and adjustment of an adequate contracting model that includes managerial elements to give incentives to worker performance, a key element necessary to achieve the model's ultimate objectives. Lessons learnt from this study could be relevant for the experience of contracting models in other developing countries. PMID:19849831
Detailed clinical models: a review.
Goossen, William; Goossen-Baremans, Anneke; van der Zel, Michael
2010-12-01
Due to the increasing use of electronic patient records and other health care information technology, we see an increase in requests to utilize these data. A highly level of standardization is required during the gathering of these data in the clinical context in order to use it for analyses. Detailed Clinical Models (DCM) have been created toward this purpose and several initiatives have been implemented in various parts of the world to create standardized models. This paper presents a review of DCM. Two types of analyses are presented; one comparing DCM against health care information architectures and a second bottom up approach from concept analysis to representation. In addition core parts of the draft ISO standard 13972 on DCM are used such as clinician involvement, data element specification, modeling, meta information, and repository and governance. SIX INITIATIVES WERE SELECTED: Intermountain Healthcare, 13606/OpenEHR Archetypes, Clinical Templates, Clinical Contents Models, Health Level 7 templates, and Dutch Detailed Clinical Models. Each model selected was reviewed for their overall development, involvement of clinicians, use of data types, code bindings, expressing semantics, modeling, meta information, use of repository and governance. Using both a top down and bottom up approach to comparison reveals many commonalties and differences between initiatives. Important differences include the use of or lack of a reference model and expressiveness of models. Applying clinical data element standards facilitates the use of conceptual DCM models in different technical representations.
NASA Astrophysics Data System (ADS)
Zhang, Bin; Deng, Congying; Zhang, Yi
2018-03-01
Rolling element bearings are mechanical components used frequently in most rotating machinery and they are also vulnerable links representing the main source of failures in such systems. Thus, health condition monitoring and fault diagnosis of rolling element bearings have long been studied to improve operational reliability and maintenance efficiency of rotatory machines. Over the past decade, prognosis that enables forewarning of failure and estimation of residual life attracted increasing attention. To accurately and efficiently predict failure of the rolling element bearing, the degradation requires to be well represented and modelled. For this purpose, degradation of the rolling element bearing is analysed with the delay-time-based model in this paper. Also, a hybrid feature selection and health indicator construction scheme is proposed for extraction of the bearing health relevant information from condition monitoring sensor data. Effectiveness of the presented approach is validated through case studies on rolling element bearing run-to-failure experiments.
Teaching Reading Sourcebook, Second Edition
ERIC Educational Resources Information Center
Honig, Bill; Diamond, Linda; Gutlohn, Linda
2008-01-01
The "Teaching Reading Sourcebook, Second Edition" is a comprehensive reference about reading instruction. Organized according to the elements of explicit instruction (what? why? when? and how?), the "Sourcebook" includes both a research-informed knowledge base and practical sample lesson models. It teaches the key elements of an effective reading…
Neural network for processing both spatial and temporal data with time based back-propagation
NASA Technical Reports Server (NTRS)
Villarreal, James A. (Inventor); Shelton, Robert O. (Inventor)
1993-01-01
Neural networks are computing systems modeled after the paradigm of the biological brain. For years, researchers using various forms of neural networks have attempted to model the brain's information processing and decision-making capabilities. Neural network algorithms have impressively demonstrated the capability of modeling spatial information. On the other hand, the application of parallel distributed models to the processing of temporal data has been severely restricted. The invention introduces a novel technique which adds the dimension of time to the well known back-propagation neural network algorithm. In the space-time neural network disclosed herein, the synaptic weights between two artificial neurons (processing elements) are replaced with an adaptable-adjustable filter. Instead of a single synaptic weight, the invention provides a plurality of weights representing not only association, but also temporal dependencies. In this case, the synaptic weights are the coefficients to the adaptable digital filters. Novelty is believed to lie in the disclosure of a processing element and a network of the processing elements which are capable of processing temporal as well as spacial data.
NASA Technical Reports Server (NTRS)
Jenkins, J. M.
1979-01-01
Additional information was added to a growing data base from which estimates of finite element model complexities can be made with respect to thermal stress analysis. The manner in which temperatures were smeared to the finite element grid points was examined from the point of view of the impact on thermal stress calculations. The general comparison of calculated and measured thermal stresses is guite good and there is little doubt that the finite element approach provided by NASTRAN results in correct thermal stress calculations. Discrepancies did exist between measured and calculated values in the skin and the skin/frame junctures. The problems with predicting skin thermal stress were attributed to inadequate temperature inputs to the structural model rather than modeling insufficiencies. The discrepancies occurring at the skin/frame juncture were most likely due to insufficient modeling elements rather than temperature problems.
Smith, Michael J; Wagner, Christian; Wallace, Ken J; Pourabdollah, Amir; Lewis, Loretta
2016-06-15
An important, and yet unresolved question in natural resource management is how best to manage natural elements and their associated values to ensure human wellbeing. Specifically, there is a lack of measurement tools to assess the contribution of nature to people. We present one approach to overcome this global issue and show that the preferred state of any system element, in terms of realising human values, is a function of element properties. Consequently, natural resource managers need to understand the nature of the relationships between element properties and values if they are to successfully manage for human wellbeing. In two case studies of applied planning, we demonstrate how to identify key element properties, quantify their relationships to priority human values, and combine this information to model the contribution of elements to human wellbeing. In one of the two case studies we also compared the modelling outputs with directly elicited stakeholder opinions regarding the importance of the elements for realising the given priority values. The two, largely congruent outputs provide additional support for the approach. The study shows that rating sets of elements on their relative overall value for human wellbeing, or utility, provides critical information for subsequent management decisions and a basis for productive new research. We consider that the described approach is broadly applicable within the domain of natural resource management. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Fatigue assessment of an existing steel bridge by finite element modelling and field measurements
NASA Astrophysics Data System (ADS)
Kwad, J.; Alencar, G.; Correia, J.; Jesus, A.; Calçada, R.; Kripakaran, P.
2017-05-01
The evaluation of fatigue life of structural details in metallic bridges is a major challenge for bridge engineers. A reliable and cost-effective approach is essential to ensure appropriate maintenance and management of these structures. Typically, local stresses predicted by a finite element model of the bridge are employed to assess the fatigue life of fatigue-prone details. This paper illustrates an approach for fatigue assessment based on measured data for a connection in an old bascule steel bridge located in Exeter (UK). A finite element model is first developed from the design information. The finite element model of the bridge is calibrated using measured responses from an ambient vibration test. The stress time histories are calculated through dynamic analysis of the updated finite element model. Stress cycles are computed through the rainflow counting algorithm, and the fatigue prone details are evaluated using the standard SN curves approach and the Miner’s rule. Results show that the proposed approach can estimate the fatigue damage of a fatigue prone detail in a structure using measured strain data.
NASA Astrophysics Data System (ADS)
Su, Y.; Ong, E. T.; Lee, K. H.
2002-05-01
The past decade has seen an accelerated growth of technology in the field of microelectromechanical systems (MEMS). The development of MEMS products has generated the need for efficient analytical and simulation methods for minimizing the requirement for actual prototyping. The boundary element method is widely used in the electrostatic analysis for MEMS devices. However, singular elements are needed to accurately capture the behavior at singular regions, such as sharp corners and edges, where standard elements fail to give an accurate result. The manual classification of boundary elements based on their singularity conditions is an immensely laborious task, especially when the boundary element model is large. This process can be automated by querying the geometric model of the MEMS device for convex edges based on geometric information of the model. The associated nodes of the boundary elements on these edges can then be retrieved. The whole process is implemented in the MSC/PATRAN platform using the Patran Command Language (the source code is available as supplementary data in the electronic version of this journal issue).
Point Clouds to Indoor/outdoor Accessibility Diagnosis
NASA Astrophysics Data System (ADS)
Balado, J.; Díaz-Vilariño, L.; Arias, P.; Garrido, I.
2017-09-01
This work presents an approach to automatically detect structural floor elements such as steps or ramps in the immediate environment of buildings, elements that may affect the accessibility to buildings. The methodology is based on Mobile Laser Scanner (MLS) point cloud and trajectory information. First, the street is segmented in stretches along the trajectory of the MLS to work in regular spaces. Next, the lower region of each stretch (the ground zone) is selected as the ROI and normal, curvature and tilt are calculated for each point. With this information, points in the ROI are classified in horizontal, inclined or vertical. Points are refined and grouped in structural elements using raster process and connected components in different phases for each type of previously classified points. At last, the trajectory data is used to distinguish between road and sidewalks. Adjacency information is used to classify structural elements in steps, ramps, curbs and curb-ramps. The methodology is tested in a real case study, consisting of 100 m of an urban street. Ground elements are correctly classified in an acceptable computation time. Steps and ramps also are exported to GIS software to enrich building models from Open Street Map with information about accessible/inaccessible entrances and their locations.
NASA Technical Reports Server (NTRS)
1983-01-01
All information directly associated with problem solving using the NASTRAN program is presented. This structural analysis program uses the finite element approach to structural modeling wherein the distributed finite properties of a structure are represented by a finite element of structural elements which are interconnected at a finite number of grid points, to which loads are applied and for which displacements are calculated. Procedures are described for defining and loading a structural model. Functional references for every card used for structural modeling, the NASTRAN data deck and control cards, problem solution sequences (rigid formats), using the plotting capability, writing a direct matrix abstraction program, and diagnostic messages are explained. A dictionary of mnemonics, acronyms, phrases, and other commonly used NASTRAN terms is included.
A Novel Machine Learning Classifier Based on a Qualia Modeling Agent (QMA)
Information Theory (IIT) of Consciousness , which proposes that the fundamental structural elements of consciousness are qualia. By modeling the...This research develops a computational agent, which overcomes this problem. The Qualia Modeling Agent (QMA) is modeled after two cognitive theories
NASA Astrophysics Data System (ADS)
Chow, L.; Fai, S.
2017-08-01
The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS) that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM) for one of Canada's most significant heritage assets - the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS), Public Services and Procurement Canada (PSPC), using a Leica C10 and P40 (exterior and large interior spaces) and a Faro Focus (small to mid-sized interior spaces). Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.
An Annotated Bibliography on Tactical Map Display Symbology
1989-08-01
failure of attention to be focused on one element selectively in filtering tasks where only that one element was relevant to the discrimination. Failure of...The present study evaluates a class of models of human information processing made popular by Broadbent . A brief tachistoscopic display of one or two...213-219. Two experiments were performed to test Neisser’s two-stage model of recognition as applied to matching. Evidence of parallel processing was
Nemesis I: Parallel Enhancements to ExodusII
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hennigan, Gary L.; John, Matthew S.; Shadid, John N.
2006-03-28
NEMESIS I is an enhancement to the EXODUS II finite element database model used to store and retrieve data for unstructured parallel finite element analyses. NEMESIS I adds data structures which facilitate the partitioning of a scalar (standard serial) EXODUS II file onto parallel disk systems found on many parallel computers. Since the NEMESIS I application programming interface (APl)can be used to append information to an existing EXODUS II files can be used on files which contain NEMESIS I information. The NEMESIS I information is written and read via C or C++ callable functions which compromise the NEMESIS I API.
An Ontology for Identifying Cyber Intrusion Induced Faults in Process Control Systems
NASA Astrophysics Data System (ADS)
Hieb, Jeffrey; Graham, James; Guan, Jian
This paper presents an ontological framework that permits formal representations of process control systems, including elements of the process being controlled and the control system itself. A fault diagnosis algorithm based on the ontological model is also presented. The algorithm can identify traditional process elements as well as control system elements (e.g., IP network and SCADA protocol) as fault sources. When these elements are identified as a likely fault source, the possibility exists that the process fault is induced by a cyber intrusion. A laboratory-scale distillation column is used to illustrate the model and the algorithm. Coupled with a well-defined statistical process model, this fault diagnosis approach provides cyber security enhanced fault diagnosis information to plant operators and can help identify that a cyber attack is underway before a major process failure is experienced.
The Relationship between Simultaneous-Successive Processing and Academic Achievement.
ERIC Educational Resources Information Center
Merritt, Frank M.; McCallum, Steve
The Luria-Das Information Processing Model of human learning holds that information is analysed and coded within the brain in either a simultaneous or a successive fashion. Simultaneous integration refers to the synthesis of separate elements into groups, often with spatial characteristics; successive integration means that information is…
Shibai, Atsushi; Arimoto, Tsunehiro; Yoshinaga, Tsukasa; Tsuchizawa, Yuta; Khureltulga, Dashdavaa; Brown, Zuben P; Kakizuka, Taishi; Hosoda, Kazufumi
2018-06-05
Visual recognition of conspecifics is necessary for a wide range of social behaviours in many animals. Medaka (Japanese rice fish), a commonly used model organism, are known to be attracted by the biological motion of conspecifics. However, biological motion is a composite of both body-shape motion and entire-field motion trajectory (i.e., posture or motion-trajectory elements, respectively), and it has not been revealed which element mediates the attractiveness. Here, we show that either posture or motion-trajectory elements alone can attract medaka. We decomposed biological motion of the medaka into the two elements and synthesized visual stimuli that contain both, either, or none of the two elements. We found that medaka were attracted by visual stimuli that contain at least one of the two elements. In the context of other known static visual information regarding the medaka, the potential multiplicity of information regarding conspecific recognition has further accumulated. Our strategy of decomposing biological motion into these partial elements is applicable to other animals, and further studies using this technique will enhance the basic understanding of visual recognition of conspecifics.
NASA Astrophysics Data System (ADS)
Garagnani, S.; Manferdini, A. M.
2013-02-01
Since their introduction, modeling tools aimed to architectural design evolved in today's "digital multi-purpose drawing boards" based on enhanced parametric elements able to originate whole buildings within virtual environments. Semantic splitting and elements topology are features that allow objects to be "intelligent" (i.e. self-aware of what kind of element they are and with whom they can interact), representing this way basics of Building Information Modeling (BIM), a coordinated, consistent and always up to date workflow improved in order to reach higher quality, reliability and cost reductions all over the design process. Even if BIM was originally intended for new architectures, its attitude to store semantic inter-related information can be successfully applied to existing buildings as well, especially if they deserve particular care such as Cultural Heritage sites. BIM engines can easily manage simple parametric geometries, collapsing them to standard primitives connected through hierarchical relationships: however, when components are generated by existing morphologies, for example acquiring point clouds by digital photogrammetry or laser scanning equipment, complex abstractions have to be introduced while remodeling elements by hand, since automatic feature extraction in available software is still not effective. In order to introduce a methodology destined to process point cloud data in a BIM environment with high accuracy, this paper describes some experiences on monumental sites documentation, generated through a plug-in written for Autodesk Revit and codenamed GreenSpider after its capability to layout points in space as if they were nodes of an ideal cobweb.
Ogrinc, Greg; Hoffman, Kimberly G.; Stevenson, Katherine M.; Shalaby, Marc; Beard, Albertine S.; Thörne, Karin E.; Coleman, Mary T.; Baum, Karyn D.
2016-01-01
Problem Current models of health care quality improvement do not explicitly describe the role of health professions education. The authors propose the Exemplary Care and Learning Site (ECLS) model as an approach to achieving continual improvement in care and learning in the clinical setting. Approach From 2008–2012, an iterative, interactive process was used to develop the ECLS model and its core elements—patients and families informing process changes; trainees engaging both in care and the improvement of care; leaders knowing, valuing, and practicing improvement; data transforming into useful information; and health professionals competently engaging both in care improvement and teaching about care improvement. In 2012–2013, a three-part feasibility test of the model, including a site self-assessment, an independent review of each site’s ratings, and implementation case stories, was conducted at six clinical teaching sites (in the United States and Sweden). Outcomes Site leaders reported the ECLS model provided a systematic approach toward improving patient (and population) outcomes, system performance, and professional development. Most sites found it challenging to incorporate the patients and families element. The trainee element was strong at four sites. The leadership and data elements were self-assessed as the most fully developed. The health professionals element exhibited the greatest variability across sites. Next Steps The next test of the model should be prospective, linked to clinical and educa tional outcomes, to evaluate whether it helps care delivery teams, educators, and patients and families take action to achieve better patient (and population) outcomes, system performance, and professional development. PMID:26760058
Neural-Net Processing of Characteristic Patterns From Electronic Holograms of Vibrating Blades
NASA Technical Reports Server (NTRS)
Decker, Arthur J.
1999-01-01
Finite-element-model-trained artificial neural networks can be used to process efficiently the characteristic patterns or mode shapes from electronic holograms of vibrating blades. The models used for routine design may not yet be sufficiently accurate for this application. This document discusses the creation of characteristic patterns; compares model generated and experimental characteristic patterns; and discusses the neural networks that transform the characteristic patterns into strain or damage information. The current potential to adapt electronic holography to spin rigs, wind tunnels and engines provides an incentive to have accurate finite element models lor training neural networks.
McMahon, Christiana; Denaxas, Spiros
2017-11-06
Informed consent is an important feature of longitudinal research studies as it enables the linking of the baseline participant information with administrative data. The lack of standardized models to capture consent elements can lead to substantial challenges. A structured approach to capturing consent-related metadata can address these. a) Explore the state-of-the-art for recording consent; b) Identify key elements of consent required for record linkage; and c) Create and evaluate a novel metadata management model to capture consent-related metadata. The main methodological components of our work were: a) a systematic literature review and qualitative analysis of consent forms; b) the development and evaluation of a novel metadata model. We qualitatively analyzed 61 manuscripts and 30 consent forms. We extracted data elements related to obtaining consent for linkage. We created a novel metadata management model for consent and evaluated it by comparison with the existing standards and by iteratively applying it to case studies. The developed model can facilitate the standardized recording of consent for linkage in longitudinal research studies and enable the linkage of external participant data. Furthermore, it can provide a structured way of recording consent-related metadata and facilitate the harmonization and streamlining of processes.
The NASTRAN user's manual (level 17.0)
NASA Technical Reports Server (NTRS)
1979-01-01
NASTRAN embodies a lumped element approach, wherein the distributed physical properties of a structure are represented by a model consisting of a finite number of idealized substructures or elements that are interconnected at a finite of grid points, to which loads are applied. All input and output data pertain to the idealized structural model. The general procedures for defining structural models are described and instructions are given for each of the bulk data cards and case control cards. Additional information on the case control cards and use of parameters is included for each rigid format.
Theoretical aspects of diagnostics of car as mechatronic system
NASA Astrophysics Data System (ADS)
Goncharov, A. E.; Bondarenko, E. V.; Krasnoshtanov, S. Yu
2018-03-01
The article describes transformation of mechanical systems of automobiles into mechatronic ones due to application of electronic control systems. To assess the relationship of mechanical and electronic components of the mechatronic systems with regard to their technical states, the method of equivalent elements was employed. A mathematical model of changes in the technical state of equivalent elements was developed. It allowed us to present changes in operation capacity in a graphic form. The analytical model is used to ensure operating capacity potential stability for the mechatronic system. For this purpose, new resources were identified with regard to the information ‘field’. Therefore, a new approach to the systematization of knowledge about mechatronic transport systems (D-C-R-E system) is required. The D-C-R-E system is examined as a separate unit. The article describes Information unit formation based on the physical component of the D-C-R-E system and external information which is collected and processed in the Information Diagnostic Center (IDC). Using probability theory and Boolean algebra methods, the authors obtained a logistic model describing information relations between elements of the upgraded D-C-R-E system and contribution of each component to the road safety protection. The logistic model helped formulate main IDC tasks. Implementation of those tasks was transformed into the logical sequence of data collection and analysis in the IDC. That approach predetermined development of the multi-level diagnosing system which made it possible to put in order existing and improved image identification methods and algorithms and to create a diagnosing method for mechatronic systems of cars which reduces labor content and increases accuracy. That approach can help assess the technical state of vehicles with characteristics of mechatronic systems and their transport and environmental safety.
NASA Astrophysics Data System (ADS)
Lee, Yonghoon; Nam, Sang-Ho; Ham, Kyung-Sik; Gonzalez, Jhanis; Oropeza, Dayana; Quarles, Derrick; Yoo, Jonghyun; Russo, Richard E.
2016-04-01
Laser-Induced Breakdown Spectroscopy (LIBS) and Laser-Ablation Inductively Coupled Plasma Mass Spectrometry (LA-ICP-MS), both based on laser ablation sampling, can be employed simultaneously to obtain different chemical fingerprints from a sample. We demonstrated that this analysis approach can provide complementary information for improved classification of edible salts. LIBS could detect several of the minor metallic elements along with Na and Cl, while LA-ICP-MS spectra were used to measure non-metallic and trace heavy metal elements. Principal component analysis using LIBS and LA-ICP-MS spectra showed that their major spectral variations classified the sample salts in different ways. Three classification models were developed by using partial least squares-discriminant analysis based on the LIBS, LA-ICP-MS, and their fused data. From the cross-validation performances and confusion matrices of these models, the minor metallic elements (Mg, Ca, and K) detected by LIBS and the non-metallic (I) and trace heavy metal (Ba, W, and Pb) elements detected by LA-ICP-MS provided complementary chemical information to distinguish particular salt samples.
Lindsköld, Lars; Wintell, Mikael; Edgren, Lars; Aspelin, Peter; Lundberg, Nina
2013-07-01
Challenges related to the cross-organizational access of accurate and timely information about a patient's condition has become a critical issue in healthcare. Interoperability of different local sources is necessary. To identify and present missing and semantically incorrect data elements of metadata in the radiology enterprise service that supports cross-organizational sharing of dynamic information about patients' visits, in the Region Västra Götaland, Sweden. Quantitative data elements of metadata were collected yearly from the first Wednesday in March from 2006 to 2011 from the 24 in-house radiology departments in Region Västra Götaland. These radiology departments were organized into four hospital groups and three stand-alone hospitals. Included data elements of metadata were the patient name, patient ID, institutional department name, referring physician's name, and examination description. The majority of missing data elements of metadata was related to the institutional department name for Hospital 2, from 87% in 2007 to 25% in 2011. All data elements of metadata except the patient ID contained semantic errors. For example, for the data element "patient name", only three names out of 3537 were semantically correct. This study shows that the semantics of metadata elements are poorly structured and inconsistently used. Although a cross-organizational solution may technically be fully functional, semantic errors may prevent it from serving as an information infrastructure for collaboration between all departments and hospitals in the region. For interoperability, it is important that the agreed semantic models are implemented in vendor systems using the information infrastructure.
TUNS/TCIS information model/process model
NASA Technical Reports Server (NTRS)
Wilson, James
1992-01-01
An Information Model is comprised of graphical and textual notation suitable for describing and defining the problem domain - in our case, TUNS or TCIS. The model focuses on the real world under study. It identifies what is in the problem and organizes the data into a formal structure for documentation and communication purposes. The Information Model is composed of an Entity Relationship Diagram (ERD) and a Data Dictionary component. The combination of these components provide an easy to understand methodology for expressing the entities in the problem space, the relationships between entities and the characteristics (attributes) of the entities. This approach is the first step in information system development. The Information Model identifies the complete set of data elements processed by TUNS. This representation provides a conceptual view of TUNS from the perspective of entities, data, and relationships. The Information Model reflects the business practices and real-world entities that users must deal with.
Reduced complexity structural modeling for automated airframe synthesis
NASA Technical Reports Server (NTRS)
Hajela, Prabhat
1987-01-01
A procedure is developed for the optimum sizing of wing structures based on representing the built-up finite element assembly of the structure by equivalent beam models. The reduced-order beam models are computationally less demanding in an optimum design environment which dictates repetitive analysis of several trial designs. The design procedure is implemented in a computer program requiring geometry and loading information to create the wing finite element model and its equivalent beam model, and providing a rapid estimate of the optimum weight obtained from a fully stressed design approach applied to the beam. The synthesis procedure is demonstrated for representative conventional-cantilever and joined wing configurations.
A Flexible Method for Producing F.E.M. Analysis of Bone Using Open-Source Software
NASA Technical Reports Server (NTRS)
Boppana, Abhishektha; Sefcik, Ryan; Meyers, Jerry G.; Lewandowski, Beth E.
2016-01-01
This project, performed in support of the NASA GRC Space Academy summer program, sought to develop an open-source workflow methodology that segmented medical image data, created a 3D model from the segmented data, and prepared the model for finite-element analysis. In an initial step, a technological survey evaluated the performance of various existing open-source software that claim to perform these tasks. However, the survey concluded that no single software exhibited the wide array of functionality required for the potential NASA application in the area of bone, muscle and bio fluidic studies. As a result, development of a series of Python scripts provided the bridging mechanism to address the shortcomings of the available open source tools. The implementation of the VTK library provided the most quick and effective means of segmenting regions of interest from the medical images; it allowed for the export of a 3D model by using the marching cubes algorithm to build a surface mesh. To facilitate the development of the model domain from this extracted information required a surface mesh to be processed in the open-source software packages Blender and Gmsh. The Preview program of the FEBio suite proved to be sufficient for volume filling the model with an unstructured mesh and preparing boundaries specifications for finite element analysis. To fully allow FEM modeling, an in house developed Python script allowed assignment of material properties on an element by element basis by performing a weighted interpolation of voxel intensity of the parent medical image correlated to published information of image intensity to material properties, such as ash density. A graphical user interface combined the Python scripts and other software into a user friendly interface. The work using Python scripts provides a potential alternative to expensive commercial software and inadequate, limited open-source freeware programs for the creation of 3D computational models. More work will be needed to validate this approach in creating finite-element models.
Application of NASTRAN to TFTR toroidal field coil structures
NASA Technical Reports Server (NTRS)
Chen, S. J.; Lee, E.
1978-01-01
The primary applied loads on the TF coils were electromagnetic and thermal. The complex structure and the tremendous applied loads necessitated computer type of solutions for the design problems. In the early stage of the TF coil design, many simplified finite element models were developed for the purpose of investigating the effects of material properties, supporting schemes, and coil case material on the stress levels in the case and in the copper coil. In the more sophisticated models that followed the parametric and scoping studies, the isoparametric elements, such as QUAD4, HEX8, and HEXA, were used. The analysis results from using these finite element models and the NASTRAN system were considered accurate enough to provide timely design information.
Discrimination of numerical proportions: A comparison of binomial and Gaussian models.
Raidvee, Aire; Lember, Jüri; Allik, Jüri
2017-01-01
Observers discriminated the numerical proportion of two sets of elements (N = 9, 13, 33, and 65) that differed either by color or orientation. According to the standard Thurstonian approach, the accuracy of proportion discrimination is determined by irreducible noise in the nervous system that stochastically transforms the number of presented visual elements onto a continuum of psychological states representing numerosity. As an alternative to this customary approach, we propose a Thurstonian-binomial model, which assumes discrete perceptual states, each of which is associated with a certain visual element. It is shown that the probability β with which each visual element can be noticed and registered by the perceptual system can explain data of numerical proportion discrimination at least as well as the continuous Thurstonian-Gaussian model, and better, if the greater parsimony of the Thurstonian-binomial model is taken into account using AIC model selection. We conclude that Gaussian and binomial models represent two different fundamental principles-internal noise vs. using only a fraction of available information-which are both plausible descriptions of visual perception.
Petrova, Guenka; Clerfeuille, Fabrice; Vakrilova, Milena; Mitkov, Cvetomir; Poubanne, Yannick
2008-01-01
The objective of this work is to study the possibilities of the tetraclass model for the evaluation of the changes in the consumer satisfaction from the provided pharmacy services during the time. Methods Within the same 4 months period in 2004 and 2006 were questioned at approximately 10 pharmacy consumers per working day. Every consumer evaluated the 34 service elements on a 5 points semantic-differential scale. The technique of the correspondence data analysis was used for the categorisation of the services. Results Most of the services have been categorized as basic ones. For the age group up to 40 years the access to pharmacy became a key element and external aspects became a secondary element in 2006 year. For the group of patients that are using the services of the pharmacy for more than 2 years, availability of phone connection, quality of answers and product prices move from plus to secondary element. The ratio quality/price moves from the group of basic to key services, visibility of the prices and hygiene became basic elements from secondary ones. During the two years period, all the service elements connected with the staff as availability, identification, good looking, confidence, dressing, advices, technical competence, explanation, and time spent with clients remain basic services. The confidentiality of the staff remains always a key element. Conclusion Our study shows that the tetraclass model allows taking more informed managerial decisions in the pharmacies, as well as, is providing information for the concrete area of services and possible measures. In case of a development of a simple statistical program for quick processing of the inquiry data, the method will became applicable and affordable even for small pharmacies. PMID:25147588
NASA Astrophysics Data System (ADS)
Nikolaev, V. N.; Titov, D. V.; Syryamkin, V. I.
2018-05-01
The comparative assessment of the level of channel capacity of different variants of the structural organization of the automated information processing systems is made. The information processing time assessment model depending on the type of standard elements and their structural organization is developed.
Design of Particulate-Reinforced Composite Materials
Muc, Aleksander; Barski, Marek
2018-01-01
A microstructure-based model is developed to study the effective anisotropic properties (magnetic, dielectric or thermal) of two-phase particle-filled composites. The Green’s function technique and the effective field method are used to theoretically derive the homogenized (averaged) properties for a representative volume element containing isolated inclusion and infinite, chain-structured particles. Those results are compared with the finite element approximations conducted for the assumed representative volume element. In addition, the Maxwell–Garnett model is retrieved as a special case when particle interactions are not considered. We also give some information on the optimal design of the effective anisotropic properties taking into account the shape of magnetic particles. PMID:29401678
Quantum memristor in a superconducting circuit
NASA Astrophysics Data System (ADS)
Salmilehto, Juha; Sanz, Mikel; di Ventra, Massimiliano; Solano, Enrique
Memristors, resistive elements that retain information of their past, have garnered interest due to their paradigm-changing potential in information processing and electronics. The emergent hysteretic behaviour allows for novel architectural applications and has recently been classically demonstrated in a simplified superconducting setup using the phase-dependent conductance in the tunnel-junction-microscopic model. In this contribution, we present a truly quantum model for a memristor constructed using established elements and techniques in superconducting nanoelectronics, and explore the parameters for feasible operation as well as refine the methods for quantifying the memory retention. In particular, the memristive behaviour is shown to arise from quasiparticle-induced tunneling in the full dissipative model and can be observed in the phase-driven tunneling current. The relevant hysteretic behaviour should be observable using current state-of-the-art measurements for detecting quasiparticle excitations. Our theoretical findings constitute the first quantum memristor in a superconducting circuit and act as the starting point for designing further circuit elements that have non-Markovian characteristics The authors acknowledge support from the CCQED EU project and the Finnish Cultural Foundation.
Optimal design of composite hip implants using NASA technology
NASA Technical Reports Server (NTRS)
Blake, T. A.; Saravanos, D. A.; Davy, D. T.; Waters, S. A.; Hopkins, D. A.
1993-01-01
Using an adaptation of NASA software, we have investigated the use of numerical optimization techniques for the shape and material optimization of fiber composite hip implants. The original NASA inhouse codes, were originally developed for the optimization of aerospace structures. The adapted code, which was called OPORIM, couples numerical optimization algorithms with finite element analysis and composite laminate theory to perform design optimization using both shape and material design variables. The external and internal geometry of the implant and the surrounding bone is described with quintic spline curves. This geometric representation is then used to create an equivalent 2-D finite element model of the structure. Using laminate theory and the 3-D geometric information, equivalent stiffnesses are generated for each element of the 2-D finite element model, so that the 3-D stiffness of the structure can be approximated. The geometric information to construct the model of the femur was obtained from a CT scan. A variety of test cases were examined, incorporating several implant constructions and design variable sets. Typically the code was able to produce optimized shape and/or material parameters which substantially reduced stress concentrations in the bone adjacent of the implant. The results indicate that this technology can provide meaningful insight into the design of fiber composite hip implants.
Space market model space industry input-output model
NASA Technical Reports Server (NTRS)
Hodgin, Robert F.; Marchesini, Roberto
1987-01-01
The goal of the Space Market Model (SMM) is to develop an information resource for the space industry. The SMM is intended to contain information appropriate for decision making in the space industry. The objectives of the SMM are to: (1) assemble information related to the development of the space business; (2) construct an adequate description of the emerging space market; (3) disseminate the information on the space market to forecasts and planners in government agencies and private corporations; and (4) provide timely analyses and forecasts of critical elements of the space market. An Input-Output model of market activity is proposed which are capable of transforming raw data into useful information for decision makers and policy makers dealing with the space sector.
NASA Astrophysics Data System (ADS)
Dore, C.; Murphy, M.
2013-02-01
This paper outlines a new approach for generating digital heritage models from laser scan or photogrammetric data using Historic Building Information Modelling (HBIM). HBIM is a plug-in for Building Information Modelling (BIM) software that uses parametric library objects and procedural modelling techniques to automate the modelling stage. The HBIM process involves a reverse engineering solution whereby parametric interactive objects representing architectural elements are mapped onto laser scan or photogrammetric survey data. A library of parametric architectural objects has been designed from historic manuscripts and architectural pattern books. These parametric objects were built using an embedded programming language within the ArchiCAD BIM software called Geometric Description Language (GDL). Procedural modelling techniques have been implemented with the same language to create a parametric building façade which automatically combines library objects based on architectural rules and proportions. Different configurations of the façade are controlled by user parameter adjustment. The automatically positioned elements of the façade can be subsequently refined using graphical editing while overlaying the model with orthographic imagery. Along with this semi-automatic method for generating façade models, manual plotting of library objects can also be used to generate a BIM model from survey data. After the 3D model has been completed conservation documents such as plans, sections, elevations and 3D views can be automatically generated for conservation projects.
Carbonatites of the World, Explored Deposits of Nb and REE - Database and Grade and Tonnage Models
Berger, Vladimir I.; Singer, Donald A.; Orris, Greta J.
2009-01-01
This report is based on published tonnage and grade data on 58 Nb- and rare-earth-element (REE)-bearing carbonatite deposits that are mostly well explored and are partially mined or contain resources of these elements. The deposits represent only a part of the known 527 carbonatites around the world, but they are characterized by reliable quantitative data on ore tonnages and grades of niobium and REE. Grade and tonnage models are an important component of mineral resource assessments. Carbonatites present one of the main natural sources of niobium and rare-earth elements, the economic importance of which grows consistently. A purpose of this report is to update earlier publications. New information about known deposits, as well as data on new deposits published during the last decade, are incorporated in the present paper. The compiled database (appendix 1; linked to right) contains 60 explored Nb- and REE-bearing carbonatite deposits - resources of 55 of these deposits are taken from publications. In the present updated grade-tonnage model we have added 24 deposits comparing with the previous model of Singer (1998). Resources of most deposits are residuum ores in the upper part of carbonatite bodies. Mineral-deposit models are important in exploration planning and quantitative resource assessments for two reasons: (1) grades and tonnages among deposit types vary significantly, and (2) deposits of different types are present in distinct geologic settings that can be identified from geologic maps. Mineral-deposit models combine the diverse geoscience information on geology, mineral occurrences, geophysics, and geochemistry used in resource assessments and mineral exploration. Globally based deposit models allow recognition of important features and demonstrate how common different features are. Well-designed deposit models allow geologists to deduce possible mineral-deposit types in a given geologic environment, and the grade and tonnage models allow economists to estimate the possible economic viability of these resources. Thus, mineral-deposit models play a central role in presenting geoscience information in a useful form to policy makers. The foundation of mineral-deposit models is information about known deposits. This publication presents the latest geologic information and newly developed grade and tonnage models for Nb- and REE-carbonatite deposits in digital form. The publication contains computer files with information on deposits from around the world. It also contains a text file allowing locations of all deposits to be plotted in geographic information system (GIS) programs. The data are presented in FileMaker Pro as well as in .xls and text files to make the information available to a broadly based audience. The value of this information and any derived analyses depends critically on the consistent manner of data gathering. For this reason, we first discuss the rules used in this compilation. Next, the fields of the database are explained. Finally, we provide new grade and tonnage models and analysis of the information in the file.
NASA Technical Reports Server (NTRS)
Wilson, R. B.; Bak, M. J.; Nakazawa, S.; Banerjee, P. K.
1984-01-01
A 3-D inelastic analysis methods program consists of a series of computer codes embodying a progression of mathematical models (mechanics of materials, special finite element, boundary element) for streamlined analysis of combustor liners, turbine blades, and turbine vanes. These models address the effects of high temperatures and thermal/mechanical loadings on the local (stress/strain) and global (dynamics, buckling) structural behavior of the three selected components. These models are used to solve 3-D inelastic problems using linear approximations in the sense that stresses/strains and temperatures in generic modeling regions are linear functions of the spatial coordinates, and solution increments for load, temperature and/or time are extrapolated linearly from previous information. Three linear formulation computer codes, referred to as MOMM (Mechanics of Materials Model), MHOST (MARC-Hot Section Technology), and BEST (Boundary Element Stress Technology), were developed and are described.
Relief diffracted elements recorded on absorbent photopolymers.
Gallego, S; Márquez, A; Ortuño, M; Francés, J; Pascual, I; Beléndez, A
2012-05-07
Relief surface changes provide interesting possibilities for storing diffractive optical elements on photopolymers and are an important source of information for characterizing and understanding the material behavior. In this paper we use a 3-dimensional model, based on direct parameter measurements, for predicting the relief structures generated on without-coverplate photopolymers. We have analyzed different spatial frequency and recording intensity distributions such as binary and blazed periodic patterns. This model was successfully applied to different photopolymers with different values of monomer diffusion.
Analysis of temperature influence on the informative parameters of single-coil eddy current sensors
NASA Astrophysics Data System (ADS)
Borovik, S. Yu.; Kuteynikova, M. M.; Sekisov, Yu. N.; Skobelev, O. P.
2017-07-01
This paper describes the study of temperature in the flowing part of a turbine on the informative parameters (equivalent inductances of primary windings of matching transformers) of single-coil eddy-current sensors with a sensitive element in the form of a conductor section, which are used as part of automation systems for testing gas-turbine engines. In this case, the objects of temperature influences are both sensors and controlled turbine blades. The existing model of electromagnetic interaction of a sensitive element with the end part of a controlled blade is used to obtain quantitative estimates of temperature changes of equivalent inductances of sensitive elements and primary windings of matching transformers. This model is also used to determine the corresponding changes of the informative parameter of the sensor in the process of experimental studies of temperature influences on it (in the absence of blades in the sensitive region). This paper also presents transformations in the form of relationships of informative parameters with radial and axial displacements at normal (20 °C) and nominal (1000 °C) temperatures, and their difference is used to determine the families of dominant functions of temperature, which characterize possible temperature errors for any radial and axial displacements in the ranges of their variation.
General Nonlinear Ferroelectric Model v. Beta
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong, Wen; Robbins, Josh
2017-03-14
The purpose of this software is to function as a generalized ferroelectric material model. The material model is designed to work with existing finite element packages by providing updated information on material properties that are nonlinear and dependent on loading history. The two major nonlinear phenomena this model captures are domain-switching and phase transformation. The software itself does not contain potentially sensitive material information and instead provides a framework for different physical phenomena observed within ferroelectric materials. The model is calibrated to a specific ferroelectric material through input parameters provided by the user.
A Maturity Model for Assessing the Use of ICT in School Education
ERIC Educational Resources Information Center
Solar, Mauricio; Sabattin, Jorge; Parada, Victor
2013-01-01
This article describes an ICT-based and capability-driven model for assessing ICT in education capabilities and maturity of schools. The proposed model, called ICTE-MM (ICT in School Education Maturity Model), has three elements supporting educational processes: information criteria, ICT resources, and leverage domains. Changing the traditional…
A high-order multiscale finite-element method for time-domain acoustic-wave modeling
NASA Astrophysics Data System (ADS)
Gao, Kai; Fu, Shubin; Chung, Eric T.
2018-05-01
Accurate and efficient wave equation modeling is vital for many applications in such as acoustics, electromagnetics, and seismology. However, solving the wave equation in large-scale and highly heterogeneous models is usually computationally expensive because the computational cost is directly proportional to the number of grids in the model. We develop a novel high-order multiscale finite-element method to reduce the computational cost of time-domain acoustic-wave equation numerical modeling by solving the wave equation on a coarse mesh based on the multiscale finite-element theory. In contrast to existing multiscale finite-element methods that use only first-order multiscale basis functions, our new method constructs high-order multiscale basis functions from local elliptic problems which are closely related to the Gauss-Lobatto-Legendre quadrature points in a coarse element. Essentially, these basis functions are not only determined by the order of Legendre polynomials, but also by local medium properties, and therefore can effectively convey the fine-scale information to the coarse-scale solution with high-order accuracy. Numerical tests show that our method can significantly reduce the computation time while maintain high accuracy for wave equation modeling in highly heterogeneous media by solving the corresponding discrete system only on the coarse mesh with the new high-order multiscale basis functions.
A high-order multiscale finite-element method for time-domain acoustic-wave modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, Kai; Fu, Shubin; Chung, Eric T.
Accurate and efficient wave equation modeling is vital for many applications in such as acoustics, electromagnetics, and seismology. However, solving the wave equation in large-scale and highly heterogeneous models is usually computationally expensive because the computational cost is directly proportional to the number of grids in the model. We develop a novel high-order multiscale finite-element method to reduce the computational cost of time-domain acoustic-wave equation numerical modeling by solving the wave equation on a coarse mesh based on the multiscale finite-element theory. In contrast to existing multiscale finite-element methods that use only first-order multiscale basis functions, our new method constructsmore » high-order multiscale basis functions from local elliptic problems which are closely related to the Gauss–Lobatto–Legendre quadrature points in a coarse element. Essentially, these basis functions are not only determined by the order of Legendre polynomials, but also by local medium properties, and therefore can effectively convey the fine-scale information to the coarse-scale solution with high-order accuracy. Numerical tests show that our method can significantly reduce the computation time while maintain high accuracy for wave equation modeling in highly heterogeneous media by solving the corresponding discrete system only on the coarse mesh with the new high-order multiscale basis functions.« less
A high-order multiscale finite-element method for time-domain acoustic-wave modeling
Gao, Kai; Fu, Shubin; Chung, Eric T.
2018-02-04
Accurate and efficient wave equation modeling is vital for many applications in such as acoustics, electromagnetics, and seismology. However, solving the wave equation in large-scale and highly heterogeneous models is usually computationally expensive because the computational cost is directly proportional to the number of grids in the model. We develop a novel high-order multiscale finite-element method to reduce the computational cost of time-domain acoustic-wave equation numerical modeling by solving the wave equation on a coarse mesh based on the multiscale finite-element theory. In contrast to existing multiscale finite-element methods that use only first-order multiscale basis functions, our new method constructsmore » high-order multiscale basis functions from local elliptic problems which are closely related to the Gauss–Lobatto–Legendre quadrature points in a coarse element. Essentially, these basis functions are not only determined by the order of Legendre polynomials, but also by local medium properties, and therefore can effectively convey the fine-scale information to the coarse-scale solution with high-order accuracy. Numerical tests show that our method can significantly reduce the computation time while maintain high accuracy for wave equation modeling in highly heterogeneous media by solving the corresponding discrete system only on the coarse mesh with the new high-order multiscale basis functions.« less
Capelli, Claudio; Biglino, Giovanni; Petrini, Lorenza; Migliavacca, Francesco; Cosentino, Daria; Bonhoeffer, Philipp; Taylor, Andrew M; Schievano, Silvia
2012-12-01
Finite element (FE) modelling can be a very resourceful tool in the field of cardiovascular devices. To ensure result reliability, FE models must be validated experimentally against physical data. Their clinical application (e.g., patients' suitability, morphological evaluation) also requires fast simulation process and access to results, while engineering applications need highly accurate results. This study shows how FE models with different mesh discretisations can suit clinical and engineering requirements for studying a novel device designed for percutaneous valve implantation. Following sensitivity analysis and experimental characterisation of the materials, the stent-graft was first studied in a simplified geometry (i.e., compliant cylinder) and validated against in vitro data, and then in a patient-specific implantation site (i.e., distensible right ventricular outflow tract). Different meshing strategies using solid, beam and shell elements were tested. Results showed excellent agreement between computational and experimental data in the simplified implantation site. Beam elements were found to be convenient for clinical applications, providing reliable results in less than one hour in a patient-specific anatomical model. Solid elements remain the FE choice for engineering applications, albeit more computationally expensive (>100 times). This work also showed how information on device mechanical behaviour differs when acquired in a simplified model as opposed to a patient-specific model.
Equity Access Plans: A Regulatory and Educational State Response Model.
ERIC Educational Resources Information Center
DeLisle, James
1984-01-01
Introduces the basic notion of equity access plans as property-based solutions to the cash flow needs of elderly homeowners and then proposes a normative response model that states can adopt to help manage the risk exposures. The recommended model incorporates regulatory, information dissemination, and educational elements. (BH)
NASA Astrophysics Data System (ADS)
Gigante-Barrera, Ángel; Dindar, Serdar; Kaewunruen, Sakdirat; Ruikar, Darshan
2017-10-01
Railway turnouts are complex systems designed using complex geometries and grades which makes them difficult to be managed in terms of risk prevention. This feature poses a substantial peril to rail users as it is considered a cause of derailment. In addition, derailment deals to financial losses due to operational downtimes and monetary compensations in case of death or injure. These are fundamental drivers to consider mitigating risks arising from poor risk management during design. Prevention through design (PtD) is a process that introduces tacit knowledge from industry professionals during the design process. There is evidence that Building Information Modelling (BIM) can help to mitigate risk since the inception of the project. BIM is considered an Information System (IS) were tacit knowledge can be stored and retrieved from a digital database making easy to take promptly decisions as information is ready to be analysed. BIM at the model element level entails working with 3D elements and embedded data, therefore adding a layer of complexity to the management of information along the different stages of the project and across different disciplines. In order to overcome this problem, the industry has created a framework for model progression specification named Level of Development (LOD). The paper presents an IDM based framework for design risk mitigation through code validation using the LOD. This effort resulted on risk datasets which describe graphically and non-graphically a rail turnout as the model progresses. Thus, permitting its inclusion within risk information systems. The assignment of an LOD construct to a set of data, requires specialised management and process related expertise. Furthermore, the selection of a set of LOD constructs requires a purpose based analysis. Therefore, a framework for LOD constructs implementation within the IDM for code checking is required for the industry to progress in this particular field.
Creating a Realistic IT Vision: The Roles and Responsibilities of a Chief Information Officer.
ERIC Educational Resources Information Center
Penrod, James I.
2003-01-01
Discusses the crucial position of the chief information officer (CIO) at higher education institutions and reviews the six major stages of information technology (IT) planning. Includes fundamental elements related to an IT vision; roles of the CIO; the six-stage planning model for a realistic IT vision; and factors for success. (AEF)
Intelligent Integrated Health Management for a System of Systems
NASA Technical Reports Server (NTRS)
Smith, Harvey; Schmalzel, John; Figueroa, Fernando
2008-01-01
An intelligent integrated health management system (IIHMS) incorporates major improvements over prior such systems. The particular IIHMS is implemented for any system defined as a hierarchical distributed network of intelligent elements (HDNIE), comprising primarily: (1) an architecture (Figure 1), (2) intelligent elements, (3) a conceptual framework and taxonomy (Figure 2), and (4) and ontology that defines standards and protocols. Some definitions of terms are prerequisite to a further brief description of this innovation: A system-of-systems (SoS) is an engineering system that comprises multiple subsystems (e.g., a system of multiple possibly interacting flow subsystems that include pumps, valves, tanks, ducts, sensors, and the like); 'Intelligent' is used here in the sense of artificial intelligence. An intelligent element may be physical or virtual, it is network enabled, and it is able to manage data, information, and knowledge (DIaK) focused on determining its condition in the context of the entire SoS; As used here, 'health' signifies the functionality and/or structural integrity of an engineering system, subsystem, or process (leading to determination of the health of components); 'Process' can signify either a physical process in the usual sense of the word or an element into which functionally related sensors are grouped; 'Element' can signify a component (e.g., an actuator, a valve), a process, a controller, an actuator, a subsystem, or a system; The term Integrated System Health Management (ISHM) is used to describe a capability that focuses on determining the condition (health) of every element in a complex system (detect anomalies, diagnose causes, prognosis of future anomalies), and provide data, information, and knowledge (DIaK) not just data to control systems for safe and effective operation. A major novel aspect of the present development is the concept of intelligent integration. The purpose of intelligent integration, as defined and implemented in the present IIHMS, is to enable automated analysis of physical phenomena in imitation of human reasoning, including the use of qualitative methods. Intelligent integration is said to occur in a system in which all elements are intelligent and can acquire, maintain, and share knowledge and information. In the HDNIE of the present IIHMS, an SoS is represented as being operationally organized in a hierarchical-distributed format. The elements of the SoS are considered to be intelligent in that they determine their own conditions within an integrated scheme that involves consideration of data, information, knowledge bases, and methods that reside in all elements of the system. The conceptual framework of the HDNIE and the methodologies of implementing it enable the flow of information and knowledge among the elements so as to make possible the determination of the condition of each element. The necessary information and knowledge is made available to each affected element at the desired time, satisfying a need to prevent information overload while providing context-sensitive information at the proper level of detail. Provision of high-quality data is a central goal in designing this or any IIHMS. In pursuit of this goal, functionally related sensors are logically assigned to groups denoted processes. An aggregate of processes is considered to form a system. Alternatively or in addition to what has been said thus far, the HDNIE of this IIHMS can be regarded as consisting of a framework containing object models that encapsulate all elements of the system, their individual and relational knowledge bases, generic methods and procedures based on models of the applicable physics, and communication processes (Figure 2). The framework enables implementation of a paradigm inspired by how expert operators monitor the health of systems with the help of (1) DIaK from various sources, (2) software tools that assist in rapid visualization of the condition of the system, (3) analical software tools that assist in reasoning about the condition, (4) sharing of information via network communication hardware and software, and (5) software tools that aid in making decisions to remedy unacceptable conditions or improve performance.
Modeling and Control of the Redundant Parallel Adjustment Mechanism on a Deployable Antenna Panel
Tian, Lili; Bao, Hong; Wang, Meng; Duan, Xuechao
2016-01-01
With the aim of developing multiple input and multiple output (MIMO) coupling systems with a redundant parallel adjustment mechanism on the deployable antenna panel, a structural control integrated design methodology is proposed in this paper. Firstly, the modal information from the finite element model of the structure of the antenna panel is extracted, and then the mathematical model is established with the Hamilton principle; Secondly, the discrete Linear Quadratic Regulator (LQR) controller is added to the model in order to control the actuators and adjust the shape of the panel. Finally, the engineering practicality of the modeling and control method based on finite element analysis simulation is verified. PMID:27706076
Promoting Model-based Definition to Establish a Complete Product Definition
Ruemler, Shawn P.; Zimmerman, Kyle E.; Hartman, Nathan W.; Hedberg, Thomas; Feeny, Allison Barnard
2016-01-01
The manufacturing industry is evolving and starting to use 3D models as the central knowledge artifact for product data and product definition, or what is known as Model-based Definition (MBD). The Model-based Enterprise (MBE) uses MBD as a way to transition away from using traditional paper-based drawings and documentation. As MBD grows in popularity, it is imperative to understand what information is needed in the transition from drawings to models so that models represent all the relevant information needed for processes to continue efficiently. Finding this information can help define what data is common amongst different models in different stages of the lifecycle, which could help establish a Common Information Model. The Common Information Model is a source that contains common information from domain specific elements amongst different aspects of the lifecycle. To help establish this Common Information Model, information about how models are used in industry within different workflows needs to be understood. To retrieve this information, a survey mechanism was administered to industry professionals from various sectors. Based on the results of the survey a Common Information Model could not be established. However, the results gave great insight that will help in further investigation of the Common Information Model. PMID:28070155
Southern Great Plains Rapid Ecoregional Assessment: pre-assessment report
Assal, Timothy J.; Melcher, Cynthia P.; Carr, Natasha B.
2015-01-01
An overview on the ecology and management issues for each Conservation Element is provided, including distribution and ecology, landscape structure and dynamics, and associated species of management concern affiliated with each Conservation Element. For each Conservation Element, effects of the Change Agents are described. An overview of potential key ecological attributes and potential Change Agents are summarized by conceptual models and tables. The tables provide an organizational framework and background information for evaluating the key ecological attributes and Change Agents in Phase II.
Analysis of central enterprise architecture elements in models of six eHealth projects.
Virkanen, Hannu; Mykkänen, Juha
2014-01-01
Large-scale initiatives for eHealth services have been established in many countries on regional or national level. The use of Enterprise Architecture has been suggested as a methodology to govern and support the initiation, specification and implementation of large-scale initiatives including the governance of business changes as well as information technology. This study reports an analysis of six health IT projects in relation to Enterprise Architecture elements, focusing on central EA elements and viewpoints in different projects.
Kaehler, Laura A.; Jacobs, Mary; Jones, Deborah J.
2016-01-01
There is a shift in evidence-based practice toward an understanding of the treatment elements that characterize empirically-supported interventions in general and the core components of specific approaches in particular. The evidence-base for Behavioral Parent Training (BPT), the standard of care for early-onset disruptive behavior disorders (Oppositional Defiant Disorder and Conduct Disorder), which frequently co-occur with Attention Deficit Hyperactivity Disorder, is well-established; yet, an ahistorical, program-specific lens tells little regarding how leaders, including Constance Hanf at the University of Oregon, shaped the common practice elements of contemporary evidence-based BPT. Accordingly, this review summarizes the formative work of Hanf, as well as the core elements, evolution, and extensions of her work, represented in Community Parent Education (COPE; Cunningham, Bremner, & Boyle, 1995; Cunningham, Bremner, Secord, & Harrison, 2009), Defiant Children (DC; Barkley 1987; Barkley, 2013), Helping the Noncompliant Child (HNC; Forehand & McMahon, 1981; McMahon & Forehand, 2003), Parent-Child Interaction Therapy (PCIT; Eyberg, & Robinson, 1982; Eyberg, 1988; Eyberg & Funderburk, 2011), and the Incredible Years (IY; Webster-Stratton, 1981; 1982; 2008). Our goal is not to provide an exhaustive review of the evidence-base for the Hanf-Model programs; rather, our intention is to provide a template of sorts from which agencies and clinicians can make informed choices about how and why they are using one program versus another, as well as how to make inform flexible use one program or combination of practice elements across programs, to best meet the needs of child clients and their families. Clinical implications and directions for future work are discussed. PMID:27389606
Reconstructing photorealistic 3D models from image sequence using domain decomposition method
NASA Astrophysics Data System (ADS)
Xiong, Hanwei; Pan, Ming; Zhang, Xiangwei
2009-11-01
In the fields of industrial design, artistic design and heritage conservation, physical objects are usually digitalized by reverse engineering through some 3D scanning methods. Structured light and photogrammetry are two main methods to acquire 3D information, and both are expensive. Even if these expensive instruments are used, photorealistic 3D models are seldom available. In this paper, a new method to reconstruction photorealistic 3D models using a single camera is proposed. A square plate glued with coded marks is used to place the objects, and a sequence of about 20 images is taken. From the coded marks, the images are calibrated, and a snake algorithm is used to segment object from the background. A rough 3d model is obtained using shape from silhouettes algorithm. The silhouettes are decomposed into a combination of convex curves, which are used to partition the rough 3d model into some convex mesh patches. For each patch, the multi-view photo consistency constraints and smooth regulations are expressed as a finite element formulation, which can be resolved locally, and the information can be exchanged along the patches boundaries. The rough model is deformed into a fine 3d model through such a domain decomposition finite element method. The textures are assigned to each element mesh, and a photorealistic 3D model is got finally. A toy pig is used to verify the algorithm, and the result is exciting.
Scalable, MEMS-enabled, vibrational tactile actuators for high resolution tactile displays
NASA Astrophysics Data System (ADS)
Xie, Xin; Zaitsev, Yuri; Velásquez-García, Luis Fernando; Teller, Seth J.; Livermore, Carol
2014-12-01
The design, fabrication, and characterization of a new type of tactile display for people with blindness or low vision is reported. Each tactile element comprises a piezoelectric extensional actuator that vibrates in plane, with a microfabricated scissor mechanism to convert the in-plane actuations into robust, higher-amplitude, out-of-plane (vertical) vibrations that are sensed with the finger pads. When the tactile elements are formed into a 2D array, information can be conveyed to the user by varying the pattern of vibrations in space and time. Analytical models and finite element analysis were used to design individual tactile elements, which were implemented with PZT actuators and both SU-8 and 3D-printed scissor amplifiers. The measured displacements of these 3 mm × 10 mm, MEMS-enabled tactile elements exceed 10 µm, in agreement with models, with measured forces exceeding 45 mN. The performance of the MEMS-enabled tactile elements is compared with the performance of larger, fully-macroscale tactile elements to demonstrate the scale dependence of the devices. The creation of a 28-element prototype is also reported, and the qualitative user experience with the individual tactile elements and displays is described.
A School-Based Mental Health Consultation Curriculum.
ERIC Educational Resources Information Center
Sandoval, Jonathan; Davis, John M.
1984-01-01
Presents one position on consultation that integrates a theoretical model, a process model, and a curriculum for training school-based mental health consultants. Elements of the proposed curriculum include: ethics, relationship building, maintaining rapport, defining problems, gathering data, sharing information, generating and supporting…
Why are coast redwood and giant sequoia not where they are not?
W.J. Libby
2017-01-01
Models predicting future climates and other kinds of information are being developed to anticipate where these two species may fail, where they may continue to thrive, and where they may colonize, given changes in climate and other elements of the environment. Important elements of such predictions, among others, are: photoperiod; site qualities; changes in levels and...
Ryan, Patrick H; Brokamp, Cole; Fan, Zhi-Hua; Rao, M B
2015-12-01
The complex mixture of chemicals and elements that constitute particulate matter (PM*) varies by season and geographic location because source contributors differ over time and place. The composition of PM having an aerodynamic diameter < 2.5 μm (PM2.5) is hypothesized to be responsible, in part, for its toxicity. Epidemiologic studies have identified specific components and sources of PM2.5 that are associated with adverse health outcomes. The majority of these studies use measures of outdoor concentrations obtained from one or a few central monitoring sites as a surrogate for measures of personal exposure. Personal PM2.5 (and its elemental composition), however, may be different from the PM2.5 measured at stationary outdoor sites. The objectives of this study were (1) to describe the relationships between the concentrations of various elements in indoor, outdoor, and personal PM2.5 samples, (2) to identify groups of individuals with similar exposures to mixtures of elements in personal PM2.5 and to examine personal and home characteristics of these groups, and (3) to evaluate whether concentrations of elements from outdoor PM2.5 samples are appropriate surrogates for personal exposure to PM2.5 and its elements and whether indoor PM2.5 concentrations and information about home characteristics improve the prediction of personal exposure. The objectives of the study were addressed using data collected as part of the Relationships of Indoor, Outdoor, and Personal Air (RIOPA) study. The RIOPA study has previously measured the mass concentrations of PM2.5 and its elemental constituents during 48-hour concurrent indoor, outdoor (directly outside the home), and personal samplings in three urban areas (Los Angeles, California; Houston, Texas; and Elizabeth, New Jersey). The resulting data and information about personal and home characteristics (including air-conditioning use, nearby emission sources, time spent indoors, census-tract geography, air-exchange rates, and other information) for each RIOPA participant were downloaded from the RIOPA study database. We performed three sets of analyses to address the study aims. First, we conducted descriptive analyses to describe the relationships between elemental concentrations in the concurrently gathered indoor, outdoor, and personal air samples. We assessed the correlation between personal exposure and indoor concentrations as well as personal exposure and outdoor concentrations of each element and calculated ratios between them. In addition, we performed principal component analysis (PCA) and calculated principal component scores (PCSs) to examine the heterogeneity of the elemental composition and then tested whether the mixture of elements in indoor, outdoor, and personal PM2.5 was significantly different within each study site and across study sites. Secondly, we performed model-based clustering analysis to group RIOPA participants with similar exposures to mixtures of elements in personal PM2.5. We examined the association between cluster membership and the concentrations of elements in indoor and outdoor PM2.5 samples and personal and home characteristics. Finally, we developed a series of linear regression models and random forest models to examine the association between personal exposure to elements in PM2.5 and (1) outdoor measurements, (2) outdoor and indoor measurements, and (3) outdoor and indoor measurements and home characteristics. As we developed each model, the improvement in prediction of personal exposure when including additional information was assessed. Personal exposures to PM2.5 and to most elements were significantly correlated with both indoor and outdoor concentrations, although concentrations in personal samples frequently exceeded those of indoor and outdoor samples. In general, for most PM2.5 elements indoor concentrations were more highly correlated with personal exposure than were outdoor concentrations. PCA showed that the mixture of elements in indoor, outdoor, and personal PM2.5 varied significantly across sample types within each study site and also across study sites within each sample type. Using model-based clustering, we identified seven clusters of RIOPA participants whose personal PM2.5 samples had similar patterns of elemental composition. Using this approach, subsets of RIOPA participants were identified whose personal exposures to PM2.5 (and its elements) were significantly higher than their indoor and outdoor concentrations (and vice versa). The results of linear and random forest regression models were consistent with our correlation analyses and demonstrated that (1) indoor concentrations were more significantly associated with personal exposure than were outdoor concentrations and (2) participant reports of time spent at their home significantly modified many of the associations between indoor and personal concentrations. In linear regression models, the inclusion of indoor concentrations significantly improved the prediction of personal exposures to Ba, Ca, Cl, Cu, K, Sn, Sr, V, and Zn compared with the use of outdoor elemental concentrations alone. Including additional information on personal and home characteristics improved the prediction for only one element, Pb. Our results support the use of outdoor monitoring sites as surrogates of personal exposure for a limited number of individual elements associated with long-range transport and with a few local or indoor sources. Based on our PCA and clustering analyses, we concluded that the overall elemental composition of PM2.5 obtained at outdoor monitoring sites may not accurately represent the elemental composition of personal PM2.5. Although the data used in these analyses compared outdoor PM2.5 composition collected at the home with indoor and personal samples, our results imply that studies examining the complete elemental composition of PM2.5 should be cautious about using data from central outdoor monitoring sites because of the potential for exposure misclassification. The inclusion of personal and home characteristics only marginally improved the prediction of personal exposure for a small number of elements in PM2.5. We concluded that the additional cost and burden of indoor and personal sampling may be justified for studies examining elements because neither outdoor monitoring nor questionnaire data on home and personal characteristics were able to represent adequately the overall elemental composition of personal PM2.5.
A Modeling Approach for Burn Scar Assessment Using Natural Features and Elastic Property
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsap, L V; Zhang, Y; Goldgof, D B
2004-04-02
A modeling approach is presented for quantitative burn scar assessment. Emphases are given to: (1) constructing a finite element model from natural image features with an adaptive mesh, and (2) quantifying the Young's modulus of scars using the finite element model and the regularization method. A set of natural point features is extracted from the images of burn patients. A Delaunay triangle mesh is then generated that adapts to the point features. A 3D finite element model is built on top of the mesh with the aid of range images providing the depth information. The Young's modulus of scars ismore » quantified with a simplified regularization functional, assuming that the knowledge of scar's geometry is available. The consistency between the Relative Elasticity Index and the physician's rating based on the Vancouver Scale (a relative scale used to rate burn scars) indicates that the proposed modeling approach has high potentials for image-based quantitative burn scar assessment.« less
Chen, Yung-Chuan; Tu, Yuan-Kun; Zhuang, Jun-Yan; Tsai, Yi-Jung; Yen, Cheng-Yo; Hsiao, Chih-Kun
2017-11-01
A three-dimensional dynamic elastoplastic finite element model was constructed and experimentally validated and was used to investigate the parameters which influence bone temperature during drilling, including the drill speed, feeding force, drill bit diameter, and bone density. Results showed the proposed three-dimensional dynamic elastoplastic finite element model can effectively simulate the temperature elevation during bone drilling. The bone temperature rise decreased with an increase in feeding force and drill speed, however, increased with the diameter of drill bit or bone density. The temperature distribution is significantly affected by the drilling duration; a lower drilling speed reduced the exposure duration, decreases the region of the thermally affected zone. The constructed model could be applied for analyzing the influence parameters during bone drilling to reduce the risk of thermal necrosis. It may provide important information for the design of drill bits and surgical drilling powers.
Feasibility of an anticipatory noncontact precrash restraint actuation system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kercel, S.W.; Dress, W.B.
1995-12-31
The problem of providing an electronic warning of an impending crash to a precrash restraint system a fraction of a second before physical contact differs from more widely explored problems, such as providing several seconds of crash warning to a driver. One approach to precrash restraint sensing is to apply anticipatory system theory. This consists of nested simplified models of the system to be controlled and of the system`s environment. It requires sensory information to describe the ``current state`` of the system and the environment. The models use the sensory data to make a faster-than-real-time prediction about the near future.more » Anticipation theory is well founded but rarely used. A major problem is to extract real-time current-state information from inexpensive sensors. Providing current-state information to the nested models is the weakest element of the system. Therefore, sensors and real-time processing of sensor signals command the most attention in an assessment of system feasibility. This paper describes problem definition, potential ``showstoppers,`` and ways to overcome them. It includes experiments showing that inexpensive radar is a practical sensing element. It considers fast and inexpensive algorithms to extract information from sensor data.« less
Yang, Suixing; Feng, Jing; Zhang, Zuo; Qu, Aili; Gong, Miao; Tang, Jie; Fan, Junheng; Li, Songqing; Zhao, Yanling
2013-04-01
To construct a three-dimensional finite element model of the upper airway and adjacent structure of an obstructive sleep apnea hypopnea syndrome (OSAHS) patient for biomechanical analysis. And to study the influence of glossopharyngeum of an OSAHS patient with three-dimensional finite element model during titrated mandible advancement. DICOM format image information of an OSAHS patient's upper airway was obtained by thin-section CT scanning and digital image processing were utilized to construct a three-dimensional finite element model by Mimics 10.0, Imageware 10.0 and Ansys software. The changes and the law of glossopharyngeum were observed by biomechanics and morphology after loading with titrated mandible advancement. A three-dimensional finite element model of the adjacent upper airway structure of OSAHS was established successfully. After loading, the transverse diameter of epiglottis tip of glossopharyngeum increased significantly, although the sagittal diameter decreased correspondingly. The principal stress was mainly distributed in anterior wall of the upper airway. The location of principal stress concentration did not change significantly with the increasing of distance. The stress of glossopharyngeum increased during titrated mandible advancement. A more precise three-dimensional finite model of upper airway and adjacent structure of an OSAHS patient is established and improved efficiency by Mimics, Imageware and Ansys software. The glossopharyngeum of finite element model of OSAHS is analyzed by titrated mandible advancement and can effectively show the relationship between mandible advancement and the glossopharyngeum.
Finite element analysis of mechanical behavior of human dysplastic hip joints: a systematic review.
Vafaeian, B; Zonoobi, D; Mabee, M; Hareendranathan, A R; El-Rich, M; Adeeb, S; Jaremko, J L
2017-04-01
Developmental dysplasia of the hip (DDH) is a common condition predisposing to osteoarthritis (OA). Especially since DDH is best identified and treated in infancy before bones ossify, there is surprisingly a near-complete absence of literature examining mechanical behavior of infant dysplastic hips. We sought to identify current practice in finite element modeling (FEM) of DDH, to inform future modeling of infant dysplastic hips. We performed multi-database systematic review using PRISMA criteria. Abstracts (n = 126) fulfilling inclusion criteria were screened for methodological quality, and results were analyzed and summarized for eligible articles (n = 12). The majority of the studies modeled human adult dysplastic hips. Two studies focused on etiology of DDH through simulating mechanobiological growth of prenatal hips; we found no FEM-based studies in infants or children. Finite element models used either patient-specific geometry or idealized average geometry. Diversities in choice of material properties, boundary conditions, and loading scenarios were found in the finite-element models. FEM of adult dysplastic hips demonstrated generally smaller cartilage contact area in dysplastic hips than in normal joints. Contact pressure (CP) may be higher or lower in dysplastic hips depending on joint geometry and mechanical contribution of labrum (Lb). FEM of mechanobiological growth of prenatal hip joints revealed evidence for effects of the joint mechanical environment on formation of coxa valga, asymmetrically shallow acetabulum and malformed femoral head associated with DDH. Future modeling informed by the results of this review may yield valuable insights into optimal treatment of DDH, and into how and why OA develops early in DDH. Copyright © 2016 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.
Ogunyemi, Omolola I; Meeker, Daniella; Kim, Hyeon-Eui; Ashish, Naveen; Farzaneh, Seena; Boxwala, Aziz
2013-08-01
The need for a common format for electronic exchange of clinical data prompted federal endorsement of applicable standards. However, despite obvious similarities, a consensus standard has not yet been selected in the comparative effectiveness research (CER) community. Using qualitative metrics for data retrieval and information loss across a variety of CER topic areas, we compare several existing models from a representative sample of organizations associated with clinical research: the Observational Medical Outcomes Partnership (OMOP), Biomedical Research Integrated Domain Group, the Clinical Data Interchange Standards Consortium, and the US Food and Drug Administration. While the models examined captured a majority of the data elements that are useful for CER studies, data elements related to insurance benefit design and plans were most detailed in OMOP's CDM version 4.0. Standardized vocabularies that facilitate semantic interoperability were included in the OMOP and US Food and Drug Administration Mini-Sentinel data models, but are left to the discretion of the end-user in Biomedical Research Integrated Domain Group and Analysis Data Model, limiting reuse opportunities. Among the challenges we encountered was the need to model data specific to a local setting. This was handled by extending the standard data models. We found that the Common Data Model from the OMOP met the broadest complement of CER objectives. Minimal information loss occurred in mapping data from institution-specific data warehouses onto the data models from the standards we assessed. However, to support certain scenarios, we found a need to enhance existing data dictionaries with local, institution-specific information.
[Three dimensional mathematical model of tooth for finite element analysis].
Puskar, Tatjana; Vasiljević, Darko; Marković, Dubravka; Jevremović, Danimir; Pantelić, Dejan; Savić-Sević, Svetlana; Murić, Branka
2010-01-01
The mathematical model of the abutment tooth is the starting point of the finite element analysis of stress and deformation of dental structures. The simplest and easiest way is to form a model according to the literature data of dimensions and morphological characteristics of teeth. Our method is based on forming 3D models using standard geometrical forms (objects) in programmes for solid modeling. Forming the mathematical model of abutment of the second upper premolar for finite element analysis of stress and deformation of dental structures. The abutment tooth has a form of a complex geometric object. It is suitable for modeling in programs for solid modeling SolidWorks. After analysing the literature data about the morphological characteristics of teeth, we started the modeling dividing the tooth (complex geometric body) into simple geometric bodies (cylinder, cone, pyramid,...). Connecting simple geometric bodies together or substricting bodies from the basic body, we formed complex geometric body, tooth. The model is then transferred into Abaqus, a computational programme for finite element analysis. Transferring the data was done by standard file format for transferring 3D models ACIS SAT. Using the programme for solid modeling SolidWorks, we developed three models of abutment of the second maxillary premolar: the model of the intact abutment, the model of the endodontically treated tooth with two remaining cavity walls and the model of the endodontically treated tooth with two remaining walls and inserted post. Mathematical models of the abutment made according to the literature data are very similar with the real abutment and the simplifications are minimal. These models enable calculations of stress and deformation of the dental structures. The finite element analysis provides useful information in understanding biomechanical problems and gives guidance for clinical research.
Sharma, Deepak K; Solbrig, Harold R; Tao, Cui; Weng, Chunhua; Chute, Christopher G; Jiang, Guoqian
2017-06-05
Detailed Clinical Models (DCMs) have been regarded as the basis for retaining computable meaning when data are exchanged between heterogeneous computer systems. To better support clinical cancer data capturing and reporting, there is an emerging need to develop informatics solutions for standards-based clinical models in cancer study domains. The objective of the study is to develop and evaluate a cancer genome study metadata management system that serves as a key infrastructure in supporting clinical information modeling in cancer genome study domains. We leveraged a Semantic Web-based metadata repository enhanced with both ISO11179 metadata standard and Clinical Information Modeling Initiative (CIMI) Reference Model. We used the common data elements (CDEs) defined in The Cancer Genome Atlas (TCGA) data dictionary, and extracted the metadata of the CDEs using the NCI Cancer Data Standards Repository (caDSR) CDE dataset rendered in the Resource Description Framework (RDF). The ITEM/ITEM_GROUP pattern defined in the latest CIMI Reference Model is used to represent reusable model elements (mini-Archetypes). We produced a metadata repository with 38 clinical cancer genome study domains, comprising a rich collection of mini-Archetype pattern instances. We performed a case study of the domain "clinical pharmaceutical" in the TCGA data dictionary and demonstrated enriched data elements in the metadata repository are very useful in support of building detailed clinical models. Our informatics approach leveraging Semantic Web technologies provides an effective way to build a CIMI-compliant metadata repository that would facilitate the detailed clinical modeling to support use cases beyond TCGA in clinical cancer study domains.
Fast Flood damage estimation coupling hydraulic modeling and Multisensor Satellite data
NASA Astrophysics Data System (ADS)
Fiorini, M.; Rudari, R.; Delogu, F.; Candela, L.; Corina, A.; Boni, G.
2011-12-01
Damage estimation requires a good representation of the Elements at risk and their vulnerability, the knowledge of the flooded area extension and the description of the hydraulic forcing. In this work the real time use of a simplified two dimensional hydraulic model constrained by satellite retrieved flooded areas is analyzed. The main features of such a model are computational speed and simple start-up, with no need to insert complex information but a subset of simplified boundary and initial condition. Those characteristics allow the model to be fast enough to be used in real time for the simulation of flooding events. The model fills the gap of information left by single satellite scenes of flooded area, allowing for the estimation of the maximum flooding extension and magnitude. The static information provided by earth observation (like SAR extension of flooded areas at a certain time) are interpreted in a dynamic consistent way and very useful hydraulic information (e.g., water depth, water speed and the evolution of flooded areas)are provided. These information are merged with satellite identification of elements exposed to risk that are characterized in terms of their vulnerability to floods in order to obtain fast estimates of Food damages. The model has been applied in several flooding events occurred worldwide. amongst the other activations in the Mediterranean areas like Veneto (IT) (October 2010), Basilicata (IT) (March 2011) and Shkoder (January 2010 and December 2010) are considered and compared with larger types of floods like the one of Queensland in December 2010.
Markup of temporal information in electronic health records.
Hyun, Sookyung; Bakken, Suzanne; Johnson, Stephen B
2006-01-01
Temporal information plays a critical role in the understanding of clinical narrative (i.e., free text). We developed a representation for marking up temporal information in a narrative, consisting of five elements: 1) reference point, 2) direction, 3) number, 4) time unit, and 5) pattern. We identified 254 temporal expressions from 50 discharge summaries and represented them using our scheme. The overall inter-rater reliability among raters applying the representation model was 75 percent agreement. The model can contribute to temporal reasoning in computer systems for decision support, data mining, and process and outcomes analyses by providing structured temporal information.
NASA Astrophysics Data System (ADS)
Bel Hadj Kacem, Mohamed Salah
All hydrological processes are affected by the spatial variability of the physical parameters of the watershed, and also by human intervention on the landscape. The water outflow from a watershed strictly depends on the spatial and temporal variabilities of the physical parameters of the watershed. It is now apparent that the integration of mathematical models into GIS's can benefit both GIS and three-dimension environmental models: a true modeling capability can help the modeling community bridge the gap between planners, scientists, decision-makers and end-users. The main goal of this research is to design a practical tool to simulate run-off water surface using Geographic design a practical tool to simulate run-off water surface using Geographic Information Systems and the simulation of the hydrological behavior by the Finite Element Method.
ACTON - AUTOCAD TO NASTRAN TRANSLATOR
NASA Technical Reports Server (NTRS)
Jones, A.
1994-01-01
The AutoCAD to NASTRAN translator, ACTON, was developed to facilitate quick generation of small finite element models for use with the NASTRAN finite element modeling program. (NASTRAN is available from COSMIC.) ACTON reads the geometric data of a drawing from the Data Exchange File (DXF) used in AutoCAD and other PC based drafting programs. The geometric entities recognized by ACTON include POINTs, LINEs, SOLIDs, 3DLINEs and 3DFACEs. From this information ACTON creates a NASTRAN bulk data deck which can be used to create a finite element model. The NASTRAN elements created include CBARs, CTRIAs, CQUAD4s, CPENTAs, and CHEXAs. The bulk data deck can be used to create a full NASTRAN deck. It is assumed that the user has at least a working knowledge of AutoCAD and NASTRAN. ACTON was written in Microsoft QuickBasic (Version 2.0). The program was developed for the IBM PC and has been implemented on an IBM PC compatible under DOS 3.21. ACTON was developed in 1988.
Kruger, Estie; Tennant, Marc
2012-03-01
Over the last decade, there has been a significant increase in attention to the overall accountability of higher education in Australia, and this is expected to continue. Increased accountability has led to the need for more explicitly documented curricula. The curricula from ten health-related disciplines developed over the last five years in Australia were the basis of this study. Curriculum information modeling is an approach that allows for the dynamic nature of curricula since elements and their linkages can be moved about and reconnected into meaningful patterns. In addition, the models give disciplines and institutions the ability to effectively monitor curricula and draw comparisons in a more unified manner. Curriculum information models are an efficient innovation in the design and management of curricula in higher education and particularly in the health care disciplines. They rest on the principles of reusable elements and linkages independent of content that were first used in the design, construction, and maintenance of buildings. The translation of this approach to the higher education sector provides a higher level of interoperability of resources and a clearer pathway for content design within a curriculum.
Z39.50 and GILS model. [Government Information Locator Service
NASA Technical Reports Server (NTRS)
Christian, Eliot
1994-01-01
The Government Information Locator System (GILS) is a component of the National Information Infrastructure (NII) which provides electronic access to sources of publicly accessible information maintained throughout the Federal Government. GILS is an internetworking information resource that identifies other information resources, describes the information available in the referenced resources, and provides assistance in how to obtain the information either directly or through intermediaries. The GILS core content which references each Federal information system holding publicly accessible data or information is described in terms of mandatory and optional core elements.
Dowrick, Christopher; Bower, Peter; Chew-Graham, Carolyn; Lovell, Karina; Edwards, Suzanne; Lamb, Jonathan; Bristow, Katie; Gabbay, Mark; Burroughs, Heather; Beatty, Susan; Waheed, Waquas; Hann, Mark; Gask, Linda
2016-02-17
Many people with mental distress are disadvantaged because care is not available or does not address their needs. In order to increase access to high quality primary mental health care for under-served groups, we created a model of care with three discrete elements: community engagement, primary care training and tailored wellbeing interventions. We have previously demonstrated the individual impact of each element of the model. Here we assess the effectiveness of the combined model in increasing access to and improving the quality of primary mental health care. We test the assumptions that access to the wellbeing interventions is increased by the presence of community engagement and primary care training; and that quality of primary mental health care is increased by the presence of community engagement and the wellbeing interventions. We implemented the model in four under-served localities in North-West England, focusing on older people and minority ethnic populations. Using a quasi-experimental design with no-intervention comparators, we gathered a combination of quantitative and qualitative information. Quantitative information, including referral and recruitment rates for the wellbeing interventions, and practice referrals to mental health services, was analysed descriptively. Qualitative information derived from interview and focus group responses to topic guides from more than 110 participants. Framework analysis was used to generate findings from the qualitative data. Access to the wellbeing interventions was associated with the presence of the community engagement and the primary care training elements. Referrals to the wellbeing interventions were associated with community engagement, while recruitment was associated with primary care training. Qualitative data suggested that the mechanisms underlying these associations were increased awareness and sense of agency. The quality of primary mental health care was enhanced by information gained from our community mapping activities, and by the offer of access to the wellbeing interventions. There were variable benefits from health practitioner participation in community consultative groups. We also found that participation in the wellbeing interventions led to increased community engagement. We explored the interactions between elements of a multilevel intervention and identified important associations and underlying mechanisms. Further research is needed to test the generalisability of the model. Current Controlled Trials, reference ISRCTN68572159 . Registered 25 February 2013.
A new methodology for free wake analysis using curved vortex elements
NASA Technical Reports Server (NTRS)
Bliss, Donald B.; Teske, Milton E.; Quackenbush, Todd R.
1987-01-01
A method using curved vortex elements was developed for helicopter rotor free wake calculations. The Basic Curve Vortex Element (BCVE) is derived from the approximate Biot-Savart integration for a parabolic arc filament. When used in conjunction with a scheme to fit the elements along a vortex filament contour, this method has a significant advantage in overall accuracy and efficiency when compared to the traditional straight-line element approach. A theoretical and numerical analysis shows that free wake flows involving close interactions between filaments should utilize curved vortex elements in order to guarantee a consistent level of accuracy. The curved element method was implemented into a forward flight free wake analysis, featuring an adaptive far wake model that utilizes free wake information to extend the vortex filaments beyond the free wake regions. The curved vortex element free wake, coupled with this far wake model, exhibited rapid convergence, even in regions where the free wake and far wake turns are interlaced. Sample calculations are presented for tip vortex motion at various advance ratios for single and multiple blade rotors. Cross-flow plots reveal that the overall downstream wake flow resembles a trailing vortex pair. A preliminary assessment shows that the rotor downwash field is insensitive to element size, even for relatively large curved elements.
Platinum Partitioning at Low Oxygen Fugacity: Implications for Core Formation Processes
NASA Technical Reports Server (NTRS)
Medard, E.; Martin, A. M.; Righter, K.; Lanziroti, A.; Newville, M.
2016-01-01
Highly siderophile elements (HSE = Au, Re, and the Pt-group elements) are tracers of silicate / metal interactions during planetary processes. Since most core-formation models involve some state of equilibrium between liquid silicate and liquid metal, understanding the partioning of highly siderophile elements (HSE) between silicate and metallic melts is a key issue for models of core / mantle equilibria and for core formation scenarios. However, partitioning models for HSE are still inaccurate due to the lack of sufficient experimental constraints to describe the variations of partitioning with key variable like temperature, pressure, and oxygen fugacity. In this abstract, we describe a self-consistent set of experiments aimed at determining the valence of platinum, one of the HSE, in silicate melts. This is a key information required to parameterize the evolution of platinum partitioning with oxygen fugacity.
Schlette, Sophia; Lisac, Melanie; Wagner, Ed; Gensichen, Jochen
2009-01-01
The Bellagio Model for Population-oriented Primary Care is an evidence-informed framework to assess accessible care for sick, vulnerable, and healthy people. The model was developed in spring 2008 by a multidisciplinary group of 24 experts from nine countries. The purpose of their gathering was to determine success factors for effective 21st century primary care based on state-of-the-art research findings, models, and empirical experience, and to assist with its implementation in practice, management, and health policy. Against the backdrop of "partialization", fragmentation in open health care systems, and the growing numbers of chronically ill or fragile people or those in need of any other kind of care, today's health care systems do not provide the much needed anchor point for continuing coordination and assistance prior, during and following an episode of illness. The Bellagio Model consists of ten key elements, which can make a substantial contribution to identify and overcome current gaps in primary care by using a synergetic approach. These elements are Shared Leadership, Public Trust, Horizontal and Vertical Integration, Networking of Professionals, Standardized Measurement, Research and Development, Payment Mix, Infrastructure, Programmes for Practice Improvement, and Population-oriented Management. All of these elements, which have been identified as being equally necessary, are also alike in that they involve all those responsible for health care: providers, managers, and policymakers.
NASA Astrophysics Data System (ADS)
Ramirez Cuesta, Timmy
Incoherent inelastic neutron scattering spectroscopy is a very powerful technique that requires the use of ab-initio models to interpret the experimental data. Albeit not exact the information obtained from the models gives very valuable insight into the dynamics of atoms in solids and molecules, that, in turn, provides unique access to the vibrational density of states. It is extremely sensitive to hydrogen since the neutron cross section of hydrogen is the largest of all chemical elements. Hydrogen, being the lightest element highlights quantum effects more pronounced than the rest of the elements.In the case of non-crystalline or disordered materials, the models provide partial information and only a reduced sampling of possible configurations can be done at the present. With very large computing power, as exascale computing will provide, a new opportunity arises to study these systems and introduce a description of statistical configurations including energetics and dynamics characterization of configurational entropy. As part of the ICE-MAN project, we are developing the tools to manage the workflows, visualize and analyze the results. To use state of the art computational methods and most neutron scattering that using atomistic models for interpretation of experimental data This work is supported by the Laboratory Directed Research and Development (LDRD 8237) program of the UT-Battelle, LLC under Contract No. DE-AC05-00OR22725 with the U.S. Department of Energy.
NASA Astrophysics Data System (ADS)
Bohrson, W. A.; Spera, F. J.; Fowler, S.; Belkin, H.; de Vivo, B.
2005-12-01
The Campanian Ignimbrite, a large volume (~200 km3 DRE) trachytic to phonolitic ignimbrite was deposited at ~39.3 ka and represents the largest of a number of highly explosive volcanic events in the region near Naples, Italy. Thermodynamic modeling of the major element evolution using the MELTS algorithm (see companion contribution by Fowler et al.) provides detailed information about the identity of and changes in proportions of solids along the liquid line of descent during isobaric fractional crystallization. We have derived trace element mass balance equations that explicitly accommodate changing mineral-melt bulk distribution coefficients during crystallization and also simultaneously satisfy energy and major element mass conservation. Although major element patterns are reasonably modeled assuming closed system fractional crystallization, modeling of trace elements that represent a range of behaviors (e.g. Zr, Nb, Th, U, Rb, Sm, Sr) yields trends for closed system fractionation that are distinct from those observed. These results suggest open-system processes were also important in the evolution of the Campanian magmatic system. Th isotope data yield an apparent isochron that is ~20 kyr younger than the age of the deposit, and age-corrected Th isotope data indicate that the magma body was an open-system at the time of eruption. Because open-system processes can profoundly change isotopic characteristics of a magma body, these results illustrate that it is critical to understand the contribution that open-system processes make to silicic magma bodies prior to assigning relevance to age or timescale information derived from isotope systematics. Fluid-magma interaction has been proposed as a mechanism to change isotopic and elemental characteristics of magma bodies, but an evaluation of the mass and thermal constraints on such a process suggest large-scale fluid-melt interaction at liquidus temperatures is unlikely. In the case of the magma body associated with the Campanian Ignimbrite, the most likely source of open-system signatures is assimilation of partial melts of compositionally heterogeneous basement composed of older cumulates and intrusive equivalents of volcanic activity within the Campanian region. Additional trace element modeling, explicitly evaluating the mass and energy balance effects that fluid, solids, and melt have on trace element evolution, will further elucidate the contributions of open vs. closed system processes within the Campanian magma body.
NASA Technical Reports Server (NTRS)
DeMott, Diana; Fuqua, Bryan; Wilson, Paul
2013-01-01
Once a project obtains approval, decision makers have to consider a variety of alternative paths for completing the project and meeting the project objectives. How decisions are made involves a variety of elements including: cost, experience, current technology, ideologies, politics, future needs and desires, capabilities, manpower, timing, available information, and for many ventures management needs to assess the elements of risk versus reward. The use of high level Probabilistic Risk Assessment (PRA) Models during conceptual design phases provides management with additional information during the decision making process regarding the risk potential for proposed operations and design prototypes. The methodology can be used as a tool to: 1) allow trade studies to compare alternatives based on risk, 2) determine which elements (equipment, process or operational parameters) drives the risk, and 3) provide information to mitigate or eliminate risks early in the conceptual design to lower costs. Creating system models using conceptual design proposals and generic key systems based on what is known today can provide an understanding of the magnitudes of proposed systems and operational risks and facilitates trade study comparisons early in the decision making process. Identifying the "best" way to achieve the desired results is difficult, and generally occurs based on limited information. PRA provides a tool for decision makers to explore how some decisions will affect risk before the project is committed to that path, which can ultimately save time and money.
21 CFR 50.25 - Elements of informed consent.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 1 2014-04-01 2014-04-01 false Elements of informed consent. 50.25 Section 50.25... OF HUMAN SUBJECTS Informed Consent of Human Subjects § 50.25 Elements of informed consent. (a) Basic elements of informed consent. In seeking informed consent, the following information shall be provided to...
21 CFR 50.25 - Elements of informed consent.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 1 2012-04-01 2012-04-01 false Elements of informed consent. 50.25 Section 50.25... OF HUMAN SUBJECTS Informed Consent of Human Subjects § 50.25 Elements of informed consent. (a) Basic elements of informed consent. In seeking informed consent, the following information shall be provided to...
21 CFR 50.25 - Elements of informed consent.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 1 2013-04-01 2013-04-01 false Elements of informed consent. 50.25 Section 50.25... OF HUMAN SUBJECTS Informed Consent of Human Subjects § 50.25 Elements of informed consent. (a) Basic elements of informed consent. In seeking informed consent, the following information shall be provided to...
NASA Technical Reports Server (NTRS)
Leser, William P.; Yuan, Fuh-Gwo; Leser, William P.
2013-01-01
A method of numerically estimating dynamic Green's functions using the finite element method is proposed. These Green's functions are accurate in a limited frequency range dependent on the mesh size used to generate them. This range can often match or exceed the frequency sensitivity of the traditional acoustic emission sensors. An algorithm is also developed to characterize an acoustic emission source by obtaining information about its strength and temporal dependence. This information can then be used to reproduce the source in a finite element model for further analysis. Numerical examples are presented that demonstrate the ability of the band-limited Green's functions approach to determine the moment tensor coefficients of several reference signals to within seven percent, as well as accurately reproduce the source-time function.
Unification of color postprocessing techniques for 3-dimensional computational mechanics
NASA Technical Reports Server (NTRS)
Bailey, Bruce Charles
1985-01-01
To facilitate the understanding of complex three-dimensional numerical models, advanced interactive color postprocessing techniques are introduced. These techniques are sufficiently flexible so that postprocessing difficulties arising from model size, geometric complexity, response variation, and analysis type can be adequately overcome. Finite element, finite difference, and boundary element models may be evaluated with the prototype postprocessor. Elements may be removed from parent models to be studied as independent subobjects. Discontinuous responses may be contoured including responses which become singular, and nonlinear color scales may be input by the user for the enhancement of the contouring operation. Hit testing can be performed to extract precise geometric, response, mesh, or material information from the database. In addition, stress intensity factors may be contoured along the crack front of a fracture model. Stepwise analyses can be studied, and the user can recontour responses repeatedly, as if he were paging through the response sets. As a system, these tools allow effective interpretation of complex analysis results.
NASA Technical Reports Server (NTRS)
Kim, E.; Tedesco, M.; Reichle, R.; Choudhury, B.; Peters-Lidard C.; Foster, J.; Hall, D.; Riggs, G.
2006-01-01
Microwave-based retrievals of snow parameters from satellite observations have a long heritage and have so far been generated primarily by regression-based empirical "inversion" methods based on snapshots in time. Direct assimilation of microwave radiance into physical land surface models can be used to avoid errors associated with such retrieval/inversion methods, instead utilizing more straightforward forward models and temporal information. This approach has been used for years for atmospheric parameters by the operational weather forecasting community with great success. Recent developments in forward radiative transfer modeling, physical land surface modeling, and land data assimilation are converging to allow the assembly of an integrated framework for snow/cold lands modeling and radiance assimilation. The objective of the Goddard snow radiance assimilation project is to develop such a framework and explore its capabilities. The key elements of this framework include: a forward radiative transfer model (FRTM) for snow, a snowpack physical model, a land surface water/energy cycle model, and a data assimilation scheme. In fact, multiple models are available for each element enabling optimization to match the needs of a particular study. Together these form a modular and flexible framework for self-consistent, physically-based remote sensing and water/energy cycle studies. In this paper we will describe the elements and the integration plan. All modules will operate within the framework of the Land Information System (LIS), a land surface modeling framework with data assimilation capabilities running on a parallel-node computing cluster. Capabilities for assimilation of snow retrieval products are already under development for LIS. We will describe plans to add radiance-based assimilation capabilities. Plans for validation activities using field measurements will also be discussed.
Deterioration and cost information for bridge management.
DOT National Transportation Integrated Search
2012-05-01
This study applies contract bid tabulations and elementlevel condition records to develop elementlevel actions, : costs for actions, transition probabilities for models of deterioration of bridge elements, and transition probabilities : for imp...
Noise shaping in populations of coupled model neurons.
Mar, D J; Chow, C C; Gerstner, W; Adams, R W; Collins, J J
1999-08-31
Biological information-processing systems, such as populations of sensory and motor neurons, may use correlations between the firings of individual elements to obtain lower noise levels and a systemwide performance improvement in the dynamic range or the signal-to-noise ratio. Here, we implement such correlations in networks of coupled integrate-and-fire neurons using inhibitory coupling and demonstrate that this can improve the system dynamic range and the signal-to-noise ratio in a population rate code. The improvement can surpass that expected for simple averaging of uncorrelated elements. A theory that predicts the resulting power spectrum is developed in terms of a stochastic point-process model in which the instantaneous population firing rate is modulated by the coupling between elements.
Validating archetypes for the Multiple Sclerosis Functional Composite.
Braun, Michael; Brandt, Alexander Ulrich; Schulz, Stefan; Boeker, Martin
2014-08-03
Numerous information models for electronic health records, such as openEHR archetypes are available. The quality of such clinical models is important to guarantee standardised semantics and to facilitate their interoperability. However, validation aspects are not regarded sufficiently yet. The objective of this report is to investigate the feasibility of archetype development and its community-based validation process, presuming that this review process is a practical way to ensure high-quality information models amending the formal reference model definitions. A standard archetype development approach was applied on a case set of three clinical tests for multiple sclerosis assessment: After an analysis of the tests, the obtained data elements were organised and structured. The appropriate archetype class was selected and the data elements were implemented in an iterative refinement process. Clinical and information modelling experts validated the models in a structured review process. Four new archetypes were developed and publicly deployed in the openEHR Clinical Knowledge Manager, an online platform provided by the openEHR Foundation. Afterwards, these four archetypes were validated by domain experts in a team review. The review was a formalised process, organised in the Clinical Knowledge Manager. Both, development and review process turned out to be time-consuming tasks, mostly due to difficult selection processes between alternative modelling approaches. The archetype review was a straightforward team process with the goal to validate archetypes pragmatically. The quality of medical information models is crucial to guarantee standardised semantic representation in order to improve interoperability. The validation process is a practical way to better harmonise models that diverge due to necessary flexibility left open by the underlying formal reference model definitions.This case study provides evidence that both community- and tool-enabled review processes, structured in the Clinical Knowledge Manager, ensure archetype quality. It offers a pragmatic but feasible way to reduce variation in the representation of clinical information models towards a more unified and interoperable model.
Validating archetypes for the Multiple Sclerosis Functional Composite
2014-01-01
Background Numerous information models for electronic health records, such as openEHR archetypes are available. The quality of such clinical models is important to guarantee standardised semantics and to facilitate their interoperability. However, validation aspects are not regarded sufficiently yet. The objective of this report is to investigate the feasibility of archetype development and its community-based validation process, presuming that this review process is a practical way to ensure high-quality information models amending the formal reference model definitions. Methods A standard archetype development approach was applied on a case set of three clinical tests for multiple sclerosis assessment: After an analysis of the tests, the obtained data elements were organised and structured. The appropriate archetype class was selected and the data elements were implemented in an iterative refinement process. Clinical and information modelling experts validated the models in a structured review process. Results Four new archetypes were developed and publicly deployed in the openEHR Clinical Knowledge Manager, an online platform provided by the openEHR Foundation. Afterwards, these four archetypes were validated by domain experts in a team review. The review was a formalised process, organised in the Clinical Knowledge Manager. Both, development and review process turned out to be time-consuming tasks, mostly due to difficult selection processes between alternative modelling approaches. The archetype review was a straightforward team process with the goal to validate archetypes pragmatically. Conclusions The quality of medical information models is crucial to guarantee standardised semantic representation in order to improve interoperability. The validation process is a practical way to better harmonise models that diverge due to necessary flexibility left open by the underlying formal reference model definitions. This case study provides evidence that both community- and tool-enabled review processes, structured in the Clinical Knowledge Manager, ensure archetype quality. It offers a pragmatic but feasible way to reduce variation in the representation of clinical information models towards a more unified and interoperable model. PMID:25087081
Development of Health Information Search Engine Based on Metadata and Ontology
Song, Tae-Min; Jin, Dal-Lae
2014-01-01
Objectives The aim of the study was to develop a metadata and ontology-based health information search engine ensuring semantic interoperability to collect and provide health information using different application programs. Methods Health information metadata ontology was developed using a distributed semantic Web content publishing model based on vocabularies used to index the contents generated by the information producers as well as those used to search the contents by the users. Vocabulary for health information ontology was mapped to the Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT), and a list of about 1,500 terms was proposed. The metadata schema used in this study was developed by adding an element describing the target audience to the Dublin Core Metadata Element Set. Results A metadata schema and an ontology ensuring interoperability of health information available on the internet were developed. The metadata and ontology-based health information search engine developed in this study produced a better search result compared to existing search engines. Conclusions Health information search engine based on metadata and ontology will provide reliable health information to both information producer and information consumers. PMID:24872907
Development of health information search engine based on metadata and ontology.
Song, Tae-Min; Park, Hyeoun-Ae; Jin, Dal-Lae
2014-04-01
The aim of the study was to develop a metadata and ontology-based health information search engine ensuring semantic interoperability to collect and provide health information using different application programs. Health information metadata ontology was developed using a distributed semantic Web content publishing model based on vocabularies used to index the contents generated by the information producers as well as those used to search the contents by the users. Vocabulary for health information ontology was mapped to the Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT), and a list of about 1,500 terms was proposed. The metadata schema used in this study was developed by adding an element describing the target audience to the Dublin Core Metadata Element Set. A metadata schema and an ontology ensuring interoperability of health information available on the internet were developed. The metadata and ontology-based health information search engine developed in this study produced a better search result compared to existing search engines. Health information search engine based on metadata and ontology will provide reliable health information to both information producer and information consumers.
Ekerete, P P
2000-01-01
The National Programme on Immunization (NPI), which was formerly known as the Expanded Program on Immunization (EPI), and Oral Rehydration Therapy (ORT) were relaunched in 1984 after the problems of vaccine supply had been corrected. The aim of the NPI was to protect children against six childhood killer diseases and ORT to rehydrate the dehydrated child caused by diarrhoea. In order to achieve these objectives, a Partner-in-Health strategy was set up to educate, convince and motivate mothers, pregnant women and the community to accept the programme. To assess the effect of the promotional strategy, the government decided to conduct a National Immunization Coverage survey. The results showed that some states were able to reach the target while some were not. The survey also reported that 32% of the reason for immunization failure was due to lack of information and that 9% was lack of motivation. It therefore became necessary to design a promotional model for effective and rapid implementation of the programme. After an evaluation of the promotional strategy set up by the government, a pilot survey was conducted from which nine promotional elements were selected. These promotional elements were regarded as sources of information and motivation. Based on these, a promotional model was set up which stated that promotion depends on consumer information which in turn depends on the extent of interaction between the consumer and the promotional elements. The implication of the model is the need for the formation of a Public Health Organisation with a Public Health Committee at all levels of government.
Extraction of information from major element chemical analyses of lunar basalts
NASA Technical Reports Server (NTRS)
Butler, J. C.
1985-01-01
Major element chemical analyses often form the framework within which similarities and differences of analyzed specimens are noted and used to propose or devise models. When percentages are formed the ratios of pairs of components are preserved whereas many familiar statistical and geometrical descriptors are likely to exhibit major changes. This ratio preserving aspect forms the basis for a proposed framework. An analysis of compositional variability within the data set of 42 major element analyses of lunar reference samples was selected to investigate this proposal.
From Point Cloud to Bim: a Modelling Challenge in the Cultural Heritage Field
NASA Astrophysics Data System (ADS)
Tommasi, C.; Achille, C.; Fassi, F.
2016-06-01
Speaking about modelling the Cultural Heritage, nowadays it is no longer enough to build the mute model of a monument, but it has to contain plenty of information inside it, especially when we refer to existing construction. For this reason, the aim of the research is to insert an historical building inside a BIM process, proposing in this way a working method that can build a reality based model and preserve the unicity of the elements. The question is: "What is the more useful mean in term of survey data management, level of detail, information and time savings?" To test the potentialities and the limits of this process we employed the most used software in the international market, taking as example some composed elements, made by regular and complex, but also modular parts. Once a final model is obtained, it is necessary to provide a test phase on the interoperability between the used software modules, in order to give a general picture of the state of art and to contribute to further studies on this subject.
A reduced Iwan model that includes pinning for bolted joint mechanics
Brake, M. R. W.
2016-10-28
Bolted joints are prevalent in most assembled structures; however, predictive models for their behavior do not exist. Calibrated models, such as the Iwan model, are able to predict the response of a jointed structure over a range of excitations once calibrated at a nominal load. The Iwan model, though, is not widely adopted due to the high computational expense of implementation. To address this, an analytical solution of the Iwan model is derived under the hypothesis that for an arbitrary load reversal, there is a new distribution of dry friction elements, which are now stuck, that approximately resemble a scaledmore » version of the original distribution of dry friction elements. The dry friction elements internal to the Iwan model do not have a uniform set of parameters and are described by a distribution of parameters, i.e., which internal dry friction elements are stuck or slipping at a given load, that ultimately governs the behavior of the joint as it transitions from microslip to macroslip. This hypothesis allows the model to require no information from previous loading cycles. Additionally, the model is extended to include the pinning behavior inherent in a bolted joint. Modifications of the resulting framework are discussed to highlight how the constitutive model for friction can be changed (in the case of an Iwan–Stribeck formulation) or how the distribution of dry friction elements can be changed (as is the case for the Iwan plasticity model). Finally, the reduced Iwan plus pinning model is then applied to the Brake–Reuß beam in order to discuss methods to deduce model parameters from experimental data.« less
Plantet, C; Meimon, S; Conan, J-M; Fusco, T
2015-11-02
Exoplanet direct imaging with large ground based telescopes requires eXtreme Adaptive Optics that couples high-order adaptive optics and coronagraphy. A key element of such systems is the high-order wavefront sensor. We study here several high-order wavefront sensing approaches, and more precisely compare their sensitivity to noise. Three techniques are considered: the classical Shack-Hartmann sensor, the pyramid sensor and the recently proposed LIFTed Shack-Hartmann sensor. They are compared in a unified framework based on precise diffractive models and on the Fisher information matrix, which conveys the information present in the data whatever the estimation method. The diagonal elements of the inverse of the Fisher information matrix, which we use as a figure of merit, are similar to noise propagation coefficients. With these diagonal elements, so called "Fisher coefficients", we show that the LIFTed Shack-Hartmann and pyramid sensors outperform the classical Shack-Hartmann sensor. In photon noise regime, the LIFTed Shack-Hartmann and modulated pyramid sensors obtain a similar overall noise propagation. The LIFTed Shack-Hartmann sensor however provides attractive noise properties on high orders.
NASA Astrophysics Data System (ADS)
Yen, Y. N.; Weng, K. H.; Huang, H. Y.
2013-07-01
After over 30 years of practise and development, Taiwan's architectural conservation field is moving rapidly into digitalization and its applications. Compared to modern buildings, traditional Chinese architecture has considerably more complex elements and forms. To document and digitize these unique heritages in their conservation lifecycle is a new and important issue. This article takes the caisson ceiling of the Taipei Confucius Temple, octagonal with 333 elements in 8 types, as a case study for digitization practise. The application of metadata representation and 3D modelling are the two key issues to discuss. Both Revit and SketchUp were appliedin this research to compare its effectiveness to metadata representation. Due to limitation of the Revit database, the final 3D models wasbuilt with SketchUp. The research found that, firstly, cultural heritage databasesmustconvey that while many elements are similar in appearance, they are unique in value; although 3D simulations help the general understanding of architectural heritage, software such as Revit and SketchUp, at this stage, could onlybe used tomodel basic visual representations, and is ineffective indocumenting additional critical data ofindividually unique elements. Secondly, when establishing conservation lifecycle information for application in management systems, a full and detailed presentation of the metadata must also be implemented; the existing applications of BIM in managing conservation lifecycles are still insufficient. Results of the research recommends SketchUp as a tool for present modelling needs, and BIM for sharing data between users, but the implementation of metadata representation is of the utmost importance.
NASA Astrophysics Data System (ADS)
Juszczyk, Michał
2018-04-01
This paper reports some results of the studies on the use of artificial intelligence tools for the purposes of cost estimation based on building information models. A problem of the cost estimates based on the building information models on a macro level supported by the ensembles of artificial neural networks is concisely discussed. In the course of the research a regression model has been built for the purposes of cost estimation of buildings' floor structural frames, as higher level elements. Building information models are supposed to serve as a repository of data used for the purposes of cost estimation. The core of the model is the ensemble of neural networks. The developed model allows the prediction of cost estimates with satisfactory accuracy.
Role of Discrepant Questioning Leading to Model Element Modification
ERIC Educational Resources Information Center
Rea-Ramirez, Mary Anne; Nunez-Oviedo, Maria Cecilia; Clement, John
2009-01-01
Discrepant questioning is a teaching technique that can help students "unlearn" misconceptions and process science ideas for deep understanding. Discrepant questioning is a technique in which teachers question students in a way that requires them to examine their ideas or models, without giving information prematurely to the student or passing…
An approach for utilizing clinical statements in HL7 RIM to evaluate eligibility criteria.
Bache, Richard; Daniel, Christel; James, Julie; Hussain, Sajjad; McGilchrist, Mark; Delaney, Brendan; Taweel, Adel
2014-01-01
The HL7 RIM (Reference Information Model) is a commonly used standard for the exchange of clinical data and can be employed for integrating the patient care and clinical research domains. Yet it is not sufficiently well specified to ensure a canonical representation of structured clinical data when used for the automated evaluation of eligibility criteria from a clinical trial protocol. We present an approach to further constrain the RIM to create a common information model to hold clinical data. In order to demonstrate our approach, we identified 132 distinct data elements from 10 rich clinical trails. We then defined a taxonomy to (i) identify the types of data elements that would need to be stored and (ii) define the types of predicate that would be used to evaluate them. This informed the definition of a pattern used to represent the data, which was shown to be sufficient for storing and evaluating the clinical statements required by the trials.
Steffensen, Jon Lund; Dufault-Thompson, Keith; Zhang, Ying
2018-01-01
The metabolism of individual organisms and biological communities can be viewed as a network of metabolites connected to each other through chemical reactions. In metabolic networks, chemical reactions transform reactants into products, thereby transferring elements between these metabolites. Knowledge of how elements are transferred through reactant/product pairs allows for the identification of primary compound connections through a metabolic network. However, such information is not readily available and is often challenging to obtain for large reaction databases or genome-scale metabolic models. In this study, a new algorithm was developed for automatically predicting the element-transferring reactant/product pairs using the limited information available in the standard representation of metabolic networks. The algorithm demonstrated high efficiency in analyzing large datasets and provided accurate predictions when benchmarked with manually curated data. Applying the algorithm to the visualization of metabolic networks highlighted pathways of primary reactant/product connections and provided an organized view of element-transferring biochemical transformations. The algorithm was implemented as a new function in the open source software package PSAMM in the release v0.30 (https://zhanglab.github.io/psamm/).
Designing and researching of the virtual display system based on the prism elements
NASA Astrophysics Data System (ADS)
Vasilev, V. N.; Grimm, V. A.; Romanova, G. E.; Smirnov, S. A.; Bakholdin, A. V.; Grishina, N. Y.
2014-05-01
Problems of designing of systems for virtual display systems for augmented reality placed near the observers eye (so called head worn displays) with the light guide prismatic elements are considered. Systems of augmented reality is the complex consists of the image generator (most often it's the microdisplay with the illumination system if the display is not self-luminous), the objective which forms the display image practically in infinity and the combiner which organizes the light splitting so that an observer could see the information of the microdisplay and the surrounding environment as the background at the same time. This work deals with the system with the combiner based on the composite structure of the prism elements. In the work three cases of the prism combiner design are considered and also the results of the modeling with the optical design software are presented. In the model the question of the large pupil zone was analyzed and also the discontinuous character (mosaic structure) of the angular field in transmission of the information from the microdisplay to the observer's eye with the prismatic structure are discussed.
NASA Technical Reports Server (NTRS)
Guruswamy, Guru P.; MacMurdy, Dale E.; Kapania, Rakesh K.
1994-01-01
Strong interactions between flow about an aircraft wing and the wing structure can result in aeroelastic phenomena which significantly impact aircraft performance. Time-accurate methods for solving the unsteady Navier-Stokes equations have matured to the point where reliable results can be obtained with reasonable computational costs for complex non-linear flows with shock waves, vortices and separations. The ability to combine such a flow solver with a general finite element structural model is key to an aeroelastic analysis in these flows. Earlier work involved time-accurate integration of modal structural models based on plate elements. A finite element model was developed to handle three-dimensional wing boxes, and incorporated into the flow solver without the need for modal analysis. Static condensation is performed on the structural model to reduce the structural degrees of freedom for the aeroelastic analysis. Direct incorporation of the finite element wing-box structural model with the flow solver requires finding adequate methods for transferring aerodynamic pressures to the structural grid and returning deflections to the aerodynamic grid. Several schemes were explored for handling the grid-to-grid transfer of information. The complex, built-up nature of the wing-box complicated this transfer. Aeroelastic calculations for a sample wing in transonic flow comparing various simple transfer schemes are presented and discussed.
Verhey, Janko F; Nathan, Nadia S
2004-01-01
Background Finite element method (FEM) analysis for intraoperative modeling of the left ventricle (LV) is presently not possible. Since 3D structural data of the LV is now obtainable using standard transesophageal echocardiography (TEE) devices intraoperatively, the present study describes a method to transfer this data into a commercially available FEM analysis system: ABAQUS©. Methods In this prospective study TomTec LV Analysis TEE© Software was used for semi-automatic endocardial border detection, reconstruction, and volume-rendering of the clinical 3D echocardiographic data. A newly developed software program MVCP FemCoGen©, written in Delphi, reformats the TomTec file structures in five patients for use in ABAQUS and allows visualization of regional deformation of the LV. Results This study demonstrates that a fully automated importation of 3D TEE data into FEM modeling is feasible and can be efficiently accomplished in the operating room. Conclusion For complete intraoperative 3D LV finite element analysis, three input elements are necessary: 1. time-gaited, reality-based structural information, 2. continuous LV pressure and 3. instantaneous tissue elastance. The first of these elements is now available using the methods presented herein. PMID:15473901
Component Design Report: International Transportation Energy Demand Determinants Model
2017-01-01
This Component Design Report discusses working design elements for a new model to replace the International Transportation Model (ITran) in the World Energy Projection System Plus (WEPS ) that is maintained by the U.S. Energy Information Administration. The key objective of the new International Transportation Energy Demand Determinants (ITEDD) model is to enable more rigorous, quantitative research related to energy consumption in the international transportation sectors.
Fusion of intraoperative force sensoring, surface reconstruction and biomechanical modeling
NASA Astrophysics Data System (ADS)
Röhl, S.; Bodenstedt, S.; Küderle, C.; Suwelack, S.; Kenngott, H.; Müller-Stich, B. P.; Dillmann, R.; Speidel, S.
2012-02-01
Minimally invasive surgery is medically complex and can heavily benefit from computer assistance. One way to help the surgeon is to integrate preoperative planning data into the surgical workflow. This information can be represented as a customized preoperative model of the surgical site. To use it intraoperatively, it has to be updated during the intervention due to the constantly changing environment. Hence, intraoperative sensor data has to be acquired and registered with the preoperative model. Haptic information which could complement the visual sensor data is still not established. In addition, biomechanical modeling of the surgical site can help in reflecting the changes which cannot be captured by intraoperative sensors. We present a setting where a force sensor is integrated into a laparoscopic instrument. In a test scenario using a silicone liver phantom, we register the measured forces with a reconstructed surface model from stereo endoscopic images and a finite element model. The endoscope, the instrument and the liver phantom are tracked with a Polaris optical tracking system. By fusing this information, we can transfer the deformation onto the finite element model. The purpose of this setting is to demonstrate the principles needed and the methods developed for intraoperative sensor data fusion. One emphasis lies on the calibration of the force sensor with the instrument and first experiments with soft tissue. We also present our solution and first results concerning the integration of the force sensor as well as accuracy to the fusion of force measurements, surface reconstruction and biomechanical modeling.
Finite element modelling of the foot for clinical application: A systematic review.
Behforootan, Sara; Chatzistergos, Panagiotis; Naemi, Roozbeh; Chockalingam, Nachiappan
2017-01-01
Over the last two decades finite element modelling has been widely used to give new insight on foot and footwear biomechanics. However its actual contribution for the improvement of the therapeutic outcome of different pathological conditions of the foot, such as the diabetic foot, remains relatively limited. This is mainly because finite element modelling has only been used within the research domain. Clinically applicable finite element modelling can open the way for novel diagnostic techniques and novel methods for treatment planning/optimisation which would significantly enhance clinical practice. In this context this review aims to provide an overview of modelling techniques in the field of foot and footwear biomechanics and to investigate their applicability in a clinical setting. Even though no integrated modelling system exists that could be directly used in the clinic and considerable progress is still required, current literature includes a comprehensive toolbox for future work towards clinically applicable finite element modelling. The key challenges include collecting the information that is needed for geometry design, the assignment of material properties and loading on a patient-specific basis and in a cost-effective and non-invasive way. The ultimate challenge for the implementation of any computational system into clinical practice is to ensure that it can produce reliable results for any person that belongs in the population for which it was developed. Consequently this highlights the need for thorough and extensive validation of each individual step of the modelling process as well as for the overall validation of the final integrated system. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Boldrini, Enrico; Schaap, Dick M. A.; Nativi, Stefano
2013-04-01
SeaDataNet implements a distributed pan-European infrastructure for Ocean and Marine Data Management whose nodes are maintained by 40 national oceanographic and marine data centers from 35 countries riparian to all European seas. A unique portal makes possible distributed discovery, visualization and access of the available sea data across all the member nodes. Geographic metadata play an important role in such an infrastructure, enabling an efficient documentation and discovery of the resources of interest. In particular: - Common Data Index (CDI) metadata describe the sea datasets, including identification information (e.g. product title, interested area), evaluation information (e.g. data resolution, constraints) and distribution information (e.g. download endpoint, download protocol); - Cruise Summary Reports (CSR) metadata describe cruises and field experiments at sea, including identification information (e.g. cruise title, name of the ship), acquisition information (e.g. utilized instruments, number of samples taken) In the context of the second phase of SeaDataNet (SeaDataNet 2 EU FP7 project, grant agreement 283607, started on October 1st, 2011 for a duration of 4 years) a major target is the setting, adoption and promotion of common international standards, to the benefit of outreach and interoperability with the international initiatives and communities (e.g. OGC, INSPIRE, GEOSS, …). A standardization effort conducted by CNR with the support of MARIS, IFREMER, STFC, BODC and ENEA has led to the creation of a ISO 19115 metadata profile of CDI and its XML encoding based on ISO 19139. The CDI profile is now in its stable version and it's being implemented and adopted by the SeaDataNet community tools and software. The effort has then continued to produce an ISO based metadata model and its XML encoding also for CSR. The metadata elements included in the CSR profile belong to different models: - ISO 19115: E.g. cruise identification information, including title and area of interest; metadata responsible party information - ISO 19115-2: E.g. acquisition information, including date of sampling, instruments used - SeaDataNet: E.g. SeaDataNet community specific, including EDMO and EDMERP code lists Two main guidelines have been followed in the metadata model drafting: - All the obligations and constraints required by both the ISO standards and INSPIRE directive had to be satisfied. These include the presence of specific elements with given cardinality (e.g. mandatory metadata date stamp, mandatory lineage information) - All the content information of legacy CSR format had to be supported by the new metadata model. An XML encoding of the CSR profile has been defined as well. Based on the ISO 19139 XML schema and constraints, it adds the new elements specific of the SeaDataNet community. The associated Schematron rules are used to enforce constraints not enforceable just with the Schema and to validate elements content against the SeaDataNet code lists vocabularies.
Toward designing for trust in database automation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duez, P. P.; Jamieson, G. A.
Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operatingmore » functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process. The connection between an AH for an automated tool and a list of information elements at the three levels of attributional abstraction is then direct, providing a method for satisfying information requirements for appropriate trust in automation. In this paper, we will present our method for developing specific information requirements for an automated tool, based on a formal analysis of that tool and the models presented by Lee and See. We will show an example of the application of the AH to automation, in the domain of relational database automation, and the resulting set of specific information elements for appropriate trust in the automated tool. Finally, we will comment on the applicability of this approach to the domain of nuclear plant instrumentation. (authors)« less
Simulation model of an eyeball based on finite element analysis on a supercomputer.
Uchio, E; Ohno, S; Kudoh, J; Aoki, K; Kisielewicz, L T
1999-10-01
A simulation model of the human eye was developed. It was applied to the determination of the physical and mechanical conditions of impacting foreign bodies causing intraocular foreign body (IOFB) injuries. Modules of the Hypermesh (Altair Engineering, Tokyo, Japan) were used for solid modelling, geometric construction, and finite element mesh creation based on information obtained from cadaver eyes. The simulations were solved by a supercomputer using the finite element analysis (FEA) program PAM-CRASH (Nihon ESI, Tokyo, Japan). It was assumed that rupture occurs at a strain of 18.0% in the cornea and 6.8% in the sclera and at a stress of 9.4 MPa for both cornea and sclera. Blunt-shaped missiles were shot and set to impact on the surface of the cornea or sclera at velocities of 30 and 60 m/s, respectively. According to the simulation, the sizes of missile above which corneal rupture occurred at velocities of 30 and 60 m/s were 1.95 and 0.82 mm. The missile sizes causing scleral rupture were 0.95 and 0.75 mm at velocities of 30 and 60 m/s. These results suggest that this FEA model has potential usefulness as a simulation tool for ocular injury and it may provide useful information for developing protective measures against industrial and traffic ocular injuries.
NASA Astrophysics Data System (ADS)
Huang, Ying; Bevans, W. J.; Xiao, Hai; Zhou, Zhi; Chen, Genda
2012-04-01
During or after an earthquake event, building system often experiences large strains due to shaking effects as observed during recent earthquakes, causing permanent inelastic deformation. In addition to the inelastic deformation induced by the earthquake effect, the post-earthquake fires associated with short fuse of electrical systems and leakage of gas devices can further strain the already damaged structures during the earthquakes, potentially leading to a progressive collapse of buildings. Under these harsh environments, measurements on the involved building by various sensors could only provide limited structural health information. Finite element model analysis, on the other hand, if validated by predesigned experiments, can provide detail structural behavior information of the entire structures. In this paper, a temperature dependent nonlinear 3-D finite element model (FEM) of a one-story steel frame is set up by ABAQUS based on the cited material property of steel from EN 1993-1.2 and AISC manuals. The FEM is validated by testing the modeled steel frame in simulated post-earthquake environments. Comparisons between the FEM analysis and the experimental results show that the FEM predicts the structural behavior of the steel frame in post-earthquake fire conditions reasonably. With experimental validations, the FEM analysis of critical structures could be continuously predicted for structures in these harsh environments for a better assistant to fire fighters in their rescue efforts and save fire victims.
Frandsen, Michael W.; Wessol, Daniel E.; Wheeler, Floyd J.
2001-01-16
Methods and computer executable instructions are disclosed for ultimately developing a dosimetry plan for a treatment volume targeted for irradiation during cancer therapy. The dosimetry plan is available in "real-time" which especially enhances clinical use for in vivo applications. The real-time is achieved because of the novel geometric model constructed for the planned treatment volume which, in turn, allows for rapid calculations to be performed for simulated movements of particles along particle tracks there through. The particles are exemplary representations of neutrons emanating from a neutron source during BNCT. In a preferred embodiment, a medical image having a plurality of pixels of information representative of a treatment volume is obtained. The pixels are: (i) converted into a plurality of substantially uniform volume elements having substantially the same shape and volume of the pixels; and (ii) arranged into a geometric model of the treatment volume. An anatomical material associated with each uniform volume element is defined and stored. Thereafter, a movement of a particle along a particle track is defined through the geometric model along a primary direction of movement that begins in a starting element of the uniform volume elements and traverses to a next element of the uniform volume elements. The particle movement along the particle track is effectuated in integer based increments along the primary direction of movement until a position of intersection occurs that represents a condition where the anatomical material of the next element is substantially different from the anatomical material of the starting element. This position of intersection is then useful for indicating whether a neutron has been captured, scattered or exited from the geometric model. From this intersection, a distribution of radiation doses can be computed for use in the cancer therapy. The foregoing represents an advance in computational times by multiple factors of time magnitudes.
The Evolution of Tyrosine-Recombinase Elements in Nematoda
Szitenberg, Amir; Koutsovoulos, Georgios; Blaxter, Mark L.; Lunt, David H.
2014-01-01
Transposable elements can be categorised into DNA and RNA elements based on their mechanism of transposition. Tyrosine recombinase elements (YREs) are relatively rare and poorly understood, despite sharing characteristics with both DNA and RNA elements. Previously, the Nematoda have been reported to have a substantially different diversity of YREs compared to other animal phyla: the Dirs1-like YRE retrotransposon was encountered in most animal phyla but not in Nematoda, and a unique Pat1-like YRE retrotransposon has only been recorded from Nematoda. We explored the diversity of YREs in Nematoda by sampling broadly across the phylum and including 34 genomes representing the three classes within Nematoda. We developed a method to isolate and classify YREs based on both feature organization and phylogenetic relationships in an open and reproducible workflow. We also ensured that our phylogenetic approach to YRE classification identified truncated and degenerate elements, informatively increasing the number of elements sampled. We identified Dirs1-like elements (thought to be absent from Nematoda) in the nematode classes Enoplia and Dorylaimia indicating that nematode model species do not adequately represent the diversity of transposable elements in the phylum. Nematode Pat1-like elements were found to be a derived form of another Pat1-like element that is present more widely in animals. Several sequence features used widely for the classification of YREs were found to be homoplasious, highlighting the need for a phylogenetically-based classification scheme. Nematode model species do not represent the diversity of transposable elements in the phylum. PMID:25197791
The evolution of tyrosine-recombinase elements in Nematoda.
Szitenberg, Amir; Koutsovoulos, Georgios; Blaxter, Mark L; Lunt, David H
2014-01-01
Transposable elements can be categorised into DNA and RNA elements based on their mechanism of transposition. Tyrosine recombinase elements (YREs) are relatively rare and poorly understood, despite sharing characteristics with both DNA and RNA elements. Previously, the Nematoda have been reported to have a substantially different diversity of YREs compared to other animal phyla: the Dirs1-like YRE retrotransposon was encountered in most animal phyla but not in Nematoda, and a unique Pat1-like YRE retrotransposon has only been recorded from Nematoda. We explored the diversity of YREs in Nematoda by sampling broadly across the phylum and including 34 genomes representing the three classes within Nematoda. We developed a method to isolate and classify YREs based on both feature organization and phylogenetic relationships in an open and reproducible workflow. We also ensured that our phylogenetic approach to YRE classification identified truncated and degenerate elements, informatively increasing the number of elements sampled. We identified Dirs1-like elements (thought to be absent from Nematoda) in the nematode classes Enoplia and Dorylaimia indicating that nematode model species do not adequately represent the diversity of transposable elements in the phylum. Nematode Pat1-like elements were found to be a derived form of another Pat1-like element that is present more widely in animals. Several sequence features used widely for the classification of YREs were found to be homoplasious, highlighting the need for a phylogenetically-based classification scheme. Nematode model species do not represent the diversity of transposable elements in the phylum.
ERIC Educational Resources Information Center
Scigliano, John A.
1983-01-01
Presents a research-based marketing model consisting of an environmental scanning process, a series of marketing audits, and an information-processing scheme. Views the essential elements of college marketing as information flow; high-level, long-term commitment; diverse strategies; innovation; and a broad view of marketing. Includes a marketing…
Evaluation of the Oregon Business Council-David Douglas Model School District Partnership Program.
ERIC Educational Resources Information Center
Conley, David T.; Stone, Patricia
The Oregon Business Council (OBC)-David Douglas Model District Project was undertaken for two reasons: (1) to create a model for a district's accelerated implementation of all the elements of school reform as mandated in Oregon House Bill 3565; and (2) to learn lessons about school reform that would inform OBC member companies and school districts…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dmitriev, Alexander S.; Yemelyanov, Ruslan Yu.; Moscow Institute of Physics and Technology
The paper deals with a new multi-element processor platform assigned for modelling the behaviour of interacting dynamical systems, i.e., active wireless network. Experimentally, this ensemble is implemented in an active network, the active nodes of which include direct chaotic transceivers and special actuator boards containing microcontrollers for modelling the dynamical systems and an information display unit (colored LEDs). The modelling technique and experimental results are described and analyzed.
NASA Astrophysics Data System (ADS)
Cardenas, Jesus Alvaro
An energy and environmental crisis will emerge throughout the world if we continue with our current practices of generation and distribution of electricity. A possible solution to this problem is based on the Smart grid concept, which is heavily influenced by Information and Communication Technology (ICT). Although the electricity industry is mostly regulated, there are global models used as roadmaps for Smart Grids' implementation focusing on technologies and the basic generation-distribution-transmission model. This project aims to further enhance a business model for a future global deployment. It takes into consideration the many factors interacting in this energy provision process, based on the diffusion of technologies and literature surveys on the available documents in the Internet as well as peer-reviewed publications. Tariffs and regulations, distributed energy generation, integration of service providers, consumers becoming producers, self-healing devices, and many other elements are shifting this industry into a major change towards liberalization and deregulation of this sector, which has been heavily protected by the government due to the importance of electricity for consumers. We propose an Energy Management Business Model composed by four basic elements: Supply Chain, Information and Communication Technology (ICT), Stakeholders Response, and the resulting Green Efficient Energy (GEE). We support the developed model based on the literature survey, we support it with the diffusion analysis of these elements, and support the overall model with two surveys: one for peers and professionals, and other for experts in the field, based on the Smart Grid Carnegie Melon Maturity Model (CMU SEI SGMM). The contribution of this model is a simple path to follow for entities that want to achieve environmental friendly energy with the involvement of technology and all stakeholders.
Nguyen, Quoc Dinh; Fernandez, Nicolas; Karsenti, Thierry; Charlin, Bernard
2014-12-01
Although reflection is considered a significant component of medical education and practice, the literature does not provide a consensual definition or model for it. Because reflection has taken on multiple meanings, it remains difficult to operationalise. A standard definition and model are needed to improve the development of practical applications of reflection. This study was conducted in order to identify, explore and analyse the most influential conceptualisations of reflection, and to develop a new theory-informed and unified definition and model of reflection. A systematic review was conducted to identify the 15 most cited authors in papers on reflection published during the period from 2008 to 2012. The authors' definitions and models were extracted. An exploratory thematic analysis was carried out and identified seven initial categories. Categories were clustered and reworded to develop an integrative definition and model of reflection, which feature core components that define reflection and extrinsic elements that influence instances of reflection. Following our review and analysis, five core components of reflection and two extrinsic elements were identified as characteristics of the reflective thinking process. Reflection is defined as the process of engaging the self (S) in attentive, critical, exploratory and iterative (ACEI) interactions with one's thoughts and actions (TA), and their underlying conceptual frame (CF), with a view to changing them and a view on the change itself (VC). Our conceptual model consists of the defining core components, supplemented with the extrinsic elements that influence reflection. This article presents a new theory-informed, five-component definition and model of reflection. We believe these have advantages over previous models in terms of helping to guide the further study, learning, assessment and teaching of reflection. © 2014 John Wiley & Sons Ltd.
A Finite-Element Method Model of Soft Tissue Response to Impulsive Acoustic Radiation Force
Palmeri, Mark L.; Sharma, Amy C.; Bouchard, Richard R.; Nightingale, Roger W.; Nightingale, Kathryn R
2010-01-01
Several groups are studying acoustic radiation force and its ability to image the mechanical properties of tissue. Acoustic radiation force impulse (ARFI) imaging is one modality using standard diagnostic ultrasound scanners to generate localized, impulsive, acoustic radiation forces in tissue. The dynamic response of tissue is measured via conventional ultrasonic speckle-tracking methods and provides information about the mechanical properties of tissue. A finite-element method (FEM) model has been developed that simulates the dynamic response of tissues, with and without spherical inclusions, to an impulsive acoustic radiation force excitation from a linear array transducer. These FEM models were validated with calibrated phantoms. Shear wave speed, and therefore elasticity, dictates tissue relaxation following ARFI excitation, but Poisson’s ratio and density do not significantly alter tissue relaxation rates. Increased acoustic attenuation in tissue increases the relative amount of tissue displacement in the near field compared with the focal depth, but relaxation rates are not altered. Applications of this model include improving image quality, and distilling material and structural information from tissue’s dynamic response to ARFI excitation. Future work on these models includes incorporation of viscous material properties and modeling the ultrasonic tracking of displaced scatterers. PMID:16382621
Predicting mortality over different time horizons: which data elements are needed?
Goldstein, Benjamin A; Pencina, Michael J; Montez-Rath, Maria E; Winkelmayer, Wolfgang C
2017-01-01
Electronic health records (EHRs) are a resource for "big data" analytics, containing a variety of data elements. We investigate how different categories of information contribute to prediction of mortality over different time horizons among patients undergoing hemodialysis treatment. We derived prediction models for mortality over 7 time horizons using EHR data on older patients from a national chain of dialysis clinics linked with administrative data using LASSO (least absolute shrinkage and selection operator) regression. We assessed how different categories of information relate to risk assessment and compared discrete models to time-to-event models. The best predictors used all the available data (c-statistic ranged from 0.72-0.76), with stronger models in the near term. While different variable groups showed different utility, exclusion of any particular group did not lead to a meaningfully different risk assessment. Discrete time models performed better than time-to-event models. Different variable groups were predictive over different time horizons, with vital signs most predictive for near-term mortality and demographic and comorbidities more important in long-term mortality. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
The PDS4 Information Model and its Role in Agile Science Data Curation
NASA Astrophysics Data System (ADS)
Hughes, J. S.; Crichton, D.
2017-12-01
PDS4 is an information model-driven service architecture supporting the capture, management, distribution and integration of massive planetary science data captured in distributed data archives world-wide. The PDS4 Information Model (IM), the core element of the architecture, was developed using lessons learned from 20 years of archiving Planetary Science Data and best practices for information model development. The foundational principles were adopted from the Open Archival Information System (OAIS) Reference Model (ISO 14721), the Metadata Registry Specification (ISO/IEC 11179), and W3C XML (Extensible Markup Language) specifications. These provided respectively an object oriented model for archive information systems, a comprehensive schema for data dictionaries and hierarchical governance, and rules for rules for encoding documents electronically. The PDS4 Information model is unique in that it drives the PDS4 infrastructure by providing the representation of concepts and their relationships, constraints, rules, and operations; a sharable, stable, and organized set of information requirements; and machine parsable definitions that are suitable for configuring and generating code. This presentation will provide an over of the PDS4 Information Model and how it is being leveraged to develop and evolve the PDS4 infrastructure and enable agile curation of over 30 years of science data collected by the international Planetary Science community.
Artificial Neural Networks for Processing Graphs with Application to Image Understanding: A Survey
NASA Astrophysics Data System (ADS)
Bianchini, Monica; Scarselli, Franco
In graphical pattern recognition, each data is represented as an arrangement of elements, that encodes both the properties of each element and the relations among them. Hence, patterns are modelled as labelled graphs where, in general, labels can be attached to both nodes and edges. Artificial neural networks able to process graphs are a powerful tool for addressing a great variety of real-world problems, where the information is naturally organized in entities and relationships among entities and, in fact, they have been widely used in computer vision, f.i. in logo recognition, in similarity retrieval, and for object detection. In this chapter, we propose a survey of neural network models able to process structured information, with a particular focus on those architectures tailored to address image understanding applications. Starting from the original recursive model (RNNs), we subsequently present different ways to represent images - by trees, forests of trees, multiresolution trees, directed acyclic graphs with labelled edges, general graphs - and, correspondingly, neural network architectures appropriate to process such structures.
Static and dynamic characteristics of a piezoceramic strut
NASA Technical Reports Server (NTRS)
Pokines, Brett J.; Belvin, W. Keith; Inman, Daniel J.
1993-01-01
The experimental study of a piezoceramic active truss is presented. This active strut is unique in that the piezoceramic configurations allow the stroke length of the strut not to be dependent on the piezoceramic material's expansion range but on the deflection range of the piezoceramic bender segment. A finite element model of a piezoceramic strut segment was constructed. Piezoceramic actuation was simulated using thermally induced strains. This model yielded information on the stiffness and force range of a bender element. The static and dynamic properties of the strut were identified experimentally. Feedback control was used to vary the stiffness of the strut. The experimentally verified model was used to explore implementation possibilities of the strut.
Elements of effective palliative care models: a rapid review
2014-01-01
Background Population ageing, changes to the profiles of life-limiting illnesses and evolving societal attitudes prompt a critical evaluation of models of palliative care. We set out to identify evidence-based models of palliative care to inform policy reform in Australia. Method A rapid review of electronic databases and the grey literature was undertaken over an eight week period in April-June 2012. We included policy documents and comparative studies from countries within the Organisation for Economic Co-operation and Development (OECD) published in English since 2001. Meta-analysis was planned where >1 study met criteria; otherwise, synthesis was narrative using methods described by Popay et al. (2006). Results Of 1,959 peer-reviewed articles, 23 reported systematic reviews, 9 additional RCTs and 34 non-randomised comparative studies. Variation in the content of models, contexts in which these were implemented and lack of detailed reporting meant that elements of models constituted a more meaningful unit of analysis than models themselves. Case management was the element most consistently reported in models for which comparative studies provided evidence for effectiveness. Essential attributes of population-based palliative care models identified by policy and addressed by more than one element were communication and coordination between providers (including primary care), skill enhancement, and capacity to respond rapidly to individuals’ changing needs and preferences over time. Conclusion Models of palliative care should integrate specialist expertise with primary and community care services and enable transitions across settings, including residential aged care. The increasing complexity of care needs, services, interventions and contextual drivers warrants future research aimed at elucidating the interactions between different components and the roles played by patient, provider and health system factors. The findings of this review are limited by its rapid methodology and focus on model elements relevant to Australia’s health system. PMID:24670065
Elements of a Green Infrastructure Maintenance Business Plan for Milwaukee WI
This report reflects the feedback provided by MMSD and local stakeholders about different business models for conducting maintenance. The findings from this process will inform a maintenance plan for the region.
Modeling of digital information optical encryption system with spatially incoherent illumination
NASA Astrophysics Data System (ADS)
Bondareva, Alyona P.; Cheremkhin, Pavel A.; Krasnov, Vitaly V.; Rodin, Vladislav G.; Starikov, Rostislav S.; Starikov, Sergey N.
2015-10-01
State of the art micromirror DMD spatial light modulators (SLM) offer unprecedented framerate up to 30000 frames per second. This, in conjunction with high speed digital camera, should allow to build high speed optical encryption system. Results of modeling of digital information optical encryption system with spatially incoherent illumination are presented. Input information is displayed with first SLM, encryption element - with second SLM. Factors taken into account are: resolution of SLMs and camera, holograms reconstruction noise, camera noise and signal sampling. Results of numerical simulation demonstrate high speed (several gigabytes per second), low bit error rate and high crypto-strength.
Kärtner, Joscha
2018-04-01
Basic elements of prosociality-(pro)social cognition, motivation, and prosocial behavior-emerge during the first and second year of life. These elements are rooted in biological predispositions and the developmental system is complemented by caregivers' structuring. By structuring, (m)others integrate toddlers' unrefined (pro)social sentiments and behavioral inclinations into coherent patterns and align toddlers' experience and behavior with the population's cultural model. These cultural models specify target states for appropriate affective, motivational and behavioral responses regarding toddlers' prosociality and these target states, in turn, inform (m)others' appraisal and guide their structuring. The experiences that toddlers make in these social interactions have important implications for how the basic elements of prosociality are refined and further develop. Copyright © 2017 Elsevier Ltd. All rights reserved.
Innovation in stem cell advocacy: you only get what you can measure.
Jakimo, Alan L; Fernandez, Alan C
2011-11-01
We propose that stem cell advocacy must engage in self-analysis to determine how to be maximally effective. For this analysis, eight advocacy elements can be measured: agitation, legislation, regulation, litigation, policy development, collaboration, education and innovation. For several of these elements, we show that stem cell advocates, particularly advocates for human embryonic stem cell research, have been matched by their opponents. This demonstrates the need for combining innovation and collaboration with advocacy-oriented education. To pursue innovative and collaborative education, we propose a 'bench-to-public knowledge' model and present some preliminary observations made with this model for different stem cell types. We also propose development of a semantic web information system to be operated within Internet Cloud/Apps/Social Media. We call this system the 'Stem Cell Information Technology Accelerator Platform'. Toward its construction, we propose formation of a working group to conceive semantic web ontology for stem cell science and its clinical translation into medicine. This ontology would function as a map of the relationships between and among the various informational components comprising discourse on stem cell research and its clinical translation, and would allow various stakeholders to contribute to evolving models of that science and translation. These models could, in turn, support an innovative and collaborative approach to education in furtherance of stem cell advocacy.
NASA Astrophysics Data System (ADS)
Abramov, G. V.; Emeljanov, A. E.; Ivashin, A. L.
Theoretical bases for modeling a digital control system with information transfer via the channel of plural access and a regular quantization cycle are submitted. The theory of dynamic systems with random changes of the structure including elements of the Markov random processes theory is used for a mathematical description of a network control system. The characteristics of similar control systems are received. Experimental research of the given control systems is carried out.
Design Through Manufacturing: The Solid Model-Finite Element Analysis Interface
NASA Technical Reports Server (NTRS)
Rubin, Carol
2002-01-01
State-of-the-art computer aided design (CAD) presently affords engineers the opportunity to create solid models of machine parts reflecting every detail of the finished product. Ideally, in the aerospace industry, these models should fulfill two very important functions: (1) provide numerical. control information for automated manufacturing of precision parts, and (2) enable analysts to easily evaluate the stress levels (using finite element analysis - FEA) for all structurally significant parts used in aircraft and space vehicles. Today's state-of-the-art CAD programs perform function (1) very well, providing an excellent model for precision manufacturing. But they do not provide a straightforward and simple means of automating the translation from CAD to FEA models, especially for aircraft-type structures. Presently, the process of preparing CAD models for FEA consumes a great deal of the analyst's time.
Business model framework applications in health care: A systematic review.
Fredriksson, Jens Jacob; Mazzocato, Pamela; Muhammed, Rafiq; Savage, Carl
2017-11-01
It has proven to be a challenge for health care organizations to achieve the Triple Aim. In the business literature, business model frameworks have been used to understand how organizations are aligned to achieve their goals. We conducted a systematic literature review with an explanatory synthesis approach to understand how business model frameworks have been applied in health care. We found a large increase in applications of business model frameworks during the last decade. E-health was the most common context of application. We identified six applications of business model frameworks: business model description, financial assessment, classification based on pre-defined typologies, business model analysis, development, and evaluation. Our synthesis suggests that the choice of business model framework and constituent elements should be informed by the intent and context of application. We see a need for harmonization in the choice of elements in order to increase generalizability, simplify application, and help organizations realize the Triple Aim.
Stable Isotope Mixing Models as a Tool for Tracking Sources of Water and Water Pollutants
One goal of monitoring pollutants is to be able to trace the pollutant to its source. Here we review how mixing models using stable isotope information on water and water pollutants can help accomplish this goal. A number of elements exist in multiple stable (non-radioactive) i...
The Origin of Noble Gas Isotopic Heterogeneity in Icelandic Basalts
NASA Technical Reports Server (NTRS)
Dixon, E. T.; Honda, M.; McDougall, I.
2001-01-01
Two models for generation of heterogeneous He, Ne and Ar isotopic ratios in Icelandic basalts are evaluated using a mixing model and the observed noble gas elemental ratios in Icelandic basalts,Ocean island Basalt (OIBs) and Mid-Ocean Ridge Basalt (MORBs). Additional information is contained in the original extended abstract.
Development of a conceptual model of cancer caregiver health literacy.
Yuen, E Y N; Dodson, S; Batterham, R W; Knight, T; Chirgwin, J; Livingston, P M
2016-03-01
Caregivers play a vital role in caring for people diagnosed with cancer. However, little is understood about caregivers' capacity to find, understand, appraise and use information to improve health outcomes. The study aimed to develop a conceptual model that describes the elements of cancer caregiver health literacy. Six concept mapping workshops were conducted with 13 caregivers, 13 people with cancer and 11 healthcare providers/policymakers. An iterative, mixed methods approach was used to analyse and synthesise workshop data and to generate the conceptual model. Six major themes and 17 subthemes were identified from 279 statements generated by participants during concept mapping workshops. Major themes included: access to information, understanding of information, relationship with healthcare providers, relationship with the care recipient, managing challenges of caregiving and support systems. The study extends conceptualisations of health literacy by identifying factors specific to caregiving within the cancer context. The findings demonstrate that caregiver health literacy is multidimensional, includes a broad range of individual and interpersonal elements, and is influenced by broader healthcare system and community factors. These results provide guidance for the development of: caregiver health literacy measurement tools; strategies for improving health service delivery, and; interventions to improve caregiver health literacy. © 2015 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Santagati, Cettina; Lo Turco, Massimiliano
2017-01-01
In recent years, we have witnessed a huge diffusion of building information modeling (BIM) approaches in the field of architectural design, although very little research has been undertaken to explore the value, criticalities, and advantages attributable to the application of these methodologies in the cultural heritage domain. Furthermore, the last developments in digital photogrammetry lead to the easy generation of reliable low-cost three-dimensional textured models that could be used in BIM platforms to create semantic-aware objects that could compose a specific library of historical architectural elements. In this case, the transfer between the point cloud and its corresponding parametric model is not so trivial and the level of geometrical abstraction could not be suitable with the scope of the BIM. The aim of this paper is to explore and retrace the milestone works on this crucial topic in order to identify the unsolved issues and to propose and test a unique and simple workflow practitioner centered and based on the use of the latest available solutions for point cloud managing into commercial BIM platforms.
Standardized reporting for rapid relative effectiveness assessments of pharmaceuticals.
Kleijnen, Sarah; Pasternack, Iris; Van de Casteele, Marc; Rossi, Bernardette; Cangini, Agnese; Di Bidino, Rossella; Jelenc, Marjetka; Abrishami, Payam; Autti-Rämö, Ilona; Seyfried, Hans; Wildbacher, Ingrid; Goettsch, Wim G
2014-11-01
Many European countries perform rapid assessments of the relative effectiveness (RE) of pharmaceuticals as part of the reimbursement decision making process. Increased sharing of information on RE across countries may save costs and reduce duplication of work. The objective of this article is to describe the development of a tool for rapid assessment of RE of new pharmaceuticals that enter the market, the HTA Core Model® for Rapid Relative Effectiveness Assessment (REA) of Pharmaceuticals. Eighteen member organisations of the European Network of Health Technology Assessment (EUnetHTA) participated in the development of the model. Different versions of the model were developed and piloted in this collaboration and adjusted accordingly based on feedback on the content and feasibility of the model. The final model deviates from the traditional HTA Core Model® used for assessing other types of technologies. This is due to the limited scope (strong focus on RE), the timing of the assessment (just after market authorisation), and strict timelines (e.g. 90 days) required for performing the assessment. The number of domains and assessment elements was limited and it was decided that the primary information sources should preferably be a submission file provided by the marketing authorisation holder and the European Public Assessment Report. The HTA Core Model® for Rapid REA (version 3.0) was developed to produce standardised transparent RE information of pharmaceuticals. Further piloting can provide input for possible improvements, such as further refining the assessment elements and new methodological guidance on relevant areas.
On domain modelling of the service system with its application to enterprise information systems
NASA Astrophysics Data System (ADS)
Wang, J. W.; Wang, H. F.; Ding, J. L.; Furuta, K.; Kanno, T.; Ip, W. H.; Zhang, W. J.
2016-01-01
Information systems are a kind of service systems and they are throughout every element of a modern industrial and business system, much like blood in our body. Types of information systems are heterogeneous because of extreme uncertainty in changes in modern industrial and business systems. To effectively manage information systems, modelling of the work domain (or domain) of information systems is necessary. In this paper, a domain modelling framework for the service system is proposed and its application to the enterprise information system is outlined. The framework is defined based on application of a general domain modelling tool called function-context-behaviour-principle-state-structure (FCBPSS). The FCBPSS is based on a set of core concepts, namely: function, context, behaviour, principle, state and structure and system decomposition. Different from many other applications of FCBPSS in systems engineering, the FCBPSS is applied to both infrastructure and substance systems, which is novel and effective to modelling of service systems including enterprise information systems. It is to be noted that domain modelling of systems (e.g. enterprise information systems) is a key to integration of heterogeneous systems and to coping with unanticipated situations facing to systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Janney, Dawn E.; Papesch, Cynthia A.; Burkes, Douglas E.
This is not a typical External Report--It is a Handbook. No Abstract is involved. This includes both Parts 1 and 2. The Metallic Fuels Handbook summarizes currently available information about phases and phase diagrams, heat capacity, thermal expansion, and thermal conductivity of elements and alloys in the U-Pu-Zr-Np-Am-La-Ce-Pr-Nd system. Although many sections are reviews and updates of material in previous versions of the Handbook [1, 2], this revision is the first to include alloys with four or more elements. In addition to presenting information about materials properties, the handbook attempts to provide information about how well each property is knownmore » and how much variation exists between measurements. Although it includes some results from models, its primary focus is experimental data.« less
Analyzing C2 Structures and Self-Synchronization with Simple Computational Models
2011-06-01
16th ICCRTS “Collective C2 in Multinational Civil-Military Operations” Analyzing C2 Structures and Self- Synchronization with Simple...Self- Synchronization with Simple Computational Models 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT...models. The Kuramoto Model, though with some serious limitations, provides a representation of information flow and self- synchronization in an
Cantera Integration with the Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS)
NASA Technical Reports Server (NTRS)
Lavelle, Thomas M.; Chapman, Jeffryes W.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei
2014-01-01
NASA Glenn Research Center (GRC) has recently developed a software package for modeling generic thermodynamic systems called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS). T-MATS is a library of building blocks that can be assembled to represent any thermodynamic system in the Simulink(Registered TradeMark) (The MathWorks, Inc.) environment. These elements, along with a Newton Raphson solver (also provided as part of the T-MATS package), enable users to create models of a wide variety of systems. The current version of T-MATS (v1.0.1) uses tabular data for providing information about a specific mixture of air, water (humidity), and hydrocarbon fuel in calculations of thermodynamic properties. The capabilities of T-MATS can be expanded by integrating it with the Cantera thermodynamic package. Cantera is an object-oriented analysis package that calculates thermodynamic solutions for any mixture defined by the user. Integration of Cantera with T-MATS extends the range of systems that may be modeled using the toolbox. In addition, the library of elements released with Cantera were developed using MATLAB native M-files, allowing for quicker prototyping of elements. This paper discusses how the new Cantera-based elements are created and provides examples for using T-MATS integrated with Cantera.
Cantera Integration with the Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS)
NASA Technical Reports Server (NTRS)
Lavelle, Thomas M.; Chapman, Jeffryes W.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei
2014-01-01
NASA Glenn Research Center (GRC) has recently developed a software package for modeling generic thermodynamic systems called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS). T-MATS is a library of building blocks that can be assembled to represent any thermodynamic system in the Simulink (The MathWorks, Inc.) environment. These elements, along with a Newton Raphson solver (also provided as part of the T-MATS package), enable users to create models of a wide variety of systems. The current version of T-MATS (v1.0.1) uses tabular data for providing information about a specific mixture of air, water (humidity), and hydrocarbon fuel in calculations of thermodynamic properties. The capabilities of T-MATS can be expanded by integrating it with the Cantera thermodynamic package. Cantera is an object-oriented analysis package that calculates thermodynamic solutions for any mixture defined by the user. Integration of Cantera with T-MATS extends the range of systems that may be modeled using the toolbox. In addition, the library of elements released with Cantera were developed using MATLAB native M-files, allowing for quicker prototyping of elements. This paper discusses how the new Cantera-based elements are created and provides examples for using T-MATS integrated with Cantera.
NASA Astrophysics Data System (ADS)
García, E.; Oliver, A.; Diaz, O.; Diez, Y.; Gubern-Mérida, A.; Martí, R.; Martí, J.
2017-03-01
Patient-specific finite element (FE) models of the breast have received increasing attention due to the potential capability of fusing images from different modalities. During the Magnetic Resonance Imaging (MRI) to X-ray mammography registration procedure, the FE model is compressed mimicking the mammographic acquisition. Subsequently, suspicious lesions in the MRI volume can be projected into the 2D mammographic space. However, most registration algorithms do not provide the reverse information, avoiding to obtain the 3D geometrical information from the lesions localized in the mammograms. In this work we introduce a fast method to localize the 3D position of the lesion within the MRI, using both cranio-caudal (CC) and medio-lateral oblique (MLO) mammographic projections, indexing the tetrahedral elements of the biomechanical model by means of an uniform grid. For each marked lesion in the Full-Field Digital Mammogram (FFDM), the X-ray path from source to the marker is calculated. Barycentric coordinates are computed in the tetrahedrons traversed by the ray. The list of elements and coordinates allows to localize two curves within the MRI and the closest point between both curves is taken as the 3D position of the lesion. The registration errors obtained in the mammographic space are 9.89 +/- 3.72 mm in CC- and 8.04 +/- 4.68 mm in MLO-projection and the error in the 3D MRI space is equal to 10.29 +/- 3.99 mm. Regarding the uniform grid, it is computed spending between 0.1 and 0.7 seconds. The average time spent to compute the 3D location of a lesion is about 8 ms.
Use of EBSD Data in Numerical Analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becker, R; Wiland, H
2000-01-14
Experimentation, theory and modeling have all played vital roles in defining what is known about microstructural evolution and the effects of microstructure on material properties. Recently, technology has become an enabling factor, allowing significant advances to be made on several fronts. Experimental evidence of crystallographic slip and the basic theory of crystal plasticity were established in the early 20th Century, and the theory and models evolved incrementally over the next 60 years. (Asaro provides a comprehensive review of the mechanisms and basic plasticity models.) During this time modeling was primarily concerned with the average response of polycrystalline aggregates. While somemore » detailed finite element modeling (FEM) with crystal plasticity constitutive relations was done in the early 1980s, such simulations over taxed the capabilities of the available computer hardware. Advances in computer capability led to a flurry of activity in finite element modeling in the next 10 years, increasing understanding of microstructure evolution and pushing the limits of theories and material characterization. Automated Electron Back Scatter Diffraction (EBSD) has produced a similar revolution in material characterization. The data collected is extensive and many questions about the evolution of microstructure and its role in determining mechanic properties can now be addressed. It is also now possible to obtain sufficient information about lattice orientations on a fine enough scale to allow detailed quantitative comparisons of experiments and newly emerging large scale numerical simulations. The insight gained from the coupling of EBSD and FEM studies will provide impetus for further development of microstructure models and theories of microstructure evolution. Early studies connecting EBSD data to finite element models used manual measurements to define initial orientations for the simulation. In one study, manual measurements of the deformed structure were also obtained for comparison with the model predictions. More recent work has taken advantage of automated data collection on deformed specimens as a means of collecting detailed and spatially correlated data for model validation. Although it will not be discussed in detail here, another area in which EBSD data is having a great impact is on recrystallization modeling. EBSD techniques can be used to collect data for quantitative microstructural analysis. This data can be used to infer growth kinetics of specific orientations, and this information can be synthesized into more accurate grain growth or recrystallization models. Another role which EBSD techniques may play is in determining initial structures for recrystallization models. A realistic starting structure is vital for evaluating the models, and attempts at predicting realistic structures with finite element simulations are not yet successful. As methodologies and equipment resolution continue to improve, it is possible that measured structures will serve as input for recrystallization models. Simulations have already been run using information obtained manually from a TEM.« less
Diagnostic Evaluation of Carbon Sources in CMAQ
Traditional monitoring networks measure only total elemental carbon (EC) and organic carbon (OC) routinely. Diagnosing model biases with such limited information is difficult. Measurements of organic tracer compounds have recently become available and allow for more detailed di...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rim, Jung H.; Kuhn, Kevin J.; Tandon, Lav
Nuclear forensics techniques, including micro-XRF, gamma spectrometry, trace elemental analysis and isotopic/chronometric characterization were used to interrogate two, potentially related plutonium metal foils. These samples were submitted for analysis with only limited production information, and a comprehensive suite of forensic analyses were performed. Resulting analytical data was paired with available reactor model and historical information to provide insight into the materials’ properties, origins, and likely intended uses. Both were super-grade plutonium, containing less than 3% 240Pu, and age-dating suggested that most recent chemical purification occurred in 1948 and 1955 for the respective metals. Additional consideration of reactor modelling feedback andmore » trace elemental observables indicate plausible U.S. reactor origin associated with the Hanford site production efforts. In conclusion, based on this investigation, the most likely intended use for these plutonium foils was 239Pu fission foil targets for physics experiments, such as cross-section measurements, etc.« less
NASA Technical Reports Server (NTRS)
Meyer, Marit Elisabeth
2015-01-01
A thermal precipitator (TP) was designed to collect smoke aerosol particles for microscopic analysis in fire characterization research. Information on particle morphology, size and agglomerate structure obtained from these tests supplements additional aerosol data collected. Modeling of the thermal precipitator throughout the design process was performed with the COMSOL Multiphysics finite element software package, including the Eulerian flow field and thermal gradients in the fluid. The COMSOL Particle Tracing Module was subsequently used to determine particle deposition. Modeling provided optimized design parameters such as geometry, flow rate and temperatures. The thermal precipitator was built and testing verified the performance of the first iteration of the device. The thermal precipitator was successfully operated and provided quality particle samples for microscopic analysis, which furthered the body of knowledge on smoke particulates. This information is a key element of smoke characterization and will be useful for future spacecraft fire detection research.
The group engagement model: procedural justice, social identity, and cooperative behavior.
Tyler, Tom R; Blader, Steven L
2003-01-01
The group engagement model expands the insights of the group-value model of procedural justice and the relational model of authority into an explanation for why procedural justice shapes cooperation in groups, organizations, and societies. It hypothesizes that procedures are important because they shape people's social identity within groups, and social identity in turn influences attitudes, values, and behaviors. The model further hypothesizes that resource judgments exercise their influence indirectly by shaping social identity. This social identity mediation hypothesis explains why people focus on procedural justice, and in particular on procedural elements related to the quality of their interpersonal treatment, because those elements carry the most social identity-relevant information. In this article, we review several key insights of the group engagement model, relate these insights to important trends in psychological research on justice, and discuss implications of the model for the future of procedural justice research.
Using graph approach for managing connectivity in integrative landscape modelling
NASA Astrophysics Data System (ADS)
Rabotin, Michael; Fabre, Jean-Christophe; Libres, Aline; Lagacherie, Philippe; Crevoisier, David; Moussa, Roger
2013-04-01
In cultivated landscapes, a lot of landscape elements such as field boundaries, ditches or banks strongly impact water flows, mass and energy fluxes. At the watershed scale, these impacts are strongly conditionned by the connectivity of these landscape elements. An accurate representation of these elements and of their complex spatial arrangements is therefore of great importance for modelling and predicting these impacts.We developped in the framework of the OpenFLUID platform (Software Environment for Modelling Fluxes in Landscapes) a digital landscape representation that takes into account the spatial variabilities and connectivities of diverse landscape elements through the application of the graph theory concepts. The proposed landscape representation consider spatial units connected together to represent the flux exchanges or any other information exchanges. Each spatial unit of the landscape is represented as a node of a graph and relations between units as graph connections. The connections are of two types - parent-child connection and up/downstream connection - which allows OpenFLUID to handle hierarchical graphs. Connections can also carry informations and graph evolution during simulation is possible (connections or elements modifications). This graph approach allows a better genericity on landscape representation, a management of complex connections and facilitate development of new landscape representation algorithms. Graph management is fully operational in OpenFLUID for developers or modelers ; and several graph tools are available such as graph traversal algorithms or graph displays. Graph representation can be managed i) manually by the user (for example in simple catchments) through XML-based files in easily editable and readable format or ii) by using methods of the OpenFLUID-landr library which is an OpenFLUID library relying on common open-source spatial libraries (ogr vector, geos topologic vector and gdal raster libraries). OpenFLUID-landr library has been developed in order i) to be used with no GIS expert skills needed (common gis formats can be read and simplified spatial management is provided), ii) to easily develop adapted rules of landscape discretization and graph creation to follow spatialized model requirements and iii) to allow model developers to manage dynamic and complex spatial topology. Graph management in OpenFLUID are shown with i) examples of hydrological modelizations on complex farmed landscapes and ii) the new implementation of Geo-MHYDAS tool based on the OpenFLUID-landr library, which allows to discretize a landscape and create graph structure for the MHYDAS model requirements.
Toward a digital library strategy for a National Information Infrastructure
NASA Technical Reports Server (NTRS)
Coyne, Robert A.; Hulen, Harry
1993-01-01
Bills currently before the House and Senate would give support to the development of a National Information Infrastructure, in which digital libraries and storage systems would be an important part. A simple model is offered to show the relationship of storage systems, software, and standards to the overall information infrastructure. Some elements of a national strategy for digital libraries are proposed, based on the mission of the nonprofit National Storage System Foundation.
Symbolic Knowledge Processing for the Acquisition of Expert Behavior: A Study in Medicine.
1984-05-01
information . It provides a model for this type of study, suggesting a different approach to the problem of learning and efficiency of knowledge -based...flow of information 2.2. Scope and description of the subsystems Three subsystems perform distinct operations using the preceding knowledge sources...which actually yields a new knowledge rCpresentation Ahere new external information is encoded in the combination and ordering of elements of the
Nonlocal and Mixed-Locality Multiscale Finite Element Methods
Costa, Timothy B.; Bond, Stephen D.; Littlewood, David J.
2018-03-27
In many applications the resolution of small-scale heterogeneities remains a significant hurdle to robust and reliable predictive simulations. In particular, while material variability at the mesoscale plays a fundamental role in processes such as material failure, the resolution required to capture mechanisms at this scale is often computationally intractable. Multiscale methods aim to overcome this difficulty through judicious choice of a subscale problem and a robust manner of passing information between scales. One promising approach is the multiscale finite element method, which increases the fidelity of macroscale simulations by solving lower-scale problems that produce enriched multiscale basis functions. Here, inmore » this study, we present the first work toward application of the multiscale finite element method to the nonlocal peridynamic theory of solid mechanics. This is achieved within the context of a discontinuous Galerkin framework that facilitates the description of material discontinuities and does not assume the existence of spatial derivatives. Analysis of the resulting nonlocal multiscale finite element method is achieved using the ambulant Galerkin method, developed here with sufficient generality to allow for application to multiscale finite element methods for both local and nonlocal models that satisfy minimal assumptions. Finally, we conclude with preliminary results on a mixed-locality multiscale finite element method in which a nonlocal model is applied at the fine scale and a local model at the coarse scale.« less
Nonlocal and Mixed-Locality Multiscale Finite Element Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Costa, Timothy B.; Bond, Stephen D.; Littlewood, David J.
In many applications the resolution of small-scale heterogeneities remains a significant hurdle to robust and reliable predictive simulations. In particular, while material variability at the mesoscale plays a fundamental role in processes such as material failure, the resolution required to capture mechanisms at this scale is often computationally intractable. Multiscale methods aim to overcome this difficulty through judicious choice of a subscale problem and a robust manner of passing information between scales. One promising approach is the multiscale finite element method, which increases the fidelity of macroscale simulations by solving lower-scale problems that produce enriched multiscale basis functions. Here, inmore » this study, we present the first work toward application of the multiscale finite element method to the nonlocal peridynamic theory of solid mechanics. This is achieved within the context of a discontinuous Galerkin framework that facilitates the description of material discontinuities and does not assume the existence of spatial derivatives. Analysis of the resulting nonlocal multiscale finite element method is achieved using the ambulant Galerkin method, developed here with sufficient generality to allow for application to multiscale finite element methods for both local and nonlocal models that satisfy minimal assumptions. Finally, we conclude with preliminary results on a mixed-locality multiscale finite element method in which a nonlocal model is applied at the fine scale and a local model at the coarse scale.« less
DOT National Transportation Integrated Search
1998-01-01
A key element of AZTech's mission is to make up-to-the-minute traffic information available to virtually any traveler. In pursuit of this goal, AZTech set its sights on obtaining an FM subcarrier that could : transmit a wide variety of traffic-relate...
Fast associative memory + slow neural circuitry = the computational model of the brain.
NASA Astrophysics Data System (ADS)
Berkovich, Simon; Berkovich, Efraim; Lapir, Gennady
1997-08-01
We propose a computational model of the brain based on a fast associative memory and relatively slow neural processors. In this model, processing time is expensive but memory access is not, and therefore most algorithmic tasks would be accomplished by using large look-up tables as opposed to calculating. The essential feature of an associative memory in this context (characteristic for a holographic type memory) is that it works without an explicit mechanism for resolution of multiple responses. As a result, the slow neuronal processing elements, overwhelmed by the flow of information, operate as a set of templates for ranking of the retrieved information. This structure addresses the primary controversy in the brain architecture: distributed organization of memory vs. localization of processing centers. This computational model offers an intriguing explanation of many of the paradoxical features in the brain architecture, such as integration of sensors (through DMA mechanism), subliminal perception, universality of software, interrupts, fault-tolerance, certain bizarre possibilities for rapid arithmetics etc. In conventional computer science the presented type of a computational model did not attract attention as it goes against the technological grain by using a working memory faster than processing elements.
The History and Use of Our Earth's Chemical Elements: A Reference Guide (by Robert E. Krebs)
NASA Astrophysics Data System (ADS)
Bracken, Reviewed By Jeffrey D.
1999-04-01
Greenwood Press: Westport, CT, 1998. 282 pp + 25 pp glossary + 37 pp index. 15.9 x 24.1 cm. ISBN 0-313-30123-9. $39.95. This book is an excellent resource for chemical educators at the high school and college levels. The format of the text is consistent and the writing style is clear and concise, making it ideally suited for student use also. The first three chapters serve to introduce the reader to a brief history of chemistry, early models of the atom, and the development of the periodic table. Names of the contributing scientists are mentioned whenever necessary, but the overall purpose of these introductory chapters is simply to lay a foundation for the subsequent seven chapters. A complete glossary of important scientific terms mentioned in the text should allow beginning students to use this book without feeling overwhelmed. Each entry for the 112 elements contains the following information: elemental symbol, atomic number, period, common valence, atomic weight, natural state, common isotopes, properties, characteristics, abundance, natural sources, history, common uses and compounds, and safety hazards. This information is well organized, with clear headings and separate sections making the book extremely user-friendly. Readers can easily obtain the information they desire without having to skim the full entry for a chosen element. One very nice feature of this book is that the elements entries are arranged by their locations in the periodic table. For example, chapter 4 contains the alkali metals and alkaline earth metals. This organizational scheme allows one to quickly see the patterns and trends within groups of elements. This format is significantly better than arranging the elements in alphabetical order, which places the entry for sodium far removed from the entries for lithium and potassium. I would highly recommend this book to high school teachers and college chemistry professors. It is well written and is an excellent source of information for both students and educators.
AlRawi, Sara N; Khidir, Amal; Elnashar, Maha S; Abdelrahim, Huda A; Killawi, Amal K; Hammoud, Maya M; Fetters, Michael D
2017-03-14
Evidence indicates traditional medicine is no longer only used for the healthcare of the poor, its prevalence is also increasing in countries where allopathic medicine is predominant in the healthcare system. While these healing practices have been utilized for thousands of years in the Arabian Gulf, only recently has a theoretical model been developed illustrating the linkages and components of such practices articulated as Traditional Arabic & Islamic Medicine (TAIM). Despite previous theoretical work presenting development of the TAIM model, empirical support has been lacking. The objective of this research is to provide empirical support for the TAIM model and illustrate real world applicability. Using an ethnographic approach, we recruited 84 individuals (43 women and 41 men) who were speakers of one of four common languages in Qatar; Arabic, English, Hindi, and Urdu, Through in-depth interviews, we sought confirming and disconfirming evidence of the model components, namely, health practices, beliefs and philosophy to treat, diagnose, and prevent illnesses and/or maintain well-being, as well as patterns of communication about their TAIM practices with their allopathic providers. Based on our analysis, we find empirical support for all elements of the TAIM model. Participants in this research, visitors to major healthcare centers, mentioned using all elements of the TAIM model: herbal medicines, spiritual therapies, dietary practices, mind-body methods, and manual techniques, applied singularly or in combination. Participants had varying levels of comfort sharing information about TAIM practices with allopathic practitioners. These findings confirm an empirical basis for the elements of the TAIM model. Three elements, namely, spiritual healing, herbal medicine, and dietary practices, were most commonly found. Future research should examine the prevalence of TAIM element use, how it differs among various populations, and its impact on health.
Two graphical user interfaces for managing and analyzing MODFLOW groundwater-model scenarios
Banta, Edward R.
2014-01-01
Scenario Manager and Scenario Analyzer are graphical user interfaces that facilitate the use of calibrated, MODFLOW-based groundwater models for investigating possible responses to proposed stresses on a groundwater system. Scenario Manager allows a user, starting with a calibrated model, to design and run model scenarios by adding or modifying stresses simulated by the model. Scenario Analyzer facilitates the process of extracting data from model output and preparing such display elements as maps, charts, and tables. Both programs are designed for users who are familiar with the science on which groundwater modeling is based but who may not have a groundwater modeler’s expertise in building and calibrating a groundwater model from start to finish. With Scenario Manager, the user can manipulate model input to simulate withdrawal or injection wells, time-variant specified hydraulic heads, recharge, and such surface-water features as rivers and canals. Input for stresses to be simulated comes from user-provided geographic information system files and time-series data files. A Scenario Manager project can contain multiple scenarios and is self-documenting. Scenario Analyzer can be used to analyze output from any MODFLOW-based model; it is not limited to use with scenarios generated by Scenario Manager. Model-simulated values of hydraulic head, drawdown, solute concentration, and cell-by-cell flow rates can be presented in display elements. Map data can be represented as lines of equal value (contours) or as a gradated color fill. Charts and tables display time-series data obtained from output generated by a transient-state model run or from user-provided text files of time-series data. A display element can be based entirely on output of a single model run, or, to facilitate comparison of results of multiple scenarios, an element can be based on output from multiple model runs. Scenario Analyzer can export display elements and supporting metadata as a Portable Document Format file.
NASA Astrophysics Data System (ADS)
Martins, J. M. P.; Thuillier, S.; Andrade-Campos, A.
2018-05-01
The identification of material parameters, for a given constitutive model, can be seen as the first step before any practical application. In the last years, the field of material parameters identification received an important boost with the development of full-field measurement techniques, such as Digital Image Correlation. These techniques enable the use of heterogeneous displacement/strain fields, which contain more information than the classical homogeneous tests. Consequently, different techniques have been developed to extract material parameters from full-field measurements. In this study, two of these techniques are addressed, the Finite Element Model Updating (FEMU) and the Virtual Fields Method (VFM). The main idea behind FEMU is to update the parameters of a constitutive model implemented in a finite element model until both numerical and experimental results match, whereas VFM makes use of the Principle of Virtual Work and does not require any finite element simulation. Though both techniques proved their feasibility in linear and non-linear constitutive models, it is rather difficult to rank their robustness in plasticity. The purpose of this work is to perform a comparative study in the case of elasto-plastic models. Details concerning the implementation of each strategy are presented. Moreover, a dedicated code for VFM within a large strain framework is developed. The reconstruction of the stress field is performed through a user subroutine. A heterogeneous tensile test is considered to compare FEMU and VFM strategies.
Design of a consensus-derived synoptic operative report for lung cancer surgery.
Schneider, Laura; Shargall, Yaron; Schieman, Colin; Seely, Andrew J; Srinathan, Sadeesh; Malthaner, Richard A; Pierre, Andrew F; Safieddine, Najib; Vaillancourt, Rosaire; Plourde, Madelaine; Bond, James; Johnson, Scott; Smith, Shona E; Finley, Christian J
2014-04-01
For lung cancer surgery, a narrative operative report is the standard reporting procedure, whereas a synoptic-style report is increasingly utilized by healthcare professionals in various specialties with great success. A synoptic operative report more succinctly and accurately captures vital information and is rapidly generated with good intraobserver reliability. The objective of this study was to systematically develop a synoptic operative report for lung cancer surgery following a modified Delphi consensus model with the support of the Canadian thoracic surgery community. Using online survey software, thoracic surgeons and related physicians were asked to suggest and rate data elements for a synoptic report following the modified Delphi consensus model. The consensus exercise-derived template was forwarded to a small working group, who further refined the definition and priority designation of elements until the working group had reached a satisfactory consensus. In all, 139 physicians were invited to participate in the consensus exercise, with 36.7%, 44.6%, and 19.5% response rates, respectively, in the three rounds. Eighty-nine elements were agreed upon at the conclusion of the exercise, but 141 elements were forwarded to the working group. The working group agreed upon a final data set of 180 independently defined data elements, with 72 mandatory and 108 optional elements for implementation in the final report. This study demonstrates the process involved in developing a multidisciplinary, consensus-based synoptic lung cancer operative report. This novel report style is a quality improvement initiative to improve the capture, dissemination, readability, and potential utility of critical surgical information. Copyright © 2014 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
2013-01-01
Background Internationally, key health care reform elements rely on improved integration of care between the primary and secondary sectors. The objective of this systematic review is to synthesise the existing published literature on elements of current integrated primary/secondary health care. These elements and how they have supported integrated healthcare governance are presented. Methods A systematic review of peer-reviewed literature from PubMed, MEDLINE, CINAHL, the Cochrane Library, Informit Health Collection, the Primary Health Care Research and Information Service, the Canadian Health Services Research Foundation, European Foundation for Primary Care, European Forum for Primary Care, and Europa Sinapse was undertaken for the years 2006–2012. Relevant websites were also searched for grey literature. Papers were assessed by two assessors according to agreed inclusion criteria which were published in English, between 2006–2012, studies describing an integrated primary/secondary care model, and had reported outcomes in care quality, efficiency and/or satisfaction. Results Twenty-one studies met the inclusion criteria. All studies evaluated the process of integrated governance and service delivery structures, rather than the effectiveness of services. They included case reports and qualitative data analyses addressing policy change, business issues and issues of clinical integration. A thematic synthesis approach organising data according to themes identified ten elements needed for integrated primary/secondary health care governance across a regional setting including: joint planning; integrated information communication technology; change management; shared clinical priorities; incentives; population focus; measurement – using data as a quality improvement tool; continuing professional development supporting joint working; patient/community engagement; and, innovation. Conclusions All examples of successful primary/secondary care integration reported in the literature have focused on a combination of some, if not all, of the ten elements described in this paper, and there appears to be agreement that multiple elements are required to ensure successful and sustained integration efforts. Whilst no one model fits all systems these elements provide a focus for setting up integration initiatives which need to be flexible for adapting to local conditions and settings. PMID:24359610
Nicholson, Caroline; Jackson, Claire; Marley, John
2013-12-20
Internationally, key health care reform elements rely on improved integration of care between the primary and secondary sectors. The objective of this systematic review is to synthesise the existing published literature on elements of current integrated primary/secondary health care. These elements and how they have supported integrated healthcare governance are presented. A systematic review of peer-reviewed literature from PubMed, MEDLINE, CINAHL, the Cochrane Library, Informit Health Collection, the Primary Health Care Research and Information Service, the Canadian Health Services Research Foundation, European Foundation for Primary Care, European Forum for Primary Care, and Europa Sinapse was undertaken for the years 2006-2012. Relevant websites were also searched for grey literature. Papers were assessed by two assessors according to agreed inclusion criteria which were published in English, between 2006-2012, studies describing an integrated primary/secondary care model, and had reported outcomes in care quality, efficiency and/or satisfaction. Twenty-one studies met the inclusion criteria. All studies evaluated the process of integrated governance and service delivery structures, rather than the effectiveness of services. They included case reports and qualitative data analyses addressing policy change, business issues and issues of clinical integration. A thematic synthesis approach organising data according to themes identified ten elements needed for integrated primary/secondary health care governance across a regional setting including: joint planning; integrated information communication technology; change management; shared clinical priorities; incentives; population focus; measurement - using data as a quality improvement tool; continuing professional development supporting joint working; patient/community engagement; and, innovation. All examples of successful primary/secondary care integration reported in the literature have focused on a combination of some, if not all, of the ten elements described in this paper, and there appears to be agreement that multiple elements are required to ensure successful and sustained integration efforts. Whilst no one model fits all systems these elements provide a focus for setting up integration initiatives which need to be flexible for adapting to local conditions and settings.
Mobile elements reveal small population size in the ancient ancestors of Homo sapiens.
Huff, Chad D; Xing, Jinchuan; Rogers, Alan R; Witherspoon, David; Jorde, Lynn B
2010-02-02
The genealogies of different genetic loci vary in depth. The deeper the genealogy, the greater the chance that it will include a rare event, such as the insertion of a mobile element. Therefore, the genealogy of a region that contains a mobile element is on average older than that of the rest of the genome. In a simple demographic model, the expected time to most recent common ancestor (TMRCA) is doubled if a rare insertion is present. We test this expectation by examining single nucleotide polymorphisms around polymorphic Alu insertions from two completely sequenced human genomes. The estimated TMRCA for regions containing a polymorphic insertion is two times larger than the genomic average (P < <10(-30)), as predicted. Because genealogies that contain polymorphic mobile elements are old, they are shaped largely by the forces of ancient population history and are insensitive to recent demographic events, such as bottlenecks and expansions. Remarkably, the information in just two human DNA sequences provides substantial information about ancient human population size. By comparing the likelihood of various demographic models, we estimate that the effective population size of human ancestors living before 1.2 million years ago was 18,500, and we can reject all models where the ancient effective population size was larger than 26,000. This result implies an unusually small population for a species spread across the entire Old World, particularly in light of the effective population sizes of chimpanzees (21,000) and gorillas (25,000), which each inhabit only one part of a single continent.
Kraft, Reuben H.; Mckee, Phillip Justin; Dagro, Amy M.; Grafton, Scott T.
2012-01-01
This article presents the integration of brain injury biomechanics and graph theoretical analysis of neuronal connections, or connectomics, to form a neurocomputational model that captures spatiotemporal characteristics of trauma. We relate localized mechanical brain damage predicted from biofidelic finite element simulations of the human head subjected to impact with degradation in the structural connectome for a single individual. The finite element model incorporates various length scales into the full head simulations by including anisotropic constitutive laws informed by diffusion tensor imaging. Coupling between the finite element analysis and network-based tools is established through experimentally-based cellular injury thresholds for white matter regions. Once edges are degraded, graph theoretical measures are computed on the “damaged” network. For a frontal impact, the simulations predict that the temporal and occipital regions undergo the most axonal strain and strain rate at short times (less than 24 hrs), which leads to cellular death initiation, which results in damage that shows dependence on angle of impact and underlying microstructure of brain tissue. The monotonic cellular death relationships predict a spatiotemporal change of structural damage. Interestingly, at 96 hrs post-impact, computations predict no network nodes were completely disconnected from the network, despite significant damage to network edges. At early times () network measures of global and local efficiency were degraded little; however, as time increased to 96 hrs the network properties were significantly reduced. In the future, this computational framework could help inform functional networks from physics-based structural brain biomechanics to obtain not only a biomechanics-based understanding of injury, but also neurophysiological insight. PMID:22915997
Effects of Pore Distributions on Ductility of Thin-Walled High Pressure Die-Cast Magnesium
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choi, Kyoo Sil; Li, Dongsheng; Sun, Xin
2013-06-01
In this paper, a microstructure-based three-dimensional (3D) finite element modeling method is adopted to investigate the effects of porosity in thin-walled high pressure die-cast (HPDC) Magnesium alloys on their ductility. For this purpose, the cross-sections of AM60 casting samples are first examined using optical microscope and X-ray tomography to obtain the general information on the pore distribution features. The experimentally observed pore distribution features are then used to generate a series of synthetic microstructure-based 3D finite element models with different pore volume fractions and pore distribution features. Shear and ductile damage models are adopted in the finite element analyses tomore » induce the fracture by element removal, leading to the prediction of ductility. The results in this study show that the ductility monotonically decreases as the pore volume fraction increases and that the effect of ‘skin region’ on the ductility is noticeable under the condition of same local pore volume fraction in the center region of the sample and its existence can be beneficial for the improvement of ductility. The further synthetic microstructure-based 3D finite element analyses are planned to investigate the effects of pore size and pore size distribution.« less
A generalized ingrowth model for the northeastern United States
Linda S. Gribko; Donald E. Hilt; Mary Ann Fajvan
1995-01-01
Ingrowth, the number of trees that periodically grow into the smallest inventoried diameter class, has long been recognized as a basic element of multicohort or, uneven-aged, stand development. However, very little information is available to aid forest managers in the estimation of ingrowth. The purpose of this study was to develop a generalized ingrowth model for the...
Building a Conceptual Model of Family Response to a Child's Chronic Illness or Disability.
ERIC Educational Resources Information Center
McDonald, Thomas P.; And Others
This literature review provides information to help in building a model of family caregiving for children with emotional disorders, focusing on the elements of stress, coping, and appraisal. Because literature on families' perceptions, use of resources, and coping with a child with an emotional disorder is nonexistent, the review uses the…
Noutoshi, Y; Arai, R; Fujie, M; Yamada, T
1997-01-01
As a model for plant-type chromosomes, we have been characterizing molecular organization of the Chlorella vulgaris C-169 chromosome I. To identify chromosome structural elements including the centromeric region and replication origins, we constructed a chromosome I specific cosmid library and aligned each cosmid clones to generate contigs. So far, more than 80% of the entire chromosome I has been covered. A complete clonal physical reconstitution of chromosome I provides information on the structure and genomic organization of plant genome. We propose our strategy to construct an artificial chromosome by assembling the functional chromosome structural elements identified on Chrorella chromosome I.
Standardized Representation of Clinical Study Data Dictionaries with CIMI Archetypes
Sharma, Deepak K.; Solbrig, Harold R.; Prud’hommeaux, Eric; Pathak, Jyotishman; Jiang, Guoqian
2016-01-01
Researchers commonly use a tabular format to describe and represent clinical study data. The lack of standardization of data dictionary’s metadata elements presents challenges for their harmonization for similar studies and impedes interoperability outside the local context. We propose that representing data dictionaries in the form of standardized archetypes can help to overcome this problem. The Archetype Modeling Language (AML) as developed by the Clinical Information Modeling Initiative (CIMI) can serve as a common format for the representation of data dictionary models. We mapped three different data dictionaries (identified from dbGAP, PheKB and TCGA) onto AML archetypes by aligning dictionary variable definitions with the AML archetype elements. The near complete alignment of data dictionaries helped map them into valid AML models that captured all data dictionary model metadata. The outcome of the work would help subject matter experts harmonize data models for quality, semantic interoperability and better downstream data integration. PMID:28269909
Standardized Representation of Clinical Study Data Dictionaries with CIMI Archetypes.
Sharma, Deepak K; Solbrig, Harold R; Prud'hommeaux, Eric; Pathak, Jyotishman; Jiang, Guoqian
2016-01-01
Researchers commonly use a tabular format to describe and represent clinical study data. The lack of standardization of data dictionary's metadata elements presents challenges for their harmonization for similar studies and impedes interoperability outside the local context. We propose that representing data dictionaries in the form of standardized archetypes can help to overcome this problem. The Archetype Modeling Language (AML) as developed by the Clinical Information Modeling Initiative (CIMI) can serve as a common format for the representation of data dictionary models. We mapped three different data dictionaries (identified from dbGAP, PheKB and TCGA) onto AML archetypes by aligning dictionary variable definitions with the AML archetype elements. The near complete alignment of data dictionaries helped map them into valid AML models that captured all data dictionary model metadata. The outcome of the work would help subject matter experts harmonize data models for quality, semantic interoperability and better downstream data integration.
Yang, Hao; Xu, Xiangyang; Neumann, Ingo
2014-11-19
Terrestrial laser scanning technology (TLS) is a new technique for quickly getting three-dimensional information. In this paper we research the health assessment of concrete structures with a Finite Element Method (FEM) model based on TLS. The goal focuses on the benefits of 3D TLS in the generation and calibration of FEM models, in order to build a convenient, efficient and intelligent model which can be widely used for the detection and assessment of bridges, buildings, subways and other objects. After comparing the finite element simulation with surface-based measurement data from TLS, the FEM model is determined to be acceptable with an error of less than 5%. The benefit of TLS lies mainly in the possibility of a surface-based validation of results predicted by the FEM model.
Luo, Xiaohui; Wang, Hang; Fan, Yubo
2007-04-01
This study was aimed to develop a 3-D finite element (3-D FE) model of the mental fractured mandible and design the boundary constrains. The CT images from a health volunteer were used as the original information and put into ANSYS program to build a 3-D FE model. The model of the miniplate and screw which were used for the internal fixation was established by Pro/E. The boundary constrains of different muscle loadings were used to simulate the 3 functional conditions of the mandible. A 3-D FE model of mental fractured mandible under the miniplate-screw internal fixation system was constructed. And by the boundary constraints, the 3 biting conditions were simulated and the model could serve as a foundation on which to analyze the biomechanical behavior of the fractured mandible.
NASA Technical Reports Server (NTRS)
Johnson, S. C.
1982-01-01
An interface system for passing data between a relational information management (RIM) data base complex and engineering analysis language (EAL), a finite element structural analysis program is documented. The interface system, implemented on a CDC Cyber computer, is composed of two FORTRAN programs called RIM2EAL and EAL2RIM. The RIM2EAL reads model definition data from RIM and creates a file of EAL commands to define the model. The EAL2RIM reads model definition and EAL generated analysis data from EAL's data library and stores these data dirctly in a RIM data base. These two interface programs and the format for the RIM data complex are described.
An Empirical Human Controller Model for Preview Tracking Tasks.
van der El, Kasper; Pool, Daan M; Damveld, Herman J; van Paassen, Marinus Rene M; Mulder, Max
2016-11-01
Real-life tracking tasks often show preview information to the human controller about the future track to follow. The effect of preview on manual control behavior is still relatively unknown. This paper proposes a generic operator model for preview tracking, empirically derived from experimental measurements. Conditions included pursuit tracking, i.e., without preview information, and tracking with 1 s of preview. Controlled element dynamics varied between gain, single integrator, and double integrator. The model is derived in the frequency domain, after application of a black-box system identification method based on Fourier coefficients. Parameter estimates are obtained to assess the validity of the model in both the time domain and frequency domain. Measured behavior in all evaluated conditions can be captured with the commonly used quasi-linear operator model for compensatory tracking, extended with two viewpoints of the previewed target. The derived model provides new insights into how human operators use preview information in tracking tasks.
NASA Astrophysics Data System (ADS)
Ferrer, Laetitia; Curt, Corinne; Tacnet, Jean-Marc
2018-04-01
Major hazard prevention is a main challenge given that it is specifically based on information communicated to the public. In France, preventive information is notably provided by way of local regulatory documents. Unfortunately, the law requires only few specifications concerning their content; therefore one can question the impact on the general population relative to the way the document is concretely created. Ergo, the purpose of our work is to propose an analytical methodology to evaluate preventive risk communication document effectiveness. The methodology is based on dependability approaches and is applied in this paper to the Document d'Information Communal sur les Risques Majeurs (DICRIM; in English, Municipal Information Document on Major Risks). DICRIM has to be made by mayors and addressed to the public to provide information on major hazards affecting their municipalities. An analysis of law compliance of the document is carried out thanks to the identification of regulatory detection elements. These are applied to a database of 30 DICRIMs. This analysis leads to a discussion on points such as usefulness of the missing elements. External and internal function analysis permits the identification of the form and content requirements and service and technical functions of the document and its components (here its sections). Their results are used to carry out an FMEA (failure modes and effects analysis), which allows us to define the failure and to identify detection elements. This permits the evaluation of the effectiveness of form and content of each components of the document. The outputs are validated by experts from the different fields investigated. Those results are obtained to build, in future works, a decision support model for the municipality (or specialised consulting firms) in charge of drawing up documents.
Privacy-Preserving Accountable Accuracy Management Systems (PAAMS)
NASA Astrophysics Data System (ADS)
Thomas, Roshan K.; Sandhu, Ravi; Bertino, Elisa; Arpinar, Budak; Xu, Shouhuai
We argue for the design of “Privacy-preserving Accountable Accuracy Management Systems (PAAMS)”. The designs of such systems recognize from the onset that accuracy, accountability, and privacy management are intertwined. As such, these systems have to dynamically manage the tradeoffs between these (often conflicting) objectives. For example, accuracy in such systems can be improved by providing better accountability links between structured and unstructured information. Further, accuracy may be enhanced if access to private information is allowed in controllable and accountable ways. Our proposed approach involves three key elements. First, a model to link unstructured information such as that found in email, image and document repositories with structured information such as that in traditional databases. Second, a model for accuracy management and entity disambiguation by proactively preventing, detecting and tracing errors in information bases. Third, a model to provide privacy-governed operation as accountability and accuracy are managed.
Distinct Perceptual Grouping Pathways Revealed By Temporal Carriers and Envelopes
Rainville, Stéphane; Clarke, Aaron
2014-01-01
Guttman et al. [2005, Vis. Res., 45(8), 1021-1030] investigated whether observers could perform temporal grouping in multi-element displays where each local element was stochastically modulated over time along one of several potential dimensions – or “messenger types” – such as contrast, position, orientation, or spatial scale. Guttman et al.’s data revealed that grouping discards messenger type and therefore support a single-pathway model that groups elements with similar temporal waveforms. In the current study, we carried out three experiments in which temporal-grouping information resided either in the carrier, the envelope, or the combined carrier and envelope of each messenger’s timecourse. Results revealed that grouping is highly specific for messenger type if carrier envelopes lack grouping information but largely messenger nonspecific if carrier envelopes contain grouping information. The imply that temporal grouping is mediated by several messenger-specific carrier pathways as well as by a messenger-nonspecific envelope pathways. Findings also challenge simple temporal-filtering accounts of perceptual grouping [Adelson & Farid, 1999, Science, 286, 2231a]. PMID:19146293
Ferguson, Adam R.; Popovich, Phillip G.; Xu, Xiao-Ming; Snow, Diane M.; Igarashi, Michihiro; Beattie, Christine E.; Bixby, John L.
2014-01-01
Abstract The lack of reproducibility in many areas of experimental science has a number of causes, including a lack of transparency and precision in the description of experimental approaches. This has far-reaching consequences, including wasted resources and slowing of progress. Additionally, the large number of laboratories around the world publishing articles on a given topic make it difficult, if not impossible, for individual researchers to read all of the relevant literature. Consequently, centralized databases are needed to facilitate the generation of new hypotheses for testing. One strategy to improve transparency in experimental description, and to allow the development of frameworks for computer-readable knowledge repositories, is the adoption of uniform reporting standards, such as common data elements (data elements used in multiple clinical studies) and minimum information standards. This article describes a minimum information standard for spinal cord injury (SCI) experiments, its major elements, and the approaches used to develop it. Transparent reporting standards for experiments using animal models of human SCI aim to reduce inherent bias and increase experimental value. PMID:24870067
Shea, Christopher M; Turner, Kea; White, B Alex; Zhu, Ye; Rozier, R Gary
2018-01-11
The majority of primary care physicians support integration of children's oral health promotion and disease prevention into their practices but can experience challenges integrating oral health services into their workflow. Most electronic health records (EHRs) in primary care settings do not include oral health information for pediatric patients. Therefore, it is important to understand providers' preferences for oral health information within the EHR. The objectives of this study are to assess (1) the relative importance of various elements of pediatric oral health information for primary care providers to have in the EHR and (2) the extent to which practice and provider characteristics are associated with these information preferences. We surveyed a sample of primary care physicians who conducted Medicaid well-child visits in North Carolina from August - December 2013. Using descriptive statistics, we analyzed primary care physicians' oral health information preferences relative to their information preferences for traditional preventive aspects of well-child visits. Furthermore, we analyzed associations between oral health information preferences and provider- and practice-level characteristics using an ordinary least squares regression model. Fewer primary care providers reported that pediatric oral health information is "very important," as compared to more traditional elements of primary care information, such as tracking immunizations. However, the majority of respondents reported some elements of oral health information as being very important. Also, we found positive associations between the percentage of well child visits in which oral health screenings and oral health referrals are performed and the reported importance of having pediatric oral health information in the EHR. Incorporating oral health information into the EHR may be desirable for providers, particularly those who perform oral health screenings and dental referrals.
Genomic patterns associated with paternal/maternal distribution of transposable elements
NASA Astrophysics Data System (ADS)
Jurka, Jerzy
2003-03-01
Transposable elements (TEs) are specialized DNA or RNA fragments capable of surviving in intragenomic niches. They are commonly, perhaps unjustifiably referred to as "selfish" or "parasitic" elements. TEs can be divided in two major classes: retroelements and DNA transposons. The former include non-LTR retrotransposons and retrovirus-like elements, using reverse transriptase for their reproduction prior to integration into host DNA. The latter depend mostly on host DNA replication, with possible exception of rolling-circle transposons recently discovered by our team. I will review basic information on TEs, with emphasis on human Alu and L1 retroelements discussed in the context of genomic organization. TEs are non-randomly distributed in chromosomal DNA. In particular, human Alu elements tend to prefer GC-rich regions, whereas L1 accumulate in AT-rich regions. Current explanations of this phenomenon focus on the so called "target effects" and post-insertional selection. However, the proposed models appear to be unsatisfactory and alternative explanations invoking "channeling" to different chromosomal regions will be a major focus of my presentation. Transposable elements (TEs) can be expressed and integrated into host DNA in the male or female germlines, or both. Different models of expression and integration imply different proportions of TEs on sex chromosomes and autosomes. The density of recently retroposed human Alu elements is around three times higher on chromosome Y than on chromosome X, and over two times higher than the average density for all human autosomes. This implies Alu activity in paternal germlines. Analogous inter-chromosomal proportions for other repeat families should determine their compatibility with one of the three basic models describing the inheritance of TEs. Published evidence indicates that maternally and paternally imprinted genes roughly correspond to GC-rich and AT-rich DNA. This may explain the observed chromosomal distribution of Alu and L1 elements. Finally, paternal models of inheritance predict rapid accumulation of active TEs on chromosome Y. I will discuss potential implications of this phenomenon for evolution of chromosome Y and transposable elements.
Simulation model of an eyeball based on finite element analysis on a supercomputer
Uchio, E.; Ohno, S.; Kudoh, J.; Aoki, K.; Kisielewicz, L. T.
1999-01-01
BACKGROUND/AIMS—A simulation model of the human eye was developed. It was applied to the determination of the physical and mechanical conditions of impacting foreign bodies causing intraocular foreign body (IOFB) injuries. METHODS—Modules of the Hypermesh (Altair Engineering, Tokyo, Japan) were used for solid modelling, geometric construction, and finite element mesh creation based on information obtained from cadaver eyes. The simulations were solved by a supercomputer using the finite element analysis (FEA) program PAM-CRASH (Nihon ESI, Tokyo, Japan). It was assumed that rupture occurs at a strain of 18.0% in the cornea and 6.8% in the sclera and at a stress of 9.4 MPa for both cornea and sclera. Blunt-shaped missiles were shot and set to impact on the surface of the cornea or sclera at velocities of 30 and 60 m/s, respectively. RESULTS—According to the simulation, the sizes of missile above which corneal rupture occurred at velocities of 30 and 60 m/s were 1.95 and 0.82 mm. The missile sizes causing scleral rupture were 0.95 and 0.75 mm at velocities of 30 and 60 m/s. CONCLUSIONS—These results suggest that this FEA model has potential usefulness as a simulation tool for ocular injury and it may provide useful information for developing protective measures against industrial and traffic ocular injuries. PMID:10502567
The HTA core model: a novel method for producing and reporting health technology assessments.
Lampe, Kristian; Mäkelä, Marjukka; Garrido, Marcial Velasco; Anttila, Heidi; Autti-Rämö, Ilona; Hicks, Nicholas J; Hofmann, Björn; Koivisto, Juha; Kunz, Regina; Kärki, Pia; Malmivaara, Antti; Meiesaar, Kersti; Reiman-Möttönen, Päivi; Norderhaug, Inger; Pasternack, Iris; Ruano-Ravina, Alberto; Räsänen, Pirjo; Saalasti-Koskinen, Ulla; Saarni, Samuli I; Walin, Laura; Kristensen, Finn Børlum
2009-12-01
The aim of this study was to develop and test a generic framework to enable international collaboration for producing and sharing results of health technology assessments (HTAs). Ten international teams constructed the HTA Core Model, dividing information contained in a comprehensive HTA into standardized pieces, the assessment elements. Each element contains a generic issue that is translated into practical research questions while performing an assessment. Elements were described in detail in element cards. Two pilot assessments, designated as Core HTAs were also produced. The Model and Core HTAs were both validated. Guidance on the use of the HTA Core Model was compiled into a Handbook. The HTA Core Model considers health technologies through nine domains. Two applications of the Model were developed, one for medical and surgical interventions and another for diagnostic technologies. Two Core HTAs were produced in parallel with developing the model, providing the first real-life testing of the Model and input for further development. The results of formal validation and public feedback were primarily positive. Development needs were also identified and considered. An online Handbook is available. The HTA Core Model is a novel approach to HTA. It enables effective international production and sharing of HTA results in a structured format. The face validity of the Model was confirmed during the project, but further testing and refining are needed to ensure optimal usefulness and user-friendliness. Core HTAs are intended to serve as a basis for local HTA reports. Core HTAs do not contain recommendations on technology use.
Representing annotation compositionality and provenance for the Semantic Web
2013-01-01
Background Though the annotation of digital artifacts with metadata has a long history, the bulk of that work focuses on the association of single terms or concepts to single targets. As annotation efforts expand to capture more complex information, annotations will need to be able to refer to knowledge structures formally defined in terms of more atomic knowledge structures. Existing provenance efforts in the Semantic Web domain primarily focus on tracking provenance at the level of whole triples and do not provide enough detail to track how individual triple elements of annotations were derived from triple elements of other annotations. Results We present a task- and domain-independent ontological model for capturing annotations and their linkage to their denoted knowledge representations, which can be singular concepts or more complex sets of assertions. We have implemented this model as an extension of the Information Artifact Ontology in OWL and made it freely available, and we show how it can be integrated with several prominent annotation and provenance models. We present several application areas for the model, ranging from linguistic annotation of text to the annotation of disease-associations in genome sequences. Conclusions With this model, progressively more complex annotations can be composed from other annotations, and the provenance of compositional annotations can be represented at the annotation level or at the level of individual elements of the RDF triples composing the annotations. This in turn allows for progressively richer annotations to be constructed from previous annotation efforts, the precise provenance recording of which facilitates evidence-based inference and error tracking. PMID:24268021
NASA Astrophysics Data System (ADS)
Li, Ying; Luo, Zhiling; Yin, Jianwei; Xu, Lida; Yin, Yuyu; Wu, Zhaohui
2017-01-01
Modern service company (MSC), the enterprise involving special domains, such as the financial industry, information service industry and technology development industry, depends heavily on information technology. Modelling of such enterprise has attracted much research attention because it promises to help enterprise managers to analyse basic business strategies (e.g. the pricing strategy) and even optimise the business process (BP) to gain benefits. While the existing models proposed by economists cover the economic elements, they fail to address the basic BP and its relationship with the economic characteristics. Those proposed in computer science regardless of achieving great success in BP modelling perform poorly in supporting the economic analysis. Therefore, the existing approaches fail to satisfy the requirement of enterprise modelling for MSC, which demands simultaneous consideration of both economic analysing and business processing. In this article, we provide a unified enterprise modelling approach named Enterprise Pattern (EP) which bridges the gap between the BP model and the enterprise economic model of MSC. Proposing a language named Enterprise Pattern Description Language (EPDL) covering all the basic language elements of EP, we formulate the language syntaxes and two basic extraction rules assisting economic analysis. Furthermore, we extend Business Process Model and Notation (BPMN) to support EPDL, named BPMN for Enterprise Pattern (BPMN4EP). The example of mobile application platform is studied in detail for a better understanding of EPDL.
Reorientation-effect measurement of the <21+∥E2̂∥21+> matrix element in 10Be
NASA Astrophysics Data System (ADS)
Orce, J. N.; Drake, T. E.; Djongolov, M. K.; Navrátil, P.; Triambak, S.; Ball, G. C.; Al Falou, H.; Churchman, R.; Cross, D. S.; Finlay, P.; Forssén, C.; Garnsworthy, A. B.; Garrett, P. E.; Hackman, G.; Hayes, A. B.; Kshetri, R.; Lassen, J.; Leach, K. G.; Li, R.; Meissner, J.; Pearson, C. J.; Rand, E. T.; Sarazin, F.; Sjue, S. K. L.; Stoyer, M. A.; Sumithrarachchi, C. S.; Svensson, C. E.; Tardiff, E. R.; Teigelhoefer, A.; Williams, S. J.; Wong, J.; Wu, C. Y.
2012-10-01
The highly-efficient and segmented TIGRESS γ-ray spectrometer at TRIUMF has been used to perform a reorientation-effect Coulomb-excitation study of the 21+ state at 3.368 MeV in 10Be. This is the first Coulomb-excitation measurement that enables one to obtain information on diagonal matrix elements for such a high-lying first excited state from γ-ray data. With the availability of accurate lifetime data, a value of -0.110±0.087 eb is determined for the <21+∥E2̂∥21+> diagonal matrix element, which assuming the rotor model, leads to a negative spectroscopic quadrupole moment of QS(21+)=-0.083±0.066 eb. This result is in agreement with both no-core shell-model calculations performed in this work with the CD-Bonn 2000 two-nucleon potential and large shell-model spaces, and Green's function Monte Carlo predictions with two- plus three-nucleon potentials.
DTFM Modeling and Analysis Method for Gossamer Structures
NASA Technical Reports Server (NTRS)
Fang, Hou-Fei; Lou, Michael; Broduer, Steve (Technical Monitor)
2001-01-01
Gossamer systems are mostly composed of support structures formed by highly flexible, long tubular elements and pre-tensioned thin-film membranes. These systems offer order-of-magnitude reductions in mass and launch volume and will revolutionize the architecture and design of space flight systems that require large in-orbit configurations and apertures. A great interest has been generated in recent years to fly gossamer systems on near-term and future space missions. Modeling and analysis requirements for gossamer structures are unique. Simulation of in-space performance issues of gossamer structures, such as inflation deployment of flexible booms, formation and effects of wrinkle in tensioned membranes, synthesis of tubular and membrane elements into a complete structural system, usually cannot be accomplished by using the general-purpose finite-element structural analysis codes. This has led to the need of structural modeling and analysis capabilities specifically suitable for gossamer structures. The Distributed Transfer Function Method (DTFM) can potentially meet this urgent need. Additional information is contained in the original extended abstract.
[Landscape quality evaluation and vertical structure optimization of natural broadleaf forest].
Ouyang, Xun-zhi; Liao, Wei-ming; Peng, Shi-kui
2007-06-01
Taking the natural broadleaf forest in Wuyuan County of Jiangxi Province as study object, a total of 30 representative photos of near-view landscapes and related information were collected. The scenic beauty values were acquired by public judgment method, and the relationship models of scenic beauty values and landscape elements were established by using multiple mathematical model. The results showed that the main elements affecting the near-view landscape quality of natural broadleaf forest were the trunk form, stand density, undergrowth coverage and height, natural pruning, and color richness, with the partial correlation coefficients being 0.4482-0.7724, which were significant or very significant by t-test. The multiple correlation coefficient of the model reached 0.9508, showing very significant by F test (F = 36.11). Straight trunk, better natural pruning and rich color did well, while the super-high or low stand density and undergrowth coverage and height did harm to the scenic beauty. Several management measures for the vertical structure optimization of these landscape elements were put forward.
From LIDAR Scanning to 3d FEM Analysis for Complex Surface and Underground Excavations
NASA Astrophysics Data System (ADS)
Chun, K.; Kemeny, J.
2017-12-01
Light detection and ranging (LIDAR) has been a prevalent remote-sensing technology applied in the geological fields due to its high precision and ease to use. One of the major applications is to use the detailed geometrical information of underground structures as a basis for the generation of three-dimensional numerical model that can be used in FEM analysis. To date, however, straightforward techniques in reconstructing numerical model from the scanned data of underground structures have not been well established or tested. In this paper, we propose a comprehensive approach integrating from LIDAR scanning to finite element numerical analysis, specifically converting LIDAR 3D point clouds of object containing complex surface geometry into finite element model. This methodology has been applied to the Kartchner Caverns in Arizona for the stability analysis. Numerical simulations were performed using the finite element code ABAQUS. The results indicate that the highlights of our technologies obtained from LIDAR is effective and provide reference for other similar engineering project in practice.
"Party Line" Information Use Studies and Implications for ATC Datalink Communications
NASA Technical Reports Server (NTRS)
Hansman, R. John; Pritchett, Amy; Midkiff, Alan
1995-01-01
The perceived importance and utilization of 'party line' information by air carrier flight crews was investigated through pilot surveys and a flight simulation study. The importance, availability, and accuracy of party line information elements were explored through surveys of pilots of several operational types. The survey identified numerous traffic and weather party line information elements which were considered important. These elements were scripted into a full-mission flight simulation which examined the utilization of party line information by studying subject responses to specific information element stimuli. The awareness of the different Party Line elements varied, and awareness was also affected by pilot workload. In addition, pilots were aware of some traffic information elements, but were reluctant to act on Party Line Information alone. Finally, the results of both the survey and the simulation indicated that the importance of party line information appeared to be greatest for operations near or on the airport. This indicates that caution should be exercised when implementing datalink communications in tower and close-in terminal control sectors.
'Party Line' Information Use Studies and Implications for ATV Datalink Communications
NASA Technical Reports Server (NTRS)
Pritchett, Amy; Hansman, R. John; Midkiff, Alan
1995-01-01
The perceived importance and utilization of 'party line' information by air carrier flight crews was investigated through pilot surveys and a flight simulation study. The Importance, Availability, and Accuracy of party line information elements were explored through surveys of pilots of several operational types. The survey identified numerous traffic and weather party line information elements which were considered important. These elements were scripted into a full-mission flight simulation which examined the utilization of party line information by studying subject responses to specific information element stimuli. The awareness of the different Party Line elements varied, and awareness was also affected by pilot workload. In addition, pilots were aware of some traffic information elements, but were reluctant to act on Party Line Information alone. Finally, the importance of party line information appears to be greatest for operations near or on the airport. This indicates that caution should be exercised when implementing datalink communications in tower and close-in terminal control sectors.
Lee, J.K.; Bennett, C. S.
1981-01-01
A two-dimensional finite element surface water model was used to study the hydraulic impact of the proposed Interstate Route 326 crossing of the Congaree River near Columbia, SC. The finite element model was assessed as a potential operational tool for analyzing complex highway crossings and other modifications of river flood plains. Infrared aerial photography was used to define regions of homogeneous roughness in the flood plain. Finite element networks approximating flood plain topography were designed using elements of three roughness types. High water marks established during an 8-yr flood that occurred in October 1976 were used to calibrate the model. The maximum flood of record, an approximately 100-yr flood that occurred in August 1908, was modeled in three cases: dikes on the right bank, dikes on the left bank, and dikes on both banks. In each of the three cases, simulations were performed both without and with the proposed highway embankments in place. Detailed information was obtained about backwater effects upstream from the proposed highway embankments, changes in flow distribution resulting from the embankments, and local velocities in the bridge openings. On the basis of results from the model study, the South Carolina Department of Highways and Public Transportation changed the design of several bridge openings. A simulation incorporating the new design for the case with dikes on the left bank indicated that both velocities in the bridge openings and backwater were reduced. A major problem in applying the model was the difficulty in predicting the network detail necessary to avoid local errors caused by roughness discontinuities and large depth gradients. (Lantz-PTT)
Lee, Joo Yun; Park, Hyeoun-Ae; Min, Yul Ha
2015-06-01
The transtheoretical model (TTM) was used to provide tailored nursing for lifestyle management such as diet, physical activity, and smoking cessation. The present study aims to assess the provision of intervention delivery methods, intervention elements, and stage-matched interventions, in order to identify ways in which information technology is used in the TTM-based research. The relevant literature was selected by two researchers using inclusion criteria after searching for "TTM (transtheoretical or stage of change)" and "nursing" from the databases PubMed and CINAHL. The selected studies were categorized in terms of study characteristics, intervention delivery method, intervention element, and use and level of stage-matched intervention. A total of 35 studies were selected including eight studies that used information communication technology (ICT). Nine different intervention delivery methods were used, of which "face-to-face" was the most common at 24 times. Of the 35 studies, 26 provided stage-matched interventions. Seven different intervention elements were used, of which "counseling" was the most common at 27 times. Of all the intervention elements, tailored feedback used ICT the most at seven instances out of nine, and there was a significant difference in the rate of ICT usage among intervention elements. ICT is not yet actively used in the TTM-based nursing interventions. Stage-matched interventions and TTM concepts were shown to be in partial use also in the TTM-based interventions. Therefore, it is necessary to develop a variety of ways to use ICT in tailored nursing interventions and to use TTM frameworks and concepts. Copyright © 2015. Published by Elsevier B.V.
A review of clinical decision making: models and current research.
Banning, Maggi
2008-01-01
The aim of this paper was to review the current literature clinical decision-making models and the educational application of models to clinical practice. This was achieved by exploring the function and related research of the three available models of clinical decision making: information-processing model, the intuitive-humanist model and the clinical decision-making model. Clinical decision making is a unique process that involves the interplay between knowledge of pre-existing pathological conditions, explicit patient information, nursing care and experiential learning. Historically, two models of clinical decision making are recognized from the literature; the information-processing model and the intuitive-humanist model. The usefulness and application of both models has been examined in relation the provision of nursing care and care related outcomes. More recently a third model of clinical decision making has been proposed. This new multidimensional model contains elements of the information-processing model but also examines patient specific elements that are necessary for cue and pattern recognition. Literature review. Evaluation of the literature generated from MEDLINE, CINAHL, OVID, PUBMED and EBESCO systems and the Internet from 1980 to November 2005. The characteristics of the three models of decision making were identified and the related research discussed. Three approaches to clinical decision making were identified, each having its own attributes and uses. The most recent addition to the clinical decision making is a theoretical, multidimensional model which was developed through an evaluation of current literature and the assessment of a limited number of research studies that focused on the clinical decision-making skills of inexperienced nurses in pseudoclinical settings. The components of this model and the relative merits to clinical practice are discussed. It is proposed that clinical decision making improves as the nurse gains experience of nursing patients within a specific speciality and with experience, nurses gain a sense of saliency in relation to decision making. Experienced nurses may use all three forms of clinical decision making both independently and concurrently to solve nursing-related problems. It is suggested that O'Neill's clinical decision-making model could be tested by educators and experienced nurses to assess the efficacy of this hybrid approach to decision making.
NASA Astrophysics Data System (ADS)
ChePa, Noraziah; Jasin, Noorhayati Md; Bakar, Nur Azzah Abu
2017-10-01
Fail to prevent or control challenges of Information System (IS) implementation have led to the failure of its implementation. Successful implementation of IS has been a challenging task to any organization including government hospitals. Government has invested a big amount of money on information system (IS) projects to improve service delivery in healthcare. However, several of them failed to be implemented successfully due to several factors. This article proposes a prevention model which incorporated Change Management (CM) concepts to avoid the failure of IS implementation, hence ensuring the success of it. Challenges of IS implementation in government hospitals have been discovered. Extensive literature review and deep interview approaches were employed to discover these challenges. A prevention model has been designed to cater the challenges. The model caters three main phases of implementation; pre-implementation, during implementation, and post-implementation by adopting CM practices of Lewin's, Kotter's and Prosci's CM model. Six elements of CM comprising thirteen sub-elements adopted from the three CM models have been used to handle CFFs of Human and Support issues; guiding team, resistance avoidance, IS adoption, enforcement, monitoring, and IS sustainability. Successful practice of the proposed mapping is expected to prevent CFFs to occur, hence ensuring a successful implementation of IS in the hospitals. The proposed model has been presented and successfully evaluated by the domain experts from the selected hospitals. The proposed model is believed to be beneficial for top management, IT practitioners and medical practitioners in preventing IS implementation failure among government hospitals towards ensuring the success implementation.
Bradbury, Angela R; Patrick-Miller, Linda; Long, Jessica; Powers, Jacquelyn; Stopfer, Jill; Forman, Andrea; Rybak, Christina; Mattie, Kristin; Brandt, Amanda; Chambers, Rachelle; Chung, Wendy K; Churpek, Jane; Daly, Mary B; Digiovanni, Laura; Farengo-Clark, Dana; Fetzer, Dominique; Ganschow, Pamela; Grana, Generosa; Gulden, Cassandra; Hall, Michael; Kohler, Lynne; Maxwell, Kara; Merrill, Shana; Montgomery, Susan; Mueller, Rebecca; Nielsen, Sarah; Olopade, Olufunmilayo; Rainey, Kimberly; Seelaus, Christina; Nathanson, Katherine L; Domchek, Susan M
2015-06-01
Multiplex genetic testing, including both moderate- and high-penetrance genes for cancer susceptibility, is associated with greater uncertainty than traditional testing, presenting challenges to informed consent and genetic counseling. We sought to develop a new model for informed consent and genetic counseling for four ongoing studies. Drawing from professional guidelines, literature, conceptual frameworks, and clinical experience, a multidisciplinary group developed a tiered-binned genetic counseling approach proposed to facilitate informed consent and improve outcomes of cancer susceptibility multiplex testing. In this model, tier 1 "indispensable" information is presented to all patients. More specific tier 2 information is provided to support variable informational needs among diverse patient populations. Clinically relevant information is "binned" into groups to minimize information overload, support informed decision making, and facilitate adaptive responses to testing. Seven essential elements of informed consent are provided to address the unique limitations, risks, and uncertainties of multiplex testing. A tiered-binned model for informed consent and genetic counseling has the potential to address the challenges of multiplex testing for cancer susceptibility and to support informed decision making and adaptive responses to testing. Future prospective studies including patient-reported outcomes are needed to inform how to best incorporate multiplex testing for cancer susceptibility into clinical practice.Genet Med 17 6, 485-492.
How causal analysis can reveal autonomy in models of biological systems
NASA Astrophysics Data System (ADS)
Marshall, William; Kim, Hyunju; Walker, Sara I.; Tononi, Giulio; Albantakis, Larissa
2017-11-01
Standard techniques for studying biological systems largely focus on their dynamical or, more recently, their informational properties, usually taking either a reductionist or holistic perspective. Yet, studying only individual system elements or the dynamics of the system as a whole disregards the organizational structure of the system-whether there are subsets of elements with joint causes or effects, and whether the system is strongly integrated or composed of several loosely interacting components. Integrated information theory offers a theoretical framework to (1) investigate the compositional cause-effect structure of a system and to (2) identify causal borders of highly integrated elements comprising local maxima of intrinsic cause-effect power. Here we apply this comprehensive causal analysis to a Boolean network model of the fission yeast (Schizosaccharomyces pombe) cell cycle. We demonstrate that this biological model features a non-trivial causal architecture, whose discovery may provide insights about the real cell cycle that could not be gained from holistic or reductionist approaches. We also show how some specific properties of this underlying causal architecture relate to the biological notion of autonomy. Ultimately, we suggest that analysing the causal organization of a system, including key features like intrinsic control and stable causal borders, should prove relevant for distinguishing life from non-life, and thus could also illuminate the origin of life problem. This article is part of the themed issue 'Reconceptualizing the origins of life'.
15 CFR 30.6 - Electronic Export Information data elements.
Code of Federal Regulations, 2014 CFR
2014-01-01
... elements. 30.6 Section 30.6 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade... Electronic Export Information data elements. The information specified in this section is required for shipments transmitted to the AES. The data elements identified as “mandatory” shall be reported for each...
Data Model for Multi Hazard Risk Assessment Spatial Support Decision System
NASA Astrophysics Data System (ADS)
Andrejchenko, Vera; Bakker, Wim; van Westen, Cees
2014-05-01
The goal of the CHANGES Spatial Decision Support System is to support end-users in making decisions related to risk reduction measures for areas at risk from multiple hydro-meteorological hazards. The crucial parts in the design of the system are the user requirements, the data model, the data storage and management, and the relationships between the objects in the system. The implementation of the data model is carried out entirely with an open source database management system with a spatial extension. The web application is implemented using open source geospatial technologies with PostGIS as the database, Python for scripting, and Geoserver and javascript libraries for visualization and the client-side user-interface. The model can handle information from different study areas (currently, study areas from France, Romania, Italia and Poland are considered). Furthermore, the data model handles information about administrative units, projects accessible by different types of users, user-defined hazard types (floods, snow avalanches, debris flows, etc.), hazard intensity maps of different return periods, spatial probability maps, elements at risk maps (buildings, land parcels, linear features etc.), economic and population vulnerability information dependent on the hazard type and the type of the element at risk, in the form of vulnerability curves. The system has an inbuilt database of vulnerability curves, but users can also add their own ones. Included in the model is the management of a combination of different scenarios (e.g. related to climate change, land use change or population change) and alternatives (possible risk-reduction measures), as well as data-structures for saving the calculated economic or population loss or exposure per element at risk, aggregation of the loss and exposure using the administrative unit maps, and finally, producing the risk maps. The risk data can be used for cost-benefit analysis (CBA) and multi-criteria evaluation (SMCE). The data model includes data-structures for CBA and SMCE. The model is at the stage where risk and cost-benefit calculations can be stored but the remaining part is currently under development. Multi-criteria information, user management and the relation of these with the rest of the model is our next step. Having a carefully designed data model plays a crucial role in the development of the whole system for rapid development, keeping the data consistent, and in the end, support the end-user in making good decisions in risk-reduction measures related to multiple natural hazards. This work is part of the EU FP7 Marie Curie ITN "CHANGES"project (www.changes-itn.edu)
Booth, Richard G
2012-06-01
In this review, studies examining information and communication technology used by nurses in clinical practice were examined. Overall, a total of 39 studies were assessed spanning a time period from 1995 to 2008. The impacts of the various health information and communication technology evaluated by individual studies were synthesized using the DeLone and McLean's six-dimensional framework for evaluating information systems success (ie, System Quality, Information Quality, Service Quality, Use, User Satisfaction, and Net Benefits). Overall, the majority of researchers reported results related to the overall Net Benefits (positive, negative, and indifferent) of the health information and communication technology used by nurses. Attitudes and user satisfaction with technology were also commonly measured attributes. The current iteration of DeLone and McLean model is effective at synthesizing basic elements of health information and communication technology use by nurses. Regardless, the current model lacks the sociotechnical sensitivity to capture deeper nurse-technology relationalities. Limitations and recommendations are provided for researchers considering using the DeLone and McLean model for evaluating health information and communication technology used by nurses.
Finite Element Analysis of M15 and M19 Mines Under Wheeled Vehicle Load
2008-03-01
the plate statically. An implicit finite element option in a code called LSDYNA was used to model the pressure generated in the explosive by the...figure 4 for the M19 mines. Maximum pressure in the explosive for each mine calculated by LSDYNA code shown for a variety of plate sizes and weights...Director U.S. Army TRADOC Analysis Center-WSMR ATTN: ATRC-WSS-R White Sands Missile Range, NM 88002 Chemical Propulsion Information Agency ATTN
NASA Astrophysics Data System (ADS)
Navadeh, N.; Goroshko, I. O.; Zhuk, Y. A.; Fallah, A. S.
2017-11-01
An approach to construction of a beam-type simplified model of a horizontal axis wind turbine composite blade based on the finite element method is proposed. The model allows effective and accurate description of low vibration bending modes taking into account the effects of coupling between flapwise and lead-lag modes of vibration transpiring due to the non-uniform distribution of twist angle in the blade geometry along its length. The identification of model parameters is carried out on the basis of modal data obtained by more detailed finite element simulations and subsequent adoption of the 'DIRECT' optimisation algorithm. Stable identification results were obtained using absolute deviations in frequencies and in modal displacements in the objective function and additional a priori information (boundedness and monotony) on the solution properties.
Finite Element Vibration Modeling and Experimental Validation for an Aircraft Engine Casing
NASA Astrophysics Data System (ADS)
Rabbitt, Christopher
This thesis presents a procedure for the development and validation of a theoretical vibration model, applies this procedure to a pair of aircraft engine casings, and compares select parameters from experimental testing of those casings to those from a theoretical model using the Modal Assurance Criterion (MAC) and linear regression coefficients. A novel method of determining the optimal MAC between axisymmetric results is developed and employed. It is concluded that the dynamic finite element models developed as part of this research are fully capable of modelling the modal parameters within the frequency range of interest. Confidence intervals calculated in this research for correlation coefficients provide important information regarding the reliability of predictions, and it is recommended that these intervals be calculated for all comparable coefficients. The procedure outlined for aligning mode shapes around an axis of symmetry proved useful, and the results are promising for the development of further optimization techniques.
Evolution of an Implementation-Ready Interprofessional Pain Assessment Reference Model
Collins, Sarah A; Bavuso, Karen; Swenson, Mary; Suchecki, Christine; Mar, Perry; Rocha, Roberto A.
2017-01-01
Standards to increase consistency of comprehensive pain assessments are important for safety, quality, and analytics activities, including meeting Joint Commission requirements and learning the best management strategies and interventions for the current prescription Opioid epidemic. In this study we describe the development and validation of a Pain Assessment Reference Model ready for implementation on EHR forms and flowsheets. Our process resulted in 5 successive revisions of the reference model, which more than doubled the number of data elements to 47. The organization of the model evolved during validation sessions with panels totaling 48 subject matter experts (SMEs) to include 9 sets of data elements, with one set recommended as a minimal data set. The reference model also evolved when implemented into EHR forms and flowsheets, indicating specifications such as cascading logic that are important to inform secondary use of data. PMID:29854125
NASA Technical Reports Server (NTRS)
Cruse, T. A.
1987-01-01
The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.
NASA Technical Reports Server (NTRS)
Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.
1988-01-01
The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.
From in silico astrocyte cell models to neuron-astrocyte network models: A review.
Oschmann, Franziska; Berry, Hugues; Obermayer, Klaus; Lenk, Kerstin
2018-01-01
The idea that astrocytes may be active partners in synaptic information processing has recently emerged from abundant experimental reports. Because of their spatial proximity to neurons and their bidirectional communication with them, astrocytes are now considered as an important third element of the synapse. Astrocytes integrate and process synaptic information and by doing so generate cytosolic calcium signals that are believed to reflect neuronal transmitter release. Moreover, they regulate neuronal information transmission by releasing gliotransmitters into the synaptic cleft affecting both pre- and postsynaptic receptors. Concurrent with the first experimental reports of the astrocytic impact on neural network dynamics, computational models describing astrocytic functions have been developed. In this review, we give an overview over the published computational models of astrocytic functions, from single-cell dynamics to the tripartite synapse level and network models of astrocytes and neurons. Copyright © 2017 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-02-17
The Natural Gas Transmission and Distribution Model (NGTDM) is the component of the National Energy Modeling System (NEMS) that is used to represent the domestic natural gas transmission and distribution system. NEMS was developed in the Office of integrated Analysis and Forecasting of the Energy information Administration (EIA). NEMS is the third in a series of computer-based, midterm energy modeling systems used since 1974 by the EIA and its predecessor, the Federal Energy Administration, to analyze domestic energy-economy markets and develop projections. The NGTDM is the model within the NEMS that represents the transmission, distribution, and pricing of natural gas.more » The model also includes representations of the end-use demand for natural gas, the production of domestic natural gas, and the availability of natural gas traded on the international market based on information received from other NEMS models. The NGTDM determines the flow of natural gas in an aggregate, domestic pipeline network, connecting domestic and foreign supply regions with 12 demand regions. The methodology employed allows the analysis of impacts of regional capacity constraints in the interstate natural gas pipeline network and the identification of pipeline capacity expansion requirements. There is an explicit representation of core and noncore markets for natural gas transmission and distribution services, and the key components of pipeline tariffs are represented in a pricing algorithm. Natural gas pricing and flow patterns are derived by obtaining a market equilibrium across the three main elements of the natural gas market: the supply element, the demand element, and the transmission and distribution network that links them. The NGTDM consists of four modules: the Annual Flow Module, the Capacity F-expansion Module, the Pipeline Tariff Module, and the Distributor Tariff Module. A model abstract is provided in Appendix A.« less
Mining key elements for severe convection prediction based on CNN
NASA Astrophysics Data System (ADS)
Liu, Ming; Pan, Ning; Zhang, Changan; Sha, Hongzhou; Zhang, Bolei; Liu, Liang; Zhang, Meng
2017-04-01
Severe convective weather is a kind of weather disasters accompanied by heavy rainfall, gust wind, hail, etc. Along with recent developments on remote sensing and numerical modeling, there are high-volume and long-term observational and modeling data accumulated to capture massive severe convective events over particular areas and time periods. With those high-volume and high-variety weather data, most of the existing studies and methods carry out the dynamical laws, cause analysis, potential rule study, and prediction enhancement by utilizing the governing equations from fluid dynamics and thermodynamics. In this study, a key-element mining method is proposed for severe convection prediction based on convolution neural network (CNN). It aims to identify the key areas and key elements from huge amounts of historical weather data including conventional measurements, weather radar, satellite, so as numerical modeling and/or reanalysis data. Under this manner, the machine-learning based method could help the human forecasters on their decision-making on operational weather forecasts on severe convective weathers by extracting key information from the real-time and historical weather big data. In this paper, it first utilizes computer vision technology to complete the data preprocessing work of the meteorological variables. Then, it utilizes the information such as radar map and expert knowledge to annotate all images automatically. And finally, by using CNN model, it cloud analyze and evaluate each weather elements (e.g., particular variables, patterns, features, etc.), and identify key areas of those critical weather elements, then help forecasters quickly screen out the key elements from huge amounts of observation data by current weather conditions. Based on the rich weather measurement and model data (up to 10 years) over Fujian province in China, where the severe convective weathers are very active during the summer months, experimental tests are conducted with the new machine-learning method via CNN models. Based on the analysis of those experimental results and case studies, the proposed new method have below benefits for the severe convection prediction: (1) helping forecasters to narrow down the scope of analysis and saves lead-time for those high-impact severe convection; (2) performing huge amount of weather big data by machine learning methods rather relying on traditional theory and knowledge, which provide new method to explore and quantify the severe convective weathers; (3) providing machine learning based end-to-end analysis and processing ability with considerable scalability on data volumes, and accomplishing the analysis work without human intervention.
Adeeb A. Rahman; Thomas J. Urbanik; Mustafa Mahamid
2002-01-01
This research develops a model using finite element to study the response of a panel made of a typical commercial corrugated fireboard due to an induced moisture function at one side of the fiberboard. The model predicts how the moisture diffusion will permeate through the fiberboard's layers (medium and liners) providing information on moisture content at any...
ERIC Educational Resources Information Center
Romero, Sonia J.; Ordoñez, Xavier G.; Ponsoda, Vincente; Revuelta, Javier
2014-01-01
Cognitive Diagnostic Models (CDMs) aim to provide information about the degree to which individuals have mastered specific attributes that underlie the success of these individuals on test items. The Q-matrix is a key element in the application of CDMs, because contains links item-attributes representing the cognitive structure proposed for solve…
NASA Technical Reports Server (NTRS)
Schmalzel, John L.; Morris, Jon; Turowski, Mark; Figueroa, Fernando; Oostdyk, Rebecca
2008-01-01
There are a number of architecture models for implementing Integrated Systems Health Management (ISHM) capabilities. For example, approaches based on the OSA-CBM and OSA-EAI models, or specific architectures developed in response to local needs. NASA s John C. Stennis Space Center (SSC) has developed one such version of an extensible architecture in support of rocket engine testing that integrates a palette of functions in order to achieve an ISHM capability. Among the functional capabilities that are supported by the framework are: prognostic models, anomaly detection, a data base of supporting health information, root cause analysis, intelligent elements, and integrated awareness. This paper focuses on the role that intelligent elements can play in ISHM architectures. We define an intelligent element as a smart element with sufficient computing capacity to support anomaly detection or other algorithms in support of ISHM functions. A smart element has the capabilities of supporting networked implementations of IEEE 1451.x smart sensor and actuator protocols. The ISHM group at SSC has been actively developing intelligent elements in conjunction with several partners at other Centers, universities, and companies as part of our ISHM approach for better supporting rocket engine testing. We have developed several implementations. Among the key features for these intelligent sensors is support for IEEE 1451.1 and incorporation of a suite of algorithms for determination of sensor health. Regardless of the potential advantages that can be achieved using intelligent sensors, existing large-scale systems are still based on conventional sensors and data acquisition systems. In order to bring the benefits of intelligent sensors to these environments, we have also developed virtual implementations of intelligent sensors.
PDS4 - Some Principles for Agile Data Curation
NASA Astrophysics Data System (ADS)
Hughes, J. S.; Crichton, D. J.; Hardman, S. H.; Joyner, R.; Algermissen, S.; Padams, J.
2015-12-01
PDS4, a research data management and curation system for NASA's Planetary Science Archive, was developed using principles that promote the characteristics of agile development. The result is an efficient system that produces better research data products while using less resources (time, effort, and money) and maximizes their usefulness for current and future scientists. The key principle is architectural. The PDS4 information architecture is developed and maintained independent of the infrastructure's process, application and technology architectures. The information architecture is based on an ontology-based information model developed to leverage best practices from standard reference models for digital archives, digital object registries, and metadata registries and capture domain knowledge from a panel of planetary science domain experts. The information model provides a sharable, stable, and formal set of information requirements for the system and is the primary source for information to configure most system components, including the product registry, search engine, validation and display tools, and production pipelines. Multi-level governance is also allowed for the effective management of the informational elements at the common, discipline, and project level. This presentation will describe the development principles, components, and uses of the information model and how an information model-driven architecture exhibits characteristics of agile curation including early delivery, evolutionary development, adaptive planning, continuous improvement, and rapid and flexible response to change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martz, Roger L.
The Revised Eolus Grid Library (REGL) is a mesh-tracking library that was developed for use with the MCNP6TM computer code so that (radiation) particles can track on an unstructured mesh. The unstructured mesh is a finite element representation of any geometric solid model created with a state-of-the-art CAE/CAD tool. The mesh-tracking library is written using modern Fortran and programming standards; the library is Fortran 2003 compliant. The library was created with a defined application programmer interface (API) so that it could easily integrate with other particle tracking/transport codes. The library does not handle parallel processing via the message passing interfacemore » (mpi), but has been used successfully where the host code handles the mpi calls. The library is thread-safe and supports the OpenMP paradigm. As a library, all features are available through the API and overall a tight coupling between it and the host code is required. Features of the library are summarized with the following list: Can accommodate first and second order 4, 5, and 6-sided polyhedra; any combination of element types may appear in a single geometry model; parts may not contain tetrahedra mixed with other element types; pentahedra and hexahedra can be together in the same part; robust handling of overlaps and gaps; tracks element-to-element to produce path length results at the element level; finds element numbers for a given mesh location; finds intersection points on element faces for the particle tracks; produce a data file for post processing results analysis; reads Abaqus .inp input (ASCII) files to obtain information for the global mesh-model; supports parallel input processing via mpi; and support parallel particle transport by both mpi and OpenMP.« less
Hindcast Wave Information for the Great Lakes: Lake Huron. Wave Information Studies of US Coastlines
1991-12-01
model used in this study, DWAVE , was developed by Dr. Donald T. Resio of Offshore and Coastal Technologies, Inc. It is described in Resio and Perrie...1989) and in an unpublished contractor’s report* available from the Wave Information Study (WIS) Project Office. 17. DWAVE is a FORTRAN computer code...discrete elements. Figure 4 shows how energy is partitioned in a directional spectrum within DWAVE . As seen there, each frequency-direction increment
Business Process Aware IS Change Management in SMEs
NASA Astrophysics Data System (ADS)
Makna, Janis
Changes in the business process usually require changes in the computer supported information system and, vice versa, changes in the information system almost always cause at least some changes in the business process. In many situations it is not even possible to detect which of those changes are causes and which of them are effects. Nevertheless, it is possible to identify a set of changes that usually happen when one of the elements of the set changes its state. These sets of changes may be used as patterns for situation analysis to anticipate full range of activities to be performed to get the business process and/or information system back to the stable state after it is lost because of the changes in one of the elements. Knowledge about the change pattern gives an opportunity to manage changes of information systems even if business process models and information systems architecture are not neatly documented as is the case in many SMEs. Using change patterns it is possible to know whether changes in information systems are to be expected and how changes in information systems activities, data and users will impact different aspects of the business process supported by the information system.
Componentware Approaches in Management Information Systems
2000-11-01
functionality. It offers plug & play readiness for service and is cooperative in combination with other programs Model ( Griffel 1998). The component view has...ISO195, DI199).terns: Elements of Reusable Object-Oriented Software.SAddison-Wesley 1995. Componentware approaches provide means that support Griffel
Computational Molecular Modeling Methods Applied to Screening for Toxicity
The risk to human health and the environment of chemicals that result from human activity often must be evaluated when relevant elements of the preferred data set are unavailable. Therefore, strategies are needed that estimate this information and prioritize the outstanding data...
NASA Astrophysics Data System (ADS)
Braun-Dullaeus, Karl-Ulrich; Traxel, Kurt
1995-02-01
One method forestimating cooling rates of meteorite parent bodies is to model measured nickel distributions in taenite lamellae of iron meteorites. Goldstein and Ogilvie ( Geochim. Cosmochim. Acta29, 893, 1965) and Rasmussen ( Icarus45, 564, 1981) developed techniques based on this idea to examine the cooling history in the temperature range between ˜700 and ˜400°C. As a result of Instrumental Neutron Activation Analysis (INAA) Rasmussen et al. ( Meteoritics23, 105, 1988) postulated that some trace elements would also be good cooling rate indicators. They argued that elements with distinct diffusion behavior are sensitive to different temperature ranges. The new Heidelberg proton microprobe uses the method of Proton Induced X-ray Emission (PIXE) for elemental analysis. This microprobe is an appropriate instrument to measure distributions of trace elements with a spatial resolution of 2 μm. We demonstrated on the iron meteorites Cape York (Agpalilik), Toluca and Odessa that the elements copper, zinc, gallium and germanium imitate the profiles of nickel in taenite lamella. The interpretation of the Zn, Ga and Ge profiles leads to the conclusion that these elements undergo diffusion mechanisms comparable to those of Ni. The numerical simulation of Cu distributions with a simplified model points out that little new information can be obtained about the cooling history of the meteorites by modelling Cu profiles. To simulate Zn, Ga or Ge distributions, the use of ternary phase diagrams is necessary.
Pike, William A; Riensche, Roderick M; Best, Daniel M; Roberts, Ian E; Whyatt, Marie V; Hart, Michelle L; Carr, Norman J; Thomas, James J
2012-09-18
Systems and computer-implemented processes for storage and management of information artifacts collected by information analysts using a computing device. The processes and systems can capture a sequence of interactive operation elements that are performed by the information analyst, who is collecting an information artifact from at least one of the plurality of software applications. The information artifact can then be stored together with the interactive operation elements as a snippet on a memory device, which is operably connected to the processor. The snippet comprises a view from an analysis application, data contained in the view, and the sequence of interactive operation elements stored as a provenance representation comprising operation element class, timestamp, and data object attributes for each interactive operation element in the sequence.
NASA Astrophysics Data System (ADS)
Szarf, Krzysztof; Combe, Gael; Villard, Pascal
2015-02-01
The mechanical performance of underground flexible structures such as buried pipes or culverts made of plastics depend not only on the properties of the structure, but also on the material surrounding it. Flexible drains can deflect by 30% with the joints staying tight, or even invert. Large deformations of the structure are difficult to model in the framework of Finite Element Method, but straightforward in Discrete Element Methods. Moreover, Discrete Element approach is able to provide information about the grain-grain and grain-structure interactions at the microscale. This paper presents numerical and experimental investigations of flexible buried pipe behaviour with focus placed on load transfer above the buried structure. Numerical modeling was able to reproduce the experimental results. Load repartition was observed, being affected by a number of factors such as particle shape, pipe friction and pipe stiffness.
Applying the chronic care model to an employee benefits program: a qualitative inquiry.
Schauer, Gillian L; Wilson, Mark; Barrett, Barbara; Honeycutt, Sally; Hermstad, April K; Kegler, Michelle C
2013-12-01
To assess how employee benefits programs may strengthen and/or complement elements of the chronic care model (CCM), a framework used by health systems to improve chronic illness care. A qualitative inquiry consisting of semi-structured interviews with employee benefit administrators and partners from a self-insured, self-administered employee health benefits program was conducted at a large family-owned business in southwest Georgia. Results indicate that the employer adapted and used many health system-related elements of the CCM in the design of their benefit program. Data also suggest that the employee benefits program contributed to self-management skills and to informing and activating patients to interact with the health system. Findings suggest that employee benefits programs can use aspects of the CCM in their own benefit design, and can structure their benefits to contribute to patient-related elements from the CCM.
Sample Size in Qualitative Interview Studies: Guided by Information Power.
Malterud, Kirsti; Siersma, Volkert Dirk; Guassora, Ann Dorrit
2015-11-27
Sample sizes must be ascertained in qualitative studies like in quantitative studies but not by the same means. The prevailing concept for sample size in qualitative studies is "saturation." Saturation is closely tied to a specific methodology, and the term is inconsistently applied. We propose the concept "information power" to guide adequate sample size for qualitative studies. Information power indicates that the more information the sample holds, relevant for the actual study, the lower amount of participants is needed. We suggest that the size of a sample with sufficient information power depends on (a) the aim of the study, (b) sample specificity, (c) use of established theory, (d) quality of dialogue, and (e) analysis strategy. We present a model where these elements of information and their relevant dimensions are related to information power. Application of this model in the planning and during data collection of a qualitative study is discussed. © The Author(s) 2015.
OOM - OBJECT ORIENTATION MANIPULATOR, VERSION 6.1
NASA Technical Reports Server (NTRS)
Goza, S. P.
1994-01-01
The Object Orientation Manipulator (OOM) is an application program for creating, rendering, and recording three-dimensional computer-generated still and animated images. This is done using geometrically defined 3D models, cameras, and light sources, referred to collectively as animation elements. OOM does not provide the tools necessary to construct 3D models; instead, it imports binary format model files generated by the Solid Surface Modeler (SSM). Model files stored in other formats must be converted to the SSM binary format before they can be used in OOM. SSM is available as MSC-21914 or as part of the SSM/OOM bundle, COS-10047. Among OOM's features are collision detection (with visual and audio feedback), the capability to define and manipulate hierarchical relationships between animation elements, stereographic display, and ray-traced rendering. OOM uses Euler angle transformations for calculating the results of translation and rotation operations. OOM provides an interactive environment for the manipulation and animation of models, cameras, and light sources. Models are the basic entity upon which OOM operates and are therefore considered the primary animation elements. Cameras and light sources are considered secondary animation elements. A camera, in OOM, is simply a location within the three-space environment from which the contents of the environment are observed. OOM supports the creation and full animation of cameras. Light sources can be defined, positioned and linked to models, but they cannot be animated independently. OOM can simultaneously accommodate as many animation elements as the host computer's memory permits. Once the required animation elements are present, the user may position them, orient them, and define any initial relationships between them. Once the initial relationships are defined, the user can display individual still views for rendering and output, or define motion for the animation elements by using the Interp Animation Editor. The program provides the capability to save still images, animated sequences of frames, and the information that describes the initialization process for an OOM session. OOM provides the same rendering and output options for both still and animated images. OOM is equipped with a robust model manipulation environment featuring a full screen viewing window, a menu-oriented user interface, and an interpolative Animation Editor. It provides three display modes: solid, wire frame, and simple, that allow the user to trade off visual authenticity for update speed. In the solid mode, each model is drawn based on the shading characteristics assigned to it when it was built. All of the shading characteristics supported by SSM are recognized and properly rendered in this mode. If increasing model complexity impedes the operation of OOM in this mode, then wireframe and simple modes are available. These provide substantially faster screen updates than solid mode. The creation and placement of cameras and light sources is under complete control of the user. One light source is provided in the default element set. It is modeled as a direct light source providing a type of lighting analogous to that provided by the Sun. OOM can accommodate as many light sources as the memory of the host computer permits. Animation is created in OOM using a technique called key frame interpolation. First, various program functions are used to load models, load or create light sources and cameras, and specify initial positions for each element. When these steps are completed, the Interp function is used to create an animation sequence for each element to be animated. An animation sequence consists of a user-defined number of frames (screen images) with some subset of those being defined as key frames. The motion of the element between key frames is interpolated automatically by the software. Key frames thus act as transition points in the motion of an element. This saves the user from having to individually define element data at each frame of a sequence. Animation frames and still images can be output to videotape recorders, film recorders, color printers, and disk files. OOM is written in C-language for implementation on SGI IRIS 4D series workstations running the IRIX operating system. A minimum of 8Mb of RAM is recommended for this program. The standard distribution medium for OOM is a .25 inch streaming magnetic IRIX tape cartridge in UNIX tar format. OOM is also offered as a bundle with a related program, SSM (Solid Surface Modeler). Please see the abstract for SSM/OOM (COS-10047) for information about the bundled package. OOM was released in 1993.
Manning’s equation and two-dimensional flow analogs
NASA Astrophysics Data System (ADS)
Hromadka, T. V., II; Whitley, R. J.; Jordan, N.; Meyer, T.
2010-07-01
SummaryTwo-dimensional (2D) flow models based on the well-known governing 2D flow equations are applied to floodplain analysis purposes. These 2D models numerically solve the governing flow equations simultaneously or explicitly on a discretization of the floodplain using grid tiles or similar tile cell geometry, called "elements". By use of automated information systems such as digital terrain modeling, digital elevation models, and GIS, large-scale topographic floodplain maps can be readily discretized into thousands of elements that densely cover the floodplain in an edge-to-edge form. However, the assumed principal flow directions of the flow model analog, as applied across an array of elements, typically do not align with the floodplain flow streamlines. This paper examines the mathematical underpinnings of a four-direction flow analog using an array of square elements with respect to floodplain flow streamlines that are not in alignment with the analog's principal flow directions. It is determined that application of Manning's equation to estimate the friction slope terms of the governing flow equations, in directions that are not coincident with the flow streamlines, may introduce a bias in modeling results, in the form of slight underestimation of flow depths. It is also determined that the maximum theoretical bias, occurs when a single square element is rotated by about 13°, and not 45° as would be intuitively thought. The bias as a function of rotation angle for an array of square elements follows approximately the bias for a single square element. For both the theoretical single square element and an array of square elements, the bias as a function of alignment angle follows a relatively constant value from about 5° to about 85°, centered at about 45°. This bias was first noted about a decade prior to the present paper, and the magnitude of this bias was estimated then to be about 20% at about 10° misalignment. An adjustment of Manning's n is investigated based on a considered steady state uniform flow problem, but the magnitude of the adjustment (about 20%) is on the order of the magnitude of the accepted ranges of friction factors. For usual cases where random streamline trajectory variability within the floodplain flow is greater than a few degrees from perfect alignment, the apparent bias appears to be implicitly included in the Manning's n values. It can be concluded that the array of square elements may be applied over the digital terrain model without respect to topographic flow directions.
NASA Astrophysics Data System (ADS)
Eltom, Hassan A.; Abdullatif, Osman M.; Makkawi, Mohammed H.; Eltoum, Isam-Eldin A.
2017-03-01
The interpretation of depositional environments provides important information to understand facies distribution and geometry. The classical approach to interpret depositional environments principally relies on the analysis of lithofacies, biofacies and stratigraphic data, among others. An alternative method, based on geochemical data (chemical element data), is advantageous because it can simply, reproducibly and efficiently interpret and refine the interpretation of the depositional environment of carbonate strata. Here we geochemically analyze and statistically model carbonate samples (n = 156) from seven sections of the Arab-D reservoir outcrop analog of central Saudi Arabia, to determine whether the elemental signatures (major, trace and rare earth elements [REEs]) can be effectively used to predict depositional environments. We find that lithofacies associations of the studied outcrop (peritidal to open marine depositional environments) possess altered REE signatures, and that this trend increases stratigraphically from bottom-to-top, which corresponds to an upward shallowing of depositional environments. The relationship between REEs and major, minor and trace elements indicates that contamination by detrital materials is the principal source of REEs, whereas redox condition, marine and diagenetic processes have minimal impact on the relative distribution of REEs in the lithofacies. In a statistical model (factor analysis and logistic regression), REEs, major and trace elements cluster together and serve as markers to differentiate between peritidal and open marine facies and to differentiate between intertidal and subtidal lithofacies within the peritidal facies. The results indicate that statistical modelling of the elemental composition of carbonate strata can be used as a quantitative method to predict depositional environments and regional paleogeography. The significance of this study lies in offering new assessments of the relationships between lithofacies and geochemical elements by using advanced statistical analysis, a method that could be used elsewhere to interpret depositional environment and refine facies models.
Possibilities of Land Administration Domain Model (ladm) Implementation in Nigeria
NASA Astrophysics Data System (ADS)
Babalola, S. O.; Rahman, A. Abdul; Choon, L. T.; Van Oosterom, P. J. M.
2015-10-01
LADM covers essential information associated components of land administration and management including those over water and elements above and below the surface of the earth. LADM standard provides an abstract conceptual model with three packages and one sub-package. LADM defined terminology for a land administration system that allows a shared explanation of different formal customary or informal tenures. The standard provides the basis for national and regional profiles and enables the combination of land management information from different sources in a coherent manner. Given this, this paper started with the description of land and land administration in Nigeria. The pre-colonial, colonial and post-colonial era with organization structure was discussed. This discussion is important to present an understanding of the background of any improvement needed for the LADM implementation in Nigeria. The LADM, ISO 19152 and the packages of LADM was discussed, and the comparison of the different aspects of each package and classes were made with Nigerian land administration and the cadastral system. In the comparison made, it was discovered that the concept is similar to LADM packages in Nigerian land administration. Although, the terminology may not be the same in all cases. Having studied conceptualization and the application of LADM, as a model that has essential information associated with components of the land administration. Including those on the land, over water as well as elements above and below the surface of the earth and discovered that the standard is suitable for the country. The model can, therefore, be adopted into Nigerian land administration system by mapping in some of the concepts of LADM.
NASA Astrophysics Data System (ADS)
Hartmann, Timo; Tanner, Gregor; Xie, Gang; Chappell, David; Bajars, Janis
2016-09-01
Dynamical Energy Analysis (DEA) combined with the Discrete Flow Mapping technique (DFM) has recently been introduced as a mesh-based high frequency method modelling structure borne sound for complex built-up structures. This has proven to enhance vibro-acoustic simulations considerably by making it possible to work directly on existing finite element meshes circumventing time-consuming and costly re-modelling strategies. In addition, DFM provides detailed spatial information about the vibrational energy distribution within a complex structure in the mid-to-high frequency range. We will present here progress in the development of the DEA method towards handling complex FEM-meshes including Rigid Body Elements. In addition, structure borne transmission paths due to spot welds are considered. We will present applications for a car floor structure.
MIPSPlantsDB—plant database resource for integrative and comparative plant genome research
Spannagl, Manuel; Noubibou, Octave; Haase, Dirk; Yang, Li; Gundlach, Heidrun; Hindemitt, Tobias; Klee, Kathrin; Haberer, Georg; Schoof, Heiko; Mayer, Klaus F. X.
2007-01-01
Genome-oriented plant research delivers rapidly increasing amount of plant genome data. Comprehensive and structured information resources are required to structure and communicate genome and associated analytical data for model organisms as well as for crops. The increase in available plant genomic data enables powerful comparative analysis and integrative approaches. PlantsDB aims to provide data and information resources for individual plant species and in addition to build a platform for integrative and comparative plant genome research. PlantsDB is constituted from genome databases for Arabidopsis, Medicago, Lotus, rice, maize and tomato. Complementary data resources for cis elements, repetive elements and extensive cross-species comparisons are implemented. The PlantsDB portal can be reached at . PMID:17202173
Core formation in the Moon: The mystery of the excess depletion of Mo, W and P
NASA Technical Reports Server (NTRS)
Newsom, H. E.; Maehr, S. A.
1993-01-01
We have evaluated siderophile element depletion models for the Moon in light of our improved statistical treatment of siderophile element abundance data and new information on the physics of core formation. If core formation occurred in the Moon at the large degrees of partial melting necessary for metal segregation, according to recent estimates, then a significant inconsistency (not seen in the eucrite parent body) exists in the depletion of the incompatible siderophile elements Mo, W, and P, compared to other siderophile elements in the Moon. The siderophile data, with the exception of Mo, are most consistent with terrestrial initial siderophile abundances and segregation of a very small core in the Moon. Our improved abundance estimates and possible explanations for these discrepancies are discussed.
The Dynamic Brain: From Spiking Neurons to Neural Masses and Cortical Fields
Deco, Gustavo; Jirsa, Viktor K.; Robinson, Peter A.; Breakspear, Michael; Friston, Karl
2008-01-01
The cortex is a complex system, characterized by its dynamics and architecture, which underlie many functions such as action, perception, learning, language, and cognition. Its structural architecture has been studied for more than a hundred years; however, its dynamics have been addressed much less thoroughly. In this paper, we review and integrate, in a unifying framework, a variety of computational approaches that have been used to characterize the dynamics of the cortex, as evidenced at different levels of measurement. Computational models at different space–time scales help us understand the fundamental mechanisms that underpin neural processes and relate these processes to neuroscience data. Modeling at the single neuron level is necessary because this is the level at which information is exchanged between the computing elements of the brain; the neurons. Mesoscopic models tell us how neural elements interact to yield emergent behavior at the level of microcolumns and cortical columns. Macroscopic models can inform us about whole brain dynamics and interactions between large-scale neural systems such as cortical regions, the thalamus, and brain stem. Each level of description relates uniquely to neuroscience data, from single-unit recordings, through local field potentials to functional magnetic resonance imaging (fMRI), electroencephalogram (EEG), and magnetoencephalogram (MEG). Models of the cortex can establish which types of large-scale neuronal networks can perform computations and characterize their emergent properties. Mean-field and related formulations of dynamics also play an essential and complementary role as forward models that can be inverted given empirical data. This makes dynamic models critical in integrating theory and experiments. We argue that elaborating principled and informed models is a prerequisite for grounding empirical neuroscience in a cogent theoretical framework, commensurate with the achievements in the physical sciences. PMID:18769680
15 CFR 30.6 - Electronic Export Information data elements.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 15 Commerce and Foreign Trade 1 2012-01-01 2012-01-01 false Electronic Export Information data elements. 30.6 Section 30.6 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade... Electronic Export Information data elements. The information specified in this section is required for...
15 CFR 30.6 - Electronic Export Information data elements.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 15 Commerce and Foreign Trade 1 2013-01-01 2013-01-01 false Electronic Export Information data elements. 30.6 Section 30.6 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade... Electronic Export Information data elements. The information specified in this section is required for...
Surface relief model for photopolymers without cover plating.
Gallego, S; Márquez, A; Ortuño, M; Francés, J; Marini, S; Beléndez, A; Pascual, I
2011-05-23
Relief surface changes provide interesting possibilities for storing diffractive optical elements on photopolymers and are an important source of information to characterize and understand the material behaviour. In this paper we present a 3-dimensional model based on direct measurements of parameters to predict the relief structures generated on the material. This model is successfully applied to different photopolymers with different values of monomer diffusion. The importance of monomer diffusion in depth is also discussed.
Spent nuclear fuel assembly inspection using neutron computed tomography
NASA Astrophysics Data System (ADS)
Pope, Chad Lee
The research presented here focuses on spent nuclear fuel assembly inspection using neutron computed tomography. Experimental measurements involving neutron beam transmission through a spent nuclear fuel assembly serve as benchmark measurements for an MCNP simulation model. Comparison of measured results to simulation results shows good agreement. Generation of tomography images from MCNP tally results was accomplished using adapted versions of built in MATLAB algorithms. Multiple fuel assembly models were examined to provide a broad set of conclusions. Tomography images revealing assembly geometric information including the fuel element lattice structure and missing elements can be obtained using high energy neutrons. A projection difference technique was developed which reveals the substitution of unirradiated fuel elements for irradiated fuel elements, using high energy neutrons. More subtle material differences such as altering the burnup of individual elements can be identified with lower energy neutrons provided the scattered neutron contribution to the image is limited. The research results show that neutron computed tomography can be used to inspect spent nuclear fuel assemblies for the purpose of identifying anomalies such as missing elements or substituted elements. The ability to identify anomalies in spent fuel assemblies can be used to deter diversion of material by increasing the risk of early detection as well as improve reprocessing facility operations by confirming the spent fuel configuration is as expected or allowing segregation if anomalies are detected.
Metal Transport across Biomembranes: Emerging Models for a Distinct Chemistry*
Argüello, José M.; Raimunda, Daniel; González-Guerrero, Manuel
2012-01-01
Transition metals are essential components of important biomolecules, and their homeostasis is central to many life processes. Transmembrane transporters are key elements controlling the distribution of metals in various compartments. However, due to their chemical properties, transition elements require transporters with different structural-functional characteristics from those of alkali and alkali earth ions. Emerging structural information and functional studies have revealed distinctive features of metal transport. Among these are the relevance of multifaceted events involving metal transfer among participating proteins, the importance of coordination geometry at transmembrane transport sites, and the presence of the largely irreversible steps associated with vectorial transport. Here, we discuss how these characteristics shape novel transition metal ion transport models. PMID:22389499
Metal transport across biomembranes: emerging models for a distinct chemistry.
Argüello, José M; Raimunda, Daniel; González-Guerrero, Manuel
2012-04-20
Transition metals are essential components of important biomolecules, and their homeostasis is central to many life processes. Transmembrane transporters are key elements controlling the distribution of metals in various compartments. However, due to their chemical properties, transition elements require transporters with different structural-functional characteristics from those of alkali and alkali earth ions. Emerging structural information and functional studies have revealed distinctive features of metal transport. Among these are the relevance of multifaceted events involving metal transfer among participating proteins, the importance of coordination geometry at transmembrane transport sites, and the presence of the largely irreversible steps associated with vectorial transport. Here, we discuss how these characteristics shape novel transition metal ion transport models.
Solernou, Albert; Hanson, Benjamin S; Richardson, Robin A; Welch, Robert; Read, Daniel J; Harlen, Oliver G; Harris, Sarah A
2018-03-01
Fluctuating Finite Element Analysis (FFEA) is a software package designed to perform continuum mechanics simulations of proteins and other globular macromolecules. It combines conventional finite element methods with stochastic thermal noise, and is appropriate for simulations of large proteins and protein complexes at the mesoscale (length-scales in the range of 5 nm to 1 μm), where there is currently a paucity of modelling tools. It requires 3D volumetric information as input, which can be low resolution structural information such as cryo-electron tomography (cryo-ET) maps or much higher resolution atomistic co-ordinates from which volumetric information can be extracted. In this article we introduce our open source software package for performing FFEA simulations which we have released under a GPLv3 license. The software package includes a C ++ implementation of FFEA, together with tools to assist the user to set up the system from Electron Microscopy Data Bank (EMDB) or Protein Data Bank (PDB) data files. We also provide a PyMOL plugin to perform basic visualisation and additional Python tools for the analysis of FFEA simulation trajectories. This manuscript provides a basic background to the FFEA method, describing the implementation of the core mechanical model and how intermolecular interactions and the solvent environment are included within this framework. We provide prospective FFEA users with a practical overview of how to set up an FFEA simulation with reference to our publicly available online tutorials and manuals that accompany this first release of the package.
Deflection Analysis of the Space Shuttle External Tank Door Drive Mechanism
NASA Technical Reports Server (NTRS)
Tosto, Michael A.; Trieu, Bo C.; Evernden, Brent A.; Hope, Drew J.; Wong, Kenneth A.; Lindberg, Robert E.
2008-01-01
Upon observing an abnormal closure of the Space Shuttle s External Tank Doors (ETD), a dynamic model was created in MSC/ADAMS to conduct deflection analyses of the Door Drive Mechanism (DDM). For a similar analysis, the traditional approach would be to construct a full finite element model of the mechanism. The purpose of this paper is to describe an alternative approach that models the flexibility of the DDM using a lumped parameter approximation to capture the compliance of individual parts within the drive linkage. This approach allows for rapid construction of a dynamic model in a time-critical setting, while still retaining the appropriate equivalent stiffness of each linkage component. As a validation of these equivalent stiffnesses, finite element analysis (FEA) was used to iteratively update the model towards convergence. Following this analysis, deflections recovered from the dynamic model can be used to calculate stress and classify each component s deformation as either elastic or plastic. Based on the modeling assumptions used in this analysis and the maximum input forcing condition, two components in the DDM show a factor of safety less than or equal to 0.5. However, to accurately evaluate the induced stresses, additional mechanism rigging information would be necessary to characterize the input forcing conditions. This information would also allow for the classification of stresses as either elastic or plastic.
Mathematical model for the dc-ac inverter for the Space Shuttle
NASA Technical Reports Server (NTRS)
Berry, Frederick C.
1987-01-01
The reader is informed of what was done for the mathematical modeling of the dc-ac inverter for the Space Shuttle. The mathematical modeling of the dc-ac inverter is an essential element in the modeling of the electrical power distribution system of the Space Shuttle. The electrical power distribution system which is present on the Space Shuttle is made up to 3 strings each having a fuel cell which provides dc to those systems which require dc, and the inverters which convert the dc to ac for those elements which require ac. The inverters are units which are 2 wire structures for the main dc inputs and 2 wire structures for the ac output. When 3 are connected together a 4 wire wye connection results on the ac side. The method of modeling is performed by using a Least Squares curve fitting method. A computer program is presented for implementation of the model along with graphs and tables to demonstrate the accuracy of the model.
Parsing GML data based on integrative GML syntactic and semantic schemas database
NASA Astrophysics Data System (ADS)
Miao, Lizhi; Zhang, Shuliang; Lu, Guonian; Gao, Xiaoli; Jiao, Donglai; Gan, Jiayan
2007-06-01
This paper proposes a new method to parse various application schemas of Geography Markup Language (GML) for understanding syntax and semantic of their element and type in order to implement uniform interpretation of the same GML instance data among diverse users. The proposed method generates an Integrative GML Syntactic and Semantic Schemas Database (IGSSSDB) from GML3.1 core schemas and corresponding application schema. This paper parses GML data based on IGSSSDB, which is composed of syntactic and semantic information, nesting information and mapping rules of GML core schemas and application schemas. Three kinds of relational tables are designed for storing information from schemas when constructing IGSSSDB. Those are info tables for schemas included and namespace imported in application schemas, tables for information related to schemas and catalog tables of core schemas. In relational tables, we propose to use homologous regular expression to describe model of elements and complex types in schemas, which can ensure model complete and readable. Based on IGSSSDB, we design and develop many APIs to implement GML data parsing, and can process syntactic and semantic information of GML data from diverse fields and users. At the latter part of this paper, test study is implemented to show that the proposed method is feasible and appropriate for parsing GML data. Also, it founds a good basis for future GML data studies such as storage, index and query etc.
Building pathway graphs from BioPAX data in R.
Benis, Nirupama; Schokker, Dirkjan; Kramer, Frank; Smits, Mari A; Suarez-Diez, Maria
2016-01-01
Biological pathways are increasingly available in the BioPAX format which uses an RDF model for data storage. One can retrieve the information in this data model in the scripting language R using the package rBiopaxParser , which converts the BioPAX format to one readable in R. It also has a function to build a regulatory network from the pathway information. Here we describe an extension of this function. The new function allows the user to build graphs of entire pathways, including regulated as well as non-regulated elements, and therefore provides a maximum of information. This function is available as part of the rBiopaxParser distribution from Bioconductor.
50 CFR 600.315 - National Standard 2-Scientific Information.
Code of Federal Regulations, 2013 CFR
2013-10-01
... from surveys or sampling programs, and models that are mathematical representations of reality... static and ideally entails developing and following a research plan with the following elements: Clear... predictions, or testing hypotheses; study design with an explicit and standardized method of collecting data...
Color pattern analysis of nymphalid butterfly wings: revision of the nymphalid groundplan.
Otaki, Joji M
2012-09-01
To better understand the developmental mechanisms of color pattern variation in butterfly wings, it is important to construct an accurate representation of pattern elements, known as the "nymphalid groundplan". However, some aspects of the current groundplan remain elusive. Here, I examined wing-wide elemental patterns of various nymphalid butterflies and confirmed that wing-wide color patterns are composed of the border, central, and basal symmetry systems. The central and basal symmetry systems can express circular patterns resembling eyespots, indicating that these systems have developmental mechanisms similar to those of the border symmetry system. The wing root band commonly occurs as a distinct symmetry system independent from the basal symmetry system. In addition, the marginal and submarginal bands are likely generated as a single system, referred to as the "marginal band system". Background spaces between two symmetry systems are sometimes light in coloration and can produce white bands, contributing significantly to color pattern diversity. When an element is enlarged with a pale central area, a visually similar (yet developmentally distinct) white band is produced. Based on the symmetric relationships of elements, I propose that both the central and border symmetry systems are comprised of "core elements" (the discal spot and the border ocelli, respectively) and a pair of "paracore elements" (the distal and proximal bands and the parafocal elements, respectively). Both core and paracore elements can be doubled, or outlined. Developmentally, this system configuration is consistent with the induction model, but not with the concentration gradient model for positional information.
Determination of origin and intended use of plutonium metal using nuclear forensic techniques.
Rim, Jung H; Kuhn, Kevin J; Tandon, Lav; Xu, Ning; Porterfield, Donivan R; Worley, Christopher G; Thomas, Mariam R; Spencer, Khalil J; Stanley, Floyd E; Lujan, Elmer J; Garduno, Katherine; Trellue, Holly R
2017-04-01
Nuclear forensics techniques, including micro-XRF, gamma spectrometry, trace elemental analysis and isotopic/chronometric characterization were used to interrogate two, potentially related plutonium metal foils. These samples were submitted for analysis with only limited production information, and a comprehensive suite of forensic analyses were performed. Resulting analytical data was paired with available reactor model and historical information to provide insight into the materials' properties, origins, and likely intended uses. Both were super-grade plutonium, containing less than 3% 240 Pu, and age-dating suggested that most recent chemical purification occurred in 1948 and 1955 for the respective metals. Additional consideration of reactor modeling feedback and trace elemental observables indicate plausible U.S. reactor origin associated with the Hanford site production efforts. Based on this investigation, the most likely intended use for these plutonium foils was 239 Pu fission foil targets for physics experiments, such as cross-section measurements, etc. Copyright © 2017 Elsevier B.V. All rights reserved.
Determination of origin and intended use of plutonium metal using nuclear forensic techniques
Rim, Jung H.; Kuhn, Kevin J.; Tandon, Lav; ...
2017-04-01
Nuclear forensics techniques, including micro-XRF, gamma spectrometry, trace elemental analysis and isotopic/chronometric characterization were used to interrogate two, potentially related plutonium metal foils. These samples were submitted for analysis with only limited production information, and a comprehensive suite of forensic analyses were performed. Resulting analytical data was paired with available reactor model and historical information to provide insight into the materials’ properties, origins, and likely intended uses. Both were super-grade plutonium, containing less than 3% 240Pu, and age-dating suggested that most recent chemical purification occurred in 1948 and 1955 for the respective metals. Additional consideration of reactor modelling feedback andmore » trace elemental observables indicate plausible U.S. reactor origin associated with the Hanford site production efforts. In conclusion, based on this investigation, the most likely intended use for these plutonium foils was 239Pu fission foil targets for physics experiments, such as cross-section measurements, etc.« less
System to provide 3D information on geological anomaly zone in deep subsea
NASA Astrophysics Data System (ADS)
Kim, W.; Kwon, O.; Kim, D.
2017-12-01
The study on building the ultra long and deep subsea tunnel of which length is 50km and depth is 200m at least, respectively, is underway in Korea. To analyze the geotechnical information required for designing and building subsea tunnel, topographic/geologiccal information analysis using 2D seabed geophysical prospecting and topographic, geologic, exploration and boring data were analyzed comprehensively and as a result, automation method to identify the geological structure zone under seabed which is needed to design the deep and long seabed tunnel was developed using geostatistical analysis. In addition, software using 3D visualized ground information to provide the information includes Gocad, MVS, Vulcan and DIMINE. This study is intended to analyze the geological anomaly zone for ultra deep seabed l and visualize the geological investigation result so as to develop the exclusive system for processing the ground investigation information which is convenient for the users. Particularly it's compatible depending on file of geophysical prospecting result and is realizable in Layer form and for 3D view as well. The data to be processed by 3D seabed information system includes (1) deep seabed topographic information, (2) geological anomaly zone, (3) geophysical prospecting, (4) boring investigation result and (5) 3D visualization of the section on seabed tunnel route. Each data has own characteristics depending on data and interface to allow interlocking with other data is granted. In each detail function, input data is displayed in a single space and each element is selectable to identify the further information as a project. Program creates the project when initially implemented and all output from detail information is stored by project unit. Each element representing detail information is stored in image file and is supported to store in text file as well. It also has the function to transfer, expand/reduce and rotate the model. To represent the all elements in 3D visualized platform, coordinate and time information are added to the data or data group to establish the conceptual model as a whole. This research was supported by the Korea Agency for Infrastructure Technology Advancement under the Ministry of Land, Infrastructure and Transport of the Korean government(Project Number: 13 Construction Research T01).
Identifying PM2.5 and PM0.1 sources for epidemiological studies in California.
Hu, Jianlin; Zhang, Hongliang; Chen, Shuhua; Ying, Qi; Wiedinmyer, Christine; Vandenberghe, Francois; Kleeman, Michael J
2014-05-06
The University of California-Davis_Primary (UCD_P) model was applied to simultaneously track ∼ 900 source contributions to primary particulate matter (PM) in California for seven continuous years (January 1st, 2000 to December 31st, 2006). Predicted source contributions to primary PM2.5 mass, PM1.8 elemental carbon (EC), PM1.8 organic carbon (OC), PM0.1 EC, and PM0.1 OC were in general agreement with the results from previous source apportionment studies using receptor-based techniques. All sources were further subjected to a constraint check based on model performance for PM trace elemental composition. A total of 151 PM2.5 sources and 71 PM0.1 sources contained PM elements that were predicted at concentrations in general agreement with measured values at nearby monitoring sites. Significant spatial heterogeneity was predicted among the 151 PM2.5 and 71 PM0.1 source concentrations, and significantly different seasonal profiles were predicted for PM2.5 and PM0.1 in central California vs southern California. Population-weighted concentrations of PM emitted from various sources calculated using the UCD_P model spatial information differed from the central monitor estimates by up to 77% for primary PM2.5 mass and 148% for PM2.5 EC because the central monitor concentration is not representative of exposure for nearby population. The results from the UCD_P model provide enhanced source apportionment information for epidemiological studies to examine the relationship between health effects and concentrations of primary PM from individual sources.
A model for assessing the systemic vulnerability in landslide prone areas
NASA Astrophysics Data System (ADS)
Pascale, S.; Sdao, F.; Sole, A.
2010-07-01
The objectives of spatial planning should include the definition and assessment of possible mitigation strategies regarding the effects of natural hazards on the surrounding territory. Unfortunately, however, there is often a lack of adequate tools to provide necessary support to the local bodies responsible for land management. This paper deals with the conception, the development and the validation of an integrated numerical model for assessing systemic vulnerability in complex and urbanized landslide-prone areas. The proposed model considers this vulnerability not as a characteristic of a particular element at risk, but as a peculiarity of a complex territorial system, in which the elements are reciprocally linked in a functional way. It is an index of the tendency of a given territorial element to suffer damage (usually of a functional kind) due to its interconnections with other elements of the same territorial system. The innovative nature of this work also lies in the formalization of a procedure based on a network of influences for an adequate assessment of such "systemic" vulnerability. This approach can be used to obtain information which is useful, in any given situation of a territory hit by a landslide event, for the identification of the element which has suffered the most functional damage, ie the most "critical" element and the element which has the greatest repercussions on other elements of the system and thus a "decisive" role in the management of the emergency. This model was developed within a GIS system through the following phases: 1. the topological characterization of the territorial system studied and the assessment of the scenarios in terms of spatial landslide hazard. A statistical method, based on neural networks was proposed for the assessment of landslide hazard; 2. the analysis of the direct consequences of a scenario event on the system; 3. the definition of the assessment model of systemic vulnerability in landslide-prone areas. To highlight the potentialities of the proposed approach we have described a specific case study of landslide hazard in the local council area of Potenza.
NASA Astrophysics Data System (ADS)
Bognot, J. R.; Candido, C. G.; Blanco, A. C.; Montelibano, J. R. Y.
2018-05-01
Monitoring the progress of building's construction is critical in construction management. However, measuring the building construction's progress are still manual, time consuming, error prone, and impose tedious process of analysis leading to delays, additional costings and effort. The main goal of this research is to develop a methodology for building construction progress monitoring based on 3D as-built model of the building from unmanned aerial system (UAS) images, 4D as-planned model (with construction schedule integrated) and, GIS analysis. Monitoring was done by capturing videos of the building with a camera-equipped UAS. Still images were extracted, filtered, bundle-adjusted, and 3D as-built model was generated using open source photogrammetric software. The as-planned model was generated from digitized CAD drawings using GIS. The 3D as-built model was aligned with the 4D as-planned model of building formed from extrusion of building elements, and integration of the construction's planned schedule. The construction progress is visualized via color-coding the building elements in the 3D model. The developed methodology was conducted and applied from the data obtained from an actual construction site. Accuracy in detecting `built' or `not built' building elements ranges from 82-84 % and precision of 50-72 %. Quantified progress in terms of the number of building elements are 21.31% (November 2016), 26.84 % (January 2017) and 44.19 % (March 2017). The results can be used as an input for progress monitoring performance of construction projects and improving related decision-making process.
Discoveries far from the lamppost with matrix elements and ranking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Debnath, Dipsikha; Gainer, James S.; Matchev, Konstantin T.
2015-04-01
The prevalence of null results in searches for new physics at the LHC motivates the effort to make these searches as model-independent as possible. We describe procedures for adapting the Matrix Element Method for situations where the signal hypothesis is not known a priori. We also present general and intuitive approaches for performing analyses and presenting results, which involve the flattening of background distributions using likelihood information. The first flattening method involves ranking events by background matrix element, the second involves quantile binning with respect to likelihood (and other) variables, and the third method involves reweighting histograms by the inversemore » of the background distribution.« less
A Compatible Hardware/Software Reliability Prediction Model.
1981-07-22
machines. In particular, he was interested in the following problem: assu me that one has a collection of connected elements computing and transmitting...software reliability prediction model is desirable, the findings about the Weibull distribution are intriguing. After collecting failure data from several...capacitor, some of the added charge carriers are collected by the capacitor. If the added charge is sufficiently large, the information stored is changed
ERIC Educational Resources Information Center
Haunberger, Sigrid
2010-01-01
This article focuses on the question of whether educational expansion leads to a new type of society, the education society. Taking into consideration the combined elements of three models of society (the post-industrial society, the knowledge society and the information society)--the chances and risks of an educational society will be elicited…
The data model for social welfare in Finland.
Kärki, Jarmo; Ailio, Erja
2014-01-01
A client data model for social welfare was gradually developed in the National Project of IT in Social Services in Finland. The client data model describes the nationally uniformed data structures and relationships between the data elements needed in production of social services. It contains the structures of social care client records, unique core components and distinct classifications. The modeling method guaranteed the coverage, integrity, flexibility and device independency of the model. The model is maintained and developed by the National Institute for Health and Welfare (THL) together with the social workers and other experts of social welfare. It forms the basis of the electronic information management of the social services. Implementation of the data model in information systems enables the availability of the client data where and when ever a client has to be helped.
The Green House Model of Nursing Home Care in Design and Implementation.
Cohen, Lauren W; Zimmerman, Sheryl; Reed, David; Brown, Patrick; Bowers, Barbara J; Nolet, Kimberly; Hudak, Sandra; Horn, Susan
2016-02-01
To describe the Green House (GH) model of nursing home (NH) care, and examine how GH homes vary from the model, one another, and their founding (or legacy) NH. Data include primary quantitative and qualitative data and secondary quantitative data, derived from 12 GH/legacy NH organizations February 2012-September 2014. This mixed methods, cross-sectional study used structured interviews to obtain information about presence of, and variation in, GH-relevant structures and processes of care. Qualitative questions explored reasons for variation in model implementation. Interview data were analyzed using related-sample tests, and qualitative data were iteratively analyzed using a directed content approach. GH homes showed substantial variation in practices to support resident choice and decision making; neither GH nor legacy homes provided complete choice, and all GH homes excluded residents from some key decisions. GH homes were most consistent with the model and one another in elements to create a real home, such as private rooms and baths and open kitchens, and in staff-related elements, such as self-managed work teams and consistent, universal workers. Although variation in model implementation complicates evaluation, if expansion is to continue, it is essential to examine GH elements and their outcomes. © Health Research and Educational Trust.
Chowdhury, Amor; Sarjaš, Andrej
2016-01-01
The presented paper describes accurate distance measurement for a field-sensed magnetic suspension system. The proximity measurement is based on a Hall effect sensor. The proximity sensor is installed directly on the lower surface of the electro-magnet, which means that it is very sensitive to external magnetic influences and disturbances. External disturbances interfere with the information signal and reduce the usability and reliability of the proximity measurements and, consequently, the whole application operation. A sensor fusion algorithm is deployed for the aforementioned reasons. The sensor fusion algorithm is based on the Unscented Kalman Filter, where a nonlinear dynamic model was derived with the Finite Element Modelling approach. The advantage of such modelling is a more accurate dynamic model parameter estimation, especially in the case when the real structure, materials and dimensions of the real-time application are known. The novelty of the paper is the design of a compact electro-magnetic actuator with a built-in low cost proximity sensor for accurate proximity measurement of the magnetic object. The paper successively presents a modelling procedure with the finite element method, design and parameter settings of a sensor fusion algorithm with Unscented Kalman Filter and, finally, the implementation procedure and results of real-time operation. PMID:27649197
Chowdhury, Amor; Sarjaš, Andrej
2016-09-15
The presented paper describes accurate distance measurement for a field-sensed magnetic suspension system. The proximity measurement is based on a Hall effect sensor. The proximity sensor is installed directly on the lower surface of the electro-magnet, which means that it is very sensitive to external magnetic influences and disturbances. External disturbances interfere with the information signal and reduce the usability and reliability of the proximity measurements and, consequently, the whole application operation. A sensor fusion algorithm is deployed for the aforementioned reasons. The sensor fusion algorithm is based on the Unscented Kalman Filter, where a nonlinear dynamic model was derived with the Finite Element Modelling approach. The advantage of such modelling is a more accurate dynamic model parameter estimation, especially in the case when the real structure, materials and dimensions of the real-time application are known. The novelty of the paper is the design of a compact electro-magnetic actuator with a built-in low cost proximity sensor for accurate proximity measurement of the magnetic object. The paper successively presents a modelling procedure with the finite element method, design and parameter settings of a sensor fusion algorithm with Unscented Kalman Filter and, finally, the implementation procedure and results of real-time operation.
Zhang, Bing; Jin, Rui; Huang, Jianmei; Liu, Xiaoqing; Xue, Chunmiao; Lin, Zhijian
2012-08-01
Traditional Chinese medicine (TCM) property theory is believed to be a key and difficult point of basic theory studies of TCM. Complex concepts, components and characteristics of TCM property have long puzzled researchers and urged them to develop new angles and approaches. In the view of cognitive science, TCM property theory is a cognitive process of storing, extracting, rebuilding and summarizing the sensory information about TCMs and their effects during the medical practice struggling against diseases under the guidance of traditional Chinese philosophical thinking. The cognitive process of TCM property has particular cognitive elements and strategies. Taking into account clinical application characteristics of TCMs, this study defines the particular cognitive elements. In the combination of research methods of modern chemistry, biology and mathematics, and on the basis early-stage work for five years, we have built a TCM property cognition model based on three elements and practiced with drugs with pungent and hot properties as example, in the hope of interpreting TCM properties with modern science and providing thoughts for the nature of medical properties and instruction for rational clinical prescription.
NASA Technical Reports Server (NTRS)
Odubiyi, Jide; Kocur, David; Pino, Nino; Chu, Don
1996-01-01
This report presents the results of our research on Earth-Mars Telecommunications and Information Management System (TIMS) network modeling and unattended network operations. The primary focus of our research is to investigate the feasibility of the TIMS architecture, which links the Earth-based Mars Operations Control Center, Science Data Processing Facility, Mars Network Management Center, and the Deep Space Network of antennae to the relay satellites and other communication network elements based in the Mars region. The investigation was enhanced by developing Build 3 of the TIMS network modeling and simulation model. The results of several 'what-if' scenarios are reported along with reports on upgraded antenna visibility determination software and unattended network management prototype.
NASA Astrophysics Data System (ADS)
Gilmanshin, I. R.; Kirpichnikov, A. P.
2017-09-01
In the result of study of the algorithm of the functioning of the early detection module of excessive losses, it is proven the ability to model it by using absorbing Markov chains. The particular interest is in the study of probability characteristics of early detection module functioning algorithm of losses in order to identify the relationship of indicators of reliability of individual elements, or the probability of occurrence of certain events and the likelihood of transmission of reliable information. The identified relations during the analysis allow to set thresholds reliability characteristics of the system components.
Interoperability Matter: Levels of Data Sharing, Starting from a 3d Information Modelling
NASA Astrophysics Data System (ADS)
Tommasi, C.; Achille, C.
2017-02-01
Nowadays, the adoption of BIM processes in the AEC (Architecture, Engineering and Construction) industry means to be oriented towards synergistic workflows, based on informative instruments capable of realizing the virtual model of the building. The target of this article is to speak about the interoperability matter, approaching the subject through a theoretical part and also a practice example, in order to show how these notions are applicable in real situations. In particular, the case study analysed belongs to the Cultural Heritage field, where it is possible to find some difficulties - both in the modelling and sharing phases - due to the complexity of shapes and elements. Focusing on the interoperability between different software, the questions are: What and how many kind of information can I share? Given that this process leads also to a standardization of the modelled parts, is there the possibility of an accuracy loss?
Modeling and measurement of angle-beam wave propagation in a scatterer-free plate
NASA Astrophysics Data System (ADS)
Dawson, Alexander J.; Michaels, Jennifer E.; Michaels, Thomas E.
2017-02-01
Wavefield imaging has been shown to be a powerful tool for improving the understanding and characterization of wave propagation and scattering in plates. The complete measurement of surface displacement over a 2-D grid provided by wavefield imaging has the potential to serve as a useful means of validating ultrasonic models. Here, a preliminary study of ultrasonic angle-beam wave propagation in a scatterer-free plate using a combination of wavefield measurements and 2-D finite element models is described. Both wavefield imaging and finite element analysis are used to study the propagation of waves at a refracted angle of 56.8° propagating in a 6.35 mm thick aluminum plate. Wavefield imaging is performed using a laser vibrometer mounted on an XYZ scanning stage, which is programmed to move point-to-point on a rectilinear grid to acquire waveform data. The commercial finite element software package, PZFlex, which is specifically designed to handle large, complex ultrasonic problems, is used to create a 2-D cross-sectional model of the transducer and plate. For model validation, vertical surface displacements from both the wavefield measurements and the PZFlex finite element model are compared and found to be in excellent agreement. The validated PZFlex model is then used to explain the mechanism of Rayleigh wave generation by the angle-beam wedge. Since the wavefield measurements are restricted to the specimen surface, the cross-sectional PZFlex model is able to provide insights the wavefield data cannot. This study illustrates how information obtained from ultrasonic experiments and modeling results can be combined to improve understanding of angle-beam wave generation and propagation.
Groves, Rachel B; Coulman, Sion A; Birchall, James C; Evans, Sam L
2013-02-01
The mechanical characteristics of skin are extremely complex and have not been satisfactorily simulated by conventional engineering models. The ability to predict human skin behaviour and to evaluate changes in the mechanical properties of the tissue would inform engineering design and would prove valuable in a diversity of disciplines, for example the pharmaceutical and cosmetic industries, which currently rely upon experiments performed in animal models. The aim of this study was to develop a predictive anisotropic, hyperelastic constitutive model of human skin and to validate this model using laboratory data. As a corollary, the mechanical characteristics of human and murine skin have been compared. A novel experimental design, using tensile tests on circular skin specimens, and an optimisation procedure were adopted for laboratory experiments to identify the material parameters of the tissue. Uniaxial tensile tests were performed along three load axes on excised murine and human skin samples, using a single set of material parameters for each skin sample. A finite element model was developed using the transversely isotropic, hyperelastic constitutive model of Weiss et al. (1996) and was embedded within a Veronda-Westmann isotropic material matrix, using three fibre families to create anisotropic behaviour. The model was able to represent the nonlinear, anisotropic behaviour of the skin well. Additionally, examination of the optimal material coefficients and the experimental data permitted quantification of the mechanical differences between human and murine skin. Differences between the skin types, most notably the extension of the skin at low load, have highlighted some of the limitations of murine skin as a biomechanical model of the human tissue. The development of accurate, predictive computational models of human tissue, such as skin, to reduce, refine or replace animal models and to inform developments in the medical, engineering and cosmetic fields, is a significant challenge but is highly desirable. Concurrent advances in computer technology and our understanding of human physiology must be utilised to produce more accurate and accessible predictive models, such as the finite element model described in this study. Copyright © 2012 Elsevier Ltd. All rights reserved.
Use of shape-preserving interpolation methods in surface modeling
NASA Technical Reports Server (NTRS)
Ftitsch, F. N.
1984-01-01
In many large-scale scientific computations, it is necessary to use surface models based on information provided at only a finite number of points (rather than determined everywhere via an analytic formula). As an example, an equation of state (EOS) table may provide values of pressure as a function of temperature and density for a particular material. These values, while known quite accurately, are typically known only on a rectangular (but generally quite nonuniform) mesh in (T,d)-space. Thus interpolation methods are necessary to completely determine the EOS surface. The most primitive EOS interpolation scheme is bilinear interpolation. This has the advantages of depending only on local information, so that changes in data remote from a mesh element have no effect on the surface over the element, and of preserving shape information, such as monotonicity. Most scientific calculations, however, require greater smoothness. Standard higher-order interpolation schemes, such as Coons patches or bicubic splines, while providing the requisite smoothness, tend to produce surfaces that are not physically reasonable. This means that the interpolant may have bumps or wiggles that are not supported by the data. The mathematical quantification of ideas such as physically reasonable and visually pleasing is examined.
QSAR modeling based on structure-information for properties of interest in human health.
Hall, L H; Hall, L M
2005-01-01
The development of QSAR models based on topological structure description is presented for problems in human health. These models are based on the structure-information approach to quantitative biological modeling and prediction, in contrast to the mechanism-based approach. The structure-information approach is outlined, starting with basic structure information developed from the chemical graph (connection table). Information explicit in the connection table (element identity and skeletal connections) leads to significant (implicit) structure information that is useful for establishing sound models of a wide range of properties of interest in drug design. Valence state definition leads to relationships for valence state electronegativity and atom/group molar volume. Based on these important aspects of molecules, together with skeletal branching patterns, both the electrotopological state (E-state) and molecular connectivity (chi indices) structure descriptors are developed and described. A summary of four QSAR models indicates the wide range of applicability of these structure descriptors and the predictive quality of QSAR models based on them: aqueous solubility (5535 chemically diverse compounds, 938 in external validation), percent oral absorption (%OA, 417 therapeutic drugs, 195 drugs in external validation testing), AMES mutagenicity (2963 compounds including 290 therapeutic drugs, 400 in external validation), fish toxicity (92 substituted phenols, anilines and substituted aromatics). These models are established independent of explicit three-dimensional (3-D) structure information and are directly interpretable in terms of the implicit structure information useful to the drug design process.
NASA Astrophysics Data System (ADS)
Jin, Biao; Rolle, Massimo
2016-04-01
Organic compounds are produced in vast quantities for industrial and agricultural use, as well as for human and animal healthcare [1]. These chemicals and their metabolites are frequently detected at trace levels in fresh water environments where they undergo degradation via different reaction pathways. Compound specific stable isotope analysis (CSIA) is a valuable tool to identify such degradation pathways in different environmental systems. Recent advances in analytical techniques have promoted the fast development and implementation of multi-element CSIA. However, quantitative frameworks to evaluate multi-element stable isotope data and incorporating mechanistic information on the degradation processes [2,3] are still lacking. In this study we propose a mechanism-based modeling approach to simultaneously evaluate concentration as well as bulk and position-specific multi-element isotope evolution during the transformation of organic micropollutants. The model explicitly simulates position-specific isotopologues for those atoms that experience isotope effects and, thereby, provides a mechanistic description of isotope fractionation occurring at different molecular positions. We validate the proposed approach with the concentration and multi-element isotope data of three selected organic micropollutants: dichlorobenzamide (BAM), isoproturon (IPU) and diclofenac (DCF). The model precisely captures the dual element isotope trends characteristic of different reaction pathways and their range of variation consistent with observed multi-element (C, N) bulk isotope fractionation. The proposed approach can also be used as a tool to explore transformation pathways in scenarios for which position-specific isotope data are not yet available. [1] Schwarzenbach, R.P., Egli, T., Hofstetter, T.B., von Gunten, U., Wehrli, B., 2010. Global Water Pollution and Human Health. Annu. Rev. Environ. Resour. doi:10.1146/annurev-environ-100809-125342. [2] Jin, B., Haderlein, S.B., Rolle, M., 2013. Integrated carbon and chlorine isotope modeling: Applications to chlorinated aliphatic hydrocarbons dechlorination. Environ. Sci. Technol. 47, 1443-1451. doi:10.1021/es304053h. [3] Jin, B., Rolle, M., 2014. Mechanistic approach to multi-element isotope modeling of organic contaminant degradation. Chemosphere 95, 131-139. doi:10.1016/j.chemosphere.2013.08.050.
Method and apparatus for displaying information
NASA Technical Reports Server (NTRS)
Huang, Sui (Inventor); Eichler, Gabriel (Inventor); Ingber, Donald E. (Inventor)
2010-01-01
A method for displaying large amounts of information. The method includes the steps of forming a spatial layout of tiles each corresponding to a representative reference element; mapping observed elements onto the spatial layout of tiles of representative reference elements; assigning a respective value to each respective tile of the spatial layout of the representative elements; and displaying an image of the spatial layout of tiles of representative elements. Each tile includes atomic attributes of representative elements. The invention also relates to an apparatus for displaying large amounts of information. The apparatus includes a tiler forming a spatial layout of tiles, each corresponding to a representative reference element; a comparator mapping observed elements onto said spatial layout of tiles of representative reference elements; an assigner assigning a respective value to each respective tile of said spatial layout of representative reference elements; and a display displaying an image of the spatial layout of tiles of representative reference elements.
Large-scale 3D geoelectromagnetic modeling using parallel adaptive high-order finite element method
Grayver, Alexander V.; Kolev, Tzanio V.
2015-11-01
Here, we have investigated the use of the adaptive high-order finite-element method (FEM) for geoelectromagnetic modeling. Because high-order FEM is challenging from the numerical and computational points of view, most published finite-element studies in geoelectromagnetics use the lowest order formulation. Solution of the resulting large system of linear equations poses the main practical challenge. We have developed a fully parallel and distributed robust and scalable linear solver based on the optimal block-diagonal and auxiliary space preconditioners. The solver was found to be efficient for high finite element orders, unstructured and nonconforming locally refined meshes, a wide range of frequencies, largemore » conductivity contrasts, and number of degrees of freedom (DoFs). Furthermore, the presented linear solver is in essence algebraic; i.e., it acts on the matrix-vector level and thus requires no information about the discretization, boundary conditions, or physical source used, making it readily efficient for a wide range of electromagnetic modeling problems. To get accurate solutions at reduced computational cost, we have also implemented goal-oriented adaptive mesh refinement. The numerical tests indicated that if highly accurate modeling results were required, the high-order FEM in combination with the goal-oriented local mesh refinement required less computational time and DoFs than the lowest order adaptive FEM.« less
Large-scale 3D geoelectromagnetic modeling using parallel adaptive high-order finite element method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grayver, Alexander V.; Kolev, Tzanio V.
Here, we have investigated the use of the adaptive high-order finite-element method (FEM) for geoelectromagnetic modeling. Because high-order FEM is challenging from the numerical and computational points of view, most published finite-element studies in geoelectromagnetics use the lowest order formulation. Solution of the resulting large system of linear equations poses the main practical challenge. We have developed a fully parallel and distributed robust and scalable linear solver based on the optimal block-diagonal and auxiliary space preconditioners. The solver was found to be efficient for high finite element orders, unstructured and nonconforming locally refined meshes, a wide range of frequencies, largemore » conductivity contrasts, and number of degrees of freedom (DoFs). Furthermore, the presented linear solver is in essence algebraic; i.e., it acts on the matrix-vector level and thus requires no information about the discretization, boundary conditions, or physical source used, making it readily efficient for a wide range of electromagnetic modeling problems. To get accurate solutions at reduced computational cost, we have also implemented goal-oriented adaptive mesh refinement. The numerical tests indicated that if highly accurate modeling results were required, the high-order FEM in combination with the goal-oriented local mesh refinement required less computational time and DoFs than the lowest order adaptive FEM.« less
van Meeuwen, Dorine Pd; van Walt Meijer, Quirine J; Simonse, Lianne Wl
2015-03-24
With a growing population of health care clients in the future, the organization of high-quality and cost-effective service providing becomes an increasing challenge. New online eHealth services are proposed as innovative options for the future. Yet, a major barrier to these services appears to be the lack of new business model designs. Although design efforts generally result in visual models, no such artifacts have been found in the literature on business model design. This paper investigates business model design in eHealth service practices from a design perspective. It adopts a research by design approach and seeks to unravel what characteristics of business models determine an online service and what are important value exchanges between health professionals and clients. The objective of the study was to analyze the construction of care models in-depth, framing the essential elements of a business model, and design a new care model that structures these elements for the particular context of an online pre-care service in practice. This research employs a qualitative method of an in-depth case study in which different perspectives on constructing a care model are investigated. Data are collected by using the visual business modeling toolkit, designed to cocreate and visualize the business model. The cocreated models are transcribed and analyzed per actor perspective, transactions, and value attributes. We revealed eight new actors in the business model for providing the service. Essential actors are: the intermediary network coordinator connecting companies, the service dedicated information technology specialists, and the service dedicated health specialist. In the transactions for every service providing we found a certain type of contract, such as a license contract and service contracts for precare services and software products. In addition to the efficiency, quality, and convenience, important value attributes appeared to be: timelines, privacy and credibility, availability, pleasantness, and social interaction. Based on the in-depth insights from the actor perspectives, the business model for online precare services is modeled with a visual design. A new care model of the online precare service is designed and compiled of building blocks for the business model. For the construction of a care model, actors, transactions, and value attributes are essential elements. The design of a care model structures these elements in a visual way. Guided by the business modeling toolkit, the care model design artifact is visualized in the context of an online precare service. Important building blocks include: provision of an online flow of information with regular interactions to the client stimulates self-management of personal health and service-dedicated health expert ensure an increase of the perceived quality of the eHealth service.
2015-01-01
Background With a growing population of health care clients in the future, the organization of high-quality and cost-effective service providing becomes an increasing challenge. New online eHealth services are proposed as innovative options for the future. Yet, a major barrier to these services appears to be the lack of new business model designs. Although design efforts generally result in visual models, no such artifacts have been found in the literature on business model design. This paper investigates business model design in eHealth service practices from a design perspective. It adopts a research by design approach and seeks to unravel what characteristics of business models determine an online service and what are important value exchanges between health professionals and clients. Objective The objective of the study was to analyze the construction of care models in-depth, framing the essential elements of a business model, and design a new care model that structures these elements for the particular context of an online pre-care service in practice. Methods This research employs a qualitative method of an in-depth case study in which different perspectives on constructing a care model are investigated. Data are collected by using the visual business modeling toolkit, designed to cocreate and visualize the business model. The cocreated models are transcribed and analyzed per actor perspective, transactions, and value attributes. Results We revealed eight new actors in the business model for providing the service. Essential actors are: the intermediary network coordinator connecting companies, the service dedicated information technology specialists, and the service dedicated health specialist. In the transactions for every service providing we found a certain type of contract, such as a license contract and service contracts for precare services and software products. In addition to the efficiency, quality, and convenience, important value attributes appeared to be: timelines, privacy and credibility, availability, pleasantness, and social interaction. Based on the in-depth insights from the actor perspectives, the business model for online precare services is modeled with a visual design. A new care model of the online precare service is designed and compiled of building blocks for the business model. Conclusions For the construction of a care model, actors, transactions, and value attributes are essential elements. The design of a care model structures these elements in a visual way. Guided by the business modeling toolkit, the care model design artifact is visualized in the context of an online precare service. Important building blocks include: provision of an online flow of information with regular interactions to the client stimulates self-management of personal health and service-dedicated health expert ensure an increase of the perceived quality of the eHealth service. PMID:25831094
Actuators of 3-element unimorph deformable mirror
NASA Astrophysics Data System (ADS)
Fu, Tianyang; Ning, Yu; Du, Shaojun
2016-10-01
Kinds of wavefront aberrations exist among optical systems because of atmosphere disturbance, device displacement and a variety of thermal effects, which disturb the information of transmitting beam and restrain its energy. Deformable mirror(DM) is designed to adjust these wavefront aberrations. Bimorph DM becomes more popular and more applicable among adaptive optical(AO) systems with advantages in simple structure, low cost and flexible design compared to traditional discrete driving DM. The defocus aberration accounted for a large proportion of all wavefront aberrations, with a simpler surface and larger amplitude than others, so it is very useful to correct the defocus aberration effectively for beam controlling and aberration adjusting of AO system. In this study, we desired on correcting the 3rd and 10th Zernike modes, analyze the characteristic of the 3rd and 10th defocus aberration surface distribution, design 3-element actuators unimorph DM model study on its structure and deformation principle theoretically, design finite element models of different electrode configuration with different ring diameters, analyze and compare effects of different electrode configuration and different fixing mode to DM deformation capacity through COMSOL finite element software, compare fitting efficiency of DM models to the 3rd and 10th Zernike modes. We choose the inhomogeneous electrode distribution model with better result, get the influence function of every electrode and the voltage-PV relationship of the model. This unimorph DM is suitable for the AO system with a mainly defocus aberration.
Distributed and Dynamic Storage of Working Memory Stimulus Information in Extrastriate Cortex
Sreenivasan, Kartik K.; Vytlacil, Jason; D'Esposito, Mark
2015-01-01
The predominant neurobiological model of working memory (WM) posits that stimulus information is stored via stable elevated activity within highly selective neurons. Based on this model, which we refer to as the canonical model, the storage of stimulus information is largely associated with lateral prefrontal cortex (lPFC). A growing number of studies describe results that cannot be fully explained by the canonical model, suggesting that it is in need of revision. In the present study, we directly test key elements of the canonical model. We analyzed functional MRI data collected as participants performed a task requiring WM for faces and scenes. Multivariate decoding procedures identified patterns of activity containing information about the items maintained in WM (faces, scenes, or both). While information about WM items was identified in extrastriate visual cortex (EC) and lPFC, only EC exhibited a pattern of results consistent with a sensory representation. Information in both regions persisted even in the absence of elevated activity, suggesting that elevated population activity may not represent the storage of information in WM. Additionally, we observed that WM information was distributed across EC neural populations that exhibited a broad range of selectivity for the WM items rather than restricted to highly selective EC populations. Finally, we determined that activity patterns coding for WM information were not stable, but instead varied over the course of a trial, indicating that the neural code for WM information is dynamic rather than static. Together, these findings challenge the canonical model of WM. PMID:24392897
NASA Astrophysics Data System (ADS)
Islam, Md. Mashfiqul; Chowdhury, Md. Arman; Sayeed, Md. Abu; Hossain, Elsha Al; Ahmed, Sheikh Saleh; Siddique, Ashfia
2014-09-01
Finite element analyses are conducted to model the tensile capacity of steel fiber-reinforced concrete (SFRC). For this purpose dog-bone specimens are casted and tested under direct and uniaxial tension. Two types of aggregates (brick and stone) are used to cast the SFRC and plain concrete. The fiber volume ratio is maintained 1.5 %. Total 8 numbers of dog-bone specimens are made and tested in a 1000-kN capacity digital universal testing machine (UTM). The strain data are gathered employing digital image correlation technique from high-definition images and high-speed video clips. Then, the strain data are synthesized with the load data obtained from the load cell of the UTM. The tensile capacity enhancement is found 182-253 % compared to control specimen to brick SFRC and in case of stone SFRC the enhancement is 157-268 %. Fibers are found to enhance the tensile capacity as well as ductile properties of concrete that ensures to prevent sudden brittle failure. The dog-bone specimens are modeled in the ANSYS 10.0 finite element platform and analyzed to model the tensile capacity of brick and stone SFRC. The SOLID65 element is used to model the SFRC as well as plain concretes by optimizing the Poisson's ratio, modulus of elasticity, tensile strength and stress-strain relationships and also failure pattern as well as failure locations. This research provides information of the tensile capacity enhancement of SFRC made of both brick and stone which will be helpful for the construction industry of Bangladesh to introduce this engineering material in earthquake design. Last of all, the finite element outputs are found to hold good agreement with the experimental tensile capacity which validates the FE modeling.
A hybrid silicon membrane spatial light modulator for optical information processing
NASA Technical Reports Server (NTRS)
Pape, D. R.; Hornbeck, L. J.
1984-01-01
A new two dimensional, fast, analog, electrically addressable, silicon based membrane spatial light modulator (SLM) was developed for optical information processing applications. Coherent light reflected from the mirror elements is phase modulated producing an optical Fourier transform of an analog signal input to the device. The DMD architecture and operating parameters related to this application are presented. A model is developed that describes the optical Fourier transform properties of the DMD.
Goossen, William T F
2014-07-01
This paper will present an overview of the developmental effort in harmonizing clinical knowledge modeling using the Detailed Clinical Models (DCMs), and will explain how it can contribute to the preservation of Electronic Health Records (EHR) data. Clinical knowledge modeling is vital for the management and preservation of EHR and data. Such modeling provides common data elements and terminology binding with the intention of capturing and managing clinical information over time and location independent from technology. Any EHR data exchange without an agreed clinical knowledge modeling will potentially result in loss of information. Many attempts exist from the past to model clinical knowledge for the benefits of semantic interoperability using standardized data representation and common terminologies. The objective of each project is similar with respect to consistent representation of clinical data, using standardized terminologies, and an overall logical approach. However, the conceptual, logical, and the technical expressions are quite different in one clinical knowledge modeling approach versus another. There currently are synergies under the Clinical Information Modeling Initiative (CIMI) in order to create a harmonized reference model for clinical knowledge models. The goal for the CIMI is to create a reference model and formalisms based on for instance the DCM (ISO/TS 13972), among other work. A global repository of DCMs may potentially be established in the future.
ACTOG - AUTOCAD TO GIFTS TRANSLATOR
NASA Technical Reports Server (NTRS)
Jones, A.
1994-01-01
The AutoCad TO Gifts Translator program, ACTOG, was developed to facilitate quick generation of small finite element models using the CASA/Gifts finite element modeling program. ACTOG reads the geometric data of a drawing from the Data Exchange File (DXF) used in AutoCAD and other PC based drafting programs. The geometric entities recognized by ACTOG include POINTs, LINEs, ARCs, SOLIDs, 3DLINEs and 3DFACEs. From this information ACTOG creates a GIFTS SRC file which can then be read into the GIFTS preprocessor BULKM or can be modified and read into EDITM to create a finite element model. The GIFTS commands created include KPOINTs, SLINEs, CARCs, GRID3s and GRID4s. The SRC file can be used as is (using the default parameters) or edited for any number of uses. It is assumed that the user has at least a working knowledge of AutoCAD and GIFTS. ACTOG was written in Microsoft QuickBasic (Version 2.0). The program was developed for the IBM PC and has been implemented on an IBM PC compatible under DOS 3.21. ACTOG was developed in 1988.
Prazak, Lisa; Fujioka, Miki; Gergen, J. Peter
2010-01-01
The relatively simple combinatorial rules responsible for establishing the initial metameric expression of sloppy-paired-1 (slp1) in the Drosophila blastoderm embryo make this system an attractive model for investigating the mechanism of regulation by pair rule transcription factors. This investigation of slp1 cis-regulatory architecture identifies two distinct elements, a proximal early stripe element (PESE) and a distal early stripe element (DESE) located from −3.1 kb to −2.5 kb and from −8.1 kb to −7.1 kb upstream of the slp1 promoter, respectively, that mediate this early regulation. The proximal element expresses only even-numbered stripes and mediates repression by Even-skipped (Eve) as well as by the combination of Runt and Fushi-tarazu (Ftz). A 272 basepair sub-element of PESE retains Eve-dependent repression, but is expressed throughout the even-numbered parasegments due to the loss of repression by Runt and Ftz. In contrast, the distal element expresses both odd and even-numbered stripes and also drives inappropriate expression in the anterior half of the odd-numbered parasegments due to an inability to respond to repression by Eve. Importantly, a composite reporter gene containing both early stripe elements recapitulates pair-rule gene-dependent regulation in a manner beyond what is expected from combining their individual patterns. These results indicate interactions involving distinct cis-elements contribute to the proper integration of pair-rule regulatory information. A model fully accounting for these results proposes that metameric slp1 expression is achieved through the Runt-dependent regulation of interactions between these two pair-rule response elements and the slp1 promoter. PMID:20435028
NASA Astrophysics Data System (ADS)
Casadei, F.; Ruzzene, M.
2011-04-01
This work illustrates the possibility to extend the field of application of the Multi-Scale Finite Element Method (MsFEM) to structural mechanics problems that involve localized geometrical discontinuities like cracks or notches. The main idea is to construct finite elements with an arbitrary number of edge nodes that describe the actual geometry of the damage with shape functions that are defined as local solutions of the differential operator of the specific problem according to the MsFEM approach. The small scale information are then brought to the large scale model through the coupling of the global system matrices that are assembled using classical finite element procedures. The efficiency of the method is demonstrated through selected numerical examples that constitute classical problems of great interest to the structural health monitoring community.
A framework for institutionalizing quality assurance.
Silimperi, Diana R; Franco, Lynne Miller; Veldhuyzen van Zanten, Tisna; MacAulay, Catherine
2002-12-01
To develop a framework to support the institutionalization of quality assurance (QA). The framework for institutionalizing QA consists of a model of eight essential elements and a 'roadmap' for the process of institutionalization. The essential elements are the building blocks required for implementing and sustaining QA activities. Core QA activities include defining, measuring and improving quality. The essential elements are grouped under three categories: the internal enabling environment (internal to the organization or system), organizing for quality, and support functions. The enabling environment contains the essential elements of leadership, policy, core values, and resources. Organizing for quality includes the structure for implementing QA. Three essential elements are primarily support functions: capacity building, communication and information, and rewarding quality. The model can be applied at the level of an organization or a system. The paper also describes the process of institutionalizing QA, starting from a state of preawareness, passing through four phases (awareness, experiential, expansion, and consolidation), and culminating in a state of maturity. The process is not linear; an organization may regress, vacillate between phases, or even remain stagnant. Some phases (e.g. awareness and experiential) may occur simultaneously. The framework has been introduced in nearly a dozen countries in Latin America and Africa. The conceptual model has been used to support strategic planning and directing Ministry of Health work plans, and also as a resource for determining the elements necessary to strengthen and sustain QA. The next step will be the development and evaluation of an assessment tool to monitor developmental progress in the institutionalization of QA.
NASA Astrophysics Data System (ADS)
George, Freya; Gaidies, Fred
2017-04-01
In comparison to our understanding of major element zoning, relatively little is known about the incorporation of trace elements into metamorphic garnet. Given their extremely slow diffusivities and sensitivity to changing mineral assemblages, the analysis of the distribution of trace elements in garnet has the potential to yield a wealth of information pertaining to interfacial attachment mechanisms during garnet crystallisation, the mobility of trace elements in both garnet and the matrix, and trace element geochronology. Due to advances in the spatial resolution and analytical precision of modern microbeam techniques, small-scale trace element variations can increasingly be documented and used to inform models of metamorphic crystallisation. Laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) in particular, can be used to rapidly quantify a wide range of elemental masses as a series of laser rasters, producing large volumes of spatially constrained trace element data. In this study, we present LA-ICP-MS maps of trace element concentrations from numerous centrally-sectioned garnets representative of the crystal size-distribution of a single sample's population. The study sample originates from the garnet-grade Barrovian zone of the Lesser Himalayan Sequence in Sikkim, northeast India, and has been shown to have crystallised garnet within a single assemblage between 515 ˚C and 565˚C, with no evidence for accessory phase reaction over the duration of garnet growth. Previous models have indicated that the duration of garnet crystallisation was extremely rapid (<1 Myr), with negligible diffusional homogenisation of major divalent cations. Consequently, the trace element record likely documents the primary zonation generated during garnet growth. In spite of straightforward (i.e. concentrically-zoned) major element garnet zonation, trace elements maps are characterised by significant complexity and variability. Y and the heavy rare earth elements are strongly enriched in crystal cores, where there is overprinting of the observed internal fabric, and exhibit numerous concentric annuli towards crystal rims. Conversely, the medium rare earth elements (e.g. Gd, Eu and Sm) exhibit bowl-shaped zoning from core to rim, with no annuli, and core and rim compositions of the medium rare earth elements are the same throughout the population within crystals of differing size. Cr exhibits pronounced spiral zoning, and the average Cr content increases towards garnet rims. In all cases, spirals are centered on the geometric core of the crystals. These LA-ICP-MS maps highlight the complexity of garnet growth over a single prograde event, and indicate that there is still much to be learnt from the analysis of garnet using ever-improving analytical methods. We explore the potential causes of the variations in the distribution of trace elements in garnet, and assess how these zoning patterns may be used to refine our understanding of the intricacies of garnet crystallisation and the spatial and temporal degree of trace element equilibration during metamorphism.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-28
... Information Collection: Comment Request; Emergency Homeowners' Loan Program Data Elements AGENCY: Office of... following information: Title of Proposal: Emergency Homeowners' Loan Program Data Elements. Description of... of Affected Public: Emergency Homeowners' Loan Program Data Elements. Estimation of the total numbers...
Applications of CCSDS recommendations to Integrated Ground Data Systems (IGDS)
NASA Technical Reports Server (NTRS)
Mizuta, Hiroshi; Martin, Daniel; Kato, Hatsuhiko; Ihara, Hirokazu
1993-01-01
This paper describes an application of the CCSDS Principle Network (CPH) service model to communications network elements of a postulated Integrated Ground Data System (IGDS). Functions are drawn principally from COSMICS (Cosmic Information and Control System), an integrated space control infrastructure, and the Earth Observing System Data and Information System (EOSDIS) Core System (ECS). From functional requirements, this paper derives a set of five communications network partitions which, taken together, support proposed space control infrastructures and data distribution systems. Our functional analysis indicates that the five network partitions derived in this paper should effectively interconnect the users, centers, processors, and other architectural elements of an IGDS. This paper illustrates a useful application of the CCSDS (Consultive Committee for Space Data Systems) Recommendations to ground data system development.
Fog-computing concept usage as means to enhance information and control system reliability
NASA Astrophysics Data System (ADS)
Melnik, E. V.; Klimenko, A. B.; Ivanov, D. Ya
2018-05-01
This paper focuses on the reliability issue of information and control systems (ICS). The authors propose using the elements of the fog-computing concept to enhance the reliability function. The key idea of fog-computing is to shift computations to the fog-layer of the network, and thus to decrease the workload of the communication environment and data processing components. As for ICS, workload also can be distributed among sensors, actuators and network infrastructure facilities near the sources of data. The authors simulated typical workload distribution situations for the “traditional” ICS architecture and for the one with fogcomputing concept elements usage. The paper contains some models, selected simulation results and conclusion about the prospects of the fog-computing as a means to enhance ICS reliability.
NASA Astrophysics Data System (ADS)
Donato, V.; Biagini, C.; Bertini, G.; Marsugli, F.
2017-05-01
Historical Building Information Modeling (H-BIM) has been widely documented in literature and is becoming more popular with government bodies, who are increasingly choosing to make its use mandatory in public procurements and contracts. Although the system seems to be one of the best approaches for managing data and driving the decision-making process, several difficulties arise due to the amount of effort required in the initial phases, when the data derived from a geometrical survey must be converted into parametric elements. Moreover, users must decide on a "level of geometrical simplification" a long time in advance, and this inevitably leads to a loss of geometrical data. From this perspective, our research describes a procedure to optimize the workflow of information for existing artefacts, in order to achieve a "lean" H-BIM. In this article, we will analyse two aspects: the first relates to the level of accuracy in a digital model created from the two different point clouds achieved from laser scanner and form images, while the second concerns the conversion of this information into parametric elements (Building Object Models- BOMs) that need to have specific characteristics. The case study we are presenting is the "Ponte Giorgini" ("Giorgini Bridge") in Castiglione della Pescaia (Grosseto - Italy).
A new statistical model to find bedrock, a prequel to geochemical mass balance
NASA Astrophysics Data System (ADS)
Fisher, B.; Rendahl, A. K.; Aufdenkampe, A. K.; Yoo, K.
2016-12-01
We present a new statistical model to assess weathering trends in deep weathering profiles. The Weathering Trends (WT) model is presented as an extension of the geochemical mass balance model (Brimhall & Dietrich, 1987), and is available as an open-source R library on GitHub (https://github.com/AaronRendahl/WeatheringTrends). WT uses element concentration data to determine the depth to fresh bedrock by assessing the maximum extent of weathering for all elements and the model applies confidence intervals on the depth to bedrock. WT models near-surface features and the shape of the weathering profile using a log transformation of data to capture the magnitude of changes that are relevant to geochemical kinetics and thermodynamics. The WT model offers a new, enhanced opportunity to characterize and understand biogeochemical weathering in heterogeneous rock types. We apply the model to two 21-meter drill cores in the Laurels Schist bedrock in the Christina River Basin Critical Zone Observatory in the Pennsylvania Piedmont. The Laurels Schist had inconclusive weathering indicators prior to development and application of WT model. The model differentiated between rock variability and weathering to delineate the maximum extent of weathering at 12.3 (CI 95% [9.2, 21.3]) meters in Ridge Well 1 and 7.2 (CI 95% [4.3, 13.0]) meters in Interfluve Well 2. The modeled extent to weathering is decoupled from the water table at the ridge, but coincides with the water table at the interfluve. These depths were applied as the parent material for the geochemical mass balance for the Laurels Schist. We test statistical approaches to assess the variability and correlation of immobile elements to facilitate the selection of the best immobile element for use in both models. We apply the model to other published data where the geochemical mass balance was applied, to demonstrate how the WT model provides additional information about weathering depth and weathering trends.
Aligning physical elements with persons' attitude: an approach using Rasch measurement theory
NASA Astrophysics Data System (ADS)
Camargo, F. R.; Henson, B.
2013-09-01
Affective engineering uses mathematical models to convert the information obtained from persons' attitude to physical elements into an ergonomic design. However, applications in the domain have not in many cases met measurement assumptions. This paper proposes a novel approach based on Rasch measurement theory to overcome the problem. The research demonstrates that if data fit the model, further variables can be added to a scale. An empirical study was designed to determine the range of compliance where consumers could obtain an impression of a moisturizer cream when touching some product containers. Persons, variables and stimulus objects were parameterised independently on a linear continuum. The results showed that a calibrated scale preserves comparability although incorporating further variables.
NASA Technical Reports Server (NTRS)
Wong, J. T.; Andre, W. L.
1981-01-01
A recent result shows that, for a certain class of systems, the interdependency among the elements of such a system together with the elements constitutes a mathematical structure a partially ordered set. It is called a loop free logic model of the system. On the basis of an intrinsic property of the mathematical structure, a characterization of system component failure in terms of maximal subsets of bad test signals of the system was obtained. Also, as a consequence, information concerning the total number of failure components in the system was deduced. Detailed examples are given to show how to restructure real systems containing loops into loop free models for which the result is applicable.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spellings, Matthew; Biointerfaces Institute, University of Michigan, 2800 Plymouth Rd., Ann Arbor, MI 48109; Marson, Ryan L.
Faceted shapes, such as polyhedra, are commonly found in systems of nanoscale, colloidal, and granular particles. Many interesting physical phenomena, like crystal nucleation and growth, vacancy motion, and glassy dynamics are challenging to model in these systems because they require detailed dynamical information at the individual particle level. Within the granular materials community the Discrete Element Method has been used extensively to model systems of anisotropic particles under gravity, with friction. We provide an implementation of this method intended for simulation of hard, faceted nanoparticles, with a conservative Weeks–Chandler–Andersen (WCA) interparticle potential, coupled to a thermodynamic ensemble. This method ismore » a natural extension of classical molecular dynamics and enables rigorous thermodynamic calculations for faceted particles.« less
A Review and Reappraisal of Adaptive Human-Computer Interfaces in Complex Control Systems
2006-08-01
maneuverability measures. The cost elements were expressed as fuzzy membership functions. Figure 9 shows the flowchart of the route planner. A fuzzy navigator...and updating of the user model, which contains information about three generic stereotypes ( beginner , intermediate and expert users) plus an
Development Decision-Makers' Perspectives of Communication.
ERIC Educational Resources Information Center
Woods, John L.
Presented to a United Nations seminar on the role of information in the development of emerging nations, this document presents effective communication as the achievement of results--a change in attitude, practice, and knowledge. Elements of the SMCRE (sender-message-channel-receiver-effect) communications model are used in analyzing communication…
ERIC Educational Resources Information Center
Rockinson-Szapkiw, Amanda J.; Wendt, Jillian; Wighting, Mervyn; Nisbet, Deanna
2016-01-01
The Community of Inquiry framework has been widely supported by research to provide a model of online learning that informs the design and implementation of distance learning courses. However, the relationship between elements of the CoI framework and perceived learning warrants further examination as a predictive model for online graduate student…
Adeeb A. Rahman; Thomas J. Urbanik; Mustafa Mahamid
2006-01-01
This paper presents a model using finite element method to study the response of a typical commercial corrugated fiberboard due to an induced moisture function at one side of the fiberboard. The model predicts how the moisture diffusion will permeate through the fiberboardâs layers(medium and liners) providing information on moisture content at any given point...
Cai, Lei; Chen, Tianlu; Yang, Jinglei; Zhou, Kejun; Yan, Xiaomei; Chen, Wenzhong; Sun, Liya; Li, Linlin; Qin, Shengying; Wang, Peng; Yang, Ping; Cui, Donghong; Burmeister, Margit; He, Lin; Jia, Wei; Wan, Chunling
2015-10-12
Little is known about the trace element profile differences between Schizophrenia patients and healthy controls; previous studies about the association of certain elements with Schizophrenia have obtained conflicting results. To identify these differences in the Han Chinese population, inductively coupled plasma-mass spectrometry was used to quantify the levels of 35 elements in the sera of 111 Schizophrenia patients and 110 healthy participants, which consisted of a training (61/61 for cases/controls included) and a test group including remaining participants. An orthogonal projection to latent structures model was constructed from the training group (R(2)Y = 0.465, Q(2)cum = 0.343) had a sensitivity of 76.0% and a specificity of 71.4% in the test group. Single element analysis indicated that the concentrations of cesium, zinc, and selenium were significantly reduced in patients with Schizophrenia in both the training and test groups. The meta-analysis including 522 cases and 360 controls supported that Zinc was significantly associated with Schizophrenia (standardized mean difference [SMD], -0.81; 95% confidence intervals [CI], -1.46 to -0.16, P = 0.01) in the random-effect model. Information theory analysis indicated that Zinc could play roles independently in Schizophrenia. These results suggest clear element profile differences between patients with Schizophrenia and healthy controls, and reduced Zn level is confirmed in the Schizophrenia patients.
Convergence in France facing Big Data era and Exascale challenges for Climate Sciences
NASA Astrophysics Data System (ADS)
Denvil, Sébastien; Dufresne, Jean-Louis; Salas, David; Meurdesoif, Yann; Valcke, Sophie; Caubel, Arnaud; Foujols, Marie-Alice; Servonnat, Jérôme; Sénési, Stéphane; Derouillat, Julien; Voury, Pascal
2014-05-01
The presentation will introduce a french national project : CONVERGENCE that has been funded for four years. This project will tackle big data and computational challenges faced by climate modeling community in HPC context. Model simulations are central to the study of complex mechanisms and feedbacks in the climate system and to provide estimates of future and past climate changes. Recent trends in climate modelling are to add more physical components in the modelled system, increasing the resolution of each individual component and the more systematic use of large suites of simulations to address many scientific questions. Climate simulations may therefore differ in their initial state, parameter values, representation of physical processes, spatial resolution, model complexity, and degree of realism or degree of idealisation. In addition, there is a strong need for evaluating, improving and monitoring the performance of climate models using a large ensemble of diagnostics and better integration of model outputs and observational data. High performance computing is currently reaching the exascale and has the potential to produce this exponential increase of size and numbers of simulations. However, post-processing, analysis, and exploration of the generated data have stalled and there is a strong need for new tools to cope with the growing size and complexity of the underlying simulations and datasets. Exascale simulations require new scalable software tools to generate, manage and mine those simulations ,and data to extract the relevant information and to take the correct decision. The primary purpose of this project is to develop a platform capable of running large ensembles of simulations with a suite of models, to handle the complex and voluminous datasets generated, to facilitate the evaluation and validation of the models and the use of higher resolution models. We propose to gather interdisciplinary skills to design, using a component-based approach, a specific programming environment for scalable scientific simulations and analytics, integrating new and efficient ways of deploying and analysing the applications on High Performance Computing (HPC) system. CONVERGENCE, gathering HPC and informatics expertise that cuts across the individual partners and the broader HPC community, will allow the national climate community to leverage information technology (IT) innovations to address its specific needs. Our methodology consists in developing an ensemble of generic elements needed to run the French climate models with different grids and different resolution, ensuring efficient and reliable execution of these models, managing large volume and number of data and allowing analysis of the results and precise evaluation of the models. These elements include data structure definition and input-output (IO), code coupling and interpolation, as well as runtime and pre/post-processing environments. A common data and metadata structure will allow transferring consistent information between the various elements. All these generic elements will be open source and publicly available. The IPSL-CM and CNRM-CM climate models will make use of these elements that will constitute a national platform for climate modelling. This platform will be used, in its entirety, to optimise and tune the next version of the IPSL-CM model and to develop a global coupled climate model with a regional grid refinement. It will also be used, at least partially, to run ensembles of the CNRM-CM model at relatively high resolution and to run a very-high resolution prototype of this model. The climate models we developed are already involved in many international projects. For instance we participate to the CMIP (Coupled Model Intercomparison Project) project that is very demanding but has a high visibility: its results are widely used and are in particular synthesised in the IPCC (Intergovernmental Panel on Climate Change) assessment reports. The CONVERGENCE project will constitute an invaluable step for the French climate community to prepare and better contribute to the next phase of the CMIP project.
Multicriteria decision model for retrofitting existing buildings
NASA Astrophysics Data System (ADS)
Bostenaru Dan, B.
2003-04-01
In this paper a model to decide which buildings from an urban area should be retrofitted is presented. The model has been cast into existing ones by choosing the decision rule, criterion weighting and decision support system types most suitable for the spatial problem of reducing earthquake risk in urban areas, considering existing spatial multiatributive and multiobjective decision methods and especially collaborative issues. Due to the participative character of the group decision problem "retrofitting existing buildings" the decision making model is based on interactivity. Buildings have been modeled following the criteria of spatial decision support systems. This includes identifying the corresponding spatial elements of buildings according to the information needs of actors from different sphaeres like architects, construction engineers and economists. The decision model aims to facilitate collaboration between this actors. The way of setting priorities interactivelly will be shown, by detailing the two phases: judgemental and computational, in this case site analysis, collection and evaluation of the unmodified data and converting survey data to information with computational methods using additional expert support. Buildings have been divided into spatial elements which are characteristic for the survey, present typical damages in case of an earthquake and are decisive for a better seismic behaviour in case of retrofitting. The paper describes the architectural and engineering characteristics as well as the structural damage for constuctions of different building ages on the example of building types in Bucharest, Romania in compressible and interdependent charts, based on field observation, reports from the 1977 earthquake and detailed studies made by the author together with a local engineer for the EERI Web Housing Encyclopedia. On this base criteria for setting priorities flow into the expert information contained in the system.
A physically based catchment partitioning method for hydrological analysis
NASA Astrophysics Data System (ADS)
Menduni, Giovanni; Riboni, Vittoria
2000-07-01
We propose a partitioning method for the topographic surface, which is particularly suitable for hydrological distributed modelling and shallow-landslide distributed modelling. The model provides variable mesh size and appears to be a natural evolution of contour-based digital terrain models. The proposed method allows the drainage network to be derived from the contour lines. The single channels are calculated via a search for the steepest downslope lines. Then, for each network node, the contributing area is determined by means of a search for both steepest upslope and downslope lines. This leads to the basin being partitioned into physically based finite elements delimited by irregular polygons. In particular, the distributed computation of local geomorphological parameters (i.e. aspect, average slope and elevation, main stream length, concentration time, etc.) can be performed easily for each single element. The contributing area system, together with the information on the distribution of geomorphological parameters provide a useful tool for distributed hydrological modelling and simulation of environmental processes such as erosion, sediment transport and shallow landslides.
On standardization of basic datasets of electronic medical records in traditional Chinese medicine.
Zhang, Hong; Ni, Wandong; Li, Jing; Jiang, Youlin; Liu, Kunjing; Ma, Zhaohui
2017-12-24
Standardization of electronic medical record, so as to enable resource-sharing and information exchange among medical institutions has become inevitable in view of the ever increasing medical information. The current research is an effort towards the standardization of basic dataset of electronic medical records in traditional Chinese medicine. In this work, an outpatient clinical information model and an inpatient clinical information model are created to adequately depict the diagnosis processes and treatment procedures of traditional Chinese medicine. To be backward compatible with the existing dataset standard created for western medicine, the new standard shall be a superset of the existing standard. Thus, the two models are checked against the existing standard in conjunction with 170,000 medical record cases. If a case cannot be covered by the existing standard due to the particularity of Chinese medicine, then either an existing data element is expanded with some Chinese medicine contents or a new data element is created. Some dataset subsets are also created to group and record Chinese medicine special diagnoses and treatments such as acupuncture. The outcome of this research is a proposal of standardized traditional Chinese medicine medical records datasets. The proposal has been verified successfully in three medical institutions with hundreds of thousands of medical records. A new dataset standard for traditional Chinese medicine is proposed in this paper. The proposed standard, covering traditional Chinese medicine as well as western medicine, is expected to be soon approved by the authority. A widespread adoption of this proposal will enable traditional Chinese medicine hospitals and institutions to easily exchange information and share resources. Copyright © 2017. Published by Elsevier B.V.
Kinetic Modeling using BioPAX ontology
Ruebenacker, Oliver; Moraru, Ion. I.; Schaff, James C.; Blinov, Michael L.
2010-01-01
Thousands of biochemical interactions are available for download from curated databases such as Reactome, Pathway Interaction Database and other sources in the Biological Pathways Exchange (BioPAX) format. However, the BioPAX ontology does not encode the necessary information for kinetic modeling and simulation. The current standard for kinetic modeling is the System Biology Markup Language (SBML), but only a small number of models are available in SBML format in public repositories. Additionally, reusing and merging SBML models presents a significant challenge, because often each element has a value only in the context of the given model, and information encoding biological meaning is absent. We describe a software system that enables a variety of operations facilitating the use of BioPAX data to create kinetic models that can be visualized, edited, and simulated using the Virtual Cell (VCell), including improved conversion to SBML (for use with other simulation tools that support this format). PMID:20862270
Solernou, Albert
2018-01-01
Fluctuating Finite Element Analysis (FFEA) is a software package designed to perform continuum mechanics simulations of proteins and other globular macromolecules. It combines conventional finite element methods with stochastic thermal noise, and is appropriate for simulations of large proteins and protein complexes at the mesoscale (length-scales in the range of 5 nm to 1 μm), where there is currently a paucity of modelling tools. It requires 3D volumetric information as input, which can be low resolution structural information such as cryo-electron tomography (cryo-ET) maps or much higher resolution atomistic co-ordinates from which volumetric information can be extracted. In this article we introduce our open source software package for performing FFEA simulations which we have released under a GPLv3 license. The software package includes a C ++ implementation of FFEA, together with tools to assist the user to set up the system from Electron Microscopy Data Bank (EMDB) or Protein Data Bank (PDB) data files. We also provide a PyMOL plugin to perform basic visualisation and additional Python tools for the analysis of FFEA simulation trajectories. This manuscript provides a basic background to the FFEA method, describing the implementation of the core mechanical model and how intermolecular interactions and the solvent environment are included within this framework. We provide prospective FFEA users with a practical overview of how to set up an FFEA simulation with reference to our publicly available online tutorials and manuals that accompany this first release of the package. PMID:29570700
Seismic behavior of an Italian Renaissance Sanctuary: Damage assessment by numerical modelling
NASA Astrophysics Data System (ADS)
Clementi, Francesco; Nespeca, Andrea; Lenci, Stefano
2016-12-01
The paper deals with modelling and analysis of architectural heritage through the discussion of an illustrative case study: the Medieval Sanctuary of Sant'Agostino (Offida, Italy). Using the finite element technique, a 3D numerical model of the sanctuary is built, and then used to identify the main sources of the damages. The work shows that advanced numerical analyses could offer significant information for the understanding of the causes of existing damage and, more generally, on the seismic vulnerability.
Integrated System Health Management (ISHM) for Test Stand and J-2X Engine: Core Implementation
NASA Technical Reports Server (NTRS)
Figueroa, Jorge F.; Schmalzel, John L.; Aguilar, Robert; Shwabacher, Mark; Morris, Jon
2008-01-01
ISHM capability enables a system to detect anomalies, determine causes and effects, predict future anomalies, and provides an integrated awareness of the health of the system to users (operators, customers, management, etc.). NASA Stennis Space Center, NASA Ames Research Center, and Pratt & Whitney Rocketdyne have implemented a core ISHM capability that encompasses the A1 Test Stand and the J-2X Engine. The implementation incorporates all aspects of ISHM; from anomaly detection (e.g. leaks) to root-cause-analysis based on failure mode and effects analysis (FMEA), to a user interface for an integrated visualization of the health of the system (Test Stand and Engine). The implementation provides a low functional capability level (FCL) in that it is populated with few algorithms and approaches for anomaly detection, and root-cause trees from a limited FMEA effort. However, it is a demonstration of a credible ISHM capability, and it is inherently designed for continuous and systematic augmentation of the capability. The ISHM capability is grounded on an integrating software environment used to create an ISHM model of the system. The ISHM model follows an object-oriented approach: includes all elements of the system (from schematics) and provides for compartmentalized storage of information associated with each element. For instance, a sensor object contains a transducer electronic data sheet (TEDS) with information that might be used by algorithms and approaches for anomaly detection, diagnostics, etc. Similarly, a component, such as a tank, contains a Component Electronic Data Sheet (CEDS). Each element also includes a Health Electronic Data Sheet (HEDS) that contains health-related information such as anomalies and health state. Some practical aspects of the implementation include: (1) near real-time data flow from the test stand data acquisition system through the ISHM model, for near real-time detection of anomalies and diagnostics, (2) insertion of the J-2X predictive model providing predicted sensor values for comparison with measured values and use in anomaly detection and diagnostics, and (3) insertion of third-party anomaly detection algorithms into the integrated ISHM model.
Ihekwaba, Adaoha E C; Mura, Ivan; Walshaw, John; Peck, Michael W; Barker, Gary C
2016-11-01
Clostridium botulinum produces botulinum neurotoxins (BoNTs), highly potent substances responsible for botulism. Currently, mathematical models of C. botulinum growth and toxigenesis are largely aimed at risk assessment and do not include explicit genetic information beyond group level but integrate many component processes, such as signalling, membrane permeability and metabolic activity. In this paper we present a scheme for modelling neurotoxin production in C. botulinum Group I type A1, based on the integration of diverse information coming from experimental results available in the literature. Experiments show that production of BoNTs depends on the growth-phase and is under the control of positive and negative regulatory elements at the intracellular level. Toxins are released as large protein complexes and are associated with non-toxic components. Here, we systematically review and integrate those regulatory elements previously described in the literature for C. botulinum Group I type A1 into a population dynamics model, to build the very first computational model of toxin production at the molecular level. We conduct a validation of our model against several items of published experimental data for different wild type and mutant strains of C. botulinum Group I type A1. The result of this process underscores the potential of mathematical modelling at the cellular level, as a means of creating opportunities in developing new strategies that could be used to prevent botulism; and potentially contribute to improved methods for the production of toxin that is used for therapeutics.
NASA Technical Reports Server (NTRS)
Rosenchein, Stanley J.; Burns, J. Brian; Chapman, David; Kaelbling, Leslie P.; Kahn, Philip; Nishihara, H. Keith; Turk, Matthew
1993-01-01
This report is concerned with agents that act to gain information. In previous work, we developed agent models combining qualitative modeling with real-time control. That work, however, focused primarily on actions that affect physical states of the environment. The current study extends that work by explicitly considering problems of active information-gathering and by exploring specialized aspects of information-gathering in computational perception, learning, and language. In our theoretical investigations, we analyzed agents into their perceptual and action components and identified these with elements of a state-machine model of control. The mathematical properties of each was developed in isolation and interactions were then studied. We considered the complexity dimension and the uncertainty dimension and related these to intelligent-agent design issues. We also explored active information gathering in visual processing. Working within the active vision paradigm, we developed a concept of 'minimal meaningful measurements' suitable for demand-driven vision. We then developed and tested an architecture for ongoing recognition and interpretation of visual information. In the area of information gathering through learning, we explored techniques for coping with combinatorial complexity. We also explored information gathering through explicit linguistic action by considering the nature of conversational rules, coordination, and situated communication behavior.
Dissecting children's observational learning of complex actions through selective video displays.
Flynn, Emma; Whiten, Andrew
2013-10-01
Children can learn how to use complex objects by watching others, yet the relative importance of different elements they may observe, such as the interactions of the individual parts of the apparatus, a model's movements, and desirable outcomes, remains unclear. In total, 140 3-year-olds and 140 5-year-olds participated in a study where they observed a video showing tools being used to extract a reward item from a complex puzzle box. Conditions varied according to the elements that could be seen in the video: (a) the whole display, including the model's hands, the tools, and the box; (b) the tools and the box but not the model's hands; (c) the model's hands and the tools but not the box; (d) only the end state with the box opened; and (e) no demonstration. Children's later attempts at the task were coded to establish whether they imitated the hierarchically organized sequence of the model's actions, the action details, and/or the outcome. Children's successful retrieval of the reward from the box and the replication of hierarchical sequence information were reduced in all but the whole display condition. Only once children had attempted the task and witnessed a second demonstration did the display focused on the tools and box prove to be better for hierarchical sequence information than the display focused on the tools and hands only. Copyright © 2013 Elsevier Inc. All rights reserved.
Analysis and test of a 16-foot radial rib reflector developmental model
NASA Technical Reports Server (NTRS)
Birchenough, Shawn A.
1989-01-01
Analytical and experimental modal tests were performed to determine the vibrational characteristics of a 16-foot diameter radial rib reflector model. Single rib analyses and experimental tests provided preliminary information relating to the reflector. A finite element model predicted mode shapes and frequencies of the reflector. The analyses correlated well with the experimental tests, verifying the modeling method used. The results indicate that five related, characteristic mode shapes form a group. The frequencies of the modes are determined by the relative phase of the radial ribs.
ERIC Educational Resources Information Center
Connecticut Business and Industry Association, Hartford.
Conducting a survey of manpower training needs of business and industry in Connecticut and identifying elements of a vocational-career information delivery system were the two major focuses of the study described in this report. Content is presented in three chapters. Chapter 1 reviews and analyzes the manpower training needs survey and results.…
Bridging data models and terminologies to support adverse drug event reporting using EHR data.
Declerck, G; Hussain, S; Daniel, C; Yuksel, M; Laleci, G B; Twagirumukiza, M; Jaulent, M-C
2015-01-01
This article is part of the Focus Theme of METHODs of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". SALUS project aims at building an interoperability platform and a dedicated toolkit to enable secondary use of electronic health records (EHR) data for post marketing drug surveillance. An important component of this toolkit is a drug-related adverse events (AE) reporting system designed to facilitate and accelerate the reporting process using automatic prepopulation mechanisms. To demonstrate SALUS approach for establishing syntactic and semantic interoperability for AE reporting. Standard (e.g. HL7 CDA-CCD) and proprietary EHR data models are mapped to the E2B(R2) data model via SALUS Common Information Model. Terminology mapping and terminology reasoning services are designed to ensure the automatic conversion of source EHR terminologies (e.g. ICD-9-CM, ICD-10, LOINC or SNOMED-CT) to the target terminology MedDRA which is expected in AE reporting forms. A validated set of terminology mappings is used to ensure the reliability of the reasoning mechanisms. The percentage of data elements of a standard E2B report that can be completed automatically has been estimated for two pilot sites. In the best scenario (i.e. the available fields in the EHR have actually been filled), only 36% (pilot site 1) and 38% (pilot site 2) of E2B data elements remain to be filled manually. In addition, most of these data elements shall not be filled in each report. SALUS platform's interoperability solutions enable partial automation of the AE reporting process, which could contribute to improve current spontaneous reporting practices and reduce under-reporting, which is currently one major obstacle in the process of acquisition of pharmacovigilance data.
Using Cluster Analysis and ICP-MS to Identify Groups of Ecstasy Tablets in Sao Paulo State, Brazil.
Maione, Camila; de Oliveira Souza, Vanessa Cristina; Togni, Loraine Rezende; da Costa, José Luiz; Campiglia, Andres Dobal; Barbosa, Fernando; Barbosa, Rommel Melgaço
2017-11-01
The variations found in the elemental composition in ecstasy samples result in spectral profiles with useful information for data analysis, and cluster analysis of these profiles can help uncover different categories of the drug. We provide a cluster analysis of ecstasy tablets based on their elemental composition. Twenty-five elements were determined by ICP-MS in tablets apprehended by Sao Paulo's State Police, Brazil. We employ the K-means clustering algorithm along with C4.5 decision tree to help us interpret the clustering results. We found a better number of two clusters within the data, which can refer to the approximated number of sources of the drug which supply the cities of seizures. The C4.5 model was capable of differentiating the ecstasy samples from the two clusters with high prediction accuracy using the leave-one-out cross-validation. The model used only Nd, Ni, and Pb concentration values in the classification of the samples. © 2017 American Academy of Forensic Sciences.
Christensen, Noel C.; Emery, James D.; Smith, Maurice L.
1988-04-05
A system converts from the boundary representation of an object to the constructive solid geometry representation thereof. The system converts the boundary representation of the object into elemental atomic geometrical units or I-bodies which are in the shape of stock primitives or regularized intersections of stock primitives. These elemental atomic geometrical units are then represented in symbolic form. The symbolic representations of the elemental atomic geometrical units are then assembled heuristically to form a constructive solid geometry representation of the object usable for manufacturing thereof. Artificial intelligence is used to determine the best constructive solid geometry representation from the boundary representation of the object. Heuristic criteria are adapted to the manufacturing environment for which the device is to be utilized. The surface finish, tolerance, and other information associated with each surface of the boundary representation of the object are mapped onto the constructive solid geometry representation of the object to produce an enhanced solid geometry representation, particularly useful for computer-aided manufacture of the object.
NASA Technical Reports Server (NTRS)
Bertelrud, Arild; Anders, J. B. (Technical Monitor)
2002-01-01
A 2-D high-lift system experiment was conducted in August of 1996 in the Low Turbulence Pressure Tunnel at NASA Langley Research Center, Hampton, VA. The purpose of the experiment was to obtain transition measurements on a three element high-lift system for CFD code validation studies. A transition database has been created using the data from this experiment. The present report contains the analysis of the surface hot film data in terms of the transition locations on the three elements. It also includes relevant information regarding the pressure loads and distributions and the wakes behind the model to aid in the interpretation of the transition data. For some of the configurations the current pressure data has been compared with previous wind tunnel entries of the same model. The methodology used to determine the regions of transitional flow is outlined and each configuration tested has been analyzed. A discussion of interference effects, repeatability, and three-dimensional effects on the data is included.
NASA Astrophysics Data System (ADS)
Makarov, M.; Shchanikov, S.; Trantina, N.
2017-01-01
We have conducted a research into the major, in terms of their future application, properties of nanoscale objects, based on modelling these objects as free-standing physical elements beyond the structure of an engineering system designed for their integration as well as a part of a system that operates under the influence of the external environment. For the empirical research suggested within the scope of this work, we have chosen a nanoscale electronic element intended to be used while designing information processing systems with the parallel architecture - a memristor. The target function of the research was to provide the maximum fault-tolerance index of a memristor-based system when affected by all possible impacts of the internal destabilizing factors and external environment. The research results have enabled us to receive and classify all the factors predetermining the fault-tolerance index of the hardware implementation of a computing system based on the nanoscale electronic element base.
Geometric k-nearest neighbor estimation of entropy and mutual information
NASA Astrophysics Data System (ADS)
Lord, Warren M.; Sun, Jie; Bollt, Erik M.
2018-03-01
Nonparametric estimation of mutual information is used in a wide range of scientific problems to quantify dependence between variables. The k-nearest neighbor (knn) methods are consistent, and therefore expected to work well for a large sample size. These methods use geometrically regular local volume elements. This practice allows maximum localization of the volume elements, but can also induce a bias due to a poor description of the local geometry of the underlying probability measure. We introduce a new class of knn estimators that we call geometric knn estimators (g-knn), which use more complex local volume elements to better model the local geometry of the probability measures. As an example of this class of estimators, we develop a g-knn estimator of entropy and mutual information based on elliptical volume elements, capturing the local stretching and compression common to a wide range of dynamical system attractors. A series of numerical examples in which the thickness of the underlying distribution and the sample sizes are varied suggest that local geometry is a source of problems for knn methods such as the Kraskov-Stögbauer-Grassberger estimator when local geometric effects cannot be removed by global preprocessing of the data. The g-knn method performs well despite the manipulation of the local geometry. In addition, the examples suggest that the g-knn estimators can be of particular relevance to applications in which the system is large, but the data size is limited.
The development and evaluation of a nursing information system for caring clinical in-patient.
Fang, Yu-Wen; Li, Chih-Ping; Wang, Mei-Hua
2015-01-01
The research aimed to develop a nursing information system in order to simplify the admission procedure for caring clinical in-patient, enhance the efficiency of medical information documentation. Therefore, by correctly delivering patients’ health records, and providing continues care, patient safety and care quality would be effectively improved. The study method was to apply Spiral Model development system to compose a nursing information team. By using strategies of data collection, working environment observation, applying use-case modeling, and conferences of Joint Application Design (JAD) to complete the system requirement analysis and design. The Admission Care Management Information System (ACMIS) mainly included: (1) Admission nursing management information system. (2) Inter-shift meeting information management system. (3) The linkage of drug management system and physical examination record system. The framework contained qualitative and quantitative components that provided both formative and summative elements of the evaluation. System evaluation was to apply information success model, and developed questionnaire of consisting nurses’ acceptance and satisfaction. The results of questionnaires were users’ satisfaction, the perceived self-involvement, age and information quality were positively to personal and organizational effectiveness. According to the results of this study, the Admission Care Management Information System was practical to simplifying clinic working procedure and effective in communicating and documenting admission medical information.
Determination of ankle external fixation stiffness by expedited interactive finite element analysis.
Nielsen, Jonathan K; Saltzman, Charles L; Brown, Thomas D
2005-11-01
Interactive finite element analysis holds the potential to quickly and accurately determine the mechanical stiffness of alternative external fixator frame configurations. Using as an example Ilizarov distraction of the ankle, a finite element model and graphical user interface were developed that provided rapid, construct-specific information on fixation rigidity. After input of specific construct variables, the finite element software determined the resulting tibial displacement for a given configuration in typically 15s. The formulation was employed to investigate constructs used to treat end-stage arthritis, both in a parametric series and for five specific clinical distraction cases. Parametric testing of 15 individual variables revealed that tibial half-pins were much more effective than transfixion wires in limiting axial tibial displacement. Factors most strongly contributing to stiffening the construct included placing the tibia closer to the fixator rings, and mounting the pins to the rings at the nearest circumferential location to the bone. Benchtop mechanical validation results differed inappreciably from the finite element computations.
A shared-world conceptual model for integrating space station life sciences telescience operations
NASA Technical Reports Server (NTRS)
Johnson, Vicki; Bosley, John
1988-01-01
Mental models of the Space Station and its ancillary facilities will be employed by users of the Space Station as they draw upon past experiences, perform tasks, and collectively plan for future activities. The operational environment of the Space Station will incorporate telescience, a new set of operational modes. To investigate properties of the operational environment, distributed users, and the mental models they employ to manipulate resources while conducting telescience, an integrating shared-world conceptual model of Space Station telescience is proposed. The model comprises distributed users and resources (active elements); agents who mediate interactions among these elements on the basis of intelligent processing of shared information; and telescience protocols which structure the interactions of agents as they engage in cooperative, responsive interactions on behalf of users and resources distributed in space and time. Examples from the life sciences are used to instantiate and refine the model's principles. Implications for transaction management and autonomy are discussed. Experiments employing the model are described which the authors intend to conduct using the Space Station Life Sciences Telescience Testbed currently under development at Ames Research Center.
Memetic Approaches for Optimizing Hidden Markov Models: A Case Study in Time Series Prediction
NASA Astrophysics Data System (ADS)
Bui, Lam Thu; Barlow, Michael
We propose a methodology for employing memetics (local search) within the framework of evolutionary algorithms to optimize parameters of hidden markov models. With this proposal, the rate and frequency of using local search are automatically changed over time either at a population or individual level. At the population level, we allow the rate of using local search to decay over time to zero (at the final generation). At the individual level, each individual is equipped with information of when it will do local search and for how long. This information evolves over time alongside the main elements of the chromosome representing the individual.
Role of negative emotion in communication about CO2 risks.
Meijnders, A L; Midden, C J; Wilke, H A
2001-10-01
This article describes how the effectiveness of risk communication is determined by the interaction between emotional and informative elements. An experiment is described that examined the role of negative emotion in communication about CO2 risks. This experiment was based on the elaboration likelihood model and the related heuristic systematic model of attitude formation. The results indicated that inducing fear of CO2 risks leads to systematic processing of information about energy conservation as a risk-reducing strategy. In turn, this results in more favorable attitudes toward energy conservation if strong arguments are provided. Individual differences in concern seem to have similar effects.
Veinot, Tiffany C; Senteio, Charles R; Hanauer, David; Lowery, Julie C
2018-06-01
To describe a new, comprehensive process model of clinical information interaction in primary care (Clinical Information Interaction Model, or CIIM) based on a systematic synthesis of published research. We used the "best fit" framework synthesis approach. Searches were performed in PubMed, Embase, the Cumulative Index to Nursing and Allied Health Literature (CINAHL), PsycINFO, Library and Information Science Abstracts, Library, Information Science and Technology Abstracts, and Engineering Village. Two authors reviewed articles according to inclusion and exclusion criteria. Data abstraction and content analysis of 443 published papers were used to create a model in which every element was supported by empirical research. The CIIM documents how primary care clinicians interact with information as they make point-of-care clinical decisions. The model highlights 3 major process components: (1) context, (2) activity (usual and contingent), and (3) influence. Usual activities include information processing, source-user interaction, information evaluation, selection of information, information use, clinical reasoning, and clinical decisions. Clinician characteristics, patient behaviors, and other professionals influence the process. The CIIM depicts the complete process of information interaction, enabling a grasp of relationships previously difficult to discern. The CIIM suggests potentially helpful functionality for clinical decision support systems (CDSSs) to support primary care, including a greater focus on information processing and use. The CIIM also documents the role of influence in clinical information interaction; influencers may affect the success of CDSS implementations. The CIIM offers a new framework for achieving CDSS workflow integration and new directions for CDSS design that can support the work of diverse primary care clinicians.
Knowledge network model of the energy consumption in discrete manufacturing system
NASA Astrophysics Data System (ADS)
Xu, Binzi; Wang, Yan; Ji, Zhicheng
2017-07-01
Discrete manufacturing system generates a large amount of data and information because of the development of information technology. Hence, a management mechanism is urgently required. In order to incorporate knowledge generated from manufacturing data and production experience, a knowledge network model of the energy consumption in the discrete manufacturing system was put forward based on knowledge network theory and multi-granularity modular ontology technology. This model could provide a standard representation for concepts, terms and their relationships, which could be understood by both human and computer. Besides, the formal description of energy consumption knowledge elements (ECKEs) in the knowledge network was also given. Finally, an application example was used to verify the feasibility of the proposed method.
Zhou, Li; Collins, Sarah; Morgan, Stephen J.; Zafar, Neelam; Gesner, Emily J.; Fehrenbach, Martin; Rocha, Roberto A.
2016-01-01
Structured clinical documentation is an important component of electronic health records (EHRs) and plays an important role in clinical care, administrative functions, and research activities. Clinical data elements serve as basic building blocks for composing the templates used for generating clinical documents (such as notes and forms). We present our experience in creating and maintaining data elements for three different EHRs (one home-grown and two commercial systems) across different clinical settings, using flowsheet data elements as examples in our case studies. We identified basic but important challenges (including naming convention, links to standard terminologies, and versioning and change management) and possible solutions to address them. We also discussed more complicated challenges regarding governance, documentation vs. structured data capture, pre-coordination vs. post-coordination, reference information models, as well as monitoring, communication and training. PMID:28269927
Elements of a collaborative systems model within the aerospace industry
NASA Astrophysics Data System (ADS)
Westphalen, Bailee R.
2000-10-01
Scope and method of study. The purpose of this study was to determine the components of current aerospace collaborative efforts. There were 44 participants from two selected groups surveyed for this study. Nineteen were from the Oklahoma Air National Guard based in Oklahoma City representing the aviation group. Twenty-five participants were from the NASA Johnson Space Center in Houston representing the aerospace group. The surveys for the aviation group were completed in reference to planning missions necessary to their operations. The surveys for the aerospace group were completed in reference to a well-defined and focused goal from a current mission. A questionnaire was developed to survey active participants of collaborative systems in order to consider various components found within the literature. Results were analyzed and aggregated through a database along with content analysis of open-ended question comments from respondents. Findings and conclusions. This study found and determined elements of a collaborative systems model in the aerospace industry. The elements were (1) purpose or mission for the group or team; (2) commitment or dedication to the challenge; (3) group or team meetings and discussions; (4) constraints of deadlines and budgets; (5) tools and resources for project and simulations; (6) significant contributors to the collaboration; (7) decision-making formats; (8) reviews of project; (9) participants education and employment longevity; (10) cross functionality of team or group members; (11) training on the job plus teambuilding; (12) other key elements identified relevant by the respondents but not included in the model such as communication and teamwork; (13) individual and group accountability; (14) conflict, learning, and performance; along with (15) intraorganizational coordination. These elements supported and allowed multiple individuals working together to solve a common problem or to develop innovation that could not have been accomplished individually. Comparing the relationship of the elements of the aerospace collaborative model with those of the various authors in the literature, it is found that the aerospace model is a more comprehensive configuration of elements and has added to the definition of collaboration found in the literature. Further, findings of relevant elements were indicated by the high response incidence of those individuals within the collaborative system. They were elements that were nearly unanimously identified as relevant by those within aerospace collaborative systems included having a stated purpose or mission for the project; working within completion deadlines; and coordinating with other departments within the organization. Furthermore, relevant terms correlating to the model as indicated by the high incidence of appearance in the respondents' comments were training, communication, commitment, and teamwork. There were also found to be variable distinctions specific to each collaborative context. The survey results suggested the multilevel overlapping of the multiple elements within the aerospace collaborative system model along with distinct context variables. Further research is necessary to refine the model and to provide additional information upon this established foundation.
FCRD Advanced Reactor (Transmutation) Fuels Handbook
DOE Office of Scientific and Technical Information (OSTI.GOV)
Janney, Dawn Elizabeth; Papesch, Cynthia Ann
2016-09-01
Transmutation of minor actinides such as Np, Am, and Cm in spent nuclear fuel is of international interest because of its potential for reducing the long-term health and safety hazards caused by the radioactivity of the spent fuel. One important approach to transmutation (currently being pursued by the DOE Fuel Cycle Research & Development Advanced Fuels Campaign) involves incorporating the minor actinides into U-Pu-Zr alloys, which can be used as fuel in fast reactors. U-Pu-Zr alloys are well suited for electrolytic refining, which leads to incorporation rare-earth fission products such as La, Ce, Pr, and Nd. It is, therefore, importantmore » to understand not only the properties of U-Pu-Zr alloys but also those of U-Pu-Zr alloys with concentrations of minor actinides (Np, Am) and rare-earth elements (La, Ce, Pr, and Nd) similar to those in reprocessed fuel. In addition to requiring extensive safety precautions, alloys containing U, Pu, and minor actinides (Np and Am) are difficult to study for numerous reasons, including their complex phase transformations, characteristically sluggish phasetransformation kinetics, tendency to produce experimental results that vary depending on the histories of individual samples, rapid oxidation, and sensitivity to contaminants such as oxygen in concentrations below a hundred parts per million. Although less toxic, rare-earth elements such as La, Ce, Pr, and Nd are also difficult to study for similar reasons. Many of the experimental measurements were made before 1980, and the level of documentation for experimental methods and results varies widely. It is, therefore, not surprising that little is known with certainty about U-Pu-Zr alloys, particularly those that also contain minor actinides and rare-earth elements. General acceptance of results commonly indicates that there is only a single measurement for a particular property. This handbook summarizes currently available information about U, Pu, Zr, Np, Am, La, Ce, Pr, and Nd and alloys of two or three of these elements. It contains information about phase diagrams and related information (including phases and phase transformations); heat capacity, entropy, and enthalpy; thermal expansion; and thermal conductivity and diffusivity. In addition to presenting information about materials properties, the handbook attempts to provide information about how well the property is known and how much variation exists between measurements. Although it includes some results from models, its primary focus is experimental data. The Handbook is organized in two sections: one with information about the U-Pu-Zr ternary and one with information about other elements and binary and vi ternary alloys in the U-Np-Pu-Am-La-Ce-Pr-Nd-Zr system. Within each section, information about elements is presented first, followed by information about binary alloys, then information about ternary alloys. The order in which the elements in each alloy are mentioned follows the order in the first sentence of this paragraph. Much of the information on the U-Pu-Zr system repeats information from the FCRD Transmutation Fuels Handbook 2015. Most of the other data has been published elsewhere (although scattered throughout numerous references, some quite obscure); however, some data from Idaho National Laboratory is presented here for the first time. As the FCRD programmatic mission evolves, future editions of this handbook will begin to include other advanced reactor fuel designs and compositions. Hence, the title of the handbook will transition to the Advanced Reactor Fuels Handbook.« less
GEM at 10: a decade's experience with the Guideline Elements Model.
Hajizadeh, Negin; Kashyap, Nitu; Michel, George; Shiffman, Richard N
2011-01-01
The Guideline Elements Model (GEM) was developed in 2000 to organize the information contained in clinical practice guidelines using XML and to represent guideline content in a form that can be understood by human readers and processed by computers. In this work, we systematically reviewed the literature to better understand how GEM was being used, potential barriers to its use, and suggestions for improvement. Fifty external and twelve internally produced publications were identified and analyzed. GEM was used most commonly for modeling and ontology creation. Other investigators applied GEM for knowledge extraction and data mining, for clinical decision support for guideline generation. The GEM Cutter software-used to markup guidelines for translation into XML- has been downloaded 563 times since 2000. Although many investigators found GEM to be valuable, others critiqued its failure to clarify guideline semantics, difficulties in markup, and the fact that GEM files are not usually executable.
Cyber Selection Test Research Effort for U.S. Army New Accessions
2017-10-12
assessment game 3. Develop an operational version of the STA game which incorporates assessments from phase 1 and (through game -play) examines...3 more STA abilities •5 STA behaviors 4. Validate the system thinking assessment game in an operational setting C O M PL ET ED PL AN N ED Research...Information Identifies Elements of Systems Models Relationships Understands System Dynamics Evaluates & Revises Model Applies Understanding to Problem STA Game
Data Warehouse Design from HL7 Clinical Document Architecture Schema.
Pecoraro, Fabrizio; Luzi, Daniela; Ricci, Fabrizio L
2015-01-01
This paper proposes a semi-automatic approach to extract clinical information structured in a HL7 Clinical Document Architecture (CDA) and transform it in a data warehouse dimensional model schema. It is based on a conceptual framework published in a previous work that maps the dimensional model primitives with CDA elements. Its feasibility is demonstrated providing a case study based on the analysis of vital signs gathered during laboratory tests.
Time-related patient data retrieval for the case studies from the pharmacogenomics research network
Zhu, Qian; Tao, Cui; Ding, Ying; Chute, Christopher G.
2012-01-01
There are lots of question-based data elements from the pharmacogenomics research network (PGRN) studies. Many data elements contain temporal information. To semantically represent these elements so that they can be machine processiable is a challenging problem for the following reasons: (1) the designers of these studies usually do not have the knowledge of any computer modeling and query languages, so that the original data elements usually are represented in spreadsheets in human languages; and (2) the time aspects in these data elements can be too complex to be represented faithfully in a machine-understandable way. In this paper, we introduce our efforts on representing these data elements using semantic web technologies. We have developed an ontology, CNTRO, for representing clinical events and their temporal relations in the web ontology language (OWL). Here we use CNTRO to represent the time aspects in the data elements. We have evaluated 720 time-related data elements from PGRN studies. We adapted and extended the knowledge representation requirements for EliXR-TIME to categorize our data elements. A CNTRO-based SPARQL query builder has been developed to customize users’ own SPARQL queries for each knowledge representation requirement. The SPARQL query builder has been evaluated with a simulated EHR triple store to ensure its functionalities. PMID:23076712
Time-related patient data retrieval for the case studies from the pharmacogenomics research network.
Zhu, Qian; Tao, Cui; Ding, Ying; Chute, Christopher G
2012-11-01
There are lots of question-based data elements from the pharmacogenomics research network (PGRN) studies. Many data elements contain temporal information. To semantically represent these elements so that they can be machine processiable is a challenging problem for the following reasons: (1) the designers of these studies usually do not have the knowledge of any computer modeling and query languages, so that the original data elements usually are represented in spreadsheets in human languages; and (2) the time aspects in these data elements can be too complex to be represented faithfully in a machine-understandable way. In this paper, we introduce our efforts on representing these data elements using semantic web technologies. We have developed an ontology, CNTRO, for representing clinical events and their temporal relations in the web ontology language (OWL). Here we use CNTRO to represent the time aspects in the data elements. We have evaluated 720 time-related data elements from PGRN studies. We adapted and extended the knowledge representation requirements for EliXR-TIME to categorize our data elements. A CNTRO-based SPARQL query builder has been developed to customize users' own SPARQL queries for each knowledge representation requirement. The SPARQL query builder has been evaluated with a simulated EHR triple store to ensure its functionalities.
NASA Technical Reports Server (NTRS)
Gillette, W. B. (Editor); Southall, J. W. (Editor)
1973-01-01
The catalog is presented of technical program elements which are required to support the design activities for a subsonic and supersonic commercial transport. Information for each element consists of usage and storage information, ownership, status and an abstract describing the purpose of the element.
Bruland, Philipp; Doods, Justin; Storck, Michael; Dugas, Martin
2017-01-01
Data dictionaries provide structural meta-information about data definitions in health information technology (HIT) systems. In this regard, reusing healthcare data for secondary purposes offers several advantages (e.g. reduce documentation times or increased data quality). Prerequisites for data reuse are its quality, availability and identical meaning of data. In diverse projects, research data warehouses serve as core components between heterogeneous clinical databases and various research applications. Given the complexity (high number of data elements) and dynamics (regular updates) of electronic health record (EHR) data structures, we propose a clinical metadata warehouse (CMDW) based on a metadata registry standard. Metadata of two large hospitals were automatically inserted into two CMDWs containing 16,230 forms and 310,519 data elements. Automatic updates of metadata are possible as well as semantic annotations. A CMDW allows metadata discovery, data quality assessment and similarity analyses. Common data models for distributed research networks can be established based on similarity analyses.
Liu, Jinbao; Zhang, Yang; Wang, Huanyuan; Du, Yichun
2018-06-15
The estimation of soils heavy metal content can reflect the impending surroundings of surface, which lays theoretical foundation for using covered vegetation to monitor environment and investigate resource. In this study, the contents of Cr, Mn, Ni, Cu, Zn, As, Cd, Hg and Pb in 44 soil samples were collected from Fufeng County, Yangling County and Wugong County, Shaanxi Province and were used as data sources. ASD FieldSpec HR (350-2500nm), and then the NOR, MSC and SNV of the reflectance were pretreated, the first deviation, second deviation and reflectance reciprocal logarithmic transformation were carried out. The optimal spectroscopy estimation model of nine heavy metal elements of Cr, Mn, Ni, Cu, Zn, As, Cd, Hg and Pb was established by regression method. Comparing the diffuse reflectance characteristics of different heavy metal contents and the effect of different pretreatment methods on the establishment of soil heavy metal spectral inversion model. The results of chemical analysis show that there was a serious Hg pollution in the study area, and the Cd content was close to the critical value. The results show that: (1) NOR, MSC and SNV were adopted for the acquisition of visible near-infrared. Combining differential transformation can improve the information of heavy metal elements in the soil, and use the correlation band energy Significantly improve the stability and predictability of the model. (2) The modeling accuracy of the optimal model of nine heavy metal spectra of Cr, Mn, Ni, Cu, Zn, As, Cd, Hg and Pb by PLSR method were 0.70, 0.79, 0.69, 0.81, 0.86, 0.58, 0.55, 0.99, 0.62. (3) The optimal estimation model of different elements using different treatment methods has better stability and higher precision, and can realize the rapid prediction of nine kinds of heavy metal elements in this region. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Liu, Jinbao; Zhang, Yang; Wang, Huanyuan; Du, Yichun
2018-06-01
The estimation of soils heavy metal content can reflect the impending surroundings of surface, which lays theoretical foundation for using covered vegetation to monitor environment and investigate resource. In this study, the contents of Cr, Mn, Ni, Cu, Zn, As, Cd, Hg and Pb in 44 soil samples were collected from Fufeng County, Yangling County and Wugong County, Shaanxi Province and were used as data sources. ASD FieldSpec HR (350-2500 nm), and then the NOR, MSC and SNV of the reflectance were pretreated, the first deviation, second deviation and reflectance reciprocal logarithmic transformation were carried out. The optimal spectroscopy estimation model of nine heavy metal elements of Cr, Mn, Ni, Cu, Zn, As, Cd, Hg and Pb was established by regression method. Comparing the diffuse reflectance characteristics of different heavy metal contents and the effect of different pretreatment methods on the establishment of soil heavy metal spectral inversion model. The results of chemical analysis show that there was a serious Hg pollution in the study area, and the Cd content was close to the critical value. The results show that: (1) NOR, MSC and SNV were adopted for the acquisition of visible near-infrared. Combining differential transformation can improve the information of heavy metal elements in the soil, and use the correlation band energy Significantly improve the stability and predictability of the model. (2) The modeling accuracy of the optimal model of nine heavy metal spectra of Cr, Mn, Ni, Cu, Zn, As, Cd, Hg and Pb by PLSR method were 0.70, 0.79, 0.69, 0.81, 0.86, 0.58, 0.55, 0.99, 0.62. (3) The optimal estimation model of different elements using different treatment methods has better stability and higher precision, and can realize the rapid prediction of nine kinds of heavy metal elements in this region.
Three-dimensional reconstruction of indoor whole elements based on mobile LiDAR point cloud data
NASA Astrophysics Data System (ADS)
Gong, Yuejian; Mao, Wenbo; Bi, Jiantao; Ji, Wei; He, Zhanjun
2014-11-01
Ground-based LiDAR is one of the most effective city modeling tools at present, which has been widely used for three-dimensional reconstruction of outdoor objects. However, as for indoor objects, there are some technical bottlenecks due to lack of GPS signal. In this paper, based on the high-precision indoor point cloud data which was obtained by LiDAR, an international advanced indoor mobile measuring equipment, high -precision model was fulfilled for all indoor ancillary facilities. The point cloud data we employed also contain color feature, which is extracted by fusion with CCD images. Thus, it has both space geometric feature and spectral information which can be used for constructing objects' surface and restoring color and texture of the geometric model. Based on Autodesk CAD platform and with help of PointSence plug, three-dimensional reconstruction of indoor whole elements was realized. Specifically, Pointools Edit Pro was adopted to edit the point cloud, then different types of indoor point cloud data was processed, including data format conversion, outline extracting and texture mapping of the point cloud model. Finally, three-dimensional visualization of the real-world indoor was completed. Experiment results showed that high-precision 3D point cloud data obtained by indoor mobile measuring equipment can be used for indoor whole elements' 3-d reconstruction and that methods proposed in this paper can efficiently realize the 3 -d construction of indoor whole elements. Moreover, the modeling precision could be controlled within 5 cm, which was proved to be a satisfactory result.
Stadelmann, Marc A; Maquer, Ghislain; Voumard, Benjamin; Grant, Aaron; Hackney, David B; Vermathen, Peter; Alkalay, Ron N; Zysset, Philippe K
2018-05-17
Intervertebral disc degeneration is a common disease that is often related to impaired mechanical function, herniations and chronic back pain. The degenerative process induces alterations of the disc's shape, composition and structure that can be visualized in vivo using magnetic resonance imaging (MRI). Numerical tools such as finite element analysis (FEA) have the potential to relate MRI-based information to the altered mechanical behavior of the disc. However, in terms of geometry, composition and fiber architecture, current FE models rely on observations made on healthy discs and might therefore not be well suited to study the degeneration process. To address the issue, we propose a new, more realistic FE methodology based on diffusion tensor imaging (DTI). For this study, a human disc joint was imaged in a high-field MR scanner with proton-density weighted (PD) and DTI sequences. The PD image was segmented and an anatomy-specific mesh was generated. Assuming accordance between local principal diffusion direction and local mean collagen fiber alignment, corresponding fiber angles were assigned to each element. Those element-wise fiber directions and PD intensities allowed the homogenized model to smoothly account for composition and fibrous structure of the disc. The disc's in vitro mechanical behavior was quantified under tension, compression, flexion, extension, lateral bending and rotation. The six resulting load-displacement curves could be replicated by the FE model, which supports our approach as a first proof of concept towards patient-specific disc modeling. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Hoang, Phong V.; Konyakhin, Igor A.
2017-06-01
Autocollimators are widely used for angular measurements in instrument-making and the manufacture of elements of optical systems (wedges, prisms, plane-parallel plates) to check their shape parameters (rectilinearity, parallelism and planarity) and retrieve their optical parameters (curvature radii, measure and test their flange focusing). Autocollimator efficiency is due to the high sensitivity of the autocollimation method to minor rotations of the reflecting control element or the controlled surface itself. We consider using quaternions to optimize reflector parameters during autocollimation measurements as compared to the matrix technique. Mathematical model studies have demonstrated that the orthogonal positioning of the two basic unchanged directions of the tetrahedral reflector of the autocollimator is optimal by the criterion of reducing measurement errors where the axis of actual rotation is in a bisecting position towards them. Computer results are presented of running quaternion models that yielded conditions for diminishing measurement errors provided apriori information is available on the position of rotation axis. A practical technique is considered for synthesizing the parameters of the tetrahedral reflector that employs the newly-retrieved relationships. Following the relationships found between the angles of the tetrahedral reflector and the angles of the parameters of its initial orientation, an applied technique was developed to synthesize the control element for autocollimation measurements in case apriori information is available on the axis of actual rotation during monitoring measurements of shaft or pipeline deformation.
The human genome: a multifractal analysis
2011-01-01
Background Several studies have shown that genomes can be studied via a multifractal formalism. Recently, we used a multifractal approach to study the genetic information content of the Caenorhabditis elegans genome. Here we investigate the possibility that the human genome shows a similar behavior to that observed in the nematode. Results We report here multifractality in the human genome sequence. This behavior correlates strongly on the presence of Alu elements and to a lesser extent on CpG islands and (G+C) content. In contrast, no or low relationship was found for LINE, MIR, MER, LTRs elements and DNA regions poor in genetic information. Gene function, cluster of orthologous genes, metabolic pathways, and exons tended to increase their frequencies with ranges of multifractality and large gene families were located in genomic regions with varied multifractality. Additionally, a multifractal map and classification for human chromosomes are proposed. Conclusions Based on these findings, we propose a descriptive non-linear model for the structure of the human genome, with some biological implications. This model reveals 1) a multifractal regionalization where many regions coexist that are far from equilibrium and 2) this non-linear organization has significant molecular and medical genetic implications for understanding the role of Alu elements in genome stability and structure of the human genome. Given the role of Alu sequences in gene regulation, genetic diseases, human genetic diversity, adaptation and phylogenetic analyses, these quantifications are especially useful. PMID:21999602
Building an Ontology for Identity Resolution in Healthcare and Public Health.
Duncan, Jeffrey; Eilbeck, Karen; Narus, Scott P; Clyde, Stephen; Thornton, Sidney; Staes, Catherine
2015-01-01
Integration of disparate information from electronic health records, clinical data warehouses, birth certificate registries and other public health information systems offers great potential for clinical care, public health practice, and research. Such integration, however, depends on correctly matching patient-specific records using demographic identifiers. Without standards for these identifiers, record linkage is complicated by issues of structural and semantic heterogeneity. Our objectives were to develop and validate an ontology to: 1) identify components of identity and events subsequent to birth that result in creation, change, or sharing of identity information; 2) develop an ontology to facilitate data integration from multiple healthcare and public health sources; and 3) validate the ontology's ability to model identity-changing events over time. We interviewed domain experts in area hospitals and public health programs and developed process models describing the creation and transmission of identity information among various organizations for activities subsequent to a birth event. We searched for existing relevant ontologies. We validated the content of our ontology with simulated identity information conforming to scenarios identified in our process models. We chose the Simple Event Model (SEM) to describe events in early childhood and integrated the Clinical Element Model (CEM) for demographic information. We demonstrated the ability of the combined SEM-CEM ontology to model identity events over time. The use of an ontology can overcome issues of semantic and syntactic heterogeneity to facilitate record linkage.
Geochemistry and origin of regional dolomites. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hanson, G.N.; Meyers, W.J.
1995-05-01
The main goal of our research on dolomites has been to better understand the composition of the fluids and processes of the fluid-rock interaction responsible for the formation of massive dolostones occurring over regional scales within sedimentary sequences. Understanding the timing of dolomitization, the fluids responsible for the dolomitization and the timing of the development of porosity has major economic ramifications in that dolomites are major oil reservoirs often having better reservoir properties than associated limestones. Our approach has been to apply trace element, major element, petrographic, crystallographic, stable isotope and radiogenic isotope systems to test models for the originsmore » of dolomites and to give information that may allow us to develop new models. Fluid compositions and processes are evaluated through the use of numerical models which we have developed showing the simultaneous evolution of the trace element and isotope systems during dolomitization. Our research has included the application of B, O, C, Sr, Nd and Pb isotope systematics and the trace elements Mn, Fe St, rare earth elements, Rb, Ba, U, Th, Pb, Zn, Na, Cl, F and SO{sub 4}{sup 2-}. Analyses are possible on individual cements or dolomite types using micro-sampling or microprobe techniques. The microprobe techniques used include synchrotron X-ray microprobe analysis at Brookhaven National Laboratory or electron microprobe at Stony Brook. Lack of a modern analogue for ancient massive dolostones has limited the application of the uniformitarian concept to developing models for the ancient regional dolostones. In addition it has not been possible to synthesize dolomite in the laboratory under conditions similar to the sedimentary or diagenetic possible environments in which the dolomites must have formed.« less
Inform, Perform, Transform: Modeling In-School Youth Participatory Action Research through Gameplay
ERIC Educational Resources Information Center
Garcia, Antero
2012-01-01
In this article, the author explores youth participatory action research (YPAR) through gameplay. He describes Ask Anansi, an alternate reality game (ARG) played in the "real world" by weaving elements of storytelling and fiction into the environment played as part of class experience. This game which the author created drove the…
Parents' Views on Preschool Care and Education in Local Community
ERIC Educational Resources Information Center
Devjak, Tatjana; Bercnik, Sanja
2009-01-01
In this text, the authors are analyzing preschool care and education in local community. They are focusing on the problem of information transfer between the kindergarten, parents and local community, as well as the model of relationship participation. Cooperation between parents, kindergarten and local community is an important element in the…
Simulating spatial and temporal context of forest management using hypothetical landscapes
Eric J. Gustafson; Thomas R. Crow
1998-01-01
Spatially explicit models that combine remote sensing with geographic information systems (GIS) offer great promise to land managers because they consider the arrangement of landscape elements in time and space. Their visual and geographic nature facilitate the comparison of alternative landscape designs. Among various activities associated with forest management,...
Information Needs within a Multi-District Environment.
ERIC Educational Resources Information Center
Thomas, Gregory P.
This paper argues that no single measurement strategy serves all purposes and that applying methods and techniques which allow a variety of data elements to be retrieved and juxtaposed may be an investment in the future. Item response theory, Rasch model, and latent trait theory are all approaches to a single conceptual topic. An abbreviated look…
Effects of Teaching Strategies in Annotated Bibliography Writing
ERIC Educational Resources Information Center
Tan-de Ramos, Jennifer
2015-01-01
The study examines the effect of teaching strategies to improved writing of students in the tertiary level. Specifically, three teaching approaches--the use of modelling, grammar-based, and information element-focused--were tested on their effect on the writing of annotated bibliography in three research classes at a university in Manila.…
Child-Centred Practice in Irish Infant Classrooms: A Case of Imaginary Play?
ERIC Educational Resources Information Center
Murphy, Brian
2006-01-01
This paper begins by outlining and discussing the fundamental elements of the process, child-centred model of curriculum, which has informed the two most recent Irish primary school curricula "Curaclam na Bunscoile" (1971 to 1999) and the "Primary School Curriculum" (1999 to date). The specific ways in which both Irish…
Dumenil, Aurélien; Kaladji, Adrien; Castro, Miguel; Esneault, Simon; Lucas, Antoine; Rochette, Michel; Goksu, Cemil; Haigron, Pascal
2013-01-01
Endovascular repair of abdominal aortic aneurysms is a well-established technique throughout the medical and surgical communities. Although increasingly indicated, this technique does have some limitations. Because intervention is commonly performed under fluoroscopic control, two-dimensional (2D) visualization of the aneurysm requires the injection of a contrast agent. The projective nature of this imaging modality inevitably leads to topographic errors, and does not give information on arterial wall quality at the time of deployment. A specially-adapted intraoperative navigation interface could increase deployment accuracy and reveal such information, which preoperative three-dimensional (3D) imaging might otherwise provide. One difficulty is the precise matching of preoperative data (images and models) and intraoperative observations affected by anatomical deformations due to tool-tissue interactions. Our proposed solution involves a finite element-based preoperative simulation of tool/tissue interactions, its adaptive tuning regarding patient specific data, and the matching with intra-operative data. The biomechanical model was first tuned on a group of 10 patients and assessed on a second group of 8 patients. PMID:23269745
Expected gamma-ray emission spectra from the lunar surface as a function of chemical composition
NASA Technical Reports Server (NTRS)
Reedy, R. C.; Arnold, J. R.; Trombka, J. I.
1973-01-01
The gamma rays emitted from the moon or any similar body carry information on the chemical composition of the surface layer. The elements most easily measured are K, U, Th and major elements such as O, Si, Mg, and Fe. The expected fluxes of gamma ray lines were calculated for four lunar compositions and one chondritic chemistry from a consideration of the important emission mechanisms: natural radioactivity, inelastic scatter, neutron capture, and induced radioactivity. The models used for cosmic ray interactions were those of Reedy and Arnold and Lingenfelter. The areal resolution of the experiment was calculated to be around 70 to 140 km under the conditions of the Apollo 15 and 16 experiments. Finally, a method was described for recovering the chemical information from the observed scintillation spectra obtained in these experiments.
NASA Technical Reports Server (NTRS)
Panontin, Tina; Carvalho, Robert; Keller, Richard
2004-01-01
Contents include the folloving:Overview of the Application; Input Data; Analytical Process; Tool's Output; and Application of the Results of the Analysis.The tool enables the first element through a Web-based application that can be accessed by distributed teams to store and retrieve any type of digital investigation material in a secure environment. The second is accomplished by making the relationships between information explicit through the use of a semantic network-a structure that literally allows an investigator or team to "connect -the-dots." The third element, the significance of the correlated information, is established through causality and consistency tests using a number of different methods embedded within the tool, including fault trees, event sequences, and other accident models. And finally, the evidence gathered and structured within the tool can be directly, electronically archived to preserve the evidence and investigative reasoning.
Use of Fuzzy Logic Systems for Assessment of Primary Faults
NASA Astrophysics Data System (ADS)
Petrović, Ivica; Jozsa, Lajos; Baus, Zoran
2015-09-01
In electric power systems, grid elements are often subjected to very complex and demanding disturbances or dangerous operating conditions. Determining initial fault or cause of those states is a difficult task. When fault occurs, often it is an imperative to disconnect affected grid element from the grid. This paper contains an overview of possibilities for using fuzzy logic in an assessment of primary faults in the transmission grid. The tool for this task is SCADA system, which is based on information of currents, voltages, events of protection devices and status of circuit breakers in the grid. The function model described with the membership function and fuzzy logic systems will be presented in the paper. For input data, diagnostics system uses information of protection devices tripping, states of circuit breakers and measurements of currents and voltages before and after faults.
Attitude Determination Using a MEMS-Based Flight Information Measurement Unit
Ma, Der-Ming; Shiau, Jaw-Kuen; Wang, I.-Chiang; Lin, Yu-Heng
2012-01-01
Obtaining precise attitude information is essential for aircraft navigation and control. This paper presents the results of the attitude determination using an in-house designed low-cost MEMS-based flight information measurement unit. This study proposes a quaternion-based extended Kalman filter to integrate the traditional quaternion and gravitational force decomposition methods for attitude determination algorithm. The proposed extended Kalman filter utilizes the evolution of the four elements in the quaternion method for attitude determination as the dynamic model, with the four elements as the states of the filter. The attitude angles obtained from the gravity computations and from the electronic magnetic sensors are regarded as the measurement of the filter. The immeasurable gravity accelerations are deduced from the outputs of the three axes accelerometers, the relative accelerations, and the accelerations due to body rotation. The constraint of the four elements of the quaternion method is treated as a perfect measurement and is integrated into the filter computation. Approximations of the time-varying noise variances of the measured signals are discussed and presented with details through Taylor series expansions. The algorithm is intuitive, easy to implement, and reliable for long-term high dynamic maneuvers. Moreover, a set of flight test data is utilized to demonstrate the success and practicality of the proposed algorithm and the filter design. PMID:22368455
Attitude determination using a MEMS-based flight information measurement unit.
Ma, Der-Ming; Shiau, Jaw-Kuen; Wang, I-Chiang; Lin, Yu-Heng
2012-01-01
Obtaining precise attitude information is essential for aircraft navigation and control. This paper presents the results of the attitude determination using an in-house designed low-cost MEMS-based flight information measurement unit. This study proposes a quaternion-based extended Kalman filter to integrate the traditional quaternion and gravitational force decomposition methods for attitude determination algorithm. The proposed extended Kalman filter utilizes the evolution of the four elements in the quaternion method for attitude determination as the dynamic model, with the four elements as the states of the filter. The attitude angles obtained from the gravity computations and from the electronic magnetic sensors are regarded as the measurement of the filter. The immeasurable gravity accelerations are deduced from the outputs of the three axes accelerometers, the relative accelerations, and the accelerations due to body rotation. The constraint of the four elements of the quaternion method is treated as a perfect measurement and is integrated into the filter computation. Approximations of the time-varying noise variances of the measured signals are discussed and presented with details through Taylor series expansions. The algorithm is intuitive, easy to implement, and reliable for long-term high dynamic maneuvers. Moreover, a set of flight test data is utilized to demonstrate the success and practicality of the proposed algorithm and the filter design.
Debris flow risk mapping on medium scale and estimation of prospective economic losses
NASA Astrophysics Data System (ADS)
Blahut, Jan; Sterlacchini, Simone
2010-05-01
Delimitation of potential zones affected by debris flow hazard, mapping of areas at risk, and estimation of future economic damage provides important information for spatial planners and local administrators in all countries endangered by this type of phenomena. This study presents a medium scale (1:25 000 - 1: 50 000) analysis applied in the Consortium of Mountain Municipalities of Valtellina di Tirano (Italian Alps, Lombardy Region). In this area a debris flow hazard map was coupled with the information about the elements at risk to obtain monetary values of prospective damage. Two available hazard maps were obtained from GIS medium scale modelling. Probability estimations of debris flow occurrence were calculated using existing susceptibility maps and two sets of aerial images. Value to the elements at risk was assigned according to the official information on housing costs and land value from the Territorial Agency of Lombardy Region. In the first risk map vulnerability values were assumed to be 1. The second risk map uses three classes of vulnerability values qualitatively estimated according to the debris flow possible propagation. Risk curves summarizing the possible economic losses were calculated. Finally these maps of economic risk were compared to maps derived from qualitative evaluation of the values of the elements at risk.
Woodward-Kron, Robyn; Connor, Melanie; Schulz, Peter J; Elliott, Kristine
2014-02-01
Communication skills teaching in medical education has yet to acknowledge the impact of the Internet on physician-patient communication. The authors present a conceptual model showing the variables influencing how and to what extent physicians and patients discuss Internet-sourced health information as part of the consultation with the purpose of educating the patient. A study exploring the role physicians play in patient education mediated through health information available on the Internet provided the foundation for the conceptual model. Twenty-one physicians participated in semistructured interviews between 2011 and 2013. Participants were from Australia and Switzerland, whose citizens demonstrate different degrees of Internet usage and who differ culturally and ethnically. The authors analyzed the interviews thematically and iteratively. The themes as well as their interrelationships informed the components of the conceptual model. The intrinsic elements of the conceptual model are the physician, the patient, and Internet based health information. The extrinsic variables of setting, time, and communication activities as well as the quality, availability, and usability of the Internet-based health information influenced the degree to which physicians engaged with, and were engaged by, their patients about Internet-based health information. The empirically informed model provides a means of understanding the environment, enablers, and constraints of discussing Internet-based health information, as well as the benefits for patients' understanding of their health. It also provides medical educators with a conceptual tool to engage and support physicians in their activities of communicating health information to patients.
DEVELOPMENT OF COMPUTER SUPPORTED INFORMATION SYSTEM SHELL FOR MEASURING POLLUTION PROGRESS
Basic elements and concepts of information systems are presented:definition of the term "information", main elements of data and atabase structure. he report also deals with the information system and its underlying theory and design. xamples of the pplication of information syst...
Neural dynamics of grouping and segmentation explain properties of visual crowding.
Francis, Gregory; Manassi, Mauro; Herzog, Michael H
2017-07-01
Investigations of visual crowding, where a target is difficult to identify because of flanking elements, has largely used a theoretical perspective based on local interactions where flanking elements pool with or substitute for properties of the target. This successful theoretical approach has motivated a wide variety of empirical investigations to identify mechanisms that cause crowding, and it has suggested practical applications to mitigate crowding effects. However, this theoretical approach has been unable to account for a parallel set of findings that crowding is influenced by long-range perceptual grouping effects. When the target and flankers are perceived as part of separate visual groups, crowding tends to be quite weak. Here, we describe how theoretical mechanisms for grouping and segmentation in cortical neural circuits can account for a wide variety of these long-range grouping effects. Building on previous work, we explain how crowding occurs in the model and explain how grouping in the model involves connected boundary signals that represent a key aspect of visual information. We then introduce new circuits that allow nonspecific top-down selection signals to flow along connected boundaries or within a surface contained by boundaries and thereby induce a segmentation that can separate the visual information corresponding to the flankers from the visual information corresponding to the target. When such segmentation occurs, crowding is shown to be weak. We compare the model's behavior to 5 sets of experimental findings on visual crowding and show that the model does a good job explaining the key empirical findings. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
ACTOMP - AUTOCAD TO MASS PROPERTIES
NASA Technical Reports Server (NTRS)
Jones, A.
1994-01-01
AutoCAD to Mass Properties was developed to facilitate quick mass properties calculations of structures having many simple elements in a complex configuration such as trusses or metal sheet containers. Calculating the mass properties of structures of this type can be a tedious and repetitive process, but ACTOMP helps automate the calculations. The structure can be modelled in AutoCAD or a compatible CAD system in a matter of minutes using the 3-Dimensional elements. This model provides all the geometric data necessary to make a mass properties calculation of the structure. ACTOMP reads the geometric data of a drawing from the Drawing Interchange File (DXF) used in AutoCAD. The geometric entities recognized by ACTOMP include POINTs, 3DLINEs, and 3DFACEs. ACTOMP requests mass, linear density, or area density of the elements for each layer, sums all the elements and calculates the total mass, center of mass (CM) and the mass moments of inertia (MOI). AutoCAD utilizes layers to define separate drawing planes. ACTOMP uses layers to differentiate between multiple types of similar elements. For example if a structure is made of various types of beams, modeled as 3DLINEs, each with a different linear density, the beams can be grouped by linear density and each group placed on a separate layer. The program will request the linear density of 3DLINEs for each new layer it finds as it processes the drawing information. The same is true with POINTs and 3DFACEs. By using layers this way a very complex model can be created. POINTs are used for point masses such as bolts, small machine parts, or small electronic boxes. 3DLINEs are used for beams, bars, rods, cables, and other similarly slender elements. 3DFACEs are used for planar elements. 3DFACEs may be created as 3 or 4 Point faces. Some examples of elements that might be modelled using 3DFACEs are plates, sheet metal, fabric, boxes, large diameter hollow cylinders and evenly distributed masses. ACTOMP was written in Microsoft QuickBasic (Version 2.0). It was developed for the IBM PC microcomputer and has been implemented on an IBM PC compatible under DOS 3.21. ACTOMP was developed in 1988 and requires approximately 5K bytes to operate.
NASA Astrophysics Data System (ADS)
Castagnetti, C.; Dubbini, M.; Ricci, P. C.; Rivola, R.; Giannini, M.; Capra, A.
2017-05-01
The new era of designing in architecture and civil engineering applications lies in the Building Information Modeling (BIM) approach, based on a 3D geometric model including a 3D database. This is easier for new constructions whereas, when dealing with existing buildings, the creation of the BIM is based on the accurate knowledge of the as-built construction. Such a condition is allowed by a 3D survey, often carried out with laser scanning technology or modern photogrammetry, which are able to guarantee an adequate points cloud in terms of resolution and completeness by balancing both time consuming and costs with respect to the request of final accuracy. The BIM approach for existing buildings and even more for historical buildings is not yet a well known and deeply discussed process. There are still several choices to be addressed in the process from the survey to the model and critical issues to be discussed in the modeling step, particularly when dealing with unconventional elements such as deformed geometries or historical elements. The paper describes a comprehensive workflow that goes through the survey and the modeling, allowing to focus on critical issues and key points to obtain a reliable BIM of an existing monument. The case study employed to illustrate the workflow is the Basilica of St. Stefano in Bologna (Italy), a large monumental complex with great religious, historical and architectural assets.
NASA Astrophysics Data System (ADS)
Rybizki, Jan; Just, Andreas; Rix, Hans-Walter
2017-09-01
Elemental abundances of stars are the result of the complex enrichment history of their galaxy. Interpretation of observed abundances requires flexible modeling tools to explore and quantify the information about Galactic chemical evolution (GCE) stored in such data. Here we present Chempy, a newly developed code for GCE modeling, representing a parametrized open one-zone model within a Bayesian framework. A Chempy model is specified by a set of five to ten parameters that describe the effective galaxy evolution along with the stellar and star-formation physics: for example, the star-formation history (SFH), the feedback efficiency, the stellar initial mass function (IMF), and the incidence of supernova of type Ia (SN Ia). Unlike established approaches, Chempy can sample the posterior probability distribution in the full model parameter space and test data-model matches for different nucleosynthetic yield sets. It is essentially a chemical evolution fitting tool. We straightforwardly extend Chempy to a multi-zone scheme. As an illustrative application, we show that interesting parameter constraints result from only the ages and elemental abundances of the Sun, Arcturus, and the present-day interstellar medium (ISM). For the first time, we use such information to infer the IMF parameter via GCE modeling, where we properly marginalize over nuisance parameters and account for different yield sets. We find that 11.6+ 2.1-1.6% of the IMF explodes as core-collapse supernova (CC-SN), compatible with Salpeter (1955, ApJ, 121, 161). We also constrain the incidence of SN Ia per 103M⊙ to 0.5-1.4. At the same time, this Chempy application shows persistent discrepancies between predicted and observed abundances for some elements, irrespective of the chosen yield set. These cannot be remedied by any variations of Chempy's parameters and could be an indication of missing nucleosynthetic channels. Chempy could be a powerful tool to confront predictions from stellar nucleosynthesis with far more complex abundance data sets and to refine the physical processes governing the chemical evolution of stellar systems.
NASA Astrophysics Data System (ADS)
Prychynenko, Diana; Sitte, Matthias; Litzius, Kai; Krüger, Benjamin; Bourianoff, George; Kläui, Mathias; Sinova, Jairo; Everschor-Sitte, Karin
2018-01-01
Inspired by the human brain, there is a strong effort to find alternative models of information processing capable of imitating the high energy efficiency of neuromorphic information processing. One possible realization of cognitive computing involves reservoir computing networks. These networks are built out of nonlinear resistive elements which are recursively connected. We propose that a Skyrmion network embedded in magnetic films may provide a suitable physical implementation for reservoir computing applications. The significant key ingredient of such a network is a two-terminal device with nonlinear voltage characteristics originating from magnetoresistive effects, such as the anisotropic magnetoresistance or the recently discovered noncollinear magnetoresistance. The most basic element for a reservoir computing network built from "Skyrmion fabrics" is a single Skyrmion embedded in a ferromagnetic ribbon. In order to pave the way towards reservoir computing systems based on Skyrmion fabrics, we simulate and analyze (i) the current flow through a single magnetic Skyrmion due to the anisotropic magnetoresistive effect and (ii) the combined physics of local pinning and the anisotropic magnetoresistive effect.
Haematic pH sensor for extracorporeal circulation
NASA Astrophysics Data System (ADS)
Ferrari, Luca; Fabbri, Paola; Rovati, Luigi; Pilati, Francesco
2012-03-01
The design and realization of an optical sensor for measuring haematic pH during extracorporeal circulation is presented. It consists of a chemical sensing element in contact with the blood, an interrogation optical head to externally probe the sensing element and the front-end electronics to acquire and process the information of interest. The fluorescein O-methacrylate 97% is used as the indicator. The developed system has been tested in-vitro and on an in-vivo animal model. It showed a linear behavior in the haematic range of interest with a mean error lower than 0.01 units of pH.
NASA Astrophysics Data System (ADS)
Srivastava, D. P.; Sahni, V.; Satsangi, P. S.
2014-08-01
Graph-theoretic quantum system modelling (GTQSM) is facilitated by considering the fundamental unit of quantum computation and information, viz. a quantum bit or qubit as a basic building block. Unit directional vectors "ket 0" and "ket 1" constitute two distinct fundamental quantum across variable orthonormal basis vectors, for the Hilbert space, specifying the direction of propagation of information, or computation data, while complementary fundamental quantum through, or flow rate, variables specify probability parameters, or amplitudes, as surrogates for scalar quantum information measure (von Neumann entropy). This paper applies GTQSM in continuum of protein heterodimer tubulin molecules of self-assembling polymers, viz. microtubules in the brain as a holistic system of interacting components representing hierarchical clustered quantum Hopfield network, hQHN, of networks. The quantum input/output ports of the constituent elemental interaction components, or processes, of tunnelling interactions and Coulombic bidirectional interactions are in cascade and parallel interconnections with each other, while the classical output ports of all elemental components are interconnected in parallel to accumulate micro-energy functions generated in the system as Hamiltonian, or Lyapunov, energy function. The paper presents an insight, otherwise difficult to gain, for the complex system of systems represented by clustered quantum Hopfield network, hQHN, through the application of GTQSM construct.
Chromosphere Active Region Plasma Diagnostics Based On Observations Of Millimeter Radiation
NASA Astrophysics Data System (ADS)
Loukitcheva, M.; Nagnibeda, V.
1999-10-01
In this paper we present the results of millimeter radiation calculations for different elements of chromospheric and transition region structures of the quiet Sun and S-component - elements of chromosphere network, sunspot groups and plages. The calculations were done on the basis of standard optical and UV models ( models by Vernazza et al. (1981,VAL), their modifications by Fontenla et al. (1993,FAL)). We also considered the sunspot model by Lites and Skumanich (1982,LS), S-component model by Staude et al.(1984) and modification of VAL and FAL models by Bocchialini and Vial - models NET and CELL. We compare these model calculations with observed characteristics of components of millimeter Solar radiation for the quiet Sun and S-component obtained with the radiotelescope RT-7.5 MGTU (wavelength 3.4 mm) and radioheliograph Nobeyama (wavelength 17.6 mm). From observations we derived spectral characteristics of millimeter sources and active region source structure. The comparison has shown that observed radio data are clearly in dissagrement with all the considered models. Finally, we propose further improvement of chromospheric and transition region models based on optical and UV observations in order to use for modelling information obtained from radio data.
Fast Erase Method and Apparatus For Digital Media
NASA Technical Reports Server (NTRS)
Oakely, Ernest C. (Inventor)
2006-01-01
A non-contact fast erase method for erasing information stored on a magnetic or optical media. The magnetic media element includes a magnetic surface affixed to a toroidal conductor and stores information in a magnetic polarization pattern. The fast erase method includes applying an alternating current to a planar inductive element positioned near the toroidal conductor, inducing an alternating current in the toroidal conductor, and heating the magnetic surface to a temperature that exceeds the Curie-point so that information stored on the magnetic media element is permanently erased. The optical disc element stores information in a plurality of locations being defined by pits and lands in a toroidal conductive layer. The fast erase method includes similarly inducing a plurality of currents in the optical media element conductive layer and melting a predetermined portion of the conductive layer so that the information stored on the optical medium is destroyed.
Nonlinear dynamics of laser systems with elements of a chaos: Advanced computational code
NASA Astrophysics Data System (ADS)
Buyadzhi, V. V.; Glushkov, A. V.; Khetselius, O. Yu; Kuznetsova, A. A.; Buyadzhi, A. A.; Prepelitsa, G. P.; Ternovsky, V. B.
2017-10-01
A general, uniform chaos-geometric computational approach to analysis, modelling and prediction of the non-linear dynamics of quantum and laser systems (laser and quantum generators system etc) with elements of the deterministic chaos is briefly presented. The approach is based on using the advanced generalized techniques such as the wavelet analysis, multi-fractal formalism, mutual information approach, correlation integral analysis, false nearest neighbour algorithm, the Lyapunov’s exponents analysis, and surrogate data method, prediction models etc There are firstly presented the numerical data on the topological and dynamical invariants (in particular, the correlation, embedding, Kaplan-York dimensions, the Lyapunov’s exponents, Kolmogorov’s entropy and other parameters) for laser system (the semiconductor GaAs/GaAlAs laser with a retarded feedback) dynamics in a chaotic and hyperchaotic regimes.
NASA Technical Reports Server (NTRS)
Weiss, Jerold L.; Hsu, John Y.
1986-01-01
The use of a decentralized approach to failure detection and isolation for use in restructurable control systems is examined. This work has produced: (1) A method for evaluating fundamental limits to FDI performance; (2) Application using flight recorded data; (3) A working control element FDI system with maximal sensitivity to critical control element failures; (4) Extensive testing on realistic simulations; and (5) A detailed design methodology involving parameter optimization (with respect to model uncertainties) and sensitivity analyses. This project has concentrated on detection and isolation of generic control element failures since these failures frequently lead to emergency conditions and since knowledge of remaining control authority is essential for control system redesign. The failures are generic in the sense that no temporal failure signature information was assumed. Thus, various forms of functional failures are treated in a unified fashion. Such a treatment results in a robust FDI system (i.e., one that covers all failure modes) but sacrifices some performance when detailed failure signature information is known, useful, and employed properly. It was assumed throughout that all sensors are validated (i.e., contain only in-spec errors) and that only the first failure of a single control element needs to be detected and isolated. The FDI system which has been developed will handle a class of multiple failures.
HIS/BUI: a conceptual model for bottom-up integration of hospital information systems.
Zviran, M; Armoni, A; Glezer, C
1998-06-01
Many successful applications of information systems have been introduced and implemented in hospitals. However, the integration of these applications into a cohesive hospital-wide information system has proved to be more complicated to develop and difficult to accomplish than expected. This paper introduces HIS/BUI, a framework for bottom-up integration of hospital information systems, and demonstrates its application through a real-life case scenario. The scope of the proposed framework is the integration of heterogeneous clinical, administrative, and financial information elements of a hospital into a unified system environment. Under the integrated architecture, all existing local applications are preserved and interconnected to an information hub that serves as a central medical and administrative data warehouse.
Application of 6D Building Information Model (6D BIM) for Business-storage Building in Slovenia
NASA Astrophysics Data System (ADS)
Pučko, Zoran; Vincek, Dražen; Štrukelj, Andrej; Šuman, Nataša
2017-10-01
The aim of this paper is to present an application of 6D building information modelling (6D BIM) on a real business-storage building in Slovenia. First, features of building maintenance in general are described according to the current Slovenian legislation, and also a general principle of BIM is given. After that, step-by-step activities for modelling 6D BIM are exposed, namely from Element list for maintenance, determination of their lifetime and service measures, cost analysing and time analysing to 6D BIM modelling. The presented 6D BIM model is designed in a unique way in which cost analysis is performed as 5D BIM model with linked data to use BIM Construction Project Management Software (Vico Office), integrated with 3D BIM model, whereas time analysis as 4D BIM model is carried out as non-linked data with the help of Excel (without connection to 3D BIM model). The paper is intended to serve as a guide to the building owners to prepare 6D BIM and to provide an insight into the relevant dynamic information about intervals and costs for execution of maintenance works in the whole building lifecycle.
Auditory Power-Law Activation Avalanches Exhibit a Fundamental Computational Ground State
NASA Astrophysics Data System (ADS)
Stoop, Ruedi; Gomez, Florian
2016-07-01
The cochlea provides a biological information-processing paradigm that we are only beginning to understand in its full complexity. Our work reveals an interacting network of strongly nonlinear dynamical nodes, on which even a simple sound input triggers subnetworks of activated elements that follow power-law size statistics ("avalanches"). From dynamical systems theory, power-law size distributions relate to a fundamental ground state of biological information processing. Learning destroys these power laws. These results strongly modify the models of mammalian sound processing and provide a novel methodological perspective for understanding how the brain processes information.
A computational framework for modeling targets as complex adaptive systems
NASA Astrophysics Data System (ADS)
Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh
2017-05-01
Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.
Thermal, Structural, and Optical Analysis of a Balloon-Based Imaging System
NASA Astrophysics Data System (ADS)
Borden, Michael; Lewis, Derek; Ochoa, Hared; Jones-Wilson, Laura; Susca, Sara; Porter, Michael; Massey, Richard; Clark, Paul; Netterfield, Barth
2017-03-01
The Subarcsecond Telescope And BaLloon Experiment, STABLE, is the fine stage of a guidance system for a high-altitude ballooning platform designed to demonstrate subarcsecond pointing stability over one minute using relatively dim guide stars in the visible spectrum. The STABLE system uses an attitude rate sensor and the motion of the guide star on a detector to control a Fast Steering Mirror to stabilize the image. The characteristics of the thermal-optical-mechanical elements in the system directly affect the quality of the point-spread function of the guide star on the detector, so a series of thermal, structural, and optical models were built to simulate system performance and ultimately inform the final pointing stability predictions. This paper describes the modeling techniques employed in each of these subsystems. The results from those models are discussed in detail, highlighting the development of the worst-case cold and hot cases, the optical metrics generated from the finite element model, and the expected STABLE residual wavefront error and decenter. Finally, the paper concludes with the predicted sensitivities in the STABLE system, which show that thermal deadbanding, structural pre-loading, and self-deflection under different loading conditions, and the speed of individual optical elements were particularly important to the resulting STABLE optical performance.
Requirements for developing a regional monitoring capacity for aerosols in Europe within EMEP.
Kahnert, Michael; Lazaridis, Mihalis; Tsyro, Svetlana; Torseth, Kjetil
2004-07-01
The European Monitoring and Evaluation Programme (EMEP) has been established to provide information to Parties to the Convention on Long Range Transboundary Air Pollution on deposition and concentration of air pollutants, as well as on the quantity and significance of long-range transmission of pollutants and transboundary fluxes. To achieve its objectives with the required scientific credibility and technical underpinning, a close integration of the programme's main elements is performed. These elements are emission inventories, chemical transport modelling, and the monitoring of atmospheric chemistry and deposition fluxes, which further are integrated towards abatement policy development. A critical element is the air pollution monitoring that is performed across Europe with a focus not only on health effect aspects and compliance monitoring, but also on process studies and source receptor relationships. Without a strong observational basis a predictive modelling capacity cannot be developed and validated. Thus the modelling success strongly depends on the quality and quantity of available observations. Particulate matter (PM) is a relatively recent addition to the EMEP monitoring programme, and the network for PM mass observations is still evolving. This article presents the current status of EMEP aerosol observations, followed by a critical evaluation in view of EMEP's main objectives and its model development requirements. Specific recommendations are given for improving the PM monitoring programme within EMEP.
Identifying elements of the health care environment that contribute to wayfinding.
Pati, Debajyoti; Harvey, Thomas E; Willis, Douglas A; Pati, Sipra
2015-01-01
Identify aspects of the physical environment that inform wayfinding for visitors. Compare and contrast the identified elements in frequency of use. Gain an understanding of the role the different elements and attributes play in the wayfinding process. Wayfinding by patients and visitors is a documented problem in healthcare facilities. The few studies that have been conducted have identified some of the environmental elements that influence wayfinding. Moreover, literatures comparing different design strategies are absent. Currently there is limited knowledge to inform prioritization of strategies to optimize wayfinding within capital budget. A multi-method, non-experimental, qualitative, exploratory study design was adopted. The study was conducted in a large, acute care facility in Texas. Ten healthy adults in five age groups, representing both sexes, participated in the study as simulated visitors. Data collection included (a) verbal protocols during navigation; (b) questionnaire; and (c) verbal directions from hospital employees. Data were collected during Fall 2013. Physical design elements contributing to wayfinding include signs, architectural features, maps, interior elements (artwork, display boards, information counters, etc.), functional clusters, interior elements pairing, structural elements, and furniture. The information is used in different ways - some for primary navigational information, some for supporting navigational information, and some as familiarity markers. The physical environment has a critical role in aiding navigation in healthcare facilities. Architectural feature is the top contributor in the domain of architecture. Artwork (painting, sculpture, etc.) is the top contributor in the domain of interior design. © The Author(s) 2015.
Effective information management and assurance for a modern organisation during a crisis.
MacLeod, Andrew
2015-01-01
During a crisis, organisations face a major unpredictable event with potentially negative consequences. Effective information management and assurance can assist the organisation in making sure that they have the correct information in a secure format to make decisions to recover their operations. The main elements of effective information management and assurance are confidentiality, integrity and availability, combined with non-repudiation. Should an element of effective information management or assurance be removed it can have a detrimental effect on the other elements and render the information management and assurance practices of the organisation ineffectual.
[Application of ICP-MS to Identify the Botanic Source of Characteristic Honey in South Yunnan].
Wei, Yue; Chen, Fang; Wang, Yong; Chen, Lan-zhen; Zhang, Xue-wen; Wang, Yan-hui; Wu, Li-ming; Zhou, Qun
2016-01-01
By adopting inductively coupled plasma mass spectrometry (ICP-MS) combined with chemometric analysis technology, 23 kinds of minerals in four kinds of characteristic honey derived from Yunnan province were analyzed. The result showed that 21 kinds of mineral elements, namely Na, Mg, K, Ca, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, As, Se, Sr, Mo, Cd, Sb, Ba, Tl and Pb, have significant differences among different varieties of honey. The results of principal component analysis (PCA) showed that the cumulative variance contribution rate of the first four main components reached 77.74%, seven kinds of elements (Mg, Ca, Mn, Co, Sr, Cd, Ba) from the first main component contained most of the honey information. Through the stepwise discriminant analysis, seven kinds of elements (Mg, K, Ca, Cr, Mn, Sr, Pb) were filtered. out and used to establish the discriminant function model, and the correct classification rates of the proposed model reached 90% and 86.7%, respectively, which showed elements contents could be effectively indicators to discriminate the four kinds characteristic honey in southern Yunnan Province. In view of all the honey samples were harvested from apiaries located at south Yunnan Province where have similar climate, soil and other environment conditions, the differences of the mineral elements contents for the honey samples mainly due to their corresponding nectariferous plant. Therefore, it is feasible to identify honey botanical source through the differences of mineral elements.
NASA Astrophysics Data System (ADS)
Lawrence, D. J.; Maurice, S.; Patterson, G. W.; Hibbitts, C. A.
2010-05-01
Understanding the global composition of Ganymede's surface is a key goal of the Europa Jupiter System Mission (EJSM) that is being jointly planned by NASA and ESA. Current plans for obtaining surface information with the Jupiter Ganymede Orbiter (JGO) use spectral imaging measurements. While spectral imaging can provide good mineralogy-related information, quantitative data about elemental abundances can often be hindered by non-composition variations due to surface effects (e.g., space weathering, grain effects, temperature, etc.). Orbital neutron and gamma-ray spectroscopy can provide quantitative composition information that is complementary to spectral imaging measurements, as has been demonstrated with similar instrumental combinations at the Moon, Mars, and Mercury. Neutron and gamma-ray measurements have successfully returned abundance information in a hydrogen-rich environment on Mars. In regards to neutrons and gamma-rays, there are many similarities between the Mars and Ganymede hydrogen-rich environments. In this study, we present results of neutron transport models, which show that quantitative composition information from Ganymede's surface can be obtained in a realistic mission scenario. Thermal and epithermal neutrons are jointly sensitive to the abundances of hydrogen and neutron absorbing elements, such as iron and titanium. These neutron measurements can discriminate between regions that are rich or depleted in neutron absorbing elements, even in the presence of large amounts of hydrogen. Details will be presented about how the neutron composition parameters can be used to meet high-level JGO science objectives, as well as an overview of a neutron spectrometer than can meet various mission and stringent environmental requirements.
Information giving and receiving in hematological malignancy consultations.
Alexander, Stewart C; Sullivan, Amy M; Back, Anthony L; Tulsky, James A; Goldman, Roberta E; Block, Susan D; Stewart, Susan K; Wilson-Genderson, Maureen; Lee, Stephanie J
2012-03-01
Little is known about communication with patients suffering from hematologic malignancies, many of whom are seen by subspecialists in consultation at tertiary-care centers. These subspecialized consultations might provide the best examples of optimal physician-patient communication behaviors, given that these consultations tend to be lengthy, to occur between individuals who have not met before and may have no intention of an ongoing relationship, and which have a goal of providing treatment recommendations. The aim of this paper is to describe and quantify the content of the subspecialty consultation in regards to exchanging information and identify patient and provider characteristics associated with discussion elements. Audio-recorded consultations between 236 patients and 40 hematologists were coded for recommended communication practices. Multilevel models for dichotomous outcomes were created to test associations between patient, physician and consultation characteristics and key discussion elements. Discussions about the purpose of the visit and patient's knowledge about their disease were common. Other elements such as patient's preference for his/her role in decision-making, preferences for information, or understanding of presented information were less common. Treatment recommendations were provided in 97% of the consultations and unambiguous presentations of prognosis occurred in 81% of the consultations. Unambiguous presentations of prognosis were associated with non-White patient race, lower educational status, greater number of questions asked, and specific physician provider. Although some communication behaviors occur in most consultations, others are much less common and could help tailor the amount and type of information discussed. Approximately half of the patients are told unambiguous prognostic estimates for mortality or cure. Copyright © 2011 John Wiley & Sons, Ltd.
Information giving and receiving in hematological malignancy consultations†
Alexander, Stewart C.; Sullivan, Amy M.; Back, Anthony L.; Tulsky, James A.; Goldman, Roberta E.; Block, Susan D.; Stewart, Susan K.; Wilson-Genderson, Maureen; Lee, Stephanie J.
2012-01-01
Purpose Little is known about communication with patients suffering from hematologic malignancies, many of whom are seen by subspecialists in consultation at tertiary-care centers. These subspecialized consultations might provide the best examples of optimal physician–patient communication behaviors, given that these consultations tend to be lengthy, to occur between individuals who have not met before and may have no intention of an ongoing relationship, and which have a goal of providing treatment recommendations. The aim of this paper is to describe and quantify the content of the subspecialty consultation in regards to exchanging information and identify patient and provider characteristics associated with discussion elements. Methods Audio-recorded consultations between 236 patients and 40 hematologists were coded for recommended communication practices. Multilevel models for dichotomous outcomes were created to test associations between patient, physician and consultation characteristics and key discussion elements. Results Discussions about the purpose of the visit and patient’s knowledge about their disease were common. Other elements such as patient’s preference for his/her role in decision-making, preferences for information, or understanding of presented information were less common. Treatment recommendations were provided in 97% of the consultations and unambiguous presentations of prognosis occurred in 81% of the consultations. Unambiguous presentations of prognosis were associated with non-White patient race, lower educational status, greater number of questions asked, and specific physician provider. Conclusion Although some communication behaviors occur in most consultations, others are much less common and could help tailor the amount and type of information discussed. Approximately half of the patients are told unambiguous prognostic estimates for mortality or cure. PMID:21294221
NASA Technical Reports Server (NTRS)
Larour, Eric; Schiermeier, John E.; Seroussi, Helene; Morlinghem, Mathieu
2013-01-01
In order to have the capability to use satellite data from its own missions to inform future sea-level rise projections, JPL needed a full-fledged ice-sheet/iceshelf flow model, capable of modeling the mass balance of Antarctica and Greenland into the near future. ISSM was developed with such a goal in mind, as a massively parallelized, multi-purpose finite-element framework dedicated to ice-sheet modeling. ISSM features unstructured meshes (Tria in 2D, and Penta in 3D) along with corresponding finite elements for both types of meshes. Each finite element can carry out diagnostic, prognostic, transient, thermal 3D, surface, and bed slope simulations. Anisotropic meshing enables adaptation of meshes to a certain metric, and the 2D Shelfy-Stream, 3D Blatter/Pattyn, and 3D Full-Stokes formulations capture the bulk of the ice-flow physics. These elements can be coupled together, based on the Arlequin method, so that on a large scale model such as Antarctica, each type of finite element is used in the most efficient manner. For each finite element referenced above, ISSM implements an adjoint. This adjoint can be used to carry out model inversions of unknown model parameters, typically ice rheology and basal drag at the ice/bedrock interface, using a metric such as the observed InSAR surface velocity. This data assimilation capability is crucial to allow spinning up of ice flow models using available satellite data. ISSM relies on the PETSc library for its vectors, matrices, and solvers. This allows ISSM to run efficiently on any parallel platform, whether shared or distrib- ISSM: Ice Sheet System Model NASA's Jet Propulsion Laboratory, Pasadena, California uted. It can run on the largest clusters, and is fully scalable. This allows ISSM to tackle models the size of continents. ISSM is embedded into MATLAB and Python, both open scientific platforms. This improves its outreach within the science community. It is entirely written in C/C++, which gives it flexibility in its design, and the power/speed that C/C++ allows. ISSM is svn (subversion) hosted, on a JPL repository, to facilitate its development and maintenance. ISSM can also model propagation of rifts using contact mechanics and mesh splitting, and can interface to the Dakota software. To carry out sensitivity analysis, mesh partitioning algorithms are available, based on the Scotch, Chaco, and Metis partitioners that ensure equal area mesh partitions can be done, which are then usable for sampling and local reliability methods.
NASA Technical Reports Server (NTRS)
Midkiff, Alan H.; Hansman, R. John, Jr.
1992-01-01
Air/ground digital datalink communications are an integral component of the FAA's Air Traffic Control (ATC) modernization strategy. With the introduction of datalink into the ATC system, there is concern over the potential loss of situational awareness by flight crews due to the reduction in the "party line" information available to the pilot. "Party line" information is gleaned by flight crews overhearing communications between ATC and other aircraft. In the datalink environment, party line information may not be available due to the use of discrete addressing. Information concerning the importance, availability, and accuracy of party line elements was explored through an opinion survey of active air carrier flight crews. The survey identified numerous important party line elements. These elements were scripted into a full-mission flight simulation. The flight simulation experiment examined the utilization of party line information by studying subject responses to the specific information elements. Some party line elements perceived as important were effectively utilized by flight crews in the simulated operational environment. However, other party line elements stimulated little or no increase in situational awareness. The ability to assimilate and use party line information appeared to be dependent on workload, time availability, and the tactical/strategic nature of the situations. In addition, the results of both the survey and the simulation indicated that the importance of party line information appeared to be greatest for operations near or on the airport. This indicates that caution must be exercised when implementing datalink communications in these high workload, tactical sectors. This document is based on the thesis of Alan H. Midkiff submitted in partial fulfillment of the degree of Master of Science in Aeronautics and Astronautics at the Massachusetts Institute of Technology.
Health information management in the home: a human factors assessment.
Zayas-Cabán, Teresa
2012-01-01
Achieving optimal health outcomes requires that consumers maintain myriad health data and understand how to utilize appropriate health information management applications. This case study investigated four families' health information management tasks in their homes. Four different families participated in the study: a single parent household; two nuclear family households; and an extended family household. A work system model known as the balance model was used as a guiding framework for data collection. Data collection consisted of three stages: (1) primary health information manager interviews; (2) family interviews; and (3) task observations. Overall, families reported 69 unique health information management tasks that took place in nine different locations, using 22 different information storage artifacts. Frequently occurring tasks related to health management or health coordination were conducted in public spaces. Less frequent or more time-consuming tasks, such as researching a health concern or storing medical history, were performed in private spaces such as bedrooms or studies. Similarities across households suggest potential foundational design elements that consumer health information technology application designers need to balance with tailored interventions to successfully support variations in individuals' health information management needs.
Rajagopal, Vijay; Bass, Gregory; Ghosh, Shouryadipta; Hunt, Hilary; Walker, Cameron; Hanssen, Eric; Crampin, Edmund; Soeller, Christian
2018-04-18
With the advent of three-dimensional (3D) imaging technologies such as electron tomography, serial-block-face scanning electron microscopy and confocal microscopy, the scientific community has unprecedented access to large datasets at sub-micrometer resolution that characterize the architectural remodeling that accompanies changes in cardiomyocyte function in health and disease. However, these datasets have been under-utilized for investigating the role of cellular architecture remodeling in cardiomyocyte function. The purpose of this protocol is to outline how to create an accurate finite element model of a cardiomyocyte using high resolution electron microscopy and confocal microscopy images. A detailed and accurate model of cellular architecture has significant potential to provide new insights into cardiomyocyte biology, more than experiments alone can garner. The power of this method lies in its ability to computationally fuse information from two disparate imaging modalities of cardiomyocyte ultrastructure to develop one unified and detailed model of the cardiomyocyte. This protocol outlines steps to integrate electron tomography and confocal microscopy images of adult male Wistar (name for a specific breed of albino rat) rat cardiomyocytes to develop a half-sarcomere finite element model of the cardiomyocyte. The procedure generates a 3D finite element model that contains an accurate, high-resolution depiction (on the order of ~35 nm) of the distribution of mitochondria, myofibrils and ryanodine receptor clusters that release the necessary calcium for cardiomyocyte contraction from the sarcoplasmic reticular network (SR) into the myofibril and cytosolic compartment. The model generated here as an illustration does not incorporate details of the transverse-tubule architecture or the sarcoplasmic reticular network and is therefore a minimal model of the cardiomyocyte. Nevertheless, the model can already be applied in simulation-based investigations into the role of cell structure in calcium signaling and mitochondrial bioenergetics, which is illustrated and discussed using two case studies that are presented following the detailed protocol.
Iwata, Masaki; Taira, Wataru; Hiyama, Atsuki; Otaki, Joji M
2015-06-01
The nymphalid groundplan has been proposed to explain diverse butterfly wing color patterns. In this model, each symmetry system is composed of a core element and a pair of paracore elements. The development of this elemental configuration has been explained by the induction model for positional information. However, the diversity of color patterns in other butterfly families in relation to the nymphalid groundplan has not been thoroughly examined. Here, we examined aberrant color pattern phenotypes of a lycaenid butterfly, Zizeeria maha, from mutagenesis and plasticity studies as well as from field surveys. In several mutants, the third and fourth spot arrays were coordinately positioned much closer to the discal spot in comparison to the normal phenotype. In temperature-shock types, the third and fourth array spots were elongated inwardly or outwardly from their normal positions. In field-caught spontaneous mutants, small black spots were located adjacent to normal black spots. Analysis of these aberrant phenotypes indicated that the spots belonging to the third and fourth arrays are synchronously changeable in position and shape around the discal spot. Thus, these arrays constitute paracore elements of the central symmetry system of the lycaenid butterflies, and the discal spot comprises the core element. These aberrant phenotypes can be explained by the black-inducing signals that propagate from the prospective discal spot, as predicted by the induction model. These results suggest the existence of long-range developmental signals that cover a large area of a wing not only in nymphalid butterflies, but also in lycaenid butterflies.
Mitchell, Geoffrey K; Burridge, Letitia; Zhang, Jianzhen; Donald, Maria; Scott, Ian A; Dart, Jared; Jackson, Claire L
2015-01-01
Integrated multidisciplinary care is difficult to achieve between specialist clinical services and primary care practitioners, but should improve outcomes for patients with chronic and/or complex chronic physical diseases. This systematic review identifies outcomes of different models that integrate specialist and primary care practitioners, and characteristics of models that delivered favourable clinical outcomes. For quality appraisal, the Cochrane Risk of Bias tool was used. Data are presented as a narrative synthesis due to marked heterogeneity in study outcomes. Ten studies were included. Publication bias cannot be ruled out. Despite few improvements in clinical outcomes, significant improvements were reported in process outcomes regarding disease control and service delivery. No study reported negative effects compared with usual care. Economic outcomes showed modest increases in costs of integrated primary-secondary care. Six elements were identified that were common to these models of integrated primary-secondary care: (1) interdisciplinary teamwork; (2) communication/information exchange; (3) shared care guidelines or pathways; (4) training and education; (5) access and acceptability for patients; and (6) a viable funding model. Compared with usual care, integrated primary-secondary care can improve elements of disease control and service delivery at a modestly increased cost, although the impact on clinical outcomes is limited. Future trials of integrated care should incorporate design elements likely to maximise effectiveness.
A general method for radio spectrum efficiency defining
NASA Astrophysics Data System (ADS)
Ramadanovic, Ljubomir M.
1986-08-01
A general method for radio spectrum efficiency defining is proposed. Although simple it can be applied to various radio services. The concept of spectral elements, as information carriers, is introduced to enable the organization of larger spectral spaces - radio network models - characteristic for a particular radio network. The method is applied to some radio network models, concerning cellular radio telephone systems and digital radio relay systems, to verify its unified approach capability. All discussed radio services operate continuously.
Maintenance Enterprise Resource Planning: Information Value Among Supply Chain Elements
2014-04-30
is the Economic Order Cost (EOQ) model, Production Order Quantity Cost, and Quantity Discount Model( Heizer & Render , 2007, pp. 489–490...demand for another item. Following an aircraft, the items to assemble the aircraft are dependent demand ( Heizer & Render , 2007, pp. 562–563). MERP...6), 947–950. doi:10.1287/opre.38.6.947 Heizer , J., & Render , B. (2007). Principles of Operations Management (7th ed., p. 684). Upper Saddle River
Information security of power enterprises of North-Arctic region
NASA Astrophysics Data System (ADS)
Sushko, O. P.
2018-05-01
The role of information technologies in providing technological security for energy enterprises is a component of the economic security for the northern Arctic region in general. Applying instruments and methods of information protection modelling of the energy enterprises' business process in the northern Arctic region (such as Arkhenergo and Komienergo), the authors analysed and identified most frequent risks of information security. With the analytic hierarchy process based on weighting factor estimations, information risks of energy enterprises' technological processes were ranked. The economic estimation of the information security within an energy enterprise considers weighting factor-adjusted variables (risks). Investments in information security systems of energy enterprises in the northern Arctic region are related to necessary security elements installation; current operating expenses on business process protection systems become materialized economic damage.
Toward Model Building for Visual Aesthetic Perception
Lughofer, Edwin; Zeng, Xianyi
2017-01-01
Several models of visual aesthetic perception have been proposed in recent years. Such models have drawn on investigations into the neural underpinnings of visual aesthetics, utilizing neurophysiological techniques and brain imaging techniques including functional magnetic resonance imaging, magnetoencephalography, and electroencephalography. The neural mechanisms underlying the aesthetic perception of the visual arts have been explained from the perspectives of neuropsychology, brain and cognitive science, informatics, and statistics. Although corresponding models have been constructed, the majority of these models contain elements that are difficult to be simulated or quantified using simple mathematical functions. In this review, we discuss the hypotheses, conceptions, and structures of six typical models for human aesthetic appreciation in the visual domain: the neuropsychological, information processing, mirror, quartet, and two hierarchical feed-forward layered models. Additionally, the neural foundation of aesthetic perception, appreciation, or judgement for each model is summarized. The development of a unified framework for the neurobiological mechanisms underlying the aesthetic perception of visual art and the validation of this framework via mathematical simulation is an interesting challenge in neuroaesthetics research. This review aims to provide information regarding the most promising proposals for bridging the gap between visual information processing and brain activity involved in aesthetic appreciation. PMID:29270194
Bitwise efficiency in chaotic models
Düben, Peter; Palmer, Tim
2017-01-01
Motivated by the increasing energy consumption of supercomputing for weather and climate simulations, we introduce a framework for investigating the bit-level information efficiency of chaotic models. In comparison with previous explorations of inexactness in climate modelling, the proposed and tested information metric has three specific advantages: (i) it requires only a single high-precision time series; (ii) information does not grow indefinitely for decreasing time step; and (iii) information is more sensitive to the dynamics and uncertainties of the model rather than to the implementation details. We demonstrate the notion of bit-level information efficiency in two of Edward Lorenz’s prototypical chaotic models: Lorenz 1963 (L63) and Lorenz 1996 (L96). Although L63 is typically integrated in 64-bit ‘double’ floating point precision, we show that only 16 bits have significant information content, given an initial condition uncertainty of approximately 1% of the size of the attractor. This result is sensitive to the size of the uncertainty but not to the time step of the model. We then apply the metric to the L96 model and find that a 16-bit scaled integer model would suffice given the uncertainty of the unresolved sub-grid-scale dynamics. We then show that, by dedicating computational resources to spatial resolution rather than numeric precision in a field programmable gate array (FPGA), we see up to 28.6% improvement in forecast accuracy, an approximately fivefold reduction in the number of logical computing elements required and an approximately 10-fold reduction in energy consumed by the FPGA, for the L96 model. PMID:28989303
Bitwise efficiency in chaotic models
NASA Astrophysics Data System (ADS)
Jeffress, Stephen; Düben, Peter; Palmer, Tim
2017-09-01
Motivated by the increasing energy consumption of supercomputing for weather and climate simulations, we introduce a framework for investigating the bit-level information efficiency of chaotic models. In comparison with previous explorations of inexactness in climate modelling, the proposed and tested information metric has three specific advantages: (i) it requires only a single high-precision time series; (ii) information does not grow indefinitely for decreasing time step; and (iii) information is more sensitive to the dynamics and uncertainties of the model rather than to the implementation details. We demonstrate the notion of bit-level information efficiency in two of Edward Lorenz's prototypical chaotic models: Lorenz 1963 (L63) and Lorenz 1996 (L96). Although L63 is typically integrated in 64-bit `double' floating point precision, we show that only 16 bits have significant information content, given an initial condition uncertainty of approximately 1% of the size of the attractor. This result is sensitive to the size of the uncertainty but not to the time step of the model. We then apply the metric to the L96 model and find that a 16-bit scaled integer model would suffice given the uncertainty of the unresolved sub-grid-scale dynamics. We then show that, by dedicating computational resources to spatial resolution rather than numeric precision in a field programmable gate array (FPGA), we see up to 28.6% improvement in forecast accuracy, an approximately fivefold reduction in the number of logical computing elements required and an approximately 10-fold reduction in energy consumed by the FPGA, for the L96 model.
Walshaw, John; Peck, Michael W.; Barker, Gary C.
2016-01-01
Clostridium botulinum produces botulinum neurotoxins (BoNTs), highly potent substances responsible for botulism. Currently, mathematical models of C. botulinum growth and toxigenesis are largely aimed at risk assessment and do not include explicit genetic information beyond group level but integrate many component processes, such as signalling, membrane permeability and metabolic activity. In this paper we present a scheme for modelling neurotoxin production in C. botulinum Group I type A1, based on the integration of diverse information coming from experimental results available in the literature. Experiments show that production of BoNTs depends on the growth-phase and is under the control of positive and negative regulatory elements at the intracellular level. Toxins are released as large protein complexes and are associated with non-toxic components. Here, we systematically review and integrate those regulatory elements previously described in the literature for C. botulinum Group I type A1 into a population dynamics model, to build the very first computational model of toxin production at the molecular level. We conduct a validation of our model against several items of published experimental data for different wild type and mutant strains of C. botulinum Group I type A1. The result of this process underscores the potential of mathematical modelling at the cellular level, as a means of creating opportunities in developing new strategies that could be used to prevent botulism; and potentially contribute to improved methods for the production of toxin that is used for therapeutics. PMID:27855161
Code of Federal Regulations, 2014 CFR
2014-01-01
... 16 Commercial Practices 1 2014-01-01 2014-01-01 false Elements. 314.4 Section 314.4 Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS STANDARDS FOR SAFEGUARDING CUSTOMER INFORMATION § 314.4 Elements. In order to develop, implement, and maintain your information...
Code of Federal Regulations, 2011 CFR
2011-01-01
... 16 Commercial Practices 1 2011-01-01 2011-01-01 false Elements. 314.4 Section 314.4 Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS STANDARDS FOR SAFEGUARDING CUSTOMER INFORMATION § 314.4 Elements. In order to develop, implement, and maintain your information...
Code of Federal Regulations, 2013 CFR
2013-01-01
... 16 Commercial Practices 1 2013-01-01 2013-01-01 false Elements. 314.4 Section 314.4 Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS STANDARDS FOR SAFEGUARDING CUSTOMER INFORMATION § 314.4 Elements. In order to develop, implement, and maintain your information...
Code of Federal Regulations, 2012 CFR
2012-01-01
... 16 Commercial Practices 1 2012-01-01 2012-01-01 false Elements. 314.4 Section 314.4 Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS STANDARDS FOR SAFEGUARDING CUSTOMER INFORMATION § 314.4 Elements. In order to develop, implement, and maintain your information...
Harvey, H Benjamin; Liu, Catherine; Ai, Jing; Jaworsky, Cristina; Guerrier, Claude Emmanuel; Flores, Efren; Pianykh, Oleg
2017-10-01
To test whether data elements available in the electronic medical record (EMR) can be effectively leveraged to predict failure to attend a scheduled radiology examination. Using data from a large academic medical center, we identified all patients with a diagnostic imaging examination scheduled from January 1, 2016, to April 1, 2016, and determined whether the patient successfully attended the examination. Demographic, clinical, and health services utilization variables available in the EMR potentially relevant to examination attendance were recorded for each patient. We used descriptive statistics and logistic regression models to test whether these data elements could predict failure to attend a scheduled radiology examination. The predictive accuracy of the regression models were determined by calculating the area under the receiver operator curve. Among the 54,652 patient appointments with radiology examinations scheduled during the study period, 6.5% were no-shows. No-show rates were highest for the modalities of mammography and CT and lowest for PET and MRI. Logistic regression indicated that 16 of the 27 demographic, clinical, and health services utilization factors were significantly associated with failure to attend a scheduled radiology examination (P ≤ .05). Stepwise logistic regression analysis demonstrated that previous no-shows, days between scheduling and appointments, modality type, and insurance type were most strongly predictive of no-show. A model considering all 16 data elements had good ability to predict radiology no-shows (area under the receiver operator curve = 0.753). The predictive ability was similar or improved when these models were analyzed by modality. Patient and examination information readily available in the EMR can be successfully used to predict radiology no-shows. Moving forward, this information can be proactively leveraged to identify patients who might benefit from additional patient engagement through appointment reminders or other targeted interventions to avoid no-shows. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Modeling error analysis of stationary linear discrete-time filters
NASA Technical Reports Server (NTRS)
Patel, R.; Toda, M.
1977-01-01
The performance of Kalman-type, linear, discrete-time filters in the presence of modeling errors is considered. The discussion is limited to stationary performance, and bounds are obtained for the performance index, the mean-squared error of estimates for suboptimal and optimal (Kalman) filters. The computation of these bounds requires information on only the model matrices and the range of errors for these matrices. Consequently, a design can easily compare the performance of a suboptimal filter with that of the optimal filter, when only the range of errors in the elements of the model matrices is available.
An Information-Based Machine Learning Approach to Elasticity Imaging
Hoerig, Cameron; Ghaboussi, Jamshid; Insana, Michael. F.
2016-01-01
An information-based technique is described for applications in mechanical-property imaging of soft biological media under quasi-static loads. We adapted the Autoprogressive method that was originally developed for civil engineering applications for this purpose. The Autoprogressive method is a computational technique that combines knowledge of object shape and a sparse distribution of force and displacement measurements with finite-element analyses and artificial neural networks to estimate a complete set of stress and strain vectors. Elasticity imaging parameters are then computed from estimated stresses and strains. We introduce the technique using ultrasonic pulse-echo measurements in simple gelatin imaging phantoms having linear-elastic properties so that conventional finite-element modeling can be used to validate results. The Autoprogressive algorithm does not require any assumptions about the material properties and can, in principle, be used to image media with arbitrary properties. We show that by selecting a few well-chosen force-displacement measurements that are appropriately applied during training and establish convergence, we can estimate all nontrivial stress and strain vectors throughout an object and accurately estimate an elastic modulus at high spatial resolution. This new method of modeling the mechanical properties of tissue-like materials introduces a unique method of solving the inverse problem and is the first technique for imaging stress without assuming the underlying constitutive model. PMID:27858175
Climate Change and a Global City: An Assessment of the Metropolitan East Coast Region
NASA Technical Reports Server (NTRS)
Rosenzweig, Cynthia; Solecki, William
1999-01-01
The objective of the research is to derive an assessment of the potential climate change impacts on a global city - in this case the 31 county region that comprises the New York City metropolitan area. This study comprises one of the regional components that contribute to the ongoing U.S. National Assessment: The Potential Consequences of Climate Variability and Change and is an application of state-of-the-art climate change science to a set of linked sectoral assessment analyses for the Metro East Coast (MEC) region. We illustrate how three interacting elements of global cities react and respond to climate variability and change with a broad conceptual model. These elements include: people (e.g., socio- demographic conditions), place (e.g., physical systems), and pulse (e.g., decision-making and economic activities). The model assumes that a comprehensive assessment of potential climate change can be derived from examining the impacts within each of these elements and at their intersections. Thus, the assessment attempts to determine the within-element and the inter-element effects. Five interacting sector studies representing the three intersecting elements are evaluated. They include the Coastal Zone, Infrastructure, Water Supply, Public Health, and Institutional Decision-making. Each study assesses potential climate change impacts on the sector and on the intersecting elements, through the analysis of the following parts: 1. Current conditions of sector in the region; 2. Lessons and evidence derived from past climate variability; 3. Scenario predictions affecting sector; potential impacts of scenario predictions; 4. Knowledge/information gaps and critical issues including identification of additional research questions, effectiveness of modeling efforts, equity of impacts, potential non-local interactions, and policy recommendations; and 5. Identification of coping strategies - i.e., resilience building, mitigation strategies, new technologies, education that affects decision-making, and better preparedness for contingencies.
Integrating Health Behavior Theory and Design Elements in Serious Games.
Cheek, Colleen; Fleming, Theresa; Lucassen, Mathijs Fg; Bridgman, Heather; Stasiak, Karolina; Shepherd, Matthew; Orpin, Peter
2015-01-01
Internet interventions for improving health and well-being have the potential to reach many people and fill gaps in service provision. Serious gaming interfaces provide opportunities to optimize user adherence and impact. Health interventions based in theory and evidence and tailored to psychological constructs have been found to be more effective to promote behavior change. Defining the design elements which engage users and help them to meet their goals can contribute to better informed serious games. To elucidate design elements important in SPARX, a serious game for adolescents with depression, from a user-centered perspective. We proposed a model based on an established theory of health behavior change and practical features of serious game design to organize ideas and rationale. We analyzed data from 5 studies comprising a total of 22 focus groups and 66 semistructured interviews conducted with youth and families in New Zealand and Australia who had viewed or used SPARX. User perceptions of the game were applied to this framework. A coherent framework was established using the three constructs of self-determination theory (SDT), autonomy, competence, and relatedness, to organize user perceptions and design elements within four areas important in design: computer game, accessibility, working alliance, and learning in immersion. User perceptions mapped well to the framework, which may assist developers in understanding the context of user needs. By mapping these elements against the constructs of SDT, we were able to propose a sound theoretical base for the model. This study's method allowed for the articulation of design elements in a serious game from a user-centered perspective within a coherent overarching framework. The framework can be used to deliberately incorporate serious game design elements that support a user's sense of autonomy, competence, and relatedness, key constructs which have been found to mediate motivation at all stages of the change process. The resulting model introduces promising avenues for future exploration. Involving users in program design remains an imperative if serious games are to be fit for purpose.
Demand Activated Manufacturing Architecture (DAMA) model for supply chain collaboration
DOE Office of Scientific and Technical Information (OSTI.GOV)
CHAPMAN,LEON D.; PETERSEN,MARJORIE B.
The Demand Activated Manufacturing Architecture (DAMA) project during the last five years of work with the U.S. Integrated Textile Complex (retail, apparel, textile, and fiber sectors) has developed an inter-enterprise architecture and collaborative model for supply chains. This model will enable improved collaborative business across any supply chain. The DAMA Model for Supply Chain Collaboration is a high-level model for collaboration to achieve Demand Activated Manufacturing. The five major elements of the architecture to support collaboration are (1) activity or process, (2) information, (3) application, (4) data, and (5) infrastructure. These five elements are tied to the application of themore » DAMA architecture to three phases of collaboration - prepare, pilot, and scale. There are six collaborative activities that may be employed in this model: (1) Develop Business Planning Agreements, (2) Define Products, (3) Forecast and Plan Capacity Commitments, (4) Schedule Product and Product Delivery, (5) Expedite Production and Delivery Exceptions, and (6) Populate Supply Chain Utility. The Supply Chain Utility is a set of applications implemented to support collaborative product definition, forecast visibility, planning, scheduling, and execution. The DAMA architecture and model will be presented along with the process for implementing this DAMA model.« less
High Level Information Fusion (HLIF) with nested fusion loops
NASA Astrophysics Data System (ADS)
Woodley, Robert; Gosnell, Michael; Fischer, Amber
2013-05-01
Situation modeling and threat prediction require higher levels of data fusion in order to provide actionable information. Beyond the sensor data and sources the analyst has access to, the use of out-sourced and re-sourced data is becoming common. Through the years, some common frameworks have emerged for dealing with information fusion—perhaps the most ubiquitous being the JDL Data Fusion Group and their initial 4-level data fusion model. Since these initial developments, numerous models of information fusion have emerged, hoping to better capture the human-centric process of data analyses within a machine-centric framework. 21st Century Systems, Inc. has developed Fusion with Uncertainty Reasoning using Nested Assessment Characterizer Elements (FURNACE) to address challenges of high level information fusion and handle bias, ambiguity, and uncertainty (BAU) for Situation Modeling, Threat Modeling, and Threat Prediction. It combines JDL fusion levels with nested fusion loops and state-of-the-art data reasoning. Initial research has shown that FURNACE is able to reduce BAU and improve the fusion process by allowing high level information fusion (HLIF) to affect lower levels without the double counting of information or other biasing issues. The initial FURNACE project was focused on the underlying algorithms to produce a fusion system able to handle BAU and repurposed data in a cohesive manner. FURNACE supports analyst's efforts to develop situation models, threat models, and threat predictions to increase situational awareness of the battlespace. FURNACE will not only revolutionize the military intelligence realm, but also benefit the larger homeland defense, law enforcement, and business intelligence markets.
Basic elements and concepts of information systems are presented: definition of the term "information", main elements of data and database structure. The report also deals with the information system and its underlying theory and design. Examples of the application of formation ...
Integrated System Health Management Development Toolkit
NASA Technical Reports Server (NTRS)
Figueroa, Jorge; Smith, Harvey; Morris, Jon
2009-01-01
This software toolkit is designed to model complex systems for the implementation of embedded Integrated System Health Management (ISHM) capability, which focuses on determining the condition (health) of every element in a complex system (detect anomalies, diagnose causes, and predict future anomalies), and to provide data, information, and knowledge (DIaK) to control systems for safe and effective operation.
Neural Mechanisms of Attention
1993-05-21
of Attention 39 The Element Superiority Effect : Attention? 46 Animal Models of Attention Deficit 47 Conditioned Attention Theory 50 2 ATTENTION AND...fails to obtain the necessary quantitative information about the effects of parametric manipulations on the dissociation, or the parametric results...neuroscience endeavor as described here. If simultaneously psychologists ignore the brain arid neuroscientists ignore the mind, no effective translation
Formation of the giant planets
NASA Technical Reports Server (NTRS)
Lissauer, Jack J.
2006-01-01
The observed properties of giant planets, models of their evolution and observations of protoplanetary disks provide constraints on the formation of gas giant planets. The four largest planets in our Solar System contain considerable quantities of hydrogen and helium, which could not have condensed into solid planetesimals within the protoplanetary disk. All three (transiting) extrasolar giant planets with well determined masses and radii also must contain substantial amounts of these light gases. Jupiter and Saturn are mostly hydrogen and helium, but have larger abundances of heavier elements than does the Sun. Neptune and Uranus are primarily composed of heavier elements. HD 149026 b, which is slightly more massive than is Saturn, appears to have comparable quantities of light gases and heavy elements. HD 209458 b and TrES-1 are primarily hydrogen and helium, but may contain supersolar abundances of heavy elements. Spacecraft flybys and observations of satellite orbits provide estimates of the gravitational moments of the giant planets in our Solar System, which in turn provide information on the internal distribution of matter within Jupiter, Saturn, Uranus and Neptune. Atmospheric thermal structure and heat flow measurements constrain the interior temperatures of planets. Internal processes may cause giant planets to become more compositionally differentiated or alternatively more homogeneous; high-pressure laboratory .experiments provide data useful for modeling these processes. The preponderance of evidence supports the core nucleated gas accretion model. According to this model, giant planets begin their growth by the accumulation of small solid bodies, as do terrestrial planets. However, unlike terrestrial planets, the growing giant planet cores become massive enough that they are able to accumulate substantial amounts of gas before the protoplanetary disk dissipates. The primary questions regarding the core nucleated growth model is under what conditions planets with small cores/total heavy element abundances can accrete gaseous envelopes within the lifetimes of gaseous protoplanetary disks.
Yousefsani, Seyed Abdolmajid; Shamloo, Amir; Farahmand, Farzam
2018-04-01
A transverse-plane hyperelastic micromechanical model of brain white matter tissue was developed using the embedded element technique (EET). The model consisted of a histology-informed probabilistic distribution of axonal fibers embedded within an extracellular matrix, both described using the generalized Ogden hyperelastic material model. A correcting method, based on the strain energy density function, was formulated to resolve the stiffness redundancy problem of the EET in large deformation regime. The model was then used to predict the homogenized tissue behavior and the associated localized responses of the axonal fibers under quasi-static, transverse, large deformations. Results indicated that with a sufficiently large representative volume element (RVE) and fine mesh, the statistically randomized microstructure implemented in the RVE exhibits directional independency in transverse plane, and the model predictions for the overall and local tissue responses, characterized by the normalized strain energy density and Cauchy and von Mises stresses, are independent from the modeling parameters. Comparison of the responses of the probabilistic model with that of a simple uniform RVE revealed that only the first one is capable of representing the localized behavior of the tissue constituents. The validity test of the model predictions for the corona radiata against experimental data from the literature indicated a very close agreement. In comparison with the conventional direct meshing method, the model provided almost the same results after correcting the stiffness redundancy, however, with much less computational cost and facilitated geometrical modeling, meshing, and boundary conditions imposing. It was concluded that the EET can be used effectively for detailed probabilistic micromechanical modeling of the white matter in order to provide more accurate predictions for the axonal responses, which are of great importance when simulating the brain trauma or tumor growth. Copyright © 2018 Elsevier Ltd. All rights reserved.
Application of thermodynamics to silicate crystalline solutions
NASA Technical Reports Server (NTRS)
Saxena, S. K.
1972-01-01
A review of thermodynamic relations is presented, describing Guggenheim's regular solution models, the simple mixture, the zeroth approximation, and the quasi-chemical model. The possibilities of retrieving useful thermodynamic quantities from phase equilibrium studies are discussed. Such quantities include the activity-composition relations and the free energy of mixing in crystalline solutions. Theory and results of the study of partitioning of elements in coexisting minerals are briefly reviewed. A thermodynamic study of the intercrystalline and intracrystalline ion exchange relations gives useful information on the thermodynamic behavior of the crystalline solutions involved. Such information is necessary for the solution of most petrogenic problems and for geothermometry. Thermodynamic quantities for tungstates (CaWO4-SrWO4) are calculated.
NASA Astrophysics Data System (ADS)
Poplin, A.; Shenk, L.; Krejci, C.; Passe, U.
2017-09-01
The main goal of this paper is to present the conceptual framework for engaging youth in urban planning activities that simultaneously create locally meaningful positive change. The framework for engaging youth interlinks the use of IT tools such as geographic information systems (GIS), agent-based modelling (ABM), online serious games, and mobile participatory geographic information systems with map-based storytelling and action projects. We summarize the elements of our framework and the first results gained in the program Community Growers established in a neighbourhood community of Des Moines, the capital of Iowa, USA. We conclude the paper with a discussion and future research directions.
NASA Astrophysics Data System (ADS)
Jones, Brian Kirby
The purpose of this grounded theory study was to develop a model explaining the role of differentiated instruction (DI) in effective middle school science teaching. The study examined the best teaching practices and differentiated elements from eight general education middle school science teachers, all scoring at the highest level of a teaching effectiveness measure on their evaluations, through a collection of observational, interview, survey, and teaching artifact data. The data were analyzed through the methodology of a systematic grounded theory qualitative approach using open, axial, and selective coding to develop a model describing how and to what degree effective middle school science teachers differentiated their best teaching practices. The model that emerged from the data shows instruction as a four-phase process and highlights the major elements of best practices and DI represented at each phase. The model also depicts how teachers narrowed the scope of their differentiating strategies as instruction progressed. The participants incorporated DI into their pedagogies, though in different degrees at each phase, and primarily by using variety to present concepts with multiple types of instruction followed by a series of sense-making activities related to several learning modalities. Teachers scaffolded students carefully, using informal and formal assessment data to inform future instructional decisions and especially their plans to reteach or extend on a concept. The model is intended to provide insight into the value of DI for middle school science teaching.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cances, Benjamin; Benedetti, Marc; Farges, Francois
2007-02-02
Gold is a highly valuable metal that can concentrate in iron-rich exogenetic horizons such as laterites. An improved knowledge of the retention mechanisms of gold onto highly reactive soil components such as iron oxy-hydroxides is therefore needed to better understand and predict the geochemical behavior of this element. In this study, we use EXAFS information and titration experiments to provide a realistic thermochemical description of the sorption of trivalent gold onto iron oxy-hydroxides. Analysis of Au LIII-edge XAFS spectra shows that aqueous Au(III) adsorbs from chloride solutions onto goethite surfaces as inner-sphere square-planar complexes (Au(III)(OH,Cl)4), with dominantly OH ligands atmore » pH > 6 and mixed OH/Cl ligands at lower pH values. In combination with these spectroscopic results, Reverse Monte Carlo simulations were used to constraint the possible sorption sites on the surface of goethite. Based on this structural information, we calculated sorption isotherms of Au(III) on Fe oxy-hydroxides surfaces, using the CD-MUSIC (Charge Distribution - MUlti SIte Complexation) model. The various Au(III)-sorbed species were identified as a function of pH, and the results of these EXAFS+CD-MUSIC models are compared with titration experiments. The overall good agreement between the predicted and measured structural models shows the potential of this combined approach to better model sorption processes of transition elements onto highly reactive solid surfaces such as goethite and ferrihydrite.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cances, Benjamin; /Marne La Vallee U.; Benedetti, Marc
2006-12-13
Gold is a highly valuable metal that can concentrate in iron-rich exogenetic horizons such as laterites. An improved knowledge of the retention mechanisms of gold onto highly reactive soil components such as iron oxyhydroxides is therefore needed to better understand and predict the geochemical behavior of this element. In this study, we use EXAFS information and titration experiments to provide a realistic thermochemical description of the sorption of trivalent gold onto iron oxy-hydroxides. Analysis of Au L{sub III}-edge XAFS spectra shows that aqueous Au(III) adsorbs from chloride solutions onto goethite surfaces as inner-sphere square-planar complexes (Au(III)(OH,Cl){sub 4}), with dominantly OHmore » ligands at pH > 6 and mixed OH/Cl ligands at lower pH values. In combination with these spectroscopic results, Reverse Monte Carlo simulations were used to constraint the possible sorption sites on the surface of goethite. Based on this structural information, we calculated sorption isotherms of Au(III) on Fe oxy-hydroxides surfaces, using the CD-MUSIC (Charge Distribution--Multi Site Complexation) model. The various Au(III)-sorbed species were identified as a function of pH, and the results of these EXAFS+CD-MUSIC models are compared with titration experiments. The overall good agreement between the predicted and measured structural models shows the potential of this combined approach to better model sorption processes of transition elements onto highly reactive solid surfaces such as goethite and ferrihydrite.« less
Rajamani, Sripriya; Chen, Elizabeth S; Lindemann, Elizabeth; Aldekhyyel, Ranyah; Wang, Yan; Melton, Genevieve B
2018-02-01
Reports by the National Academy of Medicine and leading public health organizations advocate including occupational information as part of an individual's social context. Given recent National Academy of Medicine recommendations on occupation-related data in the electronic health record, there is a critical need for improved representation. The National Institute for Occupational Safety and Health has developed an Occupational Data for Health (ODH) model, currently in draft format. This study aimed to validate the ODH model by mapping occupation-related elements from resources representing recommendations, standards, public health reports and surveys, and research measures, along with preliminary evaluation of associated value sets. All 247 occupation-related items across 20 resources mapped to the ODH model. Recommended value sets had high variability across the evaluated resources. This study demonstrates the ODH model's value, the multifaceted nature of occupation information, and the critical need for occupation value sets to support clinical care, population health, and research. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Molecular architecture of the yeast Mediator complex
Robinson, Philip J; Trnka, Michael J; Pellarin, Riccardo; Greenberg, Charles H; Bushnell, David A; Davis, Ralph; Burlingame, Alma L; Sali, Andrej; Kornberg, Roger D
2015-01-01
The 21-subunit Mediator complex transduces regulatory information from enhancers to promoters, and performs an essential role in the initiation of transcription in all eukaryotes. Structural information on two-thirds of the complex has been limited to coarse subunit mapping onto 2-D images from electron micrographs. We have performed chemical cross-linking and mass spectrometry, and combined the results with information from X-ray crystallography, homology modeling, and cryo-electron microscopy by an integrative modeling approach to determine a 3-D model of the entire Mediator complex. The approach is validated by the use of X-ray crystal structures as internal controls and by consistency with previous results from electron microscopy and yeast two-hybrid screens. The model shows the locations and orientations of all Mediator subunits, as well as subunit interfaces and some secondary structural elements. Segments of 20–40 amino acid residues are placed with an average precision of 20 Å. The model reveals roles of individual subunits in the organization of the complex. DOI: http://dx.doi.org/10.7554/eLife.08719.001 PMID:26402457
Development of a cultural heritage object BIM model
NASA Astrophysics Data System (ADS)
Braila, Natalya; Vakhrusheva, Svetlana; Martynenko, Elena; Kisel, Tatyana
2017-10-01
The BIM technology during her creation has been aimed, first of all, at design and construction branch, but its application in the field of studying and operation of architectural heritage can essentially change and transfer this kind of activity to new qualitative level. The question of effective introduction of BIM technologies at the solution of administrative questions of operation and development of monuments of architecture is considered in article. Creation of the information model of the building object of cultural heritage including a full complex of information on an object is offered: historical and archival, legal, technical, administrative, etc. The 3D model of an object of cultural heritage with color marking of elements on degree of wear and a first priority of carrying out repair will become one of components of model. This model will allow to estimate visually technical condition of the building in general and to gain general idea about scales of necessary repair and construction actions that promotes improvement of quality of operation of an object, and also simplifies and accelerates processing of information and in need of a memorial building assessment as subject to investment.
Buchanan, Drew; Ural, Ani
2010-08-01
Distal forearm fracture is one of the most frequently observed osteoporotic fractures, which may occur as a result of low energy falls such as falls from a standing height and may be linked to the osteoporotic nature of the bone, especially in the elderly. In order to prevent the occurrence of radius fractures and their adverse outcomes, understanding the effect of both extrinsic and intrinsic contributors to fracture risk is essential. In this study, a nonlinear fracture mechanics-based finite element model is applied to human radius to assess the influence of extrinsic factors (load orientation and load distribution between scaphoid and lunate) and intrinsic bone properties (age-related changes in fracture properties and bone geometry) on the Colles' fracture load. Seven three-dimensional finite element models of radius were created, and the fracture loads were determined by using cohesive finite element modeling, which explicitly represented the crack and the fracture process zone behavior. The simulation results showed that the load direction with respect to the longitudinal and dorsal axes of the radius influenced the fracture load. The fracture load increased with larger angles between the resultant load and the dorsal axis, and with smaller angles between the resultant load and longitudinal axis. The fracture load also varied as a function of the load ratio between the lunate and scaphoid, however, not as drastically as with the load orientation. The fracture load decreased as the load ratio (lunate/scaphoid) increased. Multiple regression analysis showed that the bone geometry and the load orientation are the most important variables that contribute to the prediction of the fracture load. The findings in this study establish a robust computational fracture risk assessment method that combines the effects of intrinsic properties of bone with extrinsic factors associated with a fall, and may be elemental in the identification of high fracture risk individuals as well as in the development of fracture prevention methods including protective falling techniques. The additional information that this study brings to fracture identification and prevention highlights the promise of fracture mechanics-based finite element modeling in fracture risk assessment.
Sato, Y; Wadamoto, M; Tsuga, K; Teixeira, E R
1999-04-01
More validity of finite element analysis in implant biomechanics requires element downsizing. However, excess downsizing needs computer memory and calculation time. To investigate the effectiveness of element downsizing on the construction of a three-dimensional finite element bone trabeculae model, with different element sizes (600, 300, 150 and 75 microm) models were constructed and stress induced by vertical 10 N loading was analysed. The difference in von Mises stress values between the models with 600 and 300 microm element sizes was larger than that between 300 and 150 microm. On the other hand, no clear difference of stress values was detected among the models with 300, 150 and 75 microm element sizes. Downsizing of elements from 600 to 300 microm is suggested to be effective in the construction of a three-dimensional finite element bone trabeculae model for possible saving of computer memory and calculation time in the laboratory.
Prediction of Fracture Behavior in Rock and Rock-like Materials Using Discrete Element Models
NASA Astrophysics Data System (ADS)
Katsaga, T.; Young, P.
2009-05-01
The study of fracture initiation and propagation in heterogeneous materials such as rock and rock-like materials are of principal interest in the field of rock mechanics and rock engineering. It is crucial to study and investigate failure prediction and safety measures in civil and mining structures. Our work offers a practical approach to predict fracture behaviour using discrete element models. In this approach, the microstructures of materials are presented through the combination of clusters of bonded particles with different inter-cluster particle and bond properties, and intra-cluster bond properties. The geometry of clusters is transferred from information available from thin sections, computed tomography (CT) images and other visual presentation of the modeled material using customized AutoCAD built-in dialog- based Visual Basic Application. Exact microstructures of the tested sample, including fractures, faults, inclusions and void spaces can be duplicated in the discrete element models. Although the microstructural fabrics of rocks and rock-like structures may have different scale, fracture formation and propagation through these materials are alike and will follow similar mechanics. Synthetic material provides an excellent condition for validating the modelling approaches, as fracture behaviours are known with the well-defined composite's properties. Calibration of the macro-properties of matrix material and inclusions (aggregates), were followed with the overall mechanical material responses calibration by adjusting the interfacial properties. The discrete element model predicted similar fracture propagation features and path as that of the real sample material. The path of the fractures and matrix-inclusion interaction was compared using computed tomography images. Initiation and fracture formation in the model and real material were compared using Acoustic Emission data. Analysing the temporal and spatial evolution of AE events, collected during the sample testing, in relation to the CT images allows the precise reconstruction of the failure sequence. Our proposed modelling approach illustrates realistic fracture formation and growth predictions at different loading conditions.
NASA Astrophysics Data System (ADS)
Kettle, L. M.; Mora, P.; Weatherley, D.; Gross, L.; Xing, H.
2006-12-01
Simulations using the Finite Element method are widely used in many engineering applications and for the solution of partial differential equations (PDEs). Computational models based on the solution of PDEs play a key role in earth systems simulations. We present numerical modelling of crustal fault systems where the dynamic elastic wave equation is solved using the Finite Element method. This is achieved using a high level computational modelling language, escript, available as open source software from ACcESS (Australian Computational Earth Systems Simulator), the University of Queensland. Escript is an advanced geophysical simulation software package developed at ACcESS which includes parallel equation solvers, data visualisation and data analysis software. The escript library was implemented to develop a flexible Finite Element model which reliably simulates the mechanism of faulting and the physics of earthquakes. Both 2D and 3D elastodynamic models are being developed to study the dynamics of crustal fault systems. Our final goal is to build a flexible model which can be applied to any fault system with user-defined geometry and input parameters. To study the physics of earthquake processes, two different time scales must be modelled, firstly the quasi-static loading phase which gradually increases stress in the system (~100years), and secondly the dynamic rupture process which rapidly redistributes stress in the system (~100secs). We will discuss the solution of the time-dependent elastic wave equation for an arbitrary fault system using escript. This involves prescribing the correct initial stress distribution in the system to simulate the quasi-static loading of faults to failure; determining a suitable frictional constitutive law which accurately reproduces the dynamics of the stick/slip instability at the faults; and using a robust time integration scheme. These dynamic models generate data and information that can be used for earthquake forecasting.
Lindemann, Elizabeth A.; Chen, Elizabeth S.; Rajamani, Sripriya; Manohar, Nivedha; Wang, Yan; Melton, Genevieve B.
2017-01-01
There has been increasing recognition of the key role of social determinants like occupation on health. Given the relatively poor understanding of occupation information in electronic health records (EHRs), we sought to characterize occupation information within free-text clinical document sources. From six distinct clinical sources, 868 total occupation-related sentences were identified for the study corpus. Building off approaches from previous studies, refined annotation guidelines were created using the National Institute for Occupational Safety and Health Occupational Data for Health data model with elements added to increase granularity. Our corpus generated 2,005 total annotations representing 39 of 41 entity types from the enhanced data model. Highest frequency entities were: Occupation Description (17.7%); Employment Status – Not Specified (12.5%); Employer Name (11.0%); Subject (9.8%); Industry Description (6.2%). Our findings support the value for standardizing entry of EHR occupation information to improve data quality for improved patient care and secondary uses of this information. PMID:29295142
Vocational rehabilitation after traumatic brain injury: models and services.
Tyerman, Andy
2012-01-01
A recent systematic review suggests that around 40% of people with traumatic brain injury (TBI) return to work (RTW). Yet in the U.K. currently only a small minority of people with TBI receive vocational rehabilitation (VR) to enable a RTW. Agencies with an interest in developing such services are likely to favour different models of VR. The primary objective of this paper was to review models of specialist VR after TBI and their outcomes to inform service development across relevant agencies. A literature review on VR after TBI was undertaken in MEDLINE, EMBASE and PsychINFO (from 1967 to date). Papers reporting models of VR were selected for more detailed consideration. Illustrative examples of VR models are outlined: brain injury rehabilitation programmes with added VR elements, VR models adapted for TBI, case coordination/resource facilitation models, and consumer-directed models. Models differ, both within and across these four broad categories, in provision of core TBI rehabilitation, work preparation, work trials and supported placements. Methodological variation limits direct comparison of outcomes across models with few comparative or controlled studies. There is evidence to support the benefits of a wide range of models of specialist VR after TBI. However, there remains a need for controlled studies to inform service development and more evidence on cost-effectiveness to inform funding decisions.
Grogan-Kaylor, Andrew; Perron, Brian E.; Kilbourne, Amy M.; Woltmann, Emily; Bauer, Mark S.
2013-01-01
Objective Prior meta-analysis indicates that collaborative chronic care models (CCMs) improve mental and physical health outcomes for individuals with mental disorders. This study aimed to investigate the stability of evidence over time and identify patient and intervention factors associated with CCM effects in order to facilitate implementation and sustainability of CCMs in clinical practice. Method We reviewed 53 CCM trials that analyzed depression, mental quality of life (QOL), or physical QOL outcomes. Cumulative meta-analysis and meta-regression were supplemented by descriptive investigations across and within trials. Results Most trials targeted depression in the primary care setting, and cumulative meta-analysis indicated that effect sizes favoring CCM quickly achieved significance for depression outcomes, and more recently achieved significance for mental and physical QOL. Four of six CCM elements (patient self-management support, clinical information systems, system redesign, and provider decision support) were common among reviewed trials, while two elements (healthcare organization support and linkages to community resources) were rare. No single CCM element was statistically associated with the success of the model. Similarly, meta-regression did not identify specific factors associated with CCM effectiveness. Nonetheless, results within individual trials suggest that increased illness severity predicts CCM outcomes. Conclusions Significant CCM trials have been derived primarily from four original CCM elements. Nonetheless, implementing and sustaining this established model will require healthcare organization support. While CCMs have typically been tested as population-based interventions, evidence supports stepped care application to more severely ill individuals. Future priorities include developing implementation strategies to support adoption and sustainability of the model in clinical settings while maximizing fit of this multi-component framework to local contextual factors. PMID:23938600
Patient-specific finite element modeling of bones.
Poelert, Sander; Valstar, Edward; Weinans, Harrie; Zadpoor, Amir A
2013-04-01
Finite element modeling is an engineering tool for structural analysis that has been used for many years to assess the relationship between load transfer and bone morphology and to optimize the design and fixation of orthopedic implants. Due to recent developments in finite element model generation, for example, improved computed tomography imaging quality, improved segmentation algorithms, and faster computers, the accuracy of finite element modeling has increased vastly and finite element models simulating the anatomy and properties of an individual patient can be constructed. Such so-called patient-specific finite element models are potentially valuable tools for orthopedic surgeons in fracture risk assessment or pre- and intraoperative planning of implant placement. The aim of this article is to provide a critical overview of current themes in patient-specific finite element modeling of bones. In addition, the state-of-the-art in patient-specific modeling of bones is compared with the requirements for a clinically applicable patient-specific finite element method, and judgment is passed on the feasibility of application of patient-specific finite element modeling as a part of clinical orthopedic routine. It is concluded that further development in certain aspects of patient-specific finite element modeling are needed before finite element modeling can be used as a routine clinical tool.
Kamoi, Shun; Pretty, Christopher; Docherty, Paul; Squire, Dougie; Revie, James; Chiew, Yeong Shiong; Desaive, Thomas; Shaw, Geoffrey M; Chase, J Geoffrey
2014-01-01
Accurate, continuous, left ventricular stroke volume (SV) measurements can convey large amounts of information about patient hemodynamic status and response to therapy. However, direct measurements are highly invasive in clinical practice, and current procedures for estimating SV require specialized devices and significant approximation. This study investigates the accuracy of a three element Windkessel model combined with an aortic pressure waveform to estimate SV. Aortic pressure is separated into two components capturing; 1) resistance and compliance, 2) characteristic impedance. This separation provides model-element relationships enabling SV to be estimated while requiring only one of the three element values to be known or estimated. Beat-to-beat SV estimation was performed using population-representative optimal values for each model element. This method was validated using measured SV data from porcine experiments (N = 3 female Pietrain pigs, 29-37 kg) in which both ventricular volume and aortic pressure waveforms were measured simultaneously. The median difference between measured SV from left ventricle (LV) output and estimated SV was 0.6 ml with a 90% range (5th-95th percentile) -12.4 ml-14.3 ml. During periods when changes in SV were induced, cross correlations in between estimated and measured SV were above R = 0.65 for all cases. The method presented demonstrates that the magnitude and trends of SV can be accurately estimated from pressure waveforms alone, without the need for identification of complex physiological metrics where strength of correlations may vary significantly from patient to patient.
Modelling the influence of carbon content on material behavior during forging
NASA Astrophysics Data System (ADS)
Korpała, G.; Ullmann, M.; Graf, M.; Wester, H.; Bouguecha, A.; Awiszus, B.; Behrens, B.-A.; Kawalla, R.
2017-10-01
Nowadays the design of single process steps and even of whole process chains is realized by the use of numerical simulation, in particular finite element (FE) based methods. A detailed numerical simulation of hot forging processes requires realistic models, which consider the relevant material-specific parameters to characterize the material behavior, the surface phenomena, the dies as well as models for the machine kinematic. This data exists partial for several materials, but general information on steel groups depending on alloying elements are not available. In order to generate the scientific input data regarding to material modelling, it is necessary to take into account the mathematical functions for deformation behavior as well as recrystallization kinetic, which depends alloying elements, initial microstructure and reheating mode. Besides the material flow characterization, a detailed description of surface changes caused by oxide scale is gaining in importance, as these phenomena affect the material flow and the component quality. Experiments to investigate the influence of only one chemical element on the oxide scale kinetic and the inner structure at high temperatures are still not available. Most data concerning these characteristics is provided for the steel grade C45, so this steel will be used as basis for the tests. In order to identify the effect of the carbon content on the material and oxidation behavior, the steel grades C15 and C60 will be investigated. This paper gives first approaches with regard to the influence of the carbon content on the oxide scale kinetic and the flow stresses combined with the initial microstructure.
Archetype-based semantic integration and standardization of clinical data.
Moner, David; Maldonado, Jose A; Bosca, Diego; Fernandez, Jesualdo T; Angulo, Carlos; Crespo, Pere; Vivancos, Pedro J; Robles, Montserrat
2006-01-01
One of the basic needs for any healthcare professional is to be able to access to clinical information of patients in an understandable and normalized way. The lifelong clinical information of any person supported by electronic means configures his/her Electronic Health Record (EHR). This information is usually distributed among several independent and heterogeneous systems that may be syntactically or semantically incompatible. The Dual Model architecture has appeared as a new proposal for maintaining a homogeneous representation of the EHR with a clear separation between information and knowledge. Information is represented by a Reference Model which describes common data structures with minimal semantics. Knowledge is specified by archetypes, which are formal representations of clinical concepts built upon a particular Reference Model. This kind of architecture is originally thought for implantation of new clinical information systems, but archetypes can be also used for integrating data of existing and not normalized systems, adding at the same time a semantic meaning to the integrated data. In this paper we explain the possible use of a Dual Model approach for semantic integration and standardization of heterogeneous clinical data sources and present LinkEHR-Ed, a tool for developing archetypes as elements for integration purposes. LinkEHR-Ed has been designed to be easily used by the two main participants of the creation process of archetypes for clinical data integration: the Health domain expert and the Information Technologies domain expert.
Mixed formulation for seismic analysis of composite steel-concrete frame structures
NASA Astrophysics Data System (ADS)
Ayoub, Ashraf Salah Eldin
This study presents a new finite element model for the nonlinear analysis of structures made up of steel and concrete under monotonic and cyclic loads. The new formulation is based on a two-field mixed formulation. In the formulation, both forces and deformations are simultaneously approximated within the element through independent interpolation functions. The main advantages of the model is the accuracy in global and local response with very few elements while maintaining rapid numerical convergence and robustness even under severe cyclic loading. Overall four elements were developed based on the new formulation: an element that describes the behavior of anchored reinforcing bars, an element that describes the behavior of composite steel-concrete beams with deformable shear connectors, an element that describes the behavior of reinforced concrete beam-columns with bond-slip, and an element that describes the behavior of pretensioned or posttensioned, bonded or unbonded prestressed concrete structures. The models use fiber discretization of beam sections to describe nonlinear material response. The transfer of forces between steel and concrete is described with bond elements. Bond elements are modeled with distributed spring elements. The non-linear behavior of the composite element derives entirely from the constitutive laws of the steel, concrete and bond elements. Two additional elements are used for the prestressed concrete models, a friction element that models the effect of friction between the tendon and the duct during the posttensioning operation, and an anchorage element that describes the behavior of the prestressing tendon anchorage in posttensioned structures. Two algorithms for the numerical implementation of the new proposed model are presented; an algorithm that enforces stress continuity at element boundaries, and an algorithm in which stress continuity is relaxed locally inside the element. Stability of both algorithms is discussed. Comparison with standard displacement based models and earlier flexibility based models is presented through numerical studies. The studies prove the superiority of the mixed model over both displacement and flexibility models. Correlation studies of the proposed model with experimental results of structural specimens are conducted. The studies show the accuracy of the model and its numerical robustness even under severe cyclic loading conditions.
Building an Ontology for Identity Resolution in Healthcare and Public Health
Duncan, Jeffrey; Eilbeck, Karen; Narus, Scott P.; Clyde, Stephen; Thornton, Sidney; Staes, Catherine
2015-01-01
Integration of disparate information from electronic health records, clinical data warehouses, birth certificate registries and other public health information systems offers great potential for clinical care, public health practice, and research. Such integration, however, depends on correctly matching patient-specific records using demographic identifiers. Without standards for these identifiers, record linkage is complicated by issues of structural and semantic heterogeneity. Objectives: Our objectives were to develop and validate an ontology to: 1) identify components of identity and events subsequent to birth that result in creation, change, or sharing of identity information; 2) develop an ontology to facilitate data integration from multiple healthcare and public health sources; and 3) validate the ontology’s ability to model identity-changing events over time. Methods: We interviewed domain experts in area hospitals and public health programs and developed process models describing the creation and transmission of identity information among various organizations for activities subsequent to a birth event. We searched for existing relevant ontologies. We validated the content of our ontology with simulated identity information conforming to scenarios identified in our process models. Results: We chose the Simple Event Model (SEM) to describe events in early childhood and integrated the Clinical Element Model (CEM) for demographic information. We demonstrated the ability of the combined SEM-CEM ontology to model identity events over time. Conclusion: The use of an ontology can overcome issues of semantic and syntactic heterogeneity to facilitate record linkage. PMID:26392849
Finite Element Modeling of the Buckling Response of Sandwich Panels
NASA Technical Reports Server (NTRS)
Rose, Cheryl A.; Moore, David F.; Knight, Norman F., Jr.; Rankin, Charles C.
2002-01-01
A comparative study of different modeling approaches for predicting sandwich panel buckling response is described. The study considers sandwich panels with anisotropic face sheets and a very thick core. Results from conventional analytical solutions for sandwich panel overall buckling and face-sheet-wrinkling type modes are compared with solutions obtained using different finite element modeling approaches. Finite element solutions are obtained using layered shell element models, with and without transverse shear flexibility, layered shell/solid element models, with shell elements for the face sheets and solid elements for the core, and sandwich models using a recently developed specialty sandwich element. Convergence characteristics of the shell/solid and sandwich element modeling approaches with respect to in-plane and through-the-thickness discretization, are demonstrated. Results of the study indicate that the specialty sandwich element provides an accurate and effective modeling approach for predicting both overall and localized sandwich panel buckling response. Furthermore, results indicate that anisotropy of the face sheets, along with the ratio of principle elastic moduli, affect the buckling response and these effects may not be represented accurately by analytical solutions. Modeling recommendations are also provided.
Cosmological element production.
Wagoner, R V
1967-03-17
Two recent observations appear to have provided critical information about the past history of the universe. The thermal character of the microwave background radiation suggests that the universe has expanded from a state of high temperature and density, and places constraints on such a big-bang cosmology. The observations of very weak helium lines in the spectra of certain stars in the halo of our galaxy are possibly due to a low primeval abundance of this element. However, the simplest model of a big-bang cosmology leads to much higher helium abundances, such as are observed in the solar system and in many stars. The production of helium can be reduced either by altering the early expansion rate or by introducing degenerate electron neutrinos. Observations of interstellar and intergalactic deuterium and He(4), and possibly even He(3) and Li(7), are needed to test the various models.
A High Order, Locally-Adaptive Method for the Navier-Stokes Equations
NASA Astrophysics Data System (ADS)
Chan, Daniel
1998-11-01
I have extended the FOSLS method of Cai, Manteuffel and McCormick (1997) and implemented it within the framework of a spectral element formulation using the Legendre polynomial basis function. The FOSLS method solves the Navier-Stokes equations as a system of coupled first-order equations and provides the ellipticity that is needed for fast iterative matrix solvers like multigrid to operate efficiently. Each element is treated as an object and its properties are self-contained. Only C^0 continuity is imposed across element interfaces; this design allows local grid refinement and coarsening without the burden of having an elaborate data structure, since only information along element boundaries is needed. With the FORTRAN 90 programming environment, I can maintain a high computational efficiency by employing a hybrid parallel processing model. The OpenMP directives provides parallelism in the loop level which is executed in a shared-memory SMP and the MPI protocol allows the distribution of elements to a cluster of SMP's connected via a commodity network. This talk will provide timing results and a comparison with a second order finite difference method.
An ICAI architecture for troubleshooting in complex, dynamic systems
NASA Technical Reports Server (NTRS)
Fath, Janet L.; Mitchell, Christine M.; Govindaraj, T.
1990-01-01
Ahab, an intelligent computer-aided instruction (ICAI) program, illustrates an architecture for simulator-based ICAI programs to teach troubleshooting in complex, dynamic environments. The architecture posits three elements of a computerized instructor: the task model, the student model, and the instructional module. The task model is a prescriptive model of expert performance that uses symptomatic and topographic search strategies to provide students with directed problem-solving aids. The student model is a descriptive model of student performance in the context of the task model. This student model compares the student and task models, critiques student performance, and provides interactive performance feedback. The instructional module coordinates information presented by the instructional media, the task model, and the student model so that each student receives individualized instruction. Concept and metaconcept knowledge that supports these elements is contained in frames and production rules, respectively. The results of an experimental evaluation are discussed. They support the hypothesis that training with an adaptive online system built using the Ahab architecture produces better performance than training using simulator practice alone, at least with unfamiliar problems. It is not sufficient to develop an expert strategy and present it to students using offline materials. The training is most effective if it adapts to individual student needs.
Design Requirements for Communication-Intensive Interactive Applications
NASA Astrophysics Data System (ADS)
Bolchini, Davide; Garzotto, Franca; Paolini, Paolo
Online interactive applications call for new requirements paradigms to capture the growing complexity of computer-mediated communication. Crafting successful interactive applications (such as websites and multimedia) involves modeling the requirements for the user experience, including those leading to content design, usable information architecture and interaction, in profound coordination with the communication goals of all stakeholders involved, ranging from persuasion to social engagement, to call for action. To face this grand challenge, we propose a methodology for modeling communication requirements and provide a set of operational conceptual tools to be used in complex projects with multiple stakeholders. Through examples from real-life projects and lessons-learned from direct experience, we draw on the concepts of brand, value, communication goals, information and persuasion requirements to systematically guide analysts to master the multifaceted connections of these elements as drivers to inform successful communication designs.
NASA Integrated Model Centric Architecture (NIMA) Model Use and Re-Use
NASA Technical Reports Server (NTRS)
Conroy, Mike; Mazzone, Rebecca; Lin, Wei
2012-01-01
This whitepaper accepts the goals, needs and objectives of NASA's Integrated Model-centric Architecture (NIMA); adds experience and expertise from the Constellation program as well as NASA's architecture development efforts; and provides suggested concepts, practices and norms that nurture and enable model use and re-use across programs, projects and other complex endeavors. Key components include the ability to effectively move relevant information through a large community, process patterns that support model reuse and the identification of the necessary meta-information (ex. history, credibility, and provenance) to safely use and re-use that information. In order to successfully Use and Re-Use Models and Simulations we must define and meet key organizational and structural needs: 1. We must understand and acknowledge all the roles and players involved from the initial need identification through to the final product, as well as how they change across the lifecycle. 2. We must create the necessary structural elements to store and share NIMA-enabled information throughout the Program or Project lifecycle. 3. We must create the necessary organizational processes to stand up and execute a NIMA-enabled Program or Project throughout its lifecycle. NASA must meet all three of these needs to successfully use and re-use models. The ability to Reuse Models a key component of NIMA and the capabilities inherent in NIMA are key to accomplishing NASA's space exploration goals. 11
Comprehensive model for predicting elemental composition of coal pyrolysis products
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ricahrds, Andrew P.; Shutt, Tim; Fletcher, Thomas H.
Large-scale coal combustion simulations depend highly on the accuracy and utility of the physical submodels used to describe the various physical behaviors of the system. Coal combustion simulations depend on the particle physics to predict product compositions, temperatures, energy outputs, and other useful information. The focus of this paper is to improve the accuracy of devolatilization submodels, to be used in conjunction with other particle physics models. Many large simulations today rely on inaccurate assumptions about particle compositions, including that the volatiles that are released during pyrolysis are of the same elemental composition as the char particle. Another common assumptionmore » is that the char particle can be approximated by pure carbon. These assumptions will lead to inaccuracies in the overall simulation. There are many factors that influence pyrolysis product composition, including parent coal composition, pyrolysis conditions (including particle temperature history and heating rate), and others. All of these factors are incorporated into the correlations to predict the elemental composition of the major pyrolysis products, including coal tar, char, and light gases.« less
Christensen, N.C.; Emery, J.D.; Smith, M.L.
1985-04-29
A system converts from the boundary representation of an object to the constructive solid geometry representation thereof. The system converts the boundary representation of the object into elemental atomic geometrical units or I-bodies which are in the shape of stock primitives or regularized intersections of stock primitives. These elemental atomic geometrical units are then represented in symbolic form. The symbolic representations of the elemental atomic geometrical units are then assembled heuristically to form a constructive solid geometry representation of the object usable for manufacturing thereof. Artificial intelligence is used to determine the best constructive solid geometry representation from the boundary representation of the object. Heuristic criteria are adapted to the manufacturing environment for which the device is to be utilized. The surface finish, tolerance, and other information associated with each surface of the boundary representation of the object are mapped onto the constructive solid geometry representation of the object to produce an enhanced solid geometry representation, particularly useful for computer-aided manufacture of the object. 19 figs.
Rethinking programme evaluation in health professions education: beyond 'did it work?'.
Haji, Faizal; Morin, Marie-Paule; Parker, Kathryn
2013-04-01
For nearly 40 years, outcome-based models have dominated programme evaluation in health professions education. However, there is increasing recognition that these models cannot address the complexities of the health professions context and studies employing alternative evaluation approaches that are appearing in the literature. A similar paradigm shift occurred over 50 years ago in the broader discipline of programme evaluation. Understanding the development of contemporary paradigms within this field provides important insights to support the evolution of programme evaluation in the health professions. In this discussion paper, we review the historical roots of programme evaluation as a discipline, demonstrating parallels with the dominant approach to evaluation in the health professions. In tracing the evolution of contemporary paradigms within this field, we demonstrate how their aim is not only to judge a programme's merit or worth, but also to generate information for curriculum designers seeking to adapt programmes to evolving contexts, and researchers seeking to generate knowledge to inform the work of others. From this evolution, we distil seven essential elements of educational programmes that should be evaluated to achieve the stated goals. Our formulation is not a prescriptive method for conducting programme evaluation; rather, we use these elements as a guide for the development of a holistic 'programme of evaluation' that involves multiple stakeholders, uses a combination of available models and methods, and occurs throughout the life of a programme. Thus, these elements provide a roadmap for the programme evaluation process, which allows evaluators to move beyond asking whether a programme worked, to establishing how it worked, why it worked and what else happened. By engaging in this process, evaluators will generate a sound understanding of the relationships among programmes, the contexts in which they operate, and the outcomes that result from them. © Blackwell Publishing Ltd 2013.
Variability-induced transition in a net of neural elements: From oscillatory to excitable behavior.
Glatt, Erik; Gassel, Martin; Kaiser, Friedemann
2006-06-01
Starting with an oscillatory net of neural elements, increasing variability induces a phase transition to excitability. This transition is explained by a systematic effect of the variability, which stabilizes the formerly unstable, spatially uniform, temporally constant solution of the net. Multiplicative noise may also influence the net in a systematic way and may thus induce a similar transition. Adding noise into the model, the interplay of noise and variability with respect to the reported transition is investigated. Finally, pattern formation in a diffusively coupled net is studied, because excitability implies the ability of pattern formation and information transmission.