Understanding Preservice Teachers' Technology Use through TPACK Framework
ERIC Educational Resources Information Center
Pamuk, S.
2012-01-01
This study discusses preservice teachers' achievement barriers to technology integration, using principles of technological pedagogical content knowledge (TPACK) as an evaluative framework. Technology-capable participants each freely chose a content area to comprise project. Data analysis based on interactions among core components of TPACK…
Sockolow, Paulina S; Bowles, Kathryn H; Rogers, Michelle
2015-01-01
We assessed the Health Information Technology (HIT) Reference-based Evaluation Framework (HITREF) comprehensiveness in two HIT evaluations in settings different from that in which the HITREF was developed. Clinician satisfaction themes that emerged from clinician interviews in the home care and the hospital studies were compared to the framework components. Across both studies, respondents commented on 12 of the 20 HITREF components within 5 of the 6 HITREF concepts. No new components emerged that were missing from the HITREF providing evidence that the HITREF is a comprehensive framework. HITREF use in a range of HIT evaluations by researchers new to the HITREF demonstrates that it can be used as intended. Therefore, we continue to recommend the HITREF as a comprehensive, research-based HIT evaluation framework to increase the capacity of informatics evaluators' use of best practice and evidence-based practice to support the credibility of their findings for fulfilling the purpose of program evaluation.
Models and Frameworks: A Synergistic Association for Developing Component-Based Applications
Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A.; Álvarez, Bárbara
2014-01-01
The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development. PMID:25147858
Models and frameworks: a synergistic association for developing component-based applications.
Alonso, Diego; Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A; Álvarez, Bárbara
2014-01-01
The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development.
NASA Astrophysics Data System (ADS)
Graves, S. J.; Keiser, K.; Law, E.; Yang, C. P.; Djorgovski, S. G.
2016-12-01
ECITE (EarthCube Integration and Testing Environment) is providing both cloud-based computational testing resources and an Assessment Framework for Technology Interoperability and Integration. NSF's EarthCube program is funding the development of cyberinfrastructure building block components as technologies to address Earth science research problems. These EarthCube building blocks need to support integration and interoperability objectives to work towards a coherent cyberinfrastructure architecture for the program. ECITE is being developed to provide capabilities to test and assess the interoperability and integration across funded EarthCube technology projects. EarthCube defined criteria for interoperability and integration are applied to use cases coordinating science problems with technology solutions. The Assessment Framework facilitates planning, execution and documentation of the technology assessments for review by the EarthCube community. This presentation will describe the components of ECITE and examine the methodology of cross walking between science and technology use cases.
Evolving Frameworks for Different Communities of Scientists and End Users
NASA Astrophysics Data System (ADS)
Graves, S. J.; Keiser, K.
2016-12-01
Two evolving frameworks for interdisciplinary science will be described in the context of the Common Data Framework for Earth-Observation Data and the importance of standards and protocols. The Event Data Driven Delivery (ED3) Framework, funded by NASA Applied Sciences, provides the delivery of data based on predetermined subscriptions and associated workflows to various communities of end users. ED3's capabilities are used by scientists, as well as policy and resource managers, when event alerts are triggered to respond to their needs. The EarthCube Integration and Testing Environment (ECITE) Assessment Framework for Technology Interoperability and Integration is being developed to facilitate the EarthCube community's assessment of NSF funded technologies addressing Earth science problems. ECITE is addressing the translation of geoscience researchers' use cases into technology use case that apply EarthCube-funded building block technologies (and other existing technologies) for solving science problems. EarthCube criteria for technology assessment include the use of data, metadata and service standards to improve interoperability and integration across program components. The long-range benefit will be the growth of a cyberinfrastructure with technology components that have been shown to work together to solve known science objectives.
Klein, Karsten; Wolff, Astrid C; Ziebold, Oliver; Liebscher, Thomas
2008-01-01
The ICW eHealth Framework (eHF) is a powerful infrastructure and platform for the development of service-oriented solutions in the health care business. It is the culmination of many years of experience of ICW in the development and use of in-house health care solutions and represents the foundation of ICW product developments based on the Java Enterprise Edition (Java EE). The ICW eHealth Framework has been leveraged to allow development by external partners - enabling adopters a straightforward integration into ICW solutions. The ICW eHealth Framework consists of reusable software components, development tools, architectural guidelines and conventions defining a full software-development and product lifecycle. From the perspective of a partner, the framework provides services and infrastructure capabilities for integrating applications within an eHF-based solution. This article introduces the ICW eHealth Framework's basic architectural concepts and technologies. It provides an overview of its module and component model, describes the development platform that supports the complete software development lifecycle of health care applications and outlines technological aspects, mainly focusing on application development frameworks and open standards.
Teachers' Learning While Constructing Technology-Based Instructional Resources
ERIC Educational Resources Information Center
Polly, Drew
2011-01-01
Grounded in a constructionist paradigm, this study examined elementary school teachers' learning while creating technology-rich instructional materials. Sixteen teachers at an elementary school were interviewed about their experience. Using the components of Technological Pedagogical and Content Knowledge as an analytical framework, inductive…
Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sussman, Alan
2014-10-21
This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.
Distributed Computing Framework for Synthetic Radar Application
NASA Technical Reports Server (NTRS)
Gurrola, Eric M.; Rosen, Paul A.; Aivazis, Michael
2006-01-01
We are developing an extensible software framework, in response to Air Force and NASA needs for distributed computing facilities for a variety of radar applications. The objective of this work is to develop a Python based software framework, that is the framework elements of the middleware that allows developers to control processing flow on a grid in a distributed computing environment. Framework architectures to date allow developers to connect processing functions together as interchangeable objects, thereby allowing a data flow graph to be devised for a specific problem to be solved. The Pyre framework, developed at the California Institute of Technology (Caltech), and now being used as the basis for next-generation radar processing at JPL, is a Python-based software framework. We have extended the Pyre framework to include new facilities to deploy processing components as services, including components that monitor and assess the state of the distributed network for eventual real-time control of grid resources.
SQL Collaborative Learning Framework Based on SOA
NASA Astrophysics Data System (ADS)
Armiati, S.; Awangga, RM
2018-04-01
The research is focused on designing collaborative learning-oriented framework fulfilment service in teaching SQL Oracle 10g. Framework built a foundation of academic fulfilment service performed by a layer of the working unit in collaboration with Program Studi Manajemen Informatika. In the design phase defined what form of collaboration models and information technology proposed for Program Studi Manajemen Informatika by using a framework of collaboration inspired by the stages of modelling a Service Oriented Architecture (SOA). Stages begin with analyzing subsystems, this activity is used to determine subsystem involved and reliance as well as workflow between the subsystems. After the service can be identified, the second phase is designing the component specifications, which details the components that are implemented in the service to include the data, rules, services, profiles can be configured, and variations. The third stage is to allocate service, set the service to the subsystems that have been identified, and its components. Implementation framework contributes to the teaching guides and application architecture that can be used as a landing realize an increase in service by applying information technology.
The research of .NET framework based on delegate of the LCE
NASA Astrophysics Data System (ADS)
Chen, Yi-peng
2011-10-01
Programmers realize LCE Enterprise services provided by NET framework when they develop applied VC# programming design language with component technology facing objects Lots of basic codes used to be compiled in the traditional programming design. However, nowadays this can be done just by adding corresponding character at class, interface, method, assembly with simple declarative program. This paper mainly expatiates the mechanism to realize LCE event services with delegate mode in C#. It also introduces the procedure of applying event class, event publisher, subscriber and client in LCE technology. It analyses the technology points of LCE based on delegate mode with popular language and practicing cases.
Diagnostic frameworks and nursing diagnoses: a normative stance.
Zanotti, Renzo; Chiffi, Daniele
2015-01-01
Diagnostic frameworks are essential to many scientific and technological activities and clinical practice. This study examines the main fundamental aspects of such frameworks. The three components required for all diagnoses are identified and examined, i.e. their normative dimension, temporal nature and structure, and teleological perspective. The normative dimension of a diagnosis is based on (1) epistemic values when associated with Hempel's inductive risk concerning the balance between false-positive and false-negative outcomes, leading to probabilistic judgements; and (2) non-epistemic values when related to ideas such as well-being, normality, illness, etc, as idealized norms or ideal points of reference. It should be noted that medical diagnoses match the three necessary components, while some essential diagnostic frameworks - the taxonomies of Gordon and NANDA - in nursing lack some components. The main lack is normative as the most popular frameworks in nursing diagnosis seem to be descriptions of observed reality rather than normative and value-based judgements in which both epistemic and non-epistemic values may coexist. © 2014 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Liang, Likai; Bi, Yushen
Considered on the distributed network management system's demand of high distributives, extensibility and reusability, a framework model of Three-tier distributed network management system based on COM/COM+ and DNA is proposed, which adopts software component technology and N-tier application software framework design idea. We also give the concrete design plan of each layer of this model. Finally, we discuss the internal running process of each layer in the distributed network management system's framework model.
NASA Astrophysics Data System (ADS)
Alseddiqi, M.; Mishra, R.; Pislaru, C.
2012-05-01
The paper presents the results from a quality framework to measure the effectiveness of a new engineering course entitled 'school-based learning (SBL) to work-based learning (WBL) transition module' in the Technical and Vocational Education (TVE) system in Bahrain. The framework is an extended version of existing information quality frameworks with respect to pedagogical and technological contexts. It incorporates specific pedagogical and technological dimensions as per the Bahrain modern industry requirements. Users' views questionnaire on the effectiveness of the new transition module was distributed to various stakeholders including TVE teachers and students. The aim was to receive critical information in diagnosing, monitoring and evaluating different views and perceptions about the effectiveness of the new module. The analysis categorised the quality dimensions by their relative importance. This was carried out using the principal component analysis available in SPSS. The analysis clearly identified the most important quality dimensions integrated in the new module for SBL-to-WBL transition. It was also apparent that the new module contains workplace proficiencies, prepares TVE students for work placement, provides effective teaching and learning methodologies, integrates innovative technology in the process of learning, meets modern industrial needs, and presents a cooperative learning environment for TVE students. From the principal component analysis finding, to calculate the percentage of relative importance of each factor and its quality dimensions, was significant. The percentage comparison would justify the most important factor as well as the most important quality dimensions. Also, the new, re-arranged quality dimensions from the finding with an extended number of factors tended to improve the extended version of the quality information framework to a revised quality framework.
Open architectures for formal reasoning and deductive technologies for software development
NASA Technical Reports Server (NTRS)
Mccarthy, John; Manna, Zohar; Mason, Ian; Pnueli, Amir; Talcott, Carolyn; Waldinger, Richard
1994-01-01
The objective of this project is to develop an open architecture for formal reasoning systems. One goal is to provide a framework with a clear semantic basis for specification and instantiation of generic components; construction of complex systems by interconnecting components; and for making incremental improvements and tailoring to specific applications. Another goal is to develop methods for specifying component interfaces and interactions to facilitate use of existing and newly built systems as 'off the shelf' components, thus helping bridge the gap between producers and consumers of reasoning systems. In this report we summarize results in several areas: our data base of reasoning systems; a theory of binding structures; a theory of components of open systems; a framework for specifying components of open reasoning system; and an analysis of the integration of rewriting and linear arithmetic modules in Boyer-Moore using the above framework.
ERIC Educational Resources Information Center
Kim, Minkyun; Sharman, Raj; Cook-Cottone, Catherine P.; Rao, H. Raghav; Upadhyaya, Shambhu J.
2012-01-01
Emergency management systems are a critical factor in successful mitigation of natural and man-made disasters, facilitating responder decision making in complex situations. Based on socio-technical systems, have which four components (people, technology, structure and task), this study develops a research framework of factors affecting effective…
Teaching of Computer Science Topics Using Meta-Programming-Based GLOs and LEGO Robots
ERIC Educational Resources Information Center
Štuikys, Vytautas; Burbaite, Renata; Damaševicius, Robertas
2013-01-01
The paper's contribution is a methodology that integrates two educational technologies (GLO and LEGO robot) to teach Computer Science (CS) topics at the school level. We present the methodology as a framework of 5 components (pedagogical activities, technology driven processes, tools, knowledge transfer actors, and pedagogical outcomes) and…
ERIC Educational Resources Information Center
Asanok, M.; Kitrakan, P.; Brahmawong, C.
2008-01-01
With newly developing multimedia and web-based technologies have provided opportunities of developing a multimedia-based collaborative eLearning systems. The development of eLearning systems has started a revolution for instructional content delivering, learning activities and social communication. Based on various positions on this issue have…
Semantics-enabled service discovery framework in the SIMDAT pharma grid.
Qu, Cangtao; Zimmermann, Falk; Kumpf, Kai; Kamuzinzi, Richard; Ledent, Valérie; Herzog, Robert
2008-03-01
We present the design and implementation of a semantics-enabled service discovery framework in the data Grids for process and product development using numerical simulation and knowledge discovery (SIMDAT) Pharma Grid, an industry-oriented Grid environment for integrating thousands of Grid-enabled biological data services and analysis services. The framework consists of three major components: the Web ontology language (OWL)-description logic (DL)-based biological domain ontology, OWL Web service ontology (OWL-S)-based service annotation, and semantic matchmaker based on the ontology reasoning. Built upon the framework, workflow technologies are extensively exploited in the SIMDAT to assist biologists in (semi)automatically performing in silico experiments. We present a typical usage scenario through the case study of a biological workflow: IXodus.
Share Repository Framework: Component Specification and Otology
2008-04-23
Palantir Technologies has created one such software application to support the DoD intelligence community by providing robust capabilities for...managing data from various sources. The Palantir tool is based on user-defined ontologies and supports multiple representation and analysis tools
ERIC Educational Resources Information Center
Yurdakul, Isil Kabakci; Odabasi, Hatice Ferhan; Kilicer, Kerem; Coklar, Ahmet Naci; Birinci, Gurkay; Kurt, Adile Askim
2012-01-01
The purpose of this study is to develop a TPACK (technological pedagogical content knowledge) scale based on the centered component of TPACK framework in order to measure preservice teachers' TPACK. A systematic and step-by-step approach was followed for the development of the scale. The validity and reliability studies of the scale were carried…
Wherton, Joseph; Papoutsi, Chrysanthi; Lynch, Jennifer; Hughes, Gemma; A'Court, Christine; Hinder, Susan; Fahy, Nick; Procter, Rob; Shaw, Sara
2017-01-01
Background Many promising technological innovations in health and social care are characterized by nonadoption or abandonment by individuals or by failed attempts to scale up locally, spread distantly, or sustain the innovation long term at the organization or system level. Objective Our objective was to produce an evidence-based, theory-informed, and pragmatic framework to help predict and evaluate the success of a technology-supported health or social care program. Methods The study had 2 parallel components: (1) secondary research (hermeneutic systematic review) to identify key domains, and (2) empirical case studies of technology implementation to explore, test, and refine these domains. We studied 6 technology-supported programs—video outpatient consultations, global positioning system tracking for cognitive impairment, pendant alarm services, remote biomarker monitoring for heart failure, care organizing software, and integrated case management via data sharing—using longitudinal ethnography and action research for up to 3 years across more than 20 organizations. Data were collected at micro level (individual technology users), meso level (organizational processes and systems), and macro level (national policy and wider context). Analysis and synthesis was aided by sociotechnically informed theories of individual, organizational, and system change. The draft framework was shared with colleagues who were introducing or evaluating other technology-supported health or care programs and refined in response to feedback. Results The literature review identified 28 previous technology implementation frameworks, of which 14 had taken a dynamic systems approach (including 2 integrative reviews of previous work). Our empirical dataset consisted of over 400 hours of ethnographic observation, 165 semistructured interviews, and 200 documents. The final nonadoption, abandonment, scale-up, spread, and sustainability (NASSS) framework included questions in 7 domains: the condition or illness, the technology, the value proposition, the adopter system (comprising professional staff, patient, and lay caregivers), the organization(s), the wider (institutional and societal) context, and the interaction and mutual adaptation between all these domains over time. Our empirical case studies raised a variety of challenges across all 7 domains, each classified as simple (straightforward, predictable, few components), complicated (multiple interacting components or issues), or complex (dynamic, unpredictable, not easily disaggregated into constituent components). Programs characterized by complicatedness proved difficult but not impossible to implement. Those characterized by complexity in multiple NASSS domains rarely, if ever, became mainstreamed. The framework showed promise when applied (both prospectively and retrospectively) to other programs. Conclusions Subject to further empirical testing, NASSS could be applied across a range of technological innovations in health and social care. It has several potential uses: (1) to inform the design of a new technology; (2) to identify technological solutions that (perhaps despite policy or industry enthusiasm) have a limited chance of achieving large-scale, sustained adoption; (3) to plan the implementation, scale-up, or rollout of a technology program; and (4) to explain and learn from program failures. PMID:29092808
Characterizing the reliability of a bioMEMS-based cantilever sensor
NASA Astrophysics Data System (ADS)
Bhalerao, Kaustubh D.
2004-12-01
The cantilever-based BioMEMS sensor represents one instance from many competing ideas of biosensor technology based on Micro Electro Mechanical Systems. The advancement of BioMEMS from laboratory-scale experiments to applications in the field will require standardization of their components and manufacturing procedures as well as frameworks to evaluate their performance. Reliability, the likelihood with which a system performs its intended task, is a compact mathematical description of its performance. The mathematical and statistical foundation of systems-reliability has been applied to the cantilever-based BioMEMS sensor. The sensor is designed to detect one aspect of human ovarian cancer, namely the over-expression of the folate receptor surface protein (FR-alpha). Even as the application chosen is clinically motivated, the objective of this study was to demonstrate the underlying systems-based methodology used to design, develop and evaluate the sensor. The framework development can be readily extended to other BioMEMS-based devices for disease detection and will have an impact in the rapidly growing $30 bn industry. The Unified Modeling Language (UML) is a systems-based framework for design and development of object-oriented information systems which has potential application for use in systems designed to interact with biological environments. The UML has been used to abstract and describe the application of the biosensor, to identify key components of the biosensor, and the technology needed to link them together in a coherent manner. The use of the framework is also demonstrated in computation of system reliability from first principles as a function of the structure and materials of the biosensor. The outcomes of applying the systems-based framework to the study are the following: (1) Characterizing the cantilever-based MEMS device for disease (cell) detection. (2) Development of a novel chemical interface between the analyte and the sensor that provides a degree of selectivity towards the disease. (3) Demonstrating the performance and measuring the reliability of the biosensor prototype, and (4) Identification of opportunities in technological development in order to further refine the proposed biosensor. Application of the methodology to design develop and evaluate the reliability of BioMEMS devices will be beneficial in the streamlining the growth of the BioMEMS industry, while providing a decision-support tool in comparing and adopting suitable technologies from available competing options.
Bridging Technometric Method and Innovation Process: An Initial Study
NASA Astrophysics Data System (ADS)
Rumanti, A. A.; Reynaldo, R.; Samadhi, T. M. A. A.; Wiratmadja, I. I.; Dwita, A. C.
2018-03-01
The process of innovation is one of ways utilized to increase the capability of a technology component that reflects the need of SME. Technometric method can be used to identify to what extent the level of technology advancement in a SME is, and also which technology component that needs to be maximized in order to significantly deliver an innovation. This paper serves as an early study, which lays out a conceptual framework that identifies and elaborates the principles of innovation process from a well-established innovation model by Martin with the technometric method, based on the initial background research conducted at SME Ira Silver in Jogjakarta, Indonesia.
Central Component Descriptors for Levels of Technological Pedagogical Content Knowledge
ERIC Educational Resources Information Center
Niess, Margaret L.
2013-01-01
Technological pedagogical content knowledge (TPACK) proposes a theoretical framework that incorporates four central components: an overarching conception of what it means to teach with technology, knowledge of students' thinking and understandings of specific topics with technologies, knowledge of curricular materials that incorporate…
Rajarathinam, Vetrickarthick; Chellappa, Swarnalatha; Nagarajan, Asha
2015-01-01
This study on component framework reveals the importance of management process and technology mapping in a business environment. We defined ERP as a software tool, which has to provide business solution but not necessarily an integration of all the departments. Any business process can be classified as management process, operational process and the supportive process. We have gone through entire management process and were enable to bring influencing components to be mapped with a technology for a business solution. Governance, strategic management, and decision making are thoroughly discussed and the need of mapping these components with the ERP is clearly explained. Also we suggest that implementation of this framework might reduce the ERP failures and especially the ERP misfit was completely rectified.
Chellappa, Swarnalatha; Nagarajan, Asha
2015-01-01
This study on component framework reveals the importance of management process and technology mapping in a business environment. We defined ERP as a software tool, which has to provide business solution but not necessarily an integration of all the departments. Any business process can be classified as management process, operational process and the supportive process. We have gone through entire management process and were enable to bring influencing components to be mapped with a technology for a business solution. Governance, strategic management, and decision making are thoroughly discussed and the need of mapping these components with the ERP is clearly explained. Also we suggest that implementation of this framework might reduce the ERP failures and especially the ERP misfit was completely rectified. PMID:25861688
Sockolow, P S; Crawford, P R; Lehmann, H P
2012-01-01
Our forthcoming national experiment in increased health information technology (HIT) adoption funded by the American Recovery and Reinvestment Act of 2009 will require a comprehensive approach to evaluating HIT. The quality of evaluation studies of HIT to date reveals a need for broader evaluation frameworks that limits the generalizability of findings and the depth of lessons learned. Develop an informatics evaluation framework for health information technology (HIT) integrating components of health services research (HSR) evaluation and informatics evaluation to address identified shortcomings in available HIT evaluation frameworks. A systematic literature review updated and expanded the exhaustive review by Ammenwerth and deKeizer (AdK). From retained studies, criteria were elicited and organized into classes within a framework. The resulting Health Information Technology Research-based Evaluation Framework (HITREF) was used to guide clinician satisfaction survey construction, multi-dimensional analysis of data, and interpretation of findings in an evaluation of a vanguard community health care EHR. The updated review identified 128 electronic health record (EHR) evaluation studies and seven evaluation criteria not in AdK: EHR Selection/Development/Training; Patient Privacy Concerns; Unintended Consequences/ Benefits; Functionality; Patient Satisfaction with EHR; Barriers/Facilitators to Adoption; and Patient Satisfaction with Care. HITREF was used productively and was a complete evaluation framework which included all themes that emerged. We can recommend to future EHR evaluators that they consider adding a complete, research-based HIT evaluation framework, such as HITREF, to their evaluation tools suite to monitor HIT challenges as the federal government strives to increase HIT adoption.
Study on Web-Based Tool for Regional Agriculture Industry Structure Optimization Using Ajax
NASA Astrophysics Data System (ADS)
Huang, Xiaodong; Zhu, Yeping
According to the research status of regional agriculture industry structure adjustment information system and the current development of information technology, this paper takes web-based regional agriculture industry structure optimization tool as research target. This paper introduces Ajax technology and related application frameworks to build an auxiliary toolkit of decision support system for agricultural policy maker and economy researcher. The toolkit includes a “one page” style component of regional agriculture industry structure optimization which provides agile arguments setting method that enables applying sensitivity analysis and usage of data and comparative advantage analysis result, and a component that can solve the linear programming model and its dual problem by simplex method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gribok, Andrei; Patnaik, Sobhan; Williams, Christian
This report describes the current state of research related to critical aspects of erosion and selected aspects of degradation of secondary components in nuclear power plants. The report also proposes a framework for online health monitoring of aging and degradation of secondary components. The framework consists of an integrated multi-sensor modality system which can be used to monitor different piping configurations under different degradation conditions. The report analyses the currently known degradation mechanisms and available predictive models. Based on this analysis, the structural health monitoring framework is proposed. The Light Water Reactor Sustainability Program began to evaluate technologies that couldmore » be used to perform online monitoring of piping and other secondary system structural components in commercial NPPs. These online monitoring systems have the potential to identify when a more detailed inspection is needed using real-time measurements, rather than at a pre-determined inspection interval. This transition to condition-based, risk informed automated maintenance will contribute to a significant reduction of operations and maintenance costs that account for the majority of nuclear power generation costs. There is unanimous agreement between industry experts and academic researchers that identifying and prioritizing inspection locations in secondary piping systems (for example, in raw water piping or diesel piping) would eliminate many excessive in-service inspections. The proposed structural health monitoring framework takes aim at answering this challenge by combining long-range guided wave technologies with other monitoring techniques, which can significantly increase the inspection length and pinpoint the locations that degraded the most. More widely, the report suggests research efforts aimed at developing, validating, and deploying online corrosion monitoring techniques for complex geometries, which are pervasive in NPPs.« less
Lobelo, Felipe; Kelli, Heval M.; Tejedor, Sheri Chernetsky; Pratt, Michael; McConnell, Michael V.; Martin, Seth S.; Welk, Gregory J.
2017-01-01
Physical activity (PA) interventions constitute a critical component of cardiovascular disease (CVD) risk reduction programs. Objective mobile health (mHealth) software applications (apps) and wearable activity monitors (WAMs) can advance both assessment and integration of PA counseling in clinical settings and support community-based PA interventions. The use of mHealth technology for CVD risk reduction is promising, but integration into routine clinical care and population health management has proven challenging. The increasing diversity of available technologies and the lack of a comprehensive guiding framework are key barriers for standardizing data collection and integration. This paper reviews the validity, utility and feasibility of implementing mHealth technology in clinical settings and proposes an organizational framework to support PA assessment, counseling and referrals to community resources for CVD risk reduction interventions. This integration framework can be adapted to different clinical population needs. It should also be refined as technologies and regulations advance under an evolving health care system landscape in the United States and globally. PMID:26923067
Lobelo, Felipe; Kelli, Heval M; Tejedor, Sheri Chernetsky; Pratt, Michael; McConnell, Michael V; Martin, Seth S; Welk, Gregory J
2016-01-01
Physical activity (PA) interventions constitute a critical component of cardiovascular disease (CVD) risk reduction programs. Objective mobile health (mHealth) software applications (apps) and wearable activity monitors (WAMs) can advance both assessment and integration of PA counseling in clinical settings and support community-based PA interventions. The use of mHealth technology for CVD risk reduction is promising, but integration into routine clinical care and population health management has proven challenging. The increasing diversity of available technologies and the lack of a comprehensive guiding framework are key barriers for standardizing data collection and integration. This paper reviews the validity, utility and feasibility of implementing mHealth technology in clinical settings and proposes an organizational framework to support PA assessment, counseling and referrals to community resources for CVD risk reduction interventions. This integration framework can be adapted to different clinical population needs. It should also be refined as technologies and regulations advance under an evolving health care system landscape in the United States and globally. Copyright © 2016 Elsevier Inc. All rights reserved.
Framework for a clinical information system.
Van De Velde, R; Lansiers, R; Antonissen, G
2002-01-01
The design and implementation of Clinical Information System architecture is presented. This architecture has been developed and implemented based on components following a strong underlying conceptual and technological model. Common Object Request Broker and n-tier technology featuring centralised and departmental clinical information systems as the back-end store for all clinical data are used. Servers located in the "middle" tier apply the clinical (business) model and application rules. The main characteristics are the focus on modelling and reuse of both data and business logic. Scalability as well as adaptability to constantly changing requirements via component driven computing are the main reasons for that approach.
Greenhalgh, Trisha; Wherton, Joseph; Papoutsi, Chrysanthi; Lynch, Jennifer; Hughes, Gemma; A'Court, Christine; Hinder, Susan; Fahy, Nick; Procter, Rob; Shaw, Sara
2017-11-01
Many promising technological innovations in health and social care are characterized by nonadoption or abandonment by individuals or by failed attempts to scale up locally, spread distantly, or sustain the innovation long term at the organization or system level. Our objective was to produce an evidence-based, theory-informed, and pragmatic framework to help predict and evaluate the success of a technology-supported health or social care program. The study had 2 parallel components: (1) secondary research (hermeneutic systematic review) to identify key domains, and (2) empirical case studies of technology implementation to explore, test, and refine these domains. We studied 6 technology-supported programs-video outpatient consultations, global positioning system tracking for cognitive impairment, pendant alarm services, remote biomarker monitoring for heart failure, care organizing software, and integrated case management via data sharing-using longitudinal ethnography and action research for up to 3 years across more than 20 organizations. Data were collected at micro level (individual technology users), meso level (organizational processes and systems), and macro level (national policy and wider context). Analysis and synthesis was aided by sociotechnically informed theories of individual, organizational, and system change. The draft framework was shared with colleagues who were introducing or evaluating other technology-supported health or care programs and refined in response to feedback. The literature review identified 28 previous technology implementation frameworks, of which 14 had taken a dynamic systems approach (including 2 integrative reviews of previous work). Our empirical dataset consisted of over 400 hours of ethnographic observation, 165 semistructured interviews, and 200 documents. The final nonadoption, abandonment, scale-up, spread, and sustainability (NASSS) framework included questions in 7 domains: the condition or illness, the technology, the value proposition, the adopter system (comprising professional staff, patient, and lay caregivers), the organization(s), the wider (institutional and societal) context, and the interaction and mutual adaptation between all these domains over time. Our empirical case studies raised a variety of challenges across all 7 domains, each classified as simple (straightforward, predictable, few components), complicated (multiple interacting components or issues), or complex (dynamic, unpredictable, not easily disaggregated into constituent components). Programs characterized by complicatedness proved difficult but not impossible to implement. Those characterized by complexity in multiple NASSS domains rarely, if ever, became mainstreamed. The framework showed promise when applied (both prospectively and retrospectively) to other programs. Subject to further empirical testing, NASSS could be applied across a range of technological innovations in health and social care. It has several potential uses: (1) to inform the design of a new technology; (2) to identify technological solutions that (perhaps despite policy or industry enthusiasm) have a limited chance of achieving large-scale, sustained adoption; (3) to plan the implementation, scale-up, or rollout of a technology program; and (4) to explain and learn from program failures. ©Trisha Greenhalgh, Joseph Wherton, Chrysanthi Papoutsi, Jennifer Lynch, Gemma Hughes, Christine A'Court, Susan Hinder, Nick Fahy, Rob Procter, Sara Shaw. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 01.11.2017.
NASA Astrophysics Data System (ADS)
Yang, C.; Zheng, W.; Zhang, M.; Yuan, T.; Zhuang, G.; Pan, Y.
2016-06-01
Measurement and control of the plasma in real-time are critical for advanced Tokamak operation. It requires high speed real-time data acquisition and processing. ITER has designed the Fast Plant System Controllers (FPSC) for these purposes. At J-TEXT Tokamak, a real-time data acquisition and processing framework has been designed and implemented using standard ITER FPSC technologies. The main hardware components of this framework are an Industrial Personal Computer (IPC) with a real-time system and FlexRIO devices based on FPGA. With FlexRIO devices, data can be processed by FPGA in real-time before they are passed to the CPU. The software elements are based on a real-time framework which runs under Red Hat Enterprise Linux MRG-R and uses Experimental Physics and Industrial Control System (EPICS) for monitoring and configuring. That makes the framework accord with ITER FPSC standard technology. With this framework, any kind of data acquisition and processing FlexRIO FPGA program can be configured with a FPSC. An application using the framework has been implemented for the polarimeter-interferometer diagnostic system on J-TEXT. The application is able to extract phase-shift information from the intermediate frequency signal produced by the polarimeter-interferometer diagnostic system and calculate plasma density profile in real-time. Different algorithms implementations on the FlexRIO FPGA are compared in the paper.
ENVIRONMENTAL IMPACT ASSESSMENT OF A HEALTH TECHNOLOGY: A SCOPING REVIEW.
Polisena, Julie; De Angelis, Gino; Kaunelis, David; Gutierrez-Ibarluzea, Iñaki
2018-06-13
The Health Technology Expert Review Panel is an advisory body to Canadian Agency for Drugs and Technologies in Health (CADTH) that develops recommendations on health technology assessments (HTAs) for nondrug health technologies using a deliberative framework. The framework spans several domains, including the environmental impact of the health technology(ies). Our research objective was to identify articles on frameworks, methods or case studies on the environmental impact assessment of health technologies. A literature search in major databases and a focused gray literature search were conducted. The main search concepts were HTA and environmental impact/sustainability. Eligible articles were those that described a conceptual framework or methods used to conduct an environmental assessment of health technologies, and case studies on the application of an environmental assessment. From the 1,710 citations identified, thirteen publications were included. Two articles presented a framework to incorporate environmental assessment in HTAs. Other approaches described weight of evidence practices and comprehensive and integrated environmental impact assessments. Central themes derived include transparency and repeatability, integration of components in a framework or of evidence into a single outcome, data availability to ensure the accuracy of findings, and familiarity with the approach used. Each framework and methods presented have different foci related to the ecosystem, health economics, or engineering practices. Their descriptions suggested transparency, repeatability, and the integration of components or of evidence into a single outcome as their main strengths. Our review is an initial step of a larger initiative by CADTH to develop the methods and processes to address the environmental impact question in an HTA.
Schueller, Stephen M; Montague, Enid; Burns, Michelle Nicole; Rashidi, Parisa
2014-01-01
A growing number of investigators have commented on the lack of models to inform the design of behavioral intervention technologies (BITs). BITs, which include a subset of mHealth and eHealth interventions, employ a broad range of technologies, such as mobile phones, the Web, and sensors, to support users in changing behaviors and cognitions related to health, mental health, and wellness. We propose a model that conceptually defines BITs, from the clinical aim to the technological delivery framework. The BIT model defines both the conceptual and technological architecture of a BIT. Conceptually, a BIT model should answer the questions why, what, how (conceptual and technical), and when. While BITs generally have a larger treatment goal, such goals generally consist of smaller intervention aims (the "why") such as promotion or reduction of specific behaviors, and behavior change strategies (the conceptual "how"), such as education, goal setting, and monitoring. Behavior change strategies are instantiated with specific intervention components or “elements” (the "what"). The characteristics of intervention elements may be further defined or modified (the technical "how") to meet the needs, capabilities, and preferences of a user. Finally, many BITs require specification of a workflow that defines when an intervention component will be delivered. The BIT model includes a technological framework (BIT-Tech) that can integrate and implement the intervention elements, characteristics, and workflow to deliver the entire BIT to users over time. This implementation may be either predefined or include adaptive systems that can tailor the intervention based on data from the user and the user’s environment. The BIT model provides a step towards formalizing the translation of developer aims into intervention components, larger treatments, and methods of delivery in a manner that supports research and communication between investigators on how to design, develop, and deploy BITs. PMID:24905070
Mohr, David C; Schueller, Stephen M; Montague, Enid; Burns, Michelle Nicole; Rashidi, Parisa
2014-06-05
A growing number of investigators have commented on the lack of models to inform the design of behavioral intervention technologies (BITs). BITs, which include a subset of mHealth and eHealth interventions, employ a broad range of technologies, such as mobile phones, the Web, and sensors, to support users in changing behaviors and cognitions related to health, mental health, and wellness. We propose a model that conceptually defines BITs, from the clinical aim to the technological delivery framework. The BIT model defines both the conceptual and technological architecture of a BIT. Conceptually, a BIT model should answer the questions why, what, how (conceptual and technical), and when. While BITs generally have a larger treatment goal, such goals generally consist of smaller intervention aims (the "why") such as promotion or reduction of specific behaviors, and behavior change strategies (the conceptual "how"), such as education, goal setting, and monitoring. Behavior change strategies are instantiated with specific intervention components or "elements" (the "what"). The characteristics of intervention elements may be further defined or modified (the technical "how") to meet the needs, capabilities, and preferences of a user. Finally, many BITs require specification of a workflow that defines when an intervention component will be delivered. The BIT model includes a technological framework (BIT-Tech) that can integrate and implement the intervention elements, characteristics, and workflow to deliver the entire BIT to users over time. This implementation may be either predefined or include adaptive systems that can tailor the intervention based on data from the user and the user's environment. The BIT model provides a step towards formalizing the translation of developer aims into intervention components, larger treatments, and methods of delivery in a manner that supports research and communication between investigators on how to design, develop, and deploy BITs.
Software Framework for Peer Data-Management Services
NASA Technical Reports Server (NTRS)
Hughes, John; Hardman, Sean; Crichton, Daniel; Hyon, Jason; Kelly, Sean; Tran, Thuy
2007-01-01
Object Oriented Data Technology (OODT) is a software framework for creating a Web-based system for exchange of scientific data that are stored in diverse formats on computers at different sites under the management of scientific peers. OODT software consists of a set of cooperating, distributed peer components that provide distributed peer-to-peer (P2P) services that enable one peer to search and retrieve data managed by another peer. In effect, computers running OODT software at different locations become parts of an integrated data-management system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramuhalli, Pradeep; Roy, Surajit; Hirt, Evelyn H.
2014-09-12
This report describes research results to date in support of the integration and demonstration of diagnostics technologies for prototypical AdvSMR passive components (to establish condition indices for monitoring) with model-based prognostics methods. The focus of the PHM methodology and algorithm development in this study is at the localized scale. Multiple localized measurements of material condition (using advanced nondestructive measurement methods), along with available measurements of the stressor environment, enhance the performance of localized diagnostics and prognostics of passive AdvSMR components and systems.
Recon2Neo4j: applying graph database technologies for managing comprehensive genome-scale networks.
Balaur, Irina; Mazein, Alexander; Saqi, Mansoor; Lysenko, Artem; Rawlings, Christopher J; Auffray, Charles
2017-04-01
The goal of this work is to offer a computational framework for exploring data from the Recon2 human metabolic reconstruction model. Advanced user access features have been developed using the Neo4j graph database technology and this paper describes key features such as efficient management of the network data, examples of the network querying for addressing particular tasks, and how query results are converted back to the Systems Biology Markup Language (SBML) standard format. The Neo4j-based metabolic framework facilitates exploration of highly connected and comprehensive human metabolic data and identification of metabolic subnetworks of interest. A Java-based parser component has been developed to convert query results (available in the JSON format) into SBML and SIF formats in order to facilitate further results exploration, enhancement or network sharing. The Neo4j-based metabolic framework is freely available from: https://diseaseknowledgebase.etriks.org/metabolic/browser/ . The java code files developed for this work are available from the following url: https://github.com/ibalaur/MetabolicFramework . ibalaur@eisbm.org. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Recon2Neo4j: applying graph database technologies for managing comprehensive genome-scale networks
Mazein, Alexander; Saqi, Mansoor; Lysenko, Artem; Rawlings, Christopher J.; Auffray, Charles
2017-01-01
Abstract Summary: The goal of this work is to offer a computational framework for exploring data from the Recon2 human metabolic reconstruction model. Advanced user access features have been developed using the Neo4j graph database technology and this paper describes key features such as efficient management of the network data, examples of the network querying for addressing particular tasks, and how query results are converted back to the Systems Biology Markup Language (SBML) standard format. The Neo4j-based metabolic framework facilitates exploration of highly connected and comprehensive human metabolic data and identification of metabolic subnetworks of interest. A Java-based parser component has been developed to convert query results (available in the JSON format) into SBML and SIF formats in order to facilitate further results exploration, enhancement or network sharing. Availability and Implementation: The Neo4j-based metabolic framework is freely available from: https://diseaseknowledgebase.etriks.org/metabolic/browser/. The java code files developed for this work are available from the following url: https://github.com/ibalaur/MetabolicFramework. Contact: ibalaur@eisbm.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27993779
ERIC Educational Resources Information Center
Bayram, Servet
2005-01-01
The concept of Electronic Performance Support Systems (EPSS) is containing multimedia or computer based instruction components that improves human performance by providing process simplification, performance information and decision support system. EPSS has become a hot topic for organizational development, human resources, performance technology,…
DEVELOPMENTS IN VALUE FRAMEWORKS TO INFORM THE ALLOCATION OF HEALTHCARE RESOURCES.
Oortwijn, Wija; Sampietro-Colom, Laura; Habens, Fay
2017-01-01
In recent years, there has been a surge in the development of frameworks to assess the value of different types of health technologies to inform healthcare resource allocation. The reasons for, and the potential of, these value frameworks were discussed during the 2017 Health Technology Assessment International (HTAi) Policy Forum Meeting. This study reflects the discussion, drawing on presentations from invited experts and Policy Forum members, as well as a background paper. The reasons given for a proliferation of value frameworks included: rising healthcare costs; more complex health technology; perceived disconnect between price and value in some cases; changes in societal values; the need for inclusion of additional considerations, such as ethical issues; and greater empowerment of clinicians and patients in defining and using value frameworks. Many Policy Forum participants recommended learning from existing frameworks. Furthermore, there was a desire to agree on the core components of value frameworks, defining the additional value elements as necessary and considering how they might be measured and used in practice. Furthermore, adherence to the principles of transparency, predictability, broad stakeholder involvement, and accountability were widely supported, along with being forward looking, explicit, and consistent across decisions. Value frameworks continue to evolve with significant implications for global incentives for innovation and access to health technologies. There is a role for the HTA community to address some of the key areas discussed during the meeting, such as defining the core components for assessing the value of a health technology.
Towards a Decision Support System for Space Flight Operations
NASA Technical Reports Server (NTRS)
Meshkat, Leila; Hogle, Charles; Ruszkowski, James
2013-01-01
The Mission Operations Directorate (MOD) at the Johnson Space Center (JSC) has put in place a Model Based Systems Engineering (MBSE) technological framework for the development and execution of the Flight Production Process (FPP). This framework has provided much added value and return on investment to date. This paper describes a vision for a model based Decision Support System (DSS) for the development and execution of the FPP and its design and development process. The envisioned system extends the existing MBSE methodology and technological framework which is currently in use. The MBSE technological framework currently in place enables the systematic collection and integration of data required for building an FPP model for a diverse set of missions. This framework includes the technology, people and processes required for rapid development of architectural artifacts. It is used to build a feasible FPP model for the first flight of spacecraft and for recurrent flights throughout the life of the program. This model greatly enhances our ability to effectively engage with a new customer. It provides a preliminary work breakdown structure, data flow information and a master schedule based on its existing knowledge base. These artifacts are then refined and iterated upon with the customer for the development of a robust end-to-end, high-level integrated master schedule and its associated dependencies. The vision is to enhance this framework to enable its application for uncertainty management, decision support and optimization of the design and execution of the FPP by the program. Furthermore, this enhanced framework will enable the agile response and redesign of the FPP based on observed system behavior. The discrepancy of the anticipated system behavior and the observed behavior may be due to the processing of tasks internally, or due to external factors such as changes in program requirements or conditions associated with other organizations that are outside of MOD. The paper provides a roadmap for the three increments of this vision. These increments include (1) hardware and software system components and interfaces with the NASA ground system, (2) uncertainty management and (3) re-planning and automated execution. Each of these increments provide value independently; but some may also enable building of a subsequent increment.
Design of material management system of mining group based on Hadoop
NASA Astrophysics Data System (ADS)
Xia, Zhiyuan; Tan, Zhuoying; Qi, Kuan; Li, Wen
2018-01-01
Under the background of persistent slowdown in mining market at present, improving the management level in mining group has become the key link to improve the economic benefit of the mine. According to the practical material management in mining group, three core components of Hadoop are applied: distributed file system HDFS, distributed computing framework Map/Reduce and distributed database HBase. Material management system of mining group based on Hadoop is constructed with the three core components of Hadoop and SSH framework technology. This system was found to strengthen collaboration between mining group and affiliated companies, and then the problems such as inefficient management, server pressure, hardware equipment performance deficiencies that exist in traditional mining material-management system are solved, and then mining group materials management is optimized, the cost of mining management is saved, the enterprise profit is increased.
A Component-based Programming Model for Composite, Distributed Applications
NASA Technical Reports Server (NTRS)
Eidson, Thomas M.; Bushnell, Dennis M. (Technical Monitor)
2001-01-01
The nature of scientific programming is evolving to larger, composite applications that are composed of smaller element applications. These composite applications are more frequently being targeted for distributed, heterogeneous networks of computers. They are most likely programmed by a group of developers. Software component technology and computational frameworks are being proposed and developed to meet the programming requirements of these new applications. Historically, programming systems have had a hard time being accepted by the scientific programming community. In this paper, a programming model is outlined that attempts to organize the software component concepts and fundamental programming entities into programming abstractions that will be better understood by the application developers. The programming model is designed to support computational frameworks that manage many of the tedious programming details, but also that allow sufficient programmer control to design an accurate, high-performance application.
NASA Technical Reports Server (NTRS)
Afjeh, Abdollah A.; Reed, John A.
2003-01-01
This research is aimed at developing a neiv and advanced simulation framework that will significantly improve the overall efficiency of aerospace systems design and development. This objective will be accomplished through an innovative integration of object-oriented and Web-based technologies ivith both new and proven simulation methodologies. The basic approach involves Ihree major areas of research: Aerospace system and component representation using a hierarchical object-oriented component model which enables the use of multimodels and enforces component interoperability. Collaborative software environment that streamlines the process of developing, sharing and integrating aerospace design and analysis models. . Development of a distributed infrastructure which enables Web-based exchange of models to simplify the collaborative design process, and to support computationally intensive aerospace design and analysis processes. Research for the first year dealt with the design of the basic architecture and supporting infrastructure, an initial implementation of that design, and a demonstration of its application to an example aircraft engine system simulation.
IMAGINE: Interstellar MAGnetic field INference Engine
NASA Astrophysics Data System (ADS)
Steininger, Theo
2018-03-01
IMAGINE (Interstellar MAGnetic field INference Engine) performs inference on generic parametric models of the Galaxy. The modular open source framework uses highly optimized tools and technology such as the MultiNest sampler (ascl:1109.006) and the information field theory framework NIFTy (ascl:1302.013) to create an instance of the Milky Way based on a set of parameters for physical observables, using Bayesian statistics to judge the mismatch between measured data and model prediction. The flexibility of the IMAGINE framework allows for simple refitting for newly available data sets and makes state-of-the-art Bayesian methods easily accessible particularly for random components of the Galactic magnetic field.
Planning and Management of Real-Time Geospatialuas Missions Within a Virtual Globe Environment
NASA Astrophysics Data System (ADS)
Nebiker, S.; Eugster, H.; Flückiger, K.; Christen, M.
2011-09-01
This paper presents the design and development of a hardware and software framework supporting all phases of typical monitoring and mapping missions with mini and micro UAVs (unmanned aerial vehicles). The developed solution combines state-of-the art collaborative virtual globe technologies with advanced geospatial imaging techniques and wireless data link technologies supporting the combined and highly reliable transmission of digital video, high-resolution still imagery and mission control data over extended operational ranges. The framework enables the planning, simulation, control and real-time monitoring of UAS missions in application areas such as monitoring of forest fires, agronomical research, border patrol or pipeline inspection. The geospatial components of the project are based on the Virtual Globe Technology i3D OpenWebGlobe of the Institute of Geomatics Engineering at the University of Applied Sciences Northwestern Switzerland (FHNW). i3D OpenWebGlobe is a high-performance 3D geovisualisation engine supporting the web-based streaming of very large amounts of terrain and POI data.
Onyx-Advanced Aeropropulsion Simulation Framework Created
NASA Technical Reports Server (NTRS)
Reed, John A.
2001-01-01
The Numerical Propulsion System Simulation (NPSS) project at the NASA Glenn Research Center is developing a new software environment for analyzing and designing aircraft engines and, eventually, space transportation systems. Its purpose is to dramatically reduce the time, effort, and expense necessary to design and test jet engines by creating sophisticated computer simulations of an aerospace object or system (refs. 1 and 2). Through a university grant as part of that effort, researchers at the University of Toledo have developed Onyx, an extensible Java-based (Sun Micro-systems, Inc.), objectoriented simulation framework, to investigate how advanced software design techniques can be successfully applied to aeropropulsion system simulation (refs. 3 and 4). The design of Onyx's architecture enables users to customize and extend the framework to add new functionality or adapt simulation behavior as required. It exploits object-oriented technologies, such as design patterns, domain frameworks, and software components, to develop a modular system in which users can dynamically replace components with others having different functionality.
Shingleton, Rebecca M.; Palfai, Tibor P.
2015-01-01
Objectives The aims of this paper were to describe and evaluate the methods and efficacy of technology-delivered motivational interviewing interventions (TAMIs), discuss the challenges and opportunities of TAMIs, and provide a framework for future research. Methods We reviewed studies that reported using motivational interviewing (MI) based components delivered via technology and conducted ratings on technology description, comprehensiveness of MI, and study methods. Results The majority of studies were fully-automated and included at least one form of media rich technology to deliver the TAMI. Few studies provided complete descriptions of how MI components were delivered via technology. Of the studies that isolated the TAMI effects, positive changes were reported. Conclusion Researchers have used a range of technologies to deliver TAMIs suggesting feasibility of these methods. However, there are limited data regarding their efficacy, and strategies to deliver relational components remain a challenge. Future research should better characterize the components of TAMIs, empirically test the efficacy of TAMIs with randomized controlled trials, and incorporate fidelity measures. Practice Implications TAMIs are feasible to implement and well accepted. These approaches offer considerable potential to reduce costs, minimize therapist and training burden, and expand the range of clients that may benefit from adaptations of MI. PMID:26298219
Development of software for computing forming information using a component based approach
NASA Astrophysics Data System (ADS)
Ko, Kwang Hee; Park, Jiing Seo; Kim, Jung; Kim, Young Bum; Shin, Jong Gye
2009-12-01
In shipbuilding industry, the manufacturing technology> has advanced at an unprecedented pace for the last decade. As a result, many automatic systems for cutting, welding, etc. have been developed and employed in the manufacturing process and accordingly the productivity has been increased drastically. Despite such improvement in the manufacturing technology', however, development of an automatic system for fabricating a curved hull plate remains at the beginning stage since hardware and software for the automation of the curved hull fabrication process should be developed differently depending on the dimensions of plates, forming methods and manufacturing processes of each shipyard. To deal with this problem, it is necessary> to create a "plug-in ''framework, which can adopt various kinds of hardware and software to construct a full automatic fabrication system. In this paper, a frame-work for automatic fabrication of curved hull plates is proposed, which consists of four components and related software. In particular the software module for computing fabrication information is developed by using the ooCBD development methodology; which can interface with other hardware and software with minimum effort. Examples of the proposed framework applied to medium and large shipyards are presented.
An intervention fidelity framework for technology-based behavioral interventions.
Devito Dabbs, Annette; Song, Mi-Kyung; Hawkins, Robert; Aubrecht, Jill; Kovach, Karen; Terhorst, Lauren; Connolly, Mary; McNulty, Mary; Callan, Judith
2011-01-01
Despite the proliferation of health technologies, descriptions of the unique considerations and practical guidance for evaluating the intervention fidelity of technology-based behavioral interventions are lacking. The aims of this study were to (a) discuss how technology-based behavioral interventions challenge conventions about how intervention fidelity is conceptualized and evaluated, (b) propose an intervention fidelity framework that may be more appropriate for technology-based behavioral interventions, and (c) present a plan for operationalizing each concept in the framework using the intervention fidelity monitoring plan for Pocket PATH (Personal Assistant for Tracking Health), a mobile health technology designed to promote self-care behaviors after lung transplantation, as an exemplar. The literature related to intervention fidelity and technology acceptance was used to identify the issues that are unique to the fidelity of technology-based behavioral interventions and thus important to include in a proposed intervention fidelity framework. An intervention fidelity monitoring plan for technology-based behavioral interventions was developed as an example. The intervention fidelity monitoring plan was deemed feasible and practical to implement and showed utility in operationalizing the concepts such as assessing interventionists' delivery and participants' acceptance of the technology-based behavioral intervention. The framework has the potential to guide the development of implementation fidelity monitoring tools for other technology-based behavioral interventions. Further application and testing of this framework will allow for a better understanding of the role that technology acceptance plays in the adoption and enactment of the behaviors that technology-based behavioral interventions are intended to promote.
The Dairy Technology System in Venezuela. Summary of Research 79.
ERIC Educational Resources Information Center
Nieto, Ruben D.; Henderson, Janet L.
A study examined the agricultural technology system in Venezuela with emphasis on the dairy industry. An analytical framework was used to identify the strengths and weaknesses of the following components of Venezuela's agricultural technology system: policy, technology development, technology transfer, and technology use. Selected government…
Technology Assessment for Powertrain Components Final Report CRADA No. TC-1124-95
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tokarz, F.; Gough, C.
LLNL utilized its defense technology assessment methodologies in combination with its capabilities in the energy; manufacturing, and transportation technologies to demonstrate a methodology that synthesized available but incomplete information on advanced automotive technologies into a comprehensive framework.
Koutkias, Vassilis; Stalidis, George; Chouvarda, Ioanna; Lazou, Katerina; Kilintzis, Vassilis; Maglaveras, Nicos
2009-01-01
Adverse Drug Events (ADEs) are currently considered as a major public health issue, endangering patients' safety and causing significant healthcare costs. Several research efforts are currently concentrating on the reduction of preventable ADEs by employing Information Technology (IT) solutions, which aim to provide healthcare professionals and patients with relevant knowledge and decision support tools. In this context, we present a knowledge engineering approach towards the construction of a Knowledge-based System (KBS) regarded as the core part of a CDSS (Clinical Decision Support System) for ADE prevention, all developed in the context of the EU-funded research project PSIP (Patient Safety through Intelligent Procedures in Medication). In the current paper, we present the knowledge sources considered in PSIP and the implications they pose to knowledge engineering, the methodological approach followed, as well as the components defining the knowledge engineering framework based on relevant state-of-the-art technologies and representation formalisms.
An open-source framework for large-scale, flexible evaluation of biomedical text mining systems.
Baumgartner, William A; Cohen, K Bretonnel; Hunter, Lawrence
2008-01-29
Improved evaluation methodologies have been identified as a necessary prerequisite to the improvement of text mining theory and practice. This paper presents a publicly available framework that facilitates thorough, structured, and large-scale evaluations of text mining technologies. The extensibility of this framework and its ability to uncover system-wide characteristics by analyzing component parts as well as its usefulness for facilitating third-party application integration are demonstrated through examples in the biomedical domain. Our evaluation framework was assembled using the Unstructured Information Management Architecture. It was used to analyze a set of gene mention identification systems involving 225 combinations of system, evaluation corpus, and correctness measure. Interactions between all three were found to affect the relative rankings of the systems. A second experiment evaluated gene normalization system performance using as input 4,097 combinations of gene mention systems and gene mention system-combining strategies. Gene mention system recall is shown to affect gene normalization system performance much more than does gene mention system precision, and high gene normalization performance is shown to be achievable with remarkably low levels of gene mention system precision. The software presented in this paper demonstrates the potential for novel discovery resulting from the structured evaluation of biomedical language processing systems, as well as the usefulness of such an evaluation framework for promoting collaboration between developers of biomedical language processing technologies. The code base is available as part of the BioNLP UIMA Component Repository on SourceForge.net.
An open-source framework for large-scale, flexible evaluation of biomedical text mining systems
Baumgartner, William A; Cohen, K Bretonnel; Hunter, Lawrence
2008-01-01
Background Improved evaluation methodologies have been identified as a necessary prerequisite to the improvement of text mining theory and practice. This paper presents a publicly available framework that facilitates thorough, structured, and large-scale evaluations of text mining technologies. The extensibility of this framework and its ability to uncover system-wide characteristics by analyzing component parts as well as its usefulness for facilitating third-party application integration are demonstrated through examples in the biomedical domain. Results Our evaluation framework was assembled using the Unstructured Information Management Architecture. It was used to analyze a set of gene mention identification systems involving 225 combinations of system, evaluation corpus, and correctness measure. Interactions between all three were found to affect the relative rankings of the systems. A second experiment evaluated gene normalization system performance using as input 4,097 combinations of gene mention systems and gene mention system-combining strategies. Gene mention system recall is shown to affect gene normalization system performance much more than does gene mention system precision, and high gene normalization performance is shown to be achievable with remarkably low levels of gene mention system precision. Conclusion The software presented in this paper demonstrates the potential for novel discovery resulting from the structured evaluation of biomedical language processing systems, as well as the usefulness of such an evaluation framework for promoting collaboration between developers of biomedical language processing technologies. The code base is available as part of the BioNLP UIMA Component Repository on SourceForge.net. PMID:18230184
A beamline systems model for Accelerator-Driven Transmutation Technology (ADTT) facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Todd, A.M.M.; Paulson, C.C.; Peacock, M.A.
1995-10-01
A beamline systems code, that is being developed for Accelerator-Driven Transmutation Technology (ADTT) facility trade studies, is described. The overall program is a joint Grumman, G.H. Gillespie Associates (GHGA) and Los Alamos National Laboratory effort. The GHGA Accelerator Systems Model (ASM) has been adopted as the framework on which this effort is based. Relevant accelerator and beam transport models from earlier Grumman systems codes are being adapted to this framework. Preliminary physics and engineering models for each ADTT beamline component have been constructed. Examples noted include a Bridge Coupled Drift Tube Linac (BCDTL) and the accelerator thermal system. A decisionmore » has been made to confine the ASM framework principally to beamline modeling, while detailed target/blanket, balance-of-plant and facility costing analysis will be performed externally. An interfacing external balance-of-plant and facility costing model, which will permit the performance of iterative facility trade studies, is under separate development. An ABC (Accelerator Based Conversion) example is used to highlight the present models and capabilities.« less
A beamline systems model for Accelerator-Driven Transmutation Technology (ADTT) facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Todd, Alan M. M.; Paulson, C. C.; Peacock, M. A.
1995-09-15
A beamline systems code, that is being developed for Accelerator-Driven Transmutation Technology (ADTT) facility trade studies, is described. The overall program is a joint Grumman, G. H. Gillespie Associates (GHGA) and Los Alamos National Laboratory effort. The GHGA Accelerator Systems Model (ASM) has been adopted as the framework on which this effort is based. Relevant accelerator and beam transport models from earlier Grumman systems codes are being adapted to this framework. Preliminary physics and engineering models for each ADTT beamline component have been constructed. Examples noted include a Bridge Coupled Drift Tube Linac (BCDTL) and the accelerator thermal system. Amore » decision has been made to confine the ASM framework principally to beamline modeling, while detailed target/blanket, balance-of-plant and facility costing analysis will be performed externally. An interfacing external balance-of-plant and facility costing model, which will permit the performance of iterative facility trade studies, is under separate development. An ABC (Accelerator Based Conversion) example is used to highlight the present models and capabilities.« less
The Instrumental Value of Conceptual Frameworks in Educational Technology Research
ERIC Educational Resources Information Center
Antonenko, Pavlo D.
2015-01-01
Scholars from diverse fields and research traditions agree that the conceptual framework is a critically important component of disciplined inquiry. Yet, there is a pronounced lack of shared understanding regarding the definition and functions of conceptual frameworks, which impedes our ability to design effective research and mentor novice…
Communications and radar-supported transportation operations and planning : final report.
DOT National Transportation Integrated Search
2017-03-01
This project designs a conceptual framework to harness and mature wireless technology to improve : transportation safety, with a focus on frontal collision warning/collision avoidance (CW/CA) systems. The : framework identifies components of the tech...
Open-Source, Web-Based Dashboard Components for DICOM Connectivity.
Bustamante, Catalina; Pineda, Julian; Rascovsky, Simon; Arango, Andres
2016-08-01
The administration of a DICOM network within an imaging healthcare institution requires tools that allow for monitoring of connectivity and availability for adequate uptime measurements and help guide technology management strategies. We present the implementation of an open-source widget for the Dashing framework that provides basic dashboard functionality allowing for monitoring of a DICOM network using network "ping" and DICOM "C-ECHO" operations.
Collaborative environments for capability-based planning
NASA Astrophysics Data System (ADS)
McQuay, William K.
2005-05-01
Distributed collaboration is an emerging technology for the 21st century that will significantly change how business is conducted in the defense and commercial sectors. Collaboration involves two or more geographically dispersed entities working together to create a "product" by sharing and exchanging data, information, and knowledge. A product is defined broadly to include, for example, writing a report, creating software, designing hardware, or implementing robust systems engineering and capability planning processes in an organization. Collaborative environments provide the framework and integrate models, simulations, domain specific tools, and virtual test beds to facilitate collaboration between the multiple disciplines needed in the enterprise. The Air Force Research Laboratory (AFRL) is conducting a leading edge program in developing distributed collaborative technologies targeted to the Air Force's implementation of systems engineering for a simulation-aided acquisition and capability-based planning. The research is focusing on the open systems agent-based framework, product and process modeling, structural architecture, and the integration technologies - the glue to integrate the software components. In past four years, two live assessment events have been conducted to demonstrate the technology in support of research for the Air Force Agile Acquisition initiatives. The AFRL Collaborative Environment concept will foster a major cultural change in how the acquisition, training, and operational communities conduct business.
A distributed component framework for science data product interoperability
NASA Technical Reports Server (NTRS)
Crichton, D.; Hughes, S.; Kelly, S.; Hardman, S.
2000-01-01
Correlation of science results from multi-disciplinary communities is a difficult task. Traditionally data from science missions is archived in proprietary data systems that are not interoperable. The Object Oriented Data Technology (OODT) task at the Jet Propulsion Laboratory is working on building a distributed product server as part of a distributed component framework to allow heterogeneous data systems to communicate and share scientific results.
The Mining Minds digital health and wellness framework.
Banos, Oresti; Bilal Amin, Muhammad; Ali Khan, Wajahat; Afzal, Muhammad; Hussain, Maqbool; Kang, Byeong Ho; Lee, Sungyong
2016-07-15
The provision of health and wellness care is undergoing an enormous transformation. A key element of this revolution consists in prioritizing prevention and proactivity based on the analysis of people's conducts and the empowerment of individuals in their self-management. Digital technologies are unquestionably destined to be the main engine of this change, with an increasing number of domain-specific applications and devices commercialized every year; however, there is an apparent lack of frameworks capable of orchestrating and intelligently leveraging, all the data, information and knowledge generated through these systems. This work presents Mining Minds, a novel framework that builds on the core ideas of the digital health and wellness paradigms to enable the provision of personalized support. Mining Minds embraces some of the most prominent digital technologies, ranging from Big Data and Cloud Computing to Wearables and Internet of Things, as well as modern concepts and methods, such as context-awareness, knowledge bases or analytics, to holistically and continuously investigate on people's lifestyles and provide a variety of smart coaching and support services. This paper comprehensively describes the efficient and rational combination and interoperation of these technologies and methods through Mining Minds, while meeting the essential requirements posed by a framework for personalized health and wellness support. Moreover, this work presents a realization of the key architectural components of Mining Minds, as well as various exemplary user applications and expert tools to illustrate some of the potential services supported by the proposed framework. Mining Minds constitutes an innovative holistic means to inspect human behavior and provide personalized health and wellness support. The principles behind this framework uncover new research ideas and may serve as a reference for similar initiatives.
Steps toward improving ethical evaluation in health technology assessment: a proposed framework.
Assasi, Nazila; Tarride, Jean-Eric; O'Reilly, Daria; Schwartz, Lisa
2016-06-06
While evaluation of ethical aspects in health technology assessment (HTA) has gained much attention during the past years, the integration of ethics in HTA practice still presents many challenges. In response to the increasing demand for expansion of health technology assessment (HTA) methodology to include ethical issues more systematically, this article reports on a multi-stage study that aimed at construction of a framework for improving the integration of ethics in HTA. The framework was developed through the following phases: 1) a systematic review and content analysis of guidance documents for ethics in HTA; 2) identification of factors influencing the integration of ethical considerations in HTA; 3) preparation of an action-oriented framework based on the key elements of the existing guidance documents and identified barriers to and facilitators of their implementation; and 4) expert consultation and revision of the framework. The proposed framework consists of three main components: an algorithmic flowchart, which exhibits the different steps of an ethical inquiry throughout the HTA process, including: defining the objectives and scope of the evaluation, stakeholder analysis, assessing organizational capacity, framing ethical evaluation questions, ethical analysis, deliberation, and knowledge translation; a stepwise guide, which focuses on the task objectives and potential questions that are required to be addressed at each step; and a list of some commonly recommended or used tools to help facilitate the evaluation process. The proposed framework can be used to support and promote good practice in integration of ethics into HTA. However, further validation of the framework through case studies and expert consultation is required to establish its utility for HTA practice.
Langer, Dominik; van 't Hoff, Marcel; Keller, Andreas J; Nagaraja, Chetan; Pfäffli, Oliver A; Göldi, Maurice; Kasper, Hansjörg; Helmchen, Fritjof
2013-04-30
Intravital microscopy such as in vivo imaging of brain dynamics is often performed with custom-built microscope setups controlled by custom-written software to meet specific requirements. Continuous technological advancement in the field has created a need for new control software that is flexible enough to support the biological researcher with innovative imaging techniques and provide the developer with a solid platform for quickly and easily implementing new extensions. Here, we introduce HelioScan, a software package written in LabVIEW, as a platform serving this dual role. HelioScan is designed as a collection of components that can be flexibly assembled into microscope control software tailored to the particular hardware and functionality requirements. Moreover, HelioScan provides a software framework, within which new functionality can be implemented in a quick and structured manner. A specific HelioScan application assembles at run-time from individual software components, based on user-definable configuration files. Due to its component-based architecture, HelioScan can exploit synergies of multiple developers working in parallel on different components in a community effort. We exemplify the capabilities and versatility of HelioScan by demonstrating several in vivo brain imaging modes, including camera-based intrinsic optical signal imaging for functional mapping of cortical areas, standard two-photon laser-scanning microscopy using galvanometric mirrors, and high-speed in vivo two-photon calcium imaging using either acousto-optic deflectors or a resonant scanner. We recommend HelioScan as a convenient software framework for the in vivo imaging community. Copyright © 2013 Elsevier B.V. All rights reserved.
Teaching technological innovation and entrepreneurship in polymeric biomaterials.
Washburn, Newell R
2011-01-01
A model for incorporating an entrepreneurship module has been developed in an upper-division and graduate-level engineering elective on Polymeric Biomaterials (27-311/42-311/27-711/42-711) at Carnegie Mellon University. A combination of lectures, assignments, and a team-based project were used to provide students with a framework for applying their technical skills in the development of new technologies and a basic understanding of the issues related to translational research and technology commercialization. The specific approach to the project established in the course, which represented 20% of the students' grades, and the grading rubric for each of the milestones are described along with suggestions for generalizing this approach to different applications of biomaterials or other engineering electives. Incorporating this model of entrepreneurship into electives teaches students course content within the framework of technological innovation and many of the concepts and tools need to practice it. For students with situational or individual interest in the project, it would also serve to deepen their understanding of the traditional course components as well as provide a foundation for integrating technological innovation and lifelong learning. Copyright © 2010 Wiley Periodicals, Inc.
Teaching Technological Innovation and Entrepreneurship in Polymeric Biomaterials
Washburn, Newell R.
2010-01-01
A model for incorporating an entrepreneurship module has been developed in an upper-division and graduate-level engineering elective on Polymeric Biomaterials (27-311/42-311/27-711/42-711) at Carnegie Mellon University. A combination of lectures, assignments, and a team-based project were used to provide students with a framework for applying their technical skills in the development of new technologies and a basic understanding of the issues related to translational research and technology commercialization. The specific approach to the project established in the course, which represented 20% of the students’ grades, and the grading rubric for each of the milestones are described along with suggestions for generalizing this approach to different applications of biomaterials or other engineering electives. Incorporating this model of entrepreneurship into electives teaches students course content within the framework of technological innovation and many of the concepts and tools need to practice it. For students with situational or individual interest in the project, it would also serve to deepen their understanding of the traditional course components as well as provide a foundation for integrating technological innovation and lifelong learning. PMID:20949575
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gribok, Andrei V.; Agarwal, Vivek
This paper describes the current state of research related to critical aspects of erosion and selected aspects of degradation of secondary components in nuclear power plants (NPPs). The paper also proposes a framework for online health monitoring of aging and degradation of secondary components. The framework consists of an integrated multi-sensor modality system, which can be used to monitor different piping configurations under different degradation conditions. The report analyses the currently known degradation mechanisms and available predictive models. Based on this analysis, the structural health monitoring framework is proposed. The Light Water Reactor Sustainability Program began to evaluate technologies thatmore » could be used to perform online monitoring of piping and other secondary system structural components in commercial NPPs. These online monitoring systems have the potential to identify when a more detailed inspection is needed using real time measurements, rather than at a pre-determined inspection interval. This transition to condition-based, risk-informed automated maintenance will contribute to a significant reduction of operations and maintenance costs that account for the majority of nuclear power generation costs. Furthermore, of the operations and maintenance costs in U.S. plants, approximately 80% are labor costs. To address the issue of rising operating costs and economic viability, in 2017, companies that operate the national nuclear energy fleet started the Delivering the Nuclear Promise Initiative, which is a 3 year program aimed at maintaining operational focus, increasing value, and improving efficiency. There is unanimous agreement between industry experts and academic researchers that identifying and prioritizing inspection locations in secondary piping systems (for example, in raw water piping or diesel piping) would eliminate many excessive in-service inspections. The proposed structural health monitoring framework takes aim at answering this challenge by combining long range guided wave technologies with other monitoring techniques, which can significantly increase the inspection length and pinpoint the locations that degraded the most. More widely, the report suggests research efforts aimed at developing, validating, and deploying online corrosion monitoring techniques for complex geometries, which are pervasive in NPPs.« less
Methodology Evaluation Framework for Component-Based System Development.
ERIC Educational Resources Information Center
Dahanayake, Ajantha; Sol, Henk; Stojanovic, Zoran
2003-01-01
Explains component-based development (CBD) for distributed information systems and presents an evaluation framework, which highlights the extent to which a methodology is component oriented. Compares prominent CBD methods, discusses ways of modeling, and suggests that this is a first step towards a components-oriented systems development…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kobos, Peter Holmes; Walker, La Tonya Nicole; Malczynski, Leonard A.
People save for retirement throughout their career because it is virtually impossible to save all youll need in retirement the year before you retire. Similarly, without installing incremental amounts of clean fossil, renewable or transformative energy technologies throughout the coming decades, a radical and immediate change will be near impossible the year before a policy goal is set to be in place. Therefore, our research question is, To meet our desired technical and policy goals, what are the factors that affect the rate we must install technology to achieve these goals in the coming decades? Existing models do not includemore » full regulatory constraints due to their often complex, and inflexible approaches to solve for optimal engineering instead of robust and multidisciplinary solutions. This project outlines the theory and then develops an applied software tool to model the laboratory-to-market transition using the traditional technology readiness level (TRL) framework, but develops subsequent and a novel regulatory readiness level (RRL) and market readiness level (MRL). This tool uses the ideally-suited system dynamics framework to incorporate feedbacks and time delays. Future energy-economic-environment models, regardless of their programming platform, may adapt this software model component framework or module to further vet the likelihood of new or innovative technology moving through the laboratory, regulatory and market space. The prototype analytical framework and tool, called the Technology, Regulatory and Market Readiness Level simulation model (TRMsim) illustrates the interaction between technology research, application, policy and market dynamics as they relate to a new or innovative technology moving from the theoretical stage to full market deployment. The initial results that illustrate the models capabilities indicate for a hypothetical technology, that increasing the key driver behind each of the TRL, RRL and MRL components individually decreases the time required for the technology to progress through each component by 63, 68 and 64%, respectively. Therefore, under the current working assumptions, to decrease the time it may take for a technology to move from the conceptual stage to full scale market adoption one might consider expending additional effort to secure regulatory approval and reducing the uncertainty of the technologys demand in the marketplace.« less
Framework for a space shuttle main engine health monitoring system
NASA Technical Reports Server (NTRS)
Hawman, Michael W.; Galinaitis, William S.; Tulpule, Sharayu; Mattedi, Anita K.; Kamenetz, Jeffrey
1990-01-01
A framework developed for a health management system (HMS) which is directed at improving the safety of operation of the Space Shuttle Main Engine (SSME) is summarized. An emphasis was placed on near term technology through requirements to use existing SSME instrumentation and to demonstrate the HMS during SSME ground tests within five years. The HMS framework was developed through an analysis of SSME failure modes, fault detection algorithms, sensor technologies, and hardware architectures. A key feature of the HMS framework design is that a clear path from the ground test system to a flight HMS was maintained. Fault detection techniques based on time series, nonlinear regression, and clustering algorithms were developed and demonstrated on data from SSME ground test failures. The fault detection algorithms exhibited 100 percent detection of faults, had an extremely low false alarm rate, and were robust to sensor loss. These algorithms were incorporated into a hierarchical decision making strategy for overall assessment of SSME health. A preliminary design for a hardware architecture capable of supporting real time operation of the HMS functions was developed. Utilizing modular, commercial off-the-shelf components produced a reliable low cost design with the flexibility to incorporate advances in algorithm and sensor technology as they become available.
NASA Technical Reports Server (NTRS)
Deardorff, Glenn; Djomehri, M. Jahed; Freeman, Ken; Gambrel, Dave; Green, Bryan; Henze, Chris; Hinke, Thomas; Hood, Robert; Kiris, Cetin; Moran, Patrick;
2001-01-01
A series of NASA presentations for the Supercomputing 2001 conference are summarized. The topics include: (1) Mars Surveyor Landing Sites "Collaboratory"; (2) Parallel and Distributed CFD for Unsteady Flows with Moving Overset Grids; (3) IP Multicast for Seamless Support of Remote Science; (4) Consolidated Supercomputing Management Office; (5) Growler: A Component-Based Framework for Distributed/Collaborative Scientific Visualization and Computational Steering; (6) Data Mining on the Information Power Grid (IPG); (7) Debugging on the IPG; (8) Debakey Heart Assist Device: (9) Unsteady Turbopump for Reusable Launch Vehicle; (10) Exploratory Computing Environments Component Framework; (11) OVERSET Computational Fluid Dynamics Tools; (12) Control and Observation in Distributed Environments; (13) Multi-Level Parallelism Scaling on NASA's Origin 1024 CPU System; (14) Computing, Information, & Communications Technology; (15) NAS Grid Benchmarks; (16) IPG: A Large-Scale Distributed Computing and Data Management System; and (17) ILab: Parameter Study Creation and Submission on the IPG.
Ubiquitous Robotic Technology for Smart Manufacturing System.
Wang, Wenshan; Zhu, Xiaoxiao; Wang, Liyu; Qiu, Qiang; Cao, Qixin
2016-01-01
As the manufacturing tasks become more individualized and more flexible, the machines in smart factory are required to do variable tasks collaboratively without reprogramming. This paper for the first time discusses the similarity between smart manufacturing systems and the ubiquitous robotic systems and makes an effort on deploying ubiquitous robotic technology to the smart factory. Specifically, a component based framework is proposed in order to enable the communication and cooperation of the heterogeneous robotic devices. Further, compared to the service robotic domain, the smart manufacturing systems are often in larger size. So a hierarchical planning method was implemented to improve the planning efficiency. A test bed of smart factory is developed. It demonstrates that the proposed framework is suitable for industrial domain, and the hierarchical planning method is able to solve large problems intractable with flat methods.
Ubiquitous Robotic Technology for Smart Manufacturing System
Zhu, Xiaoxiao; Wang, Liyu; Qiu, Qiang; Cao, Qixin
2016-01-01
As the manufacturing tasks become more individualized and more flexible, the machines in smart factory are required to do variable tasks collaboratively without reprogramming. This paper for the first time discusses the similarity between smart manufacturing systems and the ubiquitous robotic systems and makes an effort on deploying ubiquitous robotic technology to the smart factory. Specifically, a component based framework is proposed in order to enable the communication and cooperation of the heterogeneous robotic devices. Further, compared to the service robotic domain, the smart manufacturing systems are often in larger size. So a hierarchical planning method was implemented to improve the planning efficiency. A test bed of smart factory is developed. It demonstrates that the proposed framework is suitable for industrial domain, and the hierarchical planning method is able to solve large problems intractable with flat methods. PMID:27446206
Mehrolhassani, Mohammad Hossein; Emami, Mozhgan
2013-01-01
Background: Change theories provide an opportunity for organizational managers to plan, monitor and evaluate changes using a framework which enable them, among others, to show a fast response to environmental fluctuations and to predict the changing patterns of individuals and technology. The current study aimed to explore whether the change in the public accounting system of the Iranian health sector has followed Kurt Lewin’s change theory or not. Methods: This study which adopted a mixed methodology approach, qualitative and quantitative methods, was conducted in 2012. In the first phase of the study, 41 participants using purposive sampling and in the second phase, 32 affiliated units of Kerman University of Medical Sciences (KUMS) were selected as the study sample. Also, in phase one, we used face-to-face in-depth interviews (6 participants) and the quote method (35 participants) for data collection. We used a thematic framework analysis for analyzing data. In phase two, a questionnaire with a ten-point Likert scale was designed and then, data were analyzed using descriptive indicators, principal component and factorial analyses. Results: The results of phase one yielded a model consisting of four categories of superstructure, apparent infrastructure, hidden infrastructure and common factors. By linking all factors, totally, 12 components based on the quantitative results showed that the state of all components were not satisfactory at KUMS (5.06±2.16). Leadership and management; and technology components played the lowest and the greatest roles in implementing the accrual accounting system respectively. Conclusion: The results showed that the unfreezing stage did not occur well and the components were immature, mainly because the emphasis was placed on superstructure components rather than the components of hidden infrastructure. The study suggests that a road map should be developed in the financial system based on Kurt Lewin’s change theory and the model presented in this paper underpins the change management in any organizations. PMID:24596885
Mehrolhassani, Mohammad Hossein; Emami, Mozhgan
2013-11-01
Change theories provide an opportunity for organizational managers to plan, monitor and evaluate changes using a framework which enable them, among others, to show a fast response to environmental fluctuations and to predict the changing patterns of individuals and technology. The current study aimed to explore whether the change in the public accounting system of the Iranian health sector has followed Kurt Lewin's change theory or not. This study which adopted a mixed methodology approach, qualitative and quantitative methods, was conducted in 2012. In the first phase of the study, 41 participants using purposive sampling and in the second phase, 32 affiliated units of Kerman University of Medical Sciences (KUMS) were selected as the study sample. Also, in phase one, we used face-to-face in-depth interviews (6 participants) and the quote method (35 participants) for data collection. We used a thematic framework analysis for analyzing data. In phase two, a questionnaire with a ten-point Likert scale was designed and then, data were analyzed using descriptive indicators, principal component and factorial analyses. The results of phase one yielded a model consisting of four categories of superstructure, apparent infrastructure, hidden infrastructure and common factors. By linking all factors, totally, 12 components based on the quantitative results showed that the state of all components were not satisfactory at KUMS (5.06±2.16). Leadership and management; and technology components played the lowest and the greatest roles in implementing the accrual accounting system respectively. The results showed that the unfreezing stage did not occur well and the components were immature, mainly because the emphasis was placed on superstructure components rather than the components of hidden infrastructure. The study suggests that a road map should be developed in the financial system based on Kurt Lewin's change theory and the model presented in this paper underpins the change management in any organizations.
On-line early fault detection and diagnosis of municipal solid waste incinerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao Jinsong; Huang Jianchao; Sun Wei
A fault detection and diagnosis framework is proposed in this paper for early fault detection and diagnosis (FDD) of municipal solid waste incinerators (MSWIs) in order to improve the safety and continuity of production. In this framework, principal component analysis (PCA), one of the multivariate statistical technologies, is used for detecting abnormal events, while rule-based reasoning performs the fault diagnosis and consequence prediction, and also generates recommendations for fault mitigation once an abnormal event is detected. A software package, SWIFT, is developed based on the proposed framework, and has been applied in an actual industrial MSWI. The application shows thatmore » automated real-time abnormal situation management (ASM) of the MSWI can be achieved by using SWIFT, resulting in an industrially acceptable low rate of wrong diagnosis, which has resulted in improved process continuity and environmental performance of the MSWI.« less
Connor, Carol McDonald; Phillips, Beth M.; Kaschak, Michael; Apel, Kenn; Kim, Young-Suk; Al Otaiba, Stephanie; Crowe, Elizabeth C.; Thomas-Tate, Shurita; Johnson, Lakeisha Cooper; Lonigan, Christopher J.
2015-01-01
This paper describes the theoretical framework, as well as the development and testing of the intervention, Comprehension Tools for Teachers (CTT), which is composed of eight component interventions targeting malleable language and reading comprehension skills that emerging research indicates contribute to proficient reading for understanding for prekindergarteners through fourth graders. Component interventions target processes considered largely automatic as well as more reflective processes, with interacting and reciprocal effects. Specifically, we present component interventions targeting cognitive, linguistic, and text-specific processes, including morphological awareness, syntax, mental-state verbs, comprehension monitoring, narrative and expository text structure, enacted comprehension, academic knowledge, and reading to learn from informational text. Our aim was to develop a tool set composed of intensive meaningful individualized small group interventions. We improved feasibility in regular classrooms through the use of design-based iterative research methods including careful lesson planning, targeted scripting, pre- and postintervention proximal assessments, and technology. In addition to the overall framework, we discuss seven of the component interventions and general results of design and efficacy studies. PMID:26500420
Lifestyle modification for metabolic syndrome: a systematic review.
Bassi, Nikhil; Karagodin, Ilya; Wang, Serena; Vassallo, Patricia; Priyanath, Aparna; Massaro, Elaine; Stone, Neil J
2014-12-01
All 5 components of metabolic syndrome have been shown to improve with lifestyle and diet modification. New strategies for achieving adherence to meaningful lifestyle change are needed to optimize atherosclerotic cardiovascular risk reduction. We performed a systematic literature review, based on the Preferred Reporting Items for Systematic Reviews and Meta-Analyses framework (PRISMA), investigating optimal methods for achieving lifestyle change in metabolic syndrome. We submitted standardized search terms to the PubMed Central, CINAHL, Web of Science, and Ovid databases. Within those results, we selected randomized controlled trials (RCTs) presenting unique methods of achieving lifestyle change in patients with one or more components of the metabolic syndrome. Data extraction using the population, intervention, comparator, outcome, and risk of bias framework (PICO) was used to compare the following endpoints: prevalence of metabolic syndrome, prevalence of individual metabolic syndrome components, mean number of metabolic syndrome components, and amount of weight loss achieved. Twenty-eight RCTs (6372 patients) were included. Eight RCTs demonstrated improvement in metabolic syndrome risk factors after 1 year. Team-based, interactive approaches with high-frequency contact with patients who are motivated made the largest and most lasting impact. Technology was found to be a useful tool in achieving lifestyle change, but ineffective when compared with personal contact. Patient motivation leading to improved lifestyle adherence is a key factor in achieving reduction in metabolic syndrome components. These elements can be enhanced via frequent encounters with the health care system. Use of technologies such as mobile and Internet-based communication can increase the effectiveness of lifestyle change in metabolic syndrome, but should not replace personal contact as the cornerstone of therapy. Our ability to derive quantitative conclusions is limited by inconsistent outcome measures across studies, low power and homogeneity of individual studies, largely motivated study populations, short follow-up periods, loss to follow-up, and lack of or incomplete blinding. Copyright © 2014 Elsevier Inc. All rights reserved.
Direito, Artur; Walsh, Deirdre; Hinbarji, Moohamad; Albatal, Rami; Tooley, Mark; Whittaker, Robyn; Maddison, Ralph
2018-06-01
Few interventions to promote physical activity (PA) adapt dynamically to changes in individuals' behavior. Interventions targeting determinants of behavior are linked with increased effectiveness and should reflect changes in behavior over time. This article describes the application of two frameworks to assist the development of an adaptive evidence-based smartphone-delivered intervention aimed at influencing PA and sedentary behaviors (SB). Intervention mapping was used to identify the determinants influencing uptake of PA and optimal behavior change techniques (BCTs). Behavioral intervention technology was used to translate and operationalize the BCTs and its modes of delivery. The intervention was based on the integrated behavior change model, focused on nine determinants, consisted of 33 BCTs, and included three main components: (1) automated capture of daily PA and SB via an existing smartphone application, (2) classification of the individual into an activity profile according to their PA and SB, and (3) behavior change content delivery in a dynamic fashion via a proof-of-concept application. This article illustrates how two complementary frameworks can be used to guide the development of a mobile health behavior change program. This approach can guide the development of future mHealth programs.
Smith, Chris; Vannak, Uk; Sokhey, Ly; Ngo, Thoai D; Gold, Judy; Free, Caroline
2016-01-05
The objective of this paper is to outline the formative research process used to develop the MOTIF mobile phone-based (mHealth) intervention to support post-abortion family planning in Cambodia. The formative research process involved literature reviews, interviews and focus group discussions with clients, and consultation with clinicians and organisations implementing mHealth activities in Cambodia. This process led to the development of a conceptual framework and the intervention. Key findings from the formative research included identification of the main reasons for non-use of contraception and patterns of mobile phone use in Cambodia. We drew on components of existing interventions and behaviour change theory to develop a conceptual framework. A multi-faceted voice-based intervention was designed to address health concerns and other key determinants of contraception use. Formative research was essential in order to develop an appropriate mHealth intervention to support post-abortion contraception in Cambodia. Each component of the formative research contributed to the final intervention design.
A Method for Evaluating Information Security Governance (ISG) Components in Banking Environment
NASA Astrophysics Data System (ADS)
Ula, M.; Ula, M.; Fuadi, W.
2017-02-01
As modern banking increasingly relies on the internet and computer technologies to operate their businesses and market interactions, the threats and security breaches have highly increased in recent years. Insider and outsider attacks have caused global businesses lost trillions of Dollars a year. Therefore, that is a need for a proper framework to govern the information security in the banking system. The aim of this research is to propose and design an enhanced method to evaluate information security governance (ISG) implementation in banking environment. This research examines and compares the elements from the commonly used information security governance frameworks, standards and best practices. Their strength and weakness are considered in its approaches. The initial framework for governing the information security in banking system was constructed from document review. The framework was categorized into three levels which are Governance level, Managerial level, and technical level. The study further conducts an online survey for banking security professionals to get their professional judgment about the ISG most critical components and the importance for each ISG component that should be implemented in banking environment. Data from the survey was used to construct a mathematical model for ISG evaluation, component importance data used as weighting coefficient for the related component in the mathematical model. The research further develops a method for evaluating ISG implementation in banking based on the mathematical model. The proposed method was tested through real bank case study in an Indonesian local bank. The study evidently proves that the proposed method has sufficient coverage of ISG in banking environment and effectively evaluates the ISG implementation in banking environment.
Technology Benefit Estimator (T/BEST): User's Manual
NASA Technical Reports Server (NTRS)
Generazio, Edward R.; Chamis, Christos C.; Abumeri, Galib
1994-01-01
The Technology Benefit Estimator (T/BEST) system is a formal method to assess advanced technologies and quantify the benefit contributions for prioritization. T/BEST may be used to provide guidelines to identify and prioritize high payoff research areas, help manage research and limited resources, show the link between advanced concepts and the bottom line, i.e., accrued benefit and value, and to communicate credibly the benefits of research. The T/BEST software computer program is specifically designed to estimating benefits, and benefit sensitivities, of introducing new technologies into existing propulsion systems. Key engine cycle, structural, fluid, mission and cost analysis modules are used to provide a framework for interfacing with advanced technologies. An open-ended, modular approach is used to allow for modification and addition of both key and advanced technology modules. T/BEST has a hierarchical framework that yields varying levels of benefit estimation accuracy that are dependent on the degree of input detail available. This hierarchical feature permits rapid estimation of technology benefits even when the technology is at the conceptual stage. As knowledge of the technology details increases the accuracy of the benefit analysis increases. Included in T/BEST's framework are correlations developed from a statistical data base that is relied upon if there is insufficient information given in a particular area, e.g., fuel capacity or aircraft landing weight. Statistical predictions are not required if these data are specified in the mission requirements. The engine cycle, structural fluid, cost, noise, and emissions analyses interact with the default or user material and component libraries to yield estimates of specific global benefits: range, speed, thrust, capacity, component life, noise, emissions, specific fuel consumption, component and engine weights, pre-certification test, mission performance engine cost, direct operating cost, life cycle cost, manufacturing cost, development cost, risk, and development time. Currently, T/BEST operates on stand-alone or networked workstations, and uses a UNIX shell or script to control the operation of interfaced FORTRAN based analyses. T/BEST's interface structure works equally well with non-FORTRAN or mixed software analysis. This interface structure is designed to maintain the integrity of the expert's analyses by interfacing with expert's existing input and output files. Parameter input and output data (e.g., number of blades, hub diameters, etc.) are passed via T/BEST's neutral file, while copious data (e.g., finite element models, profiles, etc.) are passed via file pointers that point to the expert's analyses output files. In order to make the communications between the T/BEST's neutral file and attached analyses codes simple, only two software commands, PUT and GET, are required. This simplicity permits easy access to all input and output variables contained within the neutral file. Both public domain and proprietary analyses codes may be attached with a minimal amount of effort, while maintaining full data and analysis integrity, and security. T/BESt's sotware framework, status, beginner-to-expert operation, interface architecture, analysis module addition, and key analysis modules are discussed. Representative examples of T/BEST benefit analyses are shown.
Technology Benefit Estimator (T/BEST): User's manual
NASA Astrophysics Data System (ADS)
Generazio, Edward R.; Chamis, Christos C.; Abumeri, Galib
1994-12-01
The Technology Benefit Estimator (T/BEST) system is a formal method to assess advanced technologies and quantify the benefit contributions for prioritization. T/BEST may be used to provide guidelines to identify and prioritize high payoff research areas, help manage research and limited resources, show the link between advanced concepts and the bottom line, i.e., accrued benefit and value, and to communicate credibly the benefits of research. The T/BEST software computer program is specifically designed to estimating benefits, and benefit sensitivities, of introducing new technologies into existing propulsion systems. Key engine cycle, structural, fluid, mission and cost analysis modules are used to provide a framework for interfacing with advanced technologies. An open-ended, modular approach is used to allow for modification and addition of both key and advanced technology modules. T/BEST has a hierarchical framework that yields varying levels of benefit estimation accuracy that are dependent on the degree of input detail available. This hierarchical feature permits rapid estimation of technology benefits even when the technology is at the conceptual stage. As knowledge of the technology details increases the accuracy of the benefit analysis increases. Included in T/BEST's framework are correlations developed from a statistical data base that is relied upon if there is insufficient information given in a particular area, e.g., fuel capacity or aircraft landing weight. Statistical predictions are not required if these data are specified in the mission requirements. The engine cycle, structural fluid, cost, noise, and emissions analyses interact with the default or user material and component libraries to yield estimates of specific global benefits: range, speed, thrust, capacity, component life, noise, emissions, specific fuel consumption, component and engine weights, pre-certification test, mission performance engine cost, direct operating cost, life cycle cost, manufacturing cost, development cost, risk, and development time. Currently, T/BEST operates on stand-alone or networked workstations, and uses a UNIX shell or script to control the operation of interfaced FORTRAN based analyses. T/BEST's interface structure works equally well with non-FORTRAN or mixed software analysis. This interface structure is designed to maintain the integrity of the expert's analyses by interfacing with expert's existing input and output files. Parameter input and output data (e.g., number of blades, hub diameters, etc.) are passed via T/BEST's neutral file, while copious data (e.g., finite element models, profiles, etc.) are passed via file pointers that point to the expert's analyses output files. In order to make the communications between the T/BEST's neutral file and attached analyses codes simple, only two software commands, PUT and GET, are required. This simplicity permits easy access to all input and output variables contained within the neutral file. Both public domain and proprietary analyses codes may be attached with a minimal amount of effort, while maintaining full data and analysis integrity, and security.
Old Assumptions, New Paradigms: Technology, Group Process, and Continuing Professional Education.
ERIC Educational Resources Information Center
Healey, Kathryn N.; Lawler, Patricia A.
2002-01-01
Continuing educators must consider the impact of technology on group processes, including ways in which it affects group pressures, communication patterns, and social and emotional components of learning. Administrators and faculty should integrate group process frameworks with educational technologies in order to provide effective learning…
A Framework for Examining How Mathematics Teachers Evaluate Technology
ERIC Educational Resources Information Center
Smith, Ryan C.; Shin, Dongjo; Kim, Somin
2016-01-01
Our mathematics cognitive technology noticing framework is based on professional noticing and curricular noticing frameworks and data collected in a study that explored how secondary mathematics teachers evaluate technology. Our participants displayed three categories of noticing: attention to features of technology, interpretation of the…
A computational framework for modeling targets as complex adaptive systems
NASA Astrophysics Data System (ADS)
Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh
2017-05-01
Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.
Architecture for the Interdisciplinary Earth Data Alliance
NASA Astrophysics Data System (ADS)
Richard, S. M.
2016-12-01
The Interdisciplinary Earth Data Alliance (IEDA) is leading an EarthCube (EC) Integrative Activity to develop a governance structure and technology framework that enables partner data systems to share technology, infrastructure, and practice for documenting, curating, and accessing heterogeneous geoscience data. The IEDA data facility provides capabilities in an extensible framework that enables domain-specific requirements for each partner system in the Alliance to be integrated into standardized cross-domain workflows. The shared technology infrastructure includes a data submission hub, a domain-agnostic file-based repository, an integrated Alliance catalog and a Data Browser for data discovery across all partner holdings, as well as services for registering identifiers for datasets (DOI) and samples (IGSN). The submission hub will be a platform that facilitates acquisition of cross-domain resource documentation and channels users into domain and resource-specific workflows tailored for each partner community. We are exploring an event-based message bus architecture with a standardized plug-in interface for adding capabilities. This architecture builds on the EC CINERGI metadata pipeline as well as the message-based architecture of the SEAD project. Plug-in components for file introspection to match entities to a data type registry (extending EC Digital Crust and Research Data Alliance work), extract standardized keywords (using CINERGI components), location, cruise, personnel and other metadata linkage information (building on GeoLink and existing IEDA partner components). The submission hub will feed submissions to appropriate partner repositories and service endpoints targeted by domain and resource type for distribution. The Alliance governance will adopt patterns (vocabularies, operations, resource types) for self-describing data services using standard HTTP protocol for simplified data access (building on EC GeoWS and other `RESTful' approaches). Exposure of resource descriptions (datasets and service distributions) for harvesting by commercial search engines as well as geoscience-data focused crawlers (like EC B-Cube crawler) will increase discoverability of IEDA resources with minimal effort by curators.
Space Vehicle Reliability Modeling in DIORAMA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tornga, Shawn Robert
When modeling system performance of space based detection systems it is important to consider spacecraft reliability. As space vehicles age the components become prone to failure for a variety of reasons such as radiation damage. Additionally, some vehicles may lose the ability to maneuver once they exhaust fuel supplies. Typically failure is divided into two categories: engineering mistakes and technology surprise. This document will report on a method of simulating space vehicle reliability in the DIORAMA framework.
NASA Astrophysics Data System (ADS)
Nogueira, Juan Manuel; Romero, David; Espadas, Javier; Molina, Arturo
2013-02-01
With the emergence of new enterprise models, such as technology-based enterprises, and the large quantity of information generated through technological advances, the Zachman framework continues to represent a modelling tool of great utility and value to construct an enterprise architecture (EA) that can integrate and align the IT infrastructure and business goals. Nevertheless, implementing an EA requires an important effort within an enterprise. Small technology-based enterprises and start-ups can take advantage of EAs and frameworks but, because these enterprises have limited resources to allocate for this task, an enterprise framework implementation is not feasible in most cases. This article proposes a new methodology based on action-research for the implementation of the business, system and technology models of the Zachman framework to assist and facilitate its implementation. Following the explanation of cycles of the proposed methodology, a case study is presented to illustrate the results of implementing the Zachman framework in a technology-based enterprise: PyME CREATIVA, using action-research approach.
NGSANE: a lightweight production informatics framework for high-throughput data analysis.
Buske, Fabian A; French, Hugh J; Smith, Martin A; Clark, Susan J; Bauer, Denis C
2014-05-15
The initial steps in the analysis of next-generation sequencing data can be automated by way of software 'pipelines'. However, individual components depreciate rapidly because of the evolving technology and analysis methods, often rendering entire versions of production informatics pipelines obsolete. Constructing pipelines from Linux bash commands enables the use of hot swappable modular components as opposed to the more rigid program call wrapping by higher level languages, as implemented in comparable published pipelining systems. Here we present Next Generation Sequencing ANalysis for Enterprises (NGSANE), a Linux-based, high-performance-computing-enabled framework that minimizes overhead for set up and processing of new projects, yet maintains full flexibility of custom scripting when processing raw sequence data. Ngsane is implemented in bash and publicly available under BSD (3-Clause) licence via GitHub at https://github.com/BauerLab/ngsane. Denis.Bauer@csiro.au Supplementary data are available at Bioinformatics online.
Adaptive Numerical Algorithms in Space Weather Modeling
NASA Technical Reports Server (NTRS)
Toth, Gabor; vanderHolst, Bart; Sokolov, Igor V.; DeZeeuw, Darren; Gombosi, Tamas I.; Fang, Fang; Manchester, Ward B.; Meng, Xing; Nakib, Dalal; Powell, Kenneth G.;
2010-01-01
Space weather describes the various processes in the Sun-Earth system that present danger to human health and technology. The goal of space weather forecasting is to provide an opportunity to mitigate these negative effects. Physics-based space weather modeling is characterized by disparate temporal and spatial scales as well as by different physics in different domains. A multi-physics system can be modeled by a software framework comprising of several components. Each component corresponds to a physics domain, and each component is represented by one or more numerical models. The publicly available Space Weather Modeling Framework (SWMF) can execute and couple together several components distributed over a parallel machine in a flexible and efficient manner. The framework also allows resolving disparate spatial and temporal scales with independent spatial and temporal discretizations in the various models. Several of the computationally most expensive domains of the framework are modeled by the Block-Adaptive Tree Solar wind Roe Upwind Scheme (BATS-R-US) code that can solve various forms of the magnetohydrodynamics (MHD) equations, including Hall, semi-relativistic, multi-species and multi-fluid MHD, anisotropic pressure, radiative transport and heat conduction. Modeling disparate scales within BATS-R-US is achieved by a block-adaptive mesh both in Cartesian and generalized coordinates. Most recently we have created a new core for BATS-R-US: the Block-Adaptive Tree Library (BATL) that provides a general toolkit for creating, load balancing and message passing in a 1, 2 or 3 dimensional block-adaptive grid. We describe the algorithms of BATL and demonstrate its efficiency and scaling properties for various problems. BATS-R-US uses several time-integration schemes to address multiple time-scales: explicit time stepping with fixed or local time steps, partially steady-state evolution, point-implicit, semi-implicit, explicit/implicit, and fully implicit numerical schemes. Depending on the application, we find that different time stepping methods are optimal. Several of the time integration schemes exploit the block-based granularity of the grid structure. The framework and the adaptive algorithms enable physics based space weather modeling and even forecasting.
Symphony: A Framework for Accurate and Holistic WSN Simulation
Riliskis, Laurynas; Osipov, Evgeny
2015-01-01
Research on wireless sensor networks has progressed rapidly over the last decade, and these technologies have been widely adopted for both industrial and domestic uses. Several operating systems have been developed, along with a multitude of network protocols for all layers of the communication stack. Industrial Wireless Sensor Network (WSN) systems must satisfy strict criteria and are typically more complex and larger in scale than domestic systems. Together with the non-deterministic behavior of network hardware in real settings, this greatly complicates the debugging and testing of WSN functionality. To facilitate the testing, validation, and debugging of large-scale WSN systems, we have developed a simulation framework that accurately reproduces the processes that occur inside real equipment, including both hardware- and software-induced delays. The core of the framework consists of a virtualized operating system and an emulated hardware platform that is integrated with the general purpose network simulator ns-3. Our framework enables the user to adjust the real code base as would be done in real deployments and also to test the boundary effects of different hardware components on the performance of distributed applications and protocols. Additionally we have developed a clock emulator with several different skew models and a component that handles sensory data feeds. The new framework should substantially shorten WSN application development cycles. PMID:25723144
2006 Precision Strike Technology Symposium
2006-10-19
s Navy Unique Joint system 14 A/C Unique Components Framework JMPS Common Components Crypto Key GCCS-M Interface Carrier Intel Feed Carrier...210 GPS Prediction CUPC GPS Crypto Key TAMMAC SLAM-ER GPS Almanac ETIRMS PMA-281 NGMS PMA-209 Boeing PMA-201 Raytheon ESC (USAF) Hill AFB PMA-234 PMA...242 F/A-18 UPC GPS Prediction CUPC GPS Crypto Key TAMMAC SLAM-ER GPS Almanac HARM WASP Framework ARC-210 ETIRMS PMA-281 Integration/Test/ Support TLAM
Towards a Theory-Based Design Framework for an Effective E-Learning Computer Programming Course
ERIC Educational Resources Information Center
McGowan, Ian S.
2016-01-01
Built on Dabbagh (2005), this paper presents a four component theory-based design framework for an e-learning session in introductory computer programming. The framework, driven by a body of exemplars component, emphasizes the transformative interaction between the knowledge building community (KBC) pedagogical model, a mixed instructional…
Fault Tolerance in ZigBee Wireless Sensor Networks
NASA Technical Reports Server (NTRS)
Alena, Richard; Gilstrap, Ray; Baldwin, Jarren; Stone, Thom; Wilson, Pete
2011-01-01
Wireless sensor networks (WSN) based on the IEEE 802.15.4 Personal Area Network standard are finding increasing use in the home automation and emerging smart energy markets. The network and application layers, based on the ZigBee 2007 PRO Standard, provide a convenient framework for component-based software that supports customer solutions from multiple vendors. This technology is supported by System-on-a-Chip solutions, resulting in extremely small and low-power nodes. The Wireless Connections in Space Project addresses the aerospace flight domain for both flight-critical and non-critical avionics. WSNs provide the inherent fault tolerance required for aerospace applications utilizing such technology. The team from Ames Research Center has developed techniques for assessing the fault tolerance of ZigBee WSNs challenged by radio frequency (RF) interference or WSN node failure.
Bouaud, Jacques; Guézennec, Gilles; Séroussi, Brigitte
2018-01-01
The integration of clinical information models and termino-ontological models into a unique ontological framework is highly desirable for it facilitates data integration and management using the same formal mechanisms for both data concepts and information model components. This is particularly true for knowledge-based decision support tools that aim to take advantage of all facets of semantic web technologies in merging ontological reasoning, concept classification, and rule-based inferences. We present an ontology template that combines generic data model components with (parts of) existing termino-ontological resources. The approach is developed for the guideline-based decision support module on breast cancer management within the DESIREE European project. The approach is based on the entity attribute value model and could be extended to other domains.
Chen, Yen-Lin; Chiang, Hsin-Han; Chiang, Chuan-Yen; Liu, Chuan-Ming; Yuan, Shyan-Ming; Wang, Jenq-Haur
2012-01-01
This study proposes a vision-based intelligent nighttime driver assistance and surveillance system (VIDASS system) implemented by a set of embedded software components and modules, and integrates these modules to accomplish a component-based system framework on an embedded heterogamous dual-core platform. Therefore, this study develops and implements computer vision and sensing techniques of nighttime vehicle detection, collision warning determination, and traffic event recording. The proposed system processes the road-scene frames in front of the host car captured from CCD sensors mounted on the host vehicle. These vision-based sensing and processing technologies are integrated and implemented on an ARM-DSP heterogamous dual-core embedded platform. Peripheral devices, including image grabbing devices, communication modules, and other in-vehicle control devices, are also integrated to form an in-vehicle-embedded vision-based nighttime driver assistance and surveillance system. PMID:22736956
Chen, Yen-Lin; Chiang, Hsin-Han; Chiang, Chuan-Yen; Liu, Chuan-Ming; Yuan, Shyan-Ming; Wang, Jenq-Haur
2012-01-01
This study proposes a vision-based intelligent nighttime driver assistance and surveillance system (VIDASS system) implemented by a set of embedded software components and modules, and integrates these modules to accomplish a component-based system framework on an embedded heterogamous dual-core platform. Therefore, this study develops and implements computer vision and sensing techniques of nighttime vehicle detection, collision warning determination, and traffic event recording. The proposed system processes the road-scene frames in front of the host car captured from CCD sensors mounted on the host vehicle. These vision-based sensing and processing technologies are integrated and implemented on an ARM-DSP heterogamous dual-core embedded platform. Peripheral devices, including image grabbing devices, communication modules, and other in-vehicle control devices, are also integrated to form an in-vehicle-embedded vision-based nighttime driver assistance and surveillance system.
An ontology for component-based models of water resource systems
NASA Astrophysics Data System (ADS)
Elag, Mostafa; Goodall, Jonathan L.
2013-08-01
Component-based modeling is an approach for simulating water resource systems where a model is composed of a set of components, each with a defined modeling objective, interlinked through data exchanges. Component-based modeling frameworks are used within the hydrologic, atmospheric, and earth surface dynamics modeling communities. While these efforts have been advancing, it has become clear that the water resources modeling community in particular, and arguably the larger earth science modeling community as well, faces a challenge of fully and precisely defining the metadata for model components. The lack of a unified framework for model component metadata limits interoperability between modeling communities and the reuse of models across modeling frameworks due to ambiguity about the model and its capabilities. To address this need, we propose an ontology for water resources model components that describes core concepts and relationships using the Web Ontology Language (OWL). The ontology that we present, which is termed the Water Resources Component (WRC) ontology, is meant to serve as a starting point that can be refined over time through engagement by the larger community until a robust knowledge framework for water resource model components is achieved. This paper presents the methodology used to arrive at the WRC ontology, the WRC ontology itself, and examples of how the ontology can aid in component-based water resources modeling by (i) assisting in identifying relevant models, (ii) encouraging proper model coupling, and (iii) facilitating interoperability across earth science modeling frameworks.
Integrated Systems Health Management (ISHM) Toolkit
NASA Technical Reports Server (NTRS)
Venkatesh, Meera; Kapadia, Ravi; Walker, Mark; Wilkins, Kim
2013-01-01
A framework of software components has been implemented to facilitate the development of ISHM systems according to a methodology based on Reliability Centered Maintenance (RCM). This framework is collectively referred to as the Toolkit and was developed using General Atomics' Health MAP (TM) technology. The toolkit is intended to provide assistance to software developers of mission-critical system health monitoring applications in the specification, implementation, configuration, and deployment of such applications. In addition to software tools designed to facilitate these objectives, the toolkit also provides direction to software developers in accordance with an ISHM specification and development methodology. The development tools are based on an RCM approach for the development of ISHM systems. This approach focuses on defining, detecting, and predicting the likelihood of system functional failures and their undesirable consequences.
Framework for a clinical information system.
Van de Velde, R
2000-01-01
The current status of our work towards the design and implementation of a reference architecture for a Clinical Information System is presented. This architecture has been developed and implemented based on components following a strong underlying conceptual and technological model. Common Object Request Broker and n-tier technology featuring centralised and departmental clinical information systems as the back-end store for all clinical data are used. Servers located in the 'middle' tier apply the clinical (business) model and application rules to communicate with so-called 'thin client' workstations. The main characteristics are the focus on modelling and reuse of both data and business logic as there is a shift away from data and functional modelling towards object modelling. Scalability as well as adaptability to constantly changing requirements via component driven computing are the main reasons for that approach.
A Profile-Based Framework for Factorial Similarity and the Congruence Coefficient.
Hartley, Anselma G; Furr, R Michael
2017-01-01
We present a novel profile-based framework for understanding factorial similarity in the context of exploratory factor analysis in general, and for understanding the congruence coefficient (a commonly used index of factor similarity) specifically. First, we introduce the profile-based framework articulating factorial similarity in terms of 3 intuitive components: general saturation similarity, differential saturation similarity, and configural similarity. We then articulate the congruence coefficient in terms of these components, along with 2 additional profile-based components, and we explain how these components resolve ambiguities that can be-and are-found when using the congruence coefficient. Finally, we present secondary analyses revealing that profile-based components of factorial are indeed linked to experts' actual evaluations of factorial similarity. Overall, the profile-based approach we present offers new insights into the ways in which researchers can examine factor similarity and holds the potential to enhance researchers' ability to understand the congruence coefficient.
Business Technology Education. Vocational Education Program Courses Standards.
ERIC Educational Resources Information Center
Florida State Dept. of Education, Tallahassee. Div. of Applied Tech., Adult, and Community Education.
This document contains vocational education program course standards (curriculum frameworks and student performance standards) for exploratory courses, practical arts courses, and job preparatory programs offered at the secondary and postsecondary level as part of the business technology education component of Florida's comprehensive vocational…
Technology Education. Vocational Education Program Courses Standards.
ERIC Educational Resources Information Center
Florida State Dept. of Education, Tallahassee. Div. of Applied Tech., Adult, and Community Education.
This document contains vocational education program course standards (curriculum frameworks and student performance standards) for exploratory courses, practical arts courses, and job preparatory programs offered at the secondary and postsecondary level as part of the technology education component of Florida's comprehensive vocational education…
Satellites, tweets, forecasts: the future of flood disaster management?
NASA Astrophysics Data System (ADS)
Dottori, Francesco; Kalas, Milan; Lorini, Valerio; Wania, Annett; Pappenberger, Florian; Salamon, Peter; Ramos, Maria Helena; Cloke, Hannah; Castillo, Carlos
2017-04-01
Floods have devastating effects on lives and livelihoods around the world. Structural flood defence measures such as dikes and dams can help protect people. However, it is the emerging science and technologies for flood disaster management and preparedness, such as increasingly accurate flood forecasting systems, high-resolution satellite monitoring, rapid risk mapping, and the unique strength of social media information and crowdsourcing, that are most promising for reducing the impacts of flooding. Here, we describe an innovative framework which integrates in real-time two components of the Copernicus Emergency mapping services, namely the European Flood Awareness System and the satellite-based Rapid Mapping, with new procedures for rapid risk assessment and social media and news monitoring. The integrated framework enables improved flood impact forecast, thanks to the real-time integration of forecasting and monitoring components, and increases the timeliness and efficiency of satellite mapping, with the aim of capturing flood peaks and following the evolution of flooding processes. Thanks to the proposed framework, emergency responders will have access to a broad range of timely and accurate information for more effective and robust planning, decision-making, and resource allocation.
Unsteady Analyses of Valve Systems in Rocket Engine Testing Environments
NASA Technical Reports Server (NTRS)
Shipman, Jeremy; Hosangadi, Ashvin; Ahuja, Vineet
2004-01-01
This paper discusses simulation technology used to support the testing of rocket propulsion systems by performing high fidelity analyses of feed system components. A generalized multi-element framework has been used to perform simulations of control valve systems. This framework provides the flexibility to resolve the structural and functional complexities typically associated with valve-based high pressure feed systems that are difficult to deal with using traditional Computational Fluid Dynamics (CFD) methods. In order to validate this framework for control valve systems, results are presented for simulations of a cryogenic control valve at various plug settings and compared to both experimental data and simulation results obtained at NASA Stennis Space Center. A detailed unsteady analysis has also been performed for a pressure regulator type control valve used to support rocket engine and component testing at Stennis Space Center. The transient simulation captures the onset of a modal instability that has been observed in the operation of the valve. A discussion of the flow physics responsible for the instability and a prediction of the dominant modes associated with the fluctuations is presented.
NASA Technical Reports Server (NTRS)
Pinelli, Thomas E.; Barclay, Rebecca O.; Bishop, Ann P.; Kennedy, John M.
1992-01-01
Federal attempts to stimulate technological innovation have been unsuccessful because of the application of an inappropriate policy framework that lacks conceptual and empirical knowledge of the process of technological innovation and fails to acknowledge the relationship between knowled reproduction, transfer, and use as equally important components of the process of knowledge diffusion. It is argued that the potential contributions of high-speed computing and networking systems will be diminished unless empirically derived knowledge about the information-seeking behavior of the members of the social system is incorporated into a new policy framework. Findings from the NASA/DoD Aerospace Knowledge Diffusion Research Project are presented in support of this assertion.
NASA Technical Reports Server (NTRS)
Pinelli, Thomas E.; Barclay, Rebecca O.; Bishop, Ann P.; Kennedy, John M.
1992-01-01
Federal attempts to stimulate technological innovation have been unsuccessful because of the application of an inappropriate policy framework that lacks conceptual and empirical knowledge of the process of technological innovation and fails to acknowledge the relationship between knowledge production, transfer, and use as equally important components of the process of knowledge diffusion. This article argues that the potential contributions of high-speed computing and networking systems will be diminished unless empirically derived knowledge about the information-seeking behavior of members of the social system is incorporated into a new policy framework. Findings from the NASA/DoD Aerospace Knowledge Diffusion Research Project are presented in support of this assertion.
NASA Technical Reports Server (NTRS)
Mayo, L. H.
1975-01-01
An analysis is presented for the Congress of the relationships between an institutionalized assessment function and legislative information gathering and decisionmaking needs. The study was directed to the following topics: (1) the positing of a hypothetical technology assessment component for legislative support; (2) the posing of a number of questions relating to the operational context of this assessment component including the organization/operational framework, general operational problems, access to relevant information, and the utilization of assessment data and analyses; and (3) some selected comments relevant to the questions posed.
Learning Resources and Technology. A Guide to Program Development.
ERIC Educational Resources Information Center
Connecticut State Dept. of Education, Hartford.
This guide provides a framework to assist all Connecticut school districts in planning effective learning resources centers and educational technology programs capable of providing: a well developed library media component; shared instructional design responsibilities; reading for enrichment; integration of computers into instruction; distance…
Progress of the European Assistive Technology Information Network.
Gower, Valerio; Andrich, Renzo
2015-01-01
The European Assistive Technology Information Network (EASTIN), launched in 2005 as the result of a collaborative EU project, provides information on Assistive Technology products and related material through the website www.eastin.eu. In the past few years several advancements have been implemented on the EASTIN website thanks to the contribution of EU funded projects, including a multilingual query processing component for supporting non expert users, a user rating and comment facility, and a detailed taxonomy for the description of ICT based assistive products. Recently, within the framework of the EU funded project Cloud4All, the EASTIN information system has also been federated with the Unified Listing of assistive products, one of the building blocks of the Global Public Inclusive Infrastructure initiative.
Frameworks of Educational Technology
ERIC Educational Resources Information Center
Ely, Donald
2008-01-01
This paper, written from a 20th-century perspective, traces the development of, and influences on, the field of instructional technology and attempts to describe a framework within which we can better understand the field. [This article is based on "Instructional Technology: Contemporary Frameworks" originally written by the author for the…
Benchmarking Diagnostic Algorithms on an Electrical Power System Testbed
NASA Technical Reports Server (NTRS)
Kurtoglu, Tolga; Narasimhan, Sriram; Poll, Scott; Garcia, David; Wright, Stephanie
2009-01-01
Diagnostic algorithms (DAs) are key to enabling automated health management. These algorithms are designed to detect and isolate anomalies of either a component or the whole system based on observations received from sensors. In recent years a wide range of algorithms, both model-based and data-driven, have been developed to increase autonomy and improve system reliability and affordability. However, the lack of support to perform systematic benchmarking of these algorithms continues to create barriers for effective development and deployment of diagnostic technologies. In this paper, we present our efforts to benchmark a set of DAs on a common platform using a framework that was developed to evaluate and compare various performance metrics for diagnostic technologies. The diagnosed system is an electrical power system, namely the Advanced Diagnostics and Prognostics Testbed (ADAPT) developed and located at the NASA Ames Research Center. The paper presents the fundamentals of the benchmarking framework, the ADAPT system, description of faults and data sets, the metrics used for evaluation, and an in-depth analysis of benchmarking results obtained from testing ten diagnostic algorithms on the ADAPT electrical power system testbed.
A Framework for Examining Teachers' Noticing of Mathematical Cognitive Technologies
ERIC Educational Resources Information Center
Smith, Ryan; Shin, Dongjo; Kim, Somin
2017-01-01
In this paper, we propose the mathematical cognitive technology noticing framework for examining how mathematics teachers evaluate, select, and modify mathematical cognitive technology to use in their classrooms. Our framework is based on studies of professional and curricular noticing and data collected in a study that explored how secondary…
Comparison of Physics Frameworks for WebGL-Based Game Engine
NASA Astrophysics Data System (ADS)
Yogya, Resa; Kosala, Raymond
2014-03-01
Recently, a new technology called WebGL shows a lot of potentials for developing games. However since this technology is still new, there are still many potentials in the game development area that are not explored yet. This paper tries to uncover the potential of integrating physics frameworks with WebGL technology in a game engine for developing 2D or 3D games. Specifically we integrated three open source physics frameworks: Bullet, Cannon, and JigLib into a WebGL-based game engine. Using experiment, we assessed these frameworks in terms of their correctness or accuracy, performance, completeness and compatibility. The results show that it is possible to integrate open source physics frameworks into a WebGLbased game engine, and Bullet is the best physics framework to be integrated into the WebGL-based game engine.
Robopedia: Leveraging Sensorpedia for Web-Enabled Robot Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Resseguie, David R
There is a growing interest in building Internetscale sensor networks that integrate sensors from around the world into a single unified system. In contrast, robotics application development has primarily focused on building specialized systems. These specialized systems take scalability and reliability into consideration, but generally neglect exploring the key components required to build a large scale system. Integrating robotic applications with Internet-scale sensor networks will unify specialized robotics applications and provide answers to large scale implementation concerns. We focus on utilizing Internet-scale sensor network technology to construct a framework for unifying robotic systems. Our framework web-enables a surveillance robot smore » sensor observations and provides a webinterface to the robot s actuators. This lets robots seamlessly integrate into web applications. In addition, the framework eliminates most prerequisite robotics knowledge, allowing for the creation of general web-based robotics applications. The framework also provides mechanisms to create applications that can interface with any robot. Frameworks such as this one are key to solving large scale mobile robotics implementation problems. We provide an overview of previous Internetscale sensor networks, Sensorpedia (an ad-hoc Internet-scale sensor network), our framework for integrating robots with Sensorpedia, two applications which illustrate our frameworks ability to support general web-based robotic control, and offer experimental results that illustrate our framework s scalability, feasibility, and resource requirements.« less
An eHealth Capabilities Framework for Graduates and Health Professionals: Mixed-Methods Study
McGregor, Deborah; Keep, Melanie; Janssen, Anna; Spallek, Heiko; Quinn, Deleana; Jones, Aaron; Tseris, Emma; Yeung, Wilson; Togher, Leanne; Solman, Annette; Shaw, Tim
2018-01-01
Background The demand for an eHealth-ready and adaptable workforce is placing increasing pressure on universities to deliver eHealth education. At present, eHealth education is largely focused on components of eHealth rather than considering a curriculum-wide approach. Objective This study aimed to develop a framework that could be used to guide health curriculum design based on current evidence, and stakeholder perceptions of eHealth capabilities expected of tertiary health graduates. Methods A 3-phase, mixed-methods approach incorporated the results of a literature review, focus groups, and a Delphi process to develop a framework of eHealth capability statements. Results Participants (N=39) with expertise or experience in eHealth education, practice, or policy provided feedback on the proposed framework, and following the fourth iteration of this process, consensus was achieved. The final framework consisted of 4 higher-level capability statements that describe the learning outcomes expected of university graduates across the domains of (1) digital health technologies, systems, and policies; (2) clinical practice; (3) data analysis and knowledge creation; and (4) technology implementation and codesign. Across the capability statements are 40 performance cues that provide examples of how these capabilities might be demonstrated. Conclusions The results of this study inform a cross-faculty eHealth curriculum that aligns with workforce expectations. There is a need for educational curriculum to reinforce existing eHealth capabilities, adapt existing capabilities to make them transferable to novel eHealth contexts, and introduce new learning opportunities for interactions with technologies within education and practice encounters. As such, the capability framework developed may assist in the application of eHealth by emerging and existing health care professionals. Future research needs to explore the potential for integration of findings into workforce development programs. PMID:29764794
An efficient approach to the deployment of complex open source information systems
Cong, Truong Van Chi; Groeneveld, Eildert
2011-01-01
Complex open source information systems are usually implemented as component-based software to inherit the available functionality of existing software packages developed by third parties. Consequently, the deployment of these systems not only requires the installation of operating system, application framework and the configuration of services but also needs to resolve the dependencies among components. The problem becomes more challenging when the application must be installed and used on different platforms such as Linux and Windows. To address this, an efficient approach using the virtualization technology is suggested and discussed in this paper. The approach has been applied in our project to deploy a web-based integrated information system in molecular genetics labs. It is a low-cost solution to benefit both software developers and end-users. PMID:22102770
Using Ada to implement the operations management system in a community of experts
NASA Technical Reports Server (NTRS)
Frank, M. S.
1986-01-01
An architecture is described for the Space Station Operations Management System (OMS), consisting of a distributed expert system framework implemented in Ada. The motivation for such a scheme is based on the desire to integrate the very diverse elements of the OMS while taking maximum advantage of knowledge based systems technology. Part of the foundation of an Ada based distributed expert system was accomplished in the form of a proof of concept prototype for the KNOMES project (Knowledge-based Maintenance Expert System). This prototype successfully used concurrently active experts to accomplish monitoring and diagnosis for the Remote Manipulator System. The basic concept of this software architecture is named ACTORS for Ada Cognitive Task ORganization Scheme. It is when one considers the overall problem of integrating all of the OMS elements into a cooperative system that the AI solution stands out. By utilizing a distributed knowledge based system as the framework for OMS, it is possible to integrate those components which need to share information in an intelligent manner.
NASA Technical Reports Server (NTRS)
Flora-Adams, Dana; Makihara, Jeanne; Benenyan, Zabel; Berner, Jeff; Kwok, Andrew
2007-01-01
Object Oriented Data Technology (OODT) is a software framework for creating a Web-based system for exchange of scientific data that are stored in diverse formats on computers at different sites under the management of scientific peers. OODT software consists of a set of cooperating, distributed peer components that provide distributed peer-topeer (P2P) services that enable one peer to search and retrieve data managed by another peer. In effect, computers running OODT software at different locations become parts of an integrated data-management system.
ERIC Educational Resources Information Center
Monaghan, John
2013-01-01
This paper offers a framework, an extension of Valsiner's "zone theory", for the analysis of joint student-teacher development over a series of technology-based mathematics lessons. The framework is suitable for developing research studies over a moderately long period of time and considers interrelated student-teacher development as…
Design and Application of an Ontology for Component-Based Modeling of Water Systems
NASA Astrophysics Data System (ADS)
Elag, M.; Goodall, J. L.
2012-12-01
Many Earth system modeling frameworks have adopted an approach of componentizing models so that a large model can be assembled by linking a set of smaller model components. These model components can then be more easily reused, extended, and maintained by a large group of model developers and end users. While there has been a notable increase in component-based model frameworks in the Earth sciences in recent years, there has been less work on creating framework-agnostic metadata and ontologies for model components. Well defined model component metadata is needed, however, to facilitate sharing, reuse, and interoperability both within and across Earth system modeling frameworks. To address this need, we have designed an ontology for the water resources community named the Water Resources Component (WRC) ontology in order to advance the application of component-based modeling frameworks across water related disciplines. Here we present the design of the WRC ontology and demonstrate its application for integration of model components used in watershed management. First we show how the watershed modeling system Soil and Water Assessment Tool (SWAT) can be decomposed into a set of hydrological and ecological components that adopt the Open Modeling Interface (OpenMI) standard. Then we show how the components can be used to estimate nitrogen losses from land to surface water for the Baltimore Ecosystem study area. Results of this work are (i) a demonstration of how the WRC ontology advances the conceptual integration between components of water related disciplines by handling the semantic and syntactic heterogeneity present when describing components from different disciplines and (ii) an investigation of a methodology by which large models can be decomposed into a set of model components that can be well described by populating metadata according to the WRC ontology.
Dshell++: A Component Based, Reusable Space System Simulation Framework
NASA Technical Reports Server (NTRS)
Lim, Christopher S.; Jain, Abhinandan
2009-01-01
This paper describes the multi-mission Dshell++ simulation framework for high fidelity, physics-based simulation of spacecraft, robotic manipulation and mobility systems. Dshell++ is a C++/Python library which uses modern script driven object-oriented techniques to allow component reuse and a dynamic run-time interface for complex, high-fidelity simulation of spacecraft and robotic systems. The goal of the Dshell++ architecture is to manage the inherent complexity of physicsbased simulations while supporting component model reuse across missions. The framework provides several features that support a large degree of simulation configurability and usability.
Liang, Albert K.; Koniczek, Martin; Antonuk, Larry E.; El-Mohri, Youcef; Zhao, Qihua; Street, Robert A.; Lu, Jeng Ping
2017-01-01
Photon counting arrays (PCAs), defined as pixelated imagers which measure the absorbed energy of x-ray photons individually and record this information digitally, are of increasing clinical interest. A number of PCA prototypes with a 1 mm pixel-to-pixel pitch have recently been fabricated with polycrystalline silicon (poly-Si) — a thin-film technology capable of creating monolithic imagers of a size commensurate with human anatomy. In this study, analog and digital simulation frameworks were developed to provide insight into the influence of individual poly-Si transistors on pixel circuit performance — information that is not readily available through empirical means. The simulation frameworks were used to characterize the circuit designs employed in the prototypes. The analog framework, which determines the noise produced by individual transistors, was used to estimate energy resolution, as well as to identify which transistors contribute the most noise. The digital framework, which analyzes how well circuits function in the presence of significant variations in transistor properties, was used to estimate how fast a circuit can produce an output (referred to as output count rate). In addition, an algorithm was developed and used to estimate the minimum pixel pitch that could be achieved for the pixel circuits of the current prototypes. The simulation frameworks predict that the analog component of the PCA prototypes could have energy resolution as low as 8.9% FWHM at 70 keV; and the digital components should work well even in the presence of significant TFT variations, with the fastest component having output count rates as high as 3 MHz. Finally, based on conceivable improvements in the underlying fabrication process, the algorithm predicts that the 1 mm pitch of the current PCA prototypes could be reduced significantly, potentially to between ~240 and 290 μm. PMID:26878107
Liang, Albert K; Koniczek, Martin; Antonuk, Larry E; El-Mohri, Youcef; Zhao, Qihua; Street, Robert A; Lu, Jeng Ping
2016-03-07
Photon counting arrays (PCAs), defined as pixelated imagers which measure the absorbed energy of x-ray photons individually and record this information digitally, are of increasing clinical interest. A number of PCA prototypes with a 1 mm pixel-to-pixel pitch have recently been fabricated with polycrystalline silicon (poly-Si)-a thin-film technology capable of creating monolithic imagers of a size commensurate with human anatomy. In this study, analog and digital simulation frameworks were developed to provide insight into the influence of individual poly-Si transistors on pixel circuit performance-information that is not readily available through empirical means. The simulation frameworks were used to characterize the circuit designs employed in the prototypes. The analog framework, which determines the noise produced by individual transistors, was used to estimate energy resolution, as well as to identify which transistors contribute the most noise. The digital framework, which analyzes how well circuits function in the presence of significant variations in transistor properties, was used to estimate how fast a circuit can produce an output (referred to as output count rate). In addition, an algorithm was developed and used to estimate the minimum pixel pitch that could be achieved for the pixel circuits of the current prototypes. The simulation frameworks predict that the analog component of the PCA prototypes could have energy resolution as low as 8.9% full width at half maximum (FWHM) at 70 keV; and the digital components should work well even in the presence of significant thin-film transistor (TFT) variations, with the fastest component having output count rates as high as 3 MHz. Finally, based on conceivable improvements in the underlying fabrication process, the algorithm predicts that the 1 mm pitch of the current PCA prototypes could be reduced significantly, potentially to between ~240 and 290 μm.
Preparing Teachers for Technology Based Teaching-Learning Using TPACK
ERIC Educational Resources Information Center
Padmavathi, M.
2017-01-01
Technological Pedagogical Content Knowledge (TPACK) is a conceptual framework for teachers to teach effectively using technology. This framework originates from the opinion that use of technology in educational context would be effective only if content, pedagogy and technology are aligned carefully. It implies that for teachers to use technology…
The MMI Semantic Framework: Rosetta Stones for Earth Sciences
NASA Astrophysics Data System (ADS)
Rueda, C.; Bermudez, L. E.; Graybeal, J.; Alexander, P.
2009-12-01
Semantic interoperability—the exchange of meaning among computer systems—is needed to successfully share data in Ocean Science and across all Earth sciences. The best approach toward semantic interoperability requires a designed framework, and operationally tested tools and infrastructure within that framework. Currently available technologies make a scientific semantic framework feasible, but its development requires sustainable architectural vision and development processes. This presentation outlines the MMI Semantic Framework, including recent progress on it and its client applications. The MMI Semantic Framework consists of tools, infrastructure, and operational and community procedures and best practices, to meet short-term and long-term semantic interoperability goals. The design and prioritization of the semantic framework capabilities are based on real-world scenarios in Earth observation systems. We describe some key uses cases, as well as the associated requirements for building the overall infrastructure, which is realized through the MMI Ontology Registry and Repository. This system includes support for community creation and sharing of semantic content, ontology registration, version management, and seamless integration of user-friendly tools and application programming interfaces. The presentation describes the architectural components for semantic mediation, registry and repository for vocabularies, ontology, and term mappings. We show how the technologies and approaches in the framework can address community needs for managing and exchanging semantic information. We will demonstrate how different types of users and client applications exploit the tools and services for data aggregation, visualization, archiving, and integration. Specific examples from OOSTethys (http://www.oostethys.org) and the Ocean Observatories Initiative Cyberinfrastructure (http://www.oceanobservatories.org) will be cited. Finally, we show how semantic augmentation of web services standards could be performed using framework tools.
NASA Astrophysics Data System (ADS)
Stenzel, J.; Hudiburg, T. W.; Berardi, D.; McNellis, B.; Walsh, E.
2017-12-01
In forests vulnerable to drought and fire, there is critical need for in situ carbon and water balance measurements that can be integrated with earth system modeling to predict climate feedbacks. Model development can be improved by measurements that inform a mechanistic understanding of the component fluxes of net carbon uptake (i.e., NPP, autotrophic and heterotrophic respiration) and water use, with specific focus on responses to climate and disturbance. By integrating novel field-based instrumental technology, existing datasets, and state-of-the-art earth system modeling, we are attempting to 1) quantify the spatial and temporal impacts of forest thinning on regional biogeochemical cycling and climate 2) evaluate the impact of forest thinning on forest resilience to drought and disturbance in the Northern Rockies ecoregion. The combined model-experimental framework enables hypothesis testing that would otherwise be impossible because the use of new in situ high temporal resolution field technology allows for research in remote and mountainous terrains that have been excluded from eddy-covariance techniques. Our preliminary work has revealed some underlying difficulties with the new instrumentation that has led to new ideas and modified methods to correctly measure the component fluxes. Our observations of C balance following the thinning operations indicate that the recovery period (source to sink) is longer than hypothesized. Finally, we have incorporated a new plant functional type parameterization for Northern Rocky mixed-conifer into our simulation modeling using regional and site observations.
Loubet, Philippe; Roux, Philippe; Bellon-Maurel, Véronique
2016-01-01
The emphasis on the sustainable urban water management has increased over the last decades. In this context decision makers need tools to measure and improve the environmental performance of urban water systems (UWS) and their related scenarios. In this paper, we propose a versatile model, named WaLA (Water system Life cycle Assessment), which reduces the complexity of the UWS while ensuring a good representation of water issues and fulfilling life cycle assessment (LCA) requirements. Indeed, LCAs require building UWS models, which can be tedious if several scenarios are to be compared. The WaLA model is based on a framework that uses a "generic component" representing alternately water technology units and water users, with their associated water flows, and the associated impacts due to water deprivation, emissions, operation and infrastructure. UWS scenarios can be built by inter-operating and connecting the technologies and users components in a modular and integrated way. The model calculates life cycle impacts at a monthly temporal resolution for a set of services provided to users, as defined by the scenario. It also provides the ratio of impacts to amount of services provided and useful information for UWS diagnosis or comparison of different scenarios. The model is implemented in a Matlab/Simulink interface thanks to object-oriented programming. The applicability of the model is demonstrated using a virtual case study based on available life cycle inventory data. Copyright © 2015 Elsevier Ltd. All rights reserved.
Information Technology Integration in Teacher Education: Supporting the Paradigm Shift in Hong Kong.
ERIC Educational Resources Information Center
Lee, Kar Tin
2001-01-01
Examines the integration of information technology (IT) at the Hong Kong Institute of Education, presenting the rationale for this move, characteristics of IT integration, and program development issues for making IT application a critical component of contemporary teacher education. The paper presents a framework for program development and…
Higher-Order Theory for Functionally Graded Materials
NASA Technical Reports Server (NTRS)
Aboudi, Jacob; Pindera, Marek-Jerzy; Arnold, Steven M.
1999-01-01
This paper presents the full generalization of the Cartesian coordinate-based higher-order theory for functionally graded materials developed by the authors during the past several years. This theory circumvents the problematic use of the standard micromechanical approach, based on the concept of a representative volume element, commonly employed in the analysis of functionally graded composites by explicitly coupling the local (microstructural) and global (macrostructural) responses. The theoretical framework is based on volumetric averaging of the various field quantities, together with imposition of boundary and interfacial conditions in an average sense between the subvolumes used to characterize the composite's functionally graded microstructure. The generalization outlined herein involves extension of the theoretical framework to enable the analysis of materials characterized by spatially variable microstructures in three directions. Specialization of the generalized theoretical framework to previously published versions of the higher-order theory for materials functionally graded in one and two directions is demonstrated. In the applications part of the paper we summarize the major findings obtained with the one-directional and two-directional versions of the higher-order theory. The results illustrate both the fundamental issues related to the influence of microstructure on microscopic and macroscopic quantities governing the response of composites and the technologically important applications. A major issue addressed herein is the applicability of the classical homogenization schemes in the analysis of functionally graded materials. The technologically important applications illustrate the utility of functionally graded microstructures in tailoring the response of structural components in a variety of applications involving uniform and gradient thermomechanical loading.
Kristensen, Finn Børlum; Lampe, Kristian; Wild, Claudia; Cerbo, Marina; Goettsch, Wim; Becla, Lidia
2017-02-01
The HTA Core Model ® as a science-based framework for assessing dimensions of value was developed as a part of the European network for Health Technology Assessment project in the period 2006 to 2008 to facilitate production and sharing of health technology assessment (HTA) information, such as evidence on efficacy and effectiveness and patient aspects, to inform decisions. It covers clinical value as well as organizational, economic, and patient aspects of technologies and has been field-tested in two consecutive joint actions in the period 2010 to 2016. A large number of HTA institutions were involved in the work. The model has undergone revisions and improvement after iterations of piloting and can be used in a local, national, or international context to produce structured HTA information that can be taken forward by users into their own frameworks to fit their specific needs when informing decisions on technology. The model has a broad scope and offers a common ground to various stakeholders through offering a standard structure and a transparent set of proposed HTA questions. It consists of three main components: 1) the HTA ontology, 2) methodological guidance, and 3) a common reporting structure. It covers domains such as effectiveness, safety, and economics, and also includes domains covering organizational, patient, social, and legal aspects. There is a full model and a focused rapid relative effectiveness assessment model, and a third joint action is to continue till 2020. The HTA Core Model is now available for everyone around the world as a framework for assessing value. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Component Framework for Loosely Coupled High Performance Integrated Plasma Simulations
NASA Astrophysics Data System (ADS)
Elwasif, W. R.; Bernholdt, D. E.; Shet, A. G.; Batchelor, D. B.; Foley, S.
2010-11-01
We present the design and implementation of a component-based simulation framework for the execution of coupled time-dependent plasma modeling codes. The Integrated Plasma Simulator (IPS) provides a flexible lightweight component model that streamlines the integration of stand alone codes into coupled simulations. Standalone codes are adapted to the IPS component interface specification using a thin wrapping layer implemented in the Python programming language. The framework provides services for inter-component method invocation, configuration, task, and data management, asynchronous event management, simulation monitoring, and checkpoint/restart capabilities. Services are invoked, as needed, by the computational components to coordinate the execution of different aspects of coupled simulations on Massive parallel Processing (MPP) machines. A common plasma state layer serves as the foundation for inter-component, file-based data exchange. The IPS design principles, implementation details, and execution model will be presented, along with an overview of several use cases.
NASA Astrophysics Data System (ADS)
Wallace, Jon Michael
2003-10-01
Reliability prediction of components operating in complex systems has historically been conducted in a statistically isolated manner. Current physics-based, i.e. mechanistic, component reliability approaches focus more on component-specific attributes and mathematical algorithms and not enough on the influence of the system. The result is that significant error can be introduced into the component reliability assessment process. The objective of this study is the development of a framework that infuses the needs and influence of the system into the process of conducting mechanistic-based component reliability assessments. The formulated framework consists of six primary steps. The first three steps, identification, decomposition, and synthesis, are primarily qualitative in nature and employ system reliability and safety engineering principles to construct an appropriate starting point for the component reliability assessment. The following two steps are the most unique. They involve a step to efficiently characterize and quantify the system-driven local parameter space and a subsequent step using this information to guide the reduction of the component parameter space. The local statistical space quantification step is accomplished using two proposed multivariate probability models: Multi-Response First Order Second Moment and Taylor-Based Inverse Transformation. Where existing joint probability models require preliminary distribution and correlation information of the responses, these models combine statistical information of the input parameters with an efficient sampling of the response analyses to produce the multi-response joint probability distribution. Parameter space reduction is accomplished using Approximate Canonical Correlation Analysis (ACCA) employed as a multi-response screening technique. The novelty of this approach is that each individual local parameter and even subsets of parameters representing entire contributing analyses can now be rank ordered with respect to their contribution to not just one response, but the entire vector of component responses simultaneously. The final step of the framework is the actual probabilistic assessment of the component. Although the same multivariate probability tools employed in the characterization step can be used for the component probability assessment, variations of this final step are given to allow for the utilization of existing probabilistic methods such as response surface Monte Carlo and Fast Probability Integration. The overall framework developed in this study is implemented to assess the finite-element based reliability prediction of a gas turbine airfoil involving several failure responses. Results of this implementation are compared to results generated using the conventional 'isolated' approach as well as a validation approach conducted through large sample Monte Carlo simulations. The framework resulted in a considerable improvement to the accuracy of the part reliability assessment and an improved understanding of the component failure behavior. Considerable statistical complexity in the form of joint non-normal behavior was found and accounted for using the framework. Future applications of the framework elements are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trentadue, R.; Clemencic, M.; Dykstra, D.
The LCG Persistency Framework consists of three software packages (CORAL, COOL and POOL) that address the data access requirements of the LHC experiments in several different areas. The project is the result of the collaboration between the CERN IT Department and the three experiments (ATLAS, CMS and LHCb) that are using some or all of the Persistency Framework components to access their data. POOL is a hybrid technology store for C++ objects, using a mixture of streaming and relational technologies to implement both object persistency and object metadata catalogs and collections. CORAL is an abstraction layer with an SQL-free APImore » for accessing data stored using relational database technologies. COOL provides specific software components and tools for the handling of the time variation and versioning of the experiment conditions data. This presentation reports on the status and outlook in each of the three sub-projects at the time of the CHEP2012 conference, reviewing the usage of each package in the three LHC experiments.« less
ERIC Educational Resources Information Center
Skoretz, Yvonne M.; Cottle, Amy E.
2011-01-01
Meeting International Society for Technology in Education competencies creates a challenge for teachers. The authors provide a problem-based video framework that guides teachers in enhancing 21st century skills to meet those competencies. To keep the focus on the content, the authors suggest teaching the technology skills only at the point the…
Technological Pedagogical Content Knowledge -- A Review of the Literature
ERIC Educational Resources Information Center
Voogt, J.; Fisser, P.; Roblin, N. Pareja; Tondeur, J.; van Braak, J.
2013-01-01
Technological Pedagogical Content Knowledge (TPACK) has been introduced as a conceptual framework for the knowledge base teachers need to effectively teach with technology. The framework stems from the notion that technology integration in a specific educational context benefits from a careful alignment of content, pedagogy and the potential of…
Osabohien, Romanus; Osabuohien, Evans; Urhie, Ese
2018-01-01
Background: Growth in agricultural science and technology is deemed essential for in-creasing agricultural output; reduce the vulnerability of rural poverty and in turn, food security. Food security and growth in agricultural output depends on technological usages, which enhances the pro-ductive capacity of the agricultural sector. The indicators of food security utilised in this study in-clude: dietary energy supply, average value of food production, prevalence of food inadequacy, among others. Objective: In this paper, we examined the level of technology and how investment in the agriculture and technology can improve technical know-how in Nigeria with a view to achieving food security. Method: We carried out the analysis on how investment in technology and institutional framework can improve the level of food availability (a key component of food security) in Nigeria using econ-ometric technique based on Autoregressive Distribution Lag (ARDL) framework. Results: The results showed, inter alia, that in Nigeria, there is a high level of food insecurity as a result of low attention on food production occasioned by the pervasive influence of oil that become the major export product. Conclusion: It was noted that the availability of arable land was one of the major factors to increase food production to solve the challenge of food insecurity. Thus, the efforts of reducing the rate of food insecurity are essential in this regards. This can also be achieved, among others, by active interactions between government and farmers, to make contribution to important planning issues that relate to food production in the country and above all, social protection policies should be geared or channelled to agricultural sector to protect farmers who are vulnerable to shocks and avert risks associated with agriculture. PMID:29853816
Osabohien, Romanus; Osabuohien, Evans; Urhie, Ese
2018-04-01
Growth in agricultural science and technology is deemed essential for in-creasing agricultural output; reduce the vulnerability of rural poverty and in turn, food security. Food security and growth in agricultural output depends on technological usages, which enhances the pro-ductive capacity of the agricultural sector. The indicators of food security utilised in this study in-clude: dietary energy supply, average value of food production, prevalence of food inadequacy, among others. In this paper, we examined the level of technology and how investment in the agriculture and technology can improve technical know-how in Nigeria with a view to achieving food security. We carried out the analysis on how investment in technology and institutional framework can improve the level of food availability (a key component of food security) in Nigeria using econ-ometric technique based on Autoregressive Distribution Lag (ARDL) framework. The results showed, inter alia, that in Nigeria, there is a high level of food insecurity as a result of low attention on food production occasioned by the pervasive influence of oil that become the major export product. It was noted that the availability of arable land was one of the major factors to increase food production to solve the challenge of food insecurity. Thus, the efforts of reducing the rate of food insecurity are essential in this regards. This can also be achieved, among others, by active interactions between government and farmers, to make contribution to important planning issues that relate to food production in the country and above all, social protection policies should be geared or channelled to agricultural sector to protect farmers who are vulnerable to shocks and avert risks associated with agriculture.
NASA Astrophysics Data System (ADS)
Boerwinkel, Dirk Jan; Yarden, Anat; Waarlo, Arend Jan
2017-12-01
To determine what knowledge of genetics is needed for decision-making on genetic-related issues, a consensus-reaching approach was used. An international group of 57 experts, involved in teaching, studying, or developing genetic education and communication or working with genetic applications in medicine, agriculture, or forensics, answered the questions: "What knowledge of genetics is relevant to those individuals not professionally involved in science?" and "Why is this knowledge relevant?" The answers were classified in different knowledge components following the PISA 2015 science framework. During a workshop with the participants, the results were discussed and applied to seven cases in which genetic knowledge is relevant for decision-making. The analysis of these discussions resulted in a revised framework consisting of nine conceptual knowledge components, three sociocultural components, and four epistemic components. The framework can be used in curricular decisions; its open character allows for including new technologies and applications and facilitates comparisons of different cases.
Engineering Social Justice into Traffic Control for Self-Driving Vehicles?
Mladenovic, Milos N; McPherson, Tristram
2016-08-01
The convergence of computing, sensing, and communication technology will soon permit large-scale deployment of self-driving vehicles. This will in turn permit a radical transformation of traffic control technology. This paper makes a case for the importance of addressing questions of social justice in this transformation, and sketches a preliminary framework for doing so. We explain how new forms of traffic control technology have potential implications for several dimensions of social justice, including safety, sustainability, privacy, efficiency, and equal access. Our central focus is on efficiency and equal access as desiderata for traffic control design. We explain the limitations of conventional traffic control in meeting these desiderata, and sketch a preliminary vision for a next-generation traffic control tailored to address better the demands of social justice. One component of this vision is cooperative, hierarchically distributed self-organization among vehicles. Another component of this vision is a priority system enabling selection of priority levels by the user for each vehicle trip in the network, based on the supporting structure of non-monetary credits.
Big data and high-performance analytics in structural health monitoring for bridge management
NASA Astrophysics Data System (ADS)
Alampalli, Sharada; Alampalli, Sandeep; Ettouney, Mohammed
2016-04-01
Structural Health Monitoring (SHM) can be a vital tool for effective bridge management. Combining large data sets from multiple sources to create a data-driven decision-making framework is crucial for the success of SHM. This paper presents a big data analytics framework that combines multiple data sets correlated with functional relatedness to convert data into actionable information that empowers risk-based decision-making. The integrated data environment incorporates near real-time streams of semi-structured data from remote sensors, historical visual inspection data, and observations from structural analysis models to monitor, assess, and manage risks associated with the aging bridge inventories. Accelerated processing of dataset is made possible by four technologies: cloud computing, relational database processing, support from NOSQL database, and in-memory analytics. The framework is being validated on a railroad corridor that can be subjected to multiple hazards. The framework enables to compute reliability indices for critical bridge components and individual bridge spans. In addition, framework includes a risk-based decision-making process that enumerate costs and consequences of poor bridge performance at span- and network-levels when rail networks are exposed to natural hazard events such as floods and earthquakes. Big data and high-performance analytics enable insights to assist bridge owners to address problems faster.
Teaching Statistics with Technology
ERIC Educational Resources Information Center
Prodromou, Theodosia
2015-01-01
The Technological Pedagogical Content Knowledge (TPACK) conceptual framework for teaching mathematics, developed by Mishra and Koehler (2006), emphasises the importance of developing integrated and interdependent understanding of three primary forms of knowledge: technology, pedagogy, and content. The TPACK conceptual framework is based upon the…
NASA Astrophysics Data System (ADS)
Peckham, S. D.; DeLuca, C.; Gochis, D. J.; Arrigo, J.; Kelbert, A.; Choi, E.; Dunlap, R.
2014-12-01
In order to better understand and predict environmental hazards of weather/climate, ecology and deep earth processes, geoscientists develop and use physics-based computational models. These models are used widely both in academic and federal communities. Because of the large effort required to develop and test models, there is widespread interest in component-based modeling, which promotes model reuse and simplified coupling to tackle problems that often cross discipline boundaries. In component-based modeling, the goal is to make relatively small changes to models that make it easy to reuse them as "plug-and-play" components. Sophisticated modeling frameworks exist to rapidly couple these components to create new composite models. They allow component models to exchange variables while accommodating different programming languages, computational grids, time-stepping schemes, variable names and units. Modeling frameworks have arisen in many modeling communities. CSDMS (Community Surface Dynamics Modeling System) serves the academic earth surface process dynamics community, while ESMF (Earth System Modeling Framework) serves many federal Earth system modeling projects. Others exist in both the academic and federal domains and each satisfies design criteria that are determined by the community they serve. While they may use different interface standards or semantic mediation strategies, they share fundamental similarities. The purpose of the Earth System Bridge project is to develop mechanisms for interoperability between modeling frameworks, such as the ability to share a model or service component. This project has three main goals: (1) Develop a Framework Description Language (ES-FDL) that allows modeling frameworks to be described in a standard way so that their differences and similarities can be assessed. (2) Demonstrate that if a model is augmented with a framework-agnostic Basic Model Interface (BMI), then simple, universal adapters can go from BMI to a modeling framework's native component interface. (3) Create semantic mappings between modeling frameworks that support semantic mediation. This third goal involves creating a crosswalk between the CF Standard Names and the CSDMS Standard Names (a set of naming conventions). This talk will summarize progress towards these goals.
Design of a component-based integrated environmental modeling framework
Integrated environmental modeling (IEM) includes interdependent science-based components (e.g., models, databases, viewers, assessment protocols) that comprise an appropriate software modeling system. The science-based components are responsible for consuming and producing inform...
The Common Data Acquisition Platform in the Helmholtz Association
NASA Astrophysics Data System (ADS)
Kaever, P.; Balzer, M.; Kopmann, A.; Zimmer, M.; Rongen, H.
2017-04-01
Various centres of the German Helmholtz Association (HGF) started in 2012 to develop a modular data acquisition (DAQ) platform, covering the entire range from detector readout to data transfer into parallel computing environments. This platform integrates generic hardware components like the multi-purpose HGF-Advanced Mezzanine Card or a smart scientific camera framework, adding user value with Linux drivers and board support packages. Technically the scope comprises the DAQ-chain from FPGA-modules to computing servers, notably frontend-electronics-interfaces, microcontrollers and GPUs with their software plus high-performance data transmission links. The core idea is a generic and component-based approach, enabling the implementation of specific experiment requirements with low effort. This so called DTS-platform will support standards like MTCA.4 in hard- and software to ensure compatibility with commercial components. Its capability to deploy on other crate standards or FPGA-boards with PCI express or Ethernet interfaces remains an essential feature. Competences of the participating centres are coordinated in order to provide a solid technological basis for both research topics in the Helmholtz Programme ``Matter and Technology'': ``Detector Technology and Systems'' and ``Accelerator Research and Development''. The DTS-platform aims at reducing costs and development time and will ensure access to latest technologies for the collaboration. Due to its flexible approach, it has the potential to be applied in other scientific programs.
Thomas, Bex George; Elasser, Ahmed; Bollapragada, Srinivas; Galbraith, Anthony William; Agamy, Mohammed; Garifullin, Maxim Valeryevich
2016-03-29
A system and method of using one or more DC-DC/DC-AC converters and/or alternative devices allows strings of multiple module technologies to coexist within the same PV power plant. A computing (optimization) framework estimates the percentage allocation of PV power plant capacity to selected PV module technologies. The framework and its supporting components considers irradiation, temperature, spectral profiles, cost and other practical constraints to achieve the lowest levelized cost of electricity, maximum output and minimum system cost. The system and method can function using any device enabling distributed maximum power point tracking at the module, string or combiner level.
Grist : grid-based data mining for astronomy
NASA Technical Reports Server (NTRS)
Jacob, Joseph C.; Katz, Daniel S.; Miller, Craig D.; Walia, Harshpreet; Williams, Roy; Djorgovski, S. George; Graham, Matthew J.; Mahabal, Ashish; Babu, Jogesh; Berk, Daniel E. Vanden;
2004-01-01
The Grist project is developing a grid-technology based system as a research environment for astronomy with massive and complex datasets. This knowledge extraction system will consist of a library of distributed grid services controlled by a workflow system, compliant with standards emerging from the grid computing, web services, and virtual observatory communities. This new technology is being used to find high redshift quasars, study peculiar variable objects, search for transients in real time, and fit SDSS QSO spectra to measure black hole masses. Grist services are also a component of the 'hyperatlas' project to serve high-resolution multi-wavelength imagery over the Internet. In support of these science and outreach objectives, the Grist framework will provide the enabling fabric to tie together distributed grid services in the areas of data access, federation, mining, subsetting, source extraction, image mosaicking, statistics, and visualization.
Grist: Grid-based Data Mining for Astronomy
NASA Astrophysics Data System (ADS)
Jacob, J. C.; Katz, D. S.; Miller, C. D.; Walia, H.; Williams, R. D.; Djorgovski, S. G.; Graham, M. J.; Mahabal, A. A.; Babu, G. J.; vanden Berk, D. E.; Nichol, R.
2005-12-01
The Grist project is developing a grid-technology based system as a research environment for astronomy with massive and complex datasets. This knowledge extraction system will consist of a library of distributed grid services controlled by a workflow system, compliant with standards emerging from the grid computing, web services, and virtual observatory communities. This new technology is being used to find high redshift quasars, study peculiar variable objects, search for transients in real time, and fit SDSS QSO spectra to measure black hole masses. Grist services are also a component of the ``hyperatlas'' project to serve high-resolution multi-wavelength imagery over the Internet. In support of these science and outreach objectives, the Grist framework will provide the enabling fabric to tie together distributed grid services in the areas of data access, federation, mining, subsetting, source extraction, image mosaicking, statistics, and visualization.
Martin, J B; Wilkins, A S; Stawski, S K
1998-08-01
The evolving health care environment demands that health care organizations fully utilize information technologies (ITs). The effective deployment of IT requires the development and implementation of a comprehensive IT strategic plan. A number of approaches to health care IT strategic planning exist, but they are outdated or incomplete. The component alignment model (CAM) introduced here recognizes the complexity of today's health care environment, emphasizing continuous assessment and realignment of seven basic components: external environment, emerging ITs, organizational infrastructure, mission, IT infrastructure, business strategy, and IT strategy. The article provides a framework by which health care organizations can develop an effective IT strategic planning process.
Kushniruk, A W; Patel, C; Patel, V L; Cimino, J J
2001-04-01
The World Wide Web provides an unprecedented opportunity for widespread access to health-care applications by both patients and providers. The development of new methods for assessing the effectiveness and usability of these systems is becoming a critical issue. This paper describes the distance evaluation (i.e. 'televaluation') of emerging Web-based information technologies. In health informatics evaluation, there is a need for application of new ideas and methods from the fields of cognitive science and usability engineering. A framework is presented for conducting evaluations of health-care information technologies that integrates a number of methods, ranging from deployment of on-line questionnaires (and Web-based forms) to remote video-based usability testing of user interactions with clinical information systems. Examples illustrating application of these techniques are presented for the assessment of a patient clinical information system (PatCIS), as well as an evaluation of use of Web-based clinical guidelines. Issues in designing, prototyping and iteratively refining evaluation components are discussed, along with description of a 'virtual' usability laboratory.
ERIC Educational Resources Information Center
Pifarré, Manoli; Martí, Laura; Cujba, Andreea
2015-01-01
This paper explores the effects of a technology-enhanced pedagogical framework on collaborative creativity processes. The pedagogical framework is built on socio-cultural theory which conceptualizes creativity as a social activity based on intersubjectivity and dialogical interactions. Dialogue becomes an instrument for collaborative creativity…
Towards Developing a Regional Drought Information System for Lower Mekong
NASA Astrophysics Data System (ADS)
Dutta, R.; Jayasinghe, S.; Basnayake, S. B.; Apirumanekul, C.; Pudashine, J.; Granger, S. L.; Andreadis, K.; Das, N. N.
2016-12-01
With the climate and weather patterns changing over the years, the Lower Mekong Basin have been experiencing frequent and prolonged droughts resulting in severe damage to the agricultural sector affecting food security and livelihoods of the farming community. However, the Regional Drought Information System (RDIS) for Lower Mekong countries would help prepare vulnerable communities from frequent and severe droughts through monitoring, assessing and forecasting of drought conditions and allowing decision makers to take effective decisions in terms of providing early warning, incentives to farmers, and adjustments to cropping calendars and so on. The RDIS is an integrated system that is being designed for drought monitoring, analysis and forecasting based on the need to meet the growing demand of an effective monitoring system for drought by the lower Mekong countries. The RDIS is being built on four major components that includes earth observation component, meteorological data component, database storage and Regional Hydrologic Extreme Assessment System (RHEAS) framework while the outputs from the system will be made open access to the public through a web-based user interface. The system will run on the RHEAS framework that allows both nowcasting and forecasting using hydrological and crop simulation models such as the Variable Infiltration Capacity (VIC) model and the Decision Support System for Agro-Technology Transfer (DSSAT) model respectively. The RHEAS allows for a tightly constrained observation based drought and crop yield information system that can provide customized outputs on drought that includes root zone soil moisture, Standard Precipitation Index (SPI), Standard Runoff Index (SRI), Palmer Drought Severity Index (PDSI) and Crop Yield and can integrate remote sensing products, along with evapotranspiration and soil moisture data. The anticipated outcomes from the RDIS is to improve the operational, technological and institutional capabilities of lower Mekong countries to prepare for and respond towards drought situations and providing policy makers with current and forecast drought indices for decision making on adjusting cropping calendars as well as planning short and long term mitigation measures.
The Numerical Propulsion System Simulation: An Overview
NASA Technical Reports Server (NTRS)
Lytle, John K.
2000-01-01
Advances in computational technology and in physics-based modeling are making large-scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze major propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of designing systems, providing the designer with critical information about the components early in the design process. This paper describes the development of the numerical propulsion system simulation (NPSS), a modular and extensible framework for the integration of multicomponent and multidisciplinary analysis tools using geographically distributed resources such as computing platforms, data bases, and people. The analysis is currently focused on large-scale modeling of complete aircraft engines. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.
Use of Annotations for Component and Framework Interoperability
NASA Astrophysics Data System (ADS)
David, O.; Lloyd, W.; Carlson, J.; Leavesley, G. H.; Geter, F.
2009-12-01
The popular programming languages Java and C# provide annotations, a form of meta-data construct. Software frameworks for web integration, web services, database access, and unit testing now take advantage of annotations to reduce the complexity of APIs and the quantity of integration code between the application and framework infrastructure. Adopting annotation features in frameworks has been observed to lead to cleaner and leaner application code. The USDA Object Modeling System (OMS) version 3.0 fully embraces the annotation approach and additionally defines a meta-data standard for components and models. In version 3.0 framework/model integration previously accomplished using API calls is now achieved using descriptive annotations. This enables the framework to provide additional functionality non-invasively such as implicit multithreading, and auto-documenting capabilities while achieving a significant reduction in the size of the model source code. Using a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework. Since models and modeling components are not directly bound to framework by the use of specific APIs and/or data types they can more easily be reused both within the framework as well as outside of it. To study the effectiveness of an annotation based framework approach with other modeling frameworks, a framework-invasiveness study was conducted to evaluate the effects of framework design on model code quality. A monthly water balance model was implemented across several modeling frameworks and several software metrics were collected. The metrics selected were measures of non-invasive design methods for modeling frameworks from a software engineering perspective. It appears that the use of annotations positively impacts several software quality measures. In a next step, the PRMS model was implemented in OMS 3.0 and is currently being implemented for water supply forecasting in the western United States at the USDA NRCS National Water and Climate Center. PRMS is a component based modular precipitation-runoff model developed to evaluate the impacts of various combinations of precipitation, climate, and land use on streamflow and general basin hydrology. The new OMS 3.0 PRMS model source code is more concise and flexible as a result of using the new framework’s annotation based approach. The fully annotated components are now providing information directly for (i) model assembly and building, (ii) dataflow analysis for implicit multithreading, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Experience to date has demonstrated the multi-purpose value of using annotations. Annotations are also a feasible and practical method to enable interoperability among models and modeling frameworks. As a prototype example, model code annotations were used to generate binding and mediation code to allow the use of OMS 3.0 model components within the OpenMI context.
An ICT Adoption Framework for Education: A Case Study in Public Secondary School of Indonesia
NASA Astrophysics Data System (ADS)
Nurjanah, S.; Santoso, H. B.; Hasibuan, Z. A.
2017-01-01
This paper presents preliminary research findings on the ICT adoption framework for education. Despite many studies have been conducted on ICT adoption framework in education at various countries, they are lack of analysis on the degree of component contribution to the success to the framework. In this paper a set of components that link to ICT adoption in education is observed based on literatures and explorative analysis. The components are Infrastructure, Application, User Skills, Utilization, Finance, and Policy. The components are used as a basis to develop a questionnaire to capture the current ICT adoption condition in schools. The data from questionnaire are processed using Structured Equation Model (SEM). The results show that each component contributes differently to the ICT adoption framework. Finance provides the strongest affect to Infrastructure readiness, whilst User Skills provides the strongest affect to Utilization. The study concludes that development of ICT adoption framework should consider components contribution weights among the components that can be used to guide the implementation of ICT adoption in education.
Monk, Andrew; Hone, Kate; Lines, Lorna; Dowdall, Alan; Baxter, Gordon; Blythe, Mark; Wright, Peter
2006-09-01
Information and communication technology applications can help increase the independence and quality of life of older people, or people with disabilities who live in their own homes. A risk management framework is proposed to assist in selecting applications that match the needs and wishes of particular individuals. Risk comprises two components: the likelihood of the occurrence of harm and the consequences of that harm. In the home, the social and psychological harms are as important as the physical ones. The importance of the harm (e.g., injury) is conditioned by its consequences (e.g., distress, costly medical treatment). We identify six generic types of harm (including dependency, loneliness, fear and debt) and four generic consequences (including distress and loss of confidence in ability to live independently). The resultant client-centred framework offers a systematic basis for selecting and evaluating technology for independent living.
NASA Astrophysics Data System (ADS)
Anku, Sitsofe E.
1997-09-01
Using the reform documents of the National Council of Teachers of Mathematics (NCTM) (NCTM, 1989, 1991, 1995), a theory-based multi-dimensional assessment framework (the "SEA" framework) which should help expand the scope of assessment in mathematics is proposed. This framework uses a context based on mathematical reasoning and has components that comprise mathematical concepts, mathematical procedures, mathematical communication, mathematical problem solving, and mathematical disposition.
Metrology for industrial quantum communications: the MIQC project
NASA Astrophysics Data System (ADS)
Rastello, M. L.; Degiovanni, I. P.; Sinclair, A. G.; Kück, S.; Chunnilall, C. J.; Porrovecchio, G.; Smid, M.; Manoocheri, F.; Ikonen, E.; Kubarsepp, T.; Stucki, D.; Hong, K. S.; Kim, S. K.; Tosi, A.; Brida, G.; Meda, A.; Piacentini, F.; Traina, P.; Natsheh, A. Al; Cheung, J. Y.; Müller, I.; Klein, R.; Vaigu, A.
2014-12-01
The ‘Metrology for Industrial Quantum Communication Technologies’ project (MIQC) is a metrology framework that fosters development and market take-up of quantum communication technologies and is aimed at achieving maximum impact for the European industry in this area. MIQC is focused on quantum key distribution (QKD) technologies, the most advanced quantum-based technology towards practical application. QKD is a way of sending cryptographic keys with absolute security. It does this by exploiting the ability to encode in a photon's degree of freedom specific quantum states that are noticeably disturbed if an eavesdropper trying to decode it is present in the communication channel. The MIQC project has started the development of independent measurement standards and definitions for the optical components of QKD system, since one of the perceived barriers to QKD market success is the lack of standardization and quality assurance.
Identifying Core Mobile Learning Faculty Competencies Based Integrated Approach: A Delphi Study
ERIC Educational Resources Information Center
Elbarbary, Rafik Said
2015-01-01
This study is based on the integrated approach as a concept framework to identify, categorize, and rank a key component of mobile learning core competencies for Egyptian faculty members in higher education. The field investigation framework used four rounds Delphi technique to determine the importance rate of each component of core competencies…
Smart City Energy Interconnection Technology Framework Preliminary Research
NASA Astrophysics Data System (ADS)
Zheng, Guotai; Zhao, Baoguo; Zhao, Xin; Li, Hao; Huo, Xianxu; Li, Wen; Xia, Yu
2018-01-01
to improve urban energy efficiency, improve the absorptive ratio of new energy resources and renewable energy sources, and reduce environmental pollution and other energy supply and consumption technology framework matched with future energy restriction conditions and applied technology level are required to be studied. Relative to traditional energy supply system, advanced information technology-based “Energy Internet” technical framework may give play to energy integrated application and load side interactive technology advantages, as a whole optimize energy supply and consumption and improve the overall utilization efficiency of energy.
A holistic framework to improve the uptake and impact of eHealth technologies.
van Gemert-Pijnen, Julia E W C; Nijland, Nicol; van Limburg, Maarten; Ossebaard, Hans C; Kelders, Saskia M; Eysenbach, Gunther; Seydel, Erwin R
2011-12-05
Many eHealth technologies are not successful in realizing sustainable innovations in health care practices. One of the reasons for this is that the current development of eHealth technology often disregards the interdependencies between technology, human characteristics, and the socioeconomic environment, resulting in technology that has a low impact in health care practices. To overcome the hurdles with eHealth design and implementation, a new, holistic approach to the development of eHealth technologies is needed, one that takes into account the complexity of health care and the rituals and habits of patients and other stakeholders. The aim of this viewpoint paper is to improve the uptake and impact of eHealth technologies by advocating a holistic approach toward their development and eventual integration in the health sector. To identify the potential and limitations of current eHealth frameworks (1999-2009), we carried out a literature search in the following electronic databases: PubMed, ScienceDirect, Web of Knowledge, PiCarta, and Google Scholar. Of the 60 papers that were identified, 44 were selected for full review. We excluded those papers that did not describe hands-on guidelines or quality criteria for the design, implementation, and evaluation of eHealth technologies (28 papers). From the results retrieved, we identified 16 eHealth frameworks that matched the inclusion criteria. The outcomes were used to posit strategies and principles for a holistic approach toward the development of eHealth technologies; these principles underpin our holistic eHealth framework. A total of 16 frameworks qualified for a final analysis, based on their theoretical backgrounds and visions on eHealth, and the strategies and conditions for the research and development of eHealth technologies. Despite their potential, the relationship between the visions on eHealth, proposed strategies, and research methods is obscure, perhaps due to a rather conceptual approach that focuses on the rationale behind the frameworks rather than on practical guidelines. In addition, the Web 2.0 technologies that call for a more stakeholder-driven approach are beyond the scope of current frameworks. To overcome these limitations, we composed a holistic framework based on a participatory development approach, persuasive design techniques, and business modeling. To demonstrate the impact of eHealth technologies more effectively, a fresh way of thinking is required about how technology can be used to innovate health care. It also requires new concepts and instruments to develop and implement technologies in practice. The proposed framework serves as an evidence-based roadmap.
A Holistic Framework to Improve the Uptake and Impact of eHealth Technologies
van Limburg, Maarten; Ossebaard, Hans C; Kelders, Saskia M; Eysenbach, Gunther; Seydel, Erwin R
2011-01-01
Background Many eHealth technologies are not successful in realizing sustainable innovations in health care practices. One of the reasons for this is that the current development of eHealth technology often disregards the interdependencies between technology, human characteristics, and the socioeconomic environment, resulting in technology that has a low impact in health care practices. To overcome the hurdles with eHealth design and implementation, a new, holistic approach to the development of eHealth technologies is needed, one that takes into account the complexity of health care and the rituals and habits of patients and other stakeholders. Objective The aim of this viewpoint paper is to improve the uptake and impact of eHealth technologies by advocating a holistic approach toward their development and eventual integration in the health sector. Methods To identify the potential and limitations of current eHealth frameworks (1999–2009), we carried out a literature search in the following electronic databases: PubMed, ScienceDirect, Web of Knowledge, PiCarta, and Google Scholar. Of the 60 papers that were identified, 44 were selected for full review. We excluded those papers that did not describe hands-on guidelines or quality criteria for the design, implementation, and evaluation of eHealth technologies (28 papers). From the results retrieved, we identified 16 eHealth frameworks that matched the inclusion criteria. The outcomes were used to posit strategies and principles for a holistic approach toward the development of eHealth technologies; these principles underpin our holistic eHealth framework. Results A total of 16 frameworks qualified for a final analysis, based on their theoretical backgrounds and visions on eHealth, and the strategies and conditions for the research and development of eHealth technologies. Despite their potential, the relationship between the visions on eHealth, proposed strategies, and research methods is obscure, perhaps due to a rather conceptual approach that focuses on the rationale behind the frameworks rather than on practical guidelines. In addition, the Web 2.0 technologies that call for a more stakeholder-driven approach are beyond the scope of current frameworks. To overcome these limitations, we composed a holistic framework based on a participatory development approach, persuasive design techniques, and business modeling. Conclusions To demonstrate the impact of eHealth technologies more effectively, a fresh way of thinking is required about how technology can be used to innovate health care. It also requires new concepts and instruments to develop and implement technologies in practice. The proposed framework serves as an evidence-based roadmap. PMID:22155738
Yenkie, Kirti M.; Wu, Wenzhao; Maravelias, Christos T.
2017-05-08
Background. Bioseparations can contribute to more than 70% in the total production cost of a bio-based chemical, and if the desired chemical is localized intracellularly, there can be additional challenges associated with its recovery. Based on the properties of the desired chemical and other components in the stream, there can be multiple feasible options for product recovery. These options are composed of several alternative technologies, performing similar tasks. The suitability of a technology for a particular chemical depends on (1) its performance parameters, such as separation efficiency; (2) cost or amount of added separating agent; (3) properties of the bioreactormore » effluent (e.g., biomass titer, product content); and (4) final product specifications. Our goal is to first synthesize alternative separation options and then analyze how technology selection affects the overall process economics. To achieve this, we propose an optimization-based framework that helps in identifying the critical technologies and parameters. Results. We study the separation networks for two representative classes of chemicals based on their properties. The separation network is divided into three stages: cell and product isolation (stage I), product concentration (II), and product purification and refining (III). Each stage exploits differences in specific product properties for achieving the desired product quality. The cost contribution analysis for the two cases (intracellular insoluble and intracellular soluble) reveals that stage I is the key cost contributor (>70% of the overall cost). Further analysis suggests that changes in input conditions and technology performance parameters lead to new designs primarily in stage I. Conclusions. The proposed framework provides significant insights for technology selection and assists in making informed decisions regarding technologies that should be used in combination for a given set of stream/product properties and final output specifications. Additionally, the parametric sensitivity provides an opportunity to make crucial design and selection decisions in a comprehensive and rational manner. This will prove valuable in the selection of chemicals to be produced using bioconversions (bioproducts) as well as in creating better bioseparation flow sheets for detailed economic assessment and process implementation on the commercial scale.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yenkie, Kirti M.; Wu, Wenzhao; Maravelias, Christos T.
Background. Bioseparations can contribute to more than 70% in the total production cost of a bio-based chemical, and if the desired chemical is localized intracellularly, there can be additional challenges associated with its recovery. Based on the properties of the desired chemical and other components in the stream, there can be multiple feasible options for product recovery. These options are composed of several alternative technologies, performing similar tasks. The suitability of a technology for a particular chemical depends on (1) its performance parameters, such as separation efficiency; (2) cost or amount of added separating agent; (3) properties of the bioreactormore » effluent (e.g., biomass titer, product content); and (4) final product specifications. Our goal is to first synthesize alternative separation options and then analyze how technology selection affects the overall process economics. To achieve this, we propose an optimization-based framework that helps in identifying the critical technologies and parameters. Results. We study the separation networks for two representative classes of chemicals based on their properties. The separation network is divided into three stages: cell and product isolation (stage I), product concentration (II), and product purification and refining (III). Each stage exploits differences in specific product properties for achieving the desired product quality. The cost contribution analysis for the two cases (intracellular insoluble and intracellular soluble) reveals that stage I is the key cost contributor (>70% of the overall cost). Further analysis suggests that changes in input conditions and technology performance parameters lead to new designs primarily in stage I. Conclusions. The proposed framework provides significant insights for technology selection and assists in making informed decisions regarding technologies that should be used in combination for a given set of stream/product properties and final output specifications. Additionally, the parametric sensitivity provides an opportunity to make crucial design and selection decisions in a comprehensive and rational manner. This will prove valuable in the selection of chemicals to be produced using bioconversions (bioproducts) as well as in creating better bioseparation flow sheets for detailed economic assessment and process implementation on the commercial scale.« less
Yenkie, Kirti M; Wu, Wenzhao; Maravelias, Christos T
2017-01-01
Bioseparations can contribute to more than 70% in the total production cost of a bio-based chemical, and if the desired chemical is localized intracellularly, there can be additional challenges associated with its recovery. Based on the properties of the desired chemical and other components in the stream, there can be multiple feasible options for product recovery. These options are composed of several alternative technologies, performing similar tasks. The suitability of a technology for a particular chemical depends on (1) its performance parameters, such as separation efficiency; (2) cost or amount of added separating agent; (3) properties of the bioreactor effluent (e.g., biomass titer, product content); and (4) final product specifications. Our goal is to first synthesize alternative separation options and then analyze how technology selection affects the overall process economics. To achieve this, we propose an optimization-based framework that helps in identifying the critical technologies and parameters. We study the separation networks for two representative classes of chemicals based on their properties. The separation network is divided into three stages: cell and product isolation (stage I), product concentration (II), and product purification and refining (III). Each stage exploits differences in specific product properties for achieving the desired product quality. The cost contribution analysis for the two cases (intracellular insoluble and intracellular soluble) reveals that stage I is the key cost contributor (>70% of the overall cost). Further analysis suggests that changes in input conditions and technology performance parameters lead to new designs primarily in stage I. The proposed framework provides significant insights for technology selection and assists in making informed decisions regarding technologies that should be used in combination for a given set of stream/product properties and final output specifications. Additionally, the parametric sensitivity provides an opportunity to make crucial design and selection decisions in a comprehensive and rational manner. This will prove valuable in the selection of chemicals to be produced using bioconversions (bioproducts) as well as in creating better bioseparation flow sheets for detailed economic assessment and process implementation on the commercial scale.
Multiple-component covalent organic frameworks
Huang, Ning; Zhai, Lipeng; Coupry, Damien E.; Addicoat, Matthew A.; Okushita, Keiko; Nishimura, Katsuyuki; Heine, Thomas; Jiang, Donglin
2016-01-01
Covalent organic frameworks are a class of crystalline porous polymers that integrate molecular building blocks into periodic structures and are usually synthesized using two-component [1+1] condensation systems comprised of one knot and one linker. Here we report a general strategy based on multiple-component [1+2] and [1+3] condensation systems that enable the use of one knot and two or three linker units for the synthesis of hexagonal and tetragonal multiple-component covalent organic frameworks. Unlike two-component systems, multiple-component covalent organic frameworks feature asymmetric tiling of organic units into anisotropic skeletons and unusually shaped pores. This strategy not only expands the structural complexity of skeletons and pores but also greatly enhances their structural diversity. This synthetic platform is also widely applicable to multiple-component electron donor–acceptor systems, which lead to electronic properties that are not simply linear summations of those of the conventional [1+1] counterparts. PMID:27460607
Multiple-component covalent organic frameworks
NASA Astrophysics Data System (ADS)
Huang, Ning; Zhai, Lipeng; Coupry, Damien E.; Addicoat, Matthew A.; Okushita, Keiko; Nishimura, Katsuyuki; Heine, Thomas; Jiang, Donglin
2016-07-01
Covalent organic frameworks are a class of crystalline porous polymers that integrate molecular building blocks into periodic structures and are usually synthesized using two-component [1+1] condensation systems comprised of one knot and one linker. Here we report a general strategy based on multiple-component [1+2] and [1+3] condensation systems that enable the use of one knot and two or three linker units for the synthesis of hexagonal and tetragonal multiple-component covalent organic frameworks. Unlike two-component systems, multiple-component covalent organic frameworks feature asymmetric tiling of organic units into anisotropic skeletons and unusually shaped pores. This strategy not only expands the structural complexity of skeletons and pores but also greatly enhances their structural diversity. This synthetic platform is also widely applicable to multiple-component electron donor-acceptor systems, which lead to electronic properties that are not simply linear summations of those of the conventional [1+1] counterparts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coble, Jamie B.; Coles, Garill A.; Ramuhalli, Pradeep
Advanced small modular reactors (aSMRs) can provide the United States with a safe, sustainable, and carbon-neutral energy source. The controllable day-to-day costs of aSMRs are expected to be dominated by operation and maintenance costs. Health and condition assessment coupled with online risk monitors can potentially enhance affordability of aSMRs through optimized operational planning and maintenance scheduling. Currently deployed risk monitors are an extension of probabilistic risk assessment (PRA). For complex engineered systems like nuclear power plants, PRA systematically combines event likelihoods and the probability of failure (POF) of key components, so that when combined with the magnitude of possible adversemore » consequences to determine risk. Traditional PRA uses population-based POF information to estimate the average plant risk over time. Currently, most nuclear power plants have a PRA that reflects the as-operated, as-modified plant; this model is updated periodically, typically once a year. Risk monitors expand on living PRA by incorporating changes in the day-by-day plant operation and configuration (e.g., changes in equipment availability, operating regime, environmental conditions). However, population-based POF (or population- and time-based POF) is still used to populate fault trees. Health monitoring techniques can be used to establish condition indicators and monitoring capabilities that indicate the component-specific POF at a desired point in time (or over a desired period), which can then be incorporated in the risk monitor to provide a more accurate estimate of the plant risk in different configurations. This is particularly important for active systems, structures, and components (SSCs) proposed for use in aSMR designs. These SSCs may differ significantly from those used in the operating fleet of light-water reactors (or even in LWR-based SMR designs). Additionally, the operating characteristics of aSMRs can present significantly different requirements, including the need to operate in different coolant environments, higher operating temperatures, and longer operating cycles between planned refueling and maintenance outages. These features, along with the relative lack of operating experience for some of the proposed advanced designs, may limit the ability to estimate event probability and component POF with a high degree of certainty. Incorporating real-time estimates of component POF may compensate for a relative lack of established knowledge about the long-term component behavior and improve operational and maintenance planning and optimization. The particular eccentricities of advanced reactors and small modular reactors provide unique challenges and needs for advanced instrumentation, control, and human-machine interface (ICHMI) techniques such as enhanced risk monitors (ERM) in aSMRs. Several features of aSMR designs increase the need for accurate characterization of the real-time risk during operation and maintenance activities. A number of technical gaps in realizing ERM exist, and these gaps are largely independent of the specific reactor technology. As a result, the development of a framework for ERM would enable greater situational awareness regardless of the specific class of reactor technology. A set of research tasks are identified in a preliminary research plan to enable the development, testing, and demonstration of such a framework. Although some aspects of aSMRs, such as specific operational characteristics, will vary and are not now completely defined, the proposed framework is expected to be relevant regardless of such uncertainty. The development of an ERM framework will provide one of the key technical developments necessary to ensure the economic viability of aSMRs.« less
NASA Astrophysics Data System (ADS)
Koo, J.; Wood, S.; Cenacchi, N.; Fisher, M.; Cox, C.
2012-12-01
HarvestChoice (harvestchoice.org) generates knowledge products to guide strategic investments to improve the productivity and profitability of smallholder farming systems in sub-Saharan Africa (SSA). A keynote component of the HarvestChoice analytical framework is a grid-based overlay of SSA - a cropping simulation platform powered by process-based, crop models. Calibrated around the best available representation of cropping production systems in SSA, the simulation platform engages the DSSAT Crop Systems Model with the CENTURY Soil Organic Matter model (DSSAT-CENTURY) and provides a virtual experimentation module with which to explore the impact of a range of technological, managerial and environmental metrics on future crop productivity and profitability, as well as input use. For each of 5 (or 30) arc-minute grid cells in SSA, a stack of model input underlies it: datasets that cover soil properties and fertility, historic and future climate scenarios and farmers' management practices; all compiled from analyses of existing global and regional databases and consultations with other CGIAR centers. Running a simulation model is not always straightforward, especially when certain cropping systems or management practices are not even practiced by resource-poor farmers yet (e.g., precision agriculture) or they were never included in the existing simulation framework (e.g., water harvesting). In such cases, we used DSSAT-CENTURY as a function to iteratively estimate relative responses of cropping systems to technology-driven changes in water and nutrient balances compared to zero-adoption by farmers, while adjusting model input parameters to best mimic farmers' implementation of technologies in the field. We then fed the results of the simulation into to the economic and food trade model framework, IMPACT, to assess the potential implications on future food security. The outputs of the overall simulation analyses are packaged as a web-accessible database and published on the web with an interface that allows users to explore the simulation results in each country with user-defined baseline and what-if scenarios. The results are dynamically presented on maps, charts, and tables. This paper discusses the development of the simulation platform and its underlying data layers, a case study that assessed the role of potential crop management technology development, and the development of a web-based application that visualizes the simulation results.
Transformative Approaches and Technologies for Water Systems
This project will advance the transformation of water systems towards a more sustainable future. It will provide EPA with a sustainability assessment framework integrating drinking water, wastewater, and water reuse/resource recovery components, advances in real-time monitoring, ...
Consumer acceptance of technology-based food innovations: lessons for the future of nutrigenomics.
Ronteltap, A; van Trijp, J C M; Renes, R J; Frewer, L J
2007-07-01
Determinants of consumer adoption of innovations have been studied from different angles and from the perspectives of various disciplines. In the food area, the literature is dominated by a focus on consumer concern. This paper reviews previous research into acceptance of technology-based innovation from both inside and outside the food domain, extracts key learnings from this literature and integrates them into a new conceptual framework for consumer acceptance of technology-based food innovations. The framework distinguishes 'distal' and 'proximal' determinants of acceptance. Distal factors (characteristics of the innovation, the consumer and the social system) influence consumers' intention to accept an innovation through proximal factors (perceived cost/benefit considerations, perceptions of risk and uncertainty, social norm and perceived behavioural control). The framework's application as a tool to anticipate consumer reaction to future innovations is illustrated for an actual technology-based innovation in food science, nutrigenomics (the interaction between nutrition and human genetics).
A Framework for Prognostics Reasoning
2002-12-01
Center and School, Aberden Proving Ground , Maryland. Presented at the Advanced Information Systems and Technology Conference 28-30 March 1994. 44...stresses cannot be duplicated on the ground . The communication busses and permanent wiring on an aircraft are not tested at present. These components...functional aircraft components. Lastly, since CND results indicate an inability to duplicate on the ground a fault detected during flight, many
Intelligent Control in Automation Based on Wireless Traffic Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurt Derr; Milos Manic
2007-09-01
Wireless technology is a central component of many factory automation infrastructures in both the commercial and government sectors, providing connectivity among various components in industrial realms (distributed sensors, machines, mobile process controllers). However wireless technologies provide more threats to computer security than wired environments. The advantageous features of Bluetooth technology resulted in Bluetooth units shipments climbing to five million per week at the end of 2005 [1, 2]. This is why the real-time interpretation and understanding of Bluetooth traffic behavior is critical in both maintaining the integrity of computer systems and increasing the efficient use of this technology in controlmore » type applications. Although neuro-fuzzy approaches have been applied to wireless 802.11 behavior analysis in the past, a significantly different Bluetooth protocol framework has not been extensively explored using this technology. This paper presents a new neurofuzzy traffic analysis algorithm of this still new territory of Bluetooth traffic. Further enhancements of this algorithm are presented along with the comparison against the traditional, numerical approach. Through test examples, interesting Bluetooth traffic behavior characteristics were captured, and the comparative elegance of this computationally inexpensive approach was demonstrated. This analysis can be used to provide directions for future development and use of this prevailing technology in various control type applications, as well as making the use of it more secure.« less
Intelligent Control in Automation Based on Wireless Traffic Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurt Derr; Milos Manic
Wireless technology is a central component of many factory automation infrastructures in both the commercial and government sectors, providing connectivity among various components in industrial realms (distributed sensors, machines, mobile process controllers). However wireless technologies provide more threats to computer security than wired environments. The advantageous features of Bluetooth technology resulted in Bluetooth units shipments climbing to five million per week at the end of 2005 [1, 2]. This is why the real-time interpretation and understanding of Bluetooth traffic behavior is critical in both maintaining the integrity of computer systems and increasing the efficient use of this technology in controlmore » type applications. Although neuro-fuzzy approaches have been applied to wireless 802.11 behavior analysis in the past, a significantly different Bluetooth protocol framework has not been extensively explored using this technology. This paper presents a new neurofuzzy traffic analysis algorithm of this still new territory of Bluetooth traffic. Further enhancements of this algorithm are presented along with the comparison against the traditional, numerical approach. Through test examples, interesting Bluetooth traffic behavior characteristics were captured, and the comparative elegance of this computationally inexpensive approach was demonstrated. This analysis can be used to provide directions for future development and use of this prevailing technology in various control type applications, as well as making the use of it more secure.« less
Distributed Scaffolding: Synergy in Technology-Enhanced Learning Environments
ERIC Educational Resources Information Center
Ustunel, Hale H.; Tokel, Saniye Tugba
2018-01-01
When technology is employed challenges increase in learning environments. Kim et al. ("Sci Educ" 91(6):1010-1030, 2007) presented a pedagogical framework that provides a valid technology-enhanced learning environment. The purpose of the present design-based study was to investigate the micro context dimension of this framework and to…
Developing Hydrogeological Site Characterization Strategies based on Human Health Risk
NASA Astrophysics Data System (ADS)
de Barros, F.; Rubin, Y.; Maxwell, R. M.
2013-12-01
In order to provide better sustainable groundwater quality management and minimize the impact of contamination in humans, improved understanding and quantification of the interaction between hydrogeological models, geological site information and human health are needed. Considering the joint influence of these components in the overall human health risk assessment and the corresponding sources of uncertainty aid decision makers to better allocate resources in data acquisition campaigns. This is important to (1) achieve remediation goals in a cost-effective manner, (2) protect human health and (3) keep water supplies clean in order to keep with quality standards. Such task is challenging since a full characterization of the subsurface is unfeasible due to financial and technological constraints. In addition, human exposure and physiological response to contamination are subject to uncertainty and variability. Normally, sampling strategies are developed with the goal of reducing uncertainty, but less often they are developed in the context of their impacts on the overall system uncertainty. Therefore, quantifying the impact from each of these components (hydrogeological, behavioral and physiological) in final human health risk prediction can provide guidance for decision makers to best allocate resources towards minimal prediction uncertainty. In this presentation, a multi-component human health risk-based framework is presented which allows decision makers to set priorities through an information entropy-based visualization tool. Results highlight the role of characteristic length-scales characterizing flow and transport in determining data needs within an integrated hydrogeological-health framework. Conditions where uncertainty reduction in human health risk predictions may benefit from better understanding of the health component, as opposed to a more detailed hydrogeological characterization, are also discussed. Finally, results illustrate how different dose-response models can impact the probability of human health risk exceeding a regulatory threshold.
A generic bio-economic farm model for environmental and economic assessment of agricultural systems.
Janssen, Sander; Louhichi, Kamel; Kanellopoulos, Argyris; Zander, Peter; Flichman, Guillermo; Hengsdijk, Huib; Meuter, Eelco; Andersen, Erling; Belhouchette, Hatem; Blanco, Maria; Borkowski, Nina; Heckelei, Thomas; Hecker, Martin; Li, Hongtao; Oude Lansink, Alfons; Stokstad, Grete; Thorne, Peter; van Keulen, Herman; van Ittersum, Martin K
2010-12-01
Bio-economic farm models are tools to evaluate ex-post or to assess ex-ante the impact of policy and technology change on agriculture, economics and environment. Recently, various BEFMs have been developed, often for one purpose or location, but hardly any of these models are re-used later for other purposes or locations. The Farm System Simulator (FSSIM) provides a generic framework enabling the application of BEFMs under various situations and for different purposes (generating supply response functions and detailed regional or farm type assessments). FSSIM is set up as a component-based framework with components representing farmer objectives, risk, calibration, policies, current activities, alternative activities and different types of activities (e.g., annual and perennial cropping and livestock). The generic nature of FSSIM is evaluated using five criteria by examining its applications. FSSIM has been applied for different climate zones and soil types (criterion 1) and to a range of different farm types (criterion 2) with different specializations, intensities and sizes. In most applications FSSIM has been used to assess the effects of policy changes and in two applications to assess the impact of technological innovations (criterion 3). In the various applications, different data sources, level of detail (e.g., criterion 4) and model configurations have been used. FSSIM has been linked to an economic and several biophysical models (criterion 5). The model is available for applications to other conditions and research issues, and it is open to be further tested and to be extended with new components, indicators or linkages to other models.
A Generic Bio-Economic Farm Model for Environmental and Economic Assessment of Agricultural Systems
Louhichi, Kamel; Kanellopoulos, Argyris; Zander, Peter; Flichman, Guillermo; Hengsdijk, Huib; Meuter, Eelco; Andersen, Erling; Belhouchette, Hatem; Blanco, Maria; Borkowski, Nina; Heckelei, Thomas; Hecker, Martin; Li, Hongtao; Oude Lansink, Alfons; Stokstad, Grete; Thorne, Peter; van Keulen, Herman; van Ittersum, Martin K.
2010-01-01
Bio-economic farm models are tools to evaluate ex-post or to assess ex-ante the impact of policy and technology change on agriculture, economics and environment. Recently, various BEFMs have been developed, often for one purpose or location, but hardly any of these models are re-used later for other purposes or locations. The Farm System Simulator (FSSIM) provides a generic framework enabling the application of BEFMs under various situations and for different purposes (generating supply response functions and detailed regional or farm type assessments). FSSIM is set up as a component-based framework with components representing farmer objectives, risk, calibration, policies, current activities, alternative activities and different types of activities (e.g., annual and perennial cropping and livestock). The generic nature of FSSIM is evaluated using five criteria by examining its applications. FSSIM has been applied for different climate zones and soil types (criterion 1) and to a range of different farm types (criterion 2) with different specializations, intensities and sizes. In most applications FSSIM has been used to assess the effects of policy changes and in two applications to assess the impact of technological innovations (criterion 3). In the various applications, different data sources, level of detail (e.g., criterion 4) and model configurations have been used. FSSIM has been linked to an economic and several biophysical models (criterion 5). The model is available for applications to other conditions and research issues, and it is open to be further tested and to be extended with new components, indicators or linkages to other models. PMID:21113782
Feng, Qianmei
2007-10-01
Federal law mandates that every checked bag at all commercial airports be screened by explosive detection systems (EDS), explosive trace detection systems (ETD), or alternative technologies. These technologies serve as critical components of airport security systems that strive to reduce security risks at both national and global levels. To improve the operational efficiency and airport security, emerging image-based technologies have been developed, such as dual-energy X-ray (DX), backscatter X-ray (BX), and multiview tomography (MVT). These technologies differ widely in purchasing cost, maintenance cost, operating cost, processing rate, and accuracy. Based on a mathematical framework that takes into account all these factors, this article investigates two critical issues for operating screening devices: setting specifications for continuous security responses by different technologies; and selecting technology or combination of technologies for efficient 100% baggage screening. For continuous security responses, specifications or thresholds are used for classifying threat items from nonthreat items. By investigating the setting of specifications on system security responses, this article assesses the risk and cost effectiveness of various technologies for both single-device and two-device systems. The findings provide the best selection of image-based technologies for both single-device and two-device systems. Our study suggests that two-device systems outperform single-device systems in terms of both cost effectiveness and accuracy. The model can be readily extended to evaluate risk and cost effectiveness of multiple-device systems for airport checked-baggage security screening.
ERIC Educational Resources Information Center
Celik, Ismail; Sahin, Ismail; Akturk, Ahmet Oguz
2014-01-01
In the current study, the model of technological pedagogical and content knowledge (TPACK) is used as the theoretical framework in the process of data collection and interpretation of the results. This study analyzes the perceptions of 744 undergraduate students regarding their TPACK levels measured by responses to a survey developed by Sahin…
An eHealth Capabilities Framework for Graduates and Health Professionals: Mixed-Methods Study.
Brunner, Melissa; McGregor, Deborah; Keep, Melanie; Janssen, Anna; Spallek, Heiko; Quinn, Deleana; Jones, Aaron; Tseris, Emma; Yeung, Wilson; Togher, Leanne; Solman, Annette; Shaw, Tim
2018-05-15
The demand for an eHealth-ready and adaptable workforce is placing increasing pressure on universities to deliver eHealth education. At present, eHealth education is largely focused on components of eHealth rather than considering a curriculum-wide approach. This study aimed to develop a framework that could be used to guide health curriculum design based on current evidence, and stakeholder perceptions of eHealth capabilities expected of tertiary health graduates. A 3-phase, mixed-methods approach incorporated the results of a literature review, focus groups, and a Delphi process to develop a framework of eHealth capability statements. Participants (N=39) with expertise or experience in eHealth education, practice, or policy provided feedback on the proposed framework, and following the fourth iteration of this process, consensus was achieved. The final framework consisted of 4 higher-level capability statements that describe the learning outcomes expected of university graduates across the domains of (1) digital health technologies, systems, and policies; (2) clinical practice; (3) data analysis and knowledge creation; and (4) technology implementation and codesign. Across the capability statements are 40 performance cues that provide examples of how these capabilities might be demonstrated. The results of this study inform a cross-faculty eHealth curriculum that aligns with workforce expectations. There is a need for educational curriculum to reinforce existing eHealth capabilities, adapt existing capabilities to make them transferable to novel eHealth contexts, and introduce new learning opportunities for interactions with technologies within education and practice encounters. As such, the capability framework developed may assist in the application of eHealth by emerging and existing health care professionals. Future research needs to explore the potential for integration of findings into workforce development programs. ©Melissa Brunner, Deborah McGregor, Melanie Keep, Anna Janssen, Heiko Spallek, Deleana Quinn, Aaron Jones, Emma Tseris, Wilson Yeung, Leanne Togher, Annette Solman, Tim Shaw. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 15.05.2018.
Informatics for patient safety: a nursing research perspective.
Bakken, Suzanne
2006-01-01
In Crossing the Quality Chasm, the Institute of Medicine (IOM) Committee on Quality of Health Care in America identified the critical role of information technology in designing a health system that produces care that is "safe, effective, patient-centered, timely, efficient, and equitable" (Committee on Quality of Health Care in America, 2001, p. 164). A subsequent IOM report contends that improved information systems are essential to a new health care delivery system that "both prevents errors and learns from them when they occur" (Committee on Data Standards for Patient Safety, 2004, p. 1). This review specifically highlights the role of informatics processes and information technology in promoting patient safety and summarizes relevant nursing research. First, the components of an informatics infrastructure for patient safety are described within the context of the national framework for delivering consumer-centric and information-rich health care and using the National Health Information Infrastructure (NHII) (Thompson & Brailer, 2004). Second, relevant nursing research is summarized; this includes research studies that contributed to the development of selected infrastructure components as well as studies specifically focused on patient safety. Third, knowledge gaps and opportunities for nursing research are identified for each main topic. The health information technologies deployed as part of the national framework must support nursing practice in a manner that enables prevention of medical errors and promotion of patient safety and contributes to the development of practice-based nursing knowledge as well as best practices for patient safety. The seminal work that has been completed to date is necessary, but not sufficient, to achieve this objective.
Classification Framework for ICT-Based Learning Technologies for Disabled People
ERIC Educational Resources Information Center
Hersh, Marion
2017-01-01
The paper presents the first systematic approach to the classification of inclusive information and communication technologies (ICT)-based learning technologies and ICT-based learning technologies for disabled people which covers both assistive and general learning technologies, is valid for all disabled people and considers the full range of…
Semantic Integration for Marine Science Interoperability Using Web Technologies
NASA Astrophysics Data System (ADS)
Rueda, C.; Bermudez, L.; Graybeal, J.; Isenor, A. W.
2008-12-01
The Marine Metadata Interoperability Project, MMI (http://marinemetadata.org) promotes the exchange, integration, and use of marine data through enhanced data publishing, discovery, documentation, and accessibility. A key effort is the definition of an Architectural Framework and Operational Concept for Semantic Interoperability (http://marinemetadata.org/sfc), which is complemented with the development of tools that realize critical use cases in semantic interoperability. In this presentation, we describe a set of such Semantic Web tools that allow performing important interoperability tasks, ranging from the creation of controlled vocabularies and the mapping of terms across multiple ontologies, to the online registration, storage, and search services needed to work with the ontologies (http://mmisw.org). This set of services uses Web standards and technologies, including Resource Description Framework (RDF), Web Ontology language (OWL), Web services, and toolkits for Rich Internet Application development. We will describe the following components: MMI Ontology Registry: The MMI Ontology Registry and Repository provides registry and storage services for ontologies. Entries in the registry are associated with projects defined by the registered users. Also, sophisticated search functions, for example according to metadata items and vocabulary terms, are provided. Client applications can submit search requests using the WC3 SPARQL Query Language for RDF. Voc2RDF: This component converts an ASCII comma-delimited set of terms and definitions into an RDF file. Voc2RDF facilitates the creation of controlled vocabularies by using a simple form-based user interface. Created vocabularies and their descriptive metadata can be submitted to the MMI Ontology Registry for versioning and community access. VINE: The Vocabulary Integration Environment component allows the user to map vocabulary terms across multiple ontologies. Various relationships can be established, for example exactMatch, narrowerThan, and subClassOf. VINE can compute inferred mappings based on the given associations. Attributes about each mapping, like comments and a confidence level, can also be included. VINE also supports registering and storing resulting mapping files in the Ontology Registry. The presentation will describe the application of semantic technologies in general, and our planned applications in particular, to solve data management problems in the marine and environmental sciences.
Karim, Ahmad; Salleh, Rosli; Khan, Muhammad Khurram
2016-01-01
Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS), disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks’ back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps’ detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies. PMID:26978523
Karim, Ahmad; Salleh, Rosli; Khan, Muhammad Khurram
2016-01-01
Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS), disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks' back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps' detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies.
KODAMA and VPC based Framework for Ubiquitous Systems and its Experiment
NASA Astrophysics Data System (ADS)
Takahashi, Kenichi; Amamiya, Satoshi; Iwao, Tadashige; Zhong, Guoqiang; Kainuma, Tatsuya; Amamiya, Makoto
Recently, agent technologies have attracted a lot of interest as an emerging programming paradigm. With such agent technologies, services are provided through collaboration among agents. At the same time, the spread of mobile technologies and communication infrastructures has made it possible to access the network anytime and from anywhere. Using agents and mobile technologies to realize ubiquitous computing systems, we propose a new framework based on KODAMA and VPC. KODAMA provides distributed management mechanisms by using the concept of community and communication infrastructure to deliver messages among agents without agents being aware of the physical network. VPC provides a method of defining peer-to-peer services based on agent communication with policy packages. By merging the characteristics of both KODAMA and VPC functions, we propose a new framework for ubiquitous computing environments. It provides distributed management functions according to the concept of agent communities, agent communications which are abstracted from the physical environment, and agent collaboration with policy packages. Using our new framework, we conducted a large-scale experiment in shopping malls in Nagoya, which sent advertisement e-mails to users' cellular phones according to user location and attributes. The empirical results showed that our new framework worked effectively for sales in shopping malls.
Developing a Framework for Social Technologies in Learning via Design-Based Research
ERIC Educational Resources Information Center
Parmaxi, Antigoni; Zaphiris, Panayiotis
2015-01-01
This paper reports on the use of design-based research (DBR) for the development of a framework that grounds the use of social technologies in learning. The paper focuses on three studies which step on the learning theory of constructionism. Constructionism assumes that knowledge is better gained when students find this knowledge for themselves…
ERIC Educational Resources Information Center
Fisher, Tony; Denning, Tim; Higgins, Chris; Loveless, Avril
2012-01-01
This article describes a project to apply and validate a conceptual framework of clusters of purposeful learning activity involving ICT tools. The framework, which is based in a socio-cultural perspective, is described as "DECK", and comprises the following major categories of the use of digital technologies to support learning:…
Children Learning to Use Technologies through Play: A Digital Play Framework
ERIC Educational Resources Information Center
Bird, Jo; Edwards, Susan
2015-01-01
Digital technologies are increasingly acknowledged as an important aspect of early childhood education. A significant problem for early childhood education has been how to understand the pedagogical use of technologies in a sector that values play-based learning. This paper presents a new framework to understand how children learn to use…
Putting Public Health Ethics into Practice: A Systematic Framework
Marckmann, Georg; Schmidt, Harald; Sofaer, Neema; Strech, Daniel
2015-01-01
It is widely acknowledged that public health practice raises ethical issues that require a different approach than traditional biomedical ethics. Several frameworks for public health ethics (PHE) have been proposed; however, none of them provides a practice-oriented combination of the two necessary components: (1) a set of normative criteria based on an explicit ethical justification and (2) a structured methodological approach for applying the resulting normative criteria to concrete public health (PH) issues. Building on prior work in the field and integrating valuable elements of other approaches to PHE, we present a systematic ethical framework that shall guide professionals in planning, conducting, and evaluating PH interventions. Based on a coherentist model of ethical justification, the proposed framework contains (1) an explicit normative foundation with five substantive criteria and seven procedural conditions to guarantee a fair decision process, and (2) a six-step methodological approach for applying the criteria and conditions to the practice of PH and health policy. The framework explicitly ties together ethical analysis and empirical evidence, thus striving for evidence-based PHE. It can provide normative guidance to those who analyze the ethical implications of PH practice including academic ethicists, health policy makers, health technology assessment bodies, and PH professionals. It will enable those who implement a PH intervention and those affected by it (i.e., the target population) to critically assess whether and how the required ethical considerations have been taken into account. Thereby, the framework can contribute to assuring the quality of ethical analysis in PH. Whether the presented framework will be able to achieve its goals has to be determined by evaluating its practical application. PMID:25705615
Supporting Collective Inquiry: A Technology Framework for Distributed Learning
NASA Astrophysics Data System (ADS)
Tissenbaum, Michael
This design-based study describes the implementation and evaluation of a technology framework to support smart classrooms and Distributed Technology Enhanced Learning (DTEL) called SAIL Smart Space (S3). S3 is an open-source technology framework designed to support students engaged in inquiry investigations as a knowledge community. To evaluate the effectiveness of S3 as a generalizable technology framework, a curriculum named PLACE (Physics Learning Across Contexts and Environments) was developed to support two grade-11 physics classes (n = 22; n = 23) engaged in a multi-context inquiry curriculum based on the Knowledge Community and Inquiry (KCI) pedagogical model. This dissertation outlines three initial design studies that established a set of design principles for DTEL curricula, and related technology infrastructures. These principles guided the development of PLACE, a twelve-week inquiry curriculum in which students drew upon their community-generated knowledge base as a source of evidence for solving ill-structured physics problems based on the physics of Hollywood movies. During the culminating smart classroom activity, the S3 framework played a central role in orchestrating student activities, including managing the flow of materials and students using real-time data mining and intelligent agents that responded to emergent class patterns. S3 supported students' construction of knowledge through the use individual, collective and collaborative scripts and technologies, including tablets and interactive large-format displays. Aggregate and real-time ambient visualizations helped the teacher act as a wondering facilitator, supporting students in their inquiry where needed. A teacher orchestration tablet gave the teacher some control over the flow of the scripted activities, and alerted him to critical moments for intervention. Analysis focuses on S3's effectiveness in supporting students' inquiry across multiple learning contexts and scales of time, and in making timely and effective use of the community's knowledge base, towards producing solutions to sophisticated, ill defined problems in the domain of physics. Video analysis examined whether S3 supported teacher orchestration, freeing him to focus less on classroom management and more on students' inquiry. Three important outcomes of this research are a set of design principles for DTEL environments, a specific technology infrastructure (S3), and a DTEL research framework.
Acoustic Sensing and Ultrasonic Drug Delivery in Multimodal Theranostic Capsule Endoscopy
Stewart, Fraser R.; Qiu, Yongqiang; Newton, Ian P.; Cox, Benjamin F.; Al-Rawhani, Mohammed A.; Beeley, James; Liu, Yangminghao; Huang, Zhihong; Cumming, David R. S.; Näthke, Inke
2017-01-01
Video capsule endoscopy (VCE) is now a clinically accepted diagnostic modality in which miniaturized technology, an on-board power supply and wireless telemetry stand as technological foundations for other capsule endoscopy (CE) devices. However, VCE does not provide therapeutic functionality, and research towards therapeutic CE (TCE) has been limited. In this paper, a route towards viable TCE is proposed, based on multiple CE devices including important acoustic sensing and drug delivery components. In this approach, an initial multimodal diagnostic device with high-frequency quantitative microultrasound that complements video imaging allows surface and subsurface visualization and computer-assisted diagnosis. Using focused ultrasound (US) to mark sites of pathology with exogenous fluorescent agents permits follow-up with another device to provide therapy. This is based on an US-mediated targeted drug delivery system with fluorescence imaging guidance. An additional device may then be utilized for treatment verification and monitoring, exploiting the minimally invasive nature of CE. While such a theranostic patient pathway for gastrointestinal treatment is presently incomplete, the description in this paper of previous research and work under way to realize further components for the proposed pathway suggests it is feasible and provides a framework around which to structure further work. PMID:28671642
LDRD project final report : hybrid AI/cognitive tactical behavior framework for LVC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Djordjevich, Donna D.; Xavier, Patrick Gordon; Brannon, Nathan Gregory
This Lab-Directed Research and Development (LDRD) sought to develop technology that enhances scenario construction speed, entity behavior robustness, and scalability in Live-Virtual-Constructive (LVC) simulation. We investigated issues in both simulation architecture and behavior modeling. We developed path-planning technology that improves the ability to express intent in the planning task while still permitting an efficient search algorithm. An LVC simulation demonstrated how this enables 'one-click' layout of squad tactical paths, as well as dynamic re-planning for simulated squads and for real and simulated mobile robots. We identified human response latencies that can be exploited in parallel/distributed architectures. We did an experimentalmore » study to determine where parallelization would be productive in Umbra-based force-on-force (FOF) simulations. We developed and implemented a data-driven simulation composition approach that solves entity class hierarchy issues and supports assurance of simulation fairness. Finally, we proposed a flexible framework to enable integration of multiple behavior modeling components that model working memory phenomena with different degrees of sophistication.« less
NASA Technical Reports Server (NTRS)
Pitman, C. L.; Erb, D. M.; Izygon, M. E.; Fridge, E. M., III; Roush, G. B.; Braley, D. M.; Savely, R. T.
1992-01-01
The United State's big space projects of the next decades, such as Space Station and the Human Exploration Initiative, will need the development of many millions of lines of mission critical software. NASA-Johnson (JSC) is identifying and developing some of the Computer Aided Software Engineering (CASE) technology that NASA will need to build these future software systems. The goal is to improve the quality and the productivity of large software development projects. New trends are outlined in CASE technology and how the Software Technology Branch (STB) at JSC is endeavoring to provide some of these CASE solutions for NASA is described. Key software technology components include knowledge-based systems, software reusability, user interface technology, reengineering environments, management systems for the software development process, software cost models, repository technology, and open, integrated CASE environment frameworks. The paper presents the status and long-term expectations for CASE products. The STB's Reengineering Application Project (REAP), Advanced Software Development Workstation (ASDW) project, and software development cost model (COSTMODL) project are then discussed. Some of the general difficulties of technology transfer are introduced, and a process developed by STB for CASE technology insertion is described.
A Component-Based Vocabulary-Extensible Sign Language Gesture Recognition Framework.
Wei, Shengjing; Chen, Xiang; Yang, Xidong; Cao, Shuai; Zhang, Xu
2016-04-19
Sign language recognition (SLR) can provide a helpful tool for the communication between the deaf and the external world. This paper proposed a component-based vocabulary extensible SLR framework using data from surface electromyographic (sEMG) sensors, accelerometers (ACC), and gyroscopes (GYRO). In this framework, a sign word was considered to be a combination of five common sign components, including hand shape, axis, orientation, rotation, and trajectory, and sign classification was implemented based on the recognition of five components. Especially, the proposed SLR framework consisted of two major parts. The first part was to obtain the component-based form of sign gestures and establish the code table of target sign gesture set using data from a reference subject. In the second part, which was designed for new users, component classifiers were trained using a training set suggested by the reference subject and the classification of unknown gestures was performed with a code matching method. Five subjects participated in this study and recognition experiments under different size of training sets were implemented on a target gesture set consisting of 110 frequently-used Chinese Sign Language (CSL) sign words. The experimental results demonstrated that the proposed framework can realize large-scale gesture set recognition with a small-scale training set. With the smallest training sets (containing about one-third gestures of the target gesture set) suggested by two reference subjects, (82.6 ± 13.2)% and (79.7 ± 13.4)% average recognition accuracy were obtained for 110 words respectively, and the average recognition accuracy climbed up to (88 ± 13.7)% and (86.3 ± 13.7)% when the training set included 50~60 gestures (about half of the target gesture set). The proposed framework can significantly reduce the user's training burden in large-scale gesture recognition, which will facilitate the implementation of a practical SLR system.
Selected Topics in Overset Technology Development and Applications At NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Chan, William M.; Kwak, Dochan (Technical Monitor)
2002-01-01
This paper presents a general overview of overset technology development and applications at NASA Ames Research Center. The topics include: 1) Overview of overset activities at NASA Ames; 2) Recent developments in Chimera Grid Tools; 3) A general framework for multiple component dynamics; 4) A general script module for automating liquid rocket sub-systems simulations; and 5) Critical future work.
NASA Astrophysics Data System (ADS)
Li, Zhixiong; Yan, Xinping; Wang, Xuping; Peng, Zhongxiao
2016-06-01
In the complex gear transmission systems, in wind turbines a crack is one of the most common failure modes and can be fatal to the wind turbine power systems. A single sensor may suffer with issues relating to its installation position and direction, resulting in the collection of weak dynamic responses of the cracked gear. A multi-channel sensor system is hence applied in the signal acquisition and the blind source separation (BSS) technologies are employed to optimally process the information collected from multiple sensors. However, literature review finds that most of the BSS based fault detectors did not address the dependence/correlation between different moving components in the gear systems; particularly, the popular used independent component analysis (ICA) assumes mutual independence of different vibration sources. The fault detection performance may be significantly influenced by the dependence/correlation between vibration sources. In order to address this issue, this paper presents a new method based on the supervised order tracking bounded component analysis (SOTBCA) for gear crack detection in wind turbines. The bounded component analysis (BCA) is a state of art technology for dependent source separation and is applied limitedly to communication signals. To make it applicable for vibration analysis, in this work, the order tracking has been appropriately incorporated into the BCA framework to eliminate the noise and disturbance signal components. Then an autoregressive (AR) model built with prior knowledge about the crack fault is employed to supervise the reconstruction of the crack vibration source signature. The SOTBCA only outputs one source signal that has the closest distance with the AR model. Owing to the dependence tolerance ability of the BCA framework, interfering vibration sources that are dependent/correlated with the crack vibration source could be recognized by the SOTBCA, and hence, only useful fault information could be preserved in the reconstructed signal. The crack failure thus could be precisely identified by the cyclic spectral correlation analysis. A series of numerical simulations and experimental tests have been conducted to illustrate the advantages of the proposed SOTBCA method for fatigue crack detection. Comparisons to three representative techniques, i.e. Erdogan's BCA (E-BCA), joint approximate diagonalization of eigen-matrices (JADE), and FastICA, have demonstrated the effectiveness of the SOTBCA. Hence the proposed approach is suitable for accurate gear crack detection in practical applications.
The AI Bus architecture for distributed knowledge-based systems
NASA Technical Reports Server (NTRS)
Schultz, Roger D.; Stobie, Iain
1991-01-01
The AI Bus architecture is layered, distributed object oriented framework developed to support the requirements of advanced technology programs for an order of magnitude improvement in software costs. The consequent need for highly autonomous computer systems, adaptable to new technology advances over a long lifespan, led to the design of an open architecture and toolbox for building large scale, robust, production quality systems. The AI Bus accommodates a mix of knowledge based and conventional components, running on heterogeneous, distributed real world and testbed environment. The concepts and design is described of the AI Bus architecture and its current implementation status as a Unix C++ library or reusable objects. Each high level semiautonomous agent process consists of a number of knowledge sources together with interagent communication mechanisms based on shared blackboards and message passing acquaintances. Standard interfaces and protocols are followed for combining and validating subsystems. Dynamic probes or demons provide an event driven means for providing active objects with shared access to resources, and each other, while not violating their security.
Scobbie, Lesley; Dixon, Diane; Wyke, Sally
2011-05-01
Setting and achieving goals is fundamental to rehabilitation practice but has been criticized for being a-theoretical and the key components of replicable goal-setting interventions are not well established. To describe the development of a theory-based goal setting practice framework for use in rehabilitation settings and to detail its component parts. Causal modelling was used to map theories of behaviour change onto the process of setting and achieving rehabilitation goals, and to suggest the mechanisms through which patient outcomes are likely to be affected. A multidisciplinary task group developed the causal model into a practice framework for use in rehabilitation settings through iterative discussion and implementation with six patients. Four components of a goal-setting and action-planning practice framework were identified: (i) goal negotiation, (ii) goal identification, (iii) planning, and (iv) appraisal and feedback. The variables hypothesized to effect change in patient outcomes were self-efficacy and action plan attainment. A theory-based goal setting practice framework for use in rehabilitation settings is described. The framework requires further development and systematic evaluation in a range of rehabilitation settings.
PRMS Data Warehousing Prototype
NASA Technical Reports Server (NTRS)
Guruvadoo, Eranna K.
2001-01-01
Project and Resource Management System (PRMS) is a web-based, mid-level management tool developed at KSC to provide a unified enterprise framework for Project and Mission management. The addition of a data warehouse as a strategic component to the PRMS is investigated through the analysis design and implementation processes of a data warehouse prototype. As a proof of concept, a demonstration of the prototype with its OLAP's technology for multidimensional data analysis is made. The results of the data analysis and the design constraints are discussed. The prototype can be used to motivate interest and support for an operational data warehouse.
PRMS Data Warehousing Prototype
NASA Technical Reports Server (NTRS)
Guruvadoo, Eranna K.
2002-01-01
Project and Resource Management System (PRMS) is a web-based, mid-level management tool developed at KSC to provide a unified enterprise framework for Project and Mission management. The addition of a data warehouse as a strategic component to the PRMS is investigated through the analysis, design and implementation processes of a data warehouse prototype. As a proof of concept, a demonstration of the prototype with its OLAP's technology for multidimensional data analysis is made. The results of the data analysis and the design constraints are discussed. The prototype can be used to motivate interest and support for an operational data warehouse.
Technology Adoption: an Interaction Perspective
NASA Astrophysics Data System (ADS)
Sitorus, Hotna M.; Govindaraju, Rajesri; Wiratmadja, I. I.; Sudirman, Iman
2016-02-01
The success of a new technology depends on how well it is accepted by its intended users. Many technologies face the problem of low adoption rate, despite the benefits. An understanding of what makes people accept or reject a new technology can help speed up the adoption rate. This paper presents a framework for technology adoption based on an interactive perspective, resulting from a literature study on technology adoption. In studying technology adoption, it is necessary to consider the interactions among elements involved in the system, for these interactions may generate new characteristics or new relationships. The interactions among elements in a system adoption have not received sufficient consideration in previous studies of technology adoption. Based on the proposed interaction perspective, technology adoption is elaborated by examining interactions among the individual (i.e. the user or prospective user), the technology, the task and the environment. The framework is formulated by adopting several theories, including Perceived Characteristics of Innovating, Diffusion of Innovation Theory, Technology Acceptance Model, Task-Technology Fit and usability theory. The proposed framework is illustrated in the context of mobile banking adoption. It is aimed to offer a better understanding of determinants of technology adoption in various contexts, including technology in manufacturing systems.
The approach for shortest paths in fire succor based on component GIS technology
NASA Astrophysics Data System (ADS)
Han, Jie; Zhao, Yong; Dai, K. W.
2007-06-01
Fire safety is an important issue for the national economy and people's living. Efficiency and exactness of fire department succor directly relate to safety of peoples' lives and property. Many disadvantages of the traditional fire system have been emerged in practical applications. The preparation of pumpers is guided by wireless communication or wire communication, so its real-time and accurate performances are much poorer. The information about the reported fire, such as the position, disaster and map, et al., for alarm and command was processed by persons, which slows the reaction speed and delays the combat opportunity. In order to solve these disadvantages, it has an important role to construct a modern fire command center based on high technology. The construction of modern fire command center can realize the modernization and automation of fire command and management. It will play a great role in protecting safety of peoples' lives and property. The center can enhance battle ability and can reduce the direct and indirect loss of fire damage at most. With the development of science technology, Geographic Information System (GIS) has becoming a new information industry for hardware production, software development, data collection, space analysis and counseling. With the popularization of computers and the development of GIS, GIS has gained increasing broad applications for its strong functionality. Network analysis is one of the most important functions of GIS, and the most elementary and pivotal issue of network analysis is the calculation of shortest paths. The shortest paths are mostly applied to some emergent systems such as 119 fire alarms. These systems mainly require that the computation time of the optimal path should be 1-3 seconds. And during traveling, the next running path of the vehicles should be calculated in time. So the implement of the shortest paths must have a high efficiency. In this paper, the component GIS technology was applied to collect and record the data information (such as, the situation of this disaster, map and road status et al) of the reported fire firstly. The ant colony optimization was used to calculate the shortest path of fire succor secondly. The optimization results were sent to the pumpers, which can let pumpers choose the shortest paths intelligently and come to fire position with least time. The programming method for shortest paths is proposed in section 3. There are three parts in this section. The elementary framework of the proposed programming method is presented in part one. The systematic framework of GIS component is described in part two. The ant colony optimization employed is presented in part three. In section 4, a simple application instance was presented to demonstrate the proposed programming method. There are three parts in this section. The distributed Web application based on component GIS was described in part one. The optimization results without traffic constraint were presented in part two. The optimization results with traffic constraint were presented in part three. The contributions of this paper can be summarized as follows. (1) It proposed an effective approach for shortest paths in fire succor based on component GIS technology. This proposed approach can achieve the real-time decisions of shortest paths for fire succor. (2) It applied the ant colony optimization to implement the shortest path decision. The traffic information was considered in the shortest path decision using ant colony optimization. The final application instance suggests that the proposed approach is feasible, correct and valid.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kobos, Peter Holmes; Malczynski, Leonard A.; Walker, La Tonya Nicole
People save for retirement throughout their career because it is virtually impossible to save all you’ll need in retirement the year before you retire. Similarly, without installing incremental amounts of clean fossil, renewable or transformative energy technologies throughout the coming decades, a radical and immediate change will be near impossible the year before a policy goal is set to be in place. This notion of steady installation growth over acute installations of technology to meet policy goals is the core topic of discussion for this research. This research operationalizes this notion by developing the theoretical underpinnings of regulatory and marketmore » acceptance delays by building upon the common Technology Readiness Level (TRL) framework and offers two new additions to the research community. The new and novel Regulatory Readiness Level (RRL) and Market Readiness Level (MRL) frameworks were developed. These components, collectively called the Technology, Regulatory and Market (TRM) readiness level framework allow one to build new constraints into existing Integrated Assessment Models (IAMs) to address research questions such as, ‘To meet our desired technical and policy goals, what are the factors that affect the rate we must install technology to achieve these goals in the coming decades?’« less
ERIC Educational Resources Information Center
Warfield, Douglas L.
2011-01-01
The evolution of information technology has included new methodologies that use information technology to control and manage various industries and government activities. Information Technology has also evolved as its own industry with global networks of interconnectivity, such as the Internet, and frameworks, models, and methodologies to control…
Supporting Component-Based Courseware Development Using Virtual Apparatus Framework Script.
ERIC Educational Resources Information Center
Ip, Albert; Fritze, Paul
This paper reports on the latest development of the Virtual Apparatus (VA) framework, a contribution to efforts at the University of Melbourne (Australia) to mainstream content and pedagogical functions of curricula. The integration of the educational content and pedagogical functions of learning components using an XML compatible script,…
On Noise Assessment for Blended Wing Body Aircraft
NASA Technical Reports Server (NTRS)
Guo, Yueping; Burley, Casey L; Thomas, Russell H.
2014-01-01
A system noise study is presented for the blended-wing-body (BWB) aircraft configured with advanced technologies that are projected to be available in the 2025 timeframe of the NASA N+2 definition. This system noise assessment shows that the noise levels of the baseline configuration, measured by the cumulative Effective Perceived Noise Level (EPNL), have a large margin of 34 dB to the aircraft noise regulation of Stage 4. This confirms the acoustic benefits of the BWB shielding of engine noise, as well as other projected noise reduction technologies, but the noise margins are less than previously published assessments and are short of meeting the NASA N+2 noise goal. In establishing the relevance of the acoustic assessment framework, the design of the BWB configuration, the technical approach of the noise analysis, the databases and prediction tools used in the assessment are first described and discussed. The predicted noise levels and the component decomposition are then analyzed to identify the ranking order of importance of various noise components, revealing the prominence of airframe noise, which holds up the levels at all three noise certification locations and renders engine noise reduction technologies less effective. When projected airframe component noise reduction is added to the HWB configuration, it is shown that the cumulative noise margin to Stage 4 can reach 41.6 dB, nearly at the NASA goal. These results are compared with a previous NASA assessment with a different study framework. The approaches that yield projections of such low noise levels are discussed including aggressive assumptions on future technologies, assumptions on flight profile management, engine installation, and component noise reduction technologies. It is shown that reliable predictions of component noise also play an important role in the system noise assessment. The comparisons and discussions illustrate the importance of practical feasibilities and constraints in aircraft system noise studies, which include aerodynamic performance, propulsion efficiency, flight profile limitation and many other factors. For a future aircraft concept to achieve the NASA N+2 noise goal it will require a range of fully successful noise reduction technology developments.
Huser, Vojtech; Rasmussen, Luke V; Oberg, Ryan; Starren, Justin B
2011-04-10
Workflow engine technology represents a new class of software with the ability to graphically model step-based knowledge. We present application of this novel technology to the domain of clinical decision support. Successful implementation of decision support within an electronic health record (EHR) remains an unsolved research challenge. Previous research efforts were mostly based on healthcare-specific representation standards and execution engines and did not reach wide adoption. We focus on two challenges in decision support systems: the ability to test decision logic on retrospective data prior prospective deployment and the challenge of user-friendly representation of clinical logic. We present our implementation of a workflow engine technology that addresses the two above-described challenges in delivering clinical decision support. Our system is based on a cross-industry standard of XML (extensible markup language) process definition language (XPDL). The core components of the system are a workflow editor for modeling clinical scenarios and a workflow engine for execution of those scenarios. We demonstrate, with an open-source and publicly available workflow suite, that clinical decision support logic can be executed on retrospective data. The same flowchart-based representation can also function in a prospective mode where the system can be integrated with an EHR system and respond to real-time clinical events. We limit the scope of our implementation to decision support content generation (which can be EHR system vendor independent). We do not focus on supporting complex decision support content delivery mechanisms due to lack of standardization of EHR systems in this area. We present results of our evaluation of the flowchart-based graphical notation as well as architectural evaluation of our implementation using an established evaluation framework for clinical decision support architecture. We describe an implementation of a free workflow technology software suite (available at http://code.google.com/p/healthflow) and its application in the domain of clinical decision support. Our implementation seamlessly supports clinical logic testing on retrospective data and offers a user-friendly knowledge representation paradigm. With the presented software implementation, we demonstrate that workflow engine technology can provide a decision support platform which evaluates well against an established clinical decision support architecture evaluation framework. Due to cross-industry usage of workflow engine technology, we can expect significant future functionality enhancements that will further improve the technology's capacity to serve as a clinical decision support platform.
NASA Astrophysics Data System (ADS)
This Strategic Plan was developed by the Federal Coordinating Council for Science, Engineering, and Technology (FCCSET) through its Committee on Education and Human Resources (CEHR), with representatives from 16 Federal agencies. Based on two years of coordinated interagency effort, the Plan confirms the Federal Government's commitment to ensuring the health and well-being of science, mathematics, engineering, and technology education at all levels and in all sectors (i.e., elementary and secondary, undergraduate, graduate, public understanding of science, and technology education). The Plan represents the Federal Government's efforts to develop a five-year planning framework and associated milestones that focus Federal planning and the resources of the participating agencies toward achieving the requisite or expected level of mathematics and science competence by all students. The priority framework outlines the strategic objectives, implementation priorities, and components for the Strategic Plan and serves as a road map for the Plan. The Plan endorses a broad range of ongoing activities, including continued Federal support for graduate education as the backbone of our country's research and development enterprise. The Plan also identifies three tiers of program activities with goals that address issues in science, mathematics, engineering, and technology education meriting special attention. Within each tier, individual agency programs play important and often unique roles that strengthen the aggregate portfolio. The three tiers are presented in descending order of priority: (1) reforming the formal education system; (2) expanding participation and access; and (3) enabling activities.
NASA Technical Reports Server (NTRS)
1993-01-01
This Strategic Plan was developed by the Federal Coordinating Council for Science, Engineering, and Technology (FCCSET) through its Committee on Education and Human Resources (CEHR), with representatives from 16 Federal agencies. Based on two years of coordinated interagency effort, the Plan confirms the Federal Government's commitment to ensuring the health and well-being of science, mathematics, engineering, and technology education at all levels and in all sectors (i.e., elementary and secondary, undergraduate, graduate, public understanding of science, and technology education). The Plan represents the Federal Government's efforts to develop a five-year planning framework and associated milestones that focus Federal planning and the resources of the participating agencies toward achieving the requisite or expected level of mathematics and science competence by all students. The priority framework outlines the strategic objectives, implementation priorities, and components for the Strategic Plan and serves as a road map for the Plan. The Plan endorses a broad range of ongoing activities, including continued Federal support for graduate education as the backbone of our country's research and development enterprise. The Plan also identifies three tiers of program activities with goals that address issues in science, mathematics, engineering, and technology education meriting special attention. Within each tier, individual agency programs play important and often unique roles that strengthen the aggregate portfolio. The three tiers are presented in descending order of priority: (1) reforming the formal education system; (2) expanding participation and access; and (3) enabling activities.
NASA Astrophysics Data System (ADS)
Annetta, Leonard A.; Frazier, Wendy M.; Folta, Elizabeth; Holmes, Shawn; Lamb, Richard; Cheng, Meng-Tzu
2013-02-01
Designed-based research principles guided the study of 51 secondary-science teachers in the second year of a 3-year professional development project. The project entailed the creation of student-centered, inquiry-based, science, video games. A professional development model appropriate for infusing innovative technologies into standards-based curricula was employed to determine how science teacher's attitudes and efficacy where impacted while designing science-based video games. The study's mixed-method design ascertained teacher efficacy on five factors (General computer use, Science Learning, Inquiry Teaching and Learning, Synchronous chat/text, and Playing Video Games) related to technology and gaming using a web-based survey). Qualitative data in the form of online blog posts was gathered during the project to assist in the triangulation and assessment of teacher efficacy. Data analyses consisted of an Analysis of Variance and serial coding of teacher reflective responses. Results indicated participants who used computers daily have higher efficacy while using inquiry-based teaching methods and science teaching and learning. Additional emergent findings revealed possible motivating factors for efficacy. This professional development project was focused on inquiry as a pedagogical strategy, standard-based science learning as means to develop content knowledge, and creating video games as technological knowledge. The project was consistent with the Technological Pedagogical Content Knowledge (TPCK) framework where overlapping circles of the three components indicates development of an integrated understanding of the suggested relationships. Findings provide suggestions for development of standards-based science education software, its integration into the curriculum and, strategies for implementing technology into teaching practices.
DOT National Transportation Integrated Search
2016-04-01
In this study, we developed an adaptive signal control (ASC) framework for connected vehicles (CVs) using agent-based modeling technique. : The proposed framework consists of two types of agents: 1) vehicle agents (VAs); and 2) signal controller agen...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramuhalli, Pradeep; Hirt, Evelyn H.; Dib, Gerges
This project involved the development of enhanced risk monitors (ERMs) for active components in Advanced Reactor (AdvRx) designs by integrating real-time information about equipment condition with risk monitors. Health monitoring techniques in combination with predictive estimates of component failure based on condition and risk monitors can serve to indicate the risk posed by continued operation in the presence of detected degradation. This combination of predictive health monitoring based on equipment condition assessment and risk monitors can also enable optimization of maintenance scheduling with respect to the economics of plant operation. This report summarizes PNNL’s multi-year project on the development andmore » evaluation of an ERM concept for active components while highlighting FY2016 accomplishments. Specifically, this report provides a status summary of the integration and demonstration of the prototypic ERM framework with the plant supervisory control algorithms being developed at Oak Ridge National Laboratory (ORNL), and describes additional case studies conducted to assess sensitivity of the technology to different quantities. Supporting documentation on the software package to be provided to ONRL is incorporated in this report.« less
Integrated Technology Assessment Center (ITAC) Update
NASA Technical Reports Server (NTRS)
Taylor, J. L.; Neely, M. A.; Curran, F. M.; Christensen, E. R.; Escher, D.; Lovell, N.; Morris, Charles (Technical Monitor)
2002-01-01
The Integrated Technology Assessment Center (ITAC) has developed a flexible systems analysis framework to identify long-term technology needs, quantify payoffs for technology investments, and assess the progress of ASTP-sponsored technology programs in the hypersonics area. For this, ITAC has assembled an experienced team representing a broad sector of the aerospace community and developed a systematic assessment process complete with supporting tools. Concepts for transportation systems are selected based on relevance to the ASTP and integrated concept models (ICM) of these concepts are developed. Key technologies of interest are identified and projections are made of their characteristics with respect to their impacts on key aspects of the specific concepts of interest. Both the models and technology projections are then fed into the ITAC's probabilistic systems analysis framework in ModelCenter. This framework permits rapid sensitivity analysis, single point design assessment, and a full probabilistic assessment of each concept with respect to both embedded and enhancing technologies. Probabilistic outputs are weighed against metrics of interest to ASTP using a multivariate decision making process to provide inputs for technology prioritization within the ASTP. ITAC program is currently finishing the assessment of a two-stage-to-orbit (TSTO), rocket-based combined cycle (RBCC) concept and a TSTO turbine-based combined cycle (TBCC) concept developed by the team with inputs from NASA. A baseline all rocket TSTO concept is also being developed for comparison. Boeing has recently submitted a performance model for their Flexible Aerospace System Solution for Tomorrow (FASST) concept and the ISAT program will provide inputs for a single-stage-to-orbit (SSTO) TBCC based concept in the near-term. Both of these latter concepts will be analyzed within the ITAC framework over the summer. This paper provides a status update of the ITAC program.
Ezzati, Majid; Utzinger, Jürg; Cairncross, Sandy; Cohen, Aaron J; Singer, Burton H
2005-01-01
Monitoring and empirical evaluation are essential components of evidence based public health policies and programmes. Consequently, there is a growing interest in monitoring of, and indicators for, major environmental health risks, particularly in the developing world. Current large scale data collection efforts are generally disconnected from micro-scale studies in health sciences, which in turn have insufficiently investigated the behavioural and socioeconomic factors that influence exposure. A basic framework is proposed for development of indicators of exposure to environmental health risks that would facilitate the (a) assessment of the health effects of risk factors, (b) design and evaluation of interventions and programmes to deliver the interventions, and (c) appraisal and quantification of inequalities in health effects of risk factors, and benefits of intervention programmes and policies. Specific emphasis is put on the features of environmental risks that should guide the choice of indicators, in particular the interactions of technology, the environment, and human behaviour in determining exposure. The indicators are divided into four categories: (a) access and infrastructure, (b) technology, (c) agents and vectors, and (d) behaviour. The study used water and sanitation, indoor air pollution from solid fuels, urban ambient air pollution, and malaria as illustrative examples for this framework. Organised and systematic indicator selection and monitoring can provide an evidence base for design and implementation of more effective and equitable technological interventions, delivery programmes, and policies for environmental health risks in resource poor settings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patriarca, Riccardo, E-mail: riccardo.patriarca@uniroma1.it; Di Gravio, Giulio; Costantino, Francesco
Environmental auditing is a main issue for any production plant and assessing environmental performance is crucial to identify risks factors. The complexity of current plants arises from interactions among technological, human and organizational system components, which are often transient and not easily detectable. The auditing thus requires a systemic perspective, rather than focusing on individual behaviors, as emerged in recent research in the safety domain for socio-technical systems. We explore the significance of modeling the interactions of system components in everyday work, by the application of a recent systemic method, i.e. the Functional Resonance Analysis Method (FRAM), in order tomore » define dynamically the system structure. We present also an innovative evolution of traditional FRAM following a semi-quantitative approach based on Monte Carlo simulation. This paper represents the first contribution related to the application of FRAM in the environmental context, moreover considering a consistent evolution based on Monte Carlo simulation. The case study of an environmental risk auditing in a sinter plant validates the research, showing the benefits in terms of identifying potential critical activities, related mitigating actions and comprehensive environmental monitoring indicators. - Highlights: • We discuss the relevance of a systemic risk based environmental audit. • We present FRAM to represent functional interactions of the system. • We develop a semi-quantitative FRAM framework to assess environmental risks. • We apply the semi-quantitative FRAM framework to build a model for a sinter plant.« less
Data Management System for the National Energy-Water System (NEWS) Assessment Framework
NASA Astrophysics Data System (ADS)
Corsi, F.; Prousevitch, A.; Glidden, S.; Piasecki, M.; Celicourt, P.; Miara, A.; Fekete, B. M.; Vorosmarty, C. J.; Macknick, J.; Cohen, S. M.
2015-12-01
Aiming at providing a comprehensive assessment of the water-energy nexus, the National Energy-Water System (NEWS) project requires the integration of data to support a modeling framework that links climate, hydrological, power production, transmission, and economical models. Large amounts of Georeferenced data has to be streamed to the components of the inter-disciplinary model to explore future challenges and tradeoffs in the US power production, based on climate scenarios, power plant locations and technologies, available water resources, ecosystem sustainability, and economic demand. We used open source and in-house build software components to build a system that addresses two major data challenges: On-the-fly re-projection, re-gridding, interpolation, extrapolation, nodata patching, merging, temporal and spatial aggregation, of static and time series datasets in virtually any file formats and file structures, and any geographic extent for the models I/O, directly at run time; Comprehensive data management based on metadata cataloguing and discovery in repositories utilizing the MAGIC Table (Manipulation and Geographic Inquiry Control database). This innovative concept allows models to access data on-the-fly by data ID, irrespective of file path, file structure, file format and regardless its GIS specifications. In addition, a web-based information and computational system is being developed to control the I/O of spatially distributed Earth system, climate, and hydrological, power grid, and economical data flow within the NEWS framework. The system allows scenario building, data exploration, visualization, querying, and manipulation any loaded gridded, point, and vector polygon dataset. The system has demonstrated its potential for applications in other fields of Earth science modeling, education, and outreach. Over time, this implementation of the system will provide near real-time assessment of various current and future scenarios of the water-energy nexus.
How evidence from observing attending physicians links to a competency-based framework.
Bacchus, Maria; Ward, David R; de Grood, Jill; Lemaire, Jane B
2017-06-01
Competency-based medical education frameworks are often founded on a combination of existing research, educational principles and expert consensus. Our objective was to examine how components of the attending physician role, as determined by observing preceptors during their real-world work, link to the CanMEDS Physician Competency Framework. This is a sub-study of a broader study exploring the role of the attending physician by observing these doctors during their working day. The parent study revealed three overarching elements of the role that emerged from 14 themes and 123 sub-themes: (i) Competence, defined as the execution of traditional physician competencies; (ii) Context, defined as the environment in which the role is carried out, and (iii) Conduct, defined as the manner of acting, or behaviours and attitudes in the role that helped to negotiate the complex environment. In this sub-study, each sub-theme, or 'role-related component', was mapped to the competencies described in the CanMEDS 2005 and 2015 frameworks. Many role-related components from the Competence element were represented in the 2015 CanMEDS framework. No role-related components from the Context element were represented. Some role-related components from the Conduct element were represented. These Conduct role-related components were better represented in the 2015 CanMEDS framework than in the 2005 framework. This study shows how the real-world work of attending physicians links to the CanMEDS framework and provides empirical data identifying disconnects between espoused and observed behaviours. There is a conceptual gap where the contextual influences of physicians' work and the competencies required to adjust to these influences are missing from the framework. These concepts should be incorporated into learning both broadly, such as through an emphasis on context within curriculum development for the workplace (e.g. entrustable professional activities), and explicitly, through the introduction of novel competencies (e.g. the Conduct role-related components described in this study). © 2017 John Wiley & Sons Ltd and The Association for the Study of Medical Education.
Positive Attitudes towards Technologies and facets of Well-being in Older Adults.
Zambianchi, Manuela; Carelli, Maria Grazia
2018-03-01
The current study investigates the relevance of positive attitudes toward Internet technologies for psychological well-being and social well-being in old age. A sample of 245 elderly people ( Mean age = 70; SD =9.1) filled in the Psychological Well-Being Questionnaire, the Social Well-Being Questionnaire, and Attitudes Toward Technologies Questionnaire (ATTQ). Favorable attitudes toward Internet technologies showed positive correlations with overall social well-being and all its components with the exception of social acceptance. Positive correlations with overall psychological well-being and two of its components, namely, personal growth and purpose in life, were also found. Two hierarchical multiple regression models underscored that positive attitudes toward Internet technologies constitute the most important predictor of social well-being, and it appears to be a significant predictor for psychological well-being as well. Results are discussed and integrated into the Positive Technology theoretical framework that sustains the value of technological resources for improving the quality of personal experience and well-being.
High Technology Service Value Maximization through an MCDM-Based Innovative e-Business Model
NASA Astrophysics Data System (ADS)
Huang, Chi-Yo; Tzeng, Gwo-Hshiung; Ho, Wen-Rong; Chuang, Hsiu-Tyan; Lue, Yeou-Feng
The emergence of the Internet has changed the high technology marketing channels thoroughly in the past decade while E-commerce has already become one of the most efficient channels which high technology firms may skip the intermediaries and reach end customers directly. However, defining appropriate e-business models for commercializing new high technology products or services through Internet are not that easy. To overcome the above mentioned problems, a novel analytic framework based on the concept of high technology customers’ competence set expansion by leveraging high technology service firms’ capabilities and resources as well as novel multiple criteria decision making (MCDM) techniques, will be proposed in order to define an appropriate e-business model. An empirical example study of a silicon intellectual property (SIP) commercialization e-business model based on MCDM techniques will be provided for verifying the effectiveness of this novel analytic framework. The analysis successful assisted a Taiwanese IC design service firm to define an e-business model for maximizing its customer’s SIP transactions. In the future, the novel MCDM framework can be applied successful to novel business model definitions in the high technology industry.
An Integrated Web-Based 3d Modeling and Visualization Platform to Support Sustainable Cities
NASA Astrophysics Data System (ADS)
Amirebrahimi, S.; Rajabifard, A.
2012-07-01
Sustainable Development is found as the key solution to preserve the sustainability of cities in oppose to ongoing population growth and its negative impacts. This is complex and requires a holistic and multidisciplinary decision making. Variety of stakeholders with different backgrounds also needs to be considered and involved. Numerous web-based modeling and visualization tools have been designed and developed to support this process. There have been some success stories; however, majority failed to bring a comprehensive platform to support different aspects of sustainable development. In this work, in the context of SDI and Land Administration, CSDILA Platform - a 3D visualization and modeling platform -was proposed which can be used to model and visualize different dimensions to facilitate the achievement of sustainability, in particular, in urban context. The methodology involved the design of a generic framework for development of an analytical and visualization tool over the web. CSDILA Platform was then implemented via number of technologies based on the guidelines provided by the framework. The platform has a modular structure and uses Service-Oriented Architecture (SOA). It is capable of managing spatial objects in a 4D data store and can flexibly incorporate a variety of developed models using the platform's API. Development scenarios can be modeled and tested using the analysis and modeling component in the platform and the results are visualized in seamless 3D environment. The platform was further tested using number of scenarios and showed promising results and potentials to serve a wider need. In this paper, the design process of the generic framework, the implementation of CSDILA Platform and technologies used, and also findings and future research directions will be presented and discussed.
Jiang, Guoqian; Solbrig, Harold R; Chute, Christopher G
2011-01-01
A source of semantically coded Adverse Drug Event (ADE) data can be useful for identifying common phenotypes related to ADEs. We proposed a comprehensive framework for building a standardized ADE knowledge base (called ADEpedia) through combining ontology-based approach with semantic web technology. The framework comprises four primary modules: 1) an XML2RDF transformation module; 2) a data normalization module based on NCBO Open Biomedical Annotator; 3) a RDF store based persistence module; and 4) a front-end module based on a Semantic Wiki for the review and curation. A prototype is successfully implemented to demonstrate the capability of the system to integrate multiple drug data and ontology resources and open web services for the ADE data standardization. A preliminary evaluation is performed to demonstrate the usefulness of the system, including the performance of the NCBO annotator. In conclusion, the semantic web technology provides a highly scalable framework for ADE data source integration and standard query service.
Modular extracellular sensor architecture for engineering mammalian cell-based devices.
Daringer, Nichole M; Dudek, Rachel M; Schwarz, Kelly A; Leonard, Joshua N
2014-12-19
Engineering mammalian cell-based devices that monitor and therapeutically modulate human physiology is a promising and emerging frontier in clinical synthetic biology. However, realizing this vision will require new technologies enabling engineered circuitry to sense and respond to physiologically relevant cues. No existing technology enables an engineered cell to sense exclusively extracellular ligands, including proteins and pathogens, without relying upon native cellular receptors or signal transduction pathways that may be subject to crosstalk with native cellular components. To address this need, we here report a technology we term a Modular Extracellular Sensor Architecture (MESA). This self-contained receptor and signal transduction platform is maximally orthogonal to native cellular processes and comprises independent, tunable protein modules that enable performance optimization and straightforward engineering of novel MESA that recognize novel ligands. We demonstrate ligand-inducible activation of MESA signaling, optimization of receptor performance using design-based approaches, and generation of MESA biosensors that produce outputs in the form of either transcriptional regulation or transcription-independent reconstitution of enzymatic activity. This systematic, quantitative platform characterization provides a framework for engineering MESA to recognize novel ligands and for integrating these sensors into diverse mammalian synthetic biology applications.
A VGI data integration framework based on linked data model
NASA Astrophysics Data System (ADS)
Wan, Lin; Ren, Rongrong
2015-12-01
This paper aims at the geographic data integration and sharing method for multiple online VGI data sets. We propose a semantic-enabled framework for online VGI sources cooperative application environment to solve a target class of geospatial problems. Based on linked data technologies - which is one of core components of semantic web, we can construct the relationship link among geographic features distributed in diverse VGI platform by using linked data modeling methods, then deploy these semantic-enabled entities on the web, and eventually form an interconnected geographic data network to support geospatial information cooperative application across multiple VGI data sources. The mapping and transformation from VGI sources to RDF linked data model is presented to guarantee the unique data represent model among different online social geographic data sources. We propose a mixed strategy which combined spatial distance similarity and feature name attribute similarity as the measure standard to compare and match different geographic features in various VGI data sets. And our work focuses on how to apply Markov logic networks to achieve interlinks of the same linked data in different VGI-based linked data sets. In our method, the automatic generating method of co-reference object identification model according to geographic linked data is discussed in more detail. It finally built a huge geographic linked data network across loosely-coupled VGI web sites. The results of the experiment built on our framework and the evaluation of our method shows the framework is reasonable and practicable.
Real-time long term measurement using integrated framework for ubiquitous smart monitoring
NASA Astrophysics Data System (ADS)
Heo, Gwanghee; Lee, Giu; Lee, Woosang; Jeon, Joonryong; Kim, Pil-Joong
2007-04-01
Ubiquitous monitoring combining internet technologies and wireless communication is one of the most promising technologies of infrastructure health monitoring against the natural of man-made hazards. In this paper, an integrated framework of the ubiquitous monitoring is developed for real-time long term measurement in internet environment. This framework develops a wireless sensor system based on Bluetooth technology and sends measured acceleration data to the host computer through TCP/IP protocol. And it is also designed to respond to the request of web user on real time basis. In order to verify this system, real time monitoring tests are carried out on a prototype self-anchored suspension bridge. Also, wireless measurement system is analyzed to estimate its sensing capacity and evaluate its performance for monitoring purpose. Based on the evaluation, this paper proposes the effective strategies for integrated framework in order to detect structural deficiencies and to design an early warning system.
A Framework and Toolkit for the Construction of Multimodal Learning Interfaces
1998-04-29
human communication modalities in the context of a broad class of applications, specifically those that support state manipulation via parameterized actions. The multimodal semantic model is also the basis for a flexible, domain independent, incrementally trainable multimodal interpretation algorithm based on a connectionist network. The second major contribution is an application framework consisting of reusable components and a modular, distributed system architecture. Multimodal application developers can assemble the components in the framework into a new application,
Reducing Development and Operations Costs using NASA's "GMSEC" Systems Architecture
NASA Technical Reports Server (NTRS)
Smith, Dan; Bristow, John; Crouse, Patrick
2007-01-01
This viewgraph presentation reviews the role of Goddard Mission Services Evolution Center (GMSEC) in reducing development and operation costs in handling the massive data from NASA missions. The goals of GMSEC systems architecture development are to (1) Simplify integration and development, (2)Facilitate technology infusion over time, (3) Support evolving operational concepts, and (4) All for mix of heritage, COTS and new components. First 3 missions (i.e., Tropical Rainforest Measuring Mission (TRMM), Small Explorer (SMEX) missions - SWAS, TRACE, SAMPEX, and ST5 3-Satellite Constellation System) each selected a different telemetry and command system. These results show that GMSEC's message-bus component-based framework architecture is well proven and provides significant benefits over traditional flight and ground data system designs. The missions benefit through increased set of product options, enhanced automation, lower cost and new mission-enabling operations concept options .
New framework of NGN web-based management system
NASA Astrophysics Data System (ADS)
Nian, Zhou; Jie, Yin; Qian, Mao
2007-11-01
This paper introduces the basic conceptions and key technology of the Ajax and some popular frameworks in the J2EE architecture, try to integrate all the frameworks into a new framework. The developers can develop web applications much more convenient by using this framework and the web application can provide a more friendly and interactive platform to the end users. At last an example is given to explain how to use the new framework to build a web-based management system of the softswitch network.
Integrating Technology in Education: Moving the TPCK Framework towards Practical Applications
ERIC Educational Resources Information Center
Hechter, Richard P.; Phyfe, Lynette D.; Vermette, Laurie A.
2012-01-01
This theoretical paper offers a conceptual interpretation of the Technological, Pedagogical, and Content Knowledge (TPCK) framework to include the role of context within practical classroom applications. Our interpretation suggests that the importance of these three knowledge bases fluctuate within each stage of teachers' planning and instruction,…
Can composite digital monitoring biomarkers come of age? A framework for utilization.
Kovalchick, Christopher; Sirkar, Rhea; Regele, Oliver B; Kourtis, Lampros C; Schiller, Marie; Wolpert, Howard; Alden, Rhett G; Jones, Graham B; Wright, Justin M
2017-12-01
The application of digital monitoring biomarkers in health, wellness and disease management is reviewed. Harnessing the near limitless capacity of these approaches in the managed healthcare continuum will benefit from a systems-based architecture which presents data quality, quantity, and ease of capture within a decision-making dashboard. A framework was developed which stratifies key components and advances the concept of contextualized biomarkers. The framework codifies how direct, indirect, composite, and contextualized composite data can drive innovation for the application of digital biomarkers in healthcare. The de novo framework implies consideration of physiological, behavioral, and environmental factors in the context of biomarker capture and analysis. Application in disease and wellness is highlighted, and incorporation in clinical feedback loops and closed-loop systems is illustrated. The study of contextualized biomarkers has the potential to offer rich and insightful data for clinical decision making. Moreover, advancement of the field will benefit from innovation at the intersection of medicine, engineering, and science. Technological developments in this dynamic field will thus fuel its logical evolution guided by inputs from patients, physicians, healthcare providers, end-payors, actuarists, medical device manufacturers, and drug companies.
A user interface framework for the Square Kilometre Array: concepts and responsibilities
NASA Astrophysics Data System (ADS)
Marassi, Alessandro; Brajnik, Giorgio; Nicol, Mark; Alberti, Valentina; Le Roux, Gerhard
2016-07-01
The Square Kilometre Array (SKA) project is responsible for developing the SKA Observatory, the world's largest radio telescope, with eventually over a square kilometre of collecting area and including a general headquarters as well as two radio telescopes: SKA1-Mid in South Africa and SKA1-Low in Australia. The SKA project consists of a number of subsystems (elements) among which the Telescope Manager (TM) is the one involved in controlling and monitoring the SKA telescopes. The TM element has three primary responsibilities: management of astronomical observations, management of telescope hardware and software subsystems, management of data to support system operations and all stakeholders (operators, maintainers, engineers and science users) in achieving operational, maintenance and engineering goals. Operators, maintainers, engineers and science users will interact with TM via appropriate user interfaces (UI). The TM UI framework envisaged is a complete set of general technical solutions (components, technologies and design information) for implementing a generic computing system (UI platform). Such a system will enable UI components to be instantiated to allow for human interaction via screens, keyboards, mouse and to implement the necessary logic for acquiring or deriving the information needed for interaction. It will provide libraries and specific Application Programming Interfaces (APIs) to implement operator and engineer interactive interfaces. This paper will provide a status update of the TM UI framework, UI platform and UI components design effort, including the technology choices, and discuss key challenges in the TM UI architecture, as well as our approaches to addressing them.
Final Report for Bio-Inspired Approaches to Moving-Target Defense Strategies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fink, Glenn A.; Oehmen, Christopher S.
This report records the work and contributions of the NITRD-funded Bio-Inspired Approaches to Moving-Target Defense Strategies project performed by Pacific Northwest National Laboratory under the technical guidance of the National Security Agency’s R6 division. The project has incorporated a number of bio-inspired cyber defensive technologies within an elastic framework provided by the Digital Ants. This project has created the first scalable, real-world prototype of the Digital Ants Framework (DAF)[11] and integrated five technologies into this flexible, decentralized framework: (1) Ant-Based Cyber Defense (ABCD), (2) Behavioral Indicators, (3) Bioinformatic Clas- sification, (4) Moving-Target Reconfiguration, and (5) Ambient Collaboration. The DAF canmore » be used operationally to decentralize many such data intensive applications that normally rely on collection of large amounts of data in a central repository. In this work, we have shown how these component applications may be decentralized and may perform analysis at the edge. Operationally, this will enable analytics to scale far beyond current limitations while not suffering from the bandwidth or computational limitations of centralized analysis. This effort has advanced the R6 Cyber Security research program to secure digital infrastructures by developing a dynamic means to adaptively defend complex cyber systems. We hope that this work will benefit both our client’s efforts in system behavior modeling and cyber security to the overall benefit of the nation.« less
Process-aware EHR BPM systems: two prototypes and a conceptual framework.
Webster, Charles; Copenhaver, Mark
2010-01-01
Systematic methods to improve the effectiveness and efficiency of electronic health record-mediated processes will be key to EHRs playing an important role in the positive transformation of healthcare. Business process management (BPM) systematically optimizes process effectiveness, efficiency, and flexibility. Therefore BPM offers relevant ideas and technologies. We provide a conceptual model based on EHR productivity and negative feedback control that links EHR and BPM domains, describe two EHR BPM prototype modules, and close with the argument that typical EHRs must become more process-aware if they are to take full advantage of BPM ideas and technology. A prediction: Future extensible clinical groupware will coordinate delivery of EHR functionality to teams of users by combining modular components with executable process models whose usability (effectiveness, efficiency, and user satisfaction) will be systematically improved using business process management techniques.
2006-01-01
segments video game interaction into domain-independent components which together form a framework that can be used to characterize real-time interactive...multimedia applications in general and HRI in particular. We provide examples of using the components in both the video game and the Unmanned Aerial
2010-03-01
and charac- terize the actions taken by the soldier (e.g., running, walking, climbing stairs ). Real-time image capture and exchange N The ability of...multimedia information sharing among soldiers in the field, two-way speech translation systems, and autonomous robotic platforms. Key words: Emerging...soldiers in the field, two-way speech translation systems, and autonomous robotic platforms. It has been the foundation for 10 technology evaluations
NASA Astrophysics Data System (ADS)
Biermann, D.; Gausemeier, J.; Heim, H.-P.; Hess, S.; Petersen, M.; Ries, A.; Wagner, T.
2014-05-01
In this contribution a framework for the computer-aided planning and optimisation of functional graded components is presented. The framework is divided into three modules - the "Component Description", the "Expert System" for the synthetisation of several process chains and the "Modelling and Process Chain Optimisation". The Component Description module enhances a standard computer-aided design (CAD) model by a voxel-based representation of the graded properties. The Expert System synthesises process steps stored in the knowledge base to generate several alternative process chains. Each process chain is capable of producing components according to the enhanced CAD model and usually consists of a sequence of heating-, cooling-, and forming processes. The dependencies between the component and the applied manufacturing processes as well as between the processes themselves need to be considered. The Expert System utilises an ontology for that purpose. The ontology represents all dependencies in a structured way and connects the information of the knowledge base via relations. The third module performs the evaluation of the generated process chains. To accomplish this, the parameters of each process are optimised with respect to the component specification, whereby the result of the best parameterisation is used as representative value. Finally, the process chain which is capable of manufacturing a functionally graded component in an optimal way regarding to the property distributions of the component description is presented by means of a dedicated specification technique.
A Framework for a WAP-Based Course Registration System
ERIC Educational Resources Information Center
AL-Bastaki, Yousif; Al-Ajeeli, Abid
2005-01-01
This paper describes a WAP-based course registration system designed and implemented to facilitating the process of students' registration at Bahrain University. The framework will support many opportunities for applying WAP based technology to many services such as wireless commerce, cashless payment... and location-based services. The paper…
Architectural approaches for HL7-based health information systems implementation.
López, D M; Blobel, B
2010-01-01
Information systems integration is hard, especially when semantic and business process interoperability requirements need to be met. To succeed, a unified methodology, approaching different aspects of systems architecture such as business, information, computational, engineering and technology viewpoints, has to be considered. The paper contributes with an analysis and demonstration on how the HL7 standard set can support health information systems integration. Based on the Health Information Systems Development Framework (HIS-DF), common architectural models for HIS integration are analyzed. The framework is a standard-based, consistent, comprehensive, customizable, scalable methodology that supports the design of semantically interoperable health information systems and components. Three main architectural models for system integration are analyzed: the point to point interface, the messages server and the mediator models. Point to point interface and messages server models are completely supported by traditional HL7 version 2 and version 3 messaging. The HL7 v3 standard specification, combined with service-oriented, model-driven approaches provided by HIS-DF, makes the mediator model possible. The different integration scenarios are illustrated by describing a proof-of-concept implementation of an integrated public health surveillance system based on Enterprise Java Beans technology. Selecting the appropriate integration architecture is a fundamental issue of any software development project. HIS-DF provides a unique methodological approach guiding the development of healthcare integration projects. The mediator model - offered by the HIS-DF and supported in HL7 v3 artifacts - is the more promising one promoting the development of open, reusable, flexible, semantically interoperable, platform-independent, service-oriented and standard-based health information systems.
Cybersemiotics: a transdisciplinary framework for information studies.
Brier, S
1998-04-01
This paper summarizes recent attempts by this author to create a transdisciplinary, non-Cartesian and non-reductionistic framework for information studies in natural, social, and technological systems. To confront, in a scientific way, the problems of modern information technology where phenomenological man is dealing with socially constructed texts in algorithmically based digital bit-machines we need a theoretical framework spanning from physics over biology and technological design to phenomenological and social production of signification and meaning. I am working with such pragmatic theories as second order cybernetics (coupled with autopolesis theory), Lakoffs biologically oriented cognitive semantics, Peirce's triadic semiotics, and Wittgenstein's pragmatic language game theory. A coherent synthesis of these theories is what the cybersemiotic framework attempts to accomplish.
SPARK: A Framework for Multi-Scale Agent-Based Biomedical Modeling.
Solovyev, Alexey; Mikheev, Maxim; Zhou, Leming; Dutta-Moscato, Joyeeta; Ziraldo, Cordelia; An, Gary; Vodovotz, Yoram; Mi, Qi
2010-01-01
Multi-scale modeling of complex biological systems remains a central challenge in the systems biology community. A method of dynamic knowledge representation known as agent-based modeling enables the study of higher level behavior emerging from discrete events performed by individual components. With the advancement of computer technology, agent-based modeling has emerged as an innovative technique to model the complexities of systems biology. In this work, the authors describe SPARK (Simple Platform for Agent-based Representation of Knowledge), a framework for agent-based modeling specifically designed for systems-level biomedical model development. SPARK is a stand-alone application written in Java. It provides a user-friendly interface, and a simple programming language for developing Agent-Based Models (ABMs). SPARK has the following features specialized for modeling biomedical systems: 1) continuous space that can simulate real physical space; 2) flexible agent size and shape that can represent the relative proportions of various cell types; 3) multiple spaces that can concurrently simulate and visualize multiple scales in biomedical models; 4) a convenient graphical user interface. Existing ABMs of diabetic foot ulcers and acute inflammation were implemented in SPARK. Models of identical complexity were run in both NetLogo and SPARK; the SPARK-based models ran two to three times faster.
Construction of integrated case environments.
Losavio, Francisca; Matteo, Alfredo; Pérez, María
2003-01-01
The main goal of Computer-Aided Software Engineering (CASE) technology is to improve the entire software system development process. The CASE approach is not merely a technology; it involves a fundamental change in the process of software development. The tendency of the CASE approach, technically speaking, is the integration of tools that assist in the application of specific methods. In this sense, the environment architecture, which includes the platform and the system's hardware and software, constitutes the base of the CASE environment. The problem of tools integration has been proposed for two decades. Current integration efforts emphasize the interoperability of tools, especially in distributed environments. In this work we use the Brown approach. The environment resulting from the application of this model is called a federative environment, focusing on the fact that this architecture pays special attention to the connections among the components of the environment. This approach is now being used in component-based design. This paper describes a concrete experience in civil engineering and architecture fields, for the construction of an integrated CASE environment. A generic architectural framework based on an intermediary architectural pattern is applied to achieve the integration of the different tools. This intermediary represents the control perspective of the PAC (Presentation-Abstraction-Control) style, which has been implemented as a Mediator pattern and it has been used in the interactive systems domain. In addition, a process is given to construct the integrated CASE.
Golan, Ofra; Hansen, Paul
2012-11-26
Deciding which health technologies to fund involves confronting some of the most difficult choices in medicine. As for other countries, the Israeli health system is faced each year with having to make these difficult decisions. The Public National Advisory Committee, known as 'the Basket Committee', selects new technologies for the basic list of health care that all Israelis are entitled to access, known as the 'health basket'. We introduce a framework for health technology prioritization based explicitly on value for money that enables the main variables considered by decision-makers to be explicitly included. Although the framework's exposition is in terms of the Basket Committee selecting new technologies for Israel's health basket, we believe that the framework would also work well for other countries. Our proposed prioritization framework involves comparing four main variables for each technology: 1. Incremental benefits, including 'equity benefits', to Israel's population; 2. Incremental total cost to Israel's health system; 3. Quality of evidence; and 4. Any additional 'X-factors' not elsewhere included, such as strategic or legal factors, etc. Applying methodology from multi-criteria decision analysis, the multiple dimensions comprising the first variable are aggregated via a points system. The four variables are combined for each technology and compared across the technologies in the 'Value for Money (VfM) Chart'. The VfM Chart can be used to identify technologies that are good value for money, and, given a budget constraint, to select technologies that should be funded. This is demonstrated using 18 illustrative technologies. The VfM Chart is an intuitively appealing decision-support tool for helping decision-makers to focus on the inherent tradeoffs involved in health technology prioritization. Such deliberations can be performed in a systematic and transparent fashion that can also be easily communicated to stakeholders, including the general public. Possible future research includes pilot-testing the VfM Chart using real-world data. Ideally, this would involve working with the Basket Committee. Likewise, the framework could be tested and applied by health technology prioritization agencies in other countries.
NASA Astrophysics Data System (ADS)
Avolio, G.; Corso Radu, A.; Kazarov, A.; Lehmann Miotto, G.; Magnoni, L.
2012-12-01
The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment is a very complex distributed computing system, composed of more than 20000 applications running on more than 2000 computers. The TDAQ Controls system has to guarantee the smooth and synchronous operations of all the TDAQ components and has to provide the means to minimize the downtime of the system caused by runtime failures. During data taking runs, streams of information messages sent or published by running applications are the main sources of knowledge about correctness of running operations. The huge flow of operational monitoring data produced is constantly monitored by experts in order to detect problems or misbehaviours. Given the scale of the system and the rates of data to be analyzed, the automation of the system functionality in the areas of operational monitoring, system verification, error detection and recovery is a strong requirement. To accomplish its objective, the Controls system includes some high-level components which are based on advanced software technologies, namely the rule-based Expert System and the Complex Event Processing engines. The chosen techniques allow to formalize, store and reuse the knowledge of experts and thus to assist the shifters in the ATLAS control room during the data-taking activities.
Zhang, Bo; Yang, Xiang; Yang, Fei; Yang, Xin; Qin, Chenghu; Han, Dong; Ma, Xibo; Liu, Kai; Tian, Jie
2010-09-13
In molecular imaging (MI), especially the optical molecular imaging, bioluminescence tomography (BLT) emerges as an effective imaging modality for small animal imaging. The finite element methods (FEMs), especially the adaptive finite element (AFE) framework, play an important role in BLT. The processing speed of the FEMs and the AFE framework still needs to be improved, although the multi-thread CPU technology and the multi CPU technology have already been applied. In this paper, we for the first time introduce a new kind of acceleration technology to accelerate the AFE framework for BLT, using the graphics processing unit (GPU). Besides the processing speed, the GPU technology can get a balance between the cost and performance. The CUBLAS and CULA are two main important and powerful libraries for programming on NVIDIA GPUs. With the help of CUBLAS and CULA, it is easy to code on NVIDIA GPU and there is no need to worry about the details about the hardware environment of a specific GPU. The numerical experiments are designed to show the necessity, effect and application of the proposed CUBLAS and CULA based GPU acceleration. From the results of the experiments, we can reach the conclusion that the proposed CUBLAS and CULA based GPU acceleration method can improve the processing speed of the AFE framework very much while getting a balance between cost and performance.
A Component-Based FPGA Design Framework for Neuronal Ion Channel Dynamics Simulations
Mak, Terrence S. T.; Rachmuth, Guy; Lam, Kai-Pui; Poon, Chi-Sang
2008-01-01
Neuron-machine interfaces such as dynamic clamp and brain-implantable neuroprosthetic devices require real-time simulations of neuronal ion channel dynamics. Field Programmable Gate Array (FPGA) has emerged as a high-speed digital platform ideal for such application-specific computations. We propose an efficient and flexible component-based FPGA design framework for neuronal ion channel dynamics simulations, which overcomes certain limitations of the recently proposed memory-based approach. A parallel processing strategy is used to minimize computational delay, and a hardware-efficient factoring approach for calculating exponential and division functions in neuronal ion channel models is used to conserve resource consumption. Performances of the various FPGA design approaches are compared theoretically and experimentally in corresponding implementations of the AMPA and NMDA synaptic ion channel models. Our results suggest that the component-based design framework provides a more memory economic solution as well as more efficient logic utilization for large word lengths, whereas the memory-based approach may be suitable for time-critical applications where a higher throughput rate is desired. PMID:17190033
A Metadata Management Framework for Collaborative Review of Science Data Products
NASA Astrophysics Data System (ADS)
Hart, A. F.; Cinquini, L.; Mattmann, C. A.; Thompson, D. R.; Wagstaff, K.; Zimdars, P. A.; Jones, D. L.; Lazio, J.; Preston, R. A.
2012-12-01
Data volumes generated by modern scientific instruments often preclude archiving the complete observational record. To compensate, science teams have developed a variety of "triage" techniques for identifying data of potential scientific interest and marking it for prioritized processing or permanent storage. This may involve multiple stages of filtering with both automated and manual components operating at different timescales. A promising approach exploits a fast, fully automated first stage followed by a more reliable offline manual review of candidate events. This hybrid approach permits a 24-hour rapid real-time response while also preserving the high accuracy of manual review. To support this type of second-level validation effort, we have developed a metadata-driven framework for the collaborative review of candidate data products. The framework consists of a metadata processing pipeline and a browser-based user interface that together provide a configurable mechanism for reviewing data products via the web, and capturing the full stack of associated metadata in a robust, searchable archive. Our system heavily leverages software from the Apache Object Oriented Data Technology (OODT) project, an open source data integration framework that facilitates the construction of scalable data systems and places a heavy emphasis on the utilization of metadata to coordinate processing activities. OODT provides a suite of core data management components for file management and metadata cataloging that form the foundation for this effort. The system has been deployed at JPL in support of the V-FASTR experiment [1], a software-based radio transient detection experiment that operates commensally at the Very Long Baseline Array (VLBA), and has a science team that is geographically distributed across several countries. Daily review of automatically flagged data is a shared responsibility for the team, and is essential to keep the project within its resource constraints. We describe the development of the platform using open source software, and discuss our experience deploying the system operationally. [1] R.B.Wayth,W.F.Brisken,A.T.Deller,W.A.Majid,D.R.Thompson, S. J. Tingay, and K. L. Wagstaff, "V-fastr: The vlba fast radio transients experiment," The Astrophysical Journal, vol. 735, no. 2, p. 97, 2011. Acknowledgement: This effort was supported by the Jet Propulsion Laboratory, managed by the California Institute of Technology under a contract with the National Aeronautics and Space Administration.
NASA Astrophysics Data System (ADS)
Myers, B.; Beard, T. D.; Weiskopf, S. R.; Jackson, S. T.; Tittensor, D.; Harfoot, M.; Senay, G. B.; Casey, K.; Lenton, T. M.; Leidner, A. K.; Ruane, A. C.; Ferrier, S.; Serbin, S.; Matsuda, H.; Shiklomanov, A. N.; Rosa, I.
2017-12-01
Biodiversity and ecosystems services underpin political targets for the conservation of biodiversity; however, previous incarnations of these biodiversity-related targets have not relied on integrated model based projections of possible outcomes based on climate and land use change. Although a few global biodiversity models are available, most biodiversity models lie along a continuum of geography and components of biodiversity. Model-based projections of the future of global biodiversity are critical to support policymakers in the development of informed global conservation targets, but the scientific community lacks a clear strategy for integrating diverse data streams in developing, and evaluating the performance of, such biodiversity models. Therefore, in this paper, we propose a framework for ongoing testing and refinement of model-based projections of biodiversity trends and change, by linking a broad variety of biodiversity models with data streams generated by advances in remote sensing, coupled with new and emerging in-situ observation technologies to inform development of essential biodiversity variables, future global biodiversity targets, and indicators. Our two main objectives are to (1) develop a framework for model testing and refining projections of a broad range of biodiversity models, focusing on global models, through the integration of diverse data streams and (2) identify the realistic outputs that can be developed and determine coupled approaches using remote sensing and new and emerging in-situ observations (e.g., metagenomics) to better inform the next generation of global biodiversity targets.
Analysis of key technologies for virtual instruments metrology
NASA Astrophysics Data System (ADS)
Liu, Guixiong; Xu, Qingui; Gao, Furong; Guan, Qiuju; Fang, Qiang
2008-12-01
Virtual instruments (VIs) require metrological verification when applied as measuring instruments. Owing to the software-centered architecture, metrological evaluation of VIs includes two aspects: measurement functions and software characteristics. Complexity of software imposes difficulties on metrological testing of VIs. Key approaches and technologies for metrology evaluation of virtual instruments are investigated and analyzed in this paper. The principal issue is evaluation of measurement uncertainty. The nature and regularity of measurement uncertainty caused by software and algorithms can be evaluated by modeling, simulation, analysis, testing and statistics with support of powerful computing capability of PC. Another concern is evaluation of software features like correctness, reliability, stability, security and real-time of VIs. Technologies from software engineering, software testing and computer security domain can be used for these purposes. For example, a variety of black-box testing, white-box testing and modeling approaches can be used to evaluate the reliability of modules, components, applications and the whole VI software. The security of a VI can be assessed by methods like vulnerability scanning and penetration analysis. In order to facilitate metrology institutions to perform metrological verification of VIs efficiently, an automatic metrological tool for the above validation is essential. Based on technologies of numerical simulation, software testing and system benchmarking, a framework for the automatic tool is proposed in this paper. Investigation on implementation of existing automatic tools that perform calculation of measurement uncertainty, software testing and security assessment demonstrates the feasibility of the automatic framework advanced.
Improving online risk assessment with equipment prognostics and health monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coble, Jamie B.; Liu, Xiaotong; Briere, Chris
The current approach to evaluating the risk of nuclear power plant (NPP) operation relies on static probabilities of component failure, which are based on industry experience with the existing fleet of nominally similar light water reactors (LWRs). As the nuclear industry looks to advanced reactor designs that feature non-light water coolants (e.g., liquid metal, high temperature gas, molten salt), this operating history is not available. Many advanced reactor designs use advanced components, such as electromagnetic pumps, that have not been used in the US commercial nuclear fleet. Given the lack of rich operating experience, we cannot accurately estimate the evolvingmore » probability of failure for basic components to populate the fault trees and event trees that typically comprise probabilistic risk assessment (PRA) models. Online equipment prognostics and health management (PHM) technologies can bridge this gap to estimate the failure probabilities for components under operation. The enhanced risk monitor (ERM) incorporates equipment condition assessment into the existing PRA and risk monitor framework to provide accurate and timely estimates of operational risk.« less
Software Framework for Development of Web-GIS Systems for Analysis of Georeferenced Geophysical Data
NASA Astrophysics Data System (ADS)
Okladnikov, I.; Gordov, E. P.; Titov, A. G.
2011-12-01
Georeferenced datasets (meteorological databases, modeling and reanalysis results, remote sensing products, etc.) are currently actively used in numerous applications including modeling, interpretation and forecast of climatic and ecosystem changes for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset at present studies in the area of climate and environmental change require a special software support. A dedicated software framework for rapid development of providing such support information-computational systems based on Web-GIS technologies has been created. The software framework consists of 3 basic parts: computational kernel developed using ITTVIS Interactive Data Language (IDL), a set of PHP-controllers run within specialized web portal, and JavaScript class library for development of typical components of web mapping application graphical user interface (GUI) based on AJAX technology. Computational kernel comprise of number of modules for datasets access, mathematical and statistical data analysis and visualization of results. Specialized web-portal consists of web-server Apache, complying OGC standards Geoserver software which is used as a base for presenting cartographical information over the Web, and a set of PHP-controllers implementing web-mapping application logic and governing computational kernel. JavaScript library aiming at graphical user interface development is based on GeoExt library combining ExtJS Framework and OpenLayers software. Based on the software framework an information-computational system for complex analysis of large georeferenced data archives was developed. Structured environmental datasets available for processing now include two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis, meteorological observational data for the territory of the former USSR for the 20th century, and others. Current version of the system is already involved into a scientific research process. Particularly, recently the system was successfully used for analysis of Siberia climate changes and its impact in the region. The software framework presented allows rapid development of Web-GIS systems for geophysical data analysis thus providing specialists involved into multidisciplinary research projects with reliable and practical instruments for complex analysis of climate and ecosystems changes on global and regional scales. This work is partially supported by RFBR grants #10-07-00547, #11-05-01190, and SB RAS projects 4.31.1.5, 4.31.2.7, 4, 8, 9, 50 and 66.
Beyond computer literacy: supporting youth's positive development through technology.
Bers, Marina Umaschi
2010-01-01
In a digital era in which technology plays a role in most aspects of a child's life, having the competence and confidence to use computers might be a necessary step, but not a goal in itself. Developing character traits that will serve children to use technology in a safe way to communicate and connect with others, and providing opportunities for children to make a better world through the use of their computational skills, is just as important. The Positive Technological Development framework (PTD), a natural extension of the computer literacy and the technological fluency movements that have influenced the world of educational technology, adds psychosocial, civic, and ethical components to the cognitive ones. PTD examines the developmental tasks of a child growing up in our digital era and provides a model for developing and evaluating technology-rich youth programs. The explicit goal of PTD programs is to support children in the positive uses of technology to lead more fulfilling lives and make the world a better place. This article introduces the concept of PTD and presents examples of the Zora virtual world program for young people that the author developed following this framework.
Women and Computer Based Technologies: A Feminist Perspective.
ERIC Educational Resources Information Center
Morritt, Hope
The use of computer based technologies by professional women in education is examined through a feminist standpoint theory in this paper. The theory is grounded in eight claims which form the basis of the conceptual framework for the study. The experiences of nine women participants with computer based technologies were categorized using three…
A development framework for artificial intelligence based distributed operations support systems
NASA Technical Reports Server (NTRS)
Adler, Richard M.; Cottman, Bruce H.
1990-01-01
Advanced automation is required to reduce costly human operations support requirements for complex space-based and ground control systems. Existing knowledge based technologies have been used successfully to automate individual operations tasks. Considerably less progress has been made in integrating and coordinating multiple operations applications for unified intelligent support systems. To fill this gap, SOCIAL, a tool set for developing Distributed Artificial Intelligence (DAI) systems is being constructed. SOCIAL consists of three primary language based components defining: models of interprocess communication across heterogeneous platforms; models for interprocess coordination, concurrency control, and fault management; and for accessing heterogeneous information resources. DAI applications subsystems, either new or existing, will access these distributed services non-intrusively, via high-level message-based protocols. SOCIAL will reduce the complexity of distributed communications, control, and integration, enabling developers to concentrate on the design and functionality of the target DAI system itself.
Ramsey, Alex; Lord, Sarah; Torrey, John; Marsch, Lisa; Lardiere, Michael
2016-01-01
This study aimed to identify barriers to use of technology for behavioral health care from the perspective of care decision makers at community behavioral health organizations. As part of a larger survey of technology readiness, 260 care decision makers completed an open-ended question about perceived barriers to use of technology. Using the Consolidated Framework for Implementation Research (CFIR), qualitative analyses yielded barrier themes related to characteristics of technology (e.g., cost and privacy), potential end users (e.g., technology literacy and attitudes about technology), organization structure and climate (e.g., budget and infrastructure), and factors external to organizations (e.g., broadband accessibility and reimbursement policies). Number of reported barriers was higher among respondents representing agencies with lower annual budgets and smaller client bases relative to higher budget, larger clientele organizations. Individual barriers were differentially associated with budget, size of client base, and geographic location. Results are discussed in light of implementation science frameworks and proactive strategies to address perceived obstacles to adoption and use of technology-based behavioral health tools.
Ramsey, Alex; Lord, Sarah; Torrey, John; Marsch, Lisa; Lardiere, Michael
2014-01-01
This study aimed to identify barriers to use of technology for behavioral health care from the perspective of care decision-makers at community behavioral health organizations. As part of a larger survey of technology readiness, 260 care decision-makers completed an open-ended question about perceived barriers to use of technology. Using the Consolidated Framework for Implementation Research (CFIR), qualitative analyses yielded barrier themes related to characteristics of technology (e.g., cost, privacy), potential end-users (e.g., technology literacy, attitudes about technology), organization structure and climate (e.g., budget, infrastructure), and factors external to organizations (e.g., broadband accessibility, reimbursement policies). Number of reported barriers was higher among respondents representing agencies with lower annual budgets and smaller client bases relative to higher budget, larger clientele organizations. Individual barriers were differentially associated with budget, size of client base, and geographic location. Results are discussed in light of implementation science frameworks and proactive strategies to address perceived obstacles to adoption and use of technology-based behavioral health tools. PMID:25192755
A Component-Based Extension Framework for Large-Scale Parallel Simulations in NEURON
King, James G.; Hines, Michael; Hill, Sean; Goodman, Philip H.; Markram, Henry; Schürmann, Felix
2008-01-01
As neuronal simulations approach larger scales with increasing levels of detail, the neurosimulator software represents only a part of a chain of tools ranging from setup, simulation, interaction with virtual environments to analysis and visualizations. Previously published approaches to abstracting simulator engines have not received wide-spread acceptance, which in part may be to the fact that they tried to address the challenge of solving the model specification problem. Here, we present an approach that uses a neurosimulator, in this case NEURON, to describe and instantiate the network model in the simulator's native model language but then replaces the main integration loop with its own. Existing parallel network models are easily adopted to run in the presented framework. The presented approach is thus an extension to NEURON but uses a component-based architecture to allow for replaceable spike exchange components and pluggable components for monitoring, analysis, or control that can run in this framework alongside with the simulation. PMID:19430597
Mattson, Marifran; Basu, Ambar
2010-07-01
That messages are essential, if not the most critical component of any communicative process, seems like an obvious claim. More so when the communication is about health--one of the most vital and elemental of human experiences (Babrow & Mattson, 2003). Any communication campaign that aims to change a target audience's health behaviors needs to centralize messages. Even though messaging strategies are an essential component of social marketing and are a widely used campaign model, health campaigns based on this framework have not always been able to effectively operationalize this key component, leading to cases where initiating and sustaining prescribed health behavior has been difficult (MacStravic, 2000). Based on an examination of the VERB campaign and an Australian breastfeeding promotion campaign, we propose a message development tool within the ambit of the social marketing framework that aims to extend the framework and ensure that the messaging component of the model is contextualized at the core of planning, implementation, and evaluation efforts.
Augmenting breath regulation using a mobile driven virtual reality therapy framework.
Abushakra, Ahmad; Faezipour, Miad
2014-05-01
This paper presents a conceptual framework of a virtual reality therapy to assist individuals, especially lung cancer patients or those with breathing disorders to regulate their breath through real-time analysis of respiration movements using a smartphone. Virtual reality technology is an attractive means for medical simulations and treatment, particularly for patients with cancer. The theories, methodologies and approaches, and real-world dynamic contents for all the components of this virtual reality therapy (VRT) via a conceptual framework using the smartphone will be discussed. The architecture and technical aspects of the offshore platform of the virtual environment will also be presented.
Deterministic Design Optimization of Structures in OpenMDAO Framework
NASA Technical Reports Server (NTRS)
Coroneos, Rula M.; Pai, Shantaram S.
2012-01-01
Nonlinear programming algorithms play an important role in structural design optimization. Several such algorithms have been implemented in OpenMDAO framework developed at NASA Glenn Research Center (GRC). OpenMDAO is an open source engineering analysis framework, written in Python, for analyzing and solving Multi-Disciplinary Analysis and Optimization (MDAO) problems. It provides a number of solvers and optimizers, referred to as components and drivers, which users can leverage to build new tools and processes quickly and efficiently. Users may download, use, modify, and distribute the OpenMDAO software at no cost. This paper summarizes the process involved in analyzing and optimizing structural components by utilizing the framework s structural solvers and several gradient based optimizers along with a multi-objective genetic algorithm. For comparison purposes, the same structural components were analyzed and optimized using CometBoards, a NASA GRC developed code. The reliability and efficiency of the OpenMDAO framework was compared and reported in this report.
GCS component development cycle
NASA Astrophysics Data System (ADS)
Rodríguez, Jose A.; Macias, Rosa; Molgo, Jordi; Guerra, Dailos; Pi, Marti
2012-09-01
The GTC1 is an optical-infrared 10-meter segmented mirror telescope at the ORM observatory in Canary Islands (Spain). First light was at 13/07/2007 and since them it is in the operation phase. The GTC control system (GCS) is a distributed object & component oriented system based on RT-CORBA8 and it is responsible for the management and operation of the telescope, including its instrumentation. GCS has used the Rational Unified process (RUP9) in its development. RUP is an iterative software development process framework. After analysing (use cases) and designing (UML10) any of GCS subsystems, an initial component description of its interface is obtained and from that information a component specification is written. In order to improve the code productivity, GCS has adopted the code generation to transform this component specification into the skeleton of component classes based on a software framework, called Device Component Framework. Using the GCS development tools, based on javadoc and gcc, in only one step, the component is generated, compiled and deployed to be tested for the first time through our GUI inspector. The main advantages of this approach are the following: It reduces the learning curve of new developers and the development error rate, allows a systematic use of design patterns in the development and software reuse, speeds up the deliverables of the software product and massively increase the timescale, design consistency and design quality, and eliminates the future refactoring process required for the code.
Zotti, M J; Smagghe, G
2015-06-01
The time has passed for us to wonder whether RNA interference (RNAi) effectively controls pest insects or protects beneficial insects from diseases. The RNAi era in insect science began with studies of gene function and genetics that paved the way for the development of novel and highly specific approaches for the management of pest insects and, more recently, for the treatment and prevention of diseases in beneficial insects. The slight differences in components of RNAi pathways are sufficient to provide a high degree of variation in responsiveness among insects. The current framework to assess the negative effects of genetically modified (GM) plants on human health is adequate for RNAi-based GM plants. Because of the mode of action of RNAi and the lack of genomic data for most exposed non-target organisms, it becomes difficult to determine the environmental risks posed by RNAi-based technologies and the benefits provided for the protection of crops. A better understanding of the mechanisms that determine the variability in the sensitivity of insects would accelerate the worldwide release of commercial RNAi-based approaches.
TECHNOLOGY ASSESSMENT IN HOSPITALS: LESSONS LEARNED FROM AN EMPIRICAL EXPERIMENT.
Foglia, Emanuela; Lettieri, Emanuele; Ferrario, Lucrezia; Porazzi, Emanuele; Garagiola, Elisabetta; Pagani, Roberta; Bonfanti, Marzia; Lazzarotti, Valentina; Manzini, Raffaella; Masella, Cristina; Croce, Davide
2017-01-01
Hospital Based Health Technology Assessment (HBHTA) practices, to inform decision making at the hospital level, emerged as urgent priority for policy makers, hospital managers, and professionals. The present study crystallized the results achieved by the testing of an original framework for HBHTA, developed within Lombardy Region: the IMPlementation of A Quick hospital-based HTA (IMPAQHTA). The study tested: (i) the HBHTA framework efficiency, (ii) feasibility, (iii) the tool utility and completeness, considering dimensions and sub-dimensions. The IMPAQHTA framework deployed the Regional HTA program, activated in 2008 in Lombardy, at the hospital level. The relevance and feasibility of the framework were tested over a 3-year period through a large-scale empirical experiment, involving seventy-four healthcare professionals organized in different HBHTA teams for assessing thirty-two different technologies within twenty-two different hospitals. Semi-structured interviews and self-reported questionnaires were used to collect data regarding the relevance and feasibility of the IMPAQHTA framework. The proposed HBHTA framework proved to be suitable for application at the hospital level, in the Italian context, permitting a quick assessment (11 working days) and providing hospital decision makers with relevant and quantitative information. Performances in terms of feasibility, utility, completeness, and easiness proved to be satisfactory. The IMPAQHTA was considered to be a complete and feasible HBHTA framework, as well as being replicable to different technologies within any hospital settings, thus demonstrating the capability of a hospital to develop a complete HTA, if supported by adequate and well defined tools and quantitative metrics.
Quiroga-Campano, Ana L; Panoskaltsis, Nicki; Mantalaris, Athanasios
2018-03-02
Demand for high-value biologics, a rapidly growing pipeline, and pressure from competition, time-to-market and regulators, necessitate novel biomanufacturing approaches, including Quality by Design (QbD) principles and Process Analytical Technologies (PAT), to facilitate accelerated, efficient and effective process development platforms that ensure consistent product quality and reduced lot-to-lot variability. Herein, QbD and PAT principles were incorporated within an innovative in vitro-in silico integrated framework for upstream process development (UPD). The central component of the UPD framework is a mathematical model that predicts dynamic nutrient uptake and average intracellular ATP content, based on biochemical reaction networks, to quantify and characterize energy metabolism and its adaptive response, metabolic shifts, to maintain ATP homeostasis. The accuracy and flexibility of the model depends on critical cell type/product/clone-specific parameters, which are experimentally estimated. The integrated in vitro-in silico platform and the model's predictive capacity reduced burden, time and expense of experimentation resulting in optimal medium design compared to commercially available culture media (80% amino acid reduction) and a fed-batch feeding strategy that increased productivity by 129%. The framework represents a flexible and efficient tool that transforms, improves and accelerates conventional process development in biomanufacturing with wide applications, including stem cell-based therapies. Copyright © 2018. Published by Elsevier Inc.
A human-oriented framework for developing assistive service robots.
McGinn, Conor; Cullinan, Michael F; Culleton, Mark; Kelly, Kevin
2018-04-01
Multipurpose robots that can perform a range of useful tasks have the potential to increase the quality of life for many people living with disabilities. Owing to factors such as high system complexity, as-yet unresolved research questions and current technology limitations, there is a need for effective strategies to coordinate the development process. Integrating established methodologies based on human-centred design and universal design, a framework was formulated to coordinate the robot design process over successive iterations of prototype development. An account is given of how the framework was practically applied to the problem of developing a personal service robot. Application of the framework led to the formation of several design goals which addressed a wide range of identified user needs. The resultant prototype solution, which consisted of several component elements, succeeded in demonstrating the performance stipulated by all of the proposed metrics. Application of the framework resulted in the development of a complex prototype that addressed many aspects of the functional and usability requirements of a personal service robot. Following the process led to several important insights which directly benefit the development of subsequent prototypes. Implications for Rehabilitation This research shows how universal design might be used to formulate usability requirements for assistive service robots. A framework is presented that guides the process of designing service robots in a human-centred way. Through practical application of the framework, a prototype robot system that addressed a range of identified user needs was developed.
NASA Astrophysics Data System (ADS)
Mobasheri, A.; Vahidi, H.; Guan, Q.
2014-04-01
In developing countries, the number of experts and students in geo-informatics domain are very limited compared to experts and students of sciences that could benefit from geo-informatics. In this research, we study the possibility of providing an online education system for teaching geo-informatics at under-graduate level. The hypothesis is that in developing countries, such as Iran, a web-based geo-education system can greatly improve the quantity and quality of knowledge of students in undergraduate level, which is an important step that has to be made in regard of the famous "Geo for all" motto. As a technology for conducting natural and social studies, geo-informatics offers new ways of viewing, representing and analysing information for transformative learning and teaching. Therefore, we design and present a conceptual framework of an education system and elaborate its components as well as the free and open source services and software packages that could be used in this framework for a specific case study: the Web GIS course. The goal of the proposed framework is to develop experimental GI-services in a service-oriented platform for education purposes. Finally, the paper ends with concluding remarks and some tips for future research direction.
Systems and technologies for high-speed inter-office/datacenter interface
NASA Astrophysics Data System (ADS)
Sone, Y.; Nishizawa, H.; Yamamoto, S.; Fukutoku, M.; Yoshimatsu, T.
2017-01-01
Emerging requirements for inter-office/inter-datacenter short reach links for data center interconnects (DCI) and metro transport networks have led to various inter-office and inter-datacenter optical interface technologies. These technologies are bringing significant changes to systems and network architectures. In this paper, we present a system and ZR optical interface technologies for DCI and metro transport networks, then introduce the latest challenges facing the system framework. There are two trends in reach extension; one is to use Ethernet and the other is to use digital coherent technologies. The first approach achieves reach extension while using as many existing Ethernet components as possible. It offers low costs as reuses the cost-effective components created for the large Ethernet market. The second approach adopts low-cost and low power coherent DSPs that implement the minimal set long haul transmission functions. This paper introduces an architecture that integrates both trends. The architecture satisfies both datacom and telecom needs with a common control and management interface and automated configuration.
An active monitoring method for flood events
NASA Astrophysics Data System (ADS)
Chen, Zeqiang; Chen, Nengcheng; Du, Wenying; Gong, Jianya
2018-07-01
Timely and active detecting and monitoring of a flood event are critical for a quick response, effective decision-making and disaster reduction. To achieve the purpose, this paper proposes an active service framework for flood monitoring based on Sensor Web services and an active model for the concrete implementation of the active service framework. The framework consists of two core components-active warning and active planning. The active warning component is based on a publish-subscribe mechanism implemented by the Sensor Event Service. The active planning component employs the Sensor Planning Service to control the execution of the schemes and models and plans the model input data. The active model, called SMDSA, defines the quantitative calculation method for five elements, scheme, model, data, sensor, and auxiliary information, as well as their associations. Experimental monitoring of the Liangzi Lake flood in the summer of 2010 is conducted to test the proposed framework and model. The results show that 1) the proposed active service framework is efficient for timely and automated flood monitoring. 2) The active model, SMDSA, is a quantitative calculation method used to monitor floods from manual intervention to automatic computation. 3) As much preliminary work as possible should be done to take full advantage of the active service framework and the active model.
A science framework (SF) for agricultural sustainability.
Ahmed, Ferdous; Al-Amin, Abul Q; Masud, Muhammad M; Kari, Fatimah; Mohamad, Zeeda
2015-09-01
The significance of Science Framework (SF) to date is receiving more acceptances all over the world to address agricultural sustainability. The professional views, however, advocate that the SF known as Mega Science Framework (MSF) in the transitional economies is not converging effectively in many ways for the agricultural sustainability. Specially, MSF in transitional economies is mostly incapable to identify barriers in agricultural research, inadequate to frame policy gaps with the goal of strategizing the desired sustainability in agricultural technology and innovation, inconsistent in finding to identify the inequities, and incompleteness to rebuild decisions. Therefore, this study critically evaluates the components of MSF in transitional economies and appraises the significance, dispute and illegitimate issue to achieve successful sustainable development. A sound and an effective MSF can be developed when there is an inter-linkage within principal components such as of (a) national priorities, (b) specific research on agricultural sustainability, (c) adequate agricultural research and innovation, and (d) alternative policy alteration. This maiden piece of research which is first its kind has been conducted in order to outline the policy direction to have an effective science framework for agricultural sustainability.
Framework for teleoperated microassembly systems
NASA Astrophysics Data System (ADS)
Reinhart, Gunther; Anton, Oliver; Ehrenstrasser, Michael; Patron, Christian; Petzold, Bernd
2002-02-01
Manual assembly of minute parts is currently done using simple devices such as tweezers or magnifying glasses. The operator therefore requires a great deal of concentration for successful assembly. Teleoperated micro-assembly systems are a promising method for overcoming the scaling barrier. However, most of today's telepresence systems are based on proprietary and one-of-a-kind solutions. Frameworks which supply the basic functions of a telepresence system, e.g. to establish flexible communication links that depend on bandwidth requirements or to synchronize distributed components, are not currently available. Large amounts of time and money have to be invested in order to create task-specific teleoperated micro-assembly systems from scratch. For this reason, an object-oriented framework for telepresence systems that is based on CORBA as a common middleware was developed at the Institute for Machine Tools and Industrial Management (iwb). The framework is based on a distributed architectural concept and is realized in C++. External hardware components such as haptic, video or sensor devices are coupled to the system by means of defined software interfaces. In this case, the special requirements of teleoperation systems have to be considered, e.g. dynamic parameter settings for sensors during operation. Consequently, an architectural concept based on logical sensors has been developed to achieve maximum flexibility and to enable a task-oriented integration of hardware components.
Progress toward a Semantic eScience Framework; building on advanced cyberinfrastructure
NASA Astrophysics Data System (ADS)
McGuinness, D. L.; Fox, P. A.; West, P.; Rozell, E.; Zednik, S.; Chang, C.
2010-12-01
The configurable and extensible semantic eScience framework (SESF) has begun development and implementation of several semantic application components. Extensions and improvements to several ontologies have been made based on distinct interdisciplinary use cases ranging from solar physics, to biologicl and chemical oceanography. Importantly, these semantic representations mediate access to a diverse set of existing and emerging cyberinfrastructure. Among the advances are the population of triple stores with web accessible query services. A triple store is akin to a relational data store where the basic stored unit is a subject-predicate-object tuple. Access via a query is provided by the W3 Recommendation language specification SPARQL. Upon this middle tier of semantic cyberinfrastructure, we have developed several forms of semantic faceted search, including provenance-awareness. We report on the rapid advances in semantic technologies and tools and how we are sustaining the software path for the required technical advances as well as the ontology improvements and increased functionality of the semantic applications including how they are integrated into web-based portals (e.g. Drupal) and web services. Lastly, we indicate future work direction and opportunities for collaboration.
NASA Astrophysics Data System (ADS)
Shyu, Mei-Ling; Huang, Zifang; Luo, Hongli
In recent years, pervasive computing infrastructures have greatly improved the interaction between human and system. As we put more reliance on these computing infrastructures, we also face threats of network intrusion and/or any new forms of undesirable IT-based activities. Hence, network security has become an extremely important issue, which is closely connected with homeland security, business transactions, and people's daily life. Accurate and efficient intrusion detection technologies are required to safeguard the network systems and the critical information transmitted in the network systems. In this chapter, a novel network intrusion detection framework for mining and detecting sequential intrusion patterns is proposed. The proposed framework consists of a Collateral Representative Subspace Projection Modeling (C-RSPM) component for supervised classification, and an inter-transactional association rule mining method based on Layer Divided Modeling (LDM) for temporal pattern analysis. Experiments on the KDD99 data set and the traffic data set generated by a private LAN testbed show promising results with high detection rates, low processing time, and low false alarm rates in mining and detecting sequential intrusion detections.
Koldijk, Saskia; Kraaij, Wessel; Neerincx, Mark A
2016-07-05
Stress in office environments is a big concern, often leading to burn-out. New technologies are emerging, such as easily available sensors, contextual reasoning, and electronic coaching (e-coaching) apps. In the Smart Reasoning for Well-being at Home and at Work (SWELL) project, we explore the potential of using such new pervasive technologies to provide support for the self-management of well-being, with a focus on individuals' stress-coping. Ideally, these new pervasive systems should be grounded in existing work stress and intervention theory. However, there is a large diversity of theories and they hardly provide explicit directions for technology design. The aim of this paper is to present a comprehensive and concise framework that can be used to design pervasive technologies that support knowledge workers to decrease stress. Based on a literature study we identify concepts relevant to well-being at work and select different work stress models to find causes of work stress that can be addressed. From a technical perspective, we then describe how sensors can be used to infer stress and the context in which it appears, and use intervention theory to further specify interventions that can be provided by means of pervasive technology. The resulting general framework relates several relevant theories: we relate "engagement and burn-out" to "stress", and describe how relevant aspects can be quantified by means of sensors. We also outline underlying causes of work stress and how these can be addressed with interventions, in particular utilizing new technologies integrating behavioral change theory. Based upon this framework we were able to derive requirements for our case study, the pervasive SWELL system, and we implemented two prototypes. Small-scale user studies proved the value of the derived technology-supported interventions. The presented framework can be used to systematically develop theory-based technology-supported interventions to address work stress. In the area of pervasive systems for well-being, we identified the following six key research challenges and opportunities: (1) performing multi-disciplinary research, (2) interpreting personal sensor data, (3) relating measurable aspects to burn-out, (4) combining strengths of human and technology, (5) privacy, and (6) ethics.
Critical social theory as a model for the informatics curriculum for nursing.
Wainwright, P; Jones, P G
2000-01-01
It is widely acknowledged that the education and training of nurses in information management and technology is problematic. Drawing from recent research this paper presents a theoretical framework within which the nature of the problems faced by nurses in the use of information may be analyzed. This framework, based on the critical social theory of Habermas, also provides a model for the informatics curriculum. The advantages of problem based learning and multi-media web-based technologies for the delivery of learning materials within this area are also discussed.
2011-01-01
Background Workflow engine technology represents a new class of software with the ability to graphically model step-based knowledge. We present application of this novel technology to the domain of clinical decision support. Successful implementation of decision support within an electronic health record (EHR) remains an unsolved research challenge. Previous research efforts were mostly based on healthcare-specific representation standards and execution engines and did not reach wide adoption. We focus on two challenges in decision support systems: the ability to test decision logic on retrospective data prior prospective deployment and the challenge of user-friendly representation of clinical logic. Results We present our implementation of a workflow engine technology that addresses the two above-described challenges in delivering clinical decision support. Our system is based on a cross-industry standard of XML (extensible markup language) process definition language (XPDL). The core components of the system are a workflow editor for modeling clinical scenarios and a workflow engine for execution of those scenarios. We demonstrate, with an open-source and publicly available workflow suite, that clinical decision support logic can be executed on retrospective data. The same flowchart-based representation can also function in a prospective mode where the system can be integrated with an EHR system and respond to real-time clinical events. We limit the scope of our implementation to decision support content generation (which can be EHR system vendor independent). We do not focus on supporting complex decision support content delivery mechanisms due to lack of standardization of EHR systems in this area. We present results of our evaluation of the flowchart-based graphical notation as well as architectural evaluation of our implementation using an established evaluation framework for clinical decision support architecture. Conclusions We describe an implementation of a free workflow technology software suite (available at http://code.google.com/p/healthflow) and its application in the domain of clinical decision support. Our implementation seamlessly supports clinical logic testing on retrospective data and offers a user-friendly knowledge representation paradigm. With the presented software implementation, we demonstrate that workflow engine technology can provide a decision support platform which evaluates well against an established clinical decision support architecture evaluation framework. Due to cross-industry usage of workflow engine technology, we can expect significant future functionality enhancements that will further improve the technology's capacity to serve as a clinical decision support platform. PMID:21477364
Harvest: a web-based biomedical data discovery and reporting application development platform.
Italia, Michael J; Pennington, Jeffrey W; Ruth, Byron; Wrazien, Stacey; Loutrel, Jennifer G; Crenshaw, E Bryan; Miller, Jeffrey; White, Peter S
2013-01-01
Biomedical researchers share a common challenge of making complex data understandable and accessible. This need is increasingly acute as investigators seek opportunities for discovery amidst an exponential growth in the volume and complexity of laboratory and clinical data. To address this need, we developed Harvest, an open source framework that provides a set of modular components to aid the rapid development and deployment of custom data discovery software applications. Harvest incorporates visual representations of multidimensional data types in an intuitive, web-based interface that promotes a real-time, iterative approach to exploring complex clinical and experimental data. The Harvest architecture capitalizes on standards-based, open source technologies to address multiple functional needs critical to a research and development environment, including domain-specific data modeling, abstraction of complex data models, and a customizable web client.
Creating a virtual community of learning predicated on medical student learning styles.
McGowan, Julie; Abrams, Matthew; Frank, Mark; Bangert, Michael
2003-01-01
To create a virtual community of learning within the Indiana University School of Medicine, learning tools were developed within ANGEL to meet the learning needs and habits of the medical students. Determined by student feedback, the integration of digital audio recordings of class lectures into the course management content with several possible outputs was paramount. The other components included electronic enhancement of old exams and providing case-based tutorials within the ANGEL framework. Students are using the curriculum management system more. Faculty feel more secure about their intellectual property because of the authentication and security offered through the ANGEL system. The technology applications were comparatively easy to create and manage. The return on investment, particularly for the digital audio recording component, has been substantial. By considering student learning styles, extant curriculum management systems can be enhanced to facilitate student learning within an electronic environment.
A versatile MOF-based trap for heavy metal ion capture and dispersion.
Peng, Yaguang; Huang, Hongliang; Zhang, Yuxi; Kang, Chufan; Chen, Shuangming; Song, Li; Liu, Dahuan; Zhong, Chongli
2018-01-15
Current technologies for removing heavy metal ions are typically metal ion specific. Herein we report the development of a broad-spectrum heavy metal ion trap by incorporation of ethylenediaminetetraacetic acid into a robust metal-organic framework. The capture experiments for a total of 22 heavy metal ions, covering hard, soft, and borderline Lewis metal ions, show that the trap is very effective, with removal efficiencies of >99% for single-component adsorption, multi-component adsorption, or in breakthrough processes. The material can also serve as a host for metal ion loading with arbitrary selections of metal ion amounts/types with a controllable uptake ratio to prepare well-dispersed single or multiple metal catalysts. This is supported by the excellent performance of the prepared Pd 2+ -loaded composite toward the Suzuki coupling reaction. This work proposes a versatile heavy metal ion trap that may find applications in the fields of separation and catalysis.
Cutin from agro-waste as a raw material for the production of bioplastics.
Heredia-Guerrero, José A; Heredia, Antonio; Domínguez, Eva; Cingolani, Roberto; Bayer, Ilker S; Athanassiou, Athanassia; Benítez, José J
2017-11-09
Cutin is the main component of plant cuticles constituting the framework that supports the rest of the cuticle components. This biopolymer is composed of esterified bi- and trifunctional fatty acids. Despite its ubiquity in terrestrial plants, it has been underutilized as raw material due to its insolubility and lack of melting point. However, in recent years, a few technologies have been developed to obtain cutin monomers from several agro-wastes at an industrial scale. This review is focused on the description of cutin properties, biodegradability, chemical composition, processability, abundance, and the state of art of the fabrication of cutin-based materials in order to evaluate whether this biopolymer can be considered a source for the production of renewable materials. © The Author 2017. Published by Oxford University Press on behalf of the Society for Experimental Biology. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Youth, Social Networking, and Resistance: A Case Study on a Multidimensional Approach to Resistance
ERIC Educational Resources Information Center
Scozzaro, David
2011-01-01
This exploratory case study focused on youth and resistance that was aided by the use of technology. The combination of resistance and technology expanded a multidimensional framework and leads to new insight into transformative resistance. This study examined the framework of transformative resistance based on Solorzano and Delgado Bernal's…
Keeping Teachers in the Center: A Framework of Data-Driven Decision-Making
ERIC Educational Resources Information Center
Light, Daniel; Wexler, Dara H.; Heinze, Juliette
2004-01-01
The Education Development Center's Center for Children and Technology (CCT) conducted a three year study of a large-scale data reporting system, developed by the Grow Network for New York City's Department of Education. This paper presents a framework based on two years of research exploring the intersection of decision-support technologies,…
ERIC Educational Resources Information Center
Karadsheh, Louay A.
2010-01-01
This research focused on the challenges experienced when executing risk management activities for information technology projects. The lack of adequate knowledge management support of risk management activities has caused many project failures in the past. The research objective was to propose a conceptual framework of the Knowledge-Based Risk…
Integration of hybrid wireless networks in cloud services oriented enterprise information systems
NASA Astrophysics Data System (ADS)
Li, Shancang; Xu, Lida; Wang, Xinheng; Wang, Jue
2012-05-01
This article presents a hybrid wireless network integration scheme in cloud services-based enterprise information systems (EISs). With the emerging hybrid wireless networks and cloud computing technologies, it is necessary to develop a scheme that can seamlessly integrate these new technologies into existing EISs. By combining the hybrid wireless networks and computing in EIS, a new framework is proposed, which includes frontend layer, middle layer and backend layers connected to IP EISs. Based on a collaborative architecture, cloud services management framework and process diagram are presented. As a key feature, the proposed approach integrates access control functionalities within the hybrid framework that provide users with filtered views on available cloud services based on cloud service access requirements and user security credentials. In future work, we will implement the proposed framework over SwanMesh platform by integrating the UPnP standard into an enterprise information system.
Energy Technology Allocation for Distributed Energy Resources: A Technology-Policy Framework
NASA Astrophysics Data System (ADS)
Mallikarjun, Sreekanth
Distributed energy resources (DER) are emerging rapidly. New engineering technologies, materials, and designs improve the performance and extend the range of locations for DER. In contrast, constructing new or modernizing existing high voltage transmission lines for centralized generation are expensive and challenging. In addition, customer demand for reliability has increased and concerns about climate change have created a pull for swift renewable energy penetration. In this context, DER policy makers, developers, and users are interested in determining which energy technologies to use to accommodate different end-use energy demands. We present a two-stage multi-objective strategic technology-policy framework for determining the optimal energy technology allocation for DER. The framework simultaneously considers economic, technical, and environmental objectives. The first stage utilizes a Data Envelopment Analysis model for each end-use to evaluate the performance of each energy technology based on the three objectives. The second stage incorporates factor efficiencies determined in the first stage, capacity limitations, dispatchability, and renewable penetration for each technology, and demand for each end-use into a bottleneck multi-criteria decision model which provides the Pareto-optimal energy resource allocation. We conduct several case studies to understand the roles of various distributed energy technologies in different scenarios. We construct some policy implications based on the model results of set of case studies.
Davila, Juan Carlos; Cretu, Ana-Maria; Zaremba, Marek
2017-06-07
The design of multiple human activity recognition applications in areas such as healthcare, sports and safety relies on wearable sensor technologies. However, when making decisions based on the data acquired by such sensors in practical situations, several factors related to sensor data alignment, data losses, and noise, among other experimental constraints, deteriorate data quality and model accuracy. To tackle these issues, this paper presents a data-driven iterative learning framework to classify human locomotion activities such as walk, stand, lie, and sit, extracted from the Opportunity dataset. Data acquired by twelve 3-axial acceleration sensors and seven inertial measurement units are initially de-noised using a two-stage consecutive filtering approach combining a band-pass Finite Impulse Response (FIR) and a wavelet filter. A series of statistical parameters are extracted from the kinematical features, including the principal components and singular value decomposition of roll, pitch, yaw and the norm of the axial components. The novel interactive learning procedure is then applied in order to minimize the number of samples required to classify human locomotion activities. Only those samples that are most distant from the centroids of data clusters, according to a measure presented in the paper, are selected as candidates for the training dataset. The newly built dataset is then used to train an SVM multi-class classifier. The latter will produce the lowest prediction error. The proposed learning framework ensures a high level of robustness to variations in the quality of input data, while only using a much lower number of training samples and therefore a much shorter training time, which is an important consideration given the large size of the dataset.
ERIC Educational Resources Information Center
Craig, Patricia J.; Sable, Janet R.
2011-01-01
The recreation internship is one of the most critical components of professional preparation education, yet educators have done little to explore the experience from a constructivist-developmental growth perspective. This article presents a practice-based learning framework that shows promise for fostering moral development among recreation…
A Semantic Web-based System for Mining Genetic Mutations in Cancer Clinical Trials.
Priya, Sambhawa; Jiang, Guoqian; Dasari, Surendra; Zimmermann, Michael T; Wang, Chen; Heflin, Jeff; Chute, Christopher G
2015-01-01
Textual eligibility criteria in clinical trial protocols contain important information about potential clinically relevant pharmacogenomic events. Manual curation for harvesting this evidence is intractable as it is error prone and time consuming. In this paper, we develop and evaluate a Semantic Web-based system that captures and manages mutation evidences and related contextual information from cancer clinical trials. The system has 2 main components: an NLP-based annotator and a Semantic Web ontology-based annotation manager. We evaluated the performance of the annotator in terms of precision and recall. We demonstrated the usefulness of the system by conducting case studies in retrieving relevant clinical trials using a collection of mutations identified from TCGA Leukemia patients and Atlas of Genetics and Cytogenetics in Oncology and Haematology. In conclusion, our system using Semantic Web technologies provides an effective framework for extraction, annotation, standardization and management of genetic mutations in cancer clinical trials.
The Indispensable Teachers' Guide to Computer Skills. Second Edition.
ERIC Educational Resources Information Center
Johnson, Doug
This book provides a framework of technology skills that can be used for staff development. Part One presents critical components of effective staff development. Part Two describes the basic CODE 77 skills, including basic computer operation, file management, time management, word processing, network and Internet use, graphics and digital images,…
Makification: Towards a Framework for Leveraging the Maker Movement in Formal Education
ERIC Educational Resources Information Center
Cohen, Jonathan; Jones, W. Monty; Smith, Shaunna; Calandra, Brendan
2017-01-01
Maker culture is part of a burgeoning movement in which individuals leverage modern digital technologies to produce and share physical artifacts with a broader community. Certain components of the maker movement, if properly leveraged, hold promise for transforming formal education in a variety of contexts. The authors here work towards a…
A Framework for Collaborative and Convenient Learning on Cloud Computing Platforms
ERIC Educational Resources Information Center
Sharma, Deepika; Kumar, Vikas
2017-01-01
The depth of learning resides in collaborative work with more engagement and fun. Technology can enhance collaboration with a higher level of convenience and cloud computing can facilitate this in a cost effective and scalable manner. However, to deploy a successful online learning environment, elementary components of learning pedagogy must be…
Five-Axis Ultrasonic Additive Manufacturing for Nuclear Component Manufacture
NASA Astrophysics Data System (ADS)
Hehr, Adam; Wenning, Justin; Terrani, Kurt; Babu, Sudarsanam Suresh; Norfolk, Mark
2017-03-01
Ultrasonic additive manufacturing (UAM) is a three-dimensional metal printing technology which uses high-frequency vibrations to scrub and weld together both similar and dissimilar metal foils. There is no melting in the process and no special atmosphere requirements are needed. Consequently, dissimilar metals can be joined with little to no intermetallic compound formation, and large components can be manufactured. These attributes have the potential to transform manufacturing of nuclear reactor core components such as control elements for the High Flux Isotope Reactor at Oak Ridge National Laboratory. These components are hybrid structures consisting of an outer cladding layer in contact with the coolant with neutron-absorbing materials inside, such as neutron poisons for reactor control purposes. UAM systems are built into a computer numerical control (CNC) framework to utilize intermittent subtractive processes. These subtractive processes are used to introduce internal features as the component is being built and for net shaping. The CNC framework is also used for controlling the motion of the welding operation. It is demonstrated here that curved components with embedded features can be produced using a five-axis code for the welder for the first time.
Five-axis ultrasonic additive manufacturing for nuclear component manufacture
Hehr, Adam; Wenning, Justin; Terrani, Kurt A.; ...
2016-01-01
Ultrasonic additive manufacturing (UAM) is a three-dimensional metal printing technology which uses high-frequency vibrations to scrub and weld together both similar and dissimilar metal foils. There is no melting in the process and no special atmosphere requirements are needed. Consequently, dissimilar metals can be joined with little to no intermetallic compound formation, and large components can be manufactured. These attributes have the potential to transform manufacturing of nuclear reactor core components such as control elements for the High Flux Isotope Reactor at Oak Ridge National Laboratory. These components are hybrid structures consisting of an outer cladding layer in contact withmore » the coolant with neutron-absorbing materials inside, such as neutron poisons for reactor control purposes. UAM systems are built into a computer numerical control (CNC) framework to utilize intermittent subtractive processes. These subtractive processes are used to introduce internal features as the component is being built and for net shaping. The CNC framework is also used for controlling the motion of the welding operation. Lastly, it is demonstrated here that curved components with embedded features can be produced using a five-axis code for the welder for the first time.« less
Tisettanta case study: the interoperation of furniture production companies
NASA Astrophysics Data System (ADS)
Amarilli, Fabrizio; Spreafico, Alberto
This chapter presents the Tisettanta case study, focusing on the definition of the possible innovations that ICT technologies can bring to the Italian wood-furniture industry. This sector is characterized by industrial clusters composed mainly of a few large companies with international brand reputations and a large base of SMEs that manufacture finished products or are specialized in the production of single components/processes (such as the Brianza cluster, where Tisettanta operates). In this particular business ecosystem, ICT technologies can bring relevant support and improvements to the supply chain process, where collaborations between enterprises are put into action through the exchange of business documents such as orders, order confirmation, bills of lading, invoices, etc. The analysis methodology adopted in the Tisettanta case study refers to the TEKNE Methodology of Change (see Chapter 2), which defines a framework for supporting firms in the adoption of the Internetworked Enterprise organizational paradigm.
On design of sensor nodes in the rice planthopper monitoring system based on the internet of things
NASA Astrophysics Data System (ADS)
Wang, Ke Qiang; Cai, Ken
2011-02-01
Accurate records and prediction of the number of the rice planthopper's outbreaks and the environmental information of farmland are effective measures to control pests' damages. On the other hand, a new round of technological revolution from the Internet to the Internet of things is taking place in the field of information. The application of the Internet of things in rice planthopper and environmental online monitoring is an effective measure to solve problems existing in the present wired sensor monitoring technology. Having described the general framework of wireless sensor nodes in the Internet of things in this paper, the software and hardware design schemes of wireless sensor nodes are proposed, combining the needs of rice planthopper and environmental monitoring. In these schemes, each module's design and key components' selection are both aiming to the characteristics of the Internet of things, so it has a strong practical value.
Cloud-based distributed control of unmanned systems
NASA Astrophysics Data System (ADS)
Nguyen, Kim B.; Powell, Darren N.; Yetman, Charles; August, Michael; Alderson, Susan L.; Raney, Christopher J.
2015-05-01
Enabling warfighters to efficiently and safely execute dangerous missions, unmanned systems have been an increasingly valuable component in modern warfare. The evolving use of unmanned systems leads to vast amounts of data collected from sensors placed on the remote vehicles. As a result, many command and control (C2) systems have been developed to provide the necessary tools to perform one of the following functions: controlling the unmanned vehicle or analyzing and processing the sensory data from unmanned vehicles. These C2 systems are often disparate from one another, limiting the ability to optimally distribute data among different users. The Space and Naval Warfare Systems Center Pacific (SSC Pacific) seeks to address this technology gap through the UxV to the Cloud via Widgets project. The overarching intent of this three year effort is to provide three major capabilities: 1) unmanned vehicle control using an open service oriented architecture; 2) data distribution utilizing cloud technologies; 3) a collection of web-based tools enabling analysts to better view and process data. This paper focuses on how the UxV to the Cloud via Widgets system is designed and implemented by leveraging the following technologies: Data Distribution Service (DDS), Accumulo, Hadoop, and Ozone Widget Framework (OWF).
Technology dependence and health-related quality of life: a model.
Marden, Susan F
2005-04-01
This paper presents a new theoretical model to explain people's diverse responses to therapeutic health technology by characterizing the relationship between technology dependence and health-related quality of life (HRQL). Technology dependence has been defined as reliance on a variety of devices, drugs and procedures to alleviate or remedy acute or chronic health problems. Health professionals must ensure that these technologies result in positive outcomes for those who must rely on them, while minimizing the potential for unintended consequences. Little research exists to inform health professionals about how dependency on therapeutic technology may affect patient-reported outcomes such as HRQL. Organizing frameworks to focus such research are also limited. Generated from the synthesis of three theoretical frameworks and empirical research, the model proposes that attitudes towards technology dependence affect HRQL through a person's illness representations or commonsense beliefs about their illness. Symptom distress, illness history, age and gender also influence the technology dependence and HRQL relationship. Five concepts form the major components of the model: a) attitudes towards technology dependence, b) illness representation, c) symptom distress, d) HRQL and e) illness history. The model is proposed as a guide for clinical nursing research into the impact of a wide variety of therapeutic health care interventions on HRQL. Empirical validation of the model is needed to test its generality.
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2016-12-01
Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.
C3: A Collaborative Web Framework for NASA Earth Exchange
NASA Astrophysics Data System (ADS)
Foughty, E.; Fattarsi, C.; Hardoyo, C.; Kluck, D.; Wang, L.; Matthews, B.; Das, K.; Srivastava, A.; Votava, P.; Nemani, R. R.
2010-12-01
The NASA Earth Exchange (NEX) is a new collaboration platform for the Earth science community that provides a mechanism for scientific collaboration and knowledge sharing. NEX combines NASA advanced supercomputing resources, Earth system modeling, workflow management, NASA remote sensing data archives, and a collaborative communication platform to deliver a complete work environment in which users can explore and analyze large datasets, run modeling codes, collaborate on new or existing projects, and quickly share results among the Earth science communities. NEX is designed primarily for use by the NASA Earth science community to address scientific grand challenges. The NEX web portal component provides an on-line collaborative environment for sharing of Eearth science models, data, analysis tools and scientific results by researchers. In addition, the NEX portal also serves as a knowledge network that allows researchers to connect and collaborate based on the research they are involved in, specific geographic area of interest, field of study, etc. Features of the NEX web portal include: Member profiles, resource sharing (data sets, algorithms, models, publications), communication tools (commenting, messaging, social tagging), project tools (wikis, blogs) and more. The NEX web portal is built on the proven technologies and policies of DASHlink.arc.nasa.gov, (one of NASA's first science social media websites). The core component of the web portal is a C3 framework, which was built using Django and which is being deployed as a common framework for a number of collaborative sites throughout NASA.
A comprehensive conceptual framework for road safety strategies.
Hughes, B P; Anund, A; Falkmer, T
2016-05-01
Road safety strategies (generally called Strategic Highway Safety Plans in the USA) provide essential guidance for actions to improve road safety, but often lack a conceptual framework that is comprehensive, systems theory based, and underpinned by evidence from research and practice. This paper aims to incorporate all components, policy tools by which they are changed, and the general interactions between them. A framework of nine mutually interacting components that contribute to crashes and ten generic policy tools which can be applied to reduce the outcomes of these crashes was developed and used to assess 58 road safety strategies from 22 countries across 15 years. The work identifies the policy tools that are most and least widely applied to components, highlighting the potential for improvements to any individual road safety strategy, and the potential strengths and weaknesses of road safety strategies in general. The framework also provides guidance for the development of new road safety strategies, identifying potential consequences of policy tool based measures with regard to exposure and risk, useful for both mobility and safety objectives. Copyright © 2016 Elsevier Ltd. All rights reserved.
Free-electron laser emission architecture impact on extreme ultraviolet lithography
NASA Astrophysics Data System (ADS)
Hosler, Erik R.; Wood, Obert R.; Barletta, William A.
2017-10-01
Laser-produced plasma (LPP) EUV sources have demonstrated ˜125 W at customer sites, establishing confidence in EUV lithography (EUVL) as a viable manufacturing technology. However, for extension to the 3-nm technology node and beyond, existing scanner/source technology must enable higher-NA imaging systems (requiring increased resist dose and providing half-field exposures) and/or EUV multipatterning (requiring increased wafer throughput proportional to the number of exposure passes). Both development paths will require a substantial increase in EUV source power to maintain the economic viability of the technology, creating an opportunity for free-electron laser (FEL) EUV sources. FEL-based EUV sources offer an economic, high-power/single-source alternative to LPP EUV sources. Should FELs become the preferred next-generation EUV source, the choice of FEL emission architecture will greatly affect its operational stability and overall capability. A near-term industrialized FEL is expected to utilize one of the following three existing emission architectures: (1) self-amplified spontaneous emission, (2) regenerative amplifier, or (3) self-seeding. Model accelerator parameters are put forward to evaluate the impact of emission architecture on FEL output. Then, variations in the parameter space are applied to assess the potential impact to lithography operations, thereby establishing component sensitivity. The operating range of various accelerator components is discussed based on current accelerator performance demonstrated at various scientific user facilities. Finally, comparison of the performance between the model accelerator parameters and the variation in parameter space provides a means to evaluate the potential emission architectures. A scorecard is presented to facilitate this evaluation and provides a framework for future FEL design and enablement for EUVL applications.
Aligning business and information technology domains: strategic planning in hospitals.
Henderson, J C; Thomas, J B
1992-01-01
This article develops a framework for strategic information technology (IT) management in hospitals, termed the Strategic Alignment Model. This model is defined in terms of four domains--business strategy, IT strategy, organizational infrastructure, and IT infrastructure--each with its constituent components. The concept of strategic alignment is developed using two fundamental dimensions--strategic fit and integration. Different perspectives that hospitals use for aligning the various domains are discussed, and a prescriptive model of strategic IT planning is proposed.
Hamm, Julian; Money, Arthur G; Atwal, Anita; Paraskevopoulos, Ioannis
2016-02-01
In recent years, an ever increasing range of technology-based applications have been developed with the goal of assisting in the delivery of more effective and efficient fall prevention interventions. Whilst there have been a number of studies that have surveyed technologies for a particular sub-domain of fall prevention, there is no existing research which surveys the full spectrum of falls prevention interventions and characterises the range of technologies that have augmented this landscape. This study presents a conceptual framework and survey of the state of the art of technology-based fall prevention systems which is derived from a systematic template analysis of studies presented in contemporary research literature. The framework proposes four broad categories of fall prevention intervention system: Pre-fall prevention; Post-fall prevention; Fall injury prevention; Cross-fall prevention. Other categories include, Application type, Technology deployment platform, Information sources, Deployment environment, User interface type, and Collaborative function. After presenting the conceptual framework, a detailed survey of the state of the art is presented as a function of the proposed framework. A number of research challenges emerge as a result of surveying the research literature, which include a need for: new systems that focus on overcoming extrinsic falls risk factors; systems that support the environmental risk assessment process; systems that enable patients and practitioners to develop more collaborative relationships and engage in shared decision making during falls risk assessment and prevention activities. In response to these challenges, recommendations and future research directions are proposed to overcome each respective challenge. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Diabetes Information Technology: Designing Informatics Systems to Catalyze Change in Clinical Care
Lester, William T.; Zai, Adrian H.; Chueh, Henry C.; Grant, Richard W.
2008-01-01
Current computerized reminder and decision support systems intended to improve diabetes care have had a limited effect on clinical outcomes. Increasing pressures on health care networks to meet standards of diabetes care have created an environment where information technology systems for diabetes management are often created under duress, appended to existing clinical systems, and poorly integrated into the existing workflow. After defining the components of diabetes disease management, the authors present an eight-step conceptual framework to guide the development of more effective diabetes information technology systems for translating clinical information into clinical action. PMID:19885355
Multiplicative Multitask Feature Learning
Wang, Xin; Bi, Jinbo; Yu, Shipeng; Sun, Jiangwen; Song, Minghu
2016-01-01
We investigate a general framework of multiplicative multitask feature learning which decomposes individual task’s model parameters into a multiplication of two components. One of the components is used across all tasks and the other component is task-specific. Several previous methods can be proved to be special cases of our framework. We study the theoretical properties of this framework when different regularization conditions are applied to the two decomposed components. We prove that this framework is mathematically equivalent to the widely used multitask feature learning methods that are based on a joint regularization of all model parameters, but with a more general form of regularizers. Further, an analytical formula is derived for the across-task component as related to the task-specific component for all these regularizers, leading to a better understanding of the shrinkage effects of different regularizers. Study of this framework motivates new multitask learning algorithms. We propose two new learning formulations by varying the parameters in the proposed framework. An efficient blockwise coordinate descent algorithm is developed suitable for solving the entire family of formulations with rigorous convergence analysis. Simulation studies have identified the statistical properties of data that would be in favor of the new formulations. Extensive empirical studies on various classification and regression benchmark data sets have revealed the relative advantages of the two new formulations by comparing with the state of the art, which provides instructive insights into the feature learning problem with multiple tasks. PMID:28428735
DKIST visible broadband imager data processing pipeline
NASA Astrophysics Data System (ADS)
Beard, Andrew; Cowan, Bruce; Ferayorni, Andrew
2014-07-01
The Daniel K. Inouye Solar Telescope (DKIST) Data Handling System (DHS) provides the technical framework and building blocks for developing on-summit instrument quality assurance and data reduction pipelines. The DKIST Visible Broadband Imager (VBI) is a first light instrument that alone will create two data streams with a bandwidth of 960 MB/s each. The high data rate and data volume of the VBI require near-real time processing capability for quality assurance and data reduction, and will be performed on-summit using Graphics Processing Unit (GPU) technology. The VBI data processing pipeline (DPP) is the first designed and developed using the DKIST DHS components, and therefore provides insight into the strengths and weaknesses of the framework. In this paper we lay out the design of the VBI DPP, examine how the underlying DKIST DHS components are utilized, and discuss how integration of the DHS framework with GPUs was accomplished. We present our results of the VBI DPP alpha release implementation of the calibration, frame selection reduction, and quality assurance display processing nodes.
GIS Application System Design Applied to Information Monitoring
NASA Astrophysics Data System (ADS)
Qun, Zhou; Yujin, Yuan; Yuena, Kang
Natural environment information management system involves on-line instrument monitoring, data communications, database establishment, information management software development and so on. Its core lies in collecting effective and reliable environmental information, increasing utilization rate and sharing degree of environment information by advanced information technology, and maximizingly providing timely and scientific foundation for environmental monitoring and management. This thesis adopts C# plug-in application development and uses a set of complete embedded GIS component libraries and tools libraries provided by GIS Engine to finish the core of plug-in GIS application framework, namely, the design and implementation of framework host program and each functional plug-in, as well as the design and implementation of plug-in GIS application framework platform. This thesis adopts the advantages of development technique of dynamic plug-in loading configuration, quickly establishes GIS application by visualized component collaborative modeling and realizes GIS application integration. The developed platform is applicable to any application integration related to GIS application (ESRI platform) and can be as basis development platform of GIS application development.
Textile-Based Electronic Components for Energy Applications: Principles, Problems, and Perspective
Kaushik, Vishakha; Lee, Jaehong; Hong, Juree; Lee, Seulah; Lee, Sanggeun; Seo, Jungmok; Mahata, Chandreswar; Lee, Taeyoon
2015-01-01
Textile-based electronic components have gained interest in the fields of science and technology. Recent developments in nanotechnology have enabled the integration of electronic components into textiles while retaining desirable characteristics such as flexibility, strength, and conductivity. Various materials were investigated in detail to obtain current conductive textile technology, and the integration of electronic components into these textiles shows great promise for common everyday applications. The harvest and storage of energy in textile electronics is a challenge that requires further attention in order to enable complete adoption of this technology in practical implementations. This review focuses on the various conductive textiles, their methods of preparation, and textile-based electronic components. We also focus on fabrication and the function of textile-based energy harvesting and storage devices, discuss their fundamental limitations, and suggest new areas of study. PMID:28347078
Textile-Based Electronic Components for Energy Applications: Principles, Problems, and Perspective.
Kaushik, Vishakha; Lee, Jaehong; Hong, Juree; Lee, Seulah; Lee, Sanggeun; Seo, Jungmok; Mahata, Chandreswar; Lee, Taeyoon
2015-09-07
Textile-based electronic components have gained interest in the fields of science and technology. Recent developments in nanotechnology have enabled the integration of electronic components into textiles while retaining desirable characteristics such as flexibility, strength, and conductivity. Various materials were investigated in detail to obtain current conductive textile technology, and the integration of electronic components into these textiles shows great promise for common everyday applications. The harvest and storage of energy in textile electronics is a challenge that requires further attention in order to enable complete adoption of this technology in practical implementations. This review focuses on the various conductive textiles, their methods of preparation, and textile-based electronic components. We also focus on fabrication and the function of textile-based energy harvesting and storage devices, discuss their fundamental limitations, and suggest new areas of study.
Structures Technology for Future Aerospace Systems
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.; Venneri, Samuel L.; Paul, Donald B.; Hopkins, Mark A.
2000-01-01
An overview of structures technology for future aerospace systems is given. Discussion focuses on developments in component technologies that will improve the vehicle performance, advance the technology exploitation process, and reduce system life-cycle costs. The component technologies described are smart materials and structures, multifunctional materials and structures, affordable composite structures, extreme environment structures, flexible load bearing structures, and computational methods and simulation-based design. The trends in each of the component technologies are discussed and the applicability of these technologies to future aerospace vehicles is described.
Alphey, Nina; Alphey, Luke; Bonsall, Michael B.
2011-01-01
Vector-borne diseases impose enormous health and economic burdens and additional methods to control vector populations are clearly needed. The Sterile Insect Technique (SIT) has been successful against agricultural pests, but is not in large-scale use for suppressing or eliminating mosquito populations. Genetic RIDL technology (Release of Insects carrying a Dominant Lethal) is a proposed modification that involves releasing insects that are homozygous for a repressible dominant lethal genetic construct rather than being sterilized by irradiation, and could potentially overcome some technical difficulties with the conventional SIT technology. Using the arboviral disease dengue as an example, we combine vector population dynamics and epidemiological models to explore the effect of a program of RIDL releases on disease transmission. We use these to derive a preliminary estimate of the potential cost-effectiveness of vector control by applying estimates of the costs of SIT. We predict that this genetic control strategy could eliminate dengue rapidly from a human community, and at lower expense (approximately US$ 2∼30 per case averted) than the direct and indirect costs of disease (mean US$ 86–190 per case of dengue). The theoretical framework has wider potential use; by appropriately adapting or replacing each component of the framework (entomological, epidemiological, vector control bio-economics and health economics), it could be applied to other vector-borne diseases or vector control strategies and extended to include other health interventions. PMID:21998654
ERIC Educational Resources Information Center
Hirumi, Atsusi
2013-01-01
Advances in technology offer a vast array of opportunities for facilitating elearning. However, difficulties may arise if elearning research and design, including the use of emerging technologies, are based primarily on past practices, fads, or political agendas. This article describes refinements made to a framework for designing and sequencing…
NASA Astrophysics Data System (ADS)
Hayami, Masao; Seino, Junji; Nakai, Hiromi
2018-03-01
This article proposes a gauge-origin independent formalism of the nuclear magnetic shielding constant in the two-component relativistic framework based on the unitary transformation. The proposed scheme introduces the gauge factor and the unitary transformation into the atomic orbitals. The two-component relativistic equation is formulated by block-diagonalizing the Dirac Hamiltonian together with gauge factors. This formulation is available for arbitrary relativistic unitary transformations. Then, the infinite-order Douglas-Kroll-Hess (IODKH) transformation is applied to the present formulation. Next, the analytical derivatives of the IODKH Hamiltonian for the evaluation of the nuclear magnetic shielding constant are derived. Results obtained from the numerical assessments demonstrate that the present formulation removes the gauge-origin dependence completely. Furthermore, the formulation with the IODKH transformation gives results that are close to those in four-component and other two-component relativistic schemes.
A Semiautomated Framework for Integrating Expert Knowledge into Disease Marker Identification
Wang, Jing; Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; ...
2013-01-01
Background . The availability of large complex data sets generated by high throughput technologies has enabled the recent proliferation of disease biomarker studies. However, a recurring problem in deriving biological information from large data sets is how to best incorporate expert knowledge into the biomarker selection process. Objective . To develop a generalizable framework that can incorporate expert knowledge into data-driven processes in a semiautomated way while providing a metric for optimization in a biomarker selection scheme. Methods . The framework was implemented as a pipeline consisting of five components for the identification of signatures from integrated clustering (ISIC). Expertmore » knowledge was integrated into the biomarker identification process using the combination of two distinct approaches; a distance-based clustering approach and an expert knowledge-driven functional selection. Results . The utility of the developed framework ISIC was demonstrated on proteomics data from a study of chronic obstructive pulmonary disease (COPD). Biomarker candidates were identified in a mouse model using ISIC and validated in a study of a human cohort. Conclusions . Expert knowledge can be introduced into a biomarker discovery process in different ways to enhance the robustness of selected marker candidates. Developing strategies for extracting orthogonal and robust features from large data sets increases the chances of success in biomarker identification.« less
A Semiautomated Framework for Integrating Expert Knowledge into Disease Marker Identification
Wang, Jing; Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Varnum, Susan M.; Brown, Joseph N.; Riensche, Roderick M.; Adkins, Joshua N.; Jacobs, Jon M.; Hoidal, John R.; Scholand, Mary Beth; Pounds, Joel G.; Blackburn, Michael R.; Rodland, Karin D.; McDermott, Jason E.
2013-01-01
Background. The availability of large complex data sets generated by high throughput technologies has enabled the recent proliferation of disease biomarker studies. However, a recurring problem in deriving biological information from large data sets is how to best incorporate expert knowledge into the biomarker selection process. Objective. To develop a generalizable framework that can incorporate expert knowledge into data-driven processes in a semiautomated way while providing a metric for optimization in a biomarker selection scheme. Methods. The framework was implemented as a pipeline consisting of five components for the identification of signatures from integrated clustering (ISIC). Expert knowledge was integrated into the biomarker identification process using the combination of two distinct approaches; a distance-based clustering approach and an expert knowledge-driven functional selection. Results. The utility of the developed framework ISIC was demonstrated on proteomics data from a study of chronic obstructive pulmonary disease (COPD). Biomarker candidates were identified in a mouse model using ISIC and validated in a study of a human cohort. Conclusions. Expert knowledge can be introduced into a biomarker discovery process in different ways to enhance the robustness of selected marker candidates. Developing strategies for extracting orthogonal and robust features from large data sets increases the chances of success in biomarker identification. PMID:24223463
A Semiautomated Framework for Integrating Expert Knowledge into Disease Marker Identification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Jing; Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.
2013-10-01
Background. The availability of large complex data sets generated by high throughput technologies has enabled the recent proliferation of disease biomarker studies. However, a recurring problem in deriving biological information from large data sets is how to best incorporate expert knowledge into the biomarker selection process. Objective. To develop a generalizable framework that can incorporate expert knowledge into data-driven processes in a semiautomated way while providing a metric for optimization in a biomarker selection scheme. Methods. The framework was implemented as a pipeline consisting of five components for the identification of signatures from integrated clustering (ISIC). Expert knowledge was integratedmore » into the biomarker identification process using the combination of two distinct approaches; a distance-based clustering approach and an expert knowledge-driven functional selection. Results. The utility of the developed framework ISIC was demonstrated on proteomics data from a study of chronic obstructive pulmonary disease (COPD). Biomarker candidates were identified in a mouse model using ISIC and validated in a study of a human cohort. Conclusions. Expert knowledge can be introduced into a biomarker discovery process in different ways to enhance the robustness of selected marker candidates. Developing strategies for extracting orthogonal and robust features from large data sets increases the chances of success in biomarker identification.« less
Schlägel, Ulrike E; Lewis, Mark A
2016-12-01
Discrete-time random walks and their extensions are common tools for analyzing animal movement data. In these analyses, resolution of temporal discretization is a critical feature. Ideally, a model both mirrors the relevant temporal scale of the biological process of interest and matches the data sampling rate. Challenges arise when resolution of data is too coarse due to technological constraints, or when we wish to extrapolate results or compare results obtained from data with different resolutions. Drawing loosely on the concept of robustness in statistics, we propose a rigorous mathematical framework for studying movement models' robustness against changes in temporal resolution. In this framework, we define varying levels of robustness as formal model properties, focusing on random walk models with spatially-explicit component. With the new framework, we can investigate whether models can validly be applied to data across varying temporal resolutions and how we can account for these different resolutions in statistical inference results. We apply the new framework to movement-based resource selection models, demonstrating both analytical and numerical calculations, as well as a Monte Carlo simulation approach. While exact robustness is rare, the concept of approximate robustness provides a promising new direction for analyzing movement models.
Eckart, J Dana; Sobral, Bruno W S
2003-01-01
The emergent needs of the bioinformatics community challenge current information systems. The pace of biological data generation far outstrips Moore's Law. Therefore, a gap continues to widen between the capabilities to produce biological (molecular and cell) data sets and the capability to manage and analyze these data sets. As a result, Federal investments in large data set generation produces diminishing returns in terms of the community's capabilities of understanding biology and leveraging that understanding to make scientific and technological advances that improve society. We are building an open framework to address various data management issues including data and tool interoperability, nomenclature and data communication standardization, and database integration. PathPort, short for Pathogen Portal, employs a generic, web-services based framework to deal with some of the problems identified by the bioinformatics community. The motivating research goal of a scalable system to provide data management and analysis for key pathosystems, especially relating to molecular data, has resulted in a generic framework using two major components. On the server-side, we employ web-services. On the client-side, a Java application called ToolBus acts as a client-side "bus" for contacting data and tools and viewing results through a single, consistent user interface.
Towards a Holistic Framework for the Evaluation of Emergency Plans in Indoor Environments
Serrano, Emilio; Poveda, Geovanny; Garijo, Mercedes
2014-01-01
One of the most promising fields for ambient intelligence is the implementation of intelligent emergency plans. Because the use of drills and living labs cannot reproduce social behaviors, such as panic attacks, that strongly affect these plans, the use of agent-based social simulation provides an approach to evaluate these plans more thoroughly. (1) The hypothesis presented in this paper is that there has been little interest in describing the key modules that these simulators must include, such as formally represented knowledge and a realistic simulated sensor model, and especially in providing researchers with tools to reuse, extend and interconnect modules from different works. This lack of interest hinders researchers from achieving a holistic framework for evaluating emergency plans and forces them to reconsider and to implement the same components from scratch over and over. In addition to supporting this hypothesis by considering over 150 simulators, this paper: (2) defines the main modules identified and proposes the use of semantic web technologies as a cornerstone for the aforementioned holistic framework; (3) provides a basic methodology to achieve the framework; (4) identifies the main challenges; and (5) presents an open and free software tool to hint at the potential of such a holistic view of emergency plan evaluation in indoor environments. PMID:24662453
Shirley, S; Stampfl, R
1997-12-01
The purpose of this explanatory and prescriptive article is to identify interdisciplinary theories used by hospital development to direct its practice. The article explores, explains, and applies theories and principles from behavioral, social, and managerial disciplines. Learning, motivational, organizational, marketing, and attitudinal theories are incorporated and transformed into the fundamental components of a conceptual framework that provides an overview of the practice of hospital development. How this discipline incorporates these theories to design, explain, and prescribe the focus of its own practice is demonstrated. This interdisciplinary approach results in a framework for practice that is adaptable to changing social, cultural, economic, political, and technological environments.
A Systems Framework for Assessing Plumbing Products-Related Water Conservation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Alison; Dunham Whitehead, Camilla; Lutz, James
2011-12-02
Reducing the water use of plumbing products—toilets, urinals, faucets, and showerheads —has been a popular conservation measure. Improved technologies have created opportunities for additional conservation in this area. However, plumbing products do not operate in a vacuum. This paper reviews the literature related to plumbing products to determine a systems framework for evaluating future conservation measures using these products. The main framework comprises the following categories: water use efficiency, product components, product performance, source water, energy, and plumbing/sewer infrastructure. This framework for analysis provides a starting point for professionals considering future water conservation measures to evaluate the need for additionalmore » research, collaboration with other standards or codes committees, and attachment of additional metrics to water use efficiency (such as performance).« less
Consistent multiphysics simulation of a central tower CSP plant as applied to ISTORE
NASA Astrophysics Data System (ADS)
Votyakov, Evgeny V.; Papanicolas, Costas N.
2017-06-01
We present a unified consistent multiphysics approach to model a central tower CSP plant. The framework for the model includes Monte Carlo ray tracing (RT) and computational fluid dynamics (CFD) components utilizing the OpenFOAM C++ software library. The RT part works effectively with complex surfaces of engineering design given in CAD formats. The CFD simulation, which is based on 3D Navier-Stokes equations, takes into account all possible heat transfer mechanisms: radiation, conduction, and convection. Utilizing this package, the solar field of the experimental Platform for Research, Observation, and TEchnological Applications in Solar Energy (PROTEAS) and the Integrated STOrage and Receiver (ISTORE), developed at the Cyprus Institute, are being examined.
2012-01-01
Background Deciding which health technologies to fund involves confronting some of the most difficult choices in medicine. As for other countries, the Israeli health system is faced each year with having to make these difficult decisions. The Public National Advisory Committee, known as ‘the Basket Committee’, selects new technologies for the basic list of health care that all Israelis are entitled to access, known as the ‘health basket’. We introduce a framework for health technology prioritization based explicitly on value for money that enables the main variables considered by decision-makers to be explicitly included. Although the framework’s exposition is in terms of the Basket Committee selecting new technologies for Israel’s health basket, we believe that the framework would also work well for other countries. Methods Our proposed prioritization framework involves comparing four main variables for each technology: 1. Incremental benefits, including ‘equity benefits’, to Israel’s population; 2. Incremental total cost to Israel’s health system; 3. Quality of evidence; and 4. Any additional ‘X-factors’ not elsewhere included, such as strategic or legal factors, etc. Applying methodology from multi-criteria decision analysis, the multiple dimensions comprising the first variable are aggregated via a points system. Results The four variables are combined for each technology and compared across the technologies in the ‘Value for Money (VfM) Chart’. The VfM Chart can be used to identify technologies that are good value for money, and, given a budget constraint, to select technologies that should be funded. This is demonstrated using 18 illustrative technologies. Conclusions The VfM Chart is an intuitively appealing decision-support tool for helping decision-makers to focus on the inherent tradeoffs involved in health technology prioritization. Such deliberations can be performed in a systematic and transparent fashion that can also be easily communicated to stakeholders, including the general public. Possible future research includes pilot-testing the VfM Chart using real-world data. Ideally, this would involve working with the Basket Committee. Likewise, the framework could be tested and applied by health technology prioritization agencies in other countries. PMID:23181391
Video copy protection and detection framework (VPD) for e-learning systems
NASA Astrophysics Data System (ADS)
ZandI, Babak; Doustarmoghaddam, Danial; Pour, Mahsa R.
2013-03-01
This Article reviews and compares the copyright issues related to the digital video files, which can be categorized as contended based and Digital watermarking copy Detection. Then we describe how to protect a digital video by using a special Video data hiding method and algorithm. We also discuss how to detect the copy right of the file, Based on expounding Direction of the technology of the video copy detection, and Combining with the own research results, brings forward a new video protection and copy detection approach in terms of plagiarism and e-learning systems using the video data hiding technology. Finally we introduce a framework for Video protection and detection in e-learning systems (VPD Framework).
KATTS: a framework for maximizing NCLEX-RN performance.
McDowell, Betsy M
2008-04-01
A key indicator of the quality of a nursing education program is the performance of its graduates as first-time takers of the NCLEX-RN. As a result, nursing schools are open to strategies that strengthen the performance of their graduates on the examination. The Knowledge base, Anxiety control, Test-Taking Skills (KATTS) framework focuses on the three components of achieving a maximum score on an examination. In KATTS, all three components must be present and in proper balance to maximize a test taker's score. By strengthening not just one but all of these components, graduates can improve their overall test scores significantly. Suggested strategies for strengthening each component of KATTS are provided. This framework has been used successfully in designing remedial tutoring programs and in assisting first-time NCLEX test takers in preparing for the licensing examination.
ERIC Educational Resources Information Center
Cardenas-Claros, Monica Stella; Gruba, Paul A.
2013-01-01
This paper proposes a theoretical framework for the conceptualization and design of help options in computer-based second language (L2) listening. Based on four empirical studies, it aims at clarifying both conceptualization and design (CoDe) components. The elements of conceptualization consist of a novel four-part classification of help options:…
Why Do We Need Future Ready Librarians? That Kid.
ERIC Educational Resources Information Center
Ray, Mark
2018-01-01
In this article, the author examines the need of the Future Ready Librarians (FRL) initiative. The FRL Framework helps define how librarians might lead, teach, and support schools based on the core research-based components defined by Future Ready. The framework and initiative are intended to be ways to change the conversation about school…
A technical framework to describe occupant behavior for building energy simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turner, William; Hong, Tianzhen
2013-12-20
Green buildings that fail to meet expected design performance criteria indicate that technology alone does not guarantee high performance. Human influences are quite often simplified and ignored in the design, construction, and operation of buildings. Energy-conscious human behavior has been demonstrated to be a significant positive factor for improving the indoor environment while reducing the energy use of buildings. In our study we developed a new technical framework to describe energy-related human behavior in buildings. The energy-related behavior includes accounting for individuals and groups of occupants and their interactions with building energy services systems, appliances and facilities. The technical frameworkmore » consists of four key components: i. the drivers behind energy-related occupant behavior, which are biological, societal, environmental, physical, and economical in nature ii. the needs of the occupants are based on satisfying criteria that are either physical (e.g. thermal, visual and acoustic comfort) or non-physical (e.g. entertainment, privacy, and social reward) iii. the actions that building occupants perform when their needs are not fulfilled iv. the systems with which an occupant can interact to satisfy their needs The technical framework aims to provide a standardized description of a complete set of human energy-related behaviors in the form of an XML schema. For each type of behavior (e.g., occupants opening/closing windows, switching on/off lights etc.) we identify a set of common behaviors based on a literature review, survey data, and our own field study and analysis. Stochastic models are adopted or developed for each type of behavior to enable the evaluation of the impact of human behavior on energy use in buildings, during either the design or operation phase. We will also demonstrate the use of the technical framework in assessing the impact of occupancy behavior on energy saving technologies. The technical framework presented is part of our human behavior research, a 5-year program under the U.S. - China Clean Energy Research Center for Building Energy Efficiency.« less
The use of technology to promote vaccination: A social ecological model based framework.
Kolff, Chelsea A; Scott, Vanessa P; Stockwell, Melissa S
2018-05-21
Vaccinations are an important and effective cornerstone of preventive medical care. Growing technologic capabilities and use by both patients and providers present critical opportunities to leverage these tools to improve vaccination rates and public health. We propose the Social Ecological Model as a useful theoretical framework to identify areas in which technology has been or may be leveraged to target undervaccination across the individual, interpersonal, organizational, community, and society levels and the ways in which these levels interact.
Move-tecture: A Conceptual Framework for Designing Movement in Architecture
NASA Astrophysics Data System (ADS)
Yilmaz, Irem
2017-10-01
Along with the technological improvements in our age, it is now possible for the movement to become one of the basic components of the architectural space. Accordingly, architectural construction of movement changes both our architectural production practices and our understanding of architectural space. However, existing design concepts and approaches are insufficient to discuss and understand this change. In this respect, this study aims to form a conceptual framework on the relationship of architecture and movement. In this sense, the conceptualization of move-tecture is developed to research on the architectural construction of movement and the potentials of spatial creation through architecturally constructed movement. Move-tecture, is a conceptualization that treats movement as a basic component of spatial creation. It presents the framework of a qualitative categorization on the design of moving architectural structures. However, this categorization is a flexible one that can evolve in the direction of the expanding possibilities of the architectural design and the changing living conditions. With this understanding, six categories have been defined within the context of the article: Topological Organization, Choreographic Formation, Kinetic Structuring, Corporeal Constitution, Technological Configuration and Interactional Patterning. In line with these categories, a multifaceted perspective on the moving architectural structures is promoted. It is aimed that such an understanding constitutes a new initiative in the design practices carried out in this area and provides a conceptual basis for the discussions to be developed.
A Design Framework for Online Teacher Professional Development Communities
ERIC Educational Resources Information Center
Liu, Katrina Yan
2012-01-01
This paper provides a design framework for building online teacher professional development communities for preservice and inservice teachers. The framework is based on a comprehensive literature review on the latest technology and epistemology of online community and teacher professional development, comprising four major design factors and three…
An e-Learning Theoretical Framework
ERIC Educational Resources Information Center
Aparicio, Manuela; Bacao, Fernando; Oliveira, Tiago
2016-01-01
E-learning systems have witnessed a usage and research increase in the past decade. This article presents the e-learning concepts ecosystem. It summarizes the various scopes on e-learning studies. Here we propose an e-learning theoretical framework. This theory framework is based upon three principal dimensions: users, technology, and services…
Biomedical research in a Digital Health Framework
2014-01-01
This article describes a Digital Health Framework (DHF), benefitting from the lessons learnt during the three-year life span of the FP7 Synergy-COPD project. The DHF aims to embrace the emerging requirements - data and tools - of applying systems medicine into healthcare with a three-tier strategy articulating formal healthcare, informal care and biomedical research. Accordingly, it has been constructed based on three key building blocks, namely, novel integrated care services with the support of information and communication technologies, a personal health folder (PHF) and a biomedical research environment (DHF-research). Details on the functional requirements and necessary components of the DHF-research are extensively presented. Finally, the specifics of the building blocks strategy for deployment of the DHF, as well as the steps toward adoption are analyzed. The proposed architectural solutions and implementation steps constitute a pivotal strategy to foster and enable 4P medicine (Predictive, Preventive, Personalized and Participatory) in practice and should provide a head start to any community and institution currently considering to implement a biomedical research platform. PMID:25472554
CARE activities on superconducting RF cavities at INFN Milano
NASA Astrophysics Data System (ADS)
Bosotti, A.; Pierini, P.; Michelato, P.; Pagani, C.; Paparella, R.; Panzeri, N.; Monaco, L.; Paulon, R.; Novati, M.
2005-09-01
The SC RF group at INFN Milano-LASA is involved both in the TESLA/TTF collaboration and in the research and design activity on superconducting cavities for proton accelerators. Among these activities, some are supported by the European community within the CARE project. In the framework of the JRASRF collaboration we are developing a coaxial blade tuner for ILC (International Linear Collider) cavities, integrated with piezoelectric actuators for the compensation of the Lorenz force detuning and microphonics perturbation. Another activity, regarding the improved component design on SC technology, based on the information retrieving about the status of art on ancillaries and experience of various laboratories involved in SCRF, has started in our laboratory. Finally, in the framework of the HIPPI collaboration, we are testing two low beta superconducting cavities, built for the Italian TRASCO project, to verify the possibility to use them for pulsed operation. All these activities will be described here, together with the main results and the future perspectives.
Light-melt adhesive based on dynamic carbon frameworks in a columnar liquid-crystal phase
NASA Astrophysics Data System (ADS)
Saito, Shohei; Nobusue, Shunpei; Tsuzaka, Eri; Yuan, Chunxue; Mori, Chigusa; Hara, Mitsuo; Seki, Takahiro; Camacho, Cristopher; Irle, Stephan; Yamaguchi, Shigehiro
2016-07-01
Liquid crystal (LC) provides a suitable platform to exploit structural motions of molecules in a condensed phase. Amplification of the structural changes enables a variety of technologies not only in LC displays but also in other applications. Until very recently, however, a practical use of LCs for removable adhesives has not been explored, although a spontaneous disorganization of LC materials can be easily triggered by light-induced isomerization of photoactive components. The difficulty of such application derives from the requirements for simultaneous implementation of sufficient bonding strength and its rapid disappearance by photoirradiation. Here we report a dynamic molecular LC material that meets these requirements. Columnar-stacked V-shaped carbon frameworks display sufficient bonding strength even during heating conditions, while its bonding ability is immediately lost by a light-induced self-melting function. The light-melt adhesive is reusable and its fluorescence colour reversibly changes during the cycle, visualizing the bonding/nonbonding phases of the adhesive.
Building a Framework for Engineering Design Experiences in STEM: A Synthesis
ERIC Educational Resources Information Center
Denson, Cameron D.
2011-01-01
Since the inception of the National Center for Engineering and Technology Education in 2004, educators and researchers have struggled to identify the necessary components of a "good" engineering design challenge for high school students. In reading and analyzing the position papers on engineering design many themes emerged that may begin to form a…
Measuring adverse events in helicopter emergency medical services: establishing content validity.
Patterson, P Daniel; Lave, Judith R; Martin-Gill, Christian; Weaver, Matthew D; Wadas, Richard J; Arnold, Robert M; Roth, Ronald N; Mosesso, Vincent N; Guyette, Francis X; Rittenberger, Jon C; Yealy, Donald M
2014-01-01
We sought to create a valid framework for detecting adverse events (AEs) in the high-risk setting of helicopter emergency medical services (HEMS). We assembled a panel of 10 expert clinicians (n = 6 emergency medicine physicians and n = 4 prehospital nurses and flight paramedics) affiliated with a large multistate HEMS organization in the Northeast US. We used a modified Delphi technique to develop a framework for detecting AEs associated with the treatment of critically ill or injured patients. We used a widely applied measure, the content validity index (CVI), to quantify the validity of the framework's content. The expert panel of 10 clinicians reached consensus on a common AE definition and four-step protocol/process for AE detection in HEMS. The consensus-based framework is composed of three main components: (1) a trigger tool, (2) a method for rating proximal cause, and (3) a method for rating AE severity. The CVI findings isolate components of the framework considered content valid. We demonstrate a standardized process for the development of a content-valid framework for AE detection. The framework is a model for the development of a method for AE identification in other settings, including ground-based EMS.
Design document for the Surface Currents Data Base (SCDB) Management System (SCDBMS), version 1.0
NASA Technical Reports Server (NTRS)
Krisnnamagaru, Ramesh; Cesario, Cheryl; Foster, M. S.; Das, Vishnumohan
1994-01-01
The Surface Currents Database Management System (SCDBMS) provides access to the Surface Currents Data Base (SCDB) which is maintained by the Naval Oceanographic Office (NAVOCEANO). The SCDBMS incorporates database technology in providing seamless access to surface current data. The SCDBMS is an interactive software application with a graphical user interface (GUI) that supports user control of SCDBMS functional capabilities. The purpose of this document is to define and describe the structural framework and logistical design of the software components/units which are integrated into the major computer software configuration item (CSCI) identified as the SCDBMS, Version 1.0. The preliminary design is based on functional specifications and requirements identified in the governing Statement of Work prepared by the Naval Oceanographic Office (NAVOCEANO) and distributed as a request for proposal by the National Aeronautics and Space Administration (NASA).
From pull-down data to protein interaction networks and complexes with biological relevance.
Zhang, Bing; Park, Byung-Hoon; Karpinets, Tatiana; Samatova, Nagiza F
2008-04-01
Recent improvements in high-throughput Mass Spectrometry (MS) technology have expedited genome-wide discovery of protein-protein interactions by providing a capability of detecting protein complexes in a physiological setting. Computational inference of protein interaction networks and protein complexes from MS data are challenging. Advances are required in developing robust and seamlessly integrated procedures for assessment of protein-protein interaction affinities, mathematical representation of protein interaction networks, discovery of protein complexes and evaluation of their biological relevance. A multi-step but easy-to-follow framework for identifying protein complexes from MS pull-down data is introduced. It assesses interaction affinity between two proteins based on similarity of their co-purification patterns derived from MS data. It constructs a protein interaction network by adopting a knowledge-guided threshold selection method. Based on the network, it identifies protein complexes and infers their core components using a graph-theoretical approach. It deploys a statistical evaluation procedure to assess biological relevance of each found complex. On Saccharomyces cerevisiae pull-down data, the framework outperformed other more complicated schemes by at least 10% in F(1)-measure and identified 610 protein complexes with high-functional homogeneity based on the enrichment in Gene Ontology (GO) annotation. Manual examination of the complexes brought forward the hypotheses on cause of false identifications. Namely, co-purification of different protein complexes as mediated by a common non-protein molecule, such as DNA, might be a source of false positives. Protein identification bias in pull-down technology, such as the hydrophilic bias could result in false negatives.
NASA Astrophysics Data System (ADS)
Parvinnia, Elham; Khayami, Raouf; Ziarati, Koorush
Virtual collaborative networks are composed of small companies which take most advantage from the market opportunity and are able to compete with large companies. So some frameworks have been introduced for implementing this type of collaboration; although none of them has been standardized completely. In this paper we specify some instances that need to be standardized for implementing virtual enterprises. Then, a framework is suggested for implementing virtual collaborative networks. Finally, based on that suggestion, as a case study, we design a virtual collaborative network in automobile components production industry.
ERIC Educational Resources Information Center
Maloy, Robert W.; Poirier, Michelle; Smith, Hilary K.; Edwards, Sharon A.
2010-01-01
This article explores using a wiki, one of the newest forms of interactive computer-based technology, as a resource for teaching the Massachusetts K-12 History and Social Science Curriculum Framework, a set of state-mandated learning standards. Wikis are web pages that can be easily edited by multiple authors. They invite active involvement by…
ERIC Educational Resources Information Center
Kilbourn, Brent; Alvarez, Isabel
2008-01-01
This paper argues for understanding ICT from the standpoint of philosophical world views. We present a framework, based on Pepper's root-metaphors (Formism, Contextualism, Mechanism, Organicism, and Animism/Mysticism) and we illustrate the use of the framework by looking at a common example of ICT: e-mail. It is argued that such a framework is…
Herrgård, Markus; Sukumara, Sumesh; Campodonico, Miguel; Zhuang, Kai
2015-12-01
In recent years, bio-based chemicals have gained interest as a renewable alternative to petrochemicals. However, there is a significant need to assess the technological, biological, economic and environmental feasibility of bio-based chemicals, particularly during the early research phase. Recently, the Multi-scale framework for Sustainable Industrial Chemicals (MuSIC) was introduced to address this issue by integrating modelling approaches at different scales ranging from cellular to ecological scales. This framework can be further extended by incorporating modelling of the petrochemical value chain and the de novo prediction of metabolic pathways connecting existing host metabolism to desirable chemical products. This multi-scale, multi-disciplinary framework for quantitative assessment of bio-based chemicals will play a vital role in supporting engineering, strategy and policy decisions as we progress towards a sustainable chemical industry. © 2015 Authors; published by Portland Press Limited.
DOT National Transportation Integrated Search
2017-01-01
The findings from the proof of concept with mechanics-based models for flexible base suggest additional validation work should be performed, draft construction specification frameworks should be developed, and work extending the technology to stabili...
DOT National Transportation Integrated Search
2017-01-01
The findings from the proof of concept with mechanics-based models for flexible base suggest additional validation work should be performed, draft construction specification frameworks should be developed, and work extending the technology to stabili...
Koldijk, Saskia; Kraaij, Wessel
2016-01-01
Background Stress in office environments is a big concern, often leading to burn-out. New technologies are emerging, such as easily available sensors, contextual reasoning, and electronic coaching (e-coaching) apps. In the Smart Reasoning for Well-being at Home and at Work (SWELL) project, we explore the potential of using such new pervasive technologies to provide support for the self-management of well-being, with a focus on individuals' stress-coping. Ideally, these new pervasive systems should be grounded in existing work stress and intervention theory. However, there is a large diversity of theories and they hardly provide explicit directions for technology design. Objective The aim of this paper is to present a comprehensive and concise framework that can be used to design pervasive technologies that support knowledge workers to decrease stress. Methods Based on a literature study we identify concepts relevant to well-being at work and select different work stress models to find causes of work stress that can be addressed. From a technical perspective, we then describe how sensors can be used to infer stress and the context in which it appears, and use intervention theory to further specify interventions that can be provided by means of pervasive technology. Results The resulting general framework relates several relevant theories: we relate “engagement and burn-out” to “stress”, and describe how relevant aspects can be quantified by means of sensors. We also outline underlying causes of work stress and how these can be addressed with interventions, in particular utilizing new technologies integrating behavioral change theory. Based upon this framework we were able to derive requirements for our case study, the pervasive SWELL system, and we implemented two prototypes. Small-scale user studies proved the value of the derived technology-supported interventions. Conclusions The presented framework can be used to systematically develop theory-based technology-supported interventions to address work stress. In the area of pervasive systems for well-being, we identified the following six key research challenges and opportunities: (1) performing multi-disciplinary research, (2) interpreting personal sensor data, (3) relating measurable aspects to burn-out, (4) combining strengths of human and technology, (5) privacy, and (6) ethics. PMID:27380749
Zhao, Lei; Lim Choi Keung, Sarah N; Taweel, Adel; Tyler, Edward; Ogunsina, Ire; Rossiter, James; Delaney, Brendan C; Peterson, Kevin A; Hobbs, F D Richard; Arvanitis, Theodoros N
2012-01-01
Heterogeneous data models and coding schemes for electronic health records present challenges for automated search across distributed data sources. This paper describes a loosely coupled software framework based on the terminology controlled approach to enable the interoperation between the search interface and heterogeneous data sources. Software components interoperate via common terminology service and abstract criteria model so as to promote component reuse and incremental system evolution.
Development and implications of technology in reform-based physics laboratories
NASA Astrophysics Data System (ADS)
Chen, Sufen; Lo, Hao-Chang; Lin, Jing-Wen; Liang, Jyh-Chong; Chang, Hsin-Yi; Hwang, Fu-Kwun; Chiou, Guo-Li; Wu, Ying-Tien; Lee, Silvia Wen-Yu; Wu, Hsin-Kai; Wang, Chia-Yu; Tsai, Chin-Chung
2012-12-01
Technology has been widely involved in science research. Researchers are now applying it to science education in an attempt to bring students’ science activities closer to authentic science activities. The present study synthesizes the research to discuss the development of technology-enhanced laboratories and how technology may contribute to fulfilling the instructional objectives of laboratories in physics. To be more specific, this paper discusses the engagement of technology to innovate physics laboratories and the potential of technology to promote inquiry, instructor and peer interaction, and learning outcomes. We then construct a framework for teachers, scientists, and programmers to guide and evaluate technology-integrated laboratories. The framework includes inquiry learning and openness supported by technology, ways of conducting laboratories, and the diverse learning objectives on which a technology-integrated laboratory may be focused.
ERIC Educational Resources Information Center
Lee, Chia-Jung; Kim, ChanMin
2017-01-01
This paper presents the third version of a technological pedagogical content knowledge (TPACK) based instructional design model that incorporates the distinctive, transformative, and integrative views of TPACK into a comprehensive actionable framework. Strategies of relating TPACK domains to real-life learning experiences, role-playing, and…
Analyzing the development of Indonesia shrimp industry
NASA Astrophysics Data System (ADS)
Wati, L. A.
2018-04-01
This research aimed to analyze the development of shrimp industry in Indonesia. Porter’s Diamond Theory was used for analysis. The Porter’s Diamond theory is one of framework for industry analysis and business strategy development. The Porter’s Diamond theory has five forces that determine the competitive intensity in an industry, namely (1) the threat of substitute products, (2) the threat of competition, (3) the threat of new entrants, (4) bargaining power of suppliers, and (5) bargaining power of consumers. The development of Indonesian shrimp industry pretty good, explained by Porter Diamond Theory analysis. Analysis of Porter Diamond Theory through four main components namely factor conditions; demand condition; related and supporting industries; and firm strategy, structure and rivalry coupled with a two-component supporting (regulatory the government and the factor of chance). Based on the result of this research show that two-component supporting (regulatory the government and the factor of chance) have positive. Related and supporting industries have negative, firm and structure strategy have negative, rivalry has positive, factor condition have positive (except science and technology resources).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Szoka de Valladares, M.R.; Mack, S.
The DOE Hydrogen Program needs to develop criteria as part of a systematic evaluation process for proposal identification, evaluation and selection. The H Scan component of this process provides a framework in which a project proposer can fully describe their candidate technology system and its components. The H Scan complements traditional methods of capturing cost and technical information. It consists of a special set of survey forms designed to elicit information so expert reviewers can assess the proposal relative to DOE specified selection criteria. The Analytic Hierarchy Process (AHP) component of the decision process assembles the management defined evaluation andmore » selection criteria into a coherent multi-level decision construct by which projects can be evaluated in pair-wise comparisons. The AHP model will reflect management`s objectives and it will assist in the ranking of individual projects based on the extent to which each contributes to management`s objectives. This paper contains a detailed description of the products and activities associated with the planning and evaluation process: The objectives or criteria; the H Scan; and The Analytic Hierarchy Process (AHP).« less
Generational Differences in Technology Adoption in Community Colleges
ERIC Educational Resources Information Center
Rosario, Victoria C.
2012-01-01
This research study investigated the technological perceptions and expectations of community college students, faculty, administrators, and Information Technology (IT) staff. The theoretical framework is based upon two assumptions on the process of technological innovation: it can be explained by diffusion of adoption theory, and by studying the…
ERIC Educational Resources Information Center
Darrah, Johanna; O'Donnell, Maureen; Lam, Joyce; Story, Maureen; Wickenheiser, Diane; Xu, Kaishou; Jin, Xiaokun
2013-01-01
Clinical practice frameworks are a valuable component of clinical education, promoting informed clinical decision making based on the best available evidence and/or clinical experience. They encourage standardized intervention approaches and evaluation of practice. Based on an international project to support the development of an enhanced service…
Towards a Conceptual Framework of GBL Design for Engagement and Learning of Curriculum-Based Content
ERIC Educational Resources Information Center
Jabbar, Azita Iliya Abdul; Felicia, Patrick
2016-01-01
This paper aims to show best practices of GBL design for engagement. It intends to show how teachers can implement GBL in a collaborative, comprehensive and systematic way, in the classrooms, and probably outside the classrooms, based on empirical evidence and theoretical framework designed accordingly. This paper presents the components needed to…
A Program Structure for Event-Based Speech Synthesis by Rules within a Flexible Segmental Framework.
ERIC Educational Resources Information Center
Hill, David R.
1978-01-01
A program structure based on recently developed techniques for operating system simulation has the required flexibility for use as a speech synthesis algorithm research framework. This program makes synthesis possible with less rigid time and frequency-component structure than simpler schemes. It also meets real-time operation and memory-size…
A First Step Forward: Context Assessment
ERIC Educational Resources Information Center
Conner, Ross F.; Fitzpatrick, Jody L.; Rog, Debra J.
2012-01-01
In this chapter, we revisit and expand the context framework of Debra Rog, informed by three cases and by new aspects that we have identified. We then propose a way to move the framework into action, making context explicit. Based on the framework's components, we describe and illustrate a process we label context assessment (CA), which provides a…
USDA-ARS?s Scientific Manuscript database
AgroEcoSystem-Watershed (AgES-W) is a modular, Java-based spatially distributed model which implements hydrologic and water quality (H/WQ) simulation components under the Java Connection Framework (JCF) and the Object Modeling System (OMS) environmental modeling framework. AgES-W is implicitly scala...
Technological Alternatives to Paper-Based Components of Team-Based Learning
ERIC Educational Resources Information Center
Robinson, Daniel H.; Walker, Joshua D.
2008-01-01
The authors have been using components of team-based learning (TBL) in two undergraduate courses at the University of Texas for several years: an educational psychology survey course--Cognition, Human Learning and Motivation--and Introduction to Statistics. In this chapter, they describe how they used technology in classes of fifty to seventy…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dowson, Scott T.; Bruce, Joseph R.; Best, Daniel M.
2009-04-14
This paper presents key components of the Law Enforcement Information Framework (LEIF) that provides communications, situational awareness, and visual analytics tools in a service-oriented architecture supporting web-based desktop and handheld device users. LEIF simplifies interfaces and visualizations of well-established visual analytical techniques to improve usability. Advanced analytics capability is maintained by enhancing the underlying processing to support the new interface. LEIF development is driven by real-world user feedback gathered through deployments at three operational law enforcement organizations in the US. LEIF incorporates a robust information ingest pipeline supporting a wide variety of information formats. LEIF also insulates interface and analyticalmore » components from information sources making it easier to adapt the framework for many different data repositories.« less
Myneni, Sahiti; Amith, Muhammad; Geng, Yimin; Tao, Cui
2015-01-01
Adolescent and Young Adult (AYA) cancer survivors manage an array of health-related issues. Survivorship Care Plans (SCPs) have the potential to empower these young survivors by providing information regarding treatment summary, late-effects of cancer therapies, healthy lifestyle guidance, coping with work-life-health balance, and follow-up care. However, current mHealth infrastructure used to deliver SCPs has been limited in terms of flexibility, engagement, and reusability. The objective of this study is to develop an ontology-driven survivor engagement framework to facilitate rapid development of mobile apps that are targeted, extensible, and engaging. The major components include ontology models, patient engagement features, and behavioral intervention technologies. We apply the proposed framework to characterize individual building blocks ("survivor digilegos"), which form the basis for mHealth tools that address user needs across the cancer care continuum. Results indicate that the framework (a) allows identification of AYA survivorship components, (b) facilitates infusion of engagement elements, and (c) integrates behavior change constructs into the design architecture of survivorship applications. Implications for design of patient-engaging chronic disease management solutions are discussed.
Technological progress as a driver of innovation in infant foods.
Ferruzzi, Mario G; Neilson, Andrew P
2010-01-01
Advances in nutrition and food sciences are interrelated components of the innovative framework for infant formula and foods. While nutrition science continues to define the composition and functionality of human milk as a reference, food ingredient, formulation and processing technologies facilitate the design and delivery of nutritional and functional concepts to infant products. Expanding knowledge of both nutritive and non-nutritive components of human milk and their functionality guides selection and development of novel ingredient, formulation and processing methods to generate enhanced infant products targeting benefits including healthy growth, development as well as protection of health through the life cycle. In this chapter, identification and application of select novel ingredients/technologies will be discussed in the context of how these technological advancements have stimulated innovation in infant foods. Special focus will be given to advancements in protein technologies, as well as bioactive long-chain polyunsaturated fatty acids, prebiotics, probiotics that have allowed infant formula composition, and more critically functionality, to more closely align with that of human milk. Copyright © 2010 S. Karger AG, Basel.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sorokin, A.P.; Rimkevich, V.S.; Dem'yanova, L.P.
2009-05-15
Based on the physico-technical operations involved in the mineral processing technologies, the optimal production conditions are found for refractory fiber materials, aluminium, silicium, their compounds and other valued components. Ecologically safe and efficient aggregate technologies are developed for recovery of valued components from nonmetallic minerals and anthracides (brown coals).
Kobak, Roger; Zajac, Kristyn; Herres, Joanna; Krauthamer Ewing, E Stephanie
2015-01-01
The emergence of attachment-based treatments (ABTs) for adolescents highlights the need to more clearly define and evaluate these treatments in the context of other attachment based treatments for young children and adults. We propose a general framework for defining and evaluating ABTs that describes the cyclical processes that are required to maintain a secure attachment bond. This secure cycle incorporates three components: (1) the child or adult's IWM of the caregiver; (2) emotionally attuned communication; and (3) the caregiver's IWM of the child or adult. We briefly review Bowlby, Ainsworth, and Main's contributions to defining the components of the secure cycle and discuss how this framework can be adapted for understanding the process of change in ABTs. For clinicians working with adolescents, our model can be used to identify how deviations from the secure cycle (attachment injuries, empathic failures and mistuned communication) contribute to family distress and psychopathology. The secure cycle also provides a way of describing the ABT elements that have been used to revise IWMs or improve emotionally attuned communication. For researchers, our model provides a guide for conceptualizing and measuring change in attachment constructs and how change in one component of the interpersonal cycle should generalize to other components.
JAMS - a software platform for modular hydrological modelling
NASA Astrophysics Data System (ADS)
Kralisch, Sven; Fischer, Christian
2015-04-01
Current challenges of understanding and assessing the impacts of climate and land use changes on environmental systems demand for an ever-increasing integration of data and process knowledge in corresponding simulation models. Software frameworks that allow for a seamless creation of integrated models based on less complex components (domain models, process simulation routines) have therefore gained increasing attention during the last decade. JAMS is an Open-Source software framework that has been especially designed to cope with the challenges of eco-hydrological modelling. This is reflected by (i) its flexible approach for representing time and space, (ii) a strong separation of process simulation components from the declarative description of more complex models using domain specific XML, (iii) powerful analysis and visualization functions for spatial and temporal input and output data, and (iv) parameter optimization and uncertainty analysis functions commonly used in environmental modelling. Based on JAMS, different hydrological and nutrient-transport simulation models were implemented and successfully applied during the last years. We will present the JAMS core concepts and give an overview of models, simulation components and support tools available for that framework. Sample applications will be used to underline the advantages of component-based model designs and to show how JAMS can be used to address the challenges of integrated hydrological modelling.
Evaluation of Life Cycle Assessment (LCA) for Roadway Drainage Systems.
Byrne, Diana M; Grabowski, Marta K; Benitez, Amy C B; Schmidt, Arthur R; Guest, Jeremy S
2017-08-15
Roadway drainage design has traditionally focused on cost-effectively managing water quantity; however, runoff carries pollutants, posing risks to the local environment and public health. Additionally, construction and maintenance incur costs and contribute to global environmental impacts. While life cycle assessment (LCA) can potentially capture local and global environmental impacts of roadway drainage and other stormwater systems, LCA methodology must be evaluated because stormwater systems differ from wastewater and drinking water systems to which LCA is more frequently applied. To this end, this research developed a comprehensive model linking roadway drainage design parameters to LCA and life cycle costing (LCC) under uncertainty. This framework was applied to 10 highway drainage projects to evaluate LCA methodological choices by characterizing environmental and economic impacts of drainage projects and individual components (basin, bioswale, culvert, grass swale, storm sewer, and pipe underdrain). The relative impacts of drainage components varied based on functional unit choice. LCA inventory cutoff criteria evaluation showed the potential for cost-based criteria, which performed better than mass-based criteria. Finally, the local aquatic benefits of grass swales and bioswales offset global environmental impacts for four impact categories, highlighting the need to explicitly consider local impacts (i.e., direct emissions) when evaluating drainage technologies.
ERIC Educational Resources Information Center
Hechter, Richard; Vermette, Laurie Anne
2014-01-01
This paper examines the technology integration practices of Manitoban K-12 inservice science educators based on the Technological, Pedagogical, and Content knowledge (TPACK) framework. Science teachers (n = 433) completed a 10-item online survey regarding pedagogical beliefs about technology integration, types of technology used, and how often…
Ammenwerth, Elske; Iller, Carola; Mahler, Cornelia
2006-01-01
Background Factors of IT adoption have largely been discussed in the literature. However, existing frameworks (such as TAM or TTF) are failing to include one important aspect, the interaction between user and task. Method Based on a literature study and a case study, we developed the FITT framework to help analyse the socio-organisational-technical factors that influence IT adoption in a health care setting. Results Our FITT framework ("Fit between Individuals, Task and Technology") is based on the idea that IT adoption in a clinical environment depends on the fit between the attributes of the individual users (e.g. computer anxiety, motivation), attributes of the technology (e.g. usability, functionality, performance), and attributes of the clinical tasks and processes (e.g. organisation, task complexity). We used this framework in the retrospective analysis of a three-year case study, describing the adoption of a nursing documentation system in various departments in a German University Hospital. We will show how the FITT framework helped analyzing the process of IT adoption during an IT implementation: we were able to describe every found IT adoption problem with regard to the three fit dimensions, and any intervention on the fit can be described with regard to the three objects of the FITT framework (individual, task, technology). We also derive facilitators and barriers to IT adoption of clinical information systems. Conclusion This work should support a better understanding of the reasons for IT adoption failures and therefore enable better prepared and more successful IT introduction projects. We will discuss, however, that from a more epistemological point of view, it may be difficult or even impossible to analyse the complex and interacting factors that predict success or failure of IT projects in a socio-technical environment. PMID:16401336
NASA Astrophysics Data System (ADS)
Lemmen, Carsten; Hofmeister, Richard; Klingbeil, Knut; Hassan Nasermoaddeli, M.; Kerimoglu, Onur; Burchard, Hans; Kösters, Frank; Wirtz, Kai W.
2018-03-01
Shelf and coastal sea processes extend from the atmosphere through the water column and into the seabed. These processes reflect intimate interactions between physical, chemical, and biological states on multiple scales. As a consequence, coastal system modelling requires a high and flexible degree of process and domain integration; this has so far hardly been achieved by current model systems. The lack of modularity and flexibility in integrated models hinders the exchange of data and model components and has historically imposed the supremacy of specific physical driver models. We present the Modular System for Shelves and Coasts (MOSSCO; http://www.mossco.de), a novel domain and process coupling system tailored but not limited to the coupling challenges of and applications in the coastal ocean. MOSSCO builds on the Earth System Modeling Framework (ESMF) and on the Framework for Aquatic Biogeochemical Models (FABM). It goes beyond existing technologies by creating a unique level of modularity in both domain and process coupling, including a clear separation of component and basic model interfaces, flexible scheduling of several tens of models, and facilitation of iterative development at the lab and the station and on the coastal ocean scale. MOSSCO is rich in metadata and its concepts are also applicable outside the coastal domain. For coastal modelling, it contains dozens of example coupling configurations and tested set-ups for coupled applications. Thus, MOSSCO addresses the technology needs of a growing marine coastal Earth system community that encompasses very different disciplines, numerical tools, and research questions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burtch, Nicholas C.; Heinen, Jurn; Bennett, Thomas D.
We report that some of the most remarkable recent developments in metal–organic framework (MOF) performance properties can only be rationalized by the mechanical properties endowed by their hybrid inorganic–organic nanoporous structures. While these characteristics create intriguing application prospects, the same attributes also present challenges that will need to be overcome to enable the integration of MOFs with technologies where these promising traits can be exploited. In this review, emerging opportunities and challenges are identified for MOF-enabled device functionality and technological applications that arise from their fascinating mechanical properties. This is discussed not only in the context of their more well-studiedmore » gas storage and separation applications, but also for instances where MOFs serve as components of functional nanodevices. Recent advances in understanding MOF mechanical structure–property relationships due to attributes such as defects and interpenetration are highlighted, and open questions related to state-of-the-art computational approaches for quantifying their mechanical properties are critically discussed.« less
Burtch, Nicholas C.; Heinen, Jurn; Bennett, Thomas D.; ...
2017-11-17
We report that some of the most remarkable recent developments in metal–organic framework (MOF) performance properties can only be rationalized by the mechanical properties endowed by their hybrid inorganic–organic nanoporous structures. While these characteristics create intriguing application prospects, the same attributes also present challenges that will need to be overcome to enable the integration of MOFs with technologies where these promising traits can be exploited. In this review, emerging opportunities and challenges are identified for MOF-enabled device functionality and technological applications that arise from their fascinating mechanical properties. This is discussed not only in the context of their more well-studiedmore » gas storage and separation applications, but also for instances where MOFs serve as components of functional nanodevices. Recent advances in understanding MOF mechanical structure–property relationships due to attributes such as defects and interpenetration are highlighted, and open questions related to state-of-the-art computational approaches for quantifying their mechanical properties are critically discussed.« less
Burtch, Nicholas C; Heinen, Jurn; Bennett, Thomas D; Dubbeldam, David; Allendorf, Mark D
2017-11-17
Some of the most remarkable recent developments in metal-organic framework (MOF) performance properties can only be rationalized by the mechanical properties endowed by their hybrid inorganic-organic nanoporous structures. While these characteristics create intriguing application prospects, the same attributes also present challenges that will need to be overcome to enable the integration of MOFs with technologies where these promising traits can be exploited. In this review, emerging opportunities and challenges are identified for MOF-enabled device functionality and technological applications that arise from their fascinating mechanical properties. This is discussed not only in the context of their more well-studied gas storage and separation applications, but also for instances where MOFs serve as components of functional nanodevices. Recent advances in understanding MOF mechanical structure-property relationships due to attributes such as defects and interpenetration are highlighted, and open questions related to state-of-the-art computational approaches for quantifying their mechanical properties are critically discussed. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Arkhipkin, D.; Lauret, J.
2017-10-01
One of the STAR experiment’s modular Messaging Interface and Reliable Architecture framework (MIRA) integration goals is to provide seamless and automatic connections with the existing control systems. After an initial proof of concept and operation of the MIRA system as a parallel data collection system for online use and real-time monitoring, the STAR Software and Computing group is now working on the integration of Experimental Physics and Industrial Control System (EPICS) with MIRA’s interfaces. This integration goals are to allow functional interoperability and, later on, to replace the existing/legacy Detector Control System components at the service level. In this report, we describe the evolutionary integration process and, as an example, will discuss the EPICS Alarm Handler conversion. We review the complete upgrade procedure starting with the integration of EPICS-originated alarm signals propagation into MIRA, followed by the replacement of the existing operator interface based on Motif Editor and Display Manager (MEDM) with modern portable web-based Alarm Handler interface. To achieve this aim, we have built an EPICS-to-MQTT [8] bridging service, and recreated the functionality of the original Alarm Handler using low-latency web messaging technologies. The integration of EPICS alarm handling into our messaging framework allowed STAR to improve the DCS alarm awareness of existing STAR DAQ and RTS services, which use MIRA as a primary source of experiment control information.
Proton conduction in metal-organic frameworks and related modularly built porous solids.
Yoon, Minyoung; Suh, Kyungwon; Natarajan, Srinivasan; Kim, Kimoon
2013-03-04
Proton-conducting materials are an important component of fuel cells. Development of new types of proton-conducting materials is one of the most important issues in fuel-cell technology. Herein, we present newly developed proton-conducting materials, modularly built porous solids, including coordination polymers (CPs) or metal-organic frameworks (MOFs). The designable and tunable nature of the porous materials allows for fast development in this research field. Design and synthesis of the new types of proton-conducting materials and their unique proton-conduction properties are discussed. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Practical use of a framework for network science experimentation
NASA Astrophysics Data System (ADS)
Toth, Andrew; Bergamaschi, Flavio
2014-06-01
In 2006, the US Army Research Laboratory (ARL) and the UK Ministry of Defence (MoD) established a collaborative research alliance with academia and industry, called the International Technology Alliance (ITA)1 In Network and Information Sciences, to address fundamental issues concerning Network and Information Sciences that will enhance decision making for coalition operations and enable rapid, secure formation of ad hoc teams in coalition environments and enhance US and UK capabilities to conduct coalition warfare. Research conducted under the ITA was extended through collaboration between ARL and IBM UK to characterize and dene a software stack and tooling that has become the reference framework for network science experimentation in support for validation of theoretical research. This paper discusses the composition of the reference framework for experimentation resulting from the ARL/IBM UK collaboration and its use, by the Network Science Collaborative Technology Alliance (NS CTA)2 , in a recent network science experiment conducted at ARL. It also discusses how the experiment was modeled using the reference framework, the integration of two new components, the Apollo Fact-Finder3 tool and the Medusa Crowd Sensing4 application, the limitations identified and how they shall be addressed in future work.
Pre-Service Teachers' TPACK Development and Conceptions through a TPACK-Based Course
ERIC Educational Resources Information Center
Durdu, Levent; Dag, Funda
2017-01-01
This study examines pre-service teachers' Technological Pedagogical Content Knowledge (TPACK) development and analyses their conceptions of learning and teaching with technology. With this aim in mind, researchers designed and implemented a computer-based mathematics course based on a TPACK framework. As a research methodology, a parallel mixed…
Troubling STEM: Making a Case for an Ethics/STEM Partnership
NASA Astrophysics Data System (ADS)
Steele, Astrid
2016-06-01
Set against the backdrop of a STEM-based (science, technology, engineering and mathematics) activity in a teacher education science methods class, the author examines the need for ethics education to be partnered with STEM education. To make the case, the origin of the STEM initiative, undertaken and strongly supported by both US government and corporate sources, is briefly recounted. The STSE initiative (science, technology, society and environment) is posited as a counterpoint to STEM. Also considered are: (a) an historical perspective of science and technology as these impact difficult individual and social decision making; (b) STEM knowledge generation considered through the lens of Habermas' threefold knowledge typology; and (c) the experiences of the teacher candidates working through the STEM activity when an ethical challenge is posed. The author demonstrates the need for a moral component for science education and makes the case for a partnership between STEM and ethics education. Further, such a partnership has been shown to increase student enjoyment and motivation for their science studies. Three possible ethical frameworks are examined for their theoretical and practical utility in a science classroom.
IMAGE: A Design Integration Framework Applied to the High Speed Civil Transport
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.
1993-01-01
Effective design of the High Speed Civil Transport requires the systematic application of design resources throughout a product's life-cycle. Information obtained from the use of these resources is used for the decision-making processes of Concurrent Engineering. Integrated computing environments facilitate the acquisition, organization, and use of required information. State-of-the-art computing technologies provide the basis for the Intelligent Multi-disciplinary Aircraft Generation Environment (IMAGE) described in this paper. IMAGE builds upon existing agent technologies by adding a new component called a model. With the addition of a model, the agent can provide accountable resource utilization in the presence of increasing design fidelity. The development of a zeroth-order agent is used to illustrate agent fundamentals. Using a CATIA(TM)-based agent from previous work, a High Speed Civil Transport visualization system linking CATIA, FLOPS, and ASTROS will be shown. These examples illustrate the important role of the agent technologies used to implement IMAGE, and together they demonstrate that IMAGE can provide an integrated computing environment for the design of the High Speed Civil Transport.
GERICOS: A Generic Framework for the Development of On-Board Software
NASA Astrophysics Data System (ADS)
Plasson, P.; Cuomo, C.; Gabriel, G.; Gauthier, N.; Gueguen, L.; Malac-Allain, L.
2016-08-01
This paper presents an overview of the GERICOS framework (GEneRIC Onboard Software), its architecture, its various layers and its future evolutions. The GERICOS framework, developed and qualified by LESIA, offers a set of generic, reusable and customizable software components for the rapid development of payload flight software. The GERICOS framework has a layered structure. The first layer (GERICOS::CORE) implements the concept of active objects and forms an abstraction layer over the top of real-time kernels. The second layer (GERICOS::BLOCKS) offers a set of reusable software components for building flight software based on generic solutions to recurrent functionalities. The third layer (GERICOS::DRIVERS) implements software drivers for several COTS IP cores of the LEON processor ecosystem.
Rangachari, Pavani
2014-12-01
Despite the federal policy momentum towards "meaningful use" of Electronic Health Records, the healthcare organizational literature remains replete with reports of unintended adverse consequences of implementing Electronic Health Records, including: increased work for clinicians, unfavorable workflow changes, and unexpected changes in communication patterns & practices. In addition to being costly and unsafe, these unintended adverse consequences may pose a formidable barrier to "meaningful use" of Electronic Health Records. Correspondingly, it is essential for hospital administrators to understand and detect the causes of unintended adverse consequences, to ensure successful implementation of Electronic Health Records. The longstanding Technology-in-Practice framework emphasizes the role of human agency in enacting structures of technology use or "technologies-in-practice." Given a set of unintended adverse consequences from health information technology implementation, this framework could help trace them back to specific actions (types of technology-in-practice) and institutional conditions (social structures). On the other hand, the more recent Knowledge-in-Practice framework helps understand how information and communication technologies ( e.g. , social knowledge networking systems) could be implemented alongside existing technology systems, to create new social structures, generate new knowledge-in-practice, and transform technology-in-practice. Therefore, integrating the two literature streams could serve the dual purpose of understanding and overcoming unintended adverse consequences of Electronic Health Record implementation. This paper seeks to: (1) review the theoretical literatures on technology use & implementation, and identify a framework for understanding & overcoming unintended adverse consequences of implementing Electronic Health Records; (2) outline a broad project proposal to test the applicability of the framework in enabling "meaningful use" of Electronic Health Records in a healthcare context; and (3) identify strategies for successful implementation of Electronic Health Records in hospitals & health systems, based on the literature review and application.
Borycki, E M; Kushniruk, A W; Bellwood, P; Brender, J
2012-01-01
The objective of this paper is to examine the extent, range and scope to which frameworks, models and theories dealing with technology-induced error have arisen in the biomedical and life sciences literature as indexed by Medline®. To better understand the state of work in the area of technology-induced error involving frameworks, models and theories, the authors conducted a search of Medline® using selected key words identified from seminal articles in this research area. Articles were reviewed and those pertaining to frameworks, models or theories dealing with technology-induced error were further reviewed by two researchers. All articles from Medline® from its inception to April of 2011 were searched using the above outlined strategy. 239 citations were returned. Each of the abstracts for the 239 citations were reviewed by two researchers. Eleven articles met the criteria based on abstract review. These 11 articles were downloaded for further in-depth review. The majority of the articles obtained describe frameworks and models with reference to theories developed in other literatures outside of healthcare. The papers were grouped into several areas. It was found that articles drew mainly from three literatures: 1) the human factors literature (including human-computer interaction and cognition), 2) the organizational behavior/sociotechnical literature, and 3) the software engineering literature. A variety of frameworks and models were found in the biomedical and life sciences literatures. These frameworks and models drew upon and extended frameworks, models and theoretical perspectives that have emerged in other literatures. These frameworks and models are informing an emerging line of research in health and biomedical informatics involving technology-induced errors in healthcare.
Optimal Wastewater Loading under Conflicting Goals and Technology Limitations in a Riverine System.
Rafiee, Mojtaba; Lyon, Steve W; Zahraie, Banafsheh; Destouni, Georgia; Jaafarzadeh, Nemat
2017-03-01
This paper investigates a novel simulation-optimization (S-O) framework for identifying optimal treatment levels and treatment processes for multiple wastewater dischargers to rivers. A commonly used water quality simulation model, Qual2K, was linked to a Genetic Algorithm optimization model for exploration of relevant fuzzy objective-function formulations for addressing imprecision and conflicting goals of pollution control agencies and various dischargers. Results showed a dynamic flow dependence of optimal wastewater loading with good convergence to near global optimum. Explicit considerations of real-world technological limitations, which were developed here in a new S-O framework, led to better compromise solutions between conflicting goals than those identified within traditional S-O frameworks. The newly developed framework, in addition to being more technologically realistic, is also less complicated and converges on solutions more rapidly than traditional frameworks. This technique marks a significant step forward for development of holistic, riverscape-based approaches that balance the conflicting needs of the stakeholders.
Higher-Order Theory: Structural/MicroAnalysis Code (HOTSMAC) Developed
NASA Technical Reports Server (NTRS)
Arnold, Steven M.
2002-01-01
The full utilization of advanced materials (be they composite or functionally graded materials) in lightweight aerospace components requires the availability of accurate analysis, design, and life-prediction tools that enable the assessment of component and material performance and reliability. Recently, a new commercially available software product called HOTSMAC (Higher-Order Theory--Structural/MicroAnalysis Code) was jointly developed by Collier Research Corporation, Engineered Materials Concepts LLC, and the NASA Glenn Research Center under funding provided by Glenn's Commercial Technology Office. The analytical framework for HOTSMAC is based on almost a decade of research into the coupled micromacrostructural analysis of heterogeneous materials. Consequently, HOTSMAC offers a comprehensive approach for analyzing/designing the response of components with various microstructural details, including certain advantages not always available in standard displacement-based finite element analysis techniques. The capabilities of HOTSMAC include combined thermal and mechanical analysis, time-independent and time-dependent material behavior, and internal boundary cells (e.g., those that can be used to represent internal cooling passages, see the preceding figure) to name a few. In HOTSMAC problems, materials can be randomly distributed and/or functionally graded (as shown in the figure, wherein the inclusions are distributed linearly), or broken down by strata, such as in the case of thermal barrier coatings or composite laminates.
Neuroanatomical distribution of five semantic components of verbs: evidence from fMRI.
Kemmerer, David; Castillo, Javier Gonzalez; Talavage, Thomas; Patterson, Stephanie; Wiley, Cynthia
2008-10-01
The Simulation Framework, also known as the Embodied Cognition Framework, maintains that conceptual knowledge is grounded in sensorimotor systems. To test several predictions that this theory makes about the neural substrates of verb meanings, we used functional magnetic resonance imaging (fMRI) to scan subjects' brains while they made semantic judgments involving five classes of verbs-specifically, Running verbs (e.g., run, jog, walk), Speaking verbs (e.g., shout, mumble, whisper), Hitting verbs (e.g., hit, poke, jab), Cutting verbs (e.g., cut, slice, hack), and Change of State verbs (e.g., shatter, smash, crack). These classes were selected because they vary with respect to the presence or absence of five distinct semantic components-specifically, ACTION, MOTION, CONTACT, CHANGE OF STATE, and TOOL USE. Based on the Simulation Framework, we hypothesized that the ACTION component depends on the primary motor and premotor cortices, that the MOTION component depends on the posterolateral temporal cortex, that the CONTACT component depends on the intraparietal sulcus and inferior parietal lobule, that the CHANGE OF STATE component depends on the ventral temporal cortex, and that the TOOL USE component depends on a distributed network of temporal, parietal, and frontal regions. Virtually all of the predictions were confirmed. Taken together, these findings support the Simulation Framework and extend our understanding of the neuroanatomical distribution of different aspects of verb meaning.
A human factors systems approach to understanding team-based primary care: a qualitative analysis
Mundt, Marlon P.; Swedlund, Matthew P.
2016-01-01
Background. Research shows that high-functioning teams improve patient outcomes in primary care. However, there is no consensus on a conceptual model of team-based primary care that can be used to guide measurement and performance evaluation of teams. Objective. To qualitatively understand whether the Systems Engineering Initiative for Patient Safety (SEIPS) model could serve as a framework for creating and evaluating team-based primary care. Methods. We evaluated qualitative interview data from 19 clinicians and staff members from 6 primary care clinics associated with a large Midwestern university. All health care clinicians and staff in the study clinics completed a survey of their communication connections to team members. Social network analysis identified key informants for interviews by selecting the respondents with the highest frequency of communication ties as reported by their teammates. Semi-structured interviews focused on communication patterns, team climate and teamwork. Results. Themes derived from the interviews lent support to the SEIPS model components, such as the work system (Team, Tools and Technology, Physical Environment, Tasks and Organization), team processes and team outcomes. Conclusions. Our qualitative data support the SEIPS model as a promising conceptual framework for creating and evaluating primary care teams. Future studies of team-based care may benefit from using the SEIPS model to shift clinical practice to high functioning team-based primary care. PMID:27578837
We have developed a modeling framework to support grid-based simulation of ecosystems at multiple spatial scales, the Ecological Component Library for Parallel Spatial Simulation (ECLPSS). ECLPSS helps ecologists to build robust spatially explicit simulations of ...
Feature-based component model for design of embedded systems
NASA Astrophysics Data System (ADS)
Zha, Xuan Fang; Sriram, Ram D.
2004-11-01
An embedded system is a hybrid of hardware and software, which combines software's flexibility and hardware real-time performance. Embedded systems can be considered as assemblies of hardware and software components. An Open Embedded System Model (OESM) is currently being developed at NIST to provide a standard representation and exchange protocol for embedded systems and system-level design, simulation, and testing information. This paper proposes an approach to representing an embedded system feature-based model in OESM, i.e., Open Embedded System Feature Model (OESFM), addressing models of embedded system artifacts, embedded system components, embedded system features, and embedded system configuration/assembly. The approach provides an object-oriented UML (Unified Modeling Language) representation for the embedded system feature model and defines an extension to the NIST Core Product Model. The model provides a feature-based component framework allowing the designer to develop a virtual embedded system prototype through assembling virtual components. The framework not only provides a formal precise model of the embedded system prototype but also offers the possibility of designing variation of prototypes whose members are derived by changing certain virtual components with different features. A case study example is discussed to illustrate the embedded system model.
Dermatology and pathology arrangements: navigating the compliance risks.
Wood, Jane Pine; Cougevan, Bridget; McGovern, Jenny
2013-12-01
Purchased service arrangements, establishing in-house professional pathology services, conducting technical component histology within a dermatology practice, and electronic medical records technology donations are ways that dermatology practices are responding to the current health care delivery and payment changes. This article will provide a general framework for navigating the compliance risks and structure considerations associated with these relationships between dermatologists and pathologists.
2006-01-01
enabling technologies such as built-in-test, advanced health monitoring algorithms, reliability and component aging models, prognostics methods, and...deployment and acceptance. This framework and vision is consistent with the onboard PHM ( Prognostic and Health Management) as well as advanced... monitored . In addition to the prognostic forecasting capabilities provided by monitoring system power, multiple confounding errors by electronic
Tremblay, Marie-Claude; Martin, Debbie H; Macaulay, Ann C; Pluye, Pierre
2017-06-01
A long-standing challenge in community-based participatory research (CBPR) has been to anchor practice and evaluation in a relevant and comprehensive theoretical framework of community change. This study describes the development of a multidimensional conceptual framework that builds on social movement theories to identify key components of CBPR processes. Framework synthesis was used as a general literature search and analysis strategy. An initial conceptual framework was developed from the theoretical literature on social movement. A literature search performed to identify illustrative CBPR projects yielded 635 potentially relevant documents, from which eight projects (corresponding to 58 publications) were retained after record and full-text screening. Framework synthesis was used to code and organize data from these projects, ultimately providing a refined framework. The final conceptual framework maps key concepts of CBPR mobilization processes, such as the pivotal role of the partnership; resources and opportunities as necessary components feeding the partnership's development; the importance of framing processes; and a tight alignment between the cause (partnership's goal), the collective action strategy, and the system changes targeted. The revised framework provides a context-specific model to generate a new, innovative understanding of CBPR mobilization processes, drawing on existing theoretical foundations. © 2017 The Authors American Journal of Community Psychology published by Wiley Periodicals, Inc. on behalf of Society for Community Research and Action.
Connecting Effective Instruction and Technology. Intel-elebration: Safari.
ERIC Educational Resources Information Center
Burton, Larry D.; Prest, Sharon
Intel-ebration is an attempt to integrate the following research-based instructional frameworks and strategies: (1) dimensions of learning; (2) multiple intelligences; (3) thematic instruction; (4) cooperative learning; (5) project-based learning; and (6) instructional technology. This paper presents a thematic unit on safari, using the…
Experiences integrating autonomous components and legacy systems into tsunami early warning systems
NASA Astrophysics Data System (ADS)
Reißland, S.; Herrnkind, S.; Guenther, M.; Babeyko, A.; Comoglu, M.; Hammitzsch, M.
2012-04-01
Fostered by and embedded in the general development of Information and Communication Technology (ICT) the evolution of Tsunami Early Warning Systems (TEWS) shows a significant development from seismic-centred to multi-sensor system architectures using additional sensors, e.g. sea level stations for the detection of tsunami waves and GPS stations for the detection of ground displacements. Furthermore, the design and implementation of a robust and scalable service infrastructure supporting the integration and utilisation of existing resources serving near real-time data not only includes sensors but also other components and systems offering services such as the delivery of feasible simulations used for forecasting in an imminent tsunami threat. In the context of the development of the German Indonesian Tsunami Early Warning System (GITEWS) and the project Distant Early Warning System (DEWS) a service platform for both sensor integration and warning dissemination has been newly developed and demonstrated. In particular, standards of the Open Geospatial Consortium (OGC) and the Organization for the Advancement of Structured Information Standards (OASIS) have been successfully incorporated. In the project Collaborative, Complex, and Critical Decision-Support in Evolving Crises (TRIDEC) new developments are used to extend the existing platform to realise a component-based technology framework for building distributed TEWS. This talk will describe experiences made in GITEWS, DEWS and TRIDEC while integrating legacy stand-alone systems and newly developed special-purpose software components into TEWS using different software adapters and communication strategies to make the systems work together in a corporate infrastructure. The talk will also cover task management and data conversion between the different systems. Practical approaches and software solutions for the integration of sensors, e.g. providing seismic and sea level data, and utilisation of special-purpose components, such as simulation systems, in TEWS will be presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radhakrishnan, Balasubramaniam; Fattebert, Jean-Luc; Gorti, Sarma B.
Additive Manufacturing (AM) refers to a process by which digital three-dimensional (3-D) design data is converted to build up a component by depositing material layer-by-layer. United Technologies Corporation (UTC) is currently involved in fabrication and certification of several AM aerospace structural components made from aerospace materials. This is accomplished by using optimized process parameters determined through numerous design-of-experiments (DOE)-based studies. Certification of these components is broadly recognized as a significant challenge, with long lead times, very expensive new product development cycles and very high energy consumption. Because of these challenges, United Technologies Research Center (UTRC), together with UTC business unitsmore » have been developing and validating an advanced physics-based process model. The specific goal is to develop a physics-based framework of an AM process and reliably predict fatigue properties of built-up structures as based on detailed solidification microstructures. Microstructures are predicted using process control parameters including energy source power, scan velocity, deposition pattern, and powder properties. The multi-scale multi-physics model requires solution and coupling of governing physics that will allow prediction of the thermal field and enable solution at the microstructural scale. The state-of-the-art approach to solve these problems requires a huge computational framework and this kind of resource is only available within academia and national laboratories. The project utilized the parallel phase-fields codes at Oak Ridge National Laboratory (ORNL) and Lawrence Livermore National Laboratory (LLNL), along with the high-performance computing (HPC) capabilities existing at the two labs to demonstrate the simulation of multiple dendrite growth in threedimensions (3-D). The LLNL code AMPE was used to implement the UTRC phase field model that was previously developed for a model binary alloy, and the simulation results were compared against the UTRC simulation results, followed by extension of the UTRC model to simulate multiple dendrite growth in 3-D. The ORNL MEUMAPPS code was used to simulate dendritic growth in a model ternary alloy with the same equilibrium solidification range as the Ni-base alloy 718 using realistic model parameters, including thermodynamic integration with a Calphad based model for the ternary alloy. Implementation of the UTRC model in AMPE met with several numerical and parametric issues that were resolved and good comparison between the simulation results obtained by the two codes was demonstrated for two dimensional (2-D) dendrites. 3-D dendrite growth was then demonstrated with the AMPE code using nondimensional parameters obtained in 2-D simulations. Multiple dendrite growth in 2-D and 3-D were demonstrated using ORNL’s MEUMAPPS code using simple thermal boundary conditions. MEUMAPPS was then modified to incorporate the complex, time-dependent thermal boundary conditions obtained by UTRC’s thermal modeling of single track AM experiments to drive the phase field simulations. The results were in good agreement with UTRC’s experimental measurements.« less
NIST biometric evaluations and developments
NASA Astrophysics Data System (ADS)
Garris, Michael D.; Wilson, Charles L.
2005-05-01
This paper presents an R&D framework used by the National Institute of Standards and Technology (NIST) for biometric technology testing and evaluation. The focus of this paper is on fingerprint-based verification and identification. Since 9-11 the NIST Image Group has been mandated by Congress to run a program for biometric technology assessment and biometric systems certification. Four essential areas of activity are discussed: 1) developing test datasets, 2) conducting performance assessment; 3) technology development; and 4) standards participation. A description of activities and accomplishments are provided for each of these areas. In the process, methods of performance testing are described and results from specific biometric technology evaluations are presented. This framework is anticipated to have broad applicability to other technology and application domains.
Extending the psycho-historical framework to understand artistic production.
Kozbelt, Aaron; Ostrofsky, Justin
2013-04-01
We discuss how the psycho-historical framework can be profitably applied to artistic production, facilitating a synthesis of perception-based and knowledge-based perspectives on realistic observational drawing. We note that artists' technical knowledge itself constitutes a major component of an artwork's historical context, and that links between artistic practice and psychological theory may yet yield conclusions in line with universalist perspectives.
Schelbe, Lisa; Randolph, Karen A; Yelick, Anna; Cheatham, Leah P; Groton, Danielle B
2018-01-01
Increased attention to former foster youth pursuing post-secondary education has resulted in the creation of college campus based support programs to address their need. However, limited empirical evidence and theoretical knowledge exist about these programs. This study seeks to describe the application of systems theory as a framework for examining a college campus based support program for former foster youth. In-depth semi-structured interviews were conducted with 32 program stakeholders including students, mentors, collaborative members, and independent living program staff. Using qualitative data analysis software, holistic coding techniques were employed to analyze interview transcripts. Then applying principles of extended case method using systems theory, data were analyzed. Findings suggest systems theory serves as a framework for understanding the functioning of a college campus based support program. The theory's concepts help delineate program components and roles of stakeholders; outline boundaries between and interactions among stakeholders; and identify program strengths and weakness. Systems theory plays an important role in identifying intervention components and providing a structure through which to identify and understand program elements as a part of the planning process. This study highlights the utility of systems theory as a framework for program planning and evaluation.
Advanced Information Technology in Simulation Based Life Cycle Design
NASA Technical Reports Server (NTRS)
Renaud, John E.
2003-01-01
In this research a Collaborative Optimization (CO) approach for multidisciplinary systems design is used to develop a decision based design framework for non-deterministic optimization. To date CO strategies have been developed for use in application to deterministic systems design problems. In this research the decision based design (DBD) framework proposed by Hazelrigg is modified for use in a collaborative optimization framework. The Hazelrigg framework as originally proposed provides a single level optimization strategy that combines engineering decisions with business decisions in a single level optimization. By transforming this framework for use in collaborative optimization one can decompose the business and engineering decision making processes. In the new multilevel framework of Decision Based Collaborative Optimization (DBCO) the business decisions are made at the system level. These business decisions result in a set of engineering performance targets that disciplinary engineering design teams seek to satisfy as part of subspace optimizations. The Decision Based Collaborative Optimization framework more accurately models the existing relationship between business and engineering in multidisciplinary systems design.
Bryant, Jamie; Sanson-Fisher, Rob; Tzelepis, Flora; Henskens, Frans; Paul, Christine; Stevenson, William
2014-01-01
Background Effective communication with cancer patients and their families about their disease, treatment options, and possible outcomes may improve psychosocial outcomes. However, traditional approaches to providing information to patients, including verbal information and written booklets, have a number of shortcomings centered on their limited ability to meet patient preferences and literacy levels. New-generation Web-based technologies offer an innovative and pragmatic solution for overcoming these limitations by providing a platform for interactive information seeking, information sharing, and user-centered tailoring. Objective The primary goal of this paper is to discuss the advantages of comprehensive and iterative Web-based technologies for health information provision and propose a four-phase framework for the development of Web-based information tools. Methods The proposed framework draws on our experience of constructing a Web-based information tool for hematological cancer patients and their families. The framework is based on principles for the development and evaluation of complex interventions and draws on the Agile methodology of software programming that emphasizes collaboration and iteration throughout the development process. Results The DoTTI framework provides a model for a comprehensive and iterative approach to the development of Web-based informational tools for patients. The process involves 4 phases of development: (1) Design and development, (2) Testing early iterations, (3) Testing for effectiveness, and (4) Integration and implementation. At each step, stakeholders (including researchers, clinicians, consumers, and programmers) are engaged in consultations to review progress, provide feedback on versions of the Web-based tool, and based on feedback, determine the appropriate next steps in development. Conclusions This 4-phase framework is evidence-informed and consumer-centered and could be applied widely to develop Web-based programs for a diverse range of diseases. PMID:24641991
Smits, Rochelle; Bryant, Jamie; Sanson-Fisher, Rob; Tzelepis, Flora; Henskens, Frans; Paul, Christine; Stevenson, William
2014-03-14
Effective communication with cancer patients and their families about their disease, treatment options, and possible outcomes may improve psychosocial outcomes. However, traditional approaches to providing information to patients, including verbal information and written booklets, have a number of shortcomings centered on their limited ability to meet patient preferences and literacy levels. New-generation Web-based technologies offer an innovative and pragmatic solution for overcoming these limitations by providing a platform for interactive information seeking, information sharing, and user-centered tailoring. The primary goal of this paper is to discuss the advantages of comprehensive and iterative Web-based technologies for health information provision and propose a four-phase framework for the development of Web-based information tools. The proposed framework draws on our experience of constructing a Web-based information tool for hematological cancer patients and their families. The framework is based on principles for the development and evaluation of complex interventions and draws on the Agile methodology of software programming that emphasizes collaboration and iteration throughout the development process. The DoTTI framework provides a model for a comprehensive and iterative approach to the development of Web-based informational tools for patients. The process involves 4 phases of development: (1) Design and development, (2) Testing early iterations, (3) Testing for effectiveness, and (4) Integration and implementation. At each step, stakeholders (including researchers, clinicians, consumers, and programmers) are engaged in consultations to review progress, provide feedback on versions of the Web-based tool, and based on feedback, determine the appropriate next steps in development. This 4-phase framework is evidence-informed and consumer-centered and could be applied widely to develop Web-based programs for a diverse range of diseases.
Multi-agent systems and their applications
Xie, Jing; Liu, Chen-Ching
2017-07-14
The number of distributed energy components and devices continues to increase globally. As a result, distributed control schemes are desirable for managing and utilizing these devices, together with the large amount of data. In recent years, agent-based technology becomes a powerful tool for engineering applications. As a computational paradigm, multi agent systems (MASs) provide a good solution for distributed control. Here in this paper, MASs and applications are discussed. A state-of-the-art literature survey is conducted on the system architecture, consensus algorithm, and multi-agent platform, framework, and simulator. In addition, a distributed under-frequency load shedding (UFLS) scheme is proposed using themore » MAS. Simulation results for a case study are presented. The future of MASs is discussed in the conclusion.« less
Multi-agent systems and their applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Jing; Liu, Chen-Ching
The number of distributed energy components and devices continues to increase globally. As a result, distributed control schemes are desirable for managing and utilizing these devices, together with the large amount of data. In recent years, agent-based technology becomes a powerful tool for engineering applications. As a computational paradigm, multi agent systems (MASs) provide a good solution for distributed control. Here in this paper, MASs and applications are discussed. A state-of-the-art literature survey is conducted on the system architecture, consensus algorithm, and multi-agent platform, framework, and simulator. In addition, a distributed under-frequency load shedding (UFLS) scheme is proposed using themore » MAS. Simulation results for a case study are presented. The future of MASs is discussed in the conclusion.« less
Højberg, Helene; Rasmussen, Charlotte Diana Nørregaard; Osborne, Richard H; Jørgensen, Marie Birk
2018-02-01
Our aim was to identify implementation components for sustainable working environment interventions in the nursing assistant sector to generate a framework to optimize the implementation of workplace improvement initiatives. The implementation framework was informed by: 1) an industry advisory group, 2) interviews with key stakeholder, 3) concept mapping workshops, and 4) an e-mail survey. Thirty five stakeholders were interviewed and contributed in the concept mapping workshops. Eleven implementation components were derived across four domains: 1) A supportive organizational platform, 2) An engaged workplace with mutual goals, 3) The intervention is sustainably fitted to the workplace, and 4) the intervention is an attractive choice. The highest rated component was "Engaged and Active Management" (mean 4.1) and the lowest rated was "Delivered in an Attractive Form" (mean 2.8). The framework provides new insights into implementation in an evolving working environment and is aiming to assist with addressing gaps in effectiveness of workplace interventions and implementation success. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
CHAMPION: Intelligent Hierarchical Reasoning Agents for Enhanced Decision Support
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hohimer, Ryan E.; Greitzer, Frank L.; Noonan, Christine F.
2011-11-15
We describe the design and development of an advanced reasoning framework employing semantic technologies, organized within a hierarchy of computational reasoning agents that interpret domain specific information. Designed based on an inspirational metaphor of the pattern recognition functions performed by the human neocortex, the CHAMPION reasoning framework represents a new computational modeling approach that derives invariant knowledge representations through memory-prediction belief propagation processes that are driven by formal ontological language specification and semantic technologies. The CHAMPION framework shows promise for enhancing complex decision making in diverse problem domains including cyber security, nonproliferation and energy consumption analysis.
ERIC Educational Resources Information Center
Sabdan, Muhammad Sayuti Bin; Alias, Norlidah; Jomhari, Nazean; Jamaludin, Khairul Azhar; DeWitt, Dorothy
2014-01-01
The study is aimed at evaluating the FAKIH method based on technology in teaching al-Quran, based on the user's retrospective. The participants of this study were five students selected based on hearing difficulties. The study employed the user evaluation framework. Teacher's journals were used to determine the frequency and percentage of…
A Conceptual Framework for SAHRA Integrated Multi-resolution Modeling in the Rio Grande Basin
NASA Astrophysics Data System (ADS)
Liu, Y.; Gupta, H.; Springer, E.; Wagener, T.; Brookshire, D.; Duffy, C.
2004-12-01
The sustainable management of water resources in a river basin requires an integrated analysis of the social, economic, environmental and institutional dimensions of the problem. Numerical models are commonly used for integration of these dimensions and for communication of the analysis results to stakeholders and policy makers. The National Science Foundation Science and Technology Center for Sustainability of semi-Arid Hydrology and Riparian Areas (SAHRA) has been developing integrated multi-resolution models to assess impacts of climate variability and land use change on water resources in the Rio Grande Basin. These models not only couple natural systems such as surface and ground waters, but will also include engineering, economic and social components that may be involved in water resources decision-making processes. This presentation will describe the conceptual framework being developed by SAHRA to guide and focus the multiple modeling efforts and to assist the modeling team in planning, data collection and interpretation, communication, evaluation, etc. One of the major components of this conceptual framework is a Conceptual Site Model (CSM), which describes the basin and its environment based on existing knowledge and identifies what additional information must be collected to develop technically sound models at various resolutions. The initial CSM is based on analyses of basin profile information that has been collected, including a physical profile (e.g., topographic and vegetative features), a man-made facility profile (e.g., dams, diversions, and pumping stations), and a land use and ecological profile (e.g., demographics, natural habitats, and endangered species). Based on the initial CSM, a Conceptual Physical Model (CPM) is developed to guide and evaluate the selection of a model code (or numerical model) for each resolution to conduct simulations and predictions. A CPM identifies, conceptually, all the physical processes and engineering and socio-economic activities occurring (or to occur) in the real system that the corresponding numerical models are required to address, such as riparian evapotranspiration responses to vegetation change and groundwater pumping impacts on soil moisture contents. Simulation results from different resolution models and observations of the real system will then be compared to evaluate the consistency among the CSM, the CPMs, and the numerical models, and feedbacks will be used to update the models. In a broad sense, the evaluation of the models (conceptual or numerical), as well as the linkages between them, can be viewed as a part of the overall conceptual framework. As new data are generated and understanding improves, the models will evolve, and the overall conceptual framework is refined. The development of the conceptual framework becomes an on-going process. We will describe the current state of this framework and the open questions that have to be addressed in the future.
Intelligent content fitting for digital publishing
NASA Astrophysics Data System (ADS)
Lin, Xiaofan
2006-02-01
One recurring problem in Variable Data Printing (VDP) is that the existing contents cannot satisfy the VDP task as-is. So there is a strong need for content fitting technologies to support high-value digital publishing applications, in which text and image are the two major types of contents. This paper presents meta-Autocrop framework for image fitting and TextFlex technology for text fitting. The meta-Autocrop framework supports multiple modes: fixed aspect-ratio mode, advice mode, and verification mode. The TextFlex technology supports non-rectangular text wrapping and paragraph-based line breaking. We also demonstrate how these content fitting technologies are utilized in the overall automated composition and layout system.
An automated and integrated framework for dust storm detection based on ogc web processing services
NASA Astrophysics Data System (ADS)
Xiao, F.; Shea, G. Y. K.; Wong, M. S.; Campbell, J.
2014-11-01
Dust storms are known to have adverse effects on public health. Atmospheric dust loading is also one of the major uncertainties in global climatic modelling as it is known to have a significant impact on the radiation budget and atmospheric stability. The complexity of building scientific dust storm models is coupled with the scientific computation advancement, ongoing computing platform development, and the development of heterogeneous Earth Observation (EO) networks. It is a challenging task to develop an integrated and automated scheme for dust storm detection that combines Geo-Processing frameworks, scientific models and EO data together to enable the dust storm detection and tracking processes in a dynamic and timely manner. This study develops an automated and integrated framework for dust storm detection and tracking based on the Web Processing Services (WPS) initiated by Open Geospatial Consortium (OGC). The presented WPS framework consists of EO data retrieval components, dust storm detecting and tracking component, and service chain orchestration engine. The EO data processing component is implemented based on OPeNDAP standard. The dust storm detecting and tracking component combines three earth scientific models, which are SBDART model (for computing aerosol optical depth (AOT) of dust particles), WRF model (for simulating meteorological parameters) and HYSPLIT model (for simulating the dust storm transport processes). The service chain orchestration engine is implemented based on Business Process Execution Language for Web Service (BPEL4WS) using open-source software. The output results, including horizontal and vertical AOT distribution of dust particles as well as their transport paths, were represented using KML/XML and displayed in Google Earth. A serious dust storm, which occurred over East Asia from 26 to 28 Apr 2012, is used to test the applicability of the proposed WPS framework. Our aim here is to solve a specific instance of a complex EO data and scientific model integration problem by using a framework and scientific workflow approach together. The experimental result shows that this newly automated and integrated framework can be used to give advance near real-time warning of dust storms, for both environmental authorities and public. The methods presented in this paper might be also generalized to other types of Earth system models, leading to improved ease of use and flexibility.
Wenzl, Peter; Li, Haobing; Carling, Jason; Zhou, Meixue; Raman, Harsh; Paul, Edie; Hearnden, Phillippa; Maier, Christina; Xia, Ling; Caig, Vanessa; Ovesná, Jaroslava; Cakir, Mehmet; Poulsen, David; Wang, Junping; Raman, Rosy; Smith, Kevin P; Muehlbauer, Gary J; Chalmers, Ken J; Kleinhofs, Andris; Huttner, Eric; Kilian, Andrzej
2006-01-01
Background Molecular marker technologies are undergoing a transition from largely serial assays measuring DNA fragment sizes to hybridization-based technologies with high multiplexing levels. Diversity Arrays Technology (DArT) is a hybridization-based technology that is increasingly being adopted by barley researchers. There is a need to integrate the information generated by DArT with previous data produced with gel-based marker technologies. The goal of this study was to build a high-density consensus linkage map from the combined datasets of ten populations, most of which were simultaneously typed with DArT and Simple Sequence Repeat (SSR), Restriction Enzyme Fragment Polymorphism (RFLP) and/or Sequence Tagged Site (STS) markers. Results The consensus map, built using a combination of JoinMap 3.0 software and several purpose-built perl scripts, comprised 2,935 loci (2,085 DArT, 850 other loci) and spanned 1,161 cM. It contained a total of 1,629 'bins' (unique loci), with an average inter-bin distance of 0.7 ± 1.0 cM (median = 0.3 cM). More than 98% of the map could be covered with a single DArT assay. The arrangement of loci was very similar to, and almost as optimal as, the arrangement of loci in component maps built for individual populations. The locus order of a synthetic map derived from merging the component maps without considering the segregation data was only slightly inferior. The distribution of loci along chromosomes indicated centromeric suppression of recombination in all chromosomes except 5H. DArT markers appeared to have a moderate tendency toward hypomethylated, gene-rich regions in distal chromosome areas. On the average, 14 ± 9 DArT loci were identified within 5 cM on either side of SSR, RFLP or STS loci previously identified as linked to agricultural traits. Conclusion Our barley consensus map provides a framework for transferring genetic information between different marker systems and for deploying DArT markers in molecular breeding schemes. The study also highlights the need for improved software for building consensus maps from high-density segregation data of multiple populations. PMID:16904008
NASA Astrophysics Data System (ADS)
Yuan, Yanbin; Zhou, You; Zhu, Yaqiong; Yuan, Xiaohui; Sælthun, N. R.
2007-11-01
Based on digital technology, flood routing simulation system development is an important component of "digital catchment". Taking QingJiang catchment as a pilot case, in-depth analysis on informatization of Qingjiang catchment management being the basis, aiming at catchment data's multi-source, - dimension, -element, -subject, -layer and -class feature, the study brings the design thought and method of "subject-point-source database" (SPSD) to design system structure in order to realize the unified management of catchments data in great quantity. Using the thought of integrated spatial information technology for reference, integrating hierarchical structure development model of digital catchment is established. The model is general framework of the flood routing simulation system analysis, design and realization. In order to satisfy the demands of flood routing three-dimensional simulation system, the object-oriented spatial data model are designed. We can analyze space-time self-adapting relation between flood routing and catchments topography, express grid data of terrain by using non-directed graph, apply breadth first search arithmetic, set up search method for the purpose of dynamically searching stream channel on the basis of simulated three-dimensional terrain. The system prototype is therefore realized. Simulation results have demonstrated that the proposed approach is feasible and effective in the application.
Innovative Assessments That Support Students' STEM Learning
ERIC Educational Resources Information Center
Thummaphan, Phonraphee
2017-01-01
The present study aimed to represent the innovative assessments that support students' learning in STEM education through using the integrative framework for Cognitive Diagnostic Modeling (CDM). This framework is based on three components, cognition, observation, and interpretation (National Research Council, 2001). Specifically, this dissertation…
Variation and Defect Tolerance for Nano Crossbars
NASA Astrophysics Data System (ADS)
Tunc, Cihan
With the extreme shrinking in CMOS technology, quantum effects and manufacturing issues are getting more crucial. Hence, additional shrinking in CMOS feature size seems becoming more challenging, difficult, and costly. On the other hand, emerging nanotechnology has attracted many researchers since additional scaling down has been demonstrated by manufacturing nanowires, Carbon nanotubes as well as molecular switches using bottom-up manufacturing techniques. In addition to the progress in manufacturing, developments in architecture show that emerging nanoelectronic devices will be promising for the future system designs. Using nano crossbars, which are composed of two sets of perpendicular nanowires with programmable intersections, it is possible to implement logic functions. In addition, nano crossbars present some important features as regularity, reprogrammability, and interchangeability. Combining these features, researchers have presented different effective architectures. Although bottom-up nanofabrication can greatly reduce manufacturing costs, due to low controllability in the manufacturing process, some critical issues occur. Bottom- up nanofabrication process results in high variation compared to conventional top- down lithography used in CMOS technology. In addition, an increased failure rate is expected. Variation and defect tolerance methods used for conventional CMOS technology seem inadequate for adapting to emerging nano technology because the variation and the defect rate for emerging nano technology is much more than current CMOS technology. Therefore, variations and defect tolerance methods for emerging nano technology are necessary for a successful transition. In this work, in order to tolerate variations for crossbars, we introduce a framework that is established based on reprogrammability and interchangeability features of nano crossbars. This framework is shown to be applicable for both FET-based and diode-based nano crossbars. We present a characterization testing method which requires minimal number of test vectors. We formulate the variation optimization problem using Simulated Annealing with different optimization goals. Furthermore, we extend the framework for defect tolerance. Experimental results and comparison of proposed framework with exhaustive methods confirm its effectiveness for both variation and defect tolerance.
ERIC Educational Resources Information Center
Al-Harthi, Aisha Salim Ali; Campbell, Chris; Karimi, Arafeh
2018-01-01
This study aimed to develop, validate, and trial a rubric for evaluating the cloud-based learning designs (CBLD) that were developed by teachers using virtual learning environments. The rubric was developed using the technological pedagogical content knowledge (TPACK) framework, with rubric development including content and expert validation of…
Liu, Jing; Zhao, Songzheng; Wang, Gang
2018-01-01
With the development of Web 2.0 technology, social media websites have become lucrative but under-explored data sources for extracting adverse drug events (ADEs), which is a serious health problem. Besides ADE, other semantic relation types (e.g., drug indication and beneficial effect) could hold between the drug and adverse event mentions, making ADE relation extraction - distinguishing ADE relationship from other relation types - necessary. However, conducting ADE relation extraction in social media environment is not a trivial task because of the expertise-dependent, time-consuming and costly annotation process, and the feature space's high-dimensionality attributed to intrinsic characteristics of social media data. This study aims to develop a framework for ADE relation extraction using patient-generated content in social media with better performance than that delivered by previous efforts. To achieve the objective, a general semi-supervised ensemble learning framework, SSEL-ADE, was developed. The framework exploited various lexical, semantic, and syntactic features, and integrated ensemble learning and semi-supervised learning. A series of experiments were conducted to verify the effectiveness of the proposed framework. Empirical results demonstrate the effectiveness of each component of SSEL-ADE and reveal that our proposed framework outperforms most of existing ADE relation extraction methods The SSEL-ADE can facilitate enhanced ADE relation extraction performance, thereby providing more reliable support for pharmacovigilance. Moreover, the proposed semi-supervised ensemble methods have the potential of being applied to effectively deal with other social media-based problems. Copyright © 2017 Elsevier B.V. All rights reserved.
Interactive, Secure Web-enabled Aircraft Engine Simulation Using XML Databinding Integration
NASA Technical Reports Server (NTRS)
Lin, Risheng; Afjeh, Abdollah A.
2003-01-01
This paper discusses the detailed design of an XML databinding framework for aircraft engine simulation. The framework provides an object interface to access and use engine data. while at the same time preserving the meaning of the original data. The Language independent representation of engine component data enables users to move around XML data using HTTP through disparate networks. The application of this framework is demonstrated via a web-based turbofan propulsion system simulation using the World Wide Web (WWW). A Java Servlet based web component architecture is used for rendering XML engine data into HTML format and dealing with input events from the user, which allows users to interact with simulation data from a web browser. The simulation data can also be saved to a local disk for archiving or to restart the simulation at a later time.
Understanding How Adolescents with Reading Difficulties Utilize Technology-Based Tools
ERIC Educational Resources Information Center
Marino, Matthew T.
2009-01-01
This article reports the findings from a study that examined how adolescent students with reading difficulties utilized cognitive tools that were embedded in a technology-based middle school science curriculum. The curriculum contained salient features of the Universal Design for Learning (UDL) theoretical framework. Sixteen general education…
SCALING AN URBAN EMERGENCY EVACUATION FRAMEWORK: CHALLENGES AND PRACTICES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karthik, Rajasekar; Lu, Wei
2014-01-01
Critical infrastructure disruption, caused by severe weather events, natural disasters, terrorist attacks, etc., has significant impacts on urban transportation systems. We built a computational framework to simulate urban transportation systems under critical infrastructure disruption in order to aid real-time emergency evacuation. This framework will use large scale datasets to provide a scalable tool for emergency planning and management. Our framework, World-Wide Emergency Evacuation (WWEE), integrates population distribution and urban infrastructure networks to model travel demand in emergency situations at global level. Also, a computational model of agent-based traffic simulation is used to provide an optimal evacuation plan for traffic operationmore » purpose [1]. In addition, our framework provides a web-based high resolution visualization tool for emergency evacuation modelers and practitioners. We have successfully tested our framework with scenarios in both United States (Alexandria, VA) and Europe (Berlin, Germany) [2]. However, there are still some major drawbacks for scaling this framework to handle big data workloads in real time. On our back-end, lack of proper infrastructure limits us in ability to process large amounts of data, run the simulation efficiently and quickly, and provide fast retrieval and serving of data. On the front-end, the visualization performance of microscopic evacuation results is still not efficient enough due to high volume data communication between server and client. We are addressing these drawbacks by using cloud computing and next-generation web technologies, namely Node.js, NoSQL, WebGL, Open Layers 3 and HTML5 technologies. We will describe briefly about each one and how we are using and leveraging these technologies to provide an efficient tool for emergency management organizations. Our early experimentation demonstrates that using above technologies is a promising approach to build a scalable and high performance urban emergency evacuation framework that can improve traffic mobility and safety under critical infrastructure disruption in today s socially connected world.« less
Technology-based Interventions for Preventing and Treating Substance Use Among Youth
Marsch, Lisa A.; Borodovsky, Jacob T.
2017-01-01
Summary Preventing or mitigating substance use among youth generally involves three different intervention frameworks: universal prevention, selective prevention, and treatment. Each of these levels of intervention poses unique therapeutic and implementation challenges. Technology-based interventions provide solutions to many of these problems by delivering evidence-based interventions in a consistent and cost-effective manner. This article summarizes the current state of the science of technology-based interventions for preventing substance use initiation and mitigating substance use and associated consequences among youth. PMID:27613350
Applying Sensor Web Technology to Marine Sensor Data
NASA Astrophysics Data System (ADS)
Jirka, Simon; del Rio, Joaquin; Mihai Toma, Daniel; Nüst, Daniel; Stasch, Christoph; Delory, Eric
2015-04-01
In this contribution we present two activities illustrating how Sensor Web technology helps to enable a flexible and interoperable sharing of marine observation data based on standards. An important foundation is the Sensor Web Architecture developed by the European FP7 project NeXOS (Next generation Low-Cost Multifunctional Web Enabled Ocean Sensor Systems Empowering Marine, Maritime and Fisheries Management). This architecture relies on the Open Geospatial Consortium's (OGC) Sensor Web Enablement (SWE) framework. It is an exemplary solution for facilitating the interoperable exchange of marine observation data within and between (research) organisations. The architecture addresses a series of functional and non-functional requirements which are fulfilled through different types of OGC SWE components. The diverse functionalities offered by the NeXOS Sensor Web architecture are shown in the following overview: - Pull-based observation data download: This is achieved through the OGC Sensor Observation Service (SOS) 2.0 interface standard. - Push-based delivery of observation data to allow users the subscription to new measurements that are relevant for them: For this purpose there are currently several specification activities under evaluation (e.g. OGC Sensor Event Service, OGC Publish/Subscribe Standards Working Group). - (Web-based) visualisation of marine observation data: Implemented through SOS client applications. - Configuration and controlling of sensor devices: This is ensured through the OGC Sensor Planning Service 2.0 interface. - Bridging between sensors/data loggers and Sensor Web components: For this purpose several components such as the "Smart Electronic Interface for Sensor Interoperability" (SEISI) concept are developed; this is complemented by a more lightweight SOS extension (e.g. based on the W3C Efficient XML Interchange (EXI) format). To further advance this architecture, there is on-going work to develop dedicated profiles of selected OGC SWE specifications that provide stricter guidance how these standards shall be applied to marine data (e.g. SensorML 2.0 profiles stating which metadata elements are mandatory building upon the ESONET Sensor Registry developments, etc.). Within the NeXOS project the presented architecture is implemented as a set of open source components. These implementations can be re-used by all interested scientists and data providers needing tools for publishing or consuming oceanographic sensor data. In further projects such as the European project FixO3 (Fixed-point Open Ocean Observatories), these software development activities are complemented with additional efforts to provide guidance how Sensor Web technology can be applied in an efficient manner. This way, not only software components are made available but also documentation and information resources that help to understand which types of Sensor Web deployments are best suited to fulfil different types of user requirements.
Miniaturization of components and systems for space using MEMS-technology
NASA Astrophysics Data System (ADS)
Grönland, Tor-Arne; Rangsten, Pelle; Nese, Martin; Lang, Martin
2007-06-01
Development of MEMS-based (micro electro mechanical system) components and subsystems for space applications has been pursued by various research groups and organizations around the world for at least two decades. The main driver for developing MEMS-based components for space is the miniaturization that can be achieved. Miniaturization can not only save orders of magnitude in mass and volume of individual components, but it can also allow increased redundancy, and enable novel spacecraft designs and mission scenarios. However, the commercial breakthrough of MEMS has not occurred within the space business as it has within other branches such as the IT/telecom or automotive industries, or as it has in biotech or life science applications. A main explanation to this is the highly conservative attitude to new technology within the space community. This conservatism is in many senses motivated by a very low risk acceptance in the few and costly space projects that actually ends with a space flight. To overcome this threshold there is a strong need for flight opportunities where reasonable risks can be accepted. Currently there are a few flight opportunities allowing extensive use of new technology in space, but one of the exceptions is the PRISMA program. PRISMA is an international (Sweden, Germany, France, Denmark, Norway, Greece) technology demonstration program with focus on rendezvous and formation flying. It is a two satellite LEO mission with a launch scheduled for the first half of 2009. On PRISMA, a number of novel technologies e.g. RF metrology sensor for Darwin, autonomous formation flying based on GPS and vision-based sensors, ADN-based "green propulsion" will be demonstrated in space for the first time. One of the satellites will also have a miniaturized propulsion system onboard based on MEMS-technology. This novel propulsion system includes two microthruster modules, each including four thrusters with micro- to milli-Newton thrust capability. The novelty of this micropropulsion system is that all critical components such as thrust chamber/nozzle assembly including internal heaters, valves and filters are manufactured using MEMS technology. Moreover, miniaturized pressure sensors, relying on MEMS technology, is also part of the system as a self-standing component. The flight opportunity on PRISMA represents one of the few and thus important opportunities to demonstrate MEMS technology in space. The present paper aims at describing this development effort and highlights the benefits of miniaturized components and systems for space using MEMS technology.
Understanding the medical and nonmedical value of diagnostic testing.
Lee, David W; Neumann, Peter J; Rizzo, John A
2010-01-01
To develop a framework for defining the potential value of diagnostic testing, and discuss its implications for the health-care delivery system. We reviewed the conceptual and empirical literature related to the valuing of diagnostic tests, and used this information to create a framework for characterizing their value. We then made inferences about the impact of this framework on health insurance coverage, health technology assessment, physician-patient relationships, and public health policy. Three dimensions can effectively classify the potential value created by diagnostic tests: 1) medical value (impact on treatment decisions); 2) planning value (affect on patients' ability to make better life decisions); and 3) psychic value (how test information affects patients' sense of self). This comprehensive framework for valuing diagnostics suggests that existing health technology assessments may systematically under- or overvalue diagnostics, leading to potentially incorrect conclusions about cost-effectiveness. Further, failure to account for all value dimensions may lead to distorted payments under a value-based health-care system. The potential value created by medical diagnostics incorporates medical value as well as value associated with well-being and planning. Consideration of all three dimensions has important implications for technology assessment and value-based payment.
NASA Astrophysics Data System (ADS)
Berry Bertram, Kathryn
2011-12-01
The Geophysical Institute (GI) Framework for Professional Development was designed to prepare culturally responsive teachers of science, technology, engineering, and math (STEM). Professional development programs based on the framework are created for rural Alaskan teachers who instruct diverse classrooms that include indigenous students. This dissertation was written in response to the question, "Under what circumstances is the GI Framework for Professional Development effective in preparing culturally responsive teachers of science, technology, engineering, and math?" Research was conducted on two professional development programs based on the GI Framework: the Arctic Climate Modeling Program (ACMP) and the Science Teacher Education Program (STEP). Both programs were created by backward design to student learning goals aligned with Alaska standards and rooted in principles of indigenous ideology. Both were created with input from Alaska Native cultural knowledge bearers, Arctic scientists, education researchers, school administrators, and master teachers with extensive instructional experience. Both provide integrated instruction reflective of authentic Arctic research practices, and training in diverse methods shown to increase indigenous student STEM engagement. While based on the same framework, these programs were chosen for research because they offer distinctly different training venues for K-12 teachers. STEP offered two-week summer institutes on the UAF campus for more than 175 teachers from 33 Alaska school districts. By contrast, ACMP served 165 teachers from one rural Alaska school district along the Bering Strait. Due to challenges in making professional development opportunities accessible to all teachers in this geographically isolated district, ACMP offered a year-round mix of in-person, long-distance, online, and local training. Discussion centers on a comparison of the strategies used by each program to address GI Framework cornerstones, on methodologies used to conduct program research, and on findings obtained. Research indicates that in both situations the GI Framework for Professional Development was effective in preparing culturally responsive STEM teachers. Implications of these findings and recommendations for future research are discussed in the conclusion.
Balancing generality and specificity in component-based reuse
NASA Technical Reports Server (NTRS)
Eichmann, David A.; Beck, Jon
1992-01-01
For a component industry to be successful, we must move beyond the current techniques of black box reuse and genericity to a more flexible framework supporting customization of components as well as instantiation and composition of components. Customization of components strikes a balanced between creating dozens of variations of a base component and requiring the overhead of unnecessary features of an 'everything but the kitchen sink' component. We argue that design and instantiation of reusable components have competing criteria - design-for-use strives for generality, design-with-reuse strives for specificity - and that providing mechanisms for each can be complementary rather than antagonistic. In particular, we demonstrate how program slicing techniques can be applied to customization of reusable components.
ERIC Educational Resources Information Center
Pringle, Rose M.; Dawson, Kara; Ritzhaupt, Albert D.
2015-01-01
In this study, we examined how teachers involved in a yearlong technology integration initiative planned to enact technological, pedagogical, and content practices in science lessons. These science teachers, engaged in an initiative to integrate educational technology in inquiry-based science lessons, provided a total of 525 lesson plans for this…
DECISION-COMPONENTS OF NICE'S TECHNOLOGY APPRAISALS ASSESSMENT FRAMEWORK.
de Folter, Joost; Trusheim, Mark; Jonsson, Pall; Garner, Sarah
2018-01-01
Value assessment frameworks have gained prominence recently in the context of U.S. healthcare. Such frameworks set out a series of factors that are considered in funding decisions. The UK's National Institute of Health and Care Excellence (NICE) is an established health technology assessment (HTA) agency. We present a novel application of text analysis that characterizes NICE's Technology Appraisals in the context of the newer assessment frameworks and present the results in a visual way. A total of 243 documents of NICE's medicines guidance from 2007 to 2016 were analyzed. Text analysis was used to identify a hierarchical set of decision factors considered in the assessments. The frequency of decision factors stated in the documents was determined and their association with terms related to uncertainty. The results were incorporated into visual representations of hierarchical factors. We identified 125 decision factors, and hierarchically grouped these into eight domains: Clinical Effectiveness, Cost Effectiveness, Condition, Current Practice, Clinical Need, New Treatment, Studies, and Other Factors. Textual analysis showed all domains appeared consistently in the guidance documents. Many factors were commonly associated with terms relating to uncertainty. A series of visual representations was created. This study reveals the complexity and consistency of NICE's decision-making processes and demonstrates that cost effectiveness is not the only decision-criteria. The study highlights the importance of processes and methodology that can take both quantitative and qualitative information into account. Visualizations can help effectively communicate this complex information during the decision-making process and subsequently to stakeholders.
ERIC Educational Resources Information Center
Schnittka, Christine G.
2012-01-01
Currently, unless a K-12 student elects to enroll in technology-focused schools or classes, exposure to engineering design and habits of mind is minimal. However, the "Framework for K-12 Science Education," published by the National Research Council in 2011, includes engineering design as a new and major component of the science content…
Understanding Country Planning: A Guide for Air Force Component Planners
2012-01-01
TRANSPORTATION INTERNATIONAL AFFAIRS LAW AND BUSINESS NATIONAL SECURITY POPULATION AND AGING PUBLIC SAFETY SCIENCE AND TECHNOLOGY TERRORISM AND...example, Jefferson P. Marquis, Richard E. Darilek, Jasen J. Castillo, Cathryn Quantic Thurston, Anny Wong, Cynthia Huger, Andrea Mejia, Jennifer D. P...2006; Jennifer D. P. Moroney, Jefferson P. Marquis, Cathryn Quantic Thurston, and Gregory F. Treverton, A Framework to Assess Programs for Building
An approach for quantitative image quality analysis for CT
NASA Astrophysics Data System (ADS)
Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe
2016-03-01
An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.
An ontology-based collaborative service framework for agricultural information
USDA-ARS?s Scientific Manuscript database
In recent years, China has developed modern agriculture energetically. An effective information framework is an important way to provide farms with agricultural information services and improve farmer's production technology and their income. The mountain areas in central China are dominated by agri...
The Virtual Health University: An eLearning Model within the Cuban Health System.
Jardines, José B
2008-01-01
This paper describes Cuba's experience with the Virtual Health University (VHU) as a strategic project of INFOMED, promoting creation of an open teaching-learning environment for health sciences education, through intensive and creative use of Information and Communication Technologies (ICTs) and a network approach to learning. An analysis of the VHU's main antecedents in its different stages of development provides insight into the strategic reasons that led to the establishment of a virtual university in the national health system during Cuba's so-called Special Period of economic crisis. Using the general objectives of creating, sharing, and collaborating which define the VHU's conceptual-operative framework, the three essential components (subsystems) are described: pedagogical, technological, and managerial, as well as the operative stages of educational design, technological implementation, and teaching-administrative management system. Each component of the model is analyzed in the context of global, modern university trends, towards integration of the face-to-face and distance education approaches and the creation of virtual institutions that assume the technological and pedagogical changes demanded by eLearning.
Semantic Service Design for Collaborative Business Processes in Internetworked Enterprises
NASA Astrophysics Data System (ADS)
Bianchini, Devis; Cappiello, Cinzia; de Antonellis, Valeria; Pernici, Barbara
Modern collaborating enterprises can be seen as borderless organizations whose processes are dynamically transformed and integrated with the ones of their partners (Internetworked Enterprises, IE), thus enabling the design of collaborative business processes. The adoption of Semantic Web and service-oriented technologies for implementing collaboration in such distributed and heterogeneous environments promises significant benefits. IE can model their own processes independently by using the Software as a Service paradigm (SaaS). Each enterprise maintains a catalog of available services and these can be shared across IE and reused to build up complex collaborative processes. Moreover, each enterprise can adopt its own terminology and concepts to describe business processes and component services. This brings requirements to manage semantic heterogeneity in process descriptions which are distributed across different enterprise systems. To enable effective service-based collaboration, IEs have to standardize their process descriptions and model them through component services using the same approach and principles. For enabling collaborative business processes across IE, services should be designed following an homogeneous approach, possibly maintaining a uniform level of granularity. In the paper we propose an ontology-based semantic modeling approach apt to enrich and reconcile semantics of process descriptions to facilitate process knowledge management and to enable semantic service design (by discovery, reuse and integration of process elements/constructs). The approach brings together Semantic Web technologies, techniques in process modeling, ontology building and semantic matching in order to provide a comprehensive semantic modeling framework.
Houngbo, P. Thierry; De Cock Buning, Tjard; Bunders, Joske; Coleman, Harry L. S.; Medenou, Daton; Dakpanon, Laurent; Zweekhorst, Marjolein
2017-01-01
Background: Low-income countries face many contextual challenges to manage healthcare technologies effectively, as the majority are imported and resources are constrained to a greater extent. Previous healthcare technology management (HTM) policies in Benin have failed to produce better quality of care for the population and costeffectiveness for the government. This study aims to identify and assess the main problems facing HTM in Benin’s public health sector, as well as the ability of key actors within the sector to address these problems. Methods: We conducted 2 surveys in 117 selected health facilities. The first survey was based on 377 questionnaires and 259 interviews, and the second involved observation and group interviews at health facilities. The Temple-Bird Healthcare Technology Package System (TBHTPS), tailored to the context of Benin’s health system, was used as a conceptual framework. Results: The findings of the first survey show that 85% of key actors in Benin’s HTM sector characterized the system as failing in components of the TBHTPS framework. Biomedical, clinical, healthcare technology engineers and technicians perceived problems most severely, followed by users of equipment, managers and hospital directors, international organization officers, local and foreign suppliers, and finally policy-makers, planners and administrators at the Ministry of Health (MoH). The 5 most important challenges to be addressed are policy, strategic management and planning, and technology needs assessment and selection – categorized as major enabling inputs (MEI) in HTM by the TBHTPS framework – and installation and commissioning, training and skill development and procurement, which are import and use activities (IUA). The ability of each key actor to address these problems (the degree of political or administrative power they possess) was inversely proportional to their perception of the severity of the problems. Observational data gathered during site visits described a different set of challenges including maintenance and repair, distribution, installation and commissioning, use and training and personnel skill development. Conclusion: The lack of experiential and technical knowledge in policy development processes could underpin many of the continuing problems in Benin’s HTM system. Before solutions can be devised to these problems, it is necessary to investigate their root causes, and which problems are most amenable to policy development. PMID:28949474
An Ethical Framework for Evaluating Experimental Technology.
van de Poel, Ibo
2016-06-01
How are we to appraise new technological developments that may bring revolutionary social changes? Currently this is often done by trying to predict or anticipate social consequences and to use these as a basis for moral and regulatory appraisal. Such an approach can, however, not deal with the uncertainties and unknowns that are inherent in social changes induced by technological development. An alternative approach is proposed that conceives of the introduction of new technologies into society as a social experiment. An ethical framework for the acceptability of such experiments is developed based on the bioethical principles for experiments with human subjects: non-maleficence, beneficence, respect for autonomy, and justice. This provides a handle for the moral and regulatory assessment of new technologies and their impact on society.
Benchmarking high performance computing architectures with CMS’ skeleton framework
NASA Astrophysics Data System (ADS)
Sexton-Kennedy, E.; Gartung, P.; Jones, C. D.
2017-10-01
In 2012 CMS evaluated which underlying concurrency technology would be the best to use for its multi-threaded framework. The available technologies were evaluated on the high throughput computing systems dominating the resources in use at that time. A skeleton framework benchmarking suite that emulates the tasks performed within a CMSSW application was used to select Intel’s Thread Building Block library, based on the measured overheads in both memory and CPU on the different technologies benchmarked. In 2016 CMS will get access to high performance computing resources that use new many core architectures; machines such as Cori Phase 1&2, Theta, Mira. Because of this we have revived the 2012 benchmark to test it’s performance and conclusions on these new architectures. This talk will discuss the results of this exercise.
Percolation on shopping and cashback electronic commerce networks
NASA Astrophysics Data System (ADS)
Fu, Tao; Chen, Yini; Qin, Zhen; Guo, Liping
2013-06-01
Many realistic networks live in the form of multiple networks, including interacting networks and interdependent networks. Here we study percolation properties of a special kind of interacting networks, namely Shopping and Cashback Electronic Commerce Networks (SCECNs). We investigate two actual SCECNs to extract their structural properties, and develop a mathematical framework based on generating functions for analyzing directed interacting networks. Then we derive the necessary and sufficient condition for the absence of the system-wide giant in- and out- component, and propose arithmetic to calculate the corresponding structural measures in the sub-critical and supercritical regimes. We apply our mathematical framework and arithmetic to those two actual SCECNs to observe its accuracy, and give some explanations on the discrepancies. We show those structural measures based on our mathematical framework and arithmetic are useful to appraise the status of SCECNs. We also find that the supercritical regime of the whole network is maintained mainly by hyperlinks between different kinds of websites, while those hyperlinks between the same kinds of websites can only enlarge the sizes of in-components and out-components.
Group Work in a Technology-Rich Environment
ERIC Educational Resources Information Center
Penner, Nikolai; Schulze, Mathias
2010-01-01
This paper addresses several components of successful language-learning methodologies--group work, task-based instruction, and wireless computer technologies--and examines how the interplay of these three was perceived by students in a second-year university foreign-language course. The technology component of our learning design plays a central…
Smart textile-based wearable biomedical systems: a transition plan for research to reality.
Park, Sungmee; Jayaraman, Sundaresan
2010-01-01
The field of smart textile-based wearable biomedical systems (ST-WBSs) has of late been generating a lot of interest in the research and business communities since its early beginnings in the mid-nineties. However, the technology is yet to enter the marketplace and realize its original goal of enhancing the quality of life for individuals through enhanced real-time biomedical monitoring. In this paper, we propose a framework for analyzing the transition of ST-WBS from research to reality. We begin with a look at the evolution of the field and describe the major components of an ST-WBS. We then analyze the key issues encompassing the technical, medical, economic, public policy, and business facets from the viewpoints of various stakeholders in the continuum. We conclude with a plan of action for transitioning ST-WBS from "research to reality."
Recommendation System for Adaptive Learning.
Chen, Yunxiao; Li, Xiaoou; Liu, Jingchen; Ying, Zhiliang
2018-01-01
An adaptive learning system aims at providing instruction tailored to the current status of a learner, differing from the traditional classroom experience. The latest advances in technology make adaptive learning possible, which has the potential to provide students with high-quality learning benefit at a low cost. A key component of an adaptive learning system is a recommendation system, which recommends the next material (video lectures, practices, and so on, on different skills) to the learner, based on the psychometric assessment results and possibly other individual characteristics. An important question then follows: How should recommendations be made? To answer this question, a mathematical framework is proposed that characterizes the recommendation process as a Markov decision problem, for which decisions are made based on the current knowledge of the learner and that of the learning materials. In particular, two plain vanilla systems are introduced, for which the optimal recommendation at each stage can be obtained analytically.
A Security Audit Framework to Manage Information System Security
NASA Astrophysics Data System (ADS)
Pereira, Teresa; Santos, Henrique
The widespread adoption of information and communication technology have promoted an increase dependency of organizations in the performance of their Information Systems. As a result, adequate security procedures to properly manage information security must be established by the organizations, in order to protect their valued or critical resources from accidental or intentional attacks, and ensure their normal activity. A conceptual security framework to manage and audit Information System Security is proposed and discussed. The proposed framework intends to assist organizations firstly to understand what they precisely need to protect assets and what are their weaknesses (vulnerabilities), enabling to perform an adequate security management. Secondly, enabling a security audit framework to support the organization to assess the efficiency of the controls and policy adopted to prevent or mitigate attacks, threats and vulnerabilities, promoted by the advances of new technologies and new Internet-enabled services, that the organizations are subject of. The presented framework is based on a conceptual model approach, which contains the semantic description of the concepts defined in information security domain, based on the ISO/IEC_JCT1 standards.
NEIMiner: nanomaterial environmental impact data miner.
Tang, Kaizhi; Liu, Xiong; Harper, Stacey L; Steevens, Jeffery A; Xu, Roger
2013-01-01
As more engineered nanomaterials (eNM) are developed for a wide range of applications, it is crucial to minimize any unintended environmental impacts resulting from the application of eNM. To realize this vision, industry and policymakers must base risk management decisions on sound scientific information about the environmental fate of eNM, their availability to receptor organisms (eg, uptake), and any resultant biological effects (eg, toxicity). To address this critical need, we developed a model-driven, data mining system called NEIMiner, to study nanomaterial environmental impact (NEI). NEIMiner consists of four components: NEI modeling framework, data integration, data management and access, and model building. The NEI modeling framework defines the scope of NEI modeling and the strategy of integrating NEI models to form a layered, comprehensive predictability. The data integration layer brings together heterogeneous data sources related to NEI via automatic web services and web scraping technologies. The data management and access layer reuses and extends a popular content management system (CMS), Drupal, and consists of modules that model the complex data structure for NEI-related bibliography and characterization data. The model building layer provides an advanced analysis capability for NEI data. Together, these components provide significant value to the process of aggregating and analyzing large-scale distributed NEI data. A prototype of the NEIMiner system is available at http://neiminer.i-a-i.com/.
NEIMiner: nanomaterial environmental impact data miner
Tang, Kaizhi; Liu, Xiong; Harper, Stacey L; Steevens, Jeffery A; Xu, Roger
2013-01-01
As more engineered nanomaterials (eNM) are developed for a wide range of applications, it is crucial to minimize any unintended environmental impacts resulting from the application of eNM. To realize this vision, industry and policymakers must base risk management decisions on sound scientific information about the environmental fate of eNM, their availability to receptor organisms (eg, uptake), and any resultant biological effects (eg, toxicity). To address this critical need, we developed a model-driven, data mining system called NEIMiner, to study nanomaterial environmental impact (NEI). NEIMiner consists of four components: NEI modeling framework, data integration, data management and access, and model building. The NEI modeling framework defines the scope of NEI modeling and the strategy of integrating NEI models to form a layered, comprehensive predictability. The data integration layer brings together heterogeneous data sources related to NEI via automatic web services and web scraping technologies. The data management and access layer reuses and extends a popular content management system (CMS), Drupal, and consists of modules that model the complex data structure for NEI-related bibliography and characterization data. The model building layer provides an advanced analysis capability for NEI data. Together, these components provide significant value to the process of aggregating and analyzing large-scale distributed NEI data. A prototype of the NEIMiner system is available at http://neiminer.i-a-i.com/. PMID:24098076
Local adaptive tone mapping for video enhancement
NASA Astrophysics Data System (ADS)
Lachine, Vladimir; Dai, Min (.
2015-03-01
As new technologies like High Dynamic Range cameras, AMOLED and high resolution displays emerge on consumer electronics market, it becomes very important to deliver the best picture quality for mobile devices. Tone Mapping (TM) is a popular technique to enhance visual quality. However, the traditional implementation of Tone Mapping procedure is limited by pixel's value to value mapping, and the performance is restricted in terms of local sharpness and colorfulness. To overcome the drawbacks of traditional TM, we propose a spatial-frequency based framework in this paper. In the proposed solution, intensity component of an input video/image signal is split on low pass filtered (LPF) and high pass filtered (HPF) bands. Tone Mapping (TM) function is applied to LPF band to improve the global contrast/brightness, and HPF band is added back afterwards to keep the local contrast. The HPF band may be adjusted by a coring function to avoid noise boosting and signal overshooting. Colorfulness of an original image may be preserved or enhanced by chroma components correction by means of saturation function. Localized content adaptation is further improved by dividing an image to a set of non-overlapped regions and modifying each region individually. The suggested framework allows users to implement a wide range of tone mapping applications with perceptional local sharpness and colorfulness preserved or enhanced. Corresponding hardware circuit may be integrated in camera, video or display pipeline with minimal hardware budget
Trends in Process Analytical Technology: Present State in Bioprocessing.
Jenzsch, Marco; Bell, Christian; Buziol, Stefan; Kepert, Felix; Wegele, Harald; Hakemeyer, Christian
2017-08-04
Process analytical technology (PAT), the regulatory initiative for incorporating quality in pharmaceutical manufacturing, is an area of intense research and interest. If PAT is effectively applied to bioprocesses, this can increase process understanding and control, and mitigate the risk from substandard drug products to both manufacturer and patient. To optimize the benefits of PAT, the entire PAT framework must be considered and each elements of PAT must be carefully selected, including sensor and analytical technology, data analysis techniques, control strategies and algorithms, and process optimization routines. This chapter discusses the current state of PAT in the biopharmaceutical industry, including several case studies demonstrating the degree of maturity of various PAT tools. Graphical Abstract Hierarchy of QbD components.
The future scalability of pH-based genome sequencers: A theoretical perspective
NASA Astrophysics Data System (ADS)
Go, Jonghyun; Alam, Muhammad A.
2013-10-01
Sequencing of human genome is an essential prerequisite for personalized medicine and early prognosis of various genetic diseases. The state-of-art, high-throughput genome sequencing technologies provide improved sequencing; however, their reliance on relatively expensive optical detection schemes has prevented wide-spread adoption of the technology in routine care. In contrast, the recently announced pH-based electronic genome sequencers achieve fast sequencing at low cost because of the compatibility with the current microelectronics technology. While the progress in technology development has been rapid, the physics of the sequencing chips and the potential for future scaling (and therefore, cost reduction) remain unexplored. In this article, we develop a theoretical framework and a scaling theory to explain the principle of operation of the pH-based sequencing chips and use the framework to explore various perceived scaling limits of the technology related to signal to noise ratio, well-to-well crosstalk, and sequencing accuracy. We also address several limitations inherent to the key steps of pH-based genome sequencers, which are widely shared by many other sequencing platforms in the market but remained unexplained properly so far.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-08-01
An estimated 85% of the installed base of software is a custom application with a production quantity of one. In practice, almost 100% of military software systems are custom software. Paradoxically, the marginal costs of producing additional units are near zero. So why hasn`t the software market, a market with high design costs and low productions costs evolved like other similar custom widget industries, such as automobiles and hardware chips? The military software industry seems immune to market pressures that have motivated a multilevel supply chain structure in other widget industries: design cost recovery, improve quality through specialization, and enablemore » rapid assembly from purchased components. The primary goal of the ComponentWare Consortium (CWC) technology plan was to overcome barriers to building and deploying mission-critical information systems by using verified, reusable software components (Component Ware). The adoption of the ComponentWare infrastructure is predicated upon a critical mass of the leading platform vendors` inevitable adoption of adopting emerging, object-based, distributed computing frameworks--initially CORBA and COM/OLE. The long-range goal of this work is to build and deploy military systems from verified reusable architectures. The promise of component-based applications is to enable developers to snap together new applications by mixing and matching prefabricated software components. A key result of this effort is the concept of reusable software architectures. A second important contribution is the notion that a software architecture is something that can be captured in a formal language and reused across multiple applications. The formalization and reuse of software architectures provide major cost and schedule improvements. The Unified Modeling Language (UML) is fast becoming the industry standard for object-oriented analysis and design notation for object-based systems. However, the lack of a standard real-time distributed object operating system, lack of a standard Computer-Aided Software Environment (CASE) tool notation and lack of a standard CASE tool repository has limited the realization of component software. The approach to fulfilling this need is the software component factory innovation. The factory approach takes advantage of emerging standards such as UML, CORBA, Java and the Internet. The key technical innovation of the software component factory is the ability to assemble and test new system configurations as well as assemble new tools on demand from existing tools and architecture design repositories.« less
Applying the Design Framework to Technology Professional Development
ERIC Educational Resources Information Center
Curwood, Jen Scott
2013-01-01
Building on contemporary research on teacher professional development, this study examined the practices of a technology-focused learning community at a high school in the United States. Over the course of a school year, classroom teachers and a university-based researcher participated in the learning community to investigate how technology can…
Technology Acceptance among Pre-Service Teachers: Does Gender Matter?
ERIC Educational Resources Information Center
Teo, Timothy; Fan, Xitao; Du, Jianxia
2015-01-01
This study examined possible gender differences in pre-service teachers' perceived acceptance of technology in their professional work under the framework of the technology acceptance model (TAM). Based on a sample of pre-service teachers, a series of progressively more stringent measurement invariance tests (configural, metric, and scalar…
Affordance Analysis--Matching Learning Tasks with Learning Technologies
ERIC Educational Resources Information Center
Bower, Matt
2008-01-01
This article presents a design methodology for matching learning tasks with learning technologies. First a working definition of "affordances" is provided based on the need to describe the action potentials of the technologies (utility). Categories of affordances are then proposed to provide a framework for analysis. Following this, a…
Secondary Teacher Self-Efficacy and Technology Integration
ERIC Educational Resources Information Center
Hale, James Lee
2013-01-01
This dissertation is based on a conceptual framework founded in the plight of the United States in the critical areas of science, technology, engineering, and mathematics, such as student performance, global economy, job opportunities, and technological innovation. Subpar performance can be traced to, among other things, education and specifically…
Technology for Distance Education: A 10 Year Prospective.
ERIC Educational Resources Information Center
Bates, A. W.
This paper provides an overview of new technologies likely to be widely available within the next 10 years for teaching in Europe. It begins by presenting a framework which draws distinctions between different technologies based on their educational applications, i.e., for teaching or operational purposes, for communicating within or between…
NASA Technical Reports Server (NTRS)
Losquadro, G.; Luglio, M.; Vatalaro, F.
1997-01-01
A geostationary satellite system for mobile multimedia services via portable, aeronautical and mobile terminals was developed within the framework of the Advanced Communications Technology Service (ACTS) programs. The architecture of the system developed under the 'satellite extremely high frequency communications for multimedia mobile services (SECOMS)/ACTS broadband aeronautical terminal experiment' (ABATE) project is presented. The system will be composed of a Ka band system component, and an extremely high frequency band component. The major characteristics of the space segment, the ground control station and the portable, aeronautical and mobile user terminals are outlined.
Best Practices Inquiry: A Multidimensional, Value-Critical Framework
ERIC Educational Resources Information Center
Petr, Christopher G.; Walter, Uta M.
2005-01-01
This article offers a multidimensional framework that broadens current approaches to "best practices" inquiry to include (1) the perspectives of both the consumers of services and professional practitioners and (2) a value-based critique. The predominant empirical approach to best practices inquiry is a necessary, but not sufficient, component of…
NASA Technical Reports Server (NTRS)
Wang, Nanbor; Parameswaran, Kirthika; Kircher, Michael; Schmidt, Douglas
2003-01-01
Although existing CORBA specifications, such as Real-time CORBA and CORBA Messaging, address many end-to-end quality-of service (QoS) properties, they do not define strategies for configuring these properties into applications flexibly, transparently, and adaptively. Therefore, application developers must make these configuration decisions manually and explicitly, which is tedious, error-prone, and open sub-optimal. Although the recently adopted CORBA Component Model (CCM) does define a standard configuration framework for packaging and deploying software components, conventional CCM implementations focus on functionality rather than adaptive quality-of-service, which makes them unsuitable for next-generation applications with demanding QoS requirements. This paper presents three contributions to the study of middleware for QoS-enabled component-based applications. It outlines rejective middleware techniques designed to adaptively (1) select optimal communication mechanisms, (2) manage QoS properties of CORBA components in their contain- ers, and (3) (re)con$gure selected component executors dynamically. Based on our ongoing research on CORBA and the CCM, we believe the application of rejective techniques to component middleware will provide a dynamically adaptive and (re)configurable framework for COTS software that is well-suited for the QoS demands of next-generation applications.
NASA Technical Reports Server (NTRS)
Wang, Nanbor; Kircher, Michael; Schmidt, Douglas C.
2000-01-01
Although existing CORBA specifications, such as Real-time CORBA and CORBA Messaging, address many end-to-end quality-of-service (QoS) properties, they do not define strategies for configuring these properties into applications flexibly, transparently, and adaptively. Therefore, application developers must make these configuration decisions manually and explicitly, which is tedious, error-prone, and often sub-optimal. Although the recently adopted CORBA Component Model (CCM) does define a standard configuration frame-work for packaging and deploying software components, conventional CCM implementations focus on functionality rather than adaptive quality-of service, which makes them unsuitable for next-generation applications with demanding QoS requirements. This paper presents three contributions to the study of middleware for QoS-enabled component-based applications. It outlines reflective middleware techniques designed to adaptively: (1) select optimal communication mechanisms, (2) man- age QoS properties of CORBA components in their containers, and (3) (re)configure selected component executors dynamically. Based on our ongoing research on CORBA and the CCM, we believe the application of reflective techniques to component middleware will provide a dynamically adaptive and (re)configurable framework for COTS software that is well-suited for the QoS demands of next-generation applications.
ERIC Educational Resources Information Center
Tokmak, Hatice Sancar; Yelken, Tugba Yanpar; Konokman, Gamze Yavuz
2013-01-01
The current study investigated perceived development of pre-service teachers in their Instructional Material Design (IMD) competencies through the course "Instructional Technology and Material Design," which is based on a technological, pedagogical, and content knowledge (TPACK) framework. A total of 22 Elementary Education pre-service…
Evaluating Technology-Based Self-Monitoring as a Tier 2 Intervention across Middle School Settings
ERIC Educational Resources Information Center
Bruhn, Allison Leigh; Woods-Groves, Suzanne; Fernando, Josephine; Choi, Taehoon; Troughton, Leonard
2017-01-01
Multitiered frameworks like Positive Behavior Interventions and Supports (PBIS) have been recommended for preventing and remediating behavior problems. In this study, technology-based self-monitoring was used as a Tier 2 intervention to improve the academic engagement and disruptive behavior of three middle school students who were identified as…
NASA Astrophysics Data System (ADS)
Alfadhlani; Samadhi, T. M. A. Ari; Ma’ruf, Anas; Setiasyah Toha, Isa
2018-03-01
Assembly is a part of manufacturing processes that must be considered at the product design stage. Design for Assembly (DFA) is a method to evaluate product design in order to make it simpler, easier and quicker to assemble, so that assembly cost is reduced. This article discusses a framework for developing a computer-based DFA method. The method is expected to aid product designer to extract data, evaluate assembly process, and provide recommendation for the product design improvement. These three things are desirable to be performed without interactive process or user intervention, so product design evaluation process could be done automatically. Input for the proposed framework is a 3D solid engineering drawing. Product design evaluation is performed by: minimizing the number of components; generating assembly sequence alternatives; selecting the best assembly sequence based on the minimum number of assembly reorientations; and providing suggestion for design improvement.
Common modeling system for digital simulation
NASA Technical Reports Server (NTRS)
Painter, Rick
1994-01-01
The Joint Modeling and Simulation System is a tri-service investigation into a common modeling framework for the development digital models. The basis for the success of this framework is an X-window-based, open systems architecture, object-based/oriented methodology, standard interface approach to digital model construction, configuration, execution, and post processing. For years Department of Defense (DOD) agencies have produced various weapon systems/technologies and typically digital representations of the systems/technologies. These digital representations (models) have also been developed for other reasons such as studies and analysis, Cost Effectiveness Analysis (COEA) tradeoffs, etc. Unfortunately, there have been no Modeling and Simulation (M&S) standards, guidelines, or efforts towards commonality in DOD M&S. The typical scenario is an organization hires a contractor to build hardware and in doing so an digital model may be constructed. Until recently, this model was not even obtained by the organization. Even if it was procured, it was on a unique platform, in a unique language, with unique interfaces, and, with the result being UNIQUE maintenance required. Additionally, the constructors of the model expended more effort in writing the 'infrastructure' of the model/simulation (e.g. user interface, database/database management system, data journalizing/archiving, graphical presentations, environment characteristics, other components in the simulation, etc.) than in producing the model of the desired system. Other side effects include: duplication of efforts; varying assumptions; lack of credibility/validation; and decentralization in policy and execution. J-MASS provides the infrastructure, standards, toolset, and architecture to permit M&S developers and analysts to concentrate on the their area of interest.
GEMSS: grid-infrastructure for medical service provision.
Benkner, S; Berti, G; Engelbrecht, G; Fingberg, J; Kohring, G; Middleton, S E; Schmidt, R
2005-01-01
The European GEMSS Project is concerned with the creation of medical Grid service prototypes and their evaluation in a secure service-oriented infrastructure for distributed on demand/supercomputing. Key aspects of the GEMSS Grid middleware include negotiable QoS support for time-critical service provision, flexible support for business models, and security at all levels in order to ensure privacy of patient data as well as compliance to EU law. The GEMSS Grid infrastructure is based on a service-oriented architecture and is being built on top of existing standard Grid and Web technologies. The GEMSS infrastructure offers a generic Grid service provision framework that hides the complexity of transforming existing applications into Grid services. For the development of client-side applications or portals, a pluggable component framework has been developed, providing developers with full control over business processes, service discovery, QoS negotiation, and workflow, while keeping their underlying implementation hidden from view. A first version of the GEMSS Grid infrastructure is operational and has been used for the set-up of a Grid test-bed deploying six medical Grid service prototypes including maxillo-facial surgery simulation, neuro-surgery support, radio-surgery planning, inhaled drug-delivery simulation, cardiovascular simulation and advanced image reconstruction. The GEMSS Grid infrastructure is based on standard Web Services technology with an anticipated future transition path towards the OGSA standard proposed by the Global Grid Forum. GEMSS demonstrates that the Grid can be used to provide medical practitioners and researchers with access to advanced simulation and image processing services for improved preoperative planning and near real-time surgical support.
Qiao, Hong; Li, Yinlin; Li, Fengfu; Xi, Xuanyang; Wu, Wei
2016-10-01
Recently, many biologically inspired visual computational models have been proposed. The design of these models follows the related biological mechanisms and structures, and these models provide new solutions for visual recognition tasks. In this paper, based on the recent biological evidence, we propose a framework to mimic the active and dynamic learning and recognition process of the primate visual cortex. From principle point of view, the main contributions are that the framework can achieve unsupervised learning of episodic features (including key components and their spatial relations) and semantic features (semantic descriptions of the key components), which support higher level cognition of an object. From performance point of view, the advantages of the framework are as follows: 1) learning episodic features without supervision-for a class of objects without a prior knowledge, the key components, their spatial relations and cover regions can be learned automatically through a deep neural network (DNN); 2) learning semantic features based on episodic features-within the cover regions of the key components, the semantic geometrical values of these components can be computed based on contour detection; 3) forming the general knowledge of a class of objects-the general knowledge of a class of objects can be formed, mainly including the key components, their spatial relations and average semantic values, which is a concise description of the class; and 4) achieving higher level cognition and dynamic updating-for a test image, the model can achieve classification and subclass semantic descriptions. And the test samples with high confidence are selected to dynamically update the whole model. Experiments are conducted on face images, and a good performance is achieved in each layer of the DNN and the semantic description learning process. Furthermore, the model can be generalized to recognition tasks of other objects with learning ability.
Optical datacenter network employing slotted (TDMA) operation for dynamic resource allocation
NASA Astrophysics Data System (ADS)
Bakopoulos, P.; Tokas, K.; Spatharakis, C.; Patronas, I.; Landi, G.; Christodoulopoulos, K.; Capitani, M.; Kyriakos, A.; Aziz, M.; Reisis, D.; Varvarigos, E.; Zahavi, E.; Avramopoulos, H.
2018-02-01
The soaring traffic demands in datacenter networks (DCNs) are outpacing progresses in CMOS technology, challenging the bandwidth and energy scalability of currently established technologies. Optical switching is gaining traction as a promising path for sustaining the explosive growth of DCNs; however, its practical deployment necessitates extensive modifications to the network architecture and operation, tailored to the technological particularities of optical switches (i.e. no buffering, limitations in radix size and speed). European project NEPHELE is developing an optical network infrastructure that leverages optical switching within a software-defined networking (SDN) framework to overcome the bandwidth and energy scaling challenges of datacenter networks. An experimental validation of the NEPHELE data plane is reported based on commercial off-the-shelf optical components controlled by FPGA boards. To facilitate dynamic allocation of the network resources and perform collision-free routing in a lossless network environment, slotted operation is employed (i.e. using time-division multiple-access - TDMA). Error-free operation of the NEPHELE data plane is verified for 200 μs slots in various scenarios that involve communication between Ethernet hosts connected to custom-designed top-of-rack (ToR) switches, located in the same or in different datacenter pods. Control of the slotted data plane is obtained through an SDN framework comprising an OpenDaylight controller with appropriate add-ons. Communication between servers in the optical-ToR is demonstrated with various routing scenarios, concerning communication between hosts located in the same rack or in different racks, within the same or different datacenter pods. Error-free operation is confirmed for all evaluated scenarios, underpinning the feasibility of the NEPHELE architecture.
Yusof, Maryati Mohd; Kuljis, Jasna; Papazafeiropoulou, Anastasia; Stergioulas, Lampros K
2008-06-01
The realization of Health Information Systems (HIS) requires rigorous evaluation that addresses technology, human and organization issues. Our review indicates that current evaluation methods evaluate different aspects of HIS and they can be improved upon. A new evaluation framework, human, organization and technology-fit (HOT-fit) was developed after having conducted a critical appraisal of the findings of existing HIS evaluation studies. HOT-fit builds on previous models of IS evaluation--in particular, the IS Success Model and the IT-Organization Fit Model. This paper introduces the new framework for HIS evaluation that incorporates comprehensive dimensions and measures of HIS and provides a technological, human and organizational fit. Literature review on HIS and IS evaluation studies and pilot testing of developed framework. The framework was used to evaluate a Fundus Imaging System (FIS) of a primary care organization in the UK. The case study was conducted through observation, interview and document analysis. The main findings show that having the right user attitude and skills base together with good leadership, IT-friendly environment and good communication can have positive influence on the system adoption. Comprehensive, specific evaluation factors, dimensions and measures in the new framework (HOT-fit) are applicable in HIS evaluation. The use of such a framework is argued to be useful not only for comprehensive evaluation of the particular FIS system under investigation, but potentially also for any Health Information System in general.
The Advanced Technology Operations System: ATOS
NASA Technical Reports Server (NTRS)
Kaufeler, J.-F.; Laue, H. A.; Poulter, K.; Smith, H.
1993-01-01
Mission control systems supporting new space missions face ever-increasing requirements in terms of functionality, performance, reliability and efficiency. Modern data processing technology is providing the means to meet these requirements in new systems under development. During the past few years the European Space Operations Centre (ESOC) of the European Space Agency (ESA) has carried out a number of projects to demonstrate the feasibility of using advanced software technology, in particular, knowledge based systems, to support mission operations. A number of advances must be achieved before these techniques can be moved towards operational use in future missions, namely, integration of the applications into a single system framework and generalization of the applications so that they are mission independent. In order to achieve this goal, ESA initiated the Advanced Technology Operations System (ATOS) program, which will develop the infrastructure to support advanced software technology in mission operations, and provide applications modules to initially support: Mission Preparation, Mission Planning, Computer Assisted Operations, and Advanced Training. The first phase of the ATOS program is tasked with the goal of designing and prototyping the necessary system infrastructure to support the rest of the program. The major components of the ATOS architecture is presented. This architecture relies on the concept of a Mission Information Base (MIB) as the repository for all information and knowledge which will be used by the advanced application modules in future mission control systems. The MIB is being designed to exploit the latest in database and knowledge representation technology in an open and distributed system. In conclusion the technological and implementation challenges expected to be encountered, as well as the future plans and time scale of the project, are presented.
Predicting the behavior of techno-social systems.
Vespignani, Alessandro
2009-07-24
We live in an increasingly interconnected world of techno-social systems, in which infrastructures composed of different technological layers are interoperating within the social component that drives their use and development. Examples are provided by the Internet, the World Wide Web, WiFi communication technologies, and transportation and mobility infrastructures. The multiscale nature and complexity of these networks are crucial features in understanding and managing the networks. The accessibility of new data and the advances in the theory and modeling of complex networks are providing an integrated framework that brings us closer to achieving true predictive power of the behavior of techno-social systems.
School-Based Technology Use Planning.
ERIC Educational Resources Information Center
Cradler, John
1994-01-01
Describes how to conduct systematic planning for technology use. The components of an effective technology use plan, derived from a comprehensive study of school-based technology, are given. Planning development, implementation, and evaluation steps are provided. Ten planning resource books are listed. (Contains five references.) (KRN)
MEMS Deformable Mirror Technology Development for Space-Based Exoplanet Detection
NASA Astrophysics Data System (ADS)
Bierden, Paul; Cornelissen, S.; Ryan, P.
2014-01-01
In the search for earth-like extrasolar planets that has become an important objective for NASA, a critical technology development requirement is to advance deformable mirror (DM) technology. High-actuator-count DMs are critical components for nearly all proposed coronagraph instrument concepts. The science case for exoplanet imaging is strong, and rapid recent advances in test beds with DMs made using microelectromechanical system (MEMS) technology have motivated a number of compelling mission concepts that set technical specifications for their use as wavefront controllers. This research will advance the technology readiness of the MEMS DMs components that are currently at the forefront of the field, and the project will be led by the manufacturer of those components, Boston Micromachines Corporation (BMC). The project aims to demonstrate basic functionality and performance of this key component in critical test environments and in simulated operational environments, while establishing model-based predictions of its performance relative to launch and space environments. Presented will be the current status of the project with modeling and initial test results.
Measuring Adverse Events in Helicopter Emergency Medical Services: Establishing Content Validity
Patterson, P. Daniel; Lave, Judith R.; Martin-Gill, Christian; Weaver, Matthew D.; Wadas, Richard J.; Arnold, Robert M.; Roth, Ronald N.; Mosesso, Vincent N.; Guyette, Francis X.; Rittenberger, Jon C.; Yealy, Donald M.
2015-01-01
Introduction We sought to create a valid framework for detecting Adverse Events (AEs) in the high-risk setting of Helicopter Emergency Medical Services (HEMS). Methods We assembled a panel of 10 expert clinicians (n=6 emergency medicine physicians and n=4 prehospital nurses and flight paramedics) affiliated with a large multi-state HEMS organization in the Northeast U.S. We used a modified Delphi technique to develop a framework for detecting AEs associated with the treatment of critically ill or injured patients. We used a widely applied measure, the Content Validity Index (CVI), to quantify the validity of the framework’s content. Results The expert panel of 10 clinicians reached consensus on a common AE definition and four-step protocol/process for AE detection in HEMS. The consensus-based framework is composed of three main components: 1) a trigger tool, 2) a method for rating proximal cause, and 3) a method for rating AE severity. The CVI findings isolate components of the framework considered content valid. Conclusions We demonstrate a standardized process for the development of a content valid framework for AE detection. The framework is a model for the development of a method for AE identification in other settings, including ground-based EMS. PMID:24003951
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-27
... technologies, namely safety-critical processor-based signal or train control systems, including subsystems and... or train control system (including a subsystem or component thereof) that was in service as of June 6... processor-based signal or train control system, subsystem, or component.'' See 49 CFR 236.903. Under Subpart...
Research on Key Technology and Applications for Internet of Things
NASA Astrophysics Data System (ADS)
Chen, Xian-Yi; Jin, Zhi-Gang
The Internet of Things (IOT) has been paid more and more attention by the academe, industry, and government all over the world. The concept of IOT and the architecture of IOT are discussed. The key technologies of IOT, including Radio Frequency Identification technology, Electronic Product Code technology, and ZigBee technology are analyzed. The framework of digital agriculture application based on IOT is proposed.
NASA Astrophysics Data System (ADS)
Thylén, Lars
2006-07-01
The design and manufacture of components and systems underpin the European and indeed worldwide photonics industry. Optical materials and photonic components serve as the basis for systems building at different levels of complexity. In most cases, they perform a key function and dictate the performance of these systems. New products and processes will generate economic activity for the European photonics industry into the 21 st century. However, progress will rely on Europe's ability to develop new and better materials, components and systems. To achieve success, photonic components and systems must: •be reliable and inexpensive •be generic and adaptable •offer superior functionality •be innovative and protected by Intellectual Property •be aligned to market opportunities The challenge in the short-, medium-, and long-term is to put a coordinating framework in place which will make the European activity in this technology area competitive as compared to those in the US and Asia. In the short term the aim should be to facilitate the vibrant and profitable European photonics industry to further develop its ability to commercialize advances in photonic related technologies. In the medium and longer terms the objective must be to place renewed emphasis on materials research and the design and manufacturing of key components and systems to form the critical link between science endeavour and commercial success. All these general issues are highly relevant for the component intensive broadband communications industry. Also relevant for this development is the convergence of data and telecom, where the low cost of data com meets with the high reliability requirements of telecom. The text below is to a degree taken form the Strategic Research Agenda of the Technology Platform Photonics 21 [1], as this contains a concerted effort to iron out a strategy for EU in the area of photonics components and systems.
Hierarchical control and performance evaluation of multi-vehicle autonomous systems
NASA Astrophysics Data System (ADS)
Balakirsky, Stephen; Scrapper, Chris; Messina, Elena
2005-05-01
This paper will describe how the Mobility Open Architecture Tools and Simulation (MOAST) framework can facilitate performance evaluations of RCS compliant multi-vehicle autonomous systems. This framework provides an environment that allows for simulated and real architectural components to function seamlessly together. By providing repeatable environmental conditions, this framework allows for the development of individual components as well as component performance metrics. MOAST is composed of high-fidelity and low-fidelity simulation systems, a detailed model of real-world terrain, actual hardware components, a central knowledge repository, and architectural glue to tie all of the components together. This paper will describe the framework"s components in detail and provide an example that illustrates how the framework can be utilized to develop and evaluate a single architectural component through the use of repeatable trials and experimentation that includes both virtual and real components functioning together
On Representative Spaceflight Instrument and Associated Instrument Sensor Web Framework
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Patel, Umeshkumar; Vootukuru, Meg
2007-01-01
Sensor Web-based adaptation and sharing of space flight mission resources, including those of the Space-Ground and Control-User communication segment, could greatly benefit from utilization of heritage Internet Protocols and devices applied for Spaceflight (SpaceIP). This had been successfully demonstrated by a few recent spaceflight experiments. However, while terrestrial applications of Internet protocols are well developed and understood (mostly due to billions of dollars in investments by the military and industry), the spaceflight application of Internet protocols is still in its infancy. Progress in the developments of SpaceIP-enabled instrument components will largely determine the SpaceIP utilization of those investments and acceptance in years to come. Likewise SpaceIP, the development of commercial real-time and instrument colocated computational resources, data compression and storage, can be enabled on-board a spacecraft and, in turn, support a powerful application to Sensor Web-based design of a spaceflight instrument. Sensor Web-enabled reconfiguration and adaptation of structures for hardware resources and information systems will commence application of Field Programmable Arrays (FPGA) and other aerospace programmable logic devices for what this technology was intended. These are a few obvious potential benefits of Sensor Web technologies for spaceflight applications. However, they are still waiting to be explored. This is because there is a need for a new approach to spaceflight instrumentation in order to make these mature sensor web technologies applicable for spaceflight. In this paper we present an approach in developing related and enabling spaceflight instrument-level technologies based on the new concept of a representative spaceflight Instrument Sensor Web (ISW).
LISA Framework for Enhancing Gravitational Wave Signal Extraction Techniques
NASA Technical Reports Server (NTRS)
Thompson, David E.; Thirumalainambi, Rajkumar
2006-01-01
This paper describes the development of a Framework for benchmarking and comparing signal-extraction and noise-interference-removal methods that are applicable to interferometric Gravitational Wave detector systems. The primary use is towards comparing signal and noise extraction techniques at LISA frequencies from multiple (possibly confused) ,gravitational wave sources. The Framework includes extensive hybrid learning/classification algorithms, as well as post-processing regularization methods, and is based on a unique plug-and-play (component) architecture. Published methods for signal extraction and interference removal at LISA Frequencies are being encoded, as well as multiple source noise models, so that the stiffness of GW Sensitivity Space can be explored under each combination of methods. Furthermore, synthetic datasets and source models can be created and imported into the Framework, and specific degraded numerical experiments can be run to test the flexibility of the analysis methods. The Framework also supports use of full current LISA Testbeds, Synthetic data systems, and Simulators already in existence through plug-ins and wrappers, thus preserving those legacy codes and systems in tact. Because of the component-based architecture, all selected procedures can be registered or de-registered at run-time, and are completely reusable, reconfigurable, and modular.
A national framework for monitoring and reporting on environmental sustainability in Canada.
Marshall, I B; Scott Smith, C A; Selby, C J
1996-01-01
In 1991, a collaborative project to revise the terrestrial component of a national ecological framework was undertaken with a wide range of stakeholders. This spatial framework consists of multiple, nested levels of ecological generalization with linkages to existing federal and provincial scientific databases. The broadest level of generalization is the ecozone. Macroclimate, major vegetation types and subcontinental scale physiographic formations constitute the definitive components of these major ecosystems. Ecozones are subdivided into approximately 200 ecoregions which are based on properties like regional physiography, surficial geology, climate, vegetation, soil, water and fauna. The ecozone and ecoregion levels of the framework have been depicted on a national map coverage at 1:7 500 000 scale. Ecoregions have been subdivided into ecodistricts based primarily on landform, parent material, topography, soils, waterbodies and vegetation at a scale (1:2 000 000) useful for environmental resource management, monitoring and modelling activities. Nested within the ecodistricts are the polygons that make up the Soil Landscapes of Canada series of 1:1 000 000 scale soil maps. The framework is supported by an ARC-INFO GIS at Agriculture Canada. The data model allows linkage to associated databases on climate, land use and socio-economic attributes.
Wind Power Opportunities in St. Thomas, USVI: A Site-Specific Evaluation and Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lantz, E.; Warren, A.; Roberts, J. O.
This NREL technical report utilizes a development framework originated by NREL and known by the acronym SROPTTC to assist the U.S. Virgin Islands in identifying and understanding concrete opportunities for wind power development in the territory. The report covers each of the seven components of the SROPTTC framework: Site, Resource, Off-take, Permitting, Technology, Team, and Capital as they apply to wind power in the USVI and specifically to a site in Bovoni, St. Thomas. The report concludes that Bovoni peninsula is a strong candidate for utility-scale wind generation in the territory. It represents a reasonable compromise in terms of windmore » resource, distance from residences, and developable terrain. Hurricane risk and variable terrain on the peninsula and on potential equipment transport routes add technical and logistical challenges but do not appear to represent insurmountable barriers. In addition, integration of wind power into the St. Thomas power system will present operational challenges, but based on experience in other islanded power systems, there are reasonable solutions for addressing these challenges.« less
An extensible and lightweight architecture for adaptive server applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gorton, Ian; Liu, Yan; Trivedi, Nihar
2008-07-10
Server applications augmented with behavioral adaptation logic can react to environmental changes, creating self-managing server applications with improved quality of service at runtime. However, developing adaptive server applications is challenging due to the complexity of the underlying server technologies and highly dynamic application environments. This paper presents an architecture framework, the Adaptive Server Framework (ASF), to facilitate the development of adaptive behavior for legacy server applications. ASF provides a clear separation between the implementation of adaptive behavior and the business logic of the server application. This means a server application can be extended with programmable adaptive features through the definitionmore » and implementation of control components defined in ASF. Furthermore, ASF is a lightweight architecture in that it incurs low CPU overhead and memory usage. We demonstrate the effectiveness of ASF through a case study, in which a server application dynamically determines the resolution and quality to scale an image based on the load of the server and network connection speed. The experimental evaluation demonstrates the erformance gains possible by adaptive behavior and the low overhead introduced by ASF.« less
A review of predictive nonlinear theories for multiscale modeling of heterogeneous materials
NASA Astrophysics Data System (ADS)
Matouš, Karel; Geers, Marc G. D.; Kouznetsova, Varvara G.; Gillman, Andrew
2017-02-01
Since the beginning of the industrial age, material performance and design have been in the midst of innovation of many disruptive technologies. Today's electronics, space, medical, transportation, and other industries are enriched by development, design and deployment of composite, heterogeneous and multifunctional materials. As a result, materials innovation is now considerably outpaced by other aspects from component design to product cycle. In this article, we review predictive nonlinear theories for multiscale modeling of heterogeneous materials. Deeper attention is given to multiscale modeling in space and to computational homogenization in addressing challenging materials science questions. Moreover, we discuss a state-of-the-art platform in predictive image-based, multiscale modeling with co-designed simulations and experiments that executes on the world's largest supercomputers. Such a modeling framework consists of experimental tools, computational methods, and digital data strategies. Once fully completed, this collaborative and interdisciplinary framework can be the basis of Virtual Materials Testing standards and aids in the development of new material formulations. Moreover, it will decrease the time to market of innovative products.
A review of predictive nonlinear theories for multiscale modeling of heterogeneous materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matouš, Karel, E-mail: kmatous@nd.edu; Geers, Marc G.D.; Kouznetsova, Varvara G.
2017-02-01
Since the beginning of the industrial age, material performance and design have been in the midst of innovation of many disruptive technologies. Today's electronics, space, medical, transportation, and other industries are enriched by development, design and deployment of composite, heterogeneous and multifunctional materials. As a result, materials innovation is now considerably outpaced by other aspects from component design to product cycle. In this article, we review predictive nonlinear theories for multiscale modeling of heterogeneous materials. Deeper attention is given to multiscale modeling in space and to computational homogenization in addressing challenging materials science questions. Moreover, we discuss a state-of-the-art platformmore » in predictive image-based, multiscale modeling with co-designed simulations and experiments that executes on the world's largest supercomputers. Such a modeling framework consists of experimental tools, computational methods, and digital data strategies. Once fully completed, this collaborative and interdisciplinary framework can be the basis of Virtual Materials Testing standards and aids in the development of new material formulations. Moreover, it will decrease the time to market of innovative products.« less
The quality of hospital-based antenatal care in Istanbul.
Turan, Janet Molzan; Bulut, Ayşpen; Nalbant, Hacer; Ortayli, Nuriye; Akalin, A Arzu Koloğlu
2006-03-01
The aim of this study was to gather comprehensive data from three hospitals in Istanbul, Turkey, in order to gain in-depth understanding of the quality of antenatal care in this setting. The Bruce-Jain framework for quality of care was adapted for use in evaluating antenatal care. Methods included examination of hospital records, in-depth interviews, exit questionnaires, and structured observations. The study revealed deficiencies in the quality of antenatal care being delivered at the study hospitals in all six elements of the quality-of-care framework. The technical content of visits varied greatly among the hospitals, and an overuse of technology was accompanied by neglect of some essential components of antenatal care. Although at the private hospital some problems with the technical content of care were identified, client satisfaction was higher there, where the care included good interpersonal relations, information provision, and continuity. Providers at all three hospitals felt constrained by heavy patient loads and a lack of resources. Multifaceted approaches are needed to improve the quality of antenatal care in this setting.
Kirkilionis, Markus; Janus, Ulrich; Sbano, Luca
2011-09-01
We model in detail a simple synthetic genetic clock that was engineered in Atkinson et al. (Cell 113(5):597-607, 2003) using Escherichia coli as a host organism. Based on this engineered clock its theoretical description uses the modelling framework presented in Kirkilionis et al. (Theory Biosci. doi: 10.1007/s12064-011-0125-0 , 2011, this volume). The main goal of this accompanying article was to illustrate that parts of the modelling process can be algorithmically automatised once the model framework we called 'average dynamics' is accepted (Sbano and Kirkilionis, WMI Preprint 7/2007, 2008c; Kirkilionis and Sbano, Adv Complex Syst 13(3):293-326, 2010). The advantage of the 'average dynamics' framework is that system components (especially in genetics) can be easier represented in the model. In particular, if once discovered and characterised, specific molecular players together with their function can be incorporated. This means that, for example, the 'gene' concept becomes more clear, for example, in the way the genetic component would react under different regulatory conditions. Using the framework it has become a realistic aim to link mathematical modelling to novel tools of bioinformatics in the future, at least if the number of regulatory units can be estimated. This should hold in any case in synthetic environments due to the fact that the different synthetic genetic components are simply known (Elowitz and Leibler, Nature 403(6767):335-338, 2000; Gardner et al., Nature 403(6767):339-342, 2000; Hasty et al., Nature 420(6912):224-230, 2002). The paper illustrates therefore as a necessary first step how a detailed modelling of molecular interactions with known molecular components leads to a dynamic mathematical model that can be compared to experimental results on various levels or scales. The different genetic modules or components are represented in different detail by model variants. We explain how the framework can be used for investigating other more complex genetic systems in terms of regulation and feedback.
Kobak, Roger; Zajac, Kristyn; Herres, Joanna; KrauthamerEwing, E. Stephanie
2016-01-01
The emergence of ABTs for adolescents highlights the need to more clearly define and evaluate these treatments in the context of other attachment based treatments for young children and adults. We propose a general framework for defining and evaluating ABTs that describes the cyclical processes that are required to maintain a secure attachment bond. This secure cycle incorporates three components: 1) the child or adult’s IWM of the caregiver; 2) emotionally attuned communication; and 3) the caregiver’s IWM of the child or adult. We briefly review Bowlby, Ainsworth, and Main’s contributions to defining the components of the secure cycle and discuss how this framework can be adapted for understanding the process of change in ABTs. For clinicians working with adolescents, our model can be used to identify how deviations from the secure cycle (attachment injuries, empathic failures and mistuned communication) contribute to family distress and psychopathology. The secure cycle also provides a way of describing the ABT elements that have been used to revise IWMs or improve emotionally attuned communication. For researchers, our model provides a guide for conceptualizing and measuring change in attachment constructs and how change in one component of the interpersonal cycle should generalize to other components. PMID:25744572
Beacon communities' public health initiatives: a case study analysis.
Massoudi, Barbara L; Marcial, Laura H; Haque, Saira; Bailey, Robert; Chester, Kelley; Cunningham, Shellery; Riley, Amanda; Soper, Paula
2014-01-01
The Beacon Communities for Public Health (BCPH) project was launched in 2011 to gain a better understanding of the range of activities currently being conducted in population- and public health by the Beacon Communities. The project highlighted the successes and challenges of these efforts with the aim of sharing this information broadly among the public health community. The Beacon Community Program, designed to showcase technology-enabled, community-based initiatives to improve outcomes, focused on: building and strengthening health information technology (IT) infrastructure and exchange capabilities; translating investments in health IT to measureable improvements in cost, quality, and population health; and, developing innovative approaches to performance measurement, technology, and care delivery. Four multimethod case studies were conducted based on a modified sociotechnical framework to learn more about public health initiative implementation and use in the Beacon Communities. Our methodological approach included using document review and semistructured key informant interviews. NACCHO Model Practice Program criteria were used to select the public health initiatives included in the case studies. Despite differences among the case studies, common barriers and facilitators were found to be present in all areas of the sociotechnical framework application including structure, people, technology, tasks, overarching considerations, and sustainability. Overall, there were many more facilitators (range = 7-14) present for each Beacon compared to barriers (range = 4-6). Four influential promising practices were identified through the work: forging strong and sustainable partnerships; ensuring a good task-technology fit and a flexible and iterative design; fostering technology acceptance; and, providing education and demonstrating value. A common weakness was the lack of a framework or model for the Beacon Communities evaluation work. Sharing a framework or approach to evaluation at the beginning of implementation made the work more effective. Supporting evaluation to inform future implementations is important.
Integrating uncertainty into public energy research and development decisions
NASA Astrophysics Data System (ADS)
Anadón, Laura Díaz; Baker, Erin; Bosetti, Valentina
2017-05-01
Public energy research and development (R&D) is recognized as a key policy tool for transforming the world's energy system in a cost-effective way. However, managing the uncertainty surrounding technological change is a critical challenge for designing robust and cost-effective energy policies. The design of such policies is particularly important if countries are going to both meet the ambitious greenhouse-gas emissions reductions goals set by the Paris Agreement and achieve the required harmonization with the broader set of objectives dictated by the Sustainable Development Goals. The complexity of informing energy technology policy requires, and is producing, a growing collaboration between different academic disciplines and practitioners. Three analytical components have emerged to support the integration of technological uncertainty into energy policy: expert elicitations, integrated assessment models, and decision frameworks. Here we review efforts to incorporate all three approaches to facilitate public energy R&D decision-making under uncertainty. We highlight emerging insights that are robust across elicitations, models, and frameworks, relating to the allocation of public R&D investments, and identify gaps and challenges that remain.
NASA Astrophysics Data System (ADS)
Yang, Yang; Peng, Zhike; Dong, Xingjian; Zhang, Wenming; Clifton, David A.
2018-03-01
A challenge in analysing non-stationary multi-component signals is to isolate nonlinearly time-varying signals especially when they are overlapped in time and frequency plane. In this paper, a framework integrating time-frequency analysis-based demodulation and a non-parametric Gaussian latent feature model is proposed to isolate and recover components of such signals. The former aims to remove high-order frequency modulation (FM) such that the latter is able to infer demodulated components while simultaneously discovering the number of the target components. The proposed method is effective in isolating multiple components that have the same FM behavior. In addition, the results show that the proposed method is superior to generalised demodulation with singular-value decomposition-based method, parametric time-frequency analysis with filter-based method and empirical model decomposition base method, in recovering the amplitude and phase of superimposed components.
Status of the calibration and alignment framework at the Belle II experiment
NASA Astrophysics Data System (ADS)
Dossett, D.; Sevior, M.; Ritter, M.; Kuhr, T.; Bilka, T.; Yaschenko, S.;
2017-10-01
The Belle II detector at the Super KEKB e+e-collider plans to take first collision data in 2018. The monetary and CPU time costs associated with storing and processing the data mean that it is crucial for the detector components at Belle II to be calibrated quickly and accurately. A fast and accurate calibration system would allow the high level trigger to increase the efficiency of event selection, and can give users analysis-quality reconstruction promptly. A flexible framework to automate the fast production of calibration constants is being developed in the Belle II Analysis Software Framework (basf2). Detector experts only need to create two components from C++ base classes in order to use the automation system. The first collects data from Belle II event data files and outputs much smaller files to pass to the second component. This runs the main calibration algorithm to produce calibration constants ready for upload into the conditions database. A Python framework coordinates the input files, order of processing, and submission of jobs. Splitting the operation into collection and algorithm processing stages allows the framework to optionally parallelize the collection stage on a batch system.
A modeling framework for exposing risks in complex systems.
Sharit, J
2000-08-01
This article introduces and develops a modeling framework for exposing risks in the form of human errors and adverse consequences in high-risk systems. The modeling framework is based on two components: a two-dimensional theory of accidents in systems developed by Perrow in 1984, and the concept of multiple system perspectives. The theory of accidents differentiates systems on the basis of two sets of attributes. One set characterizes the degree to which systems are interactively complex; the other emphasizes the extent to which systems are tightly coupled. The concept of multiple perspectives provides alternative descriptions of the entire system that serve to enhance insight into system processes. The usefulness of these two model components derives from a modeling framework that cross-links them, enabling a variety of work contexts to be exposed and understood that would otherwise be very difficult or impossible to identify. The model components and the modeling framework are illustrated in the case of a large and comprehensive trauma care system. In addition to its general utility in the area of risk analysis, this methodology may be valuable in applications of current methods of human and system reliability analysis in complex and continually evolving high-risk systems.
Karanja, Sarah; Mbuagbaw, Lawrence; Ritvo, Paul; Law, Judith; Kyobutungi, Catherine; Reid, Graham; Ram, Ravi; Estambale, Benson; Lester, Richard
2011-01-01
mHealth is a term used to refer to mobile technologies such as personal digital assistants and mobile phones for healthcare. mHealth initiatives to support care and treatment of patients are emerging globally and this workshop brought together researchers, policy makers, information, communication and technology programmers, academics and civil society representatives for one and a half days synergy meeting in Kenya to review regional evidence based mHealth research for HIV care and treatment, review mHealth technologies for adherence and retention interventions in anti-retroviral therapy (ART) programs and develop a framework for scale up of evidence based mHealth interventions. The workshop was held in May 2011 in Nairobi, Kenya and was funded by the Canadian Global Health Research Initiatives (GHRI) and the US Centre for Disease Control and Prevention (CDC). At the end of the workshop participants came up with a framework to guide mHealth initiatives in the region and a plan to work together in scaling up evidence based mHealth interventions. The participants acknowledged the importance of the meeting in setting the pace for strengthening and coordinating mHealth initiatives and unanimously agreed to hold a follow up meeting after three months. PMID:22187619
Karanja, Sarah; Mbuagbaw, Lawrence; Ritvo, Paul; Law, Judith; Kyobutungi, Catherine; Reid, Graham; Ram, Ravi; Estambale, Benson; Lester, Richard
2011-01-01
mHealth is a term used to refer to mobile technologies such as personal digital assistants and mobile phones for healthcare. mHealth initiatives to support care and treatment of patients are emerging globally and this workshop brought together researchers, policy makers, information, communication and technology programmers, academics and civil society representatives for one and a half days synergy meeting in Kenya to review regional evidence based mHealth research for HIV care and treatment, review mHealth technologies for adherence and retention interventions in anti-retroviral therapy (ART) programs and develop a framework for scale up of evidence based mHealth interventions. The workshop was held in May 2011 in Nairobi, Kenya and was funded by the Canadian Global Health Research Initiatives (GHRI) and the US Centre for Disease Control and Prevention (CDC). At the end of the workshop participants came up with a framework to guide mHealth initiatives in the region and a plan to work together in scaling up evidence based mHealth interventions. The participants acknowledged the importance of the meeting in setting the pace for strengthening and coordinating mHealth initiatives and unanimously agreed to hold a follow up meeting after three months.
Surgical simulation: Current practices and future perspectives for technical skills training.
Bjerrum, Flemming; Thomsen, Ann Sofia Skou; Nayahangan, Leizl Joy; Konge, Lars
2018-06-17
Simulation-based training (SBT) has become a standard component of modern surgical education, yet successful implementation of evidence-based training programs remains challenging. In this narrative review, we use Kern's framework for curriculum development to describe where we are now and what lies ahead for SBT within surgery with a focus on technical skills in operative procedures. Despite principles for optimal SBT (proficiency-based, distributed, and deliberate practice) having been identified, massed training with fixed time intervals or a fixed number of repetitions is still being extensively used, and simulators are generally underutilized. SBT should be part of surgical training curricula, including theoretical, technical, and non-technical skills, and be based on relevant needs assessments. Furthermore, training should follow evidence-based theoretical principles for optimal training, and the effect of training needs to be evaluated using relevant outcomes. There is a larger, still unrealized potential of surgical SBT, which may be realized in the near future as simulator technologies evolve, more evidence-based training programs are implemented, and cost-effectiveness and impact on patient safety is clearly demonstrated.
ERIC Educational Resources Information Center
Zia, Lee L.; Van de Sompel, Herbert; Beit-Arie, Oren; Gambles, Anne
2001-01-01
Includes three articles that discuss the National Science Foundation's National Science, Mathematics, Engineering, and Technology Education Digital Library (NSDL) program; the OpenURL framework for open reference linking in the Web-based scholarly information environment; and HeadLine (Hybrid Electronic Access and Delivery in the Library Networked…
Robust and Effective Component-based Banknote Recognition for the Blind
Hasanuzzaman, Faiz M.; Yang, Xiaodong; Tian, YingLi
2012-01-01
We develop a novel camera-based computer vision technology to automatically recognize banknotes for assisting visually impaired people. Our banknote recognition system is robust and effective with the following features: 1) high accuracy: high true recognition rate and low false recognition rate, 2) robustness: handles a variety of currency designs and bills in various conditions, 3) high efficiency: recognizes banknotes quickly, and 4) ease of use: helps blind users to aim the target for image capture. To make the system robust to a variety of conditions including occlusion, rotation, scaling, cluttered background, illumination change, viewpoint variation, and worn or wrinkled bills, we propose a component-based framework by using Speeded Up Robust Features (SURF). Furthermore, we employ the spatial relationship of matched SURF features to detect if there is a bill in the camera view. This process largely alleviates false recognition and can guide the user to correctly aim at the bill to be recognized. The robustness and generalizability of the proposed system is evaluated on a dataset including both positive images (with U.S. banknotes) and negative images (no U.S. banknotes) collected under a variety of conditions. The proposed algorithm, achieves 100% true recognition rate and 0% false recognition rate. Our banknote recognition system is also tested by blind users. PMID:22661884
Beyond the plot: technology extrapolation domains for scaling out agronomic science
NASA Astrophysics Data System (ADS)
Rattalino Edreira, Juan I.; Cassman, Kenneth G.; Hochman, Zvi; van Ittersum, Martin K.; van Bussel, Lenny; Claessens, Lieven; Grassini, Patricio
2018-05-01
Ensuring an adequate food supply in systems that protect environmental quality and conserve natural resources requires productive and resource-efficient cropping systems on existing farmland. Meeting this challenge will be difficult without a robust spatial framework that facilitates rapid evaluation and scaling-out of currently available and emerging technologies. Here we develop a global spatial framework to delineate ‘technology extrapolation domains’ based on key climate and soil factors that govern crop yields and yield stability in rainfed crop production. The proposed framework adequately represents the spatial pattern of crop yields and stability when evaluated over the data-rich US Corn Belt. It also facilitates evaluation of cropping system performance across continents, which can improve efficiency of agricultural research that seeks to intensify production on existing farmland. Populating this biophysical spatial framework with appropriate socio-economic attributes provides the potential to amplify the return on investments in agricultural research and development by improving the effectiveness of research prioritization and impact assessment.
A Software Framework for Remote Patient Monitoring by Using Multi-Agent Systems Support
2017-01-01
Background Although there have been significant advances in network, hardware, and software technologies, the health care environment has not taken advantage of these developments to solve many of its inherent problems. Research activities in these 3 areas make it possible to apply advanced technologies to address many of these issues such as real-time monitoring of a large number of patients, particularly where a timely response is critical. Objective The objective of this research was to design and develop innovative technological solutions to offer a more proactive and reliable medical care environment. The short-term and primary goal was to construct IoT4Health, a flexible software framework to generate a range of Internet of things (IoT) applications, containing components such as multi-agent systems that are designed to perform Remote Patient Monitoring (RPM) activities autonomously. An investigation into its full potential to conduct such patient monitoring activities in a more proactive way is an expected future step. Methods A framework methodology was selected to evaluate whether the RPM domain had the potential to generate customized applications that could achieve the stated goal of being responsive and flexible within the RPM domain. As a proof of concept of the software framework’s flexibility, 3 applications were developed with different implementations for each framework hot spot to demonstrate potential. Agents4Health was selected to illustrate the instantiation process and IoT4Health’s operation. To develop more concrete indicators of the responsiveness of the simulated care environment, an experiment was conducted while Agents4Health was operating, to measure the number of delays incurred in monitoring the tasks performed by agents. Results IoT4Health’s construction can be highlighted as our contribution to the development of eHealth solutions. As a software framework, IoT4Health offers extensibility points for the generation of applications. Applications can extend the framework in the following ways: identification, collection, storage, recovery, visualization, monitoring, anomalies detection, resource notification, and dynamic reconfiguration. Based on other outcomes involving observation of the resulting applications, it was noted that its design contributed toward more proactive patient monitoring. Through these experimental systems, anomalies were detected in real time, with agents sending notifications instantly to the health providers. Conclusions We conclude that the cost-benefit of the construction of a more generic and complex system instead of a custom-made software system demonstrated the worth of the approach, making it possible to generate applications in this domain in a more timely fashion. PMID:28347973
EMMA: a new paradigm in configurable software
Nogiec, J. M.; Trombly-Freytag, K.
2017-11-23
EMMA is a framework designed to create a family of configurable software systems, with emphasis on extensibility and flexibility. It is based on a loosely coupled, event driven architecture. The EMMA framework has been built upon the premise of composing software systems from independent components. It opens up opportunities for reuse of components and their functionality and composing them together in many different ways. As a result, it provides the developer of test and measurement applications with a lightweight alternative to microservices, while sharing their various advantages, including composability, loose coupling, encapsulation, and reuse.
EMMA: A New Paradigm in Configurable Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nogiec, J. M.; Trombly-Freytag, K.
EMMA is a framework designed to create a family of configurable software systems, with emphasis on extensibility and flexibility. It is based on a loosely coupled, event driven architecture. The EMMA framework has been built upon the premise of composing software systems from independent components. It opens up opportunities for reuse of components and their functionality and composing them together in many different ways. It provides the developer of test and measurement applications with a lightweight alternative to microservices, while sharing their various advantages, including composability, loose coupling, encapsulation, and reuse.
EMMA: a new paradigm in configurable software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nogiec, J. M.; Trombly-Freytag, K.
EMMA is a framework designed to create a family of configurable software systems, with emphasis on extensibility and flexibility. It is based on a loosely coupled, event driven architecture. The EMMA framework has been built upon the premise of composing software systems from independent components. It opens up opportunities for reuse of components and their functionality and composing them together in many different ways. As a result, it provides the developer of test and measurement applications with a lightweight alternative to microservices, while sharing their various advantages, including composability, loose coupling, encapsulation, and reuse.