Evolution of the VLT instrument control system toward industry standards
NASA Astrophysics Data System (ADS)
Kiekebusch, Mario J.; Chiozzi, Gianluca; Knudstrup, Jens; Popovic, Dan; Zins, Gerard
2010-07-01
The VLT control system is a large distributed system consisting of Linux Workstations providing the high level coordination and interfaces to the users, and VME-based Local Control Units (LCU's) running the VxWorks real-time operating system with commercial and proprietary boards acting as the interface to the instrument functions. After more than 10 years of VLT operations, some of the applied technologies used by the astronomical instruments are being discontinued making it difficult to find adequate hardware for future projects. In order to deal with this obsolescence, the VLT Instrumentation Framework is being extended to adopt well established Commercial Off The Shelf (COTS) components connected through industry standard fieldbuses. This ensures a flexible state of the art hardware configuration for the next generation VLT instruments allowing the access to instrument devices via more compact and simpler control units like PC-based Programmable Logical Controllers (PLC's). It also makes it possible to control devices directly from the Instrument Workstation through a normal Ethernet connection. This paper outlines the requirements that motivated this work, as well as the architecture and the design of the framework extension. In addition, it describes the preliminary results on a use case which is a VLTI visitor instrument used as a pilot project to validate the concepts and the suitability of some COTS products like a PC-based PLCs, EtherCAT8 and OPC UA6 as solutions for instrument control.
Flight Software Development for the CHEOPS Instrument with the CORDET Framework
NASA Astrophysics Data System (ADS)
Cechticky, V.; Ottensamer, R.; Pasetti, A.
2015-09-01
CHEOPS is an ESA S-class mission dedicated to the precise measurement of radii of already known exoplanets using ultra-high precision photometry. The instrument flight software controlling the instrument and handling the science data is developed by the University of Vienna using the CORDET Framework offered by P&P Software GmbH. The CORDET Framework provides a generic software infrastructure for PUS-based applications. This paper describes how the framework is used for the CHEOPS application software to provide a consistent solution for to the communication and control services, event handling and FDIR procedures. This approach is innovative in four respects: (a) it is a true third-party reuse; (b) re-use is done at specification, validation and code level; (c) the re-usable assets and their qualification data package are entirely open-source; (d) re-use is based on call-back with the application developer providing functions which are called by the reusable architecture. File names missing from here on out (I tried to mimic the files names from before.)
Instrument Remote Control via the Astronomical Instrument Markup Language
NASA Technical Reports Server (NTRS)
Sall, Ken; Ames, Troy; Warsaw, Craig; Koons, Lisa; Shafer, Richard
1998-01-01
The Instrument Remote Control (IRC) project ongoing at NASA's Goddard Space Flight Center's (GSFC) Information Systems Center (ISC) supports NASA's mission by defining an adaptive intranet-based framework that provides robust interactive and distributed control and monitoring of remote instruments. An astronomical IRC architecture that combines the platform-independent processing capabilities of Java with the power of Extensible Markup Language (XML) to express hierarchical data in an equally platform-independent, as well as human readable manner, has been developed. This architecture is implemented using a variety of XML support tools and Application Programming Interfaces (API) written in Java. IRC will enable trusted astronomers from around the world to easily access infrared instruments (e.g., telescopes, cameras, and spectrometers) located in remote, inhospitable environments, such as the South Pole, a high Chilean mountaintop, or an airborne observatory aboard a Boeing 747. Using IRC's frameworks, an astronomer or other scientist can easily define the type of onboard instrument, control the instrument remotely, and return monitoring data all through the intranet. The Astronomical Instrument Markup Language (AIML) is the first implementation of the more general Instrument Markup Language (IML). The key aspects of our approach to instrument description and control applies to many domains, from medical instruments to machine assembly lines. The concepts behind AIML apply equally well to the description and control of instruments in general. IRC enables us to apply our techniques to several instruments, preferably from different observatories.
Lesher, Danielle Ann-Marie; Mulcahey, M J; Hershey, Peter; Stanton, Donna Breger; Tiedgen, Andrea C
We sought to identify outcome instruments used in rehabilitation of the hand and upper extremity; to determine their alignment with the constructs of the International Classification of Functioning, Disability and Health (ICF) and the Occupational Therapy Practice Framework: Domain and Process; and to report gaps in the constructs measured by outcome instruments as a basis for future research. We searched CINAHL, MEDLINE, OTseeker, and the Cochrane Central Register of Controlled Trials using scoping review methodology and evaluated outcome instruments for concordance with the ICF and the Framework. We identified 18 outcome instruments for analysis. The findings pertain to occupational therapists' focus on body functions, body structures, client factors, and activities of daily living; a gap in practice patterns in use of instruments; and overestimation of the degree to which instruments used are occupationally based. Occupational therapy practitioners should use outcome instruments that embody conceptual frameworks for classifying function and activity. Copyright © 2017 by the American Occupational Therapy Association, Inc.
Control software and electronics architecture design in the framework of the E-ELT instrumentation
NASA Astrophysics Data System (ADS)
Di Marcantonio, P.; Coretti, I.; Cirami, R.; Comari, M.; Santin, P.; Pucillo, M.
2010-07-01
During the last years the European Southern Observatory (ESO), in collaboration with other European astronomical institutes, has started several feasibility studies for the E-ELT (European-Extremely Large Telescope) instrumentation and post-focal adaptive optics. The goal is to create a flexible suite of instruments to deal with the wide variety of scientific questions astronomers would like to see solved in the coming decades. In this framework INAF-Astronomical Observatory of Trieste (INAF-AOTs) is currently responsible of carrying out the analysis and the preliminary study of the architecture of the electronics and control software of three instruments: CODEX (control software and electronics) and OPTIMOS-EVE/OPTIMOS-DIORAMAS (control software). To cope with the increased complexity and new requirements for stability, precision, real-time latency and communications among sub-systems imposed by these instruments, new solutions have been investigated by our group. In this paper we present the proposed software and electronics architecture based on a distributed common framework centered on the Component/Container model that uses OPC Unified Architecture as a standard layer to communicate with COTS components of three different vendors. We describe three working prototypes that have been set-up in our laboratory and discuss their performances, integration complexity and ease of deployment.
Note: Tormenta: An open source Python-powered control software for camera based optical microscopy.
Barabas, Federico M; Masullo, Luciano A; Stefani, Fernando D
2016-12-01
Until recently, PC control and synchronization of scientific instruments was only possible through closed-source expensive frameworks like National Instruments' LabVIEW. Nowadays, efficient cost-free alternatives are available in the context of a continuously growing community of open-source software developers. Here, we report on Tormenta, a modular open-source software for the control of camera-based optical microscopes. Tormenta is built on Python, works on multiple operating systems, and includes some key features for fluorescence nanoscopy based on single molecule localization.
Note: Tormenta: An open source Python-powered control software for camera based optical microscopy
NASA Astrophysics Data System (ADS)
Barabas, Federico M.; Masullo, Luciano A.; Stefani, Fernando D.
2016-12-01
Until recently, PC control and synchronization of scientific instruments was only possible through closed-source expensive frameworks like National Instruments' LabVIEW. Nowadays, efficient cost-free alternatives are available in the context of a continuously growing community of open-source software developers. Here, we report on Tormenta, a modular open-source software for the control of camera-based optical microscopes. Tormenta is built on Python, works on multiple operating systems, and includes some key features for fluorescence nanoscopy based on single molecule localization.
XML in an Adaptive Framework for Instrument Control
NASA Technical Reports Server (NTRS)
Ames, Troy J.
2004-01-01
NASA Goddard Space Flight Center is developing an extensible framework for instrument command and control, known as Instrument Remote Control (IRC), that combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML). A key aspect of the architecture is software that is driven by an instrument description, written using the Instrument Markup Language (IML). IML is an XML dialect used to describe interfaces to control and monitor the instrument, command sets and command formats, data streams, communication mechanisms, and data processing algorithms.
Instrument Remote Control Application Framework
NASA Technical Reports Server (NTRS)
Ames, Troy; Hostetter, Carl F.
2006-01-01
The Instrument Remote Control (IRC) architecture is a flexible, platform-independent application framework that is well suited for the control and monitoring of remote devices and sensors. IRC enables significant savings in development costs by utilizing extensible Markup Language (XML) descriptions to configure the framework for a specific application. The Instrument Markup Language (IML) is used to describe the commands used by an instrument, the data streams produced, the rules for formatting commands and parsing the data, and the method of communication. Often no custom code is needed to communicate with a new instrument or device. An IRC instance can advertise and publish a description about a device or subscribe to another device's description on a network. This simple capability of dynamically publishing and subscribing to interfaces enables a very flexible, self-adapting architecture for monitoring and control of complex instruments in diverse environments.
Agricultural pollution control under Spanish and European environmental policies
NASA Astrophysics Data System (ADS)
MartíNez, Yolanda; Albiac, José
2004-10-01
Nonpoint pollution from agriculture is an important environmental policy issue in Spain and the European Union. Agricultural pollution in Spain is being addressed by the National Irrigation Plan and by the European Water Framework Directive. This article contributes to the ongoing policy decision process by analyzing nonpoint pollution control and presenting results on the efficiency of abatement measures. Results question the reliance of the Water Framework Directive on water pricing as a pollution instrument for reaching good status for all waters because higher water prices close to full recovery cost advocated by the directive appear to be inefficient as an emission control instrument. Another important result is that abatement measures based on input taxes and standards on nitrogen appear to be more suitable than the National Irrigation Plan subsidies designed to promote irrigation investments. The results also contribute with further evidence to the discussion on the appropriate instrument base for pollution control, proving that nonpoint pollution control instruments cannot be assessed accurately without a correct understanding of the key underlying biophysical processes. Nonpoint pollution is characterized by nonlinearities, dynamics, and spatial dependency, and neglect of the dynamic aspects may lead to serious consequences for the design of measures. Finally, a quantitative assessment has been performed to explore discriminating measures based on crop pollution potential on vulnerable soils. No significant welfare gains are found from discriminating control, although results are contingent upon the level of damage, and discrimination could be justified in areas with valuable ecosystems and severe pollution damages.
NASA Astrophysics Data System (ADS)
Pérez-López, F.; Vallejo, J. C.; Martínez, S.; Ortiz, I.; Macfarlane, A.; Osuna, P.; Gill, R.; Casale, M.
2015-09-01
BepiColombo is an interdisciplinary ESA mission to explore the planet Mercury in cooperation with JAXA. The mission consists of two separate orbiters: ESA's Mercury Planetary Orbiter (MPO) and JAXA's Mercury Magnetospheric Orbiter (MMO), which are dedicated to the detailed study of the planet and its magnetosphere. The MPO scientific payload comprises eleven instruments packages covering different disciplines developed by several European teams. This paper describes the design and development approach of the framework required to support the operation of the distributed BepiColombo MPO instruments pipelines, developed and operated from different locations, but designed as a single entity. An architecture based on primary-redundant configuration, fully integrated into the BepiColombo Science Operations Control System (BSCS), has been selected, where some instrument pipelines will be operated from the instrument team's data processing centres, having a pipeline replica that can be run from the Science Ground Segment (SGS), while others will be executed as primary pipelines from the SGS, adopting the SGS the pipeline orchestration role.
A review of instruments to measure interprofessional team-based primary care.
Shoemaker, Sarah J; Parchman, Michael L; Fuda, Kathleen Kerwin; Schaefer, Judith; Levin, Jessica; Hunt, Meaghan; Ricciardi, Richard
2016-07-01
Interprofessional team-based care is increasingly regarded as an important feature of delivery systems redesigned to provide more efficient and higher quality care, including primary care. Measurement of the functioning of such teams might enable improvement of team effectiveness and could facilitate research on team-based primary care. Our aims were to develop a conceptual framework of high-functioning primary care teams to identify and review instruments that measure the constructs identified in the framework, and to create a searchable, web-based atlas of such instruments (available at: http://primarycaremeasures.ahrq.gov/team-based-care/ ). Our conceptual framework was developed from existing frameworks, the teamwork literature, and expert input. The framework is based on an Input-Mediator-Output model and includes 12 constructs to which we mapped both instruments as a whole, and individual instrument items. Instruments were also reviewed for relevance to measuring team-based care, and characterized. Instruments were identified from peer-reviewed and grey literature, measure databases, and expert input. From nearly 200 instruments initially identified, we found 48 to be relevant to measuring team-based primary care. The majority of instruments were surveys (n = 44), and the remainder (n = 4) were observational checklists. Most instruments had been developed/tested in healthcare settings (n = 30) and addressed multiple constructs, most commonly communication (n = 42), heedful interrelating (n = 42), respectful interactions (n = 40), and shared explicit goals (n = 37). The majority of instruments had some reliability testing (n = 39) and over half included validity testing (n = 29). Currently available instruments offer promise to researchers and practitioners to assess teams' performance, but additional work is needed to adapt these instruments for primary care settings.
Distributed framework for dyanmic telescope and instrument control
NASA Astrophysics Data System (ADS)
Ames, Troy J.; Case, Lynne
2003-02-01
Traditionally, instrument command and control systems have been developed specifically for a single instrument. Such solutions are frequently expensive and are inflexible to support the next instrument development effort. NASA Goddard Space Flight Center is developing an extensible framework, known as Instrument Remote Control (IRC) that applies to any kind of instrument that can be controlled by a computer. IRC combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML). A key aspect of the architecture is software that is driven by an instrument description, written using the Instrument Markup Language (IML). IML is an XML dialect used to describe graphical user interfaces to control and monitor the instrument, command sets and command formats, data streams, communication mechanisms, and data processing algorithms. The IRC framework provides the ability to communicate to components anywhere on a network using the JXTA protocol for dynamic discovery of distributed components. JXTA (see http://www.jxta.org) is a generalized protocol that allows any devices connected by a network to communicate in a peer-to-peer manner. IRC uses JXTA to advertise a devices IML and discover devices of interest on the network. Devices can join or leave the network and thus join or leave the instrument control environment of IRC. Currently, several astronomical instruments are working with the IRC development team to develop custom components for IRC to control their instruments. These instruments include: High resolution Airborne Wideband Camera (HAWC), a first light instrument for the Stratospheric Observatory for Infrared Astronomy (SOFIA); Submillimeter And Far Infrared Experiment (SAFIRE), a Principal Investigator instrument for SOFIA; and Fabry-Perot Interferometer Bolometer Research Experiment (FIBRE), a prototype of the SAFIRE instrument, used at the Caltech Submillimeter Observatory (CSO). Most recently, we have been working with the Submillimetre High
Using XML and Java Technologies for Astronomical Instrument Control
NASA Technical Reports Server (NTRS)
Ames, Troy; Case, Lynne; Powers, Edward I. (Technical Monitor)
2001-01-01
Traditionally, instrument command and control systems have been highly specialized, consisting mostly of custom code that is difficult to develop, maintain, and extend. Such solutions are initially very costly and are inflexible to subsequent engineering change requests, increasing software maintenance costs. Instrument description is too tightly coupled with details of implementation. NASA Goddard Space Flight Center, under the Instrument Remote Control (IRC) project, is developing a general and highly extensible framework that applies to any kind of instrument that can be controlled by a computer. The software architecture combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML), a human readable and machine understandable way to describe structured data. A key aspect of the object-oriented architecture is that the software is driven by an instrument description, written using the Instrument Markup Language (IML), a dialect of XML. IML is used to describe the command sets and command formats of the instrument, communication mechanisms, format of the data coming from the instrument, and characteristics of the graphical user interface to control and monitor the instrument. The IRC framework allows the users to define a data analysis pipeline which converts data coming out of the instrument. The data can be used in visualizations in order for the user to assess the data in real-time, if necessary. The data analysis pipeline algorithms can be supplied by the user in a variety of forms or programming languages. Although the current integration effort is targeted for the High-resolution Airborne Wideband Camera (HAWC) and the Submillimeter and Far Infrared Experiment (SAFIRE), first-light instruments of the Stratospheric Observatory for Infrared Astronomy (SOFIA), the framework is designed to be generic and extensible so that it can be applied to any instrument. Plans are underway to test the framework with other types of instruments, such as remote sensing earth science instruments.
Distributed Framework for Dynamic Telescope and Instrument Control
NASA Astrophysics Data System (ADS)
Ames, Troy J.; Case, Lynne
2002-12-01
Traditionally, instrument command and control systems have been developed specifically for a single instrument. Such solutions are frequently expensive and are inflexible to support the next instrument development effort. NASA Goddard Space Flight Center is developing an extensible framework, known as Instrument Remote Control (IRC) that applies to any kind of instrument that can be controlled by a computer. IRC combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML). A key aspect of the architecture is software that is driven by an instrument description, written using the Instrument Markup Language (IML). IML is an XML dialect used to describe graphical user interfaces to control and monitor the instrument, command sets and command formats, data streams, communication mechanisms, and data processing algorithms. The IRC framework provides the ability to communicate to components anywhere on a network using the JXTA protocol for dynamic discovery of distributed components. JXTA (see http://www.jxta.org) is a generalized protocol that allows any devices connected by a network to communicate in a peer-to-peer manner. IRC uses JXTA to advertise a device?s IML and discover devices of interest on the network. Devices can join or leave the network and thus join or leave the instrument control environment of IRC. Currently, several astronomical instruments are working with the IRC development team to develop custom components for IRC to control their instruments. These instruments include: High resolution Airborne Wideband Camera (HAWC), a first light instrument for the Stratospheric Observatory for Infrared Astronomy (SOFIA); Submillimeter And Far Infrared Experiment (SAFIRE), a principal investigator instrument for SOFIA; and Fabry-Perot Interferometer Bolometer Research Experiment (FIBRE), a prototype of the SAFIRE instrument, used at the Caltech Submillimeter Observatory (CSO). Most recently, we have been working with the Submillimetre High Angular Resolution Camera IInd Generation (SHARCII) at the CSO to investigate using IRC capabilities with the SHARC instrument.
Distributed Framework for Dynamic Telescope and Instrument Control
NASA Technical Reports Server (NTRS)
Ames, Troy J.; Case, Lynne
2002-01-01
Traditionally, instrument command and control systems have been developed specifically for a single instrument. Such solutions are frequently expensive and are inflexible to support the next instrument development effort. NASA Goddard Space Flight Center is developing an extensible framework, known as Instrument Remote Control (IRC) that applies to any kind of instrument that can be controlled by a computer. IRC combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML). A key aspect of the architecture is software that is driven by an instrument description, written using the Instrument Markup Language (IML). IML is an XML dialect used to describe graphical user interfaces to control and monitor the instrument, command sets and command formats, data streams, communication mechanisms, and data processing algorithms. The IRC framework provides the ability to communicate to components anywhere on a network using the JXTA protocol for dynamic discovery of distributed components. JXTA (see httD://www.jxta.org,) is a generalized protocol that allows any devices connected by a network to communicate in a peer-to-peer manner. IRC uses JXTA to advertise a device's IML and discover devices of interest on the network. Devices can join or leave the network and thus join or leave the instrument control environment of IRC. Currently, several astronomical instruments are working with the IRC development team to develop custom components for IRC to control their instruments. These instruments include: High resolution Airborne Wideband Camera (HAWC), a first light instrument for the Stratospheric Observatory for Infrared Astronomy (SOFIA); Submillimeter And Far Infrared Experiment (SAFIRE), a Principal Investigator instrument for SOFIA; and Fabry-Perot Interferometer Bolometer Research Experiment (FIBRE), a prototype of the SAFIRE instrument, used at the Caltech Submillimeter Observatory (CSO). Most recently, we have been working with the Submillimetre High Angular Resolution Camera IInd Generation (SHARCII) at the CSO to investigate using IRC capabilities with the SHARC instrument.
Modular and Adaptive Control of Sound Processing
NASA Astrophysics Data System (ADS)
van Nort, Douglas
This dissertation presents research into the creation of systems for the control of sound synthesis and processing. The focus differs from much of the work related to digital musical instrument design, which has rightly concentrated on the physicality of the instrument and interface: sensor design, choice of controller, feedback to performer and so on. Often times a particular choice of sound processing is made, and the resultant parameters from the physical interface are conditioned and mapped to the available sound parameters in an exploratory fashion. The main goal of the work presented here is to demonstrate the importance of the space that lies between physical interface design and the choice of sound manipulation algorithm, and to present a new framework for instrument design that strongly considers this essential part of the design process. In particular, this research takes the viewpoint that instrument designs should be considered in a musical control context, and that both control and sound dynamics must be considered in tandem. In order to achieve this holistic approach, the work presented in this dissertation assumes complementary points of view. Instrument design is first seen as a function of musical context, focusing on electroacoustic music and leading to a view on gesture that relates perceived musical intent to the dynamics of an instrumental system. The important design concept of mapping is then discussed from a theoretical and conceptual point of view, relating perceptual, systems and mathematically-oriented ways of examining the subject. This theoretical framework gives rise to a mapping design space, functional analysis of pertinent existing literature, implementations of mapping tools, instrumental control designs and several perceptual studies that explore the influence of mapping structure. Each of these reflect a high-level approach in which control structures are imposed on top of a high-dimensional space of control and sound synthesis parameters. In this view, desired gestural dynamics and sonic response are achieved through modular construction of mapping layers that are themselves subject to parametric control. Complementing this view of the design process, the work concludes with an approach in which the creation of gestural control/sound dynamics are considered in the low-level of the underlying sound model. The result is an adaptive system that is specialized to noise-based transformations that are particularly relevant in an electroacoustic music context. Taken together, these different approaches to design and evaluation result in a unified framework for creation of an instrumental system. The key point is that this framework addresses the influence that mapping structure and control dynamics have on the perceived feel of the instrument. Each of the results illustrate this using either top-down or bottom-up approaches that consider musical control context, thereby pointing to the greater potential for refined sonic articulation that can be had by combining them in the design process.
Simultaneous control of multiple instruments at the Advanced Technology Solar Telescope
NASA Astrophysics Data System (ADS)
Johansson, Erik M.; Goodrich, Bret
2012-09-01
The Advanced Technology Solar Telescope (ATST) is a 4-meter solar observatory under construction at Haleakala, Hawaii. The simultaneous use of multiple instruments is one of the unique capabilities that makes the ATST a premier ground based solar observatory. Control of the instrument suite is accomplished by the Instrument Control System (ICS), a layer of software between the Observatory Control System (OCS) and the instruments. The ICS presents a single narrow interface to the OCS and provides a standard interface for the instruments to be controlled. It is built upon the ATST Common Services Framework (CSF), an infrastructure for the implementation of a distributed control system. The ICS responds to OCS commands and events, coordinating and distributing them to the various instruments while monitoring their progress and reporting the status back to the OCS. The ICS requires no specific knowledge about the instruments. All information about the instruments used in an experiment is passed by the OCS to the ICS, which extracts and forwards the parameters to the appropriate instrument controllers. The instruments participating in an experiment define the active instrument set. A subset of those instruments must complete their observing activities in order for the experiment to be considered complete and are referred to as the must-complete instrument set. In addition, instruments may participate in eavesdrop mode, outside of the control of the ICS. All instrument controllers use the same standard narrow interface, which allows new instruments to be added without having to modify the interface or any existing instrument controllers.
Development of telescope control system for the 50cm telescope of UC Observatory Santa Martina
NASA Astrophysics Data System (ADS)
Shen, Tzu-Chiang; Soto, Ruben; Reveco, Johnny; Vanzi, Leonardo; Fernández, Jose M.; Escarate, Pedro; Suc, Vincent
2012-09-01
The main telescope of the UC Observatory Santa Martina is a 50cm optical telescope donated by ESO to Pontificia Universidad Catolica de Chile. During the past years the telescope has been refurbished and used as the main facility for testing and validating new instruments under construction by the center of Astro-Engineering UC. As part of this work, the need to develop a more efficient and flexible control system arises. The new distributed control system has been developed on top of Internet Communication Engine (ICE), a framework developed by Zeroc Inc. This framework features a lightweight but powerful and flexible inter-process communication infrastructure and provides binding to classic and modern programming languages, such as, C/C++, java, c#, ruby-rail, objective c, etc. The result of this work shows ICE as a real alternative for CORBA and other de-facto distribute programming framework. Classical control software architecture has been chosen and comprises an observation control system (OCS), the orchestrator of the observation, which controls the telescope control system (TCS), and detector control system (DCS). The real-time control and monitoring system is deployed and running over ARM based single board computers. Other features such as logging and configuration services have been developed as well. Inter-operation with other main astronomical control frameworks are foreseen in order achieve a smooth integration of instruments when they will be integrated in the main observatories in the north of Chile
NASA Astrophysics Data System (ADS)
Kiekebusch, Mario J.; Lucuix, Christian; Erm, Toomas M.; Chiozzi, Gianluca; Zamparelli, Michele; Kern, Lothar; Brast, Roland; Pirani, Werther; Reiss, Roland; Popovic, Dan; Knudstrup, Jens; Duchateau, Michel; Sandrock, Stefan; Di Lieto, Nicola
2014-07-01
ESO is currently in the final phase of the standardization process for PC-based Programmable Logical Controllers (PLCs) as the new platform for the development of control systems for future VLT/VLTI instruments. The standard solution used until now consists of a Local Control Unit (LCU), a VME-based system having a CPU and commercial and proprietary boards. This system includes several layers of software and many thousands of lines of code developed and maintained in house. LCUs have been used for several years as the interface to control instrument functions but now are being replaced by commercial off-the-shelf (COTS) systems based on BECKHOFF Embedded PCs and the EtherCAT fieldbus. ESO is working on the completion of the software framework that enables a seamless integration into the VLT control system in order to be ready to support upcoming instruments like ESPRESSO and ERIS, that will be the first fully VLT compliant instruments using the new standard. The technology evaluation and standardization process has been a long and combined effort of various engineering disciplines like electronics, control and software, working together to define a solution that meets the requirements and minimizes the impact on the observatory operations and maintenance. This paper presents the challenges of the standardization process and the steps involved in such a change. It provides a technical overview of how industrial standards like EtherCAT, OPC-UA, PLCOpen MC and TwinCAT can be used to replace LCU features in various areas like software engineering and programming languages, motion control, time synchronization and astronomical tracking.
A framework for assessing Health Economic Evaluation (HEE) quality appraisal instruments.
Langer, Astrid
2012-08-16
Health economic evaluations support the health care decision-making process by providing information on costs and consequences of health interventions. The quality of such studies is assessed by health economic evaluation (HEE) quality appraisal instruments. At present, there is no instrument for measuring and improving the quality of such HEE quality appraisal instruments. Therefore, the objectives of this study are to establish a framework for assessing the quality of HEE quality appraisal instruments to support and improve their quality, and to apply this framework to those HEE quality appraisal instruments which have been subject to more scrutiny than others, in order to test the framework and to demonstrate the shortcomings of existing HEE quality appraisal instruments. To develop the quality assessment framework for HEE quality appraisal instruments, the experiences of using appraisal tools for clinical guidelines are used. Based on a deductive iterative process, clinical guideline appraisal instruments identified through literature search are reviewed, consolidated, and adapted to produce the final quality assessment framework for HEE quality appraisal instruments. The final quality assessment framework for HEE quality appraisal instruments consists of 36 items organized within 7 dimensions, each of which captures a specific domain of quality. Applying the quality assessment framework to four existing HEE quality appraisal instruments, it is found that these four quality appraisal instruments are of variable quality. The framework described in this study should be regarded as a starting point for appraising the quality of HEE quality appraisal instruments. This framework can be used by HEE quality appraisal instrument producers to support and improve the quality and acceptance of existing and future HEE quality appraisal instruments. By applying this framework, users of HEE quality appraisal instruments can become aware of methodological deficiencies inherent in existing HEE quality appraisal instruments. These shortcomings of existing HEE quality appraisal instruments are illustrated by the pilot test.
A framework for assessing Health Economic Evaluation (HEE) quality appraisal instruments
2012-01-01
Background Health economic evaluations support the health care decision-making process by providing information on costs and consequences of health interventions. The quality of such studies is assessed by health economic evaluation (HEE) quality appraisal instruments. At present, there is no instrument for measuring and improving the quality of such HEE quality appraisal instruments. Therefore, the objectives of this study are to establish a framework for assessing the quality of HEE quality appraisal instruments to support and improve their quality, and to apply this framework to those HEE quality appraisal instruments which have been subject to more scrutiny than others, in order to test the framework and to demonstrate the shortcomings of existing HEE quality appraisal instruments. Methods To develop the quality assessment framework for HEE quality appraisal instruments, the experiences of using appraisal tools for clinical guidelines are used. Based on a deductive iterative process, clinical guideline appraisal instruments identified through literature search are reviewed, consolidated, and adapted to produce the final quality assessment framework for HEE quality appraisal instruments. Results The final quality assessment framework for HEE quality appraisal instruments consists of 36 items organized within 7 dimensions, each of which captures a specific domain of quality. Applying the quality assessment framework to four existing HEE quality appraisal instruments, it is found that these four quality appraisal instruments are of variable quality. Conclusions The framework described in this study should be regarded as a starting point for appraising the quality of HEE quality appraisal instruments. This framework can be used by HEE quality appraisal instrument producers to support and improve the quality and acceptance of existing and future HEE quality appraisal instruments. By applying this framework, users of HEE quality appraisal instruments can become aware of methodological deficiencies inherent in existing HEE quality appraisal instruments. These shortcomings of existing HEE quality appraisal instruments are illustrated by the pilot test. PMID:22894708
PScan 1.0: flexible software framework for polygon based multiphoton microscopy
NASA Astrophysics Data System (ADS)
Li, Yongxiao; Lee, Woei Ming
2016-12-01
Multiphoton laser scanning microscopes exhibit highly localized nonlinear optical excitation and are powerful instruments for in-vivo deep tissue imaging. Customized multiphoton microscopy has a significantly superior performance for in-vivo imaging because of precise control over the scanning and detection system. To date, there have been several flexible software platforms catered to custom built microscopy systems i.e. ScanImage, HelioScan, MicroManager, that perform at imaging speeds of 30-100fps. In this paper, we describe a flexible software framework for high speed imaging systems capable of operating from 5 fps to 1600 fps. The software is based on the MATLAB image processing toolbox. It has the capability to communicate directly with a high performing imaging card (Matrox Solios eA/XA), thus retaining high speed acquisition. The program is also designed to communicate with LabVIEW and Fiji for instrument control and image processing. Pscan 1.0 can handle high imaging rates and contains sufficient flexibility for users to adapt to their high speed imaging systems.
NASA Astrophysics Data System (ADS)
Pozna, E.; Ramirez, A.; Mérand, A.; Mueller, A.; Abuter, R.; Frahm, R.; Morel, S.; Schmid, C.; Duc, T. Phan; Delplancke-Ströbele, F.
2014-07-01
The quality of data obtained by VLTI instruments may be refined by analyzing the continuous data supplied by the Reflective Memory Network (RMN). Based on 5 years experience providing VLTI instruments (PACMAN, AMBER, MIDI) with RMN data, the procedure has been generalized to make the synchronization with observation trouble-free. The present software interface saves not only months of efforts for each instrument but also provides the benefits of software frameworks. Recent applications (GRAVITY, MATISSE) supply feedback for the software to evolve. The paper highlights the way common features been identified to be able to offer reusable code in due course.
Using XML and Java for Astronomical Instrumentation Control
NASA Technical Reports Server (NTRS)
Ames, Troy; Koons, Lisa; Sall, Ken; Warsaw, Craig
2000-01-01
Traditionally, instrument command and control systems have been highly specialized, consisting mostly of custom code that is difficult to develop, maintain, and extend. Such solutions are initially very costly and are inflexible to subsequent engineering change requests, increasing software maintenance costs. Instrument description is too tightly coupled with details of implementation. NASA Goddard Space Flight Center is developing a general and highly extensible framework that applies to any kind of instrument that can be controlled by a computer. The software architecture combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML), a human readable and machine understandable way to describe structured data. A key aspect of the object-oriented architecture is software that is driven by an instrument description, written using the Instrument Markup Language (IML). ]ML is used to describe graphical user interfaces to control and monitor the instrument, command sets and command formats, data streams, and communication mechanisms. Although the current effort is targeted for the High-resolution Airborne Wideband Camera, a first-light instrument of the Stratospheric Observatory for Infrared Astronomy, the framework is designed to be generic and extensible so that it can be applied to any instrument.
NASA Astrophysics Data System (ADS)
Tobar, R. J.; von Brand, H.; Araya, M. A.; Juerges, T.
2010-12-01
The ALMA Common Software (ACS) framework lacks of the real-time capabilities to control the antennas’ instrumentation — as has been probed by previous works — which has lead to non-portable workarounds to the problem. Indeed, the time service used in ACS, based in the Container/Component model, presents plenty of results that confirm this statement. This work addresses the problem of design and integrate a real-time service for ACS, providing to the framework an implementation such that the control operations over the different instruments could be done within real-time constraints. This implementation is compared with the current time service, showing the difference between the two systems when subjecting them to common scenarios. Also, the new implementation is done following the POSIX specification, ensuring interoperability and portability through different operating systems.
Remote control of astronomical instruments via the Internet
NASA Astrophysics Data System (ADS)
Ashley, M. C. B.; Brooks, P. W.; Lloyd, J. P.
1996-01-01
A software package called ERIC is described that provides a framework for allowing scientific instruments to be remotely controlled via the Internet. The package has been used to control four diverse astronomical instruments, and is now being made freely available to the community. For a description of ERIC's capabilities, and how to obtain a copy, see the conclusion to this paper.
Control volume based hydrocephalus research; analysis of human data
NASA Astrophysics Data System (ADS)
Cohen, Benjamin; Wei, Timothy; Voorhees, Abram; Madsen, Joseph; Anor, Tomer
2010-11-01
Hydrocephalus is a neuropathophysiological disorder primarily diagnosed by increased cerebrospinal fluid volume and pressure within the brain. To date, utilization of clinical measurements have been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Pressure volume models and electric circuit analogs enforce volume conservation principles in terms of pressure. Control volume analysis, through the integral mass and momentum conservation equations, ensures that pressure and volume are accounted for using first principles fluid physics. This approach is able to directly incorporate the diverse measurements obtained by clinicians into a simple, direct and robust mechanics based framework. Clinical data obtained for analysis are discussed along with data processing techniques used to extract terms in the conservation equation. Control volume analysis provides a non-invasive, physics-based approach to extracting pressure information from magnetic resonance velocity data that cannot be measured directly by pressure instrumentation.
ERIC Educational Resources Information Center
Wang, Xueli
2016-01-01
This chapter describes a new conceptual framework that informs research on factors influencing transfer in STEM fields of study from 2-year to 4-year institutions, presents a new survey instrument based on the framework, and offers directions for future research in this area.
Software Framework for Controlling Unsupervised Scientific Instruments.
Schmid, Benjamin; Jahr, Wiebke; Weber, Michael; Huisken, Jan
2016-01-01
Science outreach and communication are gaining more and more importance for conveying the meaning of today's research to the general public. Public exhibitions of scientific instruments can provide hands-on experience with technical advances and their applications in the life sciences. The software of such devices, however, is oftentimes not appropriate for this purpose. In this study, we describe a software framework and the necessary computer configuration that is well suited for exposing a complex self-built and software-controlled instrument such as a microscope to laymen under limited supervision, e.g. in museums or schools. We identify several aspects that must be met by such software, and we describe a design that can simultaneously be used to control either (i) a fully functional instrument in a robust and fail-safe manner, (ii) an instrument that has low-cost or only partially working hardware attached for illustration purposes or (iii) a completely virtual instrument without hardware attached. We describe how to assess the educational success of such a device, how to monitor its operation and how to facilitate its maintenance. The introduced concepts are illustrated using our software to control eduSPIM, a fluorescent light sheet microscope that we are currently exhibiting in a technical museum.
Comprehensive planning of data archive in Japanese planetary missions
NASA Astrophysics Data System (ADS)
Yamamoto, Yukio; Shinohara, Iku; Hoshino, Hirokazu; Tateno, Naoki; Hareyama, Makoto; Okada, Naoki; Ebisawa, Ken
Comprehensive planning of data archive in Japanese planetary missions Japan Aerospace Exploration Agency (JAXA) provides HAYABUSA and KAGUYA data as planetary data archives. These data archives, however, were prepared independently. Therefore the inconsistency of data format has occurred, and the knowledge of data archiving activity is not inherited. Recently, the discussion of comprehensive planning of data archive has started to prepare up-coming planetary missions, which indicates the comprehensive plan of data archive is required in several steps. The framework of the comprehensive plan is divided into four items: Preparation, Evaluation, Preservation, and Service. 1. PREPARATION FRAMEWORK Data is classified into several types: raw data, level-0, 1, 2 processing data, ancillary data, and etc. The task of mission data preparation is responsible for instrument teams, but preparations beside mission data and support of data management are essential to make unified conventions and formats over instruments in a mission, and over missions. 2. EVALUATION FRAMEWORK There are two meanings of evaluation: format and quality. The format evaluation is often discussed in the preparation framework. The data quality evaluation which is often called quality assurance (QA) or quality control (QC) must be performed by third party apart from preparation teams. An instrument team has the initiative for the preparation itself, and the third-party group is organized to evaluate the instrument team's activity. 3. PRESERVATION FRAMEWORK The main topic of this framework is document management, archiving structure, and simple access method. The mission produces many documents in the process of the development. Instrument de-velopment is no exception. During long-term development of a mission, many documents are obsoleted and updated repeatedly. A smart system will help instrument team to reduce some troubles of document management and archiving task. JAXA attempts to follow PDS manners to do this management since PDS has highly sophisticated archiving structure. In addition, the access method to archived data must be simple and standard well over a decade. 4. SERVICE FRAMEWORK The service framework including planetary data access protocol, PDAP, has been developed to share a stored data effectively. The sophisticated service framework will work not only for publication data, but also for low-level data. JAXA's data query services is under developed based on PDAP, which means that the low-level data can be published in the same manner as level 2 data. In this presentation, we report the detail structure of these four frameworks adopting upcoming Planet-C, Venus Climate Orbiter, mission.
Erickson, Pennifer; Willke, Richard; Burke, Laurie
2009-01-01
To facilitate development and evaluation of a PRO instrument conceptual framework, we propose two tools--a PRO concept taxonomy and a PRO instrument hierarchy. FDA's draft guidance on patient reported outcome (PRO) measures states that a clear description of the conceptual framework of an instrument is useful for evaluating its adequacy to support a treatment benefit claim for use in product labeling the draft guidance, however does not propose tools for establishing or evaluating a PRO instrument's conceptual framework. We draw from our review of PRO concepts and instruments that appear in prescription drug labeling approved in the United States from 1997 to 2007. We propose taxonomy terms that define relationships between PRO concepts, including "family,"compound concept," and "singular concept." Based on the range of complexity represented by the concepts, as defined by the taxonomy, we propose nine instrument orders for PRO measurement. The nine orders range from individual event counts to multi-item, multiscale instruments. This analysis of PRO concepts and instruments illustrates that the taxonomy and hierarchy are applicable to PRO concepts across a wide range of therapeutic areas and provide a basis for defining the instrument conceptual framework complexity. Although the utility of these tools in the drug development, review, and approval processes has not yet been demonstrated, these tools could be useful to improve communication and enhance efficiency in the instrument development and review process.
Realistic Simulations of Coronagraphic Observations with Future Space Telescopes
NASA Astrophysics Data System (ADS)
Rizzo, M. J.; Roberge, A.; Lincowski, A. P.; Zimmerman, N. T.; Juanola-Parramon, R.; Pueyo, L.; Hu, M.; Harness, A.
2017-11-01
We present a framework to simulate realistic observations of future space-based coronagraphic instruments. This gathers state-of-the-art scientific and instrumental expertise allowing robust characterization of future instrument concepts.
Implementation of a digital evaluation platform to analyze bifurcation based nonlinear amplifiers
NASA Astrophysics Data System (ADS)
Feldkord, Sven; Reit, Marco; Mathis, Wolfgang
2016-09-01
Recently, nonlinear amplifiers based on the supercritical Andronov-Hopf bifurcation have become a focus of attention, especially in the modeling of the mammalian hearing organ. In general, to gain deeper insights in the input-output behavior, the analysis of bifurcation based amplifiers requires a flexible framework to exchange equations and adjust certain parameters. A DSP implementation is presented which is capable to analyze various amplifier systems. Amplifiers based on the Andronov-Hopf and Neimark-Sacker bifurcations are implemented and compared exemplarily. It is shown that the Neimark-Sacker system remarkably outperforms the Andronov-Hopf amplifier regarding the CPU usage. Nevertheless, both show a similar input-output behavior over a wide parameter range. Combined with an USB-based control interface connected to a PC, the digital framework provides a powerful instrument to analyze bifurcation based amplifiers.
Problem-Based Learning in Instrumentation: Synergism of Real and Virtual Modular Acquisition Chains
ERIC Educational Resources Information Center
Nonclercq, A.; Biest, A. V.; De Cuyper, K.; Leroy, E.; Martinez, D. L.; Robert, F.
2010-01-01
As part of an instrumentation course, a problem-based learning framework was selected for laboratory instruction. Two acquisition chains were designed to help students carry out realistic instrumentation problems. The first tool is a virtual (simulated) modular acquisition chain that allows rapid overall understanding of the main problems in…
Using farmers' attitude and social pressures to design voluntary Bluetongue vaccination strategies.
Sok, J; Hogeveen, H; Elbers, A R W; Oude Lansink, A G J M
2016-10-01
Understanding the context and drivers of farmers' decision-making is critical to designing successful voluntary disease control interventions. This study uses a questionnaire based on the Reasoned Action Approach framework to assess the determinants of farmers' intention to participate in a hypothetical reactive vaccination scheme against Bluetongue. Results suggest that farmers' attitude and social pressures best explained intention. A mix of policy instruments can be used in a complementary way to motivate voluntary vaccination based on the finding that participation is influenced by both internal and external motivation. Next to informational and incentive-based instruments, social pressures, which stem from different type of perceived norms, can spur farmers' vaccination behaviour and serve as catalysts in voluntary vaccination schemes. Copyright © 2016 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Al-Shammari, Zaid; Yawkey, Thomas D.
2008-01-01
This investigation using Grounded Theory focuses on developing, designing and testing out an evaluation method used as a framework for this study. This framework evolved into the instrument entitled, "Classroom Teacher's Performance Based Evaluation Form (CTPBEF)". This study shows the processes and procedures used in CTPBEF's…
NASA Astrophysics Data System (ADS)
Greiner, Romy
2014-02-01
Water pollution of coastal waterways is a complex problem due to the cocktail of pollutants and multiplicity of polluters involved and pollution characteristics. Pollution control therefore requires a combination of policy instruments. This paper examines the applicability of market-based instruments to achieve effective and efficient water quality management in Darwin Harbour, Northern Territory, Australia. Potential applicability of instruments is examined in the context of biophysical and economic pollution characteristics, and experience with instruments elsewhere. The paper concludes that there is potential for inclusion of market-based instruments as part of an instrument mix to safeguard water quality in Darwin Harbour. It recommends, in particular, expanding the existing licencing system to include quantitative pollution limits for all significant point polluters; comprehensive and independent pollution monitoring across Darwin Harbour; public disclosure of water quality and emissions data; positive incentives for landholders in the Darwin Harbour catchment to improve land management practices; a stormwater offset program for greenfield urban developments; adoption of performance bonds for developments and operations which pose a substantial risk to water quality, including port expansion and dredging; and detailed consideration of a bubble licensing scheme for nutrient pollution. The paper offers an analytical framework for policy makers and resource managers tasked with water quality management in coastal waterways elsewhere in Australia and globally, and helps to scan for MBIs suitable in any given environmental management situation.
NASA Astrophysics Data System (ADS)
Muslim; Suhandi, A.; Nugraha, M. G.
2017-02-01
The purposes of this study are to determine the quality of reasoning test instruments that follow the framework of Trends in International Mathematics and Science Study (TIMSS) as a development results and to analyse the profile of reasoning skill of senior high school students on physics materials. This research used research and development method (R&D), furthermore the subject were 104 students at three senior high schools in Bandung selected by random sampling technique. Reasoning test instruments are constructed following the TIMSS framework in multiple choice forms in 30 questions that cover five subject matters i.e. parabolic motion and circular motion, Newton’s law of gravity, work and energy, harmonic oscillation, as well as the momentum and impulse. The quality of reasoning tests were analysed using the Content Validity Ratio (CVR) and classic test analysis include the validity of item, level of difficulty, discriminating power, reliability and Ferguson’s delta. As for the students’ reasoning skills profiles were analysed by the average score of achievements on eight aspects of the reasoning TIMSS framework. The results showed that reasoning test have a good quality as instruments to measure reasoning skills of senior high school students on five matters physics which developed and able to explore the reasoning of students on all aspects of reasoning based on TIMSS framework.
New resilience instrument for patients with cancer.
Ye, Zeng Jie; Liang, Mu Zi; Li, Peng Fei; Sun, Zhe; Chen, Peng; Hu, Guang Yun; Yu, Yuan Liang; Wang, Shu Ni; Qiu, Hong Zhong
2018-02-01
Resilience is an important concept in the cancer literature and is a salient indicator of cancer survivorship. The aim of this study is to develop and validate a new resilience instrument that is specific to patients with cancer diagnosis (RS-SC) in Mainland China. First, a resilience framework was constructed for patients with cancer diagnosis. Second, items were formulated based on the framework to reflect different aspects of resilience. Third, two rounds of expert panel discussion were performed to select important and relevant items. Finally, two cross-sectional studies were conducted to evaluate the psychometric properties of this instrument. Fifty-one items were generated based on the resilience framework and the final 25-item RS-SC resulted in a five-factor solution including Generic Elements, Benefit Finding, Support and Coping, Hope for the Future and Meaning for Existence, accounting for 64.72% of the variance. The Cronbach's α of the RS-SC was 0.825 and the test-retest reliability was 0.874. The RS-SC is a brief and specific self-report resilience instrument for Chinese patients with cancer and shows sound psychometric properties in this study. The RS-SC has potential applications in both clinical practice and research with strength-based resiliency interventions.
Zhang, Helen L; Omondi, Michael W; Musyoka, Augustine M; Afwamba, Isaac A; Swai, Remigi P; Karia, Francis P; Muiruri, Charles; Reddy, Elizabeth A; Crump, John A; Rubach, Matthew P
2016-08-01
Using a clinical research laboratory as a case study, we sought to characterize barriers to maintaining Good Clinical Laboratory Practice (GCLP) services in a developing world setting. Using a US Centers for Disease Control and Prevention framework for program evaluation in public health, we performed an evaluation of the Kilimanjaro Christian Medical Centre-Duke University Health Collaboration clinical research laboratory sections of the Kilimanjaro Clinical Research Institute in Moshi, Tanzania. Laboratory records from November 2012 through October 2014 were reviewed for this analysis. During the 2-year period of study, seven instrument malfunctions suspended testing required for open clinical trials. A median (range) of 9 (1-55) days elapsed between instrument malfunction and biomedical engineer service. Sixteen (76.1%) of 21 suppliers of reagents, controls, and consumables were based outside Tanzania. Test throughput among laboratory sections used a median (range) of 0.6% (0.2%-2.7%) of instrument capacity. Five (55.6%) of nine laboratory technologists left their posts over 2 years. These findings demonstrate that GCLP laboratory service provision in this setting is hampered by delays in biomedical engineer support, delays and extra costs in commodity procurement, low testing throughput, and high personnel turnover. © American Society for Clinical Pathology, 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Prototyping a Hybrid Cooperative and Tele-robotic Surgical System for Retinal Microsurgery.
Balicki, Marcin; Xia, Tian; Jung, Min Yang; Deguet, Anton; Vagvolgyi, Balazs; Kazanzides, Peter; Taylor, Russell
2011-06-01
This paper presents the design of a tele-robotic microsurgical platform designed for development of cooperative and tele-operative control schemes, sensor based smart instruments, user interfaces and new surgical techniques with eye surgery as the driving application. The system is built using the distributed component-based cisst libraries and the Surgical Assistant Workstation framework. It includes a cooperatively controlled EyeRobot2, a da Vinci Master manipulator, and a remote stereo visualization system. We use constrained optimization based virtual fixture control to provide Virtual Remote-Center-of-Motion (vRCM) and haptic feedback. Such system can be used in a hybrid setup, combining local cooperative control with remote tele-operation, where an experienced surgeon can provide hand-over-hand tutoring to a novice user. In another scheme, the system can provide haptic feedback based on virtual fixtures constructed from real-time force and proximity sensor information.
Prototyping a Hybrid Cooperative and Tele-robotic Surgical System for Retinal Microsurgery
Balicki, Marcin; Xia, Tian; Jung, Min Yang; Deguet, Anton; Vagvolgyi, Balazs; Kazanzides, Peter; Taylor, Russell
2013-01-01
This paper presents the design of a tele-robotic microsurgical platform designed for development of cooperative and tele-operative control schemes, sensor based smart instruments, user interfaces and new surgical techniques with eye surgery as the driving application. The system is built using the distributed component-based cisst libraries and the Surgical Assistant Workstation framework. It includes a cooperatively controlled EyeRobot2, a da Vinci Master manipulator, and a remote stereo visualization system. We use constrained optimization based virtual fixture control to provide Virtual Remote-Center-of-Motion (vRCM) and haptic feedback. Such system can be used in a hybrid setup, combining local cooperative control with remote tele-operation, where an experienced surgeon can provide hand-over-hand tutoring to a novice user. In another scheme, the system can provide haptic feedback based on virtual fixtures constructed from real-time force and proximity sensor information. PMID:24398557
EGSE customization for the Euclid NISP Instrument AIV/AIT activities
NASA Astrophysics Data System (ADS)
Franceschi, E.; Trifoglio, M.; Gianotti, F.; Conforti, V.; Andersen, J. J.; Stephen, J. B.; Valenziano, L.; Auricchio, N.; Bulgarelli, A.; De Rosa, A.; Fioretti, V.; Maiorano, E.; Morgante, G.; Nicastro, L.; Sortino, F.; Zoli, A.; Balestra, A.; Bonino, D.; Bonoli, C.; Bortoletto, F.; Capobianco, V.; Corcione, L.; Dal Corso, F.; Debei, S.; Di Ferdinando, D.; Dusini, S.; Farinelli, R.; Fornari, F.; Giacomini, F.; Guizzo, G. P.; Laudisio, F.; Ligori, S.; Mauri, N.; Medinaceli, E.; Patrizii, L.; Sirignano, C.; Sirri, G.; Stanco, L.; Tenti, M.; Valieri, C.; Ventura, S.
2016-07-01
The Near Infrared Spectro-Photometer (NISP) on board the Euclid ESA mission will be developed and tested at various levels of integration by using various test equipment. The Electrical Ground Support Equipment (EGSE) shall be required to support the assembly, integration, verification and testing (AIV/AIT) and calibration activities at instrument level before delivery to ESA, and at satellite level, when the NISP instrument is mounted on the spacecraft. In the case of the Euclid mission this EGSE will be provided by ESA to NISP team, in the HW/SW framework called "CCS Lite", with a possible first usage already during the Warm Electronics (WE) AIV/AIT activities. In this paper we discuss how we will customize that "CCS Lite" as required to support both the WE and Instrument test activities. This customization will primarily involve building the NISP Mission Information Base (the CCS MIB tables) by gathering the relevant data from the instrument sub-units and validating these inputs through specific tools. Secondarily, it will imply developing a suitable set of test sequences, by using uTOPE (an extension to the TCL scripting language, included in the CCS framework), in order to implement the foreseen test procedures. In addition and in parallel, custom interfaces shall be set up between the CCS and the NI-IWS (the NISP Instrument Workstation, which will be in use at any level starting from the WE activities), and also between the CCS and the TCC (the Telescope Control and command Computer, to be only and specifically used during the instrument level tests).
Using the 4MAT Framework to Design a Problem-Based Learning Biostatistics Course
ERIC Educational Resources Information Center
Nowacki, Amy S.
2011-01-01
The study presents and applies the 4MAT theoretical framework to educational planning to transform a biostatistics course into a problem-based learning experience. Using a four-question approach, described are specific activities/materials utilized at both the class and course levels. Two web-based instruments collected data regarding student…
ERIC Educational Resources Information Center
Pifarré, Manoli; Martí, Laura; Cujba, Andreea
2015-01-01
This paper explores the effects of a technology-enhanced pedagogical framework on collaborative creativity processes. The pedagogical framework is built on socio-cultural theory which conceptualizes creativity as a social activity based on intersubjectivity and dialogical interactions. Dialogue becomes an instrument for collaborative creativity…
ERIC Educational Resources Information Center
Kyndt, Eva; Janssens, Ine; Coertjens, Liesje; Gijbels, David; Donche, Vincent; Van Petegem, Peter
2014-01-01
The current study reports on the process of developing a self-assessment instrument for vocational education students' generic working life competencies. The instrument was developed based on a competence framework and in close collaboration with several vocational education teachers and intermediary organisations offering various human…
Zhao, Zijian; Voros, Sandrine; Weng, Ying; Chang, Faliang; Li, Ruijian
2017-12-01
Worldwide propagation of minimally invasive surgeries (MIS) is hindered by their drawback of indirect observation and manipulation, while monitoring of surgical instruments moving in the operated body required by surgeons is a challenging problem. Tracking of surgical instruments by vision-based methods is quite lucrative, due to its flexible implementation via software-based control with no need to modify instruments or surgical workflow. A MIS instrument is conventionally split into a shaft and end-effector portions, while a 2D/3D tracking-by-detection framework is proposed, which performs the shaft tracking followed by the end-effector one. The former portion is described by line features via the RANSAC scheme, while the latter is depicted by special image features based on deep learning through a well-trained convolutional neural network. The method verification in 2D and 3D formulation is performed through the experiments on ex-vivo video sequences, while qualitative validation on in-vivo video sequences is obtained. The proposed method provides robust and accurate tracking, which is confirmed by the experimental results: its 3D performance in ex-vivo video sequences exceeds those of the available state-of -the-art methods. Moreover, the experiments on in-vivo sequences demonstrate that the proposed method can tackle the difficult condition of tracking with unknown camera parameters. Further refinements of the method will refer to the occlusion and multi-instrumental MIS applications.
Santello, Marco; Bianchi, Matteo; Gabiccini, Marco; Ricciardi, Emiliano; Salvietti, Gionata; Prattichizzo, Domenico; Ernst, Marc; Moscatelli, Alessandro; Jörntell, Henrik; Kappers, Astrid M.L.; Kyriakopoulos, Kostas; Albu-Schäffer, Alin; Castellini, Claudio; Bicchi, Antonio
2017-01-01
The term ‘synergy’ – from the Greek synergia – means ‘working together’. The concept of multiple elements working together towards a common goal has been extensively used in neuroscience to develop theoretical frameworks, experimental approaches, and analytical techniques to understand neural control of movement, and for applications for neuro-rehabilitation. In the past decade, roboticists have successfully applied the framework of synergies to create novel design and control concepts for artificial hands, i.e., robotic hands and prostheses. At the same time, robotic research on the sensorimotor integration underlying the control and sensing of artificial hands has inspired new research approaches in neuroscience, and has provided useful instruments for novel experiments. The ambitious goal of integrating expertise and research approaches in robotics and neuroscience to study the properties and applications of the concept of synergies is generating a number of multidisciplinary cooperative projects, among which the recently finished 4-year European project “The Hand Embodied” (THE). This paper reviews the main insights provided by this framework. Specifically, we provide an overview of neuroscientific bases of hand synergies and introduce how robotics has leveraged the insights from neuroscience for innovative design in hardware and controllers for biomedical engineering applications, including myoelectric hand prostheses, devices for haptics research, and wearable sensing of human hand kinematics. The review also emphasizes how this multidisciplinary collaboration has generated new ways to conceptualize a synergy-based approach for robotics, and provides guidelines and principles for analyzing human behavior and synthesizing artificial robotic systems based on a theory of synergies. PMID:26923030
Santello, Marco; Bianchi, Matteo; Gabiccini, Marco; Ricciardi, Emiliano; Salvietti, Gionata; Prattichizzo, Domenico; Ernst, Marc; Moscatelli, Alessandro; Jörntell, Henrik; Kappers, Astrid M L; Kyriakopoulos, Kostas; Albu-Schäffer, Alin; Castellini, Claudio; Bicchi, Antonio
2016-07-01
The term 'synergy' - from the Greek synergia - means 'working together'. The concept of multiple elements working together towards a common goal has been extensively used in neuroscience to develop theoretical frameworks, experimental approaches, and analytical techniques to understand neural control of movement, and for applications for neuro-rehabilitation. In the past decade, roboticists have successfully applied the framework of synergies to create novel design and control concepts for artificial hands, i.e., robotic hands and prostheses. At the same time, robotic research on the sensorimotor integration underlying the control and sensing of artificial hands has inspired new research approaches in neuroscience, and has provided useful instruments for novel experiments. The ambitious goal of integrating expertise and research approaches in robotics and neuroscience to study the properties and applications of the concept of synergies is generating a number of multidisciplinary cooperative projects, among which the recently finished 4-year European project "The Hand Embodied" (THE). This paper reviews the main insights provided by this framework. Specifically, we provide an overview of neuroscientific bases of hand synergies and introduce how robotics has leveraged the insights from neuroscience for innovative design in hardware and controllers for biomedical engineering applications, including myoelectric hand prostheses, devices for haptics research, and wearable sensing of human hand kinematics. The review also emphasizes how this multidisciplinary collaboration has generated new ways to conceptualize a synergy-based approach for robotics, and provides guidelines and principles for analyzing human behavior and synthesizing artificial robotic systems based on a theory of synergies. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Santello, Marco; Bianchi, Matteo; Gabiccini, Marco; Ricciardi, Emiliano; Salvietti, Gionata; Prattichizzo, Domenico; Ernst, Marc; Moscatelli, Alessandro; Jörntell, Henrik; Kappers, Astrid M. L.; Kyriakopoulos, Kostas; Albu-Schäffer, Alin; Castellini, Claudio; Bicchi, Antonio
2016-07-01
The term 'synergy' - from the Greek synergia - means 'working together'. The concept of multiple elements working together towards a common goal has been extensively used in neuroscience to develop theoretical frameworks, experimental approaches, and analytical techniques to understand neural control of movement, and for applications for neuro-rehabilitation. In the past decade, roboticists have successfully applied the framework of synergies to create novel design and control concepts for artificial hands, i.e., robotic hands and prostheses. At the same time, robotic research on the sensorimotor integration underlying the control and sensing of artificial hands has inspired new research approaches in neuroscience, and has provided useful instruments for novel experiments. The ambitious goal of integrating expertise and research approaches in robotics and neuroscience to study the properties and applications of the concept of synergies is generating a number of multidisciplinary cooperative projects, among which the recently finished 4-year European project ;The Hand Embodied; (THE). This paper reviews the main insights provided by this framework. Specifically, we provide an overview of neuroscientific bases of hand synergies and introduce how robotics has leveraged the insights from neuroscience for innovative design in hardware and controllers for biomedical engineering applications, including myoelectric hand prostheses, devices for haptics research, and wearable sensing of human hand kinematics. The review also emphasizes how this multidisciplinary collaboration has generated new ways to conceptualize a synergy-based approach for robotics, and provides guidelines and principles for analyzing human behavior and synthesizing artificial robotic systems based on a theory of synergies.
2012-01-01
Background Continuous quality improvement (CQI) methods are widely used in healthcare; however, the effectiveness of the methods is variable, and evidence about the extent to which contextual and other factors modify effects is limited. Investigating the relationship between these factors and CQI outcomes poses challenges for those evaluating CQI, among the most complex of which relate to the measurement of modifying factors. We aimed to provide guidance to support the selection of measurement instruments by systematically collating, categorising, and reviewing quantitative self-report instruments. Methods Data sources: We searched MEDLINE, PsycINFO, and Health and Psychosocial Instruments, reference lists of systematic reviews, and citations and references of the main report of instruments. Study selection: The scope of the review was determined by a conceptual framework developed to capture factors relevant to evaluating CQI in primary care (the InQuIRe framework). Papers reporting development or use of an instrument measuring a construct encompassed by the framework were included. Data extracted included instrument purpose; theoretical basis, constructs measured and definitions; development methods and assessment of measurement properties. Analysis and synthesis: We used qualitative analysis of instrument content and our initial framework to develop a taxonomy for summarising and comparing instruments. Instrument content was categorised using the taxonomy, illustrating coverage of the InQuIRe framework. Methods of development and evidence of measurement properties were reviewed for instruments with potential for use in primary care. Results We identified 186 potentially relevant instruments, 152 of which were analysed to develop the taxonomy. Eighty-four instruments measured constructs relevant to primary care, with content measuring CQI implementation and use (19 instruments), organizational context (51 instruments), and individual factors (21 instruments). Forty-one instruments were included for full review. Development methods were often pragmatic, rather than systematic and theory-based, and evidence supporting measurement properties was limited. Conclusions Many instruments are available for evaluating CQI, but most require further use and testing to establish their measurement properties. Further development and use of these measures in evaluations should increase the contribution made by individual studies to our understanding of CQI and enhance our ability to synthesise evidence for informing policy and practice. PMID:23241168
The Development of the Graphics-Decoding Proficiency Instrument
ERIC Educational Resources Information Center
Lowrie, Tom; Diezmann, Carmel M.; Kay, Russell
2011-01-01
The graphics-decoding proficiency (G-DP) instrument was developed as a screening test for the purpose of measuring students' (aged 8-11 years) capacity to solve graphics-based mathematics tasks. These tasks include number lines, column graphs, maps and pie charts. The instrument was developed within a theoretical framework which highlights the…
SCTE: An open-source Perl framework for testing equipment control and data acquisition
NASA Astrophysics Data System (ADS)
Mostaço-Guidolin, Luiz C.; Frigori, Rafael B.; Ruchko, Leonid; Galvão, Ricardo M. O.
2012-07-01
SCTE intends to provide a simple, yet powerful, framework for building data acquisition and equipment control systems for experimental Physics, and correlated areas. Via its SCTE::Instrument module, RS-232, USB, and LAN buses are supported, and the intricacies of hardware communication are encapsulated underneath an object oriented abstraction layer. Written in Perl, and using the SCPI protocol, enabled instruments can be easily programmed to perform a wide variety of tasks. While this work presents general aspects of the development of data acquisition systems using the SCTE framework, it is illustrated by particular applications designed for the calibration of several in-house developed devices for power measurement in the tokamak TCABR Alfvén Waves Excitement System. Catalogue identifier: AELZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License Version 3 No. of lines in distributed program, including test data, etc.: 13 811 No. of bytes in distributed program, including test data, etc.: 743 709 Distribution format: tar.gz Programming language: Perl version 5.10.0 or higher. Computer: PC. SCPI capable digital oscilloscope, with RS-232, USB, or LAN communication ports, null modem, USB, or Ethernet cables Operating system: GNU/Linux (2.6.28-11), should also work on any Unix-based operational system Classification: 4.14 External routines: Perl modules: Device::SerialPort, Term::ANSIColor, Math::GSL, Net::HTTP. Gnuplot 4.0 or higher Nature of problem: Automation of experiments and data acquisition often requires expensive equipment and in-house development of software applications. Nowadays personal computers and test equipment come with fast and easy-to-use communication ports. Instrument vendors often supply application programs capable of controlling such devices, but are very restricted in terms of functionalities. For instance, they are not capable of controlling more than one test equipment at a same time or to automate repetitive tasks. SCTE provides a way of using auxiliary equipment in order to automate experiment procedures at low cost using only free, and open-source operational system and libraries. Solution method: SCTE provides a Perl module that implements RS-232, USB, and LAN communication allowing the use of SCPI capable instruments [1]. Therefore providing a straightforward way of creating automation and data acquisition applications using personal computers and testing instruments [2]. SCPI Consortium, Standard Commands for Programmable Instruments, 1999, http://www.scpiconsortium.org. L.C.B. Mostaço-Guidolin, Determinação da configuração de ondas de Alfvén excitadas no tokamak TCABR, Master's thesis, Universidade de São Paulo (2007), http://www.teses.usp.br/teses/disponiveis/43/43134/tde-23042009-230419/.
Internal Audit: Does it Enhance Governance in the Australian Public University Sector?
ERIC Educational Resources Information Center
Christopher, Joe
2015-01-01
This study seeks to confirm if internal audit, a corporate control process, is functioning effectively in Australian public universities. The study draws on agency theory, published literature and best-practice guidelines to develop an internal audit evaluation framework. A survey instrument is thereafter developed from the framework and used as a…
Humphries, Debbie L; Carroll-Scott, Amy; Mitchell, Leif; Tian, Terry; Choudhury, Shonali; Fiellin, David A
2014-01-01
Although awareness of the importance of the research capacity of community-based organizations (CBOs) is growing, a uniform framework of the research capacity domains within CBOs has not yet been developed. To develop a framework and instrument (the Community REsearch Activity assessment Tool [CREAT]) for assessing the research activity and capacity of CBOs that incorporates awareness of the different data collection and analysis priorities of CBOs. We conducted a review of existing tools for assessing research capacity to identify key capacity domains. Instrument items were developed through an iterative process with CBO representatives and community researchers. The CREAT was then pilot tested with 30 CBOs. The four primary domains of the CREAT framework include 1) organizational support for research, 2) generalizable experiences, 3) research specific experiences, and 4) funding. Organizations reported a high prevalence of activities in the research-specific experiences domain, including conducting literature reviews (70%), use of research terminology (83%), and primary data collection (100%). Respondents see research findings as important to improve program and service delivery, and to seek funds for new programs and services. Funders, board members, and policymakers are the most important dissemination audiences. The work reported herein advances the field of CBO research capacity by developing a systematic framework for assessing research activity and capacity relevant to the work of CBOs, and by developing and piloting an instrument to assess activity in these domains.
On Representative Spaceflight Instrument and Associated Instrument Sensor Web Framework
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Patel, Umeshkumar; Vootukuru, Meg
2007-01-01
Sensor Web-based adaptation and sharing of space flight mission resources, including those of the Space-Ground and Control-User communication segment, could greatly benefit from utilization of heritage Internet Protocols and devices applied for Spaceflight (SpaceIP). This had been successfully demonstrated by a few recent spaceflight experiments. However, while terrestrial applications of Internet protocols are well developed and understood (mostly due to billions of dollars in investments by the military and industry), the spaceflight application of Internet protocols is still in its infancy. Progress in the developments of SpaceIP-enabled instrument components will largely determine the SpaceIP utilization of those investments and acceptance in years to come. Likewise SpaceIP, the development of commercial real-time and instrument colocated computational resources, data compression and storage, can be enabled on-board a spacecraft and, in turn, support a powerful application to Sensor Web-based design of a spaceflight instrument. Sensor Web-enabled reconfiguration and adaptation of structures for hardware resources and information systems will commence application of Field Programmable Arrays (FPGA) and other aerospace programmable logic devices for what this technology was intended. These are a few obvious potential benefits of Sensor Web technologies for spaceflight applications. However, they are still waiting to be explored. This is because there is a need for a new approach to spaceflight instrumentation in order to make these mature sensor web technologies applicable for spaceflight. In this paper we present an approach in developing related and enabling spaceflight instrument-level technologies based on the new concept of a representative spaceflight Instrument Sensor Web (ISW).
Real time wind farm emulation using SimWindFarm toolbox
NASA Astrophysics Data System (ADS)
Topor, Marcel
2016-06-01
This paper presents a wind farm emulation solution using an open source Matlab/Simulink toolbox and the National Instruments cRIO platform. This work is based on the Aeolus SimWindFarm (SWF) toolbox models developed at Aalborg university, Denmark. Using the Matlab Simulink models developed in SWF, the modeling code can be exported to a real time model using the NI Veristand model framework and the resulting code is integrated as a hardware in the loop control on the NI 9068 platform.
Instrument Control (iC) – An Open-Source Software to Automate Test Equipment
Pernstich, K. P.
2012-01-01
It has become common practice to automate data acquisition from programmable instrumentation, and a range of different software solutions fulfill this task. Many routine measurements require sequential processing of certain tasks, for instance to adjust the temperature of a sample stage, take a measurement, and repeat that cycle for other temperatures. This paper introduces an open-source Java program that processes a series of text-based commands that define the measurement sequence. These commands are in an intuitive format which provides great flexibility and allows quick and easy adaptation to various measurement needs. For each of these commands, the iC-framework calls a corresponding Java method that addresses the specified instrument to perform the desired task. The functionality of iC can be extended with minimal programming effort in Java or Python, and new measurement equipment can be addressed by defining new commands in a text file without any programming. PMID:26900522
Instrument Control (iC) - An Open-Source Software to Automate Test Equipment.
Pernstich, K P
2012-01-01
It has become common practice to automate data acquisition from programmable instrumentation, and a range of different software solutions fulfill this task. Many routine measurements require sequential processing of certain tasks, for instance to adjust the temperature of a sample stage, take a measurement, and repeat that cycle for other temperatures. This paper introduces an open-source Java program that processes a series of text-based commands that define the measurement sequence. These commands are in an intuitive format which provides great flexibility and allows quick and easy adaptation to various measurement needs. For each of these commands, the iC-framework calls a corresponding Java method that addresses the specified instrument to perform the desired task. The functionality of iC can be extended with minimal programming effort in Java or Python, and new measurement equipment can be addressed by defining new commands in a text file without any programming.
ERIC Educational Resources Information Center
Ioannidou, Alexandra
2007-01-01
In recent years, the ongoing development towards a knowledge-based society--associated with globalization, an aging population, new technologies and organizational changes--has led to a more intensive analysis of education and learning throughout life with regard to quantitative, qualitative and financial aspects. In this framework, education…
Zhi, Ruicong; Zhao, Lei; Xie, Nan; Wang, Houyin; Shi, Bolin; Shi, Jingye
2016-01-13
A framework of establishing standard reference scale (texture) is proposed by multivariate statistical analysis according to instrumental measurement and sensory evaluation. Multivariate statistical analysis is conducted to rapidly select typical reference samples with characteristics of universality, representativeness, stability, substitutability, and traceability. The reasonableness of the framework method is verified by establishing standard reference scale of texture attribute (hardness) with Chinese well-known food. More than 100 food products in 16 categories were tested using instrumental measurement (TPA test), and the result was analyzed with clustering analysis, principal component analysis, relative standard deviation, and analysis of variance. As a result, nine kinds of foods were determined to construct the hardness standard reference scale. The results indicate that the regression coefficient between the estimated sensory value and the instrumentally measured value is significant (R(2) = 0.9765), which fits well with Stevens's theory. The research provides reliable a theoretical basis and practical guide for quantitative standard reference scale establishment on food texture characteristics.
Measuring patient-perceived hospital service quality: a conceptual framework.
Pai, Yogesh P; Chary, Satyanarayana T
2016-04-18
Purpose - Although measuring healthcare service quality is not a new phenomenon, the instruments used to measure are timeworn. With the shift in focus to patient centric processes in hospitals and recognizing healthcare to be different compared to other services, service quality measurement needs to be tuned specifically to healthcare. The purpose of this paper is to design a conceptual framework for measuring patient perceived hospital service quality (HSQ), based on existing service quality literature. Design/methodology/approach - Using HSQ theories, expanding existing healthcare service models and literature, a conceptual framework is proposed to measure HSQ. The paper outlines patient perceived service quality dimensions. Findings - An instrument for measuring HSQ dimensions is developed and compared with other service quality measuring instruments. The latest dimensions are in line with previous studies, but a relationship dimension is added. Practical implications - The framework empowers managers to assess healthcare quality in corporate, public and teaching hospitals. Originality/value - The paper helps academics and practitioners to assess HSQ from a patient perspective.
Schönrock-Adema, Johanna; Visscher, Maartje; Raat, A. N. Janet; Brand, Paul L. P.
2015-01-01
Introduction Current instruments to evaluate the postgraduate medical educational environment lack theoretical frameworks and are relatively long, which may reduce response rates. We aimed to develop and validate a brief instrument that, based on a solid theoretical framework for educational environments, solicits resident feedback to screen the postgraduate medical educational environment quality. Methods Stepwise, we developed a screening instrument, using existing instruments to assess educational environment quality and adopting a theoretical framework that defines three educational environment domains: content, atmosphere and organization. First, items from relevant existing instruments were collected and, after deleting duplicates and items not specifically addressing educational environment, grouped into the three domains. In a Delphi procedure, the item list was reduced to a set of items considered most important and comprehensively covering the three domains. These items were triangulated against the results of semi-structured interviews with 26 residents from three teaching hospitals to achieve face validity. This draft version of the Scan of Postgraduate Educational Environment Domains (SPEED) was administered to residents in a general and university hospital and further reduced and validated based on the data collected. Results Two hundred twenty-three residents completed the 43-item draft SPEED. We used half of the dataset for item reduction, and the other half for validating the resulting SPEED (15 items, 5 per domain). Internal consistencies were high. Correlations between domain scores in the draft and brief versions of SPEED were high (>0.85) and highly significant (p<0.001). Domain score variance of the draft instrument was explained for ≥80% by the items representing the domains in the final SPEED. Conclusions The SPEED comprehensively covers the three educational environment domains defined in the theoretical framework. Because of its validity and brevity, the SPEED is promising as useful and easily applicable tool to regularly screen educational environment quality in postgraduate medical education. PMID:26413836
Puska, Pekka
2017-05-23
The World Health Organization (WHO) Framework Convention on Tobacco Control (FCTC) is a unique global health instrument, since it is in the health field the only instrument that is international law. After the 10 years of its existence an Independent Expert Group assessed the impact of the FCTC using all available data and visiting a number of countries interviewing different stakeholders. It is quite clear that the Treaty has acted as a strong catalyst and framework for national actions and that remarkable progress in global tobacco control can be seen. At the same time FCTC has moved tobacco control in countries from a pure health issue to a legal responsibility of the whole government, and on the international level created stronger interagency collaboration. The assessment also showed the many challenges. The spread of tobacco use, as well as of other risk lifestyles, is related to globalization. FCTC is a pioneering example of global action to counteract the negative social consequences of globalization. A convention is not an easy instrument, but the FCTC has undoubtedly sparked thinking and development of other stronger public health instruments and of needed governance structures. © 2018 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
The evolution of violence risk assessment.
Monahan, John; Skeem, Jennifer L
2014-10-01
Many instruments have been published in recent years to improve the ability of mental health clinicians to estimate the likelihood that an individual will behave violently toward others. Increasingly, these instruments are being applied in response to laws that require specialized risk assessments. In this review, we present a framework that goes beyond the "clinical" and "actuarial" dichotomy to describe a continuum of structured approaches to risk assessment. Despite differences among them, there is little evidence that one instrument predicts violence better than another. We believe that these group-based instruments are useful for assessing an individual's risk, and that the instrument should be chosen based on the purpose of the assessment.
2011-01-01
Background Guidance documents for the development and validation of patient-reported outcomes (PROs) advise the use of conceptual frameworks, which outline the structure of the concept that a PRO aims to measure. It is unknown whether currently available PROs are based on conceptual frameworks. This study, which was limited to a specific case, had the following aims: (i) to identify conceptual frameworks of physical activity in chronic respiratory patients or similar populations (chronic heart disease patients or the elderly) and (ii) to assess whether the development and validation of PROs to measure physical activity in these populations were based on a conceptual framework of physical activity. Methods Two systematic reviews were conducted through searches of the Medline, Embase, PsycINFO, and Cinahl databases prior to January 2010. Results In the first review, only 2 out of 581 references pertaining to physical activity in the defined populations provided a conceptual framework of physical activity in COPD patients. In the second review, out of 103 studies developing PROs to measure physical activity or related constructs, none were based on a conceptual framework of physical activity. Conclusions These findings raise concerns about how the large body of evidence from studies that use physical activity PRO instruments should be evaluated by health care providers, guideline developers, and regulatory agencies. PMID:21967887
Bala, Sidona-Valentina; Forslind, Kristina; Fridlund, Bengt; Samuelson, Karin; Svensson, Björn; Hagell, Peter
2018-06-01
Person-centred care (PCC) is considered a key component of effective illness management and high-quality care. However, the PCC concept is underdeveloped in outpatient care. In rheumatology, PCC is considered an unmet need and its further development and evaluation is of high priority. The aim of the present study was to conceptualize and operationalize PCC, in order to develop an instrument for measuring patient-perceived PCC in nurse-led outpatient rheumatology clinics. A conceptual outpatient PCC framework was developed, based on the experiences of people with rheumatoid arthritis (RA), person-centredness principles and existing PCC frameworks. The resulting framework was operationalized into the PCC instrument for outpatient care in rheumatology (PCCoc/rheum), which was tested for acceptability and content validity among 50 individuals with RA attending a nurse-led outpatient clinic. The conceptual framework focuses on the meeting between the person with RA and the nurse, and comprises five interrelated domains: social environment, personalization, shared decision-making, empowerment and communication. Operationalization of the domains into a pool of items generated a preliminary PCCoc/rheum version, which was completed in a mean (standard deviation) of 5.3 (2.5) min. Respondents found items easy to understand (77%) and relevant (93%). The Content Validity Index of the PCCoc/rheum was 0.94 (item level range, 0.87-1.0). About 80% of respondents considered some items redundant. Based on these results, the PCCoc/rheum was revised into a 24-item questionnaire. A conceptual outpatient PCC framework and a 24-item questionnaire intended to measure PCC in nurse-led outpatient rheumatology clinics were developed. The extent to which the questionnaire represents a measurement instrument remains to be tested. Copyright © 2018 John Wiley & Sons, Ltd.
Jack, Leonard; Liburd, Leandris; Spencer, Tirzah; Airhihenbuwa, Collins O
2004-06-01
Eight studies included in a recent systematic review of the efficacy of diabetes self-management education were qualitatively reexamined to determine the presence of theoretical frameworks, methods used to ensure cultural appropriateness, and the quality of the instrument. Theoretical frameworks that help to explain complex pathways that produce health outcomes were lacking; culture indices were not incorporated into diabetes self-management education; and the instruments used to measure outcomes were inadequate. We provide recommendations to improve research on diabetes self-management education in community settings through use of a contextual framework that encourages targeting multiple levels of influence--individual, family, organizational, community, and policy.
Explicating Individual Training Decisions
ERIC Educational Resources Information Center
Walter, Marcel; Mueller, Normann
2015-01-01
In this paper, we explicate individual training decisions. For this purpose, we propose a framework based on instrumentality theory, a psychological theory of motivation that has frequently been applied to individual occupational behavior. To test this framework, we employ novel German individual data and estimate the effect of subjective expected…
ERIC Educational Resources Information Center
Tondeur, Jo; Aesaert, Koen; Pynoo, Bram; van Braak, Johan; Fraeyman, Norbert; Erstad, Ola
2017-01-01
The main objective of this study is to develop a self-report instrument to measure preservice teachers' ICT competencies in education. The questionnaire items of this instrument are based on an existing comprehensive framework and were created with input from experts in the field. The data were collected from a sample of 931 final-year preservice…
Integrated Instrument Simulator Suites for Earth Science
NASA Technical Reports Server (NTRS)
Tanelli, Simone; Tao, Wei-Kuo; Matsui, Toshihisa; Hostetler, Chris; Hair, Johnathan; Butler, Carolyn; Kuo, Kwo-Sen; Niamsuwan, Noppasin; Johnson, Michael P.; Jacob, Joseph C.;
2012-01-01
The NASA Earth Observing System Simulators Suite (NEOS3) is a modular framework of forward simulations tools for remote sensing of Earth's Atmosphere from space. It was initiated as the Instrument Simulator Suite for Atmospheric Remote Sensing (ISSARS) under the NASA Advanced Information Systems Technology (AIST) program of the Earth Science Technology Office (ESTO) to enable science users to perform simulations based on advanced atmospheric and simple land surface models, and to rapidly integrate in a broad framework any experimental or innovative tools that they may have developed in this context. The name was changed to NEOS3 when the project was expanded to include more advanced modeling tools for the surface contributions, accounting for scattering and emission properties of layered surface (e.g., soil moisture, vegetation, snow and ice, subsurface layers). NEOS3 relies on a web-based graphic user interface, and a three-stage processing strategy to generate simulated measurements. The user has full control over a wide range of customizations both in terms of a priori assumptions and in terms of specific solvers or models used to calculate the measured signals.This presentation will demonstrate the general architecture, the configuration procedures and illustrate some sample products and the fundamental interface requirements for modules candidate for integration.
A Performance Management Framework for Civil Engineering
1990-09-01
cultural change. A non - equivalent control group design was chosen to augment the case analysis. Figure 3.18 shows the form of the quasi-experiment. The...The non - equivalent control group design controls the following obstacles to internal validity: history, maturation, testing, and instrumentation. The...and Stanley, 1963:48,50) Table 7. Validity of Quasi-Experiment The non - equivalent control group experimental design controls the following obstacles to
School Violence Assessment: A Conceptual Framework, Instruments, and Methods
ERIC Educational Resources Information Center
Benbenishty, Rami; Astor, Ron Avi; Estrada, Joey Nunez
2008-01-01
This article outlines a philosophical and theoretical framework for conducting school violence assessments at the local level. The authors advocate that assessments employ a strong conceptual foundation based on social work values. These values include the active measurement of ecological factors inside and outside the school that reflect the…
Langley Wind Tunnel Data Quality Assurance-Check Standard Results
NASA Technical Reports Server (NTRS)
Hemsch, Michael J.; Grubb, John P.; Krieger, William B.; Cler, Daniel L.
2000-01-01
A framework for statistical evaluation, control and improvement of wind funnel measurement processes is presented The methodology is adapted from elements of the Measurement Assurance Plans developed by the National Bureau of Standards (now the National Institute of Standards and Technology) for standards and calibration laboratories. The present methodology is based on the notions of statistical quality control (SQC) together with check standard testing and a small number of customer repeat-run sets. The results of check standard and customer repeat-run -sets are analyzed using the statistical control chart-methods of Walter A. Shewhart long familiar to the SQC community. Control chart results are presented for. various measurement processes in five facilities at Langley Research Center. The processes include test section calibration, force and moment measurements with a balance, and instrument calibration.
Understanding the interplay of cancer patients' instrumental concerns and emotions.
Brandes, Kim; van der Goot, Margot J; Smit, Edith G; van Weert, Julia C M; Linn, Annemiek J
2017-05-01
1) to assess patients' descriptions of concerns, and 2) to inform a conceptual framework in which the impact of the nature of concerns on doctor-patient communication is specified. Six focus groups were conducted with 39 cancer patients and survivors. In these focus groups participants were asked to describe their concerns during and after their illness. Concerns were described as instrumental concerns (e.g., receiving insufficient information) and emotions (e.g., sadness). Patients frequently explained their concerns as an interplay of instrumental concerns and emotions. Examples of the interplay were "receiving incorrect information" and "frustration", and "difficulties with searching, finding and judging of information" and "fear". Instrumental concerns need to be taken into account in the operationalization of concerns in research. Based on the interplay, the conceptual framework suggests that patients can express instrumental concerns as emotions and emotions as instrumental concerns. Consequently, providers can respond with instrumental and emotional communication when patients express an interplay of concerns. The results of this study can be used to support providers in recognizing concerns that are expressed by patients in consultations. Copyright © 2017 Elsevier B.V. All rights reserved.
5. INTERIOR, INSTRUMENTATION AND CONTROL BUILDING ADDITION. Looking north. ...
5. INTERIOR, INSTRUMENTATION AND CONTROL BUILDING ADDITION. Looking north. - Edwards Air Force Base, South Base Sled Track, Instrumentation & Control Building, South of Sled Track, Station "50" area, Lancaster, Los Angeles County, CA
Core data elements tracking elder sexual abuse.
Hanrahan, Nancy P; Burgess, Ann W; Gerolamo, Angela M
2005-05-01
Sexual abuse in the older adult population is an understudied vector of violent crimes with significant physical and psychological consequences for victims and families. Research requires a theoretical framework that delineates core elements using a standardized instrument. To develop a conceptual framework and identify core data elements specific to the older adult population, clinical, administrative, and criminal experts were consulted using a nominal group method to revise an existing sexual assault instrument. The revised instrument could be used to establish a national database of elder sexual abuse. The database could become a standard reference to guide the detection, assessment, and prosecution of elder sexual abuse crimes as well as build a base from which policy makers could plan and evaluate interventions that targeted risk factors.
Role of IAC in large space systems thermal analysis
NASA Technical Reports Server (NTRS)
Jones, G. K.; Skladany, J. T.; Young, J. P.
1982-01-01
Computer analysis programs to evaluate critical coupling effects that can significantly influence spacecraft system performance are described. These coupling effects arise from the varied parameters of the spacecraft systems, environments, and forcing functions associated with disciplines such as thermal, structures, and controls. Adverse effects can be expected to significantly impact system design aspects such as structural integrity, controllability, and mission performance. One such needed design analysis capability is a software system that can integrate individual discipline computer codes into a highly user-oriented/interactive-graphics-based analysis capability. The integrated analysis capability (IAC) system can be viewed as: a core framework system which serves as an integrating base whereby users can readily add desired analysis modules and as a self-contained interdisciplinary system analysis capability having a specific set of fully integrated multidisciplinary analysis programs that deal with the coupling of thermal, structures, controls, antenna radiation performance, and instrument optical performance disciplines.
Maternal parental self-efficacy in the postpartum period.
Leahy-Warren, Patricia; McCarthy, Geraldine
2011-12-01
To present an integrated literature review on maternal parental self-efficacy (MPSE) in the postpartum period. A literature search of CINAHL with full text and MEDLINE and PsycINFO from their start dates to February 2010. Inclusion criteria were English written research articles which reported the measurement of MPSE in the postpartum period. Articles were reviewed based on purpose, theoretical framework, data collection method, sample, main findings and nursing implications for maternal parenting. In addition, data related to the instruments that were used to measure MPSE were included. Data revealed is a statistically significant increase in MPSE over time from baseline; a positive relationship between MPSE and number of children, social support, maternal parenting satisfaction and marital satisfaction; and a negative relationship between MPSE and maternal stress, anxiety and postpartum depression. A variety of instruments to measure MPSE were used but the majority were based on Bandura's framework. Findings from this review may assist women's health researchers and clinical nurses/midwives in assessing and developing appropriate interventions for increasing risk awareness, enhancing MPSE and subsequent satisfaction with parenting and emotional well-being. Further research is necessary underpinned by theoretical frameworks using domain-specific instruments to identify predictors of MPSE. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhang, Qingmin; Deng, Bangjie; Chen, Yuanmiaoliang; Liu, Bochao; Chen, Shaofei; Fan, Jinquan; Feng, Lie; Deng, Haixiao; Liu, Bo; Wang, Dong
2017-10-01
The free electron laser (FEL), as a next-generation light source, is an attractive tool in scientific frontier research because of its advantages of full coherence, ultra-short pulse duration, and controllable polarization. Owing to the demand of real-time bunch diagnosis during FEL experiments, precise nondestructive measurements of the polarization and X-ray energy spectrum using one instrument are preferred. In this paper, such an instrument based on the electron time-of-flight technique is proposed. By considering the complexity and nonlinearity, a numerical model in the framework of Geant4 has been developed for optimization. Taking the Shanghai Soft X-ray FEL user facility as an example, its measurement performances' dependence on the critical parameters was studied systematically, and, finally, an optimal design was obtained, achieving resolutions of 0.5% for the polarization degree and 0.3 eV for the X-ray energy spectrum.
A Common Calibration Source Framework for Fully-Polarimetric and Interferometric Radiometers
NASA Technical Reports Server (NTRS)
Kim, Edward J.; Davis, Brynmor; Piepmeier, Jeff; Zukor, Dorothy J. (Technical Monitor)
2000-01-01
Two types of microwave radiometry--synthetic thinned array radiometry (STAR) and fully-polarimetric (FP) radiometry--have received increasing attention during the last several years. STAR radiometers offer a technological solution to achieving high spatial resolution imaging from orbit without requiring a filled aperture or a moving antenna, and FP radiometers measure extra polarization state information upon which entirely new or more robust geophysical retrieval algorithms can be based. Radiometer configurations used for both STAR and FP instruments share one fundamental feature that distinguishes them from more 'standard' radiometers, namely, they measure correlations between pairs of microwave signals. The calibration requirements for correlation radiometers are broader than those for standard radiometers. Quantities of interest include total powers, complex correlation coefficients, various offsets, and possible nonlinearities. A candidate for an ideal calibration source would be one that injects test signals with precisely controllable correlation coefficients and absolute powers simultaneously into a pair of receivers, permitting all of these calibration quantities to be measured. The complex nature of correlation radiometer calibration, coupled with certain inherent similarities between STAR and FP instruments, suggests significant leverage in addressing both problems together. Recognizing this, a project was recently begun at NASA Goddard Space Flight Center to develop a compact low-power subsystem for spaceflight STAR or FP receiver calibration. We present a common theoretical framework for the design of signals for a controlled correlation calibration source. A statistical model is described, along with temporal and spectral constraints on such signals. Finally, a method for realizing these signals is demonstrated using a Matlab-based implementation.
Designing Successful Next-Generation Instruments to Detect the Epoch of Reionization
NASA Astrophysics Data System (ADS)
Thyagarajan, Nithyanandan; Hydrogen Epoch of Reionization Array (HERA) team, Murchison Widefield Array (MWA) team
2018-01-01
The Epoch of Reionization (EoR) signifies a period of intense evolution of the Inter-Galactic Medium (IGM) in the early Universe caused by the first generations of stars and galaxies, wherein they turned the neutral IGM to be completely ionized by redshift ≥ 6. This important epoch is poorly explored to date. Measurement of redshifted 21 cm line from neutral Hydrogen during the EoR is promising to provide the most direct constraints of this epoch. Ongoing experiments to detect redshifted 21 cm power spectrum during reionization, including the Murchison Widefield Array (MWA), Precision Array for Probing the Epoch of Reionization (PAPER), and the Low Frequency Array (LOFAR), appear to be severely affected by bright foregrounds and unaccounted instrumental systematics. For example, the spectral structure introduced by wide-field effects, aperture shapes and angular power patterns of the antennas, electrical and geometrical reflections in the antennas and electrical paths, and antenna position errors can be major limiting factors. These mimic the 21 cm signal and severely degrade the instrument performance. It is imperative for the next-generation of experiments to eliminate these systematics at their source via robust instrument design. I will discuss a generic framework to set cosmologically motivated antenna performance specifications and design strategies using the Precision Radio Interferometry Simulator (PRISim) -- a high-precision tool that I have developed for simulations of foregrounds and the instrument transfer function intended primarily for 21 cm EoR studies, but also broadly applicable to interferometer-based intensity mapping experiments. The Hydrogen Epoch of Reionization Array (HERA), designed in-part based on this framework, is expected to detect the 21 cm signal with high significance. I will present this framework and the simulations, and their potential for designing upcoming radio instruments such as HERA and the Square Kilometre Array (SKA).
FPGA based control system for space instrumentation
NASA Astrophysics Data System (ADS)
Di Giorgio, Anna M.; Cerulli Irelli, Pasquale; Nuzzolo, Francesco; Orfei, Renato; Spinoglio, Luigi; Liu, Giovanni S.; Saraceno, Paolo
2008-07-01
The prototype for a general purpose FPGA based control system for space instrumentation is presented, with particular attention to the instrument control application software. The system HW is based on the LEON3FT processor, which gives the flexibility to configure the chip with only the necessary HW functionalities, from simple logic up to small dedicated processors. The instrument control SW is developed in ANSI C and for time critical (<10μs) commanding sequences implements an internal instructions sequencer, triggered via an interrupt service routine based on a HW high priority interrupt.
Predictive assimilation framework to support contaminated site understanding and remediation
NASA Astrophysics Data System (ADS)
Versteeg, R. J.; Bianchi, M.; Hubbard, S. S.
2014-12-01
Subsurface system behavior at contaminated sites is driven and controlled by the interplay of physical, chemical, and biological processes occurring at multiple temporal and spatial scales. Effective remediation and monitoring planning requires an understanding of this complexity that is current, predictive (with some level of confidence) and actionable. We present and demonstrate a predictive assimilation framework (PAF). This framework automatically ingests, quality controls and stores near real-time environmental data and processes these data using different inversion and modeling codes to provide information on the current state and evolution of the subsurface system. PAF is implemented as a cloud based software application which has five components: (1) data acquisition, (2) data management, (3) data assimilation and processing, (4) visualization and result deliver and (5) orchestration. Access to and interaction with PAF is done through a standard browser. PAF is designed to be modular so that it can ingest and process different data streams dependent on the site. We will present an implementation of PAF which uses data from a highly instrumented site (the DOE Rifle Subsurface Biogeochemistry Field Observatory in Rifle, Colorado) for which PAF automatically ingests hydrological data and forward models groundwater flow in the saturated zone.
NASA Technical Reports Server (NTRS)
Wheeler, Kevin; Timucin, Dogan; Rabbette, Maura; Curry, Charles; Allan, Mark; Lvov, Nikolay; Clanton, Sam; Pilewskie, Peter
2002-01-01
The goal of visual inference programming is to develop a software framework data analysis and to provide machine learning algorithms for inter-active data exploration and visualization. The topics include: 1) Intelligent Data Understanding (IDU) framework; 2) Challenge problems; 3) What's new here; 4) Framework features; 5) Wiring diagram; 6) Generated script; 7) Results of script; 8) Initial algorithms; 9) Independent Component Analysis for instrument diagnosis; 10) Output sensory mapping virtual joystick; 11) Output sensory mapping typing; 12) Closed-loop feedback mu-rhythm control; 13) Closed-loop training; 14) Data sources; and 15) Algorithms. This paper is in viewgraph form.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-24
... Digital Computer-Based Instrumentation and Control Systems.'' This BTP is to be cited as the acceptance criteria for Diversity and Defense-in-Depth in Digital Computer-Based Instrumentation and Control Systems... Evaluation of Diversity and Defense-in-Depth in Digital Computer-Based Instrumentation and Control Systems...
Tosteson, Tor D.; Morden, Nancy E.; Stukel, Therese A.; O'Malley, A. James
2014-01-01
The estimation of treatment effects is one of the primary goals of statistics in medicine. Estimation based on observational studies is subject to confounding. Statistical methods for controlling bias due to confounding include regression adjustment, propensity scores and inverse probability weighted estimators. These methods require that all confounders are recorded in the data. The method of instrumental variables (IVs) can eliminate bias in observational studies even in the absence of information on confounders. We propose a method for integrating IVs within the framework of Cox's proportional hazards model and demonstrate the conditions under which it recovers the causal effect of treatment. The methodology is based on the approximate orthogonality of an instrument with unobserved confounders among those at risk. We derive an estimator as the solution to an estimating equation that resembles the score equation of the partial likelihood in much the same way as the traditional IV estimator resembles the normal equations. To justify this IV estimator for a Cox model we perform simulations to evaluate its operating characteristics. Finally, we apply the estimator to an observational study of the effect of coronary catheterization on survival. PMID:25506259
MacKenzie, Todd A; Tosteson, Tor D; Morden, Nancy E; Stukel, Therese A; O'Malley, A James
2014-06-01
The estimation of treatment effects is one of the primary goals of statistics in medicine. Estimation based on observational studies is subject to confounding. Statistical methods for controlling bias due to confounding include regression adjustment, propensity scores and inverse probability weighted estimators. These methods require that all confounders are recorded in the data. The method of instrumental variables (IVs) can eliminate bias in observational studies even in the absence of information on confounders. We propose a method for integrating IVs within the framework of Cox's proportional hazards model and demonstrate the conditions under which it recovers the causal effect of treatment. The methodology is based on the approximate orthogonality of an instrument with unobserved confounders among those at risk. We derive an estimator as the solution to an estimating equation that resembles the score equation of the partial likelihood in much the same way as the traditional IV estimator resembles the normal equations. To justify this IV estimator for a Cox model we perform simulations to evaluate its operating characteristics. Finally, we apply the estimator to an observational study of the effect of coronary catheterization on survival.
Varying ultrasound power level to distinguish surgical instruments and tissue.
Ren, Hongliang; Anuraj, Banani; Dupont, Pierre E
2018-03-01
We investigate a new framework of surgical instrument detection based on power-varying ultrasound images with simple and efficient pixel-wise intensity processing. Without using complicated feature extraction methods, we identified the instrument with an estimated optimal power level and by comparing pixel values of varying transducer power level images. The proposed framework exploits the physics of ultrasound imaging system by varying the transducer power level to effectively distinguish metallic surgical instruments from tissue. This power-varying image-guidance is motivated from our observations that ultrasound imaging at different power levels exhibit different contrast enhancement capabilities between tissue and instruments in ultrasound-guided robotic beating-heart surgery. Using lower transducer power levels (ranging from 40 to 75% of the rated lowest ultrasound power levels of the two tested ultrasound scanners) can effectively suppress the strong imaging artifacts from metallic instruments and thus, can be utilized together with the images from normal transducer power levels to enhance the separability between instrument and tissue, improving intraoperative instrument tracking accuracy from the acquired noisy ultrasound volumetric images. We performed experiments in phantoms and ex vivo hearts in water tank environments. The proposed multi-level power-varying ultrasound imaging approach can identify robotic instruments of high acoustic impedance from low-signal-to-noise-ratio ultrasound images by power adjustments.
Transaction-Based Building Controls Framework, Volume 1: Reference Guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Somasundaram, Sriram; Pratt, Robert G.; Akyol, Bora A.
This document proposes a framework concept to achieve the objectives of raising buildings’ efficiency and energy savings potential benefitting building owners and operators. We call it a transaction-based framework, wherein mutually-beneficial and cost-effective market-based transactions can be enabled between multiple players across different domains. Transaction-based building controls are one part of the transactional energy framework. While these controls realize benefits by enabling automatic, market-based intra-building efficiency optimizations, the transactional energy framework provides similar benefits using the same market -based structure, yet on a larger scale and beyond just buildings, to the society at large.
An Assessment of Alcohol and Drug Education/Prevention Programs in the United States Army
1973-12-01
summarized in an interim report, helped us to construct a conceptual framework for our data-gathering instruments and procedures and analyses. The basic...and leade.s about their attitudes and behavior before and after exposure to ADEP), but we also made separate administrations of the same instruments ...study. Based on this, we developed our instruments and pretested them. We made preliminary one-day visits to almost all posts in our CONUS sam ple
UAF: a generic OPC unified architecture framework
NASA Astrophysics Data System (ADS)
Pessemier, Wim; Deconinck, Geert; Raskin, Gert; Saey, Philippe; Van Winckel, Hans
2012-09-01
As an emerging Service Oriented Architecture (SOA) specically designed for industrial automation and process control, the OPC Unied Architecture specication should be regarded as an attractive candidate for controlling scientic instrumentation. Even though an industry-backed standard such as OPC UA can oer substantial added value to these projects, its inherent complexity poses an important obstacle for adopting the technology. Building OPC UA applications requires considerable eort, even when taking advantage of a COTS Software Development Kit (SDK). The OPC Unied Architecture Framework (UAF) attempts to reduce this burden by introducing an abstraction layer between the SDK and the application code in order to achieve a better separation of the technical and the functional concerns. True to its industrial origin, the primary requirement of the framework is to maintain interoperability by staying close to the standard specications, and by expecting the minimum compliance from other OPC UA servers and clients. UAF can therefore be regarded as a software framework to quickly and comfortably develop and deploy OPC UA-based applications, while remaining compatible to third party OPC UA-compliant toolkits, servers (such as PLCs) and clients (such as SCADA software). In the rst phase, as covered by this paper, only the client-side of UAF has been tackled in order to transparently handle discovery, session management, subscriptions, monitored items etc. We describe the design principles and internal architecture of our open-source software project, the rst results of the framework running at the Mercator Telescope, and we give a preview of the planned server-side implementation.
Integration of Sensors, Controllers and Instruments Using a Novel OPC Architecture
2017-01-01
The interconnection between sensors, controllers and instruments through a communication network plays a vital role in the performance and effectiveness of a control system. Since its inception in the 90s, the Object Linking and Embedding for Process Control (OPC) protocol has provided open connectivity for monitoring and automation systems. It has been widely used in several environments such as industrial facilities, building and energy automation, engineering education and many others. This paper presents a novel OPC-based architecture to implement automation systems devoted to R&D and educational activities. The proposal is a novel conceptual framework, structured into four functional layers where the diverse components are categorized aiming to foster the systematic design and implementation of automation systems involving OPC communication. Due to the benefits of OPC, the proposed architecture provides features like open connectivity, reliability, scalability, and flexibility. Furthermore, four successful experimental applications of such an architecture, developed at the University of Extremadura (UEX), are reported. These cases are a proof of concept of the ability of this architecture to support interoperability for different domains. Namely, the automation of energy systems like a smart microgrid and photobioreactor facilities, the implementation of a network-accessible industrial laboratory and the development of an educational hardware-in-the-loop platform are described. All cases include a Programmable Logic Controller (PLC) to automate and control the plant behavior, which exchanges operative data (measurements and signals) with a multiplicity of sensors, instruments and supervisory systems under the structure of the novel OPC architecture. Finally, the main conclusions and open research directions are highlighted. PMID:28654002
Integration of Sensors, Controllers and Instruments Using a Novel OPC Architecture.
González, Isaías; Calderón, Antonio José; Barragán, Antonio Javier; Andújar, José Manuel
2017-06-27
The interconnection between sensors, controllers and instruments through a communication network plays a vital role in the performance and effectiveness of a control system. Since its inception in the 90s, the Object Linking and Embedding for Process Control (OPC) protocol has provided open connectivity for monitoring and automation systems. It has been widely used in several environments such as industrial facilities, building and energy automation, engineering education and many others. This paper presents a novel OPC-based architecture to implement automation systems devoted to R&D and educational activities. The proposal is a novel conceptual framework, structured into four functional layers where the diverse components are categorized aiming to foster the systematic design and implementation of automation systems involving OPC communication. Due to the benefits of OPC, the proposed architecture provides features like open connectivity, reliability, scalability, and flexibility. Furthermore, four successful experimental applications of such an architecture, developed at the University of Extremadura (UEX), are reported. These cases are a proof of concept of the ability of this architecture to support interoperability for different domains. Namely, the automation of energy systems like a smart microgrid and photobioreactor facilities, the implementation of a network-accessible industrial laboratory and the development of an educational hardware-in-the-loop platform are described. All cases include a Programmable Logic Controller (PLC) to automate and control the plant behavior, which exchanges operative data (measurements and signals) with a multiplicity of sensors, instruments and supervisory systems under the structure of the novel OPC architecture. Finally, the main conclusions and open research directions are highlighted.
Astronomical Instrumentation System Markup Language
NASA Astrophysics Data System (ADS)
Goldbaum, Jesse M.
2016-05-01
The Astronomical Instrumentation System Markup Language (AISML) is an Extensible Markup Language (XML) based file format for maintaining and exchanging information about astronomical instrumentation. The factors behind the need for an AISML are first discussed followed by the reasons why XML was chosen as the format. Next it's shown how XML also provides the framework for a more precise definition of an astronomical instrument and how these instruments can be combined to form an Astronomical Instrumentation System (AIS). AISML files for several instruments as well as one for a sample AIS are provided. The files demonstrate how AISML can be utilized for various tasks from web page generation and programming interface to instrument maintenance and quality management. The advantages of widespread adoption of AISML are discussed.
Ilott, Irene; Gerrish, Kate; Eltringham, Sabrina A; Taylor, Carolyn; Pownall, Sue
2016-08-18
Swallowing difficulties challenge patient safety due to the increased risk of malnutrition, dehydration and aspiration pneumonia. A theoretically driven study was undertaken to examine the spread and sustainability of a locally developed innovation that involved using the Inter-Professional Dysphagia Framework to structure education for the workforce. A conceptual framework with 3 spread strategies (hierarchical control, participatory adaptation and facilitated evolution) was blended with a processual approach to sustaining organisational change. The aim was to understand the processes, mechanism and outcomes associated with the spread and sustainability of this safety initiative. An instrumental case study, prospectively tracked a dysphagia innovation for 34 months (April 2011 to January 2014) in a large health care organisation in England. A train-the-trainer intervention (as participatory adaptation) was deployed on care pathways for stroke and fractured neck of femur. Data were collected at the organisational and clinical level through interviews (n = 30) and document review. The coding frame combined the processual approach with the spread mechanisms. Pre-determined outcomes included the number of staff trained about dysphagia and impact related to changes in practice. The features and processes associated with hierarchical control and participatory adaptation were identified. Leadership, critical junctures, temporality and making the innovation routine were aspects of hierarchical control. Participatory adaptation was evident on the care pathways through stakeholder responses, workload and resource pressures. Six of the 25 ward based trainers cascaded the dysphagia training. The expected outcomes were achieved when the top-down mandate (hierarchical control) was supplemented by local engagement and support (participatory adaptation). Frameworks for spread and sustainability were combined to create a 'small theory' that described the interventions, the processes and desired outcomes a priori. This novel methodological approach confirmed what is known about spread and sustainability, highlighted the particularity of change and offered new insights into the factors associated with hierarchical control and participatory adaptation. The findings illustrate the dualities of organisational change as universal and context specific; as particular and amendable to theoretical generalisation. Appreciating these dualities may contribute to understanding why many innovations fail to become routine.
Building a framework for ergonomic research on laparoscopic instrument handles.
Li, Zheng; Wang, Guohui; Tan, Juan; Sun, Xulong; Lin, Hao; Zhu, Shaihong
2016-06-01
Laparoscopic surgery carries the advantage of minimal invasiveness, but ergonomic design of the instruments used has progressed slowly. Previous studies have demonstrated that the handle of laparoscopic instruments is vital for both surgical performance and surgeon's health. This review provides an overview of the sub-discipline of handle ergonomics, including an evaluation framework, objective and subjective assessment systems, data collection and statistical analyses. Furthermore, a framework for ergonomic research on laparoscopic instrument handles is proposed to standardize work on instrument design. Copyright © 2016 IJS Publishing Group Ltd. Published by Elsevier Ltd. All rights reserved.
The Type of Culture at a High Performance Schools and Low Performance School in the State of Kedah
ERIC Educational Resources Information Center
Daud, Yaakob; Raman, Arumugam; Don, Yahya; O. F., Mohd Sofian; Hussin, Fauzi
2015-01-01
This research aims to identify the type of culture at a High Performance School (HPS) and Low Performance School (LPS) in the state of Kedah. The research instrument used to measure the type of organizational culture was adapted from Organizational Culture Assessment Instrument (Cameron & Quinn, 2006) based on Competing Values Framework Quinn…
NASA Astrophysics Data System (ADS)
Valentic, T. A.
2012-12-01
The Data Transport Network is designed for the delivery of data from scientific instruments located at remote field sites with limited or unreliable communications. Originally deployed at the Sondrestrom Research Facility in Greenland over a decade ago, the system supports the real-time collection and processing of data from large instruments such as incoherent scatter radars and lidars. In recent years, the Data Transport Network has been adapted to small, low-power embedded systems controlling remote instrumentation platforms deployed throughout the Arctic. These projects include multiple buoys from the O-Buoy, IceLander and IceGoat programs, renewable energy monitoring at the Imnavait Creek and Ivotuk field sites in Alaska and remote weather observation stations in Alaska and Greenland. This presentation will discuss the common communications controller developed for these projects. Although varied in their application, each of these systems share a number of common features. Multiple instruments are attached, each of which needs to be power controlled, data sampled and files transmitted offsite. In addition, the power usage of the overall system must be minimized to handle the limited energy available from sources such as solar, wind and fuel cells. The communications links are satellite based. The buoys and weather stations utilize Iridium, necessitating the need to handle the common drop outs and high-latency, low-bandwidth nature of the link. The communications controller is an off-the-shelf, low-power, single board computer running a customized version of the Linux operating system. The Data Transport Network provides a Python-based software framework for writing individual data collection programs and supplies a number of common services for configuration, scheduling, logging, data transmission and resource management. Adding a new instrument involves writing only the necessary code for interfacing to the hardware. Individual programs communicate with the system services using XML-RPC. The scheduling algorithms have access the current position and power levels, allowing for instruments such as cameras to only be run during daylight hours or when sufficient power is available. The resource manager monitors the use of common devices such as the USB bus or Ethernet ports, and can power them down when they are not being used. This management lets us drop the power consumption from an average of 1W to 250mW.
Work environments for employee creativity.
Dul, Jan; Ceylan, Canan
2011-01-01
Innovative organisations need creative employees who generate new ideas for product or process innovation. This paper presents a conceptual framework for the effect of personal, social-organisational and physical factors on employee creativity. Based on this framework, an instrument to analyse the extent to which the work environment enhances creativity is developed. This instrument was applied to a sample of 409 employees and support was found for the hypothesis that a creative work environment enhances creative performance. This paper illustrates how the instrument can be used in companies to select and implement improvements. STATEMENT OF RELEVANCE: The ergonomics discipline addresses the work environment mainly for improving health and safety and sometimes productivity and quality. This paper opens a new area for ergonomics: designing work environments for enhancing employee creativity in order to strengthen an organisation's capability for product and process innovation and, consequently, its competitiveness.
Overview of RICOR tactical cryogenic refrigerators for space missions
NASA Astrophysics Data System (ADS)
Riabzev, Sergey; Filis, Avishai; Livni, Dorit; Regev, Itai; Segal, Victor; Gover, Dan
2016-05-01
Cryogenic refrigerators represent a significant enabling technology for Earth and Space science enterprises. Many of the space instruments require cryogenic refrigeration to enable the use of advanced detectors to explore a wide range of phenomena from space. RICOR refrigerators involved in various space missions are overviewed in this paper, starting in 1994 with "Clementine" Moon mission, till the latest ExoMars mission launched in 2016. RICOR tactical rotary refrigerators have been incorporated in many space instruments, after passing qualification, life time, thermal management testing and flight acceptance. The tactical to space customization framework includes an extensive characterization and qualification test program to validate reliability, the design of thermal interfacing with a detector, vibration export control, efficient heat dissipation in a vacuum environment, robustness, mounting design, compliance with outgassing requirements and strict performance screening. Current RICOR development is focused on dedicated ultra-long-life, highly reliable, space cryogenic refrigerator based on a Pulse Tube design
U.S.A.B.I.L.I.T.Y. Framework for Older Adults.
Caboral-Stevens, Meriam; Whetsell, Martha V; Evangelista, Lorraine S; Cypress, Brigitte; Nickitas, Donna
2015-01-01
The purpose of the current study was to present a framework to determine potential usability of health websites by older adults. Review of the literature showed paucity of nursing theory related to the use of technology and usability, particularly in older adults. The Roy Adaptation Model, a widely used nursing theory, was chosen to provide framework for the new model. Technology constructs from the Technology Acceptance Model and United Theory of Acceptance and Use of Technology and behavioral control construct from the Theory of Planned Behavior were integrated into the construction of the derived model. The Use of Technology for Adaptation by Older Adults and/or Those With Limited Literacy (U.S.A.B.I.L.I.T.Y.) Model was constructed from the integration of diverse theoretical/conceptual perspectives. The four determinants of usability in the conceptual model include (a) efficiency, (b) learnability, (c) perceived user experience, and (d) perceived control. Because of the lack of well-validated survey questionnaires to measure these determinants, a U.S.A.B.I.L.I.T.Y. Survey was developed. A panel of experts evaluated face and content validity of the new instrument. Internal consistency of the new instrument was 0.96. Usability is key to accepting technology. The derived U.S.A.B.I.L.I.T.Y. framework could serve as a guide for nurses in formative evaluation of technology. Copyright 2015, SLACK Incorporated.
Reconstructing pre-instrumental streamflow in Eastern Australia using a water balance approach
NASA Astrophysics Data System (ADS)
Tozer, C. R.; Kiem, A. S.; Vance, T. R.; Roberts, J. L.; Curran, M. A. J.; Moy, A. D.
2018-03-01
Streamflow reconstructions based on paleoclimate proxies provide much longer records than the short instrumental period records on which water resource management plans are currently based. In Australia there is a lack of in-situ high resolution paleoclimate proxy records, but remote proxies with teleconnections to Australian climate have utility in producing streamflow reconstructions. Here we investigate, via a case study for a catchment in eastern Australia, the novel use of an Antarctic ice-core based rainfall reconstruction within a Budyko-framework to reconstruct ∼1000 years of annual streamflow. The resulting streamflow reconstruction captures interannual to decadal variability in the instrumental streamflow, validating both the use of the ice core rainfall proxy record and the Budyko-framework method. In the preinstrumental era the streamflow reconstruction shows longer wet and dry epochs and periods of streamflow variability that are higher than observed in the instrumental era. Importantly, for both the instrumental record and preinstrumental reconstructions, the wet (dry) epochs in the rainfall record are shorter (longer) in the streamflow record and this non-linearity must be considered when inferring hydroclimatic risk or historical water availability directly from rainfall proxy records alone. These insights provide a better understanding of present infrastructure vulnerability in the context of past climate variability for eastern Australia. The streamflow reconstruction presented here also provides a better understanding of the range of hydroclimatic variability possible, and therefore represents a more realistic baseline on which to quantify the potential impacts of anthropogenic climate change on water security.
Designing communication and remote controlling of virtual instrument network system
NASA Astrophysics Data System (ADS)
Lei, Lin; Wang, Houjun; Zhou, Xue; Zhou, Wenjian
2005-01-01
In this paper, a virtual instrument network through the LAN and finally remote control of virtual instruments is realized based on virtual instrument and LabWindows/CVI software platform. The virtual instrument network system is made up of three subsystems. There are server subsystem, telnet client subsystem and local instrument control subsystem. This paper introduced virtual instrument network structure in detail based on LabWindows. Application procedure design of virtual instrument network communication, the Client/the programming mode of the server, remote PC and server communication far realizing, the control power of the workstation is transmitted, server program and so on essential technical were introduced. And virtual instruments network may connect to entire Internet on. Above-mentioned technology, through measuring the application in the electronic measurement virtual instrument network that is already built up, has verified the actual using value of the technology. Experiment and application validate that this design is resultful.
NASA Astrophysics Data System (ADS)
Busonero, D.; Gai, M.
The goals of 21st century high angular precision experiments rely on the limiting performance associated to the selected instrumental configuration and observational strategy. Both global and narrow angle micro-arcsec space astrometry require that the instrument contributions to the overall error budget has to be less than the desired micro-arcsec level precision. Appropriate modelling of the astrometric response is required for optimal definition of the data reduction and calibration algorithms, in order to ensure high sensitivity to the astrophysical source parameters and in general high accuracy. We will refer to the framework of the SIM-Lite and the Gaia mission, the most challenging space missions of the next decade in the narrow angle and global astrometry field, respectively. We will focus our dissertation on the Gaia data reduction issues and instrument calibration implications. We describe selected topics in the framework of the Astrometric Instrument Modelling for the Gaia mission, evidencing their role in the data reduction chain and we give a brief overview of the Astrometric Instrument Model Data Analysis Software System, a Java-based pipeline under development by our team.
Stalmeijer, Renée E; Dolmans, Diana H J M; Wolfhagen, Ineke H A P; Muijtjens, Arno M M; Scherpbier, Albert J J A
2008-01-01
Research indicates that the quality of supervision strongly influences the learning of medical students in clinical practice. Clinical teachers need feedback to improve their supervisory skills. The available instruments either lack a clear theoretical framework or are not suitable for providing feedback to individual teachers. We developed an evaluation instrument based on the 'cognitive apprenticeship model'. The aim was to estimate the content validity of the developed instrument. Item relevance was rated on a five-point scale (1 = highly irrelevant, 5 = highly relevant) by three groups of stakeholders in undergraduate clinical teaching: educationalists (N = 12), doctors (N = 16) and students (N = 12). Additionally, stakeholders commented on content, wording and omission of items. The items were generally rated as very relevant (Mean = 4.3, SD = 0.38, response = 95%) and any differences between the stakeholder groups were small. The results led to elimination of 4 items, rewording of 13 items and addition of 1 item. The cognitive apprenticeship model appears to offer a useful framework for the development of an evaluation instrument aimed at providing feedback to individual clinical teachers on the quality of student supervision. Further studies in larger populations will have to establish the instrument's statistical validity and generalizability.
Helfrich, Christian D; Li, Yu-Fang; Sharp, Nancy D; Sales, Anne E
2009-01-01
Background The Promoting Action on Research Implementation in Health Services, or PARIHS, framework is a theoretical framework widely promoted as a guide to implement evidence-based clinical practices. However, it has as yet no pool of validated measurement instruments that operationalize the constructs defined in the framework. The present article introduces an Organizational Readiness to Change Assessment instrument (ORCA), organized according to the core elements and sub-elements of the PARIHS framework, and reports on initial validation. Methods We conducted scale reliability and factor analyses on cross-sectional, secondary data from three quality improvement projects (n = 80) conducted in the Veterans Health Administration. In each project, identical 77-item ORCA instruments were administered to one or more staff from each facility involved in quality improvement projects. Items were organized into 19 subscales and three primary scales corresponding to the core elements of the PARIHS framework: (1) Strength and extent of evidence for the clinical practice changes represented by the QI program, assessed with four subscales, (2) Quality of the organizational context for the QI program, assessed with six subscales, and (3) Capacity for internal facilitation of the QI program, assessed with nine subscales. Results Cronbach's alpha for scale reliability were 0.74, 0.85 and 0.95 for the evidence, context and facilitation scales, respectively. The evidence scale and its three constituent subscales failed to meet the conventional threshold of 0.80 for reliability, and three individual items were eliminated from evidence subscales following reliability testing. In exploratory factor analysis, three factors were retained. Seven of the nine facilitation subscales loaded onto the first factor; five of the six context subscales loaded onto the second factor; and the three evidence subscales loaded on the third factor. Two subscales failed to load significantly on any factor. One measured resources in general (from the context scale), and one clinical champion role (from the facilitation scale). Conclusion We find general support for the reliability and factor structure of the ORCA. However, there was poor reliability among measures of evidence, and factor analysis results for measures of general resources and clinical champion role did not conform to the PARIHS framework. Additional validation is needed, including criterion validation. PMID:19594942
Helfrich, Christian D; Li, Yu-Fang; Sharp, Nancy D; Sales, Anne E
2009-07-14
The Promoting Action on Research Implementation in Health Services, or PARIHS, framework is a theoretical framework widely promoted as a guide to implement evidence-based clinical practices. However, it has as yet no pool of validated measurement instruments that operationalize the constructs defined in the framework. The present article introduces an Organizational Readiness to Change Assessment instrument (ORCA), organized according to the core elements and sub-elements of the PARIHS framework, and reports on initial validation. We conducted scale reliability and factor analyses on cross-sectional, secondary data from three quality improvement projects (n = 80) conducted in the Veterans Health Administration. In each project, identical 77-item ORCA instruments were administered to one or more staff from each facility involved in quality improvement projects. Items were organized into 19 subscales and three primary scales corresponding to the core elements of the PARIHS framework: (1) Strength and extent of evidence for the clinical practice changes represented by the QI program, assessed with four subscales, (2) Quality of the organizational context for the QI program, assessed with six subscales, and (3) Capacity for internal facilitation of the QI program, assessed with nine subscales. Cronbach's alpha for scale reliability were 0.74, 0.85 and 0.95 for the evidence, context and facilitation scales, respectively. The evidence scale and its three constituent subscales failed to meet the conventional threshold of 0.80 for reliability, and three individual items were eliminated from evidence subscales following reliability testing. In exploratory factor analysis, three factors were retained. Seven of the nine facilitation subscales loaded onto the first factor; five of the six context subscales loaded onto the second factor; and the three evidence subscales loaded on the third factor. Two subscales failed to load significantly on any factor. One measured resources in general (from the context scale), and one clinical champion role (from the facilitation scale). We find general support for the reliability and factor structure of the ORCA. However, there was poor reliability among measures of evidence, and factor analysis results for measures of general resources and clinical champion role did not conform to the PARIHS framework. Additional validation is needed, including criterion validation.
Security Verification Techniques Applied to PatchLink COTS Software
NASA Technical Reports Server (NTRS)
Gilliam, David P.; Powell, John D.; Bishop, Matt; Andrew, Chris; Jog, Sameer
2006-01-01
Verification of the security of software artifacts is a challenging task. An integrated approach that combines verification techniques can increase the confidence in the security of software artifacts. Such an approach has been developed by the Jet Propulsion Laboratory (JPL) and the University of California at Davis (UC Davis). Two security verification instruments were developed and then piloted on PatchLink's UNIX Agent, a Commercial-Off-The-Shelf (COTS) software product, to assess the value of the instruments and the approach. The two instruments are the Flexible Modeling Framework (FMF) -- a model-based verification instrument (JPL), and a Property-Based Tester (UC Davis). Security properties were formally specified for the COTS artifact and then verified using these instruments. The results were then reviewed to determine the effectiveness of the approach and the security of the COTS product.
Goal-Directed Behavior and Instrumental Devaluation: A Neural System-Level Computational Model
Mannella, Francesco; Mirolli, Marco; Baldassarre, Gianluca
2016-01-01
Devaluation is the key experimental paradigm used to demonstrate the presence of instrumental behaviors guided by goals in mammals. We propose a neural system-level computational model to address the question of which brain mechanisms allow the current value of rewards to control instrumental actions. The model pivots on and shows the computational soundness of the hypothesis for which the internal representation of instrumental manipulanda (e.g., levers) activate the representation of rewards (or “action-outcomes”, e.g., foods) while attributing to them a value which depends on the current internal state of the animal (e.g., satiation for some but not all foods). The model also proposes an initial hypothesis of the integrated system of key brain components supporting this process and allowing the recalled outcomes to bias action selection: (a) the sub-system formed by the basolateral amygdala and insular cortex acquiring the manipulanda-outcomes associations and attributing the current value to the outcomes; (b) three basal ganglia-cortical loops selecting respectively goals, associative sensory representations, and actions; (c) the cortico-cortical and striato-nigro-striatal neural pathways supporting the selection, and selection learning, of actions based on habits and goals. The model reproduces and explains the results of several devaluation experiments carried out with control rats and rats with pre- and post-training lesions of the basolateral amygdala, the nucleus accumbens core, the prelimbic cortex, and the dorso-medial striatum. The results support the soundness of the hypotheses of the model and show its capacity to integrate, at the system-level, the operations of the key brain structures underlying devaluation. Based on its hypotheses and predictions, the model also represents an operational framework to support the design and analysis of new experiments on the motivational aspects of goal-directed behavior. PMID:27803652
Measures of Cultural Competence in Nurses: An Integrative Review
2013-01-01
Background. There is limited literature available identifying and describing the instruments that measure cultural competence in nursing students and nursing professionals. Design. An integrative review was undertaken to identify the characteristics common to these instruments, examine their psychometric properties, and identify the concepts these instruments are designed to measure. Method. There were eleven instruments identified that measure cultural competence in nursing. Of these eleven instruments, four had been thoroughly tested in either initial development or in subsequent testing, with developers providing extensive details of the testing. Results. The current literature identifies that the instruments to assess cultural competence in nurses and nursing students are self-administered and based on individuals' perceptions. The instruments are commonly utilized to test the effectiveness of educational programs designed to increase cultural competence. Conclusions. The reviewed instruments measure nurses' self-perceptions or self-reported level of cultural competence but offer no objective measure of culturally competent care from a patient's perspective which can be problematic. Comparison of instruments reveals that they are based on a variety of conceptual frameworks and that multiple factors should be considered when deciding which instrument to use. PMID:23818818
NASA Astrophysics Data System (ADS)
Valenziano, L.; Gregorio, A.; Butler, R. C.; Amiaux, J.; Bonoli, C.; Bortoletto, F.; Burigana, C.; Corcione, L.; Ealet, A.; Frailis, M.; Jahnke, K.; Ligori, S.; Maiorano, E.; Morgante, G.; Nicastro, L.; Pasian, F.; Riva, M.; Scaramella, R.; Schiavone, F.; Tavagnacco, D.; Toledo-Moreo, R.; Trifoglio, M.; Zacchei, A.; Zerbi, F. M.; Maciaszek, T.
2012-09-01
Euclid is the future ESA mission, mainly devoted to Cosmology. Like WMAP and Planck, it is a survey mission, to be launched in 2019 and injected in orbit far away from the Earth, for a nominal lifetime of 7 years. Euclid has two instruments on-board, the Visible Imager (VIS) and the Near- Infrared Spectro-Photometer (NISP). The NISP instrument includes cryogenic mechanisms, active thermal control, high-performance Data Processing Unit and requires periodic in-flight calibrations and instrument parameters monitoring. To fully exploit the capability of the NISP, a careful control of systematic effects is required. From previous experiments, we have built the concept of an integrated instrument development and verification approach, where the scientific, instrument and ground-segment expertise have strong interactions from the early phases of the project. In particular, we discuss the strong integration of test and calibration activities with the Ground Segment, starting from early pre-launch verification activities. We want to report here the expertise acquired by the Euclid team in previous missions, only citing the literature for detailed reference, and indicate how it is applied in the Euclid mission framework.
ERIC Educational Resources Information Center
Hilz, Christoph; Ehrenfeld, John R.
1991-01-01
Several policy frameworks for managing hazardous waste import/export are examined with respect to economic issues, environmental sustainability, and administrative feasibility and effectiveness. Several recommendations for improving the present instrument and implementing process are offered. (Author/CW)
Van Dijk-de Vries, Anneke N; Duimel-Peeters, Inge G P; Muris, Jean W; Wesseling, Geertjan J; Beusmans, George H M I; Vrijhoef, Hubertus J M
2016-04-08
Teamwork between healthcare providers is conditional for the delivery of integrated care. This study aimed to assess the usefulness of the conceptual framework Integrated Team Effectiveness Model for developing and testing of the Integrated Team Effectiveness Instrument. Focus groups with healthcare providers in an integrated care setting for people with chronic obstructive pulmonary disease (COPD) were conducted to examine the recognisability of the conceptual framework and to explore critical success factors for collaborative COPD practice out of this framework. The resulting items were transposed into a pilot instrument. This was reviewed by expert opinion and completed 153 times by healthcare providers. The underlying structure and internal consistency of the instrument were verified by factor analysis and Cronbach's alpha. The conceptual framework turned out to be comprehensible for discussing teamwork effectiveness. The pilot instrument measures 25 relevant aspects of teamwork in integrated COPD care. Factor analysis suggested three reliable components: teamwork effectiveness, team processes and team psychosocial traits (Cronbach's alpha between 0.76 and 0.81). The conceptual framework Integrated Team Effectiveness Model is relevant in developing a practical full-spectrum instrument to facilitate discussing teamwork effectiveness. The Integrated Team Effectiveness Instrument provides a well-founded basis to self-evaluate teamwork effectiveness in integrated COPD care by healthcare providers. Recommendations are provided for the improvement of the instrument.
Instrumentation and control systems, equipment location; instrumentation and control building, ...
Instrumentation and control systems, equipment location; instrumentation and control building, instrumentation room, bays and console plan. Specifications No. Eng-04-353-55-72; drawing no. 60-09-12; sheet 110 of 148; file no. 1321/61. Stamped: Record drawing - as constructed. Below stamp: Contract no. 4338, no change. - Edwards Air Force Base, Air Force Rocket Propulsion Laboratory, Control Center, Test Area 1-115, near Altair & Saturn Boulevards, Boron, Kern County, CA
Neutron imaging data processing using the Mantid framework
NASA Astrophysics Data System (ADS)
Pouzols, Federico M.; Draper, Nicholas; Nagella, Sri; Yang, Erica; Sajid, Ahmed; Ross, Derek; Ritchie, Brian; Hill, John; Burca, Genoveva; Minniti, Triestino; Moreton-Smith, Christopher; Kockelmann, Winfried
2016-09-01
Several imaging instruments are currently being constructed at neutron sources around the world. The Mantid software project provides an extensible framework that supports high-performance computing for data manipulation, analysis and visualisation of scientific data. At ISIS, IMAT (Imaging and Materials Science & Engineering) will offer unique time-of-flight neutron imaging techniques which impose several software requirements to control the data reduction and analysis. Here we outline the extensions currently being added to Mantid to provide specific support for neutron imaging requirements.
Software framework for automatic learning of telescope operation
NASA Astrophysics Data System (ADS)
Rodríguez, Jose A.; Molgó, Jordi; Guerra, Dailos
2016-07-01
The "Gran Telescopio de Canarias" (GTC) is an optical-infrared 10-meter segmented mirror telescope at the ORM observatory in Canary Islands (Spain). The GTC Control System (GCS) is a distributed object and component oriented system based on RT-CORBA and it is responsible for the operation of the telescope, including its instrumentation. The current development state of GCS is mature and fully operational. On the one hand telescope users as PI's implement the sequences of observing modes of future scientific instruments that will be installed in the telescope and operators, in turn, design their own sequences for maintenance. On the other hand engineers develop new components that provide new functionality required by the system. This great work effort is possible to minimize so that costs are reduced, especially if one considers that software maintenance is the most expensive phase of the software life cycle. Could we design a system that allows the progressive assimilation of sequences of operation and maintenance of the telescope, through an automatic self-programming system, so that it can evolve from one Component oriented organization to a Service oriented organization? One possible way to achieve this is to use mechanisms of learning and knowledge consolidation to reduce to the minimum expression the effort to transform the specifications of the different telescope users to the operational deployments. This article proposes a framework for solving this problem based on the combination of the following tools: data mining, self-Adaptive software, code generation, refactoring based on metrics, Hierarchical Agglomerative Clustering and Service Oriented Architectures.
Marchand, Alain; Haines, Victor Y; Dextras-Gauthier, Julie
2013-05-04
This study advances a measurement approach for the study of organizational culture in population-based occupational health research, and tests how different organizational culture types are associated with psychological distress, depression, emotional exhaustion, and well-being. Data were collected over a sample of 1,164 employees nested in 30 workplaces. Employees completed the 26-item OCP instrument. Psychological distress was measured with the General Health Questionnaire (12-item); depression with the Beck Depression Inventory (21-item); and emotional exhaustion with five items from the Maslach Burnout Inventory general survey. Exploratory factor analysis evaluated the dimensionality of the OCP scale. Multilevel regression models estimated workplace-level variations, and the contribution of organizational culture factors to mental health and well-being after controlling for gender, age, and living with a partner. Exploratory factor analysis of OCP items revealed four factors explaining about 75% of the variance, and supported the structure of the Competing Values Framework. Factors were labeled Group, Hierarchical, Rational and Developmental. Cronbach's alphas were high (0.82-0.89). Multilevel regression analysis suggested that the four culture types varied significantly between workplaces, and correlated with mental health and well-being outcomes. The Group culture type best distinguished between workplaces and had the strongest associations with the outcomes. This study provides strong support for the use of the OCP scale for measuring organizational culture in population-based occupational health research in a way that is consistent with the Competing Values Framework. The Group organizational culture needs to be considered as a relevant factor in occupational health studies.
Instrumentation issues in implementation science.
Martinez, Ruben G; Lewis, Cara C; Weiner, Bryan J
2014-09-04
Like many new fields, implementation science has become vulnerable to instrumentation issues that potentially threaten the strength of the developing knowledge base. For instance, many implementation studies report findings based on instruments that do not have established psychometric properties. This article aims to review six pressing instrumentation issues, discuss the impact of these issues on the field, and provide practical recommendations. This debate centers on the impact of the following instrumentation issues: use of frameworks, theories, and models; role of psychometric properties; use of 'home-grown' and adapted instruments; choosing the most appropriate evaluation method and approach; practicality; and need for decision-making tools. Practical recommendations include: use of consensus definitions for key implementation constructs; reporting standards (e.g., regarding psychometrics, instrument adaptation); when to use multiple forms of observation and mixed methods; and accessing instrument repositories and decision aid tools. This debate provides an overview of six key instrumentation issues and offers several courses of action to limit the impact of these issues on the field. With careful attention to these issues, the field of implementation science can potentially move forward at the rapid pace that is respectfully demanded by community stakeholders.
Hesselman, Marlies; Toebes, Brigit
2017-07-15
This Commentary forms a response to Nikogosian's and Kickbusch's forward-looking perspective about the legal strength of international health instruments. Building on their arguments, in this commentary we consider what we can learn from the Framework Convention on Tobacco Control (FCTC) for the adoption of new legal international health instruments. © 2018 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Jefford, Elaine; Jomeen, Julie; Martin, Colin R
2016-04-28
The ability to act on and justify clinical decisions as autonomous accountable midwifery practitioners, is encompassed within many international regulatory frameworks, yet decision-making within midwifery is poorly defined. Decision-making theories from medicine and nursing may have something to offer, but fail to take into consideration midwifery context and philosophy and the decisional autonomy of women. Using an underpinning qualitative methodology, a decision-making framework was developed, which identified Good Clinical Reasoning and Good Midwifery Practice as two conditions necessary to facilitate optimal midwifery decision-making during 2nd stage labour. This study aims to confirm the robustness of the framework and describe the development of Enhancing Decision-making Assessment in Midwifery (EDAM) as a measurement tool through testing of its factor structure, validity and reliability. A cross-sectional design for instrument development and a 2 (country; Australia/UK) x 2 (Decision-making; optimal/sub-optimal) between-subjects design for instrument evaluation using exploratory and confirmatory factor analysis, internal consistency and known-groups validity. Two 'expert' maternity panels, based in Australia and the UK, comprising of 42 participants assessed 16 midwifery real care episode vignettes using the empirically derived 26 item framework. Each item was answered on a 5 point likert scale based on the level of agreement to which the participant felt each item was present in each of the vignettes. Participants were then asked to rate the overall decision-making (optimal/sub-optimal). Post factor analysis the framework was reduced to a 19 item EDAM measure, and confirmed as two distinct scales of 'Clinical Reasoning' (CR) and 'Midwifery Practice' (MP). The CR scale comprised of two subscales; 'the clinical reasoning process' and 'integration and intervention'. The MP scale also comprised two subscales; women's relationship with the midwife' and 'general midwifery practice'. EDAM would generally appear to be a robust, valid and reliable psychometric instrument for measuring midwifery decision-making, which performs consistently across differing international contexts. The 'women's relationship with midwife' subscale marginally failed to meet the threshold for determining good instrument reliability, which may be due to its brevity. Further research using larger samples and in a wider international context to confirm the veracity of the instrument's measurement properties and its wider global utility, would be advantageous.
DOT National Transportation Integrated Search
2016-04-01
In this study, we developed an adaptive signal control (ASC) framework for connected vehicles (CVs) using agent-based modeling technique. : The proposed framework consists of two types of agents: 1) vehicle agents (VAs); and 2) signal controller agen...
Christe, Blaise; Burkhard, Pierre R; Pegna, Alan J; Mayer, Eugene; Hauert, Claude-Alain
2007-01-01
In this study, we developed a digitizing tablet-based instrument for the clinical assessment of human voluntary movements targeting motor processes of planning, programming and execution. The tool was used to investigate an adaptation of Fitts' reciprocal tapping task [10], comprising four conditions, each of them modulated by three indices of difficulty related to the amplitude of movement required. Temporal, spatial and sequential constraints underlying the various conditions allowed the intricate motor processes to be dissociated. Data obtained from a group of elderly healthy subjects (N=50) were in agreement with the literature on motor control, in the temporal and spatial domains. Speed constraints generated gains in the temporal domain and costs in the spatial one, while spatial constraints generated gain in the spatial domain and costs in the temporal one; finally, sequential constraints revealed the integrative nature of the cognitive operations involved in motor production. This versatile instrument proved capable of providing quantitative, accurate and sensitive measures of the various processes sustaining voluntary movement in healthy subjects. Altogether, analyses performed in this study generated a theoretical framework and reference data which could be used in the future for the clinical assessment of patients with various movement disorders, in particular Parkinson's disease.
Applying workability in the Australian residential aged care context.
Brooke, Elizabeth; Goodall, Joanne; Handrus, Maxwell; Mawren, Daveena
2013-06-01
The study is based on an innovative demonstration project which trialled the implementation of the Finnish 'workability' framework and research measures. It aimed, firstly, to test the applicability of the Workability Index (WAI) to the Australian residential aged care workforce, focusing on personal care assistants (PCAs), and secondly, to assess the effectiveness of actions aimed at improving workability. The facility manager implemented multidimensional 'actions' according to the workability framework. The Workability Survey (WAS) and WAI and intervention instruments were administered (n = 64). Completed responses to 'pre' and 'post' instruments formed matched pairs (n = 15). WAI scores increased significantly, by 3 points on average, after all 'actions' were implemented. The only significant 'action' was increasing the number of PCAs in high care. Workability provides a useful research workforce development instrument measuring interactions between aged care workers and organisational demands and the outcomes of 'actions'. © 2013 The Authors. Australasian Journal on Ageing © 2013 ACOTA.
An autonomous observation and control system based on EPICS and RTS2 for Antarctic telescopes
NASA Astrophysics Data System (ADS)
Zhang, Guang-yu; Wang, Jian; Tang, Peng-yi; Jia, Ming-hao; Chen, Jie; Dong, Shu-cheng; Jiang, Fengxin; Wu, Wen-qing; Liu, Jia-jing; Zhang, Hong-fei
2016-01-01
For unattended telescopes in Antarctic, the remote operation, autonomous observation and control are essential. An EPICS-(Experimental Physics and Industrial Control System) and RTS2-(Remote Telescope System, 2nd Version) based autonomous observation and control system with remoted operation is introduced in this paper. EPICS is a set of open source software tools, libraries and applications developed collaboratively and used worldwide to create distributed soft real-time control systems for scientific instruments while RTS2 is an open source environment for control of a fully autonomous observatory. Using the advantage of EPICS and RTS2, respectively, a combined integrated software framework for autonomous observation and control is established that use RTS2 to fulfil the function of astronomical observation and use EPICS to fulfil the device control of telescope. A command and status interface for EPICS and RTS2 is designed to make the EPICS IOC (Input/Output Controller) components integrate to RTS2 directly. For the specification and requirement of control system of telescope in Antarctic, core components named Executor and Auto-focus for autonomous observation is designed and implemented with remote operation user interface based on browser-server mode. The whole system including the telescope is tested in Lijiang Observatory in Yunnan Province for practical observation to complete the autonomous observation and control, including telescope control, camera control, dome control, weather information acquisition with the local and remote operation.
NASA Astrophysics Data System (ADS)
Jirka, Simon; del Rio, Joaquin; Toma, Daniel; Martinez, Enoc; Delory, Eric; Pearlman, Jay; Rieke, Matthes; Stasch, Christoph
2017-04-01
The rapidly evolving technology for building Web-based (spatial) information infrastructures and Sensor Webs, there are new opportunities to improve the process how ocean data is collected and managed. A central element in this development is the suite of Sensor Web Enablement (SWE) standards specified by the Open Geospatial Consortium (OGC). This framework of standards comprises on the one hand data models as well as formats for measurement data (ISO/OGC Observations and Measurement, O&M) and metadata describing measurement processes and sensors (OGC Sensor Model Language, SensorML). On the other hand the SWE standards comprise (Web service) interface specifications for pull-based access to observation data (OGC Sensor Observation Service, SOS) and for controlling or configuring sensors (OGC Sensor Planning Service, SPS). Also within the European INSPIRE framework the SWE standards play an important role as the SOS is the recommended download service interface for O&M-encoded observation data sets. In the context of the EU-funded Oceans of Tomorrow initiative the NeXOS (Next generation, Cost-effective, Compact, Multifunctional Web Enabled Ocean Sensor Systems Empowering Marine, Maritime and Fisheries Management) project is developing a new generation of in-situ sensors that make use of the SWE standards to facilitate the data publication process and the integration into Web based information infrastructures. This includes the development of a dedicated firmware for instruments and sensor platforms (SEISI, Smart Electronic Interface for Sensors and Instruments) maintained by the Universitat Politècnica de Catalunya (UPC). Among other features, SEISI makes use of OGC SWE standards such OGC-PUCK, to enable a plug-and-play mechanism for sensors based on SensorML encoded metadata. Thus, if a new instrument is attached to a SEISI-based platform, it automatically configures the connection to these instruments, automatically generated data files compliant with the ISO/OGC Observations and Measurements standard and initiates the data transmission into the NeXOS Sensor Web infrastructure. Besides these platform-related developments, NeXOS has realised the full path of data transmission from the sensor to the end user application. The conceptual architecture design is implemented by a series of open source SWE software packages provided by 52°North. This comprises especially different SWE server components (i.e. OGC Sensor Observation Service), tools for data visualisation (e.g. the 52°North Helgoland SOS viewer), and an editor for providing SensorML-based metadata (52°North smle). As a result, NeXOS has demonstrated how the SWE standards help to improve marine observation data collection. Within this presentation, we will present the experiences and findings of the NeXOS project and will provide recommendation for future work directions.
NASA Astrophysics Data System (ADS)
Newton, Alice; Borja, Angel; Solidoro, Cosimo; Grégoire, Marilaure
2015-10-01
The Marine Strategy Framework Directive (MSFD; EC, 2008) is an ambitious European policy instrument that aims to achieve Good Environmental Status (GES) in the 5,720,000 km2 of European seas by 2020, using an Ecosystem Approach. GES is to be assessed using 11 descriptors and up to 56 indicators (European Commission, 2010), and the goal is for clean, healthy and productive seas that are the basis for marine-based development, known as Blue-Growth. The MSFD is one of many policy instruments, such as the Water Framework Directive, the Common Fisheries Policy and the Habitats Directive that, together, should result in "Healthy Oceans and Productive Ecosystems - HOPE". Researchers working together with stakeholders such as the Member States environmental agencies, the European Environmental Agency, and the Regional Sea Conventions, are to provide the scientific knowledge basis for the implementation of the MSFD. This represents both a fascinating challenge and a stimulating opportunity.
Preservice Teachers' Beliefs, Attitudes, and Motivation about Technology Integration
ERIC Educational Resources Information Center
Cullen, Theresa A.; Greene, Barbara A.
2011-01-01
The Theory of Planned Behavior was used as a framework, along with Self-Determination Theory, to examine preservice teachers' motivation to include technology in their future teaching. We modified instruments to measure theoretical constructs to be applied to plans for the use of technology. Measured were: perceived behavioral control, attitudes…
Interpreting in Mental Health Settings: Issues and Concerns.
ERIC Educational Resources Information Center
Vernon, McCay; Miller, Katrina
2001-01-01
This paper examines expectations and stresses placed on sign language interpreters in mental health settings within a framework of demand and control theory. Translations of some specific psychological screening instruments and issues related to the Code of Ethics of the Registry of Interpreters for the Deaf are considered relative to…
Evolutionary game based control for biological systems with applications in drug delivery.
Li, Xiaobo; Lenaghan, Scott C; Zhang, Mingjun
2013-06-07
Control engineering and analysis of biological systems have become increasingly important for systems and synthetic biology. Unfortunately, no widely accepted control framework is currently available for these systems, especially at the cell and molecular levels. This is partially due to the lack of appropriate mathematical models to describe the unique dynamics of biological systems, and the lack of implementation techniques, such as ultra-fast and ultra-small devices and corresponding control algorithms. This paper proposes a control framework for biological systems subject to dynamics that exhibit adaptive behavior under evolutionary pressures. The control framework was formulated based on evolutionary game based modeling, which integrates both the internal dynamics and the population dynamics. In the proposed control framework, the adaptive behavior was characterized as an internal dynamic, and the external environment was regarded as an external control input. The proposed open-interface control framework can be integrated with additional control algorithms for control of biological systems. To demonstrate the effectiveness of the proposed framework, an optimal control strategy was developed and validated for drug delivery using the pathogen Giardia lamblia as a test case. In principle, the proposed control framework can be applied to any biological system exhibiting adaptive behavior under evolutionary pressures. Copyright © 2013 Elsevier Ltd. All rights reserved.
Pluimers, Dorine J; van Vliet, Ellen J; Niezink, Anne Gh; van Mourik, Martijn S; Eddes, Eric H; Wouters, Michel W; Tollenaar, Rob A E M; van Harten, Wim H
2015-04-09
To analyze the organization of multidisciplinary care pathways such as colorectal cancer care, an instrument was developed based on a recently published framework that was earlier used in analyzing (monodisciplinary) specialist cataract care from a lean perspective. The instrument was constructed using semi-structured interviews and direct observation of the colorectal care process based on a Rapid Plant Assessment. Six lean aspects that were earlier established that highly impact process design, were investigated: operational focus, autonomous work cell, physical lay-out of resources, multi-skilled team, pull planning and non-value adding activities. To test reliability, clarity and face validity of the instrument, a pilot study was performed in eight Dutch hospitals. In the pilot it proved feasible to apply the instrument and generate the intended information. The instrument consisted of 83 quantitative and 24 qualitative items. Examples of results show differences in operational focus, number of patient visits needed for diagnosis, numbers of staff involved with treatment, the implementation of protocols and utilization of one-stop-shops. Identification of waste and non-value adding activities may need further attention. Based on feedback from involved clinicians the face validity was acceptable and the results provided useful feedback- and benchmark data. The instrument proved to be reliable and valid for broader implementation in Dutch health care. The limited number of cases made statistical analysis not possible and further validation studies may shed better light on variation. This paper demonstrates the use of an instrument to analyze organizational characteristics in colorectal cancer care from a lean perspective. Wider use might help to identify best organizational practices for colorectal surgery. In larger series the instrument might be used for in-depth research into the relation between organization and patient outcomes. Although we found no reason to adapt the underlying framework, recommendations were made for further development to enable use in different tumor- and treatment modalities and in larger (international) samples that allow for more advanced statistical analysis. Waste from defective care or from wasted human potential will need further elaboration of the instrument.
The Conceptual Framework for the Development of a Mathematics Performance Assessment Instrument.
ERIC Educational Resources Information Center
Lane, Suzanne
1993-01-01
A conceptual framework is presented for the development of the Quantitative Understanding: Amplifying Student Achievement and Reasoning (QUASAR) Cognitive Assessment Instrument (QCAI) that focuses on the ability of middle-school students to problem solve, reason, and communicate mathematically. The instrument will provide programatic rather than…
van Dam, Joris; Musuku, John; Zühlke, Liesl J; Engel, Mark E; Nestle, Nick; Tadmor, Brigitta; Spector, Jonathan; Mayosi, Bongani M
2015-01-01
Rheumatic heart disease (RHD) remains a major disease burden in low-resource settings globally. Patient registers have long been recognised to be an essential instrument in RHD control and elimination programmes, yet to date rely heavily on paper-based data collection and non-networked data-management systems, which limit their functionality. To assess the feasibility and potential benefits of producing an electronic RHD patient register. We developed an eRegister based on the World Heart Federation's framework for RHD patient registers using CommCare, an open-source, cloud-based software for health programmes that supports the development of customised data capture using mobile devices. The resulting eRegistry application allows for simultaneous data collection and entry by field workers using mobile devices, and by providers using computer terminals in clinics and hospitals. Data are extracted from CommCare and are securely uploaded into a cloud-based database that matches the criteria established by the WHF framework. The application can easily be tailored to local needs by modifying existing variables or adding new ones. Compared with traditional paper-based data-collection systems, the eRegister reduces the risk of data error, synchronises in real-time, improves clinical operations and supports management of field team operations. The user-friendly eRegister is a low-cost, mobile, compatible platform for RHD treatment and prevention programmes based on materials sanctioned by the World Heart Federation. Readily adaptable to local needs, this paperless RHD patient register program presents many practical benefits.
Towards adaptive, streaming analysis of x-ray tomography data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, Mathew; Kleese van Dam, Kerstin; Marshall, Matthew J.
2015-03-04
Temporal and spatial resolution of chemical imaging methodologies such as x-ray tomography are rapidly increasing, leading to more complex experimental procedures and fast growing data volumes. Automated analysis pipelines and big data analytics are becoming essential to effectively evaluate the results of such experiments. Offering those data techniques in an adaptive, streaming environment can further substantially improve the scientific discovery process, by enabling experimental control and steering based on the evaluation of emerging phenomena as they are observed by the experiment. Pacific Northwest National Laboratory (PNNL)’ Chemical Imaging Initiative (CII - http://imaging.pnnl.gov/ ) has worked since 2011 towards developing amore » framework that allows users to rapidly compose and customize high throughput experimental analysis pipelines for multiple instrument types. The framework, named ‘Rapid Experimental Analysis’ (REXAN) Framework [1], is based on the idea of reusable component libraries and utilizes the PNNL developed collaborative data management and analysis environment ‘Velo’, to provide a user friendly analysis and data management environment for experimental facilities. This article will, discuss the capabilities established for X-Ray tomography, discuss lessons learned, and provide an overview of our more recent work in the Analysis in Motion Initiative (AIM - http://aim.pnnl.gov/ ) at PNNL to provide REXAN capabilities in a streaming environment.« less
Penetration with Long Rods: A Theoretical Framework and Comparison with Instrumented Impacts,
1980-06-01
theoretical framework for an experimental program is described. The theory of one dimensional wave propagation is used to show how data from instrumented long rods and targets may be fitted together to give a...the theoretical framework . In the final section the results to date are discussed.
Lewis, Cara C; Stanick, Cameo F; Martinez, Ruben G; Weiner, Bryan J; Kim, Mimi; Barwick, Melanie; Comtois, Katherine A
2015-01-08
Identification of psychometrically strong instruments for the field of implementation science is a high priority underscored in a recent National Institutes of Health working meeting (October 2013). Existing instrument reviews are limited in scope, methods, and findings. The Society for Implementation Research Collaboration Instrument Review Project's objectives address these limitations by identifying and applying a unique methodology to conduct a systematic and comprehensive review of quantitative instruments assessing constructs delineated in two of the field's most widely used frameworks, adopt a systematic search process (using standard search strings), and engage an international team of experts to assess the full range of psychometric criteria (reliability, construct and criterion validity). Although this work focuses on implementation of psychosocial interventions in mental health and health-care settings, the methodology and results will likely be useful across a broad spectrum of settings. This effort has culminated in a centralized online open-access repository of instruments depicting graphical head-to-head comparisons of their psychometric properties. This article describes the methodology and preliminary outcomes. The seven stages of the review, synthesis, and evaluation methodology include (1) setting the scope for the review, (2) identifying frameworks to organize and complete the review, (3) generating a search protocol for the literature review of constructs, (4) literature review of specific instruments, (5) development of an evidence-based assessment rating criteria, (6) data extraction and rating instrument quality by a task force of implementation experts to inform knowledge synthesis, and (7) the creation of a website repository. To date, this multi-faceted and collaborative search and synthesis methodology has identified over 420 instruments related to 34 constructs (total 48 including subconstructs) that are relevant to implementation science. Despite numerous constructs having greater than 20 available instruments, which implies saturation, preliminary results suggest that few instruments stem from gold standard development procedures. We anticipate identifying few high-quality, psychometrically sound instruments once our evidence-based assessment rating criteria have been applied. The results of this methodology may enhance the rigor of implementation science evaluations by systematically facilitating access to psychometrically validated instruments and identifying where further instrument development is needed.
Van Dijk-de Vries, Anneke N.; Duimel-Peeters, Inge G. P.; Muris, Jean W.; Wesseling, Geertjan J.; Beusmans, George H. M. I.
2016-01-01
Introduction: Teamwork between healthcare providers is conditional for the delivery of integrated care. This study aimed to assess the usefulness of the conceptual framework Integrated Team Effectiveness Model for developing and testing of the Integrated Team Effectiveness Instrument. Theory and methods: Focus groups with healthcare providers in an integrated care setting for people with chronic obstructive pulmonary disease (COPD) were conducted to examine the recognisability of the conceptual framework and to explore critical success factors for collaborative COPD practice out of this framework. The resulting items were transposed into a pilot instrument. This was reviewed by expert opinion and completed 153 times by healthcare providers. The underlying structure and internal consistency of the instrument were verified by factor analysis and Cronbach’s alpha. Results: The conceptual framework turned out to be comprehensible for discussing teamwork effectiveness. The pilot instrument measures 25 relevant aspects of teamwork in integrated COPD care. Factor analysis suggested three reliable components: teamwork effectiveness, team processes and team psychosocial traits (Cronbach’s alpha between 0.76 and 0.81). Conclusions and discussion: The conceptual framework Integrated Team Effectiveness Model is relevant in developing a practical full-spectrum instrument to facilitate discussing teamwork effectiveness. The Integrated Team Effectiveness Instrument provides a well-founded basis to self-evaluate teamwork effectiveness in integrated COPD care by healthcare providers. Recommendations are provided for the improvement of the instrument. PMID:27616953
NASA Astrophysics Data System (ADS)
Israel, Maya; Wherfel, Quentin M.; Shehab, Saadeddine; Ramos, Evan A.; Metzger, Adam; Reese, George C.
2016-07-01
This paper describes the development, validation, and uses of the Collaborative Computing Observation Instrument (C-COI), a web-based analysis instrument that classifies individual and/or collaborative behaviors of students during computing problem-solving (e.g. coding, programming). The C-COI analyzes data gathered through video and audio screen recording software that captures students' computer screens as they program, and their conversations with their peers or adults. The instrument allows researchers to organize and quantify these data to track behavioral patterns that could be further analyzed for deeper understanding of persistence and/or collaborative interactions. The article provides a rationale for the C-COI including the development of a theoretical framework for measuring collaborative interactions in computer-mediated environments. This theoretical framework relied on the computer-supported collaborative learning literature related to adaptive help seeking, the joint problem-solving space in which collaborative computing occurs, and conversations related to outcomes and products of computational activities. Instrument development and validation also included ongoing advisory board feedback from experts in computer science, collaborative learning, and K-12 computing as well as classroom observations to test out the constructs in the C-COI. These processes resulted in an instrument with rigorous validation procedures and a high inter-rater reliability.
On DESTINY Science Instrument Electrical and Electronics Subsystem Framework
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Benford, Dominic J.; Lauer, Tod R.
2009-01-01
Future space missions are going to require large focal planes with many sensing arrays and hundreds of millions of pixels all read out at high data rates'' . This will place unique demands on the electrical and electronics (EE) subsystem design and it will be critically important to have high technology readiness level (TRL) EE concepts ready to support such missions. One such omission is the Joint Dark Energy Mission (JDEM) charged with making precise measurements of the expansion rate of the universe to reveal vital clues about the nature of dark energy - a hypothetical form of energy that permeates all of space and tends to increase the rate of the expansion. One of three JDEM concept studies - the Dark Energy Space Telescope (DESTINY) was conducted in 2008 at the NASA's Goddard Space Flight Center (GSFC) in Greenbelt, Maryland. This paper presents the EE subsystem framework, which evolved from the DESTINY science instrument study. It describes the main challenges and implementation concepts related to the design of an EE subsystem featuring multiple focal planes populated with dozens of large arrays and millions of pixels. The focal planes are passively cooled to cryogenic temperatures (below 140 K). The sensor mosaic is controlled by a large number of Readout Integrated Circuits and Application Specific Integrated Circuits - the ROICs/ASICs in near proximity to their sensor focal planes. The ASICs, in turn, are serviced by a set of "warm" EE subsystem boxes performing Field Programmable Gate Array (FPGA) based digital signal processing (DSP) computations of complex algorithms, such as sampling-up-the-ramp algorithm (SUTR), over large volumes of fast data streams. The SUTR boxes are supported by the Instrument Control/Command and Data Handling box (ICDH Primary and Backup boxes) for lossless data compression, command and low volume telemetry handling, power conversion and for communications with the spacecraft. The paper outlines how the JDEM DESTINY concept instrument EE subsystem can be built now, a design; which is generally U.S. Government work not protected by U.S. copyright IEEEAC paper # 1429. Version 4. Updated October 19, 2009 applicable to a wide variety of missions using large focal planes with lar ge mosaics of sensors.
How discriminating are discriminative instruments?
Hankins, Matthew
2008-05-27
The McMaster framework introduced by Kirshner & Guyatt is the dominant paradigm for the development of measures of health status and health-related quality of life (HRQL). The framework defines the functions of such instruments as evaluative, predictive or discriminative. Evaluative instruments are required to be sensitive to change (responsiveness), but there is no corresponding index of the degree to which discriminative instruments are sensitive to cross-sectional differences. This paper argues that indices of validity and reliability are not sufficient to demonstrate that a discriminative instrument performs its function of discriminating between individuals, and that the McMaster framework would be augmented by the addition of a separate index of discrimination. The coefficient proposed by Ferguson (Delta) is easily adapted to HRQL instruments and is a direct, non-parametric index of the degree to which an instrument distinguishes between individuals. While Delta should prove useful in the development and evaluation of discriminative instruments, further research is required to elucidate the relationship between the measurement properties of discrimination, reliability and responsiveness.
ERIC Educational Resources Information Center
Todd, Amber; Romine, William L.; Cook Whitt, Katahdin
2017-01-01
We describe the development, validation, and use of the "Learning Progression-Based Assessment of Modern Genetics" (LPA-MG) in a high school biology context. Items were constructed based on a current learning progression framework for genetics (Shea & Duncan, 2013; Todd & Kenyon, 2015). The 34-item instrument, which was tied to…
Crossley, James G M
2015-01-01
Nurse appraisal is well established in the Western world because of its obvious educational advantages. Appraisal works best with many sources of information on performance. Multisource feedback (MSF) is widely used in business and in other clinical disciplines to provide such information. It has also been incorporated into nursing appraisals, but, so far, none of the instruments in use for nurses has been validated. We set out to develop an instrument aligned with the UK Knowledge and Skills Framework (KSF) and to evaluate its reliability and feasibility across a wide hospital-based nursing population. The KSF framework provided a content template. Focus groups developed an instrument based on consensus. The instrument was administered to all the nursing staff in 2 large NHS hospitals forming a single trust in London, England. We used generalizability analysis to estimate reliability, response rates and unstructured interviews to evaluate feasibility, and factor structure and correlation studies to evaluate validity. On a voluntary basis the response rate was moderate (60%). A failure to engage with information technology and employment-related concerns were commonly cited as reasons for not responding. In this population, 11 responses provided a profile with sufficient reliability to inform appraisal (G = 0.7). Performance on the instrument was closely and significantly correlated with performance on a KSF questionnaire. This is the first contemporary psychometric evaluation of an MSF instrument for nurses. MSF appears to be as valid and reliable as an assessment method to inform appraisal in nurses as it is in other health professional groups. © 2015 The Alliance for Continuing Education in the Health Professions, the Society for Academic Continuing Medical Education, and the Council on Continuing Medical Education, Association for Hospital Medical Education.
Economic rationality and health and lifestyle choices for people with diabetes.
Baker, Rachel Mairi
2006-11-01
Economic rationality is traditionally represented by goal-oriented, maximising behaviour, or 'instrumental rationality'. Such a consequentialist, instrumental model of choice is often implicit in a biomedical approach to health promotion and education. The research reported here assesses the relevance of a broader conceptual framework of rationality, which includes 'procedural' and 'expressive' rationality as complements to an instrumental model of rationality, in a health context. Q methodology was used to derive 'factors' underlying health and lifestyle choices, based on a factor analysis of the results of a card sorting procedure undertaken by 27 adult respondents with type 2 diabetes in Newcastle upon Tyne, UK. These factors were then compared with the rationality framework and the appropriateness of an extended model of economic rationality as a means of better understanding health and lifestyle choices was assessed. Taking a wider rational choice perspective, choices which are rendered irrational within a narrow-biomedical or strictly instrumental model, can be understood in terms of a coherent rationale, grounded in the accounts of respondents. The implications of these findings are discussed in terms of rational choice theory and diabetes management and research.
Price responsiveness of demand for cigarettes: does rationality matter?
Laporte, Audrey
2006-01-01
Meta-analysis is applied to aggregate-level studies that model the demand for cigarettes using static, myopic, or rational addiction frameworks in an attempt to synthesize key findings in the literature and to identify determinants of the variation in reported price elasticity estimates across studies. The results suggest that the rational addiction framework produces statistically similar estimates to the static framework but that studies that use the myopic framework tend to report more elastic price effects. Studies that applied panel data techniques or controlled for cross-border smuggling reported more elastic price elasticity estimates, whereas the use of instrumental variable techniques and time trends or time dummy variables produced less elastic estimates. The finding that myopic models produce different estimates than either of the other two model frameworks underscores that careful attention must be given to time series properties of the data.
Ali, Mehri; Saeed, Mazloomy Mahmoodabad Seyed; Ali, Morowatisharifabad Mohammad; Haidar, Nadrian
2011-09-01
This paper reports on predictors of helmet use behaviour, using variables based on the theory of planned behaviour model among the employed motorcycle riders in Yazd-Iran, in an attempt to identify influential factors that may be addressed through intervention efforts. In 2007, a cluster random sample of 130 employed motorcycle riders in the city of Yazd in central Iran, participated in the study. Appropriate instruments were designed to measure the variables of interest (attitude, subjective norms, perceived behaviour control, intention along with helmet use behaviour). Reliability and validity of the instruments were examined and approved. The statistical analysis of the data included descriptive statistics, bivariate correlations, and multiple regression. Based on the results, 56 out of all the respondents (43.1%) had history of accident by motorcycle. Of these motorcycle riders only 10.7% were wearing their helmet at the time of their accident. Intention and perceived behavioural control showed a significant relationship with helmet use behaviour and perceived behaviour control was the strongest predictor of helmet use intention, followed by subjective norms, and attitude. It was found that that helmet use rate among motorcycle riders was very low. The findings of present study provide a preliminary support for the TPB model as an effective framework for examining helmet use in motorcycle riders. Understanding motorcycle rider's thoughts, feelings and beliefs about helmet use behaviour can assist intervention specialists to develop and implement effective programs in order to promote helmet use among motorcycle riders. Copyright © 2010 Elsevier Ltd. All rights reserved.
Software structure for Vega/Chara instrument
NASA Astrophysics Data System (ADS)
Clausse, J.-M.
2008-07-01
VEGA (Visible spEctroGraph and polArimeter) is one of the focal instruments of the CHARA array at Mount Wilson near Los Angeles. Its control system is based on techniques developed on the GI2T interferometer (Grand Interferometre a 2 Telescopes) and on the SIRIUS fibered hyper telescope testbed at OCA (Observatoire de la Cote d'Azur). This article describes the software and electronics architecture of the instrument. It is based on local network architecture and uses also Virtual Private Network connections. The server part is based on Windows XP (VC++). The control software is on Linux (C, GTK). For the control of the science detector and the fringe tracking systems, distributed API use real-time techniques. The control software gathers all the necessary informations of the instrument. It allows an automatic management of the instrument by using an original task scheduler. This architecture intends to drive the instrument from remote sites, such as our institute in South of France.
Using XML and Java for Astronomical Instrument Control
NASA Astrophysics Data System (ADS)
Koons, L.; Ames, T.; Evans, R.; Warsaw, C.; Sall, K.
1999-12-01
Traditionally, instrument command and control systems have been highly specialized, consisting mostly of custom code that is difficult to develop, maintain, and extend. Such solutions are initially very costly and are inflexible to subsequent engineering change requests. Instrument description is too tightly coupled with details of implementation. NASA/Goddard Space Flight Center and AppNet, Inc. are developing a very general and highly extensible framework that applies to virtually any kind of instrument that can be controlled by a computer (e.g., telescopes, microscopes and printers). A key aspect of the object-oriented architecture, implemented in Java, involves software that is driven by an instrument description. The Astronomical Instrument Markup Language (AIML) is a domain-specific implementation of the more generalized Instrument Markup Language (IML). The software architecture combines the platform-independent processing capabilities of Java with the vendor-independent data description syntax of Extensible Markup Language (XML), a human-readable and machine-understandable way to describe structured data. IML is used to describe command sets (including parameters, datatypes, and constraints) and their associated formats, telemetry, and communication mechanisms. The software uses this description to present graphical user interfaces to control and monitor the instrument. Recent efforts have extended to command procedures (scripting) and representation of data pipeline inputs, outputs, and connections. Near future efforts are likely to include an XML description of data visualizations, as well as the potential use of XSL (Extensible Stylesheet Language) to permit astronomers to customize the user interface on several levels: per user, instrument, subsystem, or observatory-wide. Our initial prototyping effort was targeted for HAWC (High-resolution Airborne Wideband Camera), a first-light instrument of SOFIA (the Stratospheric Observatory for Infrared Astronomy). A production-level application of this technology is for one of the three candidate detectors of SPIRE (Spectral and Photometric Imaging REceiver), a focal plane instrument proposed for the European Space Agency's Far Infrared Space Telescope. The detectors are being developed by the Infrared Astrophysics Branch of NASA/GSFC.
NASA Astrophysics Data System (ADS)
Chen, Ruey-Shun; Tsai, Yung-Shun; Tu, Arthur
In this study we propose a manufacturing control framework based on radio-frequency identification (RFID) technology and a distributed information system to construct a mass-customization production process in a loosely coupled shop-floor control environment. On the basis of this framework, we developed RFID middleware and an integrated information system for tracking and controlling the manufacturing process flow. A bicycle manufacturer was used to demonstrate the prototype system. The findings of this study were that the proposed framework can improve the visibility and traceability of the manufacturing process as well as enhance process quality control and real-time production pedigree access. Using this framework, an enterprise can easily integrate an RFID-based system into its manufacturing environment to facilitate mass customization and a just-in-time production model.
Estimation and Identification of the Complier Average Causal Effect Parameter in Education RCTs
ERIC Educational Resources Information Center
Schochet, Peter Z.; Chiang, Hanley S.
2011-01-01
In randomized control trials (RCTs) in the education field, the complier average causal effect (CACE) parameter is often of policy interest, because it pertains to intervention effects for students who receive a meaningful dose of treatment services. This article uses a causal inference and instrumental variables framework to examine the…
New type of measuring and intelligent instrument for curing tobacco
NASA Astrophysics Data System (ADS)
Yi, Chui-Jie; Huang, Xieqing; Chen, Tianning; Xia, Hong
1993-09-01
A new type of measuring intelligent instrument for cured tobacco is presented in this paper. Based on fuzzy linguistic control principles the instrument is used to controlling the temperature and humidity during cured tobacco taking 803 1 singlechip computer as a center controller. By using methods of fuzzy weighted factors the cross coupling in curing procedures is decoupled. Results that the instrument has producted indicate the fuzzy controller in the instrument has perfect performance for process of cured tobacco as shown in figure
ERIC Educational Resources Information Center
Tulbure, Bogdan T.; Szentagotai, Aurora; Dobrean, Anca; David, Daniel
2012-01-01
Investigating the empirical support of various assessment instruments, the evidence based assessment approach expands the scientific basis of psychotherapy. Starting from Hunsley and Mash's evaluative framework, we critically reviewed the rating scales designed to measure social anxiety or phobia in youth. Thirteen of the most researched social…
State-Based Curriculum-Making, Part 2, the Tool-Kit for the State's Curriculum-Making
ERIC Educational Resources Information Center
Westbury, Ian; Sivesind, Kirsten
2016-01-01
The paper identifies three tools that support the administrative instrument of a state-based curriculum commission: compartmentalization, licensing, and segmentation. These tools channel the state's curriculum-making towards forms of symbolic rather than regulatory action. The state curriculum becomes a framework for the ideological governance of…
NASA Astrophysics Data System (ADS)
Feng, Shou; Fu, Ping; Zheng, Wenbin
2018-03-01
Predicting gene function based on biological instrumental data is a complicated and challenging hierarchical multi-label classification (HMC) problem. When using local approach methods to solve this problem, a preliminary results processing method is usually needed. This paper proposed a novel preliminary results processing method called the nodes interaction method. The nodes interaction method revises the preliminary results and guarantees that the predictions are consistent with the hierarchy constraint. This method exploits the label dependency and considers the hierarchical interaction between nodes when making decisions based on the Bayesian network in its first phase. In the second phase, this method further adjusts the results according to the hierarchy constraint. Implementing the nodes interaction method in the HMC framework also enhances the HMC performance for solving the gene function prediction problem based on the Gene Ontology (GO), the hierarchy of which is a directed acyclic graph that is more difficult to tackle. The experimental results validate the promising performance of the proposed method compared to state-of-the-art methods on eight benchmark yeast data sets annotated by the GO.
ERIC Educational Resources Information Center
Elken, Mari
2015-01-01
The European Qualifications Framework (EQF) for lifelong learning has been characterized as a policy instrument with a number of contested ideas, raising questions about the process through which such instruments are developed at European level. The introduction of the EQF is in this article examined through variations of neo-institutional theory:…
NASA Astrophysics Data System (ADS)
Schalk, Kelly A.
The purpose of this investigation was to measure specific ways a student interest SSI-based curricular and pedagogical affects undergraduates' ability informally reason. The delimited components of informal reasoning measured were undergraduates' Nature of Science conceptualizations and ability to evaluate scientific information. The socio-scientific issues (SSI) theoretical framework used in this case-study has been advocated as a means for improving students' functional scientific literacy. This investigation focused on the laboratory component of an undergraduate microbiology course in spring 2008. There were 26 participants. The instruments used in this study included: (1) Individual and Group research projects, (2) journals, (3) laboratory write-ups, (4) a laboratory quiz, (5) anonymous evaluations, and (6) a pre/post article exercise. All instruments yielded qualitative data, which were coded using the qualitative software NVivo7. Data analyses were subjected to instrumental triangulation, inter-rater reliability, and member-checking. It was determined that undergraduates' epistemological knowledge of scientific discovery, processes, and justification matured in response to the intervention. Specifically, students realized: (1) differences between facts, theories, and opinions; (2) testable questions are not definitively proven; (3) there is no stepwise scientific process; and (4) lack of data weakens a claim. It was determined that this knowledge influenced participants' beliefs and ability to informally reason. For instance, students exhibited more critical evaluations of scientific information. It was also found that undergraduates' prior opinions had changed over the semester. Further, the student interest aspect of this framework engaged learners by offering participants several opportunities to influentially examine microbiology issues that affected their life. The investigation provided empirically based insights into the ways undergraduates' interest and functional scientific literacy can be promoted. The investigation advanced what was known about using SSI-based frameworks to the post-secondary learner context. Outstanding questions remain for investigation. For example, is this type of student interest SSI-based intervention broadly applicable (i.e., in other science disciplines and grade levels)? And, what challenges would teachers in diverse contexts encounter when implementing a SSI-based theoretical framework?
Conceptualizing and assessing improvement capability: a review
Boaden, Ruth; Walshe, Kieran
2017-01-01
Abstract Purpose The literature is reviewed to examine how ‘improvement capability’ is conceptualized and assessed and to identify future areas for research. Data sources An iterative and systematic search of the literature was carried out across all sectors including healthcare. The search was limited to literature written in English. Data extraction The study identifies and analyses 70 instruments and frameworks for assessing or measuring improvement capability. Information about the source of the instruments, the sectors in which they were developed or used, the measurement constructs or domains they employ, and how they were tested was extracted. Results of data synthesis The instruments and framework constructs are very heterogeneous, demonstrating the ambiguity of improvement capability as a concept, and the difficulties involved in its operationalisation. Two-thirds of the instruments and frameworks have been subject to tests of reliability and half to tests of validity. Many instruments have little apparent theoretical basis and do not seem to have been used widely. Conclusion The assessment and development of improvement capability needs clearer and more consistent conceptual and terminological definition, used consistently across disciplines and sectors. There is scope to learn from existing instruments and frameworks, and this study proposes a synthetic framework of eight dimensions of improvement capability. Future instruments need robust testing for reliability and validity. This study contributes to practice and research by presenting the first review of the literature on improvement capability across all sectors including healthcare. PMID:28992146
Calibration of space instruments at the Metrology Light Source
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klein, R., E-mail: roman.klein@ptb.de; Fliegauf, R.; Gottwald, A.
2016-07-27
PTB has more than 20 years of experience in the calibration of space-based instruments using synchrotron radiation to cover the UV, VUV and X-ray spectral range. New instrumentation at the electron storage ring Metrology Light Source (MLS) opens up extended calibration possibilities within this framework. In particular, the set-up of a large vacuum vessel that can accommodate entire space instruments opens up new prospects. Moreover, a new facility for the calibration of radiation transfer source standards with a considerably extended spectral range has been put into operation. Besides, characterization and calibration of single components like e.g. mirrors, filters, gratings, andmore » detectors is continued.« less
2013-01-01
Background This study advances a measurement approach for the study of organizational culture in population-based occupational health research, and tests how different organizational culture types are associated with psychological distress, depression, emotional exhaustion, and well-being. Methods Data were collected over a sample of 1,164 employees nested in 30 workplaces. Employees completed the 26-item OCP instrument. Psychological distress was measured with the General Health Questionnaire (12-item); depression with the Beck Depression Inventory (21-item); and emotional exhaustion with five items from the Maslach Burnout Inventory general survey. Exploratory factor analysis evaluated the dimensionality of the OCP scale. Multilevel regression models estimated workplace-level variations, and the contribution of organizational culture factors to mental health and well-being after controlling for gender, age, and living with a partner. Results Exploratory factor analysis of OCP items revealed four factors explaining about 75% of the variance, and supported the structure of the Competing Values Framework. Factors were labeled Group, Hierarchical, Rational and Developmental. Cronbach’s alphas were high (0.82-0.89). Multilevel regression analysis suggested that the four culture types varied significantly between workplaces, and correlated with mental health and well-being outcomes. The Group culture type best distinguished between workplaces and had the strongest associations with the outcomes. Conclusions This study provides strong support for the use of the OCP scale for measuring organizational culture in population-based occupational health research in a way that is consistent with the Competing Values Framework. The Group organizational culture needs to be considered as a relevant factor in occupational health studies. PMID:23642223
The control system of the multi-strip ionization chamber for the HIMM
NASA Astrophysics Data System (ADS)
Li, Min; Yuan, Y. J.; Mao, R. S.; Xu, Z. G.; Li, Peng; Zhao, T. C.; Zhao, Z. L.; Zhang, Nong
2015-03-01
Heavy Ion Medical Machine (HIMM) is a carbon ion cancer treatment facility which is being built by the Institute of Modern Physics (IMP) in China. In this facility, transverse profile and intensity of the beam at the treatment terminals will be measured by the multi-strip ionization chamber. In order to fulfill the requirement of the beam position feedback to accomplish the beam automatic commissioning, less than 1 ms reaction time of the Data Acquisition (DAQ) of this detector must be achieved. Therefore, the control system and software framework for DAQ have been redesigned and developed with National Instruments Compact Reconfigurable Input/Output (CompactRIO) instead of PXI 6133. The software is Labview-based and developed following the producer-consumer pattern with message mechanism and queue technology. The newly designed control system has been tested with carbon beam at the Heavy Ion Research Facility at Lanzhou-Cooler Storage Ring (HIRFL-CSR) and it has provided one single beam profile measurement in less than 1 ms with 1 mm beam position resolution. The fast reaction time and high precision data processing during the beam test have verified the usability and maintainability of the software framework. Furthermore, such software architecture is easy-fitting to applications with different detectors such as wire scanner detector.
Sebold, Miriam; Schad, Daniel J; Nebe, Stephan; Garbusow, Maria; Jünger, Elisabeth; Kroemer, Nils B; Kathmann, Norbert; Zimmermann, Ulrich S; Smolka, Michael N; Rapp, Michael A; Heinz, Andreas; Huys, Quentin J M
2016-07-01
Behavioral choice can be characterized along two axes. One axis distinguishes reflexive, model-free systems that slowly accumulate values through experience and a model-based system that uses knowledge to reason prospectively. The second axis distinguishes Pavlovian valuation of stimuli from instrumental valuation of actions or stimulus-action pairs. This results in four values and many possible interactions between them, with important consequences for accounts of individual variation. We here explored whether individual variation along one axis was related to individual variation along the other. Specifically, we asked whether individuals' balance between model-based and model-free learning was related to their tendency to show Pavlovian interferences with instrumental decisions. In two independent samples with a total of 243 participants, Pavlovian-instrumental transfer effects were negatively correlated with the strength of model-based reasoning in a two-step task. This suggests a potential common underlying substrate predisposing individuals to both have strong Pavlovian interference and be less model-based and provides a framework within which to interpret the observation of both effects in addiction.
Cadorin, Lucia; Bagnasco, Annamaria; Tolotti, Angela; Pagnucci, Nicola; Sasso, Loredana
2016-09-01
To identify, evaluate and describe the psychometric properties of instruments that measure learning outcomes in healthcare students. Meaningful learning is an active process that enables a wider and deeper understanding of concepts. It is the result of an interaction between new and prior knowledge and produces a long-standing change in knowledge and skills. In the field of education, validated and reliable instruments for assessing meaningful learning are needed. A psychometric systematic review. MEDLINE CINAHL, SCOPUS, ERIC, Cochrane Library, Psychology & Behavioural Sciences Collection Database from 1990-December 2013. Using pre-determined inclusion criteria, three reviewers independently identified studies for full-text review. Then they extracted data for quality appraisal and graded instrument validity using the Consensus-based Standards for the selection of the health status Measurement INstruments checklist and the Psychometric Grading Framework. Of the 57 studies identified for full-text review, 16 met the inclusion criteria and 13 different instruments were assessed. Following quality assessment, only one instrument was considered of good quality but it measured meaningful learning only in part; the others were either fair or poor. The Psychometric Grading Framework indicated that one instrument was weak, while the others were very weak. No instrument displayed adequate validity. The systematic review produced a synthesis of the psychometric properties of tools that measure learning outcomes in students of healthcare disciplines. Measuring learning outcomes is very important when educating health professionals. The identified tools may constitute a starting point for the development of other assessment tools. © 2016 John Wiley & Sons Ltd.
Software Framework for Development of Web-GIS Systems for Analysis of Georeferenced Geophysical Data
NASA Astrophysics Data System (ADS)
Okladnikov, I.; Gordov, E. P.; Titov, A. G.
2011-12-01
Georeferenced datasets (meteorological databases, modeling and reanalysis results, remote sensing products, etc.) are currently actively used in numerous applications including modeling, interpretation and forecast of climatic and ecosystem changes for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset at present studies in the area of climate and environmental change require a special software support. A dedicated software framework for rapid development of providing such support information-computational systems based on Web-GIS technologies has been created. The software framework consists of 3 basic parts: computational kernel developed using ITTVIS Interactive Data Language (IDL), a set of PHP-controllers run within specialized web portal, and JavaScript class library for development of typical components of web mapping application graphical user interface (GUI) based on AJAX technology. Computational kernel comprise of number of modules for datasets access, mathematical and statistical data analysis and visualization of results. Specialized web-portal consists of web-server Apache, complying OGC standards Geoserver software which is used as a base for presenting cartographical information over the Web, and a set of PHP-controllers implementing web-mapping application logic and governing computational kernel. JavaScript library aiming at graphical user interface development is based on GeoExt library combining ExtJS Framework and OpenLayers software. Based on the software framework an information-computational system for complex analysis of large georeferenced data archives was developed. Structured environmental datasets available for processing now include two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis, meteorological observational data for the territory of the former USSR for the 20th century, and others. Current version of the system is already involved into a scientific research process. Particularly, recently the system was successfully used for analysis of Siberia climate changes and its impact in the region. The software framework presented allows rapid development of Web-GIS systems for geophysical data analysis thus providing specialists involved into multidisciplinary research projects with reliable and practical instruments for complex analysis of climate and ecosystems changes on global and regional scales. This work is partially supported by RFBR grants #10-07-00547, #11-05-01190, and SB RAS projects 4.31.1.5, 4.31.2.7, 4, 8, 9, 50 and 66.
8. INTERIOR, CONTROL AND INSTRUMENTATION ROOM. Looking southwest toward entrance ...
8. INTERIOR, CONTROL AND INSTRUMENTATION ROOM. Looking southwest toward entrance and inner blast door. - Edwards Air Force Base, South Base Sled Track, Firing & Control Blockhouse for 10,000-foot Track, South of Sled Track at midpoint of 20,000-foot track, Lancaster, Los Angeles County, CA
van Dam, Joris; Tadmor, Brigitta; Spector, Jonathan; Musuku, John; Zühlke, Liesl J; Zühlke, Liesl J; Engel, Mark E; Mayosi, Bongani M; Nestle, Nick
2015-01-01
Summary Background Rheumatic heart disease (RHD) remains a major disease burden in low-resource settings globally. Patient registers have long been recognised to be an essential instrument in RHD control and elimination programmes, yet to date rely heavily on paper-based data collection and non-networked data-management systems, which limit their functionality. Objectives To assess the feasibility and potential benefits of producing an electronic RHD patient register. Methods We developed an eRegister based on the World Heart Federation’s framework for RHD patient registers using CommCare, an open-source, cloud-based software for health programmes that supports the development of customised data capture using mobile devices. Results The resulting eRegistry application allows for simultaneous data collection and entry by field workers using mobile devices, and by providers using computer terminals in clinics and hospitals. Data are extracted from CommCare and are securely uploaded into a cloud-based database that matches the criteria established by the WHF framework. The application can easily be tailored to local needs by modifying existing variables or adding new ones. Compared with traditional paper-based data-collection systems, the eRegister reduces the risk of data error, synchronises in real-time, improves clinical operations and supports management of field team operations. Conclusions The user-friendly eRegister is a low-cost, mobile, compatible platform for RHD treatment and prevention programmes based on materials sanctioned by the World Heart Federation. Readily adaptable to local needs, this paperless RHD patient register program presents many practical benefits. PMID:26444995
Air-condition Control System of Weaving Workshop Based on LabVIEW
NASA Astrophysics Data System (ADS)
Song, Jian
The project of air-condition measurement and control system based on LabVIEW is put forward for the sake of controlling effectively the environmental targets in the weaving workshop. In this project, which is based on the virtual instrument technology and in which LabVIEW development platform by NI is adopted, the system is constructed on the basis of the virtual instrument technology. It is composed of the upper PC, central control nodes based on CC2530, sensor nodes, sensor modules and executive device. Fuzzy control algorithm is employed to achieve the accuracy control of the temperature and humidity. A user-friendly man-machine interaction interface is designed with virtual instrument technology at the core of the software. It is shown by experiments that the measurement and control system can run stably and reliably and meet the functional requirements for controlling the weaving workshop.
Drugs as instruments: a new framework for non-addictive psychoactive drug use.
Müller, Christian P; Schumann, Gunter
2011-12-01
Most people who are regular consumers of psychoactive drugs are not drug addicts, nor will they ever become addicts. In neurobiological theories, non-addictive drug consumption is acknowledged only as a "necessary" prerequisite for addiction, but not as a stable and widespread behavior in its own right. This target article proposes a new neurobiological framework theory for non-addictive psychoactive drug consumption, introducing the concept of "drug instrumentalization." Psychoactive drugs are consumed for their effects on mental states. Humans are able to learn that mental states can be changed on purpose by drugs, in order to facilitate other, non-drug-related behaviors. We discuss specific "instrumentalization goals" and outline neurobiological mechanisms of how major classes of psychoactive drugs change mental states and serve non-drug-related behaviors. We argue that drug instrumentalization behavior may provide a functional adaptation to modern environments based on a historical selection for learning mechanisms that allow the dynamic modification of consummatory behavior. It is assumed that in order to effectively instrumentalize psychoactive drugs, the establishment of and retrieval from a drug memory is required. Here, we propose a new classification of different drug memory subtypes and discuss how they interact during drug instrumentalization learning and retrieval. Understanding the everyday utility and the learning mechanisms of non-addictive psychotropic drug use may help to prevent abuse and the transition to drug addiction in the future.
Lessons Learned from the Hubble Space Telescope (HST) Contamination Control Program
NASA Technical Reports Server (NTRS)
Hansen, Patricia A.; Townsend, Jacqueline A.; Hedgeland, Randy J.
2004-01-01
Over the past two decades, the Hubble Space Telescope (HST) Contamination Control Program has evolved from a ground-based integration program to a space-based science-sustaining program. The contamination controls from the new-generation Scientific Instruments and Orbital Replacement Units were incorporated into the HST Contamination Control Program to maintain scientific capability over the life of the telescope. Long-term on-orbit scientific data has shown that these contamination controls implemented for the instruments, Servicing Mission activities (Orbiter, Astronauts, and mission), and on-orbit operations successfully protected the HST &om contamination and the instruments from self-contamination.
Lessons Learned from the Hubble Space Telescope (HST) Contamination Control Program
NASA Technical Reports Server (NTRS)
Hansen, Patricia A.; Townsend, Jacqueline A.; Hedgeland, Randy J.
2004-01-01
Over the past two decades, the Hubble Space Telescope (HST) Contamination Control Program has evolved from a ground-based integration program to a space-based science-sustaining program. The contamination controls from the new-generation Scientific Instruments and Orbital Replacement Units were incorporated into the HST Contamination Control Program to maintain scientific capability over the life of the telescope. Long-term on-orbit scientific data has shown that these contamination controls implemented for the instruments, Servicing Mission activities (Orbiter, Astronauts, and mission), and on-orbit operations successfully protected the HST from contamination and the instruments from self-contamination.
ERIC Educational Resources Information Center
Dill, David D.; Beerkens, Maarja
2013-01-01
The new demands of mass systems of higher education and the emerging environment of global academic competition are altering the traditional institutions for assuring academic standards in universities. As a consequence many nations are experimenting with new instruments for academic quality assurance. Contemporary government control of academic…
GCS component development cycle
NASA Astrophysics Data System (ADS)
Rodríguez, Jose A.; Macias, Rosa; Molgo, Jordi; Guerra, Dailos; Pi, Marti
2012-09-01
The GTC1 is an optical-infrared 10-meter segmented mirror telescope at the ORM observatory in Canary Islands (Spain). First light was at 13/07/2007 and since them it is in the operation phase. The GTC control system (GCS) is a distributed object & component oriented system based on RT-CORBA8 and it is responsible for the management and operation of the telescope, including its instrumentation. GCS has used the Rational Unified process (RUP9) in its development. RUP is an iterative software development process framework. After analysing (use cases) and designing (UML10) any of GCS subsystems, an initial component description of its interface is obtained and from that information a component specification is written. In order to improve the code productivity, GCS has adopted the code generation to transform this component specification into the skeleton of component classes based on a software framework, called Device Component Framework. Using the GCS development tools, based on javadoc and gcc, in only one step, the component is generated, compiled and deployed to be tested for the first time through our GUI inspector. The main advantages of this approach are the following: It reduces the learning curve of new developers and the development error rate, allows a systematic use of design patterns in the development and software reuse, speeds up the deliverables of the software product and massively increase the timescale, design consistency and design quality, and eliminates the future refactoring process required for the code.
A KPI-based process monitoring and fault detection framework for large-scale processes.
Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Yang, Xu; Ding, Steven X; Peng, Kaixiang
2017-05-01
Large-scale processes, consisting of multiple interconnected subprocesses, are commonly encountered in industrial systems, whose performance needs to be determined. A common approach to this problem is to use a key performance indicator (KPI)-based approach. However, the different KPI-based approaches are not developed with a coherent and consistent framework. Thus, this paper proposes a framework for KPI-based process monitoring and fault detection (PM-FD) for large-scale industrial processes, which considers the static and dynamic relationships between process and KPI variables. For the static case, a least squares-based approach is developed that provides an explicit link with least-squares regression, which gives better performance than partial least squares. For the dynamic case, using the kernel representation of each subprocess, an instrument variable is used to reduce the dynamic case to the static case. This framework is applied to the TE benchmark process and the hot strip mill rolling process. The results show that the proposed method can detect faults better than previous methods. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Simmie, Geraldine Mooney; de Paor, Cathal; Liston, Jennifer; O'Shea, John
2017-01-01
This study reports on findings from a critical literature review, from 2004 to 2014, in relation to the positioning of beginning teachers' professional learning during induction. The study uses theoretical frameworks drawn from competing discourses: an instrumental standpoint based on performativity and a dialectical standpoint based on a…
Problem-Based Learning in Tertiary Education: Teaching Old "Dogs" New Tricks?
ERIC Educational Resources Information Center
Yeo, Roland K.
2005-01-01
Purpose--The paper sets out to explore the challenges of problem-based learning (PBL) in tertiary education and to propose a framework with implications for practice and learning. Design/Methodology/Approach--A total of 18 tertiary students divided into three groups participated in the focus group discussions. A quantitative instrument was used as…
East Meet West? U.S. and China: Strategies for Global Leadership
2013-03-01
construct to serve as a broader framework for this research project to describe instruments of national power in a constantly changing, resource...construct to serve as a broader framework for this research project to describe instruments of national power in a constantly changing, resource...for this research project to describe instruments of national power in a constantly changing, resource-constrained, geopolitical environment. This
TELICS—A Telescope Instrument Control System for Small/Medium Sized Astronomical Observatories
NASA Astrophysics Data System (ADS)
Srivastava, Mudit K.; Ramaprakash, A. N.; Burse, Mahesh P.; Chordia, Pravin A.; Chillal, Kalpesh S.; Mestry, Vilas B.; Das, Hillol K.; Kohok, Abhay A.
2009-10-01
For any modern astronomical observatory, it is essential to have an efficient interface between the telescope and its back-end instruments. However, for small and medium-sized observatories, this requirement is often limited by tight financial constraints. Therefore a simple yet versatile and low-cost control system is required for such observatories to minimize cost and effort. Here we report the development of a modern, multipurpose instrument control system TELICS (Telescope Instrument Control System) to integrate the controls of various instruments and devices mounted on the telescope. TELICS consists of an embedded hardware unit known as a common control unit (CCU) in combination with Linux-based data acquisition and user interface. The hardware of the CCU is built around the ATmega 128 microcontroller (Atmel Corp.) and is designed with a backplane, master-slave architecture. A Qt-based graphical user interface (GUI) has been developed and the back-end application software is based on C/C++. TELICS provides feedback mechanisms that give the operator good visibility and a quick-look display of the status and modes of instruments as well as data. TELICS has been used for regular science observations since 2008 March on the 2 m, f/10 IUCAA Telescope located at Girawali in Pune, India.
Devaluation and sequential decisions: linking goal-directed and model-based behavior
Friedel, Eva; Koch, Stefan P.; Wendt, Jean; Heinz, Andreas; Deserno, Lorenz; Schlagenhauf, Florian
2014-01-01
In experimental psychology different experiments have been developed to assess goal–directed as compared to habitual control over instrumental decisions. Similar to animal studies selective devaluation procedures have been used. More recently sequential decision-making tasks have been designed to assess the degree of goal-directed vs. habitual choice behavior in terms of an influential computational theory of model-based compared to model-free behavioral control. As recently suggested, different measurements are thought to reflect the same construct. Yet, there has been no attempt to directly assess the construct validity of these different measurements. In the present study, we used a devaluation paradigm and a sequential decision-making task to address this question of construct validity in a sample of 18 healthy male human participants. Correlational analysis revealed a positive association between model-based choices during sequential decisions and goal-directed behavior after devaluation suggesting a single framework underlying both operationalizations and speaking in favor of construct validity of both measurement approaches. Up to now, this has been merely assumed but never been directly tested in humans. PMID:25136310
2011-01-01
Objective To evaluate the validity of cancer-specific and generic preference-based instruments to discriminate across different measures of cancer severities. Methods Patients with breast (n = 66), colorectal (n = 57), and lung (n = 61) cancer completed the EORTC QLQ-C30 and the FACT-G, as well as three generic instruments: the EQ-5D, the SF-6D, and the HUI2/3. Disease severity was quantified using cancer stage, Eastern Cooperative Oncology Group Performance Status (ECOG-PS) score, and self-reported health status. Comparative analyses confirmed the multi-dimensional conceptualization of the instruments in terms of construct and convergent validity. Results In general, the instruments were able to discriminate across severity measures. The instruments demonstrated moderate to strong correlation with each other (r = 0.37-0.73). Not all of the measures could discriminate between different groups of disease severity: the EQ-5D and SF-6D were less discriminative than the HUI2/3 and the cancer-specific instruments. Conclusion The cancer-specific and generic preference-based instruments demonstrated to be valid in discriminating across levels of ECOG-PS scores and self-reported health states. However, the usefulness of the generic instruments may be limited if they are not able to detect small changes in health status within cancer patients. This raises concerns regarding the appropriateness of these instruments when comparing different cancer treatments within an economic evaluation framework. PMID:22123196
Gao, Yuan; Peters, Ove A; Wu, Hongkun; Zhou, Xuedong
2009-02-01
The purpose of this study was to customize an application framework by using the MeVisLab image processing and visualization platform for three-dimensional reconstruction and assessment of tooth and root canal morphology. One maxillary first molar was scanned before and after preparation with ProTaper by using micro-computed tomography. With a customized application framework based on MeVisLab, internal and external anatomy was reconstructed. Furthermore, the dimensions of root canal and radicular dentin were quantified, and effects of canal preparation were assessed. Finally, a virtual preparation with risk analysis was performed to simulate the removal of a broken instrument. This application framework provided an economical platform and met current requirements of endodontic research. The broad-based use of high-quality free software and the resulting exchange of experience might help to improve the quality of endodontic research with micro-computed tomography.
Using Model-Based Reasoning for Autonomous Instrument Operation - Lessons Learned From IMAGE/LENA
NASA Technical Reports Server (NTRS)
Johnson, Michael A.; Rilee, Michael L.; Truszkowski, Walt; Bailin, Sidney C.
2001-01-01
Model-based reasoning has been applied as an autonomous control strategy on the Low Energy Neutral Atom (LENA) instrument currently flying on board the Imager for Magnetosphere-to-Aurora Global Exploration (IMAGE) spacecraft. Explicit models of instrument subsystem responses have been constructed and are used to dynamically adapt the instrument to the spacecraft's environment. These functions are cast as part of a Virtual Principal Investigator (VPI) that autonomously monitors and controls the instrument. In the VPI's current implementation, LENA's command uplink volume has been decreased significantly from its previous volume; typically, no uplinks are required for operations. This work demonstrates that a model-based approach can be used to enhance science instrument effectiveness. The components of LENA are common in space science instrumentation, and lessons learned by modeling this system may be applied to other instruments. Future work involves the extension of these methods to cover more aspects of LENA operation and the generalization to other space science instrumentation.
Nature-based flood risk management -challenges in implementing catchment-wide management concepts
NASA Astrophysics Data System (ADS)
Thaler, Thomas; Fuchs, Sven
2017-04-01
Traditionally, flood risk management focused on coping with the flow at a given point by, for example, building dikes or straightening the watercourse. Increasingly the emphasis has shifted to measures within the flood plain to delay the flow through storage. As such the fluent boundaries imposed by the behaviour of the catchment at a certain point are relocated upstream by the human intervention. Therefore, the implementation of flood storages and the use of natural retention areas are promoted as mitigation measures to support sustainable flood risk management. They aimed at reducing the effluent boundaries on the floodplain by increasing the effluent boundaries upstream. However, beyond the simple change of practices it is indeed often a question of land use change which is at stake in water management. As such, it poses the questions on how to govern both water and land to satisfy the different stakeholders. Nature-based strategies often follow with voluntary agreements, which are promoted as an alternative instrument to the traditional top-down command and control regulation. Voluntary agreements aim at bringing more efficiency, participatory and transparency in solving problems between different social groups. In natural hazard risk management voluntary agreements are now receiving high interests to complement the existing policy instruments in order to achieve the objectives the EU WFD and of the Floods Directive. This paper investigates the use of voluntary agreements as an alternative instrument to the traditional top-down command and control regulation in the implementation of flood storages in Austria. The paper provides a framework of analysis to reveal barriers and opportunities associated with such approach. The paper concludes that institution and power are the central elements to tackle for allowing the success of voluntary agreement.
Beyond Instrumental Literacy: Discourse Ethics and Literacy Education.
ERIC Educational Resources Information Center
Endres, Benjamin J.
Literacy education concerns itself with assessing student needs and determining the appropriate methods for meeting them, without considering the ethical framework in which those "needs" find meaning. This paper argues that a notion of reflective communication, based on Jurgen Habermas's theory of "Discourse," provides an…
Generalized Symbolic Execution for Model Checking and Testing
NASA Technical Reports Server (NTRS)
Khurshid, Sarfraz; Pasareanu, Corina; Visser, Willem; Kofmeyer, David (Technical Monitor)
2003-01-01
Modern software systems, which often are concurrent and manipulate complex data structures must be extremely reliable. We present a novel framework based on symbolic execution, for automated checking of such systems. We provide a two-fold generalization of traditional symbolic execution based approaches: one, we define a program instrumentation, which enables standard model checkers to perform symbolic execution; two, we give a novel symbolic execution algorithm that handles dynamically allocated structures (e.g., lists and trees), method preconditions (e.g., acyclicity of lists), data (e.g., integers and strings) and concurrency. The program instrumentation enables a model checker to automatically explore program heap configurations (using a systematic treatment of aliasing) and manipulate logical formulae on program data values (using a decision procedure). We illustrate two applications of our framework: checking correctness of multi-threaded programs that take inputs from unbounded domains with complex structure and generation of non-isomorphic test inputs that satisfy a testing criterion. Our implementation for Java uses the Java PathFinder model checker.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waissbein, Oliver; Glemarec, Yannick; Bayraktar, Hande
2013-03-15
This report introduces an innovative framework to assist policymakers to quantitatively compare the impact of different public instruments to promote renewable energy. The report identifies the need to reduce the high financing costs for renewable energy in developing countries as an important task for policymakers acting today. The framework is structured in four stages: (i) risk environment, (ii) public instruments, (iii) levelised cost and (iv) evaluation. To illustrate how the framework can support decision-making in practice, the report presents findings from illustrative case studies in four developing countries. It then draws on these results to discuss possible directions for enhancingmore » public interventions to scale-up renewable energy investment. UNDP is also releasing a financial tool for policymakers to accompany the framework. The financial tool is available for download on the UNDP website.« less
NASA Astrophysics Data System (ADS)
Zachariadou, K.; Yiasemides, K.; Trougkakos, N.
2012-11-01
We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental concepts of semiconductor physics by exploring the process of an experimental physics inquiry. The system runs under the Windows operating system and is composed of a data acquisition/control board, a power supply and processing boards, sensing elements, a graphical user interface and data analysis software. The data acquisition/control board is based on the Arduino open source electronics prototyping platform. The graphical user interface and communication with the Arduino are developed in C# and C++ programming languages respectively, by using IDE Microsoft Visual Studio 2010 Professional, which is freely available to students. Finally, the data analysis is performed by using the open source, object-oriented framework ROOT. Currently the system supports five teaching activities, each one corresponding to an independent tab in the user interface. SolarInsight has been partially developed in the context of a diploma thesis conducted within the Technological Educational Institute of Piraeus under the co-supervision of the Physics and Electronic Computer Systems departments’ academic staff.
A TCP/IP framework for ethernet-based measurement, control and experiment data distribution
NASA Astrophysics Data System (ADS)
Ocaya, R. O.; Minny, J.
2010-11-01
A complete modular but scalable TCP/IP based scientific instrument control and data distribution system has been designed and realized. The system features an IEEE 802.3 compliant 10 Mbps Medium Access Controller (MAC) and Physical Layer Device that is suitable for the full-duplex monitoring and control of various physically widespread measurement transducers in the presence of a local network infrastructure. The cumbersomeness of exchanging and synchronizing data between the various transducer units using physical storage media led to the choice of TCP/IP as a logical alternative. The system and methods developed are scalable for broader usage over the Internet. The system comprises a PIC18f2620 and ENC28j60 based hardware and a software component written in C, Java/Javascript and Visual Basic.NET programming languages for event-level monitoring and browser user-interfaces respectively. The system exchanges data with the host network through IPv4 packets requested and received on a HTTP page. It also responds to ICMP echo, UDP and ARP requests through a user selectable integrated DHCP and static IPv4 address allocation scheme. The round-trip time, throughput and polling frequency are estimated and reported. A typical application to temperature monitoring and logging is also presented.
Hermans, Kirsten; Spruytte, Nele; Cohen, Joachim; Van Audenhove, Chantal; Declercq, Anja
2014-12-05
Nursing homes are important locations for palliative care. Through comprehensive geriatric assessments (CGAs), evaluations can be made of palliative care needs of nursing home residents. The interRAI Palliative Care instrument (interRAI PC) is a CGA that evaluates diverse palliative care needs of adults in all healthcare settings. The evaluation results in Client Assessment Protocols (CAPs: indications of problems that need addressing) and Scales (e.g. Palliative Index for Mortality (PIM)) which can be used to design, evaluate and adjust care plans. This study aims to examine the effect of using the interRAI PC on the quality of palliative care in nursing homes. Additionally, it aims to evaluate the feasibility and validity of the interRAI PC. This study covers phases 0, I and II of the Medical Research Council (MRC) framework for designing and evaluating complex interventions, with a longitudinal, quasi-experimental pretest-posttest design and with mixed methods of evaluation. In phase 0, a systematic literature search is conducted. In phase I, the interRAI PC is adapted for use in Belgium and implemented on the BelRAI-website and a practical training is developed. In phase II, the intervention is tested in fifteen nursing homes. Participating nursing homes fill out the interRAI PC during one year for all residents receiving palliative care. Using a pretest-posttest design with quasi-random assignment to the intervention or control group, the effect of the interRAI PC on the quality of palliative care is evaluated with the Palliative care Outcome Scale (POS). Psychometric analysis is conducted to evaluate the predictive validity of the PIM and the convergent validity of the CAP 'Mood' of the interRAI PC. Qualitative data regarding the usability and face validity of the instrument are collected through focus groups, interviews and field notes. This is the first study to evaluate the validity and effect of the interRAI PC in nursing homes, following a methodology based on the MRC framework. This approach improves the study design and implementation and will contribute to a higher generalizability of results. The final result will be a psychometrically evaluated CGA for nursing home residents receiving palliative care. ClinicalTrials.gov NCT02281032. Registered October 30th, 2014.
Construction and Validation of SRA-FV Need Assessment.
Thornton, David; Knight, Raymond A
2015-08-01
This article describes the construction and testing of a newly designed instrument to assess psychological factors associated with increased rates of sexual recidivism. The new instrument (Structured Risk Assessment-Forensic Version or SRA-FV) was based on previous research using the SRA framework. This article describes the results of testing SRA-FV with a large sample (N = 566) of sexual offenders being evaluated for an early civil commitment program. SRA-FV was found to significantly predict sexual recidivism for both child molesters and rapists and to have incremental predictive value relative to two widely used static actuarial instruments (Static-99R; Risk Matrix 2000/S). © The Author(s) 2013.
Rodríguez, Daniela C; Hoe, Connie; Dale, Elina M; Rahman, M Hafizur; Akhter, Sadika; Hafeez, Assad; Irava, Wayne; Rajbangshi, Preety; Roman, Tamlyn; Ţîrdea, Marcela; Yamout, Rouham; Peters, David H
2017-08-01
The capacity to demand and use research is critical for governments if they are to develop policies that are informed by evidence. Existing tools designed to assess how government officials use evidence in decision-making have significant limitations for low- and middle-income countries (LMICs); they are rarely tested in LMICs and focus only on individual capacity. This paper introduces an instrument that was developed to assess Ministry of Health (MoH) capacity to demand and use research evidence for decision-making, which was tested for reliability and validity in eight LMICs (Bangladesh, Fiji, India, Lebanon, Moldova, Pakistan, South Africa, Zambia). Instrument development was based on a new conceptual framework that addresses individual, organisational and systems capacities, and items were drawn from existing instruments and a literature review. After initial item development and pre-testing to address face validity and item phrasing, the instrument was reduced to 54 items for further validation and item reduction. In-country study teams interviewed a systematic sample of 203 MoH officials. Exploratory factor analysis was used in addition to standard reliability and validity measures to further assess the items. Thirty items divided between two factors representing organisational and individual capacity constructs were identified. South Africa and Zambia demonstrated the highest level of organisational capacity to use research, whereas Pakistan and Bangladesh were the lowest two. In contrast, individual capacity was highest in Pakistan, followed by South Africa, whereas Bangladesh and Lebanon were the lowest. The framework and related instrument represent a new opportunity for MoHs to identify ways to understand and improve capacities to incorporate research evidence in decision-making, as well as to provide a basis for tracking change.
2013-01-01
Background Measuring team factors in evaluations of Continuous Quality Improvement (CQI) may provide important information for enhancing CQI processes and outcomes; however, the large number of potentially relevant factors and associated measurement instruments makes inclusion of such measures challenging. This review aims to provide guidance on the selection of instruments for measuring team-level factors by systematically collating, categorizing, and reviewing quantitative self-report instruments. Methods Data sources: We searched MEDLINE, PsycINFO, and Health and Psychosocial Instruments; reference lists of systematic reviews; and citations and references of the main report of instruments. Study selection: To determine the scope of the review, we developed and used a conceptual framework designed to capture factors relevant to evaluating CQI in primary care (the InQuIRe framework). We included papers reporting development or use of an instrument measuring factors relevant to teamwork. Data extracted included instrument purpose; theoretical basis, constructs measured and definitions; development methods and assessment of measurement properties. Analysis and synthesis: We used qualitative analysis of instrument content and our initial framework to develop a taxonomy for summarizing and comparing instruments. Instrument content was categorized using the taxonomy, illustrating coverage of the InQuIRe framework. Methods of development and evidence of measurement properties were reviewed for instruments with potential for use in primary care. Results We identified 192 potentially relevant instruments, 170 of which were analyzed to develop the taxonomy. Eighty-one instruments measured constructs relevant to CQI teams in primary care, with content covering teamwork context (45 instruments measured enabling conditions or attitudes to teamwork), team process (57 instruments measured teamwork behaviors), and team outcomes (59 instruments measured perceptions of the team or its effectiveness). Forty instruments were included for full review, many with a strong theoretical basis. Evidence supporting measurement properties was limited. Conclusions Existing instruments cover many of the factors hypothesized to contribute to QI success. With further testing, use of these instruments measuring team factors in evaluations could aid our understanding of the influence of teamwork on CQI outcomes. Greater consistency in the factors measured and choice of measurement instruments is required to enable synthesis of findings for informing policy and practice. PMID:23410500
NASA Astrophysics Data System (ADS)
Bisogni, Maria Giuseppina
2006-04-01
In this paper we report on the performances and the first imaging test results of a digital mammographic demonstrator based on GaAs pixel detectors. The heart of this prototype is the X-ray detection unit, which is a GaAs pixel sensor read-out by the PCC/MEDIPIXI circuit. Since the active area of the sensor is 1 cm2, 18 detectors have been organized in two staggered rows of nine chips each. To cover the typical mammographic format (18 × 24 cm2) a linear scanning is performed by means of a stepper motor. The system is integrated in mammographic equipment comprehending the X-ray tube, the bias and data acquisition systems and the PC-based control system. The prototype has been developed in the framework of the integrated Mammographic Imaging (IMI) project, an industrial research activity aiming to develop innovative instrumentation for morphologic and functional imaging. The project has been supported by the Italian Ministry of Education, University and Research (MIUR) and by five Italian High Tech companies in collaboration with the universities of Ferrara, Roma “La Sapienza”, Pisa and the INFN.
Development of a UAS-based survey module for ecological research
NASA Astrophysics Data System (ADS)
Meng, R.; McMahon, A. M.; Serbin, S.
2016-12-01
The development of small unmanned aircraft system (UAS, < 25 kg) techniques is enabling measurements of terrestrial ecosystems at unprecedented temporal and spatial scales. Given the potential for improved mission safety, high revisit frequency, and reduced operation cost, UAS platforms are of particular interest in the development for scientific research. Our group is developing a UAS-based survey module for ecological research (e.g. scaling and mapping plant functional traits). However, in addition to technical challenges, the complicated regulations required to operate a UAS for research (e.g. Certificates of Waiver or Authorization, COA, for each location) and complying with Federal Aviation Administration (FAA) restrictions, which still actively evolving, can have significant impacts on research and schedules. Here we briefly discuss our lessons-learned related to FAA registration and COA procedures, requirements, and regulations in the US, accompanied by our hand-on experiences (our group currently have two COA granted and three more under review by FAA). We then introduce our design for a modular data collection software framework. This framework is open source (available on GitHub) and cross-platform compatible (written in Python), providing flexibility in development and deployment hardware configurations. In addition our framework uses a central module to coordinate the data acquisition, synchronization with the UAS control system and data storage through a common interface and interchangeable, hardware specific software modules. Utilizing this structure and a common data transfer format, the system can be easily reconfigured to meet the needs of a specific platform or operation, eliminating the need to redevelop acquisition systems for specific instrument/platform configurations. On-site data measurement tests of UAS-based survey module were conducted and data quality from multi-sensors (e.g. a high-resolution digital camera, spectroradiometer, and a thermal infrared camera) was reported. Finally, the results of this prototype study show that the UAS techniques can be used to develop a low-cost alternative for ecological research, but much effort is still needed to carefully deal with flight regulations and integrate off-the-shelf instrumentation, by the practitioner.
Rode, Julian; Wittmer, Heidi; Emerton, Lucy; Schröter-Schlaack, Christoph
2016-09-01
Economic instruments that promise "win-win" solutions for both biodiversity conservation and human livelihoods have become increasingly popular over recent years. There however remains a gap in terms of practical and policy-relevant guidance about appropriate approaches that take into account the local needs and the specific cultural, legal, and ecological context in which such instruments are being developed and applied. This paper presents a step-by-step framework that helps conservation and development planners and practitioners to identify economic instruments that can promote pro-conservation behaviour in a specific setting. The concept of 'ecosystem service opportunities' builds on, and brings together, general economic principles and an ecosystem services perspective. The framework was designed to also address a number of concerns regarding economic approaches in order to help practitioners recognise the potentials and limits of economic approaches to nature conservation. The framework is illustrated by its application within the realm of a biodiversity conservation project in Thailand.
ERIC Educational Resources Information Center
Durham, Mary F.; Knight, Jennifer K.; Couch, Brian A.
2017-01-01
The Scientific Teaching (ST) pedagogical framework provides various approaches for science instructors to teach in a way that more closely emulates how science is practiced by actively and inclusively engaging students in their own learning and by making instructional decisions based on student performance data. Fully understanding the impact of…
Catalog and Assessment of the Manpower and Personnel Research Division Data Bases
1992-11-01
objectives are: to assess the effectiveness of Army advertising, to assess the advertising strategy in an integrated framework, and to support management...and planning of future advertising strategy . 8POMSOR$ DCSPER, USAREC INSTRUMENT CLEARANCE APPROVAL NUMBERS: N/A TECH ARZ/TBAM RESPONSIBLE: MPPRG
NASA Astrophysics Data System (ADS)
Li, Sissi L.
At the university level, introductory science courses usually have high student to teacher ratios which increases the challenge to meaningfully connect with students. Various curricula have been developed in physics education to actively engage students in learning through social interactions with peers and instructors in class. This learning environment demands not only conceptual understanding but also learning to be a scientist. However, the success of student learning is typically measured in test performance and course grades while assessment of student development as science learners is largely ignored. This dissertation addresses this issue with the development of an instrument towards a measure of physics learning identity (PLI) which is used to guide and complement case studies through student interviews and in class observations. Using the conceptual framework based on Etienne Wenger's communities of practice (1998), I examine the relationship between science learning and learning identity from a situated perspective in the context of a large enrollment science class as a community of practice. This conceptual framework emphasizes the central role of identity in the practices negotiated in the classroom community and in the way students figure out their trajectory as members. Using this framework, I seek to understand how the changes in student learning identity are supported by active engagement based instruction. In turn, this understanding can better facilitate the building of a productive learning community and provide a measure for achievement of the curricular learning goals in active engagement strategies. Based on the conceptual framework, I developed and validated an instrument for measuring physics learning identity in terms of student learning preferences, self-efficacy for learning physics, and self-image as a physics learner. The instrument was pilot tested with a population of Oregon State University students taking calculus based introductory physics. The responses were analyzed using principal component exploratory factor analysis. The emergent factors were analyzed to create reliable subscales to measure PLI in terms of physics learning self-efficacy and social expectations about learning. Using these subscales, I present a case study of a student who performed well in the course but resisted the identity learning goals of the curriculum. These findings are used to support the factors that emerged from the statistical analysis and suggest a potential model of the relationships between the factors describing science learning and learning identity in large enrollment college science classes. This study offers an instrument with which to measure aspects of physics learning identity and insights on how PLI might develop in a classroom community of practice.
Leeseberg Stamler, L; Cole, M M; Patrick, L J
2001-08-01
Strategies to delay or prevent complications from diabetes include diabetes patient education. Diabetes educators seek to provide education that meets the needs of clients and influences positive health outcomes. (1) To expand prior research exploring an enablement framework for patient education by examining perceptions of patient education by persons with diabetes and (2) to test the mastery of stress instrument (MSI) as a potential evaluative instrument for patient education. Triangulated data collection with a convenience sample of adults taking diabetes education classes. Half the sample completed audio-taped semi-structured interviews pre, during and posteducation and all completed the MSI posteducation. Qualitative data were analysed using latent content analysis, descriptive statistics were completed. Qualitative analysis revealed content categories similar to previous work with prenatal participants, supporting the enablement framework. Statistical analyses noted congruence with psychometric findings from development of MSI; secondary qualitative analyses revealed congruency between MSI scores and patient perceptions. Mastery is an outcome congruent with the enablement framework for patient education across content areas. Mastery of stress instrument may be a instrument for identification of patients who are coping well with diabetes self-management, as well as those who are not and who require further nursing interventions.
A general observatory control software framework design for existing small and mid-size telescopes
NASA Astrophysics Data System (ADS)
Ge, Liang; Lu, Xiao-Meng; Jiang, Xiao-Jun
2015-07-01
A general framework for observatory control software would help to improve the efficiency of observation and operation of telescopes, and would also be advantageous for remote and joint observations. We describe a general framework for observatory control software, which considers principles of flexibility and inheritance to meet the expectations from observers and technical personnel. This framework includes observation scheduling, device control and data storage. The design is based on a finite state machine that controls the whole process.
Durham, Mary F.; Knight, Jennifer K.; Couch, Brian A.
2017-01-01
The Scientific Teaching (ST) pedagogical framework provides various approaches for science instructors to teach in a way that more closely emulates how science is practiced by actively and inclusively engaging students in their own learning and by making instructional decisions based on student performance data. Fully understanding the impact of ST requires having mechanisms to quantify its implementation. While many useful instruments exist to document teaching practices, these instruments only partially align with the range of practices specified by ST, as described in a recently published taxonomy. Here, we describe the development, validation, and implementation of the Measurement Instrument for Scientific Teaching (MIST), a survey derived from the ST taxonomy and designed to gauge the frequencies of ST practices in undergraduate science courses. MIST showed acceptable validity and reliability based on results from 7767 students in 87 courses at nine institutions. We used factor analyses to identify eight subcategories of ST practices and used these categories to develop a short version of the instrument amenable to joint administration with other research instruments. We further discuss how MIST can be used by instructors, departments, researchers, and professional development programs to quantify and track changes in ST practices. PMID:29196428
Federated software defined network operations for LHC experiments
NASA Astrophysics Data System (ADS)
Kim, Dongkyun; Byeon, Okhwan; Cho, Kihyeon
2013-09-01
The most well-known high-energy physics collaboration, the Large Hadron Collider (LHC), which is based on e-Science, has been facing several challenges presented by its extraordinary instruments in terms of the generation, distribution, and analysis of large amounts of scientific data. Currently, data distribution issues are being resolved by adopting an advanced Internet technology called software defined networking (SDN). Stability of the SDN operations and management is demanded to keep the federated LHC data distribution networks reliable. Therefore, in this paper, an SDN operation architecture based on the distributed virtual network operations center (DvNOC) is proposed to enable LHC researchers to assume full control of their own global end-to-end data dissemination. This may achieve an enhanced data delivery performance based on data traffic offloading with delay variation. The evaluation results indicate that the overall end-to-end data delivery performance can be improved over multi-domain SDN environments based on the proposed federated SDN/DvNOC operation framework.
NASA Astrophysics Data System (ADS)
Xu, Jun
Topic 1. An Optimization-Based Approach for Facility Energy Management with Uncertainties. Effective energy management for facilities is becoming increasingly important in view of the rising energy costs, the government mandate on the reduction of energy consumption, and the human comfort requirements. This part of dissertation presents a daily energy management formulation and the corresponding solution methodology for HVAC systems. The problem is to minimize the energy and demand costs through the control of HVAC units while satisfying human comfort, system dynamics, load limit constraints, and other requirements. The problem is difficult in view of the fact that the system is nonlinear, time-varying, building-dependent, and uncertain; and that the direct control of a large number of HVAC components is difficult. In this work, HVAC setpoints are the control variables developed on top of a Direct Digital Control (DDC) system. A method that combines Lagrangian relaxation, neural networks, stochastic dynamic programming, and heuristics is developed to predict the system dynamics and uncontrollable load, and to optimize the setpoints. Numerical testing and prototype implementation results show that our method can effectively reduce total costs, manage uncertainties, and shed the load, is computationally efficient. Furthermore, it is significantly better than existing methods. Topic 2. Power Portfolio Optimization in Deregulated Electricity Markets with Risk Management. In a deregulated electric power system, multiple markets of different time scales exist with various power supply instruments. A load serving entity (LSE) has multiple choices from these instruments to meet its load obligations. In view of the large amount of power involved, the complex market structure, risks in such volatile markets, stringent constraints to be satisfied, and the long time horizon, a power portfolio optimization problem is of critical importance but difficulty for an LSE to serve the load, maximize its profit, and manage risks. In this topic, a mid-term power portfolio optimization problem with risk management is presented. Key instruments are considered, risk terms based on semi-variances of spot market transactions are introduced, and penalties on load obligation violations are added to the objective function to improve algorithm convergence and constraint satisfaction. To overcome the inseparability of the resulting problem, a surrogate optimization framework is developed enabling a decomposition and coordination approach. Numerical testing results show that our method effectively provides decisions for various instruments to maximize profit, manage risks, and is computationally efficient.
Exploring Clinical Supervision as Instrument for Effective Teacher Supervision
ERIC Educational Resources Information Center
Ibara, E. C.
2013-01-01
This paper examines clinical supervision approaches that have the potential to promote and implement effective teacher supervision in Nigeria. The various approaches have been analysed based on the conceptual framework of instructional supervisory behavior. The findings suggest that a clear distinction can be made between the prescriptive and…
Development and Validation of the Educational Technologist Multimedia Competency Survey
ERIC Educational Resources Information Center
Ritzhaupt, Albert D.; Martin, Florence
2014-01-01
The purpose of this research study was to identify the multimedia competencies of an educational technologist by creating a valid and reliable survey instrument to administer to educational technology professionals. The educational technology multimedia competency survey developed through this research is based on a conceptual framework that…
Measuring Organisational Capabilities in the Higher Education Sector
ERIC Educational Resources Information Center
Bobe, Belete J.; Kober, Ralph
2015-01-01
Purpose: Drawing on the resource-based view (RBV), the purpose of this paper is to develop a framework and instrument to measure the organisational capabilities of university schools/departments. In doing so, this study provides evidence of the way competitive resources are bundled to generate organisational capabilities that give university…
The Role of Instruments in Three Chemical Revolutions
ERIC Educational Resources Information Center
Chamizo, José Antonio
2014-01-01
This paper attempts to show one of the ways history of chemistry can be teachable for chemistry teachers, it means something more than an undifferentiated mass of names and dates, establishing a temporal framework based on chemical entities that all students use. Represents a difficult equilibrium between over-simplification versus…
A Fair Trade Approach to Community Forest Certification? A Framework for Discussion
ERIC Educational Resources Information Center
Taylor, Peter Leigh
2005-01-01
Forest certification has gained growing attention as a market-based instrument to make globalizing markets a force for mitigating rather than fostering environmental degradation. Yet in practice, market mechanisms currently appear to encourage concentration of forest certification in Northern temperate and boreal forests, rather than in the…
Development and Validation of the Homeostasis Concept Inventory
ERIC Educational Resources Information Center
McFarland, Jenny L.; Price, Rebecca M.; Wenderoth, Mary Pat; Martinková, Patrícia; Cliff, William; Michael, Joel; Modell, Harold; Wright, Ann
2017-01-01
We present the Homeostasis Concept Inventory (HCI), a 20-item multiple-choice instrument that assesses how well undergraduates understand this critical physiological concept. We used an iterative process to develop a set of questions based on elements in the Homeostasis Concept Framework. This process involved faculty experts and undergraduate…
An instrument thermal data base system. [for future shuttle missions
NASA Technical Reports Server (NTRS)
Bartoszek, J. T.; Csigi, K. I.; Ollendorf, S.; Oberright, J. E.
1981-01-01
The rationale for the implementation of an Instrument Thermal Data Base System (ITDBS) is discussed and the potential application of a data base management system in support of future space missions, the design of scientific instruments needed, and the potential payload groupings is described. Two basic data files are suggested, the first containing a detailed narrative information list pertaining to design configurations and optimum performance of each instrument, and the second consisting of a description of the parameters pertinent to the instruments' thermal control and design in the form of a summary record of coded information, and serving as a recall record. The applicability of a data request sheet for preliminary planning is described and is concluded that the proposed system may additionally prove to be a method of inventory control.
Development of an Undergraduate Course--Internet-Based Instrumentation and Control
ERIC Educational Resources Information Center
Zhuang, Hanqi; Morgera, Salvatore D.
2007-01-01
The objective, strategy, and implementation details of a new undergraduate course, Internet-based Instrumentation and Control, are presented. The course has a companion laboratory that is supported by the National Science Foundation and industry. The combination is offered to senior-level undergraduate engineering students interested in sensing,…
Hartmann, Christine W; Palmer, Jennifer A; Mills, Whitney L; Pimentel, Camilla B; Allen, Rebecca S; Wewiorski, Nancy J; Dillon, Kristen R; Snow, A Lynn
2017-08-01
Enhanced interpersonal relationships and meaningful resident engagement in daily life are central to nursing home cultural transformation, yet these critical components of person-centered care may be difficult for frontline staff to measure using traditional research instruments. To address the need for easy-to-use instruments to help nursing home staff members evaluate and improve person-centered care, the psychometric method of cognitive-based interviewing was used to adapt a structured observation instrument originally developed for researchers and nursing home surveyors. Twenty-eight staff members from 2 Veterans Health Administration (VHA) nursing homes participated in 1 of 3 rounds of cognitive-based interviews, using the instrument in real-life situations. Modifications to the original instrument were guided by a cognitive processing model of instrument refinement. Following 2 rounds of cognitive interviews, pretesting of the revised instrument, and another round of cognitive interviews, the resulting set of 3 short instruments mirrored the concepts of the original longer instrument but were significantly easier for frontline staff to understand and use. Final results indicated frontline staff found the revised instruments feasible to use and clinically relevant in measuring and improving the lived experience of a changing culture. This article provides a framework for developing or adapting other measurement tools for frontline culture change efforts in nursing homes, in addition to reporting on a practical set of instruments to measure aspects of person-centered care. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Instrumentation and control system for an F-15 stall/spin
NASA Technical Reports Server (NTRS)
Pitts, F. L.; Holmes, D. C. E.; Zaepfel, K. P.
1974-01-01
An instrumentation and control system is described that was used for radio-controlled F-15 airplane model stall/spin research at the NASA-Langley Research Center. This stall/spin research technique, using scale model aircraft, provides information on the post-stall and spin-entry characteristics of full-scale aircraft. The instrumentation described provides measurements of flight parameters such as angle of attack and sideslip, airspeed, control-surface position, and three-axis rotation rates; these data are recorded on an onboard magnetic tape recorder. The proportional radio control system, which utilizes analog potentiometric signals generated from ground-based pilot inputs, and the ground-based system used in the flight operation are also described.
Le Grande, M; Ski, C F; Thompson, D R; Scuffham, P; Kularatna, S; Jackson, A C; Brown, A
2017-08-01
There is growing recognition that in addition to universally recognised domains and indicators of wellbeing (such as population health and life expectancy), additional frameworks are required to fully explain and measure Indigenous wellbeing. In particular, Indigenous Australian wellbeing is largely determined by colonisation, historical trauma, grief, loss, and ongoing social marginalisation. Dominant mainstream indicators of wellbeing based on the biomedical model may therefore be inadequate and not entirely relevant in the Indigenous context. It is possible that "standard" wellbeing instruments fail to adequately assess indicators of health and wellbeing within societies that have a more holistic view of health. The aim of this critical review was to identify, document, and evaluate the use of social and emotional wellbeing measures within the Australian Indigenous community. The instruments were systematically described regarding their intrinsic properties (e.g., generic v. disease-specific, domains assessed, extent of cross-cultural adaptation and psychometric characteristics) and their purpose of utilisation in studies (e.g., study setting, intervention, clinical purpose or survey). We included 33 studies, in which 22 distinct instruments were used. Three major categories of social and emotional wellbeing instruments were identified: unmodified standard instruments (10), cross-culturally adapted standard instruments (6), and Indigenous developed measures (6). Recommendations are made for researchers and practitioners who assess social and emotional wellbeing in Indigenous Australians, which may also be applicable to other minority groups where a more holistic framework of wellbeing is applied. It is advised that standard instruments only be used if they have been subject to a formal cross-cultural adaptation process, and Indigenous developed measures continue to be developed, refined, and validated within a diverse range of research and clinical settings. Copyright © 2017 Elsevier Ltd. All rights reserved.
Reliability and validity of a Tutorial Group Effectiveness Instrument.
Singaram, Veena S; Van Der Vleuten, Cees P M; Van Berkel, Henk; Dolmans, Diana H J M
2010-01-01
Tutorial group effectiveness is essential for the success of learning in problem-based learning (PBL). Less effective and dysfunctional groups compromise the quality of students learning in PBL. This article aims to report on the reliability and validity of an instrument aimed at measuring tutorial group effectiveness in PBL. The items within the instrument are clustered around motivational and cognitive factors based on Slavin's theoretical framework. A confirmatory factor analysis (CFA) was carried out to estimate the validity of the instrument. Furthermore, generalizability studies were conducted and alpha coefficients were computed to determine the reliability and homogeneity of each factor. The CFA indicated that a three-factor model comprising 19 items showed a good fit with the data. Alpha coefficients per factor were high. The findings of the generalizability studies indicated that at least 9-10 student responses are needed in order to obtain reliable data at the tutorial group level. The instrument validated in this study has the potential to provide faculty and students with diagnostic information and feedback about student behaviors that enhance and hinder tutorial group effectiveness.
Chalmers, Joanne; Deckert, Stefanie; Schmitt, Jochen
2015-06-01
This article describes the core outcome set (COS) for atopic eczema trials. COS describe a minimum set of outcomes to be assessed in a defined situation. COS are required to overcome the current situation of different trials using different endpoints with unclear/insufficient measurement properties resulting in incomparable trials. The global multi-stakeholder Harmonising Outcomes Measures for Eczema initiative developed the Harmonising Outcomes Measures for Eczema roadmap as a generic framework for COS development. Following the establishment of a panel representing all stakeholders, a core set of outcome domains need to be selected based on systematic reviews and consensus methods. Outcome measurement instruments to assess these core domains need to be valid, reliable, and feasible. There is broad global consensus that clinical signs, quality of life, symptoms, and long-term control of flares form the COS for atopic eczema trials. The Eczema Area and Severity Index is recommended to assess clinical signs in atopic eczema trials. Systematic reviews to identify adequate outcome measurement instruments for the other core outcome domains are underway. Clinical signs should be assessed in all atopic eczema trials by at least the Eczema Area and Severity Index. Quality of life, symptoms, and flares should also be assessed in all atopic eczema trials by a valid, reliable, and feasible instrument.
NASA Astrophysics Data System (ADS)
Sundberg, R.; Moberg, A.; Hind, A.
2012-08-01
A statistical framework for comparing the output of ensemble simulations from global climate models with networks of climate proxy and instrumental records has been developed, focusing on near-surface temperatures for the last millennium. This framework includes the formulation of a joint statistical model for proxy data, instrumental data and simulation data, which is used to optimize a quadratic distance measure for ranking climate model simulations. An essential underlying assumption is that the simulations and the proxy/instrumental series have a shared component of variability that is due to temporal changes in external forcing, such as volcanic aerosol load, solar irradiance or greenhouse gas concentrations. Two statistical tests have been formulated. Firstly, a preliminary test establishes whether a significant temporal correlation exists between instrumental/proxy and simulation data. Secondly, the distance measure is expressed in the form of a test statistic of whether a forced simulation is closer to the instrumental/proxy series than unforced simulations. The proposed framework allows any number of proxy locations to be used jointly, with different seasons, record lengths and statistical precision. The goal is to objectively rank several competing climate model simulations (e.g. with alternative model parameterizations or alternative forcing histories) by means of their goodness of fit to the unobservable true past climate variations, as estimated from noisy proxy data and instrumental observations.
WFIRST: Coronagraph Systems Engineering and Performance Budgets
NASA Astrophysics Data System (ADS)
Poberezhskiy, Ilya; cady, eric; Frerking, Margaret A.; Kern, Brian; Nemati, Bijan; Noecker, Martin; Seo, Byoung-Joon; Zhao, Feng; Zhou, Hanying
2018-01-01
The WFIRST coronagraph instrument (CGI) will be the first in-space coronagraph using active wavefront control to directly image and characterize mature exoplanets and zodiacal disks in reflected starlight. For CGI systems engineering, including requirements development, CGI performance is predicted using a hierarchy of performance budgets to estimate various noise components — spatial and temporal flux variations — that obscure exoplanet signals in direct imaging and spectroscopy configurations. These performance budgets are validated through a robust integrated modeling and testbed model validation efforts.We present the performance budgeting framework used by WFIRST for the flow-down of coronagraph science requirements, mission constraints, and observatory interfaces to measurable instrument engineering parameters.
ORAC: a modern observing system for UKIRT
NASA Astrophysics Data System (ADS)
Bridger, Alan; Wright, Gillian S.; Economou, Frossie; Tan, Min; Currie, Malcolm J.; Pickup, David A.; Adamson, Andrew J.; Rees, Nicholas P.; Purves, Maren; Kackley, Russell
2000-06-01
The steady improvement in telescope performance at UKIRT and the increase in data acquisition rates led to a strong desired for an integrated observing framework that would meet the needs of future instrumentation, as well as providing some support for existing instrumentation. Thus the Observatory Reduction and Acquisition Control (ORAC) project was created in 1997 with the goals of improving the scientific productivity in the telescope, reducing the overall ongoing support requirements, and eventually supporting the use of more flexibly scheduled observing. The project was also expected to achieve this within a tight resource allocation. In October 1999 the ORAC system was commissioned at the United Kingdom Infrared Telescope.
Dellefield, Mary Ellen; Corazzini, Kirsten
2015-01-01
Development of the comprehensive care plan (CCP) is a requirement for nursing homes participating in the federal Medicare and Medicaid programs, referred to as skilled nursing facilities. The plan must be developed within the context of the comprehensive interdisciplinary assessment framework—the Resident Assessment Instrument (RAI). Consistent compliance with this requirement has been difficult to achieve. To improve the quality of CCP development within this framework, an increased understanding of complex factors contributing to inconsistent compliance is required. In this commentary, we examine the history of the comprehensive care plan; its development within the RAI framework; linkages between the RAI and registered nurse staffing; empirical evidence of the CCP’s efficacy; and the limitations of extant standards of practices in CCP development. Because of the registered nurse’s educational preparation, professional practice standards, and licensure obligations, the essential contributions of professional nurses in CCP development are emphasized. Recommendations for evidence-based micro and macro level practice changes with the potential to improve the quality of CCP development and regulatory compliance are presented. Suggestions for future research are given. PMID:27417811
ERIC Educational Resources Information Center
Lao, Huei-Chen
2016-01-01
In this quantitative study, a survey was developed and administered to middle and high school teachers to examine what factors motivated them to implement problem-based learning (PBL). Using Expectancy-Value Theory by Eccles et al. (1983) and Self-Determination Theory by Ryan and Deci (2000b) as the theoretical framework, this instrument measured…
Advanced process control framework initiative
NASA Astrophysics Data System (ADS)
Hill, Tom; Nettles, Steve
1997-01-01
The semiconductor industry, one the world's most fiercely competitive industries, is driven by increasingly complex process technologies and global competition to improve cycle time, quality, and process flexibility. Due to the complexity of these problems, current process control techniques are generally nonautomated, time-consuming, reactive, nonadaptive, and focused on individual fabrication tools and processes. As the semiconductor industry moves into higher density processes, radical new approaches are required. To address the need for advanced factory-level process control in this environment, Honeywell, Advanced Micro Devices (AMD), and SEMATECH formed the Advanced Process Control Framework Initiative (APCFI) joint research project. The project defines and demonstrates an Advanced Process Control (APC) approach based on SEMATECH's Computer Integrated Manufacturing (CIM) Framework. Its scope includes the coordination of Manufacturing Execution Systems, process control tools, and wafer fabrication equipment to provide necessary process control capabilities. Moreover, it takes advantage of the CIM Framework to integrate and coordinate applications from other suppliers that provide services necessary for the overall system to function. This presentation discusses the key concept of model-based process control that differentiates the APC Framework. This major improvement over current methods enables new systematic process control by linking the knowledge of key process settings to desired product characteristics that reside in models created with commercial model development tools The unique framework-based approach facilitates integration of commercial tools and reuse of their data by tying them together in an object-based structure. The presentation also explores the perspective of each organization's involvement in the APCFI project. Each has complementary goals and expertise to contribute; Honeywell represents the supplier viewpoint, AMD represents the user with 'real customer requirements', and SEMATECH provides a consensus-building organization that widely disseminates technology to suppliers and users in the semiconductor industry that face similar equipment and factory control systems challenges.
Müller-Staub, Maria; Lunney, Margaret; Odenbreit, Matthias; Needham, Ian; Lavin, Mary Ann; van Achterberg, Theo
2009-04-01
This paper aims to report the development stages of an audit instrument to assess standardised nursing language. Because research-based instruments were not available, the instrument Quality of documentation of nursing Diagnoses, Interventions and Outcomes (Q-DIO) was developed. Standardised nursing language such as nursing diagnoses, interventions and outcomes are being implemented worldwide and will be crucial for the electronic health record. The literature showed a lack of audit instruments to assess the quality of standardised nursing language in nursing documentation. A qualitative design was used for instrument development. Criteria were first derived from a theoretical framework and literature reviews. Second, the criteria were operationalized into items and eight experts assessed face and content validity of the Q-DIO. Criteria were developed and operationalized into 29 items. For each item, a three or five point scale was applied. The experts supported content validity and showed 88.25% agreement for the scores assigned to the 29 items of the Q-DIO. The Q-DIO provides a literature-based audit instrument for nursing documentation. The strength of Q-DIO is its ability to measure the quality of nursing diagnoses and related interventions and nursing-sensitive patient outcomes. Further testing of Q-DIO is recommended. Based on the results of this study, the Q-DIO provides an audit instrument to be used in clinical practice. Its criteria can set the stage for the electronic nursing documentation in electronic health records.
NASA Astrophysics Data System (ADS)
Hadzaman, N. A. H.; Takim, R.; Nawawi, A. H.; Mohamad Yusuwan, N.
2018-04-01
BIM governance assessment instrument is a process of analysing the importance in developing BIM governance solution to tackle the existing problems during team collaboration in BIM-based projects. Despite the deployment of integrative technologies in construction industry particularly BIM, it is still insufficient compare to other sectors. Several studies have been established the requirements of BIM implementation concerning all technical and non-technical BIM adoption issues. However, the data are regarded as inadequate to develop a BIM governance framework. Hence, the objective of the paper is to evaluate the content validity of the BIM governance instrument prior to the main data collection. Two methods were employed in the form of literature review and questionnaire survey. Based on the literature review, 273 items with six main constructs are suggested to be incorporated in the BIM governance instrument. The Content Validity Ratio (CVR) scores revealed that 202 out of 273 items are considered as the utmost critical by the content experts. The findings for Item Level Content Validity Index (I-CVI) and Modified Kappa Coefficient however revealed that 257 items in BIM governance instrument are appropriate and excellent. The instrument is highly reliable for future strategies and the development of BIM projects in Malaysia.
Organizational Capabilities for Integrating Care: A Review of Measurement Tools.
Evans, Jenna M; Grudniewicz, Agnes; Baker, G Ross; Wodchis, Walter P
2016-12-01
The success of integrated care interventions is highly dependent on the internal and collective capabilities of the organizations in which they are implemented. Yet, organizational capabilities are rarely described, understood, or measured with sufficient depth and breadth in empirical studies or in practice. Assessing these capabilities can contribute to understanding why some integrated care interventions are more effective than others. We identified, organized, and assessed survey instruments that measure the internal and collective organizational capabilities required for integrated care delivery. We conducted an expert consultation and searched Medline and Google Scholar databases for survey instruments measuring factors outlined in the Context and Capabilities for Integrating Care Framework. A total of 58 instruments were included in the review and assessed based on their psychometric properties, practical considerations, and applicability to integrated care efforts. This study provides a bank of psychometrically sound instruments for describing and comparing organizational capabilities. Greater use of these instruments across integrated care interventions and studies can enhance standardized comparative analyses and inform change management. Further research is needed to build an evidence base for these instruments and to explore the associations between organizational capabilities and integrated care processes and outcomes. © The Author(s) 2016.
Microprocessor-based interface for oceanography
NASA Technical Reports Server (NTRS)
Hansen, G. R.
1979-01-01
Ocean floor imaging system incorporates five identical microprocessor-based interface units each assigned to specific sonar instrument to simplify system. Central control module based on same microprocessor eliminates need for custom tailoring hardware interfaces for each instrument.
Autonomy for SOHO Ground Operations
NASA Technical Reports Server (NTRS)
Truszkowski, Walt; Netreba, Nick; Ginn, Don; Mandutianu, Sanda; Obenschain, Arthur F. (Technical Monitor)
2001-01-01
The SOLAR and HELIOSPHERIC OBSERVATORY (SOHO) project [SOHO Web Page] is being carried out by the European Space Agency (ESA) and the US National Aeronautics and Space Administration (NASA) as a cooperative effort between the two agencies in the framework of the Solar Terrestrial Science Program (STSP) comprising SOHO and other missions. SOHO was launched on December 2, 1995. The SOHO spacecraft was built in Europe by an industry team led by Matra, and instruments were provided by European and American scientists. There are nine European Principal Investigators (PI's) and three American ones. Large engineering teams and more than 200 co-investigators from many institutions support the PI's in the development of the instruments and in the preparation of their operations and data analysis. NASA is responsible for the launch and mission operations. Large radio dishes around the world, which form NASA's Deep Space Network (DSN), are used to track the spacecraft beyond the Earths orbit. Mission control is based at Goddard Space Flight Center in Maryland. The agent group at the NASA Goddard Space Flight Center, in collaboration with JPL, is currently involved with the design and development of an agent-based system to provide intelligent interactions with the control center personnel for SOHO. The basic approach that is being taken is to develop a sub-community of agents for each major subsystem of SOHO and to integrate these sub-communities into an overall SOHO community. Agents in all sub-communities will be capable of advanced understanding (deep reasoning) of the associated spacecraft subsystem.
Structure-based control of complex networks with nonlinear dynamics.
Zañudo, Jorge Gomez Tejeda; Yang, Gang; Albert, Réka
2017-07-11
What can we learn about controlling a system solely from its underlying network structure? Here we adapt a recently developed framework for control of networks governed by a broad class of nonlinear dynamics that includes the major dynamic models of biological, technological, and social processes. This feedback-based framework provides realizable node overrides that steer a system toward any of its natural long-term dynamic behaviors, regardless of the specific functional forms and system parameters. We use this framework on several real networks, identify the topological characteristics that underlie the predicted node overrides, and compare its predictions to those of structural controllability in control theory. Finally, we demonstrate this framework's applicability in dynamic models of gene regulatory networks and identify nodes whose override is necessary for control in the general case but not in specific model instances.
The PROactive innovative conceptual framework on physical activity.
Dobbels, Fabienne; de Jong, Corina; Drost, Ellen; Elberse, Janneke; Feridou, Chryssoula; Jacobs, Laura; Rabinovich, Roberto; Frei, Anja; Puhan, Milo A; de Boer, Willem I; van der Molen, Thys; Williams, Kate; Pinnock, Hillary; Troosters, Thierry; Karlsson, Niklas; Kulich, Karoly; Rüdell, Katja
2014-11-01
Although physical activity is considered an important therapeutic target in chronic obstructive pulmonary disease (COPD), what "physical activity" means to COPD patients and how their perspective is best measured is poorly understood. We designed a conceptual framework, guiding the development and content validation of two patient reported outcome (PRO) instruments on physical activity (PROactive PRO instruments). 116 patients from four European countries with diverse demographics and COPD phenotypes participated in three consecutive qualitative studies (63% male, age mean±sd 66±9 years, 35% Global Initiative for Chronic Obstructive Lung Disease stage III-IV). 23 interviews and eight focus groups (n = 54) identified the main themes and candidate items of the framework. 39 cognitive debriefings allowed the clarity of the items and instructions to be optimised. Three themes emerged, i.e. impact of COPD on amount of physical activity, symptoms experienced during physical activity, and adaptations made to facilitate physical activity. The themes were similar irrespective of country, demographic or disease characteristics. Iterative rounds of appraisal and refinement of candidate items resulted in 30 items with a daily recall period and 34 items with a 7-day recall period. For the first time, our approach provides comprehensive insight on physical activity from the COPD patients' perspective. The PROactive PRO instruments' content validity represents the pivotal basis for empirically based item reduction and validation. ©ERS 2014.
Subjective health literacy: Development of a brief instrument for school-aged children.
Paakkari, Olli; Torppa, Minna; Kannas, Lasse; Paakkari, Leena
2016-12-01
The present paper focuses on the measurement of health literacy (HL), which is an important determinant of health and health behaviours. HL starts to develop in childhood and adolescence; hence, there is a need for instruments to monitor HL among younger age groups. These instruments are still rare. The aim of the project reported here was, therefore, to develop a brief, multidimensional, theory-based instrument to measure subjective HL among school-aged children. The development of the instrument covered four phases: item generation based on a conceptual framework; a pilot study ( n = 405); test-retest ( n = 117); and construction of the instrument ( n = 3853). All the samples were taken from Finnish 7th and 9th graders. Initially, 65 items were generated, of which 32 items were selected for the pilot study. After item reduction, the instrument contained 16 items. The test-retest phase produced estimates of stability. In the final phase a 10-item instrument was constructed, referred to as Health Literacy for School-Aged Children (HLSAC). The instrument exhibited a high Cronbach alpha (0.93), and included two items from each of the five predetermined theoretical components (theoretical knowledge, practical knowledge, critical thinking, self-awareness, citizenship). The iterative and validity-driven development process made it possible to construct a brief multidimensional HLSAC instrument. Such instruments are suitable for large-scale studies, and for use with children and adolescents. Validation will require further testing for use in other countries.
An Attribute Based Access Control Framework for Healthcare System
NASA Astrophysics Data System (ADS)
Afshar, Majid; Samet, Saeed; Hu, Ting
2018-01-01
Nowadays, access control is an indispensable part of the Personal Health Record and supplies for its confidentiality by enforcing policies and rules to ensure that only authorized users gain access to requested resources in the system. In other words, the access control means protecting patient privacy in healthcare systems. Attribute-Based Access Control (ABAC) is a new access control model that can be used instead of other traditional types of access control such as Discretionary Access Control, Mandatory Access Control, and Role-Based Access Control. During last five years ABAC has shown some applications in both recent academic fields and industry purposes. ABAC by using user’s attributes and resources, makes a decision according to an access request. In this paper, we propose an ABAC framework for healthcare system. We use the engine of ABAC for rendering and enforcing healthcare policies. Moreover, we handle emergency situations in this framework.
An evaluation framework for participatory modelling
NASA Astrophysics Data System (ADS)
Krueger, T.; Inman, A.; Chilvers, J.
2012-04-01
Strong arguments for participatory modelling in hydrology can be made on substantive, instrumental and normative grounds. These arguments have led to increasingly diverse groups of stakeholders (here anyone affecting or affected by an issue) getting involved in hydrological research and the management of water resources. In fact, participation has become a requirement of many research grants, programs, plans and policies. However, evidence of beneficial outcomes of participation as suggested by the arguments is difficult to generate and therefore rare. This is because outcomes are diverse, distributed, often tacit, and take time to emerge. In this paper we develop an evaluation framework for participatory modelling focussed on learning outcomes. Learning encompasses many of the potential benefits of participation, such as better models through diversity of knowledge and scrutiny, stakeholder empowerment, greater trust in models and ownership of subsequent decisions, individual moral development, reflexivity, relationships, social capital, institutional change, resilience and sustainability. Based on the theories of experiential, transformative and social learning, complemented by practitioner experience our framework examines if, when and how learning has occurred. Special emphasis is placed on the role of models as learning catalysts. We map the distribution of learning between stakeholders, scientists (as a subgroup of stakeholders) and models. And we analyse what type of learning has occurred: instrumental learning (broadly cognitive enhancement) and/or communicative learning (change in interpreting meanings, intentions and values associated with actions and activities; group dynamics). We demonstrate how our framework can be translated into a questionnaire-based survey conducted with stakeholders and scientists at key stages of the participatory process, and show preliminary insights from applying the framework within a rural pollution management situation in the UK.
A Framework for Context Sensitive Risk-Based Access Control in Medical Information Systems
Choi, Donghee; Kim, Dohoon; Park, Seog
2015-01-01
Since the access control environment has changed and the threat of insider information leakage has come to the fore, studies on risk-based access control models that decide access permissions dynamically have been conducted vigorously. Medical information systems should protect sensitive data such as medical information from insider threat and enable dynamic access control depending on the context such as life-threatening emergencies. In this paper, we suggest an approach and framework for context sensitive risk-based access control suitable for medical information systems. This approach categorizes context information, estimating and applying risk through context- and treatment-based permission profiling and specifications by expanding the eXtensible Access Control Markup Language (XACML) to apply risk. The proposed framework supports quick responses to medical situations and prevents unnecessary insider data access through dynamic access authorization decisions in accordance with the severity of the context and treatment. PMID:26075013
Information Power Grid (IPG) Tutorial 2003
NASA Technical Reports Server (NTRS)
Meyers, George
2003-01-01
For NASA and the general community today Grid middleware: a) provides tools to access/use data sources (databases, instruments, ...); b) provides tools to access computing (unique and generic); c) Is an enabler of large scale collaboration. Dynamically responding to needs is a key selling point of a grid. Independent resources can be joined as appropriate to solve a problem. Provide tools to enable the building of a frameworks for application. Provide value added service to the NASA user base for utilizing resources on the grid in new and more efficient ways. Provides tools for development of Frameworks.
A framework to explore the knowledge structure of multidisciplinary research fields.
Uddin, Shahadat; Khan, Arif; Baur, Louise A
2015-01-01
Understanding emerging areas of a multidisciplinary research field is crucial for researchers, policymakers and other stakeholders. For them a knowledge structure based on longitudinal bibliographic data can be an effective instrument. But with the vast amount of available online information it is often hard to understand the knowledge structure for data. In this paper, we present a novel approach for retrieving online bibliographic data and propose a framework for exploring knowledge structure. We also present several longitudinal analyses to interpret and visualize the last 20 years of published obesity research data.
How To Control Color Appearance With Instrumentation
NASA Astrophysics Data System (ADS)
Burns, Margaret E.
1980-05-01
Colorimetry, as defined by the International Commission on Illumination, is the measurement of colors, made possible by the properties of the eye and based on a set of conventions. Instrumentation for measuring object color, therefore, must be based on a human observer. The intent is to design an instrument that in effect responds as a person would, so that research development, production control and quality control areas have some means of assessing the acceptability of the appearance of a product. Investigations of a human observer's psychological response to color, and the manner in which visual observations are made, give the instrument designer and manufacturer data necessary to answer two questions: a. How can we put numbers (instrument read-out) on a perception that occurs in the brain of the observer? b. What can we learn from examination of a visual observing situation that will guide us in our design of an instrumental simulation of this situation? Involving as it does our own daily, almost unconscious, practice of making judgments concerning the things we see, the design and manufacture of color measurement instruments is an exceedingly interesting field. The advances being made concurrently today in research concerning human color vision and in optical and electronic technology will make possible increasingly useful instrumentation for quality control of product color.
A system framework of inter-enterprise machining quality control based on fractal theory
NASA Astrophysics Data System (ADS)
Zhao, Liping; Qin, Yongtao; Yao, Yiyong; Yan, Peng
2014-03-01
In order to meet the quality control requirement of dynamic and complicated product machining processes among enterprises, a system framework of inter-enterprise machining quality control based on fractal was proposed. In this system framework, the fractal-specific characteristic of inter-enterprise machining quality control function was analysed, and the model of inter-enterprise machining quality control was constructed by the nature of fractal structures. Furthermore, the goal-driven strategy of inter-enterprise quality control and the dynamic organisation strategy of inter-enterprise quality improvement were constructed by the characteristic analysis on this model. In addition, the architecture of inter-enterprise machining quality control based on fractal was established by means of Web service. Finally, a case study for application was presented. The result showed that the proposed method was available, and could provide guidance for quality control and support for product reliability in inter-enterprise machining processes.
Simultaneous operation and control of about 100 telescopes for the Cherenkov Telescope Array
NASA Astrophysics Data System (ADS)
Wegner, P.; Colomé, J.; Hoffmann, D.; Houles, J.; Köppel, H.; Lamanna, G.; Le Flour, T.; Lopatin, A.; Lyard, E.; Melkumyan, D.; Oya, I.; Panazol, L.-I.; Punch, M.; Schlenstedt, S.; Schmidt, T.; Stegmann, C.; Schwanke, U.; Walter, R.; Consortium, CTA
2012-12-01
The Cherenkov Telescope Array (CTA) project is an initiative to build the next generation ground-based very high energy (VHE) gamma-ray instrument. Compared to current imaging atmospheric Cherenkov telescope experiments CTA will extend the energy range and improve the angular resolution while increasing the sensitivity up to a factor of 10. With about 100 separate telescopes it will be operated as an observatory open to a wide astrophysics and particle physics community, providing a deep insight into the non-thermal high-energy universe. The CTA Array Control system (ACTL) is responsible for several essential control tasks supporting the evaluation and selection of proposals, as well as the preparation, scheduling, and finally the execution of observations with the array. A possible basic distributed software framework for ACTL being considered is the ALMA Common Software (ACS). The ACS framework follows a container component model and contains a high level abstraction layer to integrate different types of device. To achieve a low-level consolidation of connecting control hardware, OPC UA (OPen Connectivity-Unified Architecture) client functionality is integrated directly into ACS, thus allowing interaction with other OPC UA capable hardware. The CTA Data Acquisition System comprises the data readout of all cameras and the transfer of the data to a camera server farm, thereby using standard hardware and software technologies. CTA array control is also covering conceptions for a possible array trigger system and the corresponding clock distribution. The design of the CTA observations scheduler is introducing new algorithmic technologies to achieve the required flexibility.
Crisis in the Restructuring of China's Vocational Education System, 1980-2010
ERIC Educational Resources Information Center
Luo, Yan
2013-01-01
This article examines the origins of China's vocational education system and the restructuring of the system since 1980, finding that this thirty-year systemic restructuring was based on a framework of instrumental rationalism, but did not connect effectively with the building of a modern enterprise system. During this critical period in upgrading…
ERIC Educational Resources Information Center
Liu, Yujuan; Ferrell, Brent; Barbera, Jack; Lewis, Jennifer E.
2017-01-01
Fundamentally concerned with motivation, self-determination theory (SDT) represents a framework of several mini-theories to explore how social context interacts with people's motivational types categorized by degree of regulation internalization. This paper aims to modify an existing theory-based instrument (Academic Motivation Scale, or AMS) and…
ERIC Educational Resources Information Center
Wang, Xinghua; Zhou, Ji; Shen, Jiliang
2016-01-01
This article reports a study that is based on the framework of personal epistemology proposed by Kuhn, Cheney, and Weinstock (2000). The instrument developed by Kuhn et al. (2000) for assessing the three positions (absolutist, multiplist and evaluativist) of epistemological understanding across five judgements' domains was translated and…
Evaluating health-promoting schools in Hong Kong: development of a framework.
Lee, Albert; Cheng, Frances F K; St Leger, Lawry
2005-06-01
Health-promoting schools (HPS)/healthy schools have existed internationally for about 15 years. Yet there are few comprehensive evaluation frameworks available which enable the outcomes of HPS initiatives to be assessed. This paper identifies an evaluation framework developed in Hong Kong. The framework uses a range of approaches to explore what schools actually do in their health promotion and health education initiatives. The framework, which is based on the WHO (Western Pacific Regional Office) Guidelines for HPS, is described in detail. The appropriate instruments for data collection are described and their origins identified. The evaluation plan and protocol, which underpinned the very comprehensive evaluation in Hong Kong, are explained. Finally, a case is argued for evaluation of HPS to be more in line with the educational dynamics of schools and the research literature on effective schooling, rather than focusing primarily on health-related measures.
LabVIEW-based control software for para-hydrogen induced polarization instrumentation.
Agraz, Jose; Grunfeld, Alexander; Li, Debiao; Cunningham, Karl; Willey, Cindy; Pozos, Robert; Wagner, Shawn
2014-04-01
The elucidation of cell metabolic mechanisms is the modern underpinning of the diagnosis, treatment, and in some cases the prevention of disease. Para-Hydrogen induced polarization (PHIP) enhances magnetic resonance imaging (MRI) signals over 10,000 fold, allowing for the MRI of cell metabolic mechanisms. This signal enhancement is the result of hyperpolarizing endogenous substances used as contrast agents during imaging. PHIP instrumentation hyperpolarizes Carbon-13 ((13)C) based substances using a process requiring control of a number of factors: chemical reaction timing, gas flow, monitoring of a static magnetic field (Bo), radio frequency (RF) irradiation timing, reaction temperature, and gas pressures. Current PHIP instruments manually control the hyperpolarization process resulting in the lack of the precise control of factors listed above, resulting in non-reproducible results. We discuss the design and implementation of a LabVIEW based computer program that automatically and precisely controls the delivery and manipulation of gases and samples, monitoring gas pressures, environmental temperature, and RF sample irradiation. We show that the automated control over the hyperpolarization process results in the hyperpolarization of hydroxyethylpropionate. The implementation of this software provides the fast prototyping of PHIP instrumentation for the evaluation of a myriad of (13)C based endogenous contrast agents used in molecular imaging.
LabVIEW-based control software for para-hydrogen induced polarization instrumentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agraz, Jose, E-mail: joseagraz@ucla.edu; Grunfeld, Alexander; Li, Debiao
2014-04-15
The elucidation of cell metabolic mechanisms is the modern underpinning of the diagnosis, treatment, and in some cases the prevention of disease. Para-Hydrogen induced polarization (PHIP) enhances magnetic resonance imaging (MRI) signals over 10 000 fold, allowing for the MRI of cell metabolic mechanisms. This signal enhancement is the result of hyperpolarizing endogenous substances used as contrast agents during imaging. PHIP instrumentation hyperpolarizes Carbon-13 ({sup 13}C) based substances using a process requiring control of a number of factors: chemical reaction timing, gas flow, monitoring of a static magnetic field (B{sub o}), radio frequency (RF) irradiation timing, reaction temperature, and gas pressures.more » Current PHIP instruments manually control the hyperpolarization process resulting in the lack of the precise control of factors listed above, resulting in non-reproducible results. We discuss the design and implementation of a LabVIEW based computer program that automatically and precisely controls the delivery and manipulation of gases and samples, monitoring gas pressures, environmental temperature, and RF sample irradiation. We show that the automated control over the hyperpolarization process results in the hyperpolarization of hydroxyethylpropionate. The implementation of this software provides the fast prototyping of PHIP instrumentation for the evaluation of a myriad of {sup 13}C based endogenous contrast agents used in molecular imaging.« less
Method and apparatus for automatic control of a humanoid robot
NASA Technical Reports Server (NTRS)
Abdallah, Muhammad E (Inventor); Platt, Robert (Inventor); Wampler, II, Charles W. (Inventor); Sanders, Adam M (Inventor); Reiland, Matthew J (Inventor)
2013-01-01
A robotic system includes a humanoid robot having a plurality of joints adapted for force control with respect to an object acted upon by the robot, a graphical user interface (GUI) for receiving an input signal from a user, and a controller. The GUI provides the user with intuitive programming access to the controller. The controller controls the joints using an impedance-based control framework, which provides object level, end-effector level, and/or joint space-level control of the robot in response to the input signal. A method for controlling the robotic system includes receiving the input signal via the GUI, e.g., a desired force, and then processing the input signal using a host machine to control the joints via an impedance-based control framework. The framework provides object level, end-effector level, and/or joint space-level control of the robot, and allows for functional-based GUI to simplify implementation of a myriad of operating modes.
Empowering people to change occupational behaviours to address critical global issues.
Ikiugu, Moses N; Westerfield, Madeline A; Lien, Jamie M; Theisen, Emily R; Cerny, Shana L; Nissen, Ranelle M
2015-06-01
The greatest threat to human well-being in this century is climate change and related global issues. We examined the effectiveness of the Modified Instrumentalism in Occupational Therapy model as a framework for facilitating occupational behaviour change to address climate change and related issues. Eleven individuals participated in this mixed-methods single-subject-design study. Data were gathered using the Modified Assessment and Intervention Instrument for Instrumentalism in Occupational Therapy and Daily Occupational Inventories. Quantitative data were analyzed using two- and three-standard deviation band methods. Qualitative data were analyzed using heuristic phenomenological procedures. Occupational performance changed for five participants. Participants' feelings shifted from frustration and helplessness to empowerment and a desire for action. They felt empowered to find occupation-based solutions to the global issues. Occupation-based interventions that increase personal awareness of the connection between occupational performance and global issues could empower people to be agents for action to ameliorate the issues.
Development status of the life marker chip instrument for ExoMars
NASA Astrophysics Data System (ADS)
Sims, Mark R.; Cullen, David C.; Rix, Catherine S.; Buckley, Alan; Derveni, Mariliza; Evans, Daniel; Miguel García-Con, Luis; Rhodes, Andrew; Rato, Carla C.; Stefinovic, Marijan; Sephton, Mark A.; Court, Richard W.; Bulloch, Christopher; Kitchingman, Ian; Ali, Zeshan; Pullan, Derek; Holt, John; Blake, Oliver; Sykes, Jonathan; Samara-Ratna, Piyal; Canali, Massimiliano; Borst, Guus; Leeuwis, Henk; Prak, Albert; Norfini, Aleandro; Geraci, Ennio; Tavanti, Marco; Brucato, John; Holm, Nils
2012-11-01
The Life Marker Chip (LMC) is one of the instruments being developed for possible flight on the 2018 ExoMars mission. The instrument uses solvents to extract organic compounds from samples of martian regolith and to transfer the extracts to dedicated detectors based around the use of antibodies. The scientific aims of the instrument are to detect organics in the form of biomarkers that might be associated with extinct life, extant life or abiotic sources of organics. The instrument relies on a novel surfactant-based solvent system and bespoke, commercial and research-developed antibodies against a number of distinct biomarkers or molecular types. The LMC comprises of a number of subsystems designed to accept up to four discrete samples of martian regolith or crushed rock, implement the solvent extraction, perform microfluidic-based multiplexed antibody-assays for biomarkers and other targets, optically detect the fluorescent output of the assays, control the internal instrument pressure and temperature, in addition to the associated instrument control electronics and software. The principle of operation, the design and the instrument development status as of December 2011 are reported here. The instrument principle can be extended to other configurations and missions as needed.
NASA Astrophysics Data System (ADS)
Hicks, S. P.; Hill, P.; Goessen, S.; Rietbrock, A.; Garth, T.
2016-12-01
The self-noise level of a broadband seismometer sensor is a commonly-used parameter used to evaluate instrument performance. There are several independent studies of various instruments' self-noise (e.g. Ringler & Hutt, 2010; Tasič & Runovc, 2012). However, due to ongoing developments in instrument design (i.e. mechanics and electronics), it is essential to regularly assess any changes in self-noise, which could indicate improvements/deterioration in instrument design and performance over time. We present new self-noise estimates for a range of Güralp broadband seismometers (3T, 3ESPC, 40T, 6T). We use the three-channel coherence analysis estimate of Sleeman et al. (2006) to measure self-noise of these instruments. Based on coherency analysis, we also perform a mathematical rotation of measured waveforms to account for any relative sensor misalignment errors, which can cause artefacts of amplified self-noise around the microseismic peak (Tasič & Runovc, 2012). The instruments were tested for a period of several months at a seismic vault located at the Eskdalemuir array in southern Scotland. We discuss the implications of these self-noise estimates within the framework of the ambient noise level across the mainland United Kingdom. Using attenuation relationships derived for the United Kingdom, we investigate the detection capability thresholds of the UK National Seismic Network within the framework of a Traffic Light System (TLS) that has been proposed for monitoring of induced seismic events due to shale gas extraction.
Optimization study for the experimental configuration of CMB-S4
NASA Astrophysics Data System (ADS)
Barron, Darcy; Chinone, Yuji; Kusaka, Akito; Borril, Julian; Errard, Josquin; Feeney, Stephen; Ferraro, Simone; Keskitalo, Reijo; Lee, Adrian T.; Roe, Natalie A.; Sherwin, Blake D.; Suzuki, Aritoki
2018-02-01
The CMB Stage 4 (CMB-S4) experiment is a next-generation, ground-based experiment that will measure the cosmic microwave background (CMB) polarization to unprecedented accuracy, probing the signature of inflation, the nature of cosmic neutrinos, relativistic thermal relics in the early universe, and the evolution of the universe. CMB-S4 will consist of O(500,000) photon-noise-limited detectors that cover a wide range of angular scales in order to probe the cosmological signatures from both the early and late universe. It will measure a wide range of microwave frequencies to cleanly separate the CMB signals from galactic and extra-galactic foregrounds. To advance the progress towards designing the instrument for CMB-S4, we have established a framework to optimize the instrumental configuration to maximize its scientific output. The framework combines cost and instrumental models with a cosmology forecasting tool, and evaluates the scientific sensitivity as a function of various instrumental parameters. The cost model also allows us to perform the analysis under a fixed-cost constraint, optimizing for the scientific output of the experiment given finite resources. In this paper, we report our first results from this framework, using simplified instrumental and cost models. We have primarily studied two classes of instrumental configurations: arrays of large-aperture telescopes with diameters ranging from 2–10 m, and hybrid arrays that combine small-aperture telescopes (0.5-m diameter) with large-aperture telescopes. We explore performance as a function of telescope aperture size, distribution of the detectors into different microwave frequencies, survey strategy and survey area, low-frequency noise performance, and balance between small and large aperture telescopes for hybrid configurations. Both types of configurations must cover both large (~ degree) and small (~ arcmin) angular scales, and the performance depends on assumptions for performance vs. angular scale. The configurations with large-aperture telescopes have a shallow optimum around 4–6 m in aperture diameter, assuming that large telescopes can achieve good performance for low-frequency noise. We explore some of the uncertainties of the instrumental model and cost parameters, and we find that the optimum has a weak dependence on these parameters. The hybrid configuration shows an even broader optimum, spanning a range of 4–10 m in aperture for the large telescopes. We also present two strawperson configurations as an outcome of this optimization study, and we discuss some ideas for improving our simple cost and instrumental models used here. There are several areas of this analysis that deserve further improvement. In our forecasting framework, we adopt a simple two-component foreground model with spatially varying power-law spectral indices. We estimate de-lensing performance statistically and ignore non-idealities such as anisotropic mode coverage, boundary effect, and possible foreground residual. Instrumental systematics, which is not accounted for in our analyses, may also influence the conceptual design. Further study of the instrumental and cost models will be one of the main areas of study by the entire CMB-S4 community. We hope that our framework will be useful for estimating the influence of these improvements in the future, and we will incorporate them in order to further improve the optimization.
Morowatisharifabad, Mohammad Ali; Mazloomi-Mahmoodabad, Seyed Saied; Afshani, Seyed Alireza; Ardian, Nahid; Vaezi, Ali; Refahi, Seyed Ali Asghar
2018-05-20
The present study sought to explore the experiences of participants in divorce process according to the theory of planned behaviour. This qualitative study was conducted using content analysis method. In this research, 27 participants involved in the divorce process were selected. The data were coded, and the qualitative content analysis was performed. Based on four constructs of the theory of planned behaviour, the subcategories of instrumental attitude were "Divorce as the last solution" and "Divorce as damage for individuals and society". From the perceived behavioural control theme, two subcategories of behavioural control and self-efficacy were drawn; the first subtheme included "Others' meddling in the married life", "Social problems reducing behavioural control power" and "Personality characteristics affecting the behavioural control power"; and the second one included: "Education as a means for developing self-efficacy" and "barriers to self-efficacy". The injunctive norms theme included three subcategories of "Others help to reconcile", "Others meddling and lack of reconciliation", and "Families support to reconcile". The descriptive norms theme was "High divorce rate and misuse of satellite channels and social networks as factors making reconciliation difficult". It seems that education and counselling, within a predefined framework, such as applied theories, can be useful.
A unifying framework for systems modeling, control systems design, and system operation
NASA Technical Reports Server (NTRS)
Dvorak, Daniel L.; Indictor, Mark B.; Ingham, Michel D.; Rasmussen, Robert D.; Stringfellow, Margaret V.
2005-01-01
Current engineering practice in the analysis and design of large-scale multi-disciplinary control systems is typified by some form of decomposition- whether functional or physical or discipline-based-that enables multiple teams to work in parallel and in relative isolation. Too often, the resulting system after integration is an awkward marriage of different control and data mechanisms with poor end-to-end accountability. System of systems engineering, which faces this problem on a large scale, cries out for a unifying framework to guide analysis, design, and operation. This paper describes such a framework based on a state-, model-, and goal-based architecture for semi-autonomous control systems that guides analysis and modeling, shapes control system software design, and directly specifies operational intent. This paper illustrates the key concepts in the context of a large-scale, concurrent, globally distributed system of systems: NASA's proposed Array-based Deep Space Network.
Sibley, Kathryn M; Beauchamp, Marla K; Van Ooteghem, Karen; Straus, Sharon E; Jaglal, Susan B
2015-01-01
To identify components of postural control included in standardized balance measures for adult populations. Electronic searches of MEDLINE, EMBASE, and CINAHL databases using keyword combinations of postural balance/equilibrium, psychometrics/reproducibility of results/predictive value of tests/validation studies, instrument construction/instrument validation, geriatric assessment/disability evaluation, gray literature, and hand searches. Inclusion criteria were measures with a stated objective to assess balance, adult populations (18y and older), at least 1 psychometric evaluation, 1 standing task, a standardized protocol and evaluation criteria, and published in English. Two reviewers independently identified studies for inclusion. Sixty-six measures were included. A research assistant extracted descriptive characteristics and 2 reviewers independently coded components of balance in each measure using the Systems Framework for Postural Control, a widely recognized model of balance. Components of balance evaluated in these measures were underlying motor systems (100% of measures), anticipatory postural control (71%), dynamic stability (67%), static stability (64%), sensory integration (48%), functional stability limits (27%), reactive postural control (23%), cognitive influences (17%), and verticality (8%). Thirty-four measures evaluated 3 or fewer components of balance, and 1 measure-the Balance Evaluation Systems Test-evaluated all components of balance. Several standardized balance measures provide only partial information on postural control and omit important components of balance related to avoiding falls. As such, the choice of measure(s) may limit the overall interpretation of an individual's balance ability. Continued work is necessary to increase the implementation of comprehensive balance assessment in research and practice. Copyright © 2015 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Creating "Intelligent" Ensemble Averages Using a Process-Based Framework
NASA Astrophysics Data System (ADS)
Baker, Noel; Taylor, Patrick
2014-05-01
The CMIP5 archive contains future climate projections from over 50 models provided by dozens of modeling centers from around the world. Individual model projections, however, are subject to biases created by structural model uncertainties. As a result, ensemble averaging of multiple models is used to add value to individual model projections and construct a consensus projection. Previous reports for the IPCC establish climate change projections based on an equal-weighted average of all model projections. However, individual models reproduce certain climate processes better than other models. Should models be weighted based on performance? Unequal ensemble averages have previously been constructed using a variety of mean state metrics. What metrics are most relevant for constraining future climate projections? This project develops a framework for systematically testing metrics in models to identify optimal metrics for unequal weighting multi-model ensembles. The intention is to produce improved ("intelligent") unequal-weight ensemble averages. A unique aspect of this project is the construction and testing of climate process-based model evaluation metrics. A climate process-based metric is defined as a metric based on the relationship between two physically related climate variables—e.g., outgoing longwave radiation and surface temperature. Several climate process metrics are constructed using high-quality Earth radiation budget data from NASA's Clouds and Earth's Radiant Energy System (CERES) instrument in combination with surface temperature data sets. It is found that regional values of tested quantities can vary significantly when comparing the equal-weighted ensemble average and an ensemble weighted using the process-based metric. Additionally, this study investigates the dependence of the metric weighting scheme on the climate state using a combination of model simulations including a non-forced preindustrial control experiment, historical simulations, and several radiative forcing Representative Concentration Pathway (RCP) scenarios. Ultimately, the goal of the framework is to advise better methods for ensemble averaging models and create better climate predictions.
Zhai, Di-Hua; Xia, Yuanqing
2018-02-01
This paper addresses the adaptive control for task-space teleoperation systems with constrained predefined synchronization error, where a novel switched control framework is investigated. Based on multiple Lyapunov-Krasovskii functionals method, the stability of the resulting closed-loop system is established in the sense of state-independent input-to-output stability. Compared with previous work, the developed method can simultaneously handle the unknown kinematics/dynamics, asymmetric varying time delays, and prescribed performance control in a unified framework. It is shown that the developed controller can guarantee the prescribed transient-state and steady-state synchronization performances between the master and slave robots, which is demonstrated by the simulation study.
Toward city-scale water quality control: building a theory for smart stormwater systems
NASA Astrophysics Data System (ADS)
Kerkez, B.; Mullapudi, A. M.; Wong, B. P.
2016-12-01
Urban stormwater systems are rarely designed as actual systems. Rather, it is often assumed that individual Best Management Practices (BMPs) will add up to achieve desired watershed outcomes. Given the rise of BMPs and green infrastructure, we ask: does doing "best" at the local scale guarantee the "best" at the global scale? Existing studies suggest that the system-level performance of distributed stormwater practices may actually adversely impact watersheds by increasing downstream erosion and reducing water quality. Optimizing spatial placement may not be sufficient, however, since precipitation variability and other sources of uncertainty can drive the overall system into undesirable states. To that end, it is also important to control the temporal behavior of the system, which can be achieved by equipping stormwater elements (ponds, wetlands, basins, bioswales, etc.) with "smart" sensors and valves. Rather than building new infrastructure, this permits for existing assets to be repurposed and controlled to adapt to individual storm events. While we have learned how to build and deploy the necessary sensing and control technologies, we do not have a framework or theory that combines our knowledge of hydrology, hydraulics, water quality and control. We discuss the development of such a framework and investigate how existing water domain knowledge can be transferred into a system-theoretic context to enable real-time, city-scale stormwater control. We apply this framework to water quality control in an urban watershed in southeast Michigan, which has been heavily instrumented and retrofitted for control over the past year.
Study on virtual instrument developing system based on intelligent virtual control
NASA Astrophysics Data System (ADS)
Tang, Baoping; Cheng, Fabin; Qin, Shuren
2005-01-01
The paper introduces a non-programming developing system of a virtual instument (VI), i.e., a virtual measurement instrument developing system (VMIDS) based on intelligent virtual control (IVC). The background of the IVC-based VMIDS is described briefly, and the hierarchical message bus (HMB)-based software architecture of VMIDS is discussed in detail. The three parts and functions of VMIDS are introduced, and the process of non-programming developing VI is further described.
NASA Technical Reports Server (NTRS)
Plitau, Denis; Prasad, Narasimha S.
2012-01-01
The Active Sensing of CO2 Emissions over Nights Days and Seasons (ASCENDS) mission recommended by the NRC Decadal Survey has a desired accuracy of 0.3% in carbon dioxide mixing ratio (XCO2) retrievals requiring careful selection and optimization of the instrument parameters. NASA Langley Research Center (LaRC) is investigating 1.57 micron carbon dioxide as well as the 1.26-1.27 micron oxygen bands for our proposed ASCENDS mission requirements investigation. Simulation studies are underway for these bands to select optimum instrument parameters. The simulations are based on a multi-wavelength lidar modeling framework being developed at NASA LaRC to predict the performance of CO2 and O2 sensing from space and airborne platforms. The modeling framework consists of a lidar simulation module and a line-by-line calculation component with interchangeable lineshape routines to test the performance of alternative lineshape models in the simulations. As an option the line-by-line radiative transfer model (LBLRTM) program may also be used for line-by-line calculations. The modeling framework is being used to perform error analysis, establish optimum measurement wavelengths as well as to identify the best lineshape models to be used in CO2 and O2 retrievals. Several additional programs for HITRAN database management and related simulations are planned to be included in the framework. The description of the modeling framework with selected results of the simulation studies for CO2 and O2 sensing is presented in this paper.
MAISIE: a multipurpose astronomical instrument simulator environment
NASA Astrophysics Data System (ADS)
O'Brien, Alan; Beard, Steven; Geers, Vincent; Klaassen, Pamela
2016-07-01
Astronomical instruments often need simulators to preview their data products and test their data reduction pipelines. Instrument simulators have tended to be purpose-built with a single instrument in mind, and at- tempting to reuse one of these simulators for a different purpose is often a slow and difficult task. MAISIE is a simulator framework designed for reuse on different instruments. An object-oriented design encourages reuse of functionality and structure, while offering the flexibility to create new classes with new functionality. MAISIE is a set of Python classes, interfaces and tools to help build instrument simulators. MAISIE can just as easily build simulators for single and multi-channel instruments, imagers and spectrometers, ground and space based instruments. To remain easy to use and to facilitate the sharing of simulators across teams, MAISIE is written in Python, a freely available and open-source language. New functionality can be created for MAISIE by creating new classes that represent optical elements. This approach allows new and novel instruments to add functionality and take advantage of the existing MAISIE classes. MAISIE has recently been used successfully to develop the simulator for the JWST/MIRI- Medium Resolution Spectrometer.
Kiesewetter, Jan; Fischer, Martin R
2015-01-01
Simulation-based teamwork trainings are considered a powerful training method to advance teamwork, which becomes more relevant in medical education. The measurement of teamwork is of high importance and several instruments have been developed for various medical domains to meet this need. To our knowledge, no theoretically-based and easy-to-use measurement instrument has been published nor developed specifically for simulation-based teamwork trainings of medical students. Internist ward-rounds function as an important example of teamwork in medicine. The purpose of this study was to provide a validated, theoretically-based instrument that is easy-to-use. Furthermore, this study aimed to identify if and when rater scores relate to performance. Based on a theoretical framework for teamwork behaviour, items regarding four teamwork components (Team Coordination, Team Cooperation, Information Exchange, Team Adjustment Behaviours) were developed. In study one, three ward-round scenarios, simulated by 69 students, were videotaped and rated independently by four trained raters. The instrument was tested for the embedded psychometric properties and factorial structure. In study two, the instrument was tested for construct validity with an external criterion with a second set of 100 students and four raters. In study one, the factorial structure matched the theoretical components but was unable to separate Information Exchange and Team Cooperation. The preliminary version showed adequate psychometric properties (Cronbach's α=.75). In study two, the instrument showed physician rater scores were more reliable in measurement than those of student raters. Furthermore, a close correlation between the scale and clinical performance as an external criteria was shown (r=.64) and the sufficient psychometric properties were replicated (Cronbach's α=.78). The validation allows for use of the simulated teamwork assessment scale in undergraduate medical ward-round trainings to reliably measure teamwork by physicians. Further studies are needed to verify the applicability of the instrument.
Kiesewetter, Jan; Fischer, Martin R.
2015-01-01
Background: Simulation-based teamwork trainings are considered a powerful training method to advance teamwork, which becomes more relevant in medical education. The measurement of teamwork is of high importance and several instruments have been developed for various medical domains to meet this need. To our knowledge, no theoretically-based and easy-to-use measurement instrument has been published nor developed specifically for simulation-based teamwork trainings of medical students. Internist ward-rounds function as an important example of teamwork in medicine. Purposes: The purpose of this study was to provide a validated, theoretically-based instrument that is easy-to-use. Furthermore, this study aimed to identify if and when rater scores relate to performance. Methods: Based on a theoretical framework for teamwork behaviour, items regarding four teamwork components (Team Coordination, Team Cooperation, Information Exchange, Team Adjustment Behaviours) were developed. In study one, three ward-round scenarios, simulated by 69 students, were videotaped and rated independently by four trained raters. The instrument was tested for the embedded psychometric properties and factorial structure. In study two, the instrument was tested for construct validity with an external criterion with a second set of 100 students and four raters. Results: In study one, the factorial structure matched the theoretical components but was unable to separate Information Exchange and Team Cooperation. The preliminary version showed adequate psychometric properties (Cronbach’s α=.75). In study two, the instrument showed physician rater scores were more reliable in measurement than those of student raters. Furthermore, a close correlation between the scale and clinical performance as an external criteria was shown (r=.64) and the sufficient psychometric properties were replicated (Cronbach’s α=.78). Conclusions: The validation allows for use of the simulated teamwork assessment scale in undergraduate medical ward-round trainings to reliably measure teamwork by physicians. Further studies are needed to verify the applicability of the instrument. PMID:26038684
[A heart function measuring and analyzing instrument based on single-chip microcomputer].
Rong, Z; Liang, H; Wang, S
1999-05-01
An Introduction a measuring and analyzing instrument, based on the single-chip microcomputer, which provides sample gathering, processing, controlling, adjusting, keyboard and printing. All informations are provided and displayed in Chinese.
Li, Honghe; Ding, Ning; Zhang, Yuanyuan; Liu, Yang; Wen, Deliang
2017-01-01
Over the last three decades, various instruments were developed and employed to assess medical professionalism, but their measurement properties have yet to be fully evaluated. This study aimed to systematically evaluate these instruments' measurement properties and the methodological quality of their related studies within a universally acceptable standardized framework and then provide corresponding recommendations. A systematic search of the electronic databases PubMed, Web of Science, and PsycINFO was conducted to collect studies published from 1990-2015. After screening titles, abstracts, and full texts for eligibility, the articles included in this study were classified according to their respective instrument's usage. A two-phase assessment was conducted: 1) methodological quality was assessed by following the COnsensus-based Standards for the selection of health status Measurement INstruments (COSMIN) checklist; and 2) the quality of measurement properties was assessed according to Terwee's criteria. Results were integrated using best-evidence synthesis to look for recommendable instruments. After screening 2,959 records, 74 instruments from 80 existing studies were included. The overall methodological quality of these studies was unsatisfactory, with reasons including but not limited to unknown missing data, inadequate sample sizes, and vague hypotheses. Content validity, cross-cultural validity, and criterion validity were either unreported or negative ratings in most studies. Based on best-evidence synthesis, three instruments were recommended: Hisar's instrument for nursing students, Nurse Practitioners' Roles and Competencies Scale, and Perceived Faculty Competency Inventory. Although instruments measuring medical professionalism are diverse, only a limited number of studies were methodologically sound. Future studies should give priority to systematically improving the performance of existing instruments and to longitudinal studies.
Using a social capital framework to enhance measurement of the nursing work environment.
Sheingold, Brenda Helen; Sheingold, Steven H
2013-07-01
To develop, field test and analyse a social capital survey instrument for measuring the nursing work environment. The concept of social capital, which focuses on improving productive capacity by examining relationships and networks, may provide a promising framework to measure and evaluate the nurse work environment in a variety of settings. A survey instrument for measuring social capital in the nurse work environment was developed by adapting the World Bank's Social Capital - Integrated Questionnaire (SC-IQ). Exploratory factor analysis and multiple regression analyses were applied to assess the properties of the instrument. The exploratory factor analysis yielded five factors that align well with the social capital framework, while reflecting unique aspects of the nurse work environment. The results suggest that the social capital framework provides a promising context to assess the nurse work environment. Further work is needed to refine the instrument for a diverse range of health-care providers and to correlate social capital measures with quality of patient care. Social capital measurement of the nurse work environment has the potential to provide managers with an enhanced set of tools for building productive capacity in health-care organisations and achieving desired outcomes. © 2013 John Wiley & Sons Ltd.
Effects of Cognitive Load on Driving Performance: The Cognitive Control Hypothesis.
Engström, Johan; Markkula, Gustav; Victor, Trent; Merat, Natasha
2017-08-01
The objective of this paper was to outline an explanatory framework for understanding effects of cognitive load on driving performance and to review the existing experimental literature in the light of this framework. Although there is general consensus that taking the eyes off the forward roadway significantly impairs most aspects of driving, the effects of primarily cognitively loading tasks on driving performance are not well understood. Based on existing models of driver attention, an explanatory framework was outlined. This framework can be summarized in terms of the cognitive control hypothesis: Cognitive load selectively impairs driving subtasks that rely on cognitive control but leaves automatic performance unaffected. An extensive literature review was conducted wherein existing results were reinterpreted based on the proposed framework. It was demonstrated that the general pattern of experimental results reported in the literature aligns well with the cognitive control hypothesis and that several apparent discrepancies between studies can be reconciled based on the proposed framework. More specifically, performance on nonpracticed or inherently variable tasks, relying on cognitive control, is consistently impaired by cognitive load, whereas the performance on automatized (well-practiced and consistently mapped) tasks is unaffected and sometimes even improved. Effects of cognitive load on driving are strongly selective and task dependent. The present results have important implications for the generalization of results obtained from experimental studies to real-world driving. The proposed framework can also serve to guide future research on the potential causal role of cognitive load in real-world crashes.
A singlechip-computer-controlled conductivity meter based on conductance-frequency transformation
NASA Astrophysics Data System (ADS)
Chen, Wenxiang; Hong, Baocai
2005-02-01
A portable conductivity meter controlled by singlechip computer was designed. The instrument uses conductance-frequency transformation method to measure the conductivity of solution. The circuitry is simple and reliable. Another feature of the instrument is that the temperature compensation is realised by changing counting time of the timing counter. The theoretical based and the usage of temperature compensation are narrated.
A Framework for Simulating Turbine-Based Combined-Cycle Inlet Mode-Transition
NASA Technical Reports Server (NTRS)
Le, Dzu K.; Vrnak, Daniel R.; Slater, John W.; Hessel, Emil O.
2012-01-01
A simulation framework based on the Memory-Mapped-Files technique was created to operate multiple numerical processes in locked time-steps and send I/O data synchronously across to one-another to simulate system-dynamics. This simulation scheme is currently used to study the complex interactions between inlet flow-dynamics, variable-geometry actuation mechanisms, and flow-controls in the transition from the supersonic to hypersonic conditions and vice-versa. A study of Mode-Transition Control for a high-speed inlet wind-tunnel model with this MMF-based framework is presented to illustrate this scheme and demonstrate its usefulness in simulating supersonic and hypersonic inlet dynamics and controls or other types of complex systems.
A phase space approach to imaging from limited data
NASA Astrophysics Data System (ADS)
Testorf, Markus E.
2015-09-01
The optical instrument function is used as the basis to develop optical system theory for imaging applications. The detection of optical signals is conveniently described as the overlap integral of the Wigner distribution functions of instrument and optical signal. Based on this framework various optical imaging systems, including plenoptic cameras, phase-retrieval algorithms, and Shack-Hartman sensors are shown to acquire information about a domain in phase-space, with finite extension and finite resolution. It is demonstrated how phase space optics can be used both to analyze imaging systems, as well as for designing methods for image reconstruction.
Deploying Object Oriented Data Technology to the Planetary Data System
NASA Technical Reports Server (NTRS)
Kelly, S.; Crichton, D.; Hughes, J. S.
2003-01-01
How do you provide more than 350 scientists and researchers access to data from every instrument in Odyssey when the data is curated across half a dozen institutions and in different formats and is too big to mail on a CD-ROM anymore? The Planetary Data System (PDS) faced this exact question. The solution was to use a metadata-based middleware framework developed by the Object Oriented Data Technology task at NASA s Jet Propulsion Laboratory. Using OODT, PDS provided - for the first time ever - data from all mission instruments through a single system immediately upon data delivery.
Evaluating Payments for Environmental Services: Methodological Challenges
2016-01-01
Over the last fifteen years, Payments for Environmental Services (PES) schemes have become very popular environmental policy instruments, but the academic literature has begun to question their additionality. The literature attempts to estimate the causal effect of these programs by applying impact evaluation (IE) techniques. However, PES programs are complex instruments and IE methods cannot be directly applied without adjustments. Based on a systematic review of the literature, this article proposes a framework for the methodological process of designing an IE for PES schemes. It revises and discusses the methodological choices at each step of the process and proposes guidelines for practitioners. PMID:26910850
Cocks, E; Thoresen, S; Williamson, M; Boaden, R
2014-07-01
Following the closure of large residential facilities over the past several decades, emphasis on community living for adults with developmental disabilities has strengthened. However, the concept of community living is ambiguous. The term is often associated with congregation of people with disabilities in ordinary houses 'in' the community. Group homes, the most common contemporary formal expression of 'community living', may use ordinary houses and accommodate a small number of residents comparable to a large family. Individual supported living (ISL) arrangements around a single person with a disability using person-centred principles are occurring with increasing frequency. The ISL manual was developed over 4 years in two sequential research projects to produce a quality framework articulating ISL and operationalising the framework into a review and planning instrument for ISL arrangements. The ISL manual was developed in three stages and overseen by a reference group of key stakeholders purposively recruited as well-versed in ISL. The first stage operationalised the quality framework over two half-day workshops with a group of key informants. Participants identified indicators and sources of evidence for each attribute of the quality framework. The quality framework, indicators, and sources of evidence were compiled into an initial evaluation instrument of nine themes consisting of 27 attributes. This was piloted in two rounds to enhance the utility of the instrument and develop the final manual which contained eight themes and 21 attributes. A comprehensive literature search was carried out to identify relevant empirical ISL studies. The literature search identified four empirical studies that incorporated ISL over the preceding 3 years. A previous literature search from the first research project that produced the quality framework spanned 27 years and identified five empirical studies. We concluded that the empirical base for developing evidence for the nature and outcomes of ISL arrangements was sparse. The ISL manual and scoring booklet developed in the current research project includes six illustrative case studies of ISL, instructions for potential users to review living arrangements or set up a new arrangement, and the review framework consisting of descriptions of themes and attributes, indicators, and sources of evidence. The dearth of empirical studies of ISL arrangements for people with developmental disabilities, despite increased policy emphasis on individualised options, underscores the importance of planning and review tools to promote quality outcomes. The ISL manual can assist adults with developmental disabilities, families, carers, and service providers to plan and review ISL arrangements. Further research will enhance the properties of this instrument and establish the relationship between quality of ISL arrangements and outcomes such as quality of life, and participation and inclusion. © 2013 MENCAP and International Association of the Scientific Study of Intellectual and Developmental Disabilities and John Wiley & Sons Ltd.
Impact of the Europeanisation of Education: Qualifications Frameworks in Europe
ERIC Educational Resources Information Center
Mikulec, Borut
2017-01-01
This article examines the influence of the European qualifications framework--a key European lifelong learning policy instrument for improving employability, comparability and mobility in the European educational space--on the establishment of national qualifications frameworks in Europe. The European qualifications framework and national…
ERIC Educational Resources Information Center
Mun, Kongju; Shin, Namsoo; Lee, Hyunju; Kim, Sung-Won; Choi, Kyunghee; Choi, Sung-Youn; Krajcik, Joseph S.
2015-01-01
We re-conceptualized the meaning of scientific literacy and developed an instrument, which we call the Global Scientific Literacy Questionnaire (GSLQ) based on a new conceptual framework for scientific literacy in the twenty-first century. We identified five dimensions, each with key elements. The five dimensions are (1) content knowledge (core…
Systemic Influences on Career Development: Assisting Clients to Tell Their Career Stories
ERIC Educational Resources Information Center
McMahon, Mary L.; Watson, Mark B.
2008-01-01
A challenge for career theory informed by constructivism is how to apply it in practice. This article describes a career counseling intervention based on the constructivist Systems Theory Framework (STF) of career development and the qualitative career assessment instrument derived from it, the My System of Career Influences (MSCI; M. McMahon, W.…
ERIC Educational Resources Information Center
Groß Ophoff, Jana; Schladitz, Sandra; Leuders, Juliane; Leuders, Timo; Wirtz, Markus A.
2015-01-01
The ability to purposefully access, reflect, and use evidence from educational research (Educational Research Literacy) is expected of future professionals in educational practice. Based on the presented conceptual framework, a test instrument was developed to assess the different competency aspects: Information Literacy, Statistical Literacy, and…
The Development of a Comparative Appraisal of Perceived Resources and Demands for Principals
ERIC Educational Resources Information Center
Maerz, Drew Rory
2011-01-01
The purpose of this study was to develop the Comparative Appraisal of perceived Resources and Demands for Principals (CARD-P), which is used for appraising perceived stress in the elementary school principalship. An appraisal-based definition of stress was derived from literature and used as the theoretical framework for creating the instrument.…
ERIC Educational Resources Information Center
Neves, Victor Russell Tarbet
2007-01-01
Background: Educational guidelines and reforms focused on literacy, including No Child Left Behind (NCLB) and National Assessment of Educational Progress (NAEP), have contributed to a music education culture and climate focused on language literacy rather than on the core content literacies inherent in music itself. Purpose: The purpose of this…
Reconceptualisation of Approaches to Teaching Evaluation in Higher Education
ERIC Educational Resources Information Center
Tran, Nga D.
2015-01-01
The ubiquity of using Student Evaluation of Teaching (SET) in higher education is inherently controversial. Issues mostly resolve around whether the instrument is reliable and valid for the purpose for which it was intended. Controversies exist, in part, due to the lack of a theoretical framework upon which SETs can be based and tested for their…
"Many Moons": Understanding Teacher Learning from a Teacher Education Perspective. Issue Paper 88-5.
ERIC Educational Resources Information Center
McDiarmid, G. Williamson; Ball, Deborah Loewenberg
The Teacher Education and Learning to Teach Study of the National Center for Research on Teacher Education combines case studies of teacher education programs with longitudinal studies of teacher learning. In this paper, the development of the theoretical framework on which instrumentation for the longitudinal study is based is discussed.…
ERIC Educational Resources Information Center
Westling Allodi, Mara
2007-01-01
The Goals, Attitudes and Values in School (GAVIS) questionnaire was developed on the basis of theoretical frameworks concerning learning environments, universal human values and studies of students' experience of learning environments. The theory hypothesises that learning environments can be described and structured in a circumplex model using…
Tschmelak, Jens; Proll, Guenther; Riedt, Johannes; Kaiser, Joachim; Kraemmer, Peter; Bárzaga, Luis; Wilkinson, James S; Hua, Ping; Hole, J Patrick; Nudd, Richard; Jackson, Michael; Abuknesha, Ram; Barceló, Damià; Rodriguez-Mozaz, Sara; de Alda, Maria J López; Sacher, Frank; Stien, Jan; Slobodník, Jaroslav; Oswald, Peter; Kozmenko, Helena; Korenková, Eva; Tóthová, Lívia; Krascsenits, Zoltan; Gauglitz, Guenter
2005-02-15
A novel analytical system AWACSS (automated water analyser computer-supported system) based on immunochemical technology has been developed that can measure several organic pollutants at low nanogram per litre level in a single few-minutes analysis without any prior sample pre-concentration nor pre-treatment steps. Having in mind actual needs of water-sector managers related to the implementation of the Drinking Water Directive (DWD) (98/83/EC, 1998) and Water Framework Directive WFD (2000/60/EC, 2000), drinking, ground, surface, and waste waters were major media used for the evaluation of the system performance. The instrument was equipped with remote control and surveillance facilities. The system's software allows for the internet-based networking between the measurement and control stations, global management, trend analysis, and early-warning applications. The experience of water laboratories has been utilised at the design of the instrument's hardware and software in order to make the system rugged and user-friendly. Several market surveys were conducted during the project to assess the applicability of the final system. A web-based AWACSS database was created for automated evaluation and storage of the obtained data in a format compatible with major databases of environmental organic pollutants in Europe. This first part article gives the reader an overview of the aims and scope of the AWACSS project as well as details about basic technology, immunoassays, software, and networking developed and utilised within the research project. The second part article reports on the system performance, first real sample measurements, and an international collaborative trial (inter-laboratory tests) to compare the biosensor with conventional anayltical methods.
Morales-Asencio, José Miguel; Porcel-Gálvez, Ana María; Oliveros-Valenzuela, Rosa; Rodríguez-Gómez, Susana; Sánchez-Extremera, Lucrecia; Serrano-López, Francisco Andrés; Aranda-Gallardo, Marta; Canca-Sánchez, José Carlos; Barrientos-Trigo, Sergio
2015-03-01
The aim of this study was to establish the validity and reliability of an instrument (Inventario del NIvel de Cuidados mediante IndicAdores de clasificación de Resultados de Enfermería) used to assess the dependency level in acutely hospitalised patients. This instrument is novel, and it is based on the Nursing Outcomes Classification. Multiple existing instruments for needs assessment have been poorly validated and based predominately on interventions. Standardised Nursing Languages offer an ideal framework to develop nursing sensitive instruments. A cross-sectional validation study in two acute care hospitals in Spain. This study was implemented in two phases. First, the research team developed the instrument to be validated. In the second phase, the validation process was performed by experts, and the data analysis was conducted to establish the psychometric properties of the instrument. Seven hundred and sixty-one patient ratings performed by nurses were collected during the course of the research study. Data analysis yielded a Cronbach's alpha of 0·91. An exploratory factorial analysis identified three factors (Physiological, Instrumental and Cognitive-behavioural), which explained 74% of the variance. Inventario del NIvel de Cuidados mediante IndicAdores de clasificación de Resultados de Enfermería was demonstrated to be a valid and reliable instrument based on its use in acutely hospitalised patients to assess the level of dependency. Inventario del NIvel de Cuidados mediante IndicAdores de clasificación de Resultados de Enfermería can be used as an assessment tool in hospitalised patients during the nursing process throughout the entire hospitalisation period. It contributes information to support decisions on nursing diagnoses, interventions and outcomes. It also enables data codification in large databases. © 2014 John Wiley & Sons Ltd.
Distributed Computing Framework for Synthetic Radar Application
NASA Technical Reports Server (NTRS)
Gurrola, Eric M.; Rosen, Paul A.; Aivazis, Michael
2006-01-01
We are developing an extensible software framework, in response to Air Force and NASA needs for distributed computing facilities for a variety of radar applications. The objective of this work is to develop a Python based software framework, that is the framework elements of the middleware that allows developers to control processing flow on a grid in a distributed computing environment. Framework architectures to date allow developers to connect processing functions together as interchangeable objects, thereby allowing a data flow graph to be devised for a specific problem to be solved. The Pyre framework, developed at the California Institute of Technology (Caltech), and now being used as the basis for next-generation radar processing at JPL, is a Python-based software framework. We have extended the Pyre framework to include new facilities to deploy processing components as services, including components that monitor and assess the state of the distributed network for eventual real-time control of grid resources.
Francis, P; Eastwood, K W; Bodani, V; Looi, T; Drake, J M
2018-05-07
This work explores the feasibility of creating and accurately controlling an instrument for robotic surgery with a 2 mm diameter and a three degree-of-freedom (DoF) wrist which is compatible with the da Vinci platform. The instrument's wrist is composed of a two DoF bending notched-nitinol tube pattern, for which a kinematic model has been developed. A base mechanism for controlling the wrist is designed for integration with the da Vinci Research Kit. A basic teleoperation task is successfully performed using two of the miniature instruments. The performance and accuracy of the instrument suggest that creating and accurately controlling a 2 mm diameter instrument is feasible and the design and modelling proposed in this work provide a basis for future miniature instrument development.
NASA Astrophysics Data System (ADS)
Yang, C.; Zheng, W.; Zhang, M.; Yuan, T.; Zhuang, G.; Pan, Y.
2016-06-01
Measurement and control of the plasma in real-time are critical for advanced Tokamak operation. It requires high speed real-time data acquisition and processing. ITER has designed the Fast Plant System Controllers (FPSC) for these purposes. At J-TEXT Tokamak, a real-time data acquisition and processing framework has been designed and implemented using standard ITER FPSC technologies. The main hardware components of this framework are an Industrial Personal Computer (IPC) with a real-time system and FlexRIO devices based on FPGA. With FlexRIO devices, data can be processed by FPGA in real-time before they are passed to the CPU. The software elements are based on a real-time framework which runs under Red Hat Enterprise Linux MRG-R and uses Experimental Physics and Industrial Control System (EPICS) for monitoring and configuring. That makes the framework accord with ITER FPSC standard technology. With this framework, any kind of data acquisition and processing FlexRIO FPGA program can be configured with a FPSC. An application using the framework has been implemented for the polarimeter-interferometer diagnostic system on J-TEXT. The application is able to extract phase-shift information from the intermediate frequency signal produced by the polarimeter-interferometer diagnostic system and calculate plasma density profile in real-time. Different algorithms implementations on the FlexRIO FPGA are compared in the paper.
Instrumentino: An Open-Source Software for Scientific Instruments.
Koenka, Israel Joel; Sáiz, Jorge; Hauser, Peter C
2015-01-01
Scientists often need to build dedicated computer-controlled experimental systems. For this purpose, it is becoming common to employ open-source microcontroller platforms, such as the Arduino. These boards and associated integrated software development environments provide affordable yet powerful solutions for the implementation of hardware control of transducers and acquisition of signals from detectors and sensors. It is, however, a challenge to write programs that allow interactive use of such arrangements from a personal computer. This task is particularly complex if some of the included hardware components are connected directly to the computer and not via the microcontroller. A graphical user interface framework, Instrumentino, was therefore developed to allow the creation of control programs for complex systems with minimal programming effort. By writing a single code file, a powerful custom user interface is generated, which enables the automatic running of elaborate operation sequences and observation of acquired experimental data in real time. The framework, which is written in Python, allows extension by users, and is made available as an open source project.
Measuring organizational readiness for knowledge translation in chronic care.
Gagnon, Marie-Pierre; Labarthe, Jenni; Légaré, France; Ouimet, Mathieu; Estabrooks, Carole A; Roch, Geneviève; Ghandour, El Kebir; Grimshaw, Jeremy
2011-07-13
Knowledge translation (KT) is an imperative in order to implement research-based and contextualized practices that can answer the numerous challenges of complex health problems. The Chronic Care Model (CCM) provides a conceptual framework to guide the implementation process in chronic care. Yet, organizations aiming to improve chronic care require an adequate level of organizational readiness (OR) for KT. Available instruments on organizational readiness for change (ORC) have shown limited validity, and are not tailored or adapted to specific phases of the knowledge-to-action (KTA) process. We aim to develop an evidence-based, comprehensive, and valid instrument to measure OR for KT in healthcare. The OR for KT instrument will be based on core concepts retrieved from existing literature and validated by a Delphi study. We will specifically test the instrument in chronic care that is of an increasing importance for the health system. Phase one: We will conduct a systematic review of the theories and instruments assessing ORC in healthcare. The retained theoretical information will be synthesized in a conceptual map. A bibliography and database of ORC instruments will be prepared after appraisal of their psychometric properties according to the standards for educational and psychological testing. An online Delphi study will be carried out among decision makers and knowledge users across Canada to assess the importance of these concepts and measures at different steps in the KTA process in chronic care.Phase two: A final OR for KT instrument will be developed and validated both in French and in English and tested in chronic disease management to measure OR for KT regarding the adoption of comprehensive, patient-centered, and system-based CCMs. This study provides a comprehensive synthesis of current knowledge on explanatory models and instruments assessing OR for KT. Moreover, this project aims to create more consensus on the theoretical underpinnings and the instrumentation of OR for KT in chronic care. The final product--a comprehensive and valid OR for KT instrument--will provide the chronic care settings with an instrument to assess their readiness to implement evidence-based chronic care.
Measuring organizational readiness for knowledge translation in chronic care
2011-01-01
Background Knowledge translation (KT) is an imperative in order to implement research-based and contextualized practices that can answer the numerous challenges of complex health problems. The Chronic Care Model (CCM) provides a conceptual framework to guide the implementation process in chronic care. Yet, organizations aiming to improve chronic care require an adequate level of organizational readiness (OR) for KT. Available instruments on organizational readiness for change (ORC) have shown limited validity, and are not tailored or adapted to specific phases of the knowledge-to-action (KTA) process. We aim to develop an evidence-based, comprehensive, and valid instrument to measure OR for KT in healthcare. The OR for KT instrument will be based on core concepts retrieved from existing literature and validated by a Delphi study. We will specifically test the instrument in chronic care that is of an increasing importance for the health system. Methods Phase one: We will conduct a systematic review of the theories and instruments assessing ORC in healthcare. The retained theoretical information will be synthesized in a conceptual map. A bibliography and database of ORC instruments will be prepared after appraisal of their psychometric properties according to the standards for educational and psychological testing. An online Delphi study will be carried out among decision makers and knowledge users across Canada to assess the importance of these concepts and measures at different steps in the KTA process in chronic care. Phase two: A final OR for KT instrument will be developed and validated both in French and in English and tested in chronic disease management to measure OR for KT regarding the adoption of comprehensive, patient-centered, and system-based CCMs. Discussion This study provides a comprehensive synthesis of current knowledge on explanatory models and instruments assessing OR for KT. Moreover, this project aims to create more consensus on the theoretical underpinnings and the instrumentation of OR for KT in chronic care. The final product--a comprehensive and valid OR for KT instrument--will provide the chronic care settings with an instrument to assess their readiness to implement evidence-based chronic care. PMID:21752264
NASA Technical Reports Server (NTRS)
Klein, R.
1972-01-01
A set of specially prepared digital tapes is reported which contain synchronized measurements of pilot scanning behavior, control response, and vehicle response obtained during instrument landing system approaches made in a fixed-base DC-8 transport simulator. The objective of the master tape is to provide a common data base which can be used by the research community to test theories, models, and methods for describing and analyzing control/display relations and interactions. The experimental conditions and tasks used to obtain the data and the detailed format of the tapes are described. Conventional instrument panel and controls were used, with simulated vertical gust and glide slope beam bend forcing functions. Continuous pilot eye fixations and scan traffic on the panel were measured. Both flight director and standard localizer/glide slope types of approaches were made, with both fixed and variable instrument range sensitivities.
The University of Florida's next-generation cryogenic infrared focal plane array controller system
NASA Astrophysics Data System (ADS)
Raines, Steven N.; Boreman, Glenn D.; Eikenberry, Stephen S.; Bandyopadhyay, Reba M.; Quijano, Ismael
2008-07-01
The Infrared Instrumentation Group at the University of Florida has substantial experience building IR focal plane array (FPA) controllers and seamlessly integrating them into the instruments that it builds for 8-meter class observatories, including writing device drivers for UNIX-based computer systems. We report on a design study to investigate implementing an ASIC from Teledyne Imaging Systems (TIS) into our IR FPA controller while simultaneously replacing TIS's interface card with one that eliminates the requirement for a Windows-OS computer within the instrument's control system.
A Framework to Explore the Knowledge Structure of Multidisciplinary Research Fields
Uddin, Shahadat; Khan, Arif; Baur, Louise A.
2015-01-01
Understanding emerging areas of a multidisciplinary research field is crucial for researchers, policymakers and other stakeholders. For them a knowledge structure based on longitudinal bibliographic data can be an effective instrument. But with the vast amount of available online information it is often hard to understand the knowledge structure for data. In this paper, we present a novel approach for retrieving online bibliographic data and propose a framework for exploring knowledge structure. We also present several longitudinal analyses to interpret and visualize the last 20 years of published obesity research data. PMID:25915521
High Resolution Sensing and Control of Urban Water Networks
NASA Astrophysics Data System (ADS)
Bartos, M. D.; Wong, B. P.; Kerkez, B.
2016-12-01
We present a framework to enable high-resolution sensing, modeling, and control of urban watersheds using (i) a distributed sensor network based on low-cost cellular-enabled motes, (ii) hydraulic models powered by a cloud computing infrastructure, and (iii) automated actuation valves that allow infrastructure to be controlled in real time. This platform initiates two major advances. First, we achieve a high density of measurements in urban environments, with an anticipated 40+ sensors over each urban area of interest. In addition to new measurements, we also illustrate the design and evaluation of a "smart" control system for real-world hydraulic networks. This control system improves water quality and mitigates flooding by using real-time hydraulic models to adaptively control releases from retention basins. We evaluate the potential of this platform through two ongoing deployments: (i) a flood monitoring network in the Dallas-Fort Worth metropolitan area that detects and anticipates floods at the level of individual roadways, and (ii) a real-time hydraulic control system in the city of Ann Arbor, MI—soon to be one of the most densely instrumented urban watersheds in the United States. Through these applications, we demonstrate that distributed sensing and control of water infrastructure can improve flash flood predictions, emergency response, and stormwater contaminant mitigation.
Kruitwagen, Sonja; Reudink, Melchert; Faber, Albert
2009-04-01
Despite a general decrease in Dutch environmental emission trends, it remains difficult to comply with European Union (EU) environmental policy targets. Furthermore, environmental issues have become increasingly complex and entangled with society. Therefore, Dutch environmental policy follows a pragmatic line by adopting a flexible approach for compliance, rather than aiming at further reduction at the source of emission. This may be politically useful in order to adequately reach EU targets, but restoration of environmental conditions may be delayed. However, due to the complexity of today's environmental issues, the restoration of environmental conditions might not be the only standard for a proper policy approach. Consequently this raises the question how the Dutch pragmatic approach to compliance qualifies in a broader policy assessment. In order to answer this question, we adapt a policy assessment framework, developed by Hemerijck and Hazeu (Bestuurskunde 13(2), 2004), based on the dimensions of legitimacy and policy logic. We apply this framework for three environmental policy assessments: flexible instruments in climate policy, fine-tuning of national and local measures to meet air quality standards, and derogation for the Nitrate Directive. We conclude with general assessment notes on the appliance of flexible instruments in environmental policy, showing that a broad and comprehensive perspective can help to understand the arguments to put such policy instruments into place and to identify trade-offs between assessment criteria.
Bruner, D W; Boyd, C P
1999-12-01
Cancer and cancer therapies impair sexual health in a multitude of ways. The promotion of sexual health is therefore vital for preserving quality of life and is an integral part of total or holistic cancer management. Nursing, to provide holistic care, requires research that is meaningful to patients as well as the profession to develop educational and interventional studies to promote sexual health and coping. To obtain meaningful research data instruments that are reliable, valid, and pertinent to patients' needs are required. Several sexual functioning instruments were reviewed for this study and found to be lacking in either a conceptual foundation or psychometric validation. Without a defined conceptual framework, authors of the instruments must have made certain assumptions regarding what women undergoing cancer therapy experience and what they perceive as important. To check these assumptions before assessing women's sexuality after cancer therapies in a larger study, a pilot study was designed to compare what women experience and perceive as important regarding their sexuality with what is assessed in several currently available research instruments, using the focus group technique. Based on the focus group findings, current sexual functioning questionnaires may be lacking in pertinent areas of concern for women treated for breast or gynecologic malignancies. Better conceptual foundations may help future questionnaire design. Self-regulation theory may provide an acceptable conceptual framework from which to develop a sexual functioning questionnaire.
Sun, Xiyang; Miao, Jiacheng; Wang, You; Luo, Zhiyuan; Li, Guang
2017-01-01
An estimate on the reliability of prediction in the applications of electronic nose is essential, which has not been paid enough attention. An algorithm framework called conformal prediction is introduced in this work for discriminating different kinds of ginsengs with a home-made electronic nose instrument. Nonconformity measure based on k-nearest neighbors (KNN) is implemented separately as underlying algorithm of conformal prediction. In offline mode, the conformal predictor achieves a classification rate of 84.44% based on 1NN and 80.63% based on 3NN, which is better than that of simple KNN. In addition, it provides an estimate of reliability for each prediction. In online mode, the validity of predictions is guaranteed, which means that the error rate of region predictions never exceeds the significance level set by a user. The potential of this framework for detecting borderline examples and outliers in the application of E-nose is also investigated. The result shows that conformal prediction is a promising framework for the application of electronic nose to make predictions with reliability and validity. PMID:28805721
Hadadgar, Arash; Changiz, Tahereh; Masiello, Italo; Dehghani, Zahra; Mirshahzadeh, Nahidossadat; Zary, Nabil
2016-08-22
General practitioners (GP) update their knowledge and skills by participating in continuing medical education (CME) programs either in a traditional or an e-Learning format. GPs' beliefs about electronic format of CME have been studied but without an explicit theoretical framework which makes the findings difficult to interpret. In other health disciplines, researchers used theory of planned behavior (TPB) to predict user's behavior. In this study, an instrument was developed to investigate GPs' intention to use e-Learning in CME based on TPB. The goodness of fit of TPB was measured using confirmatory factor analysis and the relationship between latent variables was assessed using structural equation modeling. A total of 148 GPs participated in the study. Most of the items in the questionnaire related well to the TPB theoretical constructs, and the model had good fitness. The perceived behavioral control and attitudinal constructs were included, and the subjective norms construct was excluded from the structural model. The developed questionnaire could explain 66 % of the GPs' intention variance. The TPB could be used as a model to construct instruments that investigate GPs' intention to participate in e-Learning programs in CME. The findings from the study will encourage CME managers and researchers to explore the developed instrument as a mean to explain and improve the GPs' intentions to use eLearning in CME.
[Design of warm-acupuncture technique training evaluation device].
Gao, Ming; Xu, Gang; Yang, Huayuan; Liu, Tangyi; Tang, Wenchao
2017-01-12
To design a warm-acupuncture teaching instrument to train and evaluate its manipulation. We refer to the principle and technical operation characteristics of traditional warm-acupuncture, as well as the mechanical design and single-chip microcomputer technology. The device is consisted of device noumenon, universal acupoints simulator, vibration reset system and circuit control system, including frame, platform framework, the swing framework, universal acupoints simulator, vibration reset outfit, operation time circuit, acupuncture sensation display, and vibration control circuit, etc. It can be used to train needle inserting with different angles and moxa rubbing and loading. It displays whether a needle point meets the location required. We determine whether the moxa group on a needle handle is easy to fall off through vibration test, and operation time is showed. The device can objectively help warm-acupuncture training and evaluation so as to promote its clinical standardization manipulation.
Truong, Hoai-An; Taylor, Catherine R; DiPietro, Natalie A
2012-02-10
To develop and validate the Assessment, Development, Assurance Pharmacist's Tool (ADAPT), an instrument for pharmacists and student pharmacists to use in developing and implementing health promotion programs. The 36-item ADAPT instrument was developed using the framework of public health's 3 core functions (assessment, policy development, and assurance) and 10 essential services. The tool's content and usage was assessed and conducted through peer-review and initial validity testing processes. Over 20 faculty members, preceptors, and student pharmacists at 5 institutions involved in planning and implementing health promotion initiatives reviewed the instrument and conducted validity testing. The instrument took approximately 15 minutes to complete and the findings resulted in changes and improvements to elements of the programs evaluated. The ADAPT instrument fills a need to more effectively plan, develop, implement, and evaluate pharmacist-directed public health programs that are evidence-based, high-quality, and compliant with laws and regulations and facilitates documentation of pharmacists' contributions to public health.
NASA Astrophysics Data System (ADS)
van Aalderen-Smeets, Sandra; Walma van der Molen, Juliette
2013-03-01
In this article, we present a valid and reliable instrument which measures the attitude of in-service and pre-service primary teachers toward teaching science, called the Dimensions of Attitude Toward Science (DAS) Instrument. Attention to the attitudes of primary teachers toward teaching science is of fundamental importance to the professionalization of these teachers in the field of primary science education. With the development of this instrument, we sought to fulfill the need for a statistically and theoretically valid and reliable instrument to measure pre-service and in-service teachers' attitudes. The DAS Instrument is based on a comprehensive theoretical framework for attitude toward (teaching) science. After pilot testing, the DAS was revised and subsequently validated using a large group of respondents (pre-service and in-service primary teachers) (N = 556). The theoretical underpinning of the DAS combined with the statistical data indicate that the DAS possesses good construct validity and that it proves to be a promising instrument that can be utilized for research purposes, and also as a teacher training and coaching tool. This instrument can therefore make a valuable contribution to progress within the field of science education.
Framework Based Guidance Navigation and Control Flight Software Development
NASA Technical Reports Server (NTRS)
McComas, David
2007-01-01
This viewgraph presentation describes NASA's guidance navigation and control flight software development background. The contents include: 1) NASA/Goddard Guidance Navigation and Control (GN&C) Flight Software (FSW) Development Background; 2) GN&C FSW Development Improvement Concepts; and 3) GN&C FSW Application Framework.
NASA Astrophysics Data System (ADS)
Nakatani, T.; Inamura, Y.; Moriyama, K.; Ito, T.; Muto, S.; Otomo, T.
Neutron scattering can be a powerful probe in the investigation of many phenomena in the materials and life sciences. The Materials and Life Science Experimental Facility (MLF) at the Japan Proton Accelerator Research Complex (J-PARC) is a leading center of experimental neutron science and boasts one of the most intense pulsed neutron sources in the world. The MLF currently has 18 experimental instruments in operation that support a wide variety of users from across a range of research fields. The instruments include optical elements, sample environment apparatus and detector systems that are controlled and monitored electronically throughout an experiment. Signals from these components and those from the neutron source are converted into a digital format by the data acquisition (DAQ) electronics and recorded as time-tagged event data in the DAQ computers using "DAQ-Middleware". Operating in event mode, the DAQ system produces extremely large data files (˜GB) under various measurement conditions. Simultaneously, the measurement meta-data indicating each measurement condition is recorded in XML format by the MLF control software framework "IROHA". These measurement event data and meta-data are collected in the MLF common storage and cataloged by the MLF Experimental Database (MLF EXP-DB) based on a commercial XML database. The system provides a web interface for users to manage and remotely analyze experimental data.
Trade law and alcohol regulation: what role for a global Alcohol Marketing Code?
Mitchell, Andrew D; Casben, Jessica
2017-01-01
Following calls for restrictions and bans on alcohol advertising, and in light of the tobacco industry's challenge to Australia's tobacco plain packaging measure, a tobacco control measure finding support in the World Health Organization (WHO) Framework Convention on Tobacco Control, this paper considers what role, if any, an international alcohol marketing code might have in preventing or reducing the risk of challenges to domestic alcohol marketing restrictions under trade rules. Narrative review of international trade and health instruments and international trade court judgements regarding alcohol products and marketing restrictions. The experience of European trade courts in the litigation of similar measures suggests that World Trade Organization rules have sufficient flexibility to support the implementation of alcohol marketing restrictions. However, the experience also highlights the possibility that public health measures have disproportionate and unjustifiable trade effects and that the ability of a public health measure to withstand a challenge under trade rules will turn on its particular design and implementation. Measures implemented pursuant to international public health instruments are not immune to trade law challenges. Close collaboration between health policymakers, trade officials and lawyers, from as early as the research stage in the development of a measure to ensure a robust evidence base, will ensure the best chance of regulatory survival for an international marketing code. © 2016 Society for the Study of Addiction.
A Platform to Build Mobile Health Apps: The Personal Health Intervention Toolkit (PHIT).
Eckhoff, Randall Peter; Kizakevich, Paul Nicholas; Bakalov, Vesselina; Zhang, Yuying; Bryant, Stephanie Patrice; Hobbs, Maria Ann
2015-06-01
Personal Health Intervention Toolkit (PHIT) is an advanced cross-platform software framework targeted at personal self-help research on mobile devices. Following the subjective and objective measurement, assessment, and plan methodology for health assessment and intervention recommendations, the PHIT platform lets researchers quickly build mobile health research Android and iOS apps. They can (1) create complex data-collection instruments using a simple extensible markup language (XML) schema; (2) use Bluetooth wireless sensors; (3) create targeted self-help interventions based on collected data via XML-coded logic; (4) facilitate cross-study reuse from the library of existing instruments and interventions such as stress, anxiety, sleep quality, and substance abuse; and (5) monitor longitudinal intervention studies via daily upload to a Web-based dashboard portal. For physiological data, Bluetooth sensors collect real-time data with on-device processing. For example, using the BinarHeartSensor, the PHIT platform processes the heart rate data into heart rate variability measures, and plots these data as time-series waveforms. Subjective data instruments are user data-entry screens, comprising a series of forms with validation and processing logic. The PHIT instrument library consists of over 70 reusable instruments for various domains including cognitive, environmental, psychiatric, psychosocial, and substance abuse. Many are standardized instruments, such as the Alcohol Use Disorder Identification Test, Patient Health Questionnaire-8, and Post-Traumatic Stress Disorder Checklist. Autonomous instruments such as battery and global positioning system location support continuous background data collection. All data are acquired using a schedule appropriate to the app's deployment. The PHIT intelligent virtual advisor (iVA) is an expert system logic layer, which analyzes the data in real time on the device. This data analysis results in a tailored app of interventions and other data-collection instruments. For example, if a user anxiety score exceeds a threshold, the iVA might add a meditation intervention to the task list in order to teach the user how to relax, and schedule a reassessment using the anxiety instrument 2 weeks later to re-evaluate. If the anxiety score exceeds a higher threshold, then an advisory to seek professional help would be displayed. Using the easy-to-use PHIT scripting language, the researcher can program new instruments, the iVA, and interventions to their domain-specific needs. The iVA, instruments, and interventions are defined via XML files, which facilities rapid app development and deployment. The PHIT Web-based dashboard portal provides the researcher access to all the uploaded data. After a secure login, the data can be filtered by criteria such as study, protocol, domain, and user. Data can also be exported into a comma-delimited file for further processing. The PHIT framework has proven to be an extensible, reconfigurable technology that facilitates mobile data collection and health intervention research. Additional plans include instrument development in other domains, additional health sensors, and a text messaging notification system.
A Platform to Build Mobile Health Apps: The Personal Health Intervention Toolkit (PHIT)
2015-01-01
Personal Health Intervention Toolkit (PHIT) is an advanced cross-platform software framework targeted at personal self-help research on mobile devices. Following the subjective and objective measurement, assessment, and plan methodology for health assessment and intervention recommendations, the PHIT platform lets researchers quickly build mobile health research Android and iOS apps. They can (1) create complex data-collection instruments using a simple extensible markup language (XML) schema; (2) use Bluetooth wireless sensors; (3) create targeted self-help interventions based on collected data via XML-coded logic; (4) facilitate cross-study reuse from the library of existing instruments and interventions such as stress, anxiety, sleep quality, and substance abuse; and (5) monitor longitudinal intervention studies via daily upload to a Web-based dashboard portal. For physiological data, Bluetooth sensors collect real-time data with on-device processing. For example, using the BinarHeartSensor, the PHIT platform processes the heart rate data into heart rate variability measures, and plots these data as time-series waveforms. Subjective data instruments are user data-entry screens, comprising a series of forms with validation and processing logic. The PHIT instrument library consists of over 70 reusable instruments for various domains including cognitive, environmental, psychiatric, psychosocial, and substance abuse. Many are standardized instruments, such as the Alcohol Use Disorder Identification Test, Patient Health Questionnaire-8, and Post-Traumatic Stress Disorder Checklist. Autonomous instruments such as battery and global positioning system location support continuous background data collection. All data are acquired using a schedule appropriate to the app’s deployment. The PHIT intelligent virtual advisor (iVA) is an expert system logic layer, which analyzes the data in real time on the device. This data analysis results in a tailored app of interventions and other data-collection instruments. For example, if a user anxiety score exceeds a threshold, the iVA might add a meditation intervention to the task list in order to teach the user how to relax, and schedule a reassessment using the anxiety instrument 2 weeks later to re-evaluate. If the anxiety score exceeds a higher threshold, then an advisory to seek professional help would be displayed. Using the easy-to-use PHIT scripting language, the researcher can program new instruments, the iVA, and interventions to their domain-specific needs. The iVA, instruments, and interventions are defined via XML files, which facilities rapid app development and deployment. The PHIT Web-based dashboard portal provides the researcher access to all the uploaded data. After a secure login, the data can be filtered by criteria such as study, protocol, domain, and user. Data can also be exported into a comma-delimited file for further processing. The PHIT framework has proven to be an extensible, reconfigurable technology that facilitates mobile data collection and health intervention research. Additional plans include instrument development in other domains, additional health sensors, and a text messaging notification system. PMID:26033047
A flexible CAMAC based data system for Space Shuttle scientific instruments
NASA Technical Reports Server (NTRS)
Ehrmann, C. H.; Baker, R. G.; Smith, R. L.; Kaminski, T. J.
1979-01-01
An effort has been made within NASA to produce a low-cost modular system for implementation of Shuttle payloads based on the CAMAC standards for packaging and data transfer. A key element of such a modular system is a means for controlling the data system, collecting and processing the data for transmission to the ground, and issuing commands to the instrument either from the ground or based on the data collected. A description is presented of such a means based on a network of digital processors and CAMAC crate controllers, which allows for the implementation of instruments ranging from those requiring only a single CAMAC crate of functional modules and no data processing to ones requiring multiple crates and multiple data processors.
ERIC Educational Resources Information Center
Hardre, Patricia L.; Crowson, H. Michael; Xie, Kui; Ly, Cong
2007-01-01
Translation of questionnaire instruments to digital administration systems, both self-contained and web-based, is widespread and increasing daily. However, the literature is lean on controlled empirical studies investigating the potential for differential effects of administrative methods. In this study, two university student samples were…
SKA CSP controls: technological challenges
NASA Astrophysics Data System (ADS)
Baffa, C.; Giani, E.; Vrcic, S.; Vela Nuñez, M.
2016-07-01
The Square Kilometer Array (SKA) project is an international effort to build the world's largest radio telescope, with eventually over a square kilometer of collecting area. For SKA Phase 1, Australia will host the low-frequency instrument with more than 500 stations, each containing around 250 individual antennas, whilst South Africa will host an array of close to 200 dishes. The scale of the SKA represents a huge leap forward in both engineering and research and development towards building and delivering a unique instrument, with the detailed design and preparation now well under way. As one of the largest scientific endeavors in history, the SKA will brings together close to 100 organizations from 20 countries. Every aspect of the design and development of such a large and complex instrument requires state-of-the-art technology and innovative approach. This poster (or paper) addresses some aspects of the SKA monitor and control system, and in particular describes the development and test results of the CSP Local Monitoring and Control prototype. At the SKA workshop held in April 2015, the SKA monitor and control community has chosen TANGO Control System as a framework, for the implementation of the SKA monitor and control. This decision will have a large impact on Monitor an Control development of SKA. As work is on the way to incorporate TANGO Control System in SKA is in progress, we started to development a prototype for the SKA Central Signal Processor to mitigate the associated risks. In particular we now have developed a uniform class schema proposal for the sub-Element systems of the SKA-CSP.
NASA Astrophysics Data System (ADS)
Kiekebusch, Mario J.; Di Lieto, Nicola; Sandrock, Stefan; Popovic, Dan; Chiozzi, Gianluca
2014-07-01
ESO is in the process of implementing a new development platform, based on PLCs, for upcoming VLT control systems (new instruments and refurbishing of existing systems to manage obsolescence issues). In this context, we have evaluated the integration and reuse of existing C++ libraries and Simulink models into the real-time environment of BECKHOFF Embedded PCs using the capabilities of the latest version of TwinCAT software and MathWorks Embedded Coder. While doing so the aim was to minimize the impact of the new platform by adopting fully tested solutions implemented in C++. This allows us to reuse the in house expertise, as well as extending the normal capabilities of the traditional PLC programming environments. We present the progress of this work and its application in two concrete cases: 1) field rotation compensation for instrument tracking devices like derotators, 2) the ESO standard axis controller (ESTAC), a generic model-based controller implemented in Simulink and used for the control of telescope main axes.
NASA Astrophysics Data System (ADS)
Dira-Smolleck, Lori
The purpose of this study was to develop, validate and establish the reliability of an instrument that measures preservice teachers' self-efficacy in regard to the teaching of science as inquiry. The instrument (TSI) is based upon the work of Bandura, Riggs, and Enochs & Riggs (1990). The study used Bandura's theoretical framework in that the instrument uses the self-efficacy construct to explore the beliefs of prospective elementary science teachers with regards to the teaching of science through inquiry: specifically, the two dimensions of self-efficacy beliefs defined by Bandura: personal self-efficacy and outcome expectancy. Self-efficacy in regard to the teaching of science as inquiry was measured through the use of a 69-item Likert scale instrument designed by the author of the study. A 13-step plan was designed and followed in the process of developing the instrument. Using the results from Chronbach Alpha and Analysis of Variance, a 69-item instrument was found to achieve the greatest balance across the construct validity, reliability and item balance with the Essential Elements of Classroom Inquiry content matrix. Based on the standardized development processes used and the associated evidence, the TSI appears to be a content and construct valid instrument, with high internal reliability for use with prospective elementary teachers to assess self-efficacy beliefs in regard to the teaching of science as inquiry. Implications for research, policy and practice are also discussed.
A framework for air quality monitoring based on free public data and open source tools
NASA Astrophysics Data System (ADS)
Nikolov, Hristo; Borisova, Denitsa
2014-10-01
In the recent years more and more widely accepted by the Space agencies (e.g. NASA, ESA) is the policy toward provision of Earth observation (EO) data and end products concerning air quality especially in large urban areas without cost to researchers and SMEs. Those EO data are complemented by increasing amount of in-situ data also provided at no cost either from national authorities or having crowdsourced origin. This accessibility together with the increased processing capabilities of the free and open source software is a prerequisite for creation of solid framework for air modeling in support of decision making at medium and large scale. Essential part of this framework is web-based GIS mapping tool responsible for dissemination of the output generated. In this research an attempt is made to establish a running framework based solely on openly accessible data on air quality and on set of freely available software tools for processing and modeling taking into account the present status quo in Bulgaria. Among the primary sources of data, especially for bigger urban areas, for different types of gases and dust particles, noted should be the National Institute of Meteorology and Hydrology of Bulgaria (NIMH) and National System for Environmental Monitoring managed by Bulgarian Executive Environmental Agency (ExEA). Both authorities provide data for concentration of several gases just to mention CO, CO2, NO2, SO2, and fine suspended dust (PM10, PM2.5) on monthly (for some data on daily) basis. In the framework proposed these data will complement the data from satellite-based sensors such as OMI instrument aboard EOS-Aura satellite and from TROPOMI instrument payload for future ESA Sentinel-5P mission. Integral part of the framework is the modern map for the land use/land cover which is provided from EEA by initiative GIO Land CORINE. This map is also a product from EO data distributed at European level. First and above all, our effort is focused on provision to the wider public living in urbanized areas with one reliable source of information on the present conditions concerning the air quality. Also this information might be used as indicator for presence of acid rains in agriculture areas close to industrial or electricity plants. Its availability at regular basis makes such information valuable source in case of manmade industrial disasters or incidents such as forest fires. Key issue in developing this framework is to ensure the delivery of reliable data products related to air quality at larger scale that those available at the moment.
Cost-effectiveness of reducing sulfur emissions from ships.
Wang, Chengfeng; Corbett, James J; Winebrake, James J
2007-12-15
We model cost-effectiveness of control strategies for reducing SO2 emissions from U.S. foreign commerce ships traveling in existing European or hypothetical U.S. West Coast SO(x) Emission Control Areas (SECAs) under international maritime regulations. Variation among marginal costs of control for individual ships choosing between fuel-switching and aftertreatment reveals cost-saving potential of economic incentive instruments. Compared to regulations prescribing low sulfur fuels, a performance-based policy can save up to $260 million for these ships with 80% more emission reductions than required because least-cost options on some individual ships outperform standards. Optimal simulation of a market-based SO2 control policy for approximately 4,700 U.S. foreign commerce ships traveling in the SECAs in 2002 shows that SECA emissions control targets can be achieved by scrubbing exhaust gas of one out of ten ships with annual savings up to $480 million over performance-based policy. A market-based policy could save the fleet approximately $63 million annually under our best-estimate scenario. Spatial evaluation of ship emissions reductions shows that market-based instruments can reduce more SO2 closer to land while being more cost-effective for the fleet. Results suggest that combining performance requirements with market-based instruments can most effectively control SO2 emissions from ships.
Intelligent and robust optimization frameworks for smart grids
NASA Astrophysics Data System (ADS)
Dhansri, Naren Reddy
A smart grid implies a cyberspace real-time distributed power control system to optimally deliver electricity based on varying consumer characteristics. Although smart grids solve many of the contemporary problems, they give rise to new control and optimization problems with the growing role of renewable energy sources such as wind or solar energy. Under highly dynamic nature of distributed power generation and the varying consumer demand and cost requirements, the total power output of the grid should be controlled such that the load demand is met by giving a higher priority to renewable energy sources. Hence, the power generated from renewable energy sources should be optimized while minimizing the generation from non renewable energy sources. This research develops a demand-based automatic generation control and optimization framework for real-time smart grid operations by integrating conventional and renewable energy sources under varying consumer demand and cost requirements. Focusing on the renewable energy sources, the intelligent and robust control frameworks optimize the power generation by tracking the consumer demand in a closed-loop control framework, yielding superior economic and ecological benefits and circumvent nonlinear model complexities and handles uncertainties for superior real-time operations. The proposed intelligent system framework optimizes the smart grid power generation for maximum economical and ecological benefits under an uncertain renewable wind energy source. The numerical results demonstrate that the proposed framework is a viable approach to integrate various energy sources for real-time smart grid implementations. The robust optimization framework results demonstrate the effectiveness of the robust controllers under bounded power plant model uncertainties and exogenous wind input excitation while maximizing economical and ecological performance objectives. Therefore, the proposed framework offers a new worst-case deterministic optimization algorithm for smart grid automatic generation control.
Flower power: the armoured expert in the CanMEDS competency framework?
Whitehead, Cynthia R; Austin, Zubin; Hodges, Brian D
2011-12-01
Competency frameworks based on roles definitions are currently being used extensively in health professions education internationally. One of the most successful and widely used models is the CanMEDS Roles Framework. The medical literature has raised questions about both the theoretical underpinnings and the practical application of outcomes-based frameworks, however little empirical research has yet been done examining specific roles frameworks. This study examines the historical development of an important early roles framework, the Educating Future Physicians of Ontario (EFPO) roles, which were instrumental in the development of the CanMEDS roles. Prominent discourses related to roles development are examined using critical discourse analysis methodology. Exploration of discourses that emerged in the development of this particular set of roles definitions highlights the contextual and negotiated nature of roles construction. The discourses of threat and protection prevalent in the EFPO roles development offer insight into the visual construction of a centre of medical expertise surrounded by supporting roles (such as collaborator and manager). Non-medical expert roles may perhaps play the part of 'armour' for the authority of medical expertise under threat. This research suggests that it may not be accurate to consider roles as objective ideals. Effective training models may require explicit acknowledgement of the socially negotiated and contextual nature of roles definitions.
Data Quality Control of the French Permanent Broadband Network in the RESIF Framework.
NASA Astrophysics Data System (ADS)
Grunberg, M.; Lambotte, S.; Engels, F.
2014-12-01
In the framework of the RESIF (Réseau Sismologique et géodésique Français) project, a new information system is setting up, allowing the improvement of the management and the distribution of high quality data from the different elements of RESIF. Within this information system, EOST (in Strasbourg) is in charge of collecting real-time permanent broadband seismic waveform, and performing Quality Control on these data. The real-time and validated data set are pushed to the French National Distribution Center (Isterre/Grenoble) to make them publicly available. Furthermore EOST hosts the BCSF-ReNaSS, in charge of the French metropolitan seismic bulletin. This allows to benefit from some high-end quality control based on the national and world-wide seismicity. Here we present the real-time seismic data flow from the stations of the French National Broad Band Network to EOST, and then, the data Quality Control procedures that were recently installed, including some new developments.The data Quality Control consists in applying a variety of processes to check the consistency of the whole system from the stations to the data center. This allows us to verify that instruments and data transmission are operating correctly. Moreover, time quality is critical for most of the scientific data applications. To face this challenge and check the consistency of polarities and amplitudes, we deployed several high-end processes including a noise correlation procedure to check for timing accuracy (intrumental time errors result in a time-shift of the whole cross-correlation, clearly distinct from those due to change in medium physical properties), and a systematic comparison of synthetic and real data for teleseismic earthquakes of magnitude larger than 6.5 to detect timing errors as well as polarity and amplitude problems.
Foundational Tools for Petascale Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Barton
2014-05-19
The Paradyn project has a history of developing algorithms, techniques, and software that push the cutting edge of tool technology for high-end computing systems. Under this funding, we are working on a three-year agenda to make substantial new advances in support of new and emerging Petascale systems. The overall goal for this work is to address the steady increase in complexity of these petascale systems. Our work covers two key areas: (1) The analysis, instrumentation and control of binary programs. Work in this area falls under the general framework of the Dyninst API tool kits. (2) Infrastructure for building toolsmore » and applications at extreme scale. Work in this area falls under the general framework of the MRNet scalability framework. Note that work done under this funding is closely related to work done under a contemporaneous grant, “High-Performance Energy Applications and Systems”, SC0004061/FG02-10ER25972, UW PRJ36WV.« less
The Application Design of Solar Radio Spectrometer Based on FPGA
NASA Astrophysics Data System (ADS)
Du, Q. F.; Chen, R. J.; Zhao, Y. C.; Feng, S. W.; Chen, Y.; Song, Y.
2017-10-01
The Solar radio spectrometer is the key instrument to observe solar radio. By programing the computer software, we control the AD signal acquisition card which is based on FPGA to get a mass of data. The data are transferred by using PCI-E port. This program has realized the function of timing data collection, finding data in specific time and controlling acquisition meter in real time. It can also map the solar radio power intensity graph. By doing the experiment, we verify the reliability of solar radio spectrum instrument, in the meanwhile, the instrument simplifies the operation in observing the sun.
Measuring Iranian women's sexual behaviors: Expert opinion
Ghorashi, Zohreh; Merghati-Khoei, Effat; Yousefy, Alireza
2014-01-01
The cultural compatibility of sexually related instruments is problematic because the contexts from which the concepts and meanings were extracted may be significantly different from related contexts in a different society. This paper describes the instruments that have been used to assess sexual behaviors, primarily in Western contexts. Then, based on the instruments’ working definition of ‘sexual behavior’ and their theoretical frameworks, we will (1) discuss the applicability or cultural compatibility of existing instruments targeting women's sexual behaviors within an Iranian context, and (2) suggest criteria for sexually related tools applicable in Iranian settings. Iranian women's sexual scripts may compromise the existing instruments’ compatibility. Suggested criteria are as follows: understanding, language of sexuality, ethics and morality. Therefore, developing a culturally comprehensive measure that can adequately examine Iranian women's sexual behaviors is needed. PMID:25250346
A Hierarchical Learning Control Framework for an Aerial Manipulation System
NASA Astrophysics Data System (ADS)
Ma, Le; Chi, yanxun; Li, Jiapeng; Li, Zhongsheng; Ding, Yalei; Liu, Lixing
2017-07-01
A hierarchical learning control framework for an aerial manipulation system is proposed. Firstly, the mechanical design of aerial manipulation system is introduced and analyzed, and the kinematics and the dynamics based on Newton-Euler equation are modeled. Secondly, the framework of hierarchical learning for this system is presented, in which flight platform and manipulator are controlled by different controller respectively. The RBF (Radial Basis Function) neural networks are employed to estimate parameters and control. The Simulation and experiment demonstrate that the methods proposed effective and advanced.
Candeias, Vanessa; Armstrong, Timothy P; Xuereb, Godfrey C
2010-01-01
Non-communicable diseases (NCD), such as heart disease, stroke, cancer and diabetes, are by far the leading cause of mortality in the world, representing 60% of all deaths. Unhealthy diets and physical inactivity are well-established risk factors for overweight and the major NCD. In response to the rapid global growth of the NCD burden, the 2008 Action Plan on Prevention and Control of NCD and the 2004 Global Strategy on Diet, Physical Activity and Health (DPAS) have been developed and endorsed as key international policy instruments. As part of the work of the World Health Organization (WHO) to implement these resolutions, a framework describing the core elements for the development and implementation of a national school policy focused on diet and physical activity has been developed. This framework is included in the "DPAS implementation tool box", and it aims to guide policy-makers in the development and implementation of policies that promote healthy eating and physical activity in the school setting through changes in environment, behaviour and education. The article describes the key elements of the framework and details how this tool is integrated into other WHO activities to provide leadership, guidance, capacity building, evidence-based recommendations and advocacy for action to improve dietary practices and increase physical activity globally.
Method and system for providing autonomous control of a platform
NASA Technical Reports Server (NTRS)
Seelinger, Michael J. (Inventor); Yoder, John-David (Inventor)
2012-01-01
The present application provides a system for enabling instrument placement from distances on the order of five meters, for example, and increases accuracy of the instrument placement relative to visually-specified targets. The system provides precision control of a mobile base of a rover and onboard manipulators (e.g., robotic arms) relative to a visually-specified target using one or more sets of cameras. The system automatically compensates for wheel slippage and kinematic inaccuracy ensuring accurate placement (on the order of 2 mm, for example) of the instrument relative to the target. The system provides the ability for autonomous instrument placement by controlling both the base of the rover and the onboard manipulator using a single set of cameras. To extend the distance from which the placement can be completed to nearly five meters, target information may be transferred from navigation cameras (used for long-range) to front hazard cameras (used for positioning the manipulator).
Exploiting IoT Technologies and Open Source Components for Smart Seismic Network Instrumentation
NASA Astrophysics Data System (ADS)
Germenis, N. G.; Koulamas, C. A.; Foundas, P. N.
2017-12-01
The data collection infrastructure of any seismic network poses a number of requirements and trade-offs related to accuracy, reliability, power autonomy and installation & operational costs. Having the right hardware design at the edge of this infrastructure, embedded software running inside the instruments is the heart of pre-processing and communication services implementation and their integration with the central storage and processing facilities of the seismic network. This work demonstrates the feasibility and benefits of exploiting software components from heterogeneous sources in order to realize a smart seismic data logger, achieving higher reliability, faster integration and less development and testing costs of critical functionality that is in turn responsible for the cost and power efficient operation of the device. The instrument's software builds on top of widely used open source components around the Linux kernel with real-time extensions, the core Debian Linux distribution, the earthworm and seiscomp tooling frameworks, as well as components from the Internet of Things (IoT) world, such as the CoAP and MQTT protocols for the signaling planes, besides the widely used de-facto standards of the application domain at the data plane, such as the SeedLink protocol. By using an innovative integration of features based on lower level GPL components of the seiscomp suite with higher level processing earthworm components, coupled with IoT protocol extensions to the latter, the instrument can implement smart functionality such as network controlled, event triggered data transmission in parallel with edge archiving and on demand, short term historical data retrieval.
Active spacecraft potential control: An ion emitter experiment. [Cluster mission
NASA Technical Reports Server (NTRS)
Riedler, W.; Goldstein, R.; Hamelin, M.; Maehlum, B. N.; Troim, J.; Olsen, R. C.; Pedersen, A.; Grard, R. J. L.; Schmidt, R.; Rudenauer, F.
1988-01-01
The cluster spacecraft are instrumented with ion emitters for charge neutralization. The emitters produce indium ions at 6 keV. The ion current is adjusted in a feedback loop with instruments measuring the spacecraft potential. The system is based on the evaporation of indium in the apex field of a needle. The design of the active spacecraft potential control instruments, and the ion emitters is presented.
ERIC Educational Resources Information Center
Lin, Sheau-Wen; Liu, Yu; Chen, Shin-Feng; Wang, Jing-Ru; Kao, Huey-Lien
2016-01-01
The purpose of this study was to develop a computer-based measure of elementary students' science talk and to report students' benchmarks. The development procedure had three steps: defining the framework of the test, collecting and identifying key reference sets of science talk, and developing and verifying the science talk instrument. The…
ERIC Educational Resources Information Center
Novikasari, Ifada; Darhim, Didi Suryadi
2015-01-01
This study explored the characteristics of pre-service primary teachers (PSTs) influenced by mathematical belief and mathematical knowledge for teaching (MKT) PSTs'. A qualitative approach was used to investigate the levels of PSTs on mathematical belief and MKT. The two research instruments used in this study were an interview-based task and a…
Development and Validation of the Meaning of Work Inventory among French Workers
ERIC Educational Resources Information Center
Arnoux-Nicolas, Caroline; Sovet, Laurent; Lhotellier, Lin; Bernaud, Jean-Luc
2017-01-01
The purpose of this study was to validate a psychometric instrument among French workers for assessing the meaning of work. Following an empirical framework, a two-step procedure consisted of exploring and then validating the scale among distinctive samples. The consequent Meaning of Work Inventory is a 15-item scale based on a four-factor model,…
Polarimetry noise in fiber-based optical coherence tomography instrumentation
Zhang, Ellen Ziyi; Vakoc, Benjamin J.
2011-01-01
High noise levels in fiber-based polarization-sensitive optical coherence tomography (PS-OCT) have broadly limited its clinical utility. In this study we investigate contribution of polarization mode dispersion (PMD) to the polarimetry noise. We develop numerical models of the PS-OCT system including PMD and validate these models with empirical data. Using these models, we provide a framework for predicting noise levels, for processing signals to reduce noise, and for designing an optimized system. PMID:21935044
Posters that foster cognition in the classroom: Multimedia theory applied to educational posters
NASA Astrophysics Data System (ADS)
Hubenthal, M.; O'Brien, T.; Taber, J.
2011-12-01
Despite a decline in popularity within U.S. society, posters continue to hold a prominent place within middle and high school science classrooms. Teachers' demand for posters is largely satisfied by governmental and non-profit science organizations' education and public outreach (EPO) efforts. Here, posters are produced and disseminated as both tangible products resulting from the organization's research, and instruments to communicate scientific content to teachers and students. This study investigates the taken-for-granted good of posters through a survey/interview of science teachers who received sample posters at professional development workshops. The design of sample EPO posters were also examined for their implied, underlying assumptions about learning and their alignment to the setting of the classroom, which is unique for the genera of posters. Based on this analysis we found that rates of poster use were as low as 43% and that many EPO posters fail to achieve their potential as an instructional instrument. As a result, many EPO posters are relegated to merely a collection of pretty pictures on the wall. Leveraging existing research in both cognition and the cognitive theory of Multimedia learning, we propose a design framework for educational posters that is likely to activate students' attention, catalyze cognitive processing, provide a framework to guide students' construction of knowledge, and connect to extended learning through live or web-based exploration of phenomenon. While work to examine the implications of this framework is still on-going, we present a prototype poster and supporting website developed using the framework as a guide, as well as results from focus group discussions with classroom practitioners regarding the prototype poster and its potential in the classroom.
Micro-controller based air pressure monitoring instrumentation system using optical fibers as sensor
NASA Astrophysics Data System (ADS)
Hazarika, D.; Pegu, D. S.
2013-03-01
This paper describes a micro-controller based instrumentation system to monitor air pressure using optical fiber sensors. The principle of macrobending is used to develop the sensor system. The instrumentation system consists of a laser source, a beam splitter, two multi mode optical fibers, two Light Dependent Resistance (LDR) based timer circuits and a AT89S8252 micro-controller. The beam splitter is used to divide the laser beam into two parts and then these two beams are launched into two multi mode fibers. One of the multi mode fibers is used as the sensor fiber and the other one is used as the reference fiber. The use of the reference fiber is to eliminate the environmental effects while measuring the air pressure magnitude. The laser beams from the sensor and reference fibers are applied to two identical LDR based timer circuits. The LDR based timer circuits are interfaced to a micro-controller through its counter pins. The micro-controller samples the frequencies of the timer circuits using its counter-0 and counter-1 and the counter values are then processed to provide the measure of air pressure magnitude.
Chapman, Audrey R
2017-01-01
The alcohol industry in the Latin American and Caribbean (LAC) region promotes demand for alcohol products actively through a number of channels, including advertising and sponsorship of sports and other events. This paper evaluates whether human rights instruments that Latin American countries have ratified can be used to limit children's exposure to alcohol advertising and promotion. A review was conducted of the text of, and interpretative documents related to, a series of international and regional human rights instruments ratified by most countries in the LAC region that enumerate the right to health. The Convention on the Rights of the Child has the most relevant provisions to protect children and youth from alcohol promotion and advertising. Related interpretive documents by the United Nations Committee on the Rights of the Child affirm that corporations hold duties to respect and protect children's right to health. Human rights norms and law can be used to regulate or eliminate alcohol beverage marketing and promotional activities in the Latin American region. The paper recommends developing a human rights based Framework Convention on Alcohol Control to provide guidance. © 2016 Society for the Study of Addiction.
Applying the ICF framework to study changes in quality-of-life for youth with chronic conditions
McDougall, Janette; Wright, Virginia; Schmidt, Jonathan; Miller, Linda; Lowry, Karen
2011-01-01
Objective The objective of this paper is to describe how the ICF framework was applied as the foundation for a longitudinal study of changes in quality-of-life (QoL) for youth with chronic conditions. Method This article will describe the study’s aims, methods, measures and data analysis techniques. It will point out how the ICF framework was used—and expanded upon—to provide a model for studying the impact of factors on changes in QoL for youth with chronic conditions. Further, it will describe the instruments that were chosen to measure the components of the ICF framework and the data analysis techniques that will be used to examine the impact of factors on changes in youths’ QoL. Conclusions Qualitative and longitudinal designs for studying QoL based on the ICF framework can be useful for unraveling the complex ongoing inter-relationships among functioning, contextual factors and individuals’ perceptions of their QoL. PMID:21034288
Developing an instrument for assessing students' concepts of the nature of technology
NASA Astrophysics Data System (ADS)
Liou, Pey-Yan
2015-05-01
Background:The nature of technology has been rarely discussed despite the fact that technology plays an essential role in modern society. It is important to discuss students' concepts of the nature of technology, and further to advance their technological literacy and adaptation to modern society. There is a need to assess high school students' concepts of the nature of technology. Purpose:This study aims to engage in discourse on students' concepts of the nature of technology based on a proposed theoretical framework. Moreover, another goal is to develop an instrument for measuring students' concepts of the nature of technology. Sample:Four hundred and fifty-five high school students' perceptions of technology were qualitatively analyzed. Furthermore, 530 students' responses to a newly developed questionnaire were quantitatively analyzed in the final test. Design and method:First, content analysis was utilized to discuss and categorize students' statements regarding technology and its related issues. The Student Concepts of the Nature of Technology Questionnaire was developed based on the proposed theoretical framework and was supported by the students' qualitative data. Finally, exploratory factor analysis and reliability analysis were applied to determine the structure of the items and the internal consistency of each scale. Results:Through a process of instrument development, the Student Concepts of the Nature of Technology Questionnaire was shown to be a valid and reliable tool for measuring students' concepts of the nature of technology. This newly developed questionnaire is composed of 29 items in six scales, namely 'technology as artifacts,' 'technology as an innovation change,' 'the current role of technology in society,' 'technology as a double-edged sword,' 'technology as a science-based form,' and 'history of technology.' Conclusions:The Student Concepts of the Nature of Technology Questionnaire has been confirmed as a reasonably valid and reliable instrument. This study provides a useful questionnaire for educational researchers and practitioners for measuring students' concepts of the nature of technology.
New Contemporary Criterion-Referenced Assessment Instruments for Astronomy & Geology: TOAST & EGGS
NASA Astrophysics Data System (ADS)
Guffey, Sarah Katie; Slater, Stephanie J.; Slater, Timothy F.
2015-08-01
Considerable effort in the astronomy and Earth sciences education research over the past decade has focused on developing assessment tools in the form of multiple-choice conceptual diagnostics and content knowledge surveys. This has been critically important in advancing discipline-based education research allowing scholar to establish the initial, incoming knowledge state of students as well as to attempt to measure some of the impacts of innovative instructional interventions. Before now, few of the existing instruments were constructed upon a solid list of clearly articulated and widely agreed upon learning objectives. Whereas first-generation assessment tools, such as the Astronomy Diagnostics Test ADT2) were based primarily upon further identifying documented astronomy misconceptions, scholars from the CAPER Center for Astronomy & Physics Education Research team are creating contemporary instruments based instead by developing items using modern test construction techniques and tightly aligned to the consensus learning goals identified by the American Association of the Advancement of Science’s Project 2061 Benchmarks, and the National Research Council’s National Science Education Standards, and the National Research Council’s Frameworks for A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas. These consensus learning goals are further enhanced guiding documents from the American Astronomical Society - Chair’s Conference on ASTRO 101 and the NSF-funded Earth Science Literacy Initiative. Two of the resulting criterion-referenced assessment tools widely used by researchers are the Test Of Astronomy STandards (TOAST) and the Exam of GeoloGy StandardS (EGGS). These easy-to-use and easy-to-score multiple-choice instruments have a high degree of reliability and validity for instructors and researchers needing information on students’ initial knowledge state at the beginning of a course and can be used, in aggregate, to help measure the impact teaching innovations with learning goals tightly aligned to consensus goals of the broader education community.
Alternative Policy Instruments. CPRE Joint Note Series.
ERIC Educational Resources Information Center
McDonnell, Lorraine M.; Elmore, Richard F.
This publication builds a conceptual framework that categorizes alternative policy instruments for educational reform into actions. It defines four categories of policy instruments and hypothesizes how each will operate in addressing different policy problems in different political and organizational contexts. Subsequent research will assess…
5. INSTRUMENT ROOM INTERIOR, SHOWING BACKS OF CONSOLE LOCKERS. Looking ...
5. INSTRUMENT ROOM INTERIOR, SHOWING BACKS OF CONSOLE LOCKERS. Looking northeast to firing control room passageway. - Edwards Air Force Base, Air Force Rocket Propulsion Laboratory, Firing Control Building, Test Area 1-100, northeast end of Test Area 1-100 Road, Boron, Kern County, CA
NASA Astrophysics Data System (ADS)
Baqué, M.; Dobrijevic, M.; Le Postollec, A.; Moreau, T.; Faye, C.; Vigier, F.; Incerti, S.; Coussot, G.; Caron, J.; Vandenabeele-Trambouze, O.
2017-01-01
Several instruments based on immunoassay techniques have been proposed for life-detection experiments in the framework of planetary exploration but few experiments have been conducted so far to test the resistance of antibodies against cosmic ray particles. We present several irradiation experiments carried out on both grafted and free antibodies for different types of incident particles (protons, neutrons, electrons and 12C) at different energies (between 9 MeV and 50 MeV) and different fluences. No loss of antibodies activity was detected for the whole set of experiments except when considering protons with energy between 20 and 30 MeV (on free and grafted antibodies) and fluences much greater than expected for a typical planetary mission to Mars for instance. Our results on grafted antibodies suggest that biochip-based instruments must be carefully designed according to the expected radiation environment for a given mission. In particular, a surface density of antibodies much larger than the expected proton fluence would prevent significant loss of antibodies activity and thus assuring a successful detection.
Energy taxation as a policy instrument to reduce CO{sub 2} emissions: A net benefit analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyd, R.; Krutilla, K.; Viscusi, W.K.
1995-07-01
This study evaluates the costs and benefits of energy taxation as a policy instrument to conserve energy and reduce CO{sub 2} emissions. The study combines economic cost estimates generated with a CGE model and monetary estimates of environmental damages in a comprehensive cost/benefit framework. We find that optimal CO{sub 2} emissions reductions range from 5 to 38%, depending on different assumptions about energy substitution elasticities and environmental damages. CO{sub 2} emissions reductions of between 8 and 64% can be attained at no additional welfare cost relative to a policy of not undertaking any action to control CO{sub 2} emissions. 33more » refs., 7 figs., 8 tabs.« less
Comprehensive Fault Tolerance and Science-Optimal Attitude Planning for Spacecraft Applications
NASA Astrophysics Data System (ADS)
Nasir, Ali
Spacecraft operate in a harsh environment, are costly to launch, and experience unavoidable communication delay and bandwidth constraints. These factors motivate the need for effective onboard mission and fault management. This dissertation presents an integrated framework to optimize science goal achievement while identifying and managing encountered faults. Goal-related tasks are defined by pointing the spacecraft instrumentation toward distant targets of scientific interest. The relative value of science data collection is traded with risk of failures to determine an optimal policy for mission execution. Our major innovation in fault detection and reconfiguration is to incorporate fault information obtained from two types of spacecraft models: one based on the dynamics of the spacecraft and the second based on the internal composition of the spacecraft. For fault reconfiguration, we consider possible changes in both dynamics-based control law configuration and the composition-based switching configuration. We formulate our problem as a stochastic sequential decision problem or Markov Decision Process (MDP). To avoid the computational complexity involved in a fully-integrated MDP, we decompose our problem into multiple MDPs. These MDPs include planning MDPs for different fault scenarios, a fault detection MDP based on a logic-based model of spacecraft component and system functionality, an MDP for resolving conflicts between fault information from the logic-based model and the dynamics-based spacecraft models" and the reconfiguration MDP that generates a policy optimized over the relative importance of the mission objectives versus spacecraft safety. Approximate Dynamic Programming (ADP) methods for the decomposition of the planning and fault detection MDPs are applied. To show the performance of the MDP-based frameworks and ADP methods, a suite of spacecraft attitude planning case studies are described. These case studies are used to analyze the content and behavior of computed policies in response to the changes in design parameters. A primary case study is built from the Far Ultraviolet Spectroscopic Explorer (FUSE) mission for which component models and their probabilities of failure are based on realistic mission data. A comparison of our approach with an alternative framework for spacecraft task planning and fault management is presented in the context of the FUSE mission.
Okayama optical polarimetry and spectroscopy system (OOPS) II. Network-transparent control software.
NASA Astrophysics Data System (ADS)
Sasaki, T.; Kurakami, T.; Shimizu, Y.; Yutani, M.
Control system of the OOPS (Okayama Optical Polarimetry and Spectroscopy system) is designed to integrate several instruments whose controllers are distributed over a network; the OOPS instrument, a CCD camera and data acquisition unit, the 91 cm telescope, an autoguider, a weather monitor, and an image display tool SAOimage. With the help of message-based communication, the control processes cooperate with related processes to perform an astronomical observation under supervising control by a scheduler process. A logger process collects status data of all the instruments to distribute them to related processes upon request. Software structure of each process is described.
24-channel dual microcontroller-based voltage controller for ion optics remote control
NASA Astrophysics Data System (ADS)
Bengtsson, L.
2018-05-01
The design of a 24-channel voltage control instrument for Wenzel Elektronik N1130 NIM modules is described. This instrument is remote controlled from a LabVIEW GUI on a host Windows computer and is intended for ion optics control in electron affinity measurements on negative ions at the CERN-ISOLDE facility. Each channel has a resolution of 12 bits and has a normally distributed noise with a standard deviation of <1 mV. The instrument is designed as a standard 2-unit NIM module where the electronic hardware consists of a printed circuit board with two asynchronously operating microcontrollers.
NASA Astrophysics Data System (ADS)
Tatrai, David; Bors, Noemi; Gulyas, Gabor; Szabo, Gabor; Smit, Herman G. J.; Petzold, Andreas; Bozoki, Zoltan
2016-04-01
Airborne hygrometry is one of the key topics in atmospheric and climate research that is why airborne hygrometers are almost always included in aircraft based measurement campaigns (see e.g. MOZAIC and CARIBIC). However for its successful application an airborne hygrometer has to be able to measure humidity in a wide range (1 60000 ppmV) at various total pressures with high accuracy and short response time. In addition, an instrument capable of measuring water vapor and condensed water selectively has considerable added value as the water content of clouds seems to be a very uncertain parameter in climate models. At the University of Szeged, a dual channel, photoacoustic spectroscopy based hygrometer system had been developed, that measures water vapor concentration and total water content simultaneously from the ground level up to cruising altitude [1, 2]. An early version of this system is the core hygrometer of the CARIBIC project. In the past few years efforts were made to improve further the performance and long term reliability of the system [3] while also reducing its size and weight. Most important of the recent achievements is a new data acquisition and control system with more precise control performance [4]. Many of these results have been proved by various laboratory (AquaVIT2a-b) and in-flight (DANCHAR-IFCC, AIRTOSS I-II) measurement campaigns. Based on these results the system received invitation into the IAGOS ESFRI project to become one of its core instruments. The presented work was funded by EUFAR contract no. 227159, by the Hungarian Research and Technology Innovation Fund (OTKA), project no. NN109679 and by the European Community's Seventh Framework Programme (FP7/2007-2013) under grant agreement n° 312311. [1] Szakáll, M et.al: Infrared Physics & Technology. 2006. 48, (3) 192-201 [2] Szakáll, M. et.al: Infrared Physics & Technology, 2007. 51, (2) 113-121 [3] Tátrai, D. et.al: Atmos. Meas. Tech., 8, 33-42, 2015 [4] Tátrai, D. et.al: Measurement 63, 259-268, 2015
Feasibility of Optical Instruments Based on Multiaperture Optics.
1984-10-16
system may be configured. The optical elements may be nonimaging concentrators (light horns), the field of view (FOV) of which may be controlled by a...RD-RI58 868 FEASIBILITY OF OPTICAL INSTRUMENTS BASED ON i/I MULTIAPERTURE OPTICS (U) FLORIDA UNIV GAINESVILLE DEPT OF NUCLEAR ENGINEERING SCIENCES J D...d Subtitle) 5. TYPE OF REPORT & PERIOD COVERED ’ 0 Feasibility of Optical Instruments Based on Final Report * CD Multiaperature Optics 615/83 to 9/30
Structure-based control of complex networks with nonlinear dynamics
NASA Astrophysics Data System (ADS)
Zanudo, Jorge G. T.; Yang, Gang; Albert, Reka
What can we learn about controlling a system solely from its underlying network structure? Here we use a framework for control of networks governed by a broad class of nonlinear dynamics that includes the major dynamic models of biological, technological, and social processes. This feedback-based framework provides realizable node overrides that steer a system towards any of its natural long term dynamic behaviors, regardless of the dynamic details and system parameters. We use this framework on several real networks, identify the topological characteristics that underlie the predicted node overrides, and compare its predictions to those of classical structural control theory. Finally, we demonstrate this framework's applicability in dynamic models of gene regulatory networks and identify nodes whose override is necessary for control in the general case, but not in specific model instances. This work was supported by NSF Grants PHY 1205840 and IIS 1160995. JGTZ is a recipient of a Stand Up To Cancer - The V Foundation Convergence Scholar Award.
A Framework for Instrument Development of a Choice Experiment: An Application to Type 2 Diabetes.
Janssen, Ellen M; Segal, Jodi B; Bridges, John F P
2016-10-01
Choice experiments are increasingly used to obtain patient preference information for regulatory benefit-risk assessments. Despite the importance of instrument design, there remains a paucity of literature applying good research principles. We applied a novel framework for instrument development of a choice experiment to measure type 2 diabetes mellitus treatment preferences. Applying the framework, we used evidence synthesis, expert consultation, stakeholder engagement, pretest interviews, and pilot testing to develop a best-worst scaling (BWS) and discrete choice experiment (DCE). We synthesized attributes from published DCEs for type 2 diabetes, consulted clinical experts, engaged a national advisory board, conducted local cognitive interviews, and pilot tested a national survey. From published DCEs (n = 17), ten attribute categories were extracted with cost (n = 11) having the highest relative attribute importance (RAI) (range 6-10). Clinical consultation and stakeholder engagement identified six attributes for inclusion. Cognitive pretesting with local diabetes patients (n = 25) ensured comprehension of the choice experiment. Pilot testing with patients from a national sample (n = 50) identified nausea as most important (RAI for DCE: 10 [95 % CI 8.5-11.5]; RAI for BWS: 10 [95 % CI 8.9-11.1]). The developed choice experiment contained five attributes (A1c decrease, blood glucose stability, low blood glucose, nausea, additional medicine, and cost). The framework for instrument development of a choice experiment included five stages of development and incorporated multiple stakeholder perspectives. Further comparisons of instrument development approaches are needed to identify best practices. To facilitate comparisons, researchers need to be encouraged to publish or discuss their instrument development strategies and findings.
A framework and a measurement instrument for sustainability of work practices in long-term care
2011-01-01
Background In health care, many organizations are working on quality improvement and/or innovation of their care practices. Although the effectiveness of improvement processes has been studied extensively, little attention has been given to sustainability of the changed work practices after implementation. The objective of this study is to develop a theoretical framework and measurement instrument for sustainability. To this end sustainability is conceptualized with two dimensions: routinization and institutionalization. Methods The exploratory methodological design consisted of three phases: a) framework development; b) instrument development; and c) field testing in former improvement teams in a quality improvement program for health care (N teams = 63, N individual = 112). Data were collected not until at least one year had passed after implementation. Underlying constructs and their interrelations were explored using Structural Equation Modeling and Principal Component Analyses. Internal consistency was computed with Cronbach's alpha coefficient. A long and a short version of the instrument are proposed. Results The χ2- difference test of the -2 Log Likelihood estimates demonstrated that the hierarchical two factor model with routinization and institutionalization as separate constructs showed a better fit than the one factor model (p < .01). Secondly, construct validity of the instrument was strong as indicated by the high factor loadings of the items. Finally, the internal consistency of the subscales was good. Conclusions The theoretical framework offers a valuable starting point for the analysis of sustainability on the level of actual changed work practices. Even though the two dimensions routinization and institutionalization are related, they are clearly distinguishable and each has distinct value in the discussion of sustainability. Finally, the subscales conformed to psychometric properties defined in literature. The instrument can be used in the evaluation of improvement projects. PMID:22087884
A software framework for real-time multi-modal detection of microsleeps.
Knopp, Simon J; Bones, Philip J; Weddell, Stephen J; Jones, Richard D
2017-09-01
A software framework is described which was designed to process EEG, video of one eye, and head movement in real time, towards achieving early detection of microsleeps for prevention of fatal accidents, particularly in transport sectors. The framework is based around a pipeline structure with user-replaceable signal processing modules. This structure can encapsulate a wide variety of feature extraction and classification techniques and can be applied to detecting a variety of aspects of cognitive state. Users of the framework can implement signal processing plugins in C++ or Python. The framework also provides a graphical user interface and the ability to save and load data to and from arbitrary file formats. Two small studies are reported which demonstrate the capabilities of the framework in typical applications: monitoring eye closure and detecting simulated microsleeps. While specifically designed for microsleep detection/prediction, the software framework can be just as appropriately applied to (i) other measures of cognitive state and (ii) development of biomedical instruments for multi-modal real-time physiological monitoring and event detection in intensive care, anaesthesiology, cardiology, neurosurgery, etc. The software framework has been made freely available for researchers to use and modify under an open source licence.
1987-06-01
DECLASSIFICATION OWNGRAONG SCIEDULE distribution is unlimited. 4 PERFORMING ORGANIATION REPORT NUMBIR(S) S MONITORING ORGANIZATION REPORT NUVBER(S) 6a NAME OF...PERFORMING ORGANIZATION 60 OFFICE SYMBOL 7a NAME OF MONITORING ORGANIZATION (if applicable) Naval Postgraduate SchoolJ Code 74 Naval Postgraduate School 6c...FUNOINGi SPONSORING Sb OFFICE SYMBOL 9 PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (If dappicable) 8c AODRESS (City, State. ard ZIP Code
Validation of educational assessments: a primer for simulation and beyond.
Cook, David A; Hatala, Rose
2016-01-01
Simulation plays a vital role in health professions assessment. This review provides a primer on assessment validation for educators and education researchers. We focus on simulation-based assessment of health professionals, but the principles apply broadly to other assessment approaches and topics. Validation refers to the process of collecting validity evidence to evaluate the appropriateness of the interpretations, uses, and decisions based on assessment results. Contemporary frameworks view validity as a hypothesis, and validity evidence is collected to support or refute the validity hypothesis (i.e., that the proposed interpretations and decisions are defensible). In validation, the educator or researcher defines the proposed interpretations and decisions, identifies and prioritizes the most questionable assumptions in making these interpretations and decisions (the "interpretation-use argument"), empirically tests those assumptions using existing or newly-collected evidence, and then summarizes the evidence as a coherent "validity argument." A framework proposed by Messick identifies potential evidence sources: content, response process, internal structure, relationships with other variables, and consequences. Another framework proposed by Kane identifies key inferences in generating useful interpretations: scoring, generalization, extrapolation, and implications/decision. We propose an eight-step approach to validation that applies to either framework: Define the construct and proposed interpretation, make explicit the intended decision(s), define the interpretation-use argument and prioritize needed validity evidence, identify candidate instruments and/or create/adapt a new instrument, appraise existing evidence and collect new evidence as needed, keep track of practical issues, formulate the validity argument, and make a judgment: does the evidence support the intended use? Rigorous validation first prioritizes and then empirically evaluates key assumptions in the interpretation and use of assessment scores. Validation science would be improved by more explicit articulation and prioritization of the interpretation-use argument, greater use of formal validation frameworks, and more evidence informing the consequences and implications of assessment.
Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu
2010-01-01
Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933
Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu
2011-07-01
Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.
A study on acceptance of mobileschool at secondary schools in Malaysia: Urban vs rural
NASA Astrophysics Data System (ADS)
Hashim, Ahmad Sobri; Ahmad, Wan Fatimah Wan; Sarlan, Aliza
2017-10-01
Developing countries are in dilemma where sophisticated technologies are more advance as compared to the way their people think. In education, there have been many novel approaches and technologies were introduced. However, very minimal efforts were put to apply in our education. MobileSchool is a mobile learning (m-learning) management system, developed for administrative, teaching and learning processes at secondary schools in Malaysia. The paper presents the acceptance of MobileSchool between urban and rural secondary schools in Malaysia. Research framework was designed based on Technology Acceptance Model (TAM). The constructs of the framework include computer anxiety, self-efficacy, facilitating condition, technological complexity, perceived behavioral control, perceive ease of use, perceive usefulness, attitude and behavioral intention. Questionnaire was applied as research instrument which involved 373 students from four secondary schools (two schools in urban category and another two in rural category) in Perak. Inferential analyses using hypothesis and t-test, and descriptive analyses using mean and percentage were used to analyze the data. Results showed that there were no big difference (<20%) of all acceptance constructs between urban and rural secondary schools except computer anxiety.
The Open Perimetry Interface: an enabling tool for clinical visual psychophysics.
Turpin, Andrew; Artes, Paul H; McKendrick, Allison M
2012-01-01
Perimeters are commercially available instruments for measuring various attributes of the visual field in a clinical setting. They have several advantages over traditional lab-based systems for conducting vision experiments, including built-in gaze tracking and calibration, polished appearance, and attributes to increase participant comfort. Prior to this work, there was no standard to control such instruments, making it difficult and time consuming to use them for novel psychophysical experiments. This paper introduces the Open Perimetry Interface (OPI), a standard set of functions that can be used to control perimeters. Currently the standard is partially implemented in the open-source programming language R on two commercially available instruments: the Octopus 900 (a projection-based bowl perimeter produced by Haag-Streit, Switzerland) and the Heidelberg Edge Perimeter (a CRT-based system produced by Heidelberg Engineering, Germany), allowing these instruments to be used as a platform for psychophysical experimentation.
The instrumental rationality of addiction.
Pickard, Hanna
2011-12-01
The claim that non-addictive drug use is instrumental must be distinguished from the claim that its desired ends are evolutionarily adaptive or easy to comprehend. Use can be instrumental without being adaptive or comprehensible. This clarification, together with additional data, suggests that Müller & Schumann's (M&S's) instrumental framework may explain addictive, as well as non-addictive consumption.
ERIC Educational Resources Information Center
Artigue, Michele
2002-01-01
Presents an anthropological approach used in French research and the theory of instrumentation developed in cognitive ergonomics. Shows how these frameworks allow an approach to the educational use of CAS technology, focusing on the unexpected complexity of instrumental genesis, mathematical needs of instrumentation, status of instrumented…
Morowatisharifabad, Mohammad Ali; Mazloomi-Mahmoodabad, Seyed Saied; Afshani, Seyed Alireza; Ardian, Nahid; Vaezi, Ali; Refahi, Seyed Ali Asghar
2018-01-01
AIM: The present study sought to explore the experiences of participants in divorce process according to the theory of planned behaviour. MATERIAL AND METHODS: This qualitative study was conducted using content analysis method. In this research, 27 participants involved in the divorce process were selected. The data were coded, and the qualitative content analysis was performed. RESULTS: Based on four constructs of the theory of planned behaviour, the subcategories of instrumental attitude were “Divorce as the last solution” and “Divorce as damage for individuals and society”. From the perceived behavioural control theme, two subcategories of behavioural control and self-efficacy were drawn; the first subtheme included “Others’ meddling in the married life”, “Social problems reducing behavioural control power” and “Personality characteristics affecting the behavioural control power”; and the second one included: “Education as a means for developing self-efficacy” and “barriers to self-efficacy”. The injunctive norms theme included three subcategories of “Others help to reconcile”, “Others meddling and lack of reconciliation”, and “Families support to reconcile”. The descriptive norms theme was “High divorce rate and misuse of satellite channels and social networks as factors making reconciliation difficult”. CONCLUSION: It seems that education and counselling, within a predefined framework, such as applied theories, can be useful. PMID:29875872
Application of Lightweight Formal Methods to Software Security
NASA Technical Reports Server (NTRS)
Gilliam, David P.; Powell, John D.; Bishop, Matt
2005-01-01
Formal specification and verification of security has proven a challenging task. There is no single method that has proven feasible. Instead, an integrated approach which combines several formal techniques can increase the confidence in the verification of software security properties. Such an approach which species security properties in a library that can be reused by 2 instruments and their methodologies developed for the National Aeronautics and Space Administration (NASA) at the Jet Propulsion Laboratory (JPL) are described herein The Flexible Modeling Framework (FMF) is a model based verijkation instrument that uses Promela and the SPIN model checker. The Property Based Tester (PBT) uses TASPEC and a Text Execution Monitor (TEM). They are used to reduce vulnerabilities and unwanted exposures in software during the development and maintenance life cycles.
VLT instruments: industrial solutions for non-scientific detector systems
NASA Astrophysics Data System (ADS)
Duhoux, P.; Knudstrup, J.; Lilley, P.; Di Marcantonio, P.; Cirami, R.; Mannetta, M.
2014-07-01
Recent improvements in industrial vision technology and products together with the increasing need for high performance, cost efficient technical detectors for astronomical instrumentation have led ESO with the contribution of INAF to evaluate this trend and elaborate ad-hoc solutions which are interoperable and compatible with the evolution of VLT standards. The ESPRESSO spectrograph shall be the first instrument deploying this technology. ESO's Technical CCD (hereafter TCCD) requirements are extensive and demanding. A lightweight, low maintenance, rugged and high performance TCCD camera product or family of products is required which can operate in the extreme environmental conditions present at ESO's observatories with minimum maintenance and minimal downtime. In addition the camera solution needs to be interchangeable between different technical roles e.g. slit viewing, pupil and field stabilization, with excellent performance characteristics under a wide range of observing conditions together with ease of use for the end user. Interoperability is enhanced by conformance to recognized electrical, mechanical and software standards. Technical requirements and evaluation criteria for the TCCD solution are discussed in more detail. A software architecture has been adopted which facilitates easy integration with TCCD's from different vendors. The communication with the devices is implemented by means of dedicated adapters allowing usage of the same core framework (business logic). The preference has been given to cameras with an Ethernet interface, using standard TCP/IP based communication. While the preferred protocol is the industrial standard GigE Vision, not all vendors supply cameras with this interface, hence proprietary socket-based protocols are also acceptable with the provision of a validated Linux compliant API. A fundamental requirement of the TCCD software is that it shall allow for a seamless integration with the existing VLT software framework. ESPRESSO is a fiber-fed, cross-dispersed echelle spectrograph that will be located in the Combined-Coudé Laboratory of the VLT in the Paranal Observatory in Chile. It will be able to operate either using the light of any of the UT's or using the incoherently combined light of up to four UT's. The stabilization of the incoming beam is achieved by dedicated piezo systems controlled via active loops closed on 4 + 4 dedicated TCCD's for the stabilization of the pupil image and of the field with a frequency goal of 3 Hz on a 2nd to 3rd magnitude star. An additional 9th TCCD system shall be used as an exposure-meter. In this paper we will present the technical CCD solution for future VLT instruments.
Formation Flying and Deformable Instruments
NASA Astrophysics Data System (ADS)
Rio, Yvon
2009-05-01
Astronomers have always attempted to build very stable instruments. They fight all that can cause mechanical deformation or image motion. This has led to well established technologies (autoguide, active optics, thermal control, tip/tilt correction), as well as observing methods based on the use of controlled motion (scanning, micro scanning, shift and add, chopping and nodding). Formation flying disturbs this practice. It is neither possible to reduce the relative motion to very small amplitudes, nor to control it at will. Some impacts on Simbol-X instrument design, and operation are presented.
Single pilot scanning behavior in simulated instrument flight
NASA Technical Reports Server (NTRS)
Pennington, J. E.
1979-01-01
A simulation of tasks associated with single pilot general aviation flight under instrument flight rules was conducted as a baseline for future research studies on advanced flight controls and avionics. The tasks, ranging from simple climbs and turns to an instrument landing systems approach, were flown on a fixed base simulator. During the simulation the control inputs, state variables, and the pilots visual scan pattern including point of regard were measured and recorded.
AMBER instrument control software
NASA Astrophysics Data System (ADS)
Le Coarer, Etienne P.; Zins, Gerard; Gluck, Laurence; Duvert, Gilles; Driebe, Thomas; Ohnaka, Keiichi; Heininger, Matthias; Connot, Claus; Behrend, Jan; Dugue, Michel; Clausse, Jean Michel; Millour, Florentin
2004-09-01
AMBER (Astronomical Multiple BEam Recombiner) is a 3 aperture interferometric recombiner operating between 1 and 2.5 um, for the Very Large Telescope Interferometer (VLTI). The control software of the instrument, based on the VLT Common Software, has been written to comply with specific features of the AMBER hardware, such as the Infrared detector read out modes or piezo stage drivers, as well as with the very specific operation modes of an interferomtric instrument. In this respect, the AMBER control software was designed to insure that all operations, from the preparation of the observations to the control/command of the instrument during the observations, would be kept as simple as possible for the users and operators, opening the use of an interferometric instrument to the largest community of astronomers. Peculiar attention was given to internal checks and calibration procedures both to evaluate data quality in real time, and improve the successes of long term UV plane coverage observations.
Liu, Zhaoyang; Mao, Xianqiang; Tu, Jianjun; Jaccard, Mark
2014-11-01
China's iron and steel sector is faced with increasing pressure to control both local air pollutants and CO2 simultaneously. Additional policy instruments are needed to co-control these emissions in this sector. This study quantitatively evaluates and compares two categories of emission reduction instruments, namely the economic-incentive (EI) instrument of a carbon tax, and the command-and-control (CAC) instrument of mandatory application of end-of-pipe emission control measures for CO2, SO2 and NOx. The comparative evaluation tool is an integrated assessment model, which combines a top-down computable general equilibrium sub-model and a bottom-up technology-based sub-model through a soft-linkage. The simulation results indicate that the carbon tax can co-control multiple pollutants, but the emission reduction rates are limited under the tax rates examined in this study. In comparison, the CAC instruments are found to have excellent effects on controlling different pollutants separately, but not jointly. Such results indicate that no single EI or CAC instrument is overwhelmingly superior. The environmental and economic effectiveness of an instrument highly depends on its specific attributes, and cannot be predicted by the general policy category. These findings highlight the necessity of clearer identification of policy target priorities, and detail-oriented and integrated policy-making among different governmental departments. Copyright © 2014 Elsevier Ltd. All rights reserved.
A Hierarchical Framework for Demand-Side Frequency Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moya, Christian; Zhang, Wei; Lian, Jianming
2014-06-02
With large-scale plans to integrate renewable generation, more resources will be needed to compensate for the uncertainty associated with intermittent generation resources. Under such conditions, performing frequency control using only supply-side resources become not only prohibitively expensive but also technically difficult. It is therefore important to explore how a sufficient proportion of the loads could assume a routine role in frequency control to maintain the stability of the system at an acceptable cost. In this paper, a novel hierarchical decentralized framework for frequency based load control is proposed. The framework involves two decision layers. The top decision layer determines themore » optimal droop gain required from the aggregated load response on each bus using a robust decentralized control approach. The second layer consists of a large number of devices, which switch probabilistically during contingencies so that the aggregated power change matches the desired droop amount according to the updated gains. The proposed framework is based on the classical nonlinear multi-machine power system model, and can deal with timevarying system operating conditions while respecting the physical constraints of individual devices. Realistic simulation results based on a 68-bus system are provided to demonstrate the effectiveness of the proposed strategy.« less
In Brief: U.S. Volcano Early Warning System; Bill provides clear mandate for NOAA
NASA Astrophysics Data System (ADS)
Showstack, Randy
2005-05-01
The U.S. Geological Survey on 29 April released a comprehensive review of the 169 U.S. volcanoes, and established a framework for a National Volcano Early Warning System that is being formulated by the Consortium of U.S. Volcano Observatories. The framework proposes an around-the-clock Volcano Watch Office and improved instrumentation and monitoring at targeted volcanoes. The report, authored by USGS scientists John Ewert, Marianne Guffanti, and Thomas Murray, notes that although a few U.S. volcanoes are well-monitored, half of the most threatening volcanoes are monitored at a basic level and some hazardous volcanoes have no ground-based monitoring.
Michel, Yvonne Anne; Engel, Lidia; Rand-Hendriksen, Kim; Augestad, Liv Ariane; Whitehurst, David Gt
2016-11-28
In health economic analyses, health states are typically valued using instruments with few items per dimension. Due to the generic (and often reductionist) nature of such instruments, certain groups of respondents may experience challenges in describing their health state. This study is concerned with generic, preference-based health state instruments that provide information for decisions about the allocation of resources in health care. Unlike physical measurement instruments, preference-based health state instruments provide health state values that are dependent on how respondents interpret the items. This study investigates how individuals with spinal cord injury (SCI) interpret mobility-related items contained within six preference-based health state instruments. Secondary analysis of focus group transcripts originally collected in Vancouver, Canada, explored individuals' perceptions and interpretations of mobility-related items contained within the 15D, Assessment of Quality of Life 8-dimension (AQoL-8D), EQ-5D-5L, Health Utilities Index (HUI), Quality of Well-Being Scale Self-Administered (QWB-SA), and the 36-item Short Form health survey version 2 (SF-36v2). Ritchie and Spencer's 'Framework Approach' was used to perform thematic analysis that focused on participants' comments concerning the mobility-related items only. Fifteen individuals participated in three focus groups (five per focus group). Four themes emerged: wording of mobility (e.g., 'getting around' vs 'walking'), reference to aids and appliances, lack of suitable response options, and reframing of items (e.g., replacing 'walking' with 'wheeling'). These themes reflected item features that respondents perceived as relevant in enabling them to describe their mobility, and response strategies that respondents could use when faced with inaccessible items. Investigating perceptions to mobility-related items within the context of SCI highlights substantial variation in item interpretation across six preference-based health state instruments. Studying respondents' interpretations of items can help to understand discrepancies in the health state descriptions and values obtained from different instruments. This line of research warrants closer attention in the health economics and quality of life literature.
Fulmer, Erika; Rogers, Todd; Glasgow, LaShawn; Brown, Susan; Kuiper, Nicole
2018-03-01
The outcome indicator framework helps tobacco prevention and control programs (TCPs) plan and implement theory-driven evaluations of their efforts to reduce and prevent tobacco use. Tobacco use is the single-most preventable cause of morbidity and mortality in the United States. The implementation of public health best practices by comprehensive state TCPs has been shown to prevent the initiation of tobacco use, reduce tobacco use prevalence, and decrease tobacco-related health care expenditures. Achieving and sustaining program goals require TCPs to evaluate the effectiveness and impact of their programs. To guide evaluation efforts by TCPs, the Centers for Disease Control and Prevention's Office on Smoking and Health developed an outcome indicator framework that includes a high-level logic model and evidence-based outcome indicators for each tobacco prevention and control goal area. In this article, we describe how TCPs and other community organizations can use the outcome indicator framework in their evaluation efforts. We also discuss how the framework is used at the national level to unify tobacco prevention and control efforts across varying state contexts, identify promising practices, and expand the public health evidence base.
Bagraith, Karl; Chardon, Lydia; King, Robert John
2010-11-01
Although there are widely accepted and utilized models and frameworks for nondirective counseling (NDC), there is little in the way of tools or instruments designed to assist in determining whether or not a specific episode of counseling is consistent with the stated model or framework. The Counseling Progress and Depth Rating Instrument (CPDRI) was developed to evaluate counselor integrity in the use of Egan's skilled helper model in online counseling. The instrument was found to have sound internal consistency, good interrater reliability, and good face and convergent validity. The CPDRI is, therefore, proposed as a useful tool to facilitate investigation of the degree to which counselors adhere to and apply a widely used approach to NDC.
A model of the demand for Islamic banks debt-based financing instrument
NASA Astrophysics Data System (ADS)
Jusoh, Mansor; Khalid, Norlin
2013-04-01
This paper presents a theoretical analysis of the demand for debt-based financing instruments of the Islamic banks. Debt-based financing, such as through baibithamanajil and al-murabahah, is by far the most prominent of the Islamic bank financing and yet it has been largely ignored in Islamic economics literature. Most studies instead have been focusing on equity-based financing of al-mudharabah and al-musyarakah. Islamic bank offers debt-based financing through various instruments derived under the principle of exchange (ukud al-mu'awadhat) or more specifically, the contract of deferred sale. Under such arrangement, Islamic debt is created when goods are purchased and the payments are deferred. Thus, unlike debt of the conventional bank which is a form of financial loan contract to facilitate demand for liquid assets, this Islamic debt is created in response to the demand to purchase goods by deferred payment. In this paper we set an analytical framework that is based on an infinitely lived representative agent model (ILRA model) to analyze the demand for goods to be purchased by deferred payment. The resulting demand will then be used to derive the demand for Islamic debt. We also investigate theoretically, factors that may have an impact on the demand for Islamic debt.
NASA Astrophysics Data System (ADS)
Javier Romualdez, Luis
Scientific balloon-borne instrumentation offers an attractive, competitive, and effective alternative to space-borne missions when considering the overall scope, cost, and development timescale required to design and launch scientific instruments. In particular, the balloon-borne environment provides a near-space regime that is suitable for a number of modern astronomical and cosmological experiments, where the atmospheric interference suffered by ground-based instrumentation is negligible at stratospheric altitudes. This work is centered around the analytical strategies and implementation considerations for the attitude determination and control of SuperBIT, a scientific balloon-borne payload capable of meeting the strict sub-arcsecond pointing and image stability requirements demanded by modern cosmological experiments. Broadly speaking, the designed stability specifications of SuperBIT coupled with its observational efficiency, image quality, and accessibility rivals state-of-the-art astronomical observatories such as the Hubble Space Telescope. To this end, this work presents an end-to-end design methodology for precision pointing balloon-borne payloads such as SuperBIT within an analytical yet implementationally grounded context. Simulation models of SuperBIT are analytically derived to aid in pre-assembly trade-off and case studies that are pertinent to the dynamic balloon-borne environment. From these results, state estimation techniques and control methodologies are extensively developed, leveraging the analytical framework of simulation models and design studies. This pre-assembly design phase is physically validated during assembly, integration, and testing through implementation in real-time hardware and software, which bridges the gap between analytical results and practical application. SuperBIT attitude determination and control is demonstrated throughout two engineering test flights that verify pointing and image stability requirements in flight, where the post-flight results close the overall design loop by suggesting practical improvements to pre-design methodologies. Overall, the analytical and practical results presented in this work, though centered around the SuperBIT project, provide generically useful and implementationally viable methodologies for high precision balloon-borne instrumentation, all of which are validated, justified, and improved both theoretically and practically. As such, the continuing development of SuperBIT, built from the work presented in this thesis, strives to further the potential for scientific balloon-borne astronomy in the near future.
Teaching Historical Contextualization: The Construction of a Reliable Observation Instrument
ERIC Educational Resources Information Center
Huijgen, Tim; van de Grift, Wim; van Boxtel, Carla; Holthuis, Paul
2017-01-01
Since the 1970s, many observation instruments have been constructed to map teachers' general pedagogic competencies. However, few of these instruments focus on teachers' subject-specific competencies. This study presents the development of the "Framework for Analyzing the Teaching of Historical Contextualization" (FAT-HC). This…
Analysis of key technologies for virtual instruments metrology
NASA Astrophysics Data System (ADS)
Liu, Guixiong; Xu, Qingui; Gao, Furong; Guan, Qiuju; Fang, Qiang
2008-12-01
Virtual instruments (VIs) require metrological verification when applied as measuring instruments. Owing to the software-centered architecture, metrological evaluation of VIs includes two aspects: measurement functions and software characteristics. Complexity of software imposes difficulties on metrological testing of VIs. Key approaches and technologies for metrology evaluation of virtual instruments are investigated and analyzed in this paper. The principal issue is evaluation of measurement uncertainty. The nature and regularity of measurement uncertainty caused by software and algorithms can be evaluated by modeling, simulation, analysis, testing and statistics with support of powerful computing capability of PC. Another concern is evaluation of software features like correctness, reliability, stability, security and real-time of VIs. Technologies from software engineering, software testing and computer security domain can be used for these purposes. For example, a variety of black-box testing, white-box testing and modeling approaches can be used to evaluate the reliability of modules, components, applications and the whole VI software. The security of a VI can be assessed by methods like vulnerability scanning and penetration analysis. In order to facilitate metrology institutions to perform metrological verification of VIs efficiently, an automatic metrological tool for the above validation is essential. Based on technologies of numerical simulation, software testing and system benchmarking, a framework for the automatic tool is proposed in this paper. Investigation on implementation of existing automatic tools that perform calculation of measurement uncertainty, software testing and security assessment demonstrates the feasibility of the automatic framework advanced.
NASA Astrophysics Data System (ADS)
Kumar, T. S.
2016-08-01
In this paper, we describe the details of control unit and GUI software for positioning two filter wheels, a slit wheel and a grism wheel in the ADFOSC instrument. This is a first generation instrument being built for the 3.6 m Devasthal optical telescope. The control hardware consists of five electronic boards based on low cost 8-bit PIC microcontrollers and are distributed over I2C bus. The four wheels are controlled by four identical boards which are configured in I2C slave mode while the fifth board acts as an I2C master for sending commands to and receiving status from the slave boards. The master also communicates with the interfacing PC over TCP/IP protocol using simple ASCII commands. For moving the wheels stepper motors along with suitable amplifiers have been employed. Homing after powering ON is achieved using hall effect sensors. By implementing distributed control units having identical design modularity is achieved enabling easier maintenance and upgradation. A GUI based software for commanding the instrument is developed in Microsoft Visual C++. For operating the system during observations the user selects normal mode while the engineering mode is available for offering additional flexibility and low level control during maintenance and testing. A detailed time-stamped log of commands, status and errors are continuously generated. Both the control unit and the software have been successfully tested and integrated with the ADFOSC instrument.
Hernández-Garbanzo, Yenory; Brosh, Joanne; Serrano, Elena L; Cason, Katherine L; Bhattarai, Ranju
2013-01-01
To identify the psychometric properties of evaluation instruments that measure mediators of dietary behaviors in school-aged children. Systematic search of scientific databases limited to 1999-2010. Psychometric properties related to development and testing of self-report instruments for children 8-12 years old. Systematic search of 189 articles and review of 15 instruments (20 associated articles) meeting the inclusion criteria. Search terms used included children, school, nutrition, diet, nutrition education, and evaluation. Fourteen studies used a theoretical framework to guide the instrument's development. Knowledge and self-efficacy were the most commonly used psychosocial measures. Twelve instruments focused on specific nutrition-related behaviors. Eight instruments included over 40 items and used age-appropriate response formats. Acceptable reliability properties were most commonly reported for attitude and self-efficacy measures. Although most of the instruments were reviewed by experts (n = 8) and/or pilot-tested (n = 9), only 7 were tested using both rigorous types of validity and with low-income youth. Results from this review suggest that additional research is needed to develop more robust psychosocial measures for dietary behaviors, for low-income youth audiences. Copyright © 2013 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.
Hotchkiss, David R; Aqil, Anwer; Lippeveld, Theo; Mukooyo, Edward
2010-07-03
Sound policy, resource allocation and day-to-day management decisions in the health sector require timely information from routine health information systems (RHIS). In most low- and middle-income countries, the RHIS is viewed as being inadequate in providing quality data and continuous information that can be used to help improve health system performance. In addition, there is limited evidence on the effectiveness of RHIS strengthening interventions in improving data quality and use. The purpose of this study is to evaluate the usefulness of the newly developed Performance of Routine Information System Management (PRISM) framework, which consists of a conceptual framework and associated data collection and analysis tools to assess, design, strengthen and evaluate RHIS. The specific objectives of the study are: a) to assess the reliability and validity of the PRISM instruments and b) to assess the validity of the PRISM conceptual framework. Facility- and worker-level data were collected from 110 health care facilities in twelve districts in Uganda in 2004 and 2007 using records reviews, structured interviews and self-administered questionnaires. The analysis procedures include Cronbach's alpha to assess internal consistency of selected instruments, test-retest analysis to assess the reliability and sensitivity of the instruments, and bivariate and multivariate statistical techniques to assess validity of the PRISM instruments and conceptual framework. Cronbach's alpha analysis suggests high reliability (0.7 or greater) for the indices measuring a promotion of a culture of information, RHIS tasks self-efficacy and motivation. The study results also suggest that a promotion of a culture of information influences RHIS tasks self-efficacy, RHIS tasks competence and motivation, and that self-efficacy and the presence of RHIS staff have a direct influence on the use of RHIS information, a key aspect of RHIS performance. The study results provide some empirical support for the reliability and validity of the PRISM instruments and the validity of the PRISM conceptual framework, suggesting that the PRISM approach can be effectively used by RHIS policy makers and practitioners to assess the RHIS and evaluate RHIS strengthening interventions. However, additional studies with larger sample sizes are needed to further investigate the value of the PRISM instruments in exploring the linkages between RHIS data quality and use, and health systems performance.
Disbergen, Niels R.; Valente, Giancarlo; Formisano, Elia; Zatorre, Robert J.
2018-01-01
Polyphonic music listening well exemplifies processes typically involved in daily auditory scene analysis situations, relying on an interactive interplay between bottom-up and top-down processes. Most studies investigating scene analysis have used elementary auditory scenes, however real-world scene analysis is far more complex. In particular, music, contrary to most other natural auditory scenes, can be perceived by either integrating or, under attentive control, segregating sound streams, often carried by different instruments. One of the prominent bottom-up cues contributing to multi-instrument music perception is their timbre difference. In this work, we introduce and validate a novel paradigm designed to investigate, within naturalistic musical auditory scenes, attentive modulation as well as its interaction with bottom-up processes. Two psychophysical experiments are described, employing custom-composed two-voice polyphonic music pieces within a framework implementing a behavioral performance metric to validate listener instructions requiring either integration or segregation of scene elements. In Experiment 1, the listeners' locus of attention was switched between individual instruments or the aggregate (i.e., both instruments together), via a task requiring the detection of temporal modulations (i.e., triplets) incorporated within or across instruments. Subjects responded post-stimulus whether triplets were present in the to-be-attended instrument(s). Experiment 2 introduced the bottom-up manipulation by adding a three-level morphing of instrument timbre distance to the attentional framework. The task was designed to be used within neuroimaging paradigms; Experiment 2 was additionally validated behaviorally in the functional Magnetic Resonance Imaging (fMRI) environment. Experiment 1 subjects (N = 29, non-musicians) completed the task at high levels of accuracy, showing no group differences between any experimental conditions. Nineteen listeners also participated in Experiment 2, showing a main effect of instrument timbre distance, even though within attention-condition timbre-distance contrasts did not demonstrate any timbre effect. Correlation of overall scores with morph-distance effects, computed by subtracting the largest from the smallest timbre distance scores, showed an influence of general task difficulty on the timbre distance effect. Comparison of laboratory and fMRI data showed scanner noise had no adverse effect on task performance. These Experimental paradigms enable to study both bottom-up and top-down contributions to auditory stream segregation and integration within psychophysical and neuroimaging experiments. PMID:29563861
Effects of Early Writing Intervention Delivered within a Data-Based Instruction Framework
ERIC Educational Resources Information Center
Jung, Pyung-Gang; McMaster, Kristen L.; delMas, Robert C.
2017-01-01
We examined effects of research-based early writing intervention delivered within a data-based instruction (DBI) framework for children with intensive needs. We randomly assigned 46 students with and without disabilities in Grades 1 to 3 within classrooms to either treatment or control. Treatment students received research-based early writing…
Reliability and validity in a nutshell.
Bannigan, Katrina; Watson, Roger
2009-12-01
To explore and explain the different concepts of reliability and validity as they are related to measurement instruments in social science and health care. There are different concepts contained in the terms reliability and validity and these are often explained poorly and there is often confusion between them. To develop some clarity about reliability and validity a conceptual framework was built based on the existing literature. The concepts of reliability, validity and utility are explored and explained. Reliability contains the concepts of internal consistency and stability and equivalence. Validity contains the concepts of content, face, criterion, concurrent, predictive, construct, convergent (and divergent), factorial and discriminant. In addition, for clinical practice and research, it is essential to establish the utility of a measurement instrument. To use measurement instruments appropriately in clinical practice, the extent to which they are reliable, valid and usable must be established.
NASA Astrophysics Data System (ADS)
Barbot, Loïc; Villard, Jean-François; Fourrez, Stéphane; Pichon, Laurent; Makil, Hamid
2018-01-01
In the framework of the French National Research Agency program on nuclear safety and radioprotection, the `DIstributed Sensing for COrium Monitoring and Safety' project aims at developing innovative instrumentation for corium monitoring in case of severe accident in a Pressurized Water nuclear Reactor. Among others, a new under-vessel instrumentation based on Self-Powered Neutron Detectors is developed using a numerical simulation toolbox, named `MATiSSe'. The CEA Instrumentation Sensors and Dosimetry Lab developed MATiSSe since 2010 for Self-Powered Neutron Detectors material selection and geometry design, as well as for their respective partial neutron and gamma sensitivity calculations. MATiSSe is based on a comprehensive model of neutron and gamma interactions which take place in Selfpowered neutron detector components using the MCNP6 Monte Carlo code. As member of the project consortium, the THERMOCOAX SAS Company is currently manufacturing some instrumented pole prototypes to be tested in 2017. The full severe accident monitoring equipment, including the standalone low current acquisition system, will be tested during a joined CEA-THERMOCOAX experimental campaign in some realistic irradiation conditions, in the Slovenian TRIGA Mark II research reactor.
Instrumentation and control building, architectural, sections and elevation. Specifications No. ...
Instrumentation and control building, architectural, sections and elevation. Specifications No. Eng -04-353-55-72; Drawing No. 60-09-12; sheet 65 of 148; file no. 1321/16. Stamped: record drawing - as constructed. Below stamp: Contract no. 4338, no change. - Edwards Air Force Base, Air Force Rocket Propulsion Laboratory, Control Center, Test Area 1-115, near Altair & Saturn Boulevards, Boron, Kern County, CA
Instrumentation and control building, architectural, floor plans. Specifications no. Eng-04-353-55-72; Drawing No. 60-09-12' sheet 64 of 148; file no. 1321/15. Stamped: record drawing - as constructed. Below stamp: Contract no. 4338, no change. - Edwards Air Force Base, Air Force Rocket Propulsion Laboratory, Control Center, Test Area 1-115, near Altair & Saturn Boulevards, Boron, Kern County, CA
Lassi, Zohra S; Salam, Rehana A; Das, Jai K; Bhutta, Zulfiqar A
2014-01-01
This paper describes the conceptual framework and the methodology used to guide the systematic reviews of community-based interventions (CBIs) for the prevention and control of infectious diseases of poverty (IDoP). We adapted the conceptual framework from the 3ie work on the 'Community-Based Intervention Packages for Preventing Maternal Morbidity and Mortality and Improving Neonatal Outcomes' to aid in the analyzing of the existing CBIs for IDoP. The conceptual framework revolves around objectives, inputs, processes, outputs, outcomes, and impacts showing the theoretical linkages between the delivery of the interventions targeting these diseases through various community delivery platforms and the consequent health impacts. We also describe the methodology undertaken to conduct the systematic reviews and the meta-analyses.
Carfora, Valentina; Caso, Daniela; Conner, Mark
2016-11-01
The present research aimed to test the efficacy of affective and instrumental text messages compared with a no-message control as a strategy to increase fruit and vegetable intake (FVI) in adolescents. A randomized controlled trial was used test impact of different text messages compared with no message on FVI over a 2-week period. A total of 1,065 adolescents (14-19 years) from a high school of the South of Italy completed the baseline questionnaire and were randomly allocated to one of three conditions: instrumental messages (N = 238), affective messages (N = 300), and no messages (N = 521). Students in the message conditions received one message each day over a 2-week period. The messages targeted affective (affective benefits) or instrumental (instrumental benefits) information about FVI. Self-reported FVI at 2 weeks was the key dependent variable. Analyses were based on the N = 634 who completed all aspects of the study. Findings showed that messages significantly increased FVI, particularly in the affective condition and this effect was partially mediated by changes in affective attitude and intentions towards FVI. Text messages can be used to increase FVI in adolescents. Text messages based on affective benefits are more effective than text messages based on instrumental benefits. Statement of contribution What is already known on this subject? Text messages have been shown to promote positive change in health behaviours. However, the most appropriate target for such text messages is less clear although targeting attitudes may be effective. What does this study add? This randomized controlled study shows that text messages targeting instrumental or affective attitudes produce changes in fruit and vegetable intake (FVI) in adolescents. Text messages targeting affective attitudes are shown to be more effective than text messages targeting instrumental attitudes. The effect of affective text messages on FVI was partially mediated by changes in affective attitudes. © 2016 The British Psychological Society.
ERIC Educational Resources Information Center
Blazar, David; Braslow, David; Charalambous, Charalambos Y.
2015-01-01
Over the past several years, research teams have developed observational instruments to measure the quality of teachers' instructional practices. Instruments such as Framework for Teaching (FFT) and the Classroom Assessment Scoring System (CLASS) assess general teaching practices, including student-teacher interactions, behavior management, and…
5. WEST SIDE, ALSO SHOWING INSTRUMENTATION AND CONTROL BUILDING (BLDG. ...
5. WEST SIDE, ALSO SHOWING INSTRUMENTATION AND CONTROL BUILDING (BLDG. 8668) IN MIDDLE DISTANCE AT LEFT, AND TEST AREAS 1-120 AND 1-125 BEYOND. - Edwards Air Force Base, Air Force Rocket Propulsion Laboratory, Test Stand 1-4, Test Area 1-115, northwest end of Saturn Boulevard, Boron, Kern County, CA
Halek, Margareta; Holle, Daniela; Bartholomeyczik, Sabine
2017-08-14
One of the most difficult issues for care staff is the manifestation of challenging behaviour among residents with dementia. The first step in managing this type of behaviour is analysing its triggers. A structured assessment instrument can facilitate this process and may improve carers' management of the situation. This paper describes the development of an instrument designed for this purpose and an evaluation of its content validity and its feasibility and practicability in nursing homes. The development process and evaluation of the content validity were based on Lynn's methodology (1998). A literature review (steps 1 + 2) provided the theoretical framework for the instrument and for item formation. Ten experts (step 3) evaluated the first version of the instrument (the Innovative dementia-oriented Assessment (IdA®)) regarding its relevance, clarity, meaningfulness and completeness; content validity indices at the scale-level (S-CVI) and item-level (I-CVI) were calculated. Health care workers (step 4) evaluated the second version in a workshop. Finally, the instrument was introduced to 17 units in 11 nursing homes in a field study (step 5), and 60 care staff members assessed its practicability and feasibility. The IdA® used the need-driven dementia-compromised behaviour (NDB) model as a theoretical framework. The literature review and expert-based panel supported the content validity of the IdA®. At the item level, 77% of the ratings had a CVI greater than or equal to 0.78. The majority of the question-ratings (84%, n = 154) and answer-ratings (69%, n = 122) showed valid results, with none below 0.50. The health care workers confirmed the understandability, completeness and plausibility of the IdA®. Steps 3 and 4 led to further item clarification. The carers in the study considered the instrument helpful for reflecting challenging behaviour and beneficial for the care of residents with dementia. Negative ratings referred to the time required and the lack of effect on residents´ behaviour. There was strong evidence supporting the content validity of the IdA®. Despite the substantial length and time requirement, the instrument was considered helpful for analysing challenging behaviour. Thus, further research on the psychometric qualities, implementation aspects and effectiveness of the IdA® in understanding challenging behaviour is needed.
Sarafian, Isabelle
2012-08-01
This study evaluated the process of a peer education program for hotel-based sex workers in Dhaka, Bangladesh, with social support proposed as an organizing framework. Programme outcomes were examined through baseline and follow-up assessments. Sex workers naïve to peer education were assessed on socio-cognitive and behavioural variables; a subsample was reassessed at follow-up 23 weeks later on average. Process was assessed in terms of the content of peer education sessions. These sessions were recorded and coded into percentages of social support types provided by the peer educator to her audience: informational, instrumental, appraisal, emotional, companionship, non-support. Peer educators were classified into three "social support profiles" based on average proportions of emotional and informational support they provided. Seeing more peer educators with a high informational support profile was related to higher sex worker self-efficacy, self-reported STI symptoms, and self-reported condom use at follow-up; the same was true for the high emotional support profile and treatment seeking. Social support constituted a useful framework, but needs further exploration. This study provided a direct, in-depth examination of the process of peer education based on a comprehensive theoretical framework. Copyright © 2011 Elsevier Ltd. All rights reserved.
Proposal of a Framework for Internet Based Licensing of Learning Objects
ERIC Educational Resources Information Center
Santos, Osvaldo A.; Ramos, Fernando M. S.
2004-01-01
This paper presents a proposal of a framework whose main objective is to manage the delivery and rendering of learning objects in a digital rights controlled environment. The framework is based on a digital licensing scheme that requires each learning object to have the proper license in order to be rendered by a trusted player. A conceptual model…
NASA Astrophysics Data System (ADS)
Li, Xiaohui; Sun, Zhenping; Cao, Dongpu; Liu, Daxue; He, Hangen
2017-03-01
This study proposes a novel integrated local trajectory planning and tracking control (ILTPTC) framework for autonomous vehicles driving along a reference path with obstacles avoidance. For this ILTPTC framework, an efficient state-space sampling-based trajectory planning scheme is employed to smoothly follow the reference path. A model-based predictive path generation algorithm is applied to produce a set of smooth and kinematically-feasible paths connecting the initial state with the sampling terminal states. A velocity control law is then designed to assign a speed value at each of the points along the generated paths. An objective function considering both safety and comfort performance is carefully formulated for assessing the generated trajectories and selecting the optimal one. For accurately tracking the optimal trajectory while overcoming external disturbances and model uncertainties, a combined feedforward and feedback controller is developed. Both simulation analyses and vehicle testing are performed to verify the effectiveness of the proposed ILTPTC framework, and future research is also briefly discussed.
Subramanian, Sujha; Tangka, Florence; Edwards, Patrick; Hoover, Sonja; Cole-Beebe, Maggie
2016-12-01
This article reports on the methods and framework we have developed to guide economic evaluation of noncommunicable disease registries. We developed a cost data collection instrument, the Centers for Disease Control and Prevention's (CDC's) International Registry Costing Tool (IntRegCosting Tool), based on established economics methods We performed in-depth case studies, site visit interviews, and pilot testing in 11 registries from multiple countries including India, Kenya, Uganda, Colombia, and Barbados to assess the overall quality of the data collected from cancer and cardiovascular registries. Overall, the registries were able to use the IntRegCosting Tool to assign operating expenditures to specific activities. We verified that registries were able to provide accurate estimation of labor costs, which is the largest expenditure incurred by registries. We also identified several factors that can influence the cost of registry operations, including size of the geographic area served, data collection approach, local cost of living, presence of rural areas, volume of cases, extent of consolidation of records to cases, and continuity of funding. Internal and external registry factors reveal that a single estimate for the cost of registry operations is not feasible; costs will vary on the basis of factors that may be beyond the control of the registries. Some factors, such as data collection approach, can be modified to improve the efficiency of registry operations. These findings will inform both future economic data collection using a web-based tool and cost and cost-effectiveness analyses of registry operations in low- and middle-income countries (LMICs) and other locations with similar characteristics. Copyright © 2016 Elsevier Ltd. All rights reserved.
Li, Honghe; Liu, Yang; Wen, Deliang
2017-01-01
Background Over the last three decades, various instruments were developed and employed to assess medical professionalism, but their measurement properties have yet to be fully evaluated. This study aimed to systematically evaluate these instruments’ measurement properties and the methodological quality of their related studies within a universally acceptable standardized framework and then provide corresponding recommendations. Methods A systematic search of the electronic databases PubMed, Web of Science, and PsycINFO was conducted to collect studies published from 1990–2015. After screening titles, abstracts, and full texts for eligibility, the articles included in this study were classified according to their respective instrument’s usage. A two-phase assessment was conducted: 1) methodological quality was assessed by following the COnsensus-based Standards for the selection of health status Measurement INstruments (COSMIN) checklist; and 2) the quality of measurement properties was assessed according to Terwee’s criteria. Results were integrated using best-evidence synthesis to look for recommendable instruments. Results After screening 2,959 records, 74 instruments from 80 existing studies were included. The overall methodological quality of these studies was unsatisfactory, with reasons including but not limited to unknown missing data, inadequate sample sizes, and vague hypotheses. Content validity, cross-cultural validity, and criterion validity were either unreported or negative ratings in most studies. Based on best-evidence synthesis, three instruments were recommended: Hisar’s instrument for nursing students, Nurse Practitioners’ Roles and Competencies Scale, and Perceived Faculty Competency Inventory. Conclusion Although instruments measuring medical professionalism are diverse, only a limited number of studies were methodologically sound. Future studies should give priority to systematically improving the performance of existing instruments and to longitudinal studies. PMID:28498838
Klepsch, Melina; Schmitz, Florian; Seufert, Tina
2017-01-01
Cognitive Load Theory is one of the most powerful research frameworks in educational research. Beside theoretical discussions about the conceptual parts of cognitive load, the main challenge within this framework is that there is still no measurement instrument for the different aspects of cognitive load, namely intrinsic, extraneous, and germane cognitive load. Hence, the goal of this paper is to develop a differentiated measurement of cognitive load. In Study 1 ( N = 97), we developed and analyzed two strategies to measure cognitive load in a differentiated way: (1) Informed rating: We trained learners in differentiating the concepts of cognitive load, so that they could rate them in an informed way. They were asked then to rate 24 different learning situations or learning materials related to either high or low intrinsic, extraneous, or germane load. (2) Naïve rating: For this type of rating of cognitive load we developed a questionnaire with two to three items for each type of load. With this questionnaire, the same learning situations had to be rated. In the second study ( N = between 65 and 95 for each task), we improved the instrument for the naïve rating. For each study, we analyzed whether the instruments are reliable and valid, for Study 1, we also checked for comparability of the two measurement strategies. In Study 2, we conducted a simultaneous scenario based factor analysis. The informed rating seems to be a promising strategy to assess the different aspects of cognitive load, but it seems not economic and feasible for larger studies and a standardized training would be necessary. The improved version of the naïve rating turned out to be a useful, feasible, and reliable instrument. Ongoing studies analyze the conceptual validity of this measurement with up to now promising results.
Schawo, S; Bouwmans, C; van der Schee, E; Hendriks, V; Brouwer, W; Hakkaart, L
2017-09-19
Systemic family interventions have shown to be effective in adolescents with substance use disorder and delinquent behavior. The interventions target interactions between the adolescent and involved systems (i.e. youth, family, peers, neighbors, school, work, and society). Next to effectiveness considerations, economic aspects have gained attention. However, conventional generic quality of life measures used in health economic evaluations may not be able to capture the broad effects of systemic interventions. This study aims to identify existing outcome measures, which capture the broad effects of systemic family interventions, and allow use in a health economic framework. We based our systematic review on clinical studies in the field. Our goal was to identify effectiveness studies of psychosocial interventions for adolescents with substance use disorder and delinquent behavior and to distill the instruments used in these studies to measure effects. Searched databases were PubMed, Education Resource Information Center (ERIC), Cochrane and Psychnet (PsycBOOKSc, PsycCRITIQUES, print). Identified instruments were ranked according to the number of systems covered (comprehensiveness). In addition, their use for health economic analyses was evaluated according to suitability characteristics such as brevity, accessibility, psychometric properties, etc. One thousand three hundred seventy-eight articles were found and screened for eligibility. Eighty articles were selected, 8 instruments were identified covering 5 or more systems. The systematic review identified instruments from the clinical field suitable to evaluate systemic family interventions in a health economic framework. None of them had preference-weights available. Hence, a next step could be to attach preference-weights to one of the identified instruments to allow health economic evaluations of systemic family interventions.
A Model-Driven Co-Design Framework for Fusing Control and Scheduling Viewpoints.
Sundharam, Sakthivel Manikandan; Navet, Nicolas; Altmeyer, Sebastian; Havet, Lionel
2018-02-20
Model-Driven Engineering (MDE) is widely applied in the industry to develop new software functions and integrate them into the existing run-time environment of a Cyber-Physical System (CPS). The design of a software component involves designers from various viewpoints such as control theory, software engineering, safety, etc. In practice, while a designer from one discipline focuses on the core aspects of his field (for instance, a control engineer concentrates on designing a stable controller), he neglects or considers less importantly the other engineering aspects (for instance, real-time software engineering or energy efficiency). This may cause some of the functional and non-functional requirements not to be met satisfactorily. In this work, we present a co-design framework based on timing tolerance contract to address such design gaps between control and real-time software engineering. The framework consists of three steps: controller design, verified by jitter margin analysis along with co-simulation, software design verified by a novel schedulability analysis, and the run-time verification by monitoring the execution of the models on target. This framework builds on CPAL (Cyber-Physical Action Language), an MDE design environment based on model-interpretation, which enforces a timing-realistic behavior in simulation through timing and scheduling annotations. The application of our framework is exemplified in the design of an automotive cruise control system.
A Model-Driven Co-Design Framework for Fusing Control and Scheduling Viewpoints
Navet, Nicolas; Havet, Lionel
2018-01-01
Model-Driven Engineering (MDE) is widely applied in the industry to develop new software functions and integrate them into the existing run-time environment of a Cyber-Physical System (CPS). The design of a software component involves designers from various viewpoints such as control theory, software engineering, safety, etc. In practice, while a designer from one discipline focuses on the core aspects of his field (for instance, a control engineer concentrates on designing a stable controller), he neglects or considers less importantly the other engineering aspects (for instance, real-time software engineering or energy efficiency). This may cause some of the functional and non-functional requirements not to be met satisfactorily. In this work, we present a co-design framework based on timing tolerance contract to address such design gaps between control and real-time software engineering. The framework consists of three steps: controller design, verified by jitter margin analysis along with co-simulation, software design verified by a novel schedulability analysis, and the run-time verification by monitoring the execution of the models on target. This framework builds on CPAL (Cyber-Physical Action Language), an MDE design environment based on model-interpretation, which enforces a timing-realistic behavior in simulation through timing and scheduling annotations. The application of our framework is exemplified in the design of an automotive cruise control system. PMID:29461489
Explicit reflection in an introductory physics course
NASA Astrophysics Data System (ADS)
Scott, Michael Lee
This dissertation details a classroom intervention that supplements assigned in-class problems in weekly problem sets with reflective activities that are aimed to assist in knowledge integration. Using the framework of cognitive load theory, this intervention should assist in schema acquisition leading to (1) students recognizing the use and appropriately applying physical concepts across different problem contexts, and (2) enhanced physics understanding of students resulting in improved class performance. The intervention was embedded in the discussion component of an introductory, university physics course, and spanned a 14-week period. Evaluation of the intervention was based on the relative performance between a control and treatment group. Instruments used in this study to assess performance included the Force Concept Inventory (FCI), a physics problem categorization test, and four class exams. A full discussion of this implementation and the accompanying measures will be given. Possible limitations to this study and lines of future research will be proposed.
Better by design: business preferences for environmental regulatory reform.
Taylor, Christopher M; Pollard, Simon J T; Rocks, Sophie A; Angus, Andrew J
2015-04-15
We present the preferences for environmental regulatory reform expressed by 30 UK businesses and industry bodies from 5 sectors. While five strongly preferred voluntary regulation, seven expressed doubts about its effectiveness, and 18 expressed no general preference between instrument types. Voluntary approaches were valued for flexibility and lower burdens, but direct regulation offered stability and a level playing field. Respondents sought regulatory frameworks that: are coherent; balance clarity, prescription and flexibility; are enabled by positive regulatory relationships; administratively efficient; targeted according to risk magnitude and character; evidence-based and that deliver long-term market stability for regulatees. Anticipated differences in performance between types of instrument can be undermined by poor implementation. Results underline the need for policy makers and regulators to tailor an effective mix of instruments for a given sector, and to overcome analytical, institutional and political barriers to greater coherence, to better coordinate existing instruments and tackle new environmental challenges as they emerge. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Verhagen, T. L. A.; Vandekasteele, R. M.
1992-08-01
Within the framework of the research into the vulnerability of ships, an experimental investigation took place in 1989 aboard the frigate 'Wolf.' The recordings of an instrumented experiment in the crew aft sleeping compartment are presented. During this experiment, a nonfragmenting charge of 5.5 kg TNT was initiated. Preceding the 5.5 kg TNT experiment, a 2 kg TNT experiment was performed on the same day. Later that day the 15 kg TNT experiment took place. Reparation/modification of the instrumentation was not possible. The settings of the instrumentation equipment were based on the expected extreme responses of the 15 kg TNT experiment later that day which had, however, an influence on the signal to noise ratio. The blast measurements seem to have recorded correctly. The quasi static pressure in the experiment compartment as well as in the adjacent compartments showed classical behavior. The strain measurements seemed to be good, although some of them malfunctioned after a period of time.
Macintosh/LabVIEW based control and data acquisition system for a single photon counting fluorometer
NASA Astrophysics Data System (ADS)
Stryjewski, Wieslaw J.
1991-08-01
A flexible software system has been developed for controlling fluorescence decay measurements using the virtual instrument approach offered by LabVIEW. The time-correlated single photon counting instrument operates under computer control in both manual and automatic mode. Implementation time was short and the equipment is now easier to use, reducing the training time required for new investigators. It is not difficult to customize the front panel or adapt the program to a different instrument. We found LabVIEW much more convenient to use for this application than traditional, textual computer languages.
NASA Technical Reports Server (NTRS)
Storey, James; Roy, David P.; Masek, Jeffrey; Gascon, Ferran; Dwyer, John; Choate, Michael
2016-01-01
The Landsat-8 and Sentinel-2 sensors provide multi-spectral image data with similar spectral and spatial characteristics that together provide improved temporal coverage globally. Both systems are designed to register Level 1 products to a reference image framework, however, the Landsat-8 framework, based upon the Global Land Survey images, contains residual geolocation errors leading to an expected sensor-to-sensor misregistration of 38 m (2sigma). These misalignments vary geographically but should be stable for a given area. The Landsat framework will be readjusted for consistency with the Sentinel-2 Global Reference Image, with completion expected in 2018. In the interim, users can measure Landsat-to-Sentinel tie points to quantify the misalignment in their area of interest and if appropriate to reproject the data to better alignment.
Storey, James C.; Roy, David P.; Masek, Jeffrey; Gascon, Ferran; Dwyer, John L.; Choate, Michael J.
2016-01-01
The Landsat-8 and Sentinel-2 sensors provide multi-spectral image data with similar spectral and spatial characteristics that together provide improved temporal coverage globally. Both systems are designed to register Level 1 products to a reference image framework, however, the Landsat-8 framework, based upon the Global Land Survey images, contains residual geolocation errors leading to an expected sensor-to-sensor misregistration of 38 m (2σ). These misalignments vary geographically but should be stable for a given area. The Landsat framework will be readjusted for consistency with the Sentinel-2 Global Reference Image, with completion expected in 2018. In the interim, users can measure Landsat-to-Sentinel tie points to quantify the misalignment in their area of interest and if appropriate to reproject the data to better alignment.
Qualifications Frameworks in Africa: A Critical Reflection
ERIC Educational Resources Information Center
Higgs, P.; Keevy, J.
2009-01-01
Today there is an accelerating trend towards qualifications frameworks as an instrument to develop, classify and recognise formal learning across the African continent, as is also the case across most of Europe, Australasia and the Asia-Pacific region. As more and more countries and regions across the world develop qualifications frameworks to…
The Instrumental Value of Conceptual Frameworks in Educational Technology Research
ERIC Educational Resources Information Center
Antonenko, Pavlo D.
2015-01-01
Scholars from diverse fields and research traditions agree that the conceptual framework is a critically important component of disciplined inquiry. Yet, there is a pronounced lack of shared understanding regarding the definition and functions of conceptual frameworks, which impedes our ability to design effective research and mentor novice…
Graumann, Johannes; Scheltema, Richard A; Zhang, Yong; Cox, Jürgen; Mann, Matthias
2012-03-01
In the analysis of complex peptide mixtures by MS-based proteomics, many more peptides elute at any given time than can be identified and quantified by the mass spectrometer. This makes it desirable to optimally allocate peptide sequencing and narrow mass range quantification events. In computer science, intelligent agents are frequently used to make autonomous decisions in complex environments. Here we develop and describe a framework for intelligent data acquisition and real-time database searching and showcase selected examples. The intelligent agent is implemented in the MaxQuant computational proteomics environment, termed MaxQuant Real-Time. It analyzes data as it is acquired on the mass spectrometer, constructs isotope patterns and SILAC pair information as well as controls MS and tandem MS events based on real-time and prior MS data or external knowledge. Re-implementing a top10 method in the intelligent agent yields similar performance to the data dependent methods running on the mass spectrometer itself. We demonstrate the capabilities of MaxQuant Real-Time by creating a real-time search engine capable of identifying peptides "on-the-fly" within 30 ms, well within the time constraints of a shotgun fragmentation "topN" method. The agent can focus sequencing events onto peptides of specific interest, such as those originating from a specific gene ontology (GO) term, or peptides that are likely modified versions of already identified peptides. Finally, we demonstrate enhanced quantification of SILAC pairs whose ratios were poorly defined in survey spectra. MaxQuant Real-Time is flexible and can be applied to a large number of scenarios that would benefit from intelligent, directed data acquisition. Our framework should be especially useful for new instrument types, such as the quadrupole-Orbitrap, that are currently becoming available.
Graumann, Johannes; Scheltema, Richard A.; Zhang, Yong; Cox, Jürgen; Mann, Matthias
2012-01-01
In the analysis of complex peptide mixtures by MS-based proteomics, many more peptides elute at any given time than can be identified and quantified by the mass spectrometer. This makes it desirable to optimally allocate peptide sequencing and narrow mass range quantification events. In computer science, intelligent agents are frequently used to make autonomous decisions in complex environments. Here we develop and describe a framework for intelligent data acquisition and real-time database searching and showcase selected examples. The intelligent agent is implemented in the MaxQuant computational proteomics environment, termed MaxQuant Real-Time. It analyzes data as it is acquired on the mass spectrometer, constructs isotope patterns and SILAC pair information as well as controls MS and tandem MS events based on real-time and prior MS data or external knowledge. Re-implementing a top10 method in the intelligent agent yields similar performance to the data dependent methods running on the mass spectrometer itself. We demonstrate the capabilities of MaxQuant Real-Time by creating a real-time search engine capable of identifying peptides “on-the-fly” within 30 ms, well within the time constraints of a shotgun fragmentation “topN” method. The agent can focus sequencing events onto peptides of specific interest, such as those originating from a specific gene ontology (GO) term, or peptides that are likely modified versions of already identified peptides. Finally, we demonstrate enhanced quantification of SILAC pairs whose ratios were poorly defined in survey spectra. MaxQuant Real-Time is flexible and can be applied to a large number of scenarios that would benefit from intelligent, directed data acquisition. Our framework should be especially useful for new instrument types, such as the quadrupole-Orbitrap, that are currently becoming available. PMID:22171319
Error-in-variables models in calibration
NASA Astrophysics Data System (ADS)
Lira, I.; Grientschnig, D.
2017-12-01
In many calibration operations, the stimuli applied to the measuring system or instrument under test are derived from measurement standards whose values may be considered to be perfectly known. In that case, it is assumed that calibration uncertainty arises solely from inexact measurement of the responses, from imperfect control of the calibration process and from the possible inaccuracy of the calibration model. However, the premise that the stimuli are completely known is never strictly fulfilled and in some instances it may be grossly inadequate. Then, error-in-variables (EIV) regression models have to be employed. In metrology, these models have been approached mostly from the frequentist perspective. In contrast, not much guidance is available on their Bayesian analysis. In this paper, we first present a brief summary of the conventional statistical techniques that have been developed to deal with EIV models in calibration. We then proceed to discuss the alternative Bayesian framework under some simplifying assumptions. Through a detailed example about the calibration of an instrument for measuring flow rates, we provide advice on how the user of the calibration function should employ the latter framework for inferring the stimulus acting on the calibrated device when, in use, a certain response is measured.
NASA Astrophysics Data System (ADS)
Ozer, Ekin; Feng, Maria Q.
2017-04-01
Mobile, heterogeneous, and smart sensor networks produce pervasive structural health monitoring (SHM) information. With various embedded sensors, smartphones have emerged to innovate SHM by empowering citizens to serve as sensors. By default, smartphones meet the fundamental smart sensor criteria, thanks to the built-in processor, memory, wireless communication units and mobile operating system. SHM using smartphones, however, faces technical challenges due to citizen-induced uncertainties, undesired sensor-structure integration, and lack of control over the sensing platform. Previously, the authors presented successful applications of smartphone accelerometers for structural vibration measurement and proposed a monitoring framework under citizen-induced spatiotemporal uncertainties. This study aims at extending the capabilities of smartphone-based SHM with a special focus on the lack of control over the sensor (i.e., the phone) positioning by citizens resulting in unknown sensor orientations. Using smartphone gyroscope, accelerometer, and magnetometer; instantaneous sensor orientation can be obtained with respect to gravitational and magnetic north directions. Using these sensor data, mobile operating system frameworks return processed features such as attitude and heading that can be used to correct misaligned sensor signals. For this purpose, a coordinate transformation procedure is proposed and illustrated on a two-story laboratory structural model and real-scale bridges with various sensor positioning examples. The proposed method corrects the sensor signals by tracking their orientations and improves measurement accuracy. Moreover, knowing structure’s coordinate system a priori, even the data from arbitrarily positioned sensors can automatically be transformed to the structural coordinates. In addition, this paper also touches some secondary mobile and heterogeneous data issues including imperfect sampling and geolocation services. The coordinate system transformation methods proposed in this study can be implemented in other non-smartphone-based SHM systems as long as similar instrumentation is available.
Control architecture for an adaptive electronically steerable flash lidar and associated instruments
NASA Astrophysics Data System (ADS)
Ruppert, Lyle; Craner, Jeremy; Harris, Timothy
2014-09-01
An Electronically Steerable Flash Lidar (ESFL), developed by Ball Aerospace & Technologies Corporation, allows realtime adaptive control of configuration and data-collection strategy based on recent or concurrent observations and changing situations. This paper reviews, at a high level, some of the algorithms and control architecture built into ESFL. Using ESFL as an example, it also discusses the merits and utility such adaptable instruments in Earth-system studies.
Development of a Measurement Instrument to Assess Students' Electrolyte Conceptual Understanding
ERIC Educational Resources Information Center
Lu, Shanshan; Bi, Hualin
2016-01-01
To assess students' conceptual understanding levels and diagnose alternative frameworks of the electrolyte concept, a measurement instrument was developed using the Rasch model. This paper reports the use of the measurement instrument to assess 559 students from grade 10 to grade 12 in two cities. The results provided both diagnostic and summative…
Multilayer Optimization of Heterogeneous Networks Using Grammatical Genetic Programming.
Fenton, Michael; Lynch, David; Kucera, Stepan; Claussen, Holger; O'Neill, Michael
2017-09-01
Heterogeneous cellular networks are composed of macro cells (MCs) and small cells (SCs) in which all cells occupy the same bandwidth. Provision has been made under the third generation partnership project-long term evolution framework for enhanced intercell interference coordination (eICIC) between cell tiers. Expanding on previous works, this paper instruments grammatical genetic programming to evolve control heuristics for heterogeneous networks. Three aspects of the eICIC framework are addressed including setting SC powers and selection biases, MC duty cycles, and scheduling of user equipments (UEs) at SCs. The evolved heuristics yield minimum downlink rates three times higher than a baseline method, and twice that of a state-of-the-art benchmark. Furthermore, a greater number of UEs receive transmissions under the proposed scheme than in either the baseline or benchmark cases.
Agri-Environmental Policy Measures in Israel: The Potential of Using Market-Oriented Instruments
NASA Astrophysics Data System (ADS)
Amdur, Liron; Bertke, Elke; Freese, Jan; Marggraf, Rainer
2011-05-01
This paper examines the possibilities of developing agri-environmental policy measures in Israel, focusing on market-oriented instruments. A conceptual framework for developing agri-environmental policy measures is presented, first in very broad lines (mandatory regulations, economic instruments and advisory measures) and subsequently focusing on economic instruments, and specifically, on market-oriented ones. Two criteria of choice between the measures are suggested: their contribution to improving the effectiveness of the policy; and the feasibility of their implementation. This is the framework used for analyzing agri-environmental measures in Israel. Israel currently implements a mix of mandatory regulations, economic instruments and advisory measures to promote the agri-environment. The use of additional economic instruments may improve the effectiveness of the policy. When comparing the effectiveness of various economic measures, we found that the feasibility of implementation of market-oriented instruments is greater, due to the Israeli public's preference for strengthening market orientation in the agricultural sector. Four market-oriented instruments were practiced in a pilot project conducted in an Israeli rural area. We found that in this case study, the institutional feasibility and acceptance by stakeholders were the major parameters influencing the implementation of the market-oriented instruments, whereas the instruments' contribution to enhancing the ecological or economic effectiveness were hardly considered by the stakeholders as arguments in favor of their use.
Agri-environmental policy measures in Israel: the potential of using market-oriented instruments.
Amdur, Liron; Bertke, Elke; Freese, Jan; Marggraf, Rainer
2011-05-01
This paper examines the possibilities of developing agri-environmental policy measures in Israel, focusing on market-oriented instruments. A conceptual framework for developing agri-environmental policy measures is presented, first in very broad lines (mandatory regulations, economic instruments and advisory measures) and subsequently focusing on economic instruments, and specifically, on market-oriented ones. Two criteria of choice between the measures are suggested: their contribution to improving the effectiveness of the policy; and the feasibility of their implementation. This is the framework used for analyzing agri-environmental measures in Israel. Israel currently implements a mix of mandatory regulations, economic instruments and advisory measures to promote the agri-environment. The use of additional economic instruments may improve the effectiveness of the policy. When comparing the effectiveness of various economic measures, we found that the feasibility of implementation of market-oriented instruments is greater, due to the Israeli public's preference for strengthening market orientation in the agricultural sector. Four market-oriented instruments were practiced in a pilot project conducted in an Israeli rural area. We found that in this case study, the institutional feasibility and acceptance by stakeholders were the major parameters influencing the implementation of the market-oriented instruments, whereas the instruments' contribution to enhancing the ecological or economic effectiveness were hardly considered by the stakeholders as arguments in favor of their use.
NASA Astrophysics Data System (ADS)
Randol, B. M.; Ebert, R. W.; Allegrini, F.; McComas, D. J.; Schwadron, N. A.
2010-11-01
Electrostatic analyzers (ESAs), in various forms, are used to measure plasma in a range of applications. In this article, we describe how ions reflect from the interior surfaces of an ESA, the detection of which constitutes a fundamentally nonideal response of ESAs. We demonstrate this effect by comparing laboratory data from a real ESA-based space instrument, the Solar Wind Around Pluto (SWAP) instrument, aboard the NASA New Horizons spacecraft, to results from a model based on quantum mechanical simulations of particles reflected from the instrument's surfaces combined with simulations of particle trajectories through the instrument's applied electrostatic fields. Thus, we show, for the first time, how reflected ions in ESAs lead to nonideal effects that have important implications for understanding the data returned by these instruments, as well as for designing new low-background ESA-based instruments. Specifically, we show that the response of SWAP widens considerably below a level of 10-3 of the peak response. Thus, a direct measurement of a plasma distribution with SWAP will have an energy-dependent background on the order of <=10-3 of the peak of the signal due to that distribution. We predict that this order of magnitude estimate for the background applies to a large number of ESA-based instruments because ESAs operate using a common principle. However, the exact shape of the energy-dependent response will be different for different instruments. The principle of operation is that ions outside the ideal range of energy-per-charge are deflected into the walls of the ESA. Therefore, we propose that a new design paradigm is necessary to mitigate the effect of ion reflections and thus accurately and directly measure the energy spectrum of a plasma using ESAs. In this article, we build a framework for minimizing the effect of ion reflections in the design of new ESAs. Through the use of existing computer simulation software, a design team can use our method to quantify the amount of reflections in their instrument and iteratively change design parameters before fabrication, conserving resources. A possible direction for the new design paradigm is having nonsolid walls of the ESA, already used in some applications.
A Framework for the Design of Service Systems
NASA Astrophysics Data System (ADS)
Tan, Yao-Hua; Hofman, Wout; Gordijn, Jaap; Hulstijn, Joris
We propose a framework for the design and implementation of service systems, especially to design controls for long-term sustainable value co-creation. The framework is based on the software support tool e3-control. To illustrate the framework we use a large-scale case study, the Beer Living Lab, for simplification of customs procedures in international trade. The BeerLL shows how value co-creation can be achieved by reduction of administrative burden in international beer export due to electronic customs. Participants in the BeerLL are Heineken, IBM and Dutch Tax & Customs.
ERIC Educational Resources Information Center
Gosselin, Julie; Gahagan, Sheila; Amiel-Tison, Claudine
2005-01-01
The Amiel-Tison Neurological Assessment at Term (ATNAT) is part of a set of three different instruments based on a neuro-maturative framework. By sharing a same methodology and a similar scoring system, the use of these three assessments prevents any rupture in the course of high risk children follow-up from 32 weeks post-conception to 6 years of…
Foundations of Effective Influence Operations: A Framework for Enhancing Army Capabilities
2009-01-01
interesting approaches we came across in our survey of social science approaches that might be suitable for supporting influence operations. In many...planning, conducting, and assessing the impact of influence operations on attitudes and behaviors. The approach is based on survey instruments that...second to latitude. Influencing Individuals 21 To illustrate, Figure 2.1 presents, in a three-dimensional form, the results from a Galileo survey
HESP: Instrument control, calibration and pipeline development
NASA Astrophysics Data System (ADS)
Anantha, Ch.; Roy, Jayashree; Mahesh, P. K.; Parihar, P. S.; Sangal, A. K.; Sriram, S.; Anand, M. N.; Anupama, G. C.; Giridhar, S.; Prabhu, T. P.; Sivarani, T.; Sundararajan, M. S.
Hanle Echelle SPectrograph (HESP) is a fibre-fed, high resolution (R = 30,000 and 60,000) spectrograph being developed for the 2m HCT telescope at IAO, Hanle. The major components of the instrument are a) Cassegrain unit b) Spectrometer instrument. An instrument control system interacting with a guiding unit at Cassegrain interface as well as handling spectrograph functions is being developed. An on-axis auto-guiding using the spill-over angular ring around the input pinhole is also being developed. The stellar light from the Cassegrain unit is taken to the spectrograph using an optical fiber which is being characterized for spectral transmission, focal ratio degradation and scrambling properties. The design of the thermal enclosure and thermal control for the spectrograph housing is presented. A data pipeline for the entire Echelle spectral reduction is being developed. We also plan to implement an instrument physical model based calibration into the main data pipeline and in the maintenance and quality control operations.
Koziol, Leonard F; Budding, Deborah Ely; Chidekel, Dana
2010-12-01
Current cortico-centric models of cognition lack a cohesive neuroanatomic framework that sufficiently considers overlapping levels of function, from "pathological" through "normal" to "gifted" or exceptional ability. While most cognitive theories presume an evolutionary context, few actively consider the process of adaptation, including concepts of neurodevelopment. Further, the frequent co-occurrence of "gifted" and "pathological" function is difficult to explain from a cortico-centric point of view. This comprehensive review paper proposes a framework that includes the brain's vertical organization and considers "giftedness" from an evolutionary and neurodevelopmental vantage point. We begin by discussing the current cortico-centric model of cognition and its relationship to intelligence. We then review an integrated, dual-tiered model of cognition that better explains the process of adaptation by simultaneously allowing for both stimulus-based processing and higher-order cognitive control. We consider the role of the basal ganglia within this model, particularly in relation to reward circuitry and instrumental learning. We review the important role of white matter tracts in relation to speed of adaptation and development of behavioral mastery. We examine the cerebellum's critical role in behavioral refinement and in cognitive and behavioral automation, particularly in relation to expertise and giftedness. We conclude this integrated model of brain function by considering the savant syndrome, which we believe is best understood within the context of a dual-tiered model of cognition that allows for automaticity in adaptation as well as higher-order executive control.
An Open Data Platform in the framework of the EGI-LifeWatch Competence Center
NASA Astrophysics Data System (ADS)
Aguilar Gómez, Fernando; de Lucas, Jesús Marco; Yaiza Rodríguez Marrero, Ana
2016-04-01
The working pilot of an Open Data Platform supporting the full data cycle in research is presented. It aims to preserve knowledge explicitly, starting with the description of the Case Studies, and integrating data and software management and preservation on equal basis. The uninterrupted support in the chain starts at the data acquisition level and covers up to the support for reuse and publication in an open framework, providing integrity and provenance controls. The Lifewatch Open Science Framework is a pilot web portal developed in collaboration with different commercial companies that tries to enrich and integrate different data lifecycle-related tools in order to address the management of the different steps: data planning, gathering, storing, curation, preservation, sharing, discovering, etc. To achieve this goal, the platform includes the following features: -Data Management Planning. Tool to set up an structure of the data, including what data will be generated, how it will be exploited, re-used, curated, preserved, etc. It has a semantic approach: includes reference to ontologies in order to express what data will be gathered. -Close to instrumentation. The portal includes a distributed storage system that can be used both for storing data from instruments and output data from analysis. All that data can be shared -Analysis. Resources from EGI Federated Cloud are accessible within the portal, so that users can exploit computing resources to perform analysis and other processes, including workflows. -Preservation. Data can be preserved in different systems and DOIs can be minted not only for datasets but also for software, DMPs, etc. The presentation will show the different components of the framework as well as how it can be extrapolated to other communities.
Multifunctional Web Enabled Ocean Sensor Systems for the Monitoring of a Changing Ocean
NASA Astrophysics Data System (ADS)
Pearlman, Jay; Castro, Ayoze; Corrandino, Luigi; del Rio, Joaquin; Delory, Eric; Garello, Rene; Heuermann, Rudinger; Martinez, Enoc; Pearlman, Francoise; Rolin, Jean-Francois; Toma, Daniel; Waldmann, Christoph; Zielinski, Oliver
2016-04-01
As stated in the 2010 "Ostend Declaration", a major challenge in the coming years is the development of a truly integrated and sustainably funded European Ocean Observing System for supporting major policy initiatives such as the Integrated Maritime Policy and the Marine Strategy Framework Directive. This will be achieved with more long-term measurements of key parameters supported by a new generation of sensors whose costs and reliability will enable broad and consistent observations. Within the NeXOS project, a framework including new sensors capabilities and interface software has been put together that embraces the key technical aspects needed to improve the temporal and spatial coverage, resolution and quality of marine observations. The developments include new, low-cost, compact and integrated sensors with multiple functionalities that will allow for the measurements useful for a number of objectives, ranging from more precise monitoring and modeling of the marine environment to an improved assessment of fisheries. The project is entering its third year and will be demonstrating initial capabilities of optical and acoustic sensor prototypes that will become available for a number of platforms. For fisheries management, there is also a series of sensors that support an Ecosystem Approach to Fisheries (EAF). The greatest capabilities for comprehensive operations will occur when these sensors can be integrated into a multisensory capability on a single platform or multiply interconnected and coordinated platforms. Within NeXOS the full processing steps starting from the sensor signal all the way up to distributing collected environmental information will be encapsulated into standardized new state of the art Smart Sensor Interface and Web components to provide both improved integration and a flexible interface for scientists to control sensor operation. The use of the OGC SWE (Sensor Web Enablement) set of standards like OGC PUCK and SensorML at the instrument to platform integration phase will provide standard mechanisms for a truly plug'n'work connection. Through this, NeXOS Instruments will maintain within themselves specific information about how a platform (buoy controller, AUV controller, Observatory controller) has to configure and communicate with the instrument without the platform needing previous knowledge about the instrument. This mechanism is now being evaluated in real platforms like a Slocum Glider from Teledyne Web research, SeaExplorer Glider from Alseamar, Provor Float from NKE, and others including non commercial platforms like Obsea seafloor cabled observatory. The latest developments in the NeXOS sensors and the integration into an observation system will be discussed, addressing demonstration plans both for a variety of platforms and scientific objectives supporting marine management.
WFIRST: Managing Telescope Wavefront Stability to Meet Coronagraph Performance
NASA Astrophysics Data System (ADS)
Noecker, Martin; Poberezhskiy, Ilya; Kern, Brian; Krist, John; WFIRST System Engineering Team
2018-01-01
The WFIRST coronagraph instrument (CGI) needs a stable telescope and active wavefront control to perform coronagraph science with an expected sensitivity of 8x10-9 in the exoplanet-star flux ratio (SNR=10) at 200 milliarcseconds angular separation. With its subnanometer requirements on the stability of its input wavefront error (WFE), the CGI employs a combination of pointing and wavefront control loops and thermo-mechanical stability to meet budget allocations for beam-walk and low-order WFE, which enable stable starlight speckles on the science detector that can be removed by image subtraction. We describe the control strategy and the budget framework for estimating and budgeting the elements of wavefront stability, and the modeling strategy to evaluate it.
WTEC panel report on European nuclear instrumentation and controls
NASA Technical Reports Server (NTRS)
White, James D.; Lanning, David D.; Beltracchi, Leo; Best, Fred R.; Easter, James R.; Oakes, Lester C.; Sudduth, A. L.
1991-01-01
Control and instrumentation systems might be called the 'brain' and 'senses' of a nuclear power plant. As such they become the key elements in the integrated operation of these plants. Recent developments in digital equipment have allowed a dramatic change in the design of these instrument and control (I&C) systems. New designs are evolving with cathode ray tube (CRT)-based control rooms, more automation, and better logical information for the human operators. As these new advanced systems are developed, various decisions must be made about the degree of automation and the human-to-machine interface. Different stages of the development of control automation and of advanced digital systems can be found in various countries. The purpose of this technology assessment is to make a comparative evaluation of the control and instrumentation systems that are being used for commercial nuclear power plants in Europe and the United States. This study is limited to pressurized water reactors (PWR's). Part of the evaluation includes comparisons with a previous similar study assessing Japanese technology.
Penetration with Long Rods: A Theoretical Framework and Comparison with Instrumented Impacts
1981-05-01
program to begin probing the details of the interaction process. The theoretical framework underlying such a program is explained in detail. The theory of...of the time sequence of events during penetration. Data from one series of experiments, reported in detail elsewhere, is presented and discussed within the theoretical framework .
ERIC Educational Resources Information Center
Kumar, Swapna; Antonenko, Pavlo
2014-01-01
From an instrumental view, conceptual frameworks that are carefully assembled from existing literature in Educational Technology and related disciplines can help students structure all aspects of inquiry. In this article we detail how the development of a conceptual framework that connects theory, practice and method is scaffolded and facilitated…
The instrumental genesis process in future primary teachers using Dynamic Geometry Software
NASA Astrophysics Data System (ADS)
Ruiz-López, Natalia
2018-05-01
This paper, which describes a study undertaken with pairs of future primary teachers using GeoGebra software to solve geometry problems, includes a brief literature review, the theoretical framework and methodology used. An analysis of the instrumental genesis process for a pair participating in the case study is also provided. This analysis addresses the techniques and types of dragging used, the obstacles to learning encountered, a description of the interaction between the pair and their interaction with the teacher, and the type of language used. Based on this analysis, possibilities and limitations of the instrumental genesis process are identified for the development of geometric competencies such as conjecture creation, property checking and problem researching. It is also suggested that the methodology used in the analysis of the problem solving process may be useful for those teachers and researchers who want to integrate Dynamic Geometry Software (DGS) in their classrooms.
Financial Crisis: A New Measure for Risk of Pension Fund Portfolios
Cadoni, Marinella; Melis, Roberta; Trudda, Alessandro
2015-01-01
It has been argued that pension funds should have limitations on their asset allocation, based on the risk profile of the different financial instruments available on the financial markets. This issue proves to be highly relevant at times of market crisis, when a regulation establishing limits to risk taking for pension funds could prevent defaults. In this paper we present a framework for evaluating the risk level of a single financial instrument or a portfolio. By assuming that the log asset returns can be described by a multifractional Brownian motion, we evaluate the risk using the time dependent Hurst parameter H(t) which models volatility. To provide a measure of the risk, we model the Hurst parameter with a random variable with mixture of beta distribution. We prove the efficacy of the methodology by implementing it on different risk level financial instruments and portfolios. PMID:26086529
Financial Crisis: A New Measure for Risk of Pension Fund Portfolios.
Cadoni, Marinella; Melis, Roberta; Trudda, Alessandro
2015-01-01
It has been argued that pension funds should have limitations on their asset allocation, based on the risk profile of the different financial instruments available on the financial markets. This issue proves to be highly relevant at times of market crisis, when a regulation establishing limits to risk taking for pension funds could prevent defaults. In this paper we present a framework for evaluating the risk level of a single financial instrument or a portfolio. By assuming that the log asset returns can be described by a multifractional Brownian motion, we evaluate the risk using the time dependent Hurst parameter H(t) which models volatility. To provide a measure of the risk, we model the Hurst parameter with a random variable with mixture of beta distribution. We prove the efficacy of the methodology by implementing it on different risk level financial instruments and portfolios.
DRG-based hospital payment systems and technological innovation in 12 European countries.
Scheller-Kreinsen, David; Quentin, Wilm; Busse, Reinhard
2011-12-01
To assess how diagnosis-related group-based (DRG-based) hospital payment systems in 12 European countries participating in the EuroDRG project pay and incorporate technological innovation. A standardized questionnaire was used to guide comprehensive DRG system descriptions. Researchers from each country reviewed relevant materials to complete the questionnaire and drafted standardized country reports. Two characteristics of DRG-based hospital payment systems were identified as particularly important: the existence of short-term payment instruments encouraging technological innovation in different countries, and the characteristics of long-term updating mechanisms that assure technological innovation is ultimately incorporated into DRG-based hospital payment systems. Short-term payment instruments and long-term updating mechanisms differ greatly among the 12 European countries included in this study. Some countries operate generous short-term payment instruments that provide additional payments to hospitals for making use of technological innovation (e.g., France). Other countries update their DRG-based hospital payment systems very frequently and use more recent data for updates. Generous short-term payment instruments to promote technological innovation should be applied carefully as they may imply rapidly increasing health-care expenditures. In general, they should be granted only if rigorous analyses have demonstrated their benefits. If the evidence remains uncertain, coverage with evidence development frameworks or frequent updates of the DRG-based hospital systems may provide policy alternatives. Once the data and evidence base is substantially improved, future research should empirically investigate how different policy arrangements affect the adoption and use of technological innovation and health-care expenditures. Copyright © 2011 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Inertial Pointing and Positioning System
NASA Technical Reports Server (NTRS)
Yee, Robert (Inventor); Robbins, Fred (Inventor)
1998-01-01
An inertial pointing and control system and method for pointing to a designated target with known coordinates from a platform to provide accurate position, steering, and command information. The system continuously receives GPS signals and corrects Inertial Navigation System (INS) dead reckoning or drift errors. An INS is mounted directly on a pointing instrument rather than in a remote location on the platform for-monitoring the terrestrial position and instrument attitude. and for pointing the instrument at designated celestial targets or ground based landmarks. As a result. the pointing instrument and die INS move independently in inertial space from the platform since the INS is decoupled from the platform. Another important characteristic of the present system is that selected INS measurements are combined with predefined coordinate transformation equations and control logic algorithms under computer control in order to generate inertial pointing commands to the pointing instrument. More specifically. the computer calculates the desired instrument angles (Phi, Theta. Psi). which are then compared to the Euler angles measured by the instrument- mounted INS. and forms the pointing command error angles as a result of the compared difference.
A Descriptive and Interpretative Information System for the IODP
NASA Astrophysics Data System (ADS)
Blum, P.; Foster, P. A.; Mateo, Z.
2006-12-01
The ODP/IODP has a long and rich history of collecting descriptive and interpretative information (DESCINFO) from rock and sediment cores from the world's oceans. Unlike instrumental data, DESCINFO generated by subject experts is biased by the scientific and cultural background of the observers and their choices of classification schemes. As a result, global searches of DESCINFO and its integration with other data are problematical. To address this issue, the IODP-USIO is in the process of designing and implementing a DESCINFO system for IODP Phase 2 (2007-2013) that meets the user expectations expressed over the past decade. The requirements include support of (1) detailed, material property-based descriptions as well as classification-based descriptions; (2) global searches by physical sample and digital data sources as well as any of the descriptive parameters; (3) user-friendly data capture tools for a variety of workflows; and (4) extensive visualization of DESCINFO data along with instrumental data and images; and (5) portability/interoperability such that the system can work with database schemas of other organizations - a specific challenge given the schema and semantic heterogeneity not only among the three IODP operators but within the geosciences in general. The DESCINFO approach is based on the definition of a set of generic observable parameters that are populated with numeric or text values. Text values are derived from controlled, extensible hierarchical value lists that allow descriptions at the appropriate level of detail and ensure successful data searches. Material descriptions can be completed independently of domain-specific classifications, genetic concepts, and interpretative frameworks.
Molecular substrates of action control in cortico-striatal circuits.
Shiflett, Michael W; Balleine, Bernard W
2011-09-15
The purpose of this review is to describe the molecular mechanisms in the striatum that mediate reward-based learning and action control during instrumental conditioning. Experiments assessing the neural bases of instrumental conditioning have uncovered functional circuits in the striatum, including dorsal and ventral striatal sub-regions, involved in action-outcome learning, stimulus-response learning, and the motivational control of action by reward-associated cues. Integration of dopamine (DA) and glutamate neurotransmission within these striatal sub-regions is hypothesized to enable learning and action control through its role in shaping synaptic plasticity and cellular excitability. The extracellular signal regulated kinase (ERK) appears to be particularly important for reward-based learning and action control due to its sensitivity to combined DA and glutamate receptor activation and its involvement in a range of cellular functions. ERK activation in striatal neurons is proposed to have a dual role in both the learning and performance factors that contribute to instrumental conditioning through its regulation of plasticity-related transcription factors and its modulation of intrinsic cellular excitability. Furthermore, perturbation of ERK activation by drugs of abuse may give rise to behavioral disorders such as addiction. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Carniel, Roberto; Di Cecca, Mauro; Jaquet, Olivier
2006-05-01
In the framework of the EU-funded project "Multi-disciplinary monitoring, modelling and forecasting of volcanic hazard" (MULTIMO), multiparametric data have been recorded at the MULTIMO station in Montserrat. Moreover, several other long time series, recorded at Montserrat and at other volcanoes, have been acquired in order to test stochastic and deterministic methodologies under development. Creating a general framework to handle data efficiently is a considerable task even for homogeneous data. In the case of heterogeneous data, this becomes a major issue. A need for a consistent way of browsing such a heterogeneous dataset in a user-friendly way therefore arose. Additionally, a framework for applying the calculation of the developed dynamical parameters on the data series was also needed in order to easily keep these parameters under control, e.g. for monitoring, research or forecasting purposes. The solution which we present is completely based on Open Source software, including Linux operating system, MySql database management system, Apache web server, Zope application server, Scilab math engine, Plone content management framework, Unified Modelling Language. From the user point of view the main advantage is the possibility of browsing through datasets recorded on different volcanoes, with different instruments, with different sampling frequencies, stored in different formats, all via a consistent, user- friendly interface that transparently runs queries to the database, gets the data from the main storage units, generates the graphs and produces dynamically generated web pages to interact with the user. The involvement of third parties for continuing the development in the Open Source philosophy and/or extending the application fields is now sought.
Instrumentation for detailed bridge-scour measurements
Landers, Mark N.; Mueller, David S.; Trent, Roy E.; ,
1993-01-01
A portable instrumentation system is being developed to obtain channel bathymetry during floods for detailed bridge-scour measurements. Portable scour measuring systems have four components: sounding instrument, horizontal positioning instrument, deployment mechanisms, and data storage device. The sounding instrument will be a digital fathometer. Horizontal position will be measured using a range-azimuth based hydrographic survey system. The deployment mechanism designed for this system is a remote-controlled boat using a small waterplane area, twin-hull design. An on-board computer and radio will monitor the vessel instrumentation, record measured data, and telemeter data to shore.
A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems
NASA Technical Reports Server (NTRS)
Hatanaka, Iwao
2000-01-01
The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.
A novel framework for virtual prototyping of rehabilitation exoskeletons.
Agarwal, Priyanshu; Kuo, Pei-Hsin; Neptune, Richard R; Deshpande, Ashish D
2013-06-01
Human-worn rehabilitation exoskeletons have the potential to make therapeutic exercises increasingly accessible to disabled individuals while reducing the cost and labor involved in rehabilitation therapy. In this work, we propose a novel human-model-in-the-loop framework for virtual prototyping (design, control and experimentation) of rehabilitation exoskeletons by merging computational musculoskeletal analysis with simulation-based design techniques. The framework allows to iteratively optimize design and control algorithm of an exoskeleton using simulation. We introduce biomechanical, morphological, and controller measures to quantify the performance of the device for optimization study. Furthermore, the framework allows one to carry out virtual experiments for testing specific "what-if" scenarios to quantify device performance and recovery progress. To illustrate the application of the framework, we present a case study wherein the design and analysis of an index-finger exoskeleton is carried out using the proposed framework.
Durham, Mary F; Knight, Jennifer K; Couch, Brian A
2017-01-01
The Scientific Teaching (ST) pedagogical framework provides various approaches for science instructors to teach in a way that more closely emulates how science is practiced by actively and inclusively engaging students in their own learning and by making instructional decisions based on student performance data. Fully understanding the impact of ST requires having mechanisms to quantify its implementation. While many useful instruments exist to document teaching practices, these instruments only partially align with the range of practices specified by ST, as described in a recently published taxonomy. Here, we describe the development, validation, and implementation of the Measurement Instrument for Scientific Teaching (MIST), a survey derived from the ST taxonomy and designed to gauge the frequencies of ST practices in undergraduate science courses. MIST showed acceptable validity and reliability based on results from 7767 students in 87 courses at nine institutions. We used factor analyses to identify eight subcategories of ST practices and used these categories to develop a short version of the instrument amenable to joint administration with other research instruments. We further discuss how MIST can be used by instructors, departments, researchers, and professional development programs to quantify and track changes in ST practices. © 2017 M. F. Durham et al. CBE—Life Sciences Education © 2017 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
Development of a monitoring instrument to assess the performance of the Swiss primary care system.
Ebert, Sonja T; Pittet, Valérie; Cornuz, Jacques; Senn, Nicolas
2017-11-29
The Swiss health system is customer-driven with fee-for-service paiement scheme and universal coverage. It is highly performing but expensive and health information systems are scarcely implemented. The Swiss Primary Care Active Monitoring (SPAM) program aims to develop an instrument able to describe the performance and effectiveness of the Swiss PC system. Based on a Literature review we developed a conceptual framework and selected indicators according to their ability to reflect the Swiss PC system. A two round modified RAND method with 24 inter-/national experts took place to select primary/secondary indicators (validity, clarity, agreement). A limited set of priority indicators was selected (importance, priority) in a third round. A conceptual framework covering three domains (structure, process, outcome) subdivided into twelve sections (funding, access, organisation/ workflow of resources, (Para-)Medical training, management of knowledge, clinical-/interpersonal care, health status, satisfaction of PC providers/ consumers, equity) was generated. 365 indicators were pre-selected and 335 were finally retained. 56 were kept as priority indicators.- Among the remaining, 199 were identified as primary and 80 as secondary indicators. All domains and sections are represented. The development of the SPAM program allowed the construction of a consensual instrument in a traditionally unregulated health system through a modified RAND method. The selected 56 priority indicators render the SPAM instrument a comprehensive tool supporting a better understanding of the Swiss PC system's performance and effectiveness as well as in identifying potential ways to improve quality of care. Further challenges will be to update indicators regularly and to assess validity and sensitivity-to-change over time.
Structural Control of Metabolic Flux
Sajitz-Hermstein, Max; Nikoloski, Zoran
2013-01-01
Organisms have to continuously adapt to changing environmental conditions or undergo developmental transitions. To meet the accompanying change in metabolic demands, the molecular mechanisms of adaptation involve concerted interactions which ultimately induce a modification of the metabolic state, which is characterized by reaction fluxes and metabolite concentrations. These state transitions are the effect of simultaneously manipulating fluxes through several reactions. While metabolic control analysis has provided a powerful framework for elucidating the principles governing this orchestrated action to understand metabolic control, its applications are restricted by the limited availability of kinetic information. Here, we introduce structural metabolic control as a framework to examine individual reactions' potential to control metabolic functions, such as biomass production, based on structural modeling. The capability to carry out a metabolic function is determined using flux balance analysis (FBA). We examine structural metabolic control on the example of the central carbon metabolism of Escherichia coli by the recently introduced framework of functional centrality (FC). This framework is based on the Shapley value from cooperative game theory and FBA, and we demonstrate its superior ability to assign “share of control” to individual reactions with respect to metabolic functions and environmental conditions. A comparative analysis of various scenarios illustrates the usefulness of FC and its relations to other structural approaches pertaining to metabolic control. We propose a Monte Carlo algorithm to estimate FCs for large networks, based on the enumeration of elementary flux modes. We further give detailed biological interpretation of FCs for production of lactate and ATP under various respiratory conditions. PMID:24367246
Eight microprocessor-based instrument data systems in the Galileo Orbiter spacecraft
NASA Technical Reports Server (NTRS)
Barry, R. C.
1980-01-01
Instrument data systems consist of a microprocessor, 3K bytes of Read Only Memory and 3K bytes of Random Access Memory. It interfaces with the spacecraft data bus through an isolated user interface with a direct memory access bus adaptor, and/or parallel data from instrument devices such as registers, buffers, analog to digital converters, multiplexers, and solid state sensors. These data systems support the spacecraft hardware and software communication protocol, decode and process instrument commands, generate continuous instrument operating modes, control the instrument mechanisms, acquire, process, format, and output instrument science data.
Kern, Simon; Meyer, Klas; Guhl, Svetlana; Gräßer, Patrick; Paul, Andrea; King, Rudibert; Maiwald, Michael
2018-05-01
Monitoring specific chemical properties is the key to chemical process control. Today, mainly optical online methods are applied, which require time- and cost-intensive calibration effort. NMR spectroscopy, with its advantage being a direct comparison method without need for calibration, has a high potential for enabling closed-loop process control while exhibiting short set-up times. Compact NMR instruments make NMR spectroscopy accessible in industrial and rough environments for process monitoring and advanced process control strategies. We present a fully automated data analysis approach which is completely based on physically motivated spectral models as first principles information (indirect hard modeling-IHM) and applied it to a given pharmaceutical lithiation reaction in the framework of the European Union's Horizon 2020 project CONSENS. Online low-field NMR (LF NMR) data was analyzed by IHM with low calibration effort, compared to a multivariate PLS-R (partial least squares regression) approach, and both validated using online high-field NMR (HF NMR) spectroscopy. Graphical abstract NMR sensor module for monitoring of the aromatic coupling of 1-fluoro-2-nitrobenzene (FNB) with aniline to 2-nitrodiphenylamine (NDPA) using lithium-bis(trimethylsilyl) amide (Li-HMDS) in continuous operation. Online 43.5 MHz low-field NMR (LF) was compared to 500 MHz high-field NMR spectroscopy (HF) as reference method.
Validity-Supporting Evidence of the Self-Efficacy for Teaching Mathematics Instrument
ERIC Educational Resources Information Center
McGee, Jennifer R.; Wang, Chuang
2014-01-01
The purpose of this study is to provide evidence of reliability and validity of the Self-Efficacy for Teaching Mathematics Instrument (SETMI). Self-efficacy, as defined by Bandura, was the theoretical framework for the development of the instrument. The complex belief systems of mathematics teachers, as touted by Ernest provided insights into the…
Approaches of High School Instrumental Music Educators in Response to Student Challenges
ERIC Educational Resources Information Center
Edgar, Scott N.
2016-01-01
The purpose of this multiple instrumental case study was to explore approaches of four high school instrumental music educators assuming the role of facilitative teacher in responding to challenges affecting the social and emotional well-being of their students. This study utilized the framework of social emotional learning as a lens to view the…
Cavity-Enhanced Quantum-Cascade Laser-Based Instrument for Trace gas Measurements
NASA Astrophysics Data System (ADS)
Provencal, R.; Gupta, M.; Owano, T.; Baer, D.; Ricci, K.; O'Keefe, A.
2005-12-01
An autonomous instrument based on Off-Axis Integrated Cavity Output Spectroscopy has been successfully deployed for measurements of CO in the troposphere and tropopause onboard a NASA DC-8 aircraft. The instrument consists of a measurement cell comprised of two high reflectivity mirrors, a continuous-wave quantum-cascade laser, gas sampling system, control and data acquisition electronics, and data analysis software. The instrument reports CO mixing ratio at a 1-Hz rate based on measured absorption, gas temperature and pressure using Beer's Law. During several flights in May-June 2004 and January 2005 that reached altitudes of 41000 ft, the instrument recorded CO values with a precision of 0.2 ppbv (1-s averaging time). Despite moderate turbulence and measurements of particulate-laden airflows, the instrument operated consistently and did not require any maintenance, mirror cleaning, or optical realignment during the flights. We will also present recent development efforts to extend the instrument's capabilities for the measurements of CH4, N2O and CO in real time.
121. VIEW OF CABINETS ON WEST SIDE OF LANDLINE INSTRUMENTATION ...
121. VIEW OF CABINETS ON WEST SIDE OF LANDLINE INSTRUMENTATION ROOM (206), LSB (BLDG. 751). FEATURES LEFT TO RIGHT: FACILITY DISTRIBUTION CONSOLE FOR WATER CONTROL SYSTEMS, PROPULSION ELECTRICAL CHECKOUT SYSTEM (PECOS), LOGIC CONTROL AND MONITOR UNITS FOR BOOSTER AND FUEL SYSTEMS. - Vandenberg Air Force Base, Space Launch Complex 3, Launch Pad 3 East, Napa & Alden Roads, Lompoc, Santa Barbara County, CA
Controlling CAMAC instrumentation through the USB port
NASA Astrophysics Data System (ADS)
Ribas, R. V.
2012-02-01
A programmable device to interface CAMAC instrumentation to the USB port of computers, without the need of heavy, noisy and expensive CAMAC crates is described in this article. Up to four single-width modules can be used. Also, all software necessary for a multi-parametric data acquisition system was developed. A standard crate-controller based on the same project is being designed.
The knowledge-based framework for a nuclear power plant operator advisor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, D.W.; Hajek, B.K.
1989-01-01
An important facet in the design, development, and evaluation of aids for complex systems is the identification of the tasks performed by the operator. Operator aids utilizing artificial intelligence, or more specifically knowledge-based systems, require identification of these tasks in the context of a knowledge-based framework. In this context, the operator responses to the plant behavior are to monitor and comprehend the state of the plant, identify normal and abnormal plant conditions, diagnose abnormal plant conditions, predict plant response to specific control actions, and select the best available control action, implement a feasible control action, monitor system response to themore » control action, and correct for any inappropriate responses. These tasks have been identified to formulate a knowledge-based framework for an operator advisor under development at Ohio State University that utilizes the generic task methodology proposed by Chandrasekaran. The paper lays the foundation to identify the responses as a knowledge-based set of tasks in accordance with the expected human operator responses during an event. Initial evaluation of the expert system indicates the potential for an operator aid that will improve the operator's ability to respond to both anticipated and unanticipated events.« less
Application of programmable logic controllers to space simulation
NASA Technical Reports Server (NTRS)
Sushon, Janet
1992-01-01
Incorporating a state-of-the-art process control and instrumentation system into a complex system for thermal vacuum testing is discussed. The challenge was to connect several independent control systems provided by various vendors to a supervisory computer. This combination will sequentially control and monitor the process, collect the data, and transmit it to color a graphic system for subsequent manipulation. The vacuum system upgrade included: replacement of seventeen diffusion pumps with eight cryogenic pumps and one turbomolecular pump, replacing a relay based control system, replacing vacuum instrumentation, and upgrading the data acquisition system.
XAL Application Framework and Bricks GUI Builder
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pelaia II, Tom
2007-01-01
The XAL [1] Application Framework is a framework for rapidly developing document based Java applications with a common look and feel along with many built-in user interface behaviors. The Bricks GUI builder consists of a modern application and framework for rapidly building user interfaces in support of true Model-View-Controller (MVC) compliant Java applications. Bricks and the XAL Application Framework allow developers to rapidly create quality applications.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-30
... collection unless it displays a currently valid Office of Management and Budget (OMB) control number. Each of... CORPORATION 12 CFR Part 325 RIN 3064-AD58 Risk-Based Capital Standards: Advanced Capital Adequacy Framework--Basel II; Establishment of a Risk-Based Capital Floor AGENCY: Office of the Comptroller of the Currency...
ERIC Educational Resources Information Center
Schweimer, Judith; Hauber, Wolfgang
2005-01-01
The anterior cingulate cortex (ACC) plays a critical role in stimulus-reinforcement learning and reward-guided selection of actions. Here we conducted a series of experiments to further elucidate the role of the ACC in instrumental behavior involving effort-based decision-making and instrumental learning guided by reward-predictive stimuli. In…
2017-01-01
Background The home environment is where young children spend most of their time, and is critically important to supporting behaviors that promote health and prevent obesity. However, the home environment and lifestyle patterns remain understudied, and few interventions have investigated parent-led makeovers designed to create home environments that are supportive of optimal child health and healthy child weights. Objective The aim of the HomeStyles randomized controlled trial (RCT) is to determine whether the Web-based HomeStyles intervention enables and motivates parents to shape the weight-related aspects of their home environments and lifestyle behavioral practices (diet, exercise, and sleep) to be more supportive of their preschool children’s optimal health and weight. Methods A rigorous RCT utilizing an experimental group and an attention control group, receiving a bona fide contemporaneous treatment equal in nonspecific treatment effects and differing only in subject matter content, will test the effect of HomeStyles on a diverse sample of families with preschool children. This intervention is based on social cognitive theory and uses a social ecological framework, and will assess: intrapersonal characteristics (dietary intake, physical activity level, and sleep) of parents and children; family interpersonal or social characteristics related to diet, physical activity, media use, and parental values and self-efficacy for obesity-preventive practices; and home environment food availability, physical activity space and supports in and near the home, and media availability and controls in the home. Results Enrollment for this study has been completed and statistical data analyses are currently underway. Conclusions This paper describes the HomeStyles intervention with regards to: rationale, the intervention’s logic model, sample eligibility criteria and recruitment, experimental group and attention control intervention content, study design, instruments, data management, and planned analyses. PMID:28442452
Schadt, Eric E.; Banerjee, Onureena; Fang, Gang; Feng, Zhixing; Wong, Wing H.; Zhang, Xuegong; Kislyuk, Andrey; Clark, Tyson A.; Luong, Khai; Keren-Paz, Alona; Chess, Andrew; Kumar, Vipin; Chen-Plotkin, Alice; Sondheimer, Neal; Korlach, Jonas; Kasarskis, Andrew
2013-01-01
Current generation DNA sequencing instruments are moving closer to seamlessly sequencing genomes of entire populations as a routine part of scientific investigation. However, while significant inroads have been made identifying small nucleotide variation and structural variations in DNA that impact phenotypes of interest, progress has not been as dramatic regarding epigenetic changes and base-level damage to DNA, largely due to technological limitations in assaying all known and unknown types of modifications at genome scale. Recently, single-molecule real time (SMRT) sequencing has been reported to identify kinetic variation (KV) events that have been demonstrated to reflect epigenetic changes of every known type, providing a path forward for detecting base modifications as a routine part of sequencing. However, to date no statistical framework has been proposed to enhance the power to detect these events while also controlling for false-positive events. By modeling enzyme kinetics in the neighborhood of an arbitrary location in a genomic region of interest as a conditional random field, we provide a statistical framework for incorporating kinetic information at a test position of interest as well as at neighboring sites that help enhance the power to detect KV events. The performance of this and related models is explored, with the best-performing model applied to plasmid DNA isolated from Escherichia coli and mitochondrial DNA isolated from human brain tissue. We highlight widespread kinetic variation events, some of which strongly associate with known modification events, while others represent putative chemically modified sites of unknown types. PMID:23093720
Schadt, Eric E; Banerjee, Onureena; Fang, Gang; Feng, Zhixing; Wong, Wing H; Zhang, Xuegong; Kislyuk, Andrey; Clark, Tyson A; Luong, Khai; Keren-Paz, Alona; Chess, Andrew; Kumar, Vipin; Chen-Plotkin, Alice; Sondheimer, Neal; Korlach, Jonas; Kasarskis, Andrew
2013-01-01
Current generation DNA sequencing instruments are moving closer to seamlessly sequencing genomes of entire populations as a routine part of scientific investigation. However, while significant inroads have been made identifying small nucleotide variation and structural variations in DNA that impact phenotypes of interest, progress has not been as dramatic regarding epigenetic changes and base-level damage to DNA, largely due to technological limitations in assaying all known and unknown types of modifications at genome scale. Recently, single-molecule real time (SMRT) sequencing has been reported to identify kinetic variation (KV) events that have been demonstrated to reflect epigenetic changes of every known type, providing a path forward for detecting base modifications as a routine part of sequencing. However, to date no statistical framework has been proposed to enhance the power to detect these events while also controlling for false-positive events. By modeling enzyme kinetics in the neighborhood of an arbitrary location in a genomic region of interest as a conditional random field, we provide a statistical framework for incorporating kinetic information at a test position of interest as well as at neighboring sites that help enhance the power to detect KV events. The performance of this and related models is explored, with the best-performing model applied to plasmid DNA isolated from Escherichia coli and mitochondrial DNA isolated from human brain tissue. We highlight widespread kinetic variation events, some of which strongly associate with known modification events, while others represent putative chemically modified sites of unknown types.
Translating Radiometric Requirements for Satellite Sensors to Match International Standards.
Pearlman, Aaron; Datla, Raju; Kacker, Raghu; Cao, Changyong
2014-01-01
International scientific standards organizations created standards on evaluating uncertainty in the early 1990s. Although scientists from many fields use these standards, they are not consistently implemented in the remote sensing community, where traditional error analysis framework persists. For a satellite instrument under development, this can create confusion in showing whether requirements are met. We aim to create a methodology for translating requirements from the error analysis framework to the modern uncertainty approach using the product level requirements of the Advanced Baseline Imager (ABI) that will fly on the Geostationary Operational Environmental Satellite R-Series (GOES-R). In this paper we prescribe a method to combine several measurement performance requirements, written using a traditional error analysis framework, into a single specification using the propagation of uncertainties formula. By using this approach, scientists can communicate requirements in a consistent uncertainty framework leading to uniform interpretation throughout the development and operation of any satellite instrument.
Translating Radiometric Requirements for Satellite Sensors to Match International Standards
Pearlman, Aaron; Datla, Raju; Kacker, Raghu; Cao, Changyong
2014-01-01
International scientific standards organizations created standards on evaluating uncertainty in the early 1990s. Although scientists from many fields use these standards, they are not consistently implemented in the remote sensing community, where traditional error analysis framework persists. For a satellite instrument under development, this can create confusion in showing whether requirements are met. We aim to create a methodology for translating requirements from the error analysis framework to the modern uncertainty approach using the product level requirements of the Advanced Baseline Imager (ABI) that will fly on the Geostationary Operational Environmental Satellite R-Series (GOES-R). In this paper we prescribe a method to combine several measurement performance requirements, written using a traditional error analysis framework, into a single specification using the propagation of uncertainties formula. By using this approach, scientists can communicate requirements in a consistent uncertainty framework leading to uniform interpretation throughout the development and operation of any satellite instrument. PMID:26601032
Berteletti, Florence
2017-09-13
In their paper, Nikogosian and Kickbusch show how the effects of the adoption by the World Health Organization (WHO) of the Framework Convention on Tobacco Control (WHO FCTC) and its first Protocol extend beyond tobacco control and contribute to public health governance more broadly, by revealing new processes, institutions and instruments. While there are certainly good reasons to be optimistic about the impact of these instruments in the public health sphere, the experience of the FCTC's implementation in the context of the European Union (EU) shows that further efforts are still necessary for its full potential to be realised. Indeed, one of the main hurdles to the FCTC's success so far has been the difficulty in developing and maintaining comprehensive multisectoral measures and involving sectors beyond the sphere of public health. © 2018 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Multi-Mission Automated Task Invocation Subsystem
NASA Technical Reports Server (NTRS)
Cheng, Cecilia S.; Patel, Rajesh R.; Sayfi, Elias M.; Lee, Hyun H.
2009-01-01
Multi-Mission Automated Task Invocation Subsystem (MATIS) is software that establishes a distributed data-processing framework for automated generation of instrument data products from a spacecraft mission. Each mission may set up a set of MATIS servers for processing its data products. MATIS embodies lessons learned in experience with prior instrument- data-product-generation software. MATIS is an event-driven workflow manager that interprets project-specific, user-defined rules for managing processes. It executes programs in response to specific events under specific conditions according to the rules. Because requirements of different missions are too diverse to be satisfied by one program, MATIS accommodates plug-in programs. MATIS is flexible in that users can control such processing parameters as how many pipelines to run and on which computing machines to run them. MATIS has a fail-safe capability. At each step, MATIS captures and retains pertinent information needed to complete the step and start the next step. In the event of a restart, this information is retrieved so that processing can be resumed appropriately. At this writing, it is planned to develop a graphical user interface (GUI) for monitoring and controlling a product generation engine in MATIS. The GUI would enable users to schedule multiple processes and manage the data products produced in the processes. Although MATIS was initially designed for instrument data product generation,
XIMPOL: a new x-ray polarimetry observation-simulation and analysis framework
NASA Astrophysics Data System (ADS)
Omodei, Nicola; Baldini, Luca; Pesce-Rollins, Melissa; di Lalla, Niccolò
2017-08-01
We present a new simulation framework, XIMPOL, based on the python programming language and the Scipy stack, specifically developed for X-ray polarimetric applications. XIMPOL is not tied to any specific mission or instrument design and is meant to produce fast and yet realistic observation-simulations, given as basic inputs: (i) an arbitrary source model including morphological, temporal, spectral and polarimetric information, and (ii) the response functions of the detector under study, i.e., the effective area, the energy dispersion, the point-spread function and the modulation factor. The format of the response files is OGIP compliant, and the framework has the capability of producing output files that can be directly fed into the standard visualization and analysis tools used by the X-ray community, including XSPEC which make it a useful tool not only for simulating physical systems, but also to develop and test end-to-end analysis chains.
Smith, Morgan; Warland, Jane; Smith, Colleen
2012-03-01
Online role-play has the potential to actively engage students in authentic learning experiences and help develop their clinical reasoning skills. However, evaluation of student learning for this kind of simulation focuses mainly on the content and outcome of learning, rather than on the process of learning through student engagement. This article reports on the use of a student engagement framework to evaluate an online role-play offered as part of a course in Bachelor of Nursing and Bachelor of Midwifery programs. Instruments that measure student engagement to date have targeted large numbers of students at program and institutional levels, rather than at the level of a specific learning activity. Although the framework produced some useful findings for evaluation purposes, further refinement of the questions is required to be certain that deep learning results from the engagement that occurs with course-level learning initiatives. Copyright 2012, SLACK Incorporated.
Wienand, I; Nolting, U; Kistemann, T
2009-01-01
Following international developments and the new WHO Drinking Water Guidelines (WHO 2004) a process-orientated concept for risk, monitoring and incident management has been developed and implemented in this study. The concept will be reviewed with special consideration for resource protection (first barrier of the multi-barrier system) and in turn, for the Water Safety Plan (WSP) which adequately considers-beyond the current framework of legal requirements-possible new hygienic-microbiologically relevant risks (especially emerging pathogens) for the drinking water supply. The development of a WSP within the framework of risk, monitoring and incident management includes the application of Geographical Information Systems (GIS). In the present study, GIS was used for visualization and spatial analysis in decisive steps in the WSP. The detailed process of GIS-supported implementation included the identification of local participants and their tasks and interactions as an essential part of risk management. A detailed ecological investigation of drinking water conditions in the catchment area was conducted in addition to hazard identification, risk assessment and the monitoring of control measures. The main task of our study was to find out in which steps of the WSP the implementation of GIS could be integrated as a useful, and perhaps even an essential tool.
The PROactive innovative conceptual framework on physical activity
Dobbels, Fabienne; de Jong, Corina; Drost, Ellen; Elberse, Janneke; Feridou, Chryssoula; Jacobs, Laura; Rabinovich, Roberto; Frei, Anja; Puhan, Milo A.; de Boer, Willem I.; van der Molen, Thys; Williams, Kate; Pinnock, Hillary; Troosters, Thierry; Karlsson, Niklas; Kulich, Karoly; Rüdell, Katja; Brindicci, Caterina; Higenbottam, Tim; Troosters, Thierry; Dobbels, Fabienne; Decramer, Marc; Tabberer, Margaret; Rabinovich, Roberto A; MacNee, William; Vogiatzis, Ioannis; Polkey, Michael; Hopkinson, Nick; Garcia-Aymerich, Judith; Puhan, Milo; Frei, Anja; van der Molen, Thys; de Jong, Corina; de Boer, Pim; Jarrod, Ian; McBride, Paul; Kamel, Nadia; Rudell, Katja; Wilson, Frederick J.; Ivanoff, Nathalie; Kulich, Karoly; Glendenning, Alistair; Karlsson, Niklas X.; Corriol-Rohou, Solange; Nikai, Enkeleida; Erzen, Damijan
2014-01-01
Although physical activity is considered an important therapeutic target in chronic obstructive pulmonary disease (COPD), what “physical activity” means to COPD patients and how their perspective is best measured is poorly understood. We designed a conceptual framework, guiding the development and content validation of two patient reported outcome (PRO) instruments on physical activity (PROactive PRO instruments). 116 patients from four European countries with diverse demographics and COPD phenotypes participated in three consecutive qualitative studies (63% male, age mean±sd 66±9 years, 35% Global Initiative for Chronic Obstructive Lung Disease stage III–IV). 23 interviews and eight focus groups (n = 54) identified the main themes and candidate items of the framework. 39 cognitive debriefings allowed the clarity of the items and instructions to be optimised. Three themes emerged, i.e. impact of COPD on amount of physical activity, symptoms experienced during physical activity, and adaptations made to facilitate physical activity. The themes were similar irrespective of country, demographic or disease characteristics. Iterative rounds of appraisal and refinement of candidate items resulted in 30 items with a daily recall period and 34 items with a 7-day recall period. For the first time, our approach provides comprehensive insight on physical activity from the COPD patients’ perspective. The PROactive PRO instruments’ content validity represents the pivotal basis for empirically based item reduction and validation. PMID:25034563
ERIC Educational Resources Information Center
Miller, Gloria I.; Jaciw, Andrew; Hoshiko, Brandon; Wei, Xin
2007-01-01
Texas Instruments has undertaken a research program with the goal of producing scientifically-based evidence of the effectiveness of graphing calculators and the "TI-Navigator"[TM] classroom networking system in the context of a professional development and curriculum framework. The program includes a two-year longitudinal study. The…
The New Feedback Control System of RFX-mod Based on the MARTe Real-Time Framework
NASA Astrophysics Data System (ADS)
Manduchi, G.; Luchetta, A.; Soppelsa, A.; Taliercio, C.
2014-06-01
A real-time system has been successfully used since 2004 in the RFX-mod nuclear fusion experiment to control the position of the plasma and its Magneto Hydrodynamic (MHD) modes. However, its latency and the limited computation power of the used processors prevented the usage of more aggressive control algorithms. Therefore a new hardware and software architecture has been designed to overcome such limitations and to provide a shorter latency and a much increased computation power. The new system is based on a Linux multi-core server and uses MARTe, a framework for real-time control which is gaining interest in the fusion community.
4. INSTRUMENT ROOM,INTERIOR, MAIN SPACE. Looking northeast. Edwards Air ...
4. INSTRUMENT ROOM,INTERIOR, MAIN SPACE. Looking northeast. - Edwards Air Force Base, Air Force Rocket Propulsion Laboratory, Firing Control Building, Test Area 1-100, northeast end of Test Area 1-100 Road, Boron, Kern County, CA
138. LIQUID NITROGEN INSTRUMENT PANEL ON EAST WALL OF LIQUID ...
138. LIQUID NITROGEN INSTRUMENT PANEL ON EAST WALL OF LIQUID NITROGEN CONTROL ROOM (115), LSB (BLDG. 770) - Vandenberg Air Force Base, Space Launch Complex 3, Launch Pad 3 West, Napa & Alden Roads, Lompoc, Santa Barbara County, CA
[Cardiovascular circulation feedback control treatment instrument].
Ge, Yu-zhi; Zhu, Xing-huan; Sheng, Guo-tai; Cao, Ping-liang; Liu, Dong-sheng; Wu, Zhi-ting
2005-07-01
The cardiovascular circulation feedback control treatment instrument (CFCTI) is an automatic feedback control treatment system, which has the function of monitoring, alarming, trouble self-diagnosis and testing on the line in the closed loop. The instrument is designed based on the successful clinical experiences and the data are inputted into the computer in real-time through a pressure sensor and A/D card. User interface window is set up for the doctor's choosing different medicine. The orders are outputted to control the dose of medicine through the transfusion system. The response to medicine is updated continually. CFCTI can avoid the man-made errors and the long interval of sampling. Its reliability and accuracy in rescuing the critical patients are much higher than the traditional methods.
ERIC Educational Resources Information Center
de Muynck, Bram; Reijnoudt-Klein, Willemieke; Spruyt-de Kloe, Marike
2017-01-01
This article reports the development of a framework that structures differences in Christian educational practices worldwide. One of its purposes is to simplify the complexity of the contexts in which global partners cooperate. The framework also offers the theoretical basis for an instrument that nongovernmental organizations can use to determine…
Jones, J.W.; Desmond, G.B.; Henkle, C.; Glover, R.
2012-01-01
Accurate topographic data are critical to restoration science and planning for the Everglades region of South Florida, USA. They are needed to monitor and simulate water level, water depth and hydroperiod and are used in scientific research on hydrologic and biologic processes. Because large wetland environments and data acquisition challenge conventional ground-based and remotely sensed data collection methods, the United States Geological Survey (USGS) adapted a classical data collection instrument to global positioning system (GPS) and geographic information system (GIS) technologies. Data acquired with this instrument were processed using geostatistics to yield sub-water level elevation values with centimetre accuracy (??15 cm). The developed database framework, modelling philosophy and metadata protocol allow for continued, collaborative model revision and expansion, given additional elevation or other ancillary data. ?? 2012 Taylor & Francis.
Tsiknakis, Manolis; Kouroubali, Angelina
2009-01-01
The paper presents an application of the "Fit between Individuals, Task and Technology" (FITT) framework to analyze the socio-organizational-technical factors that influence IT adoption in the healthcare domain. The FITT framework was employed as the theoretical instrument for a retrospective analysis of a 15-year effort in implementing IT systems and eHealth services in the context of a Regional Health Information Network in Crete. Quantitative and qualitative research methods, interviews and participant observations were employed to gather data from a case study that involved the entire region of Crete. The detailed analysis of the case study based on the FITT framework, showed common features, but also differences of IT adoption within the various health organizations. The emerging picture is a complex nexus of factors contributing to IT adoption, and multi-level interventional strategies to promote IT use. The work presented in this paper shows the applicability of the FITT framework in explaining the complexity of aspects observed in the implementation of healthcare information systems. The reported experiences reveal that fit management can be viewed as a system with a feedback loop that is never really stable, but ever changing based on external factors or deliberate interventions. Management of fit, therefore, becomes a constant and complex task for the whole life cycle of IT systems.
Towards a Generic and Adaptive System-On-Chip Controller for Space Exploration Instrumentation
NASA Technical Reports Server (NTRS)
Iturbe, Xabier; Keymeulen, Didier; Yiu, Patrick; Berisford, Dan; Hand, Kevin; Carlson, Robert; Ozer, Emre
2015-01-01
This paper introduces one of the first efforts conducted at NASA’s Jet Propulsion Laboratory (JPL) to develop a generic System-on-Chip (SoC) platform to control science instruments that are proposed for future NASA missions. The SoC platform is named APEX-SoC, where APEX stands for Advanced Processor for space Exploration, and is based on a hybrid Xilinx Zynq that combines an FPGA and an ARM Cortex-A9 dual-core processor on a single chip. The Zynq implements a generic and customizable on-chip infrastructure that can be reused with a variety of instruments, and it has been coupled with a set of off-chip components that are necessary to deal with the different instruments. We have taken JPL’s Compositional InfraRed Imaging Spectrometer (CIRIS), which is proposed for NASA icy moons missions, as a use-case scenario to demonstrate that the entire data processing, control and interface of an instrument can be implemented on a single device using the on-chip infrastructure described in this paper. We show that the performance results achieved in this preliminary version of the instrumentation controller are sufficient to fulfill the science requirements demanded to the CIRIS instrument in future NASA missions, such as Europa.
Crocco, Laura; Madill, Catherine J; McCabe, Patricia
2017-01-01
The study systematically reviews evidence-based frameworks for teaching and learning of classical singing training. This is a systematic review. A systematic literature search of 15 electronic databases following the Preferred Reporting Items for Systematic Reviews (PRISMA) guidelines was conducted. Eligibility criteria included type of publication, participant characteristics, intervention, and report of outcomes. Quality rating scales were applied to support assessment of the included literature. Data analysis was conducted using meta-aggregation. Nine papers met the inclusion criteria. No complete evidence-based teaching and learning framework was found. Thematic content analysis showed that studies either (1) identified teaching practices in one-to-one lessons, (2) identified student learning strategies in one-to-one lessons or personal practice sessions, and (3) implemented a tool to enhance one specific area of teaching and learning in lessons. The included studies showed that research in music education is not always specific to musical genre or instrumental group, with four of the nine studies including participant teachers and students of classical voice training only. The overall methodological quality ratings were low. Research in classical singing training has not yet developed an evidence-based framework for classical singing training. This review has found that introductory information on teaching and learning practices has been provided, and tools have been suggested for use in the evaluation of the teaching-learning process. High-quality methodological research designs are needed. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
A Social-Ecological Framework of Theory, Assessment, and Prevention of Suicide
Cramer, Robert J.; Kapusta, Nestor D.
2017-01-01
The juxtaposition of increasing suicide rates with continued calls for suicide prevention efforts begs for new approaches. Grounded in the Centers for Disease Control and Prevention (CDC) framework for tackling health issues, this personal views work integrates relevant suicide risk/protective factor, assessment, and intervention/prevention literatures. Based on these components of suicide risk, we articulate a Social-Ecological Suicide Prevention Model (SESPM) which provides an integration of general and population-specific risk and protective factors. We also use this multi-level perspective to provide a structured approach to understanding current theories and intervention/prevention efforts concerning suicide. Following similar multi-level prevention efforts in interpersonal violence and Human Immunodeficiency Virus (HIV) domains, we offer recommendations for social-ecologically informed suicide prevention theory, training, research, assessment, and intervention programming. Although the SESPM calls for further empirical testing, it provides a suitable backdrop for tailoring of current prevention and intervention programs to population-specific needs. Moreover, the multi-level model shows promise to move suicide risk assessment forward (e.g., development of multi-level suicide risk algorithms or structured professional judgments instruments) to overcome current limitations in the field. Finally, we articulate a set of characteristics of social-ecologically based suicide prevention programs. These include the need to address risk and protective factors with the strongest degree of empirical support at each multi-level layer, incorporate a comprehensive program evaluation strategy, and use a variety of prevention techniques across levels of prevention. PMID:29062296
The Legal Strength of International Health Instruments - What It Brings to Global Health Governance?
Nikogosian, Haik; Kickbusch, Ilona
2016-09-04
Public health instruments have been under constant development and renewal for decades. International legal instruments, with their binding character and strength, have a special place in this development. The start of the 21st century saw, in particular, the birth of the first World Health Organization (WHO)-era health treaties - the WHO Framework Convention on Tobacco Control (WHO FCTC) and its first Protocol. The authors analyze the potential impact of these instruments on global health governance and public health, beyond the traditional view of their impact on tobacco control. Overall, the very fact that globally binding treaties in modern-era health were feasible has accelerated the debate and expectations for an expanded role of international legal regimes in public health. The impact of treaties has also been notable in global health architecture as the novel instruments required novel institutions to govern their implementation. The legal power of the WHO FCTC has enabled rapid adoption of further instruments to promote its implementation, thus, enhancing the international instrumentarium for health, and it has also prompted stronger role for national legislation on health. Notably, the Convention has elevated several traditionally challenging public health features to the level of international legal obligations. It has also revealed how the legal power of the international health instrument can be utilized in safeguarding the interests of health in the face of competing agendas and legal disputes at both the domestic and international levels. Lastly, the legal power of health instruments is associated with their potential impact not only on health but also beyond; the recently adopted Protocol to Eliminate Illicit Trade in Tobacco Products may best exemplify this matter. The first treaty experiences of the 21st century may provide important lessons for the role of legal instruments in addressing the unfolding challenges in global health. © 2016 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
NASA Astrophysics Data System (ADS)
Plankis, Brian J.
The purpose of the study was to examine the effects of technology-infused issue investigations on high school students' environmental and ocean literacies. This study explored the effects of a new educational enrichment program termed Connecting the Ocean, Reefs, Aquariums, Literacy, and Stewardship (CORALS) on high school science students. The study utilized a mixed methods approach combining a quantitative quasi-experimental pre-post test design with qualitative case studies. The CORALS program is a new educational program that combines materials based on the Investigating and Evaluating Environmental Issues and Actions (IEEIA) curriculum program with the digital storytelling process. Over an 18-week period four high school science teachers and their approximately 169 students investigated environmental issues impacting coral reefs through the IEEIA framework. An additional approximately 224 students, taught by the same teachers, were the control group exposed to standard curriculum. Students' environmental literacy was measured through the Secondary School Environmental Literacy Instrument (SSELI) and students' ocean literacy was measured through the Students' Ocean Literacy Viewpoints and Engagement (SOLVE) instrument. Two classrooms were selected as case studies and examined through classroom observations and student and teacher interviews. The results indicated the CORALS program increased the knowledge of ecological principles, knowledge of environmental problems/issues, and environmental attitudes components of environmental literacy for the experimental group students. For ocean literacy, the experimental group students' scores increased for knowledge of ocean literacy principles, ability to identify oceanic environmental problems, and attitudes concerning the ocean. The SSELI measure of Responsible Environmental Behaviors (REB) was found to be significant for the interaction of teacher and class type (experimental or control). The students for Teachers A and B reported a statistically significant increase in the self-reported REB subscales of ecomanagement and consumer/economic action. This indicates the students reported an increase in the REBs they could change within their lifestyles. This study provides baseline data in an area where few quality studies exist to date. Recommendations for practice and administration of the research study instruments are explored. Recommendations for further research include CORALS program modifications, revising the instruments utilized, and what areas of students' environmental and ocean literacies warrant further exploration.
LMI-Based Generation of Feedback Laws for a Robust Model Predictive Control Algorithm
NASA Technical Reports Server (NTRS)
Acikmese, Behcet; Carson, John M., III
2007-01-01
This technical note provides a mathematical proof of Corollary 1 from the paper 'A Nonlinear Model Predictive Control Algorithm with Proven Robustness and Resolvability' that appeared in the 2006 Proceedings of the American Control Conference. The proof was omitted for brevity in the publication. The paper was based on algorithms developed for the FY2005 R&TD (Research and Technology Development) project for Small-body Guidance, Navigation, and Control [2].The framework established by the Corollary is for a robustly stabilizing MPC (model predictive control) algorithm for uncertain nonlinear systems that guarantees the resolvability of the associated nite-horizon optimal control problem in a receding-horizon implementation. Additional details of the framework are available in the publication.
The Diamond Beamline Controls and Data Acquisition Software Architecture
NASA Astrophysics Data System (ADS)
Rees, N.
2010-06-01
The software for the Diamond Light Source beamlines[1] is based on two complementary software frameworks: low level control is provided by the Experimental Physics and Industrial Control System (EPICS) framework[2][3] and the high level user interface is provided by the Java based Generic Data Acquisition or GDA[4][5]. EPICS provides a widely used, robust, generic interface across a wide range of hardware where the user interfaces are focused on serving the needs of engineers and beamline scientists to obtain detailed low level views of all aspects of the beamline control systems. The GDA system provides a high-level system that combines an understanding of scientific concepts, such as reciprocal lattice coordinates, a flexible python syntax scripting interface for the scientific user to control their data acquisition, and graphical user interfaces where necessary. This paper describes the beamline software architecture in more detail, highlighting how these complementary frameworks provide a flexible system that can accommodate a wide range of requirements.
NASA Technical Reports Server (NTRS)
Hoegger, Bruno; Viatte, Pierre; Levrat, Gilbert; Bader, Juerg; Ribordy, Pascale; Schill, Herbert; Staehelin, Johannes
1994-01-01
Total ozone observations of two Dobson instruments (D15 and D101, C- and AD wavelength pair observations) and of two Brewer instruments (Br40 and Br72) are currently performed at the LKO at Arosa. A quality control concept is presented in order to make best use of the large number of quasi-simultaneous measurements. The longest ozone series of the world is mainly based on the measurements of the Dobson instrument D15 (wavelength pair C). Since the last years D15 has suffered from instrumental problems. The transformation of the longterm series to the measurements of D101(AD) is described.
Binocular optical axis parallelism detection precision analysis based on Monte Carlo method
NASA Astrophysics Data System (ADS)
Ying, Jiaju; Liu, Bingqi
2018-02-01
According to the working principle of the binocular photoelectric instrument optical axis parallelism digital calibration instrument, and in view of all components of the instrument, the various factors affect the system precision is analyzed, and then precision analysis model is established. Based on the error distribution, Monte Carlo method is used to analyze the relationship between the comprehensive error and the change of the center coordinate of the circle target image. The method can further guide the error distribution, optimize control the factors which have greater influence on the comprehensive error, and improve the measurement accuracy of the optical axis parallelism digital calibration instrument.
15. "GENERAL, INSTRUMENTATION AND CONTROL SYSTEMS, ISOMETRIC." Test Area 1120. ...
15. "GENERAL, INSTRUMENTATION AND CONTROL SYSTEMS, ISOMETRIC." Test Area 1-120. Specifications No. ENG04-353-55-72; Drawing No. 60-09-12; sheet 6 of 148; file no. 1320/57. Stamped: RECORD DRAWING - AS CONSTRUCTED. Below stamp: Contract no. 4338, no change. - Edwards Air Force Base, Air Force Rocket Propulsion Laboratory, Leuhman Ridge near Highways 58 & 395, Boron, Kern County, CA
122. VIEW OF CABINETS ON WEST SIDE OF LANDLINE INSTRUMENTATION ...
122. VIEW OF CABINETS ON WEST SIDE OF LANDLINE INSTRUMENTATION ROOM (206), LSB (BLDG. 751), FACING EAST. PECOS CABINET INCLUDES CONTROLS FOR PRESSURE SWITCHES, VALVES, AND PURGE; THE LOGIC AND MONITOR UNITS FOR BOOSTER AND FUEL SYSTEMS INCLUDES CONTROLS FOR MISSILE GROUND POWER AND HYDRAULICS. - Vandenberg Air Force Base, Space Launch Complex 3, Launch Pad 3 East, Napa & Alden Roads, Lompoc, Santa Barbara County, CA
Toward a More Flexible Web-Based Framework for Multidisciplinary Design
NASA Technical Reports Server (NTRS)
Rogers, J. L.; Salas, A. O.
1999-01-01
In today's competitive environment, both industry and government agencies are under pressure to reduce the time and cost of multidisciplinary design projects. New tools have been introduced to assist in this process by facilitating the integration of and communication among diverse disciplinary codes. One such tool, a framework for multidisciplinary design, is defined as a hardware-software architecture that enables integration, execution, and communication among diverse disciplinary processes. An examination of current frameworks reveals weaknesses in various areas, such as sequencing, monitoring, controlling, and displaying the design process. The objective of this research is to explore how Web technology can improve these areas of weakness and lead toward a more flexible framework. This article describes a Web-based system that optimizes and controls the execution sequence of design processes in addition to monitoring the project status and displaying the design results.
A motion sensing-based framework for robotic manipulation.
Deng, Hao; Xia, Zeyang; Weng, Shaokui; Gan, Yangzhou; Fang, Peng; Xiong, Jing
2016-01-01
To data, outside of the controlled environments, robots normally perform manipulation tasks operating with human. This pattern requires the robot operators with high technical skills training for varied teach-pendant operating system. Motion sensing technology, which enables human-machine interaction in a novel and natural interface using gestures, has crucially inspired us to adopt this user-friendly and straightforward operation mode on robotic manipulation. Thus, in this paper, we presented a motion sensing-based framework for robotic manipulation, which recognizes gesture commands captured from motion sensing input device and drives the action of robots. For compatibility, a general hardware interface layer was also developed in the framework. Simulation and physical experiments have been conducted for preliminary validation. The results have shown that the proposed framework is an effective approach for general robotic manipulation with motion sensing control.
A conceptual framework for achieving performance enhancing drug compliance in sport.
Donovan, Robert J; Egger, Garry; Kapernick, Vicki; Mendoza, John
2002-01-01
There has been, and continues to be, widespread international concern about athletes' use of banned performance enhancing drugs (PEDs). This concern culminated in the formation of the World Anti-Doping Agency (WADA) in November 1999. To date, the main focus on controlling the use of PEDs has been on testing athletes and the development of tests to detect usage. Although athletes' beliefs and values are known to influence whether or not an athlete will use drugs, little is known about athletes' beliefs and attitudes, and the limited empirical literature shows little use of behavioural science frameworks to guide research methodology, results interpretation, and intervention implications. Mindful of this in preparing its anti-doping strategy for the 2000 Olympics, the Australian Sports Drug Agency (ASDA) in 1997 commissioned a study to assess the extent to which models of attitude-behaviour change in the public health/injury prevention literature had useful implications for compliance campaigns in the sport drug area. A preliminary compliance model was developed from three behavioural science frameworks: social cognition models; threat (or fear) appeals; and instrumental and normative approaches. A subsequent review of the performance enhancing drug literature confirmed that the overall framework was consistent with known empirical data, and therefore had at least face validity if not construct validity. The overall model showed six major inputs to an athlete's attitudes and intentions with respect to performance enhancing drug usage: personality factors, threat appraisal, benefit appraisal, reference group influences, personal morality and legitimacy. The model demonstrated that a comprehensive, fully integrated programme is necessary for maximal effect, and provides anti-doping agencies with a structured framework for strategic planning and implementing interventions. Programmes can be developed in each of the six major areas, with allocation of resources to each area based on needs-assessment research with athletes and other relevant groups.
Advanced CO2 removal process control and monitor instrumentation development
NASA Technical Reports Server (NTRS)
Heppner, D. B.; Dalhausen, M. J.; Klimes, R.
1982-01-01
A progam to evaluate, design and demonstrate major advances in control and monitor instrumentation was undertaken. A carbon dioxide removal process, one whose maturity level makes it a prime candidate for early flight demonstration was investigated. The instrumentation design incorporates features which are compatible with anticipated flight requirements. Current electronics technology and projected advances are included. In addition, the program established commonality of components for all advanced life support subsystems. It was concluded from the studies and design activities conducted under this program that the next generation of instrumentation will be greatly smaller than the prior one. Not only physical size but weight, power and heat rejection requirements were reduced in the range of 80 to 85% from the former level of research and development instrumentation. Using a microprocessor based computer, a standard computer bus structure and nonvolatile memory, improved fabrication techniques and aerospace packaging this instrumentation will greatly enhance overall reliability and total system availability.
Two Axis Pointing System (TAPS) attitude acquisition, determination, and control
NASA Technical Reports Server (NTRS)
Azzolini, John D.; Mcglew, David E.
1990-01-01
The Two Axis Pointing System (TAPS) is a 2 axis gimbal system designed to provide fine pointing of Space Transportation System (STS) borne instruments. It features center-of-mass instrument mounting and will accommodate instruments of up to 1134 kg (2500 pounds) which fit within a 1.0 by 1.0 by 4.2 meter (40 by 40 by 166 inch) envelope. The TAPS system is controlled by a microcomputer based Control Electronics Assembly (CEA), a Power Distribution Unit (PDU), and a Servo Control Unit (SCU). A DRIRU-II inertial reference unit is used to provide incremental angles for attitude propagation. A Ball Brothers STRAP star tracker is used for attitude acquisition and update. The theory of the TAPS attitude determination and error computation for the Broad Band X-ray Telescope (BBXRT) are described. The attitude acquisition is based upon a 2 star geometric solution. The acquisition theory and quaternion algebra are presented. The attitude control combines classical position, integral and derivative (PID) control with techniques to compensate for coulomb friction (bias torque) and the cable harness crossing the gimbals (spring torque). Also presented is a technique for an adaptive bias torque compensation which adjusts to an ever changing frictional torque environment. The control stability margins are detailed, with the predicted pointing performance, based upon simulation studies. The TAPS user interface, which provides high level operations commands to facilitate science observations, is outlined.
Pilot dynamics for instrument approach tasks: Full panel multiloop and flight director operations
NASA Technical Reports Server (NTRS)
Weir, D. H.; Mcruer, D. T.
1972-01-01
Measurements and interpretations of single and mutiloop pilot response properties during simulated instrument approach are presented. Pilot subjects flew Category 2-like ILS approaches in a fixed base DC-8 simulaton. A conventional instrument panel and controls were used, with simulated vertical gust and glide slope beam bend forcing functions. Reduced and interpreted pilot describing functions and remmant are given for pitch attitude, flight director, and multiloop (longitudinal) control tasks. The response data are correlated with simultaneously recorded eye scanning statistics, previously reported in NASA CR-1535. The resulting combined response and scanning data and their interpretations provide a basis for validating and extending the theory of manual control displays.
Shipboard Analytical Capabilities on the Renovated JOIDES Resolution, IODP Riserless Drilling Vessel
NASA Astrophysics Data System (ADS)
Blum, P.; Foster, P.; Houpt, D.; Bennight, C.; Brandt, L.; Cobine, T.; Crawford, W.; Fackler, D.; Fujine, K.; Hastedt, M.; Hornbacher, D.; Mateo, Z.; Moortgat, E.; Vasilyev, M.; Vasilyeva, Y.; Zeliadt, S.; Zhao, J.
2008-12-01
The JOIDES Resolution (JR) has conducted 121 scientific drilling expeditions during the Ocean Drilling Program (ODP) and the first phase of the Integrated Ocean Drilling Program (IODP) (1983-2006). The vessel and scientific systems have just completed an NSF-sponsored renovation (2005-2008). Shipboard analytical systems have been upgraded, within funding constraints imposed by market driven vessel conversion cost increases, to include: (1) enhanced shipboard analytical services including instruments and software for sampling and the capture of chemistry, physical properties, and geological data; (2) new data management capabilities built around a laboratory information management system (LIMS), digital asset management system, and web services; (3) operations data services with enhanced access to navigation and rig instrumentation data; and (4) a combination of commercial and home-made user applications for workflow- specific data extractions, generic and customized data reporting, and data visualization within a shipboard production environment. The instrumented data capture systems include a new set of core loggers for rapid and non-destructive acquisition of images and other physical properties data from drill cores. Line-scan imaging and natural gamma ray loggers capture data at unprecedented quality due to new and innovative designs. Many instruments used to characterize chemical compounds of rocks, sediments, and interstitial fluids were upgraded with the latest technology. The shipboard analytical environment features a new and innovative framework (DESCinfo) and application (DESClogik) for capturing descriptive and interpretive data from geological sub-domains such as sedimentology, petrology, paleontology, structural geology, stratigraphy, etc. This system fills a long-standing gap by providing a global database, controlled vocabularies and taxa name lists with version control, a highly configurable spreadsheet environment for data capture, and visualization of context data collected with the shipboard core loggers and other instruments.
Feedback control by online learning an inverse model.
Waegeman, Tim; Wyffels, Francis; Schrauwen, Francis
2012-10-01
A model, predictor, or error estimator is often used by a feedback controller to control a plant. Creating such a model is difficult when the plant exhibits nonlinear behavior. In this paper, a novel online learning control framework is proposed that does not require explicit knowledge about the plant. This framework uses two learning modules, one for creating an inverse model, and the other for actually controlling the plant. Except for their inputs, they are identical. The inverse model learns by the exploration performed by the not yet fully trained controller, while the actual controller is based on the currently learned model. The proposed framework allows fast online learning of an accurate controller. The controller can be applied on a broad range of tasks with different dynamic characteristics. We validate this claim by applying our control framework on several control tasks: 1) the heating tank problem (slow nonlinear dynamics); 2) flight pitch control (slow linear dynamics); and 3) the balancing problem of a double inverted pendulum (fast linear and nonlinear dynamics). The results of these experiments show that fast learning and accurate control can be achieved. Furthermore, a comparison is made with some classical control approaches, and observations concerning convergence and stability are made.
NASA Technical Reports Server (NTRS)
Killough, Brian; Stover, Shelley
2008-01-01
The Committee on Earth Observation Satellites (CEOS) provides a brief to the Goddard Institute for Space Studies (GISS) regarding the CEOS Systems Engineering Office (SEO) and current work on climate requirements and analysis. A "system framework" is provided for the Global Earth Observation System of Systems (GEOSS). SEO climate-related tasks are outlined including the assessment of essential climate variable (ECV) parameters, use of the "systems framework" to determine relevant informational products and science models and the performance of assessments and gap analyses of measurements and missions for each ECV. Climate requirements, including instruments and missions, measurements, knowledge and models, and decision makers, are also outlined. These requirements would establish traceability from instruments to products and services allowing for benefit evaluation of instruments and measurements. Additionally, traceable climate requirements would provide a better understanding of global climate models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kastenberg, W.E.; Apostolakis, G.; Dhir, V.K.
Severe accident management can be defined as the use of existing and/or altemative resources, systems and actors to prevent or mitigate a core-melt accident. For each accident sequence and each combination of severe accident management strategies, there may be several options available to the operator, and each involves phenomenological and operational considerations regarding uncertainty. Operational uncertainties include operator, system and instrumentation behavior during an accident. A framework based on decision trees and influence diagrams has been developed which incorporates such criteria as feasibility, effectiveness, and adverse effects, for evaluating potential severe accident management strategies. The framework is also capable ofmore » propagating both data and model uncertainty. It is applied to several potential strategies including PWR cavity flooding, BWR drywell flooding, PWR depressurization and PWR feed and bleed.« less
Automated Microfluidic Instrument for Label-Free and High-Throughput Cell Separation.
Zhang, Xinjie; Zhu, Zhixian; Xiang, Nan; Long, Feifei; Ni, Zhonghua
2018-03-20
Microfluidic technologies for cell separation were reported frequently in recent years. However, a compact microfluidic instrument enabling thoroughly automated cell separation is still rarely reported until today due to the difficult hybrid between the macrosized fluidic control system and the microsized microfluidic device. In this work, we propose a novel and automated microfluidic instrument to realize size-based separation of cancer cells in a label-free and high-throughput manner. Briefly, the instrument is equipped with a fully integrated microfluidic device and a set of robust fluid-driven and control units, and the instrument functions of precise fluid infusion and high-throughput cell separation are guaranteed by a flow regulatory chip and two cell separation chips which are the key components of the microfluidic device. With optimized control programs, the instrument is successfully applied to automatically sort human breast adenocarcinoma cell line MCF-7 from 5 mL of diluted human blood with a high recovery ratio of ∼85% within a rapid processing time of ∼23 min. We envision that our microfluidic instrument will be potentially useful in many biomedical applications, especially cell separation, enrichment, and concentration for the purpose of cell culture and analysis.
Theoretical Framework for Integrating Distributed Energy Resources into Distribution Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lian, Jianming; Wu, Di; Kalsi, Karanjit
This paper focuses on developing a novel theoretical framework for effective coordination and control of a large number of distributed energy resources in distribution systems in order to more reliably manage the future U.S. electric power grid under the high penetration of renewable generation. The proposed framework provides a systematic view of the overall structure of the future distribution systems along with the underlying information flow, functional organization, and operational procedures. It is characterized by the features of being open, flexible and interoperable with the potential to support dynamic system configuration. Under the proposed framework, the energy consumption of variousmore » DERs is coordinated and controlled in a hierarchical way by using market-based approaches. The real-time voltage control is simultaneously considered to complement the real power control in order to keep nodal voltages stable within acceptable ranges during real time. In addition, computational challenges associated with the proposed framework are also discussed with recommended practices.« less
Coaching Tutors: An Instrumental Case Study on Testing an Integrated Framework for Tutoring Sessions
ERIC Educational Resources Information Center
Holland, Alicia L.; Grant, Chris; Donthamsetty, Reshema
2017-01-01
The objective for the current qualitative case study was to examine participants' perceptions on the tutor coaching and session review frameworks. The location of the study was at the tutor coaches' place of business. At the beginning of the study, tutor coaches were trained on how to implement the tutoring coaching framework with their tutors,…
HARMONI instrument control electronics
NASA Astrophysics Data System (ADS)
Gigante, José V.; Rodríguez Ramos, Luis F.; Zins, Gerard; Schnetler, Hermine; Pecontal, Arlette; Herreros, José Miguel; Clarke, Fraser; Bryson, Ian; Thatte, Niranjan
2014-07-01
HARMONI is an integral field spectrograph working at visible and near-infrared wavelengths over a range of spatial scales from ground layer corrected to fully diffraction-limited. The instrument has been chosen to be part of the first-light complement at the European Extremely Large Telescope (E-ELT). This paper describes the instrument control electronics to be developed at IAC. The large size of the HARMONI instrument, its cryogenic operation, and the fact that it must operate with enhanced reliability is a challenge from the point of view of the control electronics design. The present paper describes a design proposal based on the current instrument requirements and intended to be fully compliant with the ESO E-ELT standards, as well as with the European EMC and safety standards. The modularity of the design and the use of COTS standard hardware will benefit the project in several aspects, as reduced costs, shorter schedule by the use of commercially available components, and improved quality by the use of well proven solutions.
7. SOUTH REAR. Looking northwest from corner of the Instrumentation ...
7. SOUTH REAR. Looking northwest from corner of the Instrumentation and Control Building (Building 8762). - Edwards Air Force Base, Air Force Rocket Propulsion Laboratory, Test Stand 1-A, Test Area 1-120, north end of Jupiter Boulevard, Boron, Kern County, CA
Design criteria monograph for actuators and operators
NASA Technical Reports Server (NTRS)
1974-01-01
Instrumentation for actuators and operators includes electrical position-indicating switches, potentiometers, and transducers and pressure-indicating switches and transducers. Monograph is based on critical evaluation of experiences and practices in design, test, and use of these control devices and instruments in operational space vehicles.
Integrated circuit-based instrumentation for microchip capillary electrophoresis.
Behnam, M; Kaigala, G V; Khorasani, M; Martel, S; Elliott, D G; Backhouse, C J
2010-09-01
Although electrophoresis with laser-induced fluorescence (LIF) detection has tremendous potential in lab on chip-based point-of-care disease diagnostics, the wider use of microchip electrophoresis has been limited by the size and cost of the instrumentation. To address this challenge, the authors designed an integrated circuit (IC, i.e. a microelectronic chip, with total silicon area of <0.25 cm2, less than 5 mmx5 mm, and power consumption of 28 mW), which, with a minimal additional infrastructure, can perform microchip electrophoresis with LIF detection. The present work enables extremely compact and inexpensive portable systems consisting of one or more complementary metal-oxide-semiconductor (CMOS) chips and several other low-cost components. There are, to the authors' knowledge, no other reports of a CMOS-based LIF capillary electrophoresis instrument (i.e. high voltage generation, switching, control and interface circuit combined with LIF detection). This instrument is powered and controlled using a universal serial bus (USB) interface to a laptop computer. The authors demonstrate this IC in various configurations and can readily analyse the DNA produced by a standard medical diagnostic protocol (end-labelled polymerase chain reaction (PCR) product) with a limit of detection of approximately 1 ng/microl (approximately 1 ng of total DNA). The authors believe that this approach may ultimately enable lab-on-a-chip-based electrophoretic instruments that cost on the order of several dollars.
Turksoy, Kamuran; Bayrak, Elif Seyma; Quinn, Lauretta; Littlejohn, Elizabeth; Cinar, Ali
2013-05-01
Accurate closed-loop control is essential for developing artificial pancreas (AP) systems that adjust insulin infusion rates from insulin pumps. Glucose concentration information from continuous glucose monitoring (CGM) systems is the most important information for the control system. Additional physiological measurements can provide valuable information that can enhance the accuracy of the control system. Proportional-integral-derivative control and model predictive control have been popular in AP development. Their implementations to date rely on meal announcements (e.g., bolus insulin dose based on insulin:carbohydrate ratios) by the user. Adaptive control techniques provide a powerful alternative that do not necessitate any meal or activity announcements. Adaptive control systems based on the generalized predictive control framework are developed by extending the recursive modeling techniques. Physiological signals such as energy expenditure and galvanic skin response are used along with glucose measurements to generate a multiple-input-single-output model for predicting future glucose concentrations used by the controller. Insulin-on-board (IOB) is also estimated and used in control decisions. The controllers were tested with clinical studies that include seven cases with three different patients with type 1 diabetes for 32 or 60 h without any meal or activity announcements. The adaptive control system kept glucose concentration in the normal preprandial and postprandial range (70-180 mg/dL) without any meal or activity announcements during the test period. After IOB estimation was added to the control system, mild hypoglycemic episodes were observed only in one of the four experiments. This was reflected in a plasma glucose value of 56 mg/dL (YSI 2300 STAT; Yellow Springs Instrument, Yellow Springs, OH) and a CGM value of 63 mg/dL). Regulation of blood glucose concentration with an AP using adaptive control techniques was successful in clinical studies, even without any meal and physical activity announcement.
A porphyrin-based metal–organic framework as a pH-responsive drug carrier
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Wenxin; Hu, Quan; Jiang, Ke
A low cytotoxic porphyrin-based metal–organic framework (MOF) PCN-221, which exhibited high PC12 cell viability via 3-(4,5-dimethylthiazol-2-yl)−2,5-diphenyl tetrazolium (MTT) assay, was selected as an oral drug carrier. Methotrexate (MTX) was chosen as the model drug molecule which was absorbed into inner pores and channels of MOFs by diffusion. PCN-221 showed high drug loading and sustained release behavior under physiological environment without “burst effect”. The controlled pH-responsive release of drugs by PCN-221 revealed its promising application in oral drug delivery. - Graphical abstract: The porous crystals PCN-221 with pore openings (MOF) PCN-221 was prepared exhibiting low cytotoxicity. PCN-221 showed high drug Methotrexatemore » loading and controlled pH-responsive release of Methotrexate. - Highlights: • A porphyrin-based metal–organic framework (MOF) PCN-221 was prepared showing low cytotoxicity. • PCN-221 showed high drug Methotrexate loading. • PCN-221 showed controlled pH-responsive release of Methotrexate.« less
Trajectory tracking control for a nonholonomic mobile robot under ROS
NASA Astrophysics Data System (ADS)
Lakhdar Besseghieur, Khadir; Trębiński, Radosław; Kaczmarek, Wojciech; Panasiuk, Jarosław
2018-05-01
In this paper, the implementation of the trajectory tracking control strategy on a ROS-based mobile robot is considered. Our test-bench is the nonholonomic mobile robot ‘TURTLEBOT’. ROS facilitates considerably setting-up a suitable environment to test the designed controller. Our aim is to develop a framework using ROS concepts so that a trajectory tracking controller can be implemented on any ROS-enabled mobile robot. Practical experiments with ‘TURTLEBOT’ are conducted to assess the framework reliability.
Planning Framework for Mesolevel Optimization of Urban Runoff Control Schemes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Qianqian; Blohm, Andrew; Liu, Bo
A planning framework is developed to optimize runoff control schemes at scales relevant for regional planning at an early stage. The framework employs less sophisticated modeling approaches to allow a practical application in developing regions with limited data sources and computing capability. The methodology contains three interrelated modules: (1)the geographic information system (GIS)-based hydrological module, which aims at assessing local hydrological constraints and potential for runoff control according to regional land-use descriptions; (2)the grading module, which is built upon the method of fuzzy comprehensive evaluation. It is used to establish a priority ranking system to assist the allocation of runoffmore » control targets at the subdivision level; and (3)the genetic algorithm-based optimization module, which is included to derive Pareto-based optimal solutions for mesolevel allocation with multiple competing objectives. The optimization approach describes the trade-off between different allocation plans and simultaneously ensures that all allocation schemes satisfy the minimum requirement on runoff control. Our results highlight the importance of considering the mesolevel allocation strategy in addition to measures at macrolevels and microlevels in urban runoff management. (C) 2016 American Society of Civil Engineers.« less
Cross-cultural adaptation of instruments assessing breastfeeding determinants: a multi-step approach
2014-01-01
Background Cross-cultural adaptation is a necessary process to effectively use existing instruments in other cultural and language settings. The process of cross-culturally adapting, including translation, of existing instruments is considered a critical set to establishing a meaningful instrument for use in another setting. Using a multi-step approach is considered best practice in achieving cultural and semantic equivalence of the adapted version. We aimed to ensure the content validity of our instruments in the cultural context of KwaZulu-Natal, South Africa. Methods The Iowa Infant Feeding Attitudes Scale, Breastfeeding Self-Efficacy Scale-Short Form and additional items comprise our consolidated instrument, which was cross-culturally adapted utilizing a multi-step approach during August 2012. Cross-cultural adaptation was achieved through steps to maintain content validity and attain semantic equivalence in the target version. Specifically, Lynn’s recommendation to apply an item-level content validity index score was followed. The revised instrument was translated and back-translated. To ensure semantic equivalence, Brislin’s back-translation approach was utilized followed by the committee review to address any discrepancies that emerged from translation. Results Our consolidated instrument was adapted to be culturally relevant and translated to yield more reliable and valid results for use in our larger research study to measure infant feeding determinants effectively in our target cultural context. Conclusions Undertaking rigorous steps to effectively ensure cross-cultural adaptation increases our confidence that the conclusions we make based on our self-report instrument(s) will be stronger. In this way, our aim to achieve strong cross-cultural adaptation of our consolidated instruments was achieved while also providing a clear framework for other researchers choosing to utilize existing instruments for work in other cultural, geographic and population settings. PMID:25285151
Principal-agent theory: a framework for improving health care reform in Tennessee.
Sekwat, A
2000-01-01
Using a framework based on principal-agent theory, this study examines problems faced by managed care organizations (MCOs) and major health care providers under the state of Tennessee's current capitation-based managed care programs called TennCare. Based on agency theory, the study proposes a framework to show how an effective collaborative relationship can be forged between the state of Tennessee and participating MCOs which takes into account the major concerns of third-party health care providers. The proposed framework further enhances realization of the state's key health care reform goals which are to control the rising costs of health care delivery and to expand health care coverage to uninsured and underinsured Tennesseans.
Optimization-Based Robust Nonlinear Control
2006-08-01
ABSTRACT New control algorithms were developed for robust stabilization of nonlinear dynamical systems . Novel, linear matrix inequality-based synthesis...was to further advance optimization-based robust nonlinear control design, for general nonlinear systems (especially in discrete time ), for linear...Teel, IEEE Transactions on Control Systems Technology, vol. 14, no. 3, p. 398-407, May 2006. 3. "A unified framework for input-to-state stability in
Fiber optic interferometry for industrial process monitoring and control applications
NASA Astrophysics Data System (ADS)
Marcus, Michael A.
2002-02-01
Over the past few years we have been developing applications for a high-resolution (sub-micron accuracy) fiber optic coupled dual Michelson interferometer-based instrument. It is being utilized in a variety of applications including monitoring liquid layer thickness uniformity on coating hoppers, film base thickness uniformity measurement, digital camera focus assessment, optical cell path length assessment and imager and wafer surface profile mapping. The instrument includes both coherent and non-coherent light sources, custom application dependent optical probes and sample interfaces, a Michelson interferometer, custom electronics, a Pentium-based PC with data acquisition cards and LabWindows CVI or LabView based application specific software. This paper describes the development evolution of this instrument platform and applications highlighting robust instrument design, hardware, software, and user interfaces development. The talk concludes with a discussion of a new high-speed instrument configuration, which can be utilized for high speed surface profiling and as an on-line web thickness gauge.
Huang, Rong; He, Hongmei; Pi, Xitian; Diao, Ziji; Zhao, Suwen
2014-06-01
Non-drug treatment of hypertension has become a research hotspot, which might overcome the heavy economic burden and side effects of drug treatment for the patients. Because of the good treatment effect and convenient operation, a new treatment based on slow breathing training is increasingly becoming a kind of physical therapy for hypertension. This paper explains the principle of hypertension treatment based on slow breathing training method, and introduces the overall structure of the portable blood pressure controlling instrument, including breathing detection circuit, the core control module, audio module, memory module and man-machine interaction module. We give a brief introduction to the instrument and the software in this paper. The prototype testing results showed that the treatment had a significant effect on controlling the blood pressure.