Toward an Improvement of the Analysis of Neural Coding.
Alegre-Cortés, Javier; Soto-Sánchez, Cristina; Albarracín, Ana L; Farfán, Fernando D; Val-Calvo, Mikel; Ferrandez, José M; Fernandez, Eduardo
2017-01-01
Machine learning and artificial intelligence have strong roots on principles of neural computation. Some examples are the structure of the first perceptron, inspired in the retina, neuroprosthetics based on ganglion cell recordings or Hopfield networks. In addition, machine learning provides a powerful set of tools to analyze neural data, which has already proved its efficacy in so distant fields of research as speech recognition, behavioral states classification, or LFP recordings. However, despite the huge technological advances in neural data reduction of dimensionality, pattern selection, and clustering during the last years, there has not been a proportional development of the analytical tools used for Time-Frequency (T-F) analysis in neuroscience. Bearing this in mind, we introduce the convenience of using non-linear, non-stationary tools, EMD algorithms in particular, for the transformation of the oscillatory neural data (EEG, EMG, spike oscillations…) into the T-F domain prior to its analysis with machine learning tools. We support that to achieve meaningful conclusions, the transformed data we analyze has to be as faithful as possible to the original recording, so that the transformations forced into the data due to restrictions in the T-F computation are not extended to the results of the machine learning analysis. Moreover, bioinspired computation such as brain-machine interface may be enriched from a more precise definition of neuronal coding where non-linearities of the neuronal dynamics are considered.
NASA Astrophysics Data System (ADS)
Gábor Hatvani, István; Kern, Zoltán; Leél-Őssy, Szabolcs; Demény, Attila
2018-01-01
Uneven spacing is a common feature of sedimentary paleoclimate records, in many cases causing difficulties in the application of classical statistical and time series methods. Although special statistical tools do exist to assess unevenly spaced data directly, the transformation of such data into a temporally equidistant time series which may then be examined using commonly employed statistical tools remains, however, an unachieved goal. The present paper, therefore, introduces an approach to obtain evenly spaced time series (using cubic spline fitting) from unevenly spaced speleothem records with the application of a spectral guidance to avoid the spectral bias caused by interpolation and retain the original spectral characteristics of the data. The methodology was applied to stable carbon and oxygen isotope records derived from two stalagmites from the Baradla Cave (NE Hungary) dating back to the late 18th century. To show the benefit of the equally spaced records to climate studies, their coherence with climate parameters is explored using wavelet transform coherence and discussed. The obtained equally spaced time series are available at https://doi.org/10.1594/PANGAEA.875917.
NASA Astrophysics Data System (ADS)
Prasetyo, T.; Amar, S.; Arendra, A.; Zam Zami, M. K.
2018-01-01
This study develops an on-line detection system to predict the wear of DCMT070204 tool tip during the cutting process of the workpiece. The machine used in this research is CNC ProTurn 9000 to cut ST42 steel cylinder. The audio signal has been captured using the microphone placed in the tool post and recorded in Matlab. The signal is recorded at the sampling rate of 44.1 kHz, and the sampling size of 1024. The recorded signal is 110 data derived from the audio signal while cutting using a normal chisel and a worn chisel. And then perform signal feature extraction in the frequency domain using Fast Fourier Transform. Feature selection is done based on correlation analysis. And tool wear classification was performed using artificial neural networks with 33 input features selected. This artificial neural network is trained with back propagation method. Classification performance testing yields an accuracy of 74%.
Nicolas, F; Coëtmellec, S; Brunel, M; Allano, D; Lebrun, D; Janssen, A J E M
2005-11-01
The authors have studied the diffraction pattern produced by a particle field illuminated by an elliptic and astigmatic Gaussian beam. They demonstrate that the bidimensional fractional Fourier transformation is a mathematically suitable tool to analyse the diffraction pattern generated not only by a collimated plane wave [J. Opt. Soc. Am A 19, 1537 (2002)], but also by an elliptic and astigmatic Gaussian beam when two different fractional orders are considered. Simulations and experimental results are presented.
Estimating monthly streamflow values by cokriging
Solow, A.R.; Gorelick, S.M.
1986-01-01
Cokriging is applied to estimation of missing monthly streamflow values in three records from gaging stations in west central Virginia. Missing values are estimated from optimal consideration of the pattern of auto- and cross-correlation among standardized residual log-flow records. Investigation of the sensitivity of estimation to data configuration showed that when observations are available within two months of a missing value, estimation is improved by accounting for correlation. Concurrent and lag-one observations tend to screen the influence of other available observations. Three models of covariance structure in residual log-flow records are compared using cross-validation. Models differ in how much monthly variation they allow in covariance. Precision of estimation, reflected in mean squared error (MSE), proved to be insensitive to this choice. Cross-validation is suggested as a tool for choosing an inverse transformation when an initial nonlinear transformation is applied to flow values. ?? 1986 Plenum Publishing Corporation.
ECG Signal Analysis and Arrhythmia Detection using Wavelet Transform
NASA Astrophysics Data System (ADS)
Kaur, Inderbir; Rajni, Rajni; Marwaha, Anupma
2016-12-01
Electrocardiogram (ECG) is used to record the electrical activity of the heart. The ECG signal being non-stationary in nature, makes the analysis and interpretation of the signal very difficult. Hence accurate analysis of ECG signal with a powerful tool like discrete wavelet transform (DWT) becomes imperative. In this paper, ECG signal is denoised to remove the artifacts and analyzed using Wavelet Transform to detect the QRS complex and arrhythmia. This work is implemented in MATLAB software for MIT/BIH Arrhythmia database and yields the sensitivity of 99.85 %, positive predictivity of 99.92 % and detection error rate of 0.221 % with wavelet transform. It is also inferred that DWT outperforms principle component analysis technique in detection of ECG signal.
NASA Astrophysics Data System (ADS)
Testorf, M. E.; Jobst, B. C.; Kleen, J. K.; Titiz, A.; Guillory, S.; Scott, R.; Bujarski, K. A.; Roberts, D. W.; Holmes, G. L.; Lenck-Santini, P.-P.
2012-10-01
Time-frequency transforms are used to identify events in clinical EEG data. Data are recorded as part of a study for correlating the performance of human subjects during a memory task with pathological events in the EEG, called spikes. The spectrogram and the scalogram are reviewed as tools for evaluating spike activity. A statistical evaluation of the continuous wavelet transform across trials is used to quantify phase-locking events. For simultaneously improving the time and frequency resolution, and for representing the EEG of several channels or trials in a single time-frequency plane, a multichannel matching pursuit algorithm is used. Fundamental properties of the algorithm are discussed as well as preliminary results, which were obtained with clinical EEG data.
A fast algorithm for vertex-frequency representations of signals on graphs
Jestrović, Iva; Coyle, James L.; Sejdić, Ervin
2016-01-01
The windowed Fourier transform (short time Fourier transform) and the S-transform are widely used signal processing tools for extracting frequency information from non-stationary signals. Previously, the windowed Fourier transform had been adopted for signals on graphs and has been shown to be very useful for extracting vertex-frequency information from graphs. However, high computational complexity makes these algorithms impractical. We sought to develop a fast windowed graph Fourier transform and a fast graph S-transform requiring significantly shorter computation time. The proposed schemes have been tested with synthetic test graph signals and real graph signals derived from electroencephalography recordings made during swallowing. The results showed that the proposed schemes provide significantly lower computation time in comparison with the standard windowed graph Fourier transform and the fast graph S-transform. Also, the results showed that noise has no effect on the results of the algorithm for the fast windowed graph Fourier transform or on the graph S-transform. Finally, we showed that graphs can be reconstructed from the vertex-frequency representations obtained with the proposed algorithms. PMID:28479645
Lalloo, Chitra; Stinson, Jennifer N; Brown, Stephen C; Campbell, Fiona; Isaac, Lisa; Henry, James L
2014-11-01
To evaluate clinical feasibility of the Pain-QuILT (previously known as the Iconic Pain Assessment Tool) from the perspective of adolescents with chronic pain and members of their interdisciplinary health team. The Pain-QuILT (PQ), a web-based tool that records the visual self-report of sensory pain in the form of time-stamped records, was directly compared with standard interview questions that were transformed to a paper-based tool. Qualitative, semi-structured interviews were used to refine the PQ. Adolescents with chronic pain aged 12 to 18 years used the PQ and comparator tool (randomized order) to self-report pain before a scheduled clinic appointment, and then took part in a semi-structured interview. The health team used these pain reports (PQ and comparator) during patient appointments, and later participated in focus group interviews. Interview audio recordings were transcribed verbatim and underwent a simple line-by-line content analysis to identify key concepts. A total of 17 adolescents and 9 health team members completed the study. All adolescents felt that the PQ was easy to use and understand. The median time required for completion of the PQ and comparator tool was 3.3 and 3.6 minutes, respectively. Overall, 15/17 (88%) of adolescents preferred the PQ to self-report their pain versus the comparator. The health team indicated that the PQ was a clinically useful tool and identified minor barriers to implementation. Consultations with adolescents and their health team indicate that the PQ is a clinically feasible tool for eliciting detailed self-report records of the sensory experience of chronic pain.
Transforming care delivery through health information technology.
Wheatley, Benjamin
2013-01-01
The slow but progressive adoption of health information technology (IT) nationwide promises to usher in a new era in health care. Electronic health record systems provide a complete patient record at the point of care and can help to alleviate some of the challenges of a fragmented delivery system, such as drug-drug interactions. Moreover, health IT promotes evidence-based practice by identifying gaps in recommended treatment and providing clinical decision-support tools. In addition, the data collected through digital records can be used to monitor patient outcomes and identify potential improvements in care protocols. Kaiser Permanente continues to advance its capability in each of these areas.
Modeling the Pulse Signal by Wave-Shape Function and Analyzing by Synchrosqueezing Transform
Wang, Chun-Li; Yang, Yueh-Lung; Wu, Wen-Hsiang; Tsai, Tung-Hu; Chang, Hen-Hong
2016-01-01
We apply the recently developed adaptive non-harmonic model based on the wave-shape function, as well as the time-frequency analysis tool called synchrosqueezing transform (SST) to model and analyze oscillatory physiological signals. To demonstrate how the model and algorithm work, we apply them to study the pulse wave signal. By extracting features called the spectral pulse signature, and based on functional regression, we characterize the hemodynamics from the radial pulse wave signals recorded by the sphygmomanometer. Analysis results suggest the potential of the proposed signal processing approach to extract health-related hemodynamics features. PMID:27304979
Modeling the Pulse Signal by Wave-Shape Function and Analyzing by Synchrosqueezing Transform.
Wu, Hau-Tieng; Wu, Han-Kuei; Wang, Chun-Li; Yang, Yueh-Lung; Wu, Wen-Hsiang; Tsai, Tung-Hu; Chang, Hen-Hong
2016-01-01
We apply the recently developed adaptive non-harmonic model based on the wave-shape function, as well as the time-frequency analysis tool called synchrosqueezing transform (SST) to model and analyze oscillatory physiological signals. To demonstrate how the model and algorithm work, we apply them to study the pulse wave signal. By extracting features called the spectral pulse signature, and based on functional regression, we characterize the hemodynamics from the radial pulse wave signals recorded by the sphygmomanometer. Analysis results suggest the potential of the proposed signal processing approach to extract health-related hemodynamics features.
Muench, John; Jarvis, Kelly; Boverman, Josh; Hardman, Joseph; Hayes, Meg; Winkle, Jim
2012-01-01
In order to successfully integrate screening, brief intervention, and referral to treatment (SBIRT) into primary care, education of clinicians must be paired with sustainable transformation of the clinical settings in which they practice. The SBIRT Oregon project adopted this strategy in an effort to fully integrate SBIRT into 7 primary care residency clinics. Residents were trained to assess and intervene in their patients' unhealthy substance use, whereas clinic staff personnel were trained to carry out a multistep screening process. Electronic medical record tools were created to further integrate and track SBIRT processes. This article describes how a resident training curriculum complemented and was informed by the transformation of workflow processes within the residents' home clinics.
Not Scotch, but Rum: The Scope and Diffusion of the Scottish Presence in the Published Record
ERIC Educational Resources Information Center
Lavoie, Brian
2013-01-01
Big data sets and powerful computing capacity have transformed scholarly inquiry across many disciplines. While the impact of data-intensive research methodologies is perhaps most distinct in the natural and social sciences, the humanities have also benefited from these new analytical tools. While full-text data is necessary to study topics such…
Fourier-Transform Infrared Microspectroscopy, a Novel and Rapid Tool for Identification of Yeasts
Wenning, Mareike; Seiler, Herbert; Scherer, Siegfried
2002-01-01
Fourier-transform infrared (FT-IR) microspectroscopy was used in this study to identify yeasts. Cells were grown to microcolonies of 70 to 250 μm in diameter and transferred from the agar plate by replica stamping to an IR-transparent ZnSe carrier. IR spectra of the replicas on the carrier were recorded using an IR microscope coupled to an IR spectrometer, and identification was performed by comparison to reference spectra. The method was tested by using small model libraries comprising reference spectra of 45 strains from 9 genera and 13 species, recorded with both FT-IR microspectroscopy and FT-IR macrospectroscopy. The results show that identification by FT-IR microspectroscopy is equivalent to that achieved by FT-IR macrospectroscopy but the time-consuming isolation of the organisms prior to identification is not necessary. Therefore, this method also provides a rapid tool to analyze mixed populations. Furthermore, identification of 21 Debaryomyces hansenii and 9 Saccharomyces cerevisiae strains resulted in 92% correct identification at the strain level for S. cerevisiae and 91% for D. hansenii, which demonstrates that the resolution power of FT-IR microspectroscopy may also be used for yeast typing at the strain level. PMID:12324312
Sholle, Evan T.; Kabariti, Joseph; Johnson, Stephen B.; Leonard, John P.; Pathak, Jyotishman; Varughese, Vinay I.; Cole, Curtis L.; Campion, Thomas R.
2017-01-01
Academic medical centers commonly approach secondary use of electronic health record (EHR) data by implementing centralized clinical data warehouses (CDWs). However, CDWs require extensive resources to model data dimensions and harmonize clinical terminology, which can hinder effective support of the specific and varied data needs of investigators. We hypothesized that an approach that aggregates raw data from source systems, ignores initial modeling typical of CDWs, and transforms raw data for specific research purposes would meet investigator needs. The approach has successfully enabled multiple tools that provide utility to the institutional research enterprise. To our knowledge, this is the first complete description of a methodology for electronic patient data acquisition and provisioning that ignores data harmonization at the time of initial storage in favor of downstream transformation to address specific research questions and applications. PMID:29854228
van Agthoven, Maria A; Barrow, Mark P; Chiron, Lionel; Coutouly, Marie-Aude; Kilgour, David; Wootton, Christopher A; Wei, Juan; Soulby, Andrew; Delsuc, Marc-André; Rolando, Christian; O'Connor, Peter B
2015-12-01
Two-dimensional Fourier transform ion cyclotron resonance mass spectrometry is a data-independent analytical method that records the fragmentation patterns of all the compounds in a sample. This study shows the implementation of atmospheric pressure photoionization with two-dimensional (2D) Fourier transform ion cyclotron resonance mass spectrometry. In the resulting 2D mass spectrum, the fragmentation patterns of the radical and protonated species from cholesterol are differentiated. This study shows the use of fragment ion lines, precursor ion lines, and neutral loss lines in the 2D mass spectrum to determine fragmentation mechanisms of known compounds and to gain information on unknown ion species in the spectrum. In concert with high resolution mass spectrometry, 2D Fourier transform ion cyclotron resonance mass spectrometry can be a useful tool for the structural analysis of small molecules. Graphical Abstract ᅟ.
Quality improvement and practice-based research in neurology using the electronic medical record
Frigerio, Roberta; Kazmi, Nazia; Meyers, Steven L.; Sefa, Meredith; Walters, Shaun A.; Silverstein, Jonathan C.
2015-01-01
Abstract We describe quality improvement and practice-based research using the electronic medical record (EMR) in a community health system–based department of neurology. Our care transformation initiative targets 10 neurologic disorders (brain tumors, epilepsy, migraine, memory disorders, mild traumatic brain injury, multiple sclerosis, neuropathy, Parkinson disease, restless legs syndrome, and stroke) and brain health (risk assessments and interventions to prevent Alzheimer disease and related disorders in targeted populations). Our informatics methods include building and implementing structured clinical documentation support tools in the EMR; electronic data capture; enrollment, data quality, and descriptive reports; quality improvement projects; clinical decision support tools; subgroup-based adaptive assignments and pragmatic trials; and DNA biobanking. We are sharing EMR tools and deidentified data with other departments toward the creation of a Neurology Practice-Based Research Network. We discuss practical points to assist other clinical practices to make quality improvements and practice-based research in neurology using the EMR a reality. PMID:26576324
King, Raymond J; Garrett, Nedra; Kriseman, Jeffrey; Crum, Melvin; Rafalski, Edward M; Sweat, David; Frazier, Renee; Schearer, Sue; Cutts, Teresa
2016-09-08
We present a framework for developing a community health record to bring stakeholders, information, and technology together to collectively improve the health of a community. It is both social and technical in nature and presents an iterative and participatory process for achieving multisector collaboration and information sharing. It proposes a methodology and infrastructure for bringing multisector stakeholders and their information together to inform, target, monitor, and evaluate community health initiatives. The community health record is defined as both the proposed framework and a tool or system for integrating and transforming multisector data into actionable information. It is informed by the electronic health record, personal health record, and County Health Ranking systems but differs in its social complexity, communal ownership, and provision of information to multisector partners at scales ranging from address to zip code.
Garrett, Nedra; Kriseman, Jeffrey; Crum, Melvin; Rafalski, Edward M.; Sweat, David; Frazier, Renee; Schearer, Sue; Cutts, Teresa
2016-01-01
We present a framework for developing a community health record to bring stakeholders, information, and technology together to collectively improve the health of a community. It is both social and technical in nature and presents an iterative and participatory process for achieving multisector collaboration and information sharing. It proposes a methodology and infrastructure for bringing multisector stakeholders and their information together to inform, target, monitor, and evaluate community health initiatives. The community health record is defined as both the proposed framework and a tool or system for integrating and transforming multisector data into actionable information. It is informed by the electronic health record, personal health record, and County Health Ranking systems but differs in its social complexity, communal ownership, and provision of information to multisector partners at scales ranging from address to zip code. PMID:27609300
Imaging ultrasonic dispersive guided wave energy in long bones using linear radon transform.
Tran, Tho N H T; Nguyen, Kim-Cuong T; Sacchi, Mauricio D; Le, Lawrence H
2014-11-01
Multichannel analysis of dispersive ultrasonic energy requires a reliable mapping of the data from the time-distance (t-x) domain to the frequency-wavenumber (f-k) or frequency-phase velocity (f-c) domain. The mapping is usually performed with the classic 2-D Fourier transform (FT) with a subsequent substitution and interpolation via c = 2πf/k. The extracted dispersion trajectories of the guided modes lack the resolution in the transformed plane to discriminate wave modes. The resolving power associated with the FT is closely linked to the aperture of the recorded data. Here, we present a linear Radon transform (RT) to image the dispersive energies of the recorded ultrasound wave fields. The RT is posed as an inverse problem, which allows implementation of the regularization strategy to enhance the focusing power. We choose a Cauchy regularization for the high-resolution RT. Three forms of Radon transform: adjoint, damped least-squares, and high-resolution are described, and are compared with respect to robustness using simulated and cervine bone data. The RT also depends on the data aperture, but not as severely as does the FT. With the RT, the resolution of the dispersion panel could be improved up to around 300% over that of the FT. Among the Radon solutions, the high-resolution RT delineated the guided wave energy with much better imaging resolution (at least 110%) than the other two forms. The Radon operator can also accommodate unevenly spaced records. The results of the study suggest that the high-resolution RT is a valuable imaging tool to extract dispersive guided wave energies under limited aperture. Copyright © 2014 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.
Heuer, Herbert; Hegele, Mathias
2010-12-01
Mechanical tools are transparent in the sense that their input-output relations can be derived from their perceptible characteristics. Modern technology creates more and more tools that lack mechanical transparency, such as in the control of the position of a cursor by means of a computer mouse or some other input device. We inquired whether an enhancement of transparency by means of presenting the shaft of a virtual sliding lever, which governed the transformation of hand position into cursor position, supports performance of aimed cursor movement and the acquisition of an internal model of the transformation in both younger and older adults. Enhanced transparency resulted in an improvement of visual closed-loop control in terms of movement time and curvature of cursor paths. The movement-time improvement was more pronounced at older working age than at younger working age, so that the enhancement of transparency can serve as a means to mitigate age-related declines in performance. Benefits for the acquisition of an internal model of the transformation and of explicit knowledge were absent. Thus, open-loop control in this task did not profit from enhanced mechanical transparency. These findings strongly suggest that environmental support of transparency of the effects of input devices on controlled systems might be a powerful tool to support older users. Enhanced transparency may also improve simulator-based training by increasing motivation, even if training benefits do not transfer to situations without enhanced transparency. (PsycINFO Database Record (c) 2010 APA, all rights reserved).
Discrete wavelet transform: a tool in smoothing kinematic data.
Ismail, A R; Asfour, S S
1999-03-01
Motion analysis systems typically introduce noise to the displacement data recorded. Butterworth digital filters have been used to smooth the displacement data in order to obtain smoothed velocities and accelerations. However, this technique does not yield satisfactory results, especially when dealing with complex kinematic motions that occupy the low- and high-frequency bands. The use of the discrete wavelet transform, as an alternative to digital filters, is presented in this paper. The transform passes the original signal through two complementary low- and high-pass FIR filters and decomposes the signal into an approximation function and a detail function. Further decomposition of the signal results in transforming the signal into a hierarchy set of orthogonal approximation and detail functions. A reverse process is employed to perfectly reconstruct the signal (inverse transform) back from its approximation and detail functions. The discrete wavelet transform was applied to the displacement data recorded by Pezzack et al., 1977. The smoothed displacement data were twice differentiated and compared to Pezzack et al.'s acceleration data in order to choose the most appropriate filter coefficients and decomposition level on the basis of maximizing the percentage of retained energy (PRE) and minimizing the root mean square error (RMSE). Daubechies wavelet of the fourth order (Db4) at the second decomposition level showed better results than both the biorthogonal and Coiflet wavelets (PRE = 97.5%, RMSE = 4.7 rad s-2). The Db4 wavelet was then used to compress complex displacement data obtained from a noisy mathematically generated function. Results clearly indicate superiority of this new smoothing approach over traditional filters.
Seismic Linear Noise Attenuation with Use of Radial Transform
NASA Astrophysics Data System (ADS)
Szymańska-Małysa, Żaneta
2018-03-01
One of the goals of seismic data processing is to attenuate the recorded noise in order to enable correct interpretation of the image. Radial transform has been used as a very effective tool in the attenuation of various types of linear noise, both numerical and real (such as ground roll, direct waves, head waves, guided waves etc). The result of transformation from offset - time (X - T) domain into apparent velocity - time (R - T) domain is frequency separation between reflections and linear events. In this article synthetic and real seismic shot gathers were examined. One example was targeted at far offset area of dataset where reflections and noise had similar apparent velocities and frequency bands. Another example was a result of elastic modelling where linear artefacts were produced. Bandpass filtering and scaling operation executed in radial domain attenuated all discussed types of linear noise very effectively. After noise reduction all further processing steps reveal better results, especially velocity analysis, migration and stacking. In all presented cases signal-to-noise ratio was significantly increased and reflections covered previously by noise were revealed. Power spectra of filtered seismic records preserved real dynamics of reflections.
Motion detection using extended fractional Fourier transform and digital speckle photography.
Bhaduri, Basanta; Tay, C J; Quan, C; Sheppard, Colin J R
2010-05-24
Digital speckle photography is a useful tool for measuring the motion of optically rough surfaces from the speckle shift that takes place at the recording plane. A simple correlation based digital speckle photographic system has been proposed that implements two simultaneous optical extended fractional Fourier transforms (EFRTs) of different orders using only a single lens and detector to simultaneously detect both the magnitude and direction of translation and tilt by capturing only two frames: one before and another after the object motion. The dynamic range and sensitivity of the measurement can be varied readily by altering the position of the mirror/s used in the optical setup. Theoretical analysis and experiment results are presented.
Medicaid information technology architecture: an overview.
Friedman, Richard H
2006-01-01
The Medicaid Information Technology Architecture (MITA) is a roadmap and tool-kit for States to transform their Medicaid Management Information System (MMIS) into an enterprise-wide, beneficiary-centric system. MITA will enable State Medicaid agencies to align their information technology (IT) opportunities with their evolving business needs. It also addresses long-standing issues of interoperability, adaptability, and data sharing, including clinical data, across organizational boundaries by creating models based on nationally accepted technical standards. Perhaps most significantly, MITA allows State Medicaid Programs to actively participate in the DHHS Secretary's vision of a transparent health care market that utilizes electronic health records (EHRs), ePrescribing and personal health records (PHRs).
Mouse EEG spike detection based on the adapted continuous wavelet transform
NASA Astrophysics Data System (ADS)
Tieng, Quang M.; Kharatishvili, Irina; Chen, Min; Reutens, David C.
2016-04-01
Objective. Electroencephalography (EEG) is an important tool in the diagnosis of epilepsy. Interictal spikes on EEG are used to monitor the development of epilepsy and the effects of drug therapy. EEG recordings are generally long and the data voluminous. Thus developing a sensitive and reliable automated algorithm for analyzing EEG data is necessary. Approach. A new algorithm for detecting and classifying interictal spikes in mouse EEG recordings is proposed, based on the adapted continuous wavelet transform (CWT). The construction of the adapted mother wavelet is founded on a template obtained from a sample comprising the first few minutes of an EEG data set. Main Result. The algorithm was tested with EEG data from a mouse model of epilepsy and experimental results showed that the algorithm could distinguish EEG spikes from other transient waveforms with a high degree of sensitivity and specificity. Significance. Differing from existing approaches, the proposed approach combines wavelet denoising, to isolate transient signals, with adapted CWT-based template matching, to detect true interictal spikes. Using the adapted wavelet constructed from a predefined template, the adapted CWT is calculated on small EEG segments to fit dynamical changes in the EEG recording.
HydroClimATe: hydrologic and climatic analysis toolkit
Dickinson, Jesse; Hanson, Randall T.; Predmore, Steven K.
2014-01-01
The potential consequences of climate variability and climate change have been identified as major issues for the sustainability and availability of the worldwide water resources. Unlike global climate change, climate variability represents deviations from the long-term state of the climate over periods of a few years to several decades. Currently, rich hydrologic time-series data are available, but the combination of data preparation and statistical methods developed by the U.S. Geological Survey as part of the Groundwater Resources Program is relatively unavailable to hydrologists and engineers who could benefit from estimates of climate variability and its effects on periodic recharge and water-resource availability. This report documents HydroClimATe, a computer program for assessing the relations between variable climatic and hydrologic time-series data. HydroClimATe was developed for a Windows operating system. The software includes statistical tools for (1) time-series preprocessing, (2) spectral analysis, (3) spatial and temporal analysis, (4) correlation analysis, and (5) projections. The time-series preprocessing tools include spline fitting, standardization using a normal or gamma distribution, and transformation by a cumulative departure. The spectral analysis tools include discrete Fourier transform, maximum entropy method, and singular spectrum analysis. The spatial and temporal analysis tool is empirical orthogonal function analysis. The correlation analysis tools are linear regression and lag correlation. The projection tools include autoregressive time-series modeling and generation of many realizations. These tools are demonstrated in four examples that use stream-flow discharge data, groundwater-level records, gridded time series of precipitation data, and the Multivariate ENSO Index.
Kopanitsa, Georgy
2017-05-18
The efficiency and acceptance of clinical decision support systems (CDSS) can increase if they reuse medical data captured during health care delivery. High heterogeneity of the existing legacy data formats has become the main barrier for the reuse of data. Thus, we need to apply data modeling mechanisms that provide standardization, transformation, accumulation and querying medical data to allow its reuse. In this paper, we focus on the interoperability issues of the hospital information systems (HIS) and CDSS data integration. Our study is based on the approach proposed by Marcos et al. where archetypes are used as a standardized mechanism for the interaction of a CDSS with an electronic health record (EHR). We build an integration tool to enable CDSSs collect data from various institutions without a need for modifications in the implementation. The approach implies development of a conceptual level as a set of archetypes representing concepts required by a CDSS. Treatment case data from Regional Clinical Hospital in Tomsk, Russia was extracted, transformed and loaded to the archetype database of a clinical decision support system. Test records' normalization has been performed by defining transformation and aggregation rules between the EHR data and the archetypes. These mapping rules were used to automatically generate openEHR compliant data. After the transformation, archetype data instances were loaded into the CDSS archetype based data storage. The performance times showed acceptable performance for the extraction stage with a mean of 17.428 s per year (3436 case records). The transformation times were also acceptable with 136.954 s per year (0.039 s per one instance). The accuracy evaluation showed the correctness and applicability of the method for the wide range of HISes. These operations were performed without interrupting the HIS workflow to prevent the HISes from disturbing the service provision to the users. The project results have proven that archetype based technologies are mature enough to be applied in routine operations that require extraction, transformation, loading and querying medical data from heterogeneous EHR systems. Inference models in clinical research and CDSS can benefit from this by defining queries to a valid data set with known structure and constraints. The standard based nature of the archetype approach allows an easy integration of CDSSs with existing EHR systems.
Depciuch, J; Kaznowska, E; Golowski, S; Koziorowska, A; Zawlik, I; Cholewa, M; Szmuc, K; Cebulski, J
2017-09-05
Breast cancer affects one in four women, therefore, the search for new diagnostic technologies and therapeutic approaches is of critical importance. This involves the development of diagnostic tools to facilitate the detection of cancer cells, which is useful for assessing the efficacy of cancer therapies. One of the major challenges for chemotherapy is the lack of tools to monitor efficacy during the course of treatment. Vibrational spectroscopy appears to be a promising tool for such a purpose, as it yields Fourier transformation infrared (FTIR) spectra which can be used to provide information on the chemical composition of the tissue. Previous research by our group has demonstrated significant differences between the infrared spectra of healthy, cancerous and post-chemotherapy breast tissue. Furthermore, the results obtained for three extreme patient cases revealed that the infrared spectra of post-chemotherapy breast tissue closely resembles that of healthy breast tissue when chemotherapy is effective (i.e., a good therapeutic response is achieved), or that of cancerous breast tissue when chemotherapy is ineffective. In the current study, we compared the infrared spectra of healthy, cancerous and post-chemotherapy breast tissue. Characteristic parameters were designated for the obtained spectra, spreading the function of absorbance using the Kramers-Kronig transformation and the best fit procedure to obtain Lorentz functions, which represent components of the bands. The Lorentz function parameters were used to develop a physics-based computational model to verify the efficacy of a given chemotherapy protocol in a given case. The results obtained using this model reflected the actual patient data retrieved from medical records (health improvement or no improvement). Therefore, we propose this model as a useful tool for monitoring the efficacy of chemotherapy in patients with breast cancer. Copyright © 2017 Elsevier B.V. All rights reserved.
The shift-invariant discrete wavelet transform and application to speech waveform analysis.
Enders, Jörg; Geng, Weihua; Li, Peijun; Frazier, Michael W; Scholl, David J
2005-04-01
The discrete wavelet transform may be used as a signal-processing tool for visualization and analysis of nonstationary, time-sampled waveforms. The highly desirable property of shift invariance can be obtained at the cost of a moderate increase in computational complexity, and accepting a least-squares inverse (pseudoinverse) in place of a true inverse. A new algorithm for the pseudoinverse of the shift-invariant transform that is easier to implement in array-oriented scripting languages than existing algorithms is presented together with self-contained proofs. Representing only one of the many and varied potential applications, a recorded speech waveform illustrates the benefits of shift invariance with pseudoinvertibility. Visualization shows the glottal modulation of vowel formants and frication noise, revealing secondary glottal pulses and other waveform irregularities. Additionally, performing sound waveform editing operations (i.e., cutting and pasting sections) on the shift-invariant wavelet representation automatically produces quiet, click-free section boundaries in the resulting sound. The capabilities of this wavelet-domain editing technique are demonstrated by changing the rate of a recorded spoken word. Individual pitch periods are repeated to obtain a half-speed result, and alternate individual pitch periods are removed to obtain a double-speed result. The original pitch and formant frequencies are preserved. In informal listening tests, the results are clear and understandable.
The shift-invariant discrete wavelet transform and application to speech waveform analysis
NASA Astrophysics Data System (ADS)
Enders, Jörg; Geng, Weihua; Li, Peijun; Frazier, Michael W.; Scholl, David J.
2005-04-01
The discrete wavelet transform may be used as a signal-processing tool for visualization and analysis of nonstationary, time-sampled waveforms. The highly desirable property of shift invariance can be obtained at the cost of a moderate increase in computational complexity, and accepting a least-squares inverse (pseudoinverse) in place of a true inverse. A new algorithm for the pseudoinverse of the shift-invariant transform that is easier to implement in array-oriented scripting languages than existing algorithms is presented together with self-contained proofs. Representing only one of the many and varied potential applications, a recorded speech waveform illustrates the benefits of shift invariance with pseudoinvertibility. Visualization shows the glottal modulation of vowel formants and frication noise, revealing secondary glottal pulses and other waveform irregularities. Additionally, performing sound waveform editing operations (i.e., cutting and pasting sections) on the shift-invariant wavelet representation automatically produces quiet, click-free section boundaries in the resulting sound. The capabilities of this wavelet-domain editing technique are demonstrated by changing the rate of a recorded spoken word. Individual pitch periods are repeated to obtain a half-speed result, and alternate individual pitch periods are removed to obtain a double-speed result. The original pitch and formant frequencies are preserved. In informal listening tests, the results are clear and understandable. .
A Chemical Transformation Simulator is a web-based system for predicting transformation pathways and physicochemical properties of organic chemicals. Role in Environmental Modeling • Screening tool for identifying likely transformation products in the environment • Parameteri...
The Locus of Tool-Transformation Costs
ERIC Educational Resources Information Center
Kunde, Wilfried; Pfister, Roland; Janczyk, Markus
2012-01-01
Transformations of hand movements by tools such as levers or electronic input devices can invoke performance costs compared to untransformed movements. This study investigated by means of the Psychological Refractory Period (PRP) paradigm at which stage of information processing such tool-transformation costs arise. We used an inversion…
The astronomy of Andean myth: The history of a cosmology
NASA Astrophysics Data System (ADS)
Sullivan, William F.
It is shown that Andean myth, on one level, represents a technical language recording astronomical observations of precession and, at the same time, an historical record of simultaneous social and celestial transformations. Topographic and architectural terms of Andean myth are interpreted as a metaphor for the organization of and locations on the celestial sphere. Via ethoastronomical date, mythical animals are identified as stars and placed on the celestial sphere according to their topographical location. Tested in the planetarium, these arrays generate cluster of dates - 200 B.C. and 650 A.D. Analysis of the names of Wiraqocha and Manco Capac indicates they represent Saturn and Jupiter and that their mythical meeting represents their conjunction in 650 A.D. The astronomy of Andean myth is then used as an historical tool to examine how the Andean priest-astronomers recorded the simultaneous creation of the avllu and of this distinctive astronomical system about 200 B.C. The idea that the agricultural avllu, with its double descent system stressing the importance of paternity, represents a transformation of society from an earlier matrilineal/horticultural era is examined in light of the sexual imagery employed in myth. Wiraqocha's androgyny and the division of the celestial sphere into male (ecliptic) and female (celestial equator = earth) are interpreted as cosmological validations of the new social structure.
A simple computer-based measurement and analysis system of pulmonary auscultation sounds.
Polat, Hüseyin; Güler, Inan
2004-12-01
Listening to various lung sounds has proven to be an important diagnostic tool for detecting and monitoring certain types of lung diseases. In this study a computer-based system has been designed for easy measurement and analysis of lung sound using the software package DasyLAB. The designed system presents the following features: it is able to digitally record the lung sounds which are captured with an electronic stethoscope plugged to a sound card on a portable computer, display the lung sound waveform for auscultation sites, record the lung sound into the ASCII format, acoustically reproduce the lung sound, edit and print the sound waveforms, display its time-expanded waveform, compute the Fast Fourier Transform (FFT), and display the power spectrum and spectrogram.
Schilling, Rebecca; Casper, Stephen T
2015-03-01
The Minnesota Multiphasic Personality Inventory (MMPI) was developed at the University of Minnesota, Minneapolis, in the 1930s and 1940s. It became a highly successful and highly controversial psychometric tool. In professional terms, psychometric tools such as the MMPI transformed psychology and psychiatry. Psychometric instruments thus readily fit into the developmental history of psychology, psychiatry, and neurology; they were a significant part of the narrative of those fields' advances in understanding, intervening, and treating people with mental illnesses. At the same time, the advent of such tools also fits into a history of those disciplines that records the rise of obsessional observational and evaluative techniques and technologies in order to facilitate patterns of social control that became typical during the Progressive Era in the United States and after. It was those patterns that also nurtured the resistance to psychometrics that emerged during the Vietnam War and after.
COMPOSE-HPC: A Transformational Approach to Exascale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernholdt, David E; Allan, Benjamin A.; Armstrong, Robert C.
2012-04-01
The goal of the COMPOSE-HPC project is to 'democratize' tools for automatic transformation of program source code so that it becomes tractable for the developers of scientific applications to create and use their own transformations reliably and safely. This paper describes our approach to this challenge, the creation of the KNOT tool chain, which includes tools for the creation of annotation languages to control the transformations (PAUL), to perform the transformations (ROTE), and optimization and code generation (BRAID), which can be used individually and in combination. We also provide examples of current and future uses of the KNOT tools, whichmore » include transforming code to use different programming models and environments, providing tests that can be used to detect errors in software or its execution, as well as composition of software written in different programming languages, or with different threading patterns.« less
30 CFR 75.812-2 - High-voltage power centers and transformers; record of examination.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 30 Mineral Resources 1 2013-07-01 2013-07-01 false High-voltage power centers and transformers; record of examination. 75.812-2 Section 75.812-2 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION... High-Voltage Distribution § 75.812-2 High-voltage power centers and transformers; record of examination...
30 CFR 75.812-2 - High-voltage power centers and transformers; record of examination.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 1 2011-07-01 2011-07-01 false High-voltage power centers and transformers; record of examination. 75.812-2 Section 75.812-2 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION... High-Voltage Distribution § 75.812-2 High-voltage power centers and transformers; record of examination...
30 CFR 75.812-2 - High-voltage power centers and transformers; record of examination.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 30 Mineral Resources 1 2014-07-01 2014-07-01 false High-voltage power centers and transformers; record of examination. 75.812-2 Section 75.812-2 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION... High-Voltage Distribution § 75.812-2 High-voltage power centers and transformers; record of examination...
30 CFR 75.812-2 - High-voltage power centers and transformers; record of examination.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 30 Mineral Resources 1 2010-07-01 2010-07-01 false High-voltage power centers and transformers; record of examination. 75.812-2 Section 75.812-2 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION... High-Voltage Distribution § 75.812-2 High-voltage power centers and transformers; record of examination...
30 CFR 75.812-2 - High-voltage power centers and transformers; record of examination.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 30 Mineral Resources 1 2012-07-01 2012-07-01 false High-voltage power centers and transformers; record of examination. 75.812-2 Section 75.812-2 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION... High-Voltage Distribution § 75.812-2 High-voltage power centers and transformers; record of examination...
Alegre-Cortés, J; Soto-Sánchez, C; Pizá, Á G; Albarracín, A L; Farfán, F D; Felice, C J; Fernández, E
2016-07-15
Linear analysis has classically provided powerful tools for understanding the behavior of neural populations, but the neuron responses to real-world stimulation are nonlinear under some conditions, and many neuronal components demonstrate strong nonlinear behavior. In spite of this, temporal and frequency dynamics of neural populations to sensory stimulation have been usually analyzed with linear approaches. In this paper, we propose the use of Noise-Assisted Multivariate Empirical Mode Decomposition (NA-MEMD), a data-driven template-free algorithm, plus the Hilbert transform as a suitable tool for analyzing population oscillatory dynamics in a multi-dimensional space with instantaneous frequency (IF) resolution. The proposed approach was able to extract oscillatory information of neurophysiological data of deep vibrissal nerve and visual cortex multiunit recordings that were not evidenced using linear approaches with fixed bases such as the Fourier analysis. Texture discrimination analysis performance was increased when Noise-Assisted Multivariate Empirical Mode plus Hilbert transform was implemented, compared to linear techniques. Cortical oscillatory population activity was analyzed with precise time-frequency resolution. Similarly, NA-MEMD provided increased time-frequency resolution of cortical oscillatory population activity. Noise-Assisted Multivariate Empirical Mode Decomposition plus Hilbert transform is an improved method to analyze neuronal population oscillatory dynamics overcoming linear and stationary assumptions of classical methods. Copyright © 2016 Elsevier B.V. All rights reserved.
The origin of the Acheulean. Techno-functional study of the FLK W lithic record (Olduvai, Tanzania)
Diez-Martín, Fernando; Domínguez-Rodrigo, Manuel; Duque, Javier; Fraile, Cristina; Díaz, Isabel; de Francisco, Sara; Baquedano, Enrique; Mabulla, Audax
2017-01-01
The Acheulean materials documented in FLK West dated c. 1.7 Ma. are the focus of the present work. An original techno-functional approach is applied here to analyze the origin of Acheulean tools. According to the results, these tools were employed in different functional contexts in which tasks of different durations that transformed resources with different resistances were carried out. The exploitation of large and resistant resources suggests that the economic mechanism governing the manufacture of these tools was an increase in the demand of the work load. The decision processes underlying the production of these tools have thus an evident functional motivation. However, the presence of a refined handaxe in the studied sample indicates that the design form and production principles of handaxe manufacture were the result of an abrupt emergence rather than a long gradual development. The integration of mechanical and ergonomic investigation in our research has been crucial to explain how a core-and-flake industry gave way to a technology based on the production of large and heavy shaped tools. PMID:28767645
Inferring climate variability from skewed proxy records
NASA Astrophysics Data System (ADS)
Emile-Geay, J.; Tingley, M.
2013-12-01
Many paleoclimate analyses assume a linear relationship between the proxy and the target climate variable, and that both the climate quantity and the errors follow normal distributions. An ever-increasing number of proxy records, however, are better modeled using distributions that are heavy-tailed, skewed, or otherwise non-normal, on account of the proxies reflecting non-normally distributed climate variables, or having non-linear relationships with a normally distributed climate variable. The analysis of such proxies requires a different set of tools, and this work serves as a cautionary tale on the danger of making conclusions about the underlying climate from applications of classic statistical procedures to heavily skewed proxy records. Inspired by runoff proxies, we consider an idealized proxy characterized by a nonlinear, thresholded relationship with climate, and describe three approaches to using such a record to infer past climate: (i) applying standard methods commonly used in the paleoclimate literature, without considering the non-linearities inherent to the proxy record; (ii) applying a power transform prior to using these standard methods; (iii) constructing a Bayesian model to invert the mechanistic relationship between the climate and the proxy. We find that neglecting the skewness in the proxy leads to erroneous conclusions and often exaggerates changes in climate variability between different time intervals. In contrast, an explicit treatment of the skewness, using either power transforms or a Bayesian inversion of the mechanistic model for the proxy, yields significantly better estimates of past climate variations. We apply these insights in two paleoclimate settings: (1) a classical sedimentary record from Laguna Pallcacocha, Ecuador (Moy et al., 2002). Our results agree with the qualitative aspects of previous analyses of this record, but quantitative departures are evident and hold implications for how such records are interpreted, and compared to other proxy records. (2) a multiproxy reconstruction of temperature over the Common Era (Mann et al., 2009), where we find that about one third of the records display significant departures from normality. Accordingly, accounting for skewness in proxy predictors has a notable influence on both reconstructed global mean and spatial patterns of temperature change. Inferring climate variability from skewed proxy records thus requires cares, but can be done with relatively simple tools. References - Mann, M. E., Z. Zhang, S. Rutherford, R. S. Bradley, M. K. Hughes, D. Shindell, C. Ammann, G. Faluvegi, and F. Ni (2009), Global signatures and dynamical origins of the little ice age and medieval climate anomaly, Science, 326(5957), 1256-1260, doi:10.1126/science.1177303. - Moy, C., G. Seltzer, D. Rodbell, and D. Anderson (2002), Variability of El Niño/Southern Oscillation activ- ity at millennial timescales during the Holocene epoch, Nature, 420(6912), 162-165.
Optical asymmetric image encryption using gyrator wavelet transform
NASA Astrophysics Data System (ADS)
Mehra, Isha; Nishchal, Naveen K.
2015-11-01
In this paper, we propose a new optical information processing tool termed as gyrator wavelet transform to secure a fully phase image, based on amplitude- and phase-truncation approach. The gyrator wavelet transform constitutes four basic parameters; gyrator transform order, type and level of mother wavelet, and position of different frequency bands. These parameters are used as encryption keys in addition to the random phase codes to the optical cryptosystem. This tool has also been applied for simultaneous compression and encryption of an image. The system's performance and its sensitivity to the encryption parameters, such as, gyrator transform order, and robustness has also been analyzed. It is expected that this tool will not only update current optical security systems, but may also shed some light on future developments. The computer simulation results demonstrate the abilities of the gyrator wavelet transform as an effective tool, which can be used in various optical information processing applications, including image encryption, and image compression. Also this tool can be applied for securing the color image, multispectral, and three-dimensional images.
XMI2USE: A Tool for Transforming XMI to USE Specifications
NASA Astrophysics Data System (ADS)
Sun, Wuliang; Song, Eunjee; Grabow, Paul C.; Simmonds, Devon M.
The UML-based Specification Environment (USE) tool supports syntactic analysis, type checking, consistency checking, and dynamic validation of invariants and pre-/post conditions specified in the Object Constraint Language (OCL). Due to its animation and analysis power, it is useful when checking critical non-functional properties such as security policies. However, the USE tool requires one to specify (i.e., "write") a model using its own textual language and does not allow one to import any model specification files created by other UML modeling tools. Hence, to make the best use of existing UML tools, we often create a model with OCL constraints using a modeling tool such as the IBM Rational Software Architect (RSA) and then use the USE tool for model validation. This approach, however, requires a manual transformation between the specifications of two different tool formats, which is error-prone and diminishes the benefit of automated model-level validations. In this paper, we describe our own implementation of a specification transformation engine that is based on the Model Driven Architecture (MDA) framework and currently supports automatic tool-level transformations from RSA to USE.
Comparison of BrainTool to other UML modeling and model transformation tools
NASA Astrophysics Data System (ADS)
Nikiforova, Oksana; Gusarovs, Konstantins
2017-07-01
In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.
NASA Astrophysics Data System (ADS)
Ravat, B.; Platteau, C.; Texier, G.; Oudot, B.; Delaunay, F.
2009-09-01
In order to investigate the martensitic transformation, an isothermal hold at -130 °C for 48 h was performed on a highly homogenized PuGa alloy. The modifications of the microstructure were characterized in situ thanks to a specific tool. This device was developed at the CEA-Valduc to analyze the crystalline structure of plutonium alloys as a function of temperature and more especially at low temperature using X-ray diffraction. The analysis of the recorded diffraction patterns highlighted that the martensitic transformation for this alloy is the result of a direct δ → α' + δ phase transformation. Moreover, a significant Bragg's peaks broadening corresponding to the δ-phase was observed. A microstructural analysis was made to characterize anisotropic microstrain resulting from the stress induced by the unit cell volume difference between the δ and α' phases. The amount of α'-phase evolved was analyzed within the framework of the Avrami theory in order to characterize the nucleation process. The results suggested that the growth mechanism corresponded to a general mechanism where the nucleation sites were in the δ-grain edges and the α'-phase had a plate-like morphology.
Makadia, Rupa; Matcho, Amy; Ma, Qianli; Knoll, Chris; Schuemie, Martijn; DeFalco, Frank J; Londhe, Ajit; Zhu, Vivienne; Ryan, Patrick B
2015-01-01
Objectives To evaluate the utility of applying the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM) across multiple observational databases within an organization and to apply standardized analytics tools for conducting observational research. Materials and methods Six deidentified patient-level datasets were transformed to the OMOP CDM. We evaluated the extent of information loss that occurred through the standardization process. We developed a standardized analytic tool to replicate the cohort construction process from a published epidemiology protocol and applied the analysis to all 6 databases to assess time-to-execution and comparability of results. Results Transformation to the CDM resulted in minimal information loss across all 6 databases. Patients and observations excluded were due to identified data quality issues in the source system, 96% to 99% of condition records and 90% to 99% of drug records were successfully mapped into the CDM using the standard vocabulary. The full cohort replication and descriptive baseline summary was executed for 2 cohorts in 6 databases in less than 1 hour. Discussion The standardization process improved data quality, increased efficiency, and facilitated cross-database comparisons to support a more systematic approach to observational research. Comparisons across data sources showed consistency in the impact of inclusion criteria, using the protocol and identified differences in patient characteristics and coding practices across databases. Conclusion Standardizing data structure (through a CDM), content (through a standard vocabulary with source code mappings), and analytics can enable an institution to apply a network-based approach to observational research across multiple, disparate observational health databases. PMID:25670757
A Virtual Science Data Environment for Carbon Dioxide Observations
NASA Astrophysics Data System (ADS)
Verma, R.; Goodale, C. E.; Hart, A. F.; Law, E.; Crichton, D. J.; Mattmann, C. A.; Gunson, M. R.; Braverman, A. J.; Nguyen, H. M.; Eldering, A.; Castano, R.; Osterman, G. B.
2011-12-01
Climate science data are often distributed cross-institutionally and made available using heterogeneous interfaces. With respect to observational carbon-dioxide (CO2) records, these data span across national as well as international institutions and are typically distributed using a variety of data standards. Such an arrangement can yield challenges from a research perspective, as users often need to independently aggregate datasets as well as address the issue of data quality. To tackle this dispersion and heterogeneity of data, we have developed the CO2 Virtual Science Data Environment - a comprehensive approach to virtually integrating CO2 data and metadata from multiple missions and providing a suite of computational services that facilitate analysis, comparison, and transformation of that data. The Virtual Science Environment provides climate scientists with a unified web-based destination for discovering relevant observational data in context, and supports a growing range of online tools and services for analyzing and transforming the available data to suit individual research needs. It includes web-based tools to geographically and interactively search for CO2 observations collected from multiple airborne, space, as well as terrestrial platforms. Moreover, the data analysis services it provides over the Internet, including offering techniques such as bias estimation and spatial re-gridding, move computation closer to the data and reduce the complexity of performing these operations repeatedly and at scale. The key to enabling these services, as well as consolidating the disparate data into a unified resource, has been to focus on leveraging metadata descriptors as the foundation of our data environment. This metadata-centric architecture, which leverages the Dublin Core standard, forgoes the need to replicate remote datasets locally. Instead, the system relies upon an extensive, metadata-rich virtual data catalog allowing on-demand browsing and retrieval of CO2 records from multiple missions. In other words, key metadata information about remote CO2 records is stored locally while the data itself is preserved at its respective archive of origin. This strategy has been made possible by our method of encapsulating the heterogeneous sources of data using a common set of web-based services, including services provided by Jet Propulsion Laboratory's Climate Data Exchange (CDX). Furthermore, this strategy has enabled us to scale across missions, and to provide access to a broad array of CO2 observational data. Coupled with on-demand computational services and an intuitive web-portal interface, the CO2 Virtual Science Data Environment effectively transforms heterogeneous CO2 records from multiple sources into a unified resource for scientific discovery.
Using XML and XSLT for flexible elicitation of mental-health risk knowledge.
Buckingham, C D; Ahmed, A; Adams, A E
2007-03-01
Current tools for assessing risks associated with mental-health problems require assessors to make high-level judgements based on clinical experience. This paper describes how new technologies can enhance qualitative research methods to identify lower-level cues underlying these judgements, which can be collected by people without a specialist mental-health background. Content analysis of interviews with 46 multidisciplinary mental-health experts exposed the cues and their interrelationships, which were represented by a mind map using software that stores maps as XML. All 46 mind maps were integrated into a single XML knowledge structure and analysed by a Lisp program to generate quantitative information about the numbers of experts associated with each part of it. The knowledge was refined by the experts, using software developed in Flash to record their collective views within the XML itself. These views specified how the XML should be transformed by XSLT, a technology for rendering XML, which resulted in a validated hierarchical knowledge structure associating patient cues with risks. Changing knowledge elicitation requirements were accommodated by flexible transformations of XML data using XSLT, which also facilitated generation of multiple data-gathering tools suiting different assessment circumstances and levels of mental-health knowledge.
NASA Astrophysics Data System (ADS)
Abolfazl Hosseini, Seyed; Javaherian, Abdolrahim; Hassani, Hossien; Torabi, Siyavash; Sadri, Maryam
2015-06-01
Ground roll, which is a Rayleigh surface wave that exists in land seismic data, may mask reflections. Sometimes ground roll is spatially aliased. Attenuation of aliased ground roll is of importance in seismic data processing. Different methods have been developed to attenuate ground roll. The shearlet transform is a directional and multidimensional transform that generates subimages of an input image in different directions and scales. Events with different dips are separated in these subimages. In this study, the shearlet transform is used to attenuate the aliased ground roll. To do this, a shot record is divided into several segments, and the appropriate mute zone is defined for all segments. The shearlet transform is applied to each segment. The subimages related to the non-aliased and aliased ground roll are identified by plotting the energy distributions of subimages with visual checking. Then, muting filters are used on selected subimages. The inverse shearlet transform is applied to the filtered segment. This procedure is repeated for all segments. Finally, all filtered segments are merged using the Hanning window. This method of aliased ground roll attenuation was tested on a synthetic dataset and a field shot record from the west of Iran. The synthetic shot record included strong aliased ground roll, whereas the field shot record did not. To produce the strong aliased ground roll on the field shot record, the data were resampled in the offset direction from 30 to 60 m. To show the performance of the shearlet transform in attenuating the aliased ground roll, we compared the shearlet transform with the f-k filtering and curvelet transform. We showed that the performance of the shearlet transform in the aliased ground roll attenuation is better than that of the f-k filtering and curvelet transform in both the synthetic and field shot records. However, when the dip and frequency content of the aliased ground roll are the same as the reflections, ability of the shearlet transform is limited in attenuating the aliased ground roll.
Linear and Nonlinear Molecular Spectroscopy with Laser Frequency Combs
NASA Astrophysics Data System (ADS)
Picque, Nathalie
2013-06-01
The regular pulse train of a mode-locked femtosecond laser can give rise to a comb spectrum of millions of laser modes with a spacing precisely equal to the pulse repetition frequency. Laser frequency combs were conceived a decade ago as tools for the precision spectroscopy of atomic hydrogen. They are now becoming enabling tools for an increasing number of applications, including molecular spectroscopy. Recent experiments of multi-heterodyne frequency comb Fourier transform spectroscopy (also called dual-comb spectroscopy) have demonstrated that the precisely spaced spectral lines of a laser frequency comb can be harnessed for new techniques of linear absorption spectroscopy. The first proof-of-principle experiments have demonstrated a very exciting potential of dual-comb spectroscopy without moving parts for ultra-rapid and ultra-sensitive recording of complex broad spectral bandwidth molecular spectra. Compared to conventional Michelson-based Fourier transform spectroscopy, recording times could be shortened from seconds to microseconds, with intriguing prospects for spectroscopy of short lived transient species. The resolution improves proportionally to the measurement time. Therefore longer recordings allow high resolution spectroscopy of molecules with extreme precision, since the absolute frequency of each laser comb line can be known with the accuracy of an atomic clock. Moreover, since laser frequency combs involve intense ultrashort laser pulses, nonlinear interactions can be harnessed. Broad spectral bandwidth ultra-rapid nonlinear molecular spectroscopy and imaging with two laser frequency combs is demonstrated with coherent Raman effects and two-photon excitation. Real-time multiplex accessing of hyperspectral images may dramatically expand the range of applications of nonlinear microscopy. B. Bernhardt et al., Nature Photonics 4, 55-57 (2010); A. Schliesser et al. Nature Photonics 6, 440-449 (2012); T. Ideguchi et al. arXiv:1201.4177 (2012) T. Ideguchi et al., Optics letters 37, 4498-4500 (2012); T. Ideguchi et al. arXiv:1302.2414 (2013)
Singleton, Robyn; Picado Araúz, María de la Paz; Trocin, Kathleen; Winskell, Kate
2017-01-01
The use of narrative has become increasingly popular in the public health, community development, and education fields. Via emotionally engaging plotlines with authentic, captivating characters, stories provide an opportunity for participants to be carried away imaginatively into the characters' world while connecting the story with their own lived experiences. Stories have been highlighted as valuable tools in transformative learning. However, little published literature exists demonstrating applications of stories in group-based transformative learning curricula. This paper describes the creation of a narrative-based transformative learning tool based on an analysis of Nicaraguan adolescents' meaning-making around intimate partner violence (IPV) in their creative narratives. In collaboration with a Nicaraguan organization, US researchers analyzed a sample of narratives ( n = 55; 16 male-authored, 39 female-authored) on IPV submitted to a 2014 scriptwriting competition by adolescents aged 15-19. The data were particularly timely in that they responded to a new law protecting victims of gender-based violence, Law 779, and contradicted social-conservative claims that the Law 779 destroys family unity. We incorporated results from this analysis into the creation of the transformative learning tool, separated into thematic sections. The tool's sections (which comprise one story and three corresponding activities) aim to facilitate critical reflection, interpersonal dialogue, and self- and collective efficacy for social action around the following themes derived from the analysis: IPV and social support; IPV and romantic love; masculinity; warning signs of IPV; and sexual abuse. As a collaboration between a public health research team based at a US university and a Nicaraguan community-based organization, it demonstrates the potential in the age of increasingly smooth electronic communication for novel community-university partnerships to facilitate the development of narrative-based tools to support transformative learning.
Agile Model Driven Development of Electronic Health Record-Based Specialty Population Registries
Kannan, Vaishnavi; Fish, Jason C.; Willett, DuWayne L.
2018-01-01
The transformation of the American healthcare payment system from fee-for-service to value-based care increasingly makes it valuable to develop patient registries for specialized populations, to better assess healthcare quality and costs. Recent widespread adoption of Electronic Health Records (EHRs) in the U.S. now makes possible construction of EHR-based specialty registry data collection tools and reports, previously unfeasible using manual chart abstraction. But the complexities of specialty registry EHR tools and measures, along with the variety of stakeholders involved, can result in misunderstood requirements and frequent product change requests, as users first experience the tools in their actual clinical workflows. Such requirements churn could easily stall progress in specialty registry rollout. Modeling a system’s requirements and solution design can be a powerful way to remove ambiguities, facilitate shared understanding, and help evolve a design to meet newly-discovered needs. “Agile Modeling” retains these values while avoiding excessive unused up-front modeling in favor of iterative incremental modeling. Using Agile Modeling principles and practices, in calendar year 2015 one institution developed 58 EHR-based specialty registries, with 111 new data collection tools, supporting 134 clinical process and outcome measures, and enrolling over 16,000 patients. The subset of UML and non-UML models found most consistently useful in designing, building, and iteratively evolving EHR-based specialty registries included User Stories, Domain Models, Use Case Diagrams, Decision Trees, Graphical User Interface Storyboards, Use Case text descriptions, and Solution Class Diagrams. PMID:29750222
Agile Model Driven Development of Electronic Health Record-Based Specialty Population Registries.
Kannan, Vaishnavi; Fish, Jason C; Willett, DuWayne L
2016-02-01
The transformation of the American healthcare payment system from fee-for-service to value-based care increasingly makes it valuable to develop patient registries for specialized populations, to better assess healthcare quality and costs. Recent widespread adoption of Electronic Health Records (EHRs) in the U.S. now makes possible construction of EHR-based specialty registry data collection tools and reports, previously unfeasible using manual chart abstraction. But the complexities of specialty registry EHR tools and measures, along with the variety of stakeholders involved, can result in misunderstood requirements and frequent product change requests, as users first experience the tools in their actual clinical workflows. Such requirements churn could easily stall progress in specialty registry rollout. Modeling a system's requirements and solution design can be a powerful way to remove ambiguities, facilitate shared understanding, and help evolve a design to meet newly-discovered needs. "Agile Modeling" retains these values while avoiding excessive unused up-front modeling in favor of iterative incremental modeling. Using Agile Modeling principles and practices, in calendar year 2015 one institution developed 58 EHR-based specialty registries, with 111 new data collection tools, supporting 134 clinical process and outcome measures, and enrolling over 16,000 patients. The subset of UML and non-UML models found most consistently useful in designing, building, and iteratively evolving EHR-based specialty registries included User Stories, Domain Models, Use Case Diagrams, Decision Trees, Graphical User Interface Storyboards, Use Case text descriptions, and Solution Class Diagrams.
Apparatus for direct-to-digital spatially-heterodyned holography
Thomas, Clarence E.; Hanson, Gregory R.
2006-12-12
An apparatus operable to record a spatially low-frequency heterodyne hologram including spatially heterodyne fringes for Fourier analysis includes: a laser; a beamsplitter optically coupled to the laser; an object optically coupled to the beamsplitter; a focusing lens optically coupled to both the beamsplitter and the object; a digital recorder optically coupled to the focusing lens; and a computer that performs a Fourier transform, applies a digital filter, and performs an inverse Fourier transform. A reference beam and an object beam are focused by the focusing lens at a focal plane of the digital recorder to form a spatially low-frequency heterodyne hologram including spatially heterodyne fringes for Fourier analysis which is recorded by the digital recorder, and the computer transforms the recorded spatially low-frequency heterodyne hologram including spatially heterodyne fringes and shifts axes in Fourier space to sit on top of a heterodyne carrier frequency defined by an angle between the reference beam and the object beam and cuts off signals around an original origin before performing the inverse Fourier transform.
Almeida, Trevor P.; Kasama, Takeshi; Muxworthy, Adrian R.; Williams, Wyn; Nagy, Lesleis; Hansen, Thomas W.; Brown, Paul D.; Dunin-Borkowski, Rafal E.
2014-01-01
Magnetite (Fe3O4) is an important magnetic mineral to Earth scientists, as it carries the dominant magnetic signature in rocks, and the understanding of its magnetic recording fidelity provides a critical tool in the field of palaeomagnetism. However, reliable interpretation of the recording fidelity of Fe3O4 particles is greatly diminished over time by progressive oxidation to less magnetic iron oxides, such as maghemite (γ-Fe2O3), with consequent alteration of remanent magnetization potentially having important geological significance. Here we use the complementary techniques of environmental transmission electron microscopy and off-axis electron holography to induce and visualize the effects of oxidation on the magnetization of individual nanoscale Fe3O4 particles as they transform towards γ-Fe2O3. Magnetic induction maps demonstrate a change in both strength and direction of remanent magnetization within Fe3O4 particles in the size range dominant in rocks, confirming that oxidation can modify the original stored magnetic information. PMID:25300366
Almeida, Trevor P; Kasama, Takeshi; Muxworthy, Adrian R; Williams, Wyn; Nagy, Lesleis; Hansen, Thomas W; Brown, Paul D; Dunin-Borkowski, Rafal E
2014-10-10
Magnetite (Fe3O4) is an important magnetic mineral to Earth scientists, as it carries the dominant magnetic signature in rocks, and the understanding of its magnetic recording fidelity provides a critical tool in the field of palaeomagnetism. However, reliable interpretation of the recording fidelity of Fe3O4 particles is greatly diminished over time by progressive oxidation to less magnetic iron oxides, such as maghemite (γ-Fe2O3), with consequent alteration of remanent magnetization potentially having important geological significance. Here we use the complementary techniques of environmental transmission electron microscopy and off-axis electron holography to induce and visualize the effects of oxidation on the magnetization of individual nanoscale Fe3O4 particles as they transform towards γ-Fe2O3. Magnetic induction maps demonstrate a change in both strength and direction of remanent magnetization within Fe3O4 particles in the size range dominant in rocks, confirming that oxidation can modify the original stored magnetic information.
Montgomery, L D; Montgomery, R W; Guisado, R
1995-05-01
This investigation demonstrates the feasibility of mental workload assessment by rheoencephalographic (REG) and multichannel electroencephalographic (EEG) monitoring. During the performance of this research, unique testing, analytical and display procedures were developed for REG and EEG monitoring that extend the current state of the art and provide valuable tools for the study of cerebral circulatory and neural activity during cognition. REG records are analyzed to provide indices of the right and left hemisphere hemodynamic changes that take place during each test sequence. The EEG data are modeled using regression techniques and mathematically transformed to provide energy-density distributions of the scalp electrostatic field. These procedures permit concurrent REG/EEG cognitive testing not possible with current techniques. The introduction of a system for recording and analysis of cognitive REG/EEG test sequences facilitates the study of learning and memory disorders, dementia and other encephalopathies.
NASA Technical Reports Server (NTRS)
Montgomery, L. D.; Montgomery, R. W.; Guisado, R.
1995-01-01
This investigation demonstrates the feasibility of mental workload assessment by rheoencephalographic (REG) and multichannel electroencephalographic (EEG) monitoring. During the performance of this research, unique testing, analytical and display procedures were developed for REG and EEG monitoring that extend the current state of the art and provide valuable tools for the study of cerebral circulatory and neural activity during cognition. REG records are analyzed to provide indices of the right and left hemisphere hemodynamic changes that take place during each test sequence. The EEG data are modeled using regression techniques and mathematically transformed to provide energy-density distributions of the scalp electrostatic field. These procedures permit concurrent REG/EEG cognitive testing not possible with current techniques. The introduction of a system for recording and analysis of cognitive REG/EEG test sequences facilitates the study of learning and memory disorders, dementia and other encephalopathies.
Waste in health information systems: a systematic review.
Awang Kalong, Nadia; Yusof, Maryati
2017-05-08
Purpose The purpose of this paper is to discuss a systematic review on waste identification related to health information systems (HIS) in Lean transformation. Design/methodology/approach A systematic review was conducted on 19 studies to evaluate Lean transformation and tools used to remove waste related to HIS in clinical settings. Findings Ten waste categories were identified, along with their relationships and applications of Lean tool types related to HIS. Different Lean tools were used at the early and final stages of Lean transformation; the tool selection depended on the waste characteristic. Nine studies reported a positive impact from Lean transformation in improving daily work processes. The selection of Lean tools should be made based on the timing, purpose and characteristics of waste to be removed. Research limitations/implications Overview of waste and its category within HIS and its analysis from socio-technical perspectives enabled the identification of its root cause in a holistic and rigorous manner. Practical implications Understanding waste types, their root cause and review of Lean tools could subsequently lead to the identification of mitigation approach to prevent future error occurrence. Originality/value Specific waste models for HIS settings are yet to be developed. Hence, the identification of the waste categories could guide future implementation of Lean transformations in HIS settings.
Managing and Transforming Waste Streams – A Tool for Communities
The Managing and Transforming Waste Streams Tool features 100 policy and program options communities can pursue to increase rates of recycling, composting, waste reduction, and materials reuse across waste stream generators.
Employing health information technology in the real world to transform delivery.
Gold, Marsha
2013-11-01
Strong leadership and a supportive culture are critical to effective organizational transformation, but organizations pursuing change also need the infrastructure and tools to do so effectively. As policy makers seek to transform healthcare systems-specifically the delivery of care-we explore the real-world connection between health information technology (HIT) and the transformation of care delivery. This study is based on interviews with diverse federal and health system leaders and federal officials. The work was funded by the Office of the National Coordinator for Health Information Technology as part of a global assessment of the Health Information Technology for Economic and Clinical Health Act. The functionalities supported by HIT are integral to creating the information flow required for innovations such as medical homes, accountable care organizations, and bundled payment. However, such functionalities require much more than the presence of electronic health records; the data must also be liquid, integrated into the work flow, and used for analysis. Even in advanced systems, it takes years to create HIT infrastructure. Building this infrastructure and transforming delivery simultaneously is difficult, although probably unavoidable, for most providers. Progress will likely be slow and will require creative strategies that take into account the real-world environment of organizations and communities. While the rapid transformation of delivery and infrastructure is appealing, both types of change will take time and will progress unevenly across the nation. Policy makers serious about transforming the delivery of healthcare can benefit by recognizing these realities and developing practical strategies to deal with them over a relatively long period of time.
Adventures in transformations: TG, TA, oh my! (Poster abstract)
NASA Astrophysics Data System (ADS)
Ciocca, M.
2015-12-01
(Abstract only) AAVSO made available, through the great volunteer work of Gordon Myers and George Silvis, two very useful tools, Transform Generator and Transform Applier (TG and TA) for transforming instrumental magnitudes to the standard system. I will juxtapose the steps necessary to obtain transformation parameters "the old fashion way" and how can the same result be achieved with these two tools. I will present transformation parameters for the Eastern Kentucky University (EKU) telescope and obtained with the standard field M67. These parameters were applied to photometric results for AE Uma, a short-period, high-amplitude delta Scuti star (Period ~ 0.086 d).
About the Managing and Transforming Waste Streams Tool
The Managing and Transforming Waste Streams Tool was developed by a team of zero waste consultants and solid waste program managers making informed observations from hands-on work in communities, with contributions from EPA.
Voss, Erica A; Makadia, Rupa; Matcho, Amy; Ma, Qianli; Knoll, Chris; Schuemie, Martijn; DeFalco, Frank J; Londhe, Ajit; Zhu, Vivienne; Ryan, Patrick B
2015-05-01
To evaluate the utility of applying the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM) across multiple observational databases within an organization and to apply standardized analytics tools for conducting observational research. Six deidentified patient-level datasets were transformed to the OMOP CDM. We evaluated the extent of information loss that occurred through the standardization process. We developed a standardized analytic tool to replicate the cohort construction process from a published epidemiology protocol and applied the analysis to all 6 databases to assess time-to-execution and comparability of results. Transformation to the CDM resulted in minimal information loss across all 6 databases. Patients and observations excluded were due to identified data quality issues in the source system, 96% to 99% of condition records and 90% to 99% of drug records were successfully mapped into the CDM using the standard vocabulary. The full cohort replication and descriptive baseline summary was executed for 2 cohorts in 6 databases in less than 1 hour. The standardization process improved data quality, increased efficiency, and facilitated cross-database comparisons to support a more systematic approach to observational research. Comparisons across data sources showed consistency in the impact of inclusion criteria, using the protocol and identified differences in patient characteristics and coding practices across databases. Standardizing data structure (through a CDM), content (through a standard vocabulary with source code mappings), and analytics can enable an institution to apply a network-based approach to observational research across multiple, disparate observational health databases. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.
[Development of a Text-Data Based Learning Tool That Integrates Image Processing and Displaying].
Shinohara, Hiroyuki; Hashimoto, Takeyuki
2015-01-01
We developed a text-data based learning tool that integrates image processing and displaying by Excel. Knowledge required for programing this tool is limited to using absolute, relative, and composite cell references and learning approximately 20 mathematical functions available in Excel. The new tool is capable of resolution translation, geometric transformation, spatial-filter processing, Radon transform, Fourier transform, convolutions, correlations, deconvolutions, wavelet transform, mutual information, and simulation of proton density-, T1-, and T2-weighted MR images. The processed images of 128 x 128 pixels or 256 x 256 pixels are observed directly within Excel worksheets without using any particular image display software. The results of image processing using this tool were compared with those using C language and the new tool was judged to have sufficient accuracy to be practically useful. The images displayed on Excel worksheets were compared with images using binary-data display software. This comparison indicated that the image quality of the Excel worksheets was nearly equal to the latter in visual impressions. Since image processing is performed by using text-data, the process is visible and facilitates making contrasts by using mathematical equations within the program. We concluded that the newly developed tool is adequate as a computer-assisted learning tool for use in medical image processing.
Integrated Personal Health Records: Transformative Tools for Consumer-Centric Care
Detmer, Don; Bloomrosen, Meryl; Raymond, Brian; Tang, Paul
2008-01-01
Background Integrated personal health records (PHRs) offer significant potential to stimulate transformational changes in health care delivery and self-care by patients. In 2006, an invitational roundtable sponsored by Kaiser Permanente Institute, the American Medical Informatics Association, and the Agency for Healthcare Research and Quality was held to identify the transformative potential of PHRs, as well as barriers to realizing this potential and a framework for action to move them closer to the health care mainstream. This paper highlights and builds on the insights shared during the roundtable. Discussion While there is a spectrum of dominant PHR models, (standalone, tethered, integrated), the authors state that only the integrated model has true transformative potential to strengthen consumers' ability to manage their own health care. Integrated PHRs improve the quality, completeness, depth, and accessibility of health information provided by patients; enable facile communication between patients and providers; provide access to health knowledge for patients; ensure portability of medical records and other personal health information; and incorporate auto-population of content. Numerous factors impede widespread adoption of integrated PHRs: obstacles in the health care system/culture; issues of consumer confidence and trust; lack of technical standards for interoperability; lack of HIT infrastructure; the digital divide; uncertain value realization/ROI; and uncertain market demand. Recent efforts have led to progress on standards for integrated PHRs, and government agencies and private companies are offering different models to consumers, but substantial obstacles remain to be addressed. Immediate steps to advance integrated PHRs should include sharing existing knowledge and expanding knowledge about them, building on existing efforts, and continuing dialogue among public and private sector stakeholders. Summary Integrated PHRs promote active, ongoing patient collaboration in care delivery and decision making. With some exceptions, however, the integrated PHR model is still a theoretical framework for consumer-centric health care. The authors pose questions that need to be answered so that the field can move forward to realize the potential of integrated PHRs. How can integrated PHRs be moved from concept to practical application? Would a coordinating body expedite this progress? How can existing initiatives and policy levers serve as catalysts to advance integrated PHRs? PMID:18837999
NASA Technical Reports Server (NTRS)
2010-01-01
Topics covered include: Burnishing Techniques Strengthen Hip Implants; Signal Processing Methods Monitor Cranial Pressure; Ultraviolet-Blocking Lenses Protect, Enhance Vision; Hyperspectral Systems Increase Imaging Capabilities; Programs Model the Future of Air Traffic Management; Tail Rotor Airfoils Stabilize Helicopters, Reduce Noise; Personal Aircraft Point to the Future of Transportation; Ducted Fan Designs Lead to Potential New Vehicles; Winglets Save Billions of Dollars in Fuel Costs; Sensor Systems Collect Critical Aerodynamics Data; Coatings Extend Life of Engines and Infrastructure; Radiometers Optimize Local Weather Prediction; Energy-Efficient Systems Eliminate Icing Danger for UAVs; Rocket-Powered Parachutes Rescue Entire Planes; Technologies Advance UAVs for Science, Military; Inflatable Antennas Support Emergency Communication; Smart Sensors Assess Structural Health; Hand-Held Devices Detect Explosives and Chemical Agents; Terahertz Tools Advance Imaging for Security, Industry; LED Systems Target Plant Growth; Aerogels Insulate Against Extreme Temperatures; Image Sensors Enhance Camera Technologies; Lightweight Material Patches Allow for Quick Repairs; Nanomaterials Transform Hairstyling Tools; Do-It-Yourself Additives Recharge Auto Air Conditioning; Systems Analyze Water Quality in Real Time; Compact Radiometers Expand Climate Knowledge; Energy Servers Deliver Clean, Affordable Power; Solutions Remediate Contaminated Groundwater; Bacteria Provide Cleanup of Oil Spills, Wastewater; Reflective Coatings Protect People and Animals; Innovative Techniques Simplify Vibration Analysis; Modeling Tools Predict Flow in Fluid Dynamics; Verification Tools Secure Online Shopping, Banking; Toolsets Maintain Health of Complex Systems; Framework Resources Multiply Computing Power; Tools Automate Spacecraft Testing, Operation; GPS Software Packages Deliver Positioning Solutions; Solid-State Recorders Enhance Scientific Data Collection; Computer Models Simulate Fine Particle Dispersion; Composite Sandwich Technologies Lighten Components; Cameras Reveal Elements in the Short Wave Infrared; Deformable Mirrors Correct Optical Distortions; Stitching Techniques Advance Optics Manufacturing; Compact, Robust Chips Integrate Optical Functions; Fuel Cell Stations Automate Processes, Catalyst Testing; Onboard Systems Record Unique Videos of Space Missions; Space Research Results Purify Semiconductor Materials; and Toolkits Control Motion of Complex Robotics.
Characterization and Simulation of Gunfire with Wavelets
Smallwood, David O.
1999-01-01
Gunfire is used as an example to show how the wavelet transform can be used to characterize and simulate nonstationary random events when an ensemble of events is available. The structural response to nearby firing of a high-firing rate gun has been characterized in several ways as a nonstationary random process. The current paper will explore a method to describe the nonstationary random process using a wavelet transform. The gunfire record is broken up into a sequence of transient waveforms each representing the response to the firing of a single round. A wavelet transform is performed on each of thesemore » records. The gunfire is simulated by generating realizations of records of a single-round firing by computing an inverse wavelet transform from Gaussian random coefficients with the same mean and standard deviation as those estimated from the previously analyzed gunfire record. The individual records are assembled into a realization of many rounds firing. A second-order correction of the probability density function is accomplished with a zero memory nonlinear function. The method is straightforward, easy to implement, and produces a simulated record much like the measured gunfire record.« less
Implementation of quantum and classical discrete fractional Fourier transforms.
Weimann, Steffen; Perez-Leija, Armando; Lebugle, Maxime; Keil, Robert; Tichy, Malte; Gräfe, Markus; Heilmann, René; Nolte, Stefan; Moya-Cessa, Hector; Weihs, Gregor; Christodoulides, Demetrios N; Szameit, Alexander
2016-03-23
Fourier transforms, integer and fractional, are ubiquitous mathematical tools in basic and applied science. Certainly, since the ordinary Fourier transform is merely a particular case of a continuous set of fractional Fourier domains, every property and application of the ordinary Fourier transform becomes a special case of the fractional Fourier transform. Despite the great practical importance of the discrete Fourier transform, implementation of fractional orders of the corresponding discrete operation has been elusive. Here we report classical and quantum optical realizations of the discrete fractional Fourier transform. In the context of classical optics, we implement discrete fractional Fourier transforms of exemplary wave functions and experimentally demonstrate the shift theorem. Moreover, we apply this approach in the quantum realm to Fourier transform separable and path-entangled biphoton wave functions. The proposed approach is versatile and could find applications in various fields where Fourier transforms are essential tools.
Implementation of quantum and classical discrete fractional Fourier transforms
Weimann, Steffen; Perez-Leija, Armando; Lebugle, Maxime; Keil, Robert; Tichy, Malte; Gräfe, Markus; Heilmann, René; Nolte, Stefan; Moya-Cessa, Hector; Weihs, Gregor; Christodoulides, Demetrios N.; Szameit, Alexander
2016-01-01
Fourier transforms, integer and fractional, are ubiquitous mathematical tools in basic and applied science. Certainly, since the ordinary Fourier transform is merely a particular case of a continuous set of fractional Fourier domains, every property and application of the ordinary Fourier transform becomes a special case of the fractional Fourier transform. Despite the great practical importance of the discrete Fourier transform, implementation of fractional orders of the corresponding discrete operation has been elusive. Here we report classical and quantum optical realizations of the discrete fractional Fourier transform. In the context of classical optics, we implement discrete fractional Fourier transforms of exemplary wave functions and experimentally demonstrate the shift theorem. Moreover, we apply this approach in the quantum realm to Fourier transform separable and path-entangled biphoton wave functions. The proposed approach is versatile and could find applications in various fields where Fourier transforms are essential tools. PMID:27006089
Jian, Bo; Hou, Wensheng; Wu, Cunxiang; Liu, Bin; Liu, Wei; Song, Shikui; Bi, Yurong; Han, Tianfu
2009-06-25
Transgenic approaches provide a powerful tool for gene function investigations in plants. However, some legumes are still recalcitrant to current transformation technologies, limiting the extent to which functional genomic studies can be performed on. Superroot of Lotus corniculatus is a continuous root cloning system allowing direct somatic embryogenesis and mass regeneration of plants. Recently, a technique to obtain transgenic L. corniculatus plants from Superroot-derived leaves through A. tumefaciens-mediated transformation was described. However, transformation efficiency was low and it took about six months from gene transfer to PCR identification. In the present study, we developed an A. rhizogenes-mediated transformation of Superroot-derived L. corniculatus for gene function investigation, combining the efficient A. rhizogenes-mediated transformation and the rapid regeneration system of Superroot. The transformation system using A. rhizogenes K599 harbouring pGFPGUSPlus was improved by validating some parameters which may influence the transformation frequency. Using stem sections with one node as explants, a 2-day pre-culture of explants, infection with K599 at OD(600) = 0.6, and co-cultivation on medium (pH 5.4) at 22 degrees C for 2 days enhanced the transformation frequency significantly. As proof of concept, Superroot-derived L. corniculatus was transformed with a gene from wheat encoding an Na+/H+ antiporter (TaNHX2) using the described system. Transgenic Superroot plants were obtained and had increased salt tolerance, as expected from the expression of TaNHX2. A rapid and efficient tool for gene function investigation in L. corniculatus was developed, combining the simplicity and high efficiency of the Superroot regeneration system and the availability of A. rhizogenes-mediated transformation. This system was improved by validating some parameters influencing the transformation frequency, which could reach 92% based on GUS detection. The combination of the highly efficient transformation and the regeneration system of Superroot provides a valuable tool for functional genomics studies in L. corniculatus.
Developing Visual Thinking in the Electronic Health Record.
Boyd, Andrew D; Young, Christine D; Amatayakul, Margret; Dieter, Michael G; Pawola, Lawrence M
2017-01-01
The purpose of this vision paper is to identify how data visualization could transform healthcare. Electronic Health Records (EHRs) are maturing with new technology and tools being applied. Researchers are reaping the benefits of data visualization to better access compilations of EHR data for enhanced clinical research. Data visualization, while still primarily the domain of clinical researchers, is beginning to show promise for other stakeholders. A non-exhaustive review of the literature indicates that respective to the growth and development of the EHR, the maturity of data visualization in healthcare is in its infancy. Visual analytics has been only cursorily applied to healthcare. A fundamental issue contributing to fragmentation and poor coordination of healthcare delivery is that each member of the healthcare team, including patients, has a different view. Summarizing all of this care comprehensively for any member of the healthcare team is a "wickedly hard" visual analytics and data visualization problem to solve.
Chan, Lenny L. S.; Fouts, Michelle M.; Murphy, Elizabeth J.
2017-01-01
Widespread electronic health record (EHR) implementation creates new challenges in the diabetes care of complex and diverse populations, including safe medication prescribing for patients with limited health literacy and limited English proficiency. This review highlights how the EHR electronic prescribing transformation has affected diabetes care for vulnerable patients and offers recommendations for improving patient safety through EHR electronic prescribing design, implementation, policy, and research. Specifically, we present evidence for (1) the adoption of RxNorm; (2) standardized naming and picklist options for high alert medications such as insulin; (3) the widespread implementation of universal medication schedule and language-concordant labels, with the expansion of electronic prescription 140-character limit; (4) enhanced bidirectional communication with pharmacy partners; and (5) informatics and implementation research in safety net healthcare systems to examine how EHR tools and practices affect diverse vulnerable populations. PMID:28197420
Ratanawongsa, Neda; Chan, Lenny L S; Fouts, Michelle M; Murphy, Elizabeth J
2017-01-01
Widespread electronic health record (EHR) implementation creates new challenges in the diabetes care of complex and diverse populations, including safe medication prescribing for patients with limited health literacy and limited English proficiency. This review highlights how the EHR electronic prescribing transformation has affected diabetes care for vulnerable patients and offers recommendations for improving patient safety through EHR electronic prescribing design, implementation, policy, and research. Specifically, we present evidence for (1) the adoption of RxNorm; (2) standardized naming and picklist options for high alert medications such as insulin; (3) the widespread implementation of universal medication schedule and language-concordant labels, with the expansion of electronic prescription 140-character limit; (4) enhanced bidirectional communication with pharmacy partners; and (5) informatics and implementation research in safety net healthcare systems to examine how EHR tools and practices affect diverse vulnerable populations.
NASA Astrophysics Data System (ADS)
Hoell, Simon; Omenzetter, Piotr
2017-07-01
Considering jointly damage sensitive features (DSFs) of signals recorded by multiple sensors, applying advanced transformations to these DSFs and assessing systematically their contribution to damage detectability and localisation can significantly enhance the performance of structural health monitoring systems. This philosophy is explored here for partial autocorrelation coefficients (PACCs) of acceleration responses. They are interrogated with the help of the linear discriminant analysis based on the Fukunaga-Koontz transformation using datasets of the healthy and selected reference damage states. Then, a simple but efficient fast forward selection procedure is applied to rank the DSF components with respect to statistical distance measures specialised for either damage detection or localisation. For the damage detection task, the optimal feature subsets are identified based on the statistical hypothesis testing. For damage localisation, a hierarchical neuro-fuzzy tool is developed that uses the DSF ranking to establish its own optimal architecture. The proposed approaches are evaluated experimentally on data from non-destructively simulated damage in a laboratory scale wind turbine blade. The results support our claim of being able to enhance damage detectability and localisation performance by transforming and optimally selecting DSFs. It is demonstrated that the optimally selected PACCs from multiple sensors or their Fukunaga-Koontz transformed versions can not only improve the detectability of damage via statistical hypothesis testing but also increase the accuracy of damage localisation when used as inputs into a hierarchical neuro-fuzzy network. Furthermore, the computational effort of employing these advanced soft computing models for damage localisation can be significantly reduced by using transformed DSFs.
TREPS, a tool for coordinate and time transformations in space physics
NASA Astrophysics Data System (ADS)
Génot, V.; Renard, B.; Dufourg, N.; Bouchemit, M.; Lormant, N.; Beigbeder, L.; Popescu, D.; Toniutti, J.-P.; André, N.; Pitout, F.; Jacquey, C.; Cecconi, B.; Gangloff, M.
2018-01-01
We present TREPS (Transformation de REpères en Physique Spatiale) an online tool to perform coordinate transformations commonly used in planetology and heliophysics. It is based on SPICE kernels developed by NASA/NAIF. Its usage is straightforward, with a 4-step process, including various import/export options. Interoperability with external services is available through Virtual Observatory technology which is illustrated in a use case.
Transformational change in action.
Lucy, Botting
2011-02-01
In recent years, there has been an imperative to improve productivity in the NHS. This requires the use of appropriate tools and appropriate leadership at all levels of organisations. This article outlines to concept of transformational change, discusses productivity as a national driver for improving healthcare services, looks at the most appropriate tools to use at various levels of organisations to achieve this, and considers why leadership is a vital component in transformational change.
NASA Technical Reports Server (NTRS)
Stoms, R. M.
1984-01-01
Numerically-controlled 5-axis machine tool uses transformer and meter to determine and indicate whether tool is in home position, but lacks built-in test mode to check them. Tester makes possible test, and repair of components at machine rather then replace them when operation seems suspect.
An analysis of human-induced land transformations in the San Francisco Bay/Sacramento area
Kirtland, David A.; Gaydos, L.J.; Clarke, Keith; DeCola, Lee; Acevedo, William; Bell, Cindy
1994-01-01
Part of the U.S. Geological Survey's Global Change Research Program involvesstudying the area from the Pacific Ocean to the Sierra foothills to enhance understanding ofthe role that human activities play in global change. The study investigates the ways thathumans transform the land and the effects that changing the landscape may have on regionaland global systems. To accomplish this research, scientists are compiling records ofhistorical transformations in the region's land cover over the last 140 years, developing asimulation model to predict land cover change, and assembling a digital data set to analyzeand describe land transformations. The historical data regarding urban growth focusattention on the significant change the region underwent from 1850 to 1990. Animation isused to visualize a time series of the change in land cover. The historical change is beingused to calibrate a prototype cellular automata model, developed to predict changes in urbanland cover 100 years into the future. Future urban growth scenarios will be developed foranalyzing possible human-induced impacts on land cover at a regional scale. These data aidin documenting and understanding human-induced land transformations from both historical andpredictive perspectives. A descriptive analysis of the region is used to investigate therelationships among data characteristic of the region. These data consist of multilayertopography, climate, vegetation, and population data for a 256-km2 region of centralCalifornia. A variety of multivariate analysis tools are used to integrate the data inraster format from map contours, interpolated climate observations, satellite observations,and population estimates.
Beaudette, Kahlia; Hughes, Tia M; Marcus, Jeffrey M
2014-01-01
Germline transformation with transposon vectors is an important tool for insect genetics, but progress in developing transformation protocols for butterflies has been limited by high post-injection ova mortality. Here we present an improved glass injection needle design for injecting butterfly ova that increases survival in three Nymphalid butterfly species. Using the needles to genetically transform the common buckeye butterfly Junonia coenia, the hatch rate for injected Junonia ova was 21.7%, the transformation rate was 3%, and the overall experimental efficiency was 0.327%, a substantial improvement over previous results in other butterfly species. Improved needle design and a higher efficiency of transformation should permit the deployment of transposon-based genetic tools in a broad range of less fecund lepidopteran species.
Shahmoradi, Leila; Safadari, Reza; Jimma, Worku
2017-09-01
Healthcare is a knowledge driven process and thus knowledge management and the tools to manage knowledge in healthcare sector are gaining attention. The aim of this systematic review is to investigate knowledge management implementation and knowledge management tools used in healthcare for informed decision making. Three databases, two journals websites and Google Scholar were used as sources for the review. The key terms used to search relevant articles include: "Healthcare and Knowledge Management"; "Knowledge Management Tools in Healthcare" and "Community of Practices in healthcare". It was found that utilization of knowledge management in healthcare is encouraging. There exist numbers of opportunities for knowledge management implementation, though there are some barriers as well. Some of the opportunities that can transform healthcare are advances in health information and communication technology, clinical decision support systems, electronic health record systems, communities of practice and advanced care planning. Providing the right knowledge at the right time, i.e., at the point of decision making by implementing knowledge management in healthcare is paramount. To do so, it is very important to use appropriate tools for knowledge management and user-friendly system because it can significantly improve the quality and safety of care provided for patients both at hospital and home settings.
NASA Astrophysics Data System (ADS)
Pandey, Chhavi P.
2017-10-01
Wavelet analysis is a powerful mathematical and computational tool to study periodic phenomena in time series particu-larly in the presence of potential frequency changes in time. Continuous wavelet transformation (CWT) provides localised spectral information of the analysed dataset and in particular useful to study multiscale, nonstationary processes occurring over finite spatial and temporal domains. In the present work, oxygen-isotope ratio from the plantonic foraminifera species (viz. Globigerina bul-loides and Globigerinoides ruber) acquired from the broad central plateau of the Maldives ridge situated in south-eastern Arabian sea have been used as climate proxy. CWT of the time series generated using both the biofacies indicate spectro-temporal varia-tion of the natural climatic cycles. The dominant period resembles to the period of Milankovitch glacial-interglacial cycle. Apart from that, various other cycles are present in the time series. The results are in good agreement with the astronomical theory of paleoclimates and can provide better visualisation of Indian summer monsoon in the context of climate change.
Visualization index for image-enabled medical records
NASA Astrophysics Data System (ADS)
Dong, Wenjie; Zheng, Weilin; Sun, Jianyong; Zhang, Jianguo
2011-03-01
With the widely use of healthcare information technology in hospitals, the patients' medical records are more and more complex. To transform the text- or image-based medical information into easily understandable and acceptable form for human, we designed and developed an innovation indexing method which can be used to assign an anatomical 3D structure object to every patient visually to store indexes of the patients' basic information, historical examined image information and RIS report information. When a doctor wants to review patient historical records, he or she can first load the anatomical structure object and the view the 3D index of this object using a digital human model tool kit. This prototype system helps doctors to easily and visually obtain the complete historical healthcare status of patients, including large amounts of medical data, and quickly locate detailed information, including both reports and images, from medical information systems. In this way, doctors can save time that may be better used to understand information, obtain a more comprehensive understanding of their patients' situations, and provide better healthcare services to patients.
Transforming Dance History: The Lost History of Rehearsals.
ERIC Educational Resources Information Center
Hodes, Stuart
1989-01-01
Explains that an important aspect of dance history is lost by not recording dance rehearsals. Argues that recording rehearsals can reveal the creative process and illuminate the environment that engendered this art form. Concludes that a transformed dance history will influence curriculum development. (GG)
Village Voices, Global Visions: Digital Video as a Transformative Foreign Language Learning Tool
ERIC Educational Resources Information Center
Goulah, Jason
2007-01-01
This instrumental case study examines how adolescent high-intermediate Japanese language learners enrolled in a one-month credited abroad program used video as a mediational tool for (1) learning foreign language, content, and technology skills, (2) cultivating critical multiliteracies and transformative learning regarding geopolitics and the…
Transforming the History Curriculum with Geospatial Tools
ERIC Educational Resources Information Center
Hammond, Thomas
2014-01-01
Martorella's "sleeping giant" is awakening via geospatial tools. As this technology is adopted, it will transform the history curriculum in three ways: deepening curricular content, making conceptual frameworks more prominent, and increasing connections to local history. These changes may not be profound and they may not be sudden,…
Tilson, Julie K; Loeb, Kathryn; Barbosa, Sabrina; Jiang, Fei; Lee, Karin T
2016-04-01
Physical therapists strive to integrate research into daily practice. The tablet computer is a potentially transformational tool for accessing information within the clinical practice environment. The purpose of this study was to measure and describe patterns of tablet computer use among physical therapy students during clinical rotation experiences. Doctor of physical therapy students (n = 13 users) tracked their use of tablet computers (iPad), loaded with commercially available apps, during 16 clinical experiences (6-16 weeks in duration). The tablets were used on 70% of 691 clinic days, averaging 1.3 uses per day. Information seeking represented 48% of uses; 33% of those were foreground searches for research articles and syntheses and 66% were for background medical information. Other common uses included patient education (19%), medical record documentation (13%), and professional communication (9%). The most frequently used app was Safari, the preloaded web browser (representing 281 [36.5%] incidents of use). Users accessed 56 total apps to support clinical practice. Physical therapy students successfully integrated use of a tablet computer into their clinical experiences including regular activities of information seeking. Our findings suggest that the tablet computer represents a potentially transformational tool for promoting knowledge translation in the clinical practice environment.Video Abstract available for more insights from the authors (see Supplemental Digital Content 1, http://links.lww.com/JNPT/A127).
Virtual detector theory for strong-field atomic ionization
NASA Astrophysics Data System (ADS)
Wang, Xu; Tian, Justin; Eberly, J. H.
2018-04-01
A virtual detector (VD) is an imaginary device located at a fixed position in space that extracts information from the wave packet passing through it. By recording the particle momentum and the corresponding probability current at each time, the VDs can accumulate and build the differential momentum distribution of the particle, in a way that resembles real experiments. A mathematical proof is given for the equivalence of the differential momentum distribution obtained by the VD method and by Fourier transforming the wave function. In addition to being a tool for reducing the computational load, VDs have also been found useful in interpreting the ultrafast strong-field ionization process, especially the controversial quantum tunneling process.
Wavelet-Based Motion Artifact Removal for Electrodermal Activity
Chen, Weixuan; Jaques, Natasha; Taylor, Sara; Sano, Akane; Fedor, Szymon; Picard, Rosalind W.
2017-01-01
Electrodermal activity (EDA) recording is a powerful, widely used tool for monitoring psychological or physiological arousal. However, analysis of EDA is hampered by its sensitivity to motion artifacts. We propose a method for removing motion artifacts from EDA, measured as skin conductance (SC), using a stationary wavelet transform (SWT). We modeled the wavelet coefficients as a Gaussian mixture distribution corresponding to the underlying skin conductance level (SCL) and skin conductance responses (SCRs). The goodness-of-fit of the model was validated on ambulatory SC data. We evaluated the proposed method in comparison with three previous approaches. Our method achieved a greater reduction of artifacts while retaining motion-artifact-free data. PMID:26737714
The Increasing Use of Remote Sensing Data in Studying the Climatological Impacts on Public Health
NASA Technical Reports Server (NTRS)
Kempler, Steven; Benedict, Karl; Ceccato, Pietro; Golden, Meredith; Maxwell, Susan; Morian, Stan; Soebiyanto, Radina; Tong, Daniel
2011-01-01
One of the more fortunate outcomes of the capture and transformation of remote sensing data into applied information is their usefulness and impacts to better understanding climatological impacts on public health. Today, with petabytes of remote sensing data providing global coverage of climatological parameters, public health research and policy decision makers have an unprecedented (and growing) data record that relates the effects of climatic parameters, such as rainfall, heat, soil moisture, etc. to incidences and spread of disease, as well as predictive modeling. In addition, tools and services that specifically serve public health researchers and respondents have grown in response to needs of the these information users.
Anticipating land surface change
Streeter, Richard; Dugmore, Andrew J.
2013-01-01
The interplay of human actions and natural processes over varied spatial and temporal scales can result in abrupt transitions between contrasting land surface states. Understanding these transitions is a key goal of sustainability science because they can represent abrupt losses of natural capital. This paper recognizes flickering between alternate land surface states in advance of threshold change and critical slowing down in advance of both threshold changes and noncritical transformation. The early warning signals we observe are rises in autocorrelation, variance, and skewness within millimeter-resolution thickness measurements of tephra layers deposited in A.D. 2010 and A.D. 2011. These signals reflect changing patterns of surface vegetation, which are known to provide early warning signals of critical transformations. They were observed toward migrating soil erosion fronts, cryoturbation limits, and expanding deflation zones, thus providing potential early warning signals of land surface change. The record of the spatial patterning of vegetation contained in contemporary tephra layers shows how proximity to land surface change could be assessed in the widespread regions affected by shallow layers of volcanic fallout (those that can be subsumed within the existing vegetation cover). This insight shows how we could use tephra layers in the stratigraphic record to identify “near misses,” close encounters with thresholds that did not lead to tipping points, and thus provide additional tools for archaeology, sustainability science, and contemporary land management. PMID:23530230
A New Continuous Cooling Transformation Diagram for AISI M4 High-Speed Tool Steel
NASA Astrophysics Data System (ADS)
Briki, Jalel; Ben Slima, Souad
2008-12-01
The increasing evolution of dilatometric techniques now allows for the identification of structural transformations with very low signal. The use of dilatometric techniques coupled with more common techniques, such as metallographic, hardness testing, and x-ray diffraction allows to plot a new CCT diagram for AISI M4 high-speed tool steel. This diagram is useful for a better selection of alternate solutions, hardening, and tempering heat treatments. More accurate determination of the various fields of transformation of austenite during its cooling was made. The precipitation of carbides highlighted at high temperature is at the origin of the martrensitic transformation into two stages (splitting phenomena). For slow cooling rates, it was possible to highlight the ferritic, pearlitic, and bainitic transformation.
Insect transformation with piggyBac: getting the number of injections just right
Morrison, N. I.; Shimeld, S. M.
2016-01-01
Abstract The insertion of exogenous genetic cargo into insects using transposable elements is a powerful research tool with potential applications in meeting food security and public health challenges facing humanity. piggyBac is the transposable element most commonly utilized for insect germline transformation. The described efficiency of this process is variable in the published literature, and a comprehensive review of transformation efficiency in insects is lacking. This study compared and contrasted all available published data with a comprehensive data set provided by a biotechnology group specializing in insect transformation. Based on analysis of these data, with particular focus on the more complete observational data from the biotechnology group, we designed a decision tool to aid researchers' decision‐making when using piggyBac to transform insects by microinjection. A combination of statistical techniques was used to define appropriate summary statistics of piggyBac transformation efficiency by species and insect order. Publication bias was assessed by comparing the data sets. The bias was assessed using strategies co‐opted from the medical literature. The work culminated in building the Goldilocks decision tool, a Markov‐Chain Monte‐Carlo simulation operated via a graphical interface and providing guidance on best practice for those seeking to transform insects using piggyBac. PMID:27027400
Chemical Transformation Simulator
The Chemical Transformation Simulator (CTS) is a web-based, high-throughput screening tool that automates the calculation and collection of physicochemical properties for an organic chemical of interest and its predicted products resulting from transformations in environmental sy...
An Internal Coaxial Cable Electrical Connector For Use In Downhole Tools
Hall, David R.; Hall, Jr., H. Tracy; Pixton, David S.; Dahlgren, Scott; Fox, Joe; Sneddon, Cameron; Briscoe, Michael
2005-11-29
A coaxial cable electrical connector more specifically an internal coaxial cable connector placed within a coaxial cable and its constituent components. A coaxial cable connector is in electrical communcation with an inductive transformer and a coaxial cable. The connector is in electrical communication with the outer housing of the inductive transfonner. A generally coaxial center conductor, a portion of which could be the coil in the inductive transformer, passes through the connector, is electrically insulated from the connector, and is in electrical communication with the conductive care of the coaxial cable. A plurality of bulbous pliant tabs on the coaxial cable connector mechanically engage the inside diameter of the coaxial cable thus grounding the transformer to the coaxial cable. The coaxial cable and inductive transformer are disposed within downhole tools to transmit electrical signals between downhole tools within a drill string.
Relationship between the Full Range Leadership Model and Information Technology Tools Usage
ERIC Educational Resources Information Center
Landell, Antonio White
2013-01-01
Due to major technological and social changes, world dynamics have undergone tremendous leadership style and technology transitions. The transformation of information technology tools usage (ITTU) created a new paradigm confronting leaders that can provide the right change of vision to effectively motivate, inspire, and transform others to work at…
Dynamic-ETL: a hybrid approach for health data extraction, transformation and loading.
Ong, Toan C; Kahn, Michael G; Kwan, Bethany M; Yamashita, Traci; Brandt, Elias; Hosokawa, Patrick; Uhrich, Chris; Schilling, Lisa M
2017-09-13
Electronic health records (EHRs) contain detailed clinical data stored in proprietary formats with non-standard codes and structures. Participating in multi-site clinical research networks requires EHR data to be restructured and transformed into a common format and standard terminologies, and optimally linked to other data sources. The expertise and scalable solutions needed to transform data to conform to network requirements are beyond the scope of many health care organizations and there is a need for practical tools that lower the barriers of data contribution to clinical research networks. We designed and implemented a health data transformation and loading approach, which we refer to as Dynamic ETL (Extraction, Transformation and Loading) (D-ETL), that automates part of the process through use of scalable, reusable and customizable code, while retaining manual aspects of the process that requires knowledge of complex coding syntax. This approach provides the flexibility required for the ETL of heterogeneous data, variations in semantic expertise, and transparency of transformation logic that are essential to implement ETL conventions across clinical research sharing networks. Processing workflows are directed by the ETL specifications guideline, developed by ETL designers with extensive knowledge of the structure and semantics of health data (i.e., "health data domain experts") and target common data model. D-ETL was implemented to perform ETL operations that load data from various sources with different database schema structures into the Observational Medical Outcome Partnership (OMOP) common data model. The results showed that ETL rule composition methods and the D-ETL engine offer a scalable solution for health data transformation via automatic query generation to harmonize source datasets. D-ETL supports a flexible and transparent process to transform and load health data into a target data model. This approach offers a solution that lowers technical barriers that prevent data partners from participating in research data networks, and therefore, promotes the advancement of comparative effectiveness research using secondary electronic health data.
Enhancing Understanding of Transformation Matrices
ERIC Educational Resources Information Center
Dick, Jonathan; Childrey, Maria
2012-01-01
With the Common Core State Standards' emphasis on transformations, teachers need a variety of approaches to increase student understanding. Teaching matrix transformations by focusing on row vectors gives students tools to create matrices to perform transformations. This empowerment opens many doors: Students are able to create the matrices for…
The experiences of undergraduate nursing students with bots in Second LifeRTM
NASA Astrophysics Data System (ADS)
Rose, Lesele H.
As technology continues to transform education from the status quo of traditional lecture-style instruction to an interactive engaging learning experience, students' experiences within the learning environment continues to change as well. This dissertation addressed the need for continuing research in advancing implementation of technology in higher education. The purpose of this phenomenological study was to discover more about the experiences of undergraduate nursing students using standardized geriatric evaluation tools when interacting with scripted geriatric patient bots tools in a simulated instructional intake setting. Data was collected through a Demographics questionnaire, an Experiential questionnaire, and a Reflection questionnaire. Triangulation of data collection occurred through an automatically created log of the interactions with the two bots, and by an automatically recorded log of the participants' movements while in the simulated geriatric intake interview. The data analysis consisted of an iterative review of the questionnaires and the participants' logs in an effort to identify common themes, recurring comments, and issues which would benefit from further exploration. Findings revealed that the interactions with the bots were perceived as a valuable experience for the participants from the perspective of interacting with the Geriatric Evaluation Tools in the role of an intake nurse. Further research is indicated to explore instructional interactions with bots in effectively mastering the use of established Geriatric Evaluation Tools.
Peng, Hong; Hu, Bin; Shi, Qiuxia; Ratcliffe, Martyn; Zhao, Qinglin; Qi, Yanbing; Gao, Guoping
2013-05-01
A new model to remove ocular artifacts (OA) from electroencephalograms (EEGs) is presented. The model is based on discrete wavelet transformation (DWT) and adaptive noise cancellation (ANC). Using simulated and measured data, the accuracy of the model is compared with the accuracy of other existing methods based on stationary wavelet transforms and our previous work based on wavelet packet transform and independent component analysis. A particularly novel feature of the new model is the use of DWTs to construct an OA reference signal, using the three lowest frequency wavelet coefficients of the EEGs. The results show that the new model demonstrates an improved performance with respect to the recovery of true EEG signals and also has a better tracking performance. Because the new model requires only single channel sources, it is well suited for use in portable environments where constraints with respect to acceptable wearable sensor attachments usually dictate single channel devices. The model is also applied and evaluated against data recorded within the EUFP 7 Project--Online Predictive Tools for Intervention in Mental Illness (OPTIMI). The results show that the proposed model is effective in removing OAs and meets the requirements of portable systems used for patient monitoring as typified by the OPTIMI project.
Digital Doctorates? An Exploratory Study of PhD Candidates' Use of Online Tools
ERIC Educational Resources Information Center
Dowling, Robyn; Wilson, Michael
2017-01-01
Online environments are transforming learning, including doctoral education. Yet the ways in which the PhD experience is shaped and transformed through these digital modes of engagement is seldom addressed, and not systematically understood. In this article, we explore PhD students' perceptions and use of digital tools. Drawing on the results of…
A phylogenetic transform enhances analysis of compositional microbiota data.
Silverman, Justin D; Washburne, Alex D; Mukherjee, Sayan; David, Lawrence A
2017-02-15
Surveys of microbial communities (microbiota), typically measured as relative abundance of species, have illustrated the importance of these communities in human health and disease. Yet, statistical artifacts commonly plague the analysis of relative abundance data. Here, we introduce the PhILR transform, which incorporates microbial evolutionary models with the isometric log-ratio transform to allow off-the-shelf statistical tools to be safely applied to microbiota surveys. We demonstrate that analyses of community-level structure can be applied to PhILR transformed data with performance on benchmarks rivaling or surpassing standard tools. Additionally, by decomposing distance in the PhILR transformed space, we identified neighboring clades that may have adapted to distinct human body sites. Decomposing variance revealed that covariation of bacterial clades within human body sites increases with phylogenetic relatedness. Together, these findings illustrate how the PhILR transform combines statistical and phylogenetic models to overcome compositional data challenges and enable evolutionary insights relevant to microbial communities.
Cong, Fengyu; Leppänen, Paavo H T; Astikainen, Piia; Hämäläinen, Jarmo; Hietanen, Jari K; Ristaniemi, Tapani
2011-09-30
The present study addresses benefits of a linear optimal filter (OF) for independent component analysis (ICA) in extracting brain event-related potentials (ERPs). A filter such as the digital filter is usually considered as a denoising tool. Actually, in filtering ERP recordings by an OF, the ERP' topography should not be changed by the filter, and the output should also be able to be modeled by the linear transformation. Moreover, an OF designed for a specific ERP source or component may remove noise, as well as reduce the overlap of sources and even reject some non-targeted sources in the ERP recordings. The OF can thus accomplish both the denoising and dimension reduction (reducing the number of sources) simultaneously. We demonstrated these effects using two datasets, one containing visual and the other auditory ERPs. The results showed that the method including OF and ICA extracted much more reliable components than the sole ICA without OF did, and that OF removed some non-targeted sources and made the underdetermined model of EEG recordings approach to the determined one. Thus, we suggest designing an OF based on the properties of an ERP to filter recordings before using ICA decomposition to extract the targeted ERP component. Copyright © 2011 Elsevier B.V. All rights reserved.
Content-based video indexing and searching with wavelet transformation
NASA Astrophysics Data System (ADS)
Stumpf, Florian; Al-Jawad, Naseer; Du, Hongbo; Jassim, Sabah
2006-05-01
Biometric databases form an essential tool in the fight against international terrorism, organised crime and fraud. Various government and law enforcement agencies have their own biometric databases consisting of combination of fingerprints, Iris codes, face images/videos and speech records for an increasing number of persons. In many cases personal data linked to biometric records are incomplete and/or inaccurate. Besides, biometric data in different databases for the same individual may be recorded with different personal details. Following the recent terrorist atrocities, law enforcing agencies collaborate more than before and have greater reliance on database sharing. In such an environment, reliable biometric-based identification must not only determine who you are but also who else you are. In this paper we propose a compact content-based video signature and indexing scheme that can facilitate retrieval of multiple records in face biometric databases that belong to the same person even if their associated personal data are inconsistent. We shall assess the performance of our system using a benchmark audio visual face biometric database that has multiple videos for each subject but with different identity claims. We shall demonstrate that retrieval of relatively small number of videos that are nearest, in terms of the proposed index, to any video in the database results in significant proportion of that individual biometric data.
NASA Tech Briefs, November 2013
NASA Technical Reports Server (NTRS)
2013-01-01
Topics include: Cryogenic Liquid Sample Acquisition System for Remote Space Applications; 5 Spatial Statistical Data Fusion (SSDF); GPS Estimates of Integrated Precipitable Water Aid Weather Forecasters; Integrating a Microwave Radiometer into Radar Hardware for Simultaneous Data Collection Between the Instruments; Rapid Detection of Herpes Viruses for Clinical Applications; High-Speed Data Recorder for Space, Geodesy, and Other High-Speed Recording Applications; Datacasting V3.0; An All-Solid-State, Room-Temperature, Heterodyne Receiver for Atmospheric Spectroscopy at 1.2 THz; Stacked Transformer for Driver Gain and Receive Signal Splitting; Wireless Integrated Microelectronic Vacuum Sensor System; Fabrication Method for LOBSTER-Eye Optics in <110> Silicon; Compact Focal Plane Assembly for Planetary Science; Fabrication Methods for Adaptive Deformable Mirrors; Visiting Vehicle Ground Trajectory Tool; Workflow-Based Software Development Environment; Mobile Thread Task Manager; A Kinematic Calibration Process for Flight Robotic Arms; Magnetostrictive Alternator; Bulk Metallic Glasses and Composites for Optical and Compliant Mechanisms; Detection of Only Viable Bacterial Spores Using a Live/Dead Indicator in Mixed Populations; and Intravenous Fluid Generation System.
A Framework for Privacy-preserving Classification of Next-generation PHR data.
Koufi, Vassiliki; Malamateniou, Flora; Prentza, Andriana; Vassilacopoulos, George
2014-01-01
Personal Health Records (PHRs), integrated with data from various sources, such as social care data, Electronic Health Record data and genetic information, are envisaged as having a pivotal role in transforming healthcare. These data, lumped under the term 'big data', are usually complex, noisy, heterogeneous, longitudinal and voluminous thus prohibiting their meaningful use by clinicians. Deriving value from these data requires the utilization of innovative data analysis techniques, which, however, may be hindered due to potential security and privacy breaches that may arise from improper release of personal health information. This paper presents a HIPAA-compliant machine learning framework that enables privacy-preserving classification of next-generation PHR data. The predictive models acquired can act as supporting tools to clinical practice by enabling more effective prevention, diagnosis and treatment of new incidents. The proposed framework has a huge potential for complementing medical staff expertise as it outperforms the manual inspection of PHR data while protecting patient privacy.
Classification of pregnancy and labor contractions using a graph theory based analysis.
Nader, N; Hassan, M; Falou, W; Diab, A; Al-Omar, S; Khalil, M; Marque, C
2015-08-01
In this paper, we propose a new framework to characterize the electrohysterographic (EHG) signals recorded during pregnancy and labor. The approach is based on the analysis of the propagation of the uterine electrical activity. The processing pipeline includes i) the estimation of the statistical dependencies between the different recorded EHG signals, ii) the characterization of the obtained connectivity matrices using network measures and iii) the use of these measures in clinical application: the classification between pregnancy and labor. Due to its robustness to volume conductor, we used the imaginary part of coherence in order to produce the connectivity matrix which is then transformed into a graph. We evaluate the performance of several graph measures. We also compare the results with the parameter mostly used in the literature: the peak frequency combined with the propagation velocity (PV +PF). Our results show that the use of the network measures is a promising tool to classify labor and pregnancy contractions with a small superiority of the graph strength over PV+PF.
76 FR 54790 - Large Power Transformers From Korea
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-02
... Transformers From Korea Determination On the basis of the record \\1\\ developed in the subject investigation... transformers, provided for in subheadings 8504.23.00 and 8504.90.95 of the Harmonized Tariff Schedule of the...., Lynchburg, VA; and Pennsylvania Transformer Technology Inc., Canonsburg, PA, alleging that an industry in...
Tool Wear Feature Extraction Based on Hilbert Marginal Spectrum
NASA Astrophysics Data System (ADS)
Guan, Shan; Song, Weijie; Pang, Hongyang
2017-09-01
In the metal cutting process, the signal contains a wealth of tool wear state information. A tool wear signal’s analysis and feature extraction method based on Hilbert marginal spectrum is proposed. Firstly, the tool wear signal was decomposed by empirical mode decomposition algorithm and the intrinsic mode functions including the main information were screened out by the correlation coefficient and the variance contribution rate. Secondly, Hilbert transform was performed on the main intrinsic mode functions. Hilbert time-frequency spectrum and Hilbert marginal spectrum were obtained by Hilbert transform. Finally, Amplitude domain indexes were extracted on the basis of the Hilbert marginal spectrum and they structured recognition feature vector of tool wear state. The research results show that the extracted features can effectively characterize the different wear state of the tool, which provides a basis for monitoring tool wear condition.
Komorowski, Dariusz; Pietraszek, Stanislaw
2016-01-01
This paper presents the analysis of multi-channel electrogastrographic (EGG) signals using the continuous wavelet transform based on the fast Fourier transform (CWTFT). The EGG analysis was based on the determination of the several signal parameters such as dominant frequency (DF), dominant power (DP) and index of normogastria (NI). The use of continuous wavelet transform (CWT) allows for better visible localization of the frequency components in the analyzed signals, than commonly used short-time Fourier transform (STFT). Such an analysis is possible by means of a variable width window, which corresponds to the scale time of observation (analysis). Wavelet analysis allows using long time windows when we need more precise low-frequency information, and shorter when we need high frequency information. Since the classic CWT transform requires considerable computing power and time, especially while applying it to the analysis of long signals, the authors used the CWT analysis based on the fast Fourier transform (FFT). The CWT was obtained using properties of the circular convolution to improve the speed of calculation. This method allows to obtain results for relatively long records of EGG in a fairly short time, much faster than using the classical methods based on running spectrum analysis (RSA). In this study authors indicate the possibility of a parametric analysis of EGG signals using continuous wavelet transform which is the completely new solution. The results obtained with the described method are shown in the example of an analysis of four-channel EGG recordings, performed for a non-caloric meal.
An Internal Coaxial Cable Electrical Connector For Use In Downhole Tools
Hall, David R.; Hall, Jr., H. Tracy; Pixton, David S.; Dahlgren, Scott; Fox, Joe; Sneddon, Cameron; Briscoe, Michael
2005-09-20
A seal for a coaxial cable electrical connector more specifically an internal seal for a coaxial cable connector placed within a coaxial cable and its constituent components. A coaxial cable connector is in electrical communcation with an inductive transformer and a coaxial cable. The connector is in electrical communication with the outer housing of the inductive transformer. A generally coaxial center conductor, a portion of which could be the coil in the inductive transformer, passes through the connector, is electrically insulated from the connector, and is in electrical communication with the conductive core of the coaxial cable. The electrically insulating material also doubles as a seal to safegaurd against penetration of fluid, thus protecting against shorting out of the electrical connection. The seal is a multi-component seal, which is pre-compressed to a desired pressure rating. The coaxial cable and inductive transformer are disposed within downhole tools to transmit electrical signals between downhole tools within a drill string. The internal coaxial cable connector and its attendant seal can be used in a plurality of downhole tools, such as sections of pipe in a drill string, drill collars, heavy weight drill pipe, and jars.
Wavelet-enhanced convolutional neural network: a new idea in a deep learning paradigm.
Savareh, Behrouz Alizadeh; Emami, Hassan; Hajiabadi, Mohamadreza; Azimi, Seyed Majid; Ghafoori, Mahyar
2018-05-29
Manual brain tumor segmentation is a challenging task that requires the use of machine learning techniques. One of the machine learning techniques that has been given much attention is the convolutional neural network (CNN). The performance of the CNN can be enhanced by combining other data analysis tools such as wavelet transform. In this study, one of the famous implementations of CNN, a fully convolutional network (FCN), was used in brain tumor segmentation and its architecture was enhanced by wavelet transform. In this combination, a wavelet transform was used as a complementary and enhancing tool for CNN in brain tumor segmentation. Comparing the performance of basic FCN architecture against the wavelet-enhanced form revealed a remarkable superiority of enhanced architecture in brain tumor segmentation tasks. Using mathematical functions and enhancing tools such as wavelet transform and other mathematical functions can improve the performance of CNN in any image processing task such as segmentation and classification.
Sonification of acoustic emission data
NASA Astrophysics Data System (ADS)
Raith, Manuel; Große, Christian
2014-05-01
While loading different specimens, acoustic emissions appear due to micro crack formation or friction of already existing crack edges. These acoustic emissions can be recorded using suitable ultrasonic transducers and transient recorders. The analysis of acoustic emissions can be used to investigate the mechanical behavior of different specimens under load. Our working group has undertaken several experiments, monitored with acoustic emission techniques. Different materials such as natural stone, concrete, wood, steel, carbon composites and bone were investigated. Also the experimental setup has been varied. Fire-spalling experiments on ultrahigh performance concrete and pullout experiments on bonded anchors have been carried out. Furthermore uniaxial compression tests on natural stone and animal bone had been conducted. The analysis tools include not only the counting of events but the analysis of full waveforms. Powerful localization algorithms and automatic onset picking techniques (based on Akaikes Information Criterion) were established to handle the huge amount of data. Up to several thousand events were recorded during experiments of a few minutes. More sophisticated techniques like moment tensor inversion have been established on this relatively small scale as well. Problems are related to the amount of data but also to signal-to-noise quality, boundary conditions (reflections) sensor characteristics and unknown and changing Greens functions of the media. Some of the acoustic emissions recorded during these experiments had been transferred into audio range. The transformation into the audio range was done using Matlab. It is the aim of the sonification to establish a tool that is on one hand able to help controlling the experiment in-situ and probably adjust the load parameters according to the number and intensity of the acoustic emissions. On the other hand sonification can help to improve the understanding of acoustic emission techniques for training purposes (students, co-workers). On goal is to establish a real-time frequency transformation into the audio range to avoid time consuming visual data processing during the experiments. It is also the intention to analyze the signals using psycho-acoustic methods with the help of specialists from electrical engineering. Reference: Raith, Manuel (2013). "Schallemissionsanalyse bei Pulloutexperimenten an Verbunddübeln" Masterarbeit. Technische Universität München, Lehrstuhl für Zerstörungsfreie Prüfung. Malm, Fabian (2012). "Schallemissionsanalyse am humanen Femur" Masterarbeit. Technische Universität München, Lehrstuhl für Zerstörungsfreie Prüfung. Richter R. (2009): Einsatz der Schallemissionsanalyse zur Detektion des Riss und Abplatzungsverhaltens von Beton unter Brandeinwirkung. Diplomarbeit. Materialprüfungsanstalt Universität Stuttgart Keywords: Acoustic emission, bonded anchors, femur, pullout test, fire-spalling
Luo, Y.; Xia, J.; Miller, R.D.; Liu, J.; Xu, Y.; Liu, Q.
2008-01-01
Multichannel Analysis of Surface Waves (MASW) analysis is an efficient tool to obtain the vertical shear-wave profile. One of the key steps in the MASW method is to generate an image of dispersive energy in the frequency-velocity domain, so dispersion curves can be determined by picking peaks of dispersion energy. In this paper, we image Rayleigh-wave dispersive energy and separate multimodes from a multichannel record by high-resolution linear Radon transform (LRT). We first introduce Rayleigh-wave dispersive energy imaging by high-resolution LRT. We then show the process of Rayleigh-wave mode separation. Results of synthetic and real-world examples demonstrate that (1) compared with slant stacking algorithm, high-resolution LRT can improve the resolution of images of dispersion energy by more than 50% (2) high-resolution LRT can successfully separate multimode dispersive energy of Rayleigh waves with high resolution; and (3) multimode separation and reconstruction expand frequency ranges of higher mode dispersive energy, which not only increases the investigation depth but also provides a means to accurately determine cut-off frequencies.
40 CFR 761.180 - Records and monitoring.
Code of Federal Regulations, 2012 CFR
2012-07-01
... PCBs contained in PCB Container(s), or one or more PCB Transformers, or 50 or more PCB Large High or...., transformer or capacitor), the weight in kilograms of the PCB waste in each transformer or capacitor, the date... the calendar year. (iv) The total number of PCB Transformers and total weight in kilograms of PCBs...
40 CFR 761.180 - Records and monitoring.
Code of Federal Regulations, 2014 CFR
2014-07-01
... PCBs contained in PCB Container(s), or one or more PCB Transformers, or 50 or more PCB Large High or...., transformer or capacitor), the weight in kilograms of the PCB waste in each transformer or capacitor, the date... the calendar year. (iv) The total number of PCB Transformers and total weight in kilograms of PCBs...
40 CFR 761.180 - Records and monitoring.
Code of Federal Regulations, 2013 CFR
2013-07-01
... PCBs contained in PCB Container(s), or one or more PCB Transformers, or 50 or more PCB Large High or...., transformer or capacitor), the weight in kilograms of the PCB waste in each transformer or capacitor, the date... the calendar year. (iv) The total number of PCB Transformers and total weight in kilograms of PCBs...
40 CFR 761.180 - Records and monitoring.
Code of Federal Regulations, 2010 CFR
2010-07-01
... PCBs contained in PCB Container(s), or one or more PCB Transformers, or 50 or more PCB Large High or...., transformer or capacitor), the weight in kilograms of the PCB waste in each transformer or capacitor, the date... the calendar year. (iv) The total number of PCB Transformers and total weight in kilograms of PCBs...
40 CFR 761.180 - Records and monitoring.
Code of Federal Regulations, 2011 CFR
2011-07-01
... PCBs contained in PCB Container(s), or one or more PCB Transformers, or 50 or more PCB Large High or...., transformer or capacitor), the weight in kilograms of the PCB waste in each transformer or capacitor, the date... the calendar year. (iv) The total number of PCB Transformers and total weight in kilograms of PCBs...
Study of interhemispheric asymmetries in electroencephalographic signals by frequency analysis
NASA Astrophysics Data System (ADS)
Zapata, J. F.; Garzón, J.
2011-01-01
This study provides a new method for the detection of interhemispheric asymmetries in patients with continuous video-electroencephalography (EEG) monitoring at Intensive Care Unit (ICU), using wavelet energy. We obtained the registration of EEG signals in 42 patients with different pathologies, and then we proceeded to perform signal processing using the Matlab program, we compared the abnormalities recorded in the report by the neurophysiologist, the images of each patient and the result of signals analysis with the Discrete Wavelet Transform (DWT). Conclusions: there exists correspondence between the abnormalities found in the processing of the signal with the clinical reports of findings in patients; according to previous conclusion, the methodology used can be a useful tool for diagnosis and early quantitative detection of interhemispheric asymmetries.
Islam, Muhammad T; Scoutaris, Nikolaos; Maniruzzaman, Mohammed; Moradiya, Hiren G; Halsey, Sheelagh A; Bradley, Michael S A; Chowdhry, Babur Z; Snowden, Martin J; Douroumis, Dennis
2015-10-01
The aim of the work reported herein was to implement process analytical technology (PAT) tools during hot melt extrusion (HME) in order to obtain a better understanding of the relationship between HME processing parameters and the extruded formulations. For the first time two in-line NIR probes (transmission and reflectance) have been coupled with HME to monitor the extrusion of the water insoluble drug indomethacin (IND) in the presence of Soluplus (SOL) or Kollidon VA64 hydrophilic polymers. In-line extrusion monitoring of sheets, produced via a specially designed die, was conducted at various drug/polymer ratios and processing parameters. Characterisation of the extruded transparent sheets was also undertaken by using DSC, XRPD and Raman mapping. Analysis of the experimental findings revealed the production of molecular solutions where IND is homogeneously blended (ascertained by Raman mapping) in the polymer matrices, as it acts as a plasticizer for both hydrophilic polymers. PCA analysis of the recorded NIR signals showed that the screw speed used in HME affects the recorded spectra but not the homogeneity of the embedded drug in the polymer sheets. The IND/VA64 and IND/SOL extruded sheets displayed rapid dissolution rates with 80% and 30% of the IND being released, respectively within the first 20min. Copyright © 2015 Elsevier B.V. All rights reserved.
An investigation of phase transformation and crystallinity in laser surface modified H13 steel
NASA Astrophysics Data System (ADS)
Aqida, S. N.; Brabazon, D.; Naher, S.
2013-03-01
This paper presents a laser surface modification process of AISI H13 tool steel using 0.09, 0.2 and 0.4 mm size of laser spot with an aim to increase hardness properties. A Rofin DC-015 diffusion-cooled CO2 slab laser was used to process AISI H13 tool steel samples. Samples of 10 mm diameter were sectioned to 100 mm length in order to process a predefined circumferential area. The parameters selected for examination were laser peak power, overlap percentage and pulse repetition frequency (PRF). X-ray diffraction analysis (XRD) was conducted to measure crystallinity of the laser-modified surface. X-ray diffraction patterns of the samples were recorded using a Bruker D8 XRD system with Cu K α ( λ=1.5405 Å) radiation. The diffraction patterns were recorded in the 2 θ range of 20 to 80°. The hardness properties were tested at 981 mN force. The laser-modified surface exhibited reduced crystallinity compared to the un-processed samples. The presence of martensitic phase was detected in the samples processed using 0.4 mm spot size. Though there was reduced crystallinity, a high hardness was measured in the laser-modified surface. Hardness was increased more than 2.5 times compared to the as-received samples. These findings reveal the phase source of the hardening mechanism and grain composition in the laser-modified surface.
NASA Astrophysics Data System (ADS)
D'Astous, Y.; Blanchard, M.
1982-05-01
In the past years, the Journal has published a number of articles1-5 devoted to the introduction of Fourier transform spectroscopy in the undergraduate labs. In most papers, the proposed experimental setup consists of a Michelson interferometer, a light source, a light detector, and a chart recorder. The student uses this setup to record an interferogram which is then Fourier transformed to obtain the spectrogram of the light source. Although attempts have been made to ease the task of performing the required Fourier transform,6 the use of computers and Cooley-Tukey's fast Fourier transform (FFT) algorithm7 is by far the simplest method to use. However, to be able to use FFT, one has to get a number of samples of the interferogram, a tedious job which should be kept to a minimum. (AIP)
Loeb, Kathryn; Barbosa, Sabrina; Jiang, Fei; Lee, Karin T.
2016-01-01
Background and Purpose: Physical therapists strive to integrate research into daily practice. The tablet computer is a potentially transformational tool for accessing information within the clinical practice environment. The purpose of this study was to measure and describe patterns of tablet computer use among physical therapy students during clinical rotation experiences. Methods: Doctor of physical therapy students (n = 13 users) tracked their use of tablet computers (iPad), loaded with commercially available apps, during 16 clinical experiences (6-16 weeks in duration). Results: The tablets were used on 70% of 691 clinic days, averaging 1.3 uses per day. Information seeking represented 48% of uses; 33% of those were foreground searches for research articles and syntheses and 66% were for background medical information. Other common uses included patient education (19%), medical record documentation (13%), and professional communication (9%). The most frequently used app was Safari, the preloaded web browser (representing 281 [36.5%] incidents of use). Users accessed 56 total apps to support clinical practice. Discussion and Conclusions: Physical therapy students successfully integrated use of a tablet computer into their clinical experiences including regular activities of information seeking. Our findings suggest that the tablet computer represents a potentially transformational tool for promoting knowledge translation in the clinical practice environment. Video Abstract available for more insights from the authors (see Supplemental Digital Content 1, http://links.lww.com/JNPT/A127). PMID:26945431
NASA Astrophysics Data System (ADS)
Hegazy, Maha A.; Lotfy, Hayam M.; Mowaka, Shereen; Mohamed, Ekram Hany
2016-07-01
Wavelets have been adapted for a vast number of signal-processing applications due to the amount of information that can be extracted from a signal. In this work, a comparative study on the efficiency of continuous wavelet transform (CWT) as a signal processing tool in univariate regression and a pre-processing tool in multivariate analysis using partial least square (CWT-PLS) was conducted. These were applied to complex spectral signals of ternary and quaternary mixtures. CWT-PLS method succeeded in the simultaneous determination of a quaternary mixture of drotaverine (DRO), caffeine (CAF), paracetamol (PAR) and p-aminophenol (PAP, the major impurity of paracetamol). While, the univariate CWT failed to simultaneously determine the quaternary mixture components and was able to determine only PAR and PAP, the ternary mixtures of DRO, CAF, and PAR and CAF, PAR, and PAP. During the calculations of CWT, different wavelet families were tested. The univariate CWT method was validated according to the ICH guidelines. While for the development of the CWT-PLS model a calibration set was prepared by means of an orthogonal experimental design and their absorption spectra were recorded and processed by CWT. The CWT-PLS model was constructed by regression between the wavelet coefficients and concentration matrices and validation was performed by both cross validation and external validation sets. Both methods were successfully applied for determination of the studied drugs in pharmaceutical formulations.
Analysis of spike-wave discharges in rats using discrete wavelet transform.
Ubeyli, Elif Derya; Ilbay, Gül; Sahin, Deniz; Ateş, Nurbay
2009-03-01
A feature is a distinctive or characteristic measurement, transform, structural component extracted from a segment of a pattern. Features are used to represent patterns with the goal of minimizing the loss of important information. The discrete wavelet transform (DWT) as a feature extraction method was used in representing the spike-wave discharges (SWDs) records of Wistar Albino Glaxo/Rijswijk (WAG/Rij) rats. The SWD records of WAG/Rij rats were decomposed into time-frequency representations using the DWT and the statistical features were calculated to depict their distribution. The obtained wavelet coefficients were used to identify characteristics of the signal that were not apparent from the original time domain signal. The present study demonstrates that the wavelet coefficients are useful in determining the dynamics in the time-frequency domain of SWD records.
Analysis of two dimensional signals via curvelet transform
NASA Astrophysics Data System (ADS)
Lech, W.; Wójcik, W.; Kotyra, A.; Popiel, P.; Duk, M.
2007-04-01
This paper describes an application of curvelet transform analysis problem of interferometric images. Comparing to two-dimensional wavelet transform, curvelet transform has higher time-frequency resolution. This article includes numerical experiments, which were executed on random interferometric image. In the result of nonlinear approximations, curvelet transform obtains matrix with smaller number of coefficients than is guaranteed by wavelet transform. Additionally, denoising simulations show that curvelet could be a very good tool to remove noise from images.
Temporal Characterization of Aircraft Noise Sources
NASA Technical Reports Server (NTRS)
Grosveld, Ferdinand W.; Sullivan, Brenda M.; Rizzi, Stephen A.
2004-01-01
Current aircraft source noise prediction tools yield time-independent frequency spectra as functions of directivity angle. Realistic evaluation and human assessment of aircraft fly-over noise require the temporal characteristics of the noise signature. The purpose of the current study is to analyze empirical data from broadband jet and tonal fan noise sources and to provide the temporal information required for prediction-based synthesis. Noise sources included a one-tenth-scale engine exhaust nozzle and a one-fifth scale scale turbofan engine. A methodology was developed to characterize the low frequency fluctuations employing the Short Time Fourier Transform in a MATLAB computing environment. It was shown that a trade-off is necessary between frequency and time resolution in the acoustic spectrogram. The procedure requires careful evaluation and selection of the data analysis parameters, including the data sampling frequency, Fourier Transform window size, associated time period and frequency resolution, and time period window overlap. Low frequency fluctuations were applied to the synthesis of broadband noise with the resulting records sounding virtually indistinguishable from the measured data in initial subjective evaluations. Amplitude fluctuations of blade passage frequency (BPF) harmonics were successfully characterized for conditions equivalent to take-off and approach. Data demonstrated that the fifth harmonic of the BPF varied more in frequency than the BPF itself and exhibited larger amplitude fluctuations over the duration of the time record. Frequency fluctuations were found to be not perceptible in the current characterization of tonal components.
Post, Andrew R.; Kurc, Tahsin; Cholleti, Sharath; Gao, Jingjing; Lin, Xia; Bornstein, William; Cantrell, Dedra; Levine, David; Hohmann, Sam; Saltz, Joel H.
2013-01-01
Objective To create an analytics platform for specifying and detecting clinical phenotypes and other derived variables in electronic health record (EHR) data for quality improvement investigations. Materials and Methods We have developed an architecture for an Analytic Information Warehouse (AIW). It supports transforming data represented in different physical schemas into a common data model, specifying derived variables in terms of the common model to enable their reuse, computing derived variables while enforcing invariants and ensuring correctness and consistency of data transformations, long-term curation of derived data, and export of derived data into standard analysis tools. It includes software that implements these features and a computing environment that enables secure high-performance access to and processing of large datasets extracted from EHRs. Results We have implemented and deployed the architecture in production locally. The software is available as open source. We have used it as part of hospital operations in a project to reduce rates of hospital readmission within 30 days. The project examined the association of over 100 derived variables representing disease and co-morbidity phenotypes with readmissions in five years of data from our institution’s clinical data warehouse and the UHC Clinical Database (CDB). The CDB contains administrative data from over 200 hospitals that are in academic medical centers or affiliated with such centers. Discussion and Conclusion A widely available platform for managing and detecting phenotypes in EHR data could accelerate the use of such data in quality improvement and comparative effectiveness studies. PMID:23402960
Swanson, R Chad; Cattaneo, Adriano; Bradley, Elizabeth; Chunharas, Somsak; Atun, Rifat; Abbas, Kaja M; Katsaliaki, Korina; Mustafee, Navonil; Mason Meier, Benjamin; Best, Allan
2012-10-01
While reaching consensus on future plans to address current global health challenges is far from easy, there is broad agreement that reductionist approaches that suggest a limited set of targeted interventions to improve health around the world are inadequate. We argue that a comprehensive systems perspective should guide health practice, education, research and policy. We propose key 'systems thinking' tools and strategies that have the potential for transformational change in health systems. Three overarching themes span these tools and strategies: collaboration across disciplines, sectors and organizations; ongoing, iterative learning; and transformational leadership. The proposed tools and strategies in this paper can be applied, in varying degrees, to every organization within health systems, from families and communities to national ministries of health. While our categorization is necessarily incomplete, this initial effort will provide a valuable contribution to the health systems strengthening debate, as the need for a more systemic, rigorous perspective in health has never been greater.
Swanson, R Chad; Cattaneo, Adriano; Bradley, Elizabeth; Chunharas, Somsak; Atun, Rifat; Abbas, Kaja M; Katsaliaki, Korina; Mustafee, Navonil; Mason Meier, Benjamin; Best, Allan
2012-01-01
While reaching consensus on future plans to address current global health challenges is far from easy, there is broad agreement that reductionist approaches that suggest a limited set of targeted interventions to improve health around the world are inadequate. We argue that a comprehensive systems perspective should guide health practice, education, research and policy. We propose key ‘systems thinking’ tools and strategies that have the potential for transformational change in health systems. Three overarching themes span these tools and strategies: collaboration across disciplines, sectors and organizations; ongoing, iterative learning; and transformational leadership. The proposed tools and strategies in this paper can be applied, in varying degrees, to every organization within health systems, from families and communities to national ministries of health. While our categorization is necessarily incomplete, this initial effort will provide a valuable contribution to the health systems strengthening debate, as the need for a more systemic, rigorous perspective in health has never been greater. PMID:23014154
32 CFR 903.10 - Information collections, records, and forms or information management tools (IMTS).
Code of Federal Regulations, 2010 CFR
2010-07-01
... 32 National Defense 6 2010-07-01 2010-07-01 false Information collections, records, and forms or information management tools (IMTS). 903.10 Section 903.10 National Defense Department of Defense (Continued... Information collections, records, and forms or information management tools (IMTS). (a) Information...
32 CFR 903.10 - Information collections, records, and forms or information management tools (IMTS).
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 6 2014-07-01 2014-07-01 false Information collections, records, and forms or information management tools (IMTS). 903.10 Section 903.10 National Defense Department of Defense (Continued... Information collections, records, and forms or information management tools (IMTS). (a) Information...
32 CFR 903.10 - Information collections, records, and forms or information management tools (IMTS).
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 6 2013-07-01 2013-07-01 false Information collections, records, and forms or information management tools (IMTS). 903.10 Section 903.10 National Defense Department of Defense (Continued... Information collections, records, and forms or information management tools (IMTS). (a) Information...
32 CFR 903.10 - Information collections, records, and forms or information management tools (IMTS).
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 6 2011-07-01 2011-07-01 false Information collections, records, and forms or information management tools (IMTS). 903.10 Section 903.10 National Defense Department of Defense (Continued... Information collections, records, and forms or information management tools (IMTS). (a) Information...
32 CFR 903.10 - Information collections, records, and forms or information management tools (IMTS).
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 6 2012-07-01 2012-07-01 false Information collections, records, and forms or information management tools (IMTS). 903.10 Section 903.10 National Defense Department of Defense (Continued... Information collections, records, and forms or information management tools (IMTS). (a) Information...
A phylogenetic transform enhances analysis of compositional microbiota data
Silverman, Justin D; Washburne, Alex D; Mukherjee, Sayan; David, Lawrence A
2017-01-01
Surveys of microbial communities (microbiota), typically measured as relative abundance of species, have illustrated the importance of these communities in human health and disease. Yet, statistical artifacts commonly plague the analysis of relative abundance data. Here, we introduce the PhILR transform, which incorporates microbial evolutionary models with the isometric log-ratio transform to allow off-the-shelf statistical tools to be safely applied to microbiota surveys. We demonstrate that analyses of community-level structure can be applied to PhILR transformed data with performance on benchmarks rivaling or surpassing standard tools. Additionally, by decomposing distance in the PhILR transformed space, we identified neighboring clades that may have adapted to distinct human body sites. Decomposing variance revealed that covariation of bacterial clades within human body sites increases with phylogenetic relatedness. Together, these findings illustrate how the PhILR transform combines statistical and phylogenetic models to overcome compositional data challenges and enable evolutionary insights relevant to microbial communities. DOI: http://dx.doi.org/10.7554/eLife.21887.001 PMID:28198697
Kohlmayer, Florian; Prasser, Fabian; Kuhn, Klaus A
2015-12-01
With the ARX data anonymization tool structured biomedical data can be de-identified using syntactic privacy models, such as k-anonymity. Data is transformed with two methods: (a) generalization of attribute values, followed by (b) suppression of data records. The former method results in data that is well suited for analyses by epidemiologists, while the latter method significantly reduces loss of information. Our tool uses an optimal anonymization algorithm that maximizes output utility according to a given measure. To achieve scalability, existing optimal anonymization algorithms exclude parts of the search space by predicting the outcome of data transformations regarding privacy and utility without explicitly applying them to the input dataset. These optimizations cannot be used if data is transformed with generalization and suppression. As optimal data utility and scalability are important for anonymizing biomedical data, we had to develop a novel method. In this article, we first confirm experimentally that combining generalization with suppression significantly increases data utility. Next, we proof that, within this coding model, the outcome of data transformations regarding privacy and utility cannot be predicted. As a consequence, existing algorithms fail to deliver optimal data utility. We confirm this finding experimentally. The limitation of previous work can be overcome at the cost of increased computational complexity. However, scalability is important for anonymizing data with user feedback. Consequently, we identify properties of datasets that may be predicted in our context and propose a novel and efficient algorithm. Finally, we evaluate our solution with multiple datasets and privacy models. This work presents the first thorough investigation of which properties of datasets can be predicted when data is anonymized with generalization and suppression. Our novel approach adopts existing optimization strategies to our context and combines different search methods. The experiments show that our method is able to efficiently solve a broad spectrum of anonymization problems. Our work shows that implementing syntactic privacy models is challenging and that existing algorithms are not well suited for anonymizing data with transformation models which are more complex than generalization alone. As such models have been recommended for use in the biomedical domain, our results are of general relevance for de-identifying structured biomedical data. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Transformation of fruit trees. Useful breeding tool or continued future prospect?
Petri, César; Burgos, Lorenzo
2005-02-01
Regeneration and transformation systems using mature plant material of woody fruit species have to be achieved as a necessary requirement for the introduction of useful genes into specific cultivars and the rapid evaluation of resulting horticultural traits. Although the commercial production of transgenic annual crops is a reality, commercial genetically-engineered fruit trees are still far from common. In most woody fruit species, transformation and regeneration of commercial cultivars are not routine, generally being limited to a few genotypes or to seedlings. The future of genetic transformation as a tool for the breeding of fruit trees requires the development of genotype-independent procedures, based on the transformation of meristematic cells with high regeneration potential and/or the use of regeneration-promoting genes. The public concern with the introduction of antibiotic resistance into food and the restrictions due to new European laws that do not allow deliberate release of plants transformed with antibiotic-resistance genes highlight the development of methods that avoid the use of antibiotic-dependent selection or allow elimination of marker genesfrom the transformed plant as a research priority in coming years.
Frequency Dynamics of the First Heart Sound
NASA Astrophysics Data System (ADS)
Wood, John Charles
Cardiac auscultation is a fundamental clinical tool but first heart sound origins and significance remain controversial. Previous clinical studies have implicated resonant vibrations of both the myocardium and the valves. Accordingly, the goals of this thesis were threefold, (1) to characterize the frequency dynamics of the first heart sound, (2) to determine the relative contribution of the myocardium and the valves in determining first heart sound frequency, and (3) to develop new tools for non-stationary signal analysis. A resonant origin for first heart sound generation was tested through two studies in an open-chest canine preparation. Heart sounds were recorded using ultralight acceleration transducers cemented directly to the epicardium. The first heart sound was observed to be non-stationary and multicomponent. The most dominant feature was a powerful, rapidly-rising frequency component that preceded mitral valve closure. Two broadband components were observed; the first coincided with mitral valve closure while the second significantly preceded aortic valve opening. The spatial frequency of left ventricular vibrations was both high and non-stationary which indicated that the left ventricle was not vibrating passively in response to intracardiac pressure fluctuations but suggested instead that the first heart sound is a propagating transient. In the second study, regional myocardial ischemia was induced by left coronary circumflex arterial occlusion. Acceleration transducers were placed on the ischemic and non-ischemic myocardium to determine whether ischemia produced local or global changes in first heart sound amplitude and frequency. The two zones exhibited disparate amplitude and frequency behavior indicating that the first heart sound is not a resonant phenomenon. To objectively quantify the presence and orientation of signal components, Radon transformation of the time -frequency plane was performed and found to have considerable potential for pattern classification. Radon transformation of the Wigner spectrum (Radon-Wigner transform) was derived to be equivalent to dechirping in the time and frequency domains. Based upon this representation, an analogy between time-frequency estimation and computed tomography was drawn. Cohen's class of time-frequency representations was subsequently shown to result from simple changes in reconstruction filtering parameters. Time-varying filtering, adaptive time-frequency transformation and linear signal synthesis were also performed from the Radon-Wigner representation.
Cast Coil Transformer Fire Susceptibility and Reliability Study
1991-04-01
transformers reduce risk to the user compared to liquid-filled units, eliminate environmental impacts, are more efficient than most transformer designs, and...filled units, eliminate environmental impacts, arc more efficient than most transformer designs, and add minimal risk to the facility in a fire situation...add minimal risk to the facility in a fire situation. Cast coil transformers have a long record of operation and have proven to be reliable and
Khang, Chang Hyun; Park, Sook-Young; Lee, Yong-Hwan; Kang, Seogchan
2005-06-01
Rapid progress in fungal genome sequencing presents many new opportunities for functional genomic analysis of fungal biology through the systematic mutagenesis of the genes identified through sequencing. However, the lack of efficient tools for targeted gene replacement is a limiting factor for fungal functional genomics, as it often necessitates the screening of a large number of transformants to identify the desired mutant. We developed an efficient method of gene replacement and evaluated factors affecting the efficiency of this method using two plant pathogenic fungi, Magnaporthe grisea and Fusarium oxysporum. This method is based on Agrobacterium tumefaciens-mediated transformation with a mutant allele of the target gene flanked by the herpes simplex virus thymidine kinase (HSVtk) gene as a conditional negative selection marker against ectopic transformants. The HSVtk gene product converts 5-fluoro-2'-deoxyuridine to a compound toxic to diverse fungi. Because ectopic transformants express HSVtk, while gene replacement mutants lack HSVtk, growing transformants on a medium amended with 5-fluoro-2'-deoxyuridine facilitates the identification of targeted mutants by counter-selecting against ectopic transformants. In addition to M. grisea and F. oxysporum, the method and associated vectors are likely to be applicable to manipulating genes in a broad spectrum of fungi, thus potentially serving as an efficient, universal functional genomic tool for harnessing the growing body of fungal genome sequence data to study fungal biology.
Tackling the 2nd V: Big Data, Variety and the Need for Representation Consistency
NASA Astrophysics Data System (ADS)
Clune, T.; Kuo, K. S.
2016-12-01
While Big Data technologies are transforming our ability to analyze ever larger volumes of Earth science data, practical constraints continue to limit our ability to compare data across datasets from different sources in an efficient and robust manner. Within a single data collection, invariants such as file format, grid type, and spatial resolution greatly simplify many types of analysis (often implicitly). However, when analysis combines data across multiple data collections, researchers are generally required to implement data transformations (i.e., "data preparation") to provide appropriate invariants. These transformation include changing of file formats, ingesting into a database, and/or regridding to a common spatial representation, and they can either be performed once, statically, or each time the data is accessed. At the very least, this process is inefficient from the perspective of the community as each team selects its own representation and privately implements the appropriate transformations. No doubt there are disadvantages to any "universal" representation, but we posit that major benefits would be obtained if a suitably flexible spatial representation could be standardized along with tools for transforming to/from that representation. We regard this as part of the historic trend in data publishing. Early datasets used ad hoc formats and lacked metadata. As better tools evolved, published data began to use standardized formats (e.g., HDF and netCDF) with attached metadata. We propose that the modern need to perform analysis across data sets should drive a new generation of tools that support a standardized spatial representation. More specifically, we propose the hierarchical triangular mesh (HTM) as a suitable "generic" resolution that permits standard transformations to/from native representations in use today, as well as tools to convert/regrid existing datasets onto that representation.
NASA Astrophysics Data System (ADS)
Vaudor, Lise; Piegay, Herve; Wawrzyniak, Vincent; Spitoni, Marie
2016-04-01
The form and functioning of a geomorphic system result from processes operating at various spatial and temporal scales. Longitudinal channel characteristics thus exhibit complex patterns which vary according to the scale of study, might be periodic or segmented, and are generally blurred by noise. Describing the intricate, multiscale structure of such signals, and identifying at which scales the patterns are dominant and over which sub-reach, could help determine at which scales they should be investigated, and provide insights into the main controlling factors. Wavelet transforms aim at describing data at multiple scales (either in time or space), and are now exploited in geophysics for the analysis of nonstationary series of data. They provide a consistent, non-arbitrary, and multiscale description of a signal's variations and help explore potential causalities. Nevertheless, their use in fluvial geomorphology, notably to study longitudinal patterns, is hindered by a lack of user-friendly tools to help understand, implement, and interpret them. We have developed a free application, The Wavelet ToolKat, designed to facilitate the use of wavelet transforms on temporal or spatial series. We illustrate its usefulness describing longitudinal channel curvature and slope of three freely meandering rivers in the Amazon basin (the Purus, Juruá and Madre de Dios rivers), using topographic data generated from NASA's Shuttle Radar Topography Mission (SRTM) in 2000. Three types of wavelet transforms are used, with different purposes. Continuous Wavelet Transforms are used to identify in a non-arbitrary way the dominant scales and locations at which channel curvature and slope vary. Cross-wavelet transforms, and wavelet coherence and phase are used to identify scales and locations exhibiting significant channel curvature and slope co-variations. Maximal Overlap Discrete Wavelet Transforms decompose data into their variations at a series of scales and are used to provide smoothed descriptions of the series at the scales deemed relevant.
Marine bioacoustics and technology: The new world of marine acoustic ecology
NASA Astrophysics Data System (ADS)
Hastings, Mardi C.; Au, Whitlow W. L.
2012-11-01
Marine animals use sound for communication, navigation, predator avoidance, and prey detection. Thus the rise in acoustic energy associated with increasing human activity in the ocean has potential to impact the lives of marine animals. Thirty years ago marine bioacoustics primarily focused on evaluating effects of human-generated sound on hearing and behavior by testing captive animals and visually observing wild animals. Since that time rapidly changing electronic and computing technologies have yielded three tools that revolutionized how bioacousticians study marine animals. These tools are (1) portable systems for measuring electrophysiological auditory evoked potentials, (2) miniaturized tags equipped with positioning sensors and acoustic recording devices for continuous short-term acoustical observation rather than intermittent visual observation, and (3) passive acoustic monitoring (PAM) systems for remote long-term acoustic observations at specific locations. The beauty of these breakthroughs is their direct applicability to wild animals in natural habitats rather than only to animals held in captivity. Hearing capabilities of many wild species including polar bears, beaked whales, and reef fishes have now been assessed by measuring their auditory evoked potentials. Miniaturized acoustic tags temporarily attached to an animal to record its movements and acoustic environment have revealed the acoustic foraging behavior of sperm and beaked whales. Now tags are being adapted to fishes in effort to understand their behavior in the presence of noise. Moving and static PAM systems automatically detect and characterize biological and physical features of an ocean area without adding any acoustic energy to the environment. PAM is becoming a powerful technique for understanding and managing marine habitats. This paper will review the influence of these transformative tools on the knowledge base of marine bioacoustics and elucidation of relationships between marine animals and their acoustic environment, leading to a new, rapidly growing field of marine acoustic ecology.
v3NLP Framework: Tools to Build Applications for Extracting Concepts from Clinical Text
Divita, Guy; Carter, Marjorie E.; Tran, Le-Thuy; Redd, Doug; Zeng, Qing T; Duvall, Scott; Samore, Matthew H.; Gundlapalli, Adi V.
2016-01-01
Introduction: Substantial amounts of clinically significant information are contained only within the narrative of the clinical notes in electronic medical records. The v3NLP Framework is a set of “best-of-breed” functionalities developed to transform this information into structured data for use in quality improvement, research, population health surveillance, and decision support. Background: MetaMap, cTAKES and similar well-known natural language processing (NLP) tools do not have sufficient scalability out of the box. The v3NLP Framework evolved out of the necessity to scale-up these tools up and provide a framework to customize and tune techniques that fit a variety of tasks, including document classification, tuned concept extraction for specific conditions, patient classification, and information retrieval. Innovation: Beyond scalability, several v3NLP Framework-developed projects have been efficacy tested and benchmarked. While v3NLP Framework includes annotators, pipelines and applications, its functionalities enable developers to create novel annotators and to place annotators into pipelines and scaled applications. Discussion: The v3NLP Framework has been successfully utilized in many projects including general concept extraction, risk factors for homelessness among veterans, and identification of mentions of the presence of an indwelling urinary catheter. Projects as diverse as predicting colonization with methicillin-resistant Staphylococcus aureus and extracting references to military sexual trauma are being built using v3NLP Framework components. Conclusion: The v3NLP Framework is a set of functionalities and components that provide Java developers with the ability to create novel annotators and to place those annotators into pipelines and applications to extract concepts from clinical text. There are scale-up and scale-out functionalities to process large numbers of records. PMID:27683667
Transformative, Mixed Methods Checklist for Psychological Research with Mexican Americans
ERIC Educational Resources Information Center
Canales, Genevieve
2013-01-01
This is a description of the creation of a research methods tool, the "Transformative, Mixed Methods Checklist for Psychological Research With Mexican Americans." For conducting literature reviews of and planning mixed methods studies with Mexican Americans, it contains evaluative criteria calling for transformative mixed methods, perspectives…
Learning Unlimited: Transforming Learning in the Workplace. 2nd Edition.
ERIC Educational Resources Information Center
Rylatt, Alastair
This book is intended to provide managers, trainers, and others responsible for improving learning within their workplace with the tools, perspectives, and strategies to transform their workplace into a learning organization. Throughout the book, key principles underpinning recent thinking on positive workplace transformation are combined with…
Micah E. Stevens; Paula M. Pijut
2014-01-01
Using mature hypocotyls as the initial explants, an Agrobacterium tumefaciens-mediated genetic transformation system was successfully developed for pumpkin ash (Fraxinus profunda). This transformation protocol is an invaluable tool to combat the highly aggressive, non-native emerald ash borer (EAB), which has the potential to...
ICT in English Schools: Transforming Education?
ERIC Educational Resources Information Center
Yang, Hao
2012-01-01
The use of information and communications technology (ICT) as a learning tool has long been acclaimed as a catalyst for educational transformation. Over the past decade, evidence of good uses of ICT has emerged in numerous studies. While such use promises transformation in supporting teaching and learning, evidence suggests that progress is…
Transformational Leadership in Special Education: Leading the IEP Team
ERIC Educational Resources Information Center
Lentz, Kirby
2012-01-01
Using the principles of transformational leadership, IEP teams become effective tools to ensure student success and achievements. There is a difference of teams that are simply chaired and those that are lead. Teams with transformational leaders promote the best efforts of all participants including parents and students to effectively deliver…
Classroom-sized geophysical experiments: magnetic surveying using modern smartphone devices
NASA Astrophysics Data System (ADS)
Tronicke, Jens; Trauth, Martin H.
2018-05-01
Modern mobile devices (i.e. smartphones and tablet computers) are widespread, everyday tools, which are equipped with a variety of sensors including three-axis magnetometers. Here, we investigate the feasibility and the potential of using such mobile devices to mimic geophysical experiments in the classroom in a table-top setup. We focus on magnetic surveying and present a basic setup of a table-top experiment for collecting three-component magnetic data across well-defined source bodies and structures. Our results demonstrate that the quality of the recorded data is sufficient to address a number of important basic concepts in the magnetic method. The shown examples cover the analysis of magnetic data recorded across different kinds of dipole sources, thus illustrating the complexity of magnetic anomalies. In addition, we analyze the horizontal resolution capabilities using a pair of dipole sources placed at different horizontal distances to each other. Furthermore, we demonstrate that magnetic data recorded with a mobile device can even be used to introduce filtering, transformation, and inversion approaches as they are typically used when processing magnetic data sets recorded for real-world field applications. Thus, we conclude that such table-top experiments represent an easy-to-implement experimental procedure (as student exercise or classroom demonstration) and can provide first hands-on experience in the basic principles of magnetic surveying including the fundamentals of data acquisition, analysis and processing, as well as data evaluation and interpretation.
Open Innovation and Technology Maturity Analysis
2007-09-11
Management Process Develop a framework which incorporates DoD Acquisition Management framework (e.g: TRLs), DoD Business Transformation strategies...Public Organizations (DoD): DoD Force Transformation : • Support the Joint Warfighting Capability of the DoD • Enable Rapid Access to Information for...Survey - 2007 Defense Transformation : Clear Leadership, Accountability, and Management Tools Are Needed to Enhance DOD’s Efforts to Transform Military
Record, Replay, Reflect: Videotaped Lessons Accelerate Learning for Teachers and Coaches
ERIC Educational Resources Information Center
Knight, Jim; Bradley, Barbara A.; Hock, Michael; Skrtic, Thomas M.; Knight, David; Brasseur-Hock, Irma; Clark, Jean; Ruggles, Marilyn; Hatton, Carol
2012-01-01
New technologies can dramatically change the way people live and work. Jet engines transformed travel. Television revolutionized news and entertainment. Computers and the Internet have transformed just about everything else. And now small video cameras have the potential to transform professional learning. Recognizing the potential of this new…
Target Identification Using Harmonic Wavelet Based ISAR Imaging
NASA Astrophysics Data System (ADS)
Shreyamsha Kumar, B. K.; Prabhakar, B.; Suryanarayana, K.; Thilagavathi, V.; Rajagopal, R.
2006-12-01
A new approach has been proposed to reduce the computations involved in the ISAR imaging, which uses harmonic wavelet-(HW) based time-frequency representation (TFR). Since the HW-based TFR falls into a category of nonparametric time-frequency (T-F) analysis tool, it is computationally efficient compared to parametric T-F analysis tools such as adaptive joint time-frequency transform (AJTFT), adaptive wavelet transform (AWT), and evolutionary AWT (EAWT). Further, the performance of the proposed method of ISAR imaging is compared with the ISAR imaging by other nonparametric T-F analysis tools such as short-time Fourier transform (STFT) and Choi-Williams distribution (CWD). In the ISAR imaging, the use of HW-based TFR provides similar/better results with significant (92%) computational advantage compared to that obtained by CWD. The ISAR images thus obtained are identified using a neural network-based classification scheme with feature set invariant to translation, rotation, and scaling.
Magnified reconstruction of digitally recorded holograms by Fresnel-Bluestein transform.
Restrepo, John F; Garcia-Sucerquia, Jorge
2010-11-20
A method for numerical reconstruction of digitally recorded holograms with variable magnification is presented. The proposed strategy allows for smaller, equal, or larger magnification than that achieved with Fresnel transform by introducing the Bluestein substitution into the Fresnel kernel. The magnification is obtained independent of distance, wavelength, and number of pixels, which enables the method to be applied in color digital holography and metrological applications. The approach is supported by experimental and simulation results in digital holography of objects of comparable dimensions with the recording device and in the reconstruction of holograms from digital in-line holographic microscopy.
Lean thinking in emergency departments: concepts and tools for quality improvement.
Bruno, Frances
2017-10-12
The lean approach is a viable framework for reducing costs and enhancing the quality of patient care in emergency departments (EDs). Reports on lean-inspired quality improvement initiatives are rapidly growing but there is little emphasis on the philosophy behind the processes, which is the essential ingredient in sustaining transformation. This article describes lean philosophy, also referred to as lean, lean thinking and lean healthcare, and its main concepts, to enrich the knowledge and vocabulary of nurses involved or interested in quality improvement in EDs. The article includes examples of lean strategies to illustrate their practical application in EDs. ©2012 RCN Publishing Company Ltd. All rights reserved. Not to be copied, transmitted or recorded in any way, in whole or part, without prior permission of the publishers.
NASA Astrophysics Data System (ADS)
Manimunda, Praveena; Hintsala, Eric; Asif, Syed; Mishra, Manish Kumar
2017-01-01
The ability to correlate mechanical and chemical characterization techniques in real time is both lacking and powerful tool for gaining insights into material behavior. This is demonstrated through use of a novel nanoindentation device equipped with Raman spectroscopy to explore the deformation-induced structural changes in piroxicam crystals. Mechanical anisotropy was observed in two major faces ( 0bar{1}1 ) and (011), which are correlated to changes in the interlayer interaction from in situ Raman spectra recorded during indentation. The results of this study demonstrate the considerable potential of an in situ Raman nanoindentation instrument for studying a variety of topics, including stress-induced phase transformation mechanisms, mechanochemistry, and solid state reactivity under mechanical forces that occur in molecular and pharmaceutical solids.
New Neuroscience Tools That Are Identifying the Sleep-Wake Circuit.
Shiromani, Priyattam J; Peever, John H
2017-04-01
The complexity of the brain is yielding to technology. In the area of sleep neurobiology, conventional neuroscience tools such as lesions, cell recordings, c-Fos, and axon-tracing methodologies have been instrumental in identifying the complex and intermingled populations of sleep- and arousal-promoting neurons that orchestrate and generate wakefulness, NREM, and REM sleep. In the last decade, new technologies such as optogenetics, chemogenetics, and the CRISPR-Cas system have begun to transform how biologists understand the finer details associated with sleep-wake regulation. These additions to the neuroscience toolkit are helping to identify how discrete populations of brain cells function to trigger and shape the timing and transition into and out of different sleep-wake states, and how glia partner with neurons to regulate sleep. Here, we detail how some of the newest technologies are being applied to understand the neural circuits underlying sleep and wake. Published by Oxford University Press on behalf of Sleep Research Society (SRS) 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.
NASA Technical Reports Server (NTRS)
1999-01-01
Aeronautical research usually begins with computers, wind tunnels, and flight simulators, but eventually the theories must fly. This is when flight research begins, and aircraft are the primary tools of the trade. Flight research involves doing precision maneuvers in either a specially built experimental aircraft or an existing production airplane that has been modified. For example, the AD-1 was a unique airplane made only for flight research, while the NASA F-18 High Alpha Research Vehicle (HARV) was a standard fighter aircraft that was transformed into a one-of-a-kind aircraft as it was fitted with new propulsion systems, flight controls, and scientific equipment. All research aircraft are able to perform scientific experiments because of the onboard instruments that record data about its systems, aerodynamics, and the outside environment. Since the 1970's, NASA flight research has become more comprehensive, with flights involving everything form Space Shuttles to ultralights. NASA now flies not only the fastest airplanes, but some of the slowest. Flying machines continue to evolve with new wing designs, propulsion systems, and flight controls. As always, a look at today's experimental research aircraft is a preview of the future.
Transformative Learning, Adult Learners and the September 11 Terrorism Incidents in North America.
ERIC Educational Resources Information Center
Williams, Theresia
Despite their tremendous tragedy, the terrorist attacks that occurred on September 11, 2001, have also produced a juncture for transformative learning. Several transformative learning opportunities that may enable some adult learners find and use knowledge as the tool to transcend a tragedy into a learning experience have been identified. Although…
ERIC Educational Resources Information Center
Hughes, Joan E.; Guion, James M.; Bruce, Kama A.; Horton, Lucas R.; Prescott, Amy
2011-01-01
Web 2.0 tools have emerged as conducive for innovative pedagogy and transformative learning opportunities for youth. Currently,Web 2.0 is often adopted into teachers' practice to simply replace or amplify traditional instructional approaches rather than to promote or facilitate transformative educational change. Current models of innovation…
NASA Astrophysics Data System (ADS)
Schaming, Marc; Rise, Leif; Chand, Shyam; Reidulv, Bøe; Terje Osmundsen, Per; Redfield, Tim
2017-04-01
A large number of sparker lines were acquired on the Norwegian continental shelf during the years 1970-1982, by IKU (Sintef Petroleum Research). The responsibility of the analogue seismic database was transferred to NGU in 1998; this included storage of the physical data (original paper rolls and half-scale film copies) and the digital navigation database. The data (from 60°N to 71°30N) were in the early eighties subdivided in 6 data packages, and offered for sale to oil companies as half scale folded paper copies (25 cm width). Navigation applied was mainly Decca Main Chain. The 2014-2016 SPARDIG project (Chand et al., 2016) was supported by NGU, AkerBP (Det Norske), Lundin Norway and the Seabed Project. In the project, IPGS has transformed 374 rolls of analogue sparker lines in 17 different surveys into SEG-Y format. The total length of converted survey lines is 31 261 kilometers. Rolls were scanned at 600 dpi and converted into SEG-Y using the SeisTrans (Caldera software) application (Miles et al., 2007). SeisTrans uses interactive, iterative and repeatable steps in a dedicated graphics window. A first step allows definition of axes and scales, then record time lines (horizontal TWT times and navigation time lines down the record) are picked and removed, and traces are defined. At this step, control tools are available to ensure the quality of the traces. After that, navigation information extracted and interpolated from excel files are added to trace headers. A continuous QC process allows production of SEG-Y files directly readable by interpretation software. The SEG-Y data will be delivered to the Norwegian Discos National Repository (https://portal.diskos.cgg.com/whereoil-data/) but access will be restricted to participants until 1st April 2019. IKU sparker lines have higher resolution than conventional 2D lines, but the penetration is limited. The data sets are complementary to each other. In 2D seismic lines, it is often difficult to delineate units in the upper part of the records. Some lines show no details of the Quaternary stratigraphy, especially when the Quaternary overburden is thin, and in that case the sparker lines are of inestimable value. Off mid Norway, the SEG-Y transformed sparker lines were interpreted together with 2D seismic lines, and an updated geological map was made for the coastal area. We were able to classify the basement-sediment contact as fault related or stratigraphic. Several new faults were mapped based on detailed bathymetry and seismic data. Exposures of weathered basement at the seafloor and juxtaposition of basement and sediments across inherited faults were observed for several kilometers along strike. These relationships provide important links to the deeper structure and stratigraphy of the Mid-Norwegian margin. The SPARDIG project secured a national treasure for future investigations. This type of high-resolution regional grid will probably never be collected again in Norway. References Chand et al. 2016 - Transforming analogue sparker records from the Norwegian continental shelf into SEG-Y format. Technical report, Spardig project, 2016.038, http://www.ngu.no/upload/Publikasjoner/Rapporter/2016/2016_038.pdf. Miles et al., 2007 - Resurrecting vintage paper seismic records. Mar Geophys Res 28, 319-329, DOI:10.1007/s11001-007-9034-5.
Automation of the guiding center expansion
NASA Astrophysics Data System (ADS)
Burby, J. W.; Squire, J.; Qin, H.
2013-07-01
We report on the use of the recently developed Mathematica package VEST (Vector Einstein Summation Tools) to automatically derive the guiding center transformation. Our Mathematica code employs a recursive procedure to derive the transformation order-by-order. This procedure has several novel features. (1) It is designed to allow the user to easily explore the guiding center transformation's numerous non-unique forms or representations. (2) The procedure proceeds entirely in cartesian position and velocity coordinates, thereby producing manifestly gyrogauge invariant results; the commonly used perpendicular unit vector fields e1,e2 are never even introduced. (3) It is easy to apply in the derivation of higher-order contributions to the guiding center transformation without fear of human error. Our code therefore stands as a useful tool for exploring subtle issues related to the physics of toroidal momentum conservation in tokamaks.
Automation of The Guiding Center Expansion
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. W. Burby, J. Squire and H. Qin
2013-03-19
We report on the use of the recently-developed Mathematica package VEST (Vector Einstein Summation Tools) to automatically derive the guiding center transformation. Our Mathematica code employs a recursive procedure to derive the transformation order-by-order. This procedure has several novel features. (1) It is designed to allow the user to easily explore the guiding center transformation's numerous nonunique forms or representations. (2) The procedure proceeds entirely in cartesian position and velocity coordinates, thereby producing manifestly gyrogauge invariant results; the commonly-used perpendicular unit vector fields e1, e2 are never even introduced. (3) It is easy to apply in the derivation of higher-ordermore » contributions to the guiding center transformation without fear of human error. Our code therefore stands as a useful tool for exploring subtle issues related to the physics of toroidal momentum conservation in tokamaks« less
Chao, Tian-Jy; Kim, Younghun
2015-02-03
Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function to convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.
Spatially-Heterodyned Holography
Thomas, Clarence E [Knoxville, TN; Hanson, Gregory R [Clinton, TN
2006-02-21
A method of recording a spatially low-frequency heterodyne hologram, including spatially heterodyne fringes for Fourier analysis, includes: splitting a laser beam into a reference beam and an object beam; interacting the object beam with an object; focusing the reference beam and the object beam at a focal plane of a digital recorder to form a spatially low-frequency heterodyne hologram including spatially heterodyne fringes for Fourier analysis; digital recording the spatially low-frequency heterodyne hologram; Fourier transforming axes of the recorded spatially low-frequency heterodyne hologram including spatially heterodyne fringes in Fourier space to sit on top of a heterodyne carrier frequency defined by an angle between the reference beam and the object beam; cutting off signals around an origin; and performing an inverse Fourier transform.
NASA Technical Reports Server (NTRS)
Schlegel, Todd T.; Cortez, Daniel
2010-01-01
Our primary objective was to ascertain which commonly used 12-to-Frank-lead transformation yields spatial QRS-T angle values closest to those obtained from simultaneously collected true Frank-lead recordings. Simultaneous 12-lead and Frank XYZ-lead recordings were analyzed for 100 post-myocardial infarction patients and 50 controls. Relative agreement, with true Frank-lead results, of 12-to-Frank-lead transformed results for the spatial QRS-T angle using Kors regression versus inverse Dower was assessed via ANOVA, Lin s concordance and Bland-Altman plots. Spatial QRS-T angles from the true Frank leads were not significantly different than those derived from the Kors regression-related transformation but were significantly smaller than those derived from the inverse Dower-related transformation (P less than 0.001). Independent of method, spatial mean QRS-T angles were also always significantly larger than spatial maximum (peaks) QRS-T angles. Spatial QRS-T angles are best approximated by regression-related transforms. Spatial mean and spatial peaks QRS-T angles should also not be used interchangeably.
Electron-beam irradiation induced transformation of Cu2(OH)3NO3 nanoflakes into nanocrystalline CuO
NASA Astrophysics Data System (ADS)
Padhi, S. K.; Gottapu, S. N.; Krishna, M. Ghanashyam
2016-05-01
The transmission electron microscope electron-beam (TEM e-beam) as a material modification tool has been demonstrated. The material modification is realised in the high-resolution TEM mode (largest condenser aperture, 150 μm, and 200 nm spot size) at a 200 keV beam energy. The Cu2(OH)3NO3 (CHN) nanoflakes used in this study were microwave solution processed that were layered single crystals and radiation sensitive. The single domain CHN flakes disintegrate into a large number of individual CuO crystallites within a 90 s span of time. The sequential bright-field, dark-field, and selected area electron diffraction modes were employed to record the evolved morphology, microstructural changes, and structural transformation that validate CHN modification. High-resolution transmission electron microscopy imaging of e-beam irradiated regions unambiguously supports the growth of CuO nanoparticles (11.8(3.2) nm in diameter). This study demonstrates e-beam irradiation induced CHN depletion, subsequent nucleation and growth of nanocrystalline CuO regions well embedded in the parent burnt porous matrix which can be useful for miniaturized sensing applications. NaBH4 induced room temperature reduction of CHN to elemental Cu and its printability on paper was also demonstrated.The transmission electron microscope electron-beam (TEM e-beam) as a material modification tool has been demonstrated. The material modification is realised in the high-resolution TEM mode (largest condenser aperture, 150 μm, and 200 nm spot size) at a 200 keV beam energy. The Cu2(OH)3NO3 (CHN) nanoflakes used in this study were microwave solution processed that were layered single crystals and radiation sensitive. The single domain CHN flakes disintegrate into a large number of individual CuO crystallites within a 90 s span of time. The sequential bright-field, dark-field, and selected area electron diffraction modes were employed to record the evolved morphology, microstructural changes, and structural transformation that validate CHN modification. High-resolution transmission electron microscopy imaging of e-beam irradiated regions unambiguously supports the growth of CuO nanoparticles (11.8(3.2) nm in diameter). This study demonstrates e-beam irradiation induced CHN depletion, subsequent nucleation and growth of nanocrystalline CuO regions well embedded in the parent burnt porous matrix which can be useful for miniaturized sensing applications. NaBH4 induced room temperature reduction of CHN to elemental Cu and its printability on paper was also demonstrated. Electronic supplementary information (ESI) available. See DOI: 10.1039/c6nr02572b
Uranium isotopes fingerprint biotic reduction.
Stylo, Malgorzata; Neubert, Nadja; Wang, Yuheng; Monga, Nikhil; Romaniello, Stephen J; Weyer, Stefan; Bernier-Latmani, Rizlan
2015-05-05
Knowledge of paleo-redox conditions in the Earth's history provides a window into events that shaped the evolution of life on our planet. The role of microbial activity in paleo-redox processes remains unexplored due to the inability to discriminate biotic from abiotic redox transformations in the rock record. The ability to deconvolute these two processes would provide a means to identify environmental niches in which microbial activity was prevalent at a specific time in paleo-history and to correlate specific biogeochemical events with the corresponding microbial metabolism. Here, we demonstrate that the isotopic signature associated with microbial reduction of hexavalent uranium (U), i.e., the accumulation of the heavy isotope in the U(IV) phase, is readily distinguishable from that generated by abiotic uranium reduction in laboratory experiments. Thus, isotope signatures preserved in the geologic record through the reductive precipitation of uranium may provide the sought-after tool to probe for biotic processes. Because uranium is a common element in the Earth's crust and a wide variety of metabolic groups of microorganisms catalyze the biological reduction of U(VI), this tool is applicable to a multiplicity of geological epochs and terrestrial environments. The findings of this study indicate that biological activity contributed to the formation of many authigenic U deposits, including sandstone U deposits of various ages, as well as modern, Cretaceous, and Archean black shales. Additionally, engineered bioremediation activities also exhibit a biotic signature, suggesting that, although multiple pathways may be involved in the reduction, direct enzymatic reduction contributes substantially to the immobilization of uranium.
Martyna, Agnieszka; Michalska, Aleksandra; Zadora, Grzegorz
2015-05-01
The problem of interpretation of common provenance of the samples within the infrared spectra database of polypropylene samples from car body parts and plastic containers as well as Raman spectra databases of blue solid and metallic automotive paints was under investigation. The research involved statistical tools such as likelihood ratio (LR) approach for expressing the evidential value of observed similarities and differences in the recorded spectra. Since the LR models can be easily proposed for databases described by a few variables, research focused on the problem of spectra dimensionality reduction characterised by more than a thousand variables. The objective of the studies was to combine the chemometric tools easily dealing with multidimensionality with an LR approach. The final variables used for LR models' construction were derived from the discrete wavelet transform (DWT) as a data dimensionality reduction technique supported by methods for variance analysis and corresponded with chemical information, i.e. typical absorption bands for polypropylene and peaks associated with pigments present in the car paints. Univariate and multivariate LR models were proposed, aiming at obtaining more information about the chemical structure of the samples. Their performance was controlled by estimating the levels of false positive and false negative answers and using the empirical cross entropy approach. The results for most of the LR models were satisfactory and enabled solving the stated comparison problems. The results prove that the variables generated from DWT preserve signal characteristic, being a sparse representation of the original signal by keeping its shape and relevant chemical information.
Hegazy, Maha A; Lotfy, Hayam M; Mowaka, Shereen; Mohamed, Ekram Hany
2016-07-05
Wavelets have been adapted for a vast number of signal-processing applications due to the amount of information that can be extracted from a signal. In this work, a comparative study on the efficiency of continuous wavelet transform (CWT) as a signal processing tool in univariate regression and a pre-processing tool in multivariate analysis using partial least square (CWT-PLS) was conducted. These were applied to complex spectral signals of ternary and quaternary mixtures. CWT-PLS method succeeded in the simultaneous determination of a quaternary mixture of drotaverine (DRO), caffeine (CAF), paracetamol (PAR) and p-aminophenol (PAP, the major impurity of paracetamol). While, the univariate CWT failed to simultaneously determine the quaternary mixture components and was able to determine only PAR and PAP, the ternary mixtures of DRO, CAF, and PAR and CAF, PAR, and PAP. During the calculations of CWT, different wavelet families were tested. The univariate CWT method was validated according to the ICH guidelines. While for the development of the CWT-PLS model a calibration set was prepared by means of an orthogonal experimental design and their absorption spectra were recorded and processed by CWT. The CWT-PLS model was constructed by regression between the wavelet coefficients and concentration matrices and validation was performed by both cross validation and external validation sets. Both methods were successfully applied for determination of the studied drugs in pharmaceutical formulations. Copyright © 2016 Elsevier B.V. All rights reserved.
32 CFR 806b.54 - Information collections, records, and forms or information management tools (IMT).
Code of Federal Regulations, 2010 CFR
2010-07-01
... information management tools (IMT). 806b.54 Section 806b.54 National Defense Department of Defense (Continued..., records, and forms or information management tools (IMT). (a) Information Collections. No information.../pubfiles/af/37/afman37-139/afman37-139.pdf. (c) Forms or Information Management Tools (Adopted and...
32 CFR 806b.54 - Information collections, records, and forms or information management tools (IMT).
Code of Federal Regulations, 2011 CFR
2011-07-01
... information management tools (IMT). 806b.54 Section 806b.54 National Defense Department of Defense (Continued..., records, and forms or information management tools (IMT). (a) Information Collections. No information.../pubfiles/af/37/afman37-139/afman37-139.pdf. (c) Forms or Information Management Tools (Adopted and...
32 CFR 806b.54 - Information collections, records, and forms or information management tools (IMT).
Code of Federal Regulations, 2012 CFR
2012-07-01
... information management tools (IMT). 806b.54 Section 806b.54 National Defense Department of Defense (Continued..., records, and forms or information management tools (IMT). (a) Information Collections. No information.../pubfiles/af/37/afman37-139/afman37-139.pdf. (c) Forms or Information Management Tools (Adopted and...
32 CFR 806b.54 - Information collections, records, and forms or information management tools (IMT).
Code of Federal Regulations, 2013 CFR
2013-07-01
... information management tools (IMT). 806b.54 Section 806b.54 National Defense Department of Defense (Continued..., records, and forms or information management tools (IMT). (a) Information Collections. No information.../pubfiles/af/37/afman37-139/afman37-139.pdf. (c) Forms or Information Management Tools (Adopted and...
32 CFR 806b.54 - Information collections, records, and forms or information management tools (IMT).
Code of Federal Regulations, 2014 CFR
2014-07-01
... information management tools (IMT). 806b.54 Section 806b.54 National Defense Department of Defense (Continued..., records, and forms or information management tools (IMT). (a) Information Collections. No information.../pubfiles/af/37/afman37-139/afman37-139.pdf. (c) Forms or Information Management Tools (Adopted and...
A Classical Science Transformed.
ERIC Educational Resources Information Center
Kovalevsky, Jean
1979-01-01
Describes how satellites and other tools of space technology have transformed classical geodesy into the science of space geodynamics. The establishment and the activities of the French Center for Geodynamic and Astronomical Research Studies (CERGA) are also included. (HM)
Joint transform correlators with spatially incoherent illumination
NASA Astrophysics Data System (ADS)
Bykovsky, Yuri A.; Karpiouk, Andrey B.; Markilov, Anatoly A.; Rodin, Vladislav G.; Starikov, Sergey N.
1997-03-01
Two variants of joint transform correlators with monochromatic spatially incoherent illumination are considered. The Fourier-holograms of the reference and recognized images are recorded simultaneously or apart in a time on the same spatial light modulator directly by monochromatic spatially incoherent light. To create the signal of mutual correlation of the images it is necessary to execute nonlinear transformation when the hologram is illuminated by coherent light. In the first scheme of the correlator this aim was achieved by using double pas of a restoring coherent wave through the hologram. In the second variant of the correlator the non-linearity of the characteristic of the spatial light modulator for hologram recording was used. Experimental schemes and results on processing teste images by both variants of joint transform correlators with monochromatic spatially incoherent illumination. The use of spatially incoherent light on the input of joint transform correlators permits to reduce the requirements to optical quality of elements, to reduce accuracy requirements on elements positioning and to expand a number of devices suitable to input images in correlators.
de Wet, C; Bowie, P
2009-04-01
A multi-method strategy has been proposed to understand and improve the safety of primary care. The trigger tool is a relatively new method that has shown promise in American and secondary healthcare settings. It involves the focused review of a random sample of patient records using a series of "triggers" that alert reviewers to potential errors and previously undetected adverse events. To develop and test a global trigger tool to detect errors and adverse events in primary-care records. Trigger tool development was informed by previous research and content validated by expert opinion. The tool was applied by trained reviewers who worked in pairs to conduct focused audits of 100 randomly selected electronic patient records in each of five urban general practices in central Scotland. Review of 500 records revealed 2251 consultations and 730 triggers. An adverse event was found in 47 records (9.4%), indicating that harm occurred at a rate of one event per 48 consultations. Of these, 27 were judged to be preventable (42%). A further 17 records (3.4%) contained evidence of a potential adverse event. Harm severity was low to moderate for most patients (82.9%). Error and harm rates were higher in those aged > or =60 years, and most were medication-related (59%). The trigger tool was successful in identifying undetected patient harm in primary-care records and may be the most reliable method for achieving this. However, the feasibility of its routine application is open to question. The tool may have greater utility as a research rather than an audit technique. Further testing in larger, representative study samples is required.
Agrobacterium tumefaciens-mediated transformation of oleaginous yeast Lipomyces species.
Dai, Ziyu; Deng, Shuang; Culley, David E; Bruno, Kenneth S; Magnuson, Jon K
2017-08-01
Interest in using renewable sources of carbon, especially lignocellulosic biomass, for the production of hydrocarbon fuels and chemicals has fueled interest in exploring various organisms capable of producing hydrocarbon biofuels and chemicals or their precursors. The oleaginous (oil-producing) yeast Lipomyces starkeyi is the subject of active research regarding the production of triacylglycerides as hydrocarbon fuel precursors using a variety of carbohydrate and nutrient sources. The genome of L. starkeyi has been published, which opens the door to production strain improvements through the development and use of the tools of synthetic biology for this oleaginous species. The first step in establishment of synthetic biology tools for an organism is the development of effective and reliable transformation methods with suitable selectable marker genes and demonstration of the utility of the genetic elements needed for expression of introduced genes or deletion of endogenous genes. Chemical-based methods of transformation have been published but suffer from low efficiency. To address these problems, Agrobacterium-mediated transformation was investigated as an alternative method for L. starkeyi and other Lipomyces species. In this study, Agrobacterium-mediated transformation was demonstrated to be effective in the transformation of both L. starkeyi and other Lipomyces species. The deletion of the peroxisomal biogenesis factor 10 gene was also demonstrated in L. starkeyi. In addition to the bacterial antibiotic selection marker gene hygromycin B phosphotransferase, the bacterial β-glucuronidase reporter gene under the control of L. starkeyi translation elongation factor 1α promoter was also stably expressed in six different Lipomyces species. The results from this study demonstrate that Agrobacterium-mediated transformation is a reliable and effective genetic tool for homologous recombination and expression of heterologous genes in L. starkeyi and other Lipomyces species.
Task scheduling in dataflow computer architectures
NASA Technical Reports Server (NTRS)
Katsinis, Constantine
1994-01-01
Dataflow computers provide a platform for the solution of a large class of computational problems, which includes digital signal processing and image processing. Many typical applications are represented by a set of tasks which can be repetitively executed in parallel as specified by an associated dataflow graph. Research in this area aims to model these architectures, develop scheduling procedures, and predict the transient and steady state performance. Researchers at NASA have created a model and developed associated software tools which are capable of analyzing a dataflow graph and predicting its runtime performance under various resource and timing constraints. These models and tools were extended and used in this work. Experiments using these tools revealed certain properties of such graphs that require further study. Specifically, the transient behavior at the beginning of the execution of a graph can have a significant effect on the steady state performance. Transformation and retiming of the application algorithm and its initial conditions can produce a different transient behavior and consequently different steady state performance. The effect of such transformations on the resource requirements or under resource constraints requires extensive study. Task scheduling to obtain maximum performance (based on user-defined criteria), or to satisfy a set of resource constraints, can also be significantly affected by a transformation of the application algorithm. Since task scheduling is performed by heuristic algorithms, further research is needed to determine if new scheduling heuristics can be developed that can exploit such transformations. This work has provided the initial development for further long-term research efforts. A simulation tool was completed to provide insight into the transient and steady state execution of a dataflow graph. A set of scheduling algorithms was completed which can operate in conjunction with the modeling and performance tools previously developed. Initial studies on the performance of these algorithms were done to examine the effects of application algorithm transformations as measured by such quantities as number of processors, time between outputs, time between input and output, communication time, and memory size.
Transforming data into usable knowledge: the CIRC experience
NASA Astrophysics Data System (ADS)
Mote, P.; Lach, D.; Hartmann, H.; Abatzoglou, J. T.; Stevenson, J.
2017-12-01
NOAA's northwest RISA, the Climate Impacts Research Consortium, emphasizes the transformation of data into usable knowledge. This effort involves physical scientists (e.g., Abatzoglou) building web-based tools with climate and hydrologic data and model output, a team performing data mining to link crop loss claims to droughts, social scientists (eg., Lach, Hartmann) evaluating the effectiveness of such tools at communicating with end users, and two-way engagement with a wide variety of audiences who are interested in using and improving the tools. Unusual in this effort is the seamless integration across timescales past, present, and future; data mining; and the level of effort in evaluating the tools. We provide examples of agriculturally relevant climate variables (e.g. growing degree days, day of first fall freeze) and describe the iterative process of incorporating user feedback.
Tuberculosis vaccines: barriers and prospects on the quest for a transformative tool.
Karp, Christopher L; Wilson, Christopher B; Stuart, Lynda M
2015-03-01
The road to a more efficacious vaccine that could be a truly transformative tool for decreasing tuberculosis morbidity and mortality, along with Mycobacterium tuberculosis transmission, is quite daunting. Despite this, there are reasons for optimism. Abetted by better conceptual clarity, clear acknowledgment of the degree of our current immunobiological ignorance, the availability of powerful new tools for dissecting the immunopathogenesis of human tuberculosis, the generation of more creative diversity in tuberculosis vaccine concepts, the development of better fit-for-purpose animal models, and the potential of more pragmatic approaches to the clinical testing of vaccine candidates, the field has promise for delivering novel tools for dealing with this worldwide scourge of poverty. © 2015 The Authors. Immunological Reviews Published by John Wiley & Sons Ltd.
Electronic health record tools' support of nurses' clinical judgment and team communication.
Kossman, Susan P; Bonney, Leigh Ann; Kim, Myoung Jin
2013-11-01
Nurses need to quickly process information to form clinical judgments, communicate with the healthcare team, and guide optimal patient care. Electronic health records not only offer potential for enhanced care but also introduce unintended consequences through changes in workflow, clinical judgment, and communication. We investigated nurses' use of improvised (self-made) and electronic health record-generated cognitive artifacts on clinical judgment and team communication. Tanner's Clinical Judgment Model provided a framework and basis for questions in an online survey and focus group interviews. Findings indicated that (1) nurses rated self-made work lists and medication administration records highest for both clinical judgment and communication, (2) tools aided different dimensions of clinical judgment, and (3) interdisciplinary tools enhance team communication. Implications are that electronic health record tool redesign could better support nursing work.
Leadership DNA: The Ford Motor Story.
ERIC Educational Resources Information Center
Friedman, Stewart D.
2001-01-01
The Ford Motor Company invested in transformational leadership to change itself. Programs center around core principles: adopt a transformational mindset, use action learning, leverage the power of electronic tools, integrate work and life, and generate business impact. (JOW)
Walsh transforms and signal detection
NASA Technical Reports Server (NTRS)
Welch, L. R.
1977-01-01
The detection of signals using Walsh power spectral estimates is analyzed. In addition, a generalization of this method of estimation is evaluated. The conclusion is that Walsh transforms are not suitable tools for the detection of weak signals in noise.
Efficient processing of MPEG-21 metadata in the binary domain
NASA Astrophysics Data System (ADS)
Timmerer, Christian; Frank, Thomas; Hellwagner, Hermann; Heuer, Jörg; Hutter, Andreas
2005-10-01
XML-based metadata is widely adopted across the different communities and plenty of commercial and open source tools for processing and transforming are available on the market. However, all of these tools have one thing in common: they operate on plain text encoded metadata which may become a burden in constrained and streaming environments, i.e., when metadata needs to be processed together with multimedia content on the fly. In this paper we present an efficient approach for transforming such kind of metadata which are encoded using MPEG's Binary Format for Metadata (BiM) without additional en-/decoding overheads, i.e., within the binary domain. Therefore, we have developed an event-based push parser for BiM encoded metadata which transforms the metadata by a limited set of processing instructions - based on traditional XML transformation techniques - operating on bit patterns instead of cost-intensive string comparisons.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chao, Tian-Jy; Kim, Younghun
Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function tomore » convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.« less
Automation of the guiding center expansion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burby, J. W.; Squire, J.; Qin, H.
2013-07-15
We report on the use of the recently developed Mathematica package VEST (Vector Einstein Summation Tools) to automatically derive the guiding center transformation. Our Mathematica code employs a recursive procedure to derive the transformation order-by-order. This procedure has several novel features. (1) It is designed to allow the user to easily explore the guiding center transformation's numerous non-unique forms or representations. (2) The procedure proceeds entirely in cartesian position and velocity coordinates, thereby producing manifestly gyrogauge invariant results; the commonly used perpendicular unit vector fields e{sub 1},e{sub 2} are never even introduced. (3) It is easy to apply in themore » derivation of higher-order contributions to the guiding center transformation without fear of human error. Our code therefore stands as a useful tool for exploring subtle issues related to the physics of toroidal momentum conservation in tokamaks.« less
NASA Astrophysics Data System (ADS)
Weber, M. E.; Reichelt, L.; Kuhn, G.; Pfeiffer, M.; Korff, B.; Thurow, J.; Ricken, W.
2010-03-01
We present tools for rapid and quantitative detection of sediment lamination. The BMPix tool extracts color and gray scale curves from images at pixel resolution. The PEAK tool uses the gray scale curve and performs, for the first time, fully automated counting of laminae based on three methods. The maximum count algorithm counts every bright peak of a couplet of two laminae (annual resolution) in a smoothed curve. The zero-crossing algorithm counts every positive and negative halfway passage of the curve through a wide moving average, separating the record into bright and dark intervals (seasonal resolution). The same is true for the frequency truncation method, which uses Fourier transformation to decompose the curve into its frequency components before counting positive and negative passages. The algorithms are available at doi:10.1594/PANGAEA.729700. We applied the new methods successfully to tree rings, to well-dated and already manually counted marine varves from Saanich Inlet, and to marine laminae from the Antarctic continental margin. In combination with AMS14C dating, we found convincing evidence that laminations in Weddell Sea sites represent varves, deposited continuously over several millennia during the last glacial maximum. The new tools offer several advantages over previous methods. The counting procedures are based on a moving average generated from gray scale curves instead of manual counting. Hence, results are highly objective and rely on reproducible mathematical criteria. Also, the PEAK tool measures the thickness of each year or season. Since all information required is displayed graphically, interactive optimization of the counting algorithms can be achieved quickly and conveniently.
Holographic memory system based on projection recording of computer-generated 1D Fourier holograms.
Betin, A Yu; Bobrinev, V I; Donchenko, S S; Odinokov, S B; Evtikhiev, N N; Starikov, R S; Starikov, S N; Zlokazov, E Yu
2014-10-01
Utilization of computer generation of holographic structures significantly simplifies the optical scheme that is used to record the microholograms in a holographic memory record system. Also digital holographic synthesis allows to account the nonlinear errors of the record system to improve the microholograms quality. The multiplexed record of holograms is a widespread technique to increase the data record density. In this article we represent the holographic memory system based on digital synthesis of amplitude one-dimensional (1D) Fourier transform holograms and the multiplexed record of these holograms onto the holographic carrier using optical projection scheme. 1D Fourier transform holograms are very sensitive to orientation of the anamorphic optical element (cylindrical lens) that is required for encoded data object reconstruction. The multiplex record of several holograms with different orientation in an optical projection scheme allowed reconstruction of the data object from each hologram by rotating the cylindrical lens on the corresponding angle. Also, we discuss two optical schemes for the recorded holograms readout: a full-page readout system and line-by-line readout system. We consider the benefits of both systems and present the results of experimental modeling of 1D Fourier holograms nonmultiplex and multiplex record and reconstruction.
Johnson, Karin E; Kamineni, Aruna; Fuller, Sharon; Olmstead, Danielle; Wernli, Karen J
2014-01-01
The use of electronic health records (EHRs) for research is proceeding rapidly, driven by computational power, analytical techniques, and policy. However, EHR-based research is limited by the complexity of EHR data and a lack of understanding about data provenance, meaning the context under which the data were collected. This paper presents system flow mapping as a method to help researchers more fully understand the provenance of their EHR data as it relates to local workflow. We provide two specific examples of how this method can improve data identification, documentation, and processing. EHRs store clinical and administrative data, often in unstructured fields. Each clinical system has a unique and dynamic workflow, as well as an EHR customized for local use. The EHR customization may be influenced by a broader context such as documentation required for billing. We present a case study with two examples of using system flow mapping to characterize EHR data for a local colorectal cancer screening process. System flow mapping demonstrated that information entered into the EHR during clinical practice required interpretation and transformation before it could be accurately applied to research. We illustrate how system flow mapping shaped our knowledge of the quality and completeness of data in two examples: (1) determining colonoscopy indication as recorded in the EHR, and (2) discovering a specific EHR form that captured family history. Researchers who do not consider data provenance risk compiling data that are systematically incomplete or incorrect. For example, researchers who are not familiar with the clinical workflow under which data were entered might miss or misunderstand patient information or procedure and diagnostic codes. Data provenance is a fundamental characteristic of research data from EHRs. Given the diversity of EHR platforms and system workflows, researchers need tools for evaluating and reporting data availability, quality, and transformations. Our case study illustrates how system mapping can inform researchers about the provenance of their data as it pertains to local workflows.
Do's and don'ts in Fourier analysis of steady-state potentials.
Bach, M; Meigen, T
1999-01-01
Fourier analysis is a powerful tool in signal analysis that can be very fruitfully applied to steady-state evoked potentials (flicker ERG, pattern ERG, VEP, etc.). However, there are some inherent assumptions in the underlying discrete Fourier transform (DFT) that are not necessarily fulfilled in typical electrophysiological recording and analysis conditions. Furthermore, engineering software-packages may be ill-suited and/or may not fully exploit the information of steady-state recordings. Specifically: * In the case of steady-state stimulation we know more about the stimulus than in standard textbook situations (exact frequency, phase stability), so 'windowing' and calculation of the 'periodogram' are not necessary. * It is mandatory to choose an integer relationship between sampling rate and frame rate when employing a raster-based CRT stimulator. * The analysis interval must comprise an exact integer number (e.g., 10) of stimulus periods. * The choice of the number of stimulus periods per analysis interval needs a wise compromise: A high number increases the frequency resolution, but makes artifact removal difficult; a low number 'spills' noise into the response frequency. * There is no need to feel tied to a power-of-two number of data points as required by standard FFT, 'resampling' is an easy and efficient alternative. * Proper estimates of noise-corrected Fourier magnitude and statistical significance can be calculated that take into account the non-linear superposition of signal and noise. These aspects are developed in an intuitive approach with examples using both simulations and recordings. Proper use of Fourier analysis of our electrophysiological records will reduce recording time and/or increase the reliability of physiologic or pathologic interpretations.
Waveform shape analysis: extraction of physiologically relevant information from Doppler recordings.
Ramsay, M M; Broughton Pipkin, F; Rubin, P C; Skidmore, R
1994-05-01
1. Doppler recordings were made from the brachial artery of healthy female subjects during a series of manoeuvres which altered the pressure-flow characteristics of the vessel. 2. Changes were induced in the peripheral circulation of the forearm by the application of heat or ice-packs. A sphygmomanometer cuff was used to create graded occlusion of the vessel above and below the point of measurement. Recordings were also made whilst the subjects performed a standardized Valsalva manoeuvre. 3. The Doppler recordings were analysed both with the standard waveform indices (systolic/diastolic ratio, pulsatility index and resistance index) and by the method of Laplace transform analysis. 4. The waveform parameters obtained by Laplace transform analysis distinguished the different changes in flow conditions; they thus had direct physiological relevance, unlike the standard waveform indices.
Validation of a general practice audit and data extraction tool.
Peiris, David; Agaliotis, Maria; Patel, Bindu; Patel, Anushka
2013-11-01
We assessed how accurately a common general practitioner (GP) audit tool extracts data from two software systems. First, pathology test codes were audited at 33 practices covering nine companies. Second, a manual audit of chronic disease data from 200 random patient records at two practices was compared with audit tool data. Pathology review: all companies assigned correct codes for cholesterol, creatinine and glycated haemoglobin; four companies assigned incorrect codes for albuminuria tests, precluding accurate detection with the audit tool. Case record review: there was strong agreement between the manual audit and the tool for all variables except chronic kidney disease diagnoses, which was due to a tool-related programming error. The audit tool accurately detected most chronic disease data in two GP record systems. The one exception, however, highlights the importance of surveillance systems to promptly identify errors. This will maximise potential for audit tools to improve healthcare quality.
Modeling biochemical transformation processes and information processing with Narrator.
Mandel, Johannes J; Fuss, Hendrik; Palfreyman, Niall M; Dubitzky, Werner
2007-03-27
Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Narrator is a flexible and intuitive systems biology tool. It is specifically intended for users aiming to construct and simulate dynamic models of biology without recourse to extensive mathematical detail. Its design facilitates mappings to different formal languages and frameworks. The combined set of features makes Narrator unique among tools of its kind. Narrator is implemented as Java software program and available as open-source from http://www.narrator-tool.org.
The efficiency of geophysical adjoint codes generated by automatic differentiation tools
NASA Astrophysics Data System (ADS)
Vlasenko, A. V.; Köhl, A.; Stammer, D.
2016-02-01
The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the continuous use of AD tools for solving geophysical problems on modern computer architectures.
Estimation of hydrolysis rate constants for carbamates
Cheminformatics based tools, such as the Chemical Transformation Simulator under development in EPA’s Office of Research and Development, are being increasingly used to evaluate chemicals for their potential to degrade in the environment or be transformed through metabolism...
Note: Modification of an FTIR spectrometer for optoelectronic characterizations.
Puspitosari, N; Longeaud, C
2017-08-01
We propose a very simple system to be adapted to a Fourier Transform Infra-Red (FTIR) spectrometer with which three different types of characterizations can be done: the Fourier transform photocurrent spectroscopy, the recording of reflection-transmission spectra of thin film semiconductors, and the acquisition of spectral responses of solar cells. In addition to gather three techniques into a single apparatus, this FTIR-based system also significantly reduces the recording time and largely improves the resolution of the measured spectra compared to standard equipments.
Note: Modification of an FTIR spectrometer for optoelectronic characterizations
NASA Astrophysics Data System (ADS)
Puspitosari, N.; Longeaud, C.
2017-08-01
We propose a very simple system to be adapted to a Fourier Transform Infra-Red (FTIR) spectrometer with which three different types of characterizations can be done: the Fourier transform photocurrent spectroscopy, the recording of reflection-transmission spectra of thin film semiconductors, and the acquisition of spectral responses of solar cells. In addition to gather three techniques into a single apparatus, this FTIR-based system also significantly reduces the recording time and largely improves the resolution of the measured spectra compared to standard equipments.
Use of Natural Transformation To Establish an Easy Knockout Method in Riemerella anatipestifer.
Liu, MaFeng; Zhang, Li; Huang, Li; Biville, Francis; Zhu, DeKang; Wang, MingShu; Jia, RenYong; Chen, Shun; Sun, KunFeng; Yang, Qiao; Wu, Ying; Chen, XiaoYue; Cheng, AnChun
2017-05-01
Riemerella anatipestifer is a member of the family Flavobacteriaceae and a major causative agent of duck serositis. Little is known about its genetics and pathogenesis. Several bacteria are competent for natural transformation; however, whether R. anatipestifer is also competent for natural transformation has not been investigated. Here, we showed that R. anatipestifer strain ATCC 11845 can uptake the chromosomal DNA of R. anatipestifer strain RA-CH-1 in all growth phases. Subsequently, a natural transformation-based knockout method was established for R. anatipestifer ATCC 11845. Targeted mutagenesis gave transformation frequencies of ∼10 -5 transformants. Competition assay experiments showed that R. anatipestifer ATCC 11845 preferentially took up its own DNA rather than heterogeneous DNA, such as Escherichia coli DNA. Transformation was less efficient with the shuttle plasmid pLMF03 (transformation frequencies of ∼10 -9 transformants). However, the efficiency of transformation was increased approximately 100-fold using pLMF03 derivatives containing R. anatipestifer DNA fragments (transformation frequencies of ∼10 -7 transformants). Finally, we found that the R. anatipestifer RA-CH-1 strain was also naturally transformable, suggesting that natural competence is widely applicable for this species. The findings described here provide important tools for the genetic manipulation of R. anatipestifer IMPORTANCE Riemerella anatipestifer is an important duck pathogen that belongs to the family Flavobacteriaceae At least 21 different serotypes have been identified. Genetic diversity has been demonstrated among these serotypes. The genetic and pathogenic mechanisms of R. anatipestifer remain largely unknown because no genetic tools are available for this bacterium. At present, natural transformation has been found in some bacteria but not in R. anatipestifer For the first time, we showed that natural transformation occurred in R. anatipestifer ATCC 11845 and R. anatipestifer RA-CH-1. Then, we established an easy gene knockout method in R. anatipestifer based on natural transformation. This information is important for further studies of the genetic diversity and pathogenesis in R. anatipestifer . Copyright © 2017 American Society for Microbiology.
Kang, Sokbom; Lee, Jong-Min; Lee, Jae-Kwan; Kim, Jae-Weon; Cho, Chi-Heum; Kim, Seok-Mo; Park, Sang-Yoon; Park, Chan-Yong; Kim, Ki-Tae
2014-03-01
The purpose of this study is to develop a Web-based nomogram for predicting the individualized risk of para-aortic nodal metastasis in incompletely staged patients with endometrial cancer. From 8 institutions, the medical records of 397 patients who underwent pelvic and para-aortic lymphadenectomy as a surgical staging procedure were retrospectively reviewed. A multivariate logistic regression model was created and internally validated by rigorous bootstrap resampling methods. Finally, the model was transformed into a user-friendly Web-based nomogram (http://http://www.kgog.org/nomogram/empa001.html). The rate of para-aortic nodal metastasis was 14.4% (57/397 patients). Using a stepwise variable selection, 4 variables including deep myometrial invasion, non-endometrioid subtype, lymphovascular space invasion, and log-transformed CA-125 levels were finally adopted. After 1000 repetitions of bootstrapping, all of these 4 variables retained a significant association with para-aortic nodal metastasis in the multivariate analysis-deep myometrial invasion (P = 0.001), non-endometrioid histologic subtype (P = 0.034), lymphovascular space invasion (P = 0.003), and log-transformed serum CA-125 levels (P = 0.004). The model showed good discrimination (C statistics = 0.87; 95% confidence interval, 0.82-0.92) and accurate calibration (Hosmer-Lemeshow P = 0.74). This nomogram showed good performance in predicting para-aortic metastasis in patients with endometrial cancer. The tool may be useful in determining the extent of lymphadenectomy after incomplete surgery.
Harnessing the Power of Big Data to Improve Graduate Medical Education: Big Idea or Bust?
Arora, Vineet M
2018-06-01
With the advent of electronic medical records (EMRs) fueling the rise of big data, the use of predictive analytics, machine learning, and artificial intelligence are touted as transformational tools to improve clinical care. While major investments are being made in using big data to transform health care delivery, little effort has been directed toward exploiting big data to improve graduate medical education (GME). Because our current system relies on faculty observations of competence, it is not unreasonable to ask whether big data in the form of clinical EMRs and other novel data sources can answer questions of importance in GME such as when is a resident ready for independent practice.The timing is ripe for such a transformation. A recent National Academy of Medicine report called for reforms to how GME is delivered and financed. While many agree on the need to ensure that GME meets our nation's health needs, there is little consensus on how to measure the performance of GME in meeting this goal. During a recent workshop at the National Academy of Medicine on GME outcomes and metrics in October 2017, a key theme emerged: Big data holds great promise to inform GME performance at individual, institutional, and national levels. In this Invited Commentary, several examples are presented, such as using big data to inform clinical experience and provide clinically meaningful data to trainees, and using novel data sources, including ambient data, to better measure the quality of GME training.
Primers-4-Yeast: a comprehensive web tool for planning primers for Saccharomyces cerevisiae.
Yofe, Ido; Schuldiner, Maya
2014-02-01
The budding yeast Saccharomyces cerevisiae is a key model organism of functional genomics, due to its ease and speed of genetic manipulations. In fact, in this yeast, the requirement for homologous sequences for recombination purposes is so small that 40 base pairs (bp) are sufficient. Hence, an enormous variety of genetic manipulations can be performed by simply planning primers with the correct homology, using a defined set of transformation plasmids. Although designing primers for yeast transformations and for the verification of their correct insertion is a common task in all yeast laboratories, primer planning is usually done manually and a tool that would enable easy, automated primer planning for the yeast research community is still lacking. Here we introduce Primers-4-Yeast, a web tool that allows primers to be designed in batches for S. cerevisiae gene-targeting transformations, and for the validation of correct insertions. This novel tool enables fast, automated, accurate primer planning for large sets of genes, introduces consistency in primer planning and is therefore suggested to serve as a standard in yeast research. Primers-4-Yeast is available at: http://www.weizmann.ac.il/Primers-4-Yeast Copyright © 2013 John Wiley & Sons, Ltd.
Unsupervised pattern recognition methods in ciders profiling based on GCE voltammetric signals.
Jakubowska, Małgorzata; Sordoń, Wanda; Ciepiela, Filip
2016-07-15
This work presents a complete methodology of distinguishing between different brands of cider and ageing degrees, based on voltammetric signals, utilizing dedicated data preprocessing procedures and unsupervised multivariate analysis. It was demonstrated that voltammograms recorded on glassy carbon electrode in Britton-Robinson buffer at pH 2 are reproducible for each brand. By application of clustering algorithms and principal component analysis visible homogenous clusters were obtained. Advanced signal processing strategy which included automatic baseline correction, interval scaling and continuous wavelet transform with dedicated mother wavelet, was a key step in the correct recognition of the objects. The results show that voltammetry combined with optimized univariate and multivariate data processing is a sufficient tool to distinguish between ciders from various brands and to evaluate their freshness. Copyright © 2016 Elsevier Ltd. All rights reserved.
The development and evaluation of a new coding system for medical records.
Papazissis, Elias
2014-01-01
The present study aims to develop a simple, reliable and easy tool enabling clinicians to codify the major part of individualized medical details (patient history and findings of physical examination) quickly and easily in routine medical practice, by entering data to a purpose-built software application, using structure data elements and detailed medical illustrations. We studied medical records of 9,320 patients and we extracted individualized medical details. We recorded the majority of symptoms and the majority of findings of physical examination into the system, which was named IMPACT® (Intelligent Medical Patient Record and Coding Tool). Subsequently the system was evaluated by clinicians, based on the examination of 1206 patients. The evaluation results showed that IMPACT® is an efficient tool, easy to use even under time-pressing conditions. IMPACT® seems to be a promising tool for illustration-guided, structured data entry of medical narrative, in electronic patient records.
Market Transformation | Hydrogen and Fuel Cells | NREL
deployment sites Develop techno-economic assessment tools, deployment tools, and business cases for various fuel cell applications Collect and evaluate data from deployment projects to verify the business cases
Power Plant Model Validation Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
The PPMV is used to validate generator model using disturbance recordings. The PPMV tool contains a collection of power plant models and model validation studies, as well as disturbance recordings from a number of historic grid events. The user can import data from a new disturbance into the database, which converts PMU and SCADA data into GE PSLF format, and then run the tool to validate (or invalidate) the model for a specific power plant against its actual performance. The PNNL PPMV tool enables the automation of the process of power plant model validation using disturbance recordings. The tool usesmore » PMU and SCADA measurements as input information. The tool automatically adjusts all required EPCL scripts and interacts with GE PSLF in the batch mode. The main tool features includes: The tool interacts with GE PSLF; The tool uses GE PSLF Play-In Function for generator model validation; Database of projects (model validation studies); Database of the historic events; Database of the power plant; The tool has advanced visualization capabilities; and The tool automatically generates reports« less
Kannan, Vaishnavi; Fish, Jason S; Mutz, Jacqueline M; Carrington, Angela R; Lai, Ki; Davis, Lisa S; Youngblood, Josh E; Rauschuber, Mark R; Flores, Kathryn A; Sara, Evan J; Bhat, Deepa G; Willett, DuWayne L
2017-06-14
Creation of a new electronic health record (EHR)-based registry often can be a "one-off" complex endeavor: first developing new EHR data collection and clinical decision support tools, followed by developing registry-specific data extractions from the EHR for analysis. Each development phase typically has its own long development and testing time, leading to a prolonged overall cycle time for delivering one functioning registry with companion reporting into production. The next registry request then starts from scratch. Such an approach will not scale to meet the emerging demand for specialty registries to support population health and value-based care. To determine if the creation of EHR-based specialty registries could be markedly accelerated by employing (a) a finite core set of EHR data collection principles and methods, (b) concurrent engineering of data extraction and data warehouse design using a common dimensional data model for all registries, and (c) agile development methods commonly employed in new product development. We adopted as guiding principles to (a) capture data as a byproduct of care of the patient, (b) reinforce optimal EHR use by clinicians, (c) employ a finite but robust set of EHR data capture tool types, and (d) leverage our existing technology toolkit. Registries were defined by a shared condition (recorded on the Problem List) or a shared exposure to a procedure (recorded on the Surgical History) or to a medication (recorded on the Medication List). Any EHR fields needed - either to determine registry membership or to calculate a registry-associated clinical quality measure (CQM) - were included in the enterprise data warehouse (EDW) shared dimensional data model. Extract-transform-load (ETL) code was written to pull data at defined "grains" from the EHR into the EDW model. All calculated CQM values were stored in a single Fact table in the EDW crossing all registries. Registry-specific dashboards were created in the EHR to display both (a) real-time patient lists of registry patients and (b) EDW-generated CQM data. Agile project management methods were employed, including co-development, lightweight requirements documentation with User Stories and acceptance criteria, and time-boxed iterative development of EHR features in 2-week "sprints" for rapid-cycle feedback and refinement. Using this approach, in calendar year 2015 we developed a total of 43 specialty chronic disease registries, with 111 new EHR data collection and clinical decision support tools, 163 new clinical quality measures, and 30 clinic-specific dashboards reporting on both real-time patient care gaps and summarized and vetted CQM measure performance trends. This study suggests concurrent design of EHR data collection tools and reporting can quickly yield useful EHR structured data for chronic disease registries, and bodes well for efforts to migrate away from manual abstraction. This work also supports the view that in new EHR-based registry development, as in new product development, adopting agile principles and practices can help deliver valued, high-quality features early and often.
Kannan, Vaishnavi; Fish, Jason S; Mutz, Jacqueline M; Carrington, Angela R; Lai, Ki; Davis, Lisa S; Youngblood, Josh E; Rauschuber, Mark R; Flores, Kathryn A; Sara, Evan J; Bhat, Deepa G; Willett, DuWayne L
2017-01-01
Creation of a new electronic health record (EHR)-based registry often can be a "one-off" complex endeavor: first developing new EHR data collection and clinical decision support tools, followed by developing registry-specific data extractions from the EHR for analysis. Each development phase typically has its own long development and testing time, leading to a prolonged overall cycle time for delivering one functioning registry with companion reporting into production. The next registry request then starts from scratch. Such an approach will not scale to meet the emerging demand for specialty registries to support population health and value-based care. To determine if the creation of EHR-based specialty registries could be markedly accelerated by employing (a) a finite core set of EHR data collection principles and methods, (b) concurrent engineering of data extraction and data warehouse design using a common dimensional data model for all registries, and (c) agile development methods commonly employed in new product development. We adopted as guiding principles to (a) capture data as a byproduct of care of the patient, (b) reinforce optimal EHR use by clinicians, (c) employ a finite but robust set of EHR data capture tool types, and (d) leverage our existing technology toolkit. Registries were defined by a shared condition (recorded on the Problem List) or a shared exposure to a procedure (recorded on the Surgical History) or to a medication (recorded on the Medication List). Any EHR fields needed - either to determine registry membership or to calculate a registry-associated clinical quality measure (CQM) - were included in the enterprise data warehouse (EDW) shared dimensional data model. Extract-transform-load (ETL) code was written to pull data at defined "grains" from the EHR into the EDW model. All calculated CQM values were stored in a single Fact table in the EDW crossing all registries. Registry-specific dashboards were created in the EHR to display both (a) real-time patient lists of registry patients and (b) EDW-gener-ated CQM data. Agile project management methods were employed, including co-development, lightweight requirements documentation with User Stories and acceptance criteria, and time-boxed iterative development of EHR features in 2-week "sprints" for rapid-cycle feedback and refinement. Using this approach, in calendar year 2015 we developed a total of 43 specialty chronic disease registries, with 111 new EHR data collection and clinical decision support tools, 163 new clinical quality measures, and 30 clinic-specific dashboards reporting on both real-time patient care gaps and summarized and vetted CQM measure performance trends. This study suggests concurrent design of EHR data collection tools and reporting can quickly yield useful EHR structured data for chronic disease registries, and bodes well for efforts to migrate away from manual abstraction. This work also supports the view that in new EHR-based registry development, as in new product development, adopting agile principles and practices can help deliver valued, high-quality features early and often. Schattauer GmbH.
Hypercomplex Fourier transforms of color images.
Ell, Todd A; Sangwine, Stephen J
2007-01-01
Fourier transforms are a fundamental tool in signal and image processing, yet, until recently, there was no definition of a Fourier transform applicable to color images in a holistic manner. In this paper, hypercomplex numbers, specifically quaternions, are used to define a Fourier transform applicable to color images. The properties of the transform are developed, and it is shown that the transform may be computed using two standard complex fast Fourier transforms. The resulting spectrum is explained in terms of familiar phase and modulus concepts, and a new concept of hypercomplex axis. A method for visualizing the spectrum using color graphics is also presented. Finally, a convolution operational formula in the spectral domain is discussed.
NASA Astrophysics Data System (ADS)
Ciocca, M.
2016-12-01
(Abstract only) Since Fall of 2014, AAVSO made available two very useful software tools: transform generator (tg) and transform applier (ta). tg, authored by Gordon Myers (gordonmyers@hotmail.com), is a program, running under python that allows the user to obtain the transformation coefficients of their imaging train. ta, authored by George Silvis, allows users to apply the transformation coefficients obtained previously to their photometric observation. The data so processed become then directly comparable to those of other observers. I will show how to obtain transform coefficient using two Standard Field (M 67 and NGC7790), how consistent the results are and as an application, I will present transformed data for two AAVSO Target stars, AE UMA and RR CET.
Prediction of Hydrolysis Products of Organic Chemicals under Environmental pH Conditions
Cheminformatics-based software tools can predict the molecular structure of transformation products using a library of transformation reaction schemes. This paper presents the development of such a library for abiotic hydrolysis of organic chemicals under environmentally relevant...
Modelling skin penetration using the Laplace transform technique.
Anissimov, Y G; Watkinson, A
2013-01-01
The Laplace transform is a convenient mathematical tool for solving ordinary and partial differential equations. The application of this technique to problems arising in drug penetration through the skin is reviewed in this paper. © 2013 S. Karger AG, Basel.
NASA Astrophysics Data System (ADS)
Capò Sànchez, J.; Huallpa, E.; Farina, P.; Padovese, L. R.; Goldenstein, H.
2011-10-01
Magnetic Barkhausen noise (MBN) was used to characterize the progress of austenite to martensite phase transformation while cooling steel specimens, using a conventional Barkhausen noise emission setup stimulated by an alternating magnetic field. The phase transformation was also followed by electrical resistivity measurements and by optical and scanning electron microscopy. MBN measurements on a AISI D2 tool steel austenitized at 1473 K and cooled to liquid nitrogen temperature presented a clear change near 225 K during cooling, corresponding to the MS (martensite start) temperature, as confirmed by resistivity measurements. Analysis of the resulting signals suggested a novel experimental technique that measures spontaneous magnetic emission during transformation, in the absence of any external field. Spontaneous magnetic noise emission measurements were registered in situ while cooling an initially austenitic sample in liquid nitrogen, showing that local microstructural changes, corresponding to an avalanche or "burst" phenomena, could be detected. This spontaneous magnetic emission (SME) can thus be considered a new experimental tool for the study of martensite transformations in ferrous alloys, at the same level as acoustic emission.
Installation/Removal Tool for Screw-Mounted Components
NASA Technical Reports Server (NTRS)
Ash, J. P.
1984-01-01
Tweezerlike tool simplifies installation of screws in places reached only through narrow openings. With changes in size and shape, basic tool concept applicable to mounting and dismounting of transformers, sockets, terminal strips and mechanical parts. Inexpensive tool fabricated as needed by bending two pieces of steel wire. Exact size and shape selected to suit part manipulated and nature of inaccessible mounting space.
2006-02-13
business transformation is cautiously mechanistic or not much different than earlier versions of process improvement systems. This strategic research...tool for business transformation meet future needs of the Army and what changes to current systems are required. The Army should not present L6s as a...continually adapt how we approach and confront challenges, conduct business , and work with others.”3 The Secretary’s purpose for continuous transformation is
Report Central: quality reporting tool in an electronic health record.
Jung, Eunice; Li, Qi; Mangalampalli, Anil; Greim, Julie; Eskin, Michael S; Housman, Dan; Isikoff, Jeremy; Abend, Aaron H; Middleton, Blackford; Einbinder, Jonathan S
2006-01-01
Quality reporting tools, integrated with ambulatory electronic health records, can help clinicians and administrators understand performance, manage populations, and improve quality. Report Central is a secure web report delivery tool built on Crystal Reports XItrade mark and ASP.NET technologies. Pilot evaluation of Report Central indicates that clinicians prefer a quality reporting tool that is integrated with our home-grown EHR to support clinical workflow.
Absence of External Electric-Field Effects on Transformations in Steels
1991-10-01
12 2. Approximate CCT diagram for the high nickel composition used in the present measurements ...................................... 13 3...Main features of CCT diagram for 02 tool steel ........................ 14 4. DTA and THA data for the 3569C isothermal bainite transformation with...on the continuous-cooling-transformation ( CCT ) diagram obtained by examining transfor- mations in a 3.0 weight percent (wt.%) nickel specimen at
Efficient Privacy-Aware Record Integration.
Kuzu, Mehmet; Kantarcioglu, Murat; Inan, Ali; Bertino, Elisa; Durham, Elizabeth; Malin, Bradley
2013-01-01
The integration of information dispersed among multiple repositories is a crucial step for accurate data analysis in various domains. In support of this goal, it is critical to devise procedures for identifying similar records across distinct data sources. At the same time, to adhere to privacy regulations and policies, such procedures should protect the confidentiality of the individuals to whom the information corresponds. Various private record linkage (PRL) protocols have been proposed to achieve this goal, involving secure multi-party computation (SMC) and similarity preserving data transformation techniques. SMC methods provide secure and accurate solutions to the PRL problem, but are prohibitively expensive in practice, mainly due to excessive computational requirements. Data transformation techniques offer more practical solutions, but incur the cost of information leakage and false matches. In this paper, we introduce a novel model for practical PRL, which 1) affords controlled and limited information leakage, 2) avoids false matches resulting from data transformation. Initially, we partition the data sources into blocks to eliminate comparisons for records that are unlikely to match. Then, to identify matches, we apply an efficient SMC technique between the candidate record pairs. To enable efficiency and privacy, our model leaks a controlled amount of obfuscated data prior to the secure computations. Applied obfuscation relies on differential privacy which provides strong privacy guarantees against adversaries with arbitrary background knowledge. In addition, we illustrate the practical nature of our approach through an empirical analysis with data derived from public voter records.
Algorithm for the classification of multi-modulating signals on the electrocardiogram.
Mita, Mitsuo
2007-03-01
This article discusses the algorithm to measure electrocardiogram (ECG) and respiration simultaneously and to have the diagnostic potentiality for sleep apnoea from ECG recordings. The algorithm is composed by the combination with the three particular scale transform of a(j)(t), u(j)(t), o(j)(a(j)) and the statistical Fourier transform (SFT). Time and magnitude scale transforms of a(j)(t), u(j)(t) change the source into the periodic signal and tau(j) = o(j)(a(j)) confines its harmonics into a few instantaneous components at tau(j) being a common instant on two scales between t and tau(j). As a result, the multi-modulating source is decomposed by the SFT and is reconstructed into ECG, respiration and the other signals by inverse transform. The algorithm is expected to get the partial ventilation and the heart rate variability from scale transforms among a(j)(t), a(j+1)(t) and u(j+1)(t) joining with each modulation. The algorithm has a high potentiality of the clinical checkup for the diagnosis of sleep apnoea from ECG recordings.
NASA Astrophysics Data System (ADS)
Kuhnert, Kristin; Quedenau, Jörn
2016-04-01
Integration and harmonization of large spatial data sets is not only since the introduction of the spatial data infrastructure INSPIRE a big issue. The process of extracting and combining spatial data from heterogeneous source formats, transforming that data to obtain the required quality for particular purposes and loading it into a data store, are common tasks. The procedure of Extraction, Transformation and Loading of data is called ETL process. Geographic Information Systems (GIS) can take over many of these tasks but often they are not suitable for processing large datasets. ETL tools can make the implementation and execution of ETL processes convenient and efficient. One reason for choosing ETL tools for data integration is that they ease maintenance because of a clear (graphical) presentation of the transformation steps. Developers and administrators are provided with tools for identification of errors, analyzing processing performance and managing the execution of ETL processes. Another benefit of ETL tools is that for most tasks no or only little scripting skills are required so that also researchers without programming background can easily work with it. Investigations on ETL tools for business approaches are available for a long time. However, little work has been published on the capabilities of those tools to handle spatial data. In this work, we review and compare the open source ETL tools GeoKettle and Talend Open Studio in terms of processing spatial data sets of different formats. For evaluation, ETL processes are performed with both software packages based on air quality data measured during the BÄRLIN2014 Campaign initiated by the Institute for Advanced Sustainability Studies (IASS). The aim of the BÄRLIN2014 Campaign is to better understand the sources and distribution of particulate matter in Berlin. The air quality data are available in heterogeneous formats because they were measured with different instruments. For further data analysis, the instrument data has been complemented by other georeferenced data provided by the local environmental authorities. This includes both vector and raster data on e.g. land use categories or building heights, extracted from flat files and OGC-compliant web services. The requirements on the ETL tools are now for instance the extraction of different input datasets like Web Feature Services or vector datasets and the loading of those into databases. The tools also have to manage transformations on spatial datasets like to work with spatial functions (e.g. intersection, union) or change spatial reference systems. Preliminary results suggest that many complex transformation tasks could be accomplished with the existing set of components from both software tools, while there are still many gaps in the range of available features. Both ETL tools differ in functionality and in the way of implementation of various steps. For some tasks no predefined components are available at all, which could partly be compensated by the use of the respective API (freely configurable components in Java or JavaScript).
Agrobacterium tumefaciens-mediated transformation of oleaginous yeast Lipomyces species
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Ziyu; Deng, Shuang; Culley, David E.
Background: Because of interest in the production of renewable bio-hydrocarbon fuels, various living organisms have been explored for their potential use in producing fuels and chemicals. The oil-producing (oleaginous) yeast Lipomyces starkeyi is the subject of active research regarding the production of lipids using a wide variety of carbon and nutrient sources. The genome of L. starkeyi has been published, which opens the door to production strain improvements using the tools of synthetic biology and metabolic engineering. However, using these tools for strain improvement requires the establishment of effective and reliable transformation methods with suitable selectable markers (antibiotic resistance ormore » auxotrophic marker genes) and the necessary genetic elements (promoters and terminators) for expression of introduced genes. Chemical-based methods have been published, but suffer from low efficiency or the requirement for targeting to rRNA loci. To address these problems, Agrobacterium-mediated transformation was investigated as an alternative method for L. starkeyi and other Lipomyces species. Results: In this study, Agrobacterium-mediated transformation was demonstrated to be effective in the transformation of both L. starkeyi and other Lipomyces species and that the introduced DNA can be reliably integrated into the chromosomes of these species. The gene deletion of Ku70 and Pex10 was also demonstrated in L. starkeyi. In addition to the bacterial antibiotic selection marker gene hygromycin B phosphotransferase, the bacterial -glucuronidase reporter gene under the control of L. starkeyi translation elongation factor 1 promoter was also stably expressed in seven different Lipomyces species. Conclusion: The results from this study clearly demonstrate that Agrobacterium-mediated transformation is a reliable genetic tool for gene deletion and integration and expression of heterologous genes in L. starkeyi and other Lipomyces species.« less
Modeling biochemical transformation processes and information processing with Narrator
Mandel, Johannes J; Fuß, Hendrik; Palfreyman, Niall M; Dubitzky, Werner
2007-01-01
Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a flexible and intuitive systems biology tool. It is specifically intended for users aiming to construct and simulate dynamic models of biology without recourse to extensive mathematical detail. Its design facilitates mappings to different formal languages and frameworks. The combined set of features makes Narrator unique among tools of its kind. Narrator is implemented as Java software program and available as open-source from . PMID:17389034
TRANSFORM - TRANsient Simulation Framework of Reconfigurable Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greenwood, Michael S; Cetiner, Mustafa S; Fugate, David L
Existing development tools for early stage design and scoping of energy systems are often time consuming to use, proprietary, and do not contain the necessary function to model complete systems (i.e., controls, primary, and secondary systems) in a common platform. The Modelica programming language based TRANSFORM tool (1) provides a standardized, common simulation environment for early design of energy systems (i.e., power plants), (2) provides a library of baseline component modules to be assembled into full plant models using available geometry, design, and thermal-hydraulic data, (3) defines modeling conventions for interconnecting component models, and (4) establishes user interfaces and supportmore » tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.« less
NASA Astrophysics Data System (ADS)
Koneshlou, Mahdi; Meshinchi Asl, Kaveh; Khomamizadeh, Farzad
2011-01-01
This paper focuses on the effects of low temperature (subzero) treatments on microstructure and mechanical properties of H13 hot work tool steel. Cryogenic treatment at -72 °C and deep cryogenic treatment at -196 °C were applied and it was found that by applying the subzero treatments, the retained austenite was transformed to martensite. As the temperature was decreased more retained austenite was transformed to martensite and it also led to smaller and more uniform martensite laths distributed in the microstructure. The deep cryogenic treatment also resulted in precipitation of more uniform and very fine carbide particles. The microstructural modification resulted in a significant improvement on the mechanical properties of the H13 tool steel.
Development of Anthropometric Analogous Headforms. Phase 1.
1994-10-31
shown in figure 5. This surface mesh can then be transformed into polygon faces that are able to be rendered by the AutoCAD rendering tools . Rendering of...computer-generated surfaces. The material removal techniques require the programming of the tool path of the cutter and in some cases requires specialized... tooling . Tool path programs are available to transfer the computer-generated surface into actual paths of the cutting tool . In cases where the
A 2D Fourier tool for the analysis of photo-elastic effect in large granular assemblies
NASA Astrophysics Data System (ADS)
Leśniewska, Danuta
2017-06-01
Fourier transforms are the basic tool in constructing different types of image filters, mainly those reducing optical noise. Some DIC or PIV software also uses frequency space to obtain displacement fields from a series of digital images of a deforming body. The paper presents series of 2D Fourier transforms of photo-elastic transmission images, representing large pseudo 2D granular assembly, deforming under varying boundary conditions. The images related to different scales were acquired using the same image resolution, but taken at different distance from the sample. Fourier transforms of images, representing different stages of deformation, reveal characteristic features at the three (`macro-`, `meso-` and `micro-`) scales, which can serve as a data to study internal order-disorder transition within granular materials.
Laser beam shaping for studying thermally induced damage
NASA Astrophysics Data System (ADS)
Masina, Bathusile N.; Bodkin, Richard; Mwakikunga, Bonex; Forbes, Andrew
2011-10-01
This paper presents an implementation of a laser beam shaping system for both heating a diamond tool and measuring the resulting temperature optically. The influence the initial laser parameters have on the resultant temperature profiles is shown experimentally and theoretically. A CO2 laser beam was used as the source to raise the temperature of the diamond tool and the resultant temperature was measured by using the blackbody principle. We have successfully transformed a Gaussian beam profile into a flat-top beam profile by using a diffractive optical element as a phase element in conjunction with a Fourier transforming lens. In this paper, we have successfully demonstrated temperature profiles across the diamond tool surface using two laser beam profiles and two optical setups, thus allowing a study of temperature influences with and without thermal stress. The generation of such temperature profiles on the diamond tool in the laboratory is important in the study of changes that occur in diamond tools, particularly the reduced efficiency of such tools in applications where extreme heating due to friction is expected.
Report Central: Quality Reporting Tool in an Electronic Health Record
Jung, Eunice; Li, Qi; Mangalampalli, Anil; Greim, Julie; Eskin, Michael S.; Housman, Dan; Isikoff, Jeremy; Abend, Aaron H.; Middleton, Blackford; Einbinder, Jonathan S.
2006-01-01
Quality reporting tools, integrated with ambulatory electronic health records, can help clinicians and administrators understand performance, manage populations, and improve quality. Report Central is a secure web report delivery tool built on Crystal Reports XI™ and ASP.NET technologies. Pilot evaluation of Report Central indicates that clinicians prefer a quality reporting tool that is integrated with our home-grown EHR to support clinical workflow. PMID:17238590
FAST: FAST Analysis of Sequences Toolbox
Lawrence, Travis J.; Kauffman, Kyle T.; Amrine, Katherine C. H.; Carper, Dana L.; Lee, Raymond S.; Becich, Peter J.; Canales, Claudia J.; Ardell, David H.
2015-01-01
FAST (FAST Analysis of Sequences Toolbox) provides simple, powerful open source command-line tools to filter, transform, annotate and analyze biological sequence data. Modeled after the GNU (GNU's Not Unix) Textutils such as grep, cut, and tr, FAST tools such as fasgrep, fascut, and fastr make it easy to rapidly prototype expressive bioinformatic workflows in a compact and generic command vocabulary. Compact combinatorial encoding of data workflows with FAST commands can simplify the documentation and reproducibility of bioinformatic protocols, supporting better transparency in biological data science. Interface self-consistency and conformity with conventions of GNU, Matlab, Perl, BioPerl, R, and GenBank help make FAST easy and rewarding to learn. FAST automates numerical, taxonomic, and text-based sorting, selection and transformation of sequence records and alignment sites based on content, index ranges, descriptive tags, annotated features, and in-line calculated analytics, including composition and codon usage. Automated content- and feature-based extraction of sites and support for molecular population genetic statistics make FAST useful for molecular evolutionary analysis. FAST is portable, easy to install and secure thanks to the relative maturity of its Perl and BioPerl foundations, with stable releases posted to CPAN. Development as well as a publicly accessible Cookbook and Wiki are available on the FAST GitHub repository at https://github.com/tlawrence3/FAST. The default data exchange format in FAST is Multi-FastA (specifically, a restriction of BioPerl FastA format). Sanger and Illumina 1.8+ FastQ formatted files are also supported. FAST makes it easier for non-programmer biologists to interactively investigate and control biological data at the speed of thought. PMID:26042145
Model-Driven Useware Engineering
NASA Astrophysics Data System (ADS)
Meixner, Gerrit; Seissler, Marc; Breiner, Kai
User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.
Mamoshina, Polina; Ojomoko, Lucy; Yanovich, Yury; Ostrovski, Alex; Botezatu, Alex; Prikhodko, Pavel; Izumchenko, Eugene; Aliper, Alexander; Romantsov, Konstantin; Zhebrak, Alexander; Ogu, Iraneus Obioma; Zhavoronkov, Alex
2018-01-01
The increased availability of data and recent advancements in artificial intelligence present the unprecedented opportunities in healthcare and major challenges for the patients, developers, providers and regulators. The novel deep learning and transfer learning techniques are turning any data about the person into medical data transforming simple facial pictures and videos into powerful sources of data for predictive analytics. Presently, the patients do not have control over the access privileges to their medical records and remain unaware of the true value of the data they have. In this paper, we provide an overview of the next-generation artificial intelligence and blockchain technologies and present innovative solutions that may be used to accelerate the biomedical research and enable patients with new tools to control and profit from their personal data as well with the incentives to undergo constant health monitoring. We introduce new concepts to appraise and evaluate personal records, including the combination-, time- and relationship-value of the data. We also present a roadmap for a blockchain-enabled decentralized personal health data ecosystem to enable novel approaches for drug discovery, biomarker development, and preventative healthcare. A secure and transparent distributed personal data marketplace utilizing blockchain and deep learning technologies may be able to resolve the challenges faced by the regulators and return the control over personal data including medical records back to the individuals. PMID:29464026
Mamoshina, Polina; Ojomoko, Lucy; Yanovich, Yury; Ostrovski, Alex; Botezatu, Alex; Prikhodko, Pavel; Izumchenko, Eugene; Aliper, Alexander; Romantsov, Konstantin; Zhebrak, Alexander; Ogu, Iraneus Obioma; Zhavoronkov, Alex
2018-01-19
The increased availability of data and recent advancements in artificial intelligence present the unprecedented opportunities in healthcare and major challenges for the patients, developers, providers and regulators. The novel deep learning and transfer learning techniques are turning any data about the person into medical data transforming simple facial pictures and videos into powerful sources of data for predictive analytics. Presently, the patients do not have control over the access privileges to their medical records and remain unaware of the true value of the data they have. In this paper, we provide an overview of the next-generation artificial intelligence and blockchain technologies and present innovative solutions that may be used to accelerate the biomedical research and enable patients with new tools to control and profit from their personal data as well with the incentives to undergo constant health monitoring. We introduce new concepts to appraise and evaluate personal records, including the combination-, time- and relationship-value of the data. We also present a roadmap for a blockchain-enabled decentralized personal health data ecosystem to enable novel approaches for drug discovery, biomarker development, and preventative healthcare. A secure and transparent distributed personal data marketplace utilizing blockchain and deep learning technologies may be able to resolve the challenges faced by the regulators and return the control over personal data including medical records back to the individuals.
A Model of Transformative Collaboration
ERIC Educational Resources Information Center
Swartz, Ann L.; Triscari, Jacqlyn S.
2011-01-01
Two collaborative writing partners sought to deepen their understanding of transformative learning by conducting several spirals of grounded theory research on their own collaborative relationship. Drawing from adult education, business, and social science literature and including descriptive analysis of their records of activity and interaction…
ERIC Educational Resources Information Center
Thomas, Lisa Carlucci
2012-01-01
Bookstores, record stores, libraries, Facebook: these places--both physical and virtual--demonstrate an established and essential purpose as centers of community, expertise, convenience, immediacy, and respect. Yet as digital, mobile, and social shifts continue to transform culture and interactions, these spaces and places transform, too.…
Wavelet transform analysis of transient signals: the seismogram and the electrocardiogram
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anant, K.S.
1997-06-01
In this dissertation I quantitatively demonstrate how the wavelet transform can be an effective mathematical tool for the analysis of transient signals. The two key signal processing applications of the wavelet transform, namely feature identification and representation (i.e., compression), are shown by solving important problems involving the seismogram and the electrocardiogram. The seismic feature identification problem involved locating in time the P and S phase arrivals. Locating these arrivals accurately (particularly the S phase) has been a constant issue in seismic signal processing. In Chapter 3, I show that the wavelet transform can be used to locate both the Pmore » as well as the S phase using only information from single station three-component seismograms. This is accomplished by using the basis function (wave-let) of the wavelet transform as a matching filter and by processing information across scales of the wavelet domain decomposition. The `pick` time results are quite promising as compared to analyst picks. The representation application involved the compression of the electrocardiogram which is a recording of the electrical activity of the heart. Compression of the electrocardiogram is an important problem in biomedical signal processing due to transmission and storage limitations. In Chapter 4, I develop an electrocardiogram compression method that applies vector quantization to the wavelet transform coefficients. The best compression results were obtained by using orthogonal wavelets, due to their ability to represent a signal efficiently. Throughout this thesis the importance of choosing wavelets based on the problem at hand is stressed. In Chapter 5, I introduce a wavelet design method that uses linear prediction in order to design wavelets that are geared to the signal or feature being analyzed. The use of these designed wavelets in a test feature identification application led to positive results. The methods developed in this thesis; the feature identification methods of Chapter 3, the compression methods of Chapter 4, as well as the wavelet design methods of Chapter 5, are general enough to be easily applied to other transient signals.« less
Separation of overlapping dental arch objects using digital records of illuminated plaster casts.
Yadollahi, Mohammadreza; Procházka, Aleš; Kašparová, Magdaléna; Vyšata, Oldřich; Mařík, Vladimír
2015-07-11
Plaster casts of individual patients are important for orthodontic specialists during the treatment process and their analysis is still a standard diagnostical tool. But the growing capabilities of information technology enable their replacement by digital models obtained by complex scanning systems. This paper presents the possibility of using a digital camera as a simple instrument to obtain the set of digital images for analysis and evaluation of the treatment using appropriate mathematical tools of image processing. The methods studied in this paper include the segmentation of overlapping dental bodies and the use of different illumination sources to increase the reliability of the separation process. The circular Hough transform, region growing with multiple seed points, and the convex hull detection method are applied to the segmentation of orthodontic plaster cast images to identify dental arch objects and their sizes. The proposed algorithm presents the methodology of improving the accuracy of segmentation of dental arch components using combined illumination sources. Dental arch parameters and distances between the canines and premolars for different segmentation methods were used as a measure to compare the results obtained. A new method of segmentation of overlapping dental arch components using digital records of illuminated plaster casts provides information with the precision required for orthodontic treatment. The distance between corresponding teeth was evaluated with a mean error of 1.38% and the Dice similarity coefficient of the evaluated dental bodies boundaries reached 0.9436 with a false positive rate [Formula: see text] and false negative rate [Formula: see text].
Uranium isotopes fingerprint biotic reduction
Stylo, Malgorzata; Neubert, Nadja; Wang, Yuheng; ...
2015-04-20
Knowledge of paleo-redox conditions in the Earth’s history provides a window into events that shaped the evolution of life on our planet. The role of microbial activity in paleo-redox processes remains unexplored due to the inability to discriminate biotic from abiotic redox transformations in the rock record. The ability to deconvolute these two processes would provide a means to identify environmental niches in which microbial activity was prevalent at a specific time in paleo-history and to correlate specific biogeochemical events with the corresponding microbial metabolism. Here, we demonstrate that the isotopic signature associated with microbial reduction of hexavalent uranium (U),more » i.e., the accumulation of the heavy isotope in the U(IV) phase, is readily distinguishable from that generated by abiotic uranium reduction in laboratory experiments. Thus, isotope signatures preserved in the geologic record through the reductive precipitation of uranium may provide the sought-after tool to probe for biotic processes. Because uranium is a common element in the Earth’s crust and a wide variety of metabolic groups of microorganisms catalyze the biological reduction of U(VI), this tool is applicable to a multiplicity of geological epochs and terrestrial environments. The findings of this study indicate that biological activity contributed to the formation of many authigenic U deposits, including sandstone U deposits of various ages, as well as modern, Cretaceous, and Archean black shales. In addition, engineered bioremediation activities also exhibit a biotic signature, suggesting that, although multiple pathways may be involved in the reduction, direct enzymatic reduction contributes substantially to the immobilization of uranium.« less
The transforming effect of handheld computers on nursing practice.
Thompson, Brent W
2005-01-01
Handheld computers have the power to transform nursing care. The roots of this power are the shift to decentralization of communication, electronic health records, and nurses' greater need for information at the point of care. This article discusses the effects of handheld resources, calculators, databases, electronic health records, and communication devices on nursing practice. The US government has articulated the necessity of implementing the use of handheld computers in healthcare. Nurse administrators need to encourage and promote the diffusion of this technology, which can reduce costs and improve care.
Simulation of Robot Kinematics Using Interactive Computer Graphics.
ERIC Educational Resources Information Center
Leu, M. C.; Mahajan, R.
1984-01-01
Development of a robot simulation program based on geometric transformation softwares available in most computer graphics systems and program features are described. The program can be extended to simulate robots coordinating with external devices (such as tools, fixtures, conveyors) using geometric transformations to describe the…
Electrocardiogram signal denoising based on a new improved wavelet thresholding
NASA Astrophysics Data System (ADS)
Han, Guoqiang; Xu, Zhijun
2016-08-01
Good quality electrocardiogram (ECG) is utilized by physicians for the interpretation and identification of physiological and pathological phenomena. In general, ECG signals may mix various noises such as baseline wander, power line interference, and electromagnetic interference in gathering and recording process. As ECG signals are non-stationary physiological signals, wavelet transform is investigated to be an effective tool to discard noises from corrupted signals. A new compromising threshold function called sigmoid function-based thresholding scheme is adopted in processing ECG signals. Compared with other methods such as hard/soft thresholding or other existing thresholding functions, the new algorithm has many advantages in the noise reduction of ECG signals. It perfectly overcomes the discontinuity at ±T of hard thresholding and reduces the fixed deviation of soft thresholding. The improved wavelet thresholding denoising can be proved to be more efficient than existing algorithms in ECG signal denoising. The signal to noise ratio, mean square error, and percent root mean square difference are calculated to verify the denoising performance as quantitative tools. The experimental results reveal that the waves including P, Q, R, and S waves of ECG signals after denoising coincide with the original ECG signals by employing the new proposed method.
Determining the transgene containment level provided by chloroplast transformation
Ruf, Stephanie; Karcher, Daniel; Bock, Ralph
2007-01-01
Plastids (chloroplasts) are maternally inherited in most crops. Maternal inheritance excludes plastid genes and transgenes from pollen transmission. Therefore, plastid transformation is considered a superb tool for ensuring transgene containment and improving the biosafety of transgenic plants. Here, we have assessed the strictness of maternal inheritance and the extent to which plastid transformation technology confers an increase in transgene confinement. We describe an experimental system facilitating stringent selection for occasional paternal plastid transmission. In a large screen, we detected low-level paternal inheritance of transgenic plastids in tobacco. Whereas the frequency of transmission into the cotyledons of F1 seedlings was ≈1.58 × 10−5 (on 100% cross-fertilization), transmission into the shoot apical meristem was significantly lower (2.86 × 10−6). Our data demonstrate that plastid transformation provides an effective tool to increase the biosafety of transgenic plants. However, in cases where pollen transmission must be prevented altogether, stacking with other containment methods will be necessary to eliminate the residual outcrossing risk. PMID:17420459
Determining the transgene containment level provided by chloroplast transformation.
Ruf, Stephanie; Karcher, Daniel; Bock, Ralph
2007-04-24
Plastids (chloroplasts) are maternally inherited in most crops. Maternal inheritance excludes plastid genes and transgenes from pollen transmission. Therefore, plastid transformation is considered a superb tool for ensuring transgene containment and improving the biosafety of transgenic plants. Here, we have assessed the strictness of maternal inheritance and the extent to which plastid transformation technology confers an increase in transgene confinement. We describe an experimental system facilitating stringent selection for occasional paternal plastid transmission. In a large screen, we detected low-level paternal inheritance of transgenic plastids in tobacco. Whereas the frequency of transmission into the cotyledons of F(1) seedlings was approximately 1.58 x 10(-5) (on 100% cross-fertilization), transmission into the shoot apical meristem was significantly lower (2.86 x 10(-6)). Our data demonstrate that plastid transformation provides an effective tool to increase the biosafety of transgenic plants. However, in cases where pollen transmission must be prevented altogether, stacking with other containment methods will be necessary to eliminate the residual outcrossing risk.
Liu, Jui-Nung; Schulmerich, Matthew V.; Bhargava, Rohit; Cunningham, Brian T.
2014-01-01
Fourier transform infrared (FT-IR) imaging spectrometers are almost universally used to record microspectroscopic imaging data in the mid-infrared (mid-IR) spectral region. While the commercial standard, interferometry necessitates collection of large spectral regions, requires a large data handling overhead for microscopic imaging and is slow. Here we demonstrate an approach for mid-IR spectroscopic imaging at selected discrete wavelengths using narrowband resonant filtering of a broadband thermal source, enabled by high-performance guided-mode Fano resonances in one-layer, large-area mid-IR photonic crystals on a glass substrate. The microresonant devices enable discrete frequency IR (DF-IR), in which a limited number of wavelengths that are of interest are recorded using a mechanically robust instrument. This considerably simplifies instrumentation as well as overhead of data acquisition, storage and analysis for large format imaging with array detectors. To demonstrate the approach, we perform DF-IR spectral imaging of a polymer USAF resolution target and human tissue in the C−H stretching region (2600−3300 cm−1). DF-IR spectroscopy and imaging can be generalized to other IR spectral regions and can serve as an analytical tool for environmental and biomedical applications. PMID:25089433
Adaptive attenuation of aliased ground roll using the shearlet transform
NASA Astrophysics Data System (ADS)
Hosseini, Seyed Abolfazl; Javaherian, Abdolrahim; Hassani, Hossien; Torabi, Siyavash; Sadri, Maryam
2015-01-01
Attenuation of ground roll is an essential step in seismic data processing. Spatial aliasing of the ground roll may cause the overlap of the ground roll with reflections in the f-k domain. The shearlet transform is a directional and multidimensional transform that separates the events with different dips and generates subimages in different scales and directions. In this study, the shearlet transform was used adaptively to attenuate aliased and non-aliased ground roll. After defining a filtering zone, an input shot record is divided into segments. Each segment overlaps adjacent segments. To apply the shearlet transform on each segment, the subimages containing aliased and non-aliased ground roll, the locations of these events on each subimage are selected adaptively. Based on these locations, mute is applied on the selected subimages. The filtered segments are merged together, using the Hanning function, after applying the inverse shearlet transform. This adaptive process of ground roll attenuation was tested on synthetic data, and field shot records from west of Iran. Analysis of the results using the f-k spectra revealed that the non-aliased and most of the aliased ground roll were attenuated using the proposed adaptive attenuation procedure. Also, we applied this method on shot records of a 2D land survey, and the data sets before and after ground roll attenuation were stacked and compared. The stacked section after ground roll attenuation contained less linear ground roll noise and more continuous reflections in comparison with the stacked section before the ground roll attenuation. The proposed method has some drawbacks such as more run time in comparison with traditional methods such as f-k filtering and reduced performance when the dip and frequency content of aliased ground roll are the same as those of the reflections.
Tool Condition Monitoring in Micro-End Milling using wavelets
NASA Astrophysics Data System (ADS)
Dubey, N. K.; Roushan, A.; Rao, U. S.; Sandeep, K.; Patra, K.
2018-04-01
In this work, Tool Condition Monitoring (TCM) strategy is developed for micro-end milling of titanium alloy and mild steel work-pieces. Full immersion slot milling experiments are conducted using a solid tungsten carbide end mill for more than 1900 s to have reasonable amount of tool wear. During the micro-end milling process, cutting force and vibration signals are acquired using Kistler piezo-electric 3-component force dynamometer (9256C2) and accelerometer (NI cDAQ-9188) respectively. The force components and the vibration signals are processed using Discrete Wavelet Transformation (DWT) in both time and frequency window. 5-level wavelet packet decomposition using Db-8 wavelet is carried out and the detailed coefficients D1 to D5 for each of the signals are obtained. The results of the wavelet transformation are correlated with the tool wear. In case of vibration signals, de-noising is done for higher frequency components (D1) and force signals were de-noised for lower frequency components (D5). Increasing value of MAD (Mean Absolute Deviation) of the detail coefficients for successive channels depicted tool wear. The predictions of the tool wear are confirmed from the actual wear observed in the SEM of the worn tool.
Nyaboga, Evans; Tripathi, Jaindra N.; Manoharan, Rajesh; Tripathi, Leena
2014-01-01
Although genetic transformation of clonally propagated crops has been widely studied as a tool for crop improvement and as a vital part of the development of functional genomics resources, there has been no report of any existing Agrobacterium-mediated transformation of yam (Dioscorea spp.) with evidence of stable integration of T-DNA. Yam is an important crop in the tropics and subtropics providing food security and income to over 300 million people. However, yam production remains constrained by increasing levels of field and storage pests and diseases. A major constraint to the development of biotechnological approaches for yam improvement has been the lack of an efficient and robust transformation and regeneration system. In this study, we developed an Agrobacterium-mediated transformation of Dioscorea rotundata using axillary buds as explants. Two cultivars of D. rotundata were transformed using Agrobacterium tumefaciens harboring the binary vectors containing selectable marker and reporter genes. After selection with appropriate concentrations of antibiotic, shoots were developed on shoot induction and elongation medium. The elongated antibiotic-resistant shoots were subsequently rooted on medium supplemented with selection agent. Successful transformation was confirmed by polymerase chain reaction, Southern blot analysis, and reporter genes assay. Expression of gusA gene in transgenic plants was also verified by reverse transcription polymerase chain reaction analysis. Transformation efficiency varied from 9.4 to 18.2% depending on the cultivars, selectable marker genes, and the Agrobacterium strain used for transformation. It took 3–4 months from Agro-infection to regeneration of complete transgenic plant. Here we report an efficient, fast and reproducible protocol for Agrobacterium-mediated transformation of D. rotundata using axillary buds as explants, which provides a useful platform for future genetic engineering studies in this economically important crop. PMID:25309562
Multichannel Dynamic Fourier-Transform IR Spectrometer
NASA Astrophysics Data System (ADS)
Balashov, A. A.; Vaguine, V. A.; Golyak, Il. S.; Morozov, A. N.; Khorokhorin, A. I.
2017-09-01
A design of a multichannel continuous scan Fourier-transform IR spectrometer for simultaneous recording and analysis of the spectral characteristics of several objects is proposed. For implementing the design, a multi-probe fiber is used, constructed from several optical fibers connected into a single optical connector and attached at the output of the interferometer. The Fourier-transform spectrometer is used as a signal modulator. Each fiber is individually mated with an investigated sample and a dedicated radiation detector. For the developed system, the radiation intensity of the spectrometer is calculated from the condition of the minimum spectral resolution and parameters of the optical fibers. Using the proposed design, emission spectra of a gas-discharge neon lamp have been recorded using a single fiber 1 mm in diameter with a numerical aperture NA = 0.22.
On transform coding tools under development for VP10
NASA Astrophysics Data System (ADS)
Parker, Sarah; Chen, Yue; Han, Jingning; Liu, Zoe; Mukherjee, Debargha; Su, Hui; Wang, Yongzhe; Bankoski, Jim; Li, Shunyao
2016-09-01
Google started the WebM Project in 2010 to develop open source, royaltyfree video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec, VP10, that achieves at least a generational improvement in coding efficiency over VP9. Starting from VP9, a set of new experimental coding tools have already been added to VP10 to achieve decent coding gains. Subsequently, Google joined a consortium of major tech companies called the Alliance for Open Media to jointly develop a new codec AV1. As a result, the VP10 effort is largely expected to merge with AV1. In this paper, we focus primarily on new tools in VP10 that improve coding of the prediction residue using transform coding techniques. Specifically, we describe tools that increase the flexibility of available transforms, allowing the codec to handle a more diverse range or residue structures. Results are presented on a standard test set.
Analytical Tools for Cloudscope Ice Measurement
NASA Technical Reports Server (NTRS)
Arnott, W. Patrick
1998-01-01
The cloudscope is a ground or aircraft instrument for viewing ice crystals impacted on a sapphire window. It is essentially a simple optical microscope with an attached compact CCD video camera whose output is recorded on a Hi-8 mm video cassette recorder equipped with digital time and date recording capability. In aircraft operation the window is at a stagnation point of the flow so adiabatic compression heats the window to sublimate the ice crystals so that later impacting crystals can be imaged as well. A film heater is used for ground based operation to provide sublimation, and it can also be used to provide extra heat for aircraft operation. The compact video camera can be focused manually by the operator, and a beam splitter - miniature bulb combination provide illumination for night operation. Several shutter speeds are available to accommodate daytime illumination conditions by direct sunlight. The video images can be directly used to qualitatively assess the crystal content of cirrus clouds and contrails. Quantitative size spectra are obtained with the tools described in this report. Selected portions of the video images are digitized using a PCI bus frame grabber to form a short movie segment or stack using NIH (National Institute of Health) Image software with custom macros developed at DRI. The stack can be Fourier transform filtered with custom, easy to design filters to reduce most objectionable video artifacts. Particle quantification of each slice of the stack is performed using digital image analysis. Data recorded for each particle include particle number and centroid, frame number in the stack, particle area, perimeter, equivalent ellipse maximum and minimum radii, ellipse angle, and pixel number. Each valid particle in the stack is stamped with a unique number. This output can be used to obtain a semiquantitative appreciation of the crystal content. The particle information becomes the raw input for a subsequent program (FORTRAN) that synthesizes each slice and separates the new from the sublimating particles. The new particle information is used to generate quantitative particle concentration, area, and mass size spectra along with total concentration, solar extinction coefficient, and ice water content. This program directly creates output in html format for viewing with a web browser.
Open Access Metadata, Catalogers, and Vendors: The Future of Cataloging Records
ERIC Educational Resources Information Center
Flynn, Emily Alinder
2013-01-01
The open access (OA) movement is working to transform scholarly communication around the world, but this philosophy can also apply to metadata and cataloging records. While some notable, large academic libraries, such as Harvard University, the University of Michigan, and the University of Cambridge, released their cataloging records under OA…
NASA Astrophysics Data System (ADS)
Weber, M. E.; Reichelt, L.; Kuhn, G.; Thurow, J. W.; Ricken, W.
2009-12-01
We present software-based tools for rapid and quantitative detection of sediment lamination. The BMPix tool extracts color and gray-scale curves from images at ultrahigh (pixel) resolution. The PEAK tool uses the gray-scale curve and performs, for the first time, fully automated counting of laminae based on three methods. The maximum count algorithm counts every bright peak of a couplet of two laminae (annual resolution) in a Gaussian smoothed gray-scale curve. The zero-crossing algorithm counts every positive and negative halfway-passage of the gray-scale curve through a wide moving average. Hence, the record is separated into bright and dark intervals (seasonal resolution). The same is true for the frequency truncation method, which uses Fourier transformation to decompose the gray-scale curve into its frequency components, before positive and negative passages are count. We applied the new methods successfully to tree rings and to well-dated and already manually counted marine varves from Saanich Inlet before we adopted the tools to rather complex marine laminae from the Antarctic continental margin. In combination with AMS14C dating, we found convincing evidence that the laminations from three Weddell Sea sites represent true varves that were deposited on sediment ridges over several millennia during the last glacial maximum (LGM). There are apparently two seasonal layers of terrigenous composition, a coarser-grained bright layer, and a finer-grained dark layer. The new tools offer several advantages over previous tools. The counting procedures are based on a moving average generated from gray-scale curves instead of manual counting. Hence, results are highly objective and rely on reproducible mathematical criteria. Since PEAK associates counts with a specific depth, the thickness of each year or each season is also measured which is an important prerequisite for later spectral analysis. Since all information required to conduct the analysis is displayed graphically, interactive optimization of the counting algorithms can be achieved quickly and conveniently.
Experience Report: Visual Programming in the Real World
NASA Technical Reports Server (NTRS)
Baroth, E.; Hartsough, C
1994-01-01
This paper reports direct experience with two commercial, widely used visual programming environments. While neither of these systems is object oriented, the tools have transformed the development process and indicate a direction for visual object oriented tools to proceed.
Spatial-Heterodyne Interferometry For Reflection And Transm Ission (Shirt) Measurements
Hanson, Gregory R [Clinton, TN; Bingham, Philip R [Knoxville, TN; Tobin, Ken W [Harriman, TN
2006-02-14
Systems and methods are described for spatial-heterodyne interferometry for reflection and transmission (SHIRT) measurements. A method includes digitally recording a first spatially-heterodyned hologram using a first reference beam and a first object beam; digitally recording a second spatially-heterodyned hologram using a second reference beam and a second object beam; Fourier analyzing the digitally recorded first spatially-heterodyned hologram to define a first analyzed image; Fourier analyzing the digitally recorded second spatially-heterodyned hologram to define a second analyzed image; digitally filtering the first analyzed image to define a first result; and digitally filtering the second analyzed image to define a second result; performing a first inverse Fourier transform on the first result, and performing a second inverse Fourier transform on the second result. The first object beam is transmitted through an object that is at least partially translucent, and the second object beam is reflected from the object.
Streamflow record extension using power transformations and application to sediment transport
NASA Astrophysics Data System (ADS)
Moog, Douglas B.; Whiting, Peter J.; Thomas, Robert B.
1999-01-01
To obtain a representative set of flow rates for a stream, it is often desirable to fill in missing data or extend measurements to a longer time period by correlation to a nearby gage with a longer record. Linear least squares regression of the logarithms of the flows is a traditional and still common technique. However, its purpose is to generate optimal estimates of each day's discharge, rather than the population of discharges, for which it tends to underestimate variance. Maintenance-of-variance-extension (MOVE) equations [Hirsch, 1982] were developed to correct this bias. This study replaces the logarithmic transformation by the more general Box-Cox scaled power transformation, generating a more linear, constant-variance relationship for the MOVE extension. Combining the Box-Cox transformation with the MOVE extension is shown to improve accuracy in estimating order statistics of flow rate, particularly for the nonextreme discharges which generally govern cumulative transport over time. This advantage is illustrated by prediction of cumulative fractions of total bed load transport.
Sperka, Daniel J; Ditterich, Jochen
2011-01-01
While computer-aided planning of human neurosurgeries is becoming more and more common, animal researchers still largely rely on paper atlases for planning their approach before implanting recording chambers to perform invasive recordings of neural activity, which makes this planning process tedious and error-prone. Here we present SPLASh (Stereotactic PLAnning Software), an interactive software tool for the stereotactic planning of recording chamber placement and electrode trajectories. SPLASh has been developed for monkey cortical recordings and relies on a combination of structural MRIs and electronic brain atlases. Since SPLASh is based on the neuroanatomy software Caret, it should also be possible to use it for other parts of the brain or other species for which Caret atlases are available. The tool allows the user to interactively evaluate different possible placements of recording chambers and to simulate electrode trajectories.
Sperka, Daniel J.; Ditterich, Jochen
2011-01-01
While computer-aided planning of human neurosurgeries is becoming more and more common, animal researchers still largely rely on paper atlases for planning their approach before implanting recording chambers to perform invasive recordings of neural activity, which makes this planning process tedious and error-prone. Here we present SPLASh (Stereotactic PLAnning Software), an interactive software tool for the stereotactic planning of recording chamber placement and electrode trajectories. SPLASh has been developed for monkey cortical recordings and relies on a combination of structural MRIs and electronic brain atlases. Since SPLASh is based on the neuroanatomy software Caret, it should also be possible to use it for other parts of the brain or other species for which Caret atlases are available. The tool allows the user to interactively evaluate different possible placements of recording chambers and to simulate electrode trajectories. PMID:21472085
77 FR 46551 - Office of Privacy, Records, and Disclosure; Privacy Act of 1974, as Amended
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-03
... its intent to create the SIGAR system of records titled, SIGAR-11: Social Media Records, and SIGAR-12: Internal Electronic Collaboration Tools. The Social Media system will assist SIGAR by providing new ways to... SIGAR Intranet and social media tools within the Internet. DATES: Comments must be received no later...
USDA-ARS?s Scientific Manuscript database
Background: Dietary intake assessment with diet records (DR) is a standard research and practice tool in nutrition. Manual entry and analysis of DR is time-consuming and expensive. New electronic tools for diet entry by clients and research participants may reduce the cost and effort of nutrient int...
Double Density Dual Tree Discrete Wavelet Transform implementation for Degraded Image Enhancement
NASA Astrophysics Data System (ADS)
Vimala, C.; Aruna Priya, P.
2018-04-01
Wavelet transform is a main tool for image processing applications in modern existence. A Double Density Dual Tree Discrete Wavelet Transform is used and investigated for image denoising. Images are considered for the analysis and the performance is compared with discrete wavelet transform and the Double Density DWT. Peak Signal to Noise Ratio values and Root Means Square error are calculated in all the three wavelet techniques for denoised images and the performance has evaluated. The proposed techniques give the better performance when comparing other two wavelet techniques.
Discrimination of organic coffee via Fourier transform infrared-photoacoustic spectroscopy.
Gordillo-Delgado, Fernando; Marín, Ernesto; Cortés-Hernández, Diego Mauricio; Mejía-Morales, Claudia; García-Salcedo, Angela Janet
2012-08-30
Procedures for the evaluation of the origin and quality of ground and roasted coffee are constantly needed for the associated industry due to complexity of the related market. Conventional Fourier transform infrared (FTIR) spectroscopy can be used for detecting changes in functional groups of compounds, such as coffee. However, dispersion, reflection and non-homogeneity of the sample matrix can cause problems resulting in low spectral quality. On the other hand, sample preparation frequently takes place in a destructive way. To overcome these difficulties, in this work a photoacoustic cell has been adapted as a detector in a FTIR spectrophotometer to perform a study of roasted and ground coffee from three varieties of Coffea arabica grown by organic and conventional methods. Comparison between spectra of coffee recorded by FTIR-photoacoustic spectrometry (PAS) and by FTIR spectrophotometry showed a better resolution of the former method, which, aided by principal components analysis, allowed the identification of some absorption bands that allow the discrimination between organic and conventional coffee. The results obtained provide information about the spectral behavior of coffee powder which can be useful for establishing discrimination criteria. It has been demonstrated that FTIR-PAS can be a useful experimental tool for the characterization of coffee. Copyright © 2012 Society of Chemical Industry.
Using Acceleration Data to Automatically Detect the Onset of Farrowing in Sows.
Traulsen, Imke; Scheel, Christoph; Auer, Wolfgang; Burfeind, Onno; Krieter, Joachim
2018-01-10
The aim of the present study was to automatically predict the onset of farrowing in crate-confined sows. (1) Background: Automatic tools are appropriate to support animal surveillance under practical farming conditions. (2) Methods: In three batches, sows in one farrowing compartment of the Futterkamp research farm were equipped with an ear sensor to sample acceleration. As a reference video, recordings of the sows were used. A classical CUSUM chart using different acceleration indices of various distribution characteristics with several scenarios were compared. (3) Results: The increase of activity mainly due to nest building behavior before the onset of farrowing could be detected with the sow individual CUSUM chart. The best performance required a statistical distribution characteristic that represented fluctuations in the signal (for example, 1st variation) combined with a transformation of this parameter by cumulating differences in the signal within certain time periods from one day to another. With this transformed signal, farrowing sows could reliably be detected. For 100% or 85% of the sows, an alarm was given within 48 or 12 h before the onset of farrowing. (4) Conclusions: Acceleration measurements in the ear of a sow are suitable for detecting the onset of farrowing in individually housed sows in commercial farrowing crates.
Over the Fence: Learning about Education for Sustainability with New Tools and Conversation
ERIC Educational Resources Information Center
McClam, Sherie; Diefenbacher, Lori
2015-01-01
The metaphor of talking "over the fence" underscores the neutrality of tools. Shovels and hoes do the job, but the gardener creates the transformation of earth to food. Each garden requires a unique approach. Such are the tools of education for sustainable development (ESD). Pre-packaged textbooks and toolkits provide definitions and…
Helping School Leaders Help New Teachers: A Tool for Transforming School-Based Induction
ERIC Educational Resources Information Center
Birkeland, Sarah; Feiman-Nemser, Sharon
2012-01-01
Ample research demonstrates the power of comprehensive induction to develop and retain new teachers. Education scholars generally agree on what powerful systems of induction include, yet few tools exist for guiding schools in creating such systems. Drawing on theory and practice, we have created such a tool. This article introduces the "Continuum…
Value Innovation in Learner-Centered Design. How to Develop Valuable Learning Tools
ERIC Educational Resources Information Center
Breuer, Henning; Schwarz, Heinrich; Feller, Kristina; Matsumoto, Mitsuji
2014-01-01
This paper shows how to address technological, cultural and social transformations with empirically grounded innovation. Areas in transition such as higher education and learning techniques today bring about new needs and opportunities for innovative tools and services. But how do we find these tools? The paper argues for using a strategy of…
owl-qa | Informatics Technology for Cancer Research (ITCR)
owl-qa is an OWL-based QA tool for cancer study CDEs. The tool uses the combination of the NCI Thesaurus and additional disjointness axioms to detect potential errors and duplications in the data element definitions. The tool comprises three modules: Data Integration and Services Module; Compositional Expression Transformation Module; and OWL-based Quality Assurance Module.
S'Cool Tools: 5 Great Tools to Perk Up Your Classroom and Engage Your Students
ERIC Educational Resources Information Center
Yoder, Maureen Brown
2009-01-01
For a kindergarten teacher trying to find a new way to help his/her students learn about shapes and patterns or a high school science teacher hoping to bring ecology alive, there is a tool that could be just right for them. This article presents five learning tools that have the potential to transform lessons: (1) Lego Education's WeDo Robotics…
NASA Astrophysics Data System (ADS)
Wang, Zhiguo; Liang, Yingchun; Chen, Mingjun; Tong, Zhen; Chen, Jiaxuan
2010-10-01
Tool wear not only changes its geometry accuracy and integrity, but also decrease machining precision and surface integrity of workpiece that affect using performance and service life of workpiece in ultra-precision machining. Scholars made a lot of experimental researches and stimulant analyses, but there is a great difference on the wear mechanism, especially on the nano-scale wear mechanism. In this paper, the three-dimensional simulation model is built to simulate nano-metric cutting of a single crystal silicon with a non-rigid right-angle diamond tool with 0 rake angle and 0 clearance angle by the molecular dynamics (MD) simulation approach, which is used to investigate the diamond tool wear during the nano-metric cutting process. A Tersoff potential is employed for the interaction between carbon-carbon atoms, silicon-silicon atoms and carbon-silicon atoms. The tool gets the high alternating shear stress, the tool wear firstly presents at the cutting edge where intension is low. At the corner the tool is splitted along the {1 1 1} crystal plane, which forms the tipping. The wear at the flank face is the structure transformation of diamond that the diamond structure transforms into the sheet graphite structure. Owing to the tool wear the cutting force increases.
Musician Map: visualizing music collaborations over time
NASA Astrophysics Data System (ADS)
Yim, Ji-Dong; Shaw, Chris D.; Bartram, Lyn
2009-01-01
In this paper we introduce Musician Map, a web-based interactive tool for visualizing relationships among popular musicians who have released recordings since 1950. Musician Map accepts search terms from the user, and in turn uses these terms to retrieve data from MusicBrainz.org and AudioScrobbler.net, and visualizes the results. Musician Map visualizes relationships of various kinds between music groups and individual musicians, such as band membership, musical collaborations, and linkage to other artists that are generally regarded as being similar in musical style. These relationships are plotted between artists using a new timeline-based visualization where a node in a traditional node-link diagram has been transformed into a Timeline-Node, which allows the visualization of an evolving entity over time, such as the membership in a band. This allows the user to pursue social trend queries such as "Do Hip-Hop artists collaborate differently than Rock artists".
Near infrared Raman spectra of human brain lipids
NASA Astrophysics Data System (ADS)
Krafft, Christoph; Neudert, Lars; Simat, Thomas; Salzer, Reiner
2005-05-01
Human brain tissue, in particular white matter, contains high lipid content. These brain lipids can be divided into three principal classes: neutral lipids including the steroid cholesterol, phospholipids and sphingolipids. Major lipids in normal human brain tissue are phosphatidylcholine, phosphatidylethanolamine, phosphatidylserine, phosphatidylinositol, phosphatidic acid, sphingomyelin, galactocerebrosides, gangliosides, sulfatides and cholesterol. Minor lipids are cholesterolester and triacylglycerides. During transformation from normal brain tissue to tumors, composition and concentration of lipids change in a specific way. Therefore, analysis of lipids might be used as a diagnostic parameter to distinguish normal tissue from tumors and to determine the tumor type and tumor grade. Raman spectroscopy has been suggested as an analytical tool to detect these changes even under intra-operative conditions. We recorded Raman spectra of the 12 major and minor brain lipids with 785 nm excitation in order to identify their spectral fingerprints for qualitative and quantitative analyses.
Quantification of brain lipids by FTIR spectroscopy and partial least squares regression
NASA Astrophysics Data System (ADS)
Dreissig, Isabell; Machill, Susanne; Salzer, Reiner; Krafft, Christoph
2009-01-01
Brain tissue is characterized by high lipid content. Its content decreases and the lipid composition changes during transformation from normal brain tissue to tumors. Therefore, the analysis of brain lipids might complement the existing diagnostic tools to determine the tumor type and tumor grade. Objective of this work is to extract lipids from gray matter and white matter of porcine brain tissue, record infrared (IR) spectra of these extracts and develop a quantification model for the main lipids based on partial least squares (PLS) regression. IR spectra of the pure lipids cholesterol, cholesterol ester, phosphatidic acid, phosphatidylcholine, phosphatidylethanolamine, phosphatidylserine, phosphatidylinositol, sphingomyelin, galactocerebroside and sulfatide were used as references. Two lipid mixtures were prepared for training and validation of the quantification model. The composition of lipid extracts that were predicted by the PLS regression of IR spectra was compared with lipid quantification by thin layer chromatography.
Fils, D.; Cervato, C.; Reed, J.; Diver, P.; Tang, X.; Bohling, G.; Greer, D.
2009-01-01
CHRONOS's purpose is to transform Earth history research by seamlessly integrating stratigraphic databases and tools into a virtual on-line stratigraphic record. In this paper, we describe the various components of CHRONOS's distributed data system, including the encoding of semantic and descriptive data into a service-based architecture. We give examples of how we have integrated well-tested resources available from the open-source and geoinformatic communities, like the GeoSciML schema and the simple knowledge organization system (SKOS), into the services-oriented architecture to encode timescale and phylogenetic synonymy data. We also describe on-going efforts to use geospatially enhanced data syndication and informally including semantic information by embedding it directly into the XHTML Document Object Model (DOM). XHTML DOM allows machine-discoverable descriptive data such as licensing and citation information to be incorporated directly into data sets retrieved by users. ?? 2008 Elsevier Ltd. All rights reserved.
Challenges and methodology for indexing the computerized patient record.
Ehrler, Frédéric; Ruch, Patrick; Geissbuhler, Antoine; Lovis, Christian
2007-01-01
Patient records contain most crucial documents for managing the treatments and healthcare of patients in the hospital. Retrieving information from these records in an easy, quick and safe way helps care providers to save time and find important facts about their patient's health. This paper presents the scalability issues induced by the indexing and the retrieval of the information contained in the patient records. For this study, EasyIR, an information retrieval tool performing full text queries and retrieving the related documents has been used. An evaluation of the performance reveals that the indexing process suffers from overhead consequence of the particular structure of the patient records. Most IR tools are designed to manage very large numbers of documents in a single index whereas in our hypothesis, one index per record, which usually implies few documents, has been imposed. As the number of modifications and creations of patient records are significant in a day, using a specialized and efficient indexation tool is required.
Analysis of acoustic emission signals at austempering of steels using neural networks
NASA Astrophysics Data System (ADS)
Łazarska, Malgorzata; Wozniak, Tadeusz Z.; Ranachowski, Zbigniew; Trafarski, Andrzej; Domek, Grzegorz
2017-05-01
Bearing steel 100CrMnSi6-4 and tool steel C105U were used to carry out this research with the steels being austempered to obtain a martensitic-bainitic structure. During the process quite a large number of acoustic emissions (AE) were observed. These signals were then analysed using neural networks resulting in the identification of three groups of events of: high, medium and low energy and in addition their spectral characteristics were plotted. The results were presented in the form of diagrams of AE incidence as a function of time. It was demonstrated that complex transformations of austenite into martensite and bainite occurred when austempering bearing steel at 160 °C and tool steel at 130 °C respectively. The selected temperatures of isothermal quenching of the tested steels were within the area near to MS temperature, which affected the complex course of phase transition. The high activity of AE is a typical occurrence for martensitic transformation and this is the transformation mechanism that induces the generation of AE signals of higher energy in the first stage of transition. In the second stage of transformation, the initially nucleated martensite accelerates the occurrence of the next bainitic transformation.
Offset shock mounted recorder carrier including overpressure gauge protector and balance joint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patel, D.K.
1990-12-25
This patent describes a recorder carrier adapted to be included within a well tool. The carrier adapted to include at least one recorder, the recorder being movable within the recorder carrier when the carrier includes the recorder, the well tool adapted to be disposed in a borehole containing well annulus fluid, the recorder carrier adapted to receive well fluid from a formation in the borehole. It comprises overpressure protection means for preventing the well fluid from entering the recorder carrier when a pressure of the well fluid is greater than a predetermined amount above a pressure of the well annulusmore » fluid thereby protecting the recorder from the pressure of the well fluid.« less
Janisse, Tom; Tallman, Karen
2017-01-01
The top predictors of patient satisfaction with clinical visits are the quality of the physician-patient relationship and the communications contributing to their relationship. How do physicians improve their communication, and what effect does it have on them? This article presents the verbatim stories of seven high-performing physicians describing their transformative change in the areas of communication, connection, and well-being. Data for this study are based on interviews from a previous study in which a 6-question set was posed, in semistructured 60-minute interviews, to 77 of the highest-performing Permanente Medical Group physicians in 4 Regions on the "Art of Medicine" patient survey. Transformation stories emerged spontaneously during the interviews, and so it was an incidental finding when some physicians identified that they were not always high performing in their communication with patients. Seven different modes of transformation in communication were described by these physicians: a listening tool, an awareness course, finding new meaning in clinical practice, a technologic tool, a sudden insight, a mentor observation, and a physician-as-patient experience. These stories illustrate how communication skills can be learned through various activities and experiences that transform physicians into those who are highly successful communicators. All modes result in a change of state-a new way of seeing, of being-and are not just a new tool or a new practice, but a change in state of mind. This state resulted in a marked change of behavior, and a substantial improvement of communication and relationship.
Molecular tools for carotenogenesis analysis in the zygomycete Mucor circinelloides.
Torres-Martínez, Santiago; Ruiz-Vázquez, Rosa M; Garre, Victoriano; López-García, Sergio; Navarro, Eusebio; Vila, Ana
2012-01-01
The carotene producer fungus Mucor circinelloides is the zygomycete more amenable to genetic manipulations by using molecular tools. Since the initial development of an effective procedure of genetic transformation, more than two decades ago, the availability of new molecular approaches such as gene replacement techniques and gene expression inactivation by RNA silencing, in addition to the sequencing of its genome, has made Mucor a valuable organism for the study of a number of processes. Here we describe in detail the main techniques and methods currently used to manipulate M. circinelloides, including transformation, gene replacement, gene silencing, RNAi, and immunoprecipitation.
Heralded processes on continuous-variable spaces as quantum maps
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferreyrol, Franck; Spagnolo, Nicolò; Blandino, Rémi
2014-12-04
Heralding processes, which only work when a measurement on a part of the system give the good result, are particularly interesting for continuous-variables. They permit non-Gaussian transformations that are necessary for several continuous-variable quantum information tasks. However if maps and quantum process tomography are commonly used to describe quantum transformations in discrete-variable space, they are much rarer in the continuous-variable domain. Also, no convenient tool for representing maps in a way more adapted to the particularities of continuous variables have yet been explored. In this paper we try to fill this gap by presenting such a tool.
Abstracting ICU Nursing Care Quality Data From the Electronic Health Record.
Seaman, Jennifer B; Evans, Anna C; Sciulli, Andrea M; Barnato, Amber E; Sereika, Susan M; Happ, Mary Beth
2017-09-01
The electronic health record is a potentially rich source of data for clinical research in the intensive care unit setting. We describe the iterative, multi-step process used to develop and test a data abstraction tool, used for collection of nursing care quality indicators from the electronic health record, for a pragmatic trial. We computed Cohen's kappa coefficient (κ) to assess interrater agreement or reliability of data abstracted using preliminary and finalized tools. In assessing the reliability of study data ( n = 1,440 cases) using the finalized tool, 108 randomly selected cases (10% of first half sample; 5% of last half sample) were independently abstracted by a second rater. We demonstrated mean κ values ranging from 0.61 to 0.99 for all indicators. Nursing care quality data can be accurately and reliably abstracted from the electronic health records of intensive care unit patients using a well-developed data collection tool and detailed training.
Spectral analysis for GNSS coordinate time series using chirp Fourier transform
NASA Astrophysics Data System (ADS)
Feng, Shengtao; Bo, Wanju; Ma, Qingzun; Wang, Zifan
2017-12-01
Spectral analysis for global navigation satellite system (GNSS) coordinate time series provides a principal tool to understand the intrinsic mechanism that affects tectonic movements. Spectral analysis methods such as the fast Fourier transform, Lomb-Scargle spectrum, evolutionary power spectrum, wavelet power spectrum, etc. are used to find periodic characteristics in time series. Among spectral analysis methods, the chirp Fourier transform (CFT) with less stringent requirements is tested with synthetic and actual GNSS coordinate time series, which proves the accuracy and efficiency of the method. With the length of series only limited to even numbers, CFT provides a convenient tool for windowed spectral analysis. The results of ideal synthetic data prove CFT accurate and efficient, while the results of actual data show that CFT is usable to derive periodic information from GNSS coordinate time series.
Somasundaram, Karuppanagounder; Ezhilarasan, Kamalanathan
2015-01-01
To develop an automatic skull stripping method for magnetic resonance imaging (MRI) of human head scans. The proposed method is based on gray scale transformation and morphological operations. The proposed method has been tested with 20 volumes of normal T1-weighted images taken from Internet Brain Segmentation Repository. Experimental results show that the proposed method gives better results than the popular skull stripping methods Brain Extraction Tool and Brain Surface Extractor. The average value of Jaccard and Dice coefficients are 0.93 and 0.962 respectively. In this article, we have proposed a novel skull stripping method using intensity transformation and morphological operations. This is a low computational complexity method but gives competitive or better results than that of the popular skull stripping methods Brain Surface Extractor and Brain Extraction Tool.
Walking a Path of Transformation: Using the Labyrinth as a Spiritual Tool.
ERIC Educational Resources Information Center
Plugge, Carol; McCormick, Debby
Personal and spiritual transformation is emerging as a great need. Numerous societal and health concerns are beginning to surface as spiritual issues in the eyes of many professionals. A better understanding of the relationship between psychosocial-biological issues and spirituality would enhance professional capabilities. Experiential skills and…
Vector coding of wavelet-transformed images
NASA Astrophysics Data System (ADS)
Zhou, Jun; Zhi, Cheng; Zhou, Yuanhua
1998-09-01
Wavelet, as a brand new tool in signal processing, has got broad recognition. Using wavelet transform, we can get octave divided frequency band with specific orientation which combines well with the properties of Human Visual System. In this paper, we discuss the classified vector quantization method for multiresolution represented image.
ERIC Educational Resources Information Center
Ekanem, Ekpenyong E.; Ekpiken, William E.
2013-01-01
Continuous assessment is an important management tool for transforming university education. Although this policy employed measurable criteria to retain students' interest and objectivity, most academic staff of Nigerian universities lack basic knowledge and skills in test construction and interpretation and are thus, ineffective in continuous…
Application of Transformations in Parametric Inference
ERIC Educational Resources Information Center
Brownstein, Naomi; Pensky, Marianna
2008-01-01
The objective of the present paper is to provide a simple approach to statistical inference using the method of transformations of variables. We demonstrate performance of this powerful tool on examples of constructions of various estimation procedures, hypothesis testing, Bayes analysis and statistical inference for the stress-strength systems.…
Detection of physiological noise in resting state fMRI using machine learning.
Ash, Tom; Suckling, John; Walter, Martin; Ooi, Cinly; Tempelmann, Claus; Carpenter, Adrian; Williams, Guy
2013-04-01
We present a technique for predicting cardiac and respiratory phase on a time point by time point basis, from fMRI image data. These predictions have utility in attempts to detrend effects of the physiological cycles from fMRI image data. We demonstrate the technique both in the case where it can be trained on a subject's own data, and when it cannot. The prediction scheme uses a multiclass support vector machine algorithm. Predictions are demonstrated to have a close fit to recorded physiological phase, with median Pearson correlation scores between recorded and predicted values of 0.99 for the best case scenario (cardiac cycle trained on a subject's own data) down to 0.83 for the worst case scenario (respiratory predictions trained on group data), as compared to random chance correlation score of 0.70. When predictions were used with RETROICOR--a popular physiological noise removal tool--the effects are compared to using recorded phase values. Using Fourier transforms and seed based correlation analysis, RETROICOR is shown to produce similar effects whether recorded physiological phase values are used, or they are predicted using this technique. This was seen by similar levels of noise reduction noise in the same regions of the Fourier spectra, and changes in seed based correlation scores in similar regions of the brain. This technique has a use in situations where data from direct monitoring of the cardiac and respiratory cycles are incomplete or absent, but researchers still wish to reduce this source of noise in the image data. Copyright © 2011 Wiley Periodicals, Inc.
Genetic transformation protocols using zygotic embryos as explants: an overview.
Tahir, Muhammad; Waraich, Ejaz A; Stasolla, Claudio
2011-01-01
Genetic transformation of plants is an innovative research tool which has practical significance for the development of new and improved genotypes or cultivars. However, stable introduction of genes of interest into nuclear genomes depends on several factors such as the choice of target tissue, the method of DNA delivery in the target tissue, and the appropriate method to select the transformed plants. Mature or immature zygotic embryos have been a popular choice as explant or target tissue for genetic transformation in both angiosperms and gymnosperms. As a result, considerable protocols have emerged in the literature which have been optimized for various plant species in terms of transformation methods and selection procedures for transformed plants. This article summarizes the recent advances in plant transformation using zygotic embryos as explants.
NASA Astrophysics Data System (ADS)
Borowiec, N.
2013-12-01
Gathering information about the roof shapes of the buildings is still current issue. One of the many sources from which we can obtain information about the buildings is the airborne laser scanning. However, detect information from cloud o points about roofs of building automatically is still a complex task. You can perform this task by helping the additional information from other sources, or based only on Lidar data. This article describes how to detect the building roof only from a point cloud. To define the shape of the roof is carried out in three tasks. The first step is to find the location of the building, the second is the precise definition of the edge, while the third is an indication of the roof planes. First step based on the grid analyses. And the next two task based on Hough Transformation. Hough transformation is a method of detecting collinear points, so a perfect match to determine the line describing a roof. To properly determine the shape of the roof is not enough only the edges, but it is necessary to indicate roofs. Thus, in studies Hough Transform, also served as a tool for detection of roof planes. The only difference is that the tool used in this case is a three-dimensional.
2011-01-01
Background Many nursing and health related research studies have continuous outcome measures that are inherently non-normal in distribution. The Box-Cox transformation provides a powerful tool for developing a parsimonious model for data representation and interpretation when the distribution of the dependent variable, or outcome measure, of interest deviates from the normal distribution. The objectives of this study was to contrast the effect of obtaining the Box-Cox power transformation parameter and subsequent analysis of variance with or without a priori knowledge of predictor variables under the classic linear or linear mixed model settings. Methods Simulation data from a 3 × 4 factorial treatments design, along with the Patient Falls and Patient Injury Falls from the National Database of Nursing Quality Indicators (NDNQI®) for the 3rd quarter of 2007 from a convenience sample of over one thousand US hospitals were analyzed. The effect of the nonlinear monotonic transformation was contrasted in two ways: a) estimating the transformation parameter along with factors with potential structural effects, and b) estimating the transformation parameter first and then conducting analysis of variance for the structural effect. Results Linear model ANOVA with Monte Carlo simulation and mixed models with correlated error terms with NDNQI examples showed no substantial differences on statistical tests for structural effects if the factors with structural effects were omitted during the estimation of the transformation parameter. Conclusions The Box-Cox power transformation can still be an effective tool for validating statistical inferences with large observational, cross-sectional, and hierarchical or repeated measure studies under the linear or the mixed model settings without prior knowledge of all the factors with potential structural effects. PMID:21854614
Hou, Qingjiang; Mahnken, Jonathan D; Gajewski, Byron J; Dunton, Nancy
2011-08-19
Many nursing and health related research studies have continuous outcome measures that are inherently non-normal in distribution. The Box-Cox transformation provides a powerful tool for developing a parsimonious model for data representation and interpretation when the distribution of the dependent variable, or outcome measure, of interest deviates from the normal distribution. The objectives of this study was to contrast the effect of obtaining the Box-Cox power transformation parameter and subsequent analysis of variance with or without a priori knowledge of predictor variables under the classic linear or linear mixed model settings. Simulation data from a 3 × 4 factorial treatments design, along with the Patient Falls and Patient Injury Falls from the National Database of Nursing Quality Indicators (NDNQI® for the 3rd quarter of 2007 from a convenience sample of over one thousand US hospitals were analyzed. The effect of the nonlinear monotonic transformation was contrasted in two ways: a) estimating the transformation parameter along with factors with potential structural effects, and b) estimating the transformation parameter first and then conducting analysis of variance for the structural effect. Linear model ANOVA with Monte Carlo simulation and mixed models with correlated error terms with NDNQI examples showed no substantial differences on statistical tests for structural effects if the factors with structural effects were omitted during the estimation of the transformation parameter. The Box-Cox power transformation can still be an effective tool for validating statistical inferences with large observational, cross-sectional, and hierarchical or repeated measure studies under the linear or the mixed model settings without prior knowledge of all the factors with potential structural effects.
SEER Abstracting Tool (SEER*Abs)
With this customizable tool, registrars can collect and store data abstracted from medical records. Download the software and find technical support and reference manuals. SEER*Abs has features for creating records, managing abstracting work and data, accessing reference data, and integrating edits.
Computer applications in diagnostic imaging.
Horii, S C
1991-03-01
This article has introduced the nature, generation, use, and future of digital imaging. As digital technology has transformed other aspects of our lives--has the reader tried to buy a conventional record album recently? almost all music store stock is now compact disks--it is sure to continue to transform medicine as well. Whether that transformation will be to our liking as physicians or a source of frustration and disappointment is dependent on understanding the issues involved.
Emoto, Akira; Fukuda, Takashi
2013-02-20
For Fourier transform holography, an effective random phase distribution with randomly displaced phase segments is proposed for obtaining a smooth finite optical intensity distribution in the Fourier transform plane. Since unitary phase segments are randomly distributed in-plane, the blanks give various spatial frequency components to an image, and thus smooth the spectrum. Moreover, by randomly changing the phase segment size, spike generation from the unitary phase segment size in the spectrum can be reduced significantly. As a result, a smooth spectrum including sidebands can be formed at a relatively narrow extent. The proposed phase distribution sustains the primary functions of a random phase mask for holographic-data recording and reconstruction. Therefore, this distribution is expected to find applications in high-density holographic memory systems, replacing conventional random phase mask patterns.
Simultaneous measurement of translation and tilt using digital speckle photography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhaduri, Basanta; Quan, Chenggen; Tay, Cho Jui
2010-06-20
A Michelson-type digital speckle photographic system has been proposed in which one light beam produces a Fourier transform and another beam produces an image at a recording plane, without interfering between themselves. Because the optical Fourier transform is insensitive to translation and the imaging technique is insensitive to tilt, the proposed system is able to simultaneously and independently determine both surface tilt and translation by two separate recordings, one before and another after the surface motion, without the need to obtain solutions for simultaneous equations. Experimental results are presented to verify the theoretical analysis.
ERIC Educational Resources Information Center
Guerrero, Shannon; Baumgartel, Drew; Zobott, Maren
2013-01-01
Screencasting, or digital recordings of computer screen outputs, can be used to promote pedagogical transformation in the mathematics classroom by moving explicit, procedural-based instruction to the online environment, thus freeing classroom time for more student-centered investigations, problem solving, communication, and collaboration. This…
GIS Story Maps : A Tool to Empower and Engage Stakeholders in Planning Sustainable Places
DOT National Transportation Integrated Search
2016-10-01
Public engagement continues to be transformed by the explosion of new digital technologies/tools, software platforms, social media networks, mobile devices, and mobile apps. Recent changes in geospatial technology offer new opportunities for use in p...
Consequential Validity and the Transformation of Tests from Measurement Tools to Policy Tools
ERIC Educational Resources Information Center
Welner, Kevin G.
2013-01-01
Background/Context: Recent U.S. policy has brought a shift in assessment use, from measurement tools to policy levers. In particular, testing has become a core part of teacher evaluation policies in many states, with test results becoming akin to a job evaluation. Purpose: To explore the notion of consequential validity in assessment use and…
The Use of Videos as a Cognitive Stimulator and Instructional Tool in Tertiary ESL Classroom
ERIC Educational Resources Information Center
Kaur, Dalwinder; Yong, Esther; Zin, Norhayati Mohd; DeWitt, Dorothy
2014-01-01
Even though technology is known to have a transformative effect on teaching and learning, videos are not widely used as an instructional tool in the classrooms in Malaysia. This paper focuses on using videos a cognitive stimulator and an instructional tool especially in tertiary ESL classrooms. This paper the potential of using videos for…
DOE Office of Scientific and Technical Information (OSTI.GOV)
WESTRICH, HENRY; WILSON, ANDREW; STANTON, ERIC
LDRDView is a software tool for visualizing a collection of textual records and exploring relationships between them for the purpose of gaining new insights about the submitted information. By evaluating the content of the records and assigning coordinates to each based on its similarity to others, LDRDView graphically displays a corpus of records either as a landscape of hills and valleys or as a graph of nodes and links. A suite of data analysis tools facilitates in-depth exploration of the corpus as a whole and the content of each individual record.
Kannan, V; Fish, JS; Mutz, JM; Carrington, AR; Lai, K; Davis, LS; Youngblood, JE; Rauschuber, MR; Flores, KA; Sara, EJ; Bhat, DG; Willett, DL
2017-01-01
Summary Background Creation of a new electronic health record (EHR)-based registry often can be a "one-off" complex endeavor: first developing new EHR data collection and clinical decision support tools, followed by developing registry-specific data extractions from the EHR for analysis. Each development phase typically has its own long development and testing time, leading to a prolonged overall cycle time for delivering one functioning registry with companion reporting into production. The next registry request then starts from scratch. Such an approach will not scale to meet the emerging demand for specialty registries to support population health and value-based care. Objective To determine if the creation of EHR-based specialty registries could be markedly accelerated by employing (a) a finite core set of EHR data collection principles and methods, (b) concurrent engineering of data extraction and data warehouse design using a common dimensional data model for all registries, and (c) agile development methods commonly employed in new product development. Methods We adopted as guiding principles to (a) capture data as a by product of care of the patient, (b) reinforce optimal EHR use by clinicians, (c) employ a finite but robust set of EHR data capture tool types, and (d) leverage our existing technology toolkit. Registries were defined by a shared condition (recorded on the Problem List) or a shared exposure to a procedure (recorded on the Surgical History) or to a medication (recorded on the Medication List). Any EHR fields needed—either to determine registry membership or to calculate a registry-associated clinical quality measure (CQM)—were included in the enterprise data warehouse (EDW) shared dimensional data model. Extract-transform-load (ETL) code was written to pull data at defined “grains” from the EHR into the EDW model. All calculated CQM values were stored in a single Fact table in the EDW crossing all registries. Registry-specific dashboards were created in the EHR to display both (a) real-time patient lists of registry patients and (b) EDW-generated CQM data. Agile project management methods were employed, including co-development, lightweight requirements documentation with User Stories and acceptance criteria, and time-boxed iterative development of EHR features in 2-week “sprints” for rapid-cycle feedback and refinement. Results Using this approach, in calendar year 2015 we developed a total of 43 specialty chronic disease registries, with 111 new EHR data collection and clinical decision support tools, 163 new clinical quality measures, and 30 clinic-specific dashboards reporting on both real-time patient care gaps and summarized and vetted CQM measure performance trends. Conclusions This study suggests concurrent design of EHR data collection tools and reporting can quickly yield useful EHR structured data for chronic disease registries, and bodes well for efforts to migrate away from manual abstraction. This work also supports the view that in new EHR-based registry development, as in new product development, adopting agile principles and practices can help deliver valued, high-quality features early and often. PMID:28930362
Lindberg, S; Cervin, A; Runer, T; Thomasson, L
1996-09-01
Investigations of mucociliary activity in vivo are based on photoelectric recordings of light reflections from the mucosa. The alterations in light intensity produced by the beating cilia are picked up by a photodetector and converted to photoelectric signals. The optimal processing of these signals is not known, but in vitro recordings have been reported to benefit from fast Fourier transformation (FFT) of the signal. The aim of the investigation was to study the effect of FFT for frequency analysis of photoelectric signals originating from an artificial light source simulating mucociliary activity or from sinus or nasal mucosa in vivo, as compared to a conventional method of calculating mucociliary wave frequency, in which each peak in the signal is interpreted as a beat (old method). In the experiments with the artificial light source, the FFT system was superior to the conventional method by a factor of 50 in detecting weak signals. By using FFT signal processing, frequency could be correctly calculated in experiments with a compound signal. In experiments in the rabbit maxillary sinus, the spontaneous variations were greater when signals were processed by FFT. The correlation between the two methods was excellent: r = .92. The increase in mucociliary activity in response to the ciliary stimulant methacholine at a dosage of 0.5 microgram/kg was greater measured with the FFT than with the old method (55.3% +/- 8.3% versus 43.0% +/- 8.2%, p < .05, N = 8), and only with the FFT system could a significant effect of a threshold dose (0.05 microgram/kg) of methacholine be detected. In the human nose, recordings from aluminum foil placed on the nasal dorsum and from the nasal septa mucosa displayed some similarities in the lower frequency spectrum (< 5 Hz) attributable to artifacts. The predominant cause of these artifacts was the pulse beat, whereas in the frequency spectrum above 5 Hz, results differed for the two sources of reflected light, the mean frequency in seven healthy volunteers being 7.8 +/- 1.6 Hz for the human nasal mucosa. It is concluded that the FFT system has greater sensitivity in detecting photoelectric signals derived from the mucociliary system, and that it is also a useful tool for analyzing the contributions of artifacts to the signal.
Buschmann, H; Green, P; Sambade, A; Doonan, J H; Lloyd, C W
2011-04-01
Transient transformation with Agrobacterium is a widespread tool allowing rapid expression analyses in plants. However, the available methods generate expression in interphase and do not allow the routine analysis of dividing cells. Here, we present a transient transformation method (termed 'TAMBY2') to enable cell biological studies in interphase and cell division. Agrobacterium-mediated transient gene expression in tobacco BY-2 was analysed by Western blotting and quantitative fluorescence microscopy. Time-lapse microscopy of cytoskeletal markers was employed to monitor cell division. Double-labelling in interphase and mitosis enabled localization studies. We found that the transient transformation efficiency was highest when BY-2/Agrobacterium co-cultivation was performed on solid medium. Transformants produced in this way divided at high frequency. We demonstrated the utility of the method by defining the behaviour of a previously uncharacterized microtubule motor, KinG, throughout the cell cycle. Our analyses demonstrated that TAMBY2 provides a flexible tool for the transient transformation of BY-2 with Agrobacterium. Fluorescence double-labelling showed that KinG localizes to microtubules and to F-actin. In interphase, KinG accumulates on microtubule lagging ends, suggesting a minus-end-directed function in vivo. Time-lapse studies of cell division showed that GFP-KinG strongly labels preprophase band and phragmoplast, but not the metaphase spindle. © 2010 The Authors. New Phytologist © 2010 New Phytologist Trust.
High Accuracy Evaluation of the Finite Fourier Transform Using Sampled Data
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.
1997-01-01
Many system identification and signal processing procedures can be done advantageously in the frequency domain. A required preliminary step for this approach is the transformation of sampled time domain data into the frequency domain. The analytical tool used for this transformation is the finite Fourier transform. Inaccuracy in the transformation can degrade system identification and signal processing results. This work presents a method for evaluating the finite Fourier transform using cubic interpolation of sampled time domain data for high accuracy, and the chirp Zeta-transform for arbitrary frequency resolution. The accuracy of the technique is demonstrated in example cases where the transformation can be evaluated analytically. Arbitrary frequency resolution is shown to be important for capturing details of the data in the frequency domain. The technique is demonstrated using flight test data from a longitudinal maneuver of the F-18 High Alpha Research Vehicle.
3-D surface profilometry based on modulation measurement by applying wavelet transform method
NASA Astrophysics Data System (ADS)
Zhong, Min; Chen, Feng; Xiao, Chao; Wei, Yongchao
2017-01-01
A new analysis of 3-D surface profilometry based on modulation measurement technique by the application of Wavelet Transform method is proposed. As a tool excelling for its multi-resolution and localization in the time and frequency domains, Wavelet Transform method with good localized time-frequency analysis ability and effective de-noizing capacity can extract the modulation distribution more accurately than Fourier Transform method. Especially for the analysis of complex object, more details of the measured object can be well remained. In this paper, the theoretical derivation of Wavelet Transform method that obtains the modulation values from a captured fringe pattern is given. Both computer simulation and elementary experiment are used to show the validity of the proposed method by making a comparison with the results of Fourier Transform method. The results show that the Wavelet Transform method has a better performance than the Fourier Transform method in modulation values retrieval.
A permanent seismic station beneath the Ocean Bottom
NASA Astrophysics Data System (ADS)
Harris, David; Cessaro, Robert K.; Duennebier, Fred K.; Byrne, David A.
1987-03-01
The Hawaii Institute of Geophysics began development of the Ocean Subbottom Seisometer (OSS) system in 1978, and OSS systems were installed in four locations between 1979 and 1982. The OSS system is a permanent, deep ocean borehole seismic recording system composed of a borehole sensor package (tool), an electromechanical cable, recorder package, and recovery system. Installed near the bottom of a borehole (drilled by the D/V Glomar Challenger), the tool contains three orthogonal, 4.5-Hz geophones, two orthogonal tilt meters; and a temperature sensor. Signals from these sensors are multiplexed, digitized (with a floating point technique), and telemetered through approximately 10 km of electromechanical cable to a recorder package located near the ocean bottom. Electrical power for the tool is supplied from the recorder package. The digital seismic signals are demultiplexed, converted back to analog form, processed through an automatic gain control (AGC) circuit, and recorded along with a time code on magnetic tape cassettes in the recorder package. Data may be recorded continuously for up to two months in the self-contained recorder package. Data may also be recorded in real time (digital formal) during the installation and subsequent recorder package servicing. The recorder package is connected to a submerged recovery buoy by a length of bouyant polypropylene rope. The anchor on the recovery buoy is released by activating either of the acoustical command releases. The polypropylene rope may also be seized with a grappling hook to effect recovery. The recorder package may be repeatedly serviced as long as the tool remains functional A wide range of data has been recovered from the OSS system. Recovered analog records include signals from natural seismic sources such as earthquakes (teleseismic and local), man-made seismic sources such as refraction seismic shooting (explosives and air cannons), and nuclear tests. Lengthy continuous recording has permitted analysis of wideband noise levels, and the slowly varying parameters, temperature and tilt.
Healthcare @ The Speed of Thought: A digital world needs successful transformative leaders.
Tremblay, Ken
2017-09-01
In the wake of transformational change powered by the digital era, resultant leadership challenges and strategies essential for successful change, both tactical and cultural, are linked to defined capabilities within the Systems Transformation domain of the LEADS in a Caring Environment framework. Honed from experience, specific softer leadership behaviours supporting system transformation are both described and reinforced. Further, a matrix combining the LEADS framework capabilities with these more specific behaviours is offered as a planning tool that leaders may reflect upon and map out key activities associated with their sponsorship of significant change.
Teachers as Allies: Transformative Practices for Teaching DREAMers and Undocumented Students
ERIC Educational Resources Information Center
Wong, Shelley, Ed.; Gosnell, Elaisa Sánchez, Ed.; Luu, Anne Marie Foerster, Ed.; Dodson, Lori, Ed.
2017-01-01
Learn how to engage and advocate for undocumented children and youth with this new resource written by and for teachers. "Teachers as Allies" provides educators with the information and tools they need to involve immigrant students and their American-born siblings and peers in inclusive and transformative classroom experiences. The…
ERIC Educational Resources Information Center
McCormack, Stephanie; Ross, Donna L.
2010-01-01
Technology can be a powerful tool to increase motivation, engagement, and achievement (Park, Khan, and Petrina 2009). In this article, the authors describe their collaborative approach to integrating technology with a lab on bacterial transformation. Students view websites and create videos to increase their conceptual understanding. Although the…
ERIC Educational Resources Information Center
Locklin, Reid B.
2010-01-01
Educational theorist Richard Kiely highlights the central importance of "high intensity dissonance" in successful international service-learning. This essay applies Kiely's model of dissonance and transformative learning to Intercordia, an international service-learning program offered at the University of St. Michael's College and the…
Transforming the Economics Curriculum by Integrating Threshold Concepts
ERIC Educational Resources Information Center
Karunaratne, Prashan Shayanka Mendis; Breyer, Yvonne A.; Wood, Leigh N.
2016-01-01
Purpose: Economics is catering to a diverse student cohort. This cohort needs to be equipped with transformative concepts that students can integrate beyond university. When a curriculum is content-driven, threshold concepts are a useful tool in guiding curriculum re-design. The paper aims to discuss these issues. Design/Methodology/Approach: The…
Support for Debugging Automatically Parallelized Programs
NASA Technical Reports Server (NTRS)
Hood, Robert; Jost, Gabriele
2001-01-01
This viewgraph presentation provides information on support sources available for the automatic parallelization of computer program. CAPTools, a support tool developed at the University of Greenwich, transforms, with user guidance, existing sequential Fortran code into parallel message passing code. Comparison routines are then run for debugging purposes, in essence, ensuring that the code transformation was accurate.
ERIC Educational Resources Information Center
Ravitch, Sharon M.
2014-01-01
Within the ever-developing, intersecting, and overlapping contexts of globalization, top-down policy, mandates, and standardization of public and higher education, many conceptualize and position practitioner research as a powerful stance and a tool of social, communal, and educational transformation, a set of methodological processes that…
Pedagogic Transformation, Student-Directed Design and Computational Thinking
ERIC Educational Resources Information Center
Vallance, Michael; Towndrow, Phillip A.
2016-01-01
In a world where technology has become pervasive in our lives, the notion of IT integration in education practice is losing its significance. It is now more appropriate to discuss transforming pedagogy where technology is not considered a tool anymore but part of what we are. To advance this hypothesis, an enterprising, student-directed approach…
Structural Order-Disorder Transformations Monitored by X-Ray Diffraction and Photoluminescence
ERIC Educational Resources Information Center
Lima, R. C.; Paris, E. C.; Leite, E. R.; Espinosa, J. W. M.; Souza, A. G.; Longo, E.
2007-01-01
A study was conducted to examine the structural order-disorder transformation promoted by controlled heat treatment using X-ray diffraction technique (XRD) and photoluminescence (PL) techniques as tools to monitor the degree of structural order. The experiment was observed to be versatile and easily achieved with low cost which allowed producing…
Unleashing Waves of Innovation: Transformative Broadband for America's Future. Version 18
ERIC Educational Resources Information Center
Western Interstate Commission for Higher Education, 2009
2009-01-01
A forward-thinking National Broadband Strategy should focus on the transformative power of advanced networks to unleash new waves of innovation, jobs, economic growth, and national competitiveness. Such a strategy should create new tools to deliver health care, education, and a low carbon economy. The American Recovery and Reinvestment Act…
ERIC Educational Resources Information Center
Heddy, Benjamin C.; Sinatra, Gale M.; Seli, Helena; Taasoobshirazi, Gita; Mukhopadhyay, Ananya
2017-01-01
The Teaching for Transformative Experience in Science (TTES) model has shown to be a useful tool to generate learning and engagement in science. We investigated the effectiveness of TTES for facilitating transformative experience (TE), learning, the development of topic interest and transfer of course concepts to other courses employing a…
Multimedia Transformation: A Special Report on Multimedia in Schools
ERIC Educational Resources Information Center
Education Week, 2011
2011-01-01
In science and math classes across the country, digital tools are being used to conduct experiments, analyze data, and run 3-D simulations to explain complex concepts. Language arts teachers are now pushing the definition of literacy to include the ability to express ideas through media. This report, "Multimedia Transformation," examines the many…
Exploiting the Brachypodium Tool Box in cereal and grass research
USDA-ARS?s Scientific Manuscript database
It is now a decade since Brachypodium distachyon was suggested as a model species for temperate grasses and cereals. Since then transformation protocols, large expressed sequence tag (EST) populations, tools for forward and reverse genetic screens, highly refined cytogenetic probes, germplasm coll...
Grand challenge commentary: Transforming biosynthesis into an information science.
Bayer, Travis S
2010-12-01
Engineering biosynthetic pathways to natural products is a challenging endeavor that promises to provide new therapeutics and tools to manipulate biology. Information-guided design strategies and tools could unlock the creativity of a wide spectrum of scientists and engineers by decoupling expertise from implementation.
Hanson, Marta
2017-09-01
Argument This article analyzes for the first time the earliest western maps of diseases in China spanning fifty years from the late 1870s to the end of the 1920s. The 24 featured disease maps present a visual history of the major transformations in modern medicine from medical geography to laboratory medicine wrought on Chinese soil. These medical transformations occurred within new political formations from the Qing dynasty (1644-1911) to colonialism in East Asia (Hong Kong, Taiwan, Manchuria, Korea) and hypercolonialism within China (Tianjin, Shanghai, Amoy) as well as the new Republican Chinese nation state (1912-49). As a subgenre of persuasive graphics, physicians marshaled disease maps for various rhetorical functions within these different political contexts. Disease maps in China changed from being mostly analytical tools to functioning as tools of empire, national sovereignty, and public health propaganda legitimating new medical concepts, public health interventions, and political structures governing over human and non-human populations.
NASA Astrophysics Data System (ADS)
Prawata, Albertus Galih
2017-11-01
The architectural design stages in architectural practices or in architectural design studio consist of many aspects. One of them is during the early phases of the design process, where the architects or designers try to interpret the project brief into the design concept. This paper is a report of the procedure of digital tools in the early design process in an architectural practice in Jakarta. It targets principally the use of BIM and digital modeling to generate information and transform them into conceptual forms, which is not very common in Indonesian architectural practices. Traditionally, the project brief is transformed into conceptual forms by using sketches, drawings, and physical model. The new method using digital tools shows that it is possible to do the same thing during the initial stage of the design process to create early architectural design forms. Architect's traditional tools and methods begin to be replaced effectively by digital tools, which would drive bigger opportunities for innovation.
Optimized Computer Systems for Avionics Applications.
1980-02-01
medium. The recording may be photographic ( film ) or electronic (tape, disk, or digital memory). After the recording has been completed at N distinct...data into a domain where the signal components become decorrelated. Another popular intepretation is that the transformation is a mechanism for
M, Schoo A; S, Lawn; E, Rudnik; C, Litt J
2015-12-21
Many undergraduate and graduate-entry health science curricula have incorporated training in motivational interviewing (MI). However, to effectively teach skills that will remain with students after they graduate is challenging. The aims of this study were to find out self-assessed MI skills of health students and whether reflecting on the results can promote transformative learning. Thirty-six Australian occupational therapy and physiotherapy students were taught the principles of MI, asked to conduct a motivational interview, transcribe it, self-rate it using the Motivational Interviewing Treatment Integrity (MITI) tool and reflect on the experience. Student MI skills were measured using the reported MITI subscores. Student assignments and a focus group discussion were analysed to explore the student experience using the MITI tool and self-reflection to improve their understanding of MI principles. Students found MI challenging, although identified the MITI tool as useful for promoting self-reflection and to isolate MI skills. Students self-assessed their MI skills as competent and higher than scores expected from beginners. The results inform educational programs on how MI skills can be developed for health professional students and can result in transformative learning. Students may over-state their MI skills and strategies to reduce this, including peer review, are discussed. Structured self-reflection, using tools such as the MITI can promote awareness of MI skills and compliment didactic teaching methods.
The ADE scorecards: a tool for adverse drug event detection in electronic health records.
Chazard, Emmanuel; Băceanu, Adrian; Ferret, Laurie; Ficheur, Grégoire
2011-01-01
Although several methods exist for Adverse Drug events (ADE) detection due to past hospitalizations, a tool that could display those ADEs to the physicians does not exist yet. This article presents the ADE Scorecards, a Web tool that enables to screen past hospitalizations extracted from Electronic Health Records (EHR), using a set of ADE detection rules, presently rules discovered by data mining. The tool enables the physicians to (1) get contextualized statistics about the ADEs that happen in their medical department, (2) see the rules that are useful in their department, i.e. the rules that could have enabled to prevent those ADEs and (3) review in detail the ADE cases, through a comprehensive interface displaying the diagnoses, procedures, lab results, administered drugs and anonymized records. The article shows a demonstration of the tool through a use case.
Ravanfar, Seyed Ali; Orbovic, Vladimir; Moradpour, Mahdi; Abdul Aziz, Maheran; Karan, Ratna; Wallace, Simon; Parajuli, Saroj
2017-04-01
Development of in vitro plant regeneration method from Brassica explants via organogenesis and somatic embryogenesis is influenced by many factors such as culture environment, culture medium composition, explant sources, and genotypes which are reviewed in this study. An efficient in vitro regeneration system to allow genetic transformation of Brassica is a crucial tool for improving its economical value. Methods to optimize transformation protocols for the efficient introduction of desirable traits, and a comparative analysis of these methods are also reviewed. Hence, binary vectors, selectable marker genes, minimum inhibitory concentration of selection agents, reporter marker genes, preculture media, Agrobacterium concentration and regeneration ability of putative transformants for improvement of Agrobacterium-mediated transformation of Brassica are discussed.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2004-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
A Formal Approach to Requirements-Based Programming
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
No significant general-purpose method is currently available to mechanically transform system requirements into a provably equivalent model. The widespread use of such a method represents a necessary step toward high-dependability system engineering for numerous application domains. Current tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" unfilled by such tools and methods is that the formal models cannot be proven to be equivalent to the requirements. We offer a method for mechanically transforming requirements into a provably equivalent formal model that can be used as the basis for code generation and other transformations. This method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. Finally, we describe further application areas we are investigating for use of the approach.
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
A general-purpose method to mechanically transform system requirements into a probably equivalent model has yet to appeal: Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a probably equivalent implementation are valuable but not su8cient. The "gap" unfilled by such tools and methods is that their. formal models cannot be proven to be equivalent to the system requirements as originated by the customel: For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a probably equivalent formal model that can be used as the basis for code generation and other transformations.
Some applications of mathematics in theoretical physics - A review
NASA Astrophysics Data System (ADS)
Bora, Kalpana
2016-06-01
Mathematics is a very beautiful subject-very much an indispensible tool for Physics, more so for Theoretical Physics (by which we mean here mainly Field Theory and High Energy Physics). These branches of Physics are based on Quantum Mechanics and Special Theory of Relativity, and many mathematical concepts are used in them. In this work, we shall elucidate upon only some of them, like-differential geometry, infinite series, Mellin transforms, Fourier and integral transforms, special functions, calculus, complex algebra, topology, group theory, Riemannian geometry, functional analysis, linear algebra, operator algebra, etc. We shall also present, some physics issues, where these mathematical tools are used. It is not wrong to say that Mathematics is such a powerful tool, without which, there can not be any Physics theory!! A brief review on our research work is also presented.
Time-frequency analysis of pediatric murmurs
NASA Astrophysics Data System (ADS)
Lombardo, Joseph S.; Blodgett, Lisa A.; Rosen, Ron S.; Najmi, Amir-Homayoon; Thompson, W. Reid
1998-05-01
Technology has provided many new tools to assist in the diagnosis of pathologic conditions of the heart. Echocardiography, Ultrafast CT, and MRI are just a few. While these tools are a valuable resource, they are typically too expensive, large and complex in operation for use in rural, homecare, and physician's office settings. Recent advances in computer performance, miniaturization, and acoustic signal processing, have yielded new technologies that when applied to heart sounds can provide low cost screening for pathologic conditions. The short duration and transient nature of these signals requires processing techniques that provide high resolution in both time and frequency. Short-time Fourier transforms, Wigner distributions, and wavelet transforms have been applied to signals form hearts with various pathologic conditions. While no single technique provides the ideal solution, the combination of tools provides a good representation of the acoustic features of the pathologies selected.
RxnSim: a tool to compare biochemical reactions.
Giri, Varun; Sivakumar, Tadi Venkata; Cho, Kwang Myung; Kim, Tae Yong; Bhaduri, Anirban
2015-11-15
: Quantitative assessment of chemical reaction similarity aids database searches, classification of reactions and identification of candidate enzymes. Most methods evaluate reaction similarity based on chemical transformation patterns. We describe a tool, RxnSim, which computes reaction similarity based on the molecular signatures of participating molecules. The tool is able to compare reactions based on similarities of substrates and products in addition to their transformation. It allows masking of user-defined chemical moieties for weighted similarity computations. RxnSim is implemented in R and is freely available from the Comprehensive R Archive Network, CRAN (http://cran.r-project.org/web/packages/RxnSim/). anirban.b@samsung.com or ty76.kim@samsung.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Off-axis illumination direct-to-digital holography
Thomas, Clarence E.; Price, Jeffery R.; Voelkl, Edgar; Hanson, Gregory R.
2004-06-08
Systems and methods are described for off-axis illumination direct-to-digital holography. A method of recording an off-axis illuminated spatially heterodyne hologram including spatially heterodyne fringes for Fourier analysis, includes: reflecting a reference beam from a reference mirror at a non-normal angle; reflecting an object beam from an object at an angle with respect to an optical axis defined by a focusing lens; focusing the reference beam and the object beam at a focal plane of a digital recorder to form the off-axis illuminated spatially heterodyne hologram including spatially heterodyne fringes for Fourier analysis; digitally recording the off-axis illuminated spatially heterodyne hologram including spatially heterodyne fringes for Fourier analysis; Fourier analyzing the recorded off-axis illuminated spatially heterodyne hologram including spatially heterodyne fringes by transforming axes of the recorded off-axis illuminated spatially heterodyne hologram including spatially heterodyne fringes in Fourier space to sit on top of a heterodyne carrier frequency defined as an angle between the reference beam and the object beam; applying a digital filter to cut off signals around an original origin; and then performing an inverse Fourier transform.
ERIC Educational Resources Information Center
Falk, Beverly; Darling-Hammond, Linda
This report examines outcomes of the Primary Language Record (PLR), a program for systematically observing students in various aspects of their literacy development. The PLR uses classroom events and samples of student work to record students' progress and interests, recommend strategies for addressing needs and building on talents, and discuss…
Preventive overhaul time for power transformers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sarmadi, M.; Rouhi, J.; Fayyaz, A.
Power transformers are the major piece of equipment in high-voltage substations. A considerable number of these transformers exist in Iran`s integrated network. Due to the climate diversity and improper usage, many of these transformers age rapidly, suffer failure and are taken out of service before half their useful life. At the present time the utility companies have no specific time-frame and plan for preventive overhaul. Detection of preventive overhaul time will increase the remaining life of transformers and improve the reliability of substations. An exact check of the remaining lifetime of transformers is not yet possible by available diagnostic techniques.more » In this paper, the authors present a method of identifying the right time for preventive overhaul in 63 kV power transformers. This method is developed based on 25 year transformer performance records in Northern Iran (subtropical climate) and with the utilization of studies done by electrical engineering communities world-wide.« less
Why Lean doesn't work for everyone.
Kaplan, Gary S; Patterson, Sarah H; Ching, Joan M; Blackmore, C Craig
2014-12-01
Popularisation of Lean in healthcare has led to emphasis on Lean quality improvement tools in isolation, with inconsistent results. We argue that delivery of safer, more efficient, and higher quality-patient focused care requires organisational transformation of which the Lean toolkit is only one component. To successfully facilitate system transformation toward higher quality care at lower cost, Lean tools must be part of a comprehensive management system, within a supportive institutional culture, and with committed leadership. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
A Final Approach Trajectory Model for Current Operations
NASA Technical Reports Server (NTRS)
Gong, Chester; Sadovsky, Alexander
2010-01-01
Predicting accurate trajectories with limited intent information is a challenge faced by air traffic management decision support tools in operation today. One such tool is the FAA's Terminal Proximity Alert system which is intended to assist controllers in maintaining safe separation of arrival aircraft during final approach. In an effort to improve the performance of such tools, two final approach trajectory models are proposed; one based on polynomial interpolation, the other on the Fourier transform. These models were tested against actual traffic data and used to study effects of the key final approach trajectory modeling parameters of wind, aircraft type, and weight class, on trajectory prediction accuracy. Using only the limited intent data available to today's ATM system, both the polynomial interpolation and Fourier transform models showed improved trajectory prediction accuracy over a baseline dead reckoning model. Analysis of actual arrival traffic showed that this improved trajectory prediction accuracy leads to improved inter-arrival separation prediction accuracy for longer look ahead times. The difference in mean inter-arrival separation prediction error between the Fourier transform and dead reckoning models was 0.2 nmi for a look ahead time of 120 sec, a 33 percent improvement, with a corresponding 32 percent improvement in standard deviation.
PredicT-ML: a tool for automating machine learning model building with big clinical data.
Luo, Gang
2016-01-01
Predictive modeling is fundamental to transforming large clinical data sets, or "big clinical data," into actionable knowledge for various healthcare applications. Machine learning is a major predictive modeling approach, but two barriers make its use in healthcare challenging. First, a machine learning tool user must choose an algorithm and assign one or more model parameters called hyper-parameters before model training. The algorithm and hyper-parameter values used typically impact model accuracy by over 40 %, but their selection requires many labor-intensive manual iterations that can be difficult even for computer scientists. Second, many clinical attributes are repeatedly recorded over time, requiring temporal aggregation before predictive modeling can be performed. Many labor-intensive manual iterations are required to identify a good pair of aggregation period and operator for each clinical attribute. Both barriers result in time and human resource bottlenecks, and preclude healthcare administrators and researchers from asking a series of what-if questions when probing opportunities to use predictive models to improve outcomes and reduce costs. This paper describes our design of and vision for PredicT-ML (prediction tool using machine learning), a software system that aims to overcome these barriers and automate machine learning model building with big clinical data. The paper presents the detailed design of PredicT-ML. PredicT-ML will open the use of big clinical data to thousands of healthcare administrators and researchers and increase the ability to advance clinical research and improve healthcare.
78 FR 76789 - Additional Connect America Fund Phase II Issues
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-19
... inspection and copying during normal business hours in the FCC Reference Information Center, Portals II, 445... Phase I to Phase II. 2. Timing of Phase II Support Disbursements. In the USF/ICC Transformation Order... language in paragraph 180 of the USF/ICC Transformation Order. We now seek to more fully develop the record...
American Higher Education Transformed, 1940-2005: Documenting the National Discourse
ERIC Educational Resources Information Center
Smith, Wilson, Ed.; Bender, Thomas, Ed.
2008-01-01
This long-awaited sequel to Richard Hofstadter and Wilson Smith's classic anthology "American Higher Education: A Documentary History" presents one hundred and seventy-two key edited documents that record the transformation of higher education over the past sixty years. The volume includes such seminal documents as Vannevar Bush's 1945…
Transformation through Language Use: Children's Spontaneous Metaphors in Elementary School Science
ERIC Educational Resources Information Center
Jakobson, Britt; Wickman, Per-Olof
2007-01-01
This article examines the role elementary school children's spontaneous metaphors play in learning science. The data consists of tape recordings of about 25 h from five different schools. The material is analysed using a practical epistemology analysis and by using Dewey's ideas on the continuity and transformation of experience. The results show…
Reflection Promotes Transformation in a Service Learning Course
ERIC Educational Resources Information Center
Stover, Caitlin M.
2016-01-01
The purpose of this paper is to outline the delivery of a Master's level, Community/Public Health Nursing service learning course that spanned a two semester academic year. The instructor of record created a reflection binder with selected assignments to facilitate the transformative learning process that occurred during the course. Analyses of…
Open source cardiology electronic health record development for DIGICARDIAC implementation
NASA Astrophysics Data System (ADS)
Dugarte, Nelson; Medina, Rubén.; Huiracocha, Lourdes; Rojas, Rubén.
2015-12-01
This article presents the development of a Cardiology Electronic Health Record (CEHR) system. Software consists of a structured algorithm designed under Health Level-7 (HL7) international standards. Novelty of the system is the integration of high resolution ECG (HRECG) signal acquisition and processing tools, patient information management tools and telecardiology tools. Acquisition tools are for management and control of the DIGICARDIAC electrocardiograph functions. Processing tools allow management of HRECG signal analysis searching for indicative patterns of cardiovascular pathologies. Telecardiology tools incorporation allows system communication with other health care centers decreasing access time to the patient information. CEHR system was completely developed using open source software. Preliminary results of process validation showed the system efficiency.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poling, Whitney A.; Savic, Vesna; Hector, Louis G.
2016-04-05
The strain-induced, diffusionless shear transformation of retained austenite to martensite during straining of transformation induced plasticity (TRIP) assisted steels increases strain hardening and delays necking and fracture leading to exceptional ductility and strength, which are attractive for automotive applications. A novel technique that provides the retained austenite volume fraction variation with strain in TRIP-assisted steels with improved precision is presented. Digital images of the gauge section of tensile specimens were first recorded up to selected plastic strains with a stereo digital image correlation (DIC) system. The austenite volume fraction was measured by synchrotron X-ray diffraction from small squares cut frommore » the gage section. Strain fields in the squares were then computed by localizing the strain measurement to the corresponding region of a given square during DIC post-processing of the images recorded during tensile testing. Results obtained for a QP980 steel are used to study the influence of initial volume fraction of austenite and the austenite transformation with strain on tensile mechanical behavior.« less
Foster, Wendy; Gilder, Jason; Love, Thomas E; Jain, Anil K
2012-01-01
Objective To demonstrate the potential of de-identified clinical data from multiple healthcare systems using different electronic health records (EHR) to be efficiently used for very large retrospective cohort studies. Materials and methods Data of 959 030 patients, pooled from multiple different healthcare systems with distinct EHR, were obtained. Data were standardized and normalized using common ontologies, searchable through a HIPAA-compliant, patient de-identified web application (Explore; Explorys Inc). Patients were 26 years or older seen in multiple healthcare systems from 1999 to 2011 with data from EHR. Results Comparing obese, tall subjects with normal body mass index, short subjects, the venous thromboembolic events (VTE) OR was 1.83 (95% CI 1.76 to 1.91) for women and 1.21 (1.10 to 1.32) for men. Weight had more effect then height on VTE. Compared with Caucasian, Hispanic/Latino subjects had a much lower risk of VTE (female OR 0.47, 0.41 to 0.55; male OR 0.24, 0.20 to 0.28) and African-Americans a substantially higher risk (female OR 1.83, 1.76 to 1.91; male OR 1.58, 1.50 to 1.66). This 13-year retrospective study of almost one million patients was performed over approximately 125 h in 11 weeks, part time by the five authors. Discussion As research informatics tools develop and more clinical data become available in EHR, it is important to study and understand unique opportunities for clinical research informatics to transform the scale and resources needed to perform certain types of clinical research. Conclusions With the right clinical research informatics tools and EHR data, some types of very large cohort studies can be completed with minimal resources. PMID:22759621
Routes to new networks : a guide to social media for the public transportation industry
DOT National Transportation Integrated Search
2009-11-01
Today, utilization of many social media tools is seen as an added-value opportunity. AS society transforms, these tools will quickly become a necessity for those looking to communicate. It will be imperative that you are equipped with the knowledge a...
29 CFR 1926.951 - Tools and protective equipment.
Code of Federal Regulations, 2012 CFR
2012-07-01
..., DEPARTMENT OF LABOR (CONTINUED) SAFETY AND HEALTH REGULATIONS FOR CONSTRUCTION Power Transmission and Distribution § 1926.951 Tools and protective equipment. (a) Protective equipment. (1)(i) Rubber protective... “Double Insulated”; or (iii) Be connected to the power supply by means of an isolating transformer, or...
29 CFR 1926.951 - Tools and protective equipment.
Code of Federal Regulations, 2014 CFR
2014-07-01
..., DEPARTMENT OF LABOR (CONTINUED) SAFETY AND HEALTH REGULATIONS FOR CONSTRUCTION Power Transmission and Distribution § 1926.951 Tools and protective equipment. (a) Protective equipment. (1)(i) Rubber protective... “Double Insulated”; or (iii) Be connected to the power supply by means of an isolating transformer, or...
29 CFR 1926.951 - Tools and protective equipment.
Code of Federal Regulations, 2011 CFR
2011-07-01
..., DEPARTMENT OF LABOR (CONTINUED) SAFETY AND HEALTH REGULATIONS FOR CONSTRUCTION Power Transmission and Distribution § 1926.951 Tools and protective equipment. (a) Protective equipment. (1)(i) Rubber protective... “Double Insulated”; or (iii) Be connected to the power supply by means of an isolating transformer, or...
29 CFR 1926.951 - Tools and protective equipment.
Code of Federal Regulations, 2013 CFR
2013-07-01
..., DEPARTMENT OF LABOR (CONTINUED) SAFETY AND HEALTH REGULATIONS FOR CONSTRUCTION Power Transmission and Distribution § 1926.951 Tools and protective equipment. (a) Protective equipment. (1)(i) Rubber protective... “Double Insulated”; or (iii) Be connected to the power supply by means of an isolating transformer, or...
Connected Learning Communities: A Toolkit for Reinventing High School.
ERIC Educational Resources Information Center
Almeida, Cheryl, Ed.; Steinberg, Adria, Ed.
This document presents tools and guidelines to help practitioners transform their high schools into institutions facilitating community-connected learning. The approach underpinning the tools and guidelines is based on the following principles: academic rigor and relevance; personalized learning; self-passage to adulthood; and productive learning…
Doing Academic Planning: Effective Tools for Decision Making.
ERIC Educational Resources Information Center
Nedwek, Brian P., Ed.
This sourcebook was designed to provide academic planners with the tools to perform core functions and activities that facilitate the transformation of higher education institutions from provider-centered cultures and organizations to learner-centered franchises. The readings examine partnerships and alliances needed for higher education to…
Patient Perceptions of Electronic Health Records
ERIC Educational Resources Information Center
Lulejian, Armine
2011-01-01
Research objective. Electronic Health Records (EHR) are expected to transform the way medicine is delivered with patients/consumers being the intended beneficiaries. However, little is known regarding patient knowledge and attitudes about EHRs. This study examined patient perceptions about EHR. Study design. Surveys were administered following…
Winman, Thomas; Rystedt, Hans
2011-03-01
The implementation of generic models for organizing information in complex institutions like those in healthcare creates a gap between standardization and the need for locally relevant knowledge. The present study addresses how this gap can be bridged by focusing on the practical work of healthcare staff in transforming information in EPRs into knowledge that is useful for everyday work. Video recording of shift handovers on a rehabilitation ward serves as the empirical case. The results show how extensive selections and reorganizations of information in EPRs are carried out in order to transform information into professionally relevant accounts. We argue that knowledge about the institutional obligations and professional ways of construing information are fundamental for these transitions. The findings point to the need to consider the role of professional knowledge inherent in unpacking information in efforts to develop information systems intended to bridge between institutional and professional boundaries in healthcare. © The Author(s) 2011.
Noise Reduction in Breath Sound Files Using Wavelet Transform Based Filter
NASA Astrophysics Data System (ADS)
Syahputra, M. F.; Situmeang, S. I. G.; Rahmat, R. F.; Budiarto, R.
2017-04-01
The development of science and technology in the field of healthcare increasingly provides convenience in diagnosing respiratory system problem. Recording the breath sounds is one example of these developments. Breath sounds are recorded using a digital stethoscope, and then stored in a file with sound format. This breath sounds will be analyzed by health practitioners to diagnose the symptoms of disease or illness. However, the breath sounds is not free from interference signals. Therefore, noise filter or signal interference reduction system is required so that breath sounds component which contains information signal can be clarified. In this study, we designed a filter called a wavelet transform based filter. The filter that is designed in this study is using Daubechies wavelet with four wavelet transform coefficients. Based on the testing of the ten types of breath sounds data, the data is obtained in the largest SNRdB bronchial for 74.3685 decibels.
ToNER: A tool for identifying nucleotide enrichment signals in feature-enriched RNA-seq data.
Promworn, Yuttachon; Kaewprommal, Pavita; Shaw, Philip J; Intarapanich, Apichart; Tongsima, Sissades; Piriyapongsa, Jittima
2017-01-01
Biochemical methods are available for enriching 5' ends of RNAs in prokaryotes, which are employed in the differential RNA-seq (dRNA-seq) and the more recent Cappable-seq protocols. Computational methods are needed to locate RNA 5' ends from these data by statistical analysis of the enrichment. Although statistical-based analysis methods have been developed for dRNA-seq, they may not be suitable for Cappable-seq data. The more efficient enrichment method employed in Cappable-seq compared with dRNA-seq could affect data distribution and thus algorithm performance. We present Transformation of Nucleotide Enrichment Ratios (ToNER), a tool for statistical modeling of enrichment from RNA-seq data obtained from enriched and unenriched libraries. The tool calculates nucleotide enrichment scores and determines the global transformation for fitting to the normal distribution using the Box-Cox procedure. From the transformed distribution, sites of significant enrichment are identified. To increase power of detection, meta-analysis across experimental replicates is offered. We tested the tool on Cappable-seq and dRNA-seq data for identifying Escherichia coli transcript 5' ends and compared the results with those from the TSSAR tool, which is designed for analyzing dRNA-seq data. When combining results across Cappable-seq replicates, ToNER detects more known transcript 5' ends than TSSAR. In general, the transcript 5' ends detected by ToNER but not TSSAR occur in regions which cannot be locally modeled by TSSAR. ToNER uses a simple yet robust statistical modeling approach, which can be used for detecting RNA 5'ends from Cappable-seq data, in particular when combining information from experimental replicates. The ToNER tool could potentially be applied for analyzing other RNA-seq datasets in which enrichment for other structural features of RNA is employed. The program is freely available for download at ToNER webpage (http://www4a.biotec.or.th/GI/tools/toner) and GitHub repository (https://github.com/PavitaKae/ToNER).
Watershed Management Optimization Support Tool (WMOST) ...
EPA's Watershed Management Optimization Support Tool (WMOST) version 2 is a decision support tool designed to facilitate integrated water management by communities at the small watershed scale. WMOST allows users to look across management options in stormwater (including green infrastructure), wastewater, drinking water, and land conservation programs to find the least cost solutions. The pdf version of these presentations accompany the recorded webinar with closed captions to be posted on the WMOST web page. The webinar was recorded at the time a training workshop took place for EPA's Watershed Management Optimization Support Tool (WMOST, v2).
Transformative occupational therapy: We are wired to be transformers.
Dubouloz, Claire-Jehanne
2014-10-01
Transformative learning involves critical self-reflection as the motor for transforming values, beliefs, knowledge, and feelings and discovering the new meaning of daily life following a catastrophic injury or illness. Transformation has been conceptualized in various disciplines as a transcendent experience, rebirth process, and meaning-making process and within occupational therapy as a meaning perspective process. This Muriel Driver lecture explores the concept of transformation and presents the newly developed Meaning Perspectives Transformation model, constructed from research conducted with several different rehabilitation client groups. The model is characterized by three phases: trigger, changing, and outcomes. A client's critical self-reflection acts as a catalyst for moving between the phases and is represented in the model as a moment of readiness for change leading to the development of alternative ways of performing. The Meaning Perspectives Transformation model provides a tool for being an effective occupational therapist, encouraging therapists to listen closely to their clients to identify their weakening and emerging meaning perspectives and enable their occupational evolution and transformation.
Incipient fault diagnosis of power transformers using optical spectro-photometric technique
NASA Astrophysics Data System (ADS)
Hussain, K.; Karmakar, Subrata
2015-06-01
Power transformers are the vital equipment in the network of power generation, transmission and distribution. Mineral oil in oil-filled transformers plays very important role as far as electrical insulation for the winding and cooling of the transformer is concerned. As transformers are always under the influence of electrical and thermal stresses, incipient faults like partial discharge, sparking and arcing take place. As a result, mineral oil deteriorates there by premature failure of the transformer occurs causing huge losses in terms of revenue and assets. Therefore, the transformer health condition has to be monitored continuously. The Dissolved Gas Analysis (DGA) is being extensively used for this purpose, but it has some drawbacks like it needs carrier gas, regular instrument calibration, etc. To overcome these drawbacks, Ultraviolet (UV) -Visible and Fourier Transform Infrared (FTIR) Spectro-photometric techniques are used as diagnostic tools for investigating the degraded transformer oil affected by electrical, mechanical and thermal stresses. The technique has several advantages over the conventional DGA technique.
NASA Astrophysics Data System (ADS)
Reolon, David; Jacquot, Maxime; Verrier, Isabelle; Brun, Gérald; Veillas, Colette
2006-12-01
In this paper we propose group refractive index measurement with a spectral interferometric set-up using a broadband supercontinuum generated in an air-silica Microstructured Optical Fibre (MOF) pumped with a picosecond pulsed microchip laser. This source authorizes high fringes visibility for dispersion measurements by Spectroscopic Analysis of White Light Interferograms (SAWLI). Phase calculation is assumed by a wavelet transform procedure combined with a curve fit of the recorded channelled spectrum intensity. This approach provides high resolution and absolute group refractive index measurements along one line of the sample by recording a single 2D spectral interferogram without mechanical scanning.
ERIC Educational Resources Information Center
Pejakovic, Sara
2016-01-01
The transformation of the developmental process from animal rationale, through homo communicans into the (un)aware homo symbolicum and the man receiving and distributing media information today, available through multimedia tools in his everyday life, encourages thought on the contemporary man, as well as the purpose, point and sense in…
ERIC Educational Resources Information Center
Hartley, Laurel M.; Wilke, Brook J.; Schramm, Jonathon W.; D'Avanzo, Charlene; Anderson, Charles W.
2011-01-01
Processes that transform carbon (e.g., photosynthesis) play a prominent role in college biology courses. Our goals were to learn about student reasoning related to these processes and provide faculty with tools for instruction and assessment. We created a framework illustrating how carbon-transforming processes can be related to one another during…
MATILDA: A Military Laser Range Safety Tool Based on Probabilistic Risk Assessment (PRA) Techniques
2014-08-01
Figure 6: MATILDA Coordinate Transformations ....................................................... 22 Figure 7: Geocentric and MICS Coordinates...Target – Range Boundary Undershoot Geometry .............. 34 Figure 19: Geocentric Overshoot Geometry and Parameters...transformed into Geocentric coordinates, a Cartesian (x,y,z) coordinate system with origin at the center of the Earth and z-axis oriented towards the
ERIC Educational Resources Information Center
Kawinkamolroj, Milintra; Triwaranyu, Charinee; Thongthew, Sumlee
2015-01-01
This research aimed to develop coaching process based on transformative learning theory for changing the mindset about instruction of elementary school teachers. Tools used in this process include mindset tests and questionnaires designed to assess the instructional mindset of teachers and to allow the teachers to reflect on how they perceive…
University Teachers' Perspectives on the Role of the Laplace Transform in Engineering Education
ERIC Educational Resources Information Center
Holmberg, Margarita; Bernhard, Jonte
2017-01-01
The Laplace transform is an important tool in many branches of engineering, for example, electric and control engineering, but is also regarded as a difficult topic for students to master. We have interviewed 22 university teachers from five universities in three countries (Mexico, Spain and Sweden) about their views on relationships among…
The iNACOL State Policy Frameworks: 5 Critical Issues to Transform K-12 Education
ERIC Educational Resources Information Center
Worthen, Maria; Patrick, Susan
2014-01-01
Over the last decade, the American education system has seen unprecedented transformation of teaching and learning as educators have grasped the power of new learning models to close achievement gaps and extend access to high-quality learning opportunities. The availability of adaptive digital tools that use data to improve student learning has…
Nanjareddy, Kalpana; Arthikala, Manoj-Kumar; Blanco, Lourdes; Arellano, Elizabeth S; Lara, Miguel
2016-06-24
Phaseolus vulgaris is one of the most extensively studied model legumes in the world. The P. vulgaris genome sequence is available; therefore, the need for an efficient and rapid transformation system is more imperative than ever. The functional characterization of P. vulgaris genes is impeded chiefly due to the non-amenable nature of Phaseolus sp. to stable genetic transformation. Transient transformation systems are convenient and versatile alternatives for rapid gene functional characterization studies. Hence, the present work focuses on standardizing methodologies for protoplast isolation from multiple tissues and transient transformation protocols for rapid gene expression analysis in the recalcitrant grain legume P. vulgaris. Herein, we provide methodologies for the high-throughput isolation of leaf mesophyll-, flower petal-, hypocotyl-, root- and nodule-derived protoplasts from P. vulgaris. The highly efficient polyethylene glycol-mannitol magnesium (PEG-MMG)-mediated transformation of leaf mesophyll protoplasts was optimized using a GUS reporter gene. We used the P. vulgaris SNF1-related protein kinase 1 (PvSnRK1) gene as proof of concept to demonstrate rapid gene functional analysis. An RT-qPCR analysis of protoplasts that had been transformed with PvSnRK1-RNAi and PvSnRK1-OE vectors showed the significant downregulation and ectopic constitutive expression (overexpression), respectively, of the PvSnRK1 transcript. We also demonstrated an improved transient transformation approach, sonication-assisted Agrobacterium-mediated transformation (SAAT), for the leaf disc infiltration of P. vulgaris. Interestingly, this method resulted in a 90 % transformation efficiency and transformed 60-85 % of the cells in a given area of the leaf surface. The constitutive expression of YFP further confirmed the amenability of the system to gene functional characterization studies. We present simple and efficient methodologies for protoplast isolation from multiple P. vulgaris tissues. We also provide a high-efficiency and amenable method for leaf mesophyll transformation for rapid gene functional characterization studies. Furthermore, a modified SAAT leaf disc infiltration approach aids in validating genes and their functions. Together, these methods help to rapidly unravel novel gene functions and are promising tools for P. vulgaris research.
Tool Helps Utilities Assess Readiness for Electric Vehicle Charging (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
NREL research helps answer a fundamental question regarding electric vehicles: Is the grid ready to handle them? Environmental, economic and security concerns regarding oil consumption make electrifying the transportation sector a high national priority. NREL's Center for Transportation Technologies & Systems (CTTS) has developed a framework for utilities to evaluate the plug-in vehicle (PEV) readiness of distribution transformers. Combining a wealth of vehicle performance statistics with load data from partner utilities including the Hawaiian Electric Company and Xcel Energy, NREL analyzed the thermal loading characteristics of distribution transformers due to vehicle charging. After running millions of simulations replicating varying climatesmore » and conditions, NREL is now able to predict aging rates for transformers when PEVs are added to existing building loads. With the NREL tool, users define simulation parameters by inputting vehicle trip and weather data; transformer load profiles and ratings; PEV penetration, charging rates and battery sizes; utility rates; the number of houses on each transformer; and public charging availability. Transformer load profiles, drive cycles, and ambient temperature data are then run through the thermal model to produce a one-year timeseries of the hotspot temperature. Annual temperature durations are calculated to help determine the annual aging rate. Annual aging rate results are grouped by independent variables. The most useful measure is transformer mileage, a measure of how many electrically-driven miles must be supplied by the transformer. Once the spectrum analysis has been conducted for an area or utility, the outputs can be used to help determine if more detailed evaluation is necessary, or if transformer replacement is required. In the majority of scenarios, transformers have enough excess capacity to charge PEVs. Only in extreme cases does vehicle charging have negative long-term impact on transformers. In those cases, upgrades to larger transformers would be recommended. NREL analysis also showed opportunity for newly-installed smart grids to offset distribution demands by time-shifting the charging loads. Most importantly, the model demonstrated synergies between PEVs and distributed renewables, not only providing clean renewable energy for vehicles, but also reducing demand on the entire distribution infrastructure by supplying loads at the point of consumption.« less
29 CFR 1926.951 - Tools and protective equipment.
Code of Federal Regulations, 2010 CFR
2010-07-01
... falling objects, electric shock, or burns. (b) Personal climbing equipment. (1) Body belts with straps or...) Be equipped with three-wire cord having the ground wire permanently connected to the tool frame and... “Double Insulated”; or (iii) Be connected to the power supply by means of an isolating transformer, or...
Use of Synchronous Online Tools in Private English Language Teaching in Russia
ERIC Educational Resources Information Center
Kozar, Olga
2012-01-01
Like many other industries, private tutoring is now being transformed by the growth of information and communication technologies (ICT). An increasing number of educational entrepreneurs in different countries are incorporating Internet tools in their professional practice. While the popularity of online tutoring in countries with widespread…
Cell Phones Transform a Science Methods Course
ERIC Educational Resources Information Center
Madden, Lauren
2012-01-01
A science methods instructor intentionally encouraged cell phone use for class work to discover how cell phones can be used as research tools to enhance the content and engage the students. The anecdotal evidence suggested that students who used their smartphones as research tools experienced the science content and pedagogical information…
USDA-ARS?s Scientific Manuscript database
Increasing availability of genomic data and sophistication of analytical methodology in fungi has elevated the need for functional genomics tools in these organisms. Gene deletion is a critical tool for functional analysis. The targeted deletion of genes requires both a suitable method for the trans...
A Learning Management System Enhanced with Internet of Things Applications
ERIC Educational Resources Information Center
Mershad, Khaleel; Wakim, Pilar
2018-01-01
A breakthrough in the development of online learning occurred with the utilization of Learning Management Systems (LMS) as a tool for creating, distributing, tracking, and managing various types of educational and training material. Since the appearance of the first LMS, major technological enhancements transformed this tool into a powerful…
Research Interests Optimization and modeling techniques Economic impacts of energy sector transformation . Transportation Research Record. Caron, J, S Cohen, J Reilly, M Brown. 2018. Exploring the Impacts of a National : Economic and GHG Impacts of a National Low Carbon Fuel Standard. Transportation Research Record: Journal of
Rapid Benefit Indicator (RBI) Checklist Tool - Quick Start ...
The Rapid Benefits Indicators (RBI) approach consists of five steps and is outlined in Assessing the Benefits of Wetland Restoration – A Rapid Benefits Indicators Approach for Decision Makers. This checklist tool is intended to be used to record information as you answer the questions in that guide. When performing a Rapid Benefits Indicator (RBI) assessment on wetlands restoration site(s) results can be recorded and reviewed using this VBA enabled MS Excel Checklist Tool.
NASA Astrophysics Data System (ADS)
Pioldi, Fabio; Rizzi, Egidio
2017-07-01
Output-only structural identification is developed by a refined Frequency Domain Decomposition ( rFDD) approach, towards assessing current modal properties of heavy-damped buildings (in terms of identification challenge), under strong ground motions. Structural responses from earthquake excitations are taken as input signals for the identification algorithm. A new dedicated computational procedure, based on coupled Chebyshev Type II bandpass filters, is outlined for the effective estimation of natural frequencies, mode shapes and modal damping ratios. The identification technique is also coupled with a Gabor Wavelet Transform, resulting in an effective and self-contained time-frequency analysis framework. Simulated response signals generated by shear-type frames (with variable structural features) are used as a necessary validation condition. In this context use is made of a complete set of seismic records taken from the FEMA P695 database, i.e. all 44 "Far-Field" (22 NS, 22 WE) earthquake signals. The modal estimates are statistically compared to their target values, proving the accuracy of the developed algorithm in providing prompt and accurate estimates of all current strong ground motion modal parameters. At this stage, such analysis tool may be employed for convenient application in the realm of Earthquake Engineering, towards potential Structural Health Monitoring and damage detection purposes.
Fracture behavior of reinforced aluminum alloy matrix composites using thermal imaging tools
NASA Astrophysics Data System (ADS)
Avdelidis, N. P.; Exarchos, D.; Vazquez, P.; Ibarra-Castanedo, C.; Sfarra, S.; Maldague, X. P. V.; Matikas, T. E.
2016-05-01
In this work the influence of the microstructure at the vicinity of the interface on the fracture behavior of particulate-reinforced aluminum alloy matrix composites (Al/SiCp composites) is studied by using thermographic tools. In particular, infrared thermography was used to monitor the plane crack propagation behavior of the materials. The deformation of solid materials is almost always accompanied by heat release. When the material becomes deformed or is damaged and fractured, a part of the energy necessary to initiate and propagate the damage is transformed in an irreversible way into heat. The thermal camera detects the heat wave, generated by the thermo-mechanical coupling and the intrinsic dissipated energy during mechanical loading of the sample. By using an adapted detector, thermography records the two dimensional "temperature" field as it results from the infrared radiation emitted by the object. The principal advantage of infrared thermography is its noncontact, non-destructive character. This methodology is being applied to characterise the fracture behavior of the particulate composites. Infrared thermography is being used to monitor the plane crack propagation behavior of such materials. Furthermore, an innovative approach to use microscopic measurements using IR microscopic lenses was attempted, in order to enable smaller features (in the micro scale) to be imaged with accuracy and assurance.
EHR Documentation: The Hype and the Hope for Improving Nursing Satisfaction and Quality Outcomes.
OʼBrien, Ann; Weaver, Charlotte; Settergren, Theresa Tess; Hook, Mary L; Ivory, Catherine H
2015-01-01
The phenomenon of "data rich, information poor" in today's electronic health records (EHRs) is too often the reality for nursing. This article proposes the redesign of nursing documentation to leverage EHR data and clinical intelligence tools to support evidence-based, personalized nursing care across the continuum. The principles consider the need to optimize nurses' documentation efficiency while contributing to knowledge generation. The nursing process must be supported by EHRs through integration of best care practices: seamless workflows that display the right tools, evidence-based content, and information at the right time for optimal clinical decision making. Design of EHR documentation must attain a balance that ensures the capture of nursing's impact on safety, quality, highly reliable care, patient engagement, and satisfaction, yet minimizes "death by data entry." In 2014, a group of diverse informatics leaders from practice, academia, and the vendor community formed to address how best to transform electronic documentation to provide knowledge at the point of care and to deliver value to front line nurses and nurse leaders. As our health care system moves toward reimbursement on the basis of quality outcomes and prevention, the value of nursing data in this business proposition will become a key differentiator for health care organizations' economic success.
Applying the Karma Provenance tool to NASA's AMSR-E Data Production Stream
NASA Astrophysics Data System (ADS)
Ramachandran, R.; Conover, H.; Regner, K.; Movva, S.; Goodman, H. M.; Pale, B.; Purohit, P.; Sun, Y.
2010-12-01
Current procedures for capturing and disseminating provenance, or data product lineage, are limited in both what is captured and how it is disseminated to the science community. For example, the Advanced Microwave Scanning Radiometer for the Earth Observing System (AMSR-E) Science Investigator-led Processing System (SIPS) generates Level 2 and Level 3 data products for a variety of geophysical parameters. Data provenance and quality information for these data sets is either very general (e.g., user guides, a list of anomalous data receipt and processing conditions over the life of the missions) or difficult to access or interpret (e.g., quality flags embedded in the data, production history files not easily available to users). Karma is a provenance collection and representation tool designed and developed for data driven workflows such as the productions streams used to produce EOS standard products. Karma records uniform and usable provenance metadata independent of the processing system while minimizing both the modification burden on the processing system and the overall performance overhead. Karma collects both the process and data provenance. The process provenance contains information about the workflow execution and the associated algorithm invocations. The data provenance captures metadata about the derivation history of the data product, including algorithms used and input data sources transformed to generate it. As part of an ongoing NASA funded project, Karma is being integrated into the AMSR-E SIPS data production streams. Metadata gathered by the tool will be presented to the data consumers as provenance graphs, which are useful in validating the workflows and determining the quality of the data product. This presentation will discuss design and implementation issues faced while incorporating a provenance tool into a structured data production flow. Prototype results will also be presented in this talk.
Efficient Execution Methods of Pivoting for Bulk Extraction of Entity-Attribute-Value-Modeled Data
Luo, Gang; Frey, Lewis J.
2017-01-01
Entity-attribute-value (EAV) tables are widely used to store data in electronic medical records and clinical study data management systems. Before they can be used by various analytical (e.g., data mining and machine learning) programs, EAV-modeled data usually must be transformed into conventional relational table format through pivot operations. This time-consuming and resource-intensive process is often performed repeatedly on a regular basis, e.g., to provide a daily refresh of the content in a clinical data warehouse. Thus, it would be beneficial to make pivot operations as efficient as possible. In this paper, we present three techniques for improving the efficiency of pivot operations: 1) filtering out EAV tuples related to unneeded clinical parameters early on; 2) supporting pivoting across multiple EAV tables; and 3) conducting multi-query optimization. We demonstrate the effectiveness of our techniques through implementation. We show that our optimized execution method of pivoting using these techniques significantly outperforms the current basic execution method of pivoting. Our techniques can be used to build a data extraction tool to simplify the specification of and improve the efficiency of extracting data from the EAV tables in electronic medical records and clinical study data management systems. PMID:25608318
Kobayashi, Katsuhiro; Jacobs, Julia; Gotman, Jean
2013-01-01
Objective A novel type of statistical time-frequency analysis was developed to elucidate changes of high-frequency EEG activity associated with epileptic spikes. Methods The method uses the Gabor Transform and detects changes of power in comparison to background activity using t-statistics that are controlled by the false discovery rate (FDR) to correct type I error of multiple testing. The analysis was applied to EEGs recorded at 2000 Hz from three patients with mesial temporal lobe epilepsy. Results Spike-related increase of high-frequency oscillations (HFOs) was clearly shown in the FDR-controlled t-spectra: it was most dramatic in spikes recorded from the hippocampus when the hippocampus was the seizure onset zone (SOZ). Depression of fast activity was observed immediately after the spikes, especially consistently in the discharges from the hippocampal SOZ. It corresponded to the slow wave part in case of spike-and-slow-wave complexes, but it was noted even in spikes without apparent slow waves. In one patient, a gradual increase of power above 200 Hz preceded spikes. Conclusions FDR-controlled t-spectra clearly detected the spike-related changes of HFOs that were unclear in standard power spectra. Significance We developed a promising tool to study the HFOs that may be closely linked to the pathophysiology of epileptogenesis. PMID:19394892
User's manual for SEDCALC, a computer program for computation of suspended-sediment discharge
Koltun, G.F.; Gray, John R.; McElhone, T.J.
1994-01-01
Sediment-Record Calculations (SEDCALC), a menu-driven set of interactive computer programs, was developed to facilitate computation of suspended-sediment records. The programs comprising SEDCALC were developed independently in several District offices of the U.S. Geological Survey (USGS) to minimize the intensive labor associated with various aspects of sediment-record computations. SEDCALC operates on suspended-sediment-concentration data stored in American Standard Code for Information Interchange (ASCII) files in a predefined card-image format. Program options within SEDCALC can be used to assist in creating and editing the card-image files, as well as to reformat card-image files to and from formats used by the USGS Water-Quality System. SEDCALC provides options for creating card-image files containing time series of equal-interval suspended-sediment concentrations from 1. digitized suspended-sediment-concentration traces, 2. linear interpolation between log-transformed instantaneous suspended-sediment-concentration data stored at unequal time intervals, and 3. nonlinear interpolation between log-transformed instantaneous suspended-sediment-concentration data stored at unequal time intervals. Suspended-sediment discharge can be computed from the streamflow and suspended-sediment-concentration data or by application of transport relations derived by regressing log-transformed instantaneous streamflows on log-transformed instantaneous suspended-sediment concentrations or discharges. The computed suspended-sediment discharge data are stored in card-image files that can be either directly imported to the USGS Automated Data Processing System or used to generate plots by means of other SEDCALC options.
Development of the TeamOBS-PPH - targeting clinical performance in postpartum hemorrhage.
Brogaard, Lise; Hvidman, Lone; Hinshaw, Kim; Kierkegaard, Ole; Manser, Tanja; Musaeus, Peter; Arafeh, Julie; Daniels, Kay I; Judy, Amy E; Uldbjerg, Niels
2018-06-01
This study aimed to develop a valid and reliable TeamOBS-PPH tool for assessing clinical performance in the management of postpartum hemorrhage (PPH). The tool was evaluated using video-recordings of teams managing PPH in both real-life and simulated settings. A Delphi panel consisting of 12 obstetricians from the UK, Norway, Sweden, Iceland, and Denmark achieved consensus on (i) the elements to include in the assessment tool, (ii) the weighting of each element, and (iii) the final tool. The validity and reliability were evaluated according to Cook and Beckman. (Level 1) Four raters scored four video-recordings of in situ simulations of PPH. (Level 2) Two raters scored 85 video-recordings of real-life teams managing patients with PPH ≥1000 mL in two Danish hospitals. (Level 3) Two raters scored 15 video-recordings of in situ simulations of PPH from a US hospital. The tool was designed with scores from 0 to 100. (Level 1) Teams of novices had a median score of 54 (95% CI 48-60), whereas experienced teams had a median score of 75 (95% CI 71-79; p < 0.001). (Level 2) The intra-rater [intra-class correlation (ICC) = 0.96] and inter-rater (ICC = 0.83) agreements for real-life PPH were strong. The tool was applicable in all cases: atony, retained placenta, and lacerations. (Level 3) The tool was easily adapted to in situ simulation settings in the USA (ICC = 0.86). The TeamOBS-PPH tool appears to be valid and reliable for assessing clinical performance in real-life and simulated settings. The tool will be shared as the free TeamOBS App. © 2018 Nordic Federation of Societies of Obstetrics and Gynecology.
VStar: Variable star data visualization and analysis tool
NASA Astrophysics Data System (ADS)
VStar Team
2014-07-01
VStar is a multi-platform, easy-to-use variable star data visualization and analysis tool. Data for a star can be read from the AAVSO (American Association of Variable Star Observers) database or from CSV and TSV files. VStar displays light curves and phase plots, can produce a mean curve, and analyzes time-frequency with Weighted Wavelet Z-Transform. It offers tools for period analysis, filtering, and other functions.
Digital tool for detecting diabetic retinopathy in retinography image using gabor transform
NASA Astrophysics Data System (ADS)
Morales, Y.; Nuñez, R.; Suarez, J.; Torres, C.
2017-01-01
Diabetic retinopathy is a chronic disease and is the leading cause of blindness in the population. The fundamental problem is that diabetic retinopathy is usually asymptomatic in its early stage and, in advanced stages, it becomes incurable, hence the importance of early detection. To detect diabetic retinopathy, the ophthalmologist examines the fundus by ophthalmoscopy, after sends the patient to get a Retinography. Sometimes, these retinography are not of good quality. This paper show the implementation of a digital tool that facilitates to ophthalmologist provide better patient diagnosis suffering from diabetic retinopathy, informing them that type of retinopathy has and to what degree of severity is find . This tool develops an algorithm in Matlab based on Gabor transform and in the application of digital filters to provide better and higher quality of retinography. The performance of algorithm has been compared with conventional methods obtaining resulting filtered images with better contrast and higher.
Use of instruments to evaluate leadership in nursing and health services.
Carrara, Gisleangela Lima Rodrigues; Bernardes, Andrea; Balsanelli, Alexandre Pazetto; Camelo, Silvia Helena Henriques; Gabriel, Carmen Silvia; Zanetti, Ariane Cristina Barboza
2018-03-12
To identify the available scientific evidence about the use of instruments for the evaluation of leadership in health and nursing services and verify the use of leadership styles/models/theories in the construction of these tools. Integrative literature review of indexed studies in the LILACS, PUBMED, CINAHL and EMBASE databases from 2006 to 2016. Thirty-eight articles were analyzed, exhibiting 19 leadership evaluation tools; the most used were the Multifactor Leadership Questionnaire, the Global Transformational Leadership Scale, the Leadership Practices Inventory, the Servant Leadership Questionnaire, the Servant Leadership Survey and the Authentic Leadership Questionnaire. The literature search allowed to identify the main theories/styles/models of contemporary leadership and analyze their use in the design of leadership evaluation tools, with the transformational, situational, servant and authentic leadership categories standing out as the most prominent. To a lesser extent, the quantum, charismatic and clinical leadership types were evidenced.
Tool use and the effect of action on the imagination.
Schwartz, D L; Holton, D L
2000-11-01
Three studies examined the claim that hand movements can facilitate imagery for object rotations but that this facilitation depends on people's model of the situation. In Experiment 1, physically turning a block without vision reduced mental rotation times compared with imagining the same rotation without bodily movement. In Experiment 2, pulling a string from a spool facilitated participants' mental rotation of an object sitting on the spool. In Experiment 3, depending on participants' model of the spool, the exact same pulling movement facilitated or interfered with the exact same imagery transformation. Results of Experiments 2 and 3 indicate that the geometric characteristics of an action do not specify the trajectory of an imagery transformation. Instead, they point to people's ability to model the tools that mediate between motor activity and its environmental consequences and to transfer tool knowledge to a new situation.
Biron, P; Metzger, M H; Pezet, C; Sebban, C; Barthuet, E; Durand, T
2014-01-01
A full-text search tool was introduced into the daily practice of Léon Bérard Center (France), a health care facility devoted to treatment of cancer. This tool was integrated into the hospital information system by the IT department having been granted full autonomy to improve the system. To describe the development and various uses of a tool for full-text search of computerized patient records. The technology is based on Solr, an open-source search engine. It is a web-based application that processes HTTP requests and returns HTTP responses. A data processing pipeline that retrieves data from different repositories, normalizes, cleans and publishes it to Solr, was integrated in the information system of the Leon Bérard center. The IT department developed also user interfaces to allow users to access the search engine within the computerized medical record of the patient. From January to May 2013, 500 queries were launched per month by an average of 140 different users. Several usages of the tool were described, as follows: medical management of patients, medical research, and improving the traceability of medical care in medical records. The sensitivity of the tool for detecting the medical records of patients diagnosed with both breast cancer and diabetes was 83.0%, and its positive predictive value was 48.7% (gold standard: manual screening by a clinical research assistant). The project demonstrates that the introduction of full-text-search tools allowed practitioners to use unstructured medical information for various purposes.
Presse, Nancy; Shatenstein, Bryna; Kergoat, Marie-Jeanne; Ferland, Guylaine
2009-07-01
The study objective was to validate a semi-quantitative food frequency questionnaire (FFQ) specifically designed to measure dietary vitamin K intake. A 50-item FFQ was interviewer-administered and compared with data previously obtained from 5-day food records. Thirty-nine community-dwelling healthy men and women aged 65 to 85 years were recruited from the Montréal metropolitan area. Absolute and relative agreements between methods were assessed. Vitamin K intake measured by the vitamin K FFQ (mean+/-standard deviation; 222+/-186 microg/day) was significantly higher than that obtained by food records (135+/-153 microg/day; P<0.001). Bland-Altman analysis on log(10)-transformed data indicated that vitamin K intake from vitamin K FFQ was 2.26 times (95% confidence interval: 1.90 to 2.67) higher than food records, limits of agreement ranging from 0.80 to 6.35. However, correlation between methods was strong and highly significant (r=0.83; P<0.001). Cross-classification also showed that 72% of participants were correctly classified into thirds and only 8% were grossly miscategorized. Weighted kappa value (kappa=0.60) also indicated a good relative agreement. In light of these results, the vitamin K FFQ is a valid tool for ranking individuals according to their vitamin K intake. The poor absolute agreement likely results from the inability for food records to adequately measure the usual intake of episodically consumed foods, particularly those high in vitamin K. The vitamin K FFQ will be useful in large-scale, population-based research on vitamin K and disease as well as in clinical practice, especially that focusing on anticoagulant therapy.
Huntley, Melanie A; Larson, Jessica L; Chaivorapol, Christina; Becker, Gabriel; Lawrence, Michael; Hackney, Jason A; Kaminker, Joshua S
2013-12-15
It is common for computational analyses to generate large amounts of complex data that are difficult to process and share with collaborators. Standard methods are needed to transform such data into a more useful and intuitive format. We present ReportingTools, a Bioconductor package, that automatically recognizes and transforms the output of many common Bioconductor packages into rich, interactive, HTML-based reports. Reports are not generic, but have been individually designed to reflect content specific to the result type detected. Tabular output included in reports is sortable, filterable and searchable and contains context-relevant hyperlinks to external databases. Additionally, in-line graphics have been developed for specific analysis types and are embedded by default within table rows, providing a useful visual summary of underlying raw data. ReportingTools is highly flexible and reports can be easily customized for specific applications using the well-defined API. The ReportingTools package is implemented in R and available from Bioconductor (version ≥ 2.11) at the URL: http://bioconductor.org/packages/release/bioc/html/ReportingTools.html. Installation instructions and usage documentation can also be found at the above URL.
Information Technology: A Tool to Cut Health Care Costs
NASA Technical Reports Server (NTRS)
Mukkamala, Ravi; Maly, K. J.; Overstreet, C. M.; Foudriat, E. C.
1996-01-01
Old Dominion University embarked on a project to see how current computer technology could be applied to reduce the cost and or to improve the efficiency of health care services. We designed and built a prototype for an integrated medical record system (MRS). The MRS is written in Tool control language/Tool kit (Tcl/Tk). While the initial version of the prototype had patient information hard coded into the system, later versions used an INGRES database for storing patient information. Currently, we have proposed an object-oriented model for implementing MRS. These projects involve developing information systems for physicians and medical researchers to enhance their ability for improved treatment at reduced costs. The move to computerized patient records is well underway, several standards exist for laboratory records, and several groups are working on standards for other portions of the patient record.
NASA Astrophysics Data System (ADS)
Jurči, Peter; Dománková, Mária; Ptačinová, Jana; Pašák, Matej; Kusý, Martin; Priknerová, Petra
2018-03-01
The microstructure and tempering response of Cr-V ledeburitic steel Vanadis 6 subjected to sub-zero treatment at - 196 °C for 4 h have been examined with reference to the same steel after conventional heat treatment. The obtained experimental results infer that sub-zero treatment significantly reduces the retained austenite amount, makes an overall refinement of microstructure, and induces a significant increase in the number and population density of small globular carbides with a size 100-500 nm. At low tempering temperatures, the transient M3C-carbides precipitated, whereas their number was enhanced by sub-zero treatment. The presence of chromium-based M7C3 precipitates was evidenced after tempering at the temperature of normal secondary hardening; this phase was detected along with the M3C. Tempering above 470 °C converts almost all the retained austenite in conventionally quenched specimens while the transformation of retained austenite is rather accelerated in sub-zero treated material. As a result of tempering, a decrease in the population density of small globular carbides was recorded; however, the number of these particles retained much higher in sub-zero treated steel. Elevated hardness of sub-zero treated steel can be referred to more completed martensitic transformation and enhanced number of small globular carbides; this state is retained up to a tempering temperature of around 500 °C in certain extent. Correspondingly, lower as-tempered hardness of sub-zero treated steel tempered above 500 °C is referred to much lower contribution of the transformation of retained austenite, and to an expectedly lower amount of precipitated alloy carbides.
Rapid landscape transformation in South Island, New Zealand, following initial Polynesian settlement
McWethy, David B.; Whitlock, Cathy; Wilmshurst, Janet M.; McGlone, Matt S.; Fromont, Mairie; Li, Xun; Dieffenbacher-Krall, Ann; Hobbs, William O.; Fritz, Sherilyn C.; Cook, Edward R.
2010-01-01
Humans have altered natural patterns of fire for millennia, but the impact of human-set fires is thought to have been slight in wet closed-canopy forests. In the South Island of New Zealand, Polynesians (Māori), who arrived 700–800 calibrated years (cal y) ago, and then Europeans, who settled ∼150 cal y ago, used fire as a tool for forest clearance, but the structure and environmental consequences of these fires are poorly understood. High-resolution charcoal and pollen records from 16 lakes were analyzed to reconstruct the fire and vegetation history of the last 1,000 y. Diatom, chironomid, and element concentration data were examined to identify disturbance-related limnobiotic and biogeochemical changes within burned watersheds. At most sites, several high-severity fire events occurred within the first two centuries of Māori arrival and were often accompanied by a transformation in vegetation, slope stability, and lake chemistry. Proxies of past climate suggest that human activity alone, rather than unusually dry or warm conditions, was responsible for this increased fire activity. The transformation of scrub to grassland by Europeans in the mid-19th century triggered further, sometimes severe, watershed change, through additional fires, erosion, and the introduction of nonnative plant species. Alteration of natural disturbance regimes had lasting impacts, primarily because native forests had little or no previous history of fire and little resilience to the severity of burning. Anthropogenic burning in New Zealand highlights the vulnerability of closed-canopy forests to novel disturbance regimes and suggests that similar settings may be less resilient to climate-induced changes in the future. PMID:21149690
NASA Astrophysics Data System (ADS)
Wang, F.; Liang, Q.
2016-12-01
Marine sediment contains large amount of methane, estimated approximately 500-2500 gigatonnes of dissolved and hydrated methane carbon stored therein, mainly in continental margins. In localized specific areas named cold seeps, hydrocarbon (mainly methane) containing fluids rise to the seafloor, and support oases of ecosystem composed of various microorganisms and faunal assemblages. South China Sea (SCS) is surrounded by passive continental margins in the west and north and convergent margins in the south and east. Thick organic-rich sediments have accumulated in the SCS since the late Mesozoic, which are continuing sources to form gas hydrates in the sediments of SCS. Here, Microbial ecosystems, particularly those involved in methane transformations were investigated in the cold seep areas (Qiongdongnan, Shenhu, and Dongsha) in the northern continental shelf of SCS. Multiple interdisciplinary analytic tools such as stable isotope probing, geochemical analysis, and molecular ecology, were applied for a comprehensive understanding of the microbe mediated methane transformation in this project. A variety of sediments cores have been collected, the geochemical profiles and the associated microbial distribution along the sediment cores were recorded. The major microbial groups involved in the methane transformation in these sediment cores were revealed, known methane producing and oxidizing archaea including Methanosarcinales, anaerobic methane oxidizing groups ANME-1, ANME-2 and their niche preference in the SCS sediments were found. In-depth comparative analysis revealed the presence of SCS-specific archaeal subtypes which probably reflected the evolution and adaptation of these methane metabolizing microbes to the SCS environmental conditions. Our work represents the first comprehensive analysis of the methane metabolizing microbial communities in the cold seep areas along the northern continental shelf of South China Sea, would provide new insight into the mechanisms of methane biotransformation.
Binda, Davide; Cavaletti, Guido; Cornblath, David R; Merkies, Ingemar S J
2015-09-01
Composite scales such as the Total Neuropathy Score clinical version (TNSc(©) ) have been widely used to measure neurological impairment in a standardized manner but they have been criticized due to their ordinal setting having no fixed unit. This study aims to improve impairment assessment in patients with chemotherapy-induced peripheral neuropathy (CIPN) by subjecting TNSc(©) records to Rasch analyses. In particular, we wanted to investigate the influence of factors affecting the use of the TNSc(©) in clinical practice. TNSc(©) has 7 domains (sensory, motor, autonomic, pin-prick, vibration, strength, and deep tendon reflexes [DTR]) each being scored 0-4. Data obtained in 281 patients with stable CIPN were subjected to Rasch analyses to determine the fit to the model. The TNSc(©) did not meet Rasch model's expectations primarily because of misfit statistics in autonomic and DTR domains. Removing these two, acceptable model fit and uni-dimensionality were obtained. However, disordered thresholds (vibration and strength) and item bias (mainly cultural) were still seen, but these findings were kept to balance the assessment range of the Rasch-Transformed TNSc(©) (RT-TNSc(©) ). Acceptable reliability findings were also obtained. A 5-domains RT-TNSc(©) may be a more proper assessment tool in patients with CIPN. Future studies are needed to examine its responsive properties. © 2015 Peripheral Nerve Society.
Basalekou, M.; Pappas, C.; Kotseridis, Y.; Tarantilis, P. A.; Kontaxakis, E.
2017-01-01
Color, phenolic content, and chemical age values of red wines made from Cretan grape varieties (Kotsifali, Mandilari) were evaluated over nine months of maturation in different containers for two vintages. The wines differed greatly on their anthocyanin profiles. Mid-IR spectra were also recorded with the use of a Fourier Transform Infrared Spectrophotometer in ZnSe disk mode. Analysis of Variance was used to explore the parameter's dependency on time. Determination models were developed for the chemical age indexes using Partial Least Squares (PLS) (TQ Analyst software) considering the spectral region 1830–1500 cm−1. The correlation coefficients (r) for chemical age index i were 0.86 for Kotsifali (Root Mean Square Error of Calibration (RMSEC) = 0.067, Root Mean Square Error of Prediction (RMSEP) = 0,115, and Root Mean Square Error of Validation (RMSECV) = 0.164) and 0.90 for Mandilari (RMSEC = 0.050, RMSEP = 0.040, and RMSECV = 0.089). For chemical age index ii the correlation coefficients (r) were 0.86 and 0.97 for Kotsifali (RMSEC 0.044, RMSEP = 0.087, and RMSECV = 0.214) and Mandilari (RMSEC = 0.024, RMSEP = 0.033, and RMSECV = 0.078), respectively. The proposed method is simpler, less time consuming, and more economical and does not require chemical reagents. PMID:29225994
Nesakumar, Noel; Baskar, Chanthini; Kesavan, Srinivasan; Rayappan, John Bosco Balaguru; Alwarappan, Subbiah
2018-05-22
The moisture content of beetroot varies during long-term cold storage. In this work, we propose a strategy to identify the moisture content and age of beetroot using principal component analysis coupled Fourier transform infrared spectroscopy (FTIR). Frequent FTIR measurements were recorded directly from the beetroot sample surface over a period of 34 days for analysing its moisture content employing attenuated total reflectance in the spectral ranges of 2614-4000 and 1465-1853 cm -1 with a spectral resolution of 8 cm -1 . In order to estimate the transmittance peak height (T p ) and area under the transmittance curve [Formula: see text] over the spectral ranges of 2614-4000 and 1465-1853 cm -1 , Gaussian curve fitting algorithm was performed on FTIR data. Principal component and nonlinear regression analyses were utilized for FTIR data analysis. Score plot over the ranges of 2614-4000 and 1465-1853 cm -1 allowed beetroot quality discrimination. Beetroot quality predictive models were developed by employing biphasic dose response function. Validation experiment results confirmed that the accuracy of the beetroot quality predictive model reached 97.5%. This research work proves that FTIR spectroscopy in combination with principal component analysis and beetroot quality predictive models could serve as an effective tool for discriminating moisture content in fresh, half and completely spoiled stages of beetroot samples and for providing status alerts.
Automating Initial Guess Generation for High Fidelity Trajectory Optimization Tools
NASA Technical Reports Server (NTRS)
Villa, Benjamin; Lantoine, Gregory; Sims, Jon; Whiffen, Gregory
2013-01-01
Many academic studies in spaceflight dynamics rely on simplified dynamical models, such as restricted three-body models or averaged forms of the equations of motion of an orbiter. In practice, the end result of these preliminary orbit studies needs to be transformed into more realistic models, in particular to generate good initial guesses for high-fidelity trajectory optimization tools like Mystic. This paper reviews and extends some of the approaches used in the literature to perform such a task, and explores the inherent trade-offs of such a transformation with a view toward automating it for the case of ballistic arcs. Sample test cases in the libration point regimes and small body orbiter transfers are presented.
Computation of a spectrum from a single-beam fourier-transform infrared interferogram.
Ben-David, Avishai; Ifarraguerri, Agustin
2002-02-20
A new high-accuracy method has been developed to transform asymmetric single-sided interferograms into spectra. We used a fraction (short, double-sided) of the recorded interferogram and applied an iterative correction to the complete recorded interferogram for the linear part of the phase induced by the various optical elements. Iterative phase correction enhanced the symmetry in the recorded interferogram. We constructed a symmetric double-sided interferogram and followed the Mertz procedure [Infrared Phys. 7,17 (1967)] but with symmetric apodization windows and with a nonlinear phase correction deduced from this double-sided interferogram. In comparing the solution spectrum with the source spectrum we applied the Rayleigh resolution criterion with a Gaussian instrument line shape. The accuracy of the solution is excellent, ranging from better than 0.1% for a blackbody spectrum to a few percent for a complicated atmospheric radiance spectrum.
ERIC Educational Resources Information Center
Gengler, Amanda Marie
2010-01-01
Travel is a powerful pedagogical tool for critical and feminist teachers, as it leads to learning that is uniquely interactive, collective, and transformative. It places students in immersive contact with real-world realities, which the teachers strive to help them see, come to terms with, and connect to the positionality of their own lived…
ERIC Educational Resources Information Center
Karakas, Ali
2017-01-01
English has experienced grave transformations recently in terms of socio-demographic and geographical characteristics. While such transformations have resulted in diverse types of English uses and various English users, the existing ELT materials still fail to represent the global varieties and dynamic uses and users of English. Moving from a…
Using the Enneagram for Client Insight and Transformation: A Type Eight Illustration
ERIC Educational Resources Information Center
Tapp, Karen; Engebretson, Ken
2010-01-01
The purpose of this article is to demonstrate how mental health practitioners can use the Enneagram's system of human personality development as a source of insight and a tool for personality transformation. The article provides an introduction to the Enneagram and its orientation to personality types and defines a linear process of how to use the…
ERIC Educational Resources Information Center
Leugers, Rebecca; Whalen, Tina; Couch, Sarah; King, Elizabeth; Prendeville, JoAnne
2009-01-01
In the College of Allied Health Sciences at the University of Cincinnati, John Kotter's eight stage model of organizational change was utilized as a template by faculty while focusing efforts on facilitating community-engaged scholarship. The Kotter model proved to be a beneficial tool when developing a framework for transforming service-learning…
ERIC Educational Resources Information Center
Diversi, Marcelo
2007-01-01
This is an essay about the transformative power of interpretive epistemologies for those who come into the social sciences seeking meaning in the messiness of human experience. It is about how an epistemological exercise gave the author symbolic tools to become sentient of his father's subjective oppressive force in his life--with the man living…
ERIC Educational Resources Information Center
Noel-Levitz, Inc, 2012
2012-01-01
The last decade marked a dramatic change in the college search experience as students flocked to the Internet as their primary tool for researching colleges. Institutions had to transform their recruitment efforts to keep up with the online demands and expectations of prospective students. The proliferation of smartphones is transforming the…
3D kinematics using dual quaternions: theory and applications in neuroscience
Leclercq, Guillaume; Lefèvre, Philippe; Blohm, Gunnar
2013-01-01
In behavioral neuroscience, many experiments are developed in 1 or 2 spatial dimensions, but when scientists tackle problems in 3-dimensions (3D), they often face problems or new challenges. Results obtained for lower dimensions are not always extendable in 3D. In motor planning of eye, gaze or arm movements, or sensorimotor transformation problems, the 3D kinematics of external (stimuli) or internal (body parts) must often be considered: how to describe the 3D position and orientation of these objects and link them together? We describe how dual quaternions provide a convenient way to describe the 3D kinematics for position only (point transformation) or for combined position and orientation (through line transformation), easily modeling rotations, translations or screw motions or combinations of these. We also derive expressions for the velocities of points and lines as well as the transformation velocities. Then, we apply these tools to a motor planning task for manual tracking and to the modeling of forward and inverse kinematics of a seven-dof three-link arm to show the interest of dual quaternions as a tool to build models for these kinds of applications. PMID:23443667
Identifying individual sperm whales acoustically using self-organizing maps
NASA Astrophysics Data System (ADS)
Ioup, Juliette W.; Ioup, George E.
2005-09-01
The Littoral Acoustic Demonstration Center (LADC) is a consortium at Stennis Space Center comprising the University of New Orleans, the University of Southern Mississippi, the Naval Research Laboratory, and the University of Louisiana at Lafayette. LADC deployed three Environmental Acoustic Recording System (EARS) buoys in the northern Gulf of Mexico during the summer of 2001 to study ambient noise and marine mammals. Each LADC EARS was an autonomous, self-recording buoy capable of 36 days of continuous recording of a single channel at an 11.7-kHz sampling rate (bandwidth to 5859 Hz). The hydrophone selected for this analysis was approximately 50 m from the bottom in a water depth of 800 m on the continental slope off the Mississippi River delta. This paper contains recent analysis results for sperm whale codas recorded during a 3-min period. Results are presented for the identification of individual sperm whales from their codas, using the acoustic properties of the clicks within each coda. The recorded time series, the Fourier transform magnitude, and the wavelet transform coefficients are each used separately with a self-organizing map procedure for 43 codas. All show the codas as coming from four or five individual whales. [Research supported by ONR.
Hilbert-Huang transform analysis of dynamic and earthquake motion recordings
Zhang, R.R.; Ma, S.; Safak, E.; Hartzell, S.
2003-01-01
This study examines the rationale of Hilbert-Huang transform (HHT) for analyzing dynamic and earthquake motion recordings in studies of seismology and engineering. In particular, this paper first provides the fundamentals of the HHT method, which consist of the empirical mode decomposition (EMD) and the Hilbert spectral analysis. It then uses the HHT to analyze recordings of hypothetical and real wave motion, the results of which are compared with the results obtained by the Fourier data processing technique. The analysis of the two recordings indicates that the HHT method is able to extract some motion characteristics useful in studies of seismology and engineering, which might not be exposed effectively and efficiently by Fourier data processing technique. Specifically, the study indicates that the decomposed components in EMD of HHT, namely, the intrinsic mode function (IMF) components, contain observable, physical information inherent to the original data. It also shows that the grouped IMF components, namely, the EMD-based low- and high-frequency components, can faithfully capture low-frequency pulse-like as well as high-frequency wave signals. Finally, the study illustrates that the HHT-based Hilbert spectra are able to reveal the temporal-frequency energy distribution for motion recordings precisely and clearly.
Transforming Mobile Platform with KI-SIM Card into an Open Mobile Identity Tool
NASA Astrophysics Data System (ADS)
Hyppönen, Konstantin; Hassinen, Marko; Trichina, Elena
Recent introduction of Near Field Communication (NFC) in mobile phones has stimulated the development of new proximity payment and identification services. We present an architecture that facilitates the use of the mobile phone as a personalised electronic identity tool. The tool can work as a replacement for numerous ID cards and licenses. Design for privacy principles have been applied, such as minimisation of data collection and informed consent of the user. We describe an implementation of a lightweight version of the of the mobile identity tool using currently available handset technology and off-the-shelf development tools.
Social Software: Participants' Experience Using Social Networking for Learning
ERIC Educational Resources Information Center
Batchelder, Cecil W.
2010-01-01
Social networking tools used in learning provides instructional design with tools for transformative change in education. This study focused on defining the meanings and essences of social networking through the lived common experiences of 7 college students. The problem of the study was a lack of learner voice in understanding the value of social…
Transforming School Communities: Creating Dialogue Using Web 2.0 Tools
ERIC Educational Resources Information Center
Soule, Helen
2008-01-01
Web 2.0 tools should be an important part of every district's communication strategy, creating environments for collaboration in ways never possible before. Most of them are free, inexpensive, easy to use, and require little set up. When combined with basic communication principles and careful planning, they can expand a district's reach, increase…
Redesigning Instruction through Web-based Course Authoring Tools.
ERIC Educational Resources Information Center
Dabbagh, Nada H.; Schmitt, Jeff
1998-01-01
Examines the pedagogical implications of redesigning instruction for Web-based delivery through a case study of an undergraduate computer science course. Initially designed for a traditional learning environment, this course transformed to a Web-based course using WebCT, a Web-based course authoring tool. Discusses the specific features of WebCT.…
Appropriating Geometric Series as a Cultural Tool: A Study of Student Collaborative Learning
ERIC Educational Resources Information Center
Carlsen, Martin
2010-01-01
The aim of this article is to illustrate how students, through collaborative small-group problem solving, appropriate the concept of geometric series. Student appropriation of cultural tools is dependent on five sociocultural aspects: involvement in joint activity, shared focus of attention, shared meanings for utterances, transforming actions and…
A Face-to-Face Professional Development Model to Enhance Teaching of Online Research Strategies
ERIC Educational Resources Information Center
Terrazas-Arellanes, Fatima E.; Knox, Carolyn; Strycker, Lisa A.; Walden, Emily
2016-01-01
To help students navigate the digital environment, teachers not only need access to the right technology tools but they must also engage in pedagogically sound, high-quality professional development. For teachers, quality professional development can mean the difference between merely using technology tools and creating transformative change in…
Using Active Learning as Assessment in the Postsecondary Classroom.
ERIC Educational Resources Information Center
Bonwell, Charles C.
1997-01-01
Provides a conceptual framework for introducing active learning into the classroom as a tool for assessment. Notes that formative assessment can be a powerful tool for transforming a passive classroom into one filled with active participants. Discusses what is assessed, what are the criteria, who will assess, and how the object of assessment will…
Topology-Preserving Rigid Transformation of 2D Digital Images.
Ngo, Phuc; Passat, Nicolas; Kenmochi, Yukiko; Talbot, Hugues
2014-02-01
We provide conditions under which 2D digital images preserve their topological properties under rigid transformations. We consider the two most common digital topology models, namely dual adjacency and well-composedness. This paper leads to the proposal of optimal preprocessing strategies that ensure the topological invariance of images under arbitrary rigid transformations. These results and methods are proved to be valid for various kinds of images (binary, gray-level, label), thus providing generic and efficient tools, which can be used in particular in the context of image registration and warping.
Serror, Pascale; Sasaki, Takashi; Ehrlich, S. Dusko; Maguin, Emmanuelle
2002-01-01
We describe, for the first time, a detailed electroporation procedure for Lactobacillus delbrueckii. Three L. delbrueckii strains were successfully transformed. Under optimal conditions, the transformation efficiency was 104 transformants per μg of DNA. Using this procedure, we identified several plasmids able to replicate in L. delbrueckii and integrated an integrative vector based on phage integrative elements into the L. delbrueckii subsp. bulgaricus chromosome. These vectors provide a good basis for developing molecular tools for L. delbrueckii and open the field of genetic studies in L. delbrueckii. PMID:11772607
Transforming home health nursing with telehealth technology.
Farrar, Francisca Cisneros
2015-06-01
Telehealth technology is an evidence-based delivery model tool that can be integrated into the plan of care for mental health patients. Telehealth technology empowers access to health care, can help decrease or prevent hospital readmissions, assist home health nurses provide shared decision making, and focuses on collaborative care. Telehealth and the recovery model have transformed the role of the home health nurse. Nurses need to be proactive and respond to rapidly emerging technologies that are transforming their role in home care. Copyright © 2015 Elsevier Inc. All rights reserved.
On Weak and Strong 2k- bent Boolean Functions
2016-01-01
U.S.A. Email: pstanica@nps.edu Abstract—In this paper we introduce a sequence of discrete Fourier transforms and define new versions of bent...denotes the complex conjugate of z. An important tool in our analysis is the discrete Fourier transform , known in Boolean functions literature, as Walsh...Hadamard, or Walsh–Hadamard transform , which is the func- tion Wf : Fn2 → C, defined by Wf (u) = 2− n 2 ∑ x∈Vn (−1)f(x)⊕u·x. Any f ∈ Bn can be
Groups: knowledge spreadsheets for symbolic biocomputing.
Travers, Michael; Paley, Suzanne M; Shrager, Jeff; Holland, Timothy A; Karp, Peter D
2013-01-01
Knowledge spreadsheets (KSs) are a visual tool for interactive data analysis and exploration. They differ from traditional spreadsheets in that rather than being oriented toward numeric data, they work with symbolic knowledge representation structures and provide operations that take into account the semantics of the application domain. 'Groups' is an implementation of KSs within the Pathway Tools system. Groups allows Pathway Tools users to define a group of objects (e.g. groups of genes or metabolites) from a Pathway/Genome Database. Groups can be transformed (e.g. by transforming a metabolite group to the group of pathways in which those metabolites are substrates); combined through set operations; analysed (e.g. through enrichment analysis); and visualized (e.g. by painting onto a metabolic map diagram). Users of the Pathway Tools-based BioCyc.org website have made extensive use of Groups, and an informal survey of Groups users suggests that Groups has achieved the goal of allowing biologists themselves to perform some data manipulations that previously would have required the assistance of a programmer. Database URL: BioCyc.org.
Contrasting the Use of Tools for Presentation and Critique: Some Cases from Architectural Education
ERIC Educational Resources Information Center
Lymer, Gustav; Ivarsson, Jonas; Lindwall, Oskar
2009-01-01
This study investigates video recordings of design reviews in architectural education, focusing on how presentations and discussions of designs are contingent on the specific tools employed. In the analyzed recordings, three different setups are utilized: traditional posters, digital slide-show technologies, and combinations of the two. This range…
White, David B.
1991-01-01
An electrical safety device for use in power tools that is designed to automatically discontinue operation of the power tool upon physical contact of the tool with a concealed conductive material. A step down transformer is used to supply the operating power for a disconnect relay and a reset relay. When physical contact is made between the power tool and the conductive material, an electrical circuit through the disconnect relay is completed and the operation of the power tool is automatically interrupted. Once the contact between the tool and conductive material is broken, the power tool can be quickly and easily reactivated by a reset push button activating the reset relay. A remote reset is provided for convenience and efficiency of operation.
Limitations of Dower's inverse transform for the study of atrial loops during atrial fibrillation.
Guillem, María S; Climent, Andreu M; Bollmann, Andreas; Husser, Daniela; Millet, José; Castells, Francisco
2009-08-01
Spatial characteristics of atrial fibrillatory waves have been extracted by using a vectorcardiogram (VCG) during atrial fibrillation (AF). However, the VCG is usually not recorded in clinical practice and atrial loops are derived from the 12-lead electrocardiogram (ECG). We evaluated the suitability of the reconstruction of orthogonal leads from the 12-lead ECG for fibrillatory waves in AF. We used the Physikalisch-Technische Bundesanstalt diagnostic ECG database, which contains 15 simultaneously recorded signals (12-lead ECG and three Frank orthogonal leads) of 13 patients during AF. Frank leads were derived from the 12-lead ECG by using Dower's inverse transform. Derived leads were then compared to true Frank leads in terms of the relative error achieved. We calculated the orientation of AF loops of both recorded orthogonal leads and derived leads and measured the difference in estimated orientation. Also, we investigated the relationship of errors in derivation with fibrillatory wave amplitude, frequency, wave residuum, and fit to a plane of the AF loops. Errors in derivation of AF loops were 68 +/- 31% and errors in the estimation of orientation were 35.85 +/- 20.43 degrees . We did not find any correlation among these errors and amplitude, frequency, or other parameters. In conclusion, Dower's inverse transform should not be used for the derivation of orthogonal leads from the 12-lead ECG for the analysis of fibrillatory wave loops in AF. Spatial parameters obtained after this derivation may differ from those obtained from recorded orthogonal leads.
Weyda, István; Yang, Lei; Vang, Jesper; Ahring, Birgitte K; Lübeck, Mette; Lübeck, Peter S
2017-04-01
In recent years, versatile genetic tools have been developed and applied to a number of filamentous fungi of industrial importance. However, the existing techniques have limitations when it comes to achieve the desired genetic modifications, especially for efficient gene targeting. In this study, we used Aspergillus carbonarius as a host strain due to its potential as a cell factory, and compared three gene targeting techniques by disrupting the ayg1 gene involved in the biosynthesis of conidial pigment in A. carbonarius. The absence of the ayg1 gene leads to phenotypic change in conidia color, which facilitated the analysis on the gene targeting frequency. The examined transformation techniques included Agrobacterium-mediated transformation (AMT) and protoplast-mediated transformation (PMT). Furthermore, the PMT for the disruption of the ayg1 gene was carried out with bipartite gene targeting fragments and the recently adapted CRISPR-Cas9 system. All three techniques were successful in generating Δayg1 mutants, but showed different efficiencies. The most efficient method for gene targeting was AMT, but further it was shown to be dependent on the choice of Agrobacterium strain. However, there are different advantages and disadvantages of all three gene targeting methods which are discussed, in order to facilitate future approaches for fungal strain improvements. Copyright © 2017 Elsevier B.V. All rights reserved.
Unbeck, Maria; Schildmeijer, Kristina; Henriksson, Peter; Jürgensen, Urban; Muren, Olav; Nilsson, Lena; Pukk Härenstam, Karin
2013-04-15
There has been a theoretical debate as to which retrospective record review method is the most valid, reliable, cost efficient and feasible for detecting adverse events. The aim of the present study was to evaluate the feasibility and capability of two common retrospective record review methods, the "Harvard Medical Practice Study" method and the "Global Trigger Tool" in detecting adverse events in adult orthopaedic inpatients. We performed a three-stage structured retrospective record review process in a random sample of 350 orthopaedic admissions during 2009 at a Swedish university hospital. Two teams comprised each of a registered nurse and two physicians were assigned, one to each method. All records were primarily reviewed by registered nurses. Records containing a potential adverse event were forwarded to physicians for review in stage 2. Physicians made an independent review regarding, for example, healthcare causation, preventability and severity. In the third review stage all adverse events that were found with the two methods together were compared and all discrepancies after review stage 2 were analysed. Events that had not been identified by one of the methods in the first two review stages were reviewed by the respective physicians. Altogether, 160 different adverse events were identified in 105 (30.0%) of the 350 records with both methods combined. The "Harvard Medical Practice Study" method identified 155 of the 160 (96.9%, 95% CI: 92.9-99.0) adverse events in 104 (29.7%) records compared with 137 (85.6%, 95% CI: 79.2-90.7) adverse events in 98 (28.0%) records using the "Global Trigger Tool". Adverse events "causing harm without permanent disability" accounted for most of the observed difference. The overall positive predictive value for criteria and triggers using the "Harvard Medical Practice Study" method and the "Global Trigger Tool" was 40.3% and 30.4%, respectively. More adverse events were identified using the "Harvard Medical Practice Study" method than using the "Global Trigger Tool". Differences in review methodology, perception of less severe adverse events and context knowledge may explain the observed difference between two expert review teams in the detection of adverse events.
Mano, Hiroaki; Fujii, Tomomi; Sumikawa, Naomi; Hiwatashi, Yuji; Hasebe, Mitsuyasu
2014-01-01
The sensitive plant Mimosa pudica has long attracted the interest of researchers due to its spectacular leaf movements in response to touch or other external stimuli. Although various aspects of this seismonastic movement have been elucidated by histological, physiological, biochemical, and behavioral approaches, the lack of reverse genetic tools has hampered the investigation of molecular mechanisms involved in these processes. To overcome this obstacle, we developed an efficient genetic transformation method for M. pudica mediated by Agrobacterium tumefaciens (Agrobacterium). We found that the cotyledonary node explant is suitable for Agrobacterium-mediated transformation because of its high frequency of shoot formation, which was most efficiently induced on medium containing 0.5 µg/ml of a synthetic cytokinin, 6-benzylaminopurine (BAP). Transformation efficiency of cotyledonary node cells was improved from almost 0 to 30.8 positive signals arising from the intron-sGFP reporter gene by using Agrobacterium carrying a super-binary vector pSB111 and stabilizing the pH of the co-cultivation medium with 2-(N-morpholino)ethanesulfonic acid (MES) buffer. Furthermore, treatment of the explants with the detergent Silwet L-77 prior to co-cultivation led to a two-fold increase in the number of transformed shoot buds. Rooting of the regenerated shoots was efficiently induced by cultivation on irrigated vermiculite. The entire procedure for generating transgenic plants achieved a transformation frequency of 18.8%, which is comparable to frequencies obtained for other recalcitrant legumes, such as soybean (Glycine max) and pea (Pisum sativum). The transgene was stably integrated into the host genome and was inherited across generations, without affecting the seismonastic or nyctinastic movements of the plants. This transformation method thus provides an effective genetic tool for studying genes involved in M. pudica movements. PMID:24533121
Mano, Hiroaki; Fujii, Tomomi; Sumikawa, Naomi; Hiwatashi, Yuji; Hasebe, Mitsuyasu
2014-01-01
The sensitive plant Mimosa pudica has long attracted the interest of researchers due to its spectacular leaf movements in response to touch or other external stimuli. Although various aspects of this seismonastic movement have been elucidated by histological, physiological, biochemical, and behavioral approaches, the lack of reverse genetic tools has hampered the investigation of molecular mechanisms involved in these processes. To overcome this obstacle, we developed an efficient genetic transformation method for M. pudica mediated by Agrobacterium tumefaciens (Agrobacterium). We found that the cotyledonary node explant is suitable for Agrobacterium-mediated transformation because of its high frequency of shoot formation, which was most efficiently induced on medium containing 0.5 µg/ml of a synthetic cytokinin, 6-benzylaminopurine (BAP). Transformation efficiency of cotyledonary node cells was improved from almost 0 to 30.8 positive signals arising from the intron-sGFP reporter gene by using Agrobacterium carrying a super-binary vector pSB111 and stabilizing the pH of the co-cultivation medium with 2-(N-morpholino)ethanesulfonic acid (MES) buffer. Furthermore, treatment of the explants with the detergent Silwet L-77 prior to co-cultivation led to a two-fold increase in the number of transformed shoot buds. Rooting of the regenerated shoots was efficiently induced by cultivation on irrigated vermiculite. The entire procedure for generating transgenic plants achieved a transformation frequency of 18.8%, which is comparable to frequencies obtained for other recalcitrant legumes, such as soybean (Glycine max) and pea (Pisum sativum). The transgene was stably integrated into the host genome and was inherited across generations, without affecting the seismonastic or nyctinastic movements of the plants. This transformation method thus provides an effective genetic tool for studying genes involved in M. pudica movements.
IBM's Health Analytics and Clinical Decision Support.
Kohn, M S; Sun, J; Knoop, S; Shabo, A; Carmeli, B; Sow, D; Syed-Mahmood, T; Rapp, W
2014-08-15
This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation.
Transgenic barley: a prospective tool for biotechnology and agriculture.
Mrízová, Katarína; Holasková, Edita; Öz, M Tufan; Jiskrová, Eva; Frébort, Ivo; Galuszka, Petr
2014-01-01
Barley (Hordeum vulgare L.) is one of the founder crops of agriculture, and today it is the fourth most important cereal grain worldwide. Barley is used as malt in brewing and distilling industry, as an additive for animal feed, and as a component of various food and bread for human consumption. Progress in stable genetic transformation of barley ensures a potential for improvement of its agronomic performance or use of barley in various biotechnological and industrial applications. Recently, barley grain has been successfully used in molecular farming as a promising bioreactor adapted for production of human therapeutic proteins or animal vaccines. In addition to development of reliable transformation technologies, an extensive amount of various barley genetic resources and tools such as sequence data, microarrays, genetic maps, and databases has been generated. Current status on barley transformation technologies including gene transfer techniques, targets, and progeny stabilization, recent trials for improvement of agricultural traits and performance of barley, especially in relation to increased biotic and abiotic stress tolerance, and potential use of barley grain as a protein production platform have been reviewed in this study. Overall, barley represents a promising tool for both agricultural and biotechnological transgenic approaches, and is considered an ancient but rediscovered crop as a model industrial platform for molecular farming. © 2013 Elsevier Inc. All rights reserved.
Method for extracting long-equivalent wavelength interferometric information
NASA Technical Reports Server (NTRS)
Hochberg, Eric B. (Inventor)
1991-01-01
A process for extracting long-equivalent wavelength interferometric information from a two-wavelength polychromatic or achromatic interferometer. The process comprises the steps of simultaneously recording a non-linear sum of two different frequency visible light interferograms on a high resolution film and then placing the developed film in an optical train for Fourier transformation, low pass spatial filtering and inverse transformation of the film image to produce low spatial frequency fringes corresponding to a long-equivalent wavelength interferogram. The recorded non-linear sum irradiance derived from the two-wavelength interferometer is obtained by controlling the exposure so that the average interferogram irradiance is set at either the noise level threshold or the saturation level threshold of the film.
Transformation of Didactic Intensions by Teachers: The Case of Geometrical Optics in Grade 8.
ERIC Educational Resources Information Center
Hirn, Colette; Viennot, Laurence
2000-01-01
Investigates the idea that teachers are not passive transmitters, and that some general trends can be found in the way they transform proposed strategies. Presents the case of elementary optics in grade 8 in France in which four sets of data--interviews before teaching, logbooks, assessment tasks, and video-recorded class observations--lead to…
Marchev, Andrey; Yordanova, Zhenya; Alipieva, Kalina; Zahmanov, Georgi; Rusinova-Videva, Snezhana; Kapchina-Toteva, Veneta; Simova, Svetlana; Popova, Milena; Georgiev, Milen I
2016-09-01
To develop a protocol to transform Verbascum eriophorum and to study the metabolic differences between mother plants and hairy root culture by applying NMR and processing the datasets with chemometric tools. Verbascum eriophorum is a rare species with restricted distribution, which is poorly studied. Agrobacterium rhizogenes-mediated genetic transformation of V. eriophorum and hairy root culture induction are reported for the first time. To determine metabolic alterations, V. eriophorum mother plants and relevant hairy root culture were subjected to comprehensive metabolomic analyses, using NMR (1D and 2D). Metabolomics data, processed using chemometric tools (and principal component analysis in particular) allowed exploration of V. eriophorum metabolome and have enabled identification of verbascoside (by means of 2D-TOCSY NMR) as the most abundant compound in hairy root culture. Metabolomics data contribute to the elucidation of metabolic alterations after T-DNA transfer to the host V. eriophorum genome and the development of hairy root culture for sustainable bioproduction of high value verbascoside.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a: system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the ciasses of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
Phillips, Diane; Fawns, Rod; Hayes, Barbara
2002-12-01
A transformational model of professional identity formation, anchored and globalized in workplace conversations, is advanced. Whilst the need to theorize the aims and methods of clinical education has been served by the techno-rational platform of 'reflective practice', this platform does not provide an adequate psychological tool to explore the dynamics of social episodes in professional learning and this led us to positioning theory. Positioning theory is one such appropriate tool in which individuals metaphorically locate themselves within discursive action in everyday conversations to do with personal positioning, institutional practices and societal rhetoric. This paper develops the case for researching social episodes in clinical education through professional conversations where midwifery students, in practice settings, are encouraged to account for their moment-by-moment interactions with their preceptors/midwives and university mentors. It is our belief that the reflection elaborated by positioning theory should be considered as the new epistemology for professional education where professional conversations are key to transformative learning processes for persons and institutions.
Transformation of Epichloë typhina by electroporation of conidia
2011-01-01
Background Choke, caused by the endophytic fungus Epichloë typhina, is an important disease affecting orchardgrass (Dactylis glomerata L.) seed production in the Willamette Valley. Little is known concerning the conditions necessary for successful infection of orchardgrass by E. typhina. Detection of E. typhina in plants early in the disease cycle can be difficult due to the sparse distribution of hyphae in the plant. Therefore, a sensitive method to detect fungal infection in plants would provide an invaluable tool for elucidating the conditions for establishment of infection in orchardgrass. Utilization of a marker gene, such as the green fluorescent protein (GFP), transformed into Epichloë will facilitate characterization of the initial stages of infection and establishment of the fungus in plants. Findings We have developed a rapid, efficient, and reproducible transformation method using electroporation of germinating Epichloë conidia isolated from infected plants. Conclusions The GFP labelled E. typhina provides a valuable molecular tool to researchers studying conditions and mechanisms involved in the establishment of choke disease in orchardgrass. PMID:21375770
An Approach for Calculating Land Valuation by Using Inspire Data Models
NASA Astrophysics Data System (ADS)
Aydinoglu, A. C.; Bovkir, R.
2017-11-01
Land valuation is a highly important concept for societies and governments have always emphasis on the process especially for taxation, expropriation, market capitalization and economic activity purposes. To success an interoperable and standardised land valuation, INSPIRE data models can be very practical and effective. If data used in land valuation process produced in compliance with INSPIRE specifications, a reliable and effective land valuation process can be performed. In this study, possibility of the performing land valuation process with using the INSPIRE data models was analysed and with the help of Geographic Information Systems (GIS) a case study in Pendik was implemented. For this purpose, firstly data analysis and gathering was performed. After, different data structures were transformed according to the INSPIRE data model requirements. For each data set necessary ETL (Extract-Transform-Load) tools were produced and all data transformed according to the target data requirements. With the availability and practicability of spatial analysis tools of GIS software, land valuation calculations were performed for study area.
Better informed in clinical practice - a brief overview of dental informatics.
Reynolds, P A; Harper, J; Dunne, S
2008-03-22
Uptake of dental informatics has been hampered by technical and user issues. Innovative systems have been developed, but usability issues have affected many. Advances in technology and artificial intelligence are now producing clinically useful systems, although issues still remain with adapting computer interfaces to the dental practice working environment. A dental electronic health record has become a priority in many countries, including the UK. However, experience shows that any dental electronic health record (EHR) system cannot be subordinate to, or a subset of, a medical record. Such a future dental EHR is likely to incorporate integrated care pathways. Future best dental practice will increasingly depend on computer-based support tools, although disagreement remains about the effectiveness of current support tools. Over the longer term, future dental informatics tools will incorporate dynamic, online evidence-based medicine (EBM) tools, and promise more adaptive, patient-focused and efficient dental care with educational advantages in training.
A Semantic Basis for Proof Queries and Transformations
NASA Technical Reports Server (NTRS)
Aspinall, David; Denney, Ewen W.; Luth, Christoph
2013-01-01
We extend the query language PrQL, designed for inspecting machine representations of proofs, to also allow transformation of proofs. PrQL natively supports hiproofs which express proof structure using hierarchically nested labelled trees, which we claim is a natural way of taming the complexity of huge proofs. Query-driven transformations enable manipulation of this structure, in particular, to transform proofs produced by interactive theorem provers into forms that assist their understanding, or that could be consumed by other tools. In this paper we motivate and define basic transformation operations, using an abstract denotational semantics of hiproofs and queries. This extends our previous semantics for queries based on syntactic tree representations.We define update operations that add and remove sub-proofs, and manipulate the hierarchy to group and ungroup nodes. We show that
Bonnefoy, Nathalie; Fox, Thomas D
2007-01-01
Saccharomyces cerevisiae is currently the only species in which genetic transformation of mitochondria can be used to generate a wide variety of defined alterations in mitochondrial deoxyribonucleic acid (mtDNA). DNA sequences can be delivered into yeast mitochondria by microprojectile bombardment (biolistic transformation) and subsequently incorporated into mtDNA by the highly active homologous recombination machinery present in the organelle. Although transformation frequencies are relatively low, the availability of strong mitochondrial selectable markers for the yeast system, both natural and synthetic, makes the isolation of transformants routine. The strategies and procedures reviewed here allow the researcher to insert defined mutations into endogenous mitochondrial genes and to insert new genes into mtDNA. These methods provide powerful in vivo tools for the study of mitochondrial biology.
Validating data analysis of broadband laser ranging
NASA Astrophysics Data System (ADS)
Rhodes, M.; Catenacci, J.; Howard, M.; La Lone, B.; Kostinski, N.; Perry, D.; Bennett, C.; Patterson, J.
2018-03-01
Broadband laser ranging combines spectral interferometry and a dispersive Fourier transform to achieve high-repetition-rate measurements of the position of a moving surface. Telecommunications fiber is a convenient tool for generating the large linear dispersions required for a dispersive Fourier transform, but standard fiber also has higher-order dispersion that distorts the Fourier transform. Imperfections in the dispersive Fourier transform significantly complicate the ranging signal and must be dealt with to make high-precision measurements. We describe in detail an analysis process for interpreting ranging data when standard telecommunications fiber is used to perform an imperfect dispersive Fourier transform. This analysis process is experimentally validated over a 27-cm scan of static positions, showing an accuracy of 50 μm and a root-mean-square precision of 4.7 μm.
Faster processing of multiple spatially-heterodyned direct to digital holograms
Hanson, Gregory R.; Bingham, Philip R.
2006-10-03
Systems and methods are described for faster processing of multiple spatially-heterodyned direct to digital holograms. A method includes of obtaining multiple spatially-heterodyned holograms, includes: digitally recording a first spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; digitally recording a second spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded first spatially-heterodyned hologram by shifting a first original origin of the recorded first spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a first angle between a first reference beam and a first, object beam; applying a first digital filter to cut off signals around the first original origin and performing an inverse Fourier transform on the result; Fourier analyzing the recorded second spatially-heterodyned hologram by shifting a second original origin of the recorded second spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a second angle between a second reference beam and a second object beam; and applying a second digital filter to cut off signals around the second original origin and performing an inverse Fourier transform on the result, wherein digitally recording the first spatially-heterodyned hologram is completed before digitally recording the second spatially-heterodyned hologram and a single digital image includes both the first spatially-heterodyned hologram and the second spatially-heterodyned hologram.
Faster processing of multiple spatially-heterodyned direct to digital holograms
Hanson, Gregory R [Clinton, TN; Bingham, Philip R [Knoxville, TN
2008-09-09
Systems and methods are described for faster processing of multiple spatially-heterodyned direct to digital holograms. A method includes of obtaining multiple spatially-heterodyned holograms, includes: digitally recording a first spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; digitally recording a second spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded first spatially-heterodyned hologram by shifting a first original origin of the recorded first spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a first angle between a first reference beam and a first object beam; applying a first digital filter to cut off signals around the first original origin and performing an inverse Fourier transform on the result; Fourier analyzing the recorded second spatially-heterodyned hologram by shifting a second original origin of the recorded second spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a second angle between a second reference beam and a second object beam; and applying a second digital filter to cut off signals around the second original origin and performing an inverse Fourier transform on the result, wherein digitally recording the first spatially-heterodyned hologram is completed before digitally recording the second spatially-heterodyned hologram and a single digital image includes both the first spatially-heterodyned hologram and the second spatially-heterodyned hologram.
Recording multiple spatially-heterodyned direct to digital holograms in one digital image
Hanson, Gregory R [Clinton, TN; Bingham, Philip R [Knoxville, TN
2008-03-25
Systems and methods are described for recording multiple spatially-heterodyned direct to digital holograms in one digital image. A method includes digitally recording, at a first reference beam-object beam angle, a first spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded first spatially-heterodyned hologram by shifting a first original origin of the recorded first spatially-heterodyned hologram to sit on top of a first spatial-heterodyne carrier frequency defined by the first reference beam-object beam angle; digitally recording, at a second reference beam-object beam angle, a second spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded second spatially-heterodyned hologram by shifting a second original origin of the recorded second spatially-heterodyned hologram to sit on top of a second spatial-heterodyne carrier frequency defined by the second reference beam-object beam angle; applying a first digital filter to cut off signals around the first original origin and define a first result; performing a first inverse Fourier transform on the first result; applying a second digital filter to cut off signals around the second original origin and define a second result; and performing a second inverse Fourier transform on the second result, wherein the first reference beam-object beam angle is not equal to the second reference beam-object beam angle and a single digital image includes both the first spatially-heterodyned hologram and the second spatially-heterodyned hologram.
Medina-Valverde, M José; Rodríguez-Borrego, M Aurora; Luque-Alcaraz, Olga; de la Torre-Barbero, M José; Parra-Perea, Julia; Moros-Molina, M del Pilar
2012-01-01
To identify problems and critical points in the software application. Assessment of the implementation of the software tool "Azahar" used to manage nursing care processes. The monitored population consisted of nurses who were users of the tool, at the Hospital and those who benefited from it in Primary Care. Each group was selected randomly and the number was determined by data saturation. A qualitative approach was employed using in-depth interviews and group discussion as data collection techniques. The nurses considered that the most beneficial and useful application of the tool was the initial assessment and the continuity of care release forms, as well as the recording of all data on the nursing process to ensure quality. The disadvantages and weaknesses identified were associated with the continuous variability in their daily care. The nurses related an increase in workload with the impossibility of entering the records into the computer, making paper records, thus duplicating the recording process. Likewise, they consider that the operating system of the software should be improved in terms of simplicity and functionality. The simplicity of the tool and the adjustment of workloads would favour its use and as a result, continuity of care. Copyright © 2010 Elsevier España, S.L. All rights reserved.
MEA-Tools: an open source toolbox for the analysis of multi-electrode data with MATLAB.
Egert, U; Knott, Th; Schwarz, C; Nawrot, M; Brandt, A; Rotter, S; Diesmann, M
2002-05-30
Recent advances in electrophysiological techniques have created new tools for the acquisition and storage of neuronal activity recorded simultaneously with numerous electrodes. These techniques support the analysis of the function as well as the structure of individual electrogenic cells in the context of surrounding neuronal or cardiac network. Commercially available tools for the analysis of such data, however, cannot be easily adapted to newly emerging requirements for data analysis and visualization, and cross compatibility between them is limited. In this report we introduce a free open source toolbox called microelectrode array tools (MEA-Tools) for the analysis of multi-electrode data based on the common data analysis environment MATLAB (version 5.3-6.1, The Mathworks, Natick, MA). The toolbox itself is platform independent. The file interface currently supports files recorded with MCRack (Multi Channel Systems, Reutlingen, Germany) under Microsoft Windows 95, 98, NT, and 2000, but can be adapted to other data acquisition systems. Functions are controlled via command line input and graphical user interfaces, and support common requirements for the analysis of local field potentials, extracellular spike activity, and continuous recordings, in addition to supplementary data acquired by additional instruments, e.g. intracellular amplifiers. Data may be processed as continuous recordings or time windows triggered to some event.
NASA Astrophysics Data System (ADS)
West, P.; Michaelis, J.; Lebot, T.; McGuinness, D. L.; Fox, P. A.
2014-12-01
Providing proper citation and attribution for published data, derived data products, and the software tools used to generate them, has always been an important aspect of scientific research. However, It is often the case that this type of detailed citation and attribution is lacking. This is in part because it often requires manual markup since dynamic generation of this type of provenance information is not typically done by the tools used to access, manipulate, transform, and visualize data. In addition, the tools themselves lack the information needed to be properly cited themselves. The OPeNDAP Hyrax Software Framework is a tool that provides access to and the ability to constrain, manipulate, and transform, different types of data from different data formats, into a common format, the DAP (Data Access Protocol), in order to derive new data products. A user, or another software client, specifies an HTTP URL in order to access a particular piece of data, and appropriately transform it to suit a specific purpose of use. The resulting data products, however, do not contain any information about what data was used to create it, or the software process used to generate it, let alone information that would allow the proper citing and attribution to down stream researchers and tool developers. We will present our approach to provenance capture in Hyrax including a mechanism that can be used to report back to the hosting site any derived products, such as publications and reports, using the W3C PROV recommendation pingback service. We will demonstrate our utilization of Semantic Web and Web standards, the development of an information model that extends the PROV model for provenance capture, and the development of the pingback service. We will present our findings, as well as our practices for providing provenance information, visualization of the provenance information, and the development of pingback services, to better enable scientists and tool developers to be recognized and properly cited for their contributions.
Code of Federal Regulations, 2010 CFR
2010-01-01
... of regulatory tools such as warnings, disclosure requirements, public education, and economic... review. In this time of fundamental transformation, that process—and the principles governing regulation...
Biomimetic Signal Processing Using the Biosonar Measurement Tool (BMT)
NASA Astrophysics Data System (ADS)
Abawi, Ahmad T.; Hursky, Paul; Porter, Michael B.; Tiemann, Chris; Martin, Stephen
2004-11-01
In this paper data recorded on the Biosonar Measurement Tool (BMT) during a target echolocation experiment are used to 1) find ways to separate target echoes from clutter echoes, 2) analyze target returns and 3) find features in target returns that distinguish them from clutter returns. The BMT is an instrumentation package used in dolphin echolocation experiments developed at SPAWARSYSCEN. It can be held by the dolphin using a bite-plate during echolocation experiments and records the movement and echolocation strategy of a target-hunting dolphin without interfering with its motion through the search field. The BMT was developed to record a variety of data from a free-swimming dolphin engaged in a bottom target detection task. These data include the three dimensional location of the dolphin, including its heading, pitch roll and velocity as well as passive acoustic data recorded on three channels. The outgoing dolphin click is recorded on one channel and the resulting echoes are recorded on the two remaining channels. For each outgoing click the BMT records a large number of echoes that come from the entire ensonified field. Given the large number of transmitted clicks and the returned echoes, it is almost impossible to find a target return from the recorded data on the BMT. As a means of separating target echoes from those of clutter, an echo-mapping tool was developed. This tool produces an echomap on which echoes from targets (and other regular objects such as surface buoys, the side of a boat and so on) stack together as tracks, while echoes from clutter are scattered. Once these tracks are identified, the retuned echoes can easily be extracted for further analysis.
Riaño, David; Real, Francis; López-Vallverdú, Joan Albert; Campana, Fabio; Ercolani, Sara; Mecocci, Patrizia; Annicchiarico, Roberta; Caltagirone, Carlo
2012-06-01
Chronically ill patients are complex health care cases that require the coordinated interaction of multiple professionals. A correct intervention of these sort of patients entails the accurate analysis of the conditions of each concrete patient and the adaptation of evidence-based standard intervention plans to these conditions. There are some other clinical circumstances such as wrong diagnoses, unobserved comorbidities, missing information, unobserved related diseases or prevention, whose detection depends on the capacities of deduction of the professionals involved. In this paper, we introduce an ontology for the care of chronically ill patients and implement two personalization processes and a decision support tool. The first personalization process adapts the contents of the ontology to the particularities observed in the health-care record of a given concrete patient, automatically providing a personalized ontology containing only the clinical information that is relevant for health-care professionals to manage that patient. The second personalization process uses the personalized ontology of a patient to automatically transform intervention plans describing health-care general treatments into individual intervention plans. For comorbid patients, this process concludes with the semi-automatic integration of several individual plans into a single personalized plan. Finally, the ontology is also used as the knowledge base of a decision support tool that helps health-care professionals to detect anomalous circumstances such as wrong diagnoses, unobserved comorbidities, missing information, unobserved related diseases, or preventive actions. Seven health-care centers participating in the K4CARE project, together with the group SAGESA and the Local Health System in the town of Pollenza have served as the validation platform for these two processes and tool. Health-care professionals participating in the evaluation agree about the average quality 84% (5.9/7.0) and utility 90% (6.3/7.0) of the tools and also about the correct reasoning of the decision support tool, according to clinical standards. Copyright © 2012 Elsevier Inc. All rights reserved.
Hagen, Espen; Ness, Torbjørn V; Khosrowshahi, Amir; Sørensen, Christina; Fyhn, Marianne; Hafting, Torkel; Franke, Felix; Einevoll, Gaute T
2015-04-30
New, silicon-based multielectrodes comprising hundreds or more electrode contacts offer the possibility to record spike trains from thousands of neurons simultaneously. This potential cannot be realized unless accurate, reliable automated methods for spike sorting are developed, in turn requiring benchmarking data sets with known ground-truth spike times. We here present a general simulation tool for computing benchmarking data for evaluation of spike-sorting algorithms entitled ViSAPy (Virtual Spiking Activity in Python). The tool is based on a well-established biophysical forward-modeling scheme and is implemented as a Python package built on top of the neuronal simulator NEURON and the Python tool LFPy. ViSAPy allows for arbitrary combinations of multicompartmental neuron models and geometries of recording multielectrodes. Three example benchmarking data sets are generated, i.e., tetrode and polytrode data mimicking in vivo cortical recordings and microelectrode array (MEA) recordings of in vitro activity in salamander retinas. The synthesized example benchmarking data mimics salient features of typical experimental recordings, for example, spike waveforms depending on interspike interval. ViSAPy goes beyond existing methods as it includes biologically realistic model noise, synaptic activation by recurrent spiking networks, finite-sized electrode contacts, and allows for inhomogeneous electrical conductivities. ViSAPy is optimized to allow for generation of long time series of benchmarking data, spanning minutes of biological time, by parallel execution on multi-core computers. ViSAPy is an open-ended tool as it can be generalized to produce benchmarking data or arbitrary recording-electrode geometries and with various levels of complexity. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
MatchingTools: A Python library for symbolic effective field theory calculations
NASA Astrophysics Data System (ADS)
Criado, Juan C.
2018-06-01
MatchingTools is a Python library for doing symbolic calculations in effective field theory. It provides the tools to construct general models by defining their field content and their interaction Lagrangian. Once a model is given, the heavy particles can be integrated out at the tree level to obtain an effective Lagrangian in which only the light particles appear. After integration, some of the terms of the resulting Lagrangian might not be independent. MatchingTools contains functions for transforming these terms to rewrite them in terms of any chosen set of operators.
Highly Efficient Electroporation-mediated Transformation into Edible Mushroom Flammulina velutipes
Kim, Jong Kun; Park, Young Jin; Kong, Won Sik
2010-01-01
In this study, we developed an efficient electroporation-mediated transformation system featuring Flammulina velutipes. The flammutoxin (ftx) gene of F. velutipes was isolated by reverse transcription-PCR. pFTXHg plasmid was constructed using the partial ftx gene (410 bp) along with the hygromycin B phosphotransferase gene (hygB) downstream of the glyceraldehydes-3-phosphate dehydrogenase (gpd) promoter. The plasmid was transformed into protoplasts of monokaryotic strain 4019-20 of F. velutipes by electroporation. High transformation efficiency was obtained with an electric-pulse of 1.25 kV/cm by using 177 transformants/µg of DNA in 1 × 107 protoplasts. PCR and Southern blot hybridization indicated that a single copy of the plasmid DNA was inserted at different locations in the F. velutipes genome by non-homologous recombination. Therefore, this transformation system could be used as a useful tool for gene function analysis of F. velutipes. PMID:23956676
Highly Efficient Electroporation-mediated Transformation into Edible Mushroom Flammulina velutipes.
Kim, Jong Kun; Park, Young Jin; Kong, Won Sik; Kang, Hee Wan
2010-12-01
In this study, we developed an efficient electroporation-mediated transformation system featuring Flammulina velutipes. The flammutoxin (ftx) gene of F. velutipes was isolated by reverse transcription-PCR. pFTXHg plasmid was constructed using the partial ftx gene (410 bp) along with the hygromycin B phosphotransferase gene (hygB) downstream of the glyceraldehydes-3-phosphate dehydrogenase (gpd) promoter. The plasmid was transformed into protoplasts of monokaryotic strain 4019-20 of F. velutipes by electroporation. High transformation efficiency was obtained with an electric-pulse of 1.25 kV/cm by using 177 transformants/µg of DNA in 1 × 10(7) protoplasts. PCR and Southern blot hybridization indicated that a single copy of the plasmid DNA was inserted at different locations in the F. velutipes genome by non-homologous recombination. Therefore, this transformation system could be used as a useful tool for gene function analysis of F. velutipes.
Predicting Shear Transformation Events in Metallic Glasses
NASA Astrophysics Data System (ADS)
Xu, Bin; Falk, Michael L.; Li, J. F.; Kong, L. T.
2018-03-01
Shear transformation is the elementary process for plastic deformation of metallic glasses, the prediction of the occurrence of the shear transformation events is therefore of vital importance to understand the mechanical behavior of metallic glasses. In this Letter, from the view of the potential energy landscape, we find that the protocol-dependent behavior of shear transformation is governed by the stress gradient along its minimum energy path and we propose a framework as well as an atomistic approach to predict the triggering strains, locations, and structural transformations of the shear transformation events under different shear protocols in metallic glasses. Verification with a model Cu64 Zr36 metallic glass reveals that the prediction agrees well with athermal quasistatic shear simulations. The proposed framework is believed to provide an important tool for developing a quantitative understanding of the deformation processes that control mechanical behavior of metallic glasses.
Predicting Shear Transformation Events in Metallic Glasses.
Xu, Bin; Falk, Michael L; Li, J F; Kong, L T
2018-03-23
Shear transformation is the elementary process for plastic deformation of metallic glasses, the prediction of the occurrence of the shear transformation events is therefore of vital importance to understand the mechanical behavior of metallic glasses. In this Letter, from the view of the potential energy landscape, we find that the protocol-dependent behavior of shear transformation is governed by the stress gradient along its minimum energy path and we propose a framework as well as an atomistic approach to predict the triggering strains, locations, and structural transformations of the shear transformation events under different shear protocols in metallic glasses. Verification with a model Cu_{64}Zr_{36} metallic glass reveals that the prediction agrees well with athermal quasistatic shear simulations. The proposed framework is believed to provide an important tool for developing a quantitative understanding of the deformation processes that control mechanical behavior of metallic glasses.
Warped linear mixed models for the genetic analysis of transformed phenotypes
Fusi, Nicolo; Lippert, Christoph; Lawrence, Neil D.; Stegle, Oliver
2014-01-01
Linear mixed models (LMMs) are a powerful and established tool for studying genotype–phenotype relationships. A limitation of the LMM is that the model assumes Gaussian distributed residuals, a requirement that rarely holds in practice. Violations of this assumption can lead to false conclusions and loss in power. To mitigate this problem, it is common practice to pre-process the phenotypic values to make them as Gaussian as possible, for instance by applying logarithmic or other nonlinear transformations. Unfortunately, different phenotypes require different transformations, and choosing an appropriate transformation is challenging and subjective. Here we present an extension of the LMM that estimates an optimal transformation from the observed data. In simulations and applications to real data from human, mouse and yeast, we show that using transformations inferred by our model increases power in genome-wide association studies and increases the accuracy of heritability estimation and phenotype prediction. PMID:25234577
Warped linear mixed models for the genetic analysis of transformed phenotypes.
Fusi, Nicolo; Lippert, Christoph; Lawrence, Neil D; Stegle, Oliver
2014-09-19
Linear mixed models (LMMs) are a powerful and established tool for studying genotype-phenotype relationships. A limitation of the LMM is that the model assumes Gaussian distributed residuals, a requirement that rarely holds in practice. Violations of this assumption can lead to false conclusions and loss in power. To mitigate this problem, it is common practice to pre-process the phenotypic values to make them as Gaussian as possible, for instance by applying logarithmic or other nonlinear transformations. Unfortunately, different phenotypes require different transformations, and choosing an appropriate transformation is challenging and subjective. Here we present an extension of the LMM that estimates an optimal transformation from the observed data. In simulations and applications to real data from human, mouse and yeast, we show that using transformations inferred by our model increases power in genome-wide association studies and increases the accuracy of heritability estimation and phenotype prediction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bammann, D.; Prantil, V.; Kumar, A.
1996-06-24
An internal state variable formulation for phase transforming alloy steels is presented. We have illustrated how local transformation plasticity can be accommodated by an appropriate choice for the corresponding internal stress field acting between the phases. The state variable framework compares well with a numerical micromechanical calculation providing a discrete dependence of microscopic plasticity on volume fraction and the stress dependence attributable to a softer parent phase. The multiphase model is used to simulate the stress state of a quenched bar and show qualitative trends in the response when the transformation phenomenon is incorporated on the length scale of amore » global boundary value problem.« less
HERCULES: A Pattern Driven Code Transformation System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kartsaklis, Christos; Hernandez, Oscar R; Hsu, Chung-Hsing
2012-01-01
New parallel computers are emerging, but developing efficient scientific code for them remains difficult. A scientist must manage not only the science-domain complexity but also the performance-optimization complexity. HERCULES is a code transformation system designed to help the scientist to separate the two concerns, which improves code maintenance, and facilitates performance optimization. The system combines three technologies, code patterns, transformation scripts and compiler plugins, to provide the scientist with an environment to quickly implement code transformations that suit his needs. Unlike existing code optimization tools, HERCULES is unique in its focus on user-level accessibility. In this paper we discuss themore » design, implementation and an initial evaluation of HERCULES.« less
Electronic Health Record Adoption as a Function of Success: Implications for Meaningful Use
ERIC Educational Resources Information Center
Naser, Riyad J.
2012-01-01
Successful electronic health records (EHR) implementation has the potential to transform the entire care delivery process across the enterprise. However, the rate of EHR implementation and use among physicians has been slow. Different factors have been reported in the literature that may hinder adoption of EHR. Identifying and managing these…
ERIC Educational Resources Information Center
Aldukheil, Maher A.
2013-01-01
The Healthcare industry is characterized by its complexity in delivering care to the patients. Accordingly, healthcare organizations adopt and implement Information Technology (IT) solutions to manage complexity, improve quality of care, and transform to a fully integrated and digitized environment. Electronic Medical Records (EMR), which is…
NASA Astrophysics Data System (ADS)
Barkstrom, B. R.; Loeb, N. G.; Wielicki, B. A.
2017-12-01
Verification, Validation, and Uncertainty Quantification (VVUQ) are key actions that support conclusions based on Earth science data. Communities of data producers and users must undertake VVUQ when they create and use their data. The strategies [S] and tools [T] suggested below come from successful use on two large NASA projects. The first was the Earth Radiation Budget Experiment (ERBE). The second is the investigation of Clouds and the Earth's Radiant Energy System (CERES). [S] 1. Partition the production system into subsystems that deal with data transformations confined to limited space and time scales. Simplify the subsystems to minimize the number of data transformations in each subsystem. [S] 2. Derive algorithms from the fundamental physics and chemistry governing the parameters in each subsystem including those for instrument calibration. [S] 3. Use preliminary uncertainty estimates to detect unexpected discrepancies. Removing these requires diagnostic work as well as development and testing of fixes. [S] 4. Make sure there are adequate resources to support multiple end-to-end reprocessing of all data products. [T] 1. Create file identifiers that accommodate temporal and spatial sequences of data files and subsystem version changes. [T] 2. Create libraries of parameters used in common by different subsystems to reduce errors due to inconsistent values. [T] 3. Maintain a list of action items to record progress on resolving discrepancies. [T] 4. Plan on VVUQ activities that use independent data sources and peer review before distributing and archiving data. The goal of VVUQ is to provide a transparent link between the data and the physics and chemistry governing the measured quantities. The VVUQ effort also involves specialized domain experience and nomenclature. It often requires as much effort as the original system development. ERBE and CERES demonstrated that these strategies and tools can reduce the cost of VVUQ for Earth science data products.
Kriegel, Fabian L; Köhler, Ralf; Bayat-Sarmadi, Jannike; Bayerl, Simon; Hauser, Anja E; Niesner, Raluca; Luch, Andreas; Cseresnyes, Zoltan
2018-03-01
Cells in their natural environment often exhibit complex kinetic behavior and radical adjustments of their shapes. This enables them to accommodate to short- and long-term changes in their surroundings under physiological and pathological conditions. Intravital multi-photon microscopy is a powerful tool to record this complex behavior. Traditionally, cell behavior is characterized by tracking the cells' movements, which yields numerous parameters describing the spatiotemporal characteristics of cells. Cells can be classified according to their tracking behavior using all or a subset of these kinetic parameters. This categorization can be supported by the a priori knowledge of experts. While such an approach provides an excellent starting point for analyzing complex intravital imaging data, faster methods are required for automated and unbiased characterization. In addition to their kinetic behavior, the 3D shape of these cells also provide essential clues about the cells' status and functionality. New approaches that include the study of cell shapes as well may also allow the discovery of correlations amongst the track- and shape-describing parameters. In the current study, we examine the applicability of a set of Fourier components produced by Discrete Fourier Transform (DFT) as a tool for more efficient and less biased classification of complex cell shapes. By carrying out a number of 3D-to-2D projections of surface-rendered cells, the applied method reduces the more complex 3D shape characterization to a series of 2D DFTs. The resulting shape factors are used to train a Self-Organizing Map (SOM), which provides an unbiased estimate for the best clustering of the data, thereby characterizing groups of cells according to their shape. We propose and demonstrate that such shape characterization is a powerful addition to, or a replacement for kinetic analysis. This would make it especially useful in situations where live kinetic imaging is less practical or not possible at all. © 2017 International Society for Advancement of Cytometry. © 2017 International Society for Advancement of Cytometry.
Improving the quality of EHR recording in primary care: a data quality feedback tool.
van der Bij, Sjoukje; Khan, Nasra; Ten Veen, Petra; de Bakker, Dinny H; Verheij, Robert A
2017-01-01
Electronic health record (EHR) data are used to exchange information among health care providers. For this purpose, the quality of the data is essential. We developed a data quality feedback tool that evaluates differences in EHR data quality among practices and software packages as part of a larger intervention. The tool was applied in 92 practices in the Netherlands using different software packages. Practices received data quality feedback in 2010 and 2012. We observed large differences in the quality of recording. For example, the percentage of episodes of care that had a meaningful diagnostic code ranged from 30% to 100%. Differences were highly related to the software package. A year after the first measurement, the quality of recording had improved significantly and differences decreased, with 67% of the physicians indicating that they had actively changed their recording habits based on the results of the first measurement. About 80% found the feedback helpful in pinpointing recording problems. One of the software vendors made changes in functionality as a result of the feedback. Our EHR data quality feedback tool is capable of highlighting differences among practices and software packages. As such, it also stimulates improvements. As substantial variability in recording is related to the software package, our study strengthens the evidence that data quality can be improved substantially by standardizing the functionalities of EHR software packages. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Zheng, Lei; Nikolaev, Anton; Wardill, Trevor J; O'Kane, Cahir J; de Polavieja, Gonzalo G; Juusola, Mikko
2009-01-01
Because of the limited processing capacity of eyes, retinal networks must adapt constantly to best present the ever changing visual world to the brain. However, we still know little about how adaptation in retinal networks shapes neural encoding of changing information. To study this question, we recorded voltage responses from photoreceptors (R1-R6) and their output neurons (LMCs) in the Drosophila eye to repeated patterns of contrast values, collected from natural scenes. By analyzing the continuous photoreceptor-to-LMC transformations of these graded-potential neurons, we show that the efficiency of coding is dynamically improved by adaptation. In particular, adaptation enhances both the frequency and amplitude distribution of LMC output by improving sensitivity to under-represented signals within seconds. Moreover, the signal-to-noise ratio of LMC output increases in the same time scale. We suggest that these coding properties can be used to study network adaptation using the genetic tools in Drosophila, as shown in a companion paper (Part II).
An Open Source Tool for Game Theoretic Health Data De-Identification.
Prasser, Fabian; Gaupp, James; Wan, Zhiyu; Xia, Weiyi; Vorobeychik, Yevgeniy; Kantarcioglu, Murat; Kuhn, Klaus; Malin, Brad
2017-01-01
Biomedical data continues to grow in quantity and quality, creating new opportunities for research and data-driven applications. To realize these activities at scale, data must be shared beyond its initial point of collection. To maintain privacy, healthcare organizations often de-identify data, but they assume worst-case adversaries, inducing high levels of data corruption. Recently, game theory has been proposed to account for the incentives of data publishers and recipients (who attempt to re-identify patients), but this perspective has been more hypothetical than practical. In this paper, we report on a new game theoretic data publication strategy and its integration into the open source software ARX. We evaluate our implementation with an analysis on the relationship between data transformation, utility, and efficiency for over 30,000 demographic records drawn from the U.S. Census Bureau. The results indicate that our implementation is scalable and can be combined with various data privacy risk and quality measures.
From photography to cinematography: recording movement and gait in a neurological context.
Aubert, Geneviève
2002-09-01
The major challenge of photography has been freezing movement, to transform it into a fixed image or series of images. Very soon, photographers became interested in movement itself and tried to use photography as a tool to analyze movement. At the early stages, physicians interested in movement, perhaps surprisingly, made important technical contributions. Mécanisme de la physionomie humaine, by Duchenne, the first book with physiological experiments illustrated by photographs, is a landmark in this historical development. At the Salpêtrière, thanks to Charcot, photography officially entered clinical neurology. Medical journals with photographs were actively developed by Bourneville. Londe established a clinical photographic laboratory and published the first book on medical photography. The study of animal and human movement by Muybridge and Marey in the 1880s led to chronophotography and later cinematography. Clinicians such as Dercum and Richer took advantage of these new techniques to study pathological movement and gait in neurological diseases.
Statistical Analysis of an Infrared Thermography Inspection of Reinforced Carbon-Carbon
NASA Technical Reports Server (NTRS)
Comeaux, Kayla
2011-01-01
Each piece of flight hardware being used on the shuttle must be analyzed and pass NASA requirements before the shuttle is ready for launch. One tool used to detect cracks that lie within flight hardware is Infrared Flash Thermography. This is a non-destructive testing technique which uses an intense flash of light to heat up the surface of a material after which an Infrared camera is used to record the cooling of the material. Since cracks within the material obstruct the natural heat flow through the material, they are visible when viewing the data from the Infrared camera. We used Ecotherm, a software program, to collect data pertaining to the delaminations and analyzed the data using Ecotherm and University of Dayton Log Logistic Probability of Detection (POD) Software. The goal was to reproduce the statistical analysis produced by the University of Dayton software, by using scatter plots, log transforms, and residuals to test the assumption of normality for the residuals.
Wardill, Trevor J.; O'Kane, Cahir J.; de Polavieja, Gonzalo G.; Juusola, Mikko
2009-01-01
Because of the limited processing capacity of eyes, retinal networks must adapt constantly to best present the ever changing visual world to the brain. However, we still know little about how adaptation in retinal networks shapes neural encoding of changing information. To study this question, we recorded voltage responses from photoreceptors (R1–R6) and their output neurons (LMCs) in the Drosophila eye to repeated patterns of contrast values, collected from natural scenes. By analyzing the continuous photoreceptor-to-LMC transformations of these graded-potential neurons, we show that the efficiency of coding is dynamically improved by adaptation. In particular, adaptation enhances both the frequency and amplitude distribution of LMC output by improving sensitivity to under-represented signals within seconds. Moreover, the signal-to-noise ratio of LMC output increases in the same time scale. We suggest that these coding properties can be used to study network adaptation using the genetic tools in Drosophila, as shown in a companion paper (Part II). PMID:19180196
Finding Your Scientific Voice - Theatre Techniques for Physicists
NASA Astrophysics Data System (ADS)
Dreyer-Lude, Melanie
Research talks can be dull. Scientists may be making important, ground-breaking discoveries, but their audience is often missing the message. Whether presenting a conference talk, pitching a congressman for funding, or participating in a job interview, scientists must learn how to tell their stories. Conducting research and talking about that research are separate skill sets. The curse of knowledge, too much information, or the inability to speak and move properly may all be standing in the way of turning a talk into a memorable event. Building on initiatives like those of the Alan Alda Center and Bruce Greenes theatrical productions, our workshop helps researchers connect performing skills to the reality of presenting complex research subjects. This talk reviews key aspects of the Finding Your Scientific Voice workshop. Using digital recordings of pre and post workshop presentations, we will demonstrate what is exceptional about our workshop process and how it uses theatrical tools like Great Beginnings, the Dramatic Arc, the Core Message and Strong Endings to transform a humdrum presentation into a dynamic speaking event.
Structure and dynamics of spin-labeled insulin entrapped in a silica matrix by the sol-gel method.
Vanea, E; Gruian, C; Rickert, C; Steinhoff, H-J; Simon, V
2013-08-12
The structure and conformational dynamics of insulin entrapped into a silica matrix was monitored during the sol to maturated-gel transition by electron paramagnetic resonance (EPR) spectroscopy. Insulin was successfully spin-labeled with iodoacetamide and the bifunctional nitroxide reagent HO-1944. Room temperature continuous wave (cw) EPR spectra of insulin were recorded to assess the mobility of the attached spin labels. Insulin conformation and its distribution within the silica matrix were studied using double electron-electron resonance (DEER) and low-temperature cw-EPR. A porous oxide matrix seems to form around insulin molecules with pore diameters in the order of a few nanometers. Secondary structure of the encapsulated insulin investigated by Fourier transform infrared spectroscopy proved a high structural integrity of insulin even in the dried silica matrix. The results show that silica encapsulation can be used as a powerful tool to effectively isolate and functionally preserve biomolecules during preparation, storage, and release.
49 CFR 563.12 - Data retrieval tools.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 6 2011-10-01 2011-10-01 false Data retrieval tools. 563.12 Section 563.12... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EVENT DATA RECORDERS § 563.12 Data retrieval tools. Each... tool(s) is commercially available that is capable of accessing and retrieving the data stored in the...
Use of the Box-Cox Transformation in Detecting Changepoints in Daily Precipitation Data Series
NASA Astrophysics Data System (ADS)
Wang, X. L.; Chen, H.; Wu, Y.; Pu, Q.
2009-04-01
This study integrates a Box-Cox power transformation procedure into two statistical tests for detecting changepoints in Gaussian data series, to make the changepoint detection methods applicable to non-Gaussian data series, such as daily precipitation amounts. The detection power aspects of transformed methods in a common trend two-phase regression setting are assessed by Monte Carlo simulations for data of a log-normal or Gamma distribution. The results show that the transformed methods have increased the power of detection, in comparison with the corresponding original (untransformed) methods. The transformed data much better approximate to a Gaussian distribution. As an example of application, the new methods are applied to a series of daily precipitation amounts recorded at a station in Canada, showing satisfactory detection power.
Short time Fourier analysis of the electromyogram - Fast movements and constant contraction
NASA Technical Reports Server (NTRS)
Hannaford, Blake; Lehman, Steven
1986-01-01
Short-time Fourier analysis was applied to surface electromyograms (EMG) recorded during rapid movements, and during isometric contractions at constant forces. A portion of the data to be transformed by multiplying the signal by a Hamming window was selected, and then the discrete Fourier transform was computed. Shifting the window along the data record, a new spectrum was computed each 10 ms. The transformed data were displayed in spectograms or 'voiceprints'. This short-time technique made it possible to see time-dependencies in the EMG that are normally averaged in the Fourier analysis of these signals. Spectra of EMGs during isometric contractions at constant force vary in the short (10-20 ms) term. Short-time spectra from EMGs recorded during rapid movements were much less variable. The windowing technique picked out the typical 'three-burst pattern' in EMG's from both wrist and head movements. Spectra during the bursts were more consistent than those during isometric contractions. Furthermore, there was a consistent shift in spectral statistics in the course of the three bursts. Both the center frequency and the variance of the spectral energy distribution grew from the first burst to the second burst in the same muscle. The analogy between EMGs and speech signals is extended to argue for future applicability of short-time spectral analysis of EMG.
Color imaging technologies in the prepress industry
NASA Astrophysics Data System (ADS)
Silverman, Lee
1992-05-01
Over much of the last half century, electronic technologies have played an increasing role in the prepress production of film and plates prepared for printing presses. The last decade has seen an explosion of technologies capable of supplementing this production. The most outstanding technology infusing this growth has been the microcomputer, but other component technologies have also diversified the capacity for high-quality scanning of photographs. In addition, some fundamental software and affordable laser recorder technologies have provided new approaches to the merging of typographic and halftoned photographic data onto film. The next decade will evolve the methods and the technologies to achieve superior text and image communication on mass distribution media used in the printed page or instead of the printed page. This paper focuses on three domains of electronic prepress classified as the input, transformation, and output phases of the production process. The evolution of the component technologies in each of these three phases is described. The unique attributes in each are defined and then follows a discussion of the pertinent technologies which overlap all three domains. Unique to input is sensor technology and analogue to digital conversion. Unique to the transformation phase is the display on monitor for soft proofing and interactive processing. The display requires special technologies for digital frame storage and high-speed, gamma- compensated, digital to analogue conversion. Unique to output is the need for halftoning and binary recording device linearization or calibration. Specialized direct digital color technologies now allow color quality proofing without the need for writing intermediate separation films, but ultimately these technologies will be supplanted by direct printing technologies. First, dry film processing, then direct plate writing, and finally direct application of ink or toner onto paper at the 20 - 30 thousand impressions per hour now achieved by offset printing. In summary, a review of technological evolution guides industry methodologies that will define a transformation of workflow in graphic arts during the next decade. Prepress production will integrate component technologies with microcomputers in order to optimize the production cycle from graphic design to printed piece. These changes will drastically alter the business structures and tools used to put type and photographs on paper in the volumes expected from printing presses.
NASA Astrophysics Data System (ADS)
Tsang-Hin-Sun, Eve; Royer, Jean-Yves; Sukhovich, Alexey; Perrot, Julie
2014-05-01
Arrays of autonomous hydrophones (AUHs) proved to be a very valuable tool for monitoring the seismic activity of mid-ocean ridges. AUHs take advantage of the ocean acoustic properties to detect many low-magnitude underwater earthquakes undetected by land-based stations. This allows for a significant improvement in the magnitude completeness level of seismic catalogs in remote oceanic areas. This study presents some results from the deployment of the OHASISBIO array comprising 7 AUHs deployed in the southern Indian Ocean. The source of acoustic events, i.e. site where - conversion from seismic to acoustic waves occur and proxy to epicenters for shallow earthquakes - can be precisely located within few km, inside the AUH array. The distribution of the uncertainties in the locations and time-origins shows that the OHASISBIO array reliably covers a wide region encompassing the Indian Ocean triple junction and large extent of the three mid-oceanic Indian spreading ridges, from 52°E to 80°E and from 25°S to 40°S. During its one year long deployment in 2012 and in this area the AUH array recorded 1670 events, while, for the same period, land-based networks only detected 470 events. A comparison of the background seismicity along the South-east (SEIR) and South-west (SWIR) Indian ridges suggests that the microseismicity, even over a year period, could be representative of the steady-state of stress along the SEIR and SWIR; this conclusion is based on very high Spearman's correlations between our one-year long AUH catalog and teleseismic catalogs over nearly 40 years. Seismicity along the ultra-slow spreading SWIR is regularly distributed in space and time, along spreading segments and transform faults, whereas the intermediate spreading SEIR diplays clusters of events in the vicinity of some transform faults or near specific geological structures such as the St-Paul and Amsterdam hotspot. A majority of these clusters seem to be related to magmatic processes, such as dyke intrusion or propagation. The analysis of mainshock-aftershock sequences reveals that flew clusters fit a modified Omori law, non-withstanding of their location (on transform faults or not), reflecting complex rupture mechanisms along both spreading ridges.
NASA Astrophysics Data System (ADS)
Safieddine, Doha; Kachenoura, Amar; Albera, Laurent; Birot, Gwénaël; Karfoul, Ahmad; Pasnicu, Anca; Biraben, Arnaud; Wendling, Fabrice; Senhadji, Lotfi; Merlet, Isabelle
2012-12-01
Electroencephalographic (EEG) recordings are often contaminated with muscle artifacts. This disturbing myogenic activity not only strongly affects the visual analysis of EEG, but also most surely impairs the results of EEG signal processing tools such as source localization. This article focuses on the particular context of the contamination epileptic signals (interictal spikes) by muscle artifact, as EEG is a key diagnosis tool for this pathology. In this context, our aim was to compare the ability of two stochastic approaches of blind source separation, namely independent component analysis (ICA) and canonical correlation analysis (CCA), and of two deterministic approaches namely empirical mode decomposition (EMD) and wavelet transform (WT) to remove muscle artifacts from EEG signals. To quantitatively compare the performance of these four algorithms, epileptic spike-like EEG signals were simulated from two different source configurations and artificially contaminated with different levels of real EEG-recorded myogenic activity. The efficiency of CCA, ICA, EMD, and WT to correct the muscular artifact was evaluated both by calculating the normalized mean-squared error between denoised and original signals and by comparing the results of source localization obtained from artifact-free as well as noisy signals, before and after artifact correction. Tests on real data recorded in an epileptic patient are also presented. The results obtained in the context of simulations and real data show that EMD outperformed the three other algorithms for the denoising of data highly contaminated by muscular activity. For less noisy data, and when spikes arose from a single cortical source, the myogenic artifact was best corrected with CCA and ICA. Otherwise when spikes originated from two distinct sources, either EMD or ICA offered the most reliable denoising result for highly noisy data, while WT offered the better denoising result for less noisy data. These results suggest that the performance of muscle artifact correction methods strongly depend on the level of data contamination, and of the source configuration underlying EEG signals. Eventually, some insights into the numerical complexity of these four algorithms are given.
Reef-coral proteins as visual, non-destructive reporters for plant transformation.
Wenck, A; Pugieux, C; Turner, M; Dunn, M; Stacy, C; Tiozzo, A; Dunder, E; van Grinsven, E; Khan, R; Sigareva, M; Wang, W C; Reed, J; Drayton, P; Oliver, D; Trafford, H; Legris, G; Rushton, H; Tayab, S; Launis, K; Chang, Y-F; Chen, D-F; Melchers, L
2003-11-01
Recently, five novel fluorescent proteins have been isolated from non-bioluminescent species of reef-coral organisms and have been made available through ClonTech. They are AmCyan, AsRed, DsRed, ZsGreen and ZsYellow. These proteins are valuable as reporters for transformation because they do not require a substrate or external co-factor to emit fluorescence and can be tested in vivo without destruction of the tissue under study. We have evaluated them in a large range of plants, both monocots and dicots, and our results indicate that they are valuable reporting tools for transformation in a wide variety of crops. We report here their successful expression in wheat, maize, barley, rice, banana, onion, soybean, cotton, tobacco, potato and tomato. Transient expression could be observed as early as 24 h after DNA delivery in some cases, allowing for very clear visualization of individually transformed cells. Stable transgenic events were generated, using mannose, kanamycin or hygromycin selection. Transgenic plants were phenotypically normal, showing a wide range of fluorescence levels, and were fertile. Expression of AmCyan, ZsGreen and AsRed was visible in maize T1 seeds, allowing visual segregation to more than 99% accuracy. The excitation and emission wavelengths of some of these proteins are significantly different; the difference is enough for the simultaneous visualization of cells transformed with more than one of the fluorescent proteins. These proteins will become useful tools for transformation optimization and other studies. The wide variety of plants successfully tested demonstrates that these proteins will potentially find broad use in plant biology.
USDA-ARS?s Scientific Manuscript database
Background: Mobile technologies are emerging as a valuable tool to collect and assess dietary intake. Adolescents readily accept and adopt new technologies; hence, a food record application (FRapp) may be used as a tool to promote a better understanding of adolescent’s dietary intake and eating patt...
ERIC Educational Resources Information Center
Hardman, Joanne
2005-01-01
Because computers potentially transform pedagogy, much has been made of their ability to impact positively on student performance, particularly in subjects such as mathematics and science. However, there is currently a dearth of research regarding exactly how the computer acts as a transformative tool in disadvantaged schools. Drawing on a…
Assessing adherence to the evidence base in the management of poststroke dysphagia.
Burton, Christopher; Pennington, Lindsay; Roddam, Hazel; Russell, Ian; Russell, Daphne; Krawczyk, Karen; Smith, Hilary A
2006-01-01
To evaluate the reliability and responsiveness to change of an audit tool to assess adherence to evidence of effectiveness in the speech and language therapy (SLT) management of poststroke dysphagia. The tool was used to review SLT practice as part of a randomized study of different education strategies. Medical records were audited before and after delivery of the trial intervention. Seventeen SLT departments in the north-west of England participated in the study. The assessment tool was used to assess the medical records of 753 patients before and 717 patients after delivery of the trial intervention across the 17 departments. A target of 10 records per department per month was sought, using systematic sampling with a random start. Inter- and intra-rater reliability were explored, together with the tool's internal consistency and responsiveness to change. The assessment tool had high face validity, although internal consistency was low (ra = 0.37). Composite scores on the tool were however responsive to differences between SLT departments. Both inter- and intra-rater reliability ranged from 'substantial' to 'near perfect' across all items. The audit tool has high face validity and measurement reliability. The use of a composite adherence score should, however, proceed with caution as internal consistency is low.
Maldonado, José Alberto; Marcos, Mar; Fernández-Breis, Jesualdo Tomás; Parcero, Estíbaliz; Boscá, Diego; Legaz-García, María Del Carmen; Martínez-Salvador, Begoña; Robles, Montserrat
2016-01-01
The heterogeneity of clinical data is a key problem in the sharing and reuse of Electronic Health Record (EHR) data. We approach this problem through the combined use of EHR standards and semantic web technologies, concretely by means of clinical data transformation applications that convert EHR data in proprietary format, first into clinical information models based on archetypes, and then into RDF/OWL extracts which can be used for automated reasoning. In this paper we describe a proof-of-concept platform to facilitate the (re)configuration of such clinical data transformation applications. The platform is built upon a number of web services dealing with transformations at different levels (such as normalization or abstraction), and relies on a collection of reusable mappings designed to solve specific transformation steps in a particular clinical domain. The platform has been used in the development of two different data transformation applications in the area of colorectal cancer.
NASA Astrophysics Data System (ADS)
Alehosseini, Ali; A. Hejazi, Maryam; Mokhtari, Ghassem; B. Gharehpetian, Gevork; Mohammadi, Mohammad
2015-06-01
In this paper, the Bayesian classifier is used to detect and classify the radial deformation and axial displacement of transformer windings. The proposed method is tested on a model of transformer for different volumes of radial deformation and axial displacement. In this method, ultra-wideband (UWB) signal is sent to the simplified model of the transformer winding. The received signal from the winding model is recorded and used for training and testing of Bayesian classifier in different axial displacement and radial deformation states of the winding. It is shown that the proposed method has a good accuracy to detect and classify the axial displacement and radial deformation of the winding.
Archetype-based data warehouse environment to enable the reuse of electronic health record data.
Marco-Ruiz, Luis; Moner, David; Maldonado, José A; Kolstrup, Nils; Bellika, Johan G
2015-09-01
The reuse of data captured during health care delivery is essential to satisfy the demands of clinical research and clinical decision support systems. A main barrier for the reuse is the existence of legacy formats of data and the high granularity of it when stored in an electronic health record (EHR) system. Thus, we need mechanisms to standardize, aggregate, and query data concealed in the EHRs, to allow their reuse whenever they are needed. To create a data warehouse infrastructure using archetype-based technologies, standards and query languages to enable the interoperability needed for data reuse. The work presented makes use of best of breed archetype-based data transformation and storage technologies to create a workflow for the modeling, extraction, transformation and load of EHR proprietary data into standardized data repositories. We converted legacy data and performed patient-centered aggregations via archetype-based transformations. Later, specific purpose aggregations were performed at a query level for particular use cases. Laboratory test results of a population of 230,000 patients belonging to Troms and Finnmark counties in Norway requested between January 2013 and November 2014 have been standardized. Test records normalization has been performed by defining transformation and aggregation functions between the laboratory records and an archetype. These mappings were used to automatically generate open EHR compliant data. These data were loaded into an archetype-based data warehouse. Once loaded, we defined indicators linked to the data in the warehouse to monitor test activity of Salmonella and Pertussis using the archetype query language. Archetype-based standards and technologies can be used to create a data warehouse environment that enables data from EHR systems to be reused in clinical research and decision support systems. With this approach, existing EHR data becomes available in a standardized and interoperable format, thus opening a world of possibilities toward semantic or concept-based reuse, query and communication of clinical data. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Characterization of human breast cancer tissues by infrared imaging.
Verdonck, M; Denayer, A; Delvaux, B; Garaud, S; De Wind, R; Desmedt, C; Sotiriou, C; Willard-Gallo, K; Goormaghtigh, E
2016-01-21
Fourier Transform InfraRed (FTIR) spectroscopy coupled to microscopy (IR imaging) has shown unique advantages in detecting morphological and molecular pathologic alterations in biological tissues. The aim of this study was to evaluate the potential of IR imaging as a diagnostic tool to identify characteristics of breast epithelial cells and the stroma. In this study a total of 19 breast tissue samples were obtained from 13 patients. For 6 of the patients, we also obtained Non-Adjacent Non-Tumor tissue samples. Infrared images were recorded on the main cell/tissue types identified in all breast tissue samples. Unsupervised Principal Component Analyses and supervised Partial Least Square Discriminant Analyses (PLS-DA) were used to discriminate spectra. Leave-one-out cross-validation was used to evaluate the performance of PLS-DA models. Our results show that IR imaging coupled with PLS-DA can efficiently identify the main cell types present in FFPE breast tissue sections, i.e. epithelial cells, lymphocytes, connective tissue, vascular tissue and erythrocytes. A second PLS-DA model could distinguish normal and tumor breast epithelial cells in the breast tissue sections. A patient-specific model reached particularly high sensitivity, specificity and MCC rates. Finally, we showed that the stroma located close or at distance from the tumor exhibits distinct spectral characteristics. In conclusion FTIR imaging combined with computational algorithms could be an accurate, rapid and objective tool to identify/quantify breast epithelial cells and differentiate tumor from normal breast tissue as well as normal from tumor-associated stroma, paving the way to the establishment of a potential complementary tool to ensure safe tumor margins.
2009-01-01
The means we use to record the process of carrying out research remains tied to the concept of a paginated paper notebook despite the advances over the past decade in web based communication and publication tools. The development of these tools offers an opportunity to re-imagine what the laboratory record would look like if it were re-built in a web-native form. In this paper I describe a distributed approach to the laboratory record based which uses the most appropriate tool available to house and publish each specific object created during the research process, whether they be a physical sample, a digital data object, or the record of how one was created from another. I propose that the web-native laboratory record would act as a feed of relationships between these items. This approach can be seen as complementary to, rather than competitive with, integrative approaches that aim to aggregate relevant objects together to describe knowledge. The potential for the recent announcement of the Google Wave protocol to have a significant impact on realizing this vision is discussed along with the issues of security and provenance that are raised by such an approach. PMID:20098590
ERIC Educational Resources Information Center
Timmis, Sue; Joubert, Marie; Manuel, Anne; Barnes, Sally
2010-01-01
This article explores the use of multiple digital tools for mediating communications, drawing on two recent empirical studies in which students and researchers in UK higher education worked on collaborative activities: how different tools were used and the quality of the communications and their contributions to collaborative working and knowledge…
Cell Phones in Task Based Learning--Are Cell Phones Useful Language Learning Tools?
ERIC Educational Resources Information Center
Kiernan, Patrick J.; Aizawa, Kazumi
2004-01-01
Cell phones are now widespread in many countries including Japan where we teach, and are particularly popular among university students. Although they can be a distraction in the classroom, functions such as Internet access and e-mail capability have transformed them into sophisticated communication tools. But are they also potentially useful in…
Social Software and Academic Practice: Postgraduate Students as Co-Designers of Web 2.0 Tools
ERIC Educational Resources Information Center
Carmichael, Patrick; Burchmore, Helen
2010-01-01
In order to develop potentially transformative Web 2.0 tools in higher education, the complexity of existing academic practices, including current patterns of technology use, must be recognised. This paper describes how a series of participatory design activities allowed postgraduate students in education, social sciences and computer sciences to…
Involving the Child in the Management of Science Museums: A Tool of Social Transformation
ERIC Educational Resources Information Center
Gordillo Martorell, José Antonio
2017-01-01
The participation of children in the management of science museums, following the theoretical approach of psychologist and educator Francesco Tonucci, is an effective tool both for the improvement at an internal level of the organization itself and for the implementation of a series of significant changes in the child's most immediate…
Does Skepticism Predict News Media Literacy: A Study on Turkish Young Adults
ERIC Educational Resources Information Center
Kartal, Osman Yilmaz; Yazgan, Akan Deniz; Kincal, Remzi Y.
2017-01-01
The 2010's are when information and informatics age coexist, information overload has been transformed into a mass engineering tool, "imposing bombardment" has become the norm. The most influential tool of this cultural-industrial act is news media. Efforts to educate young adults, who are most active in touch with information, in view…
ERIC Educational Resources Information Center
Jahreie, Cecilie Flo
2010-01-01
This article examines the way student teachers make sense of conceptual tools when writing cases. In order to understand the problem-solving process, an analysis of the interactions is conducted. The findings show that transforming practical experiences into theoretical reflection is not a straightforward matter. To be able to elaborate on the…
NASA Technical Reports Server (NTRS)
Decker, A. J.; Pao, Y.-H.; Claspy, P. C.
1978-01-01
The use of a phase-modulated reference wave for the electronic heterodyne recording and processing of a hologram is described. Heterodyne recording is used to eliminate the self-interference terms of a hologram and to create a Leith-Upatnieks hologram with coaxial object and reference waves. Phase modulation is also shown to be the foundation of a multiple-view hologram system. When combined with hologram scale transformations, heterodyne recording is the key to general optical processing. Spatial filtering is treated as an example.