Sample records for protable analyzer based

  1. DAVE user's manual. [For analyzing FORTRAN programs, in FORTRAN for IBM 360 and 370

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGaffey, R.W.

    1980-05-01

    DAVE is a system for analyzing FORTRAN programs. It is designed to report the presence, or possible presence, of a wide variety of programing errors. In addition, it provides information on the usage of all local and global variables, and on the logical flow through a program. DAVE is written in FORTRAN and is designed for ease of protability.

  2. Effects of Processing and Powder Size on Microstructure and Reactivity in Arrested Reactive Milled Al + Ni

    DTIC Science & Technology

    2012-05-01

    reactive milled (RM) experiments forming nickel aluminides [3,4,6,8–10,12,15,16,18,19], titanium - based alloys [5] and combustion reactions in metal...highly heterogeneous and is refined during processing until reaction occurs. The refinement process consists of the cold welding of powder grains within... welding at the surface of deforming particles, which pro-Table 2 Sample preparation measurements corresponding to the designed exper- iments presented

  3. Highly Protable Airborne Multispectral Imaging System

    NASA Technical Reports Server (NTRS)

    Lehnemann, Robert; Mcnamee, Todd

    2001-01-01

    A portable instrumentation system is described that includes and airborne and a ground-based subsytem. It can acquire multispectral image data over swaths of terrain ranging in width from about 1.5 to 1 km. The system was developed especially for use in coastal environments and is well suited for performing remote sensing and general environmental monitoring. It includes a small,munpilotaed, remotely controlled airplance that carries a forward-looking camera for navigation, three downward-looking monochrome video cameras for imaging terrain in three spectral bands, a video transmitter, and a Global Positioning System (GPS) reciever.

  4. Field-testing a portable wind tunnel for fine dust emissions

    USDA-ARS?s Scientific Manuscript database

    A protable wind tunnel has been developed to allow erodibility and dust emissions testing of soil surfaces with the premise that dust concentration and properties are highly correlated with surface soil properties, as modified by crop management system. In this study we report on the field-testing ...

  5. [Rapid determination of volatile organic compounds in workplace air by protable gas chromatography-mass spectrometer].

    PubMed

    Zhu, H B; Su, C J; Tang, H F; Ruan, Z; Liu, D H; Wang, H; Qian, Y L

    2017-10-20

    Objective: To establish a method for rapid determination of 47 volatile organic compounds in the air of workplace using portable gas chromatography - mass spectrometer(GC - MS). Methods: The mixed standard gas with different concentration levels was made by using the static gas distribution method with the high purity nitrogen as dilution gas. The samples were injected into the GC - MS by a hand - held probe. Retention time and characteristic ion were used for qualitative analysis,and the internal standard method was usd for quantitation. Results: The 47 poisonous substances were separated and determined well. The linear range of this method was 0.2 - 16.0 mg/m(3),and the relative standard deviation of 45 volatile ovganic compounds was 3.8% - 15.8%. The average recovery was 79.3% - 119.0%. Conclusion: The method is simple,accurate,sensitive,has good separation effect,short analysis period, can be used for qualitative and quantitative analysis of volatile organic compounds in the workplace, and also supports the rapid identification and detection of occupational hazards.

  6. Effects of varying oxygen partial pressure on molten silicon-ceramic substrate interactions

    NASA Technical Reports Server (NTRS)

    Ownby, D. P.; Barsoum, M. W.

    1980-01-01

    The silicon sessile drop contact angle was measured on hot pressed silicon nitride, silicon nitride coated on hot pressed silicon nitride, silicon carbon coated on graphite, and on Sialon to determine the degree to which silicon wets these substances. The post-sessile drop experiment samples were sectioned and photomicrographs were taken of the silicon-substrate interface to observe the degree of surface dissolution and degradation. Of these materials, silicon did not form a true sessile drop on the SiC on graphite due to infiltration of the silicon through the SiC coating, nor on the Sialon due to the formation of a more-or-less rigid coating on the liquid silicon. The most wetting was obtained on the coated Si3N4 with a value of 42 deg. The oxygen concentrations in a silicon ribbon furnace and in a sessile drop furnace were measured using the protable thoria-yttria solid solution electrolyte oxygen sensor. Oxygen partial pressures of 10 to the minus 7 power atm and 10 to the minus 8 power atm were obtained at the two facilities. These measurements are believed to represent nonequilibrium conditions.

  7. Stability, reliability and cross-mode correlations of tests in a recommended 8-minute performance assessment battery

    NASA Technical Reports Server (NTRS)

    Wilkes, R. L.; Kennedy, R. S.; Dunlap, W. P.; Lane, N. E.

    1986-01-01

    A need exists for an automated performance test system to study drugs, agents, treatments, and stresses of interest to the aviation, space, and environmental medical community. The purpose of this present study is to evaluate tests for inclusion in the NASA-sponsored Automated Performance Test System (APTS). Twenty-one subjects were tested over 10 replications with tests previously identified as good candidates for repeated-measure research. The tests were concurrently administered in paper-and-pencil and microcomputer modes. Performance scores for the two modes were compared. Data from trials 1 to 10 were examined for indications of test stability and reliability. Nine of the ten APT system tests achieved stability. Reliabilities were generally high. Cross-correlation of microbased tests with traditional paper-and-pencil versions revealed similarity of content within tests in the different modes, and implied at least three cognition and two motor factors. This protable, inexpensive, rugged, computerized battery of tests is recommended for use in repeated-measures studies of environmental and drug effects on performance. Identification of other tests compatible with microcomputer testing and potentially capable of tapping previously unidentified factors is recommended. Documentation of APTS sensitivity to environmental agents is available for more than a dozen facilities and is reported briefly. Continuation of such validation remains critical in establishing the efficacy of APTS tests.

  8. SU-E-T-666: Radionuclides and Activity of the Patient Apertures Used in a Proton Beam of Wobbling System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, B.Y.; Chen, H.H.; Tsai, H.Y.

    2015-06-15

    Purpose: To identify the radionuclides and quantify the activity of the patient apertures used in a 190-MeV proton beam of wobbling system. Methods: A proton beam of wobbling system in the first proton center in Taiwan, Chang Gung Memorial Hospital at Linkou, was used to bombard the patient apertures. The patient aperture was composed of 60.5 % copper, 39.4 % Zinc, 0.05 % iron, 0.05 % lead. A protable high-purity germanium (HPGe) coaxial detector was used to measure the spectra of the induced nuclides of patient apertures. The analysis of the spectra and the identification of the radionuclides were preliminarilymore » operated by the Nuclide Navigator III Master Library. On the basis of the results by Nuclide Navigator III Master Library, we manually selected the reliable nuclides by the gamma-ray energies, branching ratio, and half life. In the spectra, we can quantify the activity of radionuclides by the Monte Carlo efficiency transfer method. Results: In this study, the radioisotopes activated in patient apertures by the 190-MeV proton beam were divided into two categories. The first category is long half-life radionuclides, such as Co-56 (half life, 77.3 days). Other radionuclides of Cu-60, Cu-61, Cu-62, Cu-66, and Zn-62 have shorter half life. The radionuclide of Cu-60 had the highest activity. From calculation with the efficiency transfer method, the deviations between the computed results and the measured efficiencies were mostly within 10%. Conclusion: To identify the radionuclides and quantify the activity helps us to estimate proper time intervals for cooling the patient apertures. This study was supported by the grants from the Chang Gung Memorial Hospital (CMRPD1C0682)« less

  9. Demonstration of a real-time implementation of the ICVision holographic stereogram display

    NASA Astrophysics Data System (ADS)

    Kulick, Jeffrey H.; Jones, Michael W.; Nordin, Gregory P.; Lindquist, Robert G.; Kowel, Stephen T.; Thomsen, Axel

    1995-07-01

    There is increasing interest in real-time autostereoscopic 3D displays. Such systems allow 3D objects or scenes to be viewed by one or more observers with correct motion parallax without the need for glasses or other viewing aids. Potential applications of such systems include mechanical design, training and simulation, medical imaging, virtual reality, and architectural design. One approach to the development of real-time autostereoscopic display systems has been to develop real-time holographic display systems. The approach taken by most of the systems is to compute and display a number of holographic lines at one time, and then use a scanning system to replicate the images throughout the display region. The approach taken in the ICVision system being developed at the University of Alabama in Huntsville is very different. In the ICVision display, a set of discrete viewing regions called virtual viewing slits are created by the display. Each pixel is required fill every viewing slit with different image data. When the images presented in two virtual viewing slits separated by an interoccular distance are filled with stereoscopic pair images, the observer sees a 3D image. The images are computed so that a different stereo pair is presented each time the viewer moves 1 eye pupil diameter (approximately mm), thus providing a series of stereo views. Each pixel is subdivided into smaller regions, called partial pixels. Each partial pixel is filled with a diffraction grating that is just that required to fill an individual virtual viewing slit. The sum of all the partial pixels in a pixel then fill all the virtual viewing slits. The final version of the ICVision system will form diffraction gratings in a liquid crystal layer on the surface of VLSI chips in real time. Processors embedded in the VLSI chips will compute the display in real- time. In the current version of the system, a commercial AMLCD is sandwiched with a diffraction grating array. This paper will discuss the design details of a protable 3D display based on the integration of a diffractive optical element with a commercial off-the-shelf AMLCD. The diffractive optic contains several hundred thousand partial-pixel gratings and the AMLCD modulates the light diffracted by the gratings.

  10. Feasibility evaluation of a neutron grating interferometer with an analyzer grating based on a structured scintillator.

    PubMed

    Kim, Youngju; Kim, Jongyul; Kim, Daeseung; Hussey, Daniel S; Lee, Seung Wook

    2018-03-01

    We introduce an analyzer grating based on a structured scintillator fabricated by a gadolinium oxysulfide powder filling method for a symmetric Talbot-Lau neutron grating interferometer. This is an alternative way to analyze the Talbot self-image of a grating interferometer without using an absorption grating to block neutrons. Since the structured scintillator analyzer grating itself generates the signal for neutron detection, we do not need an additional scintillator screen as an absorption analyzer grating. We have developed and tested an analyzer grating based on a structured scintillator in our symmetric Talbot-Lau neutron grating interferometer to produce high fidelity absorption, differential phase, and dark-field contrast images. The acquired images have been compared to results of a grating interferometer utilizing a typical absorption analyzer grating with two commercial scintillation screens. The analyzer grating based on the structured scintillator enhances interference fringe visibility and shows a great potential for economical fabrication, compact system design, and so on. We report the performance of the analyzer grating based on a structured scintillator and evaluate its feasibility for the neutron grating interferometer.

  11. Feasibility evaluation of a neutron grating interferometer with an analyzer grating based on a structured scintillator

    NASA Astrophysics Data System (ADS)

    Kim, Youngju; Kim, Jongyul; Kim, Daeseung; Hussey, Daniel. S.; Lee, Seung Wook

    2018-03-01

    We introduce an analyzer grating based on a structured scintillator fabricated by a gadolinium oxysulfide powder filling method for a symmetric Talbot-Lau neutron grating interferometer. This is an alternative way to analyze the Talbot self-image of a grating interferometer without using an absorption grating to block neutrons. Since the structured scintillator analyzer grating itself generates the signal for neutron detection, we do not need an additional scintillator screen as an absorption analyzer grating. We have developed and tested an analyzer grating based on a structured scintillator in our symmetric Talbot-Lau neutron grating interferometer to produce high fidelity absorption, differential phase, and dark-field contrast images. The acquired images have been compared to results of a grating interferometer utilizing a typical absorption analyzer grating with two commercial scintillation screens. The analyzer grating based on the structured scintillator enhances interference fringe visibility and shows a great potential for economical fabrication, compact system design, and so on. We report the performance of the analyzer grating based on a structured scintillator and evaluate its feasibility for the neutron grating interferometer.

  12. Using SEM to Analyze Complex Survey Data: A Comparison between Design-Based Single-Level and Model-Based Multilevel Approaches

    ERIC Educational Resources Information Center

    Wu, Jiun-Yu; Kwok, Oi-man

    2012-01-01

    Both ad-hoc robust sandwich standard error estimators (design-based approach) and multilevel analysis (model-based approach) are commonly used for analyzing complex survey data with nonindependent observations. Although these 2 approaches perform equally well on analyzing complex survey data with equal between- and within-level model structures…

  13. Implementation of Complexity Analyzing Based on Additional Effect

    NASA Astrophysics Data System (ADS)

    Zhang, Peng; Li, Na; Liang, Yanhong; Liu, Fang

    According to the Complexity Theory, there is complexity in the system when the functional requirement is not be satisfied. There are several study performances for Complexity Theory based on Axiomatic Design. However, they focus on reducing the complexity in their study and no one focus on method of analyzing the complexity in the system. Therefore, this paper put forth a method of analyzing the complexity which is sought to make up the deficiency of the researches. In order to discussing the method of analyzing the complexity based on additional effect, this paper put forth two concepts which are ideal effect and additional effect. The method of analyzing complexity based on additional effect combines Complexity Theory with Theory of Inventive Problem Solving (TRIZ). It is helpful for designers to analyze the complexity by using additional effect. A case study shows the application of the process.

  14. ELISA-BASE: An Integrated Bioinformatics Tool for Analyzing and Tracking ELISA Microarray Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Amanda M.; Collett, James L.; Seurynck-Servoss, Shannon L.

    ELISA-BASE is an open-source database for capturing, organizing and analyzing protein enzyme-linked immunosorbent assay (ELISA) microarray data. ELISA-BASE is an extension of the BioArray Soft-ware Environment (BASE) database system, which was developed for DNA microarrays. In order to make BASE suitable for protein microarray experiments, we developed several plugins for importing and analyzing quantitative ELISA microarray data. Most notably, our Protein Microarray Analysis Tool (ProMAT) for processing quantita-tive ELISA data is now available as a plugin to the database.

  15. Orthogonality Measurement for Homogenous Projects-Bases

    ERIC Educational Resources Information Center

    Ivan, Ion; Sandu, Andrei; Popa, Marius

    2009-01-01

    The homogenous projects-base concept is defined. Next, the necessary steps to create a homogenous projects-base are presented. A metric system is built, which then will be used for analyzing projects. The indicators which are meaningful for analyzing a homogenous projects-base are selected. The given hypothesis is experimentally verified. The…

  16. Quantitative Analyses about Market- and Prevalence-Based Needs for Adapted Physical Education Teachers in the Public Schools in the United States

    ERIC Educational Resources Information Center

    Zhang, Jiabei

    2011-01-01

    The purpose of this study was to analyze quantitative needs for more adapted physical education (APE) teachers based on both market- and prevalence-based models. The market-based need for more APE teachers was examined based on APE teacher positions funded, while the prevalence-based need for additional APE teachers was analyzed based on students…

  17. Soil Studies: Applying Acid-Base Chemistry to Environmental Analysis.

    ERIC Educational Resources Information Center

    West, Donna M.; Sterling, Donna R.

    2001-01-01

    Laboratory activities for chemistry students focus attention on the use of acid-base chemistry to examine environmental conditions. After using standard laboratory procedures to analyze soil and rainwater samples, students use web-based resources to interpret their findings. Uses CBL probes and graphing calculators to gather and analyze data and…

  18. Assessing errors in the determination of base excess.

    PubMed

    Mentel, Alexander; Bach, Friedhelm; Schüler, Joerg; Herrmann, Walter; Koster, Andreas; Crystal, George J; Gatzounis, Georgios; Mertzlufft, Fritz

    2002-05-01

    We compared estimates for base excess of extracellular fluid (BE(ecf); mmol/L) obtained in five clinically used blood gas analyzers: AVL Compact 2 (Roche Diagnostics, Mannheim, Germany), Ciba-Corning 860 (Bayer Diagnostics, Fernwald, Germany), IL 1620 (Instrumentation Laboratories, Lexington, MA), Stat Profile Ultra (Nova Biomedical, Waltham, MA), and ABL 510 (Radiometer, Copenhagen, Denmark). A total of 134 measurements per analyzer were obtained in arterial and venous blood samples from 10 patients undergoing cardiac surgery and 65 measurements per analyzer in venous blood samples from 2 healthy volunteers. The blood samples were equilibrated in a tonometer with gases of known composition (37 degrees C). Additional theoretical studies were performed to evaluate the relationship between pH and calculated BE(ecf) value (with varied PCO(2)) using the formulas of the various analyzers. The standard deviations of repeated measurements were 0.24 mmol/L for ABL 510 and approximately 0.45 mmol/L for the other 4 analyzers. The maximal systematic difference between the average of all measurements of each analyzer was 3.7 mmol/L; this was primarily attributable to differences in measuring pH, and, to a lesser extent, to differences in calculation and determination of PCO(2). Comparison of the results from samples with different oxygen saturation showed that the relative alkalinity of deoxygenated hemoglobin (Haldane effect) can also influence the determinations of BE(ecf). A clinically useful way to quantify nonrespiratory disturbances of the acid-base balance is calculation of the base excess of extracellular fluid by using blood gas analyzers. In this study, we found significant variability in estimates of base excess of extracellular fluid obtained with five analyzers from different manufacturers. This variability is attributable to multiple factors, including lack of correction for deoxygenated hemoglobin (Haldane effect).

  19. [A heart function measuring and analyzing instrument based on single-chip microcomputer].

    PubMed

    Rong, Z; Liang, H; Wang, S

    1999-05-01

    An Introduction a measuring and analyzing instrument, based on the single-chip microcomputer, which provides sample gathering, processing, controlling, adjusting, keyboard and printing. All informations are provided and displayed in Chinese.

  20. Development of online NIR urine analyzing system based on AOTF

    NASA Astrophysics Data System (ADS)

    Wan, Feng; Sun, Zhendong; Li, Xiaoxia

    2006-09-01

    In this paper, some key techniques on development of on-line MR urine analyzing system based on AOTF (Acousto - Optics Tunable Filter) are introduced. Problems about designing the optical system including collimation of incident light and working distance (the shortest distance for separating incident light and diffracted light) are analyzed and researched. DDS (Direct Digital Synthesizer) controlled by microprocessor is used to realize the wavelength scan. The experiment results show that this MR urine analyzing system based on. AOTF has 10000 - 4000cm -1 wavelength range and O.3ms wavelength transfer rate. Compare with the conventional Fourier Transform NIP. spectrophotometer for analyzing multi-components in urine, this system features low cost, small volume and on-line measurement function. Unscrambler software (multivariate statistical software by CAMO Inc. Norway) is selected as the software for processing the data. This system can realize on line quantitative analysis of protein, urea and creatinine in urine.

  1. Elemental analysis using temporal gating of a pulsed neutron generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitra, Sudeep

    Technologies related to determining elemental composition of a sample that comprises fissile material are described herein. In a general embodiment, a pulsed neutron generator periodically emits bursts of neutrons, and is synchronized with an analyzer circuit. The bursts of neutrons are used to interrogate the sample, and the sample outputs gamma rays based upon the neutrons impacting the sample. A detector outputs pulses based upon the gamma rays impinging upon the material of the detector, and the analyzer circuit assigns the pulses to temporally-based bins based upon the analyzer circuit being synchronized with the pulsed neutron generator. A computing devicemore » outputs data that is indicative of elemental composition of the sample based upon the binned pulses.« less

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blanco, Arthur S.; Gerlagh, Reyer; Suh, Sangwon

    Chapter 5 analyzes the anthropogenic greenhouse gas (GHG) emission trends until the present and the main drivers that explain those trends. The chapter uses different perspectives to analyze past GHG-emissions trends, including aggregate emissions flows and per capita emissions, cumulative emissions, sectoral emissions, and territory-based vs. consumption-based emissions. In all cases, global and regional trends are analyzed. Where appropriate, the emission trends are contextualized with long-term historic developments in GHG emissions extending back to 1750.

  3. Imaging quality analysis of computer-generated holograms using the point-based method and slice-based method

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen; Chen, Siqing; Zheng, Huadong; Sun, Tao; Yu, Yingjie; Gao, Hongyue; Asundi, Anand K.

    2017-06-01

    Computer holography has made a notably progress in recent years. The point-based method and slice-based method are chief calculation algorithms for generating holograms in holographic display. Although both two methods are validated numerically and optically, the differences of the imaging quality of these methods have not been specifically analyzed. In this paper, we analyze the imaging quality of computer-generated phase holograms generated by point-based Fresnel zone plates (PB-FZP), point-based Fresnel diffraction algorithm (PB-FDA) and slice-based Fresnel diffraction algorithm (SB-FDA). The calculation formula and hologram generation with three methods are demonstrated. In order to suppress the speckle noise, sequential phase-only holograms are generated in our work. The results of reconstructed images numerically and experimentally are also exhibited. By comparing the imaging quality, the merits and drawbacks with three methods are analyzed. Conclusions are given by us finally.

  4. Implementation of a digital evaluation platform to analyze bifurcation based nonlinear amplifiers

    NASA Astrophysics Data System (ADS)

    Feldkord, Sven; Reit, Marco; Mathis, Wolfgang

    2016-09-01

    Recently, nonlinear amplifiers based on the supercritical Andronov-Hopf bifurcation have become a focus of attention, especially in the modeling of the mammalian hearing organ. In general, to gain deeper insights in the input-output behavior, the analysis of bifurcation based amplifiers requires a flexible framework to exchange equations and adjust certain parameters. A DSP implementation is presented which is capable to analyze various amplifier systems. Amplifiers based on the Andronov-Hopf and Neimark-Sacker bifurcations are implemented and compared exemplarily. It is shown that the Neimark-Sacker system remarkably outperforms the Andronov-Hopf amplifier regarding the CPU usage. Nevertheless, both show a similar input-output behavior over a wide parameter range. Combined with an USB-based control interface connected to a PC, the digital framework provides a powerful instrument to analyze bifurcation based amplifiers.

  5. A laser-based FAIMS detector for detection of ultra-low concentrations of explosives

    NASA Astrophysics Data System (ADS)

    Akmalov, Artem E.; Chistyakov, Alexander A.; Kotkovskii, Gennadii E.; Sychev, Alexey V.; Tugaenko, Anton V.; Bogdanov, Artem S.; Perederiy, Anatoly N.; Spitsyn, Eugene M.

    2014-06-01

    A non-contact method for analyzing of explosives traces from surfaces was developed. The method is based on the laser desorption of analyzed molecules from the surveyed surfaces followed by the laser ionization of air sample combined with the field asymmetric ion mobility spectrometry (FAIMS). The pulsed radiation of the fourth harmonic of a portable GSGG: Cr3+ :Nd3+ laser (λ = 266 nm) is used. The laser desorption FAIMS analyzer have been developed. The detection limit of the analyzer equals 40 pg for TNT. The results of detection of trinitrotoluene (TNT), cyclotrimethylenetrinitramine (RDX) and cyclotetramethylenetetranitramine (HMX) are presented. It is shown that laser desorption of nitro-compounds from metals is accompanied by their surface decomposition. A method for detecting and analyzing of small concentrations of explosives in air based on the laser ionization and the FAIMS was developed. The method includes a highly efficient multipass optical scheme of the intracavity fourthharmonic generation of pulsed laser radiation (λ = 266 nm) and the field asymmetric ion mobility (FAIM) spectrometer disposed within a resonator. The ions formation and detection proceed inside a resonant cavity. The laser ion source based on the multi-passage of radiation at λ = 266 nm through the ionization region was elaborated. On the basis of the method the laser FAIMS analyzer has been created. The analyzer provides efficient detection of low concentrations of nitro-compounds in air and shows a detection limit of 10-14 - 10-15 g/cm3 both for RDX and TNT.

  6. Analyzing Log Files to Predict Students' Problem Solving Performance in a Computer-Based Physics Tutor

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2015-01-01

    This study investigates whether information saved in the log files of a computer-based tutor can be used to predict the problem solving performance of students. The log files of a computer-based physics tutoring environment called Andes Physics Tutor was analyzed to build a logistic regression model that predicted success and failure of students'…

  7. Identifying Meta-Clusters of Students' Interest in Science and Their Change with Age

    ERIC Educational Resources Information Center

    Baram-Tsabari, Ayelet; Yarden, Anat

    2009-01-01

    Nearly 6,000 science questions collected from five different web-based, TV-based and school-based sources were rigorously analyzed in order to identify profiles of K-12 students' interest in science, and how these profiles change with age. The questions were analyzed according to their topic, thinking level, motivation for and level of autonomy in…

  8. Use of CellNetAnalyzer in biotechnology and metabolic engineering.

    PubMed

    von Kamp, Axel; Thiele, Sven; Hädicke, Oliver; Klamt, Steffen

    2017-11-10

    Mathematical models of the cellular metabolism have become an essential tool for the optimization of biotechnological processes. They help to obtain a systemic understanding of the metabolic processes in the used microorganisms and to find suitable genetic modifications maximizing the production performance. In particular, methods of stoichiometric and constraint-based modeling are frequently used in the context of metabolic and bioprocess engineering. Since metabolic networks can be complex and comprise hundreds or even thousands of metabolites and reactions, dedicated software tools are required for an efficient analysis. One such software suite is CellNetAnalyzer, a MATLAB package providing, among others, various methods for analyzing stoichiometric and constraint-based metabolic models. CellNetAnalyzer can be used via command-line based operations or via a graphical user interface with embedded network visualizations. Herein we will present key functionalities of CellNetAnalyzer for applications in biotechnology and metabolic engineering and thereby review constraint-based modeling techniques such as metabolic flux analysis, flux balance analysis, flux variability analysis, metabolic pathway analysis (elementary flux modes) and methods for computational strain design. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  9. Performance evaluation of tile-based Fisher Ratio analysis using a benchmark yeast metabolome dataset.

    PubMed

    Watson, Nathanial E; Parsons, Brendon A; Synovec, Robert E

    2016-08-12

    Performance of tile-based Fisher Ratio (F-ratio) data analysis, recently developed for discovery-based studies using comprehensive two-dimensional gas chromatography coupled with time-of-flight mass spectrometry (GC×GC-TOFMS), is evaluated with a metabolomics dataset that had been previously analyzed in great detail, but while taking a brute force approach. The previously analyzed data (referred to herein as the benchmark dataset) were intracellular extracts from Saccharomyces cerevisiae (yeast), either metabolizing glucose (repressed) or ethanol (derepressed), which define the two classes in the discovery-based analysis to find metabolites that are statistically different in concentration between the two classes. Beneficially, this previously analyzed dataset provides a concrete means to validate the tile-based F-ratio software. Herein, we demonstrate and validate the significant benefits of applying tile-based F-ratio analysis. The yeast metabolomics data are analyzed more rapidly in about one week versus one year for the prior studies with this dataset. Furthermore, a null distribution analysis is implemented to statistically determine an adequate F-ratio threshold, whereby the variables with F-ratio values below the threshold can be ignored as not class distinguishing, which provides the analyst with confidence when analyzing the hit table. Forty-six of the fifty-four benchmarked changing metabolites were discovered by the new methodology while consistently excluding all but one of the benchmarked nineteen false positive metabolites previously identified. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Application of wireless sensor network technology in logistics information system

    NASA Astrophysics Data System (ADS)

    Xu, Tao; Gong, Lina; Zhang, Wei; Li, Xuhong; Wang, Xia; Pan, Wenwen

    2017-04-01

    This paper introduces the basic concepts of active RFID (WSN-ARFID) based on wireless sensor networks and analyzes the shortcomings of the existing RFID-based logistics monitoring system. Integrated wireless sensor network technology and the scrambling point of RFID technology. A new real-time logistics detection system based on WSN and RFID, a model of logistics system based on WSN-ARFID is proposed, and the feasibility of this technology applied to logistics field is analyzed.

  11. Center for development technology and program in technology and human affairs. [emphasizing technology-based networks

    NASA Technical Reports Server (NTRS)

    Wong, M. D.

    1974-01-01

    The role of technology in nontraditional higher education with particular emphasis on technology-based networks is analyzed nontraditional programs, institutions, and consortia are briefly reviewed. Nontraditional programs which utilize technology are studied. Technology-based networks are surveyed and analyzed with regard to kinds of students, learning locations, technology utilization, interinstitutional relationships, cost aspects, problems, and future outlook.

  12. Game-Based Approaches, Pedagogical Principles and Tactical Constraints: Examining Games Modification

    ERIC Educational Resources Information Center

    Serra-Olivares, Jaime; García-López, Luis M.; Calderón, Antonio

    2016-01-01

    The purpose of this study was to analyze the effect of modification strategies based on the pedagogical principles of the Teaching Games for Understanding approach on tactical constraints of four 3v3 soccer small-sided games. The Game performance of 21 U-10 players was analyzed in a game similar to the adult game; one based on keeping-the-ball;…

  13. Analyzing Hedges in Verbal Communication: An Adaptation-Based Approach

    ERIC Educational Resources Information Center

    Wang, Yuling

    2010-01-01

    Based on Adaptation Theory, the article analyzes the production process of hedges. The procedure consists of the continuous making of choices in linguistic forms and communicative strategies. These choices are made just for adaptation to the contextual correlates. Besides, the adaptation process is dynamic, intentional and bidirectional.

  14. Cryptanalysis of "an improvement over an image encryption method based on total shuffling"

    NASA Astrophysics Data System (ADS)

    Akhavan, A.; Samsudin, A.; Akhshani, A.

    2015-09-01

    In the past two decades, several image encryption algorithms based on chaotic systems had been proposed. Many of the proposed algorithms are meant to improve other chaos based and conventional cryptographic algorithms. Whereas, many of the proposed improvement methods suffer from serious security problems. In this paper, the security of the recently proposed improvement method for a chaos-based image encryption algorithm is analyzed. The results indicate the weakness of the analyzed algorithm against chosen plain-text.

  15. Economic Feasibility of Wireless Sensor Network-Based Service Provision in a Duopoly Setting with a Monopolist Operator.

    PubMed

    Sanchis-Cano, Angel; Romero, Julián; Sacoto-Cabrera, Erwin J; Guijarro, Luis

    2017-11-25

    We analyze the feasibility of providing Wireless Sensor Network-data-based services in an Internet of Things scenario from an economical point of view. The scenario has two competing service providers with their own private sensor networks, a network operator and final users. The scenario is analyzed as two games using game theory. In the first game, sensors decide to subscribe or not to the network operator to upload the collected sensing-data, based on a utility function related to the mean service time and the price charged by the operator. In the second game, users decide to subscribe or not to the sensor-data-based service of the service providers based on a Logit discrete choice model related to the quality of the data collected and the subscription price. The sinks and users subscription stages are analyzed using population games and discrete choice models, while network operator and service providers pricing stages are analyzed using optimization and Nash equilibrium concepts respectively. The model is shown feasible from an economic point of view for all the actors if there are enough interested final users and opens the possibility of developing more efficient models with different types of services.

  16. Analysis of Combined Data from Heterogeneous Study Designs: A Methodological Proposal from the Patient Navigation Research program

    PubMed Central

    Roetzheim, Richard G.; Freund, Karen M.; Corle, Don K.; Murray, David M.; Snyder, Frederick R.; Kronman, Andrea C.; Jean-Pierre, Pascal; Raich, Peter C.; Holden, Alan E. C.; Darnell, Julie S.; Warren-Mears, Victoria; Patierno, Steven; Design, PNRP; Committee, Analysis

    2013-01-01

    Background The Patient Navigation Research Program (PNRP) is a cooperative effort of nine research projects, each employing its own unique study design. To evaluate projects such as PNRP, it is desirable to perform a pooled analysis to increase power relative to the individual projects. There is no agreed upon prospective methodology, however, for analyzing combined data arising from different study designs. Expert opinions were thus solicited from members of the PNRP Design and Analysis Committee Purpose To review possible methodologies for analyzing combined data arising from heterogeneous study designs. Methods The Design and Analysis Committee critically reviewed the pros and cons of five potential methods for analyzing combined PNRP project data. Conclusions were based on simple consensus. The five approaches reviewed included: 1) Analyzing and reporting each project separately, 2) Combining data from all projects and performing an individual-level analysis, 3) Pooling data from projects having similar study designs, 4) Analyzing pooled data using a prospective meta analytic technique, 5) Analyzing pooled data utilizing a novel simulated group randomized design. Results Methodologies varied in their ability to incorporate data from all PNRP projects, to appropriately account for differing study designs, and in their impact from differing project sample sizes. Limitations The conclusions reached were based on expert opinion and not derived from actual analyses performed. Conclusions The ability to analyze pooled data arising from differing study designs may provide pertinent information to inform programmatic, budgetary, and policy perspectives. Multi-site community-based research may not lend itself well to the more stringent explanatory and pragmatic standards of a randomized controlled trial design. Given our growing interest in community-based population research, the challenges inherent in the analysis of heterogeneous study design are likely to become more salient. Discussion of the analytic issues faced by the PNRP and the methodological approaches we considered may be of value to other prospective community-based research programs. PMID:22273587

  17. Analysis of combined data from heterogeneous study designs: an applied example from the patient navigation research program.

    PubMed

    Roetzheim, Richard G; Freund, Karen M; Corle, Don K; Murray, David M; Snyder, Frederick R; Kronman, Andrea C; Jean-Pierre, Pascal; Raich, Peter C; Holden, Alan Ec; Darnell, Julie S; Warren-Mears, Victoria; Patierno, Steven

    2012-04-01

    The Patient Navigation Research Program (PNRP) is a cooperative effort of nine research projects, with similar clinical criteria but with different study designs. To evaluate projects such as PNRP, it is desirable to perform a pooled analysis to increase power relative to the individual projects. There is no agreed-upon prospective methodology, however, for analyzing combined data arising from different study designs. Expert opinions were thus solicited from the members of the PNRP Design and Analysis Committee. To review possible methodologies for analyzing combined data arising from heterogeneous study designs. The Design and Analysis Committee critically reviewed the pros and cons of five potential methods for analyzing combined PNRP project data. The conclusions were based on simple consensus. The five approaches reviewed included the following: (1) analyzing and reporting each project separately, (2) combining data from all projects and performing an individual-level analysis, (3) pooling data from projects having similar study designs, (4) analyzing pooled data using a prospective meta-analytic technique, and (5) analyzing pooled data utilizing a novel simulated group-randomized design. Methodologies varied in their ability to incorporate data from all PNRP projects, to appropriately account for differing study designs, and to accommodate differing project sample sizes. The conclusions reached were based on expert opinion and not derived from actual analyses performed. The ability to analyze pooled data arising from differing study designs may provide pertinent information to inform programmatic, budgetary, and policy perspectives. Multisite community-based research may not lend itself well to the more stringent explanatory and pragmatic standards of a randomized controlled trial design. Given our growing interest in community-based population research, the challenges inherent in the analysis of heterogeneous study design are likely to become more salient. Discussion of the analytic issues faced by the PNRP and the methodological approaches we considered may be of value to other prospective community-based research programs.

  18. Quality requirements for veterinary hematology analyzers in small animals-a survey about veterinary experts' requirements and objective evaluation of analyzer performance based on a meta-analysis of method validation studies: bench top hematology analyzer.

    PubMed

    Cook, Andrea M; Moritz, Andreas; Freeman, Kathleen P; Bauer, Natali

    2016-09-01

    Scarce information exists about quality requirements and objective evaluation of performance of large veterinary bench top hematology analyzers. The study was aimed at comparing the observed total error (TEobs ) derived from meta-analysis of published method validation data to the total allowable error (TEa ) for veterinary hematology variables in small animals based on experts' opinions. Ideally, TEobs should be < TEa . An online survey was sent to veterinary experts in clinical pathology and small animal internal medicine for providing the maximal allowable deviation from a given result for each variable. Percent of TEa = (allowable median deviation/clinical threshold) * 100%. Second, TEobs for 3 laser-based bench top hematology analyzers (ADVIA 2120; Sysmex XT2000iV, and CellDyn 3500) was calculated based on method validation studies published between 2005 and 2013 (n = 4). The percent TEobs = 2 * CV (%) + bias (%). The CV was derived from published studies except for the ADVIA 2120 (internal data), and bias was estimated from the regression equation. A total of 41 veterinary experts (19 diplomates, 8 residents, 10 postgraduate students, 4 anonymous specialists) responded. The proposed range of TEa was wide, but generally ≤ 20%. The TEobs was < TEa for all variables and analyzers except for canine and feline HGB (high bias, low CV) and platelet counts (high bias, high CV). Overall, veterinary bench top analyzers fulfilled experts' requirements except for HGB due to method-related bias, and platelet counts due to known preanalytic/analytic issues. © 2016 American Society for Veterinary Clinical Pathology.

  19. Microprocessor-Based Neural-Pulse-Wave Analyzer

    NASA Technical Reports Server (NTRS)

    Kojima, G. K.; Bracchi, F.

    1983-01-01

    Microprocessor-based system analyzes amplitudes and rise times of neural waveforms. Displaying histograms of measured parameters helps researchers determine how many nerves contribute to signal and specify waveform characteristics of each. Results are improved noise rejection, full or partial separation of overlapping peaks, and isolation and identification of related peaks in different histograms. 2

  20. Analysis of bHLH coding genes using gene co-expression network approach.

    PubMed

    Srivastava, Swati; Sanchita; Singh, Garima; Singh, Noopur; Srivastava, Gaurava; Sharma, Ashok

    2016-07-01

    Network analysis provides a powerful framework for the interpretation of data. It uses novel reference network-based metrices for module evolution. These could be used to identify module of highly connected genes showing variation in co-expression network. In this study, a co-expression network-based approach was used for analyzing the genes from microarray data. Our approach consists of a simple but robust rank-based network construction. The publicly available gene expression data of Solanum tuberosum under cold and heat stresses were considered to create and analyze a gene co-expression network. The analysis provide highly co-expressed module of bHLH coding genes based on correlation values. Our approach was to analyze the variation of genes expression, according to the time period of stress through co-expression network approach. As the result, the seed genes were identified showing multiple connections with other genes in the same cluster. Seed genes were found to be vary in different time periods of stress. These analyzed seed genes may be utilized further as marker genes for developing the stress tolerant plant species.

  1. Handheld Fluorescence Microscopy based Flow Analyzer.

    PubMed

    Saxena, Manish; Jayakumar, Nitin; Gorthi, Sai Siva

    2016-03-01

    Fluorescence microscopy has the intrinsic advantages of favourable contrast characteristics and high degree of specificity. Consequently, it has been a mainstay in modern biological inquiry and clinical diagnostics. Despite its reliable nature, fluorescence based clinical microscopy and diagnostics is a manual, labour intensive and time consuming procedure. The article outlines a cost-effective, high throughput alternative to conventional fluorescence imaging techniques. With system level integration of custom-designed microfluidics and optics, we demonstrate fluorescence microscopy based imaging flow analyzer. Using this system we have imaged more than 2900 FITC labeled fluorescent beads per minute. This demonstrates high-throughput characteristics of our flow analyzer in comparison to conventional fluorescence microscopy. The issue of motion blur at high flow rates limits the achievable throughput in image based flow analyzers. Here we address the issue by computationally deblurring the images and show that this restores the morphological features otherwise affected by motion blur. By further optimizing concentration of the sample solution and flow speeds, along with imaging multiple channels simultaneously, the system is capable of providing throughput of about 480 beads per second.

  2. ProstateAnalyzer: Web-based medical application for the management of prostate cancer using multiparametric MR imaging.

    PubMed

    Mata, Christian; Walker, Paul M; Oliver, Arnau; Brunotte, François; Martí, Joan; Lalande, Alain

    2016-01-01

    In this paper, we present ProstateAnalyzer, a new web-based medical tool for prostate cancer diagnosis. ProstateAnalyzer allows the visualization and analysis of magnetic resonance images (MRI) in a single framework. ProstateAnalyzer recovers the data from a PACS server and displays all the associated MRI images in the same framework, usually consisting of 3D T2-weighted imaging for anatomy, dynamic contrast-enhanced MRI for perfusion, diffusion-weighted imaging in the form of an apparent diffusion coefficient (ADC) map and MR Spectroscopy. ProstateAnalyzer allows annotating regions of interest in a sequence and propagates them to the others. From a representative case, the results using the four visualization platforms are fully detailed, showing the interaction among them. The tool has been implemented as a Java-based applet application to facilitate the portability of the tool to the different computer architectures and software and allowing the possibility to work remotely via the web. ProstateAnalyzer enables experts to manage prostate cancer patient data set more efficiently. The tool allows delineating annotations by experts and displays all the required information for use in diagnosis. According to the current European Society of Urogenital Radiology guidelines, it also includes the PI-RADS structured reporting scheme.

  3. The FNS-based analyzing the EEG to diagnose the bipolar affective disorder

    NASA Astrophysics Data System (ADS)

    Panischev, Yu; Panischeva, S. N.; Demin, S. A.

    2015-11-01

    Here we demonstrate a capability of method based on the Flicker-Noise Spectroscopy (FNS) in analyzing the manifestation bipolar affective disorder (BAD) in EEG. Generally EEG from BAD patient does not show the visual differences from healthy EEG. Analyzing the behavior of FNS-parameters and the structure of 3D-cross correlators allows to discover the differential characteristics of BAD. The cerebral cortex electric activity of BAD patients have a specific collective dynamics and configuration of the FNS-characteristics in comparison with healthy subjects.

  4. Study on combat effectiveness of air defense missile weapon system based on queuing theory

    NASA Astrophysics Data System (ADS)

    Zhao, Z. Q.; Hao, J. X.; Li, L. J.

    2017-01-01

    Queuing Theory is a method to analyze the combat effectiveness of air defense missile weapon system. The model of service probability based on the queuing theory was constructed, and applied to analyzing the combat effectiveness of "Sidewinder" and "Tor-M1" air defense missile weapon system. Finally aimed at different targets densities, the combat effectiveness of different combat units of two types' defense missile weapon system is calculated. This method can be used to analyze the usefulness of air defense missile weapon system.

  5. On-Line Analyzer For Monitoring Trace Amounts Of Oil In Turbid Waters

    NASA Astrophysics Data System (ADS)

    Niemela, P.; Jaatinen, J.

    1986-05-01

    This report presents an automated analyzer which continuously monitors oil content of a sample water stream that flows through the analyzer. The measuring principle is based on the absorption of infrared radiation by oil molecules contained in the sample water. The wavelength band that is used in the measurement is at 3.4 μm, where different types of oils show nearly equal absorption. Another wavelength band of 3.6 μm, where oil has no absorption, is used to compensate the effect of turbidity, which is due to solid particles and oil droplets contained in the sample water. Before entering the analyzer the sample water flow is properly homogenized. To compensate the strong absorption by water molecules in these wavelength bands the sample water is compared with reference water. This is done by directing them alternately through the same measuring cell. The reference water is obtained from the sample water by ultrafiltration and it determines the base line for the contaminated sample water. To ensure the stability of the base line, temperature and pressure differences of the two waters are kept within adequate ranges. Areas of application of the analyzer are wide ranging i.a. from ships' discharge waters to waste waters of industrial processes. The first application of the analyzer is on board oil tankers to control the discharge process of bilge and ballast waters. The analyzer is the first that fully corresponds to the stringent regulations for oil content monitors by the International Maritime Organization (IMO). Pilot installations of the analyzer are made on industrial plants.

  6. A Re-Examination of the Argument against Problem-Based Learning in the Classroom

    ERIC Educational Resources Information Center

    Bryant, Lauren H.

    2011-01-01

    The primary purpose of this study is to examine Kirschner, Sweller, and Clark's (2006) argument against problem-based learning (PBL) by analyzing research used to support their stance. The secondary purpose is to develop a definition of PBL that helps practitioners use this technique. Seven studies were analyzed to determine whether the PBL…

  7. Parent Implementation of Function-Based Intervention to Reduce Children's Challenging Behavior: A Literature Review

    ERIC Educational Resources Information Center

    Fettig, Angel; Barton, Erin E.

    2014-01-01

    The purpose of this literature review was to analyze the research on parent-implemented functional assessment (FA)-based interventions for reducing children's challenging behaviors. Thirteen studies met the review inclusion criteria. These studies were analyzed across independent variables, types of parent coaching and support provided,…

  8. Using Response Ratios for Meta-Analyzing Single-Case Designs with Behavioral Outcomes

    ERIC Educational Resources Information Center

    Pustejovsky, James E.

    2018-01-01

    Methods for meta-analyzing single-case designs (SCDs) are needed to inform evidence-based practice in clinical and school settings and to draw broader and more defensible generalizations in areas where SCDs comprise a large part of the research base. The most widely used outcomes in single-case research are measures of behavior collected using…

  9. An Analysis of the Science Curricula in Turkey with Respect to Spiral Curriculum Approach

    ERIC Educational Resources Information Center

    Yumusak, Güngör Keskinkiliç

    2016-01-01

    This paper aims to analyze the science curricula which is being implemented in Turkey with respect to spiral curriculum approach. To carry out this analyze 3th, 4th, 5th, 6th, 7th and 8th grade education programs are analyzed correlatively based on qualitative research method. The research findings were analyzed in terms of iterative revisiting of…

  10. Economic Feasibility of Wireless Sensor Network-Based Service Provision in a Duopoly Setting with a Monopolist Operator

    PubMed Central

    Romero, Julián; Sacoto-Cabrera, Erwin J.

    2017-01-01

    We analyze the feasibility of providing Wireless Sensor Network-data-based services in an Internet of Things scenario from an economical point of view. The scenario has two competing service providers with their own private sensor networks, a network operator and final users. The scenario is analyzed as two games using game theory. In the first game, sensors decide to subscribe or not to the network operator to upload the collected sensing-data, based on a utility function related to the mean service time and the price charged by the operator. In the second game, users decide to subscribe or not to the sensor-data-based service of the service providers based on a Logit discrete choice model related to the quality of the data collected and the subscription price. The sinks and users subscription stages are analyzed using population games and discrete choice models, while network operator and service providers pricing stages are analyzed using optimization and Nash equilibrium concepts respectively. The model is shown feasible from an economic point of view for all the actors if there are enough interested final users and opens the possibility of developing more efficient models with different types of services. PMID:29186847

  11. Novel indexes based on network structure to indicate financial market

    NASA Astrophysics Data System (ADS)

    Zhong, Tao; Peng, Qinke; Wang, Xiao; Zhang, Jing

    2016-02-01

    There have been various achievements to understand and to analyze the financial market by complex network model. However, current studies analyze the financial network model but seldom present quantified indexes to indicate or forecast the price action of market. In this paper, the stock market is modeled as a dynamic network, in which the vertices refer to listed companies and edges refer to their rank-based correlation based on price series. Characteristics of the network are analyzed and then novel indexes are introduced into market analysis, which are calculated from maximum and fully-connected subnets. The indexes are compared with existing ones and the results confirm that our indexes perform better to indicate the daily trend of market composite index in advance. Via investment simulation, the performance of our indexes is analyzed in detail. The results indicate that the dynamic complex network model could not only serve as a structural description of the financial market, but also work to predict the market and guide investment by indexes.

  12. Optical vector network analyzer with improved accuracy based on polarization modulation and polarization pulling.

    PubMed

    Li, Wei; Liu, Jian Guo; Zhu, Ning Hua

    2015-04-15

    We report a novel optical vector network analyzer (OVNA) with improved accuracy based on polarization modulation and stimulated Brillouin scattering (SBS) assisted polarization pulling. The beating between adjacent higher-order optical sidebands which are generated because of the nonlinearity of an electro-optic modulator (EOM) introduces considerable error to the OVNA. In our scheme, the measurement error is significantly reduced by removing the even-order optical sidebands using polarization discrimination. The proposed approach is theoretically analyzed and experimentally verified. The experimental results show that the accuracy of the OVNA is greatly improved compared to a conventional OVNA.

  13. Model Configuration and Innovative Design of College Students’ Ecological Civilization Construction

    NASA Astrophysics Data System (ADS)

    Chengwen, Yang

    2018-02-01

    This study is based on Marxist eco-civilization thought, combining with Eco management theory, puts forward solutions for college students’ ecological civilization construction. The paper based on the perspective of ecological management theory to analyze the main elements of eco-civilization construction of college students, mainly including five categories. In view of above-mentioned analyze, constructed the model of college students’ eco-civilization which is based on the theory of eco-management, and put forward on concrete methods to improve it.

  14. Turkish Student Teachers' Attitudes toward Teaching in University-Based and Alternative Certification Programs in Turkey

    ERIC Educational Resources Information Center

    Aksoy, Erdem

    2017-01-01

    The objective of this study is to comparatively analyze the university-based and alternative teacher certification systems in Turkey in terms of the attitudes of trainee teachers toward the teaching profession, explore the reasons of choosing teaching as a career as well as analyze attitudes by gender, department, and graduating faculty type in…

  15. ForestCrowns: a software tool for analyzing ground-based digital photographs of forest canopies

    Treesearch

    Matthew F. Winn; Sang-Mook Lee; Phillip A. Araman

    2013-01-01

    Canopy coverage is a key variable used to characterize forest structure. In addition, the light transmitted through the canopy is an important ecological indicator of plant and animal habitat and understory climate conditions. A common ground-based method used to document canopy coverage is to take digital photographs from below the canopy. To assist with analyzing...

  16. Analyzing Interactions by an IIS-Map-Based Method in Face-to-Face Collaborative Learning: An Empirical Study

    ERIC Educational Resources Information Center

    Zheng, Lanqin; Yang, Kaicheng; Huang, Ronghuai

    2012-01-01

    This study proposes a new method named the IIS-map-based method for analyzing interactions in face-to-face collaborative learning settings. This analysis method is conducted in three steps: firstly, drawing an initial IIS-map according to collaborative tasks; secondly, coding and segmenting information flows into information items of IIS; thirdly,…

  17. Construction of a quartz spherical analyzer: application to high-resolution analysis of the Ni Kα emission spectrum

    DOE PAGES

    Honnicke, Marcelo Goncalves; Bianco, Leonardo M.; Ceppi, Sergio A.; ...

    2016-08-10

    The construction and characterization of a focusing X-ray spherical analyzer based on α-quartz 4more » $$\\overline{4}$$04 are presented. For this study, the performance of the analyzer was demonstrated by applying it to a high-resolution X-ray spectroscopy study of theKα 1,2emission spectrum of Ni. An analytical representation based on physical grounds was assumed to model the shape of the X-ray emission lines. Satellite structures assigned to 3dspectator hole transitions were resolved and determined as well as their relative contribution to the emission spectrum. The present results on 1s -13d -1shake probabilities support a recently proposed calculation framework based on a multi-configuration atomic model.« less

  18. Construction of a quartz spherical analyzer: application to high-resolution analysis of the Ni Kα emission spectrum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Honnicke, Marcelo Goncalves; Bianco, Leonardo M.; Ceppi, Sergio A.

    The construction and characterization of a focusing X-ray spherical analyzer based on α-quartz 4more » $$\\overline{4}$$04 are presented. For this study, the performance of the analyzer was demonstrated by applying it to a high-resolution X-ray spectroscopy study of theKα 1,2emission spectrum of Ni. An analytical representation based on physical grounds was assumed to model the shape of the X-ray emission lines. Satellite structures assigned to 3dspectator hole transitions were resolved and determined as well as their relative contribution to the emission spectrum. The present results on 1s -13d -1shake probabilities support a recently proposed calculation framework based on a multi-configuration atomic model.« less

  19. Construction of a quartz spherical analyzer: application to high-resolution analysis of the Ni K α emission spectrum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Honnicke, Marcelo Goncalves; Bianco, Leonardo M.; Ceppi, Sergio A.

    The construction and characterization of a focusing X-ray spherical analyzer based on α-quartz 4more » $$\\bar{4}$$04 are presented. The performance of the analyzer was demonstrated by applying it to a high-resolution X-ray spectroscopy study of theKα 1,2emission spectrum of Ni. An analytical representation based on physical grounds was assumed to model the shape of the X-ray emission lines. Satellite structures assigned to 3dspectator hole transitions were resolved and determined as well as their relative contribution to the emission spectrum. The present results on 1s -13d -1shake probabilities support a recently proposed calculation framework based on a multi-configuration atomic model.« less

  20. A Morphological Analyzer for Vocalized or Not Vocalized Arabic Language

    NASA Astrophysics Data System (ADS)

    El Amine Abderrahim, Med; Breksi Reguig, Fethi

    This research has been to show the realization of a morphological analyzer of the Arabic language (vocalized or not vocalized). This analyzer is based upon our object model for the Arabic Natural Language Processing (NLP) and can be exploited by NLP applications such as translation machine, orthographical correction and the search for information.

  1. Two graphical user interfaces for managing and analyzing MODFLOW groundwater-model scenarios

    USGS Publications Warehouse

    Banta, Edward R.

    2014-01-01

    Scenario Manager and Scenario Analyzer are graphical user interfaces that facilitate the use of calibrated, MODFLOW-based groundwater models for investigating possible responses to proposed stresses on a groundwater system. Scenario Manager allows a user, starting with a calibrated model, to design and run model scenarios by adding or modifying stresses simulated by the model. Scenario Analyzer facilitates the process of extracting data from model output and preparing such display elements as maps, charts, and tables. Both programs are designed for users who are familiar with the science on which groundwater modeling is based but who may not have a groundwater modeler’s expertise in building and calibrating a groundwater model from start to finish. With Scenario Manager, the user can manipulate model input to simulate withdrawal or injection wells, time-variant specified hydraulic heads, recharge, and such surface-water features as rivers and canals. Input for stresses to be simulated comes from user-provided geographic information system files and time-series data files. A Scenario Manager project can contain multiple scenarios and is self-documenting. Scenario Analyzer can be used to analyze output from any MODFLOW-based model; it is not limited to use with scenarios generated by Scenario Manager. Model-simulated values of hydraulic head, drawdown, solute concentration, and cell-by-cell flow rates can be presented in display elements. Map data can be represented as lines of equal value (contours) or as a gradated color fill. Charts and tables display time-series data obtained from output generated by a transient-state model run or from user-provided text files of time-series data. A display element can be based entirely on output of a single model run, or, to facilitate comparison of results of multiple scenarios, an element can be based on output from multiple model runs. Scenario Analyzer can export display elements and supporting metadata as a Portable Document Format file.

  2. RF Systems in Space. Volume I. Space Antennas Frequency (SARF) Simulation.

    DTIC Science & Technology

    1983-04-01

    lens SBR designs were investigated. The survivability of an SBR system was analyzed. The design of ground based SBR validation experiments for large...aperture SBR concepts were investigated. SBR designs were investigated for ground target detection. N1’IS GRAMI DTIC TAB E Unannounced E Justificat... designs :~~.~...: .-..:. ->.. - . *.* . ..- . . .. . -. . ..- . .4. To analyze the survivability of space radar 5. To design ground-based validation

  3. Modeling of Electrocardiograph Telediagnosing System Based on Petri Net

    NASA Astrophysics Data System (ADS)

    Hu, Wensong; Li, Ming; Li, Lan

    This paper analyzed the characteristics of the electrocardiograph telediagnosing system. Firstly, we introduce the system and Petri nets. Secondly, we built a topological diagram of this system. Then we use Petri nets to show the physical process of this system. Finally, we verified the model of the electrocardiograph telediagnosing system. With the help of model based on Petri nets, we analyzed the system performance and feasibility.

  4. Integrating physically based simulators with Event Detection Systems: Multi-site detection approach.

    PubMed

    Housh, Mashor; Ohar, Ziv

    2017-03-01

    The Fault Detection (FD) Problem in control theory concerns of monitoring a system to identify when a fault has occurred. Two approaches can be distinguished for the FD: Signal processing based FD and Model-based FD. The former concerns of developing algorithms to directly infer faults from sensors' readings, while the latter uses a simulation model of the real-system to analyze the discrepancy between sensors' readings and expected values from the simulation model. Most contamination Event Detection Systems (EDSs) for water distribution systems have followed the signal processing based FD, which relies on analyzing the signals from monitoring stations independently of each other, rather than evaluating all stations simultaneously within an integrated network. In this study, we show that a model-based EDS which utilizes a physically based water quality and hydraulics simulation models, can outperform the signal processing based EDS. We also show that the model-based EDS can facilitate the development of a Multi-Site EDS (MSEDS), which analyzes the data from all the monitoring stations simultaneously within an integrated network. The advantage of the joint analysis in the MSEDS is expressed by increased detection accuracy (higher true positive alarms and fewer false alarms) and shorter detection time. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taha, Mohd F., E-mail: faisalt@petronas.com.my; Shaharun, Maizatul S.; Shuib, Anis Suhaila, E-mail: anisuha@petronas.com.my

    An attempt was made to investigate the potential of rice husk-based activated carbon as an alternative low-cost adsorbent for the removal of Ni(II), Zn(II) and Pb(II) ions from single aqueous solution. Rice husk-based activated carbon was prepared via treatment of rice husk with NaOH followed by the carbonization process at 400°C for 2 hours. Three samples, i.e. raw rice husk, rice husk treated with NaOH and rice husk-based activated carbon, were analyzed for their morphological characteristics using field-emission scanning electron microscope/energy dispersive X-ray (FESEM/EDX). These samples were also analyzed for their carbon, hydrogen, nitrogen, oxygen and silica contents using CHNmore » elemental analyzer and FESEM/EDX. The porous properties of rice husk-based activated carbon were determined by Brunauer-Emmett-Teller (BET) surface area analyzer, and its surface area and pore volume were 255 m{sup 2}/g and 0.17 cm{sup 2}/g, respectively. The adsorption studies for the removal of Ni(II), Zn(II) and Pb(II) ions from single metal aqueous solution were carried out at a fixed initial concentration of metal ion (150 ppm) with variation amount of adsorbent (rice husk-based activated carbon) as a function of varied contact time at room temperature. The concentration of each metal ion was analyzed using atomic absorption spectrophotometer (AAS). The results obtained from adsorption studies indicate the potential of rice husk as an economically promising precursor for the preparation of activated carbon for removal of Ni(II), Zn(II) and Pb(II) ions from single aqueous solution. Isotherm and kinetic model analyses suggested that the experimental data of adsorption studies fitted well with Langmuir, Freundlich and second-order kinetic models.« less

  6. The characteristics of grating structure in magnetic field measurements based on polarization properties of fiber gratings

    NASA Astrophysics Data System (ADS)

    Su, Yang; Peng, Hui; Feng, Kui; Li, Yu-quan

    2009-11-01

    In this paper the characteristics of grating structure in magnetic field measurements based on differential group delay of fiber gratings are analyzed. Theoretical simulations are realized using the coupled-mode theory and transfer matrix method. The effects of grating parameters of uniform Bragg grating on measurement range and sensitivity are analyzed. The impacts of chirped, phase-shifted and apodized gratings on DGD peak values are also monitored. FBG transmitted spectrums and DGD spectrums are recorded by means of an optical vector analyzer (OVA). Both the simulations and experiments demonstrate that the phase-shifted gratings can obviously improve the sensitivity.

  7. Network public opinion space sentiment tendency analyze based on recurrent convolution neural network

    NASA Astrophysics Data System (ADS)

    Zhang, Gaowei; Xu, Lingyu; Wang, Lei

    2018-04-01

    The purpose of this chapter is to analyze the investor's psychological characteristics and investment decision-making behavior characteristics, to study the investor sentiment under the network public opinion, and then analyze from three aspects: First, investor sentiment analysis and how to spread in the online media; The influence mechanism of investor's emotion on the stock market and its effect; the third one is to measure the investor's emotion based on the degree of attention, trying hard to sort out the internal relations between the investor's sentiment and the network public opinion and the stock market, in order to lay the theoretical foundation of this article.

  8. Revisiting tests for neglected nonlinearity using artificial neural networks.

    PubMed

    Cho, Jin Seo; Ishida, Isao; White, Halbert

    2011-05-01

    Tests for regression neglected nonlinearity based on artificial neural networks (ANNs) have so far been studied by separately analyzing the two ways in which the null of regression linearity can hold. This implies that the asymptotic behavior of general ANN-based tests for neglected nonlinearity is still an open question. Here we analyze a convenient ANN-based quasi-likelihood ratio statistic for testing neglected nonlinearity, paying careful attention to both components of the null. We derive the asymptotic null distribution under each component separately and analyze their interaction. Somewhat remarkably, it turns out that the previously known asymptotic null distribution for the type 1 case still applies, but under somewhat stronger conditions than previously recognized. We present Monte Carlo experiments corroborating our theoretical results and showing that standard methods can yield misleading inference when our new, stronger regularity conditions are violated.

  9. RDBMS Applications as Online Based Data Archive: A Case of Harbour Medical Center in Pekanbaru

    NASA Astrophysics Data System (ADS)

    Febriadi, Bayu; Zamsuri, Ahmad

    2017-12-01

    Kantor Kesehatan Pelabuhan Kelas II Pekanbaru is a government office that concerns about healthy, especially about environment health. There is a problem in case of saving electronic data, also in analyzing daily data both for internal and external data. The office has some computers and other tools that are useful in saving electronic data. In fact, the data are still saved in available cupboards and it is not efficient for an important data that is analyzed for more than one time. In other words, it is not good for a data is needed to be analyzed continuously. Rational Data Base Management System (RDBMS) application is an online based saving data and it uses System Development Life Cycle (SDLC) method. Hopefully, the application will be very useful for employees Kantor Kesehatan Pelabuhan Pekanbaru in managing their work.

  10. Inductance analyzer based on auto-balanced circuit for precision measurement of fluxgate impedance

    NASA Astrophysics Data System (ADS)

    Setiadi, Rahmondia N.; Schilling, Meinhard

    2018-05-01

    An instrument for fluxgate sensor impedance measurement based on an auto-balanced circuit has been designed and characterized. The circuit design is adjusted to comply with the fluxgate sensor characteristics which are low impedance and highly saturable core with very high permeability. The system utilizes a NI-DAQ card and LabVIEW to process the signal acquisition and evaluation. Some fixed reference resistances are employed for system calibration using linear regression. A multimeter HP 34401A and impedance analyzer Agilent 4294A are used as calibrator and validator for the resistance and inductance measurements. Here, we realized a fluxgate analyzer instrument based on auto-balanced circuit, which measures the resistance and inductance of the device under test with a small error and much lower excitation current to avoid core saturation compared to the used calibrator.

  11. Current algorithmic solutions for peptide-based proteomics data generation and identification.

    PubMed

    Hoopmann, Michael R; Moritz, Robert L

    2013-02-01

    Peptide-based proteomic data sets are ever increasing in size and complexity. These data sets provide computational challenges when attempting to quickly analyze spectra and obtain correct protein identifications. Database search and de novo algorithms must consider high-resolution MS/MS spectra and alternative fragmentation methods. Protein inference is a tricky problem when analyzing large data sets of degenerate peptide identifications. Combining multiple algorithms for improved peptide identification puts significant strain on computational systems when investigating large data sets. This review highlights some of the recent developments in peptide and protein identification algorithms for analyzing shotgun mass spectrometry data when encountering the aforementioned hurdles. Also explored are the roles that analytical pipelines, public spectral libraries, and cloud computing play in the evolution of peptide-based proteomics. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Determination of sulfur compounds in hydrotreated transformer base oil by potentiometric titration.

    PubMed

    Chao, Qiu; Sheng, Han; Cheng, Xingguo; Ren, Tianhui

    2005-06-01

    A method was developed to analyze the distribution of sulfur compounds in model sulfur compounds by potentiometric titration, and applied to analyze hydrotreated transformer base oil. Model thioethers were oxidized to corresponding sulfoxides by tetrabutylammonium periodate and sodium metaperiodate, respectively, and the sulfoxides were titrated by perchloric acid titrant in acetic anhydride. The contents of aliphatic thioethers and total thioethers were then determined from that of sulfoxides in solution. The method was applied to determine the organic sulfur compounds in hydrotreated transformer base oil.

  13. Secure quantum key distribution using continuous variables of single photons.

    PubMed

    Zhang, Lijian; Silberhorn, Christine; Walmsley, Ian A

    2008-03-21

    We analyze the distribution of secure keys using quantum cryptography based on the continuous variable degree of freedom of entangled photon pairs. We derive the information capacity of a scheme based on the spatial entanglement of photons from a realistic source, and show that the standard measures of security known for quadrature-based continuous variable quantum cryptography (CV-QKD) are inadequate. A specific simple eavesdropping attack is analyzed to illuminate how secret information may be distilled well beyond the bounds of the usual CV-QKD measures.

  14. SEMICONDUCTOR TECHNOLOGY: Material removal rate in chemical-mechanical polishing of wafers based on particle trajectories

    NASA Astrophysics Data System (ADS)

    Jianxiu, Su; Xiqu, Chen; Jiaxi, Du; Renke, Kang

    2010-05-01

    Distribution forms of abrasives in the chemical mechanical polishing (CMP) process are analyzed based on experimental results. Then the relationships between the wafer, the abrasive and the polishing pad are analyzed based on kinematics and contact mechanics. According to the track length of abrasives on the wafer surface, the relationships between the material removal rate and the polishing velocity are obtained. The analysis results are in accord with the experimental results. The conclusion provides a theoretical guide for further understanding the material removal mechanism of wafers in CMP.

  15. Computational models for the analysis/design of hypersonic scramjet components. I - Combustor and nozzle models

    NASA Technical Reports Server (NTRS)

    Dash, S. M.; Sinha, N.; Wolf, D. E.; York, B. J.

    1986-01-01

    An overview of computational models developed for the complete, design-oriented analysis of a scramjet propulsion system is provided. The modular approach taken involves the use of different PNS models to analyze the individual propulsion system components. The external compression and internal inlet flowfields are analyzed by the SCRAMP and SCRINT components discussed in Part II of this paper. The combustor is analyzed by the SCORCH code which is based upon SPLITP PNS pressure-split methodology formulated by Dash and Sinha. The nozzle is analyzed by the SCHNOZ code which is based upon SCIPVIS PNS shock-capturing methodology formulated by Dash and Wolf. The current status of these models, previous developments leading to this status, and, progress towards future hybrid and 3D versions are discussed in this paper.

  16. Real-Time, Fast Neutron Coincidence Assay of Plutonium With a 4-Channel Multiplexed Analyzer and Organic Scintillators

    NASA Astrophysics Data System (ADS)

    Joyce, Malcolm J.; Gamage, Kelum A. A.; Aspinall, M. D.; Cave, F. D.; Lavietes, A.

    2014-06-01

    The design, principle of operation and the results of measurements made with a four-channel organic scintillator system are described. The system comprises four detectors and a multiplexed analyzer for the real-time parallel processing of fast neutron events. The function of the real-time, digital multiple-channel pulse-shape discrimination analyzer is described together with the results of laboratory-based measurements with 252Cf, 241Am-Li and plutonium. The analyzer is based on a single-board solution with integrated high-voltage supplies and graphical user interface. It has been developed to meet the requirements of nuclear materials assay of relevance to safeguards and security. Data are presented for the real-time coincidence assay of plutonium in terms of doubles count rate versus mass. This includes an assessment of the limiting mass uncertainty for coincidence assay based on a 100 s measurement period and samples in the range 0-50 g. Measurements of count rate versus order of multiplicity for 252Cf and 241Am-Li and combinations of both are also presented.

  17. Design of housing file box of fire academy based on RFID

    NASA Astrophysics Data System (ADS)

    Li, Huaiyi

    2018-04-01

    This paper presents a design scheme of intelligent file box based on RFID. The advantages of RFID file box and traditional file box are compared and analyzed, and the feasibility of RFID file box design is analyzed based on the actual situation of our university. After introducing the shape and structure design of the intelligent file box, the paper discusses the working process of the file box, and explains in detail the internal communication principle of the RFID file box and the realization of the control system. The application of the RFID based file box will greatly improve the efficiency of our school's archives management.

  18. Fast polarimetry of the colliding proton beams based on the elastic pp analyzer using the NICA detectors

    NASA Astrophysics Data System (ADS)

    Sharov, V. I.

    2017-12-01

    It is shown that the existing data on analyzing power An of the elastic pp scattering could be successfully applied for polarimetry of the colliding proton beams using the NICA detectors. Performed calculations of the count rates of the elastic events have revealed that the polarimeter based on using An for elastic pp will have a high polarization measurement velocity.

  19. A large-scale intercomparison of stratospheric vertical distributions of NO2 and BrO retrieved from the SCIAMACHY limb measurements and ground-based twilight observations

    NASA Astrophysics Data System (ADS)

    Rozanov, Alexei; Hendrick, Francois; Lotz, Wolfhardt; van Roozendael, Michel; Bovensmann, Heinrich; Burrows, John P.

    This study is devoted to the intercomparison of NO2 and BrO vertical profiles obtained from the satellite and ground-based measurements. Although, the ground-based observations are performed only at selected locations, they have a great potential to be used for the validation of satellite measurements since continuous long-term measurement series performed with the same instruments are available. Thus, long-term trends in the observed species can be analyzed and intercompared. Previous intercomparisons of the vertical distributions of NO2 and BrO retrieved from SCIAMACHY limb measurements at the University of Bremen and obtained at IASB-BIRA by applying a profiling technique to ground-based zenith-sky DOAS observations have shown a good agreement between the results of completely different measurement techniques. However, only a relatively short time period of one year was analyzed so far which do not allow investigating seasonal variations and trends. Furthermore, some minor discrepancies are still to be analyzed. In the current study, several years datasets obtained at Observatoire de Haute-Provence (OHP) in France and in Harestua in Norway will be compared to the retrievals of SCIAMACHY limb measurements. Seasonal and annual variations will be analyzed and possible reasons for the remaining discrepancies will be discussed.

  20. The Rhetoric of Satire: Analyzing in Freshman English.

    ERIC Educational Resources Information Center

    Proctor, Betty Jane

    1982-01-01

    Presents a series of exercises designed to provide freshman composition students with a base for analyzing works rhetorically, to point out how language can be used persuasively, and to illustrate how satire functions. (FL)

  1. Barriers to Physical Activity on University Student

    NASA Astrophysics Data System (ADS)

    Jajat; Sultoni, K.; Suherman, A.

    2017-03-01

    The purpose of the research is to analyze the factors that become barriers to physical activity in university students based on physical activity level. An internet-based survey was conducted. The participants were 158 University students from Universitas Pendidikan Indonesia. Barriers to Physical Activity Quiz (BPAQ) were used to assessed the factors that become barriers to physical activity in university students. IPAQ (short form) were used to assessed physical activity level. The results show there was no differences BPAQ based on IPAQ level. But when analyzed further based on seven factors barriers there are differences in factors “social influence and lack of willpower” based IPAQ level. Based on this it was concluded that the “influence from other and lack of willpower” an inhibiting factor on students to perform physical activity.

  2. [Modeling and implementation method for the automatic biochemistry analyzer control system].

    PubMed

    Wang, Dong; Ge, Wan-cheng; Song, Chun-lin; Wang, Yun-guang

    2009-03-01

    In this paper the system structure The automatic biochemistry analyzer is a necessary instrument for clinical diagnostics. First of is analyzed. The system problems description and the fundamental principles for dispatch are brought forward. Then this text puts emphasis on the modeling for the automatic biochemistry analyzer control system. The objects model and the communications model are put forward. Finally, the implementation method is designed. It indicates that the system based on the model has good performance.

  3. Test data analysis for concentrating photovoltaic arrays

    NASA Astrophysics Data System (ADS)

    Maish, A. B.; Cannon, J. E.

    A test data analysis approach for use with steady state efficiency measurements taken on concentrating photovoltaic arrays is presented. The analysis procedures can be used to identify based and erroneous data. The steps involved in analyzing the test data are screening the data, developing coefficients for the performance equation, analyzing statistics to ensure adequacy of the regression fit to the data, and plotting the data. In addition, this paper analyzes the sources and magnitudes of precision and bias errors that affect measurement accuracy are analyzed.

  4. A mixed-methods investigation of successful aging among older women engaged in sports-based versus exercise-based leisure time physical activities.

    PubMed

    Berlin, Kathryn; Kruger, Tina; Klenosky, David B

    2018-01-01

    This mixed-methods study compares active older women in different physically based leisure activities and explores the difference in subjective ratings of successful aging and quantifiable predictors of success. A survey was administered to 256 women, 60-92 years of age, engaged in a sports- or exercise-based activity. Quantitative data were analyzed through ANOVA and multiple regression. Qualitative data (n = 79) was analyzed using the approach associated with means-end theory. While participants quantitatively appeared similar in terms of successful aging, qualitative interviews revealed differences in activity motivation. Women involved in sports highlighted social/psychological benefits, while those involved in exercise-based activities stressed fitness outcomes.

  5. NASTRAN thermal analyzer status, experience, and new developments

    NASA Technical Reports Server (NTRS)

    Lee, H. P.

    1975-01-01

    The unique finite element based NASTRAN Thermal Analyzer originally developed as a general purpose heat transfer analysis incorporated into the NASTRAN system is described. The current status, experiences from field applications, and new developments are included.

  6. The ACS statistical analyzer

    DOT National Transportation Integrated Search

    2010-03-01

    This document provides guidance for using the ACS Statistical Analyzer. It is an Excel-based template for users of estimates from the American Community Survey (ACS) to assess the precision of individual estimates and to compare pairs of estimates fo...

  7. Study and Application on Cloud Covered Rate for Agroclimatical Distribution Using In Guangxi Based on Modis Data

    NASA Astrophysics Data System (ADS)

    Yang, Xin; Zhong, Shiquan; Sun, Han; Tan, Zongkun; Li, Zheng; Ding, Meihua

    Based on analyzing of the physical characteristics of cloud and importance of cloud in agricultural production and national economy, cloud is a very important climatic resources such as temperature, precipitation and solar radiation. Cloud plays a very important role in agricultural climate division .This paper analyzes methods of cloud detection based on MODIS data in China and Abroad . The results suggest that Quanjun He method is suitable to detect cloud in Guangxi. State chart of cloud cover in Guangxi is imaged by using Quanjun He method .We find out the approach of calculating cloud covered rate by using the frequency spectrum analysis. At last, the Guangxi is obtained. Taking Rongxian County Guangxi as an example, this article analyze the preliminary application of cloud covered rate in distribution of Rong Shaddock pomelo . Analysis results indicate that cloud covered rate is closely related to quality of Rong Shaddock pomelo.

  8. Simulation tools for analyzer-based x-ray phase contrast imaging system with a conventional x-ray source

    NASA Astrophysics Data System (ADS)

    Caudevilla, Oriol; Zhou, Wei; Stoupin, Stanislav; Verman, Boris; Brankov, J. G.

    2016-09-01

    Analyzer-based X-ray phase contrast imaging (ABI) belongs to a broader family of phase-contrast (PC) X-ray imaging modalities. Unlike the conventional X-ray radiography, which measures only X-ray absorption, in PC imaging one can also measures the X-rays deflection induced by the object refractive properties. It has been shown that refraction imaging provides better contrast when imaging the soft tissue, which is of great interest in medical imaging applications. In this paper, we introduce a simulation tool specifically designed to simulate the analyzer-based X-ray phase contrast imaging system with a conventional polychromatic X-ray source. By utilizing ray tracing and basic physical principles of diffraction theory our simulation tool can predicting the X-ray beam profile shape, the energy content, the total throughput (photon count) at the detector. In addition we can evaluate imaging system point-spread function for various system configurations.

  9. Exploring stability of entropy analysis for signal with different trends

    NASA Astrophysics Data System (ADS)

    Zhang, Yin; Li, Jin; Wang, Jun

    2017-03-01

    Considering the effects of environment disturbances and instrument systems, the actual detecting signals always are carrying different trends, which result in that it is difficult to accurately catch signals complexity. So choosing steady and effective analysis methods is very important. In this paper, we applied entropy measures-the base-scale entropy and approximate entropy to analyze signal complexity, and studied the effect of trends on the ideal signal and the heart rate variability (HRV) signals, that is, linear, periodic, and power-law trends which are likely to occur in actual signals. The results show that approximate entropy is unsteady when we embed different trends into the signals, so it is not suitable to analyze signal with trends. However, the base-scale entropy has preferable stability and accuracy for signal with different trends. So the base-scale entropy is an effective method to analyze the actual signals.

  10. Biomedical sensing analyzer (BSA) for mobile-health (mHealth)-LTE.

    PubMed

    Adibi, Sasan

    2014-01-01

    The rapid expansion of mobile-based systems, the capabilities of smartphone devices, as well as the radio access and cellular network technologies are the wind beneath the wing of mobile health (mHealth). In this paper, the concept of biomedical sensing analyzer (BSA) is presented, which is a novel framework, devised for sensor-based mHealth applications. The BSA is capable of formulating the Quality of Service (QoS) measurements in an end-to-end sense, covering the entire communication path (wearable sensors, link-technology, smartphone, cell-towers, mobile-cloud, and the end-users). The characterization and formulation of BSA depend on a number of factors, including the deployment of application-specific biomedical sensors, generic link-technologies, collection, aggregation, and prioritization of mHealth data, cellular network based on the Long-Term Evolution (LTE) access technology, and extensive multidimensional delay analyses. The results are studied and analyzed in a LabView 8.5 programming environment.

  11. An interactive web-based system using cloud for large-scale visual analytics

    NASA Astrophysics Data System (ADS)

    Kaseb, Ahmed S.; Berry, Everett; Rozolis, Erik; McNulty, Kyle; Bontrager, Seth; Koh, Youngsol; Lu, Yung-Hsiang; Delp, Edward J.

    2015-03-01

    Network cameras have been growing rapidly in recent years. Thousands of public network cameras provide tremendous amount of visual information about the environment. There is a need to analyze this valuable information for a better understanding of the world around us. This paper presents an interactive web-based system that enables users to execute image analysis and computer vision techniques on a large scale to analyze the data from more than 65,000 worldwide cameras. This paper focuses on how to use both the system's website and Application Programming Interface (API). Given a computer program that analyzes a single frame, the user needs to make only slight changes to the existing program and choose the cameras to analyze. The system handles the heterogeneity of the geographically distributed cameras, e.g. different brands, resolutions. The system allocates and manages Amazon EC2 and Windows Azure cloud resources to meet the analysis requirements.

  12. Mean-Variance portfolio optimization by using non constant mean and volatility based on the negative exponential utility function

    NASA Astrophysics Data System (ADS)

    Soeryana, Endang; Halim, Nurfadhlina Bt Abdul; Sukono, Rusyaman, Endang; Supian, Sudradjat

    2017-03-01

    Investments in stocks investors are also faced with the issue of risk, due to daily price of stock also fluctuate. For minimize the level of risk, investors usually forming an investment portfolio. Establishment of a portfolio consisting of several stocks are intended to get the optimal composition of the investment portfolio. This paper discussed about optimizing investment portfolio of Mean-Variance to stocks by using mean and volatility is not constant based on the Negative Exponential Utility Function. Non constant mean analyzed using models Autoregressive Moving Average (ARMA), while non constant volatility models are analyzed using the Generalized Autoregressive Conditional heteroscedastic (GARCH). Optimization process is performed by using the Lagrangian multiplier technique. As a numerical illustration, the method is used to analyze some stocks in Indonesia. The expected result is to get the proportion of investment in each stock analyzed

  13. Effects of modal truncation and condensation methods on the Frequency Response Function of a stage reducer connected by rigid coupling to a planetary gear system

    NASA Astrophysics Data System (ADS)

    Bouslema, Marwa; Frikha, Ahmed; Abdennadhar, Moez; Fakhfakh, Tahar; Nasri, Rachid; Haddar, Mohamed

    2017-12-01

    The present paper is aimed at the application of a substructure methodology, based on the Frequency Response Function (FRF) simulation technique, to analyze the vibration of a stage reducer connected by a rigid coupling to a planetary gear system. The computation of the vibration response was achieved using the FRF-based substructuring method. First of all, the two subsystems were analyzed separately and their FRF were obtained. Then the coupled model was analyzed indirectly using the substructuring technique. A comparison between the full system response and the coupled model response using the FRF substructuring was investigated to validate the coupling method. Furthermore, a parametric study of the effect of the shaft coupling stiffness on the FRF was discussed and the effects of modal truncation and condensation methods on the FRF of subsystems were analyzed.

  14. An MS-DOS-based program for analyzing plutonium gamma-ray spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruhter, W.D.; Buckley, W.M.

    1989-09-07

    A plutonium gamma-ray analysis system that operates on MS-DOS-based computers has been developed for the International Atomic Energy Agency (IAEA) to perform in-field analysis of plutonium gamma-ray spectra for plutonium isotopics. The program titled IAEAPU consists of three separate applications: a data-transfer application for transferring spectral data from a CICERO multichannel analyzer to a binary data file, a data-analysis application to analyze plutonium gamma-ray spectra, for plutonium isotopic ratios and weight percents of total plutonium, and a data-quality assurance application to check spectral data for proper data-acquisition setup and performance. Volume 3 contains the software listings for these applications.

  15. Net-centric ACT-R-Based Cognitive Architecture with DEVS Unified Process

    DTIC Science & Technology

    2011-04-01

    effort has been spent in analyzing various forms of requirement specifications, viz, state-based, Natural Language based, UML-based, Rule- based, BPMN ...requirement specifications in one of the chosen formats such as BPMN , DoDAF, Natural Language Processing (NLP) based, UML- based, DSL or simply

  16. Flow-based ammonia gas analyzer with an open channel scrubber for indoor environments.

    PubMed

    Ohira, Shin-Ichi; Heima, Minako; Yamasaki, Takayuki; Tanaka, Toshinori; Koga, Tomoko; Toda, Kei

    2013-11-15

    A robust and fully automated indoor ammonia gas monitoring system with an open channel scrubber (OCS) was developed. The sample gas channel dimensions, hydrophilic surface treatment to produce a thin absorbing solution layer, and solution flow rate of the OCS were optimized to connect the OCS as in-line gas collector and avoid sample humidity effects. The OCS effluent containing absorbed ammonia in sample gas was injected into a derivatization solution flow. Derivatization was achieved with o-phthalaldehyde and sulfite in pH 11 buffer solution. The product, 1-sulfonateisoindole, is detected with a home-made fluorescence detector. The limit of detection of the analyzer based on three times the standard deviation of baseline noise was 0.9 ppbv. Sample gas could be analyzed 40 times per hour. Furthermore, relative humidity of up to 90% did not interfere considerably with the analyzer. Interference from amines was not observed. The developed gas analysis system was calibrated using a solution-based method. The system was used to analyze ammonia in an indoor environment along with an off-site method, traditional impinger gas collection followed by ion chromatographic analysis, for comparison. The results obtained using both methods agreed well. Therefore, the developed system can perform on-site monitoring of ammonia in indoor environments with improved time resolution compared with that of other methods. Crown Copyright © 2013 Published by Elsevier B.V. All rights reserved.

  17. User requirements for NASA data base management systems. Part 1: Oceanographic discipline

    NASA Technical Reports Server (NTRS)

    Fujimoto, B.

    1981-01-01

    Generic oceanographic user requirements were collected and analyzed for use in developing a general multipurpose data base management system for future missions of the Office of Space and Terrestrial Applications (OSTA) of NASA. The collection of user requirements involved; studying the state-of-the-art technology in data base management systems; analyzing the results of related studies; formulating a viable and diverse list of scientists to be interviewed; developing a presentation format and materials; and interviewing oceanographic data users. More effective data management systems are needed to handle the increasing influx of data.

  18. Systems and methods for analyzing building operations sensor data

    DOEpatents

    Mezic, Igor; Eisenhower, Bryan A.

    2015-05-26

    Systems and methods are disclosed for analyzing building sensor information and decomposing the information therein to a more manageable and more useful form. Certain embodiments integrate energy-based and spectral-based analysis methods with parameter sampling and uncertainty/sensitivity analysis to achieve a more comprehensive perspective of building behavior. The results of this analysis may be presented to a user via a plurality of visualizations and/or used to automatically adjust certain building operations. In certain embodiments, advanced spectral techniques, including Koopman-based operations, are employed to discern features from the collected building sensor data.

  19. Research on collaborative innovation mechanism of green construction supply chain based on united agency

    NASA Astrophysics Data System (ADS)

    Zhang, Min; He, Weiyi

    2018-06-01

    Under the guidance of principal-agent theory and modular theory, the collaborative innovation of green technology-based companies, design contractors and project builders based on united agency will provide direction for the development of green construction supply chain in the future. After analyzing the existing independent agencies, this paper proposes the industry-university-research bilateral collaborative innovation network architecture and modularization with the innovative function of engineering design in the context of non-standard transformation interfaces, analyzes the innovation responsibility center, and gives some countermeasures and suggestions to promote the performance of bilateral cooperative innovation network.

  20. Magnetic Tunnel Junction-Based On-Chip Microwave Phase and Spectrum Analyzer

    NASA Technical Reports Server (NTRS)

    Fan, Xin; Chen, Yunpeng; Xie, Yunsong; Kolodzey, James; Wilson, Jeffrey D.; Simons, Rainee N.; Xiao, John Q.

    2014-01-01

    A magnetic tunnel junction (MTJ)-based microwave detector is proposed and investigated. When the MTJ is excited by microwave magnetic fields, the relative angle between the free layer and pinned layer alternates, giving rise to an average resistance change. By measuring the average resistance change, the MTJ can be utilized as a microwave power sensor. Due to the nature of ferromagnetic resonance, the frequency of an incident microwave is directly determined. In addition, by integrating a mixer circuit, the MTJ-based microwave detector can also determine the relative phase between two microwave signals. Thus, the MTJ-based microwave detector can be used as an on-chip microwave phase and spectrum analyzer.

  1. Web-based multi-channel analyzer

    DOEpatents

    Gritzo, Russ E.

    2003-12-23

    The present invention provides an improved multi-channel analyzer designed to conveniently gather, process, and distribute spectrographic pulse data. The multi-channel analyzer may operate on a computer system having memory, a processor, and the capability to connect to a network and to receive digitized spectrographic pulses. The multi-channel analyzer may have a software module integrated with a general-purpose operating system that may receive digitized spectrographic pulses for at least 10,000 pulses per second. The multi-channel analyzer may further have a user-level software module that may receive user-specified controls dictating the operation of the multi-channel analyzer, making the multi-channel analyzer customizable by the end-user. The user-level software may further categorize and conveniently distribute spectrographic pulse data employing non-proprietary, standard communication protocols and formats.

  2. Locomotive crashworthiness research : volume 5 : cab car crashworthiness report

    DOT National Transportation Integrated Search

    1996-03-01

    Models used to analyze locomotive crashworthiness are modified for application to control cab cars of the types used for intercity and commuter rail passenger service. An existing control cab car is analyzed for crashworthiness based on scenarios dev...

  3. A Microfluidics-HPLC/Differential Mobility Spectrometer Macromolecular Detection System for Human and Robotic Missions

    NASA Technical Reports Server (NTRS)

    Coy, S. L.; Killeen, K.; Han, J.; Eiceman, G. A.; Kanik, I.; Kidd, R. D.

    2011-01-01

    Our goal is to develop a unique, miniaturized, solute analyzer based on microfluidics technology. The analyzer consists of an integrated microfluidics High Performance Liquid Chromatographic chip / Differential Mobility Spectrometer (?HPLCchip/ DMS) detection system

  4. RipleyGUI: software for analyzing spatial patterns in 3D cell distributions

    PubMed Central

    Hansson, Kristin; Jafari-Mamaghani, Mehrdad; Krieger, Patrik

    2013-01-01

    The true revolution in the age of digital neuroanatomy is the ability to extensively quantify anatomical structures and thus investigate structure-function relationships in great detail. To facilitate the quantification of neuronal cell patterns we have developed RipleyGUI, a MATLAB-based software that can be used to detect patterns in the 3D distribution of cells. RipleyGUI uses Ripley's K-function to analyze spatial distributions. In addition the software contains statistical tools to determine quantitative statistical differences, and tools for spatial transformations that are useful for analyzing non-stationary point patterns. The software has a graphical user interface making it easy to use without programming experience, and an extensive user manual explaining the basic concepts underlying the different statistical tools used to analyze spatial point patterns. The described analysis tool can be used for determining the spatial organization of neurons that is important for a detailed study of structure-function relationships. For example, neocortex that can be subdivided into six layers based on cell density and cell types can also be analyzed in terms of organizational principles distinguishing the layers. PMID:23658544

  5. Identification and VIGS-based characterization of Bx1 ortholog in rye (Secale cereale L.)

    PubMed Central

    Groszyk, Jolanta; Kowalczyk, Mariusz; Yanushevska, Yuliya; Stochmal, Anna; Rakoczy-Trojanowska, Monika

    2017-01-01

    The first step of the benzoxazinoid (BX) synthesis pathway is catalyzed by an enzyme with indole-3-glycerol phosphate lyase activity encoded by 3 genes, Bx1, TSA and Igl. A gene highly homologous to maize and wheat Bx1 has been identified in rye. The goal of the study was to analyze the gene and to experimentally verify its role in the rye BX biosynthesis pathway as a rye ortholog of the Bx1 gene. Expression of the gene showed peak values 3 days after imbibition (dai) and at 21 dai it was undetectable. Changes of the BX content in leaves were highly correlated with the expression pattern until 21 dai. In plants older than 21 dai despite the undetectable expression of the analyzed gene there was still low accumulation of BXs. Function of the gene was verified by correlating its native expression and virus-induced silencing with BX accumulation. Barley stripe mosaic virus (BSMV)-based vectors were used to induce transcriptional (TGS) and posttranscriptional (PTGS) silencing of the analyzed gene. Both strategies (PTGS and TGS) significantly reduced the transcript level of the analyzed gene, and this was highly correlated with lowered BX content. Inoculation with virus-based vectors specifically induced expression of the analyzed gene, indicating up-regulation by biotic stressors. This is the first report of using the BSMV-based system for functional analysis of rye gene. The findings prove that the analyzed gene is a rye ortholog of the Bx1 gene. Its expression is developmentally regulated and is strongly induced by biotic stress. Stable accumulation of BXs in plants older than 21 dai associated with undetectable expression of ScBx1 indicates that the function of the ScBx1 in the BX biosynthesis is redundant with another gene. We anticipate that the unknown gene is a putative ortholog of the Igl, which still remains to be identified in rye. PMID:28234909

  6. Identification and VIGS-based characterization of Bx1 ortholog in rye (Secale cereale L.).

    PubMed

    Groszyk, Jolanta; Kowalczyk, Mariusz; Yanushevska, Yuliya; Stochmal, Anna; Rakoczy-Trojanowska, Monika; Orczyk, Waclaw

    2017-01-01

    The first step of the benzoxazinoid (BX) synthesis pathway is catalyzed by an enzyme with indole-3-glycerol phosphate lyase activity encoded by 3 genes, Bx1, TSA and Igl. A gene highly homologous to maize and wheat Bx1 has been identified in rye. The goal of the study was to analyze the gene and to experimentally verify its role in the rye BX biosynthesis pathway as a rye ortholog of the Bx1 gene. Expression of the gene showed peak values 3 days after imbibition (dai) and at 21 dai it was undetectable. Changes of the BX content in leaves were highly correlated with the expression pattern until 21 dai. In plants older than 21 dai despite the undetectable expression of the analyzed gene there was still low accumulation of BXs. Function of the gene was verified by correlating its native expression and virus-induced silencing with BX accumulation. Barley stripe mosaic virus (BSMV)-based vectors were used to induce transcriptional (TGS) and posttranscriptional (PTGS) silencing of the analyzed gene. Both strategies (PTGS and TGS) significantly reduced the transcript level of the analyzed gene, and this was highly correlated with lowered BX content. Inoculation with virus-based vectors specifically induced expression of the analyzed gene, indicating up-regulation by biotic stressors. This is the first report of using the BSMV-based system for functional analysis of rye gene. The findings prove that the analyzed gene is a rye ortholog of the Bx1 gene. Its expression is developmentally regulated and is strongly induced by biotic stress. Stable accumulation of BXs in plants older than 21 dai associated with undetectable expression of ScBx1 indicates that the function of the ScBx1 in the BX biosynthesis is redundant with another gene. We anticipate that the unknown gene is a putative ortholog of the Igl, which still remains to be identified in rye.

  7. Network-based spatial clustering technique for exploring features in regional industry

    NASA Astrophysics Data System (ADS)

    Chou, Tien-Yin; Huang, Pi-Hui; Yang, Lung-Shih; Lin, Wen-Tzu

    2008-10-01

    In the past researches, industrial cluster mainly focused on single or particular industry and less on spatial industrial structure and mutual relations. Industrial cluster could generate three kinds of spillover effects, including knowledge, labor market pooling, and input sharing. In addition, industrial cluster indeed benefits industry development. To fully control the status and characteristics of district industrial cluster can facilitate to improve the competitive ascendancy of district industry. The related researches on industrial spatial cluster were of great significance for setting up industrial policies and promoting district economic development. In this study, an improved model, GeoSOM, that combines DBSCAN (Density-Based Spatial Clustering of Applications with Noise) and SOM (Self-Organizing Map) was developed for analyzing industrial cluster. Different from former distance-based algorithm for industrial cluster, the proposed GeoSOM model can calculate spatial characteristics between firms based on DBSCAN algorithm and evaluate the similarity between firms based on SOM clustering analysis. The demonstrative data sets, the manufacturers around Taichung County in Taiwan, were analyzed for verifying the practicability of the proposed model. The analyzed results indicate that GeoSOM is suitable for evaluating spatial industrial cluster.

  8. Methodology for cost analysis of film-based and filmless portable chest systems

    NASA Astrophysics Data System (ADS)

    Melson, David L.; Gauvain, Karen M.; Beardslee, Brian M.; Kraitsik, Michael J.; Burton, Larry; Blaine, G. James; Brink, Gary S.

    1996-05-01

    Many studies analyzing the costs of film-based and filmless radiology have focused on multi- modality, hospital-wide solutions. Yet due to the enormous cost of converting an entire large radiology department or hospital to a filmless environment all at once, institutions often choose to eliminate film one area at a time. Narrowing the focus of cost-analysis may be useful in making such decisions. This presentation will outline a methodology for analyzing the cost per exam of film-based and filmless solutions for providing portable chest exams to Intensive Care Units (ICUs). The methodology, unlike most in the literature, is based on parallel data collection from existing filmless and film-based ICUs, and is currently being utilized at our institution. Direct costs, taken from the perspective of the hospital, for portable computed radiography chest exams in one filmless and two film-based ICUs are identified. The major cost components are labor, equipment, materials, and storage. Methods for gathering and analyzing each of the cost components are discussed, including FTE-based and time-based labor analysis, incorporation of equipment depreciation, lease, and maintenance costs, and estimation of materials costs. Extrapolation of data from three ICUs to model hypothetical, hospital-wide film-based and filmless ICU imaging systems is described. Performance of sensitivity analysis on the filmless model to assess the impact of anticipated reductions in specific labor, equipment, and archiving costs is detailed. A number of indirect costs, which are not explicitly included in the analysis, are identified and discussed.

  9. Comparison of three sampling instruments, Cytobrush, Curette and OralCDx, for liquid-based cytology of the oral mucosa.

    PubMed

    Reboiras-López, M D; Pérez-Sayáns, M; Somoza-Martín, J M; Antúnez-López, J R; Gándara-Vila, P; Gayoso-Diz, P; Gándara-Rey, J M; García-García, A

    2012-01-01

    Exfoliative cytology of the oral cavity is a simple and noninvasive technique that permits the study of epithelial cells. Liquid-based cytology is an auxiliary diagnostic tool for improving the specificity and sensitivity of conventional cytology. The objective of our study was to compare the quality of normal oral mucosa cytology samples obtained using three different instruments, Cytobrush®, dermatological curette and Oral CDx® for liquid-based cytology. One hundred four cytological samples of oral cavity were analyzed. Samples were obtained from healthy volunteer subjects using all three instruments. The clinical and demographic variables were age, sex and smoking habits. We analyzed cellularity, quality of the preparation and types of cells in the samples. All preparations showed appropriate preparation quality. In all smears analyzed, cells were distributed uniformly and showed no mucus, bleeding, inflammatory exudate or artifacts. We found no correlation between the average number of cells and the type of instrument. The samples generally consisted of two types of cells: superficial and intermediate. No differences were found among the cytological preparations of these three instruments. We did not observe basal cells in any of the samples analyzed.

  10. A survey of artificial immune system based intrusion detection.

    PubMed

    Yang, Hua; Li, Tao; Hu, Xinlei; Wang, Feng; Zou, Yang

    2014-01-01

    In the area of computer security, Intrusion Detection (ID) is a mechanism that attempts to discover abnormal access to computers by analyzing various interactions. There is a lot of literature about ID, but this study only surveys the approaches based on Artificial Immune System (AIS). The use of AIS in ID is an appealing concept in current techniques. This paper summarizes AIS based ID methods from a new view point; moreover, a framework is proposed for the design of AIS based ID Systems (IDSs). This framework is analyzed and discussed based on three core aspects: antibody/antigen encoding, generation algorithm, and evolution mode. Then we collate the commonly used algorithms, their implementation characteristics, and the development of IDSs into this framework. Finally, some of the future challenges in this area are also highlighted.

  11. Relative role of different radii in the dynamics of 8B+58Ni reaction

    NASA Astrophysics Data System (ADS)

    Kaur, Amandeep; Sandhu, Kirandeep; Sharma, Manoj K.

    2018-05-01

    In the present work, we intend to analyze the significance of three different radius terms in the framework of dynamical cluster-decay model (DCM) based calculations. In the majority of DCM based calculations the impact of mass- dependent radius R(A) is extensively analyzed. The other two factors on which the radius term may depend are, the neutron- proton asymmetry and the charge of the decaying fragments. Hence, the asymmetry dependent radius term R(I) and charge dependent radius term R(Z) are incorporated in DCM based calculations to investigate their effect on the reaction dynamics involved. Here, we present an extension of an earlier work based on the decay of 66As* compound nucleus by including R(I) and R(Z) radii in addition to the R(A) term. The effect of replacement of R(A) with R(I) and R(Z) is analyzed via fragmentation structure, tunneling probabilities (P) and other barrier characteristics like barrier height (VB), barrier position (RB), barrier turning point Ra etc. The role of temperature, deformations and angular momentum is duly incorporated in the present calculations.

  12. Proton Electrostatic Analyzer.

    DTIC Science & Technology

    1983-02-01

    Detector Assembly ......................................... 11 2.2 Analyzer (Energy Selector) Assembly............................ 12 2.3 Collimator...Spectrometer assembly ........................................ 13 2.2 Base plate .................................................. 14 - ~ 2.3 Detector ... sensitive vehicle systems. Space objects undergo differential charging due to variations in physical properties among their surface regions. The rate and

  13. Past and Future Trends in Automobile Sales

    DOT National Transportation Integrated Search

    1981-07-01

    The report uses the Wharton EFA Motor Vehicle Demand Model (Mark I) and its associates data bases to discuss and analyze past and future trends in the automobile market. Part A analyzes the historical trends, generally covering the 1958-1976 period, ...

  14. Analyzing the Teaching of Professional Practice

    ERIC Educational Resources Information Center

    Moss, Pamela A.

    2011-01-01

    Background/Context: Based on their case studies of preparation for professional practice in the clergy, teaching, and clinical psychology, Grossman and colleagues (2009) identified three key concepts for analyzing and comparing practice in professional education--representations, decomposition, and approximations--to support professional educators…

  15. Problems in Employment Trend of Higher Vocational Graduates and Countermeasures

    ERIC Educational Resources Information Center

    Tan, Jianhua; Sheng, Zhichong

    2011-01-01

    Based on analyzing the regional character of higher vocational graduates employment, this paper analyzes the reasons for this employment trend, advances relevant countermeasures for employment of higher vocational graduates, and explores the direction for higher vocational graduates employment.

  16. gadfly: A pandas-based Framework for Analyzing GADGET Simulation Data

    NASA Astrophysics Data System (ADS)

    Hummel, Jacob A.

    2016-11-01

    We present the first public release (v0.1) of the open-source gadget Dataframe Library: gadfly. The aim of this package is to leverage the capabilities of the broader python scientific computing ecosystem by providing tools for analyzing simulation data from the astrophysical simulation codes gadget and gizmo using pandas, a thoroughly documented, open-source library providing high-performance, easy-to-use data structures that is quickly becoming the standard for data analysis in python. Gadfly is a framework for analyzing particle-based simulation data stored in the HDF5 format using pandas DataFrames. The package enables efficient memory management, includes utilities for unit handling, coordinate transformations, and parallel batch processing, and provides highly optimized routines for visualizing smoothed-particle hydrodynamics data sets.

  17. ADAM: analysis of discrete models of biological systems using computer algebra.

    PubMed

    Hinkelmann, Franziska; Brandon, Madison; Guang, Bonny; McNeill, Rustin; Blekherman, Grigoriy; Veliz-Cuba, Alan; Laubenbacher, Reinhard

    2011-07-20

    Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web-based tool for several different input formats, and it makes analysis of complex models accessible to a larger community, as it is platform independent as a web-service and does not require understanding of the underlying mathematics.

  18. TU-CD-304-11: Veritas 2.0: A Cloud-Based Tool to Facilitate Research and Innovation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mishra, P; Patankar, A; Etmektzoglou, A

    Purpose: We introduce Veritas 2.0, a cloud-based, non-clinical research portal, to facilitate translation of radiotherapy research ideas to new delivery techniques. The ecosystem of research tools includes web apps for a research beam builder for TrueBeam Developer Mode, an image reader for compressed and uncompressed XIM files, and a trajectory log file based QA/beam delivery analyzer. Methods: The research beam builder can generate TrueBeam readable XML file either from scratch or from pre-existing DICOM-RT plans. DICOM-RT plan is first converted to XML format and then researcher can interactively modify or add control points to them. Delivered beam can be verifiedmore » via reading generated images and analyzing trajectory log files. Image reader can read both uncompressed and HND-compressed XIM images. The trajectory log analyzer lets researchers plot expected vs. actual values and deviations among 30 mechanical axes. The analyzer gives an animated view of MLC patterns for the beam delivery. Veritas 2.0 is freely available and its advantages versus standalone software are i) No software installation or maintenance needed, ii) easy accessibility across all devices iii) seamless upgrades and iv) OS independence. Veritas is written using open-source tools like twitter bootstrap, jQuery, flask, and Python-based modules. Results: In the first experiment, an anonymized 7-beam DICOM-RT IMRT plan was converted to XML beam containing 1400 control points. kV and MV imaging points were inserted into this XML beam. In another experiment, a binary log file was analyzed to compare actual vs expected values and deviations among axes. Conclusions: Veritas 2.0 is a public cloud-based web app that hosts a pool of research tools for facilitating research from conceptualization to verification. It is aimed at providing a platform for facilitating research and collaboration. I am full time employee at Varian Medical systems, Palo Alto.« less

  19. Comparison of CTT and Rasch-based approaches for the analysis of longitudinal Patient Reported Outcomes.

    PubMed

    Blanchin, Myriam; Hardouin, Jean-Benoit; Le Neel, Tanguy; Kubis, Gildas; Blanchard, Claire; Mirallié, Eric; Sébille, Véronique

    2011-04-15

    Health sciences frequently deal with Patient Reported Outcomes (PRO) data for the evaluation of concepts, in particular health-related quality of life, which cannot be directly measured and are often called latent variables. Two approaches are commonly used for the analysis of such data: Classical Test Theory (CTT) and Item Response Theory (IRT). Longitudinal data are often collected to analyze the evolution of an outcome over time. The most adequate strategy to analyze longitudinal latent variables, which can be either based on CTT or IRT models, remains to be identified. This strategy must take into account the latent characteristic of what PROs are intended to measure as well as the specificity of longitudinal designs. A simple and widely used IRT model is the Rasch model. The purpose of our study was to compare CTT and Rasch-based approaches to analyze longitudinal PRO data regarding type I error, power, and time effect estimation bias. Four methods were compared: the Score and Mixed models (SM) method based on the CTT approach, the Rasch and Mixed models (RM), the Plausible Values (PV), and the Longitudinal Rasch model (LRM) methods all based on the Rasch model. All methods have shown comparable results in terms of type I error, all close to 5 per cent. LRM and SM methods presented comparable power and unbiased time effect estimations, whereas RM and PV methods showed low power and biased time effect estimations. This suggests that RM and PV methods should be avoided to analyze longitudinal latent variables. Copyright © 2010 John Wiley & Sons, Ltd.

  20. Study on an Air Quality Evaluation Model for Beijing City Under Haze-Fog Pollution Based on New Ambient Air Quality Standards

    PubMed Central

    Li, Li; Liu, Dong-Jun

    2014-01-01

    Since 2012, China has been facing haze-fog weather conditions, and haze-fog pollution and PM2.5 have become hot topics. It is very necessary to evaluate and analyze the ecological status of the air environment of China, which is of great significance for environmental protection measures. In this study the current situation of haze-fog pollution in China was analyzed first, and the new Ambient Air Quality Standards were introduced. For the issue of air quality evaluation, a comprehensive evaluation model based on an entropy weighting method and nearest neighbor method was developed. The entropy weighting method was used to determine the weights of indicators, and the nearest neighbor method was utilized to evaluate the air quality levels. Then the comprehensive evaluation model was applied into the practical evaluation problems of air quality in Beijing to analyze the haze-fog pollution. Two simulation experiments were implemented in this study. One experiment included the indicator of PM2.5 and was carried out based on the new Ambient Air Quality Standards (GB 3095-2012); the other experiment excluded PM2.5 and was carried out based on the old Ambient Air Quality Standards (GB 3095-1996). Their results were compared, and the simulation results showed that PM2.5 was an important indicator for air quality and the evaluation results of the new Air Quality Standards were more scientific than the old ones. The haze-fog pollution situation in Beijing City was also analyzed based on these results, and the corresponding management measures were suggested. PMID:25170682

  1. Aerodynamic Analysis of the M33 Projectile Using the CFX Code

    DTIC Science & Technology

    2011-12-01

    is unlimited 12b. DISTRIBUTION CODE A 13. ABSTRACT (maximum 200 words) The M33 projectile has been analyzed using the ANSYS CFX code that is based...analyzed using the ANSYS CFX code that is based on the numerical solution of the full Navier-Stokes equations. Simulation data were obtained...using the CFX code. The ANSYS - CFX code is a commercial CFD program used to simulate fluid flow in a variety of applications such as gas turbine

  2. Simulation of a compact analyzer-based imaging system with a regular x-ray source

    NASA Astrophysics Data System (ADS)

    Caudevilla, Oriol; Zhou, Wei; Stoupin, Stanislav; Verman, Boris; Brankov, J. G.

    2017-03-01

    Analyzer-based Imaging (ABI) belongs to a broader family of phase-contrast (PC) X-ray techniques. PC measures X-ray deflection phenomena when interacting with a sample, which is known to provide higher contrast images of soft tissue than other X-ray methods. This is of high interest in the medical field, in particular for mammogram applications. This paper presents a simulation tool for table-top ABI systems using a conventional polychromatic X-ray source.

  3. Instrumentation for air quality measurements.

    NASA Technical Reports Server (NTRS)

    Loewenstein, M.

    1973-01-01

    Comparison of the new generation of air quality monitoring instruments with some more traditional methods. The first generation of air quality measurement instruments, based on the use of oxidant coulometric cells, nitrogen oxide colorimetry, carbon monoxide infrared analyzers, and other types of detectors, is compared with new techniques now coming into wide use in the air monitoring field and involving the use of chemiluminescent reactions, optical absorption detectors, a refinement of the carbon monoxide infrared analyzer, electrochemical cells based on solid electrolytes, and laser detectors.

  4. Analyzing the Validity of Relationship Banking through Agent-based Modeling

    NASA Astrophysics Data System (ADS)

    Nishikido, Yukihito; Takahashi, Hiroshi

    This article analyzes the validity of relationship banking through agent-based modeling. In the analysis, we especially focus on the relationship between economic conditions and both lenders' and borrowers' behaviors. As a result of intensive experiments, we made the following interesting findings: (1) Relationship banking contributes to reducing bad loan; (2) relationship banking is more effective in enhancing the market growth compared to transaction banking, when borrowers' sales scale is large; (3) keener competition among lenders may bring inefficiency to the market.

  5. Control system design method

    DOEpatents

    Wilson, David G [Tijeras, NM; Robinett, III, Rush D.

    2012-02-21

    A control system design method and concomitant control system comprising representing a physical apparatus to be controlled as a Hamiltonian system, determining elements of the Hamiltonian system representation which are power generators, power dissipators, and power storage devices, analyzing stability and performance of the Hamiltonian system based on the results of the determining step and determining necessary and sufficient conditions for stability of the Hamiltonian system, creating a stable control system based on the results of the analyzing step, and employing the resulting control system to control the physical apparatus.

  6. Simulation and modeling of the temporal performance of path-based restoration schemes in planar mesh networks

    NASA Astrophysics Data System (ADS)

    Bhardwaj, Manish; McCaughan, Leon; Olkhovets, Anatoli; Korotky, Steven K.

    2006-12-01

    We formulate an analytic framework for the restoration performance of path-based restoration schemes in planar mesh networks. We analyze various switch architectures and signaling schemes and model their total restoration interval. We also evaluate the network global expectation value of the time to restore a demand as a function of network parameters. We analyze a wide range of nominally capacity-optimal planar mesh networks and find our analytic model to be in good agreement with numerical simulation data.

  7. Optical vector network analyzer based on double-sideband modulation.

    PubMed

    Jun, Wen; Wang, Ling; Yang, Chengwu; Li, Ming; Zhu, Ning Hua; Guo, Jinjin; Xiong, Liangming; Li, Wei

    2017-11-01

    We report an optical vector network analyzer (OVNA) based on double-sideband (DSB) modulation using a dual-parallel Mach-Zehnder modulator. The device under test (DUT) is measured twice with different modulation schemes. By post-processing the measurement results, the response of the DUT can be obtained accurately. Since DSB modulation is used in our approach, the measurement range is doubled compared with conventional single-sideband (SSB) modulation-based OVNA. Moreover, the measurement accuracy is improved by eliminating the even-order sidebands. The key advantage of the proposed scheme is that the measurement of a DUT with bandpass response can also be simply realized, which is a big challenge for the SSB-based OVNA. The proposed method is theoretically and experimentally demonstrated.

  8. Pattern Search in Multi-structure Data: A Framework for the Next-Generation Evidence-based Medicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sukumar, Sreenivas R; Ainsworth, Keela C

    With the advent of personalized and evidence-based medicine, the need for a framework to analyze/interpret quantitative measurements (blood work, toxicology, etc.) with qualitative descriptions (specialist reports after reading images, bio-medical knowledge-bases) to predict diagnostic risks is fast emerging. Addressing this need, we pose and address the following questions (i) How can we jointly analyze both qualitative and quantitative data ? (ii) Is the fusion of multi-structure data expected to provide better insights than either of them individually ? We present experiments on two bio-medical data sets - mammography and traumatic brain studies to demonstrate architectures and tools for evidence-pattern search.

  9. Robust Inference of Risks of Large Portfolios

    PubMed Central

    Fan, Jianqing; Han, Fang; Liu, Han; Vickers, Byron

    2016-01-01

    We propose a bootstrap-based robust high-confidence level upper bound (Robust H-CLUB) for assessing the risks of large portfolios. The proposed approach exploits rank-based and quantile-based estimators, and can be viewed as a robust extension of the H-CLUB procedure (Fan et al., 2015). Such an extension allows us to handle possibly misspecified models and heavy-tailed data, which are stylized features in financial returns. Under mixing conditions, we analyze the proposed approach and demonstrate its advantage over H-CLUB. We further provide thorough numerical results to back up the developed theory, and also apply the proposed method to analyze a stock market dataset. PMID:27818569

  10. The SPAR thermal analyzer: Present and future

    NASA Astrophysics Data System (ADS)

    Marlowe, M. B.; Whetstone, W. D.; Robinson, J. C.

    The SPAR thermal analyzer, a system of finite-element processors for performing steady-state and transient thermal analyses, is described. The processors communicate with each other through the SPAR random access data base. As each processor is executed, all pertinent source data is extracted from the data base and results are stored in the data base. Steady state temperature distributions are determined by a direct solution method for linear problems and a modified Newton-Raphson method for nonlinear problems. An explicit and several implicit methods are available for the solution of transient heat transfer problems. Finite element plotting capability is available for model checkout and verification.

  11. The SPAR thermal analyzer: Present and future

    NASA Technical Reports Server (NTRS)

    Marlowe, M. B.; Whetstone, W. D.; Robinson, J. C.

    1982-01-01

    The SPAR thermal analyzer, a system of finite-element processors for performing steady-state and transient thermal analyses, is described. The processors communicate with each other through the SPAR random access data base. As each processor is executed, all pertinent source data is extracted from the data base and results are stored in the data base. Steady state temperature distributions are determined by a direct solution method for linear problems and a modified Newton-Raphson method for nonlinear problems. An explicit and several implicit methods are available for the solution of transient heat transfer problems. Finite element plotting capability is available for model checkout and verification.

  12. Counter sniper: a localization system based on dual thermal imager

    NASA Astrophysics Data System (ADS)

    He, Yuqing; Liu, Feihu; Wu, Zheng; Jin, Weiqi; Du, Benfang

    2010-11-01

    Sniper tactics is widely used in modern warfare, which puts forward the urgent requirement of counter sniper detection devices. This paper proposed the anti-sniper detection system based on a dual-thermal imaging system. Combining the infrared characteristics of the muzzle flash and bullet trajectory of binocular infrared images obtained by the dual-infrared imaging system, the exact location of the sniper was analyzed and calculated. This paper mainly focuses on the system design method, which includes the structure and parameter selection. It also analyzes the exact location calculation method based on the binocular stereo vision and image analysis, and give the fusion result as the sniper's position.

  13. Nanoparticles based fiber optic SPR sensor

    NASA Astrophysics Data System (ADS)

    Shah, Kruti; Sharma, Navneet K.

    2018-05-01

    Localized surface plasmon resonance based fiber optic sensor using platinum nanoparticles is proposed and theoretically analyzed. Increase in thickness of nanoparticles layer increases the sensitivity of sensor. 50 nm thick platinum nanoparticles layer based sensor reveals highest sensitivity.

  14. Analyzing the travel behavior of home-based workers in the 1991 Caltrans Statewide Travel Survey

    DOT National Transportation Integrated Search

    1998-10-01

    This study compares the travel patterns of three different groups of workers identified in the 1991 Caltrans Statewide Travel Survey: home based business (HBB) workers, home based telecommuters (HBT), and non-home based (NHB) workers. HBB workers hav...

  15. SPHINX--an algorithm for taxonomic binning of metagenomic sequences.

    PubMed

    Mohammed, Monzoorul Haque; Ghosh, Tarini Shankar; Singh, Nitin Kumar; Mande, Sharmila S

    2011-01-01

    Compared with composition-based binning algorithms, the binning accuracy and specificity of alignment-based binning algorithms is significantly higher. However, being alignment-based, the latter class of algorithms require enormous amount of time and computing resources for binning huge metagenomic datasets. The motivation was to develop a binning approach that can analyze metagenomic datasets as rapidly as composition-based approaches, but nevertheless has the accuracy and specificity of alignment-based algorithms. This article describes a hybrid binning approach (SPHINX) that achieves high binning efficiency by utilizing the principles of both 'composition'- and 'alignment'-based binning algorithms. Validation results with simulated sequence datasets indicate that SPHINX is able to analyze metagenomic sequences as rapidly as composition-based algorithms. Furthermore, the binning efficiency (in terms of accuracy and specificity of assignments) of SPHINX is observed to be comparable with results obtained using alignment-based algorithms. A web server for the SPHINX algorithm is available at http://metagenomics.atc.tcs.com/SPHINX/.

  16. Electronic sleep analyzer

    NASA Technical Reports Server (NTRS)

    Frost, J. D., Jr.

    1970-01-01

    Electronic instrument automatically monitors the stages of sleep of a human subject. The analyzer provides a series of discrete voltage steps with each step corresponding to a clinical assessment of level of consciousness. It is based on the operation of an EEG and requires very little telemetry bandwidth or time.

  17. Quantitative Method for Analyzing the Allocation of Risks in Transportation Construction

    DOT National Transportation Integrated Search

    1979-04-01

    The report presents a conceptual model of risk that was developed to analyze the impact on owner's cost of alternate allocations of risk among owner and contractor in mass transit construction. A model and analysis procedure are developed, based on d...

  18. Language Comprehension and the Acquisition of Knowledge.

    ERIC Educational Resources Information Center

    Freedle, Roy O., Ed.; Carroll, John B., Ed.

    Thirteen papers given by language specialists are presented. These analyze special linguistic (semantic) problems that occur when interconnected strings of sentences constitute data base; they also analyze special psychological problems (of memory, inference, and motivation) that occur when human subjects are exposed to discourse materials in…

  19. Analysis of Vertebral Bone Strength, Fracture Pattern, and Fracture Location: A Validation Study Using a Computed Tomography-Based Nonlinear Finite Element Analysis

    PubMed Central

    Imai, Kazuhiro

    2015-01-01

    Finite element analysis (FEA) is an advanced computer technique of structural stress analysis developed in engineering mechanics. Because the compressive behavior of vertebral bone shows nonlinear behavior, a nonlinear FEA should be utilized to analyze the clinical vertebral fracture. In this article, a computed tomography-based nonlinear FEA (CT/FEA) to analyze the vertebral bone strength, fracture pattern, and fracture location is introduced. The accuracy of the CT/FEA was validated by performing experimental mechanical testing with human cadaveric specimens. Vertebral bone strength and the minimum principal strain at the vertebral surface were accurately analyzed using the CT/FEA. The experimental fracture pattern and fracture location were also accurately simulated. Optimization of the element size was performed by assessing the accuracy of the CT/FEA, and the optimum element size was assumed to be 2 mm. It is expected that the CT/FEA will be valuable in analyzing vertebral fracture risk and assessing therapeutic effects on osteoporosis. PMID:26029476

  20. Real-time color image processing for forensic fiber investigations

    NASA Astrophysics Data System (ADS)

    Paulsson, Nils

    1995-09-01

    This paper describes a system for automatic fiber debris detection based on color identification. The properties of the system are fast analysis and high selectivity, a necessity when analyzing forensic fiber samples. An ordinary investigation separates the material into well above 100,000 video images to analyze. The system is based on standard techniques such as CCD-camera, motorized sample table, and IBM-compatible PC/AT with add-on-boards for video frame digitalization and stepping motor control as the main parts. It is possible to operate the instrument at full video rate (25 image/s) with aid of the HSI-color system (hue- saturation-intensity) and software optimization. High selectivity is achieved by separating the analysis into several steps. The first step is fast direct color identification of objects in the analyzed video images and the second step analyzes detected objects with a more complex and time consuming stage of the investigation to identify single fiber fragments for subsequent analysis with more selective techniques.

  1. A Survey of Artificial Immune System Based Intrusion Detection

    PubMed Central

    Li, Tao; Hu, Xinlei; Wang, Feng; Zou, Yang

    2014-01-01

    In the area of computer security, Intrusion Detection (ID) is a mechanism that attempts to discover abnormal access to computers by analyzing various interactions. There is a lot of literature about ID, but this study only surveys the approaches based on Artificial Immune System (AIS). The use of AIS in ID is an appealing concept in current techniques. This paper summarizes AIS based ID methods from a new view point; moreover, a framework is proposed for the design of AIS based ID Systems (IDSs). This framework is analyzed and discussed based on three core aspects: antibody/antigen encoding, generation algorithm, and evolution mode. Then we collate the commonly used algorithms, their implementation characteristics, and the development of IDSs into this framework. Finally, some of the future challenges in this area are also highlighted. PMID:24790549

  2. Automatic Analyzers and Signal Indicators of Toxic and Dangerously Explosive Substances in Air,

    DTIC Science & Technology

    1980-01-09

    of air are used also thermo- conductometry and electroconductometric methods. The thermo- conductometry method of analysis is based on a change of the... conductometry gas analyzers is very limited and is reduced in essence to the analysis of two-component mixtures or multicomponent ones, all whose...differs. Rain disadvantage in tae tnormo- conductometry gas analyzers - increased sensitivity to a change in the ambient conditions, in consequence of

  3. Over the Pole: A Fuel Efficiency Analysis of Employing Joint Base Elmendorf-Richardson for Polar Route Utilization

    DTIC Science & Technology

    2014-06-13

    helping me with research, as well as Lt Col Adam Reiman for the use of his modeling software. Both were critical components to help complete this... Reiman , 2013) ............................................ 17 Figure 7: Route Analyzer Secondary Airfields ( Reiman , 2013...Transit Center (Nichol, 2013). AFIT Route Analyzer Model The AFIT Route Analyzer was created by AFIT PhD student Lt Col Adam Reiman . The model was

  4. Ganalyzer: A tool for automatic galaxy image analysis

    NASA Astrophysics Data System (ADS)

    Shamir, Lior

    2011-05-01

    Ganalyzer is a model-based tool that automatically analyzes and classifies galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large datasets of galaxy images collected by autonomous sky surveys such as SDSS, LSST or DES.

  5. Monitoring biomedical literature for post-market safety purposes by analyzing networks of text-based coded information.

    PubMed

    Botsis, Taxiarchis; Foster, Matthew; Kreimeyer, Kory; Pandey, Abhishek; Forshee, Richard

    2017-01-01

    Literature review is critical but time-consuming in the post-market surveillance of medical products. We focused on the safety signal of intussusception after the vaccination of infants with the Rotashield Vaccine in 1999 and retrieved all PubMed abstracts for rotavirus vaccines published after January 1, 1998. We used the Event-based Text-mining of Health Electronic Records system, the MetaMap tool, and the National Center for Biomedical Ontologies Annotator to process the abstracts and generate coded terms stamped with the date of publication. Data were analyzed in the Pattern-based and Advanced Network Analyzer for Clinical Evaluation and Assessment to evaluate the intussusception-related findings before and after the release of the new rotavirus vaccines in 2006. The tight connection of intussusception with the historical signal in the first period and the absence of any safety concern for the new vaccines in the second period were verified. We demonstrated the feasibility for semi-automated solutions that may assist medical reviewers in monitoring biomedical literature.

  6. MPA Portable: A Stand-Alone Software Package for Analyzing Metaproteome Samples on the Go.

    PubMed

    Muth, Thilo; Kohrs, Fabian; Heyer, Robert; Benndorf, Dirk; Rapp, Erdmann; Reichl, Udo; Martens, Lennart; Renard, Bernhard Y

    2018-01-02

    Metaproteomics, the mass spectrometry-based analysis of proteins from multispecies samples faces severe challenges concerning data analysis and results interpretation. To overcome these shortcomings, we here introduce the MetaProteomeAnalyzer (MPA) Portable software. In contrast to the original server-based MPA application, this newly developed tool no longer requires computational expertise for installation and is now independent of any relational database system. In addition, MPA Portable now supports state-of-the-art database search engines and a convenient command line interface for high-performance data processing tasks. While search engine results can easily be combined to increase the protein identification yield, an additional two-step workflow is implemented to provide sufficient analysis resolution for further postprocessing steps, such as protein grouping as well as taxonomic and functional annotation. Our new application has been developed with a focus on intuitive usability, adherence to data standards, and adaptation to Web-based workflow platforms. The open source software package can be found at https://github.com/compomics/meta-proteome-analyzer .

  7. A Field-Portable Cell Analyzer without a Microscope and Reagents.

    PubMed

    Seo, Dongmin; Oh, Sangwoo; Lee, Moonjin; Hwang, Yongha; Seo, Sungkyu

    2017-12-29

    This paper demonstrates a commercial-level field-portable lens-free cell analyzer called the NaviCell (No-stain and Automated Versatile Innovative cell analyzer) capable of automatically analyzing cell count and viability without employing an optical microscope and reagents. Based on the lens-free shadow imaging technique, the NaviCell (162 × 135 × 138 mm³ and 1.02 kg) has the advantage of providing analysis results with improved standard deviation between measurement results, owing to its large field of view. Importantly, the cell counting and viability testing can be analyzed without the use of any reagent, thereby simplifying the measurement procedure and reducing potential errors during sample preparation. In this study, the performance of the NaviCell for cell counting and viability testing was demonstrated using 13 and six cell lines, respectively. Based on the results of the hemocytometer ( de facto standard), the error rate (ER) and coefficient of variation (CV) of the NaviCell are approximately 3.27 and 2.16 times better than the commercial cell counter, respectively. The cell viability testing of the NaviCell also showed an ER and CV performance improvement of 5.09 and 1.8 times, respectively, demonstrating sufficient potential in the field of cell analysis.

  8. Development of the tongue coating analyzer based on concave grating monochrometer and virtual instrument

    NASA Astrophysics Data System (ADS)

    Ren, Zhong; Liu, Guodong; Zeng, Lvming; Huang, Zhen; Zeng, Wenping

    2010-10-01

    The tongue coating diagnosis is an important part in tongue diagnosis of traditional Chinese medicine (TCM).The change of the thickness and color of the tongue coating can reflect the pathological state for the patient. By observing the tongue coating, a Chinese doctor can determine the nature or severity of disease. Because some limitations existed in the tongue diagnosis method of TCM and the method based on the digital image processing, a novel tongue coating analyzer(TCA) based on the concave grating monochrometer and virtual instrument is developed in this paper. This analyzer consists of the light source system, check cavity, optical fiber probe, concave grating monochrometer, spectrum detector system based on CCD and data acquisition (DAQ) card, signal processing circuit system, computer and data analysis software based on LabVIEW, etc. Experimental results show that the novel TCA's spectral range can reach 300-1000 nm, its wavelength resolution can reach 1nm, and this TCA uses the back-split-light technology and multi-channel parallel analysis. Compared with the TCA based on the image processing technology, this TCA has many advantages, such as, compact volume, simpler algorithm, faster processing speed, higher accuracy, cheaper cost and real-time handle data and display the result, etc. Therefore, it has the greatly potential values in the fields of the tongue coating diagnosis for TCM.

  9. Image Edge Extraction via Fuzzy Reasoning

    NASA Technical Reports Server (NTRS)

    Dominquez, Jesus A. (Inventor); Klinko, Steve (Inventor)

    2008-01-01

    A computer-based technique for detecting edges in gray level digital images employs fuzzy reasoning to analyze whether each pixel in an image is likely on an edge. The image is analyzed on a pixel-by-pixel basis by analyzing gradient levels of pixels in a square window surrounding the pixel being analyzed. An edge path passing through the pixel having the greatest intensity gradient is used as input to a fuzzy membership function, which employs fuzzy singletons and inference rules to assigns a new gray level value to the pixel that is related to the pixel's edginess degree.

  10. Controlling the joint local false discovery rate is more powerful than meta-analysis methods in joint analysis of summary statistics from multiple genome-wide association studies.

    PubMed

    Jiang, Wei; Yu, Weichuan

    2017-02-15

    In genome-wide association studies (GWASs) of common diseases/traits, we often analyze multiple GWASs with the same phenotype together to discover associated genetic variants with higher power. Since it is difficult to access data with detailed individual measurements, summary-statistics-based meta-analysis methods have become popular to jointly analyze datasets from multiple GWASs. In this paper, we propose a novel summary-statistics-based joint analysis method based on controlling the joint local false discovery rate (Jlfdr). We prove that our method is the most powerful summary-statistics-based joint analysis method when controlling the false discovery rate at a certain level. In particular, the Jlfdr-based method achieves higher power than commonly used meta-analysis methods when analyzing heterogeneous datasets from multiple GWASs. Simulation experiments demonstrate the superior power of our method over meta-analysis methods. Also, our method discovers more associations than meta-analysis methods from empirical datasets of four phenotypes. The R-package is available at: http://bioinformatics.ust.hk/Jlfdr.html . eeyu@ust.hk. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  11. Mode evolution in polarization maintain few mode fibers and applications in mode-division-multiplexing systems

    NASA Astrophysics Data System (ADS)

    Li, Yan; Zeng, Xinglin; Mo, Qi; Li, Wei; Liu, Zhijian; Wu, Jian

    2016-10-01

    In few-mode polarization-maintaining-fiber (FM-PMF), the effective-index splitting exists not only between orthogonally polarization state but also between degenerated modes within a high-order mode group. Hence besides the polarization state evolution, the mode patterns in each LP set are need to be analyzed. In this letter, the completed firstorder mode (LP11 mode) evolution in PM-FMF is analyzed and represented by analogous Jones vector and Poincarésphere respectively. Furthermore, with Jones matrix analysis, the modal dynamics in FM-PMFs is conveniently analyzed. The conclusions are used to propose a PM-FMF based LP11 mode rotator and an PM-FMF based OAM generator. Both simulation and experiments are conducted to investigate performance of the two devices.

  12. Analysis of Electric Vehicle DC High Current Conversion Technology

    NASA Astrophysics Data System (ADS)

    Yang, Jing; Bai, Jing-fen; Lin, Fan-tao; Lu, Da

    2017-05-01

    Based on the background of electric vehicles, it is elaborated the necessity about electric energy accurate metering of electric vehicle power batteries, and it is analyzed about the charging and discharging characteristics of power batteries. It is needed a DC large current converter to realize accurate calibration of power batteries electric energy metering. Several kinds of measuring methods are analyzed based on shunts and magnetic induction principle in detail. It is put forward power batteries charge and discharge calibration system principle, and it is simulated and analyzed ripple waves containing rate and harmonic waves containing rate of power batteries AC side and DC side. It is put forward suitable DC large current measurement methods of power batteries by comparing different measurement principles and it is looked forward the DC large current measurement techniques.

  13. Analyzing Activity Behavior and Movement in a Naturalistic Environment using Smart Home Techniques

    PubMed Central

    Cook, Diane J.; Schmitter-Edgecombe, Maureen; Dawadi, Prafulla

    2015-01-01

    One of the many services that intelligent systems can provide is the ability to analyze the impact of different medical conditions on daily behavior. In this study we use smart home and wearable sensors to collect data while (n=84) older adults perform complex activities of daily living. We analyze the data using machine learning techniques and reveal that differences between healthy older adults and adults with Parkinson disease not only exist in their activity patterns, but that these differences can be automatically recognized. Our machine learning classifiers reach an accuracy of 0.97 with an AUC value of 0.97 in distinguishing these groups. Our permutation-based testing confirms that the sensor-based differences between these groups are statistically significant. PMID:26259225

  14. Opinion dynamics in a group-based society

    NASA Astrophysics Data System (ADS)

    Gargiulo, F.; Huet, S.

    2010-09-01

    Many models have been proposed to analyze the evolution of opinion structure due to the interaction of individuals in their social environment. Such models analyze the spreading of ideas both in completely interacting backgrounds and on social networks, where each person has a finite set of interlocutors. In this paper we analyze the reciprocal feedback between the opinions of the individuals and the structure of the interpersonal relationships at the level of community structures. For this purpose we define a group-based random network and we study how this structure co-evolves with opinion dynamics processes. We observe that the adaptive network structure affects the opinion dynamics process helping the consensus formation. The results also show interesting behaviors in regards to the size distribution of the groups and their correlation with opinion structure.

  15. Multi-party Measurement-Device-Independent Quantum Key Distribution Based on Cluster States

    NASA Astrophysics Data System (ADS)

    Liu, Chuanqi; Zhu, Changhua; Ma, Shuquan; Pei, Changxing

    2018-03-01

    We propose a novel multi-party measurement-device-independent quantum key distribution (MDI-QKD) protocol based on cluster states. A four-photon analyzer which can distinguish all the 16 cluster states serves as the measurement device for four-party MDI-QKD. Any two out of four participants can build secure keys after the analyzers obtains successful outputs and the two participants perform post-processing. We derive a security analysis for the protocol, and analyze the key rates under different values of polarization misalignment. The results show that four-party MDI-QKD is feasible over 280 km in the optical fiber channel when the key rate is about 10- 6 with the polarization misalignment parameter 0.015. Moreover, our work takes an important step toward a quantum communication network.

  16. Analysis of Power System Low Frequency Oscillation Based on Energy Shift Theory

    NASA Astrophysics Data System (ADS)

    Zhang, Junfeng; Zhang, Chunwang; Ma, Daqing

    2018-01-01

    In this paper, a new method for analyzing low-frequency oscillation between analytic areas based on energy coefficient is proposed. The concept of energy coefficient is proposed by constructing the energy function, and the low-frequency oscillation is analyzed according to the energy coefficient under the current operating conditions; meanwhile, the concept of model energy is proposed to analyze the energy exchange behavior between two generators. Not only does this method provide an explanation of low-frequency oscillation from the energy point of view, but also it helps further reveal the dynamic behavior of complex power systems. The case analysis of four-machine two-area and the power system of Jilin Power Grid proves the correctness and effectiveness of the proposed method in low-frequency oscillation analysis of power system.

  17. Analyzing Activity Behavior and Movement in a Naturalistic Environment Using Smart Home Techniques.

    PubMed

    Cook, Diane J; Schmitter-Edgecombe, Maureen; Dawadi, Prafulla

    2015-11-01

    One of the many services that intelligent systems can provide is the ability to analyze the impact of different medical conditions on daily behavior. In this study, we use smart home and wearable sensors to collect data, while ( n = 84) older adults perform complex activities of daily living. We analyze the data using machine learning techniques and reveal that differences between healthy older adults and adults with Parkinson disease not only exist in their activity patterns, but that these differences can be automatically recognized. Our machine learning classifiers reach an accuracy of 0.97 with an area under the ROC curve value of 0.97 in distinguishing these groups. Our permutation-based testing confirms that the sensor-based differences between these groups are statistically significant.

  18. Covering the Bases: Exploring Alternative Systems

    ERIC Educational Resources Information Center

    Kurz, Terri L.; Garcia, Jorge

    2015-01-01

    Since the 1950s, the understanding of how the base 10 system works has been encouraged through alternative base systems (Price 1995; Woodward 2004). If high school students are given opportunities to learn other base systems and analyze what they denote, we believe that they will better understand the structure of base 10 and its operations…

  19. Disposable pen-shaped capillary gel electrophoresis cartridge for fluorescence detection of bio-molecules

    NASA Astrophysics Data System (ADS)

    Amirkhanian, Varoujan; Tsai, Shou-Kuan

    2014-03-01

    We introduce a novel and cost-effective capillary gel electrophoresis (CGE) system utilizing disposable pen-shaped gelcartridges for highly efficient, high speed, high throughput fluorescence detection of bio-molecules. The CGE system has been integrated with dual excitation and emission optical-fibers with micro-ball end design for fluorescence detection of bio-molecules separated and detected in a disposable pen-shaped capillary gel electrophoresis cartridge. The high-performance capillary gel electrophoresis (CGE) analyzer has been optimized for glycoprotein analysis type applications. Using commercially available labeling agent such as ANTS (8-aminonapthalene-1,3,6- trisulfonate) as an indicator, the capillary gel electrophoresis-based glycan analyzer provides high detection sensitivity and high resolving power in 2-5 minutes of separations. The system can hold total of 96 samples, which can be automatically analyzed within 4-5 hours. This affordable fiber optic based fluorescence detection system provides fast run times (4 minutes vs. 20 minutes with other CE systems), provides improved peak resolution, good linear dynamic range and reproducible migration times, that can be used in laboratories for high speed glycan (N-glycan) profiling applications. The CGE-based glycan analyzer will significantly increase the pace at which glycoprotein research is performed in the labs, saving hours of preparation time and assuring accurate, consistent and economical results.

  20. Displacement-based back-analysis of the model parameters of the Nuozhadu high earth-rockfill dam.

    PubMed

    Wu, Yongkang; Yuan, Huina; Zhang, Bingyin; Zhang, Zongliang; Yu, Yuzhen

    2014-01-01

    The parameters of the constitutive model, the creep model, and the wetting model of materials of the Nuozhadu high earth-rockfill dam were back-analyzed together based on field monitoring displacement data by employing an intelligent back-analysis method. In this method, an artificial neural network is used as a substitute for time-consuming finite element analysis, and an evolutionary algorithm is applied for both network training and parameter optimization. To avoid simultaneous back-analysis of many parameters, the model parameters of the three main dam materials are decoupled and back-analyzed separately in a particular order. Displacement back-analyses were performed at different stages of the construction period, with and without considering the creep and wetting deformations. Good agreement between the numerical results and the monitoring data was obtained for most observation points, which implies that the back-analysis method and decoupling method are effective for solving complex problems with multiple models and parameters. The comparison of calculation results based on different sets of back-analyzed model parameters indicates the necessity of taking the effects of creep and wetting into consideration in the numerical analyses of high earth-rockfill dams. With the resulting model parameters, the stress and deformation distributions at completion are predicted and analyzed.

  1. Counter-Learning under Oppression

    ERIC Educational Resources Information Center

    Kucukaydin, Ilhan

    2010-01-01

    This qualitative study utilized the method of narrative analysis to explore the counter-learning process of an oppressed Kurdish woman from Turkey. Critical constructivism was utilized to analyze counter-learning; Frankfurt School-based Marcusian critical theory was used to analyze the sociopolitical context and its impact on the oppressed. Key…

  2. A likelihood-based biostatistical model for analyzing consumer movement in simultaneous choice experiments

    USDA-ARS?s Scientific Manuscript database

    Measures of animal movement versus consumption rates can provide valuable, ecologically relevant information on feeding preference, specifically estimates of attraction rate, leaving rate, tenure time, or measures of flight/walking path. Here, we develop a simple biostatistical model to analyze repe...

  3. Cultivating a Sense of Place in Religious Studies

    ERIC Educational Resources Information Center

    Jensen, Molly Hadley

    2015-01-01

    This essay analyzes student learning through place-based pedagogies in an American Religions course. In the course, students analyzed cultural meanings and practices of regional religious communities and participated in sensory awareness and ecological learning in a campus garden. Embodied learning increased student understanding and appreciation…

  4. Analyzing Information Systems Development: A Comparison and Analysis of Eight IS Development Approaches.

    ERIC Educational Resources Information Center

    Iivari, Juhani; Hirschheim, Rudy

    1996-01-01

    Analyzes and compares eight information systems (IS) development approaches: Information Modelling, Decision Support Systems, the Socio-Technical approach, the Infological approach, the Interactionist approach, the Speech Act-based approach, Soft Systems Methodology, and the Scandinavian Trade Unionist approach. Discusses the organizational roles…

  5. [Research-based thinking of nurses at the end of their basic nursing education].

    PubMed

    Säämänen, J

    1993-01-01

    The purpose of this study was to find out the level of research-based thinking that students have internalized during their professional education. The research-based thinking of students has been investigated by analyzing the study reports, they have made at the end of their professional education. "The research-based thinking" is evaluated from the perspective of a research consumer. So, the purpose of this study was not to evaluate the level of studies, but the basis of them. The data consisted of 51 study reports. The data was analyzed by using the content analysis and descriptive statistics. The content analysis was based on the set of general rules for research using deductive methods of approach. The students' problems in research-based thinking fell mainly into the following parts: (1) the students did not use previous studies to build up the frame and the problems of their study, the students used scientific publications limitedly, it seems that the knowledge had the same value to the students regardless of the reference they have taken it from; (2) the students had difficulties in connecting the empirical part of their study with theoretical or conceptual frame. In this study the results indicated, that Finnish nursing students' research-based thinking is not mature in their study-reports at the end of education. It seems important to do further research to be able to analyze the reasons for these findings and how education could be improved in the future.

  6. Block-Based Connected-Component Labeling Algorithm Using Binary Decision Trees

    PubMed Central

    Chang, Wan-Yu; Chiu, Chung-Cheng; Yang, Jia-Horng

    2015-01-01

    In this paper, we propose a fast labeling algorithm based on block-based concepts. Because the number of memory access points directly affects the time consumption of the labeling algorithms, the aim of the proposed algorithm is to minimize neighborhood operations. Our algorithm utilizes a block-based view and correlates a raster scan to select the necessary pixels generated by a block-based scan mask. We analyze the advantages of a sequential raster scan for the block-based scan mask, and integrate the block-connected relationships using two different procedures with binary decision trees to reduce unnecessary memory access. This greatly simplifies the pixel locations of the block-based scan mask. Furthermore, our algorithm significantly reduces the number of leaf nodes and depth levels required in the binary decision tree. We analyze the labeling performance of the proposed algorithm alongside that of other labeling algorithms using high-resolution images and foreground images. The experimental results from synthetic and real image datasets demonstrate that the proposed algorithm is faster than other methods. PMID:26393597

  7. Molecular switching behavior in isosteric DNA base pairs.

    PubMed

    Jissy, A K; Konar, Sukanya; Datta, Ayan

    2013-04-15

    The structures and proton-coupled behavior of adenine-thymine (A-T) and a modified base pair containing a thymine isostere, adenine-difluorotoluene (A-F), are studied in different solvents by dispersion-corrected density functional theory. The stability of the canonical Watson-Crick base pair and the mismatched pair in various solvents with low and high dielectric constants is analyzed. It is demonstrated that A-F base pairing is favored in solvents with low dielectric constant. The stabilization and conformational changes induced by protonation are also analyzed for the natural as well as the mismatched base pair. DNA sequences capable of changing their sequence conformation on protonation are used in the construction of pH-based molecular switches. An acidic medium has a profound influence in stabilizing the isostere base pair. Such a large gain in stability on protonation leads to an interesting pH-controlled molecular switch, which can be incorporated in a natural DNA tract. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Digital video timing analyzer for the evaluation of PC-based real-time simulation systems

    NASA Astrophysics Data System (ADS)

    Jones, Shawn R.; Crosby, Jay L.; Terry, John E., Jr.

    2009-05-01

    Due to the rapid acceleration in technology and the drop in costs, the use of commercial off-the-shelf (COTS) PC-based hardware and software components for digital and hardware-in-the-loop (HWIL) simulations has increased. However, the increase in PC-based components creates new challenges for HWIL test facilities such as cost-effective hardware and software selection, system configuration and integration, performance testing, and simulation verification/validation. This paper will discuss how the Digital Video Timing Analyzer (DiViTA) installed in the Aviation and Missile Research, Development and Engineering Center (AMRDEC) provides quantitative characterization data for PC-based real-time scene generation systems. An overview of the DiViTA is provided followed by details on measurement techniques, applications, and real-world examples of system benefits.

  9. A novel all-optical label processing for OPS networks based on multiple OOC sequences from multiple-groups OOC

    NASA Astrophysics Data System (ADS)

    Qiu, Kun; Zhang, Chongfu; Ling, Yun; Wang, Yibo

    2007-11-01

    This paper proposes an all-optical label processing scheme using multiple optical orthogonal codes sequences (MOOCS) for optical packet switching (OPS) (MOOCS-OPS) networks, for the first time to the best of our knowledge. In this scheme, the multiple optical orthogonal codes (MOOC) from multiple-groups optical orthogonal codes (MGOOC) are permuted and combined to obtain the MOOCS for the optical labels, which are used to effectively enlarge the capacity of available optical codes for optical labels. The optical label processing (OLP) schemes are reviewed and analyzed, the principles of MOOCS-based optical labels for OPS networks are given, and analyzed, then the MOOCS-OPS topology and the key realization units of the MOOCS-based optical label packets are studied in detail, respectively. The performances of this novel all-optical label processing technology are analyzed, the corresponding simulation is performed. These analysis and results show that the proposed scheme can overcome the lack of available optical orthogonal codes (OOC)-based optical labels due to the limited number of single OOC for optical label with the short code length, and indicate that the MOOCS-OPS scheme is feasible.

  10. Power-law statistics of neurophysiological processes analyzed using short signals

    NASA Astrophysics Data System (ADS)

    Pavlova, Olga N.; Runnova, Anastasiya E.; Pavlov, Alexey N.

    2018-04-01

    We discuss the problem of quantifying power-law statistics of complex processes from short signals. Based on the analysis of electroencephalograms (EEG) we compare three interrelated approaches which enable characterization of the power spectral density (PSD) and show that an application of the detrended fluctuation analysis (DFA) or the wavelet-transform modulus maxima (WTMM) method represents a useful way of indirect characterization of the PSD features from short data sets. We conclude that despite DFA- and WTMM-based measures can be obtained from the estimated PSD, these tools outperform the standard spectral analysis when characterization of the analyzed regime should be provided based on a very limited amount of data.

  11. Magnetic Tunnel Junction-Based On-Chip Microwave Phase and Spectrum Analyzer

    NASA Technical Reports Server (NTRS)

    Fan, Xin; Chen, Yunpeng; Xie, Yunsong; Kolodzey, James; Wilson, Jeffrey D.; Simons, Rainee N.; Xiao, John Q.

    2014-01-01

    A magnetic tunnel junction (MTJ)-based microwave detector is proposed and investigated. When the MTJ is excited by microwave magnetic fields, the relative angle between the free layer and pinned layer alternates, giving rise to an average resistance change. By measuring the average resistance change, the MTJ can be utilized as a microwave power sensor. Due to the nature of ferromagnetic resonance, the frequency of an incident microwave is directly determined. In addition, by integrating a mixer circuit, the MTJ-based microwave detector can also determine the relative phase between two microwave signals. Thus, the MTJbased microwave detector can be used as an on-chip microwave phase and spectrum analyzer.

  12. Alienation and the Ontology of Social Structure.

    ERIC Educational Resources Information Center

    Segalman, Ralph

    Theoretical models of social structure are analyzed in light of modern work patterns, social affiliations, and social attitudes. It is hypothesized that previous paradigms for society were based on classic theory which analyzed the then emergent forms of social structure and relationships. Because social structures and relationships have changed,…

  13. School Technology Leadership in a Spanish Secondary School: The TEI Model

    ERIC Educational Resources Information Center

    Gallego-Arrufat, María-Jesús; Gutiérrez-Santiuste, Elba; Campaña-Jiménez, Rafael Luis

    2017-01-01

    This study analyzes the perception that teachers and management team members in secondary school education have of "technology-based educational innovation" (TEI). Two questionnaires and in-depth interviews permit us to analyze leaders' perspective of planning, development, and evaluation. The school leaders' view diverges from that of…

  14. Social Construction of Authorized Users in the Digital Age

    ERIC Educational Resources Information Center

    Zhu, Xiaohua; Eschenfelder, Kristin R.

    2010-01-01

    This paper analyzes changes to the definitions of "authorized users" contained in electronic resources licenses and embedded in access control technologies from the mid-1990s to the present. In analyzing changes to the license and technology-based definitions, it tracks shifts in major stakeholders' perceptions of authorized users and…

  15. Changes in Performance in a Management by Objectives Program

    ERIC Educational Resources Information Center

    Ivancevich, John M.

    1974-01-01

    Reports on empirically-based longitudinal study of performance in a manufacturing company that uses management by objectives. The performance of the subordinates of 181 MBO-involved supervisors in the production and marketing departments is analyzed. Time lag, reinforcement, and sustaining improvements in performance are considered and analyzed.…

  16. Recent progress and market analysis of anticoagulant drugs

    PubMed Central

    Fan, Ping; Gao, Yangyang; Zheng, Minglin; Xu, Ting; Schoenhagen, Paul

    2018-01-01

    This review describes epidemiology of thromboembolic disease in China and abroad, evaluates trends in the development of anticoagulant drugs, and analyzes the market situation based on large amounts of accumulated data. Specifically, we describe advances in clinical application of anticoagulants and analyze the most commonly used anticoagulants in the market systematically.

  17. Language Loss and the Crisis of Cognition: Between Socio- and Psycholinguistics.

    ERIC Educational Resources Information Center

    Kenny, K. Dallas

    A non-structural model is proposed for quantifying and analyzing the dynamics of language attrition, particularly among immigrants in a second language environment, based on examination of disfluencies (hesitations, errors, and repairs). The first chapter discusses limitations of the conventional synchronic textual approach to analyzing language…

  18. [Research and Implementation of Vital Signs Monitoring System Based on Cloud Platform].

    PubMed

    Yu, Man; Tan, Anzu; Huang, Jianqi

    2018-05-30

    Through analyzing the existing problems in the current mode, the vital signs monitoring information system based on cloud platform is designed and developed. The system's aim is to assist nurse carry out vital signs nursing work effectively and accurately. The system collects, uploads and analyzes patient's vital signs data by PDA which connecting medical inspection equipments. Clinical application proved that the system can effectively improve the quality and efficiency of medical care and may reduce medical expenses. It is alse an important practice result to build a medical cloud platform.

  19. BIOS Security Analysis and a Kind of Trusted BIOS

    NASA Astrophysics Data System (ADS)

    Zhou, Zhenliu; Xu, Rongsheng

    The BIOS's security threats to computer system are analyzed and security requirements for firmware BIOS are summarized in this paper. Through discussion about TCG's trust transitivity, a new approach about CRTM implementation based on BIOS is developed. In this paper, we also put forward a new trusted BIOS architecture-UTBIOS which is built on Intel Framework for EFI/UEFI. The trustworthiness of UTBIOS is based on trusted hardware TPM. In UTBIOS, trust encapsulation and trust measurement are used to construct pre-OS trust chain. Performance of trust measurement is also analyzed in the end.

  20. [Construction and application of an onboard absorption analyzer device for CDOM].

    PubMed

    Lin, Jun-Fang; Sun, Zhao-Hua; Cao, Wen-Xi; Hu, Shui-Bo; Xu, Zhan-Tang

    2013-04-01

    Colored dissolved organic matter (CDOM) plays an important role in marine ecosystems. In order to solve the current problems in measurement of CDOM absorption, an automated onboard analyzer based on liquid core waveguides (Teflon AF LWCC/LCW) was constructed. This analyzer has remarkable characteristics including adjusted optical pathlength, wide measurement range, and high sensitivity. The model of filtration and injection can implement the function of automated filtration, sample injection, and LWCC cleaning. The LabVIEW software platform can efficiently control the running state of the analyzer and acquire real time data including light absorption spectra, GPS data, and CTW data. By the comparison experiments and shipboard measurements, it was proved that the analyzer was reliable and robust.

  1. The article critique as a problem-based teaching method for medical students early in their training: a French example using anatomy.

    PubMed

    Havet, Eric; Duparc, Fabrice; Peltier, Johan; Tobenas-Dujardin, Anne-Claire; Fréger, Pierre

    2012-01-01

    In France, "article critique" became a particular teaching method in the second part of the medical curriculum. It approaches a reading exercise of scientific medical papers similar to that of journal club. It could be compared to reviewing a paper as performed by reviewers of a scientific journal. We studied the relevancy of that teaching method for the youngest medical students. Our questions were about the understanding and the analyzing ability of a scientific paper while students have just learned basic medical sciences as anatomy. We have included 54 "article critique" written by voluntary students in second and third years of medical cursus. All of the IMRaD structure items (introduction, materials and methods, results and discussion) were analyzed using a qualitative scale for understanding as for analyzing ability. For understanding, 89-96% was good or fair and for the analyzing ability, 93-100% was good or fair. The anatomical papers were better understood than therapeutic or paraclinical studies, but without statistical difference, except for the introduction chapter. Results for analyzing ability were various according to the subject of the papers. This teaching method could be compared to a self-learning method, but also to a problem-based learning method. For the youngest students, the lack of medical knowledge aroused the curiosity. Their enthusiasm to learn new medical subjects remained full. The authors would insist on the requirement of rigorous lessons about evidence-based medicine and IMRaD structure and on a necessary companionship of the students by the teachers.

  2. Rethinking the NTCIP Design and Protocols - Analyzing the Issues

    DOT National Transportation Integrated Search

    1998-03-03

    This working paper discusses the issues involved in changing the current draft NTCIP standard from an X.25-based protocol stack to an Internet-based protocol stack. It contains a methodology which could be used to change NTCIP's base protocols. This ...

  3. Constraint-Based Scheduling System

    NASA Technical Reports Server (NTRS)

    Zweben, Monte; Eskey, Megan; Stock, Todd; Taylor, Will; Kanefsky, Bob; Drascher, Ellen; Deale, Michael; Daun, Brian; Davis, Gene

    1995-01-01

    Report describes continuing development of software for constraint-based scheduling system implemented eventually on massively parallel computer. Based on machine learning as means of improving scheduling. Designed to learn when to change search strategy by analyzing search progress and learning general conditions under which resource bottleneck occurs.

  4. Chemical Mixture Risk Assessment Additivity-Based Approaches

    EPA Science Inventory

    Powerpoint presentation includes additivity-based chemical mixture risk assessment methods. Basic concepts, theory and example calculations are included. Several slides discuss the use of "common adverse outcomes" in analyzing phthalate mixtures.

  5. ST-analyzer: a web-based user interface for simulation trajectory analysis.

    PubMed

    Jeong, Jong Cheol; Jo, Sunhwan; Wu, Emilia L; Qi, Yifei; Monje-Galvan, Viviana; Yeom, Min Sun; Gorenstein, Lev; Chen, Feng; Klauda, Jeffery B; Im, Wonpil

    2014-05-05

    Molecular dynamics (MD) simulation has become one of the key tools to obtain deeper insights into biological systems using various levels of descriptions such as all-atom, united-atom, and coarse-grained models. Recent advances in computing resources and MD programs have significantly accelerated the simulation time and thus increased the amount of trajectory data. Although many laboratories routinely perform MD simulations, analyzing MD trajectories is still time consuming and often a difficult task. ST-analyzer, http://im.bioinformatics.ku.edu/st-analyzer, is a standalone graphical user interface (GUI) toolset to perform various trajectory analyses. ST-analyzer has several outstanding features compared to other existing analysis tools: (i) handling various formats of trajectory files from MD programs, such as CHARMM, NAMD, GROMACS, and Amber, (ii) intuitive web-based GUI environment--minimizing administrative load and reducing burdens on the user from adapting new software environments, (iii) platform independent design--working with any existing operating system, (iv) easy integration into job queuing systems--providing options of batch processing either on the cluster or in an interactive mode, and (v) providing independence between foreground GUI and background modules--making it easier to add personal modules or to recycle/integrate pre-existing scripts utilizing other analysis tools. The current ST-analyzer contains nine main analysis modules that together contain 18 options, including density profile, lipid deuterium order parameters, surface area per lipid, and membrane hydrophobic thickness. This article introduces ST-analyzer with its design, implementation, and features, and also illustrates practical analysis of lipid bilayer simulations. Copyright © 2014 Wiley Periodicals, Inc.

  6. Interactive display/graphics systems for remote sensor data analysis.

    NASA Technical Reports Server (NTRS)

    Eppler, W. G.; Loe, D. L.; Wilson, E. L.; Whitley, S. L.; Sachen, R. J.

    1971-01-01

    Using a color-television display system and interactive graphics equipment on-line to an IBM 360/44 computer, investigators at the Manned Spacecraft Center have developed a variety of interactive displays which aid in analyzing remote sensor data. This paper describes how such interactive displays are used to: (1) analyze data from a multispectral scanner, (2) develop automatic pattern recognition systems based on multispectral scanner measurements, and (3) analyze data from nonimaging sensors such as the infrared radiometer and microwave scatterometer.

  7. Analyses of Multishaft Rotor-Bearing Response

    NASA Technical Reports Server (NTRS)

    Nelson, H. D.; Meacham, W. L.

    1985-01-01

    Method works for linear and nonlinear systems. Finite-element-based computer program developed to analyze free and forced response of multishaft rotor-bearing systems. Acronym, ARDS, denotes Analysis of Rotor Dynamic Systems. Systems with nonlinear interconnection or support bearings or both analyzed by numerically integrating reduced set of coupledsystem equations. Linear systems analyzed in closed form for steady excitations and treated as equivalent to nonlinear systems for transient excitation. ARDS is FORTRAN program developed on an Amdahl 470 (similar to IBM 370).

  8. Economic challenges of hybrid microgrid: An analysis and approaches for rural electrification

    NASA Astrophysics Data System (ADS)

    Habibullah, Mohammad; Mahmud, Khizir; Koçar, Günnur; Islam, A. K. M. Sadrul; Salehin, Sayedus

    2017-06-01

    This paper focuses on the integration of three renewable resources: biogas, wind energy and solar energy, utilizing solar PV panels, a biogas generator, and a wind turbine, respectively, to analyze the technical and economic challenges of a hybrid micro-gird. The integration of these sources has been analyzed and optimized based on realistic data for a real location. Different combinations of these sources have been analyzed to find out the optimized combination based on the efficiency and the minimum cost of electricity (COE). Wind and solar energy are considered as the primary sources of power generation during off-peak hours, and any excess power is used to charge a battery bank. During peak hours, biogas generators produce power to support the additional demand. A business strategy to implement the integrated optimized system in rural areas is discussed.

  9. Preliminary Results on the Surface of a New Fe-Based Metallic Material after “In Vivo” Maintaining

    NASA Astrophysics Data System (ADS)

    Săndulache, F.; Stanciu, S.; Cimpoeşu, N.; Stanciu, T.; Cimpoeșu, R.; Enache, A.; Baciu, R.

    2017-06-01

    Abstract A new Fe-based alloy was obtained using UltraCast melting equipment. The alloy, after mechanical processing, was implanted in five rabbit specimens (with respect for the “in-bone” procedure). After 30 days of implantation the samples were recovered and analyzed by weight and surface state meanings. Scanning electron microscopy technique was used to determine the new compounds morphology from the metallic surface and X-ray dispersive energy spectroscopy for chemical analyze results. A bond between the metallic material and biological material of the bone was observed through increasing of sample weight and by SEM images. After the first set of tests, as the samples were extracted and biologically cleaned, the samples were ultrasonically cleaned and re-analyzed in order to establish the stability of the chemical compounds.

  10. Ultrashort hybrid metal-insulator plasmonic directional coupler.

    PubMed

    Noghani, Mahmoud Talafi; Samiei, Mohammad Hashem Vadjed

    2013-11-01

    An ultrashort plasmonic directional coupler based on the hybrid metal-insulator slab waveguide is proposed and analyzed at the telecommunication wavelength of 1550 nm. It is first analyzed using the supermode theory based on mode analysis via the transfer matrix method in the interaction region. Then the 2D model of the coupler, including transition arms, is analyzed using a commercial finite-element method simulator. The hybrid slab waveguide is composed of a metallic layer of silver and two dielectric layers of silica (SiO2) and silicon (Si). The coupler is optimized to have a minimum coupling length and to transfer maximum power considering the layer thicknesses as optimization variables. The resulting coupling length in the submicrometer region along with a noticeable power transfer efficiency are advantages of the proposed coupler compared to previously reported plasmonic couplers.

  11. Analysis of physical layer performance of data center with optical wavelength switches based on advanced modulation formats

    NASA Astrophysics Data System (ADS)

    Ahmad, Iftikhar; Chughtai, Mohsan Niaz

    2018-05-01

    In this paper the IRIS (Integrated Router Interconnected spectrally), an optical domain architecture for datacenter network is analyzed. The IRIS integrated with advanced modulation formats (M-QAM) and coherent optical receiver is analyzed. The channel impairments are compensated using the DSP algorithms following the coherent receiver. The proposed scheme allows N2 multiplexed wavelengths for N×N size. The performance of the N×N-IRIS switch with and without wavelength conversion is analyzed for different Baud rates over M-QAM modulation formats. The performance of the system is analyzed in terms of bit error rate (BER) vs OSNR curves.

  12. International Space Station Major Constituent Analyzer On-orbit Performance

    NASA Technical Reports Server (NTRS)

    Gardner, Ben D.; Erwin, Phillip M.; Wiedemann, Rachel; Matty, Chris

    2016-01-01

    The Major Constituent Analyzer (MCA) is a mass spectrometer based system that measures the major atmospheric constituents on the International Space Station. A number of limited-life components require periodic change-out, including the ORU 02 analyzer and the ORU 08 Verification Gas Assembly. The most recent ORU 02 and ORU 08 assemblies are operating nominally. For ORU 02, the ion source filaments and ion pump lifetime continue to be key determinants of MCA performance. Additionally, testing is underway to evaluate the capacity of the MCA to analyze ammonia. Finally, plans are being made to bring the second MCA on ISS to an operational configuration.

  13. ADAM: Analysis of Discrete Models of Biological Systems Using Computer Algebra

    PubMed Central

    2011-01-01

    Background Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. Results We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Conclusions Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web-based tool for several different input formats, and it makes analysis of complex models accessible to a larger community, as it is platform independent as a web-service and does not require understanding of the underlying mathematics. PMID:21774817

  14. Analyzing the Operation of Performance-Based Accountability Systems for Public Services. Technical Report

    ERIC Educational Resources Information Center

    Camm, Frank; Stecher, Brian M.

    2010-01-01

    Empirical evidence of the effects of performance-based public management is scarce. This report describes a framework used to organize available empirical information on one form of performance-based management, a performance-based accountability system (PBAS). Such a system identifies individuals or organizations that must change their behavior…

  15. Team-Based Classroom Pedagogy Reframed: The Student Perspective

    ERIC Educational Resources Information Center

    Schultz, Jennifer L.; Wilson, Joel R.; Hess, Kenneth C.

    2010-01-01

    Postsecondary learning environments often utilize team-based pedagogical practices to challenge and support student learning outcomes. This manuscript presents the findings of a qualitative research study that analyzed the viewpoints and perceptions of group or team-based projects among undergraduate business students. Results identified five…

  16. The Base 32 Method: An Improved Method for Coding Sibling Constellations.

    ERIC Educational Resources Information Center

    Perfetti, Lawrence J. Carpenter

    1990-01-01

    Offers new sibling constellation coding method (Base 32) for genograms using binary and base 32 numbers that saves considerable microcomputer memory. Points out that new method will result in greater ability to store and analyze larger amounts of family data. (Author/CM)

  17. Investigating Electromagnetic Induction through a Microcomputer-Based Laboratory.

    ERIC Educational Resources Information Center

    Trumper, Ricardo; Gelbman, Moshe

    2000-01-01

    Describes a microcomputer-based laboratory experiment designed for high school students that very accurately analyzes Faraday's law of electromagnetic induction, addressing each variable separately while the others are kept constant. (Author/CCM)

  18. Adjacent Channel Interference Reduction for M-WiMAX TDD and WCDMA FDD Coexistence by Utilizing Beamforming in M-WiMAX TDD System

    NASA Astrophysics Data System (ADS)

    Wang, Yupeng; Chang, Kyunghi

    In this paper, we analyze the coexistence issues of M-WiMAX TDD and WCDMA FDD systems. Smart antenna techniques are applied to mitigate the performance loss induced by adjacent channel interference (ACI) in the scenarios where performance is heavily degraded. In addition, an ACI model is proposed to capture the effect of transmit beamforming at the M-WiMAX base station. Furthermore, a MCS-based throughput analysis is proposed, to jointly consider the effects of ACI, system packet error rate requirement, and the available modulation and coding schemes, which is not possible by using the conventional Shannon equation based analysis. From the results, we find that the proposed MCS-based analysis method is quite suitable to analyze the system theoretical throughput in a practical manner.

  19. Predictive Factors for Developing Venous Thrombosis during Cisplatin-Based Chemotherapy in Testicular Cancer.

    PubMed

    Heidegger, Isabel; Porres, Daniel; Veek, Nica; Heidenreich, Axel; Pfister, David

    2017-01-01

    Malignancies and cisplatin-based chemotherapy are both known to correlate with a high risk of venous thrombotic events (VTT). In testicular cancer, the information regarding the incidence and reason of VTT in patients undergoing cisplatin-based chemotherapy is still discussed controversially. Moreover, no risk factors for developing a VTT during cisplatin-based chemotherapy have been elucidated so far. We retrospectively analyzed 153 patients with testicular cancer undergoing cisplatin-based chemotherapy at our institution for the development of a VTT during or after chemotherapy. Clinical and pathological parameters for identifying possible risk factors for VTT were analyzed. The Khorana risk score was used to calculate the risk of VTT. Student t test was applied for calculating the statistical significance of differences between the treatment groups. Twenty-six out of 153 patients (17%) developed a VTT during chemotherapy. When we analyzed the risk factors for developing a VTT, we found that Lugano stage ≥IIc was significantly (p = 0.0006) correlated with the risk of developing a VTT during chemotherapy. On calculating the VTT risk using the Khorana risk score model, we found that only 2 out of 26 patients (7.7%) were in the high-risk Khorana group (≥3). Patients with testicular cancer with a high tumor volume have a significant risk of developing a VTT with cisplatin-based chemotherapy. The Khorana risk score is not an accurate tool for predicting VTT in testicular cancer. © 2017 S. Karger AG, Basel.

  20. Linking Physical Geography Education and Research through the Development of an Environmental Sensing Network and Project-Based Learning

    ERIC Educational Resources Information Center

    Roberts, Dar; Bradley, Eliza; Roth, Keely; Eckmann, Ted; Still, Christopher

    2010-01-01

    Geographic education is more effective when students actively participate by developing hypotheses, designing experiments, collecting and analyzing data, and discussing results. We describe an innovative pedagogical approach, in which students learn physical geography concepts by analyzing environmental data collected in contrasting environments…

  1. Using Web Server Logs to Track Users through the Electronic Forest

    ERIC Educational Resources Information Center

    Coombs, Karen A.

    2005-01-01

    This article analyzes server logs, providing helpful information in making decisions about Web-based services. The author indicates, as a result of analyzing server logs, several interesting things about the users' behavior were learned. The resulting findings are discussed in this article. Certain pages of the author's Web site, for instance, are…

  2. "Choose, Explore, Analyze": A Multi-Tiered Approach to Social Media in the Classroom

    ERIC Educational Resources Information Center

    Rosatelli, Meghan

    2015-01-01

    In this essay, social media are presented as complex tools that require student involvement from potential classroom implementation to the post-mortem. The "choose, explore, analyze" approach narrows social media options for the classroom based on student feedback and allows students and teachers to work together to understand why and…

  3. Analyzing the Cost-Effectiveness of Instruction Expenditures towards High School Completion among Oahu's Public School Districts

    ERIC Educational Resources Information Center

    Ng, Larson S. W. M.

    2011-01-01

    The following study attempted to ascertain the instructional cost-effectiveness of public high school teachers towards high school completion through a financially based econometric analysis. Essentially, public high school instruction expenditures and completer data were collected from 2000 to 2007 and bivariate interaction analyzed through a…

  4. Analyzing the Learning Process of an Online Role-Playing Discussion Activity

    ERIC Educational Resources Information Center

    Hou, Huei-Tse

    2012-01-01

    Instructional activities based on online discussion strategies have gained prevalence in recent years. Within this context, a crucial research topic is to design innovative and appropriate online discussion strategies that assist learners in attaining a deeper level of interaction and higher cognitive skills. By analyzing the process of online…

  5. Analyzing Workforce Education. Monograph.

    ERIC Educational Resources Information Center

    Texas Community & Technical Coll. Workforce Education Consortium.

    This monograph examines the issue of task analysis as used in workplace literacy programs, debating the need for it and how to perform it in a rapidly changing environment. Based on experiences of community colleges in Texas, the report analyzes ways that task analysis can be done and how to implement work force education programs more quickly.…

  6. Analyzing the Responses of 7-8 Year Olds When Solving Partitioning Problems

    ERIC Educational Resources Information Center

    Badillo, Edelmira; Font, Vicenç; Edo, Mequè

    2015-01-01

    We analyze the mathematical solutions of 7- to 8-year-old pupils while individually solving an arithmetic problem. The analysis was based on the "configuration of objects," an instrument derived from the onto-semiotic approach to mathematical knowledge. Results are illustrated through a number of cases. From the analysis of mathematical…

  7. Observing and Analyzing Children's Mathematical Development, Based on Action Theory

    ERIC Educational Resources Information Center

    Bunck, M. J. A.; Terlien, E.; van Groenestijn, M.; Toll, S. W. M.; Van Luit, J. E. H.

    2017-01-01

    Children who experience difficulties with learning mathematics should be taught by teachers who focus on the child's best way of learning. Analyses of the mathematical difficulties are necessary for fine-tuning mathematics education to the needs of these children. For this reason, an instrument for Observing and Analyzing children's Mathematical…

  8. Method for genetic identification of unknown organisms

    DOEpatents

    Colston, Jr., Billy W.; Fitch, Joseph P.; Hindson, Benjamin J.; Carter, Chance J.; Beer, Neil Reginald

    2016-08-23

    A method of rapid, genome and proteome based identification of unknown pathogenic or non-pathogenic organisms in a complex sample. The entire sample is analyzed by creating millions of emulsion encapsulated microdroplets, each containing a single pathogenic or non-pathogenic organism sized particle and appropriate reagents for amplification. Following amplification, the amplified product is analyzed.

  9. Learning from Analyzing Linguistically Diverse Students' Work: A Contribution of Preservice Teacher Inquiry

    ERIC Educational Resources Information Center

    Athanases, Steven Z.; Wong, Joanna W.

    2018-01-01

    One task of Feiman-Nemser's teacher learning model--develop tools and dispositions to study teaching--frames how we organized learning opportunities during teacher preparation. We explored how and to what degree preservice teachers used teacher inquiry to analyze linguistically diverse students' work through an asset-based lens, beyond deficit…

  10. Multidimensional Item Response Theory Models in Vocational Interest Measurement: An Illustration Using the AIST-R

    ERIC Educational Resources Information Center

    Wetzel, Eunike; Hell, Benedikt

    2014-01-01

    Vocational interest inventories are commonly analyzed using a unidimensional approach, that is, each subscale is analyzed separately. However, the theories on which these inventories are based often postulate specific relationships between the interest traits. This article presents a multidimensional approach to the analysis of vocational interest…

  11. INNOVATIVE TECHNOLOGY VERIFICATION REPORT "FIELD MEASUREMENT TECHNOLOGIES FOR TOTAL PETROLEUM HYDROCARBONS IN SOIL" WILKS ENTERPRISE, INC. INFRACAL TOG/TPH ANALYZER

    EPA Science Inventory


    The hifracal' TOG/TPH Analyzer developed by Wilks Enterprise, Inc. (Wilks), was demonstrated under the U.S. Environmental Protection Agency Superfund Innovative Technology Evaluation Program in June 2000 at the Navy Base Ventura County site in Port Hueneme, California. The pu...

  12. INNOVATIVE TECHNOLOGY VERIFICATION REPORT "FIELD MEASUREMENT TECHNOLOGIES FOR TOTAL PETROLEUM HYDROCARBONS IN SOIL" HORIBA INSTRUMENTS INCORPORATED OCMA-350 CONTENT ANALYZER

    EPA Science Inventory


    The OCMA-350 Oil Content Analyzer(OCMA-350) developed by Horiba Instruments Incorporated (Horiba), was demonstrated under the U.S. Environmental Protection Agency Superfund Innovative Technology Evaluation Program in June 2000 at the Navy Base Ventura County site in Port Huen...

  13. Are Consumer-Directed Home Care Beneficiaries Satisfied? Evidence from Washington State

    ERIC Educational Resources Information Center

    Wiener, Joshua M.; Anderson, Wayne L.; Khatutsky, Galina

    2007-01-01

    Purpose: This study analyzed the effect of consumer-directed versus agency-directed home care on satisfaction with paid personal assistance services among Medicaid beneficiaries in Washington State. Design and Methods: The study analyzed a survey of 513 Medicaid beneficiaries receiving home- and community-based services. As part of a larger study,…

  14. Analyzing the Structure of the International Business Curriculum in India

    ERIC Educational Resources Information Center

    Srivastava, Deepak K.

    2012-01-01

    This article analyzes the structure of the international business curriculum through a questionnaire-based survey among current students and young managers who are studying or have studied international business courses in one of the top B-Schools of India. Respondents have the opinion that international business is more than internationalization…

  15. Students' Experiences with an Automated Essay Scorer

    ERIC Educational Resources Information Center

    Scharber, Cassandra; Dexter, Sara; Riedel, Eric

    2008-01-01

    The purpose of this research is to analyze preservice teachers' use of and reactions to an automated essay scorer used within an online, case-based learning environment called ETIPS. Data analyzed include post-assignment surveys, a user log of students' actions within the cases, instructor-assigned scores on final essays, and interviews with four…

  16. Chemiluminescence analyzer of NOx as a high-throughput screening tool in selective catalytic reduction of NO

    PubMed Central

    Oh, Kwang Seok; Woo, Seong Ihl

    2011-01-01

    A chemiluminescence-based analyzer of NOx gas species has been applied for high-throughput screening of a library of catalytic materials. The applicability of the commercial NOx analyzer as a rapid screening tool was evaluated using selective catalytic reduction of NO gas. A library of 60 binary alloys composed of Pt and Co, Zr, La, Ce, Fe or W on Al2O3 substrate was tested for the efficiency of NOx removal using a home-built 64-channel parallel and sequential tubular reactor. The NOx concentrations measured by the NOx analyzer agreed well with the results obtained using micro gas chromatography for a reference catalyst consisting of 1 wt% Pt on γ-Al2O3. Most alloys showed high efficiency at 275 °C, which is typical of Pt-based catalysts for selective catalytic reduction of NO. The screening with NOx analyzer allowed to select Pt-Ce(X) (X=1–3) and Pt–Fe(2) as the optimal catalysts for NOx removal: 73% NOx conversion was achieved with the Pt–Fe(2) alloy, which was much better than the results for the reference catalyst and the other library alloys. This study demonstrates a sequential high-throughput method of practical evaluation of catalysts for the selective reduction of NO. PMID:27877438

  17. A Field-Portable Cell Analyzer without a Microscope and Reagents

    PubMed Central

    Oh, Sangwoo; Lee, Moonjin; Hwang, Yongha

    2017-01-01

    This paper demonstrates a commercial-level field-portable lens-free cell analyzer called the NaviCell (No-stain and Automated Versatile Innovative cell analyzer) capable of automatically analyzing cell count and viability without employing an optical microscope and reagents. Based on the lens-free shadow imaging technique, the NaviCell (162 × 135 × 138 mm3 and 1.02 kg) has the advantage of providing analysis results with improved standard deviation between measurement results, owing to its large field of view. Importantly, the cell counting and viability testing can be analyzed without the use of any reagent, thereby simplifying the measurement procedure and reducing potential errors during sample preparation. In this study, the performance of the NaviCell for cell counting and viability testing was demonstrated using 13 and six cell lines, respectively. Based on the results of the hemocytometer (de facto standard), the error rate (ER) and coefficient of variation (CV) of the NaviCell are approximately 3.27 and 2.16 times better than the commercial cell counter, respectively. The cell viability testing of the NaviCell also showed an ER and CV performance improvement of 5.09 and 1.8 times, respectively, demonstrating sufficient potential in the field of cell analysis. PMID:29286336

  18. Deciphering the complex: methodological overview of statistical models to derive OMICS-based biomarkers.

    PubMed

    Chadeau-Hyam, Marc; Campanella, Gianluca; Jombart, Thibaut; Bottolo, Leonardo; Portengen, Lutzen; Vineis, Paolo; Liquet, Benoit; Vermeulen, Roel C H

    2013-08-01

    Recent technological advances in molecular biology have given rise to numerous large-scale datasets whose analysis imposes serious methodological challenges mainly relating to the size and complex structure of the data. Considerable experience in analyzing such data has been gained over the past decade, mainly in genetics, from the Genome-Wide Association Study era, and more recently in transcriptomics and metabolomics. Building upon the corresponding literature, we provide here a nontechnical overview of well-established methods used to analyze OMICS data within three main types of regression-based approaches: univariate models including multiple testing correction strategies, dimension reduction techniques, and variable selection models. Our methodological description focuses on methods for which ready-to-use implementations are available. We describe the main underlying assumptions, the main features, and advantages and limitations of each of the models. This descriptive summary constitutes a useful tool for driving methodological choices while analyzing OMICS data, especially in environmental epidemiology, where the emergence of the exposome concept clearly calls for unified methods to analyze marginally and jointly complex exposure and OMICS datasets. Copyright © 2013 Wiley Periodicals, Inc.

  19. Differences between Parkinson's and Huntington's diseases and their role for prioritization of stem cell-based treatments.

    PubMed

    Hug, K; Hermerén, G

    2013-06-01

    The problems of allocation of scarce resources and priority setting in health care have so far not been much studied in the context of stem cell-based therapeutic applications. If and when competitive cost effective stem cell-based therapies are available, the problem of priority setting - to whom should stem cellbased therapies be offered and on what grounds - is discussed in this article using the examples of Parkinson's Disease (PD) and Huntington's Disease (HD). The aim of this paper is to examine the presently known differences between PD and HD and analyze the role of these differences for setting priorities of stem cell-based therapeutic applications to treat these diseases. To achieve this aim, we (1) present the theoretical framework used in the analysis; (2) compare PD and HD in terms of health related and non-health related consequences of these diseases for patients, their relatives and third parties; (3) analyze the ethical relevance of observed differences for priority setting given different values and variables; (4) compare PD and HD in terms of social justice related consequences of stem cell-based therapies; and (5) analyze the ethical relevance of these differences for priority setting given different values and variables. We argue that the steps of analysis applied in this paper could be helpful when setting priorities among treatments of other diseases with similar differences as those between PD and HD.

  20. A method for analyzing clustered interval-censored data based on Cox's model.

    PubMed

    Kor, Chew-Teng; Cheng, Kuang-Fu; Chen, Yi-Hau

    2013-02-28

    Methods for analyzing interval-censored data are well established. Unfortunately, these methods are inappropriate for the studies with correlated data. In this paper, we focus on developing a method for analyzing clustered interval-censored data. Our method is based on Cox's proportional hazard model with piecewise-constant baseline hazard function. The correlation structure of the data can be modeled by using Clayton's copula or independence model with proper adjustment in the covariance estimation. We establish estimating equations for the regression parameters and baseline hazards (and a parameter in copula) simultaneously. Simulation results confirm that the point estimators follow a multivariate normal distribution, and our proposed variance estimations are reliable. In particular, we found that the approach with independence model worked well even when the true correlation model was derived from Clayton's copula. We applied our method to a family-based cohort study of pandemic H1N1 influenza in Taiwan during 2009-2010. Using the proposed method, we investigate the impact of vaccination and family contacts on the incidence of pH1N1 influenza. Copyright © 2012 John Wiley & Sons, Ltd.

  1. A 40 GHz fully integrated circuit with a vector network analyzer and a coplanar-line-based detection area for circulating tumor cell analysis using 65 nm CMOS technology

    NASA Astrophysics Data System (ADS)

    Nakanishi, Taiki; Matsunaga, Maya; Kobayashi, Atsuki; Nakazato, Kazuo; Niitsu, Kiichi

    2018-03-01

    A 40-GHz fully integrated CMOS-based circuit for circulating tumor cells (CTC) analysis, consisting of an on-chip vector network analyzer (VNA) and a highly sensitive coplanar-line-based detection area is presented in this paper. In this work, we introduce a fully integrated architecture that eliminates unwanted parasitic effects. The proposed analyzer was designed using 65 nm CMOS technology, and SPICE and MWS simulations were used to validate its operation. The simulation confirmed that the proposed circuit can measure S-parameter shifts resulting from the addition of various types of tumor cells to the detection area, the data of which are provided in a previous study: the |S 21| values for HepG2, A549, and HEC-1-A cells are -0.683, -0.580, and -0.623 dB, respectively. Additionally, the measurement demonstrated an S-parameters reduction of -25.7% when a silicone resin was put on the circuit. Hence, the proposed system is expected to contribute to cancer diagnosis.

  2. FIM, a Novel FTIR-Based Imaging Method for High Throughput Locomotion Analysis

    PubMed Central

    Otto, Nils; Löpmeier, Tim; Valkov, Dimitar; Jiang, Xiaoyi; Klämbt, Christian

    2013-01-01

    We designed a novel imaging technique based on frustrated total internal reflection (FTIR) to obtain high resolution and high contrast movies. This FTIR-based Imaging Method (FIM) is suitable for a wide range of biological applications and a wide range of organisms. It operates at all wavelengths permitting the in vivo detection of fluorescent proteins. To demonstrate the benefits of FIM, we analyzed large groups of crawling Drosophila larvae. The number of analyzable locomotion tracks was increased by implementing a new software module capable of preserving larval identity during most collision events. This module is integrated in our new tracking program named FIMTrack which subsequently extracts a number of features required for the analysis of complex locomotion phenotypes. FIM enables high throughput screening for even subtle behavioral phenotypes. We tested this newly developed setup by analyzing locomotion deficits caused by the glial knockdown of several genes. Suppression of kinesin heavy chain (khc) or rab30 function led to contraction pattern or head sweeping defects, which escaped in previous analysis. Thus, FIM permits forward genetic screens aimed to unravel the neural basis of behavior. PMID:23349775

  3. Block matrix based LU decomposition to analyze kinetic damping in active plasma resonance spectroscopy

    NASA Astrophysics Data System (ADS)

    Roehl, Jan Hendrik; Oberrath, Jens

    2016-09-01

    ``Active plasma resonance spectroscopy'' (APRS) is a widely used diagnostic method to measure plasma parameter like electron density. Measurements with APRS probes in plasmas of a few Pa typically show a broadening of the spectrum due to kinetic effects. To analyze the broadening a general kinetic model in electrostatic approximation based on functional analytic methods has been presented [ 1 ] . One of the main results is, that the system response function Y(ω) is given in terms of the matrix elements of the resolvent of the dynamic operator evaluated for values on the imaginary axis. To determine the response function of a specific probe the resolvent has to be approximated by a huge matrix which is given by a banded block structure. Due to this structure a block based LU decomposition can be implemented. It leads to a solution of Y(ω) which is given only by products of matrices of the inner block size. This LU decomposition allows to analyze the influence of kinetic effects on the broadening and saves memory and calculation time. Gratitude is expressed to the internal funding of Leuphana University.

  4. [Analysis on traditional Chinese medicine prescriptions treating cancer based on traditional Chinese medicine inheritance assistance system and discovery of new prescriptions].

    PubMed

    Yu, Ming; Cao, Qi-chen; Su, Yu-xi; Sui, Xin; Yang, Hong-jun; Huang, Lu-qi; Wang, Wen-ping

    2015-08-01

    Malignant tumor is one of the main causes for death in the world at present as well as a major disease seriously harming human health and life and restricting the social and economic development. There are many kinds of reports about traditional Chinese medicine patent prescriptions, empirical prescriptions and self-made prescriptions treating cancer, and prescription rules were often analyzed based on medication frequency. Such methods were applicable for discovering dominant experience but hard to have an innovative discovery and knowledge. In this paper, based on the traditional Chinese medicine inheritance assistance system, the software integration of mutual information improvement method, complex system entropy clustering and unsupervised entropy-level clustering data mining methods was adopted to analyze the rules of traditional Chinese medicine prescriptions for cancer. Totally 114 prescriptions were selected, the frequency of herbs in prescription was determined, and 85 core combinations and 13 new prescriptions were indentified. The traditional Chinese medicine inheritance assistance system, as a valuable traditional Chinese medicine research-supporting tool, can be used to record, manage, inquire and analyze prescription data.

  5. Measurement and decomposition of energy efficiency of Northeast China-based on super efficiency DEA model and Malmquist index.

    PubMed

    Ma, Xiaojun; Liu, Yan; Wei, Xiaoxue; Li, Yifan; Zheng, Mengchen; Li, Yudong; Cheng, Chaochao; Wu, Yumei; Liu, Zhaonan; Yu, Yuanbo

    2017-08-01

    Nowadays, environment problem has become the international hot issue. Experts and scholars pay more and more attention to the energy efficiency. Unlike most studies, which analyze the changes of TFEE in inter-provincial or regional cities, TFEE is calculated with the ratio of target energy value and actual energy input based on data in cities of prefecture levels, which would be more accurate. Many researches regard TFP as TFEE to do analysis from the provincial perspective. This paper is intended to calculate more reliably by super efficiency DEA, observe the changes of TFEE, and analyze its relation with TFP, and it proves that TFP is not equal to TFEE. Additionally, the internal influences of the TFEE are obtained via the Malmquist index decomposition. The external influences of the TFFE are analyzed afterward based on the Tobit models. Analysis results demonstrate that Heilongjiang has the highest TFEE followed by Jilin, and Liaoning has the lowest TFEE. Eventually, some policy suggestions are proposed for the influences of energy efficiency and study results.

  6. The use of 2D and 3D WA-BPM models to analyze total-internal-reflection-based integrated optical switches

    NASA Astrophysics Data System (ADS)

    Wang, Pengfei; Brambilla, Gilberto; Semenova, Yuliya; Wu, Qiang; Zheng, Jie; Farrell, Gerald

    2011-08-01

    The well known beam propagation method (BPM) has become one of the most useful, robust and effective numerical simulation tools for the investigation of guided-wave optics, for example integrated optical waveguides and fiber optic devices. In this paper we examine the use of the 2D and 3D wide angle-beam propagation method (WA-BPM) combined with the well known perfectly matched layer (PML) boundary conditions as a tool to analyze TIR based optical switches, in particular the relationship between light propagation and the geometrical parameters of a TIR based optical switch. To analyze the influence of the length and the width of the region in which the refractive index can be externally controlled, the 3D structure of a 2x2 TIR optical switch is firstly considered in 2D using the effective index method (EIM). Then the influence of the etching depth and the tilt angle of the reflection facet on the switch performance are investigated with a 3D model.

  7. Superconducting fluctuations in molybdenum nitride thin films

    NASA Astrophysics Data System (ADS)

    Baskaran, R.; Thanikai Arasu, A. V.; Amaladass, E. P.; Vaidhyanathan, L. S.; Baisnab, D. K.

    2018-02-01

    MoN thin films have been deposited using reactive sputtering. The change in resistance near superconducting transition temperature at various magnetic fields has been analyzed based on superconducting fluctuations in the system. The Aslamazov and Larkin scaling theory has been utilized to analyze the conductance change. The results indicate that most of the measurements show two dimensional (2D) nature and exhibit scaling behavior at lower magnetic fields (<7T), while a cross over to three dimensional (3D) nature has been clearly observed in measurements at higher fields (>7T). We have also analyzed our data based on the model in which there is no explicit dependence of Tc. These analyses also substantiate a crossover from a 2D nature to a 3D at larger fields. Analysis using lowest Landau level scaling theory for a 2D system exhibit scaling behavior and substantiate our observations. The broadening at low resistance part has been explained based on thermally activated flux flow model and show universal behavior. The dependence of Uo on magnetic field indicates both single and collective vortex behavior.

  8. Motion-free hybrid design laser beam propagation analyzer using a digital micromirror device and a variable focus liquid lens.

    PubMed

    Sheikh, Mumtaz; Riza, Nabeel A

    2010-06-01

    To the best of our knowledge, we propose the first motion-free laser beam propagation analyzer with a hybrid design using a digital micromirror device (DMD) and a liquid electronically controlled variable focus lens (ECVFL). Unlike prior analyzers that require profiling the beam at multiple locations along the light propagation axis, the proposed analyzer profiles the beam at the same plane for multiple values of the ECVFL focal length, thus eliminating beam profiler assembly motion. In addition to measuring standard Gaussian beam parameters, the analyzer can also be used to measure the M(2) beam propagation parameter of a multimode beam. Proof-of-concept beam parameter measurements with the proposed analyzer are successfully conducted for a 633 nm laser beam. Given the all-digital nature of the DMD-based profiling and all-analog motion-free nature of the ECVFL beam focus control, the proposed analyzer versus prior art promises better repeatability, speed, and reliability.

  9. Analysis of the Image of Scientists Portrayed in the Lebanese National Science Textbooks

    NASA Astrophysics Data System (ADS)

    Yacoubian, Hagop A.; Al-Khatib, Layan; Mardirossian, Taline

    2017-07-01

    This article presents an analysis of how scientists are portrayed in the Lebanese national science textbooks. The purpose of this study was twofold. First, to develop a comprehensive analytical framework that can serve as a tool to analyze the image of scientists portrayed in educational resources. Second, to analyze the image of scientists portrayed in the Lebanese national science textbooks that are used in Basic Education. An analytical framework, based on an extensive review of the relevant literature, was constructed that served as a tool for analyzing the textbooks. Based on evidence-based stereotypes, the framework focused on the individual and work-related characteristics of scientists. Fifteen science textbooks were analyzed using both quantitative and qualitative measures. Our analysis of the textbooks showed the presence of a number of stereotypical images. The scientists are predominantly white males of European descent. Non-Western scientists, including Lebanese and/or Arab scientists are mostly absent in the textbooks. In addition, the scientists are portrayed as rational individuals who work alone, who conduct experiments in their labs by following the scientific method, and by operating within Eurocentric paradigms. External factors do not influence their work. They are engaged in an enterprise which is objective, which aims for discovering the truth out there, and which involves dealing with direct evidence. Implications for science education are discussed.

  10. Technical and economic assessment of processes for the production of butanol and acetone. Phase two: analysis of research advances. Energy Conversion and Utilization Technologies Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1984-08-01

    The initial objective of this work was to develop a methodology for analyzing the impact of technological advances as a tool to help establish priorities for R and D options in the field of biocatalysis. As an example of a biocatalyzed process, butanol/acetone fermentation (ABE process) was selected as the specific topic of study. A base case model characterizing the technology and economics associated with the ABE process was developed in the previous first phase of study. The project objectives were broadened in this second phase of work to provide parametric estimates of the economic and energy impacts of amore » variety of research advances in the hydrolysis, fermentation and purification sections of the process. The research advances analyzed in this study were based on a comprehensive literature review. The six process options analyzed were: continuous ABE fermentaton; vacuum ABE fermentation; Baelene solvent extraction; HRI's Lignol process; improved prehydrolysis/dual enzyme hydrolysis; and improved microorganism tolerance to butanol toxicity. Of the six options analyzed, only improved microorganism tolerance to butanol toxicity had a significant positive effect on energy efficiency and economics. This particular process option reduced the base case production cost (including 10% DCF return) by 20% and energy consumption by 16%. Figures and tables.« less

  11. Atmospheric pressure matrix-assisted laser desorption/ionization mass spectrometry of friction modifier additives analyzed directly from base oil solutions.

    PubMed

    Widder, Lukas; Brennerb, Josef; Huttera, Herbert

    2014-01-01

    To develop new products and to apply measures of quality control quick and simple accessibility of additive composition in automo- tive lubrication is important. The aim of this study was to investigate the possibility of analyzing organic friction modifier additives by means of atmospheric pressure matrix-assisted laser desorption/ionization mass spectrometry [AP-MALDI-MS] from lubricant solu- tions without the use of additional separation techniques. Analyses of selected friction modifier ethoxylated tallow amines and oleic acid amide were compared using two ionization methods, positive-ion electrospray ionization (ESI) and AP-MALDI, using a LTQ Orbitrap mass spectrometer. Pure additives were characterized from solvent solutions, as well as from synthetic and mineral base oil mixtures. Detected ions of pure additive samples consisted mainly of [M + H]+, but also alkaLi metal adducts [M + Na]+ and [M + K]+ could be seen. Characterizations of blends of both friction modifiers from the base oil mixtures were carried out as well and showed significant inten- sities for several additive peaks. Thus, this work shows a method to directly analyze friction modifier additives used in the automotive industry from an oil blend via the use of AP-MALDI without any further separation steps. The method presented will further simplify the acquisition of data on lubricant composition and additives. Furthermore, it allows the perspective of analyzing additive reaction products directly from formulated oil blends.

  12. Unbiased nonorthogonal bases for tomographic reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sainz, Isabel; Klimov, Andrei B.; Roa, Luis

    2010-05-15

    We have developed a general method for constructing a set of nonorthogonal bases with equal separations between all different basis states in prime dimensions. The results are that the corresponding biorthogonal counterparts are pairwise unbiased with the components of the original bases. Using these bases, we derive an explicit expression for the optimal tomography in nonorthogonal bases. A special two-dimensional case is analyzed separately.

  13. Student Characteristics, Sense of Community, and Cognitive Achievement in Web-Based and Lab-Based Learning Environments

    ERIC Educational Resources Information Center

    Overbaugh, Richard C.; Lin, ShinYi

    2006-01-01

    This study investigated differential effects of learning styles and learning orientation on sense of community and cognitive achievement in Web-based and lab-based university course formats. Students in the Web-based sections achieved higher scores at the "remember" and "understand" levels, but not at the "apply" or "analyze" levels. In terms of…

  14. Geology-based method of assessing sensitivity of streams to acidic deposition in Charles and Anne Arundel Counties, Maryland

    USGS Publications Warehouse

    Rice, Karen C.; Bricker, Owen P.

    1991-01-01

    The report describes the results of a study to assess the sensitivity of streams to acidic deposition in Charles and Anne Arundel Counties, Maryland using a geology-based method. Water samples were collected from streams in July and August 1988 when streams were at base-flow conditions. Eighteen water samples collected from streams in Charles County, and 17 water samples from streams in Anne Arundel County were analyzed in the field for pH, specific conductance, and acid-neutralizing capacity (ANC); 8 water samples from streams in Charles County were analyzed in the laboratory for chloride and sulfate concentrations. The assessment revealed that streams in these counties are sensitive to acidification by acidic deposition.

  15. Probabilistic characterization of sleep architecture: home based study on healthy volunteers.

    PubMed

    Garcia-Molina, Gary; Vissapragada, Sreeram; Mahadevan, Anandi; Goodpaster, Robert; Riedner, Brady; Bellesi, Michele; Tononi, Giulio

    2016-08-01

    The quantification of sleep architecture has high clinical value for diagnostic purposes. While the clinical standard to assess sleep architecture is in-lab based polysomnography, higher ecological validity can be obtained with multiple sleep recordings at home. In this paper, we use a dataset composed of fifty sleep EEG recordings at home (10 per study participant for five participants) to analyze the sleep stage transition dynamics using Markov chain based modeling. The statistical analysis of the duration of continuous sleep stage bouts is also analyzed to identify the speed of transition between sleep stages. This analysis identified two types of NREM states characterized by fast and slow exit rates which from the EEG analysis appear to correspond to shallow and deep sleep respectively.

  16. Precision mechanical structure of an ultra-high-resolution spectrometer for inelastic X-ray scattering instrument

    DOEpatents

    Shu, Deming; Shvydko, Yuri; Stoupin, Stanislav A.; Khachatryan, Ruben; Goetze, Kurt A.; Roberts, Timothy

    2015-04-14

    A method and an ultrahigh-resolution spectrometer including a precision mechanical structure for positioning inelastic X-ray scattering optics are provided. The spectrometer includes an X-ray monochromator and an X-ray analyzer, each including X-ray optics of a collimating (C) crystal, a pair of dispersing (D) element crystals, anomalous transmission filter (F) and a wavelength (W) selector crystal. A respective precision mechanical structure is provided with the X-ray monochromator and the X-ray analyzer. The precision mechanical structure includes a base plate, such as an aluminum base plate; positioning stages for D-crystal alignment; positioning stages with an incline sensor for C/F/W-crystal alignment, and the positioning stages including flexure-based high-stiffness structure.

  17. Temporal steering and security of quantum key distribution with mutually unbiased bases against individual attacks

    NASA Astrophysics Data System (ADS)

    Bartkiewicz, Karol; Černoch, Antonín; Lemr, Karel; Miranowicz, Adam; Nori, Franco

    2016-06-01

    Temporal steering, which is a temporal analog of Einstein-Podolsky-Rosen steering, refers to temporal quantum correlations between the initial and final state of a quantum system. Our analysis of temporal steering inequalities in relation to the average quantum bit error rates reveals the interplay between temporal steering and quantum cloning, which guarantees the security of quantum key distribution based on mutually unbiased bases against individual attacks. The key distributions analyzed here include the Bennett-Brassard 1984 protocol and the six-state 1998 protocol by Bruss. Moreover, we define a temporal steerable weight, which enables us to identify a kind of monogamy of temporal correlation that is essential to quantum cryptography and useful for analyzing various scenarios of quantum causality.

  18. Citation Patterns in the Computer-Based Instruction Literature.

    ERIC Educational Resources Information Center

    Wedman, John F.

    1987-01-01

    This study examined indirect communication patterns among professionals in the computer-based instruction field by analyzing citations from the Journal of Computer-Based Instruction. The patterns found provide the basis for identifying invisible colleges, defined here as communication networks that facilitate the diffusion of knowledge and direct…

  19. A Web-based Examination System Based on PHP+MySQL.

    PubMed

    Wen, Ji; Zhang, Yang; Yan, Yong; Xia, Shunren

    2005-01-01

    The design and implementation of web-based examination system constructed by PHP and MySQL is presented in this paper. Three primary parts, including students',teachers' and administrators', are introduced and analyzed in detail. Initial application has demonstrated the system's feasibility and reasonability.*

  20. Design and realization of confidential data management system RFID-based

    NASA Astrophysics Data System (ADS)

    Huang, Wei; Wang, Zhong; Wang, Xin

    2017-03-01

    This paper introduces the composition of RFID system, and then analyzes the hardware design and software design systems, and finally summarizes the realization and application of the confidential data management system RFID-based.

  1. Congestion-based emergency vehicle preemption.

    DOT National Transportation Integrated Search

    2010-08-01

    This research analyzed and evaluated a new strategy for preemption of emergency vehicles along a corridor, which is : route-based and adaptive to real-time traffic conditions. The method uses dynamic offsets which are adjusted using : congestion-leve...

  2. Assessing the SunGuide and STEWARD databases.

    DOT National Transportation Integrated Search

    2017-02-01

    This project evaluated the feasibility of using the existing software and data bases as platforms : for analyzing the attributes of electric vehicles within present and future transportation : infrastructure projects and models. The Florida based Sun...

  3. Friction stir welding of Zr-based bulk metallic glass

    NASA Astrophysics Data System (ADS)

    Ji, Y. S.; Fujii, H.; Maeda, M.; Nakata, K.; Kimura, H.; Inoue, A.; Nogi, K.

    2009-05-01

    A Zr55Cu30Al10Ni5 bulk metallic glass plate was successfully welded below its crystallization temperature by friction stir welding. The flash formation and heat concentration at the shoulder edge was minimized using a wider tool and the angle of the recessed shoulder surface was 3°. To analyze the crystallization of the base material and stir zone, the microstructure and mechanical properties were analyzed using DSC, XRD, TEM, and micro-hardness. As a result, it was found that the amorphous structure and original mechanical properties were maintained in the whole joints.

  4. Morphology and transport in biodegradable polymer compositions based on poly(3-hydroxybutyrate) and polyamide 54C

    NASA Astrophysics Data System (ADS)

    Zhul'Kina, A. L.; Ivantsova, E. L.; Filatova, A. G.; Kosenko, R. Yu.; Gumargalieva, K. Z.; Iordanskii, A. L.

    2009-05-01

    Complex investigation of the equilibrium sorption of water, diffusive transport of antiseptic, and morphology of mixed compositions based on polyoxybutirate and polyamide resin 54C has been performed to develop and analyze new biodegradable polymer compositions for controlled release of medicinal substances. Samples of mixtures were prepared by two methods: pressing under pressure and solvent evaporation from a polymer solution. The samples were compared and their morphology was analyzed by scanning electron microscopy. It is shown that the component ratio in the obtained mixtures affects their morphological, transport, and sorption characteristics.

  5. Model-based query language for analyzing clinical processes.

    PubMed

    Barzdins, Janis; Barzdins, Juris; Rencis, Edgars; Sostaks, Agris

    2013-01-01

    Nowadays large databases of clinical process data exist in hospitals. However, these data are rarely used in full scope. In order to perform queries on hospital processes, one must either choose from the predefined queries or develop queries using MS Excel-type software system, which is not always a trivial task. In this paper we propose a new query language for analyzing clinical processes that is easily perceptible also by non-IT professionals. We develop this language based on a process modeling language which is also described in this paper. Prototypes of both languages have already been verified using real examples from hospitals.

  6. [New methods for the evaluation of bone quality. Assessment of bone structural property using imaging.

    PubMed

    Ito, Masako

    Structural property of bone includes micro- or nano-structural property of the trabecular and cortical bone, and macroscopic geometry. Radiological technique is useful to analyze the bone structural property;multi-detector row CT(MDCT)or high-resolution peripheral QCT(HR-pQCT)is available to analyze human bone in vivo . For the analysis of hip geometry, CT-based hip structure analysis(HSA)is available as well as DXA-based HSA. These structural parameters are related to biomechanical property, and these assessment tools provide information of pathological changes or the effects of anti-osteoporotic agents on bone.

  7. Analyzing of economic growth based on electricity consumption from different sources

    NASA Astrophysics Data System (ADS)

    Maksimović, Goran; Milosavljević, Valentina; Ćirković, Bratislav; Milošević, Božidar; Jović, Srđan; Alizamir, Meysam

    2017-10-01

    Economic growth could be influenced by different factors. In this study was analyzed the economic growth based on the electricity consumption form different sources. As economic growth indicator gross domestic product (GDP) was used. ANFIS (adaptive neuro fuzzy inference system) methodology was applied to determine the most important factors from the given set for the GDP growth prediction. Six inputs were used: electricity production from coal, hydroelectric, natural gas, nuclear, oil and renewable sources. Results shown that the electricity consumption from renewable sources has the highest impact on the economic or GDP growth prediction.

  8. Stokes vector based interpolation method to improve the efficiency of bio-inspired polarization-difference imaging in turbid media

    NASA Astrophysics Data System (ADS)

    Guan, Jinge; Ren, Wei; Cheng, Yaoyu

    2018-04-01

    We demonstrate an efficient polarization-difference imaging system in turbid conditions by using the Stokes vector of light. The interaction of scattered light with the polarizer is analyzed by the Stokes-Mueller formalism. An interpolation method is proposed to replace the mechanical rotation of the polarization axis of the analyzer theoretically, and its performance is verified by the experiment at different turbidity levels. We show that compared with direct imaging, the Stokes vector based imaging method can effectively reduce the effect of light scattering and enhance the image contrast.

  9. Thermodynamic analysis of chemical compatibility of several compounds with Fe-Cr-Al alloys

    NASA Technical Reports Server (NTRS)

    Misra, Ajay K.

    1993-01-01

    Chemical compatibility between Fe-19.8Cr-4.8Al (weight percent), which is the base composition for the commercial superalloy MA956, and several carbides, borides, nitrides, oxides, and silicides was analyzed from thermodynamic considerations. The effect of addition of minor alloying elements, such as Ti, Y, and Y2O3, to the Fe-Cr-Al alloy on chemical compatibility between the alloy and various compounds was also analyzed. Several chemically compatible compounds that can be potential reinforcement materials and/or interface coating materials for Fe-Cr-Al based composites were identified.

  10. Modeling the Pulse Signal by Wave-Shape Function and Analyzing by Synchrosqueezing Transform

    PubMed Central

    Wang, Chun-Li; Yang, Yueh-Lung; Wu, Wen-Hsiang; Tsai, Tung-Hu; Chang, Hen-Hong

    2016-01-01

    We apply the recently developed adaptive non-harmonic model based on the wave-shape function, as well as the time-frequency analysis tool called synchrosqueezing transform (SST) to model and analyze oscillatory physiological signals. To demonstrate how the model and algorithm work, we apply them to study the pulse wave signal. By extracting features called the spectral pulse signature, and based on functional regression, we characterize the hemodynamics from the radial pulse wave signals recorded by the sphygmomanometer. Analysis results suggest the potential of the proposed signal processing approach to extract health-related hemodynamics features. PMID:27304979

  11. Modeling the Pulse Signal by Wave-Shape Function and Analyzing by Synchrosqueezing Transform.

    PubMed

    Wu, Hau-Tieng; Wu, Han-Kuei; Wang, Chun-Li; Yang, Yueh-Lung; Wu, Wen-Hsiang; Tsai, Tung-Hu; Chang, Hen-Hong

    2016-01-01

    We apply the recently developed adaptive non-harmonic model based on the wave-shape function, as well as the time-frequency analysis tool called synchrosqueezing transform (SST) to model and analyze oscillatory physiological signals. To demonstrate how the model and algorithm work, we apply them to study the pulse wave signal. By extracting features called the spectral pulse signature, and based on functional regression, we characterize the hemodynamics from the radial pulse wave signals recorded by the sphygmomanometer. Analysis results suggest the potential of the proposed signal processing approach to extract health-related hemodynamics features.

  12. Morphology and transport in biodegradable polymer compositions based on poly(3-hydroxybutyrate) and polyamide 54C

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhul'kina, A. L.; Ivantsova, E. L.; Filatova, A. G.

    2009-05-15

    Complex investigation of the equilibrium sorption of water, diffusive transport of antiseptic, and morphology of mixed compositions based on polyoxybutirate and polyamide resin 54C has been performed to develop and analyze new biodegradable polymer compositions for controlled release of medicinal substances. Samples of mixtures were prepared by two methods: pressing under pressure and solvent evaporation from a polymer solution. The samples were compared and their morphology was analyzed by scanning electron microscopy. It is shown that the component ratio in the obtained mixtures affects their morphological, transport, and sorption characteristics.

  13. Study on ultra-fast single photon counting spectrometer based on PCI

    NASA Astrophysics Data System (ADS)

    Zhang, Xi-feng

    2010-10-01

    The time-correlated single photon counting spectrometer developed uses PCI bus technology. We developed the ultrafast data acquisition card based on PCI, replace multi-channel analyzer primary. The system theory and design of the spectrometer are presented in detail, and the process of operation is introduced with the integration of the system. Many standard samples have been measured and the data have been analyzed and contrasted. Experimental results show that the spectrometer, s sensitive is single photon counting, and fluorescence life-span and time resolution is picosecond level. And the instrument could measure time-resolved spectroscopy.

  14. Analysis of dangerous area of single berth oil tanker operations based on CFD

    NASA Astrophysics Data System (ADS)

    Shi, Lina; Zhu, Faxin; Lu, Jinshu; Wu, Wenfeng; Zhang, Min; Zheng, Hailin

    2018-04-01

    Based on the single process in the liquid cargo tanker berths in the state as the research object, we analyzed the single berth oil tanker in the process of VOCs diffusion theory, built network model of VOCs diffusion with Gambit preprocessor, set up the simulation boundary conditions and simulated the five detection point sources in specific factors under the influence of VOCs concentration change with time by using Fluent software. We analyzed the dangerous area of single berth oil tanker operations through the diffusion of VOCs, so as to ensure the safe operation of oil tanker.

  15. Improved Extreme Learning Machine based on the Sensitivity Analysis

    NASA Astrophysics Data System (ADS)

    Cui, Licheng; Zhai, Huawei; Wang, Benchao; Qu, Zengtang

    2018-03-01

    Extreme learning machine and its improved ones is weak in some points, such as computing complex, learning error and so on. After deeply analyzing, referencing the importance of hidden nodes in SVM, an novel analyzing method of the sensitivity is proposed which meets people’s cognitive habits. Based on these, an improved ELM is proposed, it could remove hidden nodes before meeting the learning error, and it can efficiently manage the number of hidden nodes, so as to improve the its performance. After comparing tests, it is better in learning time, accuracy and so on.

  16. Multiple targets detection method in detection of UWB through-wall radar

    NASA Astrophysics Data System (ADS)

    Yang, Xiuwei; Yang, Chuanfa; Zhao, Xingwen; Tian, Xianzhong

    2017-11-01

    In this paper, the problems and difficulties encountered in the detection of multiple moving targets by UWB radar are analyzed. The experimental environment and the penetrating radar system are established. An adaptive threshold method based on local area is proposed to effectively filter out clutter interference The objective of the moving target is analyzed, and the false target is further filtered out by extracting the target feature. Based on the correlation between the targets, the target matching algorithm is proposed to improve the detection accuracy. Finally, the effectiveness of the above method is verified by practical experiment.

  17. Fuzzy Logic-based expert system for evaluating cake quality of freeze-dried formulations.

    PubMed

    Trnka, Hjalte; Wu, Jian X; Van De Weert, Marco; Grohganz, Holger; Rantanen, Jukka

    2013-12-01

    Freeze-drying of peptide and protein-based pharmaceuticals is an increasingly important field of research. The diverse nature of these compounds, limited understanding of excipient functionality, and difficult-to-analyze quality attributes together with the increasing importance of the biosimilarity concept complicate the development phase of safe and cost-effective drug products. To streamline the development phase and to make high-throughput formulation screening possible, efficient solutions for analyzing critical quality attributes such as cake quality with minimal material consumption are needed. The aim of this study was to develop a fuzzy logic system based on image analysis (IA) for analyzing cake quality. Freeze-dried samples with different visual quality attributes were prepared in well plates. Imaging solutions together with image analytical routines were developed for extracting critical visual features such as the degree of cake collapse, glassiness, and color uniformity. On the basis of the IA outputs, a fuzzy logic system for analysis of these freeze-dried cakes was constructed. After this development phase, the system was tested with a new screening well plate. The developed fuzzy logic-based system was found to give comparable quality scores with visual evaluation, making high-throughput classification of cake quality possible. © 2013 Wiley Periodicals, Inc. and the American Pharmacists Association.

  18. Breast tumor segmentation in high resolution x-ray phase contrast analyzer based computed tomography.

    PubMed

    Brun, E; Grandl, S; Sztrókay-Gaul, A; Barbone, G; Mittone, A; Gasilov, S; Bravin, A; Coan, P

    2014-11-01

    Phase contrast computed tomography has emerged as an imaging method, which is able to outperform present day clinical mammography in breast tumor visualization while maintaining an equivalent average dose. To this day, no segmentation technique takes into account the specificity of the phase contrast signal. In this study, the authors propose a new mathematical framework for human-guided breast tumor segmentation. This method has been applied to high-resolution images of excised human organs, each of several gigabytes. The authors present a segmentation procedure based on the viscous watershed transform and demonstrate the efficacy of this method on analyzer based phase contrast images. The segmentation of tumors inside two full human breasts is then shown as an example of this procedure's possible applications. A correct and precise identification of the tumor boundaries was obtained and confirmed by manual contouring performed independently by four experienced radiologists. The authors demonstrate that applying the watershed viscous transform allows them to perform the segmentation of tumors in high-resolution x-ray analyzer based phase contrast breast computed tomography images. Combining the additional information provided by the segmentation procedure with the already high definition of morphological details and tissue boundaries offered by phase contrast imaging techniques, will represent a valuable multistep procedure to be used in future medical diagnostic applications.

  19. Place-Based Education: What Is Its Place in the Social Studies Classroom?

    ERIC Educational Resources Information Center

    Resor, Cynthia Williams

    2010-01-01

    Place-based education is a growing trend in education. This article defines place-based education and briefly examines its use across the disciplines. So as to better understand the wider concept, meanings of the geographical term "place" are analyzed. Place-based education in a social studies classroom is examined using two hypothetical…

  20. An Electronic Library-Based Learning Environment for Supporting Web-Based Problem-Solving Activities

    ERIC Educational Resources Information Center

    Tsai, Pei-Shan; Hwang, Gwo-Jen; Tsai, Chin-Chung; Hung, Chun-Ming; Huang, Iwen

    2012-01-01

    This study aims to develop an electronic library-based learning environment to support teachers in developing web-based problem-solving activities and analyzing the online problem-solving behaviors of students. Two experiments were performed in this study. In study 1, an experiment on 103 elementary and high school teachers (the learning activity…

  1. BETA: Behavioral testability analyzer and its application to high-level test generation and synthesis for testability. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chen, Chung-Hsing

    1992-01-01

    In this thesis, a behavioral-level testability analysis approach is presented. This approach is based on analyzing the circuit behavioral description (similar to a C program) to estimate its testability by identifying controllable and observable circuit nodes. This information can be used by a test generator to gain better access to internal circuit nodes and to reduce its search space. The results of the testability analyzer can also be used to select test points or partial scan flip-flops in the early design phase. Based on selection criteria, a novel Synthesis for Testability approach call Test Statement Insertion (TSI) is proposed, which modifies the circuit behavioral description directly. Test Statement Insertion can also be used to modify circuit structural description to improve its testability. As a result, Synthesis for Testability methodology can be combined with an existing behavioral synthesis tool to produce more testable circuits.

  2. Exploring the limits of cryospectroscopy: Least-squares based approaches for analyzing the self-association of HCl

    NASA Astrophysics Data System (ADS)

    De Beuckeleer, Liene I.; Herrebout, Wouter A.

    2016-02-01

    To rationalize the concentration dependent behavior observed for a large spectral data set of HCl recorded in liquid argon, least-squares based numerical methods are developed and validated. In these methods, for each wavenumber a polynomial is used to mimic the relation between monomer concentrations and measured absorbances. Least-squares fitting of higher degree polynomials tends to overfit and thus leads to compensation effects where a contribution due to one species is compensated for by a negative contribution of another. The compensation effects are corrected for by carefully analyzing, using AIC and BIC information criteria, the differences observed between consecutive fittings when the degree of the polynomial model is systematically increased, and by introducing constraints prohibiting negative absorbances to occur for the monomer or for one of the oligomers. The method developed should allow other, more complicated self-associating systems to be analyzed with a much higher accuracy than before.

  3. Robust failure detection filters. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Sanmartin, A. M.

    1985-01-01

    The robustness of detection filters applied to the detection of actuator failures on a free-free beam is analyzed. This analysis is based on computer simulation tests of the detection filters in the presence of different types of model mismatch, and on frequency response functions of the transfers corresponding to the model mismatch. The robustness of detection filters based on a model of the beam containing a large number of structural modes varied dramatically with the placement of some of the filter poles. The dynamics of these filters were very hard to analyze. The design of detection filters with a number of modes equal to the number of sensors was trivial. They can be configured to detect any number of actuator failure events. The dynamics of these filters were very easy to analyze and their robustness properties were much improved. A change of the output transformation allowed the filter to perform satisfactorily with realistic levels of model mismatch.

  4. Rhombic micro-displacement amplifier for piezoelectric actuator and its linear and hybrid model

    NASA Astrophysics Data System (ADS)

    Chen, Jinglong; Zhang, Chunlin; Xu, Minglong; Zi, Yanyang; Zhang, Xinong

    2015-01-01

    This paper proposes rhombic micro-displacement amplifier (RMDA) for piezoelectric actuator (PA). First, the geometric amplification relations are analyzed and linear model is built to analyze the mechanical and electrical properties of this amplifier. Next, the accurate modeling method of amplifier is studied for important application of precise servo control. The classical Preisach model (CPM) is generally implemented using a numerical technique based on the first-order reversal curves (FORCs). The accuracy of CPM mainly depends on the number of FORCs. However, it is generally difficult to achieve enough number of FORCs in practice. So, Support Vector Machine (SVM) is employed in the work to circumvent the deficiency of the CPM. Then the hybrid model, which is based on discrete CPM and SVM is developed to account for hysteresis and dynamic effects. Finally, experimental validation is carried out. The analyzed result shows that this amplifier with the hybrid model is suitable for control application.

  5. Analyzing Carbohydrate-Protein Interaction Based on Single Plasmonic Nanoparticle by Conventional Dark Field Microscopy.

    PubMed

    Jin, Hong-Ying; Li, Da-Wei; Zhang, Na; Gu, Zhen; Long, Yi-Tao

    2015-06-10

    We demonstrated a practical method to analyze carbohydrate-protein interaction based on single plasmonic nanoparticles by conventional dark field microscopy (DFM). Protein concanavalin A (ConA) was modified on large sized gold nanoparticles (AuNPs), and dextran was conjugated on small sized AuNPs. As the interaction between ConA and dextran resulted in two kinds of gold nanoparticles coupled together, which caused coupling of plasmonic oscillations, apparent color changes (from green to yellow) of the single AuNPs were observed through DFM. Then, the color information was instantly transformed into a statistic peak wavelength distribution in less than 1 min by a self-developed statistical program (nanoparticleAnalysis). In addition, the interaction between ConA and dextran was proved with biospecific recognition. This approach is high-throughput and real-time, and is a convenient method to analyze carbohydrate-protein interaction at the single nanoparticle level efficiently.

  6. Panel acoustic contribution analysis.

    PubMed

    Wu, Sean F; Natarajan, Logesh Kumar

    2013-02-01

    Formulations are derived to analyze the relative panel acoustic contributions of a vibrating structure. The essence of this analysis is to correlate the acoustic power flow from each panel to the radiated acoustic pressure at any field point. The acoustic power is obtained by integrating the normal component of the surface acoustic intensity, which is the product of the surface acoustic pressure and normal surface velocity reconstructed by using the Helmholtz equation least squares based nearfield acoustical holography, over each panel. The significance of this methodology is that it enables one to analyze and rank relative acoustic contributions of individual panels of a complex vibrating structure to acoustic radiation anywhere in the field based on a single set of the acoustic pressures measured in the near field. Moreover, this approach is valid for both interior and exterior regions. Examples of using this method to analyze and rank the relative acoustic contributions of a scaled vehicle cabin are demonstrated.

  7. Analysis of preparation of Chinese traditional medicine based on the fiber fingerprint drop trace

    NASA Astrophysics Data System (ADS)

    Zhang, Zhilin; Wang, Jialu; Sun, Weimin; Yan, Qi

    2010-11-01

    The purpose of the fiber micro-drop analyzing technique is to measure the characteristics of liquids using optical methods. The fiber fingerprint drop trace (FFDT) is a curve of light intensity vs. time. This curve indicates the forming, growing and dripping processes of the liquid drops. A pair of fibers was used to monitor the dripping process. The FFDTs are acquired and analyzed by a computer. Different liquid samples of many kinds of preparation of Chinese traditional medicines were tested by using the fiber micro-drop sensor in the experiments. The FFDTs of preparation of Chinese traditional medicines with different concentrations were analyzed in different ways. Considering the characters of the FFDTs, a novel method is proposed to measure the different preparation of Chinese traditional medicines and its concentration based on the corresponding relationship of FFDTs and the physical and chemical parameters of the liquids.

  8. Bleach Gel: A Simple Agarose Gel for Analyzing RNA Quality

    PubMed Central

    Aranda, Patrick S.; LaJoie, Dollie M.; Jorcyk, Cheryl L.

    2013-01-01

    RNA-based applications requiring high quality, non-degraded RNA are a foundational element of many research studies. As such, it is paramount that the integrity of experimental RNA is validated prior to cDNA synthesis or other downstream applications. In the absence of expensive equipment such as microfluidic electrophoretic devices, and as an alternative to the costly and time-consuming standard formaldehyde gel, RNA quality can be quickly analyzed by adding small amounts of commercial bleach to TAE buffer-based agarose gels prior to electrophoresis. In the presence of low concentrations of bleach, the secondary structure of RNA is denatured and potential contaminating RNases are destroyed. Because of this, the ‘bleach gel’ is a functional approach that addresses the need for an inexpensive and safe way to evaluate RNA integrity and will improve the ability of researchers to rapidly analyze RNA quality. PMID:22222980

  9. Tour-based model development for TxDOT : implementation steps for the tour-based model design option and the data needs.

    DOT National Transportation Integrated Search

    2009-10-01

    Travel demand modeling, in recent years, has seen a paradigm shift with an emphasis on analyzing travel at the : individual level rather than using direct statistical projections of aggregate travel demand as in the trip-based : approach. Specificall...

  10. Preservice Teachers' Conceptions and Enactments of Project-Based Instruction

    ERIC Educational Resources Information Center

    Marshall, Jill A.; Petrosino, Anthony J.; Martin, Taylor

    2010-01-01

    We present results of an investigation of preservice secondary mathematics and science teachers' conceptions of project-based instruction (PBI) and their enactments of PBI in apprentice (student) teaching. We evaluated their thinking and implementations within a composite framework based on the work of education researchers. We analyzed survey…

  11. Gaze-Based Assistive Technology - Usefulness in Clinical Assessments.

    PubMed

    Wandin, Helena

    2017-01-01

    Gaze-based assistive technology was used in informal clinical assessments. Excerpts of medical journals were analyzed by directed content analysis using a model of communicative competence. The results of this pilot study indicate that gaze-based assistive technology is a useful tool in communication assessments that can generate clinically relevant information.

  12. Project-Based Learning and Student Knowledge Construction during Asynchronous Online Discussion

    ERIC Educational Resources Information Center

    Koh, Joyce Hwee Ling; Herring, Susan C.; Hew, Khe Foon

    2010-01-01

    Project-based learning engages students in problem solving through artefact design. However, previous studies of online project-based learning have focused primarily on the dynamics of online collaboration; students' knowledge construction throughout this process has not been examined thoroughly. This case study analyzed the relationship between…

  13. Academic Departments and Student Attitudes toward Different Dimensions of Web-based Education.

    ERIC Educational Resources Information Center

    Federico, Pat-Anthony

    2001-01-01

    Describes research at the Naval Postgraduate School that investigated student attitudes toward various aspects of Web-based instruction. Results of a survey, which were analyzed using a variety of multivariate and univariate statistical techniques, showed significantly different attitudes toward different dimensions of Web-based education…

  14. Paraconsistent Annotated Logic in Viability Analysis: an Approach to Product Launching

    NASA Astrophysics Data System (ADS)

    Romeu de Carvalho, Fábio; Brunstein, Israel; Abe, Jair Minoro

    2004-08-01

    In this paper we present an application of the Para-analyzer, a logical analyzer based on the Paraconsistent Annotated Logic Pτ, introduced by Da Silva Filho and Abe in the decision-making systems. An example is analyzed in detail showing how uncertainty, inconsistency and paracompleteness can be elegantly handled with this logical system. As application for the Para-analyzer in decision-making, we developed the BAM — Baricenter Analysis Method. In order to make the presentation easier, we present the BAM applied in the viability analysis of product launching. Some of the techniques of Paraconsistent Annotated Logic have been applied in Artificial Intelligence, Robotics, Information Technolgy (Computer Sciences), etc..

  15. Note: A portable Raman analyzer for microfluidic chips based on a dichroic beam splitter for integration of imaging and signal collection light paths

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geng, Yijia; Xu, Shuping; Xu, Weiqing, E-mail: xuwq@jlu.edu.cn

    An integrated and portable Raman analyzer featuring an inverted probe fixed on a motor-driving adjustable optical module was designed for the combination of a microfluidic system. It possesses a micro-imaging function. The inverted configuration is advantageous to locate and focus microfluidic channels. Different from commercial micro-imaging Raman spectrometers using manual switchable light path, this analyzer adopts a dichroic beam splitter for both imaging and signal collection light paths, which avoids movable parts and improves the integration and stability of optics. Combined with surface-enhanced Raman scattering technique, this portable Raman micro-analyzer is promising as a powerful tool for microfluidic analytics.

  16. International Space Station Major Constituent Analyzer On-Orbit Performance

    NASA Technical Reports Server (NTRS)

    Gardner, Ben D.; Erwin, Phillip M.; Thoresen, Souzan; Wiedemann, Rachel; Matty, Chris

    2015-01-01

    The Major Constituent Analyzer is a mass spectrometer based system that measures the major atmospheric constituents on the International Space Station. A number of limited-life components require periodic change-out, including the ORU 02 analyzer and the ORU 08 Verification Gas Assembly. Improvements to ion pump operation and ion source tuning have improved lifetime performance of the current ORU 02 design. The most recent ORU 02 analyzer assemblies, as well as ORU 08, have operated nominally. For ORU 02, the ion source filaments and ion pump lifetime continue to be key determinants of MCA performance and logistical support. Monitoring several key parameters provides the capacity to monitor ORU health and properly anticipate end of life.

  17. Analyzing Who, When, and Where: Data for Better Targeting of Resources for School-Based Asthma Interventions

    ERIC Educational Resources Information Center

    Raun, Loren H.; Campos, Laura A.; Stevenson, Elizabeth; Ensor, Katherine B.; Johnson, Gwen; Persse, David

    2017-01-01

    Background: Rates of uncontrolled asthma vary by demographics, space, and time. This article uses data on ambulance-treated asthma attacks in children to analyze these variations so that school districts can improve their asthma management interventions. Methods: Incidence rates of 1826 ambulance-treated asthma attacks for children aged 5-18 years…

  18. Using Networks to Visualize and Analyze Process Data for Educational Assessment

    ERIC Educational Resources Information Center

    Zhu, Mengxiao; Shu, Zhan; von Davier, Alina A.

    2016-01-01

    New technology enables interactive and adaptive scenario-based tasks (SBTs) to be adopted in educational measurement. At the same time, it is a challenging problem to build appropriate psychometric models to analyze data collected from these tasks, due to the complexity of the data. This study focuses on process data collected from SBTs. We…

  19. The Effect of Video-Based Approach on Prospective Teachers' Ability to Analyze Mathematics Teaching

    ERIC Educational Resources Information Center

    Alsawaie, Othman N.; Alghazo, Iman M.

    2010-01-01

    This is an intervention study that explored the effect of using video lesson analysis methodology (VLAM) on the ability of prospective middle/high school mathematics teachers to analyze mathematics teaching. The sample of the study consisted of 26 female prospective mathematics teachers enrolled in a methods course at the United Arab Emirates…

  20. Analysis of Calibration Errors for Both Short and Long Stroke White Light Experiments

    NASA Technical Reports Server (NTRS)

    Pan, Xaiopei

    2006-01-01

    This work will analyze focusing and tilt variations introduced by thermal changes in calibration processes. In particular the accuracy limits are presented for common short- and long-stroke experiments. A new, simple, practical calibration scheme is proposed and analyzed based on the SIM PlanetQuest's Micro-Arcsecond Metrology (MAM) testbed experiments.

  1. The Instructed Learning of Form-Function Mappings in the English Article System

    ERIC Educational Resources Information Center

    Zhao, Helen; MacWhinney, Brian

    2018-01-01

    This article analyzes the instructed learning of the English article system by second language (L2) learners. The Competition Model (MacWhinney, 1987, 2012) was adopted as the theoretical framework for analyzing the cues to article usage and for designing effective computer-based article instruction. Study 1 found that article cues followed a…

  2. Chemical Speciation Analysis of Sports Drinks by Acid-Base Titrimetry and Ion Chromatography: A Challenging Beverage Formulation Project

    ERIC Educational Resources Information Center

    Drossman, Howard

    2007-01-01

    Students have standardized a sodium hydroxide solution and analyzed commercially available sports drinks by titrimetric analysis of the triprotic citric acid, dihydrogen phosphate, and dihydrogen citrate and by ion chromatography for chloride, total phosphate and citrate. These experiments are interesting examples of analyzing real-world food and…

  3. Socioeconomic Indicators for Analyzing Convergence: The Case of Greece--1960-2004

    ERIC Educational Resources Information Center

    Liargovas, Panagiotis G.; Fotopoulos, Georgios

    2009-01-01

    The purpose of this paper is to use socioeconomic indicators for analyzing convergence within Greece at regional (NUTS II) and prefecture levels (NUTS III) since 1960. We use two alternative approaches. The first one is based on the coefficient of variation and the second one on quality of life rankings. We confirm the decline of regional…

  4. Interpreting Variance Components as Evidence for Reliability and Validity.

    ERIC Educational Resources Information Center

    Kane, Michael T.

    The reliability and validity of measurement is analyzed by a sampling model based on generalizability theory. A model for the relationship between a measurement procedure and an attribute is developed from an analysis of how measurements are used and interpreted in science. The model provides a basis for analyzing the concept of an error of…

  5. Bullying Victimization among Music Ensemble and Theatre Students in the United States

    ERIC Educational Resources Information Center

    Elpus, Kenneth; Carter, Bruce Allen

    2016-01-01

    The purpose of this study was to analyze the prevalence of reported school victimization through physical, verbal, social/relational, and cyberbullying aggression among music ensemble and theatre students in the middle and high schools of the United States as compared to their peers involved in other school-based activities. We analyzed nationally…

  6. To See or Not to See: Analyzing Difficulties in Geometry from the Perspective of Visual Perception

    ERIC Educational Resources Information Center

    Gal, Hagar; Linchevski, Liora

    2010-01-01

    In this paper, we consider theories about processes of visual perception and perception-based knowledge representation (VPR) in order to explain difficulties encountered in figural processing in junior high school geometry tasks. In order to analyze such difficulties, we take advantage of the following perspectives of VPR: (1) Perceptual…

  7. Rasch Based Analysis of Oral Proficiency Test Data.

    ERIC Educational Resources Information Center

    Nakamura, Yuji

    2001-01-01

    This paper examines the rating scale data of oral proficiency tests analyzed by a Rasch Analysis focusing on an item map and factor analysis. In discussing the item map, the difficulty order of six items and students' answering patterns are analyzed using descriptive statistics and measures of central tendency of test scores. The data ranks the…

  8. TRAC Innovative Visualization Techniques

    DTIC Science & Technology

    2016-11-14

    Therefore, TRAC analysts need a way to analyze the effectiveness of their visualization design choices. Currently, TRAC does not have a methodology ...to analyze visualizations used to support an analysis story. Our research team developed a visualization design methodology to create effective...visualizations that support an analysis story. First, we based our methodology on the latest research on design thinking, cognitive learning, and

  9. 40 CFR 1065.260 - Flame-ionization detector.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... concentrations on a carbon number basis of one, C1. For measuring THC or THCE you must use a FID analyzer. For... § 1065.205. Note that your FID-based system for measuring THC, THCE, or CH4 must meet all the... bias. (c) Heated FID analyzers. For measuring THC or THCE from compression-ignition engines, two-stroke...

  10. 40 CFR 1065.260 - Flame-ionization detector.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... concentrations on a carbon number basis of one, C1. For measuring THC or THCE you must use a FID analyzer. For... § 1065.205. Note that your FID-based system for measuring THC, THCE, or CH4 must meet all the... verification in § 1065.307. (c) Heated FID analyzers. For measuring THC or THCE from compression-ignition...

  11. 40 CFR 1065.260 - Flame-ionization detector.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... concentrations on a carbon number basis of one, C1. For measuring THC or THCE you must use a FID analyzer. For... § 1065.205. Note that your FID-based system for measuring THC, THCE, or CH4 must meet all the... bias. (c) Heated FID analyzers. For measuring THC or THCE from compression-ignition engines, two-stroke...

  12. DIRECT MERCURY ANALYSIS IN ENVIRONMENTAL SOLIDS BY ICPMS WITH ON-LINE SAMPLE ASHING AND MERCURY PRE-CONCENTRATION USING THE DIRECT MERCURY ANALYZER

    EPA Science Inventory



    A Direct Mercury Analyzer based on sample combustion and mercury concentration by gold amalgamation, followed by atomic absorption determination, was interfaced with a quadrupole and a magnet sector ICPMS. In this paper, we discuss design and operating parameters and eval...

  13. Fostering Skills to Enhance Critical Educators: A Pedagogical Proposal for Pre-Service Teachers

    ERIC Educational Resources Information Center

    Aguirre Morales, Jahir; Ramos Holguín, Bertha

    2011-01-01

    This article aims to share with teacher-educators a pedagogical proposal which we have applied in the past year. This investigation analyzes issues linked to critical pedagogy using movies connected to educational themes and readings based on critical pedagogy. We have used this study to generate class discussions in order to analyze educational…

  14. An Exploratory Study of the Influence That Analyzing Teaching Has on Preservice Teachers' Classroom Practice

    ERIC Educational Resources Information Center

    Sun, Jennifer; van Es, Elizabeth A.

    2015-01-01

    We designed a video-based course to develop preservice teachers' vision of ambitious instruction by decomposing instruction to learn to attend to student thinking and to examine how particular teaching moves influence student learning. In this study, we examine the influence that learning to systematically analyze ambitious pedagogy in the course…

  15. Case-Based Instruction in Post-Secondary Education: Developing Students' Problem-Solving Expertise.

    ERIC Educational Resources Information Center

    Ertmer, Peggy A.; Stepich, Donald A.

    This study was designed to explore changes in students' problem-solving skills as they analyzed instructional design case studies during a semester-long course. Nineteen students at two Midwestern universities analyzed six to ten case studies as part of their course assignments. Both quantitative and qualitative data were collected, with students'…

  16. Do Employees' Perceptions on Authentic Leadership Affect the Organizational Citizenship Behavior?: Turkish Context

    ERIC Educational Resources Information Center

    Yesilkaya, Mukaddes; Aydin, Peruzet

    2016-01-01

    The aim of this study is to analyze the relationship between employees' perceptions on authentic leadership and organizational citizenship behavior. In this context, it was carried out a research on four-hundred public employees. The data from this study were analyzed via an appropriate statistical program and evaluated. Based on the findings from…

  17. Overview of Computer-Based Models Applicable to Freight Car Utilization

    DOT National Transportation Integrated Search

    1977-10-01

    This report documents a study performed to identify and analyze twenty-two of the important computer-based models of railroad operations. The models are divided into three categories: network simulations, yard simulations, and network optimizations. ...

  18. Computer-Based Training: Will it Replace You?

    ERIC Educational Resources Information Center

    Hudson, William J.

    1982-01-01

    Examines myths and fears about computer-based training (displaces trainers, dehumanizes learners), lists what computers cannot do (analyze needs, formulate objectives, act as subject experts), and what they can do effectively (handle knowledge transfer, provide simulation). (SK)

  19. Nanotechnology applied to treatment of mucopolysaccharidoses.

    PubMed

    Schuh, Roselena S; Baldo, Guilherme; Teixeira, Helder F

    2016-12-01

    Mucopolysaccharidoses (MPS) are genetic disorders caused by the accumulation of glycosaminoglycans due to deficiencies in the lysosomal enzymes responsible for their catabolism. Current treatments are not fully effective and are not available for all MPS types. Accordingly, researchers have tested novel therapies for MPS, including nanotechnology-based enzyme delivery systems and gene therapy. In this review, we aim to analyze some of the approaches involving nanotechnology as alternative treatments for MPS. Areas covered: We analyze nanotechnology-based systems, focusing on the biomaterials, such as polymers and lipids, that comprise these nanostructures, and we have highlighted studies that describe their use as enzyme and gene delivery systems for the treatment of MPS diseases. Expert opinion: Some protocols, such as the use of polymer-based systems or nanostructured carriers associated with enzymes and nanotechnology-based carriers for gene therapy, along with combined approaches, seem to be the future of MPS therapy.

  20. Experimental and theoretical study of the in- fiber twist sensor based on quasi-fan Solc structure filter.

    PubMed

    Sun, Chunran; Wang, Muguang; Jian, Shuisheng

    2017-08-21

    In this paper, a novel quasi-fan Solc structure filter based on elliptical-core spun fiber for twist sensing has been experimentally investigated and theoretically analyzed. The discrete model of spun fiber has been built to analyze the transmission characteristics of proposed sensor. Both experimental and simulated results indicate that the extinction ratio of the comb spectrum based on quasi-fan Solc birefringent fiber filter varies with twist angle and agrees well with each other. Based on the intensity modulation, the proposed twist sensor exhibits a high sensitivity of 0.02219 dB/(°/m). Moreover, thanks to the invariability of the fiber birefringence and the state of polarization of the input light, the proposed twist sensor has a very low temperature and strain sensitivity, which can avoid the cross-sensitivity problem existing in most twist sensors.

  1. Analyzing ion distributions around DNA: sequence-dependence of potassium ion distributions from microsecond molecular dynamics

    PubMed Central

    Pasi, Marco; Maddocks, John H.; Lavery, Richard

    2015-01-01

    Microsecond molecular dynamics simulations of B-DNA oligomers carried out in an aqueous environment with a physiological salt concentration enable us to perform a detailed analysis of how potassium ions interact with the double helix. The oligomers studied contain all 136 distinct tetranucleotides and we are thus able to make a comprehensive analysis of base sequence effects. Using a recently developed curvilinear helicoidal coordinate method we are able to analyze the details of ion populations and densities within the major and minor grooves and in the space surrounding DNA. The results show higher ion populations than have typically been observed in earlier studies and sequence effects that go beyond the nature of individual base pairs or base pair steps. We also show that, in some special cases, ion distributions converge very slowly and, on a microsecond timescale, do not reflect the symmetry of the corresponding base sequence. PMID:25662221

  2. Traffic Flow Density Distribution Based on FEM

    NASA Astrophysics Data System (ADS)

    Ma, Jing; Cui, Jianming

    In analysis of normal traffic flow, it usually uses the static or dynamic model to numerical analyze based on fluid mechanics. However, in such handling process, the problem of massive modeling and data handling exist, and the accuracy is not high. Finite Element Method (FEM) is a production which is developed from the combination of a modern mathematics, mathematics and computer technology, and it has been widely applied in various domain such as engineering. Based on existing theory of traffic flow, ITS and the development of FEM, a simulation theory of the FEM that solves the problems existing in traffic flow is put forward. Based on this theory, using the existing Finite Element Analysis (FEA) software, the traffic flow is simulated analyzed with fluid mechanics and the dynamics. Massive data processing problem of manually modeling and numerical analysis is solved, and the authenticity of simulation is enhanced.

  3. Microstructure formation and fracturing characteristics of grey cast iron repaired using laser.

    PubMed

    Yi, Peng; Xu, Pengyun; Fan, Changfeng; Yang, Guanghui; Liu, Dan; Shi, Yongjun

    2014-01-01

    The repairing technology based on laser rapid fusion is becoming an important tool for fixing grey cast iron equipment efficiently. A laser repairing protocol was developed using Fe-based alloy powders as material. The microstructure and fracturing feature of the repaired zone (RZ) were analyzed. The results showed that regionally organized RZ with good density and reliable metallurgical bond can be achieved by laser repairing. At the bottom of RZ, dendrites existed in similar direction and extended to the secondary RZ, making the grains grow extensively with inheritance with isometric grains closer to the surface substrate. The strength of the grey cast iron base material was maintained by laser repairing. The base material and RZ were combined with robust strength and fracture resistance. The prevention and deflection of cracking process were analyzed using a cracking process model and showed that the overall crack toughness of the materials increased.

  4. Microstructure Formation and Fracturing Characteristics of Grey Cast Iron Repaired Using Laser

    PubMed Central

    Liu, Dan; Shi, Yongjun

    2014-01-01

    The repairing technology based on laser rapid fusion is becoming an important tool for fixing grey cast iron equipment efficiently. A laser repairing protocol was developed using Fe-based alloy powders as material. The microstructure and fracturing feature of the repaired zone (RZ) were analyzed. The results showed that regionally organized RZ with good density and reliable metallurgical bond can be achieved by laser repairing. At the bottom of RZ, dendrites existed in similar direction and extended to the secondary RZ, making the grains grow extensively with inheritance with isometric grains closer to the surface substrate. The strength of the grey cast iron base material was maintained by laser repairing. The base material and RZ were combined with robust strength and fracture resistance. The prevention and deflection of cracking process were analyzed using a cracking process model and showed that the overall crack toughness of the materials increased. PMID:25032230

  5. Access point selection game with mobile users using correlated equilibrium.

    PubMed

    Sohn, Insoo

    2015-01-01

    One of the most important issues in wireless local area network (WLAN) systems with multiple access points (APs) is the AP selection problem. Game theory is a mathematical tool used to analyze the interactions in multiplayer systems and has been applied to various problems in wireless networks. Correlated equilibrium (CE) is one of the powerful game theory solution concepts, which is more general than the Nash equilibrium for analyzing the interactions in multiplayer mixed strategy games. A game-theoretic formulation of the AP selection problem with mobile users is presented using a novel scheme based on a regret-based learning procedure. Through convergence analysis, we show that the joint actions based on the proposed algorithm achieve CE. Simulation results illustrate that the proposed algorithm is effective in a realistic WLAN environment with user mobility and achieves maximum system throughput based on the game-theoretic formulation.

  6. Access Point Selection Game with Mobile Users Using Correlated Equilibrium

    PubMed Central

    Sohn, Insoo

    2015-01-01

    One of the most important issues in wireless local area network (WLAN) systems with multiple access points (APs) is the AP selection problem. Game theory is a mathematical tool used to analyze the interactions in multiplayer systems and has been applied to various problems in wireless networks. Correlated equilibrium (CE) is one of the powerful game theory solution concepts, which is more general than the Nash equilibrium for analyzing the interactions in multiplayer mixed strategy games. A game-theoretic formulation of the AP selection problem with mobile users is presented using a novel scheme based on a regret-based learning procedure. Through convergence analysis, we show that the joint actions based on the proposed algorithm achieve CE. Simulation results illustrate that the proposed algorithm is effective in a realistic WLAN environment with user mobility and achieves maximum system throughput based on the game-theoretic formulation. PMID:25785726

  7. Electro-optic analyzer of angular momentum hyperentanglement

    PubMed Central

    Wu, Ziwen; Chen, Lixiang

    2016-01-01

    Characterizing a high-dimensional entanglement is fundamental in quantum information applications. Here, we propose a theoretical scheme to analyze and characterize the angular momentum hyperentanglement that two photons are entangled simultaneously in spin and orbital angular momentum. Based on the electro-optic sampling with a proposed hyper-entanglement analyzer and the simple matrix operation using Cramer rule, our simulations show that it is possible to retrieve effectively both the information about the degree of polarization entanglement and the spiral spectrum of high-dimensional orbital angular momentum entanglement. PMID:26911530

  8. Personal Computer (PC) Thermal Analyzer

    DTIC Science & Technology

    1990-03-01

    demonstrate the power of the PC Thermal Analyzer, it was compared with an existing thermal analysis method. Specifically, the PC Thermal Analyzer was...34Intelligence" I T Kowledge 1 User I Inference e Base I Interface 1i FMechanisms H 1 asI I II - I L m m m m m m - m m i m m - m m - m I- m i m Expert...Temperature in degrees centi- grade? (2) What is the total Heat Output ( power dissipation) in watts?). 25 BOARD ASSEMBLY ~UI U2 aooo 0i0000t00 U15

  9. International Space Station Major Constituent Analyzer On-Orbit Performance

    NASA Technical Reports Server (NTRS)

    Gardner, Ben D.; Erwin, Phillip M.; Cougar, Tamara; Ulrich, BettyLynn

    2017-01-01

    The Major Constituent Analyzer (MCA) is a mass spectrometer based system that measures the major atmospheric constituents on the International Space Station. A number of limited-life components require periodic change-out, including the ORU 02 analyzer and the ORU 08 Verification Gas Assembly. The most recent ORU 02 and ORU 08 assemblies in the LAB MCA are operating nominally. For ORU 02, the ion source filaments and ion pump lifetime continue to be key determinants of MCA performance. Finally, the Node 3 MCA is being brought to an operational configuration.

  10. Towards improving phenotype representation in OWL

    PubMed Central

    2012-01-01

    Background Phenotype ontologies are used in species-specific databases for the annotation of mutagenesis experiments and to characterize human diseases. The Entity-Quality (EQ) formalism is a means to describe complex phenotypes based on one or more affected entities and a quality. EQ-based definitions have been developed for many phenotype ontologies, including the Human and Mammalian Phenotype ontologies. Methods We analyze formalizations of complex phenotype descriptions in the Web Ontology Language (OWL) that are based on the EQ model, identify several representational challenges and analyze potential solutions to address these challenges. Results In particular, we suggest a novel, role-based approach to represent relational qualities such as concentration of iron in spleen, discuss its ontological foundation in the General Formal Ontology (GFO) and evaluate its representation in OWL and the benefits it can bring to the representation of phenotype annotations. Conclusion Our analysis of OWL-based representations of phenotypes can contribute to improving consistency and expressiveness of formal phenotype descriptions. PMID:23046625

  11. Preliminary Design of ICI-based Multimedia for Reconceptualizing Electric Conceptions at Universitas Pendidikan Indonesia

    NASA Astrophysics Data System (ADS)

    Samsudin, A.; Suhandi, A.; Rusdiana, D.; Kaniawati, I.

    2016-08-01

    Interactive Conceptual Instruction (ICI) based Multimedia has been developed to represent the electric concepts turn into more real and meaningful learning. The initial design of ICI based multimedia is a multimedia computer that allows users to explore the entire electric concepts in terms of the existing conceptual and practical. Pre-service physics teachers should be provided with the learning that could optimize the conceptions held by re-conceptualizing concepts in Basic Physics II, especially the concepts about electricity. To collect and to analyze the data genuinely and comprehensively, researchers utilized a developing method of ADDIE which has comprehensive steps: analyzing, design, development, implementation, and evaluation. The ADDIE developing steps has been utilized to describe comprehensively from the phase of analysis program up until the evaluation program. Based on data analysis, it can be concluded that ICI-based multimedia could effectively increase the pre-service physics teachers’ understanding on electric conceptions for re-conceptualizing electric conceptions at Universitas Pendidikan Indonesia.

  12. Passivity/Lyapunov based controller design for trajectory tracking of flexible joint manipulators

    NASA Technical Reports Server (NTRS)

    Sicard, Pierre; Wen, John T.; Lanari, Leonardo

    1992-01-01

    A passivity and Lyapunov based approach for the control design for the trajectory tracking problem of flexible joint robots is presented. The basic structure of the proposed controller is the sum of a model-based feedforward and a model-independent feedback. Feedforward selection and solution is analyzed for a general model for flexible joints, and for more specific and practical model structures. Passivity theory is used to design a motor state-based controller in order to input-output stabilize the error system formed by the feedforward. Observability conditions for asymptotic stability are stated and verified. In order to accommodate for modeling uncertainties and to allow for the implementation of a simplified feedforward compensation, the stability of the system is analyzed in presence of approximations in the feedforward by using a Lyapunov based robustness analysis. It is shown that under certain conditions, e.g., the desired trajectory is varying slowly enough, stability is maintained for various approximations of a canonical feedforward.

  13. Efficient Acceleration of the Pair-HMMs Forward Algorithm for GATK HaplotypeCaller on Graphics Processing Units.

    PubMed

    Ren, Shanshan; Bertels, Koen; Al-Ars, Zaid

    2018-01-01

    GATK HaplotypeCaller (HC) is a popular variant caller, which is widely used to identify variants in complex genomes. However, due to its high variants detection accuracy, it suffers from long execution time. In GATK HC, the pair-HMMs forward algorithm accounts for a large percentage of the total execution time. This article proposes to accelerate the pair-HMMs forward algorithm on graphics processing units (GPUs) to improve the performance of GATK HC. This article presents several GPU-based implementations of the pair-HMMs forward algorithm. It also analyzes the performance bottlenecks of the implementations on an NVIDIA Tesla K40 card with various data sets. Based on these results and the characteristics of GATK HC, we are able to identify the GPU-based implementations with the highest performance for the various analyzed data sets. Experimental results show that the GPU-based implementations of the pair-HMMs forward algorithm achieve a speedup of up to 5.47× over existing GPU-based implementations.

  14. Processing and characterization of bio-based composites

    NASA Astrophysics Data System (ADS)

    Lu, Hong

    Much research has focused on bio-based composites as a potential material to replace petroleum-based plastics. Considering the high price of Polyhydroxyalkanoates (PHAs), PHA/ Distiller's Dried Grains with Solubles (DDGS) composite is a promising economical and high-performance biodegradable material. In this paper, we discuss the effect of DDGS on PHA composites in balancing cost with material performance. Poly (lactic acid) PLA/DDGS composite is another excellent biodegradable composite, although as a bio-based polymer its degradation time is relatively long. The goal of this research is therefore to accelerate the degradation process for this material. Both bio-based composites were extruded through a twin-screw microcompounder, and the two materials were uniformly mixed. The morphology of the samples was examined using a Scanning Electron Microscope (SEM); thermal stability was determined with a Thermal Gravimetric Analyzer (TGA); other thermal properties were studied using Differential Scanning Calorimetry (DSC) and a Dynamic Mechanical Analyzer (DMA). Viscoelastic properties were also evaluated using a Rheometer.

  15. Using Statistics and Data Mining Approaches to Analyze Male Sexual Behaviors and Use of Erectile Dysfunction Drugs Based on Large Questionnaire Data.

    PubMed

    Qiao, Zhi; Li, Xiang; Liu, Haifeng; Zhang, Lei; Cao, Junyang; Xie, Guotong; Qin, Nan; Jiang, Hui; Lin, Haocheng

    2017-01-01

    The prevalence of erectile dysfunction (ED) has been extensively studied worldwide. Erectile dysfunction drugs has shown great efficacy in preventing male erectile dysfunction. In order to help doctors know drug taken preference of patients and better prescribe, it is crucial to analyze who actually take erectile dysfunction drugs and the relation between sexual behaviors and drug use. Existing clinical studies usually used descriptive statistics and regression analysis based on small volume of data. In this paper, based on big volume of data (48,630 questionnaires), we use data mining approaches besides statistics and regression analysis to comprehensively analyze the relation between male sexual behaviors and use of erectile dysfunction drugs for unravelling the characteristic of patients who take erectile dysfunction drugs. We firstly analyze the impact of multiple sexual behavior factors on whether to use the erectile dysfunction drugs. Then, we explore to mine the Decision Rules for Stratification to discover patients who are more likely to take drugs. Based on the decision rules, the patients can be partitioned into four potential groups for use of erectile dysfunction: high potential group, intermediate potential-1 group, intermediate potential-2 group and low potential group. Experimental results show 1) the sexual behavior factors, erectile hardness and time length to prepare (how long to prepares for sexual behaviors ahead of time), have bigger impacts both in correlation analysis and potential drug taking patients discovering; 2) odds ratio between patients identified as low potential and high potential was 6.098 (95% confidence interval, 5.159-7.209) with statistically significant differences in taking drug potential detected between all potential groups.

  16. Feasibility of real-time geochemical analysis using LIBS (Laser-Induced Breakdown Spectroscopy) in oil wells

    NASA Astrophysics Data System (ADS)

    Shahin, Mohamed

    2014-05-01

    The oil and gas industry has attempted for many years to find new ways to analyze and determine the type of rocks drilled on a real time basis. Mud analysis logging is a direct method of detecting oil and gas in formations drilled, it depends on the "feel" of the bit to decide formation type, as well as, geochemical analysis which was introduced 30 years ago, starting with a pulsed-neutron generator (PNG) based wireline tool upon which LWD technology was based. In this paper, we are studying the feasibility of introducing a new technology for real-time geochemical analysis. Laser-induced breakdown spectroscopy (LIBS) is a type of atomic emission spectroscopy, It is a cutting-edge technology that is used for many applications such as determination of alloy composition, origin of manufacture (by monitoring trace components), and molecular analysis (unknown identification). LIBS can analyze any material regardless of its state (solid, liquid or gas), based upon that fact, we can analyze rocks, formation fluids' types and contacts between them. In cooperation with the National Institute of Laser Enhanced Science, Cairo University in Egypt, we've done tests on sandstone, limestone and coal samples acquired from different places using Nd: YAG Laser with in addition to other components that are explained in details through this paper to understand the ability of Laser to analyze rock samples and provide their elemental composition using LIBS technique. We've got promising results from the sample analysis via LIBS and discussed the possibility of deploying this technology in oilfields suggesting many applications and giving a base for achieving a quantitative elemental analysis method in view of its shortcomings and solutions.

  17. Demographic-Based Content Analysis of Web-Based Health-Related Social Media

    PubMed Central

    Shahbazi, Moloud; Wiley, Matthew T; Hristidis, Vagelis

    2016-01-01

    Background An increasing number of patients from diverse demographic groups share and search for health-related information on Web-based social media. However, little is known about the content of the posted information with respect to the users’ demographics. Objective The aims of this study were to analyze the content of Web-based health-related social media based on users’ demographics to identify which health topics are discussed in which social media by which demographic groups and to help guide educational and research activities. Methods We analyze 3 different types of health-related social media: (1) general Web-based social networks Twitter and Google+; (2) drug review websites; and (3) health Web forums, with a total of about 6 million users and 20 million posts. We analyzed the content of these posts based on the demographic group of their authors, in terms of sentiment and emotion, top distinctive terms, and top medical concepts. Results The results of this study are: (1) Pregnancy is the dominant topic for female users in drug review websites and health Web forums, whereas for male users, it is cardiac problems, HIV, and back pain, but this is not the case for Twitter; (2) younger users (0-17 years) mainly talk about attention-deficit hyperactivity disorder (ADHD) and depression-related drugs, users aged 35-44 years discuss about multiple sclerosis (MS) drugs, and middle-aged users (45-64 years) talk about alcohol and smoking; (3) users from the Northeast United States talk about physical disorders, whereas users from the West United States talk about mental disorders and addictive behaviors; (4) Users with higher writing level express less anger in their posts. Conclusion We studied the popular topics and the sentiment based on users' demographics in Web-based health-related social media. Our results provide valuable information, which can help create targeted and effective educational campaigns and guide experts to reach the right users on Web-based social chatter. PMID:27296242

  18. Demographic-Based Content Analysis of Web-Based Health-Related Social Media.

    PubMed

    Sadah, Shouq A; Shahbazi, Moloud; Wiley, Matthew T; Hristidis, Vagelis

    2016-06-13

    An increasing number of patients from diverse demographic groups share and search for health-related information on Web-based social media. However, little is known about the content of the posted information with respect to the users' demographics. The aims of this study were to analyze the content of Web-based health-related social media based on users' demographics to identify which health topics are discussed in which social media by which demographic groups and to help guide educational and research activities. We analyze 3 different types of health-related social media: (1) general Web-based social networks Twitter and Google+; (2) drug review websites; and (3) health Web forums, with a total of about 6 million users and 20 million posts. We analyzed the content of these posts based on the demographic group of their authors, in terms of sentiment and emotion, top distinctive terms, and top medical concepts. The results of this study are: (1) Pregnancy is the dominant topic for female users in drug review websites and health Web forums, whereas for male users, it is cardiac problems, HIV, and back pain, but this is not the case for Twitter; (2) younger users (0-17 years) mainly talk about attention-deficit hyperactivity disorder (ADHD) and depression-related drugs, users aged 35-44 years discuss about multiple sclerosis (MS) drugs, and middle-aged users (45-64 years) talk about alcohol and smoking; (3) users from the Northeast United States talk about physical disorders, whereas users from the West United States talk about mental disorders and addictive behaviors; (4) Users with higher writing level express less anger in their posts. We studied the popular topics and the sentiment based on users' demographics in Web-based health-related social media. Our results provide valuable information, which can help create targeted and effective educational campaigns and guide experts to reach the right users on Web-based social chatter.

  19. The Orphan Disease Networks

    PubMed Central

    Zhang, Minlu; Zhu, Cheng; Jacomy, Alexis; Lu, Long J.; Jegga, Anil G.

    2011-01-01

    The low prevalence rate of orphan diseases (OD) requires special combined efforts to improve diagnosis, prevention, and discovery of novel therapeutic strategies. To identify and investigate relationships based on shared genes or shared functional features, we have conducted a bioinformatic-based global analysis of all orphan diseases with known disease-causing mutant genes. Starting with a bipartite network of known OD and OD-causing mutant genes and using the human protein interactome, we first construct and topologically analyze three networks: the orphan disease network, the orphan disease-causing mutant gene network, and the orphan disease-causing mutant gene interactome. Our results demonstrate that in contrast to the common disease-causing mutant genes that are predominantly nonessential, a majority of orphan disease-causing mutant genes are essential. In confirmation of this finding, we found that OD-causing mutant genes are topologically important in the protein interactome and are ubiquitously expressed. Additionally, functional enrichment analysis of those genes in which mutations cause ODs shows that a majority result in premature death or are lethal in the orthologous mouse gene knockout models. To address the limitations of traditional gene-based disease networks, we also construct and analyze OD networks on the basis of shared enriched features (biological processes, cellular components, pathways, phenotypes, and literature citations). Analyzing these functionally-linked OD networks, we identified several additional OD-OD relations that are both phenotypically similar and phenotypically diverse. Surprisingly, we observed that the wiring of the gene-based and other feature-based OD networks are largely different; this suggests that the relationship between ODs cannot be fully captured by the gene-based network alone. PMID:21664998

  20. A cell-based study on pedestrian acceleration and overtaking in a transfer station corridor

    NASA Astrophysics Data System (ADS)

    Ji, Xiangfeng; Zhou, Xuemei; Ran, Bin

    2013-04-01

    Pedestrian speed in a transfer station corridor is faster than usual and sometimes running can be found among some of them. In this paper, pedestrians are divided into two categories. The first one is aggressive, and the other is conservative. Aggressive pedestrians weaving their way through crowd in the corridor are the study object of this paper. During recent decades, much attention has been paid to the pedestrians' behavior, such as overtaking (also deceleration) and collision avoidance, and that continues in this paper. After sufficiently analyzing the characteristics of pedestrian flow in transfer station corridor, a cell-based model is presented in this paper, including the acceleration (also deceleration) and overtaking analysis. Acceleration (also deceleration) in a corridor is fixed according to Newton's Law and then speed calculated with a kinematic formula is discretized into cells based on the fuzzy logic. After the speed is updated, overtaking is analyzed based on updated speed and force explicitly, compared to rule-based models, which herein we call implicit ones. During the analysis of overtaking, a threshold value to determine the overtaking direction is introduced. Actually, model in this paper is a two-step one. The first step is to update speed, which is the cells the pedestrian can move in one time interval and the other is to analyze the overtaking. Finally, a comparison between the rule-based cellular automata, the model in this paper and data in HCM 2000 is made to demonstrate our model can be used to achieve reasonable simulation of acceleration (also deceleration) and overtaking among pedestrians.

  1. Music Retrieval Based on the Relation between Color Association and Lyrics

    NASA Astrophysics Data System (ADS)

    Nakamur, Tetsuaki; Utsumi, Akira; Sakamoto, Maki

    Various methods for music retrieval have been proposed. Recently, many researchers are tackling developing methods based on the relationship between music and feelings. In our previous psychological study, we found that there was a significant correlation between colors evoked from songs and colors evoked only from lyrics, and showed that the music retrieval system using lyrics could be developed. In this paper, we focus on the relationship among music, lyrics and colors, and propose a music retrieval method using colors as queries and analyzing lyrics. This method estimates colors evoked from songs by analyzing lyrics of the songs. On the first step of our method, words associated with colors are extracted from lyrics. We assumed two types of methods to extract words associated with colors. In the one of two methods, the words are extracted based on the result of a psychological experiment. In the other method, in addition to the words extracted based on the result of the psychological experiment, the words from corpora for the Latent Semantic Analysis are extracted. On the second step, colors evoked from the extracted words are compounded, and the compounded colors are regarded as those evoked from the song. On the last step, colors as queries are compared with colors estimated from lyrics, and the list of songs is presented based on similarities. We evaluated the two methods described above and found that the method based on the psychological experiment and corpora performed better than the method only based on the psychological experiment. As a result, we showed that the method using colors as queries and analyzing lyrics is effective for music retrieval.

  2. Functional specializations in human cerebral cortex analyzed using the Visible Man surface-based atlas

    NASA Technical Reports Server (NTRS)

    Drury, H. A.; Van Essen, D. C.

    1997-01-01

    We used surface-based representations to analyze functional specializations in the human cerebral cortex. A computerized reconstruction of the cortical surface of the Visible Man digital atlas was generated and transformed to the Talairach coordinate system. This surface was also flattened and used to establish a surface-based coordinate system that respects the topology of the cortical sheet. The linkage between two-dimensional and three-dimensional representations allows the locations of published neuroimaging activation foci to be stereotaxically projected onto the Visible Man cortical flat map. An analysis of two activation studies related to the hearing and reading of music and of words illustrates how this approach permits the systematic estimation of the degree of functional segregation and of potential functional overlap for different aspects of sensory processing.

  3. Review on recent research progress on laser power measurement based on light pressure

    NASA Astrophysics Data System (ADS)

    Lai, WenChang; Zhou, Pu

    2018-03-01

    Accurate measuring the laser power is one of the most important issue to evaluate the performance of high power laser. For the time being, most of the demonstrated technique could be attributed to direct measuring route. Indirect measuring laser power based on light pressure, which has been under intensive investigation, has the advantages such as fast response, real-time measuring and high accuracy, compared with direct measuring route. In this paper, we will review several non-traditional methods based on light pressure to precisely measure the laser power proposed recently. The system setup, measuring principle and scaling methods would be introduced and analyzed in detail. We also compare the benefit and the drawback of these methods and analyze the uncertainties of the measurements.

  4. Ground-based measurement of column-averaged mixing ratios of methane and carbon dioxide in the Sichuan Basin of China by a desktop optical spectrum analyzer

    NASA Astrophysics Data System (ADS)

    Qin, Xiu-Chun; Nakayama, Tomoki; Matsumi, Yutaka; Kawasaki, Masahiro; Ono, Akiko; Hayashida, Sachiko; Imasu, Ryoichi; Lei, Li-Ping; Murata, Isao; Kuroki, Takahiro; Ohashi, Masafumi

    2018-01-01

    Remote sensing of the atmospheric greenhouse gases, methane (CH4) and carbon dioxide (CO2), contributes to the understanding of global warming and climate change. A portable ground-based instrument consisting of a commercially available desktop optical spectrum analyzer and a small sun tracker has been applied to measure the column densities of atmospheric CH4 and CO2 at Yanting observation station in a mountainous paddy field of the Sichuan Basin from September to November 2013. The column-averaged dry-air molar mixing ratios, XCH4/XCO2, are compared with those retrieved by satellite observations in the Sichuan Basin and by ground-based network observations in the same latitude zone as the Yanting observation station.

  5. Stable multi-domain spectral penalty methods for fractional partial differential equations

    NASA Astrophysics Data System (ADS)

    Xu, Qinwu; Hesthaven, Jan S.

    2014-01-01

    We propose stable multi-domain spectral penalty methods suitable for solving fractional partial differential equations with fractional derivatives of any order. First, a high order discretization is proposed to approximate fractional derivatives of any order on any given grids based on orthogonal polynomials. The approximation order is analyzed and verified through numerical examples. Based on the discrete fractional derivative, we introduce stable multi-domain spectral penalty methods for solving fractional advection and diffusion equations. The equations are discretized in each sub-domain separately and the global schemes are obtained by weakly imposed boundary and interface conditions through a penalty term. Stability of the schemes are analyzed and numerical examples based on both uniform and nonuniform grids are considered to highlight the flexibility and high accuracy of the proposed schemes.

  6. Analyzing the causation of a railway accident based on a complex network

    NASA Astrophysics Data System (ADS)

    Ma, Xin; Li, Ke-Ping; Luo, Zi-Yan; Zhou, Jin

    2014-02-01

    In this paper, a new model is constructed for the causation analysis of railway accident based on the complex network theory. In the model, the nodes are defined as various manifest or latent accident causal factors. By employing the complex network theory, especially its statistical indicators, the railway accident as well as its key causations can be analyzed from the overall perspective. As a case, the “7.23” China—Yongwen railway accident is illustrated based on this model. The results show that the inspection of signals and the checking of line conditions before trains run played an important role in this railway accident. In conclusion, the constructed model gives a theoretical clue for railway accident prediction and, hence, greatly reduces the occurrence of railway accidents.

  7. Research on TCP/IP network communication based on Node.js

    NASA Astrophysics Data System (ADS)

    Huang, Jing; Cai, Lixiong

    2018-04-01

    In the face of big data, long connection and high synchronization, TCP/IP network communication will cause performance bottlenecks due to its blocking multi-threading service model. This paper presents a method of TCP/IP network communication protocol based on Node.js. On the basis of analyzing the characteristics of Node.js architecture and asynchronous non-blocking I/O model, the principle of its efficiency is discussed, and then compare and analyze the network communication model of TCP/IP protocol to expound the reasons why TCP/IP protocol stack is widely used in network communication. Finally, according to the large data and high concurrency in the large-scale grape growing environment monitoring process, a TCP server design based on Node.js is completed. The results show that the example runs stably and efficiently.

  8. Big data mining analysis method based on cloud computing

    NASA Astrophysics Data System (ADS)

    Cai, Qing Qiu; Cui, Hong Gang; Tang, Hao

    2017-08-01

    Information explosion era, large data super-large, discrete and non-(semi) structured features have gone far beyond the traditional data management can carry the scope of the way. With the arrival of the cloud computing era, cloud computing provides a new technical way to analyze the massive data mining, which can effectively solve the problem that the traditional data mining method cannot adapt to massive data mining. This paper introduces the meaning and characteristics of cloud computing, analyzes the advantages of using cloud computing technology to realize data mining, designs the mining algorithm of association rules based on MapReduce parallel processing architecture, and carries out the experimental verification. The algorithm of parallel association rule mining based on cloud computing platform can greatly improve the execution speed of data mining.

  9. Combating QR-Code-Based Compromised Accounts in Mobile Social Networks.

    PubMed

    Guo, Dong; Cao, Jian; Wang, Xiaoqi; Fu, Qiang; Li, Qiang

    2016-09-20

    Cyber Physical Social Sensing makes mobile social networks (MSNs) popular with users. However, such attacks are rampant as malicious URLs are spread covertly through quick response (QR) codes to control compromised accounts in MSNs to propagate malicious messages. Currently, there are generally two types of methods to identify compromised accounts in MSNs: one type is to analyze the potential threats on wireless access points and the potential threats on handheld devices' operation systems so as to stop compromised accounts from spreading malicious messages; the other type is to apply the method of detecting compromised accounts in online social networks to MSNs. The above types of methods above focus neither on the problems of MSNs themselves nor on the interaction of sensors' messages, which leads to the restrictiveness of platforms and the simplification of methods. In order to stop the spreading of compromised accounts in MSNs effectively, the attacks have to be traced to their sources first. Through sensors, users exchange information in MSNs and acquire information by scanning QR codes. Therefore, analyzing the traces of sensor-related information helps to identify the compromised accounts in MSNs. This paper analyzes the diversity of information sending modes of compromised accounts and normal accounts, analyzes the regularity of GPS (Global Positioning System)-based location information, and introduces the concepts of entropy and conditional entropy so as to construct an entropy-based model based on machine learning strategies. To achieve the goal, about 500,000 accounts of Sina Weibo and about 100 million corresponding messages are collected. Through the validation, the accuracy rate of the model is proved to be as high as 87.6%, and the false positive rate is only 3.7%. Meanwhile, the comparative experiments of the feature sets prove that sensor-based location information can be applied to detect the compromised accounts in MSNs.

  10. Combating QR-Code-Based Compromised Accounts in Mobile Social Networks

    PubMed Central

    Guo, Dong; Cao, Jian; Wang, Xiaoqi; Fu, Qiang; Li, Qiang

    2016-01-01

    Cyber Physical Social Sensing makes mobile social networks (MSNs) popular with users. However, such attacks are rampant as malicious URLs are spread covertly through quick response (QR) codes to control compromised accounts in MSNs to propagate malicious messages. Currently, there are generally two types of methods to identify compromised accounts in MSNs: one type is to analyze the potential threats on wireless access points and the potential threats on handheld devices’ operation systems so as to stop compromised accounts from spreading malicious messages; the other type is to apply the method of detecting compromised accounts in online social networks to MSNs. The above types of methods above focus neither on the problems of MSNs themselves nor on the interaction of sensors’ messages, which leads to the restrictiveness of platforms and the simplification of methods. In order to stop the spreading of compromised accounts in MSNs effectively, the attacks have to be traced to their sources first. Through sensors, users exchange information in MSNs and acquire information by scanning QR codes. Therefore, analyzing the traces of sensor-related information helps to identify the compromised accounts in MSNs. This paper analyzes the diversity of information sending modes of compromised accounts and normal accounts, analyzes the regularity of GPS (Global Positioning System)-based location information, and introduces the concepts of entropy and conditional entropy so as to construct an entropy-based model based on machine learning strategies. To achieve the goal, about 500,000 accounts of Sina Weibo and about 100 million corresponding messages are collected. Through the validation, the accuracy rate of the model is proved to be as high as 87.6%, and the false positive rate is only 3.7%. Meanwhile, the comparative experiments of the feature sets prove that sensor-based location information can be applied to detect the compromised accounts in MSNs. PMID:27657071

  11. Teacher Implementation of Reform-Based Mathematics and Implications for Algebra Readiness: A Qualitative Study of 4th Grade Classrooms

    ERIC Educational Resources Information Center

    Sher, Stephen Korb

    2011-01-01

    This study looked at 4th grade classrooms to see "how" teachers implement NCTM standards-based or reform-based mathematics instruction and then analyzed it for the capacity to improve students' "algebra readiness." The qualitative study was based on classroom observations, teacher and administrator interviews, and teacher surveys. The study took…

  12. Assessment of Web-Based Authentication Methods in the U.S.: Comparing E-Learning Systems to Internet Healthcare Information Systems

    ERIC Educational Resources Information Center

    Mattord, Herbert J.

    2012-01-01

    Organizations continue to rely on password-based authentication methods to control access to many Web-based systems. This research study developed a benchmarking instrument intended to assess authentication methods used in Web-based information systems (IS). It developed an Authentication Method System Index (AMSI) to analyze collected data from…

  13. Detecting and Analyzing Cybercrime in Text-Based Communication of Cybercriminal Networks through Computational Linguistic and Psycholinguistic Feature Modeling

    ERIC Educational Resources Information Center

    Mbaziira, Alex Vincent

    2017-01-01

    Cybercriminals are increasingly using Internet-based text messaging applications to exploit their victims. Incidents of deceptive cybercrime in text-based communication are increasing and include fraud, scams, as well as favorable and unfavorable fake reviews. In this work, we use a text-based deception detection approach to train models for…

  14. Analyzing Reliability and Performance Trade-Offs of HLS-Based Designs in SRAM-Based FPGAs Under Soft Errors

    NASA Astrophysics Data System (ADS)

    Tambara, Lucas Antunes; Tonfat, Jorge; Santos, André; Kastensmidt, Fernanda Lima; Medina, Nilberto H.; Added, Nemitala; Aguiar, Vitor A. P.; Aguirre, Fernando; Silveira, Marcilei A. G.

    2017-02-01

    The increasing system complexity of FPGA-based hardware designs and shortening of time-to-market have motivated the adoption of new designing methodologies focused on addressing the current need for high-performance circuits. High-Level Synthesis (HLS) tools can generate Register Transfer Level (RTL) designs from high-level software programming languages. These tools have evolved significantly in recent years, providing optimized RTL designs, which can serve the needs of safety-critical applications that require both high performance and high reliability levels. However, a reliability evaluation of HLS-based designs under soft errors has not yet been presented. In this work, the trade-offs of different HLS-based designs in terms of reliability, resource utilization, and performance are investigated by analyzing their behavior under soft errors and comparing them to a standard processor-based implementation in an SRAM-based FPGA. Results obtained from fault injection campaigns and radiation experiments show that it is possible to increase the performance of a processor-based system up to 5,000 times by changing its architecture with a small impact in the cross section (increasing up to 8 times), and still increasing the Mean Workload Between Failures (MWBF) of the system.

  15. Acoustic-wave sensor apparatus for analyzing a petroleum-based composition and sensing solidification of constituents therein

    DOEpatents

    Spates, J.J.; Martin, S.J.; Mansure, A.J.

    1997-08-26

    An acoustic-wave sensor apparatus and method are disclosed. The apparatus for analyzing a normally liquid petroleum-based composition includes at least one acoustic-wave device in contact with the petroleum-based composition for sensing or detecting the presence of constituents (e.g. paraffins or petroleum waxes) therein which solidify upon cooling of the petroleum-based composition below a cloud-point temperature. The acoustic-wave device can be a thickness-shear-mode device (also termed a quartz crystal microbalance), a surface-acoustic-wave device, an acoustic-plate-mode device or a flexural plate-wave device. Embodiments of the present invention can be used for measuring a cloud point, a pour point and/or a freeze point of the petroleum-based composition, and for determining a temperature characteristic of each point. Furthermore, measurements with the acoustic-wave sensor apparatus can be made off-line by using a sample having a particular petroleum-based composition; or in-situ with the petroleum-based composition contained within a pipeline or storage tank. The acoustic-wave sensor apparatus has uses in many different petroleum technology areas, including the recovery, transport, storage, refining and use of petroleum and petroleum-based products. 7 figs.

  16. Acoustic-wave sensor apparatus for analyzing a petroleum-based composition and sensing solidification of constituents therein

    DOEpatents

    Spates, James J.; Martin, Stephen J.; Mansure, Arthur J.

    1997-01-01

    An acoustic-wave sensor apparatus and method. The apparatus for analyzing a normally liquid petroleum-based composition includes at least one acoustic-wave device in contact with the petroleum-based composition for sensing or detecting the presence of constituents (e.g. paraffins or petroleum waxes) therein which solidify upon cooling of the petroleum-based composition below a cloud-point temperature. The acoustic-wave device can be a thickness-shear-mode device (also termed a quartz crystal mircrobalance), a surface-acoustic-wave device, an acoustic-plate-mode device or a flexural plate-wave device. Embodiments of the present invention can be used for measuring a cloud point, a pour point and/or a freeze point of the petroleum-based composition, and for determining a temperature characteristic of each point. Furthermore, measurements with the acoustic-wave sensor apparatus can be made off-line by using a sample having a particular petroleum-based composition; or in-situ with the petroleum-based composition contained within a pipeline or storage tank. The acoustic-wave sensor apparatus has uses in many different petroleum technology areas, including the recover transport, storage, refining and use of petroleum and petroleum-based products.

  17. Moon-Based INSAR Geolocation and Baseline Analysis

    NASA Astrophysics Data System (ADS)

    Liu, Guang; Ren, Yuanzhen; Ye, Hanlin; Guo, Huadong; Ding, Yixing; Ruan, Zhixing; Lv, Mingyang; Dou, Changyong; Chen, Zhaoning

    2016-07-01

    Earth observation platform is a host, the characteristics of the platform in some extent determines the ability for earth observation. Currently most developing platforms are satellite, in contrast carry out systematic observations with moon based Earth observation platform is still a new concept. The Moon is Earth's only natural satellite and is the only one which human has reached, it will give people different perspectives when observe the earth with sensors from the moon. Moon-based InSAR (SAR Interferometry), one of the important earth observation technology, has all-day, all-weather observation ability, but its uniqueness is still a need for analysis. This article will discuss key issues of geometric positioning and baseline parameters of moon-based InSAR. Based on the ephemeris data, the position, liberation and attitude of earth and moon will be obtained, and the position of the moon-base SAR sensor can be obtained by coordinate transformation from fixed seleno-centric coordinate systems to terrestrial coordinate systems, together with the Distance-Doppler equation, the positioning model will be analyzed; after establish of moon-based InSAR baseline equation, the different baseline error will be analyzed, the influence of the moon-based InSAR baseline to earth observation application will be obtained.

  18. FISH Finder: a high-throughput tool for analyzing FISH images

    PubMed Central

    Shirley, James W.; Ty, Sereyvathana; Takebayashi, Shin-ichiro; Liu, Xiuwen; Gilbert, David M.

    2011-01-01

    Motivation: Fluorescence in situ hybridization (FISH) is used to study the organization and the positioning of specific DNA sequences within the cell nucleus. Analyzing the data from FISH images is a tedious process that invokes an element of subjectivity. Automated FISH image analysis offers savings in time as well as gaining the benefit of objective data analysis. While several FISH image analysis software tools have been developed, they often use a threshold-based segmentation algorithm for nucleus segmentation. As fluorescence signal intensities can vary significantly from experiment to experiment, from cell to cell, and within a cell, threshold-based segmentation is inflexible and often insufficient for automatic image analysis, leading to additional manual segmentation and potential subjective bias. To overcome these problems, we developed a graphical software tool called FISH Finder to automatically analyze FISH images that vary significantly. By posing the nucleus segmentation as a classification problem, compound Bayesian classifier is employed so that contextual information is utilized, resulting in reliable classification and boundary extraction. This makes it possible to analyze FISH images efficiently and objectively without adjustment of input parameters. Additionally, FISH Finder was designed to analyze the distances between differentially stained FISH probes. Availability: FISH Finder is a standalone MATLAB application and platform independent software. The program is freely available from: http://code.google.com/p/fishfinder/downloads/list Contact: gilbert@bio.fsu.edu PMID:21310746

  19. Development of a Miniaturized and Portable Methane Analyzer for Natural Gas Leak Walking Surveys

    NASA Astrophysics Data System (ADS)

    Huang, Y. W.; Leen, J. B.; Gupta, M.; Baer, D. S.

    2016-12-01

    Traditional natural gas leak walking surveys have been conducted with devices that are based on technologies such as flame ionization detector (FID), IR-based spectrometer and IR camera. The sensitivity is typically on the ppm level. The low sensitivity means the device cannot pick up leaks far from it, and more time is spent surveying the area before pinpointing the leak location. A miniaturized methane analyzer has been developed to significantly improve the sensitivity of the device used in walking surveys to detect natural gas leaks at greater distance. ABB/LGR's patented Off-Axis Integrated Cavity Output Spectroscopy (OA-ICOS) is utilized to offer rugged and highly sensitive methane detection in a portable package. The miniaturized package weighs 13.5 lb, with a 4-hour rechargeable battery inside. The precision of the analyzer for methane is 2 ppb at 1 second. The analyzer operates at 10 Hz and its flow response time is 3 seconds for measurements through a 1-meter long sampling wand to registering on the data stream. The data can be viewed in real-time on a tablet or a smartphone. The compact and simplified package of the methane analyzer allows for more efficient walking surveys. It also allows for other applications that require low-power, low-weight and a portable package. We present data from walking surveys to demonstrate its ability to detect methane leaks.

  20. [Development and perspective of bio-based chemical fiber industry].

    PubMed

    Li, Zengjun

    2016-06-25

    Bio-based fiber is environment friendly, reproducible, easily biodegradable. Therefore, rapid development of bio-based fiber industry is an obvious in progress to replace petrochemical resources, develop sustainable economy, build resource saving and environment friendly society. This article describes the current development of bio-based fiber industry, analyzes existing problems, indicates the trends and objectives of bio-based fiber materials technology innovation and recommends developing bio-based fibers industry of our country.

  1. Screening tool for oropharyngeal dysphagia in stroke - Part I: evidence of validity based on the content and response processes.

    PubMed

    Almeida, Tatiana Magalhães de; Cola, Paula Cristina; Pernambuco, Leandro de Araújo; Magalhães, Hipólito Virgílio; Magnoni, Carlos Daniel; Silva, Roberta Gonçalves da

    2017-08-17

    The aim of the present study was to identify the evidence of validity based on the content and response process of the Rastreamento de Disfagia Orofaríngea no Acidente Vascular Encefálico (RADAVE; "Screening Tool for Oropharyngeal Dysphagia in Stroke"). The criteria used to elaborate the questions were based on a literature review. A group of judges consisting of 19 different health professionals evaluated the relevance and representativeness of the questions, and the results were analyzed using the Content Validity Index. In order to evidence validity based on the response processes, 23 health professionals administered the screening tool and analyzed the questions using a structured scale and cognitive interview. The RADAVE structured to be applied in two stages. The first version consisted of 18 questions in stage I and 11 questions in stage II. Eight questions in stage I and four in stage II did not reach the minimum Content Validity Index, requiring reformulation by the authors. The cognitive interview demonstrated some misconceptions. New adjustments were made and the final version was produced with 12 questions in stage I and six questions in stage II. It was possible to develop a screening tool for dysphagia in stroke with adequate evidence of validity based on content and response processes. Both validity evidences obtained so far allowed to adjust the screening tool in relation to its construct. The next studies will analyze the other evidences of validity and the measures of accuracy.

  2. Novel microemulsion-based gels for topical delivery of indomethacin: Formulation, physicochemical properties and in vitro drug release studies.

    PubMed

    Froelich, Anna; Osmałek, Tomasz; Snela, Agnieszka; Kunstman, Paweł; Jadach, Barbara; Olejniczak, Marta; Roszak, Grzegorz; Białas, Wojciech

    2017-12-01

    Microemulsion-based semisolid systems may be considered as an interesting alternative to the traditional dosage forms applied in topical drug delivery. Mechanical properties of topical products are important both in terms of application and dosage form effectiveness. In this study we designed and evaluated novel microemulsion-based gels with indomethacin and analyzed the factors affecting their mechanical characteristics and drug release. The impact of the microemulsion composition on the extent of isotropic region was investigated with the use of pseudoternary phase diagrams. Selected microemulsions were analyzed in terms of electrical conductivity and surface tension in order to determine the microemulsion type. Microemulsions were transformed into polymer-based gels and subjected to rheological and textural studies. Finally, the indomethacin release from the analyzed gels was studied and compared to commercially available product. The extent of isotropic domain in pseudoternary phase diagrams seems to be dependent on the polarity of the oil phase. The surface tension and conductivity monitored as a function of water content in microemulsion systems revealed possible structural transformations from w/o through bicontinuous systems into o/w. The mechanical properties of semisolid microemulsion-based systems depended on the composition of surface active agents and the drug presence. The drug release profiles observed in the case of the investigated gels differed from those recorded for the commercially available product which was most probably caused by the different structure of both systems. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Characterizing shipboard bilgewater effluent before and after treatment.

    PubMed

    McLaughlin, Christine; Falatko, Debra; Danesi, Robin; Albert, Ryan

    2014-04-01

    Operational discharges from oceangoing vessels, including discharges of bilgewater, release oil into marine ecosystems that can potentially damage marine life, terrestrial life, human health, and the environment. Bilgewater is a mix of oily fluids and other pollutants from a variety of sources onboard a vessel. If bilgewater cannot be retained onboard, it must be treated by an oily water separator before discharge for larger ocean-going vessels. We evaluated the effectiveness of bilgewater treatment systems by analyzing land-based type approval data, collecting and analyzing shipboard bilgewater effluent data, assessing bilgewater effluent concentrations compared to regulatory standards, evaluating the accuracy of shipboard oil content monitors relative to analytical results, and assessing additional pollution reduction benefits of treatment systems. Land-based type approval data were gathered for 20 treatment systems. Additionally, multiple samples of influent and effluent from operational bilgewater treatment systems onboard three vessels were collected and analyzed, and compared to the land-based type approval data. Based on type approval data, 15 treatment systems were performing below 5 ppm oil. Shipboard performance measurements verified land-based type approval data for the three systems that were sampled. However, oil content monitor readings were more variable than actual oil concentration measurements from effluent samples, resulting in false negatives and positives. The treatment systems sampled onboard for this study generally reduced the majority of other potentially harmful pollutants, which are not currently regulated, with the exception of some heavy metal analytes.

  4. Home care decision support using an Arden engine--merging smart home and vital signs data.

    PubMed

    Marschollek, Michael; Bott, Oliver J; Wolf, Klaus-H; Gietzelt, Matthias; Plischke, Maik; Madiesh, Moaaz; Song, Bianying; Haux, Reinhold

    2009-01-01

    The demographic change with a rising proportion of very old people and diminishing resources leads to an intensification of the use of telemedicine and home care concepts. To provide individualized decision support, data from different sources, e.g. vital signs sensors and home environmental sensors, need to be combined and analyzed together. Furthermore, a standardized decision support approach is necessary. The aim of our research work is to present a laboratory prototype home care architecture that integrates data from different sources and uses a decision support system based on the HL7 standard Arden Syntax for Medical Logical Modules. Data from environmental sensors connected to a home bus system are stored in a data base along with data from wireless medical sensors. All data are analyzed using an Arden engine with the medical knowledge represented in Medical Logic Modules. Multi-modal data from four different sensors in the home environment are stored in a single data base and are analyzed using an HL7 standard conformant decision support system. Individualized home care decision support must be based on all data available, including context data from smart home systems and medical data from electronic health records. Our prototype implementation shows the feasibility of using an Arden engine for decision support in a home setting. Our future work will include the utilization of medical background knowledge for individualized decision support, as there is no one-size-fits-all knowledge base in medicine.

  5. Visa: AN Automatic Aware and Visual Aids Mechanism for Improving the Correct Use of Geospatial Data

    NASA Astrophysics Data System (ADS)

    Hong, J. H.; Su, Y. T.

    2016-06-01

    With the fast growth of internet-based sharing mechanism and OpenGIS technology, users nowadays enjoy the luxury to quickly locate and access a variety of geospatial data for the tasks at hands. While this sharing innovation tremendously expand the possibility of application and reduce the development cost, users nevertheless have to deal with all kinds of "differences" implicitly hidden behind the acquired georesources. We argue the next generation of GIS-based environment, regardless internet-based or not, must have built-in knowledge to automatically and correctly assess the fitness of data use and present the analyzed results to users in an intuitive and meaningful way. The VISA approach proposed in this paper refer to four different types of visual aids that can be respectively used for addressing analyzed results, namely, virtual layer, informative window, symbol transformation and augmented TOC. The VISA-enabled interface works in an automatic-aware fashion, where the standardized metadata serve as the known facts about the selected geospatial resources, algorithms for analyzing the differences of temporality and quality of the geospatial resources were designed and the transformation of analyzed results into visual aids were automatically executed. It successfully presents a new way for bridging the communication gaps between systems and users. GIS has been long seen as a powerful integration tool, but its achievements would be highly restricted if it fails to provide a friendly and correct working platform.

  6. The Relative Success of Recognition-Based Inference in Multichoice Decisions

    ERIC Educational Resources Information Center

    McCloy, Rachel; Beaman, C. Philip; Smith, Philip T.

    2008-01-01

    The utility of an "ecologically rational" recognition-based decision rule in multichoice decision problems is analyzed, varying the type of judgment required (greater or lesser). The maximum size and range of a counterintuitive advantage associated with recognition-based judgment (the "less-is-more effect") is identified for a range of cue…

  7. Mindfulness-Based Approaches in the Treatment of Disordered Gambling: A Systematic Review and Meta-Analysis

    ERIC Educational Resources Information Center

    Maynard, Brandy R.; Wilson, Alyssa N.; Labuzienski, Elizabeth; Whiting, Seth W.

    2018-01-01

    Background and Aims: To examine the effects of mindfulness-based interventions on gambling behavior and symptoms, urges, and financial outcomes. Method: Systematic review and meta-analytic procedures were employed to search, select, code, and analyze studies conducted between 1980 and 2014, assessing the effects of mindfulness-based interventions…

  8. Integrated Arts-Based Teaching (IAT) Model for Brain-Based Learning

    ERIC Educational Resources Information Center

    Inocian, Reynaldo B.

    2015-01-01

    This study analyzes teaching strategies among the eight books in Principles and Methods of Teaching recommended for use in the College of Teacher Education in the Philippines. It seeks to answer the following objectives: (1) identify the most commonly used teaching strategies congruent with the integrated arts-based teaching (IAT) and (2) design…

  9. Effects of Coaching on Teachers' Use of Function-Based Interventions for Students with Severe Disabilities

    ERIC Educational Resources Information Center

    Bethune, Keri S.; Wood, Charles L.

    2013-01-01

    This study used a delayed multiple-baseline across-participants design to analyze the effects of coaching on special education teachers' implementation of function-based interventions with students with severe disabilities. This study also examined the extent to which teachers could generalize function-based interventions to different situations.…

  10. Peculiarities of Professional Training Standards Development and Implementation within Competency-Based Approach: Foreign Experience

    ERIC Educational Resources Information Center

    Desyatov, Tymofiy

    2015-01-01

    The article analyzes the development of competency-based professional training standards and their implementation into educational process in foreign countries. It determines that the main idea of competency-based approach is competency-and-active learning, which aims at complex acquirement of diverse skills and ways of practice activities via…

  11. A Study of Multimedia Annotation of Web-Based Materials

    ERIC Educational Resources Information Center

    Hwang, Wu-Yuin; Wang, Chin-Yu; Sharples, Mike

    2007-01-01

    Web-based learning has become an important way to enhance learning and teaching, offering many learning opportunities. A limitation of current Web-based learning is the restricted ability of students to personalize and annotate the learning materials. Providing personalized tools and analyzing some types of learning behavior, such as students'…

  12. Picker versus stripper harvesters on the High Plains of Texas

    USDA-ARS?s Scientific Manuscript database

    A break even analysis based on NPV was conducted to compare picker-based and stripper-based harvest systems with and without field cleaners. Under no conditions analyzed was the NPV of a stripper system without a field cleaner greater than a stripper system with a field cleaner. Break even curves re...

  13. Evaluation of modern cotton harvest systems on irrigated cotton: Economic returns

    USDA-ARS?s Scientific Manuscript database

    A breakeven analysis based on NPV was conducted to compare picker-based and stripper-based harvest systems with and without field cleaners. Under no conditions analyzed was the NPV of a stripper system without a field cleaner greater than a stripper system with a field cleaner. Breakeven curves rela...

  14. Project Based Learning in Multi-Grade Class

    ERIC Educational Resources Information Center

    Ciftci, Sabahattin; Baykan, Ayse Aysun

    2013-01-01

    The purpose of this study is to evaluate project based learning in multi-grade classes. This study, based on a student-centered learning approach, aims to analyze students' and parents' interpretations. The study was done in a primary village school belonging to the Centre of Batman, already adapting multi-grade classes in their education system,…

  15. Let's Teach Unskilled Readers like Skilled Readers: A Closer Look at Meaning-Based Instruction.

    ERIC Educational Resources Information Center

    Dowhower, Sarah L.; Speidel, Gisela E.

    1989-01-01

    Analyzes transcripts of four reading lessons based on the Kamehameha Reading Program (emphasizing discussion and oral language within group reading lessons) given to three low-ability second grade readers. Identifies six components important to reading success, including contextual-based lessons, minimal skills instruction, and active quests for…

  16. The evolution processes of DNA sequences, languages and carols

    NASA Astrophysics Data System (ADS)

    Hauck, Jürgen; Henkel, Dorothea; Mika, Klaus

    2001-04-01

    The sequences of bases A, T, C and G of about 100 enolase, secA and cytochrome DNA were analyzed for attractive or repulsive interactions by the numbers T 1,T 2,T 3; r of nearest, next-nearest and third neighbor bases of the same kind and the concentration r=other bases/analyzed base. The area of possible T1, T2 values is limited by the linear borders T 2=2T 1-2, T 2=0 or T1=0 for clustering, attractive or repulsive interactions and the border T2=-2 T1+2(2- r) for a variation from repulsive to attractive interactions at r⩽2. Clustering is preferred by most bases in sequences of enolases and secA’ s. Major deviations with repulsive interactions of some bases are observed for archaea bacteria in secA and for highly developed animals and the human species in enolase sequences. The borders of the structure map for enthalpy stabilized structures with maximum interactions are approached in few cases. Most letters of the natural languages and some music notes are at the borders of the structure map.

  17. Catalytic conversion of syngas to mixed alcohols over Zn-Mn promoted Cu-Fe based catalyst

    DOE PAGES

    Lu, Yongwu; Yu, Fei; Hu, Jin; ...

    2012-04-12

    Zn-Mn promoted Cu-Fe based catalyst was synthesized by the co-precipitation method. Mixed alcohols synthesis from syngas was studied in a half-inch tubular reactor system after the catalyst was reduced. Zn-Mn promoted Cu-Fe based catalyst was characterized by SEM-EDS, TEM, XRD, and XPS. The liquid phase products (alcohol phase and hydrocarbon phase) were analyzed by GC-MS and the gas phase products were analyzed by GC. The results showed that Zn-Mn promoted Cu-Fe based catalyst had high catalytic activity and high alcohol selectivity. The maximal CO conversion rate was 72%, and the yield of alcohol and hydrocarbons were also very high. Cumore » (111) was the active site for mixed alcohols synthesis, Fe 2C (101) was the active site for olefin and paraffin synthesis. The reaction mechanism of mixed alcohols synthesis from syngas over Zn-Mn promoted Cu-Fe based catalyst was proposed. Here, Zn-Mn promoted Cu-Fe based catalyst can be regarded as a potential candidate for catalytic conversion of biomass-derived syngas to mixed alcohols.« less

  18. A Cost-Effective Geodetic Strainmeter Based on Dual Coaxial Cable Bragg Gratings

    PubMed Central

    Fu, Jihua; Wang, Xu; Wei, Tao; Wei, Meng; Shen, Yang

    2017-01-01

    Observations of surface deformation are essential for understanding a wide range of geophysical problems, including earthquakes, volcanoes, landslides, and glaciers. Current geodetic technologies, such as global positioning system (GPS), interferometric synthetic aperture radar (InSAR), borehole and laser strainmeters, are costly and limited in their temporal or spatial resolutions. Here we present a new type of strainmeters based on the coaxial cable Bragg grating (CCBG) sensing technology that provides cost-effective strain measurements. Two CCBGs are introduced into the geodetic strainmeter: one serves as a sensor to measure the strain applied on it, and the other acts as a reference to detect environmental noises. By integrating the sensor and reference signals in a mixer, the environmental noises are minimized and a lower mixed frequency is obtained. The lower mixed frequency allows for measurements to be taken with a portable spectrum analyzer, rather than an expensive spectrum analyzer or a vector network analyzer (VNA). Analysis of laboratory experiments shows that the strain can be measured by the CCBG sensor, and the portable spectrum analyzer can make measurements with the accuracy similar to the expensive spectrum analyzer, whose relative error to the spectrum analyzer R3272 is less than ±0.4%. The outputs of the geodetic strainmeter show a linear relationship with the strains that the CCBG sensor experienced. The measured sensitivity of the geodetic strainmeter is about −0.082 kHz/με; it can cover a large dynamic measuring range up to 2%, and its nonlinear errors can be less than 5.3%. PMID:28417925

  19. A Cost-Effective Geodetic Strainmeter Based on Dual Coaxial Cable Bragg Gratings.

    PubMed

    Fu, Jihua; Wang, Xu; Wei, Tao; Wei, Meng; Shen, Yang

    2017-04-12

    Observations of surface deformation are essential for understanding a wide range of geophysical problems, including earthquakes, volcanoes, landslides, and glaciers. Current geodetic technologies, such as global positioning system (GPS), interferometric synthetic aperture radar (InSAR), borehole and laser strainmeters, are costly and limited in their temporal or spatial resolutions. Here we present a new type of strainmeters based on the coaxial cable Bragg grating (CCBG) sensing technology that provides cost-effective strain measurements. Two CCBGs are introduced into the geodetic strainmeter: one serves as a sensor to measure the strain applied on it, and the other acts as a reference to detect environmental noises. By integrating the sensor and reference signals in a mixer, the environmental noises are minimized and a lower mixed frequency is obtained. The lower mixed frequency allows for measurements to be taken with a portable spectrum analyzer, rather than an expensive spectrum analyzer or a vector network analyzer (VNA). Analysis of laboratory experiments shows that the strain can be measured by the CCBG sensor, and the portable spectrum analyzer can make measurements with the accuracy similar to the expensive spectrum analyzer, whose relative error to the spectrum analyzer R3272 is less than ±0.4%. The outputs of the geodetic strainmeter show a linear relationship with the strains that the CCBG sensor experienced. The measured sensitivity of the geodetic strainmeter is about -0.082 kHz/με; it can cover a large dynamic measuring range up to 2%, and its nonlinear errors can be less than 5.3%.

  20. Development of an x-ray prism for analyzer based imaging systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bewer, Brian; Chapman, Dean

    Analyzer crystal based imaging techniques such as diffraction enhanced imaging (DEI) and multiple imaging radiography (MIR) utilize the Bragg peak of perfect crystal diffraction to convert angular changes into intensity changes. These x-ray techniques extend the capability of conventional radiography, which derives image contrast from absorption, by providing large intensity changes for small angle changes introduced from the x-ray beam traversing the sample. Objects that have very little absorption contrast may have considerable refraction and ultrasmall angle x-ray scattering contrast improving visualization and extending the utility of x-ray imaging. To improve on the current DEI technique an x-ray prism (XRP)more » was designed and included in the imaging system. The XRP allows the analyzer crystal to be aligned anywhere on the rocking curve without physically moving the analyzer from the Bragg angle. By using the XRP to set the rocking curve alignment rather than moving the analyzer crystal physically the needed angle sensitivity is changed from submicroradians for direct mechanical movement of the analyzer crystal to tens of milliradians for movement of the XRP angle. However, this improvement in angle positioning comes at the cost of absorption loss in the XRP and depends on the x-ray energy. In addition to using an XRP for crystal alignment it has the potential for scanning quickly through the entire rocking curve. This has the benefit of collecting all the required data for image reconstruction in a single measurement thereby removing some problems with motion artifacts which remain a concern in current DEI/MIR systems especially for living animals.« less

  1. Development of an x-ray prism for analyzer based imaging systems

    NASA Astrophysics Data System (ADS)

    Bewer, Brian; Chapman, Dean

    2010-08-01

    Analyzer crystal based imaging techniques such as diffraction enhanced imaging (DEI) and multiple imaging radiography (MIR) utilize the Bragg peak of perfect crystal diffraction to convert angular changes into intensity changes. These x-ray techniques extend the capability of conventional radiography, which derives image contrast from absorption, by providing large intensity changes for small angle changes introduced from the x-ray beam traversing the sample. Objects that have very little absorption contrast may have considerable refraction and ultrasmall angle x-ray scattering contrast improving visualization and extending the utility of x-ray imaging. To improve on the current DEI technique an x-ray prism (XRP) was designed and included in the imaging system. The XRP allows the analyzer crystal to be aligned anywhere on the rocking curve without physically moving the analyzer from the Bragg angle. By using the XRP to set the rocking curve alignment rather than moving the analyzer crystal physically the needed angle sensitivity is changed from submicroradians for direct mechanical movement of the analyzer crystal to tens of milliradians for movement of the XRP angle. However, this improvement in angle positioning comes at the cost of absorption loss in the XRP and depends on the x-ray energy. In addition to using an XRP for crystal alignment it has the potential for scanning quickly through the entire rocking curve. This has the benefit of collecting all the required data for image reconstruction in a single measurement thereby removing some problems with motion artifacts which remain a concern in current DEI/MIR systems especially for living animals.

  2. Development of an x-ray prism for analyzer based imaging systems.

    PubMed

    Bewer, Brian; Chapman, Dean

    2010-08-01

    Analyzer crystal based imaging techniques such as diffraction enhanced imaging (DEI) and multiple imaging radiography (MIR) utilize the Bragg peak of perfect crystal diffraction to convert angular changes into intensity changes. These x-ray techniques extend the capability of conventional radiography, which derives image contrast from absorption, by providing large intensity changes for small angle changes introduced from the x-ray beam traversing the sample. Objects that have very little absorption contrast may have considerable refraction and ultrasmall angle x-ray scattering contrast improving visualization and extending the utility of x-ray imaging. To improve on the current DEI technique an x-ray prism (XRP) was designed and included in the imaging system. The XRP allows the analyzer crystal to be aligned anywhere on the rocking curve without physically moving the analyzer from the Bragg angle. By using the XRP to set the rocking curve alignment rather than moving the analyzer crystal physically the needed angle sensitivity is changed from submicroradians for direct mechanical movement of the analyzer crystal to tens of milliradians for movement of the XRP angle. However, this improvement in angle positioning comes at the cost of absorption loss in the XRP and depends on the x-ray energy. In addition to using an XRP for crystal alignment it has the potential for scanning quickly through the entire rocking curve. This has the benefit of collecting all the required data for image reconstruction in a single measurement thereby removing some problems with motion artifacts which remain a concern in current DEI/MIR systems especially for living animals.

  3. Universal MOSFET parameter analyzer

    NASA Astrophysics Data System (ADS)

    Klekachev, A. V.; Kuznetsov, S. N.; Pikulev, V. B.; Gurtov, V. A.

    2006-05-01

    MOSFET analyzer is developed to extract most important parameters of transistors. Instead of routine DC transfer and output characteristics, analyzer provides an evaluation of interface states density by applying charge pumping technique. There are two features that outperform the analyzer among similar products of other vendors. It is compact (100 × 80 × 50 mm 3 in dimensions) and lightweight (< 200 gram) instrument with ultra low power supply (< 2.5 W). The analyzer operates under control of IBM PC by means of USB interface that simultaneously provides power supply. Owing to the USB-compatible microcontroller as the basic element, designed analyzer offers cost-effective solution for diverse applications. The enclosed software runs under Windows 98/2000/XP operating systems, it has convenient graphical interface simplifying measurements for untrained user. Operational characteristics of analyzer are as follows: gate and drain output voltage within limits of +/-10V measuring current range of 1pA ÷ 10 mA; lowest limit of interface states density characterization of ~10 9 cm -2 • eV -1. The instrument was designed on the base of component parts from CYPRESS and ANALOG DEVICES (USA).

  4. Dust analysis on board the Destiny+ mission to 3200 Phaethon

    NASA Astrophysics Data System (ADS)

    Krüger, H.; Kobayashi, M.; Arai, T.; Srama, R.; Sarli, B. V.; Kimura, H.; Moragas-Klostermeyer, G.; Soja, R.; Altobelli, N.; Grün, E.

    2017-09-01

    The Japanese Destiny+ spacecraft will be launched to the active asteroid 3200 Phaethon in 2022. Among the proposed core payload is an in-situ dust instrument based on the Cassini Cosmic Dust Analyzer. We use the ESA Interplanetary Meteoroid Engineering Model (IMEM), to study detection conditions and fluences of interplanetary and interstellar dust with a dust analyzer on board Destiny+.

  5. Asking Scientists: A Decade of Questions Analyzed by Age, Gender, and Country

    ERIC Educational Resources Information Center

    Baram-Tsabari, Ayelet; Sethi, Ricky J.; Bry, Lynn; Yarden, Anat

    2009-01-01

    Nearly 79,000 questions sent to an Internet-based Ask-A-Scientist site during the last decade were analyzed according to the surfer's age, gender, country of origin, and the year the question was sent. The sample demonstrated a surprising dominance of female contributions among K-12 students (although this dominance did not carry over to the full…

  6. Detection of Fe[superscript 3+] and Al[superscript 3+] by Test Paper

    ERIC Educational Resources Information Center

    Li, Lili; Xiang, Haifeng; Zhou, Xiangge; Li, Menglong; Wu, Di

    2012-01-01

    A porphyrin-based test paper has been designed and prepared. It can be used to analyze for Al[superscript 3+] and Fe[superscript 3+] in aqueous solution. An experiment employing the test paper can help students understand basic principles of spectrophotometry and how spectrophotometry is used in analyzing for metal ions. (Contains 1 scheme and 1…

  7. Analyzing the Curricula of Doctor of Philosophy in Educational Technology-Related Programs in the United States

    ERIC Educational Resources Information Center

    Almaden, Abdullah; Ku, Heng-Yu

    2017-01-01

    The purpose of this study was to analyze on-campus and online PhD programs in educational technology-related fields in the United States. In particular, it sought to evaluate the most common program titles; core, elective, and research courses based on program curricula. The research design was quantitative content analysis and data were collected…

  8. Robotic Form-Finding and Construction Based on the Architectural Projection Logic

    NASA Astrophysics Data System (ADS)

    Zexin, Sun; Mei, Hongyuan

    2017-06-01

    In this article we analyze the relationship between the architectural drawings and form-finding, indicate that architects should reuse and redefine the traditional architectural drawings as a from-finding tool. Explain the projection systems and analyze how these systems affected the architectural design. Use robotic arm to do the experiment and establish a cylindrical projection form-finding system.

  9. Agricultural SWOT analysis and wisdom agriculture design of chengdu

    NASA Astrophysics Data System (ADS)

    Zhang, Qian; Chen, Xiangyu; Du, Shaoming; Yin, Guowei; Yu, Feng; Liu, Guicai; Gong, Jin; Han, Fujun

    2017-08-01

    According to the status of agricultural information, this paper analyzed the advantages, opportunities and challenges of developing wisdom agriculture in Chengdu. By analyzed the local characteristics of Chengdu agriculture, the construction program of Chengdu wisdom agriculture was designed, which was based on the existing agricultural informatization. The positioning and development theme of Chengdu agriculture is leisure agriculture, urban agriculture and quality agriculture.

  10. Analyzing the Anglo-American Hegemony in the "Times Higher Education" Rankings

    ERIC Educational Resources Information Center

    Kaba, Amadu Jacky

    2012-01-01

    This study analyzes the 2009 "Times Higher Education"-QS top 200 universities in the world. Based on this analysis the study claims that the THS reflects the phenomenon of Anglo American hegemony. The United States with 54 universities and the United Kingdom with 29 dominated the THS. In addition, six out of every ten universities on the…

  11. Distributed Computing Environment for Mine Warfare Command

    DTIC Science & Technology

    1993-06-01

    based system to a decentralized network of personal computers over the past several years. This thesis analyzes the progress of the evolution as of May of...network of personal computers over the past several years. This thesis analyzes the progress of the evolution as of May of 1992. The building blocks of a...85 A. BACKGROUND ............. .................. 85 B. PAST ENVIRONMENT ........... ............... 86 C. PRESENT ENVIRONMENT

  12. Using Affinity Chromatography to Investigate Novel Protein-Protein Interactions in an Undergraduate Cell and Molecular Biology Lab Course

    ERIC Educational Resources Information Center

    Belanger, Kenneth D.

    2009-01-01

    Inquiry-driven lab exercises require students to think carefully about a question, carry out an investigation of that question, and critically analyze the results of their investigation. Here, we describe the implementation and assessment of an inquiry-based laboratory exercise in which students obtain and analyze novel data that contribute to our…

  13. Current State of Test Development, Administration, and Analysis: A Study of Faculty Practices.

    PubMed

    Bristol, Timothy J; Nelson, John W; Sherrill, Karin J; Wangerin, Virginia S

    Developing valid and reliable test items is a critical skill for nursing faculty. This research analyzed the test item writing practice of 674 nursing faculty. Relationships between faculty characteristics and their test item writing practices were analyzed. Findings reveal variability in practice and a gap in implementation of evidence-based standards when developing and evaluating teacher-made examinations.

  14. A broader definition of occupancy: Comment on Hayes and Monfils

    Treesearch

    Quresh S. Latif; Martha M. Ellis; Courtney L. Amundson

    2016-01-01

    Occupancy models are widely used to analyze presence-absence data for a variety of taxa while accounting for observation error (MacKenzie et al. 2002, 2006; Tyre et al. 2003; Royle and Dorazio 2008). Hayes and Monfils (2015) question their use for analyzing avian point count data based on purported violations of model assumptions incurred by avian mobility....

  15. Secondary Students' Dynamic Modeling Processes: Analyzing, Reasoning About, Synthesizing, and Testing Models of Stream Ecosystems.

    ERIC Educational Resources Information Center

    Stratford, Steven J.; Krajeik, Joseph; Soloway, Elliot

    This paper presents the results of a study of the cognitive strategies in which ninth-grade science students engaged as they used a learner-centered dynamic modeling tool (called Model-It) to make original models based upon stream ecosystem scenarios. The research questions were: (1) In what Cognitive Strategies for Modeling (analyzing, reasoning,…

  16. Developing Codebooks as a New Tool to Analyze Students' ePortfolios

    ERIC Educational Resources Information Center

    Impedovo, Maria Antonietta; Ritella, Giuseppe; Ligorio, Maria Beatrice

    2013-01-01

    This paper describes a three-step method for the construction of codebooks meant for analyzing ePortfolio content. The first step produces a prototype based on qualitative analysis of very different ePortfolios from the same course. During the second step, the initial version of the codebook is tested on a larger sample and subsequently revised.…

  17. Analyzing How Formalist, Cognitive-Processing, and Literacy Practices Learning Paradigms are Shaping the Implementation of the Common Core State Standards

    ERIC Educational Resources Information Center

    Beach, Richard

    2011-01-01

    This paper analyzes the influence of three different learning paradigms for learning literacy--formalist, cognitive-processing, and literacy practices--on the implementation of the Common Core State Standards. It argues that the Common Core State Standards are based largely on a formalist paradigm as evident in the emphasis on teaching text…

  18. Multiple-Use Site Demand Analysis: An Application to the Boundary Waters Canoe Area Wilderness.

    ERIC Educational Resources Information Center

    Peterson, George L.; And Others

    1982-01-01

    A single-site, multiple-use model for analyzing trip demand is derived from a multiple site regional model based on utility maximizing choice theory. The model is used to analyze and compare trips to the Boundary Waters Canoe Area Wilderness for several types of use. Travel cost elasticities of demand are compared and discussed. (Authors/JN)

  19. Cluster analysis of historical and modern hard red spring wheat cultivars based on parentage and HPLC analysis of gluten forming proteins

    USDA-ARS?s Scientific Manuscript database

    In this study, 30 hard red spring (HRS) wheat cultivars released between 1910 and 2013 were analyzed to determine how they cluster in terms of parentage and protein data, analyzed by reverse-phase HPLC (RP-HPLC) of gliadins, and size-exclusion HPLC (SE-HPLC) of unreduced proteins. Dwarfing genes in...

  20. Portable Microplate Analyzer with a Thermostatic Chamber Based on a Smartphone for On-site Rapid Detection.

    PubMed

    Wan, Zijian; Zhong, Longjie; Pan, Yuxiang; Li, Hongbo; Zou, Quchao; Su, Kaiqi; Wang, Ping

    2017-01-01

    A microplate method provides an efficient way to use modern detection technology. However, there are some difficulties concerning on-site detection, such as being non-portable and time-consuming. In this work, a novel portable microplate analyzer with a thermostatic chamber based on a smartphone was designed for rapid on-site detection. An analyzer with a wide-angle lens and an optical filter provides a proper environment for the microplate. A smartphone app-iPlate Monitor was used for RGB analyze of image. After a consistency experiment with a microtiter plate reader (MTPR), the normalized calibration curves were y = 0.7276x + 0.0243 (R 2 = 0.9906) and y = 0.3207x + 0.0094 (R 2 = 0.9917) with a BCA protein kit as well as y = 0.182x + 0.0134 (R 2 = 0.994) and y = 0.0674x + 0.0003 (R 2 = 0.9988) with a glucose kit. The times for obtaining the detection requirement were 15 and 10 min for the BCA protein kit and the glucose kit at 37°C; in contrast, it required more than 30 and 20 min at ambient temperature. Meanwhile, it also showed good repeatability for detections.

  1. Complete set of deuteron analyzing powers from d ⃗p elastic scattering at 190 MeV/nucleon

    NASA Astrophysics Data System (ADS)

    Sekiguchi, K.; Witała, H.; Akieda, T.; Eto, D.; Kon, H.; Wada, Y.; Watanabe, A.; Chebotaryov, S.; Dozono, M.; Golak, J.; Kamada, H.; Kawakami, S.; Kubota, Y.; Maeda, Y.; Miki, K.; Milman, E.; Ohkura, A.; Sakai, H.; Sakaguchi, S.; Sakamoto, N.; Sasano, M.; Shindo, Y.; Skibiński, R.; Suzuki, H.; Tabata, M.; Uesaka, T.; Wakasa, T.; Yako, K.; Yamamoto, T.; Yanagisawa, Y.; Yasuda, J.

    2017-12-01

    All deuteron analyzing powers for elastic deuteron-proton (d p ) scattering have been measured with a polarized deuteron beam at 186.6 MeV/nucleon. They are compared with results of three-nucleon Faddeev calculations based on the standard, high-precision nucleon-nucleon (N N ) potentials alone or combined with commonly used three-nucleon force (3 N F ) models such as the Tucson-Melbourne '99 or the Urbana IX. Predicted 3 N F effects localized at backward angles are supported only partially by the data. The data are also compared to predictions based on locally regularized chiral N N potentials. An estimation of theoretical truncation uncertainties in the consecutive orders of chiral expansion suggests that the observed discrepancies between this modern theory and the data could probably be explained by including chiral 3 N F 's in future calculations. A systematic comparison to the deuteron analyzing power data previously taken at incident energies from 70 to 294 MeV/nucleon clearly shows that not only the cross section but also the analyzing powers reveal growing 3 N F effects when the three-nucleon system energy is increased.

  2. [An EMD based time-frequency distribution and its application in EEG analysis].

    PubMed

    Li, Xiaobing; Chu, Meng; Qiu, Tianshuang; Bao, Haiping

    2007-10-01

    Hilbert-Huang transform (HHT) is a new time-frequency analytic method to analyze the nonlinear and the non-stationary signals. The key step of this method is the empirical mode decomposition (EMD), with which any complicated signal can be decomposed into a finite and small number of intrinsic mode functions (IMF). In this paper, a new EMD based method for suppressing the cross-term of Wigner-Ville distribution (WVD) is developed and is applied to analyze the epileptic EEG signals. The simulation data and analysis results show that the new method suppresses the cross-term of the WVD effectively with an excellent resolution.

  3. Binocular optical axis parallelism detection precision analysis based on Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Ying, Jiaju; Liu, Bingqi

    2018-02-01

    According to the working principle of the binocular photoelectric instrument optical axis parallelism digital calibration instrument, and in view of all components of the instrument, the various factors affect the system precision is analyzed, and then precision analysis model is established. Based on the error distribution, Monte Carlo method is used to analyze the relationship between the comprehensive error and the change of the center coordinate of the circle target image. The method can further guide the error distribution, optimize control the factors which have greater influence on the comprehensive error, and improve the measurement accuracy of the optical axis parallelism digital calibration instrument.

  4. Radical-initiated controlled synthesis of homo- and copolymers based on acrylonitrile

    NASA Astrophysics Data System (ADS)

    Grishin, D. F.; Grishin, I. D.

    2015-07-01

    Data on the controlled synthesis of polyacrylonitrile and acrylonitrile copolymers with other (meth)acrylic and vinyl monomers upon radical initiation and metal complex catalysis are analyzed. Primary attention is given to the use of metal complexes for the synthesis of acrylonitrile-based (co)polymers with defined molecular weight and polydispersity in living mode by atom transfer radical polymerization. The prospects for using known methods of controlled synthesis of macromolecules for the preparation of acrylonitrile homo- and copolymers as carbon fibre precursors are estimated. The major array of published data analyzed in the review refers to the last decade. The bibliography includes 175 references.

  5. Virtual Instrument for Determining Rate Constant of Second-Order Reaction by pX Based on LabVIEW 8.0.

    PubMed

    Meng, Hu; Li, Jiang-Yuan; Tang, Yong-Huai

    2009-01-01

    The virtual instrument system based on LabVIEW 8.0 for ion analyzer which can measure and analyze ion concentrations in solution is developed and comprises homemade conditioning circuit, data acquiring board, and computer. It can calibrate slope, temperature, and positioning automatically. When applied to determine the reaction rate constant by pX, it achieved live acquiring, real-time displaying, automatical processing of testing data, generating the report of results; and other functions. This method simplifies the experimental operation greatly, avoids complicated procedures of manual processing data and personal error, and improves veracity and repeatability of the experiment results.

  6. Capillary electrophoresis for the analysis of contaminants in emerging food safety issues and food traceability.

    PubMed

    Vallejo-Cordoba, Belinda; González-Córdova, Aarón F

    2010-07-01

    This review presents an overview of the applicability of CE in the analysis of chemical and biological contaminants involved in emerging food safety issues. Additionally, CE-based genetic analyzers' usefulness as a unique tool in food traceability verification systems was presented. First, analytical approaches for the determination of melamine and specific food allergens in different foods were discussed. Second, natural toxin analysis by CE was updated from the last review reported in 2008. Finally, the analysis of prion proteins associated with the "mad cow" crises and the application of CE-based genetic analyzers for meat traceability were summarized.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rice, K.; Bricker, O.

    The report describes the results of a study to assess the sensitivity of streams to acidic deposition in Charles and Anne Arundel Counties, Maryland using a geology-based method. Water samples were collected from streams in July and August 1988 when streams were at base-flow conditions. Eighteen water samples collected from streams in Charles County, and 17 water samples from streams in Anne Arundel County were analyzed in the field for pH, specific conductance, and acid-neutralizing capacity (ANC); 8 water samples from streams in Charles County were analyzed in the laboratory for chloride and sulfate concentrations. The assessment revealed that streamsmore » in these counties are sensitive to acidification by acidic deposition.« less

  8. A Simulation Based Approach for Contingency Planning for Aircraft Turnaround Operation System Activities in Airline Hubs

    NASA Technical Reports Server (NTRS)

    Adeleye, Sanya; Chung, Christopher

    2006-01-01

    Commercial aircraft undergo a significant number of maintenance and logistical activities during the turnaround operation at the departure gate. By analyzing the sequencing of these activities, more effective turnaround contingency plans may be developed for logistical and maintenance disruptions. Turnaround contingency plans are particularly important as any kind of delay in a hub based system may cascade into further delays with subsequent connections. The contingency sequencing of the maintenance and logistical turnaround activities were analyzed using a combined network and computer simulation modeling approach. Experimental analysis of both current and alternative policies provides a framework to aid in more effective tactical decision making.

  9. Building an information model (with the help of PSL/PSA). [Problem Statement Language/Problem Statement Analyzer

    NASA Technical Reports Server (NTRS)

    Callender, E. D.; Farny, A. M.

    1983-01-01

    Problem Statement Language/Problem Statement Analyzer (PSL/PSA) applications, which were once a one-step process in which product system information was immediately translated into PSL statements, have in light of experience been shown to result in inconsistent representations. These shortcomings have prompted the development of an intermediate step, designated the Product System Information Model (PSIM), which provides a basis for the mutual understanding of customer terminology and the formal, conceptual representation of that product system in a PSA data base. The PSIM is initially captured as a paper diagram, followed by formal capture in the PSL/PSA data base.

  10. Scanning microwave microscopy applied to semiconducting GaAs structures

    NASA Astrophysics Data System (ADS)

    Buchter, Arne; Hoffmann, Johannes; Delvallée, Alexandra; Brinciotti, Enrico; Hapiuk, Dimitri; Licitra, Christophe; Louarn, Kevin; Arnoult, Alexandre; Almuneau, Guilhem; Piquemal, François; Zeier, Markus; Kienberger, Ferry

    2018-02-01

    A calibration algorithm based on one-port vector network analyzer (VNA) calibration for scanning microwave microscopes (SMMs) is presented and used to extract quantitative carrier densities from a semiconducting n-doped GaAs multilayer sample. This robust and versatile algorithm is instrument and frequency independent, as we demonstrate by analyzing experimental data from two different, cantilever- and tuning fork-based, microscope setups operating in a wide frequency range up to 27.5 GHz. To benchmark the SMM results, comparison with secondary ion mass spectrometry is undertaken. Furthermore, we show SMM data on a GaAs p-n junction distinguishing p- and n-doped layers.

  11. Estimation of Transport and Kinetic Parameters of Vanadium Redox Batteries Using Static Cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Seong Beom; Pratt, III, Harry D.; Anderson, Travis M.

    Mathematical models of Redox Flow Batteries (RFBs) can be used to analyze cell performance, optimize battery operation, and control the energy storage system efficiently. Among many other models, physics-based electrochemical models are capable of predicting internal states of the battery, such as temperature, state-of-charge, and state-of-health. In the models, estimating parameters is an important step that can study, analyze, and validate the models using experimental data. A common practice is to determine these parameters either through conducting experiments or based on the information available in the literature. However, it is not easy to investigate all proper parameters for the modelsmore » through this way, and there are occasions when important information, such as diffusion coefficients and rate constants of ions, has not been studied. Also, the parameters needed for modeling charge-discharge are not always available. In this paper, an efficient way to estimate parameters of physics-based redox battery models will be proposed. Furthermore, this paper also demonstrates that the proposed approach can study and analyze aspects of capacity loss/fade, kinetics, and transport phenomena of the RFB system.« less

  12. Ganalyzer: A Tool for Automatic Galaxy Image Analysis

    NASA Astrophysics Data System (ADS)

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  13. Estimation of Transport and Kinetic Parameters of Vanadium Redox Batteries Using Static Cells

    DOE PAGES

    Lee, Seong Beom; Pratt, III, Harry D.; Anderson, Travis M.; ...

    2018-03-27

    Mathematical models of Redox Flow Batteries (RFBs) can be used to analyze cell performance, optimize battery operation, and control the energy storage system efficiently. Among many other models, physics-based electrochemical models are capable of predicting internal states of the battery, such as temperature, state-of-charge, and state-of-health. In the models, estimating parameters is an important step that can study, analyze, and validate the models using experimental data. A common practice is to determine these parameters either through conducting experiments or based on the information available in the literature. However, it is not easy to investigate all proper parameters for the modelsmore » through this way, and there are occasions when important information, such as diffusion coefficients and rate constants of ions, has not been studied. Also, the parameters needed for modeling charge-discharge are not always available. In this paper, an efficient way to estimate parameters of physics-based redox battery models will be proposed. Furthermore, this paper also demonstrates that the proposed approach can study and analyze aspects of capacity loss/fade, kinetics, and transport phenomena of the RFB system.« less

  14. Pathway-Based Genome-Wide Association Studies for Two Meat Production Traits in Simmental Cattle.

    PubMed

    Fan, Huizhong; Wu, Yang; Zhou, Xiaojing; Xia, Jiangwei; Zhang, Wengang; Song, Yuxin; Liu, Fei; Chen, Yan; Zhang, Lupei; Gao, Xue; Gao, Huijiang; Li, Junya

    2015-12-17

    Most single nucleotide polymorphisms (SNPs) detected by genome-wide association studies (GWAS), explain only a small fraction of phenotypic variation. Pathway-based GWAS were proposed to improve the proportion of genes for some human complex traits that could be explained by enriching a mass of SNPs within genetic groups. However, few attempts have been made to describe the quantitative traits in domestic animals. In this study, we used a dataset with approximately 7,700,000 SNPs from 807 Simmental cattle and analyzed live weight and longissimus muscle area using a modified pathway-based GWAS method to orthogonalise the highly linked SNPs within each gene using principal component analysis (PCA). As a result, of the 262 biological pathways of cattle collected from the KEGG database, the gamma aminobutyric acid (GABA)ergic synapse pathway and the non-alcoholic fatty liver disease (NAFLD) pathway were significantly associated with the two traits analyzed. The GABAergic synapse pathway was biologically applicable to the traits analyzed because of its roles in feed intake and weight gain. The proposed method had high statistical power and a low false discovery rate, compared to those of the smallest P-value and SNP set enrichment analysis methods.

  15. A WebGIS-based system for analyzing and visualizing air quality data for Shanghai Municipality

    NASA Astrophysics Data System (ADS)

    Wang, Manyi; Liu, Chaoshun; Gao, Wei

    2014-10-01

    An online visual analytical system based on Java Web and WebGIS for air quality data for Shanghai Municipality was designed and implemented to quantitatively analyze and qualitatively visualize air quality data. By analyzing the architecture of WebGIS and Java Web, we firstly designed the overall scheme for system architecture, then put forward the software and hardware environment and also determined the main function modules for the system. The visual system was ultimately established with the DIV + CSS layout method combined with JSP, JavaScript, and some other computer programming languages based on the Java programming environment. Moreover, Struts, Spring, and Hibernate frameworks (SSH) were integrated in the system for the purpose of easy maintenance and expansion. To provide mapping service and spatial analysis functions, we selected ArcGIS for Server as the GIS server. We also used Oracle database and ESRI file geodatabase to store spatial data and non-spatial data in order to ensure the data security. In addition, the response data from the Web server are resampled to implement rapid visualization through the browser. The experimental successes indicate that this system can quickly respond to user's requests, and efficiently return the accurate processing results.

  16. Analyzing endocrine system conservation and evolution.

    PubMed

    Bonett, Ronald M

    2016-08-01

    Analyzing variation in rates of evolution can provide important insights into the factors that constrain trait evolution, as well as those that promote diversification. Metazoan endocrine systems exhibit apparent variation in evolutionary rates of their constituent components at multiple levels, yet relatively few studies have quantified these patterns and analyzed them in a phylogenetic context. This may be in part due to historical and current data limitations for many endocrine components and taxonomic groups. However, recent technological advancements such as high-throughput sequencing provide the opportunity to collect large-scale comparative data sets for even non-model species. Such ventures will produce a fertile data landscape for evolutionary analyses of nucleic acid and amino acid based endocrine components. Here I summarize evolutionary rate analyses that can be applied to categorical and continuous endocrine traits, and also those for nucleic acid and protein-based components. I emphasize analyses that could be used to test whether other variables (e.g., ecology, ontogenetic timing of expression, etc.) are related to patterns of rate variation and endocrine component diversification. The application of phylogenetic-based rate analyses to comparative endocrine data will greatly enhance our understanding of the factors that have shaped endocrine system evolution. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. GIS-based bivariate statistical techniques for groundwater potential analysis (an example of Iran)

    NASA Astrophysics Data System (ADS)

    Haghizadeh, Ali; Moghaddam, Davoud Davoudi; Pourghasemi, Hamid Reza

    2017-12-01

    Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster-Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran. The research was done using 11 groundwater conditioning factors and 496 spring positions. Based on the ground water potential maps (GPMs) of SI and DST methods, 24.22% and 23.74% of the study area is covered by poor zone of groundwater potential, and 43.93% and 36.3% of Broujerd region is covered by good and very good potential zones, respectively. The validation of outcomes displayed that area under the curve (AUC) of SI and DST techniques are 81.23% and 79.41%, respectively, which shows SI method has slightly a better performance than the DST technique. Therefore, SI and DST methods are advantageous to analyze groundwater capacity and scrutinize the complicated relation between groundwater occurrence and groundwater conditioning factors, which permits investigation of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management experts.

  18. The connection characteristics of flux pinned docking interface

    NASA Astrophysics Data System (ADS)

    Zhang, Mingliang; Han, Yanjun; Guo, Xing; Zhao, Cunbao; Deng, Feiyue

    2017-03-01

    This paper presents the mechanism and potential advantages of flux pinned docking interface mainly composed of a high temperature superconductor and an electromagnet. In order to readily assess the connection characteristics of flux pinned docking interface, the force between a high temperature superconductor and an electromagnet needs to be investigated. Based on the magnetic dipole method and the Ampere law method, the force between two current coils can be compared, which shows that the Ampere law method has the higher calculated accuracy. Based on the improved frozen image model and the Ampere law method, the force between high temperature superconductor bulk and permanent magnet can be calculated, which is validated experimentally. Moreover, the force between high temperature superconductor and electromagnet applied to flux pinned docking interface is able to be predicted and analyzed. The connection stiffness between high temperature superconductor and permanent magnet can be calculated based on the improved frozen image model and Hooke's law. The relationship between the connection stiffness and field cooling height is analyzed. Furthermore, the connection stiffness of the flux pinned docking interface is predicted and optimized, and its effective working range is defined and analyzed in case of some different parameters.

  19. A versatile software package for inter-subject correlation based analyses of fMRI.

    PubMed

    Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi

    2014-01-01

    In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/

  20. A versatile software package for inter-subject correlation based analyses of fMRI

    PubMed Central

    Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi

    2014-01-01

    In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/ PMID:24550818

  1. Protection of Renewable-dominated Microgrids: Challenges and Potential Solutions.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elkhatib, Mohamed; Ellis, Abraham; Milan Biswal

    keywords : Microgrid Protection, Impedance Relay, Signal Processing-based Fault Detec- tion, Networked Microgrids, Communication-Assisted Protection In this report we address the challenge of designing efficient protection system for inverter- dominated microgrids. These microgrids are characterised with limited fault current capacity as a result of current-limiting protection functions of inverters. Typically, inverters limit their fault contribution in sub-cycle time frame to as low as 1.1 per unit. As a result, overcurrent protection could fail completely to detect faults in inverter-dominated microgrids. As part of this project a detailed literature survey of existing and proposed microgrid protection schemes were conducted. The surveymore » concluded that there is a gap in the available microgrid protection methods. The only credible protection solution available in literature for low- fault inverter-dominated microgrids is the differential protection scheme which represents a robust transmission-grade protection solution but at a very high cost. Two non-overcurrent protection schemes were investigated as part of this project; impedance-based protection and transient-based protection. Impedance-based protection depends on monitoring impedance trajectories at feeder relays to detect faults. Two communication-based impedance-based protection schemes were developed. the first scheme utilizes directional elements and pilot signals to locate the fault. The second scheme depends on a Central Protection Unit that communicates with all feeder relays to locate the fault based on directional flags received from feeder relays. The later approach could potentially be adapted to protect networked microgrids and dynamic topology microgrids. Transient-based protection relies on analyzing high frequency transients to detect and locate faults. This approach is very promising but its implementation in the filed faces several challenges. For example, high frequency transients due to faults can be confused with transients due to other events such as capacitor switching. Additionally, while detecting faults by analyzing transients could be doable, locating faults based on analyzing transients is still an open question.« less

  2. Failure Analysis of Network Based Accessible Pedestrian Signals in Closed-Loop Operation

    DOT National Transportation Integrated Search

    2011-03-01

    The potential failure modes of a network based accessible pedestrian system were analyzed to determine the limitations and benefits of closed-loop operation. The vulnerabilities of the system are accessed using the industry standard process known as ...

  3. COST-EFFECTIVE SAMPLING FOR SPATIALLY DISTRIBUTED PHENOMENA

    EPA Science Inventory

    Various measures of sampling plan cost and loss are developed and analyzed as they relate to a variety of multidisciplinary sampling techniques. The sampling choices examined include methods from design-based sampling, model-based sampling, and geostatistics. Graphs and tables ar...

  4. The Role of Problem-Based Learning in the Enhancement of Allied Health Education.

    ERIC Educational Resources Information Center

    Tavakol, Kamran; Reicherter, E. Anne

    2003-01-01

    Analyzes the literature on problem-based learning (PBL) and explains its rationale, process, and current outcomes research. Cites examples of PBL in medical education and its application to allied health education. (Contains 49 references.) (JOW)

  5. Force-Based Reasoning for Assembly Planning and Subassembly Stability Analysis

    NASA Technical Reports Server (NTRS)

    Lee, S.; Yi, C.; Wang, F-C.

    1993-01-01

    In this paper, we show that force-based reasoning, for identifying a cluster of parts that can be decomposed naturally by the applied force, plays an important role in selecting feasible subassemblies and analyzing subassembly stability in assembly planning.

  6. Geometry of behavioral spaces: A computational approach to analysis and understanding of agent based models and agent behaviors

    NASA Astrophysics Data System (ADS)

    Cenek, Martin; Dahl, Spencer K.

    2016-11-01

    Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.

  7. Geometry of behavioral spaces: A computational approach to analysis and understanding of agent based models and agent behaviors.

    PubMed

    Cenek, Martin; Dahl, Spencer K

    2016-11-01

    Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.

  8. Towards early software reliability prediction for computer forensic tools (case study).

    PubMed

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  9. [Study on ITS sequences of Aconitum vilmorinianum and its medicinal adulterant].

    PubMed

    Zhang, Xiao-nan; Du, Chun-hua; Fu, De-huan; Gao, Li; Zhou, Pei-jun; Wang, Li

    2012-09-01

    To analyze and compare the ITS sequences of Aconitum vilmorinianum and its medicinal adulterant Aconitum austroyunnanense. Total genomic DNA were extracted from sample materials by improved CTAB method, ITS sequences of samples were amplified using PCR systems, directly sequenced and analyzed using software DNAStar, ClustalX1.81 and MEGA 4.0. 299 consistent sites, 19 variable sites and 13 informative sites were found in ITS1 sequences, 162 consistent sites, 2 variable sites and 1 informative sites were found in 5.8S sequences, 217 consistent sites, 3 variable sites and 1 informative site were found in ITS2 sequences. Base transition and transversion was not found only in 5.8S sequences, 2 sites transition and 1 site transversion were found in ITS1 sequences, only 1 site transversion was found in ITS2 sequences comparting the ITS sequences data matrix. By analyzing the ITS sequences data matrix from 2 population of Aconitum vilmorinianum and 3 population of Aconitum austroyunnanense, we found a stable informative site at the 596th base in ITS2 sequences, in all the samples of Aconitum vilmorinianum the base was C, and in all the samples of Aconitum austroyunnanense the base was A. Aconitum vilmorinianum and Aconitum austroyunnanense can be identified by their characters of ITS sequences, and the variable sites in ITS1 sequences are more than in ITS2 sequences.

  10. Bacterial burden in the operating room: impact of airflow systems.

    PubMed

    Hirsch, Tobias; Hubert, Helmine; Fischer, Sebastian; Lahmer, Armin; Lehnhardt, Marcus; Steinau, Hans-Ulrich; Steinstraesser, Lars; Seipp, Hans-Martin

    2012-09-01

    Wound infections present one of the most prevalent and frequent complications associated with surgical procedures. This study analyzes the impact of currently used ventilation systems in the operating room to reduce bacterial contamination during surgical procedures. Four ventilation systems (window-based ventilation, supported air nozzle canopy, low-turbulence displacement airflow, and low-turbulence displacement airflow with flow stabilizer) were analyzed. Two hundred seventy-seven surgical procedures in 6 operating rooms of 5 different hospitals were analyzed for this study. Window-based ventilation showed the highest intraoperative contamination (13.3 colony-forming units [CFU]/h) followed by supported air nozzle canopy (6.4 CFU/h; P = .001 vs window-based ventilation) and low-turbulence displacement airflow (3.4 and 0.8 CFU/h; P < .001 vs window-based ventilation and supported air nozzle canopy). The highest protection was provided by the low-turbulence displacement airflow with flow stabilizer (0.7 CFU/h), which showed a highly significant difference compared with the best supported air nozzle canopy theatre (3.9 CFU/h; P < .001). Furthermore, this system showed no increase of contamination in prolonged durations of surgical procedures. This study shows that intraoperative contamination can be significantly reduced by the use of adequate ventilation systems. Copyright © 2012 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.

  11. A communication-theory based view on telemedical communication.

    PubMed

    Schall, Thomas; Roeckelein, Wolfgang; Mohr, Markus; Kampshoff, Joerg; Lange, Tim; Nerlich, Michael

    2003-01-01

    Communication theory based analysis sheds new light on the use of health telematics. This analysis of structures in electronic medical communication shows communicative structures with special features. Current and evolving telemedical applications are analyzed. The methodology of communicational theory (focusing on linguistic pragmatics) is used to compare it with its conventional counterpart. The semiotic model, the roles of partners, the respective message and their relation are discussed. Channels, sender, addressee, and other structural roles are analyzed for different types of electronic medical communication. The communicative processes are shown as mutual, rational action towards a common goal. The types of communication/texts are analyzed in general. Furthermore the basic communicative structures of medical education via internet are presented with their special features. The analysis shows that electronic medical communication has special features compared to everyday communication: A third participant role often is involved: the patient. Messages often are addressed to an unspecified partner or to an unspecified partner within a group. Addressing in this case is (at least partially) role-based. Communication and message often directly (rather than indirectly) influence actions of the participants. Communication often is heavily regulated including legal implications like liability, and more. The conclusion from the analysis is that the development of telemedical applications so far did not sufficiently take communicative structures into consideration. Based on these results recommendations for future developments of telemedical applications/services are given.

  12. Influence of air temperature on the first flowering date of Prunus yedoensis Matsum

    PubMed Central

    Shi, Peijian; Chen, Zhenghong; Yang, Qingpei; Harris, Marvin K; Xiao, Mei

    2014-01-01

    Climate change is expected to have a significant effect on the first flowering date (FFD) in plants flowering in early spring. Prunus yedoensis Matsum is a good model plant for analyzing this effect. In this study, we used a degree day model to analyze the effect of air temperatures on the FFDs of P. yedoensis at Wuhan University from a long-time series from 1951 to 2012. First, the starting date (=7 February) is determined according to the lowest correlation coefficient between the FFD and the daily average accumulated degree days (ADD). Second, the base temperature (=−1.2°C) is determined according to the lowest root mean square error (RMSE) between the observed and predicted FFDs based on the mean of 62-year ADDs. Finally, based on this combination of starting date and base temperature, the daily average ADD of every year was calculated. Performing a linear fit of the daily average ADD to year, we find that there is an increasing trend that indicates climate warming from a biological climatic indicator. In addition, we find that the minimum annual temperature also has a significant effect on the FFD of P. yedoensis using the generalized additive model. This study provides a method for analyzing the climate change on the FFD in plants' flowering in early spring. PMID:24558585

  13. Systems, methods and apparatus for verification of knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Rash, James L. (Inventor); Gracinin, Denis (Inventor); Erickson, John D. (Inventor); Rouff, Christopher A. (Inventor); Hinchey, Michael G. (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments, domain knowledge is translated into a knowledge-based system. In some embodiments, a formal specification is derived from rules of a knowledge-based system, the formal specification is analyzed, and flaws in the formal specification are used to identify and correct errors in the domain knowledge, from which a knowledge-based system is translated.

  14. 340nm UV LED excitation in time-resolved fluorescence system for europium-based immunoassays detection

    NASA Astrophysics Data System (ADS)

    Rodenko, Olga; Fodgaard, Henrik; Tidemand-Lichtenberg, Peter; Pedersen, Christian

    2017-02-01

    In immunoassay analyzers for in-vitro diagnostics, Xenon flash lamps have been widely used as excitation light sources. Recent advancements in UV LED technology and its advantages over the flash lamps such as smaller footprint, better wall-plug efficiency, narrow emission spectrum, and no significant afterglow, have made them attractive light sources for gated detection systems. In this paper, we report on the implementation of a 340 nm UV LED based time-resolved fluorescence system based on europium chelate as a fluorescent marker. The system performance was tested with the immunoassay based on the cardiac marker, TnI. The same signal-to-noise ratio as for the flash lamp based system was obtained, operating the LED below specified maximum current. The background counts of the system and its main contributors were measured and analyzed. The background of the system of the LED based unit was improved by 39% compared to that of the Xenon flash lamp based unit, due to the LEDs narrower emission spectrum and longer pulse width. Key parameters of the LED system are discussed to further optimize the signal-to-noise ratio and signal-to-background, and hence the sensitivity of the instrument.

  15. A Black Theological Response to Race-Based Medicine: Reconciliation in Minority Communities.

    PubMed

    Johnson, Kirk A

    2017-06-01

    The harm race-based medicine inflicts on minority bodies through race-based experimentation and the false solutions a race-based drug ensues within minority communities provokes concern. Such areas analyze the minority patient in a physical proxy. Though the mind and body are important entities, we cannot forget about the spirit. Healing is not just a physical practice; it includes spiritual practice. Efficient medicine includes the holistic elements of the mind, body, and spirit. Therefore, the spiritual discipline of black theology can be used as a tool to mend the harms of race-based medicine. It can be an avenue of research to further particular concerns for justice in medical care . Such theology contributes to the discussion of race-based medicine indicating the need for the voice, participation, and interdependence of minorities. Black theology can be used as a tool of healing and empowerment for health equity and awareness by exploring black theology's response to race-based medicine, analyzing race in biblical literature, using biblical literature as a tool for minority patient empowerment, building on past and current black church health advocacy with personal leadership in health advocacy.

  16. Quantitative comparison of cognitive behavioral therapy and music therapy research: a methodological best-practices analysis to guide future investigation for adult psychiatric patients.

    PubMed

    Silverman, Michael J

    2008-01-01

    While the music therapy profession is relatively young and small in size, it can treat a variety of clinical populations and has established a diverse research base. However, although the profession originated working with persons diagnosed with mental illnesses, there is a considerable lack of quantitative research concerning the effects of music therapy with this population. Music therapy clinicians and researchers have reported on this lack of evidence and the difficulty in conducting psychosocial research on their interventions (Choi, 1997; Silverman, 2003a). While published studies have provided suggestions for future research, no studies have provided detailed propositions for the methodology and design of meticulous high quality randomized controlled psychiatric music therapy research. How do other psychotherapies accomplish their databases and could the music therapy field borrow from their rigorous "methodological best practices" to strengthen its own literature base? Therefore, as the National Institutes of Mental Health state the treatment of choice for evidence-based psychotherapy is cognitive behavioral therapy (CBT), aspects of this psychotherapy's literature base were analyzed. The purpose of this literature analysis was to (a) analyze and identify components of high-quality quantitative CBT research for adult psychiatric consumers, (b) analyze and identify the variables and other elements of existing quantitative psychiatric music therapy research for adult consumers, and (c) compare the two data sets to identify the best methodological designs and variables for future quantitative music therapy research with the mental health population. A table analyzing randomized and thoroughly controlled studies involving the use of CBT for persons with severe mental illnesses is included to determine chief components of high-quality experimental research designs and implementation of quantitative clinical research. The table also shows the same analyzed components for existing quantitative psychiatric music therapy research with adult consumers, thus highlighting potential areas and elements for future investigations. A second table depicts a number of potential dependent measures and their sources to be evaluated in future music therapy studies. A third table providing suggestions for future research is derived from a synthesis of the tables and is included to guide researchers and encourage the advancement and expansion of the current literature base. The body of the paper is a discussion of the results of the literature analysis derived from the tables, meta-analyses, and reviews of literature. It is hoped that this report will lead to the addition of future high-quality quantitative research to the psychiatric music therapy literature base and thus provide evidence-based services to as many persons with mental illnesses as possible.

  17. Investigation of International Space Station Major Constituent Analyzer Anomalous ORU 02 Performance

    NASA Technical Reports Server (NTRS)

    Gardner, Ben D.; Burchfield, David E.; Pargellis, Andrew; Erwin, Phillip M.; Thoresen, Souzan; Gentry, Grey; Granahan, John; Matty, Chris

    2013-01-01

    The Major Constituent Analyzer (MCA) is a mass spectrometer based system that measures the major atmospheric constituents on the International Space Station. In 2011, two MCA ORU 02 analyzer assemblies experienced premature on-orbit failures. These failures were determined to be the result of off-nominal ion source filament performance. Recent product improvements to ORU 02 designed to improve the lifetime of the ion pump also constrained the allowable tuning criteria for the ion source filaments. This presentation describes the filament failures as well as the corrective actions implemented to preclude such failures in the future.

  18. A method of power analysis based on piecewise discrete Fourier transform

    NASA Astrophysics Data System (ADS)

    Xin, Miaomiao; Zhang, Yanchi; Xie, Da

    2018-04-01

    The paper analyzes the existing feature extraction methods. The characteristics of discrete Fourier transform and piecewise aggregation approximation are analyzed. Combining with the advantages of the two methods, a new piecewise discrete Fourier transform is proposed. And the method is used to analyze the lighting power of a large customer in this paper. The time series feature maps of four different cases are compared with the original data, discrete Fourier transform, piecewise aggregation approximation and piecewise discrete Fourier transform. This new method can reflect both the overall trend of electricity change and its internal changes in electrical analysis.

  19. Science opportunity analyzer - a multi-mission approach to science planning

    NASA Technical Reports Server (NTRS)

    Streiffert, B. A.; Polanskey, C. A.; O'Reilly, T.; Colwell, J.

    2003-01-01

    In the past Science Planning for space missions has been comprised of using ad-hoc software toolscollected or reconstructed from previous missions, tools used by other groups who often speak a different 'technical' language or even 'the backs of envelopes'. In addition to the tools being rough, the work done with these tools often has had to be redone or at least re-entered when it came time to determine actual observations. Science Opportunity Analyzer (SOA), a Java-based application, has been built for scientists to enable them to identify/analyze observation opportunities and then, to create corresponding observation designs.

  20. Micro-foundations for macroeconomics: New set-up based on statistical physics

    NASA Astrophysics Data System (ADS)

    Yoshikawa, Hiroshi

    2016-12-01

    Modern macroeconomics is built on "micro foundations." Namely, optimization of micro agent such as consumer and firm is explicitly analyzed in model. Toward this goal, standard model presumes "the representative" consumer/firm, and analyzes its behavior in detail. However, the macroeconomy consists of 107 consumers and 106 firms. For the purpose of analyzing such macro system, it is meaningless to pursue the micro behavior in detail. In this respect, there is no essential difference between economics and physics. The method of statistical physics can be usefully applied to the macroeconomy, and provides Keynesian economics with correct micro-foundations.

  1. Combining model based and data based techniques in a robust bridge health monitoring algorithm.

    DOT National Transportation Integrated Search

    2014-09-01

    Structural Health Monitoring (SHM) aims to analyze civil, mechanical and aerospace systems in order to assess : incipient damage occurrence. In this project, we are concerned with the development of an algorithm within the : SHM paradigm for applicat...

  2. An individual-based simulation model for mottled sculpin (Cottus bairdi) in a southern Appalachian stream

    Treesearch

    Brenda Rashleigh; Gary D. Grossman

    2005-01-01

    We describe and analyze a spatially explicit, individual-based model for the local population dynamics of mottled sculpin (Cottus bairdi). The model simulated daily growth, mortality, movement and spawning of individuals within a reach of stream. Juvenile and adult growth was based on consumption bioenergetics of benthic macroinvertebrate prey;...

  3. Work-Based Learning: Good News, Bad News and Hope. Research Brief.

    ERIC Educational Resources Information Center

    Bottoms, Gene; Presson, Alice

    The effects of work-based learning on student achievement were examined by analyzing data from the 1996 High Schools That Work (HSTW) assessment. The comparison focused on the experiences of 12th-graders in structured work-based learning programs and 12th-graders with after-school jobs. A larger percentage of students earning school credit for…

  4. Preparing Teachers for Diversity: A Literature Review and Implications from Community-Based Teacher Education

    ERIC Educational Resources Information Center

    Yuan, Huanshu

    2018-01-01

    This study reviewed current issues in preparing qualified teachers for increasing diverse student populations in the U.S. and in other multicultural and multiethnic countries. Based on the framework of community-based and multicultural teacher education, this literature review paper analyzed issues and problems existed in the current curriculum,…

  5. Text-Based On-Line Conferencing: A Conceptual and Empirical Analysis Using a Minimal Prototype.

    ERIC Educational Resources Information Center

    McCarthy, John C.; And Others

    1993-01-01

    Analyzes requirements for text-based online conferencing through the use of a minimal prototype. Topics discussed include prototyping with a minimal system; text-based communication; the system as a message passer versus the system as a shared data structure; and three exercises that showed how users worked with the prototype. (Contains 61…

  6. Critical Success Factors in Teaching Strategic Sales Management: Evidence from Client-Based Classroom and Web-Based Formats

    ERIC Educational Resources Information Center

    Jaskari, Harri; Jaskari, Minna-Maarit

    2016-01-01

    The importance of sales management as an interface between a company and its customers is widely recognized. However, the teaching of strategic sales management has not received enough attention in marketing education literature. This study analyzes an experiential client-based method for teaching a strategic sales management course. The authors…

  7. Cream-Skimming, Parking and Other Intended and Unintended Effects of High-Powered, Performance-Based Contracts

    ERIC Educational Resources Information Center

    Koning, Pierre; Heinrich, Carolyn J.

    2013-01-01

    As performance-based contracting in social welfare services continues to expand, concerns about potential unintended effects are also growing. We analyze the incentive effects of high-powered, performance-based contracts and their implications for program outcomes using panel data on Dutch cohorts of unemployed and disabled workers that were…

  8. School Sector and Student Achievement in the Era of Standards Based Reforms

    ERIC Educational Resources Information Center

    Carbonaro, William; Covay, Elizabeth

    2010-01-01

    The authors examine whether standards based accountability reforms of the past two decades have closed the achievement gap among public and private high school students. They analyzed data from the Education Longitudinal Study (ELS) to examine sector differences in high school achievement in the era of standards based reforms. The authors found…

  9. Web-Based Learning Programs: Use by Learners with Various Cognitive Styles

    ERIC Educational Resources Information Center

    Chen, Ling-Hsiu

    2010-01-01

    To consider how Web-based learning program is utilized by learners with different cognitive styles, this study presents a Web-based learning system (WBLS) and analyzes learners' browsing data recorded in the log file to identify how learners' cognitive styles and learning behavior are related. In order to develop an adapted WBLS, this study also…

  10. Designing a Structured and Interactive Learning Environment Based on GIS for Secondary Geography Education

    ERIC Educational Resources Information Center

    Liu, Suxia; Zhu, Xuan

    2008-01-01

    Geographic information systems (GIS) are computer-based tools for geographic data analysis and spatial visualization. They have become one of the information and communications technologies for education at all levels. This article reviews the current status of GIS in schools, analyzes the requirements of a GIS-based learning environment from…

  11. Resonant transition-based quantum computation

    NASA Astrophysics Data System (ADS)

    Chiang, Chen-Fu; Hsieh, Chang-Yu

    2017-05-01

    In this article we assess a novel quantum computation paradigm based on the resonant transition (RT) phenomenon commonly associated with atomic and molecular systems. We thoroughly analyze the intimate connections between the RT-based quantum computation and the well-established adiabatic quantum computation (AQC). Both quantum computing frameworks encode solutions to computational problems in the spectral properties of a Hamiltonian and rely on the quantum dynamics to obtain the desired output state. We discuss how one can adapt any adiabatic quantum algorithm to a corresponding RT version and the two approaches are limited by different aspects of Hamiltonians' spectra. The RT approach provides a compelling alternative to the AQC under various circumstances. To better illustrate the usefulness of the novel framework, we analyze the time complexity of an algorithm for 3-SAT problems and discuss straightforward methods to fine tune its efficiency.

  12. [Modeling and analysis of volume conduction based on field-circuit coupling].

    PubMed

    Tang, Zhide; Liu, Hailong; Xie, Xiaohui; Chen, Xiufa; Hou, Deming

    2012-08-01

    Numerical simulations of volume conduction can be used to analyze the process of energy transfer and explore the effects of some physical factors on energy transfer efficiency. We analyzed the 3D quasi-static electric field by the finite element method, and developed A 3D coupled field-circuit model of volume conduction basing on the coupling between the circuit and the electric field. The model includes a circuit simulation of the volume conduction to provide direct theoretical guidance for energy transfer optimization design. A field-circuit coupling model with circular cylinder electrodes was established on the platform of the software FEM3.5. Based on this, the effects of electrode cross section area, electrode distance and circuit parameters on the performance of volume conduction system were obtained, which provided a basis for optimized design of energy transfer efficiency.

  13. Fourier transform mass spectrometry.

    PubMed

    Scigelova, Michaela; Hornshaw, Martin; Giannakopulos, Anastassios; Makarov, Alexander

    2011-07-01

    This article provides an introduction to Fourier transform-based mass spectrometry. The key performance characteristics of Fourier transform-based mass spectrometry, mass accuracy and resolution, are presented in the view of how they impact the interpretation of measurements in proteomic applications. The theory and principles of operation of two types of mass analyzer, Fourier transform ion cyclotron resonance and Orbitrap, are described. Major benefits as well as limitations of Fourier transform-based mass spectrometry technology are discussed in the context of practical sample analysis, and illustrated with examples included as figures in this text and in the accompanying slide set. Comparisons highlighting the performance differences between the two mass analyzers are made where deemed useful in assisting the user with choosing the most appropriate technology for an application. Recent developments of these high-performing mass spectrometers are mentioned to provide a future outlook.

  14. Fourier Transform Mass Spectrometry

    PubMed Central

    Scigelova, Michaela; Hornshaw, Martin; Giannakopulos, Anastassios; Makarov, Alexander

    2011-01-01

    This article provides an introduction to Fourier transform-based mass spectrometry. The key performance characteristics of Fourier transform-based mass spectrometry, mass accuracy and resolution, are presented in the view of how they impact the interpretation of measurements in proteomic applications. The theory and principles of operation of two types of mass analyzer, Fourier transform ion cyclotron resonance and Orbitrap, are described. Major benefits as well as limitations of Fourier transform-based mass spectrometry technology are discussed in the context of practical sample analysis, and illustrated with examples included as figures in this text and in the accompanying slide set. Comparisons highlighting the performance differences between the two mass analyzers are made where deemed useful in assisting the user with choosing the most appropriate technology for an application. Recent developments of these high-performing mass spectrometers are mentioned to provide a future outlook. PMID:21742802

  15. Engagement Assessment Using EEG Signals

    NASA Technical Reports Server (NTRS)

    Li, Feng; Li, Jiang; McKenzie, Frederic; Zhang, Guangfan; Wang, Wei; Pepe, Aaron; Xu, Roger; Schnell, Thomas; Anderson, Nick; Heitkamp, Dean

    2012-01-01

    In this paper, we present methods to analyze and improve an EEG-based engagement assessment approach, consisting of data preprocessing, feature extraction and engagement state classification. During data preprocessing, spikes, baseline drift and saturation caused by recording devices in EEG signals are identified and eliminated, and a wavelet based method is utilized to remove ocular and muscular artifacts in the EEG recordings. In feature extraction, power spectrum densities with 1 Hz bin are calculated as features, and these features are analyzed using the Fisher score and the one way ANOVA method. In the classification step, a committee classifier is trained based on the extracted features to assess engagement status. Finally, experiment results showed that there exist significant differences in the extracted features among different subjects, and we have implemented a feature normalization procedure to mitigate the differences and significantly improved the engagement assessment performance.

  16. "¿Qué Estoy Haciendo Aquí? (What Am I Doing Here?)": Chicanos/Latinos(as) Navigating Challenges and Inequalities During Their First Year of Graduate School

    ERIC Educational Resources Information Center

    Ramirez, Elvia

    2014-01-01

    Based on in-depth qualitative interviews, this study analyzed the challenges and structural inequities that Chicanos/Latinos(as) encountered and resisted during their first year of graduate school. Grounded in intersectionality theory, this study analyzed how race, class, and gender inequalities that are embedded in the graduate schooling process…

  17. The Nature of the Current and Anticipated Shortage of Professional Skills and Qualities of Workers in the Russian Labor Market

    ERIC Educational Resources Information Center

    Bondarenko, Natal'ia

    2015-01-01

    This article analyzes the characteristics of the existing and expected deficit of professional skills and qualities of workers employed by Russian companies. Analyzed factors include generic skills, soft or behavioral skills, the influence of the qualification, and shortage on the effectiveness of the companies as a whole. The data is based on a…

  18. The Complexity Analysis Tool

    DTIC Science & Technology

    1988-10-01

    overview of the complexity analysis tool ( CAT ), an automated tool which will analyze mission critical computer resources (MCCR) software. CAT is based...84 MAR UNCLASSIFIED SECURITY CLASSIFICATION OF THIS PAGE 19. ABSTRACT: (cont) CAT automates the metric for BASIC (HP-71), ATLAS (EQUATE), Ada (subset...UNIX 5.2). CAT analyzes source code and computes complexity on a module basis. CAT also generates graphic representations of the logic flow paths and

  19. How Teachers Use and Manage Their Blogs? A Cluster Analysis of Teachers' Blogs in Taiwan

    ERIC Educational Resources Information Center

    Liu, Eric Zhi-Feng; Hou, Huei-Tse

    2013-01-01

    The development of Web 2.0 has ushered in a new set of web-based tools, including blogs. This study focused on how teachers use and manage their blogs. A sample of 165 teachers' blogs in Taiwan was analyzed by factor analysis, cluster analysis and qualitative content analysis. First, the teachers' blogs were analyzed according to six criteria…

  20. [Current Situation and Prospects of Emergency Medical Equipment in Our Country].

    PubMed

    Qi, Lijing; Cheng, Feng

    2016-03-01

    This article analyzes the new demand of emergency medical equipment in the current development trend based on the analysis of the development and current situation of emergency medicine in our country. At the same time it introduces the current industrial characteristics of our country. Finally it analyzes the development trend of this kind of equipment in the new emergency medicine field.

  1. Remote sensing of chemical warfare agent by CO2 -lidar

    NASA Astrophysics Data System (ADS)

    Geiko, Pavel P.; Smirnov, Sergey S.

    2014-11-01

    The possibilities of remote sensing of chemical warfare agent by differential absorption method were analyzed. The CO2 - laser emission lines suitable for sounding of chemical warfare agent with provision for disturbing absorptions by water vapor were choose. The detection range of chemical warfare agents was estimated for a lidar based on CO2 - laser The other factors influencing upon echolocation range were analyzed.

  2. Using Item-Type Performance Covariance to Improve the Skill Model of an Existing Tutor

    ERIC Educational Resources Information Center

    Pavlik, Philip I., Jr.; Cen, Hao; Wu, Lili; Koedinger, Kenneth R.

    2008-01-01

    Using data from an existing pre-algebra computer-based tutor, we analyzed the covariance of item-types with the goal of describing a more effective way to assign skill labels to item-types. Analyzing covariance is important because it allows us to place the skills in a related network in which we can identify the role each skill plays in learning…

  3. Development of an Assay for the Detection of PrPres in Blood and Urine Based on PMCA Assay and ELISA Methods

    DTIC Science & Technology

    2008-09-01

    the Origen Analyzer (BioVeris), the DELFIA (Wallac/PE) and the MPD ELISA ( BioTraces ). BioTraces had the most sensitive assay in which 125I was used...investigations we decided to abandon the BioTraces assay and focused on a more practical and also sensitive assay provided by the Origen Analyzer

  4. Ground based and airborne atmospheric measurements near bucharest

    NASA Astrophysics Data System (ADS)

    Nemuc, Anca; Boscornea, Andreea; Belegante, Livio; Vasilescu, Jeni; Vajaiac, Sorin; Ene, Dragos; Marmureanu, Luminita; Andrei, Simona

    2018-04-01

    This paper presents the results from a coordinated approach for atmospheric investigation, exploring synergies between different techniques. A wide range of instruments have been used during an intensive measurement period both from ground (lidar, sunphotometer, aethalometer and Aerosol Chemical Speciation Monitor) and airborne (aerodynamic particle sizer, the Picarro gas analyzer and the NO2 CAPS analyzer) in 2016 over Magurele, 6 km South West of Bucharest.

  5. Fuels planning: science synthesis and integration; environmental consequences fact sheet 13: Root Disease Analyzer-Armillaria Response Tool (ART)

    Treesearch

    Geral I. McDonald; Philip D. Tanimoto; Thomas M. Rice; David E. Hall; Jane E. Stewart; Paul J. Zambino; Jonalea R. Tonn; Ned B. Klopfenstein; Mee-Sook Kim

    2005-01-01

    The Root Disease Analyzer-Armillaria Response Tool (ART) is a Web-based tool that estimates Armillaria root disease risk in dry forests of the Western United States. This fact sheet identifies the intended users and uses, required inputs, what the model does and does not do, and tells the user how to obtain the model.

  6. Equal Pay for Working Families. National and State Data on the Pay Gap and Its Costs. A Joint Research Project.

    ERIC Educational Resources Information Center

    Hartmann, Heidi; Allen, Katherine; Owens, Christine

    A national study, including state-by-state breakouts, analyzed Census Bureau and Bureau of Labor Statistics data to explore the wage gap. Median weekly earnings of men and women and of minorities and nonminorities were analyzed. Gender-based earnings differences and gender wage gaps were large for all women--and especially large for minority…

  7. Statistical analysis of DOE EML QAP data from 1982 to 1998.

    PubMed

    Mizanur Rahman, G M; Isenhour, T L; Larget, B; Greenlaw, P D

    2001-01-01

    The historical database from the Environmental Measurements Laboratory's Quality Assessment Program from 1982 to 1998 has been analyzed to determine control limits for future performance evaluations of the different laboratories contracted to the U.S. Department of Energy. Seventy-three radionuclides in four different matrices (air filter, soil, vegetation, and water) were analyzed. The evaluation criteria were established based on a z-score calculation.

  8. Realist Ontology and Natural Processes: A Semantic Tool to Analyze the Presentation of the Osmosis Concept in Science Texts

    ERIC Educational Resources Information Center

    Spinelli Barria, Michele; Morales, Cecilia; Merino, Cristian; Quiroz, Waldo

    2016-01-01

    In this work, we developed an ontological tool, based on the scientific realism of Mario Bunge, for the analysis of the presentation of natural processes in science textbooks. This tool was applied to analyze the presentation of the concept of osmosis in 16 chemistry and biology books at different educational levels. The results showed that more…

  9. Job Loss and Other Factors Behind the Recent Increase in Unemployment. Report No. 446.

    ERIC Educational Resources Information Center

    Flaim, Paul O.; Gilroy, Curtis L.

    Based on data assembled by the Bureau of Labor Statistics, the document analyzes the unemployment increase in terms of job leavers, re-entrants and new entrants into the job market, and job losers. The period analyzed runs from the fourth quarter of 1973, to the first three months of 1975. Data breakdown is by sex, race, and age, with the focus…

  10. Preliminary results on complex ceramic layers deposition by atmospheric plasma spraying

    NASA Astrophysics Data System (ADS)

    Florea, Costel; Bejinariu, Costicǎ; Munteanu, Corneliu; Cimpoeşu, Nicanor

    2017-04-01

    In this article we obtain thin layers from complex ceramic powders using industrial equipment based on atmospheric plasma spraying. We analyze the influence of the substrate material roughness on the quality of the thin layers using scanning electron microscopy (SEM) and X-ray dispersive energy analyze (EDAX). Preliminary results present an important dependence between the surface state and the structural and chemical homogeneity.

  11. Raman mediated all-optical cascadable inverter using silicon-on-insulator waveguides.

    PubMed

    Sen, Mrinal; Das, Mukul K

    2013-12-01

    In this Letter, we propose an all-optical circuit for a cascadable and integrable logic inverter based on stimulated Raman scattering. A maximum product criteria for noise margin is taken to analyze the cascadability of the inverter. Variation of noise margin for different model parameters is also studied. Finally, the time domain response of the inverter is analyzed for different widths of input pulses.

  12. Evaluation of Portable Multi-Gas Analyzers for use by Safety Personnel

    NASA Technical Reports Server (NTRS)

    Lueck, D. E.; Meneghelli, B. J.; Bardel, D. N.

    1998-01-01

    During confined space entry operations as well as Shuttle-safing operations, United Space Alliance (USA)/National Aeronautics and Space Administration (NASA) safety personnel use a variety of portable instrumentation to monitor for hazardous levels of compounds such as nitrogen dioxide (N%), monomethylhydrazine (NMM), FREON 21, ammonia (NH3), oxygen (O2), and combustibles (as hydrogen (H2)). Except for O2 and H2, each compound is monitored using a single analyzer. In many cases these analyzers are 5 to 10 years old and require frequent maintenance. In addition, they are cumbersome to carry and tend to make the job of personnel monitoring physically taxing. As part of an effort to upgrade the sensor technology background information was requested from a total of 27 manufacturers of portable multi-gas instruments. A set of criteria was established to determine which vendors would be selected for laboratory evaluation. These criteria were based on requests made by USA/NASA Safety personnel in order to meet requirements within their respective areas for confined-space and Shuttle-safing operations. Each of the 27 manufacturers of multi-gas analyzers was sent a copy of the criteria and asked to fill in the appropriate information pertaining to their instrumentation. Based on the results of the sensor criteria worksheets, a total of 9 vendors out of 27 surveyed manufacturers were chosen for evaluation. Each vendor included in the final evaluation process was requested to configure each of two analyzers with NO2, NH3, O2, and combustible sensors. A set of lab tests was designed in order to determine which of the multi-gas instruments under evaluation was best suited for use in both shuttle and confined space operations. These tests included linearity/repeatability, zero/span drift response/recovery, humidity, interference, and maintenance. At the conclusion of lab testing three vendors were selected for additional field testing. Based on the results of both the lab and field evaluations a single vendor was recommended for use by NASA/IJSA Safety personnel. Vendor selection criteria, as well as the results from both laboratory and field testing of the multi-gas analyzers, are presented as part of this paper.

  13. Evapotranspiration variability and its association with vegetation dynamics in the Nile Basin, 2002–2011

    USGS Publications Warehouse

    Alemu, Henok; Senay, Gabriel B.; Kaptue, Armel T.; Kovalskyy, Valeriy

    2014-01-01

    Evapotranspiration (ET) is a vital component in land-atmosphere interactions. In drylands, over 90% of annual rainfall evaporates. The Nile Basin in Africa is about 42% dryland in a region experiencing rapid population growth and development. The relationship of ET with climate, vegetation and land cover in the basin during 2002–2011 is analyzed using thermal-based Simplified Surface Energy Balance Operational (SSEBop) ET, Normalized Difference Vegetation Index (NDVI)-based MODIS Terrestrial (MOD16) ET, MODIS-derived NDVI as a proxy for vegetation productivity and rainfall from Tropical Rainfall Measuring Mission (TRMM). Interannual variability and trends are analyzed using established statistical methods. Analysis based on thermal-based ET revealed that >50% of the study area exhibited negative ET anomalies for 7 years (2009, driest), while >60% exhibited positive ET anomalies for 3 years (2007, wettest). NDVI-based monthly ET correlated strongly (r > 0.77) with vegetation than thermal-based ET (0.52 < r < 0.73) at p < 0.001. Climate-zone averaged thermal-based ET anomalies positively correlated (r = 0.6, p < 0.05) with rainfall in 4 of the 9 investigated climate zones. Thermal-based and NDVI-based ET estimates revealed minor discrepancies over rainfed croplands (60 mm/yr higher for thermal-based ET), but a significant divergence over wetlands (440 mm/yr higher for thermal-based ET). Only 5% of the study area exhibited statistically significant trends in ET.

  14. Changing R&D models in research-based pharmaceutical companies.

    PubMed

    Schuhmacher, Alexander; Gassmann, Oliver; Hinder, Markus

    2016-04-27

    New drugs serving unmet medical needs are one of the key value drivers of research-based pharmaceutical companies. The efficiency of research and development (R&D), defined as the successful approval and launch of new medicines (output) in the rate of the monetary investments required for R&D (input), has declined since decades. We aimed to identify, analyze and describe the factors that impact the R&D efficiency. Based on publicly available information, we reviewed the R&D models of major research-based pharmaceutical companies and analyzed the key challenges and success factors of a sustainable R&D output. We calculated that the R&D efficiencies of major research-based pharmaceutical companies were in the range of USD 3.2-32.3 billion (2006-2014). As these numbers challenge the model of an innovation-driven pharmaceutical industry, we analyzed the concepts that companies are following to increase their R&D efficiencies: (A) Activities to reduce portfolio and project risk, (B) activities to reduce R&D costs, and (C) activities to increase the innovation potential. While category A comprises measures such as portfolio management and licensing, measures grouped in category B are outsourcing and risk-sharing in late-stage development. Companies made diverse steps to increase their innovation potential and open innovation, exemplified by open source, innovation centers, or crowdsourcing, plays a key role in doing so. In conclusion, research-based pharmaceutical companies need to be aware of the key factors, which impact the rate of innovation, R&D cost and probability of success. Depending on their company strategy and their R&D set-up they can opt for one of the following open innovators: knowledge creator, knowledge integrator or knowledge leverager.

  15. Principle-based concept analysis: Caring in nursing education

    PubMed Central

    Salehian, Maryam; Heydari, Abbas; Aghebati, Nahid; Moonaghi, Hossein Karimi; Mazloom, Seyed Reza

    2016-01-01

    Introduction The aim of this principle-based concept analysis was to analyze caring in nursing education and to explain the current state of the science based on epistemologic, pragmatic, linguistic, and logical philosophical principles. Methods A principle-based concept analysis method was used to analyze the nursing literature. The dataset included 46 English language studies, published from 2005 to 2014, and they were retrieved through PROQUEST, MEDLINE, CINAHL, ERIC, SCOPUS, and SID scientific databases. The key dimensions of the data were collected using a validated data-extraction sheet. The four principles of assessing pragmatic utility were used to analyze the data. The data were managed by using MAXQDA 10 software. Results The scientific literature that deals with caring in nursing education relies on implied meaning. Caring in nursing education refers to student-teacher interactions that are formed on the basis of human values and focused on the unique needs of the students (epistemological principle). The result of student-teacher interactions is the development of both the students and the teachers. Numerous applications of the concept of caring in nursing education are available in the literature (pragmatic principle). There is consistency in the meaning of the concept, as a central value of the faculty-student interaction (linguistic principle). Compared with other related concepts, such as “caring pedagogy,” “value-based education,” and “teaching excellence,” caring in nursing education does not have exact and clear conceptual boundaries (logic principle). Conclusion Caring in nursing education was identified as an approach to teaching and learning, and it is formed based on teacher-student interactions and sustainable human values. A greater understanding of the conceptual basis of caring in nursing education will improve the caring behaviors of teachers, create teaching-learning environments, and help experts in curriculum development. PMID:27123225

  16. Principle-based concept analysis: Caring in nursing education.

    PubMed

    Salehian, Maryam; Heydari, Abbas; Aghebati, Nahid; Karimi Moonaghi, Hossein; Mazloom, Seyed Reza

    2016-03-01

    The aim of this principle-based concept analysis was to analyze caring in nursing education and to explain the current state of the science based on epistemologic, pragmatic, linguistic, and logical philosophical principles. A principle-based concept analysis method was used to analyze the nursing literature. The dataset included 46 English language studies, published from 2005 to 2014, and they were retrieved through PROQUEST, MEDLINE, CINAHL, ERIC, SCOPUS, and SID scientific databases. The key dimensions of the data were collected using a validated data-extraction sheet. The four principles of assessing pragmatic utility were used to analyze the data. The data were managed by using MAXQDA 10 software. The scientific literature that deals with caring in nursing education relies on implied meaning. Caring in nursing education refers to student-teacher interactions that are formed on the basis of human values and focused on the unique needs of the students (epistemological principle). The result of student-teacher interactions is the development of both the students and the teachers. Numerous applications of the concept of caring in nursing education are available in the literature (pragmatic principle). There is consistency in the meaning of the concept, as a central value of the faculty-student interaction (linguistic principle). Compared with other related concepts, such as "caring pedagogy," "value-based education," and "teaching excellence," caring in nursing education does not have exact and clear conceptual boundaries (logic principle). Caring in nursing education was identified as an approach to teaching and learning, and it is formed based on teacher-student interactions and sustainable human values. A greater understanding of the conceptual basis of caring in nursing education will improve the caring behaviors of teachers, create teaching-learning environments, and help experts in curriculum development.

  17. Analyzing Cyber Security Threats on Cyber-Physical Systems Using Model-Based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Kerzhner, Aleksandr; Pomerantz, Marc; Tan, Kymie; Campuzano, Brian; Dinkel, Kevin; Pecharich, Jeremy; Nguyen, Viet; Steele, Robert; Johnson, Bryan

    2015-01-01

    The spectre of cyber attacks on aerospace systems can no longer be ignored given that many of the components and vulnerabilities that have been successfully exploited by the adversary on other infrastructures are the same as those deployed and used within the aerospace environment. An important consideration with respect to the mission/safety critical infrastructure supporting space operations is that an appropriate defensive response to an attack invariably involves the need for high precision and accuracy, because an incorrect response can trigger unacceptable losses involving lives and/or significant financial damage. A highly precise defensive response, considering the typical complexity of aerospace environments, requires a detailed and well-founded understanding of the underlying system where the goal of the defensive response is to preserve critical mission objectives in the presence of adversarial activity. In this paper, a structured approach for modeling aerospace systems is described. The approach includes physical elements, network topology, software applications, system functions, and usage scenarios. We leverage Model-Based Systems Engineering methodology by utilizing the Object Management Group's Systems Modeling Language to represent the system being analyzed and also utilize model transformations to change relevant aspects of the model into specialized analyses. A novel visualization approach is utilized to visualize the entire model as a three-dimensional graph, allowing easier interaction with subject matter experts. The model provides a unifying structure for analyzing the impact of a particular attack or a particular type of attack. Two different example analysis types are demonstrated in this paper: a graph-based propagation analysis based on edge labels, and a graph-based propagation analysis based on node labels.

  18. Information integration and diagnosis analysis of equipment status and production quality for machining process

    NASA Astrophysics Data System (ADS)

    Zan, Tao; Wang, Min; Hu, Jianzhong

    2010-12-01

    Machining status monitoring technique by multi-sensors can acquire and analyze the machining process information to implement abnormity diagnosis and fault warning. Statistical quality control technique is normally used to distinguish abnormal fluctuations from normal fluctuations through statistical method. In this paper by comparing the advantages and disadvantages of the two methods, the necessity and feasibility of integration and fusion is introduced. Then an approach that integrates multi-sensors status monitoring and statistical process control based on artificial intelligent technique, internet technique and database technique is brought forward. Based on virtual instrument technique the author developed the machining quality assurance system - MoniSysOnline, which has been used to monitoring the grinding machining process. By analyzing the quality data and AE signal information of wheel dressing process the reason of machining quality fluctuation has been obtained. The experiment result indicates that the approach is suitable for the status monitoring and analyzing of machining process.

  19. Effects of problem-based learning by learning style in medical education.

    PubMed

    Chae, Su-Jin

    2012-12-01

    Although problem-based learning (PBL) has been popularized in many colleges, few studies have analyzed the relationship between individual differences and PBL. The purpose of this study was to analyze the relationship between learning style and the perception on the effects of PBL. Grasha-Riechmann Student Learning Style Scales was used to assess the learning styles of 38 students at Ajou University School of Medicine who were enrolled in a respiratory system course in 2011. The data were analyzed by regression analysis and Spearman correlation analysis. By regression analysis, dependent beta=0.478) and avoidant styles (beta=-0.815) influenced the learner's satisfaction with PBL. By Spearman correlation analysis, there was significant link between independent, dependent, and avoidant styles and the perception of the effect of PBL. There are few significant relationships between learning style and the perception of the effects of PBL. We must determine how to teach students with different learning styles and the factors that influence PBL.

  20. Turbidimetric and photometric determination of total tannins in tea using a micro-flow-batch analyzer.

    PubMed

    Lima, Marcelo B; Andrade, Stéfani I E; Harding, David P; Pistonesi, Marcelo F; Band, Beatriz S F; Araújo, Mário C U

    2012-01-15

    Both turbidimetric and photometric determinations of total tannins in samples of green and black tea, using a micro-flow-batch analyzer (μFBA) were studied. The miniaturized system was formed using photocurable urethane-acrylate resin and ultraviolet lithography technique. The turbidimetric method was based on the precipitation reaction of Cu (II) with tannins in acetate medium at a pH of 4.5. The photometric method was based on the complexation reaction of tannins with ferrous tartrate. The turbidimetric μFBA was able to test 200 samples per hour. The photometric μFBA allowed 300 analyses per hour, generating 136μL of residue per analysis. The paired t test, at a 95% confidence level, showed no statistically significant differences between results obtained by both methods and the reference method. The urethane-acrylate μFBA maintained satisfactory physical and chemical properties, and represents an improvement over conventional flow-batch analyzer. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Comparative evaluation of the influence of television advertisements on children and caries prevalence.

    PubMed

    Ghimire, Neeta; Rao, Arathi

    2013-02-12

    Children watch television during most of their free time. They are exposed to advertisers' messages and are vulnerable to sophisticated advertisements of foods often detrimental to oral and general health. To evaluate the influence of television advertisements on children, the relationship with oral health and to analyze the content of those advertisements. A questionnaire-based study was performed among 600 schoolchildren of Mangalore, Karnataka, followed by oral examination. Based on the survey, favorite and non-favorite channels and viewing times were analyzed. Advertisements on children's favorite and non-favorite channels were then viewed, analyzed, and compared. Higher caries prevalence was found among children who watched television and asked for more food and soft drinks. Cariogenic food advertisements were popular on children's favorite channels. Television advertisements may strongly influence children's food preferences and eating habits, resulting in higher caries prevalence. Advertisements regarding healthy food, oral hygiene maintenance, prevention of diseases such as caries should be given priority for the benefit of the health of children.

  2. Chemotherapy and targeted therapy in advanced biliary tract carcinoma: a pooled analysis of clinical trials.

    PubMed

    Eckel, Florian; Schmid, Roland M

    2014-01-01

    In biliary tract cancer, gemcitabine platinum (GP) doublet palliative chemotherapy is the current standard treatment. The aim of this study was to analyze recent trials, even those small and nonrandomized, and identify superior new regimens. Trials published in English between January 2000 and January 2014 were analyzed, as well as ASCO abstracts from 2010 to 2013. In total, 161 trials comprising 6,337 patients were analyzed. The pooled results of standard therapy GP (no fluoropyrimidine, F, or other drug) were as follows: the median response rate (RR), tumor control rate (TCR), time to tumor progression (TTP) and overall survival (OS) were 25.9 and 63.5%, and 5.3 and 9.5 months, respectively. GFP triplets as well as G-based chemotherapy plus targeted therapy were significantly superior to GP concerning tumor control (TCR, TTP) and OS, with no difference in RR. Triplet combinations of GFP as well as G-based chemotherapy with (predominantly EGFR) targeted therapy are most effective concerning tumor control and survival.

  3. Oil and gas pipeline construction cost analysis and developing regression models for cost estimation

    NASA Astrophysics Data System (ADS)

    Thaduri, Ravi Kiran

    In this study, cost data for 180 pipelines and 136 compressor stations have been analyzed. On the basis of the distribution analysis, regression models have been developed. Material, Labor, ROW and miscellaneous costs make up the total cost of a pipeline construction. The pipelines are analyzed based on different pipeline lengths, diameter, location, pipeline volume and year of completion. In a pipeline construction, labor costs dominate the total costs with a share of about 40%. Multiple non-linear regression models are developed to estimate the component costs of pipelines for various cross-sectional areas, lengths and locations. The Compressor stations are analyzed based on the capacity, year of completion and location. Unlike the pipeline costs, material costs dominate the total costs in the construction of compressor station, with an average share of about 50.6%. Land costs have very little influence on the total costs. Similar regression models are developed to estimate the component costs of compressor station for various capacities and locations.

  4. Design for Run-Time Monitor on Cloud Computing

    NASA Astrophysics Data System (ADS)

    Kang, Mikyung; Kang, Dong-In; Yun, Mira; Park, Gyung-Leen; Lee, Junghoon

    Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is the type of a parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring the system status change, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize resources on cloud computing. RTM monitors application software through library instrumentation as well as underlying hardware through performance counter optimizing its computing configuration based on the analyzed data.

  5. Critical thinking analysis based on Facione (2015) – Angelo (1995) logical mathematics material of vocational high school (VHS)

    NASA Astrophysics Data System (ADS)

    Seventika, S. Y.; Sukestiyarno, Y. L.; Mariani, Scolastika

    2018-03-01

    The research has purpose to analyze and categorize the critical thinking ability of VHS students based on modified critical thinking indicator according to Facione-Angelo covering: interpreting the problem, analyzing alternative solution, applying the problem, evaluating the solution and concluding the results gained – attached by supportive evidence. The subject of the research is 30 eleventh graders of TKJ in Yabujah VHS, Indramayu in the odd semester 2016/2017. The collected data are critical thinking test and interviews. The result shows 15% is in good category, 30% in fair category, and 55% in low category. The students in “Good” category has accomplished critical thinking steps although imperfect, especially the indicators of evaluating and concluding attached by supportive evidence. The “Fair” categorized students only show partial steps of the indicators. The analyzing, evaluating, and concluding indicators are the most seldom to do, meanwhile the “low” categorized students show all indicators in low quality even to identify has problem to do.

  6. Simultaneous imaging of fat crystallinity and crystal polymorphic types by Raman microspectroscopy.

    PubMed

    Motoyama, Michiyo; Ando, Masahiro; Sasaki, Keisuke; Nakajima, Ikuyo; Chikuni, Koichi; Aikawa, Katsuhiro; Hamaguchi, Hiro-O

    2016-04-01

    The crystalline states of fats, i.e., the crystallinity and crystal polymorphic types, strongly influence their physical properties in fat-based foods. Imaging of fat crystalline states has thus been a subject of abiding interest, but conventional techniques cannot image crystallinity and polymorphic types all at once. This article demonstrates a new technique using Raman microspectroscopy for simultaneously imaging the crystallinity and polymorphic types of fats. The crystallinity and β' crystal polymorph, which contribute to the hardness of fat-based food products, were quantitatively visualized in a model fat (porcine adipose tissue) by analyzing several key Raman bands. The emergence of the β crystal polymorph, which generally results in food product deterioration, was successfully imaged by analyzing the whole fingerprint regions of Raman spectra using multivariate curve resolution alternating least squares analysis. The results demonstrate that the crystalline states of fats can be nondestructively visualized and analyzed at the molecular level, in situ, without laborious sample pretreatments. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. A tone analyzer based on a piezoelectric polymer and organic thin film transistors.

    PubMed

    Hsu, Yu-Jen; Kymissis, Ioannis

    2012-12-01

    A tone analyzer is demonstrated using a distributed resonator architecture on a tensioned piezoelectric polyvinyledene diuoride (PVDF) sheet. This sheet is used as both the resonator and detection element. Two architectures are demonstrated; one uses distributed, directly addressed elements as a proof of concept, and the other integrates organic thin film transistor-based transimpedance amplifiers directly with the PVDF to convert the piezoelectric charge signal into a current signal. The PVDF sheet material is instrumented along its length, and the amplitude response at 15 sites is recorded and analyzed as a function of the frequency of excitation. The determination of the dominant component of an incoming tone is demonstrated using linear system decomposition of the time-averaged response of the sheet and is performed without any time domain analysis. This design allows for the determination of the spectral composition of a sound using the mechanical signal processing provided by the amplitude response and eliminates the need for time-domain downstream signal processing of the incoming signal.

  8. Exploring the limits of cryospectroscopy: Least-squares based approaches for analyzing the self-association of HCl.

    PubMed

    De Beuckeleer, Liene I; Herrebout, Wouter A

    2016-02-05

    To rationalize the concentration dependent behavior observed for a large spectral data set of HCl recorded in liquid argon, least-squares based numerical methods are developed and validated. In these methods, for each wavenumber a polynomial is used to mimic the relation between monomer concentrations and measured absorbances. Least-squares fitting of higher degree polynomials tends to overfit and thus leads to compensation effects where a contribution due to one species is compensated for by a negative contribution of another. The compensation effects are corrected for by carefully analyzing, using AIC and BIC information criteria, the differences observed between consecutive fittings when the degree of the polynomial model is systematically increased, and by introducing constraints prohibiting negative absorbances to occur for the monomer or for one of the oligomers. The method developed should allow other, more complicated self-associating systems to be analyzed with a much higher accuracy than before. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Mathematical properties and bounds on haplotyping populations by pure parsimony.

    PubMed

    Wang, I-Lin; Chang, Chia-Yuan

    2011-06-01

    Although the haplotype data can be used to analyze the function of DNA, due to the significant efforts required in collecting the haplotype data, usually the genotype data is collected and then the population haplotype inference (PHI) problem is solved to infer haplotype data from genotype data for a population. This paper investigates the PHI problem based on the pure parsimony criterion (HIPP), which seeks the minimum number of distinct haplotypes to infer a given genotype data. We analyze the mathematical structure and properties for the HIPP problem, propose techniques to reduce the given genotype data into an equivalent one of much smaller size, and analyze the relations of genotype data using a compatible graph. Based on the mathematical properties in the compatible graph, we propose a maximal clique heuristic to obtain an upper bound, and a new polynomial-sized integer linear programming formulation to obtain a lower bound for the HIPP problem. Copyright © 2011 Elsevier Inc. All rights reserved.

  10. Microneedle-based analysis of the micromechanics of the metaphase spindle assembled in Xenopus laevis egg extracts

    PubMed Central

    Shimamoto, Yuta; Kapoor, Tarun M.

    2014-01-01

    SUMMARY To explain how micron-sized cellular structures generate and respond to forces we need to characterize their micromechanical properties. Here we provide a protocol to build and use a dual force-calibrated microneedle-based set-up to quantitatively analyze the micromechanics of a metaphase spindle assembled in Xenopus laevis egg extracts. This cell-free extract system allows for controlled biochemical perturbations of spindle components. We describe how the microneedles are prepared and how they can be used to apply and measure forces. A multi-mode imaging system allows tracking of microtubules, chromosomes and needle tips. This set-up can be used to analyze the viscoelastic properties of the spindle on time-scales ranging from minutes to sub-seconds. A typical experiment, along with data analysis, is also detailed. We anticipate that our protocol can be readily extended to analyze the micromechanics of other cellular structures assembled in cell-free extracts. The entire procedure can take 3-4 days. PMID:22538847

  11. Implementation of age and gender recognition system for intelligent digital signage

    NASA Astrophysics Data System (ADS)

    Lee, Sang-Heon; Sohn, Myoung-Kyu; Kim, Hyunduk

    2015-12-01

    Intelligent digital signage systems transmit customized advertising and information by analyzing users and customers, unlike existing system that presented advertising in the form of broadcast without regard to type of customers. Currently, development of intelligent digital signage system has been pushed forward vigorously. In this study, we designed a system capable of analyzing gender and age of customers based on image obtained from camera, although there are many different methods for analyzing customers. We conducted age and gender recognition experiments using public database. The age/gender recognition experiments were performed through histogram matching method by extracting Local binary patterns (LBP) features after facial area on input image was normalized. The results of experiment showed that gender recognition rate was as high as approximately 97% on average. Age recognition was conducted based on categorization into 5 age classes. Age recognition rates for women and men were about 67% and 68%, respectively when that conducted separately for different gender.

  12. What Is SEER?

    Cancer.gov

    An infographic describing the functions of NCI’s Surveillance, Epidemiology, and End Results (SEER) program: collecting, analyzing, interpreting, and disseminating reliable population-based statistics.

  13. Protection of Renewable-dominated Microgrids: Challenges and Potential Solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elkhatib, Mohamed; Ellis, Abraham; Biswal, Milan

    In this report we address the challenge of designing efficient protection system for inverter- dominated microgrids. These microgrids are characterised with limited fault current capacity as a result of current-limiting protection functions of inverters. Typically, inverters limit their fault contribution in sub-cycle time frame to as low as 1.1 per unit. As a result, overcurrent protection could fail completely to detect faults in inverter-dominated microgrids. As part of this project a detailed literature survey of existing and proposed microgrid protection schemes were conducted. The survey concluded that there is a gap in the available microgrid protection methods. The only crediblemore » protection solution available in literature for low- fault inverter-dominated microgrids is the differential protection scheme which represents a robust transmission-grade protection solution but at a very high cost. Two non-overcurrent protection schemes were investigated as part of this project; impedance-based protection and transient-based protection. Impedance-based protection depends on monitoring impedance trajectories at feeder relays to detect faults. Two communication-based impedance-based protection schemes were developed. the first scheme utilizes directional elements and pilot signals to locate the fault. The second scheme depends on a Central Protection Unit that communicates with all feeder relays to locate the fault based on directional flags received from feeder relays. The later approach could potentially be adapted to protect networked microgrids and dynamic topology microgrids. Transient-based protection relies on analyzing high frequency transients to detect and locate faults. This approach is very promising but its implementation in the filed faces several challenges. For example, high frequency transients due to faults can be confused with transients due to other events such as capacitor switching. Additionally, while detecting faults by analyzing transients could be doable, locating faults based on analyzing transients is still an open question.« less

  14. A gas-phase chemiluminescence-based analyzer for waterborne arsenic

    USGS Publications Warehouse

    Idowu, A.D.; Dasgupta, P.K.; Genfa, Z.; Toda, K.; Garbarino, J.R.

    2006-01-01

    We show a practical sequential injection/zone fluidics-based analyzer that measures waterborne arsenic. The approach is capable of differentiating between inorganic As(III) and As(V). The principle is based on generating AsH 3 from the sample in a confined chamber by borohydride reduction at controlled pH, sparging the chamber to drive the AsH3 to a small reflective cell located atop a photomultiplier tube, allowing it to react with ozone generated from ambient air, and measuring the intense chemiluminescence that results. Arsine generation and removal from solution results in isolation from the sample matrix, avoiding the pitfalls encountered in some solution-based analysis techniques. The differential determination of As(III) and As(V) is based on the different pH dependence of the reducibility of these species to AsH3. At pH ???1, both As(III) and As(V) are quantitatively converted to arsine in the presence of NaBH4. At a pH of 4-5, only As(III) is converted to arsine. In the present form, the limit of detection (S/N = 3) is 0.05 ??g/L As at pH ???1 and 0.09 ??g/L As(III) at pH ???4-5 for a 3-mL sample. The analyzer is intrinsically automated and requires 4 min per determination. It is also possible to determine As(III) first at pH 4.5 and then determine the remaining As in a sequential manner; this requires 6 min. There are no significant practical interferences. A new borohydride solution formulation permits month-long reagent stability. ?? 2006 American Chemical Society.

  15. Study on Big Database Construction and its Application of Sample Data Collected in CHINA'S First National Geographic Conditions Census Based on Remote Sensing Images

    NASA Astrophysics Data System (ADS)

    Cheng, T.; Zhou, X.; Jia, Y.; Yang, G.; Bai, J.

    2018-04-01

    In the project of China's First National Geographic Conditions Census, millions of sample data have been collected all over the country for interpreting land cover based on remote sensing images, the quantity of data files reaches more than 12,000,000 and has grown in the following project of National Geographic Conditions Monitoring. By now, using database such as Oracle for storing the big data is the most effective method. However, applicable method is more significant for sample data's management and application. This paper studies a database construction method which is based on relational database with distributed file system. The vector data and file data are saved in different physical location. The key issues and solution method are discussed. Based on this, it studies the application method of sample data and analyzes some kinds of using cases, which could lay the foundation for sample data's application. Particularly, sample data locating in Shaanxi province are selected for verifying the method. At the same time, it takes 10 first-level classes which defined in the land cover classification system for example, and analyzes the spatial distribution and density characteristics of all kinds of sample data. The results verify that the method of database construction which is based on relational database with distributed file system is very useful and applicative for sample data's searching, analyzing and promoted application. Furthermore, sample data collected in the project of China's First National Geographic Conditions Census could be useful in the earth observation and land cover's quality assessment.

  16. Investigation on a fiber optic accelerometer based on FBG-FP interferometer

    NASA Astrophysics Data System (ADS)

    Lin, Chongyu; Luo, Hong; Xiong, Shuidong; Li, Haitao

    2014-12-01

    A fiber optic accelerometer based on fiber Bragg grating Fabry-Perot (FBG-FP) interferometer is presented. The sensor is a FBG-FP cavity which is formed with two weak fiber Bragg gratings (FBGs) in a single-mode fiber. The reflectivity of the two FBGs is 9.42% and 7.74% respectively, and the fiber between them is 10 meters long. An optical demodulation system was set up to analyze the reflected light of FBG-FP cavity. Acceleration signals of different frequencies and intensities were demodulated correctly and stably by the system. Based on analyzing the optical spectrum of weak FBG based FBG-FP cavity, we got the equivalent length of FBG-FP cavity. We used a path-matching Michelson interferometer (MI) to demodulate the acceleration signal. The visibility of the interference fringe we got was 41%~42% while the theory limit was 50%. This indicated that the difference of interferometer's two arms and the equivalent length of FBG-FP cavity were matched well. Phase generated carrier (PGC) technology was used to eliminate phase fading caused by random phase shift and Faraday rotation mirrors (FRMs) were used to eliminate polarization-induced phase fading. The accelerometer used a compliant cylinder design and its' sensitivity and frequency response were analyzed and simulated based on elastic mechanics. Experiment result showed that the accelerometer had a flat frequency response over the frequency range of 31-630Hz. The sensitivity was about 31dB (0dB=1rad/g) with fluctuation less than 1.5dB.

  17. Brain-computer interface based on detection of movement intention as a means of brain wave modulation enhancement

    NASA Astrophysics Data System (ADS)

    Pulido Castro, Sergio D.; López López, Juan M.

    2017-11-01

    Movement intention (MI) is the mental state in which it is desired to make an action that implies movement. There are certain signals that are directly related with MI; mainly obtained in the primary motor cortex. These signals can be used in a brain-computer interface (BCI). BCIs have a wide variety of applications for the general population, classified in two groups: optimization of conventional neuromuscular performances and enhancement of conventional neuromuscular performances beyond normal capacities. The main goal of this project is to analyze if neural rhythm modulation enhancement could be achieved by practicing, through a BCI based on MI detection, which was designed in a previous study. A six-session experiment was made with eight healthy subjects. Each session was composed by two stages: a training stage and a testing stage, which allowed control of a videogame. The scores in the game were recorded and analyzed. Changes in alpha and beta bands were also analyzed in order to observe if attention could in fact be enhanced. The obtained results were partially satisfactory, as most subjects showed a clear improvement in performance at some point in the trials. As well, the alpha to beta wave ratio of all the tasks was analyzed to observe if there are changes as the experiment progresses. The results are promising, and a different protocol must be implemented to assess the impact of the BCI on the attention span, which can be analyzed with the alpha and beta waves.

  18. Electrochemical Sensors for the Detection of Lead and Other Toxic Heavy Metals: The Next Generation of Personal Exposure Biomonitors

    PubMed Central

    Yantasee, Wassana; Lin, Yuehe; Hongsirikarn, Kitiya; Fryxell, Glen E.; Addleman, Raymond; Timchalk, Charles

    2007-01-01

    To support the development and implementation of biological monitoring programs, we need quantitative technologies for measuring xenobiotic exposure. Microanalytical based sensors that work with complex biomatrices such as blood, urine, or saliva are being developed and validated and will improve our ability to make definitive associations between chemical exposures and disease. Among toxic metals, lead continues to be one of the most problematic. Despite considerable efforts to identify and eliminate Pb exposure sources, this metal remains a significant health concern, particularly for young children. Ongoing research focuses on the development of portable metal analyzers that have many advantages over current available technologies, thus potentially representing the next generation of toxic metal analyzers. In this article, we highlight the development and validation of two classes of metal analyzers for the voltammetric detection of Pb, including: a) an analyzer based on flow injection analysis and anodic stripping voltammetry at a mercury-film electrode, and b) Hg-free metal analyzers employing adsorptive stripping voltammetry and novel nanostructure materials that include the self-assembled monolayers on mesoporous supports and carbon nanotubes. These sensors have been optimized to detect Pb in urine, blood, and saliva as accurately as the state-of-the-art inductively coupled plasma-mass spectrometry with high reproducibility, and sensitivity allows. These improved and portable analytical sensor platforms will facilitate our ability to conduct biological monitoring programs to understand the relationship between chemical exposure assessment and disease outcomes. PMID:18087583

  19. REMOVAL AND CONTAINMENT OF LEAD-BASED PAINT VIA NEEDLE SCALERS

    EPA Science Inventory

    This report describes a comparative technical and economic evaluation of using a dustless needlegun system versus a conventional abrasive grit blasting system in the removal of lead-based paint from steel structures. The objective of the study was to comparatively analyze the ope...

  20. Informedia at TRECVID 2003: Analyzing and Searching Broadcast News Video

    DTIC Science & Technology

    2004-11-03

    browsing interface to browse the top-ranked shots according to the different classifiers. Color and texture based image search engines were also...different classifiers. Color and texture based image search engines were also optimized better performance. This “new” interface was evaluated as

  1. Copula-based nonlinear modeling of the law of one price for lumber products

    Treesearch

    Barry K. Goodwin; Matthew T. Holt; Gülcan Önel; Jeffrey P. Prestemon

    2018-01-01

    This paper proposes an alternative and potentially novel approach to analyzing the law of one price in a nonlinear fashion. Copula-based models that consider the joint distribution of prices separated by space are developed and applied to weekly...

  2. Assessment of the Bill Emerson Memorial Cable-stayed Bridge based on seismic instrumentation data

    DOT National Transportation Integrated Search

    2007-06-01

    In this study, both ambient and earthquake data measured from the Bill Emerson Memorial Cable-stayed Bridge are reported and analyzed. Based on the seismic instrumentation data, the vibration characteristics of the bridge are investigated and used to...

  3. Digital roadway interactive visualization and evaluation network applications to WSDOT operational data usage.

    DOT National Transportation Integrated Search

    2016-12-01

    DRIVE Net is a region-wide, Web-based transportation decision support system that adopts digital roadway maps as : the base, and provides data layers for integrating and analyzing a variety of data sources (e.g., traffic sensors, incident : records)....

  4. Long-term Operation of an External Cavity Quantum Cascade Laser-based Trace-gas Sensor for Building Air Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, Mark C.; Craig, Ian M.

    2013-11-03

    We analyze the long-term performance and stability of a trace-gas sensor based on an external cavity quantum cascade laser using data collected over a one-year period in a building air monitoring application.

  5. Simulation-Based Testbed Development for Analyzing Toll Impacts on Freeway Travel

    DOT National Transportation Integrated Search

    2012-06-01

    Traffic congestion has been a world-wide problem in metropolitan areas all over the world. Toll-based traffic management is one of the most applicable solutions against freeway congestion. This research chooses two toll roads, the SR-167 HOT Lane and...

  6. Web-based OPACs: Between Tradition and Innovation.

    ERIC Educational Resources Information Center

    Moscoso, Purificacion; Ortiz-Repiso, Virginia

    1999-01-01

    Analyzes the change that Internet-based OPACs (Online Public Access Catalogs) have represented to the structure, administration, and maintenance of the catalogs, retrieval systems, and user interfaces. Examines the structure of databases and traditional principles that have governed systems development. Discusses repercussions of the application…

  7. Automated image-based phenotypic analysis in zebrafish embryos

    PubMed Central

    Vogt, Andreas; Cholewinski, Andrzej; Shen, Xiaoqiang; Nelson, Scott; Lazo, John S.; Tsang, Michael; Hukriede, Neil A.

    2009-01-01

    Presently, the zebrafish is the only vertebrate model compatible with contemporary paradigms of drug discovery. Zebrafish embryos are amenable to automation necessary for high-throughput chemical screens, and optical transparency makes them potentially suited for image-based screening. However, the lack of tools for automated analysis of complex images presents an obstacle to utilizing the zebrafish as a high-throughput screening model. We have developed an automated system for imaging and analyzing zebrafish embryos in multi-well plates regardless of embryo orientation and without user intervention. Images of fluorescent embryos were acquired on a high-content reader and analyzed using an artificial intelligence-based image analysis method termed Cognition Network Technology (CNT). CNT reliably detected transgenic fluorescent embryos (Tg(fli1:EGFP)y1) arrayed in 96-well plates and quantified intersegmental blood vessel development in embryos treated with small molecule inhibitors of anigiogenesis. The results demonstrate it is feasible to adapt image-based high-content screening methodology to measure complex whole organism phenotypes. PMID:19235725

  8. Testing Nelder-Mead based repulsion algorithms for multiple roots of nonlinear systems via a two-level factorial design of experiments.

    PubMed

    Ramadas, Gisela C V; Rocha, Ana Maria A C; Fernandes, Edite M G P

    2015-01-01

    This paper addresses the challenging task of computing multiple roots of a system of nonlinear equations. A repulsion algorithm that invokes the Nelder-Mead (N-M) local search method and uses a penalty-type merit function based on the error function, known as 'erf', is presented. In the N-M algorithm context, different strategies are proposed to enhance the quality of the solutions and improve the overall efficiency. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm.

  9. A new method to identify the foot of continental slope based on an integrated profile analysis

    NASA Astrophysics Data System (ADS)

    Wu, Ziyin; Li, Jiabiao; Li, Shoujun; Shang, Jihong; Jin, Xiaobin

    2017-06-01

    A new method is proposed to identify automatically the foot of the continental slope (FOS) based on the integrated analysis of topographic profiles. Based on the extremum points of the second derivative and the Douglas-Peucker algorithm, it simplifies the topographic profiles, then calculates the second derivative of the original profiles and the D-P profiles. Seven steps are proposed to simplify the original profiles. Meanwhile, multiple identification methods are proposed to determine the FOS points, including gradient, water depth and second derivative values of data points, as well as the concave and convex, continuity and segmentation of the topographic profiles. This method can comprehensively and intelligently analyze the topographic profiles and their derived slopes, second derivatives and D-P profiles, based on which, it is capable to analyze the essential properties of every single data point in the profile. Furthermore, it is proposed to remove the concave points of the curve and in addition, to implement six FOS judgment criteria.

  10. The Analysis of Image Segmentation Hierarchies with a Graph-based Knowledge Discovery System

    NASA Technical Reports Server (NTRS)

    Tilton, James C.; Cooke, diane J.; Ketkar, Nikhil; Aksoy, Selim

    2008-01-01

    Currently available pixel-based analysis techniques do not effectively extract the information content from the increasingly available high spatial resolution remotely sensed imagery data. A general consensus is that object-based image analysis (OBIA) is required to effectively analyze this type of data. OBIA is usually a two-stage process; image segmentation followed by an analysis of the segmented objects. We are exploring an approach to OBIA in which hierarchical image segmentations provided by the Recursive Hierarchical Segmentation (RHSEG) software developed at NASA GSFC are analyzed by the Subdue graph-based knowledge discovery system developed by a team at Washington State University. In this paper we discuss out initial approach to representing the RHSEG-produced hierarchical image segmentations in a graphical form understandable by Subdue, and provide results on real and simulated data. We also discuss planned improvements designed to more effectively and completely convey the hierarchical segmentation information to Subdue and to improve processing efficiency.

  11. Development of an expert system for analysis of Shuttle atmospheric revitalization and pressure control subsystem anomalies

    NASA Technical Reports Server (NTRS)

    Lafuse, Sharon A.

    1991-01-01

    The paper describes the Shuttle Leak Management Expert System (SLMES), a preprototype expert system developed to enable the ECLSS subsystem manager to analyze subsystem anomalies and to formulate flight procedures based on flight data. The SLMES combines the rule-based expert system technology with the traditional FORTRAN-based software into an integrated system. SLMES analyzes the data using rules, and, when it detects a problem that requires simulation, it sets up the input for the FORTRAN-based simulation program ARPCS2AT2, which predicts the cabin total pressure and composition as a function of time. The program simulates the pressure control system, the crew oxygen masks, the airlock repress/depress valves, and the leakage. When the simulation has completed, other SLMES rules are triggered to examine the results of simulation contrary to flight data and to suggest methods for correcting the problem. Results are then presented in form of graphs and tables.

  12. Performance analysis of visual tracking algorithms for motion-based user interfaces on mobile devices

    NASA Astrophysics Data System (ADS)

    Winkler, Stefan; Rangaswamy, Karthik; Tedjokusumo, Jefry; Zhou, ZhiYing

    2008-02-01

    Determining the self-motion of a camera is useful for many applications. A number of visual motion-tracking algorithms have been developed till date, each with their own advantages and restrictions. Some of them have also made their foray into the mobile world, powering augmented reality-based applications on phones with inbuilt cameras. In this paper, we compare the performances of three feature or landmark-guided motion tracking algorithms, namely marker-based tracking with MXRToolkit, face tracking based on CamShift, and MonoSLAM. We analyze and compare the complexity, accuracy, sensitivity, robustness and restrictions of each of the above methods. Our performance tests are conducted over two stages: The first stage of testing uses video sequences created with simulated camera movements along the six degrees of freedom in order to compare accuracy in tracking, while the second stage analyzes the robustness of the algorithms by testing for manipulative factors like image scaling and frame-skipping.

  13. Speckle noise reduction in ultrasound images using a discrete wavelet transform-based image fusion technique.

    PubMed

    Choi, Hyun Ho; Lee, Ju Hwan; Kim, Sung Min; Park, Sung Yun

    2015-01-01

    Here, the speckle noise in ultrasonic images is removed using an image fusion-based denoising method. To optimize the denoising performance, each discrete wavelet transform (DWT) and filtering technique was analyzed and compared. In addition, the performances were compared in order to derive the optimal input conditions. To evaluate the speckle noise removal performance, an image fusion algorithm was applied to the ultrasound images, and comparatively analyzed with the original image without the algorithm. As a result, applying DWT and filtering techniques caused information loss and noise characteristics, and did not represent the most significant noise reduction performance. Conversely, an image fusion method applying SRAD-original conditions preserved the key information in the original image, and the speckle noise was removed. Based on such characteristics, the input conditions of SRAD-original had the best denoising performance with the ultrasound images. From this study, the best denoising technique proposed based on the results was confirmed to have a high potential for clinical application.

  14. Comprehending idioms cross-linguistically.

    PubMed

    Bortfeld, Heather

    2003-01-01

    Speakers of three different languages (English, Latvian, and Mandarin) rated sets of idioms from their language for the analyzability of the relationship between each phrase's literal and figurative meaning. For each language, subsets of idioms were selected based on these ratings. Latvian and Mandarin idioms were literally translated into English. Across three experiments, people classified idioms from the three languages according to their figurative meanings. Response times and error rates indicate that participants were able to interpret unfamiliar (e.g., other languages') idioms depending largely on the degree to which they were analyzable, and that different forms of processing were used both within and between languages depending on this analyzability. Results support arguments for a continuum of analyzability (Bortfeld & McGlone, 2001), along which figurative speech ranges from reflecting general conceptual structures to specific cultural and historical references.

  15. Thermal Stability and Flammability of Styrene-Butadiene Rubber-Based (SBR) Ceramifiable Composites

    PubMed Central

    Anyszka, Rafał; Bieliński, Dariusz M.; Pędzich, Zbigniew; Rybiński, Przemysław; Imiela, Mateusz; Siciński, Mariusz; Zarzecka-Napierała, Magdalena; Gozdek, Tomasz; Rutkowski, Paweł

    2016-01-01

    Ceramifiable styrene-butadiene (SBR)-based composites containing low-softening-point-temperature glassy frit promoting ceramification, precipitated silica, one of four thermally stable refractory fillers (halloysite, calcined kaolin, mica or wollastonite) and a sulfur-based curing system were prepared. Kinetics of vulcanization and basic mechanical properties were analyzed and added as Supplementary Materials. Combustibility of the composites was measured by means of cone calorimetry. Their thermal properties were analyzed by means of thermogravimetry and specific heat capacity determination. Activation energy of thermal decomposition was calculated using the Flynn-Wall-Ozawa method. Finally, compression strength of the composites after ceramification was measured and their micromorphology was studied by scanning electron microscopy. The addition of a ceramification-facilitating system resulted in the lowering of combustibility and significant improvement of the thermal stability of the composites. Moreover, the compression strength of the mineral structure formed after ceramification is considerably high. The most promising refractory fillers for SBR-based ceramifiable composites are mica and halloysite. PMID:28773726

  16. A Statistical Analysis of Activity-Based and Traditional Introductory Algebra Physics Using the Force and Motion Conceptual Evaluation

    NASA Astrophysics Data System (ADS)

    Trecia Markes, Cecelia

    2006-03-01

    With a three-year FIPSE grant, it has been possible at the University of Nebraska at Kearney (UNK) to develop and implement activity- based introductory physics at the algebra level. It has generally been recognized that students enter physics classes with misconceptions about motion and force. Many of these misconceptions persist after instruction. Pretest and posttest responses on the ``Force and Motion Conceptual Evaluation'' (FMCE) are analyzed to determine the effectiveness of the activity- based method of instruction relative to the traditional (lecture/lab) method of instruction. Data were analyzed to determine the following: student understanding at the beginning of the course, student understanding at the end of the course, how student understanding is related to the type of class taken, student understanding based on gender and type of class. Some of the tests used are the t-test, the chi-squared test, and analysis of variance. The results of these tests will be presented, and their implications will be discussed.

  17. Color-Space-Based Visual-MIMO for V2X Communication †

    PubMed Central

    Kim, Jai-Eun; Kim, Ji-Won; Park, Youngil; Kim, Ki-Doo

    2016-01-01

    In this paper, we analyze the applicability of color-space-based, color-independent visual-MIMO for V2X. We aim to achieve a visual-MIMO scheme that can maintain the original color and brightness while performing seamless communication. We consider two scenarios of GCM based visual-MIMO for V2X. One is a multipath transmission using visual-MIMO networking and the other is multi-node V2X communication. In the scenario of multipath transmission, we analyze the channel capacity numerically and we illustrate the significance of networking information such as distance, reference color (symbol), and multiplexing-diversity mode transitions. In addition, in the V2X scenario of multiple access, we may achieve the simultaneous multiple access communication without node interferences by dividing the communication area using image processing. Finally, through numerical simulation, we show the superior SER performance of the visual-MIMO scheme compared with LED-PD communication and show the numerical result of the GCM based visual-MIMO channel capacity versus distance. PMID:27120603

  18. Motif-based analysis of large nucleotide data sets using MEME-ChIP

    PubMed Central

    Ma, Wenxiu; Noble, William S; Bailey, Timothy L

    2014-01-01

    MEME-ChIP is a web-based tool for analyzing motifs in large DNA or RNA data sets. It can analyze peak regions identified by ChIP-seq, cross-linking sites identified by cLIP-seq and related assays, as well as sets of genomic regions selected using other criteria. MEME-ChIP performs de novo motif discovery, motif enrichment analysis, motif location analysis and motif clustering, providing a comprehensive picture of the DNA or RNA motifs that are enriched in the input sequences. MEME-ChIP performs two complementary types of de novo motif discovery: weight matrix–based discovery for high accuracy; and word-based discovery for high sensitivity. Motif enrichment analysis using DNA or RNA motifs from human, mouse, worm, fly and other model organisms provides even greater sensitivity. MEME-ChIP’s interactive HTML output groups and aligns significant motifs to ease interpretation. this protocol takes less than 3 h, and it provides motif discovery approaches that are distinct and complementary to other online methods. PMID:24853928

  19. International Space Station Major Constituent Analyzer On-Orbit Performance

    NASA Technical Reports Server (NTRS)

    Gardner, Ben D.; Erwin, Phillip M.; Thoresen, Souzan; Granahan, John; Matty, Chris

    2012-01-01

    The Major Constituent Analyzer is a mass spectrometer based system that measures the major atmospheric constituents on the International Space Station. A number of limited-life components require periodic changeout, including the ORU 02 analyzer and the ORU 08 Verification Gas Assembly. Over the past two years, two ORU 02 analyzer assemblies have operated nominally while two others have experienced premature on-orbit failures. These failures as well as nominal performances demonstrate that ORU 02 performance remains a key determinant of MCA performance and logistical support. It can be shown that monitoring several key parameters can maximize the capacity to monitor ORU health and properly anticipate end of life. Improvements to ion pump operation and ion source tuning are expected to improve lifetime performance of the current ORU 02 design.

  20. Color-Space-Based Visual-MIMO for V2X Communication.

    PubMed

    Kim, Jai-Eun; Kim, Ji-Won; Park, Youngil; Kim, Ki-Doo

    2016-04-23

    In this paper, we analyze the applicability of color-space-based, color-independent visual-MIMO for V2X. We aim to achieve a visual-MIMO scheme that can maintain the original color and brightness while performing seamless communication. We consider two scenarios of GCM based visual-MIMO for V2X. One is a multipath transmission using visual-MIMO networking and the other is multi-node V2X communication. In the scenario of multipath transmission, we analyze the channel capacity numerically and we illustrate the significance of networking information such as distance, reference color (symbol), and multiplexing-diversity mode transitions. In addition, in the V2X scenario of multiple access, we may achieve the simultaneous multiple access communication without node interferences by dividing the communication area using image processing. Finally, through numerical simulation, we show the superior SER performance of the visual-MIMO scheme compared with LED-PD communication and show the numerical result of the GCM based visual-MIMO channel capacity versus distance.

Top