Sample records for computer based measure

  1. Acausal measurement-based quantum computing

    NASA Astrophysics Data System (ADS)

    Morimae, Tomoyuki

    2014-07-01

    In measurement-based quantum computing, there is a natural "causal cone" among qubits of the resource state, since the measurement angle on a qubit has to depend on previous measurement results in order to correct the effect of by-product operators. If we respect the no-signaling principle, by-product operators cannot be avoided. Here we study the possibility of acausal measurement-based quantum computing by using the process matrix framework [Oreshkov, Costa, and Brukner, Nat. Commun. 3, 1092 (2012), 10.1038/ncomms2076]. We construct a resource process matrix for acausal measurement-based quantum computing restricting local operations to projective measurements. The resource process matrix is an analog of the resource state of the standard causal measurement-based quantum computing. We find that if we restrict local operations to projective measurements the resource process matrix is (up to a normalization factor and trivial ancilla qubits) equivalent to the decorated graph state created from the graph state of the corresponding causal measurement-based quantum computing. We also show that it is possible to consider a causal game whose causal inequality is violated by acausal measurement-based quantum computing.

  2. Computer-Based and Paper-Based Measurement of Recognition Performance.

    ERIC Educational Resources Information Center

    Federico, Pat-Anthony

    To determine the relative reliabilities and validities of paper-based and computer-based measurement procedures, 83 male student pilots and radar intercept officers were administered computer and paper-based tests of aircraft recognition. The subject matter consisted of line drawings of front, side, and top silhouettes of aircraft. Reliabilities…

  3. Impedance computations and beam-based measurements: A problem of discrepancy

    NASA Astrophysics Data System (ADS)

    Smaluk, Victor

    2018-04-01

    High intensity of particle beams is crucial for high-performance operation of modern electron-positron storage rings, both colliders and light sources. The beam intensity is limited by the interaction of the beam with self-induced electromagnetic fields (wake fields) proportional to the vacuum chamber impedance. For a new accelerator project, the total broadband impedance is computed by element-wise wake-field simulations using computer codes. For a machine in operation, the impedance can be measured experimentally using beam-based techniques. In this article, a comparative analysis of impedance computations and beam-based measurements is presented for 15 electron-positron storage rings. The measured data and the predictions based on the computed impedance budgets show a significant discrepancy. Three possible reasons for the discrepancy are discussed: interference of the wake fields excited by a beam in adjacent components of the vacuum chamber, effect of computation mesh size, and effect of insufficient bandwidth of the computed impedance.

  4. Verifiable fault tolerance in measurement-based quantum computation

    NASA Astrophysics Data System (ADS)

    Fujii, Keisuke; Hayashi, Masahito

    2017-09-01

    Quantum systems, in general, cannot be simulated efficiently by a classical computer, and hence are useful for solving certain mathematical problems and simulating quantum many-body systems. This also implies, unfortunately, that verification of the output of the quantum systems is not so trivial, since predicting the output is exponentially hard. As another problem, the quantum system is very delicate for noise and thus needs an error correction. Here, we propose a framework for verification of the output of fault-tolerant quantum computation in a measurement-based model. In contrast to existing analyses on fault tolerance, we do not assume any noise model on the resource state, but an arbitrary resource state is tested by using only single-qubit measurements to verify whether or not the output of measurement-based quantum computation on it is correct. Verifiability is equipped by a constant time repetition of the original measurement-based quantum computation in appropriate measurement bases. Since full characterization of quantum noise is exponentially hard for large-scale quantum computing systems, our framework provides an efficient way to practically verify the experimental quantum error correction.

  5. Comparing Computer-Adaptive and Curriculum-Based Measurement Methods of Assessment

    ERIC Educational Resources Information Center

    Shapiro, Edward S.; Gebhardt, Sarah N.

    2012-01-01

    This article reported the concurrent, predictive, and diagnostic accuracy of a computer-adaptive test (CAT) and curriculum-based measurements (CBM; both computation and concepts/application measures) for universal screening in mathematics among students in first through fourth grade. Correlational analyses indicated moderate to strong…

  6. Comparison of Scientific Calipers and Computer-Enabled CT Review for the Measurement of Skull Base and Craniomaxillofacial Dimensions

    PubMed Central

    Citardi, Martin J.; Herrmann, Brian; Hollenbeak, Chris S.; Stack, Brendan C.; Cooper, Margaret; Bucholz, Richard D.

    2001-01-01

    Traditionally, cadaveric studies and plain-film cephalometrics provided information about craniomaxillofacial proportions and measurements; however, advances in computer technology now permit software-based review of computed tomography (CT)-based models. Distances between standardized anatomic points were measured on five dried human skulls with standard scientific calipers (Geneva Gauge, Albany, NY) and through computer workstation (StealthStation 2.6.4, Medtronic Surgical Navigation Technology, Louisville, CO) review of corresponding CT scans. Differences in measurements between the caliper and CT model were not statistically significant for each parameter. Measurements obtained by computer workstation CT review of the cranial skull base are an accurate representation of actual bony anatomy. Such information has important implications for surgical planning and clinical research. ImagesFigure 1Figure 2Figure 3 PMID:17167599

  7. Self-guaranteed measurement-based quantum computation

    NASA Astrophysics Data System (ADS)

    Hayashi, Masahito; Hajdušek, Michal

    2018-05-01

    In order to guarantee the output of a quantum computation, we usually assume that the component devices are trusted. However, when the total computation process is large, it is not easy to guarantee the whole system when we have scaling effects, unexpected noise, or unaccounted for correlations between several subsystems. If we do not trust the measurement basis or the prepared entangled state, we do need to be worried about such uncertainties. To this end, we propose a self-guaranteed protocol for verification of quantum computation under the scheme of measurement-based quantum computation where no prior-trusted devices (measurement basis or entangled state) are needed. The approach we present enables the implementation of verifiable quantum computation using the measurement-based model in the context of a particular instance of delegated quantum computation where the server prepares the initial computational resource and sends it to the client, who drives the computation by single-qubit measurements. Applying self-testing procedures, we are able to verify the initial resource as well as the operation of the quantum devices and hence the computation itself. The overhead of our protocol scales with the size of the initial resource state to the power of 4 times the natural logarithm of the initial state's size.

  8. Impedance computations and beam-based measurements: A problem of discrepancy

    DOE PAGES

    Smaluk, Victor

    2018-04-21

    High intensity of particle beams is crucial for high-performance operation of modern electron-positron storage rings, both colliders and light sources. The beam intensity is limited by the interaction of the beam with self-induced electromagnetic fields (wake fields) proportional to the vacuum chamber impedance. For a new accelerator project, the total broadband impedance is computed by element-wise wake-field simulations using computer codes. For a machine in operation, the impedance can be measured experimentally using beam-based techniques. In this article, a comparative analysis of impedance computations and beam-based measurements is presented for 15 electron-positron storage rings. The measured data and the predictionsmore » based on the computed impedance budgets show a significant discrepancy. For this article, three possible reasons for the discrepancy are discussed: interference of the wake fields excited by a beam in adjacent components of the vacuum chamber, effect of computation mesh size, and effect of insufficient bandwidth of the computed impedance.« less

  9. Impedance computations and beam-based measurements: A problem of discrepancy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smaluk, Victor

    High intensity of particle beams is crucial for high-performance operation of modern electron-positron storage rings, both colliders and light sources. The beam intensity is limited by the interaction of the beam with self-induced electromagnetic fields (wake fields) proportional to the vacuum chamber impedance. For a new accelerator project, the total broadband impedance is computed by element-wise wake-field simulations using computer codes. For a machine in operation, the impedance can be measured experimentally using beam-based techniques. In this article, a comparative analysis of impedance computations and beam-based measurements is presented for 15 electron-positron storage rings. The measured data and the predictionsmore » based on the computed impedance budgets show a significant discrepancy. For this article, three possible reasons for the discrepancy are discussed: interference of the wake fields excited by a beam in adjacent components of the vacuum chamber, effect of computation mesh size, and effect of insufficient bandwidth of the computed impedance.« less

  10. Hybrid architecture for encoded measurement-based quantum computation

    PubMed Central

    Zwerger, M.; Briegel, H. J.; Dür, W.

    2014-01-01

    We present a hybrid scheme for quantum computation that combines the modular structure of elementary building blocks used in the circuit model with the advantages of a measurement-based approach to quantum computation. We show how to construct optimal resource states of minimal size to implement elementary building blocks for encoded quantum computation in a measurement-based way, including states for error correction and encoded gates. The performance of the scheme is determined by the quality of the resource states, where within the considered error model a threshold of the order of 10% local noise per particle for fault-tolerant quantum computation and quantum communication. PMID:24946906

  11. Reliability of lower limb alignment measures using an established landmark-based method with a customized computer software program

    PubMed Central

    Sled, Elizabeth A.; Sheehy, Lisa M.; Felson, David T.; Costigan, Patrick A.; Lam, Miu; Cooke, T. Derek V.

    2010-01-01

    The objective of the study was to evaluate the reliability of frontal plane lower limb alignment measures using a landmark-based method by (1) comparing inter- and intra-reader reliability between measurements of alignment obtained manually with those using a computer program, and (2) determining inter- and intra-reader reliability of computer-assisted alignment measures from full-limb radiographs. An established method for measuring alignment was used, involving selection of 10 femoral and tibial bone landmarks. 1) To compare manual and computer methods, we used digital images and matching paper copies of five alignment patterns simulating healthy and malaligned limbs drawn using AutoCAD. Seven readers were trained in each system. Paper copies were measured manually and repeat measurements were performed daily for 3 days, followed by a similar routine with the digital images using the computer. 2) To examine the reliability of computer-assisted measures from full-limb radiographs, 100 images (200 limbs) were selected as a random sample from 1,500 full-limb digital radiographs which were part of the Multicenter Osteoarthritis (MOST) Study. Three trained readers used the software program to measure alignment twice from the batch of 100 images, with two or more weeks between batch handling. Manual and computer measures of alignment showed excellent agreement (intraclass correlations [ICCs] 0.977 – 0.999 for computer analysis; 0.820 – 0.995 for manual measures). The computer program applied to full-limb radiographs produced alignment measurements with high inter- and intra-reader reliability (ICCs 0.839 – 0.998). In conclusion, alignment measures using a bone landmark-based approach and a computer program were highly reliable between multiple readers. PMID:19882339

  12. Using Computation Curriculum-Based Measurement Probes for Error Pattern Analysis

    ERIC Educational Resources Information Center

    Dennis, Minyi Shih; Calhoon, Mary Beth; Olson, Christopher L.; Williams, Cara

    2014-01-01

    This article describes how "curriculum-based measurement--computation" (CBM-C) mathematics probes can be used in combination with "error pattern analysis" (EPA) to pinpoint difficulties in basic computation skills for students who struggle with learning mathematics. Both assessment procedures provide ongoing assessment data…

  13. Comparing student performance on paper- and computer-based math curriculum-based measures.

    PubMed

    Hensley, Kiersten; Rankin, Angelica; Hosp, John

    2017-01-01

    As the number of computerized curriculum-based measurement (CBM) tools increases, it is necessary to examine whether or not student performance can generalize across a variety of test administration modes (i.e., paper or computer). The purpose of this study is to compare math fact fluency on paper versus computer for 197 upper elementary students. Students completed identical sets of probes on paper and on the computer, which were then scored for digits correct, problems correct, and accuracy. Results showed a significant difference in performance between the two sets of probes, with higher fluency rates on the paper probes. Because decisions about levels of student support and interventions often rely on measures such as these, more research in this area is needed to examine the potential differences in student performance between paper-based and computer-based CBMs.

  14. Fault-tolerant measurement-based quantum computing with continuous-variable cluster states.

    PubMed

    Menicucci, Nicolas C

    2014-03-28

    A long-standing open question about Gaussian continuous-variable cluster states is whether they enable fault-tolerant measurement-based quantum computation. The answer is yes. Initial squeezing in the cluster above a threshold value of 20.5 dB ensures that errors from finite squeezing acting on encoded qubits are below the fault-tolerance threshold of known qubit-based error-correcting codes. By concatenating with one of these codes and using ancilla-based error correction, fault-tolerant measurement-based quantum computation of theoretically indefinite length is possible with finitely squeezed cluster states.

  15. Developing and validating an instrument for measuring mobile computing self-efficacy.

    PubMed

    Wang, Yi-Shun; Wang, Hsiu-Yuan

    2008-08-01

    IT-related self-efficacy has been found to have a critical influence on system use. However, traditional measures of computer self-efficacy and Internet-related self-efficacy are perceived to be inapplicable in the context of mobile computing and commerce because they are targeted primarily at either desktop computer or wire-based technology contexts. Based on previous research, this study develops and validates a multidimensional instrument for measuring mobile computing self-efficacy (MCSE). This empirically validated instrument will be useful to researchers in developing and testing the theories of mobile user behavior, and to practitioners in assessing the mobile computing self-efficacy of users and promoting the use of mobile commerce systems.

  16. QRS detection based ECG quality assessment.

    PubMed

    Hayn, Dieter; Jammerbund, Bernhard; Schreier, Günter

    2012-09-01

    Although immediate feedback concerning ECG signal quality during recording is useful, up to now not much literature describing quality measures is available. We have implemented and evaluated four ECG quality measures. Empty lead criterion (A), spike detection criterion (B) and lead crossing point criterion (C) were calculated from basic signal properties. Measure D quantified the robustness of QRS detection when applied to the signal. An advanced Matlab-based algorithm combining all four measures and a simplified algorithm for Android platforms, excluding measure D, were developed. Both algorithms were evaluated by taking part in the Computing in Cardiology Challenge 2011. Each measure's accuracy and computing time was evaluated separately. During the challenge, the advanced algorithm correctly classified 93.3% of the ECGs in the training-set and 91.6 % in the test-set. Scores for the simplified algorithm were 0.834 in event 2 and 0.873 in event 3. Computing time for measure D was almost five times higher than for other measures. Required accuracy levels depend on the application and are related to computing time. While our simplified algorithm may be accurate for real-time feedback during ECG self-recordings, QRS detection based measures can further increase the performance if sufficient computing power is available.

  17. Novel schemes for measurement-based quantum computation.

    PubMed

    Gross, D; Eisert, J

    2007-06-01

    We establish a framework which allows one to construct novel schemes for measurement-based quantum computation. The technique develops tools from many-body physics-based on finitely correlated or projected entangled pair states-to go beyond the cluster-state based one-way computer. We identify resource states radically different from the cluster state, in that they exhibit nonvanishing correlations, can be prepared using nonmaximally entangling gates, or have very different local entanglement properties. In the computational models, randomness is compensated in a different manner. It is shown that there exist resource states which are locally arbitrarily close to a pure state. We comment on the possibility of tailoring computational models to specific physical systems.

  18. Performance measurement and modeling of component applications in a high performance computing environment : a case study.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armstrong, Robert C.; Ray, Jaideep; Malony, A.

    2003-11-01

    We present a case study of performance measurement and modeling of a CCA (Common Component Architecture) component-based application in a high performance computing environment. We explore issues peculiar to component-based HPC applications and propose a performance measurement infrastructure for HPC based loosely on recent work done for Grid environments. A prototypical implementation of the infrastructure is used to collect data for a three components in a scientific application and construct performance models for two of them. Both computational and message-passing performance are addressed.

  19. Measuring Recognition Performance Using Computer-Based and Paper-Based Methods.

    ERIC Educational Resources Information Center

    Federico, Pat-Anthony

    1991-01-01

    Using a within-subjects design, computer-based and paper-based tests of aircraft silhouette recognition were administered to 83 male naval pilots and flight officers to determine the relative reliabilities and validities of 2 measurement modes. Relative reliabilities and validities of the two modes were contingent on the multivariate measurement…

  20. Flow Ambiguity: A Path Towards Classically Driven Blind Quantum Computation

    NASA Astrophysics Data System (ADS)

    Mantri, Atul; Demarie, Tommaso F.; Menicucci, Nicolas C.; Fitzsimons, Joseph F.

    2017-07-01

    Blind quantum computation protocols allow a user to delegate a computation to a remote quantum computer in such a way that the privacy of their computation is preserved, even from the device implementing the computation. To date, such protocols are only known for settings involving at least two quantum devices: either a user with some quantum capabilities and a remote quantum server or two or more entangled but noncommunicating servers. In this work, we take the first step towards the construction of a blind quantum computing protocol with a completely classical client and single quantum server. Specifically, we show how a classical client can exploit the ambiguity in the flow of information in measurement-based quantum computing to construct a protocol for hiding critical aspects of a computation delegated to a remote quantum computer. This ambiguity arises due to the fact that, for a fixed graph, there exist multiple choices of the input and output vertex sets that result in deterministic measurement patterns consistent with the same fixed total ordering of vertices. This allows a classical user, computing only measurement angles, to drive a measurement-based computation performed on a remote device while hiding critical aspects of the computation.

  1. A randomized, controlled, single-blind trial of teaching provided by a computer-based multimedia package versus lecture.

    PubMed

    Williams, C; Aubin, S; Harkin, P; Cottrell, D

    2001-09-01

    Computer-based teaching may allow effective teaching of important psychiatric knowledge and skills. To investigate the effectiveness and acceptability of computer-based teaching. A single-blind, randomized, controlled study of 166 undergraduate medical students at the University of Leeds, involving an educational intervention of either a structured lecture or a computer-based teaching package (both of equal duration). There was no difference in knowledge between the groups at baseline or immediately after teaching. Both groups made significant gains in knowledge after teaching. Students who attended the lecture rated their subjective knowledge and skills at a statistically significantly higher level than students who had used the computers. Students who had used the computer package scored higher on an objective measure of assessment skills. Students did not perceive the computer package to be as useful as the traditional lecture format, despite finding it easy to use and recommending its use to other students. Medical students rate themselves subjectively as learning less from computer-based as compared with lecture-based teaching. Objective measures suggest equivalence in knowledge acquisition and significantly greater skills acquisition for computer-based teaching.

  2. Curriculum-Based Measurement: Developing a Computer-Based Assessment Instrument for Monitoring Student Reading Progress on Multiple Indicators

    ERIC Educational Resources Information Center

    Forster, Natalie; Souvignier, Elmar

    2011-01-01

    The purpose of this study was to examine the technical adequacy of a computer-based assessment instrument which is based on hierarchical models of text comprehension for monitoring student reading progress following the Curriculum-Based Measurement (CBM) approach. At intervals of two weeks, 120 third-grade students finished eight CBM tests. To…

  3. Testing the monogamy relations via rank-2 mixtures

    NASA Astrophysics Data System (ADS)

    Jung, Eylee; Park, DaeKil

    2016-10-01

    We introduce two tangle-based four-party entanglement measures t1 and t2, and two negativity-based measures n1 and n2, which are derived from the monogamy relations. These measures are computed for three four-qubit maximally entangled and W states explicitly. We also compute these measures for the rank-2 mixture ρ4=p | GHZ4>< GHZ4|+(1 -p ) | W4>< W4| by finding the corresponding optimal decompositions. It turns out that t1(ρ4) is trivial and the corresponding optimal decomposition is equal to the spectral decomposition. Probably, this triviality is a sign of the fact that the corresponding monogamy inequality is not sufficiently tight. We fail to compute t2(ρ4) due to the difficulty in the calculation of the residual entanglement. The negativity-based measures n1(ρ4) and n2(ρ4) are explicitly computed and the corresponding optimal decompositions are also derived explicitly.

  4. An evaluation method of computer usability based on human-to-computer information transmission model.

    PubMed

    Ogawa, K

    1992-01-01

    This paper proposes a new evaluation and prediction method for computer usability. This method is based on our two previously proposed information transmission measures created from a human-to-computer information transmission model. The model has three information transmission levels: the device, software, and task content levels. Two measures, called the device independent information measure (DI) and the computer independent information measure (CI), defined on the software and task content levels respectively, are given as the amount of information transmitted. Two information transmission rates are defined as DI/T and CI/T, where T is the task completion time: the device independent information transmission rate (RDI), and the computer independent information transmission rate (RCI). The method utilizes the RDI and RCI rates to evaluate relatively the usability of software and device operations on different computer systems. Experiments using three different systems, in this case a graphical information input task, confirm that the method offers an efficient way of determining computer usability.

  5. Trusted measurement model based on multitenant behaviors.

    PubMed

    Ning, Zhen-Hu; Shen, Chang-Xiang; Zhao, Yong; Liang, Peng

    2014-01-01

    With a fast growing pervasive computing, especially cloud computing, the behaviour measurement is at the core and plays a vital role. A new behaviour measurement tailored for Multitenants in cloud computing is needed urgently to fundamentally establish trust relationship. Based on our previous research, we propose an improved trust relationship scheme which captures the world of cloud computing where multitenants share the same physical computing platform. Here, we first present the related work on multitenant behaviour; secondly, we give the scheme of behaviour measurement where decoupling of multitenants is taken into account; thirdly, we explicitly explain our decoupling algorithm for multitenants; fourthly, we introduce a new way of similarity calculation for deviation control, which fits the coupled multitenants under study well; lastly, we design the experiments to test our scheme.

  6. Trusted Measurement Model Based on Multitenant Behaviors

    PubMed Central

    Ning, Zhen-Hu; Shen, Chang-Xiang; Zhao, Yong; Liang, Peng

    2014-01-01

    With a fast growing pervasive computing, especially cloud computing, the behaviour measurement is at the core and plays a vital role. A new behaviour measurement tailored for Multitenants in cloud computing is needed urgently to fundamentally establish trust relationship. Based on our previous research, we propose an improved trust relationship scheme which captures the world of cloud computing where multitenants share the same physical computing platform. Here, we first present the related work on multitenant behaviour; secondly, we give the scheme of behaviour measurement where decoupling of multitenants is taken into account; thirdly, we explicitly explain our decoupling algorithm for multitenants; fourthly, we introduce a new way of similarity calculation for deviation control, which fits the coupled multitenants under study well; lastly, we design the experiments to test our scheme. PMID:24987731

  7. A self-analysis of the NASA-TLX workload measure.

    PubMed

    Noyes, Jan M; Bruneau, Daniel P J

    2007-04-01

    Computer use and, more specifically, the administration of tests and materials online continue to proliferate. A number of subjective, self-report workload measures exist, but the National Aeronautics and Space Administration-Task Load Index (NASA-TLX) is probably the most well known and used. The aim of this paper is to consider the workload costs associated with the computer-based and paper versions of the NASA-TLX measure. It was found that there is a significant difference between the workload scores for the two media, with the computer version of the NASA-TLX incurring more workload. This has implications for the practical use of the NASA-TLX as well as for other computer-based workload measures.

  8. PERFORMANCE OF A COMPUTER-BASED ASSESSMENT OF COGNITIVE FUNCTION MEASURES IN TWO COHORTS OF SENIORS

    PubMed Central

    Espeland, Mark A.; Katula, Jeffrey A.; Rushing, Julia; Kramer, Arthur F.; Jennings, Janine M.; Sink, Kaycee M.; Nadkarni, Neelesh K.; Reid, Kieran F.; Castro, Cynthia M.; Church, Timothy; Kerwin, Diana R.; Williamson, Jeff D.; Marottoli, Richard A.; Rushing, Scott; Marsiske, Michael; Rapp, Stephen R.

    2013-01-01

    Background Computer-administered assessment of cognitive function is being increasingly incorporated in clinical trials, however its performance in these settings has not been systematically evaluated. Design The Seniors Health and Activity Research Program (SHARP) pilot trial (N=73) developed a computer-based tool for assessing memory performance and executive functioning. The Lifestyle Interventions and Independence for Seniors (LIFE) investigators incorporated this battery in a full scale multicenter clinical trial (N=1635). We describe relationships that test scores have with those from interviewer-administered cognitive function tests and risk factors for cognitive deficits and describe performance measures (completeness, intra-class correlations). Results Computer-based assessments of cognitive function had consistent relationships across the pilot and full scale trial cohorts with interviewer-administered assessments of cognitive function, age, and a measure of physical function. In the LIFE cohort, their external validity was further demonstrated by associations with other risk factors for cognitive dysfunction: education, hypertension, diabetes, and physical function. Acceptable levels of data completeness (>83%) were achieved on all computer-based measures, however rates of missing data were higher among older participants (odds ratio=1.06 for each additional year; p<0.001) and those who reported no current computer use (odds ratio=2.71; p<0.001). Intra-class correlations among clinics were at least as low (ICC≤0.013) as for interviewer measures (ICC≤0.023), reflecting good standardization. All cognitive measures loaded onto the first principal component (global cognitive function), which accounted for 40% of the overall variance. Conclusion Our results support the use of computer-based tools for assessing cognitive function in multicenter clinical trials of older individuals. PMID:23589390

  9. Efficient quantum pseudorandomness with simple graph states

    NASA Astrophysics Data System (ADS)

    Mezher, Rawad; Ghalbouni, Joe; Dgheim, Joseph; Markham, Damian

    2018-02-01

    Measurement based (MB) quantum computation allows for universal quantum computing by measuring individual qubits prepared in entangled multipartite states, known as graph states. Unless corrected for, the randomness of the measurements leads to the generation of ensembles of random unitaries, where each random unitary is identified with a string of possible measurement results. We show that repeating an MB scheme an efficient number of times, on a simple graph state, with measurements at fixed angles and no feedforward corrections, produces a random unitary ensemble that is an ɛ -approximate t design on n qubits. Unlike previous constructions, the graph is regular and is also a universal resource for measurement based quantum computing, closely related to the brickwork state.

  10. Adaptation from Paper-Pencil to Web-Based Administration of a Parent-Completed Developmental Questionnaire for Young Children

    ERIC Educational Resources Information Center

    Yovanoff, Paul; Squires, Jane; McManus, Suzanne

    2013-01-01

    Adapting traditional paper-pencil instruments to computer-based environments has received considerable attention from the research community due to the possible administration mode effects on obtained measures. When differences due to mode of completion (i.e., paper-pencil, computer-based) are present, threats to measurement validity are posed. In…

  11. Auditorium acoustics evaluation based on simulated impulse response

    NASA Astrophysics Data System (ADS)

    Wu, Shuoxian; Wang, Hongwei; Zhao, Yuezhe

    2004-05-01

    The impulse responses and other acoustical parameters of Huangpu Teenager Palace in Guangzhou were measured. Meanwhile, the acoustical simulation and auralization based on software ODEON were also made. The comparison between the parameters based on computer simulation and measuring is given. This case study shows that auralization technique based on computer simulation can be used for predicting the acoustical quality of a hall at its design stage.

  12. Dynamic displacement measurement of large-scale structures based on the Lucas-Kanade template tracking algorithm

    NASA Astrophysics Data System (ADS)

    Guo, Jie; Zhu, Chang`an

    2016-01-01

    The development of optics and computer technologies enables the application of the vision-based technique that uses digital cameras to the displacement measurement of large-scale structures. Compared with traditional contact measurements, vision-based technique allows for remote measurement, has a non-intrusive characteristic, and does not necessitate mass introduction. In this study, a high-speed camera system is developed to complete the displacement measurement in real time. The system consists of a high-speed camera and a notebook computer. The high-speed camera can capture images at a speed of hundreds of frames per second. To process the captured images in computer, the Lucas-Kanade template tracking algorithm in the field of computer vision is introduced. Additionally, a modified inverse compositional algorithm is proposed to reduce the computing time of the original algorithm and improve the efficiency further. The modified algorithm can rapidly accomplish one displacement extraction within 1 ms without having to install any pre-designed target panel onto the structures in advance. The accuracy and the efficiency of the system in the remote measurement of dynamic displacement are demonstrated in the experiments on motion platform and sound barrier on suspension viaduct. Experimental results show that the proposed algorithm can extract accurate displacement signal and accomplish the vibration measurement of large-scale structures.

  13. A handheld computer as part of a portable in vivo knee joint load monitoring system

    PubMed Central

    Szivek, JA; Nandakumar, VS; Geffre, CP; Townsend, CP

    2009-01-01

    In vivo measurement of loads and pressures acting on articular cartilage in the knee joint during various activities and rehabilitative therapies following focal defect repair will provide a means of designing activities that encourage faster and more complete healing of focal defects. It was the goal of this study to develop a totally portable monitoring system that could be used during various activities and allow continuous monitoring of forces acting on the knee. In order to make the monitoring system portable, a handheld computer with custom software, a USB powered miniature wireless receiver and a battery-powered coil were developed to replace a currently used computer, AC powered bench top receiver and power supply. A Dell handheld running Windows Mobile operating system(OS) programmed using Labview was used to collect strain measurements. Measurements collected by the handheld based system connected to the miniature wireless receiver were compared with the measurements collected by a hardwired system and a computer based system during bench top testing and in vivo testing. The newly developed handheld based system had a maximum accuracy of 99% when compared to the computer based system. PMID:19789715

  14. Laser Spot Detection Based on Reaction Diffusion.

    PubMed

    Vázquez-Otero, Alejandro; Khikhlukha, Danila; Solano-Altamirano, J M; Dormido, Raquel; Duro, Natividad

    2016-03-01

    Center-location of a laser spot is a problem of interest when the laser is used for processing and performing measurements. Measurement quality depends on correctly determining the location of the laser spot. Hence, improving and proposing algorithms for the correct location of the spots are fundamental issues in laser-based measurements. In this paper we introduce a Reaction Diffusion (RD) system as the main computational framework for robustly finding laser spot centers. The method presented is compared with a conventional approach for locating laser spots, and the experimental results indicate that RD-based computation generates reliable and precise solutions. These results confirm the flexibility of the new computational paradigm based on RD systems for addressing problems that can be reduced to a set of geometric operations.

  15. Universal quantum computation with little entanglement.

    PubMed

    Van den Nest, Maarten

    2013-02-08

    We show that universal quantum computation can be achieved in the standard pure-state circuit model while the entanglement entropy of every bipartition is small in each step of the computation. The entanglement entropy required for large-scale quantum computation even tends to zero. Moreover we show that the same conclusion applies to many entanglement measures commonly used in the literature. This includes e.g., the geometric measure, localizable entanglement, multipartite concurrence, squashed entanglement, witness-based measures, and more generally any entanglement measure which is continuous in a certain natural sense. These results demonstrate that many entanglement measures are unsuitable tools to assess the power of quantum computers.

  16. The relative effectiveness of computer-based and traditional resources for education in anatomy.

    PubMed

    Khot, Zaid; Quinlan, Kaitlyn; Norman, Geoffrey R; Wainman, Bruce

    2013-01-01

    There is increasing use of computer-based resources to teach anatomy, although no study has compared computer-based learning to traditional. In this study, we examine the effectiveness of three formats of anatomy learning: (1) a virtual reality (VR) computer-based module, (2) a static computer-based module providing Key Views (KV), (3) a plastic model. We conducted a controlled trial in which 60 undergraduate students had ten minutes to study the names of 20 different pelvic structures. The outcome measure was a 25 item short answer test consisting of 15 nominal and 10 functional questions, based on a cadaveric pelvis. All subjects also took a brief mental rotations test (MRT) as a measure of spatial ability, used as a covariate in the analysis. Data were analyzed with repeated measures ANOVA. The group learning from the model performed significantly better than the other two groups on the nominal questions (Model 67%; KV 40%; VR 41%, Effect size 1.19 and 1.29, respectively). There was no difference between the KV and VR groups. There was no difference between the groups on the functional questions (Model 28%; KV, 23%, VR 25%). Computer-based learning resources appear to have significant disadvantages compared to traditional specimens in learning nominal anatomy. Consistent with previous research, virtual reality shows no advantage over static presentation of key views. © 2013 American Association of Anatomists.

  17. Curriculum-Based Measurement of Mathematics Competence: From Computation to Concepts and Applications to Real-Life Problem Solving

    ERIC Educational Resources Information Center

    Fuchs, Lynn S.; Fuchs, Douglas; Courey, Susan J.

    2005-01-01

    In this article, the authors explain how curriculum-based measurement (CBM) differs from other forms of classroom-based assessment. The development of CBM is traced from computation to concepts and applications to real-life problem solving, with examples of the assessments and illustrations of research to document technical features and utility…

  18. Operators manual for a computer controlled impedance measurement system

    NASA Astrophysics Data System (ADS)

    Gordon, J.

    1987-02-01

    Operating instructions of a computer controlled impedance measurement system based in Hewlett Packard instrumentation are given. Hardware details, program listings, flowcharts and a practical application are included.

  19. Method and system for environmentally adaptive fault tolerant computing

    NASA Technical Reports Server (NTRS)

    Copenhaver, Jason L. (Inventor); Jeremy, Ramos (Inventor); Wolfe, Jeffrey M. (Inventor); Brenner, Dean (Inventor)

    2010-01-01

    A method and system for adapting fault tolerant computing. The method includes the steps of measuring an environmental condition representative of an environment. An on-board processing system's sensitivity to the measured environmental condition is measured. It is determined whether to reconfigure a fault tolerance of the on-board processing system based in part on the measured environmental condition. The fault tolerance of the on-board processing system may be reconfigured based in part on the measured environmental condition.

  20. A Comparison of Three Methods for Computing Scale Score Conditional Standard Errors of Measurement. ACT Research Report Series, 2013 (7)

    ERIC Educational Resources Information Center

    Woodruff, David; Traynor, Anne; Cui, Zhongmin; Fang, Yu

    2013-01-01

    Professional standards for educational testing recommend that both the overall standard error of measurement and the conditional standard error of measurement (CSEM) be computed on the score scale used to report scores to examinees. Several methods have been developed to compute scale score CSEMs. This paper compares three methods, based on…

  1. Computer controlled multisensor thermocouple apparatus for invasive measurement of temperature.

    PubMed

    Hanus, J; Záhora, J; Volenec, K

    1996-01-01

    The computer controlled apparatus for invasive measurement of temperature profile of biological systems based on original miniature multithermocouple probe is described in this article. The main properties of measuring system were verified by using the original testing device.

  2. Computer-assisted adjuncts for aneurysmal morphologic assessment: toward more precise and accurate approaches

    NASA Astrophysics Data System (ADS)

    Rajabzadeh-Oghaz, Hamidreza; Varble, Nicole; Davies, Jason M.; Mowla, Ashkan; Shakir, Hakeem J.; Sonig, Ashish; Shallwani, Hussain; Snyder, Kenneth V.; Levy, Elad I.; Siddiqui, Adnan H.; Meng, Hui

    2017-03-01

    Neurosurgeons currently base most of their treatment decisions for intracranial aneurysms (IAs) on morphological measurements made manually from 2D angiographic images. These measurements tend to be inaccurate because 2D measurements cannot capture the complex geometry of IAs and because manual measurements are variable depending on the clinician's experience and opinion. Incorrect morphological measurements may lead to inappropriate treatment strategies. In order to improve the accuracy and consistency of morphological analysis of IAs, we have developed an image-based computational tool, AView. In this study, we quantified the accuracy of computer-assisted adjuncts of AView for aneurysmal morphologic assessment by performing measurement on spheres of known size and anatomical IA models. AView has an average morphological error of 0.56% in size and 2.1% in volume measurement. We also investigate the clinical utility of this tool on a retrospective clinical dataset and compare size and neck diameter measurement between 2D manual and 3D computer-assisted measurement. The average error was 22% and 30% in the manual measurement of size and aneurysm neck diameter, respectively. Inaccuracies due to manual measurements could therefore lead to wrong treatment decisions in 44% and inappropriate treatment strategies in 33% of the IAs. Furthermore, computer-assisted analysis of IAs improves the consistency in measurement among clinicians by 62% in size and 82% in neck diameter measurement. We conclude that AView dramatically improves accuracy for morphological analysis. These results illustrate the necessity of a computer-assisted approach for the morphological analysis of IAs.

  3. In-Flight Pitot-Static Calibration

    NASA Technical Reports Server (NTRS)

    Foster, John V. (Inventor); Cunningham, Kevin (Inventor)

    2016-01-01

    A GPS-based pitot-static calibration system uses global output-error optimization. High data rate measurements of static and total pressure, ambient air conditions, and GPS-based ground speed measurements are used to compute pitot-static pressure errors over a range of airspeed. System identification methods rapidly compute optimal pressure error models with defined confidence intervals.

  4. Verification of hypergraph states

    NASA Astrophysics Data System (ADS)

    Morimae, Tomoyuki; Takeuchi, Yuki; Hayashi, Masahito

    2017-12-01

    Hypergraph states are generalizations of graph states where controlled-Z gates on edges are replaced with generalized controlled-Z gates on hyperedges. Hypergraph states have several advantages over graph states. For example, certain hypergraph states, such as the Union Jack states, are universal resource states for measurement-based quantum computing with only Pauli measurements, while graph state measurement-based quantum computing needs non-Clifford basis measurements. Furthermore, it is impossible to classically efficiently sample measurement results on hypergraph states unless the polynomial hierarchy collapses to the third level. Although several protocols have been proposed to verify graph states with only sequential single-qubit Pauli measurements, there was no verification method for hypergraph states. In this paper, we propose a method for verifying a certain class of hypergraph states with only sequential single-qubit Pauli measurements. Importantly, no i.i.d. property of samples is assumed in our protocol: any artificial entanglement among samples cannot fool the verifier. As applications of our protocol, we consider verified blind quantum computing with hypergraph states, and quantum computational supremacy demonstrations with hypergraph states.

  5. Computer Models of Personality: Implications for Measurement

    ERIC Educational Resources Information Center

    Cranton, P. A.

    1976-01-01

    Current research on computer models of personality is reviewed and categorized under five headings: (1) models of belief systems; (2) models of interpersonal behavior; (3) models of decision-making processes; (4) prediction models; and (5) theory-based simulations of specific processes. The use of computer models in personality measurement is…

  6. A Computer-Based Approach for Deriving and Measuring Individual and Team Knowledge Structure from Essay Questions

    ERIC Educational Resources Information Center

    Clariana, Roy B.; Wallace, Patricia

    2007-01-01

    This proof-of-concept investigation describes a computer-based approach for deriving the knowledge structure of individuals and of groups from their written essays, and considers the convergent criterion-related validity of the computer-based scores relative to human rater essay scores and multiple-choice test scores. After completing a…

  7. Using Computer-Extracted Data from Electronic Health Records to Measure the Quality of Adolescent Well-Care

    PubMed Central

    Gardner, William; Morton, Suzanne; Byron, Sepheen C; Tinoco, Aldo; Canan, Benjamin D; Leonhart, Karen; Kong, Vivian; Scholle, Sarah Hudson

    2014-01-01

    Objective To determine whether quality measures based on computer-extracted EHR data can reproduce findings based on data manually extracted by reviewers. Data Sources We studied 12 measures of care indicated for adolescent well-care visits for 597 patients in three pediatric health systems. Study Design Observational study. Data Collection/Extraction Methods Manual reviewers collected quality data from the EHR. Site personnel programmed their EHR systems to extract the same data from structured fields in the EHR according to national health IT standards. Principal Findings Overall performance measured via computer-extracted data was 21.9 percent, compared with 53.2 percent for manual data. Agreement measures were high for immunizations. Otherwise, agreement between computer extraction and manual review was modest (Kappa = 0.36) because computer-extracted data frequently missed care events (sensitivity = 39.5 percent). Measure validity varied by health care domain and setting. A limitation of our findings is that we studied only three domains and three sites. Conclusions The accuracy of computer-extracted EHR quality reporting depends on the use of structured data fields, with the highest agreement found for measures and in the setting that had the greatest concentration of structured fields. We need to improve documentation of care, data extraction, and adaptation of EHR systems to practice workflow. PMID:24471935

  8. Qudit quantum computation on matrix product states with global symmetry

    NASA Astrophysics Data System (ADS)

    Wang, Dongsheng; Stephen, David; Raussendorf, Robert

    Resource states that contain nontrivial symmetry-protected topological order are identified for universal measurement-based quantum computation. Our resource states fall into two classes: one as the qudit generalizations of the qubit cluster state, and the other as the higher-symmetry generalizations of the spin-1 Affleck-Kennedy-Lieb-Tasaki (AKLT) state, namely, with unitary, orthogonal, or symplectic symmetry. The symmetry in cluster states protects information propagation (identity gate), while the higher symmetry in AKLT-type states enables nontrivial gate computation. This work demonstrates a close connection between measurement-based quantum computation and symmetry-protected topological order.

  9. Qudit quantum computation on matrix product states with global symmetry

    NASA Astrophysics Data System (ADS)

    Wang, Dong-Sheng; Stephen, David T.; Raussendorf, Robert

    2017-03-01

    Resource states that contain nontrivial symmetry-protected topological order are identified for universal single-qudit measurement-based quantum computation. Our resource states fall into two classes: one as the qudit generalizations of the one-dimensional qubit cluster state, and the other as the higher-symmetry generalizations of the spin-1 Affleck-Kennedy-Lieb-Tasaki (AKLT) state, namely, with unitary, orthogonal, or symplectic symmetry. The symmetry in cluster states protects information propagation (identity gate), while the higher symmetry in AKLT-type states enables nontrivial gate computation. This work demonstrates a close connection between measurement-based quantum computation and symmetry-protected topological order.

  10. Provable classically intractable sampling with measurement-based computation in constant time

    NASA Astrophysics Data System (ADS)

    Sanders, Stephen; Miller, Jacob; Miyake, Akimasa

    We present a constant-time measurement-based quantum computation (MQC) protocol to perform a classically intractable sampling problem. We sample from the output probability distribution of a subclass of the instantaneous quantum polynomial time circuits introduced by Bremner, Montanaro and Shepherd. In contrast with the usual circuit model, our MQC implementation includes additional randomness due to byproduct operators associated with the computation. Despite this additional randomness we show that our sampling task cannot be efficiently simulated by a classical computer. We extend previous results to verify the quantum supremacy of our sampling protocol efficiently using only single-qubit Pauli measurements. Center for Quantum Information and Control, Department of Physics and Astronomy, University of New Mexico, Albuquerque, NM 87131, USA.

  11. Medical privacy protection based on granular computing.

    PubMed

    Wang, Da-Wei; Liau, Churn-Jung; Hsu, Tsan-Sheng

    2004-10-01

    Based on granular computing methodology, we propose two criteria to quantitatively measure privacy invasion. The total cost criterion measures the effort needed for a data recipient to find private information. The average benefit criterion measures the benefit a data recipient obtains when he received the released data. These two criteria remedy the inadequacy of the deterministic privacy formulation proposed in Proceedings of Asia Pacific Medical Informatics Conference, 2000; Int J Med Inform 2003;71:17-23. Granular computing methodology provides a unified framework for these quantitative measurements and previous bin size and logical approaches. These two new criteria are implemented in a prototype system Cellsecu 2.0. Preliminary system performance evaluation is conducted and reviewed.

  12. Superior model for fault tolerance computation in designing nano-sized circuit systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, N. S. S., E-mail: narinderjit@petronas.com.my; Muthuvalu, M. S., E-mail: msmuthuvalu@gmail.com; Asirvadam, V. S., E-mail: vijanth-sagayan@petronas.com.my

    2014-10-24

    As CMOS technology scales nano-metrically, reliability turns out to be a decisive subject in the design methodology of nano-sized circuit systems. As a result, several computational approaches have been developed to compute and evaluate reliability of desired nano-electronic circuits. The process of computing reliability becomes very troublesome and time consuming as the computational complexity build ups with the desired circuit size. Therefore, being able to measure reliability instantly and superiorly is fast becoming necessary in designing modern logic integrated circuits. For this purpose, the paper firstly looks into the development of an automated reliability evaluation tool based on the generalizationmore » of Probabilistic Gate Model (PGM) and Boolean Difference-based Error Calculator (BDEC) models. The Matlab-based tool allows users to significantly speed-up the task of reliability analysis for very large number of nano-electronic circuits. Secondly, by using the developed automated tool, the paper explores into a comparative study involving reliability computation and evaluation by PGM and, BDEC models for different implementations of same functionality circuits. Based on the reliability analysis, BDEC gives exact and transparent reliability measures, but as the complexity of the same functionality circuits with respect to gate error increases, reliability measure by BDEC tends to be lower than the reliability measure by PGM. The lesser reliability measure by BDEC is well explained in this paper using distribution of different signal input patterns overtime for same functionality circuits. Simulation results conclude that the reliability measure by BDEC depends not only on faulty gates but it also depends on circuit topology, probability of input signals being one or zero and also probability of error on signal lines.« less

  13. Comparing Computer Adaptive and Curriculum-Based Measures of Math in Progress Monitoring

    ERIC Educational Resources Information Center

    Shapiro, Edward S.; Dennis, Minyi Shih; Fu, Qiong

    2015-01-01

    The purpose of the study was to compare the use of a Computer Adaptive Test and Curriculum-Based Measurement in the assessment of mathematics. This study also investigated the degree to which slope or rate of change predicted student outcomes on the annual state assessment of mathematics above and beyond scores of single point screening…

  14. Performance of a computer-based assessment of cognitive function measures in two cohorts of seniors

    USDA-ARS?s Scientific Manuscript database

    Computer-administered assessment of cognitive function is being increasingly incorporated in clinical trials, however its performance in these settings has not been systematically evaluated. The Seniors Health and Activity Research Program (SHARP) pilot trial (N=73) developed a computer-based tool f...

  15. Efficient universal blind quantum computation.

    PubMed

    Giovannetti, Vittorio; Maccone, Lorenzo; Morimae, Tomoyuki; Rudolph, Terry G

    2013-12-06

    We give a cheat sensitive protocol for blind universal quantum computation that is efficient in terms of computational and communication resources: it allows one party to perform an arbitrary computation on a second party's quantum computer without revealing either which computation is performed, or its input and output. The first party's computational capabilities can be extremely limited: she must only be able to create and measure single-qubit superposition states. The second party is not required to use measurement-based quantum computation. The protocol requires the (optimal) exchange of O(Jlog2(N)) single-qubit states, where J is the computational depth and N is the number of qubits needed for the computation.

  16. Comparative study of cranial anthropometric measurement by traditional calipers to computed tomography and three-dimensional photogrammetry.

    PubMed

    Mendonca, Derick A; Naidoo, Sybill D; Skolnick, Gary; Skladman, Rachel; Woo, Albert S

    2013-07-01

    Craniofacial anthropometry by direct caliper measurements is a common method of quantifying the morphology of the cranial vault. New digital imaging modalities including computed tomography and three-dimensional photogrammetry are similarly being used to obtain craniofacial surface measurements. This study sought to compare the accuracy of anthropometric measurements obtained by calipers versus 2 methods of digital imaging.Standard anterior-posterior, biparietal, and cranial index measurements were directly obtained on 19 participants with an age range of 1 to 20 months. Computed tomographic scans and three-dimensional photographs were both obtained on each child within 2 weeks of the clinical examination. Two analysts measured the anterior-posterior and biparietal distances on the digital images. Measures of reliability and bias between the modalities were calculated and compared.Caliper measurements were found to underestimate the anterior-posterior and biparietal distances as compared with those of the computed tomography and the three-dimensional photogrammetry (P < 0.001). Cranial index measurements between the computed tomography and the calipers differed by up to 6%. The difference between the 2 modalities was statistically significant (P = 0.021). The biparietal and cranial index results were similar between the digital modalities, but the anterior-posterior measurement was greater with the three-dimensional photogrammetry (P = 0.002). The coefficients of variation for repeated measures based on the computed tomography and the three-dimensional photogrammetry were 0.008 and 0.007, respectively.In conclusion, measurements based on digital modalities are generally reliable and interchangeable. Caliper measurements lead to underestimation of anterior-posterior and biparietal values compared with digital imaging.

  17. Study on the algorithm of computational ghost imaging based on discrete fourier transform measurement matrix

    NASA Astrophysics Data System (ADS)

    Zhang, Leihong; Liang, Dong; Li, Bei; Kang, Yi; Pan, Zilan; Zhang, Dawei; Gao, Xiumin; Ma, Xiuhua

    2016-07-01

    On the basis of analyzing the cosine light field with determined analytic expression and the pseudo-inverse method, the object is illuminated by a presetting light field with a determined discrete Fourier transform measurement matrix, and the object image is reconstructed by the pseudo-inverse method. The analytic expression of the algorithm of computational ghost imaging based on discrete Fourier transform measurement matrix is deduced theoretically, and compared with the algorithm of compressive computational ghost imaging based on random measurement matrix. The reconstruction process and the reconstruction error are analyzed. On this basis, the simulation is done to verify the theoretical analysis. When the sampling measurement number is similar to the number of object pixel, the rank of discrete Fourier transform matrix is the same as the one of the random measurement matrix, the PSNR of the reconstruction image of FGI algorithm and PGI algorithm are similar, the reconstruction error of the traditional CGI algorithm is lower than that of reconstruction image based on FGI algorithm and PGI algorithm. As the decreasing of the number of sampling measurement, the PSNR of reconstruction image based on FGI algorithm decreases slowly, and the PSNR of reconstruction image based on PGI algorithm and CGI algorithm decreases sharply. The reconstruction time of FGI algorithm is lower than that of other algorithms and is not affected by the number of sampling measurement. The FGI algorithm can effectively filter out the random white noise through a low-pass filter and realize the reconstruction denoising which has a higher denoising capability than that of the CGI algorithm. The FGI algorithm can improve the reconstruction accuracy and the reconstruction speed of computational ghost imaging.

  18. Measuring Symmetry in Children With Unrepaired Cleft Lip: Defining a Standard for the Three-Dimensional Midfacial Reference Plane.

    PubMed

    Wu, Jia; Heike, Carrie; Birgfeld, Craig; Evans, Kelly; Maga, Murat; Morrison, Clinton; Saltzman, Babette; Shapiro, Linda; Tse, Raymond

    2016-11-01

      Quantitative measures of facial form to evaluate treatment outcomes for cleft lip (CL) are currently limited. Computer-based analysis of three-dimensional (3D) images provides an opportunity for efficient and objective analysis. The purpose of this study was to define a computer-based standard of identifying the 3D midfacial reference plane of the face in children with unrepaired cleft lip for measurement of facial symmetry.   The 3D images of 50 subjects (35 with unilateral CL, 10 with bilateral CL, five controls) were included in this study.   Five methods of defining a midfacial plane were applied to each image, including two human-based (Direct Placement, Manual Landmark) and three computer-based (Mirror, Deformation, Learning) methods.   Six blinded raters (three cleft surgeons, two craniofacial pediatricians, and one craniofacial researcher) independently ranked and rated the accuracy of the defined planes.   Among computer-based methods, the Deformation method performed significantly better than the others. Although human-based methods performed best, there was no significant difference compared with the Deformation method. The average correlation coefficient among raters was .4; however, it was .7 and .9 when the angular difference between planes was greater than 6° and 8°, respectively.   Raters can agree on the 3D midfacial reference plane in children with unrepaired CL using digital surface mesh. The Deformation method performed best among computer-based methods evaluated and can be considered a useful tool to carry out automated measurements of facial symmetry in children with unrepaired cleft lip.

  19. Simultaneous measurements of density field and wavefront distortions in high speed flows

    NASA Astrophysics Data System (ADS)

    George, Jacob; Jenkins, Thomas; Trolinger, James; Hess, Cecil; Buckner, Benjamin

    2017-09-01

    This paper presents results from simultaneous measurements of fluid density and the resulting wavefront distortions in a sonic underexpanded jet. The density measurements were carried out using Rayleigh scattering, and the optical distortions were measured using a wavefront sensor based on phase shifting interferometry. The measurements represent a preliminary step toward relating wavefront distortions to a specific flow structure. The measured density field is used to compute the phase distortions using a wave propagation model based on a geometric-optics approximation, and the computed phase map shows moderate agreement with that obtained using the wavefront sensor.

  20. Computerized Measurement of Negative Symptoms in Schizophrenia

    PubMed Central

    Cohen, Alex S.; Alpert, Murray; Nienow, Tasha M.; Dinzeo, Thomas J.; Docherty, Nancy M.

    2008-01-01

    Accurate measurement of negative symptoms is crucial for understanding and treating schizophrenia. However, current measurement strategies are reliant on subjective symptom rating scales which often have psychometric and practical limitations. Computerized analysis of patients’ speech offers a sophisticated and objective means of evaluating negative symptoms. The present study examined the feasibility and validity of using widely-available acoustic and lexical-analytic software to measure flat affect, alogia and anhedonia (via positive emotion). These measures were examined in their relationships to clinically-rated negative symptoms and social functioning. Natural speech samples were collected and analyzed for 14 patients with clinically-rated flat affect, 46 patients without flat affect and 19 healthy controls. The computer-based inflection and speech rate measures significantly discriminated patients with flat affect from controls, and the computer-based measure of alogia and negative emotion significantly discriminated the flat and non-flat patients. Both the computer and clinical measures of positive emotion/anhedonia corresponded to functioning impairments. The computerized method of assessing negative symptoms offered a number of advantages over the symptom scale-based approach. PMID:17920078

  1. Computer-Based Experiments to Measure RC.

    ERIC Educational Resources Information Center

    Hart, Francis X.

    2000-01-01

    Finds that few electricity and magnetism experiments make use of computers for data acquisition. Reports on the use of a Vernier system for the measurement of the RC time constant for the charging and discharging of a capacitor. (CCM)

  2. Asymmetric Base-Bleed Effect on Aerospike Plume-Induced Base-Heating Environment

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Droege, Alan; DAgostino, Mark; Lee, Young-Ching; Williams, Robert

    2004-01-01

    A computational heat transfer design methodology was developed to study the dual-engine linear aerospike plume-induced base-heating environment during one power-pack out, in ascent flight. It includes a three-dimensional, finite volume, viscous, chemically reacting, and pressure-based computational fluid dynamics formulation, a special base-bleed boundary condition, and a three-dimensional, finite volume, and spectral-line-based weighted-sum-of-gray-gases absorption computational radiation heat transfer formulation. A separate radiation model was used for diagnostic purposes. The computational methodology was systematically benchmarked. In this study, near-base radiative heat fluxes were computed, and they compared well with those measured during static linear aerospike engine tests. The base-heating environment of 18 trajectory points selected from three power-pack out scenarios was computed. The computed asymmetric base-heating physics were analyzed. The power-pack out condition has the most impact on convective base heating when it happens early in flight. The source of its impact comes from the asymmetric and reduced base bleed.

  3. Characterization of real-time computers

    NASA Technical Reports Server (NTRS)

    Shin, K. G.; Krishna, C. M.

    1984-01-01

    A real-time system consists of a computer controller and controlled processes. Despite the synergistic relationship between these two components, they have been traditionally designed and analyzed independently of and separately from each other; namely, computer controllers by computer scientists/engineers and controlled processes by control scientists. As a remedy for this problem, in this report real-time computers are characterized by performance measures based on computer controller response time that are: (1) congruent to the real-time applications, (2) able to offer an objective comparison of rival computer systems, and (3) experimentally measurable/determinable. These measures, unlike others, provide the real-time computer controller with a natural link to controlled processes. In order to demonstrate their utility and power, these measures are first determined for example controlled processes on the basis of control performance functionals. They are then used for two important real-time multiprocessor design applications - the number-power tradeoff and fault-masking and synchronization.

  4. New Severity Indices for Quantifying Single-suture Metopic Craniosynostosis

    PubMed Central

    Ruiz-Correa, Salvador; Starr, Jacqueline R.; Lin, H. Jill; Kapp-Simon, Kathleen A.; Sze, Raymond W.; Ellenbogen, Richard G.; Speltz, Matthew L.; Cunningham, Michael L.

    2012-01-01

    OBJECTIVE To describe novel severity indices with which to quantify severity of trigonocephaly malformation in children diagnosed with isolated metopic synostosis. METHODS Computed tomographic scans of the cranium were obtained from 38 infants diagnosed with isolated metopic synostosis and 53 age-matched control patients. Volumetric reformations of the cranium were used to trace two-dimensional planes defined by the cranium-base plane and well-defined brain landmarks. For each patient, novel trigonocephaly severity indices (TSI) were computed from outline cranium shapes on each of these planes. The metopic severity index based on measurements of interlandmark distances was also computed and a receiver operating characteristic analysis used to compare the accuracy of classification based on TSIs versus that based on the metopic severity index. RESULTS The proposed TSIs are a sensitive measure of trigonocephaly malformation that can provide a classification accuracy of 96% with a specificity of 95%, in contrast with 82% of the metopic severity index at the same specificity level. CONCLUSIONS We completed exploratory analysis of outline-based severity measurements computed from computed tomographic image planes of the cranium. These TSIs enable quantitative analysis of cranium features in isolated metopic synostosis that may not be accurately detected by analytic tools derived from a sparse set of traditional interlandmark and semilandmark distances. PMID:18797362

  5. A computer vision-based approach for structural displacement measurement

    NASA Astrophysics Data System (ADS)

    Ji, Yunfeng

    2010-04-01

    Along with the incessant advancement in optics, electronics and computer technologies during the last three decades, commercial digital video cameras have experienced a remarkable evolution, and can now be employed to measure complex motions of objects with sufficient accuracy, which render great assistance to structural displacement measurement in civil engineering. This paper proposes a computer vision-based approach for dynamic measurement of structures. One digital camera is used to capture image sequences of planar targets mounted on vibrating structures. The mathematical relationship between image plane and real space is established based on computer vision theory. Then, the structural dynamic displacement at the target locations can be quantified using point reconstruction rules. Compared with other tradition displacement measurement methods using sensors, such as accelerometers, linear-variable-differential-transducers (LVDTs) and global position system (GPS), the proposed approach gives the main advantages of great flexibility, a non-contact working mode and ease of increasing measurement points. To validate, four tests of sinusoidal motion of a point, free vibration of a cantilever beam, wind tunnel test of a cross-section bridge model, and field test of bridge displacement measurement, are performed. Results show that the proposed approach can attain excellent accuracy compared with the analytical ones or the measurements using conventional transducers, and proves to deliver an innovative and low cost solution to structural displacement measurement.

  6. Measure the Earth's Radius and the Speed of Light with Simple and Inexpensive Computer-Based Experiments

    ERIC Educational Resources Information Center

    Martin, Michael J.

    2004-01-01

    With new and inexpensive computer-based methods, measuring the speed of light and the Earth's radius--historically difficult endeavors--can be simple enough to be tackled by high school and college students working in labs that have limited budgets. In this article, the author describes two methods of estimating the Earth's radius using two…

  7. Real-World Physics: A Portable MBL for Field Measurements.

    ERIC Educational Resources Information Center

    Albergotti, Clifton

    1994-01-01

    Uses a moderately priced digital multimeter that has output and software compatible with personal computers to make a portable, computer-based data-acquisition system. The system can measure voltage, current, frequency, capacitance, transistor hFE, and temperature. Describes field measures of velocity, acceleration, and temperature as function of…

  8. Using Rasch Measurement to Develop a Computer Modeling-Based Instrument to Assess Students' Conceptual Understanding of Matter

    ERIC Educational Resources Information Center

    Wei, Silin; Liu, Xiufeng; Wang, Zuhao; Wang, Xingqiao

    2012-01-01

    Research suggests that difficulty in making connections among three levels of chemical representations--macroscopic, submicroscopic, and symbolic--is a primary reason for student alternative conceptions of chemistry concepts, and computer modeling is promising to help students make the connections. However, no computer modeling-based assessment…

  9. Training and Generalization Effects of a Reading Comprehension Learning Strategy on Computer and Paper-Pencil Assessments

    ERIC Educational Resources Information Center

    Worrell, Jamie; Duffy, Mary Lou; Brady, Michael P.; Dukes, Charles; Gonzalez-DeHass, Alyssa

    2016-01-01

    Many schools use computer-based testing to measure students' progress for end-of-the-year and statewide assessments. There is little research to support whether computer-based testing accurately reflects student progress, particularly among students with learning, performance, and generalization difficulties. This article summarizes an…

  10. Implementation of a High-Speed FPGA and DSP Based FFT Processor for Improving Strain Demodulation Performance in a Fiber-Optic-Based Sensing System

    NASA Technical Reports Server (NTRS)

    Farley, Douglas L.

    2005-01-01

    NASA's Aviation Safety and Security Program is pursuing research in on-board Structural Health Management (SHM) technologies for purposes of reducing or eliminating aircraft accidents due to system and component failures. Under this program, NASA Langley Research Center (LaRC) is developing a strain-based structural health-monitoring concept that incorporates a fiber optic-based measuring system for acquiring strain values. This fiber optic-based measuring system provides for the distribution of thousands of strain sensors embedded in a network of fiber optic cables. The resolution of strain value at each discrete sensor point requires a computationally demanding data reduction software process that, when hosted on a conventional processor, is not suitable for near real-time measurement. This report describes the development and integration of an alternative computing environment using dedicated computing hardware for performing the data reduction. Performance comparison between the existing and the hardware-based system is presented.

  11. Experimental realization of nondestructive discrimination of Bell states using a five-qubit quantum computer

    NASA Astrophysics Data System (ADS)

    Sisodia, Mitali; Shukla, Abhishek; Pathak, Anirban

    2017-12-01

    A scheme for distributed quantum measurement that allows nondestructive or indirect Bell measurement was proposed by Gupta et al [1]. In the present work, Gupta et al.'s scheme is experimentally realized using the five-qubit super-conductivity-based quantum computer, which has been recently placed in cloud by IBM Corporation. The experiment confirmed that the Bell state can be constructed and measured in a nondestructive manner with a reasonably high fidelity. A comparison of the outcomes of this study and the results obtained earlier in an NMR-based experiment (Samal et al. (2010) [10]) has also been performed. The study indicates that to make a scalable SQUID-based quantum computer, errors introduced by the gates (in the present technology) have to be reduced considerably.

  12. The Designing of CALM (Computer Anxiety and Learning Measure): Validation of a Multidimensional Measure of Anxiety and Cognitions Relating to Adult Learning of Computing Skills Using Structural Equation Modeling.

    ERIC Educational Resources Information Center

    McInerney, Valentina; Marsh, Herbert W.; McInerney, Dennis M.

    This paper discusses the process through which a powerful multidimensional measure of affect and cognition in relation to adult learning of computing skills was derived from its early theoretical stages to its validation using structural equation modeling. The discussion emphasizes the importance of ensuring a strong substantive base from which to…

  13. An Attractor-Based Complexity Measurement for Boolean Recurrent Neural Networks

    PubMed Central

    Cabessa, Jérémie; Villa, Alessandro E. P.

    2014-01-01

    We provide a novel refined attractor-based complexity measurement for Boolean recurrent neural networks that represents an assessment of their computational power in terms of the significance of their attractor dynamics. This complexity measurement is achieved by first proving a computational equivalence between Boolean recurrent neural networks and some specific class of -automata, and then translating the most refined classification of -automata to the Boolean neural network context. As a result, a hierarchical classification of Boolean neural networks based on their attractive dynamics is obtained, thus providing a novel refined attractor-based complexity measurement for Boolean recurrent neural networks. These results provide new theoretical insights to the computational and dynamical capabilities of neural networks according to their attractive potentialities. An application of our findings is illustrated by the analysis of the dynamics of a simplified model of the basal ganglia-thalamocortical network simulated by a Boolean recurrent neural network. This example shows the significance of measuring network complexity, and how our results bear new founding elements for the understanding of the complexity of real brain circuits. PMID:24727866

  14. Comparing computer adaptive and curriculum-based measures of math in progress monitoring.

    PubMed

    Shapiro, Edward S; Dennis, Minyi Shih; Fu, Qiong

    2015-12-01

    The purpose of the study was to compare the use of a Computer Adaptive Test and Curriculum-Based Measurement in the assessment of mathematics. This study also investigated the degree to which slope or rate of change predicted student outcomes on the annual state assessment of mathematics above and beyond scores of single point screening assessments (i.e., the computer adaptive test or the CBM assessment just before the administration of the state assessment). Repeated measurement of mathematics once per month across a 7-month period using a Computer Adaptive Test (STAR-Math) and Curriculum-Based Measurement (CBM, AIMSweb Math Computation, AIMSweb Math Concepts/Applications) was collected for a maximum total of 250 third, fourth, and fifth grade students. Results showed STAR-Math in all 3 grades and AIMSweb Math Concepts/Applications in the third and fifth grades had primarily linear growth patterns in mathematics. AIMSweb Math Computation in all grades and AIMSweb Math Concepts/Applications in Grade 4 had decelerating positive trends. Predictive validity evidence showed the strongest relationships were between STAR-Math and outcomes for third and fourth grade students. The blockwise multiple regression by grade revealed that slopes accounted for only a very small proportion of additional variance above and beyond what was explained by the scores obtained on a single point of assessment just prior to the administration of the state assessment. (c) 2015 APA, all rights reserved).

  15. Computer vision-based analysis of foods: a non-destructive colour measurement tool to monitor quality and safety.

    PubMed

    Mogol, Burçe Ataç; Gökmen, Vural

    2014-05-01

    Computer vision-based image analysis has been widely used in food industry to monitor food quality. It allows low-cost and non-contact measurements of colour to be performed. In this paper, two computer vision-based image analysis approaches are discussed to extract mean colour or featured colour information from the digital images of foods. These types of information may be of particular importance as colour indicates certain chemical changes or physical properties in foods. As exemplified here, the mean CIE a* value or browning ratio determined by means of computer vision-based image analysis algorithms can be correlated with acrylamide content of potato chips or cookies. Or, porosity index as an important physical property of breadcrumb can be calculated easily. In this respect, computer vision-based image analysis provides a useful tool for automatic inspection of food products in a manufacturing line, and it can be actively involved in the decision-making process where rapid quality/safety evaluation is needed. © 2013 Society of Chemical Industry.

  16. Tyramine Hydrochloride Based Label-Free System for Operating Various DNA Logic Gates and a DNA Caliper for Base Number Measurements.

    PubMed

    Fan, Daoqing; Zhu, Xiaoqing; Dong, Shaojun; Wang, Erkang

    2017-07-05

    DNA is believed to be a promising candidate for molecular logic computation, and the fluorogenic/colorimetric substrates of G-quadruplex DNAzyme (G4zyme) are broadly used as label-free output reporters of DNA logic circuits. Herein, for the first time, tyramine-HCl (a fluorogenic substrate of G4zyme) is applied to DNA logic computation and a series of label-free DNA-input logic gates, including elementary AND, OR, and INHIBIT logic gates, as well as a two to one encoder, are constructed. Furthermore, a DNA caliper that can measure the base number of target DNA as low as three bases is also fabricated. This DNA caliper can also perform concatenated AND-AND logic computation to fulfil the requirements of sophisticated logic computing. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Teaching Grocery Store Purchasing Skills to Students with Intellectual Disabilities Using a Computer-Based Instruction Program

    ERIC Educational Resources Information Center

    Hansen, David L.; Morgan, Robert L.

    2008-01-01

    This research evaluated effects of a multi-media computer-based instruction (CBI) program designed to teach grocery store purchasing skills to three high-school students with intellectual disabilities. A multiple baseline design across participants used measures of computer performance mastery and grocery store probes to evaluate the CBI. All…

  18. Subject-enabled analytics model on measurement statistics in health risk expert system for public health informatics.

    PubMed

    Chung, Chi-Jung; Kuo, Yu-Chen; Hsieh, Yun-Yu; Li, Tsai-Chung; Lin, Cheng-Chieh; Liang, Wen-Miin; Liao, Li-Na; Li, Chia-Ing; Lin, Hsueh-Chun

    2017-11-01

    This study applied open source technology to establish a subject-enabled analytics model that can enhance measurement statistics of case studies with the public health data in cloud computing. The infrastructure of the proposed model comprises three domains: 1) the health measurement data warehouse (HMDW) for the case study repository, 2) the self-developed modules of online health risk information statistics (HRIStat) for cloud computing, and 3) the prototype of a Web-based process automation system in statistics (PASIS) for the health risk assessment of case studies with subject-enabled evaluation. The system design employed freeware including Java applications, MySQL, and R packages to drive a health risk expert system (HRES). In the design, the HRIStat modules enforce the typical analytics methods for biomedical statistics, and the PASIS interfaces enable process automation of the HRES for cloud computing. The Web-based model supports both modes, step-by-step analysis and auto-computing process, respectively for preliminary evaluation and real time computation. The proposed model was evaluated by computing prior researches in relation to the epidemiological measurement of diseases that were caused by either heavy metal exposures in the environment or clinical complications in hospital. The simulation validity was approved by the commercial statistics software. The model was installed in a stand-alone computer and in a cloud-server workstation to verify computing performance for a data amount of more than 230K sets. Both setups reached efficiency of about 10 5 sets per second. The Web-based PASIS interface can be used for cloud computing, and the HRIStat module can be flexibly expanded with advanced subjects for measurement statistics. The analytics procedure of the HRES prototype is capable of providing assessment criteria prior to estimating the potential risk to public health. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Cyst-based measurements for assessing lymphangioleiomyomatosis in computed tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lo, P., E-mail: pechinlo@mednet.edu.ucla; Brown, M. S.; Kim, H.

    Purpose: To investigate the efficacy of a new family of measurements made on individual pulmonary cysts extracted from computed tomography (CT) for assessing the severity of lymphangioleiomyomatosis (LAM). Methods: CT images were analyzed using thresholding to identify a cystic region of interest from chest CT of LAM patients. Individual cysts were then extracted from the cystic region by the watershed algorithm, which separates individual cysts based on subtle edges within the cystic regions. A family of measurements were then computed, which quantify the amount, distribution, and boundary appearance of the cysts. Sequential floating feature selection was used to select amore » small subset of features for quantification of the severity of LAM. Adjusted R{sup 2} from multiple linear regression and R{sup 2} from linear regression against measurements from spirometry were used to compare the performance of our proposed measurements with currently used density based CT measurements in the literature, namely, the relative area measure and the D measure. Results: Volumetric CT data, performed at total lung capacity and residual volume, from a total of 49 subjects enrolled in the MILES trial were used in our study. Our proposed measures had adjusted R{sup 2} ranging from 0.42 to 0.59 when regressing against the spirometry measures, with p < 0.05. For previously used density based CT measurements in the literature, the best R{sup 2} was 0.46 (for only one instance), with the majority being lower than 0.3 or p > 0.05. Conclusions: The proposed family of CT-based cyst measurements have better correlation with spirometric measures than previously used density based CT measurements. They show potential as a sensitive tool for quantitatively assessing the severity of LAM.« less

  20. Verifiable Measurement-Only Blind Quantum Computing with Stabilizer Testing.

    PubMed

    Hayashi, Masahito; Morimae, Tomoyuki

    2015-11-27

    We introduce a simple protocol for verifiable measurement-only blind quantum computing. Alice, a client, can perform only single-qubit measurements, whereas Bob, a server, can generate and store entangled many-qubit states. Bob generates copies of a graph state, which is a universal resource state for measurement-based quantum computing, and sends Alice each qubit of them one by one. Alice adaptively measures each qubit according to her program. If Bob is honest, he generates the correct graph state, and, therefore, Alice can obtain the correct computation result. Regarding the security, whatever Bob does, Bob cannot get any information about Alice's computation because of the no-signaling principle. Furthermore, malicious Bob does not necessarily send the copies of the correct graph state, but Alice can check the correctness of Bob's state by directly verifying the stabilizers of some copies.

  1. Verifiable Measurement-Only Blind Quantum Computing with Stabilizer Testing

    NASA Astrophysics Data System (ADS)

    Hayashi, Masahito; Morimae, Tomoyuki

    2015-11-01

    We introduce a simple protocol for verifiable measurement-only blind quantum computing. Alice, a client, can perform only single-qubit measurements, whereas Bob, a server, can generate and store entangled many-qubit states. Bob generates copies of a graph state, which is a universal resource state for measurement-based quantum computing, and sends Alice each qubit of them one by one. Alice adaptively measures each qubit according to her program. If Bob is honest, he generates the correct graph state, and, therefore, Alice can obtain the correct computation result. Regarding the security, whatever Bob does, Bob cannot get any information about Alice's computation because of the no-signaling principle. Furthermore, malicious Bob does not necessarily send the copies of the correct graph state, but Alice can check the correctness of Bob's state by directly verifying the stabilizers of some copies.

  2. Evaluating Procedures for Reducing Measurement Error in Math Curriculum-Based Measurement Probes

    ERIC Educational Resources Information Center

    Methe, Scott A.; Briesch, Amy M.; Hulac, David

    2015-01-01

    At present, it is unclear whether math curriculum-based measurement (M-CBM) procedures provide a dependable measure of student progress in math computation because support for its technical properties is based largely upon a body of correlational research. Recent investigations into the dependability of M-CBM scores have found that evaluating…

  3. Accuracy of a laboratory-based computer implant guiding system.

    PubMed

    Barnea, Eitan; Alt, Ido; Kolerman, Roni; Nissan, Joseph

    2010-05-01

    Computer-guided implant placement is a growing treatment modality in partially and totally edentulous patients, though data about the accuracy of some systems for computer-guided surgery is limited. The purpose of this study was to evaluate the accuracy of a laboratory computer-guided system. A laboratory-based computer guiding system (M Guide; MIS technologies, Shlomi, Israel) was used to place implants in a fresh sheep mandible. A second computerized tomography (CT) scan was taken after placing the implants . The drill plan figures of the planned implants were positioned using assigned software (Med3D, Heidelberg, Germany) on the second CT scan to compare the implant position with the initial planning. Values representing the implant locations of the original drill plan were compared with that of the placed implants using SPSS software. Six measurements (3 vertical, 3 horizontal) were made on each implant to assess the deviation from the initial implant planning. A repeated-measurement analysis of variance was performed comparing the location of measurement (center, abutment, apex) and type of deviation (vertical vs. horizontal). The vertical deviation (mean -0.168) was significantly smaller than the horizontal deviation (mean 1.148). The laboratory computer-based guiding system may be a viable treatment concept for placing implants. Copyright (c) 2010 Mosby, Inc. All rights reserved.

  4. A dc model for power switching transistors suitable for computer-aided design and analysis

    NASA Technical Reports Server (NTRS)

    Wilson, P. M.; George, R. T., Jr.; Owen, H. A.; Wilson, T. G.

    1979-01-01

    A model for bipolar junction power switching transistors whose parameters can be readily obtained by the circuit design engineer, and which can be conveniently incorporated into standard computer-based circuit analysis programs is presented. This formulation results from measurements which may be made with standard laboratory equipment. Measurement procedures, as well as a comparison between actual and computed results, are presented.

  5. An Approach to Experimental Design for the Computer Analysis of Complex Phenomenon

    NASA Technical Reports Server (NTRS)

    Rutherford, Brian

    2000-01-01

    The ability to make credible system assessments, predictions and design decisions related to engineered systems and other complex phenomenon is key to a successful program for many large-scale investigations in government and industry. Recently, many of these large-scale analyses have turned to computational simulation to provide much of the required information. Addressing specific goals in the computer analysis of these complex phenomenon is often accomplished through the use of performance measures that are based on system response models. The response models are constructed using computer-generated responses together with physical test results where possible. They are often based on probabilistically defined inputs and generally require estimation of a set of response modeling parameters. As a consequence, the performance measures are themselves distributed quantities reflecting these variabilities and uncertainties. Uncertainty in the values of the performance measures leads to uncertainties in predicted performance and can cloud the decisions required of the analysis. A specific goal of this research has been to develop methodology that will reduce this uncertainty in an analysis environment where limited resources and system complexity together restrict the number of simulations that can be performed. An approach has been developed that is based on evaluation of the potential information provided for each "intelligently selected" candidate set of computer runs. Each candidate is evaluated by partitioning the performance measure uncertainty into two components - one component that could be explained through the additional computational simulation runs and a second that would remain uncertain. The portion explained is estimated using a probabilistic evaluation of likely results for the additional computational analyses based on what is currently known about the system. The set of runs indicating the largest potential reduction in uncertainty is then selected and the computational simulations are performed. Examples are provided to demonstrate this approach on small scale problems. These examples give encouraging results. Directions for further research are indicated.

  6. Measurement and Evidence of Computer-Based Task Switching and Multitasking by "Net Generation" Students

    ERIC Educational Resources Information Center

    Judd, Terry; Kennedy, Gregor

    2011-01-01

    Logs of on-campus computer and Internet usage were used to conduct a study of computer-based task switching and multitasking by undergraduate medical students. A detailed analysis of over 6000 individual sessions revealed that while a majority of students engaged in both task switching and multitasking behaviours, they did so less frequently than…

  7. Computer Based Instructional Techniques in Undergraduate Introductory Organic Chemistry: Rationale, Developmental Techniques, Programming Strategies and Evaluation.

    ERIC Educational Resources Information Center

    Culp, G. H.; And Others

    Over 100 interactive computer programs for use in general and organic chemistry at the University of Texas at Austin have been prepared. The rationale for the programs is based upon the belief that computer-assisted instruction (CAI) can improve education by, among other things, freeing teachers from routine tasks, measuring entry skills,…

  8. Comparison of hand and semiautomatic tracing methods for creating maxillofacial artificial organs using sequences of computed tomography (CT) and cone beam computed tomography (CBCT) images.

    PubMed

    Szabo, Bence T; Aksoy, Seçil; Repassy, Gabor; Csomo, Krisztian; Dobo-Nagy, Csaba; Orhan, Kaan

    2017-06-09

    The aim of this study was to compare the paranasal sinus volumes obtained by manual and semiautomatic imaging software programs using both CT and CBCT imaging. 121 computed tomography (CT) and 119 cone beam computed tomography (CBCT) examinations were selected from the databases of the authors' institutes. The Digital Imaging and Communications in Medicine (DICOM) images were imported into 3-dimensonal imaging software, in which hand mode and semiautomatic tracing methods were used to measure the volumes of both maxillary sinuses and the sphenoid sinus. The determined volumetric means were compared to previously published averages. Isometric CBCT-based volume determination results were closer to the real volume conditions, whereas the non-isometric CT-based volume measurements defined coherently lower volumes. By comparing the 2 volume measurement modes, the values gained from hand mode were closer to the literature data. Furthermore, CBCT-based image measurement results corresponded to the known averages. Our results suggest that CBCT images provide reliable volumetric information that can be depended on for artificial organ construction, and which may aid the guidance of the operator prior to or during the intervention.

  9. Meta-analysis of the effectiveness of computer-based laboratory versus traditional hands-on laboratory in college and pre-college science instructions

    NASA Astrophysics Data System (ADS)

    Onuoha, Cajetan O.

    The purpose of this research study was to determine the overall effectiveness of computer-based laboratory compared with the traditional hands-on laboratory for improving students' science academic achievement and attitudes towards science subjects at the college and pre-college levels of education in the United States. Meta-analysis was used to synthesis the findings from 38 primary research studies conducted and/or reported in the United States between 1996 and 2006 that compared the effectiveness of computer-based laboratory with the traditional hands-on laboratory on measures related to science academic achievements and attitudes towards science subjects. The 38 primary research studies, with total subjects of 3,824 generated a total of 67 weighted individual effect sizes that were used in this meta-analysis. The study found that computer-based laboratory had small positive effect sizes over the traditional hands-on laboratory (ES = +0.26) on measures related to students' science academic achievements and attitudes towards science subjects (ES = +0.22). It was also found that computer-based laboratory produced more significant effects on physical science subjects compared to biological sciences (ES = +0.34, +0.17).

  10. A simple computer-based measurement and analysis system of pulmonary auscultation sounds.

    PubMed

    Polat, Hüseyin; Güler, Inan

    2004-12-01

    Listening to various lung sounds has proven to be an important diagnostic tool for detecting and monitoring certain types of lung diseases. In this study a computer-based system has been designed for easy measurement and analysis of lung sound using the software package DasyLAB. The designed system presents the following features: it is able to digitally record the lung sounds which are captured with an electronic stethoscope plugged to a sound card on a portable computer, display the lung sound waveform for auscultation sites, record the lung sound into the ASCII format, acoustically reproduce the lung sound, edit and print the sound waveforms, display its time-expanded waveform, compute the Fast Fourier Transform (FFT), and display the power spectrum and spectrogram.

  11. Quantum supremacy in constant-time measurement-based computation: A unified architecture for sampling and verification

    NASA Astrophysics Data System (ADS)

    Miller, Jacob; Sanders, Stephen; Miyake, Akimasa

    2017-12-01

    While quantum speed-up in solving certain decision problems by a fault-tolerant universal quantum computer has been promised, a timely research interest includes how far one can reduce the resource requirement to demonstrate a provable advantage in quantum devices without demanding quantum error correction, which is crucial for prolonging the coherence time of qubits. We propose a model device made of locally interacting multiple qubits, designed such that simultaneous single-qubit measurements on it can output probability distributions whose average-case sampling is classically intractable, under similar assumptions as the sampling of noninteracting bosons and instantaneous quantum circuits. Notably, in contrast to these previous unitary-based realizations, our measurement-based implementation has two distinctive features. (i) Our implementation involves no adaptation of measurement bases, leading output probability distributions to be generated in constant time, independent of the system size. Thus, it could be implemented in principle without quantum error correction. (ii) Verifying the classical intractability of our sampling is done by changing the Pauli measurement bases only at certain output qubits. Our usage of random commuting quantum circuits in place of computationally universal circuits allows a unique unification of sampling and verification, so they require the same physical resource requirements in contrast to the more demanding verification protocols seen elsewhere in the literature.

  12. Observability-based Local Path Planning and Collision Avoidance Using Bearing-only Measurements

    DTIC Science & Technology

    2012-01-20

    Clark N. Taylorb aDepartment of Electrical and Computer Engineering, Brigham Young University , Provo, Utah, 84602 bSensors Directorate, Air Force Research...NAME(S) AND ADDRESS(ES) Brigham Young University ,Department of Electrical and Computer Engineering,Provo,UT,84602 8. PERFORMING ORGANIZATION... vit is the measurement noise that is assumed to be a zero-mean Gaus- sian random variable. Based on the state transition model expressed by Eqs. (1

  13. Physics Based Modeling and Rendering of Vegetation in the Thermal Infrared

    NASA Technical Reports Server (NTRS)

    Smith, J. A.; Ballard, J. R., Jr.

    1999-01-01

    We outline a procedure for rendering physically-based thermal infrared images of simple vegetation scenes. Our approach incorporates the biophysical processes that affect the temperature distribution of the elements within a scene. Computer graphics plays a key role in two respects. First, in computing the distribution of scene shaded and sunlit facets and, second, in the final image rendering once the temperatures of all the elements in the scene have been computed. We illustrate our approach for a simple corn scene where the three-dimensional geometry is constructed based on measured morphological attributes of the row crop. Statistical methods are used to construct a representation of the scene in agreement with the measured characteristics. Our results are quite good. The rendered images exhibit realistic behavior in directional properties as a function of view and sun angle. The root-mean-square error in measured versus predicted brightness temperatures for the scene was 2.1 deg C.

  14. Link-Based Similarity Measures Using Reachability Vectors

    PubMed Central

    Yoon, Seok-Ho; Kim, Ji-Soo; Ryu, Minsoo; Choi, Ho-Jin

    2014-01-01

    We present a novel approach for computing link-based similarities among objects accurately by utilizing the link information pertaining to the objects involved. We discuss the problems with previous link-based similarity measures and propose a novel approach for computing link based similarities that does not suffer from these problems. In the proposed approach each target object is represented by a vector. Each element of the vector corresponds to all the objects in the given data, and the value of each element denotes the weight for the corresponding object. As for this weight value, we propose to utilize the probability of reaching from the target object to the specific object, computed using the “Random Walk with Restart” strategy. Then, we define the similarity between two objects as the cosine similarity of the two vectors. In this paper, we provide examples to show that our approach does not suffer from the aforementioned problems. We also evaluate the performance of the proposed methods in comparison with existing link-based measures, qualitatively and quantitatively, with respect to two kinds of data sets, scientific papers and Web documents. Our experimental results indicate that the proposed methods significantly outperform the existing measures. PMID:24701188

  15. Iterative image reconstruction in elastic inhomogenous media with application to transcranial photoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Poudel, Joemini; Matthews, Thomas P.; Mitsuhashi, Kenji; Garcia-Uribe, Alejandro; Wang, Lihong V.; Anastasio, Mark A.

    2017-03-01

    Photoacoustic computed tomography (PACT) is an emerging computed imaging modality that exploits optical contrast and ultrasonic detection principles to form images of the photoacoustically induced initial pressure distribution within tissue. The PACT reconstruction problem corresponds to a time-domain inverse source problem, where the initial pressure distribution is recovered from the measurements recorded on an aperture outside the support of the source. A major challenge in transcranial PACT brain imaging is to compensate for aberrations in the measured data due to the propagation of the photoacoustic wavefields through the skull. To properly account for these effects, a wave equation-based inversion method should be employed that can model the heterogeneous elastic properties of the medium. In this study, an iterative image reconstruction method for 3D transcranial PACT is developed based on the elastic wave equation. To accomplish this, a forward model based on a finite-difference time-domain discretization of the elastic wave equation is established. Subsequently, gradient-based methods are employed for computing penalized least squares estimates of the initial source distribution that produced the measured photoacoustic data. The developed reconstruction algorithm is validated and investigated through computer-simulation studies.

  16. Designing for deeper learning in a blended computer science course for middle school students

    NASA Astrophysics Data System (ADS)

    Grover, Shuchi; Pea, Roy; Cooper, Stephen

    2015-04-01

    The focus of this research was to create and test an introductory computer science course for middle school. Titled "Foundations for Advancing Computational Thinking" (FACT), the course aims to prepare and motivate middle school learners for future engagement with algorithmic problem solving. FACT was also piloted as a seven-week course on Stanford's OpenEdX MOOC platform for blended in-class learning. Unique aspects of FACT include balanced pedagogical designs that address the cognitive, interpersonal, and intrapersonal aspects of "deeper learning"; a focus on pedagogical strategies for mediating and assessing for transfer from block-based to text-based programming; curricular materials for remedying misperceptions of computing; and "systems of assessments" (including formative and summative quizzes and tests, directed as well as open-ended programming assignments, and a transfer test) to get a comprehensive picture of students' deeper computational learning. Empirical investigations, accomplished over two iterations of a design-based research effort with students (aged 11-14 years) in a public school, sought to examine student understanding of algorithmic constructs, and how well students transferred this learning from Scratch to text-based languages. Changes in student perceptions of computing as a discipline were measured. Results and mixed-method analyses revealed that students in both studies (1) achieved substantial learning gains in algorithmic thinking skills, (2) were able to transfer their learning from Scratch to a text-based programming context, and (3) achieved significant growth toward a more mature understanding of computing as a discipline. Factor analyses of prior computing experience, multivariate regression analyses, and qualitative analyses of student projects and artifact-based interviews were conducted to better understand the factors affecting learning outcomes. Prior computing experiences (as measured by a pretest) and math ability were found to be strong predictors of learning outcomes.

  17. A randomised controlled trial testing a web-based, computer-tailored self-management intervention for people with or at risk for chronic obstructive pulmonary disease: a study protocol

    PubMed Central

    2013-01-01

    Background Chronic Obstructive Pulmonary Disease (COPD) is a major cause of morbidity and mortality. Effective self-management support interventions are needed to improve the health and functional status of people with COPD or at risk for COPD. Computer-tailored technology could be an effective way to provide this support. Methods/Design This paper presents the protocol of a randomised controlled trial testing the effectiveness of a web-based, computer-tailored self-management intervention to change health behaviours of people with or at risk for COPD. An intervention group will be compared to a usual care control group, in which the intervention group will receive a web-based, computer-tailored self-management intervention. Participants will be recruited from an online panel and through general practices. Outcomes will be measured at baseline and at 6 months. The primary outcomes will be smoking behaviour, measuring the 7-day point prevalence abstinence and physical activity, measured in minutes. Secondary outcomes will include dyspnoea score, quality of life, stages of change, intention to change behaviour and alternative smoking behaviour measures, including current smoking behaviour, 24-hour point prevalence abstinence, prolonged abstinence, continued abstinence and number of quit attempts. Discussion To the best of our knowledge, this will be the first randomised controlled trial to test the effectiveness of a web-based, computer-tailored self-management intervention for people with or at risk for COPD. The results will be important to explore the possible benefits of computer-tailored interventions for the self-management of people with or at risk for COPD and potentially other chronic health conditions. Dutch trial register NTR3421 PMID:23742208

  18. Situation awareness and trust in computer-based procedures in nuclear power plant operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Throneburg, E. B.; Jones, J. M.

    2006-07-01

    Situation awareness and trust are two issues that need to be addressed in the design of computer-based procedures for nuclear power plants. Situation awareness, in relation to computer-based procedures, concerns the operators' knowledge of the plant's state while following the procedures. Trust concerns the amount of faith that the operators put into the automated procedures, which can affect situation awareness. This paper first discusses the advantages and disadvantages of computer-based procedures. It then discusses the known aspects of situation awareness and trust as applied to computer-based procedures in nuclear power plants. An outline of a proposed experiment is then presentedmore » that includes methods of measuring situation awareness and trust so that these aspects can be analyzed for further study. (authors)« less

  19. Development and Verification of Body Armor Target Geometry Created Using Computed Tomography Scans

    DTIC Science & Technology

    2017-07-13

    designated by other authorized documents. Citation of manufacturer’s or trade names does not constitute an official endorsement or approval of...modeling consisted of manual measurement of armor systems and translating those measurements to computer-aided design geometry, which can be tedious and...computer-aided design (CAD) human geometry model (referred to throughout as ORCA man) that is used in the Operational Requirement-based Casualty Assessment

  20. A dc model for power switching transistors suitable for computer-aided design and analysis

    NASA Technical Reports Server (NTRS)

    Wilson, P. M.; George, R. T., Jr.; Owen, H. A., Jr.; Wilson, T. G.

    1979-01-01

    The proposed dc model for bipolar junction power switching transistors is based on measurements which may be made with standard laboratory equipment. Those nonlinearities which are of importance to power electronics design are emphasized. Measurements procedures are discussed in detail. A model formulation adapted for use with a computer program is presented, and a comparison between actual and computer-generated results is made.

  1. Evaluating measurement invariance across assessment modes of phone interview and computer self-administered survey for the PROMIS measures in a population-based cohort of localized prostate cancer survivors.

    PubMed

    Wang, Mian; Chen, Ronald C; Usinger, Deborah S; Reeve, Bryce B

    2017-11-01

    To evaluate measurement invariance (phone interview vs computer self-administered survey) of 15 PROMIS measures responded by a population-based cohort of localized prostate cancer survivors. Participants were part of the North Carolina Prostate Cancer Comparative Effectiveness and Survivorship Study. Out of the 952 men who took the phone interview at 24 months post-treatment, 401 of them also completed the same survey online using a home computer. Unidimensionality of the PROMIS measures was examined using single-factor confirmatory factor analysis (CFA) models. Measurement invariance testing was conducted using longitudinal CFA via a model comparison approach. For strongly or partially strongly invariant measures, changes in the latent factors and factor autocorrelations were also estimated and tested. Six measures (sleep disturbance, sleep-related impairment, diarrhea, illness impact-negative, illness impact-positive, and global satisfaction with sex life) had locally dependent items, and therefore model modifications had to be made on these domains prior to measurement invariance testing. Overall, seven measures achieved strong invariance (all items had equal loadings and thresholds), and four measures achieved partial strong invariance (each measure had one item with unequal loadings and thresholds). Three measures (pain interference, interest in sexual activity, and global satisfaction with sex life) failed to establish configural invariance due to between-mode differences in factor patterns. This study supports the use of phone-based live interviewers in lieu of PC-based assessment (when needed) for many of the PROMIS measures.

  2. A Stratified Acoustic Model Accounting for Phase Shifts for Underwater Acoustic Networks

    PubMed Central

    Wang, Ping; Zhang, Lin; Li, Victor O. K.

    2013-01-01

    Accurate acoustic channel models are critical for the study of underwater acoustic networks. Existing models include physics-based models and empirical approximation models. The former enjoy good accuracy, but incur heavy computational load, rendering them impractical in large networks. On the other hand, the latter are computationally inexpensive but inaccurate since they do not account for the complex effects of boundary reflection losses, the multi-path phenomenon and ray bending in the stratified ocean medium. In this paper, we propose a Stratified Acoustic Model (SAM) based on frequency-independent geometrical ray tracing, accounting for each ray's phase shift during the propagation. It is a feasible channel model for large scale underwater acoustic network simulation, allowing us to predict the transmission loss with much lower computational complexity than the traditional physics-based models. The accuracy of the model is validated via comparisons with the experimental measurements in two different oceans. Satisfactory agreements with the measurements and with other computationally intensive classical physics-based models are demonstrated. PMID:23669708

  3. A stratified acoustic model accounting for phase shifts for underwater acoustic networks.

    PubMed

    Wang, Ping; Zhang, Lin; Li, Victor O K

    2013-05-13

    Accurate acoustic channel models are critical for the study of underwater acoustic networks. Existing models include physics-based models and empirical approximation models. The former enjoy good accuracy, but incur heavy computational load, rendering them impractical in large networks. On the other hand, the latter are computationally inexpensive but inaccurate since they do not account for the complex effects of boundary reflection losses, the multi-path phenomenon and ray bending in the stratified ocean medium. In this paper, we propose a Stratified Acoustic Model (SAM) based on frequency-independent geometrical ray tracing, accounting for each ray's phase shift during the propagation. It is a feasible channel model for large scale underwater acoustic network simulation, allowing us to predict the transmission loss with much lower computational complexity than the traditional physics-based models. The accuracy of the model is validated via comparisons with the experimental measurements in two different oceans. Satisfactory agreements with the measurements and with other computationally intensive classical physics-based models are demonstrated.

  4. Computing angle of arrival of radio signals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borchardt, John J.; Steele, David K.

    Various technologies pertaining to computing angle of arrival of radio signals are described. A system that is configured for computing the angle of arrival of a radio signal includes a cylindrical sheath wrapped around a cylindrical object, where the cylindrical sheath acts as a ground plane. The system further includes a plurality of antennas that are positioned about an exterior surface of the cylindrical sheath, and receivers respectively coupled to the antennas. The receivers output measurements pertaining to the radio signal. A processing circuit receives the measurements and computes the angle of arrival of the radio signal based upon themore » measurements.« less

  5. High level language for measurement complex control based on the computer E-100I

    NASA Technical Reports Server (NTRS)

    Zubkov, B. V.

    1980-01-01

    A high level language was designed to control the process of conducting an experiment using the computer "Elektrinika-1001". Program examples are given to control the measuring and actuating devices. The procedure of including these programs in the suggested high level language is described.

  6. Foresters' Metric Conversions program (version 1.0). [Computer program

    Treesearch

    Jefferson A. Palmer

    1999-01-01

    The conversion of scientific measurements has become commonplace in the fields of - engineering, research, and forestry. Foresters? Metric Conversions is a Windows-based computer program that quickly converts user-defined measurements from English to metric and from metric to English. Foresters? Metric Conversions was derived from the publication "Metric...

  7. Unsteady Aerodynamic Force Sensing from Measured Strain

    NASA Technical Reports Server (NTRS)

    Pak, Chan-Gi

    2016-01-01

    A simple approach for computing unsteady aerodynamic forces from simulated measured strain data is proposed in this study. First, the deflection and slope of the structure are computed from the unsteady strain using the two-step approach. Velocities and accelerations of the structure are computed using the autoregressive moving average model, on-line parameter estimator, low-pass filter, and a least-squares curve fitting method together with analytical derivatives with respect to time. Finally, aerodynamic forces over the wing are computed using modal aerodynamic influence coefficient matrices, a rational function approximation, and a time-marching algorithm. A cantilevered rectangular wing built and tested at the NASA Langley Research Center (Hampton, Virginia, USA) in 1959 is used to validate the simple approach. Unsteady aerodynamic forces as well as wing deflections, velocities, accelerations, and strains are computed using the CFL3D computational fluid dynamics (CFD) code and an MSC/NASTRAN code (MSC Software Corporation, Newport Beach, California, USA), and these CFL3D-based results are assumed as measured quantities. Based on the measured strains, wing deflections, velocities, accelerations, and aerodynamic forces are computed using the proposed approach. These computed deflections, velocities, accelerations, and unsteady aerodynamic forces are compared with the CFL3D/NASTRAN-based results. In general, computed aerodynamic forces based on the lifting surface theory in subsonic speeds are in good agreement with the target aerodynamic forces generated using CFL3D code with the Euler equation. Excellent aeroelastic responses are obtained even with unsteady strain data under the signal to noise ratio of -9.8dB. The deflections, velocities, and accelerations at each sensor location are independent of structural and aerodynamic models. Therefore, the distributed strain data together with the current proposed approaches can be used as distributed deflection, velocity, and acceleration sensors. This research demonstrates the feasibility of obtaining induced drag and lift forces through the use of distributed sensor technology with measured strain data. An active induced drag control system thus can be designed using the two computed aerodynamic forces, induced drag and lift, to improve the fuel efficiency of an aircraft. Interpolation elements between structural finite element grids and the CFD grids and centroids are successfully incorporated with the unsteady aeroelastic computation scheme. The most critical technology for the success of the proposed approach is the robust on-line parameter estimator, since the least-squares curve fitting method depends heavily on aeroelastic system frequencies and damping factors.

  8. A Rural Community's Involvement in the Design and Usability Testing of a Computer-Based Informed Consent Process for the Personalized Medicine Research Project

    PubMed Central

    Mahnke, Andrea N; Plasek, Joseph M; Hoffman, David G; Partridge, Nathan S; Foth, Wendy S; Waudby, Carol J; Rasmussen, Luke V; McManus, Valerie D; McCarty, Catherine A

    2014-01-01

    Many informed consent studies demonstrate that research subjects poorly retain and understand information in written consent documents. Previous research in multimedia consent is mixed in terms of success for improving participants’ understanding, satisfaction, and retention. This failure may be due to a lack of a community-centered design approach to building the interventions. The goal of this study was to gather information from the community to determine the best way to undertake the consent process. Community perceptions regarding different computer-based consenting approaches were evaluated, and a computer-based consent was developed and tested. A second goal was to evaluate whether participants make truly informed decisions to participate in research. Simulations of an informed consent process were videotaped to document the process. Focus groups were conducted to determine community attitudes towards a computer-based informed consent process. Hybrid focus groups were conducted to determine the most acceptable hardware device. Usability testing was conducted on a computer-based consent prototype using a touch-screen kiosk. Based on feedback, a computer-based consent was developed. Representative study participants were able to easily complete the consent, and all were able to correctly answer the comprehension check questions. Community involvement in developing a computer-based consent proved valuable for a population-based genetic study. These findings may translate to other types of informed consents, such as genetic clinical trials consents. A computer-based consent may serve to better communicate consistent, clear, accurate, and complete information regarding the risks and benefits of study participation. Additional analysis is necessary to measure the level of comprehension of the check-question answers by larger numbers of participants. The next step will involve contacting participants to measure whether understanding of what they consented to is retained over time. PMID:24273095

  9. A rural community's involvement in the design and usability testing of a computer-based informed consent process for the Personalized Medicine Research Project.

    PubMed

    Mahnke, Andrea N; Plasek, Joseph M; Hoffman, David G; Partridge, Nathan S; Foth, Wendy S; Waudby, Carol J; Rasmussen, Luke V; McManus, Valerie D; McCarty, Catherine A

    2014-01-01

    Many informed consent studies demonstrate that research subjects poorly retain and understand information in written consent documents. Previous research in multimedia consent is mixed in terms of success for improving participants' understanding, satisfaction, and retention. This failure may be due to a lack of a community-centered design approach to building the interventions. The goal of this study was to gather information from the community to determine the best way to undertake the consent process. Community perceptions regarding different computer-based consenting approaches were evaluated, and a computer-based consent was developed and tested. A second goal was to evaluate whether participants make truly informed decisions to participate in research. Simulations of an informed consent process were videotaped to document the process. Focus groups were conducted to determine community attitudes towards a computer-based informed consent process. Hybrid focus groups were conducted to determine the most acceptable hardware device. Usability testing was conducted on a computer-based consent prototype using a touch-screen kiosk. Based on feedback, a computer-based consent was developed. Representative study participants were able to easily complete the consent, and all were able to correctly answer the comprehension check questions. Community involvement in developing a computer-based consent proved valuable for a population-based genetic study. These findings may translate to other types of informed consents, including those for trials involving treatment of genetic disorders. A computer-based consent may serve to better communicate consistent, clear, accurate, and complete information regarding the risks and benefits of study participation. Additional analysis is necessary to measure the level of comprehension of the check-question answers by larger numbers of participants. The next step will involve contacting participants to measure whether understanding of what they consented to is retained over time. © 2013 Wiley Periodicals, Inc.

  10. Universal Quantum Computing with Measurement-Induced Continuous-Variable Gate Sequence in a Loop-Based Architecture.

    PubMed

    Takeda, Shuntaro; Furusawa, Akira

    2017-09-22

    We propose a scalable scheme for optical quantum computing using measurement-induced continuous-variable quantum gates in a loop-based architecture. Here, time-bin-encoded quantum information in a single spatial mode is deterministically processed in a nested loop by an electrically programmable gate sequence. This architecture can process any input state and an arbitrary number of modes with almost minimum resources, and offers a universal gate set for both qubits and continuous variables. Furthermore, quantum computing can be performed fault tolerantly by a known scheme for encoding a qubit in an infinite-dimensional Hilbert space of a single light mode.

  11. Universal Quantum Computing with Measurement-Induced Continuous-Variable Gate Sequence in a Loop-Based Architecture

    NASA Astrophysics Data System (ADS)

    Takeda, Shuntaro; Furusawa, Akira

    2017-09-01

    We propose a scalable scheme for optical quantum computing using measurement-induced continuous-variable quantum gates in a loop-based architecture. Here, time-bin-encoded quantum information in a single spatial mode is deterministically processed in a nested loop by an electrically programmable gate sequence. This architecture can process any input state and an arbitrary number of modes with almost minimum resources, and offers a universal gate set for both qubits and continuous variables. Furthermore, quantum computing can be performed fault tolerantly by a known scheme for encoding a qubit in an infinite-dimensional Hilbert space of a single light mode.

  12. Measurement and computer simulation of antennas on ships and aircraft for results of operational reliability

    NASA Astrophysics Data System (ADS)

    Kubina, Stanley J.

    1989-09-01

    The review of the status of computational electromagnetics by Miller and the exposition by Burke of the developments in one of the more important computer codes in the application of the electric field integral equation method, the Numerical Electromagnetic Code (NEC), coupled with Molinet's summary of progress in techniques based on the Geometrical Theory of Diffraction (GTD), provide a clear perspective on the maturity of the modern discipline of computational electromagnetics and its potential. Audone's exposition of the application to the computation of Radar Scattering Cross-section (RCS) is an indication of the breadth of practical applications and his exploitation of modern near-field measurement techniques reminds one of progress in the measurement discipline which is essential to the validation or calibration of computational modeling methodology when applied to complex structures such as aircraft and ships. The latter monograph also presents some comparison results with computational models. Some of the results presented for scale model and flight measurements show some serious disagreements in the lobe structure which would require some detailed examination. This also applies to the radiation patterns obtained by flight measurement compared with those obtained using wire-grid models and integral equation modeling methods. In the examples which follow, an attempt is made to match measurements results completely over the entire 2 to 30 MHz HF range for antennas on a large patrol aircraft. The problem of validating computer models of HF antennas on a helicopter and using computer models to generate radiation pattern information which cannot be obtained by measurements are discussed. The use of NEC computer models to analyze top-side ship configurations where measurement results are not available and only self-validation measures are available or at best comparisons with an alternate GTD computer modeling technique is also discussed.

  13. Personal Computer System for Automatic Coronary Venous Flow Measurement

    PubMed Central

    Dew, Robert B.

    1985-01-01

    We developed an automated system based on an IBM PC/XT Personal computer to measure coronary venous blood flow during cardiac catheterization. Flow is determined by a thermodilution technique in which a cold saline solution is infused through a catheter into the coronary venous system. Regional temperature fluctuations sensed by the catheter are used to determine great cardiac vein and coronary sinus blood flow. The computer system replaces manual methods of acquiring and analyzing temperature data related to flow measurement, thereby increasing the speed and accuracy with which repetitive flow determinations can be made.

  14. A specialized plug-in software module for computer-aided quantitative measurement of medical images.

    PubMed

    Wang, Q; Zeng, Y J; Huo, P; Hu, J L; Zhang, J H

    2003-12-01

    This paper presents a specialized system for quantitative measurement of medical images. Using Visual C++, we developed a computer-aided software based on Image-Pro Plus (IPP), a software development platform. When transferred to the hard disk of a computer by an MVPCI-V3A frame grabber, medical images can be automatically processed by our own IPP plug-in for immunohistochemical analysis, cytomorphological measurement and blood vessel segmentation. In 34 clinical studies, the system has shown its high stability, reliability and ease of utility.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Childs, Andrew M.; Center for Theoretical Physics, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139; Leung, Debbie W.

    We present unified, systematic derivations of schemes in the two known measurement-based models of quantum computation. The first model (introduced by Raussendorf and Briegel, [Phys. Rev. Lett. 86, 5188 (2001)]) uses a fixed entangled state, adaptive measurements on single qubits, and feedforward of the measurement results. The second model (proposed by Nielsen, [Phys. Lett. A 308, 96 (2003)] and further simplified by Leung, [Int. J. Quant. Inf. 2, 33 (2004)]) uses adaptive two-qubit measurements that can be applied to arbitrary pairs of qubits, and feedforward of the measurement results. The underlying principle of our derivations is a variant of teleportationmore » introduced by Zhou, Leung, and Chuang, [Phys. Rev. A 62, 052316 (2000)]. Our derivations unify these two measurement-based models of quantum computation and provide significantly simpler schemes.« less

  16. Identifying Optimal Measurement Subspace for the Ensemble Kalman Filter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Ning; Huang, Zhenyu; Welch, Greg

    2012-05-24

    To reduce the computational load of the ensemble Kalman filter while maintaining its efficacy, an optimization algorithm based on the generalized eigenvalue decomposition method is proposed for identifying the most informative measurement subspace. When the number of measurements is large, the proposed algorithm can be used to make an effective tradeoff between computational complexity and estimation accuracy. This algorithm also can be extended to other Kalman filters for measurement subspace selection.

  17. Descriptive study of electromagnetic wave distribution for various seating positions: using digital textbooks.

    PubMed

    Seomun, GyeongAe; Kim, YoungHwan; Lee, Jung-Ah; Jeong, KwangHoon; Park, Seon-A; Kim, Miran; Noh, Wonjung

    2014-04-01

    To better understand environmental electromagnetic wave exposure during the use of digital textbooks by elementary school students, we measured numeric values of the electromagnetic fields produced by tablet personal computers (TPCs). Specifically, we examined the distribution of the electromagnetic waves for various students' seating positions in an elementary school that uses digital textbooks. Electric and magnetic fields from TPCs were measured using the HI-3603 Visual Display Terminal/ Very Low Frequency (VDT/VLF) radiation measurement system. Electromagnetic field values from TPCs measured at a student's seat and at a teacher's computer were deemed not harmful to health. However, electromagnetic field values varied based on the distance between students, other electronic devices such as a desktop computers, and student posture while using a TPC. Based on these results, it is necessary to guide students to observe proper posture and to arrange seats at an appropriate distance in the classroom.

  18. Comparison of computer-assisted instruction (CAI) versus traditional textbook methods for training in abdominal examination (Japanese experience).

    PubMed

    Qayumi, A K; Kurihara, Y; Imai, M; Pachev, G; Seo, H; Hoshino, Y; Cheifetz, R; Matsuura, K; Momoi, M; Saleem, M; Lara-Guerra, H; Miki, Y; Kariya, Y

    2004-10-01

    This study aimed to compare the effects of computer-assisted, text-based and computer-and-text learning conditions on the performances of 3 groups of medical students in the pre-clinical years of their programme, taking into account their academic achievement to date. A fourth group of students served as a control (no-study) group. Participants were recruited from the pre-clinical years of the training programmes in 2 medical schools in Japan, Jichi Medical School near Tokyo and Kochi Medical School near Osaka. Participants were randomly assigned to 4 learning conditions and tested before and after the study on their knowledge of and skill in performing an abdominal examination, in a multiple-choice test and an objective structured clinical examination (OSCE), respectively. Information about performance in the programme was collected from school records and students were classified as average, good or excellent. Student and faculty evaluations of their experience in the study were explored by means of a short evaluation survey. Compared to the control group, all 3 study groups exhibited significant gains in performance on knowledge and performance measures. For the knowledge measure, the gains of the computer-assisted and computer-assisted plus text-based learning groups were significantly greater than the gains of the text-based learning group. The performances of the 3 groups did not differ on the OSCE measure. Analyses of gains by performance level revealed that high achieving students' learning was independent of study method. Lower achieving students performed better after using computer-based learning methods. The results suggest that computer-assisted learning methods will be of greater help to students who do not find the traditional methods effective. Explorations of the factors behind this are a matter for future research.

  19. Measuring the impact of computer resource quality on the software development process and product

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  20. Application of CT-PSF-based computer-simulated lung nodules for evaluating the accuracy of computer-aided volumetry.

    PubMed

    Funaki, Ayumu; Ohkubo, Masaki; Wada, Shinichi; Murao, Kohei; Matsumoto, Toru; Niizuma, Shinji

    2012-07-01

    With the wide dissemination of computed tomography (CT) screening for lung cancer, measuring the nodule volume accurately with computer-aided volumetry software is increasingly important. Many studies for determining the accuracy of volumetry software have been performed using a phantom with artificial nodules. These phantom studies are limited, however, in their ability to reproduce the nodules both accurately and in the variety of sizes and densities required. Therefore, we propose a new approach of using computer-simulated nodules based on the point spread function measured in a CT system. The validity of the proposed method was confirmed by the excellent agreement obtained between computer-simulated nodules and phantom nodules regarding the volume measurements. A practical clinical evaluation of the accuracy of volumetry software was achieved by adding simulated nodules onto clinical lung images, including noise and artifacts. The tested volumetry software was revealed to be accurate within an error of 20 % for nodules >5 mm and with the difference between nodule density and background (lung) (CT value) being 400-600 HU. Such a detailed analysis can provide clinically useful information on the use of volumetry software in CT screening for lung cancer. We concluded that the proposed method is effective for evaluating the performance of computer-aided volumetry software.

  1. Evaluation of Complex Human Performance: The Promise of Computer-Based Simulation

    ERIC Educational Resources Information Center

    Newsom, Robert S.; And Others

    1978-01-01

    For the training and placement of professional workers, multiple-choice instruments are the norm for wide-scale measurement and evaluation efforts. These instruments contain fundamental problems. Computer-based management simulations may provide solutions to these problems, appear scoreable and reliable, offer increased validity, and are better…

  2. Specifying and Refining a Measurement Model for a Computer-Based Interactive Assessment

    ERIC Educational Resources Information Center

    Levy, Roy; Mislevy, Robert J.

    2004-01-01

    The challenges of modeling students' performance in computer-based interactive assessments include accounting for multiple aspects of knowledge and skill that arise in different situations and the conditional dependencies among multiple aspects of performance. This article describes a Bayesian approach to modeling and estimating cognitive models…

  3. Computer-Based and Paper-Based Measurement of Semantic Knowledge

    DTIC Science & Technology

    1989-01-01

    of Personality Assessment , 34, 353-361. McArthur, D. L., & Choppin, B. H. (1984). Computerized diagnostic testing. Journal 15 of Educational...Computers in Human Behavior, 1, 49-58. Lushene, R. E., O’Neii, H. F., & Dunn, T. (1974). Equivalent validity of a completely computerized MMPI. Journal

  4. Guidelines for Developing Computer Based Resource Units. Revised.

    ERIC Educational Resources Information Center

    State Univ. of New York, Buffalo. Coll. at Buffalo. Educational Research and Development Complex.

    Presented for use with normal and handicapped children are guidelines for the development of computer based resource units organized into two operations: one of which is the production of software which includes the writing of instructional objectives, content, activities, materials, and measuring devices; and the other the coding of the software…

  5. Silicon CMOS architecture for a spin-based quantum computer.

    PubMed

    Veldhorst, M; Eenink, H G J; Yang, C H; Dzurak, A S

    2017-12-15

    Recent advances in quantum error correction codes for fault-tolerant quantum computing and physical realizations of high-fidelity qubits in multiple platforms give promise for the construction of a quantum computer based on millions of interacting qubits. However, the classical-quantum interface remains a nascent field of exploration. Here, we propose an architecture for a silicon-based quantum computer processor based on complementary metal-oxide-semiconductor (CMOS) technology. We show how a transistor-based control circuit together with charge-storage electrodes can be used to operate a dense and scalable two-dimensional qubit system. The qubits are defined by the spin state of a single electron confined in quantum dots, coupled via exchange interactions, controlled using a microwave cavity, and measured via gate-based dispersive readout. We implement a spin qubit surface code, showing the prospects for universal quantum computation. We discuss the challenges and focus areas that need to be addressed, providing a path for large-scale quantum computing.

  6. Registration of surface structures using airborne focused ultrasound.

    PubMed

    Sundström, N; Börjesson, P O; Holmer, N G; Olsson, L; Persson, H W

    1991-01-01

    A low-cost measuring system, based on a personal computer combined with standard equipment for complex measurements and signal processing, has been assembled. Such a system increases the possibilities for small hospitals and clinics to finance advanced measuring equipment. A description of equipment developed for airborne ultrasound together with a personal computer-based system for fast data acquisition and processing is given. Two air-adapted ultrasound transducers with high lateral resolution have been developed. Furthermore, a few results for fast and accurate estimation of signal arrival time are presented. The theoretical estimation models developed are applied to skin surface profile registrations.

  7. Using Business Simulations as Authentic Assessment Tools

    ERIC Educational Resources Information Center

    Neely, Pat; Tucker, Jan

    2012-01-01

    New modalities for assessing student learning exist as a result of advances in computer technology. Conventional measurement practices have been transformed into computer based testing. Although current testing replicates assessment processes used in college classrooms, a greater opportunity exists to use computer technology to create authentic…

  8. What Is Measured in Mathematics Tests? Construct Validity of Curriculum-Based Mathematics Measures.

    ERIC Educational Resources Information Center

    Thurber, Robin Schul; Shinn, Mark R.; Smolkowski, Keith

    2002-01-01

    Mathematics curriculum-based measurement (M-CBM) is one tool that has been developed for formative evaluation in mathematics. This study examines what constructs M-CBM actually measures in the context of a range of other mathematics measures. Results indicated that a two-factor model of mathematics where Computation and Applications were distinct…

  9. Lattice surgery on the Raussendorf lattice

    NASA Astrophysics Data System (ADS)

    Herr, Daniel; Paler, Alexandru; Devitt, Simon J.; Nori, Franco

    2018-07-01

    Lattice surgery is a method to perform quantum computation fault-tolerantly by using operations on boundary qubits between different patches of the planar code. This technique allows for universal planar code computation without eliminating the intrinsic two-dimensional nearest-neighbor properties of the surface code that eases physical hardware implementations. Lattice surgery approaches to algorithmic compilation and optimization have been demonstrated to be more resource efficient for resource-intensive components of a fault-tolerant algorithm, and consequently may be preferable over braid-based logic. Lattice surgery can be extended to the Raussendorf lattice, providing a measurement-based approach to the surface code. In this paper we describe how lattice surgery can be performed on the Raussendorf lattice and therefore give a viable alternative to computation using braiding in measurement-based implementations of topological codes.

  10. Measurement of information and communication technology experience and attitudes to e-learning of students in the healthcare professions: integrative review.

    PubMed

    Wilkinson, Ann; While, Alison E; Roberts, Julia

    2009-04-01

    This paper is a report of a review to describe and discuss the psychometric properties of instruments used in healthcare education settings measuring experience and attitudes of healthcare students regarding their information and communication technology skills and their use of computers and the Internet for education. Healthcare professionals are expected to be computer and information literate at registration. A previous review of evaluative studies of computer-based learning suggests that methods of measuring learners' attitudes to computers and computer aided learning are problematic. A search of eight health and social science databases located 49 papers, the majority published between 1995 and January 2007, focusing on the experience and attitudes of students in the healthcare professions towards computers and e-learning. An integrative approach was adopted, with narrative description of findings. Criteria for inclusion were quantitative studies using survey tools with samples of healthcare students and concerning computer and information literacy skills, access to computers, experience with computers and use of computers and the Internet for education purposes. Since the 1980s a number of instruments have been developed, mostly in the United States of America, to measure attitudes to computers, anxiety about computer use, information and communication technology skills, satisfaction and more recently attitudes to the Internet and computers for education. The psychometric properties are poorly described. Advances in computers and technology mean that many earlier tools are no longer valid. Measures of the experience and attitudes of healthcare students to the increased use of e-learning require development in line with computer and technology advances.

  11. Characterizing the Properties of a Woven SiC/SiC Composite Using W-CEMCAN Computer Code

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Mital, Subodh K.; DiCarlo, James A.

    1999-01-01

    A micromechanics based computer code to predict the thermal and mechanical properties of woven ceramic matrix composites (CMC) is developed. This computer code, W-CEMCAN (Woven CEramic Matrix Composites ANalyzer), predicts the properties of two-dimensional woven CMC at any temperature and takes into account various constituent geometries and volume fractions. This computer code is used to predict the thermal and mechanical properties of an advanced CMC composed of 0/90 five-harness (5 HS) Sylramic fiber which had been chemically vapor infiltrated (CVI) with boron nitride (BN) and SiC interphase coatings and melt-infiltrated (MI) with SiC. The predictions, based on the bulk constituent properties from the literature, are compared with measured experimental data. Based on the comparison. improved or calibrated properties for the constituent materials are then developed for use by material developers/designers. The computer code is then used to predict the properties of a composite with the same constituents but with different fiber volume fractions. The predictions are compared with measured data and a good agreement is achieved.

  12. Applications of Technology in Neuropsychological Assessment

    PubMed Central

    Parsey, Carolyn M.; Schmitter-Edgecombe, Maureen

    2013-01-01

    Most neuropsychological assessments include at least one measure that is administered, scored, or interpreted by computers or other technologies. Despite supportive findings for these technology-based assessments, there is resistance in the field of neuropsychology to adopt additional measures that incorporate technology components. This literature review addresses the research findings of technology-based neuropsychological assessments, including computer-, and virtual reality-based measures of cognitive and functional abilities. We evaluate the strengths and limitations of each approach, and examine the utility of technology-based assessments to obtain supplemental cognitive and behavioral information that may be otherwise undetected by traditional paper and pencil measures. We argue that the potential of technology use in neuropsychological assessment has not yet been realized, and continued adoption of new technologies could result in more comprehensive assessment of cognitive dysfunction and in turn, better informed diagnosis and treatments. Recommendations for future research are also provided. PMID:24041037

  13. Applications of technology in neuropsychological assessment.

    PubMed

    Parsey, Carolyn M; Schmitter-Edgecombe, Maureen

    2013-01-01

    Most neuropsychological assessments include at least one measure that is administered, scored, or interpreted by computers or other technologies. Despite supportive findings for these technology-based assessments, there is resistance in the field of neuropsychology to adopt additional measures that incorporate technology components. This literature review addresses the research findings of technology-based neuropsychological assessments, including computer- and virtual reality-based measures of cognitive and functional abilities. We evaluate the strengths and limitations of each approach, and examine the utility of technology-based assessments to obtain supplemental cognitive and behavioral information that may be otherwise undetected by traditional paper-and-pencil measures. We argue that the potential of technology use in neuropsychological assessment has not yet been realized, and continued adoption of new technologies could result in more comprehensive assessment of cognitive dysfunction and in turn, better informed diagnosis and treatments. Recommendations for future research are also provided.

  14. High-Threshold Fault-Tolerant Quantum Computation with Analog Quantum Error Correction

    NASA Astrophysics Data System (ADS)

    Fukui, Kosuke; Tomita, Akihisa; Okamoto, Atsushi; Fujii, Keisuke

    2018-04-01

    To implement fault-tolerant quantum computation with continuous variables, the Gottesman-Kitaev-Preskill (GKP) qubit has been recognized as an important technological element. However, it is still challenging to experimentally generate the GKP qubit with the required squeezing level, 14.8 dB, of the existing fault-tolerant quantum computation. To reduce this requirement, we propose a high-threshold fault-tolerant quantum computation with GKP qubits using topologically protected measurement-based quantum computation with the surface code. By harnessing analog information contained in the GKP qubits, we apply analog quantum error correction to the surface code. Furthermore, we develop a method to prevent the squeezing level from decreasing during the construction of the large-scale cluster states for the topologically protected, measurement-based, quantum computation. We numerically show that the required squeezing level can be relaxed to less than 10 dB, which is within the reach of the current experimental technology. Hence, this work can considerably alleviate this experimental requirement and take a step closer to the realization of large-scale quantum computation.

  15. Contextual Fraction as a Measure of Contextuality.

    PubMed

    Abramsky, Samson; Barbosa, Rui Soares; Mansfield, Shane

    2017-08-04

    We consider the contextual fraction as a quantitative measure of contextuality of empirical models, i.e., tables of probabilities of measurement outcomes in an experimental scenario. It provides a general way to compare the degree of contextuality across measurement scenarios; it bears a precise relationship to violations of Bell inequalities; its value, and a witnessing inequality, can be computed using linear programing; it is monotonic with respect to the "free" operations of a resource theory for contextuality; and it measures quantifiable advantages in informatic tasks, such as games and a form of measurement-based quantum computing.

  16. Contextual Fraction as a Measure of Contextuality

    NASA Astrophysics Data System (ADS)

    Abramsky, Samson; Barbosa, Rui Soares; Mansfield, Shane

    2017-08-01

    We consider the contextual fraction as a quantitative measure of contextuality of empirical models, i.e., tables of probabilities of measurement outcomes in an experimental scenario. It provides a general way to compare the degree of contextuality across measurement scenarios; it bears a precise relationship to violations of Bell inequalities; its value, and a witnessing inequality, can be computed using linear programing; it is monotonic with respect to the "free" operations of a resource theory for contextuality; and it measures quantifiable advantages in informatic tasks, such as games and a form of measurement-based quantum computing.

  17. Coupling of EIT with computational lung modeling for predicting patient-specific ventilatory responses.

    PubMed

    Roth, Christian J; Becher, Tobias; Frerichs, Inéz; Weiler, Norbert; Wall, Wolfgang A

    2017-04-01

    Providing optimal personalized mechanical ventilation for patients with acute or chronic respiratory failure is still a challenge within a clinical setting for each case anew. In this article, we integrate electrical impedance tomography (EIT) monitoring into a powerful patient-specific computational lung model to create an approach for personalizing protective ventilatory treatment. The underlying computational lung model is based on a single computed tomography scan and able to predict global airflow quantities, as well as local tissue aeration and strains for any ventilation maneuver. For validation, a novel "virtual EIT" module is added to our computational lung model, allowing to simulate EIT images based on the patient's thorax geometry and the results of our numerically predicted tissue aeration. Clinically measured EIT images are not used to calibrate the computational model. Thus they provide an independent method to validate the computational predictions at high temporal resolution. The performance of this coupling approach has been tested in an example patient with acute respiratory distress syndrome. The method shows good agreement between computationally predicted and clinically measured airflow data and EIT images. These results imply that the proposed framework can be used for numerical prediction of patient-specific responses to certain therapeutic measures before applying them to an actual patient. In the long run, definition of patient-specific optimal ventilation protocols might be assisted by computational modeling. NEW & NOTEWORTHY In this work, we present a patient-specific computational lung model that is able to predict global and local ventilatory quantities for a given patient and any selected ventilation protocol. For the first time, such a predictive lung model is equipped with a virtual electrical impedance tomography module allowing real-time validation of the computed results with the patient measurements. First promising results obtained in an acute respiratory distress syndrome patient show the potential of this approach for personalized computationally guided optimization of mechanical ventilation in future. Copyright © 2017 the American Physiological Society.

  18. Transient Three-Dimensional Startup Side Load Analysis of a Regeneratively Cooled Nozzle

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See

    2008-01-01

    The objective of this effort is to develop a computational methodology to capture the startup side load physics and to anchor the computed aerodynamic side loads with the available data from a regeneratively cooled, high-aspect-ratio nozzle, hot-fired at sea level. The computational methodology is based on an unstructured-grid, pressure-based, reacting flow computational fluid dynamics and heat transfer formulation, a transient 5 s inlet history based on an engine system simulation, and a wall temperature distribution to reflect the effect of regenerative cooling. To understand the effect of regenerative wall cooling, two transient computations were performed using the boundary conditions of adiabatic and cooled walls, respectively. The results show that three types of shock evolution are responsible for side loads: generation of combustion wave; transitions among free-shock separation, restricted-shock separation, and simultaneous free-shock and restricted shock separations; along with the pulsation of shocks across the lip, although the combustion wave is commonly eliminated with the sparklers during actual test. The test measured two side load events: a secondary and lower side load, followed by a primary and peak side load. Results from both wall boundary conditions captured the free-shock separation to restricted-shock separation transition with computed side loads matching the measured secondary side load. For the primary side load, the cooled wall transient produced restricted-shock pulsation across the nozzle lip with peak side load matching that of the test, while the adiabatic wall transient captured shock transitions and free-shock pulsation across the lip with computed peak side load 50% lower than that of the measurement. The computed dominant pulsation frequency of the cooled wall nozzle agrees with that of a separate test, while that of the adiabatic wall nozzle is more than 50% lower than that of the measurement. The computed teepee-like formation and the tangential motion of the shocks during lip pulsation also qualitatively agree with those of test observations. Moreover, a third transient computation was performed with a proportionately shortened 1 s sequence, and lower side loads were obtained with the higher ramp rate.

  19. VMEbus based computer and real-time UNIX as infrastructure of DAQ

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yasu, Y.; Fujii, H.; Nomachi, M.

    1994-12-31

    This paper describes what the authors have constructed as the infrastructure of data acquisition system (DAQ). The paper reports recent developments concerned with HP VME board computer with LynxOS (HP742rt/HP-RT) and Alpha/OSF1 with VMEbus adapter. The paper also reports current status of developing a Benchmark Suite for Data Acquisition (DAQBENCH) for measuring not only the performance of VME/CAMAC access but also that of the context switching, the inter-process communications and so on, for various computers including Workstation-based systems and VME board computers.

  20. Development of the Telehealth Usability Questionnaire (TUQ).

    PubMed

    Parmanto, Bambang; Lewis, Allen Nelson; Graham, Kristin M; Bertolet, Marnie H

    2016-01-01

    Current telehealth usability questionnaires are designed primarily for older technologies, where telehealth interaction is conducted over dedicated videoconferencing applications. However, telehealth services are increasingly conducted over computer-based systems that rely on commercial software and a user supplied computer interface. Therefore, a usability questionnaire that addresses the changes in telehealth service delivery and technology is needed. The Telehealth Usability Questionnaire (TUQ) was developed to evaluate the usability of telehealth implementation and services. This paper addresses: (1) the need for a new measure of telehealth usability, (2) the development of the TUQ, (3) intended uses for the TUQ, and (4) the reliability of the TUQ. Analyses indicate that the TUQ is a solid, robust, and versatile measure that can be used to measure the quality of the computer-based user interface and the quality of the telehealth interaction and services.

  1. Molecular Dynamics Simulations and Kinetic Measurements to Estimate and Predict Protein-Ligand Residence Times.

    PubMed

    Mollica, Luca; Theret, Isabelle; Antoine, Mathias; Perron-Sierra, Françoise; Charton, Yves; Fourquez, Jean-Marie; Wierzbicki, Michel; Boutin, Jean A; Ferry, Gilles; Decherchi, Sergio; Bottegoni, Giovanni; Ducrot, Pierre; Cavalli, Andrea

    2016-08-11

    Ligand-target residence time is emerging as a key drug discovery parameter because it can reliably predict drug efficacy in vivo. Experimental approaches to binding and unbinding kinetics are nowadays available, but we still lack reliable computational tools for predicting kinetics and residence time. Most attempts have been based on brute-force molecular dynamics (MD) simulations, which are CPU-demanding and not yet particularly accurate. We recently reported a new scaled-MD-based protocol, which showed potential for residence time prediction in drug discovery. Here, we further challenged our procedure's predictive ability by applying our methodology to a series of glucokinase activators that could be useful for treating type 2 diabetes mellitus. We combined scaled MD with experimental kinetics measurements and X-ray crystallography, promptly checking the protocol's reliability by directly comparing computational predictions and experimental measures. The good agreement highlights the potential of our scaled-MD-based approach as an innovative method for computationally estimating and predicting drug residence times.

  2. The I-V Measurement System for Solar Cells Based on MCU

    NASA Astrophysics Data System (ADS)

    Fengxiang, Chen; Yu, Ai; Jiafu, Wang; Lisheng, Wang

    2011-02-01

    In this paper, an I-V measurement system for solar cells based on Single-chip Microcomputer (MCU) is presented. According to the test principles of solar cells, this measurement system mainly comprises of two parts—data collecting, data processing and displaying. The MCU mainly used as to acquire data, then the collecting results is sent to the computer by serial port. The I-V measurement results of our test system are shown in the human-computer interaction interface based on our hardware circuit. By comparing the test results of our I-V tester and the results of other commercial I-V tester, we found errors for most parameters are less than 5%, which shows our I-V test result is reliable. Because the MCU can be applied in many fields, this I-V measurement system offers a simple prototype for portable I-V tester for solar cells.

  3. Evaluation of the Relationship between Literacy and Mathematics Skills as Assessed by Curriculum-Based Measures

    ERIC Educational Resources Information Center

    Rutherford-Becker, Kristy J.; Vanderwood, Michael L.

    2009-01-01

    The purpose of this study was to evaluate the extent that reading performance (as measured by curriculum-based measures [CBM] of oral reading fluency [ORF] and Maze reading comprehension), is related to math performance (as measured by CBM math computation and applied math). Additionally, this study examined which of the two reading measures was a…

  4. Comparing Student Performance on Paper-and-Pencil and Computer-Based-Tests

    ERIC Educational Resources Information Center

    Hardcastle, Joseph; Herrmann-Abell, Cari F.; DeBoer, George E.

    2017-01-01

    Can student performance on computer-based tests (CBT) and paper-and-pencil tests (PPT) be considered equivalent measures of student knowledge? States and school districts are grappling with this question, and although studies addressing this question are growing, additional research is needed. We report on the performance of students who took…

  5. Development of a Computer-Based Measure of Listening Comprehension of Science Talk

    ERIC Educational Resources Information Center

    Lin, Sheau-Wen; Liu, Yu; Chen, Shin-Feng; Wang, Jing-Ru; Kao, Huey-Lien

    2015-01-01

    The purpose of this study was to develop a computer-based assessment for elementary school students' listening comprehension of science talk within an inquiry-oriented environment. The development procedure had 3 steps: a literature review to define the framework of the test, collecting and identifying key constructs of science talk, and…

  6. Development and Initial Psychometric Properties of the Computer Assisted Maltreatment Inventory (CAMI): A Comprehensive Self-Report Measure of Child Maltreatment History

    ERIC Educational Resources Information Center

    DiLillo, David; Hayes-Skelton, Sarah A.; Fortier, Michelle A.; Perry, Andrea R.; Evans, Sarah E.; Messman Moore, Terri L.; Walsh, Kate; Nash, Cindy; Fauchier, Angele

    2010-01-01

    Objectives: The present study reports on the development and initial psychometric properties of the Computer Assisted Maltreatment Inventory (CAMI), a web-based self-report measure of child maltreatment history, including sexual and physical abuse, exposure to interparental violence, psychological abuse, and neglect. Methods: The CAMI was…

  7. Bayes' theorem application in the measure information diagnostic value assessment

    NASA Astrophysics Data System (ADS)

    Orzechowski, Piotr D.; Makal, Jaroslaw; Nazarkiewicz, Andrzej

    2006-03-01

    The paper presents Bayesian method application in the measure information diagnostic value assessment that is used in the computer-aided diagnosis system. The computer system described here has been created basing on the Bayesian Network and is used in Benign Prostatic Hyperplasia (BPH) diagnosis. The graphic diagnostic model enables to juxtapose experts' knowledge with data.

  8. Measuring the Computer-Related Self-Concept

    ERIC Educational Resources Information Center

    Langheinrich, Jessica; Schönfelder, Mona; Bogner, Franz X.

    2016-01-01

    A positive self-concept supposedly affects a student's well-being as well as his or her perception of individual competence at school. As computer-based learning is becoming increasingly important in school, a positive computer-related self-concept (CSC) might help to enhance cognitive achievement. Consequently, we focused on establishing a short,…

  9. On the radiated EMI current extraction of dc transmission line based on corona current statistical measurements

    NASA Astrophysics Data System (ADS)

    Yi, Yong; Chen, Zhengying; Wang, Liming

    2018-05-01

    Corona-originated discharge of DC transmission lines is the main reason for the radiated electromagnetic interference (EMI) field in the vicinity of transmission lines. A joint time-frequency analysis technique was proposed to extract the radiated EMI current (excitation current) of DC corona based on corona current statistical measurements. A reduced-scale experimental platform was setup to measure the statistical distributions of current waveform parameters of aluminum conductor steel reinforced. Based on the measured results, the peak value, root-mean-square value and average value with 9 kHz and 200 Hz band-with of 0.5 MHz radiated EMI current were calculated by the technique proposed and validated with conventional excitation function method. Radio interference (RI) was calculated based on the radiated EMI current and a wire-to-plate platform was built for the validity of the RI computation results. The reason for the certain deviation between the computations and measurements was detailed analyzed.

  10. Relative resilience to noise of standard and sequential approaches to measurement-based quantum computation

    NASA Astrophysics Data System (ADS)

    Gallagher, C. B.; Ferraro, A.

    2018-05-01

    A possible alternative to the standard model of measurement-based quantum computation (MBQC) is offered by the sequential model of MBQC—a particular class of quantum computation via ancillae. Although these two models are equivalent under ideal conditions, their relative resilience to noise in practical conditions is not yet known. We analyze this relationship for various noise models in the ancilla preparation and in the entangling-gate implementation. The comparison of the two models is performed utilizing both the gate infidelity and the diamond distance as figures of merit. Our results show that in the majority of instances the sequential model outperforms the standard one in regard to a universal set of operations for quantum computation. Further investigation is made into the performance of sequential MBQC in experimental scenarios, thus setting benchmarks for possible cavity-QED implementations.

  11. Computer ray tracing speeds.

    PubMed

    Robb, P; Pawlowski, B

    1990-05-01

    The results of measuring the ray trace speed and compilation speed of thirty-nine computers in fifty-seven configurations, ranging from personal computers to super computers, are described. A correlation of ray trace speed has been made with the LINPACK benchmark which allows the ray trace speed to be estimated using LINPACK performance data. The results indicate that the latest generation of workstations, using CPUs based on RISC (Reduced Instruction Set Computer) technology, are as fast or faster than mainframe computers in compute-bound situations.

  12. Adaptive quantum computation in changing environments using projective simulation

    NASA Astrophysics Data System (ADS)

    Tiersch, M.; Ganahl, E. J.; Briegel, H. J.

    2015-08-01

    Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent’s learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent’s performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover’s search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks.

  13. Adaptive quantum computation in changing environments using projective simulation

    PubMed Central

    Tiersch, M.; Ganahl, E. J.; Briegel, H. J.

    2015-01-01

    Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent’s learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent’s performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover’s search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks. PMID:26260263

  14. Computer vision based nacre thickness measurement of Tahitian pearls

    NASA Astrophysics Data System (ADS)

    Loesdau, Martin; Chabrier, Sébastien; Gabillon, Alban

    2017-03-01

    The Tahitian Pearl is the most valuable export product of French Polynesia contributing with over 61 million Euros to more than 50% of the total export income. To maintain its excellent reputation on the international market, an obligatory quality control for every pearl deemed for exportation has been established by the local government. One of the controlled quality parameters is the pearls nacre thickness. The evaluation is currently done manually by experts that are visually analyzing X-ray images of the pearls. In this article, a computer vision based approach to automate this procedure is presented. Even though computer vision based approaches for pearl nacre thickness measurement exist in the literature, the very specific features of the Tahitian pearl, namely the large shape variety and the occurrence of cavities, have so far not been considered. The presented work closes the. Our method consists of segmenting the pearl from X-ray images with a model-based approach, segmenting the pearls nucleus with an own developed heuristic circle detection and segmenting possible cavities with region growing. Out of the obtained boundaries, the 2-dimensional nacre thickness profile can be calculated. A certainty measurement to consider imaging and segmentation imprecisions is included in the procedure. The proposed algorithms are tested on 298 manually evaluated Tahitian pearls, showing that it is generally possible to automatically evaluate the nacre thickness of Tahitian pearls with computer vision. Furthermore the results show that the automatic measurement is more precise and faster than the manual one.

  15. Assessing Functional Performance using a Computer-Based Simulations of Everyday Activities

    PubMed Central

    Czaja, Sara J.; Loewenstein, David A.; Lee, Chin Chin; Fu, Shih Hua; Harvey, Philip D.

    2016-01-01

    Current functional capacity (FC) measures for patients with schizophrenia typically involve informant assessments or are in paper and pencil format, requiring in-person administration by a skilled assessor. This approach presents logistic problems and limits the possibilities for remote assessment, an important issue for these patients. This study evaluated the feasibility of using a computer-based assessment battery, including simulations of everyday activities. The battery was compared to in-person standard assessments of cognition and FC with respect to baseline convergence and sensitivity to group differences. The battery, administered on a touch screen computer, included measures of critical everyday activities, including: ATM Banking/Financial Management, Prescriptions Refill via Telephone/Voice Menu System, and Forms Completion (simulating a clinic and patient history form). The sample included 77 older adult patients with schizophrenia and 24 older adult healthy controls that were administered the battery at two time points. The results indicated that the battery was sensitive to group differences in FC. Performance on the battery was also moderately correlated with standard measures of cognitive abilities and showed convergence with standard measures of FC, while demonstrating good test-retest reliability. Our results show that it is feasible to use technology-based assessment protocols with older adults and patients with schizophrenia. The battery overcomes logistic constraints associated with current FC assessment protocols as the battery is computer-based, can be delivered remotely and does not require a healthcare professional for administration. PMID:27913159

  16. Task-based image quality assessment in radiation therapy: initial characterization and demonstration with CT simulation images

    NASA Astrophysics Data System (ADS)

    Dolly, Steven R.; Anastasio, Mark A.; Yu, Lifeng; Li, Hua

    2017-03-01

    In current radiation therapy practice, image quality is still assessed subjectively or by utilizing physically-based metrics. Recently, a methodology for objective task-based image quality (IQ) assessment in radiation therapy was proposed by Barrett et al.1 In this work, we present a comprehensive implementation and evaluation of this new IQ assessment methodology. A modular simulation framework was designed to perform an automated, computer-simulated end-to-end radiation therapy treatment. A fully simulated framework was created that utilizes new learning-based stochastic object models (SOM) to obtain known organ boundaries, generates a set of images directly from the numerical phantoms created with the SOM, and automates the image segmentation and treatment planning steps of a radiation therapy work ow. By use of this computational framework, therapeutic operating characteristic (TOC) curves can be computed and the area under the TOC curve (AUTOC) can be employed as a figure-of-merit to guide optimization of different components of the treatment planning process. The developed computational framework is employed to optimize X-ray CT pre-treatment imaging. We demonstrate that use of the radiation therapy-based-based IQ measures lead to different imaging parameters than obtained by use of physical-based measures.

  17. Adolescent computer use and alcohol use: what are the role of quantity and content of computer use?

    PubMed

    Epstein, Jennifer A

    2011-05-01

    The purpose of this study was to examine the relationship between computer use and alcohol use among adolescents. In particular, the goal of the research was to determine the role of lifetime drinking and past month drinking on quantity as measured by amount of time on the computer (for school work and excluding school work) and on content as measured by the frequency of a variety of activities on the internet (e.g., e-mail, searching for information, social networking, listen to/download music). Participants (aged 13-17 years and residing in the United States) were recruited via the internet to complete an anonymous survey online using a popular survey tool (N=270). Their average age was 16 and the sample was predominantly female (63% girls). A series of analyses was conducted with the computer use measures as dependent variables (hours on the computer per week for school work and excluding school work; various internet activities including e-mail, searching for information, social networking, listen to/download music) controlling for gender, age, academic performance and age of first computer use. Based on the results, past month drinkers used the computer more hours per week excluding school work than those who did not. As expected, there were no differences in hours based on alcohol use for computer use for school work. Drinking also had relationships with more frequent social networking and listening to/downloading music. These findings suggest that both quantity and content of computer use were related to adolescent drinking. Copyright © 2010 Elsevier Ltd. All rights reserved.

  18. Satisfaction with web-based training in an integrated healthcare delivery network: do age, education, computer skills and attitudes matter?

    PubMed Central

    Atreja, Ashish; Mehta, Neil B; Jain, Anil K; Harris, CM; Ishwaran, Hemant; Avital, Michel; Fishleder, Andrew J

    2008-01-01

    Background Healthcare institutions spend enormous time and effort to train their workforce. Web-based training can potentially streamline this process. However the deployment of web-based training in a large-scale setting with a diverse healthcare workforce has not been evaluated. The aim of this study was to evaluate the satisfaction of healthcare professionals with web-based training and to determine the predictors of such satisfaction including age, education status and computer proficiency. Methods Observational, cross-sectional survey of healthcare professionals from six hospital systems in an integrated delivery network. We measured overall satisfaction to web-based training and response to survey items measuring Website Usability, Course Usefulness, Instructional Design Effectiveness, Computer Proficiency and Self-learning Attitude. Results A total of 17,891 healthcare professionals completed the web-based training on HIPAA Privacy Rule; and of these, 13,537 completed the survey (response rate 75.6%). Overall course satisfaction was good (median, 4; scale, 1 to 5) with more than 75% of the respondents satisfied with the training (rating 4 or 5) and 65% preferring web-based training over traditional instructor-led training (rating 4 or 5). Multivariable ordinal regression revealed 3 key predictors of satisfaction with web-based training: Instructional Design Effectiveness, Website Usability and Course Usefulness. Demographic predictors such as gender, age and education did not have an effect on satisfaction. Conclusion The study shows that web-based training when tailored to learners' background, is perceived as a satisfactory mode of learning by an interdisciplinary group of healthcare professionals, irrespective of age, education level or prior computer experience. Future studies should aim to measure the long-term outcomes of web-based training. PMID:18922178

  19. High-speed linear optics quantum computing using active feed-forward.

    PubMed

    Prevedel, Robert; Walther, Philip; Tiefenbacher, Felix; Böhi, Pascal; Kaltenbaek, Rainer; Jennewein, Thomas; Zeilinger, Anton

    2007-01-04

    As information carriers in quantum computing, photonic qubits have the advantage of undergoing negligible decoherence. However, the absence of any significant photon-photon interaction is problematic for the realization of non-trivial two-qubit gates. One solution is to introduce an effective nonlinearity by measurements resulting in probabilistic gate operations. In one-way quantum computation, the random quantum measurement error can be overcome by applying a feed-forward technique, such that the future measurement basis depends on earlier measurement results. This technique is crucial for achieving deterministic quantum computation once a cluster state (the highly entangled multiparticle state on which one-way quantum computation is based) is prepared. Here we realize a concatenated scheme of measurement and active feed-forward in a one-way quantum computing experiment. We demonstrate that, for a perfect cluster state and no photon loss, our quantum computation scheme would operate with good fidelity and that our feed-forward components function with very high speed and low error for detected photons. With present technology, the individual computational step (in our case the individual feed-forward cycle) can be operated in less than 150 ns using electro-optical modulators. This is an important result for the future development of one-way quantum computers, whose large-scale implementation will depend on advances in the production and detection of the required highly entangled cluster states.

  20. Improving communication when seeking informed consent: a randomised controlled study of a computer-based method for providing information to prospective clinical trial participants.

    PubMed

    Karunaratne, Asuntha S; Korenman, Stanley G; Thomas, Samantha L; Myles, Paul S; Komesaroff, Paul A

    2010-04-05

    To assess the efficacy, with respect to participant understanding of information, of a computer-based approach to communication about complex, technical issues that commonly arise when seeking informed consent for clinical research trials. An open, randomised controlled study of 60 patients with diabetes mellitus, aged 27-70 years, recruited between August 2006 and October 2007 from the Department of Diabetes and Endocrinology at the Alfred Hospital and Baker IDI Heart and Diabetes Institute, Melbourne. Participants were asked to read information about a mock study via a computer-based presentation (n = 30) or a conventional paper-based information statement (n = 30). The computer-based presentation contained visual aids, including diagrams, video, hyperlinks and quiz pages. Understanding of information as assessed by quantitative and qualitative means. Assessment scores used to measure level of understanding were significantly higher in the group that completed the computer-based task than the group that completed the paper-based task (82% v 73%; P = 0.005). More participants in the group that completed the computer-based task expressed interest in taking part in the mock study (23 v 17 participants; P = 0.01). Most participants from both groups preferred the idea of a computer-based presentation to the paper-based statement (21 in the computer-based task group, 18 in the paper-based task group). A computer-based method of providing information may help overcome existing deficiencies in communication about clinical research, and may reduce costs and improve efficiency in recruiting participants for clinical trials.

  1. Evaluating convex roof entanglement measures.

    PubMed

    Tóth, Géza; Moroder, Tobias; Gühne, Otfried

    2015-04-24

    We show a powerful method to compute entanglement measures based on convex roof constructions. In particular, our method is applicable to measures that, for pure states, can be written as low order polynomials of operator expectation values. We show how to compute the linear entropy of entanglement, the linear entanglement of assistance, and a bound on the dimension of the entanglement for bipartite systems. We discuss how to obtain the convex roof of the three-tangle for three-qubit states. We also show how to calculate the linear entropy of entanglement and the quantum Fisher information based on partial information or device independent information. We demonstrate the usefulness of our method by concrete examples.

  2. Universal quantum computation with temporal-mode bilayer square lattices

    NASA Astrophysics Data System (ADS)

    Alexander, Rafael N.; Yokoyama, Shota; Furusawa, Akira; Menicucci, Nicolas C.

    2018-03-01

    We propose an experimental design for universal continuous-variable quantum computation that incorporates recent innovations in linear-optics-based continuous-variable cluster state generation and cubic-phase gate teleportation. The first ingredient is a protocol for generating the bilayer-square-lattice cluster state (a universal resource state) with temporal modes of light. With this state, measurement-based implementation of Gaussian unitary gates requires only homodyne detection. Second, we describe a measurement device that implements an adaptive cubic-phase gate, up to a random phase-space displacement. It requires a two-step sequence of homodyne measurements and consumes a (non-Gaussian) cubic-phase state.

  3. R2 effect-size measures for mediation analysis

    PubMed Central

    Fairchild, Amanda J.; MacKinnon, David P.; Taborga, Marcia P.; Taylor, Aaron B.

    2010-01-01

    R2 effect-size measures are presented to assess variance accounted for in mediation models. The measures offer a means to evaluate both component paths and the overall mediated effect in mediation models. Statistical simulation results indicate acceptable bias across varying parameter and sample-size combinations. The measures are applied to a real-world example using data from a team-based health promotion program to improve the nutrition and exercise habits of firefighters. SAS and SPSS computer code are also provided for researchers to compute the measures in their own data. PMID:19363189

  4. R2 effect-size measures for mediation analysis.

    PubMed

    Fairchild, Amanda J; Mackinnon, David P; Taborga, Marcia P; Taylor, Aaron B

    2009-05-01

    R(2) effect-size measures are presented to assess variance accounted for in mediation models. The measures offer a means to evaluate both component paths and the overall mediated effect in mediation models. Statistical simulation results indicate acceptable bias across varying parameter and sample-size combinations. The measures are applied to a real-world example using data from a team-based health promotion program to improve the nutrition and exercise habits of firefighters. SAS and SPSS computer code are also provided for researchers to compute the measures in their own data.

  5. Near real-time digital holographic microscope based on GPU parallel computing

    NASA Astrophysics Data System (ADS)

    Zhu, Gang; Zhao, Zhixiong; Wang, Huarui; Yang, Yan

    2018-01-01

    A transmission near real-time digital holographic microscope with in-line and off-axis light path is presented, in which the parallel computing technology based on compute unified device architecture (CUDA) and digital holographic microscopy are combined. Compared to other holographic microscopes, which have to implement reconstruction in multiple focal planes and are time-consuming the reconstruction speed of the near real-time digital holographic microscope can be greatly improved with the parallel computing technology based on CUDA, so it is especially suitable for measurements of particle field in micrometer and nanometer scale. Simulations and experiments show that the proposed transmission digital holographic microscope can accurately measure and display the velocity of particle field in micrometer scale, and the average velocity error is lower than 10%.With the graphic processing units(GPU), the computing time of the 100 reconstruction planes(512×512 grids) is lower than 120ms, while it is 4.9s using traditional reconstruction method by CPU. The reconstruction speed has been raised by 40 times. In other words, it can handle holograms at 8.3 frames per second and the near real-time measurement and display of particle velocity field are realized. The real-time three-dimensional reconstruction of particle velocity field is expected to achieve by further optimization of software and hardware. Keywords: digital holographic microscope,

  6. Improving the psychometric properties of dot-probe attention measures using response-based computation.

    PubMed

    Evans, Travis C; Britton, Jennifer C

    2018-09-01

    Abnormal threat-related attention in anxiety disorders is most commonly assessed and modified using the dot-probe paradigm; however, poor psychometric properties of reaction-time measures may contribute to inconsistencies across studies. Typically, standard attention measures are derived using average reaction-times obtained in experimentally-defined conditions. However, current approaches based on experimentally-defined conditions are limited. In this study, the psychometric properties of a novel response-based computation approach to analyze dot-probe data are compared to standard measures of attention. 148 adults (19.19 ± 1.42 years, 84 women) completed a standardized dot-probe task including threatening and neutral faces. We generated both standard and response-based measures of attention bias, attentional orientation, and attentional disengagement. We compared overall internal consistency, number of trials necessary to reach internal consistency, test-retest reliability (n = 72), and criterion validity obtained using each approach. Compared to standard attention measures, response-based measures demonstrated uniformly high levels of internal consistency with relatively few trials and varying improvements in test-retest reliability. Additionally, response-based measures demonstrated specific evidence of anxiety-related associations above and beyond both standard attention measures and other confounds. Future studies are necessary to validate this approach in clinical samples. Response-based attention measures demonstrate superior psychometric properties compared to standard attention measures, which may improve the detection of anxiety-related associations and treatment-related changes in clinical samples. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Differences in Growth on Math Curriculum-Based Measures Using Triannual Benchmarks

    ERIC Educational Resources Information Center

    Keller-Margulis, Milena A.; Mercer, Sterett H.; Shapiro, Edward S.

    2014-01-01

    Recent research on annual growth measured using curriculum-based measurement (CBM) indicates that growth may not be linear across the year and instead varies across semesters. Numerous studies in reading have confirmed this phenomenon with only one study of math computation yielding a similar finding. This study further investigated the presence…

  8. Computer-based teaching is as good as face to face lecture-based teaching of evidence based medicine: a randomised controlled trial

    PubMed Central

    2007-01-01

    Background At postgraduate level evidence based medicine (EBM) is currently taught through tutor based lectures. Computer based sessions fit around doctors' workloads, and standardise the quality of educational provision. There have been no randomized controlled trials comparing computer based sessions with traditional lectures at postgraduate level within medicine. Methods This was a randomised controlled trial involving six postgraduate education centres in the West Midlands, U.K. Fifty five newly qualified foundation year one doctors (U.S internship equivalent) were randomised to either computer based sessions or an equivalent lecture in EBM and systematic reviews. The change from pre to post-intervention score was measured using a validated questionnaire assessing knowledge (primary outcome) and attitudes (secondary outcome). Results Both groups were similar at baseline. Participants' improvement in knowledge in the computer based group was equivalent to the lecture based group (gain in score: 2.1 [S.D = 2.0] versus 1.9 [S.D = 2.4]; ANCOVA p = 0.078). Attitudinal gains were similar in both groups. Conclusion On the basis of our findings we feel computer based teaching and learning is as effective as typical lecture based teaching sessions for educating postgraduates in EBM and systematic reviews. PMID:17659076

  9. Low Boom Configuration Analysis with FUN3D Adjoint Simulation Framework

    NASA Technical Reports Server (NTRS)

    Park, Michael A.

    2011-01-01

    Off-body pressure, forces, and moments for the Gulfstream Low Boom Model are computed with a Reynolds Averaged Navier Stokes solver coupled with the Spalart-Allmaras (SA) turbulence model. This is the first application of viscous output-based adaptation to reduce estimated discretization errors in off-body pressure for a wing body configuration. The output adaptation approach is compared to an a priori grid adaptation technique designed to resolve the signature on the centerline by stretching and aligning the grid to the freestream Mach angle. The output-based approach produced good predictions of centerline and off-centerline measurements. Eddy viscosity predicted by the SA turbulence model increased significantly with grid adaptation. Computed lift as a function of drag compares well with wind tunnel measurements for positive lift, but predicted lift, drag, and pitching moment as a function of angle of attack has significant differences from the measured data. The sensitivity of longitudinal forces and moment to grid refinement is much smaller than the differences between the computed and measured data.

  10. Computationally-efficient optical coherence elastography to assess degenerative osteoarthritis based on ultrasound-induced fringe washout (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Tong, Minh Q.; Hasan, M. Monirul; Gregory, Patrick D.; Shah, Jasmine; Park, B. Hyle; Hirota, Koji; Liu, Junze; Choi, Andy; Low, Karen; Nam, Jin

    2017-02-01

    We demonstrate a computationally-efficient optical coherence elastography (OCE) method based on fringe washout. By introducing ultrasound in alternating depth profile, we can obtain information on the mechanical properties of a sample within acquisition of a single image. This can be achieved by simply comparing the intensity in adjacent depth profiles in order to quantify the degree of fringe washout. Phantom agar samples with various densities were measured and quantified by our OCE technique, the correlation to Young's modulus measurement by atomic force micrscopy (AFM) were observed. Knee cartilage samples of monoiodo acetate-induced arthiritis (MIA) rat models were utilized to replicate cartilage damages where our proposed OCE technique along with intensity and birefringence analyses and AFM measurements were applied. The results indicate that our OCE technique shows a correlation to the techniques as polarization-sensitive OCT, AFM Young's modulus measurements and histology were promising. Our OCE is applicable to any of existing OCT systems and demonstrated to be computationally-efficient.

  11. Percolation Centrality: Quantifying Graph-Theoretic Impact of Nodes during Percolation in Networks

    PubMed Central

    Piraveenan, Mahendra; Prokopenko, Mikhail; Hossain, Liaquat

    2013-01-01

    A number of centrality measures are available to determine the relative importance of a node in a complex network, and betweenness is prominent among them. However, the existing centrality measures are not adequate in network percolation scenarios (such as during infection transmission in a social network of individuals, spreading of computer viruses on computer networks, or transmission of disease over a network of towns) because they do not account for the changing percolation states of individual nodes. We propose a new measure, percolation centrality, that quantifies relative impact of nodes based on their topological connectivity, as well as their percolation states. The measure can be extended to include random walk based definitions, and its computational complexity is shown to be of the same order as that of betweenness centrality. We demonstrate the usage of percolation centrality by applying it to a canonical network as well as simulated and real world scale-free and random networks. PMID:23349699

  12. Assessment of Computer and Information Literacy in ICILS 2013: Do Different Item Types Measure the Same Construct?

    ERIC Educational Resources Information Center

    Ihme, Jan Marten; Senkbeil, Martin; Goldhammer, Frank; Gerick, Julia

    2017-01-01

    The combination of different item formats is found quite often in large scale assessments, and analyses on the dimensionality often indicate multi-dimensionality of tests regarding the task format. In ICILS 2013, three different item types (information-based response tasks, simulation tasks, and authoring tasks) were used to measure computer and…

  13. Validating a Computer-Assisted Language Learning Attitude Instrument Used in Iranian EFL Context: An Evidence-Based Approach

    ERIC Educational Resources Information Center

    Aryadoust, Vahid; Mehran, Parisa; Alizadeh, Mehrasa

    2016-01-01

    A few computer-assisted language learning (CALL) instruments have been developed in Iran to measure EFL (English as a foreign language) learners' attitude toward CALL. However, these instruments have no solid validity argument and accordingly would be unable to provide a reliable measurement of attitude. The present study aimed to develop a CALL…

  14. Scalable Quantum Networks for Distributed Computing and Sensing

    DTIC Science & Technology

    2016-04-01

    probabilistic measurement , so we developed quantum memories and guided-wave implementations of same, demonstrating controlled delay of a heralded single...Second, fundamental scalability requires a method to synchronize protocols based on quantum measurements , which are inherently probabilistic. To meet...AFRL-AFOSR-UK-TR-2016-0007 Scalable Quantum Networks for Distributed Computing and Sensing Ian Walmsley THE UNIVERSITY OF OXFORD Final Report 04/01

  15. Poem Generator: A Comparative Quantitative Evaluation of a Microworlds-Based Learning Approach for Teaching English

    ERIC Educational Resources Information Center

    Jenkins, Craig

    2015-01-01

    This paper is a comparative quantitative evaluation of an approach to teaching poetry in the subject domain of English that employs a "guided discovery" pedagogy using computer-based microworlds. It uses a quasi-experimental design in order to measure performance gains in computational thinking and poetic thinking following a…

  16. Automated Detection of Heuristics and Biases among Pathologists in a Computer-Based System

    ERIC Educational Resources Information Center

    Crowley, Rebecca S.; Legowski, Elizabeth; Medvedeva, Olga; Reitmeyer, Kayse; Tseytlin, Eugene; Castine, Melissa; Jukic, Drazen; Mello-Thoms, Claudia

    2013-01-01

    The purpose of this study is threefold: (1) to develop an automated, computer-based method to detect heuristics and biases as pathologists examine virtual slide cases, (2) to measure the frequency and distribution of heuristics and errors across three levels of training, and (3) to examine relationships of heuristics to biases, and biases to…

  17. Assessing Text Readability Using Cognitively Based Indices

    ERIC Educational Resources Information Center

    Crossley, Scott A.; Greenfield, Jerry; McNamara, Danielle S.

    2008-01-01

    Many programs designed to compute the readability of texts are narrowly based on surface-level linguistic features and take too little account of the processes which a reader brings to the text. This study is an exploratory examination of the use of Coh-Metrix, a computational tool that measures cohesion and text difficulty at various levels of…

  18. Item Difficulty in the Evaluation of Computer-Based Instruction: An Example from Neuroanatomy

    ERIC Educational Resources Information Center

    Chariker, Julia H.; Naaz, Farah; Pani, John R.

    2012-01-01

    This article reports large item effects in a study of computer-based learning of neuroanatomy. Outcome measures of the efficiency of learning, transfer of learning, and generalization of knowledge diverged by a wide margin across test items, with certain sets of items emerging as particularly difficult to master. In addition, the outcomes of…

  19. Assessing Medical Students' Self-Regulation as Aptitude in Computer-Based Learning

    ERIC Educational Resources Information Center

    Song, Hyuksoon S.; Kalet, Adina L.; Plass, Jan L.

    2011-01-01

    We developed a Self-Regulation Measure for Computer-based learning (SRMC) tailored toward medical students, by modifying Zimmerman's Self-Regulated Learning Interview Schedule (SRLIS) for K-12 learners. The SRMC's reliability and validity were examined in 2 studies. In Study 1, 109 first-year medical students were asked to complete the SRMC.…

  20. Computer-Assisted English Learning System Based on Free Conversation by Topic

    ERIC Educational Resources Information Center

    Choi, Sung-Kwon; Kwon, Oh-Woog; Kim, Young-Kil

    2017-01-01

    This paper aims to describe a computer-assisted English learning system using chatbots and dialogue systems, which allow free conversation outside the topic without limiting the learner's flow of conversation. The evaluation was conducted by 20 experimenters. The performance of the system based on a free conversation by topic was measured by the…

  1. Students' Perceptions of Computer-Based Learning Environments, Their Attitude towards Business Statistics, and Their Academic Achievement: Implications from a UK University

    ERIC Educational Resources Information Center

    Nguyen, ThuyUyen H.; Charity, Ian; Robson, Andrew

    2016-01-01

    This study investigates students' perceptions of computer-based learning environments, their attitude towards business statistics, and their academic achievement in higher education. Guided by learning environments concepts and attitudinal theory, a theoretical model was proposed with two instruments, one for measuring the learning environment and…

  2. Computer-Based Assessment of Collaborative Problem Solving: Exploring the Feasibility of Human-to-Agent Approach

    ERIC Educational Resources Information Center

    Rosen, Yigal

    2015-01-01

    How can activities in which collaborative skills of an individual are measured be standardized? In order to understand how students perform on collaborative problem solving (CPS) computer-based assessment, it is necessary to examine empirically the multi-faceted performance that may be distributed across collaboration methods. The aim of this…

  3. Multivariate optical computing using a digital micromirror device for fluorescence and Raman spectroscopy.

    PubMed

    Smith, Zachary J; Strombom, Sven; Wachsmann-Hogiu, Sebastian

    2011-08-29

    A multivariate optical computer has been constructed consisting of a spectrograph, digital micromirror device, and photomultiplier tube that is capable of determining absolute concentrations of individual components of a multivariate spectral model. We present experimental results on ternary mixtures, showing accurate quantification of chemical concentrations based on integrated intensities of fluorescence and Raman spectra measured with a single point detector. We additionally show in simulation that point measurements based on principal component spectra retain the ability to classify cancerous from noncancerous T cells.

  4. A Comparison of Computation Span and Reading Span Working Memory Measures' Relations With Problem-Solving Criteria.

    PubMed

    Perlow, Richard; Jattuso, Mia

    2018-06-01

    Researchers have operationalized working memory in different ways and although working memory-performance relationships are well documented, there has been relatively less attention devoted to determining whether seemingly similar measures yield comparable relations with performance outcomes. Our objective is to assess whether two working memory measures deploying the same processes but different item content yield different relations with two problem-solving criteria. Participants completed a computation-based working memory measure and a reading-based measure prior to performing a computerized simulation. Results reveal differential relations with one of the two criteria and support the notion that the two working memory measures tap working memory capacity and other cognitive abilities. One implication for theory development is that researchers should consider incorporating other cognitive abilities in their working memory models and that the selection of those abilities should correspond to the criterion of interest. One practical implication is that researchers and practitioners shouldn't automatically assume that different phonological loop-based working memory scales are interchangeable.

  5. Simulating the influence of scatter and beam hardening in dimensional computed tomography

    NASA Astrophysics Data System (ADS)

    Lifton, J. J.; Carmignato, S.

    2017-10-01

    Cone-beam x-ray computed tomography (XCT) is a radiographic scanning technique that allows the non-destructive dimensional measurement of an object’s internal and external features. XCT measurements are influenced by a number of different factors that are poorly understood. This work investigates how non-linear x-ray attenuation caused by beam hardening and scatter influences XCT-based dimensional measurements through the use of simulated data. For the measurement task considered, both scatter and beam hardening are found to influence dimensional measurements when evaluated using the ISO50 surface determination method. On the other hand, only beam hardening is found to influence dimensional measurements when evaluated using an advanced surface determination method. Based on the results presented, recommendations on the use of beam hardening and scatter correction for dimensional XCT are given.

  6. An Analysis of Creative Process Learning in Computer Game Activities through Player Experiences

    ERIC Educational Resources Information Center

    Inchamnan, Wilawan

    2016-01-01

    This research investigates the extent to which creative processes can be fostered through computer gaming. It focuses on creative components in games that have been specifically designed for educational purposes: Digital Game Based Learning (DGBL). A behavior analysis for measuring the creative potential of computer game activities and learning…

  7. Distributed computing for membrane-based modeling of action potential propagation.

    PubMed

    Porras, D; Rogers, J M; Smith, W M; Pollard, A E

    2000-08-01

    Action potential propagation simulations with physiologic membrane currents and macroscopic tissue dimensions are computationally expensive. We, therefore, analyzed distributed computing schemes to reduce execution time in workstation clusters by parallelizing solutions with message passing. Four schemes were considered in two-dimensional monodomain simulations with the Beeler-Reuter membrane equations. Parallel speedups measured with each scheme were compared to theoretical speedups, recognizing the relationship between speedup and code portions that executed serially. A data decomposition scheme based on total ionic current provided the best performance. Analysis of communication latencies in that scheme led to a load-balancing algorithm in which measured speedups at 89 +/- 2% and 75 +/- 8% of theoretical speedups were achieved in homogeneous and heterogeneous clusters of workstations. Speedups in this scheme with the Luo-Rudy dynamic membrane equations exceeded 3.0 with eight distributed workstations. Cluster speedups were comparable to those measured during parallel execution on a shared memory machine.

  8. Comparison of SOM point densities based on different criteria.

    PubMed

    Kohonen, T

    1999-11-15

    Point densities of model (codebook) vectors in self-organizing maps (SOMs) are evaluated in this article. For a few one-dimensional SOMs with finite grid lengths and a given probability density function of the input, the numerically exact point densities have been computed. The point density derived from the SOM algorithm turned out to be different from that minimizing the SOM distortion measure, showing that the model vectors produced by the basic SOM algorithm in general do not exactly coincide with the optimum of the distortion measure. A new computing technique based on the calculus of variations has been introduced. It was applied to the computation of point densities derived from the distortion measure for both the classical vector quantization and the SOM with general but equal dimensionality of the input vectors and the grid, respectively. The power laws in the continuum limit obtained in these cases were found to be identical.

  9. Chromatographic and computational assessment of lipophilicity using sum of ranking differences and generalized pair-correlation.

    PubMed

    Andrić, Filip; Héberger, Károly

    2015-02-06

    Lipophilicity (logP) represents one of the most studied and most frequently used fundamental physicochemical properties. At present there are several possibilities for its quantitative expression and many of them stems from chromatographic experiments. Numerous attempts have been made to compare different computational methods, chromatographic methods vs. computational approaches, as well as chromatographic methods and direct shake-flask procedure without definite results or these findings are not accepted generally. In the present work numerous chromatographically derived lipophilicity measures in combination with diverse computational methods were ranked and clustered using the novel variable discrimination and ranking approaches based on the sum of ranking differences and the generalized pair correlation method. Available literature logP data measured on HILIC, and classical reversed-phase combining different classes of compounds have been compared with most frequently used multivariate data analysis techniques (principal component and hierarchical cluster analysis) as well as with the conclusions in the original sources. Chromatographic lipophilicity measures obtained under typical reversed-phase conditions outperform the majority of computationally estimated logPs. Oppositely, in the case of HILIC none of the many proposed chromatographic indices overcomes any of the computationally assessed logPs. Only two of them (logkmin and kmin) may be selected as recommended chromatographic lipophilicity measures. Both ranking approaches, sum of ranking differences and generalized pair correlation method, although based on different backgrounds, provides highly similar variable ordering and grouping leading to the same conclusions. Copyright © 2015. Published by Elsevier B.V.

  10. Automatic computation of 2D cardiac measurements from B-mode echocardiography

    NASA Astrophysics Data System (ADS)

    Park, JinHyeong; Feng, Shaolei; Zhou, S. Kevin

    2012-03-01

    We propose a robust and fully automatic algorithm which computes the 2D echocardiography measurements recommended by America Society of Echocardiography. The algorithm employs knowledge-based imaging technologies which can learn the expert's knowledge from the training images and expert's annotation. Based on the models constructed from the learning stage, the algorithm searches initial location of the landmark points for the measurements by utilizing heart structure of left ventricle including mitral valve aortic valve. It employs the pseudo anatomic M-mode image generated by accumulating the line images in 2D parasternal long axis view along the time to refine the measurement landmark points. The experiment results with large volume of data show that the algorithm runs fast and is robust comparable to expert.

  11. Reliability model derivation of a fault-tolerant, dual, spare-switching, digital computer system

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A computer based reliability projection aid, tailored specifically for application in the design of fault-tolerant computer systems, is described. Its more pronounced characteristics include the facility for modeling systems with two distinct operational modes, measuring the effect of both permanent and transient faults, and calculating conditional system coverage factors. The underlying conceptual principles, mathematical models, and computer program implementation are presented.

  12. FPGA-based real-time phase measuring profilometry algorithm design and implementation

    NASA Astrophysics Data System (ADS)

    Zhan, Guomin; Tang, Hongwei; Zhong, Kai; Li, Zhongwei; Shi, Yusheng

    2016-11-01

    Phase measuring profilometry (PMP) has been widely used in many fields, like Computer Aided Verification (CAV), Flexible Manufacturing System (FMS) et al. High frame-rate (HFR) real-time vision-based feedback control will be a common demands in near future. However, the instruction time delay in the computer caused by numerous repetitive operations greatly limit the efficiency of data processing. FPGA has the advantages of pipeline architecture and parallel execution, and it fit for handling PMP algorithm. In this paper, we design a fully pipelined hardware architecture for PMP. The functions of hardware architecture includes rectification, phase calculation, phase shifting, and stereo matching. The experiment verified the performance of this method, and the factors that may influence the computation accuracy was analyzed.

  13. A machine-learning approach for computation of fractional flow reserve from coronary computed tomography.

    PubMed

    Itu, Lucian; Rapaka, Saikiran; Passerini, Tiziano; Georgescu, Bogdan; Schwemmer, Chris; Schoebinger, Max; Flohr, Thomas; Sharma, Puneet; Comaniciu, Dorin

    2016-07-01

    Fractional flow reserve (FFR) is a functional index quantifying the severity of coronary artery lesions and is clinically obtained using an invasive, catheter-based measurement. Recently, physics-based models have shown great promise in being able to noninvasively estimate FFR from patient-specific anatomical information, e.g., obtained from computed tomography scans of the heart and the coronary arteries. However, these models have high computational demand, limiting their clinical adoption. In this paper, we present a machine-learning-based model for predicting FFR as an alternative to physics-based approaches. The model is trained on a large database of synthetically generated coronary anatomies, where the target values are computed using the physics-based model. The trained model predicts FFR at each point along the centerline of the coronary tree, and its performance was assessed by comparing the predictions against physics-based computations and against invasively measured FFR for 87 patients and 125 lesions in total. Correlation between machine-learning and physics-based predictions was excellent (0.9994, P < 0.001), and no systematic bias was found in Bland-Altman analysis: mean difference was -0.00081 ± 0.0039. Invasive FFR ≤ 0.80 was found in 38 lesions out of 125 and was predicted by the machine-learning algorithm with a sensitivity of 81.6%, a specificity of 83.9%, and an accuracy of 83.2%. The correlation was 0.729 (P < 0.001). Compared with the physics-based computation, average execution time was reduced by more than 80 times, leading to near real-time assessment of FFR. Average execution time went down from 196.3 ± 78.5 s for the CFD model to ∼2.4 ± 0.44 s for the machine-learning model on a workstation with 3.4-GHz Intel i7 8-core processor. Copyright © 2016 the American Physiological Society.

  14. Ag2S atomic switch-based `tug of war' for decision making

    NASA Astrophysics Data System (ADS)

    Lutz, C.; Hasegawa, T.; Chikyow, T.

    2016-07-01

    For a computing process such as making a decision, a software controlled chip of several transistors is necessary. Inspired by how a single cell amoeba decides its movements, the theoretical `tug of war' computing model was proposed but not yet implemented in an analogue device suitable for integrated circuits. Based on this model, we now developed a new electronic element for decision making processes, which will have no need for prior programming. The devices are based on the growth and shrinkage of Ag filaments in α-Ag2+δS gap-type atomic switches. Here we present the adapted device design and the new materials. We demonstrate the basic `tug of war' operation by IV-measurements and Scanning Electron Microscopy (SEM) observation. These devices could be the base for a CMOS-free new computer architecture.For a computing process such as making a decision, a software controlled chip of several transistors is necessary. Inspired by how a single cell amoeba decides its movements, the theoretical `tug of war' computing model was proposed but not yet implemented in an analogue device suitable for integrated circuits. Based on this model, we now developed a new electronic element for decision making processes, which will have no need for prior programming. The devices are based on the growth and shrinkage of Ag filaments in α-Ag2+δS gap-type atomic switches. Here we present the adapted device design and the new materials. We demonstrate the basic `tug of war' operation by IV-measurements and Scanning Electron Microscopy (SEM) observation. These devices could be the base for a CMOS-free new computer architecture. Electronic supplementary information (ESI) available. See DOI: 10.1039/c6nr00690f

  15. A Comparison of Web-Based and Face-to-Face Functional Measurement Experiments

    ERIC Educational Resources Information Center

    Van Acker, Frederik; Theuns, Peter

    2010-01-01

    Information Integration Theory (IIT) is concerned with how people combine information into an overall judgment. A method is hereby presented to perform Functional Measurement (FM) experiments, the methodological counterpart of IIT, on the Web. In a comparison of Web-based FM experiments, face-to-face experiments, and computer-based experiments in…

  16. Computer Vision-Based Structural Displacement Measurement Robust to Light-Induced Image Degradation for In-Service Bridges

    PubMed Central

    Lee, Junhwa; Lee, Kyoung-Chan; Cho, Soojin

    2017-01-01

    The displacement responses of a civil engineering structure can provide important information regarding structural behaviors that help in assessing safety and serviceability. A displacement measurement using conventional devices, such as the linear variable differential transformer (LVDT), is challenging owing to issues related to inconvenient sensor installation that often requires additional temporary structures. A promising alternative is offered by computer vision, which typically provides a low-cost and non-contact displacement measurement that converts the movement of an object, mostly an attached marker, in the captured images into structural displacement. However, there is limited research on addressing light-induced measurement error caused by the inevitable sunlight in field-testing conditions. This study presents a computer vision-based displacement measurement approach tailored to a field-testing environment with enhanced robustness to strong sunlight. An image-processing algorithm with an adaptive region-of-interest (ROI) is proposed to reliably determine a marker’s location even when the marker is indistinct due to unfavorable light. The performance of the proposed system is experimentally validated in both laboratory-scale and field experiments. PMID:29019950

  17. A Bayesian approach for parameter estimation and prediction using a computationally intensive model

    DOE PAGES

    Higdon, Dave; McDonnell, Jordan D.; Schunck, Nicolas; ...

    2015-02-05

    Bayesian methods have been successful in quantifying uncertainty in physics-based problems in parameter estimation and prediction. In these cases, physical measurements y are modeled as the best fit of a physics-based modelmore » $$\\eta (\\theta )$$, where θ denotes the uncertain, best input setting. Hence the statistical model is of the form $$y=\\eta (\\theta )+\\epsilon ,$$ where $$\\epsilon $$ accounts for measurement, and possibly other, error sources. When nonlinearity is present in $$\\eta (\\cdot )$$, the resulting posterior distribution for the unknown parameters in the Bayesian formulation is typically complex and nonstandard, requiring computationally demanding computational approaches such as Markov chain Monte Carlo (MCMC) to produce multivariate draws from the posterior. Although generally applicable, MCMC requires thousands (or even millions) of evaluations of the physics model $$\\eta (\\cdot )$$. This requirement is problematic if the model takes hours or days to evaluate. To overcome this computational bottleneck, we present an approach adapted from Bayesian model calibration. This approach combines output from an ensemble of computational model runs with physical measurements, within a statistical formulation, to carry out inference. A key component of this approach is a statistical response surface, or emulator, estimated from the ensemble of model runs. We demonstrate this approach with a case study in estimating parameters for a density functional theory model, using experimental mass/binding energy measurements from a collection of atomic nuclei. Lastly, we also demonstrate how this approach produces uncertainties in predictions for recent mass measurements obtained at Argonne National Laboratory.« less

  18. Full 3-D OCT-based pseudophakic custom computer eye model

    PubMed Central

    Sun, M.; Pérez-Merino, P.; Martinez-Enriquez, E.; Velasco-Ocana, M.; Marcos, S.

    2016-01-01

    We compared measured wave aberrations in pseudophakic eyes implanted with aspheric intraocular lenses (IOLs) with simulated aberrations from numerical ray tracing on customized computer eye models, built using quantitative 3-D OCT-based patient-specific ocular geometry. Experimental and simulated aberrations show high correlation (R = 0.93; p<0.0001) and similarity (RMS for high order aberrations discrepancies within 23.58%). This study shows that full OCT-based pseudophakic custom computer eye models allow understanding the relative contribution of optical geometrical and surgically-related factors to image quality, and are an excellent tool for characterizing and improving cataract surgery. PMID:27231608

  19. CAD system for footwear design based on whole real 3D data of last surface

    NASA Astrophysics Data System (ADS)

    Song, Wanzhong; Su, Xianyu

    2000-10-01

    Two major parts of application of CAD in footwear design are studied: the development of last surface; computer-aided design of planar shoe-template. A new quasi-experiential development algorithm of last surface based on triangulation approximation is presented. This development algorithm consumes less time and does not need any interactive operation for precisely development compared with other development algorithm of last surface. Based on this algorithm, a software, SHOEMAKERTM, which contains computer aided automatic measurement, automatic development of last surface and computer aide design of shoe-template has been developed.

  20. AnnotCompute: annotation-based exploration and meta-analysis of genomics experiments

    PubMed Central

    Zheng, Jie; Stoyanovich, Julia; Manduchi, Elisabetta; Liu, Junmin; Stoeckert, Christian J.

    2011-01-01

    The ever-increasing scale of biological data sets, particularly those arising in the context of high-throughput technologies, requires the development of rich data exploration tools. In this article, we present AnnotCompute, an information discovery platform for repositories of functional genomics experiments such as ArrayExpress. Our system leverages semantic annotations of functional genomics experiments with controlled vocabulary and ontology terms, such as those from the MGED Ontology, to compute conceptual dissimilarities between pairs of experiments. These dissimilarities are then used to support two types of exploratory analysis—clustering and query-by-example. We show that our proposed dissimilarity measures correspond to a user's intuition about conceptual dissimilarity, and can be used to support effective query-by-example. We also evaluate the quality of clustering based on these measures. While AnnotCompute can support a richer data exploration experience, its effectiveness is limited in some cases, due to the quality of available annotations. Nonetheless, tools such as AnnotCompute may provide an incentive for richer annotations of experiments. Code is available for download at http://www.cbil.upenn.edu/downloads/AnnotCompute. Database URL: http://www.cbil.upenn.edu/annotCompute/ PMID:22190598

  1. One-way quantum computing in superconducting circuits

    NASA Astrophysics Data System (ADS)

    Albarrán-Arriagada, F.; Alvarado Barrios, G.; Sanz, M.; Romero, G.; Lamata, L.; Retamal, J. C.; Solano, E.

    2018-03-01

    We propose a method for the implementation of one-way quantum computing in superconducting circuits. Measurement-based quantum computing is a universal quantum computation paradigm in which an initial cluster state provides the quantum resource, while the iteration of sequential measurements and local rotations encodes the quantum algorithm. Up to now, technical constraints have limited a scalable approach to this quantum computing alternative. The initial cluster state can be generated with available controlled-phase gates, while the quantum algorithm makes use of high-fidelity readout and coherent feedforward. With current technology, we estimate that quantum algorithms with above 20 qubits may be implemented in the path toward quantum supremacy. Moreover, we propose an alternative initial state with properties of maximal persistence and maximal connectedness, reducing the required resources of one-way quantum computing protocols.

  2. Objective measures of prospective memory do not correlate with subjective complaints in schizophrenia.

    PubMed

    Chan, Raymond C K; Wang, Ya; Ma, Zheng; Hong, Xiao-hong; Yuan, Yanbo; Yu, Xin; Li, Zhanjiang; Shum, David; Gong, Qi-yong

    2008-08-01

    While a number of studies have shown that individuals with schizophrenia are impaired on various types of prospective memory, few studies have examined the relationship between subjective and objective measures of this construct in this clinical group. The purpose of the current study was to explore the relationship between computer-based prospective memory tasks and the corresponding subjective complaints in patients with schizophrenia, individuals with schizotypal personality features, and healthy volunteers. The findings showed that patients with schizophrenia demonstrated significantly poorer performance in all domains of memory function except visual memory than individuals with schizotypal personality disorder and healthy controls. More importantly, there was a significant interaction effect of prospective memory type and group. Although patients with schizophrenia were found to show significantly poorer performance on computer-based measures of prospective memory than controls, their level of subjective complaint was not found to be significantly higher. While subjective complaints of prospective memory were found to associate significantly with self-reported executive dysfunctions, significant relationships were not found between these complaints and performance on a computer-based task of prospective memory and other objective measures of memory. Taken together, these findings suggest that subjective and objective measures of prospective memory are two distinct domains that might need to be assessed and addressed separately.

  3. The impact of computer self-efficacy, computer anxiety, and perceived usability and acceptability on the efficacy of a decision support tool for colorectal cancer screening

    PubMed Central

    Lindblom, Katrina; Gregory, Tess; Flight, Ingrid H K; Zajac, Ian

    2011-01-01

    Objective This study investigated the efficacy of an internet-based personalized decision support (PDS) tool designed to aid in the decision to screen for colorectal cancer (CRC) using a fecal occult blood test. We tested whether the efficacy of the tool in influencing attitudes to screening was mediated by perceived usability and acceptability, and considered the role of computer self-efficacy and computer anxiety in these relationships. Methods Eighty-one participants aged 50–76 years worked through the on-line PDS tool and completed questionnaires on computer self-efficacy, computer anxiety, attitudes to and beliefs about CRC screening before and after exposure to the PDS, and perceived usability and acceptability of the tool. Results Repeated measures ANOVA found that PDS exposure led to a significant increase in knowledge about CRC and screening, and more positive attitudes to CRC screening as measured by factors from the Preventive Health Model. Perceived usability and acceptability of the PDS mediated changes in attitudes toward CRC screening (but not CRC knowledge), and computer self-efficacy and computer anxiety were significant predictors of individuals' perceptions of the tool. Conclusion Interventions designed to decrease computer anxiety, such as computer courses and internet training, may improve the acceptability of new health information technologies including internet-based decision support tools, increasing their impact on behavior change. PMID:21857024

  4. Validation of learning style measures: implications for medical education practice.

    PubMed

    Chapman, Dane M; Calhoun, Judith G

    2006-06-01

    It is unclear which learners would most benefit from the more individualised, student-structured, interactive approaches characteristic of problem-based and computer-assisted learning. The validity of learning style measures is uncertain, and there is no unifying learning style construct identified to predict such learners. This study was conducted to validate learning style constructs and to identify the learners most likely to benefit from problem-based and computer-assisted curricula. Using a cross-sectional design, 3 established learning style inventories were administered to 97 post-Year 2 medical students. Cognitive personality was measured by the Group Embedded Figures Test, information processing by the Learning Styles Inventory, and instructional preference by the Learning Preference Inventory. The 11 subscales from the 3 inventories were factor-analysed to identify common learning constructs and to verify construct validity. Concurrent validity was determined by intercorrelations of the 11 subscales. A total of 94 pre-clinical medical students completed all 3 inventories. Five meaningful learning style constructs were derived from the 11 subscales: student- versus teacher-structured learning; concrete versus abstract learning; passive versus active learning; individual versus group learning, and field-dependence versus field-independence. The concurrent validity of 10 of 11 subscales was supported by correlation analysis. Medical students most likely to thrive in a problem-based or computer-assisted learning environment would be expected to score highly on abstract, active and individual learning constructs and would be more field-independent. Learning style measures were validated in a medical student population and learning constructs were established for identifying learners who would most likely benefit from a problem-based or computer-assisted curriculum.

  5. Computational complexity of the landscape II-Cosmological considerations

    NASA Astrophysics Data System (ADS)

    Denef, Frederik; Douglas, Michael R.; Greene, Brian; Zukowski, Claire

    2018-05-01

    We propose a new approach for multiverse analysis based on computational complexity, which leads to a new family of "computational" measure factors. By defining a cosmology as a space-time containing a vacuum with specified properties (for example small cosmological constant) together with rules for how time evolution will produce the vacuum, we can associate global time in a multiverse with clock time on a supercomputer which simulates it. We argue for a principle of "limited computational complexity" governing early universe dynamics as simulated by this supercomputer, which translates to a global measure for regulating the infinities of eternal inflation. The rules for time evolution can be thought of as a search algorithm, whose details should be constrained by a stronger principle of "minimal computational complexity". Unlike previously studied global measures, ours avoids standard equilibrium considerations and the well-known problems of Boltzmann Brains and the youngness paradox. We also give various definitions of the computational complexity of a cosmology, and argue that there are only a few natural complexity classes.

  6. Learning to Rank the Severity of Unrepaired Cleft Lip Nasal Deformity on 3D Mesh Data.

    PubMed

    Wu, Jia; Tse, Raymond; Shapiro, Linda G

    2014-08-01

    Cleft lip is a birth defect that results in deformity of the upper lip and nose. Its severity is widely variable and the results of treatment are influenced by the initial deformity. Objective assessment of severity would help to guide prognosis and treatment. However, most assessments are subjective. The purpose of this study is to develop and test quantitative computer-based methods of measuring cleft lip severity. In this paper, a grid-patch based measurement of symmetry is introduced, with which a computer program learns to rank the severity of cleft lip on 3D meshes of human infant faces. Three computer-based methods to define the midfacial reference plane were compared to two manual methods. Four different symmetry features were calculated based upon these reference planes, and evaluated. The result shows that the rankings predicted by the proposed features were highly correlated with the ranking orders provided by experts that were used as the ground truth.

  7. Efficient, graph-based white matter connectivity from orientation distribution functions via multi-directional graph propagation

    NASA Astrophysics Data System (ADS)

    Boucharin, Alexis; Oguz, Ipek; Vachet, Clement; Shi, Yundi; Sanchez, Mar; Styner, Martin

    2011-03-01

    The use of regional connectivity measurements derived from diffusion imaging datasets has become of considerable interest in the neuroimaging community in order to better understand cortical and subcortical white matter connectivity. Current connectivity assessment methods are based on streamline fiber tractography, usually applied in a Monte-Carlo fashion. In this work we present a novel, graph-based method that performs a fully deterministic, efficient and stable connectivity computation. The method handles crossing fibers and deals well with multiple seed regions. The computation is based on a multi-directional graph propagation method applied to sampled orientation distribution function (ODF), which can be computed directly from the original diffusion imaging data. We show early results of our method on synthetic and real datasets. The results illustrate the potential of our method towards subjectspecific connectivity measurements that are performed in an efficient, stable and reproducible manner. Such individual connectivity measurements would be well suited for application in population studies of neuropathology, such as Autism, Huntington's Disease, Multiple Sclerosis or leukodystrophies. The proposed method is generic and could easily be applied to non-diffusion data as long as local directional data can be derived.

  8. Computer-implemented remote sensing techniques for measuring coastal productivity and nutrient transport systems

    NASA Technical Reports Server (NTRS)

    Butera, M. K.

    1981-01-01

    An automatic technique has been developed to measure marsh plant production by inference from a species classification derived from Landsat MSS data. A separate computer technique has been developed to calculate the transport path length of detritus and nutrients from their point of origin in the marsh to the shoreline from Landsat data. A nutrient availability indicator, the ratio of production to transport path length, was derived for each marsh-identified Landsat cell. The use of a data base compatible with the Landsat format facilitated data handling and computations.

  9. Observed differences in upper extremity forces, muscle efforts, postures, velocities and accelerations across computer activities in a field study of office workers.

    PubMed

    Bruno Garza, J L; Eijckelhof, B H W; Johnson, P W; Raina, S M; Rynell, P W; Huysmans, M A; van Dieën, J H; van der Beek, A J; Blatter, B M; Dennerlein, J T

    2012-01-01

    This study, a part of the PRedicting Occupational biomechanics in OFfice workers (PROOF) study, investigated whether there are differences in field-measured forces, muscle efforts, postures, velocities and accelerations across computer activities. These parameters were measured continuously for 120 office workers performing their own work for two hours each. There were differences in nearly all forces, muscle efforts, postures, velocities and accelerations across keyboard, mouse and idle activities. Keyboard activities showed a 50% increase in the median right trapezius muscle effort when compared to mouse activities. Median shoulder rotation changed from 25 degrees internal rotation during keyboard use to 15 degrees external rotation during mouse use. Only keyboard use was associated with median ulnar deviations greater than 5 degrees. Idle activities led to the greatest variability observed in all muscle efforts and postures measured. In future studies, measurements of computer activities could be used to provide information on the physical exposures experienced during computer use. Practitioner Summary: Computer users may develop musculoskeletal disorders due to their force, muscle effort, posture and wrist velocity and acceleration exposures during computer use. We report that many physical exposures are different across computer activities. This information may be used to estimate physical exposures based on patterns of computer activities over time.

  10. Using the Clinical Interview and Curriculum Based Measurement to Examine Risk Levels

    ERIC Educational Resources Information Center

    Ginsburg, Herbert P.; Lee, Young-Sun; Pappas, Sandra

    2016-01-01

    This paper investigates the power of the computer guided clinical interview (CI) and new curriculum based measurement (CBM) measures to identify and help children at risk of low mathematics achievement. We use data from large numbers of children in Kindergarten through Grade 3 to investigate the construct validity of CBM risk categories. The basic…

  11. The Paradox of Abstraction: Precision Versus Concreteness

    ERIC Educational Resources Information Center

    Iliev, Rumen; Axelrod, Robert

    2017-01-01

    We introduce a novel measure of abstractness based on the amount of information of a concept computed from its position in a semantic taxonomy. We refer to this measure as "precision". We propose two alternative ways to measure precision, one based on the path length from a concept to the root of the taxonomic tree, and another one based…

  12. Quantum market games: implementing tactics via measurements

    NASA Astrophysics Data System (ADS)

    Pakula, I.; Piotrowski, E. W.; Sladkowski, J.

    2006-02-01

    A major development in applying quantum mechanical formalism to various fields has been made during the last few years. Quantum counterparts of Game Theory, Economy, as well as diverse approaches to Quantum Information Theory have been found and currently are being explored. Using connections between Quantum Game Theory and Quantum Computations, an application of the universality of a measurement based computation in Quantum Market Theory is presented.

  13. ECHO: A Computer Based Test for the Measurement of Individualistic, Cooperative, Defensive, and Aggressive Models of Behavior. Occasional Paper No. 30.

    ERIC Educational Resources Information Center

    Krus, David J.; And Others

    This paper describes a test which attempts to measure a group of personality traits by analyzing the actual behavior of the participant in a computer-simulated game. ECHO evolved from an extension and computerization of Horstein and Deutsch's allocation game. The computerized version of ECHO requires subjects to make decisions about the allocation…

  14. Culvert analysis program for indirect measurement of discharge

    USGS Publications Warehouse

    Fulford, Janice M.; ,

    1993-01-01

    A program based on the U.S. Geological Survey (USGS) methods for indirectly computing peak discharges through culverts allows users to employ input data formats used by the water surface profile program (WSPRO). The program can be used to compute discharge rating surfaces or curves that describe the behavior of flow through a particular culvert or to compute discharges from measurements of upstream of the gradually varied flow equations and has been adapted slightly to provide solutions that minimize the need for the user to determine between different flow regimes. The program source is written in Fortran 77 and has been run on mini-computers and personal computers. The program does not use or require graphics capability, a color monitor, or a mouse.

  15. Computation of transonic flow past projectiles at angle of attack

    NASA Technical Reports Server (NTRS)

    Reklis, R. P.; Sturek, W. B.; Bailey, F. R.

    1978-01-01

    Aerodynamic properties of artillery shell such as normal force and pitching moment reach peak values in a narrow transonic Mach number range. In order to compute these quantities, numerical techniques have been developed to obtain solutions to the three-dimensional transonic small disturbance equation about slender bodies at angle of attack. The computation is based on a plane relaxation technique involving Fourier transforms to partially decouple the three-dimensional difference equations. Particular care is taken to assure accurate solutions near corners found in shell designs. Computed surface pressures are compared to experimental measurements for circular arc and cone cylinder bodies which have been selected as test cases. Computed pitching moments are compared to range measurements for a typical projectile shape.

  16. The Effects of CBI Lesson Sequence Type and Field Dependence on Learning from Computer-Based Cooperative Instruction in Web

    ERIC Educational Resources Information Center

    Ipek, Ismail

    2010-01-01

    The purpose of this study was to investigate the effects of CBI lesson sequence type and cognitive style of field dependence on learning from Computer-Based Cooperative Instruction (CBCI) in WEB on the dependent measures, achievement, reading comprehension and reading rate. Eighty-seven college undergraduate students were randomly assigned to…

  17. A singlechip-computer-controlled conductivity meter based on conductance-frequency transformation

    NASA Astrophysics Data System (ADS)

    Chen, Wenxiang; Hong, Baocai

    2005-02-01

    A portable conductivity meter controlled by singlechip computer was designed. The instrument uses conductance-frequency transformation method to measure the conductivity of solution. The circuitry is simple and reliable. Another feature of the instrument is that the temperature compensation is realised by changing counting time of the timing counter. The theoretical based and the usage of temperature compensation are narrated.

  18. Silicon synaptic transistor for hardware-based spiking neural network and neuromorphic system

    NASA Astrophysics Data System (ADS)

    Kim, Hyungjin; Hwang, Sungmin; Park, Jungjin; Park, Byung-Gook

    2017-10-01

    Brain-inspired neuromorphic systems have attracted much attention as new computing paradigms for power-efficient computation. Here, we report a silicon synaptic transistor with two electrically independent gates to realize a hardware-based neural network system without any switching components. The spike-timing dependent plasticity characteristics of the synaptic devices are measured and analyzed. With the help of the device model based on the measured data, the pattern recognition capability of the hardware-based spiking neural network systems is demonstrated using the modified national institute of standards and technology handwritten dataset. By comparing systems with and without inhibitory synapse part, it is confirmed that the inhibitory synapse part is an essential element in obtaining effective and high pattern classification capability.

  19. Silicon synaptic transistor for hardware-based spiking neural network and neuromorphic system.

    PubMed

    Kim, Hyungjin; Hwang, Sungmin; Park, Jungjin; Park, Byung-Gook

    2017-10-06

    Brain-inspired neuromorphic systems have attracted much attention as new computing paradigms for power-efficient computation. Here, we report a silicon synaptic transistor with two electrically independent gates to realize a hardware-based neural network system without any switching components. The spike-timing dependent plasticity characteristics of the synaptic devices are measured and analyzed. With the help of the device model based on the measured data, the pattern recognition capability of the hardware-based spiking neural network systems is demonstrated using the modified national institute of standards and technology handwritten dataset. By comparing systems with and without inhibitory synapse part, it is confirmed that the inhibitory synapse part is an essential element in obtaining effective and high pattern classification capability.

  20. Real-time polarization imaging algorithm for camera-based polarization navigation sensors.

    PubMed

    Lu, Hao; Zhao, Kaichun; You, Zheng; Huang, Kaoli

    2017-04-10

    Biologically inspired polarization navigation is a promising approach due to its autonomous nature, high precision, and robustness. Many researchers have built point source-based and camera-based polarization navigation prototypes in recent years. Camera-based prototypes can benefit from their high spatial resolution but incur a heavy computation load. The pattern recognition algorithm in most polarization imaging algorithms involves several nonlinear calculations that impose a significant computation burden. In this paper, the polarization imaging and pattern recognition algorithms are optimized through reduction to several linear calculations by exploiting the orthogonality of the Stokes parameters without affecting precision according to the features of the solar meridian and the patterns of the polarized skylight. The algorithm contains a pattern recognition algorithm with a Hough transform as well as orientation measurement algorithms. The algorithm was loaded and run on a digital signal processing system to test its computational complexity. The test showed that the running time decreased to several tens of milliseconds from several thousand milliseconds. Through simulations and experiments, it was found that the algorithm can measure orientation without reducing precision. It can hence satisfy the practical demands of low computational load and high precision for use in embedded systems.

  1. Development of a Cloud Computing-Based Pier Type Port Structure Stability Evaluation Platform Using Fiber Bragg Grating Sensors.

    PubMed

    Jo, Byung Wan; Jo, Jun Ho; Khan, Rana Muhammad Asad; Kim, Jung Hoon; Lee, Yun Sung

    2018-05-23

    Structure Health Monitoring is a topic of great interest in port structures due to the ageing of structures and the limitations of evaluating structures. This paper presents a cloud computing-based stability evaluation platform for a pier type port structure using Fiber Bragg Grating (FBG) sensors in a system consisting of a FBG strain sensor, FBG displacement gauge, FBG angle meter, gateway, and cloud computing-based web server. The sensors were installed on core components of the structure and measurements were taken to evaluate the structures. The measurement values were transmitted to the web server via the gateway to analyze and visualize them. All data were analyzed and visualized in the web server to evaluate the structure based on the safety evaluation index (SEI). The stability evaluation platform for pier type port structures involves the efficient monitoring of the structures which can be carried out easily anytime and anywhere by converging new technologies such as cloud computing and FBG sensors. In addition, the platform has been successfully implemented at “Maryang Harbor” situated in Maryang-Meyon of Korea to test its durability.

  2. Computers in medical education 1: evaluation of a problem-orientated learning package.

    PubMed

    Devitt, P; Palmer, E

    1998-04-01

    A computer-based learning package has been developed, aimed at expanding students' knowledge base, as well as improving data-handling abilities and clinical problem-solving skills. The program was evaluated by monitoring its use by students, canvassing users' opinions and measuring its effectiveness as a learning tool compared to tutorials on the same material. Evaluation was undertaken using three methods: initially, by a questionnaire on computers as a learning tool and the applicability of the content: second, through monitoring by the computer of student use, decisions and performance; finally, through pre- and post-test assessment of fifth-year students who either used a computer package or attended a tutorial on equivalent material. Most students provided positive comments on the learning material and expressed a willingness to see computer-aided learning (CAL) introduced into the curriculum. Over a 3-month period, 26 modules in the program were used on 1246 occasions. Objective measurement showed a significant gain in knowledge, data handling and problem-solving skills. Computer-aided learning is a valuable learning resource that deserves better attention in medical education. When used appropriately, the computer can be an effective learning resource, not only for the delivery of knowledge. but also to help students develop their problem-solving skills.

  3. The effects of Internet or interactive computer-based patient education in the field of breast cancer: a systematic literature review.

    PubMed

    Ryhänen, Anne M; Siekkinen, Mervi; Rankinen, Sirkku; Korvenranta, Heikki; Leino-Kilpi, Helena

    2010-04-01

    The aim of this systematic review was to analyze what kind of Internet or interactive computer-based patient education programs have been developed and to analyze the effectiveness of these programs in the field of breast cancer patient education. Patient education for breast cancer patients is an important intervention to empower the patient. However, we know very little about the effects and potential of Internet-based patient education in the empowerment of breast cancer patients. Complete databases were searched covering the period from the beginning of each database to November 2008. Studies were included if they concerned patient education for breast cancer patients with Internet or interactive computer programs and were based on randomized controlled, on clinical trials or quasi-experimental studies. We identified 14 articles involving 2374 participants. The design was randomized controlled trial in nine papers, in two papers clinical trial and in three quasi-experimental. Seven of the studies were randomized to experimental and control groups, in two papers participants were grouped by ethnic and racial differences and by mode of Internet use and three studies measured the same group pre- and post-tests after using a computer program. The interventions used were described as interactive computer or multimedia programs and use of the Internet. The methodological solutions of the studies varied. The effects of the studies were diverse except for knowledge-related issues. Internet or interactive computer-based patient education programs in the care of breast cancer patients may have positive effect increasing breast cancer knowledge. The results suggest a positive relationship between the Internet or computer-based patient education program use and the knowledge level of patients with breast cancer but a diverse relationship between patient's participation and other outcome measures. There is need to develop and research more Internet-based patient education. 2009 Elsevier Ireland Ltd. All rights reserved.

  4. Factors (Not) Affecting What Students Do with Computers and Internet at Home

    ERIC Educational Resources Information Center

    Hinostroza, J. Enrique; Matamala, Carolina; Labbé, Christian; Claro, Magdalena; Cabello, Tania

    2015-01-01

    This paper presents the results of an analysis of secondary students' computer use, aimed at understanding how different factors influence the profile of activities carried out by students with computers. The analysis is based on the data from a national study aimed at measuring students' Information and Communication Technology (ICT) skills for…

  5. Comparability of Computer- and Paper-Administered Multiple-Choice Tests for K-12 Populations: A Synthesis

    ERIC Educational Resources Information Center

    Kingston, Neal M.

    2009-01-01

    There have been many studies of the comparability of computer-administered and paper-administered tests. Not surprisingly (given the variety of measurement and statistical sampling issues that can affect any one study) the results of such studies have not always been consistent. Moreover, the quality of computer-based test administration systems…

  6. On the usage of ultrasound computational models for decision making under ambiguity

    NASA Astrophysics Data System (ADS)

    Dib, Gerges; Sexton, Samuel; Prowant, Matthew; Crawford, Susan; Diaz, Aaron

    2018-04-01

    Computer modeling and simulation is becoming pervasive within the non-destructive evaluation (NDE) industry as a convenient tool for designing and assessing inspection techniques. This raises a pressing need for developing quantitative techniques for demonstrating the validity and applicability of the computational models. Computational models provide deterministic results based on deterministic and well-defined input, or stochastic results based on inputs defined by probability distributions. However, computational models cannot account for the effects of personnel, procedures, and equipment, resulting in ambiguity about the efficacy of inspections based on guidance from computational models only. In addition, ambiguity arises when model inputs, such as the representation of realistic cracks, cannot be defined deterministically, probabilistically, or by intervals. In this work, Pacific Northwest National Laboratory demonstrates the ability of computational models to represent field measurements under known variabilities, and quantify the differences using maximum amplitude and power spectrum density metrics. Sensitivity studies are also conducted to quantify the effects of different input parameters on the simulation results.

  7. Using Curriculum-Based Measurement To Monitor Kindergarteners' Mathematics Development

    ERIC Educational Resources Information Center

    Seethaler, Pamela M.; Fuchs, Lynn S.

    2011-01-01

    The purpose of this study was to examine technical and instructional features of a kindergarten curriculum-based measurement (CBM) tool designed to track students' mathematics progress in terms of computational concepts, procedures, and counting strategies. Students in 10 kindergarten classrooms in three elementary schools completed alternate…

  8. Relative velocity change measurement based on seismic noise analysis in exploration geophysics

    NASA Astrophysics Data System (ADS)

    Corciulo, M.; Roux, P.; Campillo, M.; Dubuq, D.

    2011-12-01

    Passive monitoring techniques based on noise cross-correlation analysis are still debated in exploration geophysics even if recent studies showed impressive performance in seismology at larger scale. Time evolution of complex geological structure using noise data includes localization of noise sources and measurement of relative velocity variations. Monitoring relative velocity variations only requires the measurement of phase shifts of seismic noise cross-correlation functions computed for successive time recordings. The existing algorithms, such as the Stretching and the Doublet, classically need great efforts in terms of computation time, making them not practical when continuous dataset on dense arrays are acquired. We present here an innovative technique for passive monitoring based on the measure of the instantaneous phase of noise-correlated signals. The Instantaneous Phase Variation (IPV) technique aims at cumulating the advantages of the Stretching and Doublet methods while proposing a faster measurement of the relative velocity change. The IPV takes advantage of the Hilbert transform to compute in the time domain the phase difference between two noise correlation functions. The relative velocity variation is measured through the slope of the linear regression of the phase difference curve as a function of correlation time. The large amount of noise correlation functions, classically available at exploration scale on dense arrays, allows for a statistical analysis that further improves the precision of the estimation of the velocity change. In this work, numerical tests first aim at comparing the IPV performance to the Stretching and Doublet techniques in terms of accuracy, robustness and computation time. Then experimental results are presented using a seismic noise dataset with five days of continuous recording on 397 geophones spread on a ~1 km-squared area.

  9. Development and testing of an automated computer tablet-based method for self-testing of high and low contrast near visual acuity in ophthalmic patients.

    PubMed

    Aslam, Tariq M; Parry, Neil R A; Murray, Ian J; Salleh, Mahani; Col, Caterina Dal; Mirza, Naznin; Czanner, Gabriela; Tahir, Humza J

    2016-05-01

    Many eye diseases require on-going assessment for optimal management, creating an ever-increasing burden on patients and hospitals that could potentially be reduced through home vision monitoring. However, there is limited evidence for the utility of current applications and devices for this. To address this, we present a new automated, computer tablet-based method for self-testing near visual acuity (VA) for both high and low contrast targets. We report on its reliability and agreement with gold standard measures. The Mobile Assessment of Vision by intERactIve Computer (MAVERIC) system consists of a calibrated computer tablet housed in a bespoke viewing chamber. Purpose-built software automatically elicits touch-screen responses from subjects to measure their near VA for either low or high contrast acuity. Near high contrast acuity was measured using both the MAVERIC system and a near Landolt C chart in one eye for 81 patients and low contrast acuity using the MAVERIC system and a 25 % contrast near EDTRS chart in one eye of a separate 95 patients. The MAVERIC near acuity was also retested after 20 min to evaluate repeatability. Repeatability of both high and low contrast MAVERIC acuity measures, and their agreement with the chart tests, was assessed using the Bland-Altman comparison method. One hundred and seventy-three patients (96 %) completed the self- testing MAVERIC system without formal assistance. The resulting MAVERIC vision demonstrated good repeatability and good agreement with the gold-standard near chart measures. This study demonstrates the potential utility of the MAVERIC system for patients with ophthalmic disease to self-test their high and low contrast VA. The technique has a high degree of reliability and agreement with gold standard chart based measurements.

  10. The Effectiveness of a Web-Based Computer-Tailored Intervention on Workplace Sitting: A Randomized Controlled Trial.

    PubMed

    De Cocker, Katrien; De Bourdeaudhuij, Ilse; Cardon, Greet; Vandelanotte, Corneel

    2016-05-31

    Effective interventions to influence workplace sitting are needed, as office-based workers demonstrate high levels of continued sitting, and sitting too much is associated with adverse health effects. Therefore, we developed a theory-driven, Web-based, interactive, computer-tailored intervention aimed at reducing and interrupting sitting at work. The objective of our study was to investigate the effects of this intervention on objectively measured sitting time, standing time, and breaks from sitting, as well as self-reported context-specific sitting among Flemish employees in a field-based approach. Employees (n=213) participated in a 3-group randomized controlled trial that assessed outcomes at baseline, 1-month follow-up, and 3-month follow-up through self-reports. A subsample (n=122) were willing to wear an activity monitor (activPAL) from Monday to Friday. The tailored group received an automated Web-based, computer-tailored intervention including personalized feedback and tips on how to reduce or interrupt workplace sitting. The generic group received an automated Web-based generic advice with tips. The control group was a wait-list control condition, initially receiving no intervention. Intervention effects were tested with repeated-measures multivariate analysis of variance. The tailored intervention was successful in decreasing self-reported total workday sitting (time × group: P<.001), sitting at work (time × group: P<.001), and leisure time sitting (time × group: P=.03), and in increasing objectively measured breaks at work (time × group: P=.07); this was not the case in the other conditions. The changes in self-reported total nonworkday sitting, sitting during transport, television viewing, and personal computer use, objectively measured total sitting time, and sitting and standing time at work did not differ between conditions. Our results point out the significance of computer tailoring for sedentary behavior and its potential use in public health promotion, as the effects of the tailored condition were superior to the generic and control conditions. Clinicaltrials.gov NCT02672215; http://clinicaltrials.gov/ct2/show/NCT02672215 (Archived by WebCite at http://www.webcitation.org/6glPFBLWv).

  11. The Effectiveness of a Web-Based Computer-Tailored Intervention on Workplace Sitting: A Randomized Controlled Trial

    PubMed Central

    De Bourdeaudhuij, Ilse; Cardon, Greet; Vandelanotte, Corneel

    2016-01-01

    Background Effective interventions to influence workplace sitting are needed, as office-based workers demonstrate high levels of continued sitting, and sitting too much is associated with adverse health effects. Therefore, we developed a theory-driven, Web-based, interactive, computer-tailored intervention aimed at reducing and interrupting sitting at work. Objective The objective of our study was to investigate the effects of this intervention on objectively measured sitting time, standing time, and breaks from sitting, as well as self-reported context-specific sitting among Flemish employees in a field-based approach. Methods Employees (n=213) participated in a 3-group randomized controlled trial that assessed outcomes at baseline, 1-month follow-up, and 3-month follow-up through self-reports. A subsample (n=122) were willing to wear an activity monitor (activPAL) from Monday to Friday. The tailored group received an automated Web-based, computer-tailored intervention including personalized feedback and tips on how to reduce or interrupt workplace sitting. The generic group received an automated Web-based generic advice with tips. The control group was a wait-list control condition, initially receiving no intervention. Intervention effects were tested with repeated-measures multivariate analysis of variance. Results The tailored intervention was successful in decreasing self-reported total workday sitting (time × group: P<.001), sitting at work (time × group: P<.001), and leisure time sitting (time × group: P=.03), and in increasing objectively measured breaks at work (time × group: P=.07); this was not the case in the other conditions. The changes in self-reported total nonworkday sitting, sitting during transport, television viewing, and personal computer use, objectively measured total sitting time, and sitting and standing time at work did not differ between conditions. Conclusions Our results point out the significance of computer tailoring for sedentary behavior and its potential use in public health promotion, as the effects of the tailored condition were superior to the generic and control conditions. Trial Registration Clinicaltrials.gov NCT02672215; http://clinicaltrials.gov/ct2/show/NCT02672215 (Archived by WebCite at http://www.webcitation.org/6glPFBLWv) PMID:27245789

  12. Methods for automatic trigger threshold adjustment

    DOEpatents

    Welch, Benjamin J; Partridge, Michael E

    2014-03-18

    Methods are presented for adjusting trigger threshold values to compensate for drift in the quiescent level of a signal monitored for initiating a data recording event, thereby avoiding false triggering conditions. Initial threshold values are periodically adjusted by re-measuring the quiescent signal level, and adjusting the threshold values by an offset computation based upon the measured quiescent signal level drift. Re-computation of the trigger threshold values can be implemented on time based or counter based criteria. Additionally, a qualification width counter can be utilized to implement a requirement that a trigger threshold criterion be met a given number of times prior to initiating a data recording event, further reducing the possibility of a false triggering situation.

  13. Model-based segmentation in orbital volume measurement with cone beam computed tomography and evaluation against current concepts.

    PubMed

    Wagner, Maximilian E H; Gellrich, Nils-Claudius; Friese, Karl-Ingo; Becker, Matthias; Wolter, Franz-Erich; Lichtenstein, Juergen T; Stoetzer, Marcus; Rana, Majeed; Essig, Harald

    2016-01-01

    Objective determination of the orbital volume is important in the diagnostic process and in evaluating the efficacy of medical and/or surgical treatment of orbital diseases. Tools designed to measure orbital volume with computed tomography (CT) often cannot be used with cone beam CT (CBCT) because of inferior tissue representation, although CBCT has the benefit of greater availability and lower patient radiation exposure. Therefore, a model-based segmentation technique is presented as a new method for measuring orbital volume and compared to alternative techniques. Both eyes from thirty subjects with no known orbital pathology who had undergone CBCT as a part of routine care were evaluated (n = 60 eyes). Orbital volume was measured with manual, atlas-based, and model-based segmentation methods. Volume measurements, volume determination time, and usability were compared between the three methods. Differences in means were tested for statistical significance using two-tailed Student's t tests. Neither atlas-based (26.63 ± 3.15 mm(3)) nor model-based (26.87 ± 2.99 mm(3)) measurements were significantly different from manual volume measurements (26.65 ± 4.0 mm(3)). However, the time required to determine orbital volume was significantly longer for manual measurements (10.24 ± 1.21 min) than for atlas-based (6.96 ± 2.62 min, p < 0.001) or model-based (5.73 ± 1.12 min, p < 0.001) measurements. All three orbital volume measurement methods examined can accurately measure orbital volume, although atlas-based and model-based methods seem to be more user-friendly and less time-consuming. The new model-based technique achieves fully automated segmentation results, whereas all atlas-based segmentations at least required manipulations to the anterior closing. Additionally, model-based segmentation can provide reliable orbital volume measurements when CT image quality is poor.

  14. Flight data acquisition methodology for validation of passive ranging algorithms for obstacle avoidance

    NASA Technical Reports Server (NTRS)

    Smith, Phillip N.

    1990-01-01

    The automation of low-altitude rotorcraft flight depends on the ability to detect, locate, and navigate around obstacles lying in the rotorcraft's intended flightpath. Computer vision techniques provide a passive method of obstacle detection and range estimation, for obstacle avoidance. Several algorithms based on computer vision methods have been developed for this purpose using laboratory data; however, further development and validation of candidate algorithms require data collected from rotorcraft flight. A data base containing low-altitude imagery augmented with the rotorcraft and sensor parameters required for passive range estimation is not readily available. Here, the emphasis is on the methodology used to develop such a data base from flight-test data consisting of imagery, rotorcraft and sensor parameters, and ground-truth range measurements. As part of the data preparation, a technique for obtaining the sensor calibration parameters is described. The data base will enable the further development of algorithms for computer vision-based obstacle detection and passive range estimation, as well as provide a benchmark for verification of range estimates against ground-truth measurements.

  15. Computer-Aided Evaluation of Blood Vessel Geometry From Acoustic Images.

    PubMed

    Lindström, Stefan B; Uhlin, Fredrik; Bjarnegård, Niclas; Gylling, Micael; Nilsson, Kamilla; Svensson, Christina; Yngman-Uhlin, Pia; Länne, Toste

    2018-04-01

    A method for computer-aided assessment of blood vessel geometries based on shape-fitting algorithms from metric vision was evaluated. Acoustic images of cross sections of the radial artery and cephalic vein were acquired, and medical practitioners used a computer application to measure the wall thickness and nominal diameter of these blood vessels with a caliper method and the shape-fitting method. The methods performed equally well for wall thickness measurements. The shape-fitting method was preferable for measuring the diameter, since it reduced systematic errors by up to 63% in the case of the cephalic vein because of its eccentricity. © 2017 by the American Institute of Ultrasound in Medicine.

  16. Sound Velocity and Diffraction Intensity Measurements Based on Raman-Nath Theory of the Interaction of Light and Ultrasound

    ERIC Educational Resources Information Center

    Neeson, John F.; Austin, Stephen

    1975-01-01

    Describes a method for the measurement of the velocity of sound in various liquids based on the Raman-Nath theory of light-sound interaction. Utilizes an analog computer program to calculate the intensity of light scattered into various diffraction orders. (CP)

  17. Cooling the Collective Motion of Trapped Ions to Initialize a Quantum Register

    DTIC Science & Technology

    2016-09-13

    computation [1] provides a gen- eral framework for fundamental investigations into sub- jects such as entanglement, quantum measurement, and quantum ...information theory. Since quantum computation relies on entanglement between qubits, any implementa- tion of a quantum computer must offer isolation from the...for realiz- ing a quantum computer , which is scalable to an arbitrary number of qubits. Their scheme is based on a collection of trapped atomic ions

  18. 10 CFR Appendix A to Subpart U of... - Sampling Plan for Enforcement Testing of Electric Motors

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ....010 is based on a 20 percent tolerance in the total power loss at full-load and fixed output power... measured full-load efficiency of unit i. Step 3. Compute the sample standard deviation (S1) of the measured full-load efficiency of the n1 units in the first sample as follows: ER83AD04.006 Step 4. Compute the...

  19. 10 CFR Appendix A to Subpart U of... - Sampling Plan for Enforcement Testing of Electric Motors

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ....010 is based on a 20 percent tolerance in the total power loss at full-load and fixed output power... measured full-load efficiency of unit i. Step 3. Compute the sample standard deviation (S1) of the measured full-load efficiency of the n1 units in the first sample as follows: ER83AD04.006 Step 4. Compute the...

  20. 10 CFR Appendix A to Subpart U of... - Sampling Plan for Enforcement Testing of Electric Motors

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ....010 is based on a 20 percent tolerance in the total power loss at full-load and fixed output power... measured full-load efficiency of unit i. Step 3. Compute the sample standard deviation (S1) of the measured full-load efficiency of the n1 units in the first sample as follows: ER83AD04.006 Step 4. Compute the...

  1. Relation of Structural and Vibratory Kinematics of the Vocal Folds to Two Acoustic Measures of Breathy Voice Based on Computational Modeling

    ERIC Educational Resources Information Center

    Samlan, Robin A.; Story, Brad H.

    2011-01-01

    Purpose: To relate vocal fold structure and kinematics to 2 acoustic measures: cepstral peak prominence (CPP) and the amplitude of the first harmonic relative to the second (H1-H2). Method: The authors used a computational, kinematic model of the medial surfaces of the vocal folds to specify features of vocal fold structure and vibration in a…

  2. Validating an operational physical method to compute surface radiation from geostationary satellites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sengupta, Manajit; Dhere, Neelkanth G.; Wohlgemuth, John H.

    We developed models to compute global horizontal irradiance (GHI) and direct normal irradiance (DNI) over the last three decades. These models can be classified as empirical or physical based on the approach. Empirical models relate ground-based observations with satellite measurements and use these relations to compute surface radiation. Physical models consider the physics behind the radiation received at the satellite and create retrievals to estimate surface radiation. Furthermore, while empirical methods have been traditionally used for computing surface radiation for the solar energy industry, the advent of faster computing has made operational physical models viable. The Global Solar Insolation Projectmore » (GSIP) is a physical model that computes DNI and GHI using the visible and infrared channel measurements from a weather satellite. GSIP uses a two-stage scheme that first retrieves cloud properties and uses those properties in a radiative transfer model to calculate GHI and DNI. Developed for polar orbiting satellites, GSIP has been adapted to NOAA's Geostationary Operation Environmental Satellite series and can run operationally at high spatial resolutions. Our method holds the possibility of creating high quality datasets of GHI and DNI for use by the solar energy industry. We present an outline of the methodology and results from running the model as well as a validation study using ground-based instruments.« less

  3. Motivation and Performance within a Collaborative Computer-Based Modeling Task: Relations between Students' Achievement Goal Orientation, Self-Efficacy, Cognitive Processing, and Achievement

    ERIC Educational Resources Information Center

    Sins, Patrick H. M.; van Joolingen, Wouter R.; Savelsbergh, Elwin R.; van Hout-Wolters, Bernadette

    2008-01-01

    Purpose of the present study was to test a conceptual model of relations among achievement goal orientation, self-efficacy, cognitive processing, and achievement of students working within a particular collaborative task context. The task involved a collaborative computer-based modeling task. In order to test the model, group measures of…

  4. An Examination of Alternate Assessment Durations when Assessing Multiple-Skill Computational Fluency: The Generalizability and Dependability of Curriculum-Based Outcomes within the Context of Educational Decisions

    ERIC Educational Resources Information Center

    Christ, Theodore J.; Johnson-Gros, Kristin H.

    2005-01-01

    The current study extended previous research on curriculum-based measurement in mathematics (M-CBM) assessments. The purpose was to examine the generalizability and dependability of multiple-skill M-CBM computation assessments across various assessment durations (1, 2, 3, 4, 5, and 6 minutes). Results of generalizability and dependability studies…

  5. Ancilla-driven quantum computation for qudits and continuous variables

    NASA Astrophysics Data System (ADS)

    Proctor, Timothy; Giulian, Melissa; Korolkova, Natalia; Andersson, Erika; Kendon, Viv

    2017-05-01

    Although qubits are the leading candidate for the basic elements in a quantum computer, there are also a range of reasons to consider using higher-dimensional qudits or quantum continuous variables (QCVs). In this paper, we use a general "quantum variable" formalism to propose a method of quantum computation in which ancillas are used to mediate gates on a well-isolated "quantum memory" register and which may be applied to the setting of qubits, qudits (for d >2 ), or QCVs. More specifically, we present a model in which universal quantum computation may be implemented on a register using only repeated applications of a single fixed two-body ancilla-register interaction gate, ancillas prepared in a single state, and local measurements of these ancillas. In order to maintain determinism in the computation, adaptive measurements via a classical feed forward of measurement outcomes are used, with the method similar to that in measurement-based quantum computation (MBQC). We show that our model has the same hybrid quantum-classical processing advantages as MBQC, including the power to implement any Clifford circuit in essentially one layer of quantum computation. In some physical settings, high-quality measurements of the ancillas may be highly challenging or not possible, and hence we also present a globally unitary model which replaces the need for measurements of the ancillas with the requirement for ancillas to be prepared in states from a fixed orthonormal basis. Finally, we discuss settings in which these models may be of practical interest.

  6. Computer constructed imagery of distant plasma interaction boundaries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grenstadt, E.W.; Schurr, H.D.; Tsugawa, R.K.

    1982-01-01

    Computer constructed sketches of plasma boundaries arising from the interaction between the solar wind and the magnetosphere can serve as both didactic and research tools. In particular, the structure of the earth's bow shock can be represented as a nonuniform surfce according to the instantaneous orientation of the IMF, and temporal changes in structural distribution can be modeled as a sequence of sketches based on observed sequences of spacecraft-based measurements. Viewed rapidly, such a sequence of sketches can be the basis for representation of plasma processes by computer animation.

  7. The Webcam system: a simple, automated, computer-based video system for quantitative measurement of movement in nonhuman primates.

    PubMed

    Togasaki, Daniel M; Hsu, Albert; Samant, Meghana; Farzan, Bijan; DeLanney, Louis E; Langston, J William; Di Monte, Donato A; Quik, Maryka

    2005-06-30

    Investigations using models of neurologic disease frequently involve quantifying animal motor activity. We developed a simple method for measuring motor activity using a computer-based video system (the Webcam system) consisting of an inexpensive video camera connected to a personal computer running customized software. Images of the animals are captured at half-second intervals and movement is quantified as the number of pixel changes between consecutive images. The Webcam system allows measurement of motor activity of the animals in their home cages, without devices affixed to their bodies. Webcam quantification of movement was validated by correlation with measures simultaneously obtained by two other methods: measurement of locomotion by interruption of infrared beams; and measurement of general motor activity using portable accelerometers. In untreated squirrel monkeys, correlations of Webcam and locomotor activity exceeded 0.79, and correlations with general activity counts exceeded 0.65. Webcam activity decreased after the monkeys were rendered parkinsonian by treatment with 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP), but the correlations with the other measures of motor activity were maintained. Webcam activity also correlated with clinical ratings of parkinsonism. These results indicate that the Webcam system is reliable under both untreated and experimental conditions and is an excellent method for quantifying motor activity in animals.

  8. INS/GNSS Tightly-Coupled Integration Using Quaternion-Based AUPF for USV.

    PubMed

    Xia, Guoqing; Wang, Guoqing

    2016-08-02

    This paper addresses the problem of integration of Inertial Navigation System (INS) and Global Navigation Satellite System (GNSS) for the purpose of developing a low-cost, robust and highly accurate navigation system for unmanned surface vehicles (USVs). A tightly-coupled integration approach is one of the most promising architectures to fuse the GNSS data with INS measurements. However, the resulting system and measurement models turn out to be nonlinear, and the sensor stochastic measurement errors are non-Gaussian and distributed in a practical system. Particle filter (PF), one of the most theoretical attractive non-linear/non-Gaussian estimation methods, is becoming more and more attractive in navigation applications. However, the large computation burden limits its practical usage. For the purpose of reducing the computational burden without degrading the system estimation accuracy, a quaternion-based adaptive unscented particle filter (AUPF), which combines the adaptive unscented Kalman filter (AUKF) with PF, has been proposed in this paper. The unscented Kalman filter (UKF) is used in the algorithm to improve the proposal distribution and generate a posterior estimates, which specify the PF importance density function for generating particles more intelligently. In addition, the computational complexity of the filter is reduced with the avoidance of the re-sampling step. Furthermore, a residual-based covariance matching technique is used to adapt the measurement error covariance. A trajectory simulator based on a dynamic model of USV is used to test the proposed algorithm. Results show that quaternion-based AUPF can significantly improve the overall navigation accuracy and reliability.

  9. dETECT: A Model for the Evaluation of Instructional Units for Teaching Computing in Middle School

    ERIC Educational Resources Information Center

    von Wangenheim, Christiane G.; Petri, Giani; Zibertti, André W.; Borgatto, Adriano F.; Hauck, Jean C. R.; Pacheco, Fernando S.; Filho, Raul Missfeldt

    2017-01-01

    The objective of this article is to present the development and evaluation of dETECT (Evaluating TEaching CompuTing), a model for the evaluation of the quality of instructional units for teaching computing in middle school based on the students' perception collected through a measurement instrument. The dETECT model was systematically developed…

  10. A Communications Modeling System for Swarm-Based Sensors

    DTIC Science & Technology

    2003-09-01

    6-10 6.6. Digital and Swarm System Performance Measures . . . . . . . . . . 6-21 7.1. Simulation computing hardware...detection and monitoring, and advances in computational capabilities have provided for embedded data analysis and the generation of information from raw... computing and manufacturing technology have made such systems possible. In order to harness this potential for Air Force applica- tions, a method of

  11. A multi-time-step noise reduction method for measuring velocity statistics from particle tracking velocimetry

    NASA Astrophysics Data System (ADS)

    Machicoane, Nathanaël; López-Caballero, Miguel; Bourgoin, Mickael; Aliseda, Alberto; Volk, Romain

    2017-10-01

    We present a method to improve the accuracy of velocity measurements for fluid flow or particles immersed in it, based on a multi-time-step approach that allows for cancellation of noise in the velocity measurements. Improved velocity statistics, a critical element in turbulent flow measurements, can be computed from the combination of the velocity moments computed using standard particle tracking velocimetry (PTV) or particle image velocimetry (PIV) techniques for data sets that have been collected over different values of time intervals between images. This method produces Eulerian velocity fields and Lagrangian velocity statistics with much lower noise levels compared to standard PIV or PTV measurements, without the need of filtering and/or windowing. Particle displacement between two frames is computed for multiple different time-step values between frames in a canonical experiment of homogeneous isotropic turbulence. The second order velocity structure function of the flow is computed with the new method and compared to results from traditional measurement techniques in the literature. Increased accuracy is also demonstrated by comparing the dissipation rate of turbulent kinetic energy measured from this function against previously validated measurements.

  12. Prediction of quantitative intrathoracic fluid volume to diagnose pulmonary oedema using LabVIEW.

    PubMed

    Urooj, Shabana; Khan, M; Ansari, A Q; Lay-Ekuakille, Aimé; Salhan, Ashok K

    2012-01-01

    Pulmonary oedema is a life-threatening disease that requires special attention in the area of research and clinical diagnosis. Computer-based techniques are rarely used to quantify the intrathoracic fluid volume (IFV) for diagnostic purposes. This paper discusses a software program developed to detect and diagnose pulmonary oedema using LabVIEW. The software runs on anthropometric dimensions and physiological parameters, mainly transthoracic electrical impedance (TEI). This technique is accurate and faster than existing manual techniques. The LabVIEW software was used to compute the parameters required to quantify IFV. An equation relating per cent control and IFV was obtained. The results of predicted TEI and measured TEI were compared with previously reported data to validate the developed program. It was found that the predicted values of TEI obtained from the computer-based technique were much closer to the measured values of TEI. Six new subjects were enrolled to measure and predict transthoracic impedance and hence to quantify IFV. A similar difference was also observed in the measured and predicted values of TEI for the new subjects.

  13. Radio Frequency Mass Gauging of Propellants

    NASA Technical Reports Server (NTRS)

    Zimmerli, Gregory A.; Vaden, Karl R.; Herlacher, Michael D.; Buchanan, David A.; VanDresar, Neil T.

    2007-01-01

    A combined experimental and computer simulation effort was conducted to measure radio frequency (RF) tank resonance modes in a dewar partially filled with liquid oxygen, and compare the measurements with numerical simulations. The goal of the effort was to demonstrate that computer simulations of a tank's electromagnetic eigenmodes can be used to accurately predict ground-based measurements, thereby providing a computational tool for predicting tank modes in a low-gravity environment. Matching the measured resonant frequencies of several tank modes with computer simulations can be used to gauge the amount of liquid in a tank, thus providing a possible method to gauge cryogenic propellant tanks in low-gravity. Using a handheld RF spectrum analyzer and a small antenna in a 46 liter capacity dewar for experimental measurements, we have verified that the four lowest transverse magnetic eigenmodes can be accurately predicted as a function of liquid oxygen fill level using computer simulations. The input to the computer simulations consisted of tank dimensions, and the dielectric constant of the fluid. Without using any adjustable parameters, the calculated and measured frequencies agree such that the liquid oxygen fill level was gauged to within 2 percent full scale uncertainty. These results demonstrate the utility of using electromagnetic simulations to form the basis of an RF mass gauging technology with the power to simulate tank resonance frequencies from arbitrary fluid configurations.

  14. Experimental and Computational Investigation of Triple-rotating Blades in a Mower Deck

    NASA Astrophysics Data System (ADS)

    Chon, Woochong; Amano, Ryoichi S.

    Experimental and computational studies were performed on the 1.27m wide three-spindle lawn mower deck with side discharge arrangement. Laser Doppler Velocimetry was used to measure the air velocity at 12 different sections under the mower deck. The high-speed video camera test provided valuable visual evidence of airflow and grass discharge patterns. The strain gages were attached at several predetermined locations of the mower blades to measure the strain. In computational fluid dynamics work, computer based analytical studies were performed. During this phase of work, two different trials were attempted. First, two-dimensional blade shapes at several arbitrary radial sections were selected for flow computations around the blade model. Finally, a three-dimensional full deck model was developed and compared with the experimental results.

  15. InteGO2: A web tool for measuring and visualizing gene semantic similarities using Gene Ontology

    DOE PAGES

    Peng, Jiajie; Li, Hongxiang; Liu, Yongzhuang; ...

    2016-08-31

    Here, the Gene Ontology (GO) has been used in high-throughput omics research as a major bioinformatics resource. The hierarchical structure of GO provides users a convenient platform for biological information abstraction and hypothesis testing. Computational methods have been developed to identify functionally similar genes. However, none of the existing measurements take into account all the rich information in GO. Similarly, using these existing methods, web-based applications have been constructed to compute gene functional similarities, and to provide pure text-based outputs. Without a graphical visualization interface, it is difficult for result interpretation. As a result, we present InteGO2, a web toolmore » that allows researchers to calculate the GO-based gene semantic similarities using seven widely used GO-based similarity measurements. Also, we provide an integrative measurement that synergistically integrates all the individual measurements to improve the overall performance. Using HTML5 and cytoscape.js, we provide a graphical interface in InteGO2 to visualize the resulting gene functional association networks. In conclusion, InteGO2 is an easy-to-use HTML5 based web tool. With it, researchers can measure gene or gene product functional similarity conveniently, and visualize the network of functional interactions in a graphical interface.« less

  16. InteGO2: a web tool for measuring and visualizing gene semantic similarities using Gene Ontology.

    PubMed

    Peng, Jiajie; Li, Hongxiang; Liu, Yongzhuang; Juan, Liran; Jiang, Qinghua; Wang, Yadong; Chen, Jin

    2016-08-31

    The Gene Ontology (GO) has been used in high-throughput omics research as a major bioinformatics resource. The hierarchical structure of GO provides users a convenient platform for biological information abstraction and hypothesis testing. Computational methods have been developed to identify functionally similar genes. However, none of the existing measurements take into account all the rich information in GO. Similarly, using these existing methods, web-based applications have been constructed to compute gene functional similarities, and to provide pure text-based outputs. Without a graphical visualization interface, it is difficult for result interpretation. We present InteGO2, a web tool that allows researchers to calculate the GO-based gene semantic similarities using seven widely used GO-based similarity measurements. Also, we provide an integrative measurement that synergistically integrates all the individual measurements to improve the overall performance. Using HTML5 and cytoscape.js, we provide a graphical interface in InteGO2 to visualize the resulting gene functional association networks. InteGO2 is an easy-to-use HTML5 based web tool. With it, researchers can measure gene or gene product functional similarity conveniently, and visualize the network of functional interactions in a graphical interface. InteGO2 can be accessed via http://mlg.hit.edu.cn:8089/ .

  17. InteGO2: A web tool for measuring and visualizing gene semantic similarities using Gene Ontology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peng, Jiajie; Li, Hongxiang; Liu, Yongzhuang

    Here, the Gene Ontology (GO) has been used in high-throughput omics research as a major bioinformatics resource. The hierarchical structure of GO provides users a convenient platform for biological information abstraction and hypothesis testing. Computational methods have been developed to identify functionally similar genes. However, none of the existing measurements take into account all the rich information in GO. Similarly, using these existing methods, web-based applications have been constructed to compute gene functional similarities, and to provide pure text-based outputs. Without a graphical visualization interface, it is difficult for result interpretation. As a result, we present InteGO2, a web toolmore » that allows researchers to calculate the GO-based gene semantic similarities using seven widely used GO-based similarity measurements. Also, we provide an integrative measurement that synergistically integrates all the individual measurements to improve the overall performance. Using HTML5 and cytoscape.js, we provide a graphical interface in InteGO2 to visualize the resulting gene functional association networks. In conclusion, InteGO2 is an easy-to-use HTML5 based web tool. With it, researchers can measure gene or gene product functional similarity conveniently, and visualize the network of functional interactions in a graphical interface.« less

  18. Computer graphics for management: An abstract of capabilities and applications of the EIS system

    NASA Technical Reports Server (NTRS)

    Solem, B. J.

    1975-01-01

    The Executive Information Services (EIS) system, developed as a computer-based, time-sharing tool for making and implementing management decisions, and including computer graphics capabilities, was described. The following resources are available through the EIS languages: centralized corporate/gov't data base, customized and working data bases, report writing, general computational capability, specialized routines, modeling/programming capability, and graphics. Nearly all EIS graphs can be created by a single, on-line instruction. A large number of options are available, such as selection of graphic form, line control, shading, placement on the page, multiple images on a page, control of scaling and labeling, plotting of cum data sets, optical grid lines, and stack charts. The following are examples of areas in which the EIS system may be used: research, estimating services, planning, budgeting, and performance measurement, national computer hook-up negotiations.

  19. Computer-Based Radiographic Quantification of Joint Space Narrowing Progression Using Sequential Hand Radiographs: Validation Study in Rheumatoid Arthritis Patients from Multiple Institutions.

    PubMed

    Ichikawa, Shota; Kamishima, Tamotsu; Sutherland, Kenneth; Fukae, Jun; Katayama, Kou; Aoki, Yuko; Okubo, Takanobu; Okino, Taichi; Kaneda, Takahiko; Takagi, Satoshi; Tanimura, Kazuhide

    2017-10-01

    We have developed a refined computer-based method to detect joint space narrowing (JSN) progression with the joint space narrowing progression index (JSNPI) by superimposing sequential hand radiographs. The purpose of this study is to assess the validity of a computer-based method using images obtained from multiple institutions in rheumatoid arthritis (RA) patients. Sequential hand radiographs of 42 patients (37 females and 5 males) with RA from two institutions were analyzed by a computer-based method and visual scoring systems as a standard of reference. The JSNPI above the smallest detectable difference (SDD) defined JSN progression on the joint level. The sensitivity and specificity of the computer-based method for JSN progression was calculated using the SDD and a receiver operating characteristic (ROC) curve. Out of 314 metacarpophalangeal joints, 34 joints progressed based on the SDD, while 11 joints widened. Twenty-one joints progressed in the computer-based method, 11 joints in the scoring systems, and 13 joints in both methods. Based on the SDD, we found lower sensitivity and higher specificity with 54.2 and 92.8%, respectively. At the most discriminant cutoff point according to the ROC curve, the sensitivity and specificity was 70.8 and 81.7%, respectively. The proposed computer-based method provides quantitative measurement of JSN progression using sequential hand radiographs and may be a useful tool in follow-up assessment of joint damage in RA patients.

  20. Fast estimation of first-order scattering in a medical x-ray computed tomography scanner using a ray-tracing technique.

    PubMed

    Liu, Xin

    2014-01-01

    This study describes a deterministic method for simulating the first-order scattering in a medical computed tomography scanner. The method was developed based on a physics model of x-ray photon interactions with matter and a ray tracing technique. The results from simulated scattering were compared to the ones from an actual scattering measurement. Two phantoms with homogeneous and heterogeneous material distributions were used in the scattering simulation and measurement. It was found that the simulated scatter profile was in agreement with the measurement result, with an average difference of 25% or less. Finally, tomographic images with artifacts caused by scatter were corrected based on the simulated scatter profiles. The image quality improved significantly.

  1. Baseline estimation in flame's spectra by using neural networks and robust statistics

    NASA Astrophysics Data System (ADS)

    Garces, Hugo; Arias, Luis; Rojas, Alejandro

    2014-09-01

    This work presents a baseline estimation method in flame spectra based on artificial intelligence structure as a neural network, combining robust statistics with multivariate analysis to automatically discriminate measured wavelengths belonging to continuous feature for model adaptation, surpassing restriction of measuring target baseline for training. The main contributions of this paper are: to analyze a flame spectra database computing Jolliffe statistics from Principal Components Analysis detecting wavelengths not correlated with most of the measured data corresponding to baseline; to systematically determine the optimal number of neurons in hidden layers based on Akaike's Final Prediction Error; to estimate baseline in full wavelength range sampling measured spectra; and to train an artificial intelligence structure as a Neural Network which allows to generalize the relation between measured and baseline spectra. The main application of our research is to compute total radiation with baseline information, allowing to diagnose combustion process state for optimization in early stages.

  2. Quantitative vibro-acoustography of tissue-like objects by measurement of resonant modes

    NASA Astrophysics Data System (ADS)

    Mazumder, Dibbyan; Umesh, Sharath; Mohan Vasu, Ram; Roy, Debasish; Kanhirodan, Rajan; Asokan, Sundarrajan

    2017-01-01

    We demonstrate a simple and computationally efficient method to recover the shear modulus pertaining to the focal volume of an ultrasound transducer from the measured vibro-acoustic spectral peaks. A model that explains the transport of local deformation information with the acoustic wave acting as a carrier is put forth. It is also shown that the peaks correspond to the natural frequencies of vibration of the focal volume, which may be readily computed by solving an eigenvalue problem associated with the vibrating region. Having measured the first natural frequency with a fibre Bragg grating sensor, and armed with an expedient means of computing the same, we demonstrate a simple procedure, based on the method of bisection, to recover the average shear modulus of the object in the ultrasound focal volume. We demonstrate this recovery for four homogeneous agarose slabs of different stiffness and verify the accuracy of the recovery using independent rheometer-based measurements. Extension of the method to anisotropic samples through the measurement of a more complete set of resonant modes and the recovery of an elasticity tensor distribution, as is done in resonant ultrasound spectroscopy, is suggested.

  3. Segmentation of DTI based on tensorial morphological gradient

    NASA Astrophysics Data System (ADS)

    Rittner, Leticia; de Alencar Lotufo, Roberto

    2009-02-01

    This paper presents a segmentation technique for diffusion tensor imaging (DTI). This technique is based on a tensorial morphological gradient (TMG), defined as the maximum dissimilarity over the neighborhood. Once this gradient is computed, the tensorial segmentation problem becomes an scalar one, which can be solved by conventional techniques, such as watershed transform and thresholding. Similarity functions, namely the dot product, the tensorial dot product, the J-divergence and the Frobenius norm, were compared, in order to understand their differences regarding the measurement of tensor dissimilarities. The study showed that the dot product and the tensorial dot product turned out to be inappropriate for computation of the TMG, while the Frobenius norm and the J-divergence were both capable of measuring tensor dissimilarities, despite the distortion of Frobenius norm, since it is not an affine invariant measure. In order to validate the TMG as a solution for DTI segmentation, its computation was performed using distinct similarity measures and structuring elements. TMG results were also compared to fractional anisotropy. Finally, synthetic and real DTI were used in the method validation. Experiments showed that the TMG enables the segmentation of DTI by watershed transform or by a simple choice of a threshold. The strength of the proposed segmentation method is its simplicity and robustness, consequences of TMG computation. It enables the use, not only of well-known algorithms and tools from the mathematical morphology, but also of any other segmentation method to segment DTI, since TMG computation transforms tensorial images in scalar ones.

  4. Monitoring task loading with multivariate EEG measures during complex forms of human-computer interaction

    NASA Technical Reports Server (NTRS)

    Smith, M. E.; Gevins, A.; Brown, H.; Karnik, A.; Du, R.

    2001-01-01

    Electroencephalographic (EEG) recordings were made while 16 participants performed versions of a personal-computer-based flight simulation task of low, moderate, or high difficulty. As task difficulty increased, frontal midline theta EEG activity increased and alpha band activity decreased. A participant-specific function that combined multiple EEG features to create a single load index was derived from a sample of each participant's data and then applied to new test data from that participant. Index values were computed for every 4 s of task data. Across participants, mean task load index values increased systematically with increasing task difficulty and differed significantly between the different task versions. Actual or potential applications of this research include the use of multivariate EEG-based methods to monitor task loading during naturalistic computer-based work.

  5. Computer vision based method and system for online measurement of geometric parameters of train wheel sets.

    PubMed

    Zhang, Zhi-Feng; Gao, Zhan; Liu, Yuan-Yuan; Jiang, Feng-Chun; Yang, Yan-Li; Ren, Yu-Fen; Yang, Hong-Jun; Yang, Kun; Zhang, Xiao-Dong

    2012-01-01

    Train wheel sets must be periodically inspected for possible or actual premature failures and it is very significant to record the wear history for the full life of utilization of wheel sets. This means that an online measuring system could be of great benefit to overall process control. An online non-contact method for measuring a wheel set's geometric parameters based on the opto-electronic measuring technique is presented in this paper. A charge coupled device (CCD) camera with a selected optical lens and a frame grabber was used to capture the image of the light profile of the wheel set illuminated by a linear laser. The analogue signals of the image were transformed into corresponding digital grey level values. The 'mapping function method' is used to transform an image pixel coordinate to a space coordinate. The images of wheel sets were captured when the train passed through the measuring system. The rim inside thickness and flange thickness were measured and analyzed. The spatial resolution of the whole image capturing system is about 0.33 mm. Theoretic and experimental results show that the online measurement system based on computer vision can meet wheel set measurement requirements.

  6. Quantum proofs can be verified using only single-qubit measurements

    NASA Astrophysics Data System (ADS)

    Morimae, Tomoyuki; Nagaj, Daniel; Schuch, Norbert

    2016-02-01

    Quantum Merlin Arthur (QMA) is the class of problems which, though potentially hard to solve, have a quantum solution that can be verified efficiently using a quantum computer. It thus forms a natural quantum version of the classical complexity class NP (and its probabilistic variant MA, Merlin-Arthur games), where the verifier has only classical computational resources. In this paper, we study what happens when we restrict the quantum resources of the verifier to the bare minimum: individual measurements on single qubits received as they come, one by one. We find that despite this grave restriction, it is still possible to soundly verify any problem in QMA for the verifier with the minimum quantum resources possible, without using any quantum memory or multiqubit operations. We provide two independent proofs of this fact, based on measurement-based quantum computation and the local Hamiltonian problem. The former construction also applies to QMA1, i.e., QMA with one-sided error.

  7. Demonstration of blind quantum computing.

    PubMed

    Barz, Stefanie; Kashefi, Elham; Broadbent, Anne; Fitzsimons, Joseph F; Zeilinger, Anton; Walther, Philip

    2012-01-20

    Quantum computers, besides offering substantial computational speedups, are also expected to preserve the privacy of a computation. We present an experimental demonstration of blind quantum computing in which the input, computation, and output all remain unknown to the computer. We exploit the conceptual framework of measurement-based quantum computation that enables a client to delegate a computation to a quantum server. Various blind delegated computations, including one- and two-qubit gates and the Deutsch and Grover quantum algorithms, are demonstrated. The client only needs to be able to prepare and transmit individual photonic qubits. Our demonstration is crucial for unconditionally secure quantum cloud computing and might become a key ingredient for real-life applications, especially when considering the challenges of making powerful quantum computers widely available.

  8. Applications of quantum measurement techniques: Counterfactual quantum computation, spin hall effect of light, and atomic-vapor-based photon detectors

    NASA Astrophysics Data System (ADS)

    Hosten, Onur

    This dissertation investigates several physical phenomena in atomic and optical physics, and quantum information science, by utilizing various types and techniques of quantum measurements. It is the deeper concepts of these measurements, and the way they are integrated into the seemingly unrelated topics investigated, which binds together the research presented here. The research comprises three different topics: Counterfactual quantum computation, the spin Hall effect of light, and ultra-high-efficiency photon detectors based on atomic vapors. Counterfactual computation entails obtaining answers from a quantum computer without actually running it, and is accomplished by preparing the computer as a whole into a superposition of being activated and not activated. The first experimental demonstration is presented, including the best performing implementation of Grover's quantum search algorithm to date. In addition, we develop new counterfactual computation protocols that enable unconditional and completely deterministic operation. These methods stimulated a debate in the literature, on the meaning of counterfactuality in quantum processes, which we also discuss. The spin Hall effect of light entails tiny spin-dependent displacements, unsuspected until 2004, of a beam of light when it changes propagation direction. The first experimental demonstration of the effect during refraction at an air-glass interface is presented, together with a novel enabling metrological tool relying on the concepts of quantum weak measurements. Extensions of the effect to smoothly varying media are also presented, along with utilization of a time-varying version of the weak measurement techniques. Our approach to ultra-high-efficiency photon detection develops and extends a recent novel non-solid-state scheme for photo-detection based on atomic vapors. This approach is in principle capable of resolving the number of photons in a pulse, can be extended to non-destructive detection of photons, and most importantly is proposed to operate with single-photon detection efficiencies exceeding 99%, ideally without dark counts. Such a detector would have tremendous implications, e.g., for optical quantum information processing. The feasibility of operation of this approach at the desired level is studied theoretically and several promising physical systems are investigated.

  9. Microscope self-calibration based on micro laser line imaging and soft computing algorithms

    NASA Astrophysics Data System (ADS)

    Apolinar Muñoz Rodríguez, J.

    2018-06-01

    A technique to perform microscope self-calibration via micro laser line and soft computing algorithms is presented. In this technique, the microscope vision parameters are computed by means of soft computing algorithms based on laser line projection. To implement the self-calibration, a microscope vision system is constructed by means of a CCD camera and a 38 μm laser line. From this arrangement, the microscope vision parameters are represented via Bezier approximation networks, which are accomplished through the laser line position. In this procedure, a genetic algorithm determines the microscope vision parameters by means of laser line imaging. Also, the approximation networks compute the three-dimensional vision by means of the laser line position. Additionally, the soft computing algorithms re-calibrate the vision parameters when the microscope vision system is modified during the vision task. The proposed self-calibration improves accuracy of the traditional microscope calibration, which is accomplished via external references to the microscope system. The capability of the self-calibration based on soft computing algorithms is determined by means of the calibration accuracy and the micro-scale measurement error. This contribution is corroborated by an evaluation based on the accuracy of the traditional microscope calibration.

  10. Measurement of Clavicle Fracture Shortening Using Computed Tomography and Chest Radiography.

    PubMed

    Omid, Reza; Kidd, Chris; Yi, Anthony; Villacis, Diego; White, Eric

    2016-12-01

    Nonoperative management of midshaft clavicle fractures has resulted in widely disparate outcomes and there is growing evidence that clavicle shortening poses the risk of unsatisfactory functional outcomes due to shoulder weakness and nonunion. Unfortunately, the literature does not clearly demonstrate the superiority of one particular method for measuring clavicle shortening. The purpose of this study was to compare the accuracy of clavicle shortening measurements based on plain radiographs with those based on computed tomography (CT) reconstructed images of the clavicle. A total of 51 patients with midshaft clavicle fractures who underwent both a chest CT scan and standardized anteroposterior chest radiography on the day of admission were included in this study. Both an orthopedic surgeon and a musculoskeletal radiologist measured clavicle shortening for all included patients. We then determined the accuracy and intraclass correlation coefficients for the imaging modalities. Bland-Altman plots were created to analyze agreement between the modalities and a paired t-test was used to determine any significant difference between measurements. For injured clavicles, radiographic measurements significantly overestimated the clavicular length by a mean of 8.2 mm (standard deviation [SD], ± 10.2; confidence interval [CI], 95%) compared to CT-based measurements ( p < 0.001). The intraclass correlation was 0.96 for both plain radiograph- and CT-based measurements ( p = 0.17). We found that plain radiograph-based measurements of midshaft clavicle shortening are precise, but inaccurate. When clavicle shortening is considered in the decision to pursue operative management, we do not recommend the use of plain radiograph-based measurements.

  11. Measurement of Clavicle Fracture Shortening Using Computed Tomography and Chest Radiography

    PubMed Central

    Omid, Reza; Kidd, Chris; Villacis, Diego; White, Eric

    2016-01-01

    Background Nonoperative management of midshaft clavicle fractures has resulted in widely disparate outcomes and there is growing evidence that clavicle shortening poses the risk of unsatisfactory functional outcomes due to shoulder weakness and nonunion. Unfortunately, the literature does not clearly demonstrate the superiority of one particular method for measuring clavicle shortening. The purpose of this study was to compare the accuracy of clavicle shortening measurements based on plain radiographs with those based on computed tomography (CT) reconstructed images of the clavicle. Methods A total of 51 patients with midshaft clavicle fractures who underwent both a chest CT scan and standardized anteroposterior chest radiography on the day of admission were included in this study. Both an orthopedic surgeon and a musculoskeletal radiologist measured clavicle shortening for all included patients. We then determined the accuracy and intraclass correlation coefficients for the imaging modalities. Bland-Altman plots were created to analyze agreement between the modalities and a paired t-test was used to determine any significant difference between measurements. Results For injured clavicles, radiographic measurements significantly overestimated the clavicular length by a mean of 8.2 mm (standard deviation [SD], ± 10.2; confidence interval [CI], 95%) compared to CT-based measurements (p < 0.001). The intraclass correlation was 0.96 for both plain radiograph- and CT-based measurements (p = 0.17). Conclusions We found that plain radiograph-based measurements of midshaft clavicle shortening are precise, but inaccurate. When clavicle shortening is considered in the decision to pursue operative management, we do not recommend the use of plain radiograph-based measurements. PMID:27904717

  12. Measurement system for nitrous oxide based on amperometric gas sensor

    NASA Astrophysics Data System (ADS)

    Siswoyo, S.; Persaud, K. C.; Phillips, V. R.; Sneath, R.

    2017-03-01

    It has been well known that nitrous oxide is an important greenhouse gas, so monitoring and control of its concentration and emission is very important. In this work a nitrous oxide measurement system has been developed consisting of an amperometric sensor and an appropriate lab-made potentiostat that capable measuring picoampere current ranges. The sensor was constructed using a gold microelectrode as working electrode surrounded by a silver wire as quasi reference electrode, with tetraethyl ammonium perchlorate and dimethylsulphoxide as supporting electrolyte and solvent respectively. The lab-made potentiostat was built incorporating a transimpedance amplifier capable of picoampere measurements. This also incorporated a microcontroller based data acquisition system, controlled by a host personal computer using a dedicated computer program. The system was capable of detecting N2O concentrations down to 0.07 % v/v.

  13. Dynamic Control of Adsorption Sensitivity for Photo-EMF-Based Ammonia Gas Sensors Using a Wireless Network

    PubMed Central

    Vashpanov, Yuriy; Choo, Hyunseung; Kim, Dongsoo Stephen

    2011-01-01

    This paper proposes an adsorption sensitivity control method that uses a wireless network and illumination light intensity in a photo-electromagnetic field (EMF)-based gas sensor for measurements in real time of a wide range of ammonia concentrations. The minimum measurement error for a range of ammonia concentration from 3 to 800 ppm occurs when the gas concentration magnitude corresponds with the optimal intensity of the illumination light. A simulation with LabView-engineered modules for automatic control of a new intelligent computer system was conducted to improve measurement precision over a wide range of gas concentrations. This gas sensor computer system with wireless network technology could be useful in the chemical industry for automatic detection and measurement of hazardous ammonia gas levels in real time. PMID:22346680

  14. Space Station UCS antenna pattern computation and measurement. [UHF Communication Subsystem

    NASA Technical Reports Server (NTRS)

    Hwu, Shian U.; Lu, Ba P.; Johnson, Larry A.; Fournet, Jon S.; Panneton, Robert J.; Ngo, John D.; Eggers, Donald S.; Arndt, G. D.

    1993-01-01

    The purpose of this paper is to analyze the interference to the Space Station Ultrahigh Frequency (UHF) Communication Subsystem (UCS) antenna radiation pattern due to its environment - Space Station. A hybrid Computational Electromagnetics (CEM) technique was applied in this study. The antenna was modeled using the Method of Moments (MOM) and the radiation patterns were computed using the Uniform Geometrical Theory of Diffraction (GTD) in which the effects of the reflected and diffracted fields from surfaces, edges, and vertices of the Space Station structures were included. In order to validate the CEM techniques, and to provide confidence in the computer-generated results, a comparison with experimental measurements was made for a 1/15 scale Space Station mockup. Based on the results accomplished, good agreement on experimental and computed results was obtained. The computed results using the CEM techniques for the Space Station UCS antenna pattern predictions have been validated.

  15. Modification and fixed-point analysis of a Kalman filter for orientation estimation based on 9D inertial measurement unit data.

    PubMed

    Brückner, Hans-Peter; Spindeldreier, Christian; Blume, Holger

    2013-01-01

    A common approach for high accuracy sensor fusion based on 9D inertial measurement unit data is Kalman filtering. State of the art floating-point filter algorithms differ in their computational complexity nevertheless, real-time operation on a low-power microcontroller at high sampling rates is not possible. This work presents algorithmic modifications to reduce the computational demands of a two-step minimum order Kalman filter. Furthermore, the required bit-width of a fixed-point filter version is explored. For evaluation real-world data captured using an Xsens MTx inertial sensor is used. Changes in computational latency and orientation estimation accuracy due to the proposed algorithmic modifications and fixed-point number representation are evaluated in detail on a variety of processing platforms enabling on-board processing on wearable sensor platforms.

  16. Career Oriented Mathematics, Teacher's Manual. [Includes Mastering Computational Skill: A Use-Based Program; Owning an Automobile and Driving as a Career; Retail Sales; Measurement; and Area-Perimeter.

    ERIC Educational Resources Information Center

    Mahaffey, Michael L.; McKillip, William D.

    This manual is designed for teachers using the Career Oriented Mathematics units on owning an automobile and driving as a career, retail sales, measurement, and area-perimeter. The volume begins with a discussion of the philosophy and scheduling of the program which is designed to improve students' attitudes and ability in computation by…

  17. Development of a computational technique to measure cartilage contact area.

    PubMed

    Willing, Ryan; Lapner, Michael; Lalone, Emily A; King, Graham J W; Johnson, James A

    2014-03-21

    Computational measurement of joint contact distributions offers the benefit of non-invasive measurements of joint contact without the use of interpositional sensors or casting materials. This paper describes a technique for indirectly measuring joint contact based on overlapping of articular cartilage computer models derived from CT images and positioned using in vitro motion capture data. The accuracy of this technique when using the physiological nonuniform cartilage thickness distribution, or simplified uniform cartilage thickness distributions, is quantified through comparison with direct measurements of contact area made using a casting technique. The efficacy of using indirect contact measurement techniques for measuring the changes in contact area resulting from hemiarthroplasty at the elbow is also quantified. Using the physiological nonuniform cartilage thickness distribution reliably measured contact area (ICC=0.727), but not better than the assumed bone specific uniform cartilage thicknesses (ICC=0.673). When a contact pattern agreement score (s(agree)) was used to assess the accuracy of cartilage contact measurements made using physiological nonuniform or simplified uniform cartilage thickness distributions in terms of size, shape and location, their accuracies were not significantly different (p>0.05). The results of this study demonstrate that cartilage contact can be measured indirectly based on the overlapping of cartilage contact models. However, the results also suggest that in some situations, inter-bone distance measurement and an assumed cartilage thickness may suffice for predicting joint contact patterns. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Technical Note: spektr 3.0-A computational tool for x-ray spectrum modeling and analysis.

    PubMed

    Punnoose, J; Xu, J; Sisniega, A; Zbijewski, W; Siewerdsen, J H

    2016-08-01

    A computational toolkit (spektr 3.0) has been developed to calculate x-ray spectra based on the tungsten anode spectral model using interpolating cubic splines (TASMICS) algorithm, updating previous work based on the tungsten anode spectral model using interpolating polynomials (TASMIP) spectral model. The toolkit includes a matlab (The Mathworks, Natick, MA) function library and improved user interface (UI) along with an optimization algorithm to match calculated beam quality with measurements. The spektr code generates x-ray spectra (photons/mm(2)/mAs at 100 cm from the source) using TASMICS as default (with TASMIP as an option) in 1 keV energy bins over beam energies 20-150 kV, extensible to 640 kV using the TASMICS spectra. An optimization tool was implemented to compute the added filtration (Al and W) that provides a best match between calculated and measured x-ray tube output (mGy/mAs or mR/mAs) for individual x-ray tubes that may differ from that assumed in TASMICS or TASMIP and to account for factors such as anode angle. The median percent difference in photon counts for a TASMICS and TASMIP spectrum was 4.15% for tube potentials in the range 30-140 kV with the largest percentage difference arising in the low and high energy bins due to measurement errors in the empirically based TASMIP model and inaccurate polynomial fitting. The optimization tool reported a close agreement between measured and calculated spectra with a Pearson coefficient of 0.98. The computational toolkit, spektr, has been updated to version 3.0, validated against measurements and existing models, and made available as open source code. Video tutorials for the spektr function library, UI, and optimization tool are available.

  19. Quantum computational universality of the Cai-Miyake-Duer-Briegel two-dimensional quantum state from Affleck-Kennedy-Lieb-Tasaki quasichains

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei, Tzu-Chieh; C. N. Yang Institute for Theoretical Physics, State University of New York at Stony Brook, Stony Brook, New York 11794-3840; Raussendorf, Robert

    2011-10-15

    Universal quantum computation can be achieved by simply performing single-qubit measurements on a highly entangled resource state, such as cluster states. Cai, Miyake, Duer, and Briegel recently constructed a ground state of a two-dimensional quantum magnet by combining multiple Affleck-Kennedy-Lieb-Tasaki quasichains of mixed spin-3/2 and spin-1/2 entities and by mapping pairs of neighboring spin-1/2 particles to individual spin-3/2 particles [Phys. Rev. A 82, 052309 (2010)]. They showed that this state enables universal quantum computation by single-spin measurements. Here, we give an alternative understanding of how this state gives rise to universal measurement-based quantum computation: by local operations, each quasichain canmore » be converted to a one-dimensional cluster state and entangling gates between two neighboring logical qubits can be implemented by single-spin measurements. We further argue that a two-dimensional cluster state can be distilled from the Cai-Miyake-Duer-Briegel state.« less

  20. Let the Data Speak: Gender Differences in Math Curriculum-Based Measurement

    ERIC Educational Resources Information Center

    Yarbrough, Jamie L.; Cannon, Laura; Bergman, Shawn; Kidder-Ashley, Pamela; McCane-Bowling, Sara

    2017-01-01

    Numerous studies have identified differences between males and females in academic performance across the areas of reading, writing, and mathematics. The current study examined whether or not gender differences exist when math curriculum-based measures (M-CBMs) are used to assess basic math computation skills in a sample of third- through…

  1. An Examination of the Relationship between Computation, Problem Solving, and Reading

    ERIC Educational Resources Information Center

    Cormier, Damien C.; Yeo, Seungsoo; Christ, Theodore J.; Offrey, Laura D.; Pratt, Katherine

    2016-01-01

    The purpose of this study is to evaluate the relationship of mathematics calculation rate (curriculum-based measurement of mathematics; CBM-M), reading rate (curriculum-based measurement of reading; CBM-R), and mathematics application and problem solving skills (mathematics screener) among students at four levels of proficiency on a statewide…

  2. Screening for cognitive impairment in older individuals. Validation study of a computer-based test.

    PubMed

    Green, R C; Green, J; Harrison, J M; Kutner, M H

    1994-08-01

    This study examined the validity of a computer-based cognitive test that was recently designed to screen the elderly for cognitive impairment. Criterion-related validity was examined by comparing test scores of impaired patients and normal control subjects. Construct-related validity was computed through correlations between computer-based subtests and related conventional neuropsychological subtests. University center for memory disorders. Fifty-two patients with mild cognitive impairment by strict clinical criteria and 50 unimpaired, age- and education-matched control subjects. Control subjects were rigorously screened by neurological, neuropsychological, imaging, and electrophysiological criteria to identify and exclude individuals with occult abnormalities. Using a cut-off total score of 126, this computer-based instrument had a sensitivity of 0.83 and a specificity of 0.96. Using a prevalence estimate of 10%, predictive values, positive and negative, were 0.70 and 0.96, respectively. Computer-based subtests correlated significantly with conventional neuropsychological tests measuring similar cognitive domains. Thirteen (17.8%) of 73 volunteers with normal medical histories were excluded from the control group, with unsuspected abnormalities on standard neuropsychological tests, electroencephalograms, or magnetic resonance imaging scans. Computer-based testing is a valid screening methodology for the detection of mild cognitive impairment in the elderly, although this particular test has important limitations. Broader applications of computer-based testing will require extensive population-based validation. Future studies should recognize that normal control subjects without a history of disease who are typically used in validation studies may have a high incidence of unsuspected abnormalities on neurodiagnostic studies.

  3. [Economic efficiency of computer monitoring of health].

    PubMed

    Il'icheva, N P; Stazhadze, L L

    2001-01-01

    Presents the method of computer monitoring of health, based on utilization of modern information technologies in public health. The method helps organize preventive activities of an outpatient clinic at a high level and essentially decrease the time and money loss. Efficiency of such preventive measures, increased number of computer and Internet users suggests that such methods are promising and further studies in this field are needed.

  4. Symmetry-protected topological phases with uniform computational power in one dimension

    NASA Astrophysics Data System (ADS)

    Raussendorf, Robert; Wang, Dong-Sheng; Prakash, Abhishodh; Wei, Tzu-Chieh; Stephen, David T.

    2017-07-01

    We investigate the usefulness of ground states of quantum spin chains with symmetry-protected topological order (SPTO) for measurement-based quantum computation. We show that, in spatial dimension 1, if an SPTO phase protects the identity gate, then, subject to an additional symmetry condition that is satisfied in all cases so far investigated, it can also be used for quantum computation.

  5. Shock Location Dominated Transonic Flight Loads on the Active Aeroelastic Wing

    NASA Technical Reports Server (NTRS)

    Lokos, William A.; Lizotte, Andrew; Lindsley, Ned J.; Stauf, Rick

    2005-01-01

    During several Active Aeroelastic Wing research flights, the shadow of the over-wing shock could be observed because of natural lighting conditions. As the plane accelerated, the shock location moved aft, and as the shadow passed the aileron and trailing-edge flap hinge lines, their associated hinge moments were substantially affected. The observation of the dominant effect of shock location on aft control surface hinge moments led to this investigation. This report investigates the effect of over-wing shock location on wing loads through flight-measured data and analytical predictions. Wing-root and wing-fold bending moment and torque and leading- and trailing-edge hinge moments have been measured in flight using calibrated strain gages. These same loads have been predicted using a computational fluid dynamics code called the Euler Navier-Stokes Three Dimensional Aeroelastic Code. The computational fluid dynamics study was based on the elastically deformed shape estimated by a twist model, which in turn was derived from in-flight-measured wing deflections provided by a flight deflection measurement system. During level transonic flight, the shock location dominated the wing trailing-edge control surface hinge moments. The computational fluid dynamics analysis based on the shape provided by the flight deflection measurement system produced very similar results and substantially correlated with the measured loads data.

  6. Scalable and responsive event processing in the cloud

    PubMed Central

    Suresh, Visalakshmi; Ezhilchelvan, Paul; Watson, Paul

    2013-01-01

    Event processing involves continuous evaluation of queries over streams of events. Response-time optimization is traditionally done over a fixed set of nodes and/or by using metrics measured at query-operator levels. Cloud computing makes it easy to acquire and release computing nodes as required. Leveraging this flexibility, we propose a novel, queueing-theory-based approach for meeting specified response-time targets against fluctuating event arrival rates by drawing only the necessary amount of computing resources from a cloud platform. In the proposed approach, the entire processing engine of a distinct query is modelled as an atomic unit for predicting response times. Several such units hosted on a single node are modelled as a multiple class M/G/1 system. These aspects eliminate intrusive, low-level performance measurements at run-time, and also offer portability and scalability. Using model-based predictions, cloud resources are efficiently used to meet response-time targets. The efficacy of the approach is demonstrated through cloud-based experiments. PMID:23230164

  7. ELISA test for anti-neutrophil cytoplasm antibodies detection evaluated by a computer screen photo-assisted technique.

    PubMed

    Filippini, D; Tejle, K; Lundström, I

    2005-08-15

    The computer screen photo-assisted technique (CSPT), a method for substance classification based on spectral fingerprinting, which involves just a computer screen and a web camera as measuring platform is used here for the evaluation of a prospective enzyme-linked immunosorbent assay (ELISA). A anti-neutrophil cytoplasm antibodies (ANCA-ELISA) test, typically used for diagnosing patients suffering from chronic inflammatory disorders in the skin, joints, blood vessels and other tissues is comparatively tested with a standard microplate reader and CSPT, yielding equivalent results at a fraction of the instrumental costs. The CSPT approach is discussed as a distributed measuring platform allowing decentralized measurements in routine applications, whereas keeping centralized information management due to its natural network embedded operation.

  8. Adaptive compressive ghost imaging based on wavelet trees and sparse representation.

    PubMed

    Yu, Wen-Kai; Li, Ming-Fei; Yao, Xu-Ri; Liu, Xue-Feng; Wu, Ling-An; Zhai, Guang-Jie

    2014-03-24

    Compressed sensing is a theory which can reconstruct an image almost perfectly with only a few measurements by finding its sparsest representation. However, the computation time consumed for large images may be a few hours or more. In this work, we both theoretically and experimentally demonstrate a method that combines the advantages of both adaptive computational ghost imaging and compressed sensing, which we call adaptive compressive ghost imaging, whereby both the reconstruction time and measurements required for any image size can be significantly reduced. The technique can be used to improve the performance of all computational ghost imaging protocols, especially when measuring ultra-weak or noisy signals, and can be extended to imaging applications at any wavelength.

  9. Discovering Synergistic Drug Combination from a Computational Perspective.

    PubMed

    Ding, Pingjian; Luo, Jiawei; Liang, Cheng; Xiao, Qiu; Cao, Buwen; Li, Guanghui

    2018-03-30

    Synergistic drug combinations play an important role in the treatment of complex diseases. The identification of effective drug combination is vital to further reduce the side effects and improve therapeutic efficiency. In previous years, in vitro method has been the main route to discover synergistic drug combinations. However, many limitations of time and resource consumption lie within the in vitro method. Therefore, with the rapid development of computational models and the explosive growth of large and phenotypic data, computational methods for discovering synergistic drug combinations are an efficient and promising tool and contribute to precision medicine. It is the key of computational methods how to construct the computational model. Different computational strategies generate different performance. In this review, the recent advancements in computational methods for predicting effective drug combination are concluded from multiple aspects. First, various datasets utilized to discover synergistic drug combinations are summarized. Second, we discussed feature-based approaches and partitioned these methods into two classes including feature-based methods in terms of similarity measure, and feature-based methods in terms of machine learning. Third, we discussed network-based approaches for uncovering synergistic drug combinations. Finally, we analyzed and prospected computational methods for predicting effective drug combinations. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  10. Continuing challenges for computer-based neuropsychological tests.

    PubMed

    Letz, Richard

    2003-08-01

    A number of issues critical to the development of computer-based neuropsychological testing systems that remain continuing challenges to their widespread use in occupational and environmental health are reviewed. Several computer-based neuropsychological testing systems have been developed over the last 20 years, and they have contributed substantially to the study of neurologic effects of a number of environmental exposures. However, many are no longer supported and do not run on contemporary personal computer operating systems. Issues that are continuing challenges for development of computer-based neuropsychological tests in environmental and occupational health are discussed: (1) some current technological trends that generally make test development more difficult; (2) lack of availability of usable speech recognition of the type required for computer-based testing systems; (3) implementing computer-based procedures and tasks that are improvements over, not just adaptations of, their manually-administered predecessors; (4) implementing tests of a wider range of memory functions than the limited range now available; (5) paying more attention to motivational influences that affect the reliability and validity of computer-based measurements; and (6) increasing the usability of and audience for computer-based systems. Partial solutions to some of these challenges are offered. The challenges posed by current technological trends are substantial and generally beyond the control of testing system developers. Widespread acceptance of the "tablet PC" and implementation of accurate small vocabulary, discrete, speaker-independent speech recognition would enable revolutionary improvements to computer-based testing systems, particularly for testing memory functions not covered in existing systems. Dynamic, adaptive procedures, particularly ones based on item-response theory (IRT) and computerized-adaptive testing (CAT) methods, will be implemented in new tests that will be more efficient, reliable, and valid than existing test procedures. These additional developments, along with implementation of innovative reporting formats, are necessary for more widespread acceptance of the testing systems.

  11. Application of fiber spectrometers for etch depth measurement of binary computer-generated holograms

    NASA Astrophysics Data System (ADS)

    Korolkov, V. P.; Konchenko, A. S.; Poleshchuk, A. G.

    2013-01-01

    Novel spectrophotometric method of computer-generated holograms depth measurement is presented. It is based on spectral properties of binary phase multi-order gratings. An intensity of zero order is a periodical function of illumination light wave number. The grating grooves depth can be calculated as it is inversely proportional to the period. Measurement in reflection allows one to increase a phase depth of the grooves by factor of 2 and measure more precisely shallow phase gratings. Diffraction binary structures with depth from several hundreds to thousands nanometers could be measured by the method. Measurement uncertainty is mainly defined by following parameters - shifts of the spectrum maximums that are occurred due to the tilted grooves sidewalls, uncertainty of light incidence angle measurement, and spectrophotometer wavelength error. It is theoretically and experimentally shown that the method can ensure 0.25-1% error for desktop spectrophotometers. However fiber spectrometers are more convenient for creation of real measurement system with scanning measurement of large area computer-generated holograms which are used for optical testing of aspheric optics. Especially diffractive Fizeau null lenses need to be carefully tested for uniformity of etch depth. Experimental system for characterization of binary computer-generated holograms was developed using spectrophotometric unit of confocal sensor CHR-150 (STIL SA).

  12. Unsupervised learning of discriminative edge measures for vehicle matching between nonoverlapping cameras.

    PubMed

    Shan, Ying; Sawhney, Harpreet S; Kumar, Rakesh

    2008-04-01

    This paper proposes a novel unsupervised algorithm learning discriminative features in the context of matching road vehicles between two non-overlapping cameras. The matching problem is formulated as a same-different classification problem, which aims to compute the probability of vehicle images from two distinct cameras being from the same vehicle or different vehicle(s). We employ a novel measurement vector that consists of three independent edge-based measures and their associated robust measures computed from a pair of aligned vehicle edge maps. The weight of each measure is determined by an unsupervised learning algorithm that optimally separates the same-different classes in the combined measurement space. This is achieved with a weak classification algorithm that automatically collects representative samples from same-different classes, followed by a more discriminative classifier based on Fisher' s Linear Discriminants and Gibbs Sampling. The robustness of the match measures and the use of unsupervised discriminant analysis in the classification ensures that the proposed method performs consistently in the presence of missing/false features, temporally and spatially changing illumination conditions, and systematic misalignment caused by different camera configurations. Extensive experiments based on real data of over 200 vehicles at different times of day demonstrate promising results.

  13. Error Estimates of the Ares I Computed Turbulent Ascent Longitudinal Aerodynamic Analysis

    NASA Technical Reports Server (NTRS)

    Abdol-Hamid, Khaled S.; Ghaffari, Farhad

    2012-01-01

    Numerical predictions of the longitudinal aerodynamic characteristics for the Ares I class of vehicles, along with the associated error estimate derived from an iterative convergence grid refinement, are presented. Computational results are based on an unstructured grid, Reynolds-averaged Navier-Stokes analysis. The validity of the approach to compute the associated error estimates, derived from a base grid to an extrapolated infinite-size grid, was first demonstrated on a sub-scaled wind tunnel model at representative ascent flow conditions for which the experimental data existed. Such analysis at the transonic flow conditions revealed a maximum deviation of about 23% between the computed longitudinal aerodynamic coefficients with the base grid and the measured data across the entire roll angles. This maximum deviation from the wind tunnel data was associated with the computed normal force coefficient at the transonic flow condition and was reduced to approximately 16% based on the infinite-size grid. However, all the computed aerodynamic coefficients with the base grid at the supersonic flow conditions showed a maximum deviation of only about 8% with that level being improved to approximately 5% for the infinite-size grid. The results and the error estimates based on the established procedure are also presented for the flight flow conditions.

  14. The comparison of various approach to evaluation erosion risks and design control erosion measures

    NASA Astrophysics Data System (ADS)

    Kapicka, Jiri

    2015-04-01

    In the present is in the Czech Republic one methodology how to compute and compare erosion risks. This methodology contain also method to design erosion control measures. The base of this methodology is Universal Soil Loss Equation (USLE) and their result long-term average annual rate of erosion (G). This methodology is used for landscape planners. Data and statistics from database of erosion events in the Czech Republic shows that many troubles and damages are from local episodes of erosion events. An extent of these events and theirs impact are conditional to local precipitation events, current plant phase and soil conditions. These erosion events can do troubles and damages on agriculture land, municipally property and hydro components and even in a location is from point of view long-term average annual rate of erosion in good conditions. Other way how to compute and compare erosion risks is episodes approach. In this paper is presented the compare of various approach to compute erosion risks. The comparison was computed to locality from database of erosion events on agricultural land in the Czech Republic where have been records two erosion events. The study area is a simple agriculture land without any barriers that can have high influence to water flow and soil sediment transport. The computation of erosion risks (for all methodology) was based on laboratory analysis of soil samples which was sampled on study area. Results of the methodology USLE, MUSLE and results from mathematical model Erosion 3D have been compared. Variances of the results in space distribution of the places with highest soil erosion where compared and discussed. Other part presents variances of design control erosion measures where their design was done on based different methodology. The results shows variance of computed erosion risks which was done by different methodology. These variances can start discussion about different approach how compute and evaluate erosion risks in areas with different importance.

  15. Direct biomechanical modeling of trabecular bone using a nonlinear manifold-based volumetric representation

    NASA Astrophysics Data System (ADS)

    Jin, Dakai; Lu, Jia; Zhang, Xiaoliu; Chen, Cheng; Bai, ErWei; Saha, Punam K.

    2017-03-01

    Osteoporosis is associated with increased fracture risk. Recent advancement in the area of in vivo imaging allows segmentation of trabecular bone (TB) microstructures, which is a known key determinant of bone strength and fracture risk. An accurate biomechanical modelling of TB micro-architecture provides a comprehensive summary measure of bone strength and fracture risk. In this paper, a new direct TB biomechanical modelling method using nonlinear manifold-based volumetric reconstruction of trabecular network is presented. It is accomplished in two sequential modules. The first module reconstructs a nonlinear manifold-based volumetric representation of TB networks from three-dimensional digital images. Specifically, it starts with the fuzzy digital segmentation of a TB network, and computes its surface and curve skeletons. An individual trabecula is identified as a topological segment in the curve skeleton. Using geometric analysis, smoothing and optimization techniques, the algorithm generates smooth, curved, and continuous representations of individual trabeculae glued at their junctions. Also, the method generates a geometrically consistent TB volume at junctions. In the second module, a direct computational biomechanical stress-strain analysis is applied on the reconstructed TB volume to predict mechanical measures. The accuracy of the method was examined using micro-CT imaging of cadaveric distal tibia specimens (N = 12). A high linear correlation (r = 0.95) between TB volume computed using the new manifold-modelling algorithm and that directly derived from the voxel-based micro-CT images was observed. Young's modulus (YM) was computed using direct mechanical analysis on the TB manifold-model over a cubical volume of interest (VOI), and its correlation with the YM, computed using micro-CT based conventional finite-element analysis over the same VOI, was examined. A moderate linear correlation (r = 0.77) was observed between the two YM measures. This preliminary results show the accuracy of the new nonlinear manifold modelling algorithm for TB, and demonstrate the feasibility of a new direct mechanical strain-strain analysis on a nonlinear manifold model of a highly complex biological structure.

  16. A forward-adjoint operator pair based on the elastic wave equation for use in transcranial photoacoustic computed tomography

    PubMed Central

    Mitsuhashi, Kenji; Poudel, Joemini; Matthews, Thomas P.; Garcia-Uribe, Alejandro; Wang, Lihong V.; Anastasio, Mark A.

    2017-01-01

    Photoacoustic computed tomography (PACT) is an emerging imaging modality that exploits optical contrast and ultrasonic detection principles to form images of the photoacoustically induced initial pressure distribution within tissue. The PACT reconstruction problem corresponds to an inverse source problem in which the initial pressure distribution is recovered from measurements of the radiated wavefield. A major challenge in transcranial PACT brain imaging is compensation for aberrations in the measured data due to the presence of the skull. Ultrasonic waves undergo absorption, scattering and longitudinal-to-shear wave mode conversion as they propagate through the skull. To properly account for these effects, a wave-equation-based inversion method should be employed that can model the heterogeneous elastic properties of the skull. In this work, a forward model based on a finite-difference time-domain discretization of the three-dimensional elastic wave equation is established and a procedure for computing the corresponding adjoint of the forward operator is presented. Massively parallel implementations of these operators employing multiple graphics processing units (GPUs) are also developed. The developed numerical framework is validated and investigated in computer19 simulation and experimental phantom studies whose designs are motivated by transcranial PACT applications. PMID:29387291

  17. Perceived problems with computer gaming and internet use among adolescents: measurement tool for non-clinical survey studies

    PubMed Central

    2014-01-01

    Background Existing instruments for measuring problematic computer and console gaming and internet use are often lengthy and often based on a pathological perspective. The objective was to develop and present a new and short non-clinical measurement tool for perceived problems related to computer use and gaming among adolescents and to study the association between screen time and perceived problems. Methods Cross-sectional school-survey of 11-, 13-, and 15-year old students in thirteen schools in the City of Aarhus, Denmark, participation rate 89%, n = 2100. The main exposure was time spend on weekdays on computer- and console-gaming and internet use for communication and surfing. The outcome measures were three indexes on perceived problems related to computer and console gaming and internet use. Results The three new indexes showed high face validity and acceptable internal consistency. Most schoolchildren with high screen time did not experience problems related to computer use. Still, there was a strong and graded association between time use and perceived problems related to computer gaming, console gaming (only boys) and internet use, odds ratios ranging from 6.90 to 10.23. Conclusion The three new measures of perceived problems related to computer and console gaming and internet use among adolescents are appropriate, reliable and valid for use in non-clinical surveys about young people’s everyday life and behaviour. These new measures do not assess Internet Gaming Disorder as it is listed in the DSM and therefore has no parity with DSM criteria. We found an increasing risk of perceived problems with increasing time spent with gaming and internet use. Nevertheless, most schoolchildren who spent much time with gaming and internet use did not experience problems. PMID:24731270

  18. Clinical application of calculated split renal volume using computed tomography-based renal volumetry after partial nephrectomy: Correlation with technetium-99m dimercaptosuccinic acid renal scan data.

    PubMed

    Lee, Chan Ho; Park, Young Joo; Ku, Ja Yoon; Ha, Hong Koo

    2017-06-01

    To evaluate the clinical application of computed tomography-based measurement of renal cortical volume and split renal volume as a single tool to assess the anatomy and renal function in patients with renal tumors before and after partial nephrectomy, and to compare the findings with technetium-99m dimercaptosuccinic acid renal scan. The data of 51 patients with a unilateral renal tumor managed by partial nephrectomy were retrospectively analyzed. The renal cortical volume of tumor-bearing and contralateral kidneys was measured using ImageJ software. Split estimated glomerular filtration rate and split renal volume calculated using this renal cortical volume were compared with the split renal function measured with technetium-99m dimercaptosuccinic acid renal scan. A strong correlation between split renal function and split renal volume of the tumor-bearing kidney was observed before and after surgery (r = 0.89, P < 0.001 and r = 0.94, P < 0.001). The preoperative and postoperative split estimated glomerular filtration rate of the operated kidney showed a moderate correlation with split renal function (r = 0.39, P = 0.004 and r = 0.49, P < 0.001). The correlation between reductions in split renal function and split renal volume of the operated kidney (r = 0.87, P < 0.001) was stronger than that between split renal function and percent reduction in split estimated glomerular filtration rate (r = 0.64, P < 0.001). The split renal volume calculated using computed tomography-based renal volumetry had a strong correlation with the split renal function measured using technetium-99m dimercaptosuccinic acid renal scan. Computed tomography-based split renal volume measurement before and after partial nephrectomy can be used as a single modality for anatomical and functional assessment of the tumor-bearing kidney. © 2017 The Japanese Urological Association.

  19. The Use of Microcomputer Based Laboratories in Chemistry Secondary Education: Present State of the Art and Ideas for Research-Based Practice

    ERIC Educational Resources Information Center

    Tortosa, Montserrat

    2012-01-01

    In microcomputer based laboratories (MBL) and data loggers, one or more sensors are connected to an interphase and this to a computer. This equipment allows visualization in real time of the variables of an experiment and provides the possibility of measuring magnitudes which are difficult to measure with traditional equipment. Research shows that…

  20. Fast Virtual Fractional Flow Reserve Based Upon Steady-State Computational Fluid Dynamics Analysis: Results From the VIRTU-Fast Study.

    PubMed

    Morris, Paul D; Silva Soto, Daniel Alejandro; Feher, Jeroen F A; Rafiroiu, Dan; Lungu, Angela; Varma, Susheel; Lawford, Patricia V; Hose, D Rodney; Gunn, Julian P

    2017-08-01

    Fractional flow reserve (FFR)-guided percutaneous intervention is superior to standard assessment but remains underused. The authors have developed a novel "pseudotransient" analysis protocol for computing virtual fractional flow reserve (vFFR) based upon angiographic images and steady-state computational fluid dynamics. This protocol generates vFFR results in 189 s (cf >24 h for transient analysis) using a desktop PC, with <1% error relative to that of full-transient computational fluid dynamics analysis. Sensitivity analysis demonstrated that physiological lesion significance was influenced less by coronary or lesion anatomy (33%) and more by microvascular physiology (59%). If coronary microvascular resistance can be estimated, vFFR can be accurately computed in less time than it takes to make invasive measurements.

  1. Effects of computer-based training on procedural modifications to standard functional analyses.

    PubMed

    Schnell, Lauren K; Sidener, Tina M; DeBar, Ruth M; Vladescu, Jason C; Kahng, SungWoo

    2018-01-01

    Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to training materials using interactive software during a 1-day session. Following the training, mean scores on the posttest, novel cases probe, and maintenance probe increased for all participants. These results replicate previous findings during a 1-day session and include a measure of participant acceptability of the training. Recommendations for future research on computer-based training and functional analysis are discussed. © 2017 Society for the Experimental Analysis of Behavior.

  2. Ancilla-driven quantum computation for qudits and continuous variables

    DOE PAGES

    Proctor, Timothy; Giulian, Melissa; Korolkova, Natalia; ...

    2017-05-10

    Although qubits are the leading candidate for the basic elements in a quantum computer, there are also a range of reasons to consider using higher-dimensional qudits or quantum continuous variables (QCVs). In this paper, we use a general “quantum variable” formalism to propose a method of quantum computation in which ancillas are used to mediate gates on a well-isolated “quantum memory” register and which may be applied to the setting of qubits, qudits (for d>2), or QCVs. More specifically, we present a model in which universal quantum computation may be implemented on a register using only repeated applications of amore » single fixed two-body ancilla-register interaction gate, ancillas prepared in a single state, and local measurements of these ancillas. In order to maintain determinism in the computation, adaptive measurements via a classical feed forward of measurement outcomes are used, with the method similar to that in measurement-based quantum computation (MBQC). We show that our model has the same hybrid quantum-classical processing advantages as MBQC, including the power to implement any Clifford circuit in essentially one layer of quantum computation. In some physical settings, high-quality measurements of the ancillas may be highly challenging or not possible, and hence we also present a globally unitary model which replaces the need for measurements of the ancillas with the requirement for ancillas to be prepared in states from a fixed orthonormal basis. In conclusion, we discuss settings in which these models may be of practical interest.« less

  3. Continual Response Measurement: Design and Validation.

    ERIC Educational Resources Information Center

    Baggaley, Jon

    1987-01-01

    Discusses reliability and validity of continual response measurement (CRM), a computer-based measurement technique, and its use in social science research. Highlights include the importance of criterion-referencing the data, guidelines for designing studies using CRM, examples typifying their deductive and inductive functions, and a discussion of…

  4. Computer vision in cell biology.

    PubMed

    Danuser, Gaudenz

    2011-11-23

    Computer vision refers to the theory and implementation of artificial systems that extract information from images to understand their content. Although computers are widely used by cell biologists for visualization and measurement, interpretation of image content, i.e., the selection of events worth observing and the definition of what they mean in terms of cellular mechanisms, is mostly left to human intuition. This Essay attempts to outline roles computer vision may play and should play in image-based studies of cellular life. Copyright © 2011 Elsevier Inc. All rights reserved.

  5. Determination of high temperature strains using a PC based vision system

    NASA Astrophysics Data System (ADS)

    McNeill, Stephen R.; Sutton, Michael A.; Russell, Samuel S.

    1992-09-01

    With the widespread availability of video digitizers and cheap personal computers, the use of computer vision as an experimental tool is becoming common place. These systems are being used to make a wide variety of measurements that range from simple surface characterization to velocity profiles. The Sub-Pixel Digital Image Correlation technique has been developed to measure full field displacement and gradients of the surface of an object subjected to a driving force. The technique has shown its utility by measuring the deformation and movement of objects that range from simple translation to fluid velocity profiles to crack tip deformation of solid rocket fuel. This technique has recently been improved and used to measure the surface displacement field of an object at high temperature. The development of a PC based Sub-Pixel Digital Image Correlation system has yielded an accurate and easy to use system for measuring surface displacements and gradients. Experiments have been performed to show the system is viable for measuring thermal strain.

  6. Efficacy of Individual Computer-Based Auditory Training for People with Hearing Loss: A Systematic Review of the Evidence

    PubMed Central

    Henshaw, Helen; Ferguson, Melanie A.

    2013-01-01

    Background Auditory training involves active listening to auditory stimuli and aims to improve performance in auditory tasks. As such, auditory training is a potential intervention for the management of people with hearing loss. Objective This systematic review (PROSPERO 2011: CRD42011001406) evaluated the published evidence-base for the efficacy of individual computer-based auditory training to improve speech intelligibility, cognition and communication abilities in adults with hearing loss, with or without hearing aids or cochlear implants. Methods A systematic search of eight databases and key journals identified 229 articles published since 1996, 13 of which met the inclusion criteria. Data were independently extracted and reviewed by the two authors. Study quality was assessed using ten pre-defined scientific and intervention-specific measures. Results Auditory training resulted in improved performance for trained tasks in 9/10 articles that reported on-task outcomes. Although significant generalisation of learning was shown to untrained measures of speech intelligibility (11/13 articles), cognition (1/1 articles) and self-reported hearing abilities (1/2 articles), improvements were small and not robust. Where reported, compliance with computer-based auditory training was high, and retention of learning was shown at post-training follow-ups. Published evidence was of very-low to moderate study quality. Conclusions Our findings demonstrate that published evidence for the efficacy of individual computer-based auditory training for adults with hearing loss is not robust and therefore cannot be reliably used to guide intervention at this time. We identify a need for high-quality evidence to further examine the efficacy of computer-based auditory training for people with hearing loss. PMID:23675431

  7. Computer-Based Image Analysis for Plus Disease Diagnosis in Retinopathy of Prematurity: Performance of the "i-ROP" System and Image Features Associated With Expert Diagnosis.

    PubMed

    Ataer-Cansizoglu, Esra; Bolon-Canedo, Veronica; Campbell, J Peter; Bozkurt, Alican; Erdogmus, Deniz; Kalpathy-Cramer, Jayashree; Patel, Samir; Jonas, Karyn; Chan, R V Paul; Ostmo, Susan; Chiang, Michael F

    2015-11-01

    We developed and evaluated the performance of a novel computer-based image analysis system for grading plus disease in retinopathy of prematurity (ROP), and identified the image features, shapes, and sizes that best correlate with expert diagnosis. A dataset of 77 wide-angle retinal images from infants screened for ROP was collected. A reference standard diagnosis was determined for each image by combining image grading from 3 experts with the clinical diagnosis from ophthalmoscopic examination. Manually segmented images were cropped into a range of shapes and sizes, and a computer algorithm was developed to extract tortuosity and dilation features from arteries and veins. Each feature was fed into our system to identify the set of characteristics that yielded the highest-performing system compared to the reference standard, which we refer to as the "i-ROP" system. Among the tested crop shapes, sizes, and measured features, point-based measurements of arterial and venous tortuosity (combined), and a large circular cropped image (with radius 6 times the disc diameter), provided the highest diagnostic accuracy. The i-ROP system achieved 95% accuracy for classifying preplus and plus disease compared to the reference standard. This was comparable to the performance of the 3 individual experts (96%, 94%, 92%), and significantly higher than the mean performance of 31 nonexperts (81%). This comprehensive analysis of computer-based plus disease suggests that it may be feasible to develop a fully-automated system based on wide-angle retinal images that performs comparably to expert graders at three-level plus disease discrimination. Computer-based image analysis, using objective and quantitative retinal vascular features, has potential to complement clinical ROP diagnosis by ophthalmologists.

  8. HYSEP: A Computer Program for Streamflow Hydrograph Separation and Analysis

    USGS Publications Warehouse

    Sloto, Ronald A.; Crouse, Michele Y.

    1996-01-01

    HYSEP is a computer program that can be used to separate a streamflow hydrograph into base-flow and surface-runoff components. The base-flow component has traditionally been associated with ground-water discharge and the surface-runoff component with precipitation that enters the stream as overland runoff. HYSEP includes three methods of hydrograph separation that are referred to in the literature as the fixed interval, sliding-interval, and local-minimum methods. The program also describes the frequency and duration of measured streamflow and computed base flow and surface runoff. Daily mean stream discharge is used as input to the program in either an American Standard Code for Information Interchange (ASCII) or binary format. Output from the program includes table,s graphs, and data files. Graphical output may be plotted on the computer screen or output to a printer, plotter, or metafile.

  9. Contributions of numerical simulation data bases to the physics, modeling and measurement of turbulence

    NASA Technical Reports Server (NTRS)

    Moin, Parviz; Spalart, Philippe R.

    1987-01-01

    The use of simulation data bases for the examination of turbulent flows is an effective research tool. Studies of the structure of turbulence have been hampered by the limited number of probes and the impossibility of measuring all desired quantities. Also, flow visualization is confined to the observation of passive markers with limited field of view and contamination caused by time-history effects. Computer flow fields are a new resource for turbulence research, providing all the instantaneous flow variables in three-dimensional space. Simulation data bases also provide much-needed information for phenomenological turbulence modeling. Three dimensional velocity and pressure fields from direct simulations can be used to compute all the terms in the transport equations for the Reynolds stresses and the dissipation rate. However, only a few, geometrically simple flows have been computed by direct numerical simulation, and the inventory of simulation does not fully address the current modeling needs in complex turbulent flows. The availability of three-dimensional flow fields also poses challenges in developing new techniques for their analysis, techniques based on experimental methods, some of which are used here for the analysis of direct-simulation data bases in studies of the mechanics of turbulent flows.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Proctor, Timothy; Giulian, Melissa; Korolkova, Natalia

    Although qubits are the leading candidate for the basic elements in a quantum computer, there are also a range of reasons to consider using higher-dimensional qudits or quantum continuous variables (QCVs). In this paper, we use a general “quantum variable” formalism to propose a method of quantum computation in which ancillas are used to mediate gates on a well-isolated “quantum memory” register and which may be applied to the setting of qubits, qudits (for d>2), or QCVs. More specifically, we present a model in which universal quantum computation may be implemented on a register using only repeated applications of amore » single fixed two-body ancilla-register interaction gate, ancillas prepared in a single state, and local measurements of these ancillas. In order to maintain determinism in the computation, adaptive measurements via a classical feed forward of measurement outcomes are used, with the method similar to that in measurement-based quantum computation (MBQC). We show that our model has the same hybrid quantum-classical processing advantages as MBQC, including the power to implement any Clifford circuit in essentially one layer of quantum computation. In some physical settings, high-quality measurements of the ancillas may be highly challenging or not possible, and hence we also present a globally unitary model which replaces the need for measurements of the ancillas with the requirement for ancillas to be prepared in states from a fixed orthonormal basis. In conclusion, we discuss settings in which these models may be of practical interest.« less

  11. Security of Personal Computer Systems: A Management Guide.

    ERIC Educational Resources Information Center

    Steinauer, Dennis D.

    This report describes management and technical security considerations associated with the use of personal computer systems as well as other microprocessor-based systems designed for use in a general office environment. Its primary objective is to identify and discuss several areas of potential vulnerability and associated protective measures. The…

  12. Two Computer-Assisted Experiments

    ERIC Educational Resources Information Center

    Kraftmakher, Yaakov

    2013-01-01

    Two computer-assisted experiments are described: (i) determination of the speed of ultrasound waves in water and (ii) measurement of the thermal expansion of an aluminum-based alloy. A new data-acquisition system developed by PASCO scientific is used. In both experiments, the "Keep" mode of recording data is employed: the data are…

  13. Wavelet-based algorithm to the evaluation of contrasted hepatocellular carcinoma in CT-images after transarterial chemoembolization.

    PubMed

    Alvarez, Matheus; de Pina, Diana Rodrigues; Romeiro, Fernando Gomes; Duarte, Sérgio Barbosa; Miranda, José Ricardo de Arruda

    2014-07-26

    Hepatocellular carcinoma is a primary tumor of the liver and involves different treatment modalities according to the tumor stage. After local therapies, the tumor evaluation is based on the mRECIST criteria, which involves the measurement of the maximum diameter of the viable lesion. This paper describes a computed methodology to measure through the contrasted area of the lesions the maximum diameter of the tumor by a computational algorithm. 63 computed tomography (CT) slices from 23 patients were assessed. Non-contrasted liver and HCC typical nodules were evaluated, and a virtual phantom was developed for this purpose. Optimization of the algorithm detection and quantification was made using the virtual phantom. After that, we compared the algorithm findings of maximum diameter of the target lesions against radiologist measures. Computed results of the maximum diameter are in good agreement with the results obtained by radiologist evaluation, indicating that the algorithm was able to detect properly the tumor limits. A comparison of the estimated maximum diameter by radiologist versus the algorithm revealed differences on the order of 0.25 cm for large-sized tumors (diameter > 5 cm), whereas agreement lesser than 1.0 cm was found for small-sized tumors. Differences between algorithm and radiologist measures were accurate for small-sized tumors with a trend to a small decrease for tumors greater than 5 cm. Therefore, traditional methods for measuring lesion diameter should be complemented non-subjective measurement methods, which would allow a more correct evaluation of the contrast-enhanced areas of HCC according to the mRECIST criteria.

  14. Two-dimensional heat flow apparatus

    NASA Astrophysics Data System (ADS)

    McDougall, Patrick; Ayars, Eric

    2014-06-01

    We have created an apparatus to quantitatively measure two-dimensional heat flow in a metal plate using a grid of temperature sensors read by a microcontroller. Real-time temperature data are collected from the microcontroller by a computer for comparison with a computational model of the heat equation. The microcontroller-based sensor array allows previously unavailable levels of precision at very low cost, and the combination of measurement and modeling makes for an excellent apparatus for the advanced undergraduate laboratory course.

  15. Experimental evaluations of wearable ECG monitor.

    PubMed

    Ha, Kiryong; Kim, Youngsung; Jung, Junyoung; Lee, Jeunwoo

    2008-01-01

    Healthcare industry is changing with ubiquitous computing environment and wearable ECG measurement is one of the most popular approaches in this healthcare industry. Reliability and performance of healthcare device is fundamental issue for widespread adoptions, and interdisciplinary perspectives of wearable ECG monitor make this more difficult. In this paper, we propose evaluation criteria considering characteristic of both ECG measurement and ubiquitous computing. With our wearable ECG monitors, various levels of experimental analysis are performed based on evaluation strategy.

  16. Evaluating the generalization of math fact fluency gains across paper and computer performance modalities.

    PubMed

    Duhon, Gary J; House, Sara H; Stinnett, Terry A

    2012-06-01

    Computer-based interventions are being used more in the classroom. Student responses to these interventions often contribute to decisions making regarding important outcomes. It is important to understand the effect of these interventions within the context of the intervention as well as across related context. The current study examined the generalization of math fact fluency gains resulting from a computer-based intervention to paper-and-pencil performance. A total of 31 second grade students completed fluency drills on the computer or with paper and pencil. Pretest-posttest performance on both computer and paper and pencil for all students was evaluated using a doubly multivariate repeated measure ANOVA. Results indicated that gains achieved on the computer did not generalize to paper-and-pencil performance. Copyright © 2012 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  17. Comparison of Various Similarity Measures for Average Image Hash in Mobile Phone Application

    NASA Astrophysics Data System (ADS)

    Farisa Chaerul Haviana, Sam; Taufik, Muhammad

    2017-04-01

    One of the main issue in Content Based Image Retrieval (CIBR) is similarity measures for resulting image hashes. The main key challenge is to find the most benefits distance or similarity measures for calculating the similarity in term of speed and computing costs, specially under limited computing capabilities device like mobile phone. This study we utilize twelve most common and popular distance or similarity measures technique implemented in mobile phone application, to be compared and studied. The results show that all similarity measures implemented in this study was perform equally under mobile phone application. This gives more possibilities for method combinations to be implemented for image retrieval.

  18. Effect of Pt Doping on Nucleation and Crystallization in Li2O.2SiO2 Glass: Experimental Measurements and Computer Modeling

    NASA Technical Reports Server (NTRS)

    Narayan, K. Lakshmi; Kelton, K. F.; Ray, C. S.

    1996-01-01

    Heterogeneous nucleation and its effects on the crystallization of lithium disilicate glass containing small amounts of Pt are investigated. Measurements of the nucleation frequencies and induction times with and without Pt are shown to be consistent with predictions based on the classical nucleation theory. A realistic computer model for the transformation is presented. Computed differential thermal analysis data (such as crystallization rates as a function of time and temperature) are shown to be in good agreement with experimental results. This modeling provides a new, more quantitative method for analyzing calorimetric data.

  19. Advanced computer graphic techniques for laser range finder (LRF) simulation

    NASA Astrophysics Data System (ADS)

    Bedkowski, Janusz; Jankowski, Stanislaw

    2008-11-01

    This paper show an advanced computer graphic techniques for laser range finder (LRF) simulation. The LRF is the common sensor for unmanned ground vehicle, autonomous mobile robot and security applications. The cost of the measurement system is extremely high, therefore the simulation tool is designed. The simulation gives an opportunity to execute algorithm such as the obstacle avoidance[1], slam for robot localization[2], detection of vegetation and water obstacles in surroundings of the robot chassis[3], LRF measurement in crowd of people[1]. The Axis Aligned Bounding Box (AABB) and alternative technique based on CUDA (NVIDIA Compute Unified Device Architecture) is presented.

  20. Heat Transfer Measurements during DC Casting of Aluminium Part I: Measurement Technique

    NASA Astrophysics Data System (ADS)

    Bakken, J. A.; Bergström, T.

    A method for determination of surface heat transfer to the cooling water and mould based on in-situ temperature measurements in the DC cast ingot has been developed. Three or more steel mantled coaxial thermocouples (0.5 mm diam.) are mounted on a wire frame called a "harp". Allowing the "harp" to freeze into the solid ingots during the casting time-temperature plots T1 (t), T2(t), T3 (t) are obtained for three moving points positioned typically 3, 7 and 11 mm from the ingot surface. From these measurements surface temperature, heat flux and heat transfer coefficients are computed as functions of vertical distance. The computer program is based on steady-state two-dimensional heat balances with convective terms for two fixed volume elements: one around thermocouple T1 and one surface element. A special numerical smoothing procedure is incorporated. The heat of solidification is taken into account.

  1. A computational framework for converting textual clinical diagnostic criteria into the quality data model.

    PubMed

    Hong, Na; Li, Dingcheng; Yu, Yue; Xiu, Qiongying; Liu, Hongfang; Jiang, Guoqian

    2016-10-01

    Constructing standard and computable clinical diagnostic criteria is an important but challenging research field in the clinical informatics community. The Quality Data Model (QDM) is emerging as a promising information model for standardizing clinical diagnostic criteria. To develop and evaluate automated methods for converting textual clinical diagnostic criteria in a structured format using QDM. We used a clinical Natural Language Processing (NLP) tool known as cTAKES to detect sentences and annotate events in diagnostic criteria. We developed a rule-based approach for assigning the QDM datatype(s) to an individual criterion, whereas we invoked a machine learning algorithm based on the Conditional Random Fields (CRFs) for annotating attributes belonging to each particular QDM datatype. We manually developed an annotated corpus as the gold standard and used standard measures (precision, recall and f-measure) for the performance evaluation. We harvested 267 individual criteria with the datatypes of Symptom and Laboratory Test from 63 textual diagnostic criteria. We manually annotated attributes and values in 142 individual Laboratory Test criteria. The average performance of our rule-based approach was 0.84 of precision, 0.86 of recall, and 0.85 of f-measure; the performance of CRFs-based classification was 0.95 of precision, 0.88 of recall and 0.91 of f-measure. We also implemented a web-based tool that automatically translates textual Laboratory Test criteria into the QDM XML template format. The results indicated that our approaches leveraging cTAKES and CRFs are effective in facilitating diagnostic criteria annotation and classification. Our NLP-based computational framework is a feasible and useful solution in developing diagnostic criteria representation and computerization. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. A computer-based measure of resultant achievement motivation.

    PubMed

    Blankenship, V

    1987-08-01

    Three experiments were conducted to develop a computer-based measure of individual differences in resultant achievement motivation (RAM) on the basis of level-of-aspiration, achievement motivation, and dynamics-of-action theories. In Experiment 1, the number of atypical shifts and greater responsiveness to incentives on 21 trials with choices among easy, intermediate, and difficult levels of an achievement-oriented game were positively correlated and were found to differentiate the 62 subjects (31 men, 31 women) on the amount of time they spent at a nonachievement task (watching a color design) 1 week later. In Experiment 2, test-retest reliability was established with the use of 67 subjects (15 men, 52 women). Point and no-point trials were offered in blocks, with point trials first for half the subjects and no-point trials first for the other half. Reliability was higher for the atypical-shift measure than for the incentive-responsiveness measure and was higher when points were offered first. In Experiment 3, computer anxiety was manipulated by creating a simulated computer breakdown in the experimental condition. Fifty-nine subjects (13 men, 46 women) were randomly assigned to the experimental condition or to one of two control conditions (an interruption condition and a no-interruption condition). Subjects with low RAM, as demonstrated by a low number of typical shifts, took longer to choose the achievement-oriented task, as predicted by the dynamics-of-action theory. The difference was evident in all conditions and most striking in the computer-breakdown condition. A change of focus from atypical to typical shifts is discussed.

  3. Self-management support using an Internet-linked tablet computer (the EDGE platform)-based intervention in chronic obstructive pulmonary disease: protocol for the EDGE-COPD randomised controlled trial.

    PubMed

    Farmer, Andrew; Toms, Christy; Hardinge, Maxine; Williams, Veronika; Rutter, Heather; Tarassenko, Lionel

    2014-01-08

    The potential for telehealth-based interventions to provide remote support, education and improve self-management for long-term conditions is increasingly recognised. This trial aims to determine whether an intervention delivered through an easy-to-use tablet computer can improve the quality of life of patients with chronic obstructive pulmonary disease (COPD) by providing personalised self-management information and education. The EDGE (sElf management anD support proGrammE) for COPD is a multicentre, randomised controlled trial designed to assess the efficacy of an Internet-linked tablet computer-based intervention (the EDGE platform) in improving quality of life in patients with moderate to very severe COPD compared with usual care. Eligible patients are randomly allocated to receive the tablet computer-based intervention or usual care in a 2:1 ratio using a web-based randomisation system. Participants are recruited from respiratory outpatient clinics and pulmonary rehabilitation courses as well as from those recently discharged from hospital with a COPD-related admission and from primary care clinics. Participants allocated to the tablet computer-based intervention complete a daily symptom diary and record clinical symptoms using a Bluetooth-linked pulse oximeter. Participants allocated to receive usual care are provided with all the information given to those allocated to the intervention but without the use of the tablet computer or the facility to monitor their symptoms or physiological variables. The primary outcome of quality of life is measured using the St George's Respiratory Questionnaire for COPD patients (SGRQ-C) baseline, 6 and 12 months. Secondary outcome measures are recorded at these intervals in addition to 3 months. The Research Ethics Committee for Berkshire-South Central has provided ethical approval for the conduct of the study in the recruiting regions. The results of the study will be disseminated through peer review publications and conference presentations. Current controlled trials ISRCTN40367841.

  4. An in silico method to identify computer-based protocols worthy of clinical study: An insulin infusion protocol use case

    PubMed Central

    Wong, Anthony F; Pielmeier, Ulrike; Haug, Peter J; Andreassen, Steen

    2016-01-01

    Objective Develop an efficient non-clinical method for identifying promising computer-based protocols for clinical study. An in silico comparison can provide information that informs the decision to proceed to a clinical trial. The authors compared two existing computer-based insulin infusion protocols: eProtocol-insulin from Utah, USA, and Glucosafe from Denmark. Materials and Methods The authors used eProtocol-insulin to manage intensive care unit (ICU) hyperglycemia with intravenous (IV) insulin from 2004 to 2010. Recommendations accepted by the bedside clinicians directly link the subsequent blood glucose values to eProtocol-insulin recommendations and provide a unique clinical database. The authors retrospectively compared in silico 18 984 eProtocol-insulin continuous IV insulin infusion rate recommendations from 408 ICU patients with those of Glucosafe, the candidate computer-based protocol. The subsequent blood glucose measurement value (low, on target, high) was used to identify if the insulin recommendation was too high, on target, or too low. Results Glucosafe consistently provided more favorable continuous IV insulin infusion rate recommendations than eProtocol-insulin for on target (64% of comparisons), low (80% of comparisons), or high (70% of comparisons) blood glucose. Aggregated eProtocol-insulin and Glucosafe continuous IV insulin infusion rates were clinically similar though statistically significantly different (Wilcoxon signed rank test P = .01). In contrast, when stratified by low, on target, or high subsequent blood glucose measurement, insulin infusion rates from eProtocol-insulin and Glucosafe were statistically significantly different (Wilcoxon signed rank test, P < .001), and clinically different. Discussion This in silico comparison appears to be an efficient nonclinical method for identifying promising computer-based protocols. Conclusion Preclinical in silico comparison analytical framework allows rapid and inexpensive identification of computer-based protocol care strategies that justify expensive and burdensome clinical trials. PMID:26228765

  5. Computer Proficiency Questionnaire: Assessing Low and High Computer Proficient Seniors

    PubMed Central

    Boot, Walter R.; Charness, Neil; Czaja, Sara J.; Sharit, Joseph; Rogers, Wendy A.; Fisk, Arthur D.; Mitzner, Tracy; Lee, Chin Chin; Nair, Sankaran

    2015-01-01

    Purpose of the Study: Computers and the Internet have the potential to enrich the lives of seniors and aid in the performance of important tasks required for independent living. A prerequisite for reaping these benefits is having the skills needed to use these systems, which is highly dependent on proper training. One prerequisite for efficient and effective training is being able to gauge current levels of proficiency. We developed a new measure (the Computer Proficiency Questionnaire, or CPQ) to measure computer proficiency in the domains of computer basics, printing, communication, Internet, calendaring software, and multimedia use. Our aim was to develop a measure appropriate for individuals with a wide range of proficiencies from noncomputer users to extremely skilled users. Design and Methods: To assess the reliability and validity of the CPQ, a diverse sample of older adults, including 276 older adults with no or minimal computer experience, was recruited and asked to complete the CPQ. Results: The CPQ demonstrated excellent reliability (Cronbach’s α = .98), with subscale reliabilities ranging from .86 to .97. Age, computer use, and general technology use all predicted CPQ scores. Factor analysis revealed three main factors of proficiency related to Internet and e-mail use; communication and calendaring; and computer basics. Based on our findings, we also developed a short-form CPQ (CPQ-12) with similar properties but 21 fewer questions. Implications: The CPQ and CPQ-12 are useful tools to gauge computer proficiency for training and research purposes, even among low computer proficient older adults. PMID:24107443

  6. The measurement of boundary layers on a compressor blade in cascade. Volume 1: Experimental technique, analysis and results

    NASA Technical Reports Server (NTRS)

    Zierke, William C.; Deutsch, Steven

    1989-01-01

    Measurements were made of the boundary layers and wakes about a highly loaded, double-circular-arc compressor blade in cascade. These laser Doppler velocimetry measurements have yielded a very detailed and precise data base with which to test the application of viscous computational codes to turbomachinery. In order to test the computational codes at off-design conditions, the data were acquired at a chord Reynolds number of 500,000 and at three incidence angles. Moreover, these measurements have supplied some physical insight into these very complex flows. Although some natural transition is evident, laminar boundary layers usually detach and subsequently reattach as either fully or intermittently turbulent boundary layers. These transitional separation bubbles play an important role in the development of most of the boundary layers and wakes measured in this cascade and the modeling or computing of these bubbles should prove to be the key aspect in computing the entire cascade flow field. In addition, the nonequilibrium turbulent boundary layers on these highly loaded blades always have some region of separation near the trailing edge of the suction surface. These separated flows, as well as the subsequent near wakes, show no similarity and should prove to be a challenging test for the viscous computational codes.

  7. Reversibility and measurement in quantum computing

    NASA Astrophysics Data System (ADS)

    Leãao, J. P.

    1998-03-01

    The relation between computation and measurement at a fundamental physical level is yet to be understood. Rolf Landauer was perhaps the first to stress the strong analogy between these two concepts. His early queries have regained pertinence with the recent efforts to developed realizable models of quantum computers. In this context the irreversibility of quantum measurement appears in conflict with the requirement of reversibility of the overall computation associated with the unitary dynamics of quantum evolution. The latter in turn is responsible for the features of superposition and entanglement which make some quantum algorithms superior to classical ones for the same task in speed and resource demand. In this article we advocate an approach to this question which relies on a model of computation designed to enforce the analogy between the two concepts instead of demarcating them as it has been the case so far. The model is introduced as a symmetrization of the classical Turing machine model and is then carried on to quantum mechanics, first as a an abstract local interaction scheme (symbolic measurement) and finally in a nonlocal noninteractive implementation based on Aharonov-Bohm potentials and modular variables. It is suggested that this implementation leads to the most ubiquitous of quantum algorithms: the Discrete Fourier Transform.

  8. Reliability modeling of fault-tolerant computer based systems

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.

    1987-01-01

    Digital fault-tolerant computer-based systems have become commonplace in military and commercial avionics. These systems hold the promise of increased availability, reliability, and maintainability over conventional analog-based systems through the application of replicated digital computers arranged in fault-tolerant configurations. Three tightly coupled factors of paramount importance, ultimately determining the viability of these systems, are reliability, safety, and profitability. Reliability, the major driver affects virtually every aspect of design, packaging, and field operations, and eventually produces profit for commercial applications or increased national security. However, the utilization of digital computer systems makes the task of producing credible reliability assessment a formidable one for the reliability engineer. The root of the problem lies in the digital computer's unique adaptability to changing requirements, computational power, and ability to test itself efficiently. Addressed here are the nuances of modeling the reliability of systems with large state sizes, in the Markov sense, which result from systems based on replicated redundant hardware and to discuss the modeling of factors which can reduce reliability without concomitant depletion of hardware. Advanced fault-handling models are described and methods of acquiring and measuring parameters for these models are delineated.

  9. Wind-tunnel based definition of the AFE aerothermodynamic environment. [Aeroassist Flight Experiment

    NASA Technical Reports Server (NTRS)

    Miller, Charles G.; Wells, W. L.

    1992-01-01

    The Aeroassist Flight Experiment (AFE), scheduled to be performed in 1994, will serve as a precursor for aeroassisted space transfer vehicles (ASTV's) and is representative of entry concepts being considered for missions to Mars. Rationale for the AFE is reviewed briefly as are the various experiments carried aboard the vehicle. The approach used to determine hypersonic aerodynamic and aerothermodynamic characteristics over a wide range of simulation parameters in ground-based facilities is presented. Facilities, instrumentation and test procedures employed in the establishment of the data base are discussed. Measurements illustrating the effects of hypersonic simulation parameters, particularly normal-shock density ratio (an important parameter for hypersonic blunt bodies), and attitude on aerodynamic and aerothermodynamic characteristics are presented, and predictions from computational fluid dynamic (CFD) computer codes are compared with measurement.

  10. Study of Image Qualities From 6D Robot-Based CBCT Imaging System of Small Animal Irradiator.

    PubMed

    Sharma, Sunil; Narayanasamy, Ganesh; Clarkson, Richard; Chao, Ming; Moros, Eduardo G; Zhang, Xin; Yan, Yulong; Boerma, Marjan; Paudel, Nava; Morrill, Steven; Corry, Peter; Griffin, Robert J

    2017-01-01

    To assess the quality of cone beam computed tomography images obtained by a robotic arm-based and image-guided small animal conformal radiation therapy device. The small animal conformal radiation therapy device is equipped with a 40 to 225 kV X-ray tube mounted on a custom made gantry, a 1024 × 1024 pixels flat panel detector (200 μm resolution), a programmable 6 degrees of freedom robot for cone beam computed tomography imaging and conformal delivery of radiation doses. A series of 2-dimensional radiographic projection images were recorded in cone beam mode by placing and rotating microcomputed tomography phantoms on the "palm' of the robotic arm. Reconstructed images were studied for image quality (spatial resolution, image uniformity, computed tomography number linearity, voxel noise, and artifacts). Geometric accuracy was measured to be 2% corresponding to 0.7 mm accuracy on a Shelley microcomputed tomography QA phantom. Qualitative resolution of reconstructed axial computed tomography slices using the resolution coils was within 200 μm. Quantitative spatial resolution was found to be 3.16 lp/mm. Uniformity of the system was measured within 34 Hounsfield unit on a QRM microcomputed tomography water phantom. Computed tomography numbers measured using the linearity plate were linear with material density ( R 2 > 0.995). Cone beam computed tomography images of the QRM multidisk phantom had minimal artifacts. Results showed that the small animal conformal radiation therapy device is capable of producing high-quality cone beam computed tomography images for precise and conformal small animal dose delivery. With its high-caliber imaging capabilities, the small animal conformal radiation therapy device is a powerful tool for small animal research.

  11. A computationally inexpensive model for estimating dimensional measurement uncertainty due to x-ray computed tomography instrument misalignments

    NASA Astrophysics Data System (ADS)

    Ametova, Evelina; Ferrucci, Massimiliano; Chilingaryan, Suren; Dewulf, Wim

    2018-06-01

    The recent emergence of advanced manufacturing techniques such as additive manufacturing and an increased demand on the integrity of components have motivated research on the application of x-ray computed tomography (CT) for dimensional quality control. While CT has shown significant empirical potential for this purpose, there is a need for metrological research to accelerate the acceptance of CT as a measuring instrument. The accuracy in CT-based measurements is vulnerable to the instrument geometrical configuration during data acquisition, namely the relative position and orientation of x-ray source, rotation stage, and detector. Consistency between the actual instrument geometry and the corresponding parameters used in the reconstruction algorithm is critical. Currently available procedures provide users with only estimates of geometrical parameters. Quantification and propagation of uncertainty in the measured geometrical parameters must be considered to provide a complete uncertainty analysis and to establish confidence intervals for CT dimensional measurements. In this paper, we propose a computationally inexpensive model to approximate the influence of errors in CT geometrical parameters on dimensional measurement results. We use surface points extracted from a computer-aided design (CAD) model to model discrepancies in the radiographic image coordinates assigned to the projected edges between an aligned system and a system with misalignments. The efficacy of the proposed method was confirmed on simulated and experimental data in the presence of various geometrical uncertainty contributors.

  12. Nonlinear dynamics as an engine of computation.

    PubMed

    Kia, Behnam; Lindner, John F; Ditto, William L

    2017-03-06

    Control of chaos teaches that control theory can tame the complex, random-like behaviour of chaotic systems. This alliance between control methods and physics-cybernetical physics-opens the door to many applications, including dynamics-based computing. In this article, we introduce nonlinear dynamics and its rich, sometimes chaotic behaviour as an engine of computation. We review our work that has demonstrated how to compute using nonlinear dynamics. Furthermore, we investigate the interrelationship between invariant measures of a dynamical system and its computing power to strengthen the bridge between physics and computation.This article is part of the themed issue 'Horizons of cybernetical physics'. © 2017 The Author(s).

  13. Nonlinear dynamics as an engine of computation

    PubMed Central

    Lindner, John F.; Ditto, William L.

    2017-01-01

    Control of chaos teaches that control theory can tame the complex, random-like behaviour of chaotic systems. This alliance between control methods and physics—cybernetical physics—opens the door to many applications, including dynamics-based computing. In this article, we introduce nonlinear dynamics and its rich, sometimes chaotic behaviour as an engine of computation. We review our work that has demonstrated how to compute using nonlinear dynamics. Furthermore, we investigate the interrelationship between invariant measures of a dynamical system and its computing power to strengthen the bridge between physics and computation. This article is part of the themed issue ‘Horizons of cybernetical physics’. PMID:28115619

  14. Contextuality as a Resource for Models of Quantum Computation with Qubits

    NASA Astrophysics Data System (ADS)

    Bermejo-Vega, Juan; Delfosse, Nicolas; Browne, Dan E.; Okay, Cihan; Raussendorf, Robert

    2017-09-01

    A central question in quantum computation is to identify the resources that are responsible for quantum speed-up. Quantum contextuality has been recently shown to be a resource for quantum computation with magic states for odd-prime dimensional qudits and two-dimensional systems with real wave functions. The phenomenon of state-independent contextuality poses a priori an obstruction to characterizing the case of regular qubits, the fundamental building block of quantum computation. Here, we establish contextuality of magic states as a necessary resource for a large class of quantum computation schemes on qubits. We illustrate our result with a concrete scheme related to measurement-based quantum computation.

  15. Combined visualization for noise mapping of industrial facilities based on ray-tracing and thin plate splines

    NASA Astrophysics Data System (ADS)

    Ovsiannikov, Mikhail; Ovsiannikov, Sergei

    2017-01-01

    The paper presents the combined approach to noise mapping and visualizing of industrial facilities sound pollution using forward ray tracing method and thin-plate spline interpolation. It is suggested to cauterize industrial area in separate zones with similar sound levels. Equivalent local source is defined for range computation of sanitary zones based on ray tracing algorithm. Computation of sound pressure levels within clustered zones are based on two-dimension spline interpolation of measured data on perimeter and inside the zone.

  16. Π4U: A high performance computing framework for Bayesian uncertainty quantification of complex models

    NASA Astrophysics Data System (ADS)

    Hadjidoukas, P. E.; Angelikopoulos, P.; Papadimitriou, C.; Koumoutsakos, P.

    2015-03-01

    We present Π4U, an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.

  17. A fast non-contact imaging photoplethysmography method using a tissue-like model

    NASA Astrophysics Data System (ADS)

    McDuff, Daniel J.; Blackford, Ethan B.; Estepp, Justin R.; Nishidate, Izumi

    2018-02-01

    Imaging photoplethysmography (iPPG) allows non-contact, concomitant measurement and visualization of peripheral blood flow using just an RGB camera. Most iPPG methods require a window of temporal data and complex computation, this makes real-time measurement and spatial visualization impossible. We present a fast,"window-less", non-contact imaging photoplethysmography method, based on a tissue-like model of the skin, that allows accurate measurement of heart rate and heart rate variability parameters. The error in heart rate estimates is equivalent to state-of-the-art techniques and computation is much faster.

  18. Correlative multiple porosimetries for reservoir sandstones with adoption of a new reference-sample-guided computed-tomographic method.

    PubMed

    Jin, Jae Hwa; Kim, Junho; Lee, Jeong-Yil; Oh, Young Min

    2016-07-22

    One of the main interests in petroleum geology and reservoir engineering is to quantify the porosity of reservoir beds as accurately as possible. A variety of direct measurements, including methods of mercury intrusion, helium injection and petrographic image analysis, have been developed; however, their application frequently yields equivocal results because these methods are different in theoretical bases, means of measurement, and causes of measurement errors. Here, we present a set of porosities measured in Berea Sandstone samples by the multiple methods, in particular with adoption of a new method using computed tomography and reference samples. The multiple porosimetric data show a marked correlativeness among different methods, suggesting that these methods are compatible with each other. The new method of reference-sample-guided computed tomography is more effective than the previous methods when the accompanied merits such as experimental conveniences are taken into account.

  19. Correlative multiple porosimetries for reservoir sandstones with adoption of a new reference-sample-guided computed-tomographic method

    PubMed Central

    Jin, Jae Hwa; Kim, Junho; Lee, Jeong-Yil; Oh, Young Min

    2016-01-01

    One of the main interests in petroleum geology and reservoir engineering is to quantify the porosity of reservoir beds as accurately as possible. A variety of direct measurements, including methods of mercury intrusion, helium injection and petrographic image analysis, have been developed; however, their application frequently yields equivocal results because these methods are different in theoretical bases, means of measurement, and causes of measurement errors. Here, we present a set of porosities measured in Berea Sandstone samples by the multiple methods, in particular with adoption of a new method using computed tomography and reference samples. The multiple porosimetric data show a marked correlativeness among different methods, suggesting that these methods are compatible with each other. The new method of reference-sample-guided computed tomography is more effective than the previous methods when the accompanied merits such as experimental conveniences are taken into account. PMID:27445105

  20. Effects of Mobile Phone-Based App Learning Compared to Computer-Based Web Learning on Nursing Students: Pilot Randomized Controlled Trial

    PubMed Central

    2015-01-01

    Objectives This study aimed to determine the effect of mobile-based discussion versus computer-based discussion on self-directed learning readiness, academic motivation, learner-interface interaction, and flow state. Methods This randomized controlled trial was conducted at one university. Eighty-six nursing students who were able to use a computer, had home Internet access, and used a mobile phone were recruited. Participants were randomly assigned to either the mobile phone app-based discussion group (n = 45) or a computer web-based discussion group (n = 41). The effect was measured at before and after an online discussion via self-reported surveys that addressed academic motivation, self-directed learning readiness, time distortion, learner-learner interaction, learner-interface interaction, and flow state. Results The change in extrinsic motivation on identified regulation in the academic motivation (p = 0.011) as well as independence and ability to use basic study (p = 0.047) and positive orientation to the future in self-directed learning readiness (p = 0.021) from pre-intervention to post-intervention was significantly more positive in the mobile phone app-based group compared to the computer web-based discussion group. Interaction between learner and interface (p = 0.002), having clear goals (p = 0.012), and giving and receiving unambiguous feedback (p = 0.049) in flow state was significantly higher in the mobile phone app-based discussion group than it was in the computer web-based discussion group at post-test. Conclusions The mobile phone might offer more valuable learning opportunities for discussion teaching and learning methods in terms of self-directed learning readiness, academic motivation, learner-interface interaction, and the flow state of the learning process compared to the computer. PMID:25995965

  1. Computing exact bundle compliance control charts via probability generating functions.

    PubMed

    Chen, Binchao; Matis, Timothy; Benneyan, James

    2016-06-01

    Compliance to evidenced-base practices, individually and in 'bundles', remains an important focus of healthcare quality improvement for many clinical conditions. The exact probability distribution of composite bundle compliance measures used to develop corresponding control charts and other statistical tests is based on a fairly large convolution whose direct calculation can be computationally prohibitive. Various series expansions and other approximation approaches have been proposed, each with computational and accuracy tradeoffs, especially in the tails. This same probability distribution also arises in other important healthcare applications, such as for risk-adjusted outcomes and bed demand prediction, with the same computational difficulties. As an alternative, we use probability generating functions to rapidly obtain exact results and illustrate the improved accuracy and detection over other methods. Numerical testing across a wide range of applications demonstrates the computational efficiency and accuracy of this approach.

  2. Gate sequence for continuous variable one-way quantum computation

    PubMed Central

    Su, Xiaolong; Hao, Shuhong; Deng, Xiaowei; Ma, Lingyu; Wang, Meihong; Jia, Xiaojun; Xie, Changde; Peng, Kunchi

    2013-01-01

    Measurement-based one-way quantum computation using cluster states as resources provides an efficient model to perform computation and information processing of quantum codes. Arbitrary Gaussian quantum computation can be implemented sufficiently by long single-mode and two-mode gate sequences. However, continuous variable gate sequences have not been realized so far due to an absence of cluster states larger than four submodes. Here we present the first continuous variable gate sequence consisting of a single-mode squeezing gate and a two-mode controlled-phase gate based on a six-mode cluster state. The quantum property of this gate sequence is confirmed by the fidelities and the quantum entanglement of two output modes, which depend on both the squeezing and controlled-phase gates. The experiment demonstrates the feasibility of implementing Gaussian quantum computation by means of accessible gate sequences.

  3. Computational Power of Symmetry-Protected Topological Phases.

    PubMed

    Stephen, David T; Wang, Dong-Sheng; Prakash, Abhishodh; Wei, Tzu-Chieh; Raussendorf, Robert

    2017-07-07

    We consider ground states of quantum spin chains with symmetry-protected topological (SPT) order as resources for measurement-based quantum computation (MBQC). We show that, for a wide range of SPT phases, the computational power of ground states is uniform throughout each phase. This computational power, defined as the Lie group of executable gates in MBQC, is determined by the same algebraic information that labels the SPT phase itself. We prove that these Lie groups always contain a full set of single-qubit gates, thereby affirming the long-standing conjecture that general SPT phases can serve as computationally useful phases of matter.

  4. Computational Power of Symmetry-Protected Topological Phases

    NASA Astrophysics Data System (ADS)

    Stephen, David T.; Wang, Dong-Sheng; Prakash, Abhishodh; Wei, Tzu-Chieh; Raussendorf, Robert

    2017-07-01

    We consider ground states of quantum spin chains with symmetry-protected topological (SPT) order as resources for measurement-based quantum computation (MBQC). We show that, for a wide range of SPT phases, the computational power of ground states is uniform throughout each phase. This computational power, defined as the Lie group of executable gates in MBQC, is determined by the same algebraic information that labels the SPT phase itself. We prove that these Lie groups always contain a full set of single-qubit gates, thereby affirming the long-standing conjecture that general SPT phases can serve as computationally useful phases of matter.

  5. Waveform inversion with source encoding for breast sound speed reconstruction in ultrasound computed tomography.

    PubMed

    Wang, Kun; Matthews, Thomas; Anis, Fatima; Li, Cuiping; Duric, Neb; Anastasio, Mark A

    2015-03-01

    Ultrasound computed tomography (USCT) holds great promise for improving the detection and management of breast cancer. Because they are based on the acoustic wave equation, waveform inversion-based reconstruction methods can produce images that possess improved spatial resolution properties over those produced by ray-based methods. However, waveform inversion methods are computationally demanding and have not been applied widely in USCT breast imaging. In this work, source encoding concepts are employed to develop an accelerated USCT reconstruction method that circumvents the large computational burden of conventional waveform inversion methods. This method, referred to as the waveform inversion with source encoding (WISE) method, encodes the measurement data using a random encoding vector and determines an estimate of the sound speed distribution by solving a stochastic optimization problem by use of a stochastic gradient descent algorithm. Both computer simulation and experimental phantom studies are conducted to demonstrate the use of the WISE method. The results suggest that the WISE method maintains the high spatial resolution of waveform inversion methods while significantly reducing the computational burden.

  6. Gravity measured at the apollo 14 lading site.

    PubMed

    Nance, R L

    1971-12-03

    The gravity at the Apollo 14 landing site has been determined from the accelerometer data that were telemetered from the lunar module. The values for the lunar gravity measured at the Apollo 11, 12, and 14 sites were reduced to a common elevation and were then compared between sites. A theoretical gravity, based on the assumption of a spherical moon, was computed for each landing site and compared with the observed value. The observed gravity was also used to compute the lunar radius at each landing site.

  7. A Quantitative Geochemical Target for Modeling the Formation of the Earth and Moon

    NASA Technical Reports Server (NTRS)

    Boyce, Jeremy W.; Barnes, Jessica J.; McCubbin, Francis M.

    2017-01-01

    The past decade has been one of geochemical, isotopic, and computational advances that are bringing the laboratory measurements and computational modeling neighborhoods of the Earth-Moon community to ever closer proximity. We are now however in the position to become even better neighbors: modelers can generate testable hypthotheses for geochemists; and geochemists can provide quantitive targets for modelers. Here we present a robust example of the latter based on Cl isotope measurements of mare basalts.

  8. Method and apparatus for an optical function generator for seamless tiled displays

    NASA Technical Reports Server (NTRS)

    Johnson, Michael (Inventor); Chen, Chung-Jen (Inventor)

    2004-01-01

    Producing seamless tiled images from multiple displays includes measuring a luminance profile of each of the displays, computing a desired luminance profile for each of the displays, and determining a spatial gradient profile of each of the displays based on the measured luminance profile and the computed desired luminance profile. The determined spatial gradient profile is applied to a spatial filter to be inserted into each of the displays to produce the seamless tiled display image.

  9. Contributed Review: Source-localization algorithms and applications using time of arrival and time difference of arrival measurements

    NASA Astrophysics Data System (ADS)

    Li, Xinya; Deng, Zhiqun Daniel; Rauchenstein, Lynn T.; Carlson, Thomas J.

    2016-04-01

    Locating the position of fixed or mobile sources (i.e., transmitters) based on measurements obtained from sensors (i.e., receivers) is an important research area that is attracting much interest. In this paper, we review several representative localization algorithms that use time of arrivals (TOAs) and time difference of arrivals (TDOAs) to achieve high signal source position estimation accuracy when a transmitter is in the line-of-sight of a receiver. Circular (TOA) and hyperbolic (TDOA) position estimation approaches both use nonlinear equations that relate the known locations of receivers and unknown locations of transmitters. Estimation of the location of transmitters using the standard nonlinear equations may not be very accurate because of receiver location errors, receiver measurement errors, and computational efficiency challenges that result in high computational burdens. Least squares and maximum likelihood based algorithms have become the most popular computational approaches to transmitter location estimation. In this paper, we summarize the computational characteristics and position estimation accuracies of various positioning algorithms. By improving methods for estimating the time-of-arrival of transmissions at receivers and transmitter location estimation algorithms, transmitter location estimation may be applied across a range of applications and technologies such as radar, sonar, the Global Positioning System, wireless sensor networks, underwater animal tracking, mobile communications, and multimedia.

  10. Developing and Evaluating a Kindergarten to Third Grade CBM Mathematics Assessment

    ERIC Educational Resources Information Center

    Lee, Young-Sun; Lembke, Erica

    2016-01-01

    The present study examined the technical adequacy of curriculum-based measurement (CBM) measure of early numeracy for kindergarten through third grade students. Our CBM measures were developed to reflect broad and theoretically derived categories of mathematical thinking: quick retrieval, written computation, and number sense. The mastery of these…

  11. Wearable sensors for health monitoring

    NASA Astrophysics Data System (ADS)

    Suciu, George; Butca, Cristina; Ochian, Adelina; Halunga, Simona

    2015-02-01

    In this paper we describe several wearable sensors, designed for monitoring the health condition of the patients, based on an experimental model. Wearable sensors enable long-term continuous physiological monitoring, which is important for the treatment and management of many chronic illnesses, neurological disorders, and mental health issues. The system is based on a wearable sensors network, which is connected to a computer or smartphone. The wearable sensor network integrates several wearable sensors that can measure different parameters such as body temperature, heart rate and carbon monoxide quantity from the air. After the portable sensors measuring parameter values, they are transmitted by microprocessor through the Bluetooth to the application developed on computer or smartphone, to be interpreted.

  12. Window-based method for approximating the Hausdorff in three-dimensional range imagery

    DOEpatents

    Koch, Mark W [Albuquerque, NM

    2009-06-02

    One approach to pattern recognition is to use a template from a database of objects and match it to a probe image containing the unknown. Accordingly, the Hausdorff distance can be used to measure the similarity of two sets of points. In particular, the Hausdorff can measure the goodness of a match in the presence of occlusion, clutter, and noise. However, existing 3D algorithms for calculating the Hausdorff are computationally intensive, making them impractical for pattern recognition that requires scanning of large databases. The present invention is directed to a new method that can efficiently, in time and memory, compute the Hausdorff for 3D range imagery. The method uses a window-based approach.

  13. Adjudicating between face-coding models with individual-face fMRI responses

    PubMed Central

    Kriegeskorte, Nikolaus

    2017-01-01

    The perceptual representation of individual faces is often explained with reference to a norm-based face space. In such spaces, individuals are encoded as vectors where identity is primarily conveyed by direction and distinctiveness by eccentricity. Here we measured human fMRI responses and psychophysical similarity judgments of individual face exemplars, which were generated as realistic 3D animations using a computer-graphics model. We developed and evaluated multiple neurobiologically plausible computational models, each of which predicts a representational distance matrix and a regional-mean activation profile for 24 face stimuli. In the fusiform face area, a face-space coding model with sigmoidal ramp tuning provided a better account of the data than one based on exemplar tuning. However, an image-processing model with weighted banks of Gabor filters performed similarly. Accounting for the data required the inclusion of a measurement-level population averaging mechanism that approximates how fMRI voxels locally average distinct neuronal tunings. Our study demonstrates the importance of comparing multiple models and of modeling the measurement process in computational neuroimaging. PMID:28746335

  14. Sea surface determination from space: The GSFC geoid

    NASA Technical Reports Server (NTRS)

    Vonbun, F. O.; Mcgoogan, J.; Marsh, J.; Lerch, F. J.

    1975-01-01

    The determination of the sea surface/geoid and its relative variation were investigated and results of the altimeter experiment on Skylab to test the geoid are discussed. The spaceborne altimeter on Skylab revealed that the sea surface of the world's oceans can be measured with an accuracy in the meter range. Surface variations are discussed as they relate to those computed from satellite orbital dynamics and ground based gravity data. The GSFC geoid was constructed from about 400,000 satellite tracking data (range, range rate, angles) and about 20,000 ground gravity observations. One of the last experiments on Skylab was to measure and/or test this geoid over almost one orbit. It was found that the computed water surface deviates between 5 to 20 m from the measured one. Further outlined are the influence of orbital errors on the sea surface, and numerical examples are given based upon real tracking data. Orbital height error estimates were computed for geodetic type satellites and are found to be in the order of 0.2 to 5 meters.

  15. A Computationally-Efficient Inverse Approach to Probabilistic Strain-Based Damage Diagnosis

    NASA Technical Reports Server (NTRS)

    Warner, James E.; Hochhalter, Jacob D.; Leser, William P.; Leser, Patrick E.; Newman, John A

    2016-01-01

    This work presents a computationally-efficient inverse approach to probabilistic damage diagnosis. Given strain data at a limited number of measurement locations, Bayesian inference and Markov Chain Monte Carlo (MCMC) sampling are used to estimate probability distributions of the unknown location, size, and orientation of damage. Substantial computational speedup is obtained by replacing a three-dimensional finite element (FE) model with an efficient surrogate model. The approach is experimentally validated on cracked test specimens where full field strains are determined using digital image correlation (DIC). Access to full field DIC data allows for testing of different hypothetical sensor arrangements, facilitating the study of strain-based diagnosis effectiveness as the distance between damage and measurement locations increases. The ability of the framework to effectively perform both probabilistic damage localization and characterization in cracked plates is demonstrated and the impact of measurement location on uncertainty in the predictions is shown. Furthermore, the analysis time to produce these predictions is orders of magnitude less than a baseline Bayesian approach with the FE method by utilizing surrogate modeling and effective numerical sampling approaches.

  16. Algorithm for fast event parameters estimation on GEM acquired data

    NASA Astrophysics Data System (ADS)

    Linczuk, Paweł; Krawczyk, Rafał D.; Poźniak, Krzysztof T.; Kasprowicz, Grzegorz; Wojeński, Andrzej; Chernyshova, Maryna; Czarski, Tomasz

    2016-09-01

    We present study of a software-hardware environment for developing fast computation with high throughput and low latency methods, which can be used as back-end in High Energy Physics (HEP) and other High Performance Computing (HPC) systems, based on high amount of input from electronic sensor based front-end. There is a parallelization possibilities discussion and testing on Intel HPC solutions with consideration of applications with Gas Electron Multiplier (GEM) measurement systems presented in this paper.

  17. Novel method for measuring a dense 3D strain map of robotic flapping wings

    NASA Astrophysics Data System (ADS)

    Li, Beiwen; Zhang, Song

    2018-04-01

    Measuring dense 3D strain maps of the inextensible membranous flapping wings of robots is of vital importance to the field of bio-inspired engineering. Conventional high-speed 3D videography methods typically reconstruct the wing geometries through measuring sparse points with fiducial markers, and thus cannot obtain the full-field mechanics of the wings in detail. In this research, we propose a novel system to measure a dense strain map of inextensible membranous flapping wings by developing a superfast 3D imaging system and a computational framework for strain analysis. Specifically, first we developed a 5000 Hz 3D imaging system based on the digital fringe projection technique using the defocused binary patterns to precisely measure the dynamic 3D geometries of rapidly flapping wings. Then, we developed a geometry-based algorithm to perform point tracking on the precisely measured 3D surface data. Finally, we developed a dense strain computational method using the Kirchhoff-Love shell theory. Experiments demonstrate that our method can effectively perform point tracking and measure a highly dense strain map of the wings without many fiducial markers.

  18. Mission-based Scenario Research: Experimental Design And Analysis

    DTIC Science & Technology

    2012-01-01

    neurotechnologies called Brain-Computer Interaction Technologies. 15. SUBJECT TERMS neuroimaging, EEG, task loading, neurotechnologies , ground... neurotechnologies called Brain-Computer Interaction Technologies. INTRODUCTION Imagine a system that can identify operator fatigue during a long-term...BCIT), a class of neurotechnologies , that aim to improve task performance by incorporating measures of brain activity to optimize the interactions

  19. Technology. The Hot Cup Caper. Probing for Scientific Knowledge.

    ERIC Educational Resources Information Center

    Ramondetta, June

    1994-01-01

    Students can explore temperature and heat conductivity by examining materials that make good cups for hot cocoa. Using temperature probes from computer-based science packages, students can measure gradual change in the liquid's temperature, watch as data are plotted on the computer, and explain why they chose a specific material. (SM)

  20. A Diagnostic Study of Computer Application of Structural Communication Grid

    ERIC Educational Resources Information Center

    Bahar, Mehmet; Aydin, Fatih; Karakirik, Erol

    2009-01-01

    In this article, Structural communication grid (SCG), an alternative measurement and evaluation technique, has been firstly summarised and the design, development and implementation of a computer based SCG system have been introduced. The system is then tested on a sample of 154 participants consisting of candidate students, science teachers and…

  1. A computer search for asteroid families

    NASA Technical Reports Server (NTRS)

    Lindblad, Bertil A.

    1992-01-01

    The improved proper elements of 4100 numbered asteroids have been searched for clusterings in a, e, i space using a computer technique based on the D-criterion. A list of 14 dynamical families each with more than 15 members is presented. Quantitative measurements of the density and dimensions in phase space of each family are presented.

  2. Differences in the Mathematics-Vocabulary Knowledge of Fifth-Grade Students with and without Learning Difficulties

    ERIC Educational Resources Information Center

    Forsyth, Suzanne R.; Powell, Sarah R.

    2017-01-01

    The purpose of this pilot study was to explore the impact of mathematics and reading learning difficulties on the mathematics-vocabulary understanding of fifth-grade students. Students (n = 114) completed three measures: mathematics computation, general vocabulary, and mathematics vocabulary. Based on performance on the mathematics computation and…

  3. Superresolution radar imaging based on fast inverse-free sparse Bayesian learning for multiple measurement vectors

    NASA Astrophysics Data System (ADS)

    He, Xingyu; Tong, Ningning; Hu, Xiaowei

    2018-01-01

    Compressive sensing has been successfully applied to inverse synthetic aperture radar (ISAR) imaging of moving targets. By exploiting the block sparse structure of the target image, sparse solution for multiple measurement vectors (MMV) can be applied in ISAR imaging and a substantial performance improvement can be achieved. As an effective sparse recovery method, sparse Bayesian learning (SBL) for MMV involves a matrix inverse at each iteration. Its associated computational complexity grows significantly with the problem size. To address this problem, we develop a fast inverse-free (IF) SBL method for MMV. A relaxed evidence lower bound (ELBO), which is computationally more amiable than the traditional ELBO used by SBL, is obtained by invoking fundamental property for smooth functions. A variational expectation-maximization scheme is then employed to maximize the relaxed ELBO, and a computationally efficient IF-MSBL algorithm is proposed. Numerical results based on simulated and real data show that the proposed method can reconstruct row sparse signal accurately and obtain clear superresolution ISAR images. Moreover, the running time and computational complexity are reduced to a great extent compared with traditional SBL methods.

  4. An Evaluation of the Scattering Law for Light and Heavy Water in ENDF-6 Format, Based on Experimental Data and Molecular Dynamics

    NASA Astrophysics Data System (ADS)

    Márquez Damián, J. I.; Granada, J. R.; Malaspina, D. C.

    2014-04-01

    In this work we present an evaluation in ENDF-6 format of the scattering law for light and heavy water computed using the LEAPR module of NJOY99. The models used in this evaluation are based on experimental data on light water dynamics measured by Novikov, partial structure factors obtained by Soper, and molecular dynamics calculations performed with GROMACS using a reparameterized version of the flexible SPC model by Toukan and Rahman. The models use the Egelstaff-Schofield diffusion equation for translational motion, and a continuous spectrum calculated from the velocity autocorrelation function computed with GROMACS. The scattering law for H in H2O is computed using the incoherent approximation, and the scattering law D and O in D2O are computed using the Sköld approximation for coherent scattering. The calculations show significant improvement over ENDF/B-VI and ENDF/B-VII when compared with measurements of the total cross section, differential scattering experiments and quasi-elastic neutron scattering experiments (QENS).

  5. Iterative Refinement of a Binding Pocket Model: Active Computational Steering of Lead Optimization

    PubMed Central

    2012-01-01

    Computational approaches for binding affinity prediction are most frequently demonstrated through cross-validation within a series of molecules or through performance shown on a blinded test set. Here, we show how such a system performs in an iterative, temporal lead optimization exercise. A series of gyrase inhibitors with known synthetic order formed the set of molecules that could be selected for “synthesis.” Beginning with a small number of molecules, based only on structures and activities, a model was constructed. Compound selection was done computationally, each time making five selections based on confident predictions of high activity and five selections based on a quantitative measure of three-dimensional structural novelty. Compound selection was followed by model refinement using the new data. Iterative computational candidate selection produced rapid improvements in selected compound activity, and incorporation of explicitly novel compounds uncovered much more diverse active inhibitors than strategies lacking active novelty selection. PMID:23046104

  6. Pilot of a computer-based brief multiple-health behavior intervention for college students.

    PubMed

    Moore, Michele J; Werch, Chudley E; Bian, Hui

    2012-01-01

    Given the documented multiple health risks college students engage in, and the dearth of effective programs addressing them, the authors developed a computer-based brief multiple-health behavior intervention. This study reports immediate outcomes and feasibility of a pilot of this program. Two hundred students attending a midsized university participated. Participants were randomly assigned to the intervention or control program, both delivered via computer. Immediate feedback was collected with the computer program. Results indicate that the intervention had an early positive impact on alcohol and cigarette use intentions, as well as related constructs underlying the Behavior-Image Model specific to each of the 3 substances measured. Based on the implementation process, the program proved to be feasible to use and acceptable to the population. Results support the potential efficacy of the intervention to positively impact behavioral intentions and linkages between health promoting and damaging behaviors among college students.

  7. Engineering perceptions of female and male K-12 students: effects of a multimedia overview on elementary, middle-, and high-school students

    NASA Astrophysics Data System (ADS)

    Johnson, Amy M.; Ozogul, Gamze; DiDonato, Matt D.; Reisslein, Martin

    2013-10-01

    Computer-based multimedia presentations employing animated agents (avatars) can positively impact perceptions about engineering; the current research advances our understanding of this effect to pre-college populations, the main target for engineering outreach. The study examines the effectiveness of a brief computer-based intervention with animated agents in improving perceptions about engineering. Five hundred sixty-five elementary, middle-, and high-school students in the southwestern USA viewed a short computer-based multimedia overview of four engineering disciplines (electrical, chemical, biomedical, and environmental) with embedded animated agents. Students completed identical surveys measuring five subscales of engineering perceptions immediately before and after the intervention. Analyses of pre- and post-surveys demonstrated that the computer presentation significantly improved perceptions for each student group, and that effects were stronger for elementary school students, compared to middle- and high-school students.

  8. Initial Progress Toward Development of a Voice-Based Computer-Delivered Motivational Intervention for Heavy Drinking College Students: An Experimental Study

    PubMed Central

    Lechner, William J; MacGlashan, James; Wray, Tyler B; Littman, Michael L

    2017-01-01

    Background Computer-delivered interventions have been shown to be effective in reducing alcohol consumption in heavy drinking college students. However, these computer-delivered interventions rely on mouse, keyboard, or touchscreen responses for interactions between the users and the computer-delivered intervention. The principles of motivational interviewing suggest that in-person interventions may be effective, in part, because they encourage individuals to think through and speak aloud their motivations for changing a health behavior, which current computer-delivered interventions do not allow. Objective The objective of this study was to take the initial steps toward development of a voice-based computer-delivered intervention that can ask open-ended questions and respond appropriately to users’ verbal responses, more closely mirroring a human-delivered motivational intervention. Methods We developed (1) a voice-based computer-delivered intervention that was run by a human controller and that allowed participants to speak their responses to scripted prompts delivered by speech generation software and (2) a text-based computer-delivered intervention that relied on the mouse, keyboard, and computer screen for all interactions. We randomized 60 heavy drinking college students to interact with the voice-based computer-delivered intervention and 30 to interact with the text-based computer-delivered intervention and compared their ratings of the systems as well as their motivation to change drinking and their drinking behavior at 1-month follow-up. Results Participants reported that the voice-based computer-delivered intervention engaged positively with them in the session and delivered content in a manner consistent with motivational interviewing principles. At 1-month follow-up, participants in the voice-based computer-delivered intervention condition reported significant decreases in quantity, frequency, and problems associated with drinking, and increased perceived importance of changing drinking behaviors. In comparison to the text-based computer-delivered intervention condition, those assigned to voice-based computer-delivered intervention reported significantly fewer alcohol-related problems at the 1-month follow-up (incident rate ratio 0.60, 95% CI 0.44-0.83, P=.002). The conditions did not differ significantly on perceived importance of changing drinking or on measures of drinking quantity and frequency of heavy drinking. Conclusions Results indicate that it is feasible to construct a series of open-ended questions and a bank of responses and follow-up prompts that can be used in a future fully automated voice-based computer-delivered intervention that may mirror more closely human-delivered motivational interventions to reduce drinking. Such efforts will require using advanced speech recognition capabilities and machine-learning approaches to train a program to mirror the decisions made by human controllers in the voice-based computer-delivered intervention used in this study. In addition, future studies should examine enhancements that can increase the perceived warmth and empathy of voice-based computer-delivered intervention, possibly through greater personalization, improvements in the speech generation software, and embodying the computer-delivered intervention in a physical form. PMID:28659259

  9. TBIdoc: 3D content-based CT image retrieval system for traumatic brain injury

    NASA Astrophysics Data System (ADS)

    Li, Shimiao; Gong, Tianxia; Wang, Jie; Liu, Ruizhe; Tan, Chew Lim; Leong, Tze Yun; Pang, Boon Chuan; Lim, C. C. Tchoyoson; Lee, Cheng Kiang; Tian, Qi; Zhang, Zhuo

    2010-03-01

    Traumatic brain injury (TBI) is a major cause of death and disability. Computed Tomography (CT) scan is widely used in the diagnosis of TBI. Nowadays, large amount of TBI CT data is stacked in the hospital radiology department. Such data and the associated patient information contain valuable information for clinical diagnosis and outcome prediction. However, current hospital database system does not provide an efficient and intuitive tool for doctors to search out cases relevant to the current study case. In this paper, we present the TBIdoc system: a content-based image retrieval (CBIR) system which works on the TBI CT images. In this web-based system, user can query by uploading CT image slices from one study, retrieval result is a list of TBI cases ranked according to their 3D visual similarity to the query case. Specifically, cases of TBI CT images often present diffuse or focal lesions. In TBIdoc system, these pathological image features are represented as bin-based binary feature vectors. We use the Jaccard-Needham measure as the similarity measurement. Based on these, we propose a 3D similarity measure for computing the similarity score between two series of CT slices. nDCG is used to evaluate the system performance, which shows the system produces satisfactory retrieval results. The system is expected to improve the current hospital data management in TBI and to give better support for the clinical decision-making process. It may also contribute to the computer-aided education in TBI.

  10. Activities of Combined Sewer Overflows: A Comparison of Measured and Computed Data

    NASA Astrophysics Data System (ADS)

    Ostrowski, M. W.; Koch, J.; Wetzstein, A.

    In order to relieve sewerage systems of excess stormwaters during heavy rainfalls overflow structures are necessary for a safe operation of urban drainage and wastew- ater treatment facilities. Overflow tanks have storage effects while pure overflows di- vide the discharges and route the excess water in the next watercourse. The outflows from combined sewage overflows can evoke significant effects on the receiving waters. Hydraulic effects ("hydraulic stress") result from the additional discharges, which are generally introduced at a single point. Toxic effects are caused by the pollutant load of the decanted discharges. In awareness of these effects an immission based consid- eration is required. The lack of reliable, measurement based data is obvious, although the generally accepted necessity of those is noted in recent research projects and regu- lations of public authorities. An immission based view necessitates data regarding the amount, number and duration of the overflows. Particularly with regard to the storm overflows this data is mostly achieved by means of computational simulations. The lack of measured data is the consequence of the adverse conditions in sewer pipes and the complex hydraulic situation at the overflow structures. Reliable data is necessary for the verification, the validation and the improvement of hydrological models. Within the scope of a research project, carried out in the section for Hydrology and Water Management of the Technical University of Darmstadt, a storm overflow was equipped with measuring devices. Aims of the investigations were to discover the limiting boundary conditions in measuring sewer discharges and to record reliable data, concerning the overflow activities of the observed structure. The measured data should be compared with the results of the model SMUSI, which is an evaluation model of the public authorities in the federal state of Hesse, Germany. It is the objective of the presentation to - specify the implementation and the perfor- mance of the measurement site - describe the processing and evaluation of the mea- suring data - compare the measured data with the computed data based on SMUSI simulations - discuss the opportunities and boundaries of measuring in urban hydro- logical systems regarding new approaches (measuring, modelling, managing) as well as the attainable accurracies of measurement.

  11. An FPGA-based High Speed Parallel Signal Processing System for Adaptive Optics Testbed

    NASA Astrophysics Data System (ADS)

    Kim, H.; Choi, Y.; Yang, Y.

    In this paper a state-of-the-art FPGA (Field Programmable Gate Array) based high speed parallel signal processing system (SPS) for adaptive optics (AO) testbed with 1 kHz wavefront error (WFE) correction frequency is reported. The AO system consists of Shack-Hartmann sensor (SHS) and deformable mirror (DM), tip-tilt sensor (TTS), tip-tilt mirror (TTM) and an FPGA-based high performance SPS to correct wavefront aberrations. The SHS is composed of 400 subapertures and the DM 277 actuators with Fried geometry, requiring high speed parallel computing capability SPS. In this study, the target WFE correction speed is 1 kHz; therefore, it requires massive parallel computing capabilities as well as strict hard real time constraints on measurements from sensors, matrix computation latency for correction algorithms, and output of control signals for actuators. In order to meet them, an FPGA based real-time SPS with parallel computing capabilities is proposed. In particular, the SPS is made up of a National Instrument's (NI's) real time computer and five FPGA boards based on state-of-the-art Xilinx Kintex 7 FPGA. Programming is done with NI's LabView environment, providing flexibility when applying different algorithms for WFE correction. It also facilitates faster programming and debugging environment as compared to conventional ones. One of the five FPGA's is assigned to measure TTS and calculate control signals for TTM, while the rest four are used to receive SHS signal, calculate slops for each subaperture and correction signal for DM. With this parallel processing capabilities of the SPS the overall closed-loop WFE correction speed of 1 kHz has been achieved. System requirements, architecture and implementation issues are described; furthermore, experimental results are also given.

  12. System enhancements of Mesoscale Analysis and Space Sensor (MASS) computer system

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.; Karitani, S.

    1985-01-01

    The interactive information processing for the mesoscale analysis and space sensor (MASS) program is reported. The development and implementation of new spaceborne remote sensing technology to observe and measure atmospheric processes is described. The space measurements and conventional observational data are processed together to gain an improved understanding of the mesoscale structure and dynamical evolution of the atmosphere relative to cloud development and precipitation processes. A Research Computer System consisting of three primary computers was developed (HP-1000F, Perkin-Elmer 3250, and Harris/6) which provides a wide range of capabilities for processing and displaying interactively large volumes of remote sensing data. The development of a MASS data base management and analysis system on the HP-1000F computer and extending these capabilities by integration with the Perkin-Elmer and Harris/6 computers using the MSFC's Apple III microcomputer workstations is described. The objectives are: to design hardware enhancements for computer integration and to provide data conversion and transfer between machines.

  13. Quantum computational universality of the Cai-Miyake-Dür-Briegel two-dimensional quantum state from Affleck-Kennedy-Lieb-Tasaki quasichains

    NASA Astrophysics Data System (ADS)

    Wei, Tzu-Chieh; Raussendorf, Robert; Kwek, Leong Chuan

    2011-10-01

    Universal quantum computation can be achieved by simply performing single-qubit measurements on a highly entangled resource state, such as cluster states. Cai, Miyake, Dür, and Briegel recently constructed a ground state of a two-dimensional quantum magnet by combining multiple Affleck-Kennedy-Lieb-Tasaki quasichains of mixed spin-3/2 and spin-1/2 entities and by mapping pairs of neighboring spin-1/2 particles to individual spin-3/2 particles [Phys. Rev. APLRAAN1050-294710.1103/PhysRevA.82.052309 82, 052309 (2010)]. They showed that this state enables universal quantum computation by single-spin measurements. Here, we give an alternative understanding of how this state gives rise to universal measurement-based quantum computation: by local operations, each quasichain can be converted to a one-dimensional cluster state and entangling gates between two neighboring logical qubits can be implemented by single-spin measurements. We further argue that a two-dimensional cluster state can be distilled from the Cai-Miyake-Dür-Briegel state.

  14. Computer-Based Internet-Hosted Assessment of L2 Literacy: Computerizing and Administering of the Oxford Quick Placement Test in ExamView and Moodle

    NASA Astrophysics Data System (ADS)

    Meurant, Robert C.

    Sorting of Korean English-as-a-Foreign-Language (EFL) university students by Second Language (L2) aptitude allocates students to classes of compatible ability level, and was here used to screen candidates for interview. Paper-and-pen versions of the Oxford Quick Placement Test were adapted to computer-based testing via online hosting using FSCreations ExamView. Problems with their online hosting site led to conversion to the popular computer-based learning management system Moodle, hosted on www.ninehub.com. 317 sophomores were tested online to encourage L2 digital literacy. Strategies for effective hybrid implementation of Learning Management Systems in L2 tertiary education include computer-based Internet-hosted L2 aptitude tests. These potentially provide a convenient measure of student progress in developing L2 fluency, and offer a more objective and relevant means of teacher- and course-assessment than student evaluations, which tend to confuse entertainment value and teacher popularity with academic credibility and pedagogical effectiveness.

  15. Using a Concept-Grounded, Curriculum-Based Measure in Mathematics To Predict Statewide Test Scores for Middle School Students with LD.

    ERIC Educational Resources Information Center

    Helwig, Robert; Anderson, Lisbeth; Tindal, Gerald

    2002-01-01

    An 11-item math concept curriculum-based measure (CBM) was administered to 171 eighth grade students. Scores were correlated with scores from a computer adaptive test designed in conjunction with the state to approximate the official statewide mathematics achievement tests. Correlations for general education students and students with learning…

  16. A General Approach to Measuring Test-Taking Effort on Computer-Based Tests

    ERIC Educational Resources Information Center

    Wise, Steven L.; Gao, Lingyun

    2017-01-01

    There has been an increased interest in the impact of unmotivated test taking on test performance and score validity. This has led to the development of new ways of measuring test-taking effort based on item response time. In particular, Response Time Effort (RTE) has been shown to provide an assessment of effort down to the level of individual…

  17. Progress Monitoring Instrument Development: Silent Reading Fluency, Vocabulary, and Reading Comprehension. Technical Report #1110

    ERIC Educational Resources Information Center

    Nese, Joseph F. T.; Anderson, Daniel; Hoelscher, Kyle; Tindal, Gerald; Alonzo, Julie

    2011-01-01

    Curriculum-based measurement (CBM) is designed to measure students' academic status and growth so the effectiveness of instruction may be evaluated. In the most popular forms of reading CBM, the student's oral reading fluency is assessed. This behavior is difficult to sample in a computer-based format, a limitation that may be a function of the…

  18. On the development of a computer-based handwriting assessment tool to objectively quantify handwriting proficiency in children.

    PubMed

    Falk, Tiago H; Tam, Cynthia; Schellnus, Heidi; Chau, Tom

    2011-12-01

    Standardized writing assessments such as the Minnesota Handwriting Assessment (MHA) can inform interventions for handwriting difficulties, which are prevalent among school-aged children. However, these tests usually involve the laborious task of subjectively rating the legibility of the written product, precluding their practical use in some clinical and educational settings. This study describes a portable computer-based handwriting assessment tool to objectively measure MHA quality scores and to detect handwriting difficulties in children. Several measures are proposed based on spatial, temporal, and grip force measurements obtained from a custom-built handwriting instrument. Thirty-five first and second grade students participated in the study, nine of whom exhibited handwriting difficulties. Students performed the MHA test and were subjectively scored based on speed and handwriting quality using five primitives: legibility, form, alignment, size, and space. Several spatial parameters are shown to correlate significantly (p<0.001) with subjective scores obtained for alignment, size, space, and form. Grip force and temporal measures, in turn, serve as useful indicators of handwriting legibility and speed, respectively. Using only size and space parameters, promising discrimination between proficient and non-proficient handwriting can be achieved. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  19. Subpixel displacement measurement method based on the combination of particle swarm optimization and gradient algorithm

    NASA Astrophysics Data System (ADS)

    Guang, Chen; Qibo, Feng; Keqin, Ding; Zhan, Gao

    2017-10-01

    A subpixel displacement measurement method based on the combination of particle swarm optimization (PSO) and gradient algorithm (GA) was proposed for accuracy and speed optimization in GA, which is a subpixel displacement measurement method better applied in engineering practice. An initial integer-pixel value was obtained according to the global searching ability of PSO, and then gradient operators were adopted for a subpixel displacement search. A comparison was made between this method and GA by simulated speckle images and rigid-body displacement in metal specimens. The results showed that the computational accuracy of the combination of PSO and GA method reached 0.1 pixel in the simulated speckle images, or even 0.01 pixels in the metal specimen. Also, computational efficiency and the antinoise performance of the improved method were markedly enhanced.

  20. Glossiness of Colored Papers based on Computer Graphics Model and Its Measuring Method

    NASA Astrophysics Data System (ADS)

    Aida, Teizo

    In the case of colored papers, the color of surface effects strongly upon the gloss of its paper. The new glossiness for such a colored paper is suggested in this paper. First, using the Achromatic and Chromatic Munsell colored chips, the author obtained experimental equation which represents the relation between lightness V ( or V and saturation C ) and psychological glossiness Gph of these chips. Then, the author defined a new glossiness G for the colored papers, based on the above mentioned experimental equations Gph and Cook-Torrance's reflection model which are widely used in the filed of Computer Graphics. This new glossiness is shown to be nearly proportional to the psychological glossiness Gph. The measuring system for the new glossiness G is furthermore descrived. The measuring time for one specimen is within 1 minute.

  1. A vessel length-based method to compute coronary fractional flow reserve from optical coherence tomography images.

    PubMed

    Lee, Kyung Eun; Lee, Seo Ho; Shin, Eun-Seok; Shim, Eun Bo

    2017-06-26

    Hemodynamic simulation for quantifying fractional flow reserve (FFR) is often performed in a patient-specific geometry of coronary arteries reconstructed from the images from various imaging modalities. Because optical coherence tomography (OCT) images can provide more precise vascular lumen geometry, regardless of stenotic severity, hemodynamic simulation based on OCT images may be effective. The aim of this study is to perform OCT-FFR simulations by coupling a 3D CFD model from geometrically correct OCT images with a LPM based on vessel lengths extracted from CAG data with clinical validations for the present method. To simulate coronary hemodynamics, we developed a fast and accurate method that combined a computational fluid dynamics (CFD) model of an OCT-based region of interest (ROI) with a lumped parameter model (LPM) of the coronary microvasculature and veins. Here, the LPM was based on vessel lengths extracted from coronary X-ray angiography (CAG) images. Based on a vessel length-based approach, we describe a theoretical formulation for the total resistance of the LPM from a three-dimensional (3D) CFD model of the ROI. To show the utility of this method, we present calculated examples of FFR from OCT images. To validate the OCT-based FFR calculation (OCT-FFR) clinically, we compared the computed OCT-FFR values for 17 vessels of 13 patients with clinically measured FFR (M-FFR) values. A novel formulation for the total resistance of LPM is introduced to accurately simulate a 3D CFD model of the ROI. The simulated FFR values compared well with clinically measured ones, showing the accuracy of the method. Moreover, the present method is fast in terms of computational time, enabling clinicians to provide solutions handled within the hospital.

  2. Automatically-computed prehospital severity scores are equivalent to scores based on medic documentation.

    PubMed

    Reisner, Andrew T; Chen, Liangyou; McKenna, Thomas M; Reifman, Jaques

    2008-10-01

    Prehospital severity scores can be used in routine prehospital care, mass casualty care, and military triage. If computers could reliably calculate clinical scores, new clinical and research methodologies would be possible. One obstacle is that vital signs measured automatically can be unreliable. We hypothesized that Signal Quality Indices (SQI's), computer algorithms that differentiate between reliable and unreliable monitored physiologic data, could improve the predictive power of computer-calculated scores. In a retrospective analysis of trauma casualties transported by air ambulance, we computed the Triage Revised Trauma Score (RTS) from archived travel monitor data. We compared the areas-under-the-curve (AUC's) of receiver operating characteristic curves for prediction of mortality and red blood cell transfusion for 187 subjects with comparable quantities of good-quality and poor-quality data. Vital signs deemed reliable by SQI's led to significantly more discriminatory severity scores than vital signs deemed unreliable. We also compared automatically-computed RTS (using the SQI's) versus RTS computed from vital signs documented by medics. For the subjects in whom the SQI algorithms identified 15 consecutive seconds of reliable vital signs data (n = 350), the automatically-computed scores' AUC's were the same as the medic-based scores' AUC's. Using the Prehospital Index in place of RTS led to very similar results, corroborating our findings. SQI algorithms improve automatically-computed severity scores, and automatically-computed scores using SQI's are equivalent to medic-based scores.

  3. Computer Assisted Thermography And Its Application In Ovulation Detection

    NASA Astrophysics Data System (ADS)

    Rao, K. H.; Shah, A. V.

    1984-08-01

    Hardware and software of a computer-assisted image analyzing system used for infrared images in medical applications are discussed. The application of computer-assisted thermography (CAT) as a complementary diagnostic tool in centralized diagnostic management is proposed. The authors adopted 'Computer Assisted Thermography' to study physiological changes in the breasts related to the hormones characterizing the menstrual cycle of a woman. Based on clinical experi-ments followed by thermal image analysis, they suggest that 'differential skin temperature (DST)1 be measured to detect the fertility interval in the menstrual cycle of a woman.

  4. Etch depth mapping of phase binary computer-generated holograms by means of specular spectroscopic scatterometry

    NASA Astrophysics Data System (ADS)

    Korolkov, Victor P.; Konchenko, Alexander S.; Cherkashin, Vadim V.; Mironnikov, Nikolay G.; Poleshchuk, Alexander G.

    2013-09-01

    Detailed analysis of etch depth map for phase binary computer-generated holograms intended for testing aspheric optics is a very important task. In particular, diffractive Fizeau null lenses need to be carefully tested for uniformity of etch depth. We offer a simplified version of the specular spectroscopic scatterometry method. It is based on the spectral properties of binary phase multi-order gratings. An intensity of zero order is a periodical function of illumination light wave number. The grating grooves depth can be calculated as it is inversely proportional to the period. Measurement in reflection allows one to increase the phase depth of the grooves by a factor of 2 and measure more precisely shallow phase gratings. Measurement uncertainty is mainly defined by the following parameters: shifts of the spectrum maximums that occur due to the tilted grooves sidewalls, uncertainty of light incidence angle measurement, and spectrophotometer wavelength error. It is theoretically and experimentally shown that the method we describe can ensure 1% error. However, fiber spectrometers are more convenient for scanning measurements of large area computer-generated holograms. Our experimental system for characterization of binary computer-generated holograms was developed using a fiber spectrometer.

  5. Political leaders and the media. Can we measure political leadership images in newspapers using computer-assisted content analysis?

    PubMed

    Aaldering, Loes; Vliegenthart, Rens

    Despite the large amount of research into both media coverage of politics as well as political leadership, surprisingly little research has been devoted to the ways political leaders are discussed in the media. This paper studies whether computer-aided content analysis can be applied in examining political leadership images in Dutch newspaper articles. It, firstly, provides a conceptualization of political leader character traits that integrates different perspectives in the literature. Moreover, this paper measures twelve political leadership images in media coverage, based on a large-scale computer-assisted content analysis of Dutch media coverage (including almost 150.000 newspaper articles), and systematically tests the quality of the employed measurement instrument by assessing the relationship between the images, the variance in the measurement, the over-time development of images for two party leaders and by comparing the computer results with manual coding. We conclude that the computerized content analysis provides a valid measurement for the leadership images in Dutch newspapers. Moreover, we find that the dimensions political craftsmanship, vigorousness, integrity, communicative performances and consistency are regularly applied in discussing party leaders, but that portrayal of party leaders in terms of responsiveness is almost completely absent in Dutch newspapers.

  6. Physical and Mathematical Questions on Signal Processing in Multibase Phase Direction Finders

    NASA Astrophysics Data System (ADS)

    Denisov, V. P.; Dubinin, D. V.; Meshcheryakov, A. A.

    2018-02-01

    Questions on improving the accuracy of multiple-base phase direction finders by rejecting anomalously large errors in the process of resolving the measurement ambiguities are considered. A physical basis is derived and calculated relationships characterizing the efficiency of the proposed solutions are obtained. Results of a computer simulation of a three-base direction finder are analyzed, along with field measurements of a three-base direction finder along near-ground paths.

  7. Jack Rabbit Pretest Data For TATB Based IHE Model Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, M M; Strand, O T; Bosson, S T

    The Jack Rabbit Pretest series consisted of 5 focused hydrodynamic experiments, 2021E PT3, PT4, PT5, PT6, and PT7. They were fired in March and April of 2008 at the Contained Firing Facility, Site 300, Lawrence Livermore National Laboratory, Livermore, California. These experiments measured dead-zone formation and impulse gradients created during the detonation of TATB based insensitive high explosive. This document contains reference data tables for all 5 experiments. These data tables include: (1) Measured laser velocimetry of the experiment diagnostic plate (2) Computed diagnostic plate profile contours through velocity integration (3) Computed center axis pressures through velocity differentiation. All timesmore » are in microseconds, referenced from detonator circuit current start. All dimensions are in millimeters. Schematic axi-symmetric cross sections are shown for each experiment. These schematics detail the materials used and dimensions of the experiment and component parts. This should allow anyone wanting to evaluate their TATB based insensitive high explosive detonation model against experiment. These data are particularly relevant in examining reactive flow detonation model prediction in computational simulation of dead-zone formation and resulting impulse gradients produced by detonating TATB based explosive.« less

  8. 49 CFR 40.351 - What confidentiality requirements apply to service agents?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... confidentiality and security measures to ensure that confidential employee records are not available to unauthorized persons. This includes protecting the physical security of records, access controls, and computer security measures to safeguard confidential data in electronic data bases. ...

  9. 49 CFR 40.351 - What confidentiality requirements apply to service agents?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... confidentiality and security measures to ensure that confidential employee records are not available to unauthorized persons. This includes protecting the physical security of records, access controls, and computer security measures to safeguard confidential data in electronic data bases. ...

  10. 49 CFR 40.351 - What confidentiality requirements apply to service agents?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... confidentiality and security measures to ensure that confidential employee records are not available to unauthorized persons. This includes protecting the physical security of records, access controls, and computer security measures to safeguard confidential data in electronic data bases. ...

  11. 49 CFR 40.351 - What confidentiality requirements apply to service agents?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... confidentiality and security measures to ensure that confidential employee records are not available to unauthorized persons. This includes protecting the physical security of records, access controls, and computer security measures to safeguard confidential data in electronic data bases. ...

  12. Calculate Your Body Mass Index

    MedlinePlus

    ... Professional Resources Calculate Your Body Mass Index Body mass index (BMI) is a measure of body fat based on height and weight that applies to adult men and women. Enter your weight and height using standard or metric measures. Select "Compute BMI" and your ...

  13. Left ventricle segmentation via graph cut distribution matching.

    PubMed

    Ben Ayed, Ismail; Punithakumar, Kumaradevan; Li, Shuo; Islam, Ali; Chong, Jaron

    2009-01-01

    We present a discrete kernel density matching energy for segmenting the left ventricle cavity in cardiac magnetic resonance sequences. The energy and its graph cut optimization based on an original first-order approximation of the Bhattacharyya measure have not been proposed previously, and yield competitive results in nearly real-time. The algorithm seeks a region within each frame by optimization of two priors, one geometric (distance-based) and the other photometric, each measuring a distribution similarity between the region and a model learned from the first frame. Based on global rather than pixelwise information, the proposed algorithm does not require complex training and optimization with respect to geometric transformations. Unlike related active contour methods, it does not compute iterative updates of computationally expensive kernel densities. Furthermore, the proposed first-order analysis can be used for other intractable energies and, therefore, can lead to segmentation algorithms which share the flexibility of active contours and computational advantages of graph cuts. Quantitative evaluations over 2280 images acquired from 20 subjects demonstrated that the results correlate well with independent manual segmentations by an expert.

  14. On Target Localization Using Combined RSS and AoA Measurements

    PubMed Central

    Beko, Marko; Dinis, Rui

    2018-01-01

    This work revises existing solutions for a problem of target localization in wireless sensor networks (WSNs), utilizing integrated measurements, namely received signal strength (RSS) and angle of arrival (AoA). The problem of RSS/AoA-based target localization became very popular in the research community recently, owing to its great applicability potential and relatively low implementation cost. Therefore, here, a comprehensive study of the state-of-the-art (SoA) solutions and their detailed analysis is presented. The beginning of this work starts by considering the SoA approaches based on convex relaxation techniques (more computationally complex in general), and it goes through other (less computationally complex) approaches, as well, such as the ones based on the generalized trust region sub-problems framework and linear least squares. Furthermore, a detailed analysis of the computational complexity of each solution is reviewed. Furthermore, an extensive set of simulation results is presented. Finally, the main conclusions are summarized, and a set of future aspects and trends that might be interesting for future research in this area is identified. PMID:29671832

  15. A novel measure and significance testing in data analysis of cell image segmentation.

    PubMed

    Wu, Jin Chu; Halter, Michael; Kacker, Raghu N; Elliott, John T; Plant, Anne L

    2017-03-14

    Cell image segmentation (CIS) is an essential part of quantitative imaging of biological cells. Designing a performance measure and conducting significance testing are critical for evaluating and comparing the CIS algorithms for image-based cell assays in cytometry. Many measures and methods have been proposed and implemented to evaluate segmentation methods. However, computing the standard errors (SE) of the measures and their correlation coefficient is not described, and thus the statistical significance of performance differences between CIS algorithms cannot be assessed. We propose the total error rate (TER), a novel performance measure for segmenting all cells in the supervised evaluation. The TER statistically aggregates all misclassification error rates (MER) by taking cell sizes as weights. The MERs are for segmenting each single cell in the population. The TER is fully supported by the pairwise comparisons of MERs using 106 manually segmented ground-truth cells with different sizes and seven CIS algorithms taken from ImageJ. Further, the SE and 95% confidence interval (CI) of TER are computed based on the SE of MER that is calculated using the bootstrap method. An algorithm for computing the correlation coefficient of TERs between two CIS algorithms is also provided. Hence, the 95% CI error bars can be used to classify CIS algorithms. The SEs of TERs and their correlation coefficient can be employed to conduct the hypothesis testing, while the CIs overlap, to determine the statistical significance of the performance differences between CIS algorithms. A novel measure TER of CIS is proposed. The TER's SEs and correlation coefficient are computed. Thereafter, CIS algorithms can be evaluated and compared statistically by conducting the significance testing.

  16. Model-based registration for assessment of spinal deformities in idiopathic scoliosis

    NASA Astrophysics Data System (ADS)

    Forsberg, Daniel; Lundström, Claes; Andersson, Mats; Knutsson, Hans

    2014-01-01

    Detailed analysis of spinal deformity is important within orthopaedic healthcare, in particular for assessment of idiopathic scoliosis. This paper addresses this challenge by proposing an image analysis method, capable of providing a full three-dimensional spine characterization. The proposed method is based on the registration of a highly detailed spine model to image data from computed tomography. The registration process provides an accurate segmentation of each individual vertebra and the ability to derive various measures describing the spinal deformity. The derived measures are estimated from landmarks attached to the spine model and transferred to the patient data according to the registration result. Evaluation of the method provides an average point-to-surface error of 0.9 mm ± 0.9 (comparing segmentations), and an average target registration error of 2.3 mm ± 1.7 (comparing landmarks). Comparing automatic and manual measurements of axial vertebral rotation provides a mean absolute difference of 2.5° ± 1.8, which is on a par with other computerized methods for assessing axial vertebral rotation. A significant advantage of our method, compared to other computerized methods for rotational measurements, is that it does not rely on vertebral symmetry for computing the rotational measures. The proposed method is fully automatic and computationally efficient, only requiring three to four minutes to process an entire image volume covering vertebrae L5 to T1. Given the use of landmarks, the method can be readily adapted to estimate other measures describing a spinal deformity by changing the set of employed landmarks. In addition, the method has the potential to be utilized for accurate segmentations of the vertebrae in routine computed tomography examinations, given the relatively low point-to-surface error.

  17. Online Self-Administered Cognitive Testing Using the Amsterdam Cognition Scan: Establishing Psychometric Properties and Normative Data.

    PubMed

    Feenstra, Heleen Em; Vermeulen, Ivar E; Murre, Jaap Mj; Schagen, Sanne B

    2018-05-30

    Online tests enable efficient self-administered assessments and consequently facilitate large-scale data collection for many fields of research. The Amsterdam Cognition Scan is a new online neuropsychological test battery that measures a broad variety of cognitive functions. The aims of this study were to evaluate the psychometric properties of the Amsterdam Cognition Scan and to establish regression-based normative data. The Amsterdam Cognition Scan was self-administrated twice from home-with an interval of 6 weeks-by 248 healthy Dutch-speaking adults aged 18 to 81 years. Test-retest reliability was moderate to high and comparable with that of equivalent traditional tests (intraclass correlation coefficients: .45 to .80; .83 for the Amsterdam Cognition Scan total score). Multiple regression analyses indicated that (1) participants' age negatively influenced all (12) cognitive measures, (2) gender was associated with performance on six measures, and (3) education level was positively associated with performance on four measures. In addition, we observed influences of tested computer skills and of self-reported amount of computer use on cognitive performance. Demographic characteristics that proved to influence Amsterdam Cognition Scan test performance were included in regression-based predictive formulas to establish demographically adjusted normative data. Initial results from a healthy adult sample indicate that the Amsterdam Cognition Scan has high usability and can give reliable measures of various generic cognitive ability areas. For future use, the influence of computer skills and experience should be further studied, and for repeated measurements, computer configuration should be consistent. The reported normative data allow for initial interpretation of Amsterdam Cognition Scan performances. ©Heleen EM Feenstra, Ivar E Vermeulen, Jaap MJ Murre, Sanne B Schagen. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 30.05.2018.

  18. Numerical Analysis of a Pulse Detonation Cross Flow Heat Load Experiment

    NASA Technical Reports Server (NTRS)

    Paxson, Daniel E.; Naples, Andrew .; Hoke, John L.; Schauer, Fred

    2011-01-01

    A comparison between experimentally measured and numerically simulated, time-averaged, point heat transfer rates in a pulse detonation (PDE) engine is presented. The comparison includes measurements and calculations for heat transfer to a cylinder in crossflow and to the tube wall itself using a novel spool design. Measurements are obtained at several locations and under several operating conditions. The measured and computed results are shown to be in substantial agreement, thereby validating the modeling approach. The model, which is based in computational fluid dynamics (CFD) is then used to interpret the results. A preheating of the incoming fuel charge is predicted, which results in increased volumetric flow and subsequent overfilling. The effect is validated with additional measurements.

  19. Monitoring minimization of grade B environments based on risk assessment using three-dimensional airflow measurements and computer simulation.

    PubMed

    Katayama, Hirohito; Higo, Takashi; Tokunaga, Yuji; Katoh, Shigeo; Hiyama, Yukio; Morikawa, Kaoru

    2008-01-01

    A practical, risk-based monitoring approach using the combined data collected from actual experiments and computer simulations was developed for the qualification of an EU GMP Annex 1 Grade B, ISO Class 7 area. This approach can locate and minimize the representative number of sampling points used for microbial contamination risk assessment. We conducted a case study on an aseptic clean room, newly constructed and specifically designed for the use of a restricted access barrier system (RABS). Hotspots were located using three-dimensional airflow analysis based on a previously published empirical measurement method, the three-dimensional airflow analysis. Local mean age of air (LMAA) values were calculated based on computer simulations. Comparable results were found using actual measurements and simulations, demonstrating the potential usefulness of such tools in estimating contamination risks based on the airflow characteristics of a clean room. Intensive microbial monitoring and particle monitoring at the Grade B environmental qualification stage, as well as three-dimensional airflow analysis, were also conducted to reveal contamination hotspots. We found representative hotspots were located at perforated panels covering the air exhausts where the major piston airflows collect in the Grade B room, as well as at any locations within the room that were identified as having stagnant air. However, we also found that the floor surface air around the exit airway of the RABS EU GMP Annex 1 Grade A, ISO Class 5 area was always remarkably clean, possibly due to the immediate sweep of the piston airflow, which prevents dispersed human microbes from falling in a Stokes-type manner on settling plates placed on the floor around the Grade A exit airway. In addition, this airflow is expected to be clean with a significantly low LMAA. Based on these observed results, we propose a simplified daily monitoring program to monitor microbial contamination in Grade B environments. To locate hotspots we propose using a combination of computer simulation, actual airflow measurements, and intensive environmental monitoring at the qualification stage. Thereafter, instead of particle or microbial air monitoring, we recommend the use of microbial surface monitoring at the main air exhaust. These measures would be sufficient to assure the efficiency of the monitoring program, as well as to minimize the number of surface sampling points used in environments surrounding a RABS.

  20. Predicting Forearm Physical Exposures During Computer Work Using Self-Reports, Software-Recorded Computer Usage Patterns, and Anthropometric and Workstation Measurements.

    PubMed

    Huysmans, Maaike A; Eijckelhof, Belinda H W; Garza, Jennifer L Bruno; Coenen, Pieter; Blatter, Birgitte M; Johnson, Peter W; van Dieën, Jaap H; van der Beek, Allard J; Dennerlein, Jack T

    2017-12-15

    Alternative techniques to assess physical exposures, such as prediction models, could facilitate more efficient epidemiological assessments in future large cohort studies examining physical exposures in relation to work-related musculoskeletal symptoms. The aim of this study was to evaluate two types of models that predict arm-wrist-hand physical exposures (i.e. muscle activity, wrist postures and kinematics, and keyboard and mouse forces) during computer use, which only differed with respect to the candidate predicting variables; (i) a full set of predicting variables, including self-reported factors, software-recorded computer usage patterns, and worksite measurements of anthropometrics and workstation set-up (full models); and (ii) a practical set of predicting variables, only including the self-reported factors and software-recorded computer usage patterns, that are relatively easy to assess (practical models). Prediction models were build using data from a field study among 117 office workers who were symptom-free at the time of measurement. Arm-wrist-hand physical exposures were measured for approximately two hours while workers performed their own computer work. Each worker's anthropometry and workstation set-up were measured by an experimenter, computer usage patterns were recorded using software and self-reported factors (including individual factors, job characteristics, computer work behaviours, psychosocial factors, workstation set-up characteristics, and leisure-time activities) were collected by an online questionnaire. We determined the predictive quality of the models in terms of R2 and root mean squared (RMS) values and exposure classification agreement to low-, medium-, and high-exposure categories (in the practical model only). The full models had R2 values that ranged from 0.16 to 0.80, whereas for the practical models values ranged from 0.05 to 0.43. Interquartile ranges were not that different for the two models, indicating that only for some physical exposures the full models performed better. Relative RMS errors ranged between 5% and 19% for the full models, and between 10% and 19% for the practical model. When the predicted physical exposures were classified into low, medium, and high, classification agreement ranged from 26% to 71%. The full prediction models, based on self-reported factors, software-recorded computer usage patterns, and additional measurements of anthropometrics and workstation set-up, show a better predictive quality as compared to the practical models based on self-reported factors and recorded computer usage patterns only. However, predictive quality varied largely across different arm-wrist-hand exposure parameters. Future exploration of the relation between predicted physical exposure and symptoms is therefore only recommended for physical exposures that can be reasonably well predicted. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  1. Comparison of workload measures on computer-generated primary flight displays

    NASA Technical Reports Server (NTRS)

    Nataupsky, Mark; Abbott, Terence S.

    1987-01-01

    Four Air Force pilots were used as subjects to assess a battery of subjective and physiological workload measures in a flight simulation environment in which two computer-generated primary flight display configurations were evaluated. A high- and low-workload task was created by manipulating flight path complexity. Both SWAT and the NASA-TLX were shown to be effective in differentiating the high and low workload path conditions. Physiological measures were inconclusive. A battery of workload measures continues to be necessary for an understanding of the data. Based on workload, opinion, and performance data, it is fruitful to pursue research with a primary flight display and a horizontal situation display integrated into a single display.

  2. Quantum Computational Universality of the 2D Cai-Miyake-D"ur-Briegel Quantum State

    NASA Astrophysics Data System (ADS)

    Wei, Tzu-Chieh; Raussendorf, Robert; Kwek, Leong Chuan

    2012-02-01

    Universal quantum computation can be achieved by simply performing single-qubit measurements on a highly entangled resource state, such as cluster states. Cai, Miyake, D"ur, and Briegel recently constructed a ground state of a two-dimensional quantum magnet by combining multiple Affleck-Kennedy-Lieb-Tasaki quasichains of mixed spin-3/2 and spin-1/2 entities and by mapping pairs of neighboring spin-1/2 particles to individual spin-3/2 particles [Phys. Rev. A 82, 052309 (2010)]. They showed that this state enables universal quantum computation by constructing single- and two-qubit universal gates. Here, we give an alternative understanding of how this state gives rise to universal measurement-based quantum computation: by local operations, each quasichain can be converted to a one-dimensional cluster state and entangling gates between two neighboring logical qubits can be implemented by single-spin measurements. Furthermore, a two-dimensional cluster state can be distilled from the Cai-Miyake-D"ur-Briegel state.

  3. Automated egg grading system using computer vision: Investigation on weight measure versus shape parameters

    NASA Astrophysics Data System (ADS)

    Nasir, Ahmad Fakhri Ab; Suhaila Sabarudin, Siti; Majeed, Anwar P. P. Abdul; Ghani, Ahmad Shahrizan Abdul

    2018-04-01

    Chicken egg is a source of food of high demand by humans. Human operators cannot work perfectly and continuously when conducting egg grading. Instead of an egg grading system using weight measure, an automatic system for egg grading using computer vision (using egg shape parameter) can be used to improve the productivity of egg grading. However, early hypothesis has indicated that more number of egg classes will change when using egg shape parameter compared with using weight measure. This paper presents the comparison of egg classification by the two above-mentioned methods. Firstly, 120 images of chicken eggs of various grades (A–D) produced in Malaysia are captured. Then, the egg images are processed using image pre-processing techniques, such as image cropping, smoothing and segmentation. Thereafter, eight egg shape features, including area, major axis length, minor axis length, volume, diameter and perimeter, are extracted. Lastly, feature selection (information gain ratio) and feature extraction (principal component analysis) are performed using k-nearest neighbour classifier in the classification process. Two methods, namely, supervised learning (using weight measure as graded by egg supplier) and unsupervised learning (using egg shape parameters as graded by ourselves), are conducted to execute the experiment. Clustering results reveal many changes in egg classes after performing shape-based grading. On average, the best recognition results using shape-based grading label is 94.16% while using weight-based label is 44.17%. As conclusion, automated egg grading system using computer vision is better by implementing shape-based features since it uses image meanwhile the weight parameter is more suitable by using weight grading system.

  4. Efficient Mining of Interesting Patterns in Large Biological Sequences

    PubMed Central

    Rashid, Md. Mamunur; Karim, Md. Rezaul; Jeong, Byeong-Soo

    2012-01-01

    Pattern discovery in biological sequences (e.g., DNA sequences) is one of the most challenging tasks in computational biology and bioinformatics. So far, in most approaches, the number of occurrences is a major measure of determining whether a pattern is interesting or not. In computational biology, however, a pattern that is not frequent may still be considered very informative if its actual support frequency exceeds the prior expectation by a large margin. In this paper, we propose a new interesting measure that can provide meaningful biological information. We also propose an efficient index-based method for mining such interesting patterns. Experimental results show that our approach can find interesting patterns within an acceptable computation time. PMID:23105928

  5. Efficient mining of interesting patterns in large biological sequences.

    PubMed

    Rashid, Md Mamunur; Karim, Md Rezaul; Jeong, Byeong-Soo; Choi, Ho-Jin

    2012-03-01

    Pattern discovery in biological sequences (e.g., DNA sequences) is one of the most challenging tasks in computational biology and bioinformatics. So far, in most approaches, the number of occurrences is a major measure of determining whether a pattern is interesting or not. In computational biology, however, a pattern that is not frequent may still be considered very informative if its actual support frequency exceeds the prior expectation by a large margin. In this paper, we propose a new interesting measure that can provide meaningful biological information. We also propose an efficient index-based method for mining such interesting patterns. Experimental results show that our approach can find interesting patterns within an acceptable computation time.

  6. Computer proficiency questionnaire: assessing low and high computer proficient seniors.

    PubMed

    Boot, Walter R; Charness, Neil; Czaja, Sara J; Sharit, Joseph; Rogers, Wendy A; Fisk, Arthur D; Mitzner, Tracy; Lee, Chin Chin; Nair, Sankaran

    2015-06-01

    Computers and the Internet have the potential to enrich the lives of seniors and aid in the performance of important tasks required for independent living. A prerequisite for reaping these benefits is having the skills needed to use these systems, which is highly dependent on proper training. One prerequisite for efficient and effective training is being able to gauge current levels of proficiency. We developed a new measure (the Computer Proficiency Questionnaire, or CPQ) to measure computer proficiency in the domains of computer basics, printing, communication, Internet, calendaring software, and multimedia use. Our aim was to develop a measure appropriate for individuals with a wide range of proficiencies from noncomputer users to extremely skilled users. To assess the reliability and validity of the CPQ, a diverse sample of older adults, including 276 older adults with no or minimal computer experience, was recruited and asked to complete the CPQ. The CPQ demonstrated excellent reliability (Cronbach's α = .98), with subscale reliabilities ranging from .86 to .97. Age, computer use, and general technology use all predicted CPQ scores. Factor analysis revealed three main factors of proficiency related to Internet and e-mail use; communication and calendaring; and computer basics. Based on our findings, we also developed a short-form CPQ (CPQ-12) with similar properties but 21 fewer questions. The CPQ and CPQ-12 are useful tools to gauge computer proficiency for training and research purposes, even among low computer proficient older adults. © The Author 2013. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. Composite measures of watershed health from a water quality perspective.

    PubMed

    Mallya, Ganeshchandra; Hantush, Mohamed; Govindaraju, Rao S

    2018-05-15

    Water quality data at gaging stations are typically compared with established federal, state, or local water quality standards to determine if violations (concentrations of specific constituents falling outside acceptable limits) have occurred. Based on the frequency and severity of water quality violations, risk metrics such as reliability, resilience, and vulnerability (R-R-V) are computed for assessing water quality-based watershed health. In this study, a modified methodology for computing R-R-V measures is presented, and a new composite watershed health index is proposed. Risk-based assessments for different water quality parameters are carried out using identified national sampling stations within the Upper Mississippi River Basin, the Maumee River Basin, and the Ohio River Basin. The distributional properties of risk measures with respect to water quality parameters are reported. Scaling behaviors of risk measures using stream order, specifically for the watershed health (WH) index, suggest that WH values increased with stream order for suspended sediment concentration, nitrogen, and orthophosphate in the Upper Mississippi River Basin. Spatial distribution of risk measures enable identification of locations exhibiting poor watershed health with respect to the chosen numerical standard, and the role of land use characteristics within the watershed. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. A Distance Measure for Genome Phylogenetic Analysis

    NASA Astrophysics Data System (ADS)

    Cao, Minh Duc; Allison, Lloyd; Dix, Trevor

    Phylogenetic analyses of species based on single genes or parts of the genomes are often inconsistent because of factors such as variable rates of evolution and horizontal gene transfer. The availability of more and more sequenced genomes allows phylogeny construction from complete genomes that is less sensitive to such inconsistency. For such long sequences, construction methods like maximum parsimony and maximum likelihood are often not possible due to their intensive computational requirement. Another class of tree construction methods, namely distance-based methods, require a measure of distances between any two genomes. Some measures such as evolutionary edit distance of gene order and gene content are computational expensive or do not perform well when the gene content of the organisms are similar. This study presents an information theoretic measure of genetic distances between genomes based on the biological compression algorithm expert model. We demonstrate that our distance measure can be applied to reconstruct the consensus phylogenetic tree of a number of Plasmodium parasites from their genomes, the statistical bias of which would mislead conventional analysis methods. Our approach is also used to successfully construct a plausible evolutionary tree for the γ-Proteobacteria group whose genomes are known to contain many horizontally transferred genes.

  9. Military Vision Research Program

    DTIC Science & Technology

    2011-07-01

    accomplishments emanating from this research . • 3 novel computer-based tasks have been developed that measure visual distortions • These tests are based...10-1-0392 TITLE: Military Vision Research Program PRINCIPAL INVESTIGATOR: Dr. Darlene Dartt...CONTRACTING ORGANIZATION: The Schepens Eye Research

  10. Computer literacy and attitudes towards e-learning among first year medical students

    PubMed Central

    Link, Thomas Michael; Marz, Richard

    2006-01-01

    Background At the Medical University of Vienna, most information for students is available only online. In 2005, an e-learning project was initiated and there are plans to introduce a learning management system. In this study, we estimate the level of students' computer skills, the number of students having difficulty with e-learning, and the number of students opposed to e-learning. Methods The study was conducted in an introductory course on computer-based and web-based training (CBT/WBT). Students were asked to fill out a questionnaire online that covered a wide range of relevant attitudes and experiences. Results While the great majority of students possess sufficient computer skills and acknowledge the advantages of interactive and multimedia-enhanced learning material, a small percentage lacks basic computer skills and/or is very skeptical about e-learning. There is also a consistently significant albeit weak gender difference in available computer infrastructure and Internet access. As for student attitudes toward e-learning, we found that age, computer use, and previous exposure to computers are more important than gender. A sizable number of students, 12% of the total, make little or no use of existing e-learning offerings. Conclusion Many students would benefit from a basic introduction to computers and to the relevant computer-based resources of the university. Given to the wide range of computer skills among students, a single computer course for all students would not be useful nor would it be accepted. Special measures should be taken to prevent students who lack computer skills from being disadvantaged or from developing computer-hostile attitudes. PMID:16784524

  11. Computer literacy and attitudes towards e-learning among first year medical students.

    PubMed

    Link, Thomas Michael; Marz, Richard

    2006-06-19

    At the Medical University of Vienna, most information for students is available only online. In 2005, an e-learning project was initiated and there are plans to introduce a learning management system. In this study, we estimate the level of students' computer skills, the number of students having difficulty with e-learning, and the number of students opposed to e-learning. The study was conducted in an introductory course on computer-based and web-based training (CBT/WBT). Students were asked to fill out a questionnaire online that covered a wide range of relevant attitudes and experiences. While the great majority of students possess sufficient computer skills and acknowledge the advantages of interactive and multimedia-enhanced learning material, a small percentage lacks basic computer skills and/or is very skeptical about e-learning. There is also a consistently significant albeit weak gender difference in available computer infrastructure and Internet access. As for student attitudes toward e-learning, we found that age, computer use, and previous exposure to computers are more important than gender. A sizable number of students, 12% of the total, make little or no use of existing e-learning offerings. Many students would benefit from a basic introduction to computers and to the relevant computer-based resources of the university. Given to the wide range of computer skills among students, a single computer course for all students would not be useful nor would it be accepted. Special measures should be taken to prevent students who lack computer skills from being disadvantaged or from developing computer-hostile attitudes.

  12. Experimental and Computational Study of Sonic and Supersonic Jet Plumes

    NASA Technical Reports Server (NTRS)

    Venkatapathy, E.; Naughton, J. W.; Fletcher, D. G.; Edwards, Thomas A. (Technical Monitor)

    1994-01-01

    Study of sonic and supersonic jet plumes are relevant to understanding such phenomenon as jet-noise, plume signatures, and rocket base-heating and radiation. Jet plumes are simple to simulate and yet, have complex flow structures such as Mach disks, triple points, shear-layers, barrel shocks, shock-shear-layer interaction, etc. Experimental and computational simulation of sonic and supersonic jet plumes have been performed for under- and over-expanded, axisymmetric plume conditions. The computational simulation compare very well with the experimental observations of schlieren pictures. Experimental data such as temperature measurements with hot-wire probes are yet to be measured and will be compared with computed values. Extensive analysis of the computational simulations presents a clear picture of how the complex flow structure develops and the conditions under which self-similar flow structures evolve. From the computations, the plume structure can be further classified into many sub-groups. In the proposed paper, detail results from the experimental and computational simulations for single, axisymmetric, under- and over-expanded, sonic and supersonic plumes will be compared and the fluid dynamic aspects of flow structures will be discussed.

  13. Simple adaptive sparse representation based classification schemes for EEG based brain-computer interface applications.

    PubMed

    Shin, Younghak; Lee, Seungchan; Ahn, Minkyu; Cho, Hohyun; Jun, Sung Chan; Lee, Heung-No

    2015-11-01

    One of the main problems related to electroencephalogram (EEG) based brain-computer interface (BCI) systems is the non-stationarity of the underlying EEG signals. This results in the deterioration of the classification performance during experimental sessions. Therefore, adaptive classification techniques are required for EEG based BCI applications. In this paper, we propose simple adaptive sparse representation based classification (SRC) schemes. Supervised and unsupervised dictionary update techniques for new test data and a dictionary modification method by using the incoherence measure of the training data are investigated. The proposed methods are very simple and additional computation for the re-training of the classifier is not needed. The proposed adaptive SRC schemes are evaluated using two BCI experimental datasets. The proposed methods are assessed by comparing classification results with the conventional SRC and other adaptive classification methods. On the basis of the results, we find that the proposed adaptive schemes show relatively improved classification accuracy as compared to conventional methods without requiring additional computation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Measurement-based reliability prediction methodology. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Linn, Linda Shen

    1991-01-01

    In the past, analytical and measurement based models were developed to characterize computer system behavior. An open issue is how these models can be used, if at all, for system design improvement. The issue is addressed here. A combined statistical/analytical approach to use measurements from one environment to model the system failure behavior in a new environment is proposed. A comparison of the predicted results with the actual data from the new environment shows a close correspondence.

  15. Quantifying Similarity and Distance Measures for Vector-Based Datasets: Histograms, Signals, and Probability Distribution Functions

    DTIC Science & Technology

    2017-02-01

    note, a number of different measures implemented in both MATLAB and Python as functions are used to quantify similarity/distance between 2 vector-based...this technical note are widely used and may have an important role when computing the distance and similarity of large datasets and when considering high...throughput processes. In this technical note, a number of different measures implemented in both MAT- LAB and Python as functions are used to

  16. Measurement-Based Linear Optics

    NASA Astrophysics Data System (ADS)

    Alexander, Rafael N.; Gabay, Natasha C.; Rohde, Peter P.; Menicucci, Nicolas C.

    2017-03-01

    A major challenge in optical quantum processing is implementing large, stable interferometers. We offer a novel approach: virtual, measurement-based interferometers that are programed on the fly solely by the choice of homodyne measurement angles. The effects of finite squeezing are captured as uniform amplitude damping. We compare our proposal to existing (physical) interferometers and consider its performance for BosonSampling, which could demonstrate postclassical computational power in the near future. We prove its efficiency in time and squeezing (energy) in this setting.

  17. Visible light scattering properties of irregularly shaped silica microparticles using laser based laboratory simulations for remote sensing and medical applications

    NASA Astrophysics Data System (ADS)

    Boruah, Manash J.; Ahmed, Gazi A.

    2018-01-01

    Laser based experimental light scattering studies of irregularly shaped silica microparticles have been performed at three incident wavelengths 543.5 nm, 594.5 nm and 632.8 nm supported by laboratory based computations and 3D realistic simulations, using an indigenously fabricated light scattering setup. A comparative analysis of the computational and experimentally acquired results is done and a good agreement is found in the forward scattering lobes in all cases for each of the measured scattering parameters. This study also provides an efficient way of detecting and measuring particle size distribution for irregular micro- and nanoparticles and is highly applicable in remote sensing, atmospheric, astrophysical, and medical applications and also for finding potential health hazards in the form of inhalable and respirable small particulate matter.

  18. Millimeter-wave spectra of the Jovian planets

    NASA Technical Reports Server (NTRS)

    Joiner, Joanna; Steffes, Paul G.

    1991-01-01

    The millimeter wave portion of the electromagnetic spectrum is critical for understanding the subcloud atmospheric structure of the Jovian planets (Jupiter, Saturn, Uranus, and Neptune). This research utilizes a combination of laboratory measurements, computer modeling, and radio astronomical observation in order to obtain a better understanding of the millimeter-wave spectra of the Jovian planets. The pressure broadened absorption from gaseous ammonia (NH3) and hydrogen sulfide (H2S) was measured in the laboratory under simulated conditions for the Jovian atmospheres. Researchers developed new formalisms for computing the absorptivity of gaseous NH3 and H2S based on their laboratory measurements. They developed a radiative transfer and thermochemical model to predict the abundance and distribution of absorbing constituents in the Jovian atmospheres. They used the model to compute the millimeter wave emission from the Jovian planets.

  19. Quantum computing with incoherent resources and quantum jumps.

    PubMed

    Santos, M F; Cunha, M Terra; Chaves, R; Carvalho, A R R

    2012-04-27

    Spontaneous emission and the inelastic scattering of photons are two natural processes usually associated with decoherence and the reduction in the capacity to process quantum information. Here we show that, when suitably detected, these photons are sufficient to build all the fundamental blocks needed to perform quantum computation in the emitting qubits while protecting them from deleterious dissipative effects. We exemplify this by showing how to efficiently prepare graph states for the implementation of measurement-based quantum computation.

  20. [Personal computer-based computer monitoring system of the anesthesiologist (2-year experience in development and use)].

    PubMed

    Buniatian, A A; Sablin, I N; Flerov, E V; Mierbekov, E M; Broĭtman, O G; Shevchenko, V V; Shitikov, I I

    1995-01-01

    Creation of computer monitoring systems (CMS) for operating rooms is one of the most important spheres of personal computer employment in anesthesiology. The authors developed a PC RS/AT-based CMS and effectively used it for more than 2 years. This system permits comprehensive monitoring in cardiosurgical operations by real time processing the values of arterial and central venous pressure, pressure in the pulmonary artery, bioelectrical activity of the brain, and two temperature values. Use of this CMS helped appreciably improve patients' safety during surgery. The possibility to assess brain function by computer monitoring the EEF simultaneously with central hemodynamics and body temperature permit the anesthesiologist to objectively assess the depth of anesthesia and to diagnose cerebral hypoxia. Automated anesthesiological chart issued by the CMS after surgery reliably reflects the patient's status and the measures taken by the anesthesiologist.

  1. Computational modelling of cellular level metabolism

    NASA Astrophysics Data System (ADS)

    Calvetti, D.; Heino, J.; Somersalo, E.

    2008-07-01

    The steady and stationary state inverse problems consist of estimating the reaction and transport fluxes, blood concentrations and possibly the rates of change of some of the concentrations based on data which are often scarce noisy and sampled over a population. The Bayesian framework provides a natural setting for the solution of this inverse problem, because a priori knowledge about the system itself and the unknown reaction fluxes and transport rates can compensate for the insufficiency of measured data, provided that the computational costs do not become prohibitive. This article identifies the computational challenges which have to be met when analyzing the steady and stationary states of multicompartment model for cellular metabolism and suggest stable and efficient ways to handle the computations. The outline of a computational tool based on the Bayesian paradigm for the simulation and analysis of complex cellular metabolic systems is also presented.

  2. Workload Characterization of CFD Applications Using Partial Differential Equation Solvers

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Workload characterization is used for modeling and evaluating of computing systems at different levels of detail. We present workload characterization for a class of Computational Fluid Dynamics (CFD) applications that solve Partial Differential Equations (PDEs). This workload characterization focuses on three high performance computing platforms: SGI Origin2000, EBM SP-2, a cluster of Intel Pentium Pro bases PCs. We execute extensive measurement-based experiments on these platforms to gather statistics of system resource usage, which results in workload characterization. Our workload characterization approach yields a coarse-grain resource utilization behavior that is being applied for performance modeling and evaluation of distributed high performance metacomputing systems. In addition, this study enhances our understanding of interactions between PDE solver workloads and high performance computing platforms and is useful for tuning these applications.

  3. How Readability and Topic Incidence Relate to Performance on Mathematics Story Problems in Computer-Based Curricula

    ERIC Educational Resources Information Center

    Walkington, Candace; Clinton, Virginia; Ritter, Steven N.; Nathan, Mitchell J.

    2015-01-01

    Solving mathematics story problems requires text comprehension skills. However, previous studies have found few connections between traditional measures of text readability and performance on story problems. We hypothesized that recently developed measures of readability and topic incidence measured by text-mining tools may illuminate associations…

  4. Computer-based cognitive training for ADHD: a review of current evidence.

    PubMed

    Sonuga-Barke, Edmund; Brandeis, Daniel; Holtmann, Martin; Cortese, Samuele

    2014-10-01

    There has been an increasing interest in and the use of computer-based cognitive training as a treatment of attention-deficit/hyperactivity disorder (ADHD). The authors' review of current evidence, based partly on a stringent meta-analysis of 6 randomized controlled trials (RCTs) published in 2013, and an overview of 8 recently published RCTs highlights the inconsistency of findings between trials and across blinded and nonblinded ADHD measures within trials. Based on this, they conclude that more evidence from well-blinded studies is required before cognitive training can be supported as a frontline treatment of core ADHD symptoms. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Acoustic and Perceptual Effects of Left-Right Laryngeal Asymmetries Based on Computational Modeling

    ERIC Educational Resources Information Center

    Samlan, Robin A.; Story, Brad H.; Lotto, Andrew J.; Bunton, Kate

    2014-01-01

    Purpose: Computational modeling was used to examine the consequences of 5 different laryngeal asymmetries on acoustic and perceptual measures of vocal function. Method: A kinematic vocal fold model was used to impose 5 laryngeal asymmetries: adduction, edge bulging, nodal point ratio, amplitude of vibration, and starting phase. Thirty /a/ and /?/…

  6. Measuring ICT Use and Learning Outcomes: Evidence from Recent Econometric Studies

    ERIC Educational Resources Information Center

    Biagi, Federico; Loi, Massimo

    2013-01-01

    Based on PISA 2009 data, this article studies the relationship between students' computer use and their achievement in reading, mathematics and science in 23 countries. After having categorised computer use into a set of different activities according to the skills they involve, we correlate students' PISA test-scores with an index capturing the…

  7. Effect of Computer-Aided Instruction on Attitude and Achievement of Fifth Grade Math Students

    ERIC Educational Resources Information Center

    Shoemaker, Traci L.

    2013-01-01

    The purpose of this quasi-experimental non-equivalent control group study was to test theories of constructivism and motivation, along with research-based teaching practices of differentiating instruction and instructing within a child's Zone of Proximal Development, in measuring the effect of computer-aided instruction on fifth grade students'…

  8. A Review of EEG-Based Brain-Computer Interfaces as Access Pathways for Individuals with Severe Disabilities

    ERIC Educational Resources Information Center

    Moghimi, Saba; Kushki, Azadeh; Guerguerian, Anne Marie; Chau, Tom

    2013-01-01

    Electroencephalography (EEG) is a non-invasive method for measuring brain activity and is a strong candidate for brain-computer interface (BCI) development. While BCIs can be used as a means of communication for individuals with severe disabilities, the majority of existing studies have reported BCI evaluations by able-bodied individuals.…

  9. Linking Pedagogical Theory of Computer Games to their Usability

    ERIC Educational Resources Information Center

    Ang, Chee Siang; Avni, Einav; Zaphiris, Panayiotis

    2008-01-01

    This article reviews a range of literature of computer games and learning theories and attempts to establish a link between them by proposing a typology of games which we use as a new usability measure for the development of guidelines for game-based learning. First, we examine game literature in order to understand the key elements that…

  10. Towards International and Interdisciplinary Research Collaboration for the Measurements of Quality of Life

    ERIC Educational Resources Information Center

    Mizohata, Sachie; Jadoul, Raynald

    2013-01-01

    This paper focuses on three main subjects: (1) monitoring quality of life (QoL) in old age; (2) international and interdisciplinary collaboration for QoL research; and (3) computer-based technology and infrastructure assisting (1) and (2). This type of computer-supported cooperative work in the social sciences has been termed eHumanities or…

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habte, A.; Sengupta, M.; Wilcox, S.

    Models to compute Global Horizontal Irradiance (GHI) and Direct Normal Irradiance (DNI) have been in development over the last 3 decades. These models can be classified as empirical or physical, based on the approach. Empirical models relate ground based observations with satellite measurements and use these relations to compute surface radiation. Physical models consider the radiation received from the earth at the satellite and create retrievals to estimate surface radiation. While empirical methods have been traditionally used for computing surface radiation for the solar energy industry the advent of faster computing has made operational physical models viable. The Global Solarmore » Insolation Project (GSIP) is an operational physical model from NOAA that computes GHI using the visible and infrared channel measurements from the GOES satellites. GSIP uses a two-stage scheme that first retrieves cloud properties and uses those properties in a radiative transfer model to calculate surface radiation. NREL, University of Wisconsin and NOAA have recently collaborated to adapt GSIP to create a 4 km GHI and DNI product every 30 minutes. This paper presents an outline of the methodology and a comprehensive validation using high quality ground based solar data from the National Oceanic and Atmospheric Administration (NOAA) Surface Radiation (SURFRAD) (http://www.srrb.noaa.gov/surfrad/sitepage.html) and Integrated Surface Insolation Study (ISIS) http://www.srrb.noaa.gov/isis/isissites.html), the Solar Radiation Research Laboratory (SRRL) at National Renewable Energy Laboratory (NREL), and Sun Spot One (SS1) stations.« less

  12. Technical Note: spektr 3.0—A computational tool for x-ray spectrum modeling and analysis

    PubMed Central

    Punnoose, J.; Xu, J.; Sisniega, A.; Zbijewski, W.; Siewerdsen, J. H.

    2016-01-01

    Purpose: A computational toolkit (spektr 3.0) has been developed to calculate x-ray spectra based on the tungsten anode spectral model using interpolating cubic splines (TASMICS) algorithm, updating previous work based on the tungsten anode spectral model using interpolating polynomials (TASMIP) spectral model. The toolkit includes a matlab (The Mathworks, Natick, MA) function library and improved user interface (UI) along with an optimization algorithm to match calculated beam quality with measurements. Methods: The spektr code generates x-ray spectra (photons/mm2/mAs at 100 cm from the source) using TASMICS as default (with TASMIP as an option) in 1 keV energy bins over beam energies 20–150 kV, extensible to 640 kV using the TASMICS spectra. An optimization tool was implemented to compute the added filtration (Al and W) that provides a best match between calculated and measured x-ray tube output (mGy/mAs or mR/mAs) for individual x-ray tubes that may differ from that assumed in TASMICS or TASMIP and to account for factors such as anode angle. Results: The median percent difference in photon counts for a TASMICS and TASMIP spectrum was 4.15% for tube potentials in the range 30–140 kV with the largest percentage difference arising in the low and high energy bins due to measurement errors in the empirically based TASMIP model and inaccurate polynomial fitting. The optimization tool reported a close agreement between measured and calculated spectra with a Pearson coefficient of 0.98. Conclusions: The computational toolkit, spektr, has been updated to version 3.0, validated against measurements and existing models, and made available as open source code. Video tutorials for the spektr function library, UI, and optimization tool are available. PMID:27487888

  13. Technical Note: SPEKTR 3.0—A computational tool for x-ray spectrum modeling and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Punnoose, J.; Xu, J.; Sisniega, A.

    2016-08-15

    Purpose: A computational toolkit (SPEKTR 3.0) has been developed to calculate x-ray spectra based on the tungsten anode spectral model using interpolating cubic splines (TASMICS) algorithm, updating previous work based on the tungsten anode spectral model using interpolating polynomials (TASMIP) spectral model. The toolkit includes a MATLAB (The Mathworks, Natick, MA) function library and improved user interface (UI) along with an optimization algorithm to match calculated beam quality with measurements. Methods: The SPEKTR code generates x-ray spectra (photons/mm{sup 2}/mAs at 100 cm from the source) using TASMICS as default (with TASMIP as an option) in 1 keV energy bins overmore » beam energies 20–150 kV, extensible to 640 kV using the TASMICS spectra. An optimization tool was implemented to compute the added filtration (Al and W) that provides a best match between calculated and measured x-ray tube output (mGy/mAs or mR/mAs) for individual x-ray tubes that may differ from that assumed in TASMICS or TASMIP and to account for factors such as anode angle. Results: The median percent difference in photon counts for a TASMICS and TASMIP spectrum was 4.15% for tube potentials in the range 30–140 kV with the largest percentage difference arising in the low and high energy bins due to measurement errors in the empirically based TASMIP model and inaccurate polynomial fitting. The optimization tool reported a close agreement between measured and calculated spectra with a Pearson coefficient of 0.98. Conclusions: The computational toolkit, SPEKTR, has been updated to version 3.0, validated against measurements and existing models, and made available as open source code. Video tutorials for the SPEKTR function library, UI, and optimization tool are available.« less

  14. Fast semivariogram computation using FPGA architectures

    NASA Astrophysics Data System (ADS)

    Lagadapati, Yamuna; Shirvaikar, Mukul; Dong, Xuanliang

    2015-02-01

    The semivariogram is a statistical measure of the spatial distribution of data and is based on Markov Random Fields (MRFs). Semivariogram analysis is a computationally intensive algorithm that has typically seen applications in the geosciences and remote sensing areas. Recently, applications in the area of medical imaging have been investigated, resulting in the need for efficient real time implementation of the algorithm. The semivariogram is a plot of semivariances for different lag distances between pixels. A semi-variance, γ(h), is defined as the half of the expected squared differences of pixel values between any two data locations with a lag distance of h. Due to the need to examine each pair of pixels in the image or sub-image being processed, the base algorithm complexity for an image window with n pixels is O(n2). Field Programmable Gate Arrays (FPGAs) are an attractive solution for such demanding applications due to their parallel processing capability. FPGAs also tend to operate at relatively modest clock rates measured in a few hundreds of megahertz, but they can perform tens of thousands of calculations per clock cycle while operating in the low range of power. This paper presents a technique for the fast computation of the semivariogram using two custom FPGA architectures. The design consists of several modules dedicated to the constituent computational tasks. A modular architecture approach is chosen to allow for replication of processing units. This allows for high throughput due to concurrent processing of pixel pairs. The current implementation is focused on isotropic semivariogram computations only. Anisotropic semivariogram implementation is anticipated to be an extension of the current architecture, ostensibly based on refinements to the current modules. The algorithm is benchmarked using VHDL on a Xilinx XUPV5-LX110T development Kit, which utilizes the Virtex5 FPGA. Medical image data from MRI scans are utilized for the experiments. Computational speedup is measured with respect to Matlab implementation on a personal computer with an Intel i7 multi-core processor. Preliminary simulation results indicate that a significant advantage in speed can be attained by the architectures, making the algorithm viable for implementation in medical devices

  15. Relative-Error-Covariance Algorithms

    NASA Technical Reports Server (NTRS)

    Bierman, Gerald J.; Wolff, Peter J.

    1991-01-01

    Two algorithms compute error covariance of difference between optimal estimates, based on data acquired during overlapping or disjoint intervals, of state of discrete linear system. Provides quantitative measure of mutual consistency or inconsistency of estimates of states. Relative-error-covariance concept applied, to determine degree of correlation between trajectories calculated from two overlapping sets of measurements and construct real-time test of consistency of state estimates based upon recently acquired data.

  16. Investigating Measurement Invariance in Computer-Based Personality Testing: The Impact of Using Anchor Items on Effect Size Indices

    ERIC Educational Resources Information Center

    Egberink, Iris J. L.; Meijer, Rob R.; Tendeiro, Jorge N.

    2015-01-01

    A popular method to assess measurement invariance of a particular item is based on likelihood ratio tests with all other items as anchor items. The results of this method are often only reported in terms of statistical significance, and researchers proposed different methods to empirically select anchor items. It is unclear, however, how many…

  17. Accuracy of volumetric measurement of simulated root resorption lacunas based on cone beam computed tomography.

    PubMed

    Wang, Y; He, S; Guo, Y; Wang, S; Chen, S

    2013-08-01

    To evaluate the accuracy of volumetric measurement of simulated root resorption cavities based on cone beam computed tomography (CBCT), in comparison with that of Micro-computed tomography (Micro-CT) which served as the reference. The State Key Laboratory of Oral Diseases at Sichuan University. Thirty-two bovine teeth were included for standardized CBCT scanning and Micro-CT scanning before and after the simulation of different degrees of root resorption. The teeth were divided into three groups according to the depths of the root resorption cavity (group 1: 0.15, 0.2, 0.3 mm; group 2: 0.6, 1.0 mm; group 3: 1.5, 2.0, 3.0 mm). Each depth included four specimens. Differences in tooth volume before and after simulated root resorption were then calculated from CBCT and Micro-CT scans, respectively. The overall between-method agreement of the measurements was evaluated using the concordance correlation coefficient (CCC). For the first group, the average volume of resorption cavity was 1.07 mm(3) , and the between-method agreement of measurement for the volume changes was low (CCC = 0.098). For the second and third groups, the average volumes of resorption cavities were 3.47 and 6.73 mm(3) respectively, and the between-method agreements were good (CCC = 0.828 and 0.895, respectively). The accuracy of 3-D quantitative volumetric measurement of simulated root resorption based on CBCT was fairly good in detecting simulated resorption cavities larger than 3.47 mm(3), while it was not sufficient for measuring resorption cavities smaller than 1.07 mm(3) . This method could be applied in future studies of root resorption although further studies are required to improve its accuracy. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  18. Toward computer-aided emphysema quantification on ultralow-dose CT: reproducibility of ventrodorsal gravity effect measurement and correction

    NASA Astrophysics Data System (ADS)

    Wiemker, Rafael; Opfer, Roland; Bülow, Thomas; Rogalla, Patrik; Steinberg, Amnon; Dharaiya, Ekta; Subramanyan, Krishna

    2007-03-01

    Computer aided quantification of emphysema in high resolution CT data is based on identifying low attenuation areas below clinically determined Hounsfield thresholds. However, the emphysema quantification is prone to error since a gravity effect can influence the mean attenuation of healthy lung parenchyma up to +/- 50 HU between ventral and dorsal lung areas. Comparing ultra-low-dose (7 mAs) and standard-dose (70 mAs) CT scans of each patient we show that measurement of the ventrodorsal gravity effect is patient specific but reproducible. It can be measured and corrected in an unsupervised way using robust fitting of a linear function.

  19. A computer simulation approach to measurement of human control strategy

    NASA Technical Reports Server (NTRS)

    Green, J.; Davenport, E. L.; Engler, H. F.; Sears, W. E., III

    1982-01-01

    Human control strategy is measured through use of a psychologically-based computer simulation which reflects a broader theory of control behavior. The simulation is called the human operator performance emulator, or HOPE. HOPE was designed to emulate control learning in a one-dimensional preview tracking task and to measure control strategy in that setting. When given a numerical representation of a track and information about current position in relation to that track, HOPE generates positions for a stick controlling the cursor to be moved along the track. In other words, HOPE generates control stick behavior corresponding to that which might be used by a person learning preview tracking.

  20. Device and method for measuring multi-phase fluid flow in a conduit using an elbow flow meter

    DOEpatents

    Ortiz, Marcos G.; Boucher, Timothy J.

    1997-01-01

    A system for measuring fluid flow in a conduit. The system utilizes pressure transducers disposed generally in line upstream and downstream of the flow of fluid in a bend in the conduit. Data from the pressure transducers is transmitted to a microprocessor or computer. The pressure differential measured by the pressure transducers is then used to calculate the fluid flow rate in the conduit. Control signals may then be generated by the microprocessor or computer to control flow, total fluid dispersed, (in, for example, an irrigation system), area of dispersal or other desired effect based on the fluid flow in the conduit.

  1. Analysis of pressure-flow data in terms of computer-derived urethral resistance parameters.

    PubMed

    van Mastrigt, R; Kranse, M

    1995-01-01

    The simultaneous measurement of detrusor pressure and flow rate during voiding is at present the only way to measure or grade infravesical obstruction objectively. Numerous methods have been introduced to analyze the resulting data. These methods differ in aim (measurement of urethral resistance and/or diagnosis of obstruction), method (manual versus computerized data processing), theory or model used, and resolution (continuously variable parameters or a limited number of classes, the so-called monogram). In this paper, some aspects of these fundamental differences are discussed and illustrated. Subsequently, the properties and clinical performance of two computer-based methods for deriving continuous urethral resistance parameters are treated.

  2. A trunk ranging system based on binocular stereo vision

    NASA Astrophysics Data System (ADS)

    Zhao, Xixuan; Kan, Jiangming

    2017-07-01

    Trunk ranging is an essential function for autonomous forestry robots. Traditional trunk ranging systems based on personal computers are not convenient in practical application. This paper examines the implementation of a trunk ranging system based on the binocular vision theory via TI's DaVinc DM37x system. The system is smaller and more reliable than that implemented using a personal computer. It calculates the three-dimensional information from the images acquired by binocular cameras, producing the targeting and ranging results. The experimental results show that the measurement error is small and the system design is feasible for autonomous forestry robots.

  3. Sub-domain methods for collaborative electromagnetic computations

    NASA Astrophysics Data System (ADS)

    Soudais, Paul; Barka, André

    2006-06-01

    In this article, we describe a sub-domain method for electromagnetic computations based on boundary element method. The benefits of the sub-domain method are that the computation can be split between several companies for collaborative studies; also the computation time can be reduced by one or more orders of magnitude especially in the context of parametric studies. The accuracy and efficiency of this technique is assessed by RCS computations on an aircraft air intake with duct and rotating engine mock-up called CHANNEL. Collaborative results, obtained by combining two sets of sub-domains computed by two companies, are compared with measurements on the CHANNEL mock-up. The comparisons are made for several angular positions of the engine to show the benefits of the method for parametric studies. We also discuss the accuracy of two formulations of the sub-domain connecting scheme using edge based or modal field expansion. To cite this article: P. Soudais, A. Barka, C. R. Physique 7 (2006).

  4. Aeroelastic Modeling of a Nozzle Startup Transient

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Zhao, Xiang; Zhang, Sijun; Chen, Yen-Sen

    2014-01-01

    Lateral nozzle forces are known to cause severe structural damage to any new rocket engine in development during test. While three-dimensional, transient, turbulent, chemically reacting computational fluid dynamics methodology has been demonstrated to capture major side load physics with rigid nozzles, hot-fire tests often show nozzle structure deformation during major side load events, leading to structural damages if structural strengthening measures were not taken. The modeling picture is incomplete without the capability to address the two-way responses between the structure and fluid. The objective of this study is to develop a tightly coupled aeroelastic modeling algorithm by implementing the necessary structural dynamics component into an anchored computational fluid dynamics methodology. The computational fluid dynamics component is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, while the computational structural dynamics component is developed under the framework of modal analysis. Transient aeroelastic nozzle startup analyses at sea level were performed, and the computed transient nozzle fluid-structure interaction physics presented,

  5. Development of an Aeroelastic Modeling Capability for Transient Nozzle Side Load Analysis

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Zhao, Xiang; Zhang, Sijun; Chen, Yen-Sen

    2013-01-01

    Lateral nozzle forces are known to cause severe structural damage to any new rocket engine in development during test. While three-dimensional, transient, turbulent, chemically reacting computational fluid dynamics methodology has been demonstrated to capture major side load physics with rigid nozzles, hot-fire tests often show nozzle structure deformation during major side load events, leading to structural damages if structural strengthening measures were not taken. The modeling picture is incomplete without the capability to address the two-way responses between the structure and fluid. The objective of this study is to develop a coupled aeroelastic modeling capability by implementing the necessary structural dynamics component into an anchored computational fluid dynamics methodology. The computational fluid dynamics component is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, while the computational structural dynamics component is developed in the framework of modal analysis. Transient aeroelastic nozzle startup analyses of the Block I Space Shuttle Main Engine at sea level were performed. The computed results from the aeroelastic nozzle modeling are presented.

  6. Influence of different setups of the Frankfort horizontal plane on 3-dimensional cephalometric measurements.

    PubMed

    Santos, Rodrigo Mologni Gonçalves Dos; De Martino, José Mario; Haiter Neto, Francisco; Passeri, Luis Augusto

    2017-08-01

    The Frankfort horizontal (FH) is a plane that intersects both porions and the left orbitale. However, other combinations of points have also been used to define this plane in 3-dimensional cephalometry. These variations are based on the hypothesis that they do not affect the cephalometric analysis. We investigated the validity of this hypothesis. The material included cone-beam computed tomography data sets of 82 adult subjects with Class I molar relationship. A third-party method of cone-beam computed tomography-based 3-dimensional cephalometry was performed using 7 setups of the FH plane. Six lateral cephalometric hard tissue measurements relative to the FH plane were carried out for each setup. Measurement differences were calculated for each pair of setups of the FH plane. The number of occurrences of differences greater than the limits of agreement was counted for each of the 6 measurements. Only 3 of 21 pairs of setups had no occurrences for the 6 measurements. No measurement had no occurrences for the 21 pairs of setups. Setups based on left or right porion and both orbitales had the greatest number of occurrences for the 6 measurements. This investigation showed that significant and undesirable measurement differences can be produced by varying the definition of the FH plane. Copyright © 2017 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  7. Measurement of smaller colon polyp in CT colonography images using morphological image processing.

    PubMed

    Manjunath, K N; Siddalingaswamy, P C; Prabhu, G K

    2017-11-01

    Automated measurement of the size and shape of colon polyps is one of the challenges in Computed tomography colonography (CTC). The objective of this retrospective study was to improve the sensitivity and specificity of smaller polyp measurement in CTC using image processing techniques. A domain knowledge-based method has been implemented with hybrid method of colon segmentation, morphological image processing operators for detecting the colonic structures, and the decision-making system for delineating the smaller polyp-based on a priori knowledge. The method was applied on 45 CTC dataset. The key finding was that the smaller polyps were accurately measured. In addition to 6-9 mm range, polyps of even <5 mm were also detected. The results were validated qualitatively and quantitatively using both 2D MPR and 3D view. Implementation was done on a high-performance computer with parallel processing. It takes [Formula: see text] min for measuring the smaller polyp in a dataset of 500 CTC images. With this method, [Formula: see text] and [Formula: see text] were achieved. The domain-based approach with morphological image processing has given good results. The smaller polyps were measured accurately which helps in making right clinical decisions. Qualitatively and quantitatively the results were acceptable when compared to the ground truth at [Formula: see text].

  8. Computation of misalignment and primary mirror astigmatism figure error of two-mirror telescopes

    NASA Astrophysics Data System (ADS)

    Gu, Zhiyuan; Wang, Yang; Ju, Guohao; Yan, Changxiang

    2018-01-01

    Active optics usually uses the computation models based on numerical methods to correct misalignments and figure errors at present. These methods can hardly lead to any insight into the aberration field dependencies that arise in the presence of the misalignments. An analytical alignment model based on third-order nodal aberration theory is presented for this problem, which can be utilized to compute the primary mirror astigmatic figure error and misalignments for two-mirror telescopes. Alignment simulations are conducted for an R-C telescope based on this analytical alignment model. It is shown that in the absence of wavefront measurement errors, wavefront measurements at only two field points are enough, and the correction process can be completed with only one alignment action. In the presence of wavefront measurement errors, increasing the number of field points for wavefront measurements can enhance the robustness of the alignment model. Monte Carlo simulation shows that, when -2 mm ≤ linear misalignment ≤ 2 mm, -0.1 deg ≤ angular misalignment ≤ 0.1 deg, and -0.2 λ ≤ astigmatism figure error (expressed as fringe Zernike coefficients C5 / C6, λ = 632.8 nm) ≤0.2 λ, the misaligned systems can be corrected to be close to nominal state without wavefront testing error. In addition, the root mean square deviation of RMS wavefront error of all the misaligned samples after being corrected is linearly related to wavefront testing error.

  9. Application of mobile computers in a measuring system supporting examination of posture diseases

    NASA Astrophysics Data System (ADS)

    Piekarski, Jacek; Klimiec, Ewa; Zaraska, Wiesław

    2013-07-01

    Measuring system designed and manufactured by the authors and based on mobile computers (smartphones and tablets) working as data recorders has been invented to support diagnosis of orthopedic, especially feet, diseases. The basic idea is to examine a patient in his natural environment, during the usual activities (such as walking or running). The paper describes the proposed system with sensors manufactured from piezoelectric film (PVDF film) and placed in the shoe insole. The mechanical reliability of PVDF film is excellent, though elimination of the pyroelectric effect is required. A possible solution of the problem and the test results are presented in the paper. Data recording is based on wireless transmission to a mobile device used as a data logger.

  10. An analytical and experimental evaluation of a Fresnel lens solar concentrator

    NASA Technical Reports Server (NTRS)

    Hastings, L. J.; Allums, S. A.; Cosby, R. M.

    1976-01-01

    An analytical and experimental evaluation of line focusing Fresnel lenses with application potential in the 200 to 370 C range was studied. Analytical techniques were formulated to assess the solar transmission and imaging properties of a grooves down lens. Experimentation was based on a 56 cm wide, f/1.0 lens. A Sun tracking heliostat provided a nonmoving solar source. Measured data indicated more spreading at the profile base than analytically predicted, resulting in a peak concentration 18 percent lower than the computed peak of 57. The measured and computed transmittances were 85 and 87 percent, respectively. Preliminary testing with a subsequent lens indicated that modified manufacturing techniques corrected the profile spreading problem and should enable improved analytical experimental correlation.

  11. Initial Progress Toward Development of a Voice-Based Computer-Delivered Motivational Intervention for Heavy Drinking College Students: An Experimental Study.

    PubMed

    Kahler, Christopher W; Lechner, William J; MacGlashan, James; Wray, Tyler B; Littman, Michael L

    2017-06-28

    Computer-delivered interventions have been shown to be effective in reducing alcohol consumption in heavy drinking college students. However, these computer-delivered interventions rely on mouse, keyboard, or touchscreen responses for interactions between the users and the computer-delivered intervention. The principles of motivational interviewing suggest that in-person interventions may be effective, in part, because they encourage individuals to think through and speak aloud their motivations for changing a health behavior, which current computer-delivered interventions do not allow. The objective of this study was to take the initial steps toward development of a voice-based computer-delivered intervention that can ask open-ended questions and respond appropriately to users' verbal responses, more closely mirroring a human-delivered motivational intervention. We developed (1) a voice-based computer-delivered intervention that was run by a human controller and that allowed participants to speak their responses to scripted prompts delivered by speech generation software and (2) a text-based computer-delivered intervention that relied on the mouse, keyboard, and computer screen for all interactions. We randomized 60 heavy drinking college students to interact with the voice-based computer-delivered intervention and 30 to interact with the text-based computer-delivered intervention and compared their ratings of the systems as well as their motivation to change drinking and their drinking behavior at 1-month follow-up. Participants reported that the voice-based computer-delivered intervention engaged positively with them in the session and delivered content in a manner consistent with motivational interviewing principles. At 1-month follow-up, participants in the voice-based computer-delivered intervention condition reported significant decreases in quantity, frequency, and problems associated with drinking, and increased perceived importance of changing drinking behaviors. In comparison to the text-based computer-delivered intervention condition, those assigned to voice-based computer-delivered intervention reported significantly fewer alcohol-related problems at the 1-month follow-up (incident rate ratio 0.60, 95% CI 0.44-0.83, P=.002). The conditions did not differ significantly on perceived importance of changing drinking or on measures of drinking quantity and frequency of heavy drinking. Results indicate that it is feasible to construct a series of open-ended questions and a bank of responses and follow-up prompts that can be used in a future fully automated voice-based computer-delivered intervention that may mirror more closely human-delivered motivational interventions to reduce drinking. Such efforts will require using advanced speech recognition capabilities and machine-learning approaches to train a program to mirror the decisions made by human controllers in the voice-based computer-delivered intervention used in this study. In addition, future studies should examine enhancements that can increase the perceived warmth and empathy of voice-based computer-delivered intervention, possibly through greater personalization, improvements in the speech generation software, and embodying the computer-delivered intervention in a physical form. ©Christopher W Kahler, William J Lechner, James MacGlashan, Tyler B Wray, Michael L Littman. Originally published in JMIR Mental Health (http://mental.jmir.org), 28.06.2017.

  12. Parallel computing works

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of manymore » computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.« less

  13. Use of a turbine in a breath-by-breath computer-based respiratory measurement system.

    PubMed

    Venkateswaran, R S; Gallagher, R R

    1997-01-01

    The Computer-Based Respiratory Measurement System (CBRMS) is capable of analyzing individual breaths to monitor the kinetics of oxygen uptake, carbon dioxide production, tidal volumes, pulmonary ventilation, and other respiratory parameters during rest, exercise, and recovery. Respiratory gas volumes are measured by a calibrated turbine transducer while the respiratory gas concentrations are measured by a calibrated, fast-responding medical gas analyzer. To improve accuracy of the results, the inspiratory volumes and gas concentrations are measured and not assumed to be equal to expiratory volumes or ambient concentrations respectively. The respiratory gas volumes and concentration signals are digitized and stored in arrays. The gas volumes are converted to flow signals by software differentiation. These digitized data arrays are stored as files in a personal computer. Time alignment of the flow and gas concentration signals is performed at each breath for maximum accuracy in analysis. For system verification, data were obtained under resting conditions and under constant load exercises at 50 W, 100 W, and 150 W. These workloads were performed by a healthy, male subject on a bicycle ergometer. A strong correlation existed between the CBRMS steady-state results and the standard end-expirate bag collection technique. Thus, there is reason to believe that the CBRMS is capable of calculating respiratory transient responses accurately, a significant contribution to an understanding of total respiratory system function.

  14. The Individual Virtual Eye: a Computer Model for Advanced Intraocular Lens Calculation

    PubMed Central

    Einighammer, Jens; Oltrup, Theo; Bende, Thomas; Jean, Benedikt

    2010-01-01

    Purpose To describe the individual virtual eye, a computer model of a human eye with respect to its optical properties. It is based on measurements of an individual person and one of its major application is calculating intraocular lenses (IOLs) for cataract surgery. Methods The model is constructed from an eye's geometry, including axial length and topographic measurements of the anterior corneal surface. All optical components of a pseudophakic eye are modeled with computer scientific methods. A spline-based interpolation method efficiently includes data from corneal topographic measurements. The geometrical optical properties, such as the wavefront aberration, are simulated with real ray-tracing using Snell's law. Optical components can be calculated using computer scientific optimization procedures. The geometry of customized aspheric IOLs was calculated for 32 eyes and the resulting wavefront aberration was investigated. Results The more complex the calculated IOL is, the lower the residual wavefront error is. Spherical IOLs are only able to correct for the defocus, while toric IOLs also eliminate astigmatism. Spherical aberration is additionally reduced by aspheric and toric aspheric IOLs. The efficient implementation of time-critical numerical ray-tracing and optimization procedures allows for short calculation times, which may lead to a practicable method integrated in some device. Conclusions The individual virtual eye allows for simulations and calculations regarding geometrical optics for individual persons. This leads to clinical applications like IOL calculation, with the potential to overcome the limitations of those current calculation methods that are based on paraxial optics, exemplary shown by calculating customized aspheric IOLs.

  15. Accuracy and efficiency of computer-aided anatomical analysis using 3D visualization software based on semi-automated and automated segmentations.

    PubMed

    An, Gao; Hong, Li; Zhou, Xiao-Bing; Yang, Qiong; Li, Mei-Qing; Tang, Xiang-Yang

    2017-03-01

    We investigated and compared the functionality of two 3D visualization software provided by a CT vendor and a third-party vendor, respectively. Using surgical anatomical measurement as baseline, we evaluated the accuracy of 3D visualization and verified their utility in computer-aided anatomical analysis. The study cohort consisted of 50 adult cadavers fixed with the classical formaldehyde method. The computer-aided anatomical analysis was based on CT images (in DICOM format) acquired by helical scan with contrast enhancement, using a CT vendor provided 3D visualization workstation (Syngo) and a third-party 3D visualization software (Mimics) that was installed on a PC. Automated and semi-automated segmentations were utilized in the 3D visualization workstation and software, respectively. The functionality and efficiency of automated and semi-automated segmentation methods were compared. Using surgical anatomical measurement as a baseline, the accuracy of 3D visualization based on automated and semi-automated segmentations was quantitatively compared. In semi-automated segmentation, the Mimics 3D visualization software outperformed the Syngo 3D visualization workstation. No significant difference was observed in anatomical data measurement by the Syngo 3D visualization workstation and the Mimics 3D visualization software (P>0.05). Both the Syngo 3D visualization workstation provided by a CT vendor and the Mimics 3D visualization software by a third-party vendor possessed the needed functionality, efficiency and accuracy for computer-aided anatomical analysis. Copyright © 2016 Elsevier GmbH. All rights reserved.

  16. Automatic small bowel tumor diagnosis by using multi-scale wavelet-based analysis in wireless capsule endoscopy images.

    PubMed

    Barbosa, Daniel C; Roupar, Dalila B; Ramos, Jaime C; Tavares, Adriano C; Lima, Carlos S

    2012-01-11

    Wireless capsule endoscopy has been introduced as an innovative, non-invasive diagnostic technique for evaluation of the gastrointestinal tract, reaching places where conventional endoscopy is unable to. However, the output of this technique is an 8 hours video, whose analysis by the expert physician is very time consuming. Thus, a computer assisted diagnosis tool to help the physicians to evaluate CE exams faster and more accurately is an important technical challenge and an excellent economical opportunity. The set of features proposed in this paper to code textural information is based on statistical modeling of second order textural measures extracted from co-occurrence matrices. To cope with both joint and marginal non-Gaussianity of second order textural measures, higher order moments are used. These statistical moments are taken from the two-dimensional color-scale feature space, where two different scales are considered. Second and higher order moments of textural measures are computed from the co-occurrence matrices computed from images synthesized by the inverse wavelet transform of the wavelet transform containing only the selected scales for the three color channels. The dimensionality of the data is reduced by using Principal Component Analysis. The proposed textural features are then used as the input of a classifier based on artificial neural networks. Classification performances of 93.1% specificity and 93.9% sensitivity are achieved on real data. These promising results open the path towards a deeper study regarding the applicability of this algorithm in computer aided diagnosis systems to assist physicians in their clinical practice.

  17. Reconstruction-free sensitive wavefront sensor based on continuous position sensitive detectors.

    PubMed

    Godin, Thomas; Fromager, Michael; Cagniot, Emmanuel; Brunel, Marc; Aït-Ameur, Kamel

    2013-12-01

    We propose a new device that is able to perform highly sensitive wavefront measurements based on the use of continuous position sensitive detectors and without resorting to any reconstruction process. We demonstrate experimentally its ability to measure small wavefront distortions through the characterization of pump-induced refractive index changes in laser material. In addition, it is shown using computer-generated holograms that this device can detect phase discontinuities as well as improve the quality of sharp phase variations measurements. Results are compared to reference Shack-Hartmann measurements, and dramatic enhancements are obtained.

  18. Mirror Measurement Device

    NASA Technical Reports Server (NTRS)

    1992-01-01

    A Small Business Innovation Research (SBIR) contract led to a commercially available instrument used to measure the shape profile of mirror surfaces in scientific instruments. Bauer Associates, Inc.'s Bauer Model 200 Profilometer is based upon a different measurement concept. The local curvature of the mirror's surface is measured at many points, and the collection of data is computer processed to yield the desired shape profile. (Earlier profilometers are based on the principle of interferometry.) The system is accurate and immune to problems like vibration and turbulence. Two profilometers are currently marketed, and a third will soon be commercialized.

  19. Measurement-based quantum communication with resource states generated by entanglement purification

    NASA Astrophysics Data System (ADS)

    Wallnöfer, J.; Dür, W.

    2017-01-01

    We investigate measurement-based quantum communication with noisy resource states that are generated by entanglement purification. We consider the transmission of encoded information via noisy quantum channels using a measurement-based implementation of encoding, error correction, and decoding. We show that such an approach offers advantages over direct transmission, gate-based error correction, and measurement-based schemes with direct generation of resource states. We analyze the noise structure of resource states generated by entanglement purification and show that a local error model, i.e., noise acting independently on all qubits of the resource state, is a good approximation in general, and provides an exact description for Greenberger-Horne-Zeilinger states. The latter are resources for a measurement-based implementation of error-correction codes for bit-flip or phase-flip errors. This provides an approach to link the recently found very high thresholds for fault-tolerant measurement-based quantum information processing based on local error models for resource states with error thresholds for gate-based computational models.

  20. Home-Based Computer Gaming in Vestibular Rehabilitation of Gaze and Balance Impairment.

    PubMed

    Szturm, Tony; Reimer, Karen M; Hochman, Jordan

    2015-06-01

    Disease or damage of the vestibular sense organs cause a range of distressing symptoms and functional problems that could include loss of balance, gaze instability, disorientation, and dizziness. A novel computer-based rehabilitation system with therapeutic gaming application has been developed. This method allows different gaze and head movement exercises to be coupled to a wide range of inexpensive, commercial computer games. It can be used in standing, and thus graded balance demands using a sponge pad can be incorporated into the program. A case series pre- and postintervention study was conducted of nine adults diagnosed with peripheral vestibular dysfunction who received a 12-week home rehabilitation program. The feasibility and usability of the home computer-based therapeutic program were established. Study findings revealed that using head rotation to interact with computer games, when coupled to demanding balance conditions, resulted in significant improvements in standing balance, dynamic visual acuity, gaze control, and walking performance. Perception of dizziness as measured by the Dizziness Handicap Inventory also decreased significantly. These preliminary findings provide support that a low-cost home game-based exercise program is well suited to train standing balance and gaze control (with active and passive head motion).

  1. BROJA-2PID: A Robust Estimator for Bivariate Partial Information Decomposition

    NASA Astrophysics Data System (ADS)

    Makkeh, Abdullah; Theis, Dirk; Vicente, Raul

    2018-04-01

    Makkeh, Theis, and Vicente found in [8] that Cone Programming model is the most robust to compute the Bertschinger et al. partial information decompostion (BROJA PID) measure [1]. We developed a production-quality robust software that computes the BROJA PID measure based on the Cone Programming model. In this paper, we prove the important property of strong duality for the Cone Program and prove an equivalence between the Cone Program and the original Convex problem. Then describe in detail our software and how to use it.\

  2. Experimental Validation Data for Computational Fluid Dynamics of Forced Convection on a Vertical Flat Plate

    DOE PAGES

    Harris, Jeff R.; Lance, Blake W.; Smith, Barton L.

    2015-08-10

    We present computational fluid dynamics (CFD) validation dataset for turbulent forced convection on a vertical plate. The design of the apparatus is based on recent validation literature and provides a means to simultaneously measure boundary conditions (BCs) and system response quantities (SRQs). Important inflow quantities for Reynolds-Averaged Navier-Stokes (RANS). CFD are also measured. Data are acquired at two heating conditions and cover the range 40,000 < Re x < 300,000, 357 < Re δ2 < 813, and 0.02 < Gr/Re 2 < 0.232.

  3. A laser-based ice shape profilometer for use in icing wind tunnels

    NASA Technical Reports Server (NTRS)

    Hovenac, Edward A.; Vargas, Mario

    1995-01-01

    A laser-based profilometer was developed to measure the thickness and shape of ice accretions on the leading edge of airfoils and other models in icing wind tunnels. The instrument is a hand held device that is connected to a desk top computer with a 10 meter cable. It projects a laser line onto an ice shape and used solid state cameras to detect the light scattered by the ice. The instrument corrects the image for camera angle distortions, displays an outline of the ice shape on the computer screen, saves the data on a disk, and can print a full scale drawing of the ice shape. The profilometer has undergone extensive testing in the laboratory and in the NASA Lewis Icing Research Tunnel. Results of the tests show very good agreement between profilometer measurements and known simulated ice shapes and fair agreement between profilometer measurements and hand tracing techniques.

  4. Short-term balance training with computer-based feedback in children with cerebral palsy: A feasibility and pilot randomized trial.

    PubMed

    Saxena, Shikha; Rao, Bhamini K; Senthil, Kumaran D

    2017-04-01

    To assess the feasibility of using short-term balance training with computer-based visual feedback (BTVF) and its effect on standing balance in children with bilateral spastic cerebral palsy (BSCP). Out of the fourteen children with BSCP (mean age = 10.31 years), seven children received four sessions of BTVF (two such sessions/day, each session = 15 min) in comparison to the control group that received standard care. Feasibility was measured as percentages of recruitment, retention and safety and balance was measured using a posturography machine as sway velocity (m/s) and velocity moment (m/s 2 ) during quiet standing. No serious adverse events occurred in either group. There were no differences in the retention percentages and in any clinical outcome measure between both groups. Use of BTVF is feasible in children with BSCP but further investigation is required to estimate a dose-effect relationship.

  5. Blind Quantum Signature with Blind Quantum Computation

    NASA Astrophysics Data System (ADS)

    Li, Wei; Shi, Ronghua; Guo, Ying

    2017-04-01

    Blind quantum computation allows a client without quantum abilities to interact with a quantum server to perform a unconditional secure computing protocol, while protecting client's privacy. Motivated by confidentiality of blind quantum computation, a blind quantum signature scheme is designed with laconic structure. Different from the traditional signature schemes, the signing and verifying operations are performed through measurement-based quantum computation. Inputs of blind quantum computation are securely controlled with multi-qubit entangled states. The unique signature of the transmitted message is generated by the signer without leaking information in imperfect channels. Whereas, the receiver can verify the validity of the signature using the quantum matching algorithm. The security is guaranteed by entanglement of quantum system for blind quantum computation. It provides a potential practical application for e-commerce in the cloud computing and first-generation quantum computation.

  6. Laplace Transform Based Radiative Transfer Studies

    NASA Astrophysics Data System (ADS)

    Hu, Y.; Lin, B.; Ng, T.; Yang, P.; Wiscombe, W.; Herath, J.; Duffy, D.

    2006-12-01

    Multiple scattering is the major uncertainty for data analysis of space-based lidar measurements. Until now, accurate quantitative lidar data analysis has been limited to very thin objects that are dominated by single scattering, where photons from the laser beam only scatter a single time with particles in the atmosphere before reaching the receiver, and simple linear relationship between physical property and lidar signal exists. In reality, multiple scattering is always a factor in space-based lidar measurement and it dominates space- based lidar returns from clouds, dust aerosols, vegetation canopy and phytoplankton. While multiple scattering are clear signals, the lack of a fast-enough lidar multiple scattering computation tool forces us to treat the signal as unwanted "noise" and use simple multiple scattering correction scheme to remove them. Such multiple scattering treatments waste the multiple scattering signals and may cause orders of magnitude errors in retrieved physical properties. Thus the lack of fast and accurate time-dependent radiative transfer tools significantly limits lidar remote sensing capabilities. Analyzing lidar multiple scattering signals requires fast and accurate time-dependent radiative transfer computations. Currently, multiple scattering is done with Monte Carlo simulations. Monte Carlo simulations take minutes to hours and are too slow for interactive satellite data analysis processes and can only be used to help system / algorithm design and error assessment. We present an innovative physics approach to solve the time-dependent radiative transfer problem. The technique utilizes FPGA based reconfigurable computing hardware. The approach is as following, 1. Physics solution: Perform Laplace transform on the time and spatial dimensions and Fourier transform on the viewing azimuth dimension, and convert the radiative transfer differential equation solving into a fast matrix inversion problem. The majority of the radiative transfer computation goes to matrix inversion processes, FFT and inverse Laplace transforms. 2. Hardware solutions: Perform the well-defined matrix inversion, FFT and Laplace transforms on highly parallel, reconfigurable computing hardware. This physics-based computational tool leads to accurate quantitative analysis of space-based lidar signals and improves data quality of current lidar mission such as CALIPSO. This presentation will introduce the basic idea of this approach, preliminary results based on SRC's FPGA-based Mapstation, and how we may apply it to CALIPSO data analysis.

  7. Universal Blind Quantum Computation

    NASA Astrophysics Data System (ADS)

    Fitzsimons, Joseph; Kashefi, Elham

    2012-02-01

    Blind Quantum Computing (BQC) allows a client to have a server carry out a quantum computation for them such that the client's inputs, outputs and computation remain private. Recently we proposed a universal unconditionally secure BQC scheme, based on the conceptual framework of the measurement-based quantum computing model, where the client only needs to be able to prepare single qubits in separable states randomly chosen from a finite set and send them to the server, who has the balance of the required quantum computational resources. Here we present a refinement of the scheme which vastly expands the class of quantum circuits which can be directly implemented as a blind computation, by introducing a new class of resource states which we term dotted-complete graph states and expanding the set of single qubit states the client is required to prepare. These two modifications significantly simplify the overall protocol and remove the previously present restriction that only nearest-neighbor circuits could be implemented as blind computations directly. As an added benefit, the refined protocol admits a substantially more intuitive and simplified verification mechanism, allowing the correctness of a blind computation to be verified with arbitrarily small probability of error.

  8. EDGE COMPUTING AND CONTEXTUAL INFORMATION FOR THE INTERNET OF THINGS SENSORS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klein, Levente

    Interpreting sensor data require knowledge about sensor placement and the surrounding environment. For a single sensor measurement, it is easy to document the context by visual observation, however for millions of sensors reporting data back to a server, the contextual information needs to be automatically extracted from either data analysis or leveraging complimentary data sources. Data layers that overlap spatially or temporally with sensor locations, can be used to extract the context and to validate the measurement. To minimize the amount of data transmitted through the internet, while preserving signal information content, two methods are explored; computation at the edgemore » and compressed sensing. We validate the above methods on wind and chemical sensor data (1) eliminate redundant measurement from wind sensors and (2) extract peak value of a chemical sensor measuring a methane plume. We present a general cloud based framework to validate sensor data based on statistical and physical modeling and contextual data extracted from geospatial data.« less

  9. Determining the accuracy of maximum likelihood parameter estimates with colored residuals

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.; Klein, Vladislav

    1994-01-01

    An important part of building high fidelity mathematical models based on measured data is calculating the accuracy associated with statistical estimates of the model parameters. Indeed, without some idea of the accuracy of parameter estimates, the estimates themselves have limited value. In this work, an expression based on theoretical analysis was developed to properly compute parameter accuracy measures for maximum likelihood estimates with colored residuals. This result is important because experience from the analysis of measured data reveals that the residuals from maximum likelihood estimation are almost always colored. The calculations involved can be appended to conventional maximum likelihood estimation algorithms. Simulated data runs were used to show that the parameter accuracy measures computed with this technique accurately reflect the quality of the parameter estimates from maximum likelihood estimation without the need for analysis of the output residuals in the frequency domain or heuristically determined multiplication factors. The result is general, although the application studied here is maximum likelihood estimation of aerodynamic model parameters from flight test data.

  10. Two-dimensional (2D) displacement measurement of moving objects using a new MEMS binocular vision system

    NASA Astrophysics Data System (ADS)

    Di, Si; Lin, Hui; Du, Ruxu

    2011-05-01

    Displacement measurement of moving objects is one of the most important issues in the field of computer vision. This paper introduces a new binocular vision system (BVS) based on micro-electro-mechanical system (MEMS) technology. The eyes of the system are two microlenses fabricated on a substrate by MEMS technology. The imaging results of two microlenses are collected by one complementary metal-oxide-semiconductor (CMOS) array. An algorithm is developed for computing the displacement. Experimental results show that as long as the object is moving in two-dimensional (2D) space, the system can effectively estimate the 2D displacement without camera calibration. It is also shown that the average error of the displacement measurement is about 3.5% at different object distances ranging from 10 cm to 35 cm. Because of its low cost, small size and simple setting, this new method is particularly suitable for 2D displacement measurement applications such as vision-based electronics assembly and biomedical cell culture.

  11. Heart rate measurement based on face video sequence

    NASA Astrophysics Data System (ADS)

    Xu, Fang; Zhou, Qin-Wu; Wu, Peng; Chen, Xing; Yang, Xiaofeng; Yan, Hong-jian

    2015-03-01

    This paper proposes a new non-contact heart rate measurement method based on photoplethysmography (PPG) theory. With this method we can measure heart rate remotely with a camera and ambient light. We collected video sequences of subjects, and detected remote PPG signals through video sequences. Remote PPG signals were analyzed with two methods, Blind Source Separation Technology (BSST) and Cross Spectral Power Technology (CSPT). BSST is a commonly used method, and CSPT is used for the first time in the study of remote PPG signals in this paper. Both of the methods can acquire heart rate, but compared with BSST, CSPT has clearer physical meaning, and the computational complexity of CSPT is lower than that of BSST. Our work shows that heart rates detected by CSPT method have good consistency with the heart rates measured by a finger clip oximeter. With good accuracy and low computational complexity, the CSPT method has a good prospect for the application in the field of home medical devices and mobile health devices.

  12. Quantum computing with Majorana fermion codes

    NASA Astrophysics Data System (ADS)

    Litinski, Daniel; von Oppen, Felix

    2018-05-01

    We establish a unified framework for Majorana-based fault-tolerant quantum computation with Majorana surface codes and Majorana color codes. All logical Clifford gates are implemented with zero-time overhead. This is done by introducing a protocol for Pauli product measurements with tetrons and hexons which only requires local 4-Majorana parity measurements. An analogous protocol is used in the fault-tolerant setting, where tetrons and hexons are replaced by Majorana surface code patches, and parity measurements are replaced by lattice surgery, still only requiring local few-Majorana parity measurements. To this end, we discuss twist defects in Majorana fermion surface codes and adapt the technique of twist-based lattice surgery to fermionic codes. Moreover, we propose a family of codes that we refer to as Majorana color codes, which are obtained by concatenating Majorana surface codes with small Majorana fermion codes. Majorana surface and color codes can be used to decrease the space overhead and stabilizer weight compared to their bosonic counterparts.

  13. Breast ultrasound computed tomography using waveform inversion with source encoding

    NASA Astrophysics Data System (ADS)

    Wang, Kun; Matthews, Thomas; Anis, Fatima; Li, Cuiping; Duric, Neb; Anastasio, Mark A.

    2015-03-01

    Ultrasound computed tomography (USCT) holds great promise for improving the detection and management of breast cancer. Because they are based on the acoustic wave equation, waveform inversion-based reconstruction methods can produce images that possess improved spatial resolution properties over those produced by ray-based methods. However, waveform inversion methods are computationally demanding and have not been applied widely in USCT breast imaging. In this work, source encoding concepts are employed to develop an accelerated USCT reconstruction method that circumvents the large computational burden of conventional waveform inversion methods. This method, referred to as the waveform inversion with source encoding (WISE) method, encodes the measurement data using a random encoding vector and determines an estimate of the speed-of-sound distribution by solving a stochastic optimization problem by use of a stochastic gradient descent algorithm. Computer-simulation studies are conducted to demonstrate the use of the WISE method. Using a single graphics processing unit card, each iteration can be completed within 25 seconds for a 128 × 128 mm2 reconstruction region. The results suggest that the WISE method maintains the high spatial resolution of waveform inversion methods while significantly reducing the computational burden.

  14. Rayleigh wave ellipticity across the Iberian Peninsula and Morocco

    NASA Astrophysics Data System (ADS)

    Gómez García, Clara; Villaseñor, Antonio

    2015-04-01

    Spectral amplitude ratios between horizontal and vertical components (H/V ratios) from seismic records are useful to evaluate site effects, predict ground motion and invert for S velocity in the top several hundred meters. These spectral ratios can be obtained from both ambient noise and earthquakes. H/V ratios from ambient noise depend on the content and predominant wave types: body waves, Rayleigh waves, a mixture of different waves, etc. The H/V ratio computed in this way is assumed to measure Rayleigh wave ellipticity since ambient vibrations are dominated by Rayleigh waves. H/V ratios from earthquakes are able to determine the local crustal structure at the vicinity of the recording station. These ratios obtained from earthquakes are based on surface wave ellipticity measurements. Although long period (>20 seconds) Rayleigh H/V ratio is not currently used because of large scatter has been reported and uncertainly about whether these measurements are compatible with traditional phase and group velocity measurements, we will investigate whether it is possible to obtain stable estimates after collecting statistics for many earthquakes. We will use teleseismic events from shallow earthquakes (depth ≤ 40 km) between 2007 January 1 and 2012 December 31 with M ≥ 6 and we will compute H/V ratios for more than 400 stations from several seismic networks across the Iberian Peninsula and Morocco for periods between 20 and 100 seconds. Also H/V ratios from cross-correlations of ambient noise in different components for each station pair will be computed. Shorter period H/V ratio measurements based on ambient noise cross-correlations are strongly sensitive to near-surface structure, rather than longer period earthquake Rayleigh waves. The combination of ellipticity measurements based on earthquakes and ambient noise will allow us to perform a joint inversion with Rayleigh wave phase velocity. Upper crustal structure is better constrained by the joint inversion compared to inversions based on phase velocities alone.

  15. Estimating Skin Cancer Risk: Evaluating Mobile Computer-Adaptive Testing.

    PubMed

    Djaja, Ngadiman; Janda, Monika; Olsen, Catherine M; Whiteman, David C; Chien, Tsair-Wei

    2016-01-22

    Response burden is a major detriment to questionnaire completion rates. Computer adaptive testing may offer advantages over non-adaptive testing, including reduction of numbers of items required for precise measurement. Our aim was to compare the efficiency of non-adaptive (NAT) and computer adaptive testing (CAT) facilitated by Partial Credit Model (PCM)-derived calibration to estimate skin cancer risk. We used a random sample from a population-based Australian cohort study of skin cancer risk (N=43,794). All 30 items of the skin cancer risk scale were calibrated with the Rasch PCM. A total of 1000 cases generated following a normal distribution (mean [SD] 0 [1]) were simulated using three Rasch models with three fixed-item (dichotomous, rating scale, and partial credit) scenarios, respectively. We calculated the comparative efficiency and precision of CAT and NAT (shortening of questionnaire length and the count difference number ratio less than 5% using independent t tests). We found that use of CAT led to smaller person standard error of the estimated measure than NAT, with substantially higher efficiency but no loss of precision, reducing response burden by 48%, 66%, and 66% for dichotomous, Rating Scale Model, and PCM models, respectively. CAT-based administrations of the skin cancer risk scale could substantially reduce participant burden without compromising measurement precision. A mobile computer adaptive test was developed to help people efficiently assess their skin cancer risk.

  16. Precision of a CAD/CAM-engineered surgical template based on a facebow for orthognathic surgery: an experiment with a rapid prototyping maxillary model.

    PubMed

    Lee, Jae-Won; Lim, Se-Ho; Kim, Moon-Key; Kang, Sang-Hoon

    2015-12-01

    We examined the precision of a computer-aided design/computer-aided manufacturing-engineered, manufactured, facebow-based surgical guide template (facebow wafer) by comparing it with a bite splint-type orthognathic computer-aided design/computer-aided manufacturing-engineered surgical guide template (bite wafer). We used 24 rapid prototyping (RP) models of the craniofacial skeleton with maxillary deformities. Twelve RP models each were used for the facebow wafer group and the bite wafer group (experimental group). Experimental maxillary orthognathic surgery was performed on the RP models of both groups. Errors were evaluated through comparisons with surgical simulations. We measured the minimum distances from 3 planes of reference to determine the vertical, lateral, and anteroposterior errors at specific measurement points. The measured errors were compared between experimental groups using a t test. There were significant intergroup differences in the lateral error when we compared the absolute values of the 3-D linear distance, as well as vertical, lateral, and anteroposterior errors between experimental groups. The bite wafer method exhibited little lateral error overall and little error in the anterior tooth region. The facebow wafer method exhibited very little vertical error in the posterior molar region. The clinical precision of the facebow wafer method did not significantly exceed that of the bite wafer method. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Computer-based, Jeopardy™-like game in general chemistry for engineering majors

    NASA Astrophysics Data System (ADS)

    Ling, S. S.; Saffre, F.; Kadadha, M.; Gater, D. L.; Isakovic, A. F.

    2013-03-01

    We report on the design of Jeopardy™-like computer game for enhancement of learning of general chemistry for engineering majors. While we examine several parameters of student achievement and attitude, our primary concern is addressing the motivation of students, which tends to be low in a traditionally run chemistry lectures. The effect of the game-playing is tested by comparing paper-based game quiz, which constitutes a control group, and computer-based game quiz, constituting a treatment group. Computer-based game quizzes are Java™-based applications that students run once a week in the second part of the last lecture of the week. Overall effectiveness of the semester-long program is measured through pretest-postest conceptual testing of general chemistry. The objective of this research is to determine to what extent this ``gamification'' of the course delivery and course evaluation processes may be beneficial to the undergraduates' learning of science in general, and chemistry in particular. We present data addressing gender-specific difference in performance, as well as background (pre-college) level of general science and chemistry preparation. We outline the plan how to extend such approach to general physics courses and to modern science driven electives, and we offer live, in-lectures examples of our computer gaming experience. We acknowledge support from Khalifa University, Abu Dhabi

  18. The Effectiveness of Gaze-Contingent Control in Computer Games.

    PubMed

    Orlov, Paul A; Apraksin, Nikolay

    2015-01-01

    Eye-tracking technology and gaze-contingent control in human-computer interaction have become an objective reality. This article reports on a series of eye-tracking experiments, in which we concentrated on one aspect of gaze-contingent interaction: Its effectiveness compared with mouse-based control in a computer strategy game. We propose a measure for evaluating the effectiveness of interaction based on "the time of recognition" the game unit. In this article, we use this measure to compare gaze- and mouse-contingent systems, and we present the analysis of the differences as a function of the number of game units. Our results indicate that performance of gaze-contingent interaction is typically higher than mouse manipulation in a visual searching task. When tested on 60 subjects, the results showed that the effectiveness of gaze-contingent systems over 1.5 times higher. In addition, we obtained that eye behavior stays quite stabile with or without mouse interaction. © The Author(s) 2015.

  19. Learning the ideal observer for SKE detection tasks by use of convolutional neural networks (Cum Laude Poster Award)

    NASA Astrophysics Data System (ADS)

    Zhou, Weimin; Anastasio, Mark A.

    2018-03-01

    It has been advocated that task-based measures of image quality (IQ) should be employed to evaluate and optimize imaging systems. Task-based measures of IQ quantify the performance of an observer on a medically relevant task. The Bayesian Ideal Observer (IO), which employs complete statistical information of the object and noise, achieves the upper limit of the performance for a binary signal classification task. However, computing the IO performance is generally analytically intractable and can be computationally burdensome when Markov-chain Monte Carlo (MCMC) techniques are employed. In this paper, supervised learning with convolutional neural networks (CNNs) is employed to approximate the IO test statistics for a signal-known-exactly and background-known-exactly (SKE/BKE) binary detection task. The receiver operating characteristic (ROC) curve and the area under the ROC curve (AUC) are compared to those produced by the analytically computed IO. The advantages of the proposed supervised learning approach for approximating the IO are demonstrated.

  20. Vibration extraction based on fast NCC algorithm and high-speed camera.

    PubMed

    Lei, Xiujun; Jin, Yi; Guo, Jie; Zhu, Chang'an

    2015-09-20

    In this study, a high-speed camera system is developed to complete the vibration measurement in real time and to overcome the mass introduced by conventional contact measurements. The proposed system consists of a notebook computer and a high-speed camera which can capture the images as many as 1000 frames per second. In order to process the captured images in the computer, the normalized cross-correlation (NCC) template tracking algorithm with subpixel accuracy is introduced. Additionally, a modified local search algorithm based on the NCC is proposed to reduce the computation time and to increase efficiency significantly. The modified algorithm can rapidly accomplish one displacement extraction 10 times faster than the traditional template matching without installing any target panel onto the structures. Two experiments were carried out under laboratory and outdoor conditions to validate the accuracy and efficiency of the system performance in practice. The results demonstrated the high accuracy and efficiency of the camera system in extracting vibrating signals.

  1. Queueing Network Models for Parallel Processing of Task Systems: an Operational Approach

    NASA Technical Reports Server (NTRS)

    Mak, Victor W. K.

    1986-01-01

    Computer performance modeling of possibly complex computations running on highly concurrent systems is considered. Earlier works in this area either dealt with a very simple program structure or resulted in methods with exponential complexity. An efficient procedure is developed to compute the performance measures for series-parallel-reducible task systems using queueing network models. The procedure is based on the concept of hierarchical decomposition and a new operational approach. Numerical results for three test cases are presented and compared to those of simulations.

  2. Differential item functioning magnitude and impact measures from item response theory models.

    PubMed

    Kleinman, Marjorie; Teresi, Jeanne A

    2016-01-01

    Measures of magnitude and impact of differential item functioning (DIF) at the item and scale level, respectively are presented and reviewed in this paper. Most measures are based on item response theory models. Magnitude refers to item level effect sizes, whereas impact refers to differences between groups at the scale score level. Reviewed are magnitude measures based on group differences in the expected item scores and impact measures based on differences in the expected scale scores. The similarities among these indices are demonstrated. Various software packages are described that provide magnitude and impact measures, and new software presented that computes all of the available statistics conveniently in one program with explanations of their relationships to one another.

  3. Perceptions about computers and the internet in a pediatric clinic population.

    PubMed

    Carroll, Aaron E; Zimmerman, Frederick J; Rivara, Frederick P; Ebel, Beth E; Christakis, Dimitri A

    2005-01-01

    A digital divide with respect to computer and Internet access has been noted in numerous studies and reports. Equally important to ownership is comfort with computers and Internet technology, and concerns about privacy of personal data. To measure how households in a pediatric clinic vary in their attitudes toward computers, concerns about Internet confidentiality, and comfort using the Internet and whether these views are associated with household income or education. A phone survey was administered to a population-based sample of parents with children aged 0 to 11 years. All children received medical care from a community-based clinic network serving patients in King County, Wash. Eighty-eight percent of respondents used a computer once a week or more, and 83% of respondents reported favorable feelings toward computers. Although 97% of respondents were willing to share personal information over the Internet, many respondents considered data security important. While household income and parental education were associated with comfort and familiarity with computers, the effect is small. Respondents who already owned a computer and had Internet access did not differ in their perceptions according to socioeconomic or educational attainment. Most families like using computers and feel comfortable using the Internet regardless of socioeconomic status. Fears about the digital divide's impact on the attitudes of parents toward computers or their comfort using the Internet should not be seen as a barrier to developing Internet-based health interventions for a pediatric clinic population.

  4. Comparison of commonly used orthopaedic outcome measures using palm-top computers and paper surveys.

    PubMed

    Saleh, Khaled J; Radosevich, David M; Kassim, Rida A; Moussa, Mohamed; Dykes, Darrell; Bottolfson, Helena; Gioe, Terence J; Robinson, Harry

    2002-11-01

    Measuring patient-perceived outcomes following orthopaedic procedures have become an important component of clinical research and patient care. General and disease-specific outcomes measures have been developed and applied in orthopaedics to assess the patients' perceived health status. Unfortunately, paper-based, self-administered instruments remain inefficient for collecting data because of: (a) missing data (b) respondent error, and (c) the costs to administer and enter data. To study the comparability of palm-top computer devices and paper-pencil self-administered questionnaires in the collection of health-related quality of life (HRQL) information from patients. The comparability of administering HRQL questionnaires using palm-top computer and traditional paper-based forms was tested in a sample of 96 patients with complaints of hip and/or knee pain. Each patient completed mailed versions of the Medical Outcomes Study (MOS), 36-item Health Survey (SF-36), and Western Ontario and McMasters University Arthritis Index (WOMAC) three weeks prior to presenting to clinic. At the clinic they were asked to complete the same outcomes measures using the palm-top computer or a paper-and-pencil version. In the analysis, scale distributions, floor and ceiling effects, internal consistency and retest reliability of scales were compared across the two data collection methods. Because the baseline characteristics of the groups were not strictly comparable according to age, the data were analyzed for the entire sample and stratified according to age. Few statistically significant differences were found for the means, variances and intra-class correlation coefficients between the methods of administration. While the scale distribution between the two methods was comparable, the internal consistency of the scales was dissimilar. Administration of HRQL questionnaires using portable palm-top computer devices has the potential advantage of decreased cost and convenience. These data lend some support for the comparability of palm-top computers and paper surveys for outcomes measures widely used in the field of orthopaedic surgery. The present study identified the lack of reliability across modes of administration that requires further study in a randomized comparability trial. These mode effects are important for orthopaedic surgeons to appreciate before implementing innovative data-capture technologies in their practices.

  5. RPM-WEBBSYS: A web-based computer system to apply the rational polynomial method for estimating static formation temperatures of petroleum and geothermal wells

    NASA Astrophysics Data System (ADS)

    Wong-Loya, J. A.; Santoyo, E.; Andaverde, J. A.; Quiroz-Ruiz, A.

    2015-12-01

    A Web-Based Computer System (RPM-WEBBSYS) has been developed for the application of the Rational Polynomial Method (RPM) to estimate static formation temperatures (SFT) of geothermal and petroleum wells. The system is also capable to reproduce the full thermal recovery processes occurred during the well completion. RPM-WEBBSYS has been programmed using advances of the information technology to perform more efficiently computations of SFT. RPM-WEBBSYS may be friendly and rapidly executed by using any computing device (e.g., personal computers and portable computing devices such as tablets or smartphones) with Internet access and a web browser. The computer system was validated using bottomhole temperature (BHT) measurements logged in a synthetic heat transfer experiment, where a good matching between predicted and true SFT was achieved. RPM-WEBBSYS was finally applied to BHT logs collected from well drilling and shut-in operations, where the typical problems of the under- and over-estimation of the SFT (exhibited by most of the existing analytical methods) were effectively corrected.

  6. BIOSSES: a semantic sentence similarity estimation system for the biomedical domain.

    PubMed

    Sogancioglu, Gizem; Öztürk, Hakime; Özgür, Arzucan

    2017-07-15

    The amount of information available in textual format is rapidly increasing in the biomedical domain. Therefore, natural language processing (NLP) applications are becoming increasingly important to facilitate the retrieval and analysis of these data. Computing the semantic similarity between sentences is an important component in many NLP tasks including text retrieval and summarization. A number of approaches have been proposed for semantic sentence similarity estimation for generic English. However, our experiments showed that such approaches do not effectively cover biomedical knowledge and produce poor results for biomedical text. We propose several approaches for sentence-level semantic similarity computation in the biomedical domain, including string similarity measures and measures based on the distributed vector representations of sentences learned in an unsupervised manner from a large biomedical corpus. In addition, ontology-based approaches are presented that utilize general and domain-specific ontologies. Finally, a supervised regression based model is developed that effectively combines the different similarity computation metrics. A benchmark data set consisting of 100 sentence pairs from the biomedical literature is manually annotated by five human experts and used for evaluating the proposed methods. The experiments showed that the supervised semantic sentence similarity computation approach obtained the best performance (0.836 correlation with gold standard human annotations) and improved over the state-of-the-art domain-independent systems up to 42.6% in terms of the Pearson correlation metric. A web-based system for biomedical semantic sentence similarity computation, the source code, and the annotated benchmark data set are available at: http://tabilab.cmpe.boun.edu.tr/BIOSSES/ . gizemsogancioglu@gmail.com or arzucan.ozgur@boun.edu.tr. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  7. ConGEMs: Condensed Gene Co-Expression Module Discovery Through Rule-Based Clustering and Its Application to Carcinogenesis.

    PubMed

    Mallik, Saurav; Zhao, Zhongming

    2017-12-28

    For transcriptomic analysis, there are numerous microarray-based genomic data, especially those generated for cancer research. The typical analysis measures the difference between a cancer sample-group and a matched control group for each transcript or gene. Association rule mining is used to discover interesting item sets through rule-based methodology. Thus, it has advantages to find causal effect relationships between the transcripts. In this work, we introduce two new rule-based similarity measures-weighted rank-based Jaccard and Cosine measures-and then propose a novel computational framework to detect condensed gene co-expression modules ( C o n G E M s) through the association rule-based learning system and the weighted similarity scores. In practice, the list of evolved condensed markers that consists of both singular and complex markers in nature depends on the corresponding condensed gene sets in either antecedent or consequent of the rules of the resultant modules. In our evaluation, these markers could be supported by literature evidence, KEGG (Kyoto Encyclopedia of Genes and Genomes) pathway and Gene Ontology annotations. Specifically, we preliminarily identified differentially expressed genes using an empirical Bayes test. A recently developed algorithm-RANWAR-was then utilized to determine the association rules from these genes. Based on that, we computed the integrated similarity scores of these rule-based similarity measures between each rule-pair, and the resultant scores were used for clustering to identify the co-expressed rule-modules. We applied our method to a gene expression dataset for lung squamous cell carcinoma and a genome methylation dataset for uterine cervical carcinogenesis. Our proposed module discovery method produced better results than the traditional gene-module discovery measures. In summary, our proposed rule-based method is useful for exploring biomarker modules from transcriptomic data.

  8. Objective measures, sensors and computational techniques for stress recognition and classification: a survey.

    PubMed

    Sharma, Nandita; Gedeon, Tom

    2012-12-01

    Stress is a major growing concern in our day and age adversely impacting both individuals and society. Stress research has a wide range of benefits from improving personal operations, learning, and increasing work productivity to benefiting society - making it an interesting and socially beneficial area of research. This survey reviews sensors that have been used to measure stress and investigates techniques for modelling stress. It discusses non-invasive and unobtrusive sensors for measuring computed stress, a term we coin in the paper. Sensors that do not impede everyday activities that could be used by those who would like to monitor stress levels on a regular basis (e.g. vehicle drivers, patients with illnesses linked to stress) is the focus of the discussion. Computational techniques have the capacity to determine optimal sensor fusion and automate data analysis for stress recognition and classification. Several computational techniques have been developed to model stress based on techniques such as Bayesian networks, artificial neural networks, and support vector machines, which this survey investigates. The survey concludes with a summary and provides possible directions for further computational stress research. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  9. Blind topological measurement-based quantum computation.

    PubMed

    Morimae, Tomoyuki; Fujii, Keisuke

    2012-01-01

    Blind quantum computation is a novel secure quantum-computing protocol that enables Alice, who does not have sufficient quantum technology at her disposal, to delegate her quantum computation to Bob, who has a fully fledged quantum computer, in such a way that Bob cannot learn anything about Alice's input, output and algorithm. A recent proof-of-principle experiment demonstrating blind quantum computation in an optical system has raised new challenges regarding the scalability of blind quantum computation in realistic noisy conditions. Here we show that fault-tolerant blind quantum computation is possible in a topologically protected manner using the Raussendorf-Harrington-Goyal scheme. The error threshold of our scheme is 4.3 × 10(-3), which is comparable to that (7.5 × 10(-3)) of non-blind topological quantum computation. As the error per gate of the order 10(-3) was already achieved in some experimental systems, our result implies that secure cloud quantum computation is within reach.

  10. Blind topological measurement-based quantum computation

    NASA Astrophysics Data System (ADS)

    Morimae, Tomoyuki; Fujii, Keisuke

    2012-09-01

    Blind quantum computation is a novel secure quantum-computing protocol that enables Alice, who does not have sufficient quantum technology at her disposal, to delegate her quantum computation to Bob, who has a fully fledged quantum computer, in such a way that Bob cannot learn anything about Alice's input, output and algorithm. A recent proof-of-principle experiment demonstrating blind quantum computation in an optical system has raised new challenges regarding the scalability of blind quantum computation in realistic noisy conditions. Here we show that fault-tolerant blind quantum computation is possible in a topologically protected manner using the Raussendorf-Harrington-Goyal scheme. The error threshold of our scheme is 4.3×10-3, which is comparable to that (7.5×10-3) of non-blind topological quantum computation. As the error per gate of the order 10-3 was already achieved in some experimental systems, our result implies that secure cloud quantum computation is within reach.

  11. Using the Microsoft Kinect™ to assess 3-D shoulder kinematics during computer use.

    PubMed

    Xu, Xu; Robertson, Michelle; Chen, Karen B; Lin, Jia-Hua; McGorry, Raymond W

    2017-11-01

    Shoulder joint kinematics has been used as a representative indicator to investigate musculoskeletal symptoms among computer users for office ergonomics studies. The traditional measurement of shoulder kinematics normally requires a laboratory-based motion tracking system which limits the field studies. In the current study, a portable, low cost, and marker-less Microsoft Kinect™ sensor was examined for its feasibility on shoulder kinematics measurement during computer tasks. Eleven healthy participants performed a standardized computer task, and their shoulder kinematics data were measured by a Kinect sensor and a motion tracking system concurrently. The results indicated that placing the Kinect sensor in front of the participants would yielded a more accurate shoulder kinematics measurements then placing the Kinect sensor 15° or 30° to one side. The results also showed that the Kinect sensor had a better estimate on shoulder flexion/extension, compared with shoulder adduction/abduction and shoulder axial rotation. The RMSE of front-placed Kinect sensor on shoulder flexion/extension was less than 10° for both the right and the left shoulder. The measurement error of the front-placed Kinect sensor on the shoulder adduction/abduction was approximately 10° to 15°, and the magnitude of error is proportional to the magnitude of that joint angle. After the calibration, the RMSE on shoulder adduction/abduction were less than 10° based on an independent dataset of 5 additional participants. For shoulder axial rotation, the RMSE of front-placed Kinect sensor ranged between approximately 15° to 30°. The results of the study suggest that the Kinect sensor can provide some insight on shoulder kinematics for improving office ergonomics. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. CALCULATION OF GAMMA SPECTRA IN A PLASTIC SCINTILLATOR FOR ENERGY CALIBRATIONAND DOSE COMPUTATION.

    PubMed

    Kim, Chankyu; Yoo, Hyunjun; Kim, Yewon; Moon, Myungkook; Kim, Jong Yul; Kang, Dong Uk; Lee, Daehee; Kim, Myung Soo; Cho, Minsik; Lee, Eunjoong; Cho, Gyuseong

    2016-09-01

    Plastic scintillation detectors have practical advantages in the field of dosimetry. Energy calibration of measured gamma spectra is important for dose computation, but it is not simple in the plastic scintillators because of their different characteristics and a finite resolution. In this study, the gamma spectra in a polystyrene scintillator were calculated for the energy calibration and dose computation. Based on the relationship between the energy resolution and estimated energy broadening effect in the calculated spectra, the gamma spectra were simply calculated without many iterations. The calculated spectra were in agreement with the calculation by an existing method and measurements. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  13. Organ radiation exposure with EOS: GATE simulations versus TLD measurements

    NASA Astrophysics Data System (ADS)

    Clavel, A. H.; Thevenard-Berger, P.; Verdun, F. R.; Létang, J. M.; Darbon, A.

    2016-03-01

    EOS® is an innovative X-ray imaging system allowing the acquisition of two simultaneous images of a patient in the standing position, during the vertical scan of two orthogonal fan beams. This study aimed to compute organs radiation exposure to a patient, in the particular geometry of this system. Two different positions of the patient in the machine were studied, corresponding to postero-anterior plus left lateral projections (PA-LLAT) and antero-posterior plus right lateral projections (AP-RLAT). To achieve this goal, a Monte-Carlo simulation was developed based on a GATE environment. To model the physical properties of the patient, a computational phantom was produced based on computed tomography scan data of an anthropomorphic phantom. The simulations provided several organs doses, which were compared to previously published dose results measured with Thermo Luminescent Detectors (TLD) in the same conditions and with the same phantom. The simulation results showed a good agreement with measured doses at the TLD locations, for both AP-RLAT and PA-LLAT projections. This study also showed that the organ dose assessed only from a sample of locations, rather than considering the whole organ, introduced significant bias, depending on organs and projections.

  14. Computation of incompressible viscous flows through artificial heart devices with moving boundaries

    NASA Technical Reports Server (NTRS)

    Kiris, Cetin; Rogers, Stuart; Kwak, Dochan; Chang, I.-DEE

    1991-01-01

    The extension of computational fluid dynamics techniques to artificial heart flow simulations is illustrated. Unsteady incompressible Navier-Stokes equations written in 3-D generalized curvilinear coordinates are solved iteratively at each physical time step until the incompressibility condition is satisfied. The solution method is based on the pseudo compressibility approach and uses an implicit upwind differencing scheme together with the Gauss-Seidel line relaxation method. The efficiency and robustness of the time accurate formulation of the algorithm are tested by computing the flow through model geometries. A channel flow with a moving indentation is computed and validated with experimental measurements and other numerical solutions. In order to handle the geometric complexity and the moving boundary problems, a zonal method and an overlapping grid embedding scheme are used, respectively. Steady state solutions for the flow through a tilting disk heart valve was compared against experimental measurements. Good agreement was obtained. The flow computation during the valve opening and closing is carried out to illustrate the moving boundary capability.

  15. Fuzzy measures on the Gene Ontology for gene product similarity.

    PubMed

    Popescu, Mihail; Keller, James M; Mitchell, Joyce A

    2006-01-01

    One of the most important objects in bioinformatics is a gene product (protein or RNA). For many gene products, functional information is summarized in a set of Gene Ontology (GO) annotations. For these genes, it is reasonable to include similarity measures based on the terms found in the GO or other taxonomy. In this paper, we introduce several novel measures for computing the similarity of two gene products annotated with GO terms. The fuzzy measure similarity (FMS) has the advantage that it takes into consideration the context of both complete sets of annotation terms when computing the similarity between two gene products. When the two gene products are not annotated by common taxonomy terms, we propose a method that avoids a zero similarity result. To account for the variations in the annotation reliability, we propose a similarity measure based on the Choquet integral. These similarity measures provide extra tools for the biologist in search of functional information for gene products. The initial testing on a group of 194 sequences representing three proteins families shows a higher correlation of the FMS and Choquet similarities to the BLAST sequence similarities than the traditional similarity measures such as pairwise average or pairwise maximum.

  16. Precision of lumbar intervertebral measurements: does a computer-assisted technique improve reliability?

    PubMed

    Pearson, Adam M; Spratt, Kevin F; Genuario, James; McGough, William; Kosman, Katherine; Lurie, Jon; Sengupta, Dilip K

    2011-04-01

    Comparison of intra- and interobserver reliability of digitized manual and computer-assisted intervertebral motion measurements and classification of "instability." To determine if computer-assisted measurement of lumbar intervertebral motion on flexion-extension radiographs improves reliability compared with digitized manual measurements. Many studies have questioned the reliability of manual intervertebral measurements, although few have compared the reliability of computer-assisted and manual measurements on lumbar flexion-extension radiographs. Intervertebral rotation, anterior-posterior (AP) translation, and change in anterior and posterior disc height were measured with a digitized manual technique by three physicians and by three other observers using computer-assisted quantitative motion analysis (QMA) software. Each observer measured 30 sets of digital flexion-extension radiographs (L1-S1) twice. Shrout-Fleiss intraclass correlation coefficients for intra- and interobserver reliabilities were computed. The stability of each level was also classified (instability defined as >4 mm AP translation or 10° rotation), and the intra- and interobserver reliabilities of the two methods were compared using adjusted percent agreement (APA). Intraobserver reliability intraclass correlation coefficients were substantially higher for the QMA technique THAN the digitized manual technique across all measurements: rotation 0.997 versus 0.870, AP translation 0.959 versus 0.557, change in anterior disc height 0.962 versus 0.770, and change in posterior disc height 0.951 versus 0.283. The same pattern was observed for interobserver reliability (rotation 0.962 vs. 0.693, AP translation 0.862 vs. 0.151, change in anterior disc height 0.862 vs. 0.373, and change in posterior disc height 0.730 vs. 0.300). The QMA technique was also more reliable for the classification of "instability." Intraobserver APAs ranged from 87 to 97% for QMA versus 60% to 73% for digitized manual measurements, while interobserver APAs ranged from 91% to 96% for QMA versus 57% to 63% for digitized manual measurements. The use of QMA software substantially improved the reliability of lumbar intervertebral measurements and the classification of instability based on flexion-extension radiographs.

  17. Estimating Relative Positions of Outer-Space Structures

    NASA Technical Reports Server (NTRS)

    Balian, Harry; Breckenridge, William; Brugarolas, Paul

    2009-01-01

    A computer program estimates the relative position and orientation of two structures from measurements, made by use of electronic cameras and laser range finders on one structure, of distances and angular positions of fiducial objects on the other structure. The program was written specifically for use in determining errors in the alignment of large structures deployed in outer space from a space shuttle. The program is based partly on equations for transformations among the various coordinate systems involved in the measurements and on equations that account for errors in the transformation operators. It computes a least-squares estimate of the relative position and orientation. Sequential least-squares estimates, acquired at a measurement rate of 4 Hz, are averaged by passing them through a fourth-order Butterworth filter. The program is executed in a computer aboard the space shuttle, and its position and orientation estimates are displayed to astronauts on a graphical user interface.

  18. Wavefront measurement using computational adaptive optics.

    PubMed

    South, Fredrick A; Liu, Yuan-Zhi; Bower, Andrew J; Xu, Yang; Carney, P Scott; Boppart, Stephen A

    2018-03-01

    In many optical imaging applications, it is necessary to correct for aberrations to obtain high quality images. Optical coherence tomography (OCT) provides access to the amplitude and phase of the backscattered optical field for three-dimensional (3D) imaging samples. Computational adaptive optics (CAO) modifies the phase of the OCT data in the spatial frequency domain to correct optical aberrations without using a deformable mirror, as is commonly done in hardware-based adaptive optics (AO). This provides improvement of image quality throughout the 3D volume, enabling imaging across greater depth ranges and in highly aberrated samples. However, the CAO aberration correction has a complicated relation to the imaging pupil and is not a direct measurement of the pupil aberrations. Here we present new methods for recovering the wavefront aberrations directly from the OCT data without the use of hardware adaptive optics. This enables both computational measurement and correction of optical aberrations.

  19. Design and Development Computer-Based E-Learning Teaching Material for Improving Mathematical Understanding Ability and Spatial Sense of Junior High School Students

    NASA Astrophysics Data System (ADS)

    Nurjanah; Dahlan, J. A.; Wibisono, Y.

    2017-02-01

    This paper aims to make a design and development computer-based e-learning teaching material for improving mathematical understanding ability and spatial sense of junior high school students. Furthermore, the particular aims are (1) getting teaching material design, evaluation model, and intrument to measure mathematical understanding ability and spatial sense of junior high school students; (2) conducting trials computer-based e-learning teaching material model, asessment, and instrument to develop mathematical understanding ability and spatial sense of junior high school students; (3) completing teaching material models of computer-based e-learning, assessment, and develop mathematical understanding ability and spatial sense of junior high school students; (4) resulting research product is teaching materials of computer-based e-learning. Furthermore, the product is an interactive learning disc. The research method is used of this study is developmental research which is conducted by thought experiment and instruction experiment. The result showed that teaching materials could be used very well. This is based on the validation of computer-based e-learning teaching materials, which is validated by 5 multimedia experts. The judgement result of face and content validity of 5 validator shows that the same judgement result to the face and content validity of each item test of mathematical understanding ability and spatial sense. The reliability test of mathematical understanding ability and spatial sense are 0,929 and 0,939. This reliability test is very high. While the validity of both tests have a high and very high criteria.

  20. Improving Learning, Retention of Knowledge, and Attitude of Students in a Vocational-Technical College through Interactive Computer Technology.

    ERIC Educational Resources Information Center

    Hitchcock, A. Allen

    The problem that this practicum attempted to solve was that students in a vocational-technical college tended to underachieve in courses that were mainly cognitive in nature, as evidenced by low overall grade-point course averages and other measures. The researcher designed computer-based simulation/gaming instruction that aimed to increase…

Top