NASA Technical Reports Server (NTRS)
Bean, T. A.; Bowhill, S. A.
1973-01-01
Partial-reflection data collected for the eclipse of July 10, 1972 as well as for July 9 and 11, 1972, are analyzed to determine eclipse effects on D-region electron densities. The partial-reflection experiment was set up to collect data using an on-line PDP-15 computer and DECtape storage. The electron-density profiles show good agreement with results from other eclipses. The partial-reflection programs were changed after the eclipse data collection to improve the operation of the partial-reflection system. These changes were mainly due to expanded computer hardware and have simplified the operations of the system considerably.
Polarized reflectance and transmittance properties of windblown sea surfaces.
Mobley, Curtis D
2015-05-20
Generation of random sea surfaces using wave variance spectra and Fourier transforms is formulated in a way that guarantees conservation of wave energy and fully resolves wave height and slope variances. Monte Carlo polarized ray tracing, which accounts for multiple scattering between light rays and wave facets, is used to compute effective Mueller matrices for reflection and transmission of air- or water-incident polarized radiance. Irradiance reflectances computed using a Rayleigh sky radiance distribution, sea surfaces generated with Cox-Munk statistics, and unpolarized ray tracing differ by 10%-18% compared with values computed using elevation- and slope-resolving surfaces and polarized ray tracing. Radiance reflectance factors, as used to estimate water-leaving radiance from measured upwelling and sky radiances, are shown to depend on sky polarization, and improved values are given.
NASA Technical Reports Server (NTRS)
Tam, Christopher K. W.; Webb, Jay C.
1994-01-01
In this paper finite-difference solutions of the Helmholtz equation in an open domain are considered. By using a second-order central difference scheme and the Bayliss-Turkel radiation boundary condition, reasonably accurate solutions can be obtained when the number of grid points per acoustic wavelength used is large. However, when a smaller number of grid points per wavelength is used excessive reflections occur which tend to overwhelm the computed solutions. Excessive reflections are due to the incompability between the governing finite difference equation and the Bayliss-Turkel radiation boundary condition. The Bayliss-Turkel radiation boundary condition was developed from the asymptotic solution of the partial differential equation. To obtain compatibility, the radiation boundary condition should be constructed from the asymptotic solution of the finite difference equation instead. Examples are provided using the improved radiation boundary condition based on the asymptotic solution of the governing finite difference equation. The computed results are free of reflections even when only five grid points per wavelength are used. The improved radiation boundary condition has also been tested for problems with complex acoustic sources and sources embedded in a uniform mean flow. The present method of developing a radiation boundary condition is also applicable to higher order finite difference schemes. In all these cases no reflected waves could be detected. The use of finite difference approximation inevita bly introduces anisotropy into the governing field equation. The effect of anisotropy is to distort the directional distribution of the amplitude and phase of the computed solution. It can be quite large when the number of grid points per wavelength used in the computation is small. A way to correct this effect is proposed. The correction factor developed from the asymptotic solutions is source independent and, hence, can be determined once and for all. The effectiveness of the correction factor in providing improvements to the computed solution is demonstrated in this paper.
NASA Technical Reports Server (NTRS)
Gordy, R. S.
1972-01-01
An improved broadband impedance matching technique was developed. The technique is capable of resolving points in the waveguide which generate reflected energy. A version of the comparison reflectometer was developed and fabricated to determine the mean amplitude of the reflection coefficient excited at points in the guide as a function of distance, and the complex reflection coefficient of a specific discontinuity in the guide as a function of frequency. An impedance matching computer program was developed which is capable of impedance matching the characteristics of each disturbance independent of other reflections in the guide. The characteristics of four standard matching elements were compiled, and their associated curves of reflection coefficient and shunt susceptance as a function of frequency are presented. It is concluded that an economical, fast, and reliable impedance matching technique has been established which can provide broadband impedance matches.
NASA Astrophysics Data System (ADS)
Fasnacht, Z.; Qin, W.; Haffner, D. P.; Loyola, D. G.; Joiner, J.; Krotkov, N. A.; Vasilkov, A. P.; Spurr, R. J. D.
2017-12-01
In order to estimate surface reflectance used in trace gas retrieval algorithms, radiative transfer models (RTM) such as the Vector Linearized Discrete Ordinate Radiative Transfer Model (VLIDORT) can be used to simulate the top of the atmosphere (TOA) radiances with advanced models of surface properties. With large volumes of satellite data, these model simulations can become computationally expensive. Look up table interpolation can improve the computational cost of the calculations, but the non-linear nature of the radiances requires a dense node structure if interpolation errors are to be minimized. In order to reduce our computational effort and improve the performance of look-up tables, neural networks can be trained to predict these radiances. We investigate the impact of using look-up table interpolation versus a neural network trained using the smart sampling technique, and show that neural networks can speed up calculations and reduce errors while using significantly less memory and RTM calls. In future work we will implement a neural network in operational processing to meet growing demands for reflectance modeling in support of high spatial resolution satellite missions.
ERIC Educational Resources Information Center
Wang, Chu-Fu; Lin, Chih-Lung; Deng, Jien-Han
2012-01-01
Testing is an important stage of teaching as it can assist teachers in auditing students' learning results. A good test is able to accurately reflect the capability of a learner. Nowadays, Computer-Assisted Testing (CAT) is greatly improving traditional testing, since computers can automatically and quickly compose a proper test sheet to meet user…
ERIC Educational Resources Information Center
Orey, Michael A.; Nelson, Wayne A.
Arguing that the evolution of intelligent tutoring systems better reflects the recent theoretical developments of cognitive science than traditional computer-based instruction (CBI), this paper describes a general model for an intelligent tutoring system and suggests ways to improve CBI using design principles derived from research in cognitive…
ERIC Educational Resources Information Center
Ilieva, Vessela; Erguner-Tekinalp, Bengu
2012-01-01
This study examined the applications of computer-mediated student collaboration in a graduate multicultural counseling course. The course work included a reflective cultural competency building assignment that utilized online communication and collaboration using a wiki to extend and improve students' multicultural counseling and social justice…
NASA Astrophysics Data System (ADS)
Hersch, Roger David; Crété, Frédérique
2004-12-01
Dot gain is different when dots are printed alone, printed in superposition with one ink or printed in superposition with two inks. In addition, the dot gain may also differ depending on which solid ink the considered halftone layer is superposed. In a previous research project, we developed a model for computing the effective surface coverage of a dot according to its superposition conditions. In the present contribution, we improve the Yule-Nielsen modified Neugebauer model by integrating into it our effective dot surface coverage computation model. Calibration of the reproduction curves mapping nominal to effective surface coverages in every superposition condition is carried out by fitting effective dot surfaces which minimize the sum of square differences between the measured reflection density spectra and reflection density spectra predicted according to the Yule-Nielsen modified Neugebauer model. In order to predict the reflection spectrum of a patch, its known nominal surface coverage values are converted into effective coverage values by weighting the contributions from different reproduction curves according to the weights of the contributing superposition conditions. We analyze the colorimetric prediction improvement brought by our extended dot surface coverage model for clustered-dot offset prints, thermal transfer prints and ink-jet prints. The color differences induced by the differences between measured reflection spectra and reflection spectra predicted according to the new dot surface estimation model are quantified on 729 different cyan, magenta, yellow patches covering the full color gamut. As a reference, these differences are also computed for the classical Yule-Nielsen modified spectral Neugebauer model incorporating a single halftone reproduction curve for each ink. Taking into account dot surface coverages according to different superposition conditions considerably improves the predictions of the Yule-Nielsen modified Neugebauer model. In the case of offset prints, the mean difference between predictions and measurements expressed in CIE-LAB CIE-94 ΔE94 values is reduced at 100 lpi from 1.54 to 0.90 (accuracy improvement factor: 1.7) and at 150 lpi it is reduced from 1.87 to 1.00 (accuracy improvement factor: 1.8). Similar improvements have been observed for a thermal transfer printer at 600 dpi, at lineatures of 50 and 75 lpi. In the case of an ink-jet printer at 600 dpi, the mean ΔE94 value is reduced at 75 lpi from 3.03 to 0.90 (accuracy improvement factor: 3.4) and at 100 lpi from 3.08 to 0.91 (accuracy improvement factor: 3.4).
NASA Astrophysics Data System (ADS)
Hersch, Roger David; Crete, Frederique
2005-01-01
Dot gain is different when dots are printed alone, printed in superposition with one ink or printed in superposition with two inks. In addition, the dot gain may also differ depending on which solid ink the considered halftone layer is superposed. In a previous research project, we developed a model for computing the effective surface coverage of a dot according to its superposition conditions. In the present contribution, we improve the Yule-Nielsen modified Neugebauer model by integrating into it our effective dot surface coverage computation model. Calibration of the reproduction curves mapping nominal to effective surface coverages in every superposition condition is carried out by fitting effective dot surfaces which minimize the sum of square differences between the measured reflection density spectra and reflection density spectra predicted according to the Yule-Nielsen modified Neugebauer model. In order to predict the reflection spectrum of a patch, its known nominal surface coverage values are converted into effective coverage values by weighting the contributions from different reproduction curves according to the weights of the contributing superposition conditions. We analyze the colorimetric prediction improvement brought by our extended dot surface coverage model for clustered-dot offset prints, thermal transfer prints and ink-jet prints. The color differences induced by the differences between measured reflection spectra and reflection spectra predicted according to the new dot surface estimation model are quantified on 729 different cyan, magenta, yellow patches covering the full color gamut. As a reference, these differences are also computed for the classical Yule-Nielsen modified spectral Neugebauer model incorporating a single halftone reproduction curve for each ink. Taking into account dot surface coverages according to different superposition conditions considerably improves the predictions of the Yule-Nielsen modified Neugebauer model. In the case of offset prints, the mean difference between predictions and measurements expressed in CIE-LAB CIE-94 ΔE94 values is reduced at 100 lpi from 1.54 to 0.90 (accuracy improvement factor: 1.7) and at 150 lpi it is reduced from 1.87 to 1.00 (accuracy improvement factor: 1.8). Similar improvements have been observed for a thermal transfer printer at 600 dpi, at lineatures of 50 and 75 lpi. In the case of an ink-jet printer at 600 dpi, the mean ΔE94 value is reduced at 75 lpi from 3.03 to 0.90 (accuracy improvement factor: 3.4) and at 100 lpi from 3.08 to 0.91 (accuracy improvement factor: 3.4).
NASA Astrophysics Data System (ADS)
Downey, N.; Begnaud, M. L.; Hipp, J. R.; Ballard, S.; Young, C. S.; Encarnacao, A. V.
2017-12-01
The SALSA3D global 3D velocity model of the Earth was developed to improve the accuracy and precision of seismic travel time predictions for a wide suite of regional and teleseismic phases. Recently, the global SALSA3D model was updated to include additional body wave phases including mantle phases, core phases, reflections off the core-mantle boundary and underside reflections off the surface of the Earth. We show that this update improves travel time predictions and leads directly to significant improvements in the accuracy and precision of seismic event locations as compared to locations computed using standard 1D velocity models like ak135, or 2½D models like RSTT. A key feature of our inversions is that path-specific model uncertainty of travel time predictions are calculated using the full 3D model covariance matrix computed during tomography, which results in more realistic uncertainty ellipses that directly reflect tomographic data coverage. Application of this method can also be done at a regional scale: we present a velocity model with uncertainty obtained using data obtained from the University of Utah Seismograph Stations. These results show a reduction in travel-time residuals for re-located events compared with those obtained using previously published models.
NASA Technical Reports Server (NTRS)
Balanis, Constantine A.; Polka, Lesley A.; Polycarpou, Anastasis C.
1994-01-01
Formulations for scattering from the coated plate and the coated dihedral corner reflector are included. A coated plate model based upon the Uniform Theory of Diffraction (UTD) for impedance wedges was presented in the last report. In order to resolve inaccuracies and discontinuities in the predicted patterns using the UTD-based model, an improved model that uses more accurate diffraction coefficients is presented. A Physical Optics (PO) model for the coated dihedral corner reflector is presented as an intermediary step in developing a high-frequency model for this structure. The PO model is based upon the reflection coefficients for a metal-backed lossy material. Preliminary PO results for the dihedral corner reflector suggest that, in addition to being much faster computationally, this model may be more accurate than existing moment method (MM) models. An improved Physical Optics (PO)/Equivalent Currents model for modeling the Radar Cross Section (RCS) of both square and triangular, perfectly conducting, trihedral corner reflectors is presented. The new model uses the PO approximation at each reflection for the first- and second-order reflection terms. For the third-order reflection terms, a Geometrical Optics (GO) approximation is used for the first reflection; and PO approximations are used for the remaining reflections. The previously reported model used GO for all reflections except the terminating reflection. Using PO for most of the reflections results in a computationally slower model because many integrations must be performed numerically, but the advantage is that the predicted RCS using the new model is much more accurate. Comparisons between the two PO models, Finite-Difference Time-Domain (FDTD) and experimental data are presented for validation of the new model.
NASA Technical Reports Server (NTRS)
Lin, Z.; Stamnes, S.; Jin, Z.; Laszlo, I.; Tsay, S. C.; Wiscombe, W. J.; Stamnes, K.
2015-01-01
A successor version 3 of DISORT (DISORT3) is presented with important upgrades that improve the accuracy, efficiency, and stability of the algorithm. Compared with version 2 (DISORT2 released in 2000) these upgrades include (a) a redesigned BRDF computation that improves both speed and accuracy, (b) a revised treatment of the single scattering correction, and (c) additional efficiency and stability upgrades for beam sources. In DISORT3 the BRDF computation is improved in the following three ways: (i) the Fourier decomposition is prepared "off-line", thus avoiding the repeated internal computations done in DISORT2; (ii) a large enough number of terms in the Fourier expansion of the BRDF is employed to guarantee accurate values of the expansion coefficients (default is 200 instead of 50 in DISORT2); (iii) in the post processing step the reflection of the direct attenuated beam from the lower boundary is included resulting in a more accurate single scattering correction. These improvements in the treatment of the BRDF have led to improved accuracy and a several-fold increase in speed. In addition, the stability of beam sources has been improved by removing a singularity occurring when the cosine of the incident beam angle is too close to the reciprocal of any of the eigenvalues. The efficiency for beam sources has been further improved from reducing by a factor of 2 (compared to DISORT2) the dimension of the linear system of equations that must be solved to obtain the particular solutions, and by replacing the LINPAK routines used in DISORT2 by LAPACK 3.5 in DISORT3. These beam source stability and efficiency upgrades bring enhanced stability and an additional 5-7% improvement in speed. Numerical results are provided to demonstrate and quantify the improvements in accuracy and efficiency of DISORT3 compared to DISORT2.
Reflectance analysis of porosity gradient in nanostructured silicon layers
NASA Astrophysics Data System (ADS)
Jurečka, Stanislav; Imamura, Kentaro; Matsumoto, Taketoshi; Kobayashi, Hikaru
2017-12-01
In this work we study optical properties of nanostructured layers formed on silicon surface. Nanostructured layers on Si are formed in order to reach high suppression of the light reflectance. Low spectral reflectance is important for improvement of the conversion efficiency of solar cells and for other optoelectronic applications. Effective method of forming nanostructured layers with ultralow reflectance in a broad interval of wavelengths is in our approach based on metal assisted etching of Si. Si surface immersed in HF and H2O2 solution is etched in contact with the Pt mesh roller and the structure of the mesh is transferred on the etched surface. During this etching procedure the layer density evolves gradually and the spectral reflectance decreases exponentially with the depth in porous layer. We analyzed properties of the layer porosity by incorporating the porosity gradient into construction of the layer spectral reflectance theoretical model. Analyzed layer is splitted into 20 sublayers in our approach. Complex dielectric function in each sublayer is computed by using Bruggeman effective media theory and the theoretical spectral reflectance of modelled multilayer system is computed by using Abeles matrix formalism. Porosity gradient is extracted from the theoretical reflectance model optimized in comparison to the experimental values. Resulting values of the structure porosity development provide important information for optimization of the technological treatment operations.
NASA Astrophysics Data System (ADS)
Pon-Barry, Heather; Packard, Becky Wai-Ling; St. John, Audrey
2017-01-01
A dilemma within computer science departments is developing sustainable ways to expand capacity within introductory computer science courses while remaining committed to inclusive practices. Training near-peer mentors for peer code review is one solution. This paper describes the preparation of near-peer mentors for their role, with a focus on regular, consistent feedback via peer code review and inclusive pedagogy. Introductory computer science students provided consistently high ratings of the peer mentors' knowledge, approachability, and flexibility, and credited peer mentor meetings for their strengthened self-efficacy and understanding. Peer mentors noted the value of videotaped simulations with reflection, discussions of inclusion, and the cohort's weekly practicum for improving practice. Adaptations of peer mentoring for different types of institutions are discussed. Computer science educators, with hopes of improving the recruitment and retention of underrepresented groups, can benefit from expanding their peer support infrastructure and improving the quality of peer mentor preparation.
Estimating soil water evaporation using radar measurements
NASA Technical Reports Server (NTRS)
Sadeghi, Ali M.; Scott, H. D.; Waite, W. P.; Asrar, G.
1988-01-01
Field studies were conducted to evaluate the application of radar reflectivity as compared with the shortwave reflectivity (albedo) used in the Idso-Jackson equation for the estimation of daily evaporation under overcast sky and subhumid climatic conditions. Soil water content, water potential, shortwave and radar reflectivity, and soil and air temperatures were monitored during three soil drying cycles. The data from each cycle were used to calculate daily evaporation from the Idso-Jackson equation and from two other standard methods, the modified Penman and plane of zero-flux. All three methods resulted in similar estimates of evaporation under clear sky conditions; however, under overcast sky conditions, evaporation fluxes computed from the Idso-Jackson equation were consistently lower than the other two methods. The shortwave albedo values in the Idso-Jackson equation were then replaced with radar reflectivities and a new set of total daily evaporation fluxes were calculated. This resulted in a significant improvement in computed soil evaporation fluxes from the Idso-Jackson equation, and a better agreement between the three methods under overcast sky conditions.
Improving experimental phases for strong reflections prior to density modification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uervirojnangkoorn, Monarin; University of Lübeck, Ratzeburger Allee 160, 23538 Lübeck; Hilgenfeld, Rolf, E-mail: hilgenfeld@biochem.uni-luebeck.de
A genetic algorithm has been developed to optimize the phases of the strongest reflections in SIR/SAD data. This is shown to facilitate density modification and model building in several test cases. Experimental phasing of diffraction data from macromolecular crystals involves deriving phase probability distributions. These distributions are often bimodal, making their weighted average, the centroid phase, improbable, so that electron-density maps computed using centroid phases are often non-interpretable. Density modification brings in information about the characteristics of electron density in protein crystals. In successful cases, this allows a choice between the modes in the phase probability distributions, and the mapsmore » can cross the borderline between non-interpretable and interpretable. Based on the suggestions by Vekhter [Vekhter (2005 ▶), Acta Cryst. D61, 899–902], the impact of identifying optimized phases for a small number of strong reflections prior to the density-modification process was investigated while using the centroid phase as a starting point for the remaining reflections. A genetic algorithm was developed that optimizes the quality of such phases using the skewness of the density map as a target function. Phases optimized in this way are then used in density modification. In most of the tests, the resulting maps were of higher quality than maps generated from the original centroid phases. In one of the test cases, the new method sufficiently improved a marginal set of experimental SAD phases to enable successful map interpretation. A computer program, SISA, has been developed to apply this method for phase improvement in macromolecular crystallography.« less
On-line confidence monitoring during decision making.
Dotan, Dror; Meyniel, Florent; Dehaene, Stanislas
2018-02-01
Humans can readily assess their degree of confidence in their decisions. Two models of confidence computation have been proposed: post hoc computation using post-decision variables and heuristics, versus online computation using continuous assessment of evidence throughout the decision-making process. Here, we arbitrate between these theories by continuously monitoring finger movements during a manual sequential decision-making task. Analysis of finger kinematics indicated that subjects kept separate online records of evidence and confidence: finger deviation continuously reflected the ongoing accumulation of evidence, whereas finger speed continuously reflected the momentary degree of confidence. Furthermore, end-of-trial finger speed predicted the post-decisional subjective confidence rating. These data indicate that confidence is computed on-line, throughout the decision process. Speed-confidence correlations were previously interpreted as a post-decision heuristics, whereby slow decisions decrease subjective confidence, but our results suggest an adaptive mechanism that involves the opposite causality: by slowing down when unconfident, participants gain time to improve their decisions. Copyright © 2017 Elsevier B.V. All rights reserved.
Improving experimental phases for strong reflections prior to density modification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uervirojnangkoorn, Monarin; Hilgenfeld, Rolf; Terwilliger, Thomas C.
Experimental phasing of diffraction data from macromolecular crystals involves deriving phase probability distributions. These distributions are often bimodal, making their weighted average, the centroid phase, improbable, so that electron-density maps computed using centroid phases are often non-interpretable. Density modification brings in information about the characteristics of electron density in protein crystals. In successful cases, this allows a choice between the modes in the phase probability distributions, and the maps can cross the borderline between non-interpretable and interpretable. Based on the suggestions by Vekhter [Vekhter (2005), Acta Cryst. D 61, 899–902], the impact of identifying optimized phases for a small numbermore » of strong reflections prior to the density-modification process was investigated while using the centroid phase as a starting point for the remaining reflections. A genetic algorithm was developed that optimizes the quality of such phases using the skewness of the density map as a target function. Phases optimized in this way are then used in density modification. In most of the tests, the resulting maps were of higher quality than maps generated from the original centroid phases. In one of the test cases, the new method sufficiently improved a marginal set of experimental SAD phases to enable successful map interpretation. Lastly, a computer program, SISA, has been developed to apply this method for phase improvement in macromolecular crystallography.« less
Improving experimental phases for strong reflections prior to density modification
Uervirojnangkoorn, Monarin; Hilgenfeld, Rolf; Terwilliger, Thomas C.; ...
2013-09-20
Experimental phasing of diffraction data from macromolecular crystals involves deriving phase probability distributions. These distributions are often bimodal, making their weighted average, the centroid phase, improbable, so that electron-density maps computed using centroid phases are often non-interpretable. Density modification brings in information about the characteristics of electron density in protein crystals. In successful cases, this allows a choice between the modes in the phase probability distributions, and the maps can cross the borderline between non-interpretable and interpretable. Based on the suggestions by Vekhter [Vekhter (2005), Acta Cryst. D 61, 899–902], the impact of identifying optimized phases for a small numbermore » of strong reflections prior to the density-modification process was investigated while using the centroid phase as a starting point for the remaining reflections. A genetic algorithm was developed that optimizes the quality of such phases using the skewness of the density map as a target function. Phases optimized in this way are then used in density modification. In most of the tests, the resulting maps were of higher quality than maps generated from the original centroid phases. In one of the test cases, the new method sufficiently improved a marginal set of experimental SAD phases to enable successful map interpretation. Lastly, a computer program, SISA, has been developed to apply this method for phase improvement in macromolecular crystallography.« less
NASA Technical Reports Server (NTRS)
Fricke, C. L.
1975-01-01
A solution to the problem of reflection from a semi-infinite atmosphere is presented, based upon Chandrasekhar's H-function method for linearly anisotropic phase functions. A modification to the Gauss quadrature formula which gives about the same accuracy with 10 points as the conventional Gauss quadrature does with 100 points was developed. A computer program achieving this solution is described and results are presented for several illustrative cases.
An improved method to estimate reflectance parameters for high dynamic range imaging
NASA Astrophysics Data System (ADS)
Li, Shiying; Deguchi, Koichiro; Li, Renfa; Manabe, Yoshitsugu; Chihara, Kunihiro
2008-01-01
Two methods are described to accurately estimate diffuse and specular reflectance parameters for colors, gloss intensity and surface roughness, over the dynamic range of the camera used to capture input images. Neither method needs to segment color areas on an image, or to reconstruct a high dynamic range (HDR) image. The second method improves on the first, bypassing the requirement for specific separation of diffuse and specular reflection components. For the latter method, diffuse and specular reflectance parameters are estimated separately, using the least squares method. Reflection values are initially assumed to be diffuse-only reflection components, and are subjected to the least squares method to estimate diffuse reflectance parameters. Specular reflection components, obtained by subtracting the computed diffuse reflection components from reflection values, are then subjected to a logarithmically transformed equation of the Torrance-Sparrow reflection model, and specular reflectance parameters for gloss intensity and surface roughness are finally estimated using the least squares method. Experiments were carried out using both methods, with simulation data at different saturation levels, generated according to the Lambert and Torrance-Sparrow reflection models, and the second method, with spectral images captured by an imaging spectrograph and a moving light source. Our results show that the second method can estimate the diffuse and specular reflectance parameters for colors, gloss intensity and surface roughness more accurately and faster than the first one, so that colors and gloss can be reproduced more efficiently for HDR imaging.
Pan, Yijie; Wang, Yongtian; Liu, Juan; Li, Xin; Jia, Jia
2014-03-01
Previous research [Appl. Opt.52, A290 (2013)] has revealed that Fourier analysis of three-dimensional affine transformation theory can be used to improve the computation speed of the traditional polygon-based method. In this paper, we continue our research and propose an improved full analytical polygon-based method developed upon this theory. Vertex vectors of primitive and arbitrary triangles and the pseudo-inverse matrix were used to obtain an affine transformation matrix representing the spatial relationship between the two triangles. With this relationship and the primitive spectrum, we analytically obtained the spectrum of the arbitrary triangle. This algorithm discards low-level angular dependent computations. In order to add diffusive reflection to each arbitrary surface, we also propose a whole matrix computation approach that takes advantage of the affine transformation matrix and uses matrix multiplication to calculate shifting parameters of similar sub-polygons. The proposed method improves hologram computation speed for the conventional full analytical approach. Optical experimental results are demonstrated which prove that the proposed method can effectively reconstruct three-dimensional scenes.
Learning experiences of science teachers in a computer-mediated communication context
NASA Astrophysics Data System (ADS)
Chung, Chia-Jung
The use of computer-mediated-communication (CMC) has been applied increasingly in staff development efforts for teachers. Many teacher education programs are looking to CMC, particularly computer conferencing systems, as an effective and low-cost medium for the delivery of teacher educational programs anytime, anywhere. Based on constructivist learning theories, this study focused on examining the use of an online discussion board in a graduate course as a place where forty-six inservice teachers shared experiences and ideas. Data collection focused on online discussion transcripts of all the messages from three separate weeks, and supplemented by interviews and teacher self-evaluation reports. The nature and development of the discussions were studied over one semester by analyzing teacher online discussions in two domains: critical reflections and social-interpersonal rapport. In effect, this study provided insights into how to employ computer conferencing technology in facilitating inservice teachers' teaching practices and their professional development. Major findings include: (1) Participation: The level of participation varied during the semester but was higher at the beginning of the semester and lower at the end of the semester. (2) Critical Reflection: Teachers' critical reflection developed over time as a result of the online discussion board according to mean critical thinking scores during the three selected weeks. Cognitive presence was found mostly in focused discussion forums and social presence mainly existed in the unfocused discussion forums. (3) Social-Interpersonal Rapport: The number of social cues in the messages increased initially but declined significantly over time. When teachers focused more on on-task discussions or critical reflection, there was less social conversation. (4) Teaching Practices and Professional Development: The researcher, the instructor, and teachers identified some advantages for using computer conferencing for improving teaching practices and for professional development. The results of this study suggest that applying computer-mediated communication in teacher education would impact positively on teachers' growth in critical reflection and social-interpersonal rapport. Furthermore, this study may encourage other researchers to use cognitive and social learning theories as the theoretical backgrounds for developing teacher educational models by applying computer conferencing.
Processing data, for improved, accuracy, from device for measuring speed of sound in a gas
Owen, Thomas E.
2006-09-19
A method, used in connection with a pulse-echo type sensor for determining the speed of sound in a gas, for improving the accuracy of speed of sound measurements. The sensor operates on the principle that speed of sound can be derived from the difference between the two-way travel time of signals reflected from two different target faces of the sensor. This time difference is derived by computing the cross correlation between the two reflections. The cross correlation function may be fitted to a parabola whose vertex represents the optimum time coordinate of the coherence peak, thereby providing an accurate measure of the two-way time diffference.
Major uncertainties influencing entry probe heat shield design
NASA Technical Reports Server (NTRS)
Congdon, W.
1974-01-01
Factors influencing the design of an outer planet probe heat shield are discussed. Major factors included are: uncertainties in composition and scale height of the planet atmospheres; the augmentation/attenuation of entry heating by ablation products requires more computer study and testing; carbon heat shields, especially carbon phenolic, possessing improved resistance to spallation need developing; and white silica reflecting heat shields with improved resistance to bulk vitrification need further developing.
Voices from the Classroom: Exceptional Teachers Speak.
ERIC Educational Resources Information Center
Maeroff, Gene I., Ed.
The opinions and experiences reflected in this report are those of exceptional teachers chosen in a national competition, "Thanks to Teachers," sponsored by Apple Computer, Inc., the National Foundation for the Improvement of Education, the National Alliance of Business, and Group W Television. The report is divided into four sections: (1) the…
Lipman, Samantha L; Rouze, Ned C; Palmeri, Mark L; Nightingale, Kathryn R
2016-08-01
Shear waves propagating through interfaces where there is a change in stiffness cause reflected waves that can lead to artifacts in shear wave speed (SWS) reconstructions. Two-dimensional (2-D) directional filters are commonly used to reduce in-plane reflected waves; however, SWS artifacts arise from both in- and out-of-imaging-plane reflected waves. Herein, we introduce 3-D shear wave reconstruction methods as an extension of the previous 2-D estimation methods and quantify the reduction in image artifacts through the use of volumetric SWS monitoring and 4-D-directional filters. A Gaussian acoustic radiation force impulse excitation was simulated in phantoms with Young's modulus ( E ) of 3 kPa and a 5-mm spherical lesion with E = 6, 12, or 18.75 kPa. The 2-D-, 3-D-, and 4-D-directional filters were applied to the displacement profiles to reduce in-and out-of-plane reflected wave artifacts. Contrast-to-noise ratio and SWS bias within the lesion were calculated for each reconstructed SWS image to evaluate the image quality. For 2-D SWS image reconstructions, the 3-D-directional filters showed greater improvements in image quality than the 2-D filters, and the 4-D-directional filters showed marginal improvement over the 3-D filters. Although 4-D-directional filters can further reduce the impact of large magnitude out-of-plane reflection artifacts in SWS images, computational overhead and transducer costs to acquire 3-D data may outweigh the modest improvements in image quality. The 4-D-directional filters have the largest impact in reducing reflection artifacts in 3-D SWS volumes.
NASA Astrophysics Data System (ADS)
Rathsam, Jonathan
This dissertation seeks to advance the current state of computer-based sound field simulations for room acoustics. The first part of the dissertation assesses the reliability of geometric sound-field simulations, which are approximate in nature. The second part of the dissertation uses the rigorous boundary element method (BEM) to learn more about reflections from finite reflectors: planar and non-planar. Acoustical designers commonly use geometric simulations to predict sound fields quickly. Geometric simulation of reflections from rough surfaces is still under refinement. The first project in this dissertation investigates the scattering coefficient, which quantifies the degree of diffuse reflection from rough surfaces. The main result is that predicted reverberation time varies inversely with scattering coefficient if the sound field is nondiffuse. Additional results include a flow chart that enables acoustical designers to gauge how sensitive predicted results are to their choice of scattering coefficient. Geometric acoustics is a high-frequency approximation to wave acoustics. At low frequencies, more pronounced wave phenomena cause deviations between real-world values and geometric predictions. Acoustical designers encounter the limits of geometric acoustics in particular when simulating the low frequency response from finite suspended reflector panels. This dissertation uses the rigorous BEM to develop an improved low-frequency radiation model for smooth, finite reflectors. The improved low frequency model is suggested in two forms for implementation in geometric models. Although BEM simulations require more computation time than geometric simulations, BEM results are highly accurate. The final section of this dissertation uses the BEM to investigate the sound field around non-planar reflectors. The author has added convex edges rounded away from the source side of finite, smooth reflectors to minimize coloration of reflections caused by interference from boundary waves. Although the coloration could not be fully eliminated, the convex edge increases the sound energy reflected into previously nonspecular zones. This excess reflected energy is marginally audible using a standard of 20 dB below direct sound energy. The convex-edged panel is recommended for use when designers want to extend reflected energy spatially beyond the specular reflection zone of a planar panel.
Research on rolling element bearing fault diagnosis based on genetic algorithm matching pursuit
NASA Astrophysics Data System (ADS)
Rong, R. W.; Ming, T. F.
2017-12-01
In order to solve the problem of slow computation speed, matching pursuit algorithm is applied to rolling bearing fault diagnosis, and the improvement are conducted from two aspects that are the construction of dictionary and the way to search for atoms. To be specific, Gabor function which can reflect time-frequency localization characteristic well is used to construct the dictionary, and the genetic algorithm to improve the searching speed. A time-frequency analysis method based on genetic algorithm matching pursuit (GAMP) algorithm is proposed. The way to set property parameters for the improvement of the decomposition results is studied. Simulation and experimental results illustrate that the weak fault feature of rolling bearing can be extracted effectively by this proposed method, at the same time, the computation speed increases obviously.
Fisheye-Based Method for GPS Localization Improvement in Unknown Semi-Obstructed Areas
Moreau, Julien; Ambellouis, Sébastien; Ruichek, Yassine
2017-01-01
A precise GNSS (Global Navigation Satellite System) localization is vital for autonomous road vehicles, especially in cluttered or urban environments where satellites are occluded, preventing accurate positioning. We propose to fuse GPS (Global Positioning System) data with fisheye stereovision to face this problem independently to additional data, possibly outdated, unavailable, and needing correlation with reality. Our stereoscope is sky-facing with 360° × 180° fisheye cameras to observe surrounding obstacles. We propose a 3D modelling and plane extraction through following steps: stereoscope self-calibration for decalibration robustness, stereo matching considering neighbours epipolar curves to compute 3D, and robust plane fitting based on generated cartography and Hough transform. We use these 3D data with GPS raw data to estimate NLOS (Non Line Of Sight) reflected signals pseudorange delay. We exploit extracted planes to build a visibility mask for NLOS detection. A simplified 3D canyon model allows to compute reflections pseudorange delays. In the end, GPS positioning is computed considering corrected pseudoranges. With experimentations on real fixed scenes, we show generated 3D models reaching metric accuracy and improvement of horizontal GPS positioning accuracy by more than 50%. The proposed procedure is effective, and the proposed NLOS detection outperforms CN0-based methods (Carrier-to-receiver Noise density). PMID:28106746
Aerospace technology can be applied to exploration 'back on earth'. [offshore petroleum resources
NASA Technical Reports Server (NTRS)
Jaffe, L. D.
1977-01-01
Applications of aerospace technology to petroleum exploration are described. Attention is given to seismic reflection techniques, sea-floor mapping, remote geochemical sensing, improved drilling methods and down-hole acoustic concepts, such as down-hole seismic tomography. The seismic reflection techniques include monitoring of swept-frequency explosive or solid-propellant seismic sources, as well as aerial seismic surveys. Telemetry and processing of seismic data may also be performed through use of aerospace technology. Sea-floor sonor imaging and a computer-aided system of geologic analogies for petroleum exploration are also considered.
Radiation transfer in plant canopies - Scattering of solar radiation and canopy reflectance
NASA Technical Reports Server (NTRS)
Verstraete, Michel M.
1988-01-01
The one-dimensional vertical model of radiation transfer in a plant canopy described by Verstraete (1987) is extended to account for the transfer of diffuse radiation. This improved model computes the absorption and scattering of both visible and near-infrared radiation in a multilayer canopy as a function of solar position and leaf orientation distribution. Multiple scattering is allowed, and the spectral reflectance of the vegetation stand is predicted. The results of the model are compared to those of other models and actual observations.
Improved optical design of nontracking concentrators
NASA Astrophysics Data System (ADS)
Kwan, B. M.; Bannerot, R. B.
1984-08-01
Optical designs based on a two reflections or less criterion have been developed for one and two-facet trapezoidal concentrators. Collector designs resulting from this criterion have been evaluated with the aid of a ray-trace computer simulation which includes the effects of nonideal reflectors. Results indicate a marked increase in performance, particularly for the one-facet designs, as compared to the collectors previously designed with the one reflection or less criterion. A significant result is that when a proper accounting is made for the actual acceptance angle for the concentrators, the performances of the optimal one and two-facet designs become nearly identical, indicating that the previously held contention that improved performance could be achieved with multifaceted reflectors (geometrically approaching the compound parabolic shape) may be incorrect.
Computer Algebra, Virtual Learning Environment and Meaningful Learning: Is It Possible?
ERIC Educational Resources Information Center
Abar, Celina A. A. P.; Barbosa, Lisbete Madsen
2011-01-01
A major challenge faced by teachers nowadays relates to the usage of proper educational technology to achieve a true and meaningful learning experience involving time for reflection. Teachers constantly seek new ways to improve instruction, but in virtual learning environments they often find themselves in a new role, interacting in a dynamic…
Carlisle, Daloni
Innovative use of information technology is improving patient outcomes and making nurses' working lives easier. Nurses at a Birmingham trust are using handheld computers to record vital observations and give early warning to senior clinicians if a patient is deteriorating. The system reflects a trend for healthcare technology based around the needs of clinicians and patients.
Roskovensky, John K [Albuquerque, NM
2009-01-20
A method of detecting clouds in a digital image comprising, for an area of the digital image, determining a reflectance value in at least three discrete electromagnetic spectrum bands, computing a first ratio of one reflectance value minus another reflectance value and the same two values added together, computing a second ratio of one reflectance value and another reflectance value, choosing one of the reflectance values, and concluding that an opaque cloud exists in the area if the results of each of the two computing steps and the choosing step fall within three corresponding predetermined ranges.
Bennett, J M; Booty, M J
1966-01-01
A computational method of determining n and k for an evaporated film from the measured reflectance, transmittance, and film thickness has been programmed for an IBM 7094 computer. The method consists of modifications to the NOTS multilayer film program. The basic program computes normal incidence reflectance, transmittance, phase change on reflection, and other parameters from the optical constants and thicknesses of all materials. In the modification, n and k for the film are varied in a prescribed manner, and the computer picks from among these values one n and one k which yield reflectance and transmittance values almost equalling the measured values. Results are given for films of silicon and aluminum.
A Robust Absorbing Boundary Condition for Compressible Flows
NASA Technical Reports Server (NTRS)
Loh, Ching Y.; orgenson, Philip C. E.
2005-01-01
An absorbing non-reflecting boundary condition (NRBC) for practical computations in fluid dynamics and aeroacoustics is presented with theoretical proof. This paper is a continuation and improvement of a previous paper by the author. The absorbing NRBC technique is based on a first principle of non reflecting, which contains the essential physics that a plane wave solution of the Euler equations remains intact across the boundary. The technique is theoretically shown to work for a large class of finite volume approaches. When combined with the hyperbolic conservation laws, the NRBC is simple, robust and truly multi-dimensional; no additional implementation is needed except the prescribed physical boundary conditions. Several numerical examples in multi-dimensional spaces using two different finite volume schemes are illustrated to demonstrate its robustness in practical computations. Limitations and remedies of the technique are also discussed.
Satellite estimation of incident photosynthetically active radiation using ultraviolet reflectance
NASA Technical Reports Server (NTRS)
Eck, Thomas F.; Dye, Dennis G.
1991-01-01
A new satellite remote sensing method for estimating the amount of photosynthetically active radiation (PAR, 400-700 nm) incident at the earth's surface is described and tested. Potential incident PAR for clear sky conditions is computed from an existing spectral model. A major advantage of the UV approach over existing visible band approaches to estimating insolation is the improved ability to discriminate clouds from high-albedo background surfaces. UV spectral reflectance data from the Total Ozone Mapping Spectrometer (TOMS) were used to test the approach for three climatically distinct, midlatitude locations. Estimates of monthly total incident PAR from the satellite technique differed from values computed from ground-based pyranometer measurements by less than 6 percent. This UV remote sensing method can be applied to estimate PAR insolation over ocean and land surfaces which are free of ice and snow.
Improved mathematical and computational tools for modeling photon propagation in tissue
NASA Astrophysics Data System (ADS)
Calabro, Katherine Weaver
Light interacts with biological tissue through two predominant mechanisms: scattering and absorption, which are sensitive to the size and density of cellular organelles, and to biochemical composition (ex. hemoglobin), respectively. During the progression of disease, tissues undergo a predictable set of changes in cell morphology and vascularization, which directly affect their scattering and absorption properties. Hence, quantification of these optical property differences can be used to identify the physiological biomarkers of disease with interest often focused on cancer. Diffuse reflectance spectroscopy is a diagnostic tool, wherein broadband visible light is transmitted through a fiber optic probe into a turbid medium, and after propagating through the sample, a fraction of the light is collected at the surface as reflectance. The measured reflectance spectrum can be analyzed with appropriate mathematical models to extract the optical properties of the tissue, and from these, a set of physiological properties. A number of models have been developed for this purpose using a variety of approaches -- from diffusion theory, to computational simulations, and empirical observations. However, these models are generally limited to narrow ranges of tissue and probe geometries. In this thesis, reflectance models were developed for a much wider range of measurement parameters, and influences such as the scattering phase function and probe design were investigated rigorously for the first time. The results provide a comprehensive understanding of the factors that influence reflectance, with novel insights that, in some cases, challenge current assumptions in the field. An improved Monte Carlo simulation program, designed to run on a graphics processing unit (GPU), was built to simulate the data used in the development of the reflectance models. Rigorous error analysis was performed to identify how inaccuracies in modeling assumptions can be expected to affect the accuracy of extracted optical property values from experimentally-acquired reflectance spectra. From this analysis, probe geometries that offer the best robustness against error in estimation of physiological properties from tissue, are presented. Finally, several in vivo studies demonstrating the use of reflectance spectroscopy for both research and clinical applications are presented.
Lefor, Alan T
2011-08-01
Oncology research has traditionally been conducted using techniques from the biological sciences. The new field of computational oncology has forged a new relationship between the physical sciences and oncology to further advance research. By applying physics and mathematics to oncologic problems, new insights will emerge into the pathogenesis and treatment of malignancies. One major area of investigation in computational oncology centers around the acquisition and analysis of data, using improved computing hardware and software. Large databases of cellular pathways are being analyzed to understand the interrelationship among complex biological processes. Computer-aided detection is being applied to the analysis of routine imaging data including mammography and chest imaging to improve the accuracy and detection rate for population screening. The second major area of investigation uses computers to construct sophisticated mathematical models of individual cancer cells as well as larger systems using partial differential equations. These models are further refined with clinically available information to more accurately reflect living systems. One of the major obstacles in the partnership between physical scientists and the oncology community is communications. Standard ways to convey information must be developed. Future progress in computational oncology will depend on close collaboration between clinicians and investigators to further the understanding of cancer using these new approaches.
Efficient Geometric Sound Propagation Using Visibility Culling
NASA Astrophysics Data System (ADS)
Chandak, Anish
2011-07-01
Simulating propagation of sound can improve the sense of realism in interactive applications such as video games and can lead to better designs in engineering applications such as architectural acoustics. In this thesis, we present geometric sound propagation techniques which are faster than prior methods and map well to upcoming parallel multi-core CPUs. We model specular reflections by using the image-source method and model finite-edge diffraction by using the well-known Biot-Tolstoy-Medwin (BTM) model. We accelerate the computation of specular reflections by applying novel visibility algorithms, FastV and AD-Frustum, which compute visibility from a point. We accelerate finite-edge diffraction modeling by applying a novel visibility algorithm which computes visibility from a region. Our visibility algorithms are based on frustum tracing and exploit recent advances in fast ray-hierarchy intersections, data-parallel computations, and scalable, multi-core algorithms. The AD-Frustum algorithm adapts its computation to the scene complexity and allows small errors in computing specular reflection paths for higher computational efficiency. FastV and our visibility algorithm from a region are general, object-space, conservative visibility algorithms that together significantly reduce the number of image sources compared to other techniques while preserving the same accuracy. Our geometric propagation algorithms are an order of magnitude faster than prior approaches for modeling specular reflections and two to ten times faster for modeling finite-edge diffraction. Our algorithms are interactive, scale almost linearly on multi-core CPUs, and can handle large, complex, and dynamic scenes. We also compare the accuracy of our sound propagation algorithms with other methods. Once sound propagation is performed, it is desirable to listen to the propagated sound in interactive and engineering applications. We can generate smooth, artifact-free output audio signals by applying efficient audio-processing algorithms. We also present the first efficient audio-processing algorithm for scenarios with simultaneously moving source and moving receiver (MS-MR) which incurs less than 25% overhead compared to static source and moving receiver (SS-MR) or moving source and static receiver (MS-SR) scenario.
NASA Astrophysics Data System (ADS)
Zhang, Sijin; Austin, Geoff; Sutherland-Stacey, Luke
2014-05-01
Reverse Kessler warm rain processes were implemented within the Weather Research and Forecasting Model (WRF) and coupled with a Newtonian relaxation, or nudging technique designed to improve quantitative precipitation forecasting (QPF) in New Zealand by making use of observed radar reflectivity and modest computing facilities. One of the reasons for developing such a scheme, rather than using 4D-Var for example, is that radar VAR scheme in general, and 4D-Var in particular, requires computational resources beyond the capability of most university groups and indeed some national forecasting centres of small countries like New Zealand. The new scheme adjusts the model water vapor mixing ratio profiles based on observed reflectivity at each time step within an assimilation time window. The whole scheme can be divided into following steps: (i) The radar reflectivity is firstly converted to rain water, and (ii) then the rain water is used to derive cloud water content according to the reverse Kessler scheme; (iii) The cloud water content associated water vapor mixing ratio is then calculated based on the saturation adjustment processes; (iv) Finally the adjusted water vapor is nudged into the model and the model background is updated. 13 rainfall cases which occurred in the summer of 2011/2012 in New Zealand were used to evaluate the new scheme, different forecast scores were calculated and showed that the new scheme was able to improve precipitation forecasts on average up to around 7 hours ahead depending on different verification thresholds.
Thilak, Vimal; Voelz, David G; Creusere, Charles D
2007-10-20
A passive-polarization-based imaging system records the polarization state of light reflected by objects that are illuminated with an unpolarized and generally uncontrolled source. Such systems can be useful in many remote sensing applications including target detection, object segmentation, and material classification. We present a method to jointly estimate the complex index of refraction and the reflection angle (reflected zenith angle) of a target from multiple measurements collected by a passive polarimeter. An expression for the degree of polarization is derived from the microfacet polarimetric bidirectional reflectance model for the case of scattering in the plane of incidence. Using this expression, we develop a nonlinear least-squares estimation algorithm for extracting an apparent index of refraction and the reflection angle from a set of polarization measurements collected from multiple source positions. Computer simulation results show that the estimation accuracy generally improves with an increasing number of source position measurements. Laboratory results indicate that the proposed method is effective for recovering the reflection angle and that the estimated index of refraction provides a feature vector that is robust to the reflection angle.
NASA Astrophysics Data System (ADS)
Thilak, Vimal; Voelz, David G.; Creusere, Charles D.
2007-10-01
A passive-polarization-based imaging system records the polarization state of light reflected by objects that are illuminated with an unpolarized and generally uncontrolled source. Such systems can be useful in many remote sensing applications including target detection, object segmentation, and material classification. We present a method to jointly estimate the complex index of refraction and the reflection angle (reflected zenith angle) of a target from multiple measurements collected by a passive polarimeter. An expression for the degree of polarization is derived from the microfacet polarimetric bidirectional reflectance model for the case of scattering in the plane of incidence. Using this expression, we develop a nonlinear least-squares estimation algorithm for extracting an apparent index of refraction and the reflection angle from a set of polarization measurements collected from multiple source positions. Computer simulation results show that the estimation accuracy generally improves with an increasing number of source position measurements. Laboratory results indicate that the proposed method is effective for recovering the reflection angle and that the estimated index of refraction provides a feature vector that is robust to the reflection angle.
Achievements and challenges in structural bioinformatics and computational biophysics.
Samish, Ilan; Bourne, Philip E; Najmanovich, Rafael J
2015-01-01
The field of structural bioinformatics and computational biophysics has undergone a revolution in the last 10 years. Developments that are captured annually through the 3DSIG meeting, upon which this article reflects. An increase in the accessible data, computational resources and methodology has resulted in an increase in the size and resolution of studied systems and the complexity of the questions amenable to research. Concomitantly, the parameterization and efficiency of the methods have markedly improved along with their cross-validation with other computational and experimental results. The field exhibits an ever-increasing integration with biochemistry, biophysics and other disciplines. In this article, we discuss recent achievements along with current challenges within the field. © The Author 2014. Published by Oxford University Press.
Achievements and challenges in structural bioinformatics and computational biophysics
Samish, Ilan; Bourne, Philip E.; Najmanovich, Rafael J.
2015-01-01
Motivation: The field of structural bioinformatics and computational biophysics has undergone a revolution in the last 10 years. Developments that are captured annually through the 3DSIG meeting, upon which this article reflects. Results: An increase in the accessible data, computational resources and methodology has resulted in an increase in the size and resolution of studied systems and the complexity of the questions amenable to research. Concomitantly, the parameterization and efficiency of the methods have markedly improved along with their cross-validation with other computational and experimental results. Conclusion: The field exhibits an ever-increasing integration with biochemistry, biophysics and other disciplines. In this article, we discuss recent achievements along with current challenges within the field. Contact: Rafael.Najmanovich@USherbrooke.ca PMID:25488929
Robot computer problem solving system
NASA Technical Reports Server (NTRS)
Merriam, E. W.; Becker, J. D.
1973-01-01
A robot computer problem solving system which represents a robot exploration vehicle in a simulated Mars environment is described. The model exhibits changes and improvements made on a previously designed robot in a city environment. The Martian environment is modeled in Cartesian coordinates; objects are scattered about a plane; arbitrary restrictions on the robot's vision have been removed; and the robot's path contains arbitrary curves. New environmental features, particularly the visual occlusion of objects by other objects, were added to the model. Two different algorithms were developed for computing occlusion. Movement and vision capabilities of the robot were established in the Mars environment, using LISP/FORTRAN interface for computational efficiency. The graphical display program was redesigned to reflect the change to the Mars-like environment.
Specimen illumination apparatus with optical cavity for dark field illumination
Pinkel, Daniel; Sudar, Damir; Albertson, Donna
1999-01-01
An illumination apparatus with a specimen slide holder, an illumination source, an optical cavity producing multiple reflection of illumination light to a specimen comprising a first and a second reflective surface arranged to achieve multiple reflections of light to a specimen is provided. The apparatus can further include additional reflective surfaces to achieve the optical cavity, a slide for mounting the specimen, a coverslip which is a reflective component of the optical cavity, one or more prisms for directing light within the optical cavity, antifading solutions for improving the viewing properties of the specimen, an array of materials for analysis, fluorescent components, curved reflective surfaces as components of the optical cavity, specimen detection apparatus, optical detection equipment, computers for analysis of optical images, a plane polarizer, fiberoptics, light transmission apertures, microscopic components, lenses for viewing the specimen, and upper and lower mirrors above and below the specimen slide as components of the optical cavity. Methods of using the apparatus are also provided.
EARL: Exoplanet Analytic Reflected Lightcurves package
NASA Astrophysics Data System (ADS)
Haggard, Hal M.; Cowan, Nicolas B.
2018-05-01
EARL (Exoplanet Analytic Reflected Lightcurves) computes the analytic form of a reflected lightcurve, given a spherical harmonic decomposition of the planet albedo map and the viewing and orbital geometries. The EARL Mathematica notebook allows rapid computation of reflected lightcurves, thus making lightcurve numerical experiments accessible.
Channel Model Optimization with Reflection Residual Component for Indoor MIMO-VLC System
NASA Astrophysics Data System (ADS)
Chen, Yong; Li, Tengfei; Liu, Huanlin; Li, Yichao
2017-12-01
A fast channel modeling method is studied to solve the problem of reflection channel gain for multiple input multiple output-visible light communications (MIMO-VLC) in the paper. For reducing the computational complexity when associating with the reflection times, no more than 3 reflections are taken into consideration in VLC. We think that higher order reflection link consists of corresponding many times line of sight link and firstly present reflection residual component to characterize higher reflection (more than 2 reflections). We perform computer simulation results for point-to-point channel impulse response, receiving optical power and receiving signal to noise ratio. Based on theoretical analysis and simulation results, the proposed method can effectively reduce the computational complexity of higher order reflection in channel modeling.
NASA Technical Reports Server (NTRS)
Yang, P.; Gao, B.-C.; Baum, B. A.; Wiscombe, W.; Hu, Y.; Nasiri, S. L.; Soulen, P. F.; Heymsfield, A. J.; McFarquhar, G. M.; Miloshevich, L. M.
2000-01-01
A common assumption in satellite imager-based cirrus retrieval algorithms is that the radiative properties of a cirrus cloud may be represented by those associated with a specific ice crystal shape (or habit) and a single particle size distribution. However, observations of cirrus clouds have shown that the shapes and sizes of ice crystals may vary substantially with height within the clouds. In this study we investigate the sensitivity of the top-of-atmosphere bidirectional reflectances at two MODIS bands centered at 0.65 micron and 2.11 micron to the cirrus models assumed to be either a single homogeneous layer or three distinct but contiguous, layers. First, we define the single- and three-layer cirrus cloud models with respect to ice crystal habit and size distribution on the basis of in situ replicator data acquired during the First ISCCP Regional Experiment (FIRE-II), held in Kansas during the fall of 1991. Subsequently, fundamental light scattering and radiative transfer theory is employed to determine the single scattering and the bulk radiative properties of the cirrus cloud. Regarding the radiative transfer computations, we present a discrete form of the adding/doubling principle by introducing a direct transmission function, which is computationally straightforward and efficient an improvement over previous methods. For the 0.65 micron band, at which absorption by ice is negligible, there is little difference between the bidirectional reflectances calculated for the one- and three-layer cirrus models, suggesting that the vertical inhomogeneity effect is relatively unimportant. At the 2.11 micron band, the bidirectional reflectances computed for both optically thin (tau = 1) and thick (tau = 10) cirrus clouds show significant differences between the results for the one- and three-layer models. The reflectances computed for the three-layer cirrus model are substantially larger than those computed for the single-layer cirrus. Finally, we find that cloud reflectance is very sensitive to the optical properties of the small crystals that predominate in the top layer of the three-layer cirrus model. It is critical to define the most realistic geometric shape for the small "quasi-spherical" ice crystals in the top layer for obtaining reliable single-scattering parameters and bulk radiative properties of cirrus.
NASA Astrophysics Data System (ADS)
Dettmer, J.; Quijano, J. E.; Dosso, S. E.; Holland, C. W.; Mandolesi, E.
2016-12-01
Geophysical seabed properties are important for the detection and classification of unexploded ordnance. However, current surveying methods such as vertical seismic profiling, coring, or inversion are of limited use when surveying large areas with high spatial sampling density. We consider surveys based on a source and receiver array towed by an autonomous vehicle which produce large volumes of seabed reflectivity data that contain unprecedented and detailed seabed information. The data are analyzed with a particle filter, which requires efficient reflection-coefficient computation, efficient inversion algorithms and efficient use of computer resources. The filter quantifies information content of multiple sequential data sets by considering results from previous data along the survey track to inform the importance sampling at the current point. Challenges arise from environmental changes along the track where the number of sediment layers and their properties change. This is addressed by a trans-dimensional model in the filter which allows layering complexity to change along a track. Efficiency is improved by likelihood tempering of various particle subsets and including exchange moves (parallel tempering). The filter is implemented on a hybrid computer that combines central processing units (CPUs) and graphics processing units (GPUs) to exploit three levels of parallelism: (1) fine-grained parallel computation of spherical reflection coefficients with a GPU implementation of Levin integration; (2) updating particles by concurrent CPU processes which exchange information using automatic load balancing (coarse grained parallelism); (3) overlapping CPU-GPU communication (a major bottleneck) with GPU computation by staggering CPU access to the multiple GPUs. The algorithm is applied to spherical reflection coefficients for data sets along a 14-km track on the Malta Plateau, Mediterranean Sea. We demonstrate substantial efficiency gains over previous methods. [This research was supported in part by the U.S. Dept of Defense, thought the Strategic Environmental Research and Development Program (SERDP).
ERIC Educational Resources Information Center
Kerchner, Charles; And Others
The early stages of a microcomputer-based project to integrate managerial knowledge and practice are described in this report. Analysis of the problem-framing process that effective principals use to reduce complex problems into more manageable ones forms the basis of the project. Three cognitive-mapping techniques are used to understand the…
ERIC Educational Resources Information Center
Coddington, Lorelei R.
2014-01-01
In the past decade, mathematics performance by all students, especially minority students in low socioeconomic schools, has shown limited improvement nationwide (NCES, 2011). Traditionally in the United States, mathematics has consisted of arithmetic and computational fluency; however, mathematics researchers widely believe that this method of…
NASA Technical Reports Server (NTRS)
Fasnacht, Zachary; Qin, Wenhan; Haffner, David P.; Loyola, Diego; Joiner, Joanna; Krotkov, Nickolay; Vasilkov, Alexander; Spurr, Robert
2017-01-01
Surface Lambertian-equivalent reflectivity (LER) is important for trace gas retrievals in the direct calculation of cloud fractions and indirect calculation of the air mass factor. Current trace gas retrievals use climatological surface LER's. Surface properties that impact the bidirectional reflectance distribution function (BRDF) as well as varying satellite viewing geometry can be important for retrieval of trace gases. Geometry Dependent LER (GLER) captures these effects with its calculation of sun normalized radiances (I/F) and can be used in current LER algorithms (Vasilkov et al. 2016). Pixel by pixel radiative transfer calculations are computationally expensive for large datasets. Modern satellite missions such as the Tropospheric Monitoring Instrument (TROPOMI) produce very large datasets as they take measurements at much higher spatial and spectral resolutions. Look up table (LUT) interpolation improves the speed of radiative transfer calculations but complexity increases for non-linear functions. Neural networks perform fast calculations and can accurately predict both non-linear and linear functions with little effort.
Improved atmospheric 3D BSDF model in earthlike exoplanet using ray-tracing based method
NASA Astrophysics Data System (ADS)
Ryu, Dongok; Kim, Sug-Whan; Seong, Sehyun
2012-10-01
The studies on planetary radiative transfer computation have become important elements to disk-averaged spectral characterization of potential exoplanets. In this paper, we report an improved ray-tracing based atmospheric simulation model as a part of 3-D earth-like planet model with 3 principle sub-components i.e. land, sea and atmosphere. Any changes in ray paths and their characteristics such as radiative power and direction are computed as they experience reflection, refraction, transmission, absorption and scattering. Improved atmospheric BSDF algorithms uses Q.Liu's combined Rayleigh and aerosol Henrey-Greenstein scattering phase function. The input cloud-free atmosphere model consists of 48 layers with vertical absorption profiles and a scattering layer with their input characteristics using the GIOVANNI database. Total Solar Irradiance data are obtained from Solar Radiation and Climate Experiment (SORCE) mission. Using aerosol scattering computation, we first tested the atmospheric scattering effects with imaging simulation with HRIV, EPOXI. Then we examined the computational validity of atmospheric model with the measurements of global, direct and diffuse radiation taken from NREL(National Renewable Energy Laboratory)s pyranometers and pyrheliometers on a ground station for cases of single incident angle and for simultaneous multiple incident angles of the solar beam.
NASA Astrophysics Data System (ADS)
MacDonald, Christopher L.; Bhattacharya, Nirupama; Sprouse, Brian P.; Silva, Gabriel A.
2015-09-01
Computing numerical solutions to fractional differential equations can be computationally intensive due to the effect of non-local derivatives in which all previous time points contribute to the current iteration. In general, numerical approaches that depend on truncating part of the system history while efficient, can suffer from high degrees of error and inaccuracy. Here we present an adaptive time step memory method for smooth functions applied to the Grünwald-Letnikov fractional diffusion derivative. This method is computationally efficient and results in smaller errors during numerical simulations. Sampled points along the system's history at progressively longer intervals are assumed to reflect the values of neighboring time points. By including progressively fewer points backward in time, a temporally 'weighted' history is computed that includes contributions from the entire past of the system, maintaining accuracy, but with fewer points actually calculated, greatly improving computational efficiency.
ERIC Educational Resources Information Center
Wayman, Jeffrey C.
2005-01-01
Accountability mandates such as No Child Left Behind (NCLB) have drawn attention to the practical use of student data for school improvement. Nevertheless, schools may struggle with these mandates because student data are often stored in forms that are difficult to access, manipulate, and interpret. Such access barriers additionally preclude the…
Applications of high power lasers. [using reflection holograms for machining and surface treatment
NASA Technical Reports Server (NTRS)
Angus, J. C.
1979-01-01
The use of computer generated, reflection holograms in conjunction with high power lasers for precision machining of metals and ceramics was investigated. The Reflection holograms which were developed and made to work at both optical wavelength (He-Ne, 6328 A) and infrared (CO2, 10.6) meet the primary practical requirement of ruggedness and are relatively economical and simple to fabricate. The technology is sufficiently advanced now so that reflection holography could indeed be used as a practical manufacturing device in certain applications requiring low power densities. However, the present holograms are energy inefficient and much of the laser power is lost in the zero order spot and higher diffraction orders. Improvements of laser machining over conventional methods are discussed and addition applications are listed. Possible uses in the electronics industry include drilling holes in printed circuit boards making soldered connections, and resistor trimming.
ERIC Educational Resources Information Center
Chou, Pao-Nan; Chang, Chi-Cheng
2011-01-01
This study examines the effects of reflection category and reflection quality on learning outcomes during Web-based portfolio assessment process. Experimental subjects consist of forty-five eight-grade students in a "Computer Application" course. Through the Web-based portfolio assessment system, these students write reflection, and join…
Speed and accuracy improvements in FLAASH atmospheric correction of hyperspectral imagery
NASA Astrophysics Data System (ADS)
Perkins, Timothy; Adler-Golden, Steven; Matthew, Michael W.; Berk, Alexander; Bernstein, Lawrence S.; Lee, Jamine; Fox, Marsha
2012-11-01
Remotely sensed spectral imagery of the earth's surface can be used to fullest advantage when the influence of the atmosphere has been removed and the measurements are reduced to units of reflectance. Here, we provide a comprehensive summary of the latest version of the Fast Line-of-sight Atmospheric Analysis of Spectral Hypercubes atmospheric correction algorithm. We also report some new code improvements for speed and accuracy. These include the re-working of the original algorithm in C-language code parallelized with message passing interface and containing a new radiative transfer look-up table option, which replaces executions of the MODTRAN model. With computation times now as low as ~10 s per image per computer processor, automated, real-time, on-board atmospheric correction of hyper- and multi-spectral imagery is within reach.
Human factors aspects of control room design
NASA Technical Reports Server (NTRS)
Jenkins, J. P.
1983-01-01
A plan for the design and analysis of a multistation control room is reviewed. It is found that acceptance of the computer based information system by the uses in the control room is mandatory for mission and system success. Criteria to improve computer/user interface include: match of system input/output with user; reliability, compatibility and maintainability; easy to learn and little training needed; self descriptive system; system under user control; transparent language, format and organization; corresponds to user expectations; adaptable to user experience level; fault tolerant; dialog capability user communications needs reflected in flexibility, complexity, power and information load; integrated system; and documentation.
NASA Technical Reports Server (NTRS)
Otterman, J.; Fraser, R. S.
1976-01-01
Programs for computing atmospheric transmission and scattering solar radiation were used to compute the ratios of the Earth-atmosphere system (space) directional reflectivities in the vertical direction to the surface reflectivity, for the four bands of the LANDSAT multispectral scanner (MSS). These ratios are presented as graphs for two water vapor levels, as a function of the surface reflectivity, for various sun elevation angles. Space directional reflectivities in the vertical direction are reported for selected arid regions in Asia, Africa and Central America from the spectral radiance levels measured by the LANDSAT MSS. From these space reflectivities, surface vertical reflectivities were computed applying the pertinent graphs. These surface reflectivities were used to estimate the surface albedo for the entire solar spectrum. The estimated albedos are in the range 0.34-0.52, higher than the values reported by most previous researchers from space measurements, but are consistent with laboratory measurements.
Buniatian, A A; Sablin, I N; Flerov, E V; Mierbekov, E M; Broĭtman, O G; Shevchenko, V V; Shitikov, I I
1995-01-01
Creation of computer monitoring systems (CMS) for operating rooms is one of the most important spheres of personal computer employment in anesthesiology. The authors developed a PC RS/AT-based CMS and effectively used it for more than 2 years. This system permits comprehensive monitoring in cardiosurgical operations by real time processing the values of arterial and central venous pressure, pressure in the pulmonary artery, bioelectrical activity of the brain, and two temperature values. Use of this CMS helped appreciably improve patients' safety during surgery. The possibility to assess brain function by computer monitoring the EEF simultaneously with central hemodynamics and body temperature permit the anesthesiologist to objectively assess the depth of anesthesia and to diagnose cerebral hypoxia. Automated anesthesiological chart issued by the CMS after surgery reliably reflects the patient's status and the measures taken by the anesthesiologist.
Improved haptic interface for colonoscopy simulation.
Woo, Hyun Soo; Kim, Woo Seok; Ahn, Woojin; Lee, Doo Yong; Yi, Sun Young
2007-01-01
This paper presents an improved haptic interface of the KAIST-Ewha colonoscopy simulator II. The haptic interface enables the distal portion of the colonoscope to be freely bent while guaranteeing enough workspace and reflective forces for colonoscopy simulation. Its force-torque sensor measures profiles of the user. Manipulation of the colonoscope tip is monitored by four deflection sensors, and triggers computation to render accurate graphic images corresponding to the angle knob rotation. Tack switches are attached on the valve-actuation buttons of the colonoscope to simulate air-injection or suction, and the corresponding deformation of the colon.
ERIC Educational Resources Information Center
Erickson, Judith B.; And Others
1980-01-01
Discusses patterns resulting from the monitor of science education proposals which may reflect problems or differing perceptions of NSF. Discusses these areas: proposal submissions from two-year institutions and social and behavioral scientists, trends in project content at the academic-industrial interface and in computer technology, and…
Exact Rayleigh scattering calculations for use with the Nimbus-7 Coastal Zone Color Scanner
NASA Technical Reports Server (NTRS)
Gordon, Howard R.; Brown, James W.; Evans, Robert H.
1988-01-01
The radiance reflected from a plane-parallel atmosphere and flat sea surface in the absence of aerosols has been determined with an exact multiple scattering code to improve the analysis of Nimbus-7 CZCS imagery. It is shown that the single scattering approximation normally used to compute this radiance can result in errors of up to 5 percent for small and moderate solar zenith angles. A scheme to include the effect of variations in the surface pressure in the exact computation of the Rayleigh radiance is discussed. The results of an application of these computations to CZCS imagery suggest that accurate atmospheric corrections can be obtained for solar zenith angles at least as large as 65 deg.
Modeling and Computing of Stock Index Forecasting Based on Neural Network and Markov Chain
Dai, Yonghui; Han, Dongmei; Dai, Weihui
2014-01-01
The stock index reflects the fluctuation of the stock market. For a long time, there have been a lot of researches on the forecast of stock index. However, the traditional method is limited to achieving an ideal precision in the dynamic market due to the influences of many factors such as the economic situation, policy changes, and emergency events. Therefore, the approach based on adaptive modeling and conditional probability transfer causes the new attention of researchers. This paper presents a new forecast method by the combination of improved back-propagation (BP) neural network and Markov chain, as well as its modeling and computing technology. This method includes initial forecasting by improved BP neural network, division of Markov state region, computing of the state transition probability matrix, and the prediction adjustment. Results of the empirical study show that this method can achieve high accuracy in the stock index prediction, and it could provide a good reference for the investment in stock market. PMID:24782659
Automatic Hotspot and Sun Glint Detection in UAV Multispectral Images
Ortega-Terol, Damian; Ballesteros, Rocio
2017-01-01
Last advances in sensors, photogrammetry and computer vision have led to high-automation levels of 3D reconstruction processes for generating dense models and multispectral orthoimages from Unmanned Aerial Vehicle (UAV) images. However, these cartographic products are sometimes blurred and degraded due to sun reflection effects which reduce the image contrast and colour fidelity in photogrammetry and the quality of radiometric values in remote sensing applications. This paper proposes an automatic approach for detecting sun reflections problems (hotspot and sun glint) in multispectral images acquired with an Unmanned Aerial Vehicle (UAV), based on a photogrammetric strategy included in a flight planning and control software developed by the authors. In particular, two main consequences are derived from the approach developed: (i) different areas of the images can be excluded since they contain sun reflection problems; (ii) the cartographic products obtained (e.g., digital terrain model, orthoimages) and the agronomical parameters computed (e.g., normalized vegetation index-NVDI) are improved since radiometric defects in pixels are not considered. Finally, an accuracy assessment was performed in order to analyse the error in the detection process, getting errors around 10 pixels for a ground sample distance (GSD) of 5 cm which is perfectly valid for agricultural applications. This error confirms that the precision in the detection of sun reflections can be guaranteed using this approach and the current low-cost UAV technology. PMID:29036930
Automatic Hotspot and Sun Glint Detection in UAV Multispectral Images.
Ortega-Terol, Damian; Hernandez-Lopez, David; Ballesteros, Rocio; Gonzalez-Aguilera, Diego
2017-10-15
Last advances in sensors, photogrammetry and computer vision have led to high-automation levels of 3D reconstruction processes for generating dense models and multispectral orthoimages from Unmanned Aerial Vehicle (UAV) images. However, these cartographic products are sometimes blurred and degraded due to sun reflection effects which reduce the image contrast and colour fidelity in photogrammetry and the quality of radiometric values in remote sensing applications. This paper proposes an automatic approach for detecting sun reflections problems (hotspot and sun glint) in multispectral images acquired with an Unmanned Aerial Vehicle (UAV), based on a photogrammetric strategy included in a flight planning and control software developed by the authors. In particular, two main consequences are derived from the approach developed: (i) different areas of the images can be excluded since they contain sun reflection problems; (ii) the cartographic products obtained (e.g., digital terrain model, orthoimages) and the agronomical parameters computed (e.g., normalized vegetation index-NVDI) are improved since radiometric defects in pixels are not considered. Finally, an accuracy assessment was performed in order to analyse the error in the detection process, getting errors around 10 pixels for a ground sample distance (GSD) of 5 cm which is perfectly valid for agricultural applications. This error confirms that the precision in the detection of sun reflections can be guaranteed using this approach and the current low-cost UAV technology.
Performance evaluation of a six-axis generalized force-reflecting teleoperator
NASA Technical Reports Server (NTRS)
Hannaford, B.; Wood, L.; Guggisberg, B.; Mcaffee, D.; Zak, H.
1989-01-01
Work in real-time distributed computation and control has culminated in a prototype force-reflecting telemanipulation system having a dissimilar master (cable-driven, force-reflecting hand controller) and a slave (PUMA 560 robot with custom controller), an extremely high sampling rate (1000 Hz), and a low loop computation delay (5 msec). In a series of experiments with this system and five trained test operators covering over 100 hours of teleoperation, performance was measured in a series of generic and application-driven tasks with and without force feedback, and with control shared between teleoperation and local sensor referenced control. Measurements defining task performance included 100-Hz recording of six-axis force/torque information from the slave manipulator wrist, task completion time, and visual observation of predefined task errors. The task consisted of high precision peg-in-hole insertion, electrical connectors, velcro attach-de-attach, and a twist-lock multi-pin connector. Each task was repeated three times under several operating conditions: normal bilateral telemanipulation, forward position control without force feedback, and shared control. In shared control, orientation was locally servo controlled to comply with applied torques, while translation was under operator control. All performance measures improved as capability was added along a spectrum of capabilities ranging from pure position control through force-reflecting teleoperation and shared control. Performance was optimal for the bare-handed operator.
Earth-atmosphere system and surface reflectivities in arid regions from Landsat MSS data
NASA Technical Reports Server (NTRS)
Otterman, J.; Fraser, R. S.
1976-01-01
Previously developed programs for computing atmospheric transmission and scattering of the solar radiation are used to compute the ratios of the earth-atmosphere system (space) directional reflectivities in the nadir direction to the surface Lambertian reflectivity, for the four bands of the Landsat multispectral scanner (MSS). These ratios are presented as graphs for two water vapor levels, as a function of the surface reflectivity, for various sun elevation angles. Space directional reflectivities in the vertical direction are reported for selected arid regions in Asia, Africa, and Central America from the spectral radiance levels measured by the Landsat MSS. From these space reflectivities, surface reflectivities are computed applying the pertinent graphs. These surface reflectivities are used to estimate the surface albedo for the entire solar spectrum. The estimated albedos are in the range 0.34-0.52, higher than the values reported by most previous researchers from space measurements, but are consistent with laboratory and in situ measurements.
Light reflection models for computer graphics.
Greenberg, D P
1989-04-14
During the past 20 years, computer graphic techniques for simulating the reflection of light have progressed so that today images of photorealistic quality can be produced. Early algorithms considered direct lighting only, but global illumination phenomena with indirect lighting, surface interreflections, and shadows can now be modeled with ray tracing, radiosity, and Monte Carlo simulations. This article describes the historical development of computer graphic algorithms for light reflection and pictorially illustrates what will be commonly available in the near future.
NASA Astrophysics Data System (ADS)
Zuo, Shu-Yu; Tian, Ye; Wei, Qi; Cheng, Ying; Liu, Xiao-Jun
2018-03-01
The use of metasurfaces has allowed the provision of a variety of functionalities by ultrathin structures, paving the way toward novel highly compact analog computing devices. Here, we conceptually realize analog computing using an acoustic reflective computational metasurface (RCM) that can independently manipulate the reflection phase and amplitude of an incident acoustic signal. This RCM is composed of coating unit cells and perforated panels, where the first can tune the transmission phase within the full range of 2π and the second can adjust the reflection amplitude in the range of 0-1. We show that this RCM can achieve arbitrary reflection phase and amplitude and can be used to realize a unique linear spatially invariant transfer function. Using the spatial Fourier transform (FT), an acoustic analog computing (AAC) system is proposed based on the RCM together with a focusing lens. Based on numerical simulations, we demonstrate that this AAC system can perform mathematical operations such as spatial differentiation, integration, and convolution on an incident acoustic signal. The proposed system has low complexity and reduced size because the RCM is able to individually adjust the reflection phase and amplitude and because only one block is involved in performing the spatial FT. Our work may offer a practical, efficient, and flexible approach to the design of compact devices for acoustic computing applications, signal processing, equation solving, and acoustic wave manipulations.
Subjective analysis of energy-management projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morris, R.
The most successful energy conservation projects always reflect human effort to fine-tune engineering and technological improvements. Subjective analysis is a technique for predicting and measuring human interaction before a project begins. The examples of a subjective analysis for office buildings incorporate evaluative questions that are structured to produce numeric values for computer scoring. Each project would need to develop its own pertinent questions and determine appropriate values for the answers.
Computational ghost imaging using deep learning
NASA Astrophysics Data System (ADS)
Shimobaba, Tomoyoshi; Endo, Yutaka; Nishitsuji, Takashi; Takahashi, Takayuki; Nagahama, Yuki; Hasegawa, Satoki; Sano, Marie; Hirayama, Ryuji; Kakue, Takashi; Shiraki, Atsushi; Ito, Tomoyoshi
2018-04-01
Computational ghost imaging (CGI) is a single-pixel imaging technique that exploits the correlation between known random patterns and the measured intensity of light transmitted (or reflected) by an object. Although CGI can obtain two- or three-dimensional images with a single or a few bucket detectors, the quality of the reconstructed images is reduced by noise due to the reconstruction of images from random patterns. In this study, we improve the quality of CGI images using deep learning. A deep neural network is used to automatically learn the features of noise-contaminated CGI images. After training, the network is able to predict low-noise images from new noise-contaminated CGI images.
Tissue classification for laparoscopic image understanding based on multispectral texture analysis
NASA Astrophysics Data System (ADS)
Zhang, Yan; Wirkert, Sebastian J.; Iszatt, Justin; Kenngott, Hannes; Wagner, Martin; Mayer, Benjamin; Stock, Christian; Clancy, Neil T.; Elson, Daniel S.; Maier-Hein, Lena
2016-03-01
Intra-operative tissue classification is one of the prerequisites for providing context-aware visualization in computer-assisted minimally invasive surgeries. As many anatomical structures are difficult to differentiate in conventional RGB medical images, we propose a classification method based on multispectral image patches. In a comprehensive ex vivo study we show (1) that multispectral imaging data is superior to RGB data for organ tissue classification when used in conjunction with widely applied feature descriptors and (2) that combining the tissue texture with the reflectance spectrum improves the classification performance. Multispectral tissue analysis could thus evolve as a key enabling technique in computer-assisted laparoscopy.
Optimization of a Boiling Water Reactor Loading Pattern Using an Improved Genetic Algorithm
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kobayashi, Yoko; Aiyoshi, Eitaro
2003-08-15
A search method based on genetic algorithms (GA) using deterministic operators has been developed to generate optimized boiling water reactor (BWR) loading patterns (LPs). The search method uses an Improved GA operator, that is, crossover, mutation, and selection. The handling of the encoding technique and constraint conditions is designed so that the GA reflects the peculiar characteristics of the BWR. In addition, some strategies such as elitism and self-reproduction are effectively used to improve the search speed. LP evaluations were performed with a three-dimensional diffusion code that coupled neutronic and thermal-hydraulic models. Strong axial heterogeneities and three-dimensional-dependent constraints have alwaysmore » necessitated the use of three-dimensional core simulators for BWRs, so that an optimization method is required for computational efficiency. The proposed algorithm is demonstrated by successfully generating LPs for an actual BWR plant applying the Haling technique. In test calculations, candidates that shuffled fresh and burned fuel assemblies within a reasonable computation time were obtained.« less
Force reflecting hand controller for manipulator teleoperation
NASA Technical Reports Server (NTRS)
Bryfogle, Mark D.
1991-01-01
A force reflecting hand controller based upon a six degree of freedom fully parallel mechanism, often termed a Stewart Platform, has been designed, constructed, and tested as an integrated system with a slave robot manipulator test bed. A force reflecting hand controller comprises a kinesthetic device capable of transmitting position and orientation commands to a slave robot manipulator while simultaneously representing the environmental interaction forces of the slave manipulator back to the operator through actuators driving the hand controller mechanism. The Stewart Platform was chosen as a novel approach to improve force reflecting teleoperation because of its inherently high ratio of load generation capability to system mass content and the correspondingly high dynamic bandwidth. An additional novelty of the program was to implement closed loop force and torque control about the hand controller mechanism by equipping the handgrip with a six degree of freedom force and torque measuring cell. The mechanical, electrical, computer, and control systems are discussed and system tests are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spirakis, C.S.; Condit, C.D.
1975-01-01
LANDSAT-1 (ERTS-1) multispectral reflectance data were used to enhance the detection of alteration around uranium deposits near Cameron, Ariz. The technique involved stretching and ratioing computer-enhanced data from which electronic noise and atmospheric haze had been removed. Using present techniques, the work proves that LANDSAT-1 data are useful in detecting alteration around uranium deposits, but the method may still be improved. Bluish-gray mudstone in the target area could not be differentiated from the altered zones on the ratioed images. Further experiments involving combinations of ratioed and nonratioed data will be required to uniquely define the altered zones.
Conference summary: computers in respiratory care.
Nelson, Steven B
2004-05-01
Computers and data management in respiratory care reflect the larger practices of hospital information systems: the diversity of conference topics provides evidence. Respiratory care computing has shown a steady, slow progression from writing programs that calculate shunt equations to departmental management systems. Wider acceptance and utilization have been stifled by costs, both initial and on-going. Several authors pointed out the savings that were realized from information systems exceeded the costs of implementation and maintenance. The most significant finding from one of the presentations was that no other structure or skilled personnel could provide respiratory care more efficiently or cost-effectively than respiratory therapists. Online information resources have increased, in forms ranging from peer-reviewed journals to corporate-sponsored advertising posing as authoritative treatment regimens. Practitioners and patients need to know how to use these resources as well as how to judge the value of information they present. Departments are using computers for training on a schedule that is more convenient for the staff, providing information in a timely manner and potentially in more useful formats. Portable devices, such as personal digital assistants (PDAs) have improved the ability not only to share data to dispersed locations, but also to collect data at the point of care, thus greatly improving data capture. Ventilators are changing from simple automated bellows to complex systems collecting numerous respiratory parameters and offering feedback to improve ventilation. Clinical databases routinely collect information from a wide variety of resources and can be used for analysis to improve patient outcomes. What could possibly go wrong?
Crenshaw, Tanya L.; Chambers, Erin W.; Heeren, Cinda; Metcalf, Heather E.
2017-01-01
Just over 10 years ago, we conducted a culture study of the Computer Science Department at the flagship University of Illinois at Urbana-Champaign, one of the top five computing departments in the country. The study found that while the department placed an emphasis on research, it did so in a way that, in conjunction with a lack of communication and transparency, devalued teaching and mentoring, and negatively impacted the professional development, education, and sense of belonging of the students. As one part of a multi-phase case study spanning over a decade, this manuscript presents preliminary findings from our latest work at the university. We detail early comparisons between data gathered at the Department of Computer Science at the University of Illinois at Urbana-Champaign in 2005 and our most recent pilot case study, a follow-up research project completed in 2016. Though we have not yet completed the full data collection, we find it worthwhile to reflect on the pilot case study data we have collected thus far. Our data reveals improvements in the perceptions of undergraduate teaching quality and undergraduate peer mentoring networks. However, we also found evidence of continuing feelings of isolation, incidents of bias, policy opacity, and uneven policy implementation that are areas of concern, particularly with respect to historically underrepresented groups. We discuss these preliminary follow-up findings, offer research and methodological reflections, and share next steps for applied research that aims to create positive cultural change in computing. PMID:28579969
Blanson Henkemans, O. A.; Rogers, W. A.; Fisk, A. D.; Neerincx, M. A.; Lindenberg, J.; van der Mast, C. A. P. G.
2014-01-01
Summary Objectives We developed an adaptive computer assistant for the supervision of diabetics’ self-care, to support limiting illness and need for acute treatment, and improve health literacy. This assistant monitors self-care activities logged in the patient’s electronic diary. Accordingly, it provides context-aware feedback. The objective was to evaluate whether older adults in general can make use of the computer assistant and to compare an adaptive computer assistant with a fixed one, concerning its usability and contribution to health literacy. Methods We conducted a laboratory experiment in the Georgia Tech Aware Home wherein 28 older adults participated in a usability evaluation of the computer assistant, while engaged in scenarios reflecting normal and health-critical situations. We evaluated the assistant on effectiveness, efficiency, satisfaction, and educational value. Finally, we studied the moderating effects of the subjects’ personal characteristics. Results Logging self-care tasks and receiving feedback from the computer assistant enhanced the subjects’ knowledge of diabetes. The adaptive assistant was more effective in dealing with normal and health-critical situations, and, generally, it led to more time efficiency. Subjects’ personal characteristics had substantial effects on the effectiveness and efficiency of the two computer assistants. Conclusions Older adults were able to use the adaptive computer assistant. In addition, it had a positive effect on the development of health literacy. The assistant has the potential to support older diabetics’ self care while maintaining quality of life. PMID:18213433
Health care interprofessional education: encouraging technology, teamwork, and team performance.
2014-04-01
It is critical to prepare nurses for future practice to work in teams by engaging students in interprofessional education (IPE) that fosters positive attitudes toward teamwork. The purpose of this study was to examine the effects of computer-supported IPE on students’ attitudes and perceptions toward health care teamwork and team performance. A hybrid approach to IPE was used to provide students with an educational experience that combined the benefits of traditional face-to-face communication methodology with a computer-mediated platform that focused on reflection and team building. A statistically significant difference was found in students’ perceptions of team performance after engaging in computer-supported IPE. No statistically significant difference in students’ pretest–posttest composite attitude toward teamwork scores was noted; however, there was a positive trend toward improved scores.
Ultrasound power deposition model for the chest wall.
Moros, E G; Fan, X; Straube, W L
1999-10-01
An ultrasound power deposition model for the chest wall was developed based on secondary-source and plane-wave theories. The anatomic model consisted of a muscle-ribs-lung volume, accounted for wave reflection and refraction at muscle-rib and muscle-lung interfaces, and computed power deposition due to the propagation of both reflected and transmitted waves. Lung tissue was assumed to be air-equivalent. The parts of the theory and numerical program dealing with reflection were experimentally evaluated by comparing simulations with acoustic field measurements using several pertinent reflecting materials. Satisfactory agreement was found. A series of simulations were performed to study the influence of angle of incidence of the beam, frequency, and thickness of muscle tissue overlying the ribs on power deposition distributions that may be expected during superficial ultrasound (US) hyperthermia of chest wall recurrences. Both reflection at major interfaces and attenuation in bone were the determining factors affecting power deposition, the dominance of one vs. the other depending on the angle of incidence of the beam. Sufficient energy is reflected by these interfaces to suggest that improvements in thermal doses to overlying tissues are possible with adequate manipulation of the sound field (advances in ultrasonic heating devices) and prospective treatment planning.
Acoustic radiosity for computation of sound fields in diffuse environments
NASA Astrophysics Data System (ADS)
Muehleisen, Ralph T.; Beamer, C. Walter
2002-05-01
The use of image and ray tracing methods (and variations thereof) for the computation of sound fields in rooms is relatively well developed. In their regime of validity, both methods work well for prediction in rooms with small amounts of diffraction and mostly specular reflection at the walls. While extensions to the method to include diffuse reflections and diffraction have been made, they are limited at best. In the fields of illumination and computer graphics the ray tracing and image methods are joined by another method called luminous radiative transfer or radiosity. In radiosity, an energy balance between surfaces is computed assuming diffuse reflection at the reflective surfaces. Because the interaction between surfaces is constant, much of the computation required for sound field prediction with multiple or moving source and receiver positions can be reduced. In acoustics the radiosity method has had little attention because of the problems of diffraction and specular reflection. The utility of radiosity in acoustics and an approach to a useful development of the method for acoustics will be presented. The method looks especially useful for sound level prediction in industrial and office environments. [Work supported by NSF.
Round-off errors in cutting plane algorithms based on the revised simplex procedure
NASA Technical Reports Server (NTRS)
Moore, J. E.
1973-01-01
This report statistically analyzes computational round-off errors associated with the cutting plane approach to solving linear integer programming problems. Cutting plane methods require that the inverse of a sequence of matrices be computed. The problem basically reduces to one of minimizing round-off errors in the sequence of inverses. Two procedures for minimizing this problem are presented, and their influence on error accumulation is statistically analyzed. One procedure employs a very small tolerance factor to round computed values to zero. The other procedure is a numerical analysis technique for reinverting or improving the approximate inverse of a matrix. The results indicated that round-off accumulation can be effectively minimized by employing a tolerance factor which reflects the number of significant digits carried for each calculation and by applying the reinversion procedure once to each computed inverse. If 18 significant digits plus an exponent are carried for each variable during computations, then a tolerance value of 0.1 x 10 to the minus 12th power is reasonable.
Automatic Computer Mapping of Terrain
NASA Technical Reports Server (NTRS)
Smedes, H. W.
1971-01-01
Computer processing of 17 wavelength bands of visible, reflective infrared, and thermal infrared scanner spectrometer data, and of three wavelength bands derived from color aerial film has resulted in successful automatic computer mapping of eight or more terrain classes in a Yellowstone National Park test site. The tests involved: (1) supervised and non-supervised computer programs; (2) special preprocessing of the scanner data to reduce computer processing time and cost, and improve the accuracy; and (3) studies of the effectiveness of the proposed Earth Resources Technology Satellite (ERTS) data channels in the automatic mapping of the same terrain, based on simulations, using the same set of scanner data. The following terrain classes have been mapped with greater than 80 percent accuracy in a 12-square-mile area with 1,800 feet of relief; (1) bedrock exposures, (2) vegetated rock rubble, (3) talus, (4) glacial kame meadow, (5) glacial till meadow, (6) forest, (7) bog, and (8) water. In addition, shadows of clouds and cliffs are depicted, but were greatly reduced by using preprocessing techniques.
Analysis on trust influencing factors and trust model from multiple perspectives of online Auction
NASA Astrophysics Data System (ADS)
Yu, Wang
2017-10-01
Current reputation models lack the research on online auction trading completely so they cannot entirely reflect the reputation status of users and may cause problems on operability. To evaluate the user trust in online auction correctly, a trust computing model based on multiple influencing factors is established. It aims at overcoming the efficiency of current trust computing methods and the limitations of traditional theoretical trust models. The improved model comprehensively considers the trust degree evaluation factors of three types of participants according to different participation modes of online auctioneers, to improve the accuracy, effectiveness and robustness of the trust degree. The experiments test the efficiency and the performance of our model under different scale of malicious user, under environment like eBay and Sporas model. The experimental results analysis show the model proposed in this paper makes up the deficiency of existing model and it also has better feasibility.
Error in the Sampling Area of an Optical Disdrometer: Consequences in Computing Rain Variables
Fraile, R.; Castro, A.; Fernández-Raga, M.; Palencia, C.; Calvo, A. I.
2013-01-01
The aim of this study is to improve the estimation of the characteristic uncertainties of optic disdrometers in an attempt to calculate the efficient sampling area according to the size of the drop and to study how this influences the computation of other parameters, taking into account that the real sampling area is always smaller than the nominal area. For large raindrops (a little over 6 mm), the effective sampling area may be half the area indicated by the manufacturer. The error committed in the sampling area is propagated to all the variables depending on this surface, such as the rain intensity and the reflectivity factor. Both variables tend to underestimate the real value if the sampling area is not corrected. For example, the rainfall intensity errors may be up to 50% for large drops, those slightly larger than 6 mm. The same occurs with reflectivity values, which may be up to twice the reflectivity calculated using the uncorrected constant sampling area. The Z-R relationships appear to have little dependence on the sampling area, because both variables depend on it the same way. These results were obtained by studying one particular rain event that occurred on April 16, 2006. PMID:23844393
The Computer as a Tool for Learning through Reflection. Technical Report No. 376.
ERIC Educational Resources Information Center
Collins, Allan; Brown, John Seely
Because of its ability to record and represent process, the computer can provide a powerful, motivating, and as yet untapped tool for focusing the students' attention directly on their own thought processes and learning through reflection. Properly abstracted and structured, the computational medium can capture the processes by which a novice or…
ERIC Educational Resources Information Center
Chang, Chi-Cheng; Chen, Cheng-Chuan; Chen, Yi-Hui
2012-01-01
This research attempted to categorize reflection in a Web-based portfolio assessment using the Chinese Word Segmenting System (CWSS). Another aim of this research was to explore reflective performance in which individual differences were further examined. Participants were 45 eight-grade students from a junior high school taking a computer course.…
NASA Technical Reports Server (NTRS)
Mittra, R.; Rushdi, A.
1979-01-01
An approach for computing the geometrical optic fields reflected from a numerically specified surface is presented. The approach includes the step of deriving a specular point and begins with computing the reflected rays off the surface at the points where their coordinates, as well as the partial derivatives (or equivalently, the direction of the normal), are numerically specified. Then, a cluster of three adjacent rays are chosen to define a 'mean ray' and the divergence factor associated with this mean ray. Finally, the ampilitude, phase, and vector direction of the reflected field at a given observation point are derived by associating this point with the nearest mean ray and determining its position relative to such a ray.
Forward and backward inference in spatial cognition.
Penny, Will D; Zeidman, Peter; Burgess, Neil
2013-01-01
This paper shows that the various computations underlying spatial cognition can be implemented using statistical inference in a single probabilistic model. Inference is implemented using a common set of 'lower-level' computations involving forward and backward inference over time. For example, to estimate where you are in a known environment, forward inference is used to optimally combine location estimates from path integration with those from sensory input. To decide which way to turn to reach a goal, forward inference is used to compute the likelihood of reaching that goal under each option. To work out which environment you are in, forward inference is used to compute the likelihood of sensory observations under the different hypotheses. For reaching sensory goals that require a chaining together of decisions, forward inference can be used to compute a state trajectory that will lead to that goal, and backward inference to refine the route and estimate control signals that produce the required trajectory. We propose that these computations are reflected in recent findings of pattern replay in the mammalian brain. Specifically, that theta sequences reflect decision making, theta flickering reflects model selection, and remote replay reflects route and motor planning. We also propose a mapping of the above computational processes onto lateral and medial entorhinal cortex and hippocampus.
Forward and Backward Inference in Spatial Cognition
Penny, Will D.; Zeidman, Peter; Burgess, Neil
2013-01-01
This paper shows that the various computations underlying spatial cognition can be implemented using statistical inference in a single probabilistic model. Inference is implemented using a common set of ‘lower-level’ computations involving forward and backward inference over time. For example, to estimate where you are in a known environment, forward inference is used to optimally combine location estimates from path integration with those from sensory input. To decide which way to turn to reach a goal, forward inference is used to compute the likelihood of reaching that goal under each option. To work out which environment you are in, forward inference is used to compute the likelihood of sensory observations under the different hypotheses. For reaching sensory goals that require a chaining together of decisions, forward inference can be used to compute a state trajectory that will lead to that goal, and backward inference to refine the route and estimate control signals that produce the required trajectory. We propose that these computations are reflected in recent findings of pattern replay in the mammalian brain. Specifically, that theta sequences reflect decision making, theta flickering reflects model selection, and remote replay reflects route and motor planning. We also propose a mapping of the above computational processes onto lateral and medial entorhinal cortex and hippocampus. PMID:24348230
A general method for computing the total solar radiation force on complex spacecraft structures
NASA Technical Reports Server (NTRS)
Chan, F. K.
1981-01-01
The method circumvents many of the existing difficulties in computational logic presently encountered in the direct analytical or numerical evaluation of the appropriate surface integral. It may be applied to complex spacecraft structures for computing the total force arising from either specular or diffuse reflection or even from non-Lambertian reflection and re-radiation.
NASA Astrophysics Data System (ADS)
Kang, Zhizhong
2013-10-01
This paper presents a new approach to automatic registration of terrestrial laser scanning (TLS) point clouds utilizing a novel robust estimation method by an efficient BaySAC (BAYes SAmpling Consensus). The proposed method directly generates reflectance images from 3D point clouds, and then using SIFT algorithm extracts keypoints to identify corresponding image points. The 3D corresponding points, from which transformation parameters between point clouds are computed, are acquired by mapping the 2D ones onto the point cloud. To remove false accepted correspondences, we implement a conditional sampling method to select the n data points with the highest inlier probabilities as a hypothesis set and update the inlier probabilities of each data point using simplified Bayes' rule for the purpose of improving the computation efficiency. The prior probability is estimated by the verification of the distance invariance between correspondences. The proposed approach is tested on four data sets acquired by three different scanners. The results show that, comparing with the performance of RANSAC, BaySAC leads to less iterations and cheaper computation cost when the hypothesis set is contaminated with more outliers. The registration results also indicate that, the proposed algorithm can achieve high registration accuracy on all experimental datasets.
Yeung, E.S.; Woodruff, S.D.
1984-06-19
A refractive index and absorption detector are disclosed for liquid chromatography. It is based in part on a Fabry-Perot interferometer and is used for the improved detection of refractive index and absorption. It includes a Fabry-Perot interferometer having a normally fixed first partially reflecting mirror and a movable second partially reflecting mirror. A chromatographic flow-cell is positioned between the mirrors along the optical axis of a monochromatic laser beam passing through the interferometer. A means for deriving information about the interference fringes coming out of the interferometer is used with a mini-computer to compute the refractive index of the specimen injected into the flow cell. The minicomputer continuously scans the interferometer for continuous refractive index readings and outputs the continuous results of the scans on a chart recorder. The absorption of the specimen can concurrently be scanned by including a second optical path for an excitation laser which will not interfere with the first laser, but will affect the specimen so that absorption properties can be detected. By first scanning for the refractive index of the specimen, and then immediately adding the excitation laser and subsequently scanning for the refractive index again, the absorption of the specimen can be computed and recorded. 10 figs.
NASA Astrophysics Data System (ADS)
Song, Wanjun; Zhang, Hou
2017-11-01
Through introducing the alternating direction implicit (ADI) technique and the memory-optimized algorithm to the shift operator (SO) finite difference time domain (FDTD) method, the memory-optimized SO-ADI FDTD for nonmagnetized collisional plasma is proposed and the corresponding formulae of the proposed method for programming are deduced. In order to further the computational efficiency, the iteration method rather than Gauss elimination method is employed to solve the equation set in the derivation of the formulae. Complicated transformations and convolutions are avoided in the proposed method compared with the Z transforms (ZT) ADI FDTD method and the piecewise linear JE recursive convolution (PLJERC) ADI FDTD method. The numerical dispersion of the SO-ADI FDTD method with different plasma frequencies and electron collision frequencies is analyzed and the appropriate ratio of grid size to the minimum wavelength is given. The accuracy of the proposed method is validated by the reflection coefficient test on a nonmagnetized collisional plasma sheet. The testing results show that the proposed method is advantageous for improving computational efficiency and saving computer memory. The reflection coefficient of a perfect electric conductor (PEC) sheet covered by multilayer plasma and the RCS of the objects coated by plasma are calculated by the proposed method and the simulation results are analyzed.
Yeung, Edward S.; Woodruff, Steven D.
1984-06-19
A refractive index and absorption detector for liquid chromatography. It is based in part on a Fabry-Perot interferometer and is used for the improved detection of refractive index and absorption. It includes a Fabry-Perot interferometer having a normally fixed first partially reflecting mirror and a movable second partially reflecting mirror. A chromatographic flow-cell is positioned between the mirrors along the optical axis of a monochromatic laser beam passing through the interferometer. A means for deriving information about the interference fringes coming out of the interferometer is used with a mini-computer to compute the refractive index of the specimen injected into the flow cell. The minicomputer continuously scans the interferometer for continuous refractive index readings and outputs the continuous results of the scans on a chart recorder. The absorption of the specimen can concurrently be scanned by including a second optical path for an excitation laser which will not interfere with the first laser, but will affect the specimen so that absorption properties can be detected. By first scanning for the refractive index of the specimen, and then immediately adding the excitation laser and subsequently scanning for the refractive index again, the absorption of the specimen can be computed and recorded.
New opportunities for quality enhancing of images captured by passive THz camera
NASA Astrophysics Data System (ADS)
Trofimov, Vyacheslav A.; Trofimov, Vladislav V.
2014-10-01
As it is well-known, the passive THz camera allows seeing concealed object without contact with a person and this camera is non-dangerous for a person. Obviously, efficiency of using the passive THz camera depends on its temperature resolution. This characteristic specifies possibilities of the detection for concealed object: minimal size of the object; maximal distance of the detection; image quality. Computer processing of the THz image may lead to many times improving of the image quality without any additional engineering efforts. Therefore, developing of modern computer code for its application to THz images is urgent problem. Using appropriate new methods one may expect such temperature resolution which will allow to see banknote in pocket of a person without any real contact. Modern algorithms for computer processing of THz images allow also to see object inside the human body using a temperature trace on the human skin. This circumstance enhances essentially opportunity of passive THz camera applications for counterterrorism problems. We demonstrate opportunities, achieved at present time, for the detection both of concealed objects and of clothes components due to using of computer processing of images captured by passive THz cameras, manufactured by various companies. Another important result discussed in the paper consists in observation of both THz radiation emitted by incandescent lamp and image reflected from ceramic floorplate. We consider images produced by THz passive cameras manufactured by Microsemi Corp., and ThruVision Corp., and Capital Normal University (Beijing, China). All algorithms for computer processing of the THz images under consideration in this paper were developed by Russian part of author list. Keywords: THz wave, passive imaging camera, computer processing, security screening, concealed and forbidden objects, reflected image, hand seeing, banknote seeing, ceramic floorplate, incandescent lamp.
Computed atmospheric corrections for satellite data. [in visible and near IR spectra
NASA Technical Reports Server (NTRS)
Fraser, R. S.
1975-01-01
The corrections are presented for the visible and near infrared spectrum. The specifications of earth-atmosphere models are given. Herman's and Dave's methods of computing the four Stokes parameters are presented. The relative differences between the two sets of values are one percent. The absolute accuracy of the computations can be established only by comparisons with measured data. Suitable observations do not yet exist. Nevertheless, comparisons are made between computed and aircraft and satellite measured radiances. Particulates are the principal atmospheric variable in the window bands. They have a large effect on the radiances when the surface reflectivity is low. When the surface reflectivity exceeds 0.1, only absorbing particulates have a large effect on the reflectivity, unless the atmospheric turbidity is high. The ranges of the Multispectral Scanner responses to atmospheric effects are computed.
Computer driven optical keratometer and method of evaluating the shape of the cornea
NASA Technical Reports Server (NTRS)
Baroth, Edmund C. (Inventor); Mouneimme, Samih A. (Inventor)
1994-01-01
An apparatus and method for measuring the shape of the cornea utilize only one reticle to generate a pattern of rings projected onto the surface of a subject's eye. The reflected pattern is focused onto an imaging device such as a video camera and a computer compares the reflected pattern with a reference pattern stored in the computer's memory. The differences between the reflected and stored patterns are used to calculate the deformation of the cornea which may be useful for pre-and post-operative evaluation of the eye by surgeons.
External self-representations improve self-awareness in a child with autism.
Root, Nicholas B; Case, Laura K; Burrus, Caley J; Ramachandran, V S
2015-01-01
We have previously suggested that the social symptoms of autism spectrum disorder (ASD) could be caused in part by a dysfunctional mirror neuron system (MNS). Since the recursive activity of a functioning MNS might enable the brain to integrate visual and motor sensations into a coherent body schema, the deficits in self-awareness often seen in ASD might be caused by the same mirror neuron dysfunction. CL is an autistic adolescent who is profoundly fascinated with his reflection, looking in mirrors at every opportunity. We demonstrate that CL's abnormal gait improves significantly when using a mirror for visual feedback. We also show that both the fascination and the happiness that CL derives from looking at a computer-generated reflection diminish when a delay is introduced between the camera input and screen output. We believe that immediate, real-time visual feedback allows CL to integrate motor sensations with external visual ones into a coherent body schema that he cannot internally generate, perhaps due to a dysfunctional MNS.
NASA Astrophysics Data System (ADS)
Aubé, M.; Simoneau, A.
2018-05-01
Illumina is one of the most physically detailed artificial night sky brightness model to date. It has been in continuous development since 2005 [1]. In 2016-17, many improvements were made to the Illumina code including an overhead cloud scheme, an improved blocking scheme for subgrid obstacles (trees and buildings), and most importantly, a full hyperspectral modeling approach. Code optimization resulted in significant reduction in execution time enabling users to run the model on standard personal computers for some applications. After describing the new schemes introduced in the model, we give some examples of applications for a peri-urban and a rural site both located inside the International Dark Sky reserve of Mont-Mégantic (QC, Canada).
NASA Technical Reports Server (NTRS)
Wilson, Daniel W. (Inventor); Johnson, William R. (Inventor); Bearman, Gregory H. (Inventor)
2011-01-01
Computed tomography imaging spectrometers ("CTISs") employing a single lens are provided. The CTISs may be either transmissive or reflective, and the single lens is either configured to transmit and receive uncollimated light (in transmissive systems), or is configured to reflect and receive uncollimated light (in reflective systems). An exemplary transmissive CTIS includes a focal plane array detector, a single lens configured to transmit and receive uncollimated light, a two-dimensional grating, and a field stop aperture. An exemplary reflective CTIS includes a focal plane array detector, a single mirror configured to reflect and receive uncollimated light, a two-dimensional grating, and a field stop aperture.
The whole space three-dimensional magnetotelluric inversion algorithm with static shift correction
NASA Astrophysics Data System (ADS)
Zhang, K.
2016-12-01
Base on the previous studies on the static shift correction and 3D inversion algorithms, we improve the NLCG 3D inversion method and propose a new static shift correction method which work in the inversion. The static shift correction method is based on the 3D theory and real data. The static shift can be detected by the quantitative analysis of apparent parameters (apparent resistivity and impedance phase) of MT in high frequency range, and completed correction with inversion. The method is an automatic processing technology of computer with 0 cost, and avoids the additional field work and indoor processing with good results.The 3D inversion algorithm is improved (Zhang et al., 2013) base on the NLCG method of Newman & Alumbaugh (2000) and Rodi & Mackie (2001). For the algorithm, we added the parallel structure, improved the computational efficiency, reduced the memory of computer and added the topographic and marine factors. So the 3D inversion could work in general PC with high efficiency and accuracy. And all the MT data of surface stations, seabed stations and underground stations can be used in the inversion algorithm. The verification and application example of 3D inversion algorithm is shown in Figure 1. From the comparison of figure 1, the inversion model can reflect all the abnormal bodies and terrain clearly regardless of what type of data (impedance/tipper/impedance and tipper). And the resolution of the bodies' boundary can be improved by using tipper data. The algorithm is very effective for terrain inversion. So it is very useful for the study of continental shelf with continuous exploration of land, marine and underground.The three-dimensional electrical model of the ore zone reflects the basic information of stratum, rock and structure. Although it cannot indicate the ore body position directly, the important clues are provided for prospecting work by the delineation of diorite pluton uplift range. The test results show that, the high quality of the data processing and efficient inversion method for electromagnetic method is an important guarantee for porphyry ore.
Exploring Experienced Professionals' Reflections on Computing Education
ERIC Educational Resources Information Center
Exter, Marisa; Turnage, Nichole
2012-01-01
This exploratory qualitative study examines computing professional's memories of their own formal and non-formal educational experiences, their reflections on how these have prepared them for their professional roles, and their recommendations for an "ideal" undergraduate degree program. Data was collected through semi-structured interviews of…
Embo, M; Driessen, E; Valcke, M; van der Vleuten, C P M
2015-01-01
increasingly, reflection is highlighted as integral to core practice competencies but empirical research into the relationship between reflection and performance in the clinical workplace is scarce. this study investigated the relationship between reflection ability and clinical performance. we designed a cross-sectional and a retrospective-longitudinal cohort study. Data from first, second and third year midwifery students were collected to study the variables 'clinical performance' and 'reflection ability'. Data were analysed with SPSS for Windows, Release 20.0. Descriptive statistics, Pearson׳s Product Moment Correlation Coefficients (r) and r² values were computed to investigate associations between the research variables. the results showed a moderate observed correlation between reflection ability and clinical performance scores. When adopting a cross-sectional perspective, all correlation values were significant (p<0.01) and above 0.4, with the exception of the third year correlations. Assuming perfect reliability in the measurement, the adjusted correlations, for year 2 and year 3 indicated a high association between reflection ability and clinical performance (>0.6). The results based on the retrospective-longitudinal data set explained a moderate proportion of the variance after correction for attenuation. Finally, the results indicate that 'reflection ability' scores of earlier years are significant related with 'clinical performance' scores of subsequent years. These results suggest that (1) reflection ability is linked to clinical performance; (2) that written reflections are an important, but not the sole way to assess professional competence and that (3) reflection is a contributor to clinical performance improvement. the data showed a moderate but significant relationship between 'reflection ability' and 'clinical performance' scores in clinical practice of midwifery students. Reflection therefore seems an important component of professional competence. Copyright © 2014 Elsevier Ltd. All rights reserved.
Pore-level determination of spectral reflection behaviors of high-porosity metal foam sheets
NASA Astrophysics Data System (ADS)
Li, Yang; Xia, Xin-Lin; Ai, Qing; Sun, Chuang; Tan, He-Ping
2018-03-01
Open cell metal foams are currently attracting attention and their radiative behaviors are of primary importance in high temperature applications. The spectral reflection behaviors of high-porosity metal foam sheets, bidirectional reflectance distribution function (BRDF) and directional-hemispherical reflectivity were numerically investigated. A set of realistic nickel foams with porosity from 0.87 to 0.97 and pore density from 10 to 40 pores per inch were tomographied to obtain their 3-D digital cell network. A Monte Carlo ray-tracing method was employed in order to compute the pore-level radiative transfer inside the network within the limit of geometrical optics. The apparent reflection behaviors and their dependency on the textural parameters and strut optical properties were comprehensively computed and analysed. The results show a backward scattering of the reflected energy at the foam sheet surface. Except in the cases of large incident angles, an energy peak is located almost along the incident direction and increases with increasing incident angles. Through an analytical relation established, the directional-hemispherical reflectivity can be related directly to the porosity of the foam sheet and to the complex refractive index of the solid phase as well as the specularity parameter which characterizes the local reflection model. The computations show that a linear decrease in normal-hemispherical reflectivity occurs with increasing porosity. The rate of this decrease is directly proportional to the strut normal reflectivity. In addition, the hemispherical reflectivity increases as a power function of the incident angle cosine.
Li, Huahui; Kong, Lingzhi; Wu, Xihong; Li, Liang
2013-01-01
In reverberant rooms with multiple-people talking, spatial separation between speech sources improves recognition of attended speech, even though both the head-shadowing and interaural-interaction unmasking cues are limited by numerous reflections. It is the perceptual integration between the direct wave and its reflections that bridges the direct-reflection temporal gaps and results in the spatial unmasking under reverberant conditions. This study further investigated (1) the temporal dynamic of the direct-reflection-integration-based spatial unmasking as a function of the reflection delay, and (2) whether this temporal dynamic is correlated with the listeners’ auditory ability to temporally retain raw acoustic signals (i.e., the fast decaying primitive auditory memory, PAM). The results showed that recognition of the target speech against the speech-masker background is a descending exponential function of the delay of the simulated target reflection. In addition, the temporal extent of PAM is frequency dependent and markedly longer than that for perceptual fusion. More importantly, the temporal dynamic of the speech-recognition function is significantly correlated with the temporal extent of the PAM of low-frequency raw signals. Thus, we propose that a chain process, which links the earlier-stage PAM with the later-stage correlation computation, perceptual integration, and attention facilitation, plays a role in spatially unmasking target speech under reverberant conditions. PMID:23658664
Legal issues of computer imaging in plastic surgery: a primer.
Chávez, A E; Dagum, P; Koch, R J; Newman, J P
1997-11-01
Although plastic surgeons are increasingly incorporating computer imaging techniques into their practices, many fear the possibility of legally binding themselves to achieve surgical results identical to those reflected in computer images. Computer imaging allows surgeons to manipulate digital photographs of patients to project possible surgical outcomes. Some of the many benefits imaging techniques pose include improving doctor-patient communication, facilitating the education and training of residents, and reducing administrative and storage costs. Despite the many advantages computer imaging systems offer, however, surgeons understandably worry that imaging systems expose them to immense legal liability. The possible exploitation of computer imaging by novice surgeons as a marketing tool, coupled with the lack of consensus regarding the treatment of computer images, adds to the concern of surgeons. A careful analysis of the law, however, reveals that surgeons who use computer imaging carefully and conservatively, and adopt a few simple precautions, substantially reduce their vulnerability to legal claims. In particular, surgeons face possible claims of implied contract, failure to instruct, and malpractice from their use or failure to use computer imaging. Nevertheless, legal and practical obstacles frustrate each of those causes of actions. Moreover, surgeons who incorporate a few simple safeguards into their practice may further reduce their legal susceptibility.
1983-07-15
categories, however, represent the reality in major acquisition and are often overlooked. Although Figure 1 does not reflect tne dynamics and Interactions...networking and improved computer capabili- ties probabilistic network simulation became a reality . The Naval Sea Systems Command became involved in...reasons for using the WBS are plain: 1. Virtually all risk-prone activities are performed by the contractor, not Government. Government is responsible
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shirasaka, Y.; Ito, M.; Okuno, T.
Sequential {sup 123}I-N-isopropyl-p-iodoamphetamine (IMP) single-photon emission computed tomography (SPECT) was performed in 2 patients with acute infantile hemiplegia. In both patients, low uptake of IMP was detected in the targeted abnormal hemisphere. The {sup 123}I-IMP-SPECT findings indicative of a pathologic condition persisted even when the clinical findings and electroencephalographic abnormalities improved. Because of its sensitivity, noninvasiveness, and accurate reflection of the cerebral blood flow distribution, {sup 123}I-IMP-SPECT is useful in the examination of acute infantile hemiplegia and in the evaluation of prognosis.
Creating photorealistic virtual model with polarization-based vision system
NASA Astrophysics Data System (ADS)
Shibata, Takushi; Takahashi, Toru; Miyazaki, Daisuke; Sato, Yoichi; Ikeuchi, Katsushi
2005-08-01
Recently, 3D models are used in many fields such as education, medical services, entertainment, art, digital archive, etc., because of the progress of computational time and demand for creating photorealistic virtual model is increasing for higher reality. In computer vision field, a number of techniques have been developed for creating the virtual model by observing the real object in computer vision field. In this paper, we propose the method for creating photorealistic virtual model by using laser range sensor and polarization based image capture system. We capture the range and color images of the object which is rotated on the rotary table. By using the reconstructed object shape and sequence of color images of the object, parameter of a reflection model are estimated in a robust manner. As a result, then, we can make photorealistic 3D model in consideration of surface reflection. The key point of the proposed method is that, first, the diffuse and specular reflection components are separated from the color image sequence, and then, reflectance parameters of each reflection component are estimated separately. In separation of reflection components, we use polarization filter. This approach enables estimation of reflectance properties of real objects whose surfaces show specularity as well as diffusely reflected lights. The recovered object shape and reflectance properties are then used for synthesizing object images with realistic shading effects under arbitrary illumination conditions.
Software Validation via Model Animation
NASA Technical Reports Server (NTRS)
Dutle, Aaron M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Butler, Ricky W.
2015-01-01
This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system's algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.
Computer-aided communication satellite system analysis and optimization
NASA Technical Reports Server (NTRS)
Stagl, T. W.; Morgan, N. H.; Morley, R. E.; Singh, J. P.
1973-01-01
The capabilities and limitations of the various published computer programs for fixed/broadcast communication satellite system synthesis and optimization are discussed. A satellite Telecommunication analysis and Modeling Program (STAMP) for costing and sensitivity analysis work in application of communication satellites to educational development is given. The modifications made to STAMP include: extension of the six beam capability to eight; addition of generation of multiple beams from a single reflector system with an array of feeds; an improved system costing to reflect the time value of money, growth in earth terminal population with time, and to account for various measures of system reliability; inclusion of a model for scintillation at microwave frequencies in the communication link loss model; and, an updated technological environment.
Butler, Samuel D; Nauyoks, Stephen E; Marciniak, Michael A
2015-06-01
Of the many classes of bidirectional reflectance distribution function (BRDF) models, two popular classes of models are the microfacet model and the linear systems diffraction model. The microfacet model has the benefit of speed and simplicity, as it uses geometric optics approximations, while linear systems theory uses a diffraction approach to compute the BRDF, at the expense of greater computational complexity. In this Letter, nongrazing BRDF measurements of rough and polished surface-reflecting materials at multiple incident angles are scaled by the microfacet cross section conversion term, but in the linear systems direction cosine space, resulting in great alignment of BRDF data at various incident angles in this space. This results in a predictive BRDF model for surface-reflecting materials at nongrazing angles, while avoiding some of the computational complexities in the linear systems diffraction model.
Topography of Cells Revealed by Variable-Angle Total Internal Reflection Fluorescence Microscopy.
Cardoso Dos Santos, Marcelina; Déturche, Régis; Vézy, Cyrille; Jaffiol, Rodolphe
2016-09-20
We propose an improved version of variable-angle total internal reflection fluorescence microscopy (vaTIRFM) adapted to modern TIRF setup. This technique involves the recording of a stack of TIRF images, by gradually increasing the incident angle of the light beam on the sample. A comprehensive theory was developed to extract the membrane/substrate separation distance from fluorescently labeled cell membranes. A straightforward image processing was then established to compute the topography of cells with a nanometric axial resolution, typically 10-20 nm. To highlight the new opportunities offered by vaTIRFM to quantify adhesion process of motile cells, adhesion of MDA-MB-231 cancer cells on glass substrate coated with fibronectin was examined. Copyright © 2016 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Receive Mode Analysis and Design of Microstrip Reflectarrays
NASA Technical Reports Server (NTRS)
Rengarajan, Sembiam
2011-01-01
Traditionally microstrip or printed reflectarrays are designed using the transmit mode technique. In this method, the size of each printed element is chosen so as to provide the required value of the reflection phase such that a collimated beam results along a given direction. The reflection phase of each printed element is approximated using an infinite array model. The infinite array model is an excellent engineering approximation for a large microstrip array since the size or orientation of elements exhibits a slow spatial variation. In this model, the reflection phase from a given printed element is approximated by that of an infinite array of elements of the same size and orientation when illuminated by a local plane wave. Thus the reflection phase is a function of the size (or orientation) of the element, the elevation and azimuth angles of incidence of a local plane wave, and polarization. Typically, one computes the reflection phase of the infinite array as a function of several parameters such as size/orientation, elevation and azimuth angles of incidence, and in some cases for vertical and horizontal polarization. The design requires the selection of the size/orientation of the printed element to realize the required phase by interpolating or curve fitting all the computed data. This is a substantially complicated problem, especially in applications requiring a computationally intensive commercial code to determine the reflection phase. In dual polarization applications requiring rectangular patches, one needs to determine the reflection phase as a function of five parameters (dimensions of the rectangular patch, elevation and azimuth angles of incidence, and polarization). This is an extremely complex problem. The new method employs the reciprocity principle and reaction concept, two well-known concepts in electromagnetics to derive the receive mode analysis and design techniques. In the "receive mode design" technique, the reflection phase is computed for a plane wave incident on the reflectarray from the direction of the beam peak. In antenna applications with a single collimated beam, this method is extremely simple since all printed elements see the same angles of incidence. Thus the number of parameters is reduced by two when compared to the transmit mode design. The reflection phase computation as a function of five parameters in the rectangular patch array discussed previously is reduced to a computational problem with three parameters in the receive mode. Furthermore, if the beam peak is in the broadside direction, the receive mode design is polarization independent and the reflection phase computation is a function of two parameters only. For a square patch array, it is a function of the size, one parameter only, thus making it extremely simple.
Development of an oximeter for neurology
NASA Astrophysics Data System (ADS)
Aleinik, A.; Serikbekova, Z.; Zhukova, N.; Zhukova, I.; Nikitina, M.
2016-06-01
Cerebral desaturation can occur during surgery manipulation, whereas other parameters vary insignificantly. Prolonged intervals of cerebral anoxia can cause serious damage to the nervous system. Commonly used method for measurement of cerebral blood flow uses invasive catheters. Other techniques include single photon emission computed tomography (SPECT), positron emission tomography (PET), magnetic resonance imaging (MRI). Tomographic methods frequently use isotope administration, that may result in anaphylactic reactions to contrast media and associated nerve diseases. Moreover, the high cost and the need for continuous monitoring make it difficult to apply these techniques in clinical practice. Cerebral oximetry is a method for measuring oxygen saturation using infrared spectrometry. Moreover reflection pulse oximetry can detect sudden changes in sympathetic tone. For this purpose the reflectance pulse oximeter for use in neurology is developed. Reflectance oximeter has a definite advantage as it can be used to measure oxygen saturation in any part of the body. Preliminary results indicate that the device has a good resolution and high reliability. Modern applied schematics have improved device characteristics compared with existing ones.
NASA Technical Reports Server (NTRS)
Simard, M.; Riel, Bryan; Hensley, S.; Lavalle, Marco
2011-01-01
Radar backscatter data contain both geometric and radiometric distortions due to underlying topography and the radar viewing geometry. Our objective is to develop a radiometric correction algorithm specific to the UAVSAR system configuration that would improve retrieval of forest structure parameters. UAVSAR is an airborne Lband radar capable of repeat?pass interferometry producing images with a spatial resolution of 5m. It is characterized by an electronically steerable antenna to compensate for aircraft attitude. Thus, the computation of viewing angles (i.e. look, incidence and projection) must include aircraft attitude angles (i.e. yaw, pitch and roll) in addition to the antenna steering angle. In this presentation, we address two components of radiometric correction: area projection and vegetation reflectivity. The first correction is applied by normalization of the radar backscatter by the local ground area illuminated by the radar beam. The second is a correction due to changes in vegetation reflectivity with viewing geometry.
Chan, Sam C C; Chan, Chetwyn C H; Derbie, Abiot Y; Hui, Irene; Tan, Davynn G H; Pang, Marco Y C; Lau, Stephen C L; Fong, Kenneth N K
2017-01-01
Nonpharmacological intervention for individuals with mild cognitive impairment (MCI) needs further investigation. Test efficacy of an eight-week Chinese calligraphy writing training course in improving attentional control and working memory. Ninety-nine participants with MCI were randomized into the eight-week calligraphy writing (n = 48) or control (tablet computer) training (n = 51). Outcomes of the interventions were attentional control, working memory, visual scan and processing speed. They were measured at baseline, post-training, and six-month follow-up. Calligraphy writing, when compared with control, significantly improved working memory as reflected from DST-Backward sequence (p = 0.009) and span scores (p = 0.002), and divided attention as reflected from CTT2 (p < 0.001), and at the post-training. The unique improvement in working memory (span: p < 0.001; sequence: p = 0.008) of the intervention group was also found at follow-up when comparing with those at baseline. Changes in the other outcome measures were not statistically significant. The findings provide support that Chinese calligraphy writing training for eight weeks using a cognitive approach would improve working memory and to a lesser extent attentional control functions of patients with early MCI. They also demonstrate the usefulness of using mind-and-body practice for improving specific cognitive functions.
Pharmacist Computer Skills and Needs Assessment Survey
Jewesson, Peter J
2004-01-01
Background To use technology effectively for the advancement of patient care, pharmacists must possess a variety of computer skills. We recently introduced a novel applied informatics program in this Canadian hospital clinical service unit to enhance the informatics skills of our members. Objective This study was conducted to gain a better understanding of the baseline computer skills and needs of our hospital pharmacists immediately prior to the implementation of an applied informatics program. Methods In May 2001, an 84-question written survey was distributed by mail to 106 practicing hospital pharmacists in our multi-site, 1500-bed, acute-adult-tertiary care Canadian teaching hospital in Vancouver, British Columbia. Results Fifty-eight surveys (55% of total) were returned within the two-week study period. The survey responses reflected the opinions of licensed BSc and PharmD hospital pharmacists with a broad range of pharmacy practice experience. Most respondents had home access to personal computers, and regularly used computers in the work environment for drug distribution, information management, and communication purposes. Few respondents reported experience with handheld computers. Software use experience varied according to application. Although patient-care information software and e-mail were commonly used, experience with spreadsheet, statistical, and presentation software was negligible. The respondents were familiar with Internet search engines, and these were reported to be the most common method of seeking clinical information online. Although many respondents rated themselves as being generally computer literate and not particularly anxious about using computers, the majority believed they required more training to reach their desired level of computer literacy. Lack of familiarity with computer-related terms was prevalent. Self-reported basic computer skill was typically at a moderate level, and varied depending on the task. Specifically, respondents rated their ability to manipulate files, use software help features, and install software as low, but rated their ability to access and navigate the Internet as high. Respondents were generally aware of what online resources were available to them and Clinical Pharmacology was the most commonly employed reference. In terms of anticipated needs, most pharmacists believed they needed to upgrade their computer skills. Medical database and Internet searching skills were identified as those in greatest need of improvement. Conclusions Most pharmacists believed they needed to upgrade their computer skills. Medical database and Internet searching skills were identified as those in greatest need of improvement for the purposes of improving practice effectiveness. PMID:15111277
Characterization of Meta-Materials Using Computational Electromagnetic Methods
NASA Technical Reports Server (NTRS)
Deshpande, Manohar; Shin, Joon
2005-01-01
An efficient and powerful computational method is presented to synthesize a meta-material to specified electromagnetic properties. Using the periodicity of meta-materials, the Finite Element Methodology (FEM) is developed to estimate the reflection and transmission through the meta-material structure for a normal plane wave incidence. For efficient computations of the reflection and transmission over a wide band frequency range through a meta-material a Finite Difference Time Domain (FDTD) approach is also developed. Using the Nicholson-Ross method and the Genetic Algorithms, a robust procedure to extract electromagnetic properties of meta-material from the knowledge of its reflection and transmission coefficients is described. Few numerical examples are also presented to validate the present approach.
The effects of core-reflected waves on finite fault inversions with teleseismic body wave data
NASA Astrophysics Data System (ADS)
Qian, Yunyi; Ni, Sidao; Wei, Shengji; Almeida, Rafael; Zhang, Han
2017-11-01
Teleseismic body waves are essential for imaging rupture processes of large earthquakes. Earthquake source parameters are usually characterized by waveform analyses such as finite fault inversions using only turning (direct) P and SH waves without considering the reflected phases from the core-mantle boundary (CMB). However, core-reflected waves such as ScS usually have amplitudes comparable to direct S waves due to the total reflection from the CMB and might interfere with the S waves used for inversion, especially at large epicentral distances for long duration earthquakes. In order to understand how core-reflected waves affect teleseismic body wave inversion results, we develop a procedure named Multitel3 to compute Green's functions that contain turning waves (direct P, pP, sP, direct S, sS and reverberations in the crust) and core-reflected waves (PcP, pPcP, sPcP, ScS, sScS and associated reflected phases from the CMB). This ray-based method can efficiently generate synthetic seismograms for turning and core-reflected waves independently, with the flexibility to take into account the 3-D Earth structure effect on the timing between these phases. The performance of this approach is assessed through a series of numerical inversion tests on synthetic waveforms of the 2008 Mw7.9 Wenchuan earthquake and the 2015 Mw7.8 Nepal earthquake. We also compare this improved method with the turning-wave only inversions and explore the stability of the new procedure when there are uncertainties in a priori information (such as fault geometry and epicentre location) or arrival time of core-reflected phases. Finally, a finite fault inversion of the 2005 Mw8.7 Nias-Simeulue earthquake is carried out using the improved Green's functions. Using enhanced Green's functions yields better inversion results as expected. While the finite source inversion with conventional P and SH waves is able to recover large-scale characteristics of the earthquake source, by adding PcP and ScS phases, the inverted slip model and moment rate function better match previous results incorporating field observations, geodetic and seismic data.
Optimizing Hardware Compatibility for Scaling Up Superconducting Qubits
NASA Astrophysics Data System (ADS)
Fang, Michael; Campbell, Brooks; Chen, Zijun; Chiaro, Ben; Dunsworth, Andrew; Kelly, Julian; Megrant, Anthony; Neill, Charles; O'Malley, Peter; Quintana, Chris; Vainsencher, Amit; Wenner, Jim; White, Ted; Barends, Rami; Chen, Yu; Fowler, Austin; Jeffrey, Evan; Mutus, Josh; Roushan, Pedram; Sank, Daniel; Martinis, John
2015-03-01
Since quantum computation relies on the manipulation of fragile quantum states, qubit devices must be isolated from the noisy environment to prevent decoherence. Custom made components make isolation from thermal and infrared radiation possible, but have been unreliable, massive, and show sub-ideal microwave performance. Infrared isolation for large scale experiments (> 8 qubits) was achieved with compact impedance matched microwave filters which attenuate stray infrared signals on cryogenic cables with only -25 dB reflection up to 7.5 GHz. In addition, a thermal anchoring system was designed to effectively transfer unwanted heat from more than 100 coaxial cables in the dilution refrigerator and yielded a 33 percent improvement in base temperature and 50% improvement in hold time.
On a Non-Reflecting Boundary Condition for Hyperbolic Conservation Laws
NASA Technical Reports Server (NTRS)
Loh, Ching Y.
2003-01-01
A non-reflecting boundary condition (NRBC) for practical computations in fluid dynamics and aeroacoustics is presented. The technique is based on the first principle of non-reflecting, plane wave propagation and the hyperbolicity of the Euler equation system. The NRBC is simple and effective, provided the numerical scheme maintains locally a C(sup 1) continuous solution at the boundary. Several numerical examples in 1D, 2D, and 3D space are illustrated to demonstrate its robustness in practical computations.
ERIC Educational Resources Information Center
Solhaug, T.
2009-01-01
The context of this article is the new technological environment and the struggle to use meaningful teaching practices in Norwegian schools. Students' critical reflections in two different technological learning environments in six upper secondary schools are compared. Three of these schools offer Internet-connected computers in special computer…
Multi-ray medical ultrasound simulation without explicit speckle modelling.
Tuzer, Mert; Yazıcı, Abdulkadir; Türkay, Rüştü; Boyman, Michael; Acar, Burak
2018-05-04
To develop a medical ultrasound (US) simulation method using T1-weighted magnetic resonance images (MRI) as the input that offers a compromise between low-cost ray-based and high-cost realistic wave-based simulations. The proposed method uses a novel multi-ray image formation approach with a virtual phased array transducer probe. A domain model is built from input MR images. Multiple virtual acoustic rays are emerged from each element of the linear transducer array. Reflected and transmitted acoustic energy at discrete points along each ray is computed independently. Simulated US images are computed by fusion of the reflected energy along multiple rays from multiple transducers, while phase delays due to differences in distances to transducers are taken into account. A preliminary implementation using GPUs is presented. Preliminary results show that the multi-ray approach is capable of generating view point-dependent realistic US images with an inherent Rician distributed speckle pattern automatically. The proposed simulator can reproduce the shadowing artefacts and demonstrates frequency dependence apt for practical training purposes. We also have presented preliminary results towards the utilization of the method for real-time simulations. The proposed method offers a low-cost near-real-time wave-like simulation of realistic US images from input MR data. It can further be improved to cover the pathological findings using an improved domain model, without any algorithmic updates. Such a domain model would require lesion segmentation or manual embedding of virtual pathologies for training purposes.
NASA Astrophysics Data System (ADS)
Deglint, Jason; Chung, Audrey G.; Chwyl, Brendan; Amelard, Robert; Kazemzadeh, Farnoud; Wang, Xiao Yu; Clausi, David A.; Wong, Alexander
2016-03-01
Traditional photoplethysmographic imaging (PPGI) systems use the red, green, and blue (RGB) broadband measurements of a consumer digital camera to remotely estimate a patients heart rate; however, these broadband RGB signals are often corrupted by ambient noise, making the extraction of subtle fluctuations indicative of heart rate difficult. Therefore, the use of narrow-band spectral measurements can significantly improve the accuracy. We propose a novel digital spectral demultiplexing (DSD) method to infer narrow-band spectral information from acquired broadband RGB measurements in order to estimate heart rate via the computation of motion- compensated skin erythema fluctuation. Using high-resolution video recordings of human participants, multiple measurement locations are automatically identified on the cheeks of an individual, and motion-compensated broadband reflectance measurements are acquired at each measurement location over time via measurement location tracking. The motion-compensated broadband reflectance measurements are spectrally demultiplexed using a non-linear inverse model based on the spectral sensitivity of the camera's detector. A PPG signal is then computed from the demultiplexed narrow-band spectral information via skin erythema fluctuation analysis, with improved signal-to-noise ratio allowing for reliable remote heart rate measurements. To assess the effectiveness of the proposed system, a set of experiments involving human motion in a front-facing position were performed under ambient lighting conditions. Experimental results indicate that the proposed system achieves robust and accurate heart rate measurements and can provide additional information about the participant beyond the capabilities of traditional PPGI methods.
Hemispherical reflectance model for passive images in an outdoor environment.
Kim, Charles C; Thai, Bea; Yamaoka, Neil; Aboutalib, Omar
2015-05-01
We present a hemispherical reflectance model for simulating passive images in an outdoor environment where illumination is provided by natural sources such as the sun and the clouds. While the bidirectional reflectance distribution function (BRDF) accurately produces radiance from any objects after the illumination, using the BRDF in calculating radiance requires double integration. Replacing the BRDF by hemispherical reflectance under the natural sources transforms the double integration into a multiplication. This reduces both storage space and computation time. We present the formalism for the radiance of the scene using hemispherical reflectance instead of BRDF. This enables us to generate passive images in an outdoor environment taking advantage of the computational and storage efficiencies. We show some examples for illustration.
NASA Technical Reports Server (NTRS)
Kraft, R. E.
1996-01-01
A computational method to predict modal reflection coefficients in cylindrical ducts has been developed based on the work of Homicz, Lordi, and Rehm, which uses the Wiener-Hopf method to account for the boundary conditions at the termination of a thin cylindrical pipe. The purpose of this study is to develop a computational routine to predict the reflection coefficients of higher order acoustic modes impinging on the unflanged termination of a cylindrical duct. This effort was conducted wider Task Order 5 of the NASA Lewis LET Program, Active Noise Control of aircraft Engines: Feasibility Study, and will be used as part of the development of an integrated source noise, acoustic propagation, ANC actuator coupling, and control system algorithm simulation. The reflection coefficient prediction will be incorporated into an existing cylindrical duct modal analysis to account for the reflection of modes from the duct termination. This will provide a more accurate, rapid computation design tool for evaluating the effect of reflected waves on active noise control systems mounted in the duct, as well as providing a tool for the design of acoustic treatment in inlet ducts. As an active noise control system design tool, the method can be used preliminary to more accurate but more numerically intensive acoustic propagation models such as finite element methods. The resulting computer program has been shown to give reasonable results, some examples of which are presented. Reliable data to use for comparison is scarce, so complete checkout is difficult, and further checkout is needed over a wider range of system parameters. In future efforts the method will be adapted as a subroutine to the GEAE segmented cylindrical duct modal analysis program.
Limited-memory BFGS based least-squares pre-stack Kirchhoff depth migration
NASA Astrophysics Data System (ADS)
Wu, Shaojiang; Wang, Yibo; Zheng, Yikang; Chang, Xu
2015-08-01
Least-squares migration (LSM) is a linearized inversion technique for subsurface reflectivity estimation. Compared to conventional migration algorithms, it can improve spatial resolution significantly with a few iterative calculations. There are three key steps in LSM, (1) calculate data residuals between observed data and demigrated data using the inverted reflectivity model; (2) migrate data residuals to form reflectivity gradient and (3) update reflectivity model using optimization methods. In order to obtain an accurate and high-resolution inversion result, the good estimation of inverse Hessian matrix plays a crucial role. However, due to the large size of Hessian matrix, the inverse matrix calculation is always a tough task. The limited-memory BFGS (L-BFGS) method can evaluate the Hessian matrix indirectly using a limited amount of computer memory which only maintains a history of the past m gradients (often m < 10). We combine the L-BFGS method with least-squares pre-stack Kirchhoff depth migration. Then, we validate the introduced approach by the 2-D Marmousi synthetic data set and a 2-D marine data set. The results show that the introduced method can effectively obtain reflectivity model and has a faster convergence rate with two comparison gradient methods. It might be significant for general complex subsurface imaging.
A comparison between physicians and computer algorithms for form CMS-2728 data reporting.
Malas, Mohammed Said; Wish, Jay; Moorthi, Ranjani; Grannis, Shaun; Dexter, Paul; Duke, Jon; Moe, Sharon
2017-01-01
CMS-2728 form (Medical Evidence Report) assesses 23 comorbidities chosen to reflect poor outcomes and increased mortality risk. Previous studies questioned the validity of physician reporting on forms CMS-2728. We hypothesize that reporting of comorbidities by computer algorithms identifies more comorbidities than physician completion, and, therefore, is more reflective of underlying disease burden. We collected data from CMS-2728 forms for all 296 patients who had incident ESRD diagnosis and received chronic dialysis from 2005 through 2014 at Indiana University outpatient dialysis centers. We analyzed patients' data from electronic medical records systems that collated information from multiple health care sources. Previously utilized algorithms or natural language processing was used to extract data on 10 comorbidities for a period of up to 10 years prior to ESRD incidence. These algorithms incorporate billing codes, prescriptions, and other relevant elements. We compared the presence or unchecked status of these comorbidities on the forms to the presence or absence according to the algorithms. Computer algorithms had higher reporting of comorbidities compared to forms completion by physicians. This remained true when decreasing data span to one year and using only a single health center source. The algorithms determination was well accepted by a physician panel. Importantly, algorithms use significantly increased the expected deaths and lowered the standardized mortality ratios. Using computer algorithms showed superior identification of comorbidities for form CMS-2728 and altered standardized mortality ratios. Adapting similar algorithms in available EMR systems may offer more thorough evaluation of comorbidities and improve quality reporting. © 2016 International Society for Hemodialysis.
Colour computer-generated holography for point clouds utilizing the Phong illumination model.
Symeonidou, Athanasia; Blinder, David; Schelkens, Peter
2018-04-16
A technique integrating the bidirectional reflectance distribution function (BRDF) is proposed to generate realistic high-quality colour computer-generated holograms (CGHs). We build on prior work, namely a fast computer-generated holography method for point clouds that handles occlusions. We extend the method by integrating the Phong illumination model so that the properties of the objects' surfaces are taken into account to achieve natural light phenomena such as reflections and shadows. Our experiments show that rendering holograms with the proposed algorithm provides realistic looking objects without any noteworthy increase to the computational cost.
Algorithm for Atmospheric Corrections of Aircraft and Satellite Imagery
NASA Technical Reports Server (NTRS)
Fraser, Robert S.; Kaufman, Yoram J.; Ferrare, Richard A.; Mattoo, Shana
1989-01-01
A simple and fast atmospheric correction algorithm is described which is used to correct radiances of scattered sunlight measured by aircraft and/or satellite above a uniform surface. The atmospheric effect, the basic equations, a description of the computational procedure, and a sensitivity study are discussed. The program is designed to take the measured radiances, view and illumination directions, and the aerosol and gaseous absorption optical thickness to compute the radiance just above the surface, the irradiance on the surface, and surface reflectance. Alternatively, the program will compute the upward radiance at a specific altitude for a given surface reflectance, view and illumination directions, and aerosol and gaseous absorption optical thickness. The algorithm can be applied for any view and illumination directions and any wavelength in the range 0.48 micron to 2.2 micron. The relation between the measured radiance and surface reflectance, which is expressed as a function of atmospheric properties and measurement geometry, is computed using a radiative transfer routine. The results of the computations are presented in a table which forms the basis of the correction algorithm. The algorithm can be used for atmospheric corrections in the presence of a rural aerosol. The sensitivity of the derived surface reflectance to uncertainties in the model and input data is discussed.
Algorithm for atmospheric corrections of aircraft and satellite imagery
NASA Technical Reports Server (NTRS)
Fraser, R. S.; Ferrare, R. A.; Kaufman, Y. J.; Markham, B. L.; Mattoo, S.
1992-01-01
A simple and fast atmospheric correction algorithm is described which is used to correct radiances of scattered sunlight measured by aircraft and/or satellite above a uniform surface. The atmospheric effect, the basic equations, a description of the computational procedure, and a sensitivity study are discussed. The program is designed to take the measured radiances, view and illumination directions, and the aerosol and gaseous absorption optical thickness to compute the radiance just above the surface, the irradiance on the surface, and surface reflectance. Alternatively, the program will compute the upward radiance at a specific altitude for a given surface reflectance, view and illumination directions, and aerosol and gaseous absorption optical thickness. The algorithm can be applied for any view and illumination directions and any wavelength in the range 0.48 micron to 2.2 microns. The relation between the measured radiance and surface reflectance, which is expressed as a function of atmospheric properties and measurement geometry, is computed using a radiative transfer routine. The results of the computations are presented in a table which forms the basis of the correction algorithm. The algorithm can be used for atmospheric corrections in the presence of a rural aerosol. The sensitivity of the derived surface reflectance to uncertainties in the model and input data is discussed.
Application of a neural network for reflectance spectrum classification
NASA Astrophysics Data System (ADS)
Yang, Gefei; Gartley, Michael
2017-05-01
Traditional reflectance spectrum classification algorithms are based on comparing spectrum across the electromagnetic spectrum anywhere from the ultra-violet to the thermal infrared regions. These methods analyze reflectance on a pixel by pixel basis. Inspired by high performance that Convolution Neural Networks (CNN) have demonstrated in image classification, we applied a neural network to analyze directional reflectance pattern images. By using the bidirectional reflectance distribution function (BRDF) data, we can reformulate the 4-dimensional into 2 dimensions, namely incident direction × reflected direction × channels. Meanwhile, RIT's micro-DIRSIG model is utilized to simulate additional training samples for improving the robustness of the neural networks training. Unlike traditional classification by using hand-designed feature extraction with a trainable classifier, neural networks create several layers to learn a feature hierarchy from pixels to classifier and all layers are trained jointly. Hence, the our approach of utilizing the angular features are different to traditional methods utilizing spatial features. Although training processing typically has a large computational cost, simple classifiers work well when subsequently using neural network generated features. Currently, most popular neural networks such as VGG, GoogLeNet and AlexNet are trained based on RGB spatial image data. Our approach aims to build a directional reflectance spectrum based neural network to help us to understand from another perspective. At the end of this paper, we compare the difference among several classifiers and analyze the trade-off among neural networks parameters.
PHYSICS OF ECLIPSING BINARIES. II. TOWARD THE INCREASED MODEL FIDELITY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prša, A.; Conroy, K. E.; Horvat, M.
The precision of photometric and spectroscopic observations has been systematically improved in the last decade, mostly thanks to space-borne photometric missions and ground-based spectrographs dedicated to finding exoplanets. The field of eclipsing binary stars strongly benefited from this development. Eclipsing binaries serve as critical tools for determining fundamental stellar properties (masses, radii, temperatures, and luminosities), yet the models are not capable of reproducing observed data well, either because of the missing physics or because of insufficient precision. This led to a predicament where radiative and dynamical effects, insofar buried in noise, started showing up routinely in the data, but weremore » not accounted for in the models. PHOEBE (PHysics Of Eclipsing BinariEs; http://phoebe-project.org) is an open source modeling code for computing theoretical light and radial velocity curves that addresses both problems by incorporating missing physics and by increasing the computational fidelity. In particular, we discuss triangulation as a superior surface discretization algorithm, meshing of rotating single stars, light travel time effects, advanced phase computation, volume conservation in eccentric orbits, and improved computation of local intensity across the stellar surfaces that includes the photon-weighted mode, the enhanced limb darkening treatment, the better reflection treatment, and Doppler boosting. Here we present the concepts on which PHOEBE is built and proofs of concept that demonstrate the increased model fidelity.« less
Wireless, relative-motion computer input device
Holzrichter, John F.; Rosenbury, Erwin T.
2004-05-18
The present invention provides a system for controlling a computer display in a workspace using an input unit/output unit. A train of EM waves are sent out to flood the workspace. EM waves are reflected from the input unit/output unit. A relative distance moved information signal is created using the EM waves that are reflected from the input unit/output unit. Algorithms are used to convert the relative distance moved information signal to a display signal. The computer display is controlled in response to the display signal.
NASA Astrophysics Data System (ADS)
Xiong, Chuan; Shi, Jiancheng
2014-01-01
To date, the light scattering models of snow consider very little about the real snow microstructures. The ideal spherical or other single shaped particle assumptions in previous snow light scattering models can cause error in light scattering modeling of snow and further cause errors in remote sensing inversion algorithms. This paper tries to build up a snow polarized reflectance model based on bicontinuous medium, with which the real snow microstructure is considered. The accurate specific surface area of bicontinuous medium can be analytically derived. The polarized Monte Carlo ray tracing technique is applied to the computer generated bicontinuous medium. With proper algorithms, the snow surface albedo, bidirectional reflectance distribution function (BRDF) and polarized BRDF can be simulated. The validation of model predicted spectral albedo and bidirectional reflectance factor (BRF) using experiment data shows good results. The relationship between snow surface albedo and snow specific surface area (SSA) were predicted, and this relationship can be used for future improvement of snow specific surface area (SSA) inversion algorithms. The model predicted polarized reflectance is validated and proved accurate, which can be further applied in polarized remote sensing.
NASA Astrophysics Data System (ADS)
Maas, Christian; Schmalzl, Jörg
2013-08-01
Ground Penetrating Radar (GPR) is used for the localization of supply lines, land mines, pipes and many other buried objects. These objects can be recognized in the recorded data as reflection hyperbolas with a typical shape depending on depth and material of the object and the surrounding material. To obtain the parameters, the shape of the hyperbola has to be fitted. In the last years several methods were developed to automate this task during post-processing. In this paper we show another approach for the automated localization of reflection hyperbolas in GPR data by solving a pattern recognition problem in grayscale images. In contrast to other methods our detection program is also able to immediately mark potential objects in real-time. For this task we use a version of the Viola-Jones learning algorithm, which is part of the open source library "OpenCV". This algorithm was initially developed for face recognition, but can be adapted to any other simple shape. In our program it is used to narrow down the location of reflection hyperbolas to certain areas in the GPR data. In order to extract the exact location and the velocity of the hyperbolas we apply a simple Hough Transform for hyperbolas. Because the Viola-Jones Algorithm reduces the input for the computational expensive Hough Transform dramatically the detection system can also be implemented on normal field computers, so on-site application is possible. The developed detection system shows promising results and detection rates in unprocessed radargrams. In order to improve the detection results and apply the program to noisy radar images more data of different GPR systems as input for the learning algorithm is necessary.
A technology training protocol for meeting QSEN goals: Focusing on meaningful learning.
Luo, Shuhong; Kalman, Melanie
2018-01-01
The purpose of this paper is to describe and discuss how we designed and developed a 12-step technology training protocol. The protocol is meant to improve meaningful learning in technology education so that nursing students are able to meet the informatics requirements of Quality and Safety Education in Nursing competencies. When designing and developing the training protocol, we used a simplified experiential learning model that addressed the core features of meaningful learning: to connect new knowledge with students' prior knowledge and real-world workflow. Before training, we identified students' prior knowledge and workflow tasks. During training, students learned by doing, reflected on their prior computer skills and workflow, designed individualized procedures for integration into their workflow, and practiced the self-designed procedures in real-world settings. The trainer was a facilitator who provided a meaningful learning environment, asked the right questions to guide reflective conversation, and offered scaffoldings at critical moments. This training protocol could significantly improve nurses' competencies in using technologies and increase their desire to adopt new technologies. © 2017 Wiley Periodicals, Inc.
Numerical modelling of GPR electromagnetic fields for locating burial sites
NASA Astrophysics Data System (ADS)
Carcione, José M.; Karczewski, Jerzy; Mazurkiewicz, Ewelina; Tadeusiewicz, Ryszard; Tomecka-Suchoń, Sylwia
2017-11-01
Ground-penetrating radar (GPR) is commonly used for locating burial sites. In this article, we acquired radargrams at a site where a domestic pig cadaver was buried. The measurements were conducted with the ProEx System GPR manufactured by the Swedish company Mala Geoscience with an antenna of 500MHz. The event corresponding to the pig can be clearly seen in the measurements. In order to improve the interpretation, the electromagnetic field is compared to numerical simulations computed with the pseudo-spectral Fourier method. A geological model has been defined on the basis of assumed electromagnetic properties (permittivity, conductivity and magnetic permeability). The results, when compared with the GPR measurements, show a dissimilar amplitude behaviour, with a stronger reflection event from the bottom of the pit. We have therefore performed another simulation by decreasing the electrical conductivity of the body very close to that of air. The comparison improved, showing more reflections, which could be an indication that the body contains air or has been degraded to a certain extent that the electrical resistivity has greatly increased.
ERIC Educational Resources Information Center
Lin, Feng; Chan, Carol K. K.
2018-01-01
This study examined the role of computer-supported knowledge-building discourse and epistemic reflection in promoting elementary-school students' scientific epistemology and science learning. The participants were 39 Grade 5 students who were collectively pursuing ideas and inquiry for knowledge advance using Knowledge Forum (KF) while studying a…
ERIC Educational Resources Information Center
Nguyen, Long V.
2011-01-01
The paper examines Vietnamese learners' reflections on and perceptions of the application of computer-mediated communication (CMC) into collaborative learning. Data for analysis included an evaluation questionnaire, consisting of 24 4-point Likert scale items, appended with six open-ended questions, and transcripts of 15, out of 30, teacher…
A Charge Coupled Device Imaging System For Ophthalmology
NASA Astrophysics Data System (ADS)
Rowe, R. Wanda; Packer, Samuel; Rosen, James; Bizais, Yves
1984-06-01
A digital camera system has been constructed for obtaining reflectance images of the fundus of the eye with monochromatic light. Images at wavelengths in the visible and near infrared regions of the spectrum are recorded by a charge-coupled device array and transferred to a computer. A variety of image processing operations are performed to restore the pictures, correct for distortions in the image formation process, and extract new and diagnostically useful information. The steps involved in calibrating the system to permit quantitative measurement of fundus reflectance are discussed. Three clinically important applications of such a quantitative system are addressed: the characterization of changes in the optic nerve arising from glaucoma, the diagnosis of choroidal melanoma through spectral signatures, and the early detection and improved management of diabetic retinopathy by measurement of retinal tissue oxygen saturation.
Multi-spectral temperature measurement method for gas turbine blade
NASA Astrophysics Data System (ADS)
Gao, Shan; Feng, Chi; Wang, Lixin; Li, Dong
2016-02-01
One of the basic methods to improve both the thermal efficiency and power output of a gas turbine is to increase the firing temperature. However, gas turbine blades are easily damaged in harsh high-temperature and high-pressure environments. Therefore, ensuring that the blade temperature remains within the design limits is very important. There are unsolved problems in blade temperature measurement, relating to the emissivity of the blade surface, influences of the combustion gases, and reflections of radiant energy from the surroundings. In this study, the emissivity of blade surfaces has been measured, with errors reduced by a fitting method, influences of the combustion gases have been calculated for different operational conditions, and a reflection model has been built. An iterative computing method is proposed for calculating blade temperatures, and the experimental results show that this method has high precision.
Cone-beam x-ray luminescence computed tomography based on x-ray absorption dosage
NASA Astrophysics Data System (ADS)
Liu, Tianshuai; Rong, Junyan; Gao, Peng; Zhang, Wenli; Liu, Wenlei; Zhang, Yuanke; Lu, Hongbing
2018-02-01
With the advances of x-ray excitable nanophosphors, x-ray luminescence computed tomography (XLCT) has become a promising hybrid imaging modality. In particular, a cone-beam XLCT (CB-XLCT) system has demonstrated its potential in in vivo imaging with the advantage of fast imaging speed over other XLCT systems. Currently, the imaging models of most XLCT systems assume that nanophosphors emit light based on the intensity distribution of x-ray within the object, not completely reflecting the nature of the x-ray excitation process. To improve the imaging quality of CB-XLCT, an imaging model that adopts an excitation model of nanophosphors based on x-ray absorption dosage is proposed in this study. To solve the ill-posed inverse problem, a reconstruction algorithm that combines the adaptive Tikhonov regularization method with the imaging model is implemented for CB-XLCT reconstruction. Numerical simulations and phantom experiments indicate that compared with the traditional forward model based on x-ray intensity, the proposed dose-based model could improve the image quality of CB-XLCT significantly in terms of target shape, localization accuracy, and image contrast. In addition, the proposed model behaves better in distinguishing closer targets, demonstrating its advantage in improving spatial resolution.
Bayesian aerosol retrieval algorithm for MODIS AOD retrieval over land
NASA Astrophysics Data System (ADS)
Lipponen, Antti; Mielonen, Tero; Pitkänen, Mikko R. A.; Levy, Robert C.; Sawyer, Virginia R.; Romakkaniemi, Sami; Kolehmainen, Ville; Arola, Antti
2018-03-01
We have developed a Bayesian aerosol retrieval (BAR) algorithm for the retrieval of aerosol optical depth (AOD) over land from the Moderate Resolution Imaging Spectroradiometer (MODIS). In the BAR algorithm, we simultaneously retrieve all dark land pixels in a granule, utilize spatial correlation models for the unknown aerosol parameters, use a statistical prior model for the surface reflectance, and take into account the uncertainties due to fixed aerosol models. The retrieved parameters are total AOD at 0.55 µm, fine-mode fraction (FMF), and surface reflectances at four different wavelengths (0.47, 0.55, 0.64, and 2.1 µm). The accuracy of the new algorithm is evaluated by comparing the AOD retrievals to Aerosol Robotic Network (AERONET) AOD. The results show that the BAR significantly improves the accuracy of AOD retrievals over the operational Dark Target (DT) algorithm. A reduction of about 29 % in the AOD root mean square error and decrease of about 80 % in the median bias of AOD were found globally when the BAR was used instead of the DT algorithm. Furthermore, the fraction of AOD retrievals inside the ±(0.05+15 %) expected error envelope increased from 55 to 76 %. In addition to retrieving the values of AOD, FMF, and surface reflectance, the BAR also gives pixel-level posterior uncertainty estimates for the retrieved parameters. The BAR algorithm always results in physical, non-negative AOD values, and the average computation time for a single granule was less than a minute on a modern personal computer.
Future experimental needs to support applied aerodynamics - A transonic perspective
NASA Technical Reports Server (NTRS)
Gloss, Blair B.
1992-01-01
Advancements in facilities, test techniques, and instrumentation are needed to provide data required for the development of advanced aircraft and to verify computational methods. An industry survey of major users of wind tunnel facilities at Langley Research Center (LaRC) was recently carried out to determine future facility requirements, test techniques, and instrumentation requirements; results from this survey are reflected in this paper. In addition, areas related to transonic testing at LaRC which are either currently being developed or are recognized as needing improvements are discussed.
Dynamic, diagnostic, and pharmacological radionuclide studies of the esophagus in achalasia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rozen, P.; Gelfond, M.; Zaltzman, S.
1982-08-01
The esophagus was evaluated in 15 patients with achalasia by continuous gamma camera imaging following ingestion of a semi-solid meal labeled with /sup 99m/Tc. The images were displayed and recorded on a simple computerized data processing/display system. Subsequent cine mode images of esophageal emptying demonstrated abnormalities of the body of the esophagus not reflected by the manometric examination. Computer-generated time-activity curves representing specific regions of interest were better than manometry in evaluating the results of myotomy, dilatation, and drug therapy. Isosorbide dinitrate significantly improved esophageal emptying.
Control of the transition between regular and mach reflection of shock waves
NASA Astrophysics Data System (ADS)
Alekseev, A. K.
2012-06-01
A control problem was considered that makes it possible to switch the flow between stationary Mach and regular reflection of shock waves within the dual solution domain. The sensitivity of the flow was computed by solving adjoint equations. A control disturbance was sought by applying gradient optimization methods. According to the computational results, the transition from regular to Mach reflection can be executed by raising the temperature. The transition from Mach to regular reflection can be achieved by lowering the temperature at moderate Mach numbers and is impossible at large numbers. The reliability of the numerical results was confirmed by verifying them with the help of a posteriori analysis.
NASA Technical Reports Server (NTRS)
Ledbetter, Kenneth W.
1992-01-01
Four trends in spacecraft flight operations are discussed which will reduce overall program costs. These trends are the use of high-speed, highly reliable data communications systems for distributing operations functions to more convenient and cost-effective sites; the improved capability for remote operation of sensors; a continued rapid increase in memory and processing speed of flight qualified computer chips; and increasingly capable ground-based hardware and software systems, notably those augmented by artificial intelligence functions. Changes reflected by these trends are reviewed starting from the NASA Viking missions of the early 70s, when mission control was conducted at one location using expensive and cumbersome mainframe computers and communications equipment. In the 1980s, powerful desktop computers and modems enabled the Magellan project team to operate the spacecraft remotely. In the 1990s, the Hubble Space Telescope project uses multiple color screens and automated sequencing software on small computers. Given a projection of current capabilities, future control centers will be even more cost-effective.
NASA Technical Reports Server (NTRS)
Goel, Narendra S.; Rozehnal, Ivan; Thompson, Richard L.
1991-01-01
A computer-graphics-based model, named DIANA, is presented for generation of objects of arbitrary shape and for calculating bidirectional reflectances and scattering from them, in the visible and infrared region. The computer generation is based on a modified Lindenmayer system approach which makes it possible to generate objects of arbitrary shapes and to simulate their growth, dynamics, and movement. Rendering techniques are used to display an object on a computer screen with appropriate shading and shadowing and to calculate the scattering and reflectance from the object. The technique is illustrated with scattering from canopies of simulated corn plants.
ERIC Educational Resources Information Center
Fridge, Evorell; Bagui, Sikha
2016-01-01
The goal of this research was to investigate the effects of automated testing software on levels of student reflection and student performance. This was a self-selecting, between subjects design that examined the performance of students in introductory computer programming classes. Participants were given the option of using the Web-CAT…
Computational Planning in Facial Surgery.
Zachow, Stefan
2015-10-01
This article reflects the research of the last two decades in computational planning for cranio-maxillofacial surgery. Model-guided and computer-assisted surgery planning has tremendously developed due to ever increasing computational capabilities. Simulators for education, planning, and training of surgery are often compared with flight simulators, where maneuvers are also trained to reduce a possible risk of failure. Meanwhile, digital patient models can be derived from medical image data with astonishing accuracy and thus can serve for model surgery to derive a surgical template model that represents the envisaged result. Computerized surgical planning approaches, however, are often still explorative, meaning that a surgeon tries to find a therapeutic concept based on his or her expertise using computational tools that are mimicking real procedures. Future perspectives of an improved computerized planning may be that surgical objectives will be generated algorithmically by employing mathematical modeling, simulation, and optimization techniques. Planning systems thus act as intelligent decision support systems. However, surgeons can still use the existing tools to vary the proposed approach, but they mainly focus on how to transfer objectives into reality. Such a development may result in a paradigm shift for future surgery planning. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Reflections from the Computer Equity Training Project.
ERIC Educational Resources Information Center
Sanders, Jo Shuchat
This paper addresses girls' patterns of computer avoidance at the middle school and other grade levels. It reviews the evidence for a gender gap in computer use in several areas: in school, at home, in computer camps, in computer magazines, and in computer-related jobs. It compares the computer equity issue to math avoidance, and cites the middle…
Numerical Predictions of Mode Reflections in an Open Circular Duct: Comparison with Theory
NASA Technical Reports Server (NTRS)
Dahl, Milo D.; Hixon, Ray
2015-01-01
The NASA Broadband Aeroacoustic Stator Simulation code was used to compute the acoustic field for higher-order modes in a circular duct geometry. To test the accuracy of the results computed by the code, the duct was terminated by an open end with an infinite flange or no flange. Both open end conditions have a theoretical solution that was used to compare with the computed results. Excellent comparison for reflection matrix values was achieved after suitable refinement of the grid at the open end. The study also revealed issues with the level of the mode amplitude introduced into the acoustic held from the source boundary and the amount of reflection that occurred at the source boundary when a general nonreflecting boundary condition was applied.
Franz, Berkeley; Murphy, John W
2015-01-01
Electronic medical records are regarded as an important tool in primary health-care settings. Because these records are thought to standardize medical information, facilitate provider communication, and improve office efficiency, many practices are transitioning to these systems. However, much of the concern with improving the practice of record keeping has related to technological innovations and human-computer interaction. Drawing on the philosophical reflection raised in Jacques Ellul's work, this article questions the technological imperative that may be supporting medical record keeping. Furthermore, given the growing emphasis on community-based care, this article discusses important non-technological aspects of electronic medical records that might bring the use of these records in line with participatory primary-care medicine.
Smolarski, D C; Whitehead, T
2000-04-01
In this paper, we describe our recent approaches to introducing students in a beginning computer science class to the study of ethical issues related to computer science and technology. This consists of three components: lectures on ethics and technology, in-class discussion of ethical scenarios, and a reflective paper on a topic related to ethics or the impact of technology on society. We give both student reactions to these aspects, and instructor perspective on the difficulties and benefits in exposing students to these ideas.
NASA Astrophysics Data System (ADS)
Karagiannis, Georgios Th.
2016-04-01
The development of non-destructive techniques is a reality in the field of conservation science. These techniques are usually not so accurate, as the analytical micro-sampling techniques, however, the proper development of soft-computing techniques can improve their accuracy. In this work, we propose a real-time fast acquisition spectroscopic mapping imaging system that operates from the ultraviolet to mid infrared (UV/Vis/nIR/mIR) area of the electromagnetic spectrum and it is supported by a set of soft-computing methods to identify the materials that exist in a stratigraphic structure of paint layers. Particularly, the system acquires spectra in diffuse-reflectance mode, scanning in a Region-Of-Interest (ROI), and having wavelength range from 200 up to 5000 nm. Also, a fuzzy c-means clustering algorithm, i.e., the particular soft-computing algorithm, produces the mapping images. The evaluation of the method was tested on a byzantine painted icon.
Small subchondral drill holes improve marrow stimulation of articular cartilage defects.
Eldracher, Mona; Orth, Patrick; Cucchiarini, Magali; Pape, Dietrich; Madry, Henning
2014-11-01
Subchondral drilling is an established marrow stimulation technique. Osteochondral repair is improved when the subchondral bone is perforated with small drill holes, reflecting the physiological subchondral trabecular distance. Controlled laboratory study. A rectangular full-thickness chondral defect was created in the trochlea of adult sheep (n = 13) and treated with 6 subchondral drillings of either 1.0 mm (reflective of the trabecular distance) or 1.8 mm in diameter. Osteochondral repair was assessed after 6 months in vivo by macroscopic, histological, and immunohistochemical analyses and by micro-computed tomography. The application of 1.0-mm subchondral drill holes led to significantly improved histological matrix staining, cellular morphological characteristics, subchondral bone reconstitution, and average total histological score as well as significantly higher immunoreactivity to type II collagen and reduced immunoreactivity to type I collagen in the repair tissue compared with 1.8-mm drill holes. Analysis of osteoarthritic changes in the cartilage adjacent to the defects revealed no significant differences between treatment groups. Restoration of the microstructure of the subchondral bone plate below the chondral defects was significantly improved after 1.0-mm compared to 1.8-mm drilling, as shown by higher bone volume and reduced thickening of the subchondral bone plate. Likewise, the microarchitecture of the drilled subarticular spongiosa was better restored after 1.0-mm drilling, indicated by significantly higher bone volume and more and thinner trabeculae. Moreover, the bone mineral density of the subchondral bone in 1.0-mm drill holes was similar to the adjacent subchondral bone, whereas it was significantly reduced in 1.8-mm drill holes. No significant correlations existed between cartilage and subchondral bone repair. Small subchondral drill holes that reflect the physiological trabecular distance improve osteochondral repair in a translational model more effectively than larger drill holes. These results have important implications for the use of subchondral drilling for marrow stimulation, as they support the use of small-diameter bone-cutting devices. © 2014 The Author(s).
A Big Data and Learning Analytics Approach to Process-Level Feedback in Cognitive Simulations.
Pecaric, Martin; Boutis, Kathy; Beckstead, Jason; Pusic, Martin
2017-02-01
Collecting and analyzing large amounts of process data for the purposes of education can be considered a big data/learning analytics (BD/LA) approach to improving learning. However, in the education of health care professionals, the application of BD/LA is limited to date. The authors discuss the potential advantages of the BD/LA approach for the process of learning via cognitive simulations. Using the lens of a cognitive model of radiograph interpretation with four phases (orientation, searching/scanning, feature detection, and decision making), they reanalyzed process data from a cognitive simulation of pediatric ankle radiography where 46 practitioners from three expertise levels classified 234 cases online. To illustrate the big data component, they highlight the data available in a digital environment (time-stamped, click-level process data). Learning analytics were illustrated using algorithmic computer-enabled approaches to process-level feedback.For each phase, the authors were able to identify examples of potentially useful BD/LA measures. For orientation, the trackable behavior of re-reviewing the clinical history was associated with increased diagnostic accuracy. For searching/scanning, evidence of skipping views was associated with an increased false-negative rate. For feature detection, heat maps overlaid on the radiograph can provide a metacognitive visualization of common novice errors. For decision making, the measured influence of sequence effects can reflect susceptibility to bias, whereas computer-generated path maps can provide insights into learners' diagnostic strategies.In conclusion, the augmented collection and dynamic analysis of learning process data within a cognitive simulation can improve feedback and prompt more precise reflection on a novice clinician's skill development.
Reflections in computer modeling of rooms: Current approaches and possible extensions
NASA Astrophysics Data System (ADS)
Svensson, U. Peter
2005-09-01
Computer modeling of rooms is most commonly done by some calculation technique that is based on decomposing the sound field into separate reflection components. In a first step, a list of possible reflection paths is found and in a second step, an impulse response is constructed from the list of reflections. Alternatively, the list of reflections is used for generating a simpler echogram, the energy decay as function of time. A number of geometrical acoustics-based methods can handle specular reflections, diffuse reflections, edge diffraction, curved surfaces, and locally/non-locally reacting surfaces to various degrees. This presentation gives an overview of how reflections are handled in the image source method and variants of the ray-tracing methods, which are dominating today in commercial software, as well as in the radiosity method and edge diffraction methods. The use of the recently standardized scattering and diffusion coefficients of surfaces is discussed. Possibilities for combining edge diffraction, surface scattering, and impedance boundaries are demonstrated for an example surface. Finally, the number of reflection paths becomes prohibitively high when all such combinations are included as demonstrated for a simple concert hall model. [Work supported by the Acoustic Research Centre through NFR, Norway.
Test-bench system for a borehole azimuthal acoustic reflection imaging logging tool
NASA Astrophysics Data System (ADS)
Liu, Xianping; Ju, Xiaodong; Qiao, Wenxiao; Lu, Junqiang; Men, Baiyong; Liu, Dong
2016-06-01
The borehole azimuthal acoustic reflection imaging logging tool (BAAR) is a new generation of imaging logging tool, which is able to investigate stratums in a relatively larger range of space around the borehole. The BAAR is designed based on the idea of modularization with a very complex structure, so it has become urgent for us to develop a dedicated test-bench system to debug each module of the BAAR. With the help of a test-bench system introduced in this paper, test and calibration of BAAR can be easily achieved. The test-bench system is designed based on the client/server model. The hardware system mainly consists of a host computer, an embedded controlling board, a bus interface board, a data acquisition board and a telemetry communication board. The host computer serves as the human machine interface and processes the uploaded data. The software running on the host computer is designed based on VC++. The embedded controlling board uses Advanced Reduced Instruction Set Machines 7 (ARM7) as the micro controller and communicates with the host computer via Ethernet. The software for the embedded controlling board is developed based on the operating system uClinux. The bus interface board, data acquisition board and telemetry communication board are designed based on a field programmable gate array (FPGA) and provide test interfaces for the logging tool. To examine the feasibility of the test-bench system, it was set up to perform a test on BAAR. By analyzing the test results, an unqualified channel of the electronic receiving cabin was discovered. It is suggested that the test-bench system can be used to quickly determine the working condition of sub modules of BAAR and it is of great significance in improving production efficiency and accelerating industrial production of the logging tool.
Calibration of a laboratory spectrophotometer for specular light by means of stacked glass plates.
NASA Technical Reports Server (NTRS)
Allen, W. A.; Richardson, A. J.
1971-01-01
Stacked glass plates have been used to calibrate a laboratory spectrophotometer, over the spectral range 0.5-2.5 microns, for specular light. The uncalibrated instrument was characterized by systematic errors when used to measure the reflectance and transmittance of stacked glass plates. Calibration included first, a determination of the reflectance of a standard composed of barium sulfate paint deposited on an aluminum plate; second, the approximation of the reflectance and transmittance residuals between observed and computed values by means of cubic equations; and, finally, the removal of the systematic errors by a computer. The instrument, after calibration, was accurate to 1% when used to measure the reflectance and transmittance of stacked glass plates.
Empirical conversion of the vertical profile of reflectivity from Ku-band to S-band frequency
NASA Astrophysics Data System (ADS)
Cao, Qing; Hong, Yang; Qi, Youcun; Wen, Yixin; Zhang, Jian; Gourley, Jonathan J.; Liao, Liang
2013-02-01
ABSTRACT This paper presents an empirical method for converting reflectivity from Ku-band (13.8 GHz) to S-band (2.8 GHz) for several hydrometeor species, which facilitates the incorporation of Tropical Rainfall Measuring Mission (TRMM) Precipitation Radar (PR) measurements into quantitative precipitation estimation (QPE) products from the U.S. Next-Generation Radar (NEXRAD). The development of empirical dual-frequency relations is based on theoretical simulations, which have assumed appropriate scattering and microphysical models for liquid and solid hydrometeors (raindrops, snow, and ice/hail). Particle phase, shape, orientation, and density (especially for snow particles) have been considered in applying the T-matrix method to compute the scattering amplitudes. Gamma particle size distribution (PSD) is utilized to model the microphysical properties in the ice region, melting layer, and raining region of precipitating clouds. The variability of PSD parameters is considered to study the characteristics of dual-frequency reflectivity, especially the variations in radar dual-frequency ratio (DFR). The empirical relations between DFR and Ku-band reflectivity have been derived for particles in different regions within the vertical structure of precipitating clouds. The reflectivity conversion using the proposed empirical relations has been tested using real data collected by TRMM-PR and a prototype polarimetric WSR-88D (Weather Surveillance Radar 88 Doppler) radar, KOUN. The processing and analysis of collocated data demonstrate the validity of the proposed empirical relations and substantiate their practical significance for reflectivity conversion, which is essential to the TRMM-based vertical profile of reflectivity correction approach in improving NEXRAD-based QPE.
How to build better memory training games
Deveau, Jenni; Jaeggi, Susanne M.; Zordan, Victor; Phung, Calvin; Seitz, Aaron R.
2015-01-01
Can we create engaging training programs that improve working memory (WM) skills? While there are numerous procedures that attempt to do so, there is a great deal of controversy regarding their efficacy. Nonetheless, recent meta-analytic evidence shows consistent improvements across studies on lab-based tasks generalizing beyond the specific training effects (Au et al., 2014; Karbach and Verhaeghen, 2014), however, there is little research into how WM training aids participants in their daily life. Here we propose that incorporating design principles from the fields of Perceptual Learning (PL) and Computer Science might augment the efficacy of WM training, and ultimately lead to greater learning and transfer. In particular, the field of PL has identified numerous mechanisms (including attention, reinforcement, multisensory facilitation and multi-stimulus training) that promote brain plasticity. Also, computer science has made great progress in the scientific approach to game design that can be used to create engaging environments for learning. We suggest that approaches integrating knowledge across these fields may lead to a more effective WM interventions and better reflect real world conditions. PMID:25620916
Speckle interferometry using fiber optic phase stepping
NASA Technical Reports Server (NTRS)
Mercer, Carolyn R.; Beheim, Glenn
1989-01-01
A system employing closed-loop phase-stepping is used to measure the out-of-plane deformation of a diffusely reflecting object. Optical fibers are used to provide reference and object beam illumination for a standard two-beam speckle interferometer, providing set-up flexibility and ease of alignment. Piezoelectric fiber-stretchers and a phase-measurement/servo system are used to provide highly accurate phase steps. Intensity data is captured with a charge-injection-device camera, and is converted into a phase map using a desktop computer. The closed-loop phase-stepping system provides 90 deg phase steps which are accurate to 0.02 deg, greatly improving this system relative to open-loop interferometers. The system is demonstrated on a speckle interferometer, measuring the rigid-body translation of a diffusely reflecting object with an accuracy + or - 10 deg, or roughly + or - 15 nanometers. This accuracy is achieved without the use of a pneumatically mounted optics table.
Quantifying color variation: Improved formulas for calculating hue with segment classification.
Smith, Stacey D
2014-03-01
Differences in color form a major component of biological variation, and quantifying these differences is the first step to understanding their evolutionary and ecological importance. One common method for measuring color variation is segment classification, which uses three variables (chroma, hue, and brightness) to describe the height and shape of reflectance curves. This study provides new formulas for calculating hue (the variable that describes the "type" of color) to give correct values in all regions of color space. • Reflectance spectra were obtained from the literature, and chroma, hue, and brightness were computed for each spectrum using the original formulas as well as the new formulas. Only the new formulas result in correct values in the blue-green portion of color space. • Use of the new formulas for calculating hue will result in more accurate color quantification for a broad range of biological applications.
Brygoo, Stephanie; Millot, Marius; Loubeyre, Paul; ...
2015-11-16
Megabar (1 Mbar = 100 GPa) laser shocks on precompressed samples allow reaching unprecedented high densities and moderately high ~10 3–10 4 K temperatures. We describe in this paper a complete analysis framework for the velocimetry (VISAR) and pyrometry (SOP) data produced in these experiments. Since the precompression increases the initial density of both the sample of interest and the quartz reference for pressure-density, reflectivity, and temperature measurements, we describe analytical corrections based on available experimental data on warm dense silica and density-functional-theory based molecular dynamics computer simulations. Finally, using our improved analysis framework, we report a re-analysis of previouslymore » published data on warm dense hydrogen and helium, compare the newly inferred pressure, density, and temperature data with most advanced equation of state models and provide updated reflectivity values.« less
Numerical techniques for high-throughput reflectance interference biosensing
NASA Astrophysics Data System (ADS)
Sevenler, Derin; Ünlü, M. Selim
2016-06-01
We have developed a robust and rapid computational method for processing the raw spectral data collected from thin film optical interference biosensors. We have applied this method to Interference Reflectance Imaging Sensor (IRIS) measurements and observed a 10,000 fold improvement in processing time, unlocking a variety of clinical and scientific applications. Interference biosensors have advantages over similar technologies in certain applications, for example highly multiplexed measurements of molecular kinetics. However, processing raw IRIS data into useful measurements has been prohibitively time consuming for high-throughput studies. Here we describe the implementation of a lookup table (LUT) technique that provides accurate results in far less time than naive methods. We also discuss an additional benefit that the LUT method can be used with a wider range of interference layer thickness and experimental configurations that are incompatible with methods that require fitting the spectral response.
New reflective symmetry design capability in the JPL-IDEAS Structure Optimization Program
NASA Technical Reports Server (NTRS)
Strain, D.; Levy, R.
1986-01-01
The JPL-IDEAS antenna structure analysis and design optimization computer program was modified to process half structure models of symmetric structures subjected to arbitrary external static loads, synthesize the performance, and optimize the design of the full structure. Significant savings in computation time and cost (more than 50%) were achieved compared to the cost of full model computer runs. The addition of the new reflective symmetry analysis design capabilities to the IDEAS program allows processing of structure models whose size would otherwise prevent automated design optimization. The new program produced synthesized full model iterative design results identical to those of actual full model program executions at substantially reduced cost, time, and computer storage.
NASA Technical Reports Server (NTRS)
Reynolds, W. C. (Editor); Maccormack, R. W.
1981-01-01
Topics discussed include polygon transformations in fluid mechanics, computation of three-dimensional horseshoe vortex flow using the Navier-Stokes equations, an improved surface velocity method for transonic finite-volume solutions, transonic flow calculations with higher order finite elements, the numerical calculation of transonic axial turbomachinery flows, and the simultaneous solutions of inviscid flow and boundary layer at transonic speeds. Also considered are analytical solutions for the reflection of unsteady shock waves and relevant numerical tests, reformulation of the method of characteristics for multidimensional flows, direct numerical simulations of turbulent shear flows, the stability and separation of freely interacting boundary layers, computational models of convective motions at fluid interfaces, viscous transonic flow over airfoils, and mixed spectral/finite difference approximations for slightly viscous flows.
Designing a Reflective Teacher Education Course and Its Contribution to ELT Teachers' Reflectivity
ERIC Educational Resources Information Center
Tajik, Leila; Pakzad, Kazem
2016-01-01
Researchers in the present study planned a reflective teacher education course and documented the contribution of such a course to improving teachers' reflectivity. Five English teachers took part in the reflective teacher education course designed by the researchers. To record how the course could help improve reflective teaching, researchers…
NASA Astrophysics Data System (ADS)
Bouroubi, Mohamed Yacine
Multi-spectral satellite imagery, especially at high spatial resolution (finer than 30 m on the ground), represents an invaluable source of information for decision making in various domains in connection with natural resources management, environment preservation or urban planning and management. The mapping scales may range from local (finer resolution than 5 m) to regional (resolution coarser than 5m). The images are characterized by objects reflectance in the electromagnetic spectrum witch represents the key information in many applications. However, satellite sensor measurements are also affected by parasite input due to illumination and observation conditions, to the atmosphere, to topography and to sensor properties. Two questions have oriented this research. What is the best approach to retrieve surface reflectance with the measured values while taking into account these parasite factors? Is this retrieval a sine qua non condition for reliable image information extraction for the diverse domains of application for the images (mapping, environmental monitoring, landscape change detection, resources inventory, etc.)? The goals we have delineated for this research are as follow: (1) Develop software to retrieve ground reflectance while taking into account the aspects mentioned earlier. This software had to be modular enough to allow improvement and adaptation to diverse remote sensing application problems; and (2) Apply this software in various context (urban, agricultural, forest) and analyse results to evaluate the accuracy gain of extracted information from remote sensing imagery transformed in ground reflectance images to demonstrate the necessity of operating in this way, whatever the type of application. During this research, we have developed a tool to retrieve ground reflectance (the new version of the REFLECT software). This software is based on the formulas (and routines) of the 6S code (Second Simulation of Satellite Signal in the Solar Spectrum) and on the dark targets method to estimated the aerosol optical thickness, representing the most difficult factor to correct. Substantial improvements have been made to the existing models. These improvements essentially concern the aerosols properties (integration of a more recent model, improvement of the dark targets selection to estimate the AOD), the adjacency effect, the adaptation to most used high resolution (Landsat TM and ETM+, all HR SPOT 1 to 5, EO-1 ALI and ASTER) and very high resolution (QuickBird et Ikonos) sensors and the correction of topographic effects with a model that separate direct and diffuse solar radiation components and the adaptation of this model to forest canopy. Validation has shown that ground reflectance estimation with REFLECT is performed with an accuracy of approximately +/-0.01 in reflectance units (for in the visible, near-infrared and middle-infrared spectral bands) even for a surface with varying topography. This software has allowed demonstrating, through apparent reflectance simulations, how much parasite factors influencing numerical values of the images may alter the ground reflectance (errors ranging from 10 to 50%). REFLECT has also been used to examine the usefulness of ground reflectance instead of raw data for various common remote sensing applications in domains such as classification, change detection, agriculture and forestry. In most applications (multi-temporal change monitoring, use of vegetation indices, biophysical parameters estimation, etc.) image correction is a crucial step to obtain reliable results. From the computer environment standpoint, REFLECT is organized as a series of menus, corresponding to different steps of: input parameters introducing, gas transmittances calculation, AOD estimation, and finally image correction application, with the possibility of using the fast option witch process an image of 5000 by 5000 pixels in approximately 15 minutes. (Abstract shortened by UMI.)
Design of teleoperation system with a force-reflecting real-time simulator
NASA Technical Reports Server (NTRS)
Hirata, Mitsunori; Sato, Yuichi; Nagashima, Fumio; Maruyama, Tsugito
1994-01-01
We developed a force-reflecting teleoperation system that uses a real-time graphic simulator. This system eliminates the effects of communication time delays in remote robot manipulation. The simulator provides the operator with predictive display and feedback of computed contact forces through a six-degree of freedom (6-DOF) master arm on a real-time basis. With this system, peg-in-hole tasks involving round-trip communication time delays of up to a few seconds were performed at three support levels: a real image alone, a predictive display with a real image, and a real-time graphic simulator with computed-contact-force reflection and a predictive display. The experimental results indicate the best teleoperation efficiency was achieved by using the force-reflecting simulator with two images. The shortest work time, lowest sensor maximum, and a 100 percent success rate were obtained. These results demonstrate the effectiveness of simulated-force-reflecting teleoperation efficiency.
Marchan-Hernandez, Juan Fernando; Camps, Adriano; Rodriguez-Alvarez, Nereida; Bosch-Lluis, Xavier; Ramos-Perez, Isaac; Valencia, Enric
2008-01-01
Signals from Global Navigation Satellite Systems (GNSS) were originally conceived for position and speed determination, but they can be used as signals of opportunity as well. The reflection process over a given surface modifies the properties of the scattered signal, and therefore, by processing the reflected signal, relevant geophysical data regarding the surface under study (land, sea, ice…) can be retrieved. In essence, a GNSS-R receiver is a multi-channel GNSS receiver that computes the received power from a given satellite at a number of different delay and Doppler bins of the incoming signal. The first approaches to build such a receiver consisted of sampling and storing the scattered signal for later post-processing. However, a real-time approach to the problem is desirable to obtain immediately useful geophysical variables and reduce the amount of data. The use of FPGA technology makes this possible, while at the same time the system can be easily reconfigured. The signal tracking and processing constraints made necessary to fully design several new blocks. The uniqueness of the implemented system described in this work is the capability to compute in real-time Delay-Doppler maps (DDMs) either for four simultaneous satellites or just one, but with a larger number of bins. The first tests have been conducted from a cliff over the sea and demonstrate the successful performance of the instrument to compute DDMs in real-time from the measured reflected GNSS/R signals. The processing of these measurements shall yield quantitative relationships between the sea state (mainly driven by the surface wind and the swell) and the overall DDM shape. The ultimate goal is to use the DDM shape to correct the sea state influence on the L-band brightness temperature to improve the retrieval of the sea surface salinity (SSS). PMID:27879862
A Detailed Study of Sonar Tomographic Imaging
2013-08-01
BPA ) to form an object image. As the data is collected radially about the axis of rotation, one computation method computes an inverse Fourier...images are not quite as sharp. It is concluded UNCLASSIFIED iii DSTO–RR–0394 UNCLASSIFIED that polar BPA processing requires an appropriate choice of...attenuation factor to reduce the effect of the specular reflections, while for the 2DIFT BPA approach the degrading effect from these reflections is
Improving Seismic Data Accessibility and Performance Using HDF Containers
NASA Astrophysics Data System (ADS)
Evans, B. J. K.; Wang, J.; Yang, R.
2017-12-01
The performance of computational geophysical data processing and forward modelling relies on both computational and data. Significant efforts on developing new data formats and libraries have been made the community, such as IRIS/PASSCAL and ASDF in data, and programs and utilities such as ObsPy and SPECFEM. The National Computational Infrastructure hosts a national significant geophysical data collection that is co-located with a high performance computing facility and provides an opportunity to investigate how to improve the data formats from both a data management and a performance point of view. This paper investigates how to enhance the data usability in several perspectives: 1) propose a convention for the seismic (both active and passive) community to improve the data accessibility and interoperability; 2) recommend the convention used in the HDF container when data is made available in PH5 or ASDF formats; 3) provide tools to convert between various seismic data formats; 4) provide performance benchmark cases using ObsPy library and SPECFEM3D to demonstrate how different data organization in terms of chunking size and compression impact on the performance by comparing new data formats, such as PH5 and ASDF to traditional formats such as SEGY, SEED, SAC, etc. In this work we apply our knowledge and experience on data standards and conventions, such as CF and ACDD from the climate community to the seismology community. The generic global attributes widely used in climate community are combined with the existing convention in the seismology community, such as CMT and QuakeML, StationXML, SEGY header convention. We also extend such convention by including the provenance and benchmarking records so that the r user can learn the footprint of the data together with its baseline performance. In practise we convert the example wide angle reflection seismic data from SEGY to PH5 or ASDF by using ObsPy and pyasdf libraries. It quantitatively demonstrates how the accessibility can be improved if the seismic data are stored in the HDF container.
76 FR 66135 - Investment Advice-Participants and Beneficiaries
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-25
... ``computer model'' requirement (requiring use of a certified computer model). Both types of arrangements also... that use fee-leveling] or paragraph (b)(4) [describing investment advice arrangements that use computer...-leveling and computer modeling provisions of the final rule. We note that, as also reflected in paragraph...
Defining Computational Thinking for Mathematics and Science Classrooms
ERIC Educational Resources Information Center
Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri
2016-01-01
Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new…
NASA Technical Reports Server (NTRS)
Gordon, S.; Mcbride, B. J.
1976-01-01
A detailed description of the equations and computer program for computations involving chemical equilibria in complex systems is given. A free-energy minimization technique is used. The program permits calculations such as (1) chemical equilibrium for assigned thermodynamic states (T,P), (H,P), (S,P), (T,V), (U,V), or (S,V), (2) theoretical rocket performance for both equilibrium and frozen compositions during expansion, (3) incident and reflected shock properties, and (4) Chapman-Jouguet detonation properties. The program considers condensed species as well as gaseous species.
TERRA REF: Advancing phenomics with high resolution, open access sensor and genomics data
NASA Astrophysics Data System (ADS)
LeBauer, D.; Kooper, R.; Burnette, M.; Willis, C.
2017-12-01
Automated plant measurement has the potential to improve understanding of genetic and environmental controls on plant traits (phenotypes). The application of sensors and software in the automation of high throughput phenotyping reflects a fundamental shift from labor intensive hand measurements to drone, tractor, and robot mounted sensing platforms. These tools are expected to speed the rate of crop improvement by enabling plant breeders to more accurately select plants with improved yields, resource use efficiency, and stress tolerance. However, there are many challenges facing high throughput phenomics: sensors and platforms are expensive, currently there are few standard methods of data collection and storage, and the analysis of large data sets requires high performance computers and automated, reproducible computing pipelines. To overcome these obstacles and advance the science of high throughput phenomics, the TERRA Phenotyping Reference Platform (TERRA-REF) team is developing an open-access database of high resolution sensor data. TERRA REF is an integrated field and greenhouse phenotyping system that includes: a reference field scanner with fifteen sensors that can generate terrabytes of data each day at mm resolution; UAV, tractor, and fixed field sensing platforms; and an automated controlled-environment scanner. These platforms will enable investigation of diverse sensing modalities, and the investigation of traits under controlled and field environments. It is the goal of TERRA REF to lower the barrier to entry for academic and industry researchers by providing high-resolution data, open source software, and online computing resources. Our project is unique in that all data will be made fully public in November 2018, and is already available to early adopters through the beta-user program. We will describe the datasets and how to use them as well as the databases and computing pipeline and how these can be reused and remixed in other phenomics pipelines. Finally, we will describe the National Data Service workbench, a cloud computing platform that can access the petabyte scale data while supporting reproducible research.
NASA Technical Reports Server (NTRS)
Clark, T. A.; Brainard, G.; Salazar, G.; Johnston, S.; Schwing, B.; Litaker, H.; Kolomenski, A.; Venus, D.; Tran, K.; Hanifin, J.;
2017-01-01
NASA has demonstrated an interest in improving astronaut health and performance through the installment of a new lighting countermeasure on the International Space Station. The Solid State Lighting Assembly (SSLA) system is designed to positively influence astronaut health by providing a daily change to light spectrum to improve circadian entrainment. Unfortunately, existing NASA standards and requirements define ambient light level requirements for crew sleep and other tasks, yet the number of light-emitting diode (LED) indicators and displays within a habitable volume is currently uncontrolled. Because each of these light sources has its own unique spectral properties, the additive lighting environment ends up becoming something different from what was planned or researched. Restricting the use of displays and indicators is not a solution because these systems provide beneficial feedback to the crew. The research team for this grant used computer-based computational modeling and real-world lighting mockups to document the impact that light sources other than the ambient lighting system contribute to the ambient spectral lighting environment. In particular, the team was focused on understanding the impacts of long-term tasks located in front of avionics or computer displays. The team also wanted to understand options for mitigating the changes to the ambient light spectrum in the interest of maintaining the performance of a lighting countermeasure. The project utilized a variety of physical and computer-based simulations to determine direct relationships between system implementation and light spectrum. Using real-world data, computer models were built in the commercially available optics analysis software Zemax Optics Studio(c). The team also built a mockup test facility that had the same volume and configuration as one of the Zemax models. The team collected over 1200 spectral irradiance measurements, each representing a different configuration of the mockup. Analysis of the data showed a measurable impact on ambient light spectrum. This data showed that obvious design techniques exist that can be used to bind the ambient light spectrum closer to the planned spectral operating environment for the observer's eye point. The following observations should be considered when designing an operational environment that is dominated by computer displays. When more light is directed into the field of view of the observer, the greater the impact it will make on various human factors issues that depend on spectral shape and intensity. Because viewing angle has a large part to play in the amount of light flux on the crewmember's retina, beam shape, combined with light source location is an important factor for determining percent probable incident flux on the observer from any combination of light sources. Computer graphics design and display lumen output are major factors influencing the amount of spectrally intense light projected into the environment and in the viewer's direction. Use of adjustable white point display software was useful only if the predominant background color was white and if it matched the ambient light system's color. Display graphics that used a predominantly black background had the least influence on unplanned spectral energy projected into the environment. Percent reflectance makes a difference in total energy reflected back into an environment, and within certain architectural geometries, reflectance can be used to control the amount of a light spectrum that is allowed to perpetuate in the environment. Data showed that room volume and distance from significant light sources influence the total spectrum in a room. Smaller environments had a homogenizing effect on total light spectrum, whereas light from multiple sources in larger environments was less mixed. The findings indicated above should be considered when making recommendations for practice or standards for architectural systems. The ambient lighting system, surface reflectance, and display and indicator implementation all factor into the users' spectral environment. A variety of low-cost solutions exist to mitigate the impact of light from non-architectural lighting systems, and much potential for system automation and integration of display systems with the ambient environment. This team believes that proper planning can be used to avoid integration problems and also believes that human-in-the-loop evaluations, real-world test and measurement, and computer modeling can be used to determine how changes to a process, display graphics, and architecture will help maintain the planned spectral operating lighting environment.
NASA Astrophysics Data System (ADS)
Pascal, Christophe
2004-04-01
Stress inversion programs are nowadays frequently used in tectonic analysis. The purpose of this family of programs is to reconstruct the stress tensor characteristics from fault slip data acquired in the field or derived from earthquake focal mechanisms (i.e. inverse methods). Until now, little attention has been paid to direct methods (i.e. to determine fault slip directions from an inferred stress tensor). During the 1990s, the fast increase in resolution in 3D seismic reflection techniques made it possible to determine the geometry of subsurface faults with a satisfactory accuracy but not to determine precisely their kinematics. This recent improvement allows the use of direct methods. A computer program, namely SORTAN, is introduced. The program is highly portable on Unix platforms, straightforward to install and user-friendly. The computation is based on classical stress-fault slip relationships and allows for fast treatment of a set of faults and graphical presentation of the results (i.e. slip directions). In addition, the SORTAN program permits one to test the sensitivity of the results to input uncertainties. It is a complementary tool to classical stress inversion methods and can be used to check the mechanical consistency and the limits of structural interpretations based upon 3D seismic reflection surveys.
Simulation of cold magnetized plasmas with the 3D electromagnetic software CST Microwave Studio®
NASA Astrophysics Data System (ADS)
Louche, Fabrice; Křivská, Alena; Messiaen, André; Wauters, Tom
2017-10-01
Detailed designs of ICRF antennas were made possible by the development of sophisticated commercial 3D codes like CST Microwave Studio® (MWS). This program allows for very detailed geometries of the radiating structures, but was only considering simple materials like equivalent isotropic dielectrics to simulate the reflection and the refraction of RF waves at the vacuum/plasma interface. The code was nevertheless used intensively, notably for computing the coupling properties of the ITER ICRF antenna. Until recently it was not possible to simulate gyrotropic medias like magnetized plasmas, but recent improvements have allowed programming any material described by a general dielectric or/and diamagnetic tensor. A Visual Basic macro was developed to exploit this feature and was tested for the specific case of a monochromatic plane wave propagating longitudinally with respect to the magnetic field direction. For specific cases the exact solution can be expressed in 1D as the sum of two circularly polarized waves connected by a reflection coefficient that can be analytically computed. Solutions for stratified media can also be derived. This allows for a direct comparison with MWS results. The agreement is excellent but accurate simulations for realistic geometries require large memory resources that could significantly restrict the possibility of simulating cold plasmas to small-scale machines.
In-App Reflection Guidance: Lessons Learned Across Four Field Trials at the Workplace
ERIC Educational Resources Information Center
Fessl, Angela; Wesiak, Gudrun; Rivera-Pelayo, Verónica; Feyertag, Sandra; Pammer, Viktoria
2017-01-01
This paper presents a concept for in-app reflection guidance and its evaluation in four work-related field trials. By synthesizing across four field trials, we can show that computer-based reflection guidance can function in the workplace, in the sense of being accepted as technology, being perceived as useful and leading to reflective learning.…
NASA Astrophysics Data System (ADS)
Lukosi, Eric D.; Herrera, Elan H.; Hamm, Daniel S.; Burger, Arnold; Stowe, Ashley C.
2017-11-01
An array of lithium indium diselenide (LISe) scintillators were investigated for application in neutron imaging. The sensors, varying in thickness and surface roughness, were tested using both reflective and anti-reflective mounting to an aluminum window. The spatial resolution of each LISe scintillator was calculated using the knife-edge test and a modulation transfer function analysis. It was found that the anti-reflective backing case yielded higher spatial resolutions by up to a factor of two over the reflective backing case despite a reduction in measured light yield by an average of 1.97. In most cases, the use of an anti-reflective backing resulted in a higher spatial resolution than the 50 μm-thick ZnS(Cu):6 LiF comparison scintillation screen. The effect of surface roughness was not directly correlated to measured light yield or observed spatial resolution, but weighting the reflective backing case by the random surface roughness revealed that a linear relationship exists between the fractional change (RB/ARB) of the two. Finally, the LISe scintillator array was used in neutron computed tomography to investigate the features of halyomorpha halys with the reflective and anti-reflective backing.
NASA Technical Reports Server (NTRS)
Russell, O. R. (Principal Investigator); Nichols, D. A.; Anderson, R.
1977-01-01
The author has identified the following significant results. Evaluation of LANDSAT imagery indicates severe limitations in its utility for surface mine land studies. Image stripping resulting from unequal detector response on satellite degrades the image quality to the extent that images of scales larger than 1:125,000 are of limited value for manual interpretation. Computer processing of LANDSAT data to improve image quality is essential; the removal of scanline stripping and enhancement of mine land reflectance data combined with color composite printing permits useful photographic enlargements to approximately 1:60,000.
Can Hyperspectral Remote Sensing Detect Species Specific Biochemicals ?
NASA Astrophysics Data System (ADS)
Vanderbilt, V. C.; Daughtry, C. S.
2011-12-01
Discrimination of a few plants scattered among many plants is a goal common to detection of agricultural weeds, invasive plant species and illegal Cannabis clandestinely grown outdoors, the subject of this research. Remote sensing technology provides an automated, computer based, land cover classification capability that holds promise for improving upon the existing approaches to Cannabis detection. In this research, we investigated whether hyperspectral reflectance of recently harvested, fully turgid Cannabis leaves and buds depends upon the concentration of the psychoactive ingredient Tetrahydrocannabinol (THC) that, if present at sufficient concentration, presumably would allow species-specific identification of Cannabis.
Analysis and improvement of gas turbine blade temperature measurement error
NASA Astrophysics Data System (ADS)
Gao, Shan; Wang, Lixin; Feng, Chi; Daniel, Ketui
2015-10-01
Gas turbine blade components are easily damaged; they also operate in harsh high-temperature, high-pressure environments over extended durations. Therefore, ensuring that the blade temperature remains within the design limits is very important. In this study, measurement errors in turbine blade temperatures were analyzed, taking into account detector lens contamination, the reflection of environmental energy from the target surface, the effects of the combustion gas, and the emissivity of the blade surface. In this paper, each of the above sources of measurement error is discussed, and an iterative computing method for calculating blade temperature is proposed.
NASA Technical Reports Server (NTRS)
Malila, W. A.; Cicone, R. C.; Gleason, J. M.
1976-01-01
Simulated scanner system data values generated in support of LACIE (Large Area Crop Inventory Experiment) research and development efforts are presented. Synthetic inband (LANDSAT) wheat radiances and radiance components were computed and are presented for various wheat canopy and atmospheric conditions and scanner view geometries. Values include: (1) inband bidirectional reflectances for seven stages of wheat crop growth; (2) inband atmospheric features; and (3) inband radiances corresponding to the various combinations of wheat canopy and atmospheric conditions. Analyses of these data values are presented in the main report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rozen, P.; Gelfond, M.; Zaltzman, S.
1982-08-01
The esophagus was evaluated in 15 patients with achalasia by continuous gamma camera imaging following ingestion of a semi-solid meal labeled with /sup 99//sup m/Tc. The images were displayed and recorded on a simple computerized data processing/display system. Subsequent cine' mode images of esophagela emptying demonstrated abnormalities of the body of the esophagus not reflected by the manometric examination. Computer-generated time-activity curves representing specific regions of interest were better than manometry in evaluating the results of myotomy, dilatation, and drug therapy. Isosorbide dinitrate significantly improved esophageal emptying.
Euclidean mirrors: enhanced vacuum decay from reflected instantons
NASA Astrophysics Data System (ADS)
Akal, Ibrahim; Moortgat-Pick, Gudrid
2018-05-01
We study the tunnelling of virtual matter–antimatter pairs from the quantum vacuum in the presence of a spatially uniform, time-dependent electric background composed of a strong, slow field superimposed with a weak, rapid field. After analytic continuation to Euclidean spacetime, we obtain from the instanton equations two critical points. While one of them is the closing point of the instanton path, the other serves as an Euclidean mirror which reflects and squeezes the instanton. It is this reflection and shrinking which is responsible for an enormous enhancement of the vacuum pair production rate. We discuss how important features of two different mechanisms can be analysed and understood via such a rotation in the complex plane. (a) Consistent with previous studies, we first discuss the standard assisted mechanism with a static strong field and certain weak fields with a distinct pole structure in order to show that the reflection takes place exactly at the poles. We also discuss the effect of possible sub-cycle structures. We extend this reflection picture then to weak fields which have no poles present and illustrate the effective reflections with explicit examples. An additional field strength dependence for the rate occurs in such cases. We analytically compute the characteristic threshold for the assisted mechanism given by the critical combined Keldysh parameter. We discuss significant differences between these two types of fields. For various backgrounds, we present the contributing instantons and perform analytical computations for the corresponding rates treating both fields nonperturbatively. (b) In addition, we also study the case with a nonstatic strong field which gives rise to the assisted dynamical mechanism. For different strong field profiles we investigate the impact on the critical combined Keldysh parameter. As an explicit example, we analytically compute the rate by employing the exact reflection points. The validity of the predictions for both mechanisms is confirmed by numerical computations.
NASA Technical Reports Server (NTRS)
Minnis, Patrick; Liou, Kuo-Nan; Takano, Yoshihide
1993-01-01
The impact of using phase functions for spherical droplets and hexagonal ice crystals to analyze radiances from cirrus is examined. Adding-doubling radiative transfer calculations are employed to compute radiances for different cloud thicknesses and heights over various backgrounds. These radiances are used to develop parameterizations of top-of-the-atmosphere visible reflectance and IR emittance using tables of reflectances as a function of cloud optical depth, viewing and illumination angles, and microphysics. This parameterization, which includes Rayleigh scattering, ozone absorption, variable cloud height, and an anisotropic surface reflectance, reproduces the computed top-of-the-atmosphere reflectances with an accruacy of +/- 6 percent for four microphysical models: 10-micron water droplet, small symmetric crystal, cirrostratus, and cirrus uncinus. The accuracy is twice that of previous models.
Khersonsky, Olga; Röthlisberger, Daniela; Wollacott, Andrew M.; Murphy, Paul; Dym, Orly; Albeck, Shira; Kiss, Gert; Houk, K. N.; Baker, David; Tawfik, Dan S.
2013-01-01
Although de novo computational enzyme design has been shown to be feasible, the field is still in its infancy: the kinetic parameters of designed enzymes are still orders of magnitude lower than those of naturally occurring ones. Nonetheless, designed enzymes can be improved by directed evolution, as recently exemplified for the designed Kemp eliminase KE07. Random mutagenesis and screening resulted in variants with >200-fold higher catalytic efficiency, and provided insights about features missing in the designed enzyme. Here we describe the optimization of KE70, another designed Kemp eliminase. Amino acid substitutions predicted to improve catalysis in design calculations involving extensive backbone sampling were individually tested. Those proven beneficial were combinatorially incorporated into the originally designed KE70 along with random mutations, and the resulting libraries were screened for improved eliminase activity. Nine rounds of mutation and selection resulted in >400-fold improvement in the catalytic efficiency of the original KE70 design, reflected in both higher kcat and lower KM values, with the best variants exhibiting kcat/KM values of >5x104 s−1M−1. The optimized KE70 variants were characterized structurally and biochemically providing insights into the origins of the improvements in catalysis. Three primary contributions were identified: first, the reshaping of the active site cavity to achieve tighter substrate binding; second, the fine-tuning of the electrostatics around the catalytic His-Asp dyad; and third, stabilization of the active-site dyad in a conformation optimal for catalysis. PMID:21277311
The PLATO System and Language Study.
ERIC Educational Resources Information Center
Hart, Robert S., Ed.
1981-01-01
This issue presents an overview of research in computer-based language instruction using the PLATO IV computer system. The following articles are presented: (1) "Language Study and the PLATO system," by R. Hart; (2) "Reflections on the Use of Computers in Second-Language Acquisition," by F. Marty; (3) "Computer-Based…
Videodisc-Computer Interfaces.
ERIC Educational Resources Information Center
Zollman, Dean
1984-01-01
Lists microcomputer-videodisc interfaces currently available from 26 sources, including home use systems connected through remote control jack and industrial/educational systems utilizing computer ports and new laser reflective and stylus technology. Information provided includes computer and videodisc type, language, authoring system, educational…
Cone-beam x-ray luminescence computed tomography based on x-ray absorption dosage.
Liu, Tianshuai; Rong, Junyan; Gao, Peng; Zhang, Wenli; Liu, Wenlei; Zhang, Yuanke; Lu, Hongbing
2018-02-01
With the advances of x-ray excitable nanophosphors, x-ray luminescence computed tomography (XLCT) has become a promising hybrid imaging modality. In particular, a cone-beam XLCT (CB-XLCT) system has demonstrated its potential in in vivo imaging with the advantage of fast imaging speed over other XLCT systems. Currently, the imaging models of most XLCT systems assume that nanophosphors emit light based on the intensity distribution of x-ray within the object, not completely reflecting the nature of the x-ray excitation process. To improve the imaging quality of CB-XLCT, an imaging model that adopts an excitation model of nanophosphors based on x-ray absorption dosage is proposed in this study. To solve the ill-posed inverse problem, a reconstruction algorithm that combines the adaptive Tikhonov regularization method with the imaging model is implemented for CB-XLCT reconstruction. Numerical simulations and phantom experiments indicate that compared with the traditional forward model based on x-ray intensity, the proposed dose-based model could improve the image quality of CB-XLCT significantly in terms of target shape, localization accuracy, and image contrast. In addition, the proposed model behaves better in distinguishing closer targets, demonstrating its advantage in improving spatial resolution. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
NASA Astrophysics Data System (ADS)
Chen, Yung-Sheng; Wang, Jeng-Yau
2015-09-01
Light source plays a significant role to acquire a qualified image from objects for facilitating the image processing and pattern recognition. For objects possessing specular surface, the phenomena of reflection and halo appearing in the acquired image will increase the difficulty of information processing. Such a situation may be improved by the assistance of valuable diffuse light source. Consider reading resistor via computer vision, due to the resistor's specular reflective surface it will face with a severe non-uniform luminous intensity on image yielding a higher error rate in recognition without a well-controlled light source. A measurement system including mainly a digital microscope embedded in a replaceable diffuse cover, a ring-type LED embedded onto a small pad carrying a resistor for evaluation, and Arduino microcontrollers connected with PC, is presented in this paper. Several replaceable cost-effective diffuse covers made by paper bowl, cup and box inside pasted with white paper are presented for reducing specular reflection and halo effects and compared with a commercial diffuse some. The ring-type LED can be flexibly configured to be a full or partial lighting based on the application. For each self-made diffuse cover, a set of resistors with 4 or 5 color bands are captured via digital microscope for experiments. The signal-to-noise ratio from the segmented resistor-image is used for performance evaluation. The detected principal axis of resistor body is used for the partial LED configuration to further improve the lighting condition. Experimental results confirm that the proposed mechanism can not only evaluate the cost-effective diffuse light source but also be extended as an automatic recognition system for resistor reading.
Scanning computed confocal imager
George, John S.
2000-03-14
There is provided a confocal imager comprising a light source emitting a light, with a light modulator in optical communication with the light source for varying the spatial and temporal pattern of the light. A beam splitter receives the scanned light and direct the scanned light onto a target and pass light reflected from the target to a video capturing device for receiving the reflected light and transferring a digital image of the reflected light to a computer for creating a virtual aperture and outputting the digital image. In a transmissive mode of operation the invention omits the beam splitter means and captures light passed through the target.
Polymeric and Molecular Materials for Advanced Organic Electronics
2014-10-20
x - ray reflectivity, grazing incidence x - ray scattering, cyclic voltam- metry...6). ix These materials are characterized by AFM, conducting AFM, XPS, x - ray reflectivity (XRR), standing wave x - ray reflectivity (SWXRR), x - ray ...radiation hard - ness measurements, and quantum chemical computation of dielectric constants. Remark- ably, for semiconductors as diverse
Improving the MODIS Global Snow-Mapping Algorithm
NASA Technical Reports Server (NTRS)
Klein, Andrew G.; Hall, Dorothy K.; Riggs, George A.
1997-01-01
An algorithm (Snowmap) is under development to produce global snow maps at 500 meter resolution on a daily basis using data from the NASA MODIS instrument. MODIS, the Moderate Resolution Imaging Spectroradiometer, will be launched as part of the first Earth Observing System (EOS) platform in 1998. Snowmap is a fully automated, computationally frugal algorithm that will be ready to implement at launch. Forests represent a major limitation to the global mapping of snow cover as a forest canopy both obscures and shadows the snow underneath. Landsat Thematic Mapper (TM) and MODIS Airborne Simulator (MAS) data are used to investigate the changes in reflectance that occur as a forest stand becomes snow covered and to propose changes to the Snowmap algorithm that will improve snow classification accuracy forested areas.
DOT National Transportation Integrated Search
1997-08-01
This handbook, 1997 Freeway Management Handbook, is an update of the 1983 Freeway Management Handbook and reflects the tremendous developments in computing and communications technology. It also reflects the importance of Integrated Transportation Ma...
Virtual gonio-spectrophotometer for validation of BRDF designs
NASA Astrophysics Data System (ADS)
Mihálik, Andrej; Ďurikovič, Roman
2011-10-01
Measurement of the appearance of an object consists of a group of measurements to characterize the color and surface finish of the object. This group of measurements involves the spectral energy distribution of propagated light measured in terms of reflectance and transmittance, and the spatial energy distribution of that light measured in terms of the bidirectional reflectance distribution function (BRDF). In this article we present the virtual gonio-spectrophotometer, a device that measures flux (power) as a function of illumination and observation. Virtual gonio-spectrophotometer measurements allow the determination of the scattering profile of specimens that can be used to verify the physical characteristics of the computer model used to simulate the scattering profile. Among the characteristics that we verify is the energy conservation of the computer model. A virtual gonio-spectrophotometer is utilized to find the correspondence between industrial measurements obtained from gloss meters and the parameters of a computer reflectance model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heng, Kevin; Kitzmann, Daniel, E-mail: kevin.heng@csh.unibe.ch, E-mail: daniel.kitzmann@csh.unibe.ch
We present a novel generalization of the two-stream method of radiative transfer, which allows for the accurate treatment of radiative transfer in the presence of strong infrared scattering by aerosols. We prove that this generalization involves only a simple modification of the coupling coefficients and transmission functions in the hemispheric two-stream method. This modification originates from allowing the ratio of the first Eddington coefficients to depart from unity. At the heart of the method is the fact that this ratio may be computed once and for all over the entire range of values of the single-scattering albedo and scattering asymmetrymore » factor. We benchmark our improved two-stream method by calculating the fraction of flux reflected by a single atmospheric layer (the reflectivity) and comparing these calculations to those performed using a 32-stream discrete-ordinates method. We further compare our improved two-stream method to the two-stream source function (16 streams) and delta-Eddington methods, demonstrating that it is often more accurate at the order-of-magnitude level. Finally, we illustrate its accuracy using a toy model of the early Martian atmosphere hosting a cloud layer composed of carbon dioxide ice particles. The simplicity of implementation and accuracy of our improved two-stream method renders it suitable for implementation in three-dimensional general circulation models. In other words, our improved two-stream method has the ease of implementation of a standard two-stream method, but the accuracy of a 32-stream method.« less
NASA Astrophysics Data System (ADS)
Chang, Kuo-En; Hsiao, Ta-Chih; Hsu, N. Christina; Lin, Neng-Huei; Wang, Sheng-Hsiang; Liu, Gin-Rong; Liu, Chian-Yi; Lin, Tang-Huang
2016-08-01
In this study, an approach in determining effective mixing weight of soot aggregates from dust-soot aerosols is proposed to improve the accuracy of retrieving properties of polluted dusts by means of satellite remote sensing. Based on a pre-computed database containing several variables (such as wavelength, refractive index, soot mixing weight, surface reflectivity, observation geometries and aerosol optical depth (AOD)), the fan-shaped look-up tables can be drawn out accordingly for determining the mixing weights, AOD and single scattering albedo (SSA) of polluted dusts simultaneously with auxiliary regional dust properties and surface reflectivity. To validate the performance of the approach in this study, 6 cases study of polluted dusts (dust-soot aerosols) in Lower Egypt and Israel were examined with the ground-based measurements through AErosol RObotic NETwork (AERONET). The results show that the mean absolute differences could be reduced from 32.95% to 6.56% in AOD and from 2.67% to 0.83% in SSA retrievals for MODIS aerosol products when referenced to AERONET measurements, demonstrating the soundness of the proposed approach under different levels of dust loading, mixing weight and surface reflectivity. Furthermore, the developed algorithm is capable of providing the spatial distribution of the mixing weights and removing the requirement to assume that the dust plume properties are uniform. The case study further shows the spatially variant dust-soot mixing weight would improve the retrieval accuracy in AODmixture and SSAmixture about 10.0% and 1.4% respectively.
Hermite regularization of the lattice Boltzmann method for open source computational aeroacoustics.
Brogi, F; Malaspinas, O; Chopard, B; Bonadonna, C
2017-10-01
The lattice Boltzmann method (LBM) is emerging as a powerful engineering tool for aeroacoustic computations. However, the LBM has been shown to present accuracy and stability issues in the medium-low Mach number range, which is of interest for aeroacoustic applications. Several solutions have been proposed but are often too computationally expensive, do not retain the simplicity and the advantages typical of the LBM, or are not described well enough to be usable by the community due to proprietary software policies. An original regularized collision operator is proposed, based on the expansion of Hermite polynomials, that greatly improves the accuracy and stability of the LBM without significantly altering its algorithm. The regularized LBM can be easily coupled with both non-reflective boundary conditions and a multi-level grid strategy, essential ingredients for aeroacoustic simulations. Excellent agreement was found between this approach and both experimental and numerical data on two different benchmarks: the laminar, unsteady flow past a 2D cylinder and the 3D turbulent jet. Finally, most of the aeroacoustic computations with LBM have been done with commercial software, while here the entire theoretical framework is implemented using an open source library (palabos).
Proposal for hierarchical description of software systems
NASA Technical Reports Server (NTRS)
Thauboth, H.
1973-01-01
The programming of digital computers has developed into a new dimension full of diffculties, because the hardware of computers has become so powerful that more complex applications are entrusted to computers. The costs of software development, verification, and maintenance are outpacing those of the hardware and the trend is toward futher increase of sophistication of application of computers and consequently of sophistication of software. To obtain better visibility into software systems and to improve the structure of software systems for better tests, verification, and maintenance, a clear, but rigorous description and documentation of software is needed. The purpose of the report is to extend the present methods in order to obtain a documentation that better reflects the interplay between the various components and functions of a software system at different levels of detail without losing the precision in expression. This is done by the use of block diagrams, sequence diagrams, and cross-reference charts. In the appendices, examples from an actual large sofware system, i.e. the Marshall System for Aerospace Systems Simulation (MARSYAS), are presented. The proposed documentation structure is compatible to automation of updating significant portions of the documentation for better software change control.
NASA Technical Reports Server (NTRS)
Garcia, J.; Dauser, T.; Reynolds, C. S.; Kallman, T. R.; McClintock, J. E.; Wilms, J.; Ekmann, W.
2013-01-01
We present a new and complete library of synthetic spectra for modeling the component of emission that is reflected from an illuminated accretion disk. The spectra were computed using an updated version of our code xillver that incorporates new routines and a richer atomic data base. We offer in the form of a table model an extensive grid of reflection models that cover a wide range of parameters. Each individual model is characterized by the photon index Gamma of the illuminating radiation, the ionization parameter zeta at the surface of the disk (i.e., the ratio of the X-ray flux to the gas density), and the iron abundance A(sub Fe) relative to the solar value. The ranges of the parameters covered are: 1.2 <= Gamma <= 3.4, 1 <= zeta <= 104, and 0.5 <= A(sub Fe) <= 10. These ranges capture the physical conditions typically inferred from observations of active galactic nuclei, and also stellar-mass black holes in the hard state. This library is intended for use when the thermal disk flux is faint compared to the incident power-law flux. The models are expected to provide an accurate description of the Fe K emission line, which is the crucial spectral feature used to measure black hole spin. A total of 720 reflection spectra are provided in a single FITS file suitable for the analysis of X-ray observations via the atable model in xspec. Detailed comparisons with previous reflection models illustrate the improvements incorporated in this version of xillver.
Improving Fidelity of Launch Vehicle Liftoff Acoustic Simulations
NASA Technical Reports Server (NTRS)
Liever, Peter; West, Jeff
2016-01-01
Launch vehicles experience high acoustic loads during ignition and liftoff affected by the interaction of rocket plume generated acoustic waves with launch pad structures. Application of highly parallelized Computational Fluid Dynamics (CFD) analysis tools optimized for application on the NAS computer systems such as the Loci/CHEM program now enable simulation of time-accurate, turbulent, multi-species plume formation and interaction with launch pad geometry and capture the generation of acoustic noise at the source regions in the plume shear layers and impingement regions. These CFD solvers are robust in capturing the acoustic fluctuations, but they are too dissipative to accurately resolve the propagation of the acoustic waves throughout the launch environment domain along the vehicle. A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed to improve such liftoff acoustic environment predictions. The framework combines the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate discontinuous Galerkin (DG) solver, Loci/THRUST, developed in the same computational framework. Loci/THRUST employs a low dissipation, high-order, unstructured DG method to accurately propagate acoustic waves away from the source regions across large distances. The DG solver is currently capable of solving up to 4th order solutions for non-linear, conservative acoustic field propagation. Higher order boundary conditions are implemented to accurately model the reflection and refraction of acoustic waves on launch pad components. The DG solver accepts generalized unstructured meshes, enabling efficient application of common mesh generation tools for CHEM and THRUST simulations. The DG solution is coupled with the CFD solution at interface boundaries placed near the CFD acoustic source regions. Both simulations are executed simultaneously with coordinated boundary condition data exchange.
Reduction of lighting energy consumption in office buildings through improved daylight design
NASA Astrophysics Data System (ADS)
Papadouri, Maria Violeta Prado
This study aims to investigate the lighting energy consumption in office buildings and the options for its reduction. One way to reduce lighting energy consumption is by improving the daylight design. A better use of daylight in buildings might be an outcome from the effort made in different directions. Like the improvement of a building's fabric and layout, the materials, even the furniture in a space influences the daylight quality considerably. Also very important role in lighting energy consumption has the development of more efficient lighting technology like the electric lighting control systems, such as photo sensors and occupancy sensors. Both systems are responsible so that the electric light is not used without reason. As the focusing area of this study, is to find ways to improve the daylight use in buildings, a consequent question is which are the methods provided in order to achieve this The accuracy of the methodology used is also an important issue in order to achieve reliable results. The methodology applied in this study includes the analysis of a case study by taking field measurements and computer simulations. The first stage included gathering information about the lighting design of the building and monitoring the light levels, both from natural and from the electric lighting. The second stage involved testing with computer simulations, different parameters that were expected to improve the daylight exploitation of the specific area. The results of the field measurements showed that the main problems of the space were the low natural light levels and the poor daylight distribution. The annual electric lighting energy consumption, as it was calculated with the use of computer simulations, represented the annual energy consumption of a typical air-conditioned prestige office building (energy consumption guide 19, for energy use in offices, 2000). After several computer simulations, the results showed that initial design parameters of the building can affect the lighting energy consumption of the space significantly. On the other hand, relatively small changes, like changing the reflectance of the surfaces and the lighting control systems can make even more difference to the light quality of the space and the reduction of lighting energy consumption.
At-risk children's use of reflection and revision in hands-on experimental activities
NASA Astrophysics Data System (ADS)
Petrosino, Anthony J., Jr.
The goal of this study was to investigate the effects of incorporating opportunities for reflection and revision in hands-on science instruction which emphasized experimentation using model rockets. The participants were low achieving sixth grade summer school students (n = 23) designated as at-risk for school failure by their district. The group was asked a series of interview questions based on work by Schauble et al. (1995) relating to experimentation. The interviews took place over three distinct time points corresponding to a "hands-on only" condition, a "hands-on with reflection and revision" condition and a "hands-on with repeated reflection and revision" condition. A Friedman's Two-Way Analysis of Variance by Ranks indicate students score low at first with traditional hands-on instruction but improve significantly with opportunities to reflect and revise their experiments. In addition, a sociocultural analysis was conducted during the summer school session to assess the model rocket activity as an apprenticeship, as guided participation and as participatory appropriation using a framework established by Rogoff (1994). Finally, a survey (the Classroom Environment Survey) was administered to the students measuring five constructs consistent with a constructivist classroom: participation, autonomy, relevance, commitment to learning and disruptions to learning. Analysis indicate students in the summer school model rocket intervention experienced a greater sense of constructivist principles during the activity than a similar comparison group utilizing reform minded instruction but not including opportunities for reflection and revision cycles. This research provides important evidence that, like scientists, students in school can learn effectively from extended practice in a varied context. Importantly, the data indicate that hands-on instruction is best utilized when opportunities for reflection and revision are made explicit. Implications are discussed related to designing instruction, the incorporation of computer supported scaffolding and implications for future research.
Lungu, Angela; Swift, Andrew J; Capener, David; Kiely, David; Hose, Rod; Wild, Jim M
2016-06-01
Accurately identifying patients with pulmonary hypertension (PH) using noninvasive methods is challenging, and right heart catheterization (RHC) is the gold standard. Magnetic resonance imaging (MRI) has been proposed as an alternative to echocardiography and RHC in the assessment of cardiac function and pulmonary hemodynamics in patients with suspected PH. The aim of this study was to assess whether machine learning using computational modeling techniques and image-based metrics of PH can improve the diagnostic accuracy of MRI in PH. Seventy-two patients with suspected PH attending a referral center underwent RHC and MRI within 48 hours. Fifty-seven patients were diagnosed with PH, and 15 had no PH. A number of functional and structural cardiac and cardiovascular markers derived from 2 mathematical models and also solely from MRI of the main pulmonary artery and heart were integrated into a classification algorithm to investigate the diagnostic utility of the combination of the individual markers. A physiological marker based on the quantification of wave reflection in the pulmonary artery was shown to perform best individually, but optimal diagnostic performance was found by the combination of several image-based markers. Classifier results, validated using leave-one-out cross validation, demonstrated that combining computation-derived metrics reflecting hemodynamic changes in the pulmonary vasculature with measurement of right ventricular morphology and function, in a decision support algorithm, provides a method to noninvasively diagnose PH with high accuracy (92%). The high diagnostic accuracy of these MRI-based model parameters may reduce the need for RHC in patients with suspected PH.
Realistic tissue visualization using photoacoustic image
NASA Astrophysics Data System (ADS)
Cho, Seonghee; Managuli, Ravi; Jeon, Seungwan; Kim, Jeesu; Kim, Chulhong
2018-02-01
Visualization methods are very important in biomedical imaging. As a technology that understands life, biomedical imaging has the unique advantage of providing the most intuitive information in the image. This advantage of biomedical imaging can be greatly improved by choosing a special visualization method. This is more complicated in volumetric data. Volume data has the advantage of containing 3D spatial information. Unfortunately, the data itself cannot directly represent the potential value. Because images are always displayed in 2D space, visualization is the key and creates the real value of volume data. However, image processing of 3D data requires complicated algorithms for visualization and high computational burden. Therefore, specialized algorithms and computing optimization are important issues in volume data. Photoacoustic-imaging is a unique imaging modality that can visualize the optical properties of deep tissue. Because the color of the organism is mainly determined by its light absorbing component, photoacoustic data can provide color information of tissue, which is closer to real tissue color. In this research, we developed realistic tissue visualization using acoustic-resolution photoacoustic volume data. To achieve realistic visualization, we designed specialized color transfer function, which depends on the depth of the tissue from the skin. We used direct ray casting method and processed color during computing shader parameter. In the rendering results, we succeeded in obtaining similar texture results from photoacoustic data. The surface reflected rays were visualized in white, and the reflected color from the deep tissue was visualized red like skin tissue. We also implemented the CUDA algorithm in an OpenGL environment for real-time interactive imaging.
Use of Failure in IS Development Statistics: Lessons for IS Curriculum Design
ERIC Educational Resources Information Center
Longenecker, Herbert H., Jr.; Babb, Jeffry; Waguespack, Leslie; Tastle, William; Landry, Jeff
2016-01-01
The evolution of computing education reflects the history of the professional practice of computing. Keeping computing education current has been a major challenge due to the explosive advances in technologies. Academic programs in Information Systems, a long-standing computing discipline, develop and refine the theory and practice of computing…
Palamar, Borys I; Vaskivska, Halyna O; Palamar, Svitlana P
In the article the author touches upon the subject of significance of computer equipment for organization of cooperation of professor and future specialists. Such subject-subject interaction may be directed to forming of professional skills of future specialists. By using information and communication technologies in education system range of didactic tasks can be solved. Improving of process of teaching of subjects in high school, self-learning future specialists, motivating to learning and self-learning, the development of reflection in the learning process. The authors considers computer equipment as instrument for development of intellectual skills, potential and willingness of future specialists to solve communicative and communication tasks and problems on the creative basis. Based on results of researches the author comes to certain conclusions about the effectiveness of usage of computer technologies in process of teaching future specialists and their self-learning. Improper supplying of high schools with computer equipment, lack of appropriate educational programs, professors' teachers' poor knowledge and usage of computers have negative impact on organization of process of teaching disciplines in high schools. Computer equipment and ICT in general are the instruments of development of intellectual skills, potential and willingness of future specialists to solve communicative and communication tasks and problems. So, the formation of psychosocial environment of development of future specialist is multifaceted, complex and didactically important issue.
Computer image processing: Geologic applications
NASA Technical Reports Server (NTRS)
Abrams, M. J.
1978-01-01
Computer image processing of digital data was performed to support several geological studies. The specific goals were to: (1) relate the mineral content to the spectral reflectance of certain geologic materials, (2) determine the influence of environmental factors, such as atmosphere and vegetation, and (3) improve image processing techniques. For detection of spectral differences related to mineralogy, the technique of band ratioing was found to be the most useful. The influence of atmospheric scattering and methods to correct for the scattering were also studied. Two techniques were used to correct for atmospheric effects: (1) dark object subtraction, (2) normalization of use of ground spectral measurements. Of the two, the first technique proved to be the most successful for removing the effects of atmospheric scattering. A digital mosaic was produced from two side-lapping LANDSAT frames. The advantages were that the same enhancement algorithm can be applied to both frames, and there is no seam where the two images are joined.
Laser Signature Prediction Using The VALUE Computer Program
NASA Astrophysics Data System (ADS)
Akerman, Alexander; Hoffman, George A.; Patton, Ronald
1989-09-01
A variety of enhancements are being made to the 1976-vintage LASERX computer code. These include: - Surface characterization with BDRF tabular data - Specular reflection from transparent surfaces - Generation of glint direction maps - Generation of relative range imagery - Interface to the LOWTRAN atmospheric transmission code - Interface to the LEOPS laser sensor code - User friendly menu prompting for easy setup Versions of VALUE have been written for both VAX/VMS and PC/DOS computer environments. Outputs have also been revised to be user friendly and include tables, plots, and images for (1) intensity, (2) cross section,(3) reflectance, (4) relative range, (5) region type, and (6) silhouette.
NASA Astrophysics Data System (ADS)
Shortell, Matthew P.; Althomali, Marwan A. M.; Wille, Marie-Luise; Langton, Christian M.
2017-11-01
We demonstrate a simple technique for quantitative ultrasound imaging of the cortical shell of long bone replicas. Traditional ultrasound computed tomography instruments use the transmitted or reflected waves for separate reconstructions but suffer from strong refraction artefacts in highly heterogenous samples such as bones in soft tissue. The technique described here simplifies the long bone to a two-component composite and uses both the transmitted and reflected waves for reconstructions, allowing the speed of sound and thickness of the cortical shell to be calculated accurately. The technique is simple to implement, computationally inexpensive and sample positioning errors are minimal.
Reflection Patterns Generated by Condensed-Phase Oblique Detonation Interaction with a Rigid Wall
NASA Astrophysics Data System (ADS)
Short, Mark; Chiquete, Carlos; Bdzil, John; Meyer, Chad
2017-11-01
We examine numerically the wave reflection patterns generated by a detonation in a condensed phase explosive inclined obliquely but traveling parallel to a rigid wall as a function of incident angle. The problem is motivated by the characterization of detonation-material confiner interactions. We compare the reflection patterns for two detonation models, one where the reaction zone is spatially distributed, and the other where the reaction is instantaneous (a Chapman-Jouguet detonation). For the Chapman-Jouguet model, we compare the results of the computations with an asymptotic study recently conducted by Bdzil and Short for small detonation incident angles. We show that the ability of a spatially distributed reaction energy release to turn flow streamlines has a significant impact on the nature of the observed reflection patterns. The computational approach uses a shock-fit methodology.
Evaluation of several non-reflecting computational boundary conditions for duct acoustics
NASA Technical Reports Server (NTRS)
Watson, Willie R.; Zorumski, William E.; Hodge, Steve L.
1994-01-01
Several non-reflecting computational boundary conditions that meet certain criteria and have potential applications to duct acoustics are evaluated for their effectiveness. The same interior solution scheme, grid, and order of approximation are used to evaluate each condition. Sparse matrix solution techniques are applied to solve the matrix equation resulting from the discretization. Modal series solutions for the sound attenuation in an infinite duct are used to evaluate the accuracy of each non-reflecting boundary conditions. The evaluations are performed for sound propagation in a softwall duct, for several sources, sound frequencies, and duct lengths. It is shown that a recently developed nonlocal boundary condition leads to sound attenuation predictions considerably more accurate for short ducts. This leads to a substantial reduction in the number of grid points when compared to other non-reflecting conditions.
Digital Video for Fostering Self-Reflection in an ePortfolio Environment
ERIC Educational Resources Information Center
Cheng, Gary; Chau, Juliana
2009-01-01
The ability to self-reflect is widely recognized as a desirable learner attribute that can induce deep learning. Advances in computer-mediated communication technologies have led to intense interest in higher education in exploring the potential of digital tools, particularly digital video, for fostering self-reflection. While there are reports…
NASA Technical Reports Server (NTRS)
Dave, J. V.
1976-01-01
Two computer algorithms are described. These algorithms were used for computing the aximuth-independent component of the intensity of the monochromatic radiation emerging at the top of a pseudo-spherical atmosphere with arbitrary vertical distribution of ozone, and with any arbitrary height distribution of up to two different kinds of aerosol. This atmospheric model was assumed to rest on a surface obeying Lambert's law of reflection.
Advances in edge-diffraction modeling for virtual-acoustic simulations
NASA Astrophysics Data System (ADS)
Calamia, Paul Thomas
In recent years there has been growing interest in modeling sound propagation in complex, three-dimensional (3D) virtual environments. With diverse applications for the military, the gaming industry, psychoacoustics researchers, architectural acousticians, and others, advances in computing power and 3D audio-rendering techniques have driven research and development aimed at closing the gap between the auralization and visualization of virtual spaces. To this end, this thesis focuses on improving the physical and perceptual realism of sound-field simulations in virtual environments through advances in edge-diffraction modeling. To model sound propagation in virtual environments, acoustical simulation tools commonly rely on geometrical-acoustics (GA) techniques that assume asymptotically high frequencies, large flat surfaces, and infinitely thin ray-like propagation paths. Such techniques can be augmented with diffraction modeling to compensate for the effect of surface size on the strength and directivity of a reflection, to allow for propagation around obstacles and into shadow zones, and to maintain soundfield continuity across reflection and shadow boundaries. Using a time-domain, line-integral formulation of the Biot-Tolstoy-Medwin (BTM) diffraction expression, this thesis explores various aspects of diffraction calculations for virtual-acoustic simulations. Specifically, we first analyze the periodic singularity of the BTM integrand and describe the relationship between the singularities and higher-order reflections within wedges with open angle less than 180°. Coupled with analytical approximations for the BTM expression, this analysis allows for accurate numerical computations and a continuous sound field in the vicinity of an arbitrary wedge geometry insonified by a point source. Second, we describe an edge-subdivision strategy that allows for fast diffraction calculations with low error relative to a numerically more accurate solution. Third, to address the considerable increase in propagation paths due to diffraction, we describe a simple procedure for identifying and culling insignificant diffraction components during a virtual-acoustic simulation. Finally, we present a novel method to find GA components using diffraction parameters that ensures continuity at reflection and shadow boundaries.
Fast solar radiation pressure modelling with ray tracing and multiple reflections
NASA Astrophysics Data System (ADS)
Li, Zhen; Ziebart, Marek; Bhattarai, Santosh; Harrison, David; Grey, Stuart
2018-05-01
Physics based SRP (Solar Radiation Pressure) models using ray tracing methods are powerful tools when modelling the forces on complex real world space vehicles. Currently high resolution (1 mm) ray tracing with secondary intersections is done on high performance computers at UCL (University College London). This study introduces the BVH (Bounding Volume Hierarchy) into the ray tracing approach for physics based SRP modelling and makes it possible to run high resolution analysis on personal computers. The ray tracer is both general and efficient enough to cope with the complex shape of satellites and multiple reflections (three or more, with no upper limit). In this study, the traditional ray tracing technique is introduced in the first place and then the BVH is integrated into the ray tracing. Four aspects of the ray tracer were tested for investigating the performance including runtime, accuracy, the effects of multiple reflections and the effects of pixel array resolution.Test results in runtime on GPS IIR and Galileo IOV (In Orbit Validation) satellites show that the BVH can make the force model computation 30-50 times faster. The ray tracer has an absolute accuracy of several nanonewtons by comparing the test results for spheres and planes with the analytical computations. The multiple reflection effects are investigated both in the intersection number and acceleration on GPS IIR, Galileo IOV and Sentinel-1 spacecraft. Considering the number of intersections, the 3rd reflection can capture 99.12 %, 99.14 % , and 91.34 % of the total reflections for GPS IIR, Galileo IOV satellite bus and the Sentinel-1 spacecraft respectively. In terms of the multiple reflection effects on the acceleration, the secondary reflection effect for Galileo IOV satellite and Sentinel-1 can reach 0.2 nm /s2 and 0.4 nm /s2 respectively. The error percentage in the accelerations magnitude results show that the 3rd reflection should be considered in order to make it less than 0.035 % . The pixel array resolution tests show that the dimensions of the components have to be considered when choosing the spacing of the pixel in order not to miss some components of the satellite in ray tracing. This paper presents the first systematic and quantitative study of the secondary and higher order intersection effects. It shows conclusively the effect is non-negligible for certain classes of misson.
Computational Aeroacoustics: An Overview
NASA Technical Reports Server (NTRS)
Tam, Christopher K. W.
2003-01-01
An overview of recent advances in computational aeroacoustics (CAA) is presented. CAA algorithms must not be dispersive and dissipative. It should propagate waves supported by the Euler equations with the correct group velocities. Computation domains are inevitably finite in size. To avoid the reflection of acoustic and other outgoing waves at the boundaries of the computation domain, it is required that special boundary conditions be imposed at the boundary region. These boundary conditions either absorb all the outgoing waves without reflection or allow the waves to exit smoothly. High-order schemes, invariably, supports spurious short waves. These spurious waves tend to pollute the numerical solution. They must be selectively damped or filtered out. All these issues and relevant computation methods are briefly reviewed. Jet screech tones are known to have caused structural fatigue in military combat aircrafts. Numerical simulation of the jet screech phenomenon is presented as an example of a successful application of CAA.
Binary phase digital reflection holograms - Fabrication and potential applications
NASA Technical Reports Server (NTRS)
Gallagher, N. C., Jr.; Angus, J. C.; Coffield, F. E.; Edwards, R. V.; Mann, J. A., Jr.
1977-01-01
A novel technique for the fabrication of binary-phase computer-generated reflection holograms is described. By use of integrated circuit technology, the holographic pattern is etched into a silicon wafer and then aluminum coated to make a reflection hologram. Because these holograms reflect virtually all the incident radiation, they may find application in machining with high-power lasers. A number of possible modifications of the hologram fabrication procedure are discussed.
Shading of a computer-generated hologram by zone plate modulation.
Kurihara, Takayuki; Takaki, Yasuhiro
2012-02-13
We propose a hologram calculation technique that enables reconstructing a shaded three-dimensional (3D) image. The amplitude distributions of zone plates, which generate the object points that constitute a 3D object, were two-dimensionally modulated. Two-dimensional (2D) amplitude modulation was determined on the basis of the Phong reflection model developed for computer graphics, which considers the specular, diffuse, and ambient reflection light components. The 2D amplitude modulation added variable and constant modulations: the former controlled the specular light component and the latter controlled the diffuse and ambient components. The proposed calculation technique was experimentally verified. The reconstructed image showed specular reflection that varied depending on the viewing position.
NASA Astrophysics Data System (ADS)
Coffman, Mitchell Ward
The purpose of this dissertation was to examine the relationship between student access to a computer at home and academic achievement. The 2009 National Assessment of Educational Progress (NAEP) dataset was probed using the National Data Explorer (NDE) to investigate correlations in the subsets of SES, Parental Education, Race, and Gender as it relates to access of a home computer and improved performance scores for U.S. public school grade 12 science students. A causal-comparative approach was employed seeking clarity on the relationship between home access and performance scores. The influence of home access cannot overcome the challenges students of lower SES face. The achievement gap, or a second digital divide, for underprivileged classes of students, including minorities does not appear to contract via student access to a home computer. Nonetheless, in tests for significance, statistically significant improvement in science performance scores was reported for those having access to a computer at home compared to those not having access. Additionally, regression models reported evidence of correlations between and among subsets of controls for the demographic factors gender, race, and socioeconomic status. Variability in these correlations was high; suggesting influence from unobserved factors may have more impact upon the dependent variable. Having access to a computer at home increases performance scores for grade 12 general science students of all races, genders and socioeconomic levels. However, the performance gap is roughly equivalent to the existing performance gap of the national average for science scores, suggesting little influence from access to a computer on academic achievement. The variability of scores reported in the regression analysis models reflects a moderate to low effect, suggesting an absence of causation. These statistical results are accurate and confirm the literature review, whereby having access to a computer at home and the predictor variables were found to have a significant impact on performance scores, although the data presented suggest computer access at home is less influential upon performance scores than poverty and its correlates.
Reflective Practice: Creating Capacities for School Improvement.
ERIC Educational Resources Information Center
Montie, Jo; York-Barr, Jennifer; Kronberg, Robi; Stevenson, Jane; Vallejo, Barb; Lunders, Cheri
This monograph addresses the importance of and strategies for improving education through reflective practice, defined as cognitive processes and an open perspective that involve conscious self-examination in order to gain understandings and improve the lives of students. Chapter 1 provides an overview and explains origins of reflective practice…
NASA Astrophysics Data System (ADS)
Cao, Bin; Liao, Ningfang; Li, Yasheng; Cheng, Haobo
2017-05-01
The use of spectral reflectance as fundamental color information finds application in diverse fields related to imaging. Many approaches use training sets to train the algorithm used for color classification. In this context, we note that the modification of training sets obviously impacts the accuracy of reflectance reconstruction based on classical reflectance reconstruction methods. Different modifying criteria are not always consistent with each other, since they have different emphases; spectral reflectance similarity focuses on the deviation of reconstructed reflectance, whereas colorimetric similarity emphasizes human perception. We present a method to improve the accuracy of the reconstructed spectral reflectance by adaptively combining colorimetric and spectral reflectance similarities. The different exponential factors of the weighting coefficients were investigated. The spectral reflectance reconstructed by the proposed method exhibits considerable improvements in terms of the root-mean-square error and goodness-of-fit coefficient of the spectral reflectance errors as well as color differences under different illuminants. Our method is applicable to diverse areas such as textiles, printing, art, and other industries.
Combining computational models, semantic annotations and simulation experiments in a graph database
Henkel, Ron; Wolkenhauer, Olaf; Waltemath, Dagmar
2015-01-01
Model repositories such as the BioModels Database, the CellML Model Repository or JWS Online are frequently accessed to retrieve computational models of biological systems. However, their storage concepts support only restricted types of queries and not all data inside the repositories can be retrieved. In this article we present a storage concept that meets this challenge. It grounds on a graph database, reflects the models’ structure, incorporates semantic annotations and simulation descriptions and ultimately connects different types of model-related data. The connections between heterogeneous model-related data and bio-ontologies enable efficient search via biological facts and grant access to new model features. The introduced concept notably improves the access of computational models and associated simulations in a model repository. This has positive effects on tasks such as model search, retrieval, ranking, matching and filtering. Furthermore, our work for the first time enables CellML- and Systems Biology Markup Language-encoded models to be effectively maintained in one database. We show how these models can be linked via annotations and queried. Database URL: https://sems.uni-rostock.de/projects/masymos/ PMID:25754863
Tangible display systems: direct interfaces for computer-based studies of surface appearance
NASA Astrophysics Data System (ADS)
Darling, Benjamin A.; Ferwerda, James A.
2010-02-01
When evaluating the surface appearance of real objects, observers engage in complex behaviors involving active manipulation and dynamic viewpoint changes that allow them to observe the changing patterns of surface reflections. We are developing a class of tangible display systems to provide these natural modes of interaction in computer-based studies of material perception. A first-generation tangible display was created from an off-the-shelf laptop computer containing an accelerometer and webcam as standard components. Using these devices, custom software estimated the orientation of the display and the user's viewing position. This information was integrated with a 3D rendering module so that rotating the display or moving in front of the screen would produce realistic changes in the appearance of virtual objects. In this paper, we consider the design of a second-generation system to improve the fidelity of the virtual surfaces rendered to the screen. With a high-quality display screen and enhanced tracking and rendering capabilities, a secondgeneration system will be better able to support a range of appearance perception applications.
NASA Technical Reports Server (NTRS)
Barth, Tim; Zapata, Edgar; Benjamin, Perakath; Graul, Mike; Jones, Doug
2005-01-01
Portfolio Analysis Tool (PAT) is a Web-based, client/server computer program that helps managers of multiple projects funded by different customers to make decisions regarding investments in those projects. PAT facilitates analysis on a macroscopic level, without distraction by parochial concerns or tactical details of individual projects, so that managers decisions can reflect the broad strategy of their organization. PAT is accessible via almost any Web-browser software. Experts in specific projects can contribute to a broad database that managers can use in analyzing the costs and benefits of all projects, but do not have access for modifying criteria for analyzing projects: access for modifying criteria is limited to managers according to levels of administrative privilege. PAT affords flexibility for modifying criteria for particular "focus areas" so as to enable standardization of criteria among similar projects, thereby making it possible to improve assessments without need to rewrite computer code or to rehire experts, and thereby further reducing the cost of maintaining and upgrading computer code. Information in the PAT database and results of PAT analyses can be incorporated into a variety of ready-made or customizable tabular or graphical displays.
Reflective mulch enhances ripening and health compounds in apple fruit.
Overbeck, Verena; Schmitz-Eiberger, Michaela A; Blanke, Michael M
2013-08-15
The objective of the study was to improve fruit quality, including health compounds, by improving light utilization for fruit crops under hail net. Four reflective mulches including plastics such as Extenday® and a bio-degradable paper were spread in the alleyways of a cv. 'Gala Mondial' apple orchard on 10 August 2010 5 weeks before anticipated harvest. Reflective mulch affected neither fruit firmness nor sugar, but accelerated starch breakdown, indicative of riper fruits (smaller Streif index), compared with the uncovered grass alleyway (control). Reflective mulches also improved fruit quality such as red coloration of cv. 'Gala Mondial' apples. This was due to significantly enhanced flavonoids and anthocyanins. Flavonoids increased up to 52.4% in the Extenday® treatment (29.2 nmol cm(-2) in the grass control versus 44.5 nmol cm(-2) fruit peel with reflective mulch). Similarly, reflective mulch improved anthocyanin content in cv. 'Gala Mondial' peel up to 66% compared to grass control (14.5 nmol cm(-2) in control fruit versus 24.1 nmol cm(-2) with reflective mulch). The reflective mulch did not affect chlorophyll and carotenoid content in the 'Gala' fruit peel. Overall, the application of reflective mulches improved fruit quality in terms of better coloration and health compounds and accelerated ripening, leading to higher market value. © 2013 Society of Chemical Industry.
Using MCNP6 to Estimate Fission Neutron Properties of a Reflected Plutonium Sphere
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, Alexander Rich; Nelson, Mark Andrew; Hutchinson, Jesson D.
The purpose of this project was to determine the fission multiplicity distribution, p(v), for the Beryllium Reflected Plutonium (BeRP) ball and to determine whether or not it changed appreciably for various High Density Polyethylene (HDPE) reflected configurations. The motivation for this project was to determine whether or not the average number of neutrons emitted per fission, v, changed significantly enough to reduce the discrepancy between MCNP6 and Robba, Dowdy, Atwater (RDA) point kinetic model estimates of multiplication. The energy spectrum of neutrons that induced fissions in the BeRP ball, NIF (E), was also computed in order to determine the averagemore » energy of neutrons inducing fissions, NIF . p(v) was computed using the FMULT card, NIF (E) and NIF were computed using an F4 tally with an FM tally modifier (F4/FM) card, and the multiplication factor, k eff, was computed using the KCODE card. Although NIF (E) changed significantly between bare and HDPE reflected configurations of the BeRP ball, the change in p(v), and thus the change in v, was insignificant. This is likely due to a difference between the way that NIF is computed using the FMULT and F4/FM cards. The F4/FM card indicated that NIF (E) was essentially Watt-fission distributed for a bare configuration and highly thermalized for all HDPE reflected configurations, while the FMULT card returned an average energy between 1 and 2 MeV for all configurations, which would indicate that the spectrum is Watt-fission distributed, regardless of the amount of HDPE reflector. The spectrum computed with the F4/FM cards is more physically meaningful and so the discrepancy between it and the FMULT card result is being investigated. It is hoped that resolving the discrepancy between the FMULT and F4/FM card estimates of NIF(E) will provide better v estimates that will lead to RDA multiplication estimates that are in better agreement with MCNP6 simulations.« less
NASA Astrophysics Data System (ADS)
Liu, Fei; Xu, Guanghua; Zhang, Qing; Liang, Lin; Liu, Dan
2015-11-01
As one of the Geometrical Product Specifications that are widely applied in industrial manufacturing and measurement, sphericity error can synthetically scale a 3D structure and reflects the machining quality of a spherical workpiece. Following increasing demands in the high motion performance of spherical parts, sphericity error is becoming an indispensable component in the evaluation of form error. However, the evaluation of sphericity error is still considered to be a complex mathematical issue, and the related research studies on the development of available models are lacking. In this paper, an intersecting chord method is first proposed to solve the minimum circumscribed sphere and maximum inscribed sphere evaluations of sphericity error. This new modelling method leverages chord relationships to replace the characteristic points, thereby significantly reducing the computational complexity and improving the computational efficiency. Using the intersecting chords to generate a virtual centre, the reference sphere in two concentric spheres is simplified as a space intersecting structure. The position of the virtual centre on the space intersecting structure is determined by characteristic chords, which may reduce the deviation between the virtual centre and the centre of the reference sphere. In addition,two experiments are used to verify the effectiveness of the proposed method with real datasets from the Cartesian coordinates. The results indicate that the estimated errors are in perfect agreement with those of the published methods. Meanwhile, the computational efficiency is improved. For the evaluation of the sphericity error, the use of high performance computing is a remarkable change.
Reflection Effects in Multimode Fiber Systems Utilizing Laser Transmitters
NASA Technical Reports Server (NTRS)
Bates, Harry E.
1991-01-01
A number of optical communication lines are now in use at NASA-Kennedy for the transmission of voice, computer data, and video signals. Now, all of these channels use a single carrier wavelength centered near 1300 or 1550 nm. Engineering tests in the past have given indications of the growth of systematic and random noise in the RF spectrum of a fiber network as the number of connector pairs is increased. This noise seems to occur when a laser transmitter is used instead of a LED. It has been suggested that the noise is caused by back reflections created at connector fiber interfaces. Experiments were performed to explore the effect of reflection on the transmitting laser under conditions of reflective feedback. This effort included computer integration of some of the instrumentation in the fiber optic lab using the Lab View software recently acquired by the lab group. The main goal was to interface the Anritsu Optical and RF spectrum analyzers to the MacIntosh II computer so that laser spectra and network RF spectra could be simultaneously and rapidly acquired in a form convenient for analysis. Both single and multimode fiber is installed at Kennedy. Since most are multimode, this effort concentrated on multimode systems.
Reflection effects in multimode fiber systems utilizing laser transmitters
NASA Astrophysics Data System (ADS)
Bates, Harry E.
1991-11-01
A number of optical communication lines are now in use at NASA-Kennedy for the transmission of voice, computer data, and video signals. Now, all of these channels use a single carrier wavelength centered near 1300 or 1550 nm. Engineering tests in the past have given indications of the growth of systematic and random noise in the RF spectrum of a fiber network as the number of connector pairs is increased. This noise seems to occur when a laser transmitter is used instead of a LED. It has been suggested that the noise is caused by back reflections created at connector fiber interfaces. Experiments were performed to explore the effect of reflection on the transmitting laser under conditions of reflective feedback. This effort included computer integration of some of the instrumentation in the fiber optic lab using the Lab View software recently acquired by the lab group. The main goal was to interface the Anritsu Optical and RF spectrum analyzers to the MacIntosh II computer so that laser spectra and network RF spectra could be simultaneously and rapidly acquired in a form convenient for analysis. Both single and multimode fiber is installed at Kennedy. Since most are multimode, this effort concentrated on multimode systems.
On a Non-Reflecting Boundary Condition for Hyperbolic Conservation Laws
NASA Technical Reports Server (NTRS)
Loh, Ching Y.
2003-01-01
A non-reflecting boundary condition (NRBC) for practical computations in fluid dynamics and aeroacoustics is presented. The technique is based on the hyperbolicity of the Euler equation system and the first principle of plane (simple) wave propagation. The NRBC is simple and effective, provided the numerical scheme maintains locally a C(sup 1) continuous solution at the boundary. Several numerical examples in ID, 2D and 3D space are illustrated to demonstrate its robustness in practical computations.
Sensing And Force-Reflecting Exoskeleton
NASA Technical Reports Server (NTRS)
Eberman, Brian; Fontana, Richard; Marcus, Beth
1993-01-01
Sensing and force-reflecting exoskeleton (SAFiRE) provides control signals to robot hand and force feedback from robot hand to human operator. Operator makes robot hand touch objects gently and manipulates them finely without exerting excessive forces. Device attaches to operator's hand; comfortable and lightweight. Includes finger exoskeleton, cable mechanical transmission, two dc servomotors, partial thumb exoskeleton, harness, amplifier box, two computer circuit boards, and software. Transduces motion of index finger and thumb. Video monitor of associated computer displays image corresponding to motion.
Context Becomes Content: Sensor Data for Computer-Supported Reflective Learning
ERIC Educational Resources Information Center
Muller, Lars; Divitini, Monica; Mora, Simone; Rivera-Pelayo, Veronica; Stork, Wilhelm
2015-01-01
Wearable devices and ambient sensors can monitor a growing number of aspects of daily life and work. We propose to use this context data as content for learning applications in workplace settings to enable employees to reflect on experiences from their work. Learning by reflection is essential for today's dynamic work environments, as employees…
Force-reflective teleoperated system with shared and compliant control capabilities
NASA Technical Reports Server (NTRS)
Szakaly, Z.; Kim, W. S.; Bejczy, A. K.
1989-01-01
The force-reflecting teleoperator breadboard is described. It is the first system among available Research and Development systems with the following combined capabilities: (1) The master input device is not a replica of the slave arm. It is a general purpose device which can be applied to the control of different robot arms through proper mathematical transformations. (2) Force reflection generated in the master hand controller is referenced to forces and moments measured by a six DOF force-moment sensor at the base of the robot hand. (3) The system permits a smooth spectrum of operations between full manual, shared manual and automatic, and full automatic (called traded) control. (4) The system can be operated with variable compliance or stiffness in force-reflecting control. Some of the key points of the system are the data handling and computing architecture, the communication method, and the handling of mathematical transformations. The architecture is a fully synchronized pipeline. The communication method achieves optimal use of a parallel communication channel between the local and remote computing nodes. A time delay box is also implemented in this communication channel permitting experiments with up to 8 sec time delay. The mathematical transformations are computed faster than 1 msec so that control at each node can be operated at 1 kHz servo rate without interpolation. This results in an overall force-reflecting loop rate of 200 Hz.
Microarthroscopy System With Image Processing Technology Developed for Minimally Invasive Surgery
NASA Technical Reports Server (NTRS)
Steele, Gynelle C.
2001-01-01
In a joint effort, NASA, Micro Medical Devices, and the Cleveland Clinic have developed a microarthroscopy system with digital image processing. This system consists of a disposable endoscope the size of a needle that is aimed at expanding the use of minimally invasive surgery on the knee, ankle, and other small joints. This device not only allows surgeons to make smaller incisions (by improving the clarity and brightness of images), but it gives them a better view of the injured area to make more accurate diagnoses. Because of its small size, the endoscope helps reduce physical trauma and speeds patient recovery. The faster recovery rate also makes the system cost effective for patients. The digital image processing software used with the device was originally developed by the NASA Glenn Research Center to conduct computer simulations of satellite positioning in space. It was later modified to reflect lessons learned in enhancing photographic images in support of the Center's microgravity program. Glenn's Photovoltaic Branch and Graphics and Visualization Lab (G-VIS) computer programmers and software developers enhanced and speed up graphic imaging for this application. Mary Vickerman at Glenn developed algorithms that enabled Micro Medical Devices to eliminate interference and improve the images.
Roth, Alexis M; Ackermann, Ronald T; Downs, Stephen M; Downs, Anne M; Zillich, Alan J; Holmes, Ann M; Katz, Barry P; Murray, Michael D; Inui, Thomas S
2010-06-01
In 2003, the Indiana Office of Medicaid Policy and Planning launched the Indiana Chronic Disease Management Program (ICDMP), a programme intended to improve the health and healthcare utilization of 15,000 Aged, Blind and Disabled Medicaid members living with diabetes and/or congestive heart failure in Indiana. Within ICDMP, programme components derived from the Chronic Care Model and education based on an integrated theoretical framework were utilized to create a telephonic care management intervention that was delivered by trained, non-clinical Care Managers (CMs) working under the supervision of a Registered Nurse. CMs utilized computer-assisted health education scripts to address clinically important topics, including medication adherence, diet, exercise and prevention of disease-specific complications. Employing reflective listening techniques, barriers to optimal self-management were assessed and members were encouraged to engage in health-improving actions. ICDMP evaluation results suggest that this low-intensity telephonic intervention shifted utilization and lowered costs. We discuss this patient-centred method for motivating behaviour change, the theoretical constructs underlying the scripts and the branched-logic format that makes them suitable to use as a computer-based application. Our aim is to share these public-domain materials with other programmes.
The Construction of Knowledge through Social Interaction via Computer-Mediated Communication
ERIC Educational Resources Information Center
Saritas, Tuncay
2008-01-01
With the advance in information and communication technologies, computer-mediated communication--more specifically computer conferencing systems (CCS)--has captured the interest of educators as an ideal tool to create a learning environment featuring active, participative, and reflective learning. Educators are increasingly adapting the features…
Evaluation of Student Reflection as a Route to Improve Oral Communication
ERIC Educational Resources Information Center
Mineart, Kenneth P.; Cooper, Matthew E.
2016-01-01
This study describes the use of guided self-reflection and peer feedback activities to improve student oral communication in a large ChE class (n ~ 100) setting. Student performance tracked throughout an experimental semester indicated both reflection activities accelerated improvement in oral communication over control; student perception of the…
NASA Technical Reports Server (NTRS)
Xie, Yu; Minnis, Patrick; Hu, Yong X.; Kattawar, George W.; Yang, Ping
2008-01-01
Spherical or spheroidal air bubbles are generally trapped in the formation of rapidly growing ice crystals. In this study the single-scattering properties of inhomogeneous ice crystals containing air bubbles are investigated. Specifically, a computational model based on an improved geometric-optics method (IGOM) has been developed to simulate the scattering of light by randomly oriented hexagonal ice crystals containing spherical or spheroidal air bubbles. A combination of the ray-tracing technique and the Monte Carlo method is used. The effect of the air bubbles within ice crystals is to smooth the phase functions, diminish the 22deg and 46deg halo peaks, and substantially reduce the backscatter relative to bubble-free particles. These features vary with the number, sizes, locations and shapes of the air bubbles within ice crystals. Moreover, the asymmetry factors of inhomogeneous ice crystals decrease as the volume of air bubbles increases. Cloud reflectance lookup tables were generated at wavelengths 0.65 m and 2.13 m with different air-bubble conditions to examine the impact of the bubbles on retrieving ice cloud optical thickness and effective particle size. The reflectances simulated for inhomogeneous ice crystals are slightly larger than those computed for homogenous ice crystals at a wavelength of 0.65 microns. Thus, the retrieved cloud optical thicknesses are reduced by employing inhomogeneous ice cloud models. At a wavelength of 2.13 microns, including air bubbles in ice cloud models may also increase the reflectance. This effect implies that the retrieved effective particle sizes for inhomogeneous ice crystals are larger than those retrieved for homogeneous ice crystals, particularly, in the case of large air bubbles.
NASA Astrophysics Data System (ADS)
Meléndez, A.; Korenaga, J.; Sallarès, V.; Miniussi, A.; Ranero, C. R.
2015-10-01
We present a new 3-D traveltime tomography code (TOMO3D) for the modelling of active-source seismic data that uses the arrival times of both refracted and reflected seismic phases to derive the velocity distribution and the geometry of reflecting boundaries in the subsurface. This code is based on its popular 2-D version TOMO2D from which it inherited the methods to solve the forward and inverse problems. The traveltime calculations are done using a hybrid ray-tracing technique combining the graph and bending methods. The LSQR algorithm is used to perform the iterative regularized inversion to improve the initial velocity and depth models. In order to cope with an increased computational demand due to the incorporation of the third dimension, the forward problem solver, which takes most of the run time (˜90 per cent in the test presented here), has been parallelized with a combination of multi-processing and message passing interface standards. This parallelization distributes the ray-tracing and traveltime calculations among available computational resources. The code's performance is illustrated with a realistic synthetic example, including a checkerboard anomaly and two reflectors, which simulates the geometry of a subduction zone. The code is designed to invert for a single reflector at a time. A data-driven layer-stripping strategy is proposed for cases involving multiple reflectors, and it is tested for the successive inversion of the two reflectors. Layers are bound by consecutive reflectors, and an initial velocity model for each inversion step incorporates the results from previous steps. This strategy poses simpler inversion problems at each step, allowing the recovery of strong velocity discontinuities that would otherwise be smoothened.
Revision of IRIS/IDA Seismic Station Metadata
NASA Astrophysics Data System (ADS)
Xu, W.; Davis, P.; Auerbach, D.; Klimczak, E.
2017-12-01
Trustworthy data quality assurance has always been one of the goals of seismic network operators and data management centers. This task is considerably complex and evolving due to the huge quantities as well as the rapidly changing characteristics and complexities of seismic data. Published metadata usually reflect instrument response characteristics and their accuracies, which includes zero frequency sensitivity for both seismometer and data logger as well as other, frequency-dependent elements. In this work, we are mainly focused studying the variation of the seismometer sensitivity with time of IRIS/IDA seismic recording systems with a goal to improve the metadata accuracy for the history of the network. There are several ways to measure the accuracy of seismometer sensitivity for the seismic stations in service. An effective practice recently developed is to collocate a reference seismometer in proximity to verify the in-situ sensors' calibration. For those stations with a secondary broadband seismometer, IRIS' MUSTANG metric computation system introduced a transfer function metric to reflect two sensors' gain ratios in the microseism frequency band. In addition, a simulation approach based on M2 tidal measurements has been proposed and proven to be effective. In this work, we compare and analyze the results from three different methods, and concluded that the collocated-sensor method is most stable and reliable with the minimum uncertainties all the time. However, for epochs without both the collocated sensor and secondary seismometer, we rely on the analysis results from tide method. For the data since 1992 on IDA stations, we computed over 600 revised seismometer sensitivities for all the IRIS/IDA network calibration epochs. Hopefully further revision procedures will help to guarantee that the data is accurately reflected by the metadata of these stations.
Simulation of the Reflected Blast Wave froma C-4 Charge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Howard, W M; Kuhl, A L; Tringe, J W
2011-08-01
The reflection of a blast wave from a C4 charge detonated above a planar surface is simulated with our ALE3D code. We used a finely-resolved, fixed Eulerian 2-D mesh (167 {micro}m per cell) to capture the detonation of the charge, the blast wave propagation in nitrogen, and its reflection from the surface. The thermodynamic properties of the detonation products and nitrogen were specified by the Cheetah code. A programmed-burn model was used to detonate the charge at a rate based on measured detonation velocities. Computed pressure histories are compared with pressures measured by Kistler 603B piezoelectric gauges at 8 rangesmore » (GR = 0, 2, 4, 8, 10, and 12 inches) along the reflecting surface. Computed and measured waveforms and positive-phase impulses were similar, except at close-in ranges (GR < 2 inches), which were dominated by jetting effects.« less
Simulation of the reflected blast wave from a C-4 charge
NASA Astrophysics Data System (ADS)
Howard, W. Michael; Kuhl, Allen L.; Tringe, Joseph
2012-03-01
The reflection of a blast wave from a C4 charge detonated above a planar surface is simulated with our ALE3D code. We used a finely-resolved, fixed Eulerian 2-D mesh (167 μm per cell) to capture the detonation of the charge, the blast wave propagation in nitrogen, and its reflection from the surface. The thermodynamic properties of the detonation products and nitrogen were specified by the Cheetah code. A programmed-burn model was used to detonate the charge at a rate based on measured detonation velocities. Computed pressure histories are compared with pressures measured by Kistler 603B piezoelectric gauges at 7 ranges (GR = 0, 5.08, 10.16, 15.24, 20.32, 25.4, and 30.48 cm) along the reflecting surface. Computed and measured waveforms and positive-phase impulses were similar, except at close-in ranges (GR < 5 cm), which were dominated by jetting effects.
NASA Astrophysics Data System (ADS)
Escobar-Cerezo, J.; Penttilä, A.; Kohout, T.; Muñoz, O.; Moreno, F.; Muinonen, K.
2018-01-01
Lunar soil spectra differ from pulverized lunar rocks spectra by reddening and darkening effects, and shallower absorption bands. These effects have been described in the past as a consequence of space weathering. In this work, we focus on the effects of nanophase iron (npFe0) inclusions on the experimental reflectance spectra of lunar regolith particles. The reflectance spectra are computed using SIRIS3, a code that combines ray optics with radiative-transfer modeling to simulate light scattering by different types of scatterers. The imaginary part of the refractive index as a function of wavelength of immature lunar soil is derived by comparison with the measured spectra of the corresponding material. Furthermore, the effect of adding nanophase iron inclusions on the reflectance spectra is studied. The computed spectra qualitatively reproduce the observed effects of space weathered lunar regolith.
Observation model and parameter partials for the JPL VLBI parameter estimation software MODEST/1991
NASA Technical Reports Server (NTRS)
Sovers, O. J.
1991-01-01
A revision is presented of MASTERFIT-1987, which it supersedes. Changes during 1988 to 1991 included introduction of the octupole component of solid Earth tides, the NUVEL tectonic motion model, partial derivatives for the precession constant and source position rates, the option to correct for source structure, a refined model for antenna offsets, modeling the unique antenna at Richmond, FL, improved nutation series due to Zhu, Groten, and Reigber, and reintroduction of the old (Woolard) nutation series for simulation purposes. Text describing the relativistic transformations and gravitational contributions to the delay model was also revised in order to reflect the computer code more faithfully.
A generative, probabilistic model of local protein structure.
Boomsma, Wouter; Mardia, Kanti V; Taylor, Charles C; Ferkinghoff-Borg, Jesper; Krogh, Anders; Hamelryck, Thomas
2008-07-01
Despite significant progress in recent years, protein structure prediction maintains its status as one of the prime unsolved problems in computational biology. One of the key remaining challenges is an efficient probabilistic exploration of the structural space that correctly reflects the relative conformational stabilities. Here, we present a fully probabilistic, continuous model of local protein structure in atomic detail. The generative model makes efficient conformational sampling possible and provides a framework for the rigorous analysis of local sequence-structure correlations in the native state. Our method represents a significant theoretical and practical improvement over the widely used fragment assembly technique by avoiding the drawbacks associated with a discrete and nonprobabilistic approach.
NASA Astrophysics Data System (ADS)
Vilagosh, Zoltan; Lajevardipour, Alireza; Wood, Andrew
2018-01-01
Finite-difference time-domain (FDTD) computational phantoms aid the analysis of THz radiation interaction with human skin. The presented computational phantoms have accurate anatomical layering and electromagnetic properties. A novel "large sheet" simulation technique is used allowing for a realistic representation of lateral absorption and reflection of in-vivo measurements. Simulations carried out to date have indicated that hair follicles act as THz propagation channels and confirms the possible role of melanin, both in nevi and skin pigmentation, to act as a significant absorber of THz radiation. A novel freezing technique has promise in increasing the depth of skin penetration of THz radiation to aid diagnostic imaging.
Computers and the internet: tools for youth empowerment.
Valaitis, Ruta K
2005-10-04
Youth are often disenfranchised in their communities and may feel they have little voice. Since computers are an important aspect of youth culture, they may offer solutions to increasing youth participation in communities. This qualitative case study investigated the perceptions of 19 (predominantly female) inner-city school youth about their use of computers and the Internet in a school-based community development project. Youth working with public health nurses in a school-based community development project communicated with local community members using computer-mediated communication, surveyed peers online, built websites, searched for information online, and prepared project materials using computers and the Internet. Participant observation, semistructured interviews, analysis of online messages, and online- and paper-based surveys were used to gather data about youth's and adults' perceptions and use of the technologies. Constant comparison method and between-method triangulation were used in the analysis to satisfy the existence of themes. Not all youth were interested in working with computers. Some electronic messages from adults were perceived to be critical, and writing to adults was intimidating for some youth. In addition, technical problems were experienced. Despite these barriers, most youth perceived that using computers and the Internet reduced their anxiety concerning communication with adults, increased their control when dealing with adults, raised their perception of their social status, increased participation within the community, supported reflective thought, increased efficiency, and improved their access to resources. Overall, youth perceived computers and the Internet to be empowering tools, and they should be encouraged to use such technology to support them in community initiatives.
Computers and the Internet: Tools for Youth Empowerment
2005-01-01
Background Youth are often disenfranchised in their communities and may feel they have little voice. Since computers are an important aspect of youth culture, they may offer solutions to increasing youth participation in communities. Objective This qualitative case study investigated the perceptions of 19 (predominantly female) inner-city school youth about their use of computers and the Internet in a school-based community development project. Methods Youth working with public health nurses in a school-based community development project communicated with local community members using computer-mediated communication, surveyed peers online, built websites, searched for information online, and prepared project materials using computers and the Internet. Participant observation, semistructured interviews, analysis of online messages, and online- and paper-based surveys were used to gather data about youth’s and adults’ perceptions and use of the technologies. Constant comparison method and between-method triangulation were used in the analysis to satisfy the existence of themes. Results Not all youth were interested in working with computers. Some electronic messages from adults were perceived to be critical, and writing to adults was intimidating for some youth. In addition, technical problems were experienced. Despite these barriers, most youth perceived that using computers and the Internet reduced their anxiety concerning communication with adults, increased their control when dealing with adults, raised their perception of their social status, increased participation within the community, supported reflective thought, increased efficiency, and improved their access to resources. Conclusions Overall, youth perceived computers and the Internet to be empowering tools, and they should be encouraged to use such technology to support them in community initiatives. PMID:16403715
Improving Teaching through Collaborative Reflective Teaching Cycles
ERIC Educational Resources Information Center
Murray, Eileen
2015-01-01
Reflection and collaboration are two activities teachers can use to change and improve their practice. However, finding the time and space to do so can be challenging. The collaborative reflective teaching cycle is a structured activity teachers can use to engage in reflection and collaboration. This article describes how a seventh grade teaching…
Professionalizing the Self-Reflection of Student Teachers by Using a Wiki
ERIC Educational Resources Information Center
Wegner, Claas; Remmert, Kathrin; Strehlke, Friederike
2014-01-01
Critics encourage the process of "reflection" as a prerequisite for professionalizing how teachers behave in the classroom. Reflection helps in recognizing areas in need of improvement. Self-reflection is hence one of the teacher's most important skills in order to work constantly on one's teaching and how to improve it. However, the…
NASA Astrophysics Data System (ADS)
Kim, H. W.; Yeom, J. M.; Woo, S. H.
2017-12-01
Over the thin cloud region, satellite can simultaneously detect the reflectance from thin clouds and land surface. Since the mixed reflectance is not the exact cloud information, the background surface reflectance should be eliminated to accurately distinguish thin cloud such as cirrus. In the previous research, Kim et al (2017) was developed the cloud masking algorithm using the Geostationary Ocean Color Imager (GOCI), which is one of significant instruments for Communication, Ocean, and Meteorology Satellite (COMS). Although GOCI has 8 spectral channels including visible and near infra-red spectral ranges, the cloud masking has quantitatively reasonable result when comparing with MODIS cloud mask (Collection 6 MYD35). Especially, we noticed that this cloud masking algorithm is more specialized in thin cloud detections through the validation with Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) data. Because this cloud masking method was concentrated on eliminating background surface effects from the top-of-atmosphere (TOA) reflectance. Applying the difference between TOA reflectance and the bi-directional reflectance distribution function (BRDF) model-based background surface reflectance, cloud areas both thick cloud and thin cloud can be discriminated without infra-red channels which were mostly used for detecting clouds. Moreover, when the cloud mask result was utilized as the input data when simulating BRDF model and the optimized BRDF model-based surface reflectance was used for the optimized cloud masking, the probability of detection (POD) has higher value than POD of the original cloud mask. In this study, we examine the correlation between cloud optical depth (COD) and its cloud mask result. Cloud optical depths mostly depend on the cloud thickness, the characteristic of contents, and the size of cloud contents. COD ranges from less than 0.1 for thin clouds to over 1000 for the huge cumulus due to scattering by droplets. With the cloud optical depth of CALIPSO, the cloud masking result can be more improved since we can figure out how deep cloud is. To validate the cloud mask and the correlation result, the atmospheric retrieval will be computed to compare the difference between TOA reflectance and the simulated surface reflectance.
Use of Computer Kiosks for Breast Cancer Education in Five Community Settings
ERIC Educational Resources Information Center
Kreuter, Matthew W.; Black, Wynona J.; Friend, LaBraunna; Booker, Angela C.; Klump, Paula; Bobra, Sonal; Holt, Cheryl L.
2006-01-01
Finding ways to bring effective computer-based behavioral interventions to those with limited access to technology is a continuing challenge for health educators. Computer kiosks placed in community settings may help reach such populations. The "Reflections of You" kiosk generates individually tailored magazines on breast cancer and…
Metocognitive Support Accelerates Computer Assisted Learning for Novice Programmers
ERIC Educational Resources Information Center
Rum, Siti Nurulain Mohd; Ismail, Maizatul Akmar
2017-01-01
Computer programming is a part of the curriculum in computer science education, and high drop rates for this subject are a universal problem. Development of metacognitive skills, including the conceptual framework provided by socio-cognitive theories that afford reflective thinking, such as actively monitoring, evaluating, and modifying one's…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitcham, C.
This essay surveys recent studies concerning the social, cultural, ethical and religious dimensions of computers. The argument is that computers have certain cultural influences which call for ethical analysis. Further suggestions are that American culture is itself reflected in new ways in the high-technology computer milieu, and that ethical issues entail religious ones which are being largely ignored. 28 references.
Computer assisted holographic moire contouring
NASA Astrophysics Data System (ADS)
Sciammarella, Cesar A.
2000-01-01
Theoretical analyses and experimental results on holographic moire contouring on diffusely reflecting objects are presented. The sensitivity and limitations of the method are discussed. Particular emphasis is put on computer-assisted data retrieval, processing, and recording.
Monte Carlo Computer Simulation of a Rainbow.
ERIC Educational Resources Information Center
Olson, Donald; And Others
1990-01-01
Discusses making a computer-simulated rainbow using principles of physics, such as reflection and refraction. Provides BASIC program for the simulation. Appends a program illustrating the effects of dispersion of the colors. (YP)
Fiber-optic projected-fringe digital interferometry
NASA Technical Reports Server (NTRS)
Mercer, Carolyn R.; Beheim, Glenn
1990-01-01
A phase-stepped projected-fringe interferometer was developed which uses a closed-loop fiber-optic phase-control system to make very accurate surface profile measurements. The closed-loop phase-control system greatly reduces phase-stepping error, which is frequently the dominant source of error in digital interferometers. Two beams emitted from a fiber-optic coupler are combined to form an interference fringe pattern on a diffusely reflecting object. Reflections off of the fibers' output faces are used to create a phase-indicating signal for the closed-loop optical phase controller. The controller steps the phase difference between the two beams by pi/2 radians in order to determine the object's surface profile using a solid-state camera and a computer. The system combines the ease of alignment and automated data reduction of phase-stepping projected-fringe interferometry with the greatly improved phase-stepping accuracy of our closed-loop phase-controller. The system is demonstrated by measuring the profile of a plate containing several convex surfaces whose heights range from 15 to 25 micron high.
NASA Astrophysics Data System (ADS)
Shen, Jian; Liu, Shouhua; Shen, Zicai; Shao, Jianda; Fan, Zhengxiu
2006-03-01
A model for refractive index of stratified dielectric substrate was put forward according to theories of inhomogeneous coatings. The substrate was divided into surface layer, subsurface layer and bulk layer along the normal direction of its surface. Both the surface layer (separated into N1 sublayers of uniform thickness) and subsurface layer (separated into N2 sublayers of uniform thickness), whose refractive indices have different statistical distributions, are equivalent to inhomogeneous coatings, respectively. And theoretical deduction was carried out by employing characteristic matrix method of optical coatings. An example of mathematical calculation for optical properties of dielectric coatings had been presented. The computing results indicate that substrate subsurface defects can bring about additional bulk scattering and change propagation characteristic in thin film and substrate. Therefore, reflectance, reflective phase shift and phase difference of an assembly of coatings and substrate deviate from ideal conditions. The model will provide some beneficial theory directions for improving optical properties of dielectric coatings via substrate surface modification.
Irradiation control parameters for computer-assisted laser photocoagulation of the retina
NASA Astrophysics Data System (ADS)
Naess, Espen; Molvik, Torstein; Barrett, Steven F.; Wright, Cameron H. G.; de Graaf, Peter W.
2001-06-01
A system for robotically assisted retinal surgery has been developed to rapidly and safely place lesions on the retina for photocoagulation therapy. This system provides real- time, motion stabilized lesion placement for typical irradiation times of 100 ms. The system consists of three main subsystems: a global, digital-based tracking subsystem; a fast, local analog tracking subsystem; and a confocal reflectance subsystem to control lesion parameters dynamically. We have reported on these subsystems in previous SPIE presentations. This paper concentrates on the development of the second hybrid system prototype. Considerable progress has been made toward reducing the footprint of the optical system, simplifying the user interface, fully characterizing the analog tracking system and using measurable lesion reflectance growth parameters to develop a noninvasive method to infer lesion depth. This method will allow dynamic control of laser dosimetry to provide similar lesions across the non-uniform retinal surface. These system improvements and progress toward a clinically significant system are covered in detail within this paper.
Quantifying color variation: Improved formulas for calculating hue with segment classification1
Smith, Stacey D.
2014-01-01
• Premise of the study: Differences in color form a major component of biological variation, and quantifying these differences is the first step to understanding their evolutionary and ecological importance. One common method for measuring color variation is segment classification, which uses three variables (chroma, hue, and brightness) to describe the height and shape of reflectance curves. This study provides new formulas for calculating hue (the variable that describes the “type” of color) to give correct values in all regions of color space. • Methods and Results: Reflectance spectra were obtained from the literature, and chroma, hue, and brightness were computed for each spectrum using the original formulas as well as the new formulas. Only the new formulas result in correct values in the blue-green portion of color space. • Conclusions: Use of the new formulas for calculating hue will result in more accurate color quantification for a broad range of biological applications. PMID:25202612
Lewis, Kadriye O; Farber, Susan; Chen, Haiqin; Peska, Don N
2015-11-01
The value of reflective practices has gained momentum in osteopathic medical education. However, the use of reflective pedagogies has not been explored in the larger context of medical course delivery and design, to the authors' knowledge. To determine the types of reflection demonstrated by osteopathic medical students on an online discussion board and to explore differences in discussion engagement caused by the use of a reflective learning self-assessment tool. Using a mixed-method approach, reflection processes in an osteopathic surgery clinical clerkship online module were investigated in third-year osteopathic medical students. Discussion board messages were captured and coded. Both manual coding techniques and automated interrogation using NVivo9 (a computer program) for qualitative data were applied. Correlations of scores across 4 case-based discussion tasks and scores for self-reflection were computed as quantitative data. Twenty-eight students were included. Four main types of reflection (ie, content, contextual, dialogic, and personal) along with corresponding differentiated subthemes for each type of case-based discussion board group message were identified. Group collaboration revealed insights about the reflection process itself and also about the evidence of collective efforts, group engagements, and intragroup support among students. Student preparation revealed that students' metacognition was triggered when they judged their own contributions to group work. Challenges in completing readings and meeting deadlines were related to the students' long work hours. Reflective practices are essential to the practice of osteopathic medicine and medical education. Curricula can promote the development of reflective skills by integrating these deliberate practices in educational activities.
Skilled Metro Workers Get Highest Payoffs for Using a Computer at Work.
ERIC Educational Resources Information Center
Kusmin, Lorin D.
2000-01-01
Workers who use computers on the job receive higher wages, reflecting both computer-specific and broader skills. This accounts for a small portion of the metro-nonmetro wage gap. The payoff for using a computer on the job is higher for college graduates and more-experienced workers than their counterparts and is higher for rural than urban…
NASA Astrophysics Data System (ADS)
Pérez-Huerta, J. S.; Ariza-Flores, D.; Castro-García, R.; Mochán, W. L.; Ortiz, G. P.; Agarwal, V.
2018-04-01
We report the reflectivity of one-dimensional finite and semi-infinite photonic crystals, computed through the coupling to Bloch modes (BM) and through a transfer matrix method (TMM), and their comparison to the experimental spectral line shapes of porous silicon (PS) multilayer structures. Both methods reproduce a forbidden photonic bandgap (PBG), but slowly-converging oscillations are observed in the TMM as the number of layers increases to infinity, while a smooth converged behavior is presented with BM. The experimental reflectivity spectra is in good agreement with the TMM results for multilayer structures with a small number of periods. However, for structures with large amount of periods, the measured spectral line shapes exhibit better agreement with the smooth behavior predicted by BM.
NASA Astrophysics Data System (ADS)
Poudyal, R.; Singh, M. K.; Gatebe, C. K.; Gautam, R.; Varnai, T.
2015-12-01
Using airborne Cloud Absorption Radiometer (CAR) reflectance measurements of smoke, an empirical relationship between reflectances measured at different sun-satellite geometry is established, in this study. It is observed that reflectance of smoke aerosol at any viewing zenith angle can be computed using a linear combination of reflectance at two viewing zenith angles. One of them should be less than 30° and other must be greater than 60°. We found that the parameters of the linear combination computation follow a third order polynomial function of the viewing geometry. Similar relationships were also established for different relative azimuth angles. Reflectance at any azimuth angle can be written as a linear combination of measurements at two different azimuth angles. One must be in the forward scattering direction and the other in backward scattering, with both close to the principal plane. These relationships allowed us to create an Angular Distribution Model (ADM) for smoke, which can estimate reflectances in any direction based on measurements taken in four view directions. The model was tested by calculating the ADM parameters using CAR data from the SCAR-B campaign, and applying these parameters to different smoke cases at three spectral channels (340nm, 380nm and 470nm). We also tested our modelled smoke ADM formulas with Absorbing Aerosol Index (AAI) directly computed from the CAR data, based on 340nm and 380nm, which is probably the first study to analyze the complete multi-angular distribution of AAI for smoke aerosols. The RMSE (and mean error) of predicted reflectance for SCAR-B and ARCTAS smoke ADMs were found to be 0.002 (1.5%) and 0.047 (6%), respectively. The accuracy of the ADM formulation is also tested through radiative transfer simulations for a wide variety of situations (varying smoke loading, underlying surface types, etc.).
NASA Astrophysics Data System (ADS)
Kramer, Michael J.; Bellman, Kirstie L.; Landauer, Christopher
2002-07-01
This paper will review and examine the definitions of Self-Reflection and Active Middleware. Then it will illustrate a conceptual framework for understanding and enumerating the costs of Self-Reflection and Active Middleware at increasing levels of Application. Then it will review some application of Self-Reflection and Active Middleware to simulations. Finally it will consider the application and additional kinds of costs applying Self-Reflection and Active Middleware to sharing information among the organizations expected to participate in Homeland Defense.
Optimization of Boiling Water Reactor Loading Pattern Using Two-Stage Genetic Algorithm
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kobayashi, Yoko; Aiyoshi, Eitaro
2002-10-15
A new two-stage optimization method based on genetic algorithms (GAs) using an if-then heuristic rule was developed to generate optimized boiling water reactor (BWR) loading patterns (LPs). In the first stage, the LP is optimized using an improved GA operator. In the second stage, an exposure-dependent control rod pattern (CRP) is sought using GA with an if-then heuristic rule. The procedure of the improved GA is based on deterministic operators that consist of crossover, mutation, and selection. The handling of the encoding technique and constraint conditions by that GA reflects the peculiar characteristics of the BWR. In addition, strategies suchmore » as elitism and self-reproduction are effectively used in order to improve the search speed. The LP evaluations were performed with a three-dimensional diffusion code that coupled neutronic and thermal-hydraulic models. Strong axial heterogeneities and constraints dependent on three dimensions have always necessitated the use of three-dimensional core simulators for BWRs, so that optimization of computational efficiency is required. The proposed algorithm is demonstrated by successfully generating LPs for an actual BWR plant in two phases. One phase is only LP optimization applying the Haling technique. The other phase is an LP optimization that considers the CRP during reactor operation. In test calculations, candidates that shuffled fresh and burned fuel assemblies within a reasonable computation time were obtained.« less
A method to calculate synthetic waveforms in stratified VTI media
NASA Astrophysics Data System (ADS)
Wang, W.; Wen, L.
2012-12-01
Transverse isotropy with a vertical axis of symmetry (VTI) may be an important material property in the Earth's interior. In this presentation, we develop a method to calculate synthetic seismograms for wave propagation in stratified VTI media. Our method is based on the generalized reflection and transmission method (GRTM) (Luco & Apsel 1983). We extend it to transversely isotropic VTI media. GRTM has the advantage of remaining stable in high frequency calculations compared to the Haskell Matrix method (Haskell 1964), which explicitly excludes the exponential growth terms in the propagation matrix and is limited to low frequency computation. In the implementation, we also improve GRTM in two aspects. 1) We apply the Shanks transformation (Bender & Orszag 1999) to improve the convergence rate of convergence. This improvement is especially important when the depths of source and receiver are close. 2) We adopt a self-adaptive Simpson integration method (Chen & Zhang 2001) in the discrete wavenumber integration so that the integration can still be efficiently carried out at large epicentral distances. Because the calculation is independent in each frequency, the program can also be effectively implemented in parallel computing. Our method provides a powerful tool to synthesize broadband seismograms of VIT media at a large epicenter distance range. We will present examples of using the method to study possible transverse isotropy in the upper mantle and the lowermost mantle.
Attenuation of harmonic noise in vibroseis data using Simulated Annealing
NASA Astrophysics Data System (ADS)
Sharma, S. P.; Tildy, Peter; Iranpour, Kambiz; Scholtz, Peter
2009-04-01
Processing of high productivity vibroseis seismic data (such as slip-sweep acquisition records) suffers from the well known disadvantage of harmonic distortion. Harmonic distortions are observed after cross-correlation of the recorded seismic signal with the pilot sweep and affect the signals in negative time (before the actual strong reflection event). Weak reflection events of the earlier sweeps falling in the negative time window of the cross-correlation sequence are being masked by harmonic distortions. Though the amplitude of the harmonic distortion is small (up to 10-20 %) compared to the fundamental amplitude of the reflection events, but it is significant enough to mask weak reflected signals. Elimination of harmonic noise due to source signal distortion from the cross-correlated seismic trace is a challenging task since the application of vibratory sources started and it still needs improvement. An approach has been worked out that minimizes the level of harmonic distortion by designing the signal similar to the harmonic distortion. An arbitrary length filter is optimized using the Simulated Annealing global optimization approach to design a harmonic signal. The approach deals with the convolution of a ratio trace (ratio of the harmonics with respect to the fundamental sweep) with the correlated "positive time" recorded signal and an arbitrary filter. Synthetic data study has revealed that this procedure of designing a signal similar to the desired harmonics using convolution of a suitable filter with theoretical ratio of harmonics with fundamental sweep helps in reducing the problem of harmonic distortion. Once we generate a similar signal for a vibroseis source using an optimized filter, then, this filter could be used to generate harmonics, which can be subtracted from the main cross-correlated trace to get the better, undistorted image of the subsurface. Designing the predicted harmonics to reduce the energy in the trace by considering weak reflection and observed harmonics together yields the desired result (resolution of weak reflected signal from the harmonic distortion). As optimization steps proceeds forward it is possible to observe from the difference plots of desired and predicted harmonics how weak reflections evolved from the harmonic distortion gradually during later iterations of global optimization. The procedure is applied in resolving weak reflections from a number of traces considered together. For a more precise design of harmonics SA procedure needs longer computation time which is impractical to deal with voluminous seismic data. However, the objective of resolving weak reflection signal in the strong harmonic noise can be achieved with fast computation using faster cooling schedule and less number of iterations and number of moves in simulated annealing procedure. This process could help in reducing the harmonics distortion and achieving the objective of resolving the lost weak reflection events in the cross-correlated seismic traces. Acknowledgements: The research was supported under the European Marie Curie Host Fellowships for Transfer of Knowledge (TOK) Development Host Scheme (contract no. MTKD-CT-2006-042537).
Spectrophotometer-Integrating-Sphere System for Computing Solar Absorptance
NASA Technical Reports Server (NTRS)
Witte, William G., Jr.; Slemp, Wayne S.; Perry, John E., Jr.
1991-01-01
A commercially available ultraviolet, visible, near-infrared spectrophotometer was modified to utilize an 8-inch-diameter modified Edwards-type integrated sphere. Software was written so that the reflectance spectra could be used to obtain solar absorptance values of 1-inch-diameter specimens. A descriptions of the system, spectral reflectance, and software for calculation of solar absorptance from reflectance data are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia, J.; McClintock, J. E.; Dauser, T.
2013-05-10
We present a new and complete library of synthetic spectra for modeling the component of emission that is reflected from an illuminated accretion disk. The spectra were computed using an updated version of our code XILLVER that incorporates new routines and a richer atomic database. We offer in the form of a table model an extensive grid of reflection models that cover a wide range of parameters. Each individual model is characterized by the photon index {Gamma} of the illuminating radiation, the ionization parameter {xi} at the surface of the disk (i.e., the ratio of the X-ray flux to themore » gas density), and the iron abundance A{sub Fe} relative to the solar value. The ranges of the parameters covered are 1.2 {<=} {Gamma} {<=} 3.4, 1 {<=} {xi} {<=} 10{sup 4}, and 0.5 {<=} A{sub Fe} {<=} 10. These ranges capture the physical conditions typically inferred from observations of active galactic nuclei, and also stellar-mass black holes in the hard state. This library is intended for use when the thermal disk flux is faint compared to the incident power-law flux. The models are expected to provide an accurate description of the Fe K emission line, which is the crucial spectral feature used to measure black hole spin. A total of 720 reflection spectra are provided in a single FITS file (http://hea-www.cfa.harvard.edu/{approx}javier/xillver/) suitable for the analysis of X-ray observations via the atable model in XSPEC. Detailed comparisons with previous reflection models illustrate the improvements incorporated in this version of XILLVER.« less
Yield prediction by analysis of multispectral scanner data
NASA Technical Reports Server (NTRS)
Colwell, J. E.; Suits, G. H.
1975-01-01
A preliminary model describing the growth and grain yield of wheat was developed. The modeled growth characteristics of the wheat crop were used to compute wheat canopy reflectance using a model of vegetation canopy reflectance. The modeled reflectance characteristics were compared with the corresponding growth characteristics and grain yield in order to infer their relationships. It appears that periodic wheat canopy reflectance characteristics potentially derivable from earth satellites will be useful in forecasting wheat grain yield.
Analyzing the relationships between reflection source DPOAEs and SFOAEs using a computational model
NASA Astrophysics Data System (ADS)
Wen, Haiqi; Bowling, Thomas; Meaud, Julien
2018-05-01
Distortion product otoacoustic emissions (DPOAEs) are sounds generated by the cochlea in response to a stimulus that consists of two primary tones. DPOAEs consist of a mixture of emissions arising from two different mechanisms: nonlinear distortion and coherent reflection. Stimulus Frequency Otoacoustic Emissions (SFOAEs) are sounds generated by the cochlea in response to a pure tone; SFOAEs are commonly hypothesized to be generated due to coherent reflection. Nonlinearity of the outer hair cells (OHCs) provides nonlinear amplification to the traveling wave while reflections occur due to pre-existing micromechanical impedance perturbations. In this work, DPOAEs are obtained from a time domain computational model coupling a lumped parameter middle ear model with a multiphysics mechanical-electrical-acoustical model of cochlea. Cochlear roughness is intro-duced by perturbing the value of the OHC electromechanical coupling coefficient to account for the putative inhomogeneities inside the cochlea. The DPOAEs emitted in the ear canal are decomposed into distortion source and reflection source components. The reflection source component of DPOAEs is compared to SFOAEs obtained using a frequency-domain implementation of the model, to help us understand how distortion source and reflection source contributes to total DPOAEs. Moreover, the group delays of reflection sources OAEs are compared to group delays in the basilar membrane velocity to clarify the relationship between basilar membrane and OAE group delays.
NASA Astrophysics Data System (ADS)
Kikuchi, N.; Yoshida, Y.; Uchino, O.; Morino, I.; Yokota, T.
2016-11-01
We present an algorithm for retrieving column-averaged dry air mole fraction of carbon dioxide (XCO2) and methane (XCH4) from reflected spectra in the shortwave infrared (SWIR) measured by the TANSO-FTS (Thermal And Near infrared Sensor for carbon Observation Fourier Transform Spectrometer) sensor on board the Greenhouse gases Observing SATellite (GOSAT). The algorithm uses the two linear polarizations observed by TANSO-FTS to improve corrections to the interference effects of atmospheric aerosols, which degrade the accuracy in the retrieved greenhouse gas concentrations. To account for polarization by the land surface reflection in the forward model, we introduced a bidirectional reflection matrix model that has two parameters to be retrieved simultaneously with other state parameters. The accuracy in XCO2 and XCH4 values retrieved with the algorithm was evaluated by using simulated retrievals over both land and ocean, focusing on the capability of the algorithm to correct imperfect prior knowledge of aerosols. To do this, we first generated simulated TANSO-FTS spectra using a global distribution of aerosols computed by the aerosol transport model SPRINTARS. Then the simulated spectra were submitted to the algorithms as measurements both with and without polarization information, adopting a priori profiles of aerosols that differ from the true profiles. We found that the accuracy of XCO2 and XCH4, as well as profiles of aerosols, retrieved with polarization information was considerably improved over values retrieved without polarization information, for simulated observations over land with aerosol optical thickness greater than 0.1 at 1.6 μm.
Drewry, Darren T; Kumar, Praveen; Long, Stephen P
2014-06-01
Spanning 15% of the global ice-free terrestrial surface, agricultural lands provide an immense and near-term opportunity to address climate change, food, and water security challenges. Through the computationally informed breeding of canopy structural traits away from those of modern cultivars, we show that solutions exist that increase productivity and water use efficiency, while increasing land-surface reflectivity to offset greenhouse gas warming. Plants have evolved to maximize capture of radiation in the upper leaves, thus shading competitors. While important for survival in the wild, this is suboptimal in monoculture crop fields for maximizing productivity and other biogeophysical services. Crop progenitors evolved over the last 25 million years in an atmosphere with less than half the [CO2] projected for 2050. By altering leaf photosynthetic rates, rising [CO2] and temperature may also alter the optimal canopy form. Here using soybean, the world's most important protein crop, as an example we show by applying optimization routines to a micrometeorological leaf canopy model linked to a steady-state model of photosynthesis, that significant gains in production, water use, and reflectivity are possible with no additional demand on resources. By modifying total canopy leaf area, its vertical profile and angular distribution, and shortwave radiation reflectivity, all traits available in most major crop germplasm collections, increases in productivity (7%) are possible with no change in water use or albedo. Alternatively, improvements in water use (13%) or albedo (34%) can likewise be made with no loss of productivity, under Corn Belt climate conditions. © 2014 California Institute of Technology. Government sponsorship acknowledged.
Computer-intensive simulation of solid-state NMR experiments using SIMPSON.
Tošner, Zdeněk; Andersen, Rasmus; Stevensson, Baltzar; Edén, Mattias; Nielsen, Niels Chr; Vosegaard, Thomas
2014-09-01
Conducting large-scale solid-state NMR simulations requires fast computer software potentially in combination with efficient computational resources to complete within a reasonable time frame. Such simulations may involve large spin systems, multiple-parameter fitting of experimental spectra, or multiple-pulse experiment design using parameter scan, non-linear optimization, or optimal control procedures. To efficiently accommodate such simulations, we here present an improved version of the widely distributed open-source SIMPSON NMR simulation software package adapted to contemporary high performance hardware setups. The software is optimized for fast performance on standard stand-alone computers, multi-core processors, and large clusters of identical nodes. We describe the novel features for fast computation including internal matrix manipulations, propagator setups and acquisition strategies. For efficient calculation of powder averages, we implemented interpolation method of Alderman, Solum, and Grant, as well as recently introduced fast Wigner transform interpolation technique. The potential of the optimal control toolbox is greatly enhanced by higher precision gradients in combination with the efficient optimization algorithm known as limited memory Broyden-Fletcher-Goldfarb-Shanno. In addition, advanced parallelization can be used in all types of calculations, providing significant time reductions. SIMPSON is thus reflecting current knowledge in the field of numerical simulations of solid-state NMR experiments. The efficiency and novel features are demonstrated on the representative simulations. Copyright © 2014 Elsevier Inc. All rights reserved.
Improving GPR Surveys Productivity by Array Technology and Fully Automated Processing
NASA Astrophysics Data System (ADS)
Morello, Marco; Ercoli, Emanuele; Mazzucchelli, Paolo; Cottino, Edoardo
2016-04-01
The realization of network infrastructures with lower environmental impact and the tendency to use digging technologies less invasive in terms of time and space of road occupation and restoration play a key-role in the development of communication networks. However, pre-existing buried utilities must be detected and located in the subsurface, to exploit the high productivity of modern digging apparatus. According to SUE quality level B+ both position and depth of subsurface utilities must be accurately estimated, demanding for 3D GPR surveys. In fact, the advantages of 3D GPR acquisitions (obtained either by multiple 2D recordings or by an antenna array) versus 2D acquisitions are well-known. Nonetheless, the amount of acquired data for such 3D acquisitions does not usually allow to complete processing and interpretation directly in field and in real-time, thus limiting the overall efficiency of the GPR acquisition. As an example, the "low impact mini-trench "technique (addressed in ITU - International Telecommunication Union - L.83 recommendation) requires that non-destructive mapping of buried services enhances its productivity to match the improvements of new digging equipment. Nowadays multi-antenna and multi-pass GPR acquisitions demand for new processing techniques that can obtain high quality subsurface images, taking full advantage of 3D data: the development of a fully automated and real-time 3D GPR processing system plays a key-role in overall optical network deployment profitability. Furthermore, currently available computing power suggests the feasibility of processing schemes that incorporate better focusing algorithms. A novel processing scheme, whose goal is the automated processing and detection of buried targets that can be applied in real-time to 3D GPR array systems, has been developed and fruitfully tested with two different GPR arrays (16 antennas, 900 MHz central frequency, and 34 antennas, 600 MHz central frequency). The proposed processing scheme take advantage of 3D data multiplicity by continuous real time data focusing. Pre-stack reflection angle gathers G(x, θ; v) are computed at nv different velocities (by the mean of Kirchhoff depth-migration kernels, that can naturally cope with any acquisition pattern and handle irregular sampling issues). It must be noted that the analysis of pre-stack reflection angle gathers plays a key-role in automated detection: targets are identified and the best local propagation velocities are recovered through a correlation estimate computed for all the nv reflection angle gathers. Indeed, the data redundancy of 3D GPR acquisitions highly improves the proposed automatic detection reliability. The goal of real-time automated processing has been pursued without the need of specific high performance processing hardware (a simple laptop is required). Moreover, the automatization of the entire surveying process allows to obtain high quality and repeatable results without the need of skilled interpreters. The proposed acquisition procedure has been extensively tested: more than 100 Km of acquired data prove the feasibility of the proposed approach.
NASA Astrophysics Data System (ADS)
He, Xiao Dong
This thesis studies light scattering processes off rough surfaces. Analytic models for reflection, transmission and subsurface scattering of light are developed. The results are applicable to realistic image generation in computer graphics. The investigation focuses on the basic issue of how light is scattered locally by general surfaces which are neither diffuse nor specular; Physical optics is employed to account for diffraction and interference which play a crucial role in the scattering of light for most surfaces. The thesis presents: (1) A new reflectance model; (2) A new transmittance model; (3) A new subsurface scattering model. All of these models are physically-based, depend on only physical parameters, apply to a wide range of materials and surface finishes and more importantly, provide a smooth transition from diffuse-like to specular reflection as the wavelength and incidence angle are increased or the surface roughness is decreased. The reflectance and transmittance models are based on the Kirchhoff Theory and the subsurface scattering model is based on Energy Transport Theory. They are valid only for surfaces with shallow slopes. The thesis shows that predicted reflectance distributions given by the reflectance model compare favorably with experiment. The thesis also investigates and implements fast ways of computing the reflectance and transmittance models. Furthermore, the thesis demonstrates that a high level of realistic image generation can be achieved due to the physically -correct treatment of the scattering processes by the reflectance model.
Rao, Naren; Menon, Sangeetha
2016-06-01
Preliminary evidence suggests efficacy of yoga as add-on treatment for schizophrenia, but the underlying mechanism by which yoga improves the symptoms of schizophrenia is not completely understood. Yoga improves self-reflection in healthy individuals, and self-reflection abnormalities are typically seen in schizophrenia. However, whether yoga treatment improves impairments in self-reflection typically seen in patients with schizophrenia is not examined. This paper discusses the potential mechanism of yoga in the treatment of schizophrenia and proposes a testable hypothesis for further empirical studies. It is proposed that self-reflection abnormalities in schizophrenia improve with yoga and the neurobiological changes associated with this can be examined using empirical behavioural measures and neuroimaging measures such as magnetic resonance imaging.
Diagnosing hypoxia in murine models of rheumatoid arthritis from reflectance multispectral images
NASA Astrophysics Data System (ADS)
Glinton, Sophie; Naylor, Amy J.; Claridge, Ela
2017-07-01
Spectra computed from multispectral images of murine models of Rheumatoid Arthritis show a characteristic decrease in reflectance within the 600-800nm region which is indicative of the reduction in blood oxygenation and is consistent with hypoxia.
Neutron reflecting supermirror structure
Wood, J.L.
1992-12-01
An improved neutron reflecting supermirror structure comprising a plurality of stacked sets of bilayers of neutron reflecting materials. The improved neutron reflecting supermirror structure is adapted to provide extremely good performance at high incidence angles, i.e. up to four time the critical angle of standard neutron mirror structures. The reflection of neutrons striking the supermirror structure at a high critical angle provides enhanced neutron throughput, and hence more efficient and economical use of neutron sources. 2 figs.
Neutron reflecting supermirror structure
Wood, James L.
1992-01-01
An improved neutron reflecting supermirror structure comprising a plurality of stacked sets of bilayers of neutron reflecting materials. The improved neutron reflecting supermirror structure is adapted to provide extremely good performance at high incidence angles, i.e. up to four time the critical angle of standard neutron mirror structures. The reflection of neutrons striking the supermirror structure at a high critical angle provides enhanced neutron throughput, and hence more efficient and economical use of neutron sources.
Sharma, Ronesh; Bayarjargal, Maitsetseg; Tsunoda, Tatsuhiko; Patil, Ashwini; Sharma, Alok
2018-01-21
Intrinsically Disordered Proteins (IDPs) lack stable tertiary structure and they actively participate in performing various biological functions. These IDPs expose short binding regions called Molecular Recognition Features (MoRFs) that permit interaction with structured protein regions. Upon interaction they undergo a disorder-to-order transition as a result of which their functionality arises. Predicting these MoRFs in disordered protein sequences is a challenging task. In this study, we present MoRFpred-plus, an improved predictor over our previous proposed predictor to identify MoRFs in disordered protein sequences. Two separate independent propensity scores are computed via incorporating physicochemical properties and HMM profiles, these scores are combined to predict final MoRF propensity score for a given residue. The first score reflects the characteristics of a query residue to be part of MoRF region based on the composition and similarity of assumed MoRF and flank regions. The second score reflects the characteristics of a query residue to be part of MoRF region based on the properties of flanks associated around the given residue in the query protein sequence. The propensity scores are processed and common averaging is applied to generate the final prediction score of MoRFpred-plus. Performance of the proposed predictor is compared with available MoRF predictors, MoRFchibi, MoRFpred, and ANCHOR. Using previously collected training and test sets used to evaluate the mentioned predictors, the proposed predictor outperforms these predictors and generates lower false positive rate. In addition, MoRFpred-plus is a downloadable predictor, which makes it useful as it can be used as input to other computational tools. https://github.com/roneshsharma/MoRFpred-plus/wiki/MoRFpred-plus:-Download. Copyright © 2017 Elsevier Ltd. All rights reserved.
Tsingos-Lucas, Cherie; Bosnic-Anticevich, Sinthia; Schneider, Carl R; Smith, Lorraine
2016-05-25
Objective. To determine the effectiveness of integrating reflective practice activities into a second-year undergraduate pharmacy curriculum and their impact on reflective thinking ability. Design. A cross-over design with repeated measures was employed. Newly developed reflective modules based on real hospital and community pharmacy cases were integrated into the second-year pharmacy practice curriculum. A novel strategy, the Reflective Ability Clinical Assessment (RACA), was introduced to enhance self- and peer reflection. Assessment. Student responses (n=214) to the adapted Kember et al(1) Reflective Thinking Questionnaire (RTQ) were compared before and after reflective activities were undertaken. Significant improvement in three indicators of reflective thinking was shown after students engaged in reflective activities. Conclusion. Integration of reflective activities into a pharmacy curriculum increased the reflective thinking capacity of students. Enhancing reflective thinking ability may help students make better informed decisions and clinical judgments, thus improving future practice.
Bosnic-Anticevich, Sinthia; Schneider, Carl R.; Smith, Lorraine
2016-01-01
Objective. To determine the effectiveness of integrating reflective practice activities into a second-year undergraduate pharmacy curriculum and their impact on reflective thinking ability. Design. A cross-over design with repeated measures was employed. Newly developed reflective modules based on real hospital and community pharmacy cases were integrated into the second-year pharmacy practice curriculum. A novel strategy, the Reflective Ability Clinical Assessment (RACA), was introduced to enhance self- and peer reflection. Assessment. Student responses (n=214) to the adapted Kember et al1 Reflective Thinking Questionnaire (RTQ) were compared before and after reflective activities were undertaken. Significant improvement in three indicators of reflective thinking was shown after students engaged in reflective activities. Conclusion. Integration of reflective activities into a pharmacy curriculum increased the reflective thinking capacity of students. Enhancing reflective thinking ability may help students make better informed decisions and clinical judgments, thus improving future practice. PMID:27293232
NASA Astrophysics Data System (ADS)
Singh, Manoj K.; Gautam, Ritesh; Gatebe, Charles K.; Poudyal, Rajesh
2016-11-01
The Bidirectional Reflectance Distribution Function (BRDF) is a fundamental concept for characterizing the reflectance property of a surface, and helps in the analysis of remote sensing data from satellite, airborne and surface platforms. Multi-angular remote sensing measurements are required for the development and evaluation of BRDF models for improved characterization of surface properties. However, multi-angular data and the associated BRDF models are typically multidimensional involving multi-angular and multi-wavelength information. Effective visualization of such complex multidimensional measurements for different wavelength combinations is presently somewhat lacking in the literature, and could serve as a potentially useful research and teaching tool in aiding both interpretation and analysis of BRDF measurements. This article describes a newly developed software package in Python (PolarBRDF) to help visualize and analyze multi-angular data in polar and False Color Composite (FCC) forms. PolarBRDF also includes functionalities for computing important multi-angular reflectance/albedo parameters including spectral albedo, principal plane reflectance and spectral reflectance slope. Application of PolarBRDF is demonstrated using various case studies obtained from airborne multi-angular remote sensing measurements using NASA's Cloud Absorption Radiometer (CAR). Our visualization program also provides functionalities for untangling complex surface/atmosphere features embedded in pixel-based remote sensing measurements, such as the FCC imagery generation of BRDF measurements of grasslands in the presence of wildfire smoke and clouds. Furthermore, PolarBRDF also provides quantitative information of the angular distribution of scattered surface/atmosphere radiation, in the form of relevant BRDF variables such as sunglint, hotspot and scattering statistics.
NASA Astrophysics Data System (ADS)
Poudyal, R.; Singh, M.; Gautam, R.; Gatebe, C. K.
2016-12-01
The Bidirectional Reflectance Distribution Function (BRDF) is a fundamental concept for characterizing the reflectance property of a surface, and helps in the analysis of remote sensing data from satellite, airborne and surface platforms. Multi-angular remote sensing measurements are required for the development and evaluation of BRDF models for improved characterization of surface properties. However, multi-angular data and the associated BRDF models are typically multidimensional involving multi-angular and multi-wavelength information. Effective visualization of such complex multidimensional measurements for different wavelength combinations is presently somewhat lacking in the literature, and could serve as a potentially useful research and teaching tool in aiding both interpretation and analysis of BRDF measurements. This article describes a newly developed software package in Python (PolarBRDF) to help visualize and analyze multi-angular data in polar and False Color Composite (FCC) forms. PolarBRDF also includes functionalities for computing important multi-angular reflectance/albedo parameters including spectral albedo, principal plane reflectance and spectral reflectance slope. Application of PolarBRDF is demonstrated using various case studies obtained from airborne multi-angular remote sensing measurements using NASA's Cloud Absorption Radiometer (CAR)- http://car.gsfc.nasa.gov/. Our visualization program also provides functionalities for untangling complex surface/atmosphere features embedded in pixel-based remote sensing measurements, such as the FCC imagery generation of BRDF measurements of grasslands in the presence of wildfire smoke and clouds. Furthermore, PolarBRDF also provides quantitative information of the angular distribution of scattered surface/atmosphere radiation, in the form of relevant BRDF variables such as sunglint, hotspot and scattering statistics.
NASA Technical Reports Server (NTRS)
Singh, Manoj K.; Gautam, Ritesh; Gatebe, Charles K.; Poudyal, Rajesh
2016-01-01
The Bidirectional Reflectance Distribution Function (BRDF) is a fundamental concept for characterizing the reflectance property of a surface, and helps in the analysis of remote sensing data from satellite, airborne and surface platforms. Multi-angular remote sensing measurements are required for the development and evaluation of BRDF models for improved characterization of surface properties. However, multi-angular data and the associated BRDF models are typically multidimensional involving multi-angular and multi-wavelength information. Effective visualization of such complex multidimensional measurements for different wavelength combinations is presently somewhat lacking in the literature, and could serve as a potentially useful research and teaching tool in aiding both interpretation and analysis of BRDF measurements. This article describes a newly developed software package in Python (PolarBRDF) to help visualize and analyze multi-angular data in polar and False Color Composite (FCC) forms. PolarBRDF also includes functionalities for computing important multi-angular reflectance/albedo parameters including spectral albedo, principal plane reflectance and spectral reflectance slope. Application of PolarBRDF is demonstrated using various case studies obtained from airborne multi-angular remote sensing measurements using NASA's Cloud Absorption Radiometer (CAR). Our visualization program also provides functionalities for untangling complex surface/atmosphere features embedded in pixel-based remote sensing measurements, such as the FCC imagery generation of BRDF measurements of grasslands in the presence of wild fire smoke and clouds. Furthermore, PolarBRDF also provides quantitative information of the angular distribution of scattered surface/atmosphere radiation, in the form of relevant BRDF variables such as sunglint, hotspot and scattering statistics.
A fast color image enhancement algorithm based on Max Intensity Channel
Sun, Wei; Han, Long; Guo, Baolong; Jia, Wenyan; Sun, Mingui
2014-01-01
In this paper, we extend image enhancement techniques based on the retinex theory imitating human visual perception of scenes containing high illumination variations. This extension achieves simultaneous dynamic range modification, color consistency, and lightness rendition without multi-scale Gaussian filtering which has a certain halo effect. The reflection component is analyzed based on the illumination and reflection imaging model. A new prior named Max Intensity Channel (MIC) is implemented assuming that the reflections of some points in the scene are very high in at least one color channel. Using this prior, the illumination of the scene is obtained directly by performing a gray-scale closing operation and a fast cross-bilateral filtering on the MIC of the input color image. Consequently, the reflection component of each RGB color channel can be determined from the illumination and reflection imaging model. The proposed algorithm estimates the illumination component which is relatively smooth and maintains the edge details in different regions. A satisfactory color rendition is achieved for a class of images that do not satisfy the gray-world assumption implicit to the theoretical foundation of the retinex. Experiments are carried out to compare the new method with several spatial and transform domain methods. Our results indicate that the new method is superior in enhancement applications, improves computation speed, and performs well for images with high illumination variations than other methods. Further comparisons of images from National Aeronautics and Space Administration and a wearable camera eButton have shown a high performance of the new method with better color restoration and preservation of image details. PMID:25110395
A fast color image enhancement algorithm based on Max Intensity Channel.
Sun, Wei; Han, Long; Guo, Baolong; Jia, Wenyan; Sun, Mingui
2014-03-30
In this paper, we extend image enhancement techniques based on the retinex theory imitating human visual perception of scenes containing high illumination variations. This extension achieves simultaneous dynamic range modification, color consistency, and lightness rendition without multi-scale Gaussian filtering which has a certain halo effect. The reflection component is analyzed based on the illumination and reflection imaging model. A new prior named Max Intensity Channel (MIC) is implemented assuming that the reflections of some points in the scene are very high in at least one color channel. Using this prior, the illumination of the scene is obtained directly by performing a gray-scale closing operation and a fast cross-bilateral filtering on the MIC of the input color image. Consequently, the reflection component of each RGB color channel can be determined from the illumination and reflection imaging model. The proposed algorithm estimates the illumination component which is relatively smooth and maintains the edge details in different regions. A satisfactory color rendition is achieved for a class of images that do not satisfy the gray-world assumption implicit to the theoretical foundation of the retinex. Experiments are carried out to compare the new method with several spatial and transform domain methods. Our results indicate that the new method is superior in enhancement applications, improves computation speed, and performs well for images with high illumination variations than other methods. Further comparisons of images from National Aeronautics and Space Administration and a wearable camera eButton have shown a high performance of the new method with better color restoration and preservation of image details.
A fast color image enhancement algorithm based on Max Intensity Channel
NASA Astrophysics Data System (ADS)
Sun, Wei; Han, Long; Guo, Baolong; Jia, Wenyan; Sun, Mingui
2014-03-01
In this paper, we extend image enhancement techniques based on the retinex theory imitating human visual perception of scenes containing high illumination variations. This extension achieves simultaneous dynamic range modification, color consistency, and lightness rendition without multi-scale Gaussian filtering which has a certain halo effect. The reflection component is analyzed based on the illumination and reflection imaging model. A new prior named Max Intensity Channel (MIC) is implemented assuming that the reflections of some points in the scene are very high in at least one color channel. Using this prior, the illumination of the scene is obtained directly by performing a gray-scale closing operation and a fast cross-bilateral filtering on the MIC of the input color image. Consequently, the reflection component of each RGB color channel can be determined from the illumination and reflection imaging model. The proposed algorithm estimates the illumination component which is relatively smooth and maintains the edge details in different regions. A satisfactory color rendition is achieved for a class of images that do not satisfy the gray-world assumption implicit to the theoretical foundation of the retinex. Experiments are carried out to compare the new method with several spatial and transform domain methods. Our results indicate that the new method is superior in enhancement applications, improves computation speed, and performs well for images with high illumination variations than other methods. Further comparisons of images from National Aeronautics and Space Administration and a wearable camera eButton have shown a high performance of the new method with better color restoration and preservation of image details.
A Binaural Grouping Model for Predicting Speech Intelligibility in Multitalker Environments
Colburn, H. Steven
2016-01-01
Spatially separating speech maskers from target speech often leads to a large intelligibility improvement. Modeling this phenomenon has long been of interest to binaural-hearing researchers for uncovering brain mechanisms and for improving signal-processing algorithms in hearing-assistive devices. Much of the previous binaural modeling work focused on the unmasking enabled by binaural cues at the periphery, and little quantitative modeling has been directed toward the grouping or source-separation benefits of binaural processing. In this article, we propose a binaural model that focuses on grouping, specifically on the selection of time-frequency units that are dominated by signals from the direction of the target. The proposed model uses Equalization-Cancellation (EC) processing with a binary decision rule to estimate a time-frequency binary mask. EC processing is carried out to cancel the target signal and the energy change between the EC input and output is used as a feature that reflects target dominance in each time-frequency unit. The processing in the proposed model requires little computational resources and is straightforward to implement. In combination with the Coherence-based Speech Intelligibility Index, the model is applied to predict the speech intelligibility data measured by Marrone et al. The predicted speech reception threshold matches the pattern of the measured data well, even though the predicted intelligibility improvements relative to the colocated condition are larger than some of the measured data, which may reflect the lack of internal noise in this initial version of the model. PMID:27698261
A Binaural Grouping Model for Predicting Speech Intelligibility in Multitalker Environments.
Mi, Jing; Colburn, H Steven
2016-10-03
Spatially separating speech maskers from target speech often leads to a large intelligibility improvement. Modeling this phenomenon has long been of interest to binaural-hearing researchers for uncovering brain mechanisms and for improving signal-processing algorithms in hearing-assistive devices. Much of the previous binaural modeling work focused on the unmasking enabled by binaural cues at the periphery, and little quantitative modeling has been directed toward the grouping or source-separation benefits of binaural processing. In this article, we propose a binaural model that focuses on grouping, specifically on the selection of time-frequency units that are dominated by signals from the direction of the target. The proposed model uses Equalization-Cancellation (EC) processing with a binary decision rule to estimate a time-frequency binary mask. EC processing is carried out to cancel the target signal and the energy change between the EC input and output is used as a feature that reflects target dominance in each time-frequency unit. The processing in the proposed model requires little computational resources and is straightforward to implement. In combination with the Coherence-based Speech Intelligibility Index, the model is applied to predict the speech intelligibility data measured by Marrone et al. The predicted speech reception threshold matches the pattern of the measured data well, even though the predicted intelligibility improvements relative to the colocated condition are larger than some of the measured data, which may reflect the lack of internal noise in this initial version of the model. © The Author(s) 2016.
ERIC Educational Resources Information Center
Wilkerson-Jerde, Michelle Hoda
2014-01-01
There are increasing calls to prepare K-12 students to use computational tools and principles when exploring scientific or mathematical phenomena. The purpose of this paper is to explore whether and how constructionist computer-supported collaborative environments can explicitly engage students in this practice. The Categorizer is a…
ERIC Educational Resources Information Center
Chan, Kit Yu Karen; Yang, Sylvia; Maliska, Max E.; Grunbaum, Daniel
2012-01-01
The National Science Education Standards have highlighted the importance of active learning and reflection for contemporary scientific methods in K-12 classrooms, including the use of models. Computer modeling and visualization are tools that researchers employ in their scientific inquiry process, and often computer models are used in…
Reflections on Component Computing from the Boxer Project's Perspective
ERIC Educational Resources Information Center
diSessa, Andrea A.
2004-01-01
The Boxer Project conducted the research that led to the synthetic review "Issues in Component Computing." This brief essay provides a platform from which to develop our general perspective on educational computing and how it relates to components. The two most important lines of our thinking are (1) the goal to open technology's creative…
Children as Educational Computer Game Designers: An Exploratory Study
ERIC Educational Resources Information Center
Baytak, Ahmet; Land, Susan M.; Smith, Brian K.
2011-01-01
This study investigated how children designed computer games as artifacts that reflected their understanding of nutrition. Ten 5th grade students were asked to design computer games with the software "Game Maker" for the purpose of teaching 1st graders about nutrition. The results from the case study show that students were able to…
Handheld Computers: A Boon for Principals
ERIC Educational Resources Information Center
Brazell, Wayne
2005-01-01
As I reflect on my many years as an elementary school principal, I realize how much more effective I would have been if I had owned a wireless handheld computer. This relatively new technology can provide considerable assistance to today?s principals and recent advancements have increased its functions and capacity. Handheld computers are…
ERIC Educational Resources Information Center
Worrell, Jamie; Duffy, Mary Lou; Brady, Michael P.; Dukes, Charles; Gonzalez-DeHass, Alyssa
2016-01-01
Many schools use computer-based testing to measure students' progress for end-of-the-year and statewide assessments. There is little research to support whether computer-based testing accurately reflects student progress, particularly among students with learning, performance, and generalization difficulties. This article summarizes an…
NASA Technical Reports Server (NTRS)
Angus, J. C.; Coffield, F. E.; Edwards, R. V.; Mann, J. A., Jr.; Rugh, R. W.; Gallagher, N. C.
1977-01-01
Computer-generated reflection holograms hold substantial promise as a means of carrying out complex machining, marking, scribing, welding, soldering, heat treating, and similar processing operations simultaneously and without moving the work piece or laser beam. In the study described, a photographically reduced transparency of a 64 x 64 element Lohmann hologram was used to make a mask which, in turn, was used (with conventional photoresist techniques) to produce a holographic reflector. Images from a commercial CO2 laser (150W TEM(00)) and the holographic reflector are illustrated and discussed.
ERIC Educational Resources Information Center
Landa-Jiménez, M. A.; González-Gaspar, P.; Pérez-Estudillo, C.; López-Meraz, M. L.; Morgado-Valle, C.; Beltran-Parrazal, L.
2016-01-01
A Muscle-Computer Interface (muCI) is a human-machine system that uses electromyographic (EMG) signals to communicate with a computer. Surface EMG (sEMG) signals are currently used to command robotic devices, such as robotic arms and hands, and mobile robots, such as wheelchairs. These signals reflect the motor intention of a user before the…
NASA Astrophysics Data System (ADS)
Liu, Xiwu; Guo, Zhiqi; Han, Xu
2018-06-01
A set of parallel vertical fractures embedded in a vertically transverse isotropy (VTI) background leads to orthorhombic anisotropy and corresponding azimuthal seismic responses. We conducted seismic modeling of full waveform amplitude variations versus azimuth (AVAZ) responses of anisotropic shale by integrating a rock physics model and a reflectivity method. The results indicate that the azimuthal variation of P-wave velocity tends to be more complicated for orthorhombic medium compared to the horizontally transverse isotropy (HTI) case, especially at high polar angles. Correspondingly, for the HTI layer in the theoretical model, the short axis of the azimuthal PP amplitudes at the top interface is parallel to the fracture strike, while the long axis at the bottom reflection directs the fracture strike. In contrast, the orthorhombic layer in the theoretical model shows distinct AVAZ responses in terms of PP reflections. Nevertheless, the azimuthal signatures of the R- and T-components of the mode-converted PS reflections show similar AVAZ features for the HTI and orthorhombic layers, which may imply that the PS responses are dominated by fractures. For the application to real data, a seismic-well tie based on upscaled data and a reflectivity method illustrate good agreement between the reference layers and the corresponding reflected events. Finally, the full waveform seismic AVAZ responses of the Longmaxi shale formation are computed for the cases of HTI and orthorhombic anisotropy for comparison. For the two cases, the azimuthal features represent differences mainly in amplitudes, while slightly in the phases of the reflected waveforms. Azimuth variations in the PP reflections from the reference layers show distinct behaviors for the HTI and orthorhombic cases, while the mode-converted PS reflections in terms of the R- and T-components show little differences in azimuthal features. It may suggest that the behaviors of the PS waves are dominated by vertically aligned fractures. This work provides further insight into the azimuthal seismic response of orthorhombic shales. The proposed method may help to improve the seismic-well tie, seismic interpretation, and inversion results using an azimuth anisotropy dataset.
Tissues viability and blood flow sensing based on a new nanophotonics method
NASA Astrophysics Data System (ADS)
Yariv, Inbar; Haddad, Menashe; Duadi, Hamootal; Motiei, Menachem; Fixler, Dror
2018-02-01
Extracting optical parameters of turbid medium (e.g. tissue) by light reflectance signals is of great interest and has many applications in the medical world, life science, material analysis and biomedical optics. The reemitted light from an irradiated tissue is affected by the light's interaction with the tissue components and contains the information about the tissue structure and physiological state. In this research we present a novel noninvasive nanophotonics technique, i.e., iterative multi-plane optical property extraction (IMOPE) based on reflectance measurements. The reflectance based IMOPE was applied for tissue viability examination, detection of gold nanorods (GNRs) within the blood circulation as well as blood flow detection using the GNRs presence within the blood vessels. The basics of the IMOPE combine a simple experimental setup for recording light intensity images with an iterative Gerchberg-Saxton (G-S) algorithm for reconstructing the reflected light phase and computing its standard deviation (STD). Changes in tissue composition affect its optical properties which results in changes in the light phase that can be measured by its STD. This work presents reflectance based IMOPE tissue viability examination, producing a decrease in the computed STD for older tissues, as well as investigating their organic material absorption capability. Finally, differentiation of the femoral vein from adjacent tissues using GNRs and the detection of their presence within blood circulation and tissues are also presented with high sensitivity (better than computed tomography) to low quantities of GNRs (<3 mg).
NASA Astrophysics Data System (ADS)
Chang, Chun-Hung; Myers, Erinn M.; Kennelly, Michael J.; Fried, Nathaniel M.
2017-01-01
Near-infrared laser energy in conjunction with applied tissue cooling is being investigated for thermal remodeling of the endopelvic fascia during minimally invasive treatment of female stress urinary incontinence. Previous computer simulations of light transport, heat transfer, and tissue thermal damage have shown that a transvaginal approach is more feasible than a transurethral approach. However, results were suboptimal, and some undesirable thermal insult to the vaginal wall was still predicted. This study uses experiments and computer simulations to explore whether application of an optical clearing agent (OCA) can further improve optical penetration depth and completely preserve the vaginal wall during subsurface treatment of the endopelvic fascia. Several different mixtures of OCA's were tested, and 100% glycerol was found to be the optimal agent. Optical transmission studies, optical coherence tomography, reflection spectroscopy, and computer simulations [including Monte Carlo (MC) light transport, heat transfer, and Arrhenius integral model of thermal damage] using glycerol were performed. The OCA produced a 61% increase in optical transmission through porcine vaginal wall at 37°C after 30 min. The MC model showed improved energy deposition in endopelvic fascia using glycerol. Without OCA, 62%, 37%, and 1% of energy was deposited in vaginal wall, endopelvic fascia, and urethral wall, respectively, compared with 50%, 49%, and 1% using OCA. Use of OCA also resulted in 0.5-mm increase in treatment depth, allowing potential thermal tissue remodeling at a depth of 3 mm with complete preservation of the vaginal wall.
75 FR 60415 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-30
... computer systems and networks. This information collection is required to obtain the necessary data... card reflecting those benefits and privileges, and to maintain a centralized database of the eligible... card reflecting those benefits and privileges, and to maintain a centralized database of the eligible...
Exact Rayleigh scattering calculations for use with the Nimbus-7 Coastal Zone Color Scanner.
Gordon, H R; Brown, J W; Evans, R H
1988-03-01
For improved analysis of Coastal Zone Color Scanner (CZCS) imagery, the radiance reflected from a planeparallel atmosphere and flat sea surface in the absence of aerosols (Rayleigh radiance) has been computed with an exact multiple scattering code, i.e., including polarization. The results indicate that the single scattering approximation normally used to compute this radiance can cause errors of up to 5% for small and moderate solar zenith angles. At large solar zenith angles, such as encountered in the analysis of high-latitude imagery, the errors can become much larger, e.g.,>10% in the blue band. The single scattering error also varies along individual scan lines. Comparison with multiple scattering computations using scalar transfer theory, i.e., ignoring polarization, show that scalar theory can yield errors of approximately the same magnitude as single scattering when compared with exact computations at small to moderate values of the solar zenith angle. The exact computations can be easily incorporated into CZCS processing algorithms, and, for application to future instruments with higher radiometric sensitivity, a scheme is developed with which the effect of variations in the surface pressure could be easily and accurately included in the exact computation of the Rayleigh radiance. Direct application of these computations to CZCS imagery indicates that accurate atmospheric corrections can be made with solar zenith angles at least as large as 65 degrees and probably up to at least 70 degrees with a more sensitive instrument. This suggests that the new Rayleigh radiance algorithm should produce more consistent pigment retrievals, particularly at high latitudes.
[The research on bidirectional reflectance computer simulation of forest canopy at pixel scale].
Song, Jin-Ling; Wang, Jin-Di; Shuai, Yan-Min; Xiao, Zhi-Qiang
2009-08-01
Computer simulation is based on computer graphics to generate the realistic 3D structure scene of vegetation, and to simulate the canopy regime using radiosity method. In the present paper, the authors expand the computer simulation model to simulate forest canopy bidirectional reflectance at pixel scale. But usually, the trees are complex structures, which are tall and have many branches. So there is almost a need for hundreds of thousands or even millions of facets to built up the realistic structure scene for the forest It is difficult for the radiosity method to compute so many facets. In order to make the radiosity method to simulate the forest scene at pixel scale, in the authors' research, the authors proposed one idea to simplify the structure of forest crowns, and abstract the crowns to ellipsoids. And based on the optical characteristics of the tree component and the characteristics of the internal energy transmission of photon in real crown, the authors valued the optical characteristics of ellipsoid surface facets. In the computer simulation of the forest, with the idea of geometrical optics model, the gap model is considered to get the forest canopy bidirectional reflectance at pixel scale. Comparing the computer simulation results with the GOMS model, and Multi-angle Imaging SpectroRadiometer (MISR) multi-angle remote sensing data, the simulation results are in agreement with the GOMS simulation result and MISR BRF. But there are also some problems to be solved. So the authors can conclude that the study has important value for the application of multi-angle remote sensing and the inversion of vegetation canopy structure parameters.
Perioperative self-reflection among surgical residents.
Peshkepija, Andi N; Basson, Marc D; Davis, Alan T; Ali, Muhammad; Haan, Pam S; Gupta, Rama N; Hardaway, John C; Nebeker, Cody A; McLeod, Michael K; Osmer, Robert L; Anderson, Cheryl I
2017-09-01
We studied prevalence and predictors of meaningful self-reflection among surgical residents and with prompting/structured interventions, sought to improve/sustain resident skills. Residents from six programs recorded 1032 narrative self-reflective comments (120 residents), using a web-based platform. If residents identified something learned or to be improved, self-reflection was deemed meaningful. Independent variables PGY level, resident/surgeon gender, study site/Phase1: July2014-August2015 vs. Phase2: September2015-September2016) were analyzed. Meaningful self-reflection was documented in 40.6% (419/1032) of entries. PGY5's meaningfully self-reflected less than PGY1-4's, 26.1% vs. 49.6% (p = 0.002). In multivariate analysis, resident narratives during Phase 2 were 4.7 times more likely to engage in meaningful self-reflection compared to Phase1 entries (p < 0.001). Iterative changes during Phase2 showed a 236% increase in meaningful self-reflection, compared to Phase1. Surgical residents uncommonly practice meaningful self-reflection, even when prompted, and PGY5/chief residents reflect less than more junior residents. Substantial/sustained improvements in resident self-reflection can occur with both training and interventions. Copyright © 2017 Elsevier Inc. All rights reserved.
Design and Construction of a Field Capable Snapshot Hyperspectral Imaging Spectrometer
NASA Technical Reports Server (NTRS)
Arik, Glenda H.
2005-01-01
The computed-tomography imaging spectrometer (CTIS) is a device which captures the spatial and spectral content of a rapidly evolving same in a single image frame. The most recent CTIS design is optically all reflective and uses as its dispersive device a stated the-art reflective computer generated hologram (CGH). This project focuses on the instrument's transition from laboratory to field. This design will enable the CTIS to withstand a harsh desert environment. The system is modeled in optical design software using a tolerance analysis. The tolerances guide the design of the athermal mount and component parts. The parts are assembled into a working mount shell where the performance of the mounts is tested for thermal integrity. An interferometric analysis of the reflective CGH is also performed.
The effect of oceanic whitecaps and foams on pulse-limited radar altimeters
NASA Technical Reports Server (NTRS)
Zheng, Q. A.; Klemas, V.; Hayne, G. S.; Huang, N. E.
1983-01-01
Based on electromagnetic field theory of stratified media, the microwave reflectivity of a sea surface covered by whitecaps and foams at 13.9 GHz was computed. The computed results show that the reflectivity declines with increasing thickness of foams. The reflectivity of the sea surface without any whitecaps or foams is 0.6066 (20 C, S:35 per thousand), but it will be less than 0.15 when the thickness of foam cover is more than 0.3 cm. While gathering the data of whitecap and foam coverage in situ and reviewing whitecapping models, it can be shown that the effect of oceanic whitecaps and foams on the measured results of a pulse-limited radar altimeter working at high frequencies will not be negligible in high sea state conditions.
NASA Astrophysics Data System (ADS)
Kun, Luis G.
1994-12-01
On October 18, 1991, the IEEE-USA produced an entity statement which endorsed the vital importance of the High Performance Computer and Communications Act of 1991 (HPCC) and called for the rapid implementation of all its elements. Efforts are now underway to develop a Computer Based Patient Record (CBPR), the National Information Infrastructure (NII) as part of the HPCC, and the so-called `Patient Card'. Multiple legislative initiatives which address these and related information technology issues are pending in Congress. Clearly, a national information system will greatly affect the way health care delivery is provided to the United States public. Timely and reliable information represents a critical element in any initiative to reform the health care system as well as to protect and improve the health of every person. Appropriately used, information technologies offer a vital means of improving the quality of patient care, increasing access to universal care and lowering overall costs within a national health care program. Health care reform legislation should reflect increased budgetary support and a legal mandate for the creation of a national health care information system by: (1) constructing a National Information Infrastructure; (2) building a Computer Based Patient Record System; (3) bringing the collective resources of our National Laboratories to bear in developing and implementing the NII and CBPR, as well as a security system with which to safeguard the privacy rights of patients and the physician-patient privilege; and (4) utilizing Government (e.g. DOD, DOE) capabilities (technology and human resources) to maximize resource utilization, create new jobs and accelerate technology transfer to address health care issues.
Choi, Bernard C.K.
2015-01-01
This article provides insights into the future based on a review of the past and present of public health surveillance—the ongoing systematic collection, analysis, interpretation, and dissemination of health data for the planning, implementation, and evaluation of public health action. Public health surveillance dates back to the first recorded epidemic in 3180 BC in Egypt. A number of lessons and items of interest are summarised from a review of historical perspectives in the past 5,000 years and the current practice of surveillance. Some future scenarios are presented: exploring new frontiers; enhancing computer technology; improving epidemic investigations; improving data collection, analysis, dissemination and use; building on lessons from the past; building capacity; and enhancing global surveillance. It is concluded that learning from the past, reflecting on the present, and planning for the future can further enhance public health surveillance. PMID:29546093
Vectorial approach of determining the wave propagation at metasurfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Daniel, E-mail: D.Smith1966@outlook.com; Campbell, Michael, E-mail: mhl.campbell@gmail.com; Bergmann, Andreas, E-mail: a.bergmann@hotmail.com
2015-10-15
Vector approach often benefits optical engineers and physicists, and a vector formulation of the laws of reflection and refraction has been studied (Tkaczyk, 2012). However, the conventional reflection and refraction laws may be violated in the presence of a metasurface, and reflection and refraction at the metasurface obey generalized laws of reflection and refraction (Yu et al., 2011). In this letter, the vectorial laws of reflection and refraction at the metasurface were derived, and the matrix formulation of these vectorial laws are also obtained. These results enable highly efficient and unambiguous computations in ray-tracing problems that involve a metasurface.
Computer usage and national energy consumption: Results from a field-metering study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Desroches, Louis-Benoit; Fuchs, Heidi; Greenblatt, Jeffery
The electricity consumption of miscellaneous electronic loads (MELs) in the home has grown in recent years, and is expected to continue rising. Consumer electronics, in particular, are characterized by swift technological innovation, with varying impacts on energy use. Desktop and laptop computers make up a significant share of MELs electricity consumption, but their national energy use is difficult to estimate, given uncertainties around shifting user behavior. This report analyzes usage data from 64 computers (45 desktop, 11 laptop, and 8 unknown) collected in 2012 as part of a larger field monitoring effort of 880 households in the San Francisco Baymore » Area, and compares our results to recent values from the literature. We find that desktop computers are used for an average of 7.3 hours per day (median = 4.2 h/d), while laptops are used for a mean 4.8 hours per day (median = 2.1 h/d). The results for laptops are likely underestimated since they can be charged in other, unmetered outlets. Average unit annual energy consumption (AEC) for desktops is estimated to be 194 kWh/yr (median = 125 kWh/yr), and for laptops 75 kWh/yr (median = 31 kWh/yr). We estimate national annual energy consumption for desktop computers to be 20 TWh. National annual energy use for laptops is estimated to be 11 TWh, markedly higher than previous estimates, likely reflective of laptops drawing more power in On mode in addition to greater market penetration. This result for laptops, however, carries relatively higher uncertainty compared to desktops. Different study methodologies and definitions, changing usage patterns, and uncertainty about how consumers use computers must be considered when interpreting our results with respect to existing analyses. Finally, as energy consumption in On mode is predominant, we outline several energy savings opportunities: improved power management (defaulting to low-power modes after periods of inactivity as well as power scaling), matching the rated power of power supplies to computing needs, and improving the efficiency of individual components.« less
Normalized Temperature Contrast Processing in Infrared Flash Thermography
NASA Technical Reports Server (NTRS)
Koshti, Ajay M.
2016-01-01
The paper presents further development in normalized contrast processing used in flash infrared thermography method. Method of computing normalized image or pixel intensity contrast, and normalized temperature contrast are provided. Methods of converting image contrast to temperature contrast and vice versa are provided. Normalized contrast processing in flash thermography is useful in quantitative analysis of flash thermography data including flaw characterization and comparison of experimental results with simulation. Computation of normalized temperature contrast involves use of flash thermography data acquisition set-up with high reflectivity foil and high emissivity tape such that the foil, tape and test object are imaged simultaneously. Methods of assessing other quantitative parameters such as emissivity of object, afterglow heat flux, reflection temperature change and surface temperature during flash thermography are also provided. Temperature imaging and normalized temperature contrast processing provide certain advantages over normalized image contrast processing by reducing effect of reflected energy in images and measurements, therefore providing better quantitative data. Examples of incorporating afterglow heat-flux and reflection temperature evolution in flash thermography simulation are also discussed.
NASA Astrophysics Data System (ADS)
Bloomquist, Debra L.
This embedded-mixed methods study examined if the use of a reflective framework with guiding prompts could support early childhood science teachers in improving their reflective practice and subsequently changing their pedagogy. It further investigated whether type of cognitive coaching group, individual or collaborative, impacted teacher depth of reflection and change in practice. Data included teacher reflections that were rated using the Level of Reflection-On-Action Assessment, reflective codes and inductive themes, as well as videos of participants lessons coded using the SCIIENCE instrument. Findings demonstrated that through guided reflection, teachers developed reflective thinking skills, and through this reflection became more critical and began to improve their pedagogical practice. Further findings supported that collaborative cognitive coaching may not be the most effective professional development for all teachers; as some teachers in the study were found to have difficulty improving their reflectivity and thus their teaching practice. Based on these findings it is recommended that coaches and designers of professional development continue to use reflective frameworks with guiding prompts to support teachers in the reflective process, but take into consideration that coaching may need to be differentiated for the various reflective levels demonstrated by teachers. Future studies will be needed to establish why some teachers have difficulty with the reflective process and how coaches or designers of professional development can further assist these teachers in becoming more critical reflectors.
ERIC Educational Resources Information Center
Iowa Univ., Iowa City. Computer Center.
In most instances, the papers in this collection present information reflecting the current status of computer usage in education and offer substantive forecasts for academic computing. Two speeches from the special ceremony for the renaming of the computing center in honor of Gerard P. Weeg, which was held as part of the two-day national computer…
NASA Astrophysics Data System (ADS)
Fonseca, E. S. R.; de Jesus, M. E. P.
2007-07-01
The estimation of optical properties of highly turbid and opaque biological tissue is a difficult task since conventional purely optical methods rapidly loose sensitivity as the mean photon path length decreases. Photothermal methods, such as pulsed or frequency domain photothermal radiometry (FD-PTR), on the other hand, show remarkable sensitivity in experimental conditions that produce very feeble optical signals. Photothermal Radiometry is primarily sensitive to absorption coefficient yielding considerably higher estimation errors on scattering coefficients. Conversely, purely optical methods such as Local Diffuse Reflectance (LDR) depend mainly on the scattering coefficient and yield much better estimates of this parameter. Therefore, at moderate transport albedos, the combination of photothermal and reflectance methods can improve considerably the sensitivity of detection of tissue optical properties. The authors have recently proposed a novel method that combines FD-PTR with LDR, aimed at improving sensitivity on the determination of both optical properties. Signal analysis was performed by global fitting the experimental data to forward models based on Monte-Carlo simulations. Although this approach is accurate, the associated computational burden often limits its use as a forward model. Therefore, the application of analytical models based on the diffusion approximation offers a faster alternative. In this work, we propose the calculation of the diffuse reflectance and the fluence rate profiles under the δ-P I approximation. This approach is known to approximate fluence rate expressions better close to collimated sources and boundaries than the standard diffusion approximation (SDA). We extend this study to the calculation of the diffuse reflectance profiles. The ability of the δ-P I based model to provide good estimates of the absorption, scattering and anisotropy coefficients is tested against Monte-Carlo simulations over a wide range of scattering to absorption ratios. Experimental validation of the proposed method is accomplished by a set of measurements on solid absorbing and scattering phantoms.
NASA Technical Reports Server (NTRS)
Molthan, Andrew L.; Petersen, Walter A.; Case, Jonathan L.; Dembek, Scott R.
2009-01-01
Increases in computational resources have allowed operational forecast centers to pursue experimental, high resolution simulations that resolve the microphysical characteristics of clouds and precipitation. These experiments are motivated by a desire to improve the representation of weather and climate, but will also benefit current and future satellite campaigns, which often use forecast model output to guide the retrieval process. The combination of reliable cloud microphysics and radar reflectivity may constrain radiative transfer models used in satellite simulators during future missions, including EarthCARE and the NASA Global Precipitation Measurement. Aircraft, surface and radar data from the Canadian CloudSat/CALIPSO Validation Project are used to check the validity of size distribution and density characteristics for snowfall simulated by the NASA Goddard six-class, single moment bulk water microphysics scheme, currently available within the Weather Research and Forecast (WRF) Model. Widespread snowfall developed across the region on January 22, 2007, forced by the passing of a mid latitude cyclone, and was observed by the dual-polarimetric, C-band radar King City, Ontario, as well as the NASA 94 GHz CloudSat Cloud Profiling Radar. Combined, these data sets provide key metrics for validating model output: estimates of size distribution parameters fit to the inverse-exponential equations prescribed within the model, bulk density and crystal habit characteristics sampled by the aircraft, and representation of size characteristics as inferred by the radar reflectivity at C- and W-band. Specified constants for distribution intercept and density differ significantly from observations throughout much of the cloud depth. Alternate parameterizations are explored, using column-integrated values of vapor excess to avoid problems encountered with temperature-based parameterizations in an environment where inversions and isothermal layers are present. Simulation of CloudSat reflectivity is performed by adopting the discrete-dipole parameterizations and databases provided in literature, and demonstrate an improved capability in simulating radar reflectivity at W-band versus Mie scattering assumptions.
Prerequisites for sustainable care improvement using the reflective team as a work model.
Jonasson, Lise-Lotte; Carlsson, Gunilla; Nyström, Maria
2014-01-01
Several work models for care improvement have been developed in order to meet the requirement for evidence-based care. This study examines a work model for reflection, entitled the reflective team (RT). The main idea behind RTs is that caring skills exist among those who work closest to the patients. The team leader (RTL) encourages sustainable care improvement, rooted in research and proven experience, by using a lifeworld perspective to stimulate further reflection and a developmental process leading to research-based caring actions within the team. In order to maintain focus, it is important that the RTL has a clear idea of what sustainable care improvement means, and what the prerequisites are for such improvement. The aim of the present study is, therefore, to explore the prerequisites for improving sustainable care, seeking to answer how RTLs perceive these and use RTs for concrete planning. Nine RTLs were interviewed, and their statements were phenomenographically analysed. The analysis revealed three separate qualitative categories, which describe personal, interpersonal, and structural aspects of the prerequisites. In the discussion, these categories are compared with previous research on reflection, and the conclusion is reached that the optimal conditions for RTs to work, when focussed on sustainable care improvement, occur when the various aspects of the prerequisites are intertwined and become a natural part of the reflective work.
ERIC Educational Resources Information Center
Ndirangu, Mwangi; Kiboss, Joel K.; Wekesa, Eric W.
2005-01-01
The application of computer technology in education is a relatively new approach that is trying to justify inclusion in the Kenyan school curriculum. Being abstract, with a dynamic nature that does not manifest itself visibly, the process of cell division has posed difficulties for teachers. Consequently, a computer simulation program, using…
Integrated Visible Photonics for Trapped-Ion Quantum Computing
2017-06-10
necessarily reflect the views of the Department of Defense. Abstract- A scalable trapped-ion-based quantum - computing architecture requires the... Quantum Computing Dave Kharas, Cheryl Sorace-Agaskar, Suraj Bramhavar, William Loh, Jeremy M. Sage, Paul W. Juodawlkis, and John...coherence times, strong coulomb interactions, and optical addressability, hold great promise for implementation of practical quantum information
26 CFR 301.6231(f)-1 - Disallowance of losses and credits in certain cases.
Code of Federal Regulations, 2011 CFR
2011-04-01
... United States. (b) Computational adjustment permitted if return is not filed after mailing of notice... computational adjustment to that partner to reflect the disallowance of any loss (including a capital loss) or... computational adjustment referred to in paragraph (b) of this section may be mailed on a day on which— (1) The...
26 CFR 301.6231(f)-1 - Disallowance of losses and credits in certain cases.
Code of Federal Regulations, 2014 CFR
2014-04-01
... United States. (b) Computational adjustment permitted if return is not filed after mailing of notice... computational adjustment to that partner to reflect the disallowance of any loss (including a capital loss) or... computational adjustment referred to in paragraph (b) of this section may be mailed on a day on which— (1) The...
26 CFR 301.6231(f)-1 - Disallowance of losses and credits in certain cases.
Code of Federal Regulations, 2010 CFR
2010-04-01
... United States. (b) Computational adjustment permitted if return is not filed after mailing of notice... computational adjustment to that partner to reflect the disallowance of any loss (including a capital loss) or... computational adjustment referred to in paragraph (b) of this section may be mailed on a day on which— (1) The...
26 CFR 301.6231(f)-1 - Disallowance of losses and credits in certain cases.
Code of Federal Regulations, 2013 CFR
2013-04-01
... United States. (b) Computational adjustment permitted if return is not filed after mailing of notice... computational adjustment to that partner to reflect the disallowance of any loss (including a capital loss) or... computational adjustment referred to in paragraph (b) of this section may be mailed on a day on which— (1) The...
26 CFR 301.6231(f)-1 - Disallowance of losses and credits in certain cases.
Code of Federal Regulations, 2012 CFR
2012-04-01
... United States. (b) Computational adjustment permitted if return is not filed after mailing of notice... computational adjustment to that partner to reflect the disallowance of any loss (including a capital loss) or... computational adjustment referred to in paragraph (b) of this section may be mailed on a day on which— (1) The...
The Intersection of Community-Based Writing and Computer-Based Writing: A Cyberliteracy Case Study.
ERIC Educational Resources Information Center
Gabor, Catherine
The learning goals that inform service learning as a whole can contribute to the computers and writing field significantly. This paper demonstrates how two lines of inquiry can be furthered, community-based writing and computers and writing, through new data and critical reflection on learning goals and communication tools. The paper presents a…
Automated Interval velocity picking for Atlantic Multi-Channel Seismic Data
NASA Astrophysics Data System (ADS)
Singh, Vishwajit
2016-04-01
This paper described the challenge in developing and testing a fully automated routine for measuring interval velocities from multi-channel seismic data. Various approaches are employed for generating an interactive algorithm picking interval velocity for continuous 1000-5000 normal moveout (NMO) corrected gather and replacing the interpreter's effort for manual picking the coherent reflections. The detailed steps and pitfalls for picking the interval velocities from seismic reflection time measurements are describe in these approaches. Key ingredients these approaches utilized for velocity analysis stage are semblance grid and starting model of interval velocity. Basin-Hopping optimization is employed for convergence of the misfit function toward local minima. SLiding-Overlapping Window (SLOW) algorithm are designed to mitigate the non-linearity and ill- possessedness of root-mean-square velocity. Synthetic data case studies addresses the performance of the velocity picker generating models perfectly fitting the semblance peaks. A similar linear relationship between average depth and reflection time for synthetic model and estimated models proposed picked interval velocities as the starting model for the full waveform inversion to project more accurate velocity structure of the subsurface. The challenges can be categorized as (1) building accurate starting model for projecting more accurate velocity structure of the subsurface, (2) improving the computational cost of algorithm by pre-calculating semblance grid to make auto picking more feasible.
Reconstruction of structural damage based on reflection intensity spectra of fiber Bragg gratings
NASA Astrophysics Data System (ADS)
Huang, Guojun; Wei, Changben; Chen, Shiyuan; Yang, Guowei
2014-12-01
We present an approach for structural damage reconstruction based on the reflection intensity spectra of fiber Bragg gratings (FBGs). Our approach incorporates the finite element method, transfer matrix (T-matrix), and genetic algorithm to solve the inverse photo-elastic problem of damage reconstruction, i.e. to identify the location, size, and shape of a defect. By introducing a parameterized characterization of the damage information, the inverse photo-elastic problem is reduced to an optimization problem, and a relevant computational scheme was developed. The scheme iteratively searches for the solution to the corresponding direct photo-elastic problem until the simulated and measured (or target) reflection intensity spectra of the FBGs near the defect coincide within a prescribed error. Proof-of-concept validations of our approach were performed numerically and experimentally using both holed and cracked plate samples as typical cases of plane-stress problems. The damage identifiability was simulated by changing the deployment of the FBG sensors, including the total number of sensors and their distance to the defect. Both the numerical and experimental results demonstrate that our approach is effective and promising. It provides us with a photo-elastic method for developing a remote, automatic damage-imaging technique that substantially improves damage identification for structural health monitoring.
[Modification of nursing practice through reflection: participatory action research].
Delgado Hito, P; Sola Prado, A; Mirabete Rodríguez, I; Torrents Ros, R; Blasco Afonso, M; Barrero Pedraza, R; Catalá Gil, N; Mateos Dávila, A; Quinteiro Canedo, M
2001-01-01
Technology and complex techniques are inevitably playing an increasing role in intensive care units. They continue to characterize nursing care and in some cases dehumanize it. The general aim of this study was to stimulate reflection on nursing care. The study was based on the participation of the investigators with the goal of producing changes in nursing practice. Qualitative methodology in the form of participatory action research and the Kemmis and McTaggart method were used. Data were collected through systematic observation, seven group meetings and document analysis. Eight nurses took part in the study. The meetings were recorded and transcribed verbatim into a computer. This process and the meaning of the verbatim transcription (codification/categorization process and document synthesis cards) were analyzed. The results of this study enabled exploration of the change in nursing practice and showed that the reflection in action method stimulates changes in practice. The new way of conceiving nursing action has increased nursing care quality and its humanization since it shows greater respect for the patient, provides families with closer contact and greater support, improves coordination of nursing care acts and increases collaboration among professionals.In conclusion, participatory action research is a valid and appropriate method that nurses can use to modify their daily practice.
Real-time analysis keratometer
NASA Technical Reports Server (NTRS)
Adachi, Iwao P. (Inventor); Adachi, Yoshifumi (Inventor); Frazer, Robert E. (Inventor)
1987-01-01
A computer assisted keratometer in which a fiducial line pattern reticle illuminated by CW or pulsed laser light is projected on a corneal surface through lenses, a prismoidal beamsplitter quarterwave plate, and objective optics. The reticle surface is curved as a conjugate of an ideal corneal curvature. The fiducial image reflected from the cornea undergoes a polarization shift through the quarterwave plate and beamsplitter whereby the projected and reflected beams are separated and directed orthogonally. The reflected beam fiducial pattern forms a moire pattern with a replica of the first recticle. This moire pattern contains transverse aberration due to differences in curvature between the cornea and the ideal corneal curvature. The moire pattern is analyzed in real time by computer which displays either the CW moire pattern or a pulsed mode analysis of the transverse aberration of the cornea under observation, in real time. With the eye focused on a plurality of fixation points in succession, a survey of the entire corneal topography is made and a contour map or three dimensional plot of the cornea can be made as a computer readout in addition to corneal radius and refractive power analysis.
NASA Astrophysics Data System (ADS)
Todoran, D.; Todoran, R.; Anitas, E. M.; Szakacs, Zs.
2017-12-01
This paper presents results concerning optical and electrical properties of galena natural mineral and of the interface layer formed between it and the potassium ethyl xanthate solution. The applied experimental method was differential optical reflectance spectroscopy over the UV-Vis/NIR spectral domain. Computations were made using the Kramers-Kronig formalism. Spectral dependencies of the electron loss functions, determined from the reflectance data obtained from the polished mineral surface, display van Hove singularities, leading to the determination of its valence band gap and electron plasma energy. Time dependent measurement of the spectral dispersion of the relative reflectance of the film formed at the interface, using the same computational formalism, leads to the dynamical determination of the spectral variation of its optical and electrical properties. We computed behaviors of the dielectric constant (dielectric permittivity), the dielectric loss function, refractive index and extinction coefficient, effective valence number and of the electron loss functions. The measurements tend to stabilize when the dynamic adsorption-desorption equilibrium is reached at the interface level.
Challenges and solutions for realistic room simulation
NASA Astrophysics Data System (ADS)
Begault, Durand R.
2002-05-01
Virtual room acoustic simulation (auralization) techniques have traditionally focused on answering questions related to speech intelligibility or musical quality, typically in large volumetric spaces. More recently, auralization techniques have been found to be important for the externalization of headphone-reproduced virtual acoustic images. Although externalization can be accomplished using a minimal simulation, data indicate that realistic auralizations need to be responsive to head motion cues for accurate localization. Computational demands increase when providing for the simulation of coupled spaces, small rooms lacking meaningful reverberant decays, or reflective surfaces in outdoor environments. Auditory threshold data for both early reflections and late reverberant energy levels indicate that much of the information captured in acoustical measurements is inaudible, minimizing the intensive computational requirements of real-time auralization systems. Results are presented for early reflection thresholds as a function of azimuth angle, arrival time, and sound-source type, and reverberation thresholds as a function of reverberation time and level within 250-Hz-2-kHz octave bands. Good agreement is found between data obtained in virtual room simulations and those obtained in real rooms, allowing a strategy for minimizing computational requirements of real-time auralization systems.
Three-Dimensional Printed Prosthesis for Repair of Superior Canal Dehiscence.
Kozin, Elliott D; Remenschneider, Aaron K; Cheng, Song; Nakajima, Hideko Heidi; Lee, Daniel J
2015-10-01
Outcomes following repair of superior canal dehiscence (SCD) are variable, and surgery carries a risk of persistent or recurrent SCD symptoms, as well as a risk of hearing loss and vestibulopathy. Poor outcomes may occur from inadequate repair of the SCD or mechanical insult to the membranous labyrinth. Repair of SCD using a customized, fixed-length prosthesis may address current operative limitations and improve surgical outcomes. We aim to 3-dimensionally print customized prostheses to resurface or occlude bony SCD defects. Dehiscences were created along the arcuate eminence of superior semicircular canals in cadaveric temporal bones. Prostheses were designed and created using computed tomography and a 3-dimensional printer. The prostheses occupied the superior semicircular canal defect, reflected in postrepair computed tomography scans. This novel approach to SCD repair could have advantages over current techniques. Refinement of prosthesis design and materials will be important if this approach is translated into clinical use. © American Academy of Otolaryngology-Head and Neck Surgery Foundation 2015.
Graphical Interface for the Study of Gas-Phase Reaction Kinetics: Cyclopentene Vapor Pyrolysis
NASA Astrophysics Data System (ADS)
Marcotte, Ronald E.; Wilson, Lenore D.
2001-06-01
The undergraduate laboratory experiment on the pyrolysis of gaseous cyclopentene has been modernized to improve safety, speed, and precision and to better reflect the current practice of physical chemistry. It now utilizes virtual instrument techniques to create a graphical computer interface for the collection and display of experimental data. An electronic pressure gauge has replaced the mercury manometer formerly needed in proximity to the 500 °C pyrolysis oven. Students have much better real-time information available to them and no longer require multiple lab periods to get rate constants and acceptable Arrhenius parameters. The time saved on manual data collection is used to give the students a tour of the computer interfacing hardware and software and a hands-on introduction to gas-phase reagent preparation using a research-grade high-vacuum system. This includes loading the sample, degassing it by the freeze-pump-thaw technique, handling liquid nitrogen and working through the logic necessary for each reconfiguration of the diffusion pump section and the submanifolds.
Blood flow estimation in gastroscopic true-color images
NASA Astrophysics Data System (ADS)
Jacoby, Raffael S.; Herpers, Rainer; Zwiebel, Franz M.; Englmeier, Karl-Hans
1995-05-01
The assessment of blood flow in the gastrointestinal mucosa might be an important factor for the diagnosis and treatment of several diseases such as ulcers, gastritis, colitis, or early cancer. The quantity of blood flow is roughly estimated by computing the spatial hemoglobin distribution in the mucosa. The presented method enables a practical realization by calculating approximately the hemoglobin concentration based on a spectrophotometric analysis of endoscopic true-color images, which are recorded during routine examinations. A system model based on the reflectance spectroscopic law of Kubelka-Munk is derived which enables an estimation of the hemoglobin concentration by means of the color values of the images. Additionally, a transformation of the color values is developed in order to improve the luminance independence. Applying this transformation and estimating the hemoglobin concentration for each pixel of interest, the hemoglobin distribution can be computed. The obtained results are mostly independent of luminance. An initial validation of the presented method is performed by a quantitative estimation of the reproducibility.
Cestari, Andrea
2013-01-01
Predictive modeling is emerging as an important knowledge-based technology in healthcare. The interest in the use of predictive modeling reflects advances on different fronts such as the availability of health information from increasingly complex databases and electronic health records, a better understanding of causal or statistical predictors of health, disease processes and multifactorial models of ill-health and developments in nonlinear computer models using artificial intelligence or neural networks. These new computer-based forms of modeling are increasingly able to establish technical credibility in clinical contexts. The current state of knowledge is still quite young in understanding the likely future direction of how this so-called 'machine intelligence' will evolve and therefore how current relatively sophisticated predictive models will evolve in response to improvements in technology, which is advancing along a wide front. Predictive models in urology are gaining progressive popularity not only for academic and scientific purposes but also into the clinical practice with the introduction of several nomograms dealing with the main fields of onco-urology.
Brian hears: online auditory processing using vectorization over channels.
Fontaine, Bertrand; Goodman, Dan F M; Benichoux, Victor; Brette, Romain
2011-01-01
The human cochlea includes about 3000 inner hair cells which filter sounds at frequencies between 20 Hz and 20 kHz. This massively parallel frequency analysis is reflected in models of auditory processing, which are often based on banks of filters. However, existing implementations do not exploit this parallelism. Here we propose algorithms to simulate these models by vectorizing computation over frequency channels, which are implemented in "Brian Hears," a library for the spiking neural network simulator package "Brian." This approach allows us to use high-level programming languages such as Python, because with vectorized operations, the computational cost of interpretation represents a small fraction of the total cost. This makes it possible to define and simulate complex models in a simple way, while all previous implementations were model-specific. In addition, we show that these algorithms can be naturally parallelized using graphics processing units, yielding substantial speed improvements. We demonstrate these algorithms with several state-of-the-art cochlear models, and show that they compare favorably with existing, less flexible, implementations.
Multidimensional Modeling of Atmospheric Effects and Surface Heterogeneities on Remote Sensing
NASA Technical Reports Server (NTRS)
Gerstl, S. A. W.; Simmer, C.; Zardecki, A. (Principal Investigator)
1985-01-01
The overall goal of this project is to establish a modeling capability that allows a quantitative determination of atmospheric effects on remote sensing including the effects of surface heterogeneities. This includes an improved understanding of aerosol and haze effects in connection with structural, angular, and spatial surface heterogeneities. One important objective of the research is the possible identification of intrinsic surface or canopy characteristics that might be invariant to atmospheric perturbations so that they could be used for scene identification. Conversely, an equally important objective is to find a correction algorithm for atmospheric effects in satellite-sensed surface reflectances. The technical approach is centered around a systematic model and code development effort based on existing, highly advanced computer codes that were originally developed for nuclear radiation shielding applications. Computational techniques for the numerical solution of the radiative transfer equation are adapted on the basis of the discrete-ordinates finite-element method which proved highly successful for one and two-dimensional radiative transfer problems with fully resolved angular representation of the radiation field.
Continuous time transfer using GPS carrier phase.
Dach, Rolf; Schildknecht, Thomas; Springer, Tim; Dudle, Gregor; Prost, Leon
2002-11-01
The Astronomical Institute of the University of Berne is hosting one of the Analysis Centers (AC) of the International GPS Service (IGS). A network of a few GPS stations in Europe and North America is routinely analyzed for time transfer purposes, using the carrier phase observations. This work is done in the framework of a joint project with the Swiss Federal Office of Metrology and Accreditation (METAS). The daily solutions are computed independently. The resulting time transfer series show jumps of up to 1 ns at the day boundaries. A method to concatenate the daily time transfer solutions to a continuous series was developed. A continuous time series is available for a time span of more than 4 mo. The results were compared with the time transfer results from other techniques such as two-way satellite time and frequency transfer. This concatenation improves the results obtained in a daily computing scheme because a continuous time series better reflects the characteristics of continuously working clocks.
Web-Enabled Optoelectronic Particle-Fallout Monitor
NASA Technical Reports Server (NTRS)
Lineberger, Lewis P.
2008-01-01
A Web-enabled optoelectronic particle- fallout monitor has been developed as a prototype of future such instruments that (l) would be installed in multiple locations for which assurance of cleanliness is required and (2) could be interrogated and controlled in nearly real time by multiple remote users. Like prior particle-fallout monitors, this instrument provides a measure of particles that accumulate on a surface as an indication of the quantity of airborne particulate contaminants. The design of this instrument reflects requirements to: Reduce the cost and complexity of its optoelectronic sensory subsystem relative to those of prior optoelectronic particle fallout monitors while maintaining or improving capabilities; Use existing network and office computers for distributed display and control; Derive electric power for the instrument from a computer network, a wall outlet, or a battery; Provide for Web-based retrieval and analysis of measurement data and of a file containing such ancillary data as a log of command attempts at remote units; and Use the User Datagram Protocol (UDP) for maximum performance and minimal network overhead.
MaMR: High-performance MapReduce programming model for material cloud applications
NASA Astrophysics Data System (ADS)
Jing, Weipeng; Tong, Danyu; Wang, Yangang; Wang, Jingyuan; Liu, Yaqiu; Zhao, Peng
2017-02-01
With the increasing data size in materials science, existing programming models no longer satisfy the application requirements. MapReduce is a programming model that enables the easy development of scalable parallel applications to process big data on cloud computing systems. However, this model does not directly support the processing of multiple related data, and the processing performance does not reflect the advantages of cloud computing. To enhance the capability of workflow applications in material data processing, we defined a programming model for material cloud applications that supports multiple different Map and Reduce functions running concurrently based on hybrid share-memory BSP called MaMR. An optimized data sharing strategy to supply the shared data to the different Map and Reduce stages was also designed. We added a new merge phase to MapReduce that can efficiently merge data from the map and reduce modules. Experiments showed that the model and framework present effective performance improvements compared to previous work.
A laparoscopy-based method for BRDF estimation from in vivo human liver.
Nunes, A L P; Maciel, A; Cavazzola, L T; Walter, M
2017-01-01
While improved visual realism is known to enhance training effectiveness in virtual surgery simulators, the advances on realistic rendering for these simulators is slower than similar simulations for man-made scenes. One of the main reasons for this is that in vivo data is hard to gather and process. In this paper, we propose the analysis of videolaparoscopy data to compute the Bidirectional Reflectance Distribution Function (BRDF) of living organs as an input to physically based rendering algorithms. From the interplay between light and organic matter recorded in video images, we propose the definition of a process capable of establishing the BRDF for inside-the-body organic surfaces. We present a case study around the liver with patient-specific rendering under global illumination. Results show that despite the limited range of motion allowed within the body, the computed BRDF presents a high-coverage of the sampled regions and produces plausible renderings. Copyright © 2016 Elsevier B.V. All rights reserved.
Processing and evaluation of riverine waveforms acquired by an experimental bathymetric LiDAR
NASA Astrophysics Data System (ADS)
Kinzel, P. J.; Legleiter, C. J.; Nelson, J. M.
2010-12-01
Accurate mapping of fluvial environments with airborne bathymetric LiDAR is challenged not only by environmental characteristics but also the development and application of software routines to post-process the recorded laser waveforms. During a bathymetric LiDAR survey, the transmission of the green-wavelength laser pulses through the water column is influenced by a number of factors including turbidity, the presence of organic material, and the reflectivity of the streambed. For backscattered laser pulses returned from the river bottom and digitized by the LiDAR detector, post-processing software is needed to interpret and identify distinct inflections in the reflected waveform. Relevant features of this energy signal include the air-water interface, volume reflection from the water column itself, and, ideally, a strong return from the bottom. We discuss our efforts to acquire, analyze, and interpret riverine surveys using the USGS Experimental Advanced Airborne Research LiDAR (EAARL) in a variety of fluvial environments. Initial processing of data collected in the Trinity River, California, using the EAARL Airborne Lidar Processing Software (ALPS) highlighted the difficulty of retrieving a distinct bottom signal in deep pools. Examination of laser waveforms from these pools indicated that weak bottom reflections were often neglected by a trailing edge algorithm used by ALPS to process shallow riverine waveforms. For the Trinity waveforms, this algorithm had a tendency to identify earlier inflections as the bottom, resulting in a shallow bias. Similarly, an EAARL survey along the upper Colorado River, Colorado, also revealed the inadequacy of the trailing edge algorithm for detecting weak bottom reflections. We developed an alternative waveform processing routine by exporting digitized laser waveforms from ALPS, computing the local extrema, and fitting Gaussian curves to the convolved backscatter. Our field data indicate that these techniques improved the definition of pool areas dominated by weak bottom reflections. These processing techniques are also being tested for EAARL surveys collected along the Platte and Klamath Rivers where environmental conditions have resulted in suppressed or convolved bottom reflections.
Modified fuzzy c-means applied to a Bragg grating-based spectral imager for material clustering
NASA Astrophysics Data System (ADS)
Rodríguez, Aida; Nieves, Juan Luis; Valero, Eva; Garrote, Estíbaliz; Hernández-Andrés, Javier; Romero, Javier
2012-01-01
We have modified the Fuzzy C-Means algorithm for an application related to segmentation of hyperspectral images. Classical fuzzy c-means algorithm uses Euclidean distance for computing sample membership to each cluster. We have introduced a different distance metric, Spectral Similarity Value (SSV), in order to have a more convenient similarity measure for reflectance information. SSV distance metric considers both magnitude difference (by the use of Euclidean distance) and spectral shape (by the use of Pearson correlation). Experiments confirmed that the introduction of this metric improves the quality of hyperspectral image segmentation, creating spectrally more dense clusters and increasing the number of correctly classified pixels.
NASA Technical Reports Server (NTRS)
Manro, M. E.
1983-01-01
Two separated flow computer programs and a semiempirical method for incorporating the experimentally measured separated flow effects into a linear aeroelastic analysis were evaluated. The three dimensional leading edge vortex (LEV) code is evaluated. This code is an improved panel method for three dimensional inviscid flow over a wing with leading edge vortex separation. The governing equations are the linear flow differential equation with nonlinear boundary conditions. The solution is iterative; the position as well as the strength of the vortex is determined. Cases for both full and partial span vortices were executed. The predicted pressures are good and adequately reflect changes in configuration.
A far-field non-reflecting boundary condition for two-dimensional wake flows
NASA Technical Reports Server (NTRS)
Danowitz, Jeffrey S.; Abarbanel, Saul A.; Turkel, Eli
1995-01-01
Far-field boundary conditions for external flow problems have been developed based upon long-wave perturbations of linearized flow equations about a steady state far field solution. The boundary improves convergence to steady state in single-grid temporal integration schemes using both regular-time-stepping and local-time-stepping. The far-field boundary may be near the trailing edge of the body which significantly reduces the number of grid points, and therefore the computational time, in the numerical calculation. In addition the solution produced is smoother in the far-field than when using extrapolation conditions. The boundary condition maintains the convergence rate to steady state in schemes utilizing multigrid acceleration.
Abdellah, Marwan; Eldeib, Ayman; Owis, Mohamed I
2015-01-01
This paper features an advanced implementation of the X-ray rendering algorithm that harnesses the giant computing power of the current commodity graphics processors to accelerate the generation of high resolution digitally reconstructed radiographs (DRRs). The presented pipeline exploits the latest features of NVIDIA Graphics Processing Unit (GPU) architectures, mainly bindless texture objects and dynamic parallelism. The rendering throughput is substantially improved by exploiting the interoperability mechanisms between CUDA and OpenGL. The benchmarks of our optimized rendering pipeline reflect its capability of generating DRRs with resolutions of 2048(2) and 4096(2) at interactive and semi interactive frame-rates using an NVIDIA GeForce 970 GTX device.
NASA Astrophysics Data System (ADS)
Larson, Stephen
2007-05-01
The state and discovery rate of current NEO surveys reflects incremental improvements in a number of areas, such as detector size and sensitivity, computing capacity and availability of larger apertures. The result has been an increased discovery rate even with the expected reduction of objects left to discover. There are currently about 10 telescopes ranging in size from 0.5 - 1.5-meters carrying out full or part-time, regular surveying in both hemispheres. The sky is covered between 1-2 times per lunation to V~19, with a band near the ecliptic to V~20.5. We review the current survey programs and their contribution towards the Spaceguard goal of discovering at least 90% of the NEOs larger than 1 km.
Use of knowledge-sharing web-based portal in gross and microscopic anatomy.
Durosaro, Olayemi; Lachman, Nirusha; Pawlina, Wojciech
2008-12-01
Changes in worldwide healthcare delivery require review of current medical school curricula structure to develop learning outcomes that ensures mastery of knowledge and clinical competency. In the last 3 years, Mayo Medical School implemented outcomes-based curriculum to encompass new graduate outcomes. Standard courses were replaced by 6-week clinically-integrated didactic blocks separated by student-self selected academic enrichment activities. Gross and microscopic anatomy was integrated with radiology and genetics respectively. Laboratory components include virtual microscopy and anatomical dissection. Students assigned to teams utilise computer portals to share learning experiences. High-resolution computed tomographic (CT) scans of cadavers prior to dissection were made available for correlative learning between the cadaveric material and radiologic images. Students work in teams on assigned presentations that include histology, cell and molecular biology, genetics and genomic using the Nexus Portal, based on DrupalEd, to share their observations, reflections and dissection findings. New generation of medical students are clearly comfortable utilising web-based programmes that maximise their learning potential of conceptually difficult and labor intensive courses. Team-based learning approach emphasising the use of knowledge-sharing computer portals maximises opportunities for students to master their knowledge and improve cognitive skills to ensure clinical competency.
Eppig, Janan T
2017-07-01
The Mouse Genome Informatics (MGI) Resource supports basic, translational, and computational research by providing high-quality, integrated data on the genetics, genomics, and biology of the laboratory mouse. MGI serves a strategic role for the scientific community in facilitating biomedical, experimental, and computational studies investigating the genetics and processes of diseases and enabling the development and testing of new disease models and therapeutic interventions. This review describes the nexus of the body of growing genetic and biological data and the advances in computer technology in the late 1980s, including the World Wide Web, that together launched the beginnings of MGI. MGI develops and maintains a gold-standard resource that reflects the current state of knowledge, provides semantic and contextual data integration that fosters hypothesis testing, continually develops new and improved tools for searching and analysis, and partners with the scientific community to assure research data needs are met. Here we describe one slice of MGI relating to the development of community-wide large-scale mutagenesis and phenotyping projects and introduce ways to access and use these MGI data. References and links to additional MGI aspects are provided. © The Author 2017. Published by Oxford University Press.
Eppig, Janan T.
2017-01-01
Abstract The Mouse Genome Informatics (MGI) Resource supports basic, translational, and computational research by providing high-quality, integrated data on the genetics, genomics, and biology of the laboratory mouse. MGI serves a strategic role for the scientific community in facilitating biomedical, experimental, and computational studies investigating the genetics and processes of diseases and enabling the development and testing of new disease models and therapeutic interventions. This review describes the nexus of the body of growing genetic and biological data and the advances in computer technology in the late 1980s, including the World Wide Web, that together launched the beginnings of MGI. MGI develops and maintains a gold-standard resource that reflects the current state of knowledge, provides semantic and contextual data integration that fosters hypothesis testing, continually develops new and improved tools for searching and analysis, and partners with the scientific community to assure research data needs are met. Here we describe one slice of MGI relating to the development of community-wide large-scale mutagenesis and phenotyping projects and introduce ways to access and use these MGI data. References and links to additional MGI aspects are provided. PMID:28838066
NASA Astrophysics Data System (ADS)
Lin, Feng; Chan, Carol K. K.
2018-04-01
This study examined the role of computer-supported knowledge-building discourse and epistemic reflection in promoting elementary-school students' scientific epistemology and science learning. The participants were 39 Grade 5 students who were collectively pursuing ideas and inquiry for knowledge advance using Knowledge Forum (KF) while studying a unit on electricity; they also reflected on the epistemic nature of their discourse. A comparison class of 22 students, taught by the same teacher, studied the same unit using the school's established scientific investigation method. We hypothesised that engaging students in idea-driven and theory-building discourse, as well as scaffolding them to reflect on the epistemic nature of their discourse, would help them understand their own scientific collaborative discourse as a theory-building process, and therefore understand scientific inquiry as an idea-driven and theory-building process. As hypothesised, we found that students engaged in knowledge-building discourse and reflection outperformed comparison students in scientific epistemology and science learning, and that students' understanding of collaborative discourse predicted their post-test scientific epistemology and science learning. To further understand the epistemic change process among knowledge-building students, we analysed their KF discourse to understand whether and how their epistemic practice had changed after epistemic reflection. The implications on ways of promoting epistemic change are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Langer, S; Rotman, D; Schwegler, E
The Institutional Computing Executive Group (ICEG) review of FY05-06 Multiprogrammatic and Institutional Computing (M and IC) activities is presented in the attached report. In summary, we find that the M and IC staff does an outstanding job of acquiring and supporting a wide range of institutional computing resources to meet the programmatic and scientific goals of LLNL. The responsiveness and high quality of support given to users and the programs investing in M and IC reflects the dedication and skill of the M and IC staff. M and IC has successfully managed serial capacity, parallel capacity, and capability computing resources.more » Serial capacity computing supports a wide range of scientific projects which require access to a few high performance processors within a shared memory computer. Parallel capacity computing supports scientific projects that require a moderate number of processors (up to roughly 1000) on a parallel computer. Capability computing supports parallel jobs that push the limits of simulation science. M and IC has worked closely with Stockpile Stewardship, and together they have made LLNL a premier institution for computational and simulation science. Such a standing is vital to the continued success of laboratory science programs and to the recruitment and retention of top scientists. This report provides recommendations to build on M and IC's accomplishments and improve simulation capabilities at LLNL. We recommend that institution fully fund (1) operation of the atlas cluster purchased in FY06 to support a few large projects; (2) operation of the thunder and zeus clusters to enable 'mid-range' parallel capacity simulations during normal operation and a limited number of large simulations during dedicated application time; (3) operation of the new yana cluster to support a wide range of serial capacity simulations; (4) improvements to the reliability and performance of the Lustre parallel file system; (5) support for the new GDO petabyte-class storage facility on the green network for use in data intensive external collaborations; and (6) continued support for visualization and other methods for analyzing large simulations. We also recommend that M and IC begin planning in FY07 for the next upgrade of its parallel clusters. LLNL investments in M and IC have resulted in a world-class simulation capability leading to innovative science. We thank the LLNL management for its continued support and thank the M and IC staff for its vision and dedicated efforts to make it all happen.« less
NASA Technical Reports Server (NTRS)
Molthan, A. L.; Haynes, J. A.; Case, J. L.; Jedlovec, G. L.; Lapenta, W. M.
2008-01-01
As computational power increases, operational forecast models are performing simulations with higher spatial resolution allowing for the transition from sub-grid scale cloud parameterizations to an explicit forecast of cloud characteristics and precipitation through the use of single- or multi-moment bulk water microphysics schemes. investments in space-borne and terrestrial remote sensing have developed the NASA CloudSat Cloud Profiling Radar and the NOAA National Weather Service NEXRAD system, each providing observations related to the bulk properties of clouds and precipitation through measurements of reflectivity. CloudSat and NEXRAD system radars observed light to moderate snowfall in association with a cold-season, midlatitude cyclone traversing the Central United States in February 2007. These systems are responsible for widespread cloud cover and various types of precipitation, are of economic consequence, and pose a challenge to operational forecasters. This event is simulated with the Weather Research and Forecast (WRF) Model, utilizing the NASA Goddard Cumulus Ensemble microphysics scheme. Comparisons are made between WRF-simulated and observed reflectivity available from the CloudSat and NEXRAD systems. The application of CloudSat reflectivity is made possible through the QuickBeam radiative transfer model, with cautious application applied in light of single scattering characteristics and spherical target assumptions. Significant differences are noted within modeled and observed cloud profiles, based upon simulated reflectivity, and modifications to the single-moment scheme are tested through a supplemental WRF forecast that incorporates a temperature dependent snow crystal size distribution.
A normalisation framework for (hyper-)spectral imagery
NASA Astrophysics Data System (ADS)
Grumpe, Arne; Zirin, Vladimir; Wöhler, Christian
2015-06-01
It is well known that the topography has an influence on the observed reflectance spectra. This influence is not compensated by spectral ratios, i.e. the effect is wavelength dependent. In this work, we present a complete normalisation framework. The surface temperature is estimated based on the measured surface reflectance. To normalise the spectral reflectance with respect to a standard illumination geometry, spatially varying reflectance parameters are estimated based on a non-linear reflectance model. The reflectance parameter estimation has one free parameter, i.e. a low-pass function, which sets the scale of the spatial-variance, i.e. the lateral resolution of the reflectance parameter maps. Since the local surface topography has a major influence on the measured reflectance, often neglected shading information is extracted from the spectral imagery and an existing topography model is refined to image resolution. All methods are demonstrated on the Moon Mineralogy Mapper dataset. Additionally, two empirical methods are introduced that deal with observed systematic reflectance changes in co-registered images acquired at different phase angles. These effects, however, may also be caused by the sensor temperature, due to its correlation with the phase angle. Surface temperatures above 300 K are detected and are very similar to a reference method. The proposed method, however, seems more robust in case of absorptions visible in the reflectance spectrum near 2000 nm. By introducing a low-pass into the computation of the reflectance parameters, the reflectance behaviour of the surfaces may be derived at different scales. This allows for an iterative refinement of the local surface topography using shape from shading and the computation reflectance parameters. The inferred parameters are derived from all available co-registered images and do not show significant influence of the local surface topography. The results of the empirical correction show that both proposed methods greatly reduce the influence of different phase angles or sensor temperatures.
Standard surface-reflectance model and illuminant estimation
NASA Technical Reports Server (NTRS)
Tominaga, Shoji; Wandell, Brian A.
1989-01-01
A vector analysis technique was adopted to test the standard reflectance model. A computational model was developed to determine the components of the observed spectra and an estimate of the illuminant was obtained without using a reference white standard. The accuracy of the standard model is evaluated.
Computation of Bragg Reflection for Layered Microstructures
NASA Technical Reports Server (NTRS)
Underwood, J. W.; Barbee, T. W.
1984-01-01
Bragg diffractors analyzed for use in X-ray mirrors and other applications. SLM tailored to specific applications by varying layer thicknesses and number of layers to control reflectivity diffraction width, and wavelength resolution. Applications as glancing incidence mirrors or filters for wavelengths of few to few hundred angstroms.
Effects of the Sea-Bed on Acoustic Propagation.
1983-11-15
from the plane-wave reflection curves presented in Fig. 7, which have been computed from a numerical model developed by Hastrup [8]. Since good...La Spezia, Italy, SACLANT ASW Research Centre, 1983. 8. HASTRUP , O.F. Digital analysis of acoustic reflectivity in the Tyrrhenian abyssal plain. J
APC: A New Code for Atmospheric Polarization Computations
NASA Technical Reports Server (NTRS)
Korkin, Sergey V.; Lyapustin, Alexei I.; Rozanov, Vladimir V.
2014-01-01
A new polarized radiative transfer code Atmospheric Polarization Computations (APC) is described. The code is based on separation of the diffuse light field into anisotropic and smooth (regular) parts. The anisotropic part is computed analytically. The smooth regular part is computed numerically using the discrete ordinates method. Vertical stratification of the atmosphere, common types of bidirectional surface reflection and scattering by spherical particles or spheroids are included. A particular consideration is given to computation of the bidirectional polarization distribution function (BPDF) of the waved ocean surface.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mikhalevich, V.S.; Sergienko, I.V.; Zadiraka, V.K.
1994-11-01
This article examines some topics of optimization of computations, which have been discussed at 25 seminar-schools and symposia organized by the V.M. Glushkov Institute of Cybernetics of the Ukrainian Academy of Sciences since 1969. We describe the main directions in the development of computational mathematics and present some of our own results that reflect a certain design conception of speed-optimal and accuracy-optimal (or nearly optimal) algorithms for various classes of problems, as well as a certain approach to optimization of computer computations.
Aberg, Kristoffer C; Müller, Julia; Schwartz, Sophie
2017-01-01
Anticipation and delivery of rewards improves memory formation, but little effort has been made to disentangle their respective contributions to memory enhancement. Moreover, it has been suggested that the effects of reward on memory are mediated by dopaminergic influences on hippocampal plasticity. Yet, evidence linking memory improvements to actual reward computations reflected in the activity of the dopaminergic system, i.e., prediction errors and expected values, is scarce and inconclusive. For example, different previous studies reported that the magnitude of prediction errors during a reinforcement learning task was a positive, negative, or non-significant predictor of successfully encoding simultaneously presented images. Individual sensitivities to reward and punishment have been found to influence the activation of the dopaminergic reward system and could therefore help explain these seemingly discrepant results. Here, we used a novel associative memory task combined with computational modeling and showed independent effects of reward-delivery and reward-anticipation on memory. Strikingly, the computational approach revealed positive influences from both reward delivery, as mediated by prediction error magnitude, and reward anticipation, as mediated by magnitude of expected value, even in the absence of behavioral effects when analyzed using standard methods, i.e., by collapsing memory performance across trials within conditions. We additionally measured trait estimates of reward and punishment sensitivity and found that individuals with increased reward (vs. punishment) sensitivity had better memory for associations encoded during positive (vs. negative) prediction errors when tested after 20 min, but a negative trend when tested after 24 h. In conclusion, modeling trial-by-trial fluctuations in the magnitude of reward, as we did here for prediction errors and expected value computations, provides a comprehensive and biologically plausible description of the dynamic interplay between reward, dopamine, and associative memory formation. Our results also underline the importance of considering individual traits when assessing reward-related influences on memory.
Wroe, Stephen; Parr, William C H; Ledogar, Justin A; Bourke, Jason; Evans, Samuel P; Fiorenza, Luca; Benazzi, Stefano; Hublin, Jean-Jacques; Stringer, Chris; Kullmer, Ottmar; Curry, Michael; Rae, Todd C; Yokley, Todd R
2018-04-11
Three adaptive hypotheses have been forwarded to explain the distinctive Neanderthal face: (i) an improved ability to accommodate high anterior bite forces, (ii) more effective conditioning of cold and/or dry air and, (iii) adaptation to facilitate greater ventilatory demands. We test these hypotheses using three-dimensional models of Neanderthals, modern humans, and a close outgroup ( Homo heidelbergensis ), applying finite-element analysis (FEA) and computational fluid dynamics (CFD). This is the most comprehensive application of either approach applied to date and the first to include both. FEA reveals few differences between H. heidelbergensis , modern humans, and Neanderthals in their capacities to sustain high anterior tooth loadings. CFD shows that the nasal cavities of Neanderthals and especially modern humans condition air more efficiently than does that of H. heidelbergensis , suggesting that both evolved to better withstand cold and/or dry climates than less derived Homo We further find that Neanderthals could move considerably more air through the nasal pathway than could H. heidelbergensis or modern humans, consistent with the propositions that, relative to our outgroup Homo , Neanderthal facial morphology evolved to reflect improved capacities to better condition cold, dry air, and, to move greater air volumes in response to higher energetic requirements. © 2018 The Author(s).
1983-09-01
F.P. PX /AMPZIJ/ REFH /AMPZIJ/ REFV /AI4PZIJ/ * RHOX /AI4PZIJ/ RHOY /At4PZIJ/ RHOZ /AI4PZIJ/ S A-ZJ SA /AMPZIJ/ SALP /AMPZIJ/ 6. CALLING ROUTINE: FLDDRV...US3NG ALGORITHM 72 COMPUTE P- YES .~:*:.~~ USING* *. 1. NAME: PLAINT (GTD) ] 2. PURPOSE: To determine if a ray traveling from a given source loca...determine if a source ray reflection from plate MP occurs. If a ray traveling from the source image location in the reflected ray direction passes through
Bistability By Self-Reflection In A Saturable Absorber
NASA Astrophysics Data System (ADS)
Roso-Franco, Luis
1987-01-01
Propagation of laser light through a saturable absorber is theoretically studied. Computed steady state solutions of the Maxwell equations describing the unidimensional propagation of a plane monochromatic wave without introducing the slowly-varying envelope approximation are presented showing how saturation effects can influence the absorption of the field. At a certain range of refractive index and extintion coefficients, computed solutions display a very susprising behaviour, and a self-reflected wave appears inside the absorber. This can be useful for a new kind of biestable device, similar to a standard bistable cavity but with the back mirror self-induced by the light.
Terahertz wide aperture reflection tomography.
Pearce, Jeremy; Choi, Hyeokho; Mittleman, Daniel M; White, Jeff; Zimdars, David
2005-07-01
We describe a powerful imaging modality for terahertz (THz) radiation, THz wide aperture reflection tomography (WART). Edge maps of an object's cross section are reconstructed from a series of time-domain reflection measurements at different viewing angles. Each measurement corresponds to a parallel line projection of the object's cross section. The filtered backprojection algorithm is applied to recover the image from the projection data. To our knowledge, this is the first demonstration of a reflection computed tomography technique using electromagnetic waves. We demonstrate the capabilities of THz WART by imaging the cross sections of two test objects.
NASA Astrophysics Data System (ADS)
Barbieux, Kévin; Nouchi, Vincent; Merminod, Bertrand
2016-10-01
Retrieving the water-leaving reflectance from airborne hyperspectral data implies to deal with three steps. Firstly, the radiance recorded by an airborne sensor comes from several sources: the real radiance of the object, the atmospheric scattering, sky and sun glint and the dark current of the sensor. Secondly, the dispersive element inside the sensor (usually a diffraction grating or a prism) could move during the flight, thus shifting the observed spectra on the wavelengths axis. Thirdly, to compute the reflectance, it is necessary to estimate, for each band, what value of irradiance corresponds to a 100% reflectance. We present here our calibration method, relying on the absorption features of the atmosphere and the near-infrared properties of common materials. By choosing proper flight height and flight lines angle, we can ignore atmospheric and sun glint contributions. Autocorrelation plots allow to identify and reduce the noise in our signals. Then, we compute a signal that represents the high frequencies of the spectrum, to localize the atmospheric absorption peaks (mainly the dioxygen peak around 760 nm). Matching these peaks removes the shift induced by the moving dispersive element. Finally, we use the signal collected over a Lambertian, unit-reflectance surface to estimate the ratio of the system's transmittances to its near-infrared transmittance. This transmittance is computed assuming an average 50% reflectance of the vegetation and nearly 0% for water in the near-infrared. Results show great correlation between the output spectra and ground measurements from a TriOS Ramses and the water-insight WISP-3.
Radar probing of surfactant films on the water surface using dual co-polarized SAR
NASA Astrophysics Data System (ADS)
Ermakov, S.; da Silva, J. C. B.; Kapustin, I.; Molkov, A.; Sergievskaya, I.; Shomina, O.
2016-10-01
Retrieving the water-leaving reflectance from airborne hyperspectral data implies to deal with three steps. Firstly, the radiance recorded by an airborne sensor comes from several sources: the real radiance of the object, the atmospheric scattering, sky and sun glint and the dark current of the sensor. Secondly, the dispersive element inside the sensor (usually a diffraction grating or a prism) could move during the flight, thus shifting the observed spectra on the wavelengths axis. Thirdly, to compute the reflectance, it is necessary to estimate, for each band, what value of irradiance corresponds to a 100% reflectance. We present here our calibration method, relying on the absorption features of the atmosphere and the near-infrared properties of common materials. By choosing proper flight height and flight lines angle, we can ignore atmospheric and sun glint contributions. Autocorrelation plots allow to identify and reduce the noise in our signals. Then, we compute a signal that represents the high frequencies of the spectrum, to localize the atmospheric absorption peaks (mainly the dioxygen peak around 760 nm). Matching these peaks removes the shift induced by the moving dispersive element. Finally, we use the signal collected over a Lambertian, unit-reflectance surface to estimate the ratio of the system's transmittances to its near-infrared transmittance. This transmittance is computed assuming an average 50% reflectance of the vegetation and nearly 0% for water in the near-infrared. Results show great correlation between the output spectra and ground measurements from a TriOS Ramses and the water-insight WISP-3.
Color image watermarking against fog effects
NASA Astrophysics Data System (ADS)
Chotikawanid, Piyanart; Amornraksa, Thumrongrat
2017-07-01
Fog effects in various computer and camera software can partially or fully damage the watermark information within the watermarked image. In this paper, we propose a color image watermarking based on the modification of reflectance component against fog effects. The reflectance component is extracted from the blue color channel in the RGB color space of a host image, and then used to carry a watermark signal. The watermark extraction is blindly achieved by subtracting the estimation of the original reflectance component from the watermarked component. The performance of the proposed watermarking method in terms of wPSNR and NC is evaluated, and then compared with the previous method. The experimental results on robustness against various levels of fog effect, from both computer software and mobile application, demonstrated a higher robustness of our proposed method, compared to the previous one.
NASA Technical Reports Server (NTRS)
Muller, Richard E. (Inventor); Mouroulis, Pantazis Z. (Inventor); Maker, Paul D. (Inventor); Wilson, Daniel W. (Inventor)
2003-01-01
The optical system of this invention is an unique type of imaging spectrometer, i.e. an instrument that can determine the spectra of all points in a two-dimensional scene. The general type of imaging spectrometer under which this invention falls has been termed a computed-tomography imaging spectrometer (CTIS). CTIS's have the ability to perform spectral imaging of scenes containing rapidly moving objects or evolving features, hereafter referred to as transient scenes. This invention, a reflective CTIS with an unique two-dimensional reflective grating, can operate in any wavelength band from the ultraviolet through long-wave infrared. Although this spectrometer is especially useful for rapidly occurring events it is also useful for investigation of some slow moving phenomena as in the life sciences.
Back focal plane microscopic ellipsometer with internal reflection geometry
NASA Astrophysics Data System (ADS)
Otsuki, Soichi; Murase, Norio; Kano, Hiroshi
2013-05-01
A back focal plane (BFP) ellipsometer is presented to measure a thin film on a cover glass using an oil-immersion high-numerical-aperture objective lens. The internal reflection geometry lowers the pseudo Brewster angle (ϕB) to the range over which the light distribution is observed in BFP of the objective. A calculation based on Mueller matrix was developed to compute ellipsometric parameters from the intensity distribution on BFP. The center and radius of the partial reflection region below the critical angle were determined and used to define a polar coordinate on BFP. Harmonic components were computed from the intensities along the azimuth direction and transformed to ellipsometric parameters at multiple incident angles around ϕB. The refractive index and thickness of the film and the contributions of the objective effect were estimated at the same time by fitting.
Utilization of ERTS-1 data to monitor and classify eutrophication of inland lakes
NASA Technical Reports Server (NTRS)
Rogers, R. H.; Smith, V. E. (Principal Investigator)
1973-01-01
The author has identified the following significant results. Significant findings are: (1) one-acre lakes and one-acre islands are detectable; (2)removal of atmospheric parameters derived from RPMI measurements show test lakes to have reflectances of 3.1 to 5.5% in band 4 and 0.3 to 2.3% in band 5; (3) failure to remove reflectance caused by atmosphere results in errors up to 500% in computing lake reflectance from ERTS-1 data; (4) in band 4, up to seven reflectance levels were observed in test lakes; (5) reflectance patterns have been displayed on a color-coded TV monitor and on computer-generated gray scales; (6) deep and shallow water can be separated by a trained photointerpreter and automatic machine processing, with estimates of water depth possible in some cases; (7) RPMI provides direct spectral signature measurements of lakes and lake features such as algal scums and floating plants; (8) a method is reported for obtaining lake color, as estimated by Forel-Ule standards, from ERTS-1 data; (9) a strong correlation between browner water color, diminishing water transparency; and (10) classifying lake eutrophication by observation of surface scums or macrophytes in shallow water seems straightforward.
Build MyTune: Children's Reflective Practice during Music Creativity Processes
ERIC Educational Resources Information Center
Hsu, Chia-Pao
2015-01-01
The current study examined how components of reflective practice interplay with children's music-making and sharing processes. This study employed a qualitative approach with 11 children who played classroom instruments and researcher-designed computer programs ("Build MyTune I" and "Build MyTune II") while attending music…
Augmenting Literacy: The Role of Expertise in Digital Writing
ERIC Educational Resources Information Center
Van Ittersum, Derek
2011-01-01
This essay presents a model of reflective use of writing technologies, one that provides a means of more fully exploiting the possibilities of these tools for transforming writing activity. Derived from the work of computer designer Douglas Engelbart, the "bootstrapping" model of reflective use extends current arguments in the field…
NASA Astrophysics Data System (ADS)
Lerot, C.; Van Roozendael, M.; Spurr, R.; Loyola, D.; Coldewey-Egbers, M.; Kochenova, S.; van Gent, J.; Koukouli, M.; Balis, D.; Lambert, J.-C.; Granville, J.; Zehner, C.
2014-02-01
Within the European Space Agency's Climate Change Initiative, total ozone column records from GOME (Global Ozone Monitoring Experiment), SCIAMACHY (SCanning Imaging Absorption SpectroMeter for Atmospheric CartograpHY), and GOME-2 have been reprocessed with GODFIT version 3 (GOME-type Direct FITting). This algorithm is based on the direct fitting of reflectances simulated in the Huggins bands to the observations. We report on new developments in the algorithm from the version implemented in the operational GOME Data Processor v5. The a priori ozone profile database TOMSv8 is now combined with a recently compiled OMI/MLS tropospheric ozone climatology to improve the representativeness of a priori information. The Ring procedure that corrects simulated radiances for the rotational Raman inelastic scattering signature has been improved using a revised semi-empirical expression. Correction factors are also applied to the simulated spectra to account for atmospheric polarization. In addition, the computational performance has been significantly enhanced through the implementation of new radiative transfer tools based on principal component analysis of the optical properties. Furthermore, a soft-calibration scheme for measured reflectances and based on selected Brewer measurements has been developed in order to reduce the impact of level-1 errors. This soft-calibration corrects not only for possible biases in backscattered reflectances, but also for artificial spectral features interfering with the ozone signature. Intersensor comparisons and ground-based validation indicate that these ozone data sets are of unprecedented quality, with stability better than 1% per decade, a precision of 1.7%, and systematic uncertainties less than 3.6% over a wide range of atmospheric states.
NASA Technical Reports Server (NTRS)
Molthan, Andrew L.; Petersen, Walter A.; Case, Jonathan L.; Dembek, Scott R.; Jedlovec, Gary J.
2009-01-01
Increases in computational resources have allowed operational forecast centers to pursue experimental, high resolution simulations that resolve the microphysical characteristics of clouds and precipitation. These experiments are motivated by a desire to improve the representation of weather and climate, but will also benefit current and future satellite campaigns, which often use forecast model output to guide the retrieval process. Aircraft, surface and radar data from the Canadian CloudSat/CALIPSO Validation Project are used to check the validity of size distribution and density characteristics for snowfall simulated by the NASA Goddard six-class, single-moment bulk water microphysics scheme, currently available within the Weather Research and Forecast (WRF) Model. Widespread snowfall developed across the region on January 22, 2007, forced by the passing of a midlatitude cyclone, and was observed by the dual-polarimetric, C-band radar King City, Ontario, as well as the NASA 94 GHz CloudSat Cloud Profiling Radar. Combined, these data sets provide key metrics for validating model output: estimates of size distribution parameters fit to the inverse-exponential equations prescribed within the model, bulk density and crystal habit characteristics sampled by the aircraft, and representation of size characteristics as inferred by the radar reflectivity at C- and W-band. Specified constants for distribution intercept and density differ significantly from observations throughout much of the cloud depth. Alternate parameterizations are explored, using column-integrated values of vapor excess to avoid problems encountered with temperature-based parameterizations in an environment where inversions and isothermal layers are present. Simulation of CloudSat reflectivity is performed by adopting the discrete-dipole parameterizations and databases provided in literature, and demonstrate an improved capability in simulating radar reflectivity at W-band versus Mie scattering assumptions.
Ribot, Miguel Angel; Botteron, Cyril; Farine, Pierre-André
2016-01-01
The use of the reflected Global Navigation Satellite Systems’ (GNSS) signals in Earth observation applications, referred to as GNSS reflectometry (GNSS-R), has been already studied for more than two decades. However, the estimation precision that can be achieved by GNSS-R sensors in some particular scenarios is still not fully understood yet. In an effort to partially fill this gap, in this paper, we compute the Cramér–Rao bound (CRB) for the specific case of static ground-based GNSS-R receivers and scenarios where the coherent component of the reflected signal is dominant. We compute the CRB for GNSS signals with different modulations, GPS L1 C/A and GPS L5 I/Q, which use binary phase-shift keying, and Galileo E1 B/C and E5, using the binary offset carrier. The CRB for these signals is evaluated as a function of the receiver bandwidth and different scenario parameters, such as the height of the receiver or the properties of the reflection surface. The CRB computation presented considers observation times of up to several tens of seconds, in which the satellite elevation angle observed changes significantly. Finally, the results obtained show the theoretical benefit of using modern GNSS signals with GNSS-R techniques using long observation times, such as the interference pattern technique. PMID:27929388
Ribot, Miguel Angel; Botteron, Cyril; Farine, Pierre-André
2016-12-05
The use of the reflected Global Navigation Satellite Systems' (GNSS) signals in Earth observation applications, referred to as GNSS reflectometry (GNSS-R), has been already studied for more than two decades. However, the estimation precision that can be achieved by GNSS-R sensors in some particular scenarios is still not fully understood yet. In an effort to partially fill this gap, in this paper, we compute the Cramér-Rao bound (CRB) for the specific case of static ground-based GNSS-R receivers and scenarios where the coherent component of the reflected signal is dominant. We compute the CRB for GNSS signals with different modulations, GPS L1 C/A and GPS L5 I/Q, which use binary phase-shift keying, and Galileo E1 B/C and E5, using the binary offset carrier. The CRB for these signals is evaluated as a function of the receiver bandwidth and different scenario parameters, such as the height of the receiver or the properties of the reflection surface. The CRB computation presented considers observation times of up to several tens of seconds, in which the satellite elevation angle observed changes significantly. Finally, the results obtained show the theoretical benefit of using modern GNSS signals with GNSS-R techniques using long observation times, such as the interference pattern technique.
Combining Critical Reflection and Action Research to Improve Pedagogy
ERIC Educational Resources Information Center
Badia, Giovanna
2017-01-01
Educators need to reflect critically on their instruction to continue to be effective. This paper will employ case studies to demonstrate how librarians can improve their teaching by applying critical reflection and action research to their information literacy (IL) sessions. The four lenses model of Stephen Brookfield, an adult education expert,…
ERIC Educational Resources Information Center
Kurt, Mustafa; Kurt, Sevinc
2017-01-01
The main aim of this study was to investigate and discover whether going through the process of reflection by keeping reflective design journals (RDJ) enhances architecture students' metacognition and whether this enhanced metacognition improves their design understandings and skills. The study was a mixed-methods design and utilised content…
Genomic Prediction Accounting for Residual Heteroskedasticity
Ou, Zhining; Tempelman, Robert J.; Steibel, Juan P.; Ernst, Catherine W.; Bates, Ronald O.; Bello, Nora M.
2015-01-01
Whole-genome prediction (WGP) models that use single-nucleotide polymorphism marker information to predict genetic merit of animals and plants typically assume homogeneous residual variance. However, variability is often heterogeneous across agricultural production systems and may subsequently bias WGP-based inferences. This study extends classical WGP models based on normality, heavy-tailed specifications and variable selection to explicitly account for environmentally-driven residual heteroskedasticity under a hierarchical Bayesian mixed-models framework. WGP models assuming homogeneous or heterogeneous residual variances were fitted to training data generated under simulation scenarios reflecting a gradient of increasing heteroskedasticity. Model fit was based on pseudo-Bayes factors and also on prediction accuracy of genomic breeding values computed on a validation data subset one generation removed from the simulated training dataset. Homogeneous vs. heterogeneous residual variance WGP models were also fitted to two quantitative traits, namely 45-min postmortem carcass temperature and loin muscle pH, recorded in a swine resource population dataset prescreened for high and mild residual heteroskedasticity, respectively. Fit of competing WGP models was compared using pseudo-Bayes factors. Predictive ability, defined as the correlation between predicted and observed phenotypes in validation sets of a five-fold cross-validation was also computed. Heteroskedastic error WGP models showed improved model fit and enhanced prediction accuracy compared to homoskedastic error WGP models although the magnitude of the improvement was small (less than two percentage points net gain in prediction accuracy). Nevertheless, accounting for residual heteroskedasticity did improve accuracy of selection, especially on individuals of extreme genetic merit. PMID:26564950
NASA Astrophysics Data System (ADS)
Fabian, A. C.; Ross, R. R.
2010-12-01
Material irradiated by X-rays produces backscattered radiation which is commonly known as the Reflection Spectrum. It consists of a structured continuum, due at high energies to the competition between photoelectric absorption and electron scattering enhanced at low energies by emission from the material itself, together with a complex line spectrum. We briefly review the history of X-ray reflection in astronomy and discuss various methods for computing the reflection spectrum from cold and ionized gas, illustrated with results from our own work reflionx. We discuss how the reflection spectrum can be used to obtain the geometry of the accretion flow, particularly the inner regions around black holes and neutron stars.
Surface roughness effects on the solar reflectance of cool asphalt shingles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akbari, Hashem; Berdahl, Paul; Akbari, Hashem
2008-02-17
We analyze the solar reflectance of asphalt roofing shingles that are covered with pigmented mineral roofing granules. The reflecting surface is rough, with a total area approximately twice the nominal area. We introduce a simple analytical model that relates the 'micro-reflectance' of a small surface region to the 'macro-reflectance' of the shingle. This model uses a mean field approximation to account for multiple scattering effects. The model is then used to compute the reflectance of shingles with a mixture of different colored granules, when the reflectances of the corresponding mono-color shingles are known. Simple linear averaging works well, with smallmore » corrections to linear averaging derived for highly reflective materials. Reflective base granules and reflective surface coatings aid achievement of high solar reflectance. Other factors that influence the solar reflectance are the size distribution of the granules, coverage of the asphalt substrate, and orientation of the granules as affected by rollers during fabrication.« less
Hackley, Paul C.; Araujo, Carla Viviane; Borrego, Angeles G.; Bouzinos, Antonis; Cardott, Brian; Cook, Alan C.; Eble, Cortland; Flores, Deolinda; Gentzis, Thomas; Gonçalves, Paula Alexandra; Filho, João Graciano Mendonça; Hámor-Vidó, Mária; Jelonek, Iwona; Kommeren, Kees; Knowles, Wayne; Kus, Jolanta; Mastalerz, Maria; Menezes, Taíssa Rêgo; Newman, Jane; Pawlewicz, Mark; Pickel, Walter; Potter, Judith; Ranasinghe, Paddy; Read, Harold; Reyes, Julito; Rodriguez, Genaro De La Rosa; de Souza, Igor Viegas Alves Fernandes; Suarez-Ruiz, Isabel; Sýkorová, Ivana; Valentine, Brett J.
2015-01-01
Vitrinite reflectance generally is considered the most robust thermal maturity parameter available for application to hydrocarbon exploration and petroleum system evaluation. However, until 2011 there was no standardized methodology available to provide guidelines for vitrinite reflectance measurements in shale. Efforts to correct this deficiency resulted in publication of ASTM D7708: Standard test method for microscopical determination of the reflectance of vitrinite dispersed in sedimentary rocks. In 2012-2013, an interlaboratory exercise was conducted to establish precision limits for the D7708 measurement technique. Six samples, representing a wide variety of shale, were tested in duplicate by 28 analysts in 22 laboratories from 14 countries. Samples ranged from immature to overmature (0.31-1.53% Ro), from organic-lean to organic-rich (1-22 wt.% total organic carbon), and contained Type I (lacustrine), Type II (marine), and Type III (terrestrial) kerogens. Repeatability limits (maximum difference between valid repetitive results from same operator, same conditions) ranged from 0.03-0.11% absolute reflectance, whereas reproducibility limits (maximum difference between valid results obtained on same test material by different operators, different laboratories) ranged from 0.12-0.54% absolute reflectance. Repeatability and reproducibility limits degraded consistently with increasing maturity and decreasing organic content. However, samples with terrestrial kerogens (Type III) fell off this trend, showing improved levels of reproducibility due to higher vitrinite content and improved ease of identification. Operators did not consistently meet the reporting requirements of the test method, indicating that a common reporting template is required to improve data quality. The most difficult problem encountered was the petrographic distinction of solid bitumens and low-reflecting inert macerals from vitrinite when vitrinite occurred with reflectance ranges overlapping the other components. Discussion among participants suggested this problem could not be easily corrected via kerogen concentration or solvent extraction and is related to operator training and background. No statistical difference in mean reflectance was identified between participants reporting bitumen reflectance vs. vitrinite reflectance vs. a mixture of bitumen and vitrinite reflectance values, suggesting empirical conversion schemes should be treated with caution. Analysis of reproducibility limits obtained during this exercise in comparison to reproducibility limits from historical interlaboratory exercises suggests use of a common methodology (D7708) improves interlaboratory precision. Future work will investigate opportunities to improve reproducibility in high maturity, organic-lean shale varieties.
NASA Astrophysics Data System (ADS)
Wang, Chenxi; Yang, Ping; Nasiri, Shaima L.; Platnick, Steven; Baum, Bryan A.; Heidinger, Andrew K.; Liu, Xu
2013-02-01
A computationally efficient radiative transfer model (RTM) for calculating visible (VIS) through shortwave infrared (SWIR) reflectances is developed for use in satellite and airborne cloud property retrievals. The full radiative transfer equation (RTE) for combinations of cloud, aerosol, and molecular layers is solved approximately by using six independent RTEs that assume the plane-parallel approximation along with a single-scattering approximation for Rayleigh scattering. Each of the six RTEs can be solved analytically if the bidirectional reflectance/transmittance distribution functions (BRDF/BTDF) of the cloud/aerosol layers are known. The adding/doubling (AD) algorithm is employed to account for overlapped cloud/aerosol layers and non-Lambertian surfaces. Two approaches are used to mitigate the significant computational burden of the AD algorithm. First, the BRDF and BTDF of single cloud/aerosol layers are pre-computed using the discrete ordinates radiative transfer program (DISORT) implemented with 128 streams, and second, the required integral in the AD algorithm is numerically implemented on a twisted icosahedral mesh. A concise surface BRDF simulator associated with the MODIS land surface product (MCD43) is merged into a fast RTM to accurately account for non-isotropic surface reflectance. The resulting fast RTM is evaluated with respect to its computational accuracy and efficiency. The simulation bias between DISORT and the fast RTM is large (e.g., relative error >5%) only when both the solar zenith angle (SZA) and the viewing zenith angle (VZA) are large (i.e., SZA>45° and VZA>70°). For general situations, i.e., cloud/aerosol layers above a non-Lambertian surface, the fast RTM calculation rate is faster than that of the 128-stream DISORT by approximately two orders of magnitude.
ERIC Educational Resources Information Center
Schwienhorst, Klaus
2002-01-01
Discussion of computer-assisted language learning focuses on the benefits of virtual reality environments, particularly for foreign language contexts. Topics include three approaches to learner autonomy; supporting reflection, including self-awareness; supporting interaction, including collaboration; and supporting experimentation, including…
TFaNS-Tone Fan Noise Design/Prediction System: Users' Manual TFaNS Version 1.5
NASA Technical Reports Server (NTRS)
Topol, David A.; Huff, Dennis L. (Technical Monitor)
2003-01-01
TFaNS is the Tone Fan Noise Design/Prediction System developed by Pratt & Whitney under contract to NASA Glenn. The purpose of this system is to predict tone noise emanating from a fan stage including the effects of reflection and transmission by the rotor and stator and by the duct inlet and nozzle. The first version of this design system was developed under a previous NASA contract. Several improvements have been made to TFaNS. This users' manual shows how to run this new system. TFaNS consists of the codes that compute the acoustic properties (reflection and transmission coefficients) of the various elements and writes them to files, CUP3D Fan Noise Coupling Code that reads these files, solves the coupling problem, and outputs the desired noise predictions, and AWAKEN CFD/Measured Wake Postprocessor which reformats CFD wake predictions and/or measured wake data so they can be used by the system. This report provides information on code input and file structure essential for potential users of TFaNS.
Transient radiative transfer in a scattering slab considering polarization.
Yi, Hongliang; Ben, Xun; Tan, Heping
2013-11-04
The characteristics of the transient and polarization must be considered for a complete and correct description of short-pulse laser transfer in a scattering medium. A Monte Carlo (MC) method combined with a time shift and superposition principle is developed to simulate transient vector (polarized) radiative transfer in a scattering medium. The transient vector radiative transfer matrix (TVRTM) is defined to describe the transient polarization behavior of short-pulse laser propagating in the scattering medium. According to the definition of reflectivity, a new criterion of reflection at Fresnel surface is presented. In order to improve the computational efficiency and accuracy, a time shift and superposition principle is applied to the MC model for transient vector radiative transfer. The results for transient scalar radiative transfer and steady-state vector radiative transfer are compared with those in published literatures, respectively, and an excellent agreement between them is observed, which validates the correctness of the present model. Finally, transient radiative transfer is simulated considering the polarization effect of short-pulse laser in a scattering medium, and the distributions of Stokes vector in angular and temporal space are presented.
[A prognostic model of a cholera epidemic].
Boev, B V; Bondarenko, V M; Prokop'eva, N V; San Román, R T; Raygoza-Anaya, M; García de Alba, R
1994-01-01
A new model for the prognostication of cholera epidemic on the territory of a large city is proposed. This model reflects the characteristic feature of contacting infection by sensitive individuals due to the preservation of Vibrio cholerae in their water habitat. The mathematical model of the epidemic quantitatively reflects the processes of the spread of infection by kinetic equations describing the interaction of the streams of infected persons, the causative agents and susceptible persons. The functions and parameters of the model are linked with the distribution of individuals according to the duration of the incubation period and infectious process, as well as the period of asymptomatic carrier state. The computer realization of the model by means of IBM PC/AT made it possible to study the cholera epidemic which took place in Mexico in 1833. The verified model of the cholera epidemic was used for the prognostication of the possible spread of this infection in Guadalajara, taking into account changes in the epidemiological situation and the size of the population, as well as improvements in sanitary and hygienic conditions, in the city.
NASA Astrophysics Data System (ADS)
Cutrim, E. M.; Rudge, D.; Kits, K.; Mitchell, J.; Nogueira, R.
2006-06-01
Responding to the call for reform in science education, changes were made in an introductory meteorology and climate course offered at a large public university. These changes were a part of a larger project aimed at deepening and extending a program of science content courses that model effective teaching strategies for prospective middle school science teachers. Therefore, revisions were made to address misconceptions about meteorological phenomena, foster deeper understanding of key concepts, encourage engagement with the text, and promote inquiry-based learning. Techniques introduced include: use of a flash cards, student reflection questionnaires, writing assignments, and interactive discussions on weather and forecast data using computer technology such as Integrated Data Viewer (IDV). The revision process is described in a case study format. Preliminary results (self-reflection by the instructor, surveys of student opinion, and measurements of student achievement), suggest student learning has been positively influenced. This study is supported by three grants: NSF grant No. 0202923, the Unidata Equipment Award, and the Lucia Harrison Endowment Fund.
Cassetta, Michele; Altieri, Federica; Pandolfi, Stefano; Giansanti, Matteo
2017-01-01
The aim of this case report was to describe an innovative orthodontic treatment method that combined surgical and orthodontic techniques. The novel method was used to achieve a positive result in a case of moderate crowding by employing a computer-guided piezocision procedure followed by the use of clear aligners. A 23-year-old woman had a malocclusion with moderate crowding. Her periodontal indices, oral health-related quality of life (OHRQoL), and treatment time were evaluated. The treatment included interproximal corticotomy cuts extending through the entire thickness of the cortical layer, without a full-thickness flap reflection. This was achieved with a three-dimensionally printed surgical guide using computer-aided design and computer-aided manufacturing. Orthodontic force was applied to the teeth immediately after surgery by using clear appliances for better control of tooth movement. The total treatment time was 8 months. The periodontal indices improved after crowding correction, but the oral health impact profile showed a slight deterioration of OHRQoL during the 3 days following surgery. At the 2-year retention follow-up, the stability of treatment was excellent. The reduction in surgical time and patient discomfort, increased periodontal safety and patient acceptability, and accurate control of orthodontic movement without the risk of losing anchorage may encourage the use of this combined technique in appropriate cases. PMID:28337422
Cassetta, Michele; Altieri, Federica; Pandolfi, Stefano; Giansanti, Matteo
2017-03-01
The aim of this case report was to describe an innovative orthodontic treatment method that combined surgical and orthodontic techniques. The novel method was used to achieve a positive result in a case of moderate crowding by employing a computer-guided piezocision procedure followed by the use of clear aligners. A 23-year-old woman had a malocclusion with moderate crowding. Her periodontal indices, oral health-related quality of life (OHRQoL), and treatment time were evaluated. The treatment included interproximal corticotomy cuts extending through the entire thickness of the cortical layer, without a full-thickness flap reflection. This was achieved with a three-dimensionally printed surgical guide using computer-aided design and computer-aided manufacturing. Orthodontic force was applied to the teeth immediately after surgery by using clear appliances for better control of tooth movement. The total treatment time was 8 months. The periodontal indices improved after crowding correction, but the oral health impact profile showed a slight deterioration of OHRQoL during the 3 days following surgery. At the 2-year retention follow-up, the stability of treatment was excellent. The reduction in surgical time and patient discomfort, increased periodontal safety and patient acceptability, and accurate control of orthodontic movement without the risk of losing anchorage may encourage the use of this combined technique in appropriate cases.
Alpha absolute power measurement in panic disorder with agoraphobia patients.
de Carvalho, Marcele Regine; Velasques, Bruna Brandão; Freire, Rafael C; Cagy, Maurício; Marques, Juliana Bittencourt; Teixeira, Silmar; Rangé, Bernard P; Piedade, Roberto; Ribeiro, Pedro; Nardi, Antonio Egidio; Akiskal, Hagop Souren
2013-10-01
Panic attacks are thought to be a result from a dysfunctional coordination of cortical and brainstem sensory information leading to heightened amygdala activity with subsequent neuroendocrine, autonomic and behavioral activation. Prefrontal areas may be responsible for inhibitory top-down control processes and alpha synchronization seems to reflect this modulation. The objective of this study was to measure frontal absolute alpha-power with qEEG in 24 subjects with panic disorder and agoraphobia (PDA) compared to 21 healthy controls. qEEG data were acquired while participants watched a computer simulation, consisting of moments classified as "high anxiety"(HAM) and "low anxiety" (LAM). qEEG data were also acquired during two rest conditions, before and after the computer simulation display. We observed a higher absolute alpha-power in controls when compared to the PDA patients while watching the computer simulation. The main finding was an interaction between the moment and group factors on frontal cortex. Our findings suggest that the decreased alpha-power in the frontal cortex for the PDA group may reflect a state of high excitability. Our results suggest a possible deficiency in top-down control processes of anxiety reflected by a low absolute alpha-power in the PDA group while watching the computer simulation and they highlight that prefrontal regions and frontal region nearby the temporal area are recruited during the exposure to anxiogenic stimuli. © 2013 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Chen, Nian-Shing; Kinshuk; Wei, Chun-Wang; Liu, Chia-Chi
2011-01-01
Reflection plays an important role in improving learning performance. This study, therefore, attempted to explore whether learners' reflection levels can be improved if teaching strategies are adapted to fit with learners' thinking styles in an online learning environment. Three teaching strategies, namely constructive, guiding, and inductive,…
ERIC Educational Resources Information Center
Arnold, Cath, Ed.
2012-01-01
"Improving Your Reflective Practice through Stories of Practitioner Research" shows how research has informed and created effective and valuable reflective practice in early years education, and offers depth to the arguments for a research-orientated stance to this vital field of study. This thought-provoking text explores and documents a variety…
ERIC Educational Resources Information Center
Vachon, Brigitte; Durand, Marie-Jose; LeBlanc, Jeannette
2010-01-01
Reflective learning has been described as a promising approach for ameliorating the impact of continuing education (CE) programs. However, there are still very few studies that have investigated how occupational therapists use reflection to improve the integration of CE program content in their decision-making processes. The study objectives were…
Measuring Senior High School Students' Self-Induced Self-Reflective Thinking
ERIC Educational Resources Information Center
van Velzen, Joke H.
2017-01-01
Theoretically, reflection is known to be an essential skill for improving learning on a metacognitive level. In practice, students may not use it of their own accord to improve this kind of learning because it can be mentally demanding. The author reports on the legitimation of an instrument measuring self-induced self-reflective thinking, which…
Chang, Chun-Hung; Myers, Erinn M.; Kennelly, Michael J.; Fried, Nathaniel M.
2017-01-01
Abstract. Near-infrared laser energy in conjunction with applied tissue cooling is being investigated for thermal remodeling of the endopelvic fascia during minimally invasive treatment of female stress urinary incontinence. Previous computer simulations of light transport, heat transfer, and tissue thermal damage have shown that a transvaginal approach is more feasible than a transurethral approach. However, results were suboptimal, and some undesirable thermal insult to the vaginal wall was still predicted. This study uses experiments and computer simulations to explore whether application of an optical clearing agent (OCA) can further improve optical penetration depth and completely preserve the vaginal wall during subsurface treatment of the endopelvic fascia. Several different mixtures of OCA’s were tested, and 100% glycerol was found to be the optimal agent. Optical transmission studies, optical coherence tomography, reflection spectroscopy, and computer simulations [including Monte Carlo (MC) light transport, heat transfer, and Arrhenius integral model of thermal damage] using glycerol were performed. The OCA produced a 61% increase in optical transmission through porcine vaginal wall at 37°C after 30 min. The MC model showed improved energy deposition in endopelvic fascia using glycerol. Without OCA, 62%, 37%, and 1% of energy was deposited in vaginal wall, endopelvic fascia, and urethral wall, respectively, compared with 50%, 49%, and 1% using OCA. Use of OCA also resulted in 0.5-mm increase in treatment depth, allowing potential thermal tissue remodeling at a depth of 3 mm with complete preservation of the vaginal wall. PMID:28301637
NASA Technical Reports Server (NTRS)
Tam, Christopher K. W.; Aganin, Alexei
2000-01-01
The transonic nozzle transmission problem and the open rotor noise radiation problem are solved computationally. Both are multiple length scales problems. For efficient and accurate numerical simulation, the multiple-size-mesh multiple-time-step Dispersion-Relation-Preserving scheme is used to calculate the time periodic solution. To ensure an accurate solution, high quality numerical boundary conditions are also needed. For the nozzle problem, a set of nonhomogeneous, outflow boundary conditions are required. The nonhomogeneous boundary conditions not only generate the incoming sound waves but also, at the same time, allow the reflected acoustic waves and entropy waves, if present, to exit the computation domain without reflection. For the open rotor problem, there is an apparent singularity at the axis of rotation. An analytic extension approach is developed to provide a high quality axis boundary treatment.
ReflectED: Evaluation Report and Executive Summary
ERIC Educational Resources Information Center
Motteram, Gary; Choudry, Sophina; Kalambouka, Afroditi; Hutcheson, Graeme; Barton, Hutcheson
2016-01-01
The ReflectED programme was developed by Rosendale Primary School to improve pupils' metacognition--their ability to think about and manage their own learning. This includes the skills of setting and monitoring goals, assessing progress, and identifying personal strengths and challenges. ReflectED consists of 28, weekly, half-hour lessons, which…
ERIC Educational Resources Information Center
Campbell, Donald S.; And Others
Two studies examined the effectiveness of self-instruction training via a specially developed computer program to modify the impulsive problem-solving behavior of 16 deaf and 10 learning disabled (aphasic) adolescents attending two special residential schools in Canada. In the control condition, students learned the Apple LOGO computing language…
Applications of personal computers in geophysics
NASA Astrophysics Data System (ADS)
Lee, W. H. K.; Lahr, J. C.; Habermann, R. E.
Since 1981, the use of personal computers (PCs) to increase productivity has become widespread. At present, more than 5 million personal computers are in operation for business, education, engineering, and scientific purposes. Activities within AGU reflect this trend: KOSMOS, the AGU electronic network, was introduced this year, and the AGU Committee on Personal Computers, chaired by W.H K. Lee (U.S. Geological Survey, Menlo Park, Calif.), was recently formed. In addition, in conjunction with the 1986 AGU Fall Meeting, this committee is organizing a personal computer session and hands-on demonstrations to promote applications of personal computers in geophysics.
Jet formation of SF6 bubble induced by incident and reflected shock waves
NASA Astrophysics Data System (ADS)
Zhu, Yuejin; Yu, Lei; Pan, Jianfeng; Pan, Zhenhua; Zhang, Penggang
2017-12-01
The computational results of two different cases on the evolution of the shock-SF6 heavy bubble interaction are presented. The shock focusing processes and jet formation mechanisms are analyzed by using the high resolution of computation schemes, and the influence of reflected shock waves is also investigated. It is concluded that there are two steps in the shock focusing process behind the incident shock wave, and the density and pressure values increase distinctly when the shock focusing process is completed. The local high pressure and vorticities in the vicinity of the downstream pole can propel the formation of the jet behind the incident shock wave. In addition, the gas is with the rightward velocity before the reflected shock wave impinges on the bubble; therefore, the evolutions of the waves and the bubble are more complicated when the reflected shock wave impinges on the SF6 bubble. Furthermore, the different end wall distances would affect the deformation degree of the bubble before the interaction of the reflected shock wave; therefore, the different left jet formation processes are found after the impingement of reflected shock waves when L = 27 mm. The local high pressure zones in the vicinity of the left bubble interface and the impingement of different shock waves can induce the local gas to shift the rightward velocity to the leftward velocity, which can further promote the formation of jets.
Reflections on the Scholarly Contributions of Professor David H. Jonassen
ERIC Educational Resources Information Center
Reeves, Thomas C.; Lee, Chwee Beng; Hung, Woei
2013-01-01
The six papers in this special issue of "Computers and Education" honoring Professor David H. Jonassen are diverse in nature. They also reflect differing interpretations of the implications of Jonassen's work for research and development focused on instructional models and the factors influencing instruction as well as the directions for future…
Numerical study of unsteady shockwave reflections using an upwind TVD scheme
NASA Technical Reports Server (NTRS)
Hsu, Andrew T.; Liou, Meng-Sing
1990-01-01
An unsteady TVD Navier-Stokes solver was developed and applied to the problem of shock reflection on a circular cylinder. The obtained numerical results were compared with the Schlieren photos from an experimental study. These results show that the present computer code has the ability of capturing moving shocks.
Promoting Pre-Service Teachers' Reflections through a Cross-Cultural Keypal Project
ERIC Educational Resources Information Center
Wach, Aleksandra
2015-01-01
This paper reports the results of an action research-based study that investigated participants' reflections on EFL learning and teaching in a computer-mediated communication (CMC)-based project. Forty pre-service teachers from two universities, in Poland and in Romania, exchanged emails on class-related topics; the email exchange was followed by…
Optimizations and Applications in Head-Mounted Video-Based Eye Tracking
ERIC Educational Resources Information Center
Li, Feng
2011-01-01
Video-based eye tracking techniques have become increasingly attractive in many research fields, such as visual perception and human-computer interface design. The technique primarily relies on the positional difference between the center of the eye's pupil and the first-surface reflection at the cornea, the corneal reflection (CR). This…
Building Computer Free Sorting Devices Based on Reflection of Visible and NIR Wavelengths
USDA-ARS?s Scientific Manuscript database
NIR and visible light reflection from food products has long been the basis of scientific research for the detection of defects and contaminants as well as food quality attributes. Most of the research in this area reports derived calibration equations that indicate the potential of using the techno...
A Method to Measure and Estimate Normalized Contrast in Infrared Flash Thermography
NASA Technical Reports Server (NTRS)
Koshti, Ajay M.
2016-01-01
The paper presents further development in normalized contrast processing used in flash infrared thermography method. Method of computing normalized image or pixel intensity contrast, and normalized temperature contrast are provided. Methods of converting image contrast to temperature contrast and vice versa are provided. Normalized contrast processing in flash thermography is useful in quantitative analysis of flash thermography data including flaw characterization and comparison of experimental results with simulation. Computation of normalized temperature contrast involves use of flash thermography data acquisition set-up with high reflectivity foil and high emissivity tape such that the foil, tape and test object are imaged simultaneously. Methods of assessing other quantitative parameters such as emissivity of object, afterglow heat flux, reflection temperature change and surface temperature during flash thermography are also provided. Temperature imaging and normalized temperature contrast processing provide certain advantages over normalized image contrast processing by reducing effect of reflected energy in images and measurements, therefore providing better quantitative data. Examples of incorporating afterglow heat-flux and reflection temperature evolution in flash thermography simulation are also discussed.
Evaluation of general non-reflecting boundary conditions for industrial CFD applications
NASA Astrophysics Data System (ADS)
Basara, Branislav; Frolov, Sergei; Lidskii, Boris; Posvyanskii, Vladimir
2007-11-01
The importance of having proper boundary conditions for the calculation domain is a known issue in Computational Fluid Dynamics (CFD). In many situations, it is very difficult to define a correct boundary condition. The flow may enter and leave the computational domain at the same time and at the same boundary. In such circumstances, it is important that numerical implementation of boundary conditions enforces certain physical constraints leading to correct results which then ensures a better convergence rate. The aim of this paper is to evaluate recently proposed non-reflecting boundary conditions (Frolov et al., 2001, Advances in Chemical Propulsion) on industrial CFD applications. Derivation of the local non-reflecting boundary conditions at the open boundary is based on finding the solution of linearized Euler equations vanishing at infinity for both incompressible and compressible formulations. This is implemented into the in-house CFD package AVL FIRE and some numerical details will be presented as well. The key applications in this paper are from automotive industry, e.g. an external car aerodynamics, an intake port, etc. The results will show benefits of using effective non-reflecting boundary conditions.
Reflection of a polarized light cone
NASA Astrophysics Data System (ADS)
Brody, Jed; Weiss, Daniel; Berland, Keith
2013-01-01
We introduce a visually appealing experimental demonstration of Fresnel reflection. In this simple optical experiment, a polarized light beam travels through a high numerical-aperture microscope objective, reflects off a glass slide, and travels back through the same objective lens. The return beam is sampled with a polarizing beam splitter and produces a surprising geometric pattern on an observation screen. Understanding the origin of this pattern requires careful attention to geometry and an understanding of the Fresnel coefficients for S and P polarized light. We demonstrate that in addition to a relatively simple experimental implementation, the shape of the observed pattern can be computed both analytically and by using optical modeling software. The experience of working through complex mathematical computations and demonstrating their agreement with a surprising experimental observation makes this a highly educational experiment for undergraduate optics or advanced-lab courses. It also provides a straightforward yet non-trivial system for teaching students how to use optical modeling software.
Optical function of the finite-thickness corrugated pellicle of euglenoids.
Inchaussandague, Marina E; Skigin, Diana C; Dolinko, Andrés E
2017-06-20
We explore the electromagnetic response of the pellicle of selected species of euglenoids. These microorganisms are bounded by a typical surface pellicle formed by S-shaped overlapping bands that resemble a corrugated film. We investigate the role played by this structure in the protection of the cell against UV radiation. By considering the pellicle as a periodically corrugated film of finite thickness, we applied the C-method to compute the reflectance spectra. The far-field results revealed reflectance peaks with a Q-factor larger than 10 3 in the UV region for all the illumination conditions investigated. The resonant behavior responsible for this enhancement has also been illustrated by near-field computations performed by a photonic simulation method. These results confirm that the corrugated pellicle of euglenoids shields the cell from harmful UV radiation and open up new possibilities for the design of highly UV-reflective surfaces.
Polarized BRDF for coatings based on three-component assumption
NASA Astrophysics Data System (ADS)
Liu, Hong; Zhu, Jingping; Wang, Kai; Xu, Rong
2017-02-01
A pBRDF(polarized bidirectional reflection distribution function) model for coatings is given based on three-component reflection assumption in order to improve the polarized scattering simulation capability for space objects. In this model, the specular reflection is given based on microfacet theory, the multiple reflection and volume scattering are given separately according to experimental results. The polarization of specular reflection is considered from Fresnel's law, and both multiple reflection and volume scattering are assumed depolarized. Simulation and measurement results of two satellite coating samples SR107 and S781 are given to validate that the pBRDF modeling accuracy can be significantly improved by the three-component model given in this paper.
Varela, P; Silva, A; da Silva, F; da Graça, S; Manso, M E; Conway, G D
2010-10-01
The spectrogram is one of the best-known time-frequency distributions suitable to analyze signals whose energy varies both in time and frequency. In reflectometry, it has been used to obtain the frequency content of FM-CW signals for density profile inversion and also to study plasma density fluctuations from swept and fixed frequency data. Being implemented via the short-time Fourier transform, the spectrogram is limited in resolution, and for that reason several methods have been developed to overcome this problem. Among those, we focus on the reassigned spectrogram technique that is both easily automated and computationally efficient requiring only the calculation of two additional spectrograms. In each time-frequency window, the technique reallocates the spectrogram coordinates to the region that most contributes to the signal energy. The application to ASDEX Upgrade reflectometry data results in better energy concentration and improved localization of the spectral content of the reflected signals. When combined with the automatic (data driven) window length spectrogram, this technique provides improved profile accuracy, in particular, in regions where frequency content varies most rapidly such as the edge pedestal shoulder.
Engaging Students in a Service-Learning Community through Computer-Mediated Communication
ERIC Educational Resources Information Center
Bair, Beth Teagarden
2017-01-01
In 2015, a university in rural Maryland offered an undergraduate service-learning leadership course, which collaborated with a service-learning community of practice. This interdisciplinary leadership course initiated and sustained personal and critical reflection and social interactions by integrating Computer-Medicated Communication (CMC)…
Operation of the computer model for microenvironment solar exposure
NASA Technical Reports Server (NTRS)
Gillis, J. R.; Bourassa, R. J.; Gruenbaum, P. E.
1995-01-01
A computer model for microenvironmental solar exposure was developed to predict solar exposure to satellite surfaces which may shadow or reflect on one another. This document describes the technical features of the model as well as instructions for the installation and use of the program.
Computer-Mediated Intersensory Learning Model for Students with Learning Disabilities
ERIC Educational Resources Information Center
Seok, Soonhwa; DaCosta, Boaventura; Kinsell, Carolyn; Poggio, John C.; Meyen, Edward L.
2010-01-01
This article proposes a computer-mediated intersensory learning model as an alternative to traditional instructional approaches for students with learning disabilities (LDs) in the inclusive classroom. Predominant practices of classroom inclusion today reflect the six principles of zero reject, nondiscriminatory evaluation, appropriate education,…
Ghanbari, Cheryl M; Ho, Clifford K; Kolb, Gregory J
2014-03-04
Various technologies described herein pertain to evaluating a beam reflected by a heliostat. A portable target that has an array of sensors mounted thereupon is configured to capture the beam reflected by the heliostat. The sensors in the array output measured values indicative of a characteristic of the beam reflected by the heliostat. Moreover, a computing device can generate and output data corresponding to the beam reflected by the heliostat based on the measured values indicative of the characteristic of the beam received from the sensors in the array.
NDE Imaging of Time Differential Terahertz Waves
NASA Technical Reports Server (NTRS)
Trinh, Long B.
2008-01-01
Natural voids are present in the vicinity of a conathane interface that bonds two different foam materials. These voids are out of focus with the terahertz imaging system and multiple optical reflections also make it difficult to determine their depths. However, waves passing through the top foam article at normal incidence are partially reflected at the denser conathane layer prior to total reflection at the tank s wall. Reflections embedded in the oscillating noise segment prior to the main signals can be extracted with dual applications of filtering and time derivative. Void's depth is computed from direct path's time of flight.
Wille, M-L; Zapf, M; Ruiter, N V; Gemmeke, H; Langton, C M
2015-06-21
The quality of ultrasound computed tomography imaging is primarily determined by the accuracy of ultrasound transit time measurement. A major problem in analysis is the overlap of signals making it difficult to detect the correct transit time. The current standard is to apply a matched-filtering approach to the input and output signals. This study compares the matched-filtering technique with active set deconvolution to derive a transit time spectrum from a coded excitation chirp signal and the measured output signal. The ultrasound wave travels in a direct and a reflected path to the receiver, resulting in an overlap in the recorded output signal. The matched-filtering and deconvolution techniques were applied to determine the transit times associated with the two signal paths. Both techniques were able to detect the two different transit times; while matched-filtering has a better accuracy (0.13 μs versus 0.18 μs standard deviations), deconvolution has a 3.5 times improved side-lobe to main-lobe ratio. A higher side-lobe suppression is important to further improve image fidelity. These results suggest that a future combination of both techniques would provide improved signal detection and hence improved image fidelity.
Simple vertex correction improves G W band energies of bulk and two-dimensional crystals
NASA Astrophysics Data System (ADS)
Schmidt, Per S.; Patrick, Christopher E.; Thygesen, Kristian S.
2017-11-01
The G W self-energy method has long been recognized as the gold standard for quasiparticle (QP) calculations of solids in spite of the fact that the neglect of vertex corrections and the use of a density-functional theory starting point lack rigorous justification. In this work we remedy this situation by including a simple vertex correction that is consistent with a local-density approximation starting point. We analyze the effect of the self-energy by splitting it into short-range and long-range terms which are shown to govern, respectively, the center and size of the band gap. The vertex mainly improves the short-range correlations and therefore has a small effect on the band gap, while it shifts the band gap center up in energy by around 0.5 eV, in good agreement with experiments. Our analysis also explains how the relative importance of short- and long-range interactions in structures of different dimensionality is reflected in their QP energies. Inclusion of the vertex comes at practically no extra computational cost and even improves the basis set convergence compared to G W . Taken together, the method provides an efficient and rigorous improvement over the G W approximation.
Using Systematic Feedback and Reflection to Improve Adventure Education Teaching Skills
ERIC Educational Resources Information Center
Richardson, Rick; Kalvaitis, Darius; Delparte, Donna
2014-01-01
This study examined how adventure educators could use systematic feedback to improve their teaching skills. Evaluative instruments demonstrated a statistically significant improvement in teaching skills when applied at an outdoor education center in Western Canada. Concurrent focus group interviews enabled instructors to reflect on student…
Reflect and Improve: Instructional Development through a Teaching Journal
ERIC Educational Resources Information Center
Boyd, Josh; Boyd, Steve
2005-01-01
This article recommends the teaching journal as a method of instructional improvement. Drawing on teacher education literature, the article reviews the concept of reflective teaching and then describes uses of the teaching journal for college instructors in descriptive, comparative, and critical dimensions. Teaching journals can improve the…
Atmospheric effects on cluster analyses. [for remote sensing application
NASA Technical Reports Server (NTRS)
Kiang, R. K.
1979-01-01
Ground reflected radiance, from which information is extracted through techniques of cluster analyses for remote sensing application, is altered by the atmosphere when it reaches the satellite. Therefore it is essential to understand the effects of the atmosphere on Landsat measurements, cluster characteristics and analysis accuracy. A doubling model is employed to compute the effective reflectivity, observed from the satellite, as a function of ground reflectivity, solar zenith angle and aerosol optical thickness for standard atmosphere. The relation between the effective reflectivity and ground reflectivity is approximately linear. It is shown that for a horizontally homogeneous atmosphere, the classification statistics from a maximum likelihood classifier remains unchanged under these transforms. If inhomogeneity is present, the divergence between clusters is reduced, and correlation between spectral bands increases. Radiance reflected by the background area surrounding the target may also reach the satellite. The influence of background reflectivity on effective reflectivity is discussed.
Bidirectional reflectance distribution function effects in ladar-based reflection tomography.
Jin, Xuemin; Levine, Robert Y
2009-07-20
Light reflection from a surface is described by the bidirectional reflectance distribution function (BRDF). In this paper, BRDF effects in reflection tomography are studied using modeled range-resolved reflection from well-characterized geometrical surfaces. It is demonstrated that BRDF effects can cause a darkening at the interior boundary of the reconstructed surface analogous to the well-known beam hardening artifact in x-ray transmission computed tomography (CT). This artifact arises from reduced reflection at glancing incidence angles to the surface. It is shown that a purely Lambertian surface without shadowed components is perfectly reconstructed from range-resolved measurements. This result is relevant to newly fabricated carbon nanotube materials. Shadowing is shown to cause crossed streak artifacts similar to limited-angle effects in CT reconstruction. In tomographic reconstruction, these effects can overwhelm highly diffuse components in proximity to specularly reflecting elements. Diffuse components can be recovered by specialized processing, such as reducing glints via thresholded measurements.
NASA Technical Reports Server (NTRS)
Miles, J. H.
1975-01-01
Ground reflection effects on the propagation of jet noise over an asphalt surface are discussed for data obtained using a 33.02-cm diameter nozzle with microphones at several heights and distances from the nozzle axis. Ground reflection effects are analyzed using the concept of a reflected signal transfer function which represents the influence of both the reflecting surface and the atmosphere on the propagation of the reflected signal in a mathematical model. The mathematical model used as a basis for the computer program was successful in significantly reducing the ground reflection effects. The range of values of the single complex number used to define the reflected signal transfer function was larger than expected when determined only by the asphalt surface. This may indicate that the atmosphere is affecting the propagation of the reflected signal more than the asphalt surface. The selective placement of the reinforcements and cancellations in the design of an experiment to minimize ground reflection effects is also discussed.
NASA Technical Reports Server (NTRS)
Miles, J. H.
1975-01-01
Ground reflection effects on the propagation of jet noise over an asphalt surface are discussed for data obtained using a 33.02 cm (13-in.) diameter nozzle with microphones at several heights and distances from the nozzle axis. Analysis of ground reflection effects is accomplished using the concept of a reflected signal transfer function which represents the influence of both the reflecting surface and the atmosphere on the propagation of the reflected signal in a mathematical model. The mathematical model used as a basis for the computer program was successful in significantly reducing the ground reflection effects. The range of values of the single complex number used to define the reflected signal transfer function was larger than expected when determined only by the asphalt surface. This may indicate that the atmosphere is affecting the propagation of the reflected signal more than the asphalt surface. Also discussed is the selective placement of the reinforcements and cancellations in the design of an experiment to minimize ground reflection effects.
NASA Technical Reports Server (NTRS)
Caulfield, John; Crosson, William L.; Inguva, Ramarao; Laymon, Charles A.; Schamschula, Marius
1998-01-01
This is a followup on the preceding presentation by Crosson and Schamschula. The grid size for remote microwave measurements is much coarser than the hydrological model computational grids. To validate the hydrological models with measurements we propose mechanisms to disaggregate the microwave measurements to allow comparison with outputs from the hydrological models. Weighted interpolation and Bayesian methods are proposed to facilitate the comparison. While remote measurements occur at a large scale, they reflect underlying small-scale features. We can give continuing estimates of the small scale features by correcting the simple 0th-order, starting with each small-scale model with each large-scale measurement using a straightforward method based on Kalman filtering.
NASA Astrophysics Data System (ADS)
Zhang, Jin-Zhao; Tuo, Xian-Guo
2014-07-01
We present the design and optimization of a prompt γ-ray neutron activation analysis (PGNAA) thermal neutron output setup based on Monte Carlo simulations using MCNP5 computer code. In these simulations, the moderator materials, reflective materials, and structure of the PGNAA 252Cf neutrons of thermal neutron output setup are optimized. The simulation results reveal that the thin layer paraffin and the thick layer of heavy water moderating effect work best for the 252Cf neutron spectrum. Our new design shows a significantly improved performance of the thermal neutron flux and flux rate, that are increased by 3.02 times and 3.27 times, respectively, compared with the conventional neutron source design.
Teleoperated position control of a PUMA robot
NASA Technical Reports Server (NTRS)
Austin, Edmund; Fong, Chung P.
1987-01-01
A laboratory distributed computer control teleoperator system is developed to support NASA's future space telerobotic operation. This teleoperator system uses a universal force-reflecting hand controller in the local iste as the operator's input device. In the remote site, a PUMA controller recieves the Cartesian position commands and implements PID control laws to position the PUMA robot. The local site uses two microprocessors while the remote site uses three. The processors communicate with each other through shared memory. The PUMA robot controller was interfaced through custom made electronics to bypass VAL. The development status of this teleoperator system is reported. The execution time of each processor is analyzed, and the overall system throughput rate is reported. Methods to improve the efficiency and performance are discussed.
Spatial Modulation Improves Performance in CTIS
NASA Technical Reports Server (NTRS)
Bearman, Gregory H.; Wilson, Daniel W.; Johnson, William R.
2009-01-01
Suitably formulated spatial modulation of a scene imaged by a computed-tomography imaging spectrometer (CTIS) has been found to be useful as a means of improving the imaging performance of the CTIS. As used here, "spatial modulation" signifies the imposition of additional, artificial structure on a scene from within the CTIS optics. The basic principles of a CTIS were described in "Improvements in Computed- Tomography Imaging Spectrometry" (NPO-20561) NASA Tech Briefs, Vol. 24, No. 12 (December 2000), page 38 and "All-Reflective Computed-Tomography Imaging Spectrometers" (NPO-20836), NASA Tech Briefs, Vol. 26, No. 11 (November 2002), page 7a. To recapitulate: A CTIS offers capabilities for imaging a scene with spatial, spectral, and temporal resolution. The spectral disperser in a CTIS is a two-dimensional diffraction grating. It is positioned between two relay lenses (or on one of two relay mirrors) in a video imaging system. If the disperser were removed, the system would produce ordinary images of the scene in its field of view. In the presence of the grating, the image on the focal plane of the system contains both spectral and spatial information because the multiple diffraction orders of the grating give rise to multiple, spectrally dispersed images of the scene. By use of algorithms adapted from computed tomography, the image on the focal plane can be processed into an image cube a three-dimensional collection of data on the image intensity as a function of the two spatial dimensions (x and y) in the scene and of wavelength (lambda). Thus, both spectrally and spatially resolved information on the scene at a given instant of time can be obtained, without scanning, from a single snapshot; this is what makes the CTIS such a potentially powerful tool for spatially, spectrally, and temporally resolved imaging. A CTIS performs poorly in imaging some types of scenes in particular, scenes that contain little spatial or spectral variation. The computed spectra of such scenes tend to approximate correct values to within acceptably small errors near the edges of the field of view but to be poor approximations away from the edges. The additional structure imposed on a scene according to the present method enables the CTIS algorithms to reconstruct acceptable approximations of the spectral data throughout the scene.
Musicians Crossing Musical Instrument Gender Stereotypes: A Study of Computer-Mediated Communication
ERIC Educational Resources Information Center
Abeles, Harold F.; Hafeli, Mary; Sears, Colleen
2014-01-01
This study examined computer-mediated communication (CMC) -- blogs and responses to YouTube postings -- to better understand how CMCs reflect adolescents' attitudes towards musicians playing instruments that cross gender stereotypes. Employing purposive sampling, we used specific search terms, such as "girl drummer", to identify a…
Nurturing Students' Problem-Solving Skills and Engagement in Computer-Mediated Communications (CMC)
ERIC Educational Resources Information Center
Chen, Ching-Huei
2014-01-01
The present study sought to investigate how to enhance students' well- and ill-structured problem-solving skills and increase productive engagement in computer-mediated communication with the assistance of external prompts, namely procedural and reflection. Thirty-three graduate students were randomly assigned to two conditions: procedural and…
ERIC Educational Resources Information Center
Wu, Zhiwei
2018-01-01
Framed from positioning theory and dynamic systems theory, the paper reports on a naturalistic study involving four Chinese participants and their American peers in an intercultural asynchronous computer-mediated communication (ACMC) activity. Based on the moment-by-moment analysis and triangulation of forum posts, reflective essays, and…
A Relational Frame and Artificial Neural Network Approach to Computer-Interactive Mathematics
ERIC Educational Resources Information Center
Ninness, Chris; Rumph, Robin; McCuller, Glen; Vasquez III, Eleazar; Harrison, Carol; Ford, Angela M.; Capt, Ashley; Ninness, Sharon K.; Bradfield, Anna
2005-01-01
Fifteen participants unfamiliar with mathematical operations relative to reflections and vertical and horizontal shifts were exposed to an introductory lecture regarding the fundamentals of the rectangular coordinate system and the relationship between formulas and their graphed analogues. The lecture was followed immediately by computer-assisted…
Education & Technology: Reflections on Computing in Classrooms.
ERIC Educational Resources Information Center
Fisher, Charles, Ed.; Dwyer, David C., Ed.; Yocam, Keith, Ed.
This volume examines learning in the age of technology, describes changing practices in technology-rich classrooms, and proposes new ways to support teachers as they incorporate technology into their work. It commemorates the eleventh anniversary of the Apple Classrooms of Tomorrow (ACOT) Project, when Apple Computer, Inc., in partnership with a…
Reflections on a Strategic Vision for Computer Network Operations
2010-05-25
either a traditional or an irregular war. It cannot include the disarmament or destruction of enemy forces or the occupation of its geographic territory...Washington, DC: Chairman of the Joint Chiefs of Staff, 15 August 2007), GL-7. 34 Mr. John Mense , Basic Computer Network Operations Planners Course
Computer Support for the Rhythms of Writing.
ERIC Educational Resources Information Center
Sharples, Mike
1994-01-01
Suggests that writing is a rhythmic activity. Claims that the combined effect of rapidly switching between composing and revising is to set up complex cycles of engagement and reflection that may disrupt the flow of composition. Describes "Writer's Assistant," a writing environment designed to study computer support for writing processes. Proposes…
Single photon emission computed tomography in motor neuron disease with dementia.
Sawada, H; Udaka, F; Kishi, Y; Seriu, N; Mezaki, T; Kameyama, M; Honda, M; Tomonobu, M
1988-01-01
Single photon emission computed tomography with [123 I] isopropylamphetamine was carried out on a patient with motor neuron disease with dementia. [123 I] uptake was decreased in the frontal lobes. This would reflect the histopathological findings such as neuronal loss and gliosis in the frontal lobes.
Operation of the computer model for microenvironment atomic oxygen exposure
NASA Technical Reports Server (NTRS)
Bourassa, R. J.; Gillis, J. R.; Gruenbaum, P. E.
1995-01-01
A computer model for microenvironment atomic oxygen exposure has been developed to extend atomic oxygen modeling capability to include shadowing and reflections. The model uses average exposure conditions established by the direct exposure model and extends the application of these conditions to treat surfaces of arbitrary shape and orientation.
Reflections on John Monaghan's "Computer Algebra, Instrumentation, and the Anthropological Approach"
ERIC Educational Resources Information Center
Blume, Glen
2007-01-01
Reactions to John Monaghan's "Computer Algebra, Instrumentation and the Anthropological Approach" focus on a variety of issues related to the ergonomic approach (instrumentation) and anthropological approach to mathematical activity and practice. These include uses of the term technique; several possibilities for integration of the two approaches;…
Assistive Technology for Every Child
ERIC Educational Resources Information Center
Boyd, Barbara Foulks
2008-01-01
The Montessori philosophy advocates that the classroom be a reflection of the home, the community, and the world. Now, a century after Maria Montessori founded her Casa dei Bambini, the world is becoming a high-technology society, with computers a part of everyday American lives. Computers are almost a household necessity, and basic…
NASA Astrophysics Data System (ADS)
Christensen, David B.; Basaeri, Hamid; Roundy, Shad
2017-12-01
In acoustic power transfer systems, a receiver is displaced from a transmitter by an axial depth, a lateral offset (alignment), and a rotation angle (orientation). In systems where the receiver’s position is not fixed, such as a receiver implanted in biological tissue, slight variations in depth, orientation, or alignment can cause significant variations in the received voltage and power. To address this concern, this paper presents a computationally efficient technique to model the effects of depth, orientation, and alignment via ray tracing (DOART) on received voltage and power in acoustic power transfer systems. DOART combines transducer circuit equivalent models, a modified version of Huygens principle, and ray tracing to simulate pressure wave propagation and reflection between a transmitter and a receiver in a homogeneous medium. A reflected grid method is introduced to calculate propagation distances, reflection coefficients, and initial vectors between a point on the transmitter and a point on the receiver for an arbitrary number of reflections. DOART convergence and simulation time per data point is discussed as a function of the number of reflections and elements chosen. Finally, experimental data is compared to DOART simulation data in terms of magnitude and shape of the received voltage signal.
NASA Astrophysics Data System (ADS)
Bostater, Charles R., Jr.; Rebbman, Jan; Hall, Carlton; Provancha, Mark; Vieglais, David
1995-11-01
Measurements of temporal reflectance signatures as a function of growing season for sand live oak (Quercus geminata), myrtle oak (Q. myrtifolia, and saw palmetto (Serenoa repens) were collected during a two year study period. Canopy level spectral reflectance signatures, as a function of 252 channels between 368 and 1115 nm, were collected using near nadir viewing geometry and a consistent sun illumination angle. Leaf level reflectance measurements were made in the laboratory using a halogen light source and an environmental optics chamber with a barium sulfate reflectance coating. Spectral measurements were related to several biophysical measurements utilizing optimal passive ambient correlation spectroscopy (OPACS) technique. Biophysical parameters included percent moisture, water potential (MPa), total chlorophyll, and total Kjeldahl nitrogen. Quantitative data processing techniques were used to determine optimal bands based on the utilization of a second order derivative or inflection estimator. An optical cleanup procedure was then employed that computes the double inflection ratio (DIR) spectra for all possible three band combinations normalized to the previously computed optimal bands. These results demonstrate a unique approach to the analysis of high spectral resolution reflectance signatures for estimation of several biophysical measures of plants at the leaf and canopy level from optimally selected bands or bandwidths.
Remote sensing applied to agriculture: Basic principles, methodology, and applications
NASA Technical Reports Server (NTRS)
Dejesusparada, N. (Principal Investigator); Mendonca, F. J.
1981-01-01
The general principles of remote sensing techniques as applied to agriculture and the methods of data analysis are described. the theoretical spectral responses of crops; reflectance, transmittance, and absorbtance of plants; interactions of plants and soils with reflectance energy; leaf morphology; and factors which affect the reflectance of vegetation cover are dicussed. The methodologies of visual and computer-aided analyses of LANDSAT data are presented. Finally, a case study wherein infrared film was used to detect crop anomalies and other data applications are described.
NASA Technical Reports Server (NTRS)
Fieno, D.; Fox, T.; Mueller, R.
1972-01-01
Clean criticality data were obtained from molybdenum-reflected cylindrical uranyl-fluoride-water solution reactors. Using ENDF/B molybdenum cross sections, a nine energy group two-dimensional transport calculation of a reflected reactor configuration predicted criticality to within 7 cents of the experimental value. For these reactors, it was necessary to compute the reflector resonance integral by a detailed transport calculation at the core-reflector interface volume in the energy region of the two dominant resonances of natural molybdenum.
A radiosity-based model to compute the radiation transfer of soil surface
NASA Astrophysics Data System (ADS)
Zhao, Feng; Li, Yuguang
2011-11-01
A good understanding of interactions of electromagnetic radiation with soil surface is important for a further improvement of remote sensing methods. In this paper, a radiosity-based analytical model for soil Directional Reflectance Factor's (DRF) distributions was developed and evaluated. The model was specifically dedicated to the study of radiation transfer for the soil surface under tillage practices. The soil was abstracted as two dimensional U-shaped or V-shaped geometric structures with periodic macroscopic variations. The roughness of the simulated surfaces was expressed as a ratio of the height to the width for the U and V-shaped structures. The assumption was made that the shadowing of soil surface, simulated by U or V-shaped grooves, has a greater influence on the soil reflectance distribution than the scattering properties of basic soil particles of silt and clay. Another assumption was that the soil is a perfectly diffuse reflector at a microscopic level, which is a prerequisite for the application of the radiosity method. This radiosity-based analytical model was evaluated by a forward Monte Carlo ray-tracing model under the same structural scenes and identical spectral parameters. The statistics of these two models' BRF fitting results for several soil structures under the same conditions showed the good agreements. By using the model, the physical mechanism of the soil bidirectional reflectance pattern was revealed.
Prisms with total internal reflection as solar reflectors
Rabl, Arnulf; Rabl, Veronika
1978-01-01
An improved reflective wall for radiant energy collection and concentration devices is provided. The wall is comprised of a plurality of prisms whose frontal faces are adjacent and which reflect the desired radiation by total internal reflection.
Estimation of Catchment Transit Time in Fuji River Basin by using an improved Tank model
NASA Astrophysics Data System (ADS)
Wenchao, M.; Yamanaka, T.; Wakiyama, Y.; Wang, P.
2013-12-01
As an important parameter that reflects the characteristics of catchments, the catchment transit time (CTT) has been given much more widely attentions especially in recent years. The CTT is defined as the time water spends travelling through a catchment to the stream network [1], and it describes how catchments retain and release water and solutes and thus control geochemical and biogeochemical cycling and contamination persistence [2]. The objectives of the present study are to develop a new approach for estimating CTT without prior information on such TTD functions and to apply it to the Fuji River basin in the Central Japan Alps Region. In this study, an improved Tank model was used to compute mean CTT and TTD functions simultaneously. It involved water fluxes and isotope mass balance. Water storage capacity in the catchment, which strongly affects CTT, is reflected in isotope mass balance more sensitively than in water fluxes. A model calibrated with observed discharge and isotope data is used for virtual age tracer computation to estimate CTT. This model does not only consider the hydrological data and physical process of the research area but also reflects the actual TTD with considering the geological condition, land use and the other catchment-hydrological conditions. For the calibration of the model, we used river discharge record obtained by the Ministry of Land, Infrastructure and Transportation, and are collecting isotope data of precipitation and river waters monthly or semi-weekly. Three sub-catchments (SC1~SC3) in the Fuji River basin was selected to test the model with five layers: the surface layer, upper-soil layer, lower-soil layer, groundwater aquifer layer and bedrock layer (Layer 1- Layer 5). The evaluation of the model output was assessed using Nash-Sutcliffe efficiency (NSE), root mean square error-observations standard deviation ratio (RSR), and percent bias (PBIAS). Using long time-series of discharge records for calibration, the simulated discharge basically satisfied requirements of reproducing water fluxes and their balance, while improvements in parameter estimations relating to isotope mass balance is necessary. Water balance and isotopes balance have been exercised in abundant simulations by using Mont-Carlo method, and the optimal parameters combination generated reliable result. Later, we figured out the temporal-variant MTT as well as the degree of influence that brought by precipitation event, where the results showed inverse relationship between precipitation amount and MTT value. Reference: [1] Jeffrey. J. McDonnell, Kevin J. McGuire, Aggarwal, P., et al. 2010. How old is stream water? Open questions in catchment transit time conceptualization, modeling and analysis. Hydro. Process. 24, 1745-1754. [2] Kevin J. McGuire, Jeffrey J. McDonnell. 2006. A review and evaluation of transit time modeling. Journal of Hydrology. 330, 543-563.
2015-01-01
1 Introduction The Pegasus White Ice Runway at McMurdo Station, Antarctica , has expe- rienced significant melting during the past two austral...Laboratory Trials of White Ice Paint to Improve the Energy Reflectance Properties of the Glacial- Ice Runway Surface Co ld R eg io ns R es ea rc h...ERDC/CRREL TN-15-1 January 2015 Pegasus Airfield Repair and Protection Laboratory Trials of White Ice Paint to Improve the Energy Reflectance
Neutron reflecting supermirror structure
Wood, James L.
1992-01-01
An improved neutron reflecting supermirror structure comprising a plurality of stacked sets of bilayers of neutron reflecting materials. The improved neutron reflecting supermirror structure is adapted to provide extremely good performance at high incidence angles, i.e. up to four time the critical angle of standard neutron mirror structures. The reflection of neutrons striking the supermirror structure at a high critical angle provides enhanced neutron throughput, and hence more efficient and economical use of neutron sources. One layer of each set of bilayers consist of titanium, and the second layer of each set of bilayers consist of an alloy of nickel with carbon interstitially present in the nickel alloy.
ERIC Educational Resources Information Center
Duijnhouwer, Hendrien; Prins, Frans J.; Stokking, Karel M.
2012-01-01
This study investigated the effects of feedback providing improvement strategies and a reflection assignment on students' writing motivation, process, and performance. Students in the experimental feedback condition (n = 41) received feedback including improvement strategies, whereas students in the control feedback condition (n = 41) received…
Method and tool for network vulnerability analysis
Swiler, Laura Painton [Albuquerque, NM; Phillips, Cynthia A [Albuquerque, NM
2006-03-14
A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."
Using Dynamic Software in Mathematics: The Case of Reflection Symmetry
ERIC Educational Resources Information Center
Tatar, Enver; Akkaya, Adnan; Kagizmanli, Türkan Berrin
2014-01-01
This study was carried out to examine the effects of computer-assisted instruction (CAI) using dynamic software on the achievement of students in mathematics in the topic of reflection symmetry. The study also aimed to ascertain the pre-service mathematics teachers' opinions on the use of CAI in mathematics lessons. In the study, a mixed research…
ERIC Educational Resources Information Center
Levy, Philippa
2006-01-01
This paper focuses on learners' experiences of text-based computer-mediated communication (CMC) as a means of self-expression, dialogue and debate. A detailed case study narrative and a reflective commentary are presented, drawn from a personal, practice-based inquiry into the design and facilitation of a professional development course for which…
Low-Frequency Shallow Water Acoustics (20 to 500 Hz),
1986-05-01
developed by Hastrup 49 and the results are shown in Figure V-1. See also Table 11-4. 20- 16.m ./ \\ 12. Fig. V-I. SILT Computed reflection loss...33-39. APL-UW 8606 91 UNIVERSITY OF WASHINGTON • APPLIED PHYSICS LABORATORY 49. O.F. Hastrup , "Some bottom-reflection loss anomalies near grazing and
Simulated BRDF based on measured surface topography of metal
NASA Astrophysics Data System (ADS)
Yang, Haiyue; Haist, Tobias; Gronle, Marc; Osten, Wolfgang
2017-06-01
The radiative reflective properties of a calibration standard rough surface were simulated by ray tracing and the Finite-difference time-domain (FDTD) method. The simulation results have been used to compute the reflectance distribution functions (BRDF) of metal surfaces and have been compared with experimental measurements. The experimental and simulated results are in good agreement.
ERIC Educational Resources Information Center
Kearney, Kerri S.; Damron, Rebecca; Sohoni, Sohum
2015-01-01
This paper investigates group/team development in computer engineering courses at a University in the Central USA from the perspective of organization behavior theory, specifically Tuckman's model of the stages of group development. The investigation, conducted through linguistic analysis of student reflection essays, and through focus group…
Use of Reflection-in-Action and Self-Assessment to Promote Critical Thinking Among Pharmacy Students
Gregory, Paul AM; Chiu, Stephanie
2008-01-01
Objective To examine whether self-assessment and reflection-in-action improves critical thinking among pharmacy students. Methods A 24-item standardized test of critical thinking was developed utilizing previously-validated questions. Participants were divided into 2 groups (conditions). Those in condition 1 completed the test with no interference; those in condition 2 completed the test but were prompted at specific points during the test to reflect and self-assess. Results A total of 94 undergraduate (BScPhm) pharmacy students participated in this study. Significant differences (p < 0.05) were observed between those who completed the test under condition 1 and condition 2, suggesting reflection and self-assessment may contribute positively to improvement in critical thinking. Conclusions Structured opportunities to reflect-in-action and self-assess may be associated with improvements among pharmacy students in performance of tasks related to critical thinking. PMID:18698383
Using computer simulations to facilitate conceptual understanding of electromagnetic induction
NASA Astrophysics Data System (ADS)
Lee, Yu-Fen
This study investigated the use of computer simulations to facilitate conceptual understanding in physics. The use of computer simulations in the present study was grounded in a conceptual framework drawn from findings related to the use of computer simulations in physics education. To achieve the goal of effective utilization of computers for physics education, I first reviewed studies pertaining to computer simulations in physics education categorized by three different learning frameworks and studies comparing the effects of different simulation environments. My intent was to identify the learning context and factors for successful use of computer simulations in past studies and to learn from the studies which did not obtain a significant result. Based on the analysis of reviewed literature, I proposed effective approaches to integrate computer simulations in physics education. These approaches are consistent with well established education principles such as those suggested by How People Learn (Bransford, Brown, Cocking, Donovan, & Pellegrino, 2000). The research based approaches to integrated computer simulations in physics education form a learning framework called Concept Learning with Computer Simulations (CLCS) in the current study. The second component of this study was to examine the CLCS learning framework empirically. The participants were recruited from a public high school in Beijing, China. All participating students were randomly assigned to two groups, the experimental (CLCS) group and the control (TRAD) group. Research based computer simulations developed by the physics education research group at University of Colorado at Boulder were used to tackle common conceptual difficulties in learning electromagnetic induction. While interacting with computer simulations, CLCS students were asked to answer reflective questions designed to stimulate qualitative reasoning and explanation. After receiving model reasoning online, students were asked to submit their revised answers electronically. Students in the TRAD group were not granted access to the CLCS material and followed their normal classroom routine. At the end of the study, both the CLCS and TRAD students took a post-test. Questions on the post-test were divided into "what" questions, "how" questions, and an open response question. Analysis of students' post-test performance showed mixed results. While the TRAD students scored higher on the "what" questions, the CLCS students scored higher on the "how" questions and the one open response questions. This result suggested that more TRAD students knew what kinds of conditions may or may not cause electromagnetic induction without understanding how electromagnetic induction works. Analysis of the CLCS students' learning also suggested that frequent disruption and technical trouble might pose threats to the effectiveness of the CLCS learning framework. Despite the mixed results of students' post-test performance, the CLCS learning framework revealed some limitations to promote conceptual understanding in physics. Improvement can be made by providing students with background knowledge necessary to understand model reasoning and incorporating the CLCS learning framework with other learning frameworks to promote integration of various physics concepts. In addition, the reflective questions in the CLCS learning framework may be refined to better address students' difficulties. Limitations of the study, as well as suggestions for future research, are also presented in this study.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barker, Ashley D.; Bernholdt, David E.; Bland, Arthur S.
Oak Ridge National Laboratory’s (ORNL’s) Leadership Computing Facility (OLCF) continues to surpass its operational target goals: supporting users; delivering fast, reliable systems; creating innovative solutions for high-performance computing (HPC) needs; and managing risks, safety, and security aspects associated with operating one of the most powerful computers in the world. The results can be seen in the cutting-edge science delivered by users and the praise from the research community. Calendar year (CY) 2015 was filled with outstanding operational results and accomplishments: a very high rating from users on overall satisfaction that ties the highest-ever mark set in CY 2014; the greatestmore » number of core-hours delivered to research projects; the largest percentage of capability usage since the OLCF began tracking the metric in 2009; and success in delivering on the allocation of 60, 30, and 10% of core hours offered for the INCITE (Innovative and Novel Computational Impact on Theory and Experiment), ALCC (Advanced Scientific Computing Research Leadership Computing Challenge), and Director’s Discretionary programs, respectively. These accomplishments, coupled with the extremely high utilization rate, represent the fulfillment of the promise of Titan: maximum use by maximum-size simulations. The impact of all of these successes and more is reflected in the accomplishments of OLCF users, with publications this year in notable journals Nature, Nature Materials, Nature Chemistry, Nature Physics, Nature Climate Change, ACS Nano, Journal of the American Chemical Society, and Physical Review Letters, as well as many others. The achievements included in the 2015 OLCF Operational Assessment Report reflect first-ever or largest simulations in their communities; for example Titan enabled engineers in Los Angeles and the surrounding region to design and begin building improved critical infrastructure by enabling the highest-resolution Cybershake map for Southern California to date. The Titan system provides the largest extant heterogeneous architecture for computing and computational science. Usage is high, delivering on the promise of a system well-suited for capability simulations for science. This success is due in part to innovations in tracking and reporting the activity on the compute nodes, and using this information to further enable and optimize applications, extending and balancing workload across the entire node. The OLCF continues to invest in innovative processes, tools, and resources necessary to meet continuing user demand. The facility’s leadership in data analysis and workflows was featured at the Department of Energy (DOE) booth at SC15, for the second year in a row, highlighting work with researchers from the National Library of Medicine coupled with unique computational and data resources serving experimental and observational data across facilities. Effective operations of the OLCF play a key role in the scientific missions and accomplishments of its users. Building on the exemplary year of 2014, as shown by the 2014 Operational Assessment Report (OAR) review committee response in Appendix A, this OAR delineates the policies, procedures, and innovations implemented by the OLCF to continue delivering a multi-petaflop resource for cutting-edge research. This report covers CY 2015, which, unless otherwise specified, denotes January 1, 2015, through December 31, 2015.« less
Simulation of laser beam reflection at the sea surface modeling and validation
NASA Astrophysics Data System (ADS)
Schwenger, Frédéric; Repasi, Endre
2013-06-01
A 3D simulation of the reflection of a Gaussian shaped laser beam on the dynamic sea surface is presented. The simulation is suitable for the pre-calculation of images for cameras operating in different spectral wavebands (visible, short wave infrared) for a bistatic configuration of laser source and receiver for different atmospheric conditions. In the visible waveband the calculated detected total power of reflected laser light from a 660nm laser source is compared with data collected in a field trial. Our computer simulation comprises the 3D simulation of a maritime scene (open sea/clear sky) and the simulation of laser beam reflected at the sea surface. The basic sea surface geometry is modeled by a composition of smooth wind driven gravity waves. To predict the view of a camera the sea surface radiance must be calculated for the specific waveband. Additionally, the radiances of laser light specularly reflected at the wind-roughened sea surface are modeled considering an analytical statistical sea surface BRDF (bidirectional reflectance distribution function). Validation of simulation results is prerequisite before applying the computer simulation to maritime laser applications. For validation purposes data (images and meteorological data) were selected from field measurements, using a 660nm cw-laser diode to produce laser beam reflection at the water surface and recording images by a TV camera. The validation is done by numerical comparison of measured total laser power extracted from recorded images with the corresponding simulation results. The results of the comparison are presented for different incident (zenith/azimuth) angles of the laser beam.
Improving Inquiry Teaching through Reflection on Practice
NASA Astrophysics Data System (ADS)
Lotter, Christine R.; Miller, Cory
2017-08-01
In this paper, we explore middle school science teachers' learning of inquiry-based instructional strategies through reflection on practice teaching sessions during a summer enrichment program with middle level students. The reflection sessions were part of a larger year-long inquiry professional development program in which teachers learned science content and inquiry pedagogy. The program included a 2-week summer institute in which teachers participated in science content sessions, practice teaching to middle level students, and small group-facilitated reflection sessions on their teaching. For this study, data collection focused on teachers' recorded dialogue during the facilitator - run reflection sessions, the teachers' daily written reflections, a final written reflection, and a written reflection on a videotaped teaching session. We investigated the teachers' reflection levels and the themes teachers focused on during their reflection sessions. Teachers were found to reflect at various reflection levels, from simple description to a more sophisticated focus on how to improve student learning. Recurrent themes point to the importance of providing situated learning environments, such as the practice teaching with immediate reflection for teachers to have time to practice new instructional strategies and gain insight from peers and science educators on how to handle student learning issues.
Peer group reflection helps clinical teachers to critically reflect on their teaching.
Boerboom, Tobias B B; Jaarsma, Debbie; Dolmans, Diana H J M; Scherpbier, Albert J J A; Mastenbroek, Nicole J J M; Van Beukelen, Peter
2011-01-01
Student evaluations can help clinical teachers to reflect on their teaching skills and find ways to improve their teaching. Studies have shown that the mere presentation of student evaluations is not a sufficient incentive for teachers to critically reflect on their teaching. We evaluated and compared the effectiveness of two feedback facilitation strategies that were identical except for a peer reflection meeting. In this study, 54 clinical teachers were randomly assigned to two feedback strategies. In one strategy, a peer reflection was added as an additional step. All teachers completed a questionnaire evaluating the strategy that they had experienced. We analysed the reflection reports and the evaluation questionnaire. Both strategies stimulated teachers to reflect on feedback and formulate alternative actions for their teaching practice. The teachers who had participated in the peer reflection meeting showed deeper critical reflection and more concrete plans to change their teaching. All feedback strategies were considered effective by the majority of the teachers. Strategies with student feedback and self-assessment stimulated reflection on teaching and helped clinical teachers to formulate plans for improvement. A peer reflection meeting seemed to enhance reflection quality. Further research should establish whether it can have lasting effects on teaching quality.
Numerical study of the effects of icing on viscous flow over wings
NASA Technical Reports Server (NTRS)
Sankar, L. N.
1994-01-01
An improved hybrid method for computing unsteady compressible viscous flows is presented. This method divides the computational domain into two zones. In the outer zone, the unsteady full-potential equation (FPE) is solved. In the inner zone, the Navier-Stokes equations are solved using a diagonal form of an alternating-direction implicit (ADI) approximate factorization procedure. The two zones are tightly coupled so that steady and unsteady flows may be efficiently solved. Characteristic-based viscous/inviscid interface boundary conditions are employed to avoid spurious reflections at that interface. The resulting CPU times are less than 60 percent of that required for a full-blown Navier-Stokes analysis for steady flow applications and about 60 percent of the Navier-Stokes CPU times for unsteady flows in non-vector processing machines. Applications of the method are presented for a rectangular NACA 0012 wing in low subsonic steady flow at moderate and high angles of attack, and for an F-5 wing in steady and unsteady subsonic and transonic flows. Steady surface pressures are in very good agreement with experimental data and are essentially identical to Navier-Stokes predictions. Density contours show that shocks cross the viscous/inviscid interface smoothly, so that the accuracy of full Navier-Stokes equations can be retained with a significant savings in computational time.
Analog Computer-Aided Detection (CAD) information can be more effective than binary marks.
Cunningham, Corbin A; Drew, Trafton; Wolfe, Jeremy M
2017-02-01
In socially important visual search tasks, such as baggage screening and diagnostic radiology, experts miss more targets than is desirable. Computer-aided detection (CAD) programs have been developed specifically to improve performance in these professional search tasks. For example, in breast cancer screening, many CAD systems are capable of detecting approximately 90% of breast cancer, with approximately 0.5 false-positive detections per image. Nevertheless, benefits of CAD in clinical settings tend to be small (Birdwell, 2009) or even absent (Meziane et al., 2011; Philpotts, 2009). The marks made by a CAD system can be "binary," giving the same signal to any location where the signal is above some threshold. Alternatively, a CAD system presents an analog signal that reflects strength of the signal at a location. In the experiments reported, we compare analog and binary CAD presentations using nonexpert observers and artificial stimuli defined by two noisy signals: a visible color signal and an "invisible" signal that informed our simulated CAD system. We found that analog CAD generally yielded better overall performance than binary CAD. The analog benefit is similar at high and low target prevalence. Our data suggest that the form of the CAD signal can directly influence performance. Analog CAD may allow the computer to be more helpful to the searcher.
Orthopaedic Application Of Spatio Temporal Analysis Of Body Form And Function
NASA Astrophysics Data System (ADS)
Tauber, C.; Au, J.; Bernstein, S.; Grant, A.; Pugh, J.
1983-07-01
Spatial and temporal analysis of walking provides the orthopaedist with objective evidence of functional ability and improvement in a patient. Patients with orthopaedic problems experiencing extreme pain and, consequently, irregularities in joint motions on weightbearing are videorecorded before, during and after a course of rehabilitative treatment and/or surgical correction of their disability. A specially-programmed computer analyzes these tapes for the parameters of walking by locating reflective spots which indicate the centers of the lower limb joints. The following parameters of gait are then generated: dynamic hip, knee and foot angles at various intervals during walking; vertical, horizontal and lateral displacements of each joint at various time intervals; linear and angular velocities of each joint; and the relationships between the joints during various phases of the gait cycle. The systematic sampling and analysis of the videorecordings by computer enable such information to be converted into and presented as computer graphics, as well as organized into tables of gait variables. This format of presentation of the skeletal adjustments involved in normal human motion provides the clinician with a visual format of gait information which objectively illuminates the multifaceted and complex factors involved. This system provides the clinician a method by which to evaluate the success of the regimen in terms of patient comfort and function.
NASA Technical Reports Server (NTRS)
Mulligan, Jeffrey B.
2017-01-01
A color algebra refers to a system for computing sums and products of colors, analogous to additive and subtractive color mixtures. We would like it to match the well-defined algebra of spectral functions describing lights and surface reflectances, but an exact correspondence is impossible after the spectra have been projected to a three-dimensional color space, because of metamerism physically different spectra can produce the same color sensation. Metameric spectra are interchangeable for the purposes of addition, but not multiplication, so any color algebra is necessarily an approximation to physical reality. Nevertheless, because the majority of naturally-occurring spectra are well-behaved (e.g., continuous and slowly-varying), color algebras can be formulated that are largely accurate and agree well with human intuition. Here we explore the family of algebras that result from associating each color with a member of a three-dimensional manifold of spectra. This association can be used to construct a color product, defined as the color of the spectrum of the wavelength-wise product of the spectra associated with the two input colors. The choice of the spectral manifold determines the behavior of the resulting system, and certain special subspaces allow computational efficiencies. The resulting systems can be used to improve computer graphic rendering techniques, and to model various perceptual phenomena such as color constancy.
Mobile computing device configured to compute irradiance, glint, and glare of the sun
Gupta, Vipin P; Ho, Clifford K; Khalsa, Siri Sahib
2014-03-11
Described herein are technologies pertaining to computing the solar irradiance distribution on a surface of a receiver in a concentrating solar power system or glint/glare emitted from a reflective entity. A mobile computing device includes at least one camera that captures images of the Sun and the entity of interest, wherein the images have pluralities of pixels having respective pluralities of intensity values. Based upon the intensity values of the pixels in the respective images, the solar irradiance distribution on the surface of the entity or glint/glare corresponding to the entity is computed by the mobile computing device.
New advances in the partial-reflection-drifts experiment using microprocessors
NASA Technical Reports Server (NTRS)
Ruggerio, R. L.; Bowhill, S. A.
1982-01-01
Improvements to the partial reflection drifts experiment are completed. The results of the improvements include real time processing and simultaneous measurements of the D region with coherent scatter. Preliminary results indicate a positive correlation between drift velocities calculated by both methods during a two day interval. The possibility now exists for extended observations between partial reflection and coherent scatter. In addition, preliminary measurements could be performed between partial reflection and meteor radar to complete a comparison of methods used to determine velocities in the D region.
Mynard, Jonathan P; Penny, Daniel J; Smolich, Joseph J
2018-03-15
Coronary wave intensity analysis (WIA) is an emerging technique for assessing upstream and downstream influences on myocardial perfusion. It is thought that a dominant backward decompression wave (BDW dia ) is generated by a distal suction effect, while early-diastolic forward decompression (FDW dia ) and compression (FCW dia ) waves originate in the aorta. We show that wave reflection also makes a substantial contribution to FDW dia , FCW dia and BDW dia , as quantified by a novel method. In 18 sheep, wave reflection accounted for ∼70% of BDW dia , whereas distal suction dominated in a computer model representing a hypertensive human. Non-linear addition/subtraction of mechanistically distinct waves (e.g. wave reflection and distal suction) obfuscates the true contribution of upstream and downstream forces on measured waves (the 'smoke and mirrors' effect). The mechanisms underlying coronary WIA are more complex than previously thought and the impact of wave reflection should be considered when interpreting clinical and experimental data. Coronary arterial wave intensity analysis (WIA) is thought to provide clear insight into upstream and downstream forces on coronary flow, with a large early-diastolic surge in coronary flow accompanied by a prominent backward decompression wave (BDW dia ), as well as a forward decompression wave (FDW dia ) and forward compression wave (FCW dia ). The BDW dia is believed to arise from distal suction due to release of extravascular compression by relaxing myocardium, while FDW dia and FCW dia are thought to be transmitted from the aorta into the coronary arteries. Based on an established multi-scale computational model and high-fidelity measurements from the proximal circumflex artery (Cx) of 18 anaesthetized sheep, we present evidence that wave reflection has a major impact on each of these three waves, with a non-linear addition/subtraction of reflected waves obscuring the true influence of upstream and downstream forces through concealment and exaggeration, i.e. a 'smoke and mirrors' effect. We also describe methods, requiring additional measurement of aortic WIA, for unravelling the separate influences of wave reflection versus active upstream/downstream forces on coronary waves. Distal wave reflection accounted for ∼70% of the BDW dia in sheep, but had a lesser influence (∼25%) in the computer model representing a hypertensive human. Negative reflection of the BDW dia at the coronary-aortic junction attenuated the Cx FDW dia (by ∼40% in sheep) and augmented Cx FCW dia (∼5-fold), relative to the corresponding aortic waves. We conclude that wave reflection has a major influence on early-diastolic WIA, and thus needs to be considered when interpreting coronary WIA profiles. © 2018 The Authors. The Journal of Physiology © 2018 The Physiological Society.
NASA Astrophysics Data System (ADS)
Li, Xiumin; Wang, Wei; Xue, Fangzheng; Song, Yongduan
2018-02-01
Recently there has been continuously increasing interest in building up computational models of spiking neural networks (SNN), such as the Liquid State Machine (LSM). The biologically inspired self-organized neural networks with neural plasticity can enhance the capability of computational performance, with the characteristic features of dynamical memory and recurrent connection cycles which distinguish them from the more widely used feedforward neural networks. Despite a variety of computational models for brain-like learning and information processing have been proposed, the modeling of self-organized neural networks with multi-neural plasticity is still an important open challenge. The main difficulties lie in the interplay among different forms of neural plasticity rules and understanding how structures and dynamics of neural networks shape the computational performance. In this paper, we propose a novel approach to develop the models of LSM with a biologically inspired self-organizing network based on two neural plasticity learning rules. The connectivity among excitatory neurons is adapted by spike-timing-dependent plasticity (STDP) learning; meanwhile, the degrees of neuronal excitability are regulated to maintain a moderate average activity level by another learning rule: intrinsic plasticity (IP). Our study shows that LSM with STDP+IP performs better than LSM with a random SNN or SNN obtained by STDP alone. The noticeable improvement with the proposed method is due to the better reflected competition among different neurons in the developed SNN model, as well as the more effectively encoded and processed relevant dynamic information with its learning and self-organizing mechanism. This result gives insights to the optimization of computational models of spiking neural networks with neural plasticity.
Choi, Bernard C K
2015-01-01
This article provides insights into the future based on a review of the past and present of public health surveillance-the ongoing systematic collection, analysis, interpretation, and dissemination of health data for the planning, implementation, and evaluation of public health action. Public health surveillance dates back to the first recorded epidemic in 3180 BC in Egypt. A number of lessons and items of interest are summarised from a review of historical perspectives in the past 5,000 years and the current practice of surveillance. Some future scenarios are presented: exploring new frontiers; enhancing computer technology; improving epidemic investigations; improving data collection, analysis, dissemination and use; building on lessons from the past; building capacity; and enhancing global surveillance. It is concluded that learning from the past, reflecting on the present, and planning for the future can further enhance public health surveillance.
XenoSite: accurately predicting CYP-mediated sites of metabolism with neural networks.
Zaretzki, Jed; Matlock, Matthew; Swamidass, S Joshua
2013-12-23
Understanding how xenobiotic molecules are metabolized is important because it influences the safety, efficacy, and dose of medicines and how they can be modified to improve these properties. The cytochrome P450s (CYPs) are proteins responsible for metabolizing 90% of drugs on the market, and many computational methods can predict which atomic sites of a molecule--sites of metabolism (SOMs)--are modified during CYP-mediated metabolism. This study improves on prior methods of predicting CYP-mediated SOMs by using new descriptors and machine learning based on neural networks. The new method, XenoSite, is faster to train and more accurate by as much as 4% or 5% for some isozymes. Furthermore, some "incorrect" predictions made by XenoSite were subsequently validated as correct predictions by revaluation of the source literature. Moreover, XenoSite output is interpretable as a probability, which reflects both the confidence of the model that a particular atom is metabolized and the statistical likelihood that its prediction for that atom is correct.
3D ultrasound computer tomography: Hardware setup, reconstruction methods and first clinical results
NASA Astrophysics Data System (ADS)
Gemmeke, Hartmut; Hopp, Torsten; Zapf, Michael; Kaiser, Clemens; Ruiter, Nicole V.
2017-11-01
A promising candidate for improved imaging of breast cancer is ultrasound computer tomography (USCT). Current experimental USCT systems are still focused in elevation dimension resulting in a large slice thickness, limited depth of field, loss of out-of-plane reflections, and a large number of movement steps to acquire a stack of images. 3D USCT emitting and receiving spherical wave fronts overcomes these limitations. We built an optimized 3D USCT, realizing for the first time the full benefits of a 3D system. The point spread function could be shown to be nearly isotropic in 3D, to have very low spatial variability and fit the predicted values. The contrast of the phantom images is very satisfactory in spite of imaging with a sparse aperture. The resolution and imaged details of the reflectivity reconstruction are comparable to a 3 T MRI volume. Important for the obtained resolution are the simultaneously obtained results of the transmission tomography. The KIT 3D USCT was then tested in a pilot study on ten patients. The primary goals of the pilot study were to test the USCT device, the data acquisition protocols, the image reconstruction methods and the image fusion techniques in a clinical environment. The study was conducted successfully; the data acquisition could be carried out for all patients with an average imaging time of six minutes per breast. The reconstructions provide promising images. Overlaid volumes of the modalities show qualitative and quantitative information at a glance. This paper gives a summary of the involved techniques, methods, and first results.
Bidirectional Reflectance Functions for Application to Earth Radiation Budget Studies
NASA Technical Reports Server (NTRS)
Manalo-Smith, N.; Tiwari, S. N.; Smith, G. L.
1997-01-01
Reflected solar radiative fluxes emerging for the Earth's top of the atmosphere are inferred from satellite broadband radiance measurements by applying bidirectional reflectance functions (BDRFs) to account for the anisotropy of the radiation field. BDRF's are dependent upon the viewing geometry (i.e. solar zenith angle, view zenith angle, and relative azimuth angle), the amount and type of cloud cover, the condition of the intervening atmosphere, and the reflectance characteristics of the underlying surface. A set of operational Earth Radiation Budget Experiment (ERBE) BDRFs is available which was developed from the Nimbus 7 ERB (Earth Radiation Budget) scanner data for a three-angle grid system, An improved set of bidirectional reflectance is required for mission planning and data analysis of future earth radiation budget instruments, such as the Clouds and Earth's Radiant Energy System (CERES), and for the enhancement of existing radiation budget data products. This study presents an analytic expression for BDRFs formulated by applying a fit to the ERBE operational model tabulations. A set of model coefficients applicable to any viewing condition is computed for an overcast and a clear sky scene over four geographical surface types: ocean, land, snow, and desert, and partly cloudy scenes over ocean and land. The models are smooth in terms of the directional angles and adhere to the principle of reciprocity, i.e., they are invariant with respect to the interchange of the incoming and outgoing directional angles. The analytic BDRFs and the radiance standard deviations are compared with the operational ERBE models and validated with ERBE data. The clear ocean model is validated with Dlhopolsky's clear ocean model. Dlhopolsky developed a BDRF of higher angular resolution for clear sky ocean from ERBE radiances. Additionally, the effectiveness of the models accounting for anisotropy for various viewing directions is tested with the ERBE along tract data. An area viewed from nadir and from the side give two different radiance measurements but should yield the same flux when converted by the BDRF. The analytic BDRFs are in very good qualitative agreement with the ERBE models. The overcast scenes exhibit constant retrieved albedo over viewing zenith angles for solar zenith angles less than 60 degrees. The clear ocean model does not produce constant retrieved albedo over viewing zenith angles but gives an improvement over the ERBE operational clear sky ocean BDRF.
Genomic Prediction Accounting for Residual Heteroskedasticity.
Ou, Zhining; Tempelman, Robert J; Steibel, Juan P; Ernst, Catherine W; Bates, Ronald O; Bello, Nora M
2015-11-12
Whole-genome prediction (WGP) models that use single-nucleotide polymorphism marker information to predict genetic merit of animals and plants typically assume homogeneous residual variance. However, variability is often heterogeneous across agricultural production systems and may subsequently bias WGP-based inferences. This study extends classical WGP models based on normality, heavy-tailed specifications and variable selection to explicitly account for environmentally-driven residual heteroskedasticity under a hierarchical Bayesian mixed-models framework. WGP models assuming homogeneous or heterogeneous residual variances were fitted to training data generated under simulation scenarios reflecting a gradient of increasing heteroskedasticity. Model fit was based on pseudo-Bayes factors and also on prediction accuracy of genomic breeding values computed on a validation data subset one generation removed from the simulated training dataset. Homogeneous vs. heterogeneous residual variance WGP models were also fitted to two quantitative traits, namely 45-min postmortem carcass temperature and loin muscle pH, recorded in a swine resource population dataset prescreened for high and mild residual heteroskedasticity, respectively. Fit of competing WGP models was compared using pseudo-Bayes factors. Predictive ability, defined as the correlation between predicted and observed phenotypes in validation sets of a five-fold cross-validation was also computed. Heteroskedastic error WGP models showed improved model fit and enhanced prediction accuracy compared to homoskedastic error WGP models although the magnitude of the improvement was small (less than two percentage points net gain in prediction accuracy). Nevertheless, accounting for residual heteroskedasticity did improve accuracy of selection, especially on individuals of extreme genetic merit. Copyright © 2016 Ou et al.
Integrated Emissivity And Temperature Measurement
Poulsen, Peter
2005-11-08
A multi-channel spectrometer and a light source are used to measure both the emitted and the reflected light from a surface which is at an elevated temperature relative to its environment. In a first method, the temperature of the surface and emissivity in each wavelength is calculated from a knowledge of the spectrum and the measurement of the incident and reflected light. In the second method, the reflected light is measured from a reference surface having a known reflectivity and the same geometry as the surface of interest and the emitted and the reflected light are measured for the surface of interest. These measurements permit the computation of the emissivity in each channel of the spectrometer and the temperature of the surface of interest.
Brian Hears: Online Auditory Processing Using Vectorization Over Channels
Fontaine, Bertrand; Goodman, Dan F. M.; Benichoux, Victor; Brette, Romain
2011-01-01
The human cochlea includes about 3000 inner hair cells which filter sounds at frequencies between 20 Hz and 20 kHz. This massively parallel frequency analysis is reflected in models of auditory processing, which are often based on banks of filters. However, existing implementations do not exploit this parallelism. Here we propose algorithms to simulate these models by vectorizing computation over frequency channels, which are implemented in “Brian Hears,” a library for the spiking neural network simulator package “Brian.” This approach allows us to use high-level programming languages such as Python, because with vectorized operations, the computational cost of interpretation represents a small fraction of the total cost. This makes it possible to define and simulate complex models in a simple way, while all previous implementations were model-specific. In addition, we show that these algorithms can be naturally parallelized using graphics processing units, yielding substantial speed improvements. We demonstrate these algorithms with several state-of-the-art cochlear models, and show that they compare favorably with existing, less flexible, implementations. PMID:21811453
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potash, Peter J.; Bell, Eric B.; Harrison, Joshua J.
Predictive models for tweet deletion have been a relatively unexplored area of Twitter-related computational research. We first approach the deletion of tweets as a spam detection problem, applying a small set of handcrafted features to improve upon the current state-of-the- art in predicting deleted tweets. Next, we apply our approach to a dataset of deleted tweets that better reflects the current deletion rate. Since tweets are deleted for reasons beyond just the presence of spam, we apply topic modeling and text embeddings in order to capture the semantic content of tweets that can lead to tweet deletion. Our goal ismore » to create an effective model that has a low-dimensional feature space and is also language-independent. A lean model would be computationally advantageous processing high-volumes of Twitter data, which can reach 9,885 tweets per second. Our results show that a small set of spam-related features combined with word topics and character-level text embeddings provide the best f1 when trained with a random forest model. The highest precision of the deleted tweet class is achieved by a modification of paragraph2vec to capture author identity.« less
Exploring differences between left and right hand motor imagery via spatio-temporal EEG microstate.
Liu, Weifeng; Liu, Xiaoming; Dai, Ruomeng; Tang, Xiaoying
2017-12-01
EEG-based motor imagery is very useful in brain-computer interface. How to identify the imaging movement is still being researched. Electroencephalography (EEG) microstates reflect the spatial configuration of quasi-stable electrical potential topographies. Different microstates represent different brain functions. In this paper, microstate method was used to process the EEG-based motor imagery to obtain microstate. The single-trial EEG microstate sequences differences between two motor imagery tasks - imagination of left and right hand movement were investigated. The microstate parameters - duration, time coverage and occurrence per second as well as the transition probability of the microstate sequences were obtained with spatio-temporal microstate analysis. The results were shown significant differences (P < 0.05) with paired t-test between the two tasks. Then these microstate parameters were used as features and a linear support vector machine (SVM) was utilized to classify the two tasks with mean accuracy 89.17%, superior performance compared to the other methods. These indicate that the microstate can be a promising feature to improve the performance of the brain-computer interface classification.
Simulating compressible-incompressible two-phase flows
NASA Astrophysics Data System (ADS)
Denner, Fabian; van Wachem, Berend
2017-11-01
Simulating compressible gas-liquid flows, e.g. air-water flows, presents considerable numerical issues and requires substantial computational resources, particularly because of the stiff equation of state for the liquid and the different Mach number regimes. Treating the liquid phase (low Mach number) as incompressible, yet concurrently considering the gas phase (high Mach number) as compressible, can improve the computational performance of such simulations significantly without sacrificing important physical mechanisms. A pressure-based algorithm for the simulation of two-phase flows is presented, in which a compressible and an incompressible fluid are separated by a sharp interface. The algorithm is based on a coupled finite-volume framework, discretised in conservative form, with a compressive VOF method to represent the interface. The bulk phases are coupled via a novel acoustically-conservative interface discretisation method that retains the acoustic properties of the compressible phase and does not require a Riemann solver. Representative test cases are presented to scrutinize the proposed algorithm, including the reflection of acoustic waves at the compressible-incompressible interface, shock-drop interaction and gas-liquid flows with surface tension. Financial support from the EPSRC (Grant EP/M021556/1) is gratefully acknowledged.
East-West paths to unconventional computing.
Adamatzky, Andrew; Akl, Selim; Burgin, Mark; Calude, Cristian S; Costa, José Félix; Dehshibi, Mohammad Mahdi; Gunji, Yukio-Pegio; Konkoli, Zoran; MacLennan, Bruce; Marchal, Bruno; Margenstern, Maurice; Martínez, Genaro J; Mayne, Richard; Morita, Kenichi; Schumann, Andrew; Sergeyev, Yaroslav D; Sirakoulis, Georgios Ch; Stepney, Susan; Svozil, Karl; Zenil, Hector
2017-12-01
Unconventional computing is about breaking boundaries in thinking, acting and computing. Typical topics of this non-typical field include, but are not limited to physics of computation, non-classical logics, new complexity measures, novel hardware, mechanical, chemical and quantum computing. Unconventional computing encourages a new style of thinking while practical applications are obtained from uncovering and exploiting principles and mechanisms of information processing in and functional properties of, physical, chemical and living systems; in particular, efficient algorithms are developed, (almost) optimal architectures are designed and working prototypes of future computing devices are manufactured. This article includes idiosyncratic accounts of 'unconventional computing' scientists reflecting on their personal experiences, what attracted them to the field, their inspirations and discoveries. Copyright © 2017 Elsevier Ltd. All rights reserved.
Rotation, Reflection, and Frame Changes; Orthogonal tensors in computational engineering mechanics
NASA Astrophysics Data System (ADS)
Brannon, R. M.
2018-04-01
Whilst vast literature is available for the most common rotation-related tasks such as coordinate changes, most reference books tend to cover one or two methods, and resources for less-common tasks are scarce. Specialized research applications can be found in disparate journal articles, but a self-contained comprehensive review that covers both elementary and advanced concepts in a manner comprehensible to engineers is rare. Rotation, Reflection, and Frame Changes surveys a refreshingly broad range of rotation-related research that is routinely needed in engineering practice. By illustrating key concepts in computer source code, this book stands out as an unusually accessible guide for engineers and scientists in engineering mechanics.