Neto, Jose Osni Bruggemann; Gesser, Rafael Lehmkuhl; Steglich, Valdir; Bonilauri Ferreira, Ana Paula; Gandhi, Mihir; Vissoci, João Ricardo Nickenig; Pietrobon, Ricardo
2013-01-01
Background The validation of widely used scales facilitates the comparison across international patient samples. The objective of this study was to translate, culturally adapt and validate the Simple Shoulder Test into Brazilian Portuguese. Also we test the stability of factor analysis across different cultures. Objective The objective of this study was to translate, culturally adapt and validate the Simple Shoulder Test into Brazilian Portuguese. Also we test the stability of factor analysis across different cultures. Methods The Simple Shoulder Test was translated from English into Brazilian Portuguese, translated back into English, and evaluated for accuracy by an expert committee. It was then administered to 100 patients with shoulder conditions. Psychometric properties were analyzed including factor analysis, internal reliability, test-retest reliability at seven days, and construct validity in relation to the Short Form 36 health survey (SF-36). Results Factor analysis demonstrated a three factor solution. Cronbach’s alpha was 0.82. Test-retest reliability index as measured by intra-class correlation coefficient (ICC) was 0.84. Associations were observed in the hypothesized direction with all subscales of SF-36 questionnaire. Conclusion The Simple Shoulder Test translation and cultural adaptation to Brazilian-Portuguese demonstrated adequate factor structure, internal reliability, and validity, ultimately allowing for its use in the comparison with international patient samples. PMID:23675436
Neto, Jose Osni Bruggemann; Gesser, Rafael Lehmkuhl; Steglich, Valdir; Bonilauri Ferreira, Ana Paula; Gandhi, Mihir; Vissoci, João Ricardo Nickenig; Pietrobon, Ricardo
2013-01-01
The validation of widely used scales facilitates the comparison across international patient samples. The objective of this study was to translate, culturally adapt and validate the Simple Shoulder Test into Brazilian Portuguese. Also we test the stability of factor analysis across different cultures. The objective of this study was to translate, culturally adapt and validate the Simple Shoulder Test into Brazilian Portuguese. Also we test the stability of factor analysis across different cultures. The Simple Shoulder Test was translated from English into Brazilian Portuguese, translated back into English, and evaluated for accuracy by an expert committee. It was then administered to 100 patients with shoulder conditions. Psychometric properties were analyzed including factor analysis, internal reliability, test-retest reliability at seven days, and construct validity in relation to the Short Form 36 health survey (SF-36). Factor analysis demonstrated a three factor solution. Cronbach's alpha was 0.82. Test-retest reliability index as measured by intra-class correlation coefficient (ICC) was 0.84. Associations were observed in the hypothesized direction with all subscales of SF-36 questionnaire. The Simple Shoulder Test translation and cultural adaptation to Brazilian-Portuguese demonstrated adequate factor structure, internal reliability, and validity, ultimately allowing for its use in the comparison with international patient samples.
Translation, Cultural Adaptation and Validation of the Simple Shoulder Test to Spanish
Arcuri, Francisco; Barclay, Fernando; Nacul, Ivan
2015-01-01
Background: The validation of widely used scales facilitates the comparison across international patient samples. Objective: The objective was to translate, culturally adapt and validate the Simple Shoulder Test into Argentinian Spanish. Methods: The Simple Shoulder Test was translated from English into Argentinian Spanish by two independent translators, translated back into English and evaluated for accuracy by an expert committee to correct the possible discrepancies. It was then administered to 50 patients with different shoulder conditions.Psycometric properties were analyzed including internal consistency, measured with Cronbach´s Alpha, test-retest reliability at 15 days with the interclass correlation coefficient. Results: The internal consistency, validation, was an Alpha of 0,808, evaluated as good. The test-retest reliability index as measured by intra-class correlation coefficient (ICC) was 0.835, evaluated as excellent. Conclusion: The Simple Shoulder Test translation and it´s cultural adaptation to Argentinian-Spanish demonstrated adequate internal reliability and validity, ultimately allowing for its use in the comparison with international patient samples.
Hong, Ha; Solomon, Ethan A.; DiCarlo, James J.
2015-01-01
To go beyond qualitative models of the biological substrate of object recognition, we ask: can a single ventral stream neuronal linking hypothesis quantitatively account for core object recognition performance over a broad range of tasks? We measured human performance in 64 object recognition tests using thousands of challenging images that explore shape similarity and identity preserving object variation. We then used multielectrode arrays to measure neuronal population responses to those same images in visual areas V4 and inferior temporal (IT) cortex of monkeys and simulated V1 population responses. We tested leading candidate linking hypotheses and control hypotheses, each postulating how ventral stream neuronal responses underlie object recognition behavior. Specifically, for each hypothesis, we computed the predicted performance on the 64 tests and compared it with the measured pattern of human performance. All tested hypotheses based on low- and mid-level visually evoked activity (pixels, V1, and V4) were very poor predictors of the human behavioral pattern. However, simple learned weighted sums of distributed average IT firing rates exactly predicted the behavioral pattern. More elaborate linking hypotheses relying on IT trial-by-trial correlational structure, finer IT temporal codes, or ones that strictly respect the known spatial substructures of IT (“face patches”) did not improve predictive power. Although these results do not reject those more elaborate hypotheses, they suggest a simple, sufficient quantitative model: each object recognition task is learned from the spatially distributed mean firing rates (100 ms) of ∼60,000 IT neurons and is executed as a simple weighted sum of those firing rates. SIGNIFICANCE STATEMENT We sought to go beyond qualitative models of visual object recognition and determine whether a single neuronal linking hypothesis can quantitatively account for core object recognition behavior. To achieve this, we designed a database of images for evaluating object recognition performance. We used multielectrode arrays to characterize hundreds of neurons in the visual ventral stream of nonhuman primates and measured the object recognition performance of >100 human observers. Remarkably, we found that simple learned weighted sums of firing rates of neurons in monkey inferior temporal (IT) cortex accurately predicted human performance. Although previous work led us to expect that IT would outperform V4, we were surprised by the quantitative precision with which simple IT-based linking hypotheses accounted for human behavior. PMID:26424887
Majaj, Najib J; Hong, Ha; Solomon, Ethan A; DiCarlo, James J
2015-09-30
To go beyond qualitative models of the biological substrate of object recognition, we ask: can a single ventral stream neuronal linking hypothesis quantitatively account for core object recognition performance over a broad range of tasks? We measured human performance in 64 object recognition tests using thousands of challenging images that explore shape similarity and identity preserving object variation. We then used multielectrode arrays to measure neuronal population responses to those same images in visual areas V4 and inferior temporal (IT) cortex of monkeys and simulated V1 population responses. We tested leading candidate linking hypotheses and control hypotheses, each postulating how ventral stream neuronal responses underlie object recognition behavior. Specifically, for each hypothesis, we computed the predicted performance on the 64 tests and compared it with the measured pattern of human performance. All tested hypotheses based on low- and mid-level visually evoked activity (pixels, V1, and V4) were very poor predictors of the human behavioral pattern. However, simple learned weighted sums of distributed average IT firing rates exactly predicted the behavioral pattern. More elaborate linking hypotheses relying on IT trial-by-trial correlational structure, finer IT temporal codes, or ones that strictly respect the known spatial substructures of IT ("face patches") did not improve predictive power. Although these results do not reject those more elaborate hypotheses, they suggest a simple, sufficient quantitative model: each object recognition task is learned from the spatially distributed mean firing rates (100 ms) of ∼60,000 IT neurons and is executed as a simple weighted sum of those firing rates. Significance statement: We sought to go beyond qualitative models of visual object recognition and determine whether a single neuronal linking hypothesis can quantitatively account for core object recognition behavior. To achieve this, we designed a database of images for evaluating object recognition performance. We used multielectrode arrays to characterize hundreds of neurons in the visual ventral stream of nonhuman primates and measured the object recognition performance of >100 human observers. Remarkably, we found that simple learned weighted sums of firing rates of neurons in monkey inferior temporal (IT) cortex accurately predicted human performance. Although previous work led us to expect that IT would outperform V4, we were surprised by the quantitative precision with which simple IT-based linking hypotheses accounted for human behavior. Copyright © 2015 the authors 0270-6474/15/3513402-17$15.00/0.
Gonzalez-Neira, Eliana Maria; Jimenez-Mendoza, Claudia Patricia; Rugeles-Quintero, Saul
2016-01-01
Objective: This study aims at determining if a collection of 16 motor tests on a physical simulator can objectively discriminate and evaluate practitioners' competency level, i.e. novice, resident, and expert. Methods: An experimental design with three study groups (novice, resident, and expert) was developed to test the evaluation power of each of the 16 simple tests. An ANOVA and a Student Newman-Keuls (SNK) test were used to analyze results of each test to determine which of them can discriminate participants' competency level. Results: Four of the 16 tests used discriminated all of the three competency levels and 15 discriminated at least two of the three groups (α= 0.05). Moreover, other two tests differentiate beginners' level from intermediate, and other seven tests differentiate intermediate level from expert. Conclusion: The competency level of a practitioner of minimally invasive surgery can be evaluated by a specific collection of basic tests in a physical surgical simulator. Reduction of the number of tests needed to discriminate the competency level of surgeons can be the aim of future research. PMID:27226664
Visual conspicuity: a new simple standard, its reliability, validity and applicability.
Wertheim, A H
2010-03-01
A general standard for quantifying conspicuity is described. It derives from a simple and easy method to quantitatively measure the visual conspicuity of an object. The method stems from the theoretical view that the conspicuity of an object is not a property of that object, but describes the degree to which the object is perceptually embedded in, i.e. laterally masked by, its visual environment. First, three variations of a simple method to measure the strength of such lateral masking are described and empirical evidence for its reliability and its validity is presented, as are several tests of predictions concerning the effects of viewing distance and ambient light. It is then shown how this method yields a conspicuity standard, expressed as a number, which can be made part of a rule of law, and which can be used to test whether or not, and to what extent, the conspicuity of a particular object, e.g. a traffic sign, meets a predetermined criterion. An additional feature is that, when used under different ambient light conditions, the method may also yield an index of the amount of visual clutter in the environment. Taken together the evidence illustrates the methods' applicability in both the laboratory and in real-life situations. STATEMENT OF RELEVANCE: This paper concerns a proposal for a new method to measure visual conspicuity, yielding a numerical index that can be used in a rule of law. It is of importance to ergonomists and human factor specialists who are asked to measure the conspicuity of an object, such as a traffic or rail-road sign, or any other object. The new method is simple and circumvents the need to perform elaborate (search) experiments and thus has great relevance as a simple tool for applied research.
Coins and Costs: A Simple and Rapid Assessment of Basic Financial Knowledge
ERIC Educational Resources Information Center
Willner, Paul; Bailey, Rebecca; Dymond, Simon; Parry, Rhonwen
2011-01-01
Introduction: We describe a simple and rapid screening test for basic financial knowledge that is suitable for administration to people with mild intellectual disabilities. Method: The Coins and Costs test asks respondents to name coins, and to estimate prices of objects ranging between 1 British Pound (an ice cream) and 100K British Pounds (a…
Object permanence in orangutans (Pongo pygmaeus) and squirrel monkeys (Saimiri sciureus).
de Blois, S T; Novak, M A; Bond, M
1998-06-01
The authors tested orangutans (Pongo pygmaeus) and squirrel monkeys (Saimiri sciureus) on object permanence tasks. In Experiment 1, orangutans solved all visible displacements and most invisible displacements except those involving movements into 2 boxes successively. In Experiment 2, performance of orangutans on double invisible displacements and control displacements (assessing simple strategies) was compared. Orangutans did not use the simple strategy of selecting the box visited last by the experimenter. Instead, poorer performance on double invisible displacements may have been related to increased memory requirements. In Experiment 3, squirrel monkeys were tested using the procedure of Experiment 1. Squirrel monkeys solved visible but did not comprehend invisible displacements. Results suggest that orangutans but not squirrel monkeys possess Stage 6 object permanence capabilities.
NASA Astrophysics Data System (ADS)
Youn, Younghan; Koo, Jeong-Seo
The complete evaluation of the side vehicle structure and the occupant protection is only possible by means of the full scale side impact crash test. But, auto part manufacturers such as door trim makers can not conduct the test especially when the vehicle is under the developing process. The main objective of this study is to obtain the design guidelines by a simple component level impact test. The relationship between the target absorption energy and impactor speed were examined using the energy absorbed by the door trim. Since each different vehicle type required different energy levels on the door trim. A simple impact test method was developed to estimate abdominal injury by measuring reaction force of the impactor. The reaction force will be converted to a certain level of the energy by the proposed formula. The target of absorption energy for door trim only and the impact speed of simple impactor are derived theoretically based on the conservation of energy. With calculated speed of dummy and the effective mass of abdomen, the energy allocated in the abdomen area of door trim was calculated. The impactor speed can be calculated based on the equivalent energy of door trim absorbed during the full crash test. With the proposed design procedure for the door trim by a simple impact test method was demonstrated to evaluate the abdominal injury. This paper describes a study that was conducted to determine sensitivity of several design factors for reducing abdominal injury values using the matrix of orthogonal array method. In conclusion, with theoretical considerations and empirical test data, the main objective, standardization of door trim design using the simple impact test method was established.
NASA Astrophysics Data System (ADS)
Mundhenk, T. Nathan; Ni, Kang-Yu; Chen, Yang; Kim, Kyungnam; Owechko, Yuri
2012-01-01
An aerial multiple camera tracking paradigm needs to not only spot unknown targets and track them, but also needs to know how to handle target reacquisition as well as target handoff to other cameras in the operating theater. Here we discuss such a system which is designed to spot unknown targets, track them, segment the useful features and then create a signature fingerprint for the object so that it can be reacquired or handed off to another camera. The tracking system spots unknown objects by subtracting background motion from observed motion allowing it to find targets in motion, even if the camera platform itself is moving. The area of motion is then matched to segmented regions returned by the EDISON mean shift segmentation tool. Whole segments which have common motion and which are contiguous to each other are grouped into a master object. Once master objects are formed, we have a tight bound on which to extract features for the purpose of forming a fingerprint. This is done using color and simple entropy features. These can be placed into a myriad of different fingerprints. To keep data transmission and storage size low for camera handoff of targets, we try several different simple techniques. These include Histogram, Spatiogram and Single Gaussian Model. These are tested by simulating a very large number of target losses in six videos over an interval of 1000 frames each from the DARPA VIVID video set. Since the fingerprints are very simple, they are not expected to be valid for long periods of time. As such, we test the shelf life of fingerprints. This is how long a fingerprint is good for when stored away between target appearances. Shelf life gives us a second metric of goodness and tells us if a fingerprint method has better accuracy over longer periods. In videos which contain multiple vehicle occlusions and vehicles of highly similar appearance we obtain a reacquisition rate for automobiles of over 80% using the simple single Gaussian model compared with the null hypothesis of <20%. Additionally, the performance for fingerprints stays well above the null hypothesis for as much as 800 frames. Thus, a simple and highly compact single Gaussian model is useful for target reacquisition. Since the model is agnostic to view point and object size, it is expected to perform as well on a test of target handoff. Since some of the performance degradation is due to problems with the initial target acquisition and tracking, the simple Gaussian model may perform even better with an improved initial acquisition technique. Also, since the model makes no assumption about the object to be tracked, it should be possible to use it to fingerprint a multitude of objects, not just cars. Further accuracy may be obtained by creating manifolds of objects from multiple samples.
[Fundamental biological model for trials of wound ballistics].
Krajsa, J; Hirt, M
2006-10-01
The aim of our experiment was the testing of effects of common ammunition on usable and slightly accessible biological tissue thereby to create fundamental simple biological model for trials of wounded ballistic. Like objective tissue was elected biological material - pork and beef hind-limbs, pork head, pork bodily cavity. It was discovered that objective tissue is able to react to singles types of shots in all spectrum results namely simple smooth penetration wound as well as splintery fracture in dependence on kind of using ammunition. Pork hind-limb was evaluated like the most suitable biological material for given object.
Gonzalez-Neira, Eliana Maria; Jimenez-Mendoza, Claudia Patricia; Suarez, Daniel R; Rugeles-Quintero, Saul
2016-03-30
This study aims at determining if a collection of 16 motor tests on a physical simulator can objectively discriminate and evaluate practitioners' competency level, i.e. novice, resident, and expert. An experimental design with three study groups (novice, resident, and expert) was developed to test the evaluation power of each of the 16 simple tests. An ANOVA and a Student Newman-Keuls (SNK) test were used to analyze results of each test to determine which of them can discriminate participants' competency level. Four of the 16 tests used discriminated all of the three competency levels and 15 discriminated at least two of the three groups (α= 0.05). Moreover, other two tests differentiate beginners' level from intermediate, and other seven tests differentiate intermediate level from expert. The competency level of a practitioner of minimally invasive surgery can be evaluated by a specific collection of basic tests in a physical surgical simulator. Reduction of the number of tests needed to discriminate the competency level of surgeons can be the aim of future research.
A Novel Optical/digital Processing System for Pattern Recognition
NASA Technical Reports Server (NTRS)
Boone, Bradley G.; Shukla, Oodaye B.
1993-01-01
This paper describes two processing algorithms that can be implemented optically: the Radon transform and angular correlation. These two algorithms can be combined in one optical processor to extract all the basic geometric and amplitude features from objects embedded in video imagery. We show that the internal amplitude structure of objects is recovered by the Radon transform, which is a well-known result, but, in addition, we show simulation results that calculate angular correlation, a simple but unique algorithm that extracts object boundaries from suitably threshold images from which length, width, area, aspect ratio, and orientation can be derived. In addition to circumventing scale and rotation distortions, these simulations indicate that the features derived from the angular correlation algorithm are relatively insensitive to tracking shifts and image noise. Some optical architecture concepts, including one based on micro-optical lenslet arrays, have been developed to implement these algorithms. Simulation test and evaluation using simple synthetic object data will be described, including results of a study that uses object boundaries (derivable from angular correlation) to classify simple objects using a neural network.
Identification and detection of simple 3D objects with severely blurred vision.
Kallie, Christopher S; Legge, Gordon E; Yu, Deyue
2012-12-05
Detecting and recognizing three-dimensional (3D) objects is an important component of the visual accessibility of public spaces for people with impaired vision. The present study investigated the impact of environmental factors and object properties on the recognition of objects by subjects who viewed physical objects with severely reduced acuity. The experiment was conducted in an indoor testing space. We examined detection and identification of simple convex objects by normally sighted subjects wearing diffusing goggles that reduced effective acuity to 20/900. We used psychophysical methods to examine the effect on performance of important environmental variables: viewing distance (from 10-24 feet, or 3.05-7.32 m) and illumination (overhead fluorescent and artificial window), and object variables: shape (boxes and cylinders), size (heights from 2-6 feet, or 0.61-1.83 m), and color (gray and white). Object identification was significantly affected by distance, color, height, and shape, as well as interactions between illumination, color, and shape. A stepwise regression analysis showed that 64% of the variability in identification could be explained by object contrast values (58%) and object visual angle (6%). When acuity is severely limited, illumination, distance, color, height, and shape influence the identification and detection of simple 3D objects. These effects can be explained in large part by the impact of these variables on object contrast and visual angle. Basic design principles for improving object visibility are discussed.
STRATOP: A Model for Designing Effective Product and Communication Strategies. Paper No. 470.
ERIC Educational Resources Information Center
Pessemier, Edgar A.
The STRATOP algorithm was developed to help planners and proponents find and test effectively designed choice objects and communication strategies. Choice objects can range from complex social, scientific, military, or educational alternatives to simple economic alternatives between assortments of branded convenience goods. Two classes of measured…
Visual short-term memory capacity for simple and complex objects.
Luria, Roy; Sessa, Paola; Gotler, Alex; Jolicoeur, Pierre; Dell'Acqua, Roberto
2010-03-01
Does the capacity of visual short-term memory (VSTM) depend on the complexity of the objects represented in memory? Although some previous findings indicated lower capacity for more complex stimuli, other results suggest that complexity effects arise during retrieval (due to errors in the comparison process with what is in memory) that is not related to storage limitations of VSTM, per se. We used ERPs to track neuronal activity specifically related to retention in VSTM by measuring the sustained posterior contralateral negativity during a change detection task (which required detecting if an item was changed between a memory and a test array). The sustained posterior contralateral negativity, during the retention interval, was larger for complex objects than for simple objects, suggesting that neurons mediating VSTM needed to work harder to maintain more complex objects. This, in turn, is consistent with the view that VSTM capacity depends on complexity.
Agnosic vision is like peripheral vision, which is limited by crowding.
Strappini, Francesca; Pelli, Denis G; Di Pace, Enrico; Martelli, Marialuisa
2017-04-01
Visual agnosia is a neuropsychological impairment of visual object recognition despite near-normal acuity and visual fields. A century of research has provided only a rudimentary account of the functional damage underlying this deficit. We find that the object-recognition ability of agnosic patients viewing an object directly is like that of normally-sighted observers viewing it indirectly, with peripheral vision. Thus, agnosic vision is like peripheral vision. We obtained 14 visual-object-recognition tests that are commonly used for diagnosis of visual agnosia. Our "standard" normal observer took these tests at various eccentricities in his periphery. Analyzing the published data of 32 apperceptive agnosia patients and a group of 14 posterior cortical atrophy (PCA) patients on these tests, we find that each patient's pattern of object recognition deficits is well characterized by one number, the equivalent eccentricity at which our standard observer's peripheral vision is like the central vision of the agnosic patient. In other words, each agnosic patient's equivalent eccentricity is conserved across tests. Across patients, equivalent eccentricity ranges from 4 to 40 deg, which rates severity of the visual deficit. In normal peripheral vision, the required size to perceive a simple image (e.g., an isolated letter) is limited by acuity, and that for a complex image (e.g., a face or a word) is limited by crowding. In crowding, adjacent simple objects appear unrecognizably jumbled unless their spacing exceeds the crowding distance, which grows linearly with eccentricity. Besides conservation of equivalent eccentricity across object-recognition tests, we also find conservation, from eccentricity to agnosia, of the relative susceptibility of recognition of ten visual tests. These findings show that agnosic vision is like eccentric vision. Whence crowding? Peripheral vision, strabismic amblyopia, and possibly apperceptive agnosia are all limited by crowding, making it urgent to know what drives crowding. Acuity does not (Song et al., 2014), but neural density might: neurons per deg 2 in the crowding-relevant cortical area. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Valdez, Pablo; Reilly, Thomas; Waterhouse, Jim
2008-01-01
Cognitive performance is affected by an individual's characteristics and the environment, as well as by the nature of the task and the amount of practice at it. Mental performance tests range in complexity and include subjective estimates of mood, simple objective tests (reaction time), and measures of complex performance that require decisions to…
Maui Space Surveillance System Satellite Categorization Laboratory
NASA Astrophysics Data System (ADS)
Deiotte, R.; Guyote, M.; Kelecy, T.; Hall, D.; Africano, J.; Kervin, P.
The MSSS satellite categorization laboratory is a fusion of robotics and digital imaging processes that aims to decompose satellite photometric characteristics and behavior in a controlled setting. By combining a robot, light source and camera to acquire non-resolved images of a model satellite, detailed photometric analyses can be performed to extract relevant information about shape features, elemental makeup, and ultimately attitude and function. Using the laboratory setting a detailed analysis can be done on any type of material or design and the results cataloged in a database that will facilitate object identification by "curve-fitting" individual elements in the basis set to observational data that might otherwise be unidentifiable. Currently the laboratory has created, an ST-Robotics five degree of freedom robotic arm, collimated light source and non-focused Apogee camera have all been integrated into a MATLAB based software package that facilitates automatic data acquisition and analysis. Efforts to date have been aimed at construction of the lab as well as validation and verification of simple geometric objects. Simple tests on spheres, cubes and simple satellites show promising results that could lead to a much better understanding of non-resolvable space object characteristics. This paper presents a description of the laboratory configuration and validation test results with emphasis on the non-resolved photometric characteristics for a variety of object shapes, spin dynamics and orientations. The future vision, utility and benefits of the laboratory to the SSA community as a whole are also discussed.
Prefrontal Engagement during Source Memory Retrieval Depends on the Prior Encoding Task
Kuo, Trudy Y.; Van Petten, Cyma
2008-01-01
The prefrontal cortex is strongly engaged by some, but not all, episodic memory tests. Prior work has shown that source recognition tests—those that require memory for conjunctions of studied attributes—yield deficient performance in patients with prefrontal damage and greater prefrontal activity in healthy subjects, as compared to simple recognition tests. Here, we tested the hypothesis that there is no intrinsic relationship between the prefrontal cortex and source memory, but that the prefrontal cortex is engaged by the demand to retrieve weakly encoded relationships. Subjects attempted to remember object/color conjunctions after an encoding task that focused on object identity alone, and an integrative encoding task that encouraged attention to object/color relationships. After the integrative encoding task, the late prefrontal brain electrical activity that typically occurs in source memory tests was eliminated. Earlier brain electrical activity related to successful recognition of the objects was unaffected by the nature of prior encoding. PMID:16839287
NASA Technical Reports Server (NTRS)
Levy, G.; Brown, R. A.
1986-01-01
A simple economical objective analysis scheme is devised and tested on real scatterometer data. It is designed to treat dense data such as those of the Seasat A Satellite Scatterometer (SASS) for individual or multiple passes, and preserves subsynoptic scale features. Errors are evaluated with the aid of sampling ('bootstrap') statistical methods. In addition, sensitivity tests have been performed which establish qualitative confidence in calculated fields of divergence and vorticity. The SASS wind algorithm could be improved; however, the data at this point are limited by instrument errors rather than analysis errors. The analysis error is typically negligible in comparison with the instrument error, but amounts to 30 percent of the instrument error in areas of strong wind shear. The scheme is very economical, and thus suitable for large volumes of dense data such as SASS data.
Are we in the dark ages of environmental toxicology?
McCarty, L S
2013-12-01
Environmental toxicity is judged to be in a "dark ages" period due to longstanding limitations in the implementation of the simple conceptual model that is the basis of current aquatic toxicity testing protocols. Fortunately, the environmental regulatory revolution of the last half-century is not substantially compromised as development of past regulatory guidance was designed to deal with limited amounts of relatively poor quality toxicity data. However, as regulatory objectives have substantially increased in breadth and depth, aquatic toxicity data derived with old testing methods are no longer adequate. In the near-term explicit model description and routine assumption validation should be mandatory. Updated testing methods could provide some improvements in toxicological data quality. A thorough reevaluation of toxicity testing objectives and methods resulting in substantially revised standard testing methods, plus a comprehensive scheme for classification of modes/mechanisms of toxic action, should be the long-term objective. Copyright © 2013 Elsevier Inc. All rights reserved.
A simple Lagrangian forecast system with aviation forecast potential
NASA Technical Reports Server (NTRS)
Petersen, R. A.; Homan, J. H.
1983-01-01
A trajectory forecast procedure is developed which uses geopotential tendency fields obtained from a simple, multiple layer, potential vorticity conservative isentropic model. This model can objectively account for short-term advective changes in the mass field when combined with fine-scale initial analyses. This procedure for producing short-term, upper-tropospheric trajectory forecasts employs a combination of a detailed objective analysis technique, an efficient mass advection model, and a diagnostically proven trajectory algorithm, none of which require extensive computer resources. Results of initial tests are presented, which indicate an exceptionally good agreement for trajectory paths entering the jet stream and passing through an intensifying trough. It is concluded that this technique not only has potential for aiding in route determination, fuel use estimation, and clear air turbulence detection, but also provides an example of the types of short range forecasting procedures which can be applied at local forecast centers using simple algorithms and a minimum of computer resources.
Automatic classification of bottles in crates
NASA Astrophysics Data System (ADS)
Aas, Kjersti; Eikvil, Line; Bremnes, Dag; Norbryhn, Andreas
1995-03-01
This paper presents a statistical method for classification of bottles in crates for use in automatic return bottle machines. For the automatons to reimburse the correct deposit, a reliable recognition is important. The images are acquired by a laser range scanner coregistering the distance to the object and the strength of the reflected signal. The objective is to identify the crate and the bottles from a library with a number of legal types. The bottles with significantly different size are separated using quite simple methods, while a more sophisticated recognizer is required to distinguish the more similar bottle types. Good results have been obtained when testing the method developed on bottle types which are difficult to distinguish using simple methods.
Manual lateralization in macaques: handedness, target laterality and task complexity.
Regaiolli, Barbara; Spiezio, Caterina; Vallortigara, Giorgio
2016-01-01
Non-human primates represent models to understand the evolution of handedness in humans. Despite several researches have been investigating non-human primates handedness, few studies examined the relationship between target position, hand preference and task complexity. This study aimed at investigating macaque handedness in relation to target laterality and tastiness, as well as task complexity. Seven pig-tailed macaques (Macaca nemestrina) were involved in three different "two alternative choice" tests: one low-level task and two high-level tasks (HLTs). During the first and the third tests macaques could select a preferred food and a non-preferred food, whereas by modifying the design of the second test, macaques were presented with no-difference alternative per trial. Furthermore, a simple-reaching test was administered to assess hand preference in a social context. Macaques showed hand preference at individual level both in simple and complex tasks, but not in the simple-reaching test. Moreover, target position seemed to affect hand preference in retrieving an object in the low-level task, but not in the HLT. Additionally, individual hand preference seemed to be affected from the tastiness of the item to be retrieved. The results suggest that both target laterality and individual motivation might influence hand preference of macaques, especially in simple tasks.
Simple shear of deformable square objects
NASA Astrophysics Data System (ADS)
Treagus, Susan H.; Lan, Labao
2003-12-01
Finite element models of square objects in a contrasting matrix in simple shear show that the objects deform to a variety of shapes. For a range of viscosity contrasts, we catalogue the changing shapes and orientations of objects in progressive simple shear. At moderate simple shear ( γ=1.5), the shapes are virtually indistinguishable from those in equivalent pure shear models with the same bulk strain ( RS=4), examined in a previous study. In theory, differences would be expected, especially for very stiff objects or at very large strain. In all our simple shear models, relatively competent square objects become asymmetric barrel shapes with concave shortened edges, similar to some types of boudin. Incompetent objects develop shapes surprisingly similar to mica fish described in mylonites.
Shih, Ching-Hsiang; Chang, Man-Ling
2012-01-01
The latest researches have adopted software technology, turning the Nintendo Wii Balance Board into a high performance standing location detector with a newly developed standing location detection program (SLDP). This study extended SLDP functionality to assess whether two people with developmental disabilities would be able to actively perform simple occupational activities by controlling their favorite environmental stimulation using Nintendo Wii Balance Boards and SLDP software. An ABAB design was adopted in this study to perform the tests. The test results showed that, during the intervention phases, both participants significantly increased their target response (i.e. simple occupational activity) to activate the control system to produce environmental stimulation. The practical and developmental implications of the findings are discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.
Object discrimination using electrotactile feedback.
Arakeri, Tapas J; Hasse, Brady A; Fuglevand, Andrew J
2018-04-09
A variety of bioengineering systems are being developed to restore tactile sensations in individuals who have lost somatosensory feedback because of spinal cord injury, stroke, or amputation. These systems typically detect tactile force with sensors placed on an insensate hand (or prosthetic hand in the case of amputees) and deliver touch information by electrically or mechanically stimulating sensate skin above the site of injury. Successful object manipulation, however, also requires proprioceptive feedback representing the configuration and movements of the hand and digits. Therefore, we developed a simple system that simultaneously provides information about tactile grip force and hand aperture using current amplitude-modulated electrotactile feedback. We evaluated the utility of this system by testing the ability of eight healthy human subjects to distinguish among 27 objects of varying sizes, weights, and compliances based entirely on electrotactile feedback. The feedback was modulated by grip-force and hand-aperture sensors placed on the hand of an experimenter (not visible to the subject) grasping and lifting the test objects. We were also interested to determine the degree to which subjects could learn to use such feedback when tested over five consecutive sessions. The average percentage correct identifications on day 1 (28.5% ± 8.2% correct) was well above chance (3.7%) and increased significantly with training to 49.2% ± 10.6% on day 5. Furthermore, this training transferred reasonably well to a set of novel objects. These results suggest that simple, non-invasive methods can provide useful multisensory feedback that might prove beneficial in improving the control over prosthetic limbs.
The objective of this research is to test the utility of simple functions of spatially integrated and temporally averaged ground water residence times in shallow "groundwatersheds" with field observations and more detailed computer simulations. The residence time of water in the...
A simple bedside test to assess the swallowing dysfunction in Parkinson's disease
Kanna, S. Vinoth; Bhanu, K.
2014-01-01
Background: Swallowing changes are common in Parkinson's disease (PD). Early identification is essential to avoid complications of aspiration. Objectives: To evaluate the swallowing ability of the PD patients and to correlate it with the indicators of disease progression. Materials and Methods: A total of 100 PD patients (70 males and 30 females) aged between 50 years and 70 years with varying stage, duration, and severity were enrolled in a cross-sectional study carried out between January and May 2012. A simple bedside water swallowing test was performed using standard 150 ml of water. Swallowing process was assessed under three categories-swallowing speeds (ml/s), swallowing volume (ml/swallow) and swallowing duration (s/swallow). Equal number of age and sex matched controls were also evaluated. Results: All of them completed the task of swallowing. A mean swallowing speed (27.48 ml/s), swallowing volume (28.5 ml/s), and swallowing duration (1.05 s/swallow) was established by the control group. The PD patients showed decreased swallowing speed (7.15 ml/s in males and 6.61 ml/s in females), decreased swallowing volume (14.59 ml/swallow and 14 ml/swallow in females), and increased swallowing duration (2.37 s/swallow and 2.42 s/swallow) which are statistically significant. There was a significant positive correlation between the severity, duration, and staging of the disease with the swallowing performance and a poor correlation between the subjective reports of dysphagia and the objective performance on water swallow test. Conclusion: The water swallowing test is a simple bedside test to identify the swallowing changes early in PD. It is recommended to do the test in all PD Patients to detect dysphagia early and to intervene appropriately. PMID:24753662
ERIC Educational Resources Information Center
Myerscough, Don; And Others
1996-01-01
Describes an activity whose objectives are to encode and decode messages using linear functions and their inverses; to use modular arithmetic, including use of the reciprocal for simple equation solving; to analyze patterns and make and test conjectures; to communicate procedures and algorithms; and to use problem-solving strategies. (ASK)
The Diagnosticity of Color for Emotional Objects
McMenamin, Brenton W.; Radue, Jasmine; Trask, Joanna; Huskamp, Kristin; Kersten, Daniel; Marsolek, Chad J.
2012-01-01
Object classification can be facilitated if simple diagnostic features can be used to determine class membership. Previous studies have found that simple shapes may be diagnostic for emotional content and automatically alter the allocation of visual attention. In the present study, we analyzed whether color is diagnostic of emotional content and tested whether emotionally diagnostic hues alter the allocation of visual attention. Reddish-yellow hues are more common in (i.e., diagnostic of) emotional images, particularly images with positive emotional content. An exogenous cueing paradigm was employed to test whether these diagnostic hues orient attention differently from other hues due to the emotional diagnosticity. In two experiments, we found that participants allocated attention differently to diagnostic hues than to non-diagnostic hues, in a pattern indicating a broadening of spatial attention when cued with diagnostic hues. Moreover, the attentional broadening effect was predicted by self-reported measures of affective style, linking the behavioral effect to emotional processes. These results confirm the existence and use of diagnostic features for the rapid detection of emotional content. PMID:24659831
Testing effects in visual short-term memory: The case of an object's size.
Makovski, Tal
2018-05-29
In many daily activities, we need to form and retain temporary representations of an object's size. Typically, such visual short-term memory (VSTM) representations follow perception and are considered reliable. Here, participants were asked to hold in mind a single simple object for a short duration and to reproduce its size by adjusting the length and width of a test probe. Experiment 1 revealed two powerful findings: First, similar to a recently reported perceptual illusion, participants greatly overestimated the size of open objects - ones with missing boundaries - relative to the same-size fully closed objects. This finding confirms that object boundaries are critical for size perception and memory. Second, and in contrast to perception, even the size of the closed objects was largely overestimated. Both inflation effects were substantial and were replicated and extended in Experiments 2-5. Experiments 6-8 used a different testing procedure to examine whether the overestimation effects are due to inflation of size in VSTM representations or to biases introduced during the reproduction phase. These data showed that while the overestimation of the open objects was repeated, the overestimation of the closed objects was not. Taken together, these findings suggest that similar to perception, only the size representation of open objects is inflated in VSTM. Importantly, they demonstrate the considerable impact of the testing procedure on VSTM tasks and further question the use of reproduction procedures for measuring VSTM.
Using an Integrative Approach To Teach Hebrew Grammar in an Elementary Immersion Class.
ERIC Educational Resources Information Center
Eckstein, Peter
The 12-week program described here was designed to improve a Hebrew language immersion class' ability to correctly use the simple past and present tenses. The target group was a sixth-grade class that achieved a 65.68 percent error-free rate on a pre-test; the project's objective was to achieve 90 percent error free tests, using student…
ERIC Educational Resources Information Center
Lee, Yeung Chung; Kwok, Ping Wai
2010-01-01
Traditional methods used to teach the concept of density that employ solid objects of different masses and volumes can be supplemented by enquiry activities in which students vary the mass-to-volume ratio of the same object to test ideas about density and flotation. A simple substance, Blu-Tack, is an ideal material to use in this case. The…
Guidance of eruption for general practitioners.
Ngan, Peter W; Kao, Elizabeth C; Wei, Stephen H
2003-04-01
The principle of early treatment through well-planned extraction of primary teeth followed by removal of permanent teeth has stood the test of time. The objective of this article is to develop some simple guidelines for general dental practitioners to perform 'guidance of eruption' in malocclusion with severe crowding.
Circular common-path point diffraction interferometer.
Du, Yongzhao; Feng, Guoying; Li, Hongru; Vargas, J; Zhou, Shouhuan
2012-10-01
A simple and compact point-diffraction interferometer with circular common-path geometry configuration is developed. The interferometer is constructed by a beam-splitter, two reflection mirrors, and a telescope system composed by two lenses. The signal and reference waves travel along the same path. Furthermore, an opaque mask containing a reference pinhole and a test object holder or test window is positioned in the common focal plane of the telescope system. The object wave is divided into two beams that take opposite paths along the interferometer. The reference wave is filtered by the reference pinhole, while the signal wave is transmitted through the object holder. The reference and signal waves are combined again in the beam-splitter and their interference is imaged in the CCD. The new design is compact, vibration insensitive, and suitable for the measurement of moving objects or dynamic processes.
Clinical history and biologic age predicted falls better than objective functional tests.
Gerdhem, Paul; Ringsberg, Karin A M; Akesson, Kristina; Obrant, Karl J
2005-03-01
Fall risk assessment is important because the consequences, such as a fracture, may be devastating. The objective of this study was to find the test or tests that best predicted falls in a population-based sample of elderly women. The fall-predictive ability of a questionnaire, a subjective estimate of biologic age and objective functional tests (gait, balance [Romberg and sway test], thigh muscle strength, and visual acuity) were compared in 984 randomly selected women, all 75 years of age. A recalled fall was the most important predictor for future falls. Only recalled falls and intake of psycho-active drugs independently predicted future falls. Women with at least five of the most important fall predictors (previous falls, conditions affecting the balance, tendency to fall, intake of psychoactive medication, inability to stand on one leg, high biologic age) had an odds ratio of 11.27 (95% confidence interval 4.61-27.60) for a fall (sensitivity 70%, specificity 79%). The more time-consuming objective functional tests were of limited importance for fall prediction. A simple clinical history, the inability to stand on one leg, and a subjective estimate of biologic age were more important as part of the fall risk assessment.
Visual Short-Term Memory for Complex Objects in 6- and 8-Month-Old Infants
ERIC Educational Resources Information Center
Kwon, Mee-Kyoung; Luck, Steven J.; Oakes, Lisa M.
2014-01-01
Infants' visual short-term memory (VSTM) for simple objects undergoes dramatic development: Six-month-old infants can store in VSTM information about only a simple object presented in isolation, whereas 8-month-old infants can store information about simple objects presented in multiple-item arrays. This study extended this work to examine…
Does linear separability really matter? Complex visual search is explained by simple search
Vighneshvel, T.; Arun, S. P.
2013-01-01
Visual search in real life involves complex displays with a target among multiple types of distracters, but in the laboratory, it is often tested using simple displays with identical distracters. Can complex search be understood in terms of simple searches? This link may not be straightforward if complex search has emergent properties. One such property is linear separability, whereby search is hard when a target cannot be separated from its distracters using a single linear boundary. However, evidence in favor of linear separability is based on testing stimulus configurations in an external parametric space that need not be related to their true perceptual representation. We therefore set out to assess whether linear separability influences complex search at all. Our null hypothesis was that complex search performance depends only on classical factors such as target-distracter similarity and distracter homogeneity, which we measured using simple searches. Across three experiments involving a variety of artificial and natural objects, differences between linearly separable and nonseparable searches were explained using target-distracter similarity and distracter heterogeneity. Further, simple searches accurately predicted complex search regardless of linear separability (r = 0.91). Our results show that complex search is explained by simple search, refuting the widely held belief that linear separability influences visual search. PMID:24029822
A step-by-step solution for embedding user-controlled cines into educational Web pages.
Cornfeld, Daniel
2008-03-01
The objective of this article is to introduce a simple method for embedding user-controlled cines into a Web page using a simple JavaScript. Step-by-step instructions are included and the source code is made available. This technique allows the creation of portable Web pages that allow the user to scroll through cases as if seated at a PACS workstation. A simple JavaScript allows scrollable image stacks to be included on Web pages. With this technique, you can quickly and easily incorporate entire stacks of CT or MR images into online teaching files. This technique has the potential for use in case presentations, online didactics, teaching archives, and resident testing.
Designing, engineering, and testing wood structures
NASA Technical Reports Server (NTRS)
Gorman, Thomas M.
1992-01-01
The objective of this paper is to introduce basic structural engineering concepts in a clear, simple manner while actively involving students. This project emphasizes the fact that a good design uses materials efficiently. The test structure in this experiment can easily be built and has various design options. Even when the structure is loaded to collapsing, only one or two pieces usually break, leaving the remaining pieces intact and reusable.
Annual Research Progress Report, Fiscal Year 1980,
1980-10-01
Uric Acid Levels at 36 Weeks Gestation as 45 Screening Test for Preeclampsia as an Aid to Further Manage- ment. (0) DEPARTMENT OF PSYCHIATRY...Investigators: CPT Ellis M. Knight, MC Key Words: Serum Uric Acid Preeclampsia Accumulative MEDCASE Est Accumulative Periodic Ap-roved for continuation...Cost: 0 OMA Cost: 0 Review Results Study Objective: To demonstrate that: A. Serum uric acid level is a simple specific screening test for preeclampsia
Training Medical Students about Hazardous Drinking Using Simple Assessment Techniques
ERIC Educational Resources Information Center
Hidalgo, Jesús López-Torres; Pretel, Fernando Andrés; Bravo, Beatriz Navarro; Rabadan, Francisco Escobar; Serrano Selva, Juan Pedro; Latorre Postigo, Jose Miguel; Martínez, Ignacio Párraga
2014-01-01
Objective: To examine the ability of medical students to identify hazardous drinkers using screening tools recommended in clinical practice. Design: Observational cross-sectional study. Setting: Faculty of Medicine of Castilla-La Mancha, Spain. Method: The medical students learnt to use Alcohol Use Disorders Identification Test (AUDIT) and…
Application of Transformations in Parametric Inference
ERIC Educational Resources Information Center
Brownstein, Naomi; Pensky, Marianna
2008-01-01
The objective of the present paper is to provide a simple approach to statistical inference using the method of transformations of variables. We demonstrate performance of this powerful tool on examples of constructions of various estimation procedures, hypothesis testing, Bayes analysis and statistical inference for the stress-strength systems.…
The zebrafish world of colors and shapes: preference and discrimination.
Oliveira, Jessica; Silveira, Mayara; Chacon, Diana; Luchiari, Ana
2015-04-01
Natural environment imposes many challenges to animals, which have to use cognitive abilities to cope with and exploit it to enhance their fitness. Since zebrafish is a well-established model for cognitive studies and high-throughput screening for drugs and diseases that affect cognition, we tested their ability for ambient color preference and 3D objects discrimination to establish a protocol for memory evaluation. For the color preference test, zebrafish were observed in a multiple-chamber tank with different environmental color options. Zebrafish showed preference for blue and green, and avoided yellow and red. For the 3D objects discrimination, zebrafish were allowed to explore two equal objects and then observed in a one-trial test in which a new color, size, or shape of the object was presented. Zebrafish showed discrimination for color, shape, and color+shape combined, but not size. These results imply that zebrafish seem to use some categorical system to discriminate items, and distracters affect their ability for discrimination. The type of variables available (color and shape) may favor zebrafish objects perception and facilitate discrimination processing. We suggest that this easy and simple memory test could serve as a useful screening tool for cognitive dysfunction and neurotoxicological studies.
Perceptual advantage for category-relevant perceptual dimensions: the case of shape and motion.
Folstein, Jonathan R; Palmeri, Thomas J; Gauthier, Isabel
2014-01-01
Category learning facilitates perception along relevant stimulus dimensions, even when tested in a discrimination task that does not require categorization. While this general phenomenon has been demonstrated previously, perceptual facilitation along dimensions has been documented by measuring different specific phenomena in different studies using different kinds of objects. Across several object domains, there is support for acquired distinctiveness, the stretching of a perceptual dimension relevant to learned categories. Studies using faces and studies using simple separable visual dimensions have also found evidence of acquired equivalence, the shrinking of a perceptual dimension irrelevant to learned categories, and categorical perception, the local stretching across the category boundary. These later two effects are rarely observed with complex non-face objects. Failures to find these effects with complex non-face objects may have been because the dimensions tested previously were perceptually integrated. Here we tested effects of category learning with non-face objects categorized along dimensions that have been found to be processed by different areas of the brain, shape and motion. While we replicated acquired distinctiveness, we found no evidence for acquired equivalence or categorical perception.
Simental-Mendía, Luis E; Simental-Mendía, Esteban; Rodríguez-Hernández, Heriberto; Rodríguez-Morán, Martha; Guerrero-Romero, Fernando
2016-01-01
Introduction and aim. Given that early identification of non-alcoholic fatty liver disease (NAFLD) is an important issue for primary prevention of hepatic disease, the objectives of this study were to evaluate the efficacy of the product of triglyceride and glucose levels (TyG) for screening simple steatosis and non-alcoholic steatohepatitis (NASH) in asymptomatic women, and to compare its efficacy vs. other biomarkers for recognizing NAFLD. Asymptomatic women aged 20 to 65 years were enrolled into a cross-sectional study. The optimal values of TyG, for screening simple steatosis and NASH were established on a Receiver Operating Characteristic scatter plot; the sensitivity, specificity, and likelihood ratios of TyG index were estimated versus liver biopsy. According sensitivity and specificity, the efficacy of TyG was compared versus the well-known clinical biomarkers for recognizing NAFLD. A total of 50 asymptomatic women were enrolled. The best cutoff point of TyG for screening simple steatosis was 4.58 (sensitivity 0.94, specificity 0.69); in addition, the best cutoff point of TyG index for screening NASH was 4.59 (sensitivity 0.87, specificity 0.69). The positive and negative likelihood ratios were 3.03 and 0.08 for simple steatosis, and 2.80 and 0.18 for NASH. As compared versus SteatoTest, NashTest, Fatty liver index, and Algorithm, the TyG showed to be the best test for screening. TyG has high sensitivity and low negative likelihood ratio; as compared with other clinical biomarkers, the TyG showed to be the best test for screening simple steatosis and NASH.
The Zombie Plot: A Simple Graphic Method for Visualizing the Efficacy of a Diagnostic Test.
Richardson, Michael L
2016-08-09
One of the most important jobs of a radiologist is to pick the most appropriate imaging test for a particular clinical situation. Making a proper selection sometimes requires statistical analysis. The objective of this article is to introduce a simple graphic technique, an ROC plot that has been divided into zones of mostly bad imaging efficacy (ZOMBIE, hereafter referred to as the "zombie plot"), that transforms information about imaging efficacy from the numeric domain into the visual domain. The numeric rationale for the use of zombie plots is given, as are several examples of the clinical use of these plots. Two online calculators are described that simplify the process of producing a zombie plot.
Haga, Yoshihiro; Chida, Koichi; Inaba, Yohei; Kaga, Yuji; Meguro, Taiichiro; Zuguchi, Masayuki
2016-02-01
As the use of diagnostic X-ray equipment with flat panel detectors (FPDs) has increased, so has the importance of proper management of FPD systems. To ensure quality control (QC) of FPD system, an easy method for evaluating FPD imaging performance for both stationary and moving objects is required. Until now, simple rotatable QC phantoms have not been available for the easy evaluation of the performance (spatial resolution and dynamic range) of FPD in imaging moving objects. We developed a QC phantom for this purpose. It consists of three thicknesses of copper and a rotatable test pattern of piano wires of various diameters. Initial tests confirmed its stable performance. Our moving phantom is very useful for QC of FPD images of moving objects because it enables visual evaluation of image performance (spatial resolution and dynamic range) easily.
ERIC Educational Resources Information Center
Molina, Carmen Eneida, Ed.; And Others
This guide for teachers, in English and Spanish, examines how assigned sex roles affect grade school girls in competitive sports, simple games, pastimes, and other extracurricular activities. A pre-test and post-test are included to measure the user's awareness of sexual stereotypes. Five object lessons cover the following topics: (1) myths that…
Development of a Fluid Structures Interaction Test Technique for Fabrics
NASA Technical Reports Server (NTRS)
Zilliac, Gregory G.; Heineck, James T.; Schairer, Edward T.; Mosher, Robert N.; Garbeff, Theodore Joseph
2012-01-01
Application of fluid structures interaction (FSI) computational techniques to configurations of interest to the entry, descent and landing (EDL) community is limited by two factors - limited characterization of the material properties for fabrics of interest and insufficient experimental data to validate the FSI codes. Recently ILC Dover Inc. performed standard tests to characterize the static stress-strain response of four candidate fabrics for use in EDL applications. The objective of the tests described here is to address the need for a FSI dataset for CFD validation purposes. To reach this objective, the structural response of fabrics was measured in a very simple aerodynamic environment with well controlled boundary conditions. Two test series were undertaken. The first series covered a range of tunnel conditions and the second focused on conditions that resulted in fabric panel buckling.
Mac A. Callaham; Arthur J. Stewart; Clara Alarcon; Sara J. McMillen
2002-01-01
Current bioremediation techniques for petroleum-contaminated soils are designed to remove contaminants as quickly and efficiently as possible, but not necessarily with postremediation soil biological quality as a primary objective. To test a simple postbioremediation technique, we added earthworms (Eisenia fetida) or wheat (Triticum aestivum...
Fathers Show Modifications of Infant-Directed Action Similar to that of Mothers
ERIC Educational Resources Information Center
Rutherford, M. D.; Przednowek, Malgorzata
2012-01-01
Mothers' actions are more enthusiastic, simple, and repetitive when demonstrating novel object properties to their infants than to adults, a behavioral modification called "infant-directed action" by Brand and colleagues (2002). The current study tested whether fathers also tailor their behavior when interacting with infants and whether this…
Personality Tests: Self-Disclosures or Self-Presentations?
ERIC Educational Resources Information Center
Johnson, John A.
When people talk about themselves, psychologists have noted that their verbal reports can be categorized as simple factual communications about the self, i.e., self-disclosure, or as ways to instruct others about how one is to be regarded, i.e., self-presentation. Responses to items on objective self-report measures of personality similarly can be…
Clinical prediction of fall risk and white matter abnormalities: a diffusion tensor imaging study
USDA-ARS?s Scientific Manuscript database
The Tinetti scale is a simple clinical tool designed to predict risk of falling by focusing on gait and stance impairment in elderly persons. Gait impairment is also associated with white matter (WM) abnormalities. Objective: To test the hypothesis that elderly subjects at risk for falling, as deter...
Gripping characteristics of an electromagnetically activated magnetorheological fluid-based gripper
NASA Astrophysics Data System (ADS)
Choi, Young T.; Hartzell, Christine M.; Leps, Thomas; Wereley, Norman M.
2018-05-01
The design and test of a magnetorheological fluid (MRF)-based universal gripper (MR gripper) are presented in this study. The MR gripper was developed to have a simple design, but with the ability to produce reliable gripping and handling of a wide range of simple objects. The MR gripper design consists of a bladder mounted atop an electromagnet, where the bladder is filled with an MRF, which was formulated to have long-term stable sedimentation stability, that was synthesized using a high viscosity linear polysiloxane (HVLP) carrier fluid with a carbonyl iron particle (CIP) volume fraction of 35%. Two bladders were fabricated: a magnetizable bladder using a magnetorheological elastomer (MRE), and a passive (non-magnetizable) silicone rubber bladder. The holding force and applied (initial compression) force of the MR gripper for a bladder fill volume of 75% were experimentally measured, for both magnetizable and passive bladders, using a servohydraulic material testing machine for a range of objects. The gripping performance of the MR gripper using an MRE bladder was compared to that of the MR gripper using a passive bladder.
Portable Microleak-Detection System
NASA Technical Reports Server (NTRS)
Rivers, H. Kevin; Sikora, Joseph G.; Sankaran, Sankara N.
2007-01-01
The figure schematically depicts a portable microleak-detection system that has been built especially for use in testing hydrogen tanks made of polymer-matrix composite materials. (As used here, microleak signifies a leak that is too small to be detectable by the simple soap-bubble technique.) The system can also be used to test for microleaks in tanks that are made of other materials and that contain gases other than hydrogen. Results of calibration tests have shown that measurement errors are less than 10 percent for leak rates ranging from 0.3 to 200 cm3/min. Like some other microleak-detection systems, this system includes a vacuum pump and associated plumbing for sampling the leaking gas, and a mass spectrometer for analyzing the molecular constituents of the gas. The system includes a flexible vacuum chamber that can be attached to the outer surface of a tank or other object of interest that is to be tested for leakage (hereafter denoted, simply, the test object). The gas used in a test can be the gas or vapor (e.g., hydrogen in the original application) to be contained by the test object. Alternatively, following common practice in leak testing, helium can be used as a test gas. In either case, the mass spectrometer can be used to verify that the gas measured by the system is the test gas rather than a different gas and, hence, that the leak is indeed from the test object.
Hypervelocity Impact (HVI). Volume 7; WLE High Fidelity Specimen RCC16R
NASA Technical Reports Server (NTRS)
Gorman, Michael R.; Ziola, Steven M.
2007-01-01
During 2003 and 2004, the Johnson Space Center's White Sands Testing Facility in Las Cruces, New Mexico conducted hypervelocity impact tests on the space shuttle wing leading edge. Hypervelocity impact tests were conducted to determine if Micro-Meteoroid/Orbital Debris impacts could be reliably detected and located using simple passive ultrasonic methods. The objective of Target RCC16R was to study hypervelocity impacts through the reinforced carbon-carbon (RCC) panels of the Wing Leading Edge. Impact damage was detected using lightweight, low power instrumentation capable of being used in flight.
Quasi-Uniform High Speed Foam Crush Testing Using a Guided Drop Mass Impact
NASA Technical Reports Server (NTRS)
Jones, Lisa E. (Technical Monitor); Kellas, Sotiris
2004-01-01
A relatively simple method for measuring the dynamic crush response of foam materials at various loading rates is described. The method utilizes a drop mass impact configuration with mass and impact velocity selected such that the crush speed remains approximately uniform during the entire sample crushing event. Instrumentation, data acquisition, and data processing techniques are presented, and limitations of the test method are discussed. The objective of the test method is to produce input data for dynamic finite element modeling involving crash and energy absorption characteristics of foam materials.
Drawing skill is related to the efficiency of encoding object structure.
Perdreau, Florian; Cavanagh, Patrick
2014-01-01
Accurate drawing calls on many skills beyond simple motor coordination. A good internal representation of the target object's structure is necessary to capture its proportion and shape in the drawing. Here, we assess two aspects of the perception of object structure and relate them to participants' drawing accuracy. First, we assessed drawing accuracy by computing the geometrical dissimilarity of their drawing to the target object. We then used two tasks to evaluate the efficiency of encoding object structure. First, to examine the rate of temporal encoding, we varied presentation duration of a possible versus impossible test object in the fovea using two different test sizes (8° and 28°). More skilled participants were faster at encoding an object's structure, but this difference was not affected by image size. A control experiment showed that participants skilled in drawing did not have a general advantage that might have explained their faster processing for object structure. Second, to measure the critical image size for accurate classification in the periphery, we varied image size with possible versus impossible object tests centered at two different eccentricities (3° and 8°). More skilled participants were able to categorise object structure at smaller sizes, and this advantage did not change with eccentricity. A control experiment showed that the result could not be attributed to differences in visual acuity, leaving attentional resolution as a possible explanation. Overall, we conclude that drawing accuracy is related to faster encoding of object structure and better access to crowded details.
Drawing skill is related to the efficiency of encoding object structure
Perdreau, Florian; Cavanagh, Patrick
2014-01-01
Accurate drawing calls on many skills beyond simple motor coordination. A good internal representation of the target object's structure is necessary to capture its proportion and shape in the drawing. Here, we assess two aspects of the perception of object structure and relate them to participants' drawing accuracy. First, we assessed drawing accuracy by computing the geometrical dissimilarity of their drawing to the target object. We then used two tasks to evaluate the efficiency of encoding object structure. First, to examine the rate of temporal encoding, we varied presentation duration of a possible versus impossible test object in the fovea using two different test sizes (8° and 28°). More skilled participants were faster at encoding an object's structure, but this difference was not affected by image size. A control experiment showed that participants skilled in drawing did not have a general advantage that might have explained their faster processing for object structure. Second, to measure the critical image size for accurate classification in the periphery, we varied image size with possible versus impossible object tests centered at two different eccentricities (3° and 8°). More skilled participants were able to categorise object structure at smaller sizes, and this advantage did not change with eccentricity. A control experiment showed that the result could not be attributed to differences in visual acuity, leaving attentional resolution as a possible explanation. Overall, we conclude that drawing accuracy is related to faster encoding of object structure and better access to crowded details. PMID:25469216
Diagnosis of asthma: diagnostic testing.
Brigham, Emily P; West, Natalie E
2015-09-01
Asthma is a heterogeneous disease, encompassing both atopic and non-atopic phenotypes. Diagnosis of asthma is based on the combined presence of typical symptoms and objective tests of lung function. Objective diagnostic testing consists of 2 components: (1) demonstration of airway obstruction, and (2) documentation of variability in degree of obstruction. A review of current guidelines and literature was performed regarding diagnostic testing for asthma. Spirometry with bronchodilator reversibility testing remains the mainstay of asthma diagnostic testing for children and adults. Repetition of the test over several time points may be necessary to confirm airway obstruction and variability thereof. Repeated peak flow measurement is relatively simple to implement in a clinical and home setting. Bronchial challenge testing is reserved for patients in whom the aforementioned testing has been unrevealing but clinical suspicion remains, though is associated with low specificity. Demonstration of eosinophilic inflammation, via fractional exhaled nitric oxide measurement, or atopy, may be supportive of atopic asthma, though diagnostic utility is limited particularly in nonatopic asthma. All efforts should be made to confirm the diagnosis of asthma in those who are being presumptively treated but have not had objective measurements of variability in the degree of obstruction. Multiple testing modalities are available for objective confirmation of airway obstruction and variability thereof, consistent with a diagnosis of asthma in the appropriate clinical context. Providers should be aware that both these characteristics may be present in other disease states, and may not be specific to a diagnosis of asthma. © 2015 ARS-AAOA, LLC.
Infants' Attribution of a Goal to a Morphologically Unfamiliar Agent
ERIC Educational Resources Information Center
Shimizu, Y. Alpha; Johnson, Susan C.
2004-01-01
How do infants identify the psychological actors in their environments? Three groups of 12-month-old infants were tested for their willingness to encode a simple approach behavior as goal-directed as a function of whether it was performed by (1) a human hand, (2) a morphologically unfamiliar green object that interacted with a confederate and…
Multiobjective Optimization Using a Pareto Differential Evolution Approach
NASA Technical Reports Server (NTRS)
Madavan, Nateri K.; Biegel, Bryan A. (Technical Monitor)
2002-01-01
Differential Evolution is a simple, fast, and robust evolutionary algorithm that has proven effective in determining the global optimum for several difficult single-objective optimization problems. In this paper, the Differential Evolution algorithm is extended to multiobjective optimization problems by using a Pareto-based approach. The algorithm performs well when applied to several test optimization problems from the literature.
Young Children's Spontaneous Use of Geometry in Maps
ERIC Educational Resources Information Center
Shusterman, Anna; Lee, Sang Ah; Spelke, Elizabeth S.
2008-01-01
Two experiments tested whether 4-year-old children extract and use geometric information in simple maps without task instruction or feedback. Children saw maps depicting an arrangement of three containers and were asked to place an object into a container designated on the map. In Experiment 1, one of the three locations on the map and the array…
Attentional gating models of object substitution masking.
Põder, Endel
2013-11-01
Di Lollo, Enns, and Rensink (2000) proposed the computational model of object substitution (CMOS) to explain their experimental results with sparse visual maskers. This model supposedly is based on reentrant hypotheses testing in the visual system, and the modeled experiments are believed to demonstrate these reentrant processes in human vision. In this study, I analyze the main assumptions of this model. I argue that CMOS is a version of the attentional gating model and that its relationship with reentrant processing is rather illusory. The fit of this model to the data indicates that reentrant hypotheses testing is not necessary for the explanation of object substitution masking (OSM). Further, the original CMOS cannot predict some important aspects of the experimental data. I test 2 new models incorporating an unselective processing (divided attention) stage; these models are more consistent with data from OSM experiments. My modeling shows that the apparent complexity of OSM can be reduced to a few simple and well-known mechanisms of perception and memory. PsycINFO Database Record (c) 2013 APA, all rights reserved.
NASA Technical Reports Server (NTRS)
1985-01-01
The primary objective of the Test Active Control Technology (ACT) System laboratory tests was to verify and validate the system concept, hardware, and software. The initial lab tests were open loop hardware tests of the Test ACT System as designed and built. During the course of the testing, minor problems were uncovered and corrected. Major software tests were run. The initial software testing was also open loop. These tests examined pitch control laws, wing load alleviation, signal selection/fault detection (SSFD), and output management. The Test ACT System was modified to interface with the direct drive valve (DDV) modules. The initial testing identified problem areas with DDV nonlinearities, valve friction induced limit cycling, DDV control loop instability, and channel command mismatch. The other DDV issue investigated was the ability to detect and isolate failures. Some simple schemes for failure detection were tested but were not completely satisfactory. The Test ACT System architecture continues to appear promising for ACT/FBW applications in systems that must be immune to worst case generic digital faults, and be able to tolerate two sequential nongeneric faults with no reduction in performance. The challenge in such an implementation would be to keep the analog element sufficiently simple to achieve the necessary reliability.
Some thoughts on the management of large, complex international space ventures
NASA Technical Reports Server (NTRS)
Lee, T. J.; Kutzer, Ants; Schneider, W. C.
1992-01-01
Management issues relevant to the development and deployment of large international space ventures are discussed with particular attention given to previous experience. Management approaches utilized in the past are labeled as either simple or complex, and signs of efficient management are examined. Simple approaches include those in which experiments and subsystems are developed for integration into spacecraft, and the Apollo-Soyuz Test Project is given as an example of a simple multinational approach. Complex approaches include those for ESA's Spacelab Project and the Space Station Freedom in which functional interfaces cross agency and political boundaries. It is concluded that individual elements of space programs should be managed by individual participating agencies, and overall configuration control is coordinated by level with a program director acting to manage overall objectives and project interfaces.
Patrone, Tatiana
2017-11-01
The paper asks the question whether Kant's ethical theory can be applied to issues in assisted reproductive technology (ART). It argues against three objections to applying Kant's ethics to ART: (i) the non-identity objection, (ii) the gen-ethics objection, and (iii) the care-ethics objection. After showing that neither of the three objections is sufficiently persuasive the paper proposes a reading of Kant's 'formula of humanity,' and especially its negative clause (i.e., the 'merely as means' clause), that can be of some guidance in ART. The paper conclude that although Kant's 'formula of humanity' cannot be used as a simple litmus test for determining whether an ART practice is morally permissible or not, it nonetheless can supply us with some guidance in our moral deliberation.
Afzal, Naveed; Sohn, Sunghwan; Abram, Sara; Scott, Christopher G.; Chaudhry, Rajeev; Liu, Hongfang; Kullo, Iftikhar J.; Arruda-Olson, Adelaide M.
2016-01-01
Objective Lower extremity peripheral arterial disease (PAD) is highly prevalent and affects millions of individuals worldwide. We developed a natural language processing (NLP) system for automated ascertainment of PAD cases from clinical narrative notes and compared the performance of the NLP algorithm to billing code algorithms, using ankle-brachial index (ABI) test results as the gold standard. Methods We compared the performance of the NLP algorithm to 1) results of gold standard ABI; 2) previously validated algorithms based on relevant ICD-9 diagnostic codes (simple model) and 3) a combination of ICD-9 codes with procedural codes (full model). A dataset of 1,569 PAD patients and controls was randomly divided into training (n= 935) and testing (n= 634) subsets. Results We iteratively refined the NLP algorithm in the training set including narrative note sections, note types and service types, to maximize its accuracy. In the testing dataset, when compared with both simple and full models, the NLP algorithm had better accuracy (NLP: 91.8%, full model: 81.8%, simple model: 83%, P<.001), PPV (NLP: 92.9%, full model: 74.3%, simple model: 79.9%, P<.001), and specificity (NLP: 92.5%, full model: 64.2%, simple model: 75.9%, P<.001). Conclusions A knowledge-driven NLP algorithm for automatic ascertainment of PAD cases from clinical notes had greater accuracy than billing code algorithms. Our findings highlight the potential of NLP tools for rapid and efficient ascertainment of PAD cases from electronic health records to facilitate clinical investigation and eventually improve care by clinical decision support. PMID:28189359
Automation Hooks Architecture Trade Study for Flexible Test Orchestration
NASA Technical Reports Server (NTRS)
Lansdowne, Chatwin A.; Maclean, John R.; Graffagnino, Frank J.; McCartney, Patrick A.
2010-01-01
We describe the conclusions of a technology and communities survey supported by concurrent and follow-on proof-of-concept prototyping to evaluate feasibility of defining a durable, versatile, reliable, visible software interface to support strategic modularization of test software development. The objective is that test sets and support software with diverse origins, ages, and abilities can be reliably integrated into test configurations that assemble and tear down and reassemble with scalable complexity in order to conduct both parametric tests and monitored trial runs. The resulting approach is based on integration of three recognized technologies that are currently gaining acceptance within the test industry and when combined provide a simple, open and scalable test orchestration architecture that addresses the objectives of the Automation Hooks task. The technologies are automated discovery using multicast DNS Zero Configuration Networking (zeroconf), commanding and data retrieval using resource-oriented Restful Web Services, and XML data transfer formats based on Automatic Test Markup Language (ATML). This open-source standards-based approach provides direct integration with existing commercial off-the-shelf (COTS) analysis software tools.
Evaluation of seismic spatial interaction effects through an impact testing program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, B.D.; Driesen, G.E.
The consequences of non-seismically qualified objects falling and striking essential, seismically qualified objects is an analytically difficult problem to assess. Analytical solutions to impact problems are conservative and only available for simple situations. In a nuclear facility, the numerous ``sources`` and ``targets`` requiring evaluation often have complex geometric configurations, which makes calculations and computer modeling difficult. Few industry or regulatory rules are available for this specialized assessment. A drop test program was recently conducted to ``calibrate`` the judgment of seismic qualification engineers who perform interaction evaluations and to further develop seismic interaction criteria. Impact tests on varying combinations of sourcesmore » and targets were performed by dropping the sources from various heights onto targets that were connected to instruments. This paper summarizes the scope, test configurations, and some results of the drop test program. Force and acceleration time history data and general observations are presented on the ruggedness of various targets when subjected to impacts from different types of sources.« less
Evaluation of seismic spatial interaction effects through an impact testing program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, B.D.; Driesen, G.E.
The consequences of non-seismically qualified objects falling and striking essential, seismically qualified objects is an analytically difficult problem to assess. Analytical solutions to impact problems are conservative and only available for simple situations. In a nuclear facility, the numerous sources'' and targets'' requiring evaluation often have complex geometric configurations, which makes calculations and computer modeling difficult. Few industry or regulatory rules are available for this specialized assessment. A drop test program was recently conducted to calibrate'' the judgment of seismic qualification engineers who perform interaction evaluations and to further develop seismic interaction criteria. Impact tests on varying combinations of sourcesmore » and targets were performed by dropping the sources from various heights onto targets that were connected to instruments. This paper summarizes the scope, test configurations, and some results of the drop test program. Force and acceleration time history data and general observations are presented on the ruggedness of various targets when subjected to impacts from different types of sources.« less
Validation of the thermal code of RadTherm-IR, IR-Workbench, and F-TOM
NASA Astrophysics Data System (ADS)
Schwenger, Frédéric; Grossmann, Peter; Malaplate, Alain
2009-05-01
System assessment by image simulation requires synthetic scenarios that can be viewed by the device to be simulated. In addition to physical modeling of the camera, a reliable modeling of scene elements is necessary. Software products for modeling of target data in the IR should be capable of (i) predicting surface temperatures of scene elements over a long period of time and (ii) computing sensor views of the scenario. For such applications, FGAN-FOM acquired the software products RadTherm-IR (ThermoAnalytics Inc., Calumet, USA; IR-Workbench (OKTAL-SE, Toulouse, France). Inspection of the accuracy of simulation results by validation is necessary before using these products for applications. In the first step of validation, the performance of both "thermal solvers" was determined through comparison of the computed diurnal surface temperatures of a simple object with the corresponding values from measurements. CUBI is a rather simple geometric object with well known material parameters which makes it suitable for testing and validating object models in IR. It was used in this study as a test body. Comparison of calculated and measured surface temperature values will be presented, together with the results from the FGAN-FOM thermal object code F-TOM. In the second validation step, radiances of the simulated sensor views computed by RadTherm-IR and IR-Workbench will be compared with radiances retrieved from the recorded sensor images taken by the sensor that was simulated. Strengths and weaknesses of the models RadTherm-IR, IR-Workbench and F-TOM will be discussed.
Hypervelocity Impact (HVI). Volume 6; WLE High Fidelity Specimen Fg(RCC)-2
NASA Technical Reports Server (NTRS)
Gorman, Michael R.; Ziola, Steven M.
2007-01-01
During 2003 and 2004, the Johnson Space Center's White Sands Testing Facility in Las Cruces, New Mexico conducted hypervelocity impact tests on the space shuttle wing leading edge. Hypervelocity impact tests were conducted to determine if Micro-Meteoroid/Orbital Debris impacts could be reliably detected and located using simple passive ultrasonic methods. The objective of Target Fg(RCC)-2 was to study hypervelocity impacts through the reinforced carbon-carbon (RCC) panels of the Wing Leading Edge. Fiberglass was used in place of RCC in the initial tests. Impact damage was detected using lightweight, low power instrumentation capable of being used in flight.
Hypervelocity Impact (HVI). Volume 4; WLE Small-Scale Fiberglass Panel Flat Target C-2
NASA Technical Reports Server (NTRS)
Gorman, Michael R.; Ziola, Steven M.
2007-01-01
During 2003 and 2004, the Johnson Space Center's White Sands Testing Facility in Las Cruces, New Mexico conducted hypervelocity impact tests on the space shuttle wing leading edge. Hypervelocity impact tests were conducted to determine if Micro-Meteoroid/Orbital Debris impacts could be reliably detected and located using simple passive ultrasonic methods. The objective of Target C-2 was to study impacts through the reinforced carboncarbon (RCC) panels of the Wing Leading Edge. Fiberglass was used in place of RCC in the initial tests. Impact damage was detected using lightweight, low power instrumentation capable of being used in flight.
Hypervelocity Impact (HVI). Volume 5; WLE High Fidelity Specimen Fg(RCC)-1
NASA Technical Reports Server (NTRS)
Gorman, Michael R.; Ziola, Steven M.
2007-01-01
During 2003 and 2004, the Johnson Space Center's White Sands Testing Facility in Las Cruces, New Mexico conducted hypervelocity impact tests on the space shuttle wing leading edge. Hypervelocity impact tests were conducted to determine if Micro-Meteoroid/Orbital Debris impacts could be reliably detected and located using simple passive ultrasonic methods. The objective of Target Fg(RCC)-1 was to study hypervelocity impacts through the reinforced carbon-carbon (RCC) panels of the Wing Leading Edge. Fiberglass was used in place of RCC in the initial tests. Impact damage was detected using lightweight, low power instrumentation capable of being used in flight.
Hypervelocity Impact (HVI). Volume 3; WLE Small-Scale Fiberglass Panel Flat Target C-1
NASA Technical Reports Server (NTRS)
Gorman, Michael R.; Ziola, Steven M.
2007-01-01
During 2003 and 2004, the Johnson Space Center's White Sands Testing Facility in Las Cruces, New Mexico conducted hypervelocity impact tests on the space shuttle wing leading edge. Hypervelocity impact tests were conducted to determine if Micro-Meteoroid/Orbital Debris impacts could be reliably detected and located using simple passive ultrasonic methods. The objective of Target C-1 was to study hypervelocity impacts on the reinforced carbon-carbon (RCC) panels of the Wing Leading Edge. Fiberglass was used in place of RCC in the initial tests. Impact damage was detected using lightweight, low power instrumentation capable of being used in flight.
A Rigorous Framework for Optimization of Expensive Functions by Surrogates
NASA Technical Reports Server (NTRS)
Booker, Andrew J.; Dennis, J. E., Jr.; Frank, Paul D.; Serafini, David B.; Torczon, Virginia; Trosset, Michael W.
1998-01-01
The goal of the research reported here is to develop rigorous optimization algorithms to apply to some engineering design problems for which design application of traditional optimization approaches is not practical. This paper presents and analyzes a framework for generating a sequence of approximations to the objective function and managing the use of these approximations as surrogates for optimization. The result is to obtain convergence to a minimizer of an expensive objective function subject to simple constraints. The approach is widely applicable because it does not require, or even explicitly approximate, derivatives of the objective. Numerical results are presented for a 31-variable helicopter rotor blade design example and for a standard optimization test example.
Image quality evaluation of full reference algorithm
NASA Astrophysics Data System (ADS)
He, Nannan; Xie, Kai; Li, Tong; Ye, Yushan
2018-03-01
Image quality evaluation is a classic research topic, the goal is to design the algorithm, given the subjective feelings consistent with the evaluation value. This paper mainly introduces several typical reference methods of Mean Squared Error(MSE), Peak Signal to Noise Rate(PSNR), Structural Similarity Image Metric(SSIM) and feature similarity(FSIM) of objective evaluation methods. The different evaluation methods are tested by Matlab, and the advantages and disadvantages of these methods are obtained by analyzing and comparing them.MSE and PSNR are simple, but they are not considered to introduce HVS characteristics into image quality evaluation. The evaluation result is not ideal. SSIM has a good correlation and simple calculation ,because it is considered to the human visual effect into image quality evaluation,However the SSIM method is based on a hypothesis,The evaluation result is limited. The FSIM method can be used for test of gray image and color image test, and the result is better. Experimental results show that the new image quality evaluation algorithm based on FSIM is more accurate.
NASA Astrophysics Data System (ADS)
Georgiev, Bozhidar; Georgieva, Adriana
2013-12-01
In this paper, are presented some possibilities concerning the implementation of a test-driven development as a programming method. Here is offered a different point of view for creation of advanced programming techniques (build tests before programming source with all necessary software tools and modules respectively). Therefore, this nontraditional approach for easier programmer's work through building tests at first is preferable way of software development. This approach allows comparatively simple programming (applied with different object-oriented programming languages as for example JAVA, XML, PYTHON etc.). It is predictable way to develop software tools and to provide help about creating better software that is also easier to maintain. Test-driven programming is able to replace more complicated casual paradigms, used by many programmers.
Schmidt, Filipp; Weber, Andreas; Schmidt, Thomas
2014-08-21
Most objects can be recognized easily even when they are partly occluded. This also holds when several overlapping objects share the same surface features (self-splitting objects) which is an illustration of the grouping principle of Good Gestalt. We employed outline and filled contour stimuli in a primed flanker task to test whether the processing of self-splitting objects is in accordance with a simple feedforward model. We obtained priming effects in response time and response force for both types of stimuli, even when increasing the number of occluders up to three. The results for outline contours were in full accordance with a feedforward account. This was not the case for the results for filled contours (i.e., for self-splitting objects), especially under conditions of strong occlusion. We conclude that the implementation of the Good Gestalt principle is fast but still based on recurrent processing. © 2014 ARVO.
NASA Astrophysics Data System (ADS)
Wang, Xuan-yu; Hu, Rui; Wang, Rui-xin
2015-10-01
A simple method has been set up to quickly test the emissivity with an infrared thermal imaging system within a small distance according to the theory of measuring temperature by infrared system, which is based on the Planck radiation law and Lambert-beer law. The object's temperature is promoted and held on by a heater while a temperature difference has been formed between the target and environment. The emissivity of human skin, galvanized iron plate, black rubber and liquid water has been tested under the condition that the emissivity is set in 1.0 and the testing distance is 1m. According to the invariance of human's body temperature, a testing curve is established to describe that the thermal imaging temperatures various with the emissivity which is set in from 0.9 to 1.0. As a result, the method has been verified. The testing results show that the emissivity of human skin is 0.95. The emissivity of galvanized iron plate, black rubber and liquid water decreases with the increase of object's temperature. The emissivity of galvanized iron plate is far smaller than the one of human skin, black rubber or water. The emissivity of water slowly linearly decreases with the increase of its temperature. By the study, within a small distance and clean atmosphere, the infrared emissivity of objects may be expediently tested with an infrared thermal imaging system according to the method, which is promoting the object's temperature to make it different from the environment temperature, then simultaneously measures the environmental temperature, the real temperature and thermal imaging temperature of the object when the emissivity is set in 1.0 and the testing distance is 1.0m.
The memorial consequences of multiple-choice testing.
Marsh, Elizabeth J; Roediger, Henry L; Bjork, Robert A; Bjork, Elizabeth L
2007-04-01
The present article addresses whether multiple-choice tests may change knowledge even as they attempt to measure it. Overall, taking a multiple-choice test boosts performance on later tests, as compared with non-tested control conditions. This benefit is not limited to simple definitional questions, but holds true for SAT II questions and for items designed to tap concepts at a higher level in Bloom's (1956) taxonomy of educational objectives. Students, however, can also learn false facts from multiple-choice tests; testing leads to persistence of some multiple-choice lures on later general knowledge tests. Such persistence appears due to faulty reasoning rather than to an increase in the familiarity of lures. Even though students may learn false facts from multiple-choice tests, the positive effects of testing outweigh this cost.
Uncertainty-Based Multi-Objective Optimization of Groundwater Remediation Design
NASA Astrophysics Data System (ADS)
Singh, A.; Minsker, B.
2003-12-01
Management of groundwater contamination is a cost-intensive undertaking filled with conflicting objectives and substantial uncertainty. A critical source of this uncertainty in groundwater remediation design problems comes from the hydraulic conductivity values for the aquifer, upon which the prediction of flow and transport of contaminants are dependent. For a remediation solution to be reliable in practice it is important that it is robust over the potential error in the model predictions. This work focuses on incorporating such uncertainty within a multi-objective optimization framework, to get reliable as well as Pareto optimal solutions. Previous research has shown that small amounts of sampling within a single-objective genetic algorithm can produce highly reliable solutions. However with multiple objectives the noise can interfere with the basic operations of a multi-objective solver, such as determining non-domination of individuals, diversity preservation, and elitism. This work proposes several approaches to improve the performance of noisy multi-objective solvers. These include a simple averaging approach, taking samples across the population (which we call extended averaging), and a stochastic optimization approach. All the approaches are tested on standard multi-objective benchmark problems and a hypothetical groundwater remediation case-study; the best-performing approach is then tested on a field-scale case at Umatilla Army Depot.
Moscow Test Well, INEL Oversight Program: Aqueous geochemistry
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCurry, M.; Fromm, J.; Welhan, J.
1992-09-29
This report presents a summary and interpretation of data gathered during sampling of the Moscow Test Well at Moscow, Idaho during April and May of 1992. The principal objectives of this chemical survey were to validate sampling procedures with a new straddle packer sampling tool in a previously hydrologically well characterized and simple sampling environment, and to compare analytical results from two independent labs for reproducibility of analytical results. Analytes included a wide range of metals, anions, nutrients, BNA`s, and VOC`s. Secondary objectives included analyzing of waters from a large distilled water tank (utilized for all field laboratory purposes asmore » ``pure`` stock water), of water which passed through a steamer used to clean the packer, and of rinsates from the packer tool itself before it was lowered into the test well. Analyses were also obtained of blanks and spikes for data validation purposes.« less
Model of ballistic targets' dynamics used for trajectory tracking algorithms
NASA Astrophysics Data System (ADS)
Okoń-FÄ fara, Marta; Kawalec, Adam; Witczak, Andrzej
2017-04-01
There are known only few ballistic object tracking algorithms. To develop such algorithms and to its further testing, it is necessary to implement possibly simple and reliable objects' dynamics model. The article presents the dynamics' model of a tactical ballistic missile (TBM) including the three stages of flight: the boost stage and two passive stages - the ascending one and the descending one. Additionally, the procedure of transformation from the local coordinate system to the polar-radar oriented and the global is presented. The prepared theoretical data may be used to determine the tracking algorithm parameters and to its further verification.
ERIC Educational Resources Information Center
Munyofu, Mine
2008-01-01
The purpose of this study was to examine the instructional effectiveness of different levels of chunking (simple visual/text and complex visual/text), different forms of feedback (item-by-item feedback, end-of-test feedback and no feedback), and use of instructional gaming (game and no game) in complementing animated programmed instruction on a…
Approach to the patient with dysphagia.
Abdel Jalil, Ala' A; Katzka, David A; Castell, Donald O
2015-10-01
Dysphagia is a fascinating symptom. It is ostensibly simple when defined by trouble swallowing, yet its subtleties in deciphering and its variations in pathophysiology almost mandate a thorough knowledge of medicine itself. With patience and careful questioning, a multitude of various disorders may be suggested before an objective test is performed. Indeed, the ability to diligently and comprehensively explore the symptom of dysphagia is not only rewarding but also a real test for a physician who prides himself or herself on good history taking. Copyright © 2015 Elsevier Inc. All rights reserved.
The design of an intelligent human-computer interface for the test, control and monitor system
NASA Technical Reports Server (NTRS)
Shoaff, William D.
1988-01-01
The graphical intelligence and assistance capabilities of a human-computer interface for the Test, Control, and Monitor System at Kennedy Space Center are explored. The report focuses on how a particular commercial off-the-shelf graphical software package, Data Views, can be used to produce tools that build widgets such as menus, text panels, graphs, icons, windows, and ultimately complete interfaces for monitoring data from an application; controlling an application by providing input data to it; and testing an application by both monitoring and controlling it. A complete set of tools for building interfaces is described in a manual for the TCMS toolkit. Simple tools create primitive widgets such as lines, rectangles and text strings. Intermediate level tools create pictographs from primitive widgets, and connect processes to either text strings or pictographs. Other tools create input objects; Data Views supports output objects directly, thus output objects are not considered. Finally, a set of utilities for executing, monitoring use, editing, and displaying the content of interfaces is included in the toolkit.
Testing Installed Propulsion for Shielded Exhaust Configurations
NASA Technical Reports Server (NTRS)
Bridges, James E.; Podboy, Gary G.; Brown, Clifford A.
2016-01-01
Jet-surface interaction (JSI) can be a significant factor in the exhaust noise of installed propulsion systems. Tests to further the understanding and prediction of the acoustic impacts of JSI have been described. While there were many objectives for the test, the overall objective was to prepare for a future test validating the design of a low-noise, lowboom supersonic commercial airliner. In this paper we explore design requirements for a partial aircraft model to be used in subscale acoustic testing, especially focusing on the amount of aircraft body that must be included to produce the acoustic environment between propulsion exhaust system and observer. We document the dual-stream jets, both nozzle and flow conditions, which were tested to extend JSI acoustic modeling from simple singlestream jets to realistic dual-stream exhaust nozzles. Sample observations are provided of changes to far-field sound as surface geometry and flow conditions were varied. Initial measurements are presented for integrating the propulsion on the airframe for a supersonic airliner with simulated airframe geometries and nozzles. Acoustic impacts of installation were modest, resulting in variations of less than 3 EPNdB in most configurations.
Testing Installed Propulsion For Shielded Exhaust Configurations
NASA Technical Reports Server (NTRS)
Bridges, James; Podboy, Gary G.; Brown, Clifford A.
2016-01-01
Jet-surface interaction (JSI) can be a significant factor in the exhaust noise of installed propulsion. Tests to further understanding and prediction of the acoustic impacts of JSI have been described. While there were many objectives for the NASA JSI1044 test, the overall objective was to prepare for a 2016 test validating the design of a low-noise, low-boom supersonic commercial airliner. In this paper we explore design requirements for a partial aircraft model to be used in subscale acoustic testing, especially focusing on the amount of shielding surface that must be provided to simulate the acoustic environment between propulsion exhaust system and observer. We document the dual-stream jets, both nozzle and flow conditions, which were tested to extend JSI acoustic modeling from simple single-stream jets to realistic dual-stream exhaust nozzles. Examples of observations found as surface geometry and flow conditions were varied were provided. And we have presented initial measurements of the installation impacts of integrating the propulsion on the airframe for a supersonic airliner with realistic airframe geometries and nozzles.
NASA Technical Reports Server (NTRS)
Matney, Mark
2011-01-01
A number of statistical tools have been developed over the years for assessing the risk of reentering objects to human populations. These tools make use of the characteristics (e.g., mass, material, shape, size) of debris that are predicted by aerothermal models to survive reentry. The statistical tools use this information to compute the probability that one or more of the surviving debris might hit a person on the ground and cause one or more casualties. The statistical portion of the analysis relies on a number of assumptions about how the debris footprint and the human population are distributed in latitude and longitude, and how to use that information to arrive at realistic risk numbers. Because this information is used in making policy and engineering decisions, it is important that these assumptions be tested using empirical data. This study uses the latest database of known uncontrolled reentry locations measured by the United States Department of Defense. The predicted ground footprint distributions of these objects are based on the theory that their orbits behave basically like simple Kepler orbits. However, there are a number of factors in the final stages of reentry - including the effects of gravitational harmonics, the effects of the Earth s equatorial bulge on the atmosphere, and the rotation of the Earth and atmosphere - that could cause them to diverge from simple Kepler orbit behavior and possibly change the probability of reentering over a given location. In this paper, the measured latitude and longitude distributions of these objects are directly compared with the predicted distributions, providing a fundamental empirical test of the model assumptions.
Development of a sonar-based object recognition system
NASA Astrophysics Data System (ADS)
Ecemis, Mustafa Ihsan
2001-02-01
Sonars are used extensively in mobile robotics for obstacle detection, ranging and avoidance. However, these range-finding applications do not exploit the full range of information carried in sonar echoes. In addition, mobile robots need robust object recognition systems. Therefore, a simple and robust object recognition system using ultrasonic sensors may have a wide range of applications in robotics. This dissertation develops and analyzes an object recognition system that uses ultrasonic sensors of the type commonly found on mobile robots. Three principal experiments are used to test the sonar recognition system: object recognition at various distances, object recognition during unconstrained motion, and softness discrimination. The hardware setup, consisting of an inexpensive Polaroid sonar and a data acquisition board, is described first. The software for ultrasound signal generation, echo detection, data collection, and data processing is then presented. Next, the dissertation describes two methods to extract information from the echoes, one in the frequency domain and the other in the time domain. The system uses the fuzzy ARTMAP neural network to recognize objects on the basis of the information content of their echoes. In order to demonstrate that the performance of the system does not depend on the specific classification method being used, the K- Nearest Neighbors (KNN) Algorithm is also implemented. KNN yields a test accuracy similar to fuzzy ARTMAP in all experiments. Finally, the dissertation describes a method for extracting features from the envelope function in order to reduce the dimension of the input vector used by the classifiers. Decreasing the size of the input vectors reduces the memory requirements of the system and makes it run faster. It is shown that this method does not affect the performance of the system dramatically and is more appropriate for some tasks. The results of these experiments demonstrate that sonar can be used to develop a low-cost, low-computation system for real-time object recognition tasks on mobile robots. This system differs from all previous approaches in that it is relatively simple, robust, fast, and inexpensive.
Dai, Meiling; Yang, Fujun; He, Xiaoyuan
2012-04-20
A simple but effective fringe projection profilometry is proposed to measure 3D shape by using one snapshot color sinusoidal fringe pattern. One color fringe pattern encoded with a sinusoidal fringe (as red component) and one uniform intensity pattern (as blue component) is projected by a digital video projector, and the deformed fringe pattern is recorded by a color CCD camera. The captured color fringe pattern is separated into its RGB components and division operation is applied to red and blue channels to reduce the variable reflection intensity. Shape information of the tested object is decoded by applying an arcsine algorithm on the normalized fringe pattern with subpixel resolution. In the case of fringe discontinuities caused by height steps, or spatially isolated surfaces, the separated blue component is binarized and used for correcting the phase demodulation. A simple and robust method is also introduced to compensate for nonlinear intensity response of the digital video projector. The experimental results demonstrate the validity of the proposed method.
Gobbi, Erica; Elliot, Catherine; Varnier, Maurizio; Carraro, Attilio
2016-01-01
The purpose of this research was to assess an Italian version of the Physical Activity Questionnaire for Older Children (PAQ-C-It). Three separate studies were conducted, whereby testing general psychometric properties, construct validity, concurrent validity and the factor structure of the PAQ-C-It among general and clinical pediatric population. Study 1 (n = 1170) examined the psychometric properties, internal consistency, factor structure (exploratory factor analysis, EFA) and construct validity with enjoyment perception during physical activity. Study 2 (n = 59) reported on reliability, construct validity with enjoyment and BMI, and on cross-sectional concurrent validity with objectively measured MVPA (tri-axial accelerometry) over the span of seven consecutive days. Study 3 (n = 58) examined the PAQ-C-It reliability, construct validity with BMI and VO2max as the objective measurement among a population of children with congenital heart defects (CHD). In study 2 and 3, the factor structure of the PAQ-C-It was then re-examined with an EFA. The PAQ-C-It showed acceptable to good reliability (alpha .70 to .83). Results on construct validity showed moderate but significant association with enjoyment perception (r = .30 and .36), with BMI (r = -.30 and -.79 for CHD simple form), and with the VO2max (r = .55 for CHD simple form). Significant concurrent validity with the objectively measured MVPA was reported (rho = .30, p < .05). Findings of the EFA suggested a two-factor structure for the PAQ-C-It, with items 2, 3, and 4 contributing little to the total score. This study supports the PAQ-C-It as an appropriate instrument to assess the MVPA levels of Italian children, including children with simple forms of CHD. Support is given to the possible instrument effectiveness on a large international perspective in order to level out data gathering across the globe.
Gobbi, Erica; Elliot, Catherine; Varnier, Maurizio; Carraro, Attilio
2016-01-01
The purpose of this research was to assess an Italian version of the Physical Activity Questionnaire for Older Children (PAQ-C-It). Three separate studies were conducted, whereby testing general psychometric properties, construct validity, concurrent validity and the factor structure of the PAQ-C-It among general and clinical pediatric population. Study 1 (n = 1170) examined the psychometric properties, internal consistency, factor structure (exploratory factor analysis, EFA) and construct validity with enjoyment perception during physical activity. Study 2 (n = 59) reported on reliability, construct validity with enjoyment and BMI, and on cross-sectional concurrent validity with objectively measured MVPA (tri-axial accelerometry) over the span of seven consecutive days. Study 3 (n = 58) examined the PAQ-C-It reliability, construct validity with BMI and VO2max as the objective measurement among a population of children with congenital heart defects (CHD). In study 2 and 3, the factor structure of the PAQ-C-It was then re-examined with an EFA. The PAQ-C-It showed acceptable to good reliability (alpha .70 to .83). Results on construct validity showed moderate but significant association with enjoyment perception (r = .30 and .36), with BMI (r = -.30 and -.79 for CHD simple form), and with the VO2max (r = .55 for CHD simple form). Significant concurrent validity with the objectively measured MVPA was reported (rho = .30, p < .05). Findings of the EFA suggested a two-factor structure for the PAQ-C-It, with items 2, 3, and 4 contributing little to the total score. This study supports the PAQ-C-It as an appropriate instrument to assess the MVPA levels of Italian children, including children with simple forms of CHD. Support is given to the possible instrument effectiveness on a large international perspective in order to level out data gathering across the globe. PMID:27228050
Hypervelocity Impact (HVI). Volume 8; Tile Small Targets A-1, Ag-1, B-1, and Bg-1
NASA Technical Reports Server (NTRS)
Gorman, Michael R.; Ziola, Steven M.
2007-01-01
During 2003 and 2004, the Johnson Space Center's White Sands Testing Facility in Las Cruces, New Mexico conducted hypervelocity impact tests on the space shuttle wing leading edge. Hypervelocity impact tests were conducted to determine if Micro-Meteoroid/Orbital Debris impacts could be reliably detected and located using simple passive ultrasonic methods. The objective of Targets A-1, Ag-1, B-1, and Bg-1 was to study hypervelocity impacts on the reinforced Shuttle Heat Shield Tiles of the Wing. Impact damage was detected using lightweight, low power instrumentation capable of being used in flight.
NASA Technical Reports Server (NTRS)
Gorman, Michael R.; Ziola, Steven M.
2007-01-01
During 2003 and 2004, the Johnson Space Center's White Sands Testing Facility in Las Cruces, New Mexico conducted hypervelocity impact tests on the space shuttle wing leading edge. Hypervelocity impact tests were conducted to determine if Micro-Meteoroid/Orbital Debris impacts could be reliably detected and located using simple passive ultrasonic methods. The objective of Targets A-1, A-2, and B-2 was to study hypervelocity impacts through multi-layered panels simulating Whipple shields on spacecraft. Impact damage was detected using lightweight, low power instrumentation capable of being used in flight.
Behrmann, Marlene; Peterson, Mary A; Moscovitch, Morris; Suzuki, Satoru
2006-10-01
Whether objects are represented as a collection of parts whose relations are coded independently remains a topic of ongoing discussion among theorists in the domain of shape perception. S. M., an individual with integrative agnosia, and neurologically intact ("normal") individuals learned initially to identify 4 target objects constructed of 2 simple volumetric parts. At test, the targets were mixed with distractors, some of which could be discriminated from the targets on the basis of a mismatching part, whereas the rest could be discriminated only on the basis of the altered spatial arrangements of parts. S. M. learned to identify the target objects, although at a rate slower than that of the normal participants. At test, he correctly rejected distractors on the basis of mismatching parts but was profoundly impaired at rejecting distractors made of the same local components but with mismatching spatial arrangements. These results suggest that encoding the spatial arrangements of parts of an object requires a mechanism that is different from that required for encoding the shape of individual parts, with the former selectively compromised in integrative agnosia. Copyright 2006 APA.
Object permanence in common marmosets (Callithrix jacchus).
Mendes, Natacha; Huber, Ludwig
2004-03-01
A series of 9 search tasks corresponding to the Piagetian Stages 3-6 of object permanence were administered to 11 common marmosets (Callithrix jacchus). Success rates varied strongly among tasks and marmosets, but the performances of most subjects were above chance level on the majority of tasks of visible and invisible displacements. Although up to 24 trials were administered in the tests, subjects did not improve their performance across trials. Errors were due to preferences for specific locations or boxes, simple search strategies, and attentional deficits. The performances of at least 2 subjects that achieved very high scores up to the successive invisible displacement task suggest that this species is able to represent the existence and the movements of unperceived objects. ((c) 2004 APA, all rights reserved)
Incorporating uncertainty into medical decision making: an approach to unexpected test results.
Bianchi, Matt T; Alexander, Brian M; Cash, Sydney S
2009-01-01
The utility of diagnostic tests derives from the ability to translate the population concepts of sensitivity and specificity into information that will be useful for the individual patient: the predictive value of the result. As the array of available diagnostic testing broadens, there is a temptation to de-emphasize history and physical findings and defer to the objective rigor of technology. However, diagnostic test interpretation is not always straightforward. One significant barrier to routine use of probability-based test interpretation is the uncertainty inherent in pretest probability estimation, the critical first step of Bayesian reasoning. The context in which this uncertainty presents the greatest challenge is when test results oppose clinical judgment. It is this situation when decision support would be most helpful. The authors propose a simple graphical approach that incorporates uncertainty in pretest probability and has specific application to the interpretation of unexpected results. This method quantitatively demonstrates how uncertainty in disease probability may be amplified when test results are unexpected (opposing clinical judgment), even for tests with high sensitivity and specificity. The authors provide a simple nomogram for determining whether an unexpected test result suggests that one should "switch diagnostic sides.'' This graphical framework overcomes the limitation of pretest probability uncertainty in Bayesian analysis and guides decision making when it is most challenging: interpretation of unexpected test results.
The validity of a simple outcome measure to assess stuttering therapy.
Huinck, Wendy; Rietveld, Toni
2007-01-01
The validity of a simple and not time-consuming self-assessment (SA) Scale was tested to establish progress after or during stuttering therapy. The scores on the SA scale were related to (1) objective measures (percentage of stuttered syllables, and syllables per minute) and (2) (self-)evaluation tests (self-evaluation questionnaires and perceptual evaluations or judgments of disfluency, naturalness and comfort by naïve listeners). Data were collected from two groups of stutterers at four measurement times: pretherapy, posttherapy, 12 months after therapy and 24 months after therapy. The first group attended the Comprehensive Stuttering Program: an integrated program based on fluency shaping techniques, and the second group participated in a Dutch group therapy: the Doetinchem Method that focuses on emotions and cognitions related to stuttering. Results showed similar score patterns on the SA scale, the self-evaluation questionnaires, the objective measures over time, and significant correlations between the SA scale and syllables per minute, percentage of stuttered syllables, Struggle subscale of the Perceptions of Stuttering Inventory and judged fluency on the T1-T2 difference scores. We concluded that the validity of the SA measure was proved and therefore encourage the use of such an instrument when (stuttering) treatment efficacy is studied.
THE ROLE OF THE HIPPOCAMPUS IN OBJECT DISCRIMINATION BASED ON VISUAL FEATURES.
Levcik, David; Nekovarova, Tereza; Antosova, Eliska; Stuchlik, Ales; Klement, Daniel
2018-06-07
The role of rodent hippocampus has been intensively studied in different cognitive tasks. However, its role in discrimination of objects remains controversial due to conflicting findings. We tested whether the number and type of features available for the identification of objects might affect the strategy (hippocampal-independent vs. hippocampal-dependent) that rats adopt to solve object discrimination tasks. We trained rats to discriminate 2D visual objects presented on a computer screen. The objects were defined either by their shape only or by multiple-features (a combination of filling pattern and brightness in addition to the shape). Our data showed that objects displayed as simple geometric shapes are not discriminated by trained rats after their hippocampi had been bilaterally inactivated by the GABA A -agonist muscimol. On the other hand, objects containing a specific combination of non-geometric features in addition to the shape are discriminated even without the hippocampus. Our results suggest that the involvement of the hippocampus in visual object discrimination depends on the abundance of object's features. Copyright © 2018. Published by Elsevier Inc.
Mental Representation of Spatial Cues During Spaceflight (3D-SPACE)
NASA Astrophysics Data System (ADS)
Clement, Gilles; Lathan, Corinna; Skinner, Anna; Lorigny, Eric
2008-06-01
The 3D-SPACE experiment is a joint effort between ESA and NASA to develop a simple virtual reality platform to enable astronauts to complete a series of tests while aboard the International Space Station (ISS). These tests will provide insights into the effects of the space environment on: (a) depth perception, by presenting 2D geometric illusions and 3D objects that subjects adjust with a finger trackball; (b) distance perception, by presenting natural or computer-generated 3D scenes where subjects estimate and report absolute distances or adjust distances; and (c) handwriting/drawing, by analyzing trajectories and velocities when subjects write or draw memorized objects with an electronic pen on a digitizing tablet. The objective of these tasks is to identify problems associated with 3D perception in astronauts with the goal of developing countermeasures to alleviate any associated performance risks. The equipment has been uploaded to the ISS in April 2008, and the first measurements should take place during Increment 17.
The Grid File: A Data Structure Designed to Support Proximity Queries on Spatial Objects.
1983-06-01
dimensional space. The technique to be presented for storing spatial objects works for any choice of parameters by which * simple objects can be represented...However, depending on characteristics of the data to be processed , some choices of parameters are better than others. Let us discuss some...considerations that may determine the choice of parameters. 1) istinction between lmaerba peuwuers ad extensiem prwuneert For some clasm of simple objects It
Size Constancy in Bat Biosonar? Perceptual Interaction of Object Aperture and Distance
Heinrich, Melina; Wiegrebe, Lutz
2013-01-01
Perception and encoding of object size is an important feature of sensory systems. In the visual system object size is encoded by the visual angle (visual aperture) on the retina, but the aperture depends on the distance of the object. As object distance is not unambiguously encoded in the visual system, higher computational mechanisms are needed. This phenomenon is termed “size constancy”. It is assumed to reflect an automatic re-scaling of visual aperture with perceived object distance. Recently, it was found that in echolocating bats, the ‘sonar aperture’, i.e., the range of angles from which sound is reflected from an object back to the bat, is unambiguously perceived and neurally encoded. Moreover, it is well known that object distance is accurately perceived and explicitly encoded in bat sonar. Here, we addressed size constancy in bat biosonar, recruiting virtual-object techniques. Bats of the species Phyllostomus discolor learned to discriminate two simple virtual objects that only differed in sonar aperture. Upon successful discrimination, test trials were randomly interspersed using virtual objects that differed in both aperture and distance. It was tested whether the bats spontaneously assigned absolute width information to these objects by combining distance and aperture. The results showed that while the isolated perceptual cues encoding object width, aperture, and distance were all perceptually well resolved by the bats, the animals did not assign absolute width information to the test objects. This lack of sonar size constancy may result from the bats relying on different modalities to extract size information at different distances. Alternatively, it is conceivable that familiarity with a behaviorally relevant, conspicuous object is required for sonar size constancy, as it has been argued for visual size constancy. Based on the current data, it appears that size constancy is not necessarily an essential feature of sonar perception in bats. PMID:23630598
Size constancy in bat biosonar? Perceptual interaction of object aperture and distance.
Heinrich, Melina; Wiegrebe, Lutz
2013-01-01
Perception and encoding of object size is an important feature of sensory systems. In the visual system object size is encoded by the visual angle (visual aperture) on the retina, but the aperture depends on the distance of the object. As object distance is not unambiguously encoded in the visual system, higher computational mechanisms are needed. This phenomenon is termed "size constancy". It is assumed to reflect an automatic re-scaling of visual aperture with perceived object distance. Recently, it was found that in echolocating bats, the 'sonar aperture', i.e., the range of angles from which sound is reflected from an object back to the bat, is unambiguously perceived and neurally encoded. Moreover, it is well known that object distance is accurately perceived and explicitly encoded in bat sonar. Here, we addressed size constancy in bat biosonar, recruiting virtual-object techniques. Bats of the species Phyllostomus discolor learned to discriminate two simple virtual objects that only differed in sonar aperture. Upon successful discrimination, test trials were randomly interspersed using virtual objects that differed in both aperture and distance. It was tested whether the bats spontaneously assigned absolute width information to these objects by combining distance and aperture. The results showed that while the isolated perceptual cues encoding object width, aperture, and distance were all perceptually well resolved by the bats, the animals did not assign absolute width information to the test objects. This lack of sonar size constancy may result from the bats relying on different modalities to extract size information at different distances. Alternatively, it is conceivable that familiarity with a behaviorally relevant, conspicuous object is required for sonar size constancy, as it has been argued for visual size constancy. Based on the current data, it appears that size constancy is not necessarily an essential feature of sonar perception in bats.
Test model designs for advanced refractory ceramic materials
NASA Technical Reports Server (NTRS)
Tran, Huy Kim
1993-01-01
The next generation of space vehicles will be subjected to severe aerothermal loads and will require an improved thermal protection system (TPS) and other advanced vehicle components. In order to ensure the satisfactory performance system (TPS) and other advanced vehicle materials and components, testing is to be performed in environments similar to space flight. The design and fabrication of the test models should be fairly simple but still accomplish test objectives. In the Advanced Refractory Ceramic Materials test series, the models and model holders will need to withstand the required heat fluxes of 340 to 817 W/sq cm or surface temperatures in the range of 2700 K to 3000 K. The model holders should provide one dimensional (1-D) heat transfer to the samples and the appropriate flow field without compromising the primary test objectives. The optical properties such as the effective emissivity, catalytic efficiency coefficients, thermal properties, and mass loss measurements are also taken into consideration in the design process. Therefore, it is the intent of this paper to demonstrate the design schemes for different models and model holders that would accommodate these test requirements and ensure the safe operation in a typical arc jet facility.
NASA Technical Reports Server (NTRS)
Madavan, Nateri K.
2004-01-01
Differential Evolution (DE) is a simple, fast, and robust evolutionary algorithm that has proven effective in determining the global optimum for several difficult single-objective optimization problems. The DE algorithm has been recently extended to multiobjective optimization problem by using a Pareto-based approach. In this paper, a Pareto DE algorithm is applied to multiobjective aerodynamic shape optimization problems that are characterized by computationally expensive objective function evaluations. To improve computational expensive the algorithm is coupled with generalized response surface meta-models based on artificial neural networks. Results are presented for some test optimization problems from the literature to demonstrate the capabilities of the method.
Wald Sequential Probability Ratio Test for Space Object Conjunction Assessment
NASA Technical Reports Server (NTRS)
Carpenter, James R.; Markley, F Landis
2014-01-01
This paper shows how satellite owner/operators may use sequential estimates of collision probability, along with a prior assessment of the base risk of collision, in a compound hypothesis ratio test to inform decisions concerning collision risk mitigation maneuvers. The compound hypothesis test reduces to a simple probability ratio test, which appears to be a novel result. The test satisfies tolerances related to targeted false alarm and missed detection rates. This result is independent of the method one uses to compute the probability density that one integrates to compute collision probability. A well-established test case from the literature shows that this test yields acceptable results within the constraints of a typical operational conjunction assessment decision timeline. Another example illustrates the use of the test in a practical conjunction assessment scenario based on operations of the International Space Station.
[Influence of mental rotation of objects on psychophysiological functions of women].
Chikina, L V; Fedorchuk, S V; Trushina, V A; Ianchuk, P I; Makarchuk, M Iu
2012-01-01
An integral part of activity of modern human beings is an involvement to work with the computer systems which, in turn, produces a nervous - emotional tension. Hence, a problem of control of the psychophysiological state of workmen with the purpose of health preservation and success of their activity and the problem of application of rehabilitational actions are actual. At present it is known that the efficiency of rehabilitational procedures rises following application of the complex of regenerative programs. Previously performed by us investigation showed that mental rotation is capable to compensate the consequences of a nervous - emotional tension. Therefore, in the present work we investigated how the complex of spatial tasks developed by us influences psychophysiological performances of tested women for which the psycho-emotional tension with the usage of computer technologies is more essential, and the procedure of mental rotation is more complex task for them, than for men. The complex of spatial tasks applied in the given work included: mental rotation of simple objects (letters and digits), mental rotation of complex objects (geometrical figures) and mental rotation of complex objects with the usage of a short-term memory. Execution of the complex of spatial tasks reduces the time of simple and complex sensomotor response, raises parameters of a short-term memory, brain work capacity and improves nervous processes. Collectively, mental rotation of objects can be recommended as a rehabilitational resource for compensation of consequences of any psycho-emotional strain, both for men, and for women.
Plescia, Fulvio; Sardo, Pierangelo; Rizzo, Valerio; Cacace, Silvana; Marino, Rosa Anna Maria; Brancato, Anna; Ferraro, Giuseppe; Carletti, Fabio; Cannizzaro, Carla
2014-01-01
Neurosteroids can alter neuronal excitability interacting with specific neurotransmitter receptors, thus affecting several functions such as cognition and emotionality. In this study we investigated, in adult male rats, the effects of the acute administration of pregnenolone-sulfate (PREGS) (10mg/kg, s.c.) on cognitive processes using the Can test, a non aversive spatial/visual task which allows the assessment of both spatial orientation-acquisition and object discrimination in a simple and in a complex version of the visual task. Electrophysiological recordings were also performed in vivo, after acute PREGS systemic administration in order to investigate on the neuronal activation in the hippocampus and the perirhinal cortex. Our results indicate that, PREGS induces an improvement in spatial orientation-acquisition and in object discrimination in the simple and in the complex visual task; the behavioural responses were also confirmed by electrophysiological recordings showing a potentiation in the neuronal activity of the hippocampus and the perirhinal cortex. In conclusion, this study demonstrates that PREGS systemic administration in rats exerts cognitive enhancing properties which involve both the acquisition and utilization of spatial information, and object discrimination memory, and also correlates the behavioural potentiation observed to an increase in the neuronal firing of discrete cerebral areas critical for spatial learning and object recognition. This provides further evidence in support of the role of PREGS in exerting a protective and enhancing role on human memory. Copyright © 2013. Published by Elsevier B.V.
Tamaru, Yoshiki; Naito, Yasuo; Nishikawa, Takashi
2017-11-01
Elderly people are less able to manipulate objects skilfully than young adults. Although previous studies have examined age-related deterioration of hand movements with a focus on the phase after grasping objects, the changes in the reaching phase have not been studied thus far. We aimed to examine whether changes in hand shape patterns during the reaching phase of grasping movements differ between young adults and the elderly. Ten healthy elderly adults and 10 healthy young adults were examined using the Simple Test for Evaluating Hand Functions and kinetic analysis of hand pre-shaping reach-to-grasp tasks. The results were then compared between the two groups. For kinetic analysis, we measured the time of peak tangential velocity of the wrist and the inter-fingertip distance (the distance between the tips of the thumb and index finger) at different time points. The results showed that the elderly group's performance on the Simple Test for Evaluating Hand Functions was significantly lower than that of the young adult group, irrespective of whether the dominant or non-dominant hand was used, indicating deterioration of hand movement in the elderly. The peak tangential velocity of the wrist in either hand appeared significantly earlier in the elderly group than in the young adult group. The elderly group also showed larger inter-fingertip distances with arch-like fingertip trajectories compared to the young adult group for all object sizes. To perform accurate prehension, elderly people have an earlier peak tangential velocity point than young adults. This allows for a longer adjustment time for reaching and grasping movements and for reducing errors in object prehension by opening the hand and fingers wider. Elderly individuals gradually modify their strategy based on previous successes and failures during daily living to compensate for their decline in dexterity and operational capabilities. © 2017 Japanese Psychogeriatric Society.
Microburst vertical wind estimation from horizontal wind measurements
NASA Technical Reports Server (NTRS)
Vicroy, Dan D.
1994-01-01
The vertical wind or downdraft component of a microburst-generated wind shear can significantly degrade airplane performance. Doppler radar and lidar are two sensor technologies being tested to provide flight crews with early warning of the presence of hazardous wind shear. An inherent limitation of Doppler-based sensors is the inability to measure velocities perpendicular to the line of sight, which results in an underestimate of the total wind shear hazard. One solution to the line-of-sight limitation is to use a vertical wind model to estimate the vertical component from the horizontal wind measurement. The objective of this study was to assess the ability of simple vertical wind models to improve the hazard prediction capability of an airborne Doppler sensor in a realistic microburst environment. Both simulation and flight test measurements were used to test the vertical wind models. The results indicate that in the altitude region of interest (at or below 300 m), the simple vertical wind models improved the hazard estimate. The radar simulation study showed that the magnitude of the performance improvement was altitude dependent. The altitude of maximum performance improvement occurred at about 300 m.
A simple rule of thumb for elegant prehension.
Mon-Williams, M; Tresilian, J R
2001-07-10
Reaching out to grasp an object (prehension) is a deceptively elegant and skilled behavior. The movement prior to object contact can be described as having two components, the movement of the hand to an appropriate location for gripping the object, the "transport" component, and the opening and closing of the aperture between the fingers as they prepare to grip the target, the "grasp" component. The grasp component is sensitive to the size of the object, so that a larger grasp aperture is formed for wider objects; the maximum grasp aperture (MGA) is a little wider than the width of the target object and occurs later in the movement for larger objects. We present a simple model that can account for the temporal relationship between the transport and grasp components. We report the results of an experiment providing empirical support for our "rule of thumb." The model provides a simple, but plausible, account of a neural control strategy that has been the center of debate over the last two decades.
Optical Measurement Technique for Space Column Characterization
NASA Technical Reports Server (NTRS)
Barrows, Danny A.; Watson, Judith J.; Burner, Alpheus W.; Phelps, James E.
2004-01-01
A simple optical technique for the structural characterization of lightweight space columns is presented. The technique is useful for determining the coefficient of thermal expansion during cool down as well as the induced strain during tension and compression testing. The technique is based upon object-to-image plane scaling and does not require any photogrammetric calibrations or computations. Examples of the measurement of the coefficient of thermal expansion are presented for several lightweight space columns. Examples of strain measured during tension and compression testing are presented along with comparisons to results obtained with Linear Variable Differential Transformer (LVDT) position transducers.
NASA Astrophysics Data System (ADS)
Mustak, S.
2013-09-01
The correction of atmospheric effects is very essential because visible bands of shorter wavelength are highly affected by atmospheric scattering especially of Rayleigh scattering. The objectives of the paper is to find out the haze values present in the all spectral bands and to correct the haze values for urban analysis. In this paper, Improved Dark Object Subtraction method of P. Chavez (1988) is applied for the correction of atmospheric haze in the Resoucesat-1 LISS-4 multispectral satellite image. Dark object Subtraction is a very simple image-based method of atmospheric haze which assumes that there are at least a few pixels within an image which should be black (% reflectance) and such black reflectance termed as dark object which are clear water body and shadows whose DN values zero (0) or Close to zero in the image. Simple Dark Object Subtraction method is a first order atmospheric correction but Improved Dark Object Subtraction method which tends to correct the Haze in terms of atmospheric scattering and path radiance based on the power law of relative scattering effect of atmosphere. The haze values extracted using Simple Dark Object Subtraction method for Green band (Band2), Red band (Band3) and NIR band (band4) are 40, 34 and 18 but the haze values extracted using Improved Dark Object Subtraction method are 40, 18.02 and 11.80 for aforesaid bands. Here it is concluded that the haze values extracted by Improved Dark Object Subtraction method provides more realistic results than Simple Dark Object Subtraction method.
A Hammer-Impact, Aluminum, Shear-Wave Seismic Source
Haines, Seth
2007-01-01
Near-surface seismic surveys often employ hammer impacts to create seismic energy. Shear-wave surveys using horizontally polarized waves require horizontal hammer impacts against a rigid object (the source) that is coupled to the ground surface. I have designed, built, and tested a source made out of aluminum and equipped with spikes to improve coupling. The source is effective in a variety of settings, and it is relatively simple and inexpensive to build.
Purkinje cells signal hand shape and grasp force during reach-to-grasp in the monkey.
Mason, Carolyn R; Hendrix, Claudia M; Ebner, Timothy J
2006-01-01
The cerebellar cortex and nuclei play important roles in the learning, planning, and execution of reach-to-grasp and prehensile movements. However, few studies have investigated the signals carried by cerebellar neurons during reach-to-grasp, particularly signals relating to target object properties, hand shape, and grasp force. In this study, the simple spike discharge of 77 Purkinje cells was recorded as two rhesus monkeys reached and grasped 16 objects. The objects varied systematically in volume, shape, and orientation and each was grasped at five different force levels. Linear multiple regression analyses showed the simple spike discharge was significantly modulated in relation to objects and force levels. Object related modulation occurred preferentially during reach or early in the grasp and was linearly related to grasp aperture. The simple spike discharge was positively correlated with grasp force during both the reach and the grasp. There was no significant interaction between object and grasp force modulation, supporting previous kinematic findings that grasp kinematics and force are signaled independently. Singular value decomposition (SVD) was used to quantify the temporal patterns in the simple spike discharge. Most cells had a predominant discharge pattern that remained relatively constant across object grasp dimensions and force levels. A single predominant simple spike discharge pattern that spans reach and grasp and accounts for most of the variation (>60%) is consistent with the concept that the cerebellum is involved with synergies underlying prehension. Therefore Purkinje cells are involved with the signaling of prehension, providing independent signals for hand shaping and grasp force.
A simple bedside test to assess the swallowing dysfunction in Parkinson's disease.
Kanna, S Vinoth; Bhanu, K
2014-01-01
Swallowing changes are common in Parkinson's disease (PD). Early identification is essential to avoid complications of aspiration. To evaluate the swallowing ability of the PD patients and to correlate it with the indicators of disease progression. A total of 100 PD patients (70 males and 30 females) aged between 50 years and 70 years with varying stage, duration, and severity were enrolled in a cross-sectional study carried out between January and May 2012. A simple bedside water swallowing test was performed using standard 150 ml of water. Swallowing process was assessed under three categories-swallowing speeds (ml/s), swallowing volume (ml/swallow) and swallowing duration (s/swallow). Equal number of age and sex matched controls were also evaluated. All of them completed the task of swallowing. A mean swallowing speed (27.48 ml/s), swallowing volume (28.5 ml/s), and swallowing duration (1.05 s/swallow) was established by the control group. The PD patients showed decreased swallowing speed (7.15 ml/s in males and 6.61 ml/s in females), decreased swallowing volume (14.59 ml/swallow and 14 ml/swallow in females), and increased swallowing duration (2.37 s/swallow and 2.42 s/swallow) which are statistically significant. There was a significant positive correlation between the severity, duration, and staging of the disease with the swallowing performance and a poor correlation between the subjective reports of dysphagia and the objective performance on water swallow test. The water swallowing test is a simple bedside test to identify the swallowing changes early in PD. It is recommended to do the test in all PD Patients to detect dysphagia early and to intervene appropriately.
Testing stellar evolution models with detached eclipsing binaries
NASA Astrophysics Data System (ADS)
Higl, J.; Weiss, A.
2017-12-01
Stellar evolution codes, as all other numerical tools, need to be verified. One of the standard stellar objects that allow stringent tests of stellar evolution theory and models, are detached eclipsing binaries. We have used 19 such objects to test our stellar evolution code, in order to see whether standard methods and assumptions suffice to reproduce the observed global properties. In this paper we concentrate on three effects that contain a specific uncertainty: atomic diffusion as used for standard solar model calculations, overshooting from convective regions, and a simple model for the effect of stellar spots on stellar radius, which is one of the possible solutions for the radius problem of M dwarfs. We find that in general old systems need diffusion to allow for, or at least improve, an acceptable fit, and that systems with convective cores indeed need overshooting. Only one system (AI Phe) requires the absence of it for a successful fit. To match stellar radii for very low-mass stars, the spot model proved to be an effective approach, but depending on model details, requires a high percentage of the surface being covered by spots. We briefly discuss improvements needed to further reduce the freedom in modelling and to allow an even more restrictive test by using these objects.
Han, Kuk-Il; Kim, Do-Hwi; Choi, Jun-Hyuk; Kim, Tae-Kuk
2018-04-20
Treatments for detection by infrared (IR) signals are higher than for other signals such as radar or sonar because an object detected by the IR sensor cannot easily recognize its detection status. Recently, research for actively reducing IR signal has been conducted to control the IR signal by adjusting the surface temperature of the object. In this paper, we propose an active IR stealth algorithm to synchronize IR signals from the object and the background around the object. The proposed method includes the repulsive particle swarm optimization statistical optimization algorithm to estimate the IR stealth surface temperature, which will result in a synchronization between the IR signals from the object and the surrounding background by setting the inverse distance weighted contrast radiant intensity (CRI) equal to zero. We tested the IR stealth performance in mid wavelength infrared (MWIR) and long wavelength infrared (LWIR) bands for a test plate located at three different positions on a forest scene to verify the proposed method. Our results show that the inverse distance weighted active IR stealth technique proposed in this study is proved to be an effective method for reducing the contrast radiant intensity between the object and background up to 32% as compared to the previous method using the CRI determined as the simple signal difference between the object and the background.
Wilson, N; Vickers, H; Taylor, G; silverman, M
1982-01-01
Ten asthmatic children with a history of cough and wheeze after drinking a cola drink performed histamine inhalation tests before and 30 minutes after a drink of Pepsi-Cola, soda water, and water on three separate study days. There was no significant change in baseline peak expiratory flow after any of the three drinks. Sensitivity to histamine was increased after the cola drink (p less than 0.005) but was not significantly different after soda water or water. The detection of change in sensitivity to histamine appears to be a simple and effective method of testing for food sensitivity in asthma. Images p1228-a PMID:6803911
Prime Numbers Comparison using Sieve of Eratosthenes and Sieve of Sundaram Algorithm
NASA Astrophysics Data System (ADS)
Abdullah, D.; Rahim, R.; Apdilah, D.; Efendi, S.; Tulus, T.; Suwilo, S.
2018-03-01
Prime numbers are numbers that have their appeal to researchers due to the complexity of these numbers, many algorithms that can be used to generate prime numbers ranging from simple to complex computations, Sieve of Eratosthenes and Sieve of Sundaram are two algorithm that can be used to generate Prime numbers of randomly generated or sequential numbered random numbers, testing in this study to find out which algorithm is better used for large primes in terms of time complexity, the test also assisted with applications designed using Java language with code optimization and Maximum memory usage so that the testing process can be simultaneously and the results obtained can be objective
Virtual-stereo fringe reflection technique for specular free-form surface testing
NASA Astrophysics Data System (ADS)
Ma, Suodong; Li, Bo
2016-11-01
Due to their excellent ability to improve the performance of optical systems, free-form optics have attracted extensive interest in many fields, e.g. optical design of astronomical telescopes, laser beam expanders, spectral imagers, etc. However, compared with traditional simple ones, testing for such kind of optics is usually more complex and difficult which has been being a big barrier for the manufacture and the application of these optics. Fortunately, owing to the rapid development of electronic devices and computer vision technology, fringe reflection technique (FRT) with advantages of simple system structure, high measurement accuracy and large dynamic range is becoming a powerful tool for specular free-form surface testing. In order to obtain absolute surface shape distributions of test objects, two or more cameras are often required in the conventional FRT which makes the system structure more complex and the measurement cost much higher. Furthermore, high precision synchronization between each camera is also a troublesome issue. To overcome the aforementioned drawback, a virtual-stereo FRT for specular free-form surface testing is put forward in this paper. It is able to achieve absolute profiles with the help of only one single biprism and a camera meanwhile avoiding the problems of stereo FRT based on binocular or multi-ocular cameras. Preliminary experimental results demonstrate the feasibility of the proposed technique.
Jacks--A Study of Simple Machines.
ERIC Educational Resources Information Center
Parsons, Ralph
This vocational physics individualized student instructional module on jacks (simple machines used to lift heavy objects) contains student prerequisites and objectives, an introduction, and sections on the ratchet bumper jack, the hydraulic jack, the screw jack, and load limitations. Designed with a laboratory orientation, each section consists of…
Test of the efficiency of three storm water quality models with a rich set of data.
Ahyerre, M; Henry, F O; Gogien, F; Chabanel, M; Zug, M; Renaudet, D
2005-01-01
The objective of this article is to test the efficiency of three different Storm Water Quality Model (SWQM) on the same data set (34 rain events, SS measurements) sampled on a 42 ha watershed in the center of Paris. The models have been calibrated at the scale of the rain event. Considering the mass of pollution calculated per event, the results on the models are satisfactory but that they are in the same order of magnitude as the simple hydraulic approach associated to a constant concentration. In a second time, the mass of pollutant at the outlet of the catchment at the global scale of the 34 events has been calculated. This approach shows that the simple hydraulic calculations gives better results than SWQM. Finally, the pollutographs are analysed, showing that storm water quality models are interesting tools to represent the shape of the pollutographs, and the dynamics of the phenomenon which can be useful in some projects for managers.
Davila-Ross, Marina; Hutchinson, Johanna; Russell, Jamie L; Schaeffer, Jennifer; Billard, Aude; Hopkins, William D; Bard, Kim A
2014-05-01
Even the most rudimentary social cues may evoke affiliative responses in humans and promote social communication and cohesion. The present work tested whether such cues of an agent may also promote communicative interactions in a nonhuman primate species, by examining interaction-promoting behaviours in chimpanzees. Here, chimpanzees were tested during interactions with an interactive humanoid robot, which showed simple bodily movements and sent out calls. The results revealed that chimpanzees exhibited two types of interaction-promoting behaviours during relaxed or playful contexts. First, the chimpanzees showed prolonged active interest when they were imitated by the robot. Second, the subjects requested 'social' responses from the robot, i.e. by showing play invitations and offering toys or other objects. This study thus provides evidence that even rudimentary cues of a robotic agent may promote social interactions in chimpanzees, like in humans. Such simple and frequent social interactions most likely provided a foundation for sophisticated forms of affiliative communication to emerge.
Atzori, Manfredo; Cognolato, Matteo; Müller, Henning
2016-01-01
Natural control methods based on surface electromyography (sEMG) and pattern recognition are promising for hand prosthetics. However, the control robustness offered by scientific research is still not sufficient for many real life applications, and commercial prostheses are capable of offering natural control for only a few movements. In recent years deep learning revolutionized several fields of machine learning, including computer vision and speech recognition. Our objective is to test its methods for natural control of robotic hands via sEMG using a large number of intact subjects and amputees. We tested convolutional networks for the classification of an average of 50 hand movements in 67 intact subjects and 11 transradial amputees. The simple architecture of the neural network allowed to make several tests in order to evaluate the effect of pre-processing, layer architecture, data augmentation and optimization. The classification results are compared with a set of classical classification methods applied on the same datasets. The classification accuracy obtained with convolutional neural networks using the proposed architecture is higher than the average results obtained with the classical classification methods, but lower than the results obtained with the best reference methods in our tests. The results show that convolutional neural networks with a very simple architecture can produce accurate results comparable to the average classical classification methods. They show that several factors (including pre-processing, the architecture of the net and the optimization parameters) can be fundamental for the analysis of sEMG data. Larger networks can achieve higher accuracy on computer vision and object recognition tasks. This fact suggests that it may be interesting to evaluate if larger networks can increase sEMG classification accuracy too. PMID:27656140
Atzori, Manfredo; Cognolato, Matteo; Müller, Henning
2016-01-01
Natural control methods based on surface electromyography (sEMG) and pattern recognition are promising for hand prosthetics. However, the control robustness offered by scientific research is still not sufficient for many real life applications, and commercial prostheses are capable of offering natural control for only a few movements. In recent years deep learning revolutionized several fields of machine learning, including computer vision and speech recognition. Our objective is to test its methods for natural control of robotic hands via sEMG using a large number of intact subjects and amputees. We tested convolutional networks for the classification of an average of 50 hand movements in 67 intact subjects and 11 transradial amputees. The simple architecture of the neural network allowed to make several tests in order to evaluate the effect of pre-processing, layer architecture, data augmentation and optimization. The classification results are compared with a set of classical classification methods applied on the same datasets. The classification accuracy obtained with convolutional neural networks using the proposed architecture is higher than the average results obtained with the classical classification methods, but lower than the results obtained with the best reference methods in our tests. The results show that convolutional neural networks with a very simple architecture can produce accurate results comparable to the average classical classification methods. They show that several factors (including pre-processing, the architecture of the net and the optimization parameters) can be fundamental for the analysis of sEMG data. Larger networks can achieve higher accuracy on computer vision and object recognition tasks. This fact suggests that it may be interesting to evaluate if larger networks can increase sEMG classification accuracy too.
Study of Pressure Oscillations in Supersonic Parachute
NASA Astrophysics Data System (ADS)
Dahal, Nimesh; Fukiba, Katsuyoshi; Mizuta, Kazuki; Maru, Yusuke
2018-04-01
Supersonic parachutes are a critical element of planetary mission whose simple structure, light-weight characteristics together with high ratio of aerodynamic drag makes them the most suitable aerodynamic decelerators. The use of parachute in supersonic flow produces complex shock/shock and wake/shock interaction giving rise to dynamic pressure oscillations. The study of supersonic parachute is difficult, because parachute has very flexible structure which makes obtaining experimental pressure data difficult. In this study, a supersonic wind tunnel test using two rigid bodies is done. The wind tunnel test was done at Mach number 3 by varying the distance between the front and rear objects, and the distance of a bundle point which divides suspension lines and a riser. The analysis of Schlieren movies revealed shock wave oscillation which was repetitive and had large pressure variation. The pressure variation differed in each case of change in distance between the front and rear objects, and the change in distance between riser and the rear object. The causes of pressure oscillation are: interaction of wake caused by front object with the shock wave, fundamental harmonic vibration of suspension lines, interference between shock waves, and the boundary layer of suspension lines.
Color constancy in a scene with bright colors that do not have a fully natural surface appearance.
Fukuda, Kazuho; Uchikawa, Keiji
2014-04-01
Theoretical and experimental approaches have proposed that color constancy involves a correction related to some average of stimulation over the scene, and some of the studies showed that the average gives greater weight to surrounding bright colors. However, in a natural scene, high-luminance elements do not necessarily carry information about the scene illuminant when the luminance is too high for it to appear as a natural object color. The question is how a surrounding color's appearance mode influences its contribution to the degree of color constancy. Here the stimuli were simple geometric patterns, and the luminance of surrounding colors was tested over the range beyond the luminosity threshold. Observers performed perceptual achromatic setting on the test patch in order to measure the degree of color constancy and evaluated the surrounding bright colors' appearance mode. Broadly, our results support the assumption that the visual system counts only the colors in the object-color appearance for color constancy. However, detailed analysis indicated that surrounding colors without a fully natural object-color appearance had some sort of influence on color constancy. Consideration of this contribution of unnatural object color might be important for precise modeling of human color constancy.
Case studies evaluating the quality of synthetic environments
NASA Astrophysics Data System (ADS)
Deisinger, Joachim; Blach, Roland; Simon, Andreas
1999-03-01
Multi wall stereo projection systems (MWSP) are an emerging display paradigm, promising a new quality in 3D-real-time interactions. Not much is known about the ergonomics of these systems. In this paper some basics of perception and approaches to improve the visual quality will be discussed and results of four experiments will be presented in order to obtain a better understanding of user interactions with existing projection technology. Due to the limited number of participants the experiments are considered as case-studies only. The first task was the estimation of absolute geometrical dimensions of simple objects. The second task was grabbing simple objects of different sizes. In order to classify MWSP, these tasks were compared to other display devices and compared to physical reality. We conducted two further experiments to compare different viewing devices for virtual reality (VR) like Head Mounted Displays (HMD), monitor, and the MWSP. For all of those experiments quantitative data was collected as a measure of interaction quality. The last two tests were supported by pre- and post questionnaires to obtain subjective judgement of the displays as well.
A simple but powerful test of perseverative search in dogs and toddlers.
Péter, András; Gergely, Anna; Topál, József; Miklósi, Ádám; Pongrácz, Péter
2015-01-01
Perseverative (A-not-B) errors during the search of a hidden object were recently described in both dogs and 10-month-old infants. It was found that ostensive cues indicating a communicative intent of the person who hides the object played a major role in eliciting perseverative errors in both species. However, the employed experimental set-up gave rise to several alternative explanations regarding the source of these errors. Here we present a simplified protocol that eliminates the ambiguities present in the original design. Using five consecutive object hiding events to one of two locations in a fixed order ("AABBA"), we tested adult companion dogs and human children (24 months old). The experimenter performed the hiding actions while giving ostensive cues in each trial and moved the target object to the given location in a straight line. Our results show that in the B trials, both 24-month-old children and dogs could not reliably find the hidden object, and their performance in the first B trials was significantly below that of any of the A trials. These results are the first to show that the tendency for perseverative errors in an ostensive-communicative context is a robust phenomenon among 2-year-old children and dogs, and not the by-product of a topographically elaborate hiding event.
Estimated capacity of object files in visual short-term memory is not improved by retrieval cueing.
Saiki, Jun; Miyatsuji, Hirofumi
2009-03-23
Visual short-term memory (VSTM) has been claimed to maintain three to five feature-bound object representations. Some results showing smaller capacity estimates for feature binding memory have been interpreted as the effects of interference in memory retrieval. However, change-detection tasks may not properly evaluate complex feature-bound representations such as triple conjunctions in VSTM. To understand the general type of feature-bound object representation, evaluation of triple conjunctions is critical. To test whether interference occurs in memory retrieval for complete object file representations in a VSTM task, we cued retrieval in novel paradigms that directly evaluate the memory for triple conjunctions, in comparison with a simple change-detection task. In our multiple object permanence tracking displays, observers monitored for a switch in feature combination between objects during an occlusion period, and we found that a retrieval cue provided no benefit with the triple conjunction tasks, but significant facilitation with the change-detection task, suggesting that low capacity estimates of object file memory in VSTM reflect a limit on maintenance, not retrieval.
Koen, Joshua D; Borders, Alyssa A; Petzold, Michael T; Yonelinas, Andrew P
2017-02-01
The medial temporal lobe (MTL) plays a critical role in episodic long-term memory, but whether the MTL is necessary for visual short-term memory is controversial. Some studies have indicated that MTL damage disrupts visual short-term memory performance whereas other studies have failed to find such evidence. To account for these mixed results, it has been proposed that the hippocampus is critical in supporting short-term memory for high resolution complex bindings, while the cortex is sufficient to support simple, low resolution bindings. This hypothesis was tested in the current study by assessing visual short-term memory in patients with damage to the MTL and controls for high resolution and low resolution object-location and object-color associations. In the location tests, participants encoded sets of two or four objects in different locations on the screen. After each set, participants performed a two-alternative forced-choice task in which they were required to discriminate the object in the target location from the object in a high or low resolution lure location (i.e., the object locations were very close or far away from the target location, respectively). Similarly, in the color tests, participants were presented with sets of two or four objects in a different color and, after each set, were required to discriminate the object in the target color from the object in a high or low resolution lure color (i.e., the lure color was very similar or very different, respectively, to the studied color). The patients were significantly impaired in visual short-term memory, but importantly, they were more impaired for high resolution object-location and object-color bindings. The results are consistent with the proposal that the hippocampus plays a critical role in forming and maintaining complex, high resolution bindings. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
A smart sensor architecture based on emergent computation in an array of outer-totalistic cells
NASA Astrophysics Data System (ADS)
Dogaru, Radu; Dogaru, Ioana; Glesner, Manfred
2005-06-01
A novel smart-sensor architecture is proposed, capable to segment and recognize characters in a monochrome image. It is capable to provide a list of ASCII codes representing the recognized characters from the monochrome visual field. It can operate as a blind's aid or for industrial applications. A bio-inspired cellular model with simple linear neurons was found the best to perform the nontrivial task of cropping isolated compact objects such as handwritten digits or characters. By attaching a simple outer-totalistic cell to each pixel sensor, emergent computation in the resulting cellular automata lattice provides a straightforward and compact solution to the otherwise computationally intensive problem of character segmentation. A simple and robust recognition algorithm is built in a compact sequential controller accessing the array of cells so that the integrated device can provide directly a list of codes of the recognized characters. Preliminary simulation tests indicate good performance and robustness to various distortions of the visual field.
A simple experimental method to study depigmenting agents.
Abella, M L; de Rigal, J; Neveux, S
2007-08-01
The first objective of the study was to verify that a controlled UV exposure of four areas of the forearms together with randomized product application enabled to compare treatment efficacy and then to compare the depigmenting efficacy of different products with a simple experimental method. Sixteen volunteers received 0.7 minimal erythermal dose for four consecutive days. Products tested were ellagic acid (0.5%), vitamin C (5%) and C8-LHA (2%). Product application started 72 h post last exposure, was repeated for 42 days, the control zone being exposed, non-treated. Colour measurements included Chromameter, Chromasphere, Spectro-colorimeter and visual assessment. Comparison of colour values at day 1 and at day 7 showed that all zones were comparably tanned, allowing a rigorous comparison of the treatments. We report a new simple experimental model, which enables the rapid comparison of different depigmenting products. The efficacy and good tolerance of C8-LHA make it an excellent candidate for the treatment of hyperpigmentory disorders.
Pruning Neural Networks with Distribution Estimation Algorithms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cantu-Paz, E
2003-01-15
This paper describes the application of four evolutionary algorithms to the pruning of neural networks used in classification problems. Besides of a simple genetic algorithm (GA), the paper considers three distribution estimation algorithms (DEAs): a compact GA, an extended compact GA, and the Bayesian Optimization Algorithm. The objective is to determine if the DEAs present advantages over the simple GA in terms of accuracy or speed in this problem. The experiments used a feed forward neural network trained with standard back propagation and public-domain and artificial data sets. The pruned networks seemed to have better or equal accuracy than themore » original fully-connected networks. Only in a few cases, pruning resulted in less accurate networks. We found few differences in the accuracy of the networks pruned by the four EAs, but found important differences in the execution time. The results suggest that a simple GA with a small population might be the best algorithm for pruning networks on the data sets we tested.« less
Cosmic Star Formation: A Simple Model of the SFRD(z)
NASA Astrophysics Data System (ADS)
Chiosi, Cesare; Sciarratta, Mauro; D’Onofrio, Mauro; Chiosi, Emanuela; Brotto, Francesca; De Michele, Rosaria; Politino, Valeria
2017-12-01
We investigate the evolution of the cosmic star formation rate density (SFRD) from redshift z = 20 to z = 0 and compare it with the observational one by Madau and Dickinson derived from recent compilations of ultraviolet (UV) and infrared (IR) data. The theoretical SFRD(z) and its evolution are obtained using a simple model that folds together the star formation histories of prototype galaxies that are designed to represent real objects of different morphological type along the Hubble sequence and the hierarchical growing of structures under the action of gravity from small perturbations to large-scale objects in Λ-CDM cosmogony, i.e., the number density of dark matter halos N(M,z). Although the overall model is very simple and easy to set up, it provides results that mimic results obtained from highly complex large-scale N-body simulations well. The simplicity of our approach allows us to test different assumptions for the star formation law in galaxies, the effects of energy feedback from stars to interstellar gas, the efficiency of galactic winds, and also the effect of N(M,z). The result of our analysis is that in the framework of the hierarchical assembly of galaxies, the so-called time-delayed star formation under plain assumptions mainly for the energy feedback and galactic winds can reproduce the observational SFRD(z).
Fiene, Marina; Rufener, Katharina S; Kuehne, Maria; Matzke, Mike; Heinze, Hans-Jochen; Zaehle, Tino
2018-03-01
Fatigue is one of the most common and debilitating symptoms affecting patients with multiple sclerosis (MS). Sustained cognitive effort induces cognitive fatigue, operationalized as subjective exhaustion and fatigue-related objective alertness decrements with time-on-task. During prolonged cognitive testing, MS patients show increased simple reaction times (RT) accompanied by lower amplitudes and prolonged latencies of the P300 event-related potential. Previous studies suggested a major role of structural and functional abnormalities in the frontal cortex including a frontal hypo-activation in fatigue pathogenesis. In the present study we investigated the neuromodulatory effect of transcranial direct current stimulation (tDCS) over the left dorsolateral prefrontal cortex (DLPFC) on objective measures of fatigue-related decrements in cognitive performance in MS patients. P300 during an auditory oddball task and simple reaction times in an alertness test were recorded at baseline, during and after stimulation. Compared to sham, anodal tDCS caused an increase in P300 amplitude that persisted after the end of stimulation and eliminated the fatigue-related increase in RT over the course of a testing session. Our findings demonstrate that anodal tDCS over the left DLPFC can counteract performance decrements associated with fatigue thereby leading to an improvement in the patient's ability to cope with sustained cognitive demands. This provides causal evidence for the functional relevance of the left DLPFC in fatigue pathophysiology. The results indicate that tDCS-induced modulations of frontal activity can be an effective therapeutic option for the treatment of fatigue-related declines in cognitive performance in MS patients.
Herrera-Guzmán, I; Peña-Casanova, J; Lara, J P; Gudayol-Ferré, E; Böhm, P
2004-08-01
The assessment of visual perception and cognition forms an important part of any general cognitive evaluation. We have studied the possible influence of age, sex, and education on a normal elderly Spanish population (90 healthy subjects) in performance in visual perception tasks. To evaluate visual perception and cognition, we have used the subjects performance with The Visual Object and Space Perception Battery (VOSP). The test consists of 8 subtests: 4 measure visual object perception (Incomplete Letters, Silhouettes, Object Decision, and Progressive Silhouettes) while the other 4 measure visual space perception (Dot Counting, Position Discrimination, Number Location, and Cube Analysis). The statistical procedures employed were either simple or multiple linear regression analyses (subtests with normal distribution) and Mann-Whitney tests, followed by ANOVA with Scheffe correction (subtests without normal distribution). Age and sex were found to be significant modifying factors in the Silhouettes, Object Decision, Progressive Silhouettes, Position Discrimination, and Cube Analysis subtests. Educational level was found to be a significant predictor of function for the Silhouettes and Object Decision subtests. The results of the sample were adjusted in line with the differences observed. Our study also offers preliminary normative data for the administration of the VOSP to an elderly Spanish population. The results are discussed and compared with similar studies performed in different cultural backgrounds.
To call a cloud 'cirrus': sound symbolism in names for categories or items.
Ković, Vanja; Sučević, Jelena; Styles, Suzy J
2017-01-01
The aim of the present paper is to experimentally test whether sound symbolism has selective effects on labels with different ranges-of-reference within a simple noun-hierarchy. In two experiments, adult participants learned the make up of two categories of unfamiliar objects ('alien life forms'), and were passively exposed to either category-labels or item-labels, in a learning-by-guessing categorization task. Following category training, participants were tested on their visual discrimination of object pairs. For different groups of participants, the labels were either congruent or incongruent with the objects. In Experiment 1, when trained on items with individual labels, participants were worse (made more errors) at detecting visual object mismatches when trained labels were incongruent. In Experiment 2, when participants were trained on items in labelled categories, participants were faster at detecting a match if the trained labels were congruent, and faster at detecting a mismatch if the trained labels were incongruent. This pattern of results suggests that sound symbolism in category labels facilitates later similarity judgments when congruent, and discrimination when incongruent, whereas for item labels incongruence generates error in judgements of visual object differences. These findings reveal that sound symbolic congruence has a different outcome at different levels of labelling within a noun hierarchy. These effects emerged in the absence of the label itself, indicating subtle but pervasive effects on visual object processing.
Sopharat, Jessada; Gay, Frederic; Thaler, Philippe; Sdoodee, Sayan; Isarangkool Na Ayutthaya, Supat; Tanavud, Charlchai; Hammecker, Claude; Do, Frederic C.
2015-01-01
Climate change and fast extension in climatically suboptimal areas threaten the sustainability of rubber tree cultivation. A simple framework based on reduction factors of potential transpiration was tested to evaluate the water constraints on seasonal transpiration in tropical sub-humid climates, according pedoclimatic conditions. We selected a representative, mature stand in a drought-prone area. Tree transpiration, evaporative demand and soil water availability were measured every day over 15 months. The results showed that basic relationships with evaporative demand, leaf area index and soil water availability were globally supported. However, the implementation of a regulation of transpiration at high evaporative demand whatever soil water availability was necessary to avoid large overestimates of transpiration. The details of regulation were confirmed by the analysis of canopy conductance response to vapor pressure deficit. The final objective of providing hierarchy between the main regulation factors of seasonal and annual transpiration was achieved. In the tested environmental conditions, the impact of atmospheric drought appeared larger importance than soil drought contrary to expectations. Our results support the interest in simple models to provide a first diagnosis of water constraints on transpiration with limited data, and to help decision making toward more sustainable rubber plantations. PMID:25610443
Thermal Analysis and Testing of Fastrac Gas Generator Design
NASA Technical Reports Server (NTRS)
Nguyen, H.
1998-01-01
The Fastrac Engine is being developed by the Marshall Space Flight Center (MSFC) to help meet the goal of substantially reducing the cost of access to space. This engine relies on a simple gas-generator cycle, which burns a small amount of RP-1 and oxygen to provide gas to drive the turbine and then exhausts the spent fuel. The Fastrac program envisions a combination of analysis, design and hot-fire evaluation testing. This paper provides the supporting thermal analysis of the gas generator design. In order to ensure that the design objectives were met, the evaluation tests have started on a component level and a total of 15 tests of different durations were completed to date at MSFC. The correlated thermal model results will also be compared against hot-fire thermocouple data gathered.
Application of digital control techniques for satellite medium power DC-DC converters
NASA Astrophysics Data System (ADS)
Skup, Konrad R.; Grudzinski, Pawel; Nowosielski, Witold; Orleanski, Piotr; Wawrzaszek, Roman
2010-09-01
The objective of this paper is to present a work concerning a digital control loop system for satellite medium power DC-DC converters that is done in Space Research Centre. The whole control process of a described power converter bases on a high speed digital signal processing. The paper presents a development of a FPGA digital controller for voltage mode stabilization that was implemented using VHDL. The described controllers are a classical digital PID controller and a bang-bang controller. The used converter for testing is a simple model of 5-20 W, 200 kHz buck power converter. A high resolution digital PWM approach is presented. Additionally a simple and effective solution of filtering of an analog-to-digital converter output is presented.
Height Measuring System On Video Using Otsu Method
NASA Astrophysics Data System (ADS)
Sandy, C. L. M.; Meiyanti, R.
2017-01-01
A measurement of height is comparing the value of the magnitude of an object with a standard measuring tool. The problems that exist in the measurement are still the use of a simple apparatus in which one of them is by using a meter. This method requires a relatively long time. To overcome these problems, this research aims to create software with image processing that is used for the measurement of height. And subsequent that image is tested, where the object captured by the video camera can be known so that the height of the object can be measured using the learning method of Otsu. The system was built using Delphi 7 of Vision Lab VCL 4.5 component. To increase the quality of work of the system in future research, the developed system can be combined with other methods.
Study on shear properties of coral sand under cyclic simple shear condition
NASA Astrophysics Data System (ADS)
Ji, Wendong; Zhang, Yuting; Jin, Yafei
2018-05-01
In recent years, the ocean development in our country urgently needs to be accelerated. The construction of artificial coral reefs has become an important development direction. In this paper, experimental studies of simple shear and cyclic simple shear of coral sand are carried out, and the shear properties and particle breakage of coral sand are analyzed. The results show that the coral sand samples show an overall shear failure in the simple shear test, which is more accurate and effective for studying the particle breakage. The shear displacement corresponding to the peak shear stress of the simple shear test is significantly larger than that corresponding to the peak shear stress of the direct shear test. The degree of particle breakage caused by the simple shear test is significantly related to the normal stress level. The particle breakage of coral sand after the cyclic simple shear test obviously increases compared with that of the simple shear test, and universal particle breakage occurs within the whole particle size range. The increasing of the cycle-index under cyclic simple shear test results in continuous compacting of the sample, so that the envelope curve of peak shearing force increases with the accumulated shear displacement.
Similarity, not complexity, determines visual working memory performance.
Jackson, Margaret C; Linden, David E J; Roberts, Mark V; Kriegeskorte, Nikolaus; Haenschel, Corinna
2015-11-01
A number of studies have shown that visual working memory (WM) is poorer for complex versus simple items, traditionally accounted for by higher information load placing greater demands on encoding and storage capacity limits. Other research suggests that it may not be complexity that determines WM performance per se, but rather increased perceptual similarity between complex items as a result of a large amount of overlapping information. Increased similarity is thought to lead to greater comparison errors between items encoded into WM and the test item(s) presented at retrieval. However, previous studies have used different object categories to manipulate complexity and similarity, raising questions as to whether these effects are simply due to cross-category differences. For the first time, here the relationship between complexity and similarity in WM using the same stimulus category (abstract polygons) are investigated. The authors used a delayed discrimination task to measure WM for 1-4 complex versus simple simultaneously presented items and manipulated the similarity between the single test item at retrieval and the sample items at encoding. WM was poorer for complex than simple items only when the test item was similar to 1 of the encoding items, and not when it was dissimilar or identical. The results provide clear support for reinterpretation of the complexity effect in WM as a similarity effect and highlight the importance of the retrieval stage in governing WM performance. The authors discuss how these findings can be reconciled with current models of WM capacity limits. (c) 2015 APA, all rights reserved).
Zellner, Eric M; Hedlund, Cheryl S; Kraus, Karl H; Burton, Andrew F; Kieves, Nina R
2016-06-15
OBJECTIVE To compare suture placement time, tension at skin separation and suture line failure, and mode of failure among 4 suture patterns. DESIGN Randomized trial. SAMPLE 60 skin specimens from the pelvic limbs of 30 purpose-bred Beagles. PROCEDURES Skin specimens were harvested within 2 hours after euthanasia and tested within 6 hours after harvest. An 8-cm incision was made in each specimen and sutured with 1 of 4 randomly assigned suture patterns (simple interrupted, cruciate, intradermal, or subdermal). Suture placement time and percentage of skin apposition were evaluated. Specimens were mounted in a calibrated material testing machine and distracted until suture line failure. Tensile strength at skin-edge separation and suture-line failure and mode of failure were compared among the 4 patterns. RESULTS Mean suture placement time for the cruciate pattern was significantly less than that for other patterns. Percentage of skin apposition did not differ among the 4 patterns. Mean tensile strength at skin-edge separation and suture-line failure for the simple interrupted and cruciate patterns were significantly higher than those for the intradermal and subdermal patterns. Mean tensile strength at skin-edge separation and suture-line failure did not differ significantly between the intradermal and subdermal patterns or the simple interrupted and cruciate patterns. The primary mode of failure for the simple interrupted pattern was suture breakage, whereas that for the cruciate, intradermal, and subdermal patterns was tissue failure. CONCLUSIONS AND CLINICAL RELEVANCE Results suggested external skin sutures may be preferred for closure of incisions under tension to reduce risk of dehiscence.
A Flight Control System Architecture for the NASA AirSTAR Flight Test Infrastructure
NASA Technical Reports Server (NTRS)
Murch, Austin M.
2008-01-01
A flight control system architecture for the NASA AirSTAR infrastructure has been designed to address the challenges associated with safe and efficient flight testing of research control laws in adverse flight conditions. The AirSTAR flight control system provides a flexible framework that enables NASA Aviation Safety Program research objectives, and includes the ability to rapidly integrate and test research control laws, emulate component or sensor failures, inject automated control surface perturbations, and provide a baseline control law for comparison to research control laws and to increase operational efficiency. The current baseline control law uses an angle of attack command augmentation system for the pitch axis and simple stability augmentation for the roll and yaw axes.
Visualization of Data Regarding Infections Using Eye Tracking Techniques
Yoon, Sunmoo; Cohen, Bevin; Cato, Kenrick D.; Liu, Jianfang; Larson, Elaine L.
2016-01-01
Objective To evaluate ease of use and usefulness for nurses of visualizations of infectious disease transmission in a hospital. Design An observational study was used to evaluate perceptions of several visualizations of data extracted from electronic health records designed using a participatory approach. Twelve nurses in the master’s program in an urban research-intensive nursing school participated in May 2015. Methods A convergent parallel mixed method was used to evaluate nurses’ perceptions on ease of use and usefulness of five visualization conveying trends in hospital infection transmission applying think-aloud, interview, and eye-tracking techniques. Findings Subjective data from the interview and think-aloud techniques indicated that participants preferred the traditional line graphs in simple data representation due to their familiarity, clarity, and easiness to read. An objective quantitative measure of eye movement analysis (444,421 gaze events) identified a high degree of participants’ attention span in infographics in all three scenarios. All participants responded with the correct answer within 1 min in comprehensive tests. Conclusions A user-centric approach was effective in developing and evaluating visualizations for hospital infection transmission. For the visualizations designed by the users, the participants were easily able to comprehend the infection visualizations on both line graphs and infographics for simple visualization. The findings from the objective comprehension test and eye movement and subjective attitudes support the feasibility of integrating user-centric visualization designs into electronic health records, which may inspire clinicians to be mindful of hospital infection transmission. Future studies are needed to investigate visualizations and motivation, and the effectiveness of visualization on infection rate. Clinical Relevance This study designed visualization images using clinical data from electronic health records applying a user-centric approach. The design insights can be applied for visualizing patient data in electronic health records. PMID:27061619
Real-time Human Activity Recognition
NASA Astrophysics Data System (ADS)
Albukhary, N.; Mustafah, Y. M.
2017-11-01
The traditional Closed-circuit Television (CCTV) system requires human to monitor the CCTV for 24/7 which is inefficient and costly. Therefore, there’s a need for a system which can recognize human activity effectively in real-time. This paper concentrates on recognizing simple activity such as walking, running, sitting, standing and landing by using image processing techniques. Firstly, object detection is done by using background subtraction to detect moving object. Then, object tracking and object classification are constructed so that different person can be differentiated by using feature detection. Geometrical attributes of tracked object, which are centroid and aspect ratio of identified tracked are manipulated so that simple activity can be detected.
Identification and Illustration of Insecure Direct Object References and their Countermeasures
NASA Astrophysics Data System (ADS)
KumarShrestha, Ajay; Singh Maharjan, Pradip; Paudel, Santosh
2015-03-01
The insecure direct object reference simply represents the flaws in the system design without the full protection mechanism for the sensitive system resources or data. It basically occurs when the web application developer provides direct access to objects in accordance with the user input. So any attacker can exploit this web vulnerability and gain access to privileged information by bypassing the authorization. The main aim of this paper is to demonstrate the real effect and the identification of the insecure direct object references and then to provide the feasible preventive solutions such that the web applications do not allow direct object references to be manipulated by attackers. The experiment of the insecure direct object referencing is carried out using the insecure J2EE web application called WebGoat and its security testing is being performed using another JAVA based tool called BURP SUITE. The experimental result shows that the access control check for gaining access to privileged information is a very simple problem but at the same time its correct implementation is a tricky task. The paper finally presents some ways to overcome this web vulnerability.
A plane mirror experiment inspired by a comic strip
NASA Astrophysics Data System (ADS)
Lúcio Prados Ribeiro, Jair
2016-01-01
A comic strip about a plane mirror was used in a high school optics test, and it was perceived that a large portion of the students believed that the mirror should be larger than the object so the virtual image could be entirely visible. Inspired on the comic strip, an experimental demonstration with flat mirrors was developed, in order to readdress this topic learning. Students were encouraged to create their own investigation of the phenomenon with a simple instrumental apparatus and also suggest different experimental approaches.
NASA Astrophysics Data System (ADS)
Tejeda, E.
2018-04-01
We present a simple, analytic model of an incompressible fluid accreting onto a moving gravitating object. This solution allows us to probe the highly subsonic regime of wind accretion. Moreover, it corresponds to the Newtonian limit of a previously known relativistic model of a stiff fluid accreting onto a black hole. Besides filling this blank in the literature, the new solution should be useful as a benchmark test for numerical hydrodynamics codes. Given its simplicity, it can also be used as an illustrative example in a gas dynamics course.
SU-E-I-42: Some New Aspects of the Energy Weighting Technique
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ganezer, K; Krmar, M; Josipovic, I
2015-06-15
Purpose: The development in the last few years of photon-counting pixel detectors creates an important and significant opportunity for x-ray spectroscopy to be applied in diagnostics. The energy weighting technique was originally developed to obtain the maximum benefit from the spectroscopic information. In all previous published papers the concept of an energy weighting function was tested on relatively simple test objects having only two materials with different absorption coefficients. Methods: In this study the shape of the energy weighting function was investigated with a set of ten trabecular bone test objects each with a similar geometry and structure but withmore » different attenuation properties. In previous publications it was determined that the function E-3 was a very good choice for the weighting function (wi). Results: The most important Result from this study was the discovery that a single function of the form E-b was not sufficient to explain the energy dependence of the different types of materials that might be used to describe common bodily tissues such as trabecular bone. It was also determined from the data contained in this paper that the exponent b is often significantly dependent upon the attenuation properties of the materials that were used to make the test objects. Conclusion: Studies of the attenuation properties will be useful in further studies involving energy weighting.« less
Defining Simple nD Operations Based on Prosmatic nD Objects
NASA Astrophysics Data System (ADS)
Arroyo Ohori, K.; Ledoux, H.; Stoter, J.
2016-10-01
An alternative to the traditional approaches to model separately 2D/3D space, time, scale and other parametrisable characteristics in GIS lies in the higher-dimensional modelling of geographic information, in which a chosen set of non-spatial characteristics, e.g. time and scale, are modelled as extra geometric dimensions perpendicular to the spatial ones, thus creating a higher-dimensional model. While higher-dimensional models are undoubtedly powerful, they are also hard to create and manipulate due to our lack of an intuitive understanding in dimensions higher than three. As a solution to this problem, this paper proposes a methodology that makes nD object generation easier by splitting the creation and manipulation process into three steps: (i) constructing simple nD objects based on nD prismatic polytopes - analogous to prisms in 3D -, (ii) defining simple modification operations at the vertex level, and (iii) simple postprocessing to fix errors introduced in the model. As a use case, we show how two sets of operations can be defined and implemented in a dimension-independent manner using this methodology: the most common transformations (i.e. translation, scaling and rotation) and the collapse of objects. The nD objects generated in this manner can then be used as a basis for an nD GIS.
Ford, Jennifer Lynn; Green, Joanne Balmer; Lietz, Georg; Oxley, Anthony; Green, Michael H
2017-09-01
Background: Provitamin A carotenoids are an important source of dietary vitamin A for many populations. Thus, accurate and simple methods for estimating carotenoid bioefficacy are needed to evaluate the vitamin A value of test solutions and plant sources. β-Carotene bioefficacy is often estimated from the ratio of the areas under plasma isotope response curves after subjects ingest labeled β-carotene and a labeled retinyl acetate reference dose [isotope reference method (IRM)], but to our knowledge, the method has not yet been evaluated for accuracy. Objectives: Our objectives were to develop and test a physiologically based compartmental model that includes both absorptive and postabsorptive β-carotene bioconversion and to use the model to evaluate the accuracy of the IRM and a simple plasma retinol isotope ratio [(RIR), labeled β-carotene-derived retinol/labeled reference-dose-derived retinol in one plasma sample] for estimating relative bioefficacy. Methods: We used model-based compartmental analysis (Simulation, Analysis and Modeling software) to develop and apply a model that provided known values for β-carotene bioefficacy. Theoretical data for 10 subjects were generated by the model and used to determine bioefficacy by RIR and IRM; predictions were compared with known values. We also applied RIR and IRM to previously published data. Results: Plasma RIR accurately predicted β-carotene relative bioefficacy at 14 d or later. IRM also accurately predicted bioefficacy by 14 d, except that, when there was substantial postabsorptive bioconversion, IRM underestimated bioefficacy. Based on our model, 1-d predictions of relative bioefficacy include absorptive plus a portion of early postabsorptive conversion. Conclusion: The plasma RIR is a simple tracer method that accurately predicts β-carotene relative bioefficacy based on analysis of one blood sample obtained at ≥14 d after co-ingestion of labeled β-carotene and retinyl acetate. The method also provides information about the contributions of absorptive and postabsorptive conversion to total bioefficacy if an additional sample is taken at 1 d. © 2017 American Society for Nutrition.
Potas, Jason Robert; de Castro, Newton Gonçalves; Maddess, Ted; de Souza, Marcio Nogueira
2015-01-01
Experimental electrophysiological assessment of evoked responses from regenerating nerves is challenging due to the typical complex response of events dispersed over various latencies and poor signal-to-noise ratio. Our objective was to automate the detection of compound action potential events and derive their latencies and magnitudes using a simple cross-correlation template comparison approach. For this, we developed an algorithm called Waveform Similarity Analysis. To test the algorithm, challenging signals were generated in vivo by stimulating sural and sciatic nerves, whilst recording evoked potentials at the sciatic nerve and tibialis anterior muscle, respectively, in animals recovering from sciatic nerve transection. Our template for the algorithm was generated based on responses evoked from the intact side. We also simulated noisy signals and examined the output of the Waveform Similarity Analysis algorithm with imperfect templates. Signals were detected and quantified using Waveform Similarity Analysis, which was compared to event detection, latency and magnitude measurements of the same signals performed by a trained observer, a process we called Trained Eye Analysis. The Waveform Similarity Analysis algorithm could successfully detect and quantify simple or complex responses from nerve and muscle compound action potentials of intact or regenerated nerves. Incorrectly specifying the template outperformed Trained Eye Analysis for predicting signal amplitude, but produced consistent latency errors for the simulated signals examined. Compared to the trained eye, Waveform Similarity Analysis is automatic, objective, does not rely on the observer to identify and/or measure peaks, and can detect small clustered events even when signal-to-noise ratio is poor. Waveform Similarity Analysis provides a simple, reliable and convenient approach to quantify latencies and magnitudes of complex waveforms and therefore serves as a useful tool for studying evoked compound action potentials in neural regeneration studies.
Potas, Jason Robert; de Castro, Newton Gonçalves; Maddess, Ted; de Souza, Marcio Nogueira
2015-01-01
Experimental electrophysiological assessment of evoked responses from regenerating nerves is challenging due to the typical complex response of events dispersed over various latencies and poor signal-to-noise ratio. Our objective was to automate the detection of compound action potential events and derive their latencies and magnitudes using a simple cross-correlation template comparison approach. For this, we developed an algorithm called Waveform Similarity Analysis. To test the algorithm, challenging signals were generated in vivo by stimulating sural and sciatic nerves, whilst recording evoked potentials at the sciatic nerve and tibialis anterior muscle, respectively, in animals recovering from sciatic nerve transection. Our template for the algorithm was generated based on responses evoked from the intact side. We also simulated noisy signals and examined the output of the Waveform Similarity Analysis algorithm with imperfect templates. Signals were detected and quantified using Waveform Similarity Analysis, which was compared to event detection, latency and magnitude measurements of the same signals performed by a trained observer, a process we called Trained Eye Analysis. The Waveform Similarity Analysis algorithm could successfully detect and quantify simple or complex responses from nerve and muscle compound action potentials of intact or regenerated nerves. Incorrectly specifying the template outperformed Trained Eye Analysis for predicting signal amplitude, but produced consistent latency errors for the simulated signals examined. Compared to the trained eye, Waveform Similarity Analysis is automatic, objective, does not rely on the observer to identify and/or measure peaks, and can detect small clustered events even when signal-to-noise ratio is poor. Waveform Similarity Analysis provides a simple, reliable and convenient approach to quantify latencies and magnitudes of complex waveforms and therefore serves as a useful tool for studying evoked compound action potentials in neural regeneration studies. PMID:26325291
Tinsley, Chris J.; Fontaine-Palmer, Nadine S.; Vincent, Maria; Endean, Emma P.E.; Aggleton, John P.; Brown, Malcolm W.; Warburton, E. Clea
2011-01-01
The roles of muscarinic and nicotinic cholinergic receptors in perirhinal cortex in object recognition memory were compared. Rats' discrimination of a novel object preference test (NOP) test was measured after either systemic or local infusion into the perirhinal cortex of the nicotinic receptor antagonist methyllycaconitine (MLA), which targets alpha-7 (α7) amongst other nicotinic receptors or the muscarinic receptor antagonists scopolamine, AFDX-384, and pirenzepine. Methyllycaconitine administered systemically or intraperirhinally before acquisition impaired recognition memory tested after a 24-h, but not a 20-min delay. In contrast, all three muscarinic antagonists produced a similar, unusual pattern of impairment with amnesia after a 20-min delay, but remembrance after a 24-h delay. Thus, the amnesic effects of nicotinic and muscarinic antagonism were doubly dissociated across the 20-min and 24-h delays. The same pattern of shorter-term but not longer-term memory impairment was found for scopolamine whether the object preference test was carried out in a square arena or a Y-maze and whether rats of the Dark Agouti or Lister-hooded strains were used. Coinfusion of MLA and either scopolamine or AFDX-384 produced an impairment profile matching that for MLA. Hence, the antagonists did not act additively when coadministered. These findings establish an important role in recognition memory for both nicotinic and muscarinic cholinergic receptors in perirhinal cortex, and provide a challenge to simple ideas about the role of cholinergic processes in recognition memory: The effects of muscarinic and nicotinic antagonism are neither independent nor additive. PMID:21693636
Tinsley, Chris J; Fontaine-Palmer, Nadine S; Vincent, Maria; Endean, Emma P E; Aggleton, John P; Brown, Malcolm W; Warburton, E Clea
2011-01-01
The roles of muscarinic and nicotinic cholinergic receptors in perirhinal cortex in object recognition memory were compared. Rats' discrimination of a novel object preference test (NOP) test was measured after either systemic or local infusion into the perirhinal cortex of the nicotinic receptor antagonist methyllycaconitine (MLA), which targets alpha-7 (α7) amongst other nicotinic receptors or the muscarinic receptor antagonists scopolamine, AFDX-384, and pirenzepine. Methyllycaconitine administered systemically or intraperirhinally before acquisition impaired recognition memory tested after a 24-h, but not a 20-min delay. In contrast, all three muscarinic antagonists produced a similar, unusual pattern of impairment with amnesia after a 20-min delay, but remembrance after a 24-h delay. Thus, the amnesic effects of nicotinic and muscarinic antagonism were doubly dissociated across the 20-min and 24-h delays. The same pattern of shorter-term but not longer-term memory impairment was found for scopolamine whether the object preference test was carried out in a square arena or a Y-maze and whether rats of the Dark Agouti or Lister-hooded strains were used. Coinfusion of MLA and either scopolamine or AFDX-384 produced an impairment profile matching that for MLA. Hence, the antagonists did not act additively when coadministered. These findings establish an important role in recognition memory for both nicotinic and muscarinic cholinergic receptors in perirhinal cortex, and provide a challenge to simple ideas about the role of cholinergic processes in recognition memory: The effects of muscarinic and nicotinic antagonism are neither independent nor additive.
Freezer or non-freezer: clinical assessment of freezing of gait.
Snijders, Anke H; Haaxma, Charlotte A; Hagen, Yolien J; Munneke, Marten; Bloem, Bastiaan R
2012-02-01
Freezing of gait (FOG) is both common and debilitating in patients with Parkinson's disease (PD). Future pathophysiology studies will depend critically upon adequate classification of patients as being either 'freezers' or 'non-freezers'. This classification should be based ideally upon objective confirmation by an experienced observer during clinical assessment. Given the known difficulties to elicit FOG when examining patients, we aimed to investigate which simple clinical test would be the most sensitive to provoke FOG objectively. We examined 50 patients with PD, including 32 off-state freezers (defined as experiencing subjective 'gluing of the feet to the floor'). Assessment including a FOG trajectory (three trials: normal speed, fast speed, and with dual tasking) and several turning variants (180° vs. 360° turns; leftward vs. rightward turns; wide vs. narrow turning; and slow vs. fast turns). Sensitivity of the entire assessment to provoke FOG in subjective freezers was 0.74, specificity was 0.94. The most effective test to provoke FOG was rapid 360° turns in both directions and, if negative, combined with a gait trajectory with dual tasking. Repeated testing improved the diagnostic yield. The least informative tests included wide turns, 180° turns or normal speed full turns. Sensitivity to provoke objective FOG in subjective freezers was 0.65 for the rapid full turns in both directions and 0.63 for the FOG trajectory. The most efficient way to objectively ascertain FOG is asking patients to repeatedly make rapid 360° narrow turns from standstill, on the spot and in both directions. Copyright © 2011 Elsevier Ltd. All rights reserved.
Fabrication of wrist-like SMA-based actuator by double smart soft composite casting
NASA Astrophysics Data System (ADS)
Rodrigue, Hugo; Wei, Wang; Bhandari, Binayak; Ahn, Sung-Hoon
2015-12-01
A new manufacturing method for smart soft composite (SSC) actuators that consists of double casting a SSC actuator to produce an actuator with non-linear shape memory alloy (SMA) wire positioning is proposed. This method is used to manufacture a tube-shaped SSC actuator in which the SMA wires follow the curvature of the tube and is capable of pure-twisting deformations while sustaining a cantilever load. The concept is tested by measuring the maximum twisting angle and a simple control method is proposed to control the twisting angle of the actuator. Then, a soft robotic wrist with a length of 18 cm is built, its load-carrying capability is tested by measuring the cantilever force required for deforming the actuator, and its load-carrying capability during actuation is tested by loading one end with different objects and actuating the actuator. This wrist actuator shows good repeatability, is capable of twisting deformations up to 25° while holding objects weighing 100 g, and can sustain loads above 2 N without undergoing buckling.
Non-symbolic halving in an Amazonian indigene group
McCrink, Koleen; Spelke, Elizabeth S.; Dehaene, Stanislas; Pica, Pierre
2014-01-01
Much research supports the existence of an Approximate Number System (ANS) that is recruited by infants, children, adults, and non-human animals to generate coarse, non-symbolic representations of number. This system supports simple arithmetic operations such as addition, subtraction, and ordering of amounts. The current study tests whether an intuition of a more complex calculation, division, exists in an indigene group in the Amazon, the Mundurucu, whose language includes no words for large numbers. Mundurucu children were presented with a video event depicting a division transformation of halving, in which pairs of objects turned into single objects, reducing the array's numerical magnitude. Then they were tested on their ability to calculate the outcome of this division transformation with other large-number arrays. The Mundurucu children effected this transformation even when non-numerical variables were controlled, performed above chance levels on the very first set of test trials, and exhibited performance similar to urban children who had access to precise number words and a surrounding symbolic culture. We conclude that a halving calculation is part of the suite of intuitive operations supported by the ANS. PMID:23587042
Simple and Effective Algorithms: Computer-Adaptive Testing.
ERIC Educational Resources Information Center
Linacre, John Michael
Computer-adaptive testing (CAT) allows improved security, greater scoring accuracy, shorter testing periods, quicker availability of results, and reduced guessing and other undesirable test behavior. Simple approaches can be applied by the classroom teacher, or other content specialist, who possesses simple computer equipment and elementary…
An object location memory paradigm for older adults with and without mild cognitive impairment.
Külzow, Nadine; Kerti, Lucia; Witte, Veronica A; Kopp, Ute; Breitenstein, Caterina; Flöel, Agnes
2014-11-30
Object-location memory is critical in every-day life and known to deteriorate early in the course of neurodegenerative disease. We adapted the previously established learning paradigm "LOCATO" for use in healthy older adults and patients with mild cognitive impairment (MCI). Pictures of real-life buildings were associated with positions on a two-dimensional street map by repetitions of "correct" object-location pairings over the course of five training blocks, followed by a recall task. Correct/incorrect associations were indicated by button presses. The original two 45-item sets were reduced to 15 item-sets, and tested in healthy older adults and MCI for learning curve, recall, and re-test effects. The two 15-item versions showed comparable learning curves and recall scores within each group. While learning curves increased linearly in both groups, MCI patients performed significantly worse on learning and recall compared to healthy controls. Re-testing after 6 month showed small practice effects only. LOCATO is a simple standardized task that overcomes several limitation of previously employed visuospatial task by using real-life stimuli, minimizing verbal encoding, avoiding fine motor responses, combining explicit and implicit statistical learning, and allowing to assess learning curve in addition to recall. Results show that the shortened version of LOCATO meets the requirements for a robust and ecologically meaningful assessment of object-location memory in older adults with and without MCI. It can now be used to systematically assess acquisition of object-location memory and its modulation through adjuvant therapies like pharmacological or non-invasive brain stimulation. Copyright © 2014 Elsevier B.V. All rights reserved.
Mobility Lab to Assess Balance and Gait with Synchronized Body-worn Sensors
Mancini, Martina; King, Laurie; Salarian, Arash; Holmstrom, Lars; McNames, James; Horak, Fay B
2014-01-01
This paper is a commentary to introduce how rehabilitation professionals can use a new, body-worn sensor system to obtain objective measures of balance and gait. Current assessments of balance and gait in clinical rehabilitation are largely limited to subjective scales, simple stop-watch measures, or complex, expensive machines not practical or largely available. Although accelerometers and gyroscopes have been shown to accurately quantify many aspects of gait and balance kinematics, only recently a comprehensive, portable system has become available for clinicians. By measuring body motion during tests that clinicians are already performing, such as the Timed Up and Go test (TUG) and the Clinical Test of Sensory Integration for Balance (CITSIB), the additional time for assessment is minimal. By providing instant analysis of balance and gait and comparing a patient’s performance to age-matched control values, therapists receive an objective, sensitive screening profile of balance and gait strategies. This motion screening profile can be used to identify mild abnormalities not obvious with traditional clinical testing, measure small changes due to rehabilitation, and design customized rehabilitation programs for each individual’s specific balance and gait deficits. PMID:24955286
Industrial inspection of specular surfaces using a new calibration procedure
NASA Astrophysics Data System (ADS)
Aswendt, Petra; Hofling, Roland; Gartner, Soren
2005-06-01
The methodology of phase encoded reflection measurements has become a valuable tool for the industrial inspection of components with glossy surfaces. The measuring principle provides outstanding sensitivity for tiny variations of surface curvature so that sub-micron waviness and flaws are reliably detected. Quantitative curvature measurements can be obtained from a simple approach if the object is almost flat. 3D-objects with a high aspect ratio require more effort to determine both coordinates and normal direction of a surface point unambiguously. Stereoscopic solutions have been reported using more than one camera for a certain surface area. This paper will describe the combined double camera steady surface approach (DCSS) that is well suited for the implementation in industrial testing stations
Prevalence of Sickle Cell Trait in the Southern Suburb of Beirut, Lebanon
El Ariss, Abdel Badih; Younes, Mohamad; Matar, Jad; Berjaoui, Zeina
2016-01-01
Objective The objective of this study was to assess the prevalence, gender differences, and time trends of Sickle Cell Trait in the Southern Suburb of Beirut, Lebanon, as well as to highlight the importance of screening for Sickle Cell Trait carriers in this population. Another objective was to describe a new screening technique for Sickle Cell Trait carriers. Methods This was a retrospective cohort study carried out at a private laboratory in the Southern Suburb of Beirut, Lebanon between 2002 and 2014. The sickling test was carried out for each patient using two methods: the classical “sodium metabisulfite sickling test”, and the new “sickling test method” used in the private lab. As a confirmatory test, hemoglobin electrophoresis was run on a random sample of 223 cases which were found to be positive using the two sickling tests. Results A total of 899 cases were found to be positive for the sickle cell trait out of 184,105 subjects screened during the 12-year period, prevalence = 0.49% (95% CI: 0.46 – 0.52). Among the total sample, females were found to have higher prevalence, where no time trend over the studied period was noted. The haemoglobin electrophoresis method confirmed the results of this new sickling test technique among the random sample of the 223 cases. Conclusion We found that the prevalence of sickle cell trait is lower as compared to other Arab countries, higher in females, with no significant time trend. The sickle cell test was found to be an accurate, simple and cheap test that could be easily added as a requirement for the pre-marital testing to screen for Sickle Cell Trait carriers. PMID:26977274
Object Persistence Enhances Spatial Navigation: A Case Study in Smartphone Vision Science.
Liverence, Brandon M; Scholl, Brian J
2015-07-01
Violations of spatiotemporal continuity disrupt performance in many tasks involving attention and working memory, but experiments on this topic have been limited to the study of moment-by-moment on-line perception, typically assessed by passive monitoring tasks. We tested whether persisting object representations also serve as underlying units of longer-term memory and active spatial navigation, using a novel paradigm inspired by the visual interfaces common to many smartphones. Participants used key presses to navigate through simple visual environments consisting of grids of icons (depicting real-world objects), only one of which was visible at a time through a static virtual window. Participants found target icons faster when navigation involved persistence cues (via sliding animations) than when persistence was disrupted (e.g., via temporally matched fading animations), with all transitions inspired by smartphone interfaces. Moreover, this difference occurred even after explicit memorization of the relevant information, which demonstrates that object persistence enhances spatial navigation in an automatic and irresistible fashion. © The Author(s) 2015.
Humanoid Robotics: Real-Time Object Oriented Programming
NASA Technical Reports Server (NTRS)
Newton, Jason E.
2005-01-01
Programming of robots in today's world is often done in a procedural oriented fashion, where object oriented programming is not incorporated. In order to keep a robust architecture allowing for easy expansion of capabilities and a truly modular design, object oriented programming is required. However, concepts in object oriented programming are not typically applied to a real time environment. The Fujitsu HOAP-2 is the test bed for the development of a humanoid robot framework abstracting control of the robot into simple logical commands in a real time robotic system while allowing full access to all sensory data. In addition to interfacing between the motor and sensory systems, this paper discusses the software which operates multiple independently developed control systems simultaneously and the safety measures which keep the humanoid from damaging itself and its environment while running these systems. The use of this software decreases development time and costs and allows changes to be made while keeping results safe and predictable.
Instrument Package Manipulation Through the Generation and Use of an Attenuated-Fluent Gas Fold
NASA Technical Reports Server (NTRS)
Breen, Daniel P.
2012-01-01
This document discusses a technique that provides a means for suspending large, awkward loads, instrument packages, components, and machinery in a stable, controlled, and precise manner. In the baseplate of the test machine, a pattern of grooves and ports is installed that when pressurized generates an attenuated- fluent gas fold providing a low-cost, near-zero-coefficient-of-friction lubrication boundary layer that supports the object evenly, and in a predictable manner. Package movement control requires minimal force. Aids to repeatable travel and positional accuracy can be added via the addition of simple guide bars and stops to the floor or object being moved. This allows easily regulated three-axis motions. Loads of extreme weight and size can be moved and guided by a single person, or by automated means, using minimal force. Upon removal of the attenuated fluent gas fold, the object returns to a stable resting position without impact forces affecting the object.
Application of shift-and-add algorithms for imaging objects within biological media
NASA Astrophysics Data System (ADS)
Aizert, Avishai; Moshe, Tomer; Abookasis, David
2017-01-01
The Shift-and-Add (SAA) technique is a simple mathematical operation developed to reconstruct, at high spatial resolution, atmospherically degraded solar images obtained from stellar speckle interferometry systems. This method shifts and assembles individual degraded short-exposure images into a single average image with significantly improved contrast and detail. Since the inhomogeneous refractive indices of biological tissue causes light scattering similar to that induced by optical turbulence in the atmospheric layers, we assume that SAA methods can be successfully implemented to reconstruct the image of an object within a scattering biological medium. To test this hypothesis, five SAA algorithms were evaluated for reconstructing images acquired from multiple viewpoints. After successfully retrieving the hidden object's shape, quantitative image quality metrics were derived, enabling comparison of imaging error across a spectrum of layer thicknesses, demonstrating the relative efficacy of each SAA algorithm for biological imaging.
ERIC Educational Resources Information Center
Shih, Ching-Hsiang
2011-01-01
This study assessed whether two persons with developmental disabilities would be able to actively perform simple occupational activities by controlling their favorite environmental stimulation using battery-free wireless mice with a newly developed object location detection program (OLDP, i.e., a new software program turning a battery-free…
Combining local and global limitations of visual search.
Põder, Endel
2017-04-01
There are different opinions about the roles of local interactions and central processing capacity in visual search. This study attempts to clarify the problem using a new version of relevant set cueing. A central precue indicates two symmetrical segments (that may contain a target object) within a circular array of objects presented briefly around the fixation point. The number of objects in the relevant segments, and density of objects in the array were varied independently. Three types of search experiments were run: (a) search for a simple visual feature (color, size, and orientation); (b) conjunctions of simple features; and (c) spatial configuration of simple features (rotated Ts). For spatial configuration stimuli, the results were consistent with a fixed global processing capacity and standard crowding zones. For simple features and their conjunctions, the results were different, dependent on the features involved. While color search exhibits virtually no capacity limits or crowding, search for an orientation target was limited by both. Results for conjunctions of features can be partly explained by the results from the respective features. This study shows that visual search is limited by both local interference and global capacity, and the limitations are different for different visual features.
Portable design rules for bulk CMOS
NASA Technical Reports Server (NTRS)
Griswold, T. W.
1982-01-01
It is pointed out that for the past several years, one school of IC designers has used a simplified set of nMOS geometric design rules (GDR) which is 'portable', in that it can be used by many different nMOS manufacturers. The present investigation is concerned with a preliminary set of design rules for bulk CMOS which has been verified for simple test structures. The GDR are defined in terms of Caltech Intermediate Form (CIF), which is a geometry-description language that defines simple geometrical objects in layers. The layers are abstractions of physical mask layers. The design rules do not presume the existence of any particular design methodology. Attention is given to p-well and n-well CMOS processes, bulk CMOS and CMOS-SOS, CMOS geometric rules, and a description of the advantages of CMOS technology.
Tutorial on Fourier space coverage for scattering experiments, with application to SAR
NASA Astrophysics Data System (ADS)
Deming, Ross W.
2010-04-01
The Fourier Diffraction Theorem relates the data measured during electromagnetic, optical, or acoustic scattering experiments to the spatial Fourier transform of the object under test. The theorem is well-known, but since it is based on integral equations and complicated mathematical expansions, the typical derivation may be difficult for the non-specialist. In this paper, the theorem is derived and presented using simple geometry, plus undergraduatelevel physics and mathematics. For practitioners of synthetic aperture radar (SAR) imaging, the theorem is important to understand because it leads to a simple geometric and graphical understanding of image resolution and sampling requirements, and how they are affected by radar system parameters and experimental geometry. Also, the theorem can be used as a starting point for imaging algorithms and motion compensation methods. Several examples are given in this paper for realistic scenarios.
Security Applications Of Computer Motion Detection
NASA Astrophysics Data System (ADS)
Bernat, Andrew P.; Nelan, Joseph; Riter, Stephen; Frankel, Harry
1987-05-01
An important area of application of computer vision is the detection of human motion in security systems. This paper describes the development of a computer vision system which can detect and track human movement across the international border between the United States and Mexico. Because of the wide range of environmental conditions, this application represents a stringent test of computer vision algorithms for motion detection and object identification. The desired output of this vision system is accurate, real-time locations for individual aliens and accurate statistical data as to the frequency of illegal border crossings. Because most detection and tracking routines assume rigid body motion, which is not characteristic of humans, new algorithms capable of reliable operation in our application are required. Furthermore, most current detection and tracking algorithms assume a uniform background against which motion is viewed - the urban environment along the US-Mexican border is anything but uniform. The system works in three stages: motion detection, object tracking and object identi-fication. We have implemented motion detection using simple frame differencing, maximum likelihood estimation, mean and median tests and are evaluating them for accuracy and computational efficiency. Due to the complex nature of the urban environment (background and foreground objects consisting of buildings, vegetation, vehicles, wind-blown debris, animals, etc.), motion detection alone is not sufficiently accurate. Object tracking and identification are handled by an expert system which takes shape, location and trajectory information as input and determines if the moving object is indeed representative of an illegal border crossing.
Characterization of echoes: A Dyson-series representation of individual pulses
NASA Astrophysics Data System (ADS)
Correia, Miguel R.; Cardoso, Vitor
2018-04-01
The ability to detect and scrutinize gravitational waves from the merger and coalescence of compact binaries opens up the possibility to perform tests of fundamental physics. One such test concerns the dark nature of compact objects: are they really black holes? It was recently pointed out that the absence of horizons—while keeping the external geometry very close to that of General Relativity—would manifest itself in a series of echoes in gravitational wave signals. The observation of echoes by LIGO/Virgo or upcoming facilities would likely inform us on quantum gravity effects or unseen types of matter. Detection of such signals is in principle feasible with relatively simple tools but would benefit enormously from accurate templates. Here we analytically individualize each echo waveform and show that it can be written as a Dyson series, for arbitrary effective potential and boundary conditions. We further apply the formalism to explicitly determine the echoes of a simple toy model: the Dirac delta potential. Our results allow to read off a few known features of echoes and may find application in the modeling for data analysis.
Experimental Demonstration and Circuitry for a Very Compact Coil-Only Pulse Echo EMAT
Rueter, Dirk
2017-01-01
This experimental study demonstrates for the first time a solid-state circuitry and design for a simple compact copper coil (without an additional bulky permanent magnet or bulky electromagnet) as a contactless electromagnetic acoustic transducer (EMAT) for pulse echo operation at MHz frequencies. A pulsed ultrasound emission into a metallic test object is electromagnetically excited by an intense MHz burst at up to 500 A through the 0.15 mm filaments of the transducer. Immediately thereafter, a smoother and quasi “DC-like” current of 100 A is applied for about 1 ms and allows an echo detection. The ultrasonic pulse echo operation for a simple, compact, non-contacting copper coil is new. Application scenarios for compact transducer techniques include very narrow and hostile environments, in which, e.g., quickly moving metal parts must be tested with only one, non-contacting ultrasound shot. The small transducer coil can be operated remotely with a cable connection, separate from the much bulkier supply circuitry. Several options for more technical and fundamental progress are discussed. PMID:28441722
Prototyping and Simulation of Robot Group Intelligence using Kohonen Networks.
Wang, Zhijun; Mirdamadi, Reza; Wang, Qing
2016-01-01
Intelligent agents such as robots can form ad hoc networks and replace human being in many dangerous scenarios such as a complicated disaster relief site. This project prototypes and builds a computer simulator to simulate robot kinetics, unsupervised learning using Kohonen networks, as well as group intelligence when an ad hoc network is formed. Each robot is modeled using an object with a simple set of attributes and methods that define its internal states and possible actions it may take under certain circumstances. As the result, simple, reliable, and affordable robots can be deployed to form the network. The simulator simulates a group of robots as an unsupervised learning unit and tests the learning results under scenarios with different complexities. The simulation results show that a group of robots could demonstrate highly collaborative behavior on a complex terrain. This study could potentially provide a software simulation platform for testing individual and group capability of robots before the design process and manufacturing of robots. Therefore, results of the project have the potential to reduce the cost and improve the efficiency of robot design and building.
Prototyping and Simulation of Robot Group Intelligence using Kohonen Networks
Wang, Zhijun; Mirdamadi, Reza; Wang, Qing
2016-01-01
Intelligent agents such as robots can form ad hoc networks and replace human being in many dangerous scenarios such as a complicated disaster relief site. This project prototypes and builds a computer simulator to simulate robot kinetics, unsupervised learning using Kohonen networks, as well as group intelligence when an ad hoc network is formed. Each robot is modeled using an object with a simple set of attributes and methods that define its internal states and possible actions it may take under certain circumstances. As the result, simple, reliable, and affordable robots can be deployed to form the network. The simulator simulates a group of robots as an unsupervised learning unit and tests the learning results under scenarios with different complexities. The simulation results show that a group of robots could demonstrate highly collaborative behavior on a complex terrain. This study could potentially provide a software simulation platform for testing individual and group capability of robots before the design process and manufacturing of robots. Therefore, results of the project have the potential to reduce the cost and improve the efficiency of robot design and building. PMID:28540284
Johnson, Philip J.; Berhane, Sarah; Kagebayashi, Chiaki; Satomura, Shinji; Teng, Mabel; Reeves, Helen L.; O'Beirne, James; Fox, Richard; Skowronska, Anna; Palmer, Daniel; Yeo, Winnie; Mo, Frankie; Lai, Paul; Iñarrairaegui, Mercedes; Chan, Stephen L.; Sangro, Bruno; Miksad, Rebecca; Tada, Toshifumi; Kumada, Takashi; Toyoda, Hidenori
2015-01-01
Purpose Most patients with hepatocellular carcinoma (HCC) have associated chronic liver disease, the severity of which is currently assessed by the Child-Pugh (C-P) grade. In this international collaboration, we identify objective measures of liver function/dysfunction that independently influence survival in patients with HCC and then combine these into a model that could be compared with the conventional C-P grade. Patients and Methods We developed a simple model to assess liver function, based on 1,313 patients with HCC of all stages from Japan, that involved only serum bilirubin and albumin levels. We then tested the model using similar cohorts from other geographical regions (n = 5,097) and other clinical situations (patients undergoing resection [n = 525] or sorafenib treatment for advanced HCC [n = 1,132]). The specificity of the model for liver (dys)function was tested in patients with chronic liver disease but without HCC (n = 501). Results The model, the Albumin-Bilirubin (ALBI) grade, performed at least as well as the C-P grade in all geographic regions. The majority of patients with HCC had C-P grade A disease at presentation, and within this C-P grade, ALBI revealed two classes with clearly different prognoses. Its utility in patients with chronic liver disease alone supported the contention that the ALBI grade was indeed an index of liver (dys)function. Conclusion The ALBI grade offers a simple, evidence-based, objective, and discriminatory method of assessing liver function in HCC that has been extensively tested in an international setting. This new model eliminates the need for subjective variables such as ascites and encephalopathy, a requirement in the conventional C-P grade. PMID:25512453
A critique of Rasch residual fit statistics.
Karabatsos, G
2000-01-01
In test analysis involving the Rasch model, a large degree of importance is placed on the "objective" measurement of individual abilities and item difficulties. The degree to which the objectivity properties are attained, of course, depends on the degree to which the data fit the Rasch model. It is therefore important to utilize fit statistics that accurately and reliably detect the person-item response inconsistencies that threaten the measurement objectivity of persons and items. Given this argument, it is somewhat surprising that there is far more emphasis placed in the objective measurement of person and items than there is in the measurement quality of Rasch fit statistics. This paper provides a critical analysis of the residual fit statistics of the Rasch model, arguably the most often used fit statistics, in an effort to illustrate that the task of Rasch fit analysis is not as simple and straightforward as it appears to be. The faulty statistical properties of the residual fit statistics do not allow either a convenient or a straightforward approach to Rasch fit analysis. For instance, given a residual fit statistic, the use of a single minimum critical value for misfit diagnosis across different testing situations, where the situations vary in sample and test properties, leads to both the overdetection and underdetection of misfit. To improve this situation, it is argued that psychometricians need to implement residual-free Rasch fit statistics that are based on the number of Guttman response errors, or use indices that are statistically optimal in detecting measurement disturbances.
Evaluating a NoSQL Alternative for Chilean Virtual Observatory Services
NASA Astrophysics Data System (ADS)
Antognini, J.; Araya, M.; Solar, M.; Valenzuela, C.; Lira, F.
2015-09-01
Currently, the standards and protocols for data access in the Virtual Observatory architecture (DAL) are generally implemented with relational databases based on SQL. In particular, the Astronomical Data Query Language (ADQL), language used by IVOA to represent queries to VO services, was created to satisfy the different data access protocols, such as Simple Cone Search. ADQL is based in SQL92, and has extra functionality implemented using PgSphere. An emergent alternative to SQL are the so called NoSQL databases, which can be classified in several categories such as Column, Document, Key-Value, Graph, Object, etc.; each one recommended for different scenarios. Within their notable characteristics we can find: schema-free, easy replication support, simple API, Big Data, etc. The Chilean Virtual Observatory (ChiVO) is developing a functional prototype based on the IVOA architecture, with the following relevant factors: Performance, Scalability, Flexibility, Complexity, and Functionality. Currently, it's very difficult to compare these factors, due to a lack of alternatives. The objective of this paper is to compare NoSQL alternatives with SQL through the implementation of a Web API REST that satisfies ChiVO's needs: a SESAME-style name resolver for the data from ALMA. Therefore, we propose a test scenario by configuring a NoSQL database with data from different sources and evaluating the feasibility of creating a Simple Cone Search service and its performance. This comparison will allow to pave the way for the application of Big Data databases in the Virtual Observatory.
Fully Resolved Simulations of 3D Printing
NASA Astrophysics Data System (ADS)
Tryggvason, Gretar; Xia, Huanxiong; Lu, Jiacai
2017-11-01
Numerical simulations of Fused Deposition Modeling (FDM) (or Fused Filament Fabrication) where a filament of hot, viscous polymer is deposited to ``print'' a three-dimensional object, layer by layer, are presented. A finite volume/front tracking method is used to follow the injection, cooling, solidification and shrinking of the filament. The injection of the hot melt is modeled using a volume source, combined with a nozzle, modeled as an immersed boundary, that follows a prescribed trajectory. The viscosity of the melt depends on the temperature and the shear rate and the polymer becomes immobile as its viscosity increases. As the polymer solidifies, the stress is found by assuming a hyperelastic constitutive equation. The method is described and its accuracy and convergence properties are tested by grid refinement studies for a simple setup involving two short filaments, one on top of the other. The effect of the various injection parameters, such as nozzle velocity and injection velocity are briefly examined and the applicability of the approach to simulate the construction of simple multilayer objects is shown. The role of fully resolved simulations for additive manufacturing and their use for novel processes and as the ``ground truth'' for reduced order models is discussed.
A Simple, Inexpensive Acoustic Levitation Apparatus
NASA Astrophysics Data System (ADS)
Schappe, R. Scott; Barbosa, Cinthya
2017-01-01
Acoustic levitation uses a resonant ultrasonic standing wave to suspend small objects; it is used in a variety of research disciplines, particularly in the study of phase transitions and materials susceptible to contamination, or as a stabilization mechanism in microgravity environments. The levitation equipment used for such research is quite costly; we wanted to develop a simple, inexpensive system to demonstrate this visually striking example of standing waves. A search of the literature produced only one article relevant to creating such an apparatus, but the authors' approach uses a test tube, which limits the access to the standing wave. Our apparatus, shown in Fig. 1, can levitate multiple small (1-2 mm) pieces of expanded polystyrene (Styrofoam) using components readily available to most instructors of introductory physics. Acoustic levitation occurs in small, stable equilibrium locations where the weight of the object is balanced by the acoustic radiation force created by an ultrasonic standing wave; these locations are slightly below the pressure nodes. The levitation process also creates a horizontal restoring force. Since the pressure nodes are also velocity antinodes, this transverse stability may be analogous to the effect of an upward air stream supporting a ball.
Simple proteomics data analysis in the object-oriented PowerShell.
Mohammed, Yassene; Palmblad, Magnus
2013-01-01
Scripting languages such as Perl and Python are appreciated for solving simple, everyday tasks in bioinformatics. A more recent, object-oriented command shell and scripting language, Windows PowerShell, has many attractive features: an object-oriented interactive command line, fluent navigation and manipulation of XML files, ability to consume Web services from the command line, consistent syntax and grammar, rich regular expressions, and advanced output formatting. The key difference between classical command shells and scripting languages, such as bash, and object-oriented ones, such as PowerShell, is that in the latter the result of a command is a structured object with inherited properties and methods rather than a simple stream of characters. Conveniently, PowerShell is included in all new releases of Microsoft Windows and therefore already installed on most computers in classrooms and teaching labs. In this chapter we demonstrate how PowerShell in particular allows easy interaction with mass spectrometry data in XML formats, connection to Web services for tools such as BLAST, and presentation of results as formatted text or graphics. These features make PowerShell much more than "yet another scripting language."
Propellant-Flow-Actuated Rocket Engine Igniter
NASA Technical Reports Server (NTRS)
Wollen, Mark
2013-01-01
A rocket engine igniter has been created that uses a pneumatically driven hammer that, by specialized geometry, is induced into an oscillatory state that can be used to either repeatedly impact a piezoelectric crystal with sufficient force to generate a spark capable of initiating combustion, or can be used with any other system capable of generating a spark from direct oscillatory motion. This innovation uses the energy of flowing gaseous propellant, which by means of pressure differentials and kinetic motion, causes a hammer object to oscillate. The concept works by mass flows being induced through orifices on both sides of a cylindrical tube with one or more vent paths. As the mass flow enters the chamber, the pressure differential is caused because the hammer object is supplied with flow on one side and the other side is opened with access to the vent path. The object then crosses the vent opening and begins to slow because the pressure differential across the ball reverses due to the geometry in the tube. Eventually, the object stops because of the increasing pressure differential on the object until all of the kinetic energy has been transferred to the gas via compression. This is the point where the object reverses direction because of the pressure differential. This behavior excites a piezoelectric crystal via direct impact from the hammer object. The hammer strikes a piezoelectric crystal, then reverses direction, and the resultant high voltage created from the crystal is transferred via an electrode to a spark gap in the ignition zone, thereby providing a spark to ignite the engine. Magnets, or other retention methods, might be employed to favorably position the hammer object prior to start, but are not necessary to maintain the oscillatory behavior. Various manifestations of the igniter have been developed and tested to improve device efficiency, and some improved designs are capable of operation at gas flow rates of a fraction of a gram per second (0.001 lb/s) and pressure drops on the order of 30 to 50 kilopascal (a few psi). An analytical model has been created and tested in conjunction with a precisely calibrated reference model. The analytical model accurately captures the overall behavior of this innovation. The model is a simple "volume-orifice" concept, with each chamber considered a single temperature and pressure "node" connected to adjacent nodes, or to vent paths through flow control orifices. Mass and energy balances are applied to each node, with gas flow predicted using simple compressible flow equations.
Phase derivative method for reconstruction of slightly off-axis digital holograms.
Guo, Cheng-Shan; Wang, Ben-Yi; Sha, Bei; Lu, Yu-Jie; Xu, Ming-Yuan
2014-12-15
A phase derivative (PD) method is proposed for reconstruction of off-axis holograms. In this method, a phase distribution of the tested object wave constrained within 0 to pi radian is firstly worked out by a simple analytical formula; then it is corrected to its right range from -pi to pi according to the sign characteristics of its first-order derivative. A theoretical analysis indicates that this PD method is particularly suitable for reconstruction of slightly off-axis holograms because it only requires the spatial frequency of the reference beam larger than spatial frequency of the tested object wave in principle. In addition, because the PD method belongs to a pure local method with no need of any integral operation or phase shifting algorithm in process of the phase retrieval, it could have some advantages in reducing computer load and memory requirements to the image processing system. Some experimental results are given to demonstrate the feasibility of the method.
A simple physical model for X-ray burst sources
NASA Technical Reports Server (NTRS)
Joss, P. C.; Rappaport, S.
1977-01-01
In connection with information considered by Illarianov and Sunyaev (1975) and van den Heuvel (1975), a simple physical model for an X-ray burst source in the galactic disk is proposed. The model includes an unevolved OB star with a relatively weak stellar wind and a compact object in a close binary system. For some reason, the stellar wind from the OB star is unable to accrete steadily on to the compact object. When the stellar wind is sufficiently weak, the compact object accretes irregularly, leading to X-ray bursts.
ERIC Educational Resources Information Center
Shih, Ching-Hsiang; Chang, Man-Ling; Mohua, Zhang
2012-01-01
This study evaluated whether two people with developmental disabilities would be able to actively perform simple occupational activities to control their preferred environmental stimulation using a Nintendo Wii Remote Controller with a newly developed three-dimensional object orientation detection program (TDOODP, i.e. a new software program,…
The neural basis of precise visual short-term memory for complex recognisable objects.
Veldsman, Michele; Mitchell, Daniel J; Cusack, Rhodri
2017-10-01
Recent evidence suggests that visual short-term memory (VSTM) capacity estimated using simple objects, such as colours and oriented bars, may not generalise well to more naturalistic stimuli. More visual detail can be stored in VSTM when complex, recognisable objects are maintained compared to simple objects. It is not yet known if it is recognisability that enhances memory precision, nor whether maintenance of recognisable objects is achieved with the same network of brain regions supporting maintenance of simple objects. We used a novel stimulus generation method to parametrically warp photographic images along a continuum, allowing separate estimation of the precision of memory representations and the number of items retained. The stimulus generation method was also designed to create unrecognisable, though perceptually matched, stimuli, to investigate the impact of recognisability on VSTM. We adapted the widely-used change detection and continuous report paradigms for use with complex, photographic images. Across three functional magnetic resonance imaging (fMRI) experiments, we demonstrated greater precision for recognisable objects in VSTM compared to unrecognisable objects. This clear behavioural advantage was not the result of recruitment of additional brain regions, or of stronger mean activity within the core network. Representational similarity analysis revealed greater variability across item repetitions in the representations of recognisable, compared to unrecognisable complex objects. We therefore propose that a richer range of neural representations support VSTM for complex recognisable objects. Copyright © 2017 Elsevier Inc. All rights reserved.
Doherty, Orla; Conway, Thomas; Conway, Richard; Murray, Gerard; Casey, Vincent
2017-01-01
Noseband tightness is difficult to assess in horses participating in equestrian sports such as dressage, show jumping and three-day-eventing. There is growing concern that nosebands are commonly tightened to such an extent as to restrict normal equine behaviour and possibly cause injury. In the absence of a clear agreed definition of noseband tightness, a simple model of the equine nose-noseband interface environment was developed in order to guide further studies in this area. The normal force component of the noseband tensile force was identified as the key contributor to sub-noseband tissue compression. The model was used to inform the design of a digital tightness gauge which could reliably measure the normal force component of the noseband tensile force. A digital tightness gauge was developed to measure this parameter under nosebands fitted to bridled horses. Results are presented for field tests using two prototype designs. Prototype version three was used in field trial 1 (n = 15, frontal nasal plane sub-noseband site). Results of this trial were used to develop an ergonomically designed prototype, version 4, which was tested in a second field trial (n = 12, frontal nasal plane and lateral sub-noseband site). Nosebands were set to three tightness settings in each trial as judged by a single rater using an International Society for Equitation Science (ISES) taper gauge. Normal forces in the range 7-95 N were recorded at the frontal nasal plane while a lower range 1-28 N was found at the lateral site for the taper gauge range used in the trials. The digital tightness gauge was found to be simple to use, reliable, and safe and its use did not agitate the animals in any discernable way. A simple six point tightness scale is suggested to aid regulation implementation and the control of noseband tightness using normal force measurement as the objective tightness discriminant.
Deep learning-based artificial vision for grasp classification in myoelectric hands
NASA Astrophysics Data System (ADS)
Ghazaei, Ghazal; Alameer, Ali; Degenaar, Patrick; Morgan, Graham; Nazarpour, Kianoush
2017-06-01
Objective. Computer vision-based assistive technology solutions can revolutionise the quality of care for people with sensorimotor disorders. The goal of this work was to enable trans-radial amputees to use a simple, yet efficient, computer vision system to grasp and move common household objects with a two-channel myoelectric prosthetic hand. Approach. We developed a deep learning-based artificial vision system to augment the grasp functionality of a commercial prosthesis. Our main conceptual novelty is that we classify objects with regards to the grasp pattern without explicitly identifying them or measuring their dimensions. A convolutional neural network (CNN) structure was trained with images of over 500 graspable objects. For each object, 72 images, at {{5}\\circ} intervals, were available. Objects were categorised into four grasp classes, namely: pinch, tripod, palmar wrist neutral and palmar wrist pronated. The CNN setting was first tuned and tested offline and then in realtime with objects or object views that were not included in the training set. Main results. The classification accuracy in the offline tests reached 85 % for the seen and 75 % for the novel objects; reflecting the generalisability of grasp classification. We then implemented the proposed framework in realtime on a standard laptop computer and achieved an overall score of 84 % in classifying a set of novel as well as seen but randomly-rotated objects. Finally, the system was tested with two trans-radial amputee volunteers controlling an i-limb UltraTM prosthetic hand and a motion controlTM prosthetic wrist; augmented with a webcam. After training, subjects successfully picked up and moved the target objects with an overall success of up to 88 % . In addition, we show that with training, subjects’ performance improved in terms of time required to accomplish a block of 24 trials despite a decreasing level of visual feedback. Significance. The proposed design constitutes a substantial conceptual improvement for the control of multi-functional prosthetic hands. We show for the first time that deep-learning based computer vision systems can enhance the grip functionality of myoelectric hands considerably.
Color appearance in stereoscopy
NASA Astrophysics Data System (ADS)
Gadia, Davide; Rizzi, Alessandro; Bonanomi, Cristian; Marini, Daniele; Galmonte, Alessandra; Agostini, Tiziano
2011-03-01
The relationship between color and lightness appearance and the perception of depth has been studied since a while in the field of perceptual psychology and psycho-physiology. It has been found that depth perception affects the final object color and lightness appearance. In the stereoscopy research field, many studies have been proposed on human physiological effects, considering e.g. geometry, motion sickness, etc., but few has been done considering lightness and color information. Goal of this paper is to realize some preliminar experiments in Virtual Reality in order to determine the effects of depth perception on object color and lightness appearance. We have created a virtual test scene with a simple 3D simultaneous contrast configuration. We have created three different versions of this scene, each with different choices of relative positions and apparent size of the objects. We have collected the perceptual responses of several users after the observation of the test scene in the Virtual Theater of the University of Milan, a VR immersive installation characterized by a semi-cylindrical screen that covers 120° of horizontal field of view from an observation distance of 3.5 m. We present a description of the experiments setup and procedure, and we discuss the obtained results.
Banach Gelfand Triples for Applications in Physics and Engineering
NASA Astrophysics Data System (ADS)
Feichtinger, Hans G.
2009-07-01
The principle of extension is widespread within mathematics. Starting from simple objects one constructs more sophisticated ones, with a kind of natural embedding from the set of old objects to the new, enlarged set. Usually a set of operations on the old set can still be carried out, but maybe also some new ones. Done properly one obtains more completed objects of a similar kind, with additional useful properties. Let us give a simple example: While multiplication and addition can be done exactly and perfectly in the setting of Q, the rational numbers, the field R of real numbers has the advantage of being complete (Cauchy sequences have a limit…) and hence allowing for numbers like π or √2 . Finally the even "more complicated" field C of complex numbers allows to find solutions to equations like z2 = -1. The chain of inclusions of fields, Q⊂R⊂C is a good motivating example in the domain of "numbers." The main subject of the present survey-type article is a new theory of Banach Gelfand triples (BGTs), providing a similar setting in the context of (generalized) functions. Test functions are the simple objects, elements of the Hilbert space L2(Rd) are well suited in order to describe concepts of orthogonality, and they can be approximated to any given precision (in the ‖ṡ‖2-norm) by test functions. Finally one needs an even larger (Banach) space of generalized functions resp. distributions, containing among others pure frequencies and Dirac measures in order to describe various mappings between such Banach Gelfand triples in terms of the most important "elementary building blocks," in a clear analogy to the finite/discrete setting (where Dirac measures correspond to unit vectors). Our concrete Banach Gelfand triple is based on the Segal algebra S0(Rd), which coincides with the modulation space M1(Rd) = M01,1(Rd), and plays a very important and natural role for time-frequency analysis. We will point out that it provides the appropriate setting for a description of many problems in engineering or physics, including the classical Fourier transform or the Kohn-Nirenberg or Weyl calculus for pseudo-differential operators. Particular emphasis will be given to the concept of w*-convergence and w*-continuity of operators which allows to prove conceptual uniqueness results, and to give a correct interpretation to certain formal expressions coming up in various versions of the Dirac formalism.
Faillace, M P; Pisera-Fuster, A; Medrano, M P; Bejarano, A C; Bernabeu, R O
2017-03-01
Zebrafish have a sophisticated color- and shape-sensitive visual system, so we examined color cue-based novel object recognition in zebrafish. We evaluated preference in the absence or presence of drugs that affect attention and memory retention in rodents: nicotine and the histone deacetylase inhibitor (HDACi) phenylbutyrate (PhB). The objective of this study was to evaluate whether nicotine and PhB affect innate preferences of zebrafish for familiar and novel objects after short- and long-retention intervals. We developed modified object recognition (OR) tasks using neutral novel and familiar objects in different colors. We also tested objects which differed with respect to the exploratory behavior they elicited from naïve zebrafish. Zebrafish showed an innate preference for exploring red or green objects rather than yellow or blue objects. Zebrafish were better at discriminating color changes than changes in object shape or size. Nicotine significantly enhanced or changed short-term innate novel object preference whereas PhB had similar effects when preference was assessed 24 h after training. Analysis of other zebrafish behaviors corroborated these results. Zebrafish were innately reluctant or prone to explore colored novel objects, so drug effects on innate preference for objects can be evaluated changing the color of objects with a simple geometry. Zebrafish exhibited recognition memory for novel objects with similar innate significance. Interestingly, nicotine and PhB significantly modified innate object preference.
Simple Test Functions in Meshless Local Petrov-Galerkin Methods
NASA Technical Reports Server (NTRS)
Raju, Ivatury S.
2016-01-01
Two meshless local Petrov-Galerkin (MLPG) methods based on two different trial functions but that use a simple linear test function were developed for beam and column problems. These methods used generalized moving least squares (GMLS) and radial basis (RB) interpolation functions as trial functions. These two methods were tested on various patch test problems. Both methods passed the patch tests successfully. Then the methods were applied to various beam vibration problems and problems involving Euler and Beck's columns. Both methods yielded accurate solutions for all problems studied. The simple linear test function offers considerable savings in computing efforts as the domain integrals involved in the weak form are avoided. The two methods based on this simple linear test function method produced accurate results for frequencies and buckling loads. Of the two methods studied, the method with radial basis trial functions is very attractive as the method is simple, accurate, and robust.
The Effects of Shoulder Slings on Balance in Patients With Hemiplegic Stroke
Sohn, Min Kyun; Jee, Sung Ju; Hwang, Pyoungsik; Jeon, Yumi
2015-01-01
Objective To investigate the effects of a shoulder sling on balance in patients with hemiplegia. Methods Twenty-seven hemiplegic stroke patients (right 13, left 14) were enrolled in this study. The subjects' movement in their centers of gravity (COGs) during their static and dynamic balance tests was measured with their eyes open in each sling condition-without a sling, with Bobath's axillary support (Bobath sling), and with a simple arm sling. The percent times in quadrant, overall, anterior/posterior, and medial/lateral stability indexes were measured using a posturography platform (Biodex Balance System SD). Functional balance was evaluated using the Berg Balance Scale and the Trunk Impairment Scale. All balance tests were performed with each sling in random order. Results The COGs of right hemiplegic stroke patients and all hemiplegic stroke patients shifted to, respectively, the right and posterior quadrants during the static balance test without a sling (p<0.05). This weight asymmetry pattern did not improve with either the Bobath or the simple arm sling. There was no significant improvement in any stability index during either the static or the dynamic balance tests in any sling condition. Conclusion The right and posterior deviations of the hemiplegic stroke patients' COGs were maintained during the application of the shoulder slings, and there were no significant effects of the shoulder slings on the patients' balance in the standing still position. PMID:26798614
Voids as alternatives to dark energy and the propagation of γ rays through the universe.
DeLavallaz, Arnaud; Fairbairn, Malcolm
2012-04-27
We test the opacity of a void universe to TeV energy γ rays having obtained the extragalactic background light in that universe using a simple model and the observed constraints on the star formation rate history. We find that the void universe has significantly more opacity than a Λ cold dark matter universe, putting it at odds with observations of BL-Lac objects. We argue that while this method of distinguishing between the two cosmologies contains uncertainties, it circumvents any debates over fine-tuning.
Self-referenced interferometer for cylindrical surfaces.
Šarbort, Martin; Řeřucha, Šimon; Holá, Miroslava; Buchta, Zdeněk; Lazar, Josef
2015-11-20
We present a new interferometric method for shape measurement of hollow cylindrical tubes. We propose a simple and robust self-referenced interferometer where the reference and object waves are represented by the central and peripheral parts, respectively, of the conical wave generated by a single axicon lens. The interferogram detected by a digital camera is characterized by a closed-fringe pattern with a circular carrier. The interference phase is demodulated using spatial synchronous detection. The capabilities of the interferometer are experimentally tested for various hollow cylindrical tubes with lengths up to 600 mm.
Real-world spatial regularities affect visual working memory for objects.
Kaiser, Daniel; Stein, Timo; Peelen, Marius V
2015-12-01
Traditional memory research has focused on measuring and modeling the capacity of visual working memory for simple stimuli such as geometric shapes or colored disks. Although these studies have provided important insights, it is unclear how their findings apply to memory for more naturalistic stimuli. An important aspect of real-world scenes is that they contain a high degree of regularity: For instance, lamps appear above tables, not below them. In the present study, we tested whether such real-world spatial regularities affect working memory capacity for individual objects. Using a delayed change-detection task with concurrent verbal suppression, we found enhanced visual working memory performance for objects positioned according to real-world regularities, as compared to irregularly positioned objects. This effect was specific to upright stimuli, indicating that it did not reflect low-level grouping, because low-level grouping would be expected to equally affect memory for upright and inverted displays. These results suggest that objects can be held in visual working memory more efficiently when they are positioned according to frequently experienced real-world regularities. We interpret this effect as the grouping of single objects into larger representational units.
Recommendations for Hypersonic Boundary Layer Transition Flight Testing
NASA Technical Reports Server (NTRS)
Berry, Scott A.; Kimmel, Roger; Reshotko, Eli
2011-01-01
Much has been learned about the physics underlying the transition process at supersonic and hypersonic speeds through years of analysis, experiment and computation. Generally, the application of this knowledge has been restricted to simple shapes like plates, cones and spherical bodies. However, flight reentry vehicles are in reality never simple. They typically are highly complex geometries flown at angle of attack so three-dimensional effects are very important, as are roughness effects due to surface features and/or ablation. This paper will review our present understanding of the physics of the transition process and look back at some of the recent flight test programs for their successes and failures. The goal of this paper is to develop rationale for new hypersonic boundary layer transition flight experiments. Motivations will be derived from both an inward look at what we believe constitutes a good flight test program as well as an outward review of the goals and objectives of some recent US based unclassified proposals and programs. As part of our recommendations, this paper will address the need for careful experimental work as per the guidelines enunciated years ago by the U.S. Transition Study Group. Following these guidelines is essential to obtaining reliable, usable data for allowing refinement of transition estimation techniques.
Development of test methods for textile composites
NASA Technical Reports Server (NTRS)
Masters, John E.; Ifju, Peter G.; Fedro, Mark J.
1993-01-01
NASA's Advanced Composite Technology (ACT) Program was initiated in 1990 with the purpose of developing less costly composite aircraft structures. A number of innovative materials and processes were evaluated as a part of this effort. Chief among them are composite materials reinforced with textile preforms. These new forms of composite materials bring with them potential testing problems. Methods currently in practice were developed over the years for composite materials made from prepreg tape or simple 2-D woven fabrics. A wide variety of 2-D and 3-D braided, woven, stitched, and knit preforms were suggested for application in the ACT program. The applicability of existing test methods to the wide range of emerging materials bears investigation. The overriding concern is that the values measured are accurate representations of the true material response. The ultimate objective of this work is to establish a set of test methods to evaluate the textile composites developed for the ACT Program.
Method for targetless tracking subpixel in-plane movements.
Espinosa, Julian; Perez, Jorge; Ferrer, Belen; Mas, David
2015-09-01
We present a targetless motion tracking method for detecting planar movements with subpixel accuracy. This method is based on the computation and tracking of the intersection of two nonparallel straight-line segments in the image of a moving object in a scene. The method is simple and easy to implement because no complex structures have to be detected. It has been tested and validated using a lab experiment consisting of a vibrating object that was recorded with a high-speed camera working at 1000 fps. We managed to track displacements with an accuracy of hundredths of pixel or even of thousandths of pixel in the case of tracking harmonic vibrations. The method is widely applicable because it can be used for distance measuring amplitude and frequency of vibrations with a vision system.
Deep learning-based artificial vision for grasp classification in myoelectric hands.
Ghazaei, Ghazal; Alameer, Ali; Degenaar, Patrick; Morgan, Graham; Nazarpour, Kianoush
2017-06-01
Computer vision-based assistive technology solutions can revolutionise the quality of care for people with sensorimotor disorders. The goal of this work was to enable trans-radial amputees to use a simple, yet efficient, computer vision system to grasp and move common household objects with a two-channel myoelectric prosthetic hand. We developed a deep learning-based artificial vision system to augment the grasp functionality of a commercial prosthesis. Our main conceptual novelty is that we classify objects with regards to the grasp pattern without explicitly identifying them or measuring their dimensions. A convolutional neural network (CNN) structure was trained with images of over 500 graspable objects. For each object, 72 images, at [Formula: see text] intervals, were available. Objects were categorised into four grasp classes, namely: pinch, tripod, palmar wrist neutral and palmar wrist pronated. The CNN setting was first tuned and tested offline and then in realtime with objects or object views that were not included in the training set. The classification accuracy in the offline tests reached [Formula: see text] for the seen and [Formula: see text] for the novel objects; reflecting the generalisability of grasp classification. We then implemented the proposed framework in realtime on a standard laptop computer and achieved an overall score of [Formula: see text] in classifying a set of novel as well as seen but randomly-rotated objects. Finally, the system was tested with two trans-radial amputee volunteers controlling an i-limb Ultra TM prosthetic hand and a motion control TM prosthetic wrist; augmented with a webcam. After training, subjects successfully picked up and moved the target objects with an overall success of up to [Formula: see text]. In addition, we show that with training, subjects' performance improved in terms of time required to accomplish a block of 24 trials despite a decreasing level of visual feedback. The proposed design constitutes a substantial conceptual improvement for the control of multi-functional prosthetic hands. We show for the first time that deep-learning based computer vision systems can enhance the grip functionality of myoelectric hands considerably.
Kosack, Cara S.; Shanks, Leslie; Beelaert, Greet; Benson, Tumwesigye; Savane, Aboubacar; Ng'ang'a, Anne; Bita, André; Zahinda, Jean-Paul B. N.; Fransen, Katrien
2017-01-01
ABSTRACT Our objective was to evaluate the performance of HIV testing algorithms based on WHO recommendations, using data from specimens collected at six HIV testing and counseling sites in sub-Saharan Africa (Conakry, Guinea; Kitgum and Arua, Uganda; Homa Bay, Kenya; Douala, Cameroon; Baraka, Democratic Republic of Congo). A total of 2,780 samples, including 1,306 HIV-positive samples, were included in the analysis. HIV testing algorithms were designed using Determine as a first test. Second and third rapid diagnostic tests (RDTs) were selected based on site-specific performance, adhering where possible to the WHO-recommended minimum requirements of ≥99% sensitivity and specificity. The threshold for specificity was reduced to 98% or 96% if necessary. We also simulated algorithms consisting of one RDT followed by a simple confirmatory assay. The positive predictive values (PPV) of the simulated algorithms ranged from 75.8% to 100% using strategies recommended for high-prevalence settings, 98.7% to 100% using strategies recommended for low-prevalence settings, and 98.1% to 100% using a rapid test followed by a simple confirmatory assay. Although we were able to design algorithms that met the recommended PPV of ≥99% in five of six sites using the applicable high-prevalence strategy, options were often very limited due to suboptimal performance of individual RDTs and to shared falsely reactive results. These results underscore the impact of the sequence of HIV tests and of shared false-reactivity data on algorithm performance. Where it is not possible to identify tests that meet WHO-recommended specifications, the low-prevalence strategy may be more suitable. PMID:28747371
Akbar, Umer; Raike, Robert S.; Hack, Nawaz; Hess, Christopher W.; Skinner, Jared; Martinez‐Ramirez, Daniel; DeJesus, Sol
2016-01-01
Objectives Evidence suggests that nonconventional programming may improve deep brain stimulation (DBS) therapy for movement disorders. The primary objective was to assess feasibility of testing the tolerability of several nonconventional settings in Parkinson's disease (PD) and essential tremor (ET) subjects in a single office visit. Secondary objectives were to explore for potential efficacy signals and to assess the energy demand on the implantable pulse‐generators (IPGs). Materials and Methods A custom firmware (FW) application was developed and acutely uploaded to the IPGs of eight PD and three ET subjects, allowing delivery of several nonconventional DBS settings, including narrow pulse widths, square biphasic pulses, and irregular pulse patterns. Standard clinical rating scales and several objective measures were used to compare motor outcomes with sham, clinically‐optimal and nonconventional settings. Blinded and randomized testing was conducted in a traditional office setting. Results Overall, the nonconventional settings were well tolerated. Under these conditions it was also possible to detect clinically‐relevant differences in DBS responses using clinical rating scales but not objective measures. Compared to the clinically‐optimal settings, some nonconventional settings appeared to offer similar benefit (e.g., narrow pulse widths) and others lesser benefit. Moreover, the results suggest that square biphasic pulses may deliver greater benefit. No unexpected IPG efficiency disadvantages were associated with delivering nonconventional settings. Conclusions It is feasible to acutely screen nonconventional DBS settings using controlled study designs in traditional office settings. Simple IPG FW upgrades may provide more DBS programming options for optimizing therapy. Potential advantages of narrow and biphasic pulses deserve follow up. PMID:27000764
Optimizing an experimental design for an electromagnetic experiment
NASA Astrophysics Data System (ADS)
Roux, Estelle; Garcia, Xavier
2013-04-01
Most of geophysical studies focus on data acquisition and analysis, but another aspect which is gaining importance is the discussion on acquisition of suitable datasets. This can be done through the design of an optimal experiment. Optimizing an experimental design implies a compromise between maximizing the information we get about the target and reducing the cost of the experiment, considering a wide range of constraints (logistical, financial, experimental …). We are currently developing a method to design an optimal controlled-source electromagnetic (CSEM) experiment to detect a potential CO2 reservoir and monitor this reservoir during and after CO2 injection. Our statistical algorithm combines the use of linearized inverse theory (to evaluate the quality of one given design via the objective function) and stochastic optimization methods like genetic algorithm (to examine a wide range of possible surveys). The particularity of our method is that it uses a multi-objective genetic algorithm that searches for designs that fit several objective functions simultaneously. One main advantage of this kind of technique to design an experiment is that it does not require the acquisition of any data and can thus be easily conducted before any geophysical survey. Our new experimental design algorithm has been tested with a realistic one-dimensional resistivity model of the Earth in the region of study (northern Spain CO2 sequestration test site). We show that a small number of well distributed observations have the potential to resolve the target. This simple test also points out the importance of a well chosen objective function. Finally, in the context of CO2 sequestration that motivates this study, we might be interested in maximizing the information we get about the reservoir layer. In that case, we show how the combination of two different objective functions considerably improve its resolution.
Does Variability Across Events Affect Verb Learning in English, Mandarin and Korean?
Childers, Jane B.; Paik, Jae H.; Flores, Melissa; Lai, Gabrielle; Dolan, Megan
2016-01-01
Extending new verbs is important to becoming a productive speaker of a language. Prior results show children have difficulty extending verbs when they have seen events with varied agents. This paper further examines the impact of variability on verb learning, and asks whether this interacts with event complexity or differs by language. Children (aged 2 ½- to 3-years) in the U.S., China, Korea and Singapore learned verbs linked to simple and complex events. Sets of events included one or three agents, and children were asked to extend the verb at test. Children learning verbs linked to simple movements performed similarly across conditions. However, children learning verbs linked to events with multiple objects were less successful if those events were enacted by multiple agents. A follow-up study rules out an influence of event order. Overall, similar patterns of results emerged across languages, suggesting common cognitive processes support children’s verb learning. PMID:27457679
Phrase frequency effects in language production.
Janssen, Niels; Barber, Horacio A
2012-01-01
A classic debate in the psychology of language concerns the question of the grain-size of the linguistic information that is stored in memory. One view is that only morphologically simple forms are stored (e.g., 'car', 'red'), and that more complex forms of language such as multi-word phrases (e.g., 'red car') are generated on-line from the simple forms. In two experiments we tested this view. In Experiment 1, participants produced noun+adjective and noun+noun phrases that were elicited by experimental displays consisting of colored line drawings and two superimposed line drawings. In Experiment 2, participants produced noun+adjective and determiner+noun+adjective utterances elicited by colored line drawings. In both experiments, naming latencies decreased with increasing frequency of the multi-word phrase, and were unaffected by the frequency of the object name in the utterance. These results suggest that the language system is sensitive to the distribution of linguistic information at grain-sizes beyond individual words.
Integration and validation of a data grid software
NASA Astrophysics Data System (ADS)
Carenton-Madiec, Nicolas; Berger, Katharina; Cofino, Antonio
2014-05-01
The Earth System Grid Federation (ESGF) Peer-to-Peer (P2P) is a software infrastructure for the management, dissemination, and analysis of model output and observational data. The ESGF grid is composed with several types of nodes which have different roles. About 40 data nodes host model outputs and datasets using thredds catalogs. About 25 compute nodes offer remote visualization and analysis tools. About 15 index nodes crawl data nodes catalogs and implement faceted and federated search in a web interface. About 15 Identity providers nodes manage accounts, authentication and authorization. Here we will present an actual size test federation spread across different institutes in different countries and a python test suite that were started in December 2013. The first objective of the test suite is to provide a simple tool that helps to test and validate a single data node and its closest index, compute and identity provider peer. The next objective will be to run this test suite on every data node of the federation and therefore test and validate every single node of the whole federation. The suite already implements nosetests, requests, myproxy-logon, subprocess, selenium and fabric python libraries in order to test both web front ends, back ends and security services. The goal of this project is to improve the quality of deliverable in a small developers team context. Developers are widely spread around the world working collaboratively and without hierarchy. This kind of working organization context en-lighted the need of a federated integration test and validation process.
Sundrić, Zvonko; Rajsić, Nenad; Lakocević, Milan; Nikolić-Djorić, Emilija
2010-01-01
Decrease of daily alertness is a common cause of accidents in the work place, especially traffic accidents. Therefore, an increasing interest exists to determine reliable indicators of a tendency to fall asleep involuntarily. To determine an optimal electroencephalographic (EEG) indicator of an involuntary tendency to fall asleep, we performed a study on neurologically healthy subjects, after one night of sleep deprivation. Total sleep deprivation was aimed at increasing daily sleepiness in healthy subjects, providing us with an opportunity to test different methods of evaluation. We applied a visual analogue scale for sleepiness (VASS), EEG registration with the specific test of alpha activity attenuation (TAA) in 87 healthy subjects. The test was perfomed in a standard way (sTAA) as well as in accordance with new modifications related to changes of EEG filter width in the range from 5 to 32 Hz (mTAA). After sleep deprivation, we observed involuntary falling asleep in 54 subjects. The comparison of VASS results showed no differences, contrary to a more objective TAA. Between two variants of TAA, the modified test provided us with a better prediction for subjects who would fall asleep involuntarily. The application of a more objective EEG test in evaluation of daily alertness represents the optimal method of testing. Modified TAA attracts special attention, offering a simple solution for reliable testing of decreased daily alertness in medical services related to professional aircraft personnel.
Large-Scale Low-Boom Inlet Test Overview
NASA Technical Reports Server (NTRS)
Hirt, Stefanie
2011-01-01
This presentation provides a high level overview of the Large-Scale Low-Boom Inlet Test and was presented at the Fundamental Aeronautics 2011 Technical Conference. In October 2010 a low-boom supersonic inlet concept with flow control was tested in the 8'x6' supersonic wind tunnel at NASA Glenn Research Center (GRC). The primary objectives of the test were to evaluate the inlet stability and operability of a large-scale low-boom supersonic inlet concept by acquiring performance and flowfield validation data, as well as evaluate simple, passive, bleedless inlet boundary layer control options. During this effort two models were tested: a dual stream inlet intended to model potential flight hardware and a single stream design to study a zero-degree external cowl angle and to permit surface flow visualization of the vortex generator flow control on the internal centerbody surface. The tests were conducted by a team of researchers from NASA GRC, Gulfstream Aerospace Corporation, University of Illinois at Urbana-Champaign, and the University of Virginia
The Effect of Slipstream Obstructions on Air Propellers
NASA Technical Reports Server (NTRS)
Lesley, E P; Woods, B M
1924-01-01
The screw propeller on airplanes is usually placed near other objects, and hence its performance may be modified by them. Results of tests on propellers free from slip stream obstructions, both fore and aft, are therefore subject to correction, for the effect of such obstructions and the purpose of the investigation was to determine the effect upon the thrust and torque coefficients and efficiency, for previously tested air propellers, of obstructions placed in the slip stream, it being realized that such previous tests had been conducted under somewhat ideal conditions that are impracticable of realization in flight. Simple geometrical forms were used for the initial investigation. Such forms offered the advantage of easy, exact reproduction at another time or in other laboratories, and it was believed that the effects of obstructions usually encountered might be deduced or surmise from those chosen.
NASA Astrophysics Data System (ADS)
Howell, Robert R.; Radebaugh, Jani; M. C Lopes, Rosaly; Kerber, Laura; Solomonidou, Anezina; Watkins, Bryn
2017-10-01
Using remote sensing of planetary volcanism on objects such as Io to determine eruption conditions is challenging because the emitting region is typically not resolved and because exposed lava cools so quickly. A model of the cooling rate and eruption mechanism is typically used to predict the amount of surface area at different temperatures, then that areal distribution is convolved with a Planck blackbody emission curve, and the predicted spectra is compared with observation. Often the broad nature of the Planck curve makes interpretation non-unique. However different eruption mechanisms (for example cooling fire fountain droplets vs. cooling flows) have very different area vs. temperature distributions which can often be characterized by simple power laws. Furthermore different composition magmas have significantly different upper limit cutoff temperatures. In order to test these models in August 2016 and May 2017 we obtained spatially resolved observations of spreading Kilauea pahoehoe flows and fire fountains using a three-wavelength near-infrared prototype camera system. We have measured the area vs. temperature distribution for the flows and find that over a relatively broad temperature range the distribution does follow a power law matching the theoretical predictions. As one approaches the solidus temperature the observed area drops below the simple model predictions by an amount that seems to vary inversely with the vigor of the spreading rate. At these highest temperatures the simple models are probably inadequate. It appears necessary to model the visco-elastic stretching of the very thin crust which covers even the most recently formed surfaces. That deviation between observations and the simple models may be particularly important when using such remote sensing observations to determine magma eruption temperatures.
Preprocessing of A-scan GPR data based on energy features
NASA Astrophysics Data System (ADS)
Dogan, Mesut; Turhan-Sayan, Gonul
2016-05-01
There is an increasing demand for noninvasive real-time detection and classification of buried objects in various civil and military applications. The problem of detection and annihilation of landmines is particularly important due to strong safety concerns. The requirement for a fast real-time decision process is as important as the requirements for high detection rates and low false alarm rates. In this paper, we introduce and demonstrate a computationally simple, timeefficient, energy-based preprocessing approach that can be used in ground penetrating radar (GPR) applications to eliminate reflections from the air-ground boundary and to locate the buried objects, simultaneously, at one easy step. The instantaneous power signals, the total energy values and the cumulative energy curves are extracted from the A-scan GPR data. The cumulative energy curves, in particular, are shown to be useful to detect the presence and location of buried objects in a fast and simple way while preserving the spectral content of the original A-scan data for further steps of physics-based target classification. The proposed method is demonstrated using the GPR data collected at the facilities of IPA Defense, Ankara at outdoor test lanes. Cylindrically shaped plastic containers were buried in fine-medium sand to simulate buried landmines. These plastic containers were half-filled by ammonium nitrate including metal pins. Results of this pilot study are demonstrated to be highly promising to motivate further research for the use of energy-based preprocessing features in landmine detection problem.
X-ray system simulation software tools for radiology and radiography education.
Kengyelics, Stephen M; Treadgold, Laura A; Davies, Andrew G
2018-02-01
To develop x-ray simulation software tools to support delivery of radiological science education for a range of learning environments and audiences including individual study, lectures, and tutorials. Two software tools were developed; one simulated x-ray production for a simple two dimensional radiographic system geometry comprising an x-ray source, beam filter, test object and detector. The other simulated the acquisition and display of two dimensional radiographic images of complex three dimensional objects using a ray casting algorithm through three dimensional mesh objects. Both tools were intended to be simple to use, produce results accurate enough to be useful for educational purposes, and have an acceptable simulation time on modest computer hardware. The radiographic factors and acquisition geometry could be altered in both tools via their graphical user interfaces. A comparison of radiographic contrast measurements of the simulators to a real system was performed. The contrast output of the simulators had excellent agreement with measured results. The software simulators were deployed to 120 computers on campus. The software tools developed are easy-to-use, clearly demonstrate important x-ray physics and imaging principles, are accessible within a standard University setting and could be used to enhance the teaching of x-ray physics to undergraduate students. Current approaches to teaching x-ray physics in radiological science lack immediacy when linking theory with practice. This method of delivery allows students to engage with the subject in an experiential learning environment. Copyright © 2017. Published by Elsevier Ltd.
Yeo, Lami; Romero, Roberto; Jodicke, Cristiano; Kim, Sun Kwon; Gonzalez, Juan M.; Oggè, Giovanna; Lee, Wesley; Kusanovic, Juan Pedro; Vaisbuch, Edi; Hassan, Sonia S.
2010-01-01
Objective To describe a novel and simple technique (STAR: Simple Targeted Arterial Rendering) to visualize the fetal cardiac outflow tracts from dataset volumes obtained with spatiotemporal image correlation (STIC) and applying a new display technology (OmniView). Methods We developed a technique to image the outflow tracts by drawing three dissecting lines through the four-chamber view of the heart contained in a STIC volume dataset. Each line generated the following plane: 1) Line 1: ventricular septum “en face” with both great vessels (pulmonary artery anterior to the aorta); 2) Line 2: pulmonary artery with continuation into the longitudinal view of the ductal arch; and 3) Line 3: long axis view of the aorta arising from the left ventricle. The pattern formed by all 3 lines intersecting approximately through the crux of the heart resembles a “star”. The technique was then tested in 50 normal hearts (15.3 – 40.4 weeks of gestation). To determine if the technique could identify planes that departed from the normal images, we tested the technique in 4 cases with proven congenital heart defects (ventricular septal defect, transposition of great vessels, tetralogy of Fallot, and pulmonary atresia with intact ventricular septum). Results The STAR technique was able to generate the intended planes in all 50 normal cases. In the abnormal cases, the STAR technique allowed identification of the ventricular septal defect, demonstrated great vessel anomalies, and displayed views that deviated from what was expected from the examination of normal hearts. Conclusions This novel and simple technique can be used to visualize the outflow tracts and ventricular septum “en face” in normal fetal hearts. The inability to obtain expected views or the appearance of abnormal views in the generated planes should raise the index of suspicion for congenital heart disease involving the great vessels and/or the ventricular septum. The STAR technique may simplify examination of the fetal heart and could reduce operator dependency. PMID:20878672
Shih, Ching-Hsiang; Chang, Man-Ling; Mohua, Zhang
2012-01-01
This study evaluated whether two people with developmental disabilities would be able to actively perform simple occupational activities to control their preferred environmental stimulation using a Nintendo Wii Remote Controller with a newly developed three-dimensional object orientation detection program (TDOODP, i.e. a new software program, which turns a Wii Remote Controller into a three-dimensional object orientation detector). An ABAB design, in which A represented the baseline and B represented intervention phases, was adopted in this study. The data shows that the performance of both participants has significantly increased (i.e. they perform more simple occupational activities to activate the control system to produce environmental stimulation) during the intervention phases. The practical and developmental implications of the findings are discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.
Measuring Drag Force in Newtonian Liquids
NASA Astrophysics Data System (ADS)
Mawhinney, Matthew T.; O'Donnell, Mary Kate; Fingerut, Jonathan; Habdas, Piotr
2012-03-01
The experiments described in this paper have two goals. The first goal is to show how students can perform simple but fundamental measurements of objects moving through simple liquids (such as water, oil, or honey). In doing so, students can verify Stokes' law, which governs the motion of spheres through simple liquids, and see how it fails at higher object speeds. Moreover, they can qualitatively study fluid patterns at various object speeds (Reynolds numbers). The second goal is to help students make connections between physics and other sciences. Specifically, the results of these experiments can be used to help students understand the role of fluid motion in determining the shape of an organism, or where it lives. At Saint Josephs University we have developed these experiments as part of a newly developed course in biomechanics where both physics and biology undergraduate students bring their ideas and expertise to enrich a shared learning environment.
Image Formation in Lenses and Mirrors, a Complete Representation
ERIC Educational Resources Information Center
Bartlett, Albert A.
1976-01-01
Provides tables and graphs that give a complete and simple picture of the relationships of image distance, object distance, and magnification in all formations of images by simple lenses and mirrors. (CP)
A Systematic Study of Simple Combinatorial Configurations.
ERIC Educational Resources Information Center
Dubois, Jean-Guy
1984-01-01
A classification of the simple combinatorial configurations which correspond to various cases of distribution and ordering of objects into boxes is given (in French). Concrete descriptions, structured relations, translations, and formalizations are discussed. (MNS)
Normal probability plots with confidence.
Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang
2015-01-01
Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Debi, R; Elbaz, A; Mor, A; Kahn, G; Peskin, B; Beer, Y; Agar, G; Morag, G; Segal, G
2017-06-01
The purpose of the current study was to compare the gait patterns in patients with three differing knee pathologies - knee osteoarthritis (OA), degenerative meniscal lesion (DML) and spontaneous osteonecrosis of the knee (SONK) and a group of healthy controls. A simple gait test will detect differences between different knee pathologies. Forty-seven patients with bilateral knee OA, 47 patients with DML, 28 patients with SONK and 27 healthy controls were included in this analysis. Patients underwent a spatiotemporal gait assessment and were asked to complete the Western Ontario and McMaster University (WOMAC) Index and the Short-Form (SF)-36 Health Survey. ANOVA tests, followed by Bonferroni multiple comparison tests and the Chi 2 tests were performed for continuous and categorical variables, respectively. Significant differences were found for all gait measures and clinical questionnaires between healthy controls and all knee conditions. Patients with SONK differed from patients with bilateral knee OA and DML in all gait measures and clinical questionnaires, except for WOMAC subscales. There were no significant differences between patients with bilateral knee OA and patients with DML. Symmetry was also examined and revealed asymmetry in some gait parameters in patients with SONK and DML. Based on the differences in gait parameters that were found in the current study, adding an objective functional spatiotemporal gait test may assist in the diagnostic process of knee pathologies. Case Control study Level III. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
A Deficit in Movement-Derived Sentences in German-Speaking Hearing-Impaired Children
Ruigendijk, Esther; Friedmann, Naama
2017-01-01
Children with hearing impairment (HI) show disorders in syntax and morphology. The question is whether and how these disorders are connected to problems in the auditory domain. The aim of this paper is to examine whether moderate to severe hearing loss at a young age affects the ability of German-speaking orally trained children to understand and produce sentences. We focused on sentence structures that are derived by syntactic movement, which have been identified as a sensitive marker for syntactic impairment in other languages and in other populations with syntactic impairment. Therefore, our study tested subject and object relatives, subject and object Wh-questions, passive sentences, and topicalized sentences, as well as sentences with verb movement to second sentential position. We tested 19 HI children aged 9;5–13;6 and compared their performance with hearing children using comprehension tasks of sentence-picture matching and sentence repetition tasks. For the comprehension tasks, we included HI children who passed an auditory discrimination task; for the sentence repetition tasks, we selected children who passed a screening task of simple sentence repetition without lip-reading; this made sure that they could perceive the words in the tests, so that we could test their grammatical abilities. The results clearly showed that most of the participants with HI had considerable difficulties in the comprehension and repetition of sentences with syntactic movement: they had significant difficulties understanding object relatives, Wh-questions, and topicalized sentences, and in the repetition of object who and which questions and subject relatives, as well as in sentences with verb movement to second sentential position. Repetition of passives was only problematic for some children. Object relatives were still difficult at this age for both HI and hearing children. An additional important outcome of the study is that not all sentence structures are impaired—passive structures were not problematic for most of the HI children PMID:28659836
Testing and Validation of the Dynamic Inertia Measurement Method
NASA Technical Reports Server (NTRS)
Chin, Alexander W.; Herrera, Claudia Y.; Spivey, Natalie D.; Fladung, William A.; Cloutier, David
2015-01-01
The Dynamic Inertia Measurement (DIM) method uses a ground vibration test setup to determine the mass properties of an object using information from frequency response functions. Most conventional mass properties testing involves using spin tables or pendulum-based swing tests, which for large aerospace vehicles becomes increasingly difficult and time-consuming, and therefore expensive, to perform. The DIM method has been validated on small test articles but has not been successfully proven on large aerospace vehicles. In response, the National Aeronautics and Space Administration Armstrong Flight Research Center (Edwards, California) conducted mass properties testing on an "iron bird" test article that is comparable in mass and scale to a fighter-type aircraft. The simple two-I-beam design of the "iron bird" was selected to ensure accurate analytical mass properties. Traditional swing testing was also performed to compare the level of effort, amount of resources, and quality of data with the DIM method. The DIM test showed favorable results for the center of gravity and moments of inertia; however, the products of inertia showed disagreement with analytical predictions.
Accuracy of simple urine tests for diagnosis of urinary tract infections in low-risk pregnant women.
Feitosa, Danielle Cristina Alves; da Silva, Márcia Guimarães; de Lima Parada, Cristina Maria Garcia
2009-01-01
Anatomic and physiological alterations during pregnancy predispose pregnant women to urinary tract infections (UTI). This study aimed to identify the accuracy of the simple urine test for UTI diagnosis in low-risk pregnant women. Diagnostic test performance was conducted in Botucatu, SP, involving 230 pregnant women, between 2006 and 2008. Results showed 10% UTI prevalence. Sensitivity, specificity and accuracy of the simple urine test were 95.6%, 63.3% and 66.5%, respectively, in relation to UTI diagnoses. The analysis of positive (PPV) and negative (NPV) predictive values showed that, when a regular simple urine test was performed, the chance of UTI occurrence was small (NPV 99.2%). In view of an altered result for such a test, the possibility of UTI existence was small (PPV 22.4%). It was concluded that the accuracy of the simple urine test as a diagnostic means for UTI was low, and that performing a urine culture is essential for appropriate diagnosis.
Matsumoto, Narihisa; Eldridge, Mark A G; Saunders, Richard C; Reoli, Rachel; Richmond, Barry J
2016-01-06
In primates, visual recognition of complex objects depends on the inferior temporal lobe. By extension, categorizing visual stimuli based on similarity ought to depend on the integrity of the same area. We tested three monkeys before and after bilateral anterior inferior temporal cortex (area TE) removal. Although mildly impaired after the removals, they retained the ability to assign stimuli to previously learned categories, e.g., cats versus dogs, and human versus monkey faces, even with trial-unique exemplars. After the TE removals, they learned in one session to classify members from a new pair of categories, cars versus trucks, as quickly as they had learned the cats versus dogs before the removals. As with the dogs and cats, they generalized across trial-unique exemplars of cars and trucks. However, as seen in earlier studies, these monkeys with TE removals had difficulty learning to discriminate between two simple black and white stimuli. These results raise the possibility that TE is needed for memory of simple conjunctions of basic features, but that it plays only a small role in generalizing overall configural similarity across a large set of stimuli, such as would be needed for perceptual categorical assignment. The process of seeing and recognizing objects is attributed to a set of sequentially connected brain regions stretching forward from the primary visual cortex through the temporal lobe to the anterior inferior temporal cortex, a region designated area TE. Area TE is considered the final stage for recognizing complex visual objects, e.g., faces. It has been assumed, but not tested directly, that this area would be critical for visual generalization, i.e., the ability to place objects such as cats and dogs into their correct categories. Here, we demonstrate that monkeys rapidly and seemingly effortlessly categorize large sets of complex images (cats vs dogs, cars vs trucks), surprisingly, even after removal of area TE, leaving a puzzle about how this generalization is done. Copyright © 2016 the authors 0270-6474/16/360043-11$15.00/0.
Statistical Issues for Uncontrolled Reentry Hazards
NASA Technical Reports Server (NTRS)
Matney, Mark
2008-01-01
A number of statistical tools have been developed over the years for assessing the risk of reentering objects to human populations. These tools make use of the characteristics (e.g., mass, shape, size) of debris that are predicted by aerothermal models to survive reentry. The statistical tools use this information to compute the probability that one or more of the surviving debris might hit a person on the ground and cause one or more casualties. The statistical portion of the analysis relies on a number of assumptions about how the debris footprint and the human population are distributed in latitude and longitude, and how to use that information to arrive at realistic risk numbers. This inevitably involves assumptions that simplify the problem and make it tractable, but it is often difficult to test the accuracy and applicability of these assumptions. This paper looks at a number of these theoretical assumptions, examining the mathematical basis for the hazard calculations, and outlining the conditions under which the simplifying assumptions hold. In addition, this paper will also outline some new tools for assessing ground hazard risk in useful ways. Also, this study is able to make use of a database of known uncontrolled reentry locations measured by the United States Department of Defense. By using data from objects that were in orbit more than 30 days before reentry, sufficient time is allowed for the orbital parameters to be randomized in the way the models are designed to compute. The predicted ground footprint distributions of these objects are based on the theory that their orbits behave basically like simple Kepler orbits. However, there are a number of factors - including the effects of gravitational harmonics, the effects of the Earth's equatorial bulge on the atmosphere, and the rotation of the Earth and atmosphere - that could cause them to diverge from simple Kepler orbit behavior and change the ground footprints. The measured latitude and longitude distributions of these objects provide data that can be directly compared with the predicted distributions, providing a fundamental empirical test of the model assumptions.
Self testing for diabetes mellitus.
Davies, M; Alban-Davies, H; Cook, C; Day, J
1991-01-01
OBJECTIVE--To develop a simple, economically viable, and effective means of population screening for diabetes mellitus. DESIGN--A postal request system for self testing for glycosuria with foil wrapped dipsticks. Preprandial and postprandial tests were compared with a single postprandial test. The subjects were instructed how to test, and a result card was supplied on which to record and return the result. All those recording a positive test result and 50 people recording a negative result were invited for an oral glucose tolerance test. SETTING--General practice in east Suffolk, list size 11534. PATIENTS--All subjects aged 45-70 years registered with the practice were identified by Suffolk Family Health Services Authority (n = 3057). The 73 subjects known to have diabetes from the practice's register were excluded, leaving 2984 subjects, 2363 (79.2%) of whom responded. 1167 subjects completed the single test and 1196 the two tests. MAIN OUTCOME MEASURES--Response rate and number of patients with glycosuria. Sensitivity, specificity, and positive predictive value of a single postprandial test and preprandial and postprandial tests. Number of new cases of diabetes identified and cost of screening. RESULTS--Of the patients completing the single postprandial test, 29 had a positive result, an oral glucose tolerance test showed that eight (28%) had diabetes, six (21%) impaired glucose tolerance, and 14 (48%) normal glucose tolerance. 44 of the group who tested before and after eating had a positive result; nine (20%) had diabetes, five (11%) impaired tolerance, and 26 (11%) normal tolerance. Screening cost 59p per subject and 81 pounds per case detected. Of the 17 people with previously undiagnosed diabetes, eight were asymptomatic and 11 had not visited their general practitioner in the past three months. CONCLUSIONS--A postal request system for self testing for postprandial glycosuria in people aged 45-70 is a simple and effective method of population screening for diabetes mellitus. PMID:1912918
Contribution to interplay between a delamination test and a sensory analysis of mid-range lipsticks.
Richard, C; Tillé-Salmon, B; Mofid, Y
2016-02-01
Lipstick is currently one of the most sold products of cosmetics industry, and the competition between the various manufacturers is significant. Customers mainly seek products with high spreadability, especially long-lasting or long wear on the lips. Evaluation tests of cosmetics are usually performed by sensory analysis. This can then represent a considerable cost. The object of this study was to develop a fast and simple test of delamination (objective method with calibrated instruments) and to interplay the obtained results with those of a discriminative sensory analysis (subjective method) in order to show the relevance of the instrumental test. Three mid-range lipsticks were randomly chosen and were tested. They were made of compositions as described by the International Nomenclature of Cosmetic Ingredients (INCI). Instrumental characterization was performed by texture profile analysis and by a special delamination test. The sensory analysis was voluntarily conducted with an untrained panel as blind test to confirm or reverse the possible interplay. The two approaches or methods gave the same type of classification. The high-fat lipstick had the worst behaviour with the delamination test and the worst notation of the intensity of descriptors with the sensory analysis. There is a high correlation between the sensory analysis and the instrumental measurements in this study. The delamination test carried out should permit to quickly determine the lasting (screening test) and in consequence optimize the basic formula of lipsticks. © 2015 Society of Cosmetic Scientists and the Société Française de Cosmétologie.
Development of Three-Dimensional Completion of Complex Objects
ERIC Educational Resources Information Center
Soska, Kasey C.; Johnson, Scott P.
2013-01-01
Three-dimensional (3D) object completion, the ability to perceive the backs of objects seen from a single viewpoint, emerges at around 6 months of age. Yet, only relatively simple 3D objects have been used in assessing its development. This study examined infants' 3D object completion when presented with more complex stimuli. Infants…
Kis, Anna; Gácsi, Márta; Range, Friederike; Virányi, Zsófia
2012-01-01
In this paper, we describe a behaviour pattern similar to the "A-not-B" error found in human infants and young apes in a monkey species, the common marmosets (Callithrix jacchus). In contrast to the classical explanation, recently it has been suggested that the "A-not-B" error committed by human infants is at least partially due to misinterpretation of the hider's ostensively communicated object hiding actions as potential 'teaching' demonstrations during the A trials. We tested whether this so-called Natural Pedagogy hypothesis would account for the A-not-B error that marmosets commit in a standard object permanence task, but found no support for the hypothesis in this species. Alternatively, we present evidence that lower level mechanisms, such as attention and motivation, play an important role in committing the "A-not-B" error in marmosets. We argue that these simple mechanisms might contribute to the effect of undeveloped object representational skills in other species including young non-human primates that commit the A-not-B error.
NASA Astrophysics Data System (ADS)
Giuffre, Christopher James
In the natural world there is no such thing as a perfectly sharp edge, either thru wear or machining imprecation at the macroscopic scale all edges have curvature. This curvature can have significant impact when comparing results with theory. Both numerical and analytic models for the contact of an object with a sharp edge predict infinite stresses which are not present in the physical world. It is for this reason that the influence of rounded edges must be studied to better understand how they affect model response. Using a commercial available finite element package this influence will be studied in two different problems; how this edge geometry effects the shape of a contusion (bruise) and the accuracy of analytic models for the shaft loaded blister test (SLBT). The contusion study presents work that can be used to enable medical examiners to better determine if the object in question was capable of causing the contusions present. Using a simple layered tissue model which represents a generic location on the human body, a sweep of objects with different edges properties is studied using a simple strain based injury metric. This analysis aims to examine the role that contact area and energy have on the formation, location, and shape of the resulting contusion. In studying the SLBT with finite element analysis and cohesive zone modeling, the assessment of various analytic models will provide insight into how to accurately measure the fracture energy for both the simulation and experiment. This provides insight into the interactions between a film, the substrate it is bonded to and the loading plug. In addition, parametric studies are used to examine potential experimental designs and enable future work in this field. The final product of this project provides tools and insight into future study of the effect rounded edges have on contact and this work enables for more focused studies within desired regimes of interest.
Measuring Drag Force in Newtonian Liquids
ERIC Educational Resources Information Center
Mawhinney, Matthew T.; O'Donnell, Mary Kate; Fingerut, Jonathan; Habdas, Piotr
2012-01-01
The experiments described in this paper have two goals. The first goal is to show how students can perform simple but fundamental measurements of objects moving through simple liquids (such as water, oil, or honey). In doing so, students can verify Stokes' law, which governs the motion of spheres through simple liquids, and see how it fails at…
NASA Astrophysics Data System (ADS)
Civera Lorenzo, Tamara
2017-10-01
Brief presentation about the J-PLUS EDR data access web portal (http://archive.cefca.es/catalogues/jplus-edr) where the different services available to retrieve images and catalogues data have been presented.J-PLUS Early Data Release (EDR) archive includes two types of data: images and dual and single catalogue data which include parameters measured from images. J-PLUS web portal offers catalogue data and images through several different online data access tools or services each suited to a particular need. The different services offered are: Coverage map Sky navigator Object visualization Image search Cone search Object list search Virtual observatory services: Simple Cone Search Simple Image Access Protocol Simple Spectral Access Protocol Table Access Protocol
Simple Proof of Jury Test for Complex Polynomials
NASA Astrophysics Data System (ADS)
Choo, Younseok; Kim, Dongmin
Recently some attempts have been made in the literature to give simple proofs of Jury test for real polynomials. This letter presents a similar result for complex polynomials. A simple proof of Jury test for complex polynomials is provided based on the Rouché's Theorem and a single-parameter characterization of Schur stability property for complex polynomials.
A simple prescription for simulating and characterizing gravitational arcs
NASA Astrophysics Data System (ADS)
Furlanetto, C.; Santiago, B. X.; Makler, M.; de Bom, C.; Brandt, C. H.; Neto, A. F.; Ferreira, P. C.; da Costa, L. N.; Maia, M. A. G.
2013-01-01
Simple models of gravitational arcs are crucial for simulating large samples of these objects with full control of the input parameters. These models also provide approximate and automated estimates of the shape and structure of the arcs, which are necessary for detecting and characterizing these objects on massive wide-area imaging surveys. We here present and explore the ArcEllipse, a simple prescription for creating objects with a shape similar to gravitational arcs. We also present PaintArcs, which is a code that couples this geometrical form with a brightness distribution and adds the resulting object to images. Finally, we introduce ArcFitting, which is a tool that fits ArcEllipses to images of real gravitational arcs. We validate this fitting technique using simulated arcs and apply it to CFHTLS and HST images of tangential arcs around clusters of galaxies. Our simple ArcEllipse model for the arc, associated to a Sérsic profile for the source, recovers the total signal in real images typically within 10%-30%. The ArcEllipse+Sérsic models also automatically recover visual estimates of length-to-width ratios of real arcs. Residual maps between data and model images reveal the incidence of arc substructure. They may thus be used as a diagnostic for arcs formed by the merging of multiple images. The incidence of these substructures is the main factor that prevents ArcEllipse models from accurately describing real lensed systems.
Choose and choose again: appearance-reality errors, pragmatics and logical ability.
Deák, Gedeon O; Enright, Brian
2006-05-01
In the Appearance/Reality (AR) task some 3- and 4-year-old children make perseverative errors: they choose the same word for the appearance and the function of a deceptive object. Are these errors specific to the AR task, or signs of a general question-answering problem? Preschoolers completed five tasks: AR; simple successive forced-choice question pairs (QP); flexible naming of objects (FN); working memory (WM) span; and indeterminacy detection (ID). AR errors correlated with QP errors. Insensitivity to indeterminacy predicted perseveration in both tasks. Neither WM span nor flexible naming predicted other measures. Age predicted sensitivity to indeterminacy. These findings suggest that AR tests measure a pragmatic understanding; specifically, different questions about a topic usually call for different answers. This understanding is related to the ability to detect indeterminacy of each question in a series. AR errors are unrelated to the ability to represent an object as belonging to multiple categories, to working memory span, or to inhibiting previously activated words.
Cooper, Elisa; Henson, Richard N.
2013-01-01
A simple cue can be sufficient to elicit vivid recollection of a past episode. Theoretical models suggest that upon perceiving such a cue, disparate episodic elements held in neocortex are retrieved through hippocampal pattern completion. We tested this fundamental assumption by applying functional magnetic resonance imaging (fMRI) while objects or scenes were used to cue participants' recall of previously paired scenes or objects, respectively. We first demonstrate functional segregation within the medial temporal lobe (MTL), showing domain specificity in perirhinal and parahippocampal cortices (for object-processing vs scene-processing, respectively), but domain generality in the hippocampus (retrieval of both stimulus types). Critically, using fMRI latency analysis and dynamic causal modeling, we go on to demonstrate functional integration between these MTL regions during successful memory retrieval, with reversible signal flow from the cue region to the target region via the hippocampus. This supports the claim that the human hippocampus provides the vital associative link that integrates information held in different parts of cortex. PMID:23986252
Low-cost oblique illumination: an image quality assessment.
Ruiz-Santaquiteria, Jesus; Espinosa-Aranda, Jose Luis; Deniz, Oscar; Sanchez, Carlos; Borrego-Ramos, Maria; Blanco, Saul; Cristobal, Gabriel; Bueno, Gloria
2018-01-01
We study the effectiveness of several low-cost oblique illumination filters to improve overall image quality, in comparison with standard bright field imaging. For this purpose, a dataset composed of 3360 diatom images belonging to 21 taxa was acquired. Subjective and objective image quality assessments were done. The subjective evaluation was performed by a group of diatom experts by psychophysical test where resolution, focus, and contrast were assessed. Moreover, some objective nonreference image quality metrics were applied to the same image dataset to complete the study, together with the calculation of several texture features to analyze the effect of these filters in terms of textural properties. Both image quality evaluation methods, subjective and objective, showed better results for images acquired using these illumination filters in comparison with the no filtered image. These promising results confirm that this kind of illumination filters can be a practical way to improve the image quality, thanks to the simple and low cost of the design and manufacturing process. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
Identification of simple objects in image sequences
NASA Astrophysics Data System (ADS)
Geiselmann, Christoph; Hahn, Michael
1994-08-01
We present an investigation in the identification and location of simple objects in color image sequences. As an example the identification of traffic signs is discussed. Three aspects are of special interest. First regions have to be detected which may contain the object. The separation of those regions from the background can be based on color, motion, and contours. In the experiments all three possibilities are investigated. The second aspect focuses on the extraction of suitable features for the identification of the objects. For that purpose the border line of the region of interest is used. For planar objects a sufficient approximation of perspective projection is affine mapping. In consequence, it is near at hand to extract affine-invariant features from the border line. The investigation includes invariant features based on Fourier descriptors and moments. Finally, the object is identified by maximum likelihood classification. In the experiments all three basic object types are correctly identified. The probabilities for misclassification have been found to be below 1%
A Simple Apparatus for Demonstrating Fluid Forces and Newton's Third Law
NASA Astrophysics Data System (ADS)
Mohazzabi, Pirooz; James, Mark C.
2012-12-01
Over 2200 years ago, in order to determine the purity of a golden crown of the king of Syracuse, Archimedes submerged the crown in water and determined its volume by measuring the volume of the displaced water. This simple experiment became the foundation of what eventually became known as Archimedes' principle: An object fully or partially immersed in a fluid is buoyed up by a force equal to the weight of the fluid displaced by the object. The principle is used to explain all questions regarding buoyancy, and the method is still prescribed for determination of the volume of irregularly shaped objects.2
Using Spatial Correlations of SPDC Sources for Increasing the Signal to Noise Ratio in Images
NASA Astrophysics Data System (ADS)
Ruíz, A. I.; Caudillo, R.; Velázquez, V. M.; Barrios, E.
2017-05-01
We experimentally show that, by using spatial correlations of photon pairs produced by Spontaneous Parametric Down-Conversion, it is possible to increase the Signal to Noise Ratio in images of objects illuminated with those photons; in comparison, objects illuminated with light from a laser present a minor ratio. Our simple experimental set-up was capable to produce an average improvement in signal to noise ratio of 11dB of Parametric Down-Converted light over laser light. This simple method can be easily implemented for obtaining high contrast images of faint objects and for transmitting information with low noise.
Pikto-Pietkiewicz, Witold; Przewłocka, Monika; Chybowska, Barbara; Cyciwa, Alona; Pasierski, Tomasz
2014-01-01
Type 2 diabetes markedly increases the risk of coronary heart disease (CHD), and screening for CHD is suggested by the guidelines. The aim of the study was to compare the diagnostic usefulness of the simple exercise test score, incorporating the clinical data and cardiac stress test results, with the standard stress test in patients with type 2 diabetes. A total of 62 consecutive patients (aged 65.4 ±8.5 years; 32 men) with type 2 diabetes and clinical symptoms suggesting CHD underwent a stress test followed by coronary angiography. The simple score was calculated for all patients. Significant coronary stenosis was observed in 41 patients (66.1%). Stress test results were positive in 36 patients (58.1%). The mean simple score was high (65.5 ±14.3 points). A positive linear relationship was observed between the score and the prevalence of CHD (R2 = 0.19; P <0.001) as well as its severity (R² = 0.23; P <0.001). The area under the receiver-operating characteristic curve for the simple score was 0.74 (95% confidence interval [CI], 0.62-0.86). At the original cut-off value of 60 points, the score had a similar prognostic value to that of the standard stress test. However, in a multivariate analysis, only the simple score (odds ratio [OR], 1.46; 95% CI, 1.11-1.94; P <0.01 for an increase in the score by 1 point) and male sex (OR, 1.57; 95% CI, 1.24-1.98; P <0.001) remained independent predictors of CHD. In patients with type 2 diabetes, the simple score correlated with the prevalence and severity of CHD. However, the cut-off value of 60 points was inadequate in the population of diabetic patients with high risk of CHD. The simple score used instead of or together with the stress test was a better predictor of CHD than the stress test alone.
An objective measure of physical function of elderly outpatients. The Physical Performance Test.
Reuben, D B; Siu, A L
1990-10-01
Direct observation of physical function has the advantage of providing an objective, quantifiable measure of functional capabilities. We have developed the Physical Performance Test (PPT), which assesses multiple domains of physical function using observed performance of tasks that simulate activities of daily living of various degrees of difficulty. Two versions are presented: a nine-item scale that includes writing a sentence, simulated eating, turning 360 degrees, putting on and removing a jacket, lifting a book and putting it on a shelf, picking up a penny from the floor, a 50-foot walk test, and climbing stairs (scored as two items); and a seven-item scale that does not include stairs. The PPT can be completed in less than 10 minutes and requires only a few simple props. We then tested the validity of PPT using 183 subjects (mean age, 79 years) in six settings including four clinical practices (one of Parkinson's disease patients), a board-and-care home, and a senior citizens' apartment. The PPT was reliable (Cronbach's alpha = 0.87 and 0.79, interrater reliability = 0.99 and 0.93 for the nine-item and seven-item tests, respectively) and demonstrated concurrent validity with self-reported measures of physical function. Scores on the PPT for both scales were highly correlated (.50 to .80) with modified Rosow-Breslau, Instrumental and Basic Activities of Daily Living scales, and Tinetti gait score. Scores on the PPT were more moderately correlated with self-reported health status, cognitive status, and mental health (.24 to .47), and negatively with age (-.24 and -.18). Thus, the PPT also demonstrated construct validity. The PPT is a promising objective measurement of physical function, but its clinical and research value for screening, monitoring, and prediction will have to be determined.
Simple Spectral Lines Data Model Version 1.0
NASA Astrophysics Data System (ADS)
Osuna, Pedro; Salgado, Jesus; Guainazzi, Matteo; Dubernet, Marie-Lise; Roueff, Evelyne; Osuna, Pedro; Salgado, Jesus
2010-12-01
This document presents a Data Model to describe Spectral Line Transitions in the context of the Simple Line Access Protocol defined by the IVOA (c.f. Ref[13] IVOA Simple Line Access protocol) The main objective of the model is to integrate with and support the Simple Line Access Protocol, with which it forms a compact unit. This integration allows seamless access to Spectral Line Transitions available worldwide in the VO context. This model does not provide a complete description of Atomic and Molecular Physics, which scope is outside of this document. In the astrophysical sense, a line is considered as the result of a transition between two energy levels. Under the basis of this assumption, a whole set of objects and attributes have been derived to define properly the necessary information to describe lines appearing in astrophysical contexts. The document has been written taking into account available information from many different Line data providers (see acknowledgments section).
Electronic test and calibration circuits, a compilation
NASA Technical Reports Server (NTRS)
1972-01-01
A wide variety of simple test calibration circuits are compiled for the engineer and laboratory technician. The majority of circuits were found inexpensive to assemble. Testing electronic devices and components, instrument and system test, calibration and reference circuits, and simple test procedures are presented.
Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability
NASA Technical Reports Server (NTRS)
Pak, Chan-gi; Li, Wesley; Lung, Shun-fat
2008-01-01
An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.
Matrix Fatigue Cracking Mechanisms of Alpha(2) TMC for Hypersonic Applications
NASA Technical Reports Server (NTRS)
Gabb, Timothy P.; Gayda, John
1994-01-01
The objective of this work was to understand matrix cracking mechanisms in a unidirectional alpha(sub 2) TMC in possible hypersonic applications. A (0)(sub 8) SCS-6/Ti-24Al-11Nb (at. percent) TMC was first subjected to a variety of simple isothermal and nonisothermal fatigue cycles to evaluate the damage mechanisms in simple conditions. A modified ascent mission cycle test was then performed to evaluate the combined effects of loading modes. This cycle mixes mechanical cycling at 150 and 483 C, sustained loads, and a slow thermal cycle to 815 C. At low cyclic stresses and strains more common in hypersonic applications, environment-assisted surface cracking limited fatigue resistance. This damage mechanism was most acute for out-of-phase nonisothermal cycles having extended cycle periods and the ascent mission cycle. A simple linear fraction damage model was employed to help understand this damage mechanism. Time-dependent environmental damage was found to strongly influence out-of-phase and mission life, with mechanical cycling damage due to the combination of external loading and CTE mismatch stresses playing a smaller role. The mechanical cycling and sustained loads in the mission cycle also had a smaller role.
NASA Technical Reports Server (NTRS)
Rao, R. G. S.; Ulaby, F. T.
1977-01-01
The paper examines optimal sampling techniques for obtaining accurate spatial averages of soil moisture, at various depths and for cell sizes in the range 2.5-40 acres, with a minimum number of samples. Both simple random sampling and stratified sampling procedures are used to reach a set of recommended sample sizes for each depth and for each cell size. Major conclusions from statistical sampling test results are that (1) the number of samples required decreases with increasing depth; (2) when the total number of samples cannot be prespecified or the moisture in only one single layer is of interest, then a simple random sample procedure should be used which is based on the observed mean and SD for data from a single field; (3) when the total number of samples can be prespecified and the objective is to measure the soil moisture profile with depth, then stratified random sampling based on optimal allocation should be used; and (4) decreasing the sensor resolution cell size leads to fairly large decreases in samples sizes with stratified sampling procedures, whereas only a moderate decrease is obtained in simple random sampling procedures.
Sandwich Structure Risk Reduction in Support of the Payload Adapter Fitting
NASA Technical Reports Server (NTRS)
Nettles, A. T.; Jackson, J. R.; Guin, W. E.
2018-01-01
Reducing risk for utilizing honeycomb sandwich structure for the Space Launch System payload adapter fitting includes determining what parameters need to be tested for damage tolerance to ensure a safe structure. Specimen size and boundary conditions are the most practical parameters to use in damage tolerance inspection. The effect of impact over core splices and foreign object debris between the facesheet and core is assessed. Effects of enhanced damage tolerance by applying an outer layer of carbon fiber woven cloth is examined. A simple repair technique for barely visible impact damage that restores all compression strength is presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
King, D.L.
1995-11-01
The objective of this work was to develop improved performance model for modules and systems for for all operating conditions for use in module specifications, system and BOS component design, and system rating or monitoring. The approach taken was to identify and quantify the influence of dominant factors of solar irradiance, cell temperature, angle-of-incidence; and solar spectrum; use outdoor test procedures to separate the effects of electrical, thermal, and optical performance; use fundamental cell characteristics to improve analysis; and combine factors in simple model using the common variables.
NASA Technical Reports Server (NTRS)
Ellis, J. R.; Sandlass, G. S.; Bayyari, M.
2001-01-01
A design study was undertaken to investigate the feasibility of using simple specimen designs and reusable fixturing for in-plane biaxial tests planned for advanced aeropropulsion materials. Materials of interest in this work include: advanced metallics, polymeric matrix composites, metal and intermetallic matrix composites, and ceramic matrix composites. Early experience with advanced metallics showed that the cruciform specimen design typically used in this type of testing was impractical for these materials, primarily because of concerns regarding complexity and cost. The objective of this research was to develop specimen designs, fixturing, and procedures which would allow in-plane biaxial tests to be conducted on a wide range of aeropropulsion materials while at the same time keeping costs within acceptable limits. With this goal in mind. a conceptual design was developed centered on a specimen incorporating a relatively simple arrangement of slots and fingers for attachment and loading purposes. The ANSYS finite element code was used to demonstrate the feasibility of the approach and also to develop a number of optimized specimen designs. The same computer code was used to develop the reusable fixturing needed to position and grip the specimens in the load frame. The design adopted uses an assembly of slotted fingers which can be reconfigured as necessary to obtain optimum biaxial stress states in the specimen gage area. Most recently, prototype fixturing was manufactured and is being evaluated over a range of uniaxial and biaxial loading conditions.
The mere exposure effect in the domain of haptics.
Jakesch, Martina; Carbon, Claus-Christian
2012-01-01
Zajonc showed that the attitude towards stimuli that one had been previously exposed to is more positive than towards novel stimuli. This mere exposure effect (MEE) has been tested extensively using various visual stimuli. Research on the MEE is sparse, however, for other sensory modalities. We used objects of two material categories (stone and wood) and two complexity levels (simple and complex) to test the influence of exposure frequency (F0 = novel stimuli, F2 = stimuli exposed twice, F10 = stimuli exposed ten times) under two sensory modalities (haptics only and haptics & vision). Effects of exposure frequency were found for high complex stimuli with significantly increasing liking from F0 to F2 and F10, but only for the stone category. Analysis of "Need for Touch" data showed the MEE in participants with high need for touch, which suggests different sensitivity or saturation levels of MEE. This different sensitivity or saturation levels might also reflect the effects of expertise on the haptic evaluation of objects. It seems that haptic and cross-modal MEEs are influenced by factors similar to those in the visual domain indicating a common cognitive basis.
Digital image classification with the help of artificial neural network by simple histogram
Dey, Pranab; Banerjee, Nirmalya; Kaur, Rajwant
2016-01-01
Background: Visual image classification is a great challenge to the cytopathologist in routine day-to-day work. Artificial neural network (ANN) may be helpful in this matter. Aims and Objectives: In this study, we have tried to classify digital images of malignant and benign cells in effusion cytology smear with the help of simple histogram data and ANN. Materials and Methods: A total of 404 digital images consisting of 168 benign cells and 236 malignant cells were selected for this study. The simple histogram data was extracted from these digital images and an ANN was constructed with the help of Neurointelligence software [Alyuda Neurointelligence 2.2 (577), Cupertino, California, USA]. The network architecture was 6-3-1. The images were classified as training set (281), validation set (63), and test set (60). The on-line backpropagation training algorithm was used for this study. Result: A total of 10,000 iterations were done to train the ANN system with the speed of 609.81/s. After the adequate training of this ANN model, the system was able to identify all 34 malignant cell images and 24 out of 26 benign cells. Conclusion: The ANN model can be used for the identification of the individual malignant cells with the help of simple histogram data. This study will be helpful in the future to identify malignant cells in unknown situations. PMID:27279679
Jung, Eun-hong; Jang, Seok-heun; Lee, Jae-won
2011-01-01
Purpose The aim of this study was to categorize concealed penis and buried penis by preoperative physical examination including the manual prepubic compression test and to describe a simple surgical technique to correct buried penis that was based on surgical experience and comprehension of the anatomical components. Materials and Methods From March 2007 to November 2010, 17 patients were diagnosed with buried penis after differentiation of this condition from concealed penis. The described surgical technique consisted of a minimal incision and simple fixation of the penile shaft skin and superficial fascia to the prepubic deep fascia, without degloving the penile skin. Results The mean age of the patients was 10.2 years, ranging from 8 years to 15 years. The median follow-up was 19 months (range, 5 to 49 months). The mean penile lengths were 1.8 cm (range, 1.1 to 2.5 cm) preoperatively and 4.5 cm (range, 3.3 to 5.8 cm) postoperatively. The median difference between preoperative and postoperative penile lengths was 2.7 cm (range, 2.1 to 3.9 cm). There were no serious intra- or postoperative complications. Conclusions With the simple anchoring of the penopubic skin to the prepubic deep fascia, we obtained successful subjective and objective outcomes without complications. We suggest that this is a promising surgical method for selected patients with buried penis. PMID:22195270
Ameye, Lieveke; Fischerova, Daniela; Epstein, Elisabeth; Melis, Gian Benedetto; Guerriero, Stefano; Van Holsbeke, Caroline; Savelli, Luca; Fruscio, Robert; Lissoni, Andrea Alberto; Testa, Antonia Carla; Veldman, Joan; Vergote, Ignace; Van Huffel, Sabine; Bourne, Tom; Valentin, Lil
2010-01-01
Objectives To prospectively assess the diagnostic performance of simple ultrasound rules to predict benignity/malignancy in an adnexal mass and to test the performance of the risk of malignancy index, two logistic regression models, and subjective assessment of ultrasonic findings by an experienced ultrasound examiner in adnexal masses for which the simple rules yield an inconclusive result. Design Prospective temporal and external validation of simple ultrasound rules to distinguish benign from malignant adnexal masses. The rules comprised five ultrasonic features (including shape, size, solidity, and results of colour Doppler examination) to predict a malignant tumour (M features) and five to predict a benign tumour (B features). If one or more M features were present in the absence of a B feature, the mass was classified as malignant. If one or more B features were present in the absence of an M feature, it was classified as benign. If both M features and B features were present, or if none of the features was present, the simple rules were inconclusive. Setting 19 ultrasound centres in eight countries. Participants 1938 women with an adnexal mass examined with ultrasound by the principal investigator at each centre with a standardised research protocol. Reference standard Histological classification of the excised adnexal mass as benign or malignant. Main outcome measures Diagnostic sensitivity and specificity. Results Of the 1938 patients with an adnexal mass, 1396 (72%) had benign tumours, 373 (19.2%) had primary invasive tumours, 111 (5.7%) had borderline malignant tumours, and 58 (3%) had metastatic tumours in the ovary. The simple rules yielded a conclusive result in 1501 (77%) masses, for which they resulted in a sensitivity of 92% (95% confidence interval 89% to 94%) and a specificity of 96% (94% to 97%). The corresponding sensitivity and specificity of subjective assessment were 91% (88% to 94%) and 96% (94% to 97%). In the 357 masses for which the simple rules yielded an inconclusive result and with available results of CA-125 measurements, the sensitivities were 89% (83% to 93%) for subjective assessment, 50% (42% to 58%) for the risk of malignancy index, 89% (83% to 93%) for logistic regression model 1, and 82% (75% to 87%) for logistic regression model 2; the corresponding specificities were 78% (72% to 83%), 84% (78% to 88%), 44% (38% to 51%), and 48% (42% to 55%). Use of the simple rules as a triage test and subjective assessment for those masses for which the simple rules yielded an inconclusive result gave a sensitivity of 91% (88% to 93%) and a specificity of 93% (91% to 94%), compared with a sensitivity of 90% (88% to 93%) and a specificity of 93% (91% to 94%) when subjective assessment was used in all masses. Conclusions The use of the simple rules has the potential to improve the management of women with adnexal masses. In adnexal masses for which the rules yielded an inconclusive result, subjective assessment of ultrasonic findings by an experienced ultrasound examiner was the most accurate diagnostic test; the risk of malignancy index and the two regression models were not useful. PMID:21156740
Kashiwagi, Mitsuru; Suzuki, Shuhei
2009-09-01
Many children with developmental disorders are known to have motor impairment such as clumsiness and poor physical ability;however, the objective evaluation of such difficulties is not easy in routine clinical practice. In this study, we aimed to establish a simple method for evaluating motor difficulty of childhood. This method employs a scored interview and examination for detecting soft neurological signs (SNSs). After a preliminary survey with 22 normal children, we set the items and the cutoffs for the interview and SNSs. The interview consisted of questions pertaining to 12 items related to a child's motor skills in his/her past and current life, such as skipping, jumping a rope, ball sports, origami, and using chopsticks. The SNS evaluation included 5 tests, namely, standing on one leg with eyes closed, diadochokinesia, associated movements during diadochokinesia, finger opposition test, and laterally fixed gaze. We applied this method to 43 children, including 25 cases of developmental disorders. Children showing significantly high scores in both the interview and SNS were assigned to the "with motor difficulty" group, while those with low scores in both the tests were assigned to the "without motor difficulty" group. The remaining children were assigned to the "with suspicious motor difficulty" group. More than 90% of the children in the "with motor difficulty" group had high impairment scores in Movement Assessment Battery for Children (M-ABC), a standardized motor test, whereas 82% of the children in the "without motor difficulty" group revealed no motor impairment. Thus, we conclude that our simple method and criteria would be useful for the evaluation of motor difficulty of childhood. Further, we have discussed the diagnostic process for developmental coordination disorder using our evaluation method.
Gentry, Christina M; Messinger, Linda
2016-10-01
Intradermal testing (IDT) in cats has potential limitations; this has led to an interest in novel testing methods. A pilot study demonstrated that healthy cats produced reliable percutaneous glycerinated (PG) histamine wheals, whereas percutaneously applied glycerosaline did not lead to wheal formation. The purpose of this study was to determine if percutaneously applied aqueous and glycerinated allergens would lead to irritant reactions in healthy cats. Percutaneous testing (PCT) with both glycerinated and aqueous allergens and IDT were compared in twelve healthy cats. The lateral thorax was clipped and histamine, saline and nine allergens were tested in rows. Objective and subjective evaluations were performed at 15, 20 and 25 min, and 4 h. Results were evaluated as positive or negative at 15, 20, 25 min and 4 h. Skin test reactions for intradermal (ID) histamine wheals were larger when compared to PG and percutaneous aqueous (PA) at the immediate reading points (P < 0.05) subjectively and objectively; however, PG was not significantly different from ID when compared as either positive (2-4) or negative (0-1). PG histamine and allergen reactions, when present, were larger than equivalent PA reactions. PG and PA allergens did not cause irritant reactions at tested concentrations. Bassia scoparia (kochia), when tested at 1000 PNU/mL with IDT, was suspected to be an irritant. Percutaneously (PCT) applied allergens did not cause irritant reactions in healthy cats. PG histamine wheals, although smaller than ID histamine wheals, were easily recognizable and PCT was simple to perform. © 2016 ESVD and ACVD.
Forecasting in foodservice: model development, testing, and evaluation.
Miller, J L; Thompson, P A; Orabella, M M
1991-05-01
This study was designed to develop, test, and evaluate mathematical models appropriate for forecasting menu-item production demand in foodservice. Data were collected from residence and dining hall foodservices at Ohio State University. Objectives of the study were to collect, code, and analyze the data; develop and test models using actual operation data; and compare forecasting results with current methods in use. Customer count was forecast using deseasonalized simple exponential smoothing. Menu-item demand was forecast by multiplying the count forecast by a predicted preference statistic. Forecasting models were evaluated using mean squared error, mean absolute deviation, and mean absolute percentage error techniques. All models were more accurate than current methods. A broad spectrum of forecasting techniques could be used by foodservice managers with access to a personal computer and spread-sheet and database-management software. The findings indicate that mathematical forecasting techniques may be effective in foodservice operations to control costs, increase productivity, and maximize profits.
Spin Entanglement Witness for Quantum Gravity.
Bose, Sougato; Mazumdar, Anupam; Morley, Gavin W; Ulbricht, Hendrik; Toroš, Marko; Paternostro, Mauro; Geraci, Andrew A; Barker, Peter F; Kim, M S; Milburn, Gerard
2017-12-15
Understanding gravity in the framework of quantum mechanics is one of the great challenges in modern physics. However, the lack of empirical evidence has lead to a debate on whether gravity is a quantum entity. Despite varied proposed probes for quantum gravity, it is fair to say that there are no feasible ideas yet to test its quantum coherent behavior directly in a laboratory experiment. Here, we introduce an idea for such a test based on the principle that two objects cannot be entangled without a quantum mediator. We show that despite the weakness of gravity, the phase evolution induced by the gravitational interaction of two micron size test masses in adjacent matter-wave interferometers can detectably entangle them even when they are placed far apart enough to keep Casimir-Polder forces at bay. We provide a prescription for witnessing this entanglement, which certifies gravity as a quantum coherent mediator, through simple spin correlation measurements.
Permeability After Impact Testing of Composite Laminates
NASA Technical Reports Server (NTRS)
Nettles, Alan T.
2003-01-01
Since composite laminates are beginning to be identified for use in reusable launch vehicle propulsion systems, an understanding of their permeance is needed. A foreign object impact event can cause a localized area of permeability (leakage) in a polymer matrix composite and it is the aim of this study to assess a method of quantifying permeability-after-impact results. A simple test apparatus is presented and variables that could affect the measured values of permeability-after-impact were assessed. Once it was determined that valid numbers were being measured, a fiber/resin system was impacted at various impact levels and the resulting permeability measured, first with a leak check solution (qualitative) then using the new apparatus (quantitative). The results showed that as the impact level increased, so did the measured leakage. As the pressure to the specimen was increased, the leak rate was seen to increase in a non-linear fashion for almost all of the specimens tested.
Permeability After Impact Testing of Composite Laminates
NASA Technical Reports Server (NTRS)
Nettles, A.T.; Munafo, Paul (Technical Monitor)
2002-01-01
Since composite laminates are beginning to be identified for use in reusable launch vehicle propulsion systems, an understanding of their permeance is needed. A foreign object impact event can cause a localized area of permeability (leakage) in a polymer matrix composite and it is the aim of this study to assess a method of quantifying permeability-after-impact results. A simple test apparatus is presented and variables that could affect the measured values of permeability-after-impact were assessed. Once it was determined that valid numbers were being measured, a fiber/resin system was impacted at various impact levels and the resulting permeability measured, first with a leak check solution (qualitative) then using the new apparatus (quantitative). The results showed that as the impact level increased, so did the measured leakage. As the pressure to the specimen was increased, the leak rate was seen to increase in a non-linear fashion for almost all of the specimens tested.
[Transcutaneous electrical nervous stimulation in the prognosis of Bell's palsy].
Sabag-Ruiz, Enrique; Osuna-Bernal, Janeth; Brito-Zurita, Olga Rosa; Gómez-Alcalá, Alejandro Vidal; Ornelas-Aguirre, José Manuel
2009-01-01
The peripheral face palsy (PFP) is the commonest acute cranial neuropathy. The PFP has a showy clinical pattern which contrasts with a favorable course. Our objective was to determine the sensitivity and specificity for the nervous excitability test (NET) with transcutaneous electrical nerve stimulation (TENS) and the time required to obtain face symmetry. An analytical cross-sectional study was made in 22 patients with PFP. The goal was the time (days) to obtain face symmetry. The sensitivity and specificity was carried out. A sensitivity and specificity of the NET was of 100 %. The correlation corrected by sex and age between both variables was 0.89. The average in days of recovery was smaller in those with a positive NET (p < 0.05) test. The test of nervous excitability for PFP with TENS is safe and simple to use in primary care and urgencies services.
Spin Entanglement Witness for Quantum Gravity
NASA Astrophysics Data System (ADS)
Bose, Sougato; Mazumdar, Anupam; Morley, Gavin W.; Ulbricht, Hendrik; Toroš, Marko; Paternostro, Mauro; Geraci, Andrew A.; Barker, Peter F.; Kim, M. S.; Milburn, Gerard
2017-12-01
Understanding gravity in the framework of quantum mechanics is one of the great challenges in modern physics. However, the lack of empirical evidence has lead to a debate on whether gravity is a quantum entity. Despite varied proposed probes for quantum gravity, it is fair to say that there are no feasible ideas yet to test its quantum coherent behavior directly in a laboratory experiment. Here, we introduce an idea for such a test based on the principle that two objects cannot be entangled without a quantum mediator. We show that despite the weakness of gravity, the phase evolution induced by the gravitational interaction of two micron size test masses in adjacent matter-wave interferometers can detectably entangle them even when they are placed far apart enough to keep Casimir-Polder forces at bay. We provide a prescription for witnessing this entanglement, which certifies gravity as a quantum coherent mediator, through simple spin correlation measurements.
Acoustic Tactile Representation of Visual Information
NASA Astrophysics Data System (ADS)
Silva, Pubudu Madhawa
Our goal is to explore the use of hearing and touch to convey graphical and pictorial information to visually impaired people. Our focus is on dynamic, interactive display of visual information using existing, widely available devices, such as smart phones and tablets with touch sensitive screens. We propose a new approach for acoustic-tactile representation of visual signals that can be implemented on a touch screen and allows the user to actively explore a two-dimensional layout consisting of one or more objects with a finger or a stylus while listening to auditory feedback via stereo headphones. The proposed approach is acoustic-tactile because sound is used as the primary source of information for object localization and identification, while touch is used for pointing and kinesthetic feedback. A static overlay of raised-dot tactile patterns can also be added. A key distinguishing feature of the proposed approach is the use of spatial sound (directional and distance cues) to facilitate the active exploration of the layout. We consider a variety of configurations for acoustic-tactile rendering of object size, shape, identity, and location, as well as for the overall perception of simple layouts and scenes. While our primary goal is to explore the fundamental capabilities and limitations of representing visual information in acoustic-tactile form, we also consider a number of relatively simple configurations that can be tied to specific applications. In particular, we consider a simple scene layout consisting of objects in a linear arrangement, each with a distinct tapping sound, which we compare to a ''virtual cane.'' We will also present a configuration that can convey a ''Venn diagram.'' We present systematic subjective experiments to evaluate the effectiveness of the proposed display for shape perception, object identification and localization, and 2-D layout perception, as well as the applications. Our experiments were conducted with visually blocked subjects. The results are evaluated in terms of accuracy and speed, and they demonstrate the advantages of spatial sound for guiding the scanning finger or pointer in shape perception, object localization, and layout exploration. We show that these advantages increase with the amount of detail (smaller object size) in the display. Our experimental results show that the proposed system outperforms the state of the art in shape perception, including variable friction displays. We also demonstrate that, even though they are currently available only as static overlays, raised dot patterns provide the best shape rendition in terms of both the accuracy and speed. Our experiments with layout rendering and perception demonstrate that simultaneous representation of objects, using the most effective approaches for directionality and distance rendering, approaches the optimal performance level provided by visual layout perception. Finally, experiments with the virtual cane and Venn diagram configurations demonstrate that the proposed techniques can be used effectively in simple but nontrivial real-world applications. One of the most important conclusions of our experiments is that there is a clear performance gap between experienced and inexperienced subjects, which indicates that there is a lot of room for improvement with appropriate and extensive training. By exploring a wide variety of design alternatives and focusing on different aspects of the acoustic-tactile interfaces, our results offer many valuable insights and great promise for the design of future systematic tests visually impaired and visually blocked subjects, utilizing the most effective configurations.
Sharp, Madeleine E.; Viswanathan, Jayalakshmi; Lanyon, Linda J.; Barton, Jason J. S.
2012-01-01
Background There are few clinical tools that assess decision-making under risk. Tests that characterize sensitivity and bias in decisions between prospects varying in magnitude and probability of gain may provide insights in conditions with anomalous reward-related behaviour. Objective We designed a simple test of how subjects integrate information about the magnitude and the probability of reward, which can determine discriminative thresholds and choice bias in decisions under risk. Design/Methods Twenty subjects were required to choose between two explicitly described prospects, one with higher probability but lower magnitude of reward than the other, with the difference in expected value between the two prospects varying from 3 to 23%. Results Subjects showed a mean threshold sensitivity of 43% difference in expected value. Regarding choice bias, there was a ‘risk premium’ of 38%, indicating a tendency to choose higher probability over higher reward. An analysis using prospect theory showed that this risk premium is the predicted outcome of hypothesized non-linearities in the subjective perception of reward value and probability. Conclusions This simple test provides a robust measure of discriminative value thresholds and biases in decisions under risk. Prospect theory can also make predictions about decisions when subjective perception of reward or probability is anomalous, as may occur in populations with dopaminergic or striatal dysfunction, such as Parkinson's disease and schizophrenia. PMID:22493669
High Lift Common Research Model for Wind Tunnel Testing: An Active Flow Control Perspective
NASA Technical Reports Server (NTRS)
Lin, John C.; Melton, Latunia P.; Viken, Sally A.; Andino, Marlyn Y.; Koklu, Mehti; Hannon, Judith A.; Vatsa, Veer N.
2017-01-01
This paper provides an overview of a research and development effort sponsored by the NASA Advanced Air Transport Technology Project to achieve the required high-lift performance using active flow control (AFC) on simple hinged flaps while reducing the cruise drag associated with the external mechanisms on slotted flaps of a generic modern transport aircraft. The removal of the external fairings for the Fowler flap mechanism could help to reduce drag by 3.3 counts. The main challenge is to develop an AFC system that can provide the necessary lift recovery on a simple hinged flap high-lift system while using the limited pneumatic power available on the aircraft. Innovative low-power AFC concepts will be investigated in the flap shoulder region. The AFC concepts being explored include steady blowing and unsteady blowing operating in the spatial and/or temporal domain. Both conventional and AFC-enabled high-lift configurations were designed for the current effort. The high-lift configurations share the cruise geometry that is based on the NASA Common Research Model, and therefore, are also open geometries. A 10%-scale High Lift Common Research Model (HL-CRM) is being designed for testing at the NASA Langley Research Center 14- by 22-Foot Subsonic Tunnel during fiscal year 2018. The overall project plan, status, HL-CRM configurations, and AFC objectives for the wind tunnel test are described.
Wang, Bing; Huang, Ping; Ou, Caiwen; Li, Kaikai; Yan, Biao; Lu, Wei
2013-01-01
Magnesium and its alloys—a new class of degradable metallic biomaterials—are being increasingly investigated as a promising alternative for medical implant and device applications due to their advantageous mechanical and biological properties. However, the high corrosion rate in physiological environments prevents the clinical application of Mg-based materials. Therefore, the objective of this study was to develop a hydroxyapatite (HA) coating on ZK60 magnesium alloy substrates to mediate the rapid degradation of Mg while improving its cytocompatibility for orthopedic applications. A simple chemical conversion process was applied to prepare HA coating on ZK60 magnesium alloy. Surface morphology, elemental compositions, and crystal structures were characterized using scanning electron microscopy, energy dispersive spectroscopy, and X-ray diffraction, respectively. The corrosion properties of samples were investigated by immersion test and electrochemical test. Murine fibroblast L-929 cells were harvested and cultured with coated and non-coated ZK60 samples to determine cytocompatibility. The degradation results suggested that the HA coatings decreased the degradation of ZK60 alloy. No significant deterioration in compression strength was observed for all the uncoated and coated samples after 2 and 4 weeks’ immersion in simulated body fluid (SBF). Cytotoxicity test indicated that the coatings, especially HA coating, improved cytocompatibility of ZK60 alloy for L929 cells. PMID:24300096
The role of transparency in da Vinci stereopsis.
Zannoli, Marina; Mamassian, Pascal
2011-10-15
The majority of natural scenes contains zones that are visible to one eye only. Past studies have shown that these monocular regions can be seen at a precise depth even though there are no binocular disparities that uniquely constrain their locations in depth. In the so-called da Vinci stereopsis configuration, the monocular region is a vertical line placed next to a binocular rectangular occluder. The opacity of the occluder has been mentioned to be a necessary condition to obtain da Vinci stereopsis. However, this opacity constraint has never been empirically tested. In the present study, we tested whether da Vinci stereopsis and perceptual transparency can interact using a classical da Vinci configuration in which the opacity of the occluder varied. We used two different monocular objects: a line and a disk. We found no effect of the opacity of the occluder on the perceived depth of the monocular object. A careful analysis of the distribution of perceived depth revealed that the monocular object was perceived at a depth that increased with the distance between the object and the occluder. The analysis of the skewness of the distributions was not consistent with a double fusion explanation, favoring an implication of occlusion geometry in da Vinci stereopsis. A simple model that includes the geometry of the scene could account for the results. In summary, the mechanism responsible to locate monocular regions in depth is not sensitive to the material properties of objects, suggesting that da Vinci stereopsis is solved at relatively early stages of disparity processing. Copyright © 2011 Elsevier Ltd. All rights reserved.
Do domestic dogs learn words based on humans' referential behaviour?
Tempelmann, Sebastian; Kaminski, Juliane; Tomasello, Michael
2014-01-01
Some domestic dogs learn to comprehend human words, although the nature and basis of this learning is unknown. In the studies presented here we investigated whether dogs learn words through an understanding of referential actions by humans rather than simple association. In three studies, each modelled on a study conducted with human infants, we confronted four word-experienced dogs with situations involving no spatial-temporal contiguity between the word and the referent; the only available cues were referential actions displaced in time from exposure to their referents. We found that no dogs were able to reliably link an object with a label based on social-pragmatic cues alone in all the tests. However, one dog did show skills in some tests, possibly indicating an ability to learn based on social-pragmatic cues.
Simulated parallel annealing within a neighborhood for optimization of biomechanical systems.
Higginson, J S; Neptune, R R; Anderson, F C
2005-09-01
Optimization problems for biomechanical systems have become extremely complex. Simulated annealing (SA) algorithms have performed well in a variety of test problems and biomechanical applications; however, despite advances in computer speed, convergence to optimal solutions for systems of even moderate complexity has remained prohibitive. The objective of this study was to develop a portable parallel version of a SA algorithm for solving optimization problems in biomechanics. The algorithm for simulated parallel annealing within a neighborhood (SPAN) was designed to minimize interprocessor communication time and closely retain the heuristics of the serial SA algorithm. The computational speed of the SPAN algorithm scaled linearly with the number of processors on different computer platforms for a simple quadratic test problem and for a more complex forward dynamic simulation of human pedaling.
NASA Technical Reports Server (NTRS)
Dress, David A.
1989-01-01
Low speed wind tunnel drag force measurements were taken on a laminar flow body of revolution free of support interference. This body was tested at zero incidence in the NASA Langley 13 in. Magnetic Suspension and Balance System (MSBS). The primary objective of these tests was to substantiate the drag force measuring capabilities of the 13 in. MSBS. The drag force calibrations and wind-on repeatability data provide a means of assessing these capabilities. Additional investigations include: (1) the effects of fixing transition; (2) the effects of fins installed in the tail; and (3) surface flow visualization using both liquid crystals and oil flow. Also two simple drag prediction codes were used to assess their usefulness in estimating overall body drag.
Akbar, Umer; Raike, Robert S; Hack, Nawaz; Hess, Christopher W; Skinner, Jared; Martinez-Ramirez, Daniel; DeJesus, Sol; Okun, Michael S
2016-06-01
Evidence suggests that nonconventional programming may improve deep brain stimulation (DBS) therapy for movement disorders. The primary objective was to assess feasibility of testing the tolerability of several nonconventional settings in Parkinson's disease (PD) and essential tremor (ET) subjects in a single office visit. Secondary objectives were to explore for potential efficacy signals and to assess the energy demand on the implantable pulse-generators (IPGs). A custom firmware (FW) application was developed and acutely uploaded to the IPGs of eight PD and three ET subjects, allowing delivery of several nonconventional DBS settings, including narrow pulse widths, square biphasic pulses, and irregular pulse patterns. Standard clinical rating scales and several objective measures were used to compare motor outcomes with sham, clinically-optimal and nonconventional settings. Blinded and randomized testing was conducted in a traditional office setting. Overall, the nonconventional settings were well tolerated. Under these conditions it was also possible to detect clinically-relevant differences in DBS responses using clinical rating scales but not objective measures. Compared to the clinically-optimal settings, some nonconventional settings appeared to offer similar benefit (e.g., narrow pulse widths) and others lesser benefit. Moreover, the results suggest that square biphasic pulses may deliver greater benefit. No unexpected IPG efficiency disadvantages were associated with delivering nonconventional settings. It is feasible to acutely screen nonconventional DBS settings using controlled study designs in traditional office settings. Simple IPG FW upgrades may provide more DBS programming options for optimizing therapy. Potential advantages of narrow and biphasic pulses deserve follow up. © 2016 The Authors. Neuromodulation: Technology at the Neural Interface published by Wiley Periodicals, Inc. on behalf of International Neuromodulation Society.
A Survey of Complex Object Technologies for Digital Libraries
NASA Technical Reports Server (NTRS)
Nelson, Michael L.; Argue, Brad; Efron, Miles; Denn, Sheila; Pattuelli, Maria Cristina
2001-01-01
Many early web-based digital libraries (DLs) had implicit assumptions reflected in their architecture that the unit of focus in the DL (frequently "reports" or "e-prints") would only be manifested in a single, or at most a few, common file formats such as PDF or PostScript. DLs have now matured to the point where their contents are commonly no longer simple files. Complex objects in DLs have emerged from in response to various requirements, including: simple aggregation of formats and supporting files, bundling additional information to aid digital preservation, creating opaque digital objects for e-commerce applications, and the incorporation of dynamic services with the traditional data files. We examine a representative (but not necessarily exhaustive) number of current and recent historical web-based complex object technologies and projects that are applicable to DLs: Aurora, Buckets, ComMentor, Cryptolopes, Digibox, Document Management Alliance, FEDORA, Kahn-Wilensky Framework Digital Objects, Metadata Encoding & Transmission Standard, Multivalent Documents, Open eBooks, VERS Encapsulated Objects, and the Warwick Framework.
ERIC Educational Resources Information Center
Anyanwu, Godson Emeka; Agu, Augustine Uchechukwu; Anyaehie, Ugochukwu Bond
2012-01-01
The impact and perception of students on the use of a simple, low technology-driven version of a virtual microscope in teaching and assessments in cellular physiology and histology were studied. Its impact on the time and resources of the faculty were also assessed. Simple virtual slides and conventional microscopes were used to conduct the same…
ERIC Educational Resources Information Center
2000
All kids know the word "work." But they probably don't understand that work happens whenever a force is used to move something--whether it's lifting a heavy object or playing on a see-saw. All About Simple Machines introduces kids to the concepts of forces, work and how machines are used to make work easier. Six simple machines are…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bauer, T. H.; Robinson, W. R.; Holland, J. W.
1989-12-01
Results and analyses of margin to cladding failure and pre-failure axial expansion of metallic fuel are reported for TREAT in-pile transient overpower tests M5--M7. These are the first such tests on reference binary and ternary alloy fuel of the Integral Fast Reactor (IFR) concept with burnup ranging from 1 to 10 at. %. In all cases, test fuel was subjected to an exponential power rise on an 8 s period until either incipient or actual cladding failure was achieved. Objectives, designs and methods are described with emphasis on developments unique to metal fuel safety testing. The resulting database for claddingmore » failure threshold and prefailure fuel expansion is presented. The nature of the observed cladding failure and resultant fuel dispersals is described. Simple models of cladding failures and pre-failure axial expansions are described and compared with experimental results. Reported results include: temperature, flow, and pressure data from test instrumentation; fuel motion diagnostic data principally from the fast neutron hodoscope; and test remains described from both destructive and non-destructive post-test examination. 24 refs., 144 figs., 17 tabs.« less
A simple and low-cost structured illumination microscopy using a pico-projector
NASA Astrophysics Data System (ADS)
Özgürün, Baturay
2018-02-01
Here, development of a low-cost structured illumination microscopy (SIM) based on a pico-projector is presented. The pico-projector consists of independent red, green and blue LEDs that remove need for an external illumination source. Moreover, display element of the pico-projector serves as a pattern generating spatial light modulator. A simple lens group is employed to couple light from the projector to an epi-illumination port of a commercial microscope system. 2D sub SIM images are acquired and synthesized to surpass the diffraction limit using 40x (0.75 NA) objective. Resolution of the reconstructed SIM images is verified with a dye-and-object object and a fixed cell sample.
PCANet: A Simple Deep Learning Baseline for Image Classification?
Chan, Tsung-Han; Jia, Kui; Gao, Shenghua; Lu, Jiwen; Zeng, Zinan; Ma, Yi
2015-12-01
In this paper, we propose a very simple deep learning network for image classification that is based on very basic data processing components: 1) cascaded principal component analysis (PCA); 2) binary hashing; and 3) blockwise histograms. In the proposed architecture, the PCA is employed to learn multistage filter banks. This is followed by simple binary hashing and block histograms for indexing and pooling. This architecture is thus called the PCA network (PCANet) and can be extremely easily and efficiently designed and learned. For comparison and to provide a better understanding, we also introduce and study two simple variations of PCANet: 1) RandNet and 2) LDANet. They share the same topology as PCANet, but their cascaded filters are either randomly selected or learned from linear discriminant analysis. We have extensively tested these basic networks on many benchmark visual data sets for different tasks, including Labeled Faces in the Wild (LFW) for face verification; the MultiPIE, Extended Yale B, AR, Facial Recognition Technology (FERET) data sets for face recognition; and MNIST for hand-written digit recognition. Surprisingly, for all tasks, such a seemingly naive PCANet model is on par with the state-of-the-art features either prefixed, highly hand-crafted, or carefully learned [by deep neural networks (DNNs)]. Even more surprisingly, the model sets new records for many classification tasks on the Extended Yale B, AR, and FERET data sets and on MNIST variations. Additional experiments on other public data sets also demonstrate the potential of PCANet to serve as a simple but highly competitive baseline for texture classification and object recognition.
Fault detection and accommodation testing on an F100 engine in an F-15 airplane
NASA Technical Reports Server (NTRS)
Myers, L. P.; Baer-Riedhart, J. L.; Maxwell, M. D.
1985-01-01
The fault detection and accommodation (FDA) methodology for digital engine-control systems may range from simple comparisons of redundant parameters to the more complex and sophisticated observer models of the entire engine system. Evaluations of the various FDA schemes are done using analytical methods, simulation, and limited-altitude-facility testing. Flight testing of the FDA logic has been minimal because of the difficulty of inducing realistic faults in flight. A flight program was conducted to evaluate the fault detection and accommodation capability of a digital electronic engine control in an F-15 aircraft. The objective of the flight program was to induce selected faults and evaluate the resulting actions of the digital engine controller. Comparisons were made between the flight results and predictions. Several anomalies were found in flight and during the ground test. Simulation results showed that the inducement of dual pressure failures was not feasible since the FDA logic was not designed to accommodate these types of failures.
NASA Technical Reports Server (NTRS)
Anderson, David J.; Lambert, Heather H.; Mizukami, Masashi
1992-01-01
Experimental results from a wind tunnel test conducted to investigate propulsion/airframe integration (PAI) effects are presented. The objectives of the test were to examine rough order-of-magnitude changes in the acoustic characteristics of a mixer/ejector nozzle due to the presence of a wing and to obtain limited wing and nozzle flow-field measurements. A simple representative supersonic transport wing planform, with deflecting flaps, was installed above a two-dimensional mixer/ejector nozzle that was supplied with high-pressure heated air. Various configurations and wing positions with respect to the nozzle were studied. Because of hardware problems, no acoustics and only a limited set of flow-field data were obtained. For most hardware configurations tested, no significant propulsion/airframe integration effects were identified. Significant effects were seen for extreme flap deflections. The combination of the exploratory nature of the test and the limited flow-field instrumentation made it impossible to identify definitive propulsion/airframe integration effects.
Testing the Paradigm that Ultra-Luminous X-Ray Sources as a Class Represent Accreting Intermediate
NASA Technical Reports Server (NTRS)
Berghea, C. T.; Weaver, K. A.; Colbert, E. J. M.; Roberts, T. P.
2008-01-01
To test the idea that ultraluminous X-ray sources (ULXs) in external galaxies represent a class of accreting Intermediate-Mass Black Holes (IMBHs), we have undertaken a program to identify ULXs and a lower luminosity X-ray comparison sample with the highest quality data in the Chandra archive. We establish a general property of ULXs that the most X-ray luminous objects possess the fattest X-ray spectra (in the Chandra band pass). No prior sample studies have established the general hardening of ULX spectra with luminosity. This hardening occurs at the highest luminosities (absorbed luminosity > or equals 5x10(exp 39) ergs/s) and is in line with recent models arguing that ULXs are actually stellar-mass black holes. From spectral modeling, we show that the evidence originally taken to mean that ULXs are IMBHs - i.e., the "simple IMBH model" - is nowhere near as compelling when a large sample of ULXs is looked at properly. During the last couple of years, XMM-Newton spectroscopy of ULXs has to some large extent begun to negate the simple IMBH model based on fewer objects. We confirm and expand these results, which validates the XMM-Newton work in a broader sense with independent X-ray data. We find (1) that cool disk components are present with roughly equal probability and total flux fraction for any given ULX, regardless of luminosity, and (2) that cool disk components extend below the standard ULX luminosity cutoff of 10(exp 39) ergs/s, down to our sample limit of 10(exp 38:3) ergs/s. The fact that cool disk components are not correlated with luminosity damages the argument that cool disks indicate IMBHs in ULXs, for which a strong statistical support was never made.
Simple optical method of qualitative assessment of sperm motility: preliminary results
NASA Astrophysics Data System (ADS)
Sozanska, Agnieszka; Kolwas, Krystyna; Galas, Jacek; Blocki, Narcyz; Czyzewski, Adam
2005-09-01
The examination of quality of the sperm ejaculate is one of the most important steps in artificial fertilization procedure. The main aim of semen storage centres is to characterise the best semen quality for fertilization. Reliable information about sperm motility is also one the most important parameters for in vitro laboratory procedures. There exist very expensive automated methods for semen analysis but they are unachievable for most of laboratories and semen storage centres. Motivation for this study is to elaborate a simple, cheap, objective and repeatable method for semen motility assessment. The method enables to detect even small changes in motility introduced by medical, physical or chemical factors. To test the reliability of the method we used cryopreserved bull semen from Lowicz Semen Storage Centre. The examined sperm specimen was warmed in water bath and then centrifuged. The best semen was collected by the swim-up technique and diluted to a proper concentration. Several semen concentrations and dilutions were tested in order to find the best probe parameters giving repeatable results. For semen visualization we used the phase-contrast microscope with a CCD camera. A PC computer was used to acquire and to analyse the data. The microscope table equipped with a microscope glass pool 0.7mm deep instead of some conventional plane microscope slides was stabilised at the temperature of 37°C. The main idea of our method is based on a numerical processing of the optical contrast of the sperm images which illustrates the dynamics of the sperm cells movement and on appropriate analysis of a grey scale level of the superimposed images. An elaborated numerical algorithm allows us to find the relative amount of motile sperm cells. The proposed method of sperm motility assessment seems to be objective and repeatable.
NASA Astrophysics Data System (ADS)
Berghea, C. T.; Weaver, K. A.; Colbert, E. J. M.; Roberts, T. P.
2008-11-01
To test the idea that ultraluminous X-ray sources (ULXs) in external galaxies represent a class of accreting intermediate-mass black holes (IMBHs), we have undertaken a program to identify ULXs and a lower luminosity X-ray comparison sample with the highest quality data in the Chandra archive. We establish as a general property of ULXs that the most X-ray-luminous objects possess the flattest X-ray spectra (in the Chandra bandpass). No prior sample studies have established the general hardening of ULX spectra with luminosity. This hardening occurs at the highest luminosities (absorbed luminosity >=5 × 1039 erg s-1) and is in line with recent models arguing that ULXs are actually stellar mass black holes. From spectral modeling, we show that the evidence originally taken to mean that ULXs are IMBHs—i.e., the "simple IMBH model"—is nowhere near as compelling when a large sample of ULXs is looked at properly. During the last couple of years, XMM-Newton spectroscopy of ULXs has to a large extent begun to negate the simple IMBH model based on fewer objects. We confirm and expand these results, which validates the XMM-Newton work in a broader sense with independent X-ray data. We find that (1) cool-disk components are present with roughly equal probability and total flux fraction for any given ULX, regardless of luminosity, and (2) cool-disk components extend below the standard ULX luminosity cutoff of 1039 erg s-1, down to our sample limit of 1038.3 erg s-1. The fact that cool-disk components are not correlated with luminosity damages the argument that cool disks indicate IMBHs in ULXs, for which strong statistical support was never found.
Visualization of Data Regarding Infections Using Eye Tracking Techniques.
Yoon, Sunmoo; Cohen, Bevin; Cato, Kenrick D; Liu, Jianfang; Larson, Elaine L
2016-05-01
To evaluate ease of use and usefulness for nurses of visualizations of infectious disease transmission in a hospital. An observational study was used to evaluate perceptions of several visualizations of data extracted from electronic health records designed using a participatory approach. Twelve nurses in the master's program in an urban research-intensive nursing school participated in May 2015. A convergent parallel mixed method was used to evaluate nurses' perceptions on ease of use and usefulness of five visualization conveying trends in hospital infection transmission applying think-aloud, interview, and eye-tracking techniques. Subjective data from the interview and think-aloud techniques indicated that participants preferred the traditional line graphs in simple data representation due to their familiarity, clarity, and easiness to read. An objective quantitative measure of eye movement analysis (444,421 gaze events) identified a high degree of participants' attention span in infographics in all three scenarios. All participants responded with the correct answer within 1 min in comprehensive tests. A user-centric approach was effective in developing and evaluating visualizations for hospital infection transmission. For the visualizations designed by the users, the participants were easily able to comprehend the infection visualizations on both line graphs and infographics for simple visualization. The findings from the objective comprehension test and eye movement and subjective attitudes support the feasibility of integrating user-centric visualization designs into electronic health records, which may inspire clinicians to be mindful of hospital infection transmission. Future studies are needed to investigate visualizations and motivation, and the effectiveness of visualization on infection rate. This study designed visualization images using clinical data from electronic health records applying a user-centric approach. The design insights can be applied for visualizing patient data in electronic health records. © 2016 Sigma Theta Tau International.
Frontal crashworthiness characterisation of a vehicle segment using curve comparison metrics.
Abellán-López, D; Sánchez-Lozano, M; Martínez-Sáez, L
2018-08-01
The objective of this work is to propose a methodology for the characterization of the collision behaviour and crashworthiness of a segment of vehicles, by selecting the vehicle that best represents that group. It would be useful in the development of deformable barriers, to be used in crash tests intended to study vehicle compatibility, as well as for the definition of the representative standard pulses used in numerical simulations or component testing. The characterisation and selection of representative vehicles is based on the objective comparison of the occupant compartment acceleration and barrier force pulses, obtained during crash tests, by using appropriate comparison metrics. This method is complemented with another one, based exclusively on the comparison of a few characteristic parameters of crash behaviour obtained from the previous curves. The method has been applied to different vehicle groups, using test data from a sample of vehicles. During this application, the performance of several metrics usually employed in the validation of simulation models have been analysed, and the most efficient ones have been selected for the task. The methodology finally defined is useful for vehicle segment characterization, taken into account aspects of crash behaviour related to the shape of the curves, difficult to represent by simple numerical parameters, and it may be tuned in future works when applied to larger and different samples. Copyright © 2018 Elsevier Ltd. All rights reserved.
Seven Steps You Can Take to Improve Your Objectives. Third Edition.
ERIC Educational Resources Information Center
Alvir, Claire Gelinas
This document gives examples of seven steps a classroom teacher can take to improve both instructional and learning objectives. This improvement is to be measured by increased student learning. The seven steps are as follows: a) write a simple behavioral objective, b) edit, c) revise objective to make it learner centered, d) clarify, e) evaluate…
The Aggregate Representation of Terrestrial Land Covers Within Global Climate Models (GCM)
NASA Technical Reports Server (NTRS)
Shuttleworth, W. James; Sorooshian, Soroosh
1996-01-01
This project had four initial objectives: (1) to create a realistic coupled surface-atmosphere model to investigate the aggregate description of heterogeneous surfaces; (2) to develop a simple heuristic model of surface-atmosphere interactions; (3) using the above models, to test aggregation rules for a variety of realistic cover and meteorological conditions; and (4) to reconcile biosphere-atmosphere transfer scheme (BATS) land covers with those that can be recognized from space; Our progress in meeting these objectives can be summarized as follows. Objective 1: The first objective was achieved in the first year of the project by coupling the Biosphere-Atmosphere Transfer Scheme (BATS) with a proven two-dimensional model of the atmospheric boundary layer. The resulting model, BATS-ABL, is described in detail in a Masters thesis and reported in a paper in the Journal of Hydrology Objective 2: The potential value of the heuristic model was re-evaluated early in the project and a decision was made to focus subsequent research around modeling studies with the BATS-ABL model. The value of using such coupled surface-atmosphere models in this research area was further confirmed by the success of the Tucson Aggregation Workshop. Objective 3: There was excellent progress in using the BATS-ABL model to test aggregation rules for a variety of realistic covers. The foci of attention have been the site of the First International Satellite Land Surface Climatology Project Field Experiment (FIFE) in Kansas and one of the study sites of the Anglo-Brazilian Amazonian Climate Observational Study (ABRACOS) near the city of Manaus, Amazonas, Brazil. These two sites were selected because of the ready availability of relevant field data to validate and initiate the BATS-ABL model. The results of these tests are given in a Masters thesis, and reported in two papers. Objective 4: Progress far exceeded original expectations not only in reconciling BATS land covers with those that can be recognized from space, but also in then applying remotely-sensed land cover data to map aggregate values of BATS parameters for heterogeneous covers and interpreting these parameters in terms of surface-atmosphere exchanges.
Physics-Based Imaging Methods for Terahertz Nondestructive Evaluation Applications
NASA Astrophysics Data System (ADS)
Kniffin, Gabriel Paul
Lying between the microwave and far infrared (IR) regions, the "terahertz gap" is a relatively unexplored frequency band in the electromagnetic spectrum that exhibits a unique combination of properties from its neighbors. Like in IR, many materials have characteristic absorption spectra in the terahertz (THz) band, facilitating the spectroscopic "fingerprinting" of compounds such as drugs and explosives. In addition, non-polar dielectric materials such as clothing, paper, and plastic are transparent to THz, just as they are to microwaves and millimeter waves. These factors, combined with sub-millimeter wavelengths and non-ionizing energy levels, makes sensing in the THz band uniquely suited for many NDE applications. In a typical nondestructive test, the objective is to detect a feature of interest within the object and provide an accurate estimate of some geometrical property of the feature. Notable examples include the thickness of a pharmaceutical tablet coating layer or the 3D location, size, and shape of a flaw or defect in an integrated circuit. While the material properties of the object under test are often tightly controlled and are generally known a priori, many objects of interest exhibit irregular surface topographies such as varying degrees of curvature over the extent of their surfaces. Common THz pulsed imaging (TPI) methods originally developed for objects with planar surfaces have been adapted for objects with curved surfaces through use of mechanical scanning procedures in which measurements are taken at normal incidence over the extent of the surface. While effective, these methods often require expensive robotic arm assemblies, the cost and complexity of which would likely be prohibitive should a large volume of tests be needed to be carried out on a production line. This work presents a robust and efficient physics-based image processing approach based on the mature field of parabolic equation methods, common to undersea acoustics, seismology, and other areas of science and engineering. The method allows the generation of accurate 3D THz tomographic images of objects with irregular, non-planar surfaces using a simple planar scan geometry, thereby facilitating the integration of 3D THz imaging into mainstream NDE use.
Galy, Bertrand; Lan, André
2018-03-01
Among the many occupational risks construction workers encounter every day falling from a height is the most dangerous. The objective of this article is to propose a simple analytical design method for horizontal lifelines (HLLs) that considers anchorage flexibility. The article presents a short review of the standards and regulations/acts/codes concerning HLLs in Canada the USA and Europe. A static analytical approach is proposed considering anchorage flexibility. The analytical results are compared with a series of 42 dynamic fall tests and a SAP2000 numerical model. The experimental results show that the analytical method is a little conservative and overestimates the line tension in most cases with a maximum of 17%. The static SAP2000 results show a maximum 2.1% difference with the analytical method. The analytical method is accurate enough to safely design HLLs and quick design abaci are provided to allow the engineer to make quick on-site verification if needed.
Chen, Qi; Li, Yang; Shi, Bing; Yin, Heng; Zheng, Guang-Ning; Zheng, Qian
2013-12-01
The objective of this study was to analyze the correlative factors for velopharyngeal closure of patients with cleft palate after primary repair. Ninety-five nonsyndromic patients with cleft palate were enrolled. Two surgical techniques were applied in the patients: simple palatoplasty and combined palatoplasty with pharyngoplasty. All patients were assessed 6 months after the operation. The postoperative velopharyngeal closure (VPC) rate was compared by χ(2) test and the correlative factors were analyzed with logistic regression model. The postoperative VPC rate of young patients was higher than that of old patients, the group with incomplete cleft palate was higher than the group with complete cleft palate, and combined palatoplasty with pharyngoplasty was higher than simple palatoplasty. Operative age, cleft type, and surgical technique were the contributing factors for postoperative VPC rate. Operative age, cleft type, and surgical technique were significant factors influencing postoperative VPC rate of patients with cleft palate. Copyright © 2013 Elsevier Inc. All rights reserved.
Phrase Frequency Effects in Language Production
Janssen, Niels; Barber, Horacio A.
2012-01-01
A classic debate in the psychology of language concerns the question of the grain-size of the linguistic information that is stored in memory. One view is that only morphologically simple forms are stored (e.g., ‘car’, ‘red’), and that more complex forms of language such as multi-word phrases (e.g., ‘red car’) are generated on-line from the simple forms. In two experiments we tested this view. In Experiment 1, participants produced noun+adjective and noun+noun phrases that were elicited by experimental displays consisting of colored line drawings and two superimposed line drawings. In Experiment 2, participants produced noun+adjective and determiner+noun+adjective utterances elicited by colored line drawings. In both experiments, naming latencies decreased with increasing frequency of the multi-word phrase, and were unaffected by the frequency of the object name in the utterance. These results suggest that the language system is sensitive to the distribution of linguistic information at grain-sizes beyond individual words. PMID:22479370
Prevalence of Sickle Cell Trait in the Southern Suburb of Beirut, Lebanon.
El Ariss, Abdel Badih; Younes, Mohamad; Matar, Jad; Berjaoui, Zeina
2016-01-01
The objective of this study was to assess the prevalence, gender differences, and time trends of Sickle Cell Trait in the Southern Suburb of Beirut, Lebanon, as well as to highlight the importance of screening for Sickle Cell Trait carriers in this population. Another objective was to describe a new screening technique for Sickle Cell Trait carriers. This was a retrospective cohort study carried out at a private laboratory in the Southern Suburb of Beirut, Lebanon between 2002 and 2014. The sickling test was carried out for each patient using two methods: the classical "sodium metabisulfite sickling test", and the new "sickling test method" used in the private lab. As a confirmatory test, hemoglobin electrophoresis was run on a random sample of 223 cases which were found to be positive using the two sickling tests. A total of 899 cases were found to be positive for the sickle cell trait out of 184,105 subjects screened during the 12-year period, prevalence = 0.49% (95% CI: 0.46 - 0.52). Among the total sample, females were found to have higher prevalence, where no time trend over the studied period was noted. The haemoglobin electrophoresis method confirmed the results of this new sickling test technique among the random sample of the 223 cases. We found that the prevalence of sickle cell trait is lower as compared to other Arab countries, higher in females, with no significant time trend. The sickle cell test was found to be an accurate, simple and cheap test that could be easily added as a requirement for the pre-marital testing to screen for Sickle Cell Trait carriers.
A novel method for objective vision testing in canine models of inherited retinal disease.
Gearhart, Patricia M; Gearhart, Chris C; Petersen-Jones, Simon M
2008-08-01
The use of canine models of retinal disease in the development of therapeutic strategies for inherited retinal disorders is a growing area of research. To evaluate accurately the success of potential vision-enhancing treatments, reliable methods for objectively assessing visual function in canine models is necessary. A simple vision-testing device was constructed that consisted of a junction box with four exit tunnels. Dogs were placed in the junction box and given one vision-based choice for exit. The first-choice tunnel and time to exit were recorded and analyzed. Two canine models of retinal disease with distinct molecular defects, a null mutation in the gene encoding the alpha subunit of rod cyclic GMP phosphodiesterase (PDE6A), and a null mutation in the gene encoding a retinal pigment epithelium-specific protein (RPE65) were tested and compared to those in unaffected dogs. With the use of bright light versus dim red light, the test differentiated between unaffected dogs and dogs affected with either mutation with a high degree of certainty. The white-light intensity series showed a significantly different performance between the unaffected and affected dogs. A significant difference in performance was detected between the dogs with each mutation. The results indicate that this novel canine vision-testing method is an accurate and sensitive means of distinguishing between unaffected dogs and dogs affected with two different forms of inherited retinal disease and should be useful as a means of assessing response to therapy in future studies.
On Improving Efficiency of Differential Evolution for Aerodynamic Shape Optimization Applications
NASA Technical Reports Server (NTRS)
Madavan, Nateri K.
2004-01-01
Differential Evolution (DE) is a simple and robust evolutionary strategy that has been provEn effective in determining the global optimum for several difficult optimization problems. Although DE offers several advantages over traditional optimization approaches, its use in applications such as aerodynamic shape optimization where the objective function evaluations are computationally expensive is limited by the large number of function evaluations often required. In this paper various approaches for improving the efficiency of DE are reviewed and discussed. Several approaches that have proven effective for other evolutionary algorithms are modified and implemented in a DE-based aerodynamic shape optimization method that uses a Navier-Stokes solver for the objective function evaluations. Parallelization techniques on distributed computers are used to reduce turnaround times. Results are presented for standard test optimization problems and for the inverse design of a turbine airfoil. The efficiency improvements achieved by the different approaches are evaluated and compared.
NASA Astrophysics Data System (ADS)
Baumgart, M.; Druml, N.; Consani, M.
2018-05-01
This paper presents a simulation approach for Time-of-Flight cameras to estimate sensor performance and accuracy, as well as to help understanding experimentally discovered effects. The main scope is the detailed simulation of the optical signals. We use a raytracing-based approach and use the optical path length as the master parameter for depth calculations. The procedure is described in detail with references to our implementation in Zemax OpticStudio and Python. Our simulation approach supports multiple and extended light sources and allows accounting for all effects within the geometrical optics model. Especially multi-object reflection/scattering ray-paths, translucent objects, and aberration effects (e.g. distortion caused by the ToF lens) are supported. The optical path length approach also enables the implementation of different ToF senor types and transient imaging evaluations. The main features are demonstrated on a simple 3D test scene.
A new blood vessel extraction technique using edge enhancement and object classification.
Badsha, Shahriar; Reza, Ahmed Wasif; Tan, Kim Geok; Dimyati, Kaharudin
2013-12-01
Diabetic retinopathy (DR) is increasing progressively pushing the demand of automatic extraction and classification of severity of diseases. Blood vessel extraction from the fundus image is a vital and challenging task. Therefore, this paper presents a new, computationally simple, and automatic method to extract the retinal blood vessel. The proposed method comprises several basic image processing techniques, namely edge enhancement by standard template, noise removal, thresholding, morphological operation, and object classification. The proposed method has been tested on a set of retinal images. The retinal images were collected from the DRIVE database and we have employed robust performance analysis to evaluate the accuracy. The results obtained from this study reveal that the proposed method offers an average accuracy of about 97 %, sensitivity of 99 %, specificity of 86 %, and predictive value of 98 %, which is superior to various well-known techniques.
Safety and licensing of a small modular gas-cooled reactor system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, N.W.; Kelley, A.P. Jr.
A modular side-by-side high-temperature gas-cooled reactor (SBS-HTGR) is being developed by Interatom/Kraftwerk Union (KWU). The General Electric Company and Interatom/KWU entered into a proprietary working agreement to continue develop jointly of the SBS-HTGR. A study on adapting the SBS-HTGR for application in the US has been completed. The study investigated the safety characteristics and the use of this type of design in an innovative approach to licensing. The safety objective guiding the design of the modular SBS-HTGR is to control radionuclide release by the retention of fission products within the fuel particles with minimal reliance on active design features. Themore » philosophy on which this objective is predicated is that by providing a simple safety case, the safety criteria can be demonstrated as being met with high confidence through conduct of a full-scale module safety test.« less
Reflection symmetry detection using locally affine invariant edge correspondence.
Wang, Zhaozhong; Tang, Zesheng; Zhang, Xiao
2015-04-01
Reflection symmetry detection receives increasing attentions in recent years. The state-of-the-art algorithms mainly use the matching of intensity-based features (such as the SIFT) within a single image to find symmetry axes. This paper proposes a novel approach by establishing the correspondence of locally affine invariant edge-based features, which are superior to the intensity based in the aspects that it is insensitive to illumination variations, and applicable to textureless objects. The locally affine invariance is achieved by simple linear algebra for efficient and robust computations, making the algorithm suitable for detections under object distortions like perspective projection. Commonly used edge detectors and a voting process are, respectively, used before and after the edge description and matching steps to form a complete reflection detection pipeline. Experiments are performed using synthetic and real-world images with both multiple and single reflection symmetry axis. The test results are compared with existing algorithms to validate the proposed method.
NASA Astrophysics Data System (ADS)
Cardall, Christian Y.; Budiardja, Reuben D.
2018-01-01
The large-scale computer simulation of a system of physical fields governed by partial differential equations requires some means of approximating the mathematical limit of continuity. For example, conservation laws are often treated with a 'finite-volume' approach in which space is partitioned into a large number of small 'cells,' with fluxes through cell faces providing an intuitive discretization modeled on the mathematical definition of the divergence operator. Here we describe and make available Fortran 2003 classes furnishing extensible object-oriented implementations of simple meshes and the evolution of generic conserved currents thereon, along with individual 'unit test' programs and larger example problems demonstrating their use. These classes inaugurate the Mathematics division of our developing astrophysics simulation code GENASIS (Gen eral A strophysical Si mulation S ystem), which will be expanded over time to include additional meshing options, mathematical operations, solver types, and solver variations appropriate for many multiphysics applications.
NASA Technical Reports Server (NTRS)
Alhorn, D. C.; Howard, D. E.; Smith, D. A.
2005-01-01
The Advanced Sensor Concepts project was conducted under the Center Director's Discretionary Fund at the Marshall Space Flight Center. Its objective was to advance the technology originally developed for the Glovebox Integrated Microgravity Isolation Technology project. The objective of this effort was to develop and test several new motion sensors. To date, the investigators have invented seven new technologies during this endeavor and have conceived several others. The innovative basic sensor technology is an absolute position sensor. It employs only two active components, and it is simple, inexpensive, reliable, repeatable, lightweight, and relatively unobtrusive. Two sensors can be utilized in the same physical space to achieve redundancy. The sensor has micrometer positional accuracy and can be configured as a two- or three-dimensional sensor. The sensor technology has the potential to pioneer a new class of linear and rotary sensors. This sensor is the enabling technology for autonomous assembly of modular structures in space and on extraterrestrial locations.
The effect of postsurgical pain on attentional processing in horses.
Dodds, Louise; Knight, Laura; Allen, Kate; Murrell, Joanna
2017-07-01
To investigate the effect of postsurgical pain on the performance of horses in a novel object and auditory startle task. Prospective clinical study. Twenty horses undergoing different types of surgery and 16 control horses that did not undergo surgery. The interaction of 36 horses with novel objects and a response to an auditory stimulus were measured at two time points; the day before surgery (T1) and the day after surgery (T2) for surgical horses (G1), and at a similar time interval for control horses (G2). Pain and sedation were measured using simple descriptive scales at the time the tests were carried out. Total time or score attributed to each of the behavioural categories was compared between groups (G1 and G2) for each test and between tests (T1 and T2) for each group. The median (range) time spent interacting with novel objects was reduced in G1 from 58 (6-367) seconds in T1 to 12 (0-495) seconds in T2 (p=0.0005). In G2 the change in interaction time between T1 and T2 was not statistically significant. Median (range) total auditory score was 7 (3-12) and 10 (1-12) in G1 and G2, respectively, at T1, decreasing to 6 (0-10) in G1 after surgery and 9.5 (1-12) in G2 (p=0.0003 and p=0.94, respectively). There was a difference in total auditory score between G1 and G2 at T2 (p=0.0169), with the score being lower in G1 than G2. Postsurgical pain negatively impacts attention towards novel objects and causes a decreased responsiveness to an auditory startle test. In horses, tasks demanding attention may be useful as a biomarker of pain. Copyright © 2017 Association of Veterinary Anaesthetists and American College of Veterinary Anesthesia and Analgesia. All rights reserved.
DOT National Transportation Integrated Search
2017-06-01
The objective of this study was to develop an objective, quantitative method for evaluating damage to bridge girders by using artificial neural networks (ANNs). This evaluation method, which is a supplement to visual inspection, requires only the res...
A cortical framework for invariant object categorization and recognition.
Rodrigues, João; Hans du Buf, J M
2009-08-01
In this paper we present a new model for invariant object categorization and recognition. It is based on explicit multi-scale features: lines, edges and keypoints are extracted from responses of simple, complex and end-stopped cells in cortical area V1, and keypoints are used to construct saliency maps for Focus-of-Attention. The model is a functional but dichotomous one, because keypoints are employed to model the "where" data stream, with dynamic routing of features from V1 to higher areas to obtain translation, rotation and size invariance, whereas lines and edges are employed in the "what" stream for object categorization and recognition. Furthermore, both the "where" and "what" pathways are dynamic in that information at coarse scales is employed first, after which information at progressively finer scales is added in order to refine the processes, i.e., both the dynamic feature routing and the categorization level. The construction of group and object templates, which are thought to be available in the prefrontal cortex with "what" and "where" components in PF46d and PF46v, is also illustrated. The model was tested in the framework of an integrated and biologically plausible architecture.
Wen, Jianming
2012-09-01
A recent thermal ghost imaging experiment implemented in Wu's group [Chin. Phys. Lett. 279, 074216 (2012)] showed that both positive and negative images can be constructed by applying a novel algorithm. This algorithm allows us to form the images with the use of partial measurements from the reference arm (even which never passes through the object), conditioned on the object arm. In this paper, we present a simple theory that explains the experimental observation and provides an in-depth understanding of conventional ghost imaging. In particular, we theoretically show that the visibility of formed images through such an algorithm is not bounded by the standard value 1/3. In fact, it can ideally grow up to unity (with reduced imaging quality). Thus, the algorithm described here not only offers an alternative way to decode spatial correlation of thermal light, but also mimics a "bandpass filter" to remove the constant background such that the visibility or imaging contrast is improved. We further show that conditioned on one still object present in the test arm, it is possible to construct the object's image by sampling the available reference data.
Comparison of ORSAT and SCARAB Reentry Analysis Tools for a Generic Satellite Test Case
NASA Technical Reports Server (NTRS)
Kelley, Robert L.; Hill, Nicole M.; Rochelle, W. C.; Johnson, Nicholas L.; Lips, T.
2010-01-01
Reentry analysis is essential to understanding the consequences of the full life cycle of a spacecraft. Since reentry is a key factor in spacecraft development, NASA and ESA have separately developed tools to assess the survivability of objects during reentry. Criteria such as debris casualty area and impact energy are particularly important to understanding the risks posed to people on Earth. Therefore, NASA and ESA have undertaken a series of comparison studies of their respective reentry codes for verification and improvements in accuracy. The NASA Object Reentry Survival Analysis Tool (ORSAT) and the ESA Spacecraft Atmospheric Reentry and Aerothermal Breakup (SCARAB) reentry analysis tools serve as standard codes for reentry survivability assessment of satellites. These programs predict whether an object will demise during reentry and calculate the debris casualty area of objects determined to survive, establishing the reentry risk posed to the Earth's population by surviving debris. A series of test cases have been studied for comparison and the most recent uses "Testsat," a conceptual satellite composed of generic parts, defined to use numerous simple shapes and various materials for a better comparison of the predictions of these two codes. This study is an improvement on the others in this series because of increased consistency in modeling techniques and variables. The overall comparison demonstrated that the two codes arrive at similar results. Either most objects modeled resulted in close agreement between the two codes, or if the difference was significant, the variance could be explained as a case of semantics in the model definitions. This paper presents the main results of ORSAT and SCARAB for the Testsat case and discusses the sources of any discovered differences. Discussion of the results of previous comparisons is made for a summary of differences between the codes and lessons learned from this series of tests.
Evaluation of a New and Rapid Serologic Test for Detecting Brucellosis: Brucella Coombs Gel Test.
Hanci, Hayrunisa; Igan, Hakan; Uyanik, Muhammet Hamidullah
2017-01-01
Many serological tests have been used for the diagnosis of human brucellosis. A new serological method is identified as Brucella Coombs gel test based on the principle of centrifugation gel system similar to the gel system used in blood group determination. In this system, if Brucella antibodies were present in the serum, antigen and antibody would remain as a pink complex on the gel. Otherwise, the pink Brucella antigens would precipitate at the bottom of the gel card system. In this study, we aimed to compare the Brucella Coombs gel test, a new, rapid screen and titration method for detection of non-agglutinating IgG with the Brucella Coombs test. For this study, a total of 88 serum samples were obtained from 45 healthy persons and 43 individuals who had clinical signs and symptoms of brucellosis. For each specimen, Rose Bengal test, standard agglutination test, Coombs test and Brucella Coombs gel test were carried out. Sensitivity and specificity of Brucella Coombs gel test were found as 100.0 and 82.2%, respectively. Brucella Coombs gel test can be used as a screening test with high sensitivity. By the help of pink Brucella antigen precipitation, the tests' evaluation is simple and objective. In addition, determination of Brucella antibody by rapid titration offers another important advantage.
Design of efficient and simple interface testing equipment for opto-electric tracking system
NASA Astrophysics Data System (ADS)
Liu, Qiong; Deng, Chao; Tian, Jing; Mao, Yao
2016-10-01
Interface testing for opto-electric tracking system is one important work to assure system running performance, aiming to verify the design result of every electronic interface matching the communication protocols or not, by different levels. Opto-electric tracking system nowadays is more complicated, composed of many functional units. Usually, interface testing is executed between units manufactured completely, highly depending on unit design and manufacture progress as well as relative people. As a result, it always takes days or weeks, inefficiently. To solve the problem, this paper promotes an efficient and simple interface testing equipment for opto-electric tracking system, consisting of optional interface circuit card, processor and test program. The hardware cards provide matched hardware interface(s), easily offered from hardware engineer. Automatic code generation technique is imported, providing adaption to new communication protocols. Automatic acquiring items, automatic constructing code architecture and automatic encoding are used to form a new program quickly with adaption. After simple steps, a standard customized new interface testing equipment with matching test program and interface(s) is ready for a waiting-test system in minutes. The efficient and simple interface testing equipment for opto-electric tracking system has worked for many opto-electric tracking system to test entire or part interfaces, reducing test time from days to hours, greatly improving test efficiency, with high software quality and stability, without manual coding. Used as a common tool, the efficient and simple interface testing equipment for opto-electric tracking system promoted by this paper has changed traditional interface testing method and created much higher efficiency.
A knowledge-based machine vision system for space station automation
NASA Technical Reports Server (NTRS)
Chipman, Laure J.; Ranganath, H. S.
1989-01-01
A simple knowledge-based approach to the recognition of objects in man-made scenes is being developed. Specifically, the system under development is a proposed enhancement to a robot arm for use in the space station laboratory module. The system will take a request from a user to find a specific object, and locate that object by using its camera input and information from a knowledge base describing the scene layout and attributes of the object types included in the scene. In order to use realistic test images in developing the system, researchers are using photographs of actual NASA simulator panels, which provide similar types of scenes to those expected in the space station environment. Figure 1 shows one of these photographs. In traditional approaches to image analysis, the image is transformed step by step into a symbolic representation of the scene. Often the first steps of the transformation are done without any reference to knowledge of the scene or objects. Segmentation of an image into regions generally produces a counterintuitive result in which regions do not correspond to objects in the image. After segmentation, a merging procedure attempts to group regions into meaningful units that will more nearly correspond to objects. Here, researchers avoid segmenting the image as a whole, and instead use a knowledge-directed approach to locate objects in the scene. The knowledge-based approach to scene analysis is described and the categories of knowledge used in the system are discussed.
Aragón, Alfredo S; Kalberg, Wendy O; Buckley, David; Barela-Scott, Lindsey M; Tabachnick, Barbara G; May, Philip A
2008-12-01
Although a large body of literature exists on cognitive functioning in alcohol-exposed children, it is unclear if there is a signature neuropsychological profile in children with Fetal Alcohol Spectrum Disorders (FASD). This study assesses cognitive functioning in children with FASD from several American Indian reservations in the Northern Plains States, and it applies a hierarchical model of simple versus complex information processing to further examine cognitive function. We hypothesized that complex tests would discriminate between children with FASD and culturally similar controls, while children with FASD would perform similar to controls on relatively simple tests. Our sample includes 32 control children and 24 children with a form of FASD [fetal alcohol syndrome (FAS) = 10, partial fetal alcohol syndrome (PFAS) = 14]. The test battery measures general cognitive ability, verbal fluency, executive functioning, memory, and fine-motor skills. Many of the neuropsychological tests produced results consistent with a hierarchical model of simple versus complex processing. The complexity of the tests was determined "a priori" based on the number of cognitive processes involved in them. Multidimensional scaling was used to statistically analyze the accuracy of classifying the neurocognitive tests into a simple versus complex dichotomy. Hierarchical logistic regression models were then used to define the contribution made by complex versus simple tests in predicting the significant differences between children with FASD and controls. Complex test items discriminated better than simple test items. The tests that conformed well to the model were the Verbal Fluency, Progressive Planning Test (PPT), the Lhermitte memory tasks, and the Grooved Pegboard Test (GPT). The FASD-grouped children, when compared with controls, demonstrated impaired performance on letter fluency, while their performance was similar on category fluency. On the more complex PPT trials (problems 5 to 8), as well as the Lhermitte logical tasks, the FASD group performed the worst. The differential performance between children with FASD and controls was evident across various neuropsychological measures. The children with FASD performed significantly more poorly on the complex tasks than did the controls. The identification of a neurobehavioral profile in children with prenatal alcohol exposure will help clinicians identify and diagnose children with FASD.
Multi-Frame Convolutional Neural Networks for Object Detection in Temporal Data
2017-03-01
maximum 200 words) Given the problem of detecting objects in video , existing neural-network solutions rely on a post-processing step to combine...information across frames and strengthen conclusions. This technique has been successful for videos with simple, dominant objects but it cannot detect objects...Computer Science iii THIS PAGE INTENTIONALLY LEFT BLANK iv ABSTRACT Given the problem of detecting objects in video , existing neural-network solutions rely
Ductile Fracture Initiation of Anisotropic Metal Sheets
NASA Astrophysics Data System (ADS)
Dong, Liang; Li, Shuhui; He, Ji
2017-07-01
The objective of this research is to investigate the influence of material plastic anisotropy on ductile fracture in the strain space under the assumption of plane stress state for sheet metals. For convenient application, a simple expression is formulated by the method of total strain theory under the assumption of proportional loading. The Hill 1948 quadratic anisotropic yield model and isotropic hardening flow rule are adopted to describe the plastic response of the material. The Mohr-Coulomb model is revisited to describe the ductile fracture in the stress space. Besides, the fracture locus for DP590 in different loading directions is obtained by experiments. Four different types of tensile test specimens, including classical dog bone, flat with cutouts, flat with center holes and pure shear, are performed to fracture. All these specimens are prepared with their longitudinal axis inclined with the angle of 0°, 45°, and 90° to the rolling direction, respectively. A 3D digital image correlation system is used in this study to measure the anisotropy parameter r 0, r 45, r 90 and the equivalent strains to fracture for all the tests. The results show that the material plastic anisotropy has a remarkable influence on the fracture locus in the strain space and can be predicted accurately by the simple expression proposed in this study.
Early Cerebral Small Vessel Disease and Brain Volume, Cognition, and Gait
Smith, Eric E; O'Donnell, Martin; Dagenais, Gilles; Lear, Scott A; Wielgosz, Andreas; Sharma, Mukul; Poirier, Paul; Stotts, Grant; Black, Sandra E; Strother, Stephen; Noseworthy, Michael D; Benavente, Oscar; Modi, Jayesh; Goyal, Mayank; Batool, Saima; Sanchez, Karla; Hill, Vanessa; McCreary, Cheryl R; Frayne, Richard; Islam, Shofiqul; DeJesus, Jane; Rangarajan, Sumathy; Teo, Koon; Yusuf, Salim
2015-01-01
Objective Decline in cognitive function begins by the 40s, and may be related to future dementia risk. We used data from a community-representative study to determine whether there are age-related differences in simple cognitive and gait tests by the 40s, and whether these differences were associated with covert cerebrovascular disease on magnetic resonance imaging (MRI). Methods Between 2010 and 2012, 803 participants aged 40 to 75 years in the Prospective Urban Rural Epidemiological (PURE) study, recruited from prespecified postal code regions centered on 4 Canadian cities, underwent brain MRI and simple tests of cognition and gait as part of a substudy (PURE-MIND). Results Mean age was 58 ± 8 years. Linear decreases in performance on the Montreal Cognitive Assessment, Digit Symbol Substitution Test (DSST), and Timed Up and Go test of gait were seen with each age decade from the 40s to the 70s. Silent brain infarcts were observed in 3% of 40- to 49-year-olds, with increasing prevalence up to 18.9% in 70-year-olds. Silent brain infarcts were associated with slower timed gait and lower volume of supratentorial white matter. Higher volume of supratentorial MRI white matter hyperintensity was associated with slower timed gait and worse performance on DSST, and lower volumes of the supratentorial cortex and white matter, and cerebellum. Interpretation Covert cerebrovascular disease and its consequences on cognitive and gait performance and brain atrophy are manifest in some clinically asymptomatic persons as early as the 5th decade of life. Ann Neurol 2015;77:251–261 PMID:25428654
A Simple Approach To Assessing Copper Pitting Corrosion Tendenices and Developing Control Strategies
The objective of this research was to assess the effectiveness of a simple pipe loop system and protocol to predict localized corrosion, and to assess treatment alternatives for a drinking water that has been associated with customer complaints of pinhole leaks.
Evaluating a smartphone digits-in-noise test as part of the audiometric test battery.
Potgieter, Jenni-Mari; Swanepoel, De Wet; Smits, Cas
2018-05-21
Speech-in-noise tests have become a valuable part of the audiometric test battery providing an indication of a listener's ability to function in background noise. A simple digits-in-noise (DIN) test could be valuable to support diagnostic hearing assessments, hearing aid fittings and counselling for both paediatric and adult populations. Objective: The objective of this study was to evaluate the South African English smartphone DIN test's performance as part of the audiometric test battery. Design: This descriptive study evaluated 109 adult subjects (43 male and 66 female subjects) with and without sensorineural hearing loss by comparing pure-tone air conduction thresholds, speech recognition monaural performance scores (SRS dB) and the DIN speech reception threshold (SRT). An additional nine adult hearing aid users (four male and five female subjects) were included in a subset to determine aided and unaided DIN SRTs. Results: The DIN SRT is strongly associated with the best ear 4 frequency pure-tone average (4FPTA) (rs = 0.81) and maximum SRS dB (r = 0.72). The DIN test had high sensitivity and specificity to identify abnormal pure-tone (0.88 and 0.88, respectively) and SRS dB (0.76 and 0.88, respectively) results. There was a mean signal-to-noise ratio (SNR) improvement in the aided condition that demonstrated an overall benefit of 0.84 SNR dB. Conclusion: The DIN SRT was significantly correlated with the best ear 4FPTA and maximum SRS dB. The DIN SRT provides a useful measure of speech recognition in noise that can evaluate hearing aid fittings, manage counselling and hearing expectations.
Simple Spreadsheet Models For Interpretation Of Fractured Media Tracer Tests
An analysis of a gas-phase partitioning tracer test conducted through fractured media is discussed within this paper. The analysis employed matching eight simple mathematical models to the experimental data to determine transport parameters. All of the models tested; two porous...
Speededness and Adaptive Testing
ERIC Educational Resources Information Center
van der Linden, Wim J.; Xiong, Xinhui
2013-01-01
Two simple constraints on the item parameters in a response--time model are proposed to control the speededness of an adaptive test. As the constraints are additive, they can easily be included in the constraint set for a shadow-test approach (STA) to adaptive testing. Alternatively, a simple heuristic is presented to control speededness in plain…
Distance comparisons in virtual reality: effects of path, context, and age
van der Ham, Ineke J. M.; Baalbergen, Heleen; van der Heijden, Peter G. M.; Postma, Albert; Braspenning, Merel; van der Kuil, Milan N. A.
2015-01-01
In this large scale, individual differences study (N = 521), the effects of cardinal axes of an environment and the path taken between locations on distance comparisons were assessed. The main goal was to identify if and to what extent previous findings in simple 2D tasks can be generalized to a more dynamic, three-dimensional virtual reality environment. Moreover, effects of age and gender were assessed. After memorizing the locations of six objects in a circular environment, participants were asked to judge the distance between objects they encountered. Results indicate that categorization (based on the cardinal axes) was present, as distances within one quadrant were judged as being closer together, even when no visual indication of the cardinal axes was given. Moreover, strong effects of the path taken between object locations were found; objects that were near on the path taken were perceived as being closer together than objects that were further apart on this path, regardless of the metric distance between the objects. Males outperformed females in distance comparison, but did not differ in the extent of the categorization and path effects. Age also affected performance; the categorization and path effects were highly similar across the age range tested, but the general ability to estimate distances does show a clear pattern increase during development and decrease with aging. PMID:26321968
NASA Astrophysics Data System (ADS)
Amato, Gabriele; Eisank, Clemens; Albrecht, Florian
2017-04-01
Landslide detection from Earth observation imagery is an important preliminary work for landslide mapping, landslide inventories and landslide hazard assessment. In this context, the object-based image analysis (OBIA) concept has been increasingly used over the last decade. Within the framework of the Land@Slide project (Earth observation based landslide mapping: from methodological developments to automated web-based information delivery) a simple, unsupervised, semi-automatic and object-based approach for the detection of shallow landslides has been developed and implemented in the InterIMAGE open-source software. The method was applied to an Alpine case study in western Austria, exploiting spectral information from pansharpened 4-bands WorldView-2 satellite imagery (0.5 m spatial resolution) in combination with digital elevation models. First, we divided the image into sub-images, i.e. tiles, and then we applied the workflow to each of them without changing the parameters. The workflow was implemented as top-down approach: at the image tile level, an over-classification of the potential landslide area was produced; the over-estimated area was re-segmented and re-classified by several processing cycles until most false positive objects have been eliminated. In every step a Baatz algorithm based segmentation generates polygons "candidates" to be landslides. At the same time, the average values of normalized difference vegetation index (NDVI) and brightness are calculated for these polygons; after that, these values are used as thresholds to perform an objects selection in order to improve the quality of the classification results. In combination, also empirically determined values of slope and roughness are used in the selection process. Results for each tile were merged to obtain the landslide map for the test area. For final validation, the landslide map was compared to a geological map and a supervised landslide classification in order to estimate its accuracy. Results for the test area showed that the proposed method is capable of accurately distinguishing landslides from roofs and trees. Implementation of the workflow into InterIMAGE was straightforward. We conclude that the method is able to extract landslides in forested areas, but that there is still room for improvements concerning the extraction in non-forested high-alpine regions.
AG Dra -- a high density plasma laboratory
NASA Astrophysics Data System (ADS)
Young, Peter
2002-07-01
A STIS observation of the symbiotic star AG Draconis yielding spectra in the range 1150--10 000 Angstrom is requested. AG Dra is a non-eclipsing binary that shows strong, narrow nebular emission lines that originate in the wind of a K giant, photoionized by a hot white dwarf. The density of the nebula is around 10^10 electrons/cm^3 and is the perfect laboratory for testing the plasma modeling codes cloudy and xstar at high densities. These codes are used for a wide range of astrophysical objects including stellar winds, accretion disks, active galactic nuclei and Seyfert galaxies, and calibrating them against high signal-to-noise spectra from comparatively simple systems is essential. AG Dra is the perfect high density laboratory for this work. In addition, many previously undetected emission lines will be found through the high sensitivity of STIS, which will allow new plasma diagnostics to be tested. These twin objectives are particularly pertinent as the high sensitivity of emphHST/COS will will permit similar high resolution spectroscopy to be applied to a whole new regime of extragalactic objects. By combining far-UV data from Ause with complementary data from STIS, we will determine ratios of emission lines from the same ion, or ions of similar ionization level. These will permit a more complete set of diagnostics than are obtainable from one instrument alone.
Shape detection of Gaborized outline versions of everyday objects
Sassi, Michaël; Machilsen, Bart; Wagemans, Johan
2012-01-01
We previously tested the identifiability of six versions of Gaborized outlines of everyday objects, differing in the orientations assigned to elements inside and outside the outline. We found significant differences in identifiability between the versions, and related a number of stimulus metrics to identifiability [Sassi, M., Vancleef, K., Machilsen, B., Panis, S., & Wagemans, J. (2010). Identification of everyday objects on the basis of Gaborized outline versions. i-Perception, 1(3), 121–142]. In this study, after retesting the identifiability of new variants of three of the stimulus versions, we tested their robustness to local orientation jitter in a detection experiment. In general, our results replicated the key findings from the previous study, and allowed us to substantiate our earlier interpretations of the effects of our stimulus metrics and of the performance differences between the different stimulus versions. The results of the detection task revealed a different ranking order of stimulus versions than the identification task. By examining the parallels and differences between the effects of our stimulus metrics in the two tasks, we found evidence for a trade-off between shape detectability and identifiability. The generally simple and smooth shapes that yield the strongest contour integration and most robust detectability tend to lack the distinguishing features necessary for clear-cut identification. Conversely, contours that do contain such identifying features tend to be inherently more complex and, therefore, yield weaker integration and less robust detectability. PMID:23483752
Security aspects of RFID communication systems
NASA Astrophysics Data System (ADS)
Bîndar, Valericǎ; Popescu, Mircea; Bǎrtuşicǎ, Rǎzvan; Craciunescu, Razvan; Halunga, Simona
2015-02-01
The objective of this study is to provide an overview of basic technical elements and security risks of RFID communication systems and to analyze the possible threats arising from the use of RFID systems. A number of measurements are performed on a communication system including RFID transponder and the tag reader, and it has been determined that the uplink signal level is 62 dB larger than the average value of the noise at the distance of 1m from the tag, therefore the shielding effectiveness has to exceed this threshold. Next, the card has been covered with several shielding materials and measurements were carried, under similar conditions to test the recovery of compromising signals. A very simple protection measure to prevent unauthorized reading of the data stored on the card has been proposed, and some electromagnetic shielding materials have been proposed and tested.
Do Domestic Dogs Learn Words Based on Humans’ Referential Behaviour?
Tempelmann, Sebastian; Kaminski, Juliane; Tomasello, Michael
2014-01-01
Some domestic dogs learn to comprehend human words, although the nature and basis of this learning is unknown. In the studies presented here we investigated whether dogs learn words through an understanding of referential actions by humans rather than simple association. In three studies, each modelled on a study conducted with human infants, we confronted four word-experienced dogs with situations involving no spatial-temporal contiguity between the word and the referent; the only available cues were referential actions displaced in time from exposure to their referents. We found that no dogs were able to reliably link an object with a label based on social-pragmatic cues alone in all the tests. However, one dog did show skills in some tests, possibly indicating an ability to learn based on social-pragmatic cues. PMID:24646732
Autonomous and Autonomic Swarms
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Truszkowski, Walter F.; Rouff, Christopher A.; Sterritt, Roy
2005-01-01
A watershed in systems engineering is represented by the advent of swarm-based systems that accomplish missions through cooperative action by a (large) group of autonomous individuals each having simple capabilities and no global knowledge of the group s objective. Such systems, with individuals capable of surviving in hostile environments, pose unprecedented challenges to system developers. Design and testing and verification at much higher levels will be required, together with the corresponding tools, to bring such systems to fruition. Concepts for possible future NASA space exploration missions include autonomous, autonomic swarms. Engineering swarm-based missions begins with understanding autonomy and autonomicity and how to design, test, and verify systems that have those properties and, simultaneously, the capability to accomplish prescribed mission goals. Formal methods-based technologies, both projected and in development, are described in terms of their potential utility to swarm-based system developers.
Implicit Self-Importance in an Interpersonal Pronoun Categorization Task.
Fetterman, Adam K; Robinson, Michael D; Gilbertson, Elizabeth P
2014-06-01
Object relations theories emphasize the manner in which the salience/importance of implicit representations of self and other guide interpersonal functioning. Two studies and a pilot test (total N = 304) sought to model such representations. In dyadic contexts, the self is a "you" and the other is a "me", as verified in a pilot test. Study 1 then used a simple categorization task and found evidence for implicit self-importance: The pronoun "you" was categorized more quickly and accurately when presented in a larger font size, whereas the pronoun "me" was categorized more quickly and accurately when presented in a smaller font size. Study 2 showed that this pattern possesses value in understanding individual differences in interpersonal functioning. As predicted, arrogant people scored higher in implicit self-importance in the paradigm. Findings are discussed from the perspective of dyadic interpersonal dynamics.
Lee, Kyung J.; Park, Seong-Beom; Lee, Inah
2014-01-01
Learning theories categorize learning systems into elemental and contextual systems, the former being processed by non-hippocampal regions and the latter being processed in the hippocampus. A set of complex stimuli such as a visual background is often considered a contextual stimulus and simple sensory stimuli such as pure tone and light are considered elemental stimuli. However, this elemental-contextual categorization scheme has only been tested in limited behavioral paradigms and it is largely unknown whether it can be generalized across different learning situations. By requiring rats to respond differently to a common object in association with various types of sensory cues including contextual and elemental stimuli, we tested whether different types of elemental and contextual sensory stimuli depended on the hippocampus to different degrees. In most rats, a surrounding visual background and a tactile stimulus served as contextual (hippocampal dependent) and elemental (non-hippocampal dependent) stimuli, respectively. However, simple tone and light stimuli frequently used as elemental cues in traditional experiments required the hippocampus to varying degrees among rats. Specifically, one group of rats showed a normal contextual bias when both contextual and elemental cues were present. These rats effectively switched to using elemental cues when the hippocampus was inactivated. The other group showed a strong contextual bias (and hippocampal dependence) because these rats were not able to use elemental cues when the hippocampus was unavailable. It is possible that the latter group of rats might have interpreted the elemental cues (light and tone) as background stimuli and depended more on the hippocampus in associating the cues with choice responses. Although exact mechanisms underlying these individual variances are unclear, our findings recommend a caution for adopting a simple sensory stimulus as a non-hippocampal sensory cue only based on the literature. PMID:24982624
Interobject grouping facilitates visual awareness.
Stein, Timo; Kaiser, Daniel; Peelen, Marius V
2015-01-01
In organizing perception, the human visual system takes advantage of regularities in the visual input to perceptually group related image elements. Simple stimuli that can be perceptually grouped based on physical regularities, for example by forming an illusory contour, have a competitive advantage in entering visual awareness. Here, we show that regularities that arise from the relative positioning of complex, meaningful objects in the visual environment also modulate visual awareness. Using continuous flash suppression, we found that pairs of objects that were positioned according to real-world spatial regularities (e.g., a lamp above a table) accessed awareness more quickly than the same object pairs shown in irregular configurations (e.g., a table above a lamp). This advantage was specific to upright stimuli and abolished by stimulus inversion, meaning that it did not reflect physical stimulus confounds or the grouping of simple image elements. Thus, knowledge of the spatial configuration of objects in the environment shapes the contents of conscious perception.
Time-dependent inhomogeneous jet models for BL Lac objects
NASA Technical Reports Server (NTRS)
Marlowe, A. T.; Urry, C. M.; George, I. M.
1992-01-01
Relativistic beaming can explain many of the observed properties of BL Lac objects (e.g., rapid variability, high polarization, etc.). In particular, the broadband radio through X-ray spectra are well modeled by synchrotron-self Compton emission from an inhomogeneous relativistic jet. We have done a uniform analysis on several BL Lac objects using a simple but plausible inhomogeneous jet model. For all objects, we found that the assumed power-law distribution of the magnetic field and the electron density can be adjusted to match the observed BL Lac spectrum. While such models are typically unconstrained, consideration of spectral variability strongly restricts the allowed parameters, although to date the sampling has generally been too sparse to constrain the current models effectively. We investigate the time evolution of the inhomogeneous jet model for a simple perturbation propagating along the jet. The implications of this time evolution model and its relevance to observed data are discussed.
Time-dependent inhomogeneous jet models for BL Lac objects
NASA Astrophysics Data System (ADS)
Marlowe, A. T.; Urry, C. M.; George, I. M.
1992-05-01
Relativistic beaming can explain many of the observed properties of BL Lac objects (e.g., rapid variability, high polarization, etc.). In particular, the broadband radio through X-ray spectra are well modeled by synchrotron-self Compton emission from an inhomogeneous relativistic jet. We have done a uniform analysis on several BL Lac objects using a simple but plausible inhomogeneous jet model. For all objects, we found that the assumed power-law distribution of the magnetic field and the electron density can be adjusted to match the observed BL Lac spectrum. While such models are typically unconstrained, consideration of spectral variability strongly restricts the allowed parameters, although to date the sampling has generally been too sparse to constrain the current models effectively. We investigate the time evolution of the inhomogeneous jet model for a simple perturbation propagating along the jet. The implications of this time evolution model and its relevance to observed data are discussed.
Simple Perfusion Apparatus (SPA) for Manipulation, Tracking and Study of Oocytes and Embryos
Angione, Stephanie L.; Oulhen, Nathalie; Brayboy, Lynae M.; Tripathi, Anubhav; Wessel, Gary M.
2016-01-01
Objective To develop and implement a device and protocol for oocyte analysis at a single cell level. The device must be capable of high resolution imaging, temperature control, perfusion of media, drugs, sperm, and immunolabeling reagents all at defined flow-rates. Each oocyte and resultant embryo must remain spatially separated and defined. Design Experimental laboratory study Setting University and Academic Center for reproductive medicine. Patients/Animals Women with eggs retrieved for ICSI cycles, adult female FVBN and B6C3F1 mouse strains, sea stars. Intervention Real-time, longitudinal imaging of oocytes following fluorescent labeling, insemination, and viability tests. Main outcome measure(s) Cell and embryo viability, immunolabeling efficiency, live cell endocytosis quantitation, precise metrics of fertilization and embryonic development. Results Single oocytes were longitudinally imaged following significant changes in media, markers, endocytosis quantitation, and development, all with supreme control by microfluidics. Cells remained viable, enclosed, and separate for precision measurements, repeatability, and imaging. Conclusions We engineered a simple device to load, visualize, experiment, and effectively record individual oocytes and embryos, without loss of cells. Prolonged incubation capabilities provide longitudinal studies without need for transfer and potential loss of cells. This simple perfusion apparatus (SPA) provides for careful, precise, and flexible handling of precious samples facilitating clinical in vitro fertilization approaches. PMID:25450296
Validation of Scratching Severity as an Objective Assessment for Itch.
Udkoff, Jeremy; Silverberg, Jonathan I
2018-05-01
There are currently no simple, standardized, objective assessments of itch for clinical trials and practice. We sought to validate and test the severity of scratching as an objective measure of itch (4-point ordinal scale ranging from 0 [not present] to 3 [very prominent] based on the observation of scratching lesions). We performed a prospective outpatient study using questionnaires and evaluations by a dermatologist in adults with atopic dermatitis (n = 261). Severity of scratching best correlated with patient-reported global atopic dermatitis severity (Kendall τ = 0.336, P < 0.0001), numeric rating scale of itch in the past 24 hours (τ = 0.266, P = 0.0010) and 3 days (τ = 0.296, P < 0.0001). Severity of scratching showed responsiveness over time. Patients experiencing improvement of scratching severity of 1 point or greater had significantly lower itch based on numeric rating scale in the past 3 days (Wilcoxon rank sum test, P = 0.0175), 5-D itch scale (P = 0.0146), and Patient-Oriented Eczema Measure scores (P = 0.0146). There was a significant decrease in scratching severity for patients experiencing itch improvement of 4 points or greater in the past 3 days on the numeric rating scale (Fisher exact test, P = 0.0026), Patient-Oriented Eczema Measure (P < 0.0001), and Dermatology Life Quality Index (P = 0.0285). Severity of scratching may be a useful endpoint in clinical trials and practice across the gamut of pruritic disorders. Future studies are needed to validate severity of scratching in other pruritic disease. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Thinking in Terms of Sensors: Personification of Self as an Object in Physics Problem Solving
ERIC Educational Resources Information Center
Tabor-Morris, A. E.
2015-01-01
How can physics teachers help students develop consistent problem solving techniques for both simple and complicated physics problems, such as those that encompass objects undergoing multiple forces (mechanical or electrical) as individually portrayed in free-body diagrams and/or phenomenon involving multiple objects, such as Doppler effect…
A Bayesian Model of the Memory Colour Effect.
Witzel, Christoph; Olkkonen, Maria; Gegenfurtner, Karl R
2018-01-01
According to the memory colour effect, the colour of a colour-diagnostic object is not perceived independently of the object itself. Instead, it has been shown through an achromatic adjustment method that colour-diagnostic objects still appear slightly in their typical colour, even when they are colourimetrically grey. Bayesian models provide a promising approach to capture the effect of prior knowledge on colour perception and to link these effects to more general effects of cue integration. Here, we model memory colour effects using prior knowledge about typical colours as priors for the grey adjustments in a Bayesian model. This simple model does not involve any fitting of free parameters. The Bayesian model roughly captured the magnitude of the measured memory colour effect for photographs of objects. To some extent, the model predicted observed differences in memory colour effects across objects. The model could not account for the differences in memory colour effects across different levels of realism in the object images. The Bayesian model provides a particularly simple account of memory colour effects, capturing some of the multiple sources of variation of these effects.
A Bayesian Model of the Memory Colour Effect
Olkkonen, Maria; Gegenfurtner, Karl R.
2018-01-01
According to the memory colour effect, the colour of a colour-diagnostic object is not perceived independently of the object itself. Instead, it has been shown through an achromatic adjustment method that colour-diagnostic objects still appear slightly in their typical colour, even when they are colourimetrically grey. Bayesian models provide a promising approach to capture the effect of prior knowledge on colour perception and to link these effects to more general effects of cue integration. Here, we model memory colour effects using prior knowledge about typical colours as priors for the grey adjustments in a Bayesian model. This simple model does not involve any fitting of free parameters. The Bayesian model roughly captured the magnitude of the measured memory colour effect for photographs of objects. To some extent, the model predicted observed differences in memory colour effects across objects. The model could not account for the differences in memory colour effects across different levels of realism in the object images. The Bayesian model provides a particularly simple account of memory colour effects, capturing some of the multiple sources of variation of these effects. PMID:29760874
Lomber, S G; Payne, B R; Cornwell, P
1996-01-01
Extrastriate visual cortex of the ventral-posterior suprasylvian gyrus (vPS cortex) of freely behaving cats was reversibly deactivated with cooling to determine its role in performance on a battery of simple or masked two-dimensional pattern discriminations, and three-dimensional object discriminations. Deactivation of vPS cortex by cooling profoundly impaired the ability of the cats to recall the difference between all previously learned pattern and object discriminations. However, the cats' ability to learn or relearn pattern and object discriminations while vPS was deactivated depended upon the nature of the pattern or object and the cats' prior level of exposure to them. During cooling of vPS cortex, the cats could neither learn the novel object discriminations nor relearn a highly familiar masked or partially occluded pattern discrimination, although they could relearn both the highly familiar object and simple pattern discriminations. These cooling-induced deficits resemble those induced by cooling of the topologically equivalent inferotemporal cortex of monkeys and provides evidence that the equivalent regions contribute to visual processing in similar ways. Images Fig. 1 Fig. 3 PMID:8643686
Monostatic Radar Cross Section Estimation of Missile Shaped Object Using Physical Optics Method
NASA Astrophysics Data System (ADS)
Sasi Bhushana Rao, G.; Nambari, Swathi; Kota, Srikanth; Ranga Rao, K. S.
2017-08-01
Stealth Technology manages many signatures for a target in which most radar systems use radar cross section (RCS) for discriminating targets and classifying them with regard to Stealth. During a war target’s RCS has to be very small to make target invisible to enemy radar. In this study, Radar Cross Section of perfectly conducting objects like cylinder, truncated cone (frustum) and circular flat plate is estimated with respect to parameters like size, frequency and aspect angle. Due to the difficulties in exactly predicting the RCS, approximate methods become the alternative. Majority of approximate methods are valid in optical region and where optical region has its own strengths and weaknesses. Therefore, the analysis given in this study is purely based on far field monostatic RCS measurements in the optical region. Computation is done using Physical Optics (PO) method for determining RCS of simple models. In this study not only the RCS of simple models but also missile shaped and rocket shaped models obtained from the cascaded objects with backscatter has been computed using Matlab simulation. Rectangular plots are obtained for RCS in dbsm versus aspect angle for simple and missile shaped objects using Matlab simulation. Treatment of RCS, in this study is based on Narrow Band.
SU-E-J-197: Investigation of Microsoft Kinect 2.0 Depth Resolution for Patient Motion Tracking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Silverstein, E; Snyder, M
2015-06-15
Purpose: Investigate the use of the Kinect 2.0 for patient motion tracking during radiotherapy by studying spatial and depth resolution capabilities. Methods: Using code written in C#, depth map data was abstracted from the Kinect to create an initial depth map template indicative of the initial position of an object to be compared to the depth map of the object over time. To test this process, simple setup was created in which two objects were imaged: a 40 cm × 40 cm board covered in non reflective material and a 15 cm × 26 cm textbook with a slightly reflective,more » glossy cover. Each object, imaged and measured separately, was placed on a movable platform with object to camera distance measured. The object was then moved a specified amount to ascertain whether the Kinect’s depth camera would visualize the difference in position of the object. Results: Initial investigations have shown the Kinect depth resolution is dependent on the object to camera distance. Measurements indicate that movements as small as 1 mm can be visualized for objects as close as 50 cm away. This depth resolution decreases linearly with object to camera distance. At 4 m, the depth resolution had decreased to observe a minimum movement of 1 cm. Conclusion: The improved resolution and advanced hardware of the Kinect 2.0 allows for increase of depth resolution over the Kinect 1.0. Although obvious that the depth resolution should decrease with increasing distance from an object given the decrease in number of pixels representing said object, the depth resolution at large distances indicates its usefulness in a clinical setting.« less
Deeny, Sean; Chicoine, Caitlin; Hargrove, Levi; Parrish, Todd; Jayaraman, Arun
2014-01-01
Common goals in the development of human-machine interface (HMI) technology are to reduce cognitive workload and increase function. However, objective and quantitative outcome measures assessing cognitive workload have not been standardized for HMI research. The present study examines the efficacy of a simple event-related potential (ERP) measure of cortical effort during myoelectric control of a virtual limb for use as an outcome tool. Participants trained and tested on two methods of control, direct control (DC) and pattern recognition control (PRC), while electroencephalographic (EEG) activity was recorded. Eighteen healthy participants with intact limbs were tested using DC and PRC under three conditions: passive viewing, easy, and hard. Novel auditory probes were presented at random intervals during testing, and significant task-difficulty effects were observed in the P200, P300, and a late positive potential (LPP), supporting the efficacy of ERPs as a cognitive workload measure in HMI tasks. LPP amplitude distinguished DC from PRC in the hard condition with higher amplitude in PRC, consistent with lower cognitive workload in PRC relative to DC for complex movements. Participants completed trials faster in the easy condition using DC relative to PRC, but completed trials more slowly using DC relative to PRC in the hard condition. The results provide promising support for ERPs as an outcome measure for cognitive workload in HMI research such as prosthetics, exoskeletons, and other assistive devices, and can be used to evaluate and guide new technologies for more intuitive HMI control.
Study of chromatic adaptation using memory color matches, Part I: neutral illuminants.
Smet, Kevin A G; Zhai, Qiyan; Luo, Ming R; Hanselaer, Peter
2017-04-03
Twelve corresponding color data sets have been obtained using the long-term memory colors of familiar objects as target stimuli. Data were collected for familiar objects with neutral, red, yellow, green and blue hues under 4 approximately neutral illumination conditions on or near the blackbody locus. The advantages of the memory color matching method are discussed in light of other more traditional asymmetric matching techniques. Results were compared to eight corresponding color data sets available in literature. The corresponding color data was used to test several linear (von Kries, RLAB, etc.) and nonlinear (Hunt & Nayatani) chromatic adaptation transforms (CAT). It was found that a simple two-step von Kries, whereby the degree of adaptation D is optimized to minimize the DEu'v' prediction errors, outperformed all other tested models for both memory color and literature corresponding color sets, whereby prediction errors were lower for the memory color sets. The predictive errors were substantially smaller than the standard uncertainty on the average observer and were comparable to what are considered just-noticeable-differences in the CIE u'v' chromaticity diagram, supporting the use of memory color based internal references to study chromatic adaptation mechanisms.
The Mere Exposure Effect in the Domain of Haptics
Jakesch, Martina; Carbon, Claus-Christian
2012-01-01
Background Zajonc showed that the attitude towards stimuli that one had been previously exposed to is more positive than towards novel stimuli. This mere exposure effect (MEE) has been tested extensively using various visual stimuli. Research on the MEE is sparse, however, for other sensory modalities. Methodology/Principal Findings We used objects of two material categories (stone and wood) and two complexity levels (simple and complex) to test the influence of exposure frequency (F0 = novel stimuli, F2 = stimuli exposed twice, F10 = stimuli exposed ten times) under two sensory modalities (haptics only and haptics & vision). Effects of exposure frequency were found for high complex stimuli with significantly increasing liking from F0 to F2 and F10, but only for the stone category. Analysis of “Need for Touch” data showed the MEE in participants with high need for touch, which suggests different sensitivity or saturation levels of MEE. Conclusions/Significance This different sensitivity or saturation levels might also reflect the effects of expertise on the haptic evaluation of objects. It seems that haptic and cross-modal MEEs are influenced by factors similar to those in the visual domain indicating a common cognitive basis. PMID:22347451
A Hexapod Robot to Demonstrate Mesh Walking in a Microgravity Environment
NASA Technical Reports Server (NTRS)
Foor, David C.
2005-01-01
The JPL Micro-Robot Explorer (MRE) Spiderbot is a robot that takes advantage of its small size to perform precision tasks suitable for space applications. The Spiderbot is a legged robot that can traverse harsh terrain otherwise inaccessible to wheeled robots. A team of Spiderbots can network and can exhibit collaborative efforts to SUCCeSSfUlly complete a set of tasks. The Spiderbot is designed and developed to demonstrate hexapods that can walk on flat surfaces, crawl on meshes, and assemble simple structures. The robot has six legs consisting of two spring-compliant joints and a gripping actuator. A hard-coded set of gaits allows the robot to move smoothly in a zero-gravity environment along the mesh. The primary objective of this project is to create a Spiderbot that traverses a flexible, deployable mesh, for use in space repair. Verification of this task will take place aboard a zero-gravity test flight. The secondary objective of this project is to adapt feedback from the joints to allow the robot to test each arm for a successful grip of the mesh. The end result of this research lends itself to a fault-tolerant robot suitable for a wide variety of space applications.
Stojanoski, Bobby Boge; Niemeier, Matthias
2015-10-01
It is well known that visual expectation and attention modulate object perception. Yet, the mechanisms underlying these top-down influences are not completely understood. Event-related potentials (ERPs) indicate late contributions of expectations to object processing around the P2 or N2. This is true independent of whether people expect objects (vs. no objects) or specific shapes, hence when expectations pertain to complex visual features. However, object perception can also benefit from expecting colour information, which can facilitate figure/ground segregation. Studies on attention to colour show attention-sensitive modulations of the P1, but are limited to simple transient detection paradigms. The aim of the current study was to examine whether expecting simple features (colour information) during challenging object perception tasks produce early or late ERP modulations. We told participants to expect an object defined by predominantly black or white lines that were embedded in random arrays of distractor lines and then asked them to report the object's shape. Performance was better when colour expectations were met. ERPs revealed early and late phases of modulation. An early modulation at the P1/N1 transition arguably reflected earlier stages of object processing. Later modulations, at the P3, could be consistent with decisional processes. These results provide novel insights into feature-specific contributions of visual expectations to object perception.
ERIC Educational Resources Information Center
van der Linden, Wim J.
Latent class models for mastery testing differ from continuum models in that they do not postulate a latent mastery continuum but conceive mastery and non-mastery as two latent classes, each characterized by different probabilities of success. Several researchers use a simple latent class model that is basically a simultaneous application of the…
Schädler, Marc R; Warzybok, Anna; Kollmeier, Birger
2018-01-01
The simulation framework for auditory discrimination experiments (FADE) was adopted and validated to predict the individual speech-in-noise recognition performance of listeners with normal and impaired hearing with and without a given hearing-aid algorithm. FADE uses a simple automatic speech recognizer (ASR) to estimate the lowest achievable speech reception thresholds (SRTs) from simulated speech recognition experiments in an objective way, independent from any empirical reference data. Empirical data from the literature were used to evaluate the model in terms of predicted SRTs and benefits in SRT with the German matrix sentence recognition test when using eight single- and multichannel binaural noise-reduction algorithms. To allow individual predictions of SRTs in binaural conditions, the model was extended with a simple better ear approach and individualized by taking audiograms into account. In a realistic binaural cafeteria condition, FADE explained about 90% of the variance of the empirical SRTs for a group of normal-hearing listeners and predicted the corresponding benefits with a root-mean-square prediction error of 0.6 dB. This highlights the potential of the approach for the objective assessment of benefits in SRT without prior knowledge about the empirical data. The predictions for the group of listeners with impaired hearing explained 75% of the empirical variance, while the individual predictions explained less than 25%. Possibly, additional individual factors should be considered for more accurate predictions with impaired hearing. A competing talker condition clearly showed one limitation of current ASR technology, as the empirical performance with SRTs lower than -20 dB could not be predicted.
Schädler, Marc R.; Warzybok, Anna; Kollmeier, Birger
2018-01-01
The simulation framework for auditory discrimination experiments (FADE) was adopted and validated to predict the individual speech-in-noise recognition performance of listeners with normal and impaired hearing with and without a given hearing-aid algorithm. FADE uses a simple automatic speech recognizer (ASR) to estimate the lowest achievable speech reception thresholds (SRTs) from simulated speech recognition experiments in an objective way, independent from any empirical reference data. Empirical data from the literature were used to evaluate the model in terms of predicted SRTs and benefits in SRT with the German matrix sentence recognition test when using eight single- and multichannel binaural noise-reduction algorithms. To allow individual predictions of SRTs in binaural conditions, the model was extended with a simple better ear approach and individualized by taking audiograms into account. In a realistic binaural cafeteria condition, FADE explained about 90% of the variance of the empirical SRTs for a group of normal-hearing listeners and predicted the corresponding benefits with a root-mean-square prediction error of 0.6 dB. This highlights the potential of the approach for the objective assessment of benefits in SRT without prior knowledge about the empirical data. The predictions for the group of listeners with impaired hearing explained 75% of the empirical variance, while the individual predictions explained less than 25%. Possibly, additional individual factors should be considered for more accurate predictions with impaired hearing. A competing talker condition clearly showed one limitation of current ASR technology, as the empirical performance with SRTs lower than −20 dB could not be predicted. PMID:29692200
Paiva, Joana S.; Dias, Duarte
2017-01-01
In recent years, safer and more reliable biometric methods have been developed. Apart from the need for enhanced security, the media and entertainment sectors have also been applying biometrics in the emerging market of user-adaptable objects/systems to make these systems more user-friendly. However, the complexity of some state-of-the-art biometric systems (e.g., iris recognition) or their high false rejection rate (e.g., fingerprint recognition) is neither compatible with the simple hardware architecture required by reduced-size devices nor the new trend of implementing smart objects within the dynamic market of the Internet of Things (IoT). It was recently shown that an individual can be recognized by extracting features from their electrocardiogram (ECG). However, most current ECG-based biometric algorithms are computationally demanding and/or rely on relatively large (several seconds) ECG samples, which are incompatible with the aforementioned application fields. Here, we present a computationally low-cost method (patent pending), including simple mathematical operations, for identifying a person using only three ECG morphology-based characteristics from a single heartbeat. The algorithm was trained/tested using ECG signals of different duration from the Physionet database on more than 60 different training/test datasets. The proposed method achieved maximal averaged accuracy of 97.450% in distinguishing each subject from a ten-subject set and false acceptance and rejection rates (FAR and FRR) of 5.710±1.900% and 3.440±1.980%, respectively, placing Beat-ID in a very competitive position in terms of the FRR/FAR among state-of-the-art methods. Furthermore, the proposed method can identify a person using an average of 1.020 heartbeats. It therefore has FRR/FAR behavior similar to obtaining a fingerprint, yet it is simpler and requires less expensive hardware. This method targets low-computational/energy-cost scenarios, such as tiny wearable devices (e.g., a smart object that automatically adapts its configuration to the user). A hardware proof-of-concept implementation is presented as an annex to this paper. PMID:28719614
Paiva, Joana S; Dias, Duarte; Cunha, João P S
2017-01-01
In recent years, safer and more reliable biometric methods have been developed. Apart from the need for enhanced security, the media and entertainment sectors have also been applying biometrics in the emerging market of user-adaptable objects/systems to make these systems more user-friendly. However, the complexity of some state-of-the-art biometric systems (e.g., iris recognition) or their high false rejection rate (e.g., fingerprint recognition) is neither compatible with the simple hardware architecture required by reduced-size devices nor the new trend of implementing smart objects within the dynamic market of the Internet of Things (IoT). It was recently shown that an individual can be recognized by extracting features from their electrocardiogram (ECG). However, most current ECG-based biometric algorithms are computationally demanding and/or rely on relatively large (several seconds) ECG samples, which are incompatible with the aforementioned application fields. Here, we present a computationally low-cost method (patent pending), including simple mathematical operations, for identifying a person using only three ECG morphology-based characteristics from a single heartbeat. The algorithm was trained/tested using ECG signals of different duration from the Physionet database on more than 60 different training/test datasets. The proposed method achieved maximal averaged accuracy of 97.450% in distinguishing each subject from a ten-subject set and false acceptance and rejection rates (FAR and FRR) of 5.710±1.900% and 3.440±1.980%, respectively, placing Beat-ID in a very competitive position in terms of the FRR/FAR among state-of-the-art methods. Furthermore, the proposed method can identify a person using an average of 1.020 heartbeats. It therefore has FRR/FAR behavior similar to obtaining a fingerprint, yet it is simpler and requires less expensive hardware. This method targets low-computational/energy-cost scenarios, such as tiny wearable devices (e.g., a smart object that automatically adapts its configuration to the user). A hardware proof-of-concept implementation is presented as an annex to this paper.
Li, Jie; Li, Rui; You, Leiming; Xu, Anlong; Fu, Yonggui; Huang, Shengfeng
2015-01-01
Switching between different alternative polyadenylation (APA) sites plays an important role in the fine tuning of gene expression. New technologies for the execution of 3’-end enriched RNA-seq allow genome-wide detection of the genes that exhibit significant APA site switching between different samples. Here, we show that the independence test gives better results than the linear trend test in detecting APA site-switching events. Further examination suggests that the discrepancy between these two statistical methods arises from complex APA site-switching events that cannot be represented by a simple change of average 3’-UTR length. In theory, the linear trend test is only effective in detecting these simple changes. We classify the switching events into four switching patterns: two simple patterns (3’-UTR shortening and lengthening) and two complex patterns. By comparing the results of the two statistical methods, we show that complex patterns account for 1/4 of all observed switching events that happen between normal and cancerous human breast cell lines. Because simple and complex switching patterns may convey different biological meanings, they merit separate study. We therefore propose to combine both the independence test and the linear trend test in practice. First, the independence test should be used to detect APA site switching; second, the linear trend test should be invoked to identify simple switching events; and third, those complex switching events that pass independence testing but fail linear trend testing can be identified. PMID:25875641
A simple and novel grading method for retraction and overshoot in Duane retraction syndrome.
Kekunnaya, Ramesh; Moharana, Ruby; Tibrewal, Shailja; Chhablani, Preeti-Patil; Sachdeva, Virender
2016-11-01
Strabismus in Duane retraction syndrome is frequently associated with significant globe retraction and overshoots. However, there is no method to objectively grade retraction and overshoot. Our purpose is to describe a novel objective grading method. This novel and simple grading method has excellent agreement. It will help standardise measurements and guide the clinician in taking the decision for surgery and predicting its outcome. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Global Contrast Based Salient Region Detection.
Cheng, Ming-Ming; Mitra, Niloy J; Huang, Xiaolei; Torr, Philip H S; Hu, Shi-Min
2015-03-01
Automatic estimation of salient object regions across images, without any prior assumption or knowledge of the contents of the corresponding scenes, enhances many computer vision and computer graphics applications. We introduce a regional contrast based salient object detection algorithm, which simultaneously evaluates global contrast differences and spatial weighted coherence scores. The proposed algorithm is simple, efficient, naturally multi-scale, and produces full-resolution, high-quality saliency maps. These saliency maps are further used to initialize a novel iterative version of GrabCut, namely SaliencyCut, for high quality unsupervised salient object segmentation. We extensively evaluated our algorithm using traditional salient object detection datasets, as well as a more challenging Internet image dataset. Our experimental results demonstrate that our algorithm consistently outperforms 15 existing salient object detection and segmentation methods, yielding higher precision and better recall rates. We also show that our algorithm can be used to efficiently extract salient object masks from Internet images, enabling effective sketch-based image retrieval (SBIR) via simple shape comparisons. Despite such noisy internet images, where the saliency regions are ambiguous, our saliency guided image retrieval achieves a superior retrieval rate compared with state-of-the-art SBIR methods, and additionally provides important target object region information.
Morrow, Sarah A; Menon, Suresh; Rosehart, Heather; Sharma, Manas
2017-02-01
One of the most frequently disabling symptoms in Multiple Sclerosis (MS) is cognitive impairment which is often insidious in onset and therefore difficult to recognize in the early stages, for both persons with MS and clinicians. A biomarker that would help identify those at risk of cognitive impairment, or with only mild impairment, would be a useful tool for clinicians. Using MRI, already an integral tool in the diagnosis and monitoring of disease activity in MS, would be ideal. Thus, this study aimed to determine if simple measures on routine MRI could serve as potential biomarkers for cognitive impairment in MS. We retrospectively identified 51 persons with MS who had a cognitive assessment and MRI within six months of the MRI. Simple linear measurements of the hippocampi, bifrontral and third ventricular width, bicaudate width and the anterior, mid and posterior corpus callosum were made. Pearson's correlations examined the relationship between these MRI measures and cognitive tests, and MRI measures were compared in persons with MS who were either normal or cognitively impaired on objective cognitive tests using Analysis of Covariance (ANCOVA). Bicaudate span and third ventricular width were both negatively correlated, while corpus callosal measures were positive correlated with cognitive test performance. After controlling for potential confounders, bicaudate span was significant different on measures of immediate recall. Both anterior and posterior corpus collosal measure were significantly different on measures of verbal fluency, immediate recall and higher executive function; while the anterior corpus callosum was also significantly different on processing speed. The middle corpus collosal measure was significantly different on immediate recall and higher executive function. This study presents data demonstrating that simple to apply MRI measures of atrophy may serve as biomarkers for cognitive impairment in persons with MS. Further prospective studies are needed to validate these findings. Copyright © 2016 Elsevier B.V. All rights reserved.
Deng, Huiqiong; Durfee, William K.; Nuckley, David J.; Rheude, Brandon S.; Severson, Amy E.; Skluzacek, Katie M.; Spindler, Kristen K.; Davey, Cynthia S.
2012-01-01
Background Telerehabilitation allows rehabilitative training to continue remotely after discharge from acute care and can include complex tasks known to create rich conditions for neural change. Objectives The purposes of this study were: (1) to explore the feasibility of using telerehabilitation to improve ankle dorsiflexion during the swing phase of gait in people with stroke and (2) to compare complex versus simple movements of the ankle in promoting behavioral change and brain reorganization. Design This study was a pilot randomized controlled trial. Setting Training was done in the participant's home. Testing was done in separate research labs involving functional magnetic resonance imaging (fMRI) and multi-camera gait analysis. Patients Sixteen participants with chronic stroke and impaired ankle dorsiflexion were assigned randomly to receive 4 weeks of telerehabilitation of the paretic ankle. Intervention Participants received either computerized complex movement training (track group) or simple movement training (move group). Measurements Behavioral changes were measured with the 10-m walk test and gait analysis using a motion capture system. Brain reorganization was measured with ankle tracking during fMRI. Results Dorsiflexion during gait was significantly larger in the track group compared with the move group. For fMRI, although the volume, percent volume, and intensity of cortical activation failed to show significant changes, the frequency count of the number of participants showing an increase versus a decrease in these values from pretest to posttest measurements was significantly different between the 2 groups, with the track group decreasing and the move group increasing. Limitations Limitations of this study were that no follow-up test was conducted and that a small sample size was used. Conclusions The results suggest that telerehabilitation, emphasizing complex task training with the paretic limb, is feasible and can be effective in promoting further dorsiflexion in people with chronic stroke. PMID:22095209
NASA Astrophysics Data System (ADS)
Hrachowitz, M.; Fovet, O.; Ruiz, L.; Euser, T.; Gharari, S.; Nijzink, R.; Freer, J.; Savenije, H. H. G.; Gascuel-Odoux, C.
2014-09-01
Hydrological models frequently suffer from limited predictive power despite adequate calibration performances. This can indicate insufficient representations of the underlying processes. Thus, ways are sought to increase model consistency while satisfying the contrasting priorities of increased model complexity and limited equifinality. In this study, the value of a systematic use of hydrological signatures and expert knowledge for increasing model consistency was tested. It was found that a simple conceptual model, constrained by four calibration objective functions, was able to adequately reproduce the hydrograph in the calibration period. The model, however, could not reproduce a suite of hydrological signatures, indicating a lack of model consistency. Subsequently, testing 11 models, model complexity was increased in a stepwise way and counter-balanced by "prior constraints," inferred from expert knowledge to ensure a model which behaves well with respect to the modeler's perception of the system. We showed that, in spite of unchanged calibration performance, the most complex model setup exhibited increased performance in the independent test period and skill to better reproduce all tested signatures, indicating a better system representation. The results suggest that a model may be inadequate despite good performance with respect to multiple calibration objectives and that increasing model complexity, if counter-balanced by prior constraints, can significantly increase predictive performance of a model and its skill to reproduce hydrological signatures. The results strongly illustrate the need to balance automated model calibration with a more expert-knowledge-driven strategy of constraining models.
ERIC Educational Resources Information Center
Young, Timothy; Guy, Mark
2011-01-01
Students have a difficult time understanding force, especially when dealing with a moving object. Many forces can be acting on an object at the same time, causing it to stay in one place or move. By directly observing these forces, students can better understand the effect these forces have on an object. With a simple, student-built device called…
ERIC Educational Resources Information Center
de Oliveira, Clara Amelia; Conte, Marcos Fernando; Riso, Bernardo Goncalves
This work presents a proposal for Teaching/Learning, on Object Oriented Programming for Entry Level Courses of Engineering and Computer Science, on University. The philosophy of Object Oriented Programming comes as a new pattern of solution for problems, where flexibility and reusability appears over the simple data structure and sequential…
Permeability Testing of Impacted Composite Laminates for Use on Reusable Launch Vehicles
NASA Technical Reports Server (NTRS)
Nettles, A. T.
2001-01-01
Since composite laminates are beginning to be identified for use in reusable launch vehicle propulsion systems, an understanding of their permeance is needed. A foreign object impact event can cause a localized area of permeability (leakage) in a polymer matrix composite, and it is the aim of this study to assess a method of quantifying permeability-after-impact results. A simple test apparatus is presented, and variables that could affect the measured values of permeability-after-impact were assessed. Once it was determined that valid numbers were being measured, a fiber/resin system was impacted at various impact levels and the resulting permeability measured, first with a leak check solution (qualitative) then using the new apparatus (quantitative). The results showed that as the impact level increased, so did the measured leakage. As the pressure to the specimen was increased, the leak rate was seen to increase in a nonlinear fashion for almost all the specimens tested.
AN OBJECTIVE CLIMATOLOGY OF CAROLINA COASTAL FRONTS
This study describes a simple objective method to identify cases of coastal frontogenesis offshore of the Carolinas and to characterize the sensible weather associated with frontal passage at measurement sites near the coast. The identification method, based on surface hourly d...
Characterization of Louisiana asphalt mixtures using simple performance tests and MEPDG.
DOT National Transportation Integrated Search
2014-04-01
The National Cooperative Highway Research Program (NCHRP) Project 9-19, Superpave Support and Performance : Models Management, recommended three Simple Performance Tests (SPTs) to complement the Superpave volumetric : mixture design method. These are...
An Inexpensive and Simple Method to Demonstrate Soil Water and Nutrient Flow
ERIC Educational Resources Information Center
Nichols, K. A.; Samson-Liebig, S.
2011-01-01
Soil quality, soil health, and soil sustainability are concepts that are being widely used but are difficult to define and illustrate, especially to a non-technical audience. The objectives of this manuscript were to develop simple and inexpensive methodologies to both qualitatively and quantitatively estimate water infiltration rates (IR),…
A simple, gravimetric method to quantify inorganic carbon in calcareous soils
USDA-ARS?s Scientific Manuscript database
Total carbon (TC) in calcareous soils has two components: inorganic carbon (IC) as calcite and or dolomite and organic carbon (OC) in the soil organic matter. The IC must be measured and subtracted from TC to obtain OC. Our objective was to develop a simple gravimetric technique to quantify IC. Th...
Simulated Holograms: A Simple Introduction to Holography.
ERIC Educational Resources Information Center
Dittmann, H.; Schneider, W. B.
1992-01-01
Describes a project that uses a computer and a dot matrix printer to simulate the holographic recording process of simple object structures. The process' four steps are (1) superposition of waves; (2) representing the superposition of a plane reference wave on the monitor screen; (3) photographic reduction of the images; and (4) reconstruction of…
The Real-Time ObjectAgent Software Architecture for Distributed Satellite Systems
2001-01-01
real - time operating system selection are also discussed. The fourth section describes a simple demonstration of real-time ObjectAgent. Finally, the...experience with C++. After selecting the programming language, it was necessary to select a target real - time operating system (RTOS) and embedded...ObjectAgent software to run on the OSE Real Time Operating System . In addition, she is responsible for the integration of ObjectAgent
Distributed Computerized Catalog System
NASA Technical Reports Server (NTRS)
Borgen, Richard L.; Wagner, David A.
1995-01-01
DarkStar Distributed Catalog System describes arbitrary data objects in unified manner, providing end users with versatile, yet simple search mechanism for locating and identifying objects. Provides built-in generic and dynamic graphical user interfaces. Design of system avoids some of problems of standard DBMS, and system provides more flexibility than do conventional relational data bases, or object-oriented data bases. Data-collection lattice partly hierarchical representation of relationships among collections, subcollections, and data objects.
Bock, Eduardo; Antunes, Pedro; Leao, Tarcisio; Uebelhart, Beatriz; Fonseca, Jeison; Leme, Juliana; Utiyama, Bruno; da Silva, Cibele; Cavalheiro, Andre; Filho, Diolino Santos; Dinkhuysen, Jarbas; Biscegli, Jose; Andrade, Aron; Arruda, Celso
2011-05-01
An implantable centrifugal blood pump has been developed with original features for a left ventricular assist device. This pump is part of a multicenter and international study with the objective to offer simple, affordable, and reliable devices to developing countries. Previous computational fluid dynamics investigations and wear evaluation in bearing system were performed followed by prototyping and in vitro tests. In addition, previous blood tests for assessment of normalized index of hemolysis show results of 0.0054±2.46 × 10⁻³ mg/100 L. An electromechanical actuator was tested in order to define the best motor topology and controller configuration. Three different topologies of brushless direct current motor (BLDCM) were analyzed. An electronic driver was tested in different situations, and the BLDCM had its mechanical properties tested in a dynamometer. Prior to evaluation of performance during in vivo animal studies, anatomical studies were necessary to achieve the best configuration and cannulation for left ventricular assistance. The results were considered satisfactory, and the next step is to test the performance of the device in vivo. © 2011, Copyright the Authors. Artificial Organs © 2011, International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.
Mathematical Modelling of the Infusion Test
NASA Astrophysics Data System (ADS)
Cieslicki, Krzysztof
2007-01-01
The objective of this paper was to improve the well established in clinical practice Marmarou model for intracranial volume-pressure compensation by adding the pulsatile components. It was demonstrated that complicated pulsation and growth in intracranial pressure during infusion test could be successfully modeled by the relatively simple analytical expression derived in this paper. The CSF dynamics were tested in 25 patients with clinical symptoms of hydrocephalus. Basing on the frequency spectrum of the patient's baseline pressure and identified parameters of CSF dynamic, for each patient an "ideal" infusion test curve free from artefacts and slow waves was simulated. The degree of correlation between simulated and real curves obtained from clinical observations gave insight into the adequacy of assumptions of Marmarou model. The proposed method of infusion tests analysis designates more exactly the value of the reference pressure, which is usually treated as a secondary and of uncertain significance. The properly identified value of the reference pressure decides on the degree of pulsation amplitude growth during IT, as well as on the value of elastance coefficient. The artificially generated tests with various pulsation components were also applied to examine the correctness of the used algorithm of identification of the original Marmarou model parameters.
Learning Natural Selection in 4th Grade with Multi-Agent-Based Computational Models
NASA Astrophysics Data System (ADS)
Dickes, Amanda Catherine; Sengupta, Pratim
2013-06-01
In this paper, we investigate how elementary school students develop multi-level explanations of population dynamics in a simple predator-prey ecosystem, through scaffolded interactions with a multi-agent-based computational model (MABM). The term "agent" in an MABM indicates individual computational objects or actors (e.g., cars), and these agents obey simple rules assigned or manipulated by the user (e.g., speeding up, slowing down, etc.). It is the interactions between these agents, based on the rules assigned by the user, that give rise to emergent, aggregate-level behavior (e.g., formation and movement of the traffic jam). Natural selection is such an emergent phenomenon, which has been shown to be challenging for novices (K16 students) to understand. Whereas prior research on learning evolutionary phenomena with MABMs has typically focused on high school students and beyond, we investigate how elementary students (4th graders) develop multi-level explanations of some introductory aspects of natural selection—species differentiation and population change—through scaffolded interactions with an MABM that simulates predator-prey dynamics in a simple birds-butterflies ecosystem. We conducted a semi-clinical interview based study with ten participants, in which we focused on the following: a) identifying the nature of learners' initial interpretations of salient events or elements of the represented phenomena, b) identifying the roles these interpretations play in the development of their multi-level explanations, and c) how attending to different levels of the relevant phenomena can make explicit different mechanisms to the learners. In addition, our analysis also shows that although there were differences between high- and low-performing students (in terms of being able to explain population-level behaviors) in the pre-test, these differences disappeared in the post-test.
Perney, Pascal; Lehert, Philippe; Mason, Barbara J
2012-01-01
Sleep disturbance symptom (SDS) is commonly reported in alcoholic patients. Polysomnography studies suggested that acamprosate decreased SDS. We assessed this hypothesis by using data of a randomized controlled trial. As a secondary objective, we suggested and tested the validity of a simple measurement of SDS based on the Hamilton depression and anxiety inventory subset. We re-analysed a multi-center study evaluating the efficacy of acamprosate compared with placebos on alcohol-dependent patients in concentrating on SDS change in time. The Sleep sum score index (SAEI) was built from check-lists on adverse effects reported at each visit and constituted our main endpoint. We also tested the validity of the short sleep index (SSI) defined by the four sleep items of the Hamilton depression and anxiety scales. Statistical analyses were conducted on an intention to treat basis. A total of 592 patients were included, and 292 completed the 6-month trial. Compared with SAEI considered as our reference, the observed specificity and sensitivity of SSI were 91.6 and 87.6%. From 40.2% of patients experiencing SDS at baseline, this proportion decreased until 26.1% at M6 in the placebo group and 19.5% in the acamprosate group (relative risk placebo/acamprosate = 1.49, 95% confidence interval 1.10, 1.98, P = 0.04). Treating alcoholic patients to enhance abstinence has a beneficial effect in reducing SDS, and the duration of abstinence during the treatment constitutes the main positive factor. An additional effect of acamprosate is conjectured from its effect on the glutamatergic tone. The SSI constitutes a simple, reasonably sensitive and specific instrument tool to measure SDS.
NASA Astrophysics Data System (ADS)
Rodriguez Gonzalez, Beatriz
2008-04-01
Much of the homotopical and homological structure of the categories of chain complexes and topological spaces can be deduced from the existence and properties of the 'simple' functors Tot : {double chain complexes} -> {chain complexes} and geometric realization : {sSets} -> {Top}, or similarly, Tot : {simplicial chain complexes} -> {chain complexes} and | | : {sTop} -> {Top}. The purpose of this thesis is to abstract this situation, and to this end we introduce the notion of '(co)simplicial descent category'. It is inspired by Guillen-Navarros's '(cubical) descent categories'. The key ingredients in a (co)simplicial descent category D are a class E of morphisms in D, called equivalences, and a 'simple' functor s : {(co)simplicial objects in D} -> D. They must satisfy axioms like 'Eilenberg-Zilber', 'exactness' and 'acyclicity'. This notion covers a wide class of examples, as chain complexes, sSets, topological spaces, filtered cochain complexes (where E = filtered quasi-isomorphisms or E = E_2-isomorphisms), commutative differential graded algebras (with s = Navarro's Thom-Whitney simple), DG-modules over a DG-category and mixed Hodge complexes, where s = Deligne's simple. From the simplicial descent structure we obtain homotopical structure on D, as cone and cylinder objects. We use them to i) explicitly describe the morphisms of HoD=D[E^{-1}] similarly to the case of calculus of fractions; ii) endow HoD with a non-additive pre-triangulated structure, that becomes triangulated in the stable additive case. These results use the properties of a 'total functor', which associates to any biaugmented bisimplicial object a simplicial object. It is the simplicial analogue of the total chain complex of a double complex, and it is left adjoint to Illusie's 'decalage' functor.
Takahama, Sachiko; Saiki, Jun
2014-01-01
Information on an object's features bound to its location is very important for maintaining object representations in visual working memory. Interactions with dynamic multi-dimensional objects in an external environment require complex cognitive control, including the selective maintenance of feature-location binding. Here, we used event-related functional magnetic resonance imaging to investigate brain activity and functional connectivity related to the maintenance of complex feature-location binding. Participants were required to detect task-relevant changes in feature-location binding between objects defined by color, orientation, and location. We compared a complex binding task requiring complex feature-location binding (color-orientation-location) with a simple binding task in which simple feature-location binding, such as color-location, was task-relevant and the other feature was task-irrelevant. Univariate analyses showed that the dorsolateral prefrontal cortex (DLPFC), hippocampus, and frontoparietal network were activated during the maintenance of complex feature-location binding. Functional connectivity analyses indicated cooperation between the inferior precentral sulcus (infPreCS), DLPFC, and hippocampus during the maintenance of complex feature-location binding. In contrast, the connectivity for the spatial updating of simple feature-location binding determined by reanalyzing the data from Takahama et al. (2010) demonstrated that the superior parietal lobule (SPL) cooperated with the DLPFC and hippocampus. These results suggest that the connectivity for complex feature-location binding does not simply reflect general memory load and that the DLPFC and hippocampus flexibly modulate the dorsal frontoparietal network, depending on the task requirements, with the infPreCS involved in the maintenance of complex feature-location binding and the SPL involved in the spatial updating of simple feature-location binding. PMID:24917833
Takahama, Sachiko; Saiki, Jun
2014-01-01
Information on an object's features bound to its location is very important for maintaining object representations in visual working memory. Interactions with dynamic multi-dimensional objects in an external environment require complex cognitive control, including the selective maintenance of feature-location binding. Here, we used event-related functional magnetic resonance imaging to investigate brain activity and functional connectivity related to the maintenance of complex feature-location binding. Participants were required to detect task-relevant changes in feature-location binding between objects defined by color, orientation, and location. We compared a complex binding task requiring complex feature-location binding (color-orientation-location) with a simple binding task in which simple feature-location binding, such as color-location, was task-relevant and the other feature was task-irrelevant. Univariate analyses showed that the dorsolateral prefrontal cortex (DLPFC), hippocampus, and frontoparietal network were activated during the maintenance of complex feature-location binding. Functional connectivity analyses indicated cooperation between the inferior precentral sulcus (infPreCS), DLPFC, and hippocampus during the maintenance of complex feature-location binding. In contrast, the connectivity for the spatial updating of simple feature-location binding determined by reanalyzing the data from Takahama et al. (2010) demonstrated that the superior parietal lobule (SPL) cooperated with the DLPFC and hippocampus. These results suggest that the connectivity for complex feature-location binding does not simply reflect general memory load and that the DLPFC and hippocampus flexibly modulate the dorsal frontoparietal network, depending on the task requirements, with the infPreCS involved in the maintenance of complex feature-location binding and the SPL involved in the spatial updating of simple feature-location binding.
The effectiveness of immediate feedback during the objective structured clinical examination.
Hodder, R V; Rivington, R N; Calcutt, L E; Hart, I R
1989-03-01
Using eight different physical examination or technical stations, 400 examinations were conducted to evaluate the effectiveness of immediate feedback during the Objective Structured Clinical Examination (OSCE). The test group comprised 50 medical students who underwent a standard 4-minute examination followed by 2 minutes of feedback. Immediately following feedback the students repeated an identical 4-minute examination scored by the same examiners. The control group consisted of 50 students from the same class who underwent an identical testing sequence, but instead of receiving feedback, they were instructed to continue their examinations for an additional 2 minutes before repeating the stations. Simple repetition of the task did not significantly improve score (mean increase 2.0%, NS). Extending the testing period from 4 to 6 minutes resulted in a small but significant increase in score (mean 6.7%, P less than 0.001). However, there was a much larger increase in the scores obtained following 2 minutes of immediate feedback compared to pre-feedback performance (mean 26.3%, P less than 0.0001). The majority of students and examiners felt that feedback, as administered in this study, was valuable both as a learning and teaching experience. Short periods of immediate feedback during an OSCE are practical and can improve competency in the performance of criterion-based tasks, at least over the short term. In addition, such feedback provides students with valuable self-assessment that may stimulate further learning.
Is simple nephrectomy truly simple? Comparison with the radical alternative.
Connolly, S S; O'Brien, M Frank; Kunni, I M; Phelan, E; Conroy, R; Thornhill, J A; Grainger, R
2011-03-01
The Oxford English dictionary defines the term "simple" as "easily done" and "uncomplicated". We tested the validity of this terminology in relation to open nephrectomy surgery. Retrospective review of 215 patients undergoing open, simple (n = 89) or radical (n = 126) nephrectomy in a single university-affiliated institution between 1998 and 2002. Operative time (OT), estimated blood loss (EBL), operative complications (OC) and length of stay in hospital (LOS) were analysed. Statistical analysis employed Fisher's exact test and Stata Release 8.2. Simple nephrectomy was associated with shorter OT (mean 126 vs. 144 min; p = 0.002), reduced EBL (mean 729 vs. 859 cc; p = 0.472), lower OC (9 vs. 17%; 0.087), and more brief LOS (mean 6 vs. 8 days; p < 0.001). All parameters suggest favourable outcome for the simple nephrectomy group, supporting the use of this terminology. This implies "simple" nephrectomies are truly easier to perform with less complication than their radical counterpart.
Development of specification for the superpave simple performance tests (SPT).
DOT National Transportation Integrated Search
2009-05-16
This report describes the development and establishment of a proposed Simple Performance : Test (SPT) specification in order to contribute to the asphalt materials technology in the state of : Michigan. The properties and characteristic of materials,...
Health Information in Bengali (Bangla / বাংলা)
... Society Guidelines for the Early Detection of Cancer - English PDF American Cancer Society Guidelines for the Early ... Screening It's a Simple Test - Cervical Cancer Screening - English PDF It's a Simple Test - Cervical Cancer Screening - ...
Improved multi-objective ant colony optimization algorithm and its application in complex reasoning
NASA Astrophysics Data System (ADS)
Wang, Xinqing; Zhao, Yang; Wang, Dong; Zhu, Huijie; Zhang, Qing
2013-09-01
The problem of fault reasoning has aroused great concern in scientific and engineering fields. However, fault investigation and reasoning of complex system is not a simple reasoning decision-making problem. It has become a typical multi-constraint and multi-objective reticulate optimization decision-making problem under many influencing factors and constraints. So far, little research has been carried out in this field. This paper transforms the fault reasoning problem of complex system into a paths-searching problem starting from known symptoms to fault causes. Three optimization objectives are considered simultaneously: maximum probability of average fault, maximum average importance, and minimum average complexity of test. Under the constraints of both known symptoms and the causal relationship among different components, a multi-objective optimization mathematical model is set up, taking minimizing cost of fault reasoning as the target function. Since the problem is non-deterministic polynomial-hard(NP-hard), a modified multi-objective ant colony algorithm is proposed, in which a reachability matrix is set up to constrain the feasible search nodes of the ants and a new pseudo-random-proportional rule and a pheromone adjustment mechinism are constructed to balance conflicts between the optimization objectives. At last, a Pareto optimal set is acquired. Evaluation functions based on validity and tendency of reasoning paths are defined to optimize noninferior set, through which the final fault causes can be identified according to decision-making demands, thus realize fault reasoning of the multi-constraint and multi-objective complex system. Reasoning results demonstrate that the improved multi-objective ant colony optimization(IMACO) can realize reasoning and locating fault positions precisely by solving the multi-objective fault diagnosis model, which provides a new method to solve the problem of multi-constraint and multi-objective fault diagnosis and reasoning of complex system.
Simple Screening Test for Exercise-Induced Bronchospasm in the Middle School Athlete
ERIC Educational Resources Information Center
Weiss, Tyler J.; Baker, Rachel H.; Weiss, Jason B.; Weiss, Michelle M.
2013-01-01
This article recommends and provides results from a simple screening test that could be incorporated into a standardized school evaluation for all children participating in sports and physical education classes. The test can be employed by physical educators utilizing their own gym to identify children who demonstrate signs of exercise-induced…
A Simple Hypnotic Approach to Treat Test Anxiety in Medical Students and Residents.
ERIC Educational Resources Information Center
Hebert, Stephen W.
1984-01-01
A simple hypnotic procedure to treat test anxiety is described that was used successfully with medical students and residents at the Wake Forest University Medical Center. A light trace is obtained and then the student is told to take such a hypnotic "journey" the evening prior to the test. (MLW)
Automated optical testing of LWIR objective lenses using focal plane array sensors
NASA Astrophysics Data System (ADS)
Winters, Daniel; Erichsen, Patrik; Domagalski, Christian; Peter, Frank; Heinisch, Josef; Dumitrescu, Eugen
2012-10-01
The image quality of today's state-of-the-art IR objective lenses is constantly improving while at the same time the market for thermography and vision grows strongly. Because of increasing demands on the quality of IR optics and increasing production volumes, the standards for image quality testing increase and tests need to be performed in shorter time. Most high-precision MTF testing equipment for the IR spectral bands in use today relies on the scanning slit method that scans a 1D detector over a pattern in the image generated by the lens under test, followed by image analysis to extract performance parameters. The disadvantages of this approach are that it is relatively slow, it requires highly trained operators for aligning the sample and the number of parameters that can be extracted is limited. In this paper we present lessons learned from the R and D process on using focal plane array (FPA) sensors for testing of long-wave IR (LWIR, 8-12 m) optics. Factors that need to be taken into account when switching from scanning slit to FPAs are e.g.: the thermal background from the environment, the low scene contrast in the LWIR, the need for advanced image processing algorithms to pre-process camera images for analysis and camera artifacts. Finally, we discuss 2 measurement systems for LWIR lens characterization that we recently developed with different target applications: 1) A fully automated system suitable for production testing and metrology that uses uncooled microbolometer cameras to automatically measure MTF (on-axis and at several o-axis positions) and parameters like EFL, FFL, autofocus curves, image plane tilt, etc. for LWIR objectives with an EFL between 1 and 12mm. The measurement cycle time for one sample is typically between 6 and 8s. 2) A high-precision research-grade system using again an uncooled LWIR camera as detector, that is very simple to align and operate. A wide range of lens parameters (MTF, EFL, astigmatism, distortion, etc.) can be easily and accurately measured with this system.
A Simple Method for Causal Analysis of Return on IT Investment
Alemi, Farrokh; Zargoush, Manaf; Oakes, James L.; Edrees, Hanan
2011-01-01
This paper proposes a method for examining the causal relationship among investment in information technology (IT) and the organization's productivity. In this method, first a strong relationship among (1) investment in IT, (2) use of IT and (3) organization's productivity is verified using correlations. Second, the assumption that IT investment preceded improved productivity is tested using partial correlation. Finally, the assumption of what may have happened in the absence of IT investment, the so called counterfactual, is tested through forecasting productivity at different levels of investment. The paper applies the proposed method to investment in the Veterans Health Information Systems and Technology Architecture (VISTA) system. Result show that the causal analysis can be done, even with limited data. Furthermore, because the procedure relies on overall organization's productivity, it might be more objective than when the analyst picks and chooses which costs and benefits should be included in the analysis. PMID:23019515
Sorokin, Anatoly; Selkov, Gene; Goryanin, Igor
2012-07-16
The volume of the experimentally measured time series data is rapidly growing, while storage solutions offering better data types than simple arrays of numbers or opaque blobs for keeping series data are sorely lacking. A number of indexing methods have been proposed to provide efficient access to time series data, but none has so far been integrated into a tried-and-proven database system. To explore the possibility of such integration, we have developed a data type for time series storage in PostgreSQL, an object-relational database system, and equipped it with an access method based on SAX (Symbolic Aggregate approXimation). This new data type has been successfully tested in a database supporting a large-scale plant gene expression experiment, and it was additionally tested on a very large set of simulated time series data. Copyright © 2011 Elsevier B.V. All rights reserved.
Juicy lemons for measuring basic empathic resonance.
Hagenmuller, Florence; Rössler, Wulf; Wittwer, Amrei; Haker, Helene
2014-10-30
Watch or even think of someone biting into a juicy lemon and your saliva will flow. This is a phenomenon of resonance, best described by the Perception-Action Model, where a physiological state in a person is activated through observation of this state in another. Within a broad framework of empathy, including manifold abilities depending on the Perception-Action link, resonance has been proposed as one physiological substrate for empathy. Using 49 healthy subjects, we developed a standardized salivation paradigm to assess empathic resonance at the autonomic level. Our results showed that this physiological resonance correlated positively with self-reported empathic concern. The salivation test, delivered an objective and continuous measure, was simple to implement in terms of setup and instruction, and could not easily be unintentionally biased or intentionally manipulated by participants. Therefore, these advantages make such a test a useful tool for assessing empathy-related abilities in psychiatric populations. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Hand stereotypies distinguish Rett syndrome from autism disorder.
Goldman, Sylvie; Temudo, Teresa
2012-07-01
Rett syndrome (RTT) and autism disorder (AD) are 2 neurodevelopmental disorders of early life that share phenotypic features, one being hand stereotypies. Distinguishing RTT from AD often represents a challenge, and given their distinct long-term prognoses, this issue may have far-reaching implications. With the advances in genetic testing, the contribution of clinical manifestations in distinguishing RTT from AD has been overlooked. A comparison of hand stereotypies in 20 children with RTT and 20 with AD was performed using detailed analyses of videotaped standardized observations. Striking differences are observed between RTT and AD children. In RTT, hand stereotypies are predominantly complex, continuous, localized to the body midline, and involving mouthing. Conversely, in AD children, hand stereotypies are simple, bilateral, intermittent, and often involving objects. These results provide important clinical signs useful to the differential diagnosis of RTT versus AD, especially when genetic testing for RTT is not an option. Copyright © 2012 Movement Disorder Society.
Implicit Self-Importance in an Interpersonal Pronoun Categorization Task
Fetterman, Adam K.; Robinson, Michael D.; Gilbertson, Elizabeth P.
2014-01-01
Object relations theories emphasize the manner in which the salience/importance of implicit representations of self and other guide interpersonal functioning. Two studies and a pilot test (total N = 304) sought to model such representations. In dyadic contexts, the self is a “you” and the other is a “me”, as verified in a pilot test. Study 1 then used a simple categorization task and found evidence for implicit self-importance: The pronoun “you” was categorized more quickly and accurately when presented in a larger font size, whereas the pronoun “me” was categorized more quickly and accurately when presented in a smaller font size. Study 2 showed that this pattern possesses value in understanding individual differences in interpersonal functioning. As predicted, arrogant people scored higher in implicit self-importance in the paradigm. Findings are discussed from the perspective of dyadic interpersonal dynamics. PMID:25419089
A method for the microlensed flux variance of QSOs
NASA Astrophysics Data System (ADS)
Goodman, Jeremy; Sun, Ai-Lei
2014-06-01
A fast and practical method is described for calculating the microlensed flux variance of an arbitrary source by uncorrelated stars. The required inputs are the mean convergence and shear due to the smoothed potential of the lensing galaxy, the stellar mass function, and the absolute square of the Fourier transform of the surface brightness in the source plane. The mathematical approach follows previous authors but has been generalized, streamlined, and implemented in publicly available code. Examples of its application are given for Dexter and Agol's inhomogeneous-disc models as well as the usual Gaussian sources. Since the quantity calculated is a second moment of the magnification, it is only logarithmically sensitive to the sizes of very compact sources. However, for the inferred sizes of actual quasi-stellar objects (QSOs), it has some discriminatory power and may lend itself to simple statistical tests. At the very least, it should be useful for testing the convergence of microlensing simulations.
Calibration with confidence: a principled method for panel assessment.
MacKay, R S; Kenna, R; Low, R J; Parker, S
2017-02-01
Frequently, a set of objects has to be evaluated by a panel of assessors, but not every object is assessed by every assessor. A problem facing such panels is how to take into account different standards among panel members and varying levels of confidence in their scores. Here, a mathematically based algorithm is developed to calibrate the scores of such assessors, addressing both of these issues. The algorithm is based on the connectivity of the graph of assessors and objects evaluated, incorporating declared confidences as weights on its edges. If the graph is sufficiently well connected, relative standards can be inferred by comparing how assessors rate objects they assess in common, weighted by the levels of confidence of each assessment. By removing these biases, 'true' values are inferred for all the objects. Reliability estimates for the resulting values are obtained. The algorithm is tested in two case studies: one by computer simulation and another based on realistic evaluation data. The process is compared to the simple averaging procedure in widespread use, and to Fisher's additive incomplete block analysis. It is anticipated that the algorithm will prove useful in a wide variety of situations such as evaluation of the quality of research submitted to national assessment exercises; appraisal of grant proposals submitted to funding panels; ranking of job applicants; and judgement of performances on degree courses wherein candidates can choose from lists of options.
Calibration with confidence: a principled method for panel assessment
MacKay, R. S.; Low, R. J.; Parker, S.
2017-01-01
Frequently, a set of objects has to be evaluated by a panel of assessors, but not every object is assessed by every assessor. A problem facing such panels is how to take into account different standards among panel members and varying levels of confidence in their scores. Here, a mathematically based algorithm is developed to calibrate the scores of such assessors, addressing both of these issues. The algorithm is based on the connectivity of the graph of assessors and objects evaluated, incorporating declared confidences as weights on its edges. If the graph is sufficiently well connected, relative standards can be inferred by comparing how assessors rate objects they assess in common, weighted by the levels of confidence of each assessment. By removing these biases, ‘true’ values are inferred for all the objects. Reliability estimates for the resulting values are obtained. The algorithm is tested in two case studies: one by computer simulation and another based on realistic evaluation data. The process is compared to the simple averaging procedure in widespread use, and to Fisher's additive incomplete block analysis. It is anticipated that the algorithm will prove useful in a wide variety of situations such as evaluation of the quality of research submitted to national assessment exercises; appraisal of grant proposals submitted to funding panels; ranking of job applicants; and judgement of performances on degree courses wherein candidates can choose from lists of options. PMID:28386432
Hsiao, Yaling; Gao, Yannan; MacDonald, Maryellen C.
2014-01-01
Interference effects from semantically similar items are well-known in studies of single word production, where the presence of semantically similar distractor words slows picture naming. This article examines the consequences of this interference in sentence production and tests the hypothesis that in situations of high similarity-based interference, producers are more likely to omit one of the interfering elements than when there is low semantic similarity and thus low interference. This work investigated language production in Mandarin, which allows subject noun phrases to be omitted in discourse contexts in which the subject entity has been previously mentioned in the discourse. We hypothesize that Mandarin speakers omit the subject more often when the subject and the object entities are conceptually similar. A corpus analysis of simple transitive sentences found higher rates of subject omission when both the subject and object were animate (potentially yielding similarity-based interference) than when the subject was animate and object was inanimate. A second study manipulated subject-object animacy in a picture description task and replicated this result: participants omitted the animate subject more often when the object was also animate than when it was inanimate. These results suggest that similarity-based interference affects sentence forms, particularly when the agent of the action is mentioned in the sentence. Alternatives and mechanisms for this effect are discussed. PMID:25278915
A simple test for thermomechanical evaluation of ceramic fibers
NASA Technical Reports Server (NTRS)
Morscher, Gregory N.; Dicarlo, James A.
1991-01-01
A simple bend stress relaxation (BSR) test was developed to measure the creep related properties of ceramic fibers and whiskers. The test was applied to a variety of commercial and developmental Si based fibers to demonstrate capabilities and to evaluate the relative creep resistance of the fibers at 1200 to 1400 C. The implications of these results and the advantages of the BSR test over typical tensile creep tests are discussed.
NASA Technical Reports Server (NTRS)
Wightman, Frederic L.; Jenison, Rick
1995-01-01
All auditory sensory information is packaged in a pair of acoustical pressure waveforms, one at each ear. While there is obvious structure in these waveforms, that structure (temporal and spectral patterns) bears no simple relationship to the structure of the environmental objects that produced them. The properties of auditory objects and their layout in space must be derived completely from higher level processing of the peripheral input. This chapter begins with a discussion of the peculiarities of acoustical stimuli and how they are received by the human auditory system. A distinction is made between the ambient sound field and the effective stimulus to differentiate the perceptual distinctions among various simple classes of sound sources (ambient field) from the known perceptual consequences of the linear transformations of the sound wave from source to receiver (effective stimulus). Next, the definition of an auditory object is dealt with, specifically the question of how the various components of a sound stream become segregated into distinct auditory objects. The remainder of the chapter focuses on issues related to the spatial layout of auditory objects, both stationary and moving.
Acoustic-tactile rendering of visual information
NASA Astrophysics Data System (ADS)
Silva, Pubudu Madhawa; Pappas, Thrasyvoulos N.; Atkins, Joshua; West, James E.; Hartmann, William M.
2012-03-01
In previous work, we have proposed a dynamic, interactive system for conveying visual information via hearing and touch. The system is implemented with a touch screen that allows the user to interrogate a two-dimensional (2-D) object layout by active finger scanning while listening to spatialized auditory feedback. Sound is used as the primary source of information for object localization and identification, while touch is used both for pointing and for kinesthetic feedback. Our previous work considered shape and size perception of simple objects via hearing and touch. The focus of this paper is on the perception of a 2-D layout of simple objects with identical size and shape. We consider the selection and rendition of sounds for object identification and localization. We rely on the head-related transfer function for rendering sound directionality, and consider variations of sound intensity and tempo as two alternative approaches for rendering proximity. Subjective experiments with visually-blocked subjects are used to evaluate the effectiveness of the proposed approaches. Our results indicate that intensity outperforms tempo as a proximity cue, and that the overall system for conveying a 2-D layout is quite promising.
The evolution of hydrocarbons past the asymptotic giant branch: the case of MSX SMC 029
NASA Astrophysics Data System (ADS)
Pauly, Tyler; Sloan, Gregory C.; Kraemer, Kathleen E.; Bernard-Salas, Jeronimo; Lebouteiller, Vianney; Goes, Christopher; Barry, Donald
2015-01-01
We present an optimally extracted high-resolution spectrum of MSX SMC 029 obtained by the Infrared Spectrograph on the Spitzer Space Telescope. MSX SMC 029 is a carbon-rich object in the Small Magellanic Cloud that has evolved past the asymptotic giant branch (AGB). The spectrum reveals a cool carbon-rich dust continuum with emission from polycyclic aromatic hydrocarbons (PAHs) and absorption from simpler hydrocarbons, both aliphatic and aromatic, including acetylene and benzene. The spectrum shows many similarities to the carbon-rich post-AGB objects SMP LMC 011 in the Large Magellanic Cloud and AFGL 618 in the Galaxy. Both of these objects also show infrared absorption features from simple hydrocarbons. All three spectra lack strong atomic emission lines in the infrared, indicating that we are observing the evolution of carbon-rich dust and free hydrocarbons in objects between the AGB and planetary nebulae. These three objects give us a unique view of the elusive phase when hydrocarbons exist both as relatively simple molecules and the much more complex and ubiquitous PAHs. We may be witnessing the assembly of amorphous carbon into PAHs.
Towards discrete wavelet transform-based human activity recognition
NASA Astrophysics Data System (ADS)
Khare, Manish; Jeon, Moongu
2017-06-01
Providing accurate recognition of human activities is a challenging problem for visual surveillance applications. In this paper, we present a simple and efficient algorithm for human activity recognition based on a wavelet transform. We adopt discrete wavelet transform (DWT) coefficients as a feature of human objects to obtain advantages of its multiresolution approach. The proposed method is tested on multiple levels of DWT. Experiments are carried out on different standard action datasets including KTH and i3D Post. The proposed method is compared with other state-of-the-art methods in terms of different quantitative performance measures. The proposed method is found to have better recognition accuracy in comparison to the state-of-the-art methods.
Generation of multiple Bessel beams for a biophotonics workstation.
Cizmár, T; Kollárová, V; Tsampoula, X; Gunn-Moore, F; Sibbett, W; Bouchal, Z; Dholakia, K
2008-09-01
We present a simple method using an axicon and spatial light modulator to create multiple parallel Bessel beams and precisely control their individual positions in three dimensions. This technique is tested as an alternative to classical holographic beam shaping commonly used now in optical tweezers. Various applications of precise control of multiple Bessel beams are demonstrated within a single microscope giving rise to new methods for three-dimensional positional control of trapped particles or active sorting of micro-objects as well as "focus-free" photoporation of living cells. Overall this concept is termed a 'biophotonics workstation' where users may readily trap, sort and porate material using Bessel light modes in a microscope.
Autoantibody Approach for Serum-Based Detection of Head and Neck Cancer — EDRN Public Portal
Our long term goal is to improve survival of patients with head and neck squamous cell carcinoma (HNSCC) through early detection using simple noninvasive serum assays in an ELISA-like platform. The objective of this proposal is to improve and confirm the validity of a diagnostic serum assay based on a panel of cancer-specific biomarkers for early cancer detection in patients with HNSCC. Our central hypothesis is that the detection of antibody responses to HNSCC-specific antigens, using a panel of biomarkers, can provide sufficient sensitivity and specificity suitable for clinical testing in the primary setting to screen and diagnose HNSCC in high risk populations to improve early detection.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gratia, Pierre; Hu, Wayne; Enrico Fermi Institute and Kavli Institute for Cosmological Physics, University of Chicago,South Ellis Avenue, Chicago, IL 60637
Attempts to modify gravity in the infrared typically require a screening mechanism to ensure consistency with local tests of gravity. These screening mechanisms fit into three broad classes; we investigate theories which are capable of exhibiting more than one type of screening. Specifically, we focus on a simple model which exhibits both Vainshtein and kinetic screening. We point out that due to the two characteristic length scales in the problem, the type of screening that dominates depends on the mass of the sourcing object, allowing for different phenomenology at different scales. We consider embedding this double screening phenomenology in amore » broader cosmological scenario and show that the simplest examples that exhibit double screening are radiatively stable.« less
Smartphone-based colorimetric analysis for detection of saliva alcohol concentration.
Jung, Youngkee; Kim, Jinhee; Awofeso, Olumide; Kim, Huisung; Regnier, Fred; Bae, Euiwon
2015-11-01
A simple device and associated analytical methods are reported. We provide objective and accurate determination of saliva alcohol concentrations using smartphone-based colorimetric imaging. The device utilizes any smartphone with a miniature attachment that positions the sample and provides constant illumination for sample imaging. Analyses of histograms based on channel imaging of red-green-blue (RGB) and hue-saturation-value (HSV) color space provide unambiguous determination of blood alcohol concentration from color changes on sample pads. A smartphone-based sample analysis by colorimetry was developed and tested with blind samples that matched with the training sets. This technology can be adapted to any smartphone and used to conduct color change assays.
NecroQuant: quantitative assessment of radiological necrosis
NASA Astrophysics Data System (ADS)
Hwang, Darryl H.; Mohamed, Passant; Varghese, Bino A.; Cen, Steven Y.; Duddalwar, Vinay
2017-11-01
Clinicians can now objectively quantify tumor necrosis by Hounsfield units and enhancement characteristics from multiphase contrast enhanced CT imaging. NecroQuant has been designed to work as part of a radiomics pipelines. The software is a departure from the conventional qualitative assessment of tumor necrosis, as it provides the user (radiologists and researchers) a simple interface to precisely and interactively define and measure necrosis in contrast-enhanced CT images. Although, the software is tested here on renal masses, it can be re-configured to assess tumor necrosis across variety of tumors from different body sites, providing a generalized, open, portable, and extensible quantitative analysis platform that is widely applicable across cancer types to quantify tumor necrosis.
Posture Alignment of Adolescent Idiopathic Scoliosis: Photogrammetry in Scoliosis School Screening.
Penha, Patrícia Jundi; Penha, Nárima Lívia Jundi; De Carvalho, Bárbarah Kelly Gonçalves; Andrade, Rodrigo Mantelatto; Schmitt, Ana Carolina Basso; João, Sílvia Maria Amado
The objective of this study was to describe the posture patterns of adolescents diagnosed with adolescent idiopathic scoliosis (AIS) in a scoliosis school screening (SSS). Two-dimensional photogrammetry was used to assess the posture of 37 adolescents diagnosed with scoliosis (scoliosis group, SG) (Cobb angle ≥10°) and 76 adolescents with a false positive diagnosis (false positive group, FPG) (Cobb angle <10°, angle of trunk rotation ≥7°). In total, 2562 10- to 14-year-old adolescents were enrolled in the SSS, which was performed in public schools in the cities of Amparo, Pedreira, and Mogi Mirim in the state of São Paulo, Brazil. Their posture was analyzed using Postural Analysis Software. Continuous variables were tested using Student t test, and categorical variables were tested using a χ2 test. The SG, FPG, simple curve group, and double curve group were all compared. Bivariate analysis was used to identify associations between postural deviations and scoliosis. The adopted significance level was α = .05. The SG (2.7 ± 1.9°) had greater shoulder obliquity than the FPG (1.9 ± 1.4°) (P = .010), and this deviation was associated with scoliosis (odds ratio [95% CI] P = 1.4 [1.1-1.8] 0.011). The SG had asymmetry between the right- and left-side lower limb frontal angle, shoulder sagittal alignment, and knee angle. The double curve group (3 ± 1.7°) presented a greater value of the vertical alignment of the torso than the simple curve group did (1.9 ± 1°; P = .032). Adolescents diagnosed with AIS in an SSS had greater shoulder obliquity and asymmetry between the right and left sides. Shoulder obliquity was the only postural deviation associated with AIS. Copyright © 2017. Published by Elsevier Inc.
Assaf, Tareq; Roke, Calum; Rossiter, Jonathan; Pipe, Tony; Melhuish, Chris
2014-02-07
Effective tactile sensing for artificial platforms remains an open issue in robotics. This study investigates the performance of a soft biologically-inspired artificial fingertip in active exploration tasks. The fingertip sensor replicates the mechanisms within human skin and offers a robust solution that can be used both for tactile sensing and gripping/manipulating objects. The softness of the optical sensor's contact surface also allows safer interactions with objects. High-level tactile features such as edges are extrapolated from the sensor's output and the information is used to generate a tactile image. The work presented in this paper aims to investigate and evaluate this artificial fingertip for 2D shape reconstruction. The sensor was mounted on a robot arm to allow autonomous exploration of different objects. The sensor and a number of human participants were then tested for their abilities to track the raised perimeters of different planar objects and compared. By observing the technique and accuracy of the human subjects, simple but effective parameters were determined in order to evaluate the artificial system's performance. The results prove the capability of the sensor in such active exploration tasks, with a comparable performance to the human subjects despite it using tactile data alone whereas the human participants were also able to use proprioceptive cues.
NASA Astrophysics Data System (ADS)
McIntire, John P.; Wright, Steve T.; Harrington, Lawrence K.; Havig, Paul R.; Watamaniuk, Scott N. J.; Heft, Eric L.
2014-06-01
Twelve participants were tested on a simple virtual object precision placement task while viewing a stereoscopic three-dimensional (S3-D) display. Inclusion criteria included uncorrected or best corrected vision of 20/20 or better in each eye and stereopsis of at least 40 arc sec using the Titmus stereotest. Additionally, binocular function was assessed, including measurements of distant and near phoria (horizontal and vertical) and distant and near horizontal fusion ranges using standard optometric clinical techniques. Before each of six 30 min experimental sessions, measurements of phoria and fusion ranges were repeated using a Keystone View Telebinocular and an S3-D display, respectively. All participants completed experimental sessions in which the task required the precision placement of a virtual object in depth at the same location as a target object. Subjective discomfort was assessed using the simulator sickness questionnaire. Individual placement accuracy in S3-D trials was significantly correlated with several of the binocular screening outcomes: viewers with larger convergent fusion ranges (measured at near distance), larger total fusion ranges (convergent plus divergent ranges, measured at near distance), and/or lower (better) stereoscopic acuity thresholds were more accurate on the placement task. No screening measures were predictive of subjective discomfort, perhaps due to the low levels of discomfort induced.
Finding the Density of Objects without Measuring Mass and Volume
ERIC Educational Resources Information Center
Mumba, Frackson; Tsige, Mesfin
2007-01-01
A simple method based on the moment of forces and Archimedes' principle is described for finding density without measuring the mass and volume of an object. The method involves balancing two unknown objects of masses M[subscript 1] and M[subscript 2] on each side of a pivot on a metre rule and measuring their corresponding moment arms. The object…
Management of Simple Clavicle Fractures by Primary Care Physicians.
Stepanyan, Hayk; Gendelberg, David; Hennrikus, William
2017-05-01
The clavicle is the most commonly fractured bone. Children with simple fractures are often referred to orthopedic surgeons by primary care physician to ensure adequate care. The objective of this study was to show that simple clavicle fractures have excellent outcomes and are within the scope of primary care physician's practice. We performed a retrospective chart review of 16 adolescents with simple clavicle fractures treated with a sling. Primary outcomes were bony union, pain, and function. The patients with simple clavicle fractures had excellent outcomes with no complications or complaints of pain or restriction of their activities of daily living. The outcomes are similar whether treated by an orthopedic surgeon or a primary care physician. The cost to society and the patient is less when the primary care physician manages the fracture. Therefore, primary care physicians should manage simple clavicle fractures.
A Simple Model for the Orbital Debris Environment in GEO
NASA Astrophysics Data System (ADS)
Anilkumar, A. K.; Ananthasayanam, M. R.; Subba Rao, P. V.
The increase of space debris and its threat to commercial space activities in the Geosynchronous Earth Orbit (GEO) predictably cause concern regarding the environment over the long term. A variety of studies regarding space debris such as detection, modeling, protection and mitigation measures, is being pursued for the past couple of decades. Due to the absence of atmospheric drag to remove debris in GEO and the increasing number of utility satellites therein, the number of objects in GEO will continue to increase. The characterization of the GEO environment is critical for risk assessment and protection of future satellites and also to incorporate effective debris mitigation measures in the design and operations. The debris measurements in GEO have been limited to objects with size more than 60 cm. This paper provides an engineering model of the GEO environment by utilizing the philosophy and approach as laid out for the SIMPLE model proposed recently for LEO by the authors. The present study analyses the statistical characteristics of the GEO catalogued objects in order to arrive at a model for the GEO space debris environment. It is noted that the catalogued objects, as of now of around 800, by USSPACECOM across the years 1998 to 2004 have the same semi major axis mode (highest number density) around 35750 km above the earth. After removing the objects in the small bin around the mode, (35700, 35800) km containing around 40 percent (a value that is nearly constant across the years) of the objects, the number density of the other objects follow a single Laplace distribution with two parameters, namely location and scale. Across the years the location parameter of the above distribution does not significantly vary but the scale parameter shows a definite trend. These observations are successfully utilized in proposing a simple model for the GEO debris environment. References Ananthasayanam, M. R., Anil Kumar, A. K., and Subba Rao, P. V., ``A New Stochastic Impressionistic Low Earth (SIMPLE) Model of the Space Debris Scenario'', Conference Abstract COSPAR 02-A-01772, 2002. Ananthasayanam, M. R., Anilkumar, A. K., Subba Rao, P. V., and V. Adimurthy, ``Characterization of Eccentricity and Ballistic Coefficients of Space Debris in Altitude and Perigee Bins'', IAC-03-IAA5.p.04, Presented at the IAF Conference, Bremen, October 2003 and also to be published in the Proceedings of IAF Conference, Science and Technology Series, 2003.
Redundant correlation effect on personalized recommendation
NASA Astrophysics Data System (ADS)
Qiu, Tian; Han, Teng-Yue; Zhong, Li-Xin; Zhang, Zi-Ke; Chen, Guang
2014-02-01
The high-order redundant correlation effect is investigated for a hybrid algorithm of heat conduction and mass diffusion (HHM), through both heat conduction biased (HCB) and mass diffusion biased (MDB) correlation redundancy elimination processes. The HCB and MDB algorithms do not introduce any additional tunable parameters, but keep the simple character of the original HHM. Based on two empirical datasets, the Netflix and MovieLens, the HCB and MDB are found to show better recommendation accuracy for both the overall objects and the cold objects than the HHM algorithm. Our work suggests that properly eliminating the high-order redundant correlations can provide a simple and effective approach to accurate recommendation.
On the estimation variance for the specific Euler-Poincaré characteristic of random networks.
Tscheschel, A; Stoyan, D
2003-07-01
The specific Euler number is an important topological characteristic in many applications. It is considered here for the case of random networks, which may appear in microscopy either as primary objects of investigation or as secondary objects describing in an approximate way other structures such as, for example, porous media. For random networks there is a simple and natural estimator of the specific Euler number. For its estimation variance, a simple Poisson approximation is given. It is based on the general exact formula for the estimation variance. In two examples of quite different nature and topology application of the formulas is demonstrated.
A simple water-immersion condenser for imaging living brain slices on an inverted microscope.
Prusky, G T
1997-09-05
Due to some physical limitations of conventional condensers, inverted compound microscopes are not optimally suited for imaging living brain slices with transmitted light. Herein is described a simple device that converts an inverted microscope into an effective tool for this application by utilizing an objective as a condenser. The device is mounted on a microscope in place of the condenser, is threaded to accept a water immersion objective, and has a slot for a differential interference contrast (DIC) slider. When combined with infrared video techniques, this device allows an inverted microscope to effectively image living cells within thick brain slices in an open perfusion chamber.
3D visualization of two-phase flow in the micro-tube by a simple but effective method
NASA Astrophysics Data System (ADS)
Fu, X.; Zhang, P.; Hu, H.; Huang, C. J.; Huang, Y.; Wang, R. Z.
2009-08-01
The present study provides a simple but effective method for 3D visualization of the two-phase flow in the micro-tube. An isosceles right-angle prism combined with a mirror located 45° bevel to the prism is employed to synchronously obtain the front and side views of the flow patterns with a single camera, where the locations of the prism and the micro-tube for clear imaging should satisfy a fixed relationship which is specified in the present study. The optical design is proven successfully by the tough visualization work at the cryogenic temperature range. The image deformation due to the refraction and geometrical configuration of the test section is quantitatively investigated. It is calculated that the image is enlarged by about 20% in inner diameter compared to the real object, which is validated by the experimental results. Meanwhile, the image deformation by adding a rectangular optical correction box outside the circular tube is comparatively investigated. It is calculated that the image is reduced by about 20% in inner diameter with a rectangular optical correction box compared to the real object. The 3D re-construction process based on the two views is conducted through three steps, which shows that the 3D visualization method can easily be applied for two-phase flow research in micro-scale channels and improves the measurement accuracy of some important parameters of the two-phase flow such as void fraction, spatial distribution of bubbles, etc.
Kerkar, N; Ma, Y; Hussain, M; Muratori, L; Targett, C; Williams, R; Bianchi, F B; Mieli-Vergani, G; Vergani, D
1999-03-04
Liver Kidney Microsomal type 1 (LKM1) antibody, the diagnostic marker of autoimmune hepatitis type 2, is also found in a proportion of patients with hepatitis C virus infection (HCV). It is detected conventionally by the subjective immunofluorescence technique. Our aim was to establish a simple and objective enzyme-linked immunosorbent assay (ELISA) that measures antibodies to cytochrome P4502D6 (CYP2D6), the target of LKM1. An indirect ELISA using eukaryotically expressed CYP2D6 was designed. Absorbance values obtained against a reference microsomal preparation were subtracted from those obtained against a microsomal preparation over-expressing CYP2D6, thus removing the non-CYP2D6-specific reaction. Sera from 51 LKM1 positive patients (21 autoimmune hepatitis and 30 with HCV infection), 111 LKM1 negative patients with chronic liver disease (including 20 with HCV infection) and 43 healthy controls were tested. Of 51 patients positive by immunofluorescence, 48 were also positive by ELISA while all the 154 LKM1 negative subjects were also negative by ELISA. There was a high degree of association between IFL and ELISA as demonstrated by a kappa reliability value of 0.96. The absorbance values by ELISA correlated with immunofluorescence LKM1 titres both in autoimmune hepatitis (r = 0.74, p < 0.001) and HCV infection (r = 0.67, p < 0.001). The simple, objective ELISA described has the potential to replace the standard immunofluorescence technique.
Recognition of Simple 3D Geometrical Objects under Partial Occlusion
NASA Astrophysics Data System (ADS)
Barchunova, Alexandra; Sommer, Gerald
In this paper we present a novel procedure for contour-based recognition of partially occluded three-dimensional objects. In our approach we use images of real and rendered objects whose contours have been deformed by a restricted change of the viewpoint. The preparatory part consists of contour extraction, preprocessing, local structure analysis and feature extraction. The main part deals with an extended construction and functionality of the classifier ensemble Adaptive Occlusion Classifier (AOC). It relies on a hierarchical fragmenting algorithm to perform a local structure analysis which is essential when dealing with occlusions. In the experimental part of this paper we present classification results for five classes of simple geometrical figures: prism, cylinder, half cylinder, a cube, and a bridge. We compare classification results for three classical feature extractors: Fourier descriptors, pseudo Zernike and Zernike moments.
The effect of music on parental participation during pediatric laceration repair.
Sobieraj, Gregory; Bhatt, Maala; LeMay, Sylvie; Rennick, Janet; Johnston, Celeste
2009-12-01
The purpose of this quasi-experimental study was to test an intervention on the use of music during simple laceration repair to promote parent-led distraction in children aged 1 to 5. Children's songs were broadcast via speakers during laceration repair and parents were encouraged to participate in distracting their child. The proportion of parental participation was determined. Laceration procedures were videotaped and objectively scored using the Procedure Behavior Check List. A total of 57 children participated in the study. There was no difference in parental involvement between the control and intervention groups. When age, sex, and condition were controlled for, distress scores were significantly higher if the father was present in the procedure room than if only the mother was present (43.68 vs. 23.39, t(54) 4.296, p = < 0.001). It was concluded that distress varies with the age of the child and the parent who is present during the procedure. Providing music during simple laceration repair did not increase the proportion of parents who were involved in distraction.
Comparison of Two Simplification Methods for Shoreline Extraction from Digital Orthophoto Images
NASA Astrophysics Data System (ADS)
Bayram, B.; Sen, A.; Selbesoglu, M. O.; Vārna, I.; Petersons, P.; Aykut, N. O.; Seker, D. Z.
2017-11-01
The coastal ecosystems are very sensitive to external influences. Coastal resources such as sand dunes, coral reefs and mangroves has vital importance to prevent coastal erosion. Human based effects also threats the coastal areas. Therefore, the change of coastal areas should be monitored. Up-to-date, accurate shoreline information is indispensable for coastal managers and decision makers. Remote sensing and image processing techniques give a big opportunity to obtain reliable shoreline information. In the presented study, NIR bands of seven 1:5000 scaled digital orthophoto images of Riga Bay-Latvia have been used. The Object-oriented Simple Linear Clustering method has been utilized to extract shoreline of Riga Bay. Bend and Douglas-Peucker methods have been used to simplify the extracted shoreline to test the effect of both methods. Photogrammetrically digitized shoreline has been taken as reference data to compare obtained results. The accuracy assessment has been realised by Digital Shoreline Analysis tool. As a result, the achieved shoreline by the Bend method has been found closer to the extracted shoreline with Simple Linear Clustering method.
A matrix-inversion method for gamma-source mapping from gamma-count data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adsley, Ian; Burgess, Claire; Bull, Richard K
In a previous paper it was proposed that a simple matrix inversion method could be used to extract source distributions from gamma-count maps, using simple models to calculate the response matrix. The method was tested using numerically generated count maps. In the present work a 100 kBq Co{sup 60} source has been placed on a gridded surface and the count rate measured using a NaI scintillation detector. The resulting map of gamma counts was used as input to the matrix inversion procedure and the source position recovered. A multi-source array was simulated by superposition of several single-source count maps andmore » the source distribution was again recovered using matrix inversion. The measurements were performed for several detector heights. The effects of uncertainties in source-detector distances on the matrix inversion method are also examined. The results from this work give confidence in the application of the method to practical applications, such as the segregation of highly active objects amongst fuel-element debris. (authors)« less
NASA Astrophysics Data System (ADS)
Asfahani, J.; Tlas, M.
2015-10-01
An easy and practical method for interpreting residual gravity anomalies due to simple geometrically shaped models such as cylinders and spheres has been proposed in this paper. This proposed method is based on both the deconvolution technique and the simplex algorithm for linear optimization to most effectively estimate the model parameters, e.g., the depth from the surface to the center of a buried structure (sphere or horizontal cylinder) or the depth from the surface to the top of a buried object (vertical cylinder), and the amplitude coefficient from the residual gravity anomaly profile. The method was tested on synthetic data sets corrupted by different white Gaussian random noise levels to demonstrate the capability and reliability of the method. The results acquired show that the estimated parameter values derived by this proposed method are close to the assumed true parameter values. The validity of this method is also demonstrated using real field residual gravity anomalies from Cuba and Sweden. Comparable and acceptable agreement is shown between the results derived by this method and those derived from real field data.
3D Boolean operations in virtual surgical planning.
Charton, Jerome; Laurentjoye, Mathieu; Kim, Youngjun
2017-10-01
Boolean operations in computer-aided design or computer graphics are a set of operations (e.g. intersection, union, subtraction) between two objects (e.g. a patient model and an implant model) that are important in performing accurate and reproducible virtual surgical planning. This requires accurate and robust techniques that can handle various types of data, such as a surface extracted from volumetric data, synthetic models, and 3D scan data. This article compares the performance of the proposed method (Boolean operations by a robust, exact, and simple method between two colliding shells (BORES)) and an existing method based on the Visualization Toolkit (VTK). In all tests presented in this article, BORES could handle complex configurations as well as report impossible configurations of the input. In contrast, the VTK implementations were unstable, do not deal with singular edges and coplanar collisions, and have created several defects. The proposed method of Boolean operations, BORES, is efficient and appropriate for virtual surgical planning. Moreover, it is simple and easy to implement. In future work, we will extend the proposed method to handle non-colliding components.
Enhancing activated-peroxide formulations for porous materials: Test methods and results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krauter, Paula; Tucker, Mark D.; Tezak, Matthew S.
2012-12-01
During an urban wide-area incident involving the release of a biological warfare agent, the recovery/restoration effort will require extensive resources and will tax the current capabilities of the government and private contractors. In fact, resources may be so limited that decontamination by facility owners/occupants may become necessary and a simple decontamination process and material should be available for this use. One potential process for use by facility owners/occupants would be a liquid sporicidal decontaminant, such as pHamended bleach or activated-peroxide, and simple application devices. While pH-amended bleach is currently the recommended low-tech decontamination solution, a less corrosive and toxic decontaminantmore » is desirable. The objective of this project is to provide an operational assessment of an alternative to chlorine bleach for low-tech decontamination applications activated hydrogen peroxide. This report provides the methods and results for activatedperoxide evaluation experiments. The results suggest that the efficacy of an activated-peroxide decontaminant is similar to pH-amended bleach on many common materials.« less
Paula Vaz Cardoso, Ludimila; Dias, Ronaldo Ferreira; Freitas, Aline Araújo; Hungria, Emerith Mayra; Oliveira, Regiane Morillas; Collovati, Marco; Reed, Steven G; Duthie, Malcolm S; Martins Araújo Stefani, Mariane
2013-10-23
Despite efforts to eliminate leprosy as public health problem, delayed diagnosis and disabilities still occur in many countries. Leprosy diagnosis remains based on clinical manifestations and the number of clinicians with expertise in leprosy diagnosis is in decline. We have developed a new immunochromatographic test with the goal of producing a simple and rapid system that can be used, with a minimal amount of training, to provide an objective and consistent diagnosis of multibacillary leprosy. The test immobilizes two antigens that have been recognized as excellent candidates for serologic diagnosis (the PGL-I mimetic, ND-O, and LID-1), on a nitrocellulose membrane. This allows the detection of specific IgM and IgG antibodies within 20 minutes of the addition of patient sera. Furthermore, we coupled the NDO-LID® rapid tests with a new cell phone-based test reader platform (Smart Reader®) to provide objective interpretation that was both quantifiable and consistent. Direct comparison of serologic responses indicated that the rapid test detected a greater proportion of leprosy patients than a lab-based PGL-I ELISA. While positive responses were detected by PGL-I ELISA in 83.3% of multibacillary patients and 15.4% of paucibacillary patients, these numbers were increased to 87% and 21.2%, respectively, when a combination of the NDO-LID® test and Smart Reader® was used. Among multibacillary leprosy the sensitivity of NDO-LID® test assessed by Smart Reader® was 87% (95% CI, 79.2-92.7%) and the specificity was 96.1% (95% CI, 91.7- 98.6%). The positive predictive value and the negative predictive value of NDO-LID® tests were 94% (95% CI, 87.4-97.8%) and 91.4% (95% CI, 85.9-95.2%), respectively. The widespread provision of rapid diagnostic tests to facilitate the diagnosis or prognosis of multibacillary leprosy could impact on leprosy control programs by aiding early detection, directing appropriate treatment and potentially interrupting Mycobacterium leprae transmission.
Manipulation Capabilities with Simple Hands
2010-01-01
allowing it to interpret online kinesthetic data, addressing two objectives: • Grasp classification: Distinguish between successful and unsuccessful...determining the grasp outcome before the grasping process is complete, by using the entire time series or kinesthetic signature of the grasping process. As...the grasp proceeds and additional kinesthetic data accumulates, the confidence also increases. In some cases Manipulation Capabilities with Simple Hands
An annotated genetic map of loblolly pine based on microsatellite and cDNA markers
USDA-ARS?s Scientific Manuscript database
Previous loblolly pine (Pinus taeda L.) genetic linkage maps have been based on a variety of DNA polymorphisms, such as AFLPs, RAPDs, RFLPs, and ESTPs, but only a few SSRs (simple sequence repeats), also known as simple tandem repeats or microsatellites, have been mapped in P. taeda. The objective o...
Atypical Brain Activation during Simple & Complex Levels of Processing in Adult ADHD: An fMRI Study
ERIC Educational Resources Information Center
Hale, T. Sigi; Bookheimer, Susan; McGough, James J.; Phillips, Joseph M.; McCracken, James T.
2007-01-01
Objective: Executive dysfunction in ADHD is well supported. However, recent studies suggest that more fundamental impairments may be contributing. We assessed brain function in adults with ADHD during simple and complex forms of processing. Method: We used functional magnetic resonance imaging with forward and backward digit spans to investigate…
Do Simple Warning Signs Enhance the Use of Stairs?
ERIC Educational Resources Information Center
Aksay, Ebubekir
2014-01-01
Objective: The aim of this study was to investigate the use of stairways/moving stairways in shopping malls and examine the extent to which simple warning signs determined whether people took the stairs. Design: Large posters that could readily be seen by mall visitors were situated between the stairs and moving stairways in shopping malls.…
Designing a Simple Apparatus for Measuring Kinematic Variables
ERIC Educational Resources Information Center
Temiz, Burak Kagan
2014-01-01
This study was conducted to develop a simple and inexpensive experimental apparatus that can measure the position of an object moving along a straight line at certain time intervals. For the construction of the apparatus, a battery-powered toy car, a fine-tipped paint brush, gouache (or watercolour) paint and paper tape were used. The working…
A Simple Apparatus for Demonstrating Fluid Forces and Newton's Third Law
ERIC Educational Resources Information Center
Mohazzabi, Pirooz; James, Mark C.
2012-01-01
Over 2200 years ago, in order to determine the purity of a golden crown of the king of Syracuse, Archimedes submerged the crown in water and determined its volume by measuring the volume of the displaced water. This simple experiment became the foundation of what eventually became known as Archimedes' principle: An object fully or partially…
Comparison of different objective functions for parameterization of simple respiration models
M.T. van Wijk; B. van Putten; D.Y. Hollinger; A.D. Richardson
2008-01-01
The eddy covariance measurements of carbon dioxide fluxes collected around the world offer a rich source for detailed data analysis. Simple, aggregated models are attractive tools for gap filling, budget calculation, and upscaling in space and time. Key in the application of these models is their parameterization and a robust estimate of the uncertainty and reliability...
Global ensemble texture representations are critical to rapid scene perception.
Brady, Timothy F; Shafer-Skelton, Anna; Alvarez, George A
2017-06-01
Traditionally, recognizing the objects within a scene has been treated as a prerequisite to recognizing the scene itself. However, research now suggests that the ability to rapidly recognize visual scenes could be supported by global properties of the scene itself rather than the objects within the scene. Here, we argue for a particular instantiation of this view: That scenes are recognized by treating them as a global texture and processing the pattern of orientations and spatial frequencies across different areas of the scene without recognizing any objects. To test this model, we asked whether there is a link between how proficient individuals are at rapid scene perception and how proficiently they represent simple spatial patterns of orientation information (global ensemble texture). We find a significant and selective correlation between these tasks, suggesting a link between scene perception and spatial ensemble tasks but not nonspatial summary statistics In a second and third experiment, we additionally show that global ensemble texture information is not only associated with scene recognition, but that preserving only global ensemble texture information from scenes is sufficient to support rapid scene perception; however, preserving the same information is not sufficient for object recognition. Thus, global ensemble texture alone is sufficient to allow activation of scene representations but not object representations. Together, these results provide evidence for a view of scene recognition based on global ensemble texture rather than a view based purely on objects or on nonspatially localized global properties. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Souissi, Makram; Abedelmalek, Salma; Chtourou, Hamdi; Atheymen, Rim; Hakim, Ahmed; Sahnoun, Zouhair
2012-01-01
Purpose The purpose of the present study was to evaluate the ergogenic effect of caffeine ingestion on mood state, simple reaction time, and muscle power during the Wingate test recorded in the morning on elite Judoists. Methods Twelve elite judoists (age: 21.08 ± 1.16 years, body mass: 83.75 ± 20.2 kg, height: 1.76 ±6.57 m) participated in this study. Mood states, simple reaction time, and muscle power during the Wingate test were measured during two test sessions at 07:00 h and after placebo or caffeine ingestion (i.e. 5 mg/kg). Plasma concentrations of caffeine were measured before (T0) and 1-h after caffeine’ ingestion (T1) and after the Wingate test (T3). Results Our results revealed an increase of the anxiety and the vigor (P<0.01), a reduction of the simple reaction time (P<0.001) and an improvement of the peak and mean powers during the Wingate test. However, the fatigue index during this test was unaffected by the caffeine ingestion. In addition, plasma concentration of caffeine was significantly higher at T1 in comparison with T0. Conclusions In conclusion, the results of this study suggest that morning caffeine ingestion has ergogenic properties with the potential to benefit performance, increase anxiety and vigor, and decrease the simple reaction time. PMID:23012635
WWWinda Orchestrator: a mechanism for coordinating distributed flocks of Java Applets
NASA Astrophysics Data System (ADS)
Gutfreund, Yechezkal-Shimon; Nicol, John R.
1997-01-01
The WWWinda Orchestrator is a simple but powerful tool for coordinating distributed Java applets. Loosely derived from the Linda programming language developed by David Gelernter and Nicholas Carriero of Yale, WWWinda implements a distributed shared object space called TupleSpace where applets can post, read, or permanently store arbitrary Java objects. In this manner, applets can easily share information without being aware of the underlying communication mechanisms. WWWinda is a very useful for orchestrating flocks of distributed Java applets. Coordination event scan be posted to WWWinda TupleSpace and used to orchestrate the actions of remote applets. Applets can easily share information via the TupleSpace. The technology combines several functions in one simple metaphor: distributed web objects, remote messaging between applets, distributed synchronization mechanisms, object- oriented database, and a distributed event signaling mechanisms. WWWinda can be used a s platform for implementing shared VRML environments, shared groupware environments, controlling remote devices such as cameras, distributed Karaoke, distributed gaming, and shared audio and video experiences.
Laparoscopic repair of perforated peptic ulcer: simple closure versus omentopexy.
Lin, Being-Chuan; Liao, Chien-Hung; Wang, Shang-Yu; Hwang, Tsann-Long
2017-12-01
This report presents our experience with laparoscopic repair performed in 118 consecutive patients diagnosed with a perforated peptic ulcer (PPU). We compared the surgical outcome of simple closure with modified Cellan-Jones omentopexy and report the safety and benefit of simple closure. From January 2010 to December 2014, 118 patients with PPU underwent laparoscopic repair with simple closure (n = 27) or omentopexy (n = 91). Charts were retrospectively reviewed for demographic characteristics and outcome. The data were compared by Fisher's exact test, Mann-Whitney U test, Pearson's chi-square test, and the Kruskal-Wallis test. The results were considered statistically significant if P < 0.05. No patients died, whereas three incurred leakage. After matching, the simple closure and omentopexy groups had similarity in sex, systolic blood pressure, pulse rate, respiratory rate, Boey score, Charlson comorbidity index, Mannheim peritonitis index, and leakage. There were statistically significant differences in age, length of hospital stay, perforated size, and operating time. Comparison of the operating time in the ≤4.0 mm and 5.0-12 mm groups revealed that the simple closure took less time than omentopexy in both groups (≤4.0 mm, 76 versus 133 minutes, P < 0.0001; 5.0-12 mm, 97 versus 139.5 minutes; P = 0.006). Compared to the omentopexy, laparoscopic simple closure is a safe procedure and shortens the operating time. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.
Marschollek, M; Nemitz, G; Gietzelt, M; Wolf, K H; Meyer Zu Schwabedissen, H; Haux, R
2009-08-01
Falls are among the predominant causes for morbidity and mortality in elderly persons and occur most often in geriatric clinics. Despite several studies that have identified parameters associated with elderly patients' fall risk, prediction models -- e.g., based on geriatric assessment data -- are currently not used on a regular basis. Furthermore, technical aids to objectively assess mobility-associated parameters are currently not used. To assess group differences in clinical as well as common geriatric assessment data and sensory gait measurements between fallers and non-fallers in a geriatric sample, and to derive and compare two prediction models based on assessment data alone (model #1) and added sensory measurement data (model #2). For a sample of n=110 geriatric in-patients (81 women, 29 men) the following fall risk-associated assessments were performed: Timed 'Up & Go' (TUG) test, STRATIFY score and Barthel index. During the TUG test the subjects wore a triaxial accelerometer, and sensory gait parameters were extracted from the data recorded. Group differences between fallers (n=26) and non-fallers (n=84) were compared using Student's t-test. Two classification tree prediction models were computed and compared. Significant differences between the two groups were found for the following parameters: time to complete the TUG test, transfer item (Barthel), recent falls (STRATIFY), pelvic sway while walking and step length. Prediction model #1 (using common assessment data only) showed a sensitivity of 38.5% and a specificity of 97.6%, prediction model #2 (assessment data plus sensory gait parameters) performed with 57.7% and 100%, respectively. Significant differences between fallers and non-fallers among geriatric in-patients can be detected for several assessment subscores as well as parameters recorded by simple accelerometric measurements during a common mobility test. Existing geriatric assessment data may be used for falls prediction on a regular basis. Adding sensory data improves the specificity of our test markedly.
Development of a Three-Tier Test to Assess Misconceptions about Simple Electric Circuits
ERIC Educational Resources Information Center
Pesman, Haki; Eryilmaz, Ali
2010-01-01
The authors aimed to propose a valid and reliable diagnostic instrument by developing a three-tier test on simple electric circuits. Based on findings from the interviews, open-ended questions, and the related literature, the test was developed and administered to 124 high school students. In addition to some qualitative techniques for…
Flight Research into Simple Adaptive Control on the NASA FAST Aircraft
NASA Technical Reports Server (NTRS)
Hanson, Curtis E.
2011-01-01
A series of simple adaptive controllers with varying levels of complexity were designed, implemented and flight tested on the NASA Full-Scale Advanced Systems Testbed (FAST) aircraft. Lessons learned from the development and flight testing are presented.
Spiers, Adam J; Liarokapis, Minas V; Calli, Berk; Dollar, Aaron M
2016-01-01
Classical robotic approaches to tactile object identification often involve rigid mechanical grippers, dense sensor arrays, and exploratory procedures (EPs). Though EPs are a natural method for humans to acquire object information, evidence also exists for meaningful tactile property inference from brief, non-exploratory motions (a 'haptic glance'). In this work, we implement tactile object identification and feature extraction techniques on data acquired during a single, unplanned grasp with a simple, underactuated robot hand equipped with inexpensive barometric pressure sensors. Our methodology utilizes two cooperating schemes based on an advanced machine learning technique (random forests) and parametric methods that estimate object properties. The available data is limited to actuator positions (one per two link finger) and force sensors values (eight per finger). The schemes are able to work both independently and collaboratively, depending on the task scenario. When collaborating, the results of each method contribute to the other, improving the overall result in a synergistic fashion. Unlike prior work, the proposed approach does not require object exploration, re-grasping, grasp-release, or force modulation and works for arbitrary object start positions and orientations. Due to these factors, the technique may be integrated into practical robotic grasping scenarios without adding time or manipulation overheads.
Profile fitting in crowded astronomical images
NASA Astrophysics Data System (ADS)
Manish, Raja
Around 18,000 known objects currently populate the near Earth space. These constitute active space assets as well as space debris objects. The tracking and cataloging of such objects relies on observations, most of which are ground based. Also, because of the great distance to the objects, only non-resolved object images can be obtained from the observations. Optical systems consist of telescope optics and a detector. Nowadays, usually CCD detectors are used. The information that is sought to be extracted from the frames are the individual object's astrometric position. In order to do so, the center of the object's image on the CCD frame has to be found. However, the observation frames that are read out of the detector are subject to noise. There are three different sources of noise: celestial background sources, the object signal itself and the sensor noise. The noise statistics are usually modeled as Gaussian or Poisson distributed or their combined distribution. In order to achieve a near real time processing, computationally fast and reliable methods for the so-called centroiding are desired; analytical methods are preferred over numerical ones of comparable accuracy. In this work, an analytic method for the centroiding is investigated and compared to numerical methods. Though the work focuses mainly on astronomical images, same principle could be applied on non-celestial images containing similar data. The method is based on minimizing weighted least squared (LS) error between observed data and the theoretical model of point sources in a novel yet simple way. Synthetic image frames have been simulated. The newly developed method is tested in both crowded and non-crowded fields where former needs additional image handling procedures to separate closely packed objects. Subsequent analysis on real celestial images corroborate the effectiveness of the approach.
Zanghi, Brian M; Araujo, Joseph; Milgram, Norton W
2015-05-01
Cognition in dogs, like in humans, is not a unitary process. Some functions, such as simple discrimination learning, are relatively insensitive to age; others, such as visuospatial learning can provide behavioral biomarkers of age. The present experiment sought to further establish the relationship between various cognitive domains, namely visuospatial memory, object discrimination learning (ODL), and selective attention (SA). In addition, we also set up a task to assess motor learning (ML). Thirty-six beagles (9-16 years) performed a variable delay non-matching to position (vDNMP) task using two objects with 20- and 90-s delay and were divided into three groups based on a combined score (HMP = 88-93 % accuracy [N = 12]; MMP = 79-86 % accuracy [N = 12]; LMP = 61-78 % accuracy [N = 12]). Variable object oddity task was used to measure ODL (correct or incorrect object) and SA (0-3 incorrect distractor objects with same [SA-same] or different [SA-diff] correct object as ODL). ML involved reaching various distances (0-15 cm). Age did not differ between memory groups (mean 11.6 years). ODL (ANOVA P = 0.43), or SA-same and SA-different (ANOVA P = 0.96), performance did not differ between the three vDNMP groups, although mean errors during ODL was numerically higher for LMP dogs. Errors increased (P < 0.001) for all dogs with increasing number of distractor objects during both SA tasks. vDNMP groups remained different (ANOVA P < 0.001) when re-tested with vDNMP task 42 days later. Maximum ML distance did not differ between vDNMP groups (ANOVA P = 0.96). Impaired short-term memory performance in aged dogs does not appear to predict performance of cognitive domains associated with object learning, SA, or maximum ML distance.
Statistical Tests Black swans or dragon-kings? A simple test for deviations from the power law★
NASA Astrophysics Data System (ADS)
Janczura, J.; Weron, R.
2012-05-01
We develop a simple test for deviations from power law tails. Actually, from the tails of any distribution. We use this test - which is based on the asymptotic properties of the empirical distribution function - to answer the question whether great natural disasters, financial crashes or electricity price spikes should be classified as dragon-kings or `only' as black swans.
Optical 3D-coordinate measuring system using structured light
NASA Astrophysics Data System (ADS)
Schreiber, Wolfgang; Notni, Gunther; Kuehmstedt, Peter; Gerber, Joerg; Kowarschik, Richard M.
1996-09-01
The paper is aimed at the description of an optical shape measuring technique based on a consistent principle using fringe projection technique. We demonstrate a real 3D- coordinate measuring system where the sale of coordinates is given only by the illumination-structures. This method has the advantages that the aberration of the observing system and the depth-dependent imaging scale have no influence on the measuring accuracy and, moreover, the measurements are independent of the position of the camera with respect to the object under test. Furthermore, it is shown that the influence of specular effects of the surface on the measuring result can be eliminated. Moreover, we developed a very simple algorithm to calibrate the measuring system. The measuring examples show that a measuring accuracy of 10-4 (i.e. 10 micrometers ) within an object volume of 100 X 100 X 70 mm3 is achievable. Furthermore, it is demonstrated that the set of coordinate values can be processed in CNC- and CAD-systems.
Attracted by a magnet: Exploration behaviour of rodents in the presence of magnetic objects.
Malewski, Sandra; Malkemper, E Pascal; Sedláček, František; Šumbera, Radim; Caspar, Kai R; Burda, Hynek; Begall, Sabine
2018-06-01
Magnetosensitivity is widespread among animals with rodents being the most intensively studied mammalian group. The available behavioural assays for magnetoreception are time-consuming, which impedes screens for treatment effects that could characterize the enigmatic magnetoreceptors. Here, we present a fast and simple approach to test if an animal responds to magnetic stimuli: the magnetic object assay (MOA). The MOA focuses on investigating an animal's spontaneous exploration behaviour in the presence of a bar magnet compared to a demagnetised control. We present consistently longer exploration of the magnet in three different rodent species: Ansell's mole-rat (Fukomys anselli), C57BL/6J laboratory mouse, and naked mole-rat (Heterocephalus glaber). For the naked mole-rat this is the first report that this species reacts on magnetic stimuli. We conclude that the MOA holds the potential to screen if an animal responds to magnetic stimuli, indicating the possession of a magnetic sense. Copyright © 2018 Elsevier B.V. All rights reserved.
Underscreening in concentrated electrolytes.
Lee, Alpha A; Perez-Martinez, Carla S; Smith, Alexander M; Perkin, Susan
2017-07-01
Screening of a surface charge by an electrolyte and the resulting interaction energy between charged objects is of fundamental importance in scenarios from bio-molecular interactions to energy storage. The conventional wisdom is that the interaction energy decays exponentially with object separation and the decay length is a decreasing function of ion concentration; the interaction is thus negligible in a concentrated electrolyte. Contrary to this conventional wisdom, we have shown by surface force measurements that the decay length is an increasing function of ion concentration and Bjerrum length for concentrated electrolytes. In this paper we report surface force measurements to test directly the scaling of the screening length with Bjerrum length. Furthermore, we identify a relationship between the concentration dependence of this screening length and empirical measurements of activity coefficient and differential capacitance. The dependence of the screening length on the ion concentration and the Bjerrum length can be explained by a simple scaling conjecture based on the physical intuition that solvent molecules, rather than ions, are charge carriers in a concentrated electrolyte.
Intervals for posttest probabilities: a comparison of 5 methods.
Mossman, D; Berger, J O
2001-01-01
Several medical articles discuss methods of constructing confidence intervals for single proportions and the likelihood ratio, but scant attention has been given to the systematic study of intervals for the posterior odds, or the positive predictive value, of a test. The authors describe 5 methods of constructing confidence intervals for posttest probabilities when estimates of sensitivity, specificity, and the pretest probability of a disorder are derived from empirical data. They then evaluate each method to determine how well the intervals' coverage properties correspond to their nominal value. When the estimates of pretest probabilities, sensitivity, and specificity are derived from more than 80 subjects and are not close to 0 or 1, all methods generate intervals with appropriate coverage properties. When these conditions are not met, however, the best-performing method is an objective Bayesian approach implemented by a simple simulation using a spreadsheet. Physicians and investigators can generate accurate confidence intervals for posttest probabilities in small-sample situations using the objective Bayesian approach.
Coons, David A; Barber, F Alan; Herbert, Morley A
2006-11-01
This study evaluated the strength and suture-tendon interface security of different suture configurations from triple-suture-loaded anchors. A juvenile bovine infraspinatus tendon was detached and repaired by use of 4 different suture combinations from 2 suture anchors: 3 simple sutures in each anchor (ThreeVo anchor; Linvatec, Largo, FL); 2 peripheral simple stitches and 1 central horizontal mattress suture passed deeper into the tendon, creating a larger footprint (bigfoot-print anchor); 2 peripheral simple stitches with 1 central horizontal mattress stitch passed through the same holes as the simple sutures (stitch-of-Burns); and 2 simple stitches (TwoVo anchor; Linvatec). The constructs were cyclically loaded between 10 N and 180 N for 3,500 cycles and then destructively tested. The number of cycles required to create a 5-mm gap and a 10-mm gap and the ultimate load to failure and failure mode were recorded. The ThreeVo anchor was strongest and most resistant to cyclic loading (P < .01). The TwoVo anchor was least resistant to cyclic loading. The stitch-of-Burns anchor was more resistant to cyclic loading than both the bigfoot-print anchor and the TwoVo anchor (P < .03). The ThreeVo, stitch-of-Burns, and TwoVo anchors were stronger than the bigfoot-print anchor (P < .05). Three simple sutures in an anchor hold better than two simple sutures. Three simple sutures provide superior suture-tendon security than combinations of one mattress and two simple stitches subjected to cyclic loading. A central mattress stitch placed more medially than two peripheral simple stitches (bigfoot-print anchor) configured to enlarge the tendon-suture footprint was not as resistant to cyclic loading or destructive testing as three simple stitches (ThreeVo anchor). Placing a central mattress stitch more medially than 2 peripheral simple stitches to enlarge the tendon-suture footprint was not as resistant to cyclic loading or destructive testing as 3 simple stitches.
Sound effects: Multimodal input helps infants find displaced objects.
Shinskey, Jeanne L
2017-09-01
Before 9 months, infants use sound to retrieve a stationary object hidden by darkness but not one hidden by occlusion, suggesting auditory input is more salient in the absence of visual input. This article addresses how audiovisual input affects 10-month-olds' search for displaced objects. In AB tasks, infants who previously retrieved an object at A subsequently fail to find it after it is displaced to B, especially following a delay between hiding and retrieval. Experiment 1 manipulated auditory input by keeping the hidden object audible versus silent, and visual input by presenting the delay in the light versus dark. Infants succeeded more at B with audible than silent objects and, unexpectedly, more after delays in the light than dark. Experiment 2 presented both the delay and search phases in darkness. The unexpected light-dark difference disappeared. Across experiments, the presence of auditory input helped infants find displaced objects, whereas the absence of visual input did not. Sound might help by strengthening object representation, reducing memory load, or focusing attention. This work provides new evidence on when bimodal input aids object processing, corroborates claims that audiovisual processing improves over the first year of life, and contributes to multisensory approaches to studying cognition. Statement of contribution What is already known on this subject Before 9 months, infants use sound to retrieve a stationary object hidden by darkness but not one hidden by occlusion. This suggests they find auditory input more salient in the absence of visual input in simple search tasks. After 9 months, infants' object processing appears more sensitive to multimodal (e.g., audiovisual) input. What does this study add? This study tested how audiovisual input affects 10-month-olds' search for an object displaced in an AB task. Sound helped infants find displaced objects in both the presence and absence of visual input. Object processing becomes more sensitive to bimodal input as multisensory functions develop across the first year. © 2016 The British Psychological Society.
NASA Astrophysics Data System (ADS)
Kan, Brandon K.
A pulsed detonation rocket engine concept was explored through the use of hypergolic propellants in a fuel-centered pintle injector combustor. The combustor design yielded a simple open ended chamber with a pintle type injection element and pressure instrumentation. High-frequency pressure measurements from the first test series showed the presence of large pressure oscillations in excess of 2000 psia at frequencies between 400-600 hz during operation. High-speed video confirmed the high-frequency pulsed behavior and large amounts of after burning. Damaged hardware and instrumentation failure limited the amount of data gathered in the first test series, but the experiments met original test objectives of producing large over-pressures in an open chamber. A second test series proceeded by replacing hardware and instrumentation, and new data showed that pulsed events produced under expanded exhaust prior to pulsing, peak pressures around 8000 psi, and operating frequencies between 400-800 hz. Later hot-fires produced no pulsed behavior despite undamaged hardware. The research succeeded in producing pulsed combustion behavior using hypergolic fuels in a pintle injector setup and provided insights into design concepts that would assist future injector designs and experimental test setups.
NASA Astrophysics Data System (ADS)
Kolb, Kimberly E.; Choi, Hee-sue S.; Kaur, Balvinder; Olson, Jeffrey T.; Hill, Clayton F.; Hutchinson, James A.
2016-05-01
The US Army's Communications Electronics Research, Development and Engineering Center (CERDEC) Night Vision and Electronic Sensors Directorate (referred to as NVESD) is developing a virtual detection, recognition, and identification (DRI) testing methodology using simulated imagery as a means of augmenting the field testing component of sensor performance evaluation, which is expensive, resource intensive, time consuming, and limited to the available target(s) and existing atmospheric visibility and environmental conditions at the time of testing. Existing simulation capabilities such as the Digital Imaging Remote Sensing Image Generator (DIRSIG) and NVESD's Integrated Performance Model Image Generator (NVIPM-IG) can be combined with existing detection algorithms to reduce cost/time, minimize testing risk, and allow virtual/simulated testing using full spectral and thermal object signatures, as well as those collected in the field. NVESD has developed an end-to-end capability to demonstrate the feasibility of this approach. Simple detection algorithms have been used on the degraded images generated by NVIPM-IG to determine the relative performance of the algorithms on both DIRSIG-simulated and collected images. Evaluating the degree to which the algorithm performance agrees between simulated versus field collected imagery is the first step in validating the simulated imagery procedure.
Simple new test for rapid differentiation of Prototheca wickerhamii from Prototheca zopfii.
Casal, M J; Gutierrez, J
1983-01-01
A simple new test to differentiate Prototheca wickerhamii from Prototheca zopfii by determining susceptibility to clotrimazole is described. A 50-micrograms clotrimazole disk provides a rapid and reliable means of distinguishing P. wickerhamii from P. zopfii. PMID:6630477
On the Essence of the Mind and the Object of Psychology
1960-07-26
a simple reflectional process. At one time, Wundt , unable to discriminate between the object of psychology and the object of physiology, intro...duced the new term "physiological psychology." As the objects of this science Wundt lists those vital processes which have an external as well as an...physiology alone." According to Wundt perception represents, on the one hand, only a psychological fact and, on the other hand, only a physiological act. It
A recipe for echoes from exotic compact objects
NASA Astrophysics Data System (ADS)
Mark, Zachary; Zimmerman, Aaron; Du, Song Ming; Chen, Yanbei
2017-10-01
Gravitational wave astronomy provides an unprecedented opportunity to test the nature of black holes and search for exotic, compact alternatives. Recent studies have shown that exotic compact objects (ECOs) can ring down in a manner similar to black holes, but can also produce a sequence of distinct pulses resembling the initial ringdown. These "echoes" would provide definite evidence for the existence of ECOs. In this work we study the generation of these echoes in a generic, parametrized model for the ECO, using Green's functions. We show how to reprocess radiation in the near-horizon region of a Schwarzschild black hole into the asymptotic radiation from the corresponding source in an ECO spacetime. Our methods allow us to understand the connection between distinct echoes and ringing at the resonant frequencies of the compact object. We find that the quasinormal mode ringing in the black hole spacetime plays a central role in determining the shape of the first few echoes. We use this observation to develop a simple template for echo waveforms. This template preforms well over a variety of ECO parameters, and with improvements may prove useful in the analysis of gravitational waves.
NASA Astrophysics Data System (ADS)
Fox, Neil I.; Micheas, Athanasios C.; Peng, Yuqiang
2016-07-01
This paper introduces the use of Bayesian full Procrustes shape analysis in object-oriented meteorological applications. In particular, the Procrustes methodology is used to generate mean forecast precipitation fields from a set of ensemble forecasts. This approach has advantages over other ensemble averaging techniques in that it can produce a forecast that retains the morphological features of the precipitation structures and present the range of forecast outcomes represented by the ensemble. The production of the ensemble mean avoids the problems of smoothing that result from simple pixel or cell averaging, while producing credible sets that retain information on ensemble spread. Also in this paper, the full Bayesian Procrustes scheme is used as an object verification tool for precipitation forecasts. This is an extension of a previously presented Procrustes shape analysis based verification approach into a full Bayesian format designed to handle the verification of precipitation forecasts that match objects from an ensemble of forecast fields to a single truth image. The methodology is tested on radar reflectivity nowcasts produced in the Warning Decision Support System - Integrated Information (WDSS-II) by varying parameters in the K-means cluster tracking scheme.
Dypas: A dynamic payload scheduler for shuttle missions
NASA Technical Reports Server (NTRS)
Davis, Stephen
1988-01-01
Decision and analysis systems have had broad and very practical application areas in the human decision making process. These software systems range from the help sections in simple accounting packages, to the more complex computer configuration programs. Dypas is a decision and analysis system that aids prelaunch shutlle scheduling, and has added functionality to aid the rescheduling done in flight. Dypas is written in Common Lisp on a Symbolics Lisp machine. Dypas differs from other scheduling programs in that it can draw its knowledge from different rule bases and apply them to different rule interpretation schemes. The system has been coded with Flavors, an object oriented extension to Common Lisp on the Symbolics hardware. This allows implementation of objects (experiments) to better match the problem definition, and allows a more coherent solution space to be developed. Dypas was originally developed to test a programmer's aptitude toward Common Lisp and the Symbolics software environment. Since then the system has grown into a large software effort with several programmers and researchers thrown into the effort. Dypas is currently using two expert systems and three inferencing procedures to generate a many object schedule. The paper will review the abilities of Dypas and comment on its functionality.
Metallurgical characterization of brass objects from the Akko 1 shipwreck, Israel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ashkenazi, D., E-mail: dana@eng.tau.ac.il; Cvikel, D.; Stern, A.
2014-06-01
The Akko 1 shipwreck was a small Egyptian armed vessel or auxiliary naval brig built in the eastern Mediterranean at the beginning of the 19th century. During the underwater excavations, about 230 brass hook-and-eye closures were found, mainly in the bow area. In addition, 158 brass cases were found, mainly between midships and the aft extremity of the shipwreck. Metallurgical non-destructive and destructive characterizations of selected items were performed, including radiographic testing, XRF, lead isotope analysis, optical microscopy, SEM–EDS and microhardness tests. The hook-and-eye closures and the cases were both found to be made of binary copper–zinc alloy (about 30more » wt.% zinc). While the brass cases were made from rolled sheets, hand-made using simple tools, and joined by tin–lead soldering material, the brass hook-and-eye closures were hand-made from drawn brass wire, and manufactured from commercial drawn brass bars by a cold-working process. The lead isotope analyses suggest different provenances of the raw materials used for making the brass objects, thus the different origins of the ores may hint that the brass wire and sheet were imported to the workshops in which the objects were manufactured. - Highlights: • Brass cases and hook-and-eye closures were retrieved from the Akko 1 shipwreck. • Both types of objects were made of binary copper–zinc alloy (about 30 wt.% zinc). • The cases were hand-made from rolled sheets and joined by tin–lead soldering. • Hook-and-eye closures were made from drawn brass wire manufactured by cold-working. • Lead isotope analyses suggest that the origins of the raw material were diverse.« less
Knowledge base rule partitioning design for CLIPS
NASA Technical Reports Server (NTRS)
Mainardi, Joseph D.; Szatkowski, G. P.
1990-01-01
This describes a knowledge base (KB) partitioning approach to solve the problem of real-time performance using the CLIPS AI shell when containing large numbers of rules and facts. This work is funded under the joint USAF/NASA Advanced Launch System (ALS) Program as applied research in expert systems to perform vehicle checkout for real-time controller and diagnostic monitoring tasks. The Expert System advanced development project (ADP-2302) main objective is to provide robust systems responding to new data frames of 0.1 to 1.0 second intervals. The intelligent system control must be performed within the specified real-time window, in order to meet the demands of the given application. Partitioning the KB reduces the complexity of the inferencing Rete net at any given time. This reduced complexity improves performance but without undo impacts during load and unload cycles. The second objective is to produce highly reliable intelligent systems. This requires simple and automated approaches to the KB verification & validation task. Partitioning the KB reduces rule interaction complexity overall. Reduced interaction simplifies the V&V testing necessary by focusing attention only on individual areas of interest. Many systems require a robustness that involves a large number of rules, most of which are mutually exclusive under different phases or conditions. The ideal solution is to control the knowledge base by loading rules that directly apply for that condition, while stripping out all rules and facts that are not used during that cycle. The practical approach is to cluster rules and facts into associated 'blocks'. A simple approach has been designed to control the addition and deletion of 'blocks' of rules and facts, while allowing real-time operations to run freely. Timing tests for real-time performance for specific machines under R/T operating systems have not been completed but are planned as part of the analysis process to validate the design.
Peter Bedker; Joseph O' Brien; Manfred Mielke
2012-01-01
The objective of pruning is to produce strong, healthy, attractive plants. By understanding how, when and why to prune, and by following a few simple principles, this objective can be achievedHow to Prune Trees (Revised 2012) Agency Publisher: Agriculture Dept., Forest Service, Northeastern Area State and Price forestry USA List Price:$4.00 Sale...
Museum Theatre: Telling Stories through Objects.
ERIC Educational Resources Information Center
Schindel, Dorothy Napp
2002-01-01
Explains that Museum Theatre's goal is to teach through drama by using experiential interpretive strategies that bypass the lecture format. Outlines a production of Museum Theatre which helped a museum redefine itself. Concludes that Museum Theatre helps shift the focus of programming from simple object display to an emphasis on the human…
Clustering approaches to feature change detection
NASA Astrophysics Data System (ADS)
G-Michael, Tesfaye; Gunzburger, Max; Peterson, Janet
2018-05-01
The automated detection of changes occurring between multi-temporal images is of significant importance in a wide range of medical, environmental, safety, as well as many other settings. The usage of k-means clustering is explored as a means for detecting objects added to a scene. The silhouette score for the clustering is used to define the optimal number of clusters that should be used. For simple images having a limited number of colors, new objects can be detected by examining the change between the optimal number of clusters for the original and modified images. For more complex images, new objects may need to be identified by examining the relative areas covered by corresponding clusters in the original and modified images. Which method is preferable depends on the composition and range of colors present in the images. In addition to describing the clustering and change detection methodology of our proposed approach, we provide some simple illustrations of its application.
Computer memory management system
Kirk, III, Whitson John
2002-01-01
A computer memory management system utilizing a memory structure system of "intelligent" pointers in which information related to the use status of the memory structure is designed into the pointer. Through this pointer system, The present invention provides essentially automatic memory management (often referred to as garbage collection) by allowing relationships between objects to have definite memory management behavior by use of coding protocol which describes when relationships should be maintained and when the relationships should be broken. In one aspect, the present invention system allows automatic breaking of strong links to facilitate object garbage collection, coupled with relationship adjectives which define deletion of associated objects. In another aspect, The present invention includes simple-to-use infinite undo/redo functionality in that it has the capability, through a simple function call, to undo all of the changes made to a data model since the previous `valid state` was noted.
NASA Astrophysics Data System (ADS)
Yu, Liping; Pan, Bing
2017-08-01
Full-frame, high-speed 3D shape and deformation measurement using stereo-digital image correlation (stereo-DIC) technique and a single high-speed color camera is proposed. With the aid of a skillfully designed pseudo stereo-imaging apparatus, color images of a test object surface, composed of blue and red channel images from two different optical paths, are recorded by a high-speed color CMOS camera. The recorded color images can be separated into red and blue channel sub-images using a simple but effective color crosstalk correction method. These separated blue and red channel sub-images are processed by regular stereo-DIC method to retrieve full-field 3D shape and deformation on the test object surface. Compared with existing two-camera high-speed stereo-DIC or four-mirror-adapter-assisted singe-camera high-speed stereo-DIC, the proposed single-camera high-speed stereo-DIC technique offers prominent advantages of full-frame measurements using a single high-speed camera but without sacrificing its spatial resolution. Two real experiments, including shape measurement of a curved surface and vibration measurement of a Chinese double-side drum, demonstrated the effectiveness and accuracy of the proposed technique.
Foreground extraction for moving RGBD cameras
NASA Astrophysics Data System (ADS)
Junejo, Imran N.; Ahmed, Naveed
2017-02-01
In this paper, we propose a simple method to perform foreground extraction for a moving RGBD camera. These cameras have now been available for quite some time. Their popularity is primarily due to their low cost and ease of availability. Although the field of foreground extraction or background subtraction has been explored by the computer vision researchers since a long time, the depth-based subtraction is relatively new and has not been extensively addressed as of yet. Most of the current methods make heavy use of geometric reconstruction, making the solutions quite restrictive. In this paper, we make a novel use RGB and RGBD data: from the RGB frame, we extract corner features (FAST) and then represent these features with the histogram of oriented gradients (HoG) descriptor. We train a non-linear SVM on these descriptors. During the test phase, we make used of the fact that the foreground object has distinct depth ordering with respect to the rest of the scene. That is, we use the positively classified FAST features on the test frame to initiate a region growing to obtain the accurate segmentation of the foreground object from just the RGBD data. We demonstrate the proposed method of a synthetic datasets, and demonstrate encouraging quantitative and qualitative results.
The impact of group therapy training on social communications of Afghan immigrants
Mehrabi, Tayebeh; Musavi, Tayebeh; Ghazavi, Zahra; Zandieh, Zahra; Zamani, Ahmadreza
2011-01-01
BACKGROUND: Mental training considers sharing of mental health care information as the primary objective. The secondary objectives include facilitating dialogue about feelings such as isolation, sadness, labeling, loneliness and possible strategies for confronting with these feelings. Group therapy trainings have supportive functioning in accepting the environment so that the members are able to be part of the indigenous groups. However, no study has been ever done on the impact of this educational method on the communication problems of this group. This study aimed to determine the impact of group therapy training on the communication problems of Afghan immigrants. METHODS: This was a clinical trial study. Eighty-eight Afghan men were investigated. Sampling method was simple sampling method. Thereafter, the study subjects were divided randomly into two groups of test and control based on the inclusion criteria. Data collection tool was a self-made questionnaire about the social problems. For analyzing the data, software SPSS, independent t-test and paired t-test were used. RESULTS: Reviewing the data indicated lower mean score of the social problems after implementing the group therapy training in social communication compared with before implementing the group therapy training. Paired t-test showed a significant difference between mean scores of the social communication problems before and after the implementation of group therapy training. CONCLUSIONS: Given the effectiveness of the intervention, group therapy training on social problems in social communication of Afghan immigrants is recommended. This program should be part of continuous education and training of the Afghan immigrants. PMID:22224098
Fischer, K; Colombani, P C; Langhans, W; Wenk, C
2001-03-01
The effect of carbohydrate, protein and fat ingestion on simple as well as complex cognitive functions and the relationship between the respective postprandial metabolic changes and changes in cognitive performance were studied in fifteen healthy male students. Subjects were tested in three sessions, separated by 1 week, for short-term changes in blood variables, indirect calorimetry, subjective performance and different objective performance tasks using a repeated-measures counterbalanced cross-over design. Measurements were made after an overnight fast before and hourly during 3 h after test meal ingestion. Test meals consisted of either pure carbohydrates, protein or fat and were served as isoenergetic (1670 kJ) spoonable creams with similar sensory properties. Most aspects of subjective performance did not differ between test meals. For all objective tasks, however, postprandial cognitive performance was best after fat ingestion concomitant with an almost constant glucose metabolism and constant metabolic activation state measured by glucagon:insulin (G:I). In contrast, carbohydrate as well as protein ingestion resulted in lower overall cognitive performance, both together with partly marked changes in glucose metabolism and metabolic activation. They also differently affected specific cognitive functions in relation to their specific effect on metabolism. Carbohydrate ingestion resulted in relatively better short-term memory and accuracy of tasks concomitant with low metabolic activation, whereas protein ingestion resulted in better attention and efficiency of tasks concomitant with higher metabolic activation. Our findings support the concept that good and stable cognitive performance is related to a balanced glucose metabolism and metabolic activation state.
Experimental Verification of Electric Drive Technologies Based on Artificial Intelligence Tools
NASA Technical Reports Server (NTRS)
Rubaai, Ahmed; Kankam, David (Technical Monitor)
2003-01-01
A laboratory implementation of a fuzzy logic-tracking controller using a low cost Motorola MC68HC11E9 microprocessor is described in this report. The objective is to design the most optimal yet practical controller that can be implemented and marketed, and which gives respectable performance, even when the system loads, inertia and parameters are varying. A distinguishing feature of this work is the by-product goal of developing a marketable, simple, functional and low cost controller. Additionally, real-time nonlinearities are not ignored, and a mathematical model is not required. A number of components have been designed, built and tested individually, and in various combinations of hardware and software segments. These components have been integrated with a brushless motor to constitute the drive system. A microprocessor-based FLC is incorporated to provide robust speed and position control. Design objectives that are difficult to express mathematically can be easily incorporated in a fuzzy logic-based controller by linguistic information (in the form of fuzzy IF-THEN rules). The theory and design are tested in the laboratory using a hardware setup. Several test cases have been conducted to confirm the effectiveness of the proposed controller. The results indicate excellent tracking performance for both speed and position trajectories. For the purpose of comparison, a bang-bang controller has been tested. The fuzzy logic controller performs significantly better than the traditional bang-bang controller. The bang-bang controller has been shown to be relatively inaccurate and lacking in robustness. Description of the implementation hardware system is also given.
Noer, Christina Lehmkuhl; Needham, Esther Kjær; Wiese, Ann-Sophie; Balsby, Thorsten Johannes Skovbjerg; Dabelsteen, Torben
2015-01-01
Animal personality research is receiving increasing interest from related fields, such as evolutionary personality psychology. By merging the conceptual understanding of personality, the contributions to both fields of research may be enhanced. In this study, we investigate animal personality based on the definition of personality traits as underlying dispositional factors, which are not directly measurable, but which predispose individuals to react through different behavioural patterns. We investigated the shyness-boldness continuum reflected in the consistency of inter-individual variation in behavioural responses towards novelty in 47 farmed American mink (Neovison vison), which were raised in identical housing conditions. Different stages of approach behaviour towards novelty, and how these related within and across contexts, were explored. Our experimental design contained four tests: two novel object tests (non-social contexts) and two novel animated stimuli tests (social contexts). Our results showed consistency in shyness measures across multiple tests, indicating the existence of personality in farmed American mink. It was found that consistency in shyness measures differs across non-social and social contexts, as well as across the various stages in the approach towards novel objects, revealing that different aspects of shyness exist in the farmed American mink. To our knowledge this is the first study to reveal aspects of the shyness-boldness continuum in the American mink. Since the mink were raised in identical housing conditions, inherited factors may have been important in shaping the consistent inter-individual variation. Body weight and sex had no effect on the personality of the mink. Altogether, our results suggest that the shyness-boldness continuum cannot be explained by a simple underlying dispositional factor, but instead encompasses a broader term of hesitating behaviour that might comprise several different personality traits.
3D shape representation with spatial probabilistic distribution of intrinsic shape keypoints
NASA Astrophysics Data System (ADS)
Ghorpade, Vijaya K.; Checchin, Paul; Malaterre, Laurent; Trassoudaine, Laurent
2017-12-01
The accelerated advancement in modeling, digitizing, and visualizing techniques for 3D shapes has led to an increasing amount of 3D models creation and usage, thanks to the 3D sensors which are readily available and easy to utilize. As a result, determining the similarity between 3D shapes has become consequential and is a fundamental task in shape-based recognition, retrieval, clustering, and classification. Several decades of research in Content-Based Information Retrieval (CBIR) has resulted in diverse techniques for 2D and 3D shape or object classification/retrieval and many benchmark data sets. In this article, a novel technique for 3D shape representation and object classification has been proposed based on analyses of spatial, geometric distributions of 3D keypoints. These distributions capture the intrinsic geometric structure of 3D objects. The result of the approach is a probability distribution function (PDF) produced from spatial disposition of 3D keypoints, keypoints which are stable on object surface and invariant to pose changes. Each class/instance of an object can be uniquely represented by a PDF. This shape representation is robust yet with a simple idea, easy to implement but fast enough to compute. Both Euclidean and topological space on object's surface are considered to build the PDFs. Topology-based geodesic distances between keypoints exploit the non-planar surface properties of the object. The performance of the novel shape signature is tested with object classification accuracy. The classification efficacy of the new shape analysis method is evaluated on a new dataset acquired with a Time-of-Flight camera, and also, a comparative evaluation on a standard benchmark dataset with state-of-the-art methods is performed. Experimental results demonstrate superior classification performance of the new approach on RGB-D dataset and depth data.
A Simple Homemade Polarised Sunglasses Test Card
ERIC Educational Resources Information Center
Bamdad, Farzad
2016-01-01
In this article construction of a simple and inexpensive test card which can be used to demonstrate the polarisation ability of sunglasses is described. The card was fabricated simply by using a piece of polariser sheet with one to three layers of cellophane tape fixed on it.
Simple and Hierarchical Models for Stochastic Test Misgrading.
ERIC Educational Resources Information Center
Wang, Jianjun
1993-01-01
Test misgrading is treated as a stochastic process. The expected number of misgradings, inter-occurrence time of misgradings, and waiting time for the "n"th misgrading are discussed based on a simple Poisson model and a hierarchical Beta-Poisson model. Examples of model construction are given. (SLD)
NASA Technical Reports Server (NTRS)
Armand, Sasan
1995-01-01
A spacecraft payload flown on a launch vehicle experiences dynamic loads. The dynamic loads are caused by various phenomena ranging from the start-up of the launch vehicle engine to wind gusts. A spacecraft payload should be designed to meet launch vehicle dynamic loads. One of the major steps taken towards determining the dynamic loads is to correlate the finite element model of the spacecraft with the test results of a modal survey test. A test-verified finite element model of the spacecraft should possess the same spatial properties (stiffness, mass, and damping) and modal properties (frequencies and mode shapes) as the test hardware representing the spacecraft. The test-verified and correlated finite element model of the spacecraft is then coupled with the finite element model of the launch vehicle for analysis of loads and stress. Modal survey testing, verification of a finite element model, and modification of the finite element model to match the modal survey test results can easily be accomplished if the spacecraft structure is simple. However, this is rarely the case. A simple structure here is defined as a structure where the influence of nonlinearity between force and displacement (uncertainty in a test, for example, with errors in input and output), and the influence of damping (structural, coulomb, and viscous) are not pronounced. The objective of this study is to develop system identification and correlation methods with the focus on the structural systems that possess nonproportional damping. Two approaches to correct the nonproportional damping matrix of a truss structure were studied, and have been implemented on truss-like structures such as the National Aeronautics and Space Administration's space station truss. The results of this study showed nearly 100 percent improvement of the correlated eigensystem over the analytical eigensystem. The first method showed excellent results with up to three modes used in the system identification process. The second method could handle more modes, but required more computer usage time, and the results were less accurate than those of the first method.
Robie, Alice A.; Straw, Andrew D.; Dickinson, Michael H.
2010-01-01
Walking fruit flies, Drosophila melanogaster, use visual information to orient towards salient objects in their environment, presumably as a search strategy for finding food, shelter or other resources. Less is known, however, about the role of vision or other sensory modalities such as mechanoreception in the evaluation of objects once they have been reached. To study the role of vision and mechanoreception in exploration behavior, we developed a large arena in which we could track individual fruit flies as they walked through either simple or more topologically complex landscapes. When exploring a simple, flat environment lacking three-dimensional objects, flies used visual cues from the distant background to stabilize their walking trajectories. When exploring an arena containing an array of cones, differing in geometry, flies actively oriented towards, climbed onto, and explored the objects, spending most of their time on the tallest, steepest object. A fly's behavioral response to the geometry of an object depended upon the intrinsic properties of each object and not a relative assessment to other nearby objects. Furthermore, the preference was not due to a greater attraction towards tall, steep objects, but rather a change in locomotor behavior once a fly reached and explored the surface. Specifically, flies are much more likely to stop walking for long periods when they are perched on tall, steep objects. Both the vision system and the antennal chordotonal organs (Johnston's organs) provide sufficient information about the geometry of an object to elicit the observed change in locomotor behavior. Only when both these sensory systems were impaired did flies not show the behavioral preference for the tall, steep objects. PMID:20581279
System and method for automated object detection in an image
Kenyon, Garrett T.; Brumby, Steven P.; George, John S.; Paiton, Dylan M.; Schultz, Peter F.
2015-10-06
A contour/shape detection model may use relatively simple and efficient kernels to detect target edges in an object within an image or video. A co-occurrence probability may be calculated for two or more edge features in an image or video using an object definition. Edge features may be differentiated between in response to measured contextual support, and prominent edge features may be extracted based on the measured contextual support. The object may then be identified based on the extracted prominent edge features.
CUTE: A Concolic Unit Testing Engine for C
2005-01-01
We also introduce program units of a simple C-like language (cf. [20]). We present how CUTE instruments programs and performs concolic execution. We...works for a simple C-like language shown in Figure 2. START represents the first statement of a program under test. Each statement has an optional...is a variable, c is a constant p ::= v = v | v 6= v | v < v | v ≤ v | v ≥ v | v > v Figure 2: Syntax of a simple C-like language the inputs at the
Software Techniques for Non-Von Neumann Architectures
1990-01-01
Commtopo programmable Benes net.; hypercubic lattice for QCD Control CENTRALIZED Assign STATIC Memory :SHARED Synch UNIVERSAL Max-cpu 566 Proessor...boards (each = 4 floating point units, 2 multipliers) Cpu-size 32-bit floating point chips Perform 11.4 Gflops Market quantum chromodynamics ( QCD ...functions there should exist a capability to define hierarchies and lattices of complex objects. A complex object can be made up of a set of simple objects
HIFIRE Flight 2 Overview and Status Update 2011
NASA Technical Reports Server (NTRS)
Jackson, Kevin R.; Gruber, Mark R.; Buccellato, Salvatore
2011-01-01
A collaborative international effort, the Hypersonic International Flight Research Experimentation (HIFiRE) Program aims to study basic hypersonic phenomena through flight experimentation. HIFiRE Flight 2 teams the United States Air Force Research Lab (AFRL), NASA, and the Australian Defence Science and Technology Organisation (DSTO). Flight 2 will develop an alternative test technique for acquiring high enthalpy scramjet flight test data, allowing exploration of accelerating hydrocarbon-fueled scramjet performance and dual-to-scram mode transition up to and beyond Mach 8 flight. The generic scramjet flowpath is research quality and the test fuel is a simple surrogate for an endothermically cracked liquid hydrocarbon fuel. HIFiRE Flight 2 will be a first of its kind in contribution to scramjets. The HIFiRE program builds upon the HyShot and HYCAUSE programs and aims to leverage the low-cost flight test technique developed in those programs. It will explore suppressed trajectories of a sounding rocket propelled test article and their utility in studying ramjet-scramjet mode transition and flame extinction limits research. This paper describes the overall scramjet flight test experiment mission goals and objectives, flight test approach and strategy, ground test and analysis summary, development status and project schedule. A successful launch and operation will present to the scramjet community valuable flight test data in addition to a new tool, and vehicle, with which to explore high enthalpy scramjet technologies.
Multiscale moment-based technique for object matching and recognition
NASA Astrophysics Data System (ADS)
Thio, HweeLi; Chen, Liya; Teoh, Eam-Khwang
2000-03-01
A new method is proposed to extract features from an object for matching and recognition. The features proposed are a combination of local and global characteristics -- local characteristics from the 1-D signature function that is defined to each pixel on the object boundary, global characteristics from the moments that are generated from the signature function. The boundary of the object is first extracted, then the signature function is generated by computing the angle between two lines from every point on the boundary as a function of position along the boundary. This signature function is position, scale and rotation invariant (PSRI). The shape of the signature function is then described quantitatively by using moments. The moments of the signature function are the global characters of a local feature set. Using moments as the eventual features instead of the signature function reduces the time and complexity of an object matching application. Multiscale moments are implemented to produce several sets of moments that will generate more accurate matching. Basically multiscale technique is a coarse to fine procedure and makes the proposed method more robust to noise. This method is proposed to match and recognize objects under simple transformation, such as translation, scale changes, rotation and skewing. A simple logo indexing system is implemented to illustrate the performance of the proposed method.
Brady, Timothy F.; Störmer, Viola S.; Alvarez, George A.
2016-01-01
Visual working memory is the cognitive system that holds visual information active to make it resistant to interference from new perceptual input. Information about simple stimuli—colors and orientations—is encoded into working memory rapidly: In under 100 ms, working memory ‟fills up,” revealing a stark capacity limit. However, for real-world objects, the same behavioral limits do not hold: With increasing encoding time, people store more real-world objects and do so with more detail. This boost in performance for real-world objects is generally assumed to reflect the use of a separate episodic long-term memory system, rather than working memory. Here we show that this behavioral increase in capacity with real-world objects is not solely due to the use of separate episodic long-term memory systems. In particular, we show that this increase is a result of active storage in working memory, as shown by directly measuring neural activity during the delay period of a working memory task using EEG. These data challenge fixed-capacity working memory models and demonstrate that working memory and its capacity limitations are dependent upon our existing knowledge. PMID:27325767
Brady, Timothy F; Störmer, Viola S; Alvarez, George A
2016-07-05
Visual working memory is the cognitive system that holds visual information active to make it resistant to interference from new perceptual input. Information about simple stimuli-colors and orientations-is encoded into working memory rapidly: In under 100 ms, working memory ‟fills up," revealing a stark capacity limit. However, for real-world objects, the same behavioral limits do not hold: With increasing encoding time, people store more real-world objects and do so with more detail. This boost in performance for real-world objects is generally assumed to reflect the use of a separate episodic long-term memory system, rather than working memory. Here we show that this behavioral increase in capacity with real-world objects is not solely due to the use of separate episodic long-term memory systems. In particular, we show that this increase is a result of active storage in working memory, as shown by directly measuring neural activity during the delay period of a working memory task using EEG. These data challenge fixed-capacity working memory models and demonstrate that working memory and its capacity limitations are dependent upon our existing knowledge.
SVGenes: a library for rendering genomic features in scalable vector graphic format.
Etherington, Graham J; MacLean, Daniel
2013-08-01
Drawing genomic features in attractive and informative ways is a key task in visualization of genomics data. Scalable Vector Graphics (SVG) format is a modern and flexible open standard that provides advanced features including modular graphic design, advanced web interactivity and animation within a suitable client. SVGs do not suffer from loss of image quality on re-scaling and provide the ability to edit individual elements of a graphic on the whole object level independent of the whole image. These features make SVG a potentially useful format for the preparation of publication quality figures including genomic objects such as genes or sequencing coverage and for web applications that require rich user-interaction with the graphical elements. SVGenes is a Ruby-language library that uses SVG primitives to render typical genomic glyphs through a simple and flexible Ruby interface. The library implements a simple Page object that spaces and contains horizontal Track objects that in turn style, colour and positions features within them. Tracks are the level at which visual information is supplied providing the full styling capability of the SVG standard. Genomic entities like genes, transcripts and histograms are modelled in Glyph objects that are attached to a track and take advantage of SVG primitives to render the genomic features in a track as any of a selection of defined glyphs. The feature model within SVGenes is simple but flexible and not dependent on particular existing gene feature formats meaning graphics for any existing datasets can easily be created without need for conversion. The library is provided as a Ruby Gem from https://rubygems.org/gems/bio-svgenes under the MIT license, and open source code is available at https://github.com/danmaclean/bioruby-svgenes also under the MIT License. dan.maclean@tsl.ac.uk.
Histamine 50-skin-prick test: a tool to diagnose histamine intolerance.
Kofler, Lukas; Ulmer, Hanno; Kofler, Heinz
2011-01-01
Background. Histamine intolerance results from an imbalance between histamine intake and degradation. In healthy persons, dietary histamine can be sufficiently metabolized by amine oxidases, whereas persons with low amine oxidase activity are at risk of histamine toxicity. Diamine oxidase (DAO) is the key enzyme in degradation. Histamine elicits a wide range of effects. Histamine intolerance displays symptoms, such as rhinitis, headache, gastrointestinal symptoms, palpitations, urticaria and pruritus. Objective. Diagnosis of histamine intolerance until now is based on case history; neither a validated questionnaire nor a routine test is available. It was the aim of this trial to evaluate the usefullness of a prick-test for the diagnosis of histamine intolerance. Methods. Prick-testing with 1% histamine solution and wheal size-measurement to assess the relation between the wheal in prick-test, read after 20 to 50 minutes, as sign of slowed histamine degradation as well as history and symptoms of histamine intolerance. Results. Besides a pretest with 17 patients with HIT we investigated 156 persons (81 with HIT, 75 controls): 64 out of 81 with histamine intolerance(HIT), but only 14 out of 75 persons from the control-group presented with a histamine wheal ≥3 mm after 50 minutes (P < .0001). Conclusion and Clinical Relevance. Histamine-50 skin-prickt-test offers a simple tool with relevance.
Histamine 50-Skin-Prick Test: A Tool to Diagnose Histamine Intolerance
Kofler, Lukas; Ulmer, Hanno; Kofler, Heinz
2011-01-01
Background. Histamine intolerance results from an imbalance between histamine intake and degradation. In healthy persons, dietary histamine can be sufficiently metabolized by amine oxidases, whereas persons with low amine oxidase activity are at risk of histamine toxicity. Diamine oxidase (DAO) is the key enzyme in degradation. Histamine elicits a wide range of effects. Histamine intolerance displays symptoms, such as rhinitis, headache, gastrointestinal symptoms, palpitations, urticaria and pruritus. Objective. Diagnosis of histamine intolerance until now is based on case history; neither a validated questionnaire nor a routine test is available. It was the aim of this trial to evaluate the usefullness of a prick-test for the diagnosis of histamine intolerance. Methods. Prick-testing with 1% histamine solution and wheal size-measurement to assess the relation between the wheal in prick-test, read after 20 to 50 minutes, as sign of slowed histamine degradation as well as history and symptoms of histamine intolerance. Results. Besides a pretest with 17 patients with HIT we investigated 156 persons (81 with HIT, 75 controls): 64 out of 81 with histamine intolerance(HIT), but only 14 out of 75 persons from the control-group presented with a histamine wheal ≥3 mm after 50 minutes (P < .0001). Conclusion and Clinical Relevance. Histamine-50 skin-prickt-test offers a simple tool with relevance. PMID:23724226
Ribeiro, Madelon Novato; Pimentel, Maria Inês Fernandes; Schubach, Armando de Oliveira; Oliveira, Raquel de Vasconcellos Carvalhães de; Teixeira, José Liporage; Leite, Madson Pedro da Silva; Fonseca, Monique; Santos, Ginelza Peres Lima dos; Salgueiro, Mariza Matos; Ferreira e Vasconcellos, Erica de Camargo; Lyra, Marcelo Rosandiski; Saheki, Mauricio Naoto; Valete-Rosalino, Claudia Maria
2014-01-01
The favorable outcome of the treatment of a disease is influenced by the adherence to therapy. Our objective was to assess factors associated with adherence to treatment of patients included in a clinical trial of equivalence between the standard and alternative treatment schemes with meglumine antimoniate (MA) in the treatment of cutaneous leishmaniasis (CL), in the state of Rio de Janeiro. Between 2008 and 2011, 57 patients with CL were interviewed using a questionnaire to collect socioeconomic data. The following methods were used for adherence monitoring: counting of vial surplus, monitoring card, Morisky test and modified Morisky test (without the question regarding the schedule); we observed 82.1% (vial return), 86.0% (monitoring card), 66.7% (Morisky test) and 86.0% (modified Morisky test) adherence. There was a strong correlation between the method of vial counting and the monitoring card and modified Morisky test. A significant association was observed between greater adherence to treatment and low dose of MA, as well as with a lower number of people sleeping in the same room. We recommend the use of the modified Morisky test to assess adherence to treatment of CL with MA, because it is a simple method and with a good performance, when compared to other methods.
Popoviç, M; Biessels, G J; Isaacson, R L; Gispen, W H
2001-08-01
Diabetes mellitus is associated with disturbances of cognitive functioning. The aim of this study was to examine cognitive functioning in diabetic rats using the 'Can test', a novel spatial/object learning and memory task, without the use of aversive stimuli. Rats were trained to select a single rewarded can from seven cans. Mild water deprivation provided the motivation to obtain the reward (0.3 ml of water). After 5 days of baseline training, in which the rewarded can was marked by its surface and position in an open field, the animals were divided into two groups. Diabetes was induced in one group, by an intravenous injection of streptozotocin. Retention of baseline training was tested at 2-weekly intervals for 10 weeks. Next, two adapted versions of the task were used, with 4 days of training in each version. The rewarded can was a soft-drink can with coloured print. In a 'simple visual task' the soft-drink can was placed among six white cans, whereas in a 'complex visual task' it was placed among six soft-drink cans from different brands with distinct prints. In diabetic rats the number of correct responses was lower and number of reference and working memory errors higher than in controls in the various versions of the test. Switches between tasks and increases in task complexity accentuated the performance deficits, which may reflect an inability of diabetic rats to adapt behavioural strategies to the demands of the tasks.
Why Bother to Calibrate? Model Consistency and the Value of Prior Information
NASA Astrophysics Data System (ADS)
Hrachowitz, Markus; Fovet, Ophelie; Ruiz, Laurent; Euser, Tanja; Gharari, Shervan; Nijzink, Remko; Savenije, Hubert; Gascuel-Odoux, Chantal
2015-04-01
Hydrological models frequently suffer from limited predictive power despite adequate calibration performances. This can indicate insufficient representations of the underlying processes. Thus ways are sought to increase model consistency while satisfying the contrasting priorities of increased model complexity and limited equifinality. In this study the value of a systematic use of hydrological signatures and expert knowledge for increasing model consistency was tested. It was found that a simple conceptual model, constrained by 4 calibration objective functions, was able to adequately reproduce the hydrograph in the calibration period. The model, however, could not reproduce 20 hydrological signatures, indicating a lack of model consistency. Subsequently, testing 11 models, model complexity was increased in a stepwise way and counter-balanced by using prior information about the system to impose "prior constraints", inferred from expert knowledge and to ensure a model which behaves well with respect to the modeller's perception of the system. We showed that, in spite of unchanged calibration performance, the most complex model set-up exhibited increased performance in the independent test period and skill to reproduce all 20 signatures, indicating a better system representation. The results suggest that a model may be inadequate despite good performance with respect to multiple calibration objectives and that increasing model complexity, if efficiently counter-balanced by available prior constraints, can increase predictive performance of a model and its skill to reproduce hydrological signatures. The results strongly illustrate the need to balance automated model calibration with a more expert-knowledge driven strategy of constraining models.
Why Bother and Calibrate? Model Consistency and the Value of Prior Information.
NASA Astrophysics Data System (ADS)
Hrachowitz, M.; Fovet, O.; Ruiz, L.; Euser, T.; Gharari, S.; Nijzink, R.; Freer, J. E.; Savenije, H.; Gascuel-Odoux, C.
2014-12-01
Hydrological models frequently suffer from limited predictive power despite adequate calibration performances. This can indicate insufficient representations of the underlying processes. Thus ways are sought to increase model consistency while satisfying the contrasting priorities of increased model complexity and limited equifinality. In this study the value of a systematic use of hydrological signatures and expert knowledge for increasing model consistency was tested. It was found that a simple conceptual model, constrained by 4 calibration objective functions, was able to adequately reproduce the hydrograph in the calibration period. The model, however, could not reproduce 20 hydrological signatures, indicating a lack of model consistency. Subsequently, testing 11 models, model complexity was increased in a stepwise way and counter-balanced by using prior information about the system to impose "prior constraints", inferred from expert knowledge and to ensure a model which behaves well with respect to the modeller's perception of the system. We showed that, in spite of unchanged calibration performance, the most complex model set-up exhibited increased performance in the independent test period and skill to reproduce all 20 signatures, indicating a better system representation. The results suggest that a model may be inadequate despite good performance with respect to multiple calibration objectives and that increasing model complexity, if efficiently counter-balanced by available prior constraints, can increase predictive performance of a model and its skill to reproduce hydrological signatures. The results strongly illustrate the need to balance automated model calibration with a more expert-knowledge driven strategy of constraining models.
A Simple test for the existence of two accretion modes in active galactic nuclei
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jester, Sebastian; /Fermilab
2005-02-01
By analogy to the different accretion states observed in black-hole X-ray binaries (BHXBs), it appears plausible that accretion disks in active galactic nuclei (AGN) undergo a state transition between a radiatively efficient and inefficient accretion flow. If the radiative efficiency changes at some critical accretion rate, there will be a change in the distribution of black hole masses and bolometric luminosities at the corresponding transition luminosity. To test this prediction, the author considers the joint distribution of AGN black hole masses and bolometric luminosities for a sample taken from the literature. The small number of objects with low Eddington-scaled accretionmore » rates m < 0.01 and black hole masses M{sub BH} < 10{sup 9} M{sub {circle_dot}} constitutes tentative evidence for the existence of such a transition in AGN. Selection effects, in particular those associated with flux-limited samples, systematically exclude objects in particular regions of the (M{sub BH}, L{sub bol}) plane. Therefore, they require particular attention in the analysis of distributions of black hole mass, bolometric luminosity, and derived quantities like the accretion rate. The author suggests further observational tests of the BHXB-AGN unification scheme which are based on the jet domination of the energy output of BHXBs in the hard state, and on the possible equivalence of BHXB in the very high (or steep power-law) state showing ejections and efficiently accreting quasars and radio galaxies with powerful radio jets.« less
Openlobby: an open game server for lobby and matchmaking
NASA Astrophysics Data System (ADS)
Zamzami, E. M.; Tarigan, J. T.; Jaya, I.; Hardi, S. M.
2018-03-01
Online Multiplayer is one of the most essential feature in modern games. However, while developing a multiplayer feature can be done with a simple computer networking programming, creating a balanced multiplayer session requires more player management components such as game lobby and matchmaking system. Our objective is to develop OpenLobby, a server that available to be used by other developers to support their multiplayer application. The proposed system acts as a lobby and matchmaker where queueing players will be matched to other player according to a certain criteria defined by developer. The solution provides an application programing interface that can be used by developer to interact with the server. For testing purpose, we developed a game that uses the server as their multiplayer server.
Prediction of the bending behavior after pre-strain of an aluminum alloy
NASA Astrophysics Data System (ADS)
Pradeau, A.; Thuillier, S.; Yoon, J. W.
2016-10-01
The present work is focused on the modeling of sheet metal mechanical behavior up to rupture, including anisotropy and hardening. The mechanical behavior of an AA6016 alloy was characterized at room temperature in tension, simple shear and hydraulic bulging. The initial anisotropy was described with the Yld2004-18p yield criterion coupled to a mixed hardening law. Concerning rupture, an uncoupled phenomenological criterion of Mohr-Coulomb type will be used. For the material parameter identification, an inverse methodology was used with the objective of reducing the gap between experimental and numerical data. Finally, validation of the results was performed on bending tests with different amplitudes of tension pre-strain in order to reach or not rupture in the bent area.
Numerical implementation of the S-matrix algorithm for modeling of relief diffraction gratings
NASA Astrophysics Data System (ADS)
Yaremchuk, Iryna; Tamulevičius, Tomas; Fitio, Volodymyr; Gražulevičiūte, Ieva; Bobitski, Yaroslav; Tamulevičius, Sigitas
2013-11-01
A new numerical implementation is developed to calculate the diffraction efficiency of relief diffraction gratings. In the new formulation, vectors containing the expansion coefficients of electric and magnetic fields on boundaries of the grating layer are expressed by additional constants. An S-matrix algorithm has been systematically described in detail and adapted to a simple matrix form. This implementation is suitable for the study of optical characteristics of periodic structures by using modern object-oriented programming languages and different standard mathematical software. The modeling program has been developed on the basis of this numerical implementation and tested by comparison with other commercially available programs and experimental data. Numerical examples are given to show the usefulness of the new implementation.
Control technology development
NASA Astrophysics Data System (ADS)
Schaechter, D. B.
1982-03-01
The main objectives of the control technology development task are given in the slide below. The first is to develop control design techniques based on flexible structural models, rather than simple rigid-body models. Since large space structures are distributed parameter systems, a new degree of freedom, that of sensor/actuator placement, may be exercised for improving control system performance. Another characteristic of large space structures is numerous oscillatory modes within the control bandwidth. Reduced-order controller design models must be developed which produce stable closed-loop systems when combined with the full-order system. Since the date of an actual large-space-structure flight is rapidly approaching, it is vitally important that theoretical developments are tested in actual hardware. Experimental verification is a vital counterpart of all current theoretical developments.
ERIC Educational Resources Information Center
Rossi, Sergio; Benaglia, Maurizio; Brenna, Davide; Porta, Riccardo; Orlandi, Manuel
2015-01-01
A simple procedure to convert protein data bank files (.pdb) into a stereolithography file (.stl) using VMD software (Virtual Molecular Dynamic) is reported. This tutorial allows generating, with a very simple protocol, three-dimensional customized structures that can be printed by a low-cost 3D-printer, and used for teaching chemical education…
Creating Simple Windchill Admin Tools Using Info*Engine
NASA Technical Reports Server (NTRS)
Jones, Corey; Kapatos, Dennis; Skradski, Cory
2012-01-01
Being a Windchill administrator often requires performing simple yet repetitive tasks on large sets of objects. These can include renaming, deleting, checking in, undoing checkout, and much more. This is especially true during a migration. Fortunately, PTC has provided a simple way to dynamically interact with Windchill using Info*Engine. This presentation will describe how to create simple Info*Engine tasks capable of saving Windchill 10.0 administrators hours of tedious work. It will also show how these tasks can be combined and displayed on a simple JSP page that acts as a "Windchill Administrator Dashboard/Toolbox". The attendee will learn some valuable tasks Info*Engine capable of performing. The attendee will gain a basic understanding of how to perform and implement Info*Engine tasks. The attendee will learn what's involved in creating a JSP page that displays Info*Engine tasks
Design of an Object-Oriented Turbomachinery Analysis Code: Initial Results
NASA Technical Reports Server (NTRS)
Jones, Scott
2015-01-01
Performance prediction of turbomachines is a significant part of aircraft propulsion design. In the conceptual design stage, there is an important need to quantify compressor and turbine aerodynamic performance and develop initial geometry parameters at the 2-D level prior to more extensive Computational Fluid Dynamics (CFD) analyses. The Object-oriented Turbomachinery Analysis Code (OTAC) is being developed to perform 2-D meridional flowthrough analysis of turbomachines using an implicit formulation of the governing equations to solve for the conditions at the exit of each blade row. OTAC is designed to perform meanline or streamline calculations; for streamline analyses simple radial equilibrium is used as a governing equation to solve for spanwise property variations. While the goal for OTAC is to allow simulation of physical effects and architectural features unavailable in other existing codes, it must first prove capable of performing calculations for conventional turbomachines.OTAC is being developed using the interpreted language features available in the Numerical Propulsion System Simulation (NPSS) code described by Claus et al (1991). Using the NPSS framework came with several distinct advantages, including access to the pre-existing NPSS thermodynamic property packages and the NPSS Newton-Raphson solver. The remaining objects necessary for OTAC were written in the NPSS framework interpreted language. These new objects form the core of OTAC and are the BladeRow, BladeSegment, TransitionSection, Expander, Reducer, and OTACstart Elements. The BladeRow and BladeSegment consumed the initial bulk of the development effort and required determining the equations applicable to flow through turbomachinery blade rows given specific assumptions about the nature of that flow. Once these objects were completed, OTAC was tested and found to agree with existing solutions from other codes; these tests included various meanline and streamline comparisons of axial compressors and turbines at design and off-design conditions.
Design of an Object-Oriented Turbomachinery Analysis Code: Initial Results
NASA Technical Reports Server (NTRS)
Jones, Scott M.
2015-01-01
Performance prediction of turbomachines is a significant part of aircraft propulsion design. In the conceptual design stage, there is an important need to quantify compressor and turbine aerodynamic performance and develop initial geometry parameters at the 2-D level prior to more extensive Computational Fluid Dynamics (CFD) analyses. The Object-oriented Turbomachinery Analysis Code (OTAC) is being developed to perform 2-D meridional flowthrough analysis of turbomachines using an implicit formulation of the governing equations to solve for the conditions at the exit of each blade row. OTAC is designed to perform meanline or streamline calculations; for streamline analyses simple radial equilibrium is used as a governing equation to solve for spanwise property variations. While the goal for OTAC is to allow simulation of physical effects and architectural features unavailable in other existing codes, it must first prove capable of performing calculations for conventional turbomachines. OTAC is being developed using the interpreted language features available in the Numerical Propulsion System Simulation (NPSS) code described by Claus et al (1991). Using the NPSS framework came with several distinct advantages, including access to the pre-existing NPSS thermodynamic property packages and the NPSS Newton-Raphson solver. The remaining objects necessary for OTAC were written in the NPSS framework interpreted language. These new objects form the core of OTAC and are the BladeRow, BladeSegment, TransitionSection, Expander, Reducer, and OTACstart Elements. The BladeRow and BladeSegment consumed the initial bulk of the development effort and required determining the equations applicable to flow through turbomachinery blade rows given specific assumptions about the nature of that flow. Once these objects were completed, OTAC was tested and found to agree with existing solutions from other codes; these tests included various meanline and streamline comparisons of axial compressors and turbines at design and off-design conditions.
Fu, Min; Wu, Wenming; Hong, Xiafei; Liu, Qiuhua; Jiang, Jialin; Ou, Yaobin; Zhao, Yupei; Gong, Xinqi
2018-04-24
Efficient computational recognition and segmentation of target organ from medical images are foundational in diagnosis and treatment, especially about pancreas cancer. In practice, the diversity in appearance of pancreas and organs in abdomen, makes detailed texture information of objects important in segmentation algorithm. According to our observations, however, the structures of previous networks, such as the Richer Feature Convolutional Network (RCF), are too coarse to segment the object (pancreas) accurately, especially the edge. In this paper, we extend the RCF, proposed to the field of edge detection, for the challenging pancreas segmentation, and put forward a novel pancreas segmentation network. By employing multi-layer up-sampling structure replacing the simple up-sampling operation in all stages, the proposed network fully considers the multi-scale detailed contexture information of object (pancreas) to perform per-pixel segmentation. Additionally, using the CT scans, we supply and train our network, thus get an effective pipeline. Working with our pipeline with multi-layer up-sampling model, we achieve better performance than RCF in the task of single object (pancreas) segmentation. Besides, combining with multi scale input, we achieve the 76.36% DSC (Dice Similarity Coefficient) value in testing data. The results of our experiments show that our advanced model works better than previous networks in our dataset. On the other words, it has better ability in catching detailed contexture information. Therefore, our new single object segmentation model has practical meaning in computational automatic diagnosis.
Wang, Yuanye; Luo, Huan
2017-01-01
In order to deal with external world efficiently, the brain constantly generates predictions about incoming sensory inputs, a process known as "predictive coding." Our recent studies, by employing visual priming paradigms in combination with a time-resolved behavioral measurement, reveal that perceptual predictions about simple features (e.g., left or right orientation) return to low sensory areas not continuously but recurrently in a theta-band (3-4Hz) rhythm. However, it remains unknown whether high-level object processing is also mediated by the oscillatory mechanism and if yes at which rhythm the mechanism works. In the present study, we employed a morph-face priming paradigm and the time-resolved behavioral measurements to examine the fine temporal dynamics of face identity priming performance. First, we reveal classical priming effects and a rhythmic trend within the prime-to-probe SOA of 600ms (Experiment 1). Next, we densely sampled the face priming behavioral performances within this SOA range (Experiment 2). Our results demonstrate a significant ~5Hz oscillatory component in the face priming behavioral performances, suggesting that a rhythmic process also coordinates the object-level prediction (i.e., face identity here). In comparison to our previous studies, the results suggest that the rhythm for the high-level object is faster than that for simple features. We propose that the seemingly distinctive priming rhythms might be attributable to that the object-level and simple feature-level predictions return to different stages along the visual pathway (e.g., FFA area for face priming and V1 area for simple feature priming). In summary, the findings support a general theta-band (3-6Hz) temporal organization mechanism in predictive coding, and that such wax-and-waning pattern in predictive coding may aid the brain to be more readily updated for new inputs. © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Okura, Yuki; Futamase, Toshifumi, E-mail: yuki.okura@riken.jp
We improve the ellipticity of re-smeared artificial image (ERA) method of point-spread function (PSF) correction in a weak lensing shear analysis in order to treat the realistic shape of galaxies and the PSF. This is done by re-smearing the PSF and the observed galaxy image using a re-smearing function (RSF) and allows us to use a new PSF with a simple shape and to correct the PSF effect without any approximations or assumptions. We perform a numerical test to show that the method applied for galaxies and PSF with some complicated shapes can correct the PSF effect with a systematicmore » error of less than 0.1%. We also apply the ERA method for real data of the Abell 1689 cluster to confirm that it is able to detect the systematic weak lensing shear pattern. The ERA method requires less than 0.1 or 1 s to correct the PSF for each object in a numerical test and a real data analysis, respectively.« less
Piloted Evaluation of a UH-60 Mixer Equivalent Turbulence Simulation Model
NASA Technical Reports Server (NTRS)
Lusardi, Jeff A.; Blanken, Chris L.; Tischeler, Mark B.
2002-01-01
A simulation study of a recently developed hover/low speed Mixer Equivalent Turbulence Simulation (METS) model for the UH-60 Black Hawk helicopter was conducted in the NASA Ames Research Center Vertical Motion Simulator (VMS). The experiment was a continuation of previous work to develop a simple, but validated, turbulence model for hovering rotorcraft. To validate the METS model, two experienced test pilots replicated precision hover tasks that had been conducted in an instrumented UH-60 helicopter in turbulence. Objective simulation data were collected for comparison with flight test data, and subjective data were collected that included handling qualities ratings and pilot comments for increasing levels of turbulence. Analyses of the simulation results show good analytic agreement between the METS model and flight test data, with favorable pilot perception of the simulated turbulence. Precision hover tasks were also repeated using the more complex rotating-frame SORBET (Simulation Of Rotor Blade Element Turbulence) model to generate turbulence. Comparisons of the empirically derived METS model with the theoretical SORBET model show good agreement providing validation of the more complex blade element method of simulating turbulence.