Matheson, Heath E; Buxbaum, Laurel J; Thompson-Schill, Sharon L
2017-11-01
Our use of tools is situated in different contexts. Prior evidence suggests that diverse regions within the ventral and dorsal streams represent information supporting common tool use. However, given the flexibility of object concepts, these regions may be tuned to different types of information when generating novel or uncommon uses of tools. To investigate this, we collected fMRI data from participants who reported common or uncommon tool uses in response to visually presented familiar objects. We performed a pattern dissimilarity analysis in which we correlated cortical patterns with behavioral measures of visual, action, and category information. The results showed that evoked cortical patterns within the dorsal tool use network reflected action and visual information to a greater extent in the uncommon use group, whereas evoked neural patterns within the ventral tool use network reflected categorical information more strongly in the common use group. These results reveal the flexibility of cortical representations of tool use and the situated nature of cortical representations more generally.
Regulatory sequence analysis tools.
van Helden, Jacques
2003-07-01
The web resource Regulatory Sequence Analysis Tools (RSAT) (http://rsat.ulb.ac.be/rsat) offers a collection of software tools dedicated to the prediction of regulatory sites in non-coding DNA sequences. These tools include sequence retrieval, pattern discovery, pattern matching, genome-scale pattern matching, feature-map drawing, random sequence generation and other utilities. Alternative formats are supported for the representation of regulatory motifs (strings or position-specific scoring matrices) and several algorithms are proposed for pattern discovery. RSAT currently holds >100 fully sequenced genomes and these data are regularly updated from GenBank.
PatternCoder: A Programming Support Tool for Learning Binary Class Associations and Design Patterns
ERIC Educational Resources Information Center
Paterson, J. H.; Cheng, K. F.; Haddow, J.
2009-01-01
PatternCoder is a software tool to aid student understanding of class associations. It has a wizard-based interface which allows students to select an appropriate binary class association or design pattern for a given problem. Java code is then generated which allows students to explore the way in which the class associations are implemented in a…
NASA Astrophysics Data System (ADS)
Zhou, Yuping; Zhang, Qi
2018-04-01
In the information environment, digital and information processing to Li brocade patterns reveals an important means of Li ethnic style and inheriting the national culture. Adobe Illustrator CS3 and Java language were used in the paper to make "variation" processing to Li brocade patterns, and generate "Li brocade pattern mutant genes". The generation of pattern mutant genes includes color mutation, shape mutation, adding and missing transform, and twisted transform, etc. Research shows that Li brocade pattern mutant genes can be generated by using the Adobe Illustrator CS3 and the image processing tools of Java language edit, etc.
TOOLS FOR PRESENTING SPATIAL AND TEMPORAL PATTERNS OF ENVIRONMENTAL MONITORING DATA
The EPA Health Effects Research Laboratory has developed this data presentation tool for use with a variety of types of data which may contain spatial and temporal patterns of interest. he technology links mainframe computing power to the new generation of "desktop publishing" ha...
A visual analytics approach for pattern-recognition in patient-generated data.
Feller, Daniel J; Burgermaster, Marissa; Levine, Matthew E; Smaldone, Arlene; Davidson, Patricia G; Albers, David J; Mamykina, Lena
2018-06-13
To develop and test a visual analytics tool to help clinicians identify systematic and clinically meaningful patterns in patient-generated data (PGD) while decreasing perceived information overload. Participatory design was used to develop Glucolyzer, an interactive tool featuring hierarchical clustering and a heatmap visualization to help registered dietitians (RDs) identify associative patterns between blood glucose levels and per-meal macronutrient composition for individuals with type 2 diabetes (T2DM). Ten RDs participated in a within-subjects experiment to compare Glucolyzer to a static logbook format. For each representation, participants had 25 minutes to examine 1 month of diabetes self-monitoring data captured by an individual with T2DM and identify clinically meaningful patterns. We compared the quality and accuracy of the observations generated using each representation. Participants generated 50% more observations when using Glucolyzer (98) than when using the logbook format (64) without any loss in accuracy (69% accuracy vs 62%, respectively, p = .17). Participants identified more observations that included ingredients other than carbohydrates using Glucolyzer (36% vs 16%, p = .027). Fewer RDs reported feelings of information overload using Glucolyzer compared to the logbook format. Study participants displayed variable acceptance of hierarchical clustering. Visual analytics have the potential to mitigate provider concerns about the volume of self-monitoring data. Glucolyzer helped dietitians identify meaningful patterns in self-monitoring data without incurring perceived information overload. Future studies should assess whether similar tools can support clinicians in personalizing behavioral interventions that improve patient outcomes.
Mask pattern generator employing EPL technology
NASA Astrophysics Data System (ADS)
Yoshioka, Nobuyuki; Yamabe, Masaki; Wakamiya, Wataru; Endo, Nobuhiro
2003-08-01
Mask cost is one of crucial issues in device fabrication, especially in SoC (System on a Chip) with small-volume production. The cost mainly depends on productivity of mask manufacturing tools such as mask writers and defect inspection tools. EPL (Electron Projection Lithography) has been developing as a high-throughput electron beam exposure technology that will succeed optical lithography. The application of EPL technology to mask writing will result in high productivity and contribute to decrease the mask cost. The concept of a mask pattern generator employing EPL technology is proposed in this paper. It is very similar to EPL technology used for pattern printing on a wafer. The mask patterns on the glass substrate are exposed by projecting the basic circuit patterns formed on the mother EPL mask. One example of the mother EPL mask is a stencil type made with 200-mm Si wafer. The basic circuit patterns are IP patterns and logical primitive patterns such as cell libraries (AND, OR, Inverter, Flip-Flop and etc.) to express the SoC device patterns. Since the SoC patterns are exposed with its collective units such as IP and logical primitive patterns by using this method, the high throughput will be expected comparing with conventional mask E-beam writers. In this paper, the mask pattern generator with the EPL technology is proposed. The concept, its advantages and issues to be solved are discussed.
Fast in-situ tool inspection based on inverse fringe projection and compact sensor heads
NASA Astrophysics Data System (ADS)
Matthias, Steffen; Kästner, Markus; Reithmeier, Eduard
2016-11-01
Inspection of machine elements is an important task in production processes in order to ensure the quality of produced parts and to gather feedback for the continuous improvement process. A new measuring system is presented, which is capable of performing the inspection of critical tool geometries, such as gearing elements, inside the forming machine. To meet the constraints on sensor head size and inspection time imposed by the limited space inside the machine and the cycle time of the process, the measuring device employs a combination of endoscopy techniques with the fringe projection principle. Compact gradient index lenses enable a compact design of the sensor head, which is connected to a CMOS camera and a flexible micro-mirror based projector via flexible fiber bundles. Using common fringe projection patterns, the system achieves measuring times of less than five seconds. To further reduce the time required for inspection, the generation of inverse fringe projection patterns has been implemented for the system. Inverse fringe projection speeds up the inspection process by employing object-adapted patterns, which enable the detection of geometry deviations in a single image. Two different approaches to generate object adapted patterns are presented. The first approach uses a reference measurement of a manufactured tool master to generate the inverse pattern. The second approach is based on a virtual master geometry in the form of a CAD file and a ray-tracing model of the measuring system. Virtual modeling of the measuring device and inspection setup allows for geometric tolerancing for free-form surfaces by the tool designer in the CAD-file. A new approach is presented, which uses virtual tolerance specifications and additional simulation steps to enable fast checking of metric tolerances. Following the description of the pattern generation process, the image processing steps required for inspection are demonstrated on captures of gearing geometries.
Method and apparatus for characterizing and enhancing the functional performance of machine tools
Barkman, William E; Babelay, Jr., Edwin F; Smith, Kevin Scott; Assaid, Thomas S; McFarland, Justin T; Tursky, David A; Woody, Bethany; Adams, David
2013-04-30
Disclosed are various systems and methods for assessing and improving the capability of a machine tool. The disclosure applies to machine tools having at least one slide configured to move along a motion axis. Various patterns of dynamic excitation commands are employed to drive the one or more slides, typically involving repetitive short distance displacements. A quantification of a measurable merit of machine tool response to the one or more patterns of dynamic excitation commands is typically derived for the machine tool. Examples of measurable merits of machine tool performance include workpiece surface finish, and the ability to generate chips of the desired length.
Early stage hot spot analysis through standard cell base random pattern generation
NASA Astrophysics Data System (ADS)
Jeon, Joong-Won; Song, Jaewan; Kim, Jeong-Lim; Park, Seongyul; Yang, Seung-Hune; Lee, Sooryong; Kang, Hokyu; Madkour, Kareem; ElManhawy, Wael; Lee, SeungJo; Kwan, Joe
2017-04-01
Due to limited availability of DRC clean patterns during the process and RET recipe development, OPC recipes are not tested with high pattern coverage. Various kinds of pattern can help OPC engineer to detect sensitive patterns to lithographic effects. Random pattern generation is needed to secure robust OPC recipe. However, simple random patterns without considering real product layout style can't cover patterning hotspot in production levels. It is not effective to use them for OPC optimization thus it is important to generate random patterns similar to real product patterns. This paper presents a strategy for generating random patterns based on design architecture information and preventing hotspot in early process development stage through a tool called Layout Schema Generator (LSG). Using LSG, we generate standard cell based on random patterns reflecting real design cell structure - fin pitch, gate pitch and cell height. The output standard cells from LSG are applied to an analysis methodology to assess their hotspot severity by assigning a score according to their optical image parameters - NILS, MEEF, %PV band and thus potential hotspots can be defined by determining their ranking. This flow is demonstrated on Samsung 7nm technology optimizing OPC recipe and early enough in the process avoiding using problematic patterns.
2013-01-01
Background Multicellular organisms consist of cells of many different types that are established during development. Each type of cell is characterized by the unique combination of expressed gene products as a result of spatiotemporal gene regulation. Currently, a fundamental challenge in regulatory biology is to elucidate the gene expression controls that generate the complex body plans during development. Recent advances in high-throughput biotechnologies have generated spatiotemporal expression patterns for thousands of genes in the model organism fruit fly Drosophila melanogaster. Existing qualitative methods enhanced by a quantitative analysis based on computational tools we present in this paper would provide promising ways for addressing key scientific questions. Results We develop a set of computational methods and open source tools for identifying co-expressed embryonic domains and the associated genes simultaneously. To map the expression patterns of many genes into the same coordinate space and account for the embryonic shape variations, we develop a mesh generation method to deform a meshed generic ellipse to each individual embryo. We then develop a co-clustering formulation to cluster the genes and the mesh elements, thereby identifying co-expressed embryonic domains and the associated genes simultaneously. Experimental results indicate that the gene and mesh co-clusters can be correlated to key developmental events during the stages of embryogenesis we study. The open source software tool has been made available at http://compbio.cs.odu.edu/fly/. Conclusions Our mesh generation and machine learning methods and tools improve upon the flexibility, ease-of-use and accuracy of existing methods. PMID:24373308
Demonstration of lithography patterns using reflective e-beam direct write
NASA Astrophysics Data System (ADS)
Freed, Regina; Sun, Jeff; Brodie, Alan; Petric, Paul; McCord, Mark; Ronse, Kurt; Haspeslagh, Luc; Vereecke, Bart
2011-04-01
Traditionally, e-beam direct write lithography has been too slow for most lithography applications. E-beam direct write lithography has been used for mask writing rather than wafer processing since the maximum blur requirements limit column beam current - which drives e-beam throughput. To print small features and a fine pitch with an e-beam tool requires a sacrifice in processing time unless one significantly increases the total number of beams on a single writing tool. Because of the uncertainty with regards to the optical lithography roadmap beyond the 22 nm technology node, the semiconductor equipment industry is in the process of designing and testing e-beam lithography tools with the potential for high volume wafer processing. For this work, we report on the development and current status of a new maskless, direct write e-beam lithography tool which has the potential for high volume lithography at and below the 22 nm technology node. A Reflective Electron Beam Lithography (REBL) tool is being developed for high throughput electron beam direct write maskless lithography. The system is targeting critical patterning steps at the 22 nm node and beyond at a capital cost equivalent to conventional lithography. Reflective Electron Beam Lithography incorporates a number of novel technologies to generate and expose lithographic patterns with a throughput and footprint comparable to current 193 nm immersion lithography systems. A patented, reflective electron optic or Digital Pattern Generator (DPG) enables the unique approach. The Digital Pattern Generator is a CMOS ASIC chip with an array of small, independently controllable lens elements (lenslets), which act as an array of electron mirrors. In this way, the REBL system is capable of generating the pattern to be written using massively parallel exposure by ~1 million beams at extremely high data rates (~ 1Tbps). A rotary stage concept using a rotating platen carrying multiple wafers optimizes the writing strategy of the DPG to achieve the capability of high throughput for sparse pattern wafer levels. The lens elements on the DPG are fabricated at IMEC (Leuven, Belgium) under IMEC's CMORE program. The CMOS fabricated DPG contains ~ 1,000,000 lens elements, allowing for 1,000,000 individually controllable beamlets. A single lens element consists of 5 electrodes, each of which can be set at controlled voltage levels to either absorb or reflect the electron beam. A system using a linear movable stage and the DPG integrated into the electron optics module was used to expose patterns on device representative wafers. Results of these exposure tests are discussed.
Economic consequences of high throughput maskless lithography
NASA Astrophysics Data System (ADS)
Hartley, John G.; Govindaraju, Lakshmi
2005-11-01
Many people in the semiconductor industry bemoan the high costs of masks and view mask cost as one of the significant barriers to bringing new chip designs to market. All that is needed is a viable maskless technology and the problem will go away. Numerous sites around the world are working on maskless lithography but inevitably, the question asked is "Wouldn't a one wafer per hour maskless tool make a really good mask writer?" Of course, the answer is yes, the hesitation you hear in the answer isn't based on technology concerns, it's financial. The industry needs maskless lithography because mask costs are too high. Mask costs are too high because mask pattern generators (PG's) are slow and expensive. If mask PG's become much faster, mask costs go down, the maskless market goes away and the PG supplier is faced with an even smaller tool demand from the mask shops. Technical success becomes financial suicide - or does it? In this paper we will present the results of a model that examines some of the consequences of introducing high throughput maskless pattern generation. Specific features in the model include tool throughput for masks and wafers, market segmentation by node for masks and wafers and mask cost as an entry barrier to new chip designs. How does the availability of low cost masks and maskless tools affect the industries tool makeup and what is the ultimate potential market for high throughput maskless pattern generators?
Gerth, Victor E; Vize, Peter D
2005-04-01
The Gene Expression Viewer is a web-launched three-dimensional visualization tool, tailored to compare surface reconstructions of multi-channel image volumes generated by confocal microscopy or micro-CT.
Quantifying Traces of Tool Use: A Novel Morphometric Analysis of Damage Patterns on Percussive Tools
Caruana, Matthew V.; Carvalho, Susana; Braun, David R.; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W. K.
2014-01-01
Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns. PMID:25415303
BrEPS 2.0: Optimization of sequence pattern prediction for enzyme annotation.
Dudek, Christian-Alexander; Dannheim, Henning; Schomburg, Dietmar
2017-01-01
The prediction of gene functions is crucial for a large number of different life science areas. Faster high throughput sequencing techniques generate more and larger datasets. The manual annotation by classical wet-lab experiments is not suitable for these large amounts of data. We showed earlier that the automatic sequence pattern-based BrEPS protocol, based on manually curated sequences, can be used for the prediction of enzymatic functions of genes. The growing sequence databases provide the opportunity for more reliable patterns, but are also a challenge for the implementation of automatic protocols. We reimplemented and optimized the BrEPS pattern generation to be applicable for larger datasets in an acceptable timescale. Primary improvement of the new BrEPS protocol is the enhanced data selection step. Manually curated annotations from Swiss-Prot are used as reliable source for function prediction of enzymes observed on protein level. The pool of sequences is extended by highly similar sequences from TrEMBL and SwissProt. This allows us to restrict the selection of Swiss-Prot entries, without losing the diversity of sequences needed to generate significant patterns. Additionally, a supporting pattern type was introduced by extending the patterns at semi-conserved positions with highly similar amino acids. Extended patterns have an increased complexity, increasing the chance to match more sequences, without losing the essential structural information of the pattern. To enhance the usability of the database, we introduced enzyme function prediction based on consensus EC numbers and IUBMB enzyme nomenclature. BrEPS is part of the Braunschweig Enzyme Database (BRENDA) and is available on a completely redesigned website and as download. The database can be downloaded and used with the BrEPScmd command line tool for large scale sequence analysis. The BrEPS website and downloads for the database creation tool, command line tool and database are freely accessible at http://breps.tu-bs.de.
BrEPS 2.0: Optimization of sequence pattern prediction for enzyme annotation
Schomburg, Dietmar
2017-01-01
The prediction of gene functions is crucial for a large number of different life science areas. Faster high throughput sequencing techniques generate more and larger datasets. The manual annotation by classical wet-lab experiments is not suitable for these large amounts of data. We showed earlier that the automatic sequence pattern-based BrEPS protocol, based on manually curated sequences, can be used for the prediction of enzymatic functions of genes. The growing sequence databases provide the opportunity for more reliable patterns, but are also a challenge for the implementation of automatic protocols. We reimplemented and optimized the BrEPS pattern generation to be applicable for larger datasets in an acceptable timescale. Primary improvement of the new BrEPS protocol is the enhanced data selection step. Manually curated annotations from Swiss-Prot are used as reliable source for function prediction of enzymes observed on protein level. The pool of sequences is extended by highly similar sequences from TrEMBL and SwissProt. This allows us to restrict the selection of Swiss-Prot entries, without losing the diversity of sequences needed to generate significant patterns. Additionally, a supporting pattern type was introduced by extending the patterns at semi-conserved positions with highly similar amino acids. Extended patterns have an increased complexity, increasing the chance to match more sequences, without losing the essential structural information of the pattern. To enhance the usability of the database, we introduced enzyme function prediction based on consensus EC numbers and IUBMB enzyme nomenclature. BrEPS is part of the Braunschweig Enzyme Database (BRENDA) and is available on a completely redesigned website and as download. The database can be downloaded and used with the BrEPScmd command line tool for large scale sequence analysis. The BrEPS website and downloads for the database creation tool, command line tool and database are freely accessible at http://breps.tu-bs.de. PMID:28750104
Uomini, Natalie Thaïs; Meyer, Georg Friedrich
2013-01-01
The popular theory that complex tool-making and language co-evolved in the human lineage rests on the hypothesis that both skills share underlying brain processes and systems. However, language and stone tool-making have so far only been studied separately using a range of neuroimaging techniques and diverse paradigms. We present the first-ever study of brain activation that directly compares active Acheulean tool-making and language. Using functional transcranial Doppler ultrasonography (fTCD), we measured brain blood flow lateralization patterns (hemodynamics) in subjects who performed two tasks designed to isolate the planning component of Acheulean stone tool-making and cued word generation as a language task. We show highly correlated hemodynamics in the initial 10 seconds of task execution. Stone tool-making and cued word generation cause common cerebral blood flow lateralization signatures in our participants. This is consistent with a shared neural substrate for prehistoric stone tool-making and language, and is compatible with language evolution theories that posit a co-evolution of language and manual praxis. In turn, our results support the hypothesis that aspects of language might have emerged as early as 1.75 million years ago, with the start of Acheulean technology.
Layout pattern analysis using the Voronoi diagram of line segments
NASA Astrophysics Data System (ADS)
Dey, Sandeep Kumar; Cheilaris, Panagiotis; Gabrani, Maria; Papadopoulou, Evanthia
2016-01-01
Early identification of problematic patterns in very large scale integration (VLSI) designs is of great value as the lithographic simulation tools face significant timing challenges. To reduce the processing time, such a tool selects only a fraction of possible patterns which have a probable area of failure, with the risk of missing some problematic patterns. We introduce a fast method to automatically extract patterns based on their structure and context, using the Voronoi diagram of line-segments as derived from the edges of VLSI design shapes. Designers put line segments around the problematic locations in patterns called "gauges," along which the critical distance is measured. The gauge center is the midpoint of a gauge. We first use the Voronoi diagram of VLSI shapes to identify possible problematic locations, represented as gauge centers. Then we use the derived locations to extract windows containing the problematic patterns from the design layout. The problematic locations are prioritized by the shape and proximity information of the design polygons. We perform experiments for pattern selection in a portion of a 22-nm random logic design layout. The design layout had 38,584 design polygons (consisting of 199,946 line segments) on layer Mx, and 7079 markers generated by an optical rule checker (ORC) tool. The optical rules specify requirements for printing circuits with minimum dimension. Markers are the locations of some optical rule violations in the layout. We verify our approach by comparing the coverage of our extracted patterns to the ORC-generated markers. We further derive a similarity measure between patterns and between layouts. The similarity measure helps to identify a set of representative gauges that reduces the number of patterns for analysis.
DMT-TAFM: a data mining tool for technical analysis of futures market
NASA Astrophysics Data System (ADS)
Stepanov, Vladimir; Sathaye, Archana
2002-03-01
Technical analysis of financial markets describes many patterns of market behavior. For practical use, all these descriptions need to be adjusted for each particular trading session. In this paper, we develop a data mining tool for technical analysis of the futures markets (DMT-TAFM), which dynamically generates rules based on the notion of the price pattern similarity. The tool consists of three main components. The first component provides visualization of data series on a chart with different ranges, scales, and chart sizes and types. The second component constructs pattern descriptions using sets of polynomials. The third component specifies the training set for mining, defines the similarity notion, and searches for a set of similar patterns. DMT-TAFM is useful to prepare the data, and then reveal and systemize statistical information about similar patterns found in any type of historical price series. We performed experiments with our tool on three decades of trading data fro hundred types of futures. Our results for this data set shows that, we can prove or disprove many well-known patterns based on real data, as well as reveal new ones, and use the set of relatively consistent patterns found during data mining for developing better futures trading strategies.
From Modelling to Execution of Enterprise Integration Scenarios: The GENIUS Tool
NASA Astrophysics Data System (ADS)
Scheibler, Thorsten; Leymann, Frank
One of the predominant problems IT companies are facing today is Enterprise Application Integration (EAI). Most of the infrastructures built to tackle integration issues are proprietary because no standards exist for how to model, develop, and actually execute integration scenarios. EAI patterns gain importance for non-technical business users to ease and harmonize the development of EAI scenarios. These patterns describe recurring EAI challenges and propose possible solutions in an abstract way. Therefore, one can use those patterns to describe enterprise architectures in a technology neutral manner. However, patterns are documentation only used by developers and systems architects to decide how to implement an integration scenario manually. Thus, patterns are not theoretical thought to stand for artefacts that will immediately be executed. This paper presents a tool supporting a method how EAI patterns can be used to generate executable artefacts for various target platforms automatically using a model-driven development approach, hence turning patterns into something executable. Therefore, we introduce a continuous tool chain beginning at the design phase and ending in executing an integration solution in a completely automatically manner. For evaluation purposes we introduce a scenario demonstrating how the tool is utilized for modelling and actually executing an integration scenario.
NASA Astrophysics Data System (ADS)
Simpson, R. A.; Davis, D. E.
1982-09-01
This paper describes techniques to detect submicron pattern defects on optical photomasks with an enhanced direct-write, electron-beam lithographic tool. EL-3 is a third generation, shaped spot, electron-beam lithography tool developed by IBM to fabricate semiconductor devices and masks. This tool is being upgraded to provide 100% inspection of optical photomasks for submicron pattern defects, which are subsequently repaired. Fixed-size overlapped spots are stepped over the mask patterns while a signal derived from the back-scattered electrons is monitored to detect pattern defects. Inspection does not require pattern recognition because the inspection scan patterns are derived from the original design data. The inspection spot is square and larger than the minimum defect to be detected, to improve throughput. A new registration technique provides the beam-to-pattern overlay required to locate submicron defects. The 'guard banding" of inspection shapes prevents mask and system tolerances from producing false alarms that would occur should the spots be mispositioned such that they only partially covered a shape being inspected. A rescanning technique eliminates noise-related false alarms and significantly improves throughput. Data is accumulated during inspection and processed offline, as required for defect repair. EL-3 will detect 0.5 um pattern defects at throughputs compatible with mask manufacturing.
Martin, Teresa A.; Herman, Christine T.; Limpoco, Francis T.; Michael, Madeline C.; Potts, Gregory K.; Bailey, Ryan C.
2014-01-01
Methods for the generation of substrates presenting biomolecules in a spatially controlled manner are enabling tools for applications in biosensor systems, microarray technologies, fundamental biological studies and biointerface science. We have implemented a method to create biomolecular patterns by using light to control the direct covalent immobilization of biomolecules onto benzophenone-modified glass substrates. We have generated substrates presenting up to three different biomolecules patterned in sequence, and demonstrate biomolecular photopatterning on corrugated substrates. The chemistry of the underlying monolayer was optimized to incorporate poly(ethylene glycol) to enable adhesive cell adhesion onto patterned extracellular matrix proteins. Substrates were characterized with contact angle goniometry, AFM, and immunofluorescence microscopy. Importantly, radioimmunoassays were performed to quantify the site density of immobilized biomolecules on photopatterned substrates. Retention of function of photopatterned proteins was demonstrated both by native ligand recognition and cell adhesion to photopatterned substrates, revealing that substrates generated with this method are suitable for probing specific cell receptor-ligand interactions. This molecularly general photochemical patterning method is an enabling tool that will allow the creation of substrates presenting both biochemical and topographical variation, which is an important feature of many native biointerfaces. PMID:21793535
Laser pattern generator challenges in airborne molecular contamination protection
NASA Astrophysics Data System (ADS)
Ekberg, Mats; Skotte, Per-Uno; Utterback, Tomas; Paul, Swaraj; Kishkovich, Oleg P.; Hudzik, James S.
2003-08-01
The introduction of photomask laser pattern generators presents new challenges to system designers and manufacturers. One of the laser pattern generator's environmental operating challenges is Airborne Molecular Contamination (AMC), which affects both chemically amplified resists (CAResist) and laser optics. Similar challenges in CAResist protection have already been addressed in semiconductor wafer lithography with reasonable solutions and experience gained by all those involved. However, photomask and photomask equipment manufacturers have not previously had a comparable experience, and some photomask AMC issues differ from those seen in semiconductor wafer lithography. Culminating years of AMC experience, the authors discuss specific requirements of Photomask AMC. Air sampling and material of construction analysis were performed to understand these particular AMC challenges and used to develop an appropriate filtration specification for different classes of contaminates. The authors portray the importance of cooperation between tool designers and AMC experts early in the design stage to assure goal attainment to maximize both process stability and machine productivity in advanced mask making. In conclusion, the authors provide valuable recommendations to both laser tool users and other equipment manufacturers.
RSAT: regulatory sequence analysis tools.
Thomas-Chollier, Morgane; Sand, Olivier; Turatsinze, Jean-Valéry; Janky, Rekin's; Defrance, Matthieu; Vervisch, Eric; Brohée, Sylvain; van Helden, Jacques
2008-07-01
The regulatory sequence analysis tools (RSAT, http://rsat.ulb.ac.be/rsat/) is a software suite that integrates a wide collection of modular tools for the detection of cis-regulatory elements in genome sequences. The suite includes programs for sequence retrieval, pattern discovery, phylogenetic footprint detection, pattern matching, genome scanning and feature map drawing. Random controls can be performed with random gene selections or by generating random sequences according to a variety of background models (Bernoulli, Markov). Beyond the original word-based pattern-discovery tools (oligo-analysis and dyad-analysis), we recently added a battery of tools for matrix-based detection of cis-acting elements, with some original features (adaptive background models, Markov-chain estimation of P-values) that do not exist in other matrix-based scanning tools. The web server offers an intuitive interface, where each program can be accessed either separately or connected to the other tools. In addition, the tools are now available as web services, enabling their integration in programmatic workflows. Genomes are regularly updated from various genome repositories (NCBI and EnsEMBL) and 682 organisms are currently supported. Since 1998, the tools have been used by several hundreds of researchers from all over the world. Several predictions made with RSAT were validated experimentally and published.
Washburne, Alex D; Silverman, Justin D; Leff, Jonathan W; Bennett, Dominic J; Darcy, John L; Mukherjee, Sayan; Fierer, Noah; David, Lawrence A
2017-01-01
Marker gene sequencing of microbial communities has generated big datasets of microbial relative abundances varying across environmental conditions, sample sites and treatments. These data often come with putative phylogenies, providing unique opportunities to investigate how shared evolutionary history affects microbial abundance patterns. Here, we present a method to identify the phylogenetic factors driving patterns in microbial community composition. We use the method, "phylofactorization," to re-analyze datasets from the human body and soil microbial communities, demonstrating how phylofactorization is a dimensionality-reducing tool, an ordination-visualization tool, and an inferential tool for identifying edges in the phylogeny along which putative functional ecological traits may have arisen.
Initial benchmarking of a new electron-beam raster pattern generator for 130-100 nm maskmaking
NASA Astrophysics Data System (ADS)
Sauer, Charles A.; Abboud, Frank E.; Babin, Sergey V.; Chakarian, Varoujan; Ghanbari, Abe; Innes, Robert; Trost, David; Raymond, Frederick, III
2000-07-01
The decision by the Semiconductor Industry Association (SIA) to accelerate the continuing evolution to smaller linewidths is consistent with the commitment by Etec Systems, Inc. to rapidly develop new technologies for pattern generation systems with improved resolution, critical dimension (CD) uniformity, positional accuracy, and throughput. Current pattern generation designs are inadequate to meet the more advanced requirements for masks, particularly at or below the 100 nm node. Major changes to all pattern generation tools will be essential to meet future market requirements. An electron-beam (e-beam) system that is designed to meet the challenges for 130 - 100 nm device generation with extendibility to the 70-nm range will be discussed. This system has an architecture that includes a graybeam writing strategy, a new state system, and improved thermal management. Detailed changes include a pulse width modulated blanking system, per-pixel deflection, retrograde scanning multipass writing, and a column with a 50 kV accelerating voltage that supports a dose of up to 45 (mu) C/cm2 with minimal amounts of resist heating. This paper examines current issues, our approach to meeting International Technology Roadmap for Semiconductors (ITRS) requirements, and some preliminary results from a new pattern generator.
Pattern of Plagiarism in Novice Students' Generated Programs: An Experimental Approach
ERIC Educational Resources Information Center
Ahmadzadeh, Marzieh; Mahmoudabadi, Elham; Khodadadi, Farzad
2011-01-01
Anecdotal evidence shows that in computer programming courses plagiarism is a widespread problem. With the growing number of students in such courses, manual plagiarism detection is impractical. This requires instructors to use one of the many available plagiarism detection tools. Prior to choosing one of such tools, a metric that assures the…
NASA Astrophysics Data System (ADS)
Nishino, Takayuki
The face hobbing process has been widely applied in automotive industry. But so far few analytical tools have been developed. This makes it difficult for us to optimize gear design. To settle this situation, this study aims at developing a computerized tool to predict the running performances such as loaded tooth contact pattern, static transmission error and so on. First, based upon kinematical analysis of a cutting machine, a mathematical description of tooth surface generation is given. Second, based upon the theory of gearing and differential geometry, conjugate tooth surfaces are studied. Then contact lines are generated. Third, load distribution along contact lines is formulated. Last, the numerical model is validated by measuring loaded transmission error and loaded tooth contact pattern.
Zhu, Zhiwei; To, Suet; Zhang, Shaojian
2015-09-01
The inherent residual tool marks (RTM) with particular patterns highly affect optical functions of the generated freeform optics in fast tool servo or slow tool servo (FTS/STS) diamond turning. In the present study, a novel biaxial servo assisted fly cutting (BSFC) method is developed for flexible control of the RTM to be a functional micro/nanotexture in freeform optics generation, which is generally hard to achieve in FTS/STS diamond turning. In the BSFC system, biaxial servo motions along the z-axis and side-feeding directions are mainly adopted for primary surface generation and RTM control, respectively. Active control of the RTM from the two aspects, namely, undesired effect elimination or effective functionalization, are experimentally demonstrated by fabricating a typical F-theta freeform surface with scattering homogenization and two functional microstructures with imposition of secondary phase gratings integrating both reflective and diffractive functions.
Community health assessment tool: a patterns approach to data collection and diagnosis.
Kriegler, N F; Harton, M K
1992-01-01
Creation of an assessment tool to apply Gordon's functional patterns to the community as a client was a rewarding and stimulating project. Through use of the CHAT, students developed an appreciation of the complexity and inter-relationship of numerous aspects of the community. They completed the nursing process by developing appropriate nursing diagnoses, and planning, implementing, and evaluating a health promotion project. As the students continue to use this tool in the health promotion course, the diagnoses which they generate are being collected. From this accumulated input the plan is to compile a list of common diagnoses which are appropriate to use when the community is the client.
NASA Astrophysics Data System (ADS)
Otanocha, Omonigho B.; Li, Lin; Zhong, Shan; Liu, Zhu
2016-03-01
H13 tool steels are often used as dies and moulds for injection moulding of plastic components. Certain injection moulded components require micro-patterns on their surfaces in order to modify the physical properties of the components or for better mould release to reduce mould contamination. With these applications it is necessary to study micro-patterning to moulds and to ensure effective pattern transfer and replication onto the plastic component during moulding. In this paper, we report an investigation into high average powered (100 W) picosecond laser interactions with H13 tool steel during surface micro-patterning (texturing) and the subsequent pattern replication on ABS plastic material through injection moulding. Design of experiments and statistical modelling were used to understand the influences of laser pulse repetition rate, laser fluence, scanning velocity, and number of scans on the depth of cut, kerf width and heat affected zones (HAZ) size. The characteristics of the surface patterns are analysed. The process parameter interactions and significance of process parameters on the processing quality and efficiency are characterised. An optimum operating window is recommended. The transferred geometry is compared with the patterns generated on the dies. A discussion is made to explain the characteristics of laser texturing and pattern replication on plastics.
Computer Simulation Of An In-Process Surface Finish Sensor.
NASA Astrophysics Data System (ADS)
Rakels, Jan H.
1987-01-01
It is generally accepted, that optical methods are the most promising for the in-process measurement of surface finish. These methods have the advantages of being non-contacting and fast data acquisition. Furthermore, these optical instruments can be easily retrofitted on existing machine-tools. In the Micro-Engineering Centre at the University of Warwick, an optical sensor has been developed which can measure the rms roughness, slope and wavelength of turned and precision ground surfaces during machining. The operation of this device is based upon the Kirchhoff-Fresnel diffraction integral. Application of this theory to ideal turned and ground surfaces is straightforward, and indeed the calculated diffraction patterns are in close agreement with patterns produced by an actual optical instrument. Since it is mathematically difficult to introduce real machine-tool behaviour into the diffraction integral, a computer program has been devised, which simulates the operation of the optical sensor. The program produces a diffraction pattern as a graphical output. Comparison between computer generated and actual diffraction patterns of the same surfaces show a high correlation. The main aim of this program is to construct an atlas, which maps known machine-tool errors versus optical diffraction patterns. This atlas can then be used for machine-tool condition diagnostics. It has been found that optical monitoring is very sensitive to minor defects. Therefore machine-tool detoriation can be detected before it is detrimental.
Pain assessment tools: is the content appropriate for use in palliative care?
Hølen, Jacob Chr; Hjermstad, Marianne Jensen; Loge, Jon Håvard; Fayers, Peter M; Caraceni, Augusto; De Conno, Franco; Forbes, Karen; Fürst, Carl Johan; Radbruch, Lukas; Kaasa, Stein
2006-12-01
Inadequate pain assessment prevents optimal treatment in palliative care. The content of pain assessment tools might limit their usefulness for proper pain assessment, but data on the content validity of the tools are scarce. The objective of this study was to examine the content of the existing pain assessment tools, and to evaluate the appropriateness of different dimensions and items for pain assessment in palliative care. A systematic search was performed to find pain assessment tools for patients with advanced cancer who were receiving palliative care. An ad hoc search with broader search criteria supplemented the systematic search. The items of the identified tools were allocated to appropriate dimensions. This was reviewed by an international panel of experts, who also evaluated the relevance of the different dimensions for pain assessment in palliative care. The systematic literature search generated 16 assessment tools while the ad hoc search generated 64. Ten pain dimensions containing 1,011 pain items were identified by the experts. The experts ranked intensity, temporal pattern, treatment and exacerbating/relieving factors, location, and interference with health-related quality of life as the most important dimensions. None of the assessment tools covered these dimensions satisfactorily. Most items were related to interference (231) and intensity (138). Temporal pattern (which includes breakthrough pain), ranked as the second most important dimension, was covered by 29 items only. Many tools include dimensions and items of limited relevance for patients with advanced cancer. This might reduce compliance and threaten the validity of the assessment. New tools should reflect the clinical relevance of different dimensions and be user-friendly.
Pseudo-random tool paths for CNC sub-aperture polishing and other applications.
Dunn, Christina R; Walker, David D
2008-11-10
In this paper we first contrast classical and CNC polishing techniques in regard to the repetitiveness of the machine motions. We then present a pseudo-random tool path for use with CNC sub-aperture polishing techniques and report polishing results from equivalent random and raster tool-paths. The random tool-path used - the unicursal random tool-path - employs a random seed to generate a pattern which never crosses itself. Because of this property, this tool-path is directly compatible with dwell time maps for corrective polishing. The tool-path can be used to polish any continuous area of any boundary shape, including surfaces with interior perforations.
ALOG user's manual: A Guide to using the spreadsheet-based artificial log generator
Matthew F. Winn; Philip A. Araman; Randolph H. Wynne
2012-01-01
Computer programs that simulate log sawing can be valuable training tools for sawyers, as well as a means oftesting different sawing patterns. Most available simulation programs rely on diagrammed-log databases, which canbe very costly and time consuming to develop. Artificial Log Generator (ALOG) is a user-friendly Microsoft® Excel®...
Rethinking Methodology: What Language Diaries Can Offer to the Study of Code Choice
ERIC Educational Resources Information Center
Starks, Donna; Lee, Jeong
2010-01-01
The self-report questionnaire has served as the primary tool for investigating language maintenance in hundreds of communities over the past 40 years. More recently, it has been employed to investigate language shift amongst first-generation communities where one of the most useful indicators of generational change is the reported pattern of…
Tools for neuroanatomy and neurogenetics in Drosophila
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pfeiffer, Barret D.; Jenett, Arnim; Hammonds, Ann S.
2008-08-11
We demonstrate the feasibility of generating thousands of transgenic Drosophila melanogaster lines in which the expression of an exogenous gene is reproducibly directed to distinct small subsets of cells in the adult brain. We expect the expression patterns produced by the collection of 5,000 lines that we are currently generating to encompass all neurons in the brain in a variety of intersecting patterns. Overlapping 3-kb DNA fragments from the flanking noncoding and intronic regions of genes thought to have patterned expression in the adult brain were inserted into a defined genomic location by site-specific recombination. These fragments were then assayedmore » for their ability to function as transcriptional enhancers in conjunction with a synthetic core promoter designed to work with a wide variety of enhancer types. An analysis of 44 fragments from four genes found that >80% drive expression patterns in the brain; the observed patterns were, on average, comprised of <100 cells. Our results suggest that the D. melanogaster genome contains >50,000 enhancers and that multiple enhancers drive distinct subsets of expression of a gene in each tissue and developmental stage. We expect that these lines will be valuable tools for neuroanatomy as well as for the elucidation of neuronal circuits and information flow in the fly brain.« less
ERIC Educational Resources Information Center
Khalil, Mohammad; Ebner, Martin
2017-01-01
Massive Open Online Courses (MOOCs) are remote courses that excel in their students' heterogeneity and quantity. Due to the peculiarity of being massiveness, the large datasets generated by MOOC platforms require advanced tools and techniques to reveal hidden patterns for purposes of enhancing learning and educational behaviors. This publication…
NASA Astrophysics Data System (ADS)
Huett, Marc-Thorsten
2003-05-01
We formulate mathematical tools for analyzing spatiotemporal data sets. The tools are based on nearest-neighbor considerations similar to cellular automata. One of the analysis tools allows for reconstructing the noise intensity in a data set and is an appropriate method for detecting a variety of noise-induced phenomena in spatiotemporal data. The functioning of these methods is illustrated on sample data generated with the forest fire model and with networks of nonlinear oscillators. It is seen that these methods allow the characterization of spatiotemporal stochastic resonance (STSR) in experimental data. Application of these tools to biological spatiotemporal patterns is discussed. For one specific example, the slime mold Dictyostelium discoideum, it is seen, how transitions between different patterns are clearly marked by changes in the spatiotemporal observables.
Design pattern mining using distributed learning automata and DNA sequence alignment.
Esmaeilpour, Mansour; Naderifar, Vahideh; Shukur, Zarina
2014-01-01
Over the last decade, design patterns have been used extensively to generate reusable solutions to frequently encountered problems in software engineering and object oriented programming. A design pattern is a repeatable software design solution that provides a template for solving various instances of a general problem. This paper describes a new method for pattern mining, isolating design patterns and relationship between them; and a related tool, DLA-DNA for all implemented pattern and all projects used for evaluation. DLA-DNA achieves acceptable precision and recall instead of other evaluated tools based on distributed learning automata (DLA) and deoxyribonucleic acid (DNA) sequences alignment. The proposed method mines structural design patterns in the object oriented source code and extracts the strong and weak relationships between them, enabling analyzers and programmers to determine the dependency rate of each object, component, and other section of the code for parameter passing and modular programming. The proposed model can detect design patterns better that available other tools those are Pinot, PTIDEJ and DPJF; and the strengths of their relationships. The result demonstrate that whenever the source code is build standard and non-standard, based on the design patterns, then the result of the proposed method is near to DPJF and better that Pinot and PTIDEJ. The proposed model is tested on the several source codes and is compared with other related models and available tools those the results show the precision and recall of the proposed method, averagely 20% and 9.6% are more than Pinot, 27% and 31% are more than PTIDEJ and 3.3% and 2% are more than DPJF respectively. The primary idea of the proposed method is organized in two following steps: the first step, elemental design patterns are identified, while at the second step, is composed to recognize actual design patterns.
Supervised learning of tools for content-based search of image databases
NASA Astrophysics Data System (ADS)
Delanoy, Richard L.
1996-03-01
A computer environment, called the Toolkit for Image Mining (TIM), is being developed with the goal of enabling users with diverse interests and varied computer skills to create search tools for content-based image retrieval and other pattern matching tasks. Search tools are generated using a simple paradigm of supervised learning that is based on the user pointing at mistakes of classification made by the current search tool. As mistakes are identified, a learning algorithm uses the identified mistakes to build up a model of the user's intentions, construct a new search tool, apply the search tool to a test image, display the match results as feedback to the user, and accept new inputs from the user. Search tools are constructed in the form of functional templates, which are generalized matched filters capable of knowledge- based image processing. The ability of this system to learn the user's intentions from experience contrasts with other existing approaches to content-based image retrieval that base searches on the characteristics of a single input example or on a predefined and semantically- constrained textual query. Currently, TIM is capable of learning spectral and textural patterns, but should be adaptable to the learning of shapes, as well. Possible applications of TIM include not only content-based image retrieval, but also quantitative image analysis, the generation of metadata for annotating images, data prioritization or data reduction in bandwidth-limited situations, and the construction of components for larger, more complex computer vision algorithms.
Lacunarity study of speckle patterns produced by rough surfaces
NASA Astrophysics Data System (ADS)
Dias, M. R. B.; Dornelas, D.; Balthazar, W. F.; Huguenin, J. A. O.; da Silva, L.
2017-11-01
In this work we report on the study of Lacunarity of digital speckle patterns generated by rough surfaces. The study of Lacunarity of speckle patterns was performed on both static and moving rough surfaces. The results show that the Lacunarity is sensitive to the surface roughness, which suggests that it can be used to perform indirect measurement of surface roughness as well as to monitor defects, or variations of roughness, of metallic moving surfaces. Our results show the robustness of this statistical tool applied to speckle pattern in order to study surface roughness.
ALOG: A spreadsheet-based program for generating artificial logs
Matthew F. Winn; Randolph H. Wynne; Philip A. Araman
2004-01-01
Log sawing simulation computer programs can be valuable tools for training sawyers as well as for testing different sawing patterns. Most available simulation programs rely on databases from which to draw logs and can be very costly and time-consuming to develop. ALOG (Artificial LOg Generator) is a Microsoft Excel®-based computer program that was developed to...
Tenhaven, Christoph; Tipold, Andrea; Fischer, Martin R; Ehlers, Jan P
2013-01-01
Informal and formal lifelong learning is essential at university and in the workplace. Apart from classical learning techniques, Web 2.0 tools can be used. It is controversial whether there is a so-called net generation amongst people under 30. To test the hypothesis that a net generation among students and young veterinarians exists. An online survey of students and veterinarians was conducted in the German-speaking countries which was advertised via online media and traditional print media. 1780 people took part in the survey. Students and veterinarians have different usage patterns regarding social networks (91.9% vs. 69%) and IM (55.9% vs. 24.5%). All tools were predominantly used passively and in private, to a lesser extent also professionally and for studying. The use of Web 2.0 tools is useful, however, teaching information and media skills, preparing codes of conduct for the internet and verification of user generated content is essential.
Evidence Arguments for Using Formal Methods in Software Certification
NASA Technical Reports Server (NTRS)
Denney, Ewen W.; Pai, Ganesh
2013-01-01
We describe a generic approach for automatically integrating the output generated from a formal method/tool into a software safety assurance case, as an evidence argument, by (a) encoding the underlying reasoning as a safety case pattern, and (b) instantiating it using the data produced from the method/tool. We believe this approach not only improves the trustworthiness of the evidence generated from a formal method/tool, by explicitly presenting the reasoning and mechanisms underlying its genesis, but also provides a way to gauge the suitability of the evidence in the context of the wider assurance case. We illustrate our work by application to a real example-an unmanned aircraft system- where we invoke a formal code analysis tool from its autopilot software safety case, automatically transform the verification output into an evidence argument, and then integrate it into the former.
Network Sampling and Classification:An Investigation of Network Model Representations
Airoldi, Edoardo M.; Bai, Xue; Carley, Kathleen M.
2011-01-01
Methods for generating a random sample of networks with desired properties are important tools for the analysis of social, biological, and information networks. Algorithm-based approaches to sampling networks have received a great deal of attention in recent literature. Most of these algorithms are based on simple intuitions that associate the full features of connectivity patterns with specific values of only one or two network metrics. Substantive conclusions are crucially dependent on this association holding true. However, the extent to which this simple intuition holds true is not yet known. In this paper, we examine the association between the connectivity patterns that a network sampling algorithm aims to generate and the connectivity patterns of the generated networks, measured by an existing set of popular network metrics. We find that different network sampling algorithms can yield networks with similar connectivity patterns. We also find that the alternative algorithms for the same connectivity pattern can yield networks with different connectivity patterns. We argue that conclusions based on simulated network studies must focus on the full features of the connectivity patterns of a network instead of on the limited set of network metrics for a specific network type. This fact has important implications for network data analysis: for instance, implications related to the way significance is currently assessed. PMID:21666773
Human performance cognitive-behavioral modeling: a benefit for occupational safety.
Gore, Brian F
2002-01-01
Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.
Human performance cognitive-behavioral modeling: a benefit for occupational safety
NASA Technical Reports Server (NTRS)
Gore, Brian F.
2002-01-01
Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.
Design Pattern Mining Using Distributed Learning Automata and DNA Sequence Alignment
Esmaeilpour, Mansour; Naderifar, Vahideh; Shukur, Zarina
2014-01-01
Context Over the last decade, design patterns have been used extensively to generate reusable solutions to frequently encountered problems in software engineering and object oriented programming. A design pattern is a repeatable software design solution that provides a template for solving various instances of a general problem. Objective This paper describes a new method for pattern mining, isolating design patterns and relationship between them; and a related tool, DLA-DNA for all implemented pattern and all projects used for evaluation. DLA-DNA achieves acceptable precision and recall instead of other evaluated tools based on distributed learning automata (DLA) and deoxyribonucleic acid (DNA) sequences alignment. Method The proposed method mines structural design patterns in the object oriented source code and extracts the strong and weak relationships between them, enabling analyzers and programmers to determine the dependency rate of each object, component, and other section of the code for parameter passing and modular programming. The proposed model can detect design patterns better that available other tools those are Pinot, PTIDEJ and DPJF; and the strengths of their relationships. Results The result demonstrate that whenever the source code is build standard and non-standard, based on the design patterns, then the result of the proposed method is near to DPJF and better that Pinot and PTIDEJ. The proposed model is tested on the several source codes and is compared with other related models and available tools those the results show the precision and recall of the proposed method, averagely 20% and 9.6% are more than Pinot, 27% and 31% are more than PTIDEJ and 3.3% and 2% are more than DPJF respectively. Conclusion The primary idea of the proposed method is organized in two following steps: the first step, elemental design patterns are identified, while at the second step, is composed to recognize actual design patterns. PMID:25243670
Hybrid approach for robust diagnostics of cutting tools
NASA Astrophysics Data System (ADS)
Ramamurthi, K.; Hough, C. L., Jr.
1994-03-01
A new multisensor based hybrid technique has been developed for robust diagnosis of cutting tools. The technique combines the concepts of pattern classification and real-time knowledge based systems (RTKBS) and draws upon their strengths; learning facility in the case of pattern classification and a higher level of reasoning in the case of RTKBS. It eliminates some of their major drawbacks: false alarms or delayed/lack of diagnosis in case of pattern classification and tedious knowledge base generation in case of RTKBS. It utilizes a dynamic distance classifier, developed upon a new separability criterion and a new definition of robust diagnosis for achieving these benefits. The promise of this technique has been proven concretely through an on-line diagnosis of drill wear. Its suitability for practical implementation is substantiated by the use of practical, inexpensive, machine-mounted sensors and low-cost delivery systems.
NASA Astrophysics Data System (ADS)
Lu, Mark; Liang, Curtis; King, Dion; Melvin, Lawrence S., III
2005-11-01
Model-based Optical Proximity correction has become an indispensable tool for achieving wafer pattern to design fidelity at current manufacturing process nodes. Most model-based OPC is performed considering the nominal process condition, with limited consideration of through process manufacturing robustness. This study examines the use of off-target process models - models that represent non-nominal process states such as would occur with a dose or focus variation - to understands and manipulate the final pattern correction to a more process robust configuration. The study will first examine and validate the process of generating an off-target model, then examine the quality of the off-target model. Once the off-target model is proven, it will be used to demonstrate methods of generating process robust corrections. The concepts are demonstrated using a 0.13 μm logic gate process. Preliminary indications show success in both off-target model production and process robust corrections. With these off-target models as tools, mask production cycle times can be reduced.
Near-field diffraction from amplitude diffraction gratings: theory, simulation and results
NASA Astrophysics Data System (ADS)
Abedin, Kazi Monowar; Rahman, S. M. Mujibur
2017-08-01
We describe a computer simulation method by which the complete near-field diffract pattern of an amplitude diffraction grating can be generated. The technique uses the method of iterative Fresnel integrals to calculate and generate the diffraction images. Theoretical background as well as the techniques to perform the simulation is described. The program is written in MATLAB, and can be implemented in any ordinary PC. Examples of simulated diffraction images are presented and discussed. The generated images in the far-field where they reduce to Fraunhofer diffraction pattern are also presented for a realistic grating, and compared with the results predicted by the grating equation, which is applicable in the far-field. The method can be used as a tool to teach the complex phenomenon of diffraction in classrooms.
NeuroPG: open source software for optical pattern generation and data acquisition
Avants, Benjamin W.; Murphy, Daniel B.; Dapello, Joel A.; Robinson, Jacob T.
2015-01-01
Patterned illumination using a digital micromirror device (DMD) is a powerful tool for optogenetics. Compared to a scanning laser, DMDs are inexpensive and can easily create complex illumination patterns. Combining these complex spatiotemporal illumination patterns with optogenetics allows DMD-equipped microscopes to probe neural circuits by selectively manipulating the activity of many individual cells or many subcellular regions at the same time. To use DMDs to study neural activity, scientists must develop specialized software to coordinate optical stimulation patterns with the acquisition of electrophysiological and fluorescence data. To meet this growing need we have developed an open source optical pattern generation software for neuroscience—NeuroPG—that combines, DMD control, sample visualization, and data acquisition in one application. Built on a MATLAB platform, NeuroPG can also process, analyze, and visualize data. The software is designed specifically for the Mightex Polygon400; however, as an open source package, NeuroPG can be modified to incorporate any data acquisition, imaging, or illumination equipment that is compatible with MATLAB’s Data Acquisition and Image Acquisition toolboxes. PMID:25784873
Flow profiling of a surface-acoustic-wave nanopump.
Guttenberg, Z; Rathgeber, A; Keller, S; Rädler, J O; Wixforth, A; Kostur, M; Schindler, M; Talkner, P
2004-11-01
The flow profile in a capillary gap and the pumping efficiency of an acoustic micropump employing surface acoustic waves is investigated both experimentally and theoretically. Ultrasonic surface waves on a piezoelectric substrate strongly couple to a thin liquid layer and generate a quadrupolar streaming pattern within the fluid. We use fluorescence correlation spectroscopy and fluorescence microscopy as complementary tools to investigate the resulting flow profile. The velocity was found to depend on the applied power approximately linearly and to decrease with the inverse third power of the distance from the ultrasound generator on the chip. The found properties reveal acoustic streaming as a promising tool for the controlled agitation during microarray hybridization.
Flow profiling of a surface-acoustic-wave nanopump
NASA Astrophysics Data System (ADS)
Guttenberg, Z.; Rathgeber, A.; Keller, S.; Rädler, J. O.; Wixforth, A.; Kostur, M.; Schindler, M.; Talkner, P.
2004-11-01
The flow profile in a capillary gap and the pumping efficiency of an acoustic micropump employing surface acoustic waves is investigated both experimentally and theoretically. Ultrasonic surface waves on a piezoelectric substrate strongly couple to a thin liquid layer and generate a quadrupolar streaming pattern within the fluid. We use fluorescence correlation spectroscopy and fluorescence microscopy as complementary tools to investigate the resulting flow profile. The velocity was found to depend on the applied power approximately linearly and to decrease with the inverse third power of the distance from the ultrasound generator on the chip. The found properties reveal acoustic streaming as a promising tool for the controlled agitation during microarray hybridization.
enoLOGOS: a versatile web tool for energy normalized sequence logos
Workman, Christopher T.; Yin, Yutong; Corcoran, David L.; Ideker, Trey; Stormo, Gary D.; Benos, Panayiotis V.
2005-01-01
enoLOGOS is a web-based tool that generates sequence logos from various input sources. Sequence logos have become a popular way to graphically represent DNA and amino acid sequence patterns from a set of aligned sequences. Each position of the alignment is represented by a column of stacked symbols with its total height reflecting the information content in this position. Currently, the available web servers are able to create logo images from a set of aligned sequences, but none of them generates weighted sequence logos directly from energy measurements or other sources. With the advent of high-throughput technologies for estimating the contact energy of different DNA sequences, tools that can create logos directly from binding affinity data are useful to researchers. enoLOGOS generates sequence logos from a variety of input data, including energy measurements, probability matrices, alignment matrices, count matrices and aligned sequences. Furthermore, enoLOGOS can represent the mutual information of different positions of the consensus sequence, a unique feature of this tool. Another web interface for our software, C2H2-enoLOGOS, generates logos for the DNA-binding preferences of the C2H2 zinc-finger transcription factor family members. enoLOGOS and C2H2-enoLOGOS are accessible over the web at . PMID:15980495
Silk protein nanowires patterned using electron beam lithography.
Pal, Ramendra K; Yadavalli, Vamsi K
2018-08-17
Nanofabrication approaches to pattern proteins at the nanoscale are useful in applications ranging from organic bioelectronics to cellular engineering. Specifically, functional materials based on natural polymers offer sustainable and environment-friendly substitutes to synthetic polymers. Silk proteins (fibroin and sericin) have emerged as an important class of biomaterials for next generation applications owing to excellent optical and mechanical properties, inherent biocompatibility, and biodegradability. However, the ability to precisely control their spatial positioning at the nanoscale via high throughput tools continues to remain a challenge. In this study electron beam lithography (EBL) is used to provide nanoscale patterning using methacrylate conjugated silk proteins that are photoreactive 'photoresists' materials. Very low energy electron beam radiation can be used to pattern silk proteins at the nanoscale and over large areas, whereby such nanostructure fabrication can be performed without specialized EBL tools. Significantly, using conducting polymers in conjunction with these silk proteins, the formation of protein nanowires down to 100 nm is shown. These wires can be easily degraded using enzymatic degradation. Thus, proteins can be precisely and scalably patterned and doped with conducting polymers and enzymes to form degradable, organic bioelectronic devices.
Evaluation of 3D metrology potential using a multiple detector CDSEM
NASA Astrophysics Data System (ADS)
Hakii, Hidemitsu; Yonekura, Isao; Nishiyama, Yasushi; Tanaka, Keishi; Komoto, Kenji; Murakawa, Tsutomu; Hiroyama, Mitsuo; Shida, Soichi; Kuribara, Masayuki; Iwai, Toshimichi; Matsumoto, Jun; Nakamura, Takayuki
2012-06-01
As feature sizes of semiconductor device structures have continuously decreased, needs for metrology tools with high precision and excellent linearity over actual pattern sizes have been growing. And it has become important to measure not only two-dimensional (2D) but also three-dimensional (3D) shapes of patterns at 22 nm node and beyond. To meet requirements for 3D metrology capabilities, various pattern metrology tools have been developed. Among those, we assume that CDSEM metrology is the most qualified candidate in the light of its non-destructive, high throughput measurement capabilities that are expected to be extended to the much-awaited 3D metrology technology. On the basis of this supposition, we have developed the 3D metrology system, in which side wall angles and heights of photomask patterns can be measured with high accuracy through analyzing CDSEM images generated by multi-channel detectors. In this paper, we will discuss our attempts to measure 3D shapes of defect patterns on a photomask by using Advantest's "Multi Vision Metrology SEM" E3630 (MVM-SEM' E3630).
Kanbar, Lara J; Shalish, Wissam; Precup, Doina; Brown, Karen; Sant'Anna, Guilherme M; Kearney, Robert E
2017-07-01
In multi-disciplinary studies, different forms of data are often collected for analysis. For example, APEX, a study on the automated prediction of extubation readiness in extremely preterm infants, collects clinical parameters and cardiorespiratory signals. A variety of cardiorespiratory metrics are computed from these signals and used to assign a cardiorespiratory pattern at each time. In such a situation, exploratory analysis requires a visualization tool capable of displaying these different types of acquired and computed signals in an integrated environment. Thus, we developed APEX_SCOPE, a graphical tool for the visualization of multi-modal data comprising cardiorespiratory signals, automated cardiorespiratory metrics, automated respiratory patterns, manually classified respiratory patterns, and manual annotations by clinicians during data acquisition. This MATLAB-based application provides a means for collaborators to view combinations of signals to promote discussion, generate hypotheses and develop features.
2015-10-01
overview visualization to help clinicians identify patients that are changing and inserted these indices into the sepsis specific decision support...visualization, 4) Created a sepsis identification visualization tool to help clinicians identify patients headed for septic shock, and 5) Generated a...5 Sepsis Visualization
Using Genograms Creatively to Promote Healthy Lifestyles
ERIC Educational Resources Information Center
Casado-Kehoe, Montserrat; Kehoe, Michael P.
2007-01-01
Family therapists have used genograms as an assessment tool for years to examine the interactions and relationships of family members across generations. This article discusses how a therapist can use a genogram creatively to help clients examine the impact of family relationships on healthy and unhealthy lifestyle patterns and how those…
Nicolas, F; Coëtmellec, S; Brunel, M; Allano, D; Lebrun, D; Janssen, A J E M
2005-11-01
The authors have studied the diffraction pattern produced by a particle field illuminated by an elliptic and astigmatic Gaussian beam. They demonstrate that the bidimensional fractional Fourier transformation is a mathematically suitable tool to analyse the diffraction pattern generated not only by a collimated plane wave [J. Opt. Soc. Am A 19, 1537 (2002)], but also by an elliptic and astigmatic Gaussian beam when two different fractional orders are considered. Simulations and experimental results are presented.
Self-organization of multifunctional surfaces--the fingerprints of light on a complex system.
Reinhardt, Hendrik; Kim, Hee-Cheol; Pietzonka, Clemens; Kruempelmann, Julia; Harbrecht, Bernd; Roling, Bernhard; Hampp, Norbert
2013-06-25
Nanocomposite patterns and nanotemplates are generated by a single-step bottom-up concept that introduces laser-induced periodic surface structures (LIPSS) as a tool for site-specific reaction control in multicomponent systems. Periodic intensity fluctuations of this photothermal stimulus inflict spatial-selective reorganizations, dewetting scenarios and phase segregations, thus creating regular patterns of anisotropic physicochemical properties that feature attractive optical, electrical, magnetic, and catalytic properties. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Hirano, Ryoichi; Iida, Susumu; Amano, Tsuyoshi; Watanabe, Hidehiro; Hatakeyama, Masahiro; Murakami, Takeshi; Suematsu, Kenichi; Terao, Kenji
2016-03-01
Novel projection electron microscope optics have been developed and integrated into a new inspection system named EBEYE-V30 ("Model EBEYE" is an EBARA's model code) , and the resulting system shows promise for application to half-pitch (hp) 16-nm node extreme ultraviolet lithography (EUVL) patterned mask inspection. To improve the system's inspection throughput for 11-nm hp generation defect detection, a new electron-sensitive area image sensor with a high-speed data processing unit, a bright and stable electron source, and an image capture area deflector that operates simultaneously with the mask scanning motion have been developed. A learning system has been used for the mask inspection tool to meet the requirements of hp 11-nm node EUV patterned mask inspection. Defects are identified by the projection electron microscope system using the "defectivity" from the characteristics of the acquired image. The learning system has been developed to reduce the labor and costs associated with adjustment of the detection capability to cope with newly-defined mask defects. We describe the integration of the developed elements into the inspection tool and the verification of the designed specification. We have also verified the effectiveness of the learning system, which shows enhanced detection capability for the hp 11-nm node.
Inducing any virtual two-dimensional movement in humans by applying muscle tendon vibration.
Roll, Jean-Pierre; Albert, Frédéric; Thyrion, Chloé; Ribot-Ciscar, Edith; Bergenheim, Mikael; Mattei, Benjamin
2009-02-01
In humans, tendon vibration evokes illusory sensation of movement. We developed a model mimicking the muscle afferent patterns corresponding to any two-dimensional movement and checked its validity by inducing writing illusory movements through specific sets of muscle vibrators. Three kinds of illusory movements were compared. The first was induced by vibration patterns copying the responses of muscle spindle afferents previously recorded by microneurography during imposed ankle movements. The two others were generated by the model. Sixteen different vibratory patterns were applied to 20 motionless volunteers in the absence of vision. After each vibration sequence, the participants were asked to name the corresponding graphic symbol and then to reproduce the illusory movement perceived. Results showed that the afferent patterns generated by the model were very similar to those recorded microneurographically during actual ankle movements (r=0.82). The model was also very efficient for generating afferent response patterns at the wrist level, if the preferred sensory directions of the wrist muscle groups were first specified. Using recorded and modeled proprioceptive patterns to pilot sets of vibrators placed at the ankle or wrist levels evoked similar illusory movements, which were correctly identified by the participants in three quarters of the trials. Our proprioceptive model, based on neurosensory data recorded in behaving humans, should then be a useful tool in fields of research such as sensorimotor learning, rehabilitation, and virtual reality.
Richards, Mark J; Daniel, Susan
2017-02-07
The supported lipid bilayer has been portrayed as a useful model of the cell membrane compatible with many biophysical tools and techniques that demonstrate its appeal in learning about the basic features of the plasma membrane. However, some of its potential has yet to be realized, particularly in the area of bilayer patterning and phase/composition heterogeneity. In this work, we generate contiguous bilayer patterns as a model system that captures the general features of membrane domains and lipid rafts. Micropatterned polymer templates of two types are investigated for generating patterned bilayer formation: polymer blotting and polymer lift-off stenciling. While these approaches have been used previously to create bilayer arrays by corralling bilayers patches with various types of boundaries impenetrable to bilayer diffusion, unique to the methods presented here, there are no physical barriers to diffusion. In this work, interfaces between contiguous lipid phases define the pattern shapes, with continuity between them allowing transfer of membrane-bound biomolecules between the phases. We examine effectors of membrane domain stability including temperature and cholesterol content to investigate domain dynamics. Contiguous patterning of supported bilayers as a model of lipid rafts expands the application of the SLB to an area with current appeal and brings with it a useful toolset for characterization and analysis. These combined tools should be helpful to researchers investigating lipid raft dynamics and function and biomolecule partitioning studies. Additionally, this patterning technique may be useful for applications such as bioseparations that exploit differences in lipid phase partitioning or creation of membranes that bind species like viruses preferentially at lipid phase boundaries, to name a few.
Duchcherer, Maryana; Kottick, Andrew; Wilson, R J A
2010-01-01
Central pattern generators located in the brainstem regulate ventilatory behaviors in vertebrates. The development of the isolated brainstem preparation has allowed these neural networks to be characterized in a number of aquatic species. The aim of this study was to explore the architecture of the respiratory rhythm-generating site in the goldfish (Carassius auratus) and to determine the utility of a newly developed isolated brainstem preparation, the Sheep Dip. Here we provide evidence for a distributed organization of respiratory rhythm generating neurons along the rostrocaudal axis of the goldfish brainstem and outline the advantages of the Sheep Dip as a tool used to survey neural networks.
Tenhaven, Christoph; Tipold, Andrea; Fischer, Martin R.; Ehlers, Jan P.
2013-01-01
Introduction: Informal and formal lifelong learning is essential at university and in the workplace. Apart from classical learning techniques, Web 2.0 tools can be used. It is controversial whether there is a so-called net generation amongst people under 30. Aims: To test the hypothesis that a net generation among students and young veterinarians exists. Methods: An online survey of students and veterinarians was conducted in the German-speaking countries which was advertised via online media and traditional print media. Results: 1780 people took part in the survey. Students and veterinarians have different usage patterns regarding social networks (91.9% vs. 69%) and IM (55.9% vs. 24.5%). All tools were predominantly used passively and in private, to a lesser extent also professionally and for studying. Outlook: The use of Web 2.0 tools is useful, however, teaching information and media skills, preparing codes of conduct for the internet and verification of user generated content is essential. PMID:23467682
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boe, Timothy; Lemieux, Paul; Schultheisz, Daniel
2013-07-01
Management of debris and waste from a wide-area radiological incident would probably constitute a significant percentage of the total remediation cost and effort. The U.S. Environmental Protection Agency's (EPA's) Waste Estimation Support Tool (WEST) is a unique planning tool for estimating the potential volume and radioactivity levels of waste generated by a radiological incident and subsequent decontamination efforts. The WEST was developed to support planners and decision makers by generating a first-order estimate of the quantity and characteristics of waste resulting from a radiological incident. The tool then allows the user to evaluate the impact of various decontamination/demolition strategies onmore » the waste types and volumes generated. WEST consists of a suite of standalone applications and Esri{sup R} ArcGIS{sup R} scripts for rapidly estimating waste inventories and levels of radioactivity generated from a radiological contamination incident as a function of user-defined decontamination and demolition approaches. WEST accepts Geographic Information System (GIS) shape-files defining contaminated areas and extent of contamination. Building stock information, including square footage, building counts, and building composition estimates are then generated using the Federal Emergency Management Agency's (FEMA's) Hazus{sup R}-MH software. WEST then identifies outdoor surfaces based on the application of pattern recognition to overhead aerial imagery. The results from the GIS calculations are then fed into a Microsoft Excel{sup R} 2007 spreadsheet with a custom graphical user interface where the user can examine the impact of various decontamination/demolition scenarios on the quantity, characteristics, and residual radioactivity of the resulting waste streams. (authors)« less
Protein recognition by a pattern-generating fluorescent molecular probe.
Pode, Zohar; Peri-Naor, Ronny; Georgeson, Joseph M; Ilani, Tal; Kiss, Vladimir; Unger, Tamar; Markus, Barak; Barr, Haim M; Motiei, Leila; Margulies, David
2017-12-01
Fluorescent molecular probes have become valuable tools in protein research; however, the current methods for using these probes are less suitable for analysing specific populations of proteins in their native environment. In this study, we address this gap by developing a unimolecular fluorescent probe that combines the properties of small-molecule-based probes and cross-reactive sensor arrays (the so-called chemical 'noses/tongues'). On the one hand, the probe can detect different proteins by generating unique identification (ID) patterns, akin to cross-reactive arrays. On the other hand, its unimolecular scaffold and selective binding enable this ID-generating probe to identify combinations of specific protein families within complex mixtures and to discriminate among isoforms in living cells, where macroscopic arrays cannot access. The ability to recycle the molecular device and use it to track several binding interactions simultaneously further demonstrates how this approach could expand the fluorescent toolbox currently used to detect and image proteins.
Protein recognition by a pattern-generating fluorescent molecular probe
NASA Astrophysics Data System (ADS)
Pode, Zohar; Peri-Naor, Ronny; Georgeson, Joseph M.; Ilani, Tal; Kiss, Vladimir; Unger, Tamar; Markus, Barak; Barr, Haim M.; Motiei, Leila; Margulies, David
2017-12-01
Fluorescent molecular probes have become valuable tools in protein research; however, the current methods for using these probes are less suitable for analysing specific populations of proteins in their native environment. In this study, we address this gap by developing a unimolecular fluorescent probe that combines the properties of small-molecule-based probes and cross-reactive sensor arrays (the so-called chemical 'noses/tongues'). On the one hand, the probe can detect different proteins by generating unique identification (ID) patterns, akin to cross-reactive arrays. On the other hand, its unimolecular scaffold and selective binding enable this ID-generating probe to identify combinations of specific protein families within complex mixtures and to discriminate among isoforms in living cells, where macroscopic arrays cannot access. The ability to recycle the molecular device and use it to track several binding interactions simultaneously further demonstrates how this approach could expand the fluorescent toolbox currently used to detect and image proteins.
Developing user-friendly habitat suitability tools from regional stream fish survey data
Zorn, T.G.; Seelbach, P.; Wiley, M.J.
2011-01-01
We developed user-friendly fish habitat suitability tools (plots) for fishery managers in Michigan; these tools are based on driving habitat variables and fish population estimates for several hundred stream sites throughout the state. We generated contour plots to show patterns in fish biomass for over 60 common species (and for 120 species grouped at the family level) in relation to axes of catchment area and low-flow yield (90% exceedance flow divided by catchment area) and also in relation to axes of mean and weekly range of July temperatures. The plots showed distinct patterns in fish habitat suitability at each level of biological organization studied and were useful for quantitatively comparing river sites. We demonstrate how these plots can be used to support stream management, and we provide examples pertaining to resource assessment, trout stocking, angling regulations, chemical reclamation of marginal trout streams, indicator species, instream flow protection, and habitat restoration. These straightforward and effective tools are electronically available so that managers can easily access and incorporate them into decision protocols and presentations.
CodonLogo: a sequence logo-based viewer for codon patterns.
Sharma, Virag; Murphy, David P; Provan, Gregory; Baranov, Pavel V
2012-07-15
Conserved patterns across a multiple sequence alignment can be visualized by generating sequence logos. Sequence logos show each column in the alignment as stacks of symbol(s) where the height of a stack is proportional to its informational content, whereas the height of each symbol within the stack is proportional to its frequency in the column. Sequence logos use symbols of either nucleotide or amino acid alphabets. However, certain regulatory signals in messenger RNA (mRNA) act as combinations of codons. Yet no tool is available for visualization of conserved codon patterns. We present the first application which allows visualization of conserved regions in a multiple sequence alignment in the context of codons. CodonLogo is based on WebLogo3 and uses the same heuristics but treats codons as inseparable units of a 64-letter alphabet. CodonLogo can discriminate patterns of codon conservation from patterns of nucleotide conservation that appear indistinguishable in standard sequence logos. The CodonLogo source code and its implementation (in a local version of the Galaxy Browser) are available at http://recode.ucc.ie/CodonLogo and through the Galaxy Tool Shed at http://toolshed.g2.bx.psu.edu/.
Ajay, Dara; Gangwal, Rahul P; Sangamwar, Abhay T
2015-01-01
Intelligent Patent Analysis Tool (IPAT) is an online data retrieval tool, operated based on text mining algorithm to extract specific patent information in a predetermined pattern into an Excel sheet. The software is designed and developed to retrieve and analyze technology information from multiple patent documents and generate various patent landscape graphs and charts. The software is C# coded in visual studio 2010, which extracts the publicly available patent information from the web pages like Google Patent and simultaneously study the various technology trends based on user-defined parameters. In other words, IPAT combined with the manual categorization will act as an excellent technology assessment tool in competitive intelligence and due diligence for predicting the future R&D forecast.
Can we use virtual reality tools in the planning of an experiment?
NASA Astrophysics Data System (ADS)
Kucaba-Pietal, Anna; Szumski, Marek; Szczerba, Piotr
2015-03-01
Virtual reality (VR) has proved to be a particularly useful tool in engineering and design. A related area of aviation in which VR is particularly significant is a flight training, as it requires many hours of practice and using real planes for all training is both expensive and more dangerous. Research conducted at the Rzeszow University of Technology (RUT) showed that virtual reality can be successfully used for planning experiment during a flight tests. Motivation to the study were a wing deformation measurements of PW-6 glider in flight by use Image Pattern Correlation Technique (IPCT) planned within the frame of AIM2 project. The tool VirlIPCT was constructed, which permits to perform virtual IPCT setup on an airplane. Using it, we can test a camera position, camera resolution, pattern application. Moreover performed tests on RUT indicate, that VirlIPCT can be used as a virtual IPCT image generator. This paper presents results of the research on VirlIPCT.
2013-09-01
to a XML file, a code that Bonine in [21] developed for a similar purpose. Using the StateRover XML log file import tool, we are able to generate a...C. Bonine , M. Shing, T.W. Otani, “Computer-aided process and tools for mobile software acquisition,” NPS, Monterey, CA, Tech. Rep. NPS-SE-13...C10P07R05– 075, 2013. [21] C. Bonine , “Specification, validation and verification of mobile application behavior,” M.S. thesis, Dept. Comp. Science, NPS
Validation of the ROMI-RIP rough mill simulator
Edward R. Thomas; Urs Buehlmann
2002-01-01
The USDA Forest Service's ROMI-RIP rough mill rip-first simulation program is a popular tool for analyzing rough mill conditions, determining more efficient rough mill practices, and finding optimal lumber board cut-up patterns. However, until now, the results generated by ROMI-RIP have not been rigorously compared to those of an actual rough mill. Validating the...
USDA-ARS?s Scientific Manuscript database
Ultra-High Performance-Quadrupole Time of Flight Mass Spectrometr(UHPLC-QToF-MS)profiling has become an impattant tool for identification of marker compounds and generation of metabolic patterns that could be interrogated using chemometric modeling software. Chemometric approaches can be used to ana...
USDA-ARS?s Scientific Manuscript database
We integrated classic and Bayesian phylogeographic tools with a paleodistribution modeling approach to study the historical demographic processes that shaped the distribution of the invasive ant Wasmannia auropunctata in its native South America. We generated mitochondrial Cytochrome Oxidase I seque...
Toward robust phase-locking in Melibe swim central pattern generator models
NASA Astrophysics Data System (ADS)
Jalil, Sajiya; Allen, Dane; Youker, Joseph; Shilnikov, Andrey
2013-12-01
Small groups of interneurons, abbreviated by CPG for central pattern generators, are arranged into neural networks to generate a variety of core bursting rhythms with specific phase-locked states, on distinct time scales, which govern vital motor behaviors in invertebrates such as chewing and swimming. These movements in lower level animals mimic motions of organs in higher animals due to evolutionarily conserved mechanisms. Hence, various neurological diseases can be linked to abnormal movement of body parts that are regulated by a malfunctioning CPG. In this paper, we, being inspired by recent experimental studies of neuronal activity patterns recorded from a swimming motion CPG of the sea slug Melibe leonina, examine a mathematical model of a 4-cell network that can plausibly and stably underlie the observed bursting rhythm. We develop a dynamical systems framework for explaining the existence and robustness of phase-locked states in activity patterns produced by the modeled CPGs. The proposed tools can be used for identifying core components for other CPG networks with reliable bursting outcomes and specific phase relationships between the interneurons. Our findings can be employed for identifying or implementing the conditions for normal and pathological functioning of basic CPGs of animals and artificially intelligent prosthetics that can regulate various movements.
Computer assisted blast design and assessment tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cameron, A.R.; Kleine, T.H.; Forsyth, W.W.
1995-12-31
In general the software required by a blast designer includes tools that graphically present blast designs (surface and underground), can analyze a design or predict its result, and can assess blasting results. As computers develop and computer literacy continues to rise the development of and use of such tools will spread. An example of the tools that are becoming available includes: Automatic blast pattern generation and underground ring design; blast design evaluation in terms of explosive distribution and detonation simulation; fragmentation prediction; blast vibration prediction and minimization; blast monitoring for assessment of dynamic performance; vibration measurement, display and signal processing;more » evaluation of blast results in terms of fragmentation; and risk and reliability based blast assessment. The authors have identified a set of criteria that are essential in choosing appropriate software blasting tools.« less
Classification of time series patterns from complex dynamic systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schryver, J.C.; Rao, N.
1998-07-01
An increasing availability of high-performance computing and data storage media at decreasing cost is making possible the proliferation of large-scale numerical databases and data warehouses. Numeric warehousing enterprises on the order of hundreds of gigabytes to terabytes are a reality in many fields such as finance, retail sales, process systems monitoring, biomedical monitoring, surveillance and transportation. Large-scale databases are becoming more accessible to larger user communities through the internet, web-based applications and database connectivity. Consequently, most researchers now have access to a variety of massive datasets. This trend will probably only continue to grow over the next several years. Unfortunately,more » the availability of integrated tools to explore, analyze and understand the data warehoused in these archives is lagging far behind the ability to gain access to the same data. In particular, locating and identifying patterns of interest in numerical time series data is an increasingly important problem for which there are few available techniques. Temporal pattern recognition poses many interesting problems in classification, segmentation, prediction, diagnosis and anomaly detection. This research focuses on the problem of classification or characterization of numerical time series data. Highway vehicles and their drivers are examples of complex dynamic systems (CDS) which are being used by transportation agencies for field testing to generate large-scale time series datasets. Tools for effective analysis of numerical time series in databases generated by highway vehicle systems are not yet available, or have not been adapted to the target problem domain. However, analysis tools from similar domains may be adapted to the problem of classification of numerical time series data.« less
Structural Pattern Recognition Techniques for Data Retrieval in Massive Fusion Databases
NASA Astrophysics Data System (ADS)
Vega, J.; Murari, A.; Rattá, G. A.; Castro, P.; Pereira, A.; Portas, A.
2008-03-01
Diagnostics of present day reactor class fusion experiments, like the Joint European Torus (JET), generate thousands of signals (time series and video images) in each discharge. There is a direct correspondence between the physical phenomena taking place in the plasma and the set of structural shapes (patterns) that they form in the signals: bumps, unexpected amplitude changes, abrupt peaks, periodic components, high intensity zones or specific edge contours. A major difficulty related to data analysis is the identification, in a rapid and automated way, of a set of discharges with comparable behavior, i.e. discharges with "similar" patterns. Pattern recognition techniques are efficient tools to search for similar structural forms within the database in a fast an intelligent way. To this end, classification systems must be developed to be used as indexation methods to directly fetch the more similar patterns.
Skolimowski, Maciej; Nielsen, Martin Weiss; Emnéus, Jenny; Molin, Søren; Taboryski, Rafael; Sternberg, Claus; Dufva, Martin; Geschke, Oliver
2010-08-21
A microfluidic chip for generation of gradients of dissolved oxygen was designed, fabricated and tested. The novel way of active oxygen depletion through a gas permeable membrane was applied. Numerical simulations for generation of O(2) gradients were correlated with measured oxygen concentrations. The developed microsystem was used to study growth patterns of the bacterium Pseudomonas aeruginosa in medium with different oxygen concentrations. The results showed that attachment of Pseudomonas aeruginosa to the substrate changed with oxygen concentration. This demonstrates that the device can be used for studies requiring controlled oxygen levels and for future studies of microaerobic and anaerobic conditions.
NASA Astrophysics Data System (ADS)
Lutich, Andrey
2017-07-01
This research considers the problem of generating compact vector representations of physical design patterns for analytics purposes in semiconductor patterning domain. PatterNet uses a deep artificial neural network to learn mapping of physical design patterns to a compact Euclidean hyperspace. Distances among mapped patterns in this space correspond to dissimilarities among patterns defined at the time of the network training. Once the mapping network has been trained, PatterNet embeddings can be used as feature vectors with standard machine learning algorithms, and pattern search, comparison, and clustering become trivial problems. PatterNet is inspired by the concepts developed within the framework of generative adversarial networks as well as the FaceNet. Our method facilitates a deep neural network (DNN) to learn directly the compact representation by supplying it with pairs of design patterns and dissimilarity among these patterns defined by a user. In the simplest case, the dissimilarity is represented by an area of the XOR of two patterns. Important to realize that our PatterNet approach is very different to the methods developed for deep learning on image data. In contrast to "conventional" pictures, the patterns in the CAD world are the lists of polygon vertex coordinates. The method solely relies on the promise of deep learning to discover internal structure of the incoming data and learn its hierarchical representations. Artificial intelligence arising from the combination of PatterNet and clustering analysis very precisely follows intuition of patterning/optical proximity correction experts paving the way toward human-like and human-friendly engineering tools.
NASA Astrophysics Data System (ADS)
Mamezaki, Daiki; Harada, Tetsuo; Nagata, Yutaka; Watanabe, Takeo
2017-07-01
In extreme ultraviolet (EUV) lithography, development of review tools for EUV mask pattern and phase defect at working wavelength of 13.5 nm is required. The EUV mask is composed of an absorber pattern (50 - 70 nm thick) and Mo/Si multilayer (280 nm thick) on a glass substrate. This mask pattern seems three-dimensional (3D) structure. This 3D structure would modulate EUV reflection phase, which would cause focus and pattern shifts. Thus, EUV phase imaging is important to evaluate this phase modulation. We have developed coherent EUV scatterometry microscope (CSM), which is a simple microscope without objective optics. EUV phase and intensity image are reconstructed with diffraction images by ptychography with coherent EUV illumination. The high-harmonic-generation (HHG) EUV source was employed for standalone CSM system. In this study, we updated HHG system of pump-laser reduction and gas-pressure control. Two types of EUV mask absorber patterns were observed. An 88-nm lines-and-spaces and a cross-line patterns were clearly reconstructed by ptychography. In addition, a natural defect with 2-μm diameter on the cross-line was well reconstructed. This demonstrated the high capability of the standalone CSM, which system will be used in the factories, such as mask shops and semiconductor fabrication plants.
ERIC Educational Resources Information Center
YoussefAgha, Ahmed H.; Lohrmann, David K.; Jayawardene, Wasantha P.
2013-01-01
Background: Health eTools for Schools was developed to assist school nurses with routine entries, including height and weight, on student health records, thus providing a readily accessible data base. Data-mining techniques were applied to this database to determine if clinically signi?cant results could be generated. Methods: Body mass index…
CRISPR/Cas9-loxP-Mediated Gene Editing as a Novel Site-Specific Genetic Manipulation Tool.
Yang, Fayu; Liu, Changbao; Chen, Ding; Tu, Mengjun; Xie, Haihua; Sun, Huihui; Ge, Xianglian; Tang, Lianchao; Li, Jin; Zheng, Jiayong; Song, Zongming; Qu, Jia; Gu, Feng
2017-06-16
Cre-loxP, as one of the site-specific genetic manipulation tools, offers a method to study the spatial and temporal regulation of gene expression/inactivation in order to decipher gene function. CRISPR/Cas9-mediated targeted genome engineering technologies are sparking a new revolution in biological research. Whether the traditional site-specific genetic manipulation tool and CRISPR/Cas9 could be combined to create a novel genetic tool for highly specific gene editing is not clear. Here, we successfully generated a CRISPR/Cas9-loxP system to perform gene editing in human cells, providing the proof of principle that these two technologies can be used together for the first time. We also showed that distinct non-homologous end-joining (NHEJ) patterns from CRISPR/Cas9-mediated gene editing of the targeting sequence locates at the level of plasmids (episomal) and chromosomes. Specially, the CRISPR/Cas9-mediated NHEJ pattern in the nuclear genome favors deletions (64%-68% at the human AAVS1 locus versus 4%-28% plasmid DNA). CRISPR/Cas9-loxP, a novel site-specific genetic manipulation tool, offers a platform for the dissection of gene function and molecular insights into DNA-repair pathways. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.
Massive ordering and alignment of cylindrical micro-objects by photovoltaic optoelectronic tweezers.
Elvira, Iris; Muñoz-Martínez, Juan F; Barroso, Álvaro; Denz, Cornelia; Ramiro, José B; García-Cabañes, Angel; Agulló-López, Fernando; Carrascosa, Mercedes
2018-01-01
Optical tools for manipulation and trapping of micro- and nano-objects are a fundamental issue for many applications in nano- and biotechnology. This work reports on the use of one such method, known as photovoltaic optoelectronics tweezers, to orientate and organize cylindrical microcrystals, specifically elongated zeolite L, on the surface of Fe-doped LiNbO 3 crystal plates. Patterns of aligned zeolites have been achieved through the forces and torques generated by the bulk photovoltaic effect. The alignment patterns with zeolites parallel or perpendicular to the substrate surface are highly dependent on the features of light distribution and crystal configuration. Moreover, dielectrophoretic chains of zeolites with lengths up to 100 μm have often been observed. The experimental results of zeolite trapping and alignment have been discussed and compared together with theoretical simulations of the evanescent photovoltaic electric field and the dielectrophoretic potential. They demonstrate the remarkable capabilities of the optoelectronic photovoltaic method to orientate and pattern anisotropic microcrystals. The combined action of patterning and alignment offers a unique tool to prepare functional nanostructures with potential applications in a variety of fields such as nonlinear optics or plasmonics.
Pattern Recognition for a Flight Dynamics Monte Carlo Simulation
NASA Technical Reports Server (NTRS)
Restrepo, Carolina; Hurtado, John E.
2011-01-01
The design, analysis, and verification and validation of a spacecraft relies heavily on Monte Carlo simulations. Modern computational techniques are able to generate large amounts of Monte Carlo data but flight dynamics engineers lack the time and resources to analyze it all. The growing amounts of data combined with the diminished available time of engineers motivates the need to automate the analysis process. Pattern recognition algorithms are an innovative way of analyzing flight dynamics data efficiently. They can search large data sets for specific patterns and highlight critical variables so analysts can focus their analysis efforts. This work combines a few tractable pattern recognition algorithms with basic flight dynamics concepts to build a practical analysis tool for Monte Carlo simulations. Current results show that this tool can quickly and automatically identify individual design parameters, and most importantly, specific combinations of parameters that should be avoided in order to prevent specific system failures. The current version uses a kernel density estimation algorithm and a sequential feature selection algorithm combined with a k-nearest neighbor classifier to find and rank important design parameters. This provides an increased level of confidence in the analysis and saves a significant amount of time.
EUV mask pilot line at Intel Corporation
NASA Astrophysics Data System (ADS)
Stivers, Alan R.; Yan, Pei-Yang; Zhang, Guojing; Liang, Ted; Shu, Emily Y.; Tejnil, Edita; Lieberman, Barry; Nagpal, Rajesh; Hsia, Kangmin; Penn, Michael; Lo, Fu-Chang
2004-12-01
The introduction of extreme ultraviolet (EUV) lithography into high volume manufacturing requires the development of a new mask technology. In support of this, Intel Corporation has established a pilot line devoted to encountering and eliminating barriers to manufacturability of EUV masks. It concentrates on EUV-specific process modules and makes use of the captive standard photomask fabrication capability of Intel Corporation. The goal of the pilot line is to accelerate EUV mask development to intersect the 32nm technology node. This requires EUV mask technology to be comparable to standard photomask technology by the beginning of the silicon wafer process development phase for that technology node. The pilot line embodies Intel's strategy to lead EUV mask development in the areas of the mask patterning process, mask fabrication tools, the starting material (blanks) and the understanding of process interdependencies. The patterning process includes all steps from blank defect inspection through final pattern inspection and repair. We have specified and ordered the EUV-specific tools and most will be installed in 2004. We have worked with International Sematech and others to provide for the next generation of EUV-specific mask tools. Our process of record is run repeatedly to ensure its robustness. This primes the supply chain and collects information needed for blank improvement.
EXTATIC: ASML's α-tool development for EUVL
NASA Astrophysics Data System (ADS)
Meiling, Hans; Benschop, Jos P.; Hartman, Robert A.; Kuerz, Peter; Hoghoj, Peter; Geyl, Roland; Harned, Noreen
2002-07-01
Within the recently initiated EXTATIC project a complete full-field lithography exposure tool for he 50-nm technology node is being developed. The goal is to demonstrate the feasibility of extreme UV lithography (EUVL) for 50-nm imaging and to reduce technological risks in the development of EUVL production tools. We describe the EUV MEDEA+) framework in which EXTATIC is executed, and give an update on the status of the (alpha) -tool development. A brief summary of our in-house source-collector module development is given, as well as the general vacuum architecture of the (alpha) -tool is discussed. We discuss defect-free reticle handling, and investigated the uses of V-grooved brackets glued to the side of the reticle to reduce particle generation during takeovers. These takeovers do not only occur in the exposure tool, but also in multilayer deposition equipment, e-beam pattern writers, inspection tools, etc., where similar requirements on particle contamination are present. Finally, we present an update of mirror fabrication technology and show improved mirror figuring and finishing results.
Bagarinao, Epifanio; Yoshida, Akihiro; Ueno, Mika; Terabe, Kazunori; Kato, Shohei; Isoda, Haruo; Nakai, Toshiharu
2018-01-01
Motor imagery (MI), a covert cognitive process where an action is mentally simulated but not actually performed, could be used as an effective neurorehabilitation tool for motor function improvement or recovery. Recent approaches employing brain-computer/brain-machine interfaces to provide online feedback of the MI during rehabilitation training have promising rehabilitation outcomes. In this study, we examined whether participants could volitionally recall MI-related brain activation patterns when guided using neurofeedback (NF) during training. The participants' performance was compared to that without NF. We hypothesized that participants would be able to consistently generate the relevant activation pattern associated with the MI task during training with NF compared to that without NF. To assess activation consistency, we used the performance of classifiers trained to discriminate MI-related brain activation patterns. Our results showed significantly higher predictive values of MI-related activation patterns during training with NF. Additionally, this improvement in the classification performance tends to be associated with the activation of middle temporal gyrus/inferior occipital gyrus, a region associated with visual motion processing, suggesting the importance of performance monitoring during MI task training. Taken together, these findings suggest that the efficacy of MI training, in terms of generating consistent brain activation patterns relevant to the task, can be enhanced by using NF as a mechanism to enable participants to volitionally recall task-related brain activation patterns.
Image projection optical system for measuring pattern electroretinograms
NASA Astrophysics Data System (ADS)
Starkey, Douglas E.; Taboada, John; Peters, Daniel
1994-06-01
The use of the pattern-electroretinogram (PERG) as a noninvasive diagnostic tool for the early detection of glaucoma has been supported by a number of recent studies. We have developed a unique device which uses a laser interferometer to generate a sinusoidal fringe pattern that is presented to the eye in Maxwellian view for the purpose of producing a PERG response. The projection system stimulates a large visual field and is designed to bypass the optics of the eye in order to measure the true retinal response to a temporally alternating fringe pattern. The contrast, spatial frequency, total power output, orientation, alternating temporal frequency, and field location of the fringe pattern presented to the eye can all be varied by the device. It is critical for these parameters to be variable so that optimal settings may be determined for the normal state and any deviation from it, i.e. early or preclinical glaucoma. Several interferometer designs and optical projection systems were studied in order to design a compact system which provided the desired variable pattern stimulus to the eye. This paper will present a description of the clinical research instrument and its performance with the primary emphasis on the optical system design as it relates to the fringe pattern generation and other optical parameters. Examples of its use in the study of glaucoma diagnosis will also be presented.
Pattern search in multi-structure data: a framework for the next-generation evidence-based medicine
NASA Astrophysics Data System (ADS)
Sukumar, Sreenivas R.; Ainsworth, Keela C.
2014-03-01
With the impetus towards personalized and evidence-based medicine, the need for a framework to analyze/interpret quantitative measurements (blood work, toxicology, etc.) with qualitative descriptions (specialist reports after reading images, bio-medical knowledgebase, etc.) to predict diagnostic risks is fast emerging. Addressing this need, we pose and answer the following questions: (i) How can we jointly analyze and explore measurement data in context with qualitative domain knowledge? (ii) How can we search and hypothesize patterns (not known apriori) from such multi-structure data? (iii) How can we build predictive models by integrating weakly-associated multi-relational multi-structure data? We propose a framework towards answering these questions. We describe a software solution that leverages hardware for scalable in-memory analytics and applies next-generation semantic query tools on medical data.
TermGenie – a web-application for pattern-based ontology class generation
Dietze, Heiko; Berardini, Tanya Z.; Foulger, Rebecca E.; ...
2014-01-01
Biological ontologies are continually growing and improving from requests for new classes (terms) by biocurators. These ontology requests can frequently create bottlenecks in the biocuration process, as ontology developers struggle to keep up, while manually processing these requests and create classes. TermGenie allows biocurators to generate new classes based on formally specified design patterns or templates. The system is web-based and can be accessed by any authorized curator through a web browser. Automated rules and reasoning engines are used to ensure validity, uniqueness and relationship to pre-existing classes. In the last 4 years the Gene Ontology TermGenie generated 4715 newmore » classes, about 51.4% of all new classes created. The immediate generation of permanent identifiers proved not to be an issue with only 70 (1.4%) obsoleted classes. Lastly, TermGenie is a web-based class-generation system that complements traditional ontology development tools. All classes added through pre-defined templates are guaranteed to have OWL equivalence axioms that are used for automatic classification and in some cases inter-ontology linkage. At the same time, the system is simple and intuitive and can be used by most biocurators without extensive training.« less
TermGenie – a web-application for pattern-based ontology class generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dietze, Heiko; Berardini, Tanya Z.; Foulger, Rebecca E.
Biological ontologies are continually growing and improving from requests for new classes (terms) by biocurators. These ontology requests can frequently create bottlenecks in the biocuration process, as ontology developers struggle to keep up, while manually processing these requests and create classes. TermGenie allows biocurators to generate new classes based on formally specified design patterns or templates. The system is web-based and can be accessed by any authorized curator through a web browser. Automated rules and reasoning engines are used to ensure validity, uniqueness and relationship to pre-existing classes. In the last 4 years the Gene Ontology TermGenie generated 4715 newmore » classes, about 51.4% of all new classes created. The immediate generation of permanent identifiers proved not to be an issue with only 70 (1.4%) obsoleted classes. Lastly, TermGenie is a web-based class-generation system that complements traditional ontology development tools. All classes added through pre-defined templates are guaranteed to have OWL equivalence axioms that are used for automatic classification and in some cases inter-ontology linkage. At the same time, the system is simple and intuitive and can be used by most biocurators without extensive training.« less
TermGenie - a web-application for pattern-based ontology class generation.
Dietze, Heiko; Berardini, Tanya Z; Foulger, Rebecca E; Hill, David P; Lomax, Jane; Osumi-Sutherland, David; Roncaglia, Paola; Mungall, Christopher J
2014-01-01
Biological ontologies are continually growing and improving from requests for new classes (terms) by biocurators. These ontology requests can frequently create bottlenecks in the biocuration process, as ontology developers struggle to keep up, while manually processing these requests and create classes. TermGenie allows biocurators to generate new classes based on formally specified design patterns or templates. The system is web-based and can be accessed by any authorized curator through a web browser. Automated rules and reasoning engines are used to ensure validity, uniqueness and relationship to pre-existing classes. In the last 4 years the Gene Ontology TermGenie generated 4715 new classes, about 51.4% of all new classes created. The immediate generation of permanent identifiers proved not to be an issue with only 70 (1.4%) obsoleted classes. TermGenie is a web-based class-generation system that complements traditional ontology development tools. All classes added through pre-defined templates are guaranteed to have OWL equivalence axioms that are used for automatic classification and in some cases inter-ontology linkage. At the same time, the system is simple and intuitive and can be used by most biocurators without extensive training.
Printing-assisted surface modifications of patterned ultrafiltration membranes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wardrip, Nathaniel C.; Dsouza, Melissa; Urgun-Demirtas, Meltem
Understanding and restricting microbial surface attachment will enhance wastewater treatment with membranes. We report a maskless lithographic patterning technique for the generation of patterned polymer coatings on ultrafiltration membranes. Polyethylene glycol, zwitterionic, or negatively charged hydrophilic polymer compositions in parallel- or perpendicular-striped patterns with respect to feed flow were evaluated using wastewater. Membrane fouling was dependent on the orientation and chemical composition of the coatings. Modifications reduced alpha diversity in the attached microbial community (Shannon indices decreased from 2.63 to 1.89) which nevertheless increased with filtration time. Sphingomonas species, which condition membrane surfaces and facilitate cellular adhesion, were depleted inmore » all modified membranes. Microbial community structure was significantly different between control, different patterns, and different chemistries. Lastly, this study broadens the tools for surface modification of membranes with polymer coatings and for understanding and optimization of antifouling surfaces.« less
Printing-assisted surface modifications of patterned ultrafiltration membranes
Wardrip, Nathaniel C.; Dsouza, Melissa; Urgun-Demirtas, Meltem; ...
2016-10-17
Understanding and restricting microbial surface attachment will enhance wastewater treatment with membranes. We report a maskless lithographic patterning technique for the generation of patterned polymer coatings on ultrafiltration membranes. Polyethylene glycol, zwitterionic, or negatively charged hydrophilic polymer compositions in parallel- or perpendicular-striped patterns with respect to feed flow were evaluated using wastewater. Membrane fouling was dependent on the orientation and chemical composition of the coatings. Modifications reduced alpha diversity in the attached microbial community (Shannon indices decreased from 2.63 to 1.89) which nevertheless increased with filtration time. Sphingomonas species, which condition membrane surfaces and facilitate cellular adhesion, were depleted inmore » all modified membranes. Microbial community structure was significantly different between control, different patterns, and different chemistries. Lastly, this study broadens the tools for surface modification of membranes with polymer coatings and for understanding and optimization of antifouling surfaces.« less
Passos, M H M; Lemos, M R; Almeida, S R; Balthazar, W F; da Silva, L; Huguenin, J A O
2017-01-10
In this work, we report on the analysis of speckle patterns produced by illuminating different rough surfaces with an optical vortex, a first-order (l=1) Laguerre-Gaussian beam. The generated speckle patterns were observed in the normal direction exploring four different planes: the diffraction plane, image plane, focal plane, and exact Fourier transform plane. The digital speckle patterns were analyzed using the Hurst exponent of digital images, an interesting tool used to study surface roughness. We show a proof of principle that the Hurst exponent of a digital speckle pattern is more sensitive with respect to the surface roughness when the speckle pattern is produced by an optical vortex and observed at a focal plane. We also show that Hurst exponents are not so sensitive with respect to the topological charge l. These results open news possibilities of investigation into speckle metrology once we have several techniques that use speckle patterns for different applications.
NullSeq: A Tool for Generating Random Coding Sequences with Desired Amino Acid and GC Contents.
Liu, Sophia S; Hockenberry, Adam J; Lancichinetti, Andrea; Jewett, Michael C; Amaral, Luís A N
2016-11-01
The existence of over- and under-represented sequence motifs in genomes provides evidence of selective evolutionary pressures on biological mechanisms such as transcription, translation, ligand-substrate binding, and host immunity. In order to accurately identify motifs and other genome-scale patterns of interest, it is essential to be able to generate accurate null models that are appropriate for the sequences under study. While many tools have been developed to create random nucleotide sequences, protein coding sequences are subject to a unique set of constraints that complicates the process of generating appropriate null models. There are currently no tools available that allow users to create random coding sequences with specified amino acid composition and GC content for the purpose of hypothesis testing. Using the principle of maximum entropy, we developed a method that generates unbiased random sequences with pre-specified amino acid and GC content, which we have developed into a python package. Our method is the simplest way to obtain maximally unbiased random sequences that are subject to GC usage and primary amino acid sequence constraints. Furthermore, this approach can easily be expanded to create unbiased random sequences that incorporate more complicated constraints such as individual nucleotide usage or even di-nucleotide frequencies. The ability to generate correctly specified null models will allow researchers to accurately identify sequence motifs which will lead to a better understanding of biological processes as well as more effective engineering of biological systems.
Forecast of Future Ohio River Basin, Waterway Traffic Based on Shippers Surveys.
1979-09-01
pattern. Analytic Tool: SPSS KAll of the editing, generation of tables, graphing, and other com- putational activities were carried out through the use...00O’s) 1976-1g90 Group 1971 1990 (%) 1976 190 (9) Coal & Coke 0 0 0 1,617 3,231 +100 Petroleum Fuels 705 779 +10 509 509 +0 Aggregates 0 0 0 450 688
Mask-induced aberration in EUV lithography
NASA Astrophysics Data System (ADS)
Nakajima, Yumi; Sato, Takashi; Inanami, Ryoichi; Nakasugi, Tetsuro; Higashiki, Tatsuhiko
2009-04-01
We estimated aberrations using Zernike sensitivity analysis. We found the difference of the tolerated aberration with line direction for illumination. The tolerated aberration of perpendicular line for illumination is much smaller than that of parallel line. We consider this difference to be attributable to the mask 3D effect. We call it mask-induced aberration. In the case of the perpendicular line for illumination, there was a difference in CD between right line and left line without aberration. In this report, we discuss the possibility of pattern formation in NA 0.25 generation EUV lithography tool. In perpendicular pattern for EUV light, the dominant part of aberration is mask-induced aberration. In EUV lithography, pattern correction based on the mask topography effect will be more important.
Prospect of EUV mask repair technology using e-beam tool
NASA Astrophysics Data System (ADS)
Kanamitsu, Shingo; Hirano, Takashi; Suga, Osamu
2010-09-01
Currently, repair machines used for advanced photomasks utilize principle method like as FIB, AFM, and EB. There are specific characteristic respectively, thus they have an opportunity to be used in suitable situation. But when it comes to EUV generation, pattern size is so small highly expected as under 80nm that higher image resolution and repair accuracy is needed for its machines. Because FIB machine has intrinsic damage problem induced by Ga ion and AFM machine has critical tip size issue, those machines are basically difficult to be applied for EUV generation. Consequently, we focused on EB repair tool for research work. EB repair tool has undergone practical milestone about MoSi based masks. We have applied same process which is used for MoSi to EUV blank and confirmed its reaction. Then we found some severe problems which show uncontrollable feature due to its enormously strong reaction between etching gas and absorber material. Though we could etch opaque defect with conventional method and get the edge shaped straight by top-down SEM viewing, there were problems like as sidewall undercut or local erosion depending on defect shape. In order to cope with these problems, the tool vender has developed a new process and reported it through an international conference [1]. We have evaluated the new process mentioned above in detail. In this paper, we will bring the results of those evaluations. Several experiments for repair accuracy, process stability, and other items have been done under estimation of practical condition assuming diversified size and shape defects. A series of actual printability tests will be also included. On the basis of these experiments, we consider the possibility of EB-repair application for 20nm pattern.
Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; He, Yongqun
2015-01-01
It is time-consuming to build an ontology with many terms and axioms. Thus it is desired to automate the process of ontology development. Ontology Design Patterns (ODPs) provide a reusable solution to solve a recurrent modeling problem in the context of ontology engineering. Because ontology terms often follow specific ODPs, the Ontology for Biomedical Investigations (OBI) developers proposed a Quick Term Templates (QTTs) process targeted at generating new ontology classes following the same pattern, using term templates in a spreadsheet format. Inspired by the ODPs and QTTs, the Ontorat web application is developed to automatically generate new ontology terms, annotations of terms, and logical axioms based on a specific ODP(s). The inputs of an Ontorat execution include axiom expression settings, an input data file, ID generation settings, and a target ontology (optional). The axiom expression settings can be saved as a predesigned Ontorat setting format text file for reuse. The input data file is generated based on a template file created by a specific ODP (text or Excel format). Ontorat is an efficient tool for ontology expansion. Different use cases are described. For example, Ontorat was applied to automatically generate over 1,000 Japan RIKEN cell line cell terms with both logical axioms and rich annotation axioms in the Cell Line Ontology (CLO). Approximately 800 licensed animal vaccines were represented and annotated in the Vaccine Ontology (VO) by Ontorat. The OBI team used Ontorat to add assay and device terms required by ENCODE project. Ontorat was also used to add missing annotations to all existing Biobank specific terms in the Biobank Ontology. A collection of ODPs and templates with examples are provided on the Ontorat website and can be reused to facilitate ontology development. With ever increasing ontology development and applications, Ontorat provides a timely platform for generating and annotating a large number of ontology terms by following design patterns. http://ontorat.hegroup.org/.
Using Discursis to enhance the qualitative analysis of hospital pharmacist-patient interactions.
Chevalier, Bernadette A M; Watson, Bernadette M; Barras, Michael A; Cottrell, William N; Angus, Daniel J
2018-01-01
Pharmacist-patient communication during medication counselling has been successfully investigated using Communication Accommodation Theory (CAT). Communication researchers in other healthcare professions have utilised Discursis software as an adjunct to their manual qualitative analysis processes. Discursis provides a visual, chronological representation of communication exchanges and identifies patterns of interactant engagement. The aim of this study was to describe how Discursis software was used to enhance previously conducted qualitative analysis of pharmacist-patient interactions (by visualising pharmacist-patient speech patterns, episodes of engagement, and identifying CAT strategies employed by pharmacists within these episodes). Visual plots from 48 transcribed audio recordings of pharmacist-patient exchanges were generated by Discursis. Representative plots were selected to show moderate-high and low- level speaker engagement. Details of engagement were investigated for pharmacist application of CAT strategies (approximation, interpretability, discourse management, emotional expression, and interpersonal control). Discursis plots allowed for identification of distinct patterns occurring within pharmacist-patient exchanges. Moderate-high pharmacist-patient engagement was characterised by multiple off-diagonal squares while alternating single coloured squares depicted low engagement. Engagement episodes were associated with multiple CAT strategies such as discourse management (open-ended questions). Patterns reflecting pharmacist or patient speaker dominance were dependant on clinical setting. Discursis analysis of pharmacist-patient interactions, a novel application of the technology in health communication, was found to be an effective visualisation tool to pin-point episodes for CAT analysis. Discursis has numerous practical and theoretical applications for future health communication research and training. Researchers can use the software to support qualitative analysis where large data sets can be quickly reviewed to identify key areas for concentrated analysis. Because Discursis plots are easily generated from audio recorded transcripts, they are conducive as teaching tools for both students and practitioners to assess and develop their communication skills.
Multiclassifier information fusion methods for microarray pattern recognition
NASA Astrophysics Data System (ADS)
Braun, Jerome J.; Glina, Yan; Judson, Nicholas; Herzig-Marx, Rachel
2004-04-01
This paper addresses automatic recognition of microarray patterns, a capability that could have a major significance for medical diagnostics, enabling development of diagnostic tools for automatic discrimination of specific diseases. The paper presents multiclassifier information fusion methods for microarray pattern recognition. The input space partitioning approach based on fitness measures that constitute an a-priori gauging of classification efficacy for each subspace is investigated. Methods for generation of fitness measures, generation of input subspaces and their use in the multiclassifier fusion architecture are presented. In particular, two-level quantification of fitness that accounts for the quality of each subspace as well as the quality of individual neighborhoods within the subspace is described. Individual-subspace classifiers are Support Vector Machine based. The decision fusion stage fuses the information from mulitple SVMs along with the multi-level fitness information. Final decision fusion stage techniques, including weighted fusion as well as Dempster-Shafer theory based fusion are investigated. It should be noted that while the above methods are discussed in the context of microarray pattern recognition, they are applicable to a broader range of discrimination problems, in particular to problems involving a large number of information sources irreducible to a low-dimensional feature space.
Jung, Hyunjun; Kang, Hongki; Nam, Yoonkey
2017-06-01
Light-mediated neuromodulation techniques provide great advantages to investigate neuroscience due to its high spatial and temporal resolution. To generate a spatial pattern of neural activity, it is necessary to develop a system for patterned-light illumination to a specific area. Digital micromirror device (DMD) based patterned illumination system have been used for neuromodulation due to its simple configuration and design flexibility. In this paper, we developed a patterned near-infrared (NIR) illumination system for region specific photothermal manipulation of neural activity using NIR-sensitive plasmonic gold nanorods (GNRs). The proposed system had high power transmission efficiency for delivering power density up to 19 W/mm 2 . We used a GNR-coated microelectrode array (MEA) to perform biological experiments using E18 rat hippocampal neurons and showed that it was possible to inhibit neural spiking activity of specific area in neural circuits with the patterned NIR illumination. This patterned NIR illumination system can serve as a promising neuromodulation tool to investigate neuroscience in a wide range of physiological and clinical applications.
Traction patterns of tumor cells.
Ambrosi, D; Duperray, A; Peschetola, V; Verdier, C
2009-01-01
The traction exerted by a cell on a planar deformable substrate can be indirectly obtained on the basis of the displacement field of the underlying layer. The usual methodology used to address this inverse problem is based on the exploitation of the Green tensor of the linear elasticity problem in a half space (Boussinesq problem), coupled with a minimization algorithm under force penalization. A possible alternative strategy is to exploit an adjoint equation, obtained on the basis of a suitable minimization requirement. The resulting system of coupled elliptic partial differential equations is applied here to determine the force field per unit surface generated by T24 tumor cells on a polyacrylamide substrate. The shear stress obtained by numerical integration provides quantitative insight of the traction field and is a promising tool to investigate the spatial pattern of force per unit surface generated in cell motion, particularly in the case of such cancer cells.
IDEAL: Images Across Domains, Experiments, Algorithms and Learning
NASA Astrophysics Data System (ADS)
Ushizima, Daniela M.; Bale, Hrishikesh A.; Bethel, E. Wes; Ercius, Peter; Helms, Brett A.; Krishnan, Harinarayan; Grinberg, Lea T.; Haranczyk, Maciej; Macdowell, Alastair A.; Odziomek, Katarzyna; Parkinson, Dilworth Y.; Perciano, Talita; Ritchie, Robert O.; Yang, Chao
2016-11-01
Research across science domains is increasingly reliant on image-centric data. Software tools are in high demand to uncover relevant, but hidden, information in digital images, such as those coming from faster next generation high-throughput imaging platforms. The challenge is to analyze the data torrent generated by the advanced instruments efficiently, and provide insights such as measurements for decision-making. In this paper, we overview work performed by an interdisciplinary team of computational and materials scientists, aimed at designing software applications and coordinating research efforts connecting (1) emerging algorithms for dealing with large and complex datasets; (2) data analysis methods with emphasis in pattern recognition and machine learning; and (3) advances in evolving computer architectures. Engineering tools around these efforts accelerate the analyses of image-based recordings, improve reusability and reproducibility, scale scientific procedures by reducing time between experiments, increase efficiency, and open opportunities for more users of the imaging facilities. This paper describes our algorithms and software tools, showing results across image scales, demonstrating how our framework plays a role in improving image understanding for quality control of existent materials and discovery of new compounds.
MCAW-DB: A glycan profile database capturing the ambiguity of glycan recognition patterns.
Hosoda, Masae; Takahashi, Yushi; Shiota, Masaaki; Shinmachi, Daisuke; Inomoto, Renji; Higashimoto, Shinichi; Aoki-Kinoshita, Kiyoko F
2018-05-11
Glycan-binding protein (GBP) interaction experiments, such as glycan microarrays, are often used to understand glycan recognition patterns. However, oftentimes the interpretation of glycan array experimental data makes it difficult to identify discrete GBP binding patterns due to their ambiguity. It is known that lectins, for example, are non-specific in their binding affinities; the same lectin can bind to different monosaccharides or even different glycan structures. In bioinformatics, several tools to mine the data generated from these sorts of experiments have been developed. These tools take a library of predefined motifs, which are commonly-found glycan patterns such as sialyl-Lewis X, and attempt to identify the motif(s) that are specific to the GBP being analyzed. In our previous work, as opposed to using predefined motifs, we developed the Multiple Carbohydrate Alignment with Weights (MCAW) tool to visualize the state of the glycans being recognized by the GBP under analysis. We previously reported on the effectiveness of our tool and algorithm by analyzing several glycan array datasets from the Consortium of Functional Glycomics (CFG). In this work, we report on our analysis of 1081 data sets which we collected from the CFG, the results of which we have made publicly and freely available as a database called MCAW-DB. We introduce this database, its usage and describe several analysis results. We show how MCAW-DB can be used to analyze glycan-binding patterns of GBPs amidst their ambiguity. For example, the visualization of glycan-binding patterns in MCAW-DB show how they correlate with the concentrations of the samples used in the array experiments. Using MCAW-DB, the patterns of glycans found to bind to various GBP-glycan binding proteins are visualized, indicating the binding "environment" of the glycans. Thus, the ambiguity of glycan recognition is numerically represented, along with the patterns of monosaccharides surrounding the binding region. The profiles in MCAW-DB could potentially be used as predictors of affinity of unknown or novel glycans to particular GBPs by comparing how well they match the existing profiles for those GBPs. Moreover, as the glycan profiles of diseased tissues become available, glycan alignments could also be used to identify glycan biomarkers unique to that tissue. Databases of these alignments may be of great use for drug discovery. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Digital micromirror device as programmable rough particle in interferometric particle imaging.
Fromager, M; Aït Ameur, K; Brunel, M
2017-04-20
The 2D autocorrelation of the projection of an irregular rough particle can be estimated using the analysis of its interferometric out-of-focus image. We report the development of an experimental setup that creates speckle-like patterns generated by "programmable" rough particles of desired-shape. It should become an important tool for the development of new setups, configurations, and algorithms in interferometric particle imaging.
SU-F-T-479: Estimation of the Accuracy in Respiratory-Gated Radiotherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurosawa, T; Miyakawa, S; Sato, M
Purpose: Irregular respiratory patterns affects dose outputs in respiratorygated radiotherapy and there is no commercially available quality assurance (QA) system for it. We designed and developed a patient specific QA system for respiratory-gated radiotherapy to estimate irradiated output. Methods: Our in-house QA system for gating was composed of a personal computer with the USB-FSIO electronic circuit connecting to the linear accelerator (ONCOR-K, Toshiba Medical Systems). The linac implements a respiratory gating system (AZ-733V, Anzai Medical). During the beam was on, 4.2 V square-wave pulses were continually sent to the system. Our system can receive and count the pulses. At first,more » our system and an oscilloscope were compared to check the performance of our system. Next, basic estimation models were generated when ionization-chamber measurements were performed in gating using regular sinusoidal wave patterns (2.0, 2.5, 4.0, 8.0, 15 sec/cycle). During gated irradiation with the regular patterns, the number of the pulses per one gating window was measured using our system. Correlation between the number of the pulses per one gating and dose per the gating window were assessed to generate the estimation model. Finally, two irregular respiratory patterns were created and the accuracy of the estimation was evaluated. Results: Compared to the oscilloscope, our system worked similarly. The basic models were generated with the accuracy within 0.1%. The results of the gated irradiations with two irregular respiratory patterns show good agreement within 0.4% estimation accuracy. Conclusion: Our developed system shows good estimation for even irregular respiration patterns. The system would be a useful tool to verify the output for respiratory-gated radiotherapy.« less
Costs of Limiting Route Optimization to Published Waypoints in the Traffic Aware Planner
NASA Technical Reports Server (NTRS)
Karr, David A.; Vivona, Robert A.; Wing, David J.
2013-01-01
The Traffic Aware Planner (TAP) is an airborne advisory tool that generates optimized, traffic-avoiding routes to support the aircraft crew in making strategic reroute requests to Air Traffic Control (ATC). TAP is derived from a research-prototype self-separation tool, the Autonomous Operations Planner (AOP), in which optimized route modifications that avoid conflicts with traffic and weather, using waypoints at explicit latitudes and longitudes (a technique supported by self-separation concepts), are generated by maneuver patterns applied to the existing route. For use in current-day operations in which trajectory changes must be requested from ATC via voice communication, TAP produces optimized routes described by advisories that use only published waypoints prior to a reconnection waypoint on the existing route. We describe how the relevant algorithms of AOP have been modified to implement this requirement. The modifications include techniques for finding appropriate published waypoints in a maneuver pattern and a method for combining the genetic algorithm of AOP with an exhaustive search of certain types of advisory. We demonstrate methods to investigate the increased computation required by these techniques and to estimate other costs (measured in terms such as time to destination and fuel burned) that may be incurred when only published waypoints are used.
Mining patterns in persistent surveillance systems with smart query and visual analytics
NASA Astrophysics Data System (ADS)
Habibi, Mohammad S.; Shirkhodaie, Amir
2013-05-01
In Persistent Surveillance Systems (PSS) the ability to detect and characterize events geospatially help take pre-emptive steps to counter adversary's actions. Interactive Visual Analytic (VA) model offers this platform for pattern investigation and reasoning to comprehend and/or predict such occurrences. The need for identifying and offsetting these threats requires collecting information from diverse sources, which brings with it increasingly abstract data. These abstract semantic data have a degree of inherent uncertainty and imprecision, and require a method for their filtration before being processed further. In this paper, we have introduced an approach based on Vector Space Modeling (VSM) technique for classification of spatiotemporal sequential patterns of group activities. The feature vectors consist of an array of attributes extracted from generated sensors semantic annotated messages. To facilitate proper similarity matching and detection of time-varying spatiotemporal patterns, a Temporal-Dynamic Time Warping (DTW) method with Gaussian Mixture Model (GMM) for Expectation Maximization (EM) is introduced. DTW is intended for detection of event patterns from neighborhood-proximity semantic frames derived from established ontology. GMM with EM, on the other hand, is employed as a Bayesian probabilistic model to estimated probability of events associated with a detected spatiotemporal pattern. In this paper, we present a new visual analytic tool for testing and evaluation group activities detected under this control scheme. Experimental results demonstrate the effectiveness of proposed approach for discovery and matching of subsequences within sequentially generated patterns space of our experiments.
Dynamic analysis and pattern visualization of forest fires.
Lopes, António M; Tenreiro Machado, J A
2014-01-01
This paper analyses forest fires in the perspective of dynamical systems. Forest fires exhibit complex correlations in size, space and time, revealing features often present in complex systems, such as the absence of a characteristic length-scale, or the emergence of long range correlations and persistent memory. This study addresses a public domain forest fires catalogue, containing information of events for Portugal, during the period from 1980 up to 2012. The data is analysed in an annual basis, modelling the occurrences as sequences of Dirac impulses with amplitude proportional to the burnt area. First, we consider mutual information to correlate annual patterns. We use visualization trees, generated by hierarchical clustering algorithms, in order to compare and to extract relationships among the data. Second, we adopt the Multidimensional Scaling (MDS) visualization tool. MDS generates maps where each object corresponds to a point. Objects that are perceived to be similar to each other are placed on the map forming clusters. The results are analysed in order to extract relationships among the data and to identify forest fire patterns.
Dynamic Analysis and Pattern Visualization of Forest Fires
Lopes, António M.; Tenreiro Machado, J. A.
2014-01-01
This paper analyses forest fires in the perspective of dynamical systems. Forest fires exhibit complex correlations in size, space and time, revealing features often present in complex systems, such as the absence of a characteristic length-scale, or the emergence of long range correlations and persistent memory. This study addresses a public domain forest fires catalogue, containing information of events for Portugal, during the period from 1980 up to 2012. The data is analysed in an annual basis, modelling the occurrences as sequences of Dirac impulses with amplitude proportional to the burnt area. First, we consider mutual information to correlate annual patterns. We use visualization trees, generated by hierarchical clustering algorithms, in order to compare and to extract relationships among the data. Second, we adopt the Multidimensional Scaling (MDS) visualization tool. MDS generates maps where each object corresponds to a point. Objects that are perceived to be similar to each other are placed on the map forming clusters. The results are analysed in order to extract relationships among the data and to identify forest fire patterns. PMID:25137393
Analysis of lead twist in modern high-performance grinding methods
NASA Astrophysics Data System (ADS)
Kundrák, J.; Gyáni, K.; Felhő, C.; Markopoulos, AP; Deszpoth, I.
2016-11-01
According to quality requirements of road vehicles shafts, which bear dynamic seals, twisted-pattern micro-geometrical topography is not allowed. It is a question whether newer modern grinding methods - such as quick-point grinding and peel grinding - could provide twist- free topography. According to industrial experience, twist-free surfaces can be made, however with certain settings, same twist occurs. In this paper it is proved by detailed chip-geometrical analysis that the topography generated by the new procedures is theoretically twist-patterned because of the feeding motion of the CBN tool. The presented investigation was carried out by a single-grain wheel model and computer simulation.
Pattern Search in Multi-structure Data: A Framework for the Next-Generation Evidence-based Medicine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sukumar, Sreenivas R; Ainsworth, Keela C
With the advent of personalized and evidence-based medicine, the need for a framework to analyze/interpret quantitative measurements (blood work, toxicology, etc.) with qualitative descriptions (specialist reports after reading images, bio-medical knowledge-bases) to predict diagnostic risks is fast emerging. Addressing this need, we pose and address the following questions (i) How can we jointly analyze both qualitative and quantitative data ? (ii) Is the fusion of multi-structure data expected to provide better insights than either of them individually ? We present experiments on two bio-medical data sets - mammography and traumatic brain studies to demonstrate architectures and tools for evidence-pattern search.
Kahl, Johannes; Busscher, Nicolaas; Mergardt, Gaby; Mäder, Paul; Torp, Torfinn; Ploeger, Angelika
2015-01-01
There is a need for authentication tools in order to verify the existing certification system. Recently, markers for analytical authentication of organic products were evaluated. Herein, crystallization with additives was described as an interesting fingerprint approach which needs further evidence, based on a standardized method and well-documented sample origin. The fingerprint of wheat cultivars from a controlled field trial is generated from structure analysis variables of crystal patterns. Method performance was tested on factors such as crystallization chamber, day of experiment and region of interest of the patterns. Two different organic treatments and two different treatments of the non-organic regime can be grouped together in each of three consecutive seasons. When the k-nearest-neighbor classification method was applied, approximately 84% of Runal samples and 95% of Titlis samples were classified correctly into organic and non-organic origin using cross-validation. Crystallization with additive offers an interesting complementary fingerprint method for organic wheat samples. When the method is applied to winter wheat from the DOK trial, organic and non-organic treated samples can be differentiated significantly based on pattern recognition. Therefore crystallization with additives seems to be a promising tool in organic wheat authentication. © 2014 Society of Chemical Industry.
COMPOSE-HPC: A Transformational Approach to Exascale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernholdt, David E; Allan, Benjamin A.; Armstrong, Robert C.
2012-04-01
The goal of the COMPOSE-HPC project is to 'democratize' tools for automatic transformation of program source code so that it becomes tractable for the developers of scientific applications to create and use their own transformations reliably and safely. This paper describes our approach to this challenge, the creation of the KNOT tool chain, which includes tools for the creation of annotation languages to control the transformations (PAUL), to perform the transformations (ROTE), and optimization and code generation (BRAID), which can be used individually and in combination. We also provide examples of current and future uses of the KNOT tools, whichmore » include transforming code to use different programming models and environments, providing tests that can be used to detect errors in software or its execution, as well as composition of software written in different programming languages, or with different threading patterns.« less
The FaceBase Consortium: A comprehensive program to facilitate craniofacial research
Hochheiser, Harry; Aronow, Bruce J.; Artinger, Kristin; Beaty, Terri H.; Brinkley, James F.; Chai, Yang; Clouthier, David; Cunningham, Michael L.; Dixon, Michael; Donahue, Leah Rae; Fraser, Scott E.; Hallgrimsson, Benedikt; Iwata, Junichi; Klein, Ophir; Marazita, Mary L.; Murray, Jeffrey C.; Murray, Stephen; de Villena, Fernando Pardo-Manuel; Postlethwait, John; Potter, Steven; Shapiro, Linda; Spritz, Richard; Visel, Axel; Weinberg, Seth M.; Trainor, Paul A.
2012-01-01
The FaceBase Consortium consists of ten interlinked research and technology projects whose goal is to generate craniofacial research data and technology for use by the research community through a central data management and integrated bioinformatics hub. Funded by the National Institute of Dental and Craniofacial Research (NIDCR) and currently focused on studying the development of the middle region of the face, the Consortium will produce comprehensive datasets of global gene expression patterns, regulatory elements and sequencing; will generate anatomical and molecular atlases; will provide human normative facial data and other phenotypes; conduct follow up studies of a completed genome-wide association study; generate independent data on the genetics of craniofacial development, build repositories of animal models and of human samples and data for community access and analysis; and will develop software tools and animal models for analyzing and functionally testing and integrating these data. The FaceBase website (http://www.facebase.org) will serve as a web home for these efforts, providing interactive tools for exploring these datasets, together with discussion forums and other services to support and foster collaboration within the craniofacial research community. PMID:21458441
Data Mining Techniques Applied to Hydrogen Lactose Breath Test.
Rubio-Escudero, Cristina; Valverde-Fernández, Justo; Nepomuceno-Chamorro, Isabel; Pontes-Balanza, Beatriz; Hernández-Mendoza, Yoedusvany; Rodríguez-Herrera, Alfonso
2017-01-01
Analyze a set of data of hydrogen breath tests by use of data mining tools. Identify new patterns of H2 production. Hydrogen breath tests data sets as well as k-means clustering as the data mining technique to a dataset of 2571 patients. Six different patterns have been extracted upon analysis of the hydrogen breath test data. We have also shown the relevance of each of the samples taken throughout the test. Analysis of the hydrogen breath test data sets using data mining techniques has identified new patterns of hydrogen generation upon lactose absorption. We can see the potential of application of data mining techniques to clinical data sets. These results offer promising data for future research on the relations between gut microbiota produced hydrogen and its link to clinical symptoms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, H; Tan, J; Kavanaugh, J
Purpose: Radiotherapy (RT) contours delineated either manually or semiautomatically require verification before clinical usage. Manual evaluation is very time consuming. A new integrated software tool using supervised pattern contour recognition was thus developed to facilitate this process. Methods: The contouring tool was developed using an object-oriented programming language C# and application programming interfaces, e.g. visualization toolkit (VTK). The C# language served as the tool design basis. The Accord.Net scientific computing libraries were utilized for the required statistical data processing and pattern recognition, while the VTK was used to build and render 3-D mesh models from critical RT structures in real-timemore » and 360° visualization. Principal component analysis (PCA) was used for system self-updating geometry variations of normal structures based on physician-approved RT contours as a training dataset. The inhouse design of supervised PCA-based contour recognition method was used for automatically evaluating contour normality/abnormality. The function for reporting the contour evaluation results was implemented by using C# and Windows Form Designer. Results: The software input was RT simulation images and RT structures from commercial clinical treatment planning systems. Several abilities were demonstrated: automatic assessment of RT contours, file loading/saving of various modality medical images and RT contours, and generation/visualization of 3-D images and anatomical models. Moreover, it supported the 360° rendering of the RT structures in a multi-slice view, which allows physicians to visually check and edit abnormally contoured structures. Conclusion: This new software integrates the supervised learning framework with image processing and graphical visualization modules for RT contour verification. This tool has great potential for facilitating treatment planning with the assistance of an automatic contour evaluation module in avoiding unnecessary manual verification for physicians/dosimetrists. In addition, its nature as a compact and stand-alone tool allows for future extensibility to include additional functions for physicians’ clinical needs.« less
matK-QR classifier: a patterns based approach for plant species identification.
More, Ravi Prabhakar; Mane, Rupali Chandrashekhar; Purohit, Hemant J
2016-01-01
DNA barcoding is widely used and most efficient approach that facilitates rapid and accurate identification of plant species based on the short standardized segment of the genome. The nucleotide sequences of maturaseK ( matK ) and ribulose-1, 5-bisphosphate carboxylase ( rbcL ) marker loci are commonly used in plant species identification. Here, we present a new and highly efficient approach for identifying a unique set of discriminating nucleotide patterns to generate a signature (i.e. regular expression) for plant species identification. In order to generate molecular signatures, we used matK and rbcL loci datasets, which encompass 125 plant species in 52 genera reported by the CBOL plant working group. Initially, we performed Multiple Sequence Alignment (MSA) of all species followed by Position Specific Scoring Matrix (PSSM) for both loci to achieve a percentage of discrimination among species. Further, we detected Discriminating Patterns (DP) at genus and species level using PSSM for the matK dataset. Combining DP and consecutive pattern distances, we generated molecular signatures for each species. Finally, we performed a comparative assessment of these signatures with the existing methods including BLASTn, Support Vector Machines (SVM), Jrip-RIPPER, J48 (C4.5 algorithm), and the Naïve Bayes (NB) methods against NCBI-GenBank matK dataset. Due to the higher discrimination success obtained with the matK as compared to the rbcL , we selected matK gene for signature generation. We generated signatures for 60 species based on identified discriminating patterns at genus and species level. Our comparative assessment results suggest that a total of 46 out of 60 species could be correctly identified using generated signatures, followed by BLASTn (34 species), SVM (18 species), C4.5 (7 species), NB (4 species) and RIPPER (3 species) methods As a final outcome of this study, we converted signatures into QR codes and developed a software matK -QR Classifier (http://www.neeri.res.in/matk_classifier/index.htm), which search signatures in the query matK gene sequences and predict corresponding plant species. This novel approach of employing pattern-based signatures opens new avenues for the classification of species. In addition to existing methods, we believe that matK -QR Classifier would be a valuable tool for molecular taxonomists enabling precise identification of plant species.
Frederik Doyon; Brian Sturtevant; Michael J. Papaik; Andrew Fall; Brian Miranda; Daniel D. Kneeshaw; Christian Messier; Marie-Josee Fortin; Patrick M.A. James
2012-01-01
Sustainable forest management (SFM) recognizes that the spatial and temporal patterns generated at different scales by natural landscape and stand dynamics processes should serve as a guide for managing the forest within its range of natural variability. Landscape simulation modeling is a powerful tool that can help encompass such complexity and support SFM planning....
NASA Astrophysics Data System (ADS)
Latrubesse, E. M.; Pereira, M.; Ramonell, C. G.; Szupiany, R. N.
2011-12-01
A new category of "very large" rivers was recently proposed and defined as mega-rivers, which are those rivers with a Qmean of more than ~17,000m3/s. This category includes the nine largest rivers on Earth and the Parana River is one of the selected members of that peculiar group. The planform adjustment of mega-rivers is a variety of anabranching patterns characterized by the existence of alluvial islands. The processes and mechanisms involved in the generation of the different anabranching styles, however, are not well understood. The Paraná channel pattern has been classified as a low to moderate anabranching, low sinuosity with tendency to braided and having a meandering thalweg. We analyzed a reach of the middle Paraná in Argentina applying a combined multitemporal, hydraulic, sedimentologic and geomorphologic approach. Multitemporal geomorphologic maps, sedimentary descriptions of bars, islands and banks, volumetric calculations using multitemporal bathymetric charts, measurements with ADCP and bathymetric surveys with echosound, sediment transport estimations and the hydrological analysis of available data from gauge stations were some of the tools used in our research. The evolution of the reach was studied from 1908 to present. The reach is subdivided in two sub-reaches (named Chapeton and Curtiembre) which are comprised between nodal points. Chapeton has been in a more mature quasi-equilibrium state through the XX Century but the main channel in Curtiembre evolved from a single pattern to anabranching pattern since 1950s. We conclude that the generation of the anabranching pattern in the studied reach depends of a combination of factors such as the architecture of the floodplain and islands, the main role played by the morphodynamics and shifting of the thalweg, the availability and path of sandy sediments bedforms architecture and the temporal variability of the effective discharge among other secondary factors. A feedback system coupling erosional/depositional processes at the decadal scale seems to be the main responsible for the generation of the complex anabranching pattern in such subreaches.
Sylos-Labini, Francesca; Ivanenko, Yuri P.
2014-01-01
Reduced gravity offers unique opportunities to study motor behavior. This paper aims at providing a review on current issues of the known tools and techniques used for hypogravity simulation and their effects on human locomotion. Walking and running rely on the limb oscillatory mechanics, and one way to change its dynamic properties is to modify the level of gravity. Gravity has a strong effect on the optimal rate of limb oscillations, optimal walking speed, and muscle activity patterns, and gait transitions occur smoothly and at slower speeds at lower gravity levels. Altered center of mass movements and interplay between stance and swing leg dynamics may challenge new forms of locomotion in a heterogravity environment. Furthermore, observations in the lack of gravity effects help to reveal the intrinsic properties of locomotor pattern generators and make evident facilitation of nonvoluntary limb stepping. In view of that, space neurosciences research has participated in the development of new technologies that can be used as an effective tool for gait rehabilitation. PMID:25247179
The capability of lithography simulation based on MVM-SEM® system
NASA Astrophysics Data System (ADS)
Yoshikawa, Shingo; Fujii, Nobuaki; Kanno, Koichi; Imai, Hidemichi; Hayano, Katsuya; Miyashita, Hiroyuki; Shida, Soichi; Murakawa, Tsutomu; Kuribara, Masayuki; Matsumoto, Jun; Nakamura, Takayuki; Matsushita, Shohei; Hara, Daisuke; Pang, Linyong
2015-10-01
The 1Xnm technology node lithography is using SMO-ILT, NTD or more complex pattern. Therefore in mask defect inspection, defect verification becomes more difficult because many nuisance defects are detected in aggressive mask feature. One key Technology of mask manufacture is defect verification to use aerial image simulator or other printability simulation. AIMS™ Technology is excellent correlation for the wafer and standards tool for defect verification however it is difficult for verification over hundred numbers or more. We reported capability of defect verification based on lithography simulation with a SEM system that architecture and software is excellent correlation for simple line and space.[1] In this paper, we use a SEM system for the next generation combined with a lithography simulation tool for SMO-ILT, NTD and other complex pattern lithography. Furthermore we will use three dimension (3D) lithography simulation based on Multi Vision Metrology SEM system. Finally, we will confirm the performance of the 2D and 3D lithography simulation based on SEM system for a photomask verification.
Pattern Inspection of EUV Masks Using DUV Light
NASA Astrophysics Data System (ADS)
Liang, Ted; Tejnil, Edita; Stivers, Alan R.
2002-12-01
Inspection of extreme ultraviolet (EUV) lithography masks requires reflected light and this poses special challenges for inspection tool suppliers as well as for mask makers. Inspection must detect all the printable defects in the absorber pattern as well as printable process-related defects. Progress has been made under the NIST ATP project on "Intelligent Mask Inspection Systems for Next Generation Lithography" in assessing the factors that impact the inspection tool sensitivity. We report in this paper the inspection of EUV masks with programmed absorber defects using 257nm light. All the materials of interests for masks are highly absorptive to EUV light as compared to deep ultraviolet (DUV) light. Residues and contamination from mask fabrication process and handling are prone to be printable. Therefore, it is critical to understand their EUV printability and optical inspectability. Process related defects may include residual buffer layer such as oxide, organic contaminants and possible over-etch to the multilayer surface. Both simulation and experimental results will be presented in this paper.
Methodology to design a municipal solid waste generation and composition map: A case study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallardo, A., E-mail: gallardo@uji.es; Carlos, M., E-mail: mcarlos@uji.es; Peris, M., E-mail: perism@uji.es
Highlights: • To draw a waste generation and composition map of a town a lot of factors must be taken into account. • The methodology proposed offers two different depending on the available data combined with geographical information systems. • The methodology has been applied to a Spanish city with success. • The methodology will be a useful tool to organize the municipal solid waste management. - Abstract: The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve naturalmore » resources. To design an adequate MSW management plan the first step consist in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the selective collection. To verify the methodology it has been successfully applied to a Spanish town.« less
NASA Astrophysics Data System (ADS)
Lee, Hyemi; Jeong, Goomin; Seo, Kangjun; Kim, Sangchul; kim, changreol
2008-05-01
Since mask design rule is smaller and smaller, Defects become one of the issues dropping the mask yield. Furthermore controlled defect size become smaller while masks are manufactured. According to ITRS roadmap on 2007, controlled defect size is 46nm in 57nm node and 36nm in 45nm node on a mask. However the machine development is delayed in contrast with the speed of the photolithography development. Generally mask manufacturing process is divided into 3 parts. First part is patterning on a mask and second part is inspecting the pattern and repairing the defect on the mask. At that time, inspection tools of transmitted light type are normally used and are the most trustful as progressive type in the developed inspection tools until now. Final part is shipping the mask after the qualifying the issue points and weak points. Issue points on a mask are qualified by using the AIMS (Aerial image measurement system). But this system is including the inherent error possibility, which is AIMS measures the issue points based on the inspection results. It means defects printed on a wafer are over the specific size detected by inspection tools and the inspection tool detects the almost defects. Even though there are no tools to detect the 46nm and 36nm defects suggested by ITRS roadmap, this assumption is applied to manufacturing the 57nm and 45nm device. So we make the programmed defect mask consisted with various defect type such as spot, clear extension, dark extension and CD variation on L/S(line and space), C/H(contact hole) and Active pattern in 55nm and 45nm node. And the programmed defect mask was inspected by using the inspection tool of transmitted light type and was measured by using AIMS 45-193i. Then the marginal defects were compared between the inspection tool and AIMS. Accordingly we could verify whether defect size is proper or not, which was suggested to be controlled on a mask by ITRS roadmap. Also this result could suggest appropriate inspection tools for next generation device among the inspection tools of transmitted light type, reflected light type and aerial image type.
Hoffman, Hal M; Wolfe, Frederick; Belomestnov, Pavel; Mellis, Scott J
2008-09-01
Development of an instrument for characterization of symptom patterns and severity in patients with cryopyrin-associated periodic syndromes (CAPS). Two generations of daily health assessment forms (DHAFs) were evaluated in this study. The first-generation DHAF queried 11 symptoms. Analyses of results obtained with that instrument identified five symptoms included in a revised second-generation DHAF that was tested for internal consistency and test-retest reliability. This DHAF was also assessed during the initial portion of a phase 3 clinical study of CAPS treatment. Forty-eight CAPS patients provided data for the first-generation DHAFs. Five symptoms (rash, fever, joint pain, eye redness/pain, and fatigue) were included in the revised second-generation DHAF. Symptom severity was highly variable during all study phases with as many as 89% of patients reporting at least one symptom flare, and percentages of days with flares reaching 58% during evaluation of the second-generation instrument. Mean composite key symptom scores (KSSs) computed during evaluation of the second-generation DHAF correlated well with Physician's Global Assessment of Disease Activity (r=0.91, p<0.0001) and patient reports of limitations of daily activities (r=0.68, p<0.0001). Test-retest reliability and Cronbach's alpha's were high (0.93 and 0.94, respectively) for the second-generation DHAF. Further evaluation of this DHAF during a baseline period and placebo treatment in a phase 3 clinical study of CAPS patients indicated strong correlations between baseline KSS and Physician's Global Assessment of Disease Activity. Cronbach's alpha's at baseline and test-retest reliability were also high. Potentially important study limitations include small sample size, the lack of a standard tool for CAPS symptom assessment against which to validate the DHAF, and no assessment of the instrument's responsivity to CAPS therapy. The DHAF is a new instrument that may be useful for capturing symptom patterns and severity in CAPS patients and monitoring responses to therapies for these conditions.
2011-01-01
Background To make sense out of gene expression profiles, such analyses must be pushed beyond the mere listing of affected genes. For example, if a group of genes persistently display similar changes in expression levels under particular experimental conditions, and the proteins encoded by these genes interact and function in the same cellular compartments, this could be taken as very strong indicators for co-regulated protein complexes. One of the key requirements is having appropriate tools to detect such regulatory patterns. Results We have analyzed the global adaptations in gene expression patterns in the budding yeast when the Hsp90 molecular chaperone complex is perturbed either pharmacologically or genetically. We integrated these results with publicly accessible expression, protein-protein interaction and intracellular localization data. But most importantly, all experimental conditions were simultaneously and dynamically visualized with an animation. This critically facilitated the detection of patterns of gene expression changes that suggested underlying regulatory networks that a standard analysis by pairwise comparison and clustering could not have revealed. Conclusions The results of the animation-assisted detection of changes in gene regulatory patterns make predictions about the potential roles of Hsp90 and its co-chaperone p23 in regulating whole sets of genes. The simultaneous dynamic visualization of microarray experiments, represented in networks built by integrating one's own experimental with publicly accessible data, represents a powerful discovery tool that allows the generation of new interpretations and hypotheses. PMID:21672238
An Interactive Simulation Program for Exploring Computational Models of Auto-Associative Memory.
Fink, Christian G
2017-01-01
While neuroscience students typically learn about activity-dependent plasticity early in their education, they often struggle to conceptually connect modification at the synaptic scale with network-level neuronal dynamics, not to mention with their own everyday experience of recalling a memory. We have developed an interactive simulation program (based on the Hopfield model of auto-associative memory) that enables the user to visualize the connections generated by any pattern of neural activity, as well as to simulate the network dynamics resulting from such connectivity. An accompanying set of student exercises introduces the concepts of pattern completion, pattern separation, and sparse versus distributed neural representations. Results from a conceptual assessment administered before and after students worked through these exercises indicate that the simulation program is a useful pedagogical tool for illustrating fundamental concepts of computational models of memory.
A strategy to discover new organizers identifies a putative heart organizer
Anderson, Claire; Khan, Mohsin A. F.; Wong, Frances; Solovieva, Tatiana; Oliveira, Nidia M. M.; Baldock, Richard A.; Tickle, Cheryll; Burt, Dave W.; Stern, Claudio D.
2016-01-01
Organizers are regions of the embryo that can both induce new fates and impart pattern on other regions. So far, surprisingly few organizers have been discovered, considering the number of patterned tissue types generated during development. This may be because their discovery has relied on transplantation and ablation experiments. Here we describe a new approach, using chick embryos, to discover organizers based on a common gene expression signature, and use it to uncover the anterior intestinal portal (AIP) endoderm as a putative heart organizer. We show that the AIP can induce cardiac identity from non-cardiac mesoderm and that it can pattern this by specifying ventricular and suppressing atrial regional identity. We also uncover some of the signals responsible. The method holds promise as a tool to discover other novel organizers acting during development. PMID:27557800
Jung, Hyunjun; Kang, Hongki; Nam, Yoonkey
2017-01-01
Light-mediated neuromodulation techniques provide great advantages to investigate neuroscience due to its high spatial and temporal resolution. To generate a spatial pattern of neural activity, it is necessary to develop a system for patterned-light illumination to a specific area. Digital micromirror device (DMD) based patterned illumination system have been used for neuromodulation due to its simple configuration and design flexibility. In this paper, we developed a patterned near-infrared (NIR) illumination system for region specific photothermal manipulation of neural activity using NIR-sensitive plasmonic gold nanorods (GNRs). The proposed system had high power transmission efficiency for delivering power density up to 19 W/mm2. We used a GNR-coated microelectrode array (MEA) to perform biological experiments using E18 rat hippocampal neurons and showed that it was possible to inhibit neural spiking activity of specific area in neural circuits with the patterned NIR illumination. This patterned NIR illumination system can serve as a promising neuromodulation tool to investigate neuroscience in a wide range of physiological and clinical applications. PMID:28663912
NASA Astrophysics Data System (ADS)
Hosono, Kunihiro; Kato, Kokoro
2008-10-01
This is a report on a panel discussion organized in Photomask Japan 2008, where the challenges about "Mask Complexities, Cost, and Cycle Time in 32-nm System LSI Generation" were addressed to have a look over the possible solutions from the standpoints of chipmaker, commercial mask shop, DA tool vendor and equipments makers. The wrap-up is as follows: Mask complexities justify the mask cost, while the acceptable increase rate of 32nm-mask cost significantly differs between mask suppliers or users side. The efficiency progress by new tools or DFM has driven their cycle-time reductions. Mask complexities and cost will be crucial issues prior to cycle time, and there seems to be linear correlation between them. Controlling complexity and cycle time requires developing a mix of advanced technologies, and especially for cost reduction, shot prices in writers and processing rates in inspection tools have been improved remarkably by tool makers. In addition, activities of consortium in Japan (Mask D2I) are expected to enhance the total optimization of mask design, writing and inspection. The cycle-time reduction potentially drives the lowering of mask cost, and, on the other, the pattern complexities and tighter mask specifications get in the way to 32nm generation as well as the nano-economics and market challenges. There are still many difficult problems in mask manufacturing now, and we are sure to go ahead to overcome a 32nm hurdle with the advances of technologies and collaborations by not only technologies but also finance.
Lee, Jongkeun; Lee, Andy Jinseok; Lee, June-Koo; Park, Jongkeun; Kwon, Youngoh; Park, Seongyeol; Chun, Hyonho; Ju, Young Seok; Hong, Dongwan
2018-05-22
Somatic genome mutations occur due to combinations of various intrinsic/extrinsic mutational processes and DNA repair mechanisms. Different molecular processes frequently generate different signatures of somatic mutations in their own favored contexts. As a result, the regional somatic mutation rate is dependent on the local DNA sequence, the DNA replication/RNA transcription dynamics and epigenomic chromatin organization landscape in the genome. Here, we propose an online computational framework, termed Mutalisk, which correlates somatic mutations with various genomic, transcriptional and epigenomic features in order to understand mutational processes that contribute to the generation of the mutations. This user-friendly tool explores the presence of localized hypermutations (kataegis), dissects the spectrum of mutations into the maximum likelihood combination of known mutational signatures and associates the mutation density with numerous regulatory elements in the genome. As a result, global patterns of somatic mutations in any query sample can be efficiently screened, thus enabling a deeper understanding of various mutagenic factors. This tool will facilitate more effective downstream analyses of cancer genome sequences to elucidate the diversity of mutational processes underlying the development and clonal evolution of cancer cells. Mutalisk is freely available at http://mutalisk.org.
Zhang, Bing; Schmoyer, Denise; Kirov, Stefan; Snoddy, Jay
2004-01-01
Background Microarray and other high-throughput technologies are producing large sets of interesting genes that are difficult to analyze directly. Bioinformatics tools are needed to interpret the functional information in the gene sets. Results We have created a web-based tool for data analysis and data visualization for sets of genes called GOTree Machine (GOTM). This tool was originally intended to analyze sets of co-regulated genes identified from microarray analysis but is adaptable for use with other gene sets from other high-throughput analyses. GOTree Machine generates a GOTree, a tree-like structure to navigate the Gene Ontology Directed Acyclic Graph for input gene sets. This system provides user friendly data navigation and visualization. Statistical analysis helps users to identify the most important Gene Ontology categories for the input gene sets and suggests biological areas that warrant further study. GOTree Machine is available online at . Conclusion GOTree Machine has a broad application in functional genomic, proteomic and other high-throughput methods that generate large sets of interesting genes; its primary purpose is to help users sort for interesting patterns in gene sets. PMID:14975175
2014-01-01
Ambulation or walking is one of the main gaits of locomotion. In terrestrial animals, it may be defined as a series of rhythmic and bilaterally coordinated movement of the limbs which creates a forward movement of the body. This applies regardless of the number of limbs—from arthropods with six or more limbs to bipedal primates. These fundamental similarities among species may explain why comparable neural systems and cellular properties have been found, thus far, to control in similar ways locomotor rhythm generation in most animal models. The aim of this article is to provide a comprehensive review of the known structural and functional features associated with central nervous system (CNS) networks that are involved in the control of ambulation and other stereotyped motor patterns—specifically Central Pattern Generators (CPGs) that produce basic rhythmic patterned outputs for locomotion, micturition, ejaculation, and defecation. Although there is compelling evidence of their existence in humans, CPGs have been most studied in reduced models including in vitro isolated preparations, genetically-engineered mice and spinal cord-transected animals. Compared with other structures of the CNS, the spinal cord is generally considered as being well-preserved phylogenetically. As such, most animal models of spinal cord-injured (SCI) should be considered as valuable tools for the development of novel pharmacological strategies aimed at modulating spinal activity and restoring corresponding functions in chronic SCI patients. PMID:24910602
Xu, Kesheng; Maidana, Jean P.; Caviedes, Mauricio; Quero, Daniel; Aguirre, Pablo; Orio, Patricio
2017-01-01
In this article, we describe and analyze the chaotic behavior of a conductance-based neuronal bursting model. This is a model with a reduced number of variables, yet it retains biophysical plausibility. Inspired by the activity of cold thermoreceptors, the model contains a persistent Sodium current, a Calcium-activated Potassium current and a hyperpolarization-activated current (Ih) that drive a slow subthreshold oscillation. Driven by this oscillation, a fast subsystem (fast Sodium and Potassium currents) fires action potentials in a periodic fashion. Depending on the parameters, this model can generate a variety of firing patterns that includes bursting, regular tonic and polymodal firing. Here we show that the transitions between different firing patterns are often accompanied by a range of chaotic firing, as suggested by an irregular, non-periodic firing pattern. To confirm this, we measure the maximum Lyapunov exponent of the voltage trajectories, and the Lyapunov exponent and Lempel-Ziv's complexity of the ISI time series. The four-variable slow system (without spiking) also generates chaotic behavior, and bifurcation analysis shows that this is often originated by period doubling cascades. Either with or without spikes, chaos is no longer generated when the Ih is removed from the system. As the model is biologically plausible with biophysically meaningful parameters, we propose it as a useful tool to understand chaotic dynamics in neurons. PMID:28344550
The neuronal differentiation process involves a series of antioxidant proteins.
Oh, J-E; Karlmark Raja, K; Shin, J-H; Hengstschläger, M; Pollak, A; Lubec, G
2005-11-01
Involvement of individual antioxidant proteins (AOXP) and antioxidants in the differentiation process has been already reported. A systematic search strategy for detecting differentially regulated AOXP in neuronal differentiation, however, has not been published so far. The aim of this study was to provide an analytical tool identifying AOXP and to generate a differentiation-related AOXP expressional pattern. The undifferentiated N1E-115 neuroblastoma cell line was switched into a neuronal phenotype by DMSO treatment and used for proteomic experiments: We used two-dimensional gel electrophoresis followed by unambiguous mass spectrometrical (MALDI-TOF-TOF) identification of proteins to generate a map of AOXP. 16 AOXP were unambiguously determined in both cell lines; catalase, thioredoxin domain-containing protein 4 and hypothetical glutaredoxin/glutathione S-transferase C terminus-containing protein were detectable in the undifferentiated cells only. Five AOXP were observed in both, undifferentiated and differentiated cells and thioredoxin, thioredoxin-like protein p19, thioredoxin reductase 1, superoxide dismutases (Mn and Cu-Zn), glutathione synthetase, glutathione S-transferase P1 and Mu1 were detected in differentiated cells exclusively. Herein a differential expressional pattern is presented that reveals so far unpublished antioxidant principles involved in neuronal differentiation by a protein chemical approach, unambiguously identifying AOXP. This finding not only shows concomitant determination of AOXP but also serves as an analytical tool and forms the basis for design of future studies addressing AOXP and differentiation per se.
Brenn, B Randall; Kim, Margaret A; Hilmas, Elora
2015-08-15
Development of an operational reporting dashboard designed to correlate data from multiple sources to help detect potential drug diversion by automated dispensing cabinet (ADC) users is described. A commercial business intelligence platform was used to create a dashboard tool for rapid detection of unusual patterns of ADC transactions by anesthesia service providers at a large pediatric hospital. By linking information from the hospital's pharmacy information management system (PIMS) and anesthesia information management system (AIMS) in an associative data model, the "narcotic reconciliation dashboard" can generate various reports to help spot outlier activity associated with ADC dispensing of controlled substances and documentation of medication waste processing. The dashboard's utility was evaluated by "back-testing" the program with historical data on an actual episode of diversion by an anesthesia provider that had not been detected through traditional methods of PIMS and AIMS data monitoring. Dashboard-generated reports on key metrics (e.g., ADC transaction counts, discrepancies in dispensed versus reconciled amounts of narcotics, PIMS-AIMS documentation mismatches) over various time frames during the period of known diversion clearly indicated the diverter's outlier status relative to other authorized ADC users. A dashboard program for correlating ADC transaction data with pharmacy and patient care data may be an effective tool for detecting patterns of ADC use that suggest drug diversion. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Expert2OWL: A Methodology for Pattern-Based Ontology Development.
Tahar, Kais; Xu, Jie; Herre, Heinrich
2017-01-01
The formalization of expert knowledge enables a broad spectrum of applications employing ontologies as underlying technology. These include eLearning, Semantic Web and expert systems. However, the manual construction of such ontologies is time-consuming and thus expensive. Moreover, experts are often unfamiliar with the syntax and semantics of formal ontology languages such as OWL and usually have no experience in developing formal ontologies. To overcome these barriers, we developed a new method and tool, called Expert2OWL that provides efficient features to support the construction of OWL ontologies using GFO (General Formal Ontology) as a top-level ontology. This method allows a close and effective collaboration between ontologists and domain experts. Essentially, this tool integrates Excel spreadsheets as part of a pattern-based ontology development and refinement process. Expert2OWL enables us to expedite the development process and modularize the resulting ontologies. We applied this method in the field of Chinese Herbal Medicine (CHM) and used Expert2OWL to automatically generate an accurate Chinese Herbology ontology (CHO). The expressivity of CHO was tested and evaluated using ontology query languages SPARQL and DL. CHO shows promising results and can generate answers to important scientific questions such as which Chinese herbal formulas contain which substances, which substances treat which diseases, and which ones are the most frequently used in CHM.
Williams, Ian; Constandinou, Timothy G.
2014-01-01
Accurate models of proprioceptive neural patterns could 1 day play an important role in the creation of an intuitive proprioceptive neural prosthesis for amputees. This paper looks at combining efficient implementations of biomechanical and proprioceptor models in order to generate signals that mimic human muscular proprioceptive patterns for future experimental work in prosthesis feedback. A neuro-musculoskeletal model of the upper limb with 7 degrees of freedom and 17 muscles is presented and generates real time estimates of muscle spindle and Golgi Tendon Organ neural firing patterns. Unlike previous neuro-musculoskeletal models, muscle activation and excitation levels are unknowns in this application and an inverse dynamics tool (static optimization) is integrated to estimate these variables. A proprioceptive prosthesis will need to be portable and this is incompatible with the computationally demanding nature of standard biomechanical and proprioceptor modeling. This paper uses and proposes a number of approximations and optimizations to make real time operation on portable hardware feasible. Finally technical obstacles to mimicking natural feedback for an intuitive proprioceptive prosthesis, as well as issues and limitations with existing models, are identified and discussed. PMID:25009463
Using Discursis to enhance the qualitative analysis of hospital pharmacist-patient interactions
Barras, Michael A.; Angus, Daniel J.
2018-01-01
Introduction Pharmacist-patient communication during medication counselling has been successfully investigated using Communication Accommodation Theory (CAT). Communication researchers in other healthcare professions have utilised Discursis software as an adjunct to their manual qualitative analysis processes. Discursis provides a visual, chronological representation of communication exchanges and identifies patterns of interactant engagement. Aim The aim of this study was to describe how Discursis software was used to enhance previously conducted qualitative analysis of pharmacist-patient interactions (by visualising pharmacist-patient speech patterns, episodes of engagement, and identifying CAT strategies employed by pharmacists within these episodes). Methods Visual plots from 48 transcribed audio recordings of pharmacist-patient exchanges were generated by Discursis. Representative plots were selected to show moderate-high and low- level speaker engagement. Details of engagement were investigated for pharmacist application of CAT strategies (approximation, interpretability, discourse management, emotional expression, and interpersonal control). Results Discursis plots allowed for identification of distinct patterns occurring within pharmacist-patient exchanges. Moderate-high pharmacist-patient engagement was characterised by multiple off-diagonal squares while alternating single coloured squares depicted low engagement. Engagement episodes were associated with multiple CAT strategies such as discourse management (open-ended questions). Patterns reflecting pharmacist or patient speaker dominance were dependant on clinical setting. Discussion and conclusions Discursis analysis of pharmacist-patient interactions, a novel application of the technology in health communication, was found to be an effective visualisation tool to pin-point episodes for CAT analysis. Discursis has numerous practical and theoretical applications for future health communication research and training. Researchers can use the software to support qualitative analysis where large data sets can be quickly reviewed to identify key areas for concentrated analysis. Because Discursis plots are easily generated from audio recorded transcripts, they are conducive as teaching tools for both students and practitioners to assess and develop their communication skills. PMID:29787568
Igloo-Plot: a tool for visualization of multidimensional datasets.
Kuntal, Bhusan K; Ghosh, Tarini Shankar; Mande, Sharmila S
2014-01-01
Advances in science and technology have resulted in an exponential growth of multivariate (or multi-dimensional) datasets which are being generated from various research areas especially in the domain of biological sciences. Visualization and analysis of such data (with the objective of uncovering the hidden patterns therein) is an important and challenging task. We present a tool, called Igloo-Plot, for efficient visualization of multidimensional datasets. The tool addresses some of the key limitations of contemporary multivariate visualization and analysis tools. The visualization layout, not only facilitates an easy identification of clusters of data-points having similar feature compositions, but also the 'marker features' specific to each of these clusters. The applicability of the various functionalities implemented herein is demonstrated using several well studied multi-dimensional datasets. Igloo-Plot is expected to be a valuable resource for researchers working in multivariate data mining studies. Igloo-Plot is available for download from: http://metagenomics.atc.tcs.com/IglooPlot/. Copyright © 2014 Elsevier Inc. All rights reserved.
Cervellin, Gianfranco; Comelli, Ivan; Lippi, Giuseppe
2017-09-01
Internet-derived information has been recently recognized as a valuable tool for epidemiological investigation. Google Trends, a Google Inc. portal, generates data on geographical and temporal patterns according to specified keywords. The aim of this study was to compare the reliability of Google Trends in different clinical settings, for both common diseases with lower media coverage, and for less common diseases attracting major media coverage. We carried out a search in Google Trends using the keywords "renal colic", "epistaxis", and "mushroom poisoning", selected on the basis of available and reliable epidemiological data. Besides this search, we carried out a second search for three clinical conditions (i.e., "meningitis", "Legionella Pneumophila pneumonia", and "Ebola fever"), which recently received major focus by the Italian media. In our analysis, no correlation was found between data captured from Google Trends and epidemiology of renal colics, epistaxis and mushroom poisoning. Only when searching for the term "mushroom" alone the Google Trends search generated a seasonal pattern which almost overlaps with the epidemiological profile, but this was probably mostly due to searches for harvesting and cooking rather than to for poisoning. The Google Trends data also failed to reflect the geographical and temporary patterns of disease for meningitis, Legionella Pneumophila pneumonia and Ebola fever. The results of our study confirm that Google Trends has modest reliability for defining the epidemiology of relatively common diseases with minor media coverage, or relatively rare diseases with higher audience. Overall, Google Trends seems to be more influenced by the media clamor than by true epidemiological burden. Copyright © 2017 Ministry of Health, Saudi Arabia. Published by Elsevier Ltd. All rights reserved.
Registration performance on EUV masks using high-resolution registration metrology
NASA Astrophysics Data System (ADS)
Steinert, Steffen; Solowan, Hans-Michael; Park, Jinback; Han, Hakseung; Beyer, Dirk; Scherübl, Thomas
2016-10-01
Next-generation lithography based on EUV continues to move forward to high-volume manufacturing. Given the technical challenges and the throughput concerns a hybrid approach with 193 nm immersion lithography is expected, at least in the initial state. Due to the increasing complexity at smaller nodes a multitude of different masks, both DUV (193 nm) and EUV (13.5 nm) reticles, will then be required in the lithography process-flow. The individual registration of each mask and the resulting overlay error are of crucial importance in order to ensure proper functionality of the chips. While registration and overlay metrology on DUV masks has been the standard for decades, this has yet to be demonstrated on EUV masks. Past generations of mask registration tools were not necessarily limited in their tool stability, but in their resolution capabilities. The scope of this work is an image placement investigation of high-end EUV masks together with a registration and resolution performance qualification. For this we employ a new generation registration metrology system embedded in a production environment for full-spec EUV masks. This paper presents excellent registration performance not only on standard overlay markers but also on more sophisticated e-beam calibration patterns.
Bound and free waves in non-collinear second harmonic generation.
Larciprete, M C; Bovino, F A; Belardini, A; Sibilia, C; Bertolotti, M
2009-09-14
We analyze the relationship between the bound and the free waves in the noncollinear SHG scheme, along with the vectorial conservation law for the different components arising when there are two pump beams impinging on the sample with two different incidence angles. The generated power is systematically investigated, by varying the polarization state of both fundamental beams, while absorption is included via the Herman and Hayden correction terms. The theoretical simulations, obtained for samples which are some coherence length thick show that the resulting polarization mapping is an useful tool to put in evidence the interference between bound and free waves, as well as the effect of absorption on the interference pattern.
Liao, Wen-Te; Pálffy, Adriana
2014-02-07
A setup for generating the special superposition of a simultaneously forward- and backward-propagating collective excitation in a nuclear sample is studied. We show that by actively manipulating the scattering channels of single x-ray quanta with the help of a normal incidence x-ray mirror, a nuclear polariton which propagates in two opposite directions can be generated. The two counterpropagating polariton branches are entangled by a single x-ray photon. The quantum nature of the nuclear excitation entanglement gives rise to a subangstrom-wavelength standing wave excitation pattern that can be used as a flexible tool to probe matter dynamically on the subatomic scale.
Extended generalized recurrence plot quantification of complex circular patterns
NASA Astrophysics Data System (ADS)
Riedl, Maik; Marwan, Norbert; Kurths, Jürgen
2017-03-01
The generalized recurrence plot is a modern tool for quantification of complex spatial patterns. Its application spans the analysis of trabecular bone structures, Turing patterns, turbulent spatial plankton patterns, and fractals. Determinism is a central measure in this framework quantifying the level of regularity of spatial structures. We show by basic examples of fully regular patterns of different symmetries that this measure underestimates the orderliness of circular patterns resulting from rotational symmetries. We overcome this crucial problem by checking additional structural elements of the generalized recurrence plot which is demonstrated with the examples. Furthermore, we show the potential of the extended quantity of determinism applying it to more irregular circular patterns which are generated by the complex Ginzburg-Landau-equation and which can be often observed in real spatially extended dynamical systems. So, we are able to reconstruct the main separations of the system's parameter space analyzing single snapshots of the real part only, in contrast to the use of the original quantity. This ability of the proposed method promises also an improved description of other systems with complicated spatio-temporal dynamics typically occurring in fluid dynamics, climatology, biology, ecology, social sciences, etc.
Using Maximum Entropy to Find Patterns in Genomes
NASA Astrophysics Data System (ADS)
Liu, Sophia; Hockenberry, Adam; Lancichinetti, Andrea; Jewett, Michael; Amaral, Luis
The existence of over- and under-represented sequence motifs in genomes provides evidence of selective evolutionary pressures on biological mechanisms such as transcription, translation, ligand-substrate binding, and host immunity. To accurately identify motifs and other genome-scale patterns of interest, it is essential to be able to generate accurate null models that are appropriate for the sequences under study. There are currently no tools available that allow users to create random coding sequences with specified amino acid composition and GC content. Using the principle of maximum entropy, we developed a method that generates unbiased random sequences with pre-specified amino acid and GC content. Our method is the simplest way to obtain maximally unbiased random sequences that are subject to GC usage and primary amino acid sequence constraints. This approach can also be easily be expanded to create unbiased random sequences that incorporate more complicated constraints such as individual nucleotide usage or even di-nucleotide frequencies. The ability to generate correctly specified null models will allow researchers to accurately identify sequence motifs which will lead to a better understanding of biological processes. National Institute of General Medical Science, Northwestern University Presidential Fellowship, National Science Foundation, David and Lucile Packard Foundation, Camille Dreyfus Teacher Scholar Award.
Spatial pattern recognition of seismic events in South West Colombia
NASA Astrophysics Data System (ADS)
Benítez, Hernán D.; Flórez, Juan F.; Duque, Diana P.; Benavides, Alberto; Lucía Baquero, Olga; Quintero, Jiber
2013-09-01
Recognition of seismogenic zones in geographical regions supports seismic hazard studies. This recognition is usually based on visual, qualitative and subjective analysis of data. Spatial pattern recognition provides a well founded means to obtain relevant information from large amounts of data. The purpose of this work is to identify and classify spatial patterns in instrumental data of the South West Colombian seismic database. In this research, clustering tendency analysis validates whether seismic database possesses a clustering structure. A non-supervised fuzzy clustering algorithm creates groups of seismic events. Given the sensitivity of fuzzy clustering algorithms to centroid initial positions, we proposed a methodology to initialize centroids that generates stable partitions with respect to centroid initialization. As a result of this work, a public software tool provides the user with the routines developed for clustering methodology. The analysis of the seismogenic zones obtained reveals meaningful spatial patterns in South-West Colombia. The clustering analysis provides a quantitative location and dispersion of seismogenic zones that facilitates seismological interpretations of seismic activities in South West Colombia.
Terrestrial ecosystems: national inventory of vegetation and land use
Gergely, Kevin J.; McKerrow, Alexa
2013-11-12
The Gap Analysis Program (GAP)/Landscape Fire and Resource Management Planning Tools (LANDFIRE) National Terrestrial Ecosystems Data represents detailed data on the vegetation and land-use patterns of the United States, including Alaska, Hawaii, and Puerto Rico. This national dataset combines detailed land cover data generated by the GAP with LANDFIRE data (http://www.landfire.gov/). LANDFIRE is an interagency vegetation, fire, and fuel characteristics mapping program sponsored by the U.S. Department of the Interior (DOI) and the U.S. Department of Agriculture Forest Service.
Jones, Stuart E.; Shade, Ashley L.; McMahon, Katherine D.; Kent, Angela D.
2007-01-01
Two primer sets for automated ribosomal intergenic spacer analysis (ARISA) were used to assess the bacterial community composition (BCC) in Lake Mendota, Wisconsin, over 3 years. Correspondence analysis revealed differences in community profiles generated by different primer sets, but overall ecological patterns were conserved in each case. ARISA is a powerful tool for evaluating BCC change through space and time, regardless of the specific primer set used. PMID:17122397
VLSI Design Tools, Reference Manual, Release 2.0.
1984-08-01
eder. 2.3 ITACV: Libary ofC readne. far oesumdg a layoit 1-,, tiling. V ~2.4 "QUILT: CeinS"Wbesa-i-M-8euar ray f atwok til 2.5 "TIL: Tockmeleff...8217patterns package was added so that complex and repetitive digital waveforms could be generated far more easily. The recently written program MTP (Multiple...circuit model to estimate timing delays through digital circuits. It also has a mode that allows it to be used as a switch (gate) level simulator
2011-06-01
and interpret patterns of social ties among actors, either multitudinous or relatively few (de Nooy, Mrvar , & Batagelj , 2005, p. 5). An aspect of...millions of vertices [aka nodes] ( Batagelj & Mrvar , 2003). With its ability to process large networks, it is considered an excellent social-network...8217Vladimir Batagelj and Andrej Mrvar —program package Pajek: http://vlado.fmf.uni-lj.si/pub/networks/pajek/’ name=’generator’/> </head> <Scene
Fractal Landscape Algorithms for Environmental Simulations
NASA Astrophysics Data System (ADS)
Mao, H.; Moran, S.
2014-12-01
Natural science and geographical research are now able to take advantage of environmental simulations that more accurately test experimental hypotheses, resulting in deeper understanding. Experiments affected by the natural environment can benefit from 3D landscape simulations capable of simulating a variety of terrains and environmental phenomena. Such simulations can employ random terrain generation algorithms that dynamically simulate environments to test specific models against a variety of factors. Through the use of noise functions such as Perlin noise, Simplex noise, and diamond square algorithms, computers can generate simulations that model a variety of landscapes and ecosystems. This study shows how these algorithms work together to create realistic landscapes. By seeding values into the diamond square algorithm, one can control the shape of landscape. Perlin noise and Simplex noise are also used to simulate moisture and temperature. The smooth gradient created by coherent noise allows more realistic landscapes to be simulated. Terrain generation algorithms can be used in environmental studies and physics simulations. Potential studies that would benefit from simulations include the geophysical impact of flash floods or drought on a particular region and regional impacts on low lying area due to global warming and rising sea levels. Furthermore, terrain generation algorithms also serve as aesthetic tools to display landscapes (Google Earth), and simulate planetary landscapes. Hence, it can be used as a tool to assist science education. Algorithms used to generate these natural phenomena provide scientists a different approach in analyzing our world. The random algorithms used in terrain generation not only contribute to the generating the terrains themselves, but are also capable of simulating weather patterns.
Developing quartz wafer mold manufacturing process for patterned media
NASA Astrophysics Data System (ADS)
Chiba, Tsuyoshi; Fukuda, Masaharu; Ishikawa, Mikio; Itoh, Kimio; Kurihara, Masaaki; Hoga, Morihisa
2009-04-01
Recently, patterned media have gained attention as a possible candidate for use in the next generation of hard disk drives (HDD). Feature sizes on media are predicted to be 20-25 nm half pitch (hp) for discrete-track media in 2010. One method of fabricating such a fine pattern is by using a nanoimprint. The imprint mold for the patterned media is created from a 150-millimeter, rounded, quartz wafer. The purpose of the process introduced here was to construct a quartz wafer mold and to fabricate line and space (LS) patterns at 24 nmhp for DTM. Additionally, we attempted to achieve a dense hole (HOLE) pattern at 12.5 nmhp for BPM for use in 2012. The manufacturing process of molds for patterned media is almost the same as that for semiconductors, with the exception of the dry-etching process. A 150-millimeter quartz wafer was etched on a special tray made from carving a 6025 substrate, by using the photo-mask tool. We also optimized the quartz etching conditions. As a result, 24 nmhp LS and HOLE patterns were manufactured on the quartz wafer. In conclusion, the quartz wafer mold manufacturing process was established. It is suggested that the etching condition should be further optimized to achieve a higher resolution of HOLE patterns.
Binary Gene Expression Patterning of the Molt Cycle: The Case of Chitin Metabolism
Abehsera, Shai; Glazer, Lilah; Tynyakov, Jenny; Plaschkes, Inbar; Chalifa-Caspi, Vered; Khalaila, Isam; Aflalo, Eliahu D.; Sagi, Amir
2015-01-01
In crustaceans, like all arthropods, growth is accompanied by a molting cycle. This cycle comprises major physiological events in which mineralized chitinous structures are built and degraded. These events are in turn governed by genes whose patterns of expression are presumably linked to the molting cycle. To study these genes we performed next generation sequencing and constructed a molt-related transcriptomic library from two exoskeletal-forming tissues of the crayfish Cherax quadricarinatus, namely the gastrolith and the mandible cuticle-forming epithelium. To simplify the study of such a complex process as molting, a novel approach, binary patterning of gene expression, was employed. This approach revealed that key genes involved in the synthesis and breakdown of chitin exhibit a molt-related pattern in the gastrolith-forming epithelium. On the other hand, the same genes in the mandible cuticle-forming epithelium showed a molt-independent pattern of expression. Genes related to the metabolism of glucosamine-6-phosphate, a chitin precursor synthesized from simple sugars, showed a molt-related pattern of expression in both tissues. The binary patterning approach unfolds typical patterns of gene expression during the molt cycle of a crustacean. The use of such a simplifying integrative tool for assessing gene patterning seems appropriate for the study of complex biological processes. PMID:25919476
Ancient DNA sequence revealed by error-correcting codes.
Brandão, Marcelo M; Spoladore, Larissa; Faria, Luzinete C B; Rocha, Andréa S L; Silva-Filho, Marcio C; Palazzo, Reginaldo
2015-07-10
A previously described DNA sequence generator algorithm (DNA-SGA) using error-correcting codes has been employed as a computational tool to address the evolutionary pathway of the genetic code. The code-generated sequence alignment demonstrated that a residue mutation revealed by the code can be found in the same position in sequences of distantly related taxa. Furthermore, the code-generated sequences do not promote amino acid changes in the deviant genomes through codon reassignment. A Bayesian evolutionary analysis of both code-generated and homologous sequences of the Arabidopsis thaliana malate dehydrogenase gene indicates an approximately 1 MYA divergence time from the MDH code-generated sequence node to its paralogous sequences. The DNA-SGA helps to determine the plesiomorphic state of DNA sequences because a single nucleotide alteration often occurs in distantly related taxa and can be found in the alternative codon patterns of noncanonical genetic codes. As a consequence, the algorithm may reveal an earlier stage of the evolution of the standard code.
Ancient DNA sequence revealed by error-correcting codes
Brandão, Marcelo M.; Spoladore, Larissa; Faria, Luzinete C. B.; Rocha, Andréa S. L.; Silva-Filho, Marcio C.; Palazzo, Reginaldo
2015-01-01
A previously described DNA sequence generator algorithm (DNA-SGA) using error-correcting codes has been employed as a computational tool to address the evolutionary pathway of the genetic code. The code-generated sequence alignment demonstrated that a residue mutation revealed by the code can be found in the same position in sequences of distantly related taxa. Furthermore, the code-generated sequences do not promote amino acid changes in the deviant genomes through codon reassignment. A Bayesian evolutionary analysis of both code-generated and homologous sequences of the Arabidopsis thaliana malate dehydrogenase gene indicates an approximately 1 MYA divergence time from the MDH code-generated sequence node to its paralogous sequences. The DNA-SGA helps to determine the plesiomorphic state of DNA sequences because a single nucleotide alteration often occurs in distantly related taxa and can be found in the alternative codon patterns of noncanonical genetic codes. As a consequence, the algorithm may reveal an earlier stage of the evolution of the standard code. PMID:26159228
Papargyropoulou, Effie; Wright, Nigel; Lozano, Rodrigo; Steinberger, Julia; Padfield, Rory; Ujang, Zaini
2016-03-01
Food waste has significant detrimental economic, environmental and social impacts. The magnitude and complexity of the global food waste problem has brought it to the forefront of the environmental agenda; however, there has been little research on the patterns and drivers of food waste generation, especially outside the household. This is partially due to weaknesses in the methodological approaches used to understand such a complex problem. This paper proposes a novel conceptual framework to identify and explain the patterns and drivers of food waste generation in the hospitality sector, with the aim of identifying food waste prevention measures. This conceptual framework integrates data collection and analysis methods from ethnography and grounded theory, complemented with concepts and tools from industrial ecology for the analysis of quantitative data. A case study of food waste generation at a hotel restaurant in Malaysia is used as an example to illustrate how this conceptual framework can be applied. The conceptual framework links the biophysical and economic flows of food provisioning and waste generation, with the social and cultural practices associated with food preparation and consumption. The case study demonstrates that food waste is intrinsically linked to the way we provision and consume food, the material and socio-cultural context of food consumption and food waste generation. Food provisioning, food consumption and food waste generation should be studied together in order to fully understand how, where and most importantly why food waste is generated. This understanding will then enable to draw detailed, case specific food waste prevention plans addressing the material and socio-economic aspects of food waste generation. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Mohedano, Rubén; Chaves, Julio; Hernández, Maikel
2016-04-01
In many illumination problems, the beam pattern needed and/or some geometrical constraints lead to very asymmetric design conditions. These asymmetries have been solved in the past by means of arrangements of rotationally symmetric or linear lamps aimed in different directions whose patterns overlap to provide the asymmetric prescriptions or by splitting one single lamp into several sections, each one providing a part of the pattern. The development of new design methods yielding smooth continuous free-form optical surfaces to solve these challenging design problems, combined with the proper CAD modeling tools plus the development of multiple axes diamond turn machines, give birth to a new generation of optics. These are able to offer the performance and other advanced features, such as efficiency, compactness, or aesthetical advantages, and can be manufactured at low cost by injection molding. This paper presents two examples of devices with free-form optical surfaces, a camera flash, and a car headlamp.
Dynamic patterns in a supported lipid bilayer driven by standing surface acoustic waves.
Hennig, Martin; Neumann, Jürgen; Wixforth, Achim; Rädler, Joachim O; Schneider, Matthias F
2009-11-07
In the past decades supported lipid bilayers (SLBs) have been an important tool in order to study the physical properties of biological membranes and cells. So far, controlled manipulation of SLBs is very limited. Here we present a new technology to create lateral patterns in lipid membranes controllable in both space and time. Surface acoustic waves (SAWs) are used to generate lateral standing waves on a piezoelectric substrate which create local "traps" in the lipid bilayer and lead to a lateral modulation in lipid concentration. We demonstrate that pattern formation is reversible and does not affect the integrity of the lipid bilayer as shown by extracting the diffusion constant of fluid membranes. The described method could possibly be used to design switchable interfaces for the lateral transport and organization of membrane bound macromolecules to create dynamic bioarrays and control biofilm formation.
The neurogenetic frontier--lessons from misbehaving zebrafish.
Burgess, Harold A; Granato, Michael
2008-11-01
One of the central questions in neuroscience is how refined patterns of connectivity in the brain generate and monitor behavior. Genetic mutations can influence neural circuits by disrupting differentiation or maintenance of component neuronal cells or by altering functional patterns of nervous system connectivity. Mutagenesis screens therefore have the potential to reveal not only the molecular underpinnings of brain development and function, but to illuminate the cellular basis of behavior. Practical considerations make the zebrafish an organism of choice for undertaking forward genetic analysis of behavior. The powerful array of experimental tools at the disposal of the zebrafish researcher makes it possible to link molecular function to neuronal properties that underlie behavior. This review focuses on specific challenges to isolating and analyzing behavioral mutants in zebrafish.
The neurogenetic frontier—lessons from misbehaving zebrafish
Granato, Michael
2008-01-01
One of the central questions in neuroscience is how refined patterns of connectivity in the brain generate and monitor behavior. Genetic mutations can influence neural circuits by disrupting differentiation or maintenance of component neuronal cells or by altering functional patterns of nervous system connectivity. Mutagenesis screens therefore have the potential to reveal not only the molecular underpinnings of brain development and function, but to illuminate the cellular basis of behavior. Practical considerations make the zebrafish an organism of choice for undertaking forward genetic analysis of behavior. The powerful array of experimental tools at the disposal of the zebrafish researcher makes it possible to link molecular function to neuronal properties that underlie behavior. This review focuses on specific challenges to isolating and analyzing behavioral mutants in zebrafish. PMID:18836206
Sinkó, József; Kákonyi, Róbert; Rees, Eric; Metcalf, Daniel; Knight, Alex E.; Kaminski, Clemens F.; Szabó, Gábor; Erdélyi, Miklós
2014-01-01
Localization-based super-resolution microscopy image quality depends on several factors such as dye choice and labeling strategy, microscope quality and user-defined parameters such as frame rate and number as well as the image processing algorithm. Experimental optimization of these parameters can be time-consuming and expensive so we present TestSTORM, a simulator that can be used to optimize these steps. TestSTORM users can select from among four different structures with specific patterns, dye and acquisition parameters. Example results are shown and the results of the vesicle pattern are compared with experimental data. Moreover, image stacks can be generated for further evaluation using localization algorithms, offering a tool for further software developments. PMID:24688813
NASA Astrophysics Data System (ADS)
Silva R., Santiago S.; Giraldo, Diana L.; Romero, Eduardo
2017-11-01
Structural Magnetic Resonance (MR) brain images should provide quantitative information about the stage and progression of Alzheimer's disease. However, the use of MRI is limited and practically reduced to corroborate a diagnosis already performed with neuropsychological tools. This paper presents an automated strategy for extraction of relevant anatomic patterns related with the conversion from mild cognitive impairment (MCI) to Alzheimer's disease (AD) using T1-weighted MR images. The process starts by representing each of the possible classes with models generated from a linear combination of volumes. The difference between models allows us to establish which are the regions where relevant patterns might be located. The approach searches patterns in a space of brain sulci, herein approximated by the most representative gradients found in regions of interest defined by the difference between the linear models. This hypothesis is assessed by training a conventional SVM model with the found relevant patterns under a leave-one-out scheme. The resultant AUC was 0.86 for the group of women and 0.61 for the group of men.
Using telephony data to facilitate discovery of clinical workflows.
Rucker, Donald W
2017-04-19
Discovery of clinical workflows to target for redesign using methods such as Lean and Six Sigma is difficult. VoIP telephone call pattern analysis may complement direct observation and EMR-based tools in understanding clinical workflows at the enterprise level by allowing visualization of institutional telecommunications activity. To build an analytic framework mapping repetitive and high-volume telephone call patterns in a large medical center to their associated clinical units using an enterprise unified communications server log file and to support visualization of specific call patterns using graphical networks. Consecutive call detail records from the medical center's unified communications server were parsed to cross-correlate telephone call patterns and map associated phone numbers to a cost center dictionary. Hashed data structures were built to allow construction of edge and node files representing high volume call patterns for display with an open source graph network tool. Summary statistics for an analysis of exactly one week's call detail records at a large academic medical center showed that 912,386 calls were placed with a total duration of 23,186 hours. Approximately half of all calling called number pairs had an average call duration under 60 seconds and of these the average call duration was 27 seconds. Cross-correlation of phone calls identified by clinical cost center can be used to generate graphical displays of clinical enterprise communications. Many calls are short. The compact data transfers within short calls may serve as automation or re-design targets. The large absolute amount of time medical center employees were engaged in VoIP telecommunications suggests that analysis of telephone call patterns may offer additional insights into core clinical workflows.
Classification and printability of EUV mask defects from SEM images
NASA Astrophysics Data System (ADS)
Cho, Wonil; Price, Daniel; Morgan, Paul A.; Rost, Daniel; Satake, Masaki; Tolani, Vikram L.
2017-10-01
Classification and Printability of EUV Mask Defects from SEM images EUV lithography is starting to show more promise for patterning some critical layers at 5nm technology node and beyond. However, there still are many key technical obstacles to overcome before bringing EUV Lithography into high volume manufacturing (HVM). One of the greatest obstacles is manufacturing defect-free masks. For pattern defect inspections in the mask-shop, cutting-edge 193nm optical inspection tools have been used so far due to lacking any e-beam mask inspection (EBMI) or EUV actinic pattern inspection (API) tools. The main issue with current 193nm inspection tools is the limited resolution for mask dimensions targeted for EUV patterning. The theoretical resolution limit for 193nm mask inspection tools is about 60nm HP on masks, which means that main feature sizes on EUV masks will be well beyond the practical resolution of 193nm inspection tools. Nevertheless, 193nm inspection tools with various illumination conditions that maximize defect sensitivity and/or main-pattern modulation are being explored for initial EUV defect detection. Due to the generally low signal-to-noise in the 193nm inspection imaging at EUV patterning dimensions, these inspections often result in hundreds and thousands of defects which then need to be accurately reviewed and dispositioned. Manually reviewing each defect is difficult due to poor resolution. In addition, the lack of a reliable aerial dispositioning system makes it very challenging to disposition for printability. In this paper, we present the use of SEM images of EUV masks for higher resolution review and disposition of defects. In this approach, most of the defects detected by the 193nm inspection tools are first imaged on a mask SEM tool. These images together with the corresponding post-OPC design clips are provided to KLA-Tencor's Reticle Decision Center (RDC) platform which provides ADC (Automated Defect Classification) and S2A (SEM-to-Aerial printability) analysis of every defect. First, a defect-free or reference mask SEM is rendered from the post-OPC design, and the defective signature is detected from the defect-reference difference image. These signatures help assess the true nature of the defect as evident in e-beam imaging; for example, excess or missing absorber, line-edge roughness, contamination, etc. Next, defect and reference contours are extracted from the grayscale SEM images and fed into the simulation engine with an EUV scanner model to generate corresponding EUV defect and reference aerial images. These are then analyzed for printability and dispositioned using an Aerial Image Analyzer (AIA) application to automatically measure and determine the amount of CD errors. Thus by integrating EUV ADC and S2A applications together, every defect detection is characterized for its type and printability which is essential for not only determining which defects to repair, but also in monitoring the performance of EUV mask process tools. The accuracy of the S2A print modeling has been verified with other commercially-available simulators, and will also be verified with actual wafer print results. With EUV lithography progressing towards volume manufacturing at 5nm technology, and the likelihood of EBMI inspectors approaching the horizon, the EUV ADC-S2A system will continue serving an essential role of dispositioning defects off e-beam imaging.
NASA Astrophysics Data System (ADS)
Zhang, X.; Srinivasan, R.
2008-12-01
In this study, a user friendly GIS tool was developed for evaluating and improving NEXRAD using raingauge data. This GIS tool can automatically read in raingauge and NEXRAD data, evaluate the accuracy of NEXRAD for each time unit, implement several geostatistical methods to improve the accuracy of NEXRAD through raingauge data, and output spatial precipitation map for distributed hydrologic model. The geostatistical methods incorporated in this tool include Simple Kriging with varying local means, Kriging with External Drift, Regression Kriging, Co-Kriging, and a new geostatistical method that was newly developed by Li et al. (2008). This tool was applied in two test watersheds at hourly and daily temporal scale. The preliminary cross-validation results show that incorporating raingauge data to calibrate NEXRAD can pronouncedly change the spatial pattern of NEXRAD and improve its accuracy. Using different geostatistical methods, the GIS tool was applied to produce long term precipitation input for a distributed hydrologic model - Soil and Water Assessment Tool (SWAT). Animated video was generated to vividly illustrate the effect of using different precipitation input data on distributed hydrologic modeling. Currently, this GIS tool is developed as an extension of SWAT, which is used as water quantity and quality modeling tool by USDA and EPA. The flexible module based design of this tool also makes it easy to be adapted for other hydrologic models for hydrological modeling and water resources management.
WholePathwayScope: a comprehensive pathway-based analysis tool for high-throughput data
Yi, Ming; Horton, Jay D; Cohen, Jonathan C; Hobbs, Helen H; Stephens, Robert M
2006-01-01
Background Analysis of High Throughput (HTP) Data such as microarray and proteomics data has provided a powerful methodology to study patterns of gene regulation at genome scale. A major unresolved problem in the post-genomic era is to assemble the large amounts of data generated into a meaningful biological context. We have developed a comprehensive software tool, WholePathwayScope (WPS), for deriving biological insights from analysis of HTP data. Result WPS extracts gene lists with shared biological themes through color cue templates. WPS statistically evaluates global functional category enrichment of gene lists and pathway-level pattern enrichment of data. WPS incorporates well-known biological pathways from KEGG (Kyoto Encyclopedia of Genes and Genomes) and Biocarta, GO (Gene Ontology) terms as well as user-defined pathways or relevant gene clusters or groups, and explores gene-term relationships within the derived gene-term association networks (GTANs). WPS simultaneously compares multiple datasets within biological contexts either as pathways or as association networks. WPS also integrates Genetic Association Database and Partial MedGene Database for disease-association information. We have used this program to analyze and compare microarray and proteomics datasets derived from a variety of biological systems. Application examples demonstrated the capacity of WPS to significantly facilitate the analysis of HTP data for integrative discovery. Conclusion This tool represents a pathway-based platform for discovery integration to maximize analysis power. The tool is freely available at . PMID:16423281
Martínez-Abadías, Neus; Mateu, Roger; Niksic, Martina; Russo, Lucia; Sharpe, James
2016-01-01
How the genotype translates into the phenotype through development is critical to fully understand the evolution of phenotypes. We propose a novel approach to directly assess how changes in gene expression patterns are associated with changes in morphology using the limb as a case example. Our method combines molecular biology techniques, such as whole-mount in situ hybridization, with image and shape analysis, extending the use of Geometric Morphometrics to the analysis of nonanatomical shapes, such as gene expression domains. Elliptical Fourier and Procrustes-based semilandmark analyses were used to analyze the variation and covariation patterns of the limb bud shape with the expression patterns of two relevant genes for limb morphogenesis, Hoxa11 and Hoxa13. We devised a multiple thresholding method to semiautomatically segment gene domains at several expression levels in large samples of limb buds from C57Bl6 mouse embryos between 10 and 12 postfertilization days. Besides providing an accurate phenotyping tool to quantify the spatiotemporal dynamics of gene expression patterns within developing structures, our morphometric analyses revealed high, non-random, and gene-specific variation undergoing canalization during limb development. Our results demonstrate that Hoxa11 and Hoxa13, despite being paralogs with analogous functions in limb patterning, show clearly distinct dynamic patterns, both in shape and size, and are associated differently with the limb bud shape. The correspondence between our results and already well-established molecular processes underlying limb development confirms that this morphometric approach is a powerful tool to extract features of development regulating morphogenesis. Such multilevel analyses are promising in systems where not so much molecular information is available and will advance our understanding of the genotype–phenotype map. In systematics, this knowledge will increase our ability to infer how evolution modified a common developmental pattern to generate a wide diversity of morphologies, as in the vertebrate limb. PMID:26377442
Flexible Micro-and Nano-Patterning Tools for Photonics
2016-03-10
AFRL-AFOSR-VA-TR-2016-0125 Flexible Micro- and Nano -Patterning Tools for Photonics Henry Smith LUMARRAY INC. 15 WARD ST. SOMERVILLE, MA 21434228 03...14-01-2015 4. TITLE AND SUBTITLE Flexible Micro- and Nano -Patterning Tools for Photonics - OSD STTR Phase 2 5a. CONTRACT NUMBER FA9550-12-C-0082 5b...2016https://livelink.ebs.afrl.af.mil/livelink/llisapi.dll DISTRIBUTION A: Distribution approved for public release. FLEXIBLE MICRO- AND NANO - PATTERNING
Genovart, Meritxell; Thibault, Jean-Claude; Igual, José Manuel; Bauzà-Ribot, Maria del Mar; Rabouam, Corinne; Bretagnolle, Vincent
2013-01-01
Dispersal is critically linked to the demographic and evolutionary trajectories of populations, but in most seabird species it may be difficult to estimate. Using molecular tools, we explored population structure and the spatial dispersal pattern of a highly pelagic but philopatric seabird, the Cory's shearwater Calonectris diomedea. Microsatellite fragments were analysed from samples collected across almost the entire breeding range of the species. To help disentangle the taxonomic status of the two subspecies described, the Atlantic form C. d. borealis and the Mediterranean form C. d. diomedea, we analysed genetic divergence between subspecies and quantified both historical and recent migration rates between the Mediterranean and Atlantic basins. We also searched for evidence of isolation by distance (IBD) and addressed spatial patterns of gene flow. We found a low genetic structure in the Mediterranean basin. Conversely, strong genetic differentiation appeared in the Atlantic basin. Even if the species was mostly philopatric (97%), results suggest recent dispersal between basins, especially from the Atlantic to the Mediterranean (aprox. 10% of migrants/generation across the last two generations). Long-term gene flow analyses also suggested an historical exchange between basins (about 70 breeders/generation). Spatial analysis of genetic variation indicates that distance is not the main factor in shaping genetic structure in this species. Given our results we recommend gathering more data before concluded whether these taxa should be treated as two species or subspecies. PMID:23950986
Microfluidic-based patterning of embryonic stem cells for in vitro development studies.
Suri, Shalu; Singh, Ankur; Nguyen, Anh H; Bratt-Leal, Andres M; McDevitt, Todd C; Lu, Hang
2013-12-07
In vitro recapitulation of mammalian embryogenesis and examination of the emerging behaviours of embryonic structures require both the means to engineer complexity and accurately assess phenotypes of multicellular aggregates. Current approaches to study multicellular populations in 3D configurations are limited by the inability to create complex (i.e. spatially heterogeneous) environments in a reproducible manner with high fidelity thus impeding the ability to engineer microenvironments and combinations of cells with similar complexity to that found during morphogenic processes such as development, remodelling and wound healing. Here, we develop a multicellular embryoid body (EB) fusion technique as a higher-throughput in vitro tool, compared to a manual assembly, to generate developmentally relevant embryonic patterns. We describe the physical principles of the EB fusion microfluidic device design; we demonstrate that >60 conjoined EBs can be generated overnight and emulate a development process analogous to mouse gastrulation during early embryogenesis. Using temporal delivery of bone morphogenic protein 4 (BMP4) to embryoid bodies, we recapitulate embryonic day 6.5 (E6.5) during mouse embryo development with induced mesoderm differentiation in murine embryonic stem cells leading to expression of Brachyury-T-green fluorescent protein (T-GFP), an indicator of primitive streak development and mesoderm differentiation during gastrulation. The proposed microfluidic approach could be used to manipulate hundreds or more of individual embryonic cell aggregates in a rapid fashion, thereby allowing controlled differentiation patterns in fused multicellular assemblies to generate complex yet spatially controlled microenvironments.
Microfluidic-based patterning of embryonic stem cells for in vitro development studies
Suri, Shalu; Singh, Ankur; Nguyen, Anh H.; Bratt-Leal, Andres M.; McDevitt, Todd C.
2013-01-01
In vitro recapitulation of mammalian embryogenesis and examination of the emerging behaviours of embryonic structures require both the means to engineer complexity and accurately assess phenotypes of multicellular aggregates. Current approaches to study multicellular populations in 3D configurations are limited by the inability to create complex (i.e. spatially heterogeneous) environments in a reproducible manner with high fidelity thus impeding the ability to engineer microenvironments and combinations of cells with similar complexity to that found during morphogenic processes such as development, remodelling and wound healing. Here, we develop a multicellular embryoid body (EB) fusion technique as a higher-throughput in vitro tool, compared to a manual assembly, to generate developmentally relevant embryonic patterns. We describe the physical principles of the EB fusion microfluidic device design; we demonstrate that >60 conjoined EBs can be generated overnight and emulate a development process analogous to mouse gastrulation during early embryogenesis. Using temporal delivery of bone morphogenic protein 4 (BMP4) to embryoid bodies, we recapitulate embryonic day 6.5 (E6.5) during mouse embryo development with induced mesoderm differentiation in murine embryonic stem cells leading to expression of Brachyury-T-green fluorescent protein (T-GFP), an indicator of primitive streak development and mesoderm differentiation during gastrulation. The proposed microfluidic approach could be used to manipulate hundreds or more of individual embryonic cell aggregates in a rapid fashion, thereby allowing controlled differentiation patterns in fused multicellular assemblies to generate complex yet spatially controlled microenvironments. PMID:24113509
New approach for producing chemical templates over large area by Molecular Transfer Printing
NASA Astrophysics Data System (ADS)
Inoue, Takejiro; Janes, Dustin; Ren, Jiaxing; Willson, Grant; Ellison, Christopher; Nealey, Paul
2014-03-01
Fabrication of well-defined chemically patterned surfaces is crucially important to the development of next generation microprocessors, hard disk memory devices, photonic/plasmonic devices, separation membranes, and biological microarrays. One promising patterning method in these fields is Molecular Transfer Printing (MTP), which replicates chemical patterns with feature dimensions of the order of 10nm utilizing a master template defined by the microphase separated domains of a block copolymer thin film. The total transfer printing area achievable by MTP has so far been limited by the contact area between two rigid substrates. Therefore, strategies to make conformal contact between substrates could be practically useful because a single lithographically-defined starting pattern could be used to fabricate many replicates by a low-cost process. Here we show a new approach that utilizes a chemically deposited SiN layer and a liquid conformal layer to enable transfer printing of chemical patterns upon thermal annealing over large, continuous areas. We anticipate that our process could be integrated into Step and Flash Imprint Lithography (SFIL) tools to achieve conformal layer thicknesses thin and uniform enough to permit pattern transfer through a dry-etch protocol.
Next-generation sequencing for endocrine cancers: Recent advances and challenges.
Suresh, Padmanaban S; Venkatesh, Thejaswini; Tsutsumi, Rie; Shetty, Abhishek
2017-05-01
Contemporary molecular biology research tools have enriched numerous areas of biomedical research that address challenging diseases, including endocrine cancers (pituitary, thyroid, parathyroid, adrenal, testicular, ovarian, and neuroendocrine cancers). These tools have placed several intriguing clues before the scientific community. Endocrine cancers pose a major challenge in health care and research despite considerable attempts by researchers to understand their etiology. Microarray analyses have provided gene signatures from many cells, tissues, and organs that can differentiate healthy states from diseased ones, and even show patterns that correlate with stages of a disease. Microarray data can also elucidate the responses of endocrine tumors to therapeutic treatments. The rapid progress in next-generation sequencing methods has overcome many of the initial challenges of these technologies, and their advantages over microarray techniques have enabled them to emerge as valuable aids for clinical research applications (prognosis, identification of drug targets, etc.). A comprehensive review describing the recent advances in next-generation sequencing methods and their application in the evaluation of endocrine and endocrine-related cancers is lacking. The main purpose of this review is to illustrate the concepts that collectively constitute our current view of the possibilities offered by next-generation sequencing technological platforms, challenges to relevant applications, and perspectives on the future of clinical genetic testing of patients with endocrine tumors. We focus on recent discoveries in the use of next-generation sequencing methods for clinical diagnosis of endocrine tumors in patients and conclude with a discussion on persisting challenges and future objectives.
Evaluation of a New Digital Automated Glycemic Pattern Detection Tool.
Comellas, María José; Albiñana, Emma; Artes, Maite; Corcoy, Rosa; Fernández-García, Diego; García-Alemán, Jorge; García-Cuartero, Beatriz; González, Cintia; Rivero, María Teresa; Casamira, Núria; Weissmann, Jörg
2017-11-01
Blood glucose meters are reliable devices for data collection, providing electronic logs of historical data easier to interpret than handwritten logbooks. Automated tools to analyze these data are necessary to facilitate glucose pattern detection and support treatment adjustment. These tools emerge in a broad variety in a more or less nonevaluated manner. The aim of this study was to compare eDetecta, a new automated pattern detection tool, to nonautomated pattern analysis in terms of time investment, data interpretation, and clinical utility, with the overarching goal to identify early in development and implementation of tool areas of improvement and potential safety risks. Multicenter web-based evaluation in which 37 endocrinologists were asked to assess glycemic patterns of 4 real reports (2 continuous subcutaneous insulin infusion [CSII] and 2 multiple daily injection [MDI]). Endocrinologist and eDetecta analyses were compared on time spent to analyze each report and agreement on the presence or absence of defined patterns. eDetecta module markedly reduced the time taken to analyze each case on the basis of the emminens eConecta reports (CSII: 18 min; MDI: 12.5), compared to the automatic eDetecta analysis. Agreement between endocrinologists and eDetecta varied depending on the patterns, with high level of agreement in patterns of glycemic variability. Further analysis of low level of agreement led to identifying areas where algorithms used could be improved to optimize trend pattern identification. eDetecta was a useful tool for glycemic pattern detection, helping clinicians to reduce time required to review emminens eConecta glycemic reports. No safety risks were identified during the study.
Venieri, Danae; Fraggedaki, Antonia; Binas, Vassilios; Zachopoulos, Apostolos; Kiriakidis, George; Mantzavinos, Dionissios
2015-03-01
Klebsiella pneumoniae is considered to be an emerging pathogen persisting under extreme environmentally stressed conditions. The aim of the present study is the investigation of inactivation rates of this pathogen in water by means of heterogeneous photocatalytic treatment under solar irradiation and the induced genetic variance applying RAPD-PCR as a molecular typing tool. Novel Mn- and Co-doped TiO2 catalysts were assessed in terms of their disinfection efficiency. The reference strain of K. pneumoniae proved to be readily inactivated, since disinfection occurred rapidly (i.e. after only 10 min of treatment) and low levels of bacterial regrowth were recorded in the dark and under natural sunlight. Binary doped titania exhibited the best photocatalytic activity, verifying the synergistic effect induced by composite dopants. Applying RAPD analysis to viable cells after treatment we concluded that increasing the treatment time led to considerable alteration of RAPD profiles and the homology coefficient ranged almost between 35 and 60%. RAPD-PCR proved to be a useful typing molecular tool that under standardized conditions exhibits highly reproducible results. Genetic variation among isolates increased in relation to the period of treatment and prolonged irradiation in each case affected the overall alteration in band patterns. RAPD patterns were highly diverse between treated and untreated isolates when disinfection was performed with the Co-doped titania. The broad spectrum of genetic variance and generated polymorphisms has the potential to increase the already significant virulence of the species.
Automating generation of textual class definitions from OWL to English.
Stevens, Robert; Malone, James; Williams, Sandra; Power, Richard; Third, Allan
2011-05-17
Text definitions for entities within bio-ontologies are a cornerstone of the effort to gain a consensus in understanding and usage of those ontologies. Writing these definitions is, however, a considerable effort and there is often a lag between specification of the main part of an ontology (logical descriptions and definitions of entities) and the development of the text-based definitions. The goal of natural language generation (NLG) from ontologies is to take the logical description of entities and generate fluent natural language. The application described here uses NLG to automatically provide text-based definitions from an ontology that has logical descriptions of its entities, so avoiding the bottleneck of authoring these definitions by hand. To produce the descriptions, the program collects all the axioms relating to a given entity, groups them according to common structure, realises each group through an English sentence, and assembles the resulting sentences into a paragraph, to form as 'coherent' a text as possible without human intervention. Sentence generation is accomplished using a generic grammar based on logical patterns in OWL, together with a lexicon for realising atomic entities. We have tested our output for the Experimental Factor Ontology (EFO) using a simple survey strategy to explore the fluency of the generated text and how well it conveys the underlying axiomatisation. Two rounds of survey and improvement show that overall the generated English definitions are found to convey the intended meaning of the axiomatisation in a satisfactory manner. The surveys also suggested that one form of generated English will not be universally liked; that intrusion of too much 'formal ontology' was not liked; and that too much explicit exposure of OWL semantics was also not liked. Our prototype tools can generate reasonable paragraphs of English text that can act as definitions. The definitions were found acceptable by our survey and, as a result, the developers of EFO are sufficiently satisfied with the output that the generated definitions have been incorporated into EFO. Whilst not a substitute for hand-written textual definitions, our generated definitions are a useful starting point. An on-line version of the NLG text definition tool can be found at http://swat.open.ac.uk/tools/. The questionaire and sample generated text definitions may be found at http://mcs.open.ac.uk/nlg/SWAT/bio-ontologies.html.
Automating generation of textual class definitions from OWL to English
2011-01-01
Background Text definitions for entities within bio-ontologies are a cornerstone of the effort to gain a consensus in understanding and usage of those ontologies. Writing these definitions is, however, a considerable effort and there is often a lag between specification of the main part of an ontology (logical descriptions and definitions of entities) and the development of the text-based definitions. The goal of natural language generation (NLG) from ontologies is to take the logical description of entities and generate fluent natural language. The application described here uses NLG to automatically provide text-based definitions from an ontology that has logical descriptions of its entities, so avoiding the bottleneck of authoring these definitions by hand. Results To produce the descriptions, the program collects all the axioms relating to a given entity, groups them according to common structure, realises each group through an English sentence, and assembles the resulting sentences into a paragraph, to form as ‘coherent’ a text as possible without human intervention. Sentence generation is accomplished using a generic grammar based on logical patterns in OWL, together with a lexicon for realising atomic entities. We have tested our output for the Experimental Factor Ontology (EFO) using a simple survey strategy to explore the fluency of the generated text and how well it conveys the underlying axiomatisation. Two rounds of survey and improvement show that overall the generated English definitions are found to convey the intended meaning of the axiomatisation in a satisfactory manner. The surveys also suggested that one form of generated English will not be universally liked; that intrusion of too much ‘formal ontology’ was not liked; and that too much explicit exposure of OWL semantics was also not liked. Conclusions Our prototype tools can generate reasonable paragraphs of English text that can act as definitions. The definitions were found acceptable by our survey and, as a result, the developers of EFO are sufficiently satisfied with the output that the generated definitions have been incorporated into EFO. Whilst not a substitute for hand-written textual definitions, our generated definitions are a useful starting point. Availability An on-line version of the NLG text definition tool can be found at http://swat.open.ac.uk/tools/. The questionaire and sample generated text definitions may be found at http://mcs.open.ac.uk/nlg/SWAT/bio-ontologies.html. PMID:21624160
GREAT: a web portal for Genome Regulatory Architecture Tools
Bouyioukos, Costas; Bucchini, François; Elati, Mohamed; Képès, François
2016-01-01
GREAT (Genome REgulatory Architecture Tools) is a novel web portal for tools designed to generate user-friendly and biologically useful analysis of genome architecture and regulation. The online tools of GREAT are freely accessible and compatible with essentially any operating system which runs a modern browser. GREAT is based on the analysis of genome layout -defined as the respective positioning of co-functional genes- and its relation with chromosome architecture and gene expression. GREAT tools allow users to systematically detect regular patterns along co-functional genomic features in an automatic way consisting of three individual steps and respective interactive visualizations. In addition to the complete analysis of regularities, GREAT tools enable the use of periodicity and position information for improving the prediction of transcription factor binding sites using a multi-view machine learning approach. The outcome of this integrative approach features a multivariate analysis of the interplay between the location of a gene and its regulatory sequence. GREAT results are plotted in web interactive graphs and are available for download either as individual plots, self-contained interactive pages or as machine readable tables for downstream analysis. The GREAT portal can be reached at the following URL https://absynth.issb.genopole.fr/GREAT and each individual GREAT tool is available for downloading. PMID:27151196
ERIC Educational Resources Information Center
Kiliç, Çigdem
2017-01-01
In that current study, pattern conversion ability of 25 pre-service mathematics teachers (producing figural patterns following number patterns) was investigated. During the study participants were asked to generate figural patterns based on those number patterns. The results of the study indicate that many participants could generate different…
Galbadrakh, Bulgan; Lee, Kyung-Eun; Park, Hyun-Seok
2012-12-01
Grammatical inference methods are expected to find grammatical structures hidden in biological sequences. One hopes that studies of grammar serve as an appropriate tool for theory formation. Thus, we have developed JSequitur for automatically generating the grammatical structure of biological sequences in an inference framework of string compression algorithms. Our original motivation was to find any grammatical traits of several cancer genes that can be detected by string compression algorithms. Through this research, we could not find any meaningful unique traits of the cancer genes yet, but we could observe some interesting traits in regards to the relationship among gene length, similarity of sequences, the patterns of the generated grammar, and compression rate.
Spontaneous generation of frequency combs in QD lasers
NASA Astrophysics Data System (ADS)
Columbo, Lorenzo Luigi; Bardella, Paolo; Gioannini, Mariangela
2018-02-01
We report a systematic analysis of the phenomenon of self-generation of optical frequency combs in single section Fabry-Perot Quantum Dot lasers using a Time Domain Travelling Wave model. We show that the carriers grating due to the standing wave pattern (spatial hole burning) peculiar of Quantum Dots laser and the Four Wave Mixing are the key ingredients to explain spontaneous Optical Frequency Combs in these devices. Our results well agree with recent experimental evidences reported in semiconductor lasers based on Quantum Dots and Quantum Dashes active material and pave the way to the development of a simulation tool for the design of these comb laser sources for innovative applications in the field of high-data rate optical communications.
An eye tracking study of bloodstain pattern analysts during pattern classification.
Arthur, R M; Hoogenboom, J; Green, R D; Taylor, M C; de Bruin, K G
2018-05-01
Bloodstain pattern analysis (BPA) is the forensic discipline concerned with the classification and interpretation of bloodstains and bloodstain patterns at the crime scene. At present, it is unclear exactly which stain or pattern properties and their associated values are most relevant to analysts when classifying a bloodstain pattern. Eye tracking technology has been widely used to investigate human perception and cognition. Its application to forensics, however, is limited. This is the first study to use eye tracking as a tool for gaining access to the mindset of the bloodstain pattern expert. An eye tracking method was used to follow the gaze of 24 bloodstain pattern analysts during an assigned task of classifying a laboratory-generated test bloodstain pattern. With the aid of an automated image-processing methodology, the properties of selected features of the pattern were quantified leading to the delineation of areas of interest (AOIs). Eye tracking data were collected for each AOI and combined with verbal statements made by analysts after the classification task to determine the critical range of values for relevant diagnostic features. Eye-tracking data indicated that there were four main regions of the pattern that analysts were most interested in. Within each region, individual elements or groups of elements that exhibited features associated with directionality, size, colour and shape appeared to capture the most interest of analysts during the classification task. The study showed that the eye movements of trained bloodstain pattern experts and their verbal descriptions of a pattern were well correlated.
GEM-TREND: a web tool for gene expression data mining toward relevant network discovery
Feng, Chunlai; Araki, Michihiro; Kunimoto, Ryo; Tamon, Akiko; Makiguchi, Hiroki; Niijima, Satoshi; Tsujimoto, Gozoh; Okuno, Yasushi
2009-01-01
Background DNA microarray technology provides us with a first step toward the goal of uncovering gene functions on a genomic scale. In recent years, vast amounts of gene expression data have been collected, much of which are available in public databases, such as the Gene Expression Omnibus (GEO). To date, most researchers have been manually retrieving data from databases through web browsers using accession numbers (IDs) or keywords, but gene-expression patterns are not considered when retrieving such data. The Connectivity Map was recently introduced to compare gene expression data by introducing gene-expression signatures (represented by a set of genes with up- or down-regulated labels according to their biological states) and is available as a web tool for detecting similar gene-expression signatures from a limited data set (approximately 7,000 expression profiles representing 1,309 compounds). In order to support researchers to utilize the public gene expression data more effectively, we developed a web tool for finding similar gene expression data and generating its co-expression networks from a publicly available database. Results GEM-TREND, a web tool for searching gene expression data, allows users to search data from GEO using gene-expression signatures or gene expression ratio data as a query and retrieve gene expression data by comparing gene-expression pattern between the query and GEO gene expression data. The comparison methods are based on the nonparametric, rank-based pattern matching approach of Lamb et al. (Science 2006) with the additional calculation of statistical significance. The web tool was tested using gene expression ratio data randomly extracted from the GEO and with in-house microarray data, respectively. The results validated the ability of GEM-TREND to retrieve gene expression entries biologically related to a query from GEO. For further analysis, a network visualization interface is also provided, whereby genes and gene annotations are dynamically linked to external data repositories. Conclusion GEM-TREND was developed to retrieve gene expression data by comparing query gene-expression pattern with those of GEO gene expression data. It could be a very useful resource for finding similar gene expression profiles and constructing its gene co-expression networks from a publicly available database. GEM-TREND was designed to be user-friendly and is expected to support knowledge discovery. GEM-TREND is freely available at . PMID:19728865
GEM-TREND: a web tool for gene expression data mining toward relevant network discovery.
Feng, Chunlai; Araki, Michihiro; Kunimoto, Ryo; Tamon, Akiko; Makiguchi, Hiroki; Niijima, Satoshi; Tsujimoto, Gozoh; Okuno, Yasushi
2009-09-03
DNA microarray technology provides us with a first step toward the goal of uncovering gene functions on a genomic scale. In recent years, vast amounts of gene expression data have been collected, much of which are available in public databases, such as the Gene Expression Omnibus (GEO). To date, most researchers have been manually retrieving data from databases through web browsers using accession numbers (IDs) or keywords, but gene-expression patterns are not considered when retrieving such data. The Connectivity Map was recently introduced to compare gene expression data by introducing gene-expression signatures (represented by a set of genes with up- or down-regulated labels according to their biological states) and is available as a web tool for detecting similar gene-expression signatures from a limited data set (approximately 7,000 expression profiles representing 1,309 compounds). In order to support researchers to utilize the public gene expression data more effectively, we developed a web tool for finding similar gene expression data and generating its co-expression networks from a publicly available database. GEM-TREND, a web tool for searching gene expression data, allows users to search data from GEO using gene-expression signatures or gene expression ratio data as a query and retrieve gene expression data by comparing gene-expression pattern between the query and GEO gene expression data. The comparison methods are based on the nonparametric, rank-based pattern matching approach of Lamb et al. (Science 2006) with the additional calculation of statistical significance. The web tool was tested using gene expression ratio data randomly extracted from the GEO and with in-house microarray data, respectively. The results validated the ability of GEM-TREND to retrieve gene expression entries biologically related to a query from GEO. For further analysis, a network visualization interface is also provided, whereby genes and gene annotations are dynamically linked to external data repositories. GEM-TREND was developed to retrieve gene expression data by comparing query gene-expression pattern with those of GEO gene expression data. It could be a very useful resource for finding similar gene expression profiles and constructing its gene co-expression networks from a publicly available database. GEM-TREND was designed to be user-friendly and is expected to support knowledge discovery. GEM-TREND is freely available at http://cgs.pharm.kyoto-u.ac.jp/services/network.
NASA Astrophysics Data System (ADS)
Chan, Hau P.; Bao, Nai-Keng; Kwok, Wing O.; Wong, Wing H.
2002-04-01
The application of Digital Pixel Hologram (DPH) as anti-counterfeiting technology for products such as commercial goods, credit cards, identity cards, paper money banknote etc. is growing important nowadays. It offers many advantages over other anti-counterfeiting tools and this includes high diffraction effect, high resolving power, resistance to photo copying using two-dimensional Xeroxes, potential for mass production of patterns at a very low cost. Recently, we have successfully in fabricating high definition DPH with resolution higher than 2500dpi for the purpose of anti-counterfeiting by applying modern optical diffraction theory to computer pattern generation technique with the assist of electron beam lithography (EBL). In this paper, we introduce five levels of encryption techniques, which can be embedded in the design of such DPHs to further improve its anti-counterfeiting performance with negligible added on cost. The techniques involved, in the ascending order of decryption complexity, are namely Gray-level Encryption, Pattern Encryption, Character Encryption, Image Modification Encryption and Codebook Encryption. A Hong Kong Special Administration Regions (HKSAR) DPH emblem was fabricated at a resolution of 2540dpi using the facilities housed in our Optoelectronics Research Center. This emblem will be used as an illustration to discuss in details about each encryption idea during the conference.
Using bio.tools to generate and annotate workbench tool descriptions
Hillion, Kenzo-Hugo; Kuzmin, Ivan; Khodak, Anton; Rasche, Eric; Crusoe, Michael; Peterson, Hedi; Ison, Jon; Ménager, Hervé
2017-01-01
Workbench and workflow systems such as Galaxy, Taverna, Chipster, or Common Workflow Language (CWL)-based frameworks, facilitate the access to bioinformatics tools in a user-friendly, scalable and reproducible way. Still, the integration of tools in such environments remains a cumbersome, time consuming and error-prone process. A major consequence is the incomplete or outdated description of tools that are often missing important information, including parameters and metadata such as publication or links to documentation. ToolDog (Tool DescriptiOn Generator) facilitates the integration of tools - which have been registered in the ELIXIR tools registry (https://bio.tools) - into workbench environments by generating tool description templates. ToolDog includes two modules. The first module analyses the source code of the bioinformatics software with language-specific plugins, and generates a skeleton for a Galaxy XML or CWL tool description. The second module is dedicated to the enrichment of the generated tool description, using metadata provided by bio.tools. This last module can also be used on its own to complete or correct existing tool descriptions with missing metadata. PMID:29333231
DOE Office of Scientific and Technical Information (OSTI.GOV)
Myronakis, M; Cai, W; Dhou, S
Purpose: To design a comprehensive open-source, publicly available, graphical user interface (GUI) to facilitate the configuration, generation, processing and use of the 4D Extended Cardiac-Torso (XCAT) phantom. Methods: The XCAT phantom includes over 9000 anatomical objects as well as respiratory, cardiac and tumor motion. It is widely used for research studies in medical imaging and radiotherapy. The phantom generation process involves the configuration of a text script to parameterize the geometry, motion, and composition of the whole body and objects within it, and to generate simulated PET or CT images. To avoid the need for manual editing or script writing,more » our MATLAB-based GUI uses slider controls, drop-down lists, buttons and graphical text input to parameterize and process the phantom. Results: Our GUI can be used to: a) generate parameter files; b) generate the voxelized phantom; c) combine the phantom with a lesion; d) display the phantom; e) produce average and maximum intensity images from the phantom output files; f) incorporate irregular patient breathing patterns; and f) generate DICOM files containing phantom images. The GUI provides local help information using tool-tip strings on the currently selected phantom, minimizing the need for external documentation. The DICOM generation feature is intended to simplify the process of importing the phantom images into radiotherapy treatment planning systems or other clinical software. Conclusion: The GUI simplifies and automates the use of the XCAT phantom for imaging-based research projects in medical imaging or radiotherapy. This has the potential to accelerate research conducted with the XCAT phantom, or to ease the learning curve for new users. This tool does not include the XCAT phantom software itself. We would like to acknowledge funding from MRA, Varian Medical Systems Inc.« less
Metallization and Biopatterning on Ultra-Flexible Substrates via Dextran Sacrificial Layers
Tseng, Peter; Pushkarsky, Ivan; Di Carlo, Dino
2014-01-01
Micro-patterning tools adopted from the semiconductor industry have mostly been optimized to pattern features onto rigid silicon and glass substrates, however, recently the need to pattern on soft substrates has been identified in simulating cellular environments or developing flexible biosensors. We present a simple method of introducing a variety of patterned materials and structures into ultra-flexible polydimethylsiloxane (PDMS) layers (elastic moduli down to 3 kPa) utilizing water-soluble dextran sacrificial thin films. Dextran films provided a stable template for photolithography, metal deposition, particle adsorption, and protein stamping. These materials and structures (including dextran itself) were then readily transferrable to an elastomer surface following PDMS (10 to 70∶1 base to crosslinker ratios) curing over the patterned dextran layer and after sacrificial etch of the dextran in water. We demonstrate that this simple and straightforward approach can controllably manipulate surface wetting and protein adsorption characteristics of PDMS, covalently link protein patterns for stable cell patterning, generate composite structures of epoxy or particles for study of cell mechanical response, and stably integrate certain metals with use of vinyl molecular adhesives. This method is compatible over the complete moduli range of PDMS, and potentially generalizable over a host of additional micro- and nano-structures and materials. PMID:25153326
Evaluation of a New Digital Automated Glycemic Pattern Detection Tool
Albiñana, Emma; Artes, Maite; Corcoy, Rosa; Fernández-García, Diego; García-Alemán, Jorge; García-Cuartero, Beatriz; González, Cintia; Rivero, María Teresa; Casamira, Núria; Weissmann, Jörg
2017-01-01
Abstract Background: Blood glucose meters are reliable devices for data collection, providing electronic logs of historical data easier to interpret than handwritten logbooks. Automated tools to analyze these data are necessary to facilitate glucose pattern detection and support treatment adjustment. These tools emerge in a broad variety in a more or less nonevaluated manner. The aim of this study was to compare eDetecta, a new automated pattern detection tool, to nonautomated pattern analysis in terms of time investment, data interpretation, and clinical utility, with the overarching goal to identify early in development and implementation of tool areas of improvement and potential safety risks. Methods: Multicenter web-based evaluation in which 37 endocrinologists were asked to assess glycemic patterns of 4 real reports (2 continuous subcutaneous insulin infusion [CSII] and 2 multiple daily injection [MDI]). Endocrinologist and eDetecta analyses were compared on time spent to analyze each report and agreement on the presence or absence of defined patterns. Results: eDetecta module markedly reduced the time taken to analyze each case on the basis of the emminens eConecta reports (CSII: 18 min; MDI: 12.5), compared to the automatic eDetecta analysis. Agreement between endocrinologists and eDetecta varied depending on the patterns, with high level of agreement in patterns of glycemic variability. Further analysis of low level of agreement led to identifying areas where algorithms used could be improved to optimize trend pattern identification. Conclusion: eDetecta was a useful tool for glycemic pattern detection, helping clinicians to reduce time required to review emminens eConecta glycemic reports. No safety risks were identified during the study. PMID:29091477
NASA Astrophysics Data System (ADS)
Verrucci, Enrica; Bevington, John; Vicini, Alessandro
2014-05-01
A set of open-source tools to create building exposure datasets for seismic risk assessment was developed from 2010-13 by the Inventory Data Capture Tools (IDCT) Risk Global Component of the Global Earthquake Model (GEM). The tools were designed to integrate data derived from remotely-sensed imagery, statistically-sampled in-situ field data of buildings to generate per-building and regional exposure data. A number of software tools were created to aid the development of these data, including mobile data capture tools for in-field structural assessment, and the Spatial Inventory Data Developer (SIDD) for creating "mapping schemes" - statistically-inferred distributions of building stock applied to areas of homogeneous urban land use. These tools were made publically available in January 2014. Exemplar implementations in Europe and Central Asia during the IDCT project highlighted several potential application areas beyond the original scope of the project. These are investigated here. We describe and demonstrate how the GEM-IDCT suite can be used extensively within the framework proposed by the EC-FP7 project SENSUM (Framework to integrate Space-based and in-situ sENSing for dynamic vUlnerability and recovery Monitoring). Specifically, applications in the areas of 1) dynamic vulnerability assessment (pre-event), and 2) recovery monitoring and evaluation (post-event) are discussed. Strategies for using the IDC Tools for these purposes are discussed. The results demonstrate the benefits of using advanced technology tools for data capture, especially in a systematic fashion using the taxonomic standards set by GEM. Originally designed for seismic risk assessment, it is clear the IDCT tools have relevance for multi-hazard risk assessment. When combined with a suitable sampling framework and applied to multi-temporal recovery monitoring, data generated from the tools can reveal spatio-temporal patterns in the quality of recovery activities and resilience trends can be inferred. Lastly, this work draws attention to the use of the IDCT suite as an education resource for inspiring and training new students and engineers in the field of disaster risk reduction.
NASA Astrophysics Data System (ADS)
Yamanaka, Eiji; Taniguchi, Rikiya; Itoh, Masamitsu; Omote, Kazuhiko; Ito, Yoshiyasu; Ogata, Kiyoshi; Hayashi, Naoya
2016-05-01
Nanoimprint lithography (NIL) is one of the most potential candidates for the next generation lithography for semiconductor. It will achieve the lithography with high resolution and low cost. High resolution of NIL will be determined by a high definition template. Nanoimprint lithography will faithfully transfer the pattern of NIL template to the wafer. Cross-sectional profile of the template pattern will greatly affect the resist profile on the wafer. Therefore, the management of the cross-sectional profile is essential. Grazing incidence small angle x-ray scattering (GI-SAXS) technique has been proposed as one of the method for measuring cross-sectional profile of periodic nanostructure pattern. Incident x-rays are irradiated to the sample surface with very low glancing angle. It is close to the critical angle of the total reflection of the x-ray. The scattered x-rays from the surface structure are detected on a two-dimensional detector. The observed intensity is discrete in the horizontal (2θ) direction. It is due to the periodicity of the structure, and diffraction is observed only when the diffraction condition is satisfied. In the vertical (β) direction, the diffraction intensity pattern shows interference fringes reflected to height and shape of the structure. Features of the measurement using x-ray are that the optical constant for the materials are well known, and it is possible to calculate a specific diffraction intensity pattern based on a certain model of the cross-sectional profile. The surface structure is estimated by to collate the calculated diffraction intensity pattern that sequentially while changing the model parameters with the measured diffraction intensity pattern. Furthermore, GI-SAXS technique can be measured an object in a non-destructive. It suggests the potential to be an effective tool for product quality assurance. We have developed a cross-sectional profile measurement of quartz template pattern using GI-SAXS technique. In this report, we will report the measurement capabilities of GI-SAXS technique as a cross-sectional profile measurement tool of NIL quartz template pattern.
Huan, Zhijie; Chu, Henry K; Yang, Jie; Sun, Dong
2017-04-01
Seeding and patterning of cells with an engineered scaffold is a critical process in artificial tissue construction and regeneration. To date, many engineered scaffolds exhibit simple intrinsic designs, which fail to mimic the geometrical complexity of native tissues. In this study, a novel scaffold that can automatically seed cells into multilayer honeycomb patterns for bone tissue engineering application was designed and examined. The scaffold incorporated dielectrophoresis for noncontact manipulation of cells and intrinsic honeycomb architectures were integrated in each scaffold layer. When a voltage was supplied to the stacked scaffold layers, three-dimensional electric fields were generated, thereby manipulating cells to form into honeycomb-like cellular patterns for subsequent culture. The biocompatibility of the scaffold material was confirmed through the cell viability test. Experiments were conducted to evaluate the cell viability during DEP patterning at different voltage amplitudes, frequencies, and manipulating time. Three different mammalian cells were examined and the effects of the cell size and the cell concentration on the resultant cellular patterns were evaluated. Results showed that the proposed scaffold structure was able to construct multilayer honeycomb cellular patterns in a manner similar to the natural tissue. This honeycomb-like scaffold and the dielectrophoresis-based patterning technique examined in this study could provide the field with a promising tool to enhance seeding and patterning of a wide range of cells for the development of high-quality artificial tissues.
An analysis of pilot error-related aircraft accidents
NASA Technical Reports Server (NTRS)
Kowalsky, N. B.; Masters, R. L.; Stone, R. B.; Babcock, G. L.; Rypka, E. W.
1974-01-01
A multidisciplinary team approach to pilot error-related U.S. air carrier jet aircraft accident investigation records successfully reclaimed hidden human error information not shown in statistical studies. New analytic techniques were developed and applied to the data to discover and identify multiple elements of commonality and shared characteristics within this group of accidents. Three techniques of analysis were used: Critical element analysis, which demonstrated the importance of a subjective qualitative approach to raw accident data and surfaced information heretofore unavailable. Cluster analysis, which was an exploratory research tool that will lead to increased understanding and improved organization of facts, the discovery of new meaning in large data sets, and the generation of explanatory hypotheses. Pattern recognition, by which accidents can be categorized by pattern conformity after critical element identification by cluster analysis.
Ancient Wings: animating the evolution of butterfly wing patterns.
Arbesman, Samuel; Enthoven, Leo; Monteiro, Antónia
2003-10-01
Character optimization methods can be used to reconstruct ancestral states at the internal nodes of phylogenetic trees. However, seldom are these ancestral states visualized collectively. Ancient Wings is a computer program that provides a novel method of visualizing the evolution of several morphological traits simultaneously. It allows users to visualize how the ventral hindwing pattern of 54 butterflies in the genus Bicyclus may have changed over time. By clicking on each of the nodes within the evolutionary tree, the user can see an animation of how wing size, eyespot size, and eyespot position relative the wing margin, have putatively evolved as a collective whole. Ancient Wings may be used as a pedagogical device as well as a research tool for hypothesis-generation in the fields of evolutionary, ecological, and developmental biology.
An open-access CMIP5 pattern library for temperature and precipitation: description and methodology
NASA Astrophysics Data System (ADS)
Lynch, Cary; Hartin, Corinne; Bond-Lamberty, Ben; Kravitz, Ben
2017-05-01
Pattern scaling is used to efficiently emulate general circulation models and explore uncertainty in climate projections under multiple forcing scenarios. Pattern scaling methods assume that local climate changes scale with a global mean temperature increase, allowing for spatial patterns to be generated for multiple models for any future emission scenario. For uncertainty quantification and probabilistic statistical analysis, a library of patterns with descriptive statistics for each file would be beneficial, but such a library does not presently exist. Of the possible techniques used to generate patterns, the two most prominent are the delta and least squares regression methods. We explore the differences and statistical significance between patterns generated by each method and assess performance of the generated patterns across methods and scenarios. Differences in patterns across seasons between methods and epochs were largest in high latitudes (60-90° N/S). Bias and mean errors between modeled and pattern-predicted output from the linear regression method were smaller than patterns generated by the delta method. Across scenarios, differences in the linear regression method patterns were more statistically significant, especially at high latitudes. We found that pattern generation methodologies were able to approximate the forced signal of change to within ≤ 0.5 °C, but the choice of pattern generation methodology for pattern scaling purposes should be informed by user goals and criteria. This paper describes our library of least squares regression patterns from all CMIP5 models for temperature and precipitation on an annual and sub-annual basis, along with the code used to generate these patterns. The dataset and netCDF data generation code are available at doi:10.5281/zenodo.495632.
An integrated network of Arabidopsis growth regulators and its use for gene prioritization.
Sabaghian, Ehsan; Drebert, Zuzanna; Inzé, Dirk; Saeys, Yvan
2015-12-01
Elucidating the molecular mechanisms that govern plant growth has been an important topic in plant research, and current advances in large-scale data generation call for computational tools that efficiently combine these different data sources to generate novel hypotheses. In this work, we present a novel, integrated network that combines multiple large-scale data sources to characterize growth regulatory genes in Arabidopsis, one of the main plant model organisms. The contributions of this work are twofold: first, we characterized a set of carefully selected growth regulators with respect to their connectivity patterns in the integrated network, and, subsequently, we explored to which extent these connectivity patterns can be used to suggest new growth regulators. Using a large-scale comparative study, we designed new supervised machine learning methods to prioritize growth regulators. Our results show that these methods significantly improve current state-of-the-art prioritization techniques, and are able to suggest meaningful new growth regulators. In addition, the integrated network is made available to the scientific community, providing a rich data source that will be useful for many biological processes, not necessarily restricted to plant growth.
Hirsch, Jana A; Winters, Meghan; Clarke, Philippa; McKay, Heather
2014-12-12
Measuring mobility is critical for understanding neighborhood influences on older adults' health and functioning. Global Positioning Systems (GPS) may represent an important opportunity to measure, describe, and compare mobility patterns in older adults. We generated three types of activity spaces (Standard Deviation Ellipse, Minimum Convex Polygon, Daily Path Area) using GPS data from 95 older adults in Vancouver, Canada. Calculated activity space areas and compactness were compared across sociodemographic and resource characteristics. Area measures derived from the three different approaches to developing activity spaces were highly correlated. Participants who were younger, lived in less walkable neighborhoods, had a valid driver's license, had access to a vehicle, or had physical support to go outside of their homes had larger activity spaces. Mobility space compactness measures also differed by sociodemographic and resource characteristics. This research extends the literature by demonstrating that GPS tracking can be used as a valuable tool to better understand the geographic mobility patterns of older adults. This study informs potential ways to maintain older adult independence by identifying factors that influence geographic mobility.
A Markov game theoretic data fusion approach for cyber situational awareness
NASA Astrophysics Data System (ADS)
Shen, Dan; Chen, Genshe; Cruz, Jose B., Jr.; Haynes, Leonard; Kruger, Martin; Blasch, Erik
2007-04-01
This paper proposes an innovative data-fusion/ data-mining game theoretic situation awareness and impact assessment approach for cyber network defense. Alerts generated by Intrusion Detection Sensors (IDSs) or Intrusion Prevention Sensors (IPSs) are fed into the data refinement (Level 0) and object assessment (L1) data fusion components. High-level situation/threat assessment (L2/L3) data fusion based on Markov game model and Hierarchical Entity Aggregation (HEA) are proposed to refine the primitive prediction generated by adaptive feature/pattern recognition and capture new unknown features. A Markov (Stochastic) game method is used to estimate the belief of each possible cyber attack pattern. Game theory captures the nature of cyber conflicts: determination of the attacking-force strategies is tightly coupled to determination of the defense-force strategies and vice versa. Also, Markov game theory deals with uncertainty and incompleteness of available information. A software tool is developed to demonstrate the performance of the high level information fusion for cyber network defense situation and a simulation example shows the enhanced understating of cyber-network defense.
A three-dimensional algebraic grid generation scheme for gas turbine combustors with inclined slots
NASA Technical Reports Server (NTRS)
Yang, S. L.; Cline, M. C.; Chen, R.; Chang, Y. L.
1993-01-01
A 3D algebraic grid generation scheme is presented for generating the grid points inside gas turbine combustors with inclined slots. The scheme is based on the 2D transfinite interpolation method. Since the scheme is a 2D approach, it is very efficient and can easily be extended to gas turbine combustors with either dilution hole or slot configurations. To demonstrate the feasibility and the usefulness of the technique, a numerical study of the quick-quench/lean-combustion (QQ/LC) zones of a staged turbine combustor is given. Preliminary results illustrate some of the major features of the flow and temperature fields in the QQ/LC zones. Formation of co- and counter-rotating bulk flow and shape temperature fields can be observed clearly, and the resulting patterns are consistent with experimental observations typical of the confined slanted jet-in-cross flow. Numerical solutions show the method to be an efficient and reliable tool for generating computational grids for analyzing gas turbine combustors with slanted slots.
Role of Open Source Tools and Resources in Virtual Screening for Drug Discovery.
Karthikeyan, Muthukumarasamy; Vyas, Renu
2015-01-01
Advancement in chemoinformatics research in parallel with availability of high performance computing platform has made handling of large scale multi-dimensional scientific data for high throughput drug discovery easier. In this study we have explored publicly available molecular databases with the help of open-source based integrated in-house molecular informatics tools for virtual screening. The virtual screening literature for past decade has been extensively investigated and thoroughly analyzed to reveal interesting patterns with respect to the drug, target, scaffold and disease space. The review also focuses on the integrated chemoinformatics tools that are capable of harvesting chemical data from textual literature information and transform them into truly computable chemical structures, identification of unique fragments and scaffolds from a class of compounds, automatic generation of focused virtual libraries, computation of molecular descriptors for structure-activity relationship studies, application of conventional filters used in lead discovery along with in-house developed exhaustive PTC (Pharmacophore, Toxicophores and Chemophores) filters and machine learning tools for the design of potential disease specific inhibitors. A case study on kinase inhibitors is provided as an example.
NASA Technical Reports Server (NTRS)
Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert
2005-01-01
Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.
An open-access CMIP5 pattern library for temperature and precipitation: Description and methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lynch, Cary D.; Hartin, Corinne A.; Bond-Lamberty, Benjamin
Pattern scaling is used to efficiently emulate general circulation models and explore uncertainty in climate projections under multiple forcing scenarios. Pattern scaling methods assume that local climate changes scale with a global mean temperature increase, allowing for spatial patterns to be generated for multiple models for any future emission scenario. For uncertainty quantification and probabilistic statistical analysis, a library of patterns with descriptive statistics for each file would be beneficial, but such a library does not presently exist. Of the possible techniques used to generate patterns, the two most prominent are the delta and least squared regression methods. We exploremore » the differences and statistical significance between patterns generated by each method and assess performance of the generated patterns across methods and scenarios. Differences in patterns across seasons between methods and epochs were largest in high latitudes (60-90°N/S). Bias and mean errors between modeled and pattern predicted output from the linear regression method were smaller than patterns generated by the delta method. Across scenarios, differences in the linear regression method patterns were more statistically significant, especially at high latitudes. We found that pattern generation methodologies were able to approximate the forced signal of change to within ≤ 0.5°C, but choice of pattern generation methodology for pattern scaling purposes should be informed by user goals and criteria. As a result, this paper describes our library of least squared regression patterns from all CMIP5 models for temperature and precipitation on an annual and sub-annual basis, along with the code used to generate these patterns.« less
An open-access CMIP5 pattern library for temperature and precipitation: Description and methodology
Lynch, Cary D.; Hartin, Corinne A.; Bond-Lamberty, Benjamin; ...
2017-05-15
Pattern scaling is used to efficiently emulate general circulation models and explore uncertainty in climate projections under multiple forcing scenarios. Pattern scaling methods assume that local climate changes scale with a global mean temperature increase, allowing for spatial patterns to be generated for multiple models for any future emission scenario. For uncertainty quantification and probabilistic statistical analysis, a library of patterns with descriptive statistics for each file would be beneficial, but such a library does not presently exist. Of the possible techniques used to generate patterns, the two most prominent are the delta and least squared regression methods. We exploremore » the differences and statistical significance between patterns generated by each method and assess performance of the generated patterns across methods and scenarios. Differences in patterns across seasons between methods and epochs were largest in high latitudes (60-90°N/S). Bias and mean errors between modeled and pattern predicted output from the linear regression method were smaller than patterns generated by the delta method. Across scenarios, differences in the linear regression method patterns were more statistically significant, especially at high latitudes. We found that pattern generation methodologies were able to approximate the forced signal of change to within ≤ 0.5°C, but choice of pattern generation methodology for pattern scaling purposes should be informed by user goals and criteria. As a result, this paper describes our library of least squared regression patterns from all CMIP5 models for temperature and precipitation on an annual and sub-annual basis, along with the code used to generate these patterns.« less
Burgess, K E V; Borutzki, Y; Rankin, N; Daly, R; Jourdan, F
2017-12-15
Metabolomics frequently relies on the use of high resolution mass spectrometry data. Classification and filtering of this data remain a challenging task due to the plethora of complex mass spectral artefacts, chemical noise, adducts and fragmentation that occur during ionisation and analysis. Additionally, the relationships between detected compounds can provide a wealth of information about the nature of the samples and the biochemistry that gave rise to them. We present a biochemical networking tool: MetaNetter 2 that is based on the original MetaNetter, a Cytoscape plugin that creates ab initio networks. The new version supports two major improvements: the generation of adduct networks and the creation of tables that map adduct or transformation patterns across multiple samples, providing a readout of compound relationships. We have applied this tool to the analysis of adduct patterns in the same sample separated under two different chromatographies, allowing inferences to be made about the effect of different buffer conditions on adduct detection, and the application of the chemical transformation analysis to both a single fragmentation analysis and an all-ions fragmentation dataset. Finally, we present an analysis of a dataset derived from anaerobic and aerobic growth of the organism Staphylococcus aureus demonstrating the utility of the tool for biological analysis. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Lovette, J. P.; Duncan, J. M.; Band, L. E.
2016-12-01
Watershed management requires information on the hydrologic impacts of local to regional land use, land cover and infrastructure conditions. Management of runoff volumes, storm flows, and water quality can benefit from large scale, "top-down" screening tools, using readily available information, as well as more detailed, "bottom-up" process-based models that explicitly track local runoff production and routing from sources to receiving water bodies. Regional scale data, available nationwide through the NHD+, and top-down models based on aggregated catchment information provide useful tools for estimating regional patterns of peak flows, volumes and nutrient loads at the catchment level. Management impacts can be estimated with these models, but have limited ability to resolve impacts beyond simple changes to land cover proportions. Alternatively, distributed process-based models provide more flexibility in modeling management impacts by resolving spatial patterns of nutrient source, runoff generation, and uptake. This bottom-up approach can incorporate explicit patterns of land cover, drainage connectivity, and vegetation extent, but are typically applied over smaller areas. Here, we first model peak flood flows and nitrogen loads across North Carolina's 70,000 NHD+ catchments using USGS regional streamflow regression equations and the SPARROW model. We also estimate management impact by altering aggregated sources in each of these models. To address the missing spatial implications of the top-down approach, we further explore the demand for riparian buffers as a management strategy, simulating the accumulation of nutrient sources along flow paths and the potential mitigation of these sources through forested buffers. We use the Regional Hydro-Ecological Simulation System (RHESSys) to model changes across several basins in North Carolina's Piedmont and Blue Ridge regions, ranging in size from 15 - 1,130 km2. The two approaches provide a complementary set of tools for large area screening, followed by smaller, more process based assessment and design tools.
NASA Astrophysics Data System (ADS)
Vaudor, Lise; Piegay, Herve; Wawrzyniak, Vincent; Spitoni, Marie
2016-04-01
The form and functioning of a geomorphic system result from processes operating at various spatial and temporal scales. Longitudinal channel characteristics thus exhibit complex patterns which vary according to the scale of study, might be periodic or segmented, and are generally blurred by noise. Describing the intricate, multiscale structure of such signals, and identifying at which scales the patterns are dominant and over which sub-reach, could help determine at which scales they should be investigated, and provide insights into the main controlling factors. Wavelet transforms aim at describing data at multiple scales (either in time or space), and are now exploited in geophysics for the analysis of nonstationary series of data. They provide a consistent, non-arbitrary, and multiscale description of a signal's variations and help explore potential causalities. Nevertheless, their use in fluvial geomorphology, notably to study longitudinal patterns, is hindered by a lack of user-friendly tools to help understand, implement, and interpret them. We have developed a free application, The Wavelet ToolKat, designed to facilitate the use of wavelet transforms on temporal or spatial series. We illustrate its usefulness describing longitudinal channel curvature and slope of three freely meandering rivers in the Amazon basin (the Purus, Juruá and Madre de Dios rivers), using topographic data generated from NASA's Shuttle Radar Topography Mission (SRTM) in 2000. Three types of wavelet transforms are used, with different purposes. Continuous Wavelet Transforms are used to identify in a non-arbitrary way the dominant scales and locations at which channel curvature and slope vary. Cross-wavelet transforms, and wavelet coherence and phase are used to identify scales and locations exhibiting significant channel curvature and slope co-variations. Maximal Overlap Discrete Wavelet Transforms decompose data into their variations at a series of scales and are used to provide smoothed descriptions of the series at the scales deemed relevant.
Campana, Lorenzo; Breitbeck, Robert; Bauer-Kreuz, Regula; Buck, Ursula
2016-05-01
This study evaluated the feasibility of documenting patterned injury using three dimensions and true colour photography without complex 3D surface documentation methods. This method is based on a generated 3D surface model using radiologic slice images (CT) while the colour information is derived from photographs taken with commercially available cameras. The external patterned injuries were documented in 16 cases using digital photography as well as highly precise photogrammetry-supported 3D structured light scanning. The internal findings of these deceased were recorded using CT and MRI. For registration of the internal with the external data, two different types of radiographic markers were used and compared. The 3D surface model generated from CT slice images was linked with the photographs, and thereby digital true-colour 3D models of the patterned injuries could be created (Image projection onto CT/IprojeCT). In addition, these external models were merged with the models of the somatic interior. We demonstrated that 3D documentation and visualization of external injury findings by integration of digital photography in CT/MRI data sets is suitable for the 3D documentation of individual patterned injuries to a body. Nevertheless, this documentation method is not a substitution for photogrammetry and surface scanning, especially when the entire bodily surface is to be recorded in three dimensions including all external findings, and when precise data is required for comparing highly detailed injury features with the injury-inflicting tool.
A linguistic rule-based approach to extract drug-drug interactions from pharmacological documents.
Segura-Bedmar, Isabel; Martínez, Paloma; de Pablo-Sánchez, César
2011-03-29
A drug-drug interaction (DDI) occurs when one drug influences the level or activity of another drug. The increasing volume of the scientific literature overwhelms health care professionals trying to be kept up-to-date with all published studies on DDI. This paper describes a hybrid linguistic approach to DDI extraction that combines shallow parsing and syntactic simplification with pattern matching. Appositions and coordinate structures are interpreted based on shallow syntactic parsing provided by the UMLS MetaMap tool (MMTx). Subsequently, complex and compound sentences are broken down into clauses from which simple sentences are generated by a set of simplification rules. A pharmacist defined a set of domain-specific lexical patterns to capture the most common expressions of DDI in texts. These lexical patterns are matched with the generated sentences in order to extract DDIs. We have performed different experiments to analyze the performance of the different processes. The lexical patterns achieve a reasonable precision (67.30%), but very low recall (14.07%). The inclusion of appositions and coordinate structures helps to improve the recall (25.70%), however, precision is lower (48.69%). The detection of clauses does not improve the performance. Information Extraction (IE) techniques can provide an interesting way of reducing the time spent by health care professionals on reviewing the literature. Nevertheless, no approach has been carried out to extract DDI from texts. To the best of our knowledge, this work proposes the first integral solution for the automatic extraction of DDI from biomedical texts.
Gravel, P; Tremblay, M; Leblond, H; Rossignol, S; de Guise, J A
2010-07-15
A computer-aided method for the tracking of morphological markers in fluoroscopic images of a rat walking on a treadmill is presented and validated. The markers correspond to bone articulations in a hind leg and are used to define the hip, knee, ankle and metatarsophalangeal joints. The method allows a user to identify, using a computer mouse, about 20% of the marker positions in a video and interpolate their trajectories from frame-to-frame. This results in a seven-fold speed improvement in detecting markers. This also eliminates confusion problems due to legs crossing and blurred images. The video images are corrected for geometric distortions from the X-ray camera, wavelet denoised, to preserve the sharpness of minute bone structures, and contrast enhanced. From those images, the marker positions across video frames are extracted, corrected for rat "solid body" motions on the treadmill, and used to compute the positional and angular gait patterns. Robust Bootstrap estimates of those gait patterns and their prediction and confidence bands are finally generated. The gait patterns are invaluable tools to study the locomotion of healthy animals or the complex process of locomotion recovery in animals with injuries. The method could, in principle, be adapted to analyze the locomotion of other animals as long as a fluoroscopic imager and a treadmill are available. Copyright 2010 Elsevier B.V. All rights reserved.
Effken, Judith A.; Carley, Kathleen M.; Gephart, Sheila; Verran, Joyce A.; Bianchi, Denise; Reminga, Jeff; Brewer, Barbara
2011-01-01
Purpose We used Organization Risk Analyzer (ORA), a dynamic network analysis tool, to identify patient care unit communication patterns associated with patient safety and quality outcomes. Although ORA had previously had limited use in healthcare, we felt it could effectively model communication on patient care units. Methods Using a survey methodology, we collected communication network data from nursing staff on seven patient care units on two different days. Patient outcome data were collected via a separate survey. Results of the staff survey were used to represent the communication networks for each unit in ORA. We then used ORA's analysis capability to generate communication metrics for each unit. ORA's visualization capability was used to better understand the metrics. Results We identified communication patterns that correlated with two safety (falls and medication errors) and five quality (e.g., symptom management, complex self care, and patient satisfaction) outcome measures. Communication patterns differed substantially by shift. Conclusion The results demonstrate the utility of ORA for healthcare research and the relationship of nursing unit communication patterns to patient safety and quality outcomes. PMID:21536492
NASA Astrophysics Data System (ADS)
Gerik, A.; Kruhl, J. H.
2006-12-01
The quantitative analysis of patterns as a geometric arrangement of material domains with specific geometric or crystallographic properties such as shape, size or crystallographic orientation has been shown to be a valuable tool with a wide field of applications in geo- and material sciences. Pattern quantification allows an unbiased comparison of experimentally generated or theoretical patterns with patterns of natural origin. In addition to this, the application of different methods can also provide information about different pattern forming processes. This information includes the distribution of crystals in a matrix - to analyze i.e. the nature and orientation of flow within a melt - or the governing shear strain regime at the point of time the pattern was formed as well as nature of fracture patterns of different scales, all of which are of great interest not only in structural and engineering geology, but also in material sciences. Different approaches to this problem have been discussed over the past fifteen years, yet only few of the methods were applied successfully at least to single examples (i.e. Velde et al., 1990; Harris et al., 1991; Peternell et al., 2003; Volland &Kruhl, 2004). One of the reasons for this has been the high expenditure of time that was necessary to prepare and analyse the samples. To overcome this problem, a first selection of promising methods have been implemented into a growing collection of software tools: (1) The modifications that Harris et al. (1991) have suggested for the Cantor's dust method (Velde et al., 1990) and which have been applied by Volland &Kruhl (2004) to show the anisotropy in a breccia sample. (2) A map-counting method that uses local box-counting dimensions to map the inhomogeneity of a crystal distribution pattern. Peternell et al. (2003) have used this method to analyze the distribution of phenocrysts in a porphyric granite. (3) A modified perimeter method that relates the directional dependence of the perimeter of grain boundaries to the anisotropy of the pattern (Peternell et al., 2003). We have used the resulting new possibilities to analyze numerous patterns of natural, experimental and mathematical origin in order to determine the scope of applicability of the different methods and present these results along with an evaluation of their individual sensitivities and limitations. References: Harris, C., Franssen, R. &Loosveld, R. (1991): Fractal analysis of fractures in rocks: the Cantor's Dust method comment. Tectonophysics 198: 107-111. Peternell, M., Andries, F. &Kruhl, J.H. (2003): Magmatic flow-pattern anisotropies - analyzed on the basis of a new 'map-mounting' fractal geometry method. DRT Tectonics conference, St. Malo, Book of Abstracts. Velde, B., Dubois, J., Touchard, G. &Badri, A. (1990): Fractal analysis of fractures in rocks: the Cantor's Dust method. Tectonophysics (179): 345-352. Volland, S. &Kruhl, J.H. (2004): Anisotropy quantification: the application of fractal geometry methods on tectonic fracture patterns of a Hercynian fault zone in NW-Sardinia. Journal of Structural Geology 26: 1499- 1510.
A bifractal nature of reticular patterns induced by oxygen plasma on polymer films
NASA Astrophysics Data System (ADS)
Bae, Junwan; Lee, I. J.
2015-05-01
Plasma etching was demonstrated to be a promising tool for generating self-organized nano-patterns on various commercial films. Unfortunately, dynamic scaling approach toward fundamental understanding of the formation and growth of the plasma-induced nano-structure has not always been straightforward. The temporal evolution of self-aligned nano-patterns may often evolve with an additional scale-invariance, which leads to breakdown of the well-established dynamic scaling law. The concept of a bifractal interface is successfully applied to reticular patterns induced by oxygen plasma on the surface of polymer films. The reticular pattern, composed of nano-size self-aligned protuberances and underlying structure, develops two types of anomalous dynamic scaling characterized by super-roughening and intrinsic anomalous scaling, respectively. The diffusion and aggregation of short-cleaved chains under the plasma environment are responsible for the regular distribution of the nano-size protuberances. Remarkably, it is uncovered that the dynamic roughening of the underlying structure is governed by a relaxation mechanism described by the Edwards-Wilkinson universality class with a conservative noise. The evidence for the basic phase, characterized by the negative roughness and growth exponents, has been elusive since its first theoretical consideration more than two decades ago.
Shen, Li; Shao, Ningyi; Liu, Xiaochuan; Nestler, Eric
2014-04-15
Understanding the relationship between the millions of functional DNA elements and their protein regulators, and how they work in conjunction to manifest diverse phenotypes, is key to advancing our understanding of the mammalian genome. Next-generation sequencing technology is now used widely to probe these protein-DNA interactions and to profile gene expression at a genome-wide scale. As the cost of DNA sequencing continues to fall, the interpretation of the ever increasing amount of data generated represents a considerable challenge. We have developed ngs.plot - a standalone program to visualize enrichment patterns of DNA-interacting proteins at functionally important regions based on next-generation sequencing data. We demonstrate that ngs.plot is not only efficient but also scalable. We use a few examples to demonstrate that ngs.plot is easy to use and yet very powerful to generate figures that are publication ready. We conclude that ngs.plot is a useful tool to help fill the gap between massive datasets and genomic information in this era of big sequencing data.
2014-01-01
Background Understanding the relationship between the millions of functional DNA elements and their protein regulators, and how they work in conjunction to manifest diverse phenotypes, is key to advancing our understanding of the mammalian genome. Next-generation sequencing technology is now used widely to probe these protein-DNA interactions and to profile gene expression at a genome-wide scale. As the cost of DNA sequencing continues to fall, the interpretation of the ever increasing amount of data generated represents a considerable challenge. Results We have developed ngs.plot – a standalone program to visualize enrichment patterns of DNA-interacting proteins at functionally important regions based on next-generation sequencing data. We demonstrate that ngs.plot is not only efficient but also scalable. We use a few examples to demonstrate that ngs.plot is easy to use and yet very powerful to generate figures that are publication ready. Conclusions We conclude that ngs.plot is a useful tool to help fill the gap between massive datasets and genomic information in this era of big sequencing data. PMID:24735413
Design, analysis and testing of a new piezoelectric tool actuator for elliptical vibration turning
NASA Astrophysics Data System (ADS)
Lin, Jieqiong; Han, Jinguo; Lu, Mingming; Yu, Baojun; Gu, Yan
2017-08-01
A new piezoelectric tool actuator (PETA) for elliptical vibration turning has been developed based on a hybrid flexure hinge connection. Two double parallel four-bar linkage mechanisms and two right circular flexure hinges were chosen to guide the motion. The two input displacement directional stiffness were modeled according to the principle of virtual work modeling method and the kinematic analysis was conducted theoretically. Finite element analysis was used to carry out static and dynamic analyses. To evaluate the performance of the developed PETA, off-line experimental tests were carried out to investigate the step responses, motion strokes, resolutions, parasitic motions, and natural frequencies of the PETA along the two input directions. The relationship between input displacement and output displacement, as well as the tool tip’s elliptical trajectory in different phase shifts was analyzed. By using the developed PETA mechanism, micro-dimple patterns were generated as the preliminary application to demonstrate the feasibility and efficiency of PETA for elliptical vibration turning.
TSVdb: a web-tool for TCGA splicing variants analysis.
Sun, Wenjie; Duan, Ting; Ye, Panmeng; Chen, Kelie; Zhang, Guanling; Lai, Maode; Zhang, Honghe
2018-05-29
Collaborative projects such as The Cancer Genome Atlas (TCGA) have generated various -omics and clinical data on cancer. Many computational tools have been developed to facilitate the study of the molecular characterization of tumors using data from the TCGA. Alternative splicing of a gene produces splicing variants, and accumulating evidence has revealed its essential role in cancer-related processes, implying the urgent need to discover tumor-specific isoforms and uncover their potential functions in tumorigenesis. We developed TSVdb, a web-based tool, to explore alternative splicing based on TCGA samples with 30 clinical variables from 33 tumors. TSVdb has an integrated and well-proportioned interface for visualization of the clinical data, gene expression, usage of exons/junctions and splicing patterns. Researchers can interpret the isoform expression variations between or across clinical subgroups and estimate the relationships between isoforms and patient prognosis. TSVdb is available at http://www.tsvdb.com , and the source code is available at https://github.com/wenjie1991/TSVdb . TSVdb will inspire oncologists and accelerate isoform-level advances in cancer research.
Machine learning and data science in soft materials engineering
NASA Astrophysics Data System (ADS)
Ferguson, Andrew L.
2018-01-01
In many branches of materials science it is now routine to generate data sets of such large size and dimensionality that conventional methods of analysis fail. Paradigms and tools from data science and machine learning can provide scalable approaches to identify and extract trends and patterns within voluminous data sets, perform guided traversals of high-dimensional phase spaces, and furnish data-driven strategies for inverse materials design. This topical review provides an accessible introduction to machine learning tools in the context of soft and biological materials by ‘de-jargonizing’ data science terminology, presenting a taxonomy of machine learning techniques, and surveying the mathematical underpinnings and software implementations of popular tools, including principal component analysis, independent component analysis, diffusion maps, support vector machines, and relative entropy. We present illustrative examples of machine learning applications in soft matter, including inverse design of self-assembling materials, nonlinear learning of protein folding landscapes, high-throughput antimicrobial peptide design, and data-driven materials design engines. We close with an outlook on the challenges and opportunities for the field.
Machine learning and data science in soft materials engineering.
Ferguson, Andrew L
2018-01-31
In many branches of materials science it is now routine to generate data sets of such large size and dimensionality that conventional methods of analysis fail. Paradigms and tools from data science and machine learning can provide scalable approaches to identify and extract trends and patterns within voluminous data sets, perform guided traversals of high-dimensional phase spaces, and furnish data-driven strategies for inverse materials design. This topical review provides an accessible introduction to machine learning tools in the context of soft and biological materials by 'de-jargonizing' data science terminology, presenting a taxonomy of machine learning techniques, and surveying the mathematical underpinnings and software implementations of popular tools, including principal component analysis, independent component analysis, diffusion maps, support vector machines, and relative entropy. We present illustrative examples of machine learning applications in soft matter, including inverse design of self-assembling materials, nonlinear learning of protein folding landscapes, high-throughput antimicrobial peptide design, and data-driven materials design engines. We close with an outlook on the challenges and opportunities for the field.
Application of bioinformatics in chronobiology research.
Lopes, Robson da Silva; Resende, Nathalia Maria; Honorio-França, Adenilda Cristina; França, Eduardo Luzía
2013-01-01
Bioinformatics and other well-established sciences, such as molecular biology, genetics, and biochemistry, provide a scientific approach for the analysis of data generated through "omics" projects that may be used in studies of chronobiology. The results of studies that apply these techniques demonstrate how they significantly aided the understanding of chronobiology. However, bioinformatics tools alone cannot eliminate the need for an understanding of the field of research or the data to be considered, nor can such tools replace analysts and researchers. It is often necessary to conduct an evaluation of the results of a data mining effort to determine the degree of reliability. To this end, familiarity with the field of investigation is necessary. It is evident that the knowledge that has been accumulated through chronobiology and the use of tools derived from bioinformatics has contributed to the recognition and understanding of the patterns and biological rhythms found in living organisms. The current work aims to develop new and important applications in the near future through chronobiology research.
Tan, Amanda; Tan, Say Hoon; Vyas, Dhaval; Malaivijitnond, Suchinda; Gumert, Michael D.
2015-01-01
We explored variation in patterns of percussive stone-tool use on coastal foods by Burmese long-tailed macaques (Macaca fascicularis aurea) from two islands in Laem Son National Park, Ranong, Thailand. We catalogued variation into three hammering classes and 17 action patterns, after examining 638 tool-use bouts across 90 individuals. Hammering class was based on the stone surface used for striking food, being face, point, and edge hammering. Action patterns were discriminated by tool material, hand use, posture, and striking motion. Hammering class was analyzed for associations with material and behavioural elements of tool use. Action patterns were not, owing to insufficient instances of most patterns. We collected 3077 scan samples from 109 macaques on Piak Nam Yai Island’s coasts, to determine the proportion of individuals using each hammering class and action pattern. Point hammering was significantly more associated with sessile foods, smaller tools, faster striking rates, smoother recoil, unimanual use, and more varied striking direction, than were face and edge hammering, while both point and edge hammering were significantly more associated with precision gripping than face hammering. Edge hammering also showed distinct differences depending on whether such hammering was applied to sessile or unattached foods, resembling point hammering for sessile foods and face hammering for unattached foods. Point hammering and sessile edge hammering compared to prior descriptions of axe hammering, while face and unattached edge hammering compared to pound hammering. Analysis of scans showed that 80% of individuals used tools, each employing one to four different action patterns. The most common patterns were unimanual point hammering (58%), symmetrical-bimanual face hammering (47%) and unimanual face hammering (37%). Unimanual edge hammering was relatively frequent (13%), compared to the other thirteen rare action patterns (<5%). We compare our study to other stone-using primates, and discuss implications for further research. PMID:25970286
Novel two channel self-registering integrated macro inspection tool
NASA Astrophysics Data System (ADS)
Aiyer, Arun A.; Meloni, Mark; Kueny, Andrew; Whelan, Mike
2005-05-01
After Develop Inspection (ADI) of every wafer in a lot is quite appealing, since that provides an opportunity to rework defective wafers instead of scrapping them later on. To achieve this level of inspection in manufacturing, automated macro inspection tools with higher throughput, better detection sensitivity and repeatability are needed. Moreover, such an inspector will have to be located within the Coater Developer track. To have a smaller footprint inspector, one might consider spiral-scan of the wafer surface using an off-axis illumination beam. In product wafers, one comes across Manhattan geometry with L/S patterns that are usually smaller than or comparable to the illumination wavelength. Since the reflectance of such a surface depends on the incident polarization and the pattern orientation with respect to the plane of incidence, the acquired wafer surface image will have dark and bright regions. Occurrence of this type of inhomogeneity in the surface image is referred to as the bow tie effect. The bow tie feature degrades S/N ratio of the acquired image and therefore reduces the inspector"s detection sensitivity. In this paper we will describe a macro inspection tool based on a fast spiral-scan technique that eliminates the bow tie effect by propagating the illumination beam in two orthogonal planes of incidence. In addition, by employing two counter-propagating beams, the tool is shown to have the ability to generate real time defect images that are immune to noise from die-to-die thickness variations, die-to-die alignment errors, and under layer contributions.
Method of fabricating a 3-dimensional tool master
Bonivert, William D.; Hachman, John T.
2002-01-01
The invention is a method for the fabrication of an imprint tool master. The process begins with a metallic substrate. A layer of photoresist is placed onto the metallic substrate and a image pattern mask is then aligned to the mask. The mask pattern has opaque portions that block exposure light and "open" or transparent portions which transmit exposure light. The photoresist layer is then exposed to light transmitted through the "open" portions of the first image pattern mask and the mask is then removed. A second layer of photoresist then can be placed onto the first photoresist layer and a second image pattern mask may be placed on the second layer of photoresist. The second layer of photoresist is exposed to light, as before, and the second mask removed. The photoresist layers are developed simultaneously to produce a multi-level master mandrel upon which a conductive film is formed. A tool master can now be formed onto the conductive film. An imprint tool is then produced from the tool master. In one embodiment, nickel is electroplated onto the tool master to produce a three-dimensional imprint tool.
NASA Astrophysics Data System (ADS)
Hector, Scott
2005-11-01
The extension of optical projection lithography through immersion to patterning features with half pitch <=65 nm is placing greater demands on the mask. Strong resolution enhancement techniques (RETs), such as embedded and alternating phase shift masks and complex model-based optical proximity correction, are required to compensate for diffraction and limited depth of focus (DOF). To fabricate these masks, many new or upgraded tools are required to write patterns, measure feature sizes and placement, inspect for defects, review defect printability and repair defects on these masks. Beyond the significant technical challenges, suppliers of mask fabrication equipment face the challenge of being profitable in the small market for mask equipment while encountering significant R&D expenses to bring new generations of mask fabrication equipment to market. The total available market for patterned masks is estimated to be $2.5B to $2.9B per year. The patterned mask market is about 20% of the market size for lithography equipment and materials. The total available market for mask-making equipment is estimated to be about $800M per year. The largest R&D affordability issue arises for the makers of equipment for fabricating masks where total available sales are typically less than ten units per year. SEMATECH has used discounted cash flow models to predict the affordable R&D while maintaining industry accepted internal rates of return. The results have been compared to estimates of the total R&D cost to bring a new generation of mask equipment to market for various types of tools. The analysis revealed that affordability of the required R&D is a significant problem for many suppliers of mask-making equipment. Consortia such as SEMATECH and Selete have played an important role in cost sharing selected mask equipment and material development projects. Governments in the United States, in Europe and in Japan have also helped equipment suppliers with support for R&D. This paper summarizes the challenging business model for mask equipment suppliers and highlight government support for mask equipment and materials development.
Web Tools: The Second Generation
ERIC Educational Resources Information Center
Pascopella, Angela
2008-01-01
Web 2.0 tools and technologies, or second generation tools, help districts to save time and money, and eliminate the need to transfer or move files back and forth across computers. Many Web 2.0 tools help students think critically and solve problems, which falls under the 21st-century skills. The second-generation tools are growing in popularity…
Inoue, Yukiko U; Morimoto, Yuki; Hoshino, Mikio; Inoue, Takayoshi
2018-07-01
Pax6 encodes a transcription factor that plays pivotal roles in eye development, early brain patterning, neocortical arealization, and so forth. Visualization of Pax6 expression dynamics in these events could offer numerous advantages to neurodevelopmental studies. While CRISPR/Cas9 system has dramatically accelerated one-step generation of knock-out mouse, establishment of gene-cassette knock-in mouse via zygote injection has been considered insufficient due to its low efficiency. Recently, an improved CRISPR/Cas9 system for effective gene-cassette knock-in has been reported, where the native form of guide RNAs (crRNA and tracrRNA) assembled with recombinant Cas9 protein are directly delivered into mouse fertilized eggs. Here we apply this strategy to insert IRES-EGFP-pA cassette into Pax6 locus and achieve efficient targeted insertions of the 1.8 kb reporter gene. In Pax6-IRES-EGFP mouse we have generated, EGFP-positive cells reside in the eyes and cerebellum as endogenous Pax6 expressing cells at postnatal day 2. At the early embryonic stages when the embryos are transparent, EGFP-positive regions can be easily identified without PCR-based genotyping, precisely recapitulating the endogenous Pax6 expression patterns. Remarkably, at E12.5, the graded expression patterns of Pax6 in the developing neocortex now become recognizable in our knock-in mice, serving a sufficiently sensitive and useful tool to precisely visualize neurodevelopmental processes. Copyright © 2018 Elsevier B.V. and Japan Neuroscience Society. All rights reserved.
Students' Problem Solving as Mediated by Their Cognitive Tool Use: A Study of Tool Use Patterns
ERIC Educational Resources Information Center
Liu, M.; Horton, L. R.; Corliss, S. B.; Svinicki, M. D.; Bogard, T.; Kim, J.; Chang, M.
2009-01-01
The purpose of this study was to use multiple data sources, both objective and subjective, to capture students' thinking processes as they were engaged in problem solving, examine the cognitive tool use patterns, and understand what tools were used and why they were used. The findings of this study confirmed previous research and provided clear…
Tool making, hand morphology and fossil hominins.
Marzke, Mary W
2013-11-19
Was stone tool making a factor in the evolution of human hand morphology? Is it possible to find evidence in fossil hominin hands for this capability? These questions are being addressed with increasingly sophisticated studies that are testing two hypotheses; (i) that humans have unique patterns of grip and hand movement capabilities compatible with effective stone tool making and use of the tools and, if this is the case, (ii) that there exist unique patterns of morphology in human hands that are consistent with these capabilities. Comparative analyses of human stone tool behaviours and chimpanzee feeding behaviours have revealed a distinctive set of forceful pinch grips by humans that are effective in the control of stones by one hand during manufacture and use of the tools. Comparative dissections, kinematic analyses and biomechanical studies indicate that humans do have a unique pattern of muscle architecture and joint surface form and functions consistent with the derived capabilities. A major remaining challenge is to identify skeletal features that reflect the full morphological pattern, and therefore may serve as clues to fossil hominin manipulative capabilities. Hominin fossils are evaluated for evidence of patterns of derived human grip and stress-accommodation features.
Tool making, hand morphology and fossil hominins
Marzke, Mary W.
2013-01-01
Was stone tool making a factor in the evolution of human hand morphology? Is it possible to find evidence in fossil hominin hands for this capability? These questions are being addressed with increasingly sophisticated studies that are testing two hypotheses; (i) that humans have unique patterns of grip and hand movement capabilities compatible with effective stone tool making and use of the tools and, if this is the case, (ii) that there exist unique patterns of morphology in human hands that are consistent with these capabilities. Comparative analyses of human stone tool behaviours and chimpanzee feeding behaviours have revealed a distinctive set of forceful pinch grips by humans that are effective in the control of stones by one hand during manufacture and use of the tools. Comparative dissections, kinematic analyses and biomechanical studies indicate that humans do have a unique pattern of muscle architecture and joint surface form and functions consistent with the derived capabilities. A major remaining challenge is to identify skeletal features that reflect the full morphological pattern, and therefore may serve as clues to fossil hominin manipulative capabilities. Hominin fossils are evaluated for evidence of patterns of derived human grip and stress-accommodation features. PMID:24101624
Mullen, Jillian; Ryan, Stacy R; Mathias, Charles W; Dougherty, Donald M
2015-11-09
Alcohol use patterns that are hazardous for one's health is prevalent among DWI (driving while intoxicated) offenders and is a key predictor of recidivism. The aim of this program evaluation was to determine the feasibility and usability of implementing a computer-assisted screening, brief intervention and referral to treatment (SBIRT) program for DWI offenders to enable the identification of those in need of treatment services soon after arrest. Our treatment program consisted of a web-based, self-guided screening tool for assessing alcohol use patterns and generating a personalized feedback report that is then used to deliver a brief motivational intervention and if needed, a referral to treatment. Between August and November 2014, all DWI offenders attending orientation for pre-trial supervision were assessed for eligibility. Of the 129 eligible offenders, 53.5 percent enrolled and the first 50 were asked to complete a usability and satisfaction questionnaire. The results demonstrated that the majority of those screened reported at-risk alcohol use patterns requiring referral to treatment. Clients reported high ratings of usability and satisfaction with the screening tool and personalized feedback report, which did not significantly differ depending on alcohol use patterns. There were relatively few technical difficulties, and the majority of clients reported high levels of satisfaction with the overall SBIRT program. Results of this program evaluation suggest that computer-assisted SBIRT may be successfully implemented within the criminal justice system to DWI offenders soon after arrest; however, further research is required to examine its effects on treatment utilization and recidivism.
NASA Astrophysics Data System (ADS)
Ohnuma, Hidetoshi; Kawahira, Hiroichi
1998-09-01
An automatic alternative phase shift mask (PSM) pattern layout tool has been newly developed. This tool is dedicated for embedded DRAM in logic device to shrink gate line width with improving line width controllability in lithography process with a design rule below 0.18 micrometers by the KrF excimer laser exposure. The tool can crete Levenson type PSM used being coupled with a binary mask adopting a double exposure method for positive photo resist. By using graphs, this tool automatically creates alternative PSM patterns. Moreover, it does not give any phase conflicts. By adopting it to actual embedded DRAM in logic cells, we have provided 0.16 micrometers gate resist patterns at both random logic and DRAM areas. The patterns were fabricated using two masks with the double exposure method. Gate line width has been well controlled under a practical exposure-focus window.
Extracting patterns of database and software usage from the bioinformatics literature
Duck, Geraint; Nenadic, Goran; Brass, Andy; Robertson, David L.; Stevens, Robert
2014-01-01
Motivation: As a natural consequence of being a computer-based discipline, bioinformatics has a strong focus on database and software development, but the volume and variety of resources are growing at unprecedented rates. An audit of database and software usage patterns could help provide an overview of developments in bioinformatics and community common practice, and comparing the links between resources through time could demonstrate both the persistence of existing software and the emergence of new tools. Results: We study the connections between bioinformatics resources and construct networks of database and software usage patterns, based on resource co-occurrence, that correspond to snapshots of common practice in the bioinformatics community. We apply our approach to pairings of phylogenetics software reported in the literature and argue that these could provide a stepping stone into the identification of scientific best practice. Availability and implementation: The extracted resource data, the scripts used for network generation and the resulting networks are available at http://bionerds.sourceforge.net/networks/ Contact: robert.stevens@manchester.ac.uk PMID:25161253
Bursting Transition Dynamics Within the Pre-Bötzinger Complex
NASA Astrophysics Data System (ADS)
Duan, Lixia; Chen, Xi; Tang, Xuhui; Su, Jianzhong
The pre-Bötzinger complex of the mammalian brain stem plays a crucial role in the respiratory rhythms generation. Neurons within the pre-Bötzinger complex have been found experimentally to yield different firing activities. In this paper, we study the spiking and bursting activities related to the respiratory rhythms in the pre-Bötzinger complex based on a mathematical model proposed by Butera. Using the one-dimensional first recurrence map induced by dynamics, we investigate the different bursting patterns and their transition of the pre-Bötzinger complex neurons based on the Butera model, after we derived a one-dimensional map from the dynamical characters of the differential equations, and we obtained conditions for the transition of different bursting patterns. These analytical results were verified through numerical simulations. We conclude that the one-dimensional map contains similar rhythmic patterns as the Butera model and can be used as a simpler modeling tool to study fast-slow models like pre-Bötzinger complex neural circuit.
Patscanui: an intuitive web interface for searching patterns in DNA and protein data.
Blin, Kai; Wohlleben, Wolfgang; Weber, Tilmann
2018-05-02
Patterns in biological sequences frequently signify interesting features in the underlying molecule. Many tools exist to search for well-known patterns. Less support is available for exploratory analysis, where no well-defined patterns are known yet. PatScanUI (https://patscan.secondarymetabolites.org/) provides a highly interactive web interface to the powerful generic pattern search tool PatScan. The complex PatScan-patterns are created in a drag-and-drop aware interface allowing researchers to do rapid prototyping of the often complicated patterns useful to identifying features of interest.
150-nm generation lithography equipment
NASA Astrophysics Data System (ADS)
Deguchi, Nobuyoshi; Uzawa, Shigeyuki
1999-07-01
Lithography by step-and-scan exposure is expected to be the mainstream for semiconductor manufacturing below 180 nm resolution patterns. We have developed a scanner for 150 nm features on either 200 mm or 300 mm wafers. For this system, the synchronous stage system has been redesigned which makes it possible to improve imaging performance and overlay accuracy. A new 300 mm wafer stage enhances productivity while weighting almost the same as the stage for 200 mm wafers. The mainbody mechanical frame incorporates reactive force receiver system to counter the inertial energy and vibrational issues associated with high speed wafer and reticle stage scanning. This report outlines the total system design, new technologies and performance data of the Cannon FPA-5000ES2 step-and-scan exposure tool developed for the 150 nm generation lithography.
The art and science of hyperbolic tessellations.
Van Dusen, B; Taylor, R P
2013-04-01
The visual impact of hyperbolic tessellations has captured artists' imaginations ever since M.C. Escher generated his Circle Limit series in the 1950s. The scaling properties generated by hyperbolic geometry are different to the fractal scaling properties found in nature's scenery. Consequently, prevalent interpretations of Escher's art emphasize the lack of connection with nature's patterns. However, a recent collaboration between the two authors proposed that Escher's motivation for using hyperbolic geometry was as a method to deliberately distort nature's rules. Inspired by this hypothesis, this year's cover artist, Ben Van Dusen, embeds natural fractals such as trees, clouds and lightning into a hyperbolic scaling grid. The resulting interplay of visual structure at multiple size scales suggests that hybridizations of fractal and hyperbolic geometries provide a rich compositional tool for artists.
NASA Astrophysics Data System (ADS)
Manousaki, D.; Panagiotopoulou, A.; Bizimi, V.; Haynes, M. S.; Love, S.; Kallergi, M.
2017-11-01
The purpose of this study was the generation of ground truth files (GTFs) of the breast ducts from 3D images of the Invenia™ Automated Breast Ultrasound System (ABUS) system (GE Healthcare, Little Chalfont, UK) and the application of these GTFs for the optimization of the imaging protocol and the evaluation of a computer aided detection (CADe) algorithm developed for automated duct detection. Six lactating, nursing volunteers were scanned with the ABUS before and right after breastfeeding their infants. An expert in breast ultrasound generated rough outlines of the milk-filled ducts in the transaxial slices of all image volumes and the final GTFs were created by using thresholding and smoothing tools in ImageJ. In addition, a CADe algorithm automatically segmented duct like areas and its results were compared to the expert’s GTFs by estimating true positive fraction (TPF) or % overlap. The CADe output differed significantly from the expert’s but both detected a smaller than expected volume of the ducts due to insufficient contrast (ducts were partially filled with milk), discontinuities, and artifacts. GTFs were used to modify the imaging protocol and improve the CADe method. In conclusion, electronic GTFs provide a valuable tool in the optimization of a tomographic imaging system, the imaging protocol, and the CADe algorithms. Their generation, however, is an extremely time consuming, strenuous process, particularly for multi-slice examinations, and alternatives based on phantoms or simulations are highly desirable.
2011-01-01
Background The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. Description SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. Conclusions SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner very similar to data housed in static triple-stores, thus facilitating the intersection of Web services and Semantic Web technologies. PMID:22024447
Wilkinson, Mark D; Vandervalk, Benjamin; McCarthy, Luke
2011-10-24
The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner very similar to data housed in static triple-stores, thus facilitating the intersection of Web services and Semantic Web technologies.
GREAT: a web portal for Genome Regulatory Architecture Tools.
Bouyioukos, Costas; Bucchini, François; Elati, Mohamed; Képès, François
2016-07-08
GREAT (Genome REgulatory Architecture Tools) is a novel web portal for tools designed to generate user-friendly and biologically useful analysis of genome architecture and regulation. The online tools of GREAT are freely accessible and compatible with essentially any operating system which runs a modern browser. GREAT is based on the analysis of genome layout -defined as the respective positioning of co-functional genes- and its relation with chromosome architecture and gene expression. GREAT tools allow users to systematically detect regular patterns along co-functional genomic features in an automatic way consisting of three individual steps and respective interactive visualizations. In addition to the complete analysis of regularities, GREAT tools enable the use of periodicity and position information for improving the prediction of transcription factor binding sites using a multi-view machine learning approach. The outcome of this integrative approach features a multivariate analysis of the interplay between the location of a gene and its regulatory sequence. GREAT results are plotted in web interactive graphs and are available for download either as individual plots, self-contained interactive pages or as machine readable tables for downstream analysis. The GREAT portal can be reached at the following URL https://absynth.issb.genopole.fr/GREAT and each individual GREAT tool is available for downloading. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
A novel method for repeatedly generating speckle patterns used in digital image correlation
NASA Astrophysics Data System (ADS)
Zhang, Juan; Sweedy, Ahmed; Gitzhofer, François; Baroud, Gamal
2018-01-01
Speckle patterns play a key role in Digital Image Correlation (DIC) measurement, and generating an optimal speckle pattern has been the goal for decades now. The usual method of generating a speckle pattern is by manually spraying the paint on the specimen. However, this makes it difficult to reproduce the optimal pattern for maintaining identical testing conditions and achieving consistent DIC results. This study proposed and evaluated a novel method using an atomization system to repeatedly generate speckle patterns. To verify the repeatability of the speckle patterns generated by this system, simulation and experimental studies were systematically performed. The results from both studies showed that the speckle patterns and, accordingly, the DIC measurements become highly accurate and repeatable using the proposed atomization system.
Number Sense Made Simple Using Number Patterns
ERIC Educational Resources Information Center
Su, Hui Fang Huang; Marinas, Carol; Furner, Joseph
2011-01-01
This article highlights investigating intriguing number patterns utilising an emerging technology called the Square Tool. Mathematics teachers of grades K-12 will find the Square Tool useful in making connections and bridging the gap from the concrete to the abstract. Pattern recognition helps students discover various mathematical concepts. With…
Nelson, Carl A; Miller, David J; Oleynikov, Dmitry
2008-01-01
As modular systems come into the forefront of robotic telesurgery, streamlining the process of selecting surgical tools becomes an important consideration. This paper presents a method for optimal queuing of tools in modular surgical tool systems, based on patterns in tool-use sequences, in order to minimize time spent changing tools. The solution approach is to model the set of tools as a graph, with tool-change frequency expressed as edge weights in the graph, and to solve the Traveling Salesman Problem for the graph. In a set of simulations, this method has shown superior performance at optimizing tool arrangements for streamlining surgical procedures.
Making Scientific Data Usable and Useful
NASA Astrophysics Data System (ADS)
Satwicz, T.; Bharadwaj, A.; Evans, J.; Dirks, J.; Clark Cole, K.
2017-12-01
Transforming geological data into information that has broad scientific and societal impact is a process fraught with barriers. Data sets and tools are often reported to have poor user experiences (UX) that make scientific work more challenging than it needs be. While many other technical fields have benefited from ongoing improvements to the UX of their tools (e.g., healthcare and financial services) scientists are faced with using tools that are labor intensive and not intuitive. Our research team has been involved in a multi-year effort to understand and improve the UX of scientific tools and data sets. We use a User-Centered Design (UCD) process that involves naturalistic behavioral observation and other qualitative research methods adopted from Human-Computer Interaction (HCI) and related fields. Behavioral observation involves having users complete common tasks on data sets, tools, and websites to identify usability issues and understand the severity of the issues. We measure how successfully they complete tasks and diagnosis the cause of any failures. Behavioral observation is paired with in-depth interviews where users describe their process for generating results (from initial inquiry to final results). By asking detailed questions we unpack common patterns and challenges scientists experience while working with data. We've found that tools built using the UCD process can have a large impact on scientist work flows and greatly reduce the time it takes to process data before analysis. It is often challenging to understand the organization and nuances of data across scientific fields. By better understanding how scientists work we can create tools that make routine tasks less-labor intensive, data easier to find, and solve common issues with discovering new data sets and engaging in interdisciplinary research. There is a tremendous opportunity for advancing scientific knowledge and helping the public benefit from that work by creating intuitive, interactive, and powerful tools and resources for generating knowledge. The pathway to achieving that is through building a detailed understanding of users and their needs, then using this knowledge to inform the design of the data products, tools, and services scientists and non-scientists use to do their work.
Hierarchy of orofacial rhythms revealed through whisking and breathing
Moore, Jeffrey D.; Deschênes, Martin; Furuta, Takahiro; Huber, Daniel; Smear, Matthew C.; Demers, Maxime; Kleinfeld, David
2014-01-01
Whisking and sniffing are predominant aspects of exploratory behavior in rodents, yet the neural mechanisms that generate their motor patterns remain largely uncharacterized. We use anatomical, behavioral, electrophysiological, and pharmacological tools to demonstrate that these patterns are coordinated by respiratory centers in the ventral medulla. We delineate a distinct region in the ventral medulla that provides rhythmic input to the facial motoneurons that drive protraction of the vibrissae. Neuronal output from this region is reset at each inspiration by direct input from the preBötzinger complex, such that high frequency sniffing has a one-to-one coordination with whisking while basal respiration is accompanied by intervening whisks that occur between breaths. We conjecture that the respiratory nuclei, which project to other premotor regions for oral and facial control, function as a master clock for behaviors that coordinate with breathing. PMID:23624373
Teixidó, Meritxell; Belda, Ignasi; Zurita, Esther; Llorà, Xavier; Fabre, Myriam; Vilaró, Senén; Albericio, Fernando; Giralt, Ernest
2005-12-01
The use of high-throughput methods in drug discovery allows the generation and testing of a large number of compounds, but at the price of providing redundant information. Evolutionary combinatorial chemistry combines the selection and synthesis of biologically active compounds with artificial intelligence optimization methods, such as genetic algorithms (GA). Drug candidates for the treatment of central nervous system (CNS) disorders must overcome the blood-brain barrier (BBB). This paper reports a new genetic algorithm that searches for the optimal physicochemical properties for peptide transport across the blood-brain barrier. A first generation of peptides has been generated and synthesized. Due to the high content of N-methyl amino acids present in most of these peptides, their syntheses were especially challenging due to over-incorporations, deletions and DKP formations. Distinct fragmentation patterns during peptide cleavage have been identified. The first generation of peptides has been studied by evaluation techniques such as immobilized artificial membrane chromatography (IAMC), a cell-based assay, log Poctanol/water calculations, etc. Finally, a second generation has been proposed. (c) 2005 European Peptide Society and John Wiley & Sons, Ltd.
Pattern-set generation algorithm for the one-dimensional multiple stock sizes cutting stock problem
NASA Astrophysics Data System (ADS)
Cui, Yaodong; Cui, Yi-Ping; Zhao, Zhigang
2015-09-01
A pattern-set generation algorithm (PSG) for the one-dimensional multiple stock sizes cutting stock problem (1DMSSCSP) is presented. The solution process contains two stages. In the first stage, the PSG solves the residual problems repeatedly to generate the patterns in the pattern set, where each residual problem is solved by the column-generation approach, and each pattern is generated by solving a single large object placement problem. In the second stage, the integer linear programming model of the 1DMSSCSP is solved using a commercial solver, where only the patterns in the pattern set are considered. The computational results of benchmark instances indicate that the PSG outperforms existing heuristic algorithms and rivals the exact algorithm in solution quality.
Performance and stability of mask process correction for EBM-7000
NASA Astrophysics Data System (ADS)
Saito, Yasuko; Chen, George; Wang, Jen-Shiang; Bai, Shufeng; Howell, Rafael; Li, Jiangwei; Tao, Jun; VanDenBroeke, Doug; Wiley, Jim; Takigawa, Tadahiro; Ohnishi, Takayuki; Kamikubo, Takashi; Hara, Shigehiro; Anze, Hirohito; Hattori, Yoshiaki; Tamamushi, Shuichi
2010-05-01
In order to support complex optical masks today and EUV masks in the near future, it is critical to correct mask patterning errors with a magnitude of up to 20nm over a range of 2000nm at mask scale caused by short range mask process proximity effects. A new mask process correction technology, MPC+, has been developed to achieve the target requirements for the next generation node. In this paper, the accuracy and throughput performance of MPC+ technology is evaluated using the most advanced mask writing tool, the EBM-70001), and high quality mask metrology . The accuracy of MPC+ is achieved by using a new comprehensive mask model. The results of through-pitch and through-linewidth linearity curves and error statistics for multiple pattern layouts (including both 1D and 2D patterns) are demonstrated and show post-correction accuracy of 2.34nm 3σ for through-pitch/through-linewidth linearity. Implementing faster mask model simulation and more efficient correction recipes; full mask area (100cm2) processing run time is less than 7 hours for 32nm half-pitch technology node. From these results, it can be concluded that MPC+ with its higher precision and speed is a practical technology for the 32nm node and future technology generations, including EUV, when used with advance mask writing processes like the EBM-7000.
Xu, Yi-Hua; Manoharan, Herbert T; Pitot, Henry C
2007-09-01
The bisulfite genomic sequencing technique is one of the most widely used techniques to study sequence-specific DNA methylation because of its unambiguous ability to reveal DNA methylation status to the order of a single nucleotide. One characteristic feature of the bisulfite genomic sequencing technique is that a number of sample sequence files will be produced from a single DNA sample. The PCR products of bisulfite-treated DNA samples cannot be sequenced directly because they are heterogeneous in nature; therefore they should be cloned into suitable plasmids and then sequenced. This procedure generates an enormous number of sample DNA sequence files as well as adding extra bases belonging to the plasmids to the sequence, which will cause problems in the final sequence comparison. Finding the methylation status for each CpG in each sample sequence is not an easy job. As a result CpG PatternFinder was developed for this purpose. The main functions of the CpG PatternFinder are: (i) to analyze the reference sequence to obtain CpG and non-CpG-C residue position information. (ii) To tailor sample sequence files (delete insertions and mark deletions from the sample sequence files) based on a configuration of ClustalW multiple alignment. (iii) To align sample sequence files with a reference file to obtain bisulfite conversion efficiency and CpG methylation status. And, (iv) to produce graphics, highlighted aligned sequence text and a summary report which can be easily exported to Microsoft Office suite. CpG PatternFinder is designed to operate cooperatively with BioEdit, a freeware on the internet. It can handle up to 100 files of sample DNA sequences simultaneously, and the total CpG pattern analysis process can be finished in minutes. CpG PatternFinder is an ideal software tool for DNA methylation studies to determine the differential methylation pattern in a large number of individuals in a population. Previously we developed the CpG Analyzer program; CpG PatternFinder is our further effort to create software tools for DNA methylation studies.
ERIC Educational Resources Information Center
Papa, Frank; And Others
1990-01-01
In this study an artificial intelligence assessment tool used disease-by-feature frequency estimates to create disease prototypes for nine common causes of acute chest pain. The tool then used each subject's prototypes and a pattern-recognition-based decision-making mechanism to diagnose 18 myocardial infarction cases. (MLW)
UIVerify: A Web-Based Tool for Verification and Automatic Generation of User Interfaces
NASA Technical Reports Server (NTRS)
Shiffman, Smadar; Degani, Asaf; Heymann, Michael
2004-01-01
In this poster, we describe a web-based tool for verification and automatic generation of user interfaces. The verification component of the tool accepts as input a model of a machine and a model of its interface, and checks that the interface is adequate (correct). The generation component of the tool accepts a model of a given machine and the user's task, and then generates a correct and succinct interface. This write-up will demonstrate the usefulness of the tool by verifying the correctness of a user interface to a flight-control system. The poster will include two more examples of using the tool: verification of the interface to an espresso machine, and automatic generation of a succinct interface to a large hypothetical machine.
A survey on annotation tools for the biomedical literature.
Neves, Mariana; Leser, Ulf
2014-03-01
New approaches to biomedical text mining crucially depend on the existence of comprehensive annotated corpora. Such corpora, commonly called gold standards, are important for learning patterns or models during the training phase, for evaluating and comparing the performance of algorithms and also for better understanding the information sought for by means of examples. Gold standards depend on human understanding and manual annotation of natural language text. This process is very time-consuming and expensive because it requires high intellectual effort from domain experts. Accordingly, the lack of gold standards is considered as one of the main bottlenecks for developing novel text mining methods. This situation led the development of tools that support humans in annotating texts. Such tools should be intuitive to use, should support a range of different input formats, should include visualization of annotated texts and should generate an easy-to-parse output format. Today, a range of tools which implement some of these functionalities are available. In this survey, we present a comprehensive survey of tools for supporting annotation of biomedical texts. Altogether, we considered almost 30 tools, 13 of which were selected for an in-depth comparison. The comparison was performed using predefined criteria and was accompanied by hands-on experiences whenever possible. Our survey shows that current tools can support many of the tasks in biomedical text annotation in a satisfying manner, but also that no tool can be considered as a true comprehensive solution.
A New Individually Addressable Micro-LED Array for Photogenetic Neural Stimulation.
McGovern, B; Berlinguer Palmini, R; Grossman, N; Drakakis, E; Poher, V; Neil, M A A; Degenaar, P
2010-12-01
Here, we demonstrate the use of a micro light emitting diode (LED) array as a powerful tool for complex spatiotemporal control of photosensitized neurons. The array can generate arbitrary, 2-D, excitation patterns with millisecond and micrometer resolution. In particular, we describe an active matrix control address system to allow simultaneous control of 256 individual micro LEDs. We present the system optically integrated into a microscope environment and patch clamp electrophysiology. The results show that the emitters have sufficient radiance at the required wavelength to stimulate neurons expressing channelrhodopsin-2 (ChR2).
Electrical coupled Morris-Lecar neurons: From design to pattern analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Binczak, S.; Behdad, R.; Rossé, M.
2016-06-08
In this study, an experimental electronic neuron based on Morris-Lecar model is presented, able to become an experimental unit tool to study collective association of robust coupled neurons. The circuit design is given according to the ionic currents of this model. A weak coupling of such neurons under Multisim Software can generate clusters based on the boundary conditions of the neurons and their initial conditions. For this study, we work in the region close to the fold bifurcation of limit cycles. In this region two limit cycles exist, one of the cycles is stable and another one is unstable.
FaceIt: face recognition from static and live video for law enforcement
NASA Astrophysics Data System (ADS)
Atick, Joseph J.; Griffin, Paul M.; Redlich, A. N.
1997-01-01
Recent advances in image and pattern recognition technology- -especially face recognition--are leading to the development of a new generation of information systems of great value to the law enforcement community. With these systems it is now possible to pool and manage vast amounts of biometric intelligence such as face and finger print records and conduct computerized searches on them. We review one of the enabling technologies underlying these systems: the FaceIt face recognition engine; and discuss three applications that illustrate its benefits as a problem-solving technology and an efficient and cost effective investigative tool.
King, M; Rauch, H G; Stein, D J; Brooks, S J
2014-11-15
Handgrip is a ubiquitous human movement that was critical in our evolution. However, the differences in brain activity between grip type (i.e. power or precision) and pattern (i.e. dynamic or static) are not fully understood. In order to address this, we performed Activation Likelihood Estimation (ALE) analysis between grip type and grip pattern using functional magnetic resonance imaging (fMRI) data. ALE provides a probabilistic summary of the BOLD response in hundreds of subjects, which is often beyond the scope of a single fMRI experiment. We collected data from 28 functional magnetic resonance data sets, which included a total of 398 male and female subjects. Using ALE, we analyzed the BOLD response during power, precision, static and dynamic grip in a range of forces and age in right handed healthy individuals without physical impairment, cardiovascular or neurological dysfunction using a variety of grip tools, feedback and experimental training. Power grip generates unique activation in the postcentral gyrus (areas 1 and 3b) and precision grip generates unique activation in the supplementary motor area (SMA, area 6) and precentral gyrus (area 4a). Dynamic handgrip generates unique activation in the precentral gyrus (area 4p) and SMA (area 6) and of particular interest, both dynamic and static grip share activation in the area 2 of the postcentral gyrus, an area implicated in the evolution of handgrip. According to effect size analysis, precision and dynamic grip generates stronger activity than power and static, respectively. Our study demonstrates specific differences between grip type and pattern. However, there was a large degree of overlap in the pre and postcentral gyrus, SMA and areas of the frontal-parietal-cerebellar network, which indicates that other mechanisms are potentially involved in regulating handgrip. Further, our study provides empirically based regions of interest, which can be downloaded here within, that can be used to more effectively study power grip in a range of populations and conditions. Copyright © 2014 Elsevier Inc. All rights reserved.
Cork is used to make tooling patterns and molds
NASA Technical Reports Server (NTRS)
Hoffman, F. J.
1965-01-01
Sheet and waste cork are cemented together to provide a tooling pattern or mold. The cork form withstands moderately high temperatures under vacuum or pressure with minimum expansion, shrinkage, or distortion.
A linguistic rule-based approach to extract drug-drug interactions from pharmacological documents
2011-01-01
Background A drug-drug interaction (DDI) occurs when one drug influences the level or activity of another drug. The increasing volume of the scientific literature overwhelms health care professionals trying to be kept up-to-date with all published studies on DDI. Methods This paper describes a hybrid linguistic approach to DDI extraction that combines shallow parsing and syntactic simplification with pattern matching. Appositions and coordinate structures are interpreted based on shallow syntactic parsing provided by the UMLS MetaMap tool (MMTx). Subsequently, complex and compound sentences are broken down into clauses from which simple sentences are generated by a set of simplification rules. A pharmacist defined a set of domain-specific lexical patterns to capture the most common expressions of DDI in texts. These lexical patterns are matched with the generated sentences in order to extract DDIs. Results We have performed different experiments to analyze the performance of the different processes. The lexical patterns achieve a reasonable precision (67.30%), but very low recall (14.07%). The inclusion of appositions and coordinate structures helps to improve the recall (25.70%), however, precision is lower (48.69%). The detection of clauses does not improve the performance. Conclusions Information Extraction (IE) techniques can provide an interesting way of reducing the time spent by health care professionals on reviewing the literature. Nevertheless, no approach has been carried out to extract DDI from texts. To the best of our knowledge, this work proposes the first integral solution for the automatic extraction of DDI from biomedical texts. PMID:21489220
Multiscale musculoskeletal modelling, data–model fusion and electromyography-informed modelling
Zhang, J.; Heidlauf, T.; Sartori, M.; Besier, T.; Röhrle, O.; Lloyd, D.
2016-01-01
This paper proposes methods and technologies that advance the state of the art for modelling the musculoskeletal system across the spatial and temporal scales; and storing these using efficient ontologies and tools. We present population-based modelling as an efficient method to rapidly generate individual morphology from only a few measurements and to learn from the ever-increasing supply of imaging data available. We present multiscale methods for continuum muscle and bone models; and efficient mechanostatistical methods, both continuum and particle-based, to bridge the scales. Finally, we examine both the importance that muscles play in bone remodelling stimuli and the latest muscle force prediction methods that use electromyography-assisted modelling techniques to compute musculoskeletal forces that best reflect the underlying neuromuscular activity. Our proposal is that, in order to have a clinically relevant virtual physiological human, (i) bone and muscle mechanics must be considered together; (ii) models should be trained on population data to permit rapid generation and use underlying principal modes that describe both muscle patterns and morphology; and (iii) these tools need to be available in an open-source repository so that the scientific community may use, personalize and contribute to the database of models. PMID:27051510
The FaceBase Consortium: a comprehensive resource for craniofacial researchers
Brinkley, James F.; Fisher, Shannon; Harris, Matthew P.; Holmes, Greg; Hooper, Joan E.; Wang Jabs, Ethylin; Jones, Kenneth L.; Kesselman, Carl; Klein, Ophir D.; Maas, Richard L.; Marazita, Mary L.; Selleri, Licia; Spritz, Richard A.; van Bakel, Harm; Visel, Axel; Williams, Trevor J.; Wysocka, Joanna
2016-01-01
The FaceBase Consortium, funded by the National Institute of Dental and Craniofacial Research, National Institutes of Health, is designed to accelerate understanding of craniofacial developmental biology by generating comprehensive data resources to empower the research community, exploring high-throughput technology, fostering new scientific collaborations among researchers and human/computer interactions, facilitating hypothesis-driven research and translating science into improved health care to benefit patients. The resources generated by the FaceBase projects include a number of dynamic imaging modalities, genome-wide association studies, software tools for analyzing human facial abnormalities, detailed phenotyping, anatomical and molecular atlases, global and specific gene expression patterns, and transcriptional profiling over the course of embryonic and postnatal development in animal models and humans. The integrated data visualization tools, faceted search infrastructure, and curation provided by the FaceBase Hub offer flexible and intuitive ways to interact with these multidisciplinary data. In parallel, the datasets also offer unique opportunities for new collaborations and training for researchers coming into the field of craniofacial studies. Here, we highlight the focus of each spoke project and the integration of datasets contributed by the spokes to facilitate craniofacial research. PMID:27287806
Haptic feedback can provide an objective assessment of arthroscopic skills.
Chami, George; Ward, James W; Phillips, Roger; Sherman, Kevin P
2008-04-01
The outcome of arthroscopic procedures is related to the surgeon's skills in arthroscopy. Currently, evaluation of such skills relies on direct observation by a surgeon trainer. This type of assessment, by its nature, is subjective and time-consuming. The aim of our study was to identify whether haptic information generated from arthroscopic tools could distinguish between skilled and less skilled surgeons. A standard arthroscopic probe was fitted with a force/torque sensor. The probe was used by five surgeons with different levels of experience in knee arthroscopy performing 11 different tasks in 10 standard knee arthroscopies. The force/torque data from the hand and tool interface were recorded and synchronized with a video recording of the procedure. The torque magnitude and patterns generated were analyzed and compared. A computerized system was used to analyze the force/torque signature based on general principles for quality of performance using such measures as economy in movement, time efficiency, and consistency in performance. The results showed a considerable correlation between three haptic parameters and the surgeon's experience, which could be used in an automated objective assessment system for arthroscopic surgery. Level II, diagnostic study. See the Guidelines for Authors for a complete description of levels of evidence.
NASA Astrophysics Data System (ADS)
Son, Yurak; Kamano, Takuya; Yasuno, Takashi; Suzuki, Takayuki; Harada, Hironobu
This paper describes the generation of adaptive gait patterns using new Central Pattern Generators (CPGs) including motor dynamic models for a quadruped robot under various environment. The CPGs act as the flexible oscillators of the joints and make the desired angle of the joints. The CPGs are mutually connected each other, and the sets of their coupling parameters are adjusted by genetic algorithm so that the quadruped robot can realize the stable and adequate gait patterns. As a result of generation, the suitable CPG networks for not only a walking straight gait pattern but also rotation gait patterns are obtained. Experimental results demonstrate that the proposed CPG networks are effective to automatically adjust the adaptive gait patterns for the tested quadruped robot under various environment. Furthermore, the target tracking control based on image processing is achieved by combining the generated gait patterns.
Genetic testing in congenital heart disease: A clinical approach
Chaix, Marie A; Andelfinger, Gregor; Khairy, Paul
2016-01-01
Congenital heart disease (CHD) is the most common type of birth defect. Traditionally, a polygenic model defined by the interaction of multiple genes and environmental factors was hypothesized to account for different forms of CHD. It is now understood that the contribution of genetics to CHD extends beyond a single unified paradigm. For example, monogenic models and chromosomal abnormalities have been associated with various syndromic and non-syndromic forms of CHD. In such instances, genetic investigation and testing may potentially play an important role in clinical care. A family tree with a detailed phenotypic description serves as the initial screening tool to identify potentially inherited defects and to guide further genetic investigation. The selection of a genetic test is contingent upon the particular diagnostic hypothesis generated by clinical examination. Genetic investigation in CHD may carry the potential to improve prognosis by yielding valuable information with regards to personalized medical care, confidence in the clinical diagnosis, and/or targeted patient follow-up. Moreover, genetic assessment may serve as a tool to predict recurrence risk, define the pattern of inheritance within a family, and evaluate the need for further family screening. In some circumstances, prenatal or preimplantation genetic screening could identify fetuses or embryos at high risk for CHD. Although genetics may appear to constitute a highly specialized sector of cardiology, basic knowledge regarding inheritance patterns, recurrence risks, and available screening and diagnostic tools, including their strengths and limitations, could assist the treating physician in providing sound counsel. PMID:26981213
GO Explorer: A gene-ontology tool to aid in the interpretation of shotgun proteomics data.
Carvalho, Paulo C; Fischer, Juliana Sg; Chen, Emily I; Domont, Gilberto B; Carvalho, Maria Gc; Degrave, Wim M; Yates, John R; Barbosa, Valmir C
2009-02-24
Spectral counting is a shotgun proteomics approach comprising the identification and relative quantitation of thousands of proteins in complex mixtures. However, this strategy generates bewildering amounts of data whose biological interpretation is a challenge. Here we present a new algorithm, termed GO Explorer (GOEx), that leverages the gene ontology (GO) to aid in the interpretation of proteomic data. GOEx stands out because it combines data from protein fold changes with GO over-representation statistics to help draw conclusions. Moreover, it is tightly integrated within the PatternLab for Proteomics project and, thus, lies within a complete computational environment that provides parsers and pattern recognition tools designed for spectral counting. GOEx offers three independent methods to query data: an interactive directed acyclic graph, a specialist mode where key words can be searched, and an automatic search. Its usefulness is demonstrated by applying it to help interpret the effects of perillyl alcohol, a natural chemotherapeutic agent, on glioblastoma multiform cell lines (A172). We used a new multi-surfactant shotgun proteomic strategy and identified more than 2600 proteins; GOEx pinpointed key sets of differentially expressed proteins related to cell cycle, alcohol catabolism, the Ras pathway, apoptosis, and stress response, to name a few. GOEx facilitates organism-specific studies by leveraging GO and providing a rich graphical user interface. It is a simple to use tool, specialized for biologists who wish to analyze spectral counting data from shotgun proteomics. GOEx is available at http://pcarvalho.com/patternlab.
Systems Prototyping with Fourth Generation Tools.
ERIC Educational Resources Information Center
Sholtys, Phyllis
1983-01-01
The development of information systems using an engineering approach that uses both traditional programing techniques and fourth generation software tools is described. Fourth generation applications tools are used to quickly develop a prototype system that is revised as the user clarifies requirements. (MLW)
NASA Astrophysics Data System (ADS)
Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.
2016-11-01
The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.
Parenting Styles and Youth Well-Being across Immigrant Generations
ERIC Educational Resources Information Center
Driscoll, Anne K.; Russell, Stephen T.; Crockett, Lisa J.
2008-01-01
This study examines generational patterns of parenting styles, the relationships between parenting styles and adolescent well-being among youth of Mexican origin, and the role of generational parenting style patterns in explaining generational patterns in youth behavior (delinquency and alcohol problems) and psychological well-being (depression…
rVISTA 2.0: Evolutionary Analysis of Transcription Factor Binding Sites
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loots, G G; Ovcharenko, I
2004-01-28
Identifying and characterizing the patterns of DNA cis-regulatory modules represents a challenge that has the potential to reveal the regulatory language the genome uses to dictate transcriptional dynamics. Several studies have demonstrated that regulatory modules are under positive selection and therefore are often conserved between related species. Using this evolutionary principle we have created a comparative tool, rVISTA, for analyzing the regulatory potential of noncoding sequences. The rVISTA tool combines transcription factor binding site (TFBS) predictions, sequence comparisons and cluster analysis to identify noncoding DNA regions that are highly conserved and present in a specific configuration within an alignment. Heremore » we present the newly developed version 2.0 of the rVISTA tool that can process alignments generated by both zPicture and PipMaker alignment programs or use pre-computed pairwise alignments of seven vertebrate genomes available from the ECR Browser. The rVISTA web server is closely interconnected with the TRANSFAC database, allowing users to either search for matrices present in the TRANSFAC library collection or search for user-defined consensus sequences. rVISTA tool is publicly available at http://rvista.dcode.org/.« less
Aggregation Tool to Create Curated Data albums to Support Disaster Recovery and Response
NASA Astrophysics Data System (ADS)
Ramachandran, R.; Kulkarni, A.; Maskey, M.; Li, X.; Flynn, S.
2014-12-01
Economic losses due to natural hazards are estimated to be around 6-10 billion dollars annually for the U.S. and this number keeps increasing every year. This increase has been attributed to population growth and migration to more hazard prone locations. As this trend continues, in concert with shifts in weather patterns caused by climate change, it is anticipated that losses associated with natural disasters will keep growing substantially. One of challenges disaster response and recovery analysts face is to quickly find, access and utilize a vast variety of relevant geospatial data collected by different federal agencies. More often analysts may be familiar with limited, but specific datasets and are often unaware of or unfamiliar with a large quantity of other useful resources. Finding airborne or satellite data useful to a natural disaster event often requires a time consuming search through web pages and data archives. The search process for the analyst could be made much more efficient and productive if a tool could go beyond a typical search engine and provide not just links to web sites but actual links to specific data relevant to the natural disaster, parse unstructured reports for useful information nuggets, as well as gather other related reports, summaries, news stories, and images. This presentation will describe a semantic aggregation tool developed to address similar problem for Earth Science researchers. This tool provides automated curation, and creates "Data Albums" to support case studies. The generated "Data Albums" are compiled collections of information related to a specific science topic or event, containing links to relevant data files (granules) from different instruments; tools and services for visualization and analysis; information about the event contained in news reports, and images or videos to supplement research analysis. An ontology-based relevancy-ranking algorithm drives the curation of relevant data sets for a given event. This tool is now being used to generate a catalog of case studies focusing on hurricanes and severe storms.
NASA Astrophysics Data System (ADS)
Nagahara, Seiji; Carcasi, Michael; Shiraishi, Gosuke; Nakagawa, Hisashi; Dei, Satoshi; Shiozawa, Takahiro; Nafus, Kathleen; De Simone, Danilo; Vandenberghe, Geert; Stock, Hans-Jürgen; Küchler, Bernd; Hori, Masafumi; Naruoka, Takehiko; Nagai, Tomoki; Minekawa, Yukie; Iseki, Tomohiro; Kondo, Yoshihiro; Yoshihara, Kosuke; Kamei, Yuya; Tomono, Masaru; Shimada, Ryo; Biesemans, Serge; Nakashima, Hideo; Foubert, Philippe; Buitrago, Elizabeth; Vockenhuber, Michaela; Ekinci, Yasin; Oshima, Akihiro; Tagawa, Seiichi
2017-03-01
A new type of Photosensitized Chemically Amplified Resist (PSCAR) **: "PSCAR 2.0," is introduced in this paper. PSCAR 2.0 is composed of a protected polymer, a "photo acid generator which can be photosensitized" (PS-PAG), a "photo decomposable base (quencher) which can be photosensitized" (PS-PDB) and a photosensitizer precursor (PP). With this PSCAR 2.0, a photosensitizer (PS) is generated by an extreme ultra-violet (EUV) pattern exposure. Then, during a subsequent flood exposure, PS selectively photosensitizes the EUV exposed areas by the decomposition of a PS-PDB in addition to the decomposition of PS-PAG. As these pattern-exposed areas have the additional acid and reduced quencher concentration, the initial quencher loading in PSCAR 2.0 can be increased in order to get the same target critical dimensions (CD). The quencher loading is to be optimized simultaneously with a UV flood exposure dose to achieve the best lithographic performance and resolution. In this work, the PSCAR performance when different quenchers are used is examined by simulation and exposure experiments with the 16 nm half-pitch (HP) line/space (L/S, 1:1) patterns. According to our simulation results among resists with the different quencher types, the best performance was achieved by PSCAR 2.0 using PS-PDB with the highest possible chemical gradient resulting in the lowest line width roughness (LWR). PSCAR 2.0 performance has furthermore been confirmed on ASML's NXE:3300 with TEL's standalone pre-alpha flood exposure tool at imec. The initial PSCAR 2.0 patterning results on NXE:3300 showed the accelerated photosensitization performance with PS-PDB. From these results, we concluded that the dual sensitization of PS-PAG and PS-PDB in PSCAR 2.0 have a potential to realize a significantly improved resist performance in EUV lithography.
The Neural Basis of Mark Making: A Functional MRI Study of Drawing
Yuan, Ye; Brown, Steven
2014-01-01
Compared to most other forms of visually-guided motor activity, drawing is unique in that it “leaves a trail behind” in the form of the emanating image. We took advantage of an MRI-compatible drawing tablet in order to examine both the motor production and perceptual emanation of images. Subjects participated in a series of mark making tasks in which they were cued to draw geometric patterns on the tablet's surface. The critical comparison was between when visual feedback was displayed (image generation) versus when it was not (no image generation). This contrast revealed an occipito-parietal stream involved in motion-based perception of the emerging image, including areas V5/MT+, LO, V3A, and the posterior part of the intraparietal sulcus. Interestingly, when subjects passively viewed animations of visual patterns emerging on the projected surface, all of the sensorimotor network involved in drawing was strongly activated, with the exception of the primary motor cortex. These results argue that the origin of the human capacity to draw and write involves not only motor skills for tool use but also motor-sensory links between drawing movements and the visual images that emanate from them in real time. PMID:25271440
QuEST for malware type-classification
NASA Astrophysics Data System (ADS)
Vaughan, Sandra L.; Mills, Robert F.; Grimaila, Michael R.; Peterson, Gilbert L.; Oxley, Mark E.; Dube, Thomas E.; Rogers, Steven K.
2015-05-01
Current cyber-related security and safety risks are unprecedented, due in no small part to information overload and skilled cyber-analyst shortages. Advances in decision support and Situation Awareness (SA) tools are required to support analysts in risk mitigation. Inspired by human intelligence, research in Artificial Intelligence (AI) and Computational Intelligence (CI) have provided successful engineering solutions in complex domains including cyber. Current AI approaches aggregate large volumes of data to infer the general from the particular, i.e. inductive reasoning (pattern-matching) and generally cannot infer answers not previously programmed. Whereas humans, rarely able to reason over large volumes of data, have successfully reached the top of the food chain by inferring situations from partial or even partially incorrect information, i.e. abductive reasoning (pattern-completion); generating a hypothetical explanation of observations. In order to achieve an engineering advantage in computational decision support and SA we leverage recent research in human consciousness, the role consciousness plays in decision making, modeling the units of subjective experience which generate consciousness, qualia. This paper introduces a novel computational implementation of a Cognitive Modeling Architecture (CMA) which incorporates concepts of consciousness. We apply our model to the malware type-classification task. The underlying methodology and theories are generalizable to many domains.
Skin tissue generation by laser cell printing.
Koch, Lothar; Deiwick, Andrea; Schlie, Sabrina; Michael, Stefanie; Gruene, Martin; Coger, Vincent; Zychlinski, Daniela; Schambach, Axel; Reimers, Kerstin; Vogt, Peter M; Chichkov, Boris
2012-07-01
For the aim of ex vivo engineering of functional tissue substitutes, Laser-assisted BioPrinting (LaBP) is under investigation for the arrangement of living cells in predefined patterns. So far three-dimensional (3D) arrangements of single or two-dimensional (2D) patterning of different cell types have been presented. It has been shown that cells are not harmed by the printing procedure. We now demonstrate for the first time the 3D arrangement of vital cells by LaBP as multicellular grafts analogous to native archetype and the formation of tissue by these cells. For this purpose, fibroblasts and keratinocytes embedded in collagen were printed in 3D as a simple example for skin tissue. To study cell functions and tissue formation process in 3D, different characteristics, such as cell localisation and proliferation were investigated. We further analysed the formation of adhering and gap junctions, which are fundamental for tissue morphogenesis and cohesion. In this study, it was demonstrated that LaBP is an outstanding tool for the generation of multicellular 3D constructs mimicking tissue functions. These findings are promising for the realisation of 3D in vitro models and tissue substitutes for many applications in tissue engineering. Copyright © 2012 Wiley Periodicals, Inc.
The Gene Construction Kit: a new computer program for manipulating and presenting DNA constructs.
Gross, R H
1990-06-01
The Gene Construction Kit is a new tool for manipulating and displaying DNA sequence information. Constructs can be displayed either graphically or as formatted sequence. Segments of DNA can be cut out with restriction enzymes and pasted into other sites. The program keeps track of staggered ends and notifies the user of incompatibilities and offers a choice of ligation options. Each segment of a construct can have its own defined thickness, pattern, direction and color. The sequence listing can be displayed in any font and style in user defined grouping. Nucleotide positions can be displayed as can restriction sites and protein sequences. The DNA can be displayed as either single- or double-stranded. Restriction sites can be readily marked. Alternative views of the DNA can be maintained and the history of the construct automatically stored. Gel electrophoresis patterns can be generated and can be used in cloning project design. Extensive comments can be stored with the construct and can be searched rapidly for key words. High quality illustrations showing multiple editable constructs with added graphics and text information can be generated for slides, posters or publication.
Inferring a District-Based Hierarchical Structure of Social Contacts from Census Data
Yu, Zhiwen; Liu, Jiming; Zhu, Xianjun
2015-01-01
Researchers have recently paid attention to social contact patterns among individuals due to their useful applications in such areas as epidemic evaluation and control, public health decisions, chronic disease research and social network research. Although some studies have estimated social contact patterns from social networks and surveys, few have considered how to infer the hierarchical structure of social contacts directly from census data. In this paper, we focus on inferring an individual’s social contact patterns from detailed census data, and generate various types of social contact patterns such as hierarchical-district-structure-based, cross-district and age-district-based patterns. We evaluate newly generated contact patterns derived from detailed 2011 Hong Kong census data by incorporating them into a model and simulation of the 2009 Hong Kong H1N1 epidemic. We then compare the newly generated social contact patterns with the mixing patterns that are often used in the literature, and draw the following conclusions. First, the generation of social contact patterns based on a hierarchical district structure allows for simulations at different district levels. Second, the newly generated social contact patterns reflect individuals social contacts. Third, the newly generated social contact patterns improve the accuracy of the SEIR-based epidemic model. PMID:25679787
Performance Measurement, Visualization and Modeling of Parallel and Distributed Programs
NASA Technical Reports Server (NTRS)
Yan, Jerry C.; Sarukkai, Sekhar R.; Mehra, Pankaj; Lum, Henry, Jr. (Technical Monitor)
1994-01-01
This paper presents a methodology for debugging the performance of message-passing programs on both tightly coupled and loosely coupled distributed-memory machines. The AIMS (Automated Instrumentation and Monitoring System) toolkit, a suite of software tools for measurement and analysis of performance, is introduced and its application illustrated using several benchmark programs drawn from the field of computational fluid dynamics. AIMS includes (i) Xinstrument, a powerful source-code instrumentor, which supports both Fortran77 and C as well as a number of different message-passing libraries including Intel's NX Thinking Machines' CMMD, and PVM; (ii) Monitor, a library of timestamping and trace -collection routines that run on supercomputers (such as Intel's iPSC/860, Delta, and Paragon and Thinking Machines' CM5) as well as on networks of workstations (including Convex Cluster and SparcStations connected by a LAN); (iii) Visualization Kernel, a trace-animation facility that supports source-code clickback, simultaneous visualization of computation and communication patterns, as well as analysis of data movements; (iv) Statistics Kernel, an advanced profiling facility, that associates a variety of performance data with various syntactic components of a parallel program; (v) Index Kernel, a diagnostic tool that helps pinpoint performance bottlenecks through the use of abstract indices; (vi) Modeling Kernel, a facility for automated modeling of message-passing programs that supports both simulation -based and analytical approaches to performance prediction and scalability analysis; (vii) Intrusion Compensator, a utility for recovering true performance from observed performance by removing the overheads of monitoring and their effects on the communication pattern of the program; and (viii) Compatibility Tools, that convert AIMS-generated traces into formats used by other performance-visualization tools, such as ParaGraph, Pablo, and certain AVS/Explorer modules.
CRISPR-Cas9 and CRISPR-Cpf1 mediated targeting of a stomatal developmental gene EPFL9 in rice.
Yin, Xiaojia; Biswal, Akshaya K; Dionora, Jacqueline; Perdigon, Kristel M; Balahadia, Christian P; Mazumdar, Shamik; Chater, Caspar; Lin, Hsiang-Chun; Coe, Robert A; Kretzschmar, Tobias; Gray, Julie E; Quick, Paul W; Bandyopadhyay, Anindya
2017-05-01
CRISPR-Cas9/Cpf1 system with its unique gene targeting efficiency, could be an important tool for functional study of early developmental genes through the generation of successful knockout plants. The introduction and utilization of systems biology approaches have identified several genes that are involved in early development of a plant and with such knowledge a robust tool is required for the functional validation of putative candidate genes thus obtained. The development of the CRISPR-Cas9/Cpf1 genome editing system has provided a convenient tool for creating loss of function mutants for genes of interest. The present study utilized CRISPR/Cas9 and CRISPR-Cpf1 technology to knock out an early developmental gene EPFL9 (Epidermal Patterning Factor like-9, a positive regulator of stomatal development in Arabidopsis) orthologue in rice. Germ-line mutants that were generated showed edits that were carried forward into the T2 generation when Cas9-free homozygous mutants were obtained. The homozygous mutant plants showed more than an eightfold reduction in stomatal density on the abaxial leaf surface of the edited rice plants. Potential off-target analysis showed no significant off-target effects. This study also utilized the CRISPR-LbCpf1 (Lachnospiracae bacterium Cpf1) to target the same OsEPFL9 gene to test the activity of this class-2 CRISPR system in rice and found that Cpf1 is also capable of genome editing and edits get transmitted through generations with similar phenotypic changes seen with CRISPR-Cas9. This study demonstrates the application of CRISPR-Cas9/Cpf1 to precisely target genomic locations and develop transgene-free homozygous heritable gene edits and confirms that the loss of function analysis of the candidate genes emerging from different systems biology based approaches, could be performed, and therefore, this system adds value in the validation of gene function studies.
GenePattern | Informatics Technology for Cancer Research (ITCR)
GenePattern is a genomic analysis platform that provides access to hundreds of tools for the analysis and visualization of multiple data types. A web-based interface provides easy access to these tools and allows the creation of multi-step analysis pipelines that enable reproducible in silico research. A new GenePattern Notebook environment allows users to combine GenePattern analyses with text, graphics, and code to create complete reproducible research narratives.
Fine pattern replication on 10 x 10-mm exposure area using ETS-1 laboratory tool in HIT
NASA Astrophysics Data System (ADS)
Hamamoto, K.; Watanabe, Takeo; Hada, Hideo; Komano, Hiroshi; Kishimura, Shinji; Okazaki, Shinji; Kinoshita, Hiroo
2002-07-01
Utilizing ETS-1 laboratory tool in Himeji Institute of Technology (HIT), as for the fine pattern replicated by using the Cr mask in static exposure, it is replicated in the exposure area of 10 mm by 2 mm in size that the line and space pattern width of 60 nm, the isolated line pattern width of 40 nm, and hole pattern width of 150 nm. According to the synchronous scanning of the mass and wafer with EUVL laboratory tool with reduction optical system which consisted of three-aspherical-mirror in the NewSUBARU facilities succeeded in the line of 60 nm and the space pattern formation in the exposure region of 10mm by 10mm. From the result of exposure characteristics for positive- tone resist for KrF and EB, KrF chemically amplified resist has better characteristics than EB chemically amplified resist.
BioCluster: tool for identification and clustering of Enterobacteriaceae based on biochemical data.
Abdullah, Ahmed; Sabbir Alam, S M; Sultana, Munawar; Hossain, M Anwar
2015-06-01
Presumptive identification of different Enterobacteriaceae species is routinely achieved based on biochemical properties. Traditional practice includes manual comparison of each biochemical property of the unknown sample with known reference samples and inference of its identity based on the maximum similarity pattern with the known samples. This process is labor-intensive, time-consuming, error-prone, and subjective. Therefore, automation of sorting and similarity in calculation would be advantageous. Here we present a MATLAB-based graphical user interface (GUI) tool named BioCluster. This tool was designed for automated clustering and identification of Enterobacteriaceae based on biochemical test results. In this tool, we used two types of algorithms, i.e., traditional hierarchical clustering (HC) and the Improved Hierarchical Clustering (IHC), a modified algorithm that was developed specifically for the clustering and identification of Enterobacteriaceae species. IHC takes into account the variability in result of 1-47 biochemical tests within this Enterobacteriaceae family. This tool also provides different options to optimize the clustering in a user-friendly way. Using computer-generated synthetic data and some real data, we have demonstrated that BioCluster has high accuracy in clustering and identifying enterobacterial species based on biochemical test data. This tool can be freely downloaded at http://microbialgen.du.ac.bd/biocluster/. Copyright © 2015 The Authors. Production and hosting by Elsevier Ltd.. All rights reserved.
Computer generated maps from digital satellite data - A case study in Florida
NASA Technical Reports Server (NTRS)
Arvanitis, L. G.; Reich, R. M.; Newburne, R.
1981-01-01
Ground cover maps are important tools to a wide array of users. Over the past three decades, much progress has been made in supplementing planimetric and topographic maps with ground cover details obtained from aerial photographs. The present investigation evaluates the feasibility of using computer maps of ground cover from satellite input tapes. Attention is given to the selection of test sites, a satellite data processing system, a multispectral image analyzer, general purpose computer-generated maps, the preliminary evaluation of computer maps, a test for areal correspondence, the preparation of overlays and acreage estimation of land cover types on the Landsat computer maps. There is every indication to suggest that digital multispectral image processing systems based on Landsat input data will play an increasingly important role in pattern recognition and mapping land cover in the years to come.
PLAN-IT-2: The next generation planning and scheduling tool
NASA Technical Reports Server (NTRS)
Eggemeyer, William C.; Cruz, Jennifer W.
1990-01-01
PLAN-IT is a scheduling program which has been demonstrated and evaluated in a variety of scheduling domains. The capability enhancements being made for the next generation of PLAN-IT, called PLAN-IT-2 is discussed. PLAN-IT-2 represents a complete rewrite of the original PLAN-IT incorporating major changes as suggested by the application experiences with the original PLAN-IT. A few of the enhancements described are additional types of constraints, such as states and resettable-depletables (batteries), dependencies between constraints, multiple levels of activity planning during the scheduling process, pattern constraint searching for opportunities as opposed to just minimizing the amount of conflicts, additional customization construction features for display and handling of diverse multiple time systems, and reduction in both the size and the complexity for creating the knowledge-base to address the different problem domains.
Forecasting techno-social systems: how physics and computing help to fight off global pandemics
NASA Astrophysics Data System (ADS)
Vespignani, Alessandro
2010-03-01
The crucial issue when planning for adequate public health interventions to mitigate the spread and impact of epidemics is risk evaluation and forecast. This amount to the anticipation of where, when and how strong the epidemic will strike. In the last decade advances in performance in computer technology, data acquisition, statistical physics and complex networks theory allow the generation of sophisticated simulations on supercomputer infrastructures to anticipate the spreading pattern of a pandemic. For the first time we are in the position of generating real time forecast of epidemic spreading. I will review the history of the current H1N1 pandemic, the major road-blocks the community has faced in its containment and mitigation and how physics and computing provide predictive tools that help us to battle epidemics.
Method and apparatus for characterizing and enhancing the dynamic performance of machine tools
Barkman, William E; Babelay, Jr., Edwin F
2013-12-17
Disclosed are various systems and methods for assessing and improving the capability of a machine tool. The disclosure applies to machine tools having at least one slide configured to move along a motion axis. Various patterns of dynamic excitation commands are employed to drive the one or more slides, typically involving repetitive short distance displacements. A quantification of a measurable merit of machine tool response to the one or more patterns of dynamic excitation commands is typically derived for the machine tool. Examples of measurable merits of machine tool performance include dynamic one axis positional accuracy of the machine tool, dynamic cross-axis stability of the machine tool, and dynamic multi-axis positional accuracy of the machine tool.
van Strien, Maarten J; Slager, Cornelis T J; de Vries, Bauke; Grêt-Regamey, Adrienne
2016-06-01
Many studies have assessed the effect of landscape patterns on spatial ecological processes by simulating these processes in computer-generated landscapes with varying composition and configuration. To generate such landscapes, various neutral landscape models have been developed. However, the limited set of landscape-level pattern variables included in these models is often inadequate to generate landscapes that reflect real landscapes. In order to achieve more flexibility and variability in the generated landscapes patterns, a more complete set of class- and patch-level pattern variables should be implemented in these models. These enhancements have been implemented in Landscape Generator (LG), which is a software that uses optimization algorithms to generate landscapes that match user-defined target values. Developed for participatory spatial planning at small scale, we enhanced the usability of LG and demonstrated how it can be used for larger scale ecological studies. First, we used LG to recreate landscape patterns from a real landscape (i.e., a mountainous region in Switzerland). Second, we generated landscape series with incrementally changing pattern variables, which could be used in ecological simulation studies. We found that LG was able to recreate landscape patterns that approximate those of real landscapes. Furthermore, we successfully generated landscape series that would not have been possible with traditional neutral landscape models. LG is a promising novel approach for generating neutral landscapes and enables testing of new hypotheses regarding the influence of landscape patterns on ecological processes. LG is freely available online.
Sparsity enables estimation of both subcortical and cortical activity from MEG and EEG
Krishnaswamy, Pavitra; Obregon-Henao, Gabriel; Ahveninen, Jyrki; Khan, Sheraz; Iglesias, Juan Eugenio; Hämäläinen, Matti S.; Purdon, Patrick L.
2017-01-01
Subcortical structures play a critical role in brain function. However, options for assessing electrophysiological activity in these structures are limited. Electromagnetic fields generated by neuronal activity in subcortical structures can be recorded noninvasively, using magnetoencephalography (MEG) and electroencephalography (EEG). However, these subcortical signals are much weaker than those generated by cortical activity. In addition, we show here that it is difficult to resolve subcortical sources because distributed cortical activity can explain the MEG and EEG patterns generated by deep sources. We then demonstrate that if the cortical activity is spatially sparse, both cortical and subcortical sources can be resolved with M/EEG. Building on this insight, we develop a hierarchical sparse inverse solution for M/EEG. We assess the performance of this algorithm on realistic simulations and auditory evoked response data, and show that thalamic and brainstem sources can be correctly estimated in the presence of cortical activity. Our work provides alternative perspectives and tools for characterizing electrophysiological activity in subcortical structures in the human brain. PMID:29138310
Methodology to design a municipal solid waste generation and composition map: a case study.
Gallardo, A; Carlos, M; Peris, M; Colomer, F J
2014-11-01
The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consist in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the selective collection. To verify the methodology it has been successfully applied to a Spanish town. Copyright © 2014 Elsevier Ltd. All rights reserved.
Methodology to design a municipal solid waste generation and composition map: a case study.
Gallardo, A; Carlos, M; Peris, M; Colomer, F J
2015-02-01
The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consists in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the selective collection. To verify the methodology it has been successfully applied to a Spanish town. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Nurhayati, E.; Koesmaryono, Y.; Impron
2017-03-01
Rice Yellow Stem Borer (YSB) is one of the major insect pests in rice plants that has high attack intensity in rice production center areas, especially in West Java. This pest is consider as holometabola insects that causes rice damage in the vegetative phase (deadheart) as well as generative phase (whitehead). Climatic factor is one of the environmental factors influence the pattern of dynamics population. The purpose of this study was to develop a predictive modeling of YSB pest dynamics population under climate change scenarios (2016-2035 period) using Dymex Model in Indramayu area, West Java. YSB modeling required two main components, namely climate parameters and YSB development lower threshold of temperature (To) to describe YSB life cycle in every phase. Calibration and validation test of models showed the coefficient of determination (R2) between the predicted results and observations of the study area were 0.74 and 0.88 respectively, which was able to illustrate the development, mortality, transfer of individuals from one stage to the next life also fecundity and YSB reproduction. On baseline climate condition, there was a tendency of population abundance peak (outbreak) occured when a change of rainfall intensity in the rainy season transition to dry season or the opposite conditions was happen. In both of application of climate change scenarios, the model outputs were generated well and able to predict the pattern of YSB population dynamics with a the increasing trend of specific population numbers, generation numbers per season and also shifting pattern of populations abundance peak in the future climatic conditions. These results can be adopted as a tool to predict outbreak and to give early warning to control YSB pest more effectively.
Barreda, Santiago; Kidder, Ian J; Mudery, Jordan A; Bailey, E Fiona
2015-03-01
Neonates at risk for sudden infant death syndrome (SIDS) are hospitalized for cardiorespiratory monitoring however, monitoring is costly and generates large quantities of averaged data that serve as poor predictors of infant risk. In this study we used a traditional autocorrelation function (ACF) testing its suitability as a tool to detect subtle alterations in respiratory patterning in vivo. We applied the ACF to chest wall motion tracings obtained from rat pups in the period corresponding to the mid-to-end of the third trimester of human pregnancy. Pups were drawn from two groups: nicotine-exposed and saline-exposed at each age (i.e., P7, P8, P9, and P10). Respiratory-related motions of the chest wall were recorded in room air and in response to an arousal stimulus (FIO2 14%). The autocorrelation function was used to determine measures of breathing rate and respiratory patterning. Unlike alternative tools such as Poincare plots that depict an averaged difference in a measure breath to breath, the ACF when applied to a digitized chest wall trace yields an instantaneous sample of data points that can be used to compare (data) points at the same time in the next breath or in any subsequent number of breaths. The moment-to-moment evaluation of chest wall motion detected subtle differences in respiratory pattern in rat pups exposed to nicotine in utero and aged matched saline-exposed peers. The ACF can be applied online as well as to existing data sets and requires comparatively short sampling windows (∼2 min). As shown here, the ACF could be used to identify factors that precipitate or minimize instability and thus, offers a quantitative measure of risk in vulnerable populations. Copyright © 2015 Elsevier B.V. All rights reserved.
Du, Lijuan; Zhou, Amy; Patel, Akshay; Rao, Mishal; Anderson, Kelsey; Roy, Sougata
2017-07-01
Fibroblast growth factors (FGF) are essential signaling proteins that regulate diverse cellular functions in developmental and metabolic processes. In Drosophila, the FGF homolog, branchless (bnl) is expressed in a dynamic and spatiotemporally restricted pattern to induce branching morphogenesis of the trachea, which expresses the Bnl-receptor, breathless (btl). Here we have developed a new strategy to determine bnl- expressing cells and study their interactions with the btl-expressing cells in the range of tissue patterning during Drosophila development. To enable targeted gene expression specifically in the bnl expressing cells, a new LexA based bnl enhancer trap line was generated using CRISPR/Cas9 based genome editing. Analyses of the spatiotemporal expression of the reporter in various embryonic stages, larval or adult tissues and in metabolic hypoxia, confirmed its target specificity and versatility. With this tool, new bnl expressing cells, their unique organization and functional interactions with the btl-expressing cells were uncovered in a larval tracheoblast niche in the leg imaginal discs, in larval photoreceptors of the developing retina, and in the embryonic central nervous system. The targeted expression system also facilitated live imaging of simultaneously labeled Bnl sources and tracheal cells, which revealed a unique morphogenetic movement of the embryonic bnl- source. Migration of bnl- expressing cells may create a dynamic spatiotemporal pattern of the signal source necessary for the directional growth of the tracheal branch. The genetic tool and the comprehensive profile of expression, organization, and activity of various types of bnl-expressing cells described in this study provided us with an important foundation for future research investigating the mechanisms underlying Bnl signaling in tissue morphogenesis. Copyright © 2017 Elsevier Inc. All rights reserved.
Frank, Steven A.
2010-01-01
We typically observe large-scale outcomes that arise from the interactions of many hidden, small-scale processes. Examples include age of disease onset, rates of amino acid substitutions, and composition of ecological communities. The macroscopic patterns in each problem often vary around a characteristic shape that can be generated by neutral processes. A neutral generative model assumes that each microscopic process follows unbiased or random stochastic fluctuations: random connections of network nodes; amino acid substitutions with no effect on fitness; species that arise or disappear from communities randomly. These neutral generative models often match common patterns of nature. In this paper, I present the theoretical background by which we can understand why these neutral generative models are so successful. I show where the classic patterns come from, such as the Poisson pattern, the normal or Gaussian pattern, and many others. Each classic pattern was often discovered by a simple neutral generative model. The neutral patterns share a special characteristic: they describe the patterns of nature that follow from simple constraints on information. For example, any aggregation of processes that preserves information only about the mean and variance attracts to the Gaussian pattern; any aggregation that preserves information only about the mean attracts to the exponential pattern; any aggregation that preserves information only about the geometric mean attracts to the power law pattern. I present a simple and consistent informational framework of the common patterns of nature based on the method of maximum entropy. This framework shows that each neutral generative model is a special case that helps to discover a particular set of informational constraints; those informational constraints define a much wider domain of non-neutral generative processes that attract to the same neutral pattern. PMID:19538344
A Statistically Representative Atlas for Mapping Neuronal Circuits in the Drosophila Adult Brain.
Arganda-Carreras, Ignacio; Manoliu, Tudor; Mazuras, Nicolas; Schulze, Florian; Iglesias, Juan E; Bühler, Katja; Jenett, Arnim; Rouyer, François; Andrey, Philippe
2018-01-01
Imaging the expression patterns of reporter constructs is a powerful tool to dissect the neuronal circuits of perception and behavior in the adult brain of Drosophila , one of the major models for studying brain functions. To date, several Drosophila brain templates and digital atlases have been built to automatically analyze and compare collections of expression pattern images. However, there has been no systematic comparison of performances between alternative atlasing strategies and registration algorithms. Here, we objectively evaluated the performance of different strategies for building adult Drosophila brain templates and atlases. In addition, we used state-of-the-art registration algorithms to generate a new group-wise inter-sex atlas. Our results highlight the benefit of statistical atlases over individual ones and show that the newly proposed inter-sex atlas outperformed existing solutions for automated registration and annotation of expression patterns. Over 3,000 images from the Janelia Farm FlyLight collection were registered using the proposed strategy. These registered expression patterns can be searched and compared with a new version of the BrainBaseWeb system and BrainGazer software. We illustrate the validity of our methodology and brain atlas with registration-based predictions of expression patterns in a subset of clock neurons. The described registration framework should benefit to brain studies in Drosophila and other insect species.
NASA Astrophysics Data System (ADS)
Wang, Xin; Gao, Jun; Fan, Zhiguo; Roberts, Nicholas W.
2016-06-01
We present a computationally inexpensive analytical model for simulating celestial polarization patterns in variable conditions. We combine both the singularity theory of Berry et al (2004 New J. Phys. 6 162) and the intensity model of Perez et al (1993 Sol. Energy 50 235-245) such that our single model describes three key sets of data: (1) the overhead distribution of the degree of polarization as well as the existence of neutral points in the sky; (2) the change in sky polarization as a function of the turbidity of the atmosphere; and (3) sky polarization patterns as a function of wavelength, calculated in this work from the ultra-violet to the near infra-red. To verify the performance of our model we generate accurate reference data using a numerical radiative transfer model and statistical comparisons between these two methods demonstrate no significant difference in almost all situations. The development of our analytical model provides a novel method for efficiently calculating the overhead skylight polarization pattern. This provides a new tool of particular relevance for our understanding of animals that use the celestial polarization pattern as a source of visual information.
Self-similar transmission patterns induced by magnetic field effects in graphene
NASA Astrophysics Data System (ADS)
Rodríguez-González, R.; Rodríguez-Vargas, I.; Díaz-Guerrero, D. S.; Gaggero-Sager, L. M.
2018-07-01
In this work we study the propagation of Dirac electrons through Cantor-like structures in graphene. In concrete, we are considering structures with magnetic and electrostatic barriers arrange in Cantor-like fashion. The Dirac-like equation and the transfer matrix approach have been used to obtain the transmission properties. We found self-similar patterns in the transmission probability or transmittance once the magnetic field is incorporated. Moreover, these patterns can be connected with other ones at different scales through well-defined scaling rules. In particular, we have found two scaling rules that become a useful tool to describe the self-similarity of our system. The first expression is related to the generation and the second one to the length of the Cantor-like structure. As far as we know it is the first time that a special self-similar structure in conjunction with magnetic field effects give rise to self-similar transmission patterns. It is also important to remark that according to our knowledge it is fundamental to break some symmetry of graphene in order to obtain self-similar transmission properties. In fact, in our case the time-reversal symmetry is broken by the magnetic field effects.
Informational Aspects of Isotopic Diversity in Biology and Medicine
NASA Astrophysics Data System (ADS)
Berezin, Alexander A.
2004-10-01
Use of stable and radioactive isotopes in biology and medicine is intensive, yet informational aspects of isotopes as such are largely neglected (A.A.Berezin, J.Theor.Biol.,1992). Classical distinguishability (``labelability'') of isotopes allows for pattern generation dynamics. Quantum mechanically advantages of isotopicity (diversity of stable isotopes) arise from (almost perfect) degeneracy of various isotopic configurations; this in turn allows for isotopic sweeps (hoppings) by resonance neutron tunneling (Eccles mechanism). Isotopic variations of de Broglie wavelength affect quantum tunneling, diffusivity, magnetic interactions (e.g. by Lorentz force), etc. Ergodicity principle (all isoenergetic states are eventually accessed) implies possibility of fast scanning of library of morphogenetic patterns (cf metaphors of universal ``Platonic'' Library of Patterns: e.g. J.L.Borges, R.Sheldrake) with subsequent Darwinian reinforcement (e.g. by targeted mutations) of evolutionary advantageous patterns and structures. Isotopic shifts in organisms, from viruses and protozoa to mammalians, (e.g. DNA with enriched or depleted C-13) are tools to elucidate possible informational (e.g. Shannon entropy) role of isotopicity in genetic (e.g. evolutionary and morphological), dynamical (e.g. physiological and neurological) as well as medical (e.g. carcinogenesis, aging) aspects of biology and medicine.
Energy landscapes for a machine-learning prediction of patient discharge
NASA Astrophysics Data System (ADS)
Das, Ritankar; Wales, David J.
2016-06-01
The energy landscapes framework is applied to a configuration space generated by training the parameters of a neural network. In this study the input data consists of time series for a collection of vital signs monitored for hospital patients, and the outcomes are patient discharge or continued hospitalisation. Using machine learning as a predictive diagnostic tool to identify patterns in large quantities of electronic health record data in real time is a very attractive approach for supporting clinical decisions, which have the potential to improve patient outcomes and reduce waiting times for discharge. Here we report some preliminary analysis to show how machine learning might be applied. In particular, we visualize the fitting landscape in terms of locally optimal neural networks and the connections between them in parameter space. We anticipate that these results, and analogues of thermodynamic properties for molecular systems, may help in the future design of improved predictive tools.
Advances in EPG for treatment and research: an illustrative case study.
Scobbie, James M; Wood, Sara E; Wrench, Alan A
2004-01-01
Electropalatography (EPG), a technique which reveals tongue-palate contact patterns over time, is a highly effective tool for speech research. We report here on recent developments by Articulate Instruments Ltd. These include hardware for Windows-based computers, backwardly compatible (with Reading EPG3) software systems for clinical intervention and laboratory-based analysis for EPG and acoustic data, and an enhanced clinical interface with client and file management tools. We focus here on a single case study of a child aged 10+/-years who had been diagnosed with an intractable speech disorder possibly resulting ultimately from a complete cleft of hard and soft palate. We illustrate how assessment, diagnosis and treatment of the intractable speech disorder are undertaken using this new generation of instrumental phonetic support. We also look forward to future developments in articulatory phonetics that will link EPG with ultrasound for research and clinical communities.
Measuring multielectron beam imaging fidelity with a signal-to-noise ratio analysis
NASA Astrophysics Data System (ADS)
Mukhtar, Maseeh; Bunday, Benjamin D.; Quoi, Kathy; Malloy, Matt; Thiel, Brad
2016-07-01
Java Monte Carlo Simulator for Secondary Electrons (JMONSEL) simulations are used to generate expected imaging responses of chosen test cases of patterns and defects with the ability to vary parameters for beam energy, spot size, pixel size, and/or defect material and form factor. The patterns are representative of the design rules for an aggressively scaled FinFET-type design. With these simulated images and resulting shot noise, a signal-to-noise framework is developed, which relates to defect detection probabilities. Additionally, with this infrastructure, the effect of detection chain noise and frequency-dependent system response can be made, allowing for targeting of best recipe parameters for multielectron beam inspection validation experiments. Ultimately, these results should lead to insights into how such parameters will impact tool design, including necessary doses for defect detection and estimations of scanning speeds for achieving high throughput for high-volume manufacturing.
Holographic photolysis of caged neurotransmitters
Lutz, Christoph; Otis, Thomas S.; DeSars, Vincent; Charpak, Serge; DiGregorio, David A.; Emiliani, Valentina
2009-01-01
Stimulation of light-sensitive chemical probes has become a powerful tool for the study of dynamic signaling processes in living tissue. Classically, this approach has been constrained by limitations of lens–based and point-scanning illumination systems. Here we describe a novel microscope configuration that incorporates a nematic liquid crystal spatial light modulator (LC-SLM) to generate holographic patterns of illumination. This microscope can produce illumination spots of variable size and number and patterns shaped to precisely match user-defined elements in a specimen. Using holographic illumination to photolyse caged glutamate in brain slices, we demonstrate that shaped excitation on segments of neuronal dendrites and simultaneous, multi-spot excitation of different dendrites enables precise spatial and rapid temporal control of glutamate receptor activation. By allowing the excitation volume shape to be tailored precisely, the holographic microscope provides an extremely flexible method for activation of various photosensitive proteins and small molecules. PMID:19160517
Gravitational Waves From Ultra Short Period Exoplanets
NASA Astrophysics Data System (ADS)
Cunha, J. V.; Silva, F. E.; Lima, J. A. S.
2018-06-01
In the last two decades, thousands of extrasolar planets were discovered based on different observational techniques, and their number must increase substantially in virtue of the ongoing and near-future approved missions and facilities. It is shown that interesting signatures of binary systems from nearby exoplanets and their parent stars can also be obtained measuring the pattern of gravitational waves that will be made available by the new generation of detectors including the space-based LISA (Laser Interferometer Space Antenna) observatory. As an example, a subset of exoplanets with extremely short periods (less than 80 min) is discussed. All of them have gravitational luminosity, LGW ˜ 1030erg/s, strain h ˜ 10-22, frequencies fgw > 10-4Hz, and, as such, are within the standard sensitivity curve of LISA. Our analysis suggests that the emitted gravitational wave pattern may also provide an efficient tool to discover ultra short period exoplanets.
Fast parallel 3D profilometer with DMD technology
NASA Astrophysics Data System (ADS)
Hou, Wenmei; Zhang, Yunbo
2011-12-01
Confocal microscope has been a powerful tool for three-dimensional profile analysis. Single mode confocal microscope is limited by scanning speed. This paper presents a 3D profilometer prototype of parallel confocal microscope based on DMD (Digital Micromirror Device). In this system the DMD takes the place of Nipkow Disk which is a classical parallel scanning scheme to realize parallel lateral scanning technique. Operated with certain pattern, the DMD generates a virtual pinholes array which separates the light into multi-beams. The key parameters that affect the measurement (pinhole size and the lateral scanning distance) can be configured conveniently by different patterns sent to DMD chip. To avoid disturbance between two virtual pinholes working at the same time, a scanning strategy is adopted. Depth response curve both axial and abaxial were extract. Measurement experiments have been carried out on silicon structured sample, and axial resolution of 55nm is achieved.
Understanding mutagenesis through delineation of mutational signatures in human cancer
Petljak, Mia; Alexandrov, Ludmil B.
2016-05-04
Each individual cell within a human body acquires a certain number of somatic mutations during a course of its lifetime. These mutations originate from a wide spectra of both endogenous and exogenous mutational processes that leave distinct patterns of mutations, termed mutational signatures, embedded within the genomes of all cells. In recent years, the vast amount of data produced by sequencing of cancer genomes was coupled with novel mathematical models and computational tools to generate the first comprehensive map of mutational signatures in human cancer. Up to date, >30 distinct mutational signatures have been identified, and etiologies have been proposedmore » for many of them. This paper provides a brief historical background on examination of mutational patterns in human cancer, summarizes the knowledge accumulated since introducing the concept of mutational signatures and discusses their future potential applications and perspectives within the field.« less
Data Auditor: Analyzing Data Quality Using Pattern Tableaux
NASA Astrophysics Data System (ADS)
Srivastava, Divesh
Monitoring databases maintain configuration and measurement tables about computer systems, such as networks and computing clusters, and serve important business functions, such as troubleshooting customer problems, analyzing equipment failures, planning system upgrades, etc. These databases are prone to many data quality issues: configuration tables may be incorrect due to data entry errors, while measurement tables may be affected by incorrect, missing, duplicate and delayed polls. We describe Data Auditor, a tool for analyzing data quality and exploring data semantics of monitoring databases. Given a user-supplied constraint, such as a boolean predicate expected to be satisfied by every tuple, a functional dependency, or an inclusion dependency, Data Auditor computes "pattern tableaux", which are concise summaries of subsets of the data that satisfy or fail the constraint. We discuss the architecture of Data Auditor, including the supported types of constraints and the tableau generation mechanism. We also show the utility of our approach on an operational network monitoring database.
Estimating the Size of Onion Epidermal Cells from Diffraction Patterns
NASA Astrophysics Data System (ADS)
Groff, Jeffrey R.
2012-10-01
Bioscience and premedical profession students are a major demographic served by introductory physics courses at many colleges and universities. Exposing these students to biological applications of physical principles will help them to appreciate physics as a useful tool for their future professions. Here I describe an experiment suitable for introductory physics where principles of wave optics are applied to probe the size of onion epidermal cells. The epidermis tissue is composed of cells of relatively uniform size and shape (Fig. 1) so the tissue acts like a one-dimensional transmission diffraction grating. The diffraction patterns generated when a laser beam passes through the tissue (Fig. 2) are analyzed and an estimate of the average width of individual onion epidermal cells is calculated. The results are compared to direct measurements taken using a light microscope. The use of microscopes and plant-cell tissue slides creates opportunities for cross-discipline collaboration between physics and biology instructors.
Santos, Ana Paula; Ferreira, Liliana J.; Oliveira, M. Margarida
2017-01-01
The spatial organization of chromosome structure within the interphase nucleus, as well as the patterns of methylome and histone modifications, represent intersecting layers that influence genome accessibility and function. This review is focused on the plastic nature of chromatin structure and epigenetic marks in association to stress situations. The use of chemical compounds (epigenetic drugs) or T-DNA-mediated mutagenesis affecting epigenetic regulators (epi-mutants) are discussed as being important tools for studying the impact of deregulated epigenetic backgrounds on gene function and phenotype. The inheritability of epigenetic marks and chromatin configurations along successive generations are interpreted as a way for plants to “communicate” past experiences of stress sensing. A mechanistic understanding of chromatin and epigenetics plasticity in plant response to stress, including tissue- and genotype-specific epigenetic patterns, may help to reveal the epigenetics contributions for genome and phenotype regulation. PMID:28275209
Dumonceaux, Tim J.; Green, Margaret; Hammond, Christine; Perez, Edel; Olivier, Chrystel
2014-01-01
Phytoplasmas (‘Candidatus Phytoplasma’ spp.) are insect-vectored bacteria that infect a wide variety of plants, including many agriculturally important species. The infections can cause devastating yield losses by inducing morphological changes that dramatically alter inflorescence development. Detection of phytoplasma infection typically utilizes sequences located within the 16S–23S rRNA-encoding locus, and these sequences are necessary for strain identification by currently accepted standards for phytoplasma classification. However, these methods can generate PCR products >1400 bp that are less divergent in sequence than protein-encoding genes, limiting strain resolution in certain cases. We describe a method for accessing the chaperonin-60 (cpn60) gene sequence from a diverse array of ‘Ca.Phytoplasma’ spp. Two degenerate primer sets were designed based on the known sequence diversity of cpn60 from ‘Ca.Phytoplasma’ spp. and used to amplify cpn60 gene fragments from various reference samples and infected plant tissues. Forty three cpn60 sequences were thereby determined. The cpn60 PCR-gel electrophoresis method was highly sensitive compared to 16S-23S-targeted PCR-gel electrophoresis. The topology of a phylogenetic tree generated using cpn60 sequences was congruent with that reported for 16S rRNA-encoding genes. The cpn60 sequences were used to design a hybridization array using oligonucleotide-coupled fluorescent microspheres, providing rapid diagnosis and typing of phytoplasma infections. The oligonucleotide-coupled fluorescent microsphere assay revealed samples that were infected simultaneously with two subtypes of phytoplasma. These tools were applied to show that two host plants, Brassica napus and Camelina sativa, displayed different phytoplasma infection patterns. PMID:25551224
NASA Astrophysics Data System (ADS)
Xin, YANG; Si-qi, WU; Qi, ZHANG
2018-05-01
Beijing, London, Paris, New York are typical cities in the world, so comparative study of four cities green pattern is very important to find out gap and advantage and to learn from each other. The paper will provide basis and new ideas for development of metropolises in China. On the background of big data, API (Application Programming Interface) system can provide extensive and accurate basic data to study urban green pattern in different geographical environment in domestic and foreign. On the basis of this, Average nearest neighbor tool, Kernel density tool and Standard Ellipse tool in ArcGIS platform can process and summarize data and realize quantitative analysis of green pattern. The paper summarized uniqueness of four cities green pattern and reasons of formation on basis of numerical comparison.
Wang, Jian; Anania, Veronica G.; Knott, Jeff; Rush, John; Lill, Jennie R.; Bourne, Philip E.; Bandeira, Nuno
2014-01-01
The combination of chemical cross-linking and mass spectrometry has recently been shown to constitute a powerful tool for studying protein–protein interactions and elucidating the structure of large protein complexes. However, computational methods for interpreting the complex MS/MS spectra from linked peptides are still in their infancy, making the high-throughput application of this approach largely impractical. Because of the lack of large annotated datasets, most current approaches do not capture the specific fragmentation patterns of linked peptides and therefore are not optimal for the identification of cross-linked peptides. Here we propose a generic approach to address this problem and demonstrate it using disulfide-bridged peptide libraries to (i) efficiently generate large mass spectral reference data for linked peptides at a low cost and (ii) automatically train an algorithm that can efficiently and accurately identify linked peptides from MS/MS spectra. We show that using this approach we were able to identify thousands of MS/MS spectra from disulfide-bridged peptides through comparison with proteome-scale sequence databases and significantly improve the sensitivity of cross-linked peptide identification. This allowed us to identify 60% more direct pairwise interactions between the protein subunits in the 20S proteasome complex than existing tools on cross-linking studies of the proteasome complexes. The basic framework of this approach and the MS/MS reference dataset generated should be valuable resources for the future development of new tools for the identification of linked peptides. PMID:24493012
Understanding and reduction of defects on finished EUV masks
NASA Astrophysics Data System (ADS)
Liang, Ted; Sanchez, Peter; Zhang, Guojing; Shu, Emily; Nagpal, Rajesh; Stivers, Alan
2005-05-01
To reduce the risk of EUV lithography adaptation for the 32nm technology node in 2009, Intel has operated a EUV mask Pilot Line since early 2004. The Pilot Line integrates all the necessary process modules including common tool sets shared with current photomask production as well as EUV specific tools. This integrated endeavor ensures a comprehensive understanding of any issues, and development of solutions for the eventual fabrication of defect-free EUV masks. Two enabling modules for "defect-free" masks are pattern inspection and repair, which have been integrated into the Pilot Line. This is the first time we are able to look at real defects originated from multilayer blanks and patterning process on finished masks over entire mask area. In this paper, we describe our efforts in the qualification of DUV pattern inspection and electron beam mask repair tools for Pilot Line operation, including inspection tool sensitivity, defect classification and characterization, and defect repair. We will discuss the origins of each of the five classes of defects as seen by DUV pattern inspection tool on finished masks, and present solutions of eliminating and mitigating them.
Generation 1.5 Written Error Patterns: A Comparative Study
ERIC Educational Resources Information Center
Doolan, Stephen M.; Miller, Donald
2012-01-01
In an attempt to contribute to existing research on Generation 1.5 students, the current study uses quantitative and qualitative methods to compare error patterns in a corpus of Generation 1.5, L1, and L2 community college student writing. This error analysis provides one important way to determine if error patterns in Generation 1.5 student…
cuBLASTP: Fine-Grained Parallelization of Protein Sequence Search on CPU+GPU.
Zhang, Jing; Wang, Hao; Feng, Wu-Chun
2017-01-01
BLAST, short for Basic Local Alignment Search Tool, is a ubiquitous tool used in the life sciences for pairwise sequence search. However, with the advent of next-generation sequencing (NGS), whether at the outset or downstream from NGS, the exponential growth of sequence databases is outstripping our ability to analyze the data. While recent studies have utilized the graphics processing unit (GPU) to speedup the BLAST algorithm for searching protein sequences (i.e., BLASTP), these studies use coarse-grained parallelism, where one sequence alignment is mapped to only one thread. Such an approach does not efficiently utilize the capabilities of a GPU, particularly due to the irregularity of BLASTP in both execution paths and memory-access patterns. To address the above shortcomings, we present a fine-grained approach to parallelize BLASTP, where each individual phase of sequence search is mapped to many threads on a GPU. This approach, which we refer to as cuBLASTP, reorders data-access patterns and reduces divergent branches of the most time-consuming phases (i.e., hit detection and ungapped extension). In addition, cuBLASTP optimizes the remaining phases (i.e., gapped extension and alignment with trace back) on a multicore CPU and overlaps their execution with the phases running on the GPU.
Reticles, write time, and the need for speed
NASA Astrophysics Data System (ADS)
Ackmann, Paul W.; Litt, Lloyd C.; Ning, Guo Xiang
2014-10-01
Historical data indicates reticle write times are increasing node-to-node. The cost of mask sets is increasing driven by the tighter requirements and more levels. The regular introduction of new generations of mask patterning tools with improved performance is unable to fully compensate for the increased data and complexity required. Write time is a primary metric that drives mask fabrication speed. Design (Raw data) is only the first step in the process and many interactions between mask and wafer technology such as OPC used, OPC efficiency for writers, fracture engines, and actual field size used drive total write time. Yield, technology, and inspection rules drive the remaining raw cycle time. Yield can be even more critical for speed of delivery as it drives re-writes and wasted time. While intrinsic process yield is important, repair capability is the reason mask delivery is still able to deliver 100% good reticles to the fab. Advanced nodes utilizing several layers of multiple patterning may require mask writer tool dedication to meet image placement specifications. This will increase the effective mask cycle time for a layer mask set and drive the need for additional mask write capability in order to deliver masks at the rate required by the wafer fab production schedules.
RipleyGUI: software for analyzing spatial patterns in 3D cell distributions
Hansson, Kristin; Jafari-Mamaghani, Mehrdad; Krieger, Patrik
2013-01-01
The true revolution in the age of digital neuroanatomy is the ability to extensively quantify anatomical structures and thus investigate structure-function relationships in great detail. To facilitate the quantification of neuronal cell patterns we have developed RipleyGUI, a MATLAB-based software that can be used to detect patterns in the 3D distribution of cells. RipleyGUI uses Ripley's K-function to analyze spatial distributions. In addition the software contains statistical tools to determine quantitative statistical differences, and tools for spatial transformations that are useful for analyzing non-stationary point patterns. The software has a graphical user interface making it easy to use without programming experience, and an extensive user manual explaining the basic concepts underlying the different statistical tools used to analyze spatial point patterns. The described analysis tool can be used for determining the spatial organization of neurons that is important for a detailed study of structure-function relationships. For example, neocortex that can be subdivided into six layers based on cell density and cell types can also be analyzed in terms of organizational principles distinguishing the layers. PMID:23658544
Mala, Sankeerti; Rathod, Vanita; Pundir, Siddharth; Dixit, Sudhanshu
2017-01-01
The unique pattern and structural diversity of fingerprints, lip prints, palatal rugae, and their occurrence in different patterns among individuals make it questionable whether they are completely unique even in a family hierarchy? Do they have any repetition of the patterns among the generations? Or is this a mere chaos theory? The present study aims to assess the pattern self-repetition of fingerprints, lip prints, and palatal rugae among three generations of ten different families. The present study was conducted at Rungta College of Dental Science and Research, Bhilai, India. Participants birth by origin of Chhattisgarh were only included in the study. Thirty participants from three consecutive generations of ten different families were briefed about the purpose of the study, and their fingerprints, lip prints, and palatal rugae impression were recorded and analyzed for the pattern of self-repetition. Multiple comparisons among the generations and one-way analysis of variance test were performed using SPSS 20 trial version. Among the pattern of primary palatal rugae, 10% showed repetition in all the three generations. Thirty percent showed repetition of the pattern of thumb fingerprints in all the three generation. The pattern of lip prints in the middle 1/3 rd of lower lip, 20% showed repetition in alternative generations. The evaluations of fingerprints, lip prints, and palatal rugae showed fractal dimensions, occurring variations in dimensions according to the complexity of each structure. Even though a minute self-repetition in the patterns of lip, thumb, and palate among the three consequent generations in a family was observed considering the sample size, these results need to be confirmed in a larger sample, either to establish the role of chaos theory in forensic science or identifying a particular pattern of the individual in his family hierarchy.
Heterogeneous Mobile Phone Ownership and Usage Patterns in Kenya
Wesolowski, Amy; Eagle, Nathan; Noor, Abdisalan M.; Snow, Robert W.; Buckee, Caroline O.
2012-01-01
The rapid adoption of mobile phone technologies in Africa is offering exciting opportunities for engaging with high-risk populations through mHealth programs, and the vast volumes of behavioral data being generated as people use their phones provide valuable data about human behavioral dynamics in these regions. Taking advantage of these opportunities requires an understanding of the penetration of mobile phones and phone usage patterns across the continent, but very little is known about the social and geographical heterogeneities in mobile phone ownership among African populations. Here, we analyze a survey of mobile phone ownership and usage across Kenya in 2009 and show that distinct regional, gender-related, and socioeconomic variations exist, with particularly low ownership among rural communities and poor people. We also examine patterns of phone sharing and highlight the contrasting relationships between ownership and sharing in different parts of the country. This heterogeneous penetration of mobile phones has important implications for the use of mobile technologies as a source of population data and as a public health tool in sub-Saharan Africa. PMID:22558140
Light Therapy and Alzheimer’s Disease and Related Dementia: Past, Present, and Future
Hanford, Nicholas; Figueiro, Mariana
2012-01-01
Sleep disturbances are common in persons with Alzheimer’s disease or related dementia (ADRD), resulting in a negative impact on the daytime function of the affected person and on the wellbeing of caregivers. The sleep/wake pattern is directly driven by the timing signals generated by a circadian pacemaker, which may or may not be perfectly functioning in those with ADRD. A 24-hour light/dark pattern incident on the retina is the most efficacious stimulus for entraining the circadian system to the solar day. In fact, a carefully orchestrated light/dark pattern has been shown in several controlled studies of older populations, with and without ADRD, to be a powerful non-pharmacological tool to improve sleep efficiency and consolidation. Discussed here are research results from studies looking at the effectiveness of light therapy in improving sleep, depression, and agitation in older adults with ADRD. A 24-hour lighting scheme to increase circadian entrainment, improve visibility, and reduce the risk of falls in those with ADRD is proposed, and future research needs are discussed. PMID:23099814
Light therapy and Alzheimer's disease and related dementia: past, present, and future.
Hanford, Nicholas; Figueiro, Mariana
2013-01-01
Sleep disturbances are common in persons with Alzheimer's disease or related dementia (ADRD), resulting in a negative impact on the daytime function of the affected person and on the wellbeing of caregivers. The sleep/wake pattern is directly driven by the timing signals generated by a circadian pacemaker, which may or may not be perfectly functioning in those with ADRD. A 24-hour light/dark pattern incident on the retina is the most efficacious stimulus for entraining the circadian system to the solar day. In fact, a carefully orchestrated light/dark pattern has been shown in several controlled studies of older populations, with and without ADRD, to be a powerful non-pharmacological tool to improve sleep efficiency and consolidation. Discussed here are research results from studies looking at the effectiveness of light therapy in improving sleep, depression, and agitation in older adults with ADRD. A 24-hour lighting scheme to increase circadian entrainment, improve visibility, and reduce the risk of falls in those with ADRD is proposed, and future research needs are discussed.
Kaplan, Jonas T.; Man, Kingson; Greening, Steven G.
2015-01-01
Here we highlight an emerging trend in the use of machine learning classifiers to test for abstraction across patterns of neural activity. When a classifier algorithm is trained on data from one cognitive context, and tested on data from another, conclusions can be drawn about the role of a given brain region in representing information that abstracts across those cognitive contexts. We call this kind of analysis Multivariate Cross-Classification (MVCC), and review several domains where it has recently made an impact. MVCC has been important in establishing correspondences among neural patterns across cognitive domains, including motor-perception matching and cross-sensory matching. It has been used to test for similarity between neural patterns evoked by perception and those generated from memory. Other work has used MVCC to investigate the similarity of representations for semantic categories across different kinds of stimulus presentation, and in the presence of different cognitive demands. We use these examples to demonstrate the power of MVCC as a tool for investigating neural abstraction and discuss some important methodological issues related to its application. PMID:25859202
Heterogeneous mobile phone ownership and usage patterns in Kenya.
Wesolowski, Amy; Eagle, Nathan; Noor, Abdisalan M; Snow, Robert W; Buckee, Caroline O
2012-01-01
The rapid adoption of mobile phone technologies in Africa is offering exciting opportunities for engaging with high-risk populations through mHealth programs, and the vast volumes of behavioral data being generated as people use their phones provide valuable data about human behavioral dynamics in these regions. Taking advantage of these opportunities requires an understanding of the penetration of mobile phones and phone usage patterns across the continent, but very little is known about the social and geographical heterogeneities in mobile phone ownership among African populations. Here, we analyze a survey of mobile phone ownership and usage across Kenya in 2009 and show that distinct regional, gender-related, and socioeconomic variations exist, with particularly low ownership among rural communities and poor people. We also examine patterns of phone sharing and highlight the contrasting relationships between ownership and sharing in different parts of the country. This heterogeneous penetration of mobile phones has important implications for the use of mobile technologies as a source of population data and as a public health tool in sub-Saharan Africa.
The power of fission: yeast as a tool for understanding complex splicing.
Fair, Benjamin Jung; Pleiss, Jeffrey A
2017-06-01
Pre-mRNA splicing is an essential component of eukaryotic gene expression. Many metazoans, including humans, regulate alternative splicing patterns to generate expansions of their proteome from a limited number of genes. Importantly, a considerable fraction of human disease causing mutations manifest themselves through altering the sequences that shape the splicing patterns of genes. Thus, understanding the mechanistic bases of this complex pathway will be an essential component of combating these diseases. Dating almost to the initial discovery of splicing, researchers have taken advantage of the genetic tractability of budding yeast to identify the components and decipher the mechanisms of splicing. However, budding yeast lacks the complex splicing machinery and alternative splicing patterns most relevant to humans. More recently, many researchers have turned their efforts to study the fission yeast, Schizosaccharomyces pombe, which has retained many features of complex splicing, including degenerate splice site sequences, the usage of exonic splicing enhancers, and SR proteins. Here, we review recent work using fission yeast genetics to examine pre-mRNA splicing, highlighting its promise for modeling the complex splicing seen in higher eukaryotes.
NASA Astrophysics Data System (ADS)
Nagahara, Seiji; Carcasi, Michael; Nakagawa, Hisashi; Buitrago, Elizabeth; Yildirim, Oktay; Shiraishi, Gosuke; Terashita, Yuichi; Minekawa, Yukie; Yoshihara, Kosuke; Tomono, Masaru; Mizoguchi, Hironori; Estrella, Joel; Nagai, Tomoki; Naruoka, Takehiko; Dei, Satoshi; Hori, Masafumi; Oshima, Akihiro; Vockenhuber, Michaela; Ekinci, Yasin; Meeuwissen, Marieke; Verspaget, Coen; Hoefnagels, Rik; Rispens, Gijsbert; Maas, Raymond; Nakashima, Hideo; Tagawa, Seiichi
2016-03-01
This paper proposes a promising approach to break the resolution (R), line-edge-roughness (LER), and sensitivity (S) trade-off (RLS trade-off) relationships that limit the ultimate lithographic performance of standard chemically amplified resists (CAR). This is accomplished in a process that uses a Photosensitized Chemically Amplified Resist (PSCAR) in combination with a flood-exposure in an in-line track connected to a pattern exposure tool. PSCAR is a modified CAR which contains a photosensitizer precursor (PP) in addition to other standard CAR components such as a protected polymer, a photo acid generator (PAG) and a quencher. In this paper, the PSCAR concept and the required conditions in resist formulation are carefully explained. In the PSCAR process, the sensitivity improvement is accomplished by PAG decomposition to selectively generate more acid at the pattern exposed areas during the flood exposure. The selective photosensitization happens through the excitation of the photosensitizer (PS) generated by the deprotection of the PP at the pattern exposed areas. A higher resist chemical gradient which leads to an improved resolution and lower LER values is also predicted using the PSCAR simulator. In the PSCAR process, the improved chemical gradient can be realized by dual acid quenching steps with the help of increased quencher concentration. Acid quenching first happens simultaneously with acid catalytic PP to PS reactions. As a result, a sharpened PS latent image is created in the PSCAR. This image is subsequently excited by the flood exposure creating additional acid products at the pattern exposed areas only. Much the same as in the standard CAR system, unnecessary acid present in the non-pattern exposed areas can be neutralized by the remaining quencher to therefore produce sharper acid latent images. EUV exposure results down to 15 nm half pitch (HP) line/space (L/S) patterns using a PSCAR resist indicate that the use of PSCAR has the potential to improve the sensitivity of the system while simultaneously improving the line-width-roughness (LWR) with added quencher and flood exposure doses. In addition, improved across-wafer critical dimension uniformity (CDU) is realized by the use of a PSCAR in combination with a flood exposure using pre α UV exposure module.
Miyawaki, Christina E
2016-03-01
This study is a cross-sectional investigation of caregiving practice patterns among Asian, Hispanic and non-Hispanic White American family caregivers of older adults across three immigrant generations. The 2009 California Health Interview Survey (CHIS) dataset was used, and 591 Asian, 989 Hispanic and 6537 non-Hispanic White American caregivers of older adults were selected. First, descriptive analyses of caregivers' characteristics, caregiving situations and practice patterns were examined by racial/ethnic groups and immigrant generations. Practice patterns measured were respite care use, hours and length of caregiving. Three hypotheses on caregiving patterns based on assimilation theory were tested and analyzed using logistic regression and generalized linear models by racial/ethnic groups and generations. Caregiving patterns of non-Hispanic White caregivers supported all three hypotheses regarding respite care use, caregiving hours and caregiving duration, showing less caregiving involvement in later generations. However, Asian and Hispanic counterparts showed mixed results. Third generation Asian and Hispanic caregivers used respite care the least and spent the most caregiving hours per week and had the longest caregiving duration compared to earlier generations. These caregiving patterns revealed underlying cultural values related to filial responsibility, even among later generations of caregivers of color. Findings suggest the importance of considering the cultural values of each racial/ethnic group regardless of generation when working with racially and ethnically diverse populations of family caregivers of older adults.
Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit
2016-03-01
Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics, and others yet to emerge on the postgenomics horizon.
Hybrid strategies for nanolithography and chemical patterning
NASA Astrophysics Data System (ADS)
Srinivasan, Charan
Remarkable technological advances in photolithography have extended patterning to the sub-50-nm regime. However, because photolithography is a top-down approach, it faces substantial technological and economic challenges in maintaining the downward scaling trends of feature sizes below 30 nm. Concurrently, fundamental research on chemical self-assembly has enabled the path to access molecular length scales. The key to the success of photolithography is its inherent economies of scale, which justify the large capital investment for its implementation. In this thesis research, top-down and bottom-up approaches have been combined synergistically, and these hybrid strategies have been employed in applications that do not have the economies of scale found in semiconductor chip manufacturing. The specific instances of techniques developed here include molecular-ruler lithography and a series of nanoscale chemical patterning methods. Molecular-ruler lithography utilizes self-assembled multilayered films as a sidewall spacer on initial photolithographically patterned gold features (parent) to place a second-generation feature (daughter) in precise proximity to the parent. The parent-daughter separation, which is on the nanometer length scale, is defined by the thickness of the molecular-ruler resist. Analogous to protocols followed in industry to evaluate lithographic performance, electrical test-pad structures were designed to interrogate the nanostructures patterned by molecular-ruler nanolithography, failure modes creating electrical shorts were mapped to each lithographic step, and subsequent lithographic optimization was performed to pattern nanoscale devices with excellent electrical performance. The optimized lithographic processes were applied to generate nanoscale devices such as nanowires and thin-film transistors (TFTs). Metallic nanowires were patterned by depositing a tertiary generation material in the nanogap and surrounding micron-scale regions, and then chemically removing the parent and daughter structures selectively. This processing was also performed on silicon-on-insulator substrates and the metallic nanowires were used as a hard mask to transfer the pattern to the single crystalline silicon epilayer resulting in a quaternary generation structure of single-crystalline silicon nanowire field-effect transistors. Additionally, the proof of concept for patterning nanoscale pentacene TFTs utilizing molecular-rulers was demonstrated. For applications in sub-100-nm lithography, the limitations on the relative heights of parent and daughter structures were overcome and processes to integrate molecular-ruler nanolithography with existing complementary metal-oxide-semiconductor (CMOS) processing were developed. Pattern transfer to underlying SiO2 substrates has opened a new avenue of opportunities to apply these nanostructures in nanofluidics and in non-traditional lithography such as imprint lithography. Additionally, the molecular-ruler process has been shown to increase the spatial density of features created by high-resolution techniques such as electron-beam lithography. A limitation of photolithography is its inability to pattern chemical functionality on surfaces. To overcome this limitation, two techniques were developed to extend nanolithography beyond semiconductors and apply them to patterning of self-assembled monolayers. First, a novel bilayer resist was devised to protect and to pattern chemical functionality on surfaces by being able to withstand conditions necessary for both chemical self-assembly and photooxidation of the Au-S bond while not disrupting the preexisting SAM. In addition to photolithography, soft-lithographic approaches such as microcontact printing are often used to create chemical patterns. In this work, a technique for the creation of chemical patterns of inserted molecules with dilute coverages (≤10%) was implemented. As part of the research in chemical patterning, a method for characterizing chemical patterns using scanning electron microscopy has been developed. These tools are the standard for metrology in nanolithography, and thus are readily accessible as our advances in chemical patterning are adopted and applied by the lithography community.
Arora, Naveen Kumar; Khare, Ekta; Singh, Sachin; Tewari, Sakshi
2018-01-01
Pigeon pea ( Cajanus cajan ) is one of the most important legumes grown in the northern province of Uttar Pradesh, India. However, its productively in Uttar Pradesh is lower than the average yield of adjoining states. During the course of the present study, a survey of pigeon pea growing agricultural fields was carried out and it was found that 80% of plants were inadequately nodulated. The study was aimed to evaluate the pigeon pea symbiotic compatibility and nodulation efficiency of root nodulating bacteria isolated from various legumes, and to explore the phenetic and genetic diversity of rhizobial population nodulating pigeon pea growing in fields of Uttar Pradesh. Amongst all the 96 isolates, 40 isolates showed nodulation in pigeon pea. These 40 isolates were further characterized by phenotypic, biochemical and physiological tests. Intrinsic antibiotic resistance pattern was taken to generate similarity matrix revealing 10 phenons. The study shows that most of the isolates nodulating pigeon pea in this region were rapid growers. The dendrogram generated using the NTSYSpc software grouped RAPD patterns into 19 clusters. The high degree of phenetic and genetic diversity encountered is probably because of a history of mixed cropping of legumes. The assessment of diversity is a very important tool and can be used to improve the nodulation and quality of pigeon pea crop. It is also concluded that difference between phenetic and RAPD clustering pattern is an indication that rhizobial diversity of pigeon pea is not as yet completely understood and settled.
Pe'er, Guy; Zurita, Gustavo A.; Schober, Lucia; Bellocq, Maria I.; Strer, Maximilian; Müller, Michael; Pütz, Sandro
2013-01-01
Landscape simulators are widely applied in landscape ecology for generating landscape patterns. These models can be divided into two categories: pattern-based models that generate spatial patterns irrespective of the processes that shape them, and process-based models that attempt to generate patterns based on the processes that shape them. The latter often tend toward complexity in an attempt to obtain high predictive precision, but are rarely used for generic or theoretical purposes. Here we show that a simple process-based simulator can generate a variety of spatial patterns including realistic ones, typifying landscapes fragmented by anthropogenic activities. The model “G-RaFFe” generates roads and fields to reproduce the processes in which forests are converted into arable lands. For a selected level of habitat cover, three factors dominate its outcomes: the number of roads (accessibility), maximum field size (accounting for land ownership patterns), and maximum field disconnection (which enables field to be detached from roads). We compared the performance of G-RaFFe to three other models: Simmap (neutral model), Qrule (fractal-based) and Dinamica EGO (with 4 model versions differing in complexity). A PCA-based analysis indicated G-RaFFe and Dinamica version 4 (most complex) to perform best in matching realistic spatial patterns, but an alternative analysis which considers model variability identified G-RaFFe and Qrule as performing best. We also found model performance to be affected by habitat cover and the actual land-uses, the latter reflecting on land ownership patterns. We suggest that simple process-based generators such as G-RaFFe can be used to generate spatial patterns as templates for theoretical analyses, as well as for gaining better understanding of the relation between spatial processes and patterns. We suggest caution in applying neutral or fractal-based approaches, since spatial patterns that typify anthropogenic landscapes are often non-fractal in nature. PMID:23724108
Pe'er, Guy; Zurita, Gustavo A; Schober, Lucia; Bellocq, Maria I; Strer, Maximilian; Müller, Michael; Pütz, Sandro
2013-01-01
Landscape simulators are widely applied in landscape ecology for generating landscape patterns. These models can be divided into two categories: pattern-based models that generate spatial patterns irrespective of the processes that shape them, and process-based models that attempt to generate patterns based on the processes that shape them. The latter often tend toward complexity in an attempt to obtain high predictive precision, but are rarely used for generic or theoretical purposes. Here we show that a simple process-based simulator can generate a variety of spatial patterns including realistic ones, typifying landscapes fragmented by anthropogenic activities. The model "G-RaFFe" generates roads and fields to reproduce the processes in which forests are converted into arable lands. For a selected level of habitat cover, three factors dominate its outcomes: the number of roads (accessibility), maximum field size (accounting for land ownership patterns), and maximum field disconnection (which enables field to be detached from roads). We compared the performance of G-RaFFe to three other models: Simmap (neutral model), Qrule (fractal-based) and Dinamica EGO (with 4 model versions differing in complexity). A PCA-based analysis indicated G-RaFFe and Dinamica version 4 (most complex) to perform best in matching realistic spatial patterns, but an alternative analysis which considers model variability identified G-RaFFe and Qrule as performing best. We also found model performance to be affected by habitat cover and the actual land-uses, the latter reflecting on land ownership patterns. We suggest that simple process-based generators such as G-RaFFe can be used to generate spatial patterns as templates for theoretical analyses, as well as for gaining better understanding of the relation between spatial processes and patterns. We suggest caution in applying neutral or fractal-based approaches, since spatial patterns that typify anthropogenic landscapes are often non-fractal in nature.
Maskless micro-ion-beam reduction lithography system
Leung, Ka-Ngo; Barletta, William A.; Patterson, David O.; Gough, Richard A.
2005-05-03
A maskless micro-ion-beam reduction lithography system is a system for projecting patterns onto a resist layer on a wafer with feature size down to below 100 nm. The MMRL system operates without a stencil mask. The patterns are generated by switching beamlets on and off from a two electrode blanking system or pattern generator. The pattern generator controllably extracts the beamlet pattern from an ion source and is followed by a beam reduction and acceleration column.
Producing Production Level Tooling in Prototype Timing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mc Hugh, Kevin Matthew; Knirsch, J.
A new rapid solidification process machine will be able to produce eight-inch diameter by six-inch thick finished cavities at the rate of one per hour - a rate that will change the tooling industry dramatically. Global Metal Technologies, Inc. (GMTI) (Solon, OH) has signed an exclusive license with Idaho National Engineered and Environmental Laboratories (INEEL) (Idaho Falls, ID) for the development and commercialization of the rapid solidification process (RSP tooling). The first production machine is scheduled for delivery in July 2001. The RSP tooling process is a method of producing production level tooling in prototype timing. The process' inventor, Kevinmore » McHugh, describes it as a rapid solidification method, which differentiates it from the standard spray forming methods. RSP itself is relatively straightforward. Molten metal is sprayed against the ceramic pattern, replicating the pattern's contours, surface texture and details. After spraying, the molten tool steel is cooled at room temperature and separated from the pattern. The irregular periphery of the freshly sprayed insert is squared off, either by machining or, in the case of harder tool steels, by wire EDM. XX« less
Generating DEM from LIDAR data - comparison of available software tools
NASA Astrophysics Data System (ADS)
Korzeniowska, K.; Lacka, M.
2011-12-01
In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.
Design space exploration for early identification of yield limiting patterns
NASA Astrophysics Data System (ADS)
Li, Helen; Zou, Elain; Lee, Robben; Hong, Sid; Liu, Square; Wang, JinYan; Du, Chunshan; Zhang, Recco; Madkour, Kareem; Ali, Hussein; Hsu, Danny; Kabeel, Aliaa; ElManhawy, Wael; Kwan, Joe
2016-03-01
In order to resolve the causality dilemma of which comes first, accurate design rules or real designs, this paper presents a flow for exploration of the layout design space to early identify problematic patterns that will negatively affect the yield. A new random layout generating method called Layout Schema Generator (LSG) is reported in this paper, this method generates realistic design-like layouts without any design rule violation. Lithography simulation is then used on the generated layout to discover the potentially problematic patterns (hotspots). These hotspot patterns are further explored by randomly inducing feature and context variations to these identified hotspots through a flow called Hotspot variation Flow (HSV). Simulation is then performed on these expanded set of layout clips to further identify more problematic patterns. These patterns are then classified into design forbidden patterns that should be included in the design rule checker and legal patterns that need better handling in the RET recipes and processes.
Understanding Adherence and Prescription Patterns Using Large-Scale Claims Data.
Bjarnadóttir, Margrét V; Malik, Sana; Onukwugha, Eberechukwu; Gooden, Tanisha; Plaisant, Catherine
2016-02-01
Advanced computing capabilities and novel visual analytics tools now allow us to move beyond the traditional cross-sectional summaries to analyze longitudinal prescription patterns and the impact of study design decisions. For example, design decisions regarding gaps and overlaps in prescription fill data are necessary for measuring adherence using prescription claims data. However, little is known regarding the impact of these decisions on measures of medication possession (e.g., medication possession ratio). The goal of the study was to demonstrate the use of visualization tools for pattern discovery, hypothesis generation, and study design. We utilized EventFlow, a novel discrete event sequence visualization software, to investigate patterns of prescription fills, including gaps and overlaps, utilizing large-scale healthcare claims data. The study analyzes data of individuals who had at least two prescriptions for one of five hypertension medication classes: ACE inhibitors, angiotensin II receptor blockers, beta blockers, calcium channel blockers, and diuretics. We focused on those members initiating therapy with diuretics (19.2%) who may have concurrently or subsequently take drugs in other classes as well. We identified longitudinal patterns in prescription fills for antihypertensive medications, investigated the implications of decisions regarding gap length and overlaps, and examined the impact on the average cost and adherence of the initial treatment episode. A total of 790,609 individuals are included in the study sample, 19.2% (N = 151,566) of whom started on diuretics first during the study period. The average age was 52.4 years and 53.1% of the population was female. When the allowable gap was zero, 34% of the population had continuous coverage and the average length of continuous coverage was 2 months. In contrast, when the allowable gap was 30 days, 69% of the population showed a single continuous prescription period with an average length of 5 months. The average prescription cost of the period of continuous coverage ranged from US$3.44 (when the maximum gap was 0 day) to US$9.08 (when the maximum gap was 30 days). Results were less impactful when considering overlaps. This proof-of-concept study illustrates the use of visual analytics tools in characterizing longitudinal medication possession. We find that prescription patterns and associated prescription costs are more influenced by allowable gap lengths than by definitions and treatment of overlap. Research using medication gaps and overlaps to define medication possession in prescription claims data should pay particular attention to the definition and use of gap lengths.
Cell force mapping using a double-sided micropillar array based on the moiré fringe method
NASA Astrophysics Data System (ADS)
Zhang, F.; Anderson, S.; Zheng, X.; Roberts, E.; Qiu, Y.; Liao, R.; Zhang, X.
2014-07-01
The mapping of traction forces is crucial to understanding the means by which cells regulate their behavior and physiological function to adapt to and communicate with their local microenvironment. To this end, polymeric micropillar arrays have been used for measuring cell traction force. However, the small scale of the micropillar deflections induced by cell traction forces results in highly inefficient force analyses using conventional optical approaches; in many cases, cell forces may be below the limits of detection achieved using conventional microscopy. To address these limitations, the moiré phenomenon has been leveraged as a visualization tool for cell force mapping due to its inherent magnification effect and capacity for whole-field force measurements. This Letter reports an optomechanical cell force sensor, namely, a double-sided micropillar array (DMPA) made of poly(dimethylsiloxane), on which one side is employed to support cultured living cells while the opposing side serves as a reference pattern for generating moiré patterns. The distance between the two sides, which is a crucial parameter influencing moiré pattern contrast, is predetermined during fabrication using theoretical calculations based on the Talbot effect that aim to optimize contrast. Herein, double-sided micropillar arrays were validated by mapping mouse embryo fibroblast contraction forces and the resulting force maps compared to conventional microscopy image analyses as the reference standard. The DMPA-based approach precludes the requirement for aligning two independent periodic substrates, improves moiré contrast, and enables efficient moiré pattern generation. Furthermore, the double-sided structure readily allows for the integration of moiré-based cell force mapping into microfabricated cell culture environments or lab-on-a-chip devices.
User's manual for EZPLOT version 5.5: A FORTRAN program for 2-dimensional graphic display of data
NASA Technical Reports Server (NTRS)
Garbinski, Charles; Redin, Paul C.; Budd, Gerald D.
1988-01-01
EZPLOT is a computer applications program that converts data resident on a file into a plot displayed on the screen of a graphics terminal. This program generates either time history or x-y plots in response to commands entered interactively from a terminal keyboard. Plot parameters consist of a single independent parameter and from one to eight dependent parameters. Various line patterns, symbol shapes, axis scales, text labels, and data modification techniques are available. This user's manual describes EZPLOT as it is implemented on the Ames Research Center, Dryden Research Facility ELXSI computer using DI-3000 graphics software tools.
Global Ionospheric Perturbations Monitored by the Worldwide GPS Network
NASA Technical Reports Server (NTRS)
Ho, C. M.; Mannucci, A. T.; Lindqwister, U. J.; Pi, X. Q.
1996-01-01
Based on the delays of these (Global Positioning System-GPS)signals, we have generated high resolution global ionospheric TEC (Total Electronic Changes) maps at 15-minute intervals. Using a differential method comparing storm time maps with quiet time maps, we find that the ionopshere during this time storm has increased significantly (the percentage change relative to quiet times is greater than 150 percent) ...These preliminary results (those mentioned above plus other in the paper)indicate that the differential maping method, which is based on GPS network measurements appears to be a useful tool for studying the global pattern and evolution process of the entire ionospheric perturbation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paul, Sanjoy; Ellman, Brett, E-mail: bellman@kent.edu; Singh, Gautam
We describe a tool for studying the two-dimensional spatial variation in electronic properties of organic semiconductors: the scanning time-of-flight microscope (STOFm). The STOFm simultaneously measures the transmittance of polarized light and time-of-flight current transients with a pixel size <30 μm, making it especially valuable for studies of the correlations of structure with charge generation and transport in liquid crystalline organic semiconductors (LC OSCs). Adapting a previously developed photopolymerization technique, we characterize the instrument using patterned samples of a LC OSC bounded by a non-semiconducting polymer matrix.
Technology acceptance perception for promotion of sustainable consumption.
Biswas, Aindrila; Roy, Mousumi
2018-03-01
Economic growth in the past decades has resulted in change in consumption pattern and emergence of tech-savvy generation with unprecedented increase in the usage of social network technology. In this paper, the technology acceptance value gap adapted from the technology acceptance model has been applied as a tool supporting social network technology usage and subsequent promotion of sustainable consumption. The data generated through the use of structured questionnaires have been analyzed using structural equation modeling. The validity of the model and path estimates signifies the robustness of Technology Acceptance value gap in adjudicating the efficiency of social network technology usage in augmentation of sustainable consumption and awareness. The results indicate that subjective norm gap, ease-of-operation gap, and quality of green information gap have the most adversarial impact on social network technology usage. Eventually social networking technology usage has been identified as a significant antecedent of sustainable consumption.
Free-surface tracking of submerged features to infer hydrodynamic flow characteristics
NASA Astrophysics Data System (ADS)
Mandel, Tracy; Rosenzweig, Itay; Koseff, Jeffrey
2016-11-01
As sea level rise and stronger storm events threaten our coastlines, increased attention has been focused on coastal vegetation as a potentially resilient, financially viable tool to mitigate flooding and erosion. However, the actual effect of this "green infrastructure" on near-shore wave fields and flow patterns is not fully understood. For example, how do wave setup, wave nonlinearity, and canopy-generated instabilities change due to complex bottom roughness? Answering this question requires detailed knowledge of the free surface. We develop easy-to-use laboratory techniques to remotely measure physical processes by imaging the apparent distortion of the fixed features of a submerged cylinder array. Measurements of surface turbulence from a canopy-generated Kelvin-Helmholtz instability are possible with a single camera. A stereoscopic approach similar to Morris (2004) and Gomit et al. (2013) allows for measurement of waveform evolution and the effect of vegetation on wave steepness and nonlinearity.
NASA Astrophysics Data System (ADS)
Kubis, Michael; Wise, Rich; Reijnen, Liesbeth; Viatkina, Katja; Jaenen, Patrick; Luca, Melisa; Mernier, Guillaume; Chahine, Charlotte; Hellin, David; Kam, Benjamin; Sobieski, Daniel; Vertommen, Johan; Mulkens, Jan; Dusa, Mircea; Dixit, Girish; Shamma, Nader; Leray, Philippe
2016-03-01
With shrinking design rules, the overall patterning requirements are getting aggressively tighter. For the 7-nm node and below, allowable CD uniformity variations are entering the Angstrom region (ref [1]). Optimizing inter- and intra-field CD uniformity of the final pattern requires a holistic tuning of all process steps. In previous work, CD control with either litho cluster or etch tool corrections has been discussed. Today, we present a holistic CD control approach, combining the correction capability of the etch tool with the correction capability of the exposure tool. The study is done on 10-nm logic node wafers, processed with a test vehicle stack patterning sequence. We include wafer-to-wafer and lot-to-lot variation and apply optical scatterometry to characterize the fingerprints. Making use of all available correction capabilities (lithography and etch), we investigated single application of exposure tool corrections and of etch tool corrections as well as combinations of both to reach the lowest CD uniformity. Results of the final pattern uniformity based on single and combined corrections are shown. We conclude on the application of this holistic lithography and etch optimization to 7nm High-Volume manufacturing, paving the way to ultimate within-wafer CD uniformity control.
Kebschull, Moritz; Fittler, Melanie Julia; Demmer, Ryan T; Papapanou, Panos N
2017-01-01
Today, -omics analyses, including the systematic cataloging of messenger RNA and microRNA sequences or DNA methylation patterns in a cell population, organ, or tissue sample, allow for an unbiased, comprehensive genome-level analysis of complex diseases, offering a large advantage over earlier "candidate" gene or pathway analyses. A primary goal in the analysis of these high-throughput assays is the detection of those features among several thousand that differ between different groups of samples. In the context of oral biology, our group has successfully utilized -omics technology to identify key molecules and pathways in different diagnostic entities of periodontal disease.A major issue when inferring biological information from high-throughput -omics studies is the fact that the sheer volume of high-dimensional data generated by contemporary technology is not appropriately analyzed using common statistical methods employed in the biomedical sciences.In this chapter, we outline a robust and well-accepted bioinformatics workflow for the initial analysis of -omics data generated using microarrays or next-generation sequencing technology using open-source tools. Starting with quality control measures and necessary preprocessing steps for data originating from different -omics technologies, we next outline a differential expression analysis pipeline that can be used for data from both microarray and sequencing experiments, and offers the possibility to account for random or fixed effects. Finally, we present an overview of the possibilities for a functional analysis of the obtained data.
IndeCut evaluates performance of network motif discovery algorithms.
Ansariola, Mitra; Megraw, Molly; Koslicki, David
2018-05-01
Genomic networks represent a complex map of molecular interactions which are descriptive of the biological processes occurring in living cells. Identifying the small over-represented circuitry patterns in these networks helps generate hypotheses about the functional basis of such complex processes. Network motif discovery is a systematic way of achieving this goal. However, a reliable network motif discovery outcome requires generating random background networks which are the result of a uniform and independent graph sampling method. To date, there has been no method to numerically evaluate whether any network motif discovery algorithm performs as intended on realistically sized datasets-thus it was not possible to assess the validity of resulting network motifs. In this work, we present IndeCut, the first method to date that characterizes network motif finding algorithm performance in terms of uniform sampling on realistically sized networks. We demonstrate that it is critical to use IndeCut prior to running any network motif finder for two reasons. First, IndeCut indicates the number of samples needed for a tool to produce an outcome that is both reproducible and accurate. Second, IndeCut allows users to choose the tool that generates samples in the most independent fashion for their network of interest among many available options. The open source software package is available at https://github.com/megrawlab/IndeCut. megrawm@science.oregonstate.edu or david.koslicki@math.oregonstate.edu. Supplementary data are available at Bioinformatics online.
Comparative analytics of infusion pump data across multiple hospital systems.
Catlin, Ann Christine; Malloy, William X; Arthur, Karen J; Gaston, Cindy; Young, James; Fernando, Sudheera; Fernando, Ruchith
2015-02-15
A Web-based analytics system for conducting inhouse evaluations and cross-facility comparisons of alert data generated by smart infusion pumps is described. The Infusion Pump Informatics (IPI) project, a collaborative effort led by research scientists at Purdue University, was launched in 2009 to provide advanced analytics and tools for workflow analyses to assist hospitals in determining the significance of smart-pump alerts and reducing nuisance alerts. The IPI system allows facility-specific analyses of alert patterns and trends, as well as cross-facility comparisons of alert data uploaded by more than 55 participating institutions using different types of smart pumps. Tools accessible through the IPI portal include (1) charts displaying aggregated or breakout data on the top drugs associated with alerts, numbers of alerts per device or care area, and override-to-alert ratios, (2) investigative reports that can be used to characterize and analyze pump-programming errors in a variety of ways (e.g., by drug, by infusion type, by time of day), and (3) "drill-down" workflow analytics enabling users to evaluate alert patterns—both internally and in relation to patterns at other hospitals—in a quick and efficient stepwise fashion. The formation of the IPI analytics system to support a community of hospitals has been successful in providing sophisticated tools for member facilities to review, investigate, and efficiently analyze smart-pump alert data, not only within a member facility but also across other member facilities, to further enhance smart pump drug library design. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Scoring Tools for the Analysis of Infant Respiratory Inductive Plethysmography Signals.
Robles-Rubio, Carlos Alejandro; Bertolizio, Gianluca; Brown, Karen A; Kearney, Robert E
2015-01-01
Infants recovering from anesthesia are at risk of life threatening Postoperative Apnea (POA). POA events are rare, and so the study of POA requires the analysis of long cardiorespiratory records. Manual scoring is the preferred method of analysis for these data, but it is limited by low intra- and inter-scorer repeatability. Furthermore, recommended scoring rules do not provide a comprehensive description of the respiratory patterns. This work describes a set of manual scoring tools that address these limitations. These tools include: (i) a set of definitions and scoring rules for 6 mutually exclusive, unique patterns that fully characterize infant respiratory inductive plethysmography (RIP) signals; (ii) RIPScore, a graphical, manual scoring software to apply these rules to infant data; (iii) a library of data segments representing each of the 6 patterns; (iv) a fully automated, interactive formal training protocol to standardize the analysis and establish intra- and inter-scorer repeatability; and (v) a quality control method to monitor scorer ongoing performance over time. To evaluate these tools, three scorers from varied backgrounds were recruited and trained to reach a performance level similar to that of an expert. These scorers used RIPScore to analyze data from infants at risk of POA in two separate, independent instances. Scorers performed with high accuracy and consistency, analyzed data efficiently, had very good intra- and inter-scorer repeatability, and exhibited only minor confusion between patterns. These results indicate that our tools represent an excellent method for the analysis of respiratory patterns in long data records. Although the tools were developed for the study of POA, their use extends to any study of respiratory patterns using RIP (e.g., sleep apnea, extubation readiness). Moreover, by establishing and monitoring scorer repeatability, our tools enable the analysis of large data sets by multiple scorers, which is essential for longitudinal and multicenter studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vakili, Hajar; Rahvar, Sohrab; Kroupa, Pavel, E-mail: vakili@physics.sharif.edu
Shell galaxies are understood to form through the collision of a dwarf galaxy with an elliptical galaxy. Shell structures and kinematics have been noted to be independent tools to measure the gravitational potential of the shell galaxies. We compare theoretically the formation of shells in Type I shell galaxies in different gravity theories in this work because this is so far missing in the literature. We include Newtonian plus dark halo gravity, and two non-Newtonian gravity models, MOG and MOND, in identical initial systems. We investigate the effect of dynamical friction, which by slowing down the dwarf galaxy in themore » dark halo models limits the range of shell radii to low values. Under the same initial conditions, shells appear on a shorter timescale and over a smaller range of distances in the presence of dark matter than in the corresponding non-Newtonian gravity models. If galaxies are embedded in a dark matter halo, then the merging time may be too rapid to allow multi-generation shell formation as required by observed systems because of the large dynamical friction effect. Starting from the same initial state, the observation of small bright shells in the dark halo model should be accompanied by large faint ones, while for the case of MOG, the next shell generation patterns iterate with a specific time delay. The first shell generation pattern shows a degeneracy with the age of the shells and in different theories, but the relative distance of the shells and the shell expansion velocity can break this degeneracy.« less
Using the Colored Eco-Genetic Relationship Map with children.
Driessnack, Martha
2009-01-01
The Colored Eco-Genetic Relationship Map (CEGRM) is a hybridized assessment tool that combines the ecomap, the family genogram, and the genetic pedigree to produce a unique, participant-generated picture of an individual's social networks, information exchange patterns, and sources of support. To date, the CEGRM has been used successfully with adults, providing insights into their social networks and the communication patterns they use in the update and exchange of health-related information. To explore the feasibility and the utility of adapting elements of the CEGRM for use with children. Twenty children, 7 to 10 years of age, distributed by gender, socioeconomic status, and geographic heritage, participated in one-on-one sessions in which they created modified CEGRMs using adapted art directives. A qualitative descriptive design and approach to analysis were used. Children were able to create a modified CEGRM, and resultant discussions provided considerable insights. A focused analysis revealed a kaleidoscope of social networks being accessed by today's children as well as surprising information exchange sources and patterns. Although all the children included one parent, family composition varied. Extended family, other adults, peers, and media sources were not only prevalent but also often preferred over the nuclear family as sources of health information. Of particular interest, mothers were rarely identified as children's primary source of health-related information. Elements of the CEGRM are adapted easily for use with children using children's drawings and may prove to be an effective, adjunctive assessment and interventional tool for parents, researchers, educators, and providers working with young children.
Rapid SAW Sensor Development Tools
NASA Technical Reports Server (NTRS)
Wilson, William C.; Atkinson, Gary M.
2007-01-01
The lack of integrated design tools for Surface Acoustic Wave (SAW) devices has led us to develop tools for the design, modeling, analysis, and automatic layout generation of SAW devices. These tools enable rapid development of wireless SAW sensors. The tools developed have been designed to integrate into existing Electronic Design Automation (EDA) tools to take advantage of existing 3D modeling, and Finite Element Analysis (FEA). This paper presents the SAW design, modeling, analysis, and automated layout generation tools.
NASA Astrophysics Data System (ADS)
Copur, Hanifi; Bilgin, Nuh; Balci, Cemal; Tumac, Deniz; Avunduk, Emre
2017-06-01
This study aims at determining the effects of single-, double-, and triple-spiral cutting patterns; the effects of tool cutting speeds on the experimental scale; and the effects of the method of yield estimation on cutting performance by performing a set of full-scale linear cutting tests with a conical cutting tool. The average and maximum normal, cutting and side forces; specific energy; yield; and coarseness index are measured and compared in each cutting pattern at a 25-mm line spacing, at varying depths of cut per revolution, and using two cutting speeds on five different rock samples. The results indicate that the optimum specific energy decreases by approximately 25% with an increasing number of spirals from the single- to the double-spiral cutting pattern for the hard rocks, whereas generally little effect was observed for the soft- and medium-strength rocks. The double-spiral cutting pattern appeared to be more effective than the single- or triple-spiral cutting pattern and had an advantage of lower side forces. The tool cutting speed had no apparent effect on the cutting performance. The estimation of the specific energy by the yield based on the theoretical swept area was not significantly different from that estimated by the yield based on the muck weighing, especially for the double- and triple-spiral cutting patterns and with the optimum ratio of line spacing to depth of cut per revolution. This study also demonstrated that the cutterhead and mechanical miner designs, semi-theoretical deterministic computer simulations and empirical performance predictions and optimization models should be based on realistic experimental simulations. Studies should be continued to obtain more reliable results by creating a larger database of laboratory tests and field performance records for mechanical miners using drag tools.
Aspects of ultra-high-precision diamond machining of RSA 443 optical aluminium
NASA Astrophysics Data System (ADS)
Mkoko, Z.; Abou-El-Hossein, K.
2015-08-01
Optical aluminium alloys such as 6061-T6 are traditionally used in ultra-high precision manufacturing for making optical mirrors for aerospace and other applications. However, the optics industry has recently witnessed the development of more advanced optical aluminium grades that are capable of addressing some of the issues encountered when turning with single-point natural monocrystalline diamond cutters. The advent of rapidly solidified aluminium (RSA) grades has generally opened up new possibilities for ultra-high precision manufacturing of optical components. In this study, experiments were conducted with single-point diamond cutters on rapidly solidified aluminium RSA 443 material. The objective of this study is to observe the effects of depth of cut and feed rate at a fixed rotational speed on the tool wear rate and resulting surface roughness of diamond turned specimens. This is done to gain further understanding of the rate of wear on the diamond cutters versus the surface texture generated on the RSA 443 material. The diamond machining experiments yielded machined surfaces which are less reflective but with consistent surface roughness values. Cutting tools were observed for wear through scanning microscopy; relatively low wear pattern was evident on the diamond tool edge. The highest tool wear were obtained at higher depth of cut and increased feed rate.
Bacteria-powered battery on paper.
Fraiwan, Arwa; Choi, Seokheun
2014-12-21
Paper-based devices have recently emerged as simple and low-cost paradigms for fluid manipulation and analytical/clinical testing. However, there are significant challenges in developing paper-based devices at the system level, which contain integrated paper-based power sources. Here, we report a microfabricated paper-based bacteria-powered battery that is capable of generating power from microbial metabolism. The battery on paper showed a very short start-up time relative to conventional microbial fuel cells (MFCs); paper substrates eliminated the time traditional MFCs required to accumulate and acclimate bacteria on the anode. Only four batteries connected in series provided desired values of current and potential to power an LED for more than 30 minutes. The battery featured (i) a low-cost paper-based proton exchange membrane directly patterned on commercially available parchment paper and (ii) paper reservoirs for holding the anolyte and the catholyte for an extended period of time. Based on this concept, we also demonstrate the use of paper-based test platforms for the rapid characterization of electricity-generating bacteria. This paper-based microbial screening tool does not require external pumps/tubings and represents the most rapid test platform (<50 min) compared with the time needed by using traditional screening tools (up to 103 days) and even recently proposed MEMS arrays (< 2 days).
Malyshev, A Y; Roshchin, M V; Smirnova, G R; Dolgikh, D A; Balaban, P M; Ostrovsky, M A
2017-02-15
Optogenetics is a powerful technique in neuroscience that provided a great success in studying the brain functions during the last decade. Progress of optogenetics crucially depends on development of new molecular tools. Light-activated cation-conducting channelrhodopsin2 was widely used for excitation of cells since the emergence of optogenetics. In 2015 a family of natural light activated chloride channels GtACR was identified which appeared to be a very promising tool for using in optogenetics experiments as a cell silencer. Here we examined properties of GtACR2 channel expressed in the rat layer 2/3 pyramidal neurons by means of in utero electroporation. We have found that despite strong inhibition the light stimulation of GtACR2-positive neurons can surprisingly lead to generation of action potentials, presumably initiated in the axonal terminals. Thus, when using the GtACR2 in optogenetics experiments, its ability to induce action potentials should be taken into account. Our results also open an interesting possibility of using the GtACR2 both as cell silencer and cell activator in the same experiment varying the pattern of light stimulation. Copyright © 2017 Elsevier B.V. All rights reserved.
Sharma, Neeraj; Sosnay, Patrick R.; Ramalho, Anabela S.; Douville, Christopher; Franca, Arianna; Gottschalk, Laura B.; Park, Jeenah; Lee, Melissa; Vecchio-Pagan, Briana; Raraigh, Karen S.; Amaral, Margarida D.; Karchin, Rachel; Cutting, Garry R.
2015-01-01
Assessment of the functional consequences of variants near splice sites is a major challenge in the diagnostic laboratory. To address this issue, we created expression minigenes (EMGs) to determine the RNA and protein products generated by splice site variants (n = 10) implicated in cystic fibrosis (CF). Experimental results were compared with the splicing predictions of eight in silico tools. EMGs containing the full-length Cystic Fibrosis Transmembrane Conductance Regulator (CFTR) coding sequence and flanking intron sequences generated wild-type transcript and fully processed protein in Human Embryonic Kidney (HEK293) and CF bronchial epithelial (CFBE41o-) cells. Quantification of variant induced aberrant mRNA isoforms was concordant using fragment analysis and pyrosequencing. The splicing patterns of c.1585−1G>A and c.2657+5G>A were comparable to those reported in primary cells from individuals bearing these variants. Bioinformatics predictions were consistent with experimental results for 9/10 variants (MES), 8/10 variants (NNSplice), and 7/10 variants (SSAT and Sroogle). Programs that estimate the consequences of mis-splicing predicted 11/16 (HSF and ASSEDA) and 10/16 (Fsplice and SplicePort) experimentally observed mRNA isoforms. EMGs provide a robust experimental approach for clinical interpretation of splice site variants and refinement of in silico tools. PMID:25066652
A strip chart recorder pattern recognition tool kit for Shuttle operations
NASA Technical Reports Server (NTRS)
Hammen, David G.; Moebes, Travis A.; Shelton, Robert O.; Savely, Robert T.
1993-01-01
During Space Shuttle operations, Mission Control personnel monitor numerous mission-critical systems such as electrical power; guidance, navigation, and control; and propulsion by means of paper strip chart recorders. For example, electrical power controllers monitor strip chart recorder pen traces to identify onboard electrical equipment activations and deactivations. Recent developments in pattern recognition technologies coupled with new capabilities that distribute real-time Shuttle telemetry data to engineering workstations make it possible to develop computer applications that perform some of the low-level monitoring now performed by controllers. The number of opportunities for such applications suggests a need to build a pattern recognition tool kit to reduce software development effort through software reuse. We are building pattern recognition applications while keeping such a tool kit in mind. We demonstrated the initial prototype application, which identifies electrical equipment activations, during three recent Shuttle flights. This prototype was developed to test the viability of the basic system architecture, to evaluate the performance of several pattern recognition techniques including those based on cross-correlation, neural networks, and statistical methods, to understand the interplay between an advanced automation application and human controllers to enhance utility, and to identify capabilities needed in a more general-purpose tool kit.
A Statistically Representative Atlas for Mapping Neuronal Circuits in the Drosophila Adult Brain
Arganda-Carreras, Ignacio; Manoliu, Tudor; Mazuras, Nicolas; Schulze, Florian; Iglesias, Juan E.; Bühler, Katja; Jenett, Arnim; Rouyer, François; Andrey, Philippe
2018-01-01
Imaging the expression patterns of reporter constructs is a powerful tool to dissect the neuronal circuits of perception and behavior in the adult brain of Drosophila, one of the major models for studying brain functions. To date, several Drosophila brain templates and digital atlases have been built to automatically analyze and compare collections of expression pattern images. However, there has been no systematic comparison of performances between alternative atlasing strategies and registration algorithms. Here, we objectively evaluated the performance of different strategies for building adult Drosophila brain templates and atlases. In addition, we used state-of-the-art registration algorithms to generate a new group-wise inter-sex atlas. Our results highlight the benefit of statistical atlases over individual ones and show that the newly proposed inter-sex atlas outperformed existing solutions for automated registration and annotation of expression patterns. Over 3,000 images from the Janelia Farm FlyLight collection were registered using the proposed strategy. These registered expression patterns can be searched and compared with a new version of the BrainBaseWeb system and BrainGazer software. We illustrate the validity of our methodology and brain atlas with registration-based predictions of expression patterns in a subset of clock neurons. The described registration framework should benefit to brain studies in Drosophila and other insect species. PMID:29628885
Automatic Generation of English-Japanese Translation Pattern Utilizing Genetic Programming Technique
NASA Astrophysics Data System (ADS)
Matsumura, Koki; Tamekuni, Yuji; Kimura, Shuhei
There are a lot of constructional differences in an English-Japanese phrase template, and that often makes the act of translation difficult. Moreover, there exist various and tremendous phrase templates and sentence to be refered to. It is not easy to prepare the corpus that covers the all. Therefore, it is very significant to generate the translation pattern of the sentence pattern automatically from a viewpoint of the translation success rate and the capacity of the pattern dictionary. Then, for the purpose of realizing the automatic generation of the translation pattern, this paper proposed the new method for the generation of the translation pattern by using the genetic programming technique (GP). The technique tries to generate the translation pattern of various sentences which are not registered in the phrase template dictionary automatically by giving the genetic operation to the parsing tree of a basic pattern. The tree consists of the pair of the English-Japanese sentence generated as the first stage population. The analysis tree data base with 50,100,150,200 pairs was prepared as the first stage population. And this system was applied and executed for an English input of 1,555 sentences. As a result, the analysis tree increases from 200 to 517, and the accuracy rate of the translation pattern has improved from 42.57% to 70.10%. And, 86.71% of the generated translations was successfully done, whose meanings are enough acceptable and understandable. It seemed that this proposal technique became a clue to raise the translation success rate, and to find the possibility of the reduction of the analysis tree data base.
On the role of the reticular formation in vocal pattern generation.
Jürgens, Uwe; Hage, Steffen R
2007-09-04
This review is an attempt to localize the brain region responsible for pattern generation of species-specific vocalizations. A catalogue is set up, listing the criteria considered to be essential for a vocal pattern generator. According to this catalogue, a vocal pattern generator should show vocalization-correlated activity, starting before vocal onset and reflecting specific acoustic features of the vocalization. Artificial activation by electrical or glutamatergic stimulation should produce artificially sounding vocalization. Lesioning is expected to have an inhibitory or deteriorating effect on vocalization. Anatomically, a vocal pattern generator can be assumed to have direct or, at least, oligosynaptic connections with all the motoneuron pools involved in phonation. A survey of the literature reveals that the only area meeting all these criteria is a region, reaching from the parvocellular pontine reticular formation just above the superior olive through the lateral reticular formation around the facial nucleus and nucleus ambiguus down to the caudalmost medulla, including the dorsal and ventral reticular nuclei and nucleus retroambiguus. It is proposed that vocal pattern generation takes place within this whole region.
Big Data, Global Development, and Complex Social Systems
NASA Astrophysics Data System (ADS)
Eagle, Nathan
2010-03-01
Petabytes of data about human movements, transactions, and communication patterns are continuously being generated by everyday technologies such as mobile phones and credit cards. This unprecedented volume of information facilitates a novel set of research questions applicable to a wide range of development issues. In collaboration with the mobile phone, internet, and credit card industries, my colleagues and I are aggregating and analyzing behavioral data from over 250 million people from North and South America, Europe, Asia and Africa. I will discuss a selection of projects arising from these collaborations that involve inferring behavioral dynamics on a broad spectrum of scales; from risky behavior in a group of MIT freshman to population-level behavioral signatures, including cholera outbreaks in Rwanda and wealth in the UK. Access to the movement patterns of the majority of mobile phones in East Africa also facilitates realistic models of disease transmission as well as slum formations. This vast volume of data requires new analytical tools - we are developing a range of large-scale network analysis and machine learning algorithms that we hope will provide deeper insight into human behavior. However, ultimately our goal is to determine how we can use these insights to actively improve the lives of the billions of people who generate this data and the societies in which they live.
Molchanova, Svetlana M; Huupponen, Johanna; Lauri, Sari E; Taira, Tomi
2016-08-01
Direct electrical coupling between neurons through gap junctions is prominent during development, when synaptic connectivity is scarce, providing the additional intercellular connectivity. However, functional studies of gap junctions are hampered by the unspecificity of pharmacological tools available. Here we have investigated gap-junctional coupling between CA3 pyramidal cells in neonatal hippocampus and its contribution to early network activity. Four different gap junction inhibitors, including the general blocker carbenoxolone, decreased the frequency of network activity bursts in CA3 area of hippocampus of P3-6 rats, suggesting the involvement of electrical connections in the generation of spontaneous network activity. In CA3 pyramidal cells, spikelets evoked by local stimulation of stratum oriens, were inhibited by carbenoxolone, but not by inhibitors of glutamatergic and GABAergic synaptic transmission, signifying the presence of electrical connectivity through axo-axonic gap junctions. Carbenoxolone also decreased the success rate of firing antidromic action potentials in response to stimulation, and changed the pattern of spontaneous action potential firing of CA3 pyramidal cells. Altogether, these data suggest that electrical coupling of CA3 pyramidal cells contribute to the generation of the early network events in neonatal hippocampus by modulating their firing pattern and synchronization. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Gellis, B. S.; McElroy, B. J.
2016-12-01
PATTERNS across Wyoming is a science and art project that promotes new and innovative approaches to STEM education and outreach, helping to re-contextualize how educators think about creative knowledge, and how to reach diverse audiences through informal education. The convergence of art, science and STEM outreach efforts is vital to increasing the presence of art in geosciences, developing multidisciplinary student research opportunities, expanding creative STEM thinking, and generating creative approaches of visualizing scientific data. A major goal of this project is to train art students to think critically about the value of scientific and artistic inquiry. PATTERNS across Wyoming makes science tangible to Wyoming citizens through K-14 art classrooms, and promotes novel maker-based art explorations centered around Wyoming's geosciences. The first PATTERNS across Wyoming scientific learning module (SIM) is a fish-tank sized flume that recreates natural patterns in sand as a result of fluid flow and sediment transport. It will help promotes the understanding of river systems found across Wyoming (e.g. Green, Yellowstone, Snake). This SIM, and the student artwork inspired by it, will help to visualize environmental-water changes in the central Rocky Mountains and will provide the essential inspiration and tools for Wyoming art students to design biological-driven creative explorations. Each art class will receive different fluvial system conditions, allowing for greater understanding of river system interactions. Artwork will return to the University of Wyoming for a STE{A}M Exhibition inspired by Wyoming's varying fluvial systems. It is our hope that new generations of science and art critical thinkers will not only explore questions of `why' and `how' scientific phenomena occur, but also `how' to better predict, conserve and study invaluable artifacts, and visualize conditions which allow for better control of scientific outcomes and public understanding.
NASA Astrophysics Data System (ADS)
Mihanovic, H.; Vilibic, I.
2014-12-01
Herein we present three recent oceanographic studies performed in the Adriatic Sea (the northernmost arm of the Mediterranean Sea), where Self-Organizing Maps (SOM) method, an unsupervised neural network method capable of recognizing patterns in various types of datasets, was applied to environmental data. The first study applied the SOM method to a long (50 years) series of thermohaline, dissolved oxygen and nutrient data measured over a deep (1200 m) Southern Adriatic Pit, in order to extract characteristic deep water mass patterns and their temporal variability. Low-dimensional SOM solutions revealed that the patterns were not sensitive to nutrients but were determined mostly by temperature, salinity and DO content; therefore, the water masses in the region can be traced by using no nutrient data. The second study encompassed the classification of surface current patterns measured by HF radars over the northernmost part of the Adriatic, by applying the SOM method to the HF radar data and operational mesoscale meteorological model surface wind fields. The major output from this study was a high correlation found between characteristic ocean current distribution patterns with and without wind data introduced to the SOM, implying the dominant wind driven dynamics over a local scale. That nominates the SOM method as a basis for generating very fast real-time forecast models over limited domains, based on the existing atmospheric forecasts and basin-oriented ocean experiments. The last study classified the sea ambient noise distributions in a habitat area of bottlenose dolphin, connecting it to the man-made noise generated by different types of vessels. Altogether, the usefulness of the SOM method has been recognized in different aspects of basin-scale ocean environmental studies, and may be a useful tool in future investigations of understanding of the multi-disciplinary dynamics over a basin, including the creation of operational environmental forecasting systems.
Sociality influences cultural complexity.
Muthukrishna, Michael; Shulman, Ben W; Vasilescu, Vlad; Henrich, Joseph
2014-01-07
Archaeological and ethnohistorical evidence suggests a link between a population's size and structure, and the diversity or sophistication of its toolkits or technologies. Addressing these patterns, several evolutionary models predict that both the size and social interconnectedness of populations can contribute to the complexity of its cultural repertoire. Some models also predict that a sudden loss of sociality or of population will result in subsequent losses of useful skills/technologies. Here, we test these predictions with two experiments that permit learners to access either one or five models (teachers). Experiment 1 demonstrates that naive participants who could observe five models, integrate this information and generate increasingly effective skills (using an image editing tool) over 10 laboratory generations, whereas those with access to only one model show no improvement. Experiment 2, which began with a generation of trained experts, shows how learners with access to only one model lose skills (in knot-tying) more rapidly than those with access to five models. In the final generation of both experiments, all participants with access to five models demonstrate superior skills to those with access to only one model. These results support theoretical predictions linking sociality to cumulative cultural evolution.
Sociality influences cultural complexity
Muthukrishna, Michael; Shulman, Ben W.; Vasilescu, Vlad; Henrich, Joseph
2014-01-01
Archaeological and ethnohistorical evidence suggests a link between a population's size and structure, and the diversity or sophistication of its toolkits or technologies. Addressing these patterns, several evolutionary models predict that both the size and social interconnectedness of populations can contribute to the complexity of its cultural repertoire. Some models also predict that a sudden loss of sociality or of population will result in subsequent losses of useful skills/technologies. Here, we test these predictions with two experiments that permit learners to access either one or five models (teachers). Experiment 1 demonstrates that naive participants who could observe five models, integrate this information and generate increasingly effective skills (using an image editing tool) over 10 laboratory generations, whereas those with access to only one model show no improvement. Experiment 2, which began with a generation of trained experts, shows how learners with access to only one model lose skills (in knot-tying) more rapidly than those with access to five models. In the final generation of both experiments, all participants with access to five models demonstrate superior skills to those with access to only one model. These results support theoretical predictions linking sociality to cumulative cultural evolution. PMID:24225461
Knowledge discovery from data as a framework to decision support in medical domains
Gibert, Karina
2009-01-01
Introduction Knowledge discovery from data (KDD) is a multidisciplinary discipline which appeared in 1996 for “non trivial identifying of valid, novel, potentially useful, ultimately understandable patterns in data”. Pre-treatment of data and post-processing is as important as the data exploitation (Data Mining) itself. Different analysis techniques can be properly combined to produce explicit knowledge from data. Methods Hybrid KDD methodologies combining Artificial Intelligence with Statistics and visualization have been used to identify patterns in complex medical phenomena: experts provide prior knowledge (pK); it biases the search of distinguishable groups of homogeneous objects; support-interpretation tools (CPG) assisted experts in conceptualization and labelling of discovered patterns, consistently with pK. Results Patterns of dependency in mental disabilities supported decision-making on legislation of the Spanish Dependency Law in Catalonia. Relationships between type of neurorehabilitation treatment and patterns of response for brain damage are assessed. Patterns of the perceived QOL along time are used in spinal cord lesion to improve social inclusion. Conclusion Reality is more and more complex and classical data analyses are not powerful enough to model it. New methodologies are required including multidisciplinarity and stressing on production of understandable models. Interaction with the experts is critical to generate meaningful results which can really support decision-making, particularly convenient transferring the pK to the system, as well as interpreting results in close interaction with experts. KDD is a valuable paradigm, particularly when facing very complex domains, not well understood yet, like many medical phenomena.
Gomez-De-Leon, Patricia; Santos, Jose I.; Caballero, Javier; Gomez, Demostenes; Espinosa, Luz E.; Moreno, Isabel; Piñero, Daniel; Cravioto, Alejandro
2000-01-01
Genomic fingerprints from 92 capsulated and noncapsulated strains of Haemophilus influenzae from Mexican children with different diseases and healthy carriers were generated by PCR using the enterobacterial repetitive intergenic consensus (ERIC) sequences. A cluster analysis by the unweighted pair-group method with arithmetic averages based on the overall similarity as estimated from the characteristics of the genomic fingerprints, was conducted to group the strains. A total of 69 fingerprint patterns were detected in the H. influenzae strains. Isolates from patients with different diseases were represented by a variety of patterns, which clustered into two major groups. Of the 37 strains isolated from cases of meningitis, 24 shared patterns and were clustered into five groups within a similarity level of 1.0. One fragment of 1.25 kb was common to all meningitis strains. H. influenzae strains from healthy carriers presented fingerprint patterns different from those found in strains from sick children. Isolates from healthy individuals were more variable and were distributed differently from those from patients. The results show that ERIC-PCR provides a powerful tool for the determination of the distinctive pathogenicity potentials of H. influenzae strains and encourage its use for molecular epidemiology investigations. PMID:10878033
A method to generate the surface cell layer of the 3D virtual shoot apex from apical initials.
Kucypera, Krzysztof; Lipowczan, Marcin; Piekarska-Stachowiak, Anna; Nakielski, Jerzy
2017-01-01
The development of cell pattern in the surface cell layer of the shoot apex can be investigated in vivo by use of a time-lapse confocal images, showing naked meristem in 3D in successive times. However, how this layer is originated from apical initials and develops as a result of growth and divisions of their descendants, remains unknown. This is an open area for computer modelling. A method to generate the surface cell layer is presented on the example of the 3D paraboloidal shoot apical dome. In the used model the layer originates from three apical initials that meet at the dome summit and develops through growth and cell divisions under the isotropic surface growth, defined by the growth tensor. The cells, which are described by polyhedrons, divide anticlinally with the smallest division plane that passes depending on the used mode through the cell center, or the point found randomly near this center. The formation of the surface cell pattern is described with the attention being paid to activity of the apical initials and fates of their descendants. The computer generated surface layer that included about 350 cells required about 1200 divisions of the apical initials and their derivatives. The derivatives were arranged into three more or less equal clonal sectors composed of cellular clones at different age. Each apical initial renewed itself 7-8 times to produce the sector. In the shape and location and the cellular clones the following divisions of the initial were manifested. The application of the random factor resulted in more realistic cell pattern in comparison to the pure mode. The cell divisions were analyzed statistically on the top view. When all of the division walls were considered, their angular distribution was uniform, whereas in the distribution that was limited to apical initials only, some preferences related to their arrangement at the dome summit were observed. The realistic surface cell pattern was obtained. The present method is a useful tool to generate surface cell layer, study activity of initial cells and their derivatives, and how cell expansion and division are coordinated during growth. We expect its further application to clarify the question of a number and permanence or impermanence of initial cells, and possible relationship between their shape and oriented divisions, both on the ground of the growth tensor approach.
NASA Astrophysics Data System (ADS)
Godfrey, B.; Majdalani, J.
2014-11-01
This study relies on computational fluid dynamics (CFD) tools to analyse a possible method for creating a stable quadrupole vortex within a simulated, circular-port, cylindrical rocket chamber. A model of the vortex generator is created in a SolidWorks CAD program and then the grid is generated using the Pointwise mesh generation software. The non-reactive flowfield is simulated using an open source computational program, Stanford University Unstructured (SU2). Subsequent analysis and visualization are performed using ParaView. The vortex generation approach that we employ consists of four tangentially injected monopole vortex generators that are arranged symmetrically with respect to the center of the chamber in such a way to produce a quadrupole vortex with a common downwash. The present investigation focuses on characterizing the flow dynamics so that future investigations can be undertaken with increasing levels of complexity. Our CFD simulations help to elucidate the onset of vortex filaments within the monopole tubes, and the evolution of quadrupole vortices downstream of the injection faceplate. Our results indicate that the quadrupole vortices produced using the present injection pattern can become quickly unstable to the extent of dissipating soon after being introduced into simulated rocket chamber. We conclude that a change in the geometrical configuration will be necessary to produce more stable quadrupoles.
NASA Astrophysics Data System (ADS)
Siarto, J.
2014-12-01
As more Earth science software tools and services move to the web--the design and usability of those tools become ever more important. A good user interface is becoming expected and users are becoming increasingly intolerant of websites and web applications that work against them. The Earthdata UI Pattern Library attempts to give these scientists and developers the design tools they need to make usable, compelling user interfaces without the associated overhead of using a full design team. Patterns are tested and functional user interface elements targeted specifically at the Earth science community and will include web layouts, buttons, tables, typography, iconography, mapping and visualization/graphing widgets. These UI elements have emerged as the result of extensive user testing, research and software development within the NASA Earthdata team over the past year.
Generation of Customizable Micro-wavy Pattern through Grayscale Direct Image Lithography
He, Ran; Wang, Shunqiang; Andrews, Geoffrey; Shi, Wentao; Liu, Yaling
2016-01-01
With the increasing amount of research work in surface studies, a more effective method of producing patterned microstructures is highly desired due to the geometric limitations and complex fabricating process of current techniques. This paper presents an efficient and cost-effective method to generate customizable micro-wavy pattern using direct image lithography. This method utilizes a grayscale Gaussian distribution effect to model inaccuracies inherent in the polymerization process, which are normally regarded as trivial matters or errors. The measured surface profiles and the mathematical prediction show a good agreement, demonstrating the ability of this method to generate wavy patterns with precisely controlled features. An accurate pattern can be generated with customizable parameters (wavelength, amplitude, wave shape, pattern profile, and overall dimension). This mask-free photolithography approach provides a rapid fabrication method that is capable of generating complex and non-uniform 3D wavy patterns with the wavelength ranging from 12 μm to 2100 μm and an amplitude-to-wavelength ratio as large as 300%. Microfluidic devices with pure wavy and wavy-herringbone patterns suitable for capture of circulating tumor cells are made as a demonstrative application. A completely customized microfluidic device with wavy patterns can be created within a few hours without access to clean room or commercial photolithography equipment. PMID:26902520
Experiences on developing digital down conversion algorithms using Xilinx system generator
NASA Astrophysics Data System (ADS)
Xu, Chengfa; Yuan, Yuan; Zhao, Lizhi
2013-07-01
The Digital Down Conversion (DDC) algorithm is a classical signal processing method which is widely used in radar and communication systems. In this paper, the DDC function is implemented by Xilinx System Generator tool on FPGA. System Generator is an FPGA design tool provided by Xilinx Inc and MathWorks Inc. It is very convenient for programmers to manipulate the design and debug the function, especially for the complex algorithm. Through the developing process of DDC function based on System Generator, the results show that System Generator is a very fast and efficient tool for FPGA design.
Engineering the Intracellular Micro- and Nano-environment via Magnetic Nanoparticles
NASA Astrophysics Data System (ADS)
Tseng, Peter
Single cells, despite being the base unit of living organisms, possess a high degree of hierarchical structure and functional compartmentalization. This complexity exists for good reason: cells must respond efficiently and effectively to its surrounding environment by differentiating, moving, interacting, and more in order to survive or inhabit its role in the larger biological system. At the core of these responses is cellular decision-making. Cells process cues internally and externally from the environment and effect intracellular asymmetry in biochemistry and structure in order to carry out the proper biological responses. Functionalized magnetic particles have shown to be a powerful tool in interacting with biological matter, through either cell or biomolecule sorting, and the activation of biological processes. This dissertation reports on techniques utilizing manipulated magnetic nanoparticles (internalized by cells) to spatially and temporally localize intracellular cues, and examines the resulting asymmetry in biological processes generated by our methods. We first examine patterned micromagnetic elements as a simple strategy of rapidly manipulating magnetic nanoparticles throughout the intracellular space. Silicon or silicon dioxide substrates form the base for electroplated NiFe rods, which are repeated at varying size and pitch. A planarizing resin, initially SU-8, is used as the substrate layer for cellular adhesion. We demonstrate that through the manipulations of a simple external magnet, these micro-fabricated substrates can mediate rapid (under 2 s) and precise (submicron), reversible translation of magnetic nanoparticles through cellular space. Seeding cells on substrates composed of these elements allows simultaneous control of ensembles of nanoparticles over thousands of cells at a time. We believe such substrates could form the basis of magnetically based tools for the activation of biological matter. We further utilize these strategies to generate user-controllable (time-varying and localizable), massively parallel forces on arrays of cells mediated by coalesced ensembles of magnetic nanoparticles. The above process is simplified and adapted for single cell analysis by precisely aligning fibronectin patterned cells to a single flanking micromagnet. The cells are loaded with magnetic-fluorescent nanoparticles, which are then localized to uniform positions at the internal edge of the cell membrane over huge arrays of cells using large external fields, allowing us to conduct composed studies on cellular response to force. By applying forces approaching the yield tension (5 nN / mum) of single cells, we are able to generate highly coordinated responses in cellular behavior. We discover that increasing tension generates highly directed, PAK-dependent leading-edge type filopodia that increase in intensity with rising tension. In addition, we find that our generated forces can simulate cues created during cellular mitosis, as we are consistently able to generate significant (45 to 90 degree) biasing of the metaphase plate during cell division. Large sample size and rapid sample generation also allow us to analyze cells at an unprecedented rate---a single sample can simultaneously stimulate thousands of cells for high statistical accuracy in measurements. We believe these approaches have potential not just as a tool to study single-cell response, but as a means of cell control, potentially through modifying cell movement, division, or differentiation. More generally, once approaches to release nanoparticles from endosomes are implemented, the technique provides a platform to dynamically apply a range of localized stimuli arbitrarily within cells. Through the bioconjugation of proteins, nucleic acids, small molecules, or whole organelles a broad range of questions should be accessible concerning molecular localization and its importance in cell function.
NASA Astrophysics Data System (ADS)
Bowen, Brian Hugh
1998-12-01
Electricity utilities in the Southern African region are conscious that gains could be made from more economically efficient trading but have had no tools with which to analyze the effects of a change in policy. This research is the first to provide transparent quantitative techniques to quantify the impacts of new trading arrangements in this region. The study poses a model of the recently formed Southern African Power Pool, built with the collaboration of the region's national utilities to represent each country's demand and generation/transmission system. The multi-region model includes commitment and dispatch from diverse hydrothermal sources over a vast area. Economic gains are determined by comparing the total costs under free-trade conditions with those from the existing fixed-trade bilateral arrangements. The objective function minimizes production costs needed to meet total demand, subject to each utility's constraints for thermal and hydro generation, transmission, load balance and losses. Linearized thermal cost functions are used along with linearized input output hydropower plant curves and hydrothermal on/off status variables to formulate a mixed-integer programming problem. Results from the modeling show that moving to optimal trading patterns could save between 70 million and 130 million per year. With free-trade policies the quantity of power flow between utilities is doubled and maximum usage is made of the hydropower stations thus reducing costs and fuel use. In electricity exporting countries such as Zambia and Mozambique gains from increased trade are achieved which equal 16% and 18% respectively of the value of their total manufactured exports. A sensitivity analysis is conducted on the possible effects of derating generation, derating transmission and reducing water inflows but gains remain large. Maximum economic gains from optimal trading patterns can be achieved by each country allowing centralized control through the newly founded SAPP coordination center. Using standard mixed integer programming solvers makes the cost of such modeling activity easily affordable to each utility in the Southern African pool. This research provides the utilities with the modeling tools to quantify the gains from increased trade and thereby furthers a move towards greater efficiency, faster economic growth and reduced use of fossil fuels.
PDS4: Harnessing the Power of Generate and Apache Velocity
NASA Astrophysics Data System (ADS)
Padams, J.; Cayanan, M.; Hardman, S.
2018-04-01
The PDS4 Generate Tool is a Java-based command-line tool developed by the Cartography and Imaging Sciences Nodes (PDSIMG) for generating PDS4 XML labels, from Apache Velocity templates and input metadata.
Mask replication using jet and flash imprint lithography
NASA Astrophysics Data System (ADS)
Selinidis, Kosta S.; Jones, Chris; Doyle, Gary F.; Brown, Laura; Imhof, Joseph; LaBrake, Dwayne L.; Resnick, Douglas J.; Sreenivasan, S. V.
2011-11-01
The Jet and Flash Imprint Lithography (J-FILTM) process uses drop dispensing of UV curable resists to assist high resolution patterning for subsequent dry etch pattern transfer. The technology is actively being used to develop solutions for memory markets including Flash memory and patterned media for hard disk drives. It is anticipated that the lifetime of a single template (for patterned media) or mask (for semiconductor) will be on the order of 104 - 105imprints. This suggests that tens of thousands of templates/masks will be required to satisfy the needs of a manufacturing environment. Electron-beam patterning is too slow to feasibly deliver these volumes, but instead can provide a high quality "master" mask which can be replicated many times with an imprint lithography tool. This strategy has the capability to produce the required supply of "working" templates/masks. In this paper, we review the development of the mask form factor, imprint replication tools and the semiconductor mask replication process. A PerfectaTM MR5000 mask replication tool has been developed specifically to pattern replica masks from an ebeam written master. Performance results, including image placement, critical dimension uniformity, and pattern transfer are covered in detail.
Olugbara, Oludayo
2014-01-01
This paper presents an annual multiobjective crop-mix planning as a problem of concurrent maximization of net profit and maximization of crop production to determine an optimal cropping pattern. The optimal crop production in a particular planting season is a crucial decision making task from the perspectives of economic management and sustainable agriculture. A multiobjective optimal crop-mix problem is formulated and solved using the generalized differential evolution 3 (GDE3) metaheuristic to generate a globally optimal solution. The performance of the GDE3 metaheuristic is investigated by comparing its results with the results obtained using epsilon constrained and nondominated sorting genetic algorithms—being two representatives of state-of-the-art in evolutionary optimization. The performance metrics of additive epsilon, generational distance, inverted generational distance, and spacing are considered to establish the comparability. In addition, a graphical comparison with respect to the true Pareto front for the multiobjective optimal crop-mix planning problem is presented. Empirical results generally show GDE3 to be a viable alternative tool for solving a multiobjective optimal crop-mix planning problem. PMID:24883369
Adekanmbi, Oluwole; Olugbara, Oludayo; Adeyemo, Josiah
2014-01-01
This paper presents an annual multiobjective crop-mix planning as a problem of concurrent maximization of net profit and maximization of crop production to determine an optimal cropping pattern. The optimal crop production in a particular planting season is a crucial decision making task from the perspectives of economic management and sustainable agriculture. A multiobjective optimal crop-mix problem is formulated and solved using the generalized differential evolution 3 (GDE3) metaheuristic to generate a globally optimal solution. The performance of the GDE3 metaheuristic is investigated by comparing its results with the results obtained using epsilon constrained and nondominated sorting genetic algorithms-being two representatives of state-of-the-art in evolutionary optimization. The performance metrics of additive epsilon, generational distance, inverted generational distance, and spacing are considered to establish the comparability. In addition, a graphical comparison with respect to the true Pareto front for the multiobjective optimal crop-mix planning problem is presented. Empirical results generally show GDE3 to be a viable alternative tool for solving a multiobjective optimal crop-mix planning problem.
Identification of nuclear weapons
Mihalczo, J.T.; King, W.T.
1987-04-10
A method and apparatus for non-invasively indentifying different types of nuclear weapons is disclosed. A neutron generator is placed against the weapon to generate a stream of neutrons causing fissioning within the weapon. A first detects the generation of the neutrons and produces a signal indicative thereof. A second particle detector located on the opposite side of the weapon detects the fission particles and produces signals indicative thereof. The signals are converted into a detected pattern and a computer compares the detected pattern with known patterns of weapons and indicates which known weapon has a substantially similar pattern. Either a time distribution pattern or noise analysis pattern, or both, is used. Gamma-neutron discrimination and a third particle detector for fission particles adjacent the second particle detector are preferably used. The neutrons are generated by either a decay neutron source or a pulled neutron particle accelerator.
Generation of shape complexity through tissue conflict resolution
Rebocho, Alexandra B; Southam, Paul; Kennaway, J Richard; Coen, Enrico
2017-01-01
Out-of-plane tissue deformations are key morphogenetic events during plant and animal development that generate 3D shapes, such as flowers or limbs. However, the mechanisms by which spatiotemporal patterns of gene expression modify cellular behaviours to generate such deformations remain to be established. We use the Snapdragon flower as a model system to address this problem. Combining cellular analysis with tissue-level modelling, we show that an orthogonal pattern of growth orientations plays a key role in generating out-of-plane deformations. This growth pattern is most likely oriented by a polarity field, highlighted by PIN1 protein localisation, and is modulated by dorsoventral gene activity. The orthogonal growth pattern interacts with other patterns of differential growth to create tissue conflicts that shape the flower. Similar shape changes can be generated by contraction as well as growth, suggesting tissue conflict resolution provides a flexible morphogenetic mechanism for generating shape diversity in plants and animals. DOI: http://dx.doi.org/10.7554/eLife.20156.001 PMID:28166865
GEsture: an online hand-drawing tool for gene expression pattern search.
Wang, Chunyan; Xu, Yiqing; Wang, Xuelin; Zhang, Li; Wei, Suyun; Ye, Qiaolin; Zhu, Youxiang; Yin, Hengfu; Nainwal, Manoj; Tanon-Reyes, Luis; Cheng, Feng; Yin, Tongming; Ye, Ning
2018-01-01
Gene expression profiling data provide useful information for the investigation of biological function and process. However, identifying a specific expression pattern from extensive time series gene expression data is not an easy task. Clustering, a popular method, is often used to classify similar expression genes, however, genes with a 'desirable' or 'user-defined' pattern cannot be efficiently detected by clustering methods. To address these limitations, we developed an online tool called GEsture. Users can draw, or graph a curve using a mouse instead of inputting abstract parameters of clustering methods. GEsture explores genes showing similar, opposite and time-delay expression patterns with a gene expression curve as input from time series datasets. We presented three examples that illustrate the capacity of GEsture in gene hunting while following users' requirements. GEsture also provides visualization tools (such as expression pattern figure, heat map and correlation network) to display the searching results. The result outputs may provide useful information for researchers to understand the targets, function and biological processes of the involved genes.
Optimization of RET flow using test layout
NASA Astrophysics Data System (ADS)
Zhang, Yunqiang; Sethi, Satyendra; Lucas, Kevin
2008-11-01
At advanced technology nodes with extremely low k1 lithography, it is very hard to achieve image fidelity requirements and process window for some layout configurations. Quite often these layouts are within simple design rule constraints for a given technology node. It is important to have these layouts included during early RET flow development. Most of RET developments are based on shrunk layout from the previous technology node, which is possibly not good enough. A better methodology in creating test layout is required for optical proximity correction (OPC) recipe and assists feature development. In this paper we demonstrate the application of programmable test layouts in RET development. Layout pattern libraries are developed and embedded in a layout tool (ICWB). Assessment gauges are generated together with patterns for quick correction accuracy assessment. Several groups of test pattern libraries have been developed based on learning from product patterns and a layout DOE approach. The interaction between layout patterns and OPC recipe has been studied. Correction of a contact layer is quite challenge because of poor convergence and low process window. We developed test pattern library with many different contact configurations. Different OPC schemes are studied on these test layouts. The worst process window patterns are pinpointed for a given illumination condition. Assist features (AF) are frequently placed according to pre-determined rules to improve lithography process window. These rules are usually derived from lithographic models and experiments. Direct validation of AF rules is required at development phase. We use the test layout approach to determine rules in order to eliminate AF printability problem.
Pattern and Process in the Comparative Study of Convergent Evolution.
Mahler, D Luke; Weber, Marjorie G; Wagner, Catherine E; Ingram, Travis
2017-08-01
Understanding processes that have shaped broad-scale biodiversity patterns is a fundamental goal in evolutionary biology. The development of phylogenetic comparative methods has yielded a tool kit for analyzing contemporary patterns by explicitly modeling processes of change in the past, providing neontologists tools for asking questions previously accessible only for select taxa via the fossil record or laboratory experimentation. The comparative approach, however, differs operationally from alternative approaches to studying convergence in that, for studies of only extant species, convergence must be inferred using evolutionary process models rather than being directly measured. As a result, investigation of evolutionary pattern and process cannot be decoupled in comparative studies of convergence, even though such a decoupling could in theory guard against adaptationist bias. Assumptions about evolutionary process underlying comparative tools can shape the inference of convergent pattern in sometimes profound ways and can color interpretation of such patterns. We discuss these issues and other limitations common to most phylogenetic comparative approaches and suggest ways that they can be avoided in practice. We conclude by promoting a multipronged approach to studying convergence that integrates comparative methods with complementary tests of evolutionary mechanisms and includes ecological and biogeographical perspectives. Carefully employed, the comparative method remains a powerful tool for enriching our understanding of convergence in macroevolution, especially for investigation of why convergence occurs in some settings but not others.
Couvin, David; Zozio, Thierry; Rastogi, Nalin
2017-07-01
Spoligotyping is one of the most commonly used polymerase chain reaction (PCR)-based methods for identification and study of genetic diversity of Mycobacterium tuberculosis complex (MTBC). Despite its known limitations if used alone, the methodology is particularly useful when used in combination with other methods such as mycobacterial interspersed repetitive units - variable number of tandem DNA repeats (MIRU-VNTRs). At a worldwide scale, spoligotyping has allowed identification of information on 103,856 MTBC isolates (corresponding to 98049 clustered strains plus 5807 unique isolates from 169 countries of patient origin) contained within the SITVIT2 proprietary database of the Institut Pasteur de la Guadeloupe. The SpolSimilaritySearch web-tool described herein (available at: http://www.pasteur-guadeloupe.fr:8081/SpolSimilaritySearch) incorporates a similarity search algorithm allowing users to get a complete overview of similar spoligotype patterns (with information on presence or absence of 43 spacers) in the aforementioned worldwide database. This tool allows one to analyze spread and evolutionary patterns of MTBC by comparing similar spoligotype patterns, to distinguish between widespread, specific and/or confined patterns, as well as to pinpoint patterns with large deleted blocks, which play an intriguing role in the genetic epidemiology of M. tuberculosis. Finally, the SpolSimilaritySearch tool also provides with the country distribution patterns for each queried spoligotype. Copyright © 2017 Elsevier Ltd. All rights reserved.
Designed tools for analysis of lithography patterns and nanostructures
NASA Astrophysics Data System (ADS)
Dervillé, Alexandre; Baderot, Julien; Bernard, Guilhem; Foucher, Johann; Grönqvist, Hanna; Labrosse, Aurélien; Martinez, Sergio; Zimmermann, Yann
2017-03-01
We introduce a set of designed tools for the analysis of lithography patterns and nano structures. The classical metrological analysis of these objects has the drawbacks of being time consuming, requiring manual tuning and lacking robustness and user friendliness. With the goal of improving the current situation, we propose new image processing tools at different levels: semi automatic, automatic and machine-learning enhanced tools. The complete set of tools has been integrated into a software platform designed to transform the lab into a virtual fab. The underlying idea is to master nano processes at the research and development level by accelerating the access to knowledge and hence speed up the implementation in product lines.
NASA Astrophysics Data System (ADS)
Radziszewski, Kacper
2017-10-01
The following paper presents the results of the research in the field of the machine learning, investigating the scope of application of the artificial neural networks algorithms as a tool in architectural design. The computational experiment was held using the backward propagation of errors method of training the artificial neural network, which was trained based on the geometry of the details of the Roman Corinthian order capital. During the experiment, as an input training data set, five local geometry parameters combined has given the best results: Theta, Pi, Rho in spherical coordinate system based on the capital volume centroid, followed by Z value of the Cartesian coordinate system and a distance from vertical planes created based on the capital symmetry. Additionally during the experiment, artificial neural network hidden layers optimal count and structure was found, giving results of the error below 0.2% for the mentioned before input parameters. Once successfully trained artificial network, was able to mimic the details composition on any other geometry type given. Despite of calculating the transformed geometry locally and separately for each of the thousands of surface points, system could create visually attractive and diverse, complex patterns. Designed tool, based on the supervised learning method of machine learning, gives possibility of generating new architectural forms- free of the designer’s imagination bounds. Implementing the infinitely broad computational methods of machine learning, or Artificial Intelligence in general, not only could accelerate and simplify the design process, but give an opportunity to explore never seen before, unpredictable forms or everyday architectural practice solutions.
Evaluation of the efficiency and reliability of software generated by code generators
NASA Technical Reports Server (NTRS)
Schreur, Barbara
1994-01-01
There are numerous studies which show that CASE Tools greatly facilitate software development. As a result of these advantages, an increasing amount of software development is done with CASE Tools. As more software engineers become proficient with these tools, their experience and feedback lead to further development with the tools themselves. What has not been widely studied, however, is the reliability and efficiency of the actual code produced by the CASE Tools. This investigation considered these matters. Three segments of code generated by MATRIXx, one of many commercially available CASE Tools, were chosen for analysis: ETOFLIGHT, a portion of the Earth to Orbit Flight software, and ECLSS and PFMC, modules for Environmental Control and Life Support System and Pump Fan Motor Control, respectively.
Test-Case Generation using an Explicit State Model Checker Final Report
NASA Technical Reports Server (NTRS)
Heimdahl, Mats P. E.; Gao, Jimin
2003-01-01
In the project 'Test-Case Generation using an Explicit State Model Checker' we have extended an existing tools infrastructure for formal modeling to export Java code so that we can use the NASA Ames tool Java Pathfinder (JPF) for test case generation. We have completed a translator from our source language RSML(exp -e) to Java and conducted initial studies of how JPF can be used as a testing tool. In this final report, we provide a detailed description of the translation approach as implemented in our tools.
An automated graphics tool for comparative genomics: the Coulson plot generator
2013-01-01
Background Comparative analysis is an essential component to biology. When applied to genomics for example, analysis may require comparisons between the predicted presence and absence of genes in a group of genomes under consideration. Frequently, genes can be grouped into small categories based on functional criteria, for example membership of a multimeric complex, participation in a metabolic or signaling pathway or shared sequence features and/or paralogy. These patterns of retention and loss are highly informative for the prediction of function, and hence possible biological context, and can provide great insights into the evolutionary history of cellular functions. However, representation of such information in a standard spreadsheet is a poor visual means from which to extract patterns within a dataset. Results We devised the Coulson Plot, a new graphical representation that exploits a matrix of pie charts to display comparative genomics data. Each pie is used to describe a complex or process from a separate taxon, and is divided into sectors corresponding to the number of proteins (subunits) in a complex/process. The predicted presence or absence of proteins in each complex are delineated by occupancy of a given sector; this format is visually highly accessible and makes pattern recognition rapid and reliable. A key to the identity of each subunit, plus hierarchical naming of taxa and coloring are included. A java-based application, the Coulson plot generator (CPG) automates graphic production, with a tab or comma-delineated text file as input and generating an editable portable document format or svg file. Conclusions CPG software may be used to rapidly convert spreadsheet data to a graphical matrix pie chart format. The representation essentially retains all of the information from the spreadsheet but presents a graphically rich format making comparisons and identification of patterns significantly clearer. While the Coulson plot format is highly useful in comparative genomics, its original purpose, the software can be used to visualize any dataset where entity occupancy is compared between different classes. Availability CPG software is available at sourceforge http://sourceforge.net/projects/coulson and http://dl.dropbox.com/u/6701906/Web/Sites/Labsite/CPG.html PMID:23621955
ESAS Deliverable PS 1.1.2.3: Customer Survey on Code Generations in Safety-Critical Applications
NASA Technical Reports Server (NTRS)
Schumann, Johann; Denney, Ewen
2006-01-01
Automated code generators (ACG) are tools that convert a (higher-level) model of a software (sub-)system into executable code without the necessity for a developer to actually implement the code. Although both commercially supported and in-house tools have been used in many industrial applications, little data exists on how these tools are used in safety-critical domains (e.g., spacecraft, aircraft, automotive, nuclear). The aims of the survey, therefore, were threefold: 1) to determine if code generation is primarily used as a tool for prototyping, including design exploration and simulation, or for fiight/production code; 2) to determine the verification issues with code generators relating, in particular, to qualification and certification in safety-critical domains; and 3) to determine perceived gaps in functionality of existing tools.
Alexander Meets Michotte: A Simulation Tool Based on Pattern Programming and Phenomenology
ERIC Educational Resources Information Center
Basawapatna, Ashok
2016-01-01
Simulation and modeling activities, a key point of computational thinking, are currently not being integrated into the science classroom. This paper describes a new visual programming tool entitled the Simulation Creation Toolkit. The Simulation Creation Toolkit is a high level pattern-based phenomenological approach to bringing rapid simulation…
Patterns of Propaganda and Persuasion.
ERIC Educational Resources Information Center
Rank, Hugh
Because children are exposed to highly professional sales pitches on television and because the old material produced by the Institute of Propaganda Analysis is outdated and in error, a new tool for the analysis of propaganda and persuasion is called for. Such a tool is the intensify/downplay pattern analysis chart, which includes the basic…
Semi-automated ontology generation within OBO-Edit.
Wächter, Thomas; Schroeder, Michael
2010-06-15
Ontologies and taxonomies have proven highly beneficial for biocuration. The Open Biomedical Ontology (OBO) Foundry alone lists over 90 ontologies mainly built with OBO-Edit. Creating and maintaining such ontologies is a labour-intensive, difficult, manual process. Automating parts of it is of great importance for the further development of ontologies and for biocuration. We have developed the Dresden Ontology Generator for Directed Acyclic Graphs (DOG4DAG), a system which supports the creation and extension of OBO ontologies by semi-automatically generating terms, definitions and parent-child relations from text in PubMed, the web and PDF repositories. DOG4DAG is seamlessly integrated into OBO-Edit. It generates terms by identifying statistically significant noun phrases in text. For definitions and parent-child relations it employs pattern-based web searches. We systematically evaluate each generation step using manually validated benchmarks. The term generation leads to high-quality terms also found in manually created ontologies. Up to 78% of definitions are valid and up to 54% of child-ancestor relations can be retrieved. There is no other validated system that achieves comparable results. By combining the prediction of high-quality terms, definitions and parent-child relations with the ontology editor OBO-Edit we contribute a thoroughly validated tool for all OBO ontology engineers. DOG4DAG is available within OBO-Edit 2.1 at http://www.oboedit.org. Supplementary data are available at Bioinformatics online.
NASA Astrophysics Data System (ADS)
Hauschild, Dirk; Homburg, Oliver; Mitra, Thomas; Ivanenko, Mikhail; Jarczynski, Manfred; Meinschien, Jens; Bayer, Andreas; Lissotschenko, Vitalij
2009-02-01
High power laser sources are used in various production tools for microelectronic products and solar cells, including the applications annealing, lithography, edge isolation as well as dicing and patterning. Besides the right choice of the laser source suitable high performance optics for generating the appropriate beam profile and intensity distribution are of high importance for the right processing speed, quality and yield. For industrial applications equally important is an adequate understanding of the physics of the light-matter interaction behind the process. In advance simulations of the tool performance can minimize technical and financial risk as well as lead times for prototyping and introduction into series production. LIMO has developed its own software founded on the Maxwell equations taking into account all important physical aspects of the laser based process: the light source, the beam shaping optical system and the light-matter interaction. Based on this knowledge together with a unique free-form micro-lens array production technology and patented micro-optics beam shaping designs a number of novel solar cell production tool sub-systems have been built. The basic functionalities, design principles and performance results are presented with a special emphasis on resilience, cost reduction and process reliability.
Gustaf: Detecting and correctly classifying SVs in the NGS twilight zone.
Trappe, Kathrin; Emde, Anne-Katrin; Ehrlich, Hans-Christian; Reinert, Knut
2014-12-15
The landscape of structural variation (SV) including complex duplication and translocation patterns is far from resolved. SV detection tools usually exhibit low agreement, are often geared toward certain types or size ranges of variation and struggle to correctly classify the type and exact size of SVs. We present Gustaf (Generic mUlti-SpliT Alignment Finder), a sound generic multi-split SV detection tool that detects and classifies deletions, inversions, dispersed duplications and translocations of ≥ 30 bp. Our approach is based on a generic multi-split alignment strategy that can identify SV breakpoints with base pair resolution. We show that Gustaf correctly identifies SVs, especially in the range from 30 to 100 bp, which we call the next-generation sequencing (NGS) twilight zone of SVs, as well as larger SVs >500 bp. Gustaf performs better than similar tools in our benchmark and is furthermore able to correctly identify size and location of dispersed duplications and translocations, which otherwise might be wrongly classified, for example, as large deletions. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Non-competitive inhibition by active site binders.
Blat, Yuval
2010-06-01
Classical enzymology has been used for generations to understand the interactions of inhibitors with their enzyme targets. Enzymology tools enabled prediction of the biological impact of inhibitors as well as the development of novel, more potent, ones. Experiments designed to examine the competition between the tested inhibitor and the enzyme substrate(s) are the tool of choice to identify inhibitors that bind in the active site. Competition between an inhibitor and a substrate is considered a strong evidence for binding of the inhibitor in the active site, while the lack of competition suggests binding to an alternative site. Nevertheless, exceptions to this notion do exist. Active site-binding inhibitors can display non-competitive inhibition patterns. This unusual behavior has been observed with enzymes utilizing an exosite for substrate binding, isomechanism enzymes, enzymes with multiple substrates and/or products and two-step binding inhibitors. In many of these cases, the mechanisms underlying the lack of competition between the substrate and the inhibitor are well understood. Tools like alternative substrates, testing the enzyme reaction in the reverse direction and monitoring inhibition time dependence can be applied to enable distinction between 'badly behaving' active site binders and true exosite inhibitors.
Halim, Zahid; Abbas, Ghulam
2015-01-01
Sign language provides hearing and speech impaired individuals with an interface to communicate with other members of the society. Unfortunately, sign language is not understood by most of the common people. For this, a gadget based on image processing and pattern recognition can provide with a vital aid for detecting and translating sign language into a vocal language. This work presents a system for detecting and understanding the sign language gestures by a custom built software tool and later translating the gesture into a vocal language. For the purpose of recognizing a particular gesture, the system employs a Dynamic Time Warping (DTW) algorithm and an off-the-shelf software tool is employed for vocal language generation. Microsoft(®) Kinect is the primary tool used to capture video stream of a user. The proposed method is capable of successfully detecting gestures stored in the dictionary with an accuracy of 91%. The proposed system has the ability to define and add custom made gestures. Based on an experiment in which 10 individuals with impairments used the system to communicate with 5 people with no disability, 87% agreed that the system was useful.
Molecular epidemiology of African sleeping sickness.
Hide, G; Tait, A
2009-10-01
Human sleeping sickness in Africa, caused by Trypanosoma brucei spp. raises a number of questions. Despite the widespread distribution of the tsetse vectors and animal trypanosomiasis, human disease is only found in discrete foci which periodically give rise to epidemics followed by periods of endemicity A key to unravelling this puzzle is a detailed knowledge of the aetiological agents responsible for different patterns of disease--knowledge that is difficult to achieve using traditional microscopy. The science of molecular epidemiology has developed a range of tools which have enabled us to accurately identify taxonomic groups at all levels (species, subspecies, populations, strains and isolates). Using these tools, we can now investigate the genetic interactions within and between populations of Trypanosoma brucei and gain an understanding of the distinction between human- and nonhuman-infective subspecies. In this review, we discuss the development of these tools, their advantages and disadvantages and describe how they have been used to understand parasite genetic diversity, the origin of epidemics, the role of reservoir hosts and the population structure. Using the specific case of T.b. rhodesiense in Uganda, we illustrate how molecular epidemiology has enabled us to construct a more detailed understanding of the origins, generation and dynamics of sleeping sickness epidemics.
Kumar, Rajendra; Sobhy, Haitham
2017-01-01
Abstract Hi-C experiments generate data in form of large genome contact maps (Hi-C maps). These show that chromosomes are arranged in a hierarchy of three-dimensional compartments. But to understand how these compartments form and by how much they affect genetic processes such as gene regulation, biologists and bioinformaticians need efficient tools to visualize and analyze Hi-C data. However, this is technically challenging because these maps are big. In this paper, we remedied this problem, partly by implementing an efficient file format and developed the genome contact map explorer platform. Apart from tools to process Hi-C data, such as normalization methods and a programmable interface, we made a graphical interface that let users browse, scroll and zoom Hi-C maps to visually search for patterns in the Hi-C data. In the software, it is also possible to browse several maps simultaneously and plot related genomic data. The software is openly accessible to the scientific community. PMID:28973466
Stanislawski, Larry V.; Survila, Kornelijus; Wendel, Jeffrey; Liu, Yan; Buttenfield, Barbara P.
2018-01-01
This paper describes a workflow for automating the extraction of elevation-derived stream lines using open source tools with parallel computing support and testing the effectiveness of procedures in various terrain conditions within the conterminous United States. Drainage networks are extracted from the US Geological Survey 1/3 arc-second 3D Elevation Program elevation data having a nominal cell size of 10 m. This research demonstrates the utility of open source tools with parallel computing support for extracting connected drainage network patterns and handling depressions in 30 subbasins distributed across humid, dry, and transitional climate regions and in terrain conditions exhibiting a range of slopes. Special attention is given to low-slope terrain, where network connectivity is preserved by generating synthetic stream channels through lake and waterbody polygons. Conflation analysis compares the extracted streams with a 1:24,000-scale National Hydrography Dataset flowline network and shows that similarities are greatest for second- and higher-order tributaries.
Zhang, Fan; Liu, Runsheng; Zheng, Jie
2016-12-23
Linking computational models of signaling pathways to predicted cellular responses such as gene expression regulation is a major challenge in computational systems biology. In this work, we present Sig2GRN, a Cytoscape plugin that is able to simulate time-course gene expression data given the user-defined external stimuli to the signaling pathways. A generalized logical model is used in modeling the upstream signaling pathways. Then a Boolean model and a thermodynamics-based model are employed to predict the downstream changes in gene expression based on the simulated dynamics of transcription factors in signaling pathways. Our empirical case studies show that the simulation of Sig2GRN can predict changes in gene expression patterns induced by DNA damage signals and drug treatments. As a software tool for modeling cellular dynamics, Sig2GRN can facilitate studies in systems biology by hypotheses generation and wet-lab experimental design. http://histone.scse.ntu.edu.sg/Sig2GRN/.
The centrality of RNA for engineering gene expression
Chappell, James; Takahashi, Melissa K; Meyer, Sarai; Loughrey, David; Watters, Kyle E; Lucks, Julius
2013-01-01
Synthetic biology holds promise as both a framework for rationally engineering biological systems and a way to revolutionize how we fundamentally understand them. Essential to realizing this promise is the development of strategies and tools to reliably and predictably control and characterize sophisticated patterns of gene expression. Here we review the role that RNA can play towards this goal and make a case for why this versatile, designable, and increasingly characterizable molecule is one of the most powerful substrates for engineering gene expression at our disposal. We discuss current natural and synthetic RNA regulators of gene expression acting at key points of control – transcription, mRNA degradation, and translation. We also consider RNA structural probing and computational RNA structure predication tools as a way to study RNA structure and ultimately function. Finally, we discuss how next-generation sequencing methods are being applied to the study of RNA and to the characterization of RNA's many properties throughout the cell. PMID:24124015
NASA Astrophysics Data System (ADS)
Yang, Yang; Pan, Yayue; Guo, Ping
2017-04-01
Creating orderly periodic micro/nano-structures on metallic surfaces, or structural coloration, for control of surface apparent color and optical reflectivity has been an exciting research topic over the years. The direct applications of structural coloration include color marking, display devices, and invisibility cloak. This paper presents an efficient method to colorize metallic surfaces with periodic micro/nano-gratings using elliptical vibration texturing. When the tool vibration is coupled with a constant cutting velocity, controlled periodic ripples can be generated due to the overlapping tool trajectory. These periodic ripples with a wavelength near visible spectrum can act as micro-gratings to introduce iridescent colors. The proposed technique also provides a flexible method for color marking of metallic surfaces with arbitrary patterns and images by precise control of the spacing distance and orientation of induced micro/nano-ripples. Theoretical analysis and experimental results are given to demonstrate structural coloration of metals by a direct mechanical machining technique.
Deloizy, Charlotte; Bouguyon, Edwige; Fossum, Even; Sebo, Peter; Osicka, Radim; Bole, Angélique; Pierres, Michel; Biacchesi, Stéphane; Dalod, Marc; Bogen, Bjarne; Bertho, Nicolas; Schwartz-Cornil, Isabelle
2016-12-01
Pig is a domestic species of major importance in the agro-economy and in biomedical research. Mononuclear phagocytes (MNP) are organized in subsets with specialized roles in the orchestration of the immune response and new tools are awaited to improve MNP subset identification in the pig. We cloned pig CD11c cDNA and generated a monoclonal antibody to pig CD11c which showed a pattern of expression by blood and skin MNP subsets similar to humans. We also developed a porcine XCL1-mCherry dimer which specifically reacted with the XCR1-expressing dendritic cell subset of the type 1 lineage in blood and skin. These original reagents will allow the efficient identification of pig MNP subsets to study their role in physiological and pathological processes and also to target these cells in novel intervention and vaccine strategies for veterinary applications and preclinical evaluations. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Lewis, Jared; Bodeker, Greg E.; Kremser, Stefanie; Tait, Andrew
2017-12-01
A method, based on climate pattern scaling, has been developed to expand a small number of projections of fields of a selected climate variable (X) into an ensemble that encapsulates a wide range of indicative model structural uncertainties. The method described in this paper is referred to as the Ensemble Projections Incorporating Climate model uncertainty (EPIC) method. Each ensemble member is constructed by adding contributions from (1) a climatology derived from observations that represents the time-invariant part of the signal; (2) a contribution from forced changes in X, where those changes can be statistically related to changes in global mean surface temperature (Tglobal); and (3) a contribution from unforced variability that is generated by a stochastic weather generator. The patterns of unforced variability are also allowed to respond to changes in Tglobal. The statistical relationships between changes in X (and its patterns of variability) and Tglobal are obtained in a training
phase. Then, in an implementation
phase, 190 simulations of Tglobal are generated using a simple climate model tuned to emulate 19 different global climate models (GCMs) and 10 different carbon cycle models. Using the generated Tglobal time series and the correlation between the forced changes in X and Tglobal, obtained in the training
phase, the forced change in the X field can be generated many times using Monte Carlo analysis. A stochastic weather generator is used to generate realistic representations of weather which include spatial coherence. Because GCMs and regional climate models (RCMs) are less likely to correctly represent unforced variability compared to observations, the stochastic weather generator takes as input measures of variability derived from observations, but also responds to forced changes in climate in a way that is consistent with the RCM projections. This approach to generating a large ensemble of projections is many orders of magnitude more computationally efficient than running multiple GCM or RCM simulations. Such a large ensemble of projections permits a description of a probability density function (PDF) of future climate states rather than a small number of individual story lines within that PDF, which may not be representative of the PDF as a whole; the EPIC method largely corrects for such potential sampling biases. The method is useful for providing projections of changes in climate to users wishing to investigate the impacts and implications of climate change in a probabilistic way. A web-based tool, using the EPIC method to provide probabilistic projections of changes in daily maximum and minimum temperatures for New Zealand, has been developed and is described in this paper.
Lakbub, Jude C; Su, Xiaomeng; Zhu, Zhikai; Patabandige, Milani W; Hua, David; Go, Eden P; Desaire, Heather
2017-08-04
The glycopeptide analysis field is tightly constrained by a lack of effective tools that translate mass spectrometry data into meaningful chemical information, and perhaps the most challenging aspect of building effective glycopeptide analysis software is designing an accurate scoring algorithm for MS/MS data. We provide the glycoproteomics community with two tools to address this challenge. The first tool, a curated set of 100 expert-assigned CID spectra of glycopeptides, contains a diverse set of spectra from a variety of glycan types; the second tool, Glycopeptide Decoy Generator, is a new software application that generates glycopeptide decoys de novo. We developed these tools so that emerging methods of assigning glycopeptides' CID spectra could be rigorously tested. Software developers or those interested in developing skills in expert (manual) analysis can use these tools to facilitate their work. We demonstrate the tools' utility in assessing the quality of one particular glycopeptide software package, GlycoPep Grader, which assigns glycopeptides to CID spectra. We first acquired the set of 100 expert assigned CID spectra; then, we used the Decoy Generator (described herein) to generate 20 decoys per target glycopeptide. The assigned spectra and decoys were used to test the accuracy of GlycoPep Grader's scoring algorithm; new strengths and weaknesses were identified in the algorithm using this approach. Both newly developed tools are freely available. The software can be downloaded at http://glycopro.chem.ku.edu/GPJ.jar.
Culture, Interface Design, and Design Methods for Mobile Devices
NASA Astrophysics Data System (ADS)
Lee, Kun-Pyo
Aesthetic differences and similarities among cultures are obviously one of the very important issues in cultural design. However, ever since products became knowledge-supporting tools, the visible elements of products have become more universal so that the invisible parts of products such as interface and interaction are getting more important. Therefore, the cultural design should be extended to the invisible elements of culture like people's conceptual models beyond material and phenomenal culture. This chapter aims to explain how we address the invisible cultural elements in interface design and design methods by exploring the users' cognitive styles and communication patterns in different cultures. Regarding cultural interface design, we examined users' conceptual models while interacting with mobile phone and website interfaces, and observed cultural difference in performing tasks and viewing patterns, which appeared to agree with cultural cognitive styles known as Holistic thoughts vs. Analytic thoughts. Regarding design methods for culture, we explored how to localize design methods such as focus group interview and generative session for specific cultural groups, and the results of comparative experiments revealed cultural difference on participants' behaviors and performance in each design method and led us to suggest how to conduct them in East Asian culture. Mobile Observation Analyzer and Wi-Pro, user research tools we invented to capture user behaviors and needs especially in their mobile context, were also introduced.
NASA Astrophysics Data System (ADS)
Lee, J.; Jeong, S.
2017-12-01
Glaciers often have been considered as a symbol of climate change, also its mass change is a major contributor to sea level rise. Dynamic discharge is one of the mechanisms that marine-terminating outlet glaciers loses its mass, whose trend consists of seasonal, annual and secular patterns. These patterns, along with the other climate parameters, can be inspirational to music composition, thereby it can be expressed and transferred by musical media. Here we present `Threatened by,' a piece of electronic music accompanied by animation of glaciers' movement which represent an attempt to frame the sound of the glacier in freer ways vis-à-vis acoustic music. To give expression to the sound, musical production tools such as Pro Tools, Sound Forge Pro, Logic Pro X, Max/MSP, etc. are utilized to modify and combine a variety of sounds generated by a melting glacier. After adding impact by the way of EQ, reverberation, distortion, delay, reverse, etc., we created a two-channel stereo piece in approximately 7 minutes. Along with the musical media, we also present a video clip whose visual features corresponds to glacial properties or events. We expect this work will raise awareness of glaciers' behaviour to general public, also presenting one of the examples that scientists and artists work collaboratively to come up with an artwork that has social implications.
Penttinen, Kirsi; Siirtola, Harri; Àvalos-Salguero, Jorge; Vainio, Tiina; Juhola, Martti; Aalto-Setälä, Katriina
2015-01-01
Comprehensive functioning of Ca2+ cycling is crucial for excitation–contraction coupling of cardiomyocytes (CMs). Abnormal Ca2+ cycling is linked to arrhythmogenesis, which is associated with cardiac disorders and heart failure. Accordingly, we have generated spontaneously beating CMs from induced pluripotent stem cells (iPSC) derived from patients with catecholaminergic polymorphic ventricular tachycardia (CPVT), which is an inherited and severe cardiac disease. Ca2+ cycling studies have revealed substantial abnormalities in these CMs. Ca2+ transient analysis performed manually lacks accepted analysis criteria, and has both low throughput and high variability. To overcome these issues, we have developed a software tool, AnomalyExplorer based on interactive visualization, to assist in the classification of Ca2+ transient patterns detected in CMs. Here, we demonstrate the usability and capability of the software, and we also compare the analysis efficiency to manual analysis. We show that AnomalyExplorer is suitable for detecting normal and abnormal Ca2+ transients; furthermore, this method provides more defined and consistent information regarding the Ca2+ abnormality patterns and cell line specific differences when compared to manual analysis. This tool will facilitate and speed up the analysis of CM Ca2+ transients, making it both more accurate and user-independent. AnomalyExplorer can be exploited in Ca2+ cycling analysis to study basic disease pathology and the effects of different drugs. PMID:26308621
Penttinen, Kirsi; Siirtola, Harri; Àvalos-Salguero, Jorge; Vainio, Tiina; Juhola, Martti; Aalto-Setälä, Katriina
2015-01-01
Comprehensive functioning of Ca2+ cycling is crucial for excitation-contraction coupling of cardiomyocytes (CMs). Abnormal Ca2+ cycling is linked to arrhythmogenesis, which is associated with cardiac disorders and heart failure. Accordingly, we have generated spontaneously beating CMs from induced pluripotent stem cells (iPSC) derived from patients with catecholaminergic polymorphic ventricular tachycardia (CPVT), which is an inherited and severe cardiac disease. Ca2+ cycling studies have revealed substantial abnormalities in these CMs. Ca2+ transient analysis performed manually lacks accepted analysis criteria, and has both low throughput and high variability. To overcome these issues, we have developed a software tool, AnomalyExplorer based on interactive visualization, to assist in the classification of Ca2+ transient patterns detected in CMs. Here, we demonstrate the usability and capability of the software, and we also compare the analysis efficiency to manual analysis. We show that AnomalyExplorer is suitable for detecting normal and abnormal Ca2+ transients; furthermore, this method provides more defined and consistent information regarding the Ca2+ abnormality patterns and cell line specific differences when compared to manual analysis. This tool will facilitate and speed up the analysis of CM Ca2+ transients, making it both more accurate and user-independent. AnomalyExplorer can be exploited in Ca2+ cycling analysis to study basic disease pathology and the effects of different drugs.
Holmquist-Johnson, C. L.
2009-01-01
River spanning rock structures are being constructed for water delivery as well as to enable fish passage at barriers and provide or improve the aquatic habitat for endangered fish species. Current design methods are based upon anecdotal information applicable to a narrow range of channel conditions. The complex flow patterns and performance of rock weirs is not well understood. Without accurate understanding of their hydraulics, designers cannot address the failure mechanisms of these structures. Flow characteristics such as jets, near bed velocities, recirculation, eddies, and plunging flow govern scour pool development. These detailed flow patterns can be replicated using a 3D numerical model. Numerical studies inexpensively simulate a large number of cases resulting in an increased range of applicability in order to develop design tools and predictive capability for analysis and design. The analysis and results of the numerical modeling, laboratory modeling, and field data provide a process-based method for understanding how structure geometry affects flow characteristics, scour development, fish passage, water delivery, and overall structure stability. Results of the numerical modeling allow designers to utilize results of the analysis to determine the appropriate geometry for generating desirable flow parameters. The end product of this research will develop tools and guidelines for more robust structure design or retrofits based upon predictable engineering and hydraulic performance criteria. ?? 2009 ASCE.
Wei, Ning-Ning; Hamza, Adel
2014-01-27
We present an efficient and rational ligand/structure shape-based virtual screening approach combining our previous ligand shape-based similarity SABRE (shape-approach-based routines enhanced) and the 3D shape of the receptor binding site. Our approach exploits the pharmacological preferences of a number of known active ligands to take advantage of the structural diversities and chemical similarities, using a linear combination of weighted molecular shape density. Furthermore, the algorithm generates a consensus molecular-shape pattern recognition that is used to filter and place the candidate structure into the binding pocket. The descriptor pool used to construct the consensus molecular-shape pattern consists of four dimensional (4D) fingerprints generated from the distribution of conformer states available to a molecule and the 3D shapes of a set of active ligands computed using SABRE software. The virtual screening efficiency of SABRE was validated using the Database of Useful Decoys (DUD) and the filtered version (WOMBAT) of 10 DUD targets. The ligand/structure shape-based similarity SABRE algorithm outperforms several other widely used virtual screening methods which uses the data fusion of multiscreening tools (2D and 3D fingerprints) and demonstrates a superior early retrieval rate of active compounds (EF(0.1%) = 69.0% and EF(1%) = 98.7%) from a large size of ligand database (∼95,000 structures). Therefore, our developed similarity approach can be of particular use for identifying active compounds that are similar to reference molecules and predicting activity against other targets (chemogenomics). An academic license of the SABRE program is available on request.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McGonegle, David, E-mail: d.mcgonegle1@physics.ox.ac.uk; Wark, Justin S.; Higginbotham, Andrew
2015-08-14
A growing number of shock compression experiments, especially those involving laser compression, are taking advantage of in situ x-ray diffraction as a tool to interrogate structure and microstructure evolution. Although these experiments are becoming increasingly sophisticated, there has been little work on exploiting the textured nature of polycrystalline targets to gain information on sample response. Here, we describe how to generate simulated x-ray diffraction patterns from materials with an arbitrary texture function subject to a general deformation gradient. We will present simulations of Debye-Scherrer x-ray diffraction from highly textured polycrystalline targets that have been subjected to uniaxial compression, as maymore » occur under planar shock conditions. In particular, we study samples with a fibre texture, and find that the azimuthal dependence of the diffraction patterns contains information that, in principle, affords discrimination between a number of similar shock-deformation mechanisms. For certain cases, we compare our method with results obtained by taking the Fourier transform of the atomic positions calculated by classical molecular dynamics simulations. Illustrative results are presented for the shock-induced α–ϵ phase transition in iron, the α–ω transition in titanium and deformation due to twinning in tantalum that is initially preferentially textured along [001] and [011]. The simulations are relevant to experiments that can now be performed using 4th generation light sources, where single-shot x-ray diffraction patterns from crystals compressed via laser-ablation can be obtained on timescales shorter than a phonon period.« less
McGonegle, David; Milathianaki, Despina; Remington, Bruce A.; ...
2015-08-11
A growing number of shock compression experiments, especially those involving laser compression, are taking advantage of in situ x-ray diffraction as a tool to interrogate structure and microstructure evolution. Although these experiments are becoming increasingly sophisticated, there has been little work on exploiting the textured nature of polycrystalline targets to gain information on sample response. Here, we describe how to generate simulated x-ray diffraction patterns from materials with an arbitrary texture function subject to a general deformation gradient. We will present simulations of Debye-Scherrer x-ray diffraction from highly textured polycrystalline targets that have been subjected to uniaxial compression, as maymore » occur under planar shock conditions. In particular, we study samples with a fibre texture, and find that the azimuthal dependence of the diffraction patterns contains information that, in principle, affords discrimination between a number of similar shock-deformation mechanisms. For certain cases, we compare our method with results obtained by taking the Fourier transform of the atomic positions calculated by classical molecular dynamics simulations. Illustrative results are presented for the shock-induced α–ϵ phase transition in iron, the α–ω transition in titanium and deformation due to twinning in tantalum that is initially preferentially textured along [001] and [011]. In conclusion, the simulations are relevant to experiments that can now be performed using 4th generation light sources, where single-shot x-ray diffraction patterns from crystals compressed via laser-ablation can be obtained on timescales shorter than a phonon period.« less
Semi-autonomous remote sensing time series generation tool
NASA Astrophysics Data System (ADS)
Babu, Dinesh Kumar; Kaufmann, Christof; Schmidt, Marco; Dhams, Thorsten; Conrad, Christopher
2017-10-01
High spatial and temporal resolution data is vital for crop monitoring and phenology change detection. Due to the lack of satellite architecture and frequent cloud cover issues, availability of daily high spatial data is still far from reality. Remote sensing time series generation of high spatial and temporal data by data fusion seems to be a practical alternative. However, it is not an easy process, since it involves multiple steps and also requires multiple tools. In this paper, a framework of Geo Information System (GIS) based tool is presented for semi-autonomous time series generation. This tool will eliminate the difficulties by automating all the steps and enable the users to generate synthetic time series data with ease. Firstly, all the steps required for the time series generation process are identified and grouped into blocks based on their functionalities. Later two main frameworks are created, one to perform all the pre-processing steps on various satellite data and the other one to perform data fusion to generate time series. The two frameworks can be used individually to perform specific tasks or they could be combined to perform both the processes in one go. This tool can handle most of the known geo data formats currently available which makes it a generic tool for time series generation of various remote sensing satellite data. This tool is developed as a common platform with good interface which provides lot of functionalities to enable further development of more remote sensing applications. A detailed description on the capabilities and the advantages of the frameworks are given in this paper.
The New England travel market: generational travel patterns, 1979 to 1996
Rod Warnick
2002-01-01
Generations of travelers who select New England as a primary destination are examined over time from the years of 1979 through 1996 and the analysis serves to update an earlier review of generational travel patterns of the region (Warnick, 1994). Changes in travel patterns are noted by overall adjusted annual change rates by demographic and geographic regions of...
Westcott, Nathan P; Pulsipher, Abigail; Lamb, Brian M; Yousaf, Muhammad N
2008-09-02
An expedient and inexpensive method to generate patterned aldehydes on self-assembled monolayers (SAMs) of alkanethiolates on gold with control of density for subsequent chemoselective immobilization from commercially available starting materials has been developed. Utilizing microfluidic cassettes, primary alcohol oxidation of tetra(ethylene glycol) undecane thiol and 11-mercapto-1-undecanol SAMs was performed directly on the surface generating patterned aldehyde groups with pyridinium chlorochromate. The precise density of surface aldehydes generated can be controlled and characterized by electrochemistry. For biological applications, fibroblast cells were seeded on patterned surfaces presenting biospecifc cell adhesive (Arg-Glyc-Asp) RGD peptides.
Pan, Deyun; Sun, Ning; Cheung, Kei-Hoi; Guan, Zhong; Ma, Ligeng; Holford, Matthew; Deng, Xingwang; Zhao, Hongyu
2003-11-07
To date, many genomic and pathway-related tools and databases have been developed to analyze microarray data. In published web-based applications to date, however, complex pathways have been displayed with static image files that may not be up-to-date or are time-consuming to rebuild. In addition, gene expression analyses focus on individual probes and genes with little or no consideration of pathways. These approaches reveal little information about pathways that are key to a full understanding of the building blocks of biological systems. Therefore, there is a need to provide useful tools that can generate pathways without manually building images and allow gene expression data to be integrated and analyzed at pathway levels for such experimental organisms as Arabidopsis. We have developed PathMAPA, a web-based application written in Java that can be easily accessed over the Internet. An Oracle database is used to store, query, and manipulate the large amounts of data that are involved. PathMAPA allows its users to (i) upload and populate microarray data into a database; (ii) integrate gene expression with enzymes of the pathways; (iii) generate pathway diagrams without building image files manually; (iv) visualize gene expressions for each pathway at enzyme, locus, and probe levels; and (v) perform statistical tests at pathway, enzyme and gene levels. PathMAPA can be used to examine Arabidopsis thaliana gene expression patterns associated with metabolic pathways. PathMAPA provides two unique features for the gene expression analysis of Arabidopsis thaliana: (i) automatic generation of pathways associated with gene expression and (ii) statistical tests at pathway level. The first feature allows for the periodical updating of genomic data for pathways, while the second feature can provide insight into how treatments affect relevant pathways for the selected experiment(s).
Pan, Deyun; Sun, Ning; Cheung, Kei-Hoi; Guan, Zhong; Ma, Ligeng; Holford, Matthew; Deng, Xingwang; Zhao, Hongyu
2003-01-01
Background To date, many genomic and pathway-related tools and databases have been developed to analyze microarray data. In published web-based applications to date, however, complex pathways have been displayed with static image files that may not be up-to-date or are time-consuming to rebuild. In addition, gene expression analyses focus on individual probes and genes with little or no consideration of pathways. These approaches reveal little information about pathways that are key to a full understanding of the building blocks of biological systems. Therefore, there is a need to provide useful tools that can generate pathways without manually building images and allow gene expression data to be integrated and analyzed at pathway levels for such experimental organisms as Arabidopsis. Results We have developed PathMAPA, a web-based application written in Java that can be easily accessed over the Internet. An Oracle database is used to store, query, and manipulate the large amounts of data that are involved. PathMAPA allows its users to (i) upload and populate microarray data into a database; (ii) integrate gene expression with enzymes of the pathways; (iii) generate pathway diagrams without building image files manually; (iv) visualize gene expressions for each pathway at enzyme, locus, and probe levels; and (v) perform statistical tests at pathway, enzyme and gene levels. PathMAPA can be used to examine Arabidopsis thaliana gene expression patterns associated with metabolic pathways. Conclusion PathMAPA provides two unique features for the gene expression analysis of Arabidopsis thaliana: (i) automatic generation of pathways associated with gene expression and (ii) statistical tests at pathway level. The first feature allows for the periodical updating of genomic data for pathways, while the second feature can provide insight into how treatments affect relevant pathways for the selected experiment(s). PMID:14604444
Automated branching pattern report generation for laparoscopic surgery assistance
NASA Astrophysics Data System (ADS)
Oda, Masahiro; Matsuzaki, Tetsuro; Hayashi, Yuichiro; Kitasaka, Takayuki; Misawa, Kazunari; Mori, Kensaku
2015-05-01
This paper presents a method for generating branching pattern reports of abdominal blood vessels for laparoscopic gastrectomy. In gastrectomy, it is very important to understand branching structure of abdominal arteries and veins, which feed and drain specific abdominal organs including the stomach, the liver and the pancreas. In the real clinical stage, a surgeon creates a diagnostic report of the patient anatomy. This report summarizes the branching patterns of the blood vessels related to the stomach. The surgeon decides actual operative procedure. This paper shows an automated method to generate a branching pattern report for abdominal blood vessels based on automated anatomical labeling. The report contains 3D rendering showing important blood vessels and descriptions of branching patterns of each vessel. We have applied this method for fifty cases of 3D abdominal CT scans and confirmed the proposed method can automatically generate branching pattern reports of abdominal arteries.
Development of Anthropometric Analogous Headforms. Phase 1.
1994-10-31
shown in figure 5. This surface mesh can then be transformed into polygon faces that are able to be rendered by the AutoCAD rendering tools . Rendering of...computer-generated surfaces. The material removal techniques require the programming of the tool path of the cutter and in some cases requires specialized... tooling . Tool path programs are available to transfer the computer-generated surface into actual paths of the cutting tool . In cases where the
Power Plant Model Validation Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
The PPMV is used to validate generator model using disturbance recordings. The PPMV tool contains a collection of power plant models and model validation studies, as well as disturbance recordings from a number of historic grid events. The user can import data from a new disturbance into the database, which converts PMU and SCADA data into GE PSLF format, and then run the tool to validate (or invalidate) the model for a specific power plant against its actual performance. The PNNL PPMV tool enables the automation of the process of power plant model validation using disturbance recordings. The tool usesmore » PMU and SCADA measurements as input information. The tool automatically adjusts all required EPCL scripts and interacts with GE PSLF in the batch mode. The main tool features includes: The tool interacts with GE PSLF; The tool uses GE PSLF Play-In Function for generator model validation; Database of projects (model validation studies); Database of the historic events; Database of the power plant; The tool has advanced visualization capabilities; and The tool automatically generates reports« less
Wang, Dongwen
2017-01-01
We analyzed four interactive case simulation tools (ICSTs) from a statewide online clinical education program. Results have shown that ICSTs are increasingly used by HIV healthcare providers. Smart phone has become the primary usage platform for specific ICSTs. Usage patterns depend on particular ICST modules, usage stages, and use contexts. Future design of ICSTs should consider these usage patterns for more effective dissemination of clinical evidence to healthcare providers.
Laser induced Erasable Patterns in a N* Liquid Crystal on an Iron Doped Lithium Niobate (Postprint)
2017-10-12
be applied selectively to erase these patterns. Thus, a promising method is reported to generate reconfigurable patterns, photonic motives , and...erase these patterns. Thus, a promising method is reported to generate reconfigurable patterns, photonic motives , and touch sensitive devices in a...release (PA): distribution unlimited. loss of the patterns inscribed. Possible motives are not limited to graphics. It should be also possible to write
Application of Multimedia Design Principles to Visuals Used in Course-Books: An Evaluation Tool
ERIC Educational Resources Information Center
Kuzu, Abdullah; Akbulut, Yavuz; Sahin, Mehmet Can
2007-01-01
This paper introduces an evaluation tool prepared to examine the quality of visuals in course-books. The tool is based on Mayer's Cognitive Theory of Multimedia Learning (i.e. Generative Theory) and its principles regarding the correct use of illustrations within text. The reason to generate the tool, the development process along with the…
Signatures of Sex-Antagonistic Selection on Recombining Sex Chromosomes
Kirkpatrick, Mark; Guerrero, Rafael F.
2014-01-01
Sex-antagonistic (SA) selection has major evolutionary consequences: it can drive genomic change, constrain adaptation, and maintain genetic variation for fitness. The recombining (or pseudoautosomal) regions of sex chromosomes are a promising setting in which to study SA selection because they tend to accumulate SA polymorphisms and because recombination allows us to deploy the tools of molecular evolution to locate targets of SA selection and quantify evolutionary forces. Here we use coalescent models to characterize the patterns of polymorphism expected within and divergence between recombining X and Y (or Z and W) sex chromosomes. SA selection generates peaks of divergence between X and Y that can extend substantial distances away from the targets of selection. Linkage disequilibrium between neutral sites is also inflated. We show how the pattern of divergence is altered when the SA polymorphism or the sex-determining region was recently established. We use data from the flowering plant Silene latifolia to illustrate how the strength of SA selection might be quantified using molecular data from recombining sex chromosomes. PMID:24578352
NASA Astrophysics Data System (ADS)
Law, Yuen C.; Tenbrinck, Daniel; Jiang, Xiaoyi; Kuhlen, Torsten
2014-03-01
Computer-assisted processing and interpretation of medical ultrasound images is one of the most challenging tasks within image analysis. Physical phenomena in ultrasonographic images, e.g., the characteristic speckle noise and shadowing effects, make the majority of standard methods from image analysis non optimal. Furthermore, validation of adapted computer vision methods proves to be difficult due to missing ground truth information. There is no widely accepted software phantom in the community and existing software phantoms are not exible enough to support the use of specific speckle models for different tissue types, e.g., muscle and fat tissue. In this work we propose an anatomical software phantom with a realistic speckle pattern simulation to _ll this gap and provide a exible tool for validation purposes in medical ultrasound image analysis. We discuss the generation of speckle patterns and perform statistical analysis of the simulated textures to obtain quantitative measures of the realism and accuracy regarding the resulting textures.
Stanford, Michael G.; Lewis, Brett B.; Iberi, Vighter O.; ...
2016-02-16
Focused helium and neon ion (He(+)/Ne(+) ) beam processing has recently been used to push resolution limits of direct-write nanoscale synthesis. The ubiquitous insertion of focused He(+) /Ne(+) beams as the next-generation nanofabrication tool-of-choice is currently limited by deleterious subsurface and peripheral damage induced by the energetic ions in the underlying substrate. The in situ mitigation of subsurface damage induced by He(+)/Ne(+) ion exposures in silicon via a synchronized infrared pulsed laser-assisted process is demonstrated. The pulsed laser assist provides highly localized in situ photothermal energy which reduces the implantation and defect concentration by greater than 90%. The laser-assisted exposuremore » process is also shown to reduce peripheral defects in He(+) patterned graphene, which makes this process an attractive candidate for direct-write patterning of 2D materials. In conclusion, these results offer a necessary solution for the applicability of high-resolution direct-write nanoscale material processing via focused ion beams.« less
Levin, Michael; Pezzulo, Giovanni; Finkelstein, Joshua M
2017-06-21
Living systems exhibit remarkable abilities to self-assemble, regenerate, and remodel complex shapes. How cellular networks construct and repair specific anatomical outcomes is an open question at the heart of the next-generation science of bioengineering. Developmental bioelectricity is an exciting emerging discipline that exploits endogenous bioelectric signaling among many cell types to regulate pattern formation. We provide a brief overview of this field, review recent data in which bioelectricity is used to control patterning in a range of model systems, and describe the molecular tools being used to probe the role of bioelectrics in the dynamic control of complex anatomy. We suggest that quantitative strategies recently developed to infer semantic content and information processing from ionic activity in the brain might provide important clues to cracking the bioelectric code. Gaining control of the mechanisms by which large-scale shape is regulated in vivo will drive transformative advances in bioengineering, regenerative medicine, and synthetic morphology, and could be used to therapeutically address birth defects, traumatic injury, and cancer.
Possibility expectation and its decision making algorithm
NASA Technical Reports Server (NTRS)
Keller, James M.; Yan, Bolin
1992-01-01
The fuzzy integral has been shown to be an effective tool for the aggregation of evidence in decision making. Of primary importance in the development of a fuzzy integral pattern recognition algorithm is the choice (construction) of the measure which embodies the importance of subsets of sources of evidence. Sugeno fuzzy measures have received the most attention due to the recursive nature of the fabrication of the measure on nested sequences of subsets. Possibility measures exhibit an even simpler generation capability, but usually require that one of the sources of information possess complete credibility. In real applications, such normalization may not be possible, or even desirable. In this report, both the theory and a decision making algorithm for a variation of the fuzzy integral are presented. This integral is based on a possibility measure where it is not required that the measure of the universe be unity. A training algorithm for the possibility densities in a pattern recognition application is also presented with the results demonstrated on the shuttle-earth-space training and testing images.
Profile of new green fluorescent protein transgenic Jinhua pigs as an imaging source
NASA Astrophysics Data System (ADS)
Kawarasaki, Tatsuo; Uchiyama, Kazuhiko; Hirao, Atsushi; Azuma, Sadahiro; Otake, Masayoshi; Shibata, Masatoshi; Tsuchiya, Seiko; Enosawa, Shin; Takeuchi, Koichi; Konno, Kenjiro; Hakamata, Yoji; Yoshino, Hiroyuki; Wakai, Takuya; Ookawara, Shigeo; Tanaka, Hozumi; Kobayashi, Eiji; Murakami, Takashi
2009-09-01
Animal imaging sources have become an indispensable material for biological sciences. Specifically, gene-encoded biological probes serve as stable and high-performance tools to visualize cellular fate in living animals. We use a somatic cell cloning technique to create new green fluorescent protein (GFP)-expressing Jinhua pigs with a miniature body size, and characterized the expression profile in various tissues/organs and ex vivo culture conditions. The born GFP-transgenic pig demonstrate an organ/tissue-dependent expression pattern. Strong GFP expression is observed in the skeletal muscle, pancreas, heart, and kidney. Regarding cellular levels, bone-marrow-derived mesenchymal stromal cells, hepatocytes, and islet cells of the pancreas also show sufficient expression with the unique pattern. Moreover, the cloned pigs demonstrate normal growth and fertility, and the introduced GFP gene is stably transmitted to pigs in subsequent generations. The new GFP-expressing Jinhua pigs may be used as new cellular/tissue light resources for biological imaging in preclinical research fields such as tissue engineering, experimental regenerative medicine, and transplantation.
Statistical process control using optimized neural networks: a case study.
Addeh, Jalil; Ebrahimzadeh, Ata; Azarbad, Milad; Ranaee, Vahid
2014-09-01
The most common statistical process control (SPC) tools employed for monitoring process changes are control charts. A control chart demonstrates that the process has altered by generating an out-of-control signal. This study investigates the design of an accurate system for the control chart patterns (CCPs) recognition in two aspects. First, an efficient system is introduced that includes two main modules: feature extraction module and classifier module. In the feature extraction module, a proper set of shape features and statistical feature are proposed as the efficient characteristics of the patterns. In the classifier module, several neural networks, such as multilayer perceptron, probabilistic neural network and radial basis function are investigated. Based on an experimental study, the best classifier is chosen in order to recognize the CCPs. Second, a hybrid heuristic recognition system is introduced based on cuckoo optimization algorithm (COA) algorithm to improve the generalization performance of the classifier. The simulation results show that the proposed algorithm has high recognition accuracy. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.
Ingrosso, Chiara; Panniello, AnnaMaria; Comparelli, Roberto; Curri, Maria Lucia; Striccoli, Marinella
2010-01-01
The unique size- and shape-dependent electronic properties of nanocrystals (NCs) make them extremely attractive as novel structural building blocks for constructing a new generation of innovative materials and solid-state devices. Recent advances in material chemistry has allowed the synthesis of colloidal NCs with a wide range of compositions, with a precise control on size, shape and uniformity as well as specific surface chemistry. By incorporating such nanostructures in polymers, mesoscopic materials can be achieved and their properties engineered by choosing NCs differing in size and/or composition, properly tuning the interaction between NCs and surrounding environment. In this contribution, different approaches will be presented as effective opportunities for conveying colloidal NC properties to nanocomposite materials for micro and nanofabrication. Patterning of such nanocomposites either by conventional lithographic techniques and emerging patterning tools, such as ink jet printing and nanoimprint lithography, will be illustrated, pointing out their technological impact on developing new optoelectronic and sensing devices.
Short term load forecasting of anomalous load using hybrid soft computing methods
NASA Astrophysics Data System (ADS)
Rasyid, S. A.; Abdullah, A. G.; Mulyadi, Y.
2016-04-01
Load forecast accuracy will have an impact on the generation cost is more economical. The use of electrical energy by consumers on holiday, show the tendency of the load patterns are not identical, it is different from the pattern of the load on a normal day. It is then defined as a anomalous load. In this paper, the method of hybrid ANN-Particle Swarm proposed to improve the accuracy of anomalous load forecasting that often occur on holidays. The proposed methodology has been used to forecast the half-hourly electricity demand for power systems in the Indonesia National Electricity Market in West Java region. Experiments were conducted by testing various of learning rate and learning data input. Performance of this methodology will be validated with real data from the national of electricity company. The result of observations show that the proposed formula is very effective to short-term load forecasting in the case of anomalous load. Hybrid ANN-Swarm Particle relatively simple and easy as a analysis tool by engineers.
Formal Validation of Fault Management Design Solutions
NASA Technical Reports Server (NTRS)
Gibson, Corrina; Karban, Robert; Andolfato, Luigi; Day, John
2013-01-01
The work presented in this paper describes an approach used to develop SysML modeling patterns to express the behavior of fault protection, test the model's logic by performing fault injection simulations, and verify the fault protection system's logical design via model checking. A representative example, using a subset of the fault protection design for the Soil Moisture Active-Passive (SMAP) system, was modeled with SysML State Machines and JavaScript as Action Language. The SysML model captures interactions between relevant system components and system behavior abstractions (mode managers, error monitors, fault protection engine, and devices/switches). Development of a method to implement verifiable and lightweight executable fault protection models enables future missions to have access to larger fault test domains and verifiable design patterns. A tool-chain to transform the SysML model to jpf-Statechart compliant Java code and then verify the generated code via model checking was established. Conclusions and lessons learned from this work are also described, as well as potential avenues for further research and development.
Maasz, G; Takács, P; Boda, P; Varbiro, G; Pirger, Z
2017-12-01
Besides food quality control of fish or cephalopods, the novel mass spectrometry (MS) approaches could be effective and beneficial methods for the investigation of biodiversity in ecological research. Our aims were to verify the applicability of MALDI-TOF MS in the rapid identification of closely related species, and to further develop it for sex determination in phenotypically similar fish focusing on the low mass range. For MALDI-TOF MS spectra analysis, ClinProTools software was applied, but our observed classification was also confirmed by Self Organizing Map. For verifying the wide applicability of the method, brains from invertebrate and vertebrate species were used in order to detect the species related markers from two mayflies and eight fish as well as sex-related markers within bleak. Seven Ephemera larvae and sixty-one fish species related markers were observed and nineteen sex-related markers were identified in bleak. Similar patterns were observed between the individuals within one species. In contrast, there were markedly diverse patterns between the different species and sexes visualized by SOMs. Two different Ephemera species and male or female fish were identified with 100% accuracy. The various fish species were classified into 8 species with a high level of accuracy (96.2%). Based on MS data, dendrogram was generated from different fish species by using ClinProTools software. This MS-based dendrogram shows relatively high correspondence with the phylogenetic relationships of both the studied species and orders. In summary, MALDI-TOF MS provides a cheap, reliable, sensitive and fast identification tool for researchers in the case of closely related species using mass spectra acquired in a low mass range to define specific molecular profiles. Moreover, we presented evidence for the first time for determination of sex within one fish species by using this method. We conclude that it is a powerful tool that can revolutionize ecological and environmental research. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Witte, D.R.
1978-11-01
It is believed that sferics, a word that stands for atmospheric electromagnetic radiation, can be correlated to the genesis of tornadoes and severe weather. Sferics are generated by lightning and other atmospheric disturbances that are not yet entirely understood. The recording and analysis of the patterns in which sferic events occur, it is hoped, will lead to accurate real-time prediction of tornadoes and other severe weather. Collection of this data becomes cumbersome when correlation between at least two stations is necessary for triangulation; however, the advent of microprocessors has made the task of data collection and massaging inexpensive and manageable.
Social networks as embedded complex adaptive systems.
Benham-Hutchins, Marge; Clancy, Thomas R
2010-09-01
As systems evolve over time, their natural tendency is to become increasingly more complex. Studies in the field of complex systems have generated new perspectives on management in social organizations such as hospitals. Much of this research appears as a natural extension of the cross-disciplinary field of systems theory. This is the 15th in a series of articles applying complex systems science to the traditional management concepts of planning, organizing, directing, coordinating, and controlling. In this article, the authors discuss healthcare social networks as a hierarchy of embedded complex adaptive systems. The authors further examine the use of social network analysis tools as a means to understand complex communication patterns and reduce medical errors.
Applications of diatoms as potential microalgae in nanobiotechnology.
Jamali, Ali Akbar; Akbari, Fariba; Ghorakhlu, Mohamad Moradi; de la Guardia, Miguel; Yari Khosroushahi, Ahmad
2012-01-01
Diatoms are single cell eukaryotic microalgae, which present in nearly every water habitat make them ideal tools for a wide range of applications such as oil explora-tion, forensic examination, environmental indication, biosilica pattern generation, toxicity testing and eutrophication of aqueous ecosystems. Essential information on diatoms were reviewed and discussed towards impacts of diatoms on biosynthesis and bioremediation. In this review, we present the recent progress in this century on the application of diatoms in waste degradation, synthesis of biomaterial, biomineraliza-tion, toxicity and toxic effects of mineral elements evaluations. Diatoms can be considered as metal toxicity bioindicators and they can be applied for biomineralization, synthesis of biomaterials, and degradation of wastes.
Opportunities for Fluid Dynamics Research in the Forensic Discipline of Bloodstain Pattern Analysis
NASA Astrophysics Data System (ADS)
Attinger, Daniel; Moore, Craig; Donaldson, Adam; Jafari, Arian; Stone, Howard
2013-11-01
This review [Forensic Science International, vol. 231, pp. 375-396, 2013] highlights research opportunities for fluid dynamics (FD) studies related to the forensic discipline of bloodstain pattern analysis (BPA). The need for better integrating FD and BPA is mentioned in a 2009 report by the US National Research Council, entitled ``Strengthening Forensic Science in the United States: A Path Forward''. BPA aims for practical answers to specific questions of the kind: ``How did a bloodletting incident happen?'' FD, on the other hand, aims to quantitatively describe the transport of fluids and the related causes, with general equations. BPA typically solves the indirect problem of inspecting stains in a crime scene to infer the most probable bloodletting incident that produced these patterns. FD typically defines the initial and boundary conditions of a fluid system and from there describe how the system evolves in time and space, most often in a deterministic manner. We review four topics in BPA with strong connections to FD: the generation of drops, their flight, their impact and the formation of stains. Future research on these topics would deliver new quantitative tools and methods for BPA, and present new multiphase flow problems for FD.
Laser Processing of Multilayered Thermal Spray Coatings: Optimal Processing Parameters
NASA Astrophysics Data System (ADS)
Tewolde, Mahder; Zhang, Tao; Lee, Hwasoo; Sampath, Sanjay; Hwang, David; Longtin, Jon
2017-12-01
Laser processing offers an innovative approach for the fabrication and transformation of a wide range of materials. As a rapid, non-contact, and precision material removal technology, lasers are natural tools to process thermal spray coatings. Recently, a thermoelectric generator (TEG) was fabricated using thermal spray and laser processing. The TEG device represents a multilayer, multimaterial functional thermal spray structure, with laser processing serving an essential role in its fabrication. Several unique challenges are presented when processing such multilayer coatings, and the focus of this work is on the selection of laser processing parameters for optimal feature quality and device performance. A parametric study is carried out using three short-pulse lasers, where laser power, repetition rate and processing speed are varied to determine the laser parameters that result in high-quality features. The resulting laser patterns are characterized using optical and scanning electron microscopy, energy-dispersive x-ray spectroscopy, and electrical isolation tests between patterned regions. The underlying laser interaction and material removal mechanisms that affect the feature quality are discussed. Feature quality was found to improve both by using a multiscanning approach and an optional assist gas of air or nitrogen. Electrically isolated regions were also patterned in a cylindrical test specimen.
pyLIDEM: A Python-Based Tool to Delineate Coastal Watersheds Using LIDAR Data
NASA Astrophysics Data System (ADS)
O'Banion, R.; Alameddine, I.; Gronewold, A.; Reckhow, K.
2008-12-01
Accurately identifying the boundary of a watershed is one of the most fundamental and important steps in any hydrological assessment. Representative applications include defining a study area, predicting overland flow, estimating groundwater infiltration, modeling pollutant accumulation and wash-off rates, and evaluating effectiveness of pollutant mitigation measures. The United States Environmental Protection Agency (USEPA) Total Maximum Daily Load (TMDL) program, the most comprehensive water quality management program in the United States (US), is just one example of an application in which accurate and efficient watershed delineation tools play a critical role. For example, many impaired water bodies currently being addressed through the TMDL program drain small coastal watersheds with relatively flat terrain, making watershed delineation particularly challenging. Most of these TMDL studies use 30-meter digital elevation models (DEMs) that rarely capture all of the small elevation changes in coastal watersheds, leading to errors not only in watershed boundary delineation, but in subsequent model predictions (such as watershed runoff flow and pollutant deposition rate predictions) for which watershed attributes are key inputs. Manually delineating these low-relief coastal watersheds through the use of expert knowledge of local water flow patterns, often produces relatively accurate (and often more accurate) watershed boundaries as compared to the boundaries generated by the 30-meter DEMs. Yet, manual delineation is a costly and time consuming procedure that is often not opted for. There is a growing need, therefore, particularly to address the ongoing needs of the TMDL program (and similar environmental management programs), for software tools which can utilize high resolution topography data to more accurately delineate coastal watersheds. Here, we address this need by developing pyLIDEM (python LIdar DEM), a python-based tool which processes bare earth high-resolution Light Detection and Ranging (LIDAR) data, generates fine scale DEMs, and delineates watershed boundaries for a given pour point. Because LIDAR data are typically distributed in large sets of predefined tiles, our tool is capable of combining only the minimum number of bare earth LIDAR tiles required to delineate a watershed of interest. Our tool then processes the LIDAR data into Triangulated Irregular Networks, generates DEMs at user- specified cell sizes, and creates the required files needed to delineate watersheds within ArcGIS. To make pyLIDEM more accessible to the modeling community, we have bundled it within an ArcGIS toolbox, which also allows users to run it directly from an ArcGIS platform. We assess pyLIDEM functionality and accuracy by delineating several impaired small coastal watersheds in the Newport River Estuary in Eastern North Carolina using LIDAR data collected for the North Carolina Flood Mapping Program. We then compare the pyLIDAR-based watershed boundaries with those generated manually and with those generated using the 30-meter DEMs, and find that the pyLIDAR-based boundaries are more accurate than the 30-meter DEMs, and provide a significant time savings compared to manual delineation, particularly in cases where multiple watersheds need to be delineated for a single project.
Connectivity Measures in EEG Microstructural Sleep Elements.
Sakellariou, Dimitris; Koupparis, Andreas M; Kokkinos, Vasileios; Koutroumanidis, Michalis; Kostopoulos, George K
2016-01-01
During Non-Rapid Eye Movement sleep (NREM) the brain is relatively disconnected from the environment, while connectedness between brain areas is also decreased. Evidence indicates, that these dynamic connectivity changes are delivered by microstructural elements of sleep: short periods of environmental stimuli evaluation followed by sleep promoting procedures. The connectivity patterns of the latter, among other aspects of sleep microstructure, are still to be fully elucidated. We suggest here a methodology for the assessment and investigation of the connectivity patterns of EEG microstructural elements, such as sleep spindles. The methodology combines techniques in the preprocessing, estimation, error assessing and visualization of results levels in order to allow the detailed examination of the connectivity aspects (levels and directionality of information flow) over frequency and time with notable resolution, while dealing with the volume conduction and EEG reference assessment. The high temporal and frequency resolution of the methodology will allow the association between the microelements and the dynamically forming networks that characterize them, and consequently possibly reveal aspects of the EEG microstructure. The proposed methodology is initially tested on artificially generated signals for proof of concept and subsequently applied to real EEG recordings via a custom built MATLAB-based tool developed for such studies. Preliminary results from 843 fast sleep spindles recorded in whole night sleep of 5 healthy volunteers indicate a prevailing pattern of interactions between centroparietal and frontal regions. We demonstrate hereby, an opening to our knowledge attempt to estimate the scalp EEG connectivity that characterizes fast sleep spindles via an "EEG-element connectivity" methodology we propose. The application of the latter, via a computational tool we developed suggests it is able to investigate the connectivity patterns related to the occurrence of EEG microstructural elements. Network characterization of specified physiological or pathological EEG microstructural elements can potentially be of great importance in the understanding, identification, and prediction of health and disease.
Connectivity Measures in EEG Microstructural Sleep Elements
Sakellariou, Dimitris; Koupparis, Andreas M.; Kokkinos, Vasileios; Koutroumanidis, Michalis; Kostopoulos, George K.
2016-01-01
During Non-Rapid Eye Movement sleep (NREM) the brain is relatively disconnected from the environment, while connectedness between brain areas is also decreased. Evidence indicates, that these dynamic connectivity changes are delivered by microstructural elements of sleep: short periods of environmental stimuli evaluation followed by sleep promoting procedures. The connectivity patterns of the latter, among other aspects of sleep microstructure, are still to be fully elucidated. We suggest here a methodology for the assessment and investigation of the connectivity patterns of EEG microstructural elements, such as sleep spindles. The methodology combines techniques in the preprocessing, estimation, error assessing and visualization of results levels in order to allow the detailed examination of the connectivity aspects (levels and directionality of information flow) over frequency and time with notable resolution, while dealing with the volume conduction and EEG reference assessment. The high temporal and frequency resolution of the methodology will allow the association between the microelements and the dynamically forming networks that characterize them, and consequently possibly reveal aspects of the EEG microstructure. The proposed methodology is initially tested on artificially generated signals for proof of concept and subsequently applied to real EEG recordings via a custom built MATLAB-based tool developed for such studies. Preliminary results from 843 fast sleep spindles recorded in whole night sleep of 5 healthy volunteers indicate a prevailing pattern of interactions between centroparietal and frontal regions. We demonstrate hereby, an opening to our knowledge attempt to estimate the scalp EEG connectivity that characterizes fast sleep spindles via an “EEG-element connectivity” methodology we propose. The application of the latter, via a computational tool we developed suggests it is able to investigate the connectivity patterns related to the occurrence of EEG microstructural elements. Network characterization of specified physiological or pathological EEG microstructural elements can potentially be of great importance in the understanding, identification, and prediction of health and disease. PMID:26924980
Bristow, Tony; Constantine, Jill; Harrison, Mark; Cavoit, Fabien
2008-04-01
Orthogonal-acceleration quadrupole time-of-flight (oa-QTOF) mass spectrometers, employed for accurate mass measurement, have been commercially available for well over a decade. A limitation of the early instruments of this type was the narrow ion abundance range over which accurate mass measurements could be made with a high degree of certainty. Recently, a new generation of oa-QTOF mass spectrometers has been developed and these allow accurate mass measurements to be recorded over a much greater range of ion abundances. This development has resulted from new ion detection technology and improved electronic stability or by accurate control of the number of ions reaching the detector. In this report we describe the results from experiments performed to evaluate the mass measurement performance of the Bruker micrOTOF-Q, a member of the new-generation oa-QTOFs. The relationship between mass accuracy and ion abundance has been extensively evaluated and mass measurement accuracy remained stable (+/-1.5 m m/z units) over approximately 3-4 orders of magnitude of ion abundance. The second feature of the Bruker micrOTOF-Q that was evaluated was the SigmaFit function of the software. This isotope pattern-matching algorithm provides an exact numerical comparison of the theoretical and measured isotope patterns as an additional identification tool to accurate mass measurement. The smaller the value, the closer the match between theoretical and measured isotope patterns. This information is then employed to reduce the number of potential elemental formulae produced from the mass measurements. A relationship between the SigmaFit value and ion abundance has been established. The results from the study for both mass accuracy and SigmaFit were employed to define the performance criteria for the micrOTOF-Q. This provided increased confidence in the selection of elemental formulae resulting from accurate mass measurements.
The mission events graphic generator software: A small tool with big results
NASA Technical Reports Server (NTRS)
Lupisella, Mark; Leibee, Jack; Scaffidi, Charles
1993-01-01
Utilization of graphics has long been a useful methodology for many aspects of spacecraft operations. A personal computer based software tool that implements straight-forward graphics and greatly enhances spacecraft operations is presented. This unique software tool is the Mission Events Graphic Generator (MEGG) software which is used in support of the Hubble Space Telescope (HST) Project. MEGG reads the HST mission schedule and generates a graphical timeline.
Stable Epigenetic Variants Selected from an Induced Hypomethylated Fragaria vesca Population.
Xu, Jihua; Tanino, Karen K; Robinson, Stephen J
2016-01-01
Epigenetic inheritance was transmitted through selection over five generations of extreme early, but not late flowering time phenotypic lines in Fragaria vesca . Epigenetic variation was initially artificially induced using the DNA demethylation reagent 5-azacytidine (5-azaC). It is the first report to explore epigenetic variant selection and phenotypic trait inheritance in strawberry. Transmission frequency of these traits was determined across generations. The early flowering (EF4) and late stolon (LS) phenotypic traits were successfully transmitted across five and three generations through meiosis, respectively. Stable mitotic transmission of the early flowering phenotype was also demonstrated using clonal daughters derived from the 4th Generation (S4) mother plant. In order to further explore the DNA methylation patterns underlying the early flowering trait, the standard MSAP method using isoschizomers Hpa II/Msp I, and newly modified MSAP method using isoschizomers Tfi I/Pfe I which detected DNA methylation at CG, CHG, CHH sites were used in two early flowering lines, EF lines 1 (P2) and EF lines 2 (P3), and control lines (P1). A significant reduction in the number of fully-methylated bands was detected in P2 and P3 when compared to P1 using the novel MSAP method. In the standard MSAP, the symmetric CG and CHG methylation was maintained over generations in the early flowering lines based on the clustering in P2 and P3, the novel MSAP approach revealed the asymmetric CHH methylation pattern was not maintained over generations. This study provides evidence of stable selection of phenotypic traits, particularly early flowering through both meiosis and mitosis, which is meaningful to both breeding programs and commercial horticulture. The maintenance in CG and CHG methylation over generations suggests the early flowering phenotype might be related to DNA methylation alterations at the CG or CHG sites. Finally, this work provides a new approach for studying the role of epigenetics on complex quantitative trait improvement in strawberry, as well as providing a tool to expand phenotypic diversity and expedite potential new horticulture cultivar releases through either seed or vegetative propagation.
Morgan, Kevin T; Pino, Michael; Crosby, Lynn M; Wang, Min; Elston, Timothy C; Jayyosi, Zaid; Bonnefoi, Marc; Boorman, Gary
2004-01-01
Toxicogenomics is an emerging multidisciplinary science that will profoundly impact the practice of toxicology. New generations of biologists, using evolving toxicogenomics tools, will generate massive data sets in need of interpretation. Mathematical tools are necessary to cluster and otherwise find meaningful structure in such data. The linking of this structure to gene functions and disease processes, and finally the generation of useful data interpretation remains a significant challenge. The training and background of pathologists make them ideally suited to contribute to the field of toxicogenomics, from experimental design to data interpretation. Toxicologic pathology, a discipline based on pattern recognition, requires familiarity with the dynamics of disease processes and interactions between organs, tissues, and cell populations. Optimal involvement of toxicologic pathologists in toxicogenomics requires that they communicate effectively with the many other scientists critical for the effective application of this complex discipline to societal problems. As noted by Petricoin III et al (Nature Genetics 32, 474-479, 2002), cooperation among regulators, sponsors and experts will be essential for realizing the potential of microarrays for public health. Following a brief introduction to the role of mathematics in toxicogenomics, "data interpretation" from the perspective of a pathologist is briefly discussed. Based on oscillatory behavior in the liver, the importance of an understanding of mathematics is addressed, and an approach to learning mathematics "later in life" is provided. An understanding of pathology by mathematicians involved in toxicogenomics is equally critical, as both mathematics and pathology are essential for transforming toxicogenomics data sets into useful knowledge.
Torous, John; Kiang, Mathew V; Lorme, Jeanette; Onnela, Jukka-Pekka
2016-05-05
A longstanding barrier to progress in psychiatry, both in clinical settings and research trials, has been the persistent difficulty of accurately and reliably quantifying disease phenotypes. Mobile phone technology combined with data science has the potential to offer medicine a wealth of additional information on disease phenotypes, but the large majority of existing smartphone apps are not intended for use as biomedical research platforms and, as such, do not generate research-quality data. Our aim is not the creation of yet another app per se but rather the establishment of a platform to collect research-quality smartphone raw sensor and usage pattern data. Our ultimate goal is to develop statistical, mathematical, and computational methodology to enable us and others to extract biomedical and clinical insights from smartphone data. We report on the development and early testing of Beiwe, a research platform featuring a study portal, smartphone app, database, and data modeling and analysis tools designed and developed specifically for transparent, customizable, and reproducible biomedical research use, in particular for the study of psychiatric and neurological disorders. We also outline a proposed study using the platform for patients with schizophrenia. We demonstrate the passive data capabilities of the Beiwe platform and early results of its analytical capabilities. Smartphone sensors and phone usage patterns, when coupled with appropriate statistical learning tools, are able to capture various social and behavioral manifestations of illnesses, in naturalistic settings, as lived and experienced by patients. The ubiquity of smartphones makes this type of moment-by-moment quantification of disease phenotypes highly scalable and, when integrated within a transparent research platform, presents tremendous opportunities for research, discovery, and patient health.
Frick, Melissa A; Vachani, Carolyn C; Bach, Christina; Hampshire, Margaret K; Arnold-Korzeniowski, Karen; Metz, James M; Hill-Kayser, Christine E
2017-11-01
The survivorship needs of patients living with chronic cancer (CC) and their use of survivorship care plans (SCPs) have been overlooked and underappreciated. A convenience sample of 39,088 SCPs completed for cancer survivors with an Internet-based SCP tool was examined; it included 5847 CC survivors (15%; CC was defined as chronic leukemia and/or recurrent/metastatic cancer of another nature). Patient-reported treatment effects and follow-up care patterns were compared between CC survivors and survivors treated with curative intent (CI). Responses from a follow-up survey regarding SCP satisfaction and use were reviewed. CC survivors had greater odds of experiencing multiple treatment-related effects than survivors treated with CI; these effects included fatigue, cognitive changes, dyspnea, peripheral neuropathy, lymphedema, and erectile dysfunction. Nearly half of CC survivors were managed by an oncologist alone, and they were less likely than CI patients to be comanaged by a primary care provider and an oncologist. Fewer SCPs were generated by health care providers (HCPs) for CC survivors versus CI survivors. A smaller proportion of CC users versus CI users rated their experience and satisfaction with the SCP tool as very good or excellent, and CC users were less likely to share the HCP summary with their health care team. A substantial number of CC survivors, often considered incurable but treatable, seek survivorship support. Tools to facilitate participation, communication, and coordination of care are valuable for these patients, and future iterations of SCPs should be designed to address the particular circumstances of living with CC. Cancer 2017;123:4268-4276. © 2017 American Cancer Society. © 2017 American Cancer Society.
Torous, John; Kiang, Mathew V; Lorme, Jeanette
2016-01-01
Background A longstanding barrier to progress in psychiatry, both in clinical settings and research trials, has been the persistent difficulty of accurately and reliably quantifying disease phenotypes. Mobile phone technology combined with data science has the potential to offer medicine a wealth of additional information on disease phenotypes, but the large majority of existing smartphone apps are not intended for use as biomedical research platforms and, as such, do not generate research-quality data. Objective Our aim is not the creation of yet another app per se but rather the establishment of a platform to collect research-quality smartphone raw sensor and usage pattern data. Our ultimate goal is to develop statistical, mathematical, and computational methodology to enable us and others to extract biomedical and clinical insights from smartphone data. Methods We report on the development and early testing of Beiwe, a research platform featuring a study portal, smartphone app, database, and data modeling and analysis tools designed and developed specifically for transparent, customizable, and reproducible biomedical research use, in particular for the study of psychiatric and neurological disorders. We also outline a proposed study using the platform for patients with schizophrenia. Results We demonstrate the passive data capabilities of the Beiwe platform and early results of its analytical capabilities. Conclusions Smartphone sensors and phone usage patterns, when coupled with appropriate statistical learning tools, are able to capture various social and behavioral manifestations of illnesses, in naturalistic settings, as lived and experienced by patients. The ubiquity of smartphones makes this type of moment-by-moment quantification of disease phenotypes highly scalable and, when integrated within a transparent research platform, presents tremendous opportunities for research, discovery, and patient health. PMID:27150677
OPC model data collection for 45-nm technology node using automatic CD-SEM offline recipe creation
NASA Astrophysics Data System (ADS)
Fischer, Daniel; Talbi, Mohamed; Wei, Alex; Menadeva, Ovadya; Cornell, Roger
2007-03-01
Optical and Process Correction in the 45nm node is requiring an ever higher level of characterization. The greater complexity drives a need for automation of the metrology process allowing more efficient, accurate and effective use of the engineering resources and metrology tool time in the fab, helping to satisfy what seems an insatiable appetite for data by lithographers and modelers charged with development of 45nm and 32nm processes. The scope of the work referenced here is a 45nm design cycle "full-loop automation", starting with gds formatted target design layout and ending with the necessary feedback of one and two dimensional printed wafer metrology. In this paper the authors consider the key elements of software, algorithmic framework and Critical Dimension Scanning Electron Microscope (CDSEM) functionality necessary to automate its recipe creation. We evaluate specific problems with the methodology of the former art, "on-tool on-wafer" recipe construction, and discuss how the implementation of the design based recipe generation improves upon the overall metrology process. Individual target-by-target construction, use of a one pattern recognition template fits all approach, a blind navigation to the desired measurement feature, lengthy sessions on tool to construct recipes and limited ability to determine measurement quality in the resultant data set are each discussed as to how the state of the art Design Based Metrology (DBM) approach is implemented. The offline created recipes have shown pattern recognition success rates of up to 100% and measurement success rates of up to 93% for line/space as well as for 2D Minimum/Maximum measurements without manual assists during measurement.
Deriving Earth Science Data Analytics Tools/Techniques Requirements
NASA Astrophysics Data System (ADS)
Kempler, S. J.
2015-12-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.
Patterning roadmap: 2017 prospects
NASA Astrophysics Data System (ADS)
Neisser, Mark
2017-06-01
Road mapping of semiconductor chips has been underway for over 20 years, first with the International Technology Roadmap for Semiconductors (ITRS) roadmap and now with the International Roadmap for Devices and Systems (IRDS) roadmap. The original roadmap was mostly driven bottom up and was developed to ensure that the large numbers of semiconductor producers and suppliers had good information to base their research and development on. The current roadmap is generated more top-down, where the customers of semiconductor chips anticipate what will be needed in the future and the roadmap projects what will be needed to fulfill that demand. The More Moore section of the roadmap projects that advanced logic will drive higher-resolution patterning, rather than memory chips. Potential solutions for patterning future logic nodes can be derived as extensions of `next-generation' patterning technologies currently under development. Advanced patterning has made great progress, and two `next-generation' patterning technologies, EUV and nanoimprint lithography, have potential to be in production as early as 2018. The potential adoption of two different next-generation patterning technologies suggests that patterning technology is becoming more specialized. This is good for the industry in that it lowers overall costs, but may lead to slower progress in extending any one patterning technology in the future.
An experimental method for the assessment of color simulation tools.
Lillo, Julio; Alvaro, Leticia; Moreira, Humberto
2014-07-22
The Simulcheck method for evaluating the accuracy of color simulation tools in relation to dichromats is described and used to test three color simulation tools: Variantor, Coblis, and Vischeck. A total of 10 dichromats (five protanopes, five deuteranopes) and 10 normal trichromats participated in the current study. Simulcheck includes two psychophysical tasks: the Pseudoachromatic Stimuli Identification task and the Minimum Achromatic Contrast task. The Pseudoachromatic Stimuli Identification task allows determination of the two chromatic angles (h(uv) values) that generate a minimum response in the yellow–blue opponent mechanism and, consequently, pseudoachromatic stimuli (greens or reds). The Minimum Achromatic Contrast task requires the selection of the gray background that produces minimum contrast (near zero change in the achromatic mechanism) for each pseudoachromatic stimulus selected in the previous task (L(R) values). Results showed important differences in the colorimetric transformations performed by the three evaluated simulation tools and their accuracy levels. Vischeck simulation accurately implemented the algorithm of Brettel, Viénot, and Mollon (1997). Only Vischeck appeared accurate (similarity in huv and L(R) values between real and simulated dichromats) and, consequently, could render reliable color selections. It is concluded that Simulcheck is a consistent method because it provided an equivalent pattern of results for huv and L(R) values irrespective of the stimulus set used to evaluate a simulation tool. Simulcheck was also considered valid because real dichromats provided expected huv and LR values when performing the two psychophysical tasks included in this method. © 2014 ARVO.
Combining the Bourne-Shell, sed and awk in the UNIX Environment for Language Analysis.
ERIC Educational Resources Information Center
Schmitt, Lothar M.; Christianson, Kiel T.
This document describes how to construct tools for language analysis in research and teaching using the Bourne-shell, sed, and awk, three search tools, in the UNIX operating system. Applications include: searches for words, phrases, grammatical patterns, and phonemic patterns in text; statistical analysis of text in regard to such searches,…
ERIC Educational Resources Information Center
Kuna, Aruna Sai
2012-01-01
The study identified the association between student interaction patterns and academic performance in online graduate courses delivered by the Department of Agricultural Education and Studies at Iowa State University. In addition, the study investigated which online course tools were perceived by students to be most useful in learning. The study…
Real cell overlay measurement through design based metrology
NASA Astrophysics Data System (ADS)
Yoo, Gyun; Kim, Jungchan; Park, Chanha; Lee, Taehyeong; Ji, Sunkeun; Jo, Gyoyeon; Yang, Hyunjo; Yim, Donggyu; Yamamoto, Masahiro; Maruyama, Kotaro; Park, Byungjun
2014-04-01
Until recent device nodes, lithography has been struggling to improve its resolution limit. Even though next generation lithography technology is now facing various difficulties, several innovative resolution enhancement technologies, based on 193nm wavelength, were introduced and implemented to keep the trend of device scaling. Scanner makers keep developing state-of-the-art exposure system which guarantees higher productivity and meets a more aggressive overlay specification. "The scaling reduction of the overlay error has been a simple matter of the capability of exposure tools. However, it is clear that the scanner contributions may no longer be the majority component in total overlay performance. The ability to control correctable overlay components is paramount to achieve the desired performance.(2)" In a manufacturing fab, the overlay error, determined by a conventional overlay measurement: by using an overlay mark based on IBO and DBO, often does not represent the physical placement error in the cell area of a memory device. The mismatch may arise from the size or pitch difference between the overlay mark and the cell pattern. Pattern distortion, caused by etching or CMP, also can be a source of the mismatch. Therefore, the requirement of a direct overlay measurement in the cell pattern gradually increases in the manufacturing field, and also in the development level. In order to overcome the mismatch between conventional overlay measurement and the real placement error of layer to layer in the cell area of a memory device, we suggest an alternative overlay measurement method utilizing by design, based metrology tool. A basic concept of this method is shown in figure1. A CD-SEM measurement of the overlay error between layer 1 and 2 could be the ideal method but it takes too long time to extract a lot of data from wafer level. An E-beam based DBM tool provides high speed to cover the whole wafer with high repeatability. It is enabled by using the design as a reference for overlay measurement and a high speed scan system. In this paper, we have demonstrated that direct overlay measurement in the cell area can distinguish the mismatch exactly, instead of using overlay mark. This experiment was carried out for several critical layer in DRAM and Flash memory, using DBM(Design Based Metrology) tool, NGR2170™.
Genetic landscapes GIS Toolbox: tools to map patterns of genetic divergence and diversity.
Vandergast, Amy G.; Perry, William M.; Lugo, Roberto V.; Hathaway, Stacie A.
2011-01-01
The Landscape Genetics GIS Toolbox contains tools that run in the Geographic Information System software, ArcGIS, to map genetic landscapes and to summarize multiple genetic landscapes as average and variance surfaces. These tools can be used to visualize the distribution of genetic diversity across geographic space and to study associations between patterns of genetic diversity and geographic features or other geo-referenced environmental data sets. Together, these tools create genetic landscape surfaces directly from tables containing genetic distance or diversity data and sample location coordinates, greatly reducing the complexity of building and analyzing these raster surfaces in a Geographic Information System.
LandEx - Fast, FOSS-Based Application for Query and Retrieval of Land Cover Patterns
NASA Astrophysics Data System (ADS)
Netzel, P.; Stepinski, T.
2012-12-01
The amount of satellite-based spatial data is continuously increasing making a development of efficient data search tools a priority. The bulk of existing research on searching satellite-gathered data concentrates on images and is based on the concept of Content-Based Image Retrieval (CBIR); however, available solutions are not efficient and robust enough to be put to use as deployable web-based search tools. Here we report on development of a practical, deployable tool that searches classified, rather than raw image. LandEx (Landscape Explorer) is a GeoWeb-based tool for Content-Based Pattern Retrieval (CBPR) contained within the National Land Cover Dataset 2006 (NLCD2006). The USGS-developed NLCD2006 is derived from Landsat multispectral images; it covers the entire conterminous U.S. with the resolution of 30 meters/pixel and it depicts 16 land cover classes. The size of NLCD2006 is about 10 Gpixels (161,000 x 100,000 pixels). LandEx is a multi-tier GeoWeb application based on Open Source Software. Main components are: GeoExt/OpenLayers (user interface), GeoServer (OGC WMS, WCS and WPS server), and GRASS (calculation engine). LandEx performs search using query-by-example approach: user selects a reference scene (exhibiting a chosen pattern of land cover classes) and the tool produces, in real time, a map indicating a degree of similarity between the reference pattern and all local patterns across the U.S. Scene pattern is encapsulated by a 2D histogram of classes and sizes of single-class clumps. Pattern similarity is based on the notion of mutual information. The resultant similarity map can be viewed and navigated in a web browser, or it can download as a GeoTiff file for more in-depth analysis. The LandEx is available at http://sil.uc.edu
Iterating between Tools to Create and Edit Visualizations.
Bigelow, Alex; Drucker, Steven; Fisher, Danyel; Meyer, Miriah
2017-01-01
A common workflow for visualization designers begins with a generative tool, like D3 or Processing, to create the initial visualization; and proceeds to a drawing tool, like Adobe Illustrator or Inkscape, for editing and cleaning. Unfortunately, this is typically a one-way process: once a visualization is exported from the generative tool into a drawing tool, it is difficult to make further, data-driven changes. In this paper, we propose a bridge model to allow designers to bring their work back from the drawing tool to re-edit in the generative tool. Our key insight is to recast this iteration challenge as a merge problem - similar to when two people are editing a document and changes between them need to reconciled. We also present a specific instantiation of this model, a tool called Hanpuku, which bridges between D3 scripts and Illustrator. We show several examples of visualizations that are iteratively created using Hanpuku in order to illustrate the flexibility of the approach. We further describe several hypothetical tools that bridge between other visualization tools to emphasize the generality of the model.
Direct generation of abruptly focusing vortex beams using a 3/2 radial phase-only pattern.
Davis, Jeffrey A; Cottrell, Don M; Zinn, Jonathan M
2013-03-20
Abruptly focusing Airy beams have previously been generated using a radial cubic phase pattern that represents the Fourier transform of the Airy beam. The Fourier transform of this pattern is formed using a system length of 2f, where f is the focal length of the Fourier transform lens. In this work, we directly generate these abruptly focusing Airy beams using a 3/2 radial phase pattern encoded onto a liquid crystal display. The resulting optical system is much shorter. In addition, we can easily produce vortex patterns at the focal point of these beams. Experimental results match theoretical predictions.
Automated Defect and Correlation Length Analysis of Block Copolymer Thin Film Nanopatterns
Murphy, Jeffrey N.; Harris, Kenneth D.; Buriak, Jillian M.
2015-01-01
Line patterns produced by lamellae- and cylinder-forming block copolymer (BCP) thin films are of widespread interest for their potential to enable nanoscale patterning over large areas. In order for such patterning methods to effectively integrate with current technologies, the resulting patterns need to have low defect densities, and be produced in a short timescale. To understand whether a given polymer or annealing method might potentially meet such challenges, it is necessary to examine the evolution of defects. Unfortunately, few tools are readily available to researchers, particularly those engaged in the synthesis and design of new polymeric systems with the potential for patterning, to measure defects in such line patterns. To this end, we present an image analysis tool, which we have developed and made available, to measure the characteristics of such patterns in an automated fashion. Additionally we apply the tool to six cylinder-forming polystyrene-block-poly(2-vinylpyridine) polymers thermally annealed to explore the relationship between the size of each polymer and measured characteristics including line period, line-width, defect density, line-edge roughness (LER), line-width roughness (LWR), and correlation length. Finally, we explore the line-edge roughness, line-width roughness, defect density, and correlation length as a function of the image area sampled to determine each in a more rigorous fashion. PMID:26207990
Range pattern matching with layer operations and continuous refinements
NASA Astrophysics Data System (ADS)
Tseng, I.-Lun; Lee, Zhao Chuan; Li, Yongfu; Perez, Valerio; Tripathi, Vikas; Ong, Jonathan Yoong Seang
2018-03-01
At advanced and mainstream process nodes (e.g., 7nm, 14nm, 22nm, and 55nm process nodes), lithography hotspots can exist in layouts of integrated circuits even if the layouts pass design rule checking (DRC). Existence of lithography hotspots in a layout can cause manufacturability issues, which can result in yield losses of manufactured integrated circuits. In order to detect lithography hotspots existing in physical layouts, pattern matching (PM) algorithms and commercial PM tools have been developed. However, there are still needs to use DRC tools to perform PM operations. In this paper, we propose a PM synthesis methodology, which uses a continuous refinement technique, for the automatic synthesis of a given lithography hotspot pattern into a DRC deck, which consists of layer operation commands, so that an equivalent PM operation can be performed by executing the synthesized deck with the use of a DRC tool. Note that the proposed methodology can deal with not only exact patterns, but also range patterns. Also, lithography hotspot patterns containing multiple layers can be processed. Experimental results show that the proposed methodology can accurately and efficiently detect lithography hotspots in physical layouts.
Novel 3D Approach to Flare Modeling via Interactive IDL Widget Tools
NASA Astrophysics Data System (ADS)
Nita, G. M.; Fleishman, G. D.; Gary, D. E.; Kuznetsov, A.; Kontar, E. P.
2011-12-01
Currently, and soon-to-be, available sophisticated 3D models of particle acceleration and transport in solar flares require a new level of user-friendly visualization and analysis tools allowing quick and easy adjustment of the model parameters and computation of realistic radiation patterns (images, spectra, polarization, etc). We report the current state of the art of these tools in development, already proved to be highly efficient for the direct flare modeling. We present an interactive IDL widget application intended to provide a flexible tool that allows the user to generate spatially resolved radio and X-ray spectra. The object-based architecture of this application provides full interaction with imported 3D magnetic field models (e.g., from an extrapolation) that may be embedded in a global coronal model. Various tools provided allow users to explore the magnetic connectivity of the model by generating magnetic field lines originating in user-specified volume positions. Such lines may serve as reference lines for creating magnetic flux tubes, which are further populated with user-defined analytical thermal/non thermal particle distribution models. By default, the application integrates IDL callable DLL and Shared libraries containing fast GS emission codes developed in FORTRAN and C++ and soft and hard X-ray codes developed in IDL. However, the interactive interface allows interchanging these default libraries with any user-defined IDL or external callable codes designed to solve the radiation transfer equation in the same or other wavelength ranges of interest. To illustrate the tool capacity and generality, we present a step-by-step real-time computation of microwave and X-ray images from realistic magnetic structures obtained from a magnetic field extrapolation preceding a real event, and compare them with the actual imaging data obtained by NORH and RHESSI instruments. We discuss further anticipated developments of the tools needed to accommodate temporal evolution of the magnetic field structure and/or fast electron population implied by the electron acceleration and transport. This work was supported in part by NSF grants AGS-0961867, AST-0908344, and NASA grants NNX10AF27G and NNX11AB49G to New Jersey Institute of Technology, by a UK STFC rolling grant, STFC/PPARC Advanced Fellowship, and the Leverhulme Trust, UK. Financial support by the European Commission through the SOLAIRE and HESPE Networks is gratefully acknowledged.
Test pattern generation for ILA sequential circuits
NASA Technical Reports Server (NTRS)
Feng, YU; Frenzel, James F.; Maki, Gary K.
1993-01-01
An efficient method of generating test patterns for sequential machines implemented using one-dimensional, unilateral, iterative logic arrays (ILA's) of BTS pass transistor networks is presented. Based on a transistor level fault model, the method affords a unique opportunity for real-time fault detection with improved fault coverage. The resulting test sets are shown to be equivalent to those obtained using conventional gate level models, thus eliminating the need for additional test patterns. The proposed method advances the simplicity and ease of the test pattern generation for a special class of sequential circuitry.
Duchateau, Nicolas; Kostantyn Butakov, Constantine Butakoff; Andreu, David; Fernández-Armenta, Juan; Bijnens, Bart; Berruezo, Antonio; Sitges, Marta; Camara, Oscar
2017-01-01
Electro-anatomical maps (EAMs) are commonly acquired in clinical routine for guiding ablation therapies. They provide voltage and activation time information on a 3-D anatomical mesh representation, making them useful for analyzing the electrical activation patterns in specific pathologies. However, the variability between the different acquisitions and anatomies hampers the comparison between different maps. This paper presents two contributions for the analysis of electrical patterns in EAM data from biventricular surfaces of cardiac chambers. The first contribution is an integrated automatic 2-D disk representation (2-D bull’s eye plot) of the left ventricle (LV) and right ventricle (RV) obtained with a quasi-conformal mapping from the 3-D EAM meshes, that allows an analysis of cardiac resynchronization therapy (CRT) lead positioning, interpretation of global (total activation time), and local indices (local activation time (LAT), surrogates of conduction velocity, inter-ventricular, and transmural delays) that characterize changes in the electrical activation pattern. The second contribution is a set of indices derived from the electrical activation: speed maps, computed from LAT values, to study the electrical wave propagation, and histograms of isochrones to analyze regional electrical heterogeneities in the ventricles. We have applied the proposed methods to look for the underlying physiological mechanisms of left bundle branch block (LBBB) and CRT, with the goal of optimizing the therapy by improving CRT response. To better illustrate the benefits of the proposed tools, we created a set of synthetically generated and fully controlled activation patterns, where the proposed representation and indices were validated. Then, the proposed analysis tools are used to analyze EAM data from an experimental swine model of induced LBBB with an implanted CRT device. We have analyzed and compared the electrical activation patterns at baseline, LBBB, and CRT stages in four animals: two without any structural disease and two with an induced infarction. By relating the CRT lead location with electrical dyssynchrony, we evaluated current hypotheses about lead placement in CRT and showed that optimal pacing sites should target the RV lead close to the apex and the LV one distant from it. PMID:29164019
Neu, Thomas R; Kuhlicke, Ute
2017-02-10
Microbial biofilm systems are defined as interface-associated microorganisms embedded into a self-produced matrix. The extracellular matrix represents a continuous challenge in terms of characterization and analysis. The tools applied in more detailed studies comprise extraction/chemical analysis, molecular characterization, and visualisation using various techniques. Imaging by laser microscopy became a standard tool for biofilm analysis, and, in combination with fluorescently labelled lectins, the glycoconjugates of the matrix can be assessed. By employing this approach a wide range of pure culture biofilms from different habitats were examined using the commercially available lectins. From the results, a binary barcode pattern of lectin binding can be generated. Furthermore, the results can be fine-tuned and transferred into a heat map according to signal intensity. The lectin barcode approach is suggested as a useful tool for investigating the biofilm matrix characteristics and dynamics at various levels, e.g. bacterial cell surfaces, adhesive footprints, individual microcolonies, and the gross biofilm or bio-aggregate. Hence fluorescence lectin bar-coding (FLBC) serves as a basis for a subsequent tailor-made fluorescence lectin-binding analysis (FLBA) of a particular biofilm. So far, the lectin approach represents the only tool for in situ characterization of the glycoconjugate makeup in biofilm systems. Furthermore, lectin staining lends itself to other fluorescence techniques in order to correlate it with cellular biofilm constituents in general and glycoconjugate producers in particular.
Self-Assembly of Human Serum Albumin: A Simplex Phenomenon
Thakur, Garima; Prashanthi, Kovur; Jiang, Keren; Thundat, Thomas
2017-01-01
Spontaneous self-assemblies of biomolecules can generate geometrical patterns. Our findings provide an insight into the mechanism of self-assembled ring pattern generation by human serum albumin (HSA). The self-assembly is a process guided by kinetic and thermodynamic parameters. The generated protein ring patterns display a behavior which is geometrically related to a n-simplex model and is explained through thermodynamics and chemical kinetics. PMID:28930179
Computational Tools and Facilities for the Next-Generation Analysis and Design Environment
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)
1997-01-01
This document contains presentations from the joint UVA/NASA Workshop on Computational Tools and Facilities for the Next-Generation Analysis and Design Environment held at the Virginia Consortium of Engineering and Science Universities in Hampton, Virginia on September 17-18, 1996. The presentations focused on the computational tools and facilities for analysis and design of engineering systems, including, real-time simulations, immersive systems, collaborative engineering environment, Web-based tools and interactive media for technical training. Workshop attendees represented NASA, commercial software developers, the aerospace industry, government labs, and academia. The workshop objectives were to assess the level of maturity of a number of computational tools and facilities and their potential for application to the next-generation integrated design environment.
Classification and assessment tools for structural motif discovery algorithms.
Badr, Ghada; Al-Turaiki, Isra; Mathkour, Hassan
2013-01-01
Motif discovery is the problem of finding recurring patterns in biological data. Patterns can be sequential, mainly when discovered in DNA sequences. They can also be structural (e.g. when discovering RNA motifs). Finding common structural patterns helps to gain a better understanding of the mechanism of action (e.g. post-transcriptional regulation). Unlike DNA motifs, which are sequentially conserved, RNA motifs exhibit conservation in structure, which may be common even if the sequences are different. Over the past few years, hundreds of algorithms have been developed to solve the sequential motif discovery problem, while less work has been done for the structural case. In this paper, we survey, classify, and compare different algorithms that solve the structural motif discovery problem, where the underlying sequences may be different. We highlight their strengths and weaknesses. We start by proposing a benchmark dataset and a measurement tool that can be used to evaluate different motif discovery approaches. Then, we proceed by proposing our experimental setup. Finally, results are obtained using the proposed benchmark to compare available tools. To the best of our knowledge, this is the first attempt to compare tools solely designed for structural motif discovery. Results show that the accuracy of discovered motifs is relatively low. The results also suggest a complementary behavior among tools where some tools perform well on simple structures, while other tools are better for complex structures. We have classified and evaluated the performance of available structural motif discovery tools. In addition, we have proposed a benchmark dataset with tools that can be used to evaluate newly developed tools.
Beigh, Mohammad Muzafar
2016-01-01
Humans have predicted the relationship between heredity and diseases for a long time. Only in the beginning of the last century, scientists begin to discover the connotations between different genes and disease phenotypes. Recent trends in next-generation sequencing (NGS) technologies have brought a great momentum in biomedical research that in turn has remarkably augmented our basic understanding of human biology and its associated diseases. State-of-the-art next generation biotechnologies have started making huge strides in our current understanding of mechanisms of various chronic illnesses like cancers, metabolic disorders, neurodegenerative anomalies, etc. We are experiencing a renaissance in biomedical research primarily driven by next generation biotechnologies like genomics, transcriptomics, proteomics, metabolomics, lipidomics etc. Although genomic discoveries are at the forefront of next generation omics technologies, however, their implementation into clinical arena had been painstakingly slow mainly because of high reaction costs and unavailability of requisite computational tools for large-scale data analysis. However rapid innovations and steadily lowering cost of sequence-based chemistries along with the development of advanced bioinformatics tools have lately prompted launching and implementation of large-scale massively parallel genome sequencing programs in different fields ranging from medical genetics, infectious biology, agriculture sciences etc. Recent advances in large-scale omics-technologies is bringing healthcare research beyond the traditional “bench to bedside” approach to more of a continuum that will include improvements, in public healthcare and will be primarily based on predictive, preventive, personalized, and participatory medicine approach (P4). Recent large-scale research projects in genetic and infectious disease biology have indicated that massively parallel whole-genome/whole-exome sequencing, transcriptome analysis, and other functional genomic tools can reveal large number of unique functional elements and/or markers that otherwise would be undetected by traditional sequencing methodologies. Therefore, latest trends in the biomedical research is giving birth to the new branch in medicine commonly referred to as personalized and/or precision medicine. Developments in the post-genomic era are believed to completely restructure the present clinical pattern of disease prevention and treatment as well as methods of diagnosis and prognosis. The next important step in the direction of the precision/personalized medicine approach should be its early adoption in clinics for future medical interventions. Consequently, in coming year’s next generation biotechnologies will reorient medical practice more towards disease prediction and prevention approaches rather than curing them at later stages of their development and progression, even at wider population level(s) for general public healthcare system. PMID:28930123
R-CMap-An open-source software for concept mapping.
Bar, Haim; Mentch, Lucas
2017-02-01
Planning and evaluating projects often involves input from many stakeholders. Fusing and organizing many different ideas, opinions, and interpretations into a coherent and acceptable plan or project evaluation is challenging. This is especially true when seeking contributions from a large number of participants, especially when not all can participate in group discussions, or when some prefer to contribute their perspectives anonymously. One of the major breakthroughs in the area of evaluation and program planning has been the use of graphical tools to represent the brainstorming process. This provides a quantitative framework for organizing ideas and general concepts into simple-to-interpret graphs. We developed a new, open-source concept mapping software called R-CMap, which is implemented in R. This software provides a graphical user interface to guide users through the analytical process of concept mapping. The R-CMap software allows users to generate a variety of plots, including cluster maps, point rating and cluster rating maps, as well as pattern matching and go-zone plots. Additionally, R-CMap is capable of generating detailed reports that contain useful statistical summaries of the data. The plots and reports can be embedded in Microsoft Office tools such as Word and PowerPoint, where users may manually adjust various plot and table features to achieve the best visual results in their presentations and official reports. The graphical user interface of R-CMap allows users to define cluster names, change the number of clusters, select rating variables for relevant plots, and importantly, select subsets of respondents by demographic criteria. The latter is particularly useful to project managers in order to identify different patterns of preferences by subpopulations. R-CMap is user-friendly, and does not require any programming experience. However, proficient R users can add to its functionality by directly accessing built-in functions in R and sharing new features with the concept mapping community. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Hayasaki, Yoshio
2017-02-01
Femtosecond laser processing is a promising tool for fabricating novel and useful structures on the surfaces of and inside materials. An enormous number of pulse irradiation points will be required for fabricating actual structures with millimeter scale, and therefore, the throughput of femtosecond laser processing must be improved for practical adoption of this technique. One promising method to improve throughput is parallel pulse generation based on a computer-generated hologram (CGH) displayed on a spatial light modulator (SLM), a technique called holographic femtosecond laser processing. The holographic method has the advantages such as high throughput, high light use efficiency, and variable, instantaneous, and 3D patterning. Furthermore, the use of an SLM gives an ability to correct unknown imperfections of the optical system and inhomogeneity in a sample using in-system optimization of the CGH. Furthermore, the CGH can adaptively compensate in response to dynamic unpredictable mechanical movements, air and liquid disturbances, a shape variation and deformation of the target sample, as well as adaptive wavefront control for environmental changes. Therefore, it is a powerful tool for the fabrication of biological cells and tissues, because they have free form, variable, and deformable structures. In this paper, we present the principle and the experimental setup of holographic femtosecond laser processing, and the effective way for processing the biological sample. We demonstrate the femtosecond laser processing of biological materials and the processing properties.
CGDV: a webtool for circular visualization of genomics and transcriptomics data.
Jha, Vineet; Singh, Gulzar; Kumar, Shiva; Sonawane, Amol; Jere, Abhay; Anamika, Krishanpal
2017-10-24
Interpretation of large-scale data is very challenging and currently there is scarcity of web tools which support automated visualization of a variety of high throughput genomics and transcriptomics data and for a wide variety of model organisms along with user defined karyotypes. Circular plot provides holistic visualization of high throughput large scale data but it is very complex and challenging to generate as most of the available tools need informatics expertise to install and run them. We have developed CGDV (Circos for Genomics and Transcriptomics Data Visualization), a webtool based on Circos, for seamless and automated visualization of a variety of large scale genomics and transcriptomics data. CGDV takes output of analyzed genomics or transcriptomics data of different formats, such as vcf, bed, xls, tab limited matrix text file, CNVnator raw output and Gene fusion raw output, to plot circular view of the sample data. CGDV take cares of generating intermediate files required for circos. CGDV is freely available at https://cgdv-upload.persistent.co.in/cgdv/ . The circular plot for each data type is tailored to gain best biological insights into the data. The inter-relationship between data points, homologous sequences, genes involved in fusion events, differential expression pattern, sequencing depth, types and size of variations and enrichment of DNA binding proteins can be seen using CGDV. CGDV thus helps biologists and bioinformaticians to visualize a variety of genomics and transcriptomics data seamlessly.
Individually Coded Telemetry: a Tool for Studying Heart Rate and Behaviour in Reindeer Calves
Eloranta, E; Norberg, H; Nilsson, A; Pudas, T; Säkkinen, H
2002-01-01
The aim of the study was to test the performance of a silver wire modified version of the coded telemetric heart rate monitor Polar Vantage NV™ (PVNV) and to measure heart rate (HR) in a group of captive reindeer calves during different behaviour. The technical performance of PVNV HR monitors was tested in cold conditions (-30°C) using a pulse generator and the correlation between generated pulse and PVNV values was high (r = 0.9957). The accuracy was tested by comparing the HR obtained with the PVNV monitor with the standard ECG, and the correlation was significant (r = 0.9965). Both circadian HR and HR related to behavioural pattern were recorded. A circadian rhythm was observed in the HR in reindeer with a minimum during night and early morning hours and maximum at noon and during the afternoon, the average HR of the reindeer calves studied being 42.5 beats/min in February. The behaviour was recorded by focal individual observations and the data was synchronized with the output of the HR monitors. Running differed from all other behavioural categories in HR. Inter-individual differences were seen expressing individual responses to external and internal stimuli. The silver wire modified Polar Vantage NV™ provides a suitable and reliable tool for measuring heart rate in reindeer, also in natural conditions. PMID:12564543
The heparanome--the enigma of encoding and decoding heparan sulfate sulfation.
Lamanna, William C; Kalus, Ina; Padva, Michael; Baldwin, Rebecca J; Merry, Catherine L R; Dierks, Thomas
2007-04-30
Heparan sulfate (HS) is a cell surface carbohydrate polymer modified with sulfate moieties whose highly ordered composition is central to directing specific cell signaling events. The ability of the cell to generate these information rich glycans with such specificity has opened up a new field of "heparanomics" which seeks to understand the systems involved in generating these cell type and developmental stage specific HS sulfation patterns. Unlike other instances where biological information is encrypted as linear sequences in molecules such as DNA, HS sulfation patterns are generated through a non-template driven process. Thus, deciphering the sulfation code and the dynamic nature of its generation has posed a new challenge to system biologists. The recent discovery of two sulfatases, Sulf1 and Sulf2, with the unique ability to edit sulfation patterns at the cell surface, has opened up a new dimension as to how we understand the regulation of HS sulfation patterning and pattern-dependent cell signaling events. This review will focus on the functional relationship between HS sulfation patterning and biological processes. Special attention will be given to Sulf1 and Sulf2 and how these key editing enzymes might act in concert with the HS biosynthetic enzymes to generate and regulate specific HS sulfation patterns in vivo. We will further explore the use of knock out mice as biological models for understanding the dynamic systems involved in generating HS sulfation patterns and their biological relevance. A brief overview of new technologies and innovations summarizes advances in the systems biology field for understanding non-template molecular networks and their influence on the "heparanome".
NetList(+): A simple interface language for chip design
NASA Astrophysics Data System (ADS)
Wuu, Tzyh-Yung
1991-04-01
NetList (+) is a design specification language developed at MOSIS for rapid turn-around cell-based ASIC prototyping. By using NetList (+), a uniform representation is achieved for the specification, simulation, and physical description of a design. The goal is to establish an interfacing methodology between design specification and independent computer aided design tools. Designers need only to specify a system by writing a corresponding netlist. This netlist is used for both functional simulation and timing simulation. The same netlist is also used to derive the low level physical tools to generate layout. Another goal of using NetList (+) is to generate parts of a design by running it through different kinds of placement and routing (P and R) tools. For example some parts of a design will be generated by standard cell P and R tools. Other parts may be generated by a layout tiler; i.e., datapath compiler, RAM/ROM generator, or PLA generator. Finally all different parts of a design can be integrated by general block P and R tools as a single chip. The NetList (+) language can actually act as an interface among tools. Section 2 shows a flowchart to illustrate the NetList (+) system and its relation with other related design tools. Section 3 shows how to write a NetList (+) description from the block diagram of a circuit. In section 4 discusses how to prepare a cell library or several cell libraries for a design system. Section 5 gives a few designs by NetList (+) and shows their simulation and layout results.
Video Games as a Training Tool to Prepare the Next Generation of Cyber Warriors
2014-10-01
2. REPORT TYPE N/A 3. DATES COVERED - 4 . TITLE AND SUBTITLE Video Games as a Training Tool to Prepare the Next Generation of Cyber Warriors...CYBERSECURITY WORKFORCE SHORTAGE .......................................................................... 3 4 1.1 GREATER CYBERSECURITY EDUCATION IS... 4 6 2.1 HOW VIDEO GAMES CAN BE EFFECTIVE LEARNING TOOLS
The Exercise: An Exercise Generator Tool for the SOURCe Project
ERIC Educational Resources Information Center
Kakoyianni-Doa, Fryni; Tziafa, Eleni; Naskos, Athanasios
2016-01-01
The Exercise, an Exercise generator in the SOURCe project, is a tool that complements the properties and functionalities of the SOURCe project, which includes the search engine for the Searchable Online French-Greek parallel corpus for the UniveRsity of Cyprus (SOURCe) (Kakoyianni-Doa & Tziafa, 2013), the PENCIL (an alignment tool)…
Slade, Brendan; Parrott, Marissa L.; Paproth, Aleisha; Magrath, Michael J. L.; Gillespie, Graeme R.; Jessop, Tim S.
2014-01-01
Captive breeding is a high profile management tool used for conserving threatened species. However, the inevitable consequence of generations in captivity is broad scale and often-rapid phenotypic divergence between captive and wild individuals, through environmental differences and genetic processes. Although poorly understood, mate choice preference is one of the changes that may occur in captivity that could have important implications for the reintroduction success of captive-bred animals. We bred wild-caught house mice for three generations to examine mating patterns and reproductive outcomes when these animals were simultaneously released into multiple outdoor enclosures with wild conspecifics. At release, there were significant differences in phenotypic (e.g. body mass) and genetic measures (e.g. Gst and F) between captive-bred and wild adult mice. Furthermore, 83% of offspring produced post-release were of same source parentage, inferring pronounced assortative mating. Our findings suggest that captive breeding may affect mating preferences, with potentially adverse implications for the success of threatened species reintroduction programmes. PMID:25411380
Emergent mechanics of biological structures
Dumont, Sophie; Prakash, Manu
2014-01-01
Mechanical force organizes life at all scales, from molecules to cells and tissues. Although we have made remarkable progress unraveling the mechanics of life's individual building blocks, our understanding of how they give rise to the mechanics of larger-scale biological structures is still poor. Unlike the engineered macroscopic structures that we commonly build, biological structures are dynamic and self-organize: they sculpt themselves and change their own architecture, and they have structural building blocks that generate force and constantly come on and off. A description of such structures defies current traditional mechanical frameworks. It requires approaches that account for active force-generating parts and for the formation of spatial and temporal patterns utilizing a diverse array of building blocks. In this Perspective, we term this framework “emergent mechanics.” Through examples at molecular, cellular, and tissue scales, we highlight challenges and opportunities in quantitatively understanding the emergent mechanics of biological structures and the need for new conceptual frameworks and experimental tools on the way ahead. PMID:25368421
A generative spike train model with time-structured higher order correlations.
Trousdale, James; Hu, Yu; Shea-Brown, Eric; Josić, Krešimir
2013-01-01
Emerging technologies are revealing the spiking activity in ever larger neural ensembles. Frequently, this spiking is far from independent, with correlations in the spike times of different cells. Understanding how such correlations impact the dynamics and function of neural ensembles remains an important open problem. Here we describe a new, generative model for correlated spike trains that can exhibit many of the features observed in data. Extending prior work in mathematical finance, this generalized thinning and shift (GTaS) model creates marginally Poisson spike trains with diverse temporal correlation structures. We give several examples which highlight the model's flexibility and utility. For instance, we use it to examine how a neural network responds to highly structured patterns of inputs. We then show that the GTaS model is analytically tractable, and derive cumulant densities of all orders in terms of model parameters. The GTaS framework can therefore be an important tool in the experimental and theoretical exploration of neural dynamics.
Microprocessors as a tool in determining correlation between sferics and tornado genesis: an update
DOE Office of Scientific and Technical Information (OSTI.GOV)
Witte, D.R.
1980-09-01
Sferics - atmospheric electromagnetic radiation - can be directly correlated, it is believed, to the genesis of tornadoes and other severe weather. Sferics are generated by lightning and other atmospheric disturbances that are not yet entirely understood. The recording and analysis of the patterns in which sferics events occur, it is hoped, will lead to accurate real-time prediction of tornadoes and other severe weather. Collection of the tremendous amount of sferics data generated by one storm system becomes cumbersome when correlation between at least two stations is necessary for triangulation. Microprocessor-based computing systems have made the task of data collectionmore » and manipulation inexpensive and manageable. The original paper on this subject delivered at MAECON '78 dealt with hardware interfacing. Presented were hardware and software tradeoffs, as well as design and construction techniques to yield a cost effective system. This updated paper presents an overview of where the data comes from, how it is collected, and some current manipulation and interpretation techniques used.« less
Using GIS to generate spatially balanced random survey designs for natural resource applications.
Theobald, David M; Stevens, Don L; White, Denis; Urquhart, N Scott; Olsen, Anthony R; Norman, John B
2007-07-01
Sampling of a population is frequently required to understand trends and patterns in natural resource management because financial and time constraints preclude a complete census. A rigorous probability-based survey design specifies where to sample so that inferences from the sample apply to the entire population. Probability survey designs should be used in natural resource and environmental management situations because they provide the mathematical foundation for statistical inference. Development of long-term monitoring designs demand survey designs that achieve statistical rigor and are efficient but remain flexible to inevitable logistical or practical constraints during field data collection. Here we describe an approach to probability-based survey design, called the Reversed Randomized Quadrant-Recursive Raster, based on the concept of spatially balanced sampling and implemented in a geographic information system. This provides environmental managers a practical tool to generate flexible and efficient survey designs for natural resource applications. Factors commonly used to modify sampling intensity, such as categories, gradients, or accessibility, can be readily incorporated into the spatially balanced sample design.
Modeling hippocampal neurogenesis using human pluripotent stem cells.
Yu, Diana Xuan; Di Giorgio, Francesco Paolo; Yao, Jun; Marchetto, Maria Carolina; Brennand, Kristen; Wright, Rebecca; Mei, Arianna; McHenry, Lauren; Lisuk, David; Grasmick, Jaeson Michael; Silberman, Pedro; Silberman, Giovanna; Jappelli, Roberto; Gage, Fred H
2014-03-11
The availability of human pluripotent stem cells (hPSCs) offers the opportunity to generate lineage-specific cells to investigate mechanisms of human diseases specific to brain regions. Here, we report a differentiation paradigm for hPSCs that enriches for hippocampal dentate gyrus (DG) granule neurons. This differentiation paradigm recapitulates the expression patterns of key developmental genes during hippocampal neurogenesis, exhibits characteristics of neuronal network maturation, and produces PROX1+ neurons that functionally integrate into the DG. Because hippocampal neurogenesis has been implicated in schizophrenia (SCZD), we applied our protocol to SCZD patient-derived human induced pluripotent stem cells (hiPSCs). We found deficits in the generation of DG granule neurons from SCZD hiPSC-derived hippocampal NPCs with lowered levels of NEUROD1, PROX1, and TBR1, reduced neuronal activity, and reduced levels of spontaneous neurotransmitter release. Our approach offers important insights into the neurodevelopmental aspects of SCZD and may be a promising tool for drug screening and personalized medicine.
Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System
NASA Technical Reports Server (NTRS)
Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.
1999-01-01
Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.
NASA Astrophysics Data System (ADS)
Van Den Broeke, Douglas J.; Laidig, Thomas L.; Chen, J. Fung; Wampler, Kurt E.; Hsu, Stephen D.; Shi, Xuelong; Socha, Robert J.; Dusa, Mircea V.; Corcoran, Noel P.
2004-08-01
Imaging contact and via layers continues to be one of the major challenges to be overcome for 65nm node lithography. Initial results of using ASML MaskTools' CPL Technology to print contact arrays through pitch have demonstrated the potential to further extend contact imaging to a k1 near 0.30. While there are advantages and disadvantages for any potential RET, the benefits of not having to solve the phase assignment problem (which can lead to unresolvable phase conflicts), of it being a single reticle - single exposure technique, and its application to multiple layers within a device (clear field and dark field) make CPL an attractive, cost effective solution to low k1 imaging. However, real semiconductor circuit designs consist of much more than regular arrays of contact holes and a method to define the CPL reticle design for a full chip circuit pattern is required in order for this technique to be feasible in volume manufacturing. Interference Mapping Lithography (IML) is a novel approach for defining optimum reticle patterns based on the imaging conditions that will be used when the wafer is exposed. Figure 1 shows an interference map for an isolated contact simulated using ASML /1150 settings of 0.75NA and 0.92/0.72/30deg Quasar illumination. This technique provides a model-based approach for placing all types features (scattering bars, anti-scattering bars, non-printing assist features, phase shifted and non-phase shifted) for the purpose of enhancing the resolution of the target pattern and it can be applied to any reticle type including binary (COG), attenuated phase shifting mask (attPSM), alternating aperture phase shifting mask (altPSM), and CPL. In this work, we investigate the application of IML to generate CPL reticle designs for random contact patterns that are typical for 65nm node logic devices. We examine the critical issues related to using CPL with Interference Mapping Lithography including controlling side lobe printing, contact patterns with odd symmetry, forbidden pitch regions, and reticle manufacturing constraints. Multiple methods for deriving the interference map used to define reticle patterns for various RET's will be discussed. CPL reticle designs that were created from implementing automated algorithms for contact pattern decomposition using MaskWeaver will also be presented.
TU-CD-304-11: Veritas 2.0: A Cloud-Based Tool to Facilitate Research and Innovation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mishra, P; Patankar, A; Etmektzoglou, A
Purpose: We introduce Veritas 2.0, a cloud-based, non-clinical research portal, to facilitate translation of radiotherapy research ideas to new delivery techniques. The ecosystem of research tools includes web apps for a research beam builder for TrueBeam Developer Mode, an image reader for compressed and uncompressed XIM files, and a trajectory log file based QA/beam delivery analyzer. Methods: The research beam builder can generate TrueBeam readable XML file either from scratch or from pre-existing DICOM-RT plans. DICOM-RT plan is first converted to XML format and then researcher can interactively modify or add control points to them. Delivered beam can be verifiedmore » via reading generated images and analyzing trajectory log files. Image reader can read both uncompressed and HND-compressed XIM images. The trajectory log analyzer lets researchers plot expected vs. actual values and deviations among 30 mechanical axes. The analyzer gives an animated view of MLC patterns for the beam delivery. Veritas 2.0 is freely available and its advantages versus standalone software are i) No software installation or maintenance needed, ii) easy accessibility across all devices iii) seamless upgrades and iv) OS independence. Veritas is written using open-source tools like twitter bootstrap, jQuery, flask, and Python-based modules. Results: In the first experiment, an anonymized 7-beam DICOM-RT IMRT plan was converted to XML beam containing 1400 control points. kV and MV imaging points were inserted into this XML beam. In another experiment, a binary log file was analyzed to compare actual vs expected values and deviations among axes. Conclusions: Veritas 2.0 is a public cloud-based web app that hosts a pool of research tools for facilitating research from conceptualization to verification. It is aimed at providing a platform for facilitating research and collaboration. I am full time employee at Varian Medical systems, Palo Alto.« less
Stobbe, Nina; Westphal-Fitch, Gesche; Aust, Ulrike; Fitch, W. Tecumseh
2012-01-01
Artificial grammar learning (AGL) provides a useful tool for exploring rule learning strategies linked to general purpose pattern perception. To be able to directly compare performance of humans with other species with different memory capacities, we developed an AGL task in the visual domain. Presenting entire visual patterns simultaneously instead of sequentially minimizes the amount of required working memory. This approach allowed us to evaluate performance levels of two bird species, kea (Nestor notabilis) and pigeons (Columba livia), in direct comparison to human participants. After being trained to discriminate between two types of visual patterns generated by rules at different levels of computational complexity and presented on a computer screen, birds and humans received further training with a series of novel stimuli that followed the same rules, but differed in various visual features from the training stimuli. Most avian and all human subjects continued to perform well above chance during this initial generalization phase, suggesting that they were able to generalize learned rules to novel stimuli. However, detailed testing with stimuli that violated the intended rules regarding the exact number of stimulus elements indicates that neither bird species was able to successfully acquire the intended pattern rule. Our data suggest that, in contrast to humans, these birds were unable to master a simple rule above the finite-state level, even with simultaneous item presentation and despite intensive training. PMID:22688635
NASA Astrophysics Data System (ADS)
Baccar, D.; Söffker, D.
2017-11-01
Acoustic Emission (AE) is a suitable method to monitor the health of composite structures in real-time. However, AE-based failure mode identification and classification are still complex to apply due to the fact that AE waves are generally released simultaneously from all AE-emitting damage sources. Hence, the use of advanced signal processing techniques in combination with pattern recognition approaches is required. In this paper, AE signals generated from laminated carbon fiber reinforced polymer (CFRP) subjected to indentation test are examined and analyzed. A new pattern recognition approach involving a number of processing steps able to be implemented in real-time is developed. Unlike common classification approaches, here only CWT coefficients are extracted as relevant features. Firstly, Continuous Wavelet Transform (CWT) is applied to the AE signals. Furthermore, dimensionality reduction process using Principal Component Analysis (PCA) is carried out on the coefficient matrices. The PCA-based feature distribution is analyzed using Kernel Density Estimation (KDE) allowing the determination of a specific pattern for each fault-specific AE signal. Moreover, waveform and frequency content of AE signals are in depth examined and compared with fundamental assumptions reported in this field. A correlation between the identified patterns and failure modes is achieved. The introduced method improves the damage classification and can be used as a non-destructive evaluation tool.
Developing and Implementing the Data Mining Algorithms in RAVEN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sen, Ramazan Sonat; Maljovec, Daniel Patrick; Alfonsi, Andrea
The RAVEN code is becoming a comprehensive tool to perform probabilistic risk assessment, uncertainty quantification, and verification and validation. The RAVEN code is being developed to support many programs and to provide a set of methodologies and algorithms for advanced analysis. Scientific computer codes can generate enormous amounts of data. To post-process and analyze such data might, in some cases, take longer than the initial software runtime. Data mining algorithms/methods help in recognizing and understanding patterns in the data, and thus discover knowledge in databases. The methodologies used in the dynamic probabilistic risk assessment or in uncertainty and error quantificationmore » analysis couple system/physics codes with simulation controller codes, such as RAVEN. RAVEN introduces both deterministic and stochastic elements into the simulation while the system/physics code model the dynamics deterministically. A typical analysis is performed by sampling values of a set of parameter values. A major challenge in using dynamic probabilistic risk assessment or uncertainty and error quantification analysis for a complex system is to analyze the large number of scenarios generated. Data mining techniques are typically used to better organize and understand data, i.e. recognizing patterns in the data. This report focuses on development and implementation of Application Programming Interfaces (APIs) for different data mining algorithms, and the application of these algorithms to different databases.« less
Simulation of California's Major Reservoirs Outflow Using Data Mining Technique
NASA Astrophysics Data System (ADS)
Yang, T.; Gao, X.; Sorooshian, S.
2014-12-01
The reservoir's outflow is controlled by reservoir operators, which is different from the upstream inflow. The outflow is more important than the reservoir's inflow for the downstream water users. In order to simulate the complicated reservoir operation and extract the outflow decision making patterns for California's 12 major reservoirs, we build a data-driven, computer-based ("artificial intelligent") reservoir decision making tool, using decision regression and classification tree approach. This is a well-developed statistical and graphical modeling methodology in the field of data mining. A shuffled cross validation approach is also employed to extract the outflow decision making patterns and rules based on the selected decision variables (inflow amount, precipitation, timing, water type year etc.). To show the accuracy of the model, a verification study is carried out comparing the model-generated outflow decisions ("artificial intelligent" decisions) with that made by reservoir operators (human decisions). The simulation results show that the machine-generated outflow decisions are very similar to the real reservoir operators' decisions. This conclusion is based on statistical evaluations using the Nash-Sutcliffe test. The proposed model is able to detect the most influential variables and their weights when the reservoir operators make an outflow decision. While the proposed approach was firstly applied and tested on California's 12 major reservoirs, the method is universally adaptable to other reservoir systems.